Zihang Jiang
Zihang Jiang
Verified email at u.nus.edu
Title
Cited by
Cited by
Year
Tokens-to-token vit: Training vision transformers from scratch on imagenet
L Yuan, Y Chen, T Wang, W Yu, Y Shi, Z Jiang, FEH Tay, J Feng, S Yan
arXiv preprint arXiv:2101.11986, 2021
962021
Disentangled representation learning for 3D face shape
ZH Jiang, Q Wu, K Chen, J Zhang
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2019
382019
Joint 3d face reconstruction and dense face alignment from a single image with 2d-assisted self-supervised learning
X Tu, J Zhao, Z Jiang, Y Luo, M Xie, Y Zhao, L He, Z Ma, J Feng
arXiv preprint arXiv:1903.09359 1 (2), 2019
31*2019
Reclor: A reading comprehension dataset requiring logical reasoning
W Yu, Z Jiang, Y Dong, J Feng
arXiv preprint arXiv:2002.04326, 2020
232020
DeepViT: Towards Deeper Vision Transformer
D Zhou, B Kang, X Jin, L Yang, X Lian, Z Jiang, Q Hou, J Feng
arXiv preprint arXiv:2103.11886, 2021
222021
Convbert: Improving bert with span-based dynamic convolution
Z Jiang, W Yu, D Zhou, Y Chen, J Feng, S Yan
arXiv preprint arXiv:2008.02496, 2020
212020
All Tokens Matter: Token Labeling for Training Better Vision Transformers
Z Jiang, Q Hou, L Yuan, D Zhou, Y Shi, X Jin, A Wang, J Feng
arXiv preprint arXiv:2104.10858, 2021
11*2021
Volo: Vision outlooker for visual recognition
L Yuan, Q Hou, Z Jiang, J Feng, S Yan
arXiv preprint arXiv:2106.13112, 2021
22021
Vision permutator: A permutable mlp-like architecture for visual recognition
Q Hou, Z Jiang, L Yuan, MM Cheng, S Yan, J Feng
arXiv preprint arXiv:2106.12368, 2021
22021
Refiner: Refining Self-attention for Vision Transformers
D Zhou, Y Shi, B Kang, W Yu, Z Jiang, Y Li, X Jin, Q Hou, J Feng
arXiv preprint arXiv:2106.03714, 2021
22021
Few-shot classification via adaptive attention
Z Jiang, B Kang, K Zhou, J Feng
arXiv preprint arXiv:2008.02465, 2020
22020
LV-BERT: Exploiting Layer Variety for BERT
W Yu, Z Jiang, F Chen, Q Hou, J Feng
arXiv preprint arXiv:2106.11740, 2021
2021
The system can't perform the operation now. Try again later.
Articles 1–12