Follow
Huan Wang
Huan Wang
Westlake University
Verified email at westlake.edu.cn - Homepage
Title
Cited by
Cited by
Year
MNN: A universal and efficient inference engine
X Jiang, H Wang, Y Chen, Z Wu, L Wang, B Zou, Y Yang, Z Cui, Y Cai, ...
MLSys'20, Code: https://github.com/alibaba/MNN, 2020
1772020
Neural pruning via growing regularization
H Wang, C Qin, Y Zhang, Y Fu
ICLR'21, Code: https://github.com/MingSun-Tse/Regularization-Pruning, 2021
1622021
Collaborative distillation for ultra-resolution universal style transfer
H Wang, Y Li, Y Wang, H Hu, MH Yang
CVPR'20, Code: https://github.com/MingSun-Tse/Collaborative-Distillation, 2020
1192020
Recent Advances on Neural Network Pruning at Initialization
H Wang, C Qin, Y Bai, Y Zhang, Y Fu
IJCAI'22, 2022
111*2022
SnapFusion: Text-to-Image Diffusion Model on Mobile Devices within Two Seconds
Y Li*, H Wang*, Q Jin*, J Hu, P Chemerys, Y Fu, Y Wang, S Tulyakov, ...
NeurIPS'23, Project: https://snap-research.github.io/SnapFusion, 2023
1062023
Context reasoning attention network for image super-resolution
Y Zhang, D Wei, C Qin, H Wang, H Pfister, Y Fu
ICCV'21, 4278-4287, 2021
852021
Structured probabilistic pruning for deep convolutional neural network acceleration
H Wang, Q Zhang, Y Wang, H Hu
BMVC'18 (Oral), Code: https://github.com/MingSun-Tse/Caffe_IncReg, 2018
84*2018
R2L: Distilling Neural Radiance Field to Neural Light Field for Efficient Novel View Synthesis
H Wang, J Ren, Z Huang, K Olszewski, M Chai, Y Fu, S Tulyakov
ECCV'22, Project: https://snap-research.github.io/R2L, 2022
792022
Structured Pruning for Efficient ConvNets via Incremental Regularization
H Wang, Q Zhang, Y Wang, L Yu, H Hu
NeurIPS Workshop'18, IJCNN'19 (Oral), Code: https://github.com/MingSun-Tse …, 2019
68*2019
Image as Set of Points
X Ma, Y Zhou, H Wang, C Qin, B Sun, C Liu, Y Fu
ICLR'23 (Oral, top 5%), Code: https://github.com/ma-xu/Context-Cluster, 2023
622023
Contradictory Structure Learning for Semi-supervised Domain Adaptation
C Qin, L Wang, Q Ma, Y Yin, H Wang, Y Fu
SIAM International Conference on Data Mining (SDM), 576-584, 2021
592021
What Makes a "Good" Data Augmentation in Knowledge Distillation -- A Statistical Perspective
H Wang, S Lohit, M Jones, Y Fu
NeurIPS'22, Code: https://github.com/MingSun-Tse/Good-DA-in-KD, 2022
58*2022
Aligned Structured Sparsity Learning for Efficient Image Super-Resolution
H Wang*, Y Zhang*, C Qin, Y Fu
NeurIPS'21 (Spotlight), Code: https://github.com/MingSun-Tse/ASSL, 2021
582021
Triplet distillation for deep face recognition
Y Feng, H Wang, R Hu, DT Yi
ICML'19 Workshop, 2019
562019
Learning Efficient Image Super-Resolution Networks via Structure-Regularized Pruning
H Wang*, Y Zhang*, C Qin, Y Fu
ICLR'22, Code: https://github.com/MingSun-Tse/SRP, 2022
542022
Real-Time Neural Light Field on Mobile Devices
J Cao, H Wang, P Chemerys, V Shakhrai, J Hu, Y Fu, D Makoviichuk, ...
CVPR'23, Project: https://snap-research.github.io/MobileR2L/, 2023
492023
Dual Lottery Ticket Hypothesis
Y Bai, H Wang, Z Tao, K Li, Y Fu
ICLR'22, Code: https://github.com/yueb17/DLTH, 2022
432022
Structured pruning for efficient convolutional neural networks via incremental regularization
H Wang, X Hu, Q Zhang, Y Wang, L Yu, H Hu
IEEE Journal of Selected Topics in Signal Processing 14 (4), 775-788, 2019
412019
Trainability preserving neural pruning
H Wang, Y Fu
ICLR'23, Code: https://github.com/MingSun-Tse/TPP, 2023
352023
Semi-supervised Domain Adaptive Structure Learning
C Qin, L Wang, Q Ma, Y Yin, H Wang, Y Fu
TIP'22, 2022
242022
The system can't perform the operation now. Try again later.
Articles 1–20