From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Learning Tensor Latent Features., , , , и . CoRR, (2018)Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetune Paradigm., , , , , , , , и . CoRR, (2021)FILM-QNN: Efficient FPGA Acceleration of Deep Neural Networks with Intra-Layer, Mixed-Precision Quantization., , , , , , , и . FPGA, стр. 134-145. ACM, (2022)MixLasso: Generalized Mixed Regression via Convex Atomic-Norm Regularization., , , , , и . NeurIPS, стр. 10891-10899. (2018)Hardware-efficient stochastic rounding unit design for DNN training: late breaking results., , , , , , , , , и 2 other автор(ы). DAC, стр. 1396-1397. ACM, (2022)Latent Feature Lasso., , , , , и . ICML, том 70 из Proceedings of Machine Learning Research, стр. 3949-3957. PMLR, (2017)MSP: An FPGA-Specific Mixed-Scheme, Multi-Precision Deep Neural Network Quantization Framework., , , , , , и . CoRR, (2020)Mix and Match: A Novel FPGA-Centric Deep Neural Network Quantization Framework., , , , , , , и . CoRR, (2020)ESRU: Extremely Low-Bit and Hardware-Efficient Stochastic Rounding Unit Design for Low-Bit DNN Training., , , , , , , , , и 2 other автор(ы). DATE, стр. 1-6. IEEE, (2023)You Already Have It: A Generator-Free Low-Precision DNN Training Framework Using Stochastic Rounding., , , , , , , , , и 4 other автор(ы). ECCV (12), том 13672 из Lecture Notes in Computer Science, стр. 34-51. Springer, (2022)