Author of the publication

Modeling and Evaluation of Synchronous Stochastic Gradient Descent in Distributed Deep Learning on Multiple GPUs.

, , , and . CoRR, (2018)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Decoupling the All-Reduce Primitive for Accelerating Distributed Deep Learning., , , , , and . CoRR, (2023)FADNet++: Real-Time and Accurate Disparity Estimation with Configurable Networks., , , , and . CoRR, (2021)An Efficient Split Fine-tuning Framework for Edge and Cloud Collaborative Learning., , , , and . CoRR, (2022)Scalable K-FAC Training for Deep Neural Networks with Distributed Preconditioning., , , and . CoRR, (2022)A Distributed Synchronous SGD Algorithm with Global Top-k Sparsification for Low Bandwidth Networks., , , , , , and . ICDCS, page 2238-2247. IEEE, (2019)Benchmarking the Performance and Energy Efficiency of AI Accelerators for AI Training., , , , , , and . CCGRID, page 744-751. IEEE, (2020)Supervised Learning Based Algorithm Selection for Deep Neural Networks., , and . ICPADS, page 344-351. IEEE Computer Society, (2017)MG-WFBP: Efficient Data Communication for Distributed Synchronous SGD Algorithms., , and . INFOCOM, page 172-180. IEEE, (2019)MG-WFBP: Merging Gradients Wisely for Efficient Communication in Distributed Deep Learning., , and . IEEE Trans. Parallel Distributed Syst., 32 (8): 1903-1917 (2021)MG-WFBP: Efficient Data Communication for Distributed Synchronous SGD Algorithms., and . CoRR, (2018)