Author of the publication

ROPUST: Improving Robustness through Fine-tuning with Photonic Processors and Synthetic Gradients.

, , , , and . CoRR, (2021)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

What Language Model Architecture and Pretraining Objective Work Best for Zero-Shot Generalization?, , , , , , , and . CoRR, (2022)Direct Feedback Alignment Scales to Modern Deep Learning Tasks and Architectures., , , and . NeurIPS, (2020)Is the Number of Trainable Parameters All That Actually Matters?, , , and . ICBINB@NeurIPS, volume 163 of Proceedings of Machine Learning Research, page 27-32. PMLR, (2021)Energy-Aware Resources in Digital Twin: The Case of Injection Moulding Machines., , , , , , , and . SOHOMA, volume 853 of Studies in Computational Intelligence, page 183-194. Springer, (2019)Is the Number of Trainable Parameters All That Actually Matters?, , , , and . CoRR, (2021)Scaling Laws Beyond Backpropagation., , , and . CoRR, (2022)BLOOM: A 176B-Parameter Open-Access Multilingual Language Model., , , , , , , , , and 39 other author(s). CoRR, (2022)Hardware Beyond Backpropagation: a Photonic Co-Processor for Direct Feedback Alignment., , , , , , , and . CoRR, (2020)What Language Model Architecture and Pretraining Objective Works Best for Zero-Shot Generalization?, , , , , , , and . ICML, volume 162 of Proceedings of Machine Learning Research, page 22964-22984. PMLR, (2022)Light-in-the-loop: using a photonics co-processor for scalable training of neural networks., , , , , , and . CoRR, (2020)