Author of the publication

Human control redressed: Comparing AI and human predictability in a real-effort task.

, , , , , , and . CogSci, cognitivesciencesociety.org, (2023)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Transformers are Sample Efficient World Models., , and . CoRR, (2022)Human control redressed: Comparing AI and human predictability in a real-effort task., , , , , , and . CogSci, cognitivesciencesociety.org, (2023)MineRL Diamond 2021 Competition: Overview, Results, and Lessons Learned., , , , , , , , , and 12 other author(s). NeurIPS (Competition and Demos), volume 176 of Proceedings of Machine Learning Research, page 13-28. PMLR, (2021)Multi-task Reinforcement Learning with a Planning Quasi-Metric., , and . CoRR, (2020)MineRL Diamond 2021 Competition: Overview, Results, and Lessons Learned., , , , , , , , , and 12 other author(s). CoRR, (2022)Structural analysis of an all-purpose question answering model., , , and . CoRR, (2021)On the importance of pre-training data volume for compact language models., , and . EMNLP (1), page 7853-7858. Association for Computational Linguistics, (2020)Language Models are Few-Shot Butlers., and . EMNLP (1), page 9312-9318. Association for Computational Linguistics, (2021)Transformers are Sample-Efficient World Models., , and . ICLR, OpenReview.net, (2023)