Engineer friends often ask me: Graph Deep Learning sounds great, but are there any big commercial success stories? Is it being deployed in practical applications? Besides the obvious ones–recommendation systems at Pinterest, Alibaba and Twitter–a slightly nuanced success story is the Transformer architecture, which has taken the NLP industry by storm. Through this post, I want to establish links between Graph Neural Networks (GNNs) and Transformers. I’ll talk about the intuitions behind model architectures in the NLP and GNN communities, make connections using equations and figures, and discuss how we could work together to drive progress.
COBOSLAB: Cognitive Bodyspaces: Learning and Behavior:
Laboratory that investigates and models the Self-organized Learning of and Behavior within Integrated Multimodal Multimodular Bodyspace Representations.
Subscribe to stay notified about new videos: http://3b1b.co/subscribe Support more videos like this on Patreon: https://www.patreon.com/3blue1brown Special t...
M. Akbarzadeh-T., E. Tunstel, K. Kumbla, and M. Jamshidi. Proceedings of the 1998 IEEE World Congress on
Computational Intelligence, 2, page 1200--1205. Anchorage, Alaska, USA, IEEE Press, (5-9 May 1998)