TL;DR: Have you even wondered what is so special about convolution? In this post, I derive the convolution from first principles and show that it naturally emerges from translational symmetry. During…
Engineer friends often ask me: Graph Deep Learning sounds great, but are there any big commercial success stories? Is it being deployed in practical applications? Besides the obvious ones–recommendation systems at Pinterest, Alibaba and Twitter–a slightly nuanced success story is the Transformer architecture, which has taken the NLP industry by storm. Through this post, I want to establish links between Graph Neural Networks (GNNs) and Transformers. I’ll talk about the intuitions behind model architectures in the NLP and GNN communities, make connections using equations and figures, and discuss how we could work together to drive progress.
Fullstack GraphQL Tutorial to go from zero to production covering all basics and advanced concepts. Includes tutorials for Apollo, Relay, React and NodeJS.
If you use the code, please kindly cite the following paper:
Yankai Lin, Zhiyuan Liu, Maosong Sun, Yang Liu, Xuan Zhu. Learning Entity and Relation Embeddings for Knowledge Graph Completion. The 29th AAAI Conference on Artificial Intelligence (AAAI'15).
P. Chapman, G. Stapleton, J. Howse, and I. Oliver. 2011 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC), page 87-94. (September 2011)
X. Wang, and M. Zhang. Proceedings of the 39th International Conference on Machine Learning, volume 162 of Proceedings of Machine Learning Research, page 23341--23362. PMLR, (17--23 Jul 2022)
J. Feng, Y. Chen, F. Li, A. Sarkar, and M. Zhang. Advances in Neural Information Processing Systems, 35, page 4776--4790. Curran Associates, Inc., (2022)