Engineer friends often ask me: Graph Deep Learning sounds great, but are there any big commercial success stories? Is it being deployed in practical applications? Besides the obvious ones–recommendation systems at Pinterest, Alibaba and Twitter–a slightly nuanced success story is the Transformer architecture, which has taken the NLP industry by storm. Through this post, I want to establish links between Graph Neural Networks (GNNs) and Transformers. I’ll talk about the intuitions behind model architectures in the NLP and GNN communities, make connections using equations and figures, and discuss how we could work together to drive progress.
If you use the code, please kindly cite the following paper:
Yankai Lin, Zhiyuan Liu, Maosong Sun, Yang Liu, Xuan Zhu. Learning Entity and Relation Embeddings for Knowledge Graph Completion. The 29th AAAI Conference on Artificial Intelligence (AAAI'15).
This page provides a large hyperlink graph for public download. The graph has been extracted from the Common Crawl 2012 web corpus and covers 3.5 billion web pages and 128 billion hyperlinks between these pages. To the best of our knowledge, this graph is the largest hyperlink graph that is available to the public outside companies such as Google, Yahoo, and Microsoft. Below we provide instructions on how to download the graph as well as basic statistics about its topology.
D. Yang, P. Rosso, B. Li, and P. Cudre-Mauroux. Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, page 1162–1172. New York, NY, USA, Association for Computing Machinery, (2019)
X. Wang, and M. Zhang. Proceedings of the 39th International Conference on Machine Learning, volume 162 of Proceedings of Machine Learning Research, page 23341--23362. PMLR, (17--23 Jul 2022)
J. Feng, Y. Chen, F. Li, A. Sarkar, and M. Zhang. Advances in Neural Information Processing Systems, 35, page 4776--4790. Curran Associates, Inc., (2022)