Engineer friends often ask me: Graph Deep Learning sounds great, but are there any big commercial success stories? Is it being deployed in practical applications? Besides the obvious ones–recommendation systems at Pinterest, Alibaba and Twitter–a slightly nuanced success story is the Transformer architecture, which has taken the NLP industry by storm. Through this post, I want to establish links between Graph Neural Networks (GNNs) and Transformers. I’ll talk about the intuitions behind model architectures in the NLP and GNN communities, make connections using equations and figures, and discuss how we could work together to drive progress.
We introduce the Generative Query Network (GQN), a framework within which machines learn to perceive their surroundings by training only on data obtained by themselves as they move around scenes. Much like infants and animals, the GQN learns by trying to make sense of its observations of the world around it. In doing so, the GQN learns about plausible scenes and their geometrical properties, without any human labelling of the contents of scenes.
*NOTE: These videos were recorded in Fall 2015 to update the Neural Nets portion of the class. MIT 6.034 Artificial Intelligence, Fall 2010 View the complete...
Subscribe to stay notified about new videos: http://3b1b.co/subscribe Support more videos like this on Patreon: https://www.patreon.com/3blue1brown Special t...
In this project, we provide our implementations of CNN [Zeng et al., 2014] and PCNN [Zeng et al.,2015] and their extended version with sentence-level attention scheme [Lin et al., 2016] .
What are Convolutional Neural Networks and why are they important? Convolutional Neural Networks (ConvNets or CNNs) are a category of Neural Networks that have proven very effective in areas such as image recognition and classification. ConvNets have been successful in identifying faces, objects and traffic signs apart from powering vision in robots and self driving cars. Figure 1:…
TensorFlow™ is an open source software library for numerical computation using data flow graphs. Nodes in the graph represent mathematical operations, while the graph edges represent the multidimensional data arrays (tensors) communicated between them. The flexible architecture allows you to deploy computation to one or more CPUs or GPUs in a desktop, server, or mobile device with a single API. TensorFlow was originally developed by researchers and engineers working on the Google Brain Team within Google's Machine Intelligence research organization for the purposes of conducting machine learning and deep neural networks research, but the system is general enough to be applicable in a wide variety of other domains as well.
X. Hu, W. Liu, J. Bian, и J. Pei. Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, стр. 1521--1531. (2020)
Q. Le, и T. Mikolov. Proceedings of the 31st International Conference on Machine Learning, том 32 из Proceedings of Machine Learning Research, стр. 1188--1196. Bejing, China, PMLR, (июня 2014)
Y. Liu, A. Ganguly, и J. Dy. Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, стр. 3145–3153. New York, NY, USA, ACM, (2020)
Z. Zhang, S. Liu, M. Li, M. Zhou, и E. Chen. Proceedings of the 22nd Conference on Computational Natural Language Learning, CoNLL 2018, Brussels, Belgium, October 31 - November 1, 2018, стр. 190--199. Association for Computational Linguistics, (2018)
M. Valdenegro-Toro, O. Arriaga, и P. Plöger. 27th European Symposium on Artificial Neural Networks, ESANN 2019, Bruges, Belgium, April 24-26, 2019, (2019)
A. Kadra, M. Lindauer, F. Hutter, и J. Grabocka. Proceedings of the international conference on Neural Information Processing Systems (NeurIPS), (декабря 2021)
M. Benjak, Y. Samayoa, и J. Ostermann. Proceedings of the 28th IEEE International Conference on Image Processing (ICIP), (сентября 2021)accepted for publication.
S. Wang, L. Hu, Y. Wang, X. He, Q. Sheng, M. Orgun, L. Cao, F. Ricci, и P. Yu. (2021)cite arxiv:2105.06339Comment: Accepted by IJCAI 2021 Survey Track, copyright is owned to IJCAI. The first systematic survey on graph learning based recommender systems. arXiv admin note: text overlap with arXiv:2004.11718.
M. Paris, и R. Jäschke. Proceedings of the 14th International Conference on Knowledge Science, Engineering and Management, том 12816 из Lecture Notes in Artificial Intelligence, стр. 1--14. Springer, (2021)
M. Dacrema, P. Cremonesi, и D. Jannach. (2019)cite arxiv:1907.06902Comment: Source code available at: https://github.com/MaurizioFD/RecSys2019_DeepLearning_Evaluation.
X. He, L. Liao, H. Zhang, L. Nie, X. Hu, и T. Chua. Proceedings of the 26th International Conference on World Wide Web, стр. 173–182. Republic and Canton of Geneva, CHE, International World Wide Web Conferences Steering Committee, (2017)
P. Xia, S. Wu, и B. Van Durme. Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), стр. 7516--7533. Association for Computational Linguistics, (ноября 2020)