@thoni

Never-ending language learning

, and . 2014 IEEE International Conference on Big Data (Big Data), page 1-1. (October 2014)
DOI: 10.1109/BigData.2014.7004203

Abstract

Summary form only given. We will never really understand learning until we can build machines that learn many different things, over years, and become better learners over time. We describe our research to build a Never-Ending Language Learner (NELL) that runs 24 hours per day, forever, learning to read the web. Each day NELL extracts (reads) more facts from the web, into its growing knowledge base of beliefs. Each day NELL also learns to read better than the day before. NELL has been running 24 hours/day for over four years now. The result so far is a collection of 70 million interconnected beliefs (e.g., servedWtih(coffee, applePie)), NELL is considering at different levels of confidence, along with millions of learned phrasings, morphological features, and web page structures that NELL uses to extract beliefs from the web. NELL is also learning to reason over its extracted knowledge, and to automatically extend its ontology. Track NELL's progress at http://rtw.ml.cmu.edu, or follow it on Twitter at @CMUNELL.

Description

Never-ending language learning - IEEE Conference Publication

Links and resources

Tags