Abstract

When we were invited to write a retrospective article about our AAAI-99 paper on mutual bootstrapping, our first reaction was hesitation because, well, that algorithm seems old and clunky now. But upon reflection, it shaped a great deal of subsequent work on bootstrapped learning for natural language processing, both by ourselves and others. So our second reaction was enthusiasm, for the opportunity to think about the path from 1999 to 2017 and to share the lessons that we learned about bootstrapped learning along the way. This article begins with a brief history of related research that preceded and inspired the mutual bootstrapping work, to position it with respect to that period of time. We then describe the general ideas and approach behind the mutual bootstrapping algorithm. Next, we overview several types of research that have followed and shared similar themes: multi-view learning, bootstrapped lexicon induction, and bootstrapped pattern learning. Finally, we discuss some of the general lessons that we have learned about bootstrapping techniques for NLP to offer guidance to researchers and practitioners who may be interested in exploring these types of techniques in their own work.

Links and resources

Tags

community

  • @flint63
  • @dblp
@flint63's tags highlighted