Misc,

A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks

, , , and .
(2016)cite arxiv:1611.01587Comment: Accepted as a full paper at the 2017 Conference on Empirical Methods in Natural Language Processing (EMNLP 2017).

Abstract

Transfer and multi-task learning have traditionally focused on either a single source-target pair or very few, similar tasks. Ideally, the linguistic levels of morphology, syntax and semantics would benefit each other by being trained in a single model. We introduce a joint many-task model together with a strategy for successively growing its depth to solve increasingly complex tasks. Higher layers include shortcut connections to lower-level task predictions to reflect linguistic hierarchies. We use a simple regularization term to allow for optimizing all model weights to improve one task's loss without exhibiting catastrophic interference of the other tasks. Our single end-to-end model obtains state-of-the-art or competitive results on five different tasks from tagging, parsing, relatedness, and entailment tasks.

Tags

Users

  • @florianpircher
  • @dallmann
  • @dblp

Comments and Reviews