Joint Multi-Label Attention Networks for Social Text Annotation
H. Dong, W. Wang, K. Huang, и F. Coenen. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), стр. 1348--1354. Minneapolis, Minnesota, Association for Computational Linguistics, (июня 2019)
Аннотация
We propose a novel attention network for document annotation with user-generated tags. The network is designed according to the human reading and annotation behaviour. Usually, users try to digest the title and obtain a rough idea about the topic first, and then read the content of the document. Present research shows that the title metadata could largely affect the social annotation. To better utilise this information, we design a framework that separates the title from the content of a document and apply a title-guided attention mechanism over each sentence in the content. We also propose two semantic-based loss regularisers that enforce the output of the network to conform to label semantics, i.e. similarity and subsumption. We analyse each part of the proposed system with two real-world open datasets on publication and question annotation. The integrated approach, Joint Multi-label Attention Network (JMAN), significantly outperformed the Bidirectional Gated Recurrent Unit (Bi-GRU) by around 13\%-26\% and the Hierarchical Attention Network (HAN) by around 4\%-12\% on both datasets, with around 10\%-30\% reduction of training time.
Описание
Joint Multi-Label Attention Networks for Social Text Annotation - ACL Anthology
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
%0 Conference Paper
%1 dong-etal-2019-joint
%A Dong, Hang
%A Wang, Wei
%A Huang, Kaizhu
%A Coenen, Frans
%B Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
%C Minneapolis, Minnesota
%D 2019
%I Association for Computational Linguistics
%K attention_mechanism attention_network folksonomy multi-label multi-label_classification myown regularisation regularisers semantics social_tag structured_knowledge tag
%P 1348--1354
%T Joint Multi-Label Attention Networks for Social Text Annotation
%U https://www.aclweb.org/anthology/N19-1136
%X We propose a novel attention network for document annotation with user-generated tags. The network is designed according to the human reading and annotation behaviour. Usually, users try to digest the title and obtain a rough idea about the topic first, and then read the content of the document. Present research shows that the title metadata could largely affect the social annotation. To better utilise this information, we design a framework that separates the title from the content of a document and apply a title-guided attention mechanism over each sentence in the content. We also propose two semantic-based loss regularisers that enforce the output of the network to conform to label semantics, i.e. similarity and subsumption. We analyse each part of the proposed system with two real-world open datasets on publication and question annotation. The integrated approach, Joint Multi-label Attention Network (JMAN), significantly outperformed the Bidirectional Gated Recurrent Unit (Bi-GRU) by around 13\%-26\% and the Hierarchical Attention Network (HAN) by around 4\%-12\% on both datasets, with around 10\%-30\% reduction of training time.
@inproceedings{dong-etal-2019-joint,
abstract = {We propose a novel attention network for document annotation with user-generated tags. The network is designed according to the human reading and annotation behaviour. Usually, users try to digest the title and obtain a rough idea about the topic first, and then read the content of the document. Present research shows that the title metadata could largely affect the social annotation. To better utilise this information, we design a framework that separates the title from the content of a document and apply a title-guided attention mechanism over each sentence in the content. We also propose two semantic-based loss regularisers that enforce the output of the network to conform to label semantics, i.e. similarity and subsumption. We analyse each part of the proposed system with two real-world open datasets on publication and question annotation. The integrated approach, Joint Multi-label Attention Network (JMAN), significantly outperformed the Bidirectional Gated Recurrent Unit (Bi-GRU) by around 13{\%}-26{\%} and the Hierarchical Attention Network (HAN) by around 4{\%}-12{\%} on both datasets, with around 10{\%}-30{\%} reduction of training time.},
added-at = {2019-07-19T06:45:26.000+0200},
address = {Minneapolis, Minnesota},
author = {Dong, Hang and Wang, Wei and Huang, Kaizhu and Coenen, Frans},
biburl = {https://www.bibsonomy.org/bibtex/2103ddc8134a55eec87a26bb61422a957/hangdong},
booktitle = {Proceedings of the 2019 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)},
description = {Joint Multi-Label Attention Networks for Social Text Annotation - ACL Anthology},
interhash = {e2f7a9180e9e3c969c2fcf0a50b0bf3a},
intrahash = {103ddc8134a55eec87a26bb61422a957},
keywords = {attention_mechanism attention_network folksonomy multi-label multi-label_classification myown regularisation regularisers semantics social_tag structured_knowledge tag},
month = jun,
pages = {1348--1354},
publisher = {Association for Computational Linguistics},
timestamp = {2019-07-19T06:47:37.000+0200},
title = {Joint Multi-Label Attention Networks for Social Text Annotation},
url = {https://www.aclweb.org/anthology/N19-1136},
year = 2019
}