Command line:
```bash
jupyter nbconvert --to latex --TagRemovePreprocessor.remove_cell_tags='{"skip"}' --TagRemovePreprocessor.enabled=True 'nb.ipynb'
```
Also, to use it via python you need to enable the `TagRemoveProcessor` manually.
See: [source](https://stackoverflow.com/q/58564376/991496)
We propose a novel attention network for document annotation with user-generated tags. The network is designed according to the human reading and annotation behaviour. Usually, users try to digest the title and obtain a rough idea about the topic first, and then read the content of the document. Present research shows that the title metadata could largely affect the social annotation. To better utilise this information, we design a framework that separates the title from the content of a document and apply a title-guided attention mechanism over each sentence in the content. We also propose two semanticbased loss regularisers that enforce the output of the network to conform to label semantics, i.e. similarity and subsumption. We analyse each part of the proposed system with two real-world open datasets on publication and question annotation. The integrated approach, Joint Multi-label Attention Network (JMAN), significantly outperformed the Bidirectional Gated Recurrent Unit (Bi-GRU) by around 13%-26% and the Hierarchical Attention Network (HAN) by around 4%-12% on both datasets, with around 10%-30% reduction of training time.
R. Gaina, M. Balla, A. Dockhorn, R. Montoliu, and D. Perez liebana. Joint Proceedings of the AIIDE 2020 Workshops co-located with 16th AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment (AIIDE 2020); CEUR Workshop Proceedings (2020)
, page 1--7. (2020)
S. Pandya, P. Virparia, and R. Chavda. International Journal on Soft Computing, Artificial Intelligence and Applications (IJSCAI)5
(1):
09 - 15 (February 2016)