To the top

Page Manager: Webmaster
Last update: 9/11/2012 3:13 PM

Tell a friend about this page
Print version

Bigrams and BiLSTMs Two n… - University of Gothenburg, Sweden Till startsida
To content Read more about how we use cookies on

Bigrams and BiLSTMs Two neural networks for sequential metaphor detection

Conference paper
Authors Yuri Bizzoni
Mehdi Ghanimifard
Published in Proceedings of the Workshop on Figurative Language Processing at NAACL HLT 2018. 6 June 2018 New Orleans, Louisiana
ISBN 978-1-948087-15-5
Publisher Association of Computational Linguistics (ACL)
Place of publication New Orleans, Louisiana, USA
Publication year 2018
Published at Department of Philosophy, Linguistics and Theory of Science
Language en
Keywords language modeling, metaphor detection, recurrent neural networks
Subject categories Computational linguistics


We present and compare two alternative deep neural architectures to perform word-level metaphor detection on text: a bi-LSTM model and a new structure based on recursive feedforward concatenation of the input. We discuss different versions of such models and the effect that input manipulation - specifically, reducing the length of sentences and introducing concreteness scores for words - have on their performance.

Page Manager: Webmaster|Last update: 9/11/2012

The University of Gothenburg uses cookies to provide you with the best possible user experience. By continuing on this website, you approve of our use of cookies.  What are cookies?