Author | Search for: Zhu, Xiaodan1; Search for: Sobhani, Parinaz; Search for: Guo, Hongyu1 |
---|
Affiliation | - National Research Council of Canada. Information and Communication Technologies
|
---|
Format | Text, Article |
---|
Conference | 32nd International Conference on Machine Learning, July 6-11, 2015, Lille, France |
---|
Subject | artificial intelligence; brain; computational linguistics; semantics; speech recognition; speech transmission; trees (mathematics); composition layers; long distance interactions; long short term memory; machine translations; natural language understanding; recursive modeling; recursive structure; semantic composition; learning systems |
---|
Abstract | The chain-structured long short-term memory (LSTM) has showed to be effective in a wide range of problems such as speech recognition and machine translation. In this paper, we propose to extend it to tree structures, in which a memory cell can reflect the history memories of multiple child cells or multiple descendant cells in a recursive process. We call the model S-LSTM, which provides a principled way of considering long-distance interaction over hierarchies, e.g., language or image parse structures. We leverage the models for semantic composition to understand the meaning of text, a fundamental problem in natural language understanding, and show that it outperforms a state-of-the-art recursive model by replacing its composition layers with the S-LSTM memory blocks. We also show that utilizing the given structures is helpful in achieving a performance better than that without considering the structures. |
---|
Publication date | 2016-03 |
---|
Publisher | International Machine Learning Society |
---|
In | |
---|
Language | English |
---|
Peer reviewed | Yes |
---|
NPARC number | 23000279 |
---|
Export citation | Export as RIS |
---|
Report a correction | Report a correction (opens in a new tab) |
---|
Record identifier | 3f52b3f9-330f-4ca8-95b8-1c3a68fc4fa7 |
---|
Record created | 2016-07-04 |
---|
Record modified | 2020-03-16 |
---|