A modified long short term memory cellHaralabopoulos, G. ORCID: https://orcid.org/0000-0002-2142-4975, Razis, G. and Anagnostopoulos, I. (2023) A modified long short term memory cell. International Journal of Neural Systems, 33 (7). 2350039. ISSN 1793-6462
It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing. To link to this item DOI: 10.1142/S0129065723500399 Abstract/SummaryMachine Learning (ML), among other things, facilitates Text Classification, the task of assigning classes to textual items. Classification performance in ML has been significantly improved due to recent devel- opments, including the rise of Recurrent Neural Networks (RNNs), Long Short-Term Memory (LSTM), Gated Recurrent Units (GRU) and Transformer Models. Internal memory states with dynamic temporal behaviour can be found in these kinds of cells. This temporal behaviour in the LSTM cell is stored in two different states: “Current” and “Hidden”. In this work, we define a modification layer within the LSTM cell which allows us to perform additional state adjustments for either state, or even simultane- ously alter both. We perform 17 state alterations. Out of these 17 single-state alteration experiments, twelve involve the Current state whereas five involve the Hidden one. These alterations are evaluated using seven datasets related to sentiment analysis, document classification, hate speech detection and human-to-robot interaction. Our results showed that the highest performing alteration for Current and Hidden state can achieve an average F1 improvement of 0.5% and 0.3%, respectively. We also compare our modified cell performance to two Transformer models, where our modified LSTM cell is outper- formed in classification metrics in 4/6 datasets, but improves upon the simple Transformer model and clearly has a better cost-efficiency than both Transformer models.
Download Statistics DownloadsDownloads per month over past year Altmetric Deposit Details University Staff: Request a correction | Centaur Editors: Update this record |