Posts by Collection

portfolio

Reimplement paper

Reimplement deep learning and machine learning papers on various tasks and data types

publications

An Exploratory Comparison of LSTM and BiLSTM in Stock Price Prediction

Published in Inventive Communication and Computational Technologies, 2023

Forecasting stock prices is a challenging topic that has been the subject of many studies in the field of finance. Using machine learning techniques, such as deep learning, to model and predict future stock prices is a potential approach. Long short-term memory (LSTM) and bidirectional long short-term memory (BiLSTM) are two common deep learning models. The finding of this work is to discover which activation function and which optimization method will influence the performance of the models the most. Also, the comparison of closely related models: vanilla RNN, LSTM, and BiLSTM to discover the best model for stock price prediction is implemented. Experimental results indicated that BiLSTM with ReLU and Adam method achieved the best performance in the prediction of stock price.

Recommended citation: Viet, N.Q., Quang, N.N., King, N., Huu, D.T., Toan, N.D., Thanh, D.N.H. (2023). "An Exploratory Comparison of LSTM and BiLSTM in Stock Price Prediction." Inventive Communication and Computational Technologies. ICICCT 2023. Lecture Notes in Networks and Systems, vol 757. Springer, Singapore.

Performance Insights of Attention-free Language Models in Sentiment Analysis: A Case Study for E commerce Platforms in Vietnam

Published in Inventive Communication and Computational Technologies, 2024

Transformer-based models have gained significant development over the last few years due to their efficiency and parallelizability in training on vari- ous data domains. However, one bottleneck of the Transformer architecture lies in the Attention mechanism with high complexity. Consequently, training a Trans- former network involves long training time and large computational resources. Although there has been much research work to address this challenge, it is necessary to investigate other language models without the Attention compo- nent. In this work, we focused on the effectiveness and efficiency of attention- free language models, namely BiLSTM, TextCNN, gMLP, and HyenaDNA for the sentiment analysis problem based on reviews on popular e-commerce plat- forms in Vietnam. The findings showed that the accuracy of Bidirectional LSTM, TextCNN, HyenaDNA, and gMLP achieved approximately 97.8%, 97%, 96.8%, and 97.5%, respectively, compared to a popular attention-based model – RoBERTa but their number of parameters is 36.7 times less, 410 times less, 9.3 times less and 98 times less, respectively. In addition, among the considered attention-free models, even though Bidirectional LSTM obtained the highest accuracy, the dif- ference compared to gMLP is tiny. Otherwise, gMLP also acquired the highest F1 score in the considered attention-free model family.

Recommended citation: Pending Pending

On Enhancing Deep Embedded Clustering for Intent Mining in Goal-Oriented Dialogue Understanding

Published in Journal of Uncertain Systems, 2024

Discovering user intents plays an indispensable role in natural lan- guage understanding and automated dialogue response. However, labeling intents for new domains from scratch is a daunting process that often requires extensive manual effort from domain experts. To this end, this paper proposes an unsu- pervised approach for discovering intents and automatically producing intention labels from a collection of unlabeled utterances in the context of the banking domain. A proposed two-stage training procedure includes deploying Deep Em- bedded Clustering (DEC), wherein we made significant modifications by using the Sophia optimizer and the Jensen-Shannon divergence measure to simultane- ously learn feature representations and cluster assignments. A set of intent labels for each cluster is then generated by using a dependency parser in the second stage. We empirically show that the proposed unsupervised approach is capable of generating meaningful intent labels and short text clustering while achieving high evaluation scores.

Recommended citation: NQK Ha, NTT Huyen, MTM Uyen, NQ Viet, NN Quang, Dang N. H. Thanh. Customer Intent Mining from Service Inquiries with Newly Improved Deep Embedded Clustering. Journal of Uncertain Systems, 2024 (Scopus).

talks

teaching