Comparison of LSTM and GRU Methods for Predicting Gold Exchange Rate against US Dollar

(1) * Dušan Bohovic Mail (University of Novi Pazar, Serbia)
*corresponding author

Abstract


This study aims to compare the performance of Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) models in predicting the gold exchange rate against the United States Dollar (USD). Using time series data from Yahoo Finance for the period 2017-2023, we evaluate and compare the two models based on comprehensive evaluation metrics. The results show that the GRU model performs better in several important metrics, especially in terms of Root Mean Square Error (RMSE) on the test data (26.41 compared to 27.54 on LSTM) and higher coefficient of determination (R²) on the test data (0.9004 compared to 0.7825 on LSTM). These findings indicate that the GRU model has better generalization ability for gold to USD exchange rate prediction, although both models show very high accuracy rates above 98% on the test data.

   

DOI

https://doi.org/10.33292/ijarlit.v3i1.43
      

Article metrics

10.33292/ijarlit.v3i1.43 Abstract views : 143 | PDF views : 73

   

Cite

   

Full Text

Download

References


S. Hochreiter and J. Schmidhuber, "Long short-term memory," Neural Comput., vol. 9, no. 8, pp. 1735-1780, 1997.

K. Cho et al., "Learning phrase representations using RNN encoder-decoder for statistical machine translation," arXiv Prepr. arXiv1406.1078, 2014.

J. Chung, C. Gulcehre, K. Cho, and Y. Bengio, "Empirical evaluation of gated recurrent neural networks on sequence modeling," arXiv Prepr. arXiv1412.3555, 2014.

R. Jozefowicz, W. Zaremba, and I. Sutskever, "An empirical exploration of recurrent network architectures," in International conference on machine learning, 2015, pp. 2342-2350.

Z. Halim, S. M. Shuhidan, and Z. M. Sanusi, "Corporation Financial Distress Prediction With Deep Learning: Analysis of Public Listed Companies in Malaysia," Bus. Process Manag. J., vol. 27, no. 4, pp. 1163-1178, 2021, doi: 10.1108/bpmj-06-2020-0273.

A. Flores, H. Tito, and D. Centty, "Recurrent Neural Networks for Meteorological Time Series Imputation," Int. J. Adv. Comput. Sci. Appl., vol. 11, no. 3, 2020, doi: 10.14569/ijacsa.2020.0110360.

F. Wang, D. Zhang, G. Min, and J. Li, "Reservoir Production Prediction Based on Variational Mode Decomposition and Gated Recurrent Unit Networks," Ieee Access, vol. 9, pp. 53317-53325, 2021, doi: 10.1109/access.2021.3070343.

J. Yuan and Y. Tian, "An Intelligent Fault Diagnosis Method Using GRU Neural Network Towards Sequential Data in Dynamic Processes," Processes, vol. 7, no. 3, p. 152, 2019, doi: 10.3390/pr7030152.

Z. Yu, Y. Sun, J. Zhang, Y.-A. Zhang, and Z. Liu, "Gated Recurrent Unit Neural Network (GRU) Based on Quantile Regression (QR) Predicts Reservoir Parameters Through Well Logging Data," Front. Earth Sci., vol. 11, 2023, doi: 10.3389/feart.2023.1087385.

A. Inteha, N. Masood, F. Hussain, and I. A. Khan, "A Data Driven Approach for Day Ahead Short Term Load Forecasting," Ieee Access, vol. 10, pp. 84227-84243, 2022, doi: 10.1109/access.2022.3197609.

Y. Wang, M. Liu, Z. Bao, and S. Zhang, "Short-Term Load Forecasting With Multi-Source Data Using Gated Recurrent Unit Neural Networks," Energies, vol. 11, no. 5, p. 1138, 2018, doi: 10.3390/en11051138.

J. Song, G. Xue, Y. Ma, H. Li, Y. Pan, and Z. Hao, "An Indoor Temperature Prediction Framework Based on Hierarchical Attention Gated Recurrent Unit Model for Energy Efficient Buildings," Ieee Access, vol. 7, pp. 157268-157283, 2019, doi: 10.1109/access.2019.2950341.

Y. Gao, R. Wang, and E. Zhou, "Stock Prediction Based on Optimized LSTM and GRU Models," Sci. Program., vol. 2021, pp. 1-8, 2021, doi: 10.1155/2021/4055281.

K. He and J. Qian, "Research on Stock Prediction Algorithm Based on CNN and LSTM," Acad. J. Comput. Inf. Sci., vol. 5, no. 12, 2022, doi: 10.25236/ajcis.2022.051215.

S. Zhang and F. Wen, "Multifractal Behaviors of Stock Indices and Their Ability to Improve Forecasting in a Volatility Clustering Period," Entropy, vol. 23, no. 8, p. 1018, 2021, doi: 10.3390/e23081018.

Z. Che, S. Purushotham, K. Cho, D. Sontag, and Y. Liu, "Recurrent Neural Networks for Multivariate Time Series With Missing Values," Sci. Rep., vol. 8, no. 1, 2018, doi: 10.1038/s41598-018-24271-9.

S. Park, Y. Seonwoo, J. Kim, J. Kim, and A. Oh, "Denoising Recurrent Neural Networks for Classifying Crash-Related Events," Ieee Trans. Intell. Transp. Syst., vol. 21, no. 7, pp. 2906-2917, 2020, doi: 10.1109/tits.2019.2921722.

G. Weiss, Y. Goldberg, and E. Yahav, "On the practical computational power of finite precision RNNs for language recognition," arXiv Prepr. arXiv1805.04908, 2018.

D. Tang, B. Qin, and T. Liu, "Document modeling with gated recurrent neural network for sentiment classification," in Proceedings of the 2015 conference on empirical methods in natural language processing, 2015, pp. 1422-1432.

M. A. Zaytar and C. El Amrani, "Sequence to sequence weather forecasting with long short-term memory recurrent neural networks," Int. J. Comput. Appl., vol. 143, no. 11, pp. 7-11, 2016.


Refbacks

  • There are currently no refbacks.


View My Stats IJARLIT