Academic literature on the topic 'LSTM Neural networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'LSTM Neural networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "LSTM Neural networks"

1

Yu, Yong, Xiaosheng Si, Changhua Hu, and Jianxun Zhang. "A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures." Neural Computation 31, no. 7 (2019): 1235–70. http://dx.doi.org/10.1162/neco_a_01199.

Full text
Abstract:
Recurrent neural networks (RNNs) have been widely adopted in research areas concerned with sequential data, such as text, audio, and video. However, RNNs consisting of sigma cells or tanh cells are unable to learn the relevant information of input data when the input gap is large. By introducing gate functions into the cell structure, the long short-term memory (LSTM) could handle the problem of long-term dependencies well. Since its introduction, almost all the exciting results based on RNNs have been achieved by the LSTM. The LSTM has become the focus of deep learning. We review the LSTM cel
APA, Harvard, Vancouver, ISO, and other styles
2

Chen, Huimin, Liyong Wang, Yangyang Xu, et al. "State of Charge Estimation for Lithium-ion Battery Using Long Short-Term Memory Networks." Journal of Physics: Conference Series 2890, no. 1 (2024): 012024. http://dx.doi.org/10.1088/1742-6596/2890/1/012024.

Full text
Abstract:
Abstract Accurate estimation of the State of Charge (SOC) in lithium-ion batteries is crucial for enhancing performance and extending battery life, especially in applications like electric vehicles and energy storage systems. This study introduces a novel method for SOC estimation that utilizes Long Short-Term Memory (LSTM) neural networks. To evaluate the LSTM model’s effectiveness, we compared its performance with that of Backpropagation (BP) neural networks and Recurrent Neural Networks (RNN) using the Root Mean Square Error (RMSE) as the evaluation metric. The findings reveal that the LSTM
APA, Harvard, Vancouver, ISO, and other styles
3

Bakir, Houda, Ghassen Chniti, and Hédi Zaher. "E-Commerce Price Forecasting Using LSTM Neural Networks." International Journal of Machine Learning and Computing 8, no. 2 (2018): 169–74. http://dx.doi.org/10.18178/ijmlc.2018.8.2.682.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Burges, Entesar T., Zakariya A. Oraibi, and Ali Wali. "Gait Recognition Using Hybrid LSTM-CNN Deep Neural Networks." Journal of Image and Graphics 12, no. 2 (2024): 168–75. http://dx.doi.org/10.18178/joig.12.2.168-175.

Full text
Abstract:
Identifying individuals based on their gait is a crucial aspect of biometric authentication. It is complicated by several factors, such as altering one’s walking posture, donning a coat, and wearing high heels. With the advent of artificial intelligence, deep learning, in particular, has made significant strides in this area. The conditional Generative Adversarial Network (cGAN), together with hybrid Long Short-Term Memory (LSTM) and Convolutional Neural Networks (CNNs), are used in this research to create images using a novel technique. The framework comprises three parts. The first involves
APA, Harvard, Vancouver, ISO, and other styles
5

Liu, David, and An Wei. "Regulated LSTM Artificial Neural Networks for Option Risks." FinTech 1, no. 2 (2022): 180–90. http://dx.doi.org/10.3390/fintech1020014.

Full text
Abstract:
This research aims to study the pricing risks of options by using improved LSTM artificial neural network models and make direct comparisons with the Black–Scholes option pricing model based upon the option prices of 50 ETFs of the Shanghai Securities Exchange from 1 January 2018 to 31 December 2019. We study an LSTM model, a mathematical option pricing model (BS model), and an improved artificial neural network model—the regulated LSTM model. The method we adopted is first to price the options using the mathematical model—i.e., the BS model—and then to construct the LSTM neural network for tr
APA, Harvard, Vancouver, ISO, and other styles
6

Wan, Yingliang, Hong Tao, and Li Ma. "Forecasting Zhejiang Province's GDP Using a CNN-LSTM Model." Frontiers in Business, Economics and Management 13, no. 3 (2024): 233–35. http://dx.doi.org/10.54097/bmq2dy63.

Full text
Abstract:
Zhejiang province has experienced notable economic growth in recent years. Despite this, achieving sustainable high-quality economic development presents complex challenges and uncertainties. This study employs advanced neural network methodologies, including Convolutional Neural Networks (CNN), Long Short-Term Memory networks (LSTM), and an integrated CNN-LSTM model, to predict Zhejiang's economic trajectory. Our empirical analysis demonstrates the proficiency of neural networks in delivering reasonably precise economic forecasts, despite inherent prediction residuals. A comparative assessmen
APA, Harvard, Vancouver, ISO, and other styles
7

Kalinin, Maxim, Vasiliy Krundyshev, and Evgeny Zubkov. "Estimation of applicability of modern neural network methods for preventing cyberthreats to self-organizing network infrastructures of digital economy platforms,." SHS Web of Conferences 44 (2018): 00044. http://dx.doi.org/10.1051/shsconf/20184400044.

Full text
Abstract:
The problems of applying neural network methods for solving problems of preventing cyberthreats to flexible self-organizing network infrastructures of digital economy platforms: vehicle adhoc networks, wireless sensor networks, industrial IoT, “smart buildings” and “smart cities” are considered. The applicability of the classic perceptron neural network, recurrent, deep, LSTM neural networks and neural networks ensembles in the restricting conditions of fast training and big data processing are estimated. The use of neural networks with a complex architecture– recurrent and LSTM neural network
APA, Harvard, Vancouver, ISO, and other styles
8

Wan, Huaiyu, Shengnan Guo, Kang Yin, Xiaohui Liang, and Youfang Lin. "CTS-LSTM: LSTM-based neural networks for correlatedtime series prediction." Knowledge-Based Systems 191 (March 2020): 105239. http://dx.doi.org/10.1016/j.knosys.2019.105239.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kande, Jayanth. "Twitter Sentiment Analysis with LSTM Neural Networks." REST Journal on Data Analytics and Artificial Intelligence 3, no. 3 (2024): 92–98. http://dx.doi.org/10.46632/jdaai/3/3/11.

Full text
Abstract:
This project delves into sentiment analysis on Twitter using Long Short-Term Memory (LSTM) Neural Networks in conjunction with Global Vectors for Word Representation (GloVe). The study explores the properties of tweets, preprocessing steps, and applying GloVe embedding’s to map words to vectors. The classifier’s design and training parameters are detailed, and the results are compared with baselines, revealing the LSTM’s superiority in handling sequential language data. Furthermore, trials explore how changing the quantity of fully connected layers and LSTM time steps affects accuracy. The fin
APA, Harvard, Vancouver, ISO, and other styles
10

Shewalkar, Apeksha, Deepika Nyavanandi, and Simone A. Ludwig. "Performance Evaluation of Deep Neural Networks Applied to Speech Recognition: RNN, LSTM and GRU." Journal of Artificial Intelligence and Soft Computing Research 9, no. 4 (2019): 235–45. http://dx.doi.org/10.2478/jaiscr-2019-0006.

Full text
Abstract:
Abstract Deep Neural Networks (DNN) are nothing but neural networks with many hidden layers. DNNs are becoming popular in automatic speech recognition tasks which combines a good acoustic with a language model. Standard feedforward neural networks cannot handle speech data well since they do not have a way to feed information from a later layer back to an earlier layer. Thus, Recurrent Neural Networks (RNNs) have been introduced to take temporal dependencies into account. However, the shortcoming of RNNs is that long-term dependencies due to the vanishing/exploding gradient problem cannot be h
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "LSTM Neural networks"

1

Paschou, Michail. "ASIC implementation of LSTM neural network algorithm." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-254290.

Full text
Abstract:
LSTM neural networks have been used for speech recognition, image recognition and other artificial intelligence applications for many years. Most applications perform the LSTM algorithm and the required calculations on cloud computers. Off-line solutions include the use of FPGAs and GPUs but the most promising solutions include ASIC accelerators designed for this purpose only. This report presents an ASIC design capable of performing the multiple iterations of the LSTM algorithm on a unidirectional and without peepholes neural network architecture. The proposed design provides arithmetic level
APA, Harvard, Vancouver, ISO, and other styles
2

Cavallie, Mester Jon William. "Using LSTM Neural Networks To Predict Daily Stock Returns." Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-106124.

Full text
Abstract:
Long short-term memory (LSTM) neural networks have been proven to be effective for time series prediction, even in some instances where the data is non-stationary. This lead us to examine their predictive ability of stock market returns, as the development of stock prices and returns tend to be a non-stationary time series. We used daily stock trading data to let an LSTM train models at predicting daily returns for 60 stocks from the OMX30 and Nasdaq-100 indices. Subsequently, we measured their accuracy, precision, and recall. The mean accuracy was 49.75 percent, meaning that the observed accu
APA, Harvard, Vancouver, ISO, and other styles
3

Pokhrel, Abhishek <1996&gt. "Stock Returns Prediction using Recurrent Neural Networks with LSTM." Master's Degree Thesis, Università Ca' Foscari Venezia, 2022. http://hdl.handle.net/10579/22038.

Full text
Abstract:
Research in asset pricing has, until recently, side-stepped the high dimensionality problem by focusing on low-dimensional models. Work on cross-sectional stock return prediction, for example, has focused on regressions with a small number of characteristics. Given the background of an enormously large number of variables that could potentially be relevant for predicting returns, focusing on such a small number of factors effectively means that the researchers are imposing a very high degree of sparsity on these models. This research studies the use of the recurrent neural network (RNN) method
APA, Harvard, Vancouver, ISO, and other styles
4

Ärlemalm, Filip. "Harbour Porpoise Click Train Classification with LSTM Recurrent Neural Networks." Thesis, KTH, Teknisk informationsvetenskap, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-215088.

Full text
Abstract:
The harbour porpoise is a toothed whale whose presence is threatened in Scandinavia. Onestep towards preserving the species in critical areas is to study and observe the harbourporpoise population growth or decline in these areas. Today this is done by using underwateraudio recorders, so called hydrophones, and manual analyzing tools. This report describes amethod that modernizes the process of harbour porpoise detection with machine learning. Thedetection method is based on data collected by the hydrophone AQUAclick 100. The data isprocessed and classified automatically with a stacked long sh
APA, Harvard, Vancouver, ISO, and other styles
5

Li, Edwin. "LSTM Neural Network Models for Market Movement Prediction." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-231627.

Full text
Abstract:
Interpreting time varying phenomena is a key challenge in the capital markets. Time series analysis using autoregressive methods has been carried out over the last couple of decades, often with reassuring results. However, such methods sometimes fail to explain trends and cyclical fluctuations, which may be characterized by long-range dependencies or even dependencies between the input features. The purpose of this thesis is to investigate whether recurrent neural networks with LSTM-cells can be used to capture these dependencies, and ultimately be used as a complement for index trading decisi
APA, Harvard, Vancouver, ISO, and other styles
6

Zambezi, Samantha. "Predicting social unrest events in South Africa using LSTM neural networks." Master's thesis, Faculty of Science, 2021. http://hdl.handle.net/11427/33986.

Full text
Abstract:
This thesis demonstrates an approach to predict the count of social unrest events in South Africa. A comparison is made between traditional forecasting approaches and neural networks; the traditional forecast method selected being the Autoregressive Integrated Moving Average (ARIMA model). The type of neural network implemented was the Long Short-Term Memory (LSTM) neural network. The basic theoretical concepts of ARIMA and LSTM neural networks are explained and subsequently, the patterns of the social unrest time series were analysed using time series exploratory techniques. The social unrest
APA, Harvard, Vancouver, ISO, and other styles
7

Holm, Noah, and Emil Plynning. "Spatio-temporal prediction of residential burglaries using convolutional LSTM neural networks." Thesis, KTH, Geoinformatik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-229952.

Full text
Abstract:
The low amount solved residential burglary crimes calls for new and innovative methods in the prevention and investigation of the cases. There were 22 600 reported residential burglaries in Sweden 2017 but only four to five percent of these will ever be solved. There are many initiatives in both Sweden and abroad for decreasing the amount of occurring residential burglaries and one of the areas that are being tested is the use of prediction methods for more efficient preventive actions. This thesis is an investigation of a potential method of prediction by using neural networks to identify are
APA, Harvard, Vancouver, ISO, and other styles
8

Graffi, Giacomo. "A novel approach for Credit Scoring using Deep Neural Networks with bank transaction data." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021.

Find full text
Abstract:
With the PSD2 open banking revolution FinTechs obtained a key role in the financial industry. This role implies the inquiry and development of new techniques, products and solutions to compete with other players in this area. The aim of this thesis is to investigate the applicability of the state-of-the-art Deep Learning techniques for Credit Risk Modeling. In order to accomplish it, a PSD2-related synthetic and anonymized dataset has been used to simulate an application process with only one account per user. Firstly, a machine-readable representation of the bank accounts has been created, s
APA, Harvard, Vancouver, ISO, and other styles
9

Xiang, Wenliang. "Anomaly detection by prediction for health monitoring of satellites using LSTM neural networks." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/24695/.

Full text
Abstract:
Anomaly detection in satellite has not been well-documented due to the unavailability of satellite data, while it becomes more and more important with the increasing popularity of satellite applications. Our work focus on the anomaly detection by prediction on the dataset from the satellite, where we try and compare performance among recurrent neural network (RNN), Long Short-Term Memory (LSTM) and conventional neural network (NN). We conclude that LSTM with input length p=16, dimensionality n=32, output length q=2, 128 neurons and without maximum overlap is the best in terms of balanced accur
APA, Harvard, Vancouver, ISO, and other styles
10

Lin, Alvin. "Video Based Automatic Speech Recognition Using Neural Networks." DigitalCommons@CalPoly, 2020. https://digitalcommons.calpoly.edu/theses/2343.

Full text
Abstract:
Neural network approaches have become popular in the field of automatic speech recognition (ASR). Most ASR methods use audio data to classify words. Lip reading ASR techniques utilize only video data, which compensates for noisy environments where audio may be compromised. A comprehensive approach, including the vetting of datasets and development of a preprocessing chain, to video-based ASR is developed. This approach will be based on neural networks, namely 3D convolutional neural networks (3D-CNN) and Long short-term memory (LSTM). These types of neural networks are designed to take in temp
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "LSTM Neural networks"

1

Sangeetha, V., and S. Kevin Andrews. Introduction to Artificial Intelligence and Neural Networks. Magestic Technology Solutions (P) Ltd, Chennai, Tamil Nadu, India, 2023. http://dx.doi.org/10.47716/mts/978-93-92090-24-0.

Full text
Abstract:
Artificial Intelligence (AI) has emerged as a defining force in the current era, shaping the contours of technology and deeply permeating our everyday lives. From autonomous vehicles to predictive analytics and personalized recommendations, AI continues to revolutionize various facets of human existence, progressively becoming the invisible hand guiding our decisions. Simultaneously, its growing influence necessitates the need for a nuanced understanding of AI, thereby providing the impetus for this book, “Introduction to Artificial Intelligence and Neural Networks.” This book aims to equip it
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "LSTM Neural networks"

1

Wüthrich, Mario V., and Michael Merz. "Recurrent Neural Networks." In Springer Actuarial. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-12409-9_8.

Full text
Abstract:
AbstractThis chapter considers recurrent neural (RN) networks. These are special network architectures that are useful for time-series modeling, e.g., applied to time-series forecasting. We study the most popular RN networks which are the long short-term memory (LSTM) networks and the gated recurrent unit (GRU) networks. We apply these networks to mortality forecasting.
APA, Harvard, Vancouver, ISO, and other styles
2

Salem, Fathi M. "Gated RNN: The Long Short-Term Memory (LSTM) RNN." In Recurrent Neural Networks. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-89929-5_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zhang, Nan, Wei-Long Zheng, Wei Liu, and Bao-Liang Lu. "Continuous Vigilance Estimation Using LSTM Neural Networks." In Neural Information Processing. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-46672-9_59.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Alexandre, Luís A., and J. P. Marques de Sá. "Error Entropy Minimization for LSTM Training." In Artificial Neural Networks – ICANN 2006. Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11840817_26.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Paul, Sarthak, and Sarvesh Tanwar. "Stock Forecasting Using LSTM Neural Networks." In Lecture Notes in Networks and Systems. Springer Nature Singapore, 2025. https://doi.org/10.1007/978-981-97-8329-8_27.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Yu, Wen, Xiaoou Li, and Jesus Gonzalez. "Fast Training of Deep LSTM Networks." In Advances in Neural Networks – ISNN 2019. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-22796-8_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Klapper-Rybicka, Magdalena, Nicol N. Schraudolph, and Jürgen Schmidhuber. "Unsupervised Learning in LSTM Recurrent Neural Networks." In Artificial Neural Networks — ICANN 2001. Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-44668-0_95.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Haralabopoulos, Giannis, and Ioannis Anagnostopoulos. "A Custom State LSTM Cell for Text Classification Tasks." In Engineering Applications of Neural Networks. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-08223-8_40.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Li, SiLiang, Bin Xu, and Tong Lee Chung. "Definition Extraction with LSTM Recurrent Neural Networks." In Lecture Notes in Computer Science. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-47674-2_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gers, Felix A., Douglas Eck, and Jürgen Schmidhuber. "Applying LSTM to Time Series Predictable through Time-Window Approaches." In Artificial Neural Networks — ICANN 2001. Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-44668-0_93.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "LSTM Neural networks"

1

Kamal, Maisirreem A., and Fawziya M. Ramo. "Kidney Disease Prediction Using Multiple LSTM Neural Networks." In 2025 International Conference on Computer Science and Software Engineering (CSASE). IEEE, 2025. https://doi.org/10.1109/csase63707.2025.11054016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chen, Dingyu, Shaohua Liu, and Le Yuan. "Pioneering Industrial Anomaly Detection with a Hierarchical LSTM-Rola Framework." In 2024 International Joint Conference on Neural Networks (IJCNN). IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10650607.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Shan, Zhaowen, Xuanteng Huang, Zheng Zhou, and Xianwei Zhang. "openLG: A Tunable and Efficient Open-source LSTM on GPUs." In 2024 International Joint Conference on Neural Networks (IJCNN). IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10650733.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zhou, Dianyi, Xi Du, Shiyi Liu, Qingyu Su, and Hongyang Guo. "Research on LSTM-driven UAV path planning." In Fourth International Conference on Advanced Algorithms and Neural Networks (AANN 2024), edited by Qinghua Lu and Weishan Zhang. SPIE, 2024. http://dx.doi.org/10.1117/12.3049651.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Medeiros, Thiago, and Alfredo Weitzenfeld. "A Place Cell Model for Spatio-Temporal Navigation Learning with LSTM." In 2024 International Joint Conference on Neural Networks (IJCNN). IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10650241.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kaur, Arpanpreet, Kanwarpartap Singh Gill, Rahul Chauhan, and Hemant Singh Pokhariya. "Breaking Boundaries in Text Classification with LSTM Neural Networks." In 2024 4th International Conference on Advancement in Electronics & Communication Engineering (AECE). IEEE, 2024. https://doi.org/10.1109/aece62803.2024.10911774.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Goel, Akash, Alok Katiyar, Amit Kumar Goel, and Adesh Kumar. "LSTM Neural Networks for Brain Signals and Neuromorphic Chip." In 2024 2nd International Conference on Advances in Computation, Communication and Information Technology (ICAICCIT). IEEE, 2024. https://doi.org/10.1109/icaiccit64383.2024.10912358.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Sun, Qingnan, Marko V. Jankovic, Lia Bally, and Stavroula G. Mougiakakou. "Predicting Blood Glucose with an LSTM and Bi-LSTM Based Deep Neural Network." In 2018 14th Symposium on Neural Networks and Applications (NEUREL). IEEE, 2018. http://dx.doi.org/10.1109/neurel.2018.8586990.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Arshi, Sahar, Li Zhang, and Rebecca Strachan. "Prediction Using LSTM Networks." In 2019 International Joint Conference on Neural Networks (IJCNN). IEEE, 2019. http://dx.doi.org/10.1109/ijcnn.2019.8852206.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Pérez, José, Rafael Baez, Jose Terrazas, et al. "Physics-Informed Long-Short Term Memory Neural Network Performance on Holloman High-Speed Test Track Sled Study." In ASME 2022 Fluids Engineering Division Summer Meeting. American Society of Mechanical Engineers, 2022. http://dx.doi.org/10.1115/fedsm2022-86953.

Full text
Abstract:
Abstract Physics Informed Neural Networks (PINNs) incorporate known physics equations into a network to reduce training time and increase accuracy. Traditional PINNs approaches are based on dense networks that do not consider the fact that simulations are a type of sequential data. Long-Short Term Memory (LSTM) networks are a modified version of Recurrent Neural Networks (RNNs) which are used to analyze sequential datasets. We propose a Physics Informed LSTM network that leverages the power of LSTMs for sequential datasets that also incorporates the governing physics equations of 2D incompress
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "LSTM Neural networks"

1

Cárdenas-Cárdenas, Julián Alonso, Deicy J. Cristiano-Botia, and Nicolás Martínez-Cortés. Colombian inflation forecast using Long Short-Term Memory approach. Banco de la República, 2023. http://dx.doi.org/10.32468/be.1241.

Full text
Abstract:
We use Long Short Term Memory (LSTM) neural networks, a deep learning technique, to forecast Colombian headline inflation one year ahead through two approaches. The first one uses only information from the target variable, while the second one incorporates additional information from some relevant variables. We employ sample rolling to the traditional neuronal network construction process, selecting the hyperparameters with criteria for minimizing the forecast error. Our results show a better forecasting capacity of the network with information from additional variables, surpassing both the ot
APA, Harvard, Vancouver, ISO, and other styles
2

Ankel, Victoria, Stella Pantopoulou, Matthew Weathered, Darius Lisowski, Anthonie Cilliers, and Alexander Heifetz. One-Step Ahead Prediction of Thermal Mixing Tee Sensors with Long Short Term Memory (LSTM) Neural Networks. Office of Scientific and Technical Information (OSTI), 2020. http://dx.doi.org/10.2172/1760289.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!