Academic literature on the topic 'Recurrent Neural Network architecture'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Recurrent Neural Network architecture.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Recurrent Neural Network architecture"
Back, Andrew D., and Ah Chung Tsoi. "A Low-Sensitivity Recurrent Neural Network." Neural Computation 10, no. 1 (January 1, 1998): 165–88. http://dx.doi.org/10.1162/089976698300017935.
Full textПаршин, А. И., М. Н. Аралов, В. Ф. Барабанов, and Н. И. Гребенникова. "RANDOM MULTI-MODAL DEEP LEARNING IN THE PROBLEM OF IMAGE RECOGNITION." ВЕСТНИК ВОРОНЕЖСКОГО ГОСУДАРСТВЕННОГО ТЕХНИЧЕСКОГО УНИВЕРСИТЕТА, no. 4 (October 20, 2021): 21–26. http://dx.doi.org/10.36622/vstu.2021.17.4.003.
Full textGallicchio, Claudio, and Alessio Micheli. "Fast and Deep Graph Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 3898–905. http://dx.doi.org/10.1609/aaai.v34i04.5803.
Full textVASSILIADIS, STAMATIS, GERALD G. PECHANEK, and JOSÉ G. DELGADO-FRIAS. "SPIN: THE SEQUENTIAL PIPELINED NEUROEMULATOR." International Journal on Artificial Intelligence Tools 02, no. 01 (March 1993): 117–32. http://dx.doi.org/10.1142/s0218213093000084.
Full textUzdiaev, M. Yu, R. N. Iakovlev, D. M. Dudarenko, and A. D. Zhebrun. "Identification of a Person by Gait in a Video Stream." Proceedings of the Southwest State University 24, no. 4 (February 4, 2021): 57–75. http://dx.doi.org/10.21869/2223-1560-2020-24-4-57-75.
Full textKalinin, Maxim, Vasiliy Krundyshev, and Evgeny Zubkov. "Estimation of applicability of modern neural network methods for preventing cyberthreats to self-organizing network infrastructures of digital economy platforms,." SHS Web of Conferences 44 (2018): 00044. http://dx.doi.org/10.1051/shsconf/20184400044.
Full textCaniago, Afif Ilham, Wilis Kaswidjanti, and Juwairiah Juwairiah. "Recurrent Neural Network With Gate Recurrent Unit For Stock Price Prediction." Telematika 18, no. 3 (October 31, 2021): 345. http://dx.doi.org/10.31315/telematika.v18i3.6650.
Full textTELMOUDI, ACHRAF JABEUR, HATEM TLIJANI, LOTFI NABLI, MAARUF ALI, and RADHI M'HIRI. "A NEW RBF NEURAL NETWORK FOR PREDICTION IN INDUSTRIAL CONTROL." International Journal of Information Technology & Decision Making 11, no. 04 (July 2012): 749–75. http://dx.doi.org/10.1142/s0219622012500198.
Full textZiemke, Tom. "Radar Image Segmentation Using Self-Adapting Recurrent Networks." International Journal of Neural Systems 08, no. 01 (February 1997): 47–54. http://dx.doi.org/10.1142/s0129065797000070.
Full textK, Karthika, Tejashree K, Naveen Rajan M., and Namita R. "Towards Strong AI with Analog Neural Chips." International Journal of Innovative Research in Advanced Engineering 10, no. 06 (June 23, 2023): 394–99. http://dx.doi.org/10.26562/ijirae.2023.v1006.28.
Full textDissertations / Theses on the topic "Recurrent Neural Network architecture"
Bopaiah, Jeevith. "A recurrent neural network architecture for biomedical event trigger classification." UKnowledge, 2018. https://uknowledge.uky.edu/cs_etds/73.
Full textPan, YaDung. "Fuzzy adaptive recurrent counterpropagation neural networks: A neural network architecture for qualitative modeling and real-time simulation of dynamic processes." Diss., The University of Arizona, 1995. http://hdl.handle.net/10150/187101.
Full textHanson, Jack. "Protein Structure Prediction by Recurrent and Convolutional Deep Neural Network Architectures." Thesis, Griffith University, 2018. http://hdl.handle.net/10072/382722.
Full textThesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Eng & Built Env
Science, Environment, Engineering and Technology
Full Text
Silfa, Franyell. "Energy-efficient architectures for recurrent neural networks." Doctoral thesis, Universitat Politècnica de Catalunya, 2021. http://hdl.handle.net/10803/671448.
Full textLos algoritmos de aprendizaje profundo han tenido un éxito notable en aplicaciones como el reconocimiento automático de voz y la traducción automática. Por ende, estas aplicaciones son omnipresentes en nuestras vidas y se encuentran en una gran cantidad de dispositivos. Estos algoritmos se componen de Redes Neuronales Profundas (DNN), tales como las Redes Neuronales Convolucionales y Redes Neuronales Recurrentes (RNN), las cuales tienen un gran número de parámetros y cálculos. Por esto implementar DNNs en dispositivos móviles y servidores es un reto debido a los requisitos de memoria y energía. Las RNN se usan para resolver problemas de secuencia a secuencia tales como traducción automática. Estas contienen dependencias de datos entre las ejecuciones de cada time-step, por ello la cantidad de paralelismo es limitado. Por eso la evaluación de RNNs de forma energéticamente eficiente es un reto. En esta tesis se estudian RNNs para mejorar su eficiencia energética en arquitecturas especializadas. Para esto, proponemos técnicas de ahorro energético y arquitecturas de alta eficiencia adaptadas a la evaluación de RNN. Primero, caracterizamos un conjunto de RNN ejecutándose en un SoC. Luego identificamos que acceder a la memoria para leer los pesos es la mayor fuente de consumo energético el cual llega hasta un 80%. Por ende, creamos E-PUR: una unidad de procesamiento para RNN. E-PUR logra una aceleración de 6.8x y mejora el consumo energético en 88x en comparación con el SoC. Esas mejoras se deben a la maximización de la ubicación temporal de los pesos. En E-PUR, la lectura de los pesos representa el mayor consumo energético. Por ende, nos enfocamos en reducir los accesos a la memoria y creamos un esquema que reutiliza resultados calculados previamente. La observación es que al evaluar las secuencias de entrada de un RNN, la salida de una neurona dada tiende a cambiar ligeramente entre evaluaciones consecutivas, por lo que ideamos un esquema que almacena en caché las salidas de las neuronas y las reutiliza cada vez que detecta un cambio pequeño entre el valor de salida actual y el valor previo, lo que evita leer los pesos. Para decidir cuándo usar un cálculo anterior utilizamos una Red Neuronal Binaria (BNN) como predictor de reutilización, dado que su salida está altamente correlacionada con la salida de la RNN. Esta propuesta evita más del 24.2% de los cálculos y reduce el consumo energético promedio en 18.5%. El tamaño de la memoria de los modelos RNN suele reducirse utilizando baja precisión para la evaluación y el almacenamiento de los pesos. En este caso, la precisión mínima utilizada se identifica de forma estática y se establece de manera que la RNN mantenga su exactitud. Normalmente, este método utiliza la misma precisión para todo los cálculos. Sin embargo, observamos que algunos cálculos se pueden evaluar con una precisión menor sin afectar la exactitud. Por eso, ideamos una técnica que selecciona dinámicamente la precisión utilizada para calcular cada time-step. Un reto de esta propuesta es como elegir una precisión menor. Abordamos este problema reconociendo que el resultado de una evaluación previa se puede emplear para determinar la precisión requerida en el time-step actual. Nuestro esquema evalúa el 57% de los cálculos con una precisión menor que la precisión fija empleada por los métodos estáticos. Por último, la evaluación en E-PUR muestra una aceleración de 1.46x con un ahorro de energía promedio de 19.2%
Мельникова, І. К. "Інтелектуальна технологія прогнозу курсу криптовалют методом рекурентних нейромереж." Master's thesis, Сумський державний університет, 2019. http://essuir.sumdu.edu.ua/handle/123456789/76753.
Full textTekin, Mim Kemal. "Vehicle Path Prediction Using Recurrent Neural Network." Thesis, Linköpings universitet, Statistik och maskininlärning, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-166134.
Full textSivakumar, Shyamala C. "Architectures and algorithms for stable and constructive learning in discrete time recurrent neural networks." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp03/NQ31533.pdf.
Full textWen, Tsung-Hsien. "Recurrent neural network language generation for dialogue systems." Thesis, University of Cambridge, 2018. https://www.repository.cam.ac.uk/handle/1810/275648.
Full textMelidis, Christos. "Adaptive neural architectures for intuitive robot control." Thesis, University of Plymouth, 2017. http://hdl.handle.net/10026.1/9998.
Full textHe, Jian. "Adaptive power system stabilizer based on recurrent neural network." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape8/PQDD_0008/NQ38471.pdf.
Full textBooks on the topic "Recurrent Neural Network architecture"
Dayhoff, Judith E. Neural network architectures: An introduction. New York, N.Y: Van Nostrand Reinhold, 1990.
Find full textT, Leondes Cornelius, ed. Neural network systems, techniques, and applications. San Diego: Academic Press, 1998.
Find full textC, Jain L., and Johnson R. P, eds. Automatic generation of neural network architecture using evolutionary computation. Singapore: World Scientific, 1997.
Find full textCios, Krzysztof J. Self-growing neural network architecture using crisp and fuzzy entropy. [Washington, DC]: National Aeronautics and Space Administration, 1992.
Find full textCios, Krzysztof J. Self-growing neural network architecture using crisp and fuzzy entropy. [Washington, DC]: National Aeronautics and Space Administration, 1992.
Find full textCios, Krzysztof J. Self-growing neural network architecture using crisp and fuzzy entropy. [Washington, DC]: National Aeronautics and Space Administration, 1992.
Find full textCios, Krzysztof J. Self-growing neural network architecture using crisp and fuzzy entropy. [Washington, DC]: National Aeronautics and Space Administration, 1992.
Find full textUnited States. National Aeronautics and Space Administration., ed. A neural network architecture for implementation of expert sytems for real time monitoring. [Cincinnati, Ohio]: University of Cincinnati, College of Engineering, 1991.
Find full textLim, Chee Peng. Probabilistic fuzzy ARTMAP: An autonomous neural network architecture for Bayesian probability estimation. Sheffield: University of Sheffield, Dept. of Automatic Control & Systems Engineering, 1995.
Find full textUnited States. National Aeronautics and Space Administration., ed. A novel approach to noise-filtering based on a gain-scheduling neural network architecture. [Washington, DC]: National Aeronautics and Space Administration, 1994.
Find full textBook chapters on the topic "Recurrent Neural Network architecture"
Salem, Fathi M. "Network Architectures." In Recurrent Neural Networks, 3–19. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-89929-5_1.
Full textBianchi, Filippo Maria, Enrico Maiorino, Michael C. Kampffmeyer, Antonello Rizzi, and Robert Jenssen. "Recurrent Neural Network Architectures." In SpringerBriefs in Computer Science, 23–29. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-70338-1_3.
Full textWüthrich, Mario V., and Michael Merz. "Recurrent Neural Networks." In Springer Actuarial, 381–406. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-12409-9_8.
Full textSelvanathan, N., and Mashkuri Hj Yaacob. "Canonical form of Recurrent Neural Network Architecture." In Industrial and Engineering Applications of Artificial Intelligence and Expert Systems, 796. London: CRC Press, 2022. http://dx.doi.org/10.1201/9780429332111-154.
Full textRawal, Aditya, Jason Liang, and Risto Miikkulainen. "Discovering Gated Recurrent Neural Network Architectures." In Natural Computing Series, 233–51. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-3685-4_9.
Full textTsoi, Ah Chung. "Recurrent neural network architectures: An overview." In Adaptive Processing of Sequences and Data Structures, 1–26. Berlin, Heidelberg: Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/bfb0053993.
Full textModrzejewski, Mateusz, Konrad Bereda, and Przemysław Rokita. "Efficient Recurrent Neural Network Architecture for Musical Style Transfer." In Artificial Intelligence and Soft Computing, 124–32. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-87986-0_11.
Full textModrzejewski, Mateusz, Konrad Bereda, and Przemysław Rokita. "Efficient Recurrent Neural Network Architecture for Musical Style Transfer." In Artificial Intelligence and Soft Computing, 124–32. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-87986-0_11.
Full textLiao, Yongbo, Hongmei Li, and Zongbo Wang. "FPGA Based Real-Time Processing Architecture for Recurrent Neural Network." In Advances in Intelligent Systems and Computing, 705–9. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-69096-4_99.
Full textCebrián, Pedro L., Alberto Martín-Pérez, Manuel Villa, Jaime Sancho, Gonzalo Rosa, Guillermo Vazquez, Pallab Sutradhar, et al. "Deep Recurrent Neural Network Performing Spectral Recurrence on Hyperspectral Images for Brain Tissue Classification." In Design and Architecture for Signal and Image Processing, 15–27. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-29970-4_2.
Full textConference papers on the topic "Recurrent Neural Network architecture"
Gao, Yang, Hong Yang, Peng Zhang, Chuan Zhou, and Yue Hu. "Graph Neural Architecture Search." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/195.
Full textFourati, Rahma, Chaouki Aouiti, and Adel M. Alimi. "Improved recurrent neural network architecture for SVM learning." In 2015 15th International Conference on Intelligent Systems Design and Applications (ISDA). IEEE, 2015. http://dx.doi.org/10.1109/isda.2015.7489221.
Full textMaulik, Romit, Romain Egele, Bethany Lusch, and Prasanna Balaprakash. "Recurrent Neural Network Architecture Search for Geophysical Emulation." In SC20: International Conference for High Performance Computing, Networking, Storage and Analysis. IEEE, 2020. http://dx.doi.org/10.1109/sc41405.2020.00012.
Full textKumar, Swaraj, Vishal Murgai, Devashish Singh, and Issaac Kommeneni. "Recurrent Neural Network Architecture for Communication Log Analysis." In 2022 IEEE 33rd Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC). IEEE, 2022. http://dx.doi.org/10.1109/pimrc54779.2022.9978048.
Full textKhaliq, Shaik Abdul, and Mohamed Asan Basiri M. "An Efficient VLSI Architecture of Recurrent Neural Network." In 2022 IEEE 6th Conference on Information and Communication Technology (CICT). IEEE, 2022. http://dx.doi.org/10.1109/cict56698.2022.9997812.
Full textBasu, Saikat, Jaybrata Chakraborty, and Md Aftabuddin. "Emotion recognition from speech using convolutional neural network with recurrent neural network architecture." In 2017 2nd International Conference on Communication and Electronics Systems (ICCES). IEEE, 2017. http://dx.doi.org/10.1109/cesys.2017.8321292.
Full textChakraborty, Bappaditya, Partha Sarathi Mukherjee, and Ujjwal Bhattacharya. "Bangla online handwriting recognition using recurrent neural network architecture." In the Tenth Indian Conference. New York, New York, USA: ACM Press, 2016. http://dx.doi.org/10.1145/3009977.3010072.
Full textHee-Heon Song, Sun-Mee Kang, and Seong-Whan Lee. "A new recurrent neural network architecture for pattern recognition." In Proceedings of 13th International Conference on Pattern Recognition. IEEE, 1996. http://dx.doi.org/10.1109/icpr.1996.547658.
Full textElsayed, Nelly, Zag ElSayed, and Anthony S. Maida. "LiteLSTM Architecture for Deep Recurrent Neural Networks." In 2022 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE, 2022. http://dx.doi.org/10.1109/iscas48785.2022.9937585.
Full textZhao, Yi, Yanyan Shen, and Junjie Yao. "Recurrent Neural Network for Text Classification with Hierarchical Multiscale Dense Connections." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/757.
Full textReports on the topic "Recurrent Neural Network architecture"
Barto, Andrew. Adaptive Neural Network Architecture. Fort Belvoir, VA: Defense Technical Information Center, October 1987. http://dx.doi.org/10.21236/ada190114.
Full textMcDonnell, John R., and Don Waagen. Evolving Neural Network Architecture. Fort Belvoir, VA: Defense Technical Information Center, March 1993. http://dx.doi.org/10.21236/ada264802.
Full textBrabel, Michael J. Basin Sculpting a Hybrid Recurrent Feedforward Neural Network. Fort Belvoir, VA: Defense Technical Information Center, January 1998. http://dx.doi.org/10.21236/ada336386.
Full textBasu, Anujit. Nuclear power plant status diagnostics using a neural network with dynamic node architecture. Office of Scientific and Technical Information (OSTI), January 1992. http://dx.doi.org/10.2172/10139977.
Full textBasu, A. Nuclear power plant status diagnostics using a neural network with dynamic node architecture. Office of Scientific and Technical Information (OSTI), January 1992. http://dx.doi.org/10.2172/6649091.
Full textBodruzzaman, M., and M. A. Essawy. Iterative prediction of chaotic time series using a recurrent neural network. Quarterly progress report, January 1, 1995--March 31, 1995. Office of Scientific and Technical Information (OSTI), March 1996. http://dx.doi.org/10.2172/283610.
Full textKirichek, Galina, Vladyslav Harkusha, Artur Timenko, and Nataliia Kulykovska. System for detecting network anomalies using a hybrid of an uncontrolled and controlled neural network. [б. в.], February 2020. http://dx.doi.org/10.31812/123456789/3743.
Full textPettit, Chris, and D. Wilson. A physics-informed neural network for sound propagation in the atmospheric boundary layer. Engineer Research and Development Center (U.S.), June 2021. http://dx.doi.org/10.21079/11681/41034.
Full textMohanty, Subhasish, and Joseph Listwan. Development of Digital Twin Predictive Model for PWR Components: Updates on Multi Times Series Temperature Prediction Using Recurrent Neural Network, DMW Fatigue Tests, System Level Thermal-Mechanical-Stress Analysis. Office of Scientific and Technical Information (OSTI), September 2021. http://dx.doi.org/10.2172/1822853.
Full textMarkova, Oksana, Serhiy Semerikov, and Maiia Popel. СoCalc as a Learning Tool for Neural Network Simulation in the Special Course “Foundations of Mathematic Informatics”. Sun SITE Central Europe, May 2018. http://dx.doi.org/10.31812/0564/2250.
Full text