Academic literature on the topic 'Reservoir computing networks'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Reservoir computing networks.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Reservoir computing networks"
Van der Sande, Guy, Daniel Brunner, and Miguel C. Soriano. "Advances in photonic reservoir computing." Nanophotonics 6, no. 3 (May 12, 2017): 561–76. http://dx.doi.org/10.1515/nanoph-2016-0132.
Full textGhosh, Sanjib, Kohei Nakajima, Tanjung Krisnanda, Keisuke Fujii, and Timothy C. H. Liew. "Quantum Neuromorphic Computing with Reservoir Computing Networks." Advanced Quantum Technologies 4, no. 9 (July 9, 2021): 2100053. http://dx.doi.org/10.1002/qute.202100053.
Full textRohm, Andre, Lina Jaurigue, and Kathy Ludge. "Reservoir Computing Using Laser Networks." IEEE Journal of Selected Topics in Quantum Electronics 26, no. 1 (January 2020): 1–8. http://dx.doi.org/10.1109/jstqe.2019.2927578.
Full textDamicelli, Fabrizio, Claus C. Hilgetag, and Alexandros Goulas. "Brain connectivity meets reservoir computing." PLOS Computational Biology 18, no. 11 (November 16, 2022): e1010639. http://dx.doi.org/10.1371/journal.pcbi.1010639.
Full textAntonik, Piotr, Serge Massar, and Guy Van Der Sande. "Photonic reservoir computing using delay dynamical systems." Photoniques, no. 104 (September 2020): 45–48. http://dx.doi.org/10.1051/photon/202010445.
Full textTran, Dat, and Christof Teuscher. "Computational Capacity of Complex Memcapacitive Networks." ACM Journal on Emerging Technologies in Computing Systems 17, no. 2 (April 2021): 1–25. http://dx.doi.org/10.1145/3445795.
Full textSenn, Christoph Walter, and Itsuo Kumazawa. "Abstract Reservoir Computing." AI 3, no. 1 (March 10, 2022): 194–210. http://dx.doi.org/10.3390/ai3010012.
Full textHart, Joseph D., Laurent Larger, Thomas E. Murphy, and Rajarshi Roy. "Delayed dynamical systems: networks, chimeras and reservoir computing." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 377, no. 2153 (July 22, 2019): 20180123. http://dx.doi.org/10.1098/rsta.2018.0123.
Full textS Pathak, Shantanu, and D. Rajeswara Rao. "Reservoir Computing for Healthcare Analytics." International Journal of Engineering & Technology 7, no. 2.32 (May 31, 2018): 240. http://dx.doi.org/10.14419/ijet.v7i2.32.15576.
Full textAndrecut, M. "Reservoir computing on the hypersphere." International Journal of Modern Physics C 28, no. 07 (July 2017): 1750095. http://dx.doi.org/10.1142/s0129183117500954.
Full textDissertations / Theses on the topic "Reservoir computing networks"
Fu, Kaiwei. "Reservoir Computing with Neuro-memristive Nanowire Networks." Thesis, The University of Sydney, 2021. https://hdl.handle.net/2123/25900.
Full textKulkarni, Manjari S. "Memristor-based Reservoir Computing." PDXScholar, 2012. https://pdxscholar.library.pdx.edu/open_access_etds/899.
Full textCanaday, Daniel M. "Modeling and Control of Dynamical Systems with Reservoir Computing." The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu157469471458874.
Full textDai, Jing. "Reservoir-computing-based, biologically inspired artificial neural networks and their applications in power systems." Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/47646.
Full textAlomar, Barceló Miquel Lleó. "Methodologies for hardware implementation of reservoir computing systems." Doctoral thesis, Universitat de les Illes Balears, 2017. http://hdl.handle.net/10803/565422.
Full text[spa]Inspiradas en la forma en que el cerebro procesa la información, las redes neuronales artificiales (RNA) se crearon con el objetivo de reproducir habilidades humanas en tareas que son difíciles de resolver utilizando la programación algorítmica clásica. El paradigma de las RNA se ha aplicado a numerosos campos de la ciencia y la ingeniería gracias a su capacidad de aprender de ejemplos, la adaptación, el paralelismo y la tolerancia a fallas. El reservoir computing (RC), basado en el uso de una red neuronal recurrente (RNR) aleatoria como núcleo de procesamiento, es un modelo de gran alcance muy adecuado para procesar series temporales. Las realizaciones en hardware de las RNA son cruciales para aprovechar las propiedades paralelas de estos modelos, las cuales favorecen una mayor velocidad y fiabilidad. Por otro lado, las redes neuronales en hardware (RNH) pueden ofrecer ventajas apreciables en términos de consumo energético y coste. Los dispositivos compactos de bajo coste implementando RNH son útiles para apoyar o reemplazar al software en aplicaciones en tiempo real, como el control, monitorización médica, robótica y redes de sensores. Sin embargo, la realización en hardware de RNA con un número elevado de neuronas, como en el caso del RC, es una tarea difícil debido a la gran cantidad de recursos exigidos por las operaciones involucradas. A pesar de los posibles beneficios de los circuitos digitales en hardware para realizar un procesamiento neuronal basado en RC, la mayoría de las implementaciones se realizan en software mediante procesadores convencionales. En esta tesis, propongo y analizo varias metodologías para la implementación digital de sistemas RC utilizando un número limitado de recursos hardware. Los diseños de la red neuronal se describen en detalle tanto para una implementación convencional como para los distintos métodos alternativos. Se discuten las ventajas e inconvenientes de las diversas técnicas con respecto a la precisión, velocidad de cálculo y área requerida. Finalmente, las implementaciones propuestas se aplican a resolver diferentes problemas prácticos de ingeniería.
[eng]Inspired by the way the brain processes information, artificial neural networks (ANNs) were created with the aim of reproducing human capabilities in tasks that are hard to solve using the classical algorithmic programming. The ANN paradigma has been applied to numerous fields of science and engineering thanks to its ability to learn from examples, adaptation, parallelism and fault-tolerance. Reservoir computing (RC), based on the use of a random recurrent neural network (RNN) as processing core, is a powerful model that is highly suited to time-series processing. Hardware realizations of ANNs are crucial to exploit the parallel properties of these models, which favor higher speed and reliability. On the other hand, hardware neural networks (HNNs) may offer appreciable advantages in terms of power consumption and cost. Low-cost compact devices implementing HNNs are useful to suport or replace software in real-time applications, such as control, medical monitoring, robotics and sensor networks. However, the hardware realization of ANNs with large neuron counts, such as in RC, is a challenging task due to the large resource requirement of the involved operations. Despite the potential benefits of hardware digital circuits to perform RC-based neural processing, most implementations are realized in software using sequential processors. In this thesis, I propose and analyze several methodologies for the digital implementation of RC systems using limited hardware resources. The neural network design is described in detail for both a conventional implementation and the diverse alternative approaches. The advantages and shortcomings of the various techniques regarding the accuracy, computation speed and required silicon area are discussed. Finally, the proposed approaches are applied to solve different real-life engineering problems.
Vincent-Lamarre, Philippe. "Learning Long Temporal Sequences in Spiking Networks by Multiplexing Neural Oscillations." Thesis, Université d'Ottawa / University of Ottawa, 2019. http://hdl.handle.net/10393/39960.
Full textAlmassian, Amin. "Information Representation and Computation of Spike Trains in Reservoir Computing Systems with Spiking Neurons and Analog Neurons." PDXScholar, 2016. http://pdxscholar.library.pdx.edu/open_access_etds/2724.
Full textBazzanella, Davide. "Microring Based Neuromorphic Photonics." Doctoral thesis, Università degli studi di Trento, 2022. http://hdl.handle.net/11572/344624.
Full textRöhm, André [Verfasser], Kathy [Akademischer Betreuer] Lüdge, Kathy [Gutachter] Lügde, and Ingo [Gutachter] Fischer. "Symmetry-Breaking bifurcations and reservoir computing in regular oscillator networks / André Röhm ; Gutachter: Kathy Lügde, Ingo Fischer ; Betreuer: Kathy Lüdge." Berlin : Technische Universität Berlin, 2019. http://d-nb.info/1183789491/34.
Full textEnel, Pierre. "Représentation dynamique dans le cortex préfrontal : comparaison entre reservoir computing et neurophysiologie du primate." Phd thesis, Université Claude Bernard - Lyon I, 2014. http://tel.archives-ouvertes.fr/tel-01056696.
Full textBooks on the topic "Reservoir computing networks"
Brunner, Daniel, Miguel C. Soriano, and Guy Van der Sande. Photonic Reservoir Computing: Optical Recurrent Neural Networks. de Gruyter GmbH, Walter, 2019.
Find full textBrunner, Daniel, Miguel C. Soriano, and Guy Van der Sande. Photonic Reservoir Computing: Optical Recurrent Neural Networks. de Gruyter GmbH, Walter, 2019.
Find full textBrunner, Daniel, Miguel C. Soriano, and Guy Van der Sande. Photonic Reservoir Computing: Optical Recurrent Neural Networks. de Gruyter GmbH, Walter, 2019.
Find full textBook chapters on the topic "Reservoir computing networks"
Sergio, Anderson Tenório, and Teresa B. Ludermir. "PSO for Reservoir Computing Optimization." In Artificial Neural Networks and Machine Learning – ICANN 2012, 685–92. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-33269-2_86.
Full textVerstraeten, David, and Benjamin Schrauwen. "On the Quantification of Dynamics in Reservoir Computing." In Artificial Neural Networks – ICANN 2009, 985–94. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04274-4_101.
Full textAtencia, Miguel, Claudio Gallicchio, Gonzalo Joya, and Alessio Micheli. "Time Series Clustering with Deep Reservoir Computing." In Artificial Neural Networks and Machine Learning – ICANN 2020, 482–93. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-61616-8_39.
Full textColla, Valentina, Ismael Matino, Stefano Dettori, Silvia Cateni, and Ruben Matino. "Reservoir Computing Approaches Applied to Energy Management in Industry." In Engineering Applications of Neural Networks, 66–79. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-20257-6_6.
Full textKobayashi, Taisuke. "Practical Fractional-Order Neuron Dynamics for Reservoir Computing." In Artificial Neural Networks and Machine Learning – ICANN 2018, 116–25. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-01424-7_12.
Full textSchliebs, Stefan, Nikola Kasabov, Dave Parry, and Doug Hunt. "Towards a Wearable Coach: Classifying Sports Activities with Reservoir Computing." In Engineering Applications of Neural Networks, 233–42. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-41013-0_24.
Full textHinaut, Xavier, and Peter F. Dominey. "On-Line Processing of Grammatical Structure Using Reservoir Computing." In Artificial Neural Networks and Machine Learning – ICANN 2012, 596–603. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-33269-2_75.
Full textOyebode, Oluwaseun, and Josiah Adeyemo. "Reservoir Inflow Forecasting Using Differential Evolution Trained Neural Networks." In Advances in Intelligent Systems and Computing, 307–19. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-07494-8_21.
Full textTietz, Stephan, Doreen Jirak, and Stefan Wermter. "A Reservoir Computing Framework for Continuous Gesture Recognition." In Artificial Neural Networks and Machine Learning – ICANN 2019: Workshop and Special Sessions, 7–18. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-30493-5_1.
Full textLocquet, Jean-Pierre. "Overview on the PHRESCO Project: PHotonic REServoir COmputing." In Artificial Neural Networks and Machine Learning – ICANN 2019: Workshop and Special Sessions, 149–55. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-30493-5_14.
Full textConference papers on the topic "Reservoir computing networks"
Bacciu, Davide, Daniele Di Sarli, Pouria Faraji, Claudio Gallicchio, and Alessio Micheli. "Federated Reservoir Computing Neural Networks." In 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021. http://dx.doi.org/10.1109/ijcnn52387.2021.9534035.
Full textHeroux, Jean Benoit, Hidetoshi Numata, Naoki Kanazawa, and Daiju Nakano. "Optoelectronic Reservoir Computing with VCSEL." In 2018 International Joint Conference on Neural Networks (IJCNN). IEEE, 2018. http://dx.doi.org/10.1109/ijcnn.2018.8489757.
Full textDong, Jonathan, Erik Borve, Mushegh Rafayelyan, and Michael Unser. "Asymptotic Stability in Reservoir Computing." In 2022 International Joint Conference on Neural Networks (IJCNN). IEEE, 2022. http://dx.doi.org/10.1109/ijcnn55064.2022.9892302.
Full textWyffels, Francis, Benjamin Schrauwen, David Verstraeten, and Dirk Stroobandt. "Band-pass Reservoir Computing." In 2008 IEEE International Joint Conference on Neural Networks (IJCNN 2008 - Hong Kong). IEEE, 2008. http://dx.doi.org/10.1109/ijcnn.2008.4634252.
Full textFu, Kaiwei, Ruomin Zhu, Alon Loeffler, Joel Hochstetter, Adrian Diaz-Alvarez, Adam Stieg, James Gimzewski, Tomonobu Nakayama, and Zdenka Kuncic. "Reservoir Computing with Neuromemristive Nanowire Networks." In 2020 International Joint Conference on Neural Networks (IJCNN). IEEE, 2020. http://dx.doi.org/10.1109/ijcnn48605.2020.9207727.
Full textMujal, Pere, Johannes Nokkala, Rodrigo Martinez-Peña, Jorge Garcia Beni, Gian Luca Giorgi, Miguel C. Cornelles-Soriano, and Roberta Zambrini. "Quantum reservoir computing in bosonic networks." In Emerging Topics in Artificial Intelligence (ETAI) 2021, edited by Giovanni Volpe, Joana B. Pereira, Daniel Brunner, and Aydogan Ozcan. SPIE, 2021. http://dx.doi.org/10.1117/12.2596177.
Full textGallicchio, Claudio. "Sparsity in Reservoir Computing Neural Networks." In 2020 International Conference on INnovations in Intelligent SysTems and Applications (INISTA). IEEE, 2020. http://dx.doi.org/10.1109/inista49547.2020.9194611.
Full textZhu, Ruomin, Sam Lilak, Alon Loeffler, Joseph Lizier, Adam Stieg, James Gimzewski, and Zdenka Kuncic. "Reservoir Computing with Neuromorphic Nanowire Networks." In Neuromorphic Materials, Devices, Circuits and Systems. València: FUNDACIO DE LA COMUNITAT VALENCIANA SCITO, 2023. http://dx.doi.org/10.29363/nanoge.neumatdecas.2023.055.
Full textRöhm, André, and Kathy Lüdge. "Reservoir computing with delay in structured networks." In Neuro-inspired Photonic Computing, edited by Marc Sciamanna and Peter Bienstman. SPIE, 2018. http://dx.doi.org/10.1117/12.2307159.
Full textLaporte, Floris, Joni Dambre, and Peter Bienstman. "Reservoir computing with signal-mixing cavities." In 2017 19th International Conference on Transparent Optical Networks (ICTON). IEEE, 2017. http://dx.doi.org/10.1109/icton.2017.8024990.
Full text