Добірка наукової літератури з теми "Neuromorphic computer systems"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Neuromorphic computer systems".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Статті в журналах з теми "Neuromorphic computer systems"

1

Dunham, Christopher S., Sam Lilak, Joel Hochstetter, Alon Loeffler, Ruomin Zhu, Charles Chase, Adam Z. Stieg, Zdenka Kuncic, and James K. Gimzewski. "Nanoscale neuromorphic networks and criticality: a perspective." Journal of Physics: Complexity 2, no. 4 (December 1, 2021): 042001. http://dx.doi.org/10.1088/2632-072x/ac3ad3.

Повний текст джерела
Анотація:
Abstract Numerous studies suggest critical dynamics may play a role in information processing and task performance in biological systems. However, studying critical dynamics in these systems can be challenging due to many confounding biological variables that limit access to the physical processes underpinning critical dynamics. Here we offer a perspective on the use of abiotic, neuromorphic nanowire networks as a means to investigate critical dynamics in complex adaptive systems. Neuromorphic nanowire networks are composed of metallic nanowires and possess metal-insulator-metal junctions. These networks self-assemble into a highly interconnected, variable-density structure and exhibit nonlinear electrical switching properties and information processing capabilities. We highlight key dynamical characteristics observed in neuromorphic nanowire networks, including persistent fluctuations in conductivity with power law distributions, hysteresis, chaotic attractor dynamics, and avalanche criticality. We posit that neuromorphic nanowire networks can function effectively as tunable abiotic physical systems for studying critical dynamics and leveraging criticality for computation.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Ferreira de Lima, Thomas, Alexander N. Tait, Armin Mehrabian, Mitchell A. Nahmias, Chaoran Huang, Hsuan-Tung Peng, Bicky A. Marquez, et al. "Primer on silicon neuromorphic photonic processors: architecture and compiler." Nanophotonics 9, no. 13 (August 10, 2020): 4055–73. http://dx.doi.org/10.1515/nanoph-2020-0172.

Повний текст джерела
Анотація:
AbstractMicroelectronic computers have encountered challenges in meeting all of today’s demands for information processing. Meeting these demands will require the development of unconventional computers employing alternative processing models and new device physics. Neural network models have come to dominate modern machine learning algorithms, and specialized electronic hardware has been developed to implement them more efficiently. A silicon photonic integration industry promises to bring manufacturing ecosystems normally reserved for microelectronics to photonics. Photonic devices have already found simple analog signal processing niches where electronics cannot provide sufficient bandwidth and reconfigurability. In order to solve more complex information processing problems, they will have to adopt a processing model that generalizes and scales. Neuromorphic photonics aims to map physical models of optoelectronic systems to abstract models of neural networks. It represents a new opportunity for machine information processing on sub-nanosecond timescales, with application to mathematical programming, intelligent radio frequency signal processing, and real-time control. The strategy of neuromorphic engineering is to externalize the risk of developing computational theory alongside hardware. The strategy of remaining compatible with silicon photonics externalizes the risk of platform development. In this perspective article, we provide a rationale for a neuromorphic photonics processor, envisioning its architecture and a compiler. We also discuss how it can be interfaced with a general purpose computer, i.e. a CPU, as a coprocessor to target specific applications. This paper is intended for a wide audience and provides a roadmap for expanding research in the direction of transforming neuromorphic photonics into a viable and useful candidate for accelerating neuromorphic computing.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Jang, Taejin, Suhyeon Kim, Jeesoo Chang, Kyung Kyu Min, Sungmin Hwang, Kyungchul Park, Jong-Ho Lee, and Byung-Gook Park. "3D AND-Type Stacked Array for Neuromorphic Systems." Micromachines 11, no. 9 (August 31, 2020): 829. http://dx.doi.org/10.3390/mi11090829.

Повний текст джерела
Анотація:
NOR/AND flash memory was studied in neuromorphic systems to perform vector-by-matrix multiplication (VMM) by summing the current. Because the size of NOR/AND cells exceeds those of other memristor synaptic devices, we proposed a 3D AND-type stacked array to reduce the cell size. Through a tilted implantation method, the conformal sources and drains of each cell could be formed, with confirmation by a technology computer aided design (TCAD) simulation. In addition, the cell-to-cell variation due to the etch slope could be eliminated by controlling the deposition thickness of the cells. The suggested array can be beneficial in simple program/inhibit schemes given its use of Fowler–Nordheim (FN) tunneling because the drain lines and source lines are parallel. Therefore, the conductance of each synaptic device can be updated at low power level.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Liu, Te-Yuan, Ata Mahjoubfar, Daniel Prusinski, and Luis Stevens. "Neuromorphic computing for content-based image retrieval." PLOS ONE 17, no. 4 (April 6, 2022): e0264364. http://dx.doi.org/10.1371/journal.pone.0264364.

Повний текст джерела
Анотація:
Neuromorphic computing mimics the neural activity of the brain through emulating spiking neural networks. In numerous machine learning tasks, neuromorphic chips are expected to provide superior solutions in terms of cost and power efficiency. Here, we explore the application of Loihi, a neuromorphic computing chip developed by Intel, for the computer vision task of image retrieval. We evaluated the functionalities and the performance metrics that are critical in content-based visual search and recommender systems using deep-learning embeddings. Our results show that the neuromorphic solution is about 2.5 times more energy-efficient compared with an ARM Cortex-A72 CPU and 12.5 times more energy-efficient compared with NVIDIA T4 GPU for inference by a lightweight convolutional neural network when batch size is 1 while maintaining the same level of matching accuracy. The study validates the potential of neuromorphic computing in low-power image retrieval, as a complementary paradigm to the existing von Neumann architectures.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Bhat, Pranava. "Analysis of Neuromorphic Computing Systems and its Applications in Machine Learning." International Journal for Research in Applied Science and Engineering Technology 9, no. VI (June 30, 2021): 5309–12. http://dx.doi.org/10.22214/ijraset.2021.35601.

Повний текст джерела
Анотація:
The domain of engineering has always taken inspiration from the biological world. Understanding the functionalities of the human brain is one of the key areas of interest over time and has caused many advancements in the field of computing systems. The computational capability per unit power per unit volume of the human brain exceeds the current best supercomputers. Mimicking the physics of computations used by the nervous system and the brain can bring a paradigm shift to the computing systems. The concept of bridging computing and neural systems can be termed as neuromorphic computing and it is bringing revolutionary changes in the computing hardware. Neuromorphic computing systems have seen swift progress in the past decades. Many organizations have introduced a variety of designs, implementation methodologies and prototype chips. This paper discusses the parameters that are considered in the advanced neuromorphic computing systems and the tradeoffs between them. There have been attempts made to make computer models of neurons. Advancements in the hardware implementation are fuelling the applications in the field of machine learning. This paper presents the applications of these modern computing systems in Machine Learning.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Choi, Hyun-Seok, Yu Jeong Park, Jong-Ho Lee, and Yoon Kim. "3-D Synapse Array Architecture Based on Charge-Trap Flash Memory for Neuromorphic Application." Electronics 9, no. 1 (December 30, 2019): 57. http://dx.doi.org/10.3390/electronics9010057.

Повний текст джерела
Анотація:
In order to address a fundamental bottleneck of conventional digital computers, there is recently a tremendous upsurge of investigations on hardware-based neuromorphic systems. To emulate the functionalities of artificial neural networks, various synaptic devices and their 2-D cross-point array structures have been proposed. In our previous work, we proposed the 3-D synapse array architecture based on a charge-trap flash (CTF) memory. It has the advantages of high-density integration of 3-D stacking technology and excellent reliability characteristics of mature CTF device technology. This paper examines some issues of the 3-D synapse array architecture. Also, we propose an improved structure and programming method compared to the previous work. The synaptic characteristics of the proposed method are closely examined and validated through a technology computer-aided design (TCAD) device simulation and a system-level simulation for the pattern recognition task. The proposed technology will be the promising solution for high-performance and high-reliability of neuromorphic hardware systems.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Varshika, M. Lakshmi, Federico Corradi, and Anup Das. "Nonvolatile Memories in Spiking Neural Network Architectures: Current and Emerging Trends." Electronics 11, no. 10 (May 18, 2022): 1610. http://dx.doi.org/10.3390/electronics11101610.

Повний текст джерела
Анотація:
A sustainable computing scenario demands more energy-efficient processors. Neuromorphic systems mimic biological functions by employing spiking neural networks for achieving brain-like efficiency, speed, adaptability, and intelligence. Current trends in neuromorphic technologies address the challenges of investigating novel materials, systems, and architectures for enabling high-integration and extreme low-power brain-inspired computing. This review collects the most recent trends in exploiting the physical properties of nonvolatile memory technologies for implementing efficient in-memory and in-device computing with spike-based neuromorphic architectures.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Young, Aaron R., Mark E. Dean, James S. Plank, and Garrett S. Rose. "A Review of Spiking Neuromorphic Hardware Communication Systems." IEEE Access 7 (2019): 135606–20. http://dx.doi.org/10.1109/access.2019.2941772.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Chung, Jaeyong, Taehwan Shin, and Joon-Sung Yang. "Simplifying Deep Neural Networks for FPGA-Like Neuromorphic Systems." IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 38, no. 11 (November 2019): 2032–42. http://dx.doi.org/10.1109/tcad.2018.2877016.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Kang, Yongshin, Joon-Sung Yang, and Jaeyong Chung. "Weight Partitioning for Dynamic Fixed-Point Neuromorphic Computing Systems." IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 38, no. 11 (November 2019): 2167–71. http://dx.doi.org/10.1109/tcad.2018.2878167.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Дисертації з теми "Neuromorphic computer systems"

1

Bieszczad, Andrzej Carleton University Dissertation Engineering Systems and Computer. "Neuromorphic distributed general problem solvers." Ottawa, 1996.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Nease, Stephen Howard. "Contributions to neuromorphic and reconfigurable circuits and systems." Thesis, Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/44923.

Повний текст джерела
Анотація:
This thesis presents a body of work in the field of reconfigurable and neuromorphic circuits and systems. Three main projects were undertaken. The first was using a Field-Programmable Analog Array (FPAA) to model the cable behavior of dendrites using analog circuits. The second was to design, lay out, and test part of a new FPAA, the RASP 2.9v. The final project was to use floating-gate programming to remove offsets in a neuromorphic FPAA, the RASP Neuron 1D.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Azam, Md Ali. "Energy Efficient Spintronic Device for Neuromorphic Computation." VCU Scholars Compass, 2019. https://scholarscompass.vcu.edu/etd/6036.

Повний текст джерела
Анотація:
Future computing will require significant development in new computing device paradigms. This is motivated by CMOS devices reaching their technological limits, the need for non-Von Neumann architectures as well as the energy constraints of wearable technologies and embedded processors. The first device proposal, an energy-efficient voltage-controlled domain wall device for implementing an artificial neuron and synapse is analyzed using micromagnetic modeling. By controlling the domain wall motion utilizing spin transfer or spin orbit torques in association with voltage generated strain control of perpendicular magnetic anisotropy in the presence of Dzyaloshinskii-Moriya interaction (DMI), different positions of the domain wall are realized in the free layer of a magnetic tunnel junction to program different synaptic weights. Additionally, an artificial neuron can be realized by combining this DW device with a CMOS buffer. The second neuromorphic device proposal is inspired by the brain. Membrane potential of many neurons oscillate in a subthreshold damped fashion and fire when excited by an input frequency that nearly equals their Eigen frequency. We investigate theoretical implementation of such “resonate-and-fire” neurons by utilizing the magnetization dynamics of a fixed magnetic skyrmion based free layer of a magnetic tunnel junction (MTJ). Voltage control of magnetic anisotropy or voltage generated strain results in expansion and shrinking of a skyrmion core that mimics the subthreshold oscillation. Finally, we show that such resonate and fire neurons have potential application in coupled nanomagnetic oscillator based associative memory arrays.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Smith, Paul Devon. "An Analog Architecture for Auditory Feature Extraction and Recognition." Diss., Georgia Institute of Technology, 2004. http://hdl.handle.net/1853/4839.

Повний текст джерела
Анотація:
Speech recognition systems have been implemented using a wide range of signal processing techniques including neuromorphic/biological inspired and Digital Signal Processing techniques. Neuromorphic/biologically inspired techniques, such as silicon cochlea models, are based on fairly simple yet highly parallel computation and/or computational units. While the area of digital signal processing (DSP) is based on block transforms and statistical or error minimization methods. Essential to each of these techniques is the first stage of extracting meaningful information from the speech signal, which is known as feature extraction. This can be done using biologically inspired techniques such as silicon cochlea models, or techniques beginning with a model of speech production and then trying to separate the the vocal tract response from an excitation signal. Even within each of these approaches, there are multiple techniques including cepstrum filtering, which sits under the class of Homomorphic signal processing, or techniques using FFT based predictive approaches. The underlying reality is there are multiple techniques that have attacked the problem in speech recognition but the problem is still far from being solved. The techniques that have shown to have the best recognition rates involve Cepstrum Coefficients for the feature extraction and Hidden-Markov Models to perform the pattern recognition. The presented research develops an analog system based on programmable analog array technology that can perform the initial stages of auditory feature extraction and recognition before passing information to a digital signal processor. The goal being a low power system that can be fully contained on one or more integrated circuit chips. Results show that it is possible to realize advanced filtering techniques such as Cepstrum Filtering and Vector Quantization in analog circuitry. Prior to this work, previous applications of analog signal processing have focused on vision, cochlea models, anti-aliasing filters and other single component uses. Furthermore, classic designs have looked heavily at utilizing op-amps as a basic core building block for these designs. This research also shows a novel design for a Hidden Markov Model (HMM) decoder utilizing circuits that take advantage of the inherent properties of subthreshold transistors and floating-gate technology to create low-power computational blocks.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Паржин, Юрій Володимирович. "Моделі і методи побудови архітектури і компонентів детекторних нейроморфних комп'ютерних систем". Thesis, НТУ "ХПІ", 2018. http://repository.kpi.kharkov.ua/handle/KhPI-Press/34755.

Повний текст джерела
Анотація:
Дисертація на здобуття наукового ступеня доктора технічних наук за спеціальністю 05.13.05 – комп'ютерні системи та компоненти. – Національний технічний університет "Харківський політехнічний інститут", Міністерство освіти і науки України, Харків, 2018. Дисертація присвячена вирішенню проблеми підвищення ефективності побудови та використання нейроморфних комп'ютерних систем (НКС) в результаті розробки моделей побудови їх компонентів та загальної архітектури, а також методів їх навчання на основі формалізованого детекторного принципу. В результаті аналізу і класифікації архітектури та компонентів НКС встановлено, що в основі всіх їх нейромережевих реалізацій лежить конекціоністська парадигма побудови штучних нейронних мереж. Було обґрунтовано та формалізовано альтернативний до конекціоністської парадигми детекторний принцип побудови архітектури НКС та її компонентів, в основі якого лежить встановлена властивість зв’язності елементів вхідного вектору сигналів та відповідних вагових коефіцієнтів нейроелемента НКС. На основі детекторного принципу були розроблені багатосегментні порогові інформаційні моделі компонентів детекторної НКС (ДНКС): блоків-детекторів, блоків-аналізаторів та блоку новизни, в яких в результаті розробленого методу зустрічного навчання формуються концепти, що визначають необхідні і достатні умови формування їх реакцій. Метод зустрічного навчання ДНКС дозволяє скоротити час її навчання при вирішенні практичних задач розпізнавання зображень до однієї епохи та скоротити розмірність навчальної вибірки. Крім того, цей метод дозволяє вирішити проблему стабільності-пластичності пам'яті ДНКС та проблему її перенавчання на основі самоорганізації карти блоків-детекторів вторинного рівня обробки інформації під управлінням блоку новизни. В результаті досліджень була розроблена модель мережевої архітектури ДНКС, що складається з двох шарів нейроморфних компонентів первинного та вторинного рівнів обробки інформації, та яка дозволяє скоротити кількість необхідних компонентів системи. Для обґрунтування підвищення ефективності побудови та використання НКС на основі детекторного принципу, були розроблені програмні моделі ДНКС автоматизованого моніторингу та аналізу зовнішньої електромагнітної обстановки, а також розпізнавання рукописних цифр бази даних MNIST. Результати дослідження цих систем підтвердили правильність теоретичних положень дисертації та високу ефективність розроблених моделей і методів.
Dissertation for the degree of Doctor of Technical Sciences in the specialty 05.13.05 – Computer systems and components. – National Technical University "Kharkiv Polytechnic Institute", Ministry of Education and Science of Ukraine, Kharkiv, 2018. The thesis is devoted to solving the problem of increasing the efficiency of building and using neuromorphic computer systems (NCS) as a result of developing models for constructing their components and a general architecture, as well as methods for their training based on the formalized detection principle. As a result of the analysis and classification of the architecture and components of the NCS, it is established that the connectionist paradigm for constructing artificial neural networks underlies all neural network implementations. The detector principle of constructing the architecture of the NCS and its components was substantiated and formalized, which is an alternative to the connectionist paradigm. This principle is based on the property of the binding of the elements of the input signal vector and the corresponding weighting coefficients of the NCS. On the basis of the detector principle, multi-segment threshold information models for the components of the detector NCS (DNCS): block-detectors, block-analyzers and a novelty block were developed. As a result of the developed method of counter training, these components form concepts that determine the necessary and sufficient conditions for the formation of reactions. The method of counter training of DNCS allows reducing the time of its training in solving practical problems of image recognition up to one epoch and reducing the dimension of the training sample. In addition, this method allows to solve the problem of stability-plasticity of DNCS memory and the problem of its overfitting based on self-organization of a map of block-detectors of a secondary level of information processing under the control of a novelty block. As a result of the research, a model of the network architecture of DNCS was developed, which consists of two layers of neuromorphic components of the primary and secondary levels of information processing, and which reduces the number of necessary components of the system. To substantiate the increase in the efficiency of constructing and using the NCS on the basis of the detector principle, software models were developed for automated monitoring and analysis of the external electromagnetic environment, as well as recognition of the manuscript figures of the MNIST database. The results of the study of these systems confirmed the correctness of the theoretical provisions of the dissertation and the high efficiency of the developed models and methods.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Ramakrishnan, Shubha. "A system design approach to neuromorphic classifiers." Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/51718.

Повний текст джерела
Анотація:
This work considers alternative strategies to mainstream digital approaches to signal processing - namely analog and neuromorphic solutions, for increased computing efficiency. In the context of a speech recognizer application, we use low-power analog approaches for the signal conditioning and basic auditory feature extraction, while using a neuromorphic IC for building a dendritic classifier that can be used as a low-power word spotter. In doing so, this work also aspires to posit the significance of dendrites in neural computation.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Паржин, Юрій Володимирович. "Моделі і методи побудови архітектури і компонентів детекторних нейроморфних комп'ютерних систем". Thesis, НТУ "ХПІ", 2018. http://repository.kpi.kharkov.ua/handle/KhPI-Press/34756.

Повний текст джерела
Анотація:
Дисертація на здобуття наукового ступеня доктора технічних наук за спеціальністю 05.13.05 – комп'ютерні системи та компоненти. – Національний технічний університет "Харківський політехнічний інститут", Міністерство освіти і науки України, Харків, 2018. Дисертація присвячена вирішенню проблеми підвищення ефективності побудови та використання нейроморфних комп'ютерних систем (НКС) в результаті розробки моделей побудови їх компонентів та загальної архітектури, а також методів їх навчання на основі формалізованого детекторного принципу. В результаті аналізу і класифікації архітектури та компонентів НКС встановлено, що в основі всіх їх нейромережевих реалізацій лежить конекціоністська парадигма побудови штучних нейронних мереж. Було обґрунтовано та формалізовано альтернативний до конекціоністської парадигми детекторний принцип побудови архітектури НКС та її компонентів, в основі якого лежить встановлена властивість зв’язності елементів вхідного вектору сигналів та відповідних вагових коефіцієнтів нейроелемента НКС. На основі детекторного принципу були розроблені багатосегментні порогові інформаційні моделі компонентів детекторної НКС (ДНКС): блоків-детекторів, блоків-аналізаторів та блоку новизни, в яких в результаті розробленого методу зустрічного навчання формуються концепти, що визначають необхідні і достатні умови формування їх реакцій. Метод зустрічного навчання ДНКС дозволяє скоротити час її навчання при вирішенні практичних задач розпізнавання зображень до однієї епохи та скоротити розмірність навчальної вибірки. Крім того, цей метод дозволяє вирішити проблему стабільності-пластичності пам'яті ДНКС та проблему її перенавчання на основі самоорганізації карти блоків-детекторів вторинного рівня обробки інформації під управлінням блоку новизни. В результаті досліджень була розроблена модель мережевої архітектури ДНКС, що складається з двох шарів нейроморфних компонентів первинного та вторинного рівнів обробки інформації, та яка дозволяє скоротити кількість необхідних компонентів системи. Для обґрунтування підвищення ефективності побудови та використання НКС на основі детекторного принципу, були розроблені програмні моделі ДНКС автоматизованого моніторингу та аналізу зовнішньої електромагнітної обстановки, а також розпізнавання рукописних цифр бази даних MNIST. Результати дослідження цих систем підтвердили правильність теоретичних положень дисертації та високу ефективність розроблених моделей і методів.
Dissertation for the degree of Doctor of Technical Sciences in the specialty 05.13.05 – Computer systems and components. – National Technical University "Kharkiv Polytechnic Institute", Ministry of Education and Science of Ukraine, Kharkiv, 2018. The thesis is devoted to solving the problem of increasing the efficiency of building and using neuromorphic computer systems (NCS) as a result of developing models for constructing their components and a general architecture, as well as methods for their training based on the formalized detection principle. As a result of the analysis and classification of the architecture and components of the NCS, it is established that the connectionist paradigm for constructing artificial neural networks underlies all neural network implementations. The detector principle of constructing the architecture of the NCS and its components was substantiated and formalized, which is an alternative to the connectionist paradigm. This principle is based on the property of the binding of the elements of the input signal vector and the corresponding weighting coefficients of the NCS. On the basis of the detector principle, multi-segment threshold information models for the components of the detector NCS (DNCS): block-detectors, block-analyzers and a novelty block were developed. As a result of the developed method of counter training, these components form concepts that determine the necessary and sufficient conditions for the formation of reactions. The method of counter training of DNCS allows reducing the time of its training in solving practical problems of image recognition up to one epoch and reducing the dimension of the training sample. In addition, this method allows to solve the problem of stability-plasticity of DNCS memory and the problem of its overfitting based on self-organization of a map of block-detectors of a secondary level of information processing under the control of a novelty block. As a result of the research, a model of the network architecture of DNCS was developed, which consists of two layers of neuromorphic components of the primary and secondary levels of information processing, and which reduces the number of necessary components of the system. To substantiate the increase in the efficiency of constructing and using the NCS on the basis of the detector principle, software models were developed for automated monitoring and analysis of the external electromagnetic environment, as well as recognition of the manuscript figures of the MNIST database. The results of the study of these systems confirmed the correctness of the theoretical provisions of the dissertation and the high efficiency of the developed models and methods.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Tully, Philip. "Spike-Based Bayesian-Hebbian Learning in Cortical and Subcortical Microcircuits." Doctoral thesis, KTH, Beräkningsvetenskap och beräkningsteknik (CST), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-205568.

Повний текст джерела
Анотація:
Cortical and subcortical microcircuits are continuously modified throughout life. Despite ongoing changes these networks stubbornly maintain their functions, which persist although destabilizing synaptic and nonsynaptic mechanisms should ostensibly propel them towards runaway excitation or quiescence. What dynamical phenomena exist to act together to balance such learning with information processing? What types of activity patterns do they underpin, and how do these patterns relate to our perceptual experiences? What enables learning and memory operations to occur despite such massive and constant neural reorganization? Progress towards answering many of these questions can be pursued through large-scale neuronal simulations.    In this thesis, a Hebbian learning rule for spiking neurons inspired by statistical inference is introduced. The spike-based version of the Bayesian Confidence Propagation Neural Network (BCPNN) learning rule involves changes in both synaptic strengths and intrinsic neuronal currents. The model is motivated by molecular cascades whose functional outcomes are mapped onto biological mechanisms such as Hebbian and homeostatic plasticity, neuromodulation, and intrinsic excitability. Temporally interacting memory traces enable spike-timing dependence, a stable learning regime that remains competitive, postsynaptic activity regulation, spike-based reinforcement learning and intrinsic graded persistent firing levels.    The thesis seeks to demonstrate how multiple interacting plasticity mechanisms can coordinate reinforcement, auto- and hetero-associative learning within large-scale, spiking, plastic neuronal networks. Spiking neural networks can represent information in the form of probability distributions, and a biophysical realization of Bayesian computation can help reconcile disparate experimental observations.

QC 20170421

Стилі APA, Harvard, Vancouver, ISO та ін.
9

Brink, Stephen Isaac. "Learning in silicon: a floating-gate based, biophysically inspired, neuromorphic hardware system with synaptic plasticity." Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/50143.

Повний текст джерела
Анотація:
The goal of neuromorphic engineering is to create electronic systems that model the behavior of biological neural systems. Neuromorphic systems can leverage a combination of analog and digital circuit design techniques to enable computational modeling, with orders of magnitude of reduction in size, weight, and power consumption compared to the traditional modeling approach based upon numerical integration. These benefits of neuromorphic modeling have the potential to facilitate neural modeling in resource-constrained research environments. Moreover, they will make it practical to use neural computation in the design of intelligent machines, including portable, battery-powered, and energy harvesting applications. Floating-gate transistor technology is a powerful tool for neuromorphic engineering because it allows dense implementation of synapses with nonvolatile storage of synaptic weights, cancellation of process mismatch, and reconfigurable system design. A novel neuromorphic hardware system, featuring compact and efficient channel-based model neurons and floating-gate transistor synapses, was developed. This system was used to model a variety of network topologies with up to 100 neurons. The networks were shown to possess computational capabilities such as spatio-temporal pattern generation and recognition, winner-take-all competition, bistable activity implementing a "volatile memory", and wavefront-based robotic path planning. Some canonical features of synaptic plasticity, such as potentiation of high frequency inputs and potentiation of correlated inputs in the presence of uncorrelated noise, were demonstrated. Preliminary results regarding formation of receptive fields were obtained. Several advances in enabling technologies, including methods for floating-gate transistor array programming, and the creation of a reconfigurable system for studying adaptation in floating-gate transistor circuits, were made.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Rajamanikkam, Chidhambaranathan. "Understanding Security Threats of Emerging Computing Architectures and Mitigating Performance Bottlenecks of On-Chip Interconnects in Manycore NTC System." DigitalCommons@USU, 2019. https://digitalcommons.usu.edu/etd/7453.

Повний текст джерела
Анотація:
Emerging computing architectures such as, neuromorphic computing and third party intellectual property (3PIP) cores, have attracted significant attention in the recent past. Neuromorphic Computing introduces an unorthodox non-von neumann architecture that mimics the abstract behavior of neuron activity of the human brain. They can execute more complex applications, such as image processing, object recognition, more efficiently in terms of performance and energy than the traditional microprocessors. However, focus on the hardware security aspects of the neuromorphic computing at its nascent stage. 3PIP core, on the other hand, have covertly inserted malicious functional behavior that can inflict range of harms at the system/application levels. This dissertation examines the impact of various threat models that emerges from neuromorphic architectures and 3PIP cores. Near-Threshold Computing (NTC) serves as an energy-efficient paradigm by aggressively operating all computing resources with a supply voltage closer to its threshold voltage at the cost of performance. Therefore, STC system is scaled to many-core NTC system to reclaim the lost performance. However, the interconnect performance in many-core NTC system pose significant bottleneck that hinders the performance of many-core NTC system. This dissertation analyzes the interconnect performance, and further, propose a novel technique to boost the interconnect performance of many-core NTC system.
Стилі APA, Harvard, Vancouver, ISO та ін.

Книги з теми "Neuromorphic computer systems"

1

Landolt, Oliver. Place Coding in Analog VLSI: A Neuromorphic Approach to Computation. Boston, MA: Springer US, 1998.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Liu, Shih-Chii, Giacomo Indiveri, Rodney Douglas, Tobi Delbruck, and Adrian Whatley. Event-Based Neuromorphic Systems. Wiley & Sons, Incorporated, John, 2014.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Liu, Shih-Chii, Giacomo Indiveri, Rodney Douglas, Tobi Delbruck, and Adrian Whatley. Event-Based Neuromorphic Systems. Wiley & Sons, Incorporated, John, 2014.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

1952-, Smith Leslie S., Hamilton Alister, and European Workshop on Neuromorphic Systems (1st : 1997 : University of Stirling), eds. Neuromorphic systems: Engineering silicon from neurobiology. Singapore: World Scientific, 1998.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Lande, Tor Sverre. Neuromorphic Systems Engineering: Neural Networks In Silicon. Springer, 2013.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

1950-, Lande Tor Sverre, ed. Neuromorphic systems engineering: Neural networks in silicon. Boston: Kluwer Academic, 1998.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Liu, S. C. Neuromorphic and Bio-Inspired Engineered Systems. John Wiley and Sons Ltd, 2007.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

European Workshop on Neurmorphic Systems 1997 (University of Stirling). Neuromorphic Systems: Engineering Silicon from Neurobiology (Progress in Neural Processing, 10). World Scientific Publishing Company, 1998.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Lande, Tor Sverre. Neuromorphic Systems Engineering: Neural Networks in Silicon (The International Series in Engineering and Computer Science). Springer, 1998.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

The Making of a Neuromorphic Visual System. Springer, 2004.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Частини книг з теми "Neuromorphic computer systems"

1

Carboni, Roberto. "Characterization and Modeling of Spin-Transfer Torque (STT) Magnetic Memory for Computing Applications." In Special Topics in Information Technology, 51–62. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-62476-7_5.

Повний текст джерела
Анотація:
AbstractWith the ubiquitous diffusion of mobile computing and Internet of Things (IoT), the amount of data exchanged and processed over the internet is increasing every day, demanding secure data communication/storage and new computing primitives. Although computing systems based on microelectronics steadily improved over the past 50 years thanks to the aggressive technological scaling, their improvement is now hindered by excessive power consumption and inherent performance limitation associated to the conventional computer architecture (von Neumann bottleneck). In this scenario, emerging memory technologies are gaining interest thanks to their non-volatility and low power/fast operation. In this chapter, experimental characterization and modeling of spin-transfer torque magnetic memory (STT-MRAM) are presented, with particular focus on cycling endurance and switching variability, which both present a challenge towards STT-based memory applications. Then, the switching variability in STT-MRAM is exploited for hardware security and computing primitives, such as true-random number generator (TRNG) and stochastic spiking neuron for neuromorphic and stochastic computing.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Malavena, Gerardo. "Modeling of GIDL–Assisted Erase in 3–D NAND Flash Memory Arrays and Its Employment in NOR Flash–Based Spiking Neural Networks." In Special Topics in Information Technology, 43–53. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-85918-3_4.

Повний текст джерела
Анотація:
AbstractSince the very first introduction of three-dimensional (3–D) vertical-channel (VC) NAND Flash memory arrays, gate-induced drain leakage (GIDL) current has been suggested as a solution to increase the string channel potential to trigger the erase operation. Thanks to that erase scheme, the memory array can be built directly on the top of a $$n^+$$ n + plate, without requiring any p-doped region to contact the string channel and therefore allowing to simplify the manufacturing process and increase the array integration density. For those reasons, the understanding of the physical phenomena occurring in the string when GIDL is triggered is important for the proper design of the cell structure and of the voltage waveforms adopted during erase. Even though a detailed comprehension of the GIDL phenomenology can be achieved by means of technology computer-aided design (TCAD) simulations, they are usually time and resource consuming, especially when realistic string structures with many word-lines (WLs) are considered. In this chapter, an analysis of the GIDL-assisted erase in 3–D VC nand memory arrays is presented. First, the evolution of the string potential and GIDL current during erase is investigated by means of TCAD simulations; then, a compact model able to reproduce both the string dynamics and the threshold voltage transients with reduced computational effort is presented. The developed compact model is proven to be a valuable tool for the optimization of the array performance during erase assisted by GIDL. Then, the idea of taking advantage of GIDL for the erase operation is exported to the context of spiking neural networks (SNNs) based on NOR Flash memory arrays, which require operational schemes that allow single-cell selectivity during both cell program and cell erase. To overcome the block erase typical of nor Flash memory arrays based on Fowler-Nordheim tunneling, a new erase scheme that triggers GIDL in the NOR Flash cell and exploits hot-hole injection (HHI) at its drain side to accomplish the erase operation is presented. Using that scheme, spike-timing dependent plasticity (STDP) is implemented in a mainstream NOR Flash array and array learning is successfully demonstrated in a prototype SNN. The achieved results represent an important step for the development of large-scale neuromorphic systems based on mature and reliable memory technologies.
Стилі APA, Harvard, Vancouver, ISO та ін.

Тези доповідей конференцій з теми "Neuromorphic computer systems"

1

Song, Chang, Beiye Liu, Chenchen Liu, Hai Li, and Yiran Chen. "Design techniques of eNVM-enabled neuromorphic computing systems." In 2016 IEEE 34th International Conference on Computer Design (ICCD). IEEE, 2016. http://dx.doi.org/10.1109/iccd.2016.7753356.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Rajasekharan, Dinesh, Amit Ranjan Trivedi, and Yogesh Singh Chauhan. "Neuromorphic Circuits on FDSOI Technology for Computer Vision Applications." In 2019 32nd International Conference on VLSI Design and 2019 18th International Conference on Embedded Systems (VLSID). IEEE, 2019. http://dx.doi.org/10.1109/vlsid.2019.00108.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Liu, Beiye, Hai Li, Yiran Chen, Xin Li, Tingwen Huang, Qing Wu, and Mark Barnell. "Reduction and IR-drop compensations techniques for reliable neuromorphic computing systems." In 2014 IEEE/ACM International Conference on Computer-Aided Design (ICCAD). IEEE, 2014. http://dx.doi.org/10.1109/iccad.2014.7001330.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Shahsavari, Mahyar, Pierre Boulet, Asadollah Shahbahrami, and Said Hamdioui. "Impact of increasing number of neurons on performance of neuromorphic architecture." In 2017 19th International Symposium on Computer Architecture and Digital Systems (CADS). IEEE, 2017. http://dx.doi.org/10.1109/cads.2017.8310732.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Mes, Johan, Ester Stienstra, Xuefei You, Sumeet S. Kumar, Amir Zjajo, Carlo Galuzzi, and Rene van Leuken. "Neuromorphic self-organizing map design for classification of bioelectric-timescale signals." In 2017 International Conference on Embedded Computer Systems: Architectures, Modeling, and Simulation (SAMOS). IEEE, 2017. http://dx.doi.org/10.1109/samos.2017.8344618.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Zjajo, Amir, Johan Mes, Eralp Kolagasioglu, Sumeet Kumar, and Rene van Leuken. "Uncertainty in Noise-Driven Steady-State Neuromorphic Network for ECG Data Classification." In 2018 IEEE 31st International Symposium on Computer-Based Medical Systems (CBMS). IEEE, 2018. http://dx.doi.org/10.1109/cbms.2018.00082.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Sugiarto, Indar, Delong Shang, Amit Kumar Singh, Bassem Ouni, Geoff Merrett, Bashir Al-Hashimi, and Steve Furber. "Software-defined PMC for runtime power management of a many-core neuromorphic platform." In 2017 12th International Conference on Computer Engineering and Systems (ICCES). IEEE, 2017. http://dx.doi.org/10.1109/icces.2017.8275383.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Tosson, Amr M. S., Shimeng Yu, Mohab H. Anis, and Lan Wei. "Analysis of RRAM Reliability Soft-Errors on the Performance of RRAM-Based Neuromorphic Systems." In 2017 IEEE Computer Society Annual Symposium on VLSI (ISVLSI). IEEE, 2017. http://dx.doi.org/10.1109/isvlsi.2017.20.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Kim, Yongtae, Yong Zhang, and Peng Li. "An energy efficient approximate adder with carry skip for error resilient neuromorphic VLSI systems." In 2013 IEEE/ACM International Conference on Computer-Aided Design (ICCAD). IEEE, 2013. http://dx.doi.org/10.1109/iccad.2013.6691108.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Sayyaparaju, Sagarvarma, Ryan Weiss, and Garrett S. Rose. "A Mixed-Mode Neuron with On-chip Tunability for Generic Use in Memristive Neuromorphic Systems." In 2018 IEEE Computer Society Annual Symposium on VLSI (ISVLSI). IEEE, 2018. http://dx.doi.org/10.1109/isvlsi.2018.00086.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Звіти організацій з теми "Neuromorphic computer systems"

1

Gall, W. E. Brain-Based Devices for Neuromorphic Computer Systems. Fort Belvoir, VA: Defense Technical Information Center, July 2013. http://dx.doi.org/10.21236/ada587348.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії