Статті в журналах з теми "Potts Attractor Neural Network"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Potts Attractor Neural Network".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.
Abdukhamidov, Eldor, Firuz Juraev, Mohammed Abuhamad, Shaker El-Sappagh, and Tamer AbuHmed. "Sentiment Analysis of Users’ Reactions on Social Media During the Pandemic." Electronics 11, no. 10 (May 22, 2022): 1648. http://dx.doi.org/10.3390/electronics11101648.
Повний текст джерелаO'Kane, D., and D. Sherrington. "A feature retrieving attractor neural network." Journal of Physics A: Mathematical and General 26, no. 10 (May 21, 1993): 2333–42. http://dx.doi.org/10.1088/0305-4470/26/10/008.
Повний текст джерелаDeng, Hanming, Yang Hua, Tao Song, Zhengui Xue, Ruhui Ma, Neil Robertson, and Haibing Guan. "Reinforcing Neural Network Stability with Attractor Dynamics." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 3765–72. http://dx.doi.org/10.1609/aaai.v34i04.5787.
Повний текст джерелаTAN, Z., and L. SCHÜLKE. "THE ATTRACTOR BASIN OF NEURAL NETWORK WITH CORRELATED INTERACTIONS." International Journal of Modern Physics B 10, no. 26 (November 30, 1996): 3549–60. http://dx.doi.org/10.1142/s0217979296001902.
Повний текст джерелаBadoni, Davide, Roberto Riccardi, and Gaetano Salina. "LEARNING ATTRACTOR NEURAL NETWORK: THE ELECTRONIC IMPLEMENTATION." International Journal of Neural Systems 03, supp01 (January 1992): 13–24. http://dx.doi.org/10.1142/s0129065792000334.
Повний текст джерелаFrolov, A. A., D. Husek, I. P. Muraviev, and P. Yu Polyakov. "Boolean Factor Analysis by Attractor Neural Network." IEEE Transactions on Neural Networks 18, no. 3 (May 2007): 698–707. http://dx.doi.org/10.1109/tnn.2007.891664.
Повний текст джерелаZOU, FAN, and JOSEF A. NOSSEK. "AN AUTONOMOUS CHAOTIC CELLULAR NEURAL NETWORK AND CHUA'S CIRCUIT." Journal of Circuits, Systems and Computers 03, no. 02 (June 1993): 591–601. http://dx.doi.org/10.1142/s0218126693000368.
Повний текст джерелаDominguez, D. R. C., and D. Bollé. "Categorization by a three-state attractor neural network." Physical Review E 56, no. 6 (December 1, 1997): 7306–9. http://dx.doi.org/10.1103/physreve.56.7306.
Повний текст джерелаSERULNIK, SERGIO D., and MOSHE GUR. "AN ATTRACTOR NEURAL NETWORK MODEL OF CLASSICAL CONDITIONING." International Journal of Neural Systems 07, no. 01 (March 1996): 1–18. http://dx.doi.org/10.1142/s0129065796000026.
Повний текст джерелаWong, K. Y. M., and C. Ho. "Attractor properties of dynamical systems: neural network models." Journal of Physics A: Mathematical and General 27, no. 15 (August 7, 1994): 5167–85. http://dx.doi.org/10.1088/0305-4470/27/15/017.
Повний текст джерелаGonzález, Mario, David Dominguez, Ángel Sánchez, and Francisco B. Rodríguez. "Increase attractor capacity using an ensembled neural network." Expert Systems with Applications 71 (April 2017): 206–15. http://dx.doi.org/10.1016/j.eswa.2016.11.035.
Повний текст джерелаSteffan, Helmut, and Reimer K�hn. "Replica symmetry breaking in attractor neural network models." Zeitschrift f�r Physik B Condensed Matter 95, no. 2 (June 1994): 249–60. http://dx.doi.org/10.1007/bf01312198.
Повний текст джерелаFunabashi, Masatoshi. "Synthetic Modeling of Autonomous Learning with a Chaotic Neural Network." International Journal of Bifurcation and Chaos 25, no. 04 (April 2015): 1550054. http://dx.doi.org/10.1142/s0218127415500546.
Повний текст джерелаDominguez, D., K. Koroutchev, E. Serrano, and F. B. Rodríguez. "Information and Topology in Attractor Neural Networks." Neural Computation 19, no. 4 (April 2007): 956–73. http://dx.doi.org/10.1162/neco.2007.19.4.956.
Повний текст джерелаGislén, Lars, Carsten Peterson, and Bo Söderberg. "Complex Scheduling with Potts Neural Networks." Neural Computation 4, no. 6 (November 1992): 805–31. http://dx.doi.org/10.1162/neco.1992.4.6.805.
Повний текст джерелаBollé, D., P. Dupont, and J. Huyghebaert. "Thermodynamic properties of theQ-state Potts-glass neural network." Physical Review A 45, no. 6 (March 1, 1992): 4194–97. http://dx.doi.org/10.1103/physreva.45.4194.
Повний текст джерелаIRWIN, J., H. BOHR, K. MOCHIZUKI, and P. G. WOLYNES. "CLASSIFICATION AND PREDICTION OF PROTEIN SIDE-CHAINS BY NEURAL NETWORK TECHNIQUES." International Journal of Neural Systems 03, supp01 (January 1992): 177–82. http://dx.doi.org/10.1142/s0129065792000504.
Повний текст джерелаWu, Si, Kosuke Hamaguchi, and Shun-ichi Amari. "Dynamics and Computation of Continuous Attractors." Neural Computation 20, no. 4 (April 2008): 994–1025. http://dx.doi.org/10.1162/neco.2008.10-06-378.
Повний текст джерелаLattanzi, G., G. Nardulli, and S. Stramaglia. "A Neural Network with Permanent and Volatile Memory." Modern Physics Letters B 11, no. 24 (October 20, 1997): 1037–45. http://dx.doi.org/10.1142/s0217984997001250.
Повний текст джерелаRuppin, E., and M. Usher. "An attractor neural network model of semantic fact retrieval." Network: Computation in Neural Systems 1, no. 3 (January 1990): 325–44. http://dx.doi.org/10.1088/0954-898x_1_3_003.
Повний текст джерелаTsodyks, Misha. "Attractor neural network models of spatial maps in hippocampus." Hippocampus 9, no. 4 (1999): 481–89. http://dx.doi.org/10.1002/(sici)1098-1063(1999)9:4<481::aid-hipo14>3.0.co;2-s.
Повний текст джерелаСоловьева, К. П., and K. P. Solovyeva. "Self-Organized Maps on Continuous Bump Attractors." Mathematical Biology and Bioinformatics 8, no. 1 (May 27, 2013): 234–47. http://dx.doi.org/10.17537/2013.8.234.
Повний текст джерелаHoffman, Ralph E. "Additional tests of Amit's attractor neural networks." Behavioral and Brain Sciences 18, no. 4 (December 1995): 634–35. http://dx.doi.org/10.1017/s0140525x00040255.
Повний текст джерелаTang, Buzhou, Jianglu Hu, Xiaolong Wang, and Qingcai Chen. "Recognizing Continuous and Discontinuous Adverse Drug Reaction Mentions from Social Media Using LSTM-CRF." Wireless Communications and Mobile Computing 2018 (2018): 1–8. http://dx.doi.org/10.1155/2018/2379208.
Повний текст джерелаAkιn, H. "Phase diagrams of lattice models on Cayley tree and chandelier network: a review." Condensed Matter Physics 25, no. 3 (2022): 32501. http://dx.doi.org/10.5488/cmp.25.32501.
Повний текст джерелаBROUWER, ROELOF K. "AN INTEGER RECURRENT ARTIFICIAL NEURAL NETWORK FOR CLASSIFYING FEATURE VECTORS." International Journal of Pattern Recognition and Artificial Intelligence 14, no. 03 (May 2000): 339–55. http://dx.doi.org/10.1142/s0218001400000222.
Повний текст джерелаAhissar, Ehud. "Are single-cell data sufficient for testing neural network models?" Behavioral and Brain Sciences 18, no. 4 (December 1995): 626–27. http://dx.doi.org/10.1017/s0140525x00040176.
Повний текст джерелаHorn, D., and E. Ruppin. "Compensatory Mechanisms in an Attractor Neural Network Model of Schizophrenia." Neural Computation 7, no. 1 (January 1995): 182–205. http://dx.doi.org/10.1162/neco.1995.7.1.182.
Повний текст джерелаFink, Wolfgang. "Neural attractor network for application in visual field data classification." Physics in Medicine and Biology 49, no. 13 (June 12, 2004): 2799–809. http://dx.doi.org/10.1088/0031-9155/49/13/003.
Повний текст джерелаYu, Jiali, Huajin Tang, Haizhou Li, and Luping Shi. "Dynamical properties of continuous attractor neural network with background tuning." Neurocomputing 99 (January 2013): 439–47. http://dx.doi.org/10.1016/j.neucom.2012.06.029.
Повний текст джерелаIgarashi, Yasuhiko, Masafumi Oizumi, Yosuke Otsubo, Kenji Nagata, and Masato Okada. "Statistical mechanics of attractor neural network models with synaptic depression." Journal of Physics: Conference Series 197 (December 1, 2009): 012018. http://dx.doi.org/10.1088/1742-6596/197/1/012018.
Повний текст джерелаLakshmi, C., K. Thenmozhi, John Bosco Balaguru Rayappan, and Rengarajan Amirtharajan. "Hopfield attractor-trusted neural network: an attack-resistant image encryption." Neural Computing and Applications 32, no. 15 (November 29, 2019): 11477–89. http://dx.doi.org/10.1007/s00521-019-04637-4.
Повний текст джерелаSeow, M. J., and V. K. Asari. "Recurrent Neural Network as a Linear Attractor for Pattern Association." IEEE Transactions on Neural Networks 17, no. 1 (January 2006): 246–50. http://dx.doi.org/10.1109/tnn.2005.860869.
Повний текст джерелаDeco, Gustavo, and Edmund T. Rolls. "Sequential Memory: A Putative Neural and Synaptic Dynamical Mechanism." Journal of Cognitive Neuroscience 17, no. 2 (February 2005): 294–307. http://dx.doi.org/10.1162/0898929053124875.
Повний текст джерелаABDI, H. "A NEURAL NETWORK PRIMER." Journal of Biological Systems 02, no. 03 (September 1994): 247–81. http://dx.doi.org/10.1142/s0218339094000179.
Повний текст джерелаFiorelli, Eliana, Igor Lesanovsky, and Markus Müller. "Phase diagram of quantum generalized Potts-Hopfield neural networks." New Journal of Physics 24, no. 3 (March 1, 2022): 033012. http://dx.doi.org/10.1088/1367-2630/ac5490.
Повний текст джерелаXiong, Daxing, and Hong Zhao. "Estimates of storage capacity in theq-state Potts-glass neural network." Journal of Physics A: Mathematical and Theoretical 43, no. 44 (October 13, 2010): 445001. http://dx.doi.org/10.1088/1751-8113/43/44/445001.
Повний текст джерелаKang, Chol, Michelangelo Naim, Vezha Boboeva, and Alessandro Treves. "Life on the Edge: Latching Dynamics in a Potts Neural Network." Entropy 19, no. 9 (September 3, 2017): 468. http://dx.doi.org/10.3390/e19090468.
Повний текст джерелаMarconi, Carlo, Pau Colomer Saus, María García Díaz, and Anna Sanpera. "The role of coherence theory in attractor quantum neural networks." Quantum 6 (September 8, 2022): 794. http://dx.doi.org/10.22331/q-2022-09-08-794.
Повний текст джерелаAmit†, Daniel, and Nicolas Brunel. "Learning internal representations in an attractor neural network with analogue neurons." Network: Computation in Neural Systems 6, no. 3 (August 1, 1995): 359–88. http://dx.doi.org/10.1088/0954-898x/6/3/004.
Повний текст джерелаFassnacht, C., and A. Zippelius. "Recognition and categorization in a structured neural network with attractor dynamics." Network: Computation in Neural Systems 2, no. 1 (January 1991): 63–84. http://dx.doi.org/10.1088/0954-898x_2_1_004.
Повний текст джерелаBrunel, Nicolas. "Dynamics of an attractor neural network converting temporal into spatial correlations." Network: Computation in Neural Systems 5, no. 4 (January 1994): 449–70. http://dx.doi.org/10.1088/0954-898x_5_4_003.
Повний текст джерелаBadoni, Davide, Stefano Bertazzoni, Stefano Buglioni, Gaetano Salina, Daniel J. Amit, and Stefano Fusi. "Electronic implementation of an analogue attractor neural network with stochastic learning." Network: Computation in Neural Systems 6, no. 2 (January 1995): 125–57. http://dx.doi.org/10.1088/0954-898x_6_2_002.
Повний текст джерелаBeňušková, Lubica. "Modelling transpositional invariancy of melody recognition with an attractor neural network." Network: Computation in Neural Systems 6, no. 3 (January 1995): 313–31. http://dx.doi.org/10.1088/0954-898x_6_3_001.
Повний текст джерелаAmit, Daniel J., and Nicolas Brunel. "Learning internal representations in an attractor neural network with analogue neurons." Network: Computation in Neural Systems 6, no. 3 (January 1995): 359–88. http://dx.doi.org/10.1088/0954-898x_6_3_004.
Повний текст джерелаFrolov, Alexander A., Dusan Husek, Pavel Y. Polyakov, and Vaclav Snasel. "New BFA method based on attractor neural network and likelihood maximization." Neurocomputing 132 (May 2014): 14–29. http://dx.doi.org/10.1016/j.neucom.2013.07.047.
Повний текст джерелаTorres, J. J., J. M. Cortes, J. Marro, and H. J. Kappen. "Competition Between Synaptic Depression and Facilitation in Attractor Neural Networks." Neural Computation 19, no. 10 (October 2007): 2739–55. http://dx.doi.org/10.1162/neco.2007.19.10.2739.
Повний текст джерелаBrunel, Nicolas. "Hebbian Learning of Context in Recurrent Neural Networks." Neural Computation 8, no. 8 (November 1996): 1677–710. http://dx.doi.org/10.1162/neco.1996.8.8.1677.
Повний текст джерелаKRAWIECKI, A., and R. A. KOSIŃSKI. "ON–OFF INTERMITTENCY IN SMALL NEURAL NETWORKS WITH TIME-DEPENDENT SYNAPTIC NOISE." International Journal of Bifurcation and Chaos 09, no. 01 (January 1999): 97–105. http://dx.doi.org/10.1142/s0218127499000055.
Повний текст джерелаAKCAN, BURCU, and YİĞIT GÜNDÜÇ. "A MONTE CARLO STUDY OF THE STORAGE CAPACITY AND EFFECTS OF THE CORRELATIONS IN q-STATE POTTS NEURON SYSTEM." International Journal of Modern Physics C 13, no. 02 (February 2002): 199–206. http://dx.doi.org/10.1142/s012918310200305x.
Повний текст джерела