Artykuły w czasopismach na temat „Complex-valued neural networks”

Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Complex-valued neural networks.

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 50 najlepszych artykułów w czasopismach naukowych na temat „Complex-valued neural networks”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.

1

Hirose, Akira. "Complex-valued Neural Networks". IEEJ Transactions on Electronics, Information and Systems 131, nr 1 (2011): 2–8. http://dx.doi.org/10.1541/ieejeiss.131.2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Boonsatit, Nattakan, Santhakumari Rajendran, Chee Peng Lim, Anuwat Jirawattanapanit i Praneesh Mohandas. "New Adaptive Finite-Time Cluster Synchronization of Neutral-Type Complex-Valued Coupled Neural Networks with Mixed Time Delays". Fractal and Fractional 6, nr 9 (13.09.2022): 515. http://dx.doi.org/10.3390/fractalfract6090515.

Pełny tekst źródła
Streszczenie:
The issue of adaptive finite-time cluster synchronization corresponding to neutral-type coupled complex-valued neural networks with mixed delays is examined in this research. A neutral-type coupled complex-valued neural network with mixed delays is more general than that of a traditional neural network, since it considers distributed delays, state delays and coupling delays. In this research, a new adaptive control technique is developed to synchronize neutral-type coupled complex-valued neural networks with mixed delays in finite time. To stabilize the resulting closed-loop system, the Lyapunov stability argument is leveraged to infer the necessary requirements on the control factors. The effectiveness of the proposed method is illustrated through simulation studies.
Style APA, Harvard, Vancouver, ISO itp.
3

Nitta, Tohru. "Orthogonality of Decision Boundaries in Complex-Valued Neural Networks". Neural Computation 16, nr 1 (1.01.2004): 73–97. http://dx.doi.org/10.1162/08997660460734001.

Pełny tekst źródła
Streszczenie:
This letter presents some results of an analysis on the decision boundaries of complex-valued neural networks whose weights, threshold values, input and output signals are all complex numbers. The main results may be summarized as follows. (1) A decision boundary of a single complex-valued neuron consists of two hypersurfaces that intersect orthogonally, and divides a decision region into four equal sections. The XOR problem and the detection of symmetry problem that cannot be solved with two-layered real-valued neural networks, can be solved by two-layered complex-valued neural networks with the orthogonal decision boundaries, which reveals a potent computational power of complex-valued neural nets. Furthermore, the fading equalization problem can be successfully solved by the two-layered complex-valued neural network with the highest generalization ability. (2) A decision boundary of a three-layered complex-valued neural network has the orthogonal property as a basic structure, and its two hypersurfaces approach orthogonality as all the net inputs to each hidden neuron grow. In particular, most of the decision boundaries in the three-layered complex-valued neural network inetersect orthogonally when the network is trained using Complex-BP algorithm. As a result, the orthogonality of the decision boundaries improves its generalization ability. (3) The average of the learning speed of the Complex-BP is several times faster than that of the Real-BP. The standard deviation of the learning speed of the Complex-BP is smaller than that of the Real-BP. It seems that the complex-valued neural network and the related algorithm are natural for learning complex-valued patterns for the above reasons.
Style APA, Harvard, Vancouver, ISO itp.
4

Nitta, Tohru. "Learning Transformations with Complex-Valued Neurocomputing". International Journal of Organizational and Collective Intelligence 3, nr 2 (kwiecień 2012): 81–116. http://dx.doi.org/10.4018/joci.2012040103.

Pełny tekst źródła
Streszczenie:
The ability of the 1-n-1 complex-valued neural network to learn 2D affine transformations has been applied to the estimation of optical flows and the generation of fractal images. The complex-valued neural network has the adaptability and the generalization ability as inherent nature. This is the most different point between the ability of the 1-n-1 complex-valued neural network to learn 2D affine transformations and the standard techniques for 2D affine transformations such as the Fourier descriptor. It is important to clarify the properties of complex-valued neural networks in order to accelerate its practical applications more and more. In this paper, first, the generalization ability of the 1-n-1 complex-valued neural network which has learned complicated rotations on a 2D plane is examined experimentally and analytically. Next, the behavior of the 1-n-1 complex-valued neural network that has learned a transformation on the Steiner circles is demonstrated, and the relationship the values of the complex-valued weights after training and a linear transformation related to the Steiner circles is clarified via computer simulations. Furthermore, the relationship the weight values of the 1-n-1 complex-valued neural network learned 2D affine transformations and the learning patterns used is elucidated. These research results make it possible to solve complicated problems more simply and efficiently with 1-n-1 complex-valued neural networks. As a matter of fact, an application of the 1-n-1 type complex-valued neural network to an associative memory is presented.
Style APA, Harvard, Vancouver, ISO itp.
5

Guo, Song, i Bo Du. "Global Exponential Stability of Periodic Solution for Neutral-Type Complex-Valued Neural Networks". Discrete Dynamics in Nature and Society 2016 (2016): 1–10. http://dx.doi.org/10.1155/2016/1267954.

Pełny tekst źródła
Streszczenie:
This paper deals with a class of neutral-type complex-valued neural networks with delays. By means of Mawhin’s continuation theorem, some criteria on existence of periodic solutions are established for the neutral-type complex-valued neural networks. By constructing an appropriate Lyapunov-Krasovskii functional, some sufficient conditions are derived for the global exponential stability of periodic solutions to the neutral-type complex-valued neural networks. Finally, numerical examples are given to show the effectiveness and merits of the present results.
Style APA, Harvard, Vancouver, ISO itp.
6

NITTA, TOHRU. "THE UNIQUENESS THEOREM FOR COMPLEX-VALUED NEURAL NETWORKS WITH THRESHOLD PARAMETERS AND THE REDUNDANCY OF THE PARAMETERS". International Journal of Neural Systems 18, nr 02 (kwiecień 2008): 123–34. http://dx.doi.org/10.1142/s0129065708001439.

Pełny tekst źródła
Streszczenie:
This paper will prove the uniqueness theorem for 3-layered complex-valued neural networks where the threshold parameters of the hidden neurons can take non-zeros. That is, if a 3-layered complex-valued neural network is irreducible, the 3-layered complex-valued neural network that approximates a given complex-valued function is uniquely determined up to a finite group on the transformations of the learnable parameters of the complex-valued neural network.
Style APA, Harvard, Vancouver, ISO itp.
7

Valle, Marcos Eduardo. "Complex-Valued Recurrent Correlation Neural Networks". IEEE Transactions on Neural Networks and Learning Systems 25, nr 9 (wrzesień 2014): 1600–1612. http://dx.doi.org/10.1109/tnnls.2014.2341013.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Kobayashi, Masaki. "Symmetric Complex-Valued Hopfield Neural Networks". IEEE Transactions on Neural Networks and Learning Systems 28, nr 4 (kwiecień 2017): 1011–15. http://dx.doi.org/10.1109/tnnls.2016.2518672.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Kobayashi, Masaki. "Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks". Neural Computation 32, nr 11 (listopad 2020): 2237–48. http://dx.doi.org/10.1162/neco_a_01320.

Pełny tekst źródła
Streszczenie:
A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory resources. Twin-multistate activation functions were introduced to quaternion- and bicomplex-valued Hopfield neural networks. Since their architectures are much more complicated than that of CHNN, the architecture should be simplified. In this work, the number of weight parameters is reduced by bicomplex projection rule for CHNNs, which is given by the decomposition of bicomplex-valued Hopfield neural networks. Computer simulations support that the noise tolerance of CHNN with a bicomplex projection rule is equal to or even better than that of quaternion- and bicomplex-valued Hopfield neural networks. By computer simulations, we find that the projection rule for hyperbolic-valued Hopfield neural networks in synchronous mode maintains a high noise tolerance.
Style APA, Harvard, Vancouver, ISO itp.
10

Kobayashi, Masaki. "Fast Recall for Complex-Valued Hopfield Neural Networks with Projection Rules". Computational Intelligence and Neuroscience 2017 (2017): 1–6. http://dx.doi.org/10.1155/2017/4894278.

Pełny tekst źródła
Streszczenie:
Many models of neural networks have been extended to complex-valued neural networks. A complex-valued Hopfield neural network (CHNN) is a complex-valued version of a Hopfield neural network. Complex-valued neurons can represent multistates, and CHNNs are available for the storage of multilevel data, such as gray-scale images. The CHNNs are often trapped into the local minima, and their noise tolerance is low. Lee improved the noise tolerance of the CHNNs by detecting and exiting the local minima. In the present work, we propose a new recall algorithm that eliminates the local minima. We show that our proposed recall algorithm not only accelerated the recall but also improved the noise tolerance through computer simulations.
Style APA, Harvard, Vancouver, ISO itp.
11

Popa, Călin-Adrian. "Asymptotic and Mittag–Leffler Synchronization of Fractional-Order Octonion-Valued Neural Networks with Neutral-Type and Mixed Delays". Fractal and Fractional 7, nr 11 (20.11.2023): 830. http://dx.doi.org/10.3390/fractalfract7110830.

Pełny tekst źródła
Streszczenie:
Very recently, a different generalization of real-valued neural networks (RVNNs) to multidimensional domains beside the complex-valued neural networks (CVNNs), quaternion-valued neural networks (QVNNs), and Clifford-valued neural networks (ClVNNs) has appeared, namely octonion-valued neural networks (OVNNs), which are not a subset of ClVNNs. They are defined on the octonion algebra, which is an 8D algebra over the reals, and is also the only other normed division algebra that can be defined over the reals beside the complex and quaternion algebras. On the other hand, fractional-order neural networks (FONNs) have also been very intensively researched in the recent past. Thus, the present work combines FONNs and OVNNs and puts forward a fractional-order octonion-valued neural network (FOOVNN) with neutral-type, time-varying, and distributed delays, a very general model not yet discussed in the literature, to our awareness. Sufficient criteria expressed as linear matrix inequalities (LMIs) and algebraic inequalities are deduced, which ensure the asymptotic and Mittag–Leffler synchronization properties of the proposed model by decomposing the OVNN system of equations into a real-valued one, in order to avoid the non-associativity problem of the octonion algebra. To accomplish synchronization, we use two different state feedback controllers, two different types of Lyapunov-like functionals in conjunction with two Halanay-type lemmas for FONNs, the free-weighting matrix method, a classical lemma, and Young’s inequality. The four theorems presented in the paper are each illustrated by a numerical example.
Style APA, Harvard, Vancouver, ISO itp.
12

Lee, ChiYan, Hideyuki Hasegawa i Shangce Gao. "Complex-Valued Neural Networks: A Comprehensive Survey". IEEE/CAA Journal of Automatica Sinica 9, nr 8 (sierpień 2022): 1406–26. http://dx.doi.org/10.1109/jas.2022.105743.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
13

Gorday, Paul E., Nurgun Erdol i Hanqi Zhuang. "Complex-Valued Neural Networks for Noncoherent Demodulation". IEEE Open Journal of the Communications Society 1 (2020): 217–25. http://dx.doi.org/10.1109/ojcoms.2020.2970688.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
14

Kobayashi, Masaki. "Exceptional Reducibility of Complex-Valued Neural Networks". IEEE Transactions on Neural Networks 21, nr 7 (lipiec 2010): 1060–72. http://dx.doi.org/10.1109/tnn.2010.2048040.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
15

Hirose, A. "Dynamics of fully complex-valued neural networks". Electronics Letters 28, nr 16 (1992): 1492. http://dx.doi.org/10.1049/el:19920948.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
16

Zhang, Yongliang, i He Huang. "Adaptive complex-valued stepsize based fast learning of complex-valued neural networks". Neural Networks 124 (kwiecień 2020): 233–42. http://dx.doi.org/10.1016/j.neunet.2020.01.011.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
17

Hirose, Akira. "Nature of complex number and complex-valued neural networks". Frontiers of Electrical and Electronic Engineering in China 6, nr 1 (14.01.2011): 171–80. http://dx.doi.org/10.1007/s11460-011-0125-3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
18

Wang, Ruiting, Pengfei Wang, Chen Lyu, Guangzhen Luo, Hongyan Yu, Xuliang Zhou, Yejin Zhang i Jiaoqing Pan. "Multicore Photonic Complex-Valued Neural Network with Transformation Layer". Photonics 9, nr 6 (28.05.2022): 384. http://dx.doi.org/10.3390/photonics9060384.

Pełny tekst źródła
Streszczenie:
Photonic neural network chips have been widely studied because of their low power consumption, high speed and large bandwidth. Using amplitude and phase to encode, photonic chips can accelerate complex-valued neural network computations. In this article, a photonic complex-valued neural network (PCNN) chip is designed. The scale of the single-core PCNN chip is limited because of optical losses, and the multicore architecture of the chip is used to improve computing capability. Further, for improving the performance of the PCNN, we propose the transformation layer, which can be implemented by the designed photonic chip to transform real-valued encoding to complex-valued encoding, which has richer information. Compared with real-valued input, the transformation layer can effectively improve the classification accuracy from 93.14% to 97.51% of a 64-dimensional input on the MNIST test set. Finally, we analyze the multicore computation of the PCNN. Compared with the single-core architecture, the multicore architecture can improve the classification accuracy by implementing larger neural networks and has better phase noise robustness. The proposed architecture and algorithms are beneficial to promote the accelerated computing of photonic chips for complex-valued neural networks and are promising for use in many applications, such as image recognition and signal processing.
Style APA, Harvard, Vancouver, ISO itp.
19

Xu, Zhaojing, Shunhu Hou, Shengliang Fang, Huachao Hu i Zhao Ma. "A Novel Complex-Valued Hybrid Neural Network for Automatic Modulation Classification". Electronics 12, nr 20 (23.10.2023): 4380. http://dx.doi.org/10.3390/electronics12204380.

Pełny tekst źródła
Streszczenie:
Currently, dealing directly with in-phase and quadrature time series data using the deep learning method is widely used in signal modulation classification. However, there is a relative lack of methods that consider the complex properties of signals. Therefore, to make full use of the inherent relationship between in-phase and quadrature time series data, a complex-valued hybrid neural network (CV-PET-CSGDNN) based on the existing PET-CGDNN network is proposed in this paper, which consists of phase parameter estimation, parameter transformation, and complex-valued signal feature extraction layers. The complex-valued signal feature extraction layers are composed of complex-valued convolutional neural networks (CNN), complex-valued gate recurrent units (GRU), squeeze-and-excite (SE) blocks, and complex-valued dense neural networks (DNN). The proposed network can improve the extraction of the intrinsic relationship between in-phase and quadrature time series data with low capacity and then improve the accuracy of modulation classification. Experiments are carried out on RML2016.10a and RML2018.01a. The results show that, compared with ResNet, CLDNN, MCLDNN, PET-CGDNN, and CV-ResNet models, our proposed complex-valued neural network (CVNN) achieves the highest average accuracy of 61.50% and 62.92% for automatic modulation classification, respectively. In addition, the proposed CV-PET-CSGDNN has a significant improvement in the misjudgment situation between 64QAM, 128QAM, and 256QAM compared with PET-CGDNN on RML2018.01a.
Style APA, Harvard, Vancouver, ISO itp.
20

Goh, Su Lee, i Danilo P. Mandic. "A Complex-Valued RTRL Algorithm for Recurrent Neural Networks". Neural Computation 16, nr 12 (1.12.2004): 2699–713. http://dx.doi.org/10.1162/0899766042321779.

Pełny tekst źródła
Streszczenie:
A complex-valued real-time recurrent learning (CRTRL) algorithm for the class of nonlinear adaptive filters realized as fully connected recurrent neural networks is introduced. The proposed CRTRL is derived for a general complex activation function of a neuron, which makes it suitable for nonlinear adaptive filtering of complex-valued nonlinear and nonstationary signals and complex signals with strong component correlations. In addition, this algorithm is generic and represents a natural extension of the real-valued RTRL. Simulations on benchmark and real-world complex-valued signals support the approach.
Style APA, Harvard, Vancouver, ISO itp.
21

Dong, Tao, i Tingwen Huang. "Neural Cryptography Based on Complex-Valued Neural Network". IEEE Transactions on Neural Networks and Learning Systems 31, nr 11 (listopad 2020): 4999–5004. http://dx.doi.org/10.1109/tnnls.2019.2955165.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
22

Caragea, Andrei, Dae Gwan Lee, Johannes Maly, Götz Pfander i Felix Voigtlaender. "Quantitative Approximation Results for Complex-Valued Neural Networks". SIAM Journal on Mathematics of Data Science 4, nr 2 (2.05.2022): 553–80. http://dx.doi.org/10.1137/21m1429540.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
23

Scardapane, Simone, Steven Van Vaerenbergh, Amir Hussain i Aurelio Uncini. "Complex-Valued Neural Networks With Nonparametric Activation Functions". IEEE Transactions on Emerging Topics in Computational Intelligence 4, nr 2 (kwiecień 2020): 140–50. http://dx.doi.org/10.1109/tetci.2018.2872600.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
24

HASHIMOTO, Naoki, Yasuaki KUROE i Takehiro MORI. "On Energy Function for Complex-Valued Neural Networks". Transactions of the Institute of Systems, Control and Information Engineers 15, nr 10 (2002): 559–65. http://dx.doi.org/10.5687/iscie.15.559.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
25

Kaslik, Eva, i Ileana Rodica Rădulescu. "Dynamics of complex-valued fractional-order neural networks". Neural Networks 89 (maj 2017): 39–49. http://dx.doi.org/10.1016/j.neunet.2017.02.011.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
26

Pandey, R. "Complex Valued Recurrent Neural Networks for Blind Equalization". International Journal of Modelling and Simulation 25, nr 3 (styczeń 2005): 182–89. http://dx.doi.org/10.1080/02286203.2005.11442333.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
27

Nitta, Tohru. "Proposal of fully augmented complex-valued neural networks". Nonlinear Theory and Its Applications, IEICE 14, nr 2 (2023): 175–92. http://dx.doi.org/10.1587/nolta.14.175.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
28

Jayanthi, N., i R. Santhakumari. "Synchronization of time invariant uncertain delayed neural networks in finite time via improved sliding mode control". Mathematical Modeling and Computing 8, nr 2 (2021): 228–40. http://dx.doi.org/10.23939/mmc2021.02.228.

Pełny tekst źródła
Streszczenie:
This paper explores the finite-time synchronization problem of delayed complex valued neural networks with time invariant uncertainty through improved integral sliding mode control. Firstly, the master-slave complex valued neural networks are transformed into two real valued neural networks through the method of separating the complex valued neural networks into real and imaginary parts. Also, the interval uncertainty terms of delayed complex valued neural networks are converted into the real uncertainty terms. Secondly, a new integral sliding mode surface is designed by employing the master-slave concept and the synchronization error of master-slave systems such that the error system can converge to zero in finite-time along the constructed integral sliding mode surface. Next, a suitable sliding mode control is designed by using Lyapunov stability theory such that state trajectories of the system can be driven onto the pre-set sliding mode surface in finite-time. Finally, a numerical example is presented to illustrate the effectiveness of the theoretical results.
Style APA, Harvard, Vancouver, ISO itp.
29

Rattan, Sanjay S. P., i William W. Hsieh. "Complex-valued neural networks for nonlinear complex principal component analysis". Neural Networks 18, nr 1 (styczeń 2005): 61–69. http://dx.doi.org/10.1016/j.neunet.2004.08.002.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
30

Dong, Wenjun, Yujiao Huang, Tingan Chen, Xinggang Fan i Haixia Long. "Local Lagrange Exponential Stability Analysis of Quaternion-Valued Neural Networks with Time Delays". Mathematics 10, nr 13 (21.06.2022): 2157. http://dx.doi.org/10.3390/math10132157.

Pełny tekst źródła
Streszczenie:
This study on the local stability of quaternion-valued neural networks is of great significance to the application of associative memory and pattern recognition. In the research, we study local Lagrange exponential stability of quaternion-valued neural networks with time delays. By separating the quaternion-valued neural networks into a real part and three imaginary parts, separating the quaternion field into 34n subregions, and using the intermediate value theorem, sufficient conditions are proposed to ensure quaternion-valued neural networks have 34n equilibrium points. According to the Halanay inequality, the conditions for the existence of 24n local Lagrange exponentially stable equilibria of quaternion-valued neural networks are established. The obtained stability results improve and extend the existing ones. Under the same conditions, quaternion-valued neural networks have more stable equilibrium points than complex-valued neural networks and real-valued neural networks. The validity of the theoretical results were verified by an example.
Style APA, Harvard, Vancouver, ISO itp.
31

Wang, Xue-Zhong, Yimin Wei i Predrag S. Stanimirović. "Complex Neural Network Models for Time-Varying Drazin Inverse". Neural Computation 28, nr 12 (grudzień 2016): 2790–824. http://dx.doi.org/10.1162/neco_a_00866.

Pełny tekst źródła
Streszczenie:
Two complex Zhang neural network (ZNN) models for computing the Drazin inverse of arbitrary time-varying complex square matrix are presented. The design of these neural networks is based on corresponding matrix-valued error functions arising from the limit representations of the Drazin inverse. Two types of activation functions, appropriate for handling complex matrices, are exploited to develop each of these networks. Theoretical results of convergence analysis are presented to show the desirable properties of the proposed complex-valued ZNN models. Numerical results further demonstrate the effectiveness of the proposed models.
Style APA, Harvard, Vancouver, ISO itp.
32

Fiori, Simone. "Nonlinear Complex-Valued Extensions of Hebbian Learning: An Essay". Neural Computation 17, nr 4 (1.04.2005): 779–838. http://dx.doi.org/10.1162/0899766053429381.

Pełny tekst źródła
Streszczenie:
The Hebbian paradigm is perhaps the best-known unsupervised learning theory in connectionism. It has inspired wide research activity in the artificial neural network field because it embodies some interesting properties such as locality and the capability of being applicable to the basic weight-and-sum structure of neuron models. The plain Hebbian principle, however, also presents some inherent theoretical limitations that make it impractical in most cases. Therefore, modifications of the basic Hebbian learning paradigm have been proposed over the past 20 years in order to design profitable signal and data processing algorithms. Such modifications led to the principal component analysis type class of learning rules along with their nonlinear extensions. The aim of this review is primarily to present part of the existing fragmented material in the field of principal component learning within a unified view and contextually to motivate and present extensions of previous works on Hebbian learning to complex-weighted linear neural networks. This work benefits from previous studies on linear signal decomposition by artificial neural networks, nonquadratic component optimization and reconstruction error definition, neural parameters adaptation by constrained optimization of learning criteria of complex-valued arguments, and orthonormality expression via the insertion of topological elements in the networks or by modifying the network learning criterion. In particular, the learning principles considered here and their analysis concern complex-valued principal/minor component/subspace linear/nonlinear rules for complex-weighted neural structures, both feedforward and laterally connected.
Style APA, Harvard, Vancouver, ISO itp.
33

Liu, Xin, Lili Chen i Yanfeng Zhao. "Uniform Stability of a Class of Fractional-Order Fuzzy Complex-Valued Neural Networks in Infinite Dimensions". Fractal and Fractional 6, nr 5 (23.05.2022): 281. http://dx.doi.org/10.3390/fractalfract6050281.

Pełny tekst źródła
Streszczenie:
In this paper, the problem of the uniform stability for a class of fractional-order fuzzy impulsive complex-valued neural networks with mixed delays in infinite dimensions is discussed for the first time. By utilizing fixed-point theory, theory of differential inclusion and set-valued mappings, the uniqueness of the solution of the above complex-valued neural networks is derived. Subsequently, the criteria for uniform stability of the above complex-valued neural networks are established. In comparison with related results, we do not need to construct a complex Lyapunov function, reducing the computational complexity. Finally, an example is given to show the validity of the main results.
Style APA, Harvard, Vancouver, ISO itp.
34

Wang, Zengyun, Jinde Cao, Zhenyuan Guo i Lihong Huang. "Generalized stability for discontinuous complex-valued Hopfield neural networks via differential inclusions". Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 474, nr 2220 (grudzień 2018): 20180507. http://dx.doi.org/10.1098/rspa.2018.0507.

Pełny tekst źródła
Streszczenie:
Some dynamical behaviours of discontinuous complex-valued Hopfield neural networks are discussed in this paper. First, we introduce a method to construct the complex-valued set-valued mapping and define some basic definitions for discontinuous complex-valued differential equations. In addition, Leray–Schauder alternative theorem is used to analyse the equilibrium existence of the networks. Lastly, we present the dynamical behaviours, including global stability and convergence in measure for discontinuous complex-valued neural networks (CVNNs) via differential inclusions. The main contribution of this paper is that we extend previous studies on continuous CVNNs to discontinuous ones. Several simulations are given to substantiate the correction of the proposed results.
Style APA, Harvard, Vancouver, ISO itp.
35

Han, Xiaofang, Abdujelil Abdurahman i Jingjing You. "Nonseparation Approach to General-Decay Synchronization of Quaternion-Valued Neural Networks with Mixed Time Delays". Axioms 12, nr 9 (30.08.2023): 842. http://dx.doi.org/10.3390/axioms12090842.

Pełny tekst źródła
Streszczenie:
In this paper, the general-decay synchronization issue of a class of quaternion-valued neural networks with mixed time delays is investigated. Firstly, unlike some previous works where the quaternion-valued model is separated into four real-valued networks or two complex-valued networks, we consider the mixed-delayed quaternion-valued neural network model as a whole and introduce a novel nonlinear feedback controller for the corresponding response system. Then, by introducing a suitable Lyapunov–Krasovskii functional and employing a novel inequality technique, some easily verifiable sufficient conditions are obtained to ensure the general-decay synchronization for the considered drive-response networks. Finally, the feasibility of the established theoretical results is verified by carrying out Matlab numerical simulations.
Style APA, Harvard, Vancouver, ISO itp.
36

Pavlov, Stanislav, Dmitry Kozlov, Mikhail Bakulin, Aleksandr Zuev, Andrey Latyshev i Alexander Beliaev. "Generalization of Neural Networks on Second-Order Hypercomplex Numbers". Mathematics 11, nr 18 (19.09.2023): 3973. http://dx.doi.org/10.3390/math11183973.

Pełny tekst źródła
Streszczenie:
The vast majority of existing neural networks operate by rules set within the algebra of real numbers. However, as theoretical understanding of the fundamentals of neural networks and their practical applications grow stronger, new problems arise, which require going beyond such algebra. Various tasks come to light when the original data naturally have complex-valued formats. This situation is encouraging researchers to explore whether neural networks based on complex numbers can provide benefits over the ones limited to real numbers. Multiple recent works have been dedicated to developing the architecture and building blocks of complex-valued neural networks. In this paper, we generalize models by considering other types of hypercomplex numbers of the second order: dual and double numbers. We developed basic operators for these algebras, such as convolution, activation functions, and batch normalization, and rebuilt several real-valued networks to use them with these new algebras. We developed a general methodology for dual and double-valued gradient calculations based on Wirtinger derivatives for complex-valued functions. For classical computer vision (CIFAR-10, CIFAR-100, SVHN) and signal processing (G2Net, MusicNet) classification problems, our benchmarks show that the transition to the hypercomplex domain can be helpful in reaching higher values of metrics, compared to the original real-valued models.
Style APA, Harvard, Vancouver, ISO itp.
37

Lai, Wei, Jinjing Shi i Yan Chang. "Quantum-Inspired Fully Complex-Valued Neutral Network for Sentiment Analysis". Axioms 12, nr 3 (19.03.2023): 308. http://dx.doi.org/10.3390/axioms12030308.

Pełny tekst źródła
Streszczenie:
Most of the existing quantum-inspired models are based on amplitude-phase embedding to model natural language, which maps words into Hilbert space. In quantum-computing theory, the vectors corresponding to quantum states are all complex values, so there is a gap between these two areas. Presently, complex-valued neural networks have been studied, but their practical applications are few, let alone in the downstream tasks of natural language processing such as sentiment analysis and language modeling. In fact, the complex-valued neural network can use the imaginary part information to embed hidden information and can express more complex information, which is suitable for modeling complex natural language. Meanwhile, quantum-inspired models are defined in Hilbert space, which is also a complex space. So it is natural to construct quantum-inspired models based on complex-valued neural networks. Therefore, we propose a new quantum-inspired model for NLP, ComplexQNN, which contains a complex-valued embedding layer, a quantum encoding layer, and a measurement layer. The modules of ComplexQNN are fully based on complex-valued neural networks. It is more in line with quantum-computing theory and easier to transfer to quantum computers in the future to achieve exponential acceleration. We conducted experiments on six sentiment-classification datasets comparing with five classical models (TextCNN, GRU, ELMo, BERT, and RoBERTa). The results show that our model has improved by 10% in accuracy metric compared with TextCNN and GRU, and has competitive experimental results with ELMo, BERT, and RoBERTa.
Style APA, Harvard, Vancouver, ISO itp.
38

Tanaka, Gouhei. "Complex-Valued Neural Networks: Advances and Applications [Book Review]". IEEE Computational Intelligence Magazine 8, nr 2 (maj 2013): 77–79. http://dx.doi.org/10.1109/mci.2013.2247895.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
39

Goh, Su Lee, i Danilo P. Mandic. "An augmented CRTRL for complex-valued recurrent neural networks". Neural Networks 20, nr 10 (grudzień 2007): 1061–66. http://dx.doi.org/10.1016/j.neunet.2007.09.015.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
40

Dghais, Wael, Vitor Ribeiro, Zhansheng Liu, Zoran Vujicic, Manuel Violas i António Teixeira. "Efficient RSOA modelling using polar complex-valued neural networks". Optics Communications 334 (styczeń 2015): 129–32. http://dx.doi.org/10.1016/j.optcom.2014.08.031.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
41

Kobayashi, Masaki. "Fixed points of symmetric complex-valued Hopfield neural networks". Neurocomputing 275 (styczeń 2018): 132–36. http://dx.doi.org/10.1016/j.neucom.2017.05.006.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
42

Kobayashi, Masaki. "Synthesis of complex- and hyperbolic-valued Hopfield neural networks". Neurocomputing 423 (styczeń 2021): 80–88. http://dx.doi.org/10.1016/j.neucom.2020.10.002.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
43

Gong, Weiqiang, Jinling Liang, Xiu Kan i Xiaobing Nie. "Robust State Estimation for Delayed Complex-Valued Neural Networks". Neural Processing Letters 46, nr 3 (7.04.2017): 1009–29. http://dx.doi.org/10.1007/s11063-017-9626-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
44

Georgiou, G. "Complex-Valued Neural Networks (Hirose, A.; 2006) [Book review]". IEEE Transactions on Neural Networks 19, nr 3 (marzec 2008): 544. http://dx.doi.org/10.1109/tnn.2008.919145.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
45

Wang, Huamin, Shukai Duan, Tingwen Huang, Lidan Wang i Chuandong Li. "Exponential Stability of Complex-Valued Memristive Recurrent Neural Networks". IEEE Transactions on Neural Networks and Learning Systems 28, nr 3 (marzec 2017): 766–71. http://dx.doi.org/10.1109/tnnls.2015.2513001.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
46

Zhang, Weiwei, Jinde Cao, Dingyuan Chen i Fuad Alsaadi. "Synchronization in Fractional-Order Complex-Valued Delayed Neural Networks". Entropy 20, nr 1 (12.01.2018): 54. http://dx.doi.org/10.3390/e20010054.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
47

Hirose, Akira. "Fractal variation of attractors in complex-valued neural networks". Neural Processing Letters 1, nr 1 (marzec 1994): 6–8. http://dx.doi.org/10.1007/bf02312393.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
48

Nitta, Tohru. "Resolution of singularities via deep complex-valued neural networks". Mathematical Methods in the Applied Sciences 41, nr 11 (12.05.2017): 4170–78. http://dx.doi.org/10.1002/mma.4434.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
49

Gong, Weiqiang, Jinling Liang i Congjun Zhang. "Multistability of complex-valued neural networks with distributed delays". Neural Computing and Applications 28, S1 (18.04.2016): 1–14. http://dx.doi.org/10.1007/s00521-016-2305-9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
50

Shu, Jinlong, Lianglin Xiong, Tao Wu i Zixin Liu. "Stability Analysis of Quaternion-Valued Neutral-Type Neural Networks with Time-Varying Delay". Mathematics 7, nr 1 (18.01.2019): 101. http://dx.doi.org/10.3390/math7010101.

Pełny tekst źródła
Streszczenie:
This paper addresses the problem of global μ -stability for quaternion-valued neutral-type neural networks (QVNTNNs) with time-varying delays. First, QVNTNNs are transformed into two complex-valued systems by using a transformation to reduce the complexity of the computation generated by the non-commutativity of quaternion multiplication. A new convex inequality in a complex field is introduced. In what follows, the condition for the existence and uniqueness of the equilibrium point is primarily obtained by the homeomorphism theory. Next, the global stability conditions of the complex-valued systems are provided by constructing a novel Lyapunov–Krasovskii functional, using an integral inequality technique, and reciprocal convex combination approach. The gained global μ -stability conditions can be divided into three different kinds of stability forms by varying the positive continuous function μ ( t ) . Finally, three reliable examples and a simulation are given to display the effectiveness of the proposed methods.
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii