Gotowa bibliografia na temat „Complex-valued neural networks”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Complex-valued neural networks”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Complex-valued neural networks"

1

Hirose, Akira. "Complex-valued Neural Networks". IEEJ Transactions on Electronics, Information and Systems 131, nr 1 (2011): 2–8. http://dx.doi.org/10.1541/ieejeiss.131.2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Boonsatit, Nattakan, Santhakumari Rajendran, Chee Peng Lim, Anuwat Jirawattanapanit i Praneesh Mohandas. "New Adaptive Finite-Time Cluster Synchronization of Neutral-Type Complex-Valued Coupled Neural Networks with Mixed Time Delays". Fractal and Fractional 6, nr 9 (13.09.2022): 515. http://dx.doi.org/10.3390/fractalfract6090515.

Pełny tekst źródła
Streszczenie:
The issue of adaptive finite-time cluster synchronization corresponding to neutral-type coupled complex-valued neural networks with mixed delays is examined in this research. A neutral-type coupled complex-valued neural network with mixed delays is more general than that of a traditional neural network, since it considers distributed delays, state delays and coupling delays. In this research, a new adaptive control technique is developed to synchronize neutral-type coupled complex-valued neural networks with mixed delays in finite time. To stabilize the resulting closed-loop system, the Lyapunov stability argument is leveraged to infer the necessary requirements on the control factors. The effectiveness of the proposed method is illustrated through simulation studies.
Style APA, Harvard, Vancouver, ISO itp.
3

Nitta, Tohru. "Orthogonality of Decision Boundaries in Complex-Valued Neural Networks". Neural Computation 16, nr 1 (1.01.2004): 73–97. http://dx.doi.org/10.1162/08997660460734001.

Pełny tekst źródła
Streszczenie:
This letter presents some results of an analysis on the decision boundaries of complex-valued neural networks whose weights, threshold values, input and output signals are all complex numbers. The main results may be summarized as follows. (1) A decision boundary of a single complex-valued neuron consists of two hypersurfaces that intersect orthogonally, and divides a decision region into four equal sections. The XOR problem and the detection of symmetry problem that cannot be solved with two-layered real-valued neural networks, can be solved by two-layered complex-valued neural networks with the orthogonal decision boundaries, which reveals a potent computational power of complex-valued neural nets. Furthermore, the fading equalization problem can be successfully solved by the two-layered complex-valued neural network with the highest generalization ability. (2) A decision boundary of a three-layered complex-valued neural network has the orthogonal property as a basic structure, and its two hypersurfaces approach orthogonality as all the net inputs to each hidden neuron grow. In particular, most of the decision boundaries in the three-layered complex-valued neural network inetersect orthogonally when the network is trained using Complex-BP algorithm. As a result, the orthogonality of the decision boundaries improves its generalization ability. (3) The average of the learning speed of the Complex-BP is several times faster than that of the Real-BP. The standard deviation of the learning speed of the Complex-BP is smaller than that of the Real-BP. It seems that the complex-valued neural network and the related algorithm are natural for learning complex-valued patterns for the above reasons.
Style APA, Harvard, Vancouver, ISO itp.
4

Nitta, Tohru. "Learning Transformations with Complex-Valued Neurocomputing". International Journal of Organizational and Collective Intelligence 3, nr 2 (kwiecień 2012): 81–116. http://dx.doi.org/10.4018/joci.2012040103.

Pełny tekst źródła
Streszczenie:
The ability of the 1-n-1 complex-valued neural network to learn 2D affine transformations has been applied to the estimation of optical flows and the generation of fractal images. The complex-valued neural network has the adaptability and the generalization ability as inherent nature. This is the most different point between the ability of the 1-n-1 complex-valued neural network to learn 2D affine transformations and the standard techniques for 2D affine transformations such as the Fourier descriptor. It is important to clarify the properties of complex-valued neural networks in order to accelerate its practical applications more and more. In this paper, first, the generalization ability of the 1-n-1 complex-valued neural network which has learned complicated rotations on a 2D plane is examined experimentally and analytically. Next, the behavior of the 1-n-1 complex-valued neural network that has learned a transformation on the Steiner circles is demonstrated, and the relationship the values of the complex-valued weights after training and a linear transformation related to the Steiner circles is clarified via computer simulations. Furthermore, the relationship the weight values of the 1-n-1 complex-valued neural network learned 2D affine transformations and the learning patterns used is elucidated. These research results make it possible to solve complicated problems more simply and efficiently with 1-n-1 complex-valued neural networks. As a matter of fact, an application of the 1-n-1 type complex-valued neural network to an associative memory is presented.
Style APA, Harvard, Vancouver, ISO itp.
5

Guo, Song, i Bo Du. "Global Exponential Stability of Periodic Solution for Neutral-Type Complex-Valued Neural Networks". Discrete Dynamics in Nature and Society 2016 (2016): 1–10. http://dx.doi.org/10.1155/2016/1267954.

Pełny tekst źródła
Streszczenie:
This paper deals with a class of neutral-type complex-valued neural networks with delays. By means of Mawhin’s continuation theorem, some criteria on existence of periodic solutions are established for the neutral-type complex-valued neural networks. By constructing an appropriate Lyapunov-Krasovskii functional, some sufficient conditions are derived for the global exponential stability of periodic solutions to the neutral-type complex-valued neural networks. Finally, numerical examples are given to show the effectiveness and merits of the present results.
Style APA, Harvard, Vancouver, ISO itp.
6

NITTA, TOHRU. "THE UNIQUENESS THEOREM FOR COMPLEX-VALUED NEURAL NETWORKS WITH THRESHOLD PARAMETERS AND THE REDUNDANCY OF THE PARAMETERS". International Journal of Neural Systems 18, nr 02 (kwiecień 2008): 123–34. http://dx.doi.org/10.1142/s0129065708001439.

Pełny tekst źródła
Streszczenie:
This paper will prove the uniqueness theorem for 3-layered complex-valued neural networks where the threshold parameters of the hidden neurons can take non-zeros. That is, if a 3-layered complex-valued neural network is irreducible, the 3-layered complex-valued neural network that approximates a given complex-valued function is uniquely determined up to a finite group on the transformations of the learnable parameters of the complex-valued neural network.
Style APA, Harvard, Vancouver, ISO itp.
7

Valle, Marcos Eduardo. "Complex-Valued Recurrent Correlation Neural Networks". IEEE Transactions on Neural Networks and Learning Systems 25, nr 9 (wrzesień 2014): 1600–1612. http://dx.doi.org/10.1109/tnnls.2014.2341013.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Kobayashi, Masaki. "Symmetric Complex-Valued Hopfield Neural Networks". IEEE Transactions on Neural Networks and Learning Systems 28, nr 4 (kwiecień 2017): 1011–15. http://dx.doi.org/10.1109/tnnls.2016.2518672.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Kobayashi, Masaki. "Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks". Neural Computation 32, nr 11 (listopad 2020): 2237–48. http://dx.doi.org/10.1162/neco_a_01320.

Pełny tekst źródła
Streszczenie:
A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory resources. Twin-multistate activation functions were introduced to quaternion- and bicomplex-valued Hopfield neural networks. Since their architectures are much more complicated than that of CHNN, the architecture should be simplified. In this work, the number of weight parameters is reduced by bicomplex projection rule for CHNNs, which is given by the decomposition of bicomplex-valued Hopfield neural networks. Computer simulations support that the noise tolerance of CHNN with a bicomplex projection rule is equal to or even better than that of quaternion- and bicomplex-valued Hopfield neural networks. By computer simulations, we find that the projection rule for hyperbolic-valued Hopfield neural networks in synchronous mode maintains a high noise tolerance.
Style APA, Harvard, Vancouver, ISO itp.
10

Kobayashi, Masaki. "Fast Recall for Complex-Valued Hopfield Neural Networks with Projection Rules". Computational Intelligence and Neuroscience 2017 (2017): 1–6. http://dx.doi.org/10.1155/2017/4894278.

Pełny tekst źródła
Streszczenie:
Many models of neural networks have been extended to complex-valued neural networks. A complex-valued Hopfield neural network (CHNN) is a complex-valued version of a Hopfield neural network. Complex-valued neurons can represent multistates, and CHNNs are available for the storage of multilevel data, such as gray-scale images. The CHNNs are often trapped into the local minima, and their noise tolerance is low. Lee improved the noise tolerance of the CHNNs by detecting and exiting the local minima. In the present work, we propose a new recall algorithm that eliminates the local minima. We show that our proposed recall algorithm not only accelerated the recall but also improved the noise tolerance through computer simulations.
Style APA, Harvard, Vancouver, ISO itp.

Rozprawy doktorskie na temat "Complex-valued neural networks"

1

Barrachina, Jose Agustin. "Complex-valued neural networks for radar applications". Electronic Thesis or Diss., université Paris-Saclay, 2022. http://www.theses.fr/2022UPASG094.

Pełny tekst źródła
Streszczenie:
Le traitement des signaux radars et des images SAR nécessite généralement des représentations et des opérations à valeurs complexes, telles que les transformées de Fourier et d'ondelettes, les filtres de Wiener et les filtres adaptés, etc. Cependant, la grande majorité des architectures d'apprentissage profond sont actuellement basées sur des opérations à valeurs réelles, ce qui limite leur capacité d'apprentissage à partir de données complexes. Malgré l'émergence des réseaux de neurones à valeurs complexes (CVNN), leur application au radar et à l'imagerie SAR manque encore d'études sur leur pertinence et leur efficacité. Et la comparaison avec un réseau de neurones à valeurs réelles (RVNN) équivalent est généralement biaisée.Dans cette thèse, nous proposons d'étudier les mérites des CVNNs pour classifier des données complexes. Nous montrons que les CVNNs atteignent de meilleures performances que leur equivalent réel pour classifier des vecteurs à données gaussiennes non circulaires. Nous définissons également un critère d'équivalence entre les CVNNs et les RVNNs, entièrement connectés ou convolutifs, en termes du nombre de paramètres entraînables, tout en leur conservant une architecture similaire. Nous comparons ainsi statistiquement les performances de perceptrons multicouches (MLPs), de réseaux convolutifs (CNNs) et entièrement convolutifs (FCNNs) utilisés pour la segmentation d'images SAR polarimétriques. Le partitionnement des images SAR et l'équilibrage des classes sont étudiés afin d'éviter des biais d'apprentissage. En parallèle, nous avons également proposé une librairie open-source pour faciliter l'implémentation des CVNNs et la comparaison avec des réseaux équivalents réels
Radar signal and SAR image processing generally require complex-valued representations and operations, e.g., Fourier, wavelet transforms, Wiener, matched filters, etc. However, the vast majority of architectures for deep learning are currently based on real-valued operations, which restrict their ability to learn from complex-valued features. Despite the emergence of Complex-Valued Neural Networks (CVNNs), their application on radar and SAR still lacks study on their relevance and efficiency. And the comparison against an equivalent Real-Valued Neural Network (RVNN) is usually biased.In this thesis, we propose to investigate the merits of CVNNs for classifying complex-valued data. We show that CVNNs achieve better performance than their real-valued counterpart for classifying non-circular Gaussian data. We also define a criterion of equivalence between feed-forward fully connected and convolutional CVNNs and RVNNs in terms of trainable parameters while keeping a similar architecture. We statistically compare the performance of equivalent Multi-Layer Perceptrons (MLPs), Convolutional Neural Networks (CNNs), and Fully Convolutional Neural Networks (FCNNs) for polarimetric SAR image segmentation. SAR image splitting and balancing classes are also studied to avoid learning biases. In parallel, we also proposed an open-source toolbox to facilitate the implementation of CVNNs and the comparison with real-equivalent networks
Style APA, Harvard, Vancouver, ISO itp.
2

Minin, Alexey [Verfasser]. "Modeling of Dynamical Systems with Complex Valued Recurrent Neural Networks / Alexey Minin. Gutachter: Alois Knoll ; Mark J. Embrechts. Betreuer: Alois Knoll ; Hans-Georg Zimmermann". München : Universitätsbibliothek der TU München, 2012. http://d-nb.info/1024963985/34.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Hu, Qiong. "Statistical parametric speech synthesis based on sinusoidal models". Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/28719.

Pełny tekst źródła
Streszczenie:
This study focuses on improving the quality of statistical speech synthesis based on sinusoidal models. Vocoders play a crucial role during the parametrisation and reconstruction process, so we first lead an experimental comparison of a broad range of the leading vocoder types. Although our study shows that for analysis / synthesis, sinusoidal models with complex amplitudes can generate high quality of speech compared with source-filter ones, component sinusoids are correlated with each other, and the number of parameters is also high and varies in each frame, which constrains its application for statistical speech synthesis. Therefore, we first propose a perceptually based dynamic sinusoidal model (PDM) to decrease and fix the number of components typically used in the standard sinusoidal model. Then, in order to apply the proposed vocoder with an HMM-based speech synthesis system (HTS), two strategies for modelling sinusoidal parameters have been compared. In the first method (DIR parameterisation), features extracted from the fixed- and low-dimensional PDM are statistically modelled directly. In the second method (INT parameterisation), we convert both static amplitude and dynamic slope from all the harmonics of a signal, which we term the Harmonic Dynamic Model (HDM), to intermediate parameters (regularised cepstral coefficients (RDC)) for modelling. Our results show that HDM with intermediate parameters can generate comparable quality to STRAIGHT. As correlations between features in the dynamic model cannot be modelled satisfactorily by a typical HMM-based system with diagonal covariance, we have applied and tested a deep neural network (DNN) for modelling features from these two methods. To fully exploit DNN capabilities, we investigate ways to combine INT and DIR at the level of both DNN modelling and waveform generation. For DNN training, we propose to use multi-task learning to model cepstra (from INT) and log amplitudes (from DIR) as primary and secondary tasks. We conclude from our results that sinusoidal models are indeed highly suited for statistical parametric synthesis. The proposed method outperforms the state-of-the-art STRAIGHT-based equivalent when used in conjunction with DNNs. To further improve the voice quality, phase features generated from the proposed vocoder also need to be parameterised and integrated into statistical modelling. Here, an alternative statistical model referred to as the complex-valued neural network (CVNN), which treats complex coefficients as a whole, is proposed to model complex amplitude explicitly. A complex-valued back-propagation algorithm using a logarithmic minimisation criterion which includes both amplitude and phase errors is used as a learning rule. Three parameterisation methods are studied for mapping text to acoustic features: RDC / real-valued log amplitude, complex-valued amplitude with minimum phase and complex-valued amplitude with mixed phase. Our results show the potential of using CVNNs for modelling both real and complex-valued acoustic features. Overall, this thesis has established competitive alternative vocoders for speech parametrisation and reconstruction. The utilisation of proposed vocoders on various acoustic models (HMM / DNN / CVNN) clearly demonstrates that it is compelling to apply them for the parametric statistical speech synthesis.
Style APA, Harvard, Vancouver, ISO itp.
4

"Dynamical analysis of complex-valued recurrent neural networks with time-delays". 2013. http://library.cuhk.edu.hk/record=b5884392.

Pełny tekst źródła
Streszczenie:
Hu, Jin.
Thesis (Ph.D.)--Chinese University of Hong Kong, 2013.
Includes bibliographical references (leaves 140-153).
Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
Abstracts also in Chinese.
Style APA, Harvard, Vancouver, ISO itp.
5

Wang, Shu-Fan, i 王書凡. "Monaural Source Separation Based on Complex-valued Deep Neural Network". Thesis, 2016. http://ndltd.ncl.edu.tw/handle/fyvr7y.

Pełny tekst źródła
Streszczenie:
碩士
國立中央大學
資訊工程學系
104
Deep neural networks (DNNs) have become a popular means of separating a target source from a mixed signal. Almost all DNN-based methods modify only the magnitude spectrum of the mixture. The phase spectrum is left unchanged, which is inherent in the short-time Fourier transform (STFT) coefficients of the input signal. However, recent studies have revealed that incorporating phase information can improve the perceptual quality of separated sources. Accordingly, in this paper, estimating the STFT coefficients of target sources from an input mixture is regarded a regression problem. A fully complex-valued deep neural network is developed herein to learn the nonlinear mapping from complex-valued STFT coefficients of a mixture to sources. The proposed method is applied to speech separation and singing separation.
Style APA, Harvard, Vancouver, ISO itp.
6

Yu, Kuo, i 俞果. "Complex-Valued Deep Recurrent Neural Network for Singing Voice Separation". Thesis, 2017. http://ndltd.ncl.edu.tw/handle/4waab5.

Pełny tekst źródła
Streszczenie:
碩士
國立中央大學
資訊工程學系
105
Deep neural networks (DNN) have performed impressively in the processing of multimedia signals. Most DNN-based approaches were developed to handle real-valued data; very few have been designed for complex-valued data, despite their being essential for processing various types of multimedia signal. Accordingly, this work presents a complex-valued deep recurrent neural network (C-DRNN) for singing voice separation. The C-DRNN operates on the complex-valued short-time discrete Fourier transform (STFT) domain. A key aspect of the C-DRNN is that the activations and weights are complex-valued. The goal herein is to reconstruct the singing voice and the background music from a mixed signal. For error back-propagation, CR-calculus is utilized to calculate the complex-valued gradients of the objective function. To reinforce model regularity, two constraints are incorporated into the cost function of the C-DRNN. The first is an additional masking layer that ensures the sum of separated sources equals the input mixture. The second is a discriminative term that preserves the mutual difference between two separated sources. Finally, the proposed method is evaluated using the MIR-1K dataset and a singing voice separation task. Experimental results demonstrate that the proposed method outperforms the state-of-the-art DNN-based methods.
Style APA, Harvard, Vancouver, ISO itp.

Książki na temat "Complex-valued neural networks"

1

Hirose, Akira. Complex-Valued Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-27632-3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Hirose, Akira. Complex-Valued Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/978-3-540-33457-6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Hirose, Akira, red. Complex-Valued Neural Networks. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118590072.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Hirose, Akira. Complex-Valued Neural Networks. Wyd. 2. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Aizenberg, Igor. Complex-Valued Neural Networks with Multi-Valued Neurons. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-20353-4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

service), SpringerLink (Online, red. Complex-Valued Neural Networks with Multi-Valued Neurons. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Suresh, Sundaram, Narasimhan Sundararajan i Ramasamy Savitha. Supervised Learning with Complex-valued Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-29491-4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Suresh, Sundaram. Supervised Learning with Complex-valued Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

1963-, Hirose Akira, red. Complex-valued neural networks: Theories and applications. River Edge, NJ: World Scientific, 2003.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Zhang, Ziye, Zhen Wang, Jian Chen i Chong Lin. Complex-Valued Neural Networks Systems with Time Delay. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-5450-4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Części książek na temat "Complex-valued neural networks"

1

Hirose, Akira. "Application Fields and Fundamental Merits of Complex-Valued Neural Networks". W Complex-Valued Neural Networks, 1–31. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118590072.ch1.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Wong, Wai Kit, Gin Chong Lee, Chu Kiong Loo, Way Soong Lim i Raymond Lock. "Quaternionic Fuzzy Neural Network for View-Invariant Color Face Image Recognition". W Complex-Valued Neural Networks, 235–78. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118590072.ch10.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Fiori, Simone. "Neural System Learning on Complex-Valued Manifolds". W Complex-Valued Neural Networks, 33–57. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118590072.ch2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Nitta, Tohru. "N-Dimensional Vector Neuron and Its Application to theN-Bit Parity Problem". W Complex-Valued Neural Networks, 59–74. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118590072.ch3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Amin, Md Faijul, i Kazuyuki Murase. "Learning Algorithms in Complex-Valued Neural Networks using Wirtinger Calculus". W Complex-Valued Neural Networks, 75–102. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118590072.ch4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Isokawa, Teijiro, Haruhiko Nishimura i Nobuyuki Matsui. "Quaternionic Neural Networks for Associative Memories". W Complex-Valued Neural Networks, 103–31. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118590072.ch5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Kuroe, Yasuaki. "Models of Recurrent Clifford Neural Networks and Their Dynamics". W Complex-Valued Neural Networks, 133–51. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118590072.ch6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Savitha, Ramasamy, Sundaram Suresh i Narasimhan Sundarara. "Meta-Cognitive Complex-Valued Relaxation Network and Its Sequential Learning Algorithm". W Complex-Valued Neural Networks, 153–83. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118590072.ch7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Manyakov, Nikolay V., Igor Aizenberg, Nikolay Chumerin i Marc M. Van Hulle. "Multilayer Feedforward Neural Network with Multi-Valued Neurons for Brain-Computer Interfacing". W Complex-Valued Neural Networks, 185–208. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118590072.ch8.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Hong, Xia, Sheng Chen i Chris J. Harris. "Complex-Valued B-Spline Neural Networks for Modeling and Inverse of Wiener Systems". W Complex-Valued Neural Networks, 209–34. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118590072.ch9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Streszczenia konferencji na temat "Complex-valued neural networks"

1

Ishikawa, Masaya, i Kazuyuki Murase. "Complex-valued online classifier". W 2016 International Joint Conference on Neural Networks (IJCNN). IEEE, 2016. http://dx.doi.org/10.1109/ijcnn.2016.7727779.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Popa, Calin-Adrian. "Complex-Valued Deep Boltzmann Machines". W 2018 International Joint Conference on Neural Networks (IJCNN). IEEE, 2018. http://dx.doi.org/10.1109/ijcnn.2018.8489359.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Kagan, Evgeny, Alexander Rybalov i Ronald Yager. "Complex-Valued Logic for Neural Networks". W 2018 IEEE International Conference on the Science of Electrical Engineering in Israel (ICSEE). IEEE, 2018. http://dx.doi.org/10.1109/icsee.2018.8646029.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Popa, Calin-Adrian. "Complex-valued convolutional neural networks for real-valued image classification". W 2017 International Joint Conference on Neural Networks (IJCNN). IEEE, 2017. http://dx.doi.org/10.1109/ijcnn.2017.7965936.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Savitha, R., S. Suresh i N. Sundararajan. "Complex-valued function approximation using a Fully Complex-valued RBF (FC-RBF) learning algorithm". W 2009 International Joint Conference on Neural Networks (IJCNN 2009 - Atlanta). IEEE, 2009. http://dx.doi.org/10.1109/ijcnn.2009.5178624.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Mandic, Danilo P. "Complex valued recurrent neural networks for noncircular complex signals". W 2009 International Joint Conference on Neural Networks (IJCNN 2009 - Atlanta). IEEE, 2009. http://dx.doi.org/10.1109/ijcnn.2009.5178960.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Ouabi, Othmane-Latif, Radmila Pribic i Sorin Olaru. "Stochastic Complex-valued Neural Networks for Radar". W 2020 28th European Signal Processing Conference (EUSIPCO). IEEE, 2021. http://dx.doi.org/10.23919/eusipco47968.2020.9287425.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Lv, Yiqi, Xiang Meng, Yong Luo i Yan Pei. "Glucose Sensing Utilizing Complex-Valued Neural Networks". W CNIOT'23: 2023 4th International Conference on Computing, Networks and Internet of Things. New York, NY, USA: ACM, 2023. http://dx.doi.org/10.1145/3603781.3603845.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Ren, Lei, Chunyang Liu i Ying Zhang. "Noise Benefits in Complex-Valued Neural Networks". W 2023 8th International Conference on Image, Vision and Computing (ICIVC). IEEE, 2023. http://dx.doi.org/10.1109/icivc58118.2023.10269993.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Masuyama, Naoki, i Chu Kiong Loo. "Quantum-Inspired Complex-Valued Multidirectional Associative Memory". W 2015 International Joint Conference on Neural Networks (IJCNN). IEEE, 2015. http://dx.doi.org/10.1109/ijcnn.2015.7280403.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii