Academic literature on the topic 'Stochastic neural networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Stochastic neural networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Stochastic neural networks"

1

Reddy, BhanuTeja, and Usha J.C. "Prediction of Stock Market using Stochastic Neural Networks." International Journal of Innovative Research in Computer Science & Technology 7, no. 5 (September 2019): 128–38. http://dx.doi.org/10.21276/ijircst.2019.7.5.1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wong, Eugene. "Stochastic neural networks." Algorithmica 6, no. 1-6 (June 1991): 466–78. http://dx.doi.org/10.1007/bf01759054.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zhou, Wuneng, Xueqing Yang, Jun Yang, and Jun Zhou. "Stochastic Synchronization of Neutral-Type Neural Networks with Multidelays Based onM-Matrix." Discrete Dynamics in Nature and Society 2015 (2015): 1–8. http://dx.doi.org/10.1155/2015/826810.

Full text
Abstract:
The problem of stochastic synchronization of neutral-type neural networks with multidelays based onM-matrix is researched. Firstly, we designed a control law of stochastic synchronization of the neural-type and multiple time-delays neural network. Secondly, by making use of Lyapunov functional andM-matrix method, we obtained a criterion under which the drive and response neutral-type multiple time-delays neural networks with stochastic disturbance and Markovian switching are stochastic synchronization. The synchronization condition is expressed as linear matrix inequality which can be easily solved by MATLAB. Finally, we introduced a numerical example to illustrate the effectiveness of the method and result obtained in this paper.
APA, Harvard, Vancouver, ISO, and other styles
4

Hu, Shigeng, Xiaoxin Liao, and Xuerong Mao. "Stochastic Hopfield neural networks." Journal of Physics A: Mathematical and General 36, no. 9 (February 19, 2003): 2235–49. http://dx.doi.org/10.1088/0305-4470/36/9/303.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Gao, Zhan, Elvin Isufi, and Alejandro Ribeiro. "Stochastic Graph Neural Networks." IEEE Transactions on Signal Processing 69 (2021): 4428–43. http://dx.doi.org/10.1109/tsp.2021.3092336.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

SAKTHIVEL, RATHINASAMY, R. SAMIDURAI, and S. MARSHAL ANTHONI. "EXPONENTIAL STABILITY FOR STOCHASTIC NEURAL NETWORKS OF NEUTRAL TYPE WITH IMPULSIVE EFFECTS." Modern Physics Letters B 24, no. 11 (May 10, 2010): 1099–110. http://dx.doi.org/10.1142/s0217984910023141.

Full text
Abstract:
This paper is concerned with the exponential stability of stochastic neural networks of neutral type with impulsive effects. By employing the Lyapunov functional and stochastic analysis, a new stability criterion for the stochastic neural network is derived in terms of linear matrix inequality. A numerical example is provided to show the effectiveness and applicability of the obtained result.
APA, Harvard, Vancouver, ISO, and other styles
7

Wu, Chunmei, Junhao Hu, and Yan Li. "Robustness Analysis of Hybrid Stochastic Neural Networks with Neutral Terms and Time-Varying Delays." Discrete Dynamics in Nature and Society 2015 (2015): 1–12. http://dx.doi.org/10.1155/2015/278571.

Full text
Abstract:
We analyze the robustness of global exponential stability of hybrid stochastic neural networks subject to neutral terms and time-varying delays simultaneously. Given globally exponentially stable hybrid stochastic neural networks, we characterize the upper bounds of contraction coefficients of neutral terms and time-varying delays by using the transcendental equation. Moreover, we prove theoretically that, for any globally exponentially stable hybrid stochastic neural networks, if additive neutral terms and time-varying delays are smaller than the upper bounds arrived, then the perturbed neural networks are guaranteed to also be globally exponentially stable. Finally, a numerical simulation example is given to illustrate the presented criteria.
APA, Harvard, Vancouver, ISO, and other styles
8

Kanarachos, Andreas E., and Kleanthis T. Geramanis. "Semi-Stochastic Complex Neural Networks." IFAC Proceedings Volumes 31, no. 12 (June 1998): 47–52. http://dx.doi.org/10.1016/s1474-6670(17)36040-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Peretto, Pierre, and Jean-jacques Niez. "Stochastic Dynamics of Neural Networks." IEEE Transactions on Systems, Man, and Cybernetics 16, no. 1 (January 1986): 73–83. http://dx.doi.org/10.1109/tsmc.1986.289283.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hurtado, Jorge E. "Neural networks in stochastic mechanics." Archives of Computational Methods in Engineering 8, no. 3 (September 2001): 303–42. http://dx.doi.org/10.1007/bf02736646.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Stochastic neural networks"

1

Pensuwon, Wanida. "Stochastic dynamic hierarchical neural networks." Thesis, University of Hertfordshire, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.366030.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

CAMPOS, LUCIANA CONCEICAO DIAS. "PERIODIC STOCHASTIC MODEL BASED ON NEURAL NETWORKS." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2010. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=17076@1.

Full text
Abstract:
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
Processo Estocástico é um ramo da teoria da probabilidade onde se define um conjunto de modelos que permitem o estudo de problemas com componentes aleatórias. Muitos problemas reais apresentam características complexas, tais como não-linearidade e comportamento caótico, que necessitam de modelos capazes de capturar as reais características do problema para obter um tratamento apropriado. Porém, os modelos existentes ou são lineares, cuja aplicabilidade a esses problemas pode ser inadequada, ou necessitam de uma formulação complexa, onde a aplicabilidade é limitada e específica ao problema, ou dependem de suposições a priori sobre o comportamento do problema para poderem ser aplicados. Isso motivou a elaboração de um novo modelo de processo estocástico genérico, intrinsecamente não-linear, que possa ser aplicado em uma gama de problemas de fenômenos não-lineares, de comportamento altamente estocástico, e até mesmo com características periódicas. Como as redes neurais artificiais são modelos paramétricos não-lineares, simples de entendimento e implementação, capazes de capturar comportamentos de variados tipos de problemas, decidiu-se então utilizá-las como base do novo modelo proposto nessa tese, que é denominado Processo Estocástico Neural. A não-linearidade, obtida através das redes neurais desse processo estocástico, permite que se capture adequadamente o comportamento da série histórica de problemas de fenômenos não-lineares, com características altamente estocásticas e até mesmo periódicas. O objetivo é usar esse modelo para gerar séries temporais sintéticas, igualmente prováveis à série histórica, na solução desses tipos de problemas, como por exemplo os problemas que envolvem fenômenos climatológicos, econômicos, entre outros. Escolheu-se, como estudo de caso dessa tese, aplicar o modelo proposto no tratamento de afluências mensais sob o contexto do planejamento da operação do sistema hidrotérmico brasileiro. Os resultados mostraram que o Processo Estocástico Neural consegue gerar séries sintéticas com características similares às séries históricas de afluências mensais.
Stochastic Process is a branch of probability theory which defines a set of templates that allow the study of problems with random components. Many real problems exhibit complex characteristics such as nonlinearity and chaotic behavior, which require models capable of capture the real characteristics of the problem for a appropriate treatment. However, existing models have limited application to certain problems or because they are linear models (whose application gets results inconsistent or inadequate) or because they require a complex formulation or depend on a priori assumptions about the behavior of the problem, which requires a knowledge the problem at a level of detail that there is not always available. This motivated the development of a model stochastic process based on neural networks, so that is generic to be applied in a range of problems involving highly stochastic phenomena of behavior and also can be applied to phenomena that have periodic characteristics. As artificial neural networks are non-linear models, simple to understand and implementation, able to capture behaviors of varied types problems, then decided to use them as the basis of new model proposed in this thesis, which is an intrinsically non-linear model, called the Neural Stochastic Process. Through neural networks that stochastic process, can adequately capture the behavior problems of the series of phenomena with features highly stochastic and / or periodical. The goal is to use this model to generate synthetic time series, equally likely to historical series, in solution of various problems, eg problems phenomena involving climatology, economic, among others. It was chosen as a case study of this thesis, applying the model proposed in the treatment of monthly inflows in the context of operation planning of the Brazilian hydrothermal system. The Results showed that the process can Stochastic Neural generate synthetic series of similar characteristics to the historical monthly inflow series.
APA, Harvard, Vancouver, ISO, and other styles
3

Sutherland, Connie. "Spatio-temporal feedback in stochastic neural networks." Thesis, University of Ottawa (Canada), 2007. http://hdl.handle.net/10393/27559.

Full text
Abstract:
The mechanisms by which groups of neurons interact is an important facet to understanding how the brain functions. Here we study stochastic neural networks with delayed feedback. The first part of our study looks at how feedback and noise affect the mean firing rate of the network. Secondly we look at how the spatial profile of the feedback affects the behavior of the network. Our numerical and theoretical results show that negative (inhibitory) feedback linearizes the frequency vs input current (f-I) curve via the divisive gain effect it has on the network. The interaction of the inhibitory feedback and the input bias is what produces the divisive decrease in the slope (known as the gain) of the f-I curve. Our work predicts that an increase in noise is required along with increase in inhibitory feedback to attain a divisive and subtractive shift of the gain as seen in experiments [1]. Our results also show that, although the spatial profile of the feedback does not effect the mean activity of the network, it does influence the overall dynamics of the network. Local feedback generates a network oscillation, which is more robust against disruption by noise or uncorrelated input or network heterogeneity, than that for the global feedback (all-to-all coupling) case. For example uncorrelated input completely disrupts the network oscillation generated by global feedback, but only diminishes the network oscillation due to local feedback. This is characterized by 1st and 2nd order spike train statistics. Further, our theory agrees well with numerical simulations of network dynamics.
APA, Harvard, Vancouver, ISO, and other styles
4

Ling, Hong. "Implementation of Stochastic Neural Networks for Approximating Random Processes." Master's thesis, Lincoln University. Environment, Society and Design Division, 2007. http://theses.lincoln.ac.nz/public/adt-NZLIU20080108.124352/.

Full text
Abstract:
Artificial Neural Networks (ANNs) can be viewed as a mathematical model to simulate natural and biological systems on the basis of mimicking the information processing methods in the human brain. The capability of current ANNs only focuses on approximating arbitrary deterministic input-output mappings. However, these ANNs do not adequately represent the variability which is observed in the systems’ natural settings as well as capture the complexity of the whole system behaviour. This thesis addresses the development of a new class of neural networks called Stochastic Neural Networks (SNNs) in order to simulate internal stochastic properties of systems. Developing a suitable mathematical model for SNNs is based on canonical representation of stochastic processes or systems by means of Karhunen-Loève Theorem. Some successful real examples, such as analysis of full displacement field of wood in compression, confirm the validity of the proposed neural networks. Furthermore, analysis of internal workings of SNNs provides an in-depth view on the operation of SNNs that help to gain a better understanding of the simulation of stochastic processes by SNNs.
APA, Harvard, Vancouver, ISO, and other styles
5

Zhao, Jieyu. "Stochastic bit stream neural networks : theory, simulations and applications." Thesis, Royal Holloway, University of London, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.338916.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hyland, P. "On the implementation of neural networks using stochastic arithmetic." Thesis, Bangor University, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.306224.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Todeschi, Tiziano. "Calibration of local-stochastic volatility models with neural networks." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/23052/.

Full text
Abstract:
During the last twenty years several models have been proposed to improve the classic Black-Scholes framework for equity derivatives pricing. Recently a new model has been proposed: Local-Stochastic Volatility Model (LSV). This model considers volatility as the product between a deterministic and a stochastic term. So far, the model choice was not only driven by the capacity of capturing empirically observed market features well, but also by the computational tractability of the calibration process. This is now undergoing a big change since machine learning technologies offer new perspectives on model calibration. In this thesis we consider the calibration problem to be the search for a model which generates given market prices and where additionally technology from generative adversarial networks can be used. This means parametrizing the model pool in a way which is accessible for machine learning techniques and interpreting the inverse problems a training task of a generative network, whose quality is assessed by an adversary. The calibration algorithm proposed for LSV models use as generative models so-called neural stochastic differential equations (SDE), which just means to parameterize the drift and volatility of an Ito-SDE by neural networks.
APA, Harvard, Vancouver, ISO, and other styles
8

陳穎志 and Wing-chi Chan. "Modelling of nonlinear stochastic systems using neural and neurofuzzy networks." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B31241475.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Chan, Wing-chi. "Modelling of nonlinear stochastic systems using neural and neurofuzzy networks /." Hong Kong : University of Hong Kong, 2001. http://sunzi.lib.hku.hk/hkuto/record.jsp?B22925843.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Rising, Barry John Paul. "Hardware architectures for stochastic bit-stream neural networks : design and implementation." Thesis, Royal Holloway, University of London, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.326219.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Stochastic neural networks"

1

Zhou, Wuneng, Jun Yang, Liuwei Zhou, and Dongbing Tong. Stability and Synchronization Control of Stochastic Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2016. http://dx.doi.org/10.1007/978-3-662-47833-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

International Conference on Applied Stochastic Models and Data Analysis (12th : 2007 : Chania, Greece), ed. Advances in data analysis: Theory and applications to reliability and inference, data mining, bioinformatics, lifetime data, and neural networks. Boston: Birkhäuser, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zhu, Q. M. Fast orthogonal identification of nonlinear stochastic models and radial basis function neural networks. Sheffield: University of Sheffield, Dept. of Automatic Control and Systems Engineering, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Thathachar, Mandayam A. L. Networks of learning automata: Techniques for online stochastic optimization. Boston, MA: Kluwer Academic, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

S, Sastry P., ed. Networks of learning automata: Techniques for online stochastic optimization. Boston: Kluwer Academic, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Focus, Symposium on Learning and Adaptation in Stochastic and Statistical Systems (2001 Baden-Baden Germany). Proceedings of the Focus Symposium on Learning and Adaptation in Stochastic and Statistical Systems. Windsor, Ont: International Institute for Advanced Studies in Systems Research and Cybernetics, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Curram, Stephen. representing intelligent decision making in discrete event simulation: a stochastic neural network approach. [s.l.]: typescript, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Falmagne, Jean-Claude, David Eppstein, Christopher Doble, Dietrich Albert, and Xiangen Hu. Knowledge spaces: Applications in education. Heidelberg: Springer, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Falmagne, Jean-Claude. Knowledge Spaces: Applications in Education. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hangartner, Ricky Dale. Probabilistic computation in stochastic pulse neuromime networks. 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Stochastic neural networks"

1

Rojas, Raúl. "Stochastic Networks." In Neural Networks, 371–87. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-642-61068-4_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Müller, Berndt, and Joachim Reinhardt. "Stochastic Neurons." In Neural Networks, 37–44. Berlin, Heidelberg: Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/978-3-642-97239-3_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Müller, Berndt, Joachim Reinhardt, and Michael T. Strickland. "Stochastic Neurons." In Neural Networks, 38–45. Berlin, Heidelberg: Springer Berlin Heidelberg, 1995. http://dx.doi.org/10.1007/978-3-642-57760-4_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hänggi, Martin, and George S. Moschytz. "Stochastic Optimization." In Cellular Neural Networks, 101–25. Boston, MA: Springer US, 2000. http://dx.doi.org/10.1007/978-1-4757-3220-7_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rennolls, Keith, Alan Soper, Phil Robbins, and Ray Guthrie. "Stochastic Neural Networks." In ICANN ’93, 481. London: Springer London, 1993. http://dx.doi.org/10.1007/978-1-4471-2063-6_122.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhang, Yumin, Lei Guo, Lingyao Wu, and Chunbo Feng. "On Stochastic Neutral Neural Networks." In Advances in Neural Networks — ISNN 2005, 69–74. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11427391_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Siegelmann, Hava T. "Stochastic Dynamics." In Neural Networks and Analog Computation, 121–39. Boston, MA: Birkhäuser Boston, 1999. http://dx.doi.org/10.1007/978-1-4612-0707-8_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Golea, Mostefa, Masahiro Matsuoka, and Yasubumi Sakakibara. "Stochastic simple recurrent neural networks." In Grammatical Interference: Learning Syntax from Sentences, 262–73. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/bfb0033360.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Hidalgo, Jorge, Luís F. Seoane, Jesús M. Cortés, and Miguel A. Muñoz. "Stochastic Amplification in Neural Networks." In Trends in Mathematics, 45–49. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-08138-0_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Herve, Thierry, Olivier Francois, and Jacques Demongeot. "Markovian spatial properties of a random field describing a stochastic neural network: Sequential or parallel implementation?" In Neural Networks, 81–89. Berlin, Heidelberg: Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/3-540-52255-7_29.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Stochastic neural networks"

1

Gao, Zhan, Elvin Isufi, and Alejandro Ribeiro. "Stochastic Graph Neural Networks." In ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2020. http://dx.doi.org/10.1109/icassp40776.2020.9054424.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhao, J. "Stochastic connection neural networks." In 4th International Conference on Artificial Neural Networks. IEE, 1995. http://dx.doi.org/10.1049/cp:19950525.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Chien, Jen-Tzung, and Yu-Min Huang. "Stochastic Convolutional Recurrent Networks." In 2020 International Joint Conference on Neural Networks (IJCNN). IEEE, 2020. http://dx.doi.org/10.1109/ijcnn48605.2020.9206970.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Galan-Prado, Fabio, Alejandro Moran, Joan Font, Miquel Roca, and Josep L. Rossello. "Stochastic Radial Basis Neural Networks." In 2019 29th International Symposium on Power and Timing Modeling, Optimization and Simulation (PATMOS). IEEE, 2019. http://dx.doi.org/10.1109/patmos.2019.8862129.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Ramakrishnan, Swathika, and Dhireesha Kudithipudi. "On accelerating stochastic neural networks." In NANOCOM '17: ACM The Fourth Annual International Conference on Nanoscale Computing and Communication. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3109453.3123959.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Weller, Dennis D., Nathaniel Bleier, Michael Hefenbrock, Jasmin Aghassi-Hagmann, Michael Beigl, Rakesh Kumar, and Mehdi B. Tahoori. "Printed Stochastic Computing Neural Networks." In 2021 Design, Automation & Test in Europe Conference & Exhibition (DATE). IEEE, 2021. http://dx.doi.org/10.23919/date51398.2021.9474254.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Gulshad, Sadaf, Dick Sigmund, and Jong-Hwan Kim. "Learning to reproduce stochastic time series using stochastic LSTM." In 2017 International Joint Conference on Neural Networks (IJCNN). IEEE, 2017. http://dx.doi.org/10.1109/ijcnn.2017.7965942.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Nikolic, Konstantin P., and Ivan B. Scepanovic. "Stochastic search-based neural networks learning algorithms." In 2008 9th Symposium on Neural Network Applications in Electrical Engineering. IEEE, 2008. http://dx.doi.org/10.1109/neurel.2008.4685579.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Juuti, Mika, Francesco Corona, and Juha Karhunen. "Stochastic Discriminant Analysis." In 2015 International Joint Conference on Neural Networks (IJCNN). IEEE, 2015. http://dx.doi.org/10.1109/ijcnn.2015.7280609.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kosko, B. "Stochastic competitive learning." In 1990 IJCNN International Joint Conference on Neural Networks. IEEE, 1990. http://dx.doi.org/10.1109/ijcnn.1990.137718.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Stochastic neural networks"

1

Burton, Robert M., and Jr. Topics in Stochastics, Symbolic Dynamics and Neural Networks. Fort Belvoir, VA: Defense Technical Information Center, December 1996. http://dx.doi.org/10.21236/ada336426.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography