Auswahl der wissenschaftlichen Literatur zum Thema „Neural network adaptation“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Inhaltsverzeichnis
Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Neural network adaptation" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Zeitschriftenartikel zum Thema "Neural network adaptation"
Hylton, Todd. „Thermodynamic Neural Network“. Entropy 22, Nr. 3 (25.02.2020): 256. http://dx.doi.org/10.3390/e22030256.
Der volle Inhalt der QuelleVreeswijk, C. van, und D. Hansel. „Patterns of Synchrony in Neural Networks with Spike Adaptation“. Neural Computation 13, Nr. 5 (01.05.2001): 959–92. http://dx.doi.org/10.1162/08997660151134280.
Der volle Inhalt der QuelleXie, Xurong, Xunying Liu, Tan Lee und Lan Wang. „Bayesian Learning for Deep Neural Network Adaptation“. IEEE/ACM Transactions on Audio, Speech, and Language Processing 29 (2021): 2096–110. http://dx.doi.org/10.1109/taslp.2021.3084072.
Der volle Inhalt der QuellePatre, P. M., S. Bhasin, Z. D. Wilcox und W. E. Dixon. „Composite Adaptation for Neural Network-Based Controllers“. IEEE Transactions on Automatic Control 55, Nr. 4 (April 2010): 944–50. http://dx.doi.org/10.1109/tac.2010.2041682.
Der volle Inhalt der QuelleYu, D. L., und T. K. Chang. „Adaptation of diagonal recurrent neural network model“. Neural Computing and Applications 14, Nr. 3 (23.03.2005): 189–97. http://dx.doi.org/10.1007/s00521-004-0453-9.
Der volle Inhalt der QuelleJoty, Shafiq, Nadir Durrani, Hassan Sajjad und Ahmed Abdelali. „Domain adaptation using neural network joint model“. Computer Speech & Language 45 (September 2017): 161–79. http://dx.doi.org/10.1016/j.csl.2016.12.006.
Der volle Inhalt der QuelleDenker, John S. „Neural network models of learning and adaptation“. Physica D: Nonlinear Phenomena 22, Nr. 1-3 (Oktober 1986): 216–32. http://dx.doi.org/10.1016/0167-2789(86)90242-3.
Der volle Inhalt der QuelleYAEGER, LARRY S. „IDENTIFYING NEURAL NETWORK TOPOLOGIES THAT FOSTER DYNAMICAL COMPLEXITY“. Advances in Complex Systems 16, Nr. 02n03 (Mai 2013): 1350032. http://dx.doi.org/10.1142/s021952591350032x.
Der volle Inhalt der QuelleZiemke, Tom. „Radar Image Segmentation Using Self-Adapting Recurrent Networks“. International Journal of Neural Systems 08, Nr. 01 (Februar 1997): 47–54. http://dx.doi.org/10.1142/s0129065797000070.
Der volle Inhalt der QuelleLi, Xiaofeng, Suying Xiang, Pengfei Zhu und Min Wu. „Establishing a Dynamic Self-Adaptation Learning Algorithm of the BP Neural Network and Its Applications“. International Journal of Bifurcation and Chaos 25, Nr. 14 (30.12.2015): 1540030. http://dx.doi.org/10.1142/s0218127415400301.
Der volle Inhalt der QuelleDissertationen zum Thema "Neural network adaptation"
Donati, Lorenzo. „Domain Adaptation through Deep Neural Networks for Health Informatics“. Master's thesis, Alma Mater Studiorum - Università di Bologna, 2017. http://amslaurea.unibo.it/14888/.
Der volle Inhalt der QuelleHaskey, Stephen. „A modified One-Class-One-Network ANN architecture for dynamic phoneme adaptation“. Thesis, Loughborough University, 1998. https://dspace.lboro.ac.uk/2134/12099.
Der volle Inhalt der QuelleWen, Tsung-Hsien. „Recurrent neural network language generation for dialogue systems“. Thesis, University of Cambridge, 2018. https://www.repository.cam.ac.uk/handle/1810/275648.
Der volle Inhalt der QuelleGangireddy, Siva Reddy. „Recurrent neural network language models for automatic speech recognition“. Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/28990.
Der volle Inhalt der QuelleTomashenko, Natalia. „Speaker adaptation of deep neural network acoustic models using Gaussian mixture model framework in automatic speech recognition systems“. Thesis, Le Mans, 2017. http://www.theses.fr/2017LEMA1040/document.
Der volle Inhalt der QuelleDifferences between training and testing conditions may significantly degrade recognition accuracy in automatic speech recognition (ASR) systems. Adaptation is an efficient way to reduce the mismatch between models and data from a particular speaker or channel. There are two dominant types of acoustic models (AMs) used in ASR: Gaussian mixture models (GMMs) and deep neural networks (DNNs). The GMM hidden Markov model (GMM-HMM) approach has been one of the most common technique in ASR systems for many decades. Speaker adaptation is very effective for these AMs and various adaptation techniques have been developed for them. On the other hand, DNN-HMM AMs have recently achieved big advances and outperformed GMM-HMM models for various ASR tasks. However, speaker adaptation is still very challenging for these AMs. Many adaptation algorithms that work well for GMMs systems cannot be easily applied to DNNs because of the different nature of these models. The main purpose of this thesis is to develop a method for efficient transfer of adaptation algorithms from the GMM framework to DNN models. A novel approach for speaker adaptation of DNN AMs is proposed and investigated. The idea of this approach is based on using so-called GMM-derived features as input to a DNN. The proposed technique provides a general framework for transferring adaptation algorithms, developed for GMMs, to DNN adaptation. It is explored for various state-of-the-art ASR systems and is shown to be effective in comparison with other speaker adaptation techniques and complementary to them
Buttar, Sarpreet Singh. „Applying Artificial Neural Networks to Reduce the Adaptation Space in Self-Adaptive Systems : an exploratory work“. Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-87117.
Der volle Inhalt der QuellePalapelas, Kantola Philip. „Extreme Quantile Estimation of Downlink Radio Channel Quality“. Thesis, Linköpings universitet, Artificiell intelligens och integrerade datorsystem, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-177657.
Der volle Inhalt der QuelleFic, Miloslav. „Adaptace parametrů ve fuzzy systémech“. Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2015. http://www.nusl.cz/ntk/nusl-221163.
Der volle Inhalt der QuelleVu, Hien Duc. „Adaptation des méthodes d'apprentissage automatique pour la détection de défauts d'arc électriques“. Electronic Thesis or Diss., Université de Lorraine, 2019. http://docnum.univ-lorraine.fr/ulprive/DDOC_T_2019_0152_VU.pdf.
Der volle Inhalt der QuelleThe detection of electric arcs occurring in an electrical network by machine learning approaches represents the heart of the work presented in this thesis. The problem was first considered as a classification of fixed-size time series with two classes: normal and default. This first part is based on the work of the literature where the detection algorithms are organized mainly on a step of the transformation of the signals acquired on the network, followed by a step of extraction of descriptive characteristics and finally a step of decision. The multi-criteria approach adopted here aims to respond to systematic classification errors. A methodology for selecting the best combinations, transformation, and descriptors has been proposed by using learning solutions. As the development of relevant descriptors is always difficult, differents solutions offered by deep learning has also been studied. In a second phase, the study focused on the variable aspects in time of the fault detection. Two statistical decision paths have been explored, one based on the sequential probabilistic test (SPRT) and the other based on artificial neural networks LSTM (Long Short Time Memory Network). Each of these two methods exploits in its way the duration a first classification step between 0 and 1 (normal, default). The decision by SPRT uses an integration of the initial classification. LSTM learns to classify data with variable time. The results of the LSTM network are very promising, but there are a few things to explore. All of this work is based on experiments with the most complete and broadest possible data on the field of 230V alternative networks in a domestic and industrial context. The accuracy obtained is close to 100% in the majority of situations
Ainapure, Abhijeet Narhar. „Application and Performance Enhancement of Intelligent Cross-Domain Fault Diagnosis in Rotating Machinery“. University of Cincinnati / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1623164772153736.
Der volle Inhalt der QuelleBücher zum Thema "Neural network adaptation"
Lee, Tsu-Chang. Structure level adaptation for artificial neural networks. Boston: Kluwer Academic Publishers, 1991.
Den vollen Inhalt der Quelle findenLee, Tsu-Chang. Structure Level Adaptation for Artificial Neural Networks. Boston, MA: Springer US, 1991. http://dx.doi.org/10.1007/978-1-4615-3954-4.
Der volle Inhalt der QuelleLee, Tsu-Chang. Structure Level Adaptation for Artificial Neural Networks. Boston, MA: Springer US, 1991.
Den vollen Inhalt der Quelle findenStonier, Russel J., und Xing Huo Yu. Complex systems: Mechanism of adaptation. Amsterdam: IOS Press, 1994.
Den vollen Inhalt der Quelle findenNeuronal adaptation theory: Including 29 exercises with solutions, 43 essential ideas, and 108 partially couloured figures, experiment explanations, and general theorems. Frankfurt am Main: Peter Lang, 1996.
Den vollen Inhalt der Quelle finden1931-, Haykin Simon S., Hrsg. Kalman filtering and neural networks. New York: Wiley, 2001.
Den vollen Inhalt der Quelle findenJ, Stonier Russel, und Xing Huo-yu, Hrsg. Complex systems: Mechanism of adaptation. Amsterdam: IOS Press, 1994.
Den vollen Inhalt der Quelle findenFocus, Symposium on Learning and Adaptation in Stochastic and Statistical Systems (2001 Baden-Baden Germany). Proceedings of the Focus Symposium on Learning and Adaptation in Stochastic and Statistical Systems. Windsor, Ont: International Institute for Advanced Studies in Systems Research and Cybernetics, 2002.
Den vollen Inhalt der Quelle findenMarcello, Pucci, und Vitale Gianpaolo, Hrsg. Power converters and AC electrical drives with linear neutral networks. Boca Raton: CRC Press, 2012.
Den vollen Inhalt der Quelle findenChannel-Mismatch Compensation in Speaker Identification Feature Selection and Adaptation with Artificial Neural Networks. Storming Media, 1998.
Den vollen Inhalt der Quelle findenBuchteile zum Thema "Neural network adaptation"
Ljung, L., J. Sjöberg und H. Hjalmarsson. „On Neural Network Model Structures in System Identification“. In Identification, Adaptation, Learning, 366–99. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-662-03295-4_9.
Der volle Inhalt der QuelleCai, ManJun, JinCun Liu, GuangJun Tian, XueJian Zhang und TiHua Wu. „Hybrid Neural Network Controller Using Adaptation Algorithm“. In Advances in Neural Networks – ISNN 2007, 148–57. Berlin, Heidelberg: Springer Berlin Heidelberg, 2007. http://dx.doi.org/10.1007/978-3-540-72383-7_19.
Der volle Inhalt der QuellePatil, Dipali Himmatrao, und Amit Gadekar. „Tuberculosis Detection Using a Deep Neural Network“. In Proceedings in Adaptation, Learning and Optimization, 600–608. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-31164-2_51.
Der volle Inhalt der QuelleHozjan, Tomaž, Goran Turk und Iztok Fister. „Hybrid Artificial Neural Network for Fire Analysis of Steel Frames“. In Adaptation, Learning, and Optimization, 149–69. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-14400-9_7.
Der volle Inhalt der QuelleKursin, Andrei. „Neural Network: Input Anticipation May Lead to Advanced Adaptation Properties“. In Artificial Neural Networks and Neural Information Processing — ICANN/ICONIP 2003, 779–85. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/3-540-44989-2_93.
Der volle Inhalt der QuelleLee, Tsu-Chang. „Application Example: An Adaptive Neural Network Source Coder“. In Structure Level Adaptation for Artificial Neural Networks, 135–53. Boston, MA: Springer US, 1991. http://dx.doi.org/10.1007/978-1-4615-3954-4_5.
Der volle Inhalt der QuelleVidyasagar, M. „An Overview of Computational Learning Theory and Its Applications to Neural Network Training“. In Identification, Adaptation, Learning, 400–422. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-662-03295-4_10.
Der volle Inhalt der QuelleYang, Yongxin, und Timothy M. Hospedales. „Unifying Multi-domain Multitask Learning: Tensor and Neural Network Perspectives“. In Domain Adaptation in Computer Vision Applications, 291–309. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-58347-1_16.
Der volle Inhalt der QuelleZajíc, Zbyněk, Jan Zelinka, Jan Vaněk und Luděk Müller. „Convolutional Neural Network for Refinement of Speaker Adaptation Transformation“. In Speech and Computer, 161–68. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-11581-8_20.
Der volle Inhalt der QuelleBureš, Tomáš, Petr Hnětynka, Martin Kruliš, František Plášil, Danylo Khalyeyev, Sebastian Hahner, Stephan Seifermann, Maximilian Walter und Robert Heinrich. „Attuning Adaptation Rules via a Rule-Specific Neural Network“. In Leveraging Applications of Formal Methods, Verification and Validation. Adaptation and Learning, 215–30. Cham: Springer Nature Switzerland, 2022. http://dx.doi.org/10.1007/978-3-031-19759-8_14.
Der volle Inhalt der QuelleKonferenzberichte zum Thema "Neural network adaptation"
Li, Jinyu, Jui-Ting Huang und Yifan Gong. „Factorized adaptation for deep neural network“. In ICASSP 2014 - 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2014. http://dx.doi.org/10.1109/icassp.2014.6854662.
Der volle Inhalt der QuelleJae Hoon Jeong und Soo-Young Lee. „Speaker adaptation based on judge network with small adaptation words“. In Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium. IEEE, 2000. http://dx.doi.org/10.1109/ijcnn.2000.859377.
Der volle Inhalt der QuelleSteffens Henrique, Alisson, Vinicius Almeida dos Santos und Rodrigo Lyra. „NEAT Snake: a both evolutionary and neural network adaptation approach“. In Computer on the Beach. Itajaí: Universidade do Vale do Itajaí, 2020. http://dx.doi.org/10.14210/cotb.v11n1.p052-053.
Der volle Inhalt der QuelleWu, Chunwei, Guitao Cao, Wenming Cao, Hong Wang und He Ren. „Debiased Prototype Network for Adversarial Domain Adaptation“. In 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021. http://dx.doi.org/10.1109/ijcnn52387.2021.9533346.
Der volle Inhalt der QuelleVesely, Karel, Shinji Watanabe, Katerina Zmolikova, Martin Karafiat, Lukas Burget und Jan Honza Cernocky. „Sequence summarizing neural network for speaker adaptation“. In 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2016. http://dx.doi.org/10.1109/icassp.2016.7472692.
Der volle Inhalt der QuellePatre, Parag M., Shubhendu Bhasin, Zachary D. Wilcox und Warren E. Dixon. „Composite adaptation for neural network-based controllers“. In 2009 Joint 48th IEEE Conference on Decision and Control (CDC) and 28th Chinese Control Conference (CCC). IEEE, 2009. http://dx.doi.org/10.1109/cdc.2009.5400453.
Der volle Inhalt der QuelleMa, Min, Michael Nirschl, Fadi Biadsy und Shankar Kumar. „Approaches for Neural-Network Language Model Adaptation“. In Interspeech 2017. ISCA: ISCA, 2017. http://dx.doi.org/10.21437/interspeech.2017-1310.
Der volle Inhalt der QuelleKimoto, T., Y. Yaginuma, S. Nagata und K. Asakawa. „Inverse modeling of dynamical system-network architecture with identification network and adaptation network“. In 1991 IEEE International Joint Conference on Neural Networks. IEEE, 1991. http://dx.doi.org/10.1109/ijcnn.1991.170460.
Der volle Inhalt der QuelleSzekely, Geza, und Thomas Lindblad. „Parameter adaptation in a simplified pulse-coupled neural network“. In Ninth Workshop on Virtual Intelligence/Dynamic Neural Networks: Neural Networks Fuzzy Systems, Evolutionary Systems and Virtual Re, herausgegeben von Thomas Lindblad, Mary Lou Padgett und Jason M. Kinser. SPIE, 1999. http://dx.doi.org/10.1117/12.343046.
Der volle Inhalt der QuelleMuniz, L. F., C. N. Lintzmayer, C. Jutten und D. G. Fantinato. „Neuroevolutive Strategies for Topology and Weights Adaptation of Artificial Neural Networks“. In Symposium on Knowledge Discovery, Mining and Learning. Sociedade Brasileira de Computação - SBC, 2022. http://dx.doi.org/10.5753/kdmile.2022.227807.
Der volle Inhalt der QuelleBerichte der Organisationen zum Thema "Neural network adaptation"
Miles, Gaines E., Yael Edan, F. Tom Turpin, Avshalom Grinstein, Thomas N. Jordan, Amots Hetzroni, Stephen C. Weller, Marvin M. Schreiber und Okan K. Ersoy. Expert Sensor for Site Specification Application of Agricultural Chemicals. United States Department of Agriculture, August 1995. http://dx.doi.org/10.32747/1995.7570567.bard.
Der volle Inhalt der QuelleKosko, Bart. Stability and Adaptation of Neural Networks. Fort Belvoir, VA: Defense Technical Information Center, November 1990. http://dx.doi.org/10.21236/ada230108.
Der volle Inhalt der QuelleYatsymirska, Mariya. KEY IMPRESSIONS OF 2020 IN JOURNALISTIC TEXTS. Ivan Franko National University of Lviv, März 2021. http://dx.doi.org/10.30970/vjo.2021.50.11107.
Der volle Inhalt der QuelleSeginer, Ido, Louis D. Albright und Robert W. Langhans. On-line Fault Detection and Diagnosis for Greenhouse Environmental Control. United States Department of Agriculture, Februar 2001. http://dx.doi.org/10.32747/2001.7575271.bard.
Der volle Inhalt der Quelle