Literatura científica selecionada sobre o tema "Neural network adaptation"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Índice
Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Neural network adaptation".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Artigos de revistas sobre o assunto "Neural network adaptation"
Hylton, Todd. "Thermodynamic Neural Network". Entropy 22, n.º 3 (25 de fevereiro de 2020): 256. http://dx.doi.org/10.3390/e22030256.
Texto completo da fonteVreeswijk, C. van, e D. Hansel. "Patterns of Synchrony in Neural Networks with Spike Adaptation". Neural Computation 13, n.º 5 (1 de maio de 2001): 959–92. http://dx.doi.org/10.1162/08997660151134280.
Texto completo da fonteXie, Xurong, Xunying Liu, Tan Lee e Lan Wang. "Bayesian Learning for Deep Neural Network Adaptation". IEEE/ACM Transactions on Audio, Speech, and Language Processing 29 (2021): 2096–110. http://dx.doi.org/10.1109/taslp.2021.3084072.
Texto completo da fontePatre, P. M., S. Bhasin, Z. D. Wilcox e W. E. Dixon. "Composite Adaptation for Neural Network-Based Controllers". IEEE Transactions on Automatic Control 55, n.º 4 (abril de 2010): 944–50. http://dx.doi.org/10.1109/tac.2010.2041682.
Texto completo da fonteYu, D. L., e T. K. Chang. "Adaptation of diagonal recurrent neural network model". Neural Computing and Applications 14, n.º 3 (23 de março de 2005): 189–97. http://dx.doi.org/10.1007/s00521-004-0453-9.
Texto completo da fonteJoty, Shafiq, Nadir Durrani, Hassan Sajjad e Ahmed Abdelali. "Domain adaptation using neural network joint model". Computer Speech & Language 45 (setembro de 2017): 161–79. http://dx.doi.org/10.1016/j.csl.2016.12.006.
Texto completo da fonteDenker, John S. "Neural network models of learning and adaptation". Physica D: Nonlinear Phenomena 22, n.º 1-3 (outubro de 1986): 216–32. http://dx.doi.org/10.1016/0167-2789(86)90242-3.
Texto completo da fonteYAEGER, LARRY S. "IDENTIFYING NEURAL NETWORK TOPOLOGIES THAT FOSTER DYNAMICAL COMPLEXITY". Advances in Complex Systems 16, n.º 02n03 (maio de 2013): 1350032. http://dx.doi.org/10.1142/s021952591350032x.
Texto completo da fonteZiemke, Tom. "Radar Image Segmentation Using Self-Adapting Recurrent Networks". International Journal of Neural Systems 08, n.º 01 (fevereiro de 1997): 47–54. http://dx.doi.org/10.1142/s0129065797000070.
Texto completo da fonteLi, Xiaofeng, Suying Xiang, Pengfei Zhu e Min Wu. "Establishing a Dynamic Self-Adaptation Learning Algorithm of the BP Neural Network and Its Applications". International Journal of Bifurcation and Chaos 25, n.º 14 (30 de dezembro de 2015): 1540030. http://dx.doi.org/10.1142/s0218127415400301.
Texto completo da fonteTeses / dissertações sobre o assunto "Neural network adaptation"
Donati, Lorenzo. "Domain Adaptation through Deep Neural Networks for Health Informatics". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2017. http://amslaurea.unibo.it/14888/.
Texto completo da fonteHaskey, Stephen. "A modified One-Class-One-Network ANN architecture for dynamic phoneme adaptation". Thesis, Loughborough University, 1998. https://dspace.lboro.ac.uk/2134/12099.
Texto completo da fonteWen, Tsung-Hsien. "Recurrent neural network language generation for dialogue systems". Thesis, University of Cambridge, 2018. https://www.repository.cam.ac.uk/handle/1810/275648.
Texto completo da fonteGangireddy, Siva Reddy. "Recurrent neural network language models for automatic speech recognition". Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/28990.
Texto completo da fonteTomashenko, Natalia. "Speaker adaptation of deep neural network acoustic models using Gaussian mixture model framework in automatic speech recognition systems". Thesis, Le Mans, 2017. http://www.theses.fr/2017LEMA1040/document.
Texto completo da fonteDifferences between training and testing conditions may significantly degrade recognition accuracy in automatic speech recognition (ASR) systems. Adaptation is an efficient way to reduce the mismatch between models and data from a particular speaker or channel. There are two dominant types of acoustic models (AMs) used in ASR: Gaussian mixture models (GMMs) and deep neural networks (DNNs). The GMM hidden Markov model (GMM-HMM) approach has been one of the most common technique in ASR systems for many decades. Speaker adaptation is very effective for these AMs and various adaptation techniques have been developed for them. On the other hand, DNN-HMM AMs have recently achieved big advances and outperformed GMM-HMM models for various ASR tasks. However, speaker adaptation is still very challenging for these AMs. Many adaptation algorithms that work well for GMMs systems cannot be easily applied to DNNs because of the different nature of these models. The main purpose of this thesis is to develop a method for efficient transfer of adaptation algorithms from the GMM framework to DNN models. A novel approach for speaker adaptation of DNN AMs is proposed and investigated. The idea of this approach is based on using so-called GMM-derived features as input to a DNN. The proposed technique provides a general framework for transferring adaptation algorithms, developed for GMMs, to DNN adaptation. It is explored for various state-of-the-art ASR systems and is shown to be effective in comparison with other speaker adaptation techniques and complementary to them
Buttar, Sarpreet Singh. "Applying Artificial Neural Networks to Reduce the Adaptation Space in Self-Adaptive Systems : an exploratory work". Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-87117.
Texto completo da fontePalapelas, Kantola Philip. "Extreme Quantile Estimation of Downlink Radio Channel Quality". Thesis, Linköpings universitet, Artificiell intelligens och integrerade datorsystem, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-177657.
Texto completo da fonteFic, Miloslav. "Adaptace parametrů ve fuzzy systémech". Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2015. http://www.nusl.cz/ntk/nusl-221163.
Texto completo da fonteVu, Hien Duc. "Adaptation des méthodes d'apprentissage automatique pour la détection de défauts d'arc électriques". Electronic Thesis or Diss., Université de Lorraine, 2019. http://docnum.univ-lorraine.fr/ulprive/DDOC_T_2019_0152_VU.pdf.
Texto completo da fonteThe detection of electric arcs occurring in an electrical network by machine learning approaches represents the heart of the work presented in this thesis. The problem was first considered as a classification of fixed-size time series with two classes: normal and default. This first part is based on the work of the literature where the detection algorithms are organized mainly on a step of the transformation of the signals acquired on the network, followed by a step of extraction of descriptive characteristics and finally a step of decision. The multi-criteria approach adopted here aims to respond to systematic classification errors. A methodology for selecting the best combinations, transformation, and descriptors has been proposed by using learning solutions. As the development of relevant descriptors is always difficult, differents solutions offered by deep learning has also been studied. In a second phase, the study focused on the variable aspects in time of the fault detection. Two statistical decision paths have been explored, one based on the sequential probabilistic test (SPRT) and the other based on artificial neural networks LSTM (Long Short Time Memory Network). Each of these two methods exploits in its way the duration a first classification step between 0 and 1 (normal, default). The decision by SPRT uses an integration of the initial classification. LSTM learns to classify data with variable time. The results of the LSTM network are very promising, but there are a few things to explore. All of this work is based on experiments with the most complete and broadest possible data on the field of 230V alternative networks in a domestic and industrial context. The accuracy obtained is close to 100% in the majority of situations
Ainapure, Abhijeet Narhar. "Application and Performance Enhancement of Intelligent Cross-Domain Fault Diagnosis in Rotating Machinery". University of Cincinnati / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1623164772153736.
Texto completo da fonteLivros sobre o assunto "Neural network adaptation"
Lee, Tsu-Chang. Structure level adaptation for artificial neural networks. Boston: Kluwer Academic Publishers, 1991.
Encontre o texto completo da fonteLee, Tsu-Chang. Structure Level Adaptation for Artificial Neural Networks. Boston, MA: Springer US, 1991. http://dx.doi.org/10.1007/978-1-4615-3954-4.
Texto completo da fonteLee, Tsu-Chang. Structure Level Adaptation for Artificial Neural Networks. Boston, MA: Springer US, 1991.
Encontre o texto completo da fonteStonier, Russel J., e Xing Huo Yu. Complex systems: Mechanism of adaptation. Amsterdam: IOS Press, 1994.
Encontre o texto completo da fonteNeuronal adaptation theory: Including 29 exercises with solutions, 43 essential ideas, and 108 partially couloured figures, experiment explanations, and general theorems. Frankfurt am Main: Peter Lang, 1996.
Encontre o texto completo da fonte1931-, Haykin Simon S., ed. Kalman filtering and neural networks. New York: Wiley, 2001.
Encontre o texto completo da fonteJ, Stonier Russel, e Xing Huo-yu, eds. Complex systems: Mechanism of adaptation. Amsterdam: IOS Press, 1994.
Encontre o texto completo da fonteFocus, Symposium on Learning and Adaptation in Stochastic and Statistical Systems (2001 Baden-Baden Germany). Proceedings of the Focus Symposium on Learning and Adaptation in Stochastic and Statistical Systems. Windsor, Ont: International Institute for Advanced Studies in Systems Research and Cybernetics, 2002.
Encontre o texto completo da fonteMarcello, Pucci, e Vitale Gianpaolo, eds. Power converters and AC electrical drives with linear neutral networks. Boca Raton: CRC Press, 2012.
Encontre o texto completo da fonteChannel-Mismatch Compensation in Speaker Identification Feature Selection and Adaptation with Artificial Neural Networks. Storming Media, 1998.
Encontre o texto completo da fonteCapítulos de livros sobre o assunto "Neural network adaptation"
Ljung, L., J. Sjöberg e H. Hjalmarsson. "On Neural Network Model Structures in System Identification". In Identification, Adaptation, Learning, 366–99. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-662-03295-4_9.
Texto completo da fonteCai, ManJun, JinCun Liu, GuangJun Tian, XueJian Zhang e TiHua Wu. "Hybrid Neural Network Controller Using Adaptation Algorithm". In Advances in Neural Networks – ISNN 2007, 148–57. Berlin, Heidelberg: Springer Berlin Heidelberg, 2007. http://dx.doi.org/10.1007/978-3-540-72383-7_19.
Texto completo da fontePatil, Dipali Himmatrao, e Amit Gadekar. "Tuberculosis Detection Using a Deep Neural Network". In Proceedings in Adaptation, Learning and Optimization, 600–608. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-31164-2_51.
Texto completo da fonteHozjan, Tomaž, Goran Turk e Iztok Fister. "Hybrid Artificial Neural Network for Fire Analysis of Steel Frames". In Adaptation, Learning, and Optimization, 149–69. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-14400-9_7.
Texto completo da fonteKursin, Andrei. "Neural Network: Input Anticipation May Lead to Advanced Adaptation Properties". In Artificial Neural Networks and Neural Information Processing — ICANN/ICONIP 2003, 779–85. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/3-540-44989-2_93.
Texto completo da fonteLee, Tsu-Chang. "Application Example: An Adaptive Neural Network Source Coder". In Structure Level Adaptation for Artificial Neural Networks, 135–53. Boston, MA: Springer US, 1991. http://dx.doi.org/10.1007/978-1-4615-3954-4_5.
Texto completo da fonteVidyasagar, M. "An Overview of Computational Learning Theory and Its Applications to Neural Network Training". In Identification, Adaptation, Learning, 400–422. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-662-03295-4_10.
Texto completo da fonteYang, Yongxin, e Timothy M. Hospedales. "Unifying Multi-domain Multitask Learning: Tensor and Neural Network Perspectives". In Domain Adaptation in Computer Vision Applications, 291–309. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-58347-1_16.
Texto completo da fonteZajíc, Zbyněk, Jan Zelinka, Jan Vaněk e Luděk Müller. "Convolutional Neural Network for Refinement of Speaker Adaptation Transformation". In Speech and Computer, 161–68. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-11581-8_20.
Texto completo da fonteBureš, Tomáš, Petr Hnětynka, Martin Kruliš, František Plášil, Danylo Khalyeyev, Sebastian Hahner, Stephan Seifermann, Maximilian Walter e Robert Heinrich. "Attuning Adaptation Rules via a Rule-Specific Neural Network". In Leveraging Applications of Formal Methods, Verification and Validation. Adaptation and Learning, 215–30. Cham: Springer Nature Switzerland, 2022. http://dx.doi.org/10.1007/978-3-031-19759-8_14.
Texto completo da fonteTrabalhos de conferências sobre o assunto "Neural network adaptation"
Li, Jinyu, Jui-Ting Huang e Yifan Gong. "Factorized adaptation for deep neural network". In ICASSP 2014 - 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2014. http://dx.doi.org/10.1109/icassp.2014.6854662.
Texto completo da fonteJae Hoon Jeong e Soo-Young Lee. "Speaker adaptation based on judge network with small adaptation words". In Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium. IEEE, 2000. http://dx.doi.org/10.1109/ijcnn.2000.859377.
Texto completo da fonteSteffens Henrique, Alisson, Vinicius Almeida dos Santos e Rodrigo Lyra. "NEAT Snake: a both evolutionary and neural network adaptation approach". In Computer on the Beach. Itajaí: Universidade do Vale do Itajaí, 2020. http://dx.doi.org/10.14210/cotb.v11n1.p052-053.
Texto completo da fonteWu, Chunwei, Guitao Cao, Wenming Cao, Hong Wang e He Ren. "Debiased Prototype Network for Adversarial Domain Adaptation". In 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021. http://dx.doi.org/10.1109/ijcnn52387.2021.9533346.
Texto completo da fonteVesely, Karel, Shinji Watanabe, Katerina Zmolikova, Martin Karafiat, Lukas Burget e Jan Honza Cernocky. "Sequence summarizing neural network for speaker adaptation". In 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2016. http://dx.doi.org/10.1109/icassp.2016.7472692.
Texto completo da fontePatre, Parag M., Shubhendu Bhasin, Zachary D. Wilcox e Warren E. Dixon. "Composite adaptation for neural network-based controllers". In 2009 Joint 48th IEEE Conference on Decision and Control (CDC) and 28th Chinese Control Conference (CCC). IEEE, 2009. http://dx.doi.org/10.1109/cdc.2009.5400453.
Texto completo da fonteMa, Min, Michael Nirschl, Fadi Biadsy e Shankar Kumar. "Approaches for Neural-Network Language Model Adaptation". In Interspeech 2017. ISCA: ISCA, 2017. http://dx.doi.org/10.21437/interspeech.2017-1310.
Texto completo da fonteKimoto, T., Y. Yaginuma, S. Nagata e K. Asakawa. "Inverse modeling of dynamical system-network architecture with identification network and adaptation network". In 1991 IEEE International Joint Conference on Neural Networks. IEEE, 1991. http://dx.doi.org/10.1109/ijcnn.1991.170460.
Texto completo da fonteSzekely, Geza, e Thomas Lindblad. "Parameter adaptation in a simplified pulse-coupled neural network". In Ninth Workshop on Virtual Intelligence/Dynamic Neural Networks: Neural Networks Fuzzy Systems, Evolutionary Systems and Virtual Re, editado por Thomas Lindblad, Mary Lou Padgett e Jason M. Kinser. SPIE, 1999. http://dx.doi.org/10.1117/12.343046.
Texto completo da fonteMuniz, L. F., C. N. Lintzmayer, C. Jutten e D. G. Fantinato. "Neuroevolutive Strategies for Topology and Weights Adaptation of Artificial Neural Networks". In Symposium on Knowledge Discovery, Mining and Learning. Sociedade Brasileira de Computação - SBC, 2022. http://dx.doi.org/10.5753/kdmile.2022.227807.
Texto completo da fonteRelatórios de organizações sobre o assunto "Neural network adaptation"
Miles, Gaines E., Yael Edan, F. Tom Turpin, Avshalom Grinstein, Thomas N. Jordan, Amots Hetzroni, Stephen C. Weller, Marvin M. Schreiber e Okan K. Ersoy. Expert Sensor for Site Specification Application of Agricultural Chemicals. United States Department of Agriculture, agosto de 1995. http://dx.doi.org/10.32747/1995.7570567.bard.
Texto completo da fonteKosko, Bart. Stability and Adaptation of Neural Networks. Fort Belvoir, VA: Defense Technical Information Center, novembro de 1990. http://dx.doi.org/10.21236/ada230108.
Texto completo da fonteYatsymirska, Mariya. KEY IMPRESSIONS OF 2020 IN JOURNALISTIC TEXTS. Ivan Franko National University of Lviv, março de 2021. http://dx.doi.org/10.30970/vjo.2021.50.11107.
Texto completo da fonteSeginer, Ido, Louis D. Albright e Robert W. Langhans. On-line Fault Detection and Diagnosis for Greenhouse Environmental Control. United States Department of Agriculture, fevereiro de 2001. http://dx.doi.org/10.32747/2001.7575271.bard.
Texto completo da fonte