Добірка наукової літератури з теми "Combination of neural networks"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Combination of neural networks".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Статті в журналах з теми "Combination of neural networks"
Zengguo Sun, Zengguo Sun, Guodong Zhao Zengguo Sun, Rafał Scherer Guodong Zhao, Wei Wei Rafał Scherer, and Marcin Woźniak Wei Wei. "Overview of Capsule Neural Networks." 網際網路技術學刊 23, no. 1 (January 2022): 033–44. http://dx.doi.org/10.53106/160792642022012301004.
Повний текст джерелаAladag, Cagdas Hakan, Erol Egrioglu, and Ufuk Yolcu. "Forecast Combination by Using Artificial Neural Networks." Neural Processing Letters 32, no. 3 (October 30, 2010): 269–76. http://dx.doi.org/10.1007/s11063-010-9156-7.
Повний текст джерелаWang, Yunhong, Songde Ma, T. N. Tan, and Guosui Liu. "Combination of multiple classifiers with neural networks." IFAC Proceedings Volumes 32, no. 2 (July 1999): 5332–37. http://dx.doi.org/10.1016/s1474-6670(17)56908-6.
Повний текст джерелаMeng, Ya Feng, Sai Zhu, and Rong Li Han. "A Fault Diagnosis Method Based on Combination of Neural Network and Fault Dictionary." Advanced Materials Research 765-767 (September 2013): 2078–81. http://dx.doi.org/10.4028/www.scientific.net/amr.765-767.2078.
Повний текст джерелаIosifidis, Alexandros, Anastasios Tefas, and Ioannis Pitas. "Human Action Recognition Based on Multi-View Regularized Extreme Learning Machine." International Journal on Artificial Intelligence Tools 24, no. 05 (October 2015): 1540020. http://dx.doi.org/10.1142/s0218213015400205.
Повний текст джерелаSmith, Lauren C., and Adam Kimbrough. "Leveraging Neural Networks in Preclinical Alcohol Research." Brain Sciences 10, no. 9 (August 21, 2020): 578. http://dx.doi.org/10.3390/brainsci10090578.
Повний текст джерелаCherepanova, V. О., and I. V. Sylka. "Optimizing the Intellectual Property Management in Accordance with a Process-Functional Approach." Business Inform 9, no. 524 (2021): 41–51. http://dx.doi.org/10.32983/2222-4459-2021-9-41-51.
Повний текст джерелаFoon, See Lee, Nazira Anisa Rahim, Ahmad Zainal, and Zhang Jie. "Selective combination in multiple neural networks prediction using independent component regression approach." Chemical Engineering Research Bulletin 19 (September 10, 2017): 12. http://dx.doi.org/10.3329/cerb.v19i0.33772.
Повний текст джерелаLézoray, Olivier, and Hubert Cardot. "Comparing Combination Rules of Pairwise Neural Networks Classifiers." Neural Processing Letters 27, no. 1 (November 4, 2007): 43–56. http://dx.doi.org/10.1007/s11063-007-9058-5.
Повний текст джерелаLi, Yi Bing, and Fei Pan. "Study on the Combination of SOM and K-Means Algorithms in Manufacturing Process Quality Control." Applied Mechanics and Materials 427-429 (September 2013): 1315–18. http://dx.doi.org/10.4028/www.scientific.net/amm.427-429.1315.
Повний текст джерелаДисертації з теми "Combination of neural networks"
Morabito, David L. "Statistical mechanics of neural networks and combinatorial opimization problems /." Online version of thesis, 1991. http://hdl.handle.net/1850/11089.
Повний текст джерелаKorn, Stefan. "The combination of AI modelling techniques for the simulation of manufacturing processes." Thesis, Glasgow Caledonian University, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.263139.
Повний текст джерелаAmanzadi, Amirhossein. "Predicting safe drug combinations with Graph Neural Networks (GNN)." Thesis, Uppsala universitet, Institutionen för farmaceutisk biovetenskap, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-446691.
Повний текст джерелаFreitas, Paulo Sérgio Abreu. "The combination of neural estimates in prediction and decision problems." Doctoral thesis, Universidade de Lisboa: Faculdade de Ciências, 2008. http://hdl.handle.net/10400.13/98.
Повний текст джерелаOrientador: António José Lopes Rodrigues
Yang, Shuang. "Multistage neural network ensemble : adaptive combination of ensemble results." Thesis, London Metropolitan University, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.425920.
Повний текст джерелаTorres, Sospedra Joaquín. "Ensembles of Artificial Neural Networks: Analysis and Development of Design Methods." Doctoral thesis, Universitat Jaume I, 2011. http://hdl.handle.net/10803/48638.
Повний текст джерелаThis thesis is focused on the analysis and development of Ensembles of Neural Networks. An ensemble is a system in which a set of heterogeneous Artificial Neural Networks are generated in order to outperform the Single network based classifiers. However, this proposed thesis differs from others related to ensembles of neural networks [1, 2, 3, 4, 5, 6, 7] since it is organized as follows.
In this thesis, firstly, an ensemble methods comparison has been introduced in order to provide a rank-based list of the best ensemble methods existing in the bibliography. This comparison has been split into two researches which represents two chapters of the thesis.
Moreover, there is another important step related to the ensembles of neural networks which is how to combine the information provided by the neural networks in the ensemble. In the bibliography, there are some alternatives to apply in order to get an accurate combination of the information provided by the heterogeneous set of networks. For this reason, a combiner comparison has also been introduced in this thesis.
Furthermore, Ensembles of Neural Networks is only a kind of Multiple Classifier System based on neural networks. However, there are other alternatives to generate MCS based on neural networks which are quite different to Ensembles. The most important systems are Stacked Generalization and Mixture of Experts. These two systems will be also analysed in this thesis and new alternatives are proposed.
One of the results of the comparative research developed is a deep understanding of the field of ensembles. So new ensemble methods and combiners can be designed after analyzing the results provided by the research performed. Concretely, two new ensemble methods, a new ensemble methodology called Cross-Validated Boosting and two reordering algorithms are proposed in this thesis. The best overall results are obtained by the ensemble methods proposed.
Finally, all the experiments done have been carried out on a common experimental setup. The experiments have been repeated ten times on nineteen different datasets from the UCI repository in order to validate the results. Moreover, the procedure applied to set up specific parameters is quite similar in all the experiments performed.
It is important to conclude by remarking that the main contributions are:
1) An experimental setup to prepare the experiments which can be applied for further comparisons. 2) A guide to select the most appropriate methods to build and combine ensembles and multiple classifiers systems. 3) New methods proposed to build ensembles and other multiple classifier systems.
Henry, Timothy G. "Generalization of deep neural networks to unseen attribute combinations." Thesis, Massachusetts Institute of Technology, 2020. https://hdl.handle.net/1721.1/129905.
Повний текст джерелаCataloged from student-submitted PDF of thesis.
Includes bibliographical references (pages 71-73).
Visual understanding results from a combined understanding of primitive visual attributes such as color, texture, and shape. This allows humans and other primates to generalize their understanding of objects to new combinations of attributes. For instance, one can understand that a pink elephant is an elephant even if they have never seen this particular combination of color and shape before. However, is it the case that deep neural networks (DNNs) are able to generalize to such novel combinations in object recognition or other related vision tasks? This thesis demonstrates that (1) the ability of DNNs to generalize to unseen attribute combinations increases with the increased diversity of combinations seen in training as a percentage of the total combination space, (2) this effect is largely independent of the specifics of the DNN architecture used, (3) while single-task and multi-task formulations of supervised attribute classification problems may lead to similar performance on seen combinations, single-task formulations have a superior ability to generalize to unseen combinations, and (4) DNNs demonstrating the ability to generalize well in this setting learn to do so by leveraging emergent hidden units that exhibit properties of attribute selectivity and invariance.
by Timothy G. Henry.
M. Eng.
M.Eng. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science
Zhao, Yi. "Combination of Wireless sensor network and artifical neuronal network : a new approach of modeling." Thesis, Toulon, 2013. http://www.theses.fr/2013TOUL0013/document.
Повний текст джерелаA Wireless Sensor Network (WSN) consisting of autonomous sensor nodes can provide a rich stream of sensor data representing physical measurements. A well built Artificial Neural Network (ANN) model needs sufficient training data sources. Facing the limitation of traditional parametric modeling, this paper proposes a standard procedure of combining ANN and WSN sensor data in modeling. Experiments on indoor thermal modeling demonstrated that WSN together with ANN can lead to accurate fine grained indoor thermal models. A new training method "Multi-Pattern Cross Training" (MPCT) is also introduced in this work. This training method makes it possible to merge knowledge from different independent training data sources (patterns) into a single ANN model. Further experiments demonstrated that models trained by MPCT method shew better generalization performance and lower prediction errors in tests using different data sets. Also the MPCT based Neural Network Model has shown advantages in multi-variable Neural Network based Model Predictive Control (NNMPC). Software simulation and application results indicate that MPCT implemented NNMPC outperformed Multiple models based NNMPC in online control efficiency
Alani, Shayma. "Design of intelligent ensembled classifiers combination methods." Thesis, Brunel University, 2015. http://bura.brunel.ac.uk/handle/2438/12793.
Повний текст джерелаHuhtinen, J. (Jouni). "Utilization of neural network and agent technology combination for distributed intelligent applications and services." Doctoral thesis, University of Oulu, 2005. http://urn.fi/urn:isbn:9514278550.
Повний текст джерелаКниги з теми "Combination of neural networks"
IEEE Symposium on Combinations of Evolutionary Computation and Neural Networks (1st 2000 San Antonio, Tex.). 2000 IEEE Symposium on Combinations of Evolutionary Computation and Neural Networks: Proceedings of the First IEEE Symposium on Combinations of Evolutionary Computation and Neural Networks : May 11-13, 2000, the Gunter Hotel, San Antonio, TX, USA. Piscataway, N.J: IEEE, 2000.
Знайти повний текст джерелаCOGANN-92 (1992 Baltimore, Md.). COGANN-92, International Workshop on Combinations of Genetic Algorithms and Neural Networks, June 6, 1992, Baltimore, Maryland. Edited by Whitley L. Darrell, Schaffer J. David, IEEE Neural Networks Council, and International Society for Genetic Algorithms. Los Alamitos, Calif: IEEE Computer Society Press, 1992.
Знайти повний текст джерелаDominique, Valentin, and Edelman Betty, eds. Neural networks. Thousand Oaks, Calif: Sage Publications, 1999.
Знайти повний текст джерелаRojas, Raúl. Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-642-61068-4.
Повний текст джерелаMüller, Berndt, Joachim Reinhardt, and Michael T. Strickland. Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 1995. http://dx.doi.org/10.1007/978-3-642-57760-4.
Повний текст джерелаAlmeida, Luis B., and Christian J. Wellekens, eds. Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/3-540-52255-7.
Повний текст джерелаDavalo, Eric, and Patrick Naïm. Neural Networks. London: Macmillan Education UK, 1991. http://dx.doi.org/10.1007/978-1-349-12312-4.
Повний текст джерелаMüller, Berndt, and Joachim Reinhardt. Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/978-3-642-97239-3.
Повний текст джерелаNeural networks. New York: Palgrave, 2000.
Знайти повний текст джерелаAbdi, Hervé, Dominique Valentin, and Betty Edelman. Neural Networks. 2455 Teller Road, Thousand Oaks California 91320 United States of America: SAGE Publications, Inc., 1999. http://dx.doi.org/10.4135/9781412985277.
Повний текст джерелаЧастини книг з теми "Combination of neural networks"
Bellot, Pau, and Patrick E. Meyer. "Efficient Combination of Pairwise Feature Networks." In Neural Connectomics Challenge, 85–93. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-53070-3_7.
Повний текст джерелаBiggio, Battista, Giorgio Fumera, and Fabio Roli. "Bayesian Linear Combination of Neural Networks." In Innovations in Neural Information Paradigms and Applications, 201–30. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04003-0_9.
Повний текст джерелаEnglund, Cristofer, and Antanas Verikas. "A SOM Based Model Combination Strategy." In Advances in Neural Networks — ISNN 2005, 461–66. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11427391_73.
Повний текст джерелаKobos, Mateusz, and Jacek Mańdziuk. "Classification Based on Combination of Kernel Density Estimators." In Artificial Neural Networks – ICANN 2009, 125–34. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04277-5_13.
Повний текст джерелаWersing, Heiko, and Edgar Körner. "Unsupervised Learning of Combination Features for Hierarchical Recognition Models." In Artificial Neural Networks — ICANN 2002, 1225–30. Berlin, Heidelberg: Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/3-540-46084-5_198.
Повний текст джерелаMartín-Merino, Manuel. "Learning a Combination of Heterogeneous Dissimilarities from Incomplete Knowledge." In Artificial Neural Networks – ICANN 2010, 62–71. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15825-4_7.
Повний текст джерелаStraszecka, Ewa, and Joanna Straszecka. "Membership Functions as Combination of Expert’s Knowledge with Population Information." In Neural Networks and Soft Computing, 322–27. Heidelberg: Physica-Verlag HD, 2003. http://dx.doi.org/10.1007/978-3-7908-1902-1_47.
Повний текст джерелаBenmokhtar, Rachid, and Benoit Huet. "Classifier Fusion: Combination Methods For Semantic Indexing in Video Content." In Artificial Neural Networks – ICANN 2006, 65–74. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11840930_7.
Повний текст джерелаKrawczyk, Bartosz, and Michał Woźniak. "Untrained Method for Ensemble Pruning and Weighted Combination." In Advances in Neural Networks – ISNN 2014, 358–65. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-12436-0_40.
Повний текст джерелаMohammed, Hussein Syed, James Leander, Matthew Marbach, and Robi Polikar. "Can AdaBoost.M1 Learn Incrementally? A Comparison to Learn + + Under Different Combination Rules." In Artificial Neural Networks – ICANN 2006, 254–63. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11840817_27.
Повний текст джерелаТези доповідей конференцій з теми "Combination of neural networks"
Freitas, Cinthia O. A., Joao M. Carvalho, Jose J. Oliveira, Simone B. K. Aires, and Robert Sabourin. "Distance-based Disagreement Classifiers Combination." In 2007 International Joint Conference on Neural Networks. IEEE, 2007. http://dx.doi.org/10.1109/ijcnn.2007.4371390.
Повний текст джерелаKassem, Ayman H., and Ihab G. Adam. "Optimizing Neural Networks for Leak Monitoring in Pipelines." In ASME/JSME 2004 Pressure Vessels and Piping Conference. ASMEDC, 2004. http://dx.doi.org/10.1115/pvp2004-3005.
Повний текст джерелаKrawczyk, Bartosz, and Michal Wozniak. "New untrained aggregation methods for classifier combination." In 2014 International Joint Conference on Neural Networks (IJCNN). IEEE, 2014. http://dx.doi.org/10.1109/ijcnn.2014.6889810.
Повний текст джерелаBreker, Sebastian, and Bernhard Sick. "Combination of uncertain ordinal expert statements: The combination rule EIDMR and its application to low-voltage grid classification with SVM." In 2016 International Joint Conference on Neural Networks (IJCNN). IEEE, 2016. http://dx.doi.org/10.1109/ijcnn.2016.7727467.
Повний текст джерелаC. Prudencio, Ricardo, and Teresa Ludermir. "LearningWeights for Linear Combination of Forecasting Methods." In 2006 Ninth Brazilian Symposium on Neural Networks (SBRN'06). IEEE, 2006. http://dx.doi.org/10.1109/sbrn.2006.25.
Повний текст джерелаLahroodi, Mahmood, and A. A. Mozafari. "Combination of Neural Networks and State Vector Feedback Adaptive Control (SVFAC) Technique to Control the Gas Turbine Combustor." In ASME 2006 International Mechanical Engineering Congress and Exposition. ASMEDC, 2006. http://dx.doi.org/10.1115/imece2006-15764.
Повний текст джерелаWang, Xiuying, Changliang Li, Zhijun Zheng, and Bo Xu. "Paraphrase Recognition via Combination of Neural Classifier and Keywords." In 2018 International Joint Conference on Neural Networks (IJCNN). IEEE, 2018. http://dx.doi.org/10.1109/ijcnn.2018.8489222.
Повний текст джерелаYazhi Gao, Wenge Rong, Yikang Shen, and Zhang Xiong. "Convolutional Neural Network based sentiment analysis using Adaboost combination." In 2016 International Joint Conference on Neural Networks (IJCNN). IEEE, 2016. http://dx.doi.org/10.1109/ijcnn.2016.7727352.
Повний текст джерелаMayhua-Lopez, Efrain, Vanessa Gomez-Verdejo, and Anibal R. Figueiras-Vidal. "Improving boosting performance with a local combination of learners." In 2010 International Joint Conference on Neural Networks (IJCNN). IEEE, 2010. http://dx.doi.org/10.1109/ijcnn.2010.5596317.
Повний текст джерелаHu, Junlin, and Ping Guo. "Learning multiple pooling combination for image classification." In 2012 International Joint Conference on Neural Networks (IJCNN 2012 - Brisbane). IEEE, 2012. http://dx.doi.org/10.1109/ijcnn.2012.6252840.
Повний текст джерелаЗвіти організацій з теми "Combination of neural networks"
Engel, Bernard, Yael Edan, James Simon, Hanoch Pasternak, and Shimon Edelman. Neural Networks for Quality Sorting of Agricultural Produce. United States Department of Agriculture, July 1996. http://dx.doi.org/10.32747/1996.7613033.bard.
Повний текст джерелаKirichek, Galina, Vladyslav Harkusha, Artur Timenko, and Nataliia Kulykovska. System for detecting network anomalies using a hybrid of an uncontrolled and controlled neural network. [б. в.], February 2020. http://dx.doi.org/10.31812/123456789/3743.
Повний текст джерелаRenz, Manuel. B-jet and c-jet identification with Neural Networks as well as combination of multivariate analyses for the search for of multivariate analyses for the search for single top-quark production. Office of Scientific and Technical Information (OSTI), June 2008. http://dx.doi.org/10.2172/957074.
Повний текст джерелаJohnson, John L., and C. C. Sung. Neural Networks. Fort Belvoir, VA: Defense Technical Information Center, January 1990. http://dx.doi.org/10.21236/ada222110.
Повний текст джерелаSmith, Patrick I. Neural Networks. Office of Scientific and Technical Information (OSTI), September 2003. http://dx.doi.org/10.2172/815740.
Повний текст джерелаHolder, Nanette S. Introduction to Neural Networks. Fort Belvoir, VA: Defense Technical Information Center, March 1992. http://dx.doi.org/10.21236/ada248258.
Повний текст джерелаWiggins, Vince L., Larry T. Looper, and Sheree K. Engquist. Neural Networks: A Primer. Fort Belvoir, VA: Defense Technical Information Center, May 1991. http://dx.doi.org/10.21236/ada235920.
Повний текст джерелаAbu-Mostafa, Yaser S., and Amir F. Atiya. Theory of Neural Networks. Fort Belvoir, VA: Defense Technical Information Center, July 1991. http://dx.doi.org/10.21236/ada253187.
Повний текст джерелаAlltop, W. O. Piecewise Linear Neural Networks. Fort Belvoir, VA: Defense Technical Information Center, August 1992. http://dx.doi.org/10.21236/ada265031.
Повний текст джерелаYu, Haichao, Haoxiang Li, Honghui Shi, Thomas S. Huang, and Gang Hua. Any-Precision Deep Neural Networks. Web of Open Science, December 2020. http://dx.doi.org/10.37686/ejai.v1i1.82.
Повний текст джерела