Academic literature on the topic 'Heterogeneous neural networks'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Heterogeneous neural networks.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Heterogeneous neural networks"
Zeng, Wei, Ge Fan, Shan Sun, Biao Geng, Weiyi Wang, Jiacheng Li, and Weibo Liu. "Collaborative filtering via heterogeneous neural networks." Applied Soft Computing 109 (September 2021): 107516. http://dx.doi.org/10.1016/j.asoc.2021.107516.
Full textDrakopoulos, John A., and Ahmad Abdulkader. "Training neural networks with heterogeneous data." Neural Networks 18, no. 5-6 (July 2005): 595–601. http://dx.doi.org/10.1016/j.neunet.2005.06.011.
Full textTurner, Andrew James, and Julian Francis Miller. "NeuroEvolution: Evolving Heterogeneous Artificial Neural Networks." Evolutionary Intelligence 7, no. 3 (November 2014): 135–54. http://dx.doi.org/10.1007/s12065-014-0115-5.
Full textZhang, Chen, Zhouhua Tang, Bin Yu, Yu Xie, and Ke Pan. "Deep heterogeneous network embedding based on Siamese Neural Networks." Neurocomputing 388 (May 2020): 1–11. http://dx.doi.org/10.1016/j.neucom.2020.01.012.
Full textSun, Yizhou, Jiawei Han, Xifeng Yan, Philip S. Yu, and Tianyi Wu. "Heterogeneous information networks." Proceedings of the VLDB Endowment 15, no. 12 (August 2022): 3807–11. http://dx.doi.org/10.14778/3554821.3554901.
Full textIddianozie, Chidubem, and Gavin McArdle. "Towards Robust Representations of Spatial Networks Using Graph Neural Networks." Applied Sciences 11, no. 15 (July 27, 2021): 6918. http://dx.doi.org/10.3390/app11156918.
Full textGracious, Tony, Shubham Gupta, Arun Kanthali, Rui M. Castro, and Ambedkar Dukkipati. "Neural Latent Space Model for Dynamic Networks and Temporal Knowledge Graphs." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 5 (May 18, 2021): 4054–62. http://dx.doi.org/10.1609/aaai.v35i5.16526.
Full textWu, Nan, and Chaofan Wang. "Ensemble Graph Attention Networks." Transactions on Machine Learning and Artificial Intelligence 10, no. 3 (June 12, 2022): 29–41. http://dx.doi.org/10.14738/tmlai.103.12399.
Full textSon, Ha Min, Moon Hyun Kim, and Tai-Myoung Chung. "Comparisons Where It Matters: Using Layer-Wise Regularization to Improve Federated Learning on Heterogeneous Data." Applied Sciences 12, no. 19 (October 3, 2022): 9943. http://dx.doi.org/10.3390/app12199943.
Full textHosny, Khalid M., Marwa M. Khashaba, Walid I. Khedr, and Fathy A. Amer. "An Efficient Neural Network-Based Prediction Scheme for Heterogeneous Networks." International Journal of Sociotechnology and Knowledge Development 12, no. 2 (April 2020): 63–76. http://dx.doi.org/10.4018/ijskd.2020040104.
Full textDissertations / Theses on the topic "Heterogeneous neural networks"
Belanche, Muñoz Lluís A. (Lluís Antoni). "Heterogeneous neural networks: theory and applications." Doctoral thesis, Universitat Politècnica de Catalunya, 2000. http://hdl.handle.net/10803/6660.
Full textLa similitud proporciona una marc conceptual i serveix de cobertura unificadora de molts models neuronals de la literatura i d'exploració de noves instàncies de models de neurona.
La visió basada en similitud porta amb naturalitat a integrar informació heterogènia, com ara quantitats contínues i discretes (nominals i ordinals), i difuses ó imprecises. Els valors perduts es tracten de manera explícita.
Una neurona d'aquesta classe s'anomena neurona heterogènia i qualsevol arquitectura neuronal que en faci ús serà una Xarxa Neuronal Heterogènia.
En aquest treball ens concentrem en xarxes neuronals endavant, com focus inicial d'estudi. Els algorismes d'aprenentatge són basats en algorisms evolutius, especialment extesos per treballar amb informació heterogènia.
En aquesta tesi es descriu com una certa classe de neurones heterogènies porten a xarxes neuronals que mostren un rendiment molt satisfactori, comparable o superior al de xarxes neuronals tradicionals (com el perceptró multicapa ó la xarxa de base radial), molt especialment en presència d'informació heterogènia, usual en les bases de dades actuals.
This work presents a class of functions serving as generalized neuron models to be used in artificial neural networks. They are cast into the common framework of computing a similarity function, a flexible definition of a neuron as a pattern recognizer. The similarity endows the model with a clear conceptual view and serves as a unification cover for many of the existing neural models, including those classically used for the MultiLayer Perceptron (MLP) and most of those used in Radial Basis Function Networks (RBF). These families of models are conceptually unified and their relation is clarified.
The possibilities of deriving new instances are explored and several neuron models --representative of their families-- are proposed.
The similarity view naturally leads to further extensions of the models to handle heterogeneous information, that is to say, information coming from sources radically different in character, including continuous and discrete (ordinal) numerical quantities, nominal (categorical) quantities, and fuzzy quantities. Missing data are also explicitly considered. A neuron of this class is called an heterogeneous neuron and any neural structure making use of them is an Heterogeneous Neural Network (HNN), regardless of the specific architecture or learning algorithm. Among them, in this work we concentrate on feed-forward networks, as the initial focus of study. The learning procedures may include a great variety of techniques, basically divided in derivative-based methods (such as the conjugate gradient)and evolutionary ones (such as variants of genetic algorithms).
In this Thesis we also explore a number of directions towards the construction of better neuron models --within an integrant envelope-- more adapted to the problems they are meant to solve.
It is described how a certain generic class of heterogeneous models leads to a satisfactory performance, comparable, and often better, to that of classical neural models, especially in the presence of heterogeneous information, imprecise or incomplete data, in a wide range of domains, most of them corresponding to real-world problems.
Belanche, Muñoz Lluis. "Heterogeneous neural networks: theory and applications." Doctoral thesis, Universitat Politècnica de Catalunya, 2000. http://hdl.handle.net/10803/6660.
Full textThis work presents a class of functions serving as generalized neuron models to be used in artificial neural networks. They are cast into the common framework of computing a similarity function, a flexible definition of a neuron as a pattern recognizer. The similarity endows the model with a clear conceptual view and serves as a unification cover for many of the existing neural models, including those classically used for the MultiLayer Perceptron (MLP) and most of those used in Radial Basis Function Networks (RBF). These families of models are conceptually unified and their relation is clarified. The possibilities of deriving new instances are explored and several neuron models --representative of their families-- are proposed. The similarity view naturally leads to further extensions of the models to handle heterogeneous information, that is to say, information coming from sources radically different in character, including continuous and discrete (ordinal) numerical quantities, nominal (categorical) quantities, and fuzzy quantities. Missing data are also explicitly considered. A neuron of this class is called an heterogeneous neuron and any neural structure making use of them is an Heterogeneous Neural Network (HNN), regardless of the specific architecture or learning algorithm. Among them, in this work we concentrate on feed-forward networks, as the initial focus of study. The learning procedures may include a great variety of techniques, basically divided in derivative-based methods (such as the conjugate gradient)and evolutionary ones (such as variants of genetic algorithms).In this Thesis we also explore a number of directions towards the construction of better neuron models --within an integrant envelope-- more adapted to the problems they are meant to solve.It is described how a certain generic class of heterogeneous models leads to a satisfactory performance, comparable, and often better, to that of classical neural models, especially in the presence of heterogeneous information, imprecise or incomplete data, in a wide range of domains, most of them corresponding to real-world problems.
Cabana, Tanguy. "Large deviations for the dynamics of heterogeneous neural networks." Thesis, Paris 6, 2016. http://www.theses.fr/2016PA066551/document.
Full textThis thesis addresses the rigorous derivation of mean-field results for the continuous time dynamics of heterogeneous large neural networks. In our models, we consider firing-rate neurons subject to additive noise. The network is fully connected, with highly random connectivity weights. Their variance scales as the inverse of the network size, and thus conserves a non-trivial role in the thermodynamic limit. Moreover, another heterogeneity is considered at the level of each neuron. It is interpreted as a spatial location. For biological relevance, a model considered includes delays, mean and variance of connections depending on the distance between cells. A second model considers interactions depending on the states of both neurons at play. This last case notably applies to Kuramoto's model of coupled oscillators. When the weights are independent Gaussian random variables, we show that the empirical measure of the neurons' states satisfies a large deviations principle, with a good rate function achieving its minimum at a unique probability measure, implying averaged convergence of the empirical measure and propagation of chaos. In certain cases, we also obtained quenched results. The limit is characterized through a complex non Markovian implicit equation in which the network interaction term is replaced by a non-local Gaussian process whose statistics depend on the solution over the whole neural field. We further demonstrate the universality of this limit, in the sense that neuronal networks with non-Gaussian interconnections but sub-Gaussian tails converge towards it. Moreover, we present a few numerical applications, and discuss possible perspectives
Zhao, Qiwei. "Federated Learning with Heterogeneous Challenge." Thesis, The University of Sydney, 2022. https://hdl.handle.net/2123/27399.
Full textSchliebs, Stefan. "Heterogeneous probabilistic models for optimisation and modelling of evolving spiking neural networks." AUT University, 2010. http://hdl.handle.net/10292/963.
Full textAntoniou, Christos Andrea. "Improving the acoustic modelling of speech using modular/ensemble combinations of heterogeneous neural networks." Thesis, University of Essex, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.340582.
Full textWilson, Daniel B. "Combining genetic algorithms and artificial neural networks to select heterogeneous dispatching rules for a job shop system." Ohio : Ohio University, 1996. http://www.ohiolink.edu/etd/view.cgi?ohiou1177701025.
Full textHobro, Mark. "Semantic Integration across Heterogeneous Databases : Finding Data Correspondences using Agglomerative Hierarchical Clustering and Artificial Neural Networks." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-226657.
Full textDataintegrering är en viktig del inom området databaser när det kommer till databasmigreringar och sammanslagning av data. Forskning inom området har ökat i takt med att maskininlärning blivit ett attraktivt tillvägagångssätt under de senaste 20 åren. På grund av komplexiteten av forskningsområdet, har inga optimala lösningar hittats. Istället har flera olika tekniker framställts, som tillsammans kan förbättra databasmigreringar. Denna avhandling undersöker hur bra en lösning baserad på maskininlärning presterar för dataintegreringsproblemet vid databasmigreringar. Två algoritmer har implementerats. En är baserad på informationssökningsteori, som främst används för att ha en prestandamässig utgångspunkt för algoritmen som är baserad på maskininlärning. Den algoritmen består av ett första steg, där data grupperas med hjälp av hierarkisk klustring. Sedan tränas ett artificiellt neuronnät att hitta mönster i dessa grupperingar, för att kunna göra förutsägelser huruvida olika datainstanser har ett samband mellan två databaser. Resultatet visar att agglomerativ hierarkisk klustring presterar väl i uppgiften att klassificera den data som använts. Resultatet av matchningsalgoritmen visar på att en stor mängd av de matchande tabellerna kan hittas. Men förbättringar behöver göras för att både ge hög en hög återkallelse av matchningar och hög precision för de matchningar som hittas. Slutsatsen är att ett inlärningsbaserat tillvägagångssätt, i detta fall att använda agglomerativ hierarkisk klustring och sedan träna ett artificiellt neuronnät, fungerar bra som en basis för att till viss del automatisera ett dataintegreringsproblem likt det som presenterats i denna avhandling. För att få bättre resultat, krävs att lösningen förbättras med mer situationsspecifika algoritmer och regler.
Tekleyohannes, Anteneh Tesfaye. "Unified and heterogeneous modeling of water vapour sorption in Douglas-fir wood with artificial neural networks." Thesis, University of British Columbia, 2010. http://hdl.handle.net/2429/23032.
Full textToledo, Testa Juan Ignacio. "Information extraction from heterogeneous handwritten documents." Doctoral thesis, Universitat Autònoma de Barcelona, 2019. http://hdl.handle.net/10803/667388.
Full textEl objetivo de esta tesis es la extracción de Información de documentos total o parcialmente manuscritos, con una cierta estructura. Básicamente trabajamos con dos escenarios de aplicación diferentes. El primer escenario son los documentos modernos altamente estructurados, como los formularios. En estos documentos, la información semántica está pre-definida en campos con una posición concreta en el documento i la extracción de información es equivalente a una transcripción. El segundo escenario son los documentos semi-estructurados totalmente manuscritos, donde, además de transcribir, es necesario asociar un valor semántico, de entre un conjunto conocido de valores posibles, a las palabras manuscritas. En ambos casos, la calidad de la transcripción tiene un gran peso en la precisión del sistema. Por ese motivo proponemos modelos basados en redes neuronales para transcribir el texto manuscrito. Para poder afrontar el reto de los documentos semi-estructurados, hemos generado un benchmark, compuesto de dataset, una serie de tareas y una métrica que fue presentado a la comunidad científica a modo de competición internacional. También proponemos diferentes modelos basados en Redes Neuronales Convolucionales y Recurrentes, capaces de transcribir y asignar diferentes etiquetas semánticas a cada palabra manuscrita, es decir, capaces de extraer información.
The goal of this thesis is information Extraction from totally or partially handwritten documents. Basically we are dealing with two different application scenarios. The first scenario are modern highly structured documents like forms. In this kind of documents, the semantic information is encoded in different fields with a pre-defined location in the document, therefore, information extraction becomes equivalent to transcription. The second application scenario are loosely structured totally handwritten documents, besides transcribing them, we need to assign a semantic label, from a set of known values to the handwritten words. In both scenarios, transcription is an important part of the information extraction. For that reason in this thesis we present two methods based on Neural Networks, to transcribe handwritten text.In order to tackle the challenge of loosely structured documents, we have produced a benchmark, consisting of a dataset, a defined set of tasks and a metric, that was presented to the community as an international competition. Also, we propose different models based on Convolutional and Recurrent neural networks that are able to transcribe and assign different semantic labels to each handwritten words, that is, able to perform Information Extraction.
Books on the topic "Heterogeneous neural networks"
Metric for the Application of Heterogeneous Datasets to Improve Neural Networks in Cybersecurity Defense: A Quantitative Experimental Research Study. Independently Published, 2021.
Find full textRusso, Mark A. Metric for the Application of Heterogeneous Datasets to Improve Neural Networks in Cybersecurity Defense: A Quantitative Experimental Research Study. Independently Published, 2021.
Find full textBook chapters on the topic "Heterogeneous neural networks"
Park, Dong-Chul, Duc-Hoai Nguyen, Song-Jae Lee, and Yunsik Lee. "Heterogeneous Centroid Neural Networks." In Advances in Neural Networks - ISNN 2006, 689–94. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11759966_101.
Full textWang, Ruijia. "Heterogeneous Graph Neural Networks." In Advances in Graph Neural Networks, 61–85. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-16174-2_4.
Full textShi, Chuan. "Heterogeneous Graph Neural Networks." In Graph Neural Networks: Foundations, Frontiers, and Applications, 351–69. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-6054-2_16.
Full textGrąbczewski, Krzysztof, and Włodzisław Duch. "Heterogeneous Forests of Decision Trees." In Artificial Neural Networks — ICANN 2002, 504–9. Berlin, Heidelberg: Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/3-540-46084-5_82.
Full textSatizábal, Héctor F., Andres Pérez-Uribe, and Marco Tomassini. "Avoiding Prototype Proliferation in Incremental Vector Quantization of Large Heterogeneous Datasets." In Constructive Neural Networks, 243–60. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04512-7_13.
Full textJin, Sichen, Yijia Zhang, and Mingyu Lu. "Heterogeneous Adaptive Denoising Networks for Recommendation." In Neural Computing for Advanced Applications, 30–43. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-6142-7_3.
Full textWang, Zhengxin, Jingbo Fan, He Jiang, and Haibo He. "Pinning Synchronization in Heterogeneous Networks of Harmonic Oscillators." In Neural Information Processing, 836–45. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-70090-8_85.
Full textKaizoji, Taisei. "Speculative Dynamics in a Heterogeneous-Agent Model." In Artificial Neural Networks — ICANN 2001, 775–81. Berlin, Heidelberg: Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-44668-0_108.
Full textXu, Hongyan, Wenjun Wang, Hongtao Liu, Mengxuan Zhang, Qiang Tian, and Pengfei Jiao. "Key Nodes Cluster Augmented Embedding for Heterogeneous Information Networks." In Neural Information Processing, 499–511. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-63833-7_42.
Full textFu, Guoji, Bo Yuan, Qiqi Duan, and Xin Yao. "Representation Learning for Heterogeneous Information Networks via Embedding Events." In Neural Information Processing, 327–39. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-36708-4_27.
Full textConference papers on the topic "Heterogeneous neural networks"
Hu, Ruiqi, Celina Ping Yu, Sai-Fu Fung, Shirui Pan, Haishuai Wang, and Guodong Long. "Universal network representation for heterogeneous information networks." In 2017 International Joint Conference on Neural Networks (IJCNN). IEEE, 2017. http://dx.doi.org/10.1109/ijcnn.2017.7965880.
Full textAlhubail, Ali, Xupeng He, Marwa AlSinan, Hyung Kwak, and Hussein Hoteit. "Extended Physics-Informed Neural Networks for Solving Fluid Flow Problems in Highly Heterogeneous Media." In International Petroleum Technology Conference. IPTC, 2022. http://dx.doi.org/10.2523/iptc-22163-ms.
Full textValdes, Julio J. "Heterogeneous extreme learning machines." In 2016 International Joint Conference on Neural Networks (IJCNN). IEEE, 2016. http://dx.doi.org/10.1109/ijcnn.2016.7727400.
Full textCao, Meng, Xiying Ma, Kai Zhu, Ming Xu, and Chongjun Wang. "Heterogeneous Information Network Embedding with Convolutional Graph Attention Networks." In 2020 International Joint Conference on Neural Networks (IJCNN). IEEE, 2020. http://dx.doi.org/10.1109/ijcnn48605.2020.9206610.
Full textYang, Carl, Jieyu Zhang, and Jiawei Han. "Neural Embedding Propagation on Heterogeneous Networks." In 2019 IEEE International Conference on Data Mining (ICDM). IEEE, 2019. http://dx.doi.org/10.1109/icdm.2019.00080.
Full textBai, Tong, and Gary Overett. "Heterogeneous Image Stylization Using Neural Networks." In 2017 International Conference on Digital Image Computing: Techniques and Applications (DICTA). IEEE, 2017. http://dx.doi.org/10.1109/dicta.2017.8227439.
Full textZhang, Shanshan, Tiancheng Huang, and Donglin Wang. "Sequence Contained Heterogeneous Graph Neural Network." In 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021. http://dx.doi.org/10.1109/ijcnn52387.2021.9533391.
Full textSublime, Jeremie, Nistor Grozavu, Younes Bennani, and Antoine Cornuejols. "Collaborative clustering with heterogeneous algorithms." In 2015 International Joint Conference on Neural Networks (IJCNN). IEEE, 2015. http://dx.doi.org/10.1109/ijcnn.2015.7280351.
Full textMa, Shuai, Jian-Wei Liu, Xin Zuo, and Wei-Min Li. "Heterogeneous Graph Gated Attention Network." In 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021. http://dx.doi.org/10.1109/ijcnn52387.2021.9533711.
Full textYang, Zheng, Bing Han, Weiming Chen, and Xinbo Gao. "Learn to Encode Heterogeneous Data: A Heterogeneous Aware Network for Multi-Future Trajectory Prediction." In 2023 International Joint Conference on Neural Networks (IJCNN). IEEE, 2023. http://dx.doi.org/10.1109/ijcnn54540.2023.10191508.
Full textReports on the topic "Heterogeneous neural networks"
Kase, Hanno, Leonardo Melosi, and Matthias Rottner. Estimating Nonlinear Heterogeneous Agents Models with Neural Networks. Federal Reserve Bank of Chicago, 2022. http://dx.doi.org/10.21033/wp-2022-26.
Full textComola, Margherita, Rokhaya Dieye, and Bernard Fortin. Heterogeneous peer effects and gender-based interventions for teenage obesity. CIRANO, September 2022. http://dx.doi.org/10.54932/tqag9043.
Full text