Littérature scientifique sur le sujet « Heterogeneous neural networks »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Sommaire
Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Heterogeneous neural networks ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Articles de revues sur le sujet "Heterogeneous neural networks"
Zeng, Wei, Ge Fan, Shan Sun, Biao Geng, Weiyi Wang, Jiacheng Li et Weibo Liu. « Collaborative filtering via heterogeneous neural networks ». Applied Soft Computing 109 (septembre 2021) : 107516. http://dx.doi.org/10.1016/j.asoc.2021.107516.
Texte intégralDrakopoulos, John A., et Ahmad Abdulkader. « Training neural networks with heterogeneous data ». Neural Networks 18, no 5-6 (juillet 2005) : 595–601. http://dx.doi.org/10.1016/j.neunet.2005.06.011.
Texte intégralTurner, Andrew James, et Julian Francis Miller. « NeuroEvolution : Evolving Heterogeneous Artificial Neural Networks ». Evolutionary Intelligence 7, no 3 (novembre 2014) : 135–54. http://dx.doi.org/10.1007/s12065-014-0115-5.
Texte intégralZhang, Chen, Zhouhua Tang, Bin Yu, Yu Xie et Ke Pan. « Deep heterogeneous network embedding based on Siamese Neural Networks ». Neurocomputing 388 (mai 2020) : 1–11. http://dx.doi.org/10.1016/j.neucom.2020.01.012.
Texte intégralSun, Yizhou, Jiawei Han, Xifeng Yan, Philip S. Yu et Tianyi Wu. « Heterogeneous information networks ». Proceedings of the VLDB Endowment 15, no 12 (août 2022) : 3807–11. http://dx.doi.org/10.14778/3554821.3554901.
Texte intégralIddianozie, Chidubem, et Gavin McArdle. « Towards Robust Representations of Spatial Networks Using Graph Neural Networks ». Applied Sciences 11, no 15 (27 juillet 2021) : 6918. http://dx.doi.org/10.3390/app11156918.
Texte intégralGracious, Tony, Shubham Gupta, Arun Kanthali, Rui M. Castro et Ambedkar Dukkipati. « Neural Latent Space Model for Dynamic Networks and Temporal Knowledge Graphs ». Proceedings of the AAAI Conference on Artificial Intelligence 35, no 5 (18 mai 2021) : 4054–62. http://dx.doi.org/10.1609/aaai.v35i5.16526.
Texte intégralWu, Nan, et Chaofan Wang. « Ensemble Graph Attention Networks ». Transactions on Machine Learning and Artificial Intelligence 10, no 3 (12 juin 2022) : 29–41. http://dx.doi.org/10.14738/tmlai.103.12399.
Texte intégralSon, Ha Min, Moon Hyun Kim et Tai-Myoung Chung. « Comparisons Where It Matters : Using Layer-Wise Regularization to Improve Federated Learning on Heterogeneous Data ». Applied Sciences 12, no 19 (3 octobre 2022) : 9943. http://dx.doi.org/10.3390/app12199943.
Texte intégralHosny, Khalid M., Marwa M. Khashaba, Walid I. Khedr et Fathy A. Amer. « An Efficient Neural Network-Based Prediction Scheme for Heterogeneous Networks ». International Journal of Sociotechnology and Knowledge Development 12, no 2 (avril 2020) : 63–76. http://dx.doi.org/10.4018/ijskd.2020040104.
Texte intégralThèses sur le sujet "Heterogeneous neural networks"
Belanche, Muñoz Lluís A. (Lluís Antoni). « Heterogeneous neural networks : theory and applications ». Doctoral thesis, Universitat Politècnica de Catalunya, 2000. http://hdl.handle.net/10803/6660.
Texte intégralLa similitud proporciona una marc conceptual i serveix de cobertura unificadora de molts models neuronals de la literatura i d'exploració de noves instàncies de models de neurona.
La visió basada en similitud porta amb naturalitat a integrar informació heterogènia, com ara quantitats contínues i discretes (nominals i ordinals), i difuses ó imprecises. Els valors perduts es tracten de manera explícita.
Una neurona d'aquesta classe s'anomena neurona heterogènia i qualsevol arquitectura neuronal que en faci ús serà una Xarxa Neuronal Heterogènia.
En aquest treball ens concentrem en xarxes neuronals endavant, com focus inicial d'estudi. Els algorismes d'aprenentatge són basats en algorisms evolutius, especialment extesos per treballar amb informació heterogènia.
En aquesta tesi es descriu com una certa classe de neurones heterogènies porten a xarxes neuronals que mostren un rendiment molt satisfactori, comparable o superior al de xarxes neuronals tradicionals (com el perceptró multicapa ó la xarxa de base radial), molt especialment en presència d'informació heterogènia, usual en les bases de dades actuals.
This work presents a class of functions serving as generalized neuron models to be used in artificial neural networks. They are cast into the common framework of computing a similarity function, a flexible definition of a neuron as a pattern recognizer. The similarity endows the model with a clear conceptual view and serves as a unification cover for many of the existing neural models, including those classically used for the MultiLayer Perceptron (MLP) and most of those used in Radial Basis Function Networks (RBF). These families of models are conceptually unified and their relation is clarified.
The possibilities of deriving new instances are explored and several neuron models --representative of their families-- are proposed.
The similarity view naturally leads to further extensions of the models to handle heterogeneous information, that is to say, information coming from sources radically different in character, including continuous and discrete (ordinal) numerical quantities, nominal (categorical) quantities, and fuzzy quantities. Missing data are also explicitly considered. A neuron of this class is called an heterogeneous neuron and any neural structure making use of them is an Heterogeneous Neural Network (HNN), regardless of the specific architecture or learning algorithm. Among them, in this work we concentrate on feed-forward networks, as the initial focus of study. The learning procedures may include a great variety of techniques, basically divided in derivative-based methods (such as the conjugate gradient)and evolutionary ones (such as variants of genetic algorithms).
In this Thesis we also explore a number of directions towards the construction of better neuron models --within an integrant envelope-- more adapted to the problems they are meant to solve.
It is described how a certain generic class of heterogeneous models leads to a satisfactory performance, comparable, and often better, to that of classical neural models, especially in the presence of heterogeneous information, imprecise or incomplete data, in a wide range of domains, most of them corresponding to real-world problems.
Belanche, Muñoz Lluis. « Heterogeneous neural networks : theory and applications ». Doctoral thesis, Universitat Politècnica de Catalunya, 2000. http://hdl.handle.net/10803/6660.
Texte intégralThis work presents a class of functions serving as generalized neuron models to be used in artificial neural networks. They are cast into the common framework of computing a similarity function, a flexible definition of a neuron as a pattern recognizer. The similarity endows the model with a clear conceptual view and serves as a unification cover for many of the existing neural models, including those classically used for the MultiLayer Perceptron (MLP) and most of those used in Radial Basis Function Networks (RBF). These families of models are conceptually unified and their relation is clarified. The possibilities of deriving new instances are explored and several neuron models --representative of their families-- are proposed. The similarity view naturally leads to further extensions of the models to handle heterogeneous information, that is to say, information coming from sources radically different in character, including continuous and discrete (ordinal) numerical quantities, nominal (categorical) quantities, and fuzzy quantities. Missing data are also explicitly considered. A neuron of this class is called an heterogeneous neuron and any neural structure making use of them is an Heterogeneous Neural Network (HNN), regardless of the specific architecture or learning algorithm. Among them, in this work we concentrate on feed-forward networks, as the initial focus of study. The learning procedures may include a great variety of techniques, basically divided in derivative-based methods (such as the conjugate gradient)and evolutionary ones (such as variants of genetic algorithms).In this Thesis we also explore a number of directions towards the construction of better neuron models --within an integrant envelope-- more adapted to the problems they are meant to solve.It is described how a certain generic class of heterogeneous models leads to a satisfactory performance, comparable, and often better, to that of classical neural models, especially in the presence of heterogeneous information, imprecise or incomplete data, in a wide range of domains, most of them corresponding to real-world problems.
Cabana, Tanguy. « Large deviations for the dynamics of heterogeneous neural networks ». Thesis, Paris 6, 2016. http://www.theses.fr/2016PA066551/document.
Texte intégralThis thesis addresses the rigorous derivation of mean-field results for the continuous time dynamics of heterogeneous large neural networks. In our models, we consider firing-rate neurons subject to additive noise. The network is fully connected, with highly random connectivity weights. Their variance scales as the inverse of the network size, and thus conserves a non-trivial role in the thermodynamic limit. Moreover, another heterogeneity is considered at the level of each neuron. It is interpreted as a spatial location. For biological relevance, a model considered includes delays, mean and variance of connections depending on the distance between cells. A second model considers interactions depending on the states of both neurons at play. This last case notably applies to Kuramoto's model of coupled oscillators. When the weights are independent Gaussian random variables, we show that the empirical measure of the neurons' states satisfies a large deviations principle, with a good rate function achieving its minimum at a unique probability measure, implying averaged convergence of the empirical measure and propagation of chaos. In certain cases, we also obtained quenched results. The limit is characterized through a complex non Markovian implicit equation in which the network interaction term is replaced by a non-local Gaussian process whose statistics depend on the solution over the whole neural field. We further demonstrate the universality of this limit, in the sense that neuronal networks with non-Gaussian interconnections but sub-Gaussian tails converge towards it. Moreover, we present a few numerical applications, and discuss possible perspectives
Zhao, Qiwei. « Federated Learning with Heterogeneous Challenge ». Thesis, The University of Sydney, 2022. https://hdl.handle.net/2123/27399.
Texte intégralSchliebs, Stefan. « Heterogeneous probabilistic models for optimisation and modelling of evolving spiking neural networks ». AUT University, 2010. http://hdl.handle.net/10292/963.
Texte intégralAntoniou, Christos Andrea. « Improving the acoustic modelling of speech using modular/ensemble combinations of heterogeneous neural networks ». Thesis, University of Essex, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.340582.
Texte intégralWilson, Daniel B. « Combining genetic algorithms and artificial neural networks to select heterogeneous dispatching rules for a job shop system ». Ohio : Ohio University, 1996. http://www.ohiolink.edu/etd/view.cgi?ohiou1177701025.
Texte intégralHobro, Mark. « Semantic Integration across Heterogeneous Databases : Finding Data Correspondences using Agglomerative Hierarchical Clustering and Artificial Neural Networks ». Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-226657.
Texte intégralDataintegrering är en viktig del inom området databaser när det kommer till databasmigreringar och sammanslagning av data. Forskning inom området har ökat i takt med att maskininlärning blivit ett attraktivt tillvägagångssätt under de senaste 20 åren. På grund av komplexiteten av forskningsområdet, har inga optimala lösningar hittats. Istället har flera olika tekniker framställts, som tillsammans kan förbättra databasmigreringar. Denna avhandling undersöker hur bra en lösning baserad på maskininlärning presterar för dataintegreringsproblemet vid databasmigreringar. Två algoritmer har implementerats. En är baserad på informationssökningsteori, som främst används för att ha en prestandamässig utgångspunkt för algoritmen som är baserad på maskininlärning. Den algoritmen består av ett första steg, där data grupperas med hjälp av hierarkisk klustring. Sedan tränas ett artificiellt neuronnät att hitta mönster i dessa grupperingar, för att kunna göra förutsägelser huruvida olika datainstanser har ett samband mellan två databaser. Resultatet visar att agglomerativ hierarkisk klustring presterar väl i uppgiften att klassificera den data som använts. Resultatet av matchningsalgoritmen visar på att en stor mängd av de matchande tabellerna kan hittas. Men förbättringar behöver göras för att både ge hög en hög återkallelse av matchningar och hög precision för de matchningar som hittas. Slutsatsen är att ett inlärningsbaserat tillvägagångssätt, i detta fall att använda agglomerativ hierarkisk klustring och sedan träna ett artificiellt neuronnät, fungerar bra som en basis för att till viss del automatisera ett dataintegreringsproblem likt det som presenterats i denna avhandling. För att få bättre resultat, krävs att lösningen förbättras med mer situationsspecifika algoritmer och regler.
Tekleyohannes, Anteneh Tesfaye. « Unified and heterogeneous modeling of water vapour sorption in Douglas-fir wood with artificial neural networks ». Thesis, University of British Columbia, 2010. http://hdl.handle.net/2429/23032.
Texte intégralToledo, Testa Juan Ignacio. « Information extraction from heterogeneous handwritten documents ». Doctoral thesis, Universitat Autònoma de Barcelona, 2019. http://hdl.handle.net/10803/667388.
Texte intégralEl objetivo de esta tesis es la extracción de Información de documentos total o parcialmente manuscritos, con una cierta estructura. Básicamente trabajamos con dos escenarios de aplicación diferentes. El primer escenario son los documentos modernos altamente estructurados, como los formularios. En estos documentos, la información semántica está pre-definida en campos con una posición concreta en el documento i la extracción de información es equivalente a una transcripción. El segundo escenario son los documentos semi-estructurados totalmente manuscritos, donde, además de transcribir, es necesario asociar un valor semántico, de entre un conjunto conocido de valores posibles, a las palabras manuscritas. En ambos casos, la calidad de la transcripción tiene un gran peso en la precisión del sistema. Por ese motivo proponemos modelos basados en redes neuronales para transcribir el texto manuscrito. Para poder afrontar el reto de los documentos semi-estructurados, hemos generado un benchmark, compuesto de dataset, una serie de tareas y una métrica que fue presentado a la comunidad científica a modo de competición internacional. También proponemos diferentes modelos basados en Redes Neuronales Convolucionales y Recurrentes, capaces de transcribir y asignar diferentes etiquetas semánticas a cada palabra manuscrita, es decir, capaces de extraer información.
The goal of this thesis is information Extraction from totally or partially handwritten documents. Basically we are dealing with two different application scenarios. The first scenario are modern highly structured documents like forms. In this kind of documents, the semantic information is encoded in different fields with a pre-defined location in the document, therefore, information extraction becomes equivalent to transcription. The second application scenario are loosely structured totally handwritten documents, besides transcribing them, we need to assign a semantic label, from a set of known values to the handwritten words. In both scenarios, transcription is an important part of the information extraction. For that reason in this thesis we present two methods based on Neural Networks, to transcribe handwritten text.In order to tackle the challenge of loosely structured documents, we have produced a benchmark, consisting of a dataset, a defined set of tasks and a metric, that was presented to the community as an international competition. Also, we propose different models based on Convolutional and Recurrent neural networks that are able to transcribe and assign different semantic labels to each handwritten words, that is, able to perform Information Extraction.
Livres sur le sujet "Heterogeneous neural networks"
Metric for the Application of Heterogeneous Datasets to Improve Neural Networks in Cybersecurity Defense : A Quantitative Experimental Research Study. Independently Published, 2021.
Trouver le texte intégralRusso, Mark A. Metric for the Application of Heterogeneous Datasets to Improve Neural Networks in Cybersecurity Defense : A Quantitative Experimental Research Study. Independently Published, 2021.
Trouver le texte intégralChapitres de livres sur le sujet "Heterogeneous neural networks"
Park, Dong-Chul, Duc-Hoai Nguyen, Song-Jae Lee et Yunsik Lee. « Heterogeneous Centroid Neural Networks ». Dans Advances in Neural Networks - ISNN 2006, 689–94. Berlin, Heidelberg : Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11759966_101.
Texte intégralWang, Ruijia. « Heterogeneous Graph Neural Networks ». Dans Advances in Graph Neural Networks, 61–85. Cham : Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-16174-2_4.
Texte intégralShi, Chuan. « Heterogeneous Graph Neural Networks ». Dans Graph Neural Networks : Foundations, Frontiers, and Applications, 351–69. Singapore : Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-6054-2_16.
Texte intégralGrąbczewski, Krzysztof, et Włodzisław Duch. « Heterogeneous Forests of Decision Trees ». Dans Artificial Neural Networks — ICANN 2002, 504–9. Berlin, Heidelberg : Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/3-540-46084-5_82.
Texte intégralSatizábal, Héctor F., Andres Pérez-Uribe et Marco Tomassini. « Avoiding Prototype Proliferation in Incremental Vector Quantization of Large Heterogeneous Datasets ». Dans Constructive Neural Networks, 243–60. Berlin, Heidelberg : Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04512-7_13.
Texte intégralJin, Sichen, Yijia Zhang et Mingyu Lu. « Heterogeneous Adaptive Denoising Networks for Recommendation ». Dans Neural Computing for Advanced Applications, 30–43. Singapore : Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-6142-7_3.
Texte intégralWang, Zhengxin, Jingbo Fan, He Jiang et Haibo He. « Pinning Synchronization in Heterogeneous Networks of Harmonic Oscillators ». Dans Neural Information Processing, 836–45. Cham : Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-70090-8_85.
Texte intégralKaizoji, Taisei. « Speculative Dynamics in a Heterogeneous-Agent Model ». Dans Artificial Neural Networks — ICANN 2001, 775–81. Berlin, Heidelberg : Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-44668-0_108.
Texte intégralXu, Hongyan, Wenjun Wang, Hongtao Liu, Mengxuan Zhang, Qiang Tian et Pengfei Jiao. « Key Nodes Cluster Augmented Embedding for Heterogeneous Information Networks ». Dans Neural Information Processing, 499–511. Cham : Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-63833-7_42.
Texte intégralFu, Guoji, Bo Yuan, Qiqi Duan et Xin Yao. « Representation Learning for Heterogeneous Information Networks via Embedding Events ». Dans Neural Information Processing, 327–39. Cham : Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-36708-4_27.
Texte intégralActes de conférences sur le sujet "Heterogeneous neural networks"
Hu, Ruiqi, Celina Ping Yu, Sai-Fu Fung, Shirui Pan, Haishuai Wang et Guodong Long. « Universal network representation for heterogeneous information networks ». Dans 2017 International Joint Conference on Neural Networks (IJCNN). IEEE, 2017. http://dx.doi.org/10.1109/ijcnn.2017.7965880.
Texte intégralAlhubail, Ali, Xupeng He, Marwa AlSinan, Hyung Kwak et Hussein Hoteit. « Extended Physics-Informed Neural Networks for Solving Fluid Flow Problems in Highly Heterogeneous Media ». Dans International Petroleum Technology Conference. IPTC, 2022. http://dx.doi.org/10.2523/iptc-22163-ms.
Texte intégralValdes, Julio J. « Heterogeneous extreme learning machines ». Dans 2016 International Joint Conference on Neural Networks (IJCNN). IEEE, 2016. http://dx.doi.org/10.1109/ijcnn.2016.7727400.
Texte intégralCao, Meng, Xiying Ma, Kai Zhu, Ming Xu et Chongjun Wang. « Heterogeneous Information Network Embedding with Convolutional Graph Attention Networks ». Dans 2020 International Joint Conference on Neural Networks (IJCNN). IEEE, 2020. http://dx.doi.org/10.1109/ijcnn48605.2020.9206610.
Texte intégralYang, Carl, Jieyu Zhang et Jiawei Han. « Neural Embedding Propagation on Heterogeneous Networks ». Dans 2019 IEEE International Conference on Data Mining (ICDM). IEEE, 2019. http://dx.doi.org/10.1109/icdm.2019.00080.
Texte intégralBai, Tong, et Gary Overett. « Heterogeneous Image Stylization Using Neural Networks ». Dans 2017 International Conference on Digital Image Computing : Techniques and Applications (DICTA). IEEE, 2017. http://dx.doi.org/10.1109/dicta.2017.8227439.
Texte intégralZhang, Shanshan, Tiancheng Huang et Donglin Wang. « Sequence Contained Heterogeneous Graph Neural Network ». Dans 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021. http://dx.doi.org/10.1109/ijcnn52387.2021.9533391.
Texte intégralSublime, Jeremie, Nistor Grozavu, Younes Bennani et Antoine Cornuejols. « Collaborative clustering with heterogeneous algorithms ». Dans 2015 International Joint Conference on Neural Networks (IJCNN). IEEE, 2015. http://dx.doi.org/10.1109/ijcnn.2015.7280351.
Texte intégralMa, Shuai, Jian-Wei Liu, Xin Zuo et Wei-Min Li. « Heterogeneous Graph Gated Attention Network ». Dans 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021. http://dx.doi.org/10.1109/ijcnn52387.2021.9533711.
Texte intégralYang, Zheng, Bing Han, Weiming Chen et Xinbo Gao. « Learn to Encode Heterogeneous Data : A Heterogeneous Aware Network for Multi-Future Trajectory Prediction ». Dans 2023 International Joint Conference on Neural Networks (IJCNN). IEEE, 2023. http://dx.doi.org/10.1109/ijcnn54540.2023.10191508.
Texte intégralRapports d'organisations sur le sujet "Heterogeneous neural networks"
Kase, Hanno, Leonardo Melosi et Matthias Rottner. Estimating Nonlinear Heterogeneous Agents Models with Neural Networks. Federal Reserve Bank of Chicago, 2022. http://dx.doi.org/10.21033/wp-2022-26.
Texte intégralComola, Margherita, Rokhaya Dieye et Bernard Fortin. Heterogeneous peer effects and gender-based interventions for teenage obesity. CIRANO, septembre 2022. http://dx.doi.org/10.54932/tqag9043.
Texte intégral