Literatura académica sobre el tema "Heterogeneous neural networks"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Heterogeneous neural networks".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Artículos de revistas sobre el tema "Heterogeneous neural networks"
Zeng, Wei, Ge Fan, Shan Sun, Biao Geng, Weiyi Wang, Jiacheng Li y Weibo Liu. "Collaborative filtering via heterogeneous neural networks". Applied Soft Computing 109 (septiembre de 2021): 107516. http://dx.doi.org/10.1016/j.asoc.2021.107516.
Texto completoDrakopoulos, John A. y Ahmad Abdulkader. "Training neural networks with heterogeneous data". Neural Networks 18, n.º 5-6 (julio de 2005): 595–601. http://dx.doi.org/10.1016/j.neunet.2005.06.011.
Texto completoTurner, Andrew James y Julian Francis Miller. "NeuroEvolution: Evolving Heterogeneous Artificial Neural Networks". Evolutionary Intelligence 7, n.º 3 (noviembre de 2014): 135–54. http://dx.doi.org/10.1007/s12065-014-0115-5.
Texto completoZhang, Chen, Zhouhua Tang, Bin Yu, Yu Xie y Ke Pan. "Deep heterogeneous network embedding based on Siamese Neural Networks". Neurocomputing 388 (mayo de 2020): 1–11. http://dx.doi.org/10.1016/j.neucom.2020.01.012.
Texto completoSun, Yizhou, Jiawei Han, Xifeng Yan, Philip S. Yu y Tianyi Wu. "Heterogeneous information networks". Proceedings of the VLDB Endowment 15, n.º 12 (agosto de 2022): 3807–11. http://dx.doi.org/10.14778/3554821.3554901.
Texto completoIddianozie, Chidubem y Gavin McArdle. "Towards Robust Representations of Spatial Networks Using Graph Neural Networks". Applied Sciences 11, n.º 15 (27 de julio de 2021): 6918. http://dx.doi.org/10.3390/app11156918.
Texto completoGracious, Tony, Shubham Gupta, Arun Kanthali, Rui M. Castro y Ambedkar Dukkipati. "Neural Latent Space Model for Dynamic Networks and Temporal Knowledge Graphs". Proceedings of the AAAI Conference on Artificial Intelligence 35, n.º 5 (18 de mayo de 2021): 4054–62. http://dx.doi.org/10.1609/aaai.v35i5.16526.
Texto completoWu, Nan y Chaofan Wang. "Ensemble Graph Attention Networks". Transactions on Machine Learning and Artificial Intelligence 10, n.º 3 (12 de junio de 2022): 29–41. http://dx.doi.org/10.14738/tmlai.103.12399.
Texto completoSon, Ha Min, Moon Hyun Kim y Tai-Myoung Chung. "Comparisons Where It Matters: Using Layer-Wise Regularization to Improve Federated Learning on Heterogeneous Data". Applied Sciences 12, n.º 19 (3 de octubre de 2022): 9943. http://dx.doi.org/10.3390/app12199943.
Texto completoHosny, Khalid M., Marwa M. Khashaba, Walid I. Khedr y Fathy A. Amer. "An Efficient Neural Network-Based Prediction Scheme for Heterogeneous Networks". International Journal of Sociotechnology and Knowledge Development 12, n.º 2 (abril de 2020): 63–76. http://dx.doi.org/10.4018/ijskd.2020040104.
Texto completoTesis sobre el tema "Heterogeneous neural networks"
Belanche, Muñoz Lluís A. (Lluís Antoni). "Heterogeneous neural networks: theory and applications". Doctoral thesis, Universitat Politècnica de Catalunya, 2000. http://hdl.handle.net/10803/6660.
Texto completoLa similitud proporciona una marc conceptual i serveix de cobertura unificadora de molts models neuronals de la literatura i d'exploració de noves instàncies de models de neurona.
La visió basada en similitud porta amb naturalitat a integrar informació heterogènia, com ara quantitats contínues i discretes (nominals i ordinals), i difuses ó imprecises. Els valors perduts es tracten de manera explícita.
Una neurona d'aquesta classe s'anomena neurona heterogènia i qualsevol arquitectura neuronal que en faci ús serà una Xarxa Neuronal Heterogènia.
En aquest treball ens concentrem en xarxes neuronals endavant, com focus inicial d'estudi. Els algorismes d'aprenentatge són basats en algorisms evolutius, especialment extesos per treballar amb informació heterogènia.
En aquesta tesi es descriu com una certa classe de neurones heterogènies porten a xarxes neuronals que mostren un rendiment molt satisfactori, comparable o superior al de xarxes neuronals tradicionals (com el perceptró multicapa ó la xarxa de base radial), molt especialment en presència d'informació heterogènia, usual en les bases de dades actuals.
This work presents a class of functions serving as generalized neuron models to be used in artificial neural networks. They are cast into the common framework of computing a similarity function, a flexible definition of a neuron as a pattern recognizer. The similarity endows the model with a clear conceptual view and serves as a unification cover for many of the existing neural models, including those classically used for the MultiLayer Perceptron (MLP) and most of those used in Radial Basis Function Networks (RBF). These families of models are conceptually unified and their relation is clarified.
The possibilities of deriving new instances are explored and several neuron models --representative of their families-- are proposed.
The similarity view naturally leads to further extensions of the models to handle heterogeneous information, that is to say, information coming from sources radically different in character, including continuous and discrete (ordinal) numerical quantities, nominal (categorical) quantities, and fuzzy quantities. Missing data are also explicitly considered. A neuron of this class is called an heterogeneous neuron and any neural structure making use of them is an Heterogeneous Neural Network (HNN), regardless of the specific architecture or learning algorithm. Among them, in this work we concentrate on feed-forward networks, as the initial focus of study. The learning procedures may include a great variety of techniques, basically divided in derivative-based methods (such as the conjugate gradient)and evolutionary ones (such as variants of genetic algorithms).
In this Thesis we also explore a number of directions towards the construction of better neuron models --within an integrant envelope-- more adapted to the problems they are meant to solve.
It is described how a certain generic class of heterogeneous models leads to a satisfactory performance, comparable, and often better, to that of classical neural models, especially in the presence of heterogeneous information, imprecise or incomplete data, in a wide range of domains, most of them corresponding to real-world problems.
Belanche, Muñoz Lluis. "Heterogeneous neural networks: theory and applications". Doctoral thesis, Universitat Politècnica de Catalunya, 2000. http://hdl.handle.net/10803/6660.
Texto completoThis work presents a class of functions serving as generalized neuron models to be used in artificial neural networks. They are cast into the common framework of computing a similarity function, a flexible definition of a neuron as a pattern recognizer. The similarity endows the model with a clear conceptual view and serves as a unification cover for many of the existing neural models, including those classically used for the MultiLayer Perceptron (MLP) and most of those used in Radial Basis Function Networks (RBF). These families of models are conceptually unified and their relation is clarified. The possibilities of deriving new instances are explored and several neuron models --representative of their families-- are proposed. The similarity view naturally leads to further extensions of the models to handle heterogeneous information, that is to say, information coming from sources radically different in character, including continuous and discrete (ordinal) numerical quantities, nominal (categorical) quantities, and fuzzy quantities. Missing data are also explicitly considered. A neuron of this class is called an heterogeneous neuron and any neural structure making use of them is an Heterogeneous Neural Network (HNN), regardless of the specific architecture or learning algorithm. Among them, in this work we concentrate on feed-forward networks, as the initial focus of study. The learning procedures may include a great variety of techniques, basically divided in derivative-based methods (such as the conjugate gradient)and evolutionary ones (such as variants of genetic algorithms).In this Thesis we also explore a number of directions towards the construction of better neuron models --within an integrant envelope-- more adapted to the problems they are meant to solve.It is described how a certain generic class of heterogeneous models leads to a satisfactory performance, comparable, and often better, to that of classical neural models, especially in the presence of heterogeneous information, imprecise or incomplete data, in a wide range of domains, most of them corresponding to real-world problems.
Cabana, Tanguy. "Large deviations for the dynamics of heterogeneous neural networks". Thesis, Paris 6, 2016. http://www.theses.fr/2016PA066551/document.
Texto completoThis thesis addresses the rigorous derivation of mean-field results for the continuous time dynamics of heterogeneous large neural networks. In our models, we consider firing-rate neurons subject to additive noise. The network is fully connected, with highly random connectivity weights. Their variance scales as the inverse of the network size, and thus conserves a non-trivial role in the thermodynamic limit. Moreover, another heterogeneity is considered at the level of each neuron. It is interpreted as a spatial location. For biological relevance, a model considered includes delays, mean and variance of connections depending on the distance between cells. A second model considers interactions depending on the states of both neurons at play. This last case notably applies to Kuramoto's model of coupled oscillators. When the weights are independent Gaussian random variables, we show that the empirical measure of the neurons' states satisfies a large deviations principle, with a good rate function achieving its minimum at a unique probability measure, implying averaged convergence of the empirical measure and propagation of chaos. In certain cases, we also obtained quenched results. The limit is characterized through a complex non Markovian implicit equation in which the network interaction term is replaced by a non-local Gaussian process whose statistics depend on the solution over the whole neural field. We further demonstrate the universality of this limit, in the sense that neuronal networks with non-Gaussian interconnections but sub-Gaussian tails converge towards it. Moreover, we present a few numerical applications, and discuss possible perspectives
Zhao, Qiwei. "Federated Learning with Heterogeneous Challenge". Thesis, The University of Sydney, 2022. https://hdl.handle.net/2123/27399.
Texto completoSchliebs, Stefan. "Heterogeneous probabilistic models for optimisation and modelling of evolving spiking neural networks". AUT University, 2010. http://hdl.handle.net/10292/963.
Texto completoAntoniou, Christos Andrea. "Improving the acoustic modelling of speech using modular/ensemble combinations of heterogeneous neural networks". Thesis, University of Essex, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.340582.
Texto completoWilson, Daniel B. "Combining genetic algorithms and artificial neural networks to select heterogeneous dispatching rules for a job shop system". Ohio : Ohio University, 1996. http://www.ohiolink.edu/etd/view.cgi?ohiou1177701025.
Texto completoHobro, Mark. "Semantic Integration across Heterogeneous Databases : Finding Data Correspondences using Agglomerative Hierarchical Clustering and Artificial Neural Networks". Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-226657.
Texto completoDataintegrering är en viktig del inom området databaser när det kommer till databasmigreringar och sammanslagning av data. Forskning inom området har ökat i takt med att maskininlärning blivit ett attraktivt tillvägagångssätt under de senaste 20 åren. På grund av komplexiteten av forskningsområdet, har inga optimala lösningar hittats. Istället har flera olika tekniker framställts, som tillsammans kan förbättra databasmigreringar. Denna avhandling undersöker hur bra en lösning baserad på maskininlärning presterar för dataintegreringsproblemet vid databasmigreringar. Två algoritmer har implementerats. En är baserad på informationssökningsteori, som främst används för att ha en prestandamässig utgångspunkt för algoritmen som är baserad på maskininlärning. Den algoritmen består av ett första steg, där data grupperas med hjälp av hierarkisk klustring. Sedan tränas ett artificiellt neuronnät att hitta mönster i dessa grupperingar, för att kunna göra förutsägelser huruvida olika datainstanser har ett samband mellan två databaser. Resultatet visar att agglomerativ hierarkisk klustring presterar väl i uppgiften att klassificera den data som använts. Resultatet av matchningsalgoritmen visar på att en stor mängd av de matchande tabellerna kan hittas. Men förbättringar behöver göras för att både ge hög en hög återkallelse av matchningar och hög precision för de matchningar som hittas. Slutsatsen är att ett inlärningsbaserat tillvägagångssätt, i detta fall att använda agglomerativ hierarkisk klustring och sedan träna ett artificiellt neuronnät, fungerar bra som en basis för att till viss del automatisera ett dataintegreringsproblem likt det som presenterats i denna avhandling. För att få bättre resultat, krävs att lösningen förbättras med mer situationsspecifika algoritmer och regler.
Tekleyohannes, Anteneh Tesfaye. "Unified and heterogeneous modeling of water vapour sorption in Douglas-fir wood with artificial neural networks". Thesis, University of British Columbia, 2010. http://hdl.handle.net/2429/23032.
Texto completoToledo, Testa Juan Ignacio. "Information extraction from heterogeneous handwritten documents". Doctoral thesis, Universitat Autònoma de Barcelona, 2019. http://hdl.handle.net/10803/667388.
Texto completoEl objetivo de esta tesis es la extracción de Información de documentos total o parcialmente manuscritos, con una cierta estructura. Básicamente trabajamos con dos escenarios de aplicación diferentes. El primer escenario son los documentos modernos altamente estructurados, como los formularios. En estos documentos, la información semántica está pre-definida en campos con una posición concreta en el documento i la extracción de información es equivalente a una transcripción. El segundo escenario son los documentos semi-estructurados totalmente manuscritos, donde, además de transcribir, es necesario asociar un valor semántico, de entre un conjunto conocido de valores posibles, a las palabras manuscritas. En ambos casos, la calidad de la transcripción tiene un gran peso en la precisión del sistema. Por ese motivo proponemos modelos basados en redes neuronales para transcribir el texto manuscrito. Para poder afrontar el reto de los documentos semi-estructurados, hemos generado un benchmark, compuesto de dataset, una serie de tareas y una métrica que fue presentado a la comunidad científica a modo de competición internacional. También proponemos diferentes modelos basados en Redes Neuronales Convolucionales y Recurrentes, capaces de transcribir y asignar diferentes etiquetas semánticas a cada palabra manuscrita, es decir, capaces de extraer información.
The goal of this thesis is information Extraction from totally or partially handwritten documents. Basically we are dealing with two different application scenarios. The first scenario are modern highly structured documents like forms. In this kind of documents, the semantic information is encoded in different fields with a pre-defined location in the document, therefore, information extraction becomes equivalent to transcription. The second application scenario are loosely structured totally handwritten documents, besides transcribing them, we need to assign a semantic label, from a set of known values to the handwritten words. In both scenarios, transcription is an important part of the information extraction. For that reason in this thesis we present two methods based on Neural Networks, to transcribe handwritten text.In order to tackle the challenge of loosely structured documents, we have produced a benchmark, consisting of a dataset, a defined set of tasks and a metric, that was presented to the community as an international competition. Also, we propose different models based on Convolutional and Recurrent neural networks that are able to transcribe and assign different semantic labels to each handwritten words, that is, able to perform Information Extraction.
Libros sobre el tema "Heterogeneous neural networks"
Metric for the Application of Heterogeneous Datasets to Improve Neural Networks in Cybersecurity Defense: A Quantitative Experimental Research Study. Independently Published, 2021.
Buscar texto completoRusso, Mark A. Metric for the Application of Heterogeneous Datasets to Improve Neural Networks in Cybersecurity Defense: A Quantitative Experimental Research Study. Independently Published, 2021.
Buscar texto completoCapítulos de libros sobre el tema "Heterogeneous neural networks"
Park, Dong-Chul, Duc-Hoai Nguyen, Song-Jae Lee y Yunsik Lee. "Heterogeneous Centroid Neural Networks". En Advances in Neural Networks - ISNN 2006, 689–94. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11759966_101.
Texto completoWang, Ruijia. "Heterogeneous Graph Neural Networks". En Advances in Graph Neural Networks, 61–85. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-16174-2_4.
Texto completoShi, Chuan. "Heterogeneous Graph Neural Networks". En Graph Neural Networks: Foundations, Frontiers, and Applications, 351–69. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-6054-2_16.
Texto completoGrąbczewski, Krzysztof y Włodzisław Duch. "Heterogeneous Forests of Decision Trees". En Artificial Neural Networks — ICANN 2002, 504–9. Berlin, Heidelberg: Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/3-540-46084-5_82.
Texto completoSatizábal, Héctor F., Andres Pérez-Uribe y Marco Tomassini. "Avoiding Prototype Proliferation in Incremental Vector Quantization of Large Heterogeneous Datasets". En Constructive Neural Networks, 243–60. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04512-7_13.
Texto completoJin, Sichen, Yijia Zhang y Mingyu Lu. "Heterogeneous Adaptive Denoising Networks for Recommendation". En Neural Computing for Advanced Applications, 30–43. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-6142-7_3.
Texto completoWang, Zhengxin, Jingbo Fan, He Jiang y Haibo He. "Pinning Synchronization in Heterogeneous Networks of Harmonic Oscillators". En Neural Information Processing, 836–45. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-70090-8_85.
Texto completoKaizoji, Taisei. "Speculative Dynamics in a Heterogeneous-Agent Model". En Artificial Neural Networks — ICANN 2001, 775–81. Berlin, Heidelberg: Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-44668-0_108.
Texto completoXu, Hongyan, Wenjun Wang, Hongtao Liu, Mengxuan Zhang, Qiang Tian y Pengfei Jiao. "Key Nodes Cluster Augmented Embedding for Heterogeneous Information Networks". En Neural Information Processing, 499–511. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-63833-7_42.
Texto completoFu, Guoji, Bo Yuan, Qiqi Duan y Xin Yao. "Representation Learning for Heterogeneous Information Networks via Embedding Events". En Neural Information Processing, 327–39. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-36708-4_27.
Texto completoActas de conferencias sobre el tema "Heterogeneous neural networks"
Hu, Ruiqi, Celina Ping Yu, Sai-Fu Fung, Shirui Pan, Haishuai Wang y Guodong Long. "Universal network representation for heterogeneous information networks". En 2017 International Joint Conference on Neural Networks (IJCNN). IEEE, 2017. http://dx.doi.org/10.1109/ijcnn.2017.7965880.
Texto completoAlhubail, Ali, Xupeng He, Marwa AlSinan, Hyung Kwak y Hussein Hoteit. "Extended Physics-Informed Neural Networks for Solving Fluid Flow Problems in Highly Heterogeneous Media". En International Petroleum Technology Conference. IPTC, 2022. http://dx.doi.org/10.2523/iptc-22163-ms.
Texto completoValdes, Julio J. "Heterogeneous extreme learning machines". En 2016 International Joint Conference on Neural Networks (IJCNN). IEEE, 2016. http://dx.doi.org/10.1109/ijcnn.2016.7727400.
Texto completoCao, Meng, Xiying Ma, Kai Zhu, Ming Xu y Chongjun Wang. "Heterogeneous Information Network Embedding with Convolutional Graph Attention Networks". En 2020 International Joint Conference on Neural Networks (IJCNN). IEEE, 2020. http://dx.doi.org/10.1109/ijcnn48605.2020.9206610.
Texto completoYang, Carl, Jieyu Zhang y Jiawei Han. "Neural Embedding Propagation on Heterogeneous Networks". En 2019 IEEE International Conference on Data Mining (ICDM). IEEE, 2019. http://dx.doi.org/10.1109/icdm.2019.00080.
Texto completoBai, Tong y Gary Overett. "Heterogeneous Image Stylization Using Neural Networks". En 2017 International Conference on Digital Image Computing: Techniques and Applications (DICTA). IEEE, 2017. http://dx.doi.org/10.1109/dicta.2017.8227439.
Texto completoZhang, Shanshan, Tiancheng Huang y Donglin Wang. "Sequence Contained Heterogeneous Graph Neural Network". En 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021. http://dx.doi.org/10.1109/ijcnn52387.2021.9533391.
Texto completoSublime, Jeremie, Nistor Grozavu, Younes Bennani y Antoine Cornuejols. "Collaborative clustering with heterogeneous algorithms". En 2015 International Joint Conference on Neural Networks (IJCNN). IEEE, 2015. http://dx.doi.org/10.1109/ijcnn.2015.7280351.
Texto completoMa, Shuai, Jian-Wei Liu, Xin Zuo y Wei-Min Li. "Heterogeneous Graph Gated Attention Network". En 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021. http://dx.doi.org/10.1109/ijcnn52387.2021.9533711.
Texto completoYang, Zheng, Bing Han, Weiming Chen y Xinbo Gao. "Learn to Encode Heterogeneous Data: A Heterogeneous Aware Network for Multi-Future Trajectory Prediction". En 2023 International Joint Conference on Neural Networks (IJCNN). IEEE, 2023. http://dx.doi.org/10.1109/ijcnn54540.2023.10191508.
Texto completoInformes sobre el tema "Heterogeneous neural networks"
Kase, Hanno, Leonardo Melosi y Matthias Rottner. Estimating Nonlinear Heterogeneous Agents Models with Neural Networks. Federal Reserve Bank of Chicago, 2022. http://dx.doi.org/10.21033/wp-2022-26.
Texto completoComola, Margherita, Rokhaya Dieye y Bernard Fortin. Heterogeneous peer effects and gender-based interventions for teenage obesity. CIRANO, septiembre de 2022. http://dx.doi.org/10.54932/tqag9043.
Texto completo