Gotowa bibliografia na temat „Graph attention network (GAT)”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Graph attention network (GAT)”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Artykuły w czasopismach na temat "Graph attention network (GAT)"
Wu, Nan, i Chaofan Wang. "Ensemble Graph Attention Networks". Transactions on Machine Learning and Artificial Intelligence 10, nr 3 (12.06.2022): 29–41. http://dx.doi.org/10.14738/tmlai.103.12399.
Pełny tekst źródłaVerma, Atul Kumar, Rahul Saxena, Mahipal Jadeja, Vikrant Bhateja i Jerry Chun-Wei Lin. "Bet-GAT: An Efficient Centrality-Based Graph Attention Model for Semi-Supervised Node Classification". Applied Sciences 13, nr 2 (7.01.2023): 847. http://dx.doi.org/10.3390/app13020847.
Pełny tekst źródłaLu, Shengfu, Jiaming Kang, Jinyu Zhang i Mi Li. "Assessment method of depressive disorder level based on graph attention network". ITM Web of Conferences 45 (2022): 01039. http://dx.doi.org/10.1051/itmconf/20224501039.
Pełny tekst źródłaXiang, Zhijie, Weijia Gong, Zehui Li, Xue Yang, Jihua Wang i Hong Wang. "Predicting Protein–Protein Interactions via Gated Graph Attention Signed Network". Biomolecules 11, nr 6 (28.05.2021): 799. http://dx.doi.org/10.3390/biom11060799.
Pełny tekst źródłaYuan, Hong, Jing Huang i Jin Li. "Protein-ligand binding affinity prediction model based on graph attention network". Mathematical Biosciences and Engineering 18, nr 6 (2021): 9148–62. http://dx.doi.org/10.3934/mbe.2021451.
Pełny tekst źródłaJing, Weipeng, Xianyang Song, Donglin Di i Houbing Song. "geoGAT: Graph Model Based on Attention Mechanism for Geographic Text Classification". ACM Transactions on Asian and Low-Resource Language Information Processing 20, nr 5 (30.09.2021): 1–18. http://dx.doi.org/10.1145/3434239.
Pełny tekst źródłaLiu, Yiwen, Tao Wen i Zhenning Wu. "Motion Artifact Detection Based on Regional–Temporal Graph Attention Network from Head Computed Tomography Images". Electronics 13, nr 4 (10.02.2024): 724. http://dx.doi.org/10.3390/electronics13040724.
Pełny tekst źródłaHuang, Ling, Xing-Xing Liu, Shu-Qiang Huang, Chang-Dong Wang, Wei Tu, Jia-Meng Xie, Shuai Tang i Wendi Xie. "Temporal Hierarchical Graph Attention Network for Traffic Prediction". ACM Transactions on Intelligent Systems and Technology 12, nr 6 (31.12.2021): 1–21. http://dx.doi.org/10.1145/3446430.
Pełny tekst źródłaSong, Kyungwoo, Yohan Jung, Dongjun Kim i Il-Chul Moon. "Implicit Kernel Attention". Proceedings of the AAAI Conference on Artificial Intelligence 35, nr 11 (18.05.2021): 9713–21. http://dx.doi.org/10.1609/aaai.v35i11.17168.
Pełny tekst źródłaZheng, Jing, Ziren Gao, Jingsong Ma, Jie Shen i Kang Zhang. "Deep Graph Convolutional Networks for Accurate Automatic Road Network Selection". ISPRS International Journal of Geo-Information 10, nr 11 (11.11.2021): 768. http://dx.doi.org/10.3390/ijgi10110768.
Pełny tekst źródłaRozprawy doktorskie na temat "Graph attention network (GAT)"
Belhadj, Djedjiga. "Multi-GAT semi-supervisé pour l’extraction d’informations et son adaptation au chiffrement homomorphe". Electronic Thesis or Diss., Université de Lorraine, 2024. http://www.theses.fr/2024LORR0023.
Pełny tekst źródłaThis thesis is being carried out as part of the BPI DeepTech project, in collaboration with the company Fair&Smart, primarily looking after the protection of personal data in accordance with the General Data Protection Regulation (RGPD). In this context, we have proposed a deep neural model for extracting information in semi-structured administrative documents (SSDs). Due to the lack of public training datasets, we have proposed an artificial generator of SSDs that can generate several classes of documents with a wide variation in content and layout. Documents are generated using random variables to manage content and layout, while respecting constraints aimed at ensuring their similarity to real documents. Metrics were introduced to evaluate the content and layout diversity of the generated SSDs. The results of the evaluation have shown that the generated datasets for three SSD types (payslips, receipts and invoices) present a high diversity level, thus avoiding overfitting when training the information extraction systems. Based on the specific format of SSDs, consisting specifically of word pairs (keywords-information) located in spatially close neighborhoods, the document is modeled as a graph where nodes represent words and edges, neighborhood connections. The graph is fed into a multi-layer graph attention network (Multi-GAT). The latter applies the multi-head attention mechanism to learn the importance of each word's neighbors in order to better classify it. A first version of this model was used in supervised mode and obtained an F1 score of 96% on two generated invoice and payslip datasets, and 89% on a real receipt dataset (SROIE). We then enriched the multi-GAT with multimodal embedding of word-level information (textual, visual and positional), and combined it with a variational graph auto-encoder (VGAE). This model operates in semi-supervised mode, being able to learn on both labeled and unlabeled data simultaneously. To further optimize the graph node classification, we have proposed a semi-VGAE whose encoder shares its first layers with the multi-GAT classifier. This is also reinforced by the proposal of a VGAE loss function managed by the classification loss. Using a small unlabeled dataset, we were able to improve the F1 score obtained on a generated invoice dataset by over 3%. Intended to operate in a protected environment, we have adapted the architecture of the model to suit its homomorphic encryption. We studied a method of dimensionality reduction of the Multi-GAT model. We then proposed a polynomial approximation approach for the non-linear functions in the model. To reduce the dimensionality of the model, we proposed a multimodal feature fusion method that requires few additional parameters and reduces the dimensions of the model while improving its performance. For the encryption adaptation, we studied low-degree polynomial approximations of nonlinear functions, using knowledge distillation and fine-tuning techniques to better adapt the model to the new approximations. We were able to minimize the approximation loss by around 3% on two invoice datasets as well as one payslip dataset and by 5% on SROIE
Lee, John Boaz T. "Deep Learning on Graph-structured Data". Digital WPI, 2019. https://digitalcommons.wpi.edu/etd-dissertations/570.
Pełny tekst źródłaYou, Di. "Attributed Multi-Relational Attention Network for Fact-checking URL Recommendation". Digital WPI, 2019. https://digitalcommons.wpi.edu/etd-theses/1321.
Pełny tekst źródłaDronzeková, Michaela. "Analýza polygonálních modelů pomocí neuronových sítí". Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2020. http://www.nusl.cz/ntk/nusl-417253.
Pełny tekst źródłaBlini, Elvio A. "Biases in Visuo-Spatial Attention: from Assessment to Experimental Induction". Doctoral thesis, Università degli studi di Padova, 2016. http://hdl.handle.net/11577/3424480.
Pełny tekst źródłaIn questo lavoro presenterò una serie di ricerche che possono sembrare piuttosto eterogenee per quesiti sperimentali e approcci metodologici, ma sono tuttavia legate da un filo conduttore comune: i costrutti di ragionamento e attenzione spaziale. Affronterò in particolare aspetti legati alla valutazione delle asimmetrie attenzionali, nell'individuo sano come nel paziente con disturbi neurologici, il loro ruolo in vari aspetti della cognizione umana, e i loro substrati neurali, guidato dalla convinzione che l’attenzione spaziale giochi un ruolo importante in svariati processi mentali non necessariamente limitati alla percezione. Quanto segue è stato dunque organizzato in due sezioni distinte. Nella prima mi soffermerò sulla valutazione delle asimmetrie visuospaziali, iniziando dalla descrizione di un nuovo paradigma particolarmente adatto a questo scopo. Nel primo capitolo descriverò gli effetti del doppio compito e del carico attenzionale su un test di monitoraggio spaziale; il risultato principale mostra un netto peggioramento nella prestazione al compito di detezione spaziale in funzione del carico di memoria introdotto. Nel secondo capitolo applicherò lo stesso paradigma ad una popolazione clinica contraddistinta da lesione cerebrale dell’emisfero sinistro. Nonostante una valutazione neuropsicologica standard non evidenziasse alcun deficit lateralizzato dell’attenzione, mostrerò che sfruttare un compito accessorio può portare ad una spiccata maggiore sensibilità dei test diagnostici, con evidenti ricadute benefiche sull'iter clinico e terapeutico dei pazienti. Infine, nel terzo capitolo suggerirò, tramite dati preliminari, che asimmetrie attenzionali possono essere individuate, nell'individuo sano, anche lungo l’asse sagittale; argomenterò, in particolare, che attorno allo spazio peripersonale sembrano essere generalmente concentrate più risorse attentive, e che i benefici conseguenti si estendono a compiti di varia natura (ad esempio compiti di discriminazione). Passerò dunque alla seconda sezione, in cui, seguendo una logica inversa, indurrò degli spostamenti nel focus attentivo in modo da valutarne il ruolo in compiti di varia natura. Nei capitoli quarto e quinto sfrutterò delle stimolazioni sensoriali: la stimolazione visiva optocinetica e la stimolazione galvanico vestibolare, rispettivamente. Nel quarto capitolo mostrerò che l’attenzione spaziale è coinvolta nella cognizione numerica, con cui intrattiene rapporti bidirezionali. Nello specifico mostrerò da un lato che la stimolazione optocinetica può modulare l’occorrenza di errori procedurali nel calcolo mentale, dall'altro che il calcolo stesso ha degli effetti sull'attenzione spaziale e in particolare sul comportamento oculomotorio. Nel quinto capitolo esaminerò gli effetti della stimolazione galvanica vestibolare, una tecnica particolarmente promettente per la riabilitazione dei disturbi attentivi lateralizzati, sulle rappresentazioni mentali dello spazio. Discuterò in modo critico un recente modello della negligenza spaziale unilaterale, suggerendo che stimolazioni e disturbi vestibolari possano sì avere ripercussioni sulle rappresentazioni metriche dello spazio, ma senza comportare necessariamente inattenzione per lo spazio stesso. Infine, nel sesto capitolo descriverò gli effetti di cattura dell’attenzione visuospaziale che stimoli distrattori intrinsecamente motivanti possono esercitare nell'adulto sano. Cercherò, in particolare, di predire l’entità di questa cattura attenzionale partendo da immagini di risonanza magnetica funzionale a riposo: riporterò dati preliminari focalizzati sull'importanza del circuito cingolo-opercolare, effettuando un parallelismo con popolazioni cliniche caratterizzate da comportamenti di dipendenza.
Huang, Wei-Chia, i 黃偉嘉. "A Question Answering System for Financial Time-Series Correlation Based on Improved Gated Graph Sequence Neural Network with Attention Mechanism". Thesis, 2019. http://ndltd.ncl.edu.tw/handle/hu4b8r.
Pełny tekst źródła國立交通大學
資訊管理研究所
108
With the rise of financial technology (FinTech) in recent years, the financial industry seek to make their services more efficient through technology, one of the important topic in Fintech is how to conduct analysis in big data and establish prediction model based on artificial intelligence. In this case, we hope to find out the rules and implicit correlation between these data through algorithms, and forecast the situation of future market. In fact, we believe that there is a complex correlation pattern between different financial commodities. The changes in commodities may lead to a chain reaction of financial market through the complex network, and we may be able to build a relational model, which can be represented by a graph structure from these commodities and take a glance on the real situation of market. We may be able to learn the correlation with the help of deep neural network. So we will focus on the research of graph neural network and apply it to the financial domain. In this work, we proposes a deep learning model based on graph structure and attention mechanism, which is applied to the study of interaction relationship of financial time-series data. Traditional deep learning model perform well while input data are in the Euclidean space such as images and sequences. However, it is very easy to lose the structural information of graph if we learn the graph structure data with traditional deep learning module. Therefore, it is necessary to design a deep learning model specifically used for processing the graph structure. In this study, we expect to formulate various relationship between financial commodities as and learn the representation of graph through the graph neural networks. Moreover, we can highlight the importance of each commodity through the attention mechanism, and finally forecast the future trend of market with the help of our proposed model.
Części książek na temat "Graph attention network (GAT)"
Peng, Mi, Buqing Cao, Junjie Chen, Jianxun Liu i Bing Li. "SC-GAT: Web Services Classification Based on Graph Attention Network". W Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, 513–29. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-67537-0_31.
Pełny tekst źródłaSui, Jianan, Yuehui Chen, Baitong Chen, Yi Cao, Jiazi Chen i Hanhan Cong. "SeqVec-GAT: A Golgi Classification Model Based on Multi-headed Graph Attention Network". W Intelligent Computing Theories and Application, 697–704. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-13829-4_61.
Pełny tekst źródłaXu, Hongbo, Shuhao Li, Zhenyu Cheng, Rui Qin, Jiang Xie i Peishuai Sun. "VT-GAT: A Novel VPN Encrypted Traffic Classification Model Based on Graph Attention Neural Network". W Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, 437–56. Cham: Springer Nature Switzerland, 2022. http://dx.doi.org/10.1007/978-3-031-24386-8_24.
Pełny tekst źródłaWu, Xinhui, Weizhi An, Shiqi Yu, Weiyu Guo i Edel B. García. "Spatial-Temporal Graph Attention Network for Video-Based Gait Recognition". W Lecture Notes in Computer Science, 274–86. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-41299-9_22.
Pełny tekst źródłaLiu, Lu, Xiao Song, Bingli Sun, Guanghong Gong i Wenxin Li. "MMoE-GAT: A Multi-Gate Mixture-of-Experts Boosted Graph Attention Network for Aircraft Engine Remaining Useful Life Prediction". W Communications in Computer and Information Science, 451–65. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-7240-1_36.
Pełny tekst źródłaNerrise, Favour, Qingyu Zhao, Kathleen L. Poston, Kilian M. Pohl i Ehsan Adeli. "An Explainable Geometric-Weighted Graph Attention Network for Identifying Functional Networks Associated with Gait Impairment". W Lecture Notes in Computer Science, 723–33. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-43895-0_68.
Pełny tekst źródłaSamanta, Aritra, Shyamali Mitra, Biplab Banerjee i Nibaran Das. "Cluster—GAT: Mixing Convolutional and Self-Attended Feature Maps Using Graph Attention Networks for Cervical Cell Classification". W Proceedings of 4th International Conference on Frontiers in Computing and Systems, 409–18. Singapore: Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-2611-0_28.
Pełny tekst źródłaZhang, Xueya, Tong Zhang, Wenting Zhao, Zhen Cui i Jian Yang. "Dual-Attention Graph Convolutional Network". W Lecture Notes in Computer Science, 238–51. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-41299-9_19.
Pełny tekst źródłaLong, Yunfei, Huosheng Xu, Pengyuan Qi, Liguo Zhang i Jun Li. "Graph Attention Network for Word Embeddings". W Lecture Notes in Computer Science, 191–201. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-78612-0_16.
Pełny tekst źródłaWang, Ziming, Jun Chen i Haopeng Chen. "EGAT: Edge-Featured Graph Attention Network". W Lecture Notes in Computer Science, 253–64. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-86362-3_21.
Pełny tekst źródłaStreszczenia konferencji na temat "Graph attention network (GAT)"
Yuan, Xiaosong, Ke Chen, Wanli Zuo i Yijia Zhang. "TC-GAT: Graph Attention Network for Temporal Causality Discovery". W 2023 International Joint Conference on Neural Networks (IJCNN). IEEE, 2023. http://dx.doi.org/10.1109/ijcnn54540.2023.10191712.
Pełny tekst źródłaHuang, Deling, Xialong Tong i Haodong Yang. "Web Service Recommendation based on Graph Attention Network (GAT-WSR)". W 2022 International Conference on Computer Communication and Informatics (ICCCI). IEEE, 2022. http://dx.doi.org/10.1109/iccci54379.2022.9740941.
Pełny tekst źródłaAhmed, Ahmed N., Ali Anwar, Siegfried Mercelis, Steven Latre i Peter Hellinckx. "FF-GAT: Feature Fusion Using Graph Attention Networks". W IECON 2021 - 47th Annual Conference of the IEEE Industrial Electronics Society. IEEE, 2021. http://dx.doi.org/10.1109/iecon48115.2021.9589579.
Pełny tekst źródłaFang, Yujie, Jie Jiang i Yixiang He. "Traffic Speed Prediction Based on LSTM-Graph Attention Network (L-GAT)". W 2021 4th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE). IEEE, 2021. http://dx.doi.org/10.1109/aemcse51986.2021.00163.
Pełny tekst źródłaZheng, Zhaohua, Jianfang Li, Lingjie Zhu, Honghua Li, Frank Petzold i Ping Tan. "GAT-CADNet: Graph Attention Network for Panoptic Symbol Spotting in CAD Drawings". W 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2022. http://dx.doi.org/10.1109/cvpr52688.2022.01145.
Pełny tekst źródłaLu, Xiaofeng, Xiaoyu Zhang i Pietro Lio. "GAT-DNS: DNS Multivariate Time Series Prediction Model Based on Graph Attention Network". W WWW '23: The ACM Web Conference 2023. New York, NY, USA: ACM, 2023. http://dx.doi.org/10.1145/3543873.3587329.
Pełny tekst źródłaLiang, Shuo, Wei Wei, Xian-Ling Mao, Fei Wang i Zhiyong He. "BiSyn-GAT+: Bi-Syntax Aware Graph Attention Network for Aspect-based Sentiment Analysis". W Findings of the Association for Computational Linguistics: ACL 2022. Stroudsburg, PA, USA: Association for Computational Linguistics, 2022. http://dx.doi.org/10.18653/v1/2022.findings-acl.144.
Pełny tekst źródłaMahto, Dinesh Kumar, Vikash Kumar Saini, Akhilesh Mathur, Rajesh Kumar i Rupesh Yadav. "GAT- DNet: High Fidelity Graph Attention Network for Distribution Optimal Power Flow Pursuit". W 2023 9th IEEE India International Conference on Power Electronics (IICPE). IEEE, 2023. http://dx.doi.org/10.1109/iicpe60303.2023.10474885.
Pełny tekst źródłaDemir, Andac, Toshiaki Koike-Akino, Ye Wang i Deniz Erdogmus. "EEG-GAT: Graph Attention Networks for Classification of Electroencephalogram (EEG) Signals". W 2022 44th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC). IEEE, 2022. http://dx.doi.org/10.1109/embc48229.2022.9871984.
Pełny tekst źródłaZhang, Tianqi. "VP-GAT: vector prior graph attention network for automated segment labeling of coronary arteries". W Fourteenth International Conference on Graphics and Image Processing (ICGIP 2022), redaktorzy Liang Xiao i Jianru Xue. SPIE, 2023. http://dx.doi.org/10.1117/12.2680419.
Pełny tekst źródła