Artykuły w czasopismach na temat „SELF-ATTENTION MECHANISM”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Sprawdź 50 najlepszych artykułów w czasopismach naukowych na temat „SELF-ATTENTION MECHANISM”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.
Yang, Kehua, Yaodong Wang, Wei Zhang, Jiqing Yao i Yuquan Le. "Keyphrase Generation Based on Self-Attention Mechanism". Computers, Materials & Continua 61, nr 2 (2019): 569–81. http://dx.doi.org/10.32604/cmc.2019.05952.
Pełny tekst źródłaLiu, Siqi, Jiangshu Wei, Gang Liu i Bei Zhou. "Image classification model based on large kernel attention mechanism and relative position self-attention mechanism". PeerJ Computer Science 9 (21.04.2023): e1344. http://dx.doi.org/10.7717/peerj-cs.1344.
Pełny tekst źródłaZhu, Hu, Ze Wang, Yu Shi, Yingying Hua, Guoxia Xu i Lizhen Deng. "Multimodal Fusion Method Based on Self-Attention Mechanism". Wireless Communications and Mobile Computing 2020 (23.09.2020): 1–8. http://dx.doi.org/10.1155/2020/8843186.
Pełny tekst źródłaCao, Fude, Chunguang Zheng, Limin Huang, Aihua Wang, Jiong Zhang, Feng Zhou, Haoxue Ju, Haitao Guo i Yuxia Du. "Research of Self-Attention in Image Segmentation". Journal of Information Technology Research 15, nr 1 (styczeń 2022): 1–12. http://dx.doi.org/10.4018/jitr.298619.
Pełny tekst źródłaWu, Hongqiu, Ruixue Ding, Hai Zhao, Pengjun Xie, Fei Huang i Min Zhang. "Adversarial Self-Attention for Language Understanding". Proceedings of the AAAI Conference on Artificial Intelligence 37, nr 11 (26.06.2023): 13727–35. http://dx.doi.org/10.1609/aaai.v37i11.26608.
Pełny tekst źródłaXie, Fei, Dalong Zhang i Chengming Liu. "Global–Local Self-Attention Based Transformer for Speaker Verification". Applied Sciences 12, nr 19 (10.10.2022): 10154. http://dx.doi.org/10.3390/app121910154.
Pełny tekst źródłaWang, Duofeng, Haifeng Hu i Dihu Chen. "Transformer with sparse self‐attention mechanism for image captioning". Electronics Letters 56, nr 15 (lipiec 2020): 764–66. http://dx.doi.org/10.1049/el.2020.0635.
Pełny tekst źródłaLi, Yujie, i Jintong Cai. "Point cloud classification network based on self-attention mechanism". Computers and Electrical Engineering 104 (grudzień 2022): 108451. http://dx.doi.org/10.1016/j.compeleceng.2022.108451.
Pełny tekst źródłaBrotchie, James, Wei Shao, Wenchao Li i Allison Kealy. "Leveraging Self-Attention Mechanism for Attitude Estimation in Smartphones". Sensors 22, nr 22 (21.11.2022): 9011. http://dx.doi.org/10.3390/s22229011.
Pełny tekst źródłaFan, Zhongkui, i Ye-Peng Guan. "Pedestrian attribute recognition based on dual self-attention mechanism". Computer Science and Information Systems, nr 00 (2023): 16. http://dx.doi.org/10.2298/csis220815016f.
Pełny tekst źródłaLuo, Youtao, i Xiaoming Gao. "Lightweight Human Pose Estimation Based on Self-Attention Mechanism". Advances in Engineering Technology Research 4, nr 1 (21.03.2023): 253. http://dx.doi.org/10.56028/aetr.4.1.253.2023.
Pełny tekst źródłaRendón-Segador, Fernando J., Juan A. Álvarez-García i Angel Jesús Varela-Vaca. "Paying attention to cyber-attacks: A multi-layer perceptron with self-attention mechanism". Computers & Security 132 (wrzesień 2023): 103318. http://dx.doi.org/10.1016/j.cose.2023.103318.
Pełny tekst źródłaDai, Biyun, Jinlong Li i Ruoyi Xu. "Multiple Positional Self-Attention Network for Text Classification". Proceedings of the AAAI Conference on Artificial Intelligence 34, nr 05 (3.04.2020): 7610–17. http://dx.doi.org/10.1609/aaai.v34i05.6261.
Pełny tekst źródłaLin, Zhihui, Maomao Li, Zhuobin Zheng, Yangyang Cheng i Chun Yuan. "Self-Attention ConvLSTM for Spatiotemporal Prediction". Proceedings of the AAAI Conference on Artificial Intelligence 34, nr 07 (3.04.2020): 11531–38. http://dx.doi.org/10.1609/aaai.v34i07.6819.
Pełny tekst źródłaNakata, Haruki, Kanji Tanaka i Koji Takeda. "Exploring Self-Attention for Visual Intersection Classification". Journal of Advanced Computational Intelligence and Intelligent Informatics 27, nr 3 (20.05.2023): 386–93. http://dx.doi.org/10.20965/jaciii.2023.p0386.
Pełny tekst źródłaBae, Ara, i Wooil Kim. "Speaker Verification Employing Combinations of Self-Attention Mechanisms". Electronics 9, nr 12 (21.12.2020): 2201. http://dx.doi.org/10.3390/electronics9122201.
Pełny tekst źródłaWang, Yu, Liang Hu, Yang Wu i Wanfu Gao. "Graph Multihead Attention Pooling with Self-Supervised Learning". Entropy 24, nr 12 (29.11.2022): 1745. http://dx.doi.org/10.3390/e24121745.
Pełny tekst źródłaZheng, Jianming, Fei Cai, Taihua Shao i Honghui Chen. "Self-Interaction Attention Mechanism-Based Text Representation for Document Classification". Applied Sciences 8, nr 4 (12.04.2018): 613. http://dx.doi.org/10.3390/app8040613.
Pełny tekst źródłaWang, Yue, Guanci Yang, Shaobo Li, Yang Li, Ling He i Dan Liu. "Arrhythmia classification algorithm based on multi-head self-attention mechanism". Biomedical Signal Processing and Control 79 (styczeń 2023): 104206. http://dx.doi.org/10.1016/j.bspc.2022.104206.
Pełny tekst źródłaChun, Yutong, Chuansheng Wang i Mingke He. "A Novel Clothing Attribute Representation Network-Based Self-Attention Mechanism". IEEE Access 8 (2020): 201762–69. http://dx.doi.org/10.1109/access.2020.3035781.
Pełny tekst źródłaHu, Wanting, Lu Cao, Qunsheng Ruan i Qingfeng Wu. "Research on Anomaly Network Detection Based on Self-Attention Mechanism". Sensors 23, nr 11 (25.05.2023): 5059. http://dx.doi.org/10.3390/s23115059.
Pełny tekst źródłaChen, Ziye, Mingming Gong, Yanwu Xu, Chaohui Wang, Kun Zhang i Bo Du. "Compressed Self-Attention for Deep Metric Learning". Proceedings of the AAAI Conference on Artificial Intelligence 34, nr 04 (3.04.2020): 3561–68. http://dx.doi.org/10.1609/aaai.v34i04.5762.
Pełny tekst źródłaZhang, Zhiqin, Bo Zhang, Fen Li i Dehua Kong. "Multihead Self Attention Hand Pose Estimation". E3S Web of Conferences 218 (2020): 03023. http://dx.doi.org/10.1051/e3sconf/202021803023.
Pełny tekst źródłaJi, Mingi, Weonyoung Joo, Kyungwoo Song, Yoon-Yeong Kim i Il-Chul Moon. "Sequential Recommendation with Relation-Aware Kernelized Self-Attention". Proceedings of the AAAI Conference on Artificial Intelligence 34, nr 04 (3.04.2020): 4304–11. http://dx.doi.org/10.1609/aaai.v34i04.5854.
Pełny tekst źródłaZhao Ning, 赵宁, i 刘立波 Liu Libo. "融合自注意力机制的人物姿态迁移生成模型". Laser & Optoelectronics Progress 59, nr 4 (2022): 0410014. http://dx.doi.org/10.3788/lop202259.0410014.
Pełny tekst źródłaDaihong, Jiang, Hu yuanzheng, Dai Lei i Peng Jin. "Facial Expression Recognition Based on Attention Mechanism". Scientific Programming 2021 (2.03.2021): 1–10. http://dx.doi.org/10.1155/2021/6624251.
Pełny tekst źródłaTiwari, Prayag, Amit Kumar Jaiswal, Sahil Garg i Ilsun You. "SANTM: Efficient Self-attention-driven Network for Text Matching". ACM Transactions on Internet Technology 22, nr 3 (31.08.2022): 1–21. http://dx.doi.org/10.1145/3426971.
Pełny tekst źródłaYang, Zuoxi, i Shoubin Dong. "HSRec: Hierarchical self-attention incorporating knowledge graph for sequential recommendation". Journal of Intelligent & Fuzzy Systems 42, nr 4 (4.03.2022): 3749–60. http://dx.doi.org/10.3233/jifs-211953.
Pełny tekst źródłaMa, Xin, Zhanzhan Liu, Mingxing Zheng i Youqing Wang. "Application and exploration of self-attention mechanism in dynamic process monitoring". IFAC-PapersOnLine 55, nr 6 (2022): 139–44. http://dx.doi.org/10.1016/j.ifacol.2022.07.119.
Pełny tekst źródłaLiang, Hong, Hui Zhou, Qian Zhang i Ting Wu. "Object Detection Algorithm Based on Context Information and Self-Attention Mechanism". Symmetry 14, nr 5 (28.04.2022): 904. http://dx.doi.org/10.3390/sym14050904.
Pełny tekst źródłaZhang, Ru, Xinjian Zhao, Jiaqi Li, Song Zhang i Zhijie Shang. "A malicious code family classification method based on self-attention mechanism". Journal of Physics: Conference Series 2010, nr 1 (1.09.2021): 012066. http://dx.doi.org/10.1088/1742-6596/2010/1/012066.
Pełny tekst źródłaYE, Rui-da, Wei-jie WANG, Liang HE, Xiao-cen CHEN i Yue XUE. "RUL prediction of aero-engine based on residual self-attention mechanism". Optics and Precision Engineering 29, nr 6 (2021): 1482–90. http://dx.doi.org/10.37188/ope.20212906.1482.
Pełny tekst źródłaLi, Jinsong, Jianhua Peng, Shuxin Liu, Lintianran Weng i Cong Li. "Temporal link prediction in directed networks based on self-attention mechanism". Intelligent Data Analysis 26, nr 1 (13.01.2022): 173–88. http://dx.doi.org/10.3233/ida-205524.
Pełny tekst źródłaCheng, Kefei, Yanan Yue i Zhiwen Song. "Sentiment Classification Based on Part-of-Speech and Self-Attention Mechanism". IEEE Access 8 (2020): 16387–96. http://dx.doi.org/10.1109/access.2020.2967103.
Pełny tekst źródłaLiao, Fei, Liangli Ma, Jingjing Pei i Linshan Tan. "Combined Self-Attention Mechanism for Chinese Named Entity Recognition in Military". Future Internet 11, nr 8 (18.08.2019): 180. http://dx.doi.org/10.3390/fi11080180.
Pełny tekst źródłaPeng, Dunlu, Weiwei Yuan i Cong Liu. "HARSAM: A Hybrid Model for Recommendation Supported by Self-Attention Mechanism". IEEE Access 7 (2019): 12620–29. http://dx.doi.org/10.1109/access.2019.2892565.
Pełny tekst źródłaZhou, Yuhang, Xiaoli Huo, Zhiqun Gu, Jiawei Zhang, Yi Ding, Rentao Gu i Yuefeng Ji. "Self-Attention Mechanism-Based Multi-Channel QoT Estimation in Optical Networks". Photonics 10, nr 1 (6.01.2023): 63. http://dx.doi.org/10.3390/photonics10010063.
Pełny tekst źródłaWu, Peiyang, Zongxu Pan, Hairong Tang i Yuxin Hu. "Cloudformer: A Cloud-Removal Network Combining Self-Attention Mechanism and Convolution". Remote Sensing 14, nr 23 (3.12.2022): 6132. http://dx.doi.org/10.3390/rs14236132.
Pełny tekst źródłaWang, Mei, Yu Yao, Hongbin Qiu i Xiyu Song. "Adaptive Memory-Controlled Self-Attention for Polyphonic Sound Event Detection". Symmetry 14, nr 2 (12.02.2022): 366. http://dx.doi.org/10.3390/sym14020366.
Pełny tekst źródłaLiu, Guangjie, Xin Ma, Jinlong Zhu, Yu Zhang, Danyang Yang, Jianfeng Wang i Yi Wang. "Individualized tourism recommendation based on self-attention". PLOS ONE 17, nr 8 (25.08.2022): e0272319. http://dx.doi.org/10.1371/journal.pone.0272319.
Pełny tekst źródłaWei, Yupeng, Dazhong Wu i Janis Terpenny. "Bearing remaining useful life prediction using self-adaptive graph convolutional networks with self-attention mechanism". Mechanical Systems and Signal Processing 188 (kwiecień 2023): 110010. http://dx.doi.org/10.1016/j.ymssp.2022.110010.
Pełny tekst źródłaMa, Suling. "A Study of Two-Way Short- and Long-Term Memory Network Intelligent Computing IoT Model-Assisted Home Education Attention Mechanism". Computational Intelligence and Neuroscience 2021 (21.12.2021): 1–11. http://dx.doi.org/10.1155/2021/3587884.
Pełny tekst źródłaJiang, Cheng, Yuanxi Peng, Xuebin Tang, Chunchao Li i Teng Li. "PointSwin: Modeling Self-Attention with Shifted Window on Point Cloud". Applied Sciences 12, nr 24 (9.12.2022): 12616. http://dx.doi.org/10.3390/app122412616.
Pełny tekst źródłaPan, Wenxia. "English Machine Translation Model Based on an Improved Self-Attention Technology". Scientific Programming 2021 (23.12.2021): 1–11. http://dx.doi.org/10.1155/2021/2601480.
Pełny tekst źródłaIshizuka, Ryoto, Ryo Nishikimi i Kazuyoshi Yoshii. "Global Structure-Aware Drum Transcription Based on Self-Attention Mechanisms". Signals 2, nr 3 (13.08.2021): 508–26. http://dx.doi.org/10.3390/signals2030031.
Pełny tekst źródłaHu, Gensheng, Lidong Qian, Dong Liang i Mingzhu Wan. "Self-adversarial Training and Attention for Multi-task Wheat Phenotyping". Applied Engineering in Agriculture 35, nr 6 (2019): 1009–14. http://dx.doi.org/10.13031/aea.13406.
Pełny tekst źródłaFang, Yong, Shaoshuai Yang, Bin Zhao i Cheng Huang. "Cyberbullying Detection in Social Networks Using Bi-GRU with Self-Attention Mechanism". Information 12, nr 4 (16.04.2021): 171. http://dx.doi.org/10.3390/info12040171.
Pełny tekst źródłaFernández-Llaneza, Daniel, Silas Ulander, Dea Gogishvili, Eva Nittinger, Hongtao Zhao i Christian Tyrchan. "Siamese Recurrent Neural Network with a Self-Attention Mechanism for Bioactivity Prediction". ACS Omega 6, nr 16 (15.04.2021): 11086–94. http://dx.doi.org/10.1021/acsomega.1c01266.
Pełny tekst źródłaChen, Shuai, Lin Luo, Qilei Xia i Lunjie Wang. "Self-attention Mechanism based Dynamic Fault Diagnosis and Classification for Chemical Processes". Journal of Physics: Conference Series 1914, nr 1 (1.05.2021): 012046. http://dx.doi.org/10.1088/1742-6596/1914/1/012046.
Pełny tekst źródłaNkabiti, Kabo Poloko, i Yueyun Chen. "Application of solely self-attention mechanism in CSI-fingerprinting-based indoor localization". Neural Computing and Applications 33, nr 15 (18.01.2021): 9185–98. http://dx.doi.org/10.1007/s00521-020-05681-1.
Pełny tekst źródła