Статті в журналах з теми "SELF-ATTENTION MECHANISM"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-50 статей у журналах для дослідження на тему "SELF-ATTENTION MECHANISM".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.
Yang, Kehua, Yaodong Wang, Wei Zhang, Jiqing Yao, and Yuquan Le. "Keyphrase Generation Based on Self-Attention Mechanism." Computers, Materials & Continua 61, no. 2 (2019): 569–81. http://dx.doi.org/10.32604/cmc.2019.05952.
Повний текст джерелаLiu, Siqi, Jiangshu Wei, Gang Liu, and Bei Zhou. "Image classification model based on large kernel attention mechanism and relative position self-attention mechanism." PeerJ Computer Science 9 (April 21, 2023): e1344. http://dx.doi.org/10.7717/peerj-cs.1344.
Повний текст джерелаZhu, Hu, Ze Wang, Yu Shi, Yingying Hua, Guoxia Xu, and Lizhen Deng. "Multimodal Fusion Method Based on Self-Attention Mechanism." Wireless Communications and Mobile Computing 2020 (September 23, 2020): 1–8. http://dx.doi.org/10.1155/2020/8843186.
Повний текст джерелаCao, Fude, Chunguang Zheng, Limin Huang, Aihua Wang, Jiong Zhang, Feng Zhou, Haoxue Ju, Haitao Guo, and Yuxia Du. "Research of Self-Attention in Image Segmentation." Journal of Information Technology Research 15, no. 1 (January 2022): 1–12. http://dx.doi.org/10.4018/jitr.298619.
Повний текст джерелаWu, Hongqiu, Ruixue Ding, Hai Zhao, Pengjun Xie, Fei Huang, and Min Zhang. "Adversarial Self-Attention for Language Understanding." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 11 (June 26, 2023): 13727–35. http://dx.doi.org/10.1609/aaai.v37i11.26608.
Повний текст джерелаXie, Fei, Dalong Zhang, and Chengming Liu. "Global–Local Self-Attention Based Transformer for Speaker Verification." Applied Sciences 12, no. 19 (October 10, 2022): 10154. http://dx.doi.org/10.3390/app121910154.
Повний текст джерелаWang, Duofeng, Haifeng Hu, and Dihu Chen. "Transformer with sparse self‐attention mechanism for image captioning." Electronics Letters 56, no. 15 (July 2020): 764–66. http://dx.doi.org/10.1049/el.2020.0635.
Повний текст джерелаLi, Yujie, and Jintong Cai. "Point cloud classification network based on self-attention mechanism." Computers and Electrical Engineering 104 (December 2022): 108451. http://dx.doi.org/10.1016/j.compeleceng.2022.108451.
Повний текст джерелаBrotchie, James, Wei Shao, Wenchao Li, and Allison Kealy. "Leveraging Self-Attention Mechanism for Attitude Estimation in Smartphones." Sensors 22, no. 22 (November 21, 2022): 9011. http://dx.doi.org/10.3390/s22229011.
Повний текст джерелаFan, Zhongkui, and Ye-Peng Guan. "Pedestrian attribute recognition based on dual self-attention mechanism." Computer Science and Information Systems, no. 00 (2023): 16. http://dx.doi.org/10.2298/csis220815016f.
Повний текст джерелаLuo, Youtao, and Xiaoming Gao. "Lightweight Human Pose Estimation Based on Self-Attention Mechanism." Advances in Engineering Technology Research 4, no. 1 (March 21, 2023): 253. http://dx.doi.org/10.56028/aetr.4.1.253.2023.
Повний текст джерелаRendón-Segador, Fernando J., Juan A. Álvarez-García, and Angel Jesús Varela-Vaca. "Paying attention to cyber-attacks: A multi-layer perceptron with self-attention mechanism." Computers & Security 132 (September 2023): 103318. http://dx.doi.org/10.1016/j.cose.2023.103318.
Повний текст джерелаDai, Biyun, Jinlong Li, and Ruoyi Xu. "Multiple Positional Self-Attention Network for Text Classification." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (April 3, 2020): 7610–17. http://dx.doi.org/10.1609/aaai.v34i05.6261.
Повний текст джерелаLin, Zhihui, Maomao Li, Zhuobin Zheng, Yangyang Cheng, and Chun Yuan. "Self-Attention ConvLSTM for Spatiotemporal Prediction." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 07 (April 3, 2020): 11531–38. http://dx.doi.org/10.1609/aaai.v34i07.6819.
Повний текст джерелаNakata, Haruki, Kanji Tanaka, and Koji Takeda. "Exploring Self-Attention for Visual Intersection Classification." Journal of Advanced Computational Intelligence and Intelligent Informatics 27, no. 3 (May 20, 2023): 386–93. http://dx.doi.org/10.20965/jaciii.2023.p0386.
Повний текст джерелаBae, Ara, and Wooil Kim. "Speaker Verification Employing Combinations of Self-Attention Mechanisms." Electronics 9, no. 12 (December 21, 2020): 2201. http://dx.doi.org/10.3390/electronics9122201.
Повний текст джерелаWang, Yu, Liang Hu, Yang Wu, and Wanfu Gao. "Graph Multihead Attention Pooling with Self-Supervised Learning." Entropy 24, no. 12 (November 29, 2022): 1745. http://dx.doi.org/10.3390/e24121745.
Повний текст джерелаZheng, Jianming, Fei Cai, Taihua Shao, and Honghui Chen. "Self-Interaction Attention Mechanism-Based Text Representation for Document Classification." Applied Sciences 8, no. 4 (April 12, 2018): 613. http://dx.doi.org/10.3390/app8040613.
Повний текст джерелаWang, Yue, Guanci Yang, Shaobo Li, Yang Li, Ling He, and Dan Liu. "Arrhythmia classification algorithm based on multi-head self-attention mechanism." Biomedical Signal Processing and Control 79 (January 2023): 104206. http://dx.doi.org/10.1016/j.bspc.2022.104206.
Повний текст джерелаChun, Yutong, Chuansheng Wang, and Mingke He. "A Novel Clothing Attribute Representation Network-Based Self-Attention Mechanism." IEEE Access 8 (2020): 201762–69. http://dx.doi.org/10.1109/access.2020.3035781.
Повний текст джерелаHu, Wanting, Lu Cao, Qunsheng Ruan, and Qingfeng Wu. "Research on Anomaly Network Detection Based on Self-Attention Mechanism." Sensors 23, no. 11 (May 25, 2023): 5059. http://dx.doi.org/10.3390/s23115059.
Повний текст джерелаChen, Ziye, Mingming Gong, Yanwu Xu, Chaohui Wang, Kun Zhang, and Bo Du. "Compressed Self-Attention for Deep Metric Learning." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 3561–68. http://dx.doi.org/10.1609/aaai.v34i04.5762.
Повний текст джерелаZhang, Zhiqin, Bo Zhang, Fen Li, and Dehua Kong. "Multihead Self Attention Hand Pose Estimation." E3S Web of Conferences 218 (2020): 03023. http://dx.doi.org/10.1051/e3sconf/202021803023.
Повний текст джерелаJi, Mingi, Weonyoung Joo, Kyungwoo Song, Yoon-Yeong Kim, and Il-Chul Moon. "Sequential Recommendation with Relation-Aware Kernelized Self-Attention." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 4304–11. http://dx.doi.org/10.1609/aaai.v34i04.5854.
Повний текст джерелаZhao Ning, 赵宁, та 刘立波 Liu Libo. "融合自注意力机制的人物姿态迁移生成模型". Laser & Optoelectronics Progress 59, № 4 (2022): 0410014. http://dx.doi.org/10.3788/lop202259.0410014.
Повний текст джерелаDaihong, Jiang, Hu yuanzheng, Dai Lei, and Peng Jin. "Facial Expression Recognition Based on Attention Mechanism." Scientific Programming 2021 (March 2, 2021): 1–10. http://dx.doi.org/10.1155/2021/6624251.
Повний текст джерелаTiwari, Prayag, Amit Kumar Jaiswal, Sahil Garg, and Ilsun You. "SANTM: Efficient Self-attention-driven Network for Text Matching." ACM Transactions on Internet Technology 22, no. 3 (August 31, 2022): 1–21. http://dx.doi.org/10.1145/3426971.
Повний текст джерелаYang, Zuoxi, and Shoubin Dong. "HSRec: Hierarchical self-attention incorporating knowledge graph for sequential recommendation." Journal of Intelligent & Fuzzy Systems 42, no. 4 (March 4, 2022): 3749–60. http://dx.doi.org/10.3233/jifs-211953.
Повний текст джерелаMa, Xin, Zhanzhan Liu, Mingxing Zheng, and Youqing Wang. "Application and exploration of self-attention mechanism in dynamic process monitoring." IFAC-PapersOnLine 55, no. 6 (2022): 139–44. http://dx.doi.org/10.1016/j.ifacol.2022.07.119.
Повний текст джерелаLiang, Hong, Hui Zhou, Qian Zhang, and Ting Wu. "Object Detection Algorithm Based on Context Information and Self-Attention Mechanism." Symmetry 14, no. 5 (April 28, 2022): 904. http://dx.doi.org/10.3390/sym14050904.
Повний текст джерелаZhang, Ru, Xinjian Zhao, Jiaqi Li, Song Zhang, and Zhijie Shang. "A malicious code family classification method based on self-attention mechanism." Journal of Physics: Conference Series 2010, no. 1 (September 1, 2021): 012066. http://dx.doi.org/10.1088/1742-6596/2010/1/012066.
Повний текст джерелаYE, Rui-da, Wei-jie WANG, Liang HE, Xiao-cen CHEN, and Yue XUE. "RUL prediction of aero-engine based on residual self-attention mechanism." Optics and Precision Engineering 29, no. 6 (2021): 1482–90. http://dx.doi.org/10.37188/ope.20212906.1482.
Повний текст джерелаLi, Jinsong, Jianhua Peng, Shuxin Liu, Lintianran Weng, and Cong Li. "Temporal link prediction in directed networks based on self-attention mechanism." Intelligent Data Analysis 26, no. 1 (January 13, 2022): 173–88. http://dx.doi.org/10.3233/ida-205524.
Повний текст джерелаCheng, Kefei, Yanan Yue, and Zhiwen Song. "Sentiment Classification Based on Part-of-Speech and Self-Attention Mechanism." IEEE Access 8 (2020): 16387–96. http://dx.doi.org/10.1109/access.2020.2967103.
Повний текст джерелаLiao, Fei, Liangli Ma, Jingjing Pei, and Linshan Tan. "Combined Self-Attention Mechanism for Chinese Named Entity Recognition in Military." Future Internet 11, no. 8 (August 18, 2019): 180. http://dx.doi.org/10.3390/fi11080180.
Повний текст джерелаPeng, Dunlu, Weiwei Yuan, and Cong Liu. "HARSAM: A Hybrid Model for Recommendation Supported by Self-Attention Mechanism." IEEE Access 7 (2019): 12620–29. http://dx.doi.org/10.1109/access.2019.2892565.
Повний текст джерелаZhou, Yuhang, Xiaoli Huo, Zhiqun Gu, Jiawei Zhang, Yi Ding, Rentao Gu, and Yuefeng Ji. "Self-Attention Mechanism-Based Multi-Channel QoT Estimation in Optical Networks." Photonics 10, no. 1 (January 6, 2023): 63. http://dx.doi.org/10.3390/photonics10010063.
Повний текст джерелаWu, Peiyang, Zongxu Pan, Hairong Tang, and Yuxin Hu. "Cloudformer: A Cloud-Removal Network Combining Self-Attention Mechanism and Convolution." Remote Sensing 14, no. 23 (December 3, 2022): 6132. http://dx.doi.org/10.3390/rs14236132.
Повний текст джерелаWang, Mei, Yu Yao, Hongbin Qiu, and Xiyu Song. "Adaptive Memory-Controlled Self-Attention for Polyphonic Sound Event Detection." Symmetry 14, no. 2 (February 12, 2022): 366. http://dx.doi.org/10.3390/sym14020366.
Повний текст джерелаLiu, Guangjie, Xin Ma, Jinlong Zhu, Yu Zhang, Danyang Yang, Jianfeng Wang, and Yi Wang. "Individualized tourism recommendation based on self-attention." PLOS ONE 17, no. 8 (August 25, 2022): e0272319. http://dx.doi.org/10.1371/journal.pone.0272319.
Повний текст джерелаWei, Yupeng, Dazhong Wu, and Janis Terpenny. "Bearing remaining useful life prediction using self-adaptive graph convolutional networks with self-attention mechanism." Mechanical Systems and Signal Processing 188 (April 2023): 110010. http://dx.doi.org/10.1016/j.ymssp.2022.110010.
Повний текст джерелаMa, Suling. "A Study of Two-Way Short- and Long-Term Memory Network Intelligent Computing IoT Model-Assisted Home Education Attention Mechanism." Computational Intelligence and Neuroscience 2021 (December 21, 2021): 1–11. http://dx.doi.org/10.1155/2021/3587884.
Повний текст джерелаJiang, Cheng, Yuanxi Peng, Xuebin Tang, Chunchao Li, and Teng Li. "PointSwin: Modeling Self-Attention with Shifted Window on Point Cloud." Applied Sciences 12, no. 24 (December 9, 2022): 12616. http://dx.doi.org/10.3390/app122412616.
Повний текст джерелаPan, Wenxia. "English Machine Translation Model Based on an Improved Self-Attention Technology." Scientific Programming 2021 (December 23, 2021): 1–11. http://dx.doi.org/10.1155/2021/2601480.
Повний текст джерелаIshizuka, Ryoto, Ryo Nishikimi, and Kazuyoshi Yoshii. "Global Structure-Aware Drum Transcription Based on Self-Attention Mechanisms." Signals 2, no. 3 (August 13, 2021): 508–26. http://dx.doi.org/10.3390/signals2030031.
Повний текст джерелаHu, Gensheng, Lidong Qian, Dong Liang, and Mingzhu Wan. "Self-adversarial Training and Attention for Multi-task Wheat Phenotyping." Applied Engineering in Agriculture 35, no. 6 (2019): 1009–14. http://dx.doi.org/10.13031/aea.13406.
Повний текст джерелаFang, Yong, Shaoshuai Yang, Bin Zhao, and Cheng Huang. "Cyberbullying Detection in Social Networks Using Bi-GRU with Self-Attention Mechanism." Information 12, no. 4 (April 16, 2021): 171. http://dx.doi.org/10.3390/info12040171.
Повний текст джерелаFernández-Llaneza, Daniel, Silas Ulander, Dea Gogishvili, Eva Nittinger, Hongtao Zhao, and Christian Tyrchan. "Siamese Recurrent Neural Network with a Self-Attention Mechanism for Bioactivity Prediction." ACS Omega 6, no. 16 (April 15, 2021): 11086–94. http://dx.doi.org/10.1021/acsomega.1c01266.
Повний текст джерелаChen, Shuai, Lin Luo, Qilei Xia, and Lunjie Wang. "Self-attention Mechanism based Dynamic Fault Diagnosis and Classification for Chemical Processes." Journal of Physics: Conference Series 1914, no. 1 (May 1, 2021): 012046. http://dx.doi.org/10.1088/1742-6596/1914/1/012046.
Повний текст джерелаNkabiti, Kabo Poloko, and Yueyun Chen. "Application of solely self-attention mechanism in CSI-fingerprinting-based indoor localization." Neural Computing and Applications 33, no. 15 (January 18, 2021): 9185–98. http://dx.doi.org/10.1007/s00521-020-05681-1.
Повний текст джерела