Статті в журналах з теми "Self-attention mechanisms"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Self-attention mechanisms".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.
Makarov, Ilya, Maria Bakhanova, Sergey Nikolenko, and Olga Gerasimova. "Self-supervised recurrent depth estimation with attention mechanisms." PeerJ Computer Science 8 (January 31, 2022): e865. http://dx.doi.org/10.7717/peerj-cs.865.
Повний текст джерелаBae, Ara, and Wooil Kim. "Speaker Verification Employing Combinations of Self-Attention Mechanisms." Electronics 9, no. 12 (December 21, 2020): 2201. http://dx.doi.org/10.3390/electronics9122201.
Повний текст джерелаCao, Fude, Chunguang Zheng, Limin Huang, Aihua Wang, Jiong Zhang, Feng Zhou, Haoxue Ju, Haitao Guo, and Yuxia Du. "Research of Self-Attention in Image Segmentation." Journal of Information Technology Research 15, no. 1 (January 2022): 1–12. http://dx.doi.org/10.4018/jitr.298619.
Повний текст джерелаDai, Biyun, Jinlong Li, and Ruoyi Xu. "Multiple Positional Self-Attention Network for Text Classification." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (April 3, 2020): 7610–17. http://dx.doi.org/10.1609/aaai.v34i05.6261.
Повний текст джерелаXie, Fei, Dalong Zhang, and Chengming Liu. "Global–Local Self-Attention Based Transformer for Speaker Verification." Applied Sciences 12, no. 19 (October 10, 2022): 10154. http://dx.doi.org/10.3390/app121910154.
Повний текст джерелаIshizuka, Ryoto, Ryo Nishikimi, and Kazuyoshi Yoshii. "Global Structure-Aware Drum Transcription Based on Self-Attention Mechanisms." Signals 2, no. 3 (August 13, 2021): 508–26. http://dx.doi.org/10.3390/signals2030031.
Повний текст джерелаZhu, Hu, Ze Wang, Yu Shi, Yingying Hua, Guoxia Xu, and Lizhen Deng. "Multimodal Fusion Method Based on Self-Attention Mechanism." Wireless Communications and Mobile Computing 2020 (September 23, 2020): 1–8. http://dx.doi.org/10.1155/2020/8843186.
Повний текст джерелаPOSNER, MICHAEL I., and MARY K. ROTHBART. "Developing mechanisms of self-regulation." Development and Psychopathology 12, no. 3 (September 2000): 427–41. http://dx.doi.org/10.1017/s0954579400003096.
Повний текст джерелаTiwari, Prayag, Amit Kumar Jaiswal, Sahil Garg, and Ilsun You. "SANTM: Efficient Self-attention-driven Network for Text Matching." ACM Transactions on Internet Technology 22, no. 3 (August 31, 2022): 1–21. http://dx.doi.org/10.1145/3426971.
Повний текст джерелаNg, Hu, Glenn Jun Weng Chia, Timothy Tzen Vun Yap, and Vik Tor Goh. "Modelling sentiments based on objectivity and subjectivity with self-attention mechanisms." F1000Research 10 (May 17, 2022): 1001. http://dx.doi.org/10.12688/f1000research.73131.2.
Повний текст джерелаLin, Hung-Hsiang, Jiun-Da Lin, Jose Jaena Mari Ople, Jun-Cheng Chen, and Kai-Lung Hua. "Social Media Popularity Prediction Based on Multi-Modal Self-Attention Mechanisms." IEEE Access 10 (2022): 4448–55. http://dx.doi.org/10.1109/access.2021.3136552.
Повний текст джерелаNg, Hu, Glenn Jun Weng Chia, Timothy Tzen Vun Yap, and Vik Tor Goh. "Modelling sentiments based on objectivity and subjectivity with self-attention mechanisms." F1000Research 10 (October 4, 2021): 1001. http://dx.doi.org/10.12688/f1000research.73131.1.
Повний текст джерелаBaer, Ruth A. "Self-Focused Attention and Mechanisms of Change in Mindfulness-Based Treatment." Cognitive Behaviour Therapy 38, sup1 (January 2009): 15–20. http://dx.doi.org/10.1080/16506070902980703.
Повний текст джерелаLo, Ronda F., Andy H. Ng, Adam S. Cohen, and Joni Y. Sasaki. "Does self-construal shape automatic social attention?" PLOS ONE 16, no. 2 (February 10, 2021): e0246577. http://dx.doi.org/10.1371/journal.pone.0246577.
Повний текст джерелаKaiser, Roselinde H., Hannah R. Snyder, Franziska Goer, Rachel Clegg, Manon Ironside, and Diego A. Pizzagalli. "Attention Bias in Rumination and Depression: Cognitive Mechanisms and Brain Networks." Clinical Psychological Science 6, no. 6 (September 21, 2018): 765–82. http://dx.doi.org/10.1177/2167702618797935.
Повний текст джерелаChen, Shouyan, Mingyan Zhang, Xiaofen Yang, Zhijia Zhao, Tao Zou, and Xinqi Sun. "The Impact of Attention Mechanisms on Speech Emotion Recognition." Sensors 21, no. 22 (November 12, 2021): 7530. http://dx.doi.org/10.3390/s21227530.
Повний текст джерелаSpringer, Anne, Juliane Beyer, Jan Derrfuss, Kirsten G. Volz, and Bettina Hannover. "Seeing You or the Scene? Self-Construals Modulate Inhibitory Mechanisms of Attention." Social Cognition 30, no. 2 (April 2012): 133–52. http://dx.doi.org/10.1521/soco.2012.30.2.133.
Повний текст джерелаSun, Yange, Meng Li, Huaping Guo, and Li Zhang. "MSGSA: Multi-Scale Guided Self-Attention Network for Crowd Counting." Electronics 12, no. 12 (June 11, 2023): 2631. http://dx.doi.org/10.3390/electronics12122631.
Повний текст джерелаWang, Mei, Yu Yao, Hongbin Qiu, and Xiyu Song. "Adaptive Memory-Controlled Self-Attention for Polyphonic Sound Event Detection." Symmetry 14, no. 2 (February 12, 2022): 366. http://dx.doi.org/10.3390/sym14020366.
Повний текст джерелаZhou, Qian, Hua Zou, and Huanhuan Wu. "LGViT: A Local and Global Vision Transformer with Dynamic Contextual Position Bias Using Overlapping Windows." Applied Sciences 13, no. 3 (February 3, 2023): 1993. http://dx.doi.org/10.3390/app13031993.
Повний текст джерелаPosner, Michael I., Mary K. Rothbart, Brad E. Sheese, and Pascale Voelker. "Developing Attention: Behavioral and Brain Mechanisms." Advances in Neuroscience 2014 (May 8, 2014): 1–9. http://dx.doi.org/10.1155/2014/405094.
Повний текст джерелаLin, Zhicheng, and Shihui Han. "Self-construal priming modulates the scope of visual attention." Quarterly Journal of Experimental Psychology 62, no. 4 (April 2009): 802–13. http://dx.doi.org/10.1080/17470210802271650.
Повний текст джерелаHan, Suk Won, and Cheol Hwan Kim. "Neurocognitive Mechanisms Underlying Internet/Smartphone Addiction: A Preliminary fMRI Study." Tomography 8, no. 4 (July 11, 2022): 1781–90. http://dx.doi.org/10.3390/tomography8040150.
Повний текст джерелаNagai, Yukie, Koh Hosoda, Akio Morita, and Minoru Asada. "Emergence of Joint Attention through Bootstrap Learning based on the Mechanisms of Visual Attention and Learning with Self-evaluation." Transactions of the Japanese Society for Artificial Intelligence 19 (2004): 10–19. http://dx.doi.org/10.1527/tjsai.19.10.
Повний текст джерелаRen, Xudie, Jialve Wang, and Shenghong Li. "MAM: Multiple Attention Mechanism Neural Networks for Cross-Age Face Recognition." Wireless Communications and Mobile Computing 2022 (April 30, 2022): 1–11. http://dx.doi.org/10.1155/2022/8546029.
Повний текст джерелаKuo, Yu-Chen, Ching-Bang Yao, and Chen-Yu Wu. "A Strategy for Enhancing English Learning Achievement, Based on the Eye-Tracking Technology with Self-Regulated Learning." Sustainability 14, no. 23 (December 6, 2022): 16286. http://dx.doi.org/10.3390/su142316286.
Повний текст джерелаZhu, Yuhua, Hang Li, Tong Zhen, and Zhihui Li. "Integrating Self-Attention Mechanisms and ResNet for Grain Storage Ventilation Decision Making: A Study." Applied Sciences 13, no. 13 (June 28, 2023): 7655. http://dx.doi.org/10.3390/app13137655.
Повний текст джерелаDiao, Zhifeng, and Fanglei Sun. "Visual Object Tracking Based on Deep Neural Network." Mathematical Problems in Engineering 2022 (July 12, 2022): 1–9. http://dx.doi.org/10.1155/2022/2154463.
Повний текст джерелаGao, Yue, Di Li, Xiangjian Chen, and Junwu Zhu. "Attention-Based Mechanisms for Cognitive Reinforcement Learning." Applied Sciences 13, no. 13 (June 21, 2023): 7361. http://dx.doi.org/10.3390/app13137361.
Повний текст джерелаWang, Wanru, Yuwei Lv, Yonggang Wen, and Xuemei Sun. "Rumor Detection Based on Knowledge Enhancement and Graph Attention Network." Discrete Dynamics in Nature and Society 2022 (October 6, 2022): 1–12. http://dx.doi.org/10.1155/2022/6257658.
Повний текст джерелаNiu, Jinxing, Shuo Liu, Hanbing Li, Tao Zhang, and Lijun Wang. "Grasp Detection Combining Self-Attention with CNN in Complex Scenes." Applied Sciences 13, no. 17 (August 25, 2023): 9655. http://dx.doi.org/10.3390/app13179655.
Повний текст джерелаMa, Suling. "A Study of Two-Way Short- and Long-Term Memory Network Intelligent Computing IoT Model-Assisted Home Education Attention Mechanism." Computational Intelligence and Neuroscience 2021 (December 21, 2021): 1–11. http://dx.doi.org/10.1155/2021/3587884.
Повний текст джерелаKardakis, Spyridon, Isidoros Perikos, Foteini Grivokostopoulou, and Ioannis Hatzilygeroudis. "Examining Attention Mechanisms in Deep Learning Models for Sentiment Analysis." Applied Sciences 11, no. 9 (April 25, 2021): 3883. http://dx.doi.org/10.3390/app11093883.
Повний текст джерелаHendricks, Lisa Anne, John Mellor, Rosalia Schneider, Jean-Baptiste Alayrac, and Aida Nematzadeh. "Decoupling the Role of Data, Attention, and Losses in Multimodal Transformers." Transactions of the Association for Computational Linguistics 9 (2021): 570–85. http://dx.doi.org/10.1162/tacl_a_00385.
Повний текст джерелаZhang, Shugang, Mingjian Jiang, Shuang Wang, Xiaofeng Wang, Zhiqiang Wei, and Zhen Li. "SAG-DTA: Prediction of Drug–Target Affinity Using Self-Attention Graph Network." International Journal of Molecular Sciences 22, no. 16 (August 20, 2021): 8993. http://dx.doi.org/10.3390/ijms22168993.
Повний текст джерелаReimann, Jan Niclas, Andreas Schwung, and Steven X. Ding. "Adopting attention-mechanisms for Neural Logic Rule Layers." at - Automatisierungstechnik 70, no. 3 (March 1, 2022): 257–66. http://dx.doi.org/10.1515/auto-2021-0136.
Повний текст джерелаSchäfer, Sarah, Dirk Wentura, and Christian Frings. "Creating a network of importance: The particular effects of self-relevance on stimulus processing." Attention, Perception, & Psychophysics 82, no. 7 (June 17, 2020): 3750–66. http://dx.doi.org/10.3758/s13414-020-02070-7.
Повний текст джерелаZhou, Wei, Zhongwei Qu, Lianen Qu, Xupeng Wang, Yilin Shi, De Zhang, and Zhenlin Hui. "Radar Echo Maps Prediction Using an Improved MIM with Self-Attention Memory Module." Journal of Sensors 2023 (July 18, 2023): 1–12. http://dx.doi.org/10.1155/2023/8876971.
Повний текст джерелаLi, Yabei, Minjun Liang, Mingyang Wei, Ge Wang, and Yanan Li. "Mechanisms and Applications of Attention in Medical Image Segmentation: A Review." Academic Journal of Science and Technology 5, no. 3 (May 5, 2023): 237–43. http://dx.doi.org/10.54097/ajst.v5i3.8021.
Повний текст джерелаZhang, Rongkai, Ying Zeng, Li Tong, and Bin Yan. "Specific Neural Mechanisms of Self-Cognition and the Application of Brainprint Recognition." Biology 12, no. 3 (March 22, 2023): 486. http://dx.doi.org/10.3390/biology12030486.
Повний текст джерелаMörtberg, Ewa, Asle Hoffart, Benjamin Boecking, and David M. Clark. "Shifting the Focus of One's Attention Mediates Improvement in Cognitive Therapy for Social Anxiety Disorder." Behavioural and Cognitive Psychotherapy 43, no. 1 (August 28, 2013): 63–73. http://dx.doi.org/10.1017/s1352465813000738.
Повний текст джерелаFichten, Catherine S., Harriet Lennox, Kristen Robillard, John Wright, Stéphane Sabourin, and Rhonda Amsel. "Attentional Focus and Attitudes Toward Peers with Disabilities: Self Focusing and A Comparison of Modeling and Self-Disclosure." Journal of Applied Rehabilitation Counseling 27, no. 4 (December 1, 1996): 30–39. http://dx.doi.org/10.1891/0047-2220.27.4.30.
Повний текст джерелаZhijian, Lyu, Jiang Shaohua, and Tan Yonghao. "DSAGLSTM-DTA: Prediction of Drug-Target Affinity using Dual Self-Attention and LSTM." Machine Learning and Applications: An International Journal 9, no. 02 (June 30, 2022): 1–19. http://dx.doi.org/10.5121/mlaij.2022.9201.
Повний текст джерелаSong, Jiang, Jianguo Qian, Zhengjun Liu, Yang Jiao, Jiahui Zhou, Yongrong Li, Yiming Chen, Jie Guo, and Zhiqiang Wang. "Research on Arc Sag Measurement Methods for Transmission Lines Based on Deep Learning and Photogrammetry Technology." Remote Sensing 15, no. 10 (May 11, 2023): 2533. http://dx.doi.org/10.3390/rs15102533.
Повний текст джерелаGilboa-Schechtman, E., and R. Azoulay. "Treatment of Social Anxiety Disorder: Mechanisms, Techniques, and Empirically Supported Interventions." Клиническая и специальная психология 11, no. 2 (2022): 1–21. http://dx.doi.org/10.17759/cpse.2022110201.
Повний текст джерелаWu, Sitong, Tianyi Wu, Haoru Tan, and Guodong Guo. "Pale Transformer: A General Vision Transformer Backbone with Pale-Shaped Attention." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 3 (June 28, 2022): 2731–39. http://dx.doi.org/10.1609/aaai.v36i3.20176.
Повний текст джерелаYan, Wenhui, Wending Tang, Lihua Wang, Yannan Bin, and Junfeng Xia. "PrMFTP: Multi-functional therapeutic peptides prediction based on multi-head self-attention mechanism and class weight optimization." PLOS Computational Biology 18, no. 9 (September 12, 2022): e1010511. http://dx.doi.org/10.1371/journal.pcbi.1010511.
Повний текст джерелаYou, Yujie, Le Zhang, Peng Tao, Suran Liu, and Luonan Chen. "Spatiotemporal Transformer Neural Network for Time-Series Forecasting." Entropy 24, no. 11 (November 14, 2022): 1651. http://dx.doi.org/10.3390/e24111651.
Повний текст джерелаElster, Jon. "Self-poisoning of the mind." Philosophical Transactions of the Royal Society B: Biological Sciences 365, no. 1538 (January 27, 2010): 221–26. http://dx.doi.org/10.1098/rstb.2009.0176.
Повний текст джерелаMarmolejo-Martínez-Artesero, Sara, Caty Casas, and David Romeo-Guitart. "Endogenous Mechanisms of Neuroprotection: To Boost or Not to Be." Cells 10, no. 2 (February 10, 2021): 370. http://dx.doi.org/10.3390/cells10020370.
Повний текст джерела