Artículos de revistas sobre el tema "Self-attention mechanisms"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte los 50 mejores artículos de revistas para su investigación sobre el tema "Self-attention mechanisms".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.
Makarov, Ilya, Maria Bakhanova, Sergey Nikolenko y Olga Gerasimova. "Self-supervised recurrent depth estimation with attention mechanisms". PeerJ Computer Science 8 (31 de enero de 2022): e865. http://dx.doi.org/10.7717/peerj-cs.865.
Texto completoBae, Ara y Wooil Kim. "Speaker Verification Employing Combinations of Self-Attention Mechanisms". Electronics 9, n.º 12 (21 de diciembre de 2020): 2201. http://dx.doi.org/10.3390/electronics9122201.
Texto completoCao, Fude, Chunguang Zheng, Limin Huang, Aihua Wang, Jiong Zhang, Feng Zhou, Haoxue Ju, Haitao Guo y Yuxia Du. "Research of Self-Attention in Image Segmentation". Journal of Information Technology Research 15, n.º 1 (enero de 2022): 1–12. http://dx.doi.org/10.4018/jitr.298619.
Texto completoDai, Biyun, Jinlong Li y Ruoyi Xu. "Multiple Positional Self-Attention Network for Text Classification". Proceedings of the AAAI Conference on Artificial Intelligence 34, n.º 05 (3 de abril de 2020): 7610–17. http://dx.doi.org/10.1609/aaai.v34i05.6261.
Texto completoXie, Fei, Dalong Zhang y Chengming Liu. "Global–Local Self-Attention Based Transformer for Speaker Verification". Applied Sciences 12, n.º 19 (10 de octubre de 2022): 10154. http://dx.doi.org/10.3390/app121910154.
Texto completoIshizuka, Ryoto, Ryo Nishikimi y Kazuyoshi Yoshii. "Global Structure-Aware Drum Transcription Based on Self-Attention Mechanisms". Signals 2, n.º 3 (13 de agosto de 2021): 508–26. http://dx.doi.org/10.3390/signals2030031.
Texto completoZhu, Hu, Ze Wang, Yu Shi, Yingying Hua, Guoxia Xu y Lizhen Deng. "Multimodal Fusion Method Based on Self-Attention Mechanism". Wireless Communications and Mobile Computing 2020 (23 de septiembre de 2020): 1–8. http://dx.doi.org/10.1155/2020/8843186.
Texto completoPOSNER, MICHAEL I. y MARY K. ROTHBART. "Developing mechanisms of self-regulation". Development and Psychopathology 12, n.º 3 (septiembre de 2000): 427–41. http://dx.doi.org/10.1017/s0954579400003096.
Texto completoTiwari, Prayag, Amit Kumar Jaiswal, Sahil Garg y Ilsun You. "SANTM: Efficient Self-attention-driven Network for Text Matching". ACM Transactions on Internet Technology 22, n.º 3 (31 de agosto de 2022): 1–21. http://dx.doi.org/10.1145/3426971.
Texto completoNg, Hu, Glenn Jun Weng Chia, Timothy Tzen Vun Yap y Vik Tor Goh. "Modelling sentiments based on objectivity and subjectivity with self-attention mechanisms". F1000Research 10 (17 de mayo de 2022): 1001. http://dx.doi.org/10.12688/f1000research.73131.2.
Texto completoLin, Hung-Hsiang, Jiun-Da Lin, Jose Jaena Mari Ople, Jun-Cheng Chen y Kai-Lung Hua. "Social Media Popularity Prediction Based on Multi-Modal Self-Attention Mechanisms". IEEE Access 10 (2022): 4448–55. http://dx.doi.org/10.1109/access.2021.3136552.
Texto completoNg, Hu, Glenn Jun Weng Chia, Timothy Tzen Vun Yap y Vik Tor Goh. "Modelling sentiments based on objectivity and subjectivity with self-attention mechanisms". F1000Research 10 (4 de octubre de 2021): 1001. http://dx.doi.org/10.12688/f1000research.73131.1.
Texto completoBaer, Ruth A. "Self-Focused Attention and Mechanisms of Change in Mindfulness-Based Treatment". Cognitive Behaviour Therapy 38, sup1 (enero de 2009): 15–20. http://dx.doi.org/10.1080/16506070902980703.
Texto completoLo, Ronda F., Andy H. Ng, Adam S. Cohen y Joni Y. Sasaki. "Does self-construal shape automatic social attention?" PLOS ONE 16, n.º 2 (10 de febrero de 2021): e0246577. http://dx.doi.org/10.1371/journal.pone.0246577.
Texto completoKaiser, Roselinde H., Hannah R. Snyder, Franziska Goer, Rachel Clegg, Manon Ironside y Diego A. Pizzagalli. "Attention Bias in Rumination and Depression: Cognitive Mechanisms and Brain Networks". Clinical Psychological Science 6, n.º 6 (21 de septiembre de 2018): 765–82. http://dx.doi.org/10.1177/2167702618797935.
Texto completoChen, Shouyan, Mingyan Zhang, Xiaofen Yang, Zhijia Zhao, Tao Zou y Xinqi Sun. "The Impact of Attention Mechanisms on Speech Emotion Recognition". Sensors 21, n.º 22 (12 de noviembre de 2021): 7530. http://dx.doi.org/10.3390/s21227530.
Texto completoSpringer, Anne, Juliane Beyer, Jan Derrfuss, Kirsten G. Volz y Bettina Hannover. "Seeing You or the Scene? Self-Construals Modulate Inhibitory Mechanisms of Attention". Social Cognition 30, n.º 2 (abril de 2012): 133–52. http://dx.doi.org/10.1521/soco.2012.30.2.133.
Texto completoSun, Yange, Meng Li, Huaping Guo y Li Zhang. "MSGSA: Multi-Scale Guided Self-Attention Network for Crowd Counting". Electronics 12, n.º 12 (11 de junio de 2023): 2631. http://dx.doi.org/10.3390/electronics12122631.
Texto completoWang, Mei, Yu Yao, Hongbin Qiu y Xiyu Song. "Adaptive Memory-Controlled Self-Attention for Polyphonic Sound Event Detection". Symmetry 14, n.º 2 (12 de febrero de 2022): 366. http://dx.doi.org/10.3390/sym14020366.
Texto completoZhou, Qian, Hua Zou y Huanhuan Wu. "LGViT: A Local and Global Vision Transformer with Dynamic Contextual Position Bias Using Overlapping Windows". Applied Sciences 13, n.º 3 (3 de febrero de 2023): 1993. http://dx.doi.org/10.3390/app13031993.
Texto completoPosner, Michael I., Mary K. Rothbart, Brad E. Sheese y Pascale Voelker. "Developing Attention: Behavioral and Brain Mechanisms". Advances in Neuroscience 2014 (8 de mayo de 2014): 1–9. http://dx.doi.org/10.1155/2014/405094.
Texto completoLin, Zhicheng y Shihui Han. "Self-construal priming modulates the scope of visual attention". Quarterly Journal of Experimental Psychology 62, n.º 4 (abril de 2009): 802–13. http://dx.doi.org/10.1080/17470210802271650.
Texto completoHan, Suk Won y Cheol Hwan Kim. "Neurocognitive Mechanisms Underlying Internet/Smartphone Addiction: A Preliminary fMRI Study". Tomography 8, n.º 4 (11 de julio de 2022): 1781–90. http://dx.doi.org/10.3390/tomography8040150.
Texto completoNagai, Yukie, Koh Hosoda, Akio Morita y Minoru Asada. "Emergence of Joint Attention through Bootstrap Learning based on the Mechanisms of Visual Attention and Learning with Self-evaluation". Transactions of the Japanese Society for Artificial Intelligence 19 (2004): 10–19. http://dx.doi.org/10.1527/tjsai.19.10.
Texto completoRen, Xudie, Jialve Wang y Shenghong Li. "MAM: Multiple Attention Mechanism Neural Networks for Cross-Age Face Recognition". Wireless Communications and Mobile Computing 2022 (30 de abril de 2022): 1–11. http://dx.doi.org/10.1155/2022/8546029.
Texto completoKuo, Yu-Chen, Ching-Bang Yao y Chen-Yu Wu. "A Strategy for Enhancing English Learning Achievement, Based on the Eye-Tracking Technology with Self-Regulated Learning". Sustainability 14, n.º 23 (6 de diciembre de 2022): 16286. http://dx.doi.org/10.3390/su142316286.
Texto completoZhu, Yuhua, Hang Li, Tong Zhen y Zhihui Li. "Integrating Self-Attention Mechanisms and ResNet for Grain Storage Ventilation Decision Making: A Study". Applied Sciences 13, n.º 13 (28 de junio de 2023): 7655. http://dx.doi.org/10.3390/app13137655.
Texto completoDiao, Zhifeng y Fanglei Sun. "Visual Object Tracking Based on Deep Neural Network". Mathematical Problems in Engineering 2022 (12 de julio de 2022): 1–9. http://dx.doi.org/10.1155/2022/2154463.
Texto completoGao, Yue, Di Li, Xiangjian Chen y Junwu Zhu. "Attention-Based Mechanisms for Cognitive Reinforcement Learning". Applied Sciences 13, n.º 13 (21 de junio de 2023): 7361. http://dx.doi.org/10.3390/app13137361.
Texto completoWang, Wanru, Yuwei Lv, Yonggang Wen y Xuemei Sun. "Rumor Detection Based on Knowledge Enhancement and Graph Attention Network". Discrete Dynamics in Nature and Society 2022 (6 de octubre de 2022): 1–12. http://dx.doi.org/10.1155/2022/6257658.
Texto completoNiu, Jinxing, Shuo Liu, Hanbing Li, Tao Zhang y Lijun Wang. "Grasp Detection Combining Self-Attention with CNN in Complex Scenes". Applied Sciences 13, n.º 17 (25 de agosto de 2023): 9655. http://dx.doi.org/10.3390/app13179655.
Texto completoMa, Suling. "A Study of Two-Way Short- and Long-Term Memory Network Intelligent Computing IoT Model-Assisted Home Education Attention Mechanism". Computational Intelligence and Neuroscience 2021 (21 de diciembre de 2021): 1–11. http://dx.doi.org/10.1155/2021/3587884.
Texto completoKardakis, Spyridon, Isidoros Perikos, Foteini Grivokostopoulou y Ioannis Hatzilygeroudis. "Examining Attention Mechanisms in Deep Learning Models for Sentiment Analysis". Applied Sciences 11, n.º 9 (25 de abril de 2021): 3883. http://dx.doi.org/10.3390/app11093883.
Texto completoHendricks, Lisa Anne, John Mellor, Rosalia Schneider, Jean-Baptiste Alayrac y Aida Nematzadeh. "Decoupling the Role of Data, Attention, and Losses in Multimodal Transformers". Transactions of the Association for Computational Linguistics 9 (2021): 570–85. http://dx.doi.org/10.1162/tacl_a_00385.
Texto completoZhang, Shugang, Mingjian Jiang, Shuang Wang, Xiaofeng Wang, Zhiqiang Wei y Zhen Li. "SAG-DTA: Prediction of Drug–Target Affinity Using Self-Attention Graph Network". International Journal of Molecular Sciences 22, n.º 16 (20 de agosto de 2021): 8993. http://dx.doi.org/10.3390/ijms22168993.
Texto completoReimann, Jan Niclas, Andreas Schwung y Steven X. Ding. "Adopting attention-mechanisms for Neural Logic Rule Layers". at - Automatisierungstechnik 70, n.º 3 (1 de marzo de 2022): 257–66. http://dx.doi.org/10.1515/auto-2021-0136.
Texto completoSchäfer, Sarah, Dirk Wentura y Christian Frings. "Creating a network of importance: The particular effects of self-relevance on stimulus processing". Attention, Perception, & Psychophysics 82, n.º 7 (17 de junio de 2020): 3750–66. http://dx.doi.org/10.3758/s13414-020-02070-7.
Texto completoZhou, Wei, Zhongwei Qu, Lianen Qu, Xupeng Wang, Yilin Shi, De Zhang y Zhenlin Hui. "Radar Echo Maps Prediction Using an Improved MIM with Self-Attention Memory Module". Journal of Sensors 2023 (18 de julio de 2023): 1–12. http://dx.doi.org/10.1155/2023/8876971.
Texto completoLi, Yabei, Minjun Liang, Mingyang Wei, Ge Wang y Yanan Li. "Mechanisms and Applications of Attention in Medical Image Segmentation: A Review". Academic Journal of Science and Technology 5, n.º 3 (5 de mayo de 2023): 237–43. http://dx.doi.org/10.54097/ajst.v5i3.8021.
Texto completoZhang, Rongkai, Ying Zeng, Li Tong y Bin Yan. "Specific Neural Mechanisms of Self-Cognition and the Application of Brainprint Recognition". Biology 12, n.º 3 (22 de marzo de 2023): 486. http://dx.doi.org/10.3390/biology12030486.
Texto completoMörtberg, Ewa, Asle Hoffart, Benjamin Boecking y David M. Clark. "Shifting the Focus of One's Attention Mediates Improvement in Cognitive Therapy for Social Anxiety Disorder". Behavioural and Cognitive Psychotherapy 43, n.º 1 (28 de agosto de 2013): 63–73. http://dx.doi.org/10.1017/s1352465813000738.
Texto completoFichten, Catherine S., Harriet Lennox, Kristen Robillard, John Wright, Stéphane Sabourin y Rhonda Amsel. "Attentional Focus and Attitudes Toward Peers with Disabilities: Self Focusing and A Comparison of Modeling and Self-Disclosure". Journal of Applied Rehabilitation Counseling 27, n.º 4 (1 de diciembre de 1996): 30–39. http://dx.doi.org/10.1891/0047-2220.27.4.30.
Texto completoZhijian, Lyu, Jiang Shaohua y Tan Yonghao. "DSAGLSTM-DTA: Prediction of Drug-Target Affinity using Dual Self-Attention and LSTM". Machine Learning and Applications: An International Journal 9, n.º 02 (30 de junio de 2022): 1–19. http://dx.doi.org/10.5121/mlaij.2022.9201.
Texto completoSong, Jiang, Jianguo Qian, Zhengjun Liu, Yang Jiao, Jiahui Zhou, Yongrong Li, Yiming Chen, Jie Guo y Zhiqiang Wang. "Research on Arc Sag Measurement Methods for Transmission Lines Based on Deep Learning and Photogrammetry Technology". Remote Sensing 15, n.º 10 (11 de mayo de 2023): 2533. http://dx.doi.org/10.3390/rs15102533.
Texto completoGilboa-Schechtman, E. y R. Azoulay. "Treatment of Social Anxiety Disorder: Mechanisms, Techniques, and Empirically Supported Interventions". Клиническая и специальная психология 11, n.º 2 (2022): 1–21. http://dx.doi.org/10.17759/cpse.2022110201.
Texto completoWu, Sitong, Tianyi Wu, Haoru Tan y Guodong Guo. "Pale Transformer: A General Vision Transformer Backbone with Pale-Shaped Attention". Proceedings of the AAAI Conference on Artificial Intelligence 36, n.º 3 (28 de junio de 2022): 2731–39. http://dx.doi.org/10.1609/aaai.v36i3.20176.
Texto completoYan, Wenhui, Wending Tang, Lihua Wang, Yannan Bin y Junfeng Xia. "PrMFTP: Multi-functional therapeutic peptides prediction based on multi-head self-attention mechanism and class weight optimization". PLOS Computational Biology 18, n.º 9 (12 de septiembre de 2022): e1010511. http://dx.doi.org/10.1371/journal.pcbi.1010511.
Texto completoYou, Yujie, Le Zhang, Peng Tao, Suran Liu y Luonan Chen. "Spatiotemporal Transformer Neural Network for Time-Series Forecasting". Entropy 24, n.º 11 (14 de noviembre de 2022): 1651. http://dx.doi.org/10.3390/e24111651.
Texto completoElster, Jon. "Self-poisoning of the mind". Philosophical Transactions of the Royal Society B: Biological Sciences 365, n.º 1538 (27 de enero de 2010): 221–26. http://dx.doi.org/10.1098/rstb.2009.0176.
Texto completoMarmolejo-Martínez-Artesero, Sara, Caty Casas y David Romeo-Guitart. "Endogenous Mechanisms of Neuroprotection: To Boost or Not to Be". Cells 10, n.º 2 (10 de febrero de 2021): 370. http://dx.doi.org/10.3390/cells10020370.
Texto completo