Artykuły w czasopismach na temat „Self-attention mechanisms”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Sprawdź 50 najlepszych artykułów w czasopismach naukowych na temat „Self-attention mechanisms”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.
Makarov, Ilya, Maria Bakhanova, Sergey Nikolenko i Olga Gerasimova. "Self-supervised recurrent depth estimation with attention mechanisms". PeerJ Computer Science 8 (31.01.2022): e865. http://dx.doi.org/10.7717/peerj-cs.865.
Pełny tekst źródłaBae, Ara, i Wooil Kim. "Speaker Verification Employing Combinations of Self-Attention Mechanisms". Electronics 9, nr 12 (21.12.2020): 2201. http://dx.doi.org/10.3390/electronics9122201.
Pełny tekst źródłaCao, Fude, Chunguang Zheng, Limin Huang, Aihua Wang, Jiong Zhang, Feng Zhou, Haoxue Ju, Haitao Guo i Yuxia Du. "Research of Self-Attention in Image Segmentation". Journal of Information Technology Research 15, nr 1 (styczeń 2022): 1–12. http://dx.doi.org/10.4018/jitr.298619.
Pełny tekst źródłaDai, Biyun, Jinlong Li i Ruoyi Xu. "Multiple Positional Self-Attention Network for Text Classification". Proceedings of the AAAI Conference on Artificial Intelligence 34, nr 05 (3.04.2020): 7610–17. http://dx.doi.org/10.1609/aaai.v34i05.6261.
Pełny tekst źródłaXie, Fei, Dalong Zhang i Chengming Liu. "Global–Local Self-Attention Based Transformer for Speaker Verification". Applied Sciences 12, nr 19 (10.10.2022): 10154. http://dx.doi.org/10.3390/app121910154.
Pełny tekst źródłaIshizuka, Ryoto, Ryo Nishikimi i Kazuyoshi Yoshii. "Global Structure-Aware Drum Transcription Based on Self-Attention Mechanisms". Signals 2, nr 3 (13.08.2021): 508–26. http://dx.doi.org/10.3390/signals2030031.
Pełny tekst źródłaZhu, Hu, Ze Wang, Yu Shi, Yingying Hua, Guoxia Xu i Lizhen Deng. "Multimodal Fusion Method Based on Self-Attention Mechanism". Wireless Communications and Mobile Computing 2020 (23.09.2020): 1–8. http://dx.doi.org/10.1155/2020/8843186.
Pełny tekst źródłaPOSNER, MICHAEL I., i MARY K. ROTHBART. "Developing mechanisms of self-regulation". Development and Psychopathology 12, nr 3 (wrzesień 2000): 427–41. http://dx.doi.org/10.1017/s0954579400003096.
Pełny tekst źródłaTiwari, Prayag, Amit Kumar Jaiswal, Sahil Garg i Ilsun You. "SANTM: Efficient Self-attention-driven Network for Text Matching". ACM Transactions on Internet Technology 22, nr 3 (31.08.2022): 1–21. http://dx.doi.org/10.1145/3426971.
Pełny tekst źródłaNg, Hu, Glenn Jun Weng Chia, Timothy Tzen Vun Yap i Vik Tor Goh. "Modelling sentiments based on objectivity and subjectivity with self-attention mechanisms". F1000Research 10 (17.05.2022): 1001. http://dx.doi.org/10.12688/f1000research.73131.2.
Pełny tekst źródłaLin, Hung-Hsiang, Jiun-Da Lin, Jose Jaena Mari Ople, Jun-Cheng Chen i Kai-Lung Hua. "Social Media Popularity Prediction Based on Multi-Modal Self-Attention Mechanisms". IEEE Access 10 (2022): 4448–55. http://dx.doi.org/10.1109/access.2021.3136552.
Pełny tekst źródłaNg, Hu, Glenn Jun Weng Chia, Timothy Tzen Vun Yap i Vik Tor Goh. "Modelling sentiments based on objectivity and subjectivity with self-attention mechanisms". F1000Research 10 (4.10.2021): 1001. http://dx.doi.org/10.12688/f1000research.73131.1.
Pełny tekst źródłaBaer, Ruth A. "Self-Focused Attention and Mechanisms of Change in Mindfulness-Based Treatment". Cognitive Behaviour Therapy 38, sup1 (styczeń 2009): 15–20. http://dx.doi.org/10.1080/16506070902980703.
Pełny tekst źródłaLo, Ronda F., Andy H. Ng, Adam S. Cohen i Joni Y. Sasaki. "Does self-construal shape automatic social attention?" PLOS ONE 16, nr 2 (10.02.2021): e0246577. http://dx.doi.org/10.1371/journal.pone.0246577.
Pełny tekst źródłaKaiser, Roselinde H., Hannah R. Snyder, Franziska Goer, Rachel Clegg, Manon Ironside i Diego A. Pizzagalli. "Attention Bias in Rumination and Depression: Cognitive Mechanisms and Brain Networks". Clinical Psychological Science 6, nr 6 (21.09.2018): 765–82. http://dx.doi.org/10.1177/2167702618797935.
Pełny tekst źródłaChen, Shouyan, Mingyan Zhang, Xiaofen Yang, Zhijia Zhao, Tao Zou i Xinqi Sun. "The Impact of Attention Mechanisms on Speech Emotion Recognition". Sensors 21, nr 22 (12.11.2021): 7530. http://dx.doi.org/10.3390/s21227530.
Pełny tekst źródłaSpringer, Anne, Juliane Beyer, Jan Derrfuss, Kirsten G. Volz i Bettina Hannover. "Seeing You or the Scene? Self-Construals Modulate Inhibitory Mechanisms of Attention". Social Cognition 30, nr 2 (kwiecień 2012): 133–52. http://dx.doi.org/10.1521/soco.2012.30.2.133.
Pełny tekst źródłaSun, Yange, Meng Li, Huaping Guo i Li Zhang. "MSGSA: Multi-Scale Guided Self-Attention Network for Crowd Counting". Electronics 12, nr 12 (11.06.2023): 2631. http://dx.doi.org/10.3390/electronics12122631.
Pełny tekst źródłaWang, Mei, Yu Yao, Hongbin Qiu i Xiyu Song. "Adaptive Memory-Controlled Self-Attention for Polyphonic Sound Event Detection". Symmetry 14, nr 2 (12.02.2022): 366. http://dx.doi.org/10.3390/sym14020366.
Pełny tekst źródłaZhou, Qian, Hua Zou i Huanhuan Wu. "LGViT: A Local and Global Vision Transformer with Dynamic Contextual Position Bias Using Overlapping Windows". Applied Sciences 13, nr 3 (3.02.2023): 1993. http://dx.doi.org/10.3390/app13031993.
Pełny tekst źródłaPosner, Michael I., Mary K. Rothbart, Brad E. Sheese i Pascale Voelker. "Developing Attention: Behavioral and Brain Mechanisms". Advances in Neuroscience 2014 (8.05.2014): 1–9. http://dx.doi.org/10.1155/2014/405094.
Pełny tekst źródłaLin, Zhicheng, i Shihui Han. "Self-construal priming modulates the scope of visual attention". Quarterly Journal of Experimental Psychology 62, nr 4 (kwiecień 2009): 802–13. http://dx.doi.org/10.1080/17470210802271650.
Pełny tekst źródłaHan, Suk Won, i Cheol Hwan Kim. "Neurocognitive Mechanisms Underlying Internet/Smartphone Addiction: A Preliminary fMRI Study". Tomography 8, nr 4 (11.07.2022): 1781–90. http://dx.doi.org/10.3390/tomography8040150.
Pełny tekst źródłaNagai, Yukie, Koh Hosoda, Akio Morita i Minoru Asada. "Emergence of Joint Attention through Bootstrap Learning based on the Mechanisms of Visual Attention and Learning with Self-evaluation". Transactions of the Japanese Society for Artificial Intelligence 19 (2004): 10–19. http://dx.doi.org/10.1527/tjsai.19.10.
Pełny tekst źródłaRen, Xudie, Jialve Wang i Shenghong Li. "MAM: Multiple Attention Mechanism Neural Networks for Cross-Age Face Recognition". Wireless Communications and Mobile Computing 2022 (30.04.2022): 1–11. http://dx.doi.org/10.1155/2022/8546029.
Pełny tekst źródłaKuo, Yu-Chen, Ching-Bang Yao i Chen-Yu Wu. "A Strategy for Enhancing English Learning Achievement, Based on the Eye-Tracking Technology with Self-Regulated Learning". Sustainability 14, nr 23 (6.12.2022): 16286. http://dx.doi.org/10.3390/su142316286.
Pełny tekst źródłaZhu, Yuhua, Hang Li, Tong Zhen i Zhihui Li. "Integrating Self-Attention Mechanisms and ResNet for Grain Storage Ventilation Decision Making: A Study". Applied Sciences 13, nr 13 (28.06.2023): 7655. http://dx.doi.org/10.3390/app13137655.
Pełny tekst źródłaDiao, Zhifeng, i Fanglei Sun. "Visual Object Tracking Based on Deep Neural Network". Mathematical Problems in Engineering 2022 (12.07.2022): 1–9. http://dx.doi.org/10.1155/2022/2154463.
Pełny tekst źródłaGao, Yue, Di Li, Xiangjian Chen i Junwu Zhu. "Attention-Based Mechanisms for Cognitive Reinforcement Learning". Applied Sciences 13, nr 13 (21.06.2023): 7361. http://dx.doi.org/10.3390/app13137361.
Pełny tekst źródłaWang, Wanru, Yuwei Lv, Yonggang Wen i Xuemei Sun. "Rumor Detection Based on Knowledge Enhancement and Graph Attention Network". Discrete Dynamics in Nature and Society 2022 (6.10.2022): 1–12. http://dx.doi.org/10.1155/2022/6257658.
Pełny tekst źródłaNiu, Jinxing, Shuo Liu, Hanbing Li, Tao Zhang i Lijun Wang. "Grasp Detection Combining Self-Attention with CNN in Complex Scenes". Applied Sciences 13, nr 17 (25.08.2023): 9655. http://dx.doi.org/10.3390/app13179655.
Pełny tekst źródłaMa, Suling. "A Study of Two-Way Short- and Long-Term Memory Network Intelligent Computing IoT Model-Assisted Home Education Attention Mechanism". Computational Intelligence and Neuroscience 2021 (21.12.2021): 1–11. http://dx.doi.org/10.1155/2021/3587884.
Pełny tekst źródłaKardakis, Spyridon, Isidoros Perikos, Foteini Grivokostopoulou i Ioannis Hatzilygeroudis. "Examining Attention Mechanisms in Deep Learning Models for Sentiment Analysis". Applied Sciences 11, nr 9 (25.04.2021): 3883. http://dx.doi.org/10.3390/app11093883.
Pełny tekst źródłaHendricks, Lisa Anne, John Mellor, Rosalia Schneider, Jean-Baptiste Alayrac i Aida Nematzadeh. "Decoupling the Role of Data, Attention, and Losses in Multimodal Transformers". Transactions of the Association for Computational Linguistics 9 (2021): 570–85. http://dx.doi.org/10.1162/tacl_a_00385.
Pełny tekst źródłaZhang, Shugang, Mingjian Jiang, Shuang Wang, Xiaofeng Wang, Zhiqiang Wei i Zhen Li. "SAG-DTA: Prediction of Drug–Target Affinity Using Self-Attention Graph Network". International Journal of Molecular Sciences 22, nr 16 (20.08.2021): 8993. http://dx.doi.org/10.3390/ijms22168993.
Pełny tekst źródłaReimann, Jan Niclas, Andreas Schwung i Steven X. Ding. "Adopting attention-mechanisms for Neural Logic Rule Layers". at - Automatisierungstechnik 70, nr 3 (1.03.2022): 257–66. http://dx.doi.org/10.1515/auto-2021-0136.
Pełny tekst źródłaSchäfer, Sarah, Dirk Wentura i Christian Frings. "Creating a network of importance: The particular effects of self-relevance on stimulus processing". Attention, Perception, & Psychophysics 82, nr 7 (17.06.2020): 3750–66. http://dx.doi.org/10.3758/s13414-020-02070-7.
Pełny tekst źródłaZhou, Wei, Zhongwei Qu, Lianen Qu, Xupeng Wang, Yilin Shi, De Zhang i Zhenlin Hui. "Radar Echo Maps Prediction Using an Improved MIM with Self-Attention Memory Module". Journal of Sensors 2023 (18.07.2023): 1–12. http://dx.doi.org/10.1155/2023/8876971.
Pełny tekst źródłaLi, Yabei, Minjun Liang, Mingyang Wei, Ge Wang i Yanan Li. "Mechanisms and Applications of Attention in Medical Image Segmentation: A Review". Academic Journal of Science and Technology 5, nr 3 (5.05.2023): 237–43. http://dx.doi.org/10.54097/ajst.v5i3.8021.
Pełny tekst źródłaZhang, Rongkai, Ying Zeng, Li Tong i Bin Yan. "Specific Neural Mechanisms of Self-Cognition and the Application of Brainprint Recognition". Biology 12, nr 3 (22.03.2023): 486. http://dx.doi.org/10.3390/biology12030486.
Pełny tekst źródłaMörtberg, Ewa, Asle Hoffart, Benjamin Boecking i David M. Clark. "Shifting the Focus of One's Attention Mediates Improvement in Cognitive Therapy for Social Anxiety Disorder". Behavioural and Cognitive Psychotherapy 43, nr 1 (28.08.2013): 63–73. http://dx.doi.org/10.1017/s1352465813000738.
Pełny tekst źródłaFichten, Catherine S., Harriet Lennox, Kristen Robillard, John Wright, Stéphane Sabourin i Rhonda Amsel. "Attentional Focus and Attitudes Toward Peers with Disabilities: Self Focusing and A Comparison of Modeling and Self-Disclosure". Journal of Applied Rehabilitation Counseling 27, nr 4 (1.12.1996): 30–39. http://dx.doi.org/10.1891/0047-2220.27.4.30.
Pełny tekst źródłaZhijian, Lyu, Jiang Shaohua i Tan Yonghao. "DSAGLSTM-DTA: Prediction of Drug-Target Affinity using Dual Self-Attention and LSTM". Machine Learning and Applications: An International Journal 9, nr 02 (30.06.2022): 1–19. http://dx.doi.org/10.5121/mlaij.2022.9201.
Pełny tekst źródłaSong, Jiang, Jianguo Qian, Zhengjun Liu, Yang Jiao, Jiahui Zhou, Yongrong Li, Yiming Chen, Jie Guo i Zhiqiang Wang. "Research on Arc Sag Measurement Methods for Transmission Lines Based on Deep Learning and Photogrammetry Technology". Remote Sensing 15, nr 10 (11.05.2023): 2533. http://dx.doi.org/10.3390/rs15102533.
Pełny tekst źródłaGilboa-Schechtman, E., i R. Azoulay. "Treatment of Social Anxiety Disorder: Mechanisms, Techniques, and Empirically Supported Interventions". Клиническая и специальная психология 11, nr 2 (2022): 1–21. http://dx.doi.org/10.17759/cpse.2022110201.
Pełny tekst źródłaWu, Sitong, Tianyi Wu, Haoru Tan i Guodong Guo. "Pale Transformer: A General Vision Transformer Backbone with Pale-Shaped Attention". Proceedings of the AAAI Conference on Artificial Intelligence 36, nr 3 (28.06.2022): 2731–39. http://dx.doi.org/10.1609/aaai.v36i3.20176.
Pełny tekst źródłaYan, Wenhui, Wending Tang, Lihua Wang, Yannan Bin i Junfeng Xia. "PrMFTP: Multi-functional therapeutic peptides prediction based on multi-head self-attention mechanism and class weight optimization". PLOS Computational Biology 18, nr 9 (12.09.2022): e1010511. http://dx.doi.org/10.1371/journal.pcbi.1010511.
Pełny tekst źródłaYou, Yujie, Le Zhang, Peng Tao, Suran Liu i Luonan Chen. "Spatiotemporal Transformer Neural Network for Time-Series Forecasting". Entropy 24, nr 11 (14.11.2022): 1651. http://dx.doi.org/10.3390/e24111651.
Pełny tekst źródłaElster, Jon. "Self-poisoning of the mind". Philosophical Transactions of the Royal Society B: Biological Sciences 365, nr 1538 (27.01.2010): 221–26. http://dx.doi.org/10.1098/rstb.2009.0176.
Pełny tekst źródłaMarmolejo-Martínez-Artesero, Sara, Caty Casas i David Romeo-Guitart. "Endogenous Mechanisms of Neuroprotection: To Boost or Not to Be". Cells 10, nr 2 (10.02.2021): 370. http://dx.doi.org/10.3390/cells10020370.
Pełny tekst źródła