Статті в журналах з теми "Transformer Architecture"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Transformer Architecture".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.
Rahali, Abir, and Moulay A. Akhloufi. "End-to-End Transformer-Based Models in Textual-Based NLP." AI 4, no. 1 (January 5, 2023): 54–110. http://dx.doi.org/10.3390/ai4010004.
Повний текст джерелаChi, Ye, Haikun Liu, Ganwei Peng, Xiaofei Liao, and Hai Jin. "Transformer: An OS-Supported Reconfigurable Hybrid Memory Architecture." Applied Sciences 12, no. 24 (December 18, 2022): 12995. http://dx.doi.org/10.3390/app122412995.
Повний текст джерелаCui, Liyuan, Guoqiang Zhong, Xiang Liu, and Hongwei Xu. "A Compact Object Detection Architecture with Transformer Enhancing." Journal of Physics: Conference Series 2278, no. 1 (May 1, 2022): 012034. http://dx.doi.org/10.1088/1742-6596/2278/1/012034.
Повний текст джерелаLorenzo, Javier, Ignacio Parra Alonso, Rubén Izquierdo, Augusto Luis Ballardini, Álvaro Hernández Saz, David Fernández Llorca, and Miguel Ángel Sotelo. "CAPformer: Pedestrian Crossing Action Prediction Using Transformer." Sensors 21, no. 17 (August 24, 2021): 5694. http://dx.doi.org/10.3390/s21175694.
Повний текст джерелаShao, Ran, Xiao-Jun Bi, and Zheng Chen. "A novel hybrid transformer-CNN architecture for environmental microorganism classification." PLOS ONE 17, no. 11 (November 11, 2022): e0277557. http://dx.doi.org/10.1371/journal.pone.0277557.
Повний текст джерелаIbrahem, Hatem, Ahmed Salem, and Hyun-Soo Kang. "RT-ViT: Real-Time Monocular Depth Estimation Using Lightweight Vision Transformers." Sensors 22, no. 10 (May 19, 2022): 3849. http://dx.doi.org/10.3390/s22103849.
Повний текст джерелаLee, Jaewoo, Sungjun Lee, Wonki Cho, Zahid Ali Siddiqui, and Unsang Park. "Vision Transformer-Based Tailing Detection in Videos." Applied Sciences 11, no. 24 (December 7, 2021): 11591. http://dx.doi.org/10.3390/app112411591.
Повний текст джерелаHe, Ju, Jie-Neng Chen, Shuai Liu, Adam Kortylewski, Cheng Yang, Yutong Bai, and Changhu Wang. "TransFG: A Transformer Architecture for Fine-Grained Recognition." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 1 (June 28, 2022): 852–60. http://dx.doi.org/10.1609/aaai.v36i1.19967.
Повний текст джерелаGao, Shuguo, Jun Zhao, Yunpeng Liu, Ziqiang Xu, Zhe Li, Lu Sun, and Yuan Tian. "Research into Power Transformer Health Assessment Technology Based on Uncertainty of Information and Deep Architecture Design." Mathematical Problems in Engineering 2021 (April 2, 2021): 1–12. http://dx.doi.org/10.1155/2021/8831872.
Повний текст джерелаXu, Zhen, David R. So, and Andrew M. Dai. "MUFASA: Multimodal Fusion Architecture Search for Electronic Health Records." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 12 (May 18, 2021): 10532–40. http://dx.doi.org/10.1609/aaai.v35i12.17260.
Повний текст джерелаWei, Lixing. "A Transformer Network Architecture for Dermoscopy Image Segmentation." Journal of Physics: Conference Series 2303, no. 1 (July 1, 2022): 012043. http://dx.doi.org/10.1088/1742-6596/2303/1/012043.
Повний текст джерелаWang, Zhixue, Yu Zhang, Lin Luo, and Nan Wang. "TransCD: scene change detection via transformer-based architecture." Optics Express 29, no. 25 (November 30, 2021): 41409. http://dx.doi.org/10.1364/oe.440720.
Повний текст джерелаChoi, Yong-Seok, Yo-Han Park, and Kong Joo Lee. "Building a Korean morphological analyzer using two Korean BERT models." PeerJ Computer Science 8 (May 2, 2022): e968. http://dx.doi.org/10.7717/peerj-cs.968.
Повний текст джерелаShi, Hao, Bingqian Chai, Yupei Wang, and Liang Chen. "A Local-Sparse-Information-Aggregation Transformer with Explicit Contour Guidance for SAR Ship Detection." Remote Sensing 14, no. 20 (October 20, 2022): 5247. http://dx.doi.org/10.3390/rs14205247.
Повний текст джерелаYoung, Paul, Nima Ebadi, Arun Das, Mazal Bethany, Kevin Desai, and Peyman Najafirad. "Can Hierarchical Transformers Learn Facial Geometry?" Sensors 23, no. 2 (January 13, 2023): 929. http://dx.doi.org/10.3390/s23020929.
Повний текст джерелаBai, He, Peng Shi, Jimmy Lin, Yuqing Xie, Luchen Tan, Kun Xiong, Wen Gao, and Ming Li. "Segatron: Segment-Aware Transformer for Language Modeling and Understanding." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 14 (May 18, 2021): 12526–34. http://dx.doi.org/10.1609/aaai.v35i14.17485.
Повний текст джерелаBurn, G. L. "Implementing the evaluation transformer model of reduction on parallel machines." Journal of Functional Programming 1, no. 3 (July 1991): 329–66. http://dx.doi.org/10.1017/s0956796800000137.
Повний текст джерелаIliadis, Lazaros, Spyridon Nikolaidis, Panagiotis Sarigiannidis, Shaohua Wan, and Sotirios Goudos. "Artwork Style Recognition Using Vision Transformers and MLP Mixer." Technologies 10, no. 1 (December 28, 2021): 2. http://dx.doi.org/10.3390/technologies10010002.
Повний текст джерелаObuchowski, Aleksander, and Michał Lew. "Transformer-Capsule Model for Intent Detection (Student Abstract)." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 10 (April 3, 2020): 13885–86. http://dx.doi.org/10.1609/aaai.v34i10.7215.
Повний текст джерелаVasilevskij, V. V., and M. O. Poliakov. "Reproducing of the humidity curve of power transformers oil using adaptive neuro-fuzzy systems." Electrical Engineering & Electromechanics, no. 1 (February 23, 2021): 10–14. http://dx.doi.org/10.20998/2074-272x.2021.1.02.
Повний текст джерелаKim, Ki Jin, Tae Ho Lim, S. H. Park, and K. H. Ahn. "A High Efficiency CMOS Power Amplfieir with a Diode Linearizer and Voltage Combining Transformers." Applied Mechanics and Materials 110-116 (October 2011): 5500–5504. http://dx.doi.org/10.4028/www.scientific.net/amm.110-116.5500.
Повний текст джерелаHan, Jianhua, Xiajun Deng, Xinyue Cai, Zhen Yang, Hang Xu, Chunjing Xu, and Xiaodan Liang. "Laneformer: Object-Aware Row-Column Transformers for Lane Detection." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 1 (June 28, 2022): 799–807. http://dx.doi.org/10.1609/aaai.v36i1.19961.
Повний текст джерелаChen, Shichuan, Kunfeng Qiu, Shilian Zheng, Qi Xuan, and Xiaoniu Yang. "Radio–Image Transformer: Bridging Radio Modulation Classification and ImageNet Classification." Electronics 9, no. 10 (October 9, 2020): 1646. http://dx.doi.org/10.3390/electronics9101646.
Повний текст джерелаSun, Zeyu, Qihao Zhu, Yingfei Xiong, Yican Sun, Lili Mou, and Lu Zhang. "TreeGen: A Tree-Based Transformer Architecture for Code Generation." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (April 3, 2020): 8984–91. http://dx.doi.org/10.1609/aaai.v34i05.6430.
Повний текст джерелаZhao, Qian, Hao Yang, Dongming Zhou, and Jinde Cao. "Rethinking Image Deblurring via CNN-Transformer Multiscale Hybrid Architecture." IEEE Transactions on Instrumentation and Measurement 72 (2023): 1–15. http://dx.doi.org/10.1109/tim.2022.3230482.
Повний текст джерелаWu, Sitong, Tianyi Wu, Haoru Tan, and Guodong Guo. "Pale Transformer: A General Vision Transformer Backbone with Pale-Shaped Attention." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 3 (June 28, 2022): 2731–39. http://dx.doi.org/10.1609/aaai.v36i3.20176.
Повний текст джерелаOsolo, Raymond Ian, Zhan Yang, and Jun Long. "An Attentive Fourier-Augmented Image-Captioning Transformer." Applied Sciences 11, no. 18 (September 9, 2021): 8354. http://dx.doi.org/10.3390/app11188354.
Повний текст джерелаPaik, Incheon, and Jun-Wei Wang. "Improving Text-to-Code Generation with Features of Code Graph on GPT-2." Electronics 10, no. 21 (November 5, 2021): 2706. http://dx.doi.org/10.3390/electronics10212706.
Повний текст джерелаÖzdemir, Özgür, Emre Salih Akın, Rıza Velioğlu, and Tuğba Dalyan. "A comparative study of neural machine translation models for Turkish language." Journal of Intelligent & Fuzzy Systems 42, no. 3 (February 2, 2022): 2103–13. http://dx.doi.org/10.3233/jifs-211453.
Повний текст джерелаReva, I. V., O. V. Bialobrzheskyi, O. V. Todorov, and M. A. Bezzub. "Review of electric methods and systems for monitoring power transformers in the SMART GRID environment." Electrical Engineering and Power Engineering, no. 1 (March 30, 2022): 30–41. http://dx.doi.org/10.15588/1607-6761-2022-1-3.
Повний текст джерелаSun, Tao, and Hai Bo Liu. "Design of Fault Diagnosis Expert System of Transformer." Applied Mechanics and Materials 291-294 (February 2013): 2557–61. http://dx.doi.org/10.4028/www.scientific.net/amm.291-294.2557.
Повний текст джерелаLu, Kevin, Aditya Grover, Pieter Abbeel, and Igor Mordatch. "Frozen Pretrained Transformers as Universal Computation Engines." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 7 (June 28, 2022): 7628–36. http://dx.doi.org/10.1609/aaai.v36i7.20729.
Повний текст джерелаZhang, Zizhao, Han Zhang, Long Zhao, Ting Chen, Sercan Ö. Arik, and Tomas Pfister. "Nested Hierarchical Transformer: Towards Accurate, Data-Efficient and Interpretable Visual Understanding." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 3 (June 28, 2022): 3417–25. http://dx.doi.org/10.1609/aaai.v36i3.20252.
Повний текст джерелаYang, Xin, and Tao Su. "EFA-Trans: An Efficient and Flexible Acceleration Architecture for Transformers." Electronics 11, no. 21 (October 31, 2022): 3550. http://dx.doi.org/10.3390/electronics11213550.
Повний текст джерелаBacco, Luca, Andrea Cimino, Felice Dell’Orletta, and Mario Merone. "Explainable Sentiment Analysis: A Hierarchical Transformer-Based Extractive Summarization Approach." Electronics 10, no. 18 (September 8, 2021): 2195. http://dx.doi.org/10.3390/electronics10182195.
Повний текст джерелаDai, Yaonan, Jiuyang Yu, Dean Zhang, Tianhao Hu, and Xiaotao Zheng. "RODFormer: High-Precision Design for Rotating Object Detection with Transformers." Sensors 22, no. 7 (March 29, 2022): 2633. http://dx.doi.org/10.3390/s22072633.
Повний текст джерелаZhu, Xiaoning, Yannan Jia, Sun Jian, Lize Gu, and Zhang Pu. "ViTT: Vision Transformer Tracker." Sensors 21, no. 16 (August 20, 2021): 5608. http://dx.doi.org/10.3390/s21165608.
Повний текст джерелаBedair, Sarah S., Jeffrey S. Pulskamp, Ryan Rudy, Ronald Polcawich, Ryan Cable, and Lee Griffin. "Boosting MEMS Piezoelectric Transformer Figures of Merit via Architecture Optimization." IEEE Electron Device Letters 39, no. 3 (March 2018): 428–31. http://dx.doi.org/10.1109/led.2018.2799864.
Повний текст джерелаRamos-Pérez, Eduardo, Pablo J. Alonso-González, and José Javier Núñez-Velázquez. "Multi-Transformer: A New Neural Network-Based Architecture for Forecasting S&P Volatility." Mathematics 9, no. 15 (July 28, 2021): 1794. http://dx.doi.org/10.3390/math9151794.
Повний текст джерелаMeng, Fandong, and Jinchao Zhang. "DTMT: A Novel Deep Transition Architecture for Neural Machine Translation." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 224–31. http://dx.doi.org/10.1609/aaai.v33i01.3301224.
Повний текст джерелаChaudhry, Parinnay. "Bidirectional Encoder Representations from Transformers for Modelling Stock Prices." International Journal for Research in Applied Science and Engineering Technology 10, no. 2 (February 28, 2022): 896–901. http://dx.doi.org/10.22214/ijraset.2022.40406.
Повний текст джерелаSykiotis, Stavros, Maria Kaselimi, Anastasios Doulamis, and Nikolaos Doulamis. "ELECTRIcity: An Efficient Transformer for Non-Intrusive Load Monitoring." Sensors 22, no. 8 (April 11, 2022): 2926. http://dx.doi.org/10.3390/s22082926.
Повний текст джерелаXu, Yuxin, Yuyao Yan, Yiming Lin, Xi Yang, and Kaizhu Huang. "Sketch Based Image Retrieval for Architecture Images with Siamese Swin Transformer." Journal of Physics: Conference Series 2278, no. 1 (May 1, 2022): 012035. http://dx.doi.org/10.1088/1742-6596/2278/1/012035.
Повний текст джерелаWang, Bingting, Ziping Cao, Zhen Luan, and Bo Zhou. "Design and Evaluation of Band-Pass Matching Coupler for Narrow-Band DC Power Line Communications." Journal of Circuits, Systems and Computers 28, no. 07 (June 27, 2019): 1950119. http://dx.doi.org/10.1142/s0218126619501196.
Повний текст джерела杨, 靖翔. "Research on Chinese Text Error Correction Based on Transformer Enhanced Architecture." Computer Science and Application 12, no. 03 (2022): 565–71. http://dx.doi.org/10.12677/csa.2022.123057.
Повний текст джерелаRaisi, Zobeir, Mohamed A. Naiel, Paul Fieguth, Steven Wardell, and John Zelek. "2D Positional Embedding-based Transformer for Scene Text Recognition." Journal of Computational Vision and Imaging Systems 6, no. 1 (January 15, 2021): 1–4. http://dx.doi.org/10.15353/jcvis.v6i1.3533.
Повний текст джерелаWang, Zeji, Xiaowei He, Yi Li, and Qinliang Chuai. "EmbedFormer: Embedded Depth-Wise Convolution Layer for Token Mixing." Sensors 22, no. 24 (December 15, 2022): 9854. http://dx.doi.org/10.3390/s22249854.
Повний текст джерелаPan, Hang, Lun Xie, and Zhiliang Wang. "Plant and Animal Species Recognition Based on Dynamic Vision Transformer Architecture." Remote Sensing 14, no. 20 (October 20, 2022): 5242. http://dx.doi.org/10.3390/rs14205242.
Повний текст джерелаPrakash, PKS, Srinivas Chilukuri, Nikhil Ranade, and Shankar Viswanathan. "RareBERT: Transformer Architecture for Rare Disease Patient Identification using Administrative Claims." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 1 (May 18, 2021): 453–60. http://dx.doi.org/10.1609/aaai.v35i1.16122.
Повний текст джерелаChernyshov, Artem, Valentin Klimov, Anita Balandina, and Boris Shchukin. "The Application of Transformer Model Architecture for the Dependency Parsing Task." Procedia Computer Science 190 (2021): 142–45. http://dx.doi.org/10.1016/j.procs.2021.06.018.
Повний текст джерела