Artigos de revistas sobre o tema "Pretrained models"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Veja os 50 melhores artigos de revistas para estudos sobre o assunto "Pretrained models".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Veja os artigos de revistas das mais diversas áreas científicas e compile uma bibliografia correta.
Hofmann, Valentin, Goran Glavaš, Nikola Ljubešić, Janet B. Pierrehumbert e Hinrich Schütze. "Geographic Adaptation of Pretrained Language Models". Transactions of the Association for Computational Linguistics 12 (2024): 411–31. http://dx.doi.org/10.1162/tacl_a_00652.
Texto completo da fonteBear Don’t Walk IV, Oliver J., Tony Sun, Adler Perotte e Noémie Elhadad. "Clinically relevant pretraining is all you need". Journal of the American Medical Informatics Association 28, n.º 9 (21 de junho de 2021): 1970–76. http://dx.doi.org/10.1093/jamia/ocab086.
Texto completo da fonteBasu, Sourya, Prasanna Sattigeri, Karthikeyan Natesan Ramamurthy, Vijil Chenthamarakshan, Kush R. Varshney, Lav R. Varshney e Payel Das. "Equi-Tuning: Group Equivariant Fine-Tuning of Pretrained Models". Proceedings of the AAAI Conference on Artificial Intelligence 37, n.º 6 (26 de junho de 2023): 6788–96. http://dx.doi.org/10.1609/aaai.v37i6.25832.
Texto completo da fonteWang, Canjun, Zhao Li, Tong Chen, Ruishuang Wang e Zhengyu Ju. "Research on the Application of Prompt Learning Pretrained Language Model in Machine Translation Task with Reinforcement Learning". Electronics 12, n.º 16 (9 de agosto de 2023): 3391. http://dx.doi.org/10.3390/electronics12163391.
Texto completo da fonteParmonangan, Ivan Halim, Marsella Marsella, Doharfen Frans Rino Pardede, Katarina Prisca Rijanto, Stephanie Stephanie, Kreshna Adhitya Chandra Kesuma, Valentina Tiara Cahyaningtyas e Maria Susan Anggreainy. "Training CNN-based Model on Low Resource Hardware and Small Dataset for Early Prediction of Melanoma from Skin Lesion Images". Engineering, MAthematics and Computer Science (EMACS) Journal 5, n.º 2 (31 de maio de 2023): 41–46. http://dx.doi.org/10.21512/emacsjournal.v5i2.9904.
Texto completo da fonteEdman, Lukas, Gabriele Sarti, Antonio Toral, Gertjan van Noord e Arianna Bisazza. "Are Character-level Translations Worth the Wait? Comparing ByT5 and mT5 for Machine Translation". Transactions of the Association for Computational Linguistics 12 (2024): 392–410. http://dx.doi.org/10.1162/tacl_a_00651.
Texto completo da fonteWon, Hyun-Sik, Min-Ji Kim, Dohyun Kim, Hee-Soo Kim e Kang-Min Kim. "University Student Dropout Prediction Using Pretrained Language Models". Applied Sciences 13, n.º 12 (13 de junho de 2023): 7073. http://dx.doi.org/10.3390/app13127073.
Texto completo da fonteZhou, Shengchao, Gaofeng Meng, Zhaoxiang Zhang, Richard Yi Da Xu e Shiming Xiang. "Robust Feature Rectification of Pretrained Vision Models for Object Recognition". Proceedings of the AAAI Conference on Artificial Intelligence 37, n.º 3 (26 de junho de 2023): 3796–804. http://dx.doi.org/10.1609/aaai.v37i3.25492.
Texto completo da fonteElazar, Yanai, Nora Kassner, Shauli Ravfogel, Abhilasha Ravichander, Eduard Hovy, Hinrich Schütze e Yoav Goldberg. "Measuring and Improving Consistency in Pretrained Language Models". Transactions of the Association for Computational Linguistics 9 (2021): 1012–31. http://dx.doi.org/10.1162/tacl_a_00410.
Texto completo da fonteTakeoka, Kunihiro. "Low-resouce Taxonomy Enrichment with Pretrained Language Models". Journal of Natural Language Processing 29, n.º 1 (2022): 259–63. http://dx.doi.org/10.5715/jnlp.29.259.
Texto completo da fonteSi, Chenglei, Zhengyan Zhang, Yingfa Chen, Fanchao Qi, Xiaozhi Wang, Zhiyuan Liu, Yasheng Wang, Qun Liu e Maosong Sun. "Sub-Character Tokenization for Chinese Pretrained Language Models". Transactions of the Association for Computational Linguistics 11 (18 de maio de 2023): 469–87. http://dx.doi.org/10.1162/tacl_a_00560.
Texto completo da fonteRen, Guanyu. "Monkeypox Disease Detection with Pretrained Deep Learning Models". Information Technology and Control 52, n.º 2 (15 de julho de 2023): 288–96. http://dx.doi.org/10.5755/j01.itc.52.2.32803.
Texto completo da fonteChen, Zhi, Yuncong Liu, Lu Chen, Su Zhu, Mengyue Wu e Kai Yu. "OPAL: Ontology-Aware Pretrained Language Model for End-to-End Task-Oriented Dialogue". Transactions of the Association for Computational Linguistics 11 (2023): 68–84. http://dx.doi.org/10.1162/tacl_a_00534.
Texto completo da fonteChoi, Yong-Seok, Yo-Han Park e Kong Joo Lee. "Building a Korean morphological analyzer using two Korean BERT models". PeerJ Computer Science 8 (2 de maio de 2022): e968. http://dx.doi.org/10.7717/peerj-cs.968.
Texto completo da fonteKim, Hyunil, Tae-Yeong Kwak, Hyeyoon Chang, Sun Woo Kim e Injung Kim. "RCKD: Response-Based Cross-Task Knowledge Distillation for Pathological Image Analysis". Bioengineering 10, n.º 11 (2 de novembro de 2023): 1279. http://dx.doi.org/10.3390/bioengineering10111279.
Texto completo da fonteIvgi, Maor, Uri Shaham e Jonathan Berant. "Efficient Long-Text Understanding with Short-Text Models". Transactions of the Association for Computational Linguistics 11 (2023): 284–99. http://dx.doi.org/10.1162/tacl_a_00547.
Texto completo da fonteAlmonacid-Olleros, Guillermo, Gabino Almonacid, David Gil e Javier Medina-Quero. "Evaluation of Transfer Learning and Fine-Tuning to Nowcast Energy Generation of Photovoltaic Systems in Different Climates". Sustainability 14, n.º 5 (7 de março de 2022): 3092. http://dx.doi.org/10.3390/su14053092.
Texto completo da fonteLee, Eunchan, Changhyeon Lee e Sangtae Ahn. "Comparative Study of Multiclass Text Classification in Research Proposals Using Pretrained Language Models". Applied Sciences 12, n.º 9 (29 de abril de 2022): 4522. http://dx.doi.org/10.3390/app12094522.
Texto completo da fonteMutreja, G., e K. Bittner. "EVALUATING CONVNET AND TRANSFORMER BASED SELF-SUPERVISED ALGORITHMS FOR BUILDING ROOF FORM CLASSIFICATION". International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLVIII-1/W2-2023 (13 de dezembro de 2023): 315–21. http://dx.doi.org/10.5194/isprs-archives-xlviii-1-w2-2023-315-2023.
Texto completo da fonteMalyala, Sohith Sai, Janardhan Reddy Guntaka, Sai Vignesh Chintala, Lohith Vattikuti e SrinivasaRao Tummalapalli. "Exploring How AI Answering Models Understand and Respond in Context." International Journal for Research in Applied Science and Engineering Technology 11, n.º 9 (30 de setembro de 2023): 224–28. http://dx.doi.org/10.22214/ijraset.2023.55597.
Texto completo da fonteDemircioğlu, Aydin. "Deep Features from Pretrained Networks Do Not Outperform Hand-Crafted Features in Radiomics". Diagnostics 13, n.º 20 (20 de outubro de 2023): 3266. http://dx.doi.org/10.3390/diagnostics13203266.
Texto completo da fonteKotei, Evans, e Ramkumar Thirunavukarasu. "A Systematic Review of Transformer-Based Pre-Trained Language Models through Self-Supervised Learning". Information 14, n.º 3 (16 de março de 2023): 187. http://dx.doi.org/10.3390/info14030187.
Texto completo da fonteJackson, Richard G., Erik Jansson, Aron Lagerberg, Elliot Ford, Vladimir Poroshin, Timothy Scrivener, Mats Axelsson, Martin Johansson, Lesly Arun Franco e Eliseo Papa. "Ablations over transformer models for biomedical relationship extraction". F1000Research 9 (16 de julho de 2020): 710. http://dx.doi.org/10.12688/f1000research.24552.1.
Texto completo da fonteSahel, S., M. Alsahafi, M. Alghamdi e T. Alsubait. "Logo Detection Using Deep Learning with Pretrained CNN Models". Engineering, Technology & Applied Science Research 11, n.º 1 (6 de fevereiro de 2021): 6724–29. http://dx.doi.org/10.48084/etasr.3919.
Texto completo da fonteJiang, Shengyi, Sihui Fu, Nankai Lin e Yingwen Fu. "Pretrained models and evaluation data for the Khmer language". Tsinghua Science and Technology 27, n.º 4 (agosto de 2022): 709–18. http://dx.doi.org/10.26599/tst.2021.9010060.
Texto completo da fonteZeng, Zhiyuan, e Deyi Xiong. "Unsupervised and few-shot parsing from pretrained language models". Artificial Intelligence 305 (abril de 2022): 103665. http://dx.doi.org/10.1016/j.artint.2022.103665.
Texto completo da fonteSaravagi, Deepika, Shweta Agrawal, Manisha Saravagi, Jyotir Moy Chatterjee e Mohit Agarwal. "Diagnosis of Lumbar Spondylolisthesis Using Optimized Pretrained CNN Models". Computational Intelligence and Neuroscience 2022 (13 de abril de 2022): 1–12. http://dx.doi.org/10.1155/2022/7459260.
Texto completo da fonteElazar, Yanai, Nora Kassner, Shauli Ravfogel, Abhilasha Ravichander, Eduard Hovy, Hinrich Schütze e Yoav Goldberg. "Erratum: Measuring and Improving Consistency in Pretrained Language Models". Transactions of the Association for Computational Linguistics 9 (2021): 1407. http://dx.doi.org/10.1162/tacl_x_00455.
Texto completo da fonteAl-Sarem, Mohammed, Mohammed Al-Asali, Ahmed Yaseen Alqutaibi e Faisal Saeed. "Enhanced Tooth Region Detection Using Pretrained Deep Learning Models". International Journal of Environmental Research and Public Health 19, n.º 22 (21 de novembro de 2022): 15414. http://dx.doi.org/10.3390/ijerph192215414.
Texto completo da fonteXu, Canwen, e Julian McAuley. "A Survey on Model Compression and Acceleration for Pretrained Language Models". Proceedings of the AAAI Conference on Artificial Intelligence 37, n.º 9 (26 de junho de 2023): 10566–75. http://dx.doi.org/10.1609/aaai.v37i9.26255.
Texto completo da fonteLee, Chanhee, Kisu Yang, Taesun Whang, Chanjun Park, Andrew Matteson e Heuiseok Lim. "Exploring the Data Efficiency of Cross-Lingual Post-Training in Pretrained Language Models". Applied Sciences 11, n.º 5 (24 de fevereiro de 2021): 1974. http://dx.doi.org/10.3390/app11051974.
Texto completo da fonteZhang, Wenbo, Xiao Li, Yating Yang, Rui Dong e Gongxu Luo. "Keeping Models Consistent between Pretraining and Translation for Low-Resource Neural Machine Translation". Future Internet 12, n.º 12 (27 de novembro de 2020): 215. http://dx.doi.org/10.3390/fi12120215.
Texto completo da fonteLobo, Fernando, Maily Selena González, Alicia Boto e José Manuel Pérez de la Lastra. "Prediction of Antifungal Activity of Antimicrobial Peptides by Transfer Learning from Protein Pretrained Models". International Journal of Molecular Sciences 24, n.º 12 (17 de junho de 2023): 10270. http://dx.doi.org/10.3390/ijms241210270.
Texto completo da fonteZhang, Tianyu, Jake Gu, Omid Ardakanian e Joyce Kim. "Addressing data inadequacy challenges in personal comfort models by combining pretrained comfort models". Energy and Buildings 264 (junho de 2022): 112068. http://dx.doi.org/10.1016/j.enbuild.2022.112068.
Texto completo da fonteYang, Xi, Jiang Bian, William R. Hogan e Yonghui Wu. "Clinical concept extraction using transformers". Journal of the American Medical Informatics Association 27, n.º 12 (29 de outubro de 2020): 1935–42. http://dx.doi.org/10.1093/jamia/ocaa189.
Texto completo da fonteDe Coster, Mathieu, e Joni Dambre. "Leveraging Frozen Pretrained Written Language Models for Neural Sign Language Translation". Information 13, n.º 5 (23 de abril de 2022): 220. http://dx.doi.org/10.3390/info13050220.
Texto completo da fonteAlOyaynaa, Sarah, e Yasser Kotb. "Arabic Grammatical Error Detection Using Transformers-based Pretrained Language Models". ITM Web of Conferences 56 (2023): 04009. http://dx.doi.org/10.1051/itmconf/20235604009.
Texto completo da fonteKalyan, Katikapalli Subramanyam, Ajit Rajasekharan e Sivanesan Sangeetha. "AMMU: A survey of transformer-based biomedical pretrained language models". Journal of Biomedical Informatics 126 (fevereiro de 2022): 103982. http://dx.doi.org/10.1016/j.jbi.2021.103982.
Texto completo da fonteSilver, Tom, Soham Dan, Kavitha Srinivas, Joshua B. Tenenbaum, Leslie Kaelbling e Michael Katz. "Generalized Planning in PDDL Domains with Pretrained Large Language Models". Proceedings of the AAAI Conference on Artificial Intelligence 38, n.º 18 (24 de março de 2024): 20256–64. http://dx.doi.org/10.1609/aaai.v38i18.30006.
Texto completo da fonteAhmad, Muhammad Shahrul Zaim, Nor Azlina Ab. Aziz e Anith Khairunnisa Ghazali. "Development of Automated Attendance System Using Pretrained Deep Learning Models". Vol. 6 No. 1 (2024) 6, n.º 1 (30 de abril de 2024): 6–12. http://dx.doi.org/10.33093/ijoras.2024.6.1.2.
Texto completo da fonteYulianto, Rudy, Faqihudin, Meika Syahbana Rusli, Adhitio Satyo Bayangkari Karno, Widi Hastomo, Aqwam Rosadi Kardian, Vany Terisia e Tri Surawan. "Innovative UNET-Based Steel Defect Detection Using 5 Pretrained Models". Evergreen 10, n.º 4 (dezembro de 2023): 2365–78. http://dx.doi.org/10.5109/7160923.
Texto completo da fonteYin, Yi, Weiming Zhang, Nenghai Yu e Kejiang Chen. "Steganalysis of neural networks based on parameter statistical bias". Journal of University of Science and Technology of China 52, n.º 1 (2022): 1. http://dx.doi.org/10.52396/justc-2021-0197.
Texto completo da fonteAlZahrani, Fetoun Mansour, e Maha Al-Yahya. "A Transformer-Based Approach to Authorship Attribution in Classical Arabic Texts". Applied Sciences 13, n.º 12 (18 de junho de 2023): 7255. http://dx.doi.org/10.3390/app13127255.
Texto completo da fonteAlbashish, Dheeb. "Ensemble of adapted convolutional neural networks (CNN) methods for classifying colon histopathological images". PeerJ Computer Science 8 (5 de julho de 2022): e1031. http://dx.doi.org/10.7717/peerj-cs.1031.
Texto completo da fontePan, Yu, Ye Yuan, Yichun Yin, Jiaxin Shi, Zenglin Xu, Ming Zhang, Lifeng Shang, Xin Jiang e Qun Liu. "Preparing Lessons for Progressive Training on Language Models". Proceedings of the AAAI Conference on Artificial Intelligence 38, n.º 17 (24 de março de 2024): 18860–68. http://dx.doi.org/10.1609/aaai.v38i17.29851.
Texto completo da fonteAnupriya, Anupriya. "Fine-tuning Pretrained Transformers for Sentiment Analysis on Twitter Data". Mathematical Statistician and Engineering Applications 70, n.º 2 (26 de fevereiro de 2021): 1344–52. http://dx.doi.org/10.17762/msea.v70i2.2326.
Texto completo da fonteZhang, Zhanhao. "The transferability of transfer learning model based on ImageNet for medical image classification tasks". Applied and Computational Engineering 18, n.º 1 (23 de outubro de 2023): 143–51. http://dx.doi.org/10.54254/2755-2721/18/20230980.
Texto completo da fonteAnton, Jonah, Liam Castelli, Mun Fai Chan, Mathilde Outters, Wan Hee Tang, Venus Cheung, Pancham Shukla, Rahee Walambe e Ketan Kotecha. "How Well Do Self-Supervised Models Transfer to Medical Imaging?" Journal of Imaging 8, n.º 12 (1 de dezembro de 2022): 320. http://dx.doi.org/10.3390/jimaging8120320.
Texto completo da fonteSiahkoohi, Ali, Mathias Louboutin e Felix J. Herrmann. "The importance of transfer learning in seismic modeling and imaging". GEOPHYSICS 84, n.º 6 (1 de novembro de 2019): A47—A52. http://dx.doi.org/10.1190/geo2019-0056.1.
Texto completo da fonteChen, Die, Hua Zhang, Zeqi Chen, Bo Xie e Ye Wang. "Comparative Analysis on Alignment-Based and Pretrained Feature Representations for the Identification of DNA-Binding Proteins". Computational and Mathematical Methods in Medicine 2022 (28 de junho de 2022): 1–14. http://dx.doi.org/10.1155/2022/5847242.
Texto completo da fonte