Artykuły w czasopismach na temat „Low-rank adaptation”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Sprawdź 50 najlepszych artykułów w czasopismach naukowych na temat „Low-rank adaptation”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.
Yang, Weiqi, i Michael Spece. "Implicit Adaptation to Low Rank Structure in Online Learning". International Journal of Machine Learning and Computing 11, nr 5 (wrzesień 2021): 339–44. http://dx.doi.org/10.18178/ijmlc.2021.11.5.1058.
Pełny tekst źródłaChen, Yanran. "A concise analysis of low-rank adaptation". Applied and Computational Engineering 42, nr 1 (23.02.2024): 76–82. http://dx.doi.org/10.54254/2755-2721/42/20230688.
Pełny tekst źródłaFilatov, N., i M. Kindulov. "Low Rank Adaptation for Stable Domain Adaptation of Vision Transformers". Optical Memory and Neural Networks 32, S2 (28.11.2023): S277—S283. http://dx.doi.org/10.3103/s1060992x2306005x.
Pełny tekst źródłaXu, Bingrong, Jianhua Yin, Cheng Lian, Yixin Su i Zhigang Zeng. "Low-Rank Optimal Transport for Robust Domain Adaptation". IEEE/CAA Journal of Automatica Sinica 11, nr 7 (lipiec 2024): 1667–80. http://dx.doi.org/10.1109/jas.2024.124344.
Pełny tekst źródłaHu, Yahao, Yifei Xie, Tianfeng Wang, Man Chen i Zhisong Pan. "Structure-Aware Low-Rank Adaptation for Parameter-Efficient Fine-Tuning". Mathematics 11, nr 20 (17.10.2023): 4317. http://dx.doi.org/10.3390/math11204317.
Pełny tekst źródłaLi, Wen, Zheng Xu, Dong Xu, Dengxin Dai i Luc Van Gool. "Domain Generalization and Adaptation Using Low Rank Exemplar SVMs". IEEE Transactions on Pattern Analysis and Machine Intelligence 40, nr 5 (1.05.2018): 1114–27. http://dx.doi.org/10.1109/tpami.2017.2704624.
Pełny tekst źródłaJaech, Aaron, i Mari Ostendorf. "Low-Rank RNN Adaptation for Context-Aware Language Modeling". Transactions of the Association for Computational Linguistics 6 (grudzień 2018): 497–510. http://dx.doi.org/10.1162/tacl_a_00035.
Pełny tekst źródłaRuff, Douglas A., Cheng Xue, Lily E. Kramer, Faisal Baqai i Marlene R. Cohen. "Low rank mechanisms underlying flexible visual representations". Proceedings of the National Academy of Sciences 117, nr 47 (23.11.2020): 29321–29. http://dx.doi.org/10.1073/pnas.2005797117.
Pełny tekst źródłaJeong, Y., i H. S. Kim. "Speaker adaptation using generalised low rank approximations of training matrices". Electronics Letters 46, nr 10 (2010): 724. http://dx.doi.org/10.1049/el.2010.0466.
Pełny tekst źródłaKim, Juhyeong, Gyunyeop Kim i Sangwoo Kang. "Lottery Rank-Pruning Adaptation Parameter Efficient Fine-Tuning". Mathematics 12, nr 23 (28.11.2024): 3744. http://dx.doi.org/10.3390/math12233744.
Pełny tekst źródłaTao, JianWen, Dawei Song, Shiting Wen i Wenjun Hu. "Robust multi-source adaptation visual classification using supervised low-rank representation". Pattern Recognition 61 (styczeń 2017): 47–65. http://dx.doi.org/10.1016/j.patcog.2016.07.006.
Pełny tekst źródłaTao, JianWen, Shiting Wen i Wenjun Hu. "Robust domain adaptation image classification via sparse and low rank representation". Journal of Visual Communication and Image Representation 33 (listopad 2015): 134–48. http://dx.doi.org/10.1016/j.jvcir.2015.09.005.
Pełny tekst źródłaRen, Chuan-Xian, Xiao-Lin Xu i Hong Yan. "Generalized Conditional Domain Adaptation: A Causal Perspective With Low-Rank Translators". IEEE Transactions on Cybernetics 50, nr 2 (luty 2020): 821–34. http://dx.doi.org/10.1109/tcyb.2018.2874219.
Pełny tekst źródłaWu, Hanrui, i Michael K. Ng. "Multiple Graphs and Low-Rank Embedding for Multi-Source Heterogeneous Domain Adaptation". ACM Transactions on Knowledge Discovery from Data 16, nr 4 (31.08.2022): 1–25. http://dx.doi.org/10.1145/3492804.
Pełny tekst źródłaHong, Chaoqun, Zhiqiang Zeng, Rongsheng Xie, Weiwei Zhuang i Xiaodong Wang. "Domain adaptation with low-rank alignment for weakly supervised hand pose recovery". Signal Processing 142 (styczeń 2018): 223–30. http://dx.doi.org/10.1016/j.sigpro.2017.07.032.
Pełny tekst źródłaYang, Liran, Min Men, Yiming Xue i Ping Zhong. "Low-rank representation-based regularized subspace learning method for unsupervised domain adaptation". Multimedia Tools and Applications 79, nr 3-4 (5.12.2019): 3031–47. http://dx.doi.org/10.1007/s11042-019-08474-4.
Pełny tekst źródłaTao, Jianwen, Haote Xu i Jianjing Fu. "Low-Rank Constrained Latent Domain Adaptation Co-Regression for Robust Depression Recognition". IEEE Access 7 (2019): 145406–25. http://dx.doi.org/10.1109/access.2019.2944211.
Pełny tekst źródłaXiao, Ting, Cangning Fan, Peng Liu i Hongwei Liu. "Simultaneously Improve Transferability and Discriminability for Adversarial Domain Adaptation". Entropy 24, nr 1 (27.12.2021): 44. http://dx.doi.org/10.3390/e24010044.
Pełny tekst źródłaWang, Mingliang, Daoqiang Zhang, Jiashuang Huang, Pew-Thian Yap, Dinggang Shen i Mingxia Liu. "Identifying Autism Spectrum Disorder With Multi-Site fMRI via Low-Rank Domain Adaptation". IEEE Transactions on Medical Imaging 39, nr 3 (marzec 2020): 644–55. http://dx.doi.org/10.1109/tmi.2019.2933160.
Pełny tekst źródłaZhu, Chenyang, Lanlan Zhang, Weibin Luo, Guangqi Jiang i Qian Wang. "Tensorial multiview low-rank high-order graph learning for context-enhanced domain adaptation". Neural Networks 181 (styczeń 2025): 106859. http://dx.doi.org/10.1016/j.neunet.2024.106859.
Pełny tekst źródłaTrust, Paul, i Rosane Minghim. "A Study on Text Classification in the Age of Large Language Models". Machine Learning and Knowledge Extraction 6, nr 4 (21.11.2024): 2688–721. http://dx.doi.org/10.3390/make6040129.
Pełny tekst źródłaLe, Khoi M., Trinh Pham, Tho Quan i Anh Tuan Luu. "LAMPAT: Low-Rank Adaption for Multilingual Paraphrasing Using Adversarial Training". Proceedings of the AAAI Conference on Artificial Intelligence 38, nr 16 (24.03.2024): 18435–43. http://dx.doi.org/10.1609/aaai.v38i16.29804.
Pełny tekst źródłaZdunek, Rafał, i Tomasz Sadowski. "Image Completion with Hybrid Interpolation in Tensor Representation". Applied Sciences 10, nr 3 (22.01.2020): 797. http://dx.doi.org/10.3390/app10030797.
Pełny tekst źródłaMavaddaty, Samira, Seyed Mohammad Ahadi i Sanaz Seyedin. "A novel speech enhancement method by learnable sparse and low-rank decomposition and domain adaptation". Speech Communication 76 (luty 2016): 42–60. http://dx.doi.org/10.1016/j.specom.2015.11.003.
Pełny tekst źródłaHong, Zhenchen, Jingwei Xiong, Han Yang i Yu K. Mo. "Lightweight Low-Rank Adaptation Vision Transformer Framework for Cervical Cancer Detection and Cervix Type Classification". Bioengineering 11, nr 5 (8.05.2024): 468. http://dx.doi.org/10.3390/bioengineering11050468.
Pełny tekst źródłaHu, Yaopeng. "Optimizing e-commerce recommendation systems through conditional image generation: Merging LoRA and cGANs for improved performance". Applied and Computational Engineering 32, nr 1 (22.01.2024): 177–84. http://dx.doi.org/10.54254/2755-2721/32/20230207.
Pełny tekst źródłaTatianchenko, Natalia Petrovna. "Psychological conditions for the formation of adaptation potential of an individual in the learning process". Психология и Психотехника, nr 1 (styczeń 2021): 62–77. http://dx.doi.org/10.7256/2454-0722.2021.1.32485.
Pełny tekst źródłaYan, Chaokun, Haicao Yan, Wenjuan Liang, Menghan Yin, Huimin Luo i Junwei Luo. "DP-SSLoRA: A privacy-preserving medical classification model combining differential privacy with self-supervised low-rank adaptation". Computers in Biology and Medicine 179 (wrzesień 2024): 108792. http://dx.doi.org/10.1016/j.compbiomed.2024.108792.
Pełny tekst źródłaHong, Yang, Xiaowei Zhou, Ruzhuang Hua, Qingxuan Lv i Junyu Dong. "WaterSAM: Adapting SAM for Underwater Object Segmentation". Journal of Marine Science and Engineering 12, nr 9 (11.09.2024): 1616. http://dx.doi.org/10.3390/jmse12091616.
Pełny tekst źródłaIca Wahyuni, Nonok Karlina i Citra Setyo Dwi Andhini. "Correlation Of Self Efficacy With Stress Adaptation On Chronic Kidney Failure Patients Hemodialysis In Waled General HospitalCirebon District". Jurnal Kesehatan Mahardika 6, nr 2 (1.09.2019): 12–16. http://dx.doi.org/10.54867/jkm.v6i2.41.
Pełny tekst źródłaTian, Qing, i Canyu Sun. "Structure preserved ordinal unsupervised domain adaptation". Electronic Research Archive 32, nr 11 (2024): 6338–63. http://dx.doi.org/10.3934/era.2024295.
Pełny tekst źródłaYashchenko, Elena Fedorovna, Ekaterina Galiulovna Shchelokova i Olga Vasilievna Lazorak. "PERSONAL FEATURES OF FOREIGN STUDENTS WITH A HIGH AND LOW LEVEL OF SELF-ACTUALIZATION DURING SOCIO-PSYCHOLOGICAL ADAPTATION". Психология. Психофизиология 13, nr 2 (20.07.2020): 62–75. http://dx.doi.org/10.14529/jpps200206.
Pełny tekst źródłaHou, Zejiang, Julian Salazar i George Polovets. "Meta-Learning the Difference: Preparing Large Language Models for Efficient Adaptation". Transactions of the Association for Computational Linguistics 10 (2022): 1249–65. http://dx.doi.org/10.1162/tacl_a_00517.
Pełny tekst źródłaQian Shi, Bo Du i Liangpei Zhang. "Domain Adaptation for Remote Sensing Image Classification: A Low-Rank Reconstruction and Instance Weighting Label Propagation Inspired Algorithm". IEEE Transactions on Geoscience and Remote Sensing 53, nr 10 (październik 2015): 5677–89. http://dx.doi.org/10.1109/tgrs.2015.2427791.
Pełny tekst źródłaUtomo, Hanung Addi Chandra, Yuris Mulya Saputra i Agi Prasetiadi. "Implementasi Sistem Konfigurasi Router Berbasis Natural Language Processing dengan Pendekatan Low Rank Adaptation Finetuning dan 8-Bit Quantization". Journal of Internet and Software Engineering 4, nr 2 (1.12.2023): 1–7. http://dx.doi.org/10.22146/jise.v4i2.9093.
Pełny tekst źródłaKashina, Yuliya V., Irina L. Cherednik i Svetlana V. Polishchuk. "Students’ index of adaptation to the educational process depending on the personality type". Journal of Medical and Biological Research, nr 3 (10.10.2022): 213–20. http://dx.doi.org/10.37482/2687-1491-z108.
Pełny tekst źródłaShumakov, Vadim Anatolevich, Darya Aleksandrovna Dubrovina i Anna Vladimirovna Platonova. "SOCIAL AND PSYCHOLOGICAL ADAPTATION OF YOUNGER SCHOOLCHILDREN TO THE LEARNING ENVIRONMENT AS A FACTOR OF THEIR EMOTIONAL WELL-BEING". Психология. Психофизиология 12, nr 4 (15.01.2020): 63–70. http://dx.doi.org/10.14529/jpps190407.
Pełny tekst źródłaMartini, Luca, Saverio Iacono, Daniele Zolezzi i Gianni Viardo Vercelli. "Advancing Persistent Character Generation: Comparative Analysis of Fine-Tuning Techniques for Diffusion Models". AI 5, nr 4 (29.09.2024): 1779–92. http://dx.doi.org/10.3390/ai5040088.
Pełny tekst źródłaMahendra, Anton, i Styawati Styawati. "Implementasi Lowk-Rank Adaptation of Large Langauage Model (LoRA) Untuk Effisiensi Large Language Model". JIPI (Jurnal Ilmiah Penelitian dan Pembelajaran Informatika) 9, nr 4 (19.11.2024): 1881–90. https://doi.org/10.29100/jipi.v9i4.5519.
Pełny tekst źródłaArian, Md Sahadul Hasan, Faisal Ahmed Sifat, Saif Ahmed, Nabeel Mohammed, Taseef Hasan Farook i James Dudley. "Dental Loop Chatbot: A Prototype Large Language Model Framework for Dentistry". Software 3, nr 4 (17.12.2024): 587–94. https://doi.org/10.3390/software3040029.
Pełny tekst źródłaWu, Haokun. "Large language models capsule: A research analysis of In-Context Learning (ICL) and Parameter-Efficient Fine-Tuning (PEFT) methods". Applied and Computational Engineering 43, nr 1 (26.02.2024): 327–31. http://dx.doi.org/10.54254/2755-2721/43/20230858.
Pełny tekst źródłaAdams, Henry, Lara Kassab i Deanna Needell. "An adaptation for iterative structured matrix completion". Foundations of Data Science 3, nr 4 (2021): 769. http://dx.doi.org/10.3934/fods.2021028.
Pełny tekst źródłaEker, Oktay, Murat Avcı, Selen Çiğdem, Oğuzhan Özdemir, Fatih Nar i Dmitry Kudinov. "Integrating SAM and LoRA for DSM-Based Planar Region Extraction in Building Footprints". International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLVIII-4/W10-2024 (31.05.2024): 57–64. http://dx.doi.org/10.5194/isprs-archives-xlviii-4-w10-2024-57-2024.
Pełny tekst źródłaShvyrov, V. V., D. A. Kapustin, R. N. Sentyay i T. I. Shulika. "Using Large Language Models to Classify Some Vulnerabilities in Program Code". Programmnaya Ingeneria 15, nr 9 (9.09.2024): 465–75. http://dx.doi.org/10.17587/prin.15.465-475.
Pełny tekst źródłaKim, Sanghyeon, Hyunmo Yang, Younghyun Kim, Youngjoon Hong i Eunbyung Park. "Corrigendum to “Hydra: Multi-head Low-rank Adaptation for Parameter Efficient Fine-tuning” [Neural Networks Volume 178, October (2024), 1-11/106414]]". Neural Networks 181 (styczeń 2025): 106878. http://dx.doi.org/10.1016/j.neunet.2024.106878.
Pełny tekst źródłaCheng, Yuxi, Yang Song, Yi Liu, Hui Zhang i Feng Liu. "High-Performance Binocular Disparity Prediction Algorithm for Edge Computing". Sensors 24, nr 14 (14.07.2024): 4563. http://dx.doi.org/10.3390/s24144563.
Pełny tekst źródłaMitrofanov, Igor. "SOCIO-PSYCHOLOGICAL ADAPTATION IN ADOLESCENTS WITH INTERNET-DEPENDENT BEHAVIOR". Child in a Digital World 1, nr 1 (2023): 64. http://dx.doi.org/10.61365/forum.2023.049.
Pełny tekst źródłaBazi, Yakoub, Laila Bashmal, Mohamad Mahmoud Al Rahhal, Riccardo Ricci i Farid Melgani. "RS-LLaVA: A Large Vision-Language Model for Joint Captioning and Question Answering in Remote Sensing Imagery". Remote Sensing 16, nr 9 (23.04.2024): 1477. http://dx.doi.org/10.3390/rs16091477.
Pełny tekst źródłaHu, Haotian, Alex Jie Yang, Sanhong Deng, Dongbo Wang, Min Song i Si Shen. "A Generative Drug–Drug Interaction Triplets Extraction Framework Based on Large Language Models". Proceedings of the Association for Information Science and Technology 60, nr 1 (październik 2023): 980–82. http://dx.doi.org/10.1002/pra2.918.
Pełny tekst źródłaMakaricheva, Elvira V., i Maria S. Burguvan. "Specificity and dynamics of psychological adaptation during the COVID-19 pandemic". Neurology Bulletin LIV, nr 2 (19.07.2022): 23–32. http://dx.doi.org/10.17816/nb106247.
Pełny tekst źródła