Academic literature on the topic 'Learning with Limited Data'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Learning with Limited Data.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Learning with Limited Data"
Oh, Se Eun, Nate Mathews, Mohammad Saidur Rahman, Matthew Wright, and Nicholas Hopper. "GANDaLF: GAN for Data-Limited Fingerprinting." Proceedings on Privacy Enhancing Technologies 2021, no. 2 (January 29, 2021): 305–22. http://dx.doi.org/10.2478/popets-2021-0029.
Full textTriantafillou, Sofia, and Greg Cooper. "Learning Adjustment Sets from Observational and Limited Experimental Data." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 11 (May 18, 2021): 9940–48. http://dx.doi.org/10.1609/aaai.v35i11.17194.
Full textZhao, Yao, Dong Joo Rhee, Carlos Cardenas, Laurence E. Court, and Jinzhong Yang. "Training deep‐learning segmentation models from severely limited data." Medical Physics 48, no. 4 (February 19, 2021): 1697–706. http://dx.doi.org/10.1002/mp.14728.
Full textKim, Minjeong, Yujung Gil, Yuyeon Kim, and Jihie Kim. "Deep-Learning-Based Scalp Image Analysis Using Limited Data." Electronics 12, no. 6 (March 14, 2023): 1380. http://dx.doi.org/10.3390/electronics12061380.
Full textChen, Jiaao, Derek Tam, Colin Raffel, Mohit Bansal, and Diyi Yang. "An Empirical Survey of Data Augmentation for Limited Data Learning in NLP." Transactions of the Association for Computational Linguistics 11 (2023): 191–211. http://dx.doi.org/10.1162/tacl_a_00542.
Full textHan, Te, Chao Liu, Rui Wu, and Dongxiang Jiang. "Deep transfer learning with limited data for machinery fault diagnosis." Applied Soft Computing 103 (May 2021): 107150. http://dx.doi.org/10.1016/j.asoc.2021.107150.
Full textJi, Xuefei, Jue Wang, Ye Li, Qiang Sun, Shi Jin, and Tony Q. S. Quek. "Data-Limited Modulation Classification With a CVAE-Enhanced Learning Model." IEEE Communications Letters 24, no. 10 (October 2020): 2191–95. http://dx.doi.org/10.1109/lcomm.2020.3004877.
Full textForestier, Germain, and Cédric Wemmert. "Semi-supervised learning using multiple clusterings with limited labeled data." Information Sciences 361-362 (September 2016): 48–65. http://dx.doi.org/10.1016/j.ins.2016.04.040.
Full textWen, Jiahui, and Zhiying Wang. "Learning general model for activity recognition with limited labelled data." Expert Systems with Applications 74 (May 2017): 19–28. http://dx.doi.org/10.1016/j.eswa.2017.01.002.
Full textZhang, Ansi, Shaobo Li, Yuxin Cui, Wanli Yang, Rongzhi Dong, and Jianjun Hu. "Limited Data Rolling Bearing Fault Diagnosis With Few-Shot Learning." IEEE Access 7 (2019): 110895–904. http://dx.doi.org/10.1109/access.2019.2934233.
Full textDissertations / Theses on the topic "Learning with Limited Data"
Chen, Si. "Active Learning Under Limited Interaction with Data Labeler." Thesis, Virginia Tech, 2021. http://hdl.handle.net/10919/104894.
Full textM.S.
Machine Learning (ML) has achieved huge success in recent years. Machine Learning technologies such as recommendation system, speech recognition and image recognition play an important role on human daily life. This success mainly build upon the use of large amount of labeled data: Compared with traditional programming, a ML algorithm does not rely on explicit instructions from human; instead, it takes the data along with the label as input, and aims to learn a function that can correctly map data to the label space by itself. However, data labeling requires human effort and could be time-consuming and expensive especially for datasets that contain domain-specific knowledge (e.g., disease prediction etc.) Active Learning (AL) is one of the solution to reduce data labeling effort. Specifically, the learning algorithm actively selects data points that provide more information for the model, hence a better model can be achieved with less labeled data. While traditional AL strategies do achieve good performance, it requires a small amount of labeled data as initialization and performs data selection in multi-round, which pose great challenge to its application, as there is no platform provide timely online interaction with data labeler and the interaction is often time inefficient. To deal with the limitations, we first propose DULO which a new setting of AL is studied: data selection is only allowed to be performed once. To further broaden the application of our method, we propose D²ULO which is built upon DULO and Domain Adaptation techniques to avoid the use of initial labeled data. Our experiments show that both of the proposed two frameworks achieve better performance compared with state-of-the-art baselines.
Dvornik, Mikita. "Learning with Limited Annotated Data for Visual Understanding." Thesis, Université Grenoble Alpes (ComUE), 2019. http://www.theses.fr/2019GREAM050.
Full textThe ability of deep-learning methods to excel in computer vision highly depends on the amount of annotated data available for training. For some tasks, annotation may be too costly and labor intensive, thus becoming the main obstacle to better accuracy. Algorithms that learn from data automatically, without human supervision, perform substantially worse than their fully-supervised counterparts. Thus, there is a strong motivation to work on effective methods for learning with limited annotations. This thesis proposes to exploit prior knowledge about the task and develops more effective solutions for scene understanding and few-shot image classification.Main challenges of scene understanding include object detection, semantic and instance segmentation. Similarly, all these tasks aim at recognizing and localizing objects, at region- or more precise pixel-level, which makes the annotation process difficult. The first contribution of this manuscript is a Convolutional Neural Network (CNN) that performs both object detection and semantic segmentation. We design a specialized network architecture, that is trained to solve both problems in one forward pass, and operates in real-time. Thanks to the multi-task training procedure, both tasks benefit from each other in terms of accuracy, with no extra labeled data.The second contribution introduces a new technique for data augmentation, i.e., artificially increasing the amount of training data. It aims at creating new scenes by copy-pasting objects from one image to another, within a given dataset. Placing an object in a right context was found to be crucial in order to improve scene understanding performance. We propose to model visual context explicitly using a CNN that discovers correlations between object categories and their typical neighborhood, and then proposes realistic locations for augmentation. Overall, pasting objects in ``right'' locations allows to improve object detection and segmentation performance, with higher gains in limited annotation scenarios.For some problems, the data is extremely scarce, and an algorithm has to learn new concepts from a handful of examples. Few-shot classification consists of learning a predictive model that is able to effectively adapt to a new class, given only a few annotated samples. While most current methods concentrate on the adaptation mechanism, few works have tackled the problem of scarce training data explicitly. In our third contribution, we show that by addressing the fundamental high-variance issue of few-shot learning classifiers, it is possible to significantly outperform more sophisticated existing techniques. Our approach consists of designing an ensemble of deep networks to leverage the variance of the classifiers, and introducing new strategies to encourage the networks to cooperate, while encouraging prediction diversity. By matching different networks outputs on similar input images, we improve model accuracy and robustness, comparing to classical ensemble training. Moreover, a single network obtained by distillation shows similar to the full ensemble performance and yields state-of-the-art results with no computational overhead at test time
Moskvyak, Olga. "Learning from limited annotated data for re-identification problem." Thesis, Queensland University of Technology, 2021. https://eprints.qut.edu.au/226866/1/Olga_Moskvyak_Thesis.pdf.
Full textXian, Yongqin [Verfasser]. "Learning from limited labeled data - Zero-Shot and Few-Shot Learning / Yongqin Xian." Saarbrücken : Saarländische Universitäts- und Landesbibliothek, 2020. http://d-nb.info/1219904457/34.
Full textEriksson, Håkan. "Clustering Generic Log Files Under Limited Data Assumptions." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-189642.
Full textKomplexa datorsystem är ofta benägna att uppvisa anormalt eller felaktigt beteende, vilket kan leda till kostsamma driftstopp under tiden som systemen diagnosticeras och repareras. En informationskälla till feldiagnosticeringen är loggfiler, vilka ofta genereras i stora mängder och av olika typer. Givet loggfilernas storlek och semistrukturerade utseende så blir en manuell analys orimlig att genomföra. Viss automatisering är önsvkärd för att sovra bland loggfilerna så att källan till felen och anormaliteterna blir enklare att upptäcka. Det här projektet syftade till att utveckla en generell algoritm som kan klustra olikartade loggfiler i enlighet med domänexpertis. Resultaten visar att algoritmen presterar väl i enlighet med manuell klustring även med färre antaganden om datan.
Boman, Jimmy. "A deep learning approach to defect detection with limited data availability." Thesis, Umeå universitet, Institutionen för fysik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-173207.
Full textGuo, Zhenyu. "Data famine in big data era : machine learning algorithms for visual object recognition with limited training data." Thesis, University of British Columbia, 2014. http://hdl.handle.net/2429/46412.
Full textAyllon, Clemente Irene [Verfasser]. "Towards natural speech acquisition: incremental word learning with limited data / Irene Ayllon Clemente." Bielefeld : Universitätsbibliothek Bielefeld, 2013. http://d-nb.info/1077063458/34.
Full textChang, Fengming. "Learning accuracy from limited data using mega-fuzzification method to improve small data set learning accuracy for early flexible manufacturing system scheduling." Saarbrücken VDM Verlag Dr. Müller, 2005. http://d-nb.info/989267156/04.
Full textTania, Zannatun Nayem. "Machine Learning with Reconfigurable Privacy on Resource-Limited Edge Computing Devices." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-292105.
Full textDistribuerad databehandling möjliggör effektiv datalagring, bearbetning och hämtning men det medför säkerhets- och sekretessproblem. Sensorer är hörnstenen i de IoT-baserade rörledningarna, eftersom de ständigt samlar in data tills de kan analyseras på de centrala molnresurserna. Dessa sensornoder begränsas dock ofta av begränsade resurser. Helst är det önskvärt att göra alla insamlade datafunktioner privata, men på grund av resursbegränsningar kanske det inte alltid är möjligt. Att göra alla funktioner privata kan orsaka överutnyttjande av resurser, vilket i sin tur skulle påverka prestanda för hela systemet. I denna avhandling designar och implementerar vi ett system som kan hitta den optimala uppsättningen datafunktioner för att göra privata, med tanke på begränsningar av enhetsresurserna och systemets önskade prestanda eller noggrannhet. Med hjälp av generaliseringsteknikerna för data-anonymisering skapar vi användardefinierade injicerbara sekretess-kodningsfunktioner för att göra varje funktion i datasetet privat. Oavsett resurstillgänglighet definieras vissa datafunktioner av användaren som viktiga funktioner för att göra privat. Alla andra datafunktioner som kan utgöra ett integritetshot kallas de icke-väsentliga funktionerna. Vi föreslår Dynamic Iterative Greedy Search (DIGS), en girig sökalgoritm som tar resursförbrukningen för varje icke-väsentlig funktion som inmatning och ger den mest optimala uppsättningen icke-väsentliga funktioner som kan vara privata med tanke på tillgängliga resurser. Den mest optimala uppsättningen innehåller de funktioner som förbrukar minst resurser. Vi utvärderar vårt system på en Fitbit-dataset som innehåller 17 datafunktioner, varav 4 är viktiga privata funktioner för en viss klassificeringsapplikation. Våra resultat visar att vi kan erbjuda ytterligare 9 privata funktioner förutom de 4 viktiga funktionerna i Fitbit-datasetet som innehåller 1663 poster. Dessutom kan vi spara 26; 21% minne jämfört med att göra alla funktioner privata. Vi testar också vår metod på en större dataset som genereras med Generative Adversarial Network (GAN). Den valda kantenheten, Raspberry Pi, kan dock inte tillgodose storleken på den stora datasetet på grund av otillräckliga resurser. Våra utvärderingar med 1=8th av GAN-datasetet resulterar i 3 extra privata funktioner med upp till 62; 74% minnesbesparingar jämfört med alla privata datafunktioner. Att upprätthålla integritet kräver inte bara ytterligare resurser utan har också konsekvenser för de designade applikationernas prestanda. Vi upptäcker dock att integritetskodning har en positiv inverkan på noggrannheten i klassificeringsmodellen för vår valda klassificeringsapplikation.
Books on the topic "Learning with Limited Data"
Zamzmi, Ghada, Sameer Antani, Ulas Bagci, Marius George Linguraru, Sivaramakrishnan Rajaraman, and Zhiyun Xue, eds. Medical Image Learning with Limited and Noisy Data. Cham: Springer Nature Switzerland, 2022. http://dx.doi.org/10.1007/978-3-031-16760-7.
Full textXue, Zhiyun, Sameer Antani, Ghada Zamzmi, Feng Yang, Sivaramakrishnan Rajaraman, Sharon Xiaolei Huang, Marius George Linguraru, and Zhaohui Liang, eds. Medical Image Learning with Limited and Noisy Data. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-44917-8.
Full textFisher, Doug, and Hans-J. Lenz, eds. Learning from Data. New York, NY: Springer New York, 1996. http://dx.doi.org/10.1007/978-1-4612-2404-4.
Full textBig learning data. Alexandria, VA: ASTD Press, 2014.
Find full textHartemink, Alfred E., Alex McBratney, and Maria de Lourdes Mendonça-Santos, eds. Digital Soil Mapping with Limited Data. Dordrecht: Springer Netherlands, 2008. http://dx.doi.org/10.1007/978-1-4020-8592-5.
Full text1964-, Hartemink Alfred E., McBratney A. B, and Mendonça-Santos Maria de Lourdes, eds. Digital soil mapping with limited data. Dordrecht: Springer, 2008.
Find full textVelleman, Paul F. Learning data analysis with Data desk. New York: W.H. Freeman, 1993.
Find full textLearning data analysis with Data desk. New York: W.H. Freeman, 1989.
Find full textVNU Entertainment Media UK Limited and Book Data Limited: A report on the acquisition by VNU Entertainment Media UK Limited of Book Data Limited. London: Stationery Office, 2003.
Find full textDean, Jared. Big Data, Data Mining, and Machine Learning. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2014. http://dx.doi.org/10.1002/9781118691786.
Full textBook chapters on the topic "Learning with Limited Data"
Bennett, James, Kitty Kautzer, and Leila Casteel. "Analyzing question items with limited data." In Data Analytics and Adaptive Learning, 230–41. New York: Routledge, 2023. http://dx.doi.org/10.4324/9781003244271-16.
Full textHe, Xiangyu, and Jian Cheng. "Learning Compression from Limited Unlabeled Data." In Computer Vision – ECCV 2018, 778–95. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-01246-5_46.
Full textChen, Shurui, Yufu Chen, Yuyin Lu, Yanghui Rao, Haoran Xie, and Qing Li. "Chinese Word Embedding Learning with Limited Data." In Web and Big Data, 211–26. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-85896-4_18.
Full textWang, Li-C. "Learning from Limited Data in VLSI CAD." In Machine Learning in VLSI Computer-Aided Design, 375–99. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-04666-8_13.
Full textChan, Yung-Chieh, Jerry Zhang, Katie Frizzi, Nigel Calcutt, and Garrison Cottrell. "Automated Skin Biopsy Analysis with Limited Data." In Medical Image Learning with Limited and Noisy Data, 229–38. Cham: Springer Nature Switzerland, 2022. http://dx.doi.org/10.1007/978-3-031-16760-7_22.
Full textRamlan, Fitria Wulandari, and James McDermott. "Genetic Programming with Synthetic Data for Interpretable Regression Modelling and Limited Data." In Machine Learning, Optimization, and Data Science, 142–57. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-53969-5_12.
Full textVu, Tu Thanh, Giang Binh Tran, and Son Bao Pham. "Learning to Simplify Children Stories with Limited Data." In Intelligent Information and Database Systems, 31–41. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-05476-6_4.
Full textNguyen, Minh-Tien, Viet-Anh Phan, Le Thai Linh, Nguyen Hong Son, Le Tien Dung, Miku Hirano, and Hajime Hotta. "Transfer Learning for Information Extraction with Limited Data." In Communications in Computer and Information Science, 469–82. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-6168-9_38.
Full textJain, Sanjay, and Efim Kinber. "On Learning Languages from Positive Data and a Limited Number of Short Counterexamples." In Learning Theory, 259–73. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11776420_21.
Full textLiu, Alex X., and Rui Li. "Differentially Private and Budget Limited Bandit Learning over Matroids." In Algorithms for Data and Computation Privacy, 347–82. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58896-0_13.
Full textConference papers on the topic "Learning with Limited Data"
Malaviya, Maya, Ilia Sucholutsky, and Thomas L. Griffiths. "Pushing the Limits of Learning from Limited Data." In 2023 Conference on Cognitive Computational Neuroscience. Oxford, United Kingdom: Cognitive Computational Neuroscience, 2023. http://dx.doi.org/10.32470/ccn.2023.1583-0.
Full textYang, Diyi, Ankur Parikh, and Colin Raffel. "Learning with Limited Text Data." In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Tutorial Abstracts. Stroudsburg, PA, USA: Association for Computational Linguistics, 2022. http://dx.doi.org/10.18653/v1/2022.acl-tutorials.5.
Full textKhoshgoftaar, Taghi M., Chris Seiffert, Jason Van Hulse, Amri Napolitano, and Andres Folleco. "Learning with limited minority class data." In Sixth International Conference on Machine Learning and Applications (ICMLA 2007). IEEE, 2007. http://dx.doi.org/10.1109/icmla.2007.76.
Full textChang, Shiyu, Charu C. Aggarwal, and Thomas S. Huang. "Learning Local Semantic Distances with Limited Supervision." In 2014 IEEE International Conference on Data Mining (ICDM). IEEE, 2014. http://dx.doi.org/10.1109/icdm.2014.114.
Full textSelf, Ryan, S. M. Nahid Mahmud, Katrine Hareland, and Rushikesh Kamalapurkar. "Online inverse reinforcement learning with limited data." In 2020 59th IEEE Conference on Decision and Control (CDC). IEEE, 2020. http://dx.doi.org/10.1109/cdc42340.2020.9303883.
Full textChen, Hanlin, and Peng Cao. "Deep Learning Based Data Augmentation and Classification for Limited Medical Data Learning." In 2019 IEEE International Conference on Power, Intelligent Computing and Systems (ICPICS). IEEE, 2019. http://dx.doi.org/10.1109/icpics47731.2019.8942411.
Full textLee, Kyungjae, Sunghyun Park, Hojae Han, Jinyoung Yeo, Seung-won Hwang, and Juho Lee. "Learning with Limited Data for Multilingual Reading Comprehension." In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). Stroudsburg, PA, USA: Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/d19-1283.
Full textLiu, Feng, Fengzhan Tian, and Qiliang Zhu. "Ensembling Bayesian network structure learning on limited data." In the sixteenth ACM conference. New York, New York, USA: ACM Press, 2007. http://dx.doi.org/10.1145/1321440.1321577.
Full textMitchell, Frost, Aniqua Baset, Neal Patwari, Sneha Kumar Kasera, and Aditya Bhaskara. "Deep Learning-based Localization in Limited Data Regimes." In WiSec '22: 15th ACM Conference on Security and Privacy in Wireless and Mobile Networks. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3522783.3529529.
Full textIosifidis, Vasileios, and Eirini Ntoutsi. "Large Scale Sentiment Learning with Limited Labels." In KDD '17: The 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3097983.3098159.
Full textReports on the topic "Learning with Limited Data"
Safta, Cosmin, Kookjin Lee, and Jaideep Ray. Predictive Skill of Deep Learning Models Trained on Limited Sequence Data. Office of Scientific and Technical Information (OSTI), October 2020. http://dx.doi.org/10.2172/1688570.
Full textRhoades, Alan, and Ankur Mahesh. Title:Machine learning to generate gridded extreme precipitation data sets for global land areas with limited in situ measurements. Office of Scientific and Technical Information (OSTI), April 2021. http://dx.doi.org/10.2172/1769784.
Full textCassity, Elizabeth, and Debbie Wong. Teacher development multi-year studies. Insights on the challenges of data availability for measuring and reporting on student learning outcomes. Australian Council for Educational Research, 2022. http://dx.doi.org/10.37517/978-1-74286-677-2.
Full textSukumar, Sreenivas R., and Carlos Emilio Del-Castillo-Negrete. Machine Learning for Big Data: A Study to Understand Limits at Scale. Office of Scientific and Technical Information (OSTI), December 2015. http://dx.doi.org/10.2172/1234336.
Full textChoquette, Gary. PR-000-16209-WEB Data Management Best Practices Learned from CEPM. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), April 2019. http://dx.doi.org/10.55274/r0011568.
Full textOgenyi, Moses. Looking back on Nigeria’s COVID-19 School Closures: Effects of Parental Investments on Learning Outcomes and Avoidance of Hysteresis in Education. Research on Improving Systems of Education (RISE), March 2022. http://dx.doi.org/10.35489/bsg-rise-ri_2022/040.
Full textAsgedom, Amare, Shelby Carvalho, and Pauline Rose. Negotiating Equity: Examining Priorities, Ownership, and Politics Shaping Ethiopia’s Large-Scale Education Reforms for Equitable Learning. Research on Improving Systems of Education (RISE), March 2020. http://dx.doi.org/10.35489/bsg-rise-wp_2021/067.
Full textBergeron, Augustin, Arnaud Fournier, John Kabeya Kabeya, Gabriel Tourek, and Jonathan L. Weigel. Using Machine Learning to Create a Property Tax Roll: Evidence from the City of Kananga, DR Congo'. Institute of Development Studies, October 2023. http://dx.doi.org/10.19088/ictd.2023.053.
Full textMahat, Marian, Vivienne Awad, Christopher Bradbeer, Chengxin Guo, Wesley Imms, and Julia Morris. Furniture for Engagement. University of Melbourne, February 2023. http://dx.doi.org/10.46580/124374.
Full textQuak, Evert-Jan. K4D’s Work on the Indirect Impacts of COVID-19 in Low- and Middle- Income Countries. Institute of Development Studies (IDS), June 2021. http://dx.doi.org/10.19088/k4d.2021.093.
Full text