Academic literature on the topic 'SOFTWARE DEFECT REPORTS'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'SOFTWARE DEFECT REPORTS.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "SOFTWARE DEFECT REPORTS"
Jindal, Rajni, Ruchika Malhotra, and Abha Jain. "Predicting Software Maintenance Effort by Mining Software Project Reports Using Inter-Version Validation." International Journal of Reliability, Quality and Safety Engineering 23, no. 06 (December 2016): 1640009. http://dx.doi.org/10.1142/s021853931640009x.
Full textMalhotra, Ruchika, Nidhi Kapoor, Rishabh Jain, and Sahaj Biyani. "Severity Assessment of Software Defect Reports using Text Classification." International Journal of Computer Applications 83, no. 11 (December 18, 2013): 13–16. http://dx.doi.org/10.5120/14492-2622.
Full textJindal, Rajni, Ruchika Malhotra, and Abha Jain. "Prediction of defect severity by mining software project reports." International Journal of System Assurance Engineering and Management 8, no. 2 (March 10, 2016): 334–51. http://dx.doi.org/10.1007/s13198-016-0438-y.
Full textMarappan, Shanmugasundaram, Archana Kollu, Ismail Keshta, Shehab Mohamed Beram, Sahil Bhende, and Karthikeyan Kaliyaperumal. "An Optimized Systematic Approach to Identify Bugs in Cloud-Based Software." Scientific Programming 2022 (September 15, 2022): 1–10. http://dx.doi.org/10.1155/2022/2302027.
Full textMellegard, Niklas, Hakan Burden, Daniel Levin, Kenneth Lind, and Ana Magazinius. "Contrasting Big Bang With Continuous Integration Through Defect Reports." IEEE Software 37, no. 3 (May 2020): 14–20. http://dx.doi.org/10.1109/ms.2018.2880822.
Full textSultan, Torky, Ayman E. Khedr, and Mostafa Sayed. "A Proposed Defect Tracking Model for Classifying the Inserted Defect Reports to Enhance Software Quality Control." International Journal of Computer Applications 67, no. 14 (April 18, 2013): 1–7. http://dx.doi.org/10.5120/11460-7068.
Full textSultan, Torky, Ayman Khedr, and Mostafa Sayed. "A Proposed Defect Tracking Model for Classifying the Inserted Defect Reports to Enhance Software Quality Control." Acta Informatica Medica 21, no. 2 (2013): 103. http://dx.doi.org/10.5455/aim.2013.21.103-108.
Full textYadla, Suresh, Jane Huffman Hayes, and Alex Dekhtyar. "Tracing requirements to defect reports: an application of information retrieval techniques." Innovations in Systems and Software Engineering 1, no. 2 (July 29, 2005): 116–24. http://dx.doi.org/10.1007/s11334-005-0011-3.
Full textPipitone, J., and S. Easterbrook. "Assessing climate model software quality: a defect density analysis of three models." Geoscientific Model Development 5, no. 4 (August 9, 2012): 1009–22. http://dx.doi.org/10.5194/gmd-5-1009-2012.
Full textPipitone, J., and S. Easterbrook. "Assessing climate model software quality: a defect density analysis of three models." Geoscientific Model Development Discussions 5, no. 1 (February 15, 2012): 347–82. http://dx.doi.org/10.5194/gmdd-5-347-2012.
Full textDissertations / Theses on the topic "SOFTWARE DEFECT REPORTS"
Ye, Xin. "Automated Software Defect Localization." Ohio University / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1462374079.
Full textCAVALCANTI, Diego Tavares. "Estudo do uso de vocabulários para analisar o impacto de relatórios de defeitos a código-fonte." Universidade Federal de Campina Grande, 2012. http://dspace.sti.ufcg.edu.br:8080/jspui/handle/riufcg/1839.
Full textMade available in DSpace on 2018-09-28T14:01:43Z (GMT). No. of bitstreams: 1 DIEGO TAVARES CAVALCANTI - DISSERTAÇÃO PPGCC 2012..pdf: 11733349 bytes, checksum: 59909ce95d6ea71dea6e9686d3d20c33 (MD5) Previous issue date: 2012-11-26
Localizar e corrigir defeitos são tarefas comuns no processo de manutenção de software. Entretanto, a atividade de localizar entidades de código que são possivelmente defeituosas e que necessitam ser modificadas para a correção de um defeito, não é trivial. Geralmente, desenvolvedores realizam esta tarefa por meio de um processo manual de leitura e inspeção do código, bem como de informações cadastradas em relatórios de defeitos. De fato, é necessário que os desenvolvedores tenham um bom conhecimento da arquitetura e do design do software a fim de realizarem tal tarefa. Entretanto, este conhecimento fica espalhado por entre a equipe e requer tempo para ser adquirido por novatos. Assim, é necessário o desenvolvimento de técnicas que auxiliem na tarefa de análise de impacto de relatórios de defeitos no código, independente da experiência do desenvolvedor que irá executá-la. Neste trabalho, apresentamos resultados de um estudo empírico no qual avaliamos se a análise automática de vocabulários de relatórios de defeitos e de software pode ser útil na tarefa de localizar defeitos no código. Nele, analisamos similaridade de vocabulários como fator para sugerir classes que são prováveis de serem impactadas por um dado relatório de defeito. Realizamos uma avaliação com oito projetos maduros de código aberto, desenvolvidos em Java, que utilizam Bugzilla e JIRA como seus repositórios de defeitos. Nossos resultados indicam que a análise de ambos os vocabulários é, de fato, uma fonte valiosa de informação, que pode ser utilizada para agilizar a tarefa de localização de defeitos. Para todos os sistemas estudados, ao considerarmos apenas análise de vocabulário, vimos que, mesmo com um ranking contendo apenas 8% das classes de um projeto, foi possível encontrar classes relacionadas ao defeito buscado em até 75% dos casos. Portanto, podemos concluir que, mesmo que não possamos utilizar vocabulários de software e de relatórios de defeitos como únicas fontes de informação, eles certamente podem melhorar os resultados obtidos, ao serem combinados com técnicas complementares.
Locating and fixing bugs described in bug reports are routine tasks in software development processes. A major effort must be undertaken to successfully locate the (possibly faulty) entities in the code that must be worked on. Generally, developers map bug reports to code through manual reading and inspection of both bug reports and the code itself. In practice, they must rely on their knowledge about the software architecture and design to perform the mapping in an efficient and effective way. However, it is well known that architectural and design knowledge is spread out among developers. Hence, the success of such a task is directly depending on choosing the right developer. In this paper, we present results of an empirical study we performed to evaluate whether the automated analysis of bug reports and software vocabularies can be helpful in the task of locating bugs. We conducted our study on eight versions of six mature Java open-source projects that use Bugzilla and JIRA as bug tracking systems. In our study, we have used Information Retrieval techniques to assess the similarity of bug reports and code entities vocabularies. For each bug report, we ranked ali code entities according to the measured similarity. Our results indicate that vocabularies are indeed a valuable source of information that can be used to narrow down the bug-locating task. For ali the studied systems, considering vocabulary similarity only, a Top 8% list of entities has about 75% of the target entities. We conclude that while vocabularies cannot be the sole source of information, they can certainly improve results if combined with other techniques.
JALAN, ADITYA HRIDAY. "ASSESSING SEVERITY OF SOFTWARE DEFECT REPORTS USING MACHINE LEARNING TECHNIQUES." Thesis, 2014. http://dspace.dtu.ac.in:8080/jspui/handle/repository/15606.
Full textIvanov, E. S., and Е. С. Иванов. "Разработка методики тестирования программного обеспечения : магистерская диссертация." Master's thesis, 2014. http://hdl.handle.net/10995/28187.
Full textТема выпускной квалификационное работы: разработка методики тестирования программного обеспечения. Цель работы: изучение процесса тестирования, видов дефектов в ПО и их отслеживание, способов создания и применения тест кейсов, и, на основе полученных знаний, разработка проекта авто-тестов для веб-сервиса "Эксперт". Дополнительной целью является проведение нагрузочного тестирования для веб-сервиса "Эксперт". Первая часть работы посвящена теоретическим основам тестирования: место тестирования в разработке ПО, процесс тестирования в IT-компаниях, обзор дефектов, способов их отслеживания, а также техник создания тестов и их применение. Вторая часть посвящена обзору ПО для нагрузочного тестирования и его практическое использование для тестирования веб-сервиса «Эксперт». Третья часть посвящена изучению процесса автоматизации функционального тестирования и разработке авто-тестов для веб-сервиса «Эксперт». Выпускная работа состоит из введения, 12 глав и заключения, изложенных на 106 страницах, а также списка литературы и приложений. В работе имеется 55 рисунков. Список литературы содержит 15 наименований.
Book chapters on the topic "SOFTWARE DEFECT REPORTS"
Gromova, Anna. "Using Cluster Analysis for Characteristics Detection in Software Defect Reports." In Lecture Notes in Computer Science, 152–63. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-73013-4_14.
Full textWang, Han, Min Zhou, Xi Cheng, Guang Chen, and Ming Gu. "Which Defect Should Be Fixed First? Semantic Prioritization of Static Analysis Report." In Software Analysis, Testing, and Evolution, 3–19. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-04272-1_1.
Full textGou, Lang, Qing Wang, Jun Yuan, Ye Yang, Mingshu Li, and Nan Jiang. "Quantitatively Managing Defects for Iterative Projects: An Industrial Experience Report in China." In Making Globally Distributed Software Development a Success Story, 369–80. Berlin, Heidelberg: Springer Berlin Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-79588-9_32.
Full textWee Land, Lesley Pek, Chris Sauer, and Ross Jeffery. "Validating the defect detection performance advantage of group designs for software reviews: Report of a laboratory experiment using program code." In Lecture Notes in Computer Science, 294–309. Berlin, Heidelberg: Springer Berlin Heidelberg, 1997. http://dx.doi.org/10.1007/3-540-63531-9_21.
Full textJarzabek, Stanislaw, and Cezary Boldak. "Prioritizing Defects for Debugging with Requirement-to-Test-Case Mappings." In Frontiers in Artificial Intelligence and Applications. IOS Press, 2022. http://dx.doi.org/10.3233/faia220254.
Full textVimaladevi M. and Zayaraz G. "A Game Theoretic Approach for Quality Assurance in Software Systems Using Antifragility-Based Learning Hooks." In Research Anthology on Agile Software, Software Development, and Testing, 1701–19. IGI Global, 2022. http://dx.doi.org/10.4018/978-1-6684-3702-5.ch081.
Full textSapna, P. G., Hrushikesha Mohanty, and Arunkumar Balakrishnan. "Consistency Checking of Specification in UML." In Advances in Systems Analysis, Software Engineering, and High Performance Computing, 300–316. IGI Global, 2014. http://dx.doi.org/10.4018/978-1-4666-4494-6.ch014.
Full textConference papers on the topic "SOFTWARE DEFECT REPORTS"
Menzies, Tim, and Andrian Marcus. "Automated severity assessment of software defect reports." In 2008 IEEE International Conference on Software Maintenance (ICSM). IEEE, 2008. http://dx.doi.org/10.1109/icsm.2008.4658083.
Full textPatil, Sangameshwar. "Concept-Based Classification of Software Defect Reports." In 2017 IEEE/ACM 14th International Conference on Mining Software Repositories (MSR). IEEE, 2017. http://dx.doi.org/10.1109/msr.2017.20.
Full textJindal, Rajni, Ruchika Malhotra, and Abha Jain. "Mining defect reports for predicting software maintenance effort." In 2015 International Conference on Advances in Computing, Communications and Informatics (ICACCI). IEEE, 2015. http://dx.doi.org/10.1109/icacci.2015.7275620.
Full textGarousi, Vahid, Ebru Göçmen Ergezer, and Kadir Herkiloğlu. "Usage, usefulness and quality of defect reports." In EASE '16: 20th International Conference on Evaluation and Assessment in Software Engineering. New York, NY, USA: ACM, 2016. http://dx.doi.org/10.1145/2915970.2916009.
Full textRuneson, Per, Magnus Alexandersson, and Oskar Nyholm. "Detection of Duplicate Defect Reports Using Natural Language Processing." In 29th International Conference on Software Engineering. IEEE, 2007. http://dx.doi.org/10.1109/icse.2007.32.
Full textLai, Tuan Dung. "Towards the generation of machine learning defect reports." In 2021 36th IEEE/ACM International Conference on Automated Software Engineering (ASE). IEEE, 2021. http://dx.doi.org/10.1109/ase51524.2021.9678592.
Full textMellegard, Niklas, Hakan Burden, Daniel Levin, Kenneth Lind, and Ana Magazinius. "Contrasting Big Bang with Continuous Integration through Defect Reports." In 2021 IEEE 18th International Conference on Software Architecture Companion (ICSA-C). IEEE, 2021. http://dx.doi.org/10.1109/icsa-c52384.2021.00010.
Full textGromova, Anna, Iosif Itkin, Sergey Pavlov, and Alexander Korovayev. "Raising the Quality of Bug Reports by Predicting Software Defect Indicators." In 2019 IEEE 19th International Conference on Software Quality, Reliability and Security Companion (QRS-C). IEEE, 2019. http://dx.doi.org/10.1109/qrs-c.2019.00048.
Full textYusop, Nor Shahida Mohamad, Jean-Guy Schneider, John Grundy, and Rajesh Vasa. "Analysis of the Textual Content of Mined Open Source Usability Defect Reports." In 2017 24th Asia-Pacific Software Engineering Conference (APSEC). IEEE, 2017. http://dx.doi.org/10.1109/apsec.2017.42.
Full textMellegård, Niklas. "Using weekly open defect reports as an indicator for software process efficiency." In IWSM/Mensura '17: 27th International Workshop on Software Measurement and 12th International Conference on Software Process and Product Measurement. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3143434.3143463.
Full textReports on the topic "SOFTWARE DEFECT REPORTS"
Leis, Brian. L51794A Failure Criterion for Residual Strength of Corrosion Defects in Moderate to High Toughness Pipe. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), January 2000. http://dx.doi.org/10.55274/r0011253.
Full textLane, Lerose, and DingXin Cheng. Pavement Condition Survey using Drone Technology. Mineta Transportation Institute, June 2023. http://dx.doi.org/10.31979/mti.2023.2202.
Full text