Academic literature on the topic 'Software evaluation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Software evaluation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Software evaluation"

1

Flood, Stephen. "Software evaluation." Aslib Proceedings 38, no. 2 (February 1986): 65–69. http://dx.doi.org/10.1108/eb050999.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Sobhy, Dalia, Rami Bahsoon, Leandro Minku, and Rick Kazman. "Evaluation of Software Architectures under Uncertainty." ACM Transactions on Software Engineering and Methodology 30, no. 4 (July 2021): 1–50. http://dx.doi.org/10.1145/3464305.

Full text
Abstract:
Context: Evaluating software architectures in uncertain environments raises new challenges, which require continuous approaches. We define continuous evaluation as multiple evaluations of the software architecture that begins at the early stages of the development and is periodically and repeatedly performed throughout the lifetime of the software system. Numerous approaches have been developed for continuous evaluation; to handle dynamics and uncertainties at run-time, over the past years, these approaches are still very few, limited, and lack maturity. Objective: This review surveys efforts on architecture evaluation and provides a unified terminology and perspective on the subject. Method: We conducted a systematic literature review to identify and analyse architecture evaluation approaches for uncertainty including continuous and non-continuous, covering work published between 1990–2020. We examined each approach and provided a classification framework for this field. We present an analysis of the results and provide insights regarding open challenges. Major results and conclusions: The survey reveals that most of the existing architecture evaluation approaches typically lack an explicit linkage between design-time and run-time. Additionally, there is a general lack of systematic approaches on how continuous architecture evaluation can be realised or conducted. To remedy this lack, we present a set of necessary requirements for continuous evaluation and describe some examples.
APA, Harvard, Vancouver, ISO, and other styles
3

Wellek, S., J. L. Willems, and J. Michaelis. "Reference Standards for Software Evaluation." Methods of Information in Medicine 29, no. 04 (1990): 289–97. http://dx.doi.org/10.1055/s-0038-1634806.

Full text
Abstract:
AbstractThe field of automated ECG analysis was one of the earliest topics in Medical Informatics and may be regarded as a model both for computer-assisted medical diagnosis and for evaluating medical diagnostic programs. The CSE project has set reference standards of two kinds: In a broad sense, a standard how to perform a comprehensive evaluation study, in a narrow sense, standards as specific references for evaluating computer ECG programs. The evaluation methodology used within the CSE project is described as a basis for presentation of results which are published elsewhere in this issue.
APA, Harvard, Vancouver, ISO, and other styles
4

Masood Butt, Saad, Shahid Masood Butt, Azura Onn, Nadra Tabassam, and Mazlina Abdul Majid. "Usability Evaluation Techniques for Agile Software Model." Journal of Software 10, no. 1 (January 2015): 32–41. http://dx.doi.org/10.17706/jsw.10.1.32-41.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Masood, Zafar, Xuequn Shang, and Jamal Yousaf. "Usability Evaluation Framework for Software Engineering Methodologies." Lecture Notes on Software Engineering 2, no. 3 (2014): 225–32. http://dx.doi.org/10.7763/lnse.2014.v2.127.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

ZHANG, Li. "Software Architecture Evaluation." Journal of Software 19, no. 6 (October 21, 2008): 1328–39. http://dx.doi.org/10.3724/sp.j.1001.2008.01328.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Saputri, Theresia Ratih Dewi, and Seok-Won Lee. "Software Analysis Method for Assessing Software Sustainability." International Journal of Software Engineering and Knowledge Engineering 30, no. 01 (January 2020): 67–95. http://dx.doi.org/10.1142/s0218194020500047.

Full text
Abstract:
Software sustainability evaluation has become an essential component of software engineering (SE) owing to sustainability considerations that must be incorporated into software development. Several studies have been performed to address the issues associated with sustainability concerns in the SE process. However, current practices extensively rely on participant experiences to evaluate sustainability achievement. Moreover, there exist limited quantifiable methods for supporting software sustainability evaluation. Our primary objective is to present a methodology that can assist software engineers in evaluating a software system based on well-defined sustainability metrics and measurements. We propose a novel approach that combines machine learning (ML) and software analysis methods. To simplify the application of the proposed approach, we present a semi-automated tool that supports engineers in assessing the sustainability achievement of a software system. The results of our study demonstrate that the proposed approach determines sustainability criteria and defines sustainability achievement in terms of a traceable matrix. Our theoretical evaluation and empirical study demonstrate that the proposed support tool can help engineers identify sustainability limitations in a particular feature of a software system. Our semi-automated tool can identify features that must be revised to enhance sustainability achievement.
APA, Harvard, Vancouver, ISO, and other styles
8

Chopra, Kunal, and Monika Sachdeva. "EVALUATION OF SOFTWARE METRICS FOR SOFTWARE PROJECTS." INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY 14, no. 6 (April 30, 2015): 5845–53. http://dx.doi.org/10.24297/ijct.v14i6.1915.

Full text
Abstract:
Software metrics are developed and used by the many software organizations for the evaluation and confirmation of good code, working and maintenance of the software product. Software metrics measure and identify various types of software complexities such as size metrics, control flow metrics and data flow metrics. One of the significant objective of software metrics is that it is applicable to both a process and product metrics. Ndepend is the most advanced as well as flexible tool available in the market. We have ensured the Quality of the project by using Ndepend metrics. So we have concluded that software metrics are easy to understand and applicable on the software, so favourable among software professionals.It is most prevalent and important testing metrics used in organizations. Metrics are used to improve software productivity and quality. This thesis introduces the most commonly used software metrics proposed and reviews their use in constructing models of the software development process.
APA, Harvard, Vancouver, ISO, and other styles
9

Hassani, Mohammad, and Mehran Mirshams. "Remote sensing satellites evaluation software (RSSE software)." Aircraft Engineering and Aerospace Technology 81, no. 4 (July 3, 2009): 323–33. http://dx.doi.org/10.1108/00022660910967318.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Huber, J. T., and N. B. Giuse. "Educational Software Evaluation Process." Journal of the American Medical Informatics Association 2, no. 5 (September 1, 1995): 295–96. http://dx.doi.org/10.1136/jamia.1995.96073831.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Software evaluation"

1

Kumar, Nadella Navin. "Evaluation of ISDS software." Master's thesis, This resource online, 1991. http://scholar.lib.vt.edu/theses/available/etd-01262010-020122/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Jah, Muzamil. "Software metrics : usability and evaluation of software quality." Thesis, University West, Department of Economics and IT, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:hv:diva-548.

Full text
Abstract:

It is difficult to understand, let alone improve, the quality of software without the knowledge of its software development process and software products. There must be some measurement process to predict the software development, and to evaluate the software products. This thesis provides a brief view on Software Quality, Software Metrics, and Software Metrics methods that will predict and measure the specified quality factors of software. It further discusses about the Quality as given by the standards such as ISO, principal elements required for the Software Quality and Software Metrics as the measurement technique to predict the Software Quality. This thesis was performed by evaluating a source code developed in Java, using Software Metrics, such as Size Metrics, Complexity Metrics, and Defect Metrics. Results show that, the quality of software can be analyzed, studied and improved by the usage of software metrics.

APA, Harvard, Vancouver, ISO, and other styles
3

Powale, Kalkin. "Automotive Powertrain Software Evaluation Tool." Master's thesis, Universitätsbibliothek Chemnitz, 2018. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-qucosa-233186.

Full text
Abstract:
The software is a key differentiator and driver of innovation in the automotive industry. The major challenges for software development are increasing in complexity, shorter time-to-market, increase in development cost and demand of quality assurance. The complexity is increasing due to emission legislations, variants of product and new communication technologies being interfaced with the vehicle. The shorter development time is due to competition in the market, which requires faster feedback loops of verification and validation of developed functionalities. The increase in development cost is contributed by two factors; the first is pre-launch cost, this involves the cost of error correction in development stages. Another is post-launch cost; this involves warranty and guarantees cost. As the development time passes the cost of error correction also increases. Hence it is important to detect the error as early as possible. All these factors affect the software quality; there are several cases where Original Equipment Manufacturer (OEM) have callbacks their product because of the quality defect. Hence, there is increased in the requirement of software quality assurance. The solution for these software challenges can be the early quality evaluation in continuous integration framework environment. The most prominent in today\'s automotive industry AUTomotive Open System ARchitecture (AUTOSAR) reference architecture is used to describe software component and interfaces. AUTOSAR provides the standardised software component architecture elements. It was created to address the issues of growing complexity; the existing AUTOSAR environment does have software quality measures, such as schema validations and protocols for acceptance tests. However, it lacks the quality specification for non-functional qualities such as maintainability, modularity, etc. The tool is required which will evaluate the AUTOSAR based software architecture and give the objective feedback regarding quality. This thesis aims to provide the quality measurement tool, which will be used for evaluation of AUTOSAR based software architecture. The tool reads the AUTOSAR architecture information from AUTOSAR Extensible Markup Language (ARXML) file. The tool provides configuration ability, continuous evaluation and objective feedback regarding software quality characteristics. The tool was utilised on transmission control project, and results are validated by industry experts.
APA, Harvard, Vancouver, ISO, and other styles
4

Gabriel, Pedro Hugo do Nascimento. "Software languages engineering: experimental evaluation." Master's thesis, Faculdade de Ciências e Tecnologia, 2010. http://hdl.handle.net/10362/4854.

Full text
Abstract:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Informática
Domain-Specific Languages (DSLs) are programming languages that offer, through appropriate notation and abstraction, still enough an expressive control over a particular problem domain for more restricted use. They are expected to contribute with an enhancement of productivity, reliability, maintainability and portability, when compared with General Purpose Programming Languages (GPLs). However, like in any Software Product without passing by all development stages namely Domain Analysis, Design, Implementation and Evaluation, some of the DSLs’ alleged advantages may be impossible to be achieved with a significant level of satisfaction. This may lead to the production of inadequate or inefficient languages. This dissertation is focused on the Evaluation phase. To characterize DSL community commitment concerning Evaluation, we conducted a systematic review. The review covered publications in the main fora dedicated to DSLs from 2001 to 2008, and allowed to analyse and classify papers with respect to the validation efforts conducted by DSLs’ producers, where have been observed a reduced concern to this matter. Another important outcome that has been identified is the absence of a concrete approach to the evaluation of DSLs, which would allow a sound assessment of the actual improvements brought by the usage of DSLs. Therefore, the main goal of this dissertation concerns the production of a Systematic Evaluation Methodology for DSLs. To achieve this objective, has been carried out the major techniques used in Experimental Software Engineering and Usability Engineering context. The proposed methodology was validated with its use in several case studies, whereupon DSLs evaluation has been made in accordance with this methodology.
APA, Harvard, Vancouver, ISO, and other styles
5

Dillon, Andrew. "The Evaluation of software usability." London: Taylor and Francis, 2001. http://hdl.handle.net/10150/105344.

Full text
Abstract:
This item is not the definitive copy. Please use the following citation when referencing this material: Dillon, A. (2001) Usability evaluation. In W. Karwowski (ed.) Encyclopedia of Human Factors and Ergonomics, London: Taylor and Francis. Introduction: Usability is a measure of interface quality that refers to the effectiveness, efficiency and satisfaction with which users can perform tasks with a tool. Evaluating usability is now considered an essential part of the system development process and a variety of methods and have been developed to support the human factors professional in this work.
APA, Harvard, Vancouver, ISO, and other styles
6

Brophy, Dennis J. O'Leary James D. "Software evaluation for developing software reliability engineering and metrics models /." Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1999. http://handle.dtic.mil/100.2/ADA361889.

Full text
Abstract:
Thesis (M.S. in Information Technology Management) Naval Postgraduate School, March 1999.
"March 1999". Thesis advisor(s): Norman F. Schneidewind, Douglas Brinkley. Includes bibliographical references (p. 59-60). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
7

Brophy, Dennis J., and James D. O'Leary. "Software evaluation for developing software reliability engineering and metrics models." Thesis, Monterey, California ; Naval Postgraduate School, 1999. http://hdl.handle.net/10945/13581.

Full text
Abstract:
Today's software is extremely complex, often constituting millions of lines of instructions. Programs are expected to operate smoothly on a wide variety of platforms. There are continuous attempts to try to assess what the reliability of a software package is and to predict what the reliability of software under development will be. The quantitative aspects of these assessments deal with evaluating, characterizing and predicting how well software will operate. Experience has shown that it is extremely difficult to make something as large and complex as modern software and predict with any accuracy how it is going to behave in the field. This thesis proposes to create an integrated system to predict software reliability for mission critical systems. This will be accomplished by developing a flexible DBMS to track failures and to integrate the DBMS with statistical analysis programs and software reliability prediction tools that are used to make calculations and display trend analysis. It further proposes a software metrics model for fault prediction by determining and manipulating metrics extracted from the code.
APA, Harvard, Vancouver, ISO, and other styles
8

Barney, Sebastian. "Software Quality Alignment : Evaluation and Understanding." Doctoral thesis, Karlskrona : Blekinge Institute of Technology, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-00492.

Full text
Abstract:
Background: The software development environment is growing increasingly complex, with a greater diversity of stakeholders involved in product development. Moves towards global software development with onshoring, offshoring, insourcing and outsourcing have seen a range of stakeholders introduced to the software development process, each with their own incentives and understanding of their product. These differences between the stakeholders can be especially problematic with regard to aspects of software quality. The aspects are often not clearly and explicitly defined for a product, but still essential for its long-term sustainability. Research shows that software projects are more likely to succeed when the stakeholders share a common understanding of software quality. Objectives: This thesis has two main objectives. The first is to develop a method to determine the level of alignment between stakeholders with regard to the priority given to aspects of software quality. Given the ability to understand the levels of alignment between stakeholders, the second objective is to identify factors that support and impair this alignment. Both the method and the identified factors will help software development organisations create work environments that are better able to foster a common set of priorities with respect to software quality. Method: The primary research method employed throughout this thesis is case study research. In total, six case studies are presented, all conducted in large or multinational companies. A range of data collection techniques have been used, including questionnaires, semi-structured interviews and workshops. Results: A method to determine the level of alignment between stakeholders on the priority given to aspects of software quality is presented—the Stakeholder Alignment Assessment Method for Software Quality (SAAM-SQ). It is developed by drawing upon a systematic literature review and the experience of conducting a related case study. The method is then refined and extended through the experience gained from its repeated application in a series of case studies. These case studies are further used to identify factors that support and impair alignment in a range of different software development contexts. The contexts studied include onshore insourcing, onshore outsourcing, offshore insourcing and offshore outsourcing. Conclusion: SAAM-SQ is found to be robust, being successfully applied to case studies covering a range of different software development contexts. The factors identified from the case studies as supporting or impairing alignment confirm and extend research in the global software development domain.
APA, Harvard, Vancouver, ISO, and other styles
9

SHAUGHNESSY, MICHAEL RYAN. "EDUCATIONAL SOFTWARE EVALUATION: A CONTEXTUAL APPROACH." University of Cincinnati / OhioLINK, 2002. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1021653053.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Clemens, Ronald F. "TEMPO software modificationg for SEVER evaluation." Thesis, Monterey, California : Naval Postgraduate School, 2009. http://edocs.nps.edu/npspubs/scholarly/theses/2009/Sep/09Sep_Clemens.pdf.

Full text
Abstract:
Thesis (M.S. in Systems Engineering)--Naval Postgraduate School, September 2009.
Thesis Advisor(s): Langford, Gary O. "September 2009." Description based on title screen as viewed on November 4, 2009. Author(s) subject terms: Decision, decision analysis, decision process, system engineering tool, SEVER, resource allocation, military planning, software tool, strategy evaluation. Includes bibliographical references (p. 111-113). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Software evaluation"

1

Karasulu, Bahadir, and Serdar Korukoglu. Performance Evaluation Software. New York, NY: Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4614-6534-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Johnston, D. J. PHIGS software evaluation. Loughborough: Advisory Group on Computer Graphics, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Software metrics: Measurement for software process improvement. Oxford, UK: NCC Blackwell, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Owston, Ronald Davis. York educational software evaluation scales. North York, Ont: Faculty of Education, York University, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Dobbins, James H. Software quality assurance and evaluation. Milwaukee, Wis: ASQC Quality Press, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Q.E.D. Information Sciences., ed. Management evaluation of software packages. Wellesley, Mass: QED Information Sciences, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Adler, Carolyn. Software evaluation, a training manual. [Tampa, Fla.]: Florida Center for Instructional Computing, 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Inter University Software Committee Graphics Working Party. Image processing: Software evaluation report. Loughborough: AGOCG, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Knodel, Jens, and Matthias Naab. Pragmatic Evaluation of Software Architectures. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-34177-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Smith, Shirley C. Software review center. Philadelphia, PA: Drexel University, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Software evaluation"

1

Mainini, Maria Teresa. "Reliability Evaluation." In Software Fault Tolerance, 177–97. Berlin, Heidelberg: Springer Berlin Heidelberg, 1992. http://dx.doi.org/10.1007/978-3-642-84725-7_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Karasulu, Bahadir, and Serdar Korukoglu. "Introduction." In Performance Evaluation Software, 1–5. New York, NY: Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4614-6534-8_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Karasulu, Bahadir, and Serdar Korukoglu. "Moving Object Detection and Tracking in Videos." In Performance Evaluation Software, 7–30. New York, NY: Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4614-6534-8_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Karasulu, Bahadir, and Serdar Korukoglu. "A Software Approach to Performance Evaluation." In Performance Evaluation Software, 31–37. New York, NY: Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4614-6534-8_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Karasulu, Bahadir, and Serdar Korukoglu. "Performance Measures and Evaluation." In Performance Evaluation Software, 39–49. New York, NY: Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4614-6534-8_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Karasulu, Bahadir, and Serdar Korukoglu. "A Case Study: People Detection and Tracking in Videos." In Performance Evaluation Software, 51–63. New York, NY: Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4614-6534-8_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Karasulu, Bahadir, and Serdar Korukoglu. "Conclusion." In Performance Evaluation Software, 65–66. New York, NY: Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4614-6534-8_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Staron, Miroslaw. "Evaluation." In Action Research in Software Engineering, 93–122. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-32610-4_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhu, Joe. "DEAFrontier Software." In Quantitative Models for Performance Evaluation and Benchmarking, 399–407. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-06647-9_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

van der Linden, Frank, Jan Bosch, Erik Kamsties, Kari Känsälä, and Henk Obbink. "Software Product Family Evaluation." In Software Product Lines, 110–29. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-28630-1_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Software evaluation"

1

Abrahão, Silvia, and Emilio Insfran. "Evaluating Software Architecture Evaluation Methods." In EASE'17: Evaluation and Assessment in Software Engineering. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3084226.3084253.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Preston, Jon Anderson. "Evaluation software." In The supplemental proceedings of the conference. New York, New York, USA: ACM Press, 1997. http://dx.doi.org/10.1145/266057.266143.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

McNab, D. "A Software System for Inspection Qualification." In QUANTITATIVE NONDESTRUCTIVE EVALUATION. AIP, 2004. http://dx.doi.org/10.1063/1.1711806.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sherief, Nada, Nan Jiang, Mahmood Hosseini, Keith Phalp, and Raian Ali. "Crowdsourcing software evaluation." In the 18th International Conference. New York, New York, USA: ACM Press, 2014. http://dx.doi.org/10.1145/2601248.2601300.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kurbatov, Pavel A. "Method and software for eddy currents NDE analysis." In QUANTITATIVE NONDESTRUCTIVE EVALUATION. AIP, 2002. http://dx.doi.org/10.1063/1.1472826.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Eisty, Nasir U., George K. Thiruvathukal, and Jeffrey C. Carver. "Use of Software Process in Research Software Development." In EASE '19: Evaluation and Assessment in Software Engineering. New York, NY, USA: ACM, 2019. http://dx.doi.org/10.1145/3319008.3319351.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

"SOFTWARE SEMANTIC PROVISIONING - Actually Reusing Software." In 3rd International Conference on Evaluation of Novel Approaches to Software Engineering. SciTePress - Science and and Technology Publications, 2008. http://dx.doi.org/10.5220/0001763501850188.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Ram, Prabhat, Pilar Rodríguez, Markku Oivo, Silverio Martínez-Fernández, Alessandra Bagnato, Michał Choraś, Rafał Kozik, Sanja Aaramaa, and Milla Ahola. "Actionable Software Metrics." In EASE '20: Evaluation and Assessment in Software Engineering. New York, NY, USA: ACM, 2020. http://dx.doi.org/10.1145/3383219.3383244.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Brombach, Ronald P. "Automotive Software Development Evaluation." In SAE 2000 World Congress. 400 Commonwealth Drive, Warrendale, PA, United States: SAE International, 2000. http://dx.doi.org/10.4271/2000-01-0712.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

CLOUGH, A. "Software component quality evaluation." In 8th Computing in Aerospace Conference. Reston, Virigina: American Institute of Aeronautics and Astronautics, 1991. http://dx.doi.org/10.2514/6.1991-3760.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Software evaluation"

1

Heinen, Beth A., Ed Meiman, Daniel A. Fien-Helfman, Sydney K. Ayine, and Asad A. Khan. Survey Software Evaluation. Fort Belvoir, VA: Defense Technical Information Center, January 2009. http://dx.doi.org/10.21236/ada495855.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Edwards, T. L., and H. W. Allen. Programming software for usability evaluation. Office of Scientific and Technical Information (OSTI), January 1997. http://dx.doi.org/10.2172/446361.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Blum, Bruce I. Evaluation Methodology for Software Engineering. Fort Belvoir, VA: Defense Technical Information Center, May 1988. http://dx.doi.org/10.21236/ada198398.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Budlong, Faye, and Judi Peterson. Software Metrics Capability Evaluation Guide,. Fort Belvoir, VA: Defense Technical Information Center, October 1995. http://dx.doi.org/10.21236/ada325385.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Basili, V. R. Quantitative Evaluation of Software Methodology. Fort Belvoir, VA: Defense Technical Information Center, July 1985. http://dx.doi.org/10.21236/ada160202.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Blum, Bruce I. Evaluation Methodology for Software Engineering. Fort Belvoir, VA: Defense Technical Information Center, December 1989. http://dx.doi.org/10.21236/ada218474.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Middendorf, Matthew S., Nicole M. Guilliams, and Annie B. McLaughlin. A Software Testbed for Display Evaluation. Fort Belvoir, VA: Defense Technical Information Center, August 2004. http://dx.doi.org/10.21236/ada428492.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Huber, Hartmut G. Higher Order Software - Evaluation and Critique. Fort Belvoir, VA: Defense Technical Information Center, August 1987. http://dx.doi.org/10.21236/ada198753.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Sisti, Frank J., and Sujoe Joseph. Software Risk Evaluation Method Version 1.0. Fort Belvoir, VA: Defense Technical Information Center, December 1994. http://dx.doi.org/10.21236/ada290597.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Mclean, Thomas, and Timothy Gildea. An Evaluation of the ASM2000 software. Office of Scientific and Technical Information (OSTI), July 2021. http://dx.doi.org/10.2172/1807812.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography