To see the other types of publications on this topic, follow the link: Computer software - Evaluation.

Journal articles on the topic 'Computer software - Evaluation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Computer software - Evaluation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

JINGU, Hideo. "Sensory evaluation of computer software." Japanese journal of ergonomics 30, Supplement (1994): 32–33. http://dx.doi.org/10.5100/jje.30.supplement_32.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Schueckler, Linda M., and Thomas J. Shuell. "A Comparison of Software Evaluation Forms and Reviews." Journal of Educational Computing Research 5, no. 1 (February 1989): 17–33. http://dx.doi.org/10.2190/ae0l-kt93-0u4n-mk59.

Full text
Abstract:
Various evaluation forms and reviews used to evaluate instructional software are compared with regard to the criteria employed in their assessments, and the usefulness and appropriateness of these criteria for making instructional decisions are discussed. Among the evaluation forms considered are those developed by the National Council of Teachers of Mathematics, the National Science Teachers Association, and the Software Evaluation Project at the State University of New York at Buffalo. Software awards such as those sponsored by Classroom Computer Learning and reviews by monthly publications such as Teaching and Computers and Classroom Computer Learning are also considered. Although certain criteria are represented on nearly all of the evaluation forms and reviews, other criteria appear on a more limited basis. Differences between evaluation forms and reviews are discussed, and limitations in current approaches to software evaluation are identified — e.g., concern for valid principles of learning and teaching.
APA, Harvard, Vancouver, ISO, and other styles
3

Emy, Saleh Rosana. "DESIGNING COMPUTER ASSISTED LANGUAGE LEARNING SOFTWARE EVALUATION." IJOLTL: Indonesian Journal of Language Teaching and Linguistics 3, no. 2 (May 30, 2018): 159–66. http://dx.doi.org/10.30957/ijoltl.v3i2.454.

Full text
Abstract:
The use of computer to assist learning has increased significantly through more than three decades. However, the use of the instrument is still becoming a problem among teachers and educators. This paper discusses the evaluation criteria in selecting Computer Assisted Language Learning (CALL) software in language and skill development for ESL/EFL. The CALL evaluation criteria proposed in this paper is aimed to assist English language teachers to determine good quality CALL software used in classroom activity. CALL has proven its benefits for three aspects: programming consideration, educational design, and easy for use. A checklist describing evaluation aspects of the CALL is provided in this paper.
APA, Harvard, Vancouver, ISO, and other styles
4

Wellek, S., J. L. Willems, and J. Michaelis. "Reference Standards for Software Evaluation." Methods of Information in Medicine 29, no. 04 (1990): 289–97. http://dx.doi.org/10.1055/s-0038-1634806.

Full text
Abstract:
AbstractThe field of automated ECG analysis was one of the earliest topics in Medical Informatics and may be regarded as a model both for computer-assisted medical diagnosis and for evaluating medical diagnostic programs. The CSE project has set reference standards of two kinds: In a broad sense, a standard how to perform a comprehensive evaluation study, in a narrow sense, standards as specific references for evaluating computer ECG programs. The evaluation methodology used within the CSE project is described as a basis for presentation of results which are published elsewhere in this issue.
APA, Harvard, Vancouver, ISO, and other styles
5

Balkı, Mustafa, and Mehmet Doğru. "Evaluation of two different imaging software programs in planning orthognathic surgery cases." International Dental Research 12, no. 2 (August 31, 2022): 70–81. http://dx.doi.org/10.5577/intdentres.2022.vol12.no2.5.

Full text
Abstract:
Aim: In this study, we aimed to compare the two-dimensional predictions made by two computer software packages with the postoperative values, and thus to evaluate the clinical reliability of digital orthognathic surgery planning. Methodology: Orthodontic treatment was performed before orthognathic surgery, and the same surgical team performed double-jaw orthognathic surgeries. We included 20 individuals (10 females, 10 males) with skeletal Class III malocclusion. The average age of the individuals was 21.5 years. In our study, the amount of movement was determined using reference lines on lateral cephalometric radiographs obtained from the preoperative and postoperative Cone-Beam Computed Tomography (CBCT) records of 20 individuals. Prediction profiles were formed using Dolphin Imaging (Dolphin Imaging & Management Solutions, Chatsworth, CA, USA) and NemoFAB 2D (Software Nemotec, S.L, Spain) computer softwares. In this way, the reliability and consistency of two-dimensional prediction software were examined. Results: The prediction profiles obtained from the computer software were compared with lateral cephalometric radiographs of the postoperative surgery results for 37 cephalometric parameters. There were no significant differences between software predictions and postoperative results in any cephalometric parameters. Conclusion: The plans and predictions made with the two computer software packages were reliable and can be used clinically. How to cite this article: Balkı M, Doğru M. Evaluation of two different imaging software programs in planning orthognathic surgery cases. Int Dent Res 2022;12(2):70-81. https://doi.org/10.5577/intdentres.2022.vol12.no2.5 Linguistic Revision: The English in this manuscript has been checked by at least two professional editors, both native speakers of English.
APA, Harvard, Vancouver, ISO, and other styles
6

Globus, Al, and Sam Uselton. "Evaluation of visualization software." ACM SIGGRAPH Computer Graphics 29, no. 2 (May 1995): 41–44. http://dx.doi.org/10.1145/204362.204372.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Li, Lin Lin, and Liang Xu Sun. "Online Examination System for Microsoft Office Software Operations." Advanced Materials Research 756-759 (September 2013): 911–15. http://dx.doi.org/10.4028/www.scientific.net/amr.756-759.911.

Full text
Abstract:
online examination is an effective solution to the level evaluation problem for computer basic operations. This paper proposed an online examination system for computer basic operations, especially, Microsoft Office software operations. The system mainly achieved functions including making exams intelligently, collecting and marking documents submitted automatically in the exam by the database, socket, ado and VBA program methods. This system supported some kinds of Microsoft Office software including Word, Excel, PowerPoint and Access. The actual running result showed the system could help teachers to improve work efficiency and students to improve software operations by the way of online actual operation in computers. The system has been running in the USTL university computer lab center for some times which has already proved that it was very valid to solve level evaluation problem for Microsoft Office software operations.
APA, Harvard, Vancouver, ISO, and other styles
8

Gilbert, Catherine. "Computer Software Evaluation: Balancing User’s Needs & Wants." Journal of the Australian Library and Information Association 67, no. 1 (January 2, 2018): 65. http://dx.doi.org/10.1080/24750158.2018.1429796.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bei-Yang, Wang. "Software Reliability Evaluation Approaches using Weighted Software Network." Information Technology Journal 12, no. 21 (October 15, 2013): 6288–94. http://dx.doi.org/10.3923/itj.2013.6288.6294.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Luo, Shi Yong, Wen Cai Xu, Li Xia Huo, Xin Lin Zhang, and Jia Yun Zhang. "A Computer Software on Diffusion in Solid." Advanced Materials Research 267 (June 2011): 410–13. http://dx.doi.org/10.4028/www.scientific.net/amr.267.410.

Full text
Abstract:
A computer software on solid/solid reaction kinetics, KinPreSSR, is one of subsystems in the software, Intellectualized Database Management System on Kinetics of Metallurgy (IDMSKM). KinPreSSR is a Windows application developed using Visual C++ and FoxPro, and includes two main modules, “DIFFUSION” and “REACTION”. The ‘DIFFUSION’ module includes two sub-modules of “database management system (DBMS)” and "Evaluation & prediction". The “DBMS” deals with the diffusion coefficients gathered from reported documents and the data evaluated according to some rules, besides, it can provide users with retrieval of diffusion coefficients. Based on the solutions to the Fick’s first law and the Fick’s second law in the four typical critical conditions, the "Evaluation & prediction" sub-module gives the predication of concentration distribution after diffusion process in solids or computation for diffusion coefficient.
APA, Harvard, Vancouver, ISO, and other styles
11

Yang, D., J. Tan, L. Appenzoller, H. Li, B. Sun, and S. Mutic. "A Computer Software Tool for Comprehensive Plan Quality Evaluation." International Journal of Radiation Oncology*Biology*Physics 87, no. 2 (October 2013): S620. http://dx.doi.org/10.1016/j.ijrobp.2013.06.1640.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Pickens, David R., Yong Li, Victoria L. Morgan, and Benoit M. Dawant. "Development of computer-generated phantoms for FMRI software evaluation." Magnetic Resonance Imaging 23, no. 5 (June 2005): 653–63. http://dx.doi.org/10.1016/j.mri.2005.04.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Baker, Alan M., Rowan Faludi, and David R. Green. "AN EVALUATION OF SAS/GRAPH SOFTWARE FOR COMPUTER CARTOGRAPHY1." Professional Geographer 37, no. 2 (May 1985): 204–14. http://dx.doi.org/10.1111/j.0033-0124.1985.00204.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Krishnamurthi, Shriram. "Artifact evaluation for software conferences." ACM SIGPLAN Notices 48, no. 4S (July 9, 2013): 17–21. http://dx.doi.org/10.1145/2502508.2502518.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Hashemi Farzaneh, Helena, and Lorenz Neuner. "Usability Evaluation of Software Tools for Engineering Design." Proceedings of the Design Society: International Conference on Engineering Design 1, no. 1 (July 2019): 1303–12. http://dx.doi.org/10.1017/dsi.2019.136.

Full text
Abstract:
AbstractMuch of the work in design research focusses on the development of methods and tools to support engineering designers. Many of these tools are nowadays implemented in software. Due to the strongly growing use of computers and smart devices in the last two decades, the expectations of users increased dramatically. In particular users expect good usability, for example little effort for learning to apply the software. Therefore, the usability evaluation of design software tools is crucial. A software tool with bad usability will not be used in industrial practice. Recommendations for usability evaluation of software often stem from the field of Human Computer Interaction. The aim of this paper is to tailor these general approaches to the specific needs of engineering design. In addition, we propose a method to analyse the results of the evaluation and to derive suggestions for improving the design software tool. We apply the usability evaluation method on a use case - the KoMBi software tool for bio-inspired design. The case study provides additional insights with regards to problem, causes and improvement categories.
APA, Harvard, Vancouver, ISO, and other styles
16

Parnas, David L., A. John van Schouwen, and Shu Po Kwan. "Evaluation of safety-critical software." Communications of the ACM 33, no. 6 (June 1990): 636–48. http://dx.doi.org/10.1145/78973.78974.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Zhang, Z. Y., Y. Zhang, Y. He, and J. G. An. "Evaluation of measuring the globe proptosis by using computed tomography-data based computer software." International Journal of Oral and Maxillofacial Surgery 38, no. 5 (May 2009): 487. http://dx.doi.org/10.1016/j.ijom.2009.03.321.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Barrett, BW. "Software testing and evaluation." Information and Software Technology 32, no. 10 (December 1990): 699. http://dx.doi.org/10.1016/0950-5849(90)90106-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Saputri, Theresia Ratih Dewi, and Seok-Won Lee. "Software Analysis Method for Assessing Software Sustainability." International Journal of Software Engineering and Knowledge Engineering 30, no. 01 (January 2020): 67–95. http://dx.doi.org/10.1142/s0218194020500047.

Full text
Abstract:
Software sustainability evaluation has become an essential component of software engineering (SE) owing to sustainability considerations that must be incorporated into software development. Several studies have been performed to address the issues associated with sustainability concerns in the SE process. However, current practices extensively rely on participant experiences to evaluate sustainability achievement. Moreover, there exist limited quantifiable methods for supporting software sustainability evaluation. Our primary objective is to present a methodology that can assist software engineers in evaluating a software system based on well-defined sustainability metrics and measurements. We propose a novel approach that combines machine learning (ML) and software analysis methods. To simplify the application of the proposed approach, we present a semi-automated tool that supports engineers in assessing the sustainability achievement of a software system. The results of our study demonstrate that the proposed approach determines sustainability criteria and defines sustainability achievement in terms of a traceable matrix. Our theoretical evaluation and empirical study demonstrate that the proposed support tool can help engineers identify sustainability limitations in a particular feature of a software system. Our semi-automated tool can identify features that must be revised to enhance sustainability achievement.
APA, Harvard, Vancouver, ISO, and other styles
20

Difazio, Martha Hunt. "Graphics Software Side by Side." Mathematics Teacher 83, no. 6 (September 1990): 436–46. http://dx.doi.org/10.5951/mt.83.6.0436.

Full text
Abstract:
NCTM's Curriculum and Evaluation Standards for School Mathematics (Standards) (1989) makes some very specific recommendations concerning the use of microcomputers in secondary school mathematics. This document recommends that “a computer will be available at all times in every classroom for demonstration purposes, and all students will have access to computers for individual and group work.” It strongly advocates the use of graphics calculators and interactive graphics computer packages as a principal means for investigating the behavior of functions. Recently, Waits and Demana (1988a) have presented convincing arguments for the use of graphics technology as a supplement to traditional mathematics instruction. Interactive graphics software can be used to give quick, accurate plots of functions, conic sections, and parametric equations. The use of such software can foster a geometric approach to problem solving and is especially valuable for those students for whom visual images work better than symbolic manipulations. Estimation skills can be sharpened as students begin to search for roots and critical points visually even before they are taught the symbolic manipulations needed for an algebraic solution to their problem.
APA, Harvard, Vancouver, ISO, and other styles
21

Hardin, Paul C., and Janet Reis. "Interactive Multimedia Software Design: Concepts, Process, and Evaluation." Health Education & Behavior 24, no. 1 (February 1997): 35–53. http://dx.doi.org/10.1177/109019819702400106.

Full text
Abstract:
This article provides the health educator with a review of the design and construction of computer-based health education materials. Specifically, this review considers questions of instructional objectives as defined in the field of instructional design, a body of expertise not often deployed in the area of health education. It discusses in some detail the computer materials development process, associated documentation, and the range of personnel required. This article also briefly updates discussion of the costing process, and current costs for multimedia hardware and authoring software. The review touches on, but does not synthesize, emerging possibilities with the Internet.
APA, Harvard, Vancouver, ISO, and other styles
22

Squires, D., and A. McDougall. "Software evaluation: a situated approach." Journal of Computer Assisted Learning 12, no. 3 (September 1996): 146–61. http://dx.doi.org/10.1111/j.1365-2729.1996.tb00047.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Yahlali, Mebarka, and Abdellah Chouarfia. "Towards a software component assembly evaluation." IET Software 9, no. 1 (February 2015): 1–6. http://dx.doi.org/10.1049/iet-sen.2014.0001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Wu, Wenjun, Wei-Tek Tsai, and Wei Li. "An evaluation framework for software crowdsourcing." Frontiers of Computer Science 7, no. 5 (August 14, 2013): 694–709. http://dx.doi.org/10.1007/s11704-013-2320-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Zastinceanu, Liubov. "EVALUATION IN MATHEMATICS USING SOFTWARES. METHODOLOGICAL REFLECTIONS." Acta et commentationes: Științe ale Educației 26, no. 4 (January 2022): 86–94. http://dx.doi.org/10.36120/2587-3636.v26i4.86-94.

Full text
Abstract:
Computer-assisted evaluation in mathematics has been and remains a rather difficult activity. The experience, gained by mathematics teachers in the Republic of Moldova over the last two years in this regard, highlights aspects specific of the local educational process. The article analyzes data from a survey conducted among mathematics teachers in the country, the experience of the teachers themselves and the author's experience in using computer-assisted assessment. At the end some conclusions and suggestions are formulated about methodological conclusions and recommendations for the use of software-based evaluations both in distance learning and in direct contact.
APA, Harvard, Vancouver, ISO, and other styles
26

Mohammed Hasan, Kawthar A., Ali H. Kadhum, and Ameer H. Morad. "Evaluation and Improvement of Manufacturing System Using Computer Software Arena." Al-Khwarizmi Engineering Journal 15, no. 4 (December 1, 2019): 71–78. http://dx.doi.org/10.22153/kej.2019.10.003.

Full text
Abstract:
The main purpose of the paper is to identify the controllability of an existing production system; yogurt production line in Abu Ghraib Dairy Factory which has several machines of food processing and packing that has been studied. Through the starting of analysis, instability in production has been found in the factory. The analysis is built depending on experimental observation and data collection for different processing time of the machines, and statistical analysis has been conducted to model the production system. Arena Software is applied for simulating and analyzing the current state of the production system, and results are expanded to improve the system production and efficiency. Research method is applied to contribute in knowing and expecting the future running of the system to enhance the controllability of the system production and improve the production system and machine efficiency. Moreover, built an experiment-real model in Arena in order to control the system in term of production and process. First step is to collect the statistical data required for analysis in terms of input and output data for analysis. Second, is to track the production problem in term of process bottleneck in order to improve the utilization of the system. Third is to validate the model in order to overcome the product demand of the system uncontrollability. Through the result analysis of waiting time and production rate, it is clearly shown that system is stable with a need to resetting the capacity as a chance for the improvement, regarding to resources utilization. Re-planning resources capacity positively enhances the production and profitability of the system.
APA, Harvard, Vancouver, ISO, and other styles
27

Hedley, Carolyn. "What's New in Software? Computer Programs for Unobtrusive, Informal Evaluation." Journal of Reading, Writing, and Learning Disabilities International 1, no. 4 (July 1985): 105–11. http://dx.doi.org/10.1080/0748763850010414.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Akahori, Kanji. "Evaluation of Educational Computer Software in Japan (II): Practical Problems." PLET: Programmed Learning & Educational Technology 25, no. 1 (February 1988): 57–66. http://dx.doi.org/10.1080/1355800880250107.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

STALKER, RUTH, and IAN F. C. SMITH. "Structural monitoring using engineer–computer interaction." Artificial Intelligence for Engineering Design, Analysis and Manufacturing 16, no. 3 (June 2002): 203–18. http://dx.doi.org/10.1017/s0890060402163062.

Full text
Abstract:
Engineer–computer interaction (ECI) is a new subdomain of human–computer interaction that is specifically tailored to engineers' needs. ECI uses an information classification schema, provides a modular approach to task decomposition, and integrates standard engineering characteristics and working procedures into software. A software tool kit that interprets monitoring data taken from bridges was developed according to ECI guidelines. This tool kit was given to engineers for testing and evaluation. An empirical evaluation using questionnaires was performed. The results show that this ECI software corresponds to engineers' needs and the ECI approach has potential applications to other engineering tasks.
APA, Harvard, Vancouver, ISO, and other styles
30

Lee, Chong Hyung, Seung Min Lee, and Dong Ho Park *. "Evaluation of software availability for the imperfect software debugging model." International Journal of Systems Science 36, no. 11 (September 15, 2005): 671–78. http://dx.doi.org/10.1080/00207720500159904.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Fertalj, Kresimir, and Damir Kalpic. "ERP Software Evaluation and Comparative Analysis." Journal of Computing and Information Technology 12, no. 3 (2004): 195. http://dx.doi.org/10.2498/cit.2004.03.02.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Tsai, Jeffrey J. P., Bing Li, and Eric Y. T. Juan. "Parallel evaluation of software architecture specifications." Communications of the ACM 40, no. 1 (January 1997): 83–86. http://dx.doi.org/10.1145/242857.242881.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Smith, Eric, and Antonio Siochi. "Software Usability: Requirements by Evaluation." Proceedings of the Human Factors Society Annual Meeting 32, no. 5 (October 1988): 264–66. http://dx.doi.org/10.1177/154193128803200503.

Full text
Abstract:
Recent research has established the importance of defining usability requirements as part of the total requirements for a system. Instead of deciding in an ad hoc manner whether or not a human-computer interface is usable, measurable usability requirements are established at the outset. It is common to state such requirements in an operational manner: U% of a sample of the intended user population should accomplish T% of the benchmark tasks within M minutes and with no more than E errors. The formal experiments needed to test compliance with the requirements makes this method costly. This paper presents an alternative method of specifying usability requirements currently being developed and tested on a large software project at Virginia Tech. Briefly, usability requirements are specified by having every member of the software design team and the user interface design team specify the ease of use desired for each proposed functional requirement of the system under development. The individual ratings are then compared in order to arrive at a consensus. It is this consensus that leads to the formal usability requirements which the interface must meet or exceed. As the interface is built, it is rated in the same manner as that used originally to specify the requirements. This method thus provides a structured means of specifying measurable usability requirements and a means of determining whether or not the interface satisfies those requirements. Several other benefits of this method are presented as well.
APA, Harvard, Vancouver, ISO, and other styles
34

Lew, Roger, Thomas A. Ulrich, and Ronald L. Boring. "Open Source Software Architecture for Computer Based Procedure Systems." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 66, no. 1 (September 2022): 1503–7. http://dx.doi.org/10.1177/1071181322661347.

Full text
Abstract:
In the US and globally Advanced Reactors are being developed. These Nuclear Power Plants (NPP) will utilize modern engineering and simulation tools, as well as modern control systems. New plants are safer and more flexible than previous generations. New opportunities exist for how plants could manage and operate reactors. Specifically, future plants could utilize fleet management and reduced staffing to decrease operational expenses. Tools are needed to support the development of computer-based procedure systems for advanced reactors. Here we find inspiration from scientific open source software tools (Python, Jupyter Notebook, Jupytext, Jupyter Book, Rmarkdown, Rstudio, MarkDown, MyST, Sphinx, Git) for rapid prototyping and evaluation of computer based procedure systems to support the design and evaluation of novel concept of operations for new reactors, conducting engineering tests, validating and optimizing operations using machine learning.
APA, Harvard, Vancouver, ISO, and other styles
35

Scott, Wayne J. "Evaluation: Westcap MacIntosh Computer Reading Program." Aboriginal Child at School 18, no. 2 (May 1990): 44–50. http://dx.doi.org/10.1017/s1326011100600753.

Full text
Abstract:
In early 1989 a proposal was submitted for the purchase of three Macintosh computers and software for the purpose of assisting lower streamed students in their reading difficulties. A condition of the purchase was that some form of evaluation would be implemented in order to gauge the worth of the program.The computers were installed in the schools and a program of instruction was written for the commencement of Term IV, which ran for 10 weeks.
APA, Harvard, Vancouver, ISO, and other styles
36

Card, David N. "A software technology evaluation program." Information and Software Technology 29, no. 6 (July 1987): 291–300. http://dx.doi.org/10.1016/0950-5849(87)90028-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Iglesias, Omar A., Carmen N. Paniagua, and R�ul A. Pessacq. "Evaluation of university educational software." Computer Applications in Engineering Education 5, no. 3 (1997): 181–88. http://dx.doi.org/10.1002/(sici)1099-0542(1997)5:3<181::aid-cae5>3.0.co;2-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Lourenço, Miguel L., Fátima Lanhoso, and Denis A. Coelho. "Usability Evaluation of Slanted Computer Mice." International Journal of Environmental Research and Public Health 18, no. 8 (April 7, 2021): 3854. http://dx.doi.org/10.3390/ijerph18083854.

Full text
Abstract:
Prevention of musculoskeletal disorders is supported by use of slanted rather than horizontal pointing devices, but user acceptance of the former may be compromised due to lower perceived ease of use. This study compares subjectively rated usability (N = 37) for three sizes of slanted computer mice and includes a horizontal small conventional device as a reference. For a random subset of the sample (n = 10), objective usability parameters were also elicited. Participants followed a standard protocol which is based on executing graphical pointing, steering, and dragging tasks generated by a purpose-built software. Subjective ratings were collected for each of the four pointing devices tested. The three slanted devices differed in size but were chosen because of an approximately similar slant angle (around 50–60 degrees relative to the horizontal plane). Additionally, effectiveness and efficiency were objectively calculated based on data recorded for the graphical tasks’ software for a random subset of the participants (n = 10). The results unveil small differences in preference in some of the subjective usability parameters across hand size groups. This notwithstanding, the objective efficiency results are aligned with the subjective results, indicating consistency with the hypothesis that smaller slanted devices relative to the user’s hand size are easier to use than larger ones. Mean values of weighted efficiency recorded in the study range from 68% to 75%, with differences across devices coherent with preference rank orders.
APA, Harvard, Vancouver, ISO, and other styles
39

Agarwal, Jyoti, Sanjay Kumar Dubey, and Rajdev Tiwari. "Usability evaluation of component based software system using software metrics." Intelligent Decision Technologies 14, no. 3 (September 29, 2020): 281–89. http://dx.doi.org/10.3233/idt-190021.

Full text
Abstract:
Component Based Software Engineering (CBSE) provides a way to create a new Component Based Software System (CBSS) by utilizing the existing components. The primary reason for that is to minimize the software development time, cost and effort. CBSS also increases the component reusability. Due to these advantages, software industries are working on CBSS and continuously trying to provide quality product. Usability is one of the major quality factors for CBSS. It should be measured before delivering the software product to the customer, so that if there are any usability flaws, it can be removed by software development team. In this paper, work has been done to evaluate the usability of CBSS based on major usability sub-factors (learnability, operability, understandability and configurability). For this purpose, firstly software metrics are identified for each usability sub-factor and the value of each sub-factor is evaluated for a component based software project. Secondly, overall usability of the software project is evaluated by using the calculated value of each usability sub-factor. Usability for the same project was also evaluated using Fuzzy approach in MATLAB to validate the experimental work of this research paper. It was identified that the value of usability obtained from software metrics and fuzzy model was very similar. This research work will be useful for the software developer to evaluate the usability of any CBSS and will also help them to compare different version of any CBSS in term of their usability.
APA, Harvard, Vancouver, ISO, and other styles
40

Hasty, Ronald W., Anthony F. Herbst, and Mo A. Mahmood. "Microcomputer Software Evaluation and Selection Strategies." Journal of Organizational and End User Computing 1, no. 1 (January 1989): 8–21. http://dx.doi.org/10.4018/joeuc.1989010102.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Klingholz, F. "Computer aided evaluation of phonetograms." Computer Methods and Programs in Biomedicine 37, no. 2 (March 1992): 127–35. http://dx.doi.org/10.1016/0169-2607(92)90094-n.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Doughty, Ken. "The Evaluation of Software Packages." EDPACS 15, no. 7 (January 1988): 5–9. http://dx.doi.org/10.1080/07366988809450460.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Jia, Xinyang. "Research on Computer Software Security Testing Techniques and Applications." Journal of Intelligence and Knowledge Engineering 1, no. 4 (December 2023): 82–87. http://dx.doi.org/10.62517/jike.202304415.

Full text
Abstract:
Computer software security testing is a critical step in ensuring the security of computer systems. This research focuses on the techniques and applications of computer software security testing. Firstly, an analysis is conducted on three common testing methods: black-box testing, white-box testing, and grey-box testing. Next, the analysis and evaluation of security testing technology based on vulnerability mining, static analysis, and dynamic analysis are discussed. Lastly, the application practice of computer software security testing technology in web applications and mobile applications is explored. This research provides insight and guidance for computer software security testing.
APA, Harvard, Vancouver, ISO, and other styles
44

Taylor, Bruce H., and Scott A. Weisgerber. "SUE: A Usability Evaluation Tool for Operational Software." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 41, no. 2 (October 1997): 1107–10. http://dx.doi.org/10.1177/107118139704100284.

Full text
Abstract:
In response to the proliferation of computer processing in military systems, the Air Force Operational Test and Evaluation Center (AFOTEC) has developed the Software Usability Evaluator, or SUE. SUE is a software tool for the generation, administration, and analysis of questionnaires that assess software usability in operational systems. To date, SUE has proven to be a powerful source of descriptive and narrative information on software interface deficiencies and potential means of remediation.
APA, Harvard, Vancouver, ISO, and other styles
45

Antonov, Anton. "SIMULATION SOFTWARE FOR MODELING THE MOVEMENT OF MATERIAL FLOWS." Journal Scientific and Applied Research 14, no. 1 (December 1, 2018): 17–22. http://dx.doi.org/10.46687/jsar.v14i1.244.

Full text
Abstract:
The Computer modeling is one of the best tools for the development of a real automated warehouse system. It aims to explore and define the behavior of the system and make the evaluation of its performance. In this paper is analyzed and simulated a software for the design of logistic systems.
APA, Harvard, Vancouver, ISO, and other styles
46

Quevedo, Lluïsa, José Antonio Aznar-Casanova, Dolores Merindano-Encina, Genís Cardona, and Joan Solé-Fortó. "A novel computer software for the evaluation of dynamic visual acuity." Journal of Optometry 5, no. 3 (July 2012): 131–38. http://dx.doi.org/10.1016/j.optom.2012.05.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Chiecchio, Andrea, Andrea Bo, Paolo Manzone, and Francesca Giglioli. "DECIDE: a software for computer-assisted evaluation of diagnostic test performance." Computer Methods and Programs in Biomedicine 40, no. 1 (May 1993): 55–65. http://dx.doi.org/10.1016/0169-2607(93)90049-q.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Selamat, M. H., and M. M. Rahim. "Evaluation of computer-aided software engineering tools: Experience from Malaysian organisations." International Journal of Information Management 16, no. 4 (August 1996): 299–313. http://dx.doi.org/10.1016/0268-4012(96)00015-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Akahori, Kanji. "Evaluation of Educational Computer Software in Japan (I): Methods and Results." PLET: Programmed Learning & Educational Technology 25, no. 1 (February 1988): 46–56. http://dx.doi.org/10.1080/1355800880250106.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Ratheal, S., and F. Lombardi. "Software testbed for the design and evaluation of distributed computer systems." Microprocessing and Microprogramming 19, no. 1 (January 1987): 49–58. http://dx.doi.org/10.1016/0165-6074(87)90233-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography