Journal articles on the topic 'Computers Reliability Testing'

To see the other types of publications on this topic, follow the link: Computers Reliability Testing.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Computers Reliability Testing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Jacobs, Ronald L., David M. Byrd, and William R. High. "Computerized Testing: The Hidden Figures Test." Journal of Educational Computing Research 1, no. 2 (May 1985): 173–78. http://dx.doi.org/10.2190/8v89-af7r-urea-degf.

Full text
Abstract:
The adaptation of paper-and-pencil tests to computers may be confounded by differences between the medium of paper-and-pencil and computers. The purpose of this study was to adapt the Hidden Figures Test for use on PLATO and determine the alternate-form reliability of the computerized version as compared to the paper-and-pencil version. The HFT is one of the most commonly used tests to measure the cognitive style known as field independence-field dependence. The results showed that the test could be adapted with some modifications and that a significant relationship was found between scores on the two versions of the test, though the total amount of variance explained between the tests was low. The results suggest that the computerized version may be measuring computer anxiety, perception of computer generated figures, and previous experience with a keyboard, as well as the construct under study.
APA, Harvard, Vancouver, ISO, and other styles
2

Briš, Radim, and Simona Domesová. "New Computing Technology in Reliability Engineering." Mathematical Problems in Engineering 2014 (2014): 1–7. http://dx.doi.org/10.1155/2014/187362.

Full text
Abstract:
Reliability engineering is relatively new scientific discipline which develops in close connection with computers. Rapid development of computer technology recently requires adequate novelties of source codes and appropriate software. New parallel computing technology based on HPC (high performance computing) for availability calculation will be demonstrated in this paper. The technology is particularly effective in context with simulation methods; nevertheless, analytical methods are taken into account as well. In general, basic algorithms for reliability calculations must be appropriately modified and improved to achieve better computation efficiency. Parallel processing is executed by two ways, firstly by the use of the MATLAB function parfor and secondly by the use of the CUDA technology. The computation efficiency was significantly improved which is clearly demonstrated in numerical experiments performed on selected testing examples as well as on industrial example. Scalability graphs are used to demonstrate reduction of computation time caused by parallel computing.
APA, Harvard, Vancouver, ISO, and other styles
3

Tang, Tian, Mu-Chuan Zhou, Yi Quan, Jun-Liang Guo, V. S. Balaji, V. Gomathi, and V. Elamaran. "Penetration Testing and Security Assessment of Healthcare Records on Hospital Websites." Journal of Medical Imaging and Health Informatics 10, no. 9 (August 1, 2020): 2242–46. http://dx.doi.org/10.1166/jmihi.2020.3138.

Full text
Abstract:
At present, computer security is the flourishing field in the IT industry. Nowadays, the usage of computers and the Internet grows drastically, and hence, computers become vehicles for the attackers to spread viruses and worms, to distribute spam and spyware, and to perform denial-of-service attacks, etc. The IT engineers (even users) should know about network security threats, and at the same time, to some extent, they should know techniques to overcome the issues. The reliability and privacy of healthcare records of the patients are the most critical issue in the healthcare business industry sector. The security safeguards, such as physical, technical, and administrative safeguards, are crucial in protecting the information in all aspects. This article deals with the forty popular hospital portals in India related to the professional and network security related issues such as operating system guesses, number of open/closed/filtered ports, the name of the Web server, etc. The Nmap (network mapper) tool is used to analyze the results belong to the security perspective.
APA, Harvard, Vancouver, ISO, and other styles
4

Kaur, Jasmine, Adarsh Anand, Ompal Singh, and Vijay Kumar. "Measuring software reliability under the influence of an infected patch." Yugoslav Journal of Operations Research 31, no. 2 (2021): 249–64. http://dx.doi.org/10.2298/yjor200117005k.

Full text
Abstract:
Patching service provides software firms an option to deal with the leftover bugs and is thereby helping them to keep a track of their product. More and more software firms are making use of this concept of prolonged testing. But this framework of releasing unprepared software in market involves a huge risk. The hastiness of vendors in releasing software patch at times can be dangerous as there are chances that firms release an infected patch. The infected patch (es) might lead to a hike in bug occurrence and error count and might make the software more vulnerable. The current work presents an understanding of such situation through mathematical modeling framework; wherein, the distinct behavior of testers (during in-house testing and field testing) and users is described. The proposed model has been validated on two software failure data sets of Tandem Computers and Brazilian Electronic Switching System, TROPICO R-1500.
APA, Harvard, Vancouver, ISO, and other styles
5

Muhammad, Dicky, Gita Indah Hapsari, and Giva Andriana Mutiara. "An Experimental Connectivity Performance of Simple Wireless Mesh Implementation Using Wireless Distribution System (WDS)." IJAIT (International Journal of Applied Information Technology) 1, no. 02 (August 14, 2017): 18–27. http://dx.doi.org/10.25124/ijait.v1i02.871.

Full text
Abstract:
Today wireless technology grows rapidly, especially in the field of telecommunications and communications. Computer networks now widely utilizes wireless. Wireless Mesh Network is one of the method which is use to communicate computer wirelessly. One important factor in application of wireless network is how to extend wireless signal coverage. Wireless Distribution System is one way to expand the wireless network by mean of wireless interconnection of access point on the network IEEE 8022.11. This study suggests how to build a simple wireless computer network using WDS technology and describes connectivity performance and its signal coverage. The test result of connectivity performance shows that the connectivity between two computers work properly for reliability and multi SSID testing. However, the connectivity was not success in multichannel testing. Furthermore the test result of coverage shows that the range of wireless signal coverage reaches 39 meters with different circumstance room.
APA, Harvard, Vancouver, ISO, and other styles
6

Martin, Nicholas, John Capman, Anthony Boyce, Kyle Morgan, Manuel Francisco Gonzalez, and Seymour Adler. "New frontiers in cognitive ability testing: working memory." Journal of Managerial Psychology 35, no. 4 (January 18, 2020): 193–208. http://dx.doi.org/10.1108/jmp-09-2018-0422.

Full text
Abstract:
Purpose Cognitive ability tests demonstrate strong relationships with job performance, but have several limitations; notably, subgroup differences based on race/ethnicity. As an alternative, the purpose of this paper is to develop a working memory assessment for personnel selection contexts. Design/methodology/approach The authors describe the development of Global Adaptive Memory Evaluation (G.A.M.E.) – a working memory assessment – along with three studies focused on refining and validating G.A.M.E., including examining test-taker reactions, reliability, subgroup differences, construct and criterion-related validity, and measurement equivalence across computer and mobile devices. Findings Evidence suggests that G.A.M.E. is a reliable and valid tool for employee selection. G.A.M.E. exhibited convergent validity with other cognitive assessments, predicted job performance, yielded smaller subgroup differences than traditional cognitive ability tests, was engaging for test-takers, and upheld equivalent measurement across computers and mobile devices. Research limitations/implications Additional research is needed on the use of working memory assessments as an alternative to traditional cognitive ability testing, including its advantages and disadvantages, relative to other constructs and methods. Practical implications The findings illustrate working memory’s potential as an alternative to traditional cognitive ability assessments and highlight the need for cognitive ability tests that rely on modern theories of intelligence and leverage burgeoning mobile technology. Originality/value This paper highlights an alternative to traditional cognitive ability tests, namely, working memory assessments, and demonstrates how to design reliable, valid, engaging and mobile-compatible versions.
APA, Harvard, Vancouver, ISO, and other styles
7

Umar, Rusydi, Imam Riadi, and Ridho Surya Kusuma. "Mitigating Sodinokibi Ransomware Attack on Cloud Network Using Software-Defined Networking (SDN)." International Journal of Safety and Security Engineering 11, no. 3 (June 30, 2021): 239–46. http://dx.doi.org/10.18280/ijsse.110304.

Full text
Abstract:
Sodinokibi Ransomware virus becomes a severe threat by targeting data encryption on a server, and this virus infection continues to spread to encrypt data on other computers. This study aims to mitigate by experiment with building a prevention system through computer network management. The mitigation process is carried out through static, dynamic, and Software-Defined Networking (SDN) analysis to prevent the impact of attacks through programmatic network management. SDN consists of two main components in its implementation, the Ryu controller and Open Virtual Switch (OVS). Result testing mitigation system on infected networks by crippling TCP internet protocol access can reduce virus spread by 17.13% and suppress Sodinokibi traffic logs by up to 73.97%. Based on the percentage data, SDN-based mitigation in this study is per the objectives to make it possible to mitigate Ransomware attacks on computer network traffic.
APA, Harvard, Vancouver, ISO, and other styles
8

Saxena, Sharad. "A Decision Theoretic Estimation in Exponential Product Life Testing Model Using Guesstimate." Vikalpa: The Journal for Decision Makers 31, no. 4 (October 2006): 31–46. http://dx.doi.org/10.1177/0256090920060403.

Full text
Abstract:
In today's technologically advanced world, everyone depends on the continued working of a widevarietyofproductslikecomplexmachinery and equipment for theireverydayrequirements pertaining to health, safety, mobility, and economic welfare. We expect our cars, computers, electrical appliances, lights, televisions, etc., to function whenever we need them — day after day, year after year. When they fail, the results can be catastrophic: injury, loss of life or may be some costly lawsuits. More often, repeated failure leads to annoyance, inconvenience, and a lasting customer dissatisfaction that can have an adverse impact on the company's position in the marketplace. It takes a long time for a company to build up a reputation for reliability and only a short time to be branded as ‘unreliable’ after shipping a flawed product. Continual assessment of new product reliability and ongoing control of the reliability of everything shipped are critical necessities in today's competitive marketplace. The mean and variance of an exponentially distributed random variable are of interest in numerous life and reliability testing problems as measures of various characteristics like expected life, failure rate, reliability, future mean, etc., and, thus, it is of practical importance to estimate these unknown parameters. Situations frequently arise where one has an initial estimate that combines reasoning with guessing called the guesstimate. The guesstimate may be based on a calculation of sample observations or a conjecture that comes from past experience about similar situations involving similar parameter or from the association with the experimental material or from any reliable source. It is in the form of a point value arising from either statistical investigation or sources other than that and thus should also be incorporated in any of the estimation procedure. This paper is intended to propose some improved estimators by considering guesstimate and thereby employing the results in the estimation of mean and variance in an accelerated life testing experiment of breaking strength of transformer oil at different voltage levels. The approach is very generic and the suggested estimators are shown to be better than the other estimators in the sense of having a smaller mean squared error as well as a smaller bias.
APA, Harvard, Vancouver, ISO, and other styles
9

Aslam, Tariq M., Ian J. Murray, Michael Y. T. Lai, Emma Linton, Humza J. Tahir, and Neil R. A. Parry. "An assessment of a modern touch-screen tablet computer with reference to core physical characteristics necessary for clinical vision testing." Journal of The Royal Society Interface 10, no. 84 (July 6, 2013): 20130239. http://dx.doi.org/10.1098/rsif.2013.0239.

Full text
Abstract:
There are a multitude of applications using modern tablet computers for vision testing that are accessible to ophthalmology patients. While these may be of potential future benefit, they are often unsupported by scientific assessment. This report investigates the pertinent physical characteristics behind one of the most common highest specification tablet computers with regard to its capacity for vision testing. We demonstrate through plotting of a gamma curve that it is feasible to produce a precise programmable range of central luminance levels on the device, even with varying background luminance levels. It may not be possible to display very low levels of contrast, but carefully using the gamma curve information allows a reasonable range of contrast sensitivity to be tested. When the screen is first powered on, it may require up to 15 min for the luminance values to stabilize. Finally, luminance of objects varies towards the edge of the screen and when viewed at an angle. However, the resulting effective contrast of objects is less variable. Details of our assessments are important to developers, users and prescribers of tablet clinical vision tests. Without awareness of such findings, these tests may never reach satisfactory levels of clinical validity and reliability.
APA, Harvard, Vancouver, ISO, and other styles
10

Megantoro, Prisma, Hafidz Faqih Aldi Kusuma, Sinta Adisti Reina, Abdul Abror, Lilik Jamilatul Awalin, and Yusrizal Afif. "Reliability and performance analysis of a mini solar home system installed in Indonesian household." Bulletin of Electrical Engineering and Informatics 11, no. 1 (February 1, 2022): 20–28. http://dx.doi.org/10.11591/eei.v11i1.3335.

Full text
Abstract:
During the COVID-19 pandemic since early 2020 in Indonesia, the demand for electrical energy in the housing sector has increased significantly. This is due to the government’s recommendation to reduce activities on the outside and work from home, specifically for educational and entertainment activities. Those are almost recommended to be done online. Many people complain about the increase in monthly electricity payments compared to before the pandemic. The construction of solar power plants in housing/solar home systems (SHS) will reduce the electricity consumption from the public grid. This SHS installation can be used to supply some household electricity needs, such as computers, televisions, internet facilities, lighting, et cetera. In this article, the researchers discuss the performance testing of SHS with a capacity of 300 Wp. It is installed in the house buildings accompanied by the design and measurement of solar energy potential.
APA, Harvard, Vancouver, ISO, and other styles
11

Lytton, Robert L. "Characterizing Asphalt Pavements for Performance." Transportation Research Record: Journal of the Transportation Research Board 1723, no. 1 (January 2000): 5–16. http://dx.doi.org/10.3141/1723-02.

Full text
Abstract:
Having fast computers with lots of memory makes it possible to make performance predictions using mechanics, which simply could not have been done even as recently as 3 or 4 years ago. This trend can be expected to continue. Pavement materials, whether asphalt concrete, base course, or subgrade, can now be characterized in realistic ways that were simply unavailable to pavement engineers. The dependence of these materials on stress state, moisture, temperature, strain rate, and damage is what has made their characterization difficult, if not impossible. This complexity has posed an almost insuperable problem for computational mechanics and for the linkage of these properties to construction specifications on the one hand and performance of in-service pavements on the other. Although these tasks will never be simple, the tools to work on them are becoming more adequate. Some examples are given of how one can proceed to simplicity through complexity in several areas that are important to pavement performance. In every case, one must be able to measure a property of the material that can be used as input to a computer program, principally a finite element program, that allows material properties to change from point to point. Several examples are given in the categories of material properties, material behavior, materials testing in the laboratory, nondestructive testing in the field, and performance prediction models. Laboratory testing, materials characterization and properties, nondestructive testing of pavements in service, construction specifications, pavement variability and reliability, field data collection, performance prediction modeling, and prediction of pavement response and distress with modern computer methods—taken individually, these subjects look complex. Combined with the glue of mechanics, they are simplicity itself. And they confirm Alfred North Whitehead’s principle, “The only simplicity that can be trusted is that which lies beyond complexity.”
APA, Harvard, Vancouver, ISO, and other styles
12

Revel, Gian Marco. "A New Vibration Measurement Procedure for On-Line Quality Control of Electronic Devices." Shock and Vibration 9, no. 1-2 (2002): 3–10. http://dx.doi.org/10.1155/2002/401407.

Full text
Abstract:
In this paper the problem of experimentally testing the mechanical reliability of electronic components for quality control is approached. In general, many tests are performed on electronic devices (personal computers, power supply units, lamps, etc.), according to the relevant international standards (IEC), in order to verify their resistance to shock and vibrations, but these are mainly “go no-go” experiments, performed on few samples taken from the production batches.The idea here proposed is to improve the efficiency of these tests by using electro-optic techniques for the measurement of the vibration behaviour of the components under known excitation. This would allow the on-line testing of a high percentage of the production and would be useful to give important feedback to the design process.Scanning laser Doppler vibrometry seems to be a valuable solution for this problem, thanks to its capabilities of measuring several spatially-defined points on a vibrating object with reduced testing time for on-line application, with high sensitivity and accuracy, non-intrusivity and with any kind of excitation signal. Experimental tests are performed on a power supply: the results show the effectiveness of the proposed approach. The metrological problems connected with the on-line implementation are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
13

Lorenzetti, Robert, Janet Williamson, Larry Hoffman, Tim Beck, and Frank Maguire. "A Case Study in large Scale Computer Human Interaction Design." Proceedings of the Human Factors Society Annual Meeting 32, no. 16 (October 1988): 1041–45. http://dx.doi.org/10.1177/154193128803201610.

Full text
Abstract:
The Reliability & Maintainability Information System (REMIS) is being developed by a team of Litton Computer Services, Tandem Computers, and SofTech under contract to the Air Force Logistics Command (AFLC) Logistics Management Support Center. Its purpose is to make accessible to AF managers worldwide, the data necessary to keep weapon systems combat ready in peace and sustain them in war. REMIS will modernize the collection and use of inventory, status, utilization, operational reliability, maintenance, configuration, mission capability, and awaiting parts data for aircraft, trainers, automatic test equipment, Communications-Electronics (C-E) equipment, and support equipment. The Government procurement request for REMIS envisioned the need for human engineering of the Human Computer Interface (HCI), but there were no requirements to deliver human factors analysis documentation nor to conduct any formal testing. The HCI design task was eventually assigned to an ad hoc team (Messrs. Lorenzetti, Beck, and Maguire) with no formal human engineering experience, and with severe time constraints. Design of a user-friendly system under these constraints (using available human factors data sources) proved to be a challenging exercise! This paper presents a description of the informal user surveys and qualitative evaluations that were used as a surrogate to the more formal approaches normally recommended. After the fact, the over 900 guidelines in Smith and Mosier were reviewed, with 75 found to be specifically applicable to REMIS. REMIS conformance to each of the 75 guidelines was then assessed. Although the REMIS design was evaluated as reasonably good, we concluded that specific human engineering requirements, schedule, budget, and documentation should be provided. The accessibility of human factors data supporting design should be substantially improved if better quality HCIs are to be assured.
APA, Harvard, Vancouver, ISO, and other styles
14

Kuznetsov, Vitaly, Galina Polekhina, and Yulia Shaposhnikova. "Computerized testing as efficient form of objective knowledge monitoring in studying of technical subjects." Stroitel stvo nauka i obrazovanie [Construction Science and Education], no. 2 (June 30, 2019): 11. http://dx.doi.org/10.22227/2305-5502.2019.2.11.

Full text
Abstract:
Introduction. Objective and regular students’ knowledge monitoring in technical subjects can be implemented by means of special tests allowing for the required mastering level of the matter and the reliable consolidation of the acquired knowledge. Various aspects of the application of tests in the academic activity were considered. Materials and methods. Tests used in practical studies should meet specific requirements, such as: validity, definiteness, simplicity, unambiguity, reliability. The identification of mastering levels makes it possible to “troubleshoot” and to improve the academic activity and the mastering degree of the competences by the students. Based on the assessment of the studying pattern of the forthcoming activity, one could point out four mastering levels of the subject matter. Level I tests include recognition, discrimination and classification. Level II tests monitor the mastering of the subject in the level of “reproduction” allowing for retrieval of information from the memory and its analysis, for routine assignment solutions. Level III tests impose special assignments challenging a student with quests for which no ready algorithms are catered, whereas the solutions found lead to obtaining of subjectively new information. Level IV tests reveal students’ capability to take decisions in new problematic situations, the solutions found, being a result of creative activity, are followed by obtaining of objectively new information. Results. To establish an efficient system of monitoring tests in a certain subject, a number of basic prerequisites is required, such as a data base, a sample group of with the required number of assignments, at least 30 and maximum 70, a time limit in accordance with the required labor intensity, assessment of the assignments and its criteria, the output of the results. Conclusions. If there is a required number of computers of at least one PC per two students, correctly arranged computer testing considerably reduces time demand of a monitoring event, increases the responsibility and the progress of the students, guarantees the objectiveness of the knowledge monitoring and helps to avoid conflicts.
APA, Harvard, Vancouver, ISO, and other styles
15

Seter, Zehava, and Cristian Stan. "Educational Change - Easy to Say, Hard to Do: Teachers’ Perceptions towards Introducing Innovative Teaching Approaches." Educatia 21, no. 19 (December 19, 2020): 127–36. http://dx.doi.org/10.24193/ed21.2020.19.16.

Full text
Abstract:
"Computers, communication, and internet technologies have led to significant changes in learning and teaching. The constructivist approach in education puts learners at the center of the teaching process and actively makes them construct their knowledge, developing 21st century skills required for tomorrow’s world. Despite advances in the process of integrating technology into teaching, a significant gap still exists between promise and actual reality. Implementation of computer technologies depends on many complex factors, one of which is teachers’ perceptions of assimilating computerization into their teaching methods. This research is part of a broader study examining techno-pedagogical change in a high school in Israel. The current study focuses on the process of constructing, testing, and validating a questionnaire examining teachers’ attitudes toward pedagogical innovation and assimilating technological skills into teaching. The validation process was done by an exploratory factor analysis to detect cases with low variability and explore the dimensionality of each survey instrument. This was complemented by a confirmatory phase. The results showed high reliability and stable dimensions in the instruments. This study's importance is in constructing an original instrument that examines the extent to which high school teachers adopt innovative pedagogies assimilating technological tools. This study may have a universal contribution because the instrument can be used across countries and cultures."
APA, Harvard, Vancouver, ISO, and other styles
16

A.V., Chistyakov. "On improving the efficiency of mathematical modeling of the problem of stability of construction." Artificial Intelligence 25, no. 3 (October 10, 2020): 27–36. http://dx.doi.org/10.15407/jai2020.03.027.

Full text
Abstract:
Algorithmic software for mathematical modeling of structural stability is considered, which is reduced to solving a partial generalized eigenvalues problem of sparse matrices, with automatic parallelization of calculations on modern parallel computers with graphics processors. Peculiarities of realization of parallel algorithms for different structures of sparse matrices are presented. The times of solving the problem of stability of composite materialsusing a three-dimensional model of "finite size fibers" on computers of different architectures are given. In mathematical modeling of physical and technical processes in many cases there is a need to solve problems of algebraic problem of eigenvalues (APVZ) with sparse matrices of large volumes. In particular, such problems arise in the analysis of the strength of structures in civil and industrial construction, aircraft construction, electric welding, etc. The solving to these problems is to determine the eigenvalues and eigenvectors of sparse matrices of different structure. The efficiency of solving these problems largely depends on the effectiveness of mathematical modeling of the problem as a whole. Continuous growth of task parameters, calculation of more complete models of objects and processes on computers require an increase in computer productivity. High-performance computing requirements are far ahead of traditional parallel computing, even with multicore processors. High-performance computing requirements are far ahead of traditional parallel computing, even with multicore processors. Today, this problem is solved by using powerful supercomputers of hybrid architecture, such as computers with multicore processors (CPUs) and graphics processors (GPUs), which combine MIMD and SIMD architectures. But the potential of high-performance computers can be used to the fullest only with algorithmic software that takes into account both the properties of the task and the features of the hybrid architecture. Complicating the architecture of modern high-performance supercomputers of hybrid architecture, which are actively used for mathematical modeling (increasing the number of computer processors and cores, different types of computer memory, different programming technologies, etc.) means a significant complication of efficient use of these resources in creating parallel algorithms and programs. here are problems with the creation of algorithmic software with automatic execution of stages of work, which are associated with the efficient use of computing resources, ways to store and process sparse matrices, analysis of the reliability of computer results. This makes it possible to significantly increase the efficiency of mathematical modeling of practical problems on modern high-performance computers, as well as free users from the problems of parallelization of complex problems. he developed algorithmic software automatically implements all stages of parallel computing and processing of sparse matrices on a hybrid computer. It was used at the Institute of Mechanics named after S.P. Tymoshenko NAS of Ukraine in modeling the strength problems of composite material. A significant improvement in the time characteristics of mathematical modeling was obtained. Problems of mathematical modeling of the properties of composite materials has an important role in designing the processes of deformation and destruction of products in various subject areas. Algorithmic software for mathematical modeling of structural stability is considered, which is reduced to solving a partial generalized problem of eigen values of sparse matrices of different structure of large orders, with automatic parallelization of calculations on modern parallel computers with graphics processors. The main methodological principles and features of implementation of parallel algorithms for different structures of sparse matrices are presented, which ensure effective implementation of multilevel parallelism of a hybrid system and reduce data exchange time during the computational process. As an example of these approaches, a hybrid algorithm of the iteration method in subspace for tape and block-diagonal matrices with a frame for computers of hybrid architecture is given. Peculiarities of data decomposition for matrices of profile structure at realization of parallel algorithms are considered. The proposed approach provides automatic determination of the required topology of the hybrid computer and the optimal amount of resources for the organization of an efficient computational process. The results of testing the developed algorithmic software for problems from the collection of the University of Florida, as well as the times of solving the problem of stability of composite materials using a three-dimensional model of "finite size fibers" on computers of different architectures. The results show a significant improvement in the time characteristics of solving problems.
APA, Harvard, Vancouver, ISO, and other styles
17

Rofatto, Vinicius Francisco, Marcelo Tomio Matsuoka, and Ivandro Klein. "DESIGN OF GEODETIC NETWORKS BASED ON OUTLIER IDENTIFICATIONCRITERIA: AN EXAMPLE APPLIED TO THE LEVELING NETWORK." Boletim de Ciências Geodésicas 24, no. 2 (June 2018): 152–70. http://dx.doi.org/10.1590/s1982-21702018000200011.

Full text
Abstract:
Abstract: We present a numerical simulation method for designing geodetic networks. The quality criterion considered is based on the power of the test of data snooping testing procedure. This criterion expresses the probability of the data snooping to identify correctly an outlier. In general, the power of the test is defined theoretically. However, with the advent of the fast computers and large data storage systems, it can be estimated using numerical simulation. Here, the number of experiments in which the data snooping procedure identifies the outlier correctly is counted using Monte Carlos simulations. If the network configuration does not meet the reliability criterion at some part, then it can be improved by adding required observation to the surveying plan. The method does not use real observations. Thus, it depends on the geometrical configuration of the network; the uncertainty of the observations; and the size of outlier. The proposed method is demonstrated by practical application of one simulated leveling network. Results showed the needs of five additional observations between adjacent stations. The addition of these new observations improved the internal reliability of approximately 18%. Therefore, the final designed network must be able to identify and resist against the undetectable outliers - according to the probability levels.
APA, Harvard, Vancouver, ISO, and other styles
18

Kristiyani, Erni. "Evaluasi Pelaksanaan Program Ujian Online di SMK NEGERI 4 JAKARTA." Jurnal PenSil 3, no. 1 (February 27, 2014): 62–75. http://dx.doi.org/10.21009/jpensil.v3i1.9885.

Full text
Abstract:
The Objectives of this study to determine whether the online testing program for Class XII students of Stone Concrete Construction Engineering Department at SMK 4 Jakarta has been successful or not. This research was conducted in the Department of Stone Concrete Construction Engineering SMKN 4 Jakarta in July 2013 until November 2013.The method used is the evaluative research method with the type of CIPP evaluation model (Context, Input, Process, Product), with a sample of 26 students Class XII Stone Concrete Construction Engineering Department, one of teacher Stone Concrete Construction Engineering Department, and Head of IT SMKN 4 Jakarta. The research instrument used online testing, interviews, and questionnaires.Validity test results, 10 items questionnaire is valid. While the results of the reliability test, R11 value is 0.904. To access the online testing program, Head of IT requires program of Moodle, server, internet networks, and laptop; the teachers need laptop, Hot Potatoes, and the internet networks; while the students need laptop and internet networks. The curriculum used is the Kurikulum Tingkat Satuan Pendidikan (KTSP). The average score that student obtained from online testing is 72, and this score is lower than a Kriteria Ketuntasan Minimal (KKM) which has been determined 75. From the calculation of the questionnaire, percentage of successful online testing program is only 56.69% of the respondents expected. During the implementation of the online testing program, there are some issues such as school wifi network can not be used, the server often down, students must bring their own laptops and modems, and as well as a hot classroom. The impact of online testing program that is obtained by the school teachers and students can operate computers and the Internet as a means of learning evaluation. Thus, the overall implementation of the online testing program at SMK 4 Jakarta can be said to have not succeeded and need for improvement, especially in school’s internet networks and facilities to support the online testing program
APA, Harvard, Vancouver, ISO, and other styles
19

Minardi, Alberto, Silvio B. Giger, Russell T. Ewy, Rudy Stankovic, Jørn Stenebråten, Magnus Soldal, Marco Rosone, Alessio Ferrari, and Lyesse Laloui. "Benchmark study of undrained triaxial testing of Opalinus Clay shale: Results and implications for robust testing." Geomechanics for Energy and the Environment 25 (March 2021): 100210. http://dx.doi.org/10.1016/j.gete.2020.100210.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Hadi, Samsul, Haryanto Haryanto, Muh Asriadi AM, Marlina Marlina, and Abdul Rahim. "Developing Classroom Assessment Tool using Learning Management System-based Computerized Adaptive Test in Vocational High Schools." Journal of Education Research and Evaluation 6, no. 1 (February 22, 2022): 143–55. http://dx.doi.org/10.23887/jere.v6i1.35630.

Full text
Abstract:
Computers have taken on a large role in education, including testing and evaluation. Traditional tests that aren't comprehensive and don't distinguish between students' beginning talents lead to measurement findings that aren't representative of their true abilities. This study aims to develop and test assessment tool eligibility class that is used as an LMS-based adaptive. This type of research includes development research. The respondents of this study were experts who assessed the validation and students of SMK Electrical Power Installation Engineering Expertise Competencies. The data analysis technique used item response theory, classical test theory and descriptive statistics. Item analysis using the Rasch Model showed 10 items were not fit and the remaining 50 items were fit. Classical test theory analysis items with less validity there are 0 items, moderate there are 55 items, and high there are 5 items, with an Alpha reliability of 0.934. The attitude questionnaire developed consists of 8 items. There are 0 items with less validity, 6 items being moderate, and 2 items high, with an Alpha reliability of 0.731. The developed observation guide contains 16 observations. Items with less validity have 0 items, while there are 15 items, and 1 item high. The use of CAT-assisted learning management systems on research tools greatly facilitates teachers in conducting research accurately and practically.
APA, Harvard, Vancouver, ISO, and other styles
21

Tokhi, Hamayon, Gang Ren, and Yi Min Xie. "Improving the Predictability of Hiley Pile Driving Formula in the Quality Control of the Pile Foundation." Advanced Materials Research 261-263 (May 2011): 1292–96. http://dx.doi.org/10.4028/www.scientific.net/amr.261-263.1292.

Full text
Abstract:
Pile Dynamic Formulas are the oldest and frequently used method to determine bearing capacity of piles. The more recent method is based on the Wave Equation analysis and different formulations such as Case Mathod, TNO, CAPWAP and TEPWAP which were developed for pre-driving analysis and post-driving measurements applications. The major factors for the common use of the dynamic formulas have been due to their simplicity, cost effectiveness and applicability in various piling situations. However, in some literature the energy approach have been given an unfair reputation as being unreliable and less accurate than the more analytical or dynamic testing methods. One of the issues due to the poor performance of the dynamic formulas is that, historically, the hammer energy and the energy trasferred to pile had to be assumed. Nevertheless, with the advent of computers, new technologies are emerging with the advancement in construction industry. This has produced gradual improvements that have resulted in the dynamic method to be used on many projects with greater reliability. In this paper, a review of the different testing methods as well as pros and cons of the pile driving formulas are discussed. Also, an approach to improving the widely used Hiley dynamic equation is presented. This approach enables evaluation of the pile capacity to be made more accurately.
APA, Harvard, Vancouver, ISO, and other styles
22

Correia, Sérgio D., João Fé, Slavisa Tomic, and Marko Beko. "Development of a Test-Bench for Evaluating the Embedded Implementation of the Improved Elephant Herding Optimization Algorithm Applied to Energy-Based Acoustic Localization." Computers 9, no. 4 (November 3, 2020): 87. http://dx.doi.org/10.3390/computers9040087.

Full text
Abstract:
The present work addresses the development of a test-bench for the embedded implementation, validity, and testing of the recently proposed Improved Elephant Herding Optimization (iEHO) algorithm, applied to the acoustic localization problem. The implemented methodology aims to corroborate the feasibility of applying iEHO in real-time applications on low complexity and low power devices, where three different electronic modules are used and tested. Swarm-based metaheuristic methods are usually examined by employing high-level languages on centralized computers, demonstrating their capability in finding global or good local solutions. This work considers iEHO implementation in C-language running on an embedded processor. Several random scenarios are generated, uploaded, and processed by the embedded processor to demonstrate the algorithm’s effectiveness and the test-bench usability, low complexity, and high reliability. On the one hand, the results obtained in our test-bench are concordant with the high-level implementations using MatLab® in terms of accuracy. On the other hand, concerning the processing time and as a breakthrough, the results obtained over the test-bench allow to demonstrate a high suitability of the embedded iEHO implementation for real-time applications due to its low latency.
APA, Harvard, Vancouver, ISO, and other styles
23

Hu, Juan, and Jianwei Wu. "5G Network Slicing: Methods to Support Blockchain and Reinforcement Learning." Computational Intelligence and Neuroscience 2022 (March 24, 2022): 1–10. http://dx.doi.org/10.1155/2022/1164273.

Full text
Abstract:
With the advent of the 5G era, due to the limited network resources and methods before, it cannot be guaranteed that all services can be carried out. In the 5G era, network services are not limited to mobile phones and computers but support the normal operation of equipment in all walks of life. There are more and more scenarios and more and more complex scenarios, and more convenient and fast methods are needed to assist network services. In order to better perform network offloading of the business, make the business more refined, and assist the better development of 5G network technology, this article proposes 5G network slicing: methods to support blockchain and reinforcement learning, aiming to improve the efficiency of network services. The research results of the article show the following: (1) In the model testing stage, the research results on the variation of the delay with the number of slices show that the delay increases with the increase of the number of slices, but the blockchain + reinforcement learning method has the lowest delay. The minimum delay can be maintained. When the number of slices is 3, the delay is 155 ms. (2) The comparison of the latency of different types of slices shows that the latency of 5G network slicing is lower than that of 4G, 3G, and 2G network slicing, and the minimum latency of 5G network slicing using blockchain and reinforcement learning is only 15 ms. (3) In the detection of system reliability, reliability decreases as the number of users increases because reliability is related to time delay. The greater the transmission delay, the lower the reliability. The reliability of supporting blockchain + reinforcement learning method is the highest, with a reliability of 0.95. (4) Through the resource utilization experiment of different slices, it can be known that the method of blockchain + reinforcement learning has the highest resource utilization. The resource utilization rate of the four slices under the blockchain + reinforcement learning method is all above 0.8 and the highest is 1. (5) Through the simulation test of the experiment, the results show that the average receiving throughput of video stream 1 is higher than that of video stream 2, IOT devices and mobile devices, and the average cumulative receiving throughput under the blockchain + reinforcement learning method. The highest is 1450 kbps. The average QOE of video stream 1 is higher than that of video stream 2, IOT devices and mobile devices, and the average QOE is the highest under the blockchain + reinforcement learning method, reaching 0.83.
APA, Harvard, Vancouver, ISO, and other styles
24

Giger, Silvio B., Russell T. Ewy, Valentina Favero, Rudy Stankovic, and Lukas M. Keller. "Consolidated-undrained triaxial testing of Opalinus Clay: Results and method validation." Geomechanics for Energy and the Environment 14 (June 2018): 16–28. http://dx.doi.org/10.1016/j.gete.2018.01.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Korshunov, Gennady I. "Creation and development of cyber-physical systems under condition of uncertainty interaction of subsystems." Journal of Physics: Conference Series 2373, no. 6 (December 1, 2022): 062011. http://dx.doi.org/10.1088/1742-6596/2373/6/062011.

Full text
Abstract:
Abstract The tasks of creation and development of cyber-physical systems are considered. Cybernetic and physical subsystems can be presented in digital and analog form. The physical subsystem that simulates nature is presented in an analog form. Physical subsystems and processes, both natural, in particular, and technogenic, can behave like “things in themselves.” Therefore, the processes of synchronization of analog processes with cybernetics are often difficult to formalize or are not performed correctly enough. This applies to both obvious cases in problem areas of ecology, mining, energy, and in computers under conditions of inappropriate behavior of programs, ensuring the reliability of testing. This leads to the emergence of uncertainties accompanying the integration of subsystems and control in the system as a whole. The nature of analog-to-digital and digital-to-analog conversion in the known representation requires taking into account the features at the level of this class of systems. These problems are considered at the levels of deterministic, stochastic and uncertainty. The well-known models of monitoring, control of a physical subsystem cannot be directly used for control. The development of correct control actions is considered from the standpoint of transforming such models. These models for dynamical systems are represented by differential equations. It is necessary to develop input-output models for the subsequent synthesis of controls.
APA, Harvard, Vancouver, ISO, and other styles
26

Ewy, Russell T. "Practical approaches for addressing shale testing challenges associated with permeability, capillarity and brine interactions." Geomechanics for Energy and the Environment 14 (June 2018): 3–15. http://dx.doi.org/10.1016/j.gete.2018.01.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Loveridge, Fleur, John S. McCartney, Guillermo A. Narsilio, and Marcelo Sanchez. "Energy geostructures: A review of analysis approaches, in situ testing and model scale experiments." Geomechanics for Energy and the Environment 22 (May 2020): 100173. http://dx.doi.org/10.1016/j.gete.2019.100173.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Álvarez, José Luis, Juan Daniel Mozo, and Eladio Durán. "Analysis of Single Board Architectures Integrating Sensors Technologies." Sensors 21, no. 18 (September 21, 2021): 6303. http://dx.doi.org/10.3390/s21186303.

Full text
Abstract:
Development boards, Single-Board Computers (SBCs) and Single-Board Microcontrollers (SBMs) integrating sensors and communication technologies have become a very popular and interesting solution in the last decade. They are of interest for their simplicity, versatility, adaptability, ease of use and prototyping, which allow them to serve as a starting point for projects and as reference for all kinds of designs. In this sense, there are innumerable applications integrating sensors and communication technologies where they are increasingly used, including robotics, domotics, testing and measurement, Do-It-Yourself (DIY) projects, Internet of Things (IoT) devices in the home or workplace and science, technology, engineering, educational and also academic world for STEAM (Science, Technology, Engineering and Mathematics) skills. The interest in single-board architectures and their applications have caused that all electronics manufacturers currently develop low-cost single board platform solutions. In this paper we realized an analysis of the most important topics related with single-board architectures integrating sensors. We analyze the most popular platforms based on characteristics as: cost, processing capacity, integrated processing technology and open-source license, as well as power consumption (mA@V), reliability (%), programming flexibility, support availability and electronics utilities. For evaluation, an experimental framework has been designed and implemented with six sensors (temperature, humidity, CO2/TVOC, pressure, ambient light and CO) and different data storage and monitoring options: locally on a μSD (Micro Secure Digital), on a Cloud Server, on a Web Server or on a Mobile Application.
APA, Harvard, Vancouver, ISO, and other styles
29

Smith, Laura, David Elwood, S. Lee Barbour, and M. Jim Hendry. "Profiling the in situ compressibility of cretaceous shale using grouted-in piezometers and laboratory testing." Geomechanics for Energy and the Environment 14 (June 2018): 29–37. http://dx.doi.org/10.1016/j.gete.2018.04.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Namiki, Norikazu, Akihiro Komtasu, Keiji Watanabe, and Naoki Kagi. "Surface Corrosion of HDD Media and Subsidiary Particle Formation Due to SO2 Gas Adsorption." Journal of the IEST 50, no. 2 (October 1, 2007): 38–51. http://dx.doi.org/10.17764/jiet.50.2.wh12q7h78l005042.

Full text
Abstract:
The storage capacity of hard disk drives (HDDs) for personal computers has increased more than 10,000 times in the past decade. Meanwhile, the gap between the disk and the magnetic head (flying height) has decreased from sub-micrometers to a few nanometers. The lower flying height leads to more sensitive disk-to-surface contamination linked to fatal failures. Many studies have reported that disk surface contamination is related to the adsorption of volatile organic compounds (VOCs) and the corrosion of the magnetic layer of the disk. However, surface contamination derived from sulfur dioxide (SO2) gas in the surrounding area has been discussed insufficiently. In this work, to ascertain the mechanism of disk surface contamination and the subsidiary particle generation, HDD disks were intentionally exposed to SO2 gas, followed by an evaluation of disk operation performance during endurance testing. After the series of endurance tests, a large quantity of sulfate and nickel, which is a main component of the disk substrate layer, was detected on the SO2-contaminated disk surface, as well as a small quantity of cobalt, which is a main component of the magnetic layer. From these findings, the mechanism of surface contamination of HDD disks in the coexistence of SO2 and water was inferred. Nickel is supplied from the substrate layer to the top layer through holes and cracks in the films of disks because of the corrosion triggered by adsorbed water. High temperature and humidity causes the desorption of SO2 from SO2-contaminated disk surfaces to react with nickel. Eventually, the products of nickel sulfate are precipitated on the surface to be released in the form of particles by contact with the head.
APA, Harvard, Vancouver, ISO, and other styles
31

Romanov, Aleksey M. "A review on control systems hardware and software for robots of various scale and purpose. Part 2. Service robotics." Russian Technological Journal 7, no. 6 (January 10, 2020): 68–86. http://dx.doi.org/10.32362/2500-316x-2019-7-6-68-86.

Full text
Abstract:
A review of robotic systems was carried out. The paper analyzes applied hardware and software solutions and summarizes the most common block diagrams of control systems. The analysis of approaches to control systems scaling, the use of intelligent control, the achievement of fault tolerance, and the reduction of the weight and size of control system elements belonging to various classes of robotic systems were carried out. The goal of the review is finding common approaches used in various areas of robotics to build on their basis a uniform methodology for designing scalable intelligent control systems for robots with a given level of fault tolerance on a unified component base. This part is dedicated to service robotics. The following conclusions are made on the basis of the review results: the key technology in service robotics from the point of view of scalability is the Robot Operating System (ROS); service robotics is today the main springboard for testing intelligent algorithms for the tactical and strategic control levels that are integrated into a common system based on ROS; the problem of ensuring fault tolerance in the service robotics is practically neglected, with the exception of the issue of increasing reliability by changing behavioral algorithms; in a number of areas of service robotics, in which the reduction of mass and dimensions is especially important, the robot control systems are implemented on a single computing device, in other cases a multi-level architecture implemented on Linux-based embedded computers with ROS are used.
APA, Harvard, Vancouver, ISO, and other styles
32

Gurevych, Roman, Myroslav Koval, Galyna Gordiichuk, Iryna Shakhina, Svitlana Genkal, and Viktor Romanenko. "Improving the Training of Skilled Workers for Professional Activities in Educational Institutions of Ukraine." Revista Romaneasca pentru Educatie Multidimensionala 14, no. 1 (February 9, 2022): 440–64. http://dx.doi.org/10.18662/rrem/14.1/528.

Full text
Abstract:
The article talks about the author's improvement of the training of skilled workers for professional activity and its effectiveness. The purpose of the research is a theoretical study of the problems of organizing the training of skilled workers and experimental verification, proposed by the authors of the organization within the framework of the experiment. The research methods are theoretical (analysis, synthesis, modeling, comparison, systematization, generalization) and empirical (a wide range of diagnostic techniques), as well as methods of mathematical statistics with testing the null hypothesis by the chi-squared test. New information and digital technologies based on the use of personal computers and multimedia tools were widely used during the training. Extracurricular work in general, general technical and special disciplines, which was carried out according to our methodology was systematic, its content was focused on the professions that are mastered by students of vocational education schools (VES). Percentages that reflect the number of correct answers at higher levels (III-IV) in the experimental groups exceed similar indicators for control groups with a reliability of 95-99.9%. Thus, the examination of the pedagogical capabilities of the proposed organization of training in VES confirmed that it provides high results of training of skilled workers in vocational education institutions. International significance of the article. The article proposes the author's effective (experimentally verified) organization of training for transitional education systems in developing countries. This system can become the basis for the development of national systems for organizing vocational education in post-totalitarian countries.
APA, Harvard, Vancouver, ISO, and other styles
33

Noijons, José. "Testing Computer Assisted Language Testing." CALICO Journal 12, no. 1 (January 14, 2013): 37–58. http://dx.doi.org/10.1558/cj.v12i1.37-58.

Full text
Abstract:
Much computer assisted language learning (CALL) material that includes tests and exercises looks attractive enough but is clearly lacking in terms of validation: the possibilities of the computer and the inventiveness of the programmers mainly determine the format of tests and exercises, causing possible harm to a fair assessment of pupils' language abilities. This article begins with a definition of computer assisted language testing (CALT), followed by a discussion of the various processes involved. E3oth advantages and disadvantages of CALT are outlined. Psychometric aspects of computer adaptive testing are then discussed. Issues of validity and reliability in CALT are acknowledged. A table of factors in CALT distinguishes between test content and the mechanics of taking a test, before, during and after a test. The various factors are examined and comprise a table for developing a CALT checklist. The article ends with a call for professional testers and developers of educational software to work together in developing CALT.
APA, Harvard, Vancouver, ISO, and other styles
34

Xiang, Wei, Rui Zhang, Guoxiang Liu, Xiaowen Wang, Wenfei Mao, Bo Zhang, Yin Fu, and Tingting Wu. "Saline-Soil Deformation Extraction Based on an Improved Time-Series InSAR Approach." ISPRS International Journal of Geo-Information 10, no. 3 (February 27, 2021): 112. http://dx.doi.org/10.3390/ijgi10030112.

Full text
Abstract:
Significant seasonal fluctuations could occur in the regional scattering characteristics and surface deformation of saline soil, and cause decorrelation, which limits the application of the conventional time-series InSAR (TS-InSAR). For extending the saline-soil deformation monitoring capability, this paper presents an improved TS-InSAR approach, based on the interferometric coherence statistics and high-coherence interferogram refinement. By constructing a network of the refined interferograms, high-accuracy ground deformation can be extracted through the weighted least square estimation and the coherent target refinement. To extract the high-accuracy deformation of a representative saline soil area in the Qarhan Salt Lake, 119 C-band Sentinel-1A images collected between May 2015 and May 2020 are selected as the data source. Subsequently, 845 refined interferograms are selected from all possible interferograms to conduct the network inversion, based on the related thresholds (the temporal baseline <49 days, the average spatial coherences >0.5, respectively). Compared with the conventional TS-InSAR measurements, both the accuracy and reliability of the extracted deformation results of the saline soil increased dramatically. Furthermore, the testing results indicate that the improved TS-InSAR method has advantages on the deformation extraction in the saline soil region, and is adaptive to reflecting the typical seasonal variations of the saline soil.
APA, Harvard, Vancouver, ISO, and other styles
35

Kaziuchyts, V. O., S. M. Borovikov, and E. N. Shneiderov. "Model for Prediction of Testing Time of a Computer Program for Automated Reliability Evaluation of Semiconductor Devices." Doklady BGUIR 20, no. 7 (December 12, 2022): 72–80. http://dx.doi.org/10.35596/1729-7648-2022-20-7-72-80.

Full text
Abstract:
The KLASS computer program planned for development is designed to work as a part of the ARION-plus software package and allows you to perform an automated assessment of the reliability of electronic products, including semiconductor devices. At the stage of work planning on the creation of the KLASS program, as a module of the ARION-plus complex, the question arose about the working time allotted for the procedure for testing a computer program. The approaches described in the scientific literature used to assess the operational reliability of computer programs, taking into account their testing, proceed from the fact that the program code has been written and debugged and there are certain data on the results of testing the computer program. Software developers would like to know the predicted testing time, which ensures a given operational reliability of a computer program, even before starting work on writing program code. Based on the analysis of the experimental data on the reliability of computer programs in various fields of application, a model is proposed for determining the testing time required to ensure the operational reliability of programs. The model was used for the computer program KLASS planned for development and takes into account the programming language, the amount of program code, the speed of the computer processor, and the scope of the program. Based on the obtained model, a nomogram with two binary fields was constructed, which allows one to quickly determine the predicted time for testing computer programs.
APA, Harvard, Vancouver, ISO, and other styles
36

Zou, Xiao Wei, Xiao Li Wang, and Yan Wang. "Analysis of Reliability and Validity in Computer Assisted English Teaching Test." Advanced Materials Research 989-994 (July 2014): 5029–32. http://dx.doi.org/10.4028/www.scientific.net/amr.989-994.5029.

Full text
Abstract:
With the rapid development of computer technology, the computer has rapidly entered our daily life. People begin to use the computer for scientific research, production, entertainment, and information consulting and other activities widely, which brings about the Computer Aided Designing, Computer Aided Manufacturing, Computer Aided Instruction in order to improve the efficiency and quality of life. Using computer technology in language testing has become the trend of language testing both at home and abroad. Communicative language teaching becomes the guiding ideology and theoretical basis of modern language testing. The reliability and validity in computerized language assessments have become the two important criteria of evaluation of test quality. Computer Assisted English test with its unique advantages has opened up a new heaven and earth for English test.
APA, Harvard, Vancouver, ISO, and other styles
37

Joslin, Philip R. "Software for Reliability Testing of Computer Peripherals: A Case History." IEEE Transactions on Reliability 35, no. 3 (1986): 279–84. http://dx.doi.org/10.1109/tr.1986.4335433.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Berber, Aslihan, and Jesus Garcia Laborda. "Turkish teachers’ and students’ perceptions towards computer assisted testing in comparison with Spanish teachers’ and students’ perceptions." World Journal on Educational Technology 7, no. 2 (August 13, 2015): 99. http://dx.doi.org/10.18844/wjet.v7i2.42.

Full text
Abstract:
There are different opinions about using technology in assessment field of education regarding computer assisted assessments. People have some concerns such as its application, reliability and so on. It seems that those concerns may decrease with the developing technology in the following years since computer-based testing programs are gradually getting better in terms of reliability and utility. This research aims to determine Turkish teachers’ and students’ perceptions towards computer assisted testing comparing the results with Spanish students’ and teachers’ perceptions. In this study, testing and assessment are used interchangeably even though some researchers accept these terms separately. The result of this study is crucial for educators in Turkey because computer-assisted assessment is being tried to be applied in Turkish schools. It is crucial to be aware of educators and students’ perceptions towards it.Keywords: computer assisted testing, Spanish, Turkish, teachers, students.
APA, Harvard, Vancouver, ISO, and other styles
39

E. Fluck, Andrew, Olawale Surajudeen Adebayo, and Shafi'i Muhammad Abdulhamid. "Secure E-Examination Systems Compared: Case Studies from Two Countries." Journal of Information Technology Education: Innovations in Practice 16 (2017): 107–25. http://dx.doi.org/10.28945/3705.

Full text
Abstract:
Aim/Purpose: Electronic examinations have some inherent problems. Students have expressed negative opinions about electronic examinations (e-examinations) due to a fear of, or unfamiliarity with, the technology of assessment, and a lack of knowledge about the methods of e-examinations. Background: Electronic examinations are now a viable alternative method of assessing student learning. They provide freedom of choice, in terms of the location of the examination, and can provide immediate feedback; students and institutions can be assured of the integrity of knowledge testing. This in turn motivates students to strive for deeper learning and better results, in a higher quality and more rigorous educational process. Methodology : This paper compares an e-examination system at FUT Minna Nigeria with one in Australia, at the University of Tasmania, using case study analysis. The functions supported, or inhibited, by each of the two e-examination systems, with different approaches to question types, cohort size, technology used, and security features, are compared. Contribution: The researchers’ aim is to assist stakeholders (including lecturers, invigilators, candidates, computer instructors, and server operators) to identify ways of improving the process. The relative convenience for students, administrators, and lecturer/assessors and the reliability and security of the two systems are considered. Challenges in conducting e-examinations in both countries are revealed by juxtaposing the systems. The authors propose ways of developing more effective e-examination systems. Findings: The comparison of the two institutions in Nigeria and Australia shows e-examinations have been implemented for the purpose of selecting students for university courses, and for their assessment once enrolled. In Nigeria, there is widespread systemic adoption for university entrance merit selection. In Australia this has been limited to one subject in one state, rather than being adopted nationally. Within undergraduate courses, the Nigerian scenario is quite extensive; in Australia this adoption has been slower, but has penetrated a wide variety of disciplines. Recommendations for Practitioners: Assessment integrity and equipment reliability were common issues across the two case studies, although the delivery of e-examinations is different in each country. As with any procedural process, a particular solution is only as good as its weakest attribute. Technical differences highlight the link between e-examination system approaches and pedagogical implications. It is clear that social, cultural, and environmental factors affect the success of e-examinations. For example, an interrupted electrical power supply and limited technical know-how are two of the challenges affecting the conduct of e-examinations in Nigeria. In Tasmania, the challenge with the “bring your own device” (BYOD) is to make the system operate on an increasing variety of user equipment, including tablets. Recommendation for Researchers: The comparisons between the two universities indicate there will be a productive convergence of the approaches in future. One key proposal, which arose from the analysis of the existing e-examination systems in Nigeria and Australia, is to design a form of “live” operating system that is deployable over the Internet. This method would use public key cryptography for lecturers to encrypt their questions online. Impact on Society : If institutions are to transition to e-examinations, one way of facilitating this move is by using computers to imitate other assessment techniques. However, higher order thinking is usually demonstrated through open-ended or creative tasks. In this respect the Australian system shows promise by providing the same full operating system and software application suite to all candidates, thereby supporting assessment of such creative higher order thinking. The two cases illustrate the potential tension between “online” or networked reticulation of questions and answers, as opposed to “offline” methods. Future Research: A future design proposition is a web-based strategy for a virtual machine, which is launched into candidates’ computers at the start of each e-examination. The new system is a form of BYOD externally booted e-examination (as in Australia) that is deployable over the Internet with encryption and decryption features using public key cryptography (Nigeria). This will allow lecturers to encrypt their questions and post them online while the questions are decrypted by the administrator or students are given the key. The system will support both objective and open-ended questions (possibly essays and creative design tasks). The authors believe this can re-define e-examinations as the “gold standard” of assessment.
APA, Harvard, Vancouver, ISO, and other styles
40

Hwang, S., and R. Rajsuman. "VLSI Testing for High Reliability: Mixing IDDQ Testing With Logic Testing." VLSI Design 5, no. 3 (January 1, 1997): 299–311. http://dx.doi.org/10.1155/1997/59329.

Full text
Abstract:
In this paper, we examine the effectiveness of combined logic and IDDQ testing to detect stuck-at and bridging faults. The stuck-at faults are detected by the logic test and IDDQ testing detects bridging faults.Near minimal stuck-at test sets are used for this combined logic and IDQQ test environment. These near minimal stuck-at test sets are generated using standard test programs, while using collapsed fault lists. We examined ISCAS '85 and ISCAS '89 benchmark circuits under this combined test environment. A comparison is given for the fault coverage obtained under this combined test environment with other studies based on pure logic test and IDDQ test. Also, the results of IDDQ based test sets (vectors generated specifically for IDDQ testing) are compared with that of stuck-at test sets. Finally, we present a case study on a microprogrammed processor using a functional test set to detect bridging faults in IDDQ testing.
APA, Harvard, Vancouver, ISO, and other styles
41

Borovikov, S. M., V. O. Kaziuchyts,, V. V. Khoroshko, S. S. Dick, and K. I. Klinov. "Assessment of expected reliability of applied software for computer-based information systems." Informatics 18, no. 1 (March 29, 2021): 84–95. http://dx.doi.org/10.37661/1816-0301-2021-18-1-84-95.

Full text
Abstract:
The reliability of computer-based information systems is largely determined by the reliability of the developed application software. The failure rate of its computer program is considered as an indicator of the reliability of the application software. To determine the expected reliability of the application software planned for the development (until writing the code of a program), the model is proposed that uses some parameters of the future computer program, data on the influence of various factors on its reliability, and further testing of the program. The model takes into account the field of software application and computer processor performance. The process of model parameters obtaining is analyzed., It is possible by use of proposed model to determine the predicted failure rate of the planned application computer program, and then the reliability of the computer-based information system as a whole. If necessary, the measures can be developed to ensure the required level of reliability of the computer-based information system.
APA, Harvard, Vancouver, ISO, and other styles
42

Schneidewind, Norman. "Software Testing and Reliability Strategies." Journal of Aerospace Computing, Information, and Communication 7, no. 9 (September 2010): 294–307. http://dx.doi.org/10.2514/1.49220.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

GUPTA, ANSHU, REECHA KAPUR, and P. C. JHA. "CONSIDERING TESTING EFFICIENCY AND TESTING RESOURCE CONSUMPTION VARIATIONS IN ESTIMATING SOFTWARE RELIABILITY." International Journal of Reliability, Quality and Safety Engineering 15, no. 02 (April 2008): 77–91. http://dx.doi.org/10.1142/s0218539308002940.

Full text
Abstract:
Advances in software technologies have promoted the growth of computer-related applications to a great extent. Building quality in terms of reliability of the software has become one of the main issues for software developers. Software testing is necessary to build highly reliable software. Monitoring and controlling the resource utilization, measuring and controlling the progress of testing, efficiency of testing and debugging personals and reliability growth are important for effective management the testing phase and meeting the quality objectives. Over the past 35 years many Software reliability growth models (SRGM) are proposed to accomplish the above-mentioned activities related to the software testing. From the literature it appears that most of the SRGM do not account the changes in the testing effort consumption. During the testing process especially in the beginning and towards the end of the testing frequent changes are observed in testing resource consumption due to changes in testing strategy, team constitution, schedule pressures etc. Apart from this testing efficiency plays a major role determining the progress of the testing process. In this paper we incorporate the important concept of testing resource consumption variations for Weibull type testing effort functions and testing efficiency in software reliability growth modeling. The performance of the proposed models is demonstrated through two real life data sets existing in literature. The experimental result shows fairly accurate estimating capabilities of the proposed models.
APA, Harvard, Vancouver, ISO, and other styles
44

Jha, P. C., Deepali Gupta, Bo Yang, and P. K. Kapur. "Optimal testing resource allocation during module testing considering cost, testing effort and reliability." Computers & Industrial Engineering 57, no. 3 (October 2009): 1122–30. http://dx.doi.org/10.1016/j.cie.2009.05.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

KAPUR, P. K., ANU G. AGGARWAL, KANICA KAPOOR, and GURJEET KAUR. "OPTIMAL TESTING RESOURCE ALLOCATION FOR MODULAR SOFTWARE CONSIDERING COST, TESTING EFFORT AND RELIABILITY USING GENETIC ALGORITHM." International Journal of Reliability, Quality and Safety Engineering 16, no. 06 (December 2009): 495–508. http://dx.doi.org/10.1142/s0218539309003538.

Full text
Abstract:
The demand for complex and large-scale software systems is increasing rapidly. Therefore, the development of high-quality, reliable and low cost computer software has become critical issue in the enormous worldwide computer technology market. For developing these large and complex software small and independent modules are integrated which are tested independently during module testing phase of software development. In the process, testing resources such as time, testing personnel etc. are used. These resources are not infinitely large. Consequently, it is an important matter for the project manager to allocate these limited resources among the modules optimally during the testing process. Another major concern in software development is the cost. It is in fact, profit to the management if the cost of the software is less while meeting the costumer requirements. In this paper, we investigate an optimal resource allocation problem of minimizing the cost of software testing under limited amount of available resources, given a reliability constraint. To solve the optimization problem we present genetic algorithm which stands up as a powerful tool for solving search and optimization problems. The key objective of using genetic algorithm in the field of software reliability is its capability to give optimal results through learning from historical data. One numerical example has been discussed to illustrate the applicability of the approach.
APA, Harvard, Vancouver, ISO, and other styles
46

Kozlyuk, Iryna, and Yuliia Kovalenko. "Reliability of computer structures of integrated modular avionics for hardware configurations." System research and information technologies, no. 2 (September 14, 2021): 84–93. http://dx.doi.org/10.20535/srit.2308-8893.2021.2.07.

Full text
Abstract:
The problem of designing advanced computing systems in the class of structures of integrated modular avionics is considered. The unified topology of the internal network of the computer on the basis of Space Wire exchange channels and variants of its execution for various onboard applications is offered. Equivalent reliability schemes of each of the specific structures are introduced and the probabilities of trouble-free operation of each structure are analyzed. Families of graphic dependencies are given. The analysis of the existing principles and algorithms for testing multiprocessor multimodal onboard digital computer systems is given; the new testing algorithm for the multiprocessor systems which follows the software design standards for products of integrated modular avionics is offered. The structure of the unified automated workplace for checking the functional modules of integrated modular avionics is considered. Specific requirements inherent in the workplaces for testing integrated avionics are identified: an increased level of control of the hardware component of products; the ability to simulate the failure state of individual components of avionics to check the mode of reconfiguration of the computer system; modular construction of software with the division of verification tests into components performed at the level of each CPM and the computer as a whole in single-task and multitasking modes; openness of architecture of a workplace, which provides an ability to change the level of control complexity of a product and control of one class of complexity; intra-project unification of both hardware and software of the workstation of the inspection.
APA, Harvard, Vancouver, ISO, and other styles
47

Johnson, L. A. "Laser diode burn-in and reliability testing." IEEE Communications Magazine 44, no. 2 (February 2006): 4–7. http://dx.doi.org/10.1109/mcom.2006.1593543.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Nayak, Tapan K. "Minimum Variance Unbiased Estimation of Software Reliability." Probability in the Engineering and Informational Sciences 3, no. 3 (July 1989): 335–40. http://dx.doi.org/10.1017/s0269964800001200.

Full text
Abstract:
As the formal methods of proving correctness of a computer program are still very inadequate, in practice when a new piece of software is developed and all obvious errors are removed, it is tested with different (random) inputs in order to detect the remaining errors and assess its quality. We suppose that whenever the program fails the error causing the failure can be detected and removed correctly. Thus, the quality of the software increases as testing goes on. In this paper, we consider two different models and present the minimum variance unbiased estimators of the expected failure rate of the revised software at any time of testing t, based on the data generated up to that point.
APA, Harvard, Vancouver, ISO, and other styles
49

Shatnawi, Omar. "Testing-Effort Dependent Software Reliability Model for Distributed Systems." International Journal of Distributed Systems and Technologies 4, no. 2 (April 2013): 1–14. http://dx.doi.org/10.4018/jdst.2013040101.

Full text
Abstract:
Distributed systems are being developed in the context of the client-server architecture. Client-server architectures dominate the landscape of computer-based systems. Client-server systems are developed using the classical software engineering activities. Developing distributed systems is an activity that consumes time and resources. Even if the degree of automation of software development activities increased, resources are an important limitation. Reusability is widely believed to be a key direction to improving software development productivity and quality. Software metrics are needed to identify the place where resources are needed; they are an extremely important source of information for decision making. In this paper, an attempt has been made to describe the relationship between the calendar time, the fault removal process and the testing-effort consumption in a distributed development environment. Software fault removal phenomena and testing-effort expenditures are described by a non-homogenous Poisson process (NHPP) and testing-effort curves respectively. Actual software reliability data cited in literature have been used to demonstrate the proposed model. The results are fairly encouraging.
APA, Harvard, Vancouver, ISO, and other styles
50

Chen, Wenhua. "FURTHER ESTIMATION FOR RELIABILITY IN DEMONSTRATION TESTING." Chinese Journal of Mechanical Engineering 41, no. 08 (2005): 159. http://dx.doi.org/10.3901/jme.2005.08.159.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography