Articles de revues sur le sujet « OPTIMIZED TEST CASES »

Pour voir les autres types de publications sur ce sujet consultez le lien suivant : OPTIMIZED TEST CASES.

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les 50 meilleurs articles de revues pour votre recherche sur le sujet « OPTIMIZED TEST CASES ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Parcourez les articles de revues sur diverses disciplines et organisez correctement votre bibliographie.

1

Singh, Rajvir, Anita Singhrova et Rajesh Bhatia. « Optimized Test Case Generation for Object Oriented Systems Using Weka Open Source Software ». International Journal of Open Source Software and Processes 9, no 3 (juillet 2018) : 15–35. http://dx.doi.org/10.4018/ijossp.2018070102.

Texte intégral
Résumé :
Detection of fault proneness classes helps software testers to generate effective class level test cases. In this article, a novel technique is presented for an optimized test case generation for ant-1.7 open source software. Class level object oriented (OO) metrics are considered as effective means to find fault proneness classes. The open source software ant-1.7 is considered for the evaluation of proposed techniques as a case study. The proposed mathematical model is the first of its kind generated using Weka open source software to select effective OO metrics. Effective and ineffective OO metrics are identified using feature selection techniques for generating test cases to cover fault proneness classes. In this methodology, only effective metrics are considered for assigning weights to test paths. The results indicate that the proposed methodology is effective and efficient as the average fault exposition potential of generated test cases is 90.16% and test cases execution time saving is 45.11%.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Domínguez-Muñoz, J. Enrique, et Peter Malfertheiner. « Optimized serum pancreolauryl test for differentiating patients with and without chronic pancreatitis ». Clinical Chemistry 44, no 4 (1 avril 1998) : 869–75. http://dx.doi.org/10.1093/clinchem/44.4.869.

Texte intégral
Résumé :
Abstract The serum pancreolauryl test has limited sensitivity for detecting mild pancreatic insufficiency. The aim of this study was to optimize the serum pancreolauryl test so as to increase the probability of positive results in patients with chronic pancreatitis. The study had three parts. First, the sampling time was optimized by analyzing retrospectively the frequency of fluorescein peaks at different times from 0 to 240 min in 560 consecutive patients. Second, the calculation of serum fluorescein concentrations by means of a standard calibration factor was prospectively compared in 271 consecutive patients with a modification involving a specimen-specific calibration factor for each patient. Third, the clinical utility of the intravenous injection of secretin before ingestion of the test meal was prospectively evaluated in a further 32 patients. As a result, the optimized serum pancreolauryl test developed differs from the former version of the test in utilizing intravenous administration of secretin before the test meal, calculation of serum fluorescein based on specimen-specific calibration factors, and blood samples taken only at 0 (basal), 120, 150, 180, and 240 min. This optimized pancreolauryl test was abnormal more frequently in patients with chronic pancreatitis than was the formerly used test, especially for cases of mild and moderate disease.
Styles APA, Harvard, Vancouver, ISO, etc.
3

P Mahapatra, R., Aparna Ranjith et A. Kulothungan. « Framework for Optimizing Test Cases in Regression Testing ». International Journal of Engineering & ; Technology 7, no 3.12 (20 juillet 2018) : 444. http://dx.doi.org/10.14419/ijet.v7i3.12.16126.

Texte intégral
Résumé :
Software once developed is subject to continuous changes. Software Regression Testing thus can be used to reduce the efforts of testing the software by selecting only the required number of test cases and ordering them to test the software after changes have been made to it. In order to improve the fault detection rate, the selection of efficient test cases and order of execution of these tests is important. Here is when the test case selection comes into picture where in, the fault detection rate during the working of any software has to be improved. The test case selection process will find the most efficient test cases which can fully functionally test the software that has been modified. This indeed will contribute to an improved fault detection rate which can provide faster feedback on the system under test and let software engineers begin correcting faults as early as possible. In this paper, an approach for test case selection is proposed which takes into consideration the effect of three parameters History, Coverage and Requirement all together in order to improve the selection process. This will also ensure that the rejection of some efficient test cases is reduced when compared to the selection process in conventional methods, most of them making use of a single parameter for test case selection. These Test cases are further optimized using Genetic Algorithm. Results indicate that the proposed technique is much more efficient in terms of selecting the test cases when compared to conventional techniques, thereby improving fault detection rate.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Nelasov, N. J., R. V. Sidorov, M. N. Morgunov, N. S. Doltmurzieva, O. L. Eroshenko, E. A. Arzumanjan, A. G. Nechaeva et S. V. Shluik. « Echocardiographic Stress Test with Adenosine Triphosphate : Optimization of the Algorithm ». Kardiologiia 59, no 11 (15 décembre 2019) : 39–47. http://dx.doi.org/10.18087/cardio.2019.11.2665.

Texte intégral
Résumé :
Purpose. To: 1) optimize algorithm of stress echocardiography (s-Echo) with intravenous adenosine triphosphate (ATP) infusion taking into account pharmacokinetics and pharmacodynamics of ATP in human body, 2) test new algorithm in patients with coronary and other heart diseases. Materials and methods. In order to determine spectrum of factors influencing the results of stress test with ATP we inspected main scientific data bases and found 48 publications on ATP application for diagnostic purposes. Analysis of these publications allowed us to optimize algorithm of ATP s-Echo. Optimized algorithm was tested on 26 subjects, who underwent ATP 4D strain-stress-echocardiography of the left ventricle. Results and discussion. Optimized algorithm has three stages: registration of Echo data sets before, at the time of ATP infusion, and after 5 min of ATP infusion termination. Registration of Echo parameters at the second stage must begin not earlier than 3 min after the onset of ATP infusion and only in the presence of signs of coronary vasodilation. We think that the main indirect criterion of submaximal coronary vasodilation is 5 mm Hg or more decrease in systolic blood pressure (SBP), but not below SBP level of 90 mm Hg. Initial dose of ATP is 140 µg/kg/min. If after 2 min of infusion SBP do not diminish we increase the infusion rate at first to 175 and then to 210 µg/kg/min. While testing new algorithm in all cases we have achieved criteria of effective vasodilation. Mean SBP decrease was 16.4±13.7 mm Hg, heart rate increase – 12.7±8.1 bpm. In all patients we obtained interpretable 4D LV Echo data sets for visual analysis of local contractility and automatic strain analysis. Conclusion. Optimization of ATP s­Echo algorithm was performed. Safety and efficacy of optimized algorythm for registration of echo data was demonstrated. New ATP infusion algorithm can also be recommended for testing with other cardiac imaging modalities in evaluation of myocardial perfusion and contractility (SPECT, CT, MRI, PET).
Styles APA, Harvard, Vancouver, ISO, etc.
5

Horlenko, Olesya M., Vasyl I. Rusyn, Viktoriya M. Studenyak, Nataliia V. Sochka, Fedir V. Horlenko, Ivan I. Kopolovets et Lyubomyra B. Prylypko. « INTEGRATIVE MORPHOMETRIC CHARACTERISTIC OF ENDOTHELIAL DYSFUNCTION IN THE CASES OF CHILDREN WITH ESSENTIAL ARTERIAL HYPERTENSION ». Wiadomości Lekarskie 74, no 4 (2021) : 948–53. http://dx.doi.org/10.36740/wlek202104125.

Texte intégral
Résumé :
The aim: To optimize the treatment of children with Essential Arterial Hypertension (EAH) in assotiation with Endotelial Dysfunction (ED) by studying the clinical and morphofunctional characteristics of the cardiovascular system disorders and correction of endothelial dysfunction with the using of essential phospholipids. Materials and methods: The study group consisted of 80 children and 30 – a control group. The next stage included the division of 80 children into 2 subgroups. Patients in the first subgroup received basic treatment (angiotensin-converting enzyme inhibitor of the third generation), the second – optimized treatment (basic treatment was with addition of certified drug lecithin). Doses were determined according to the instructions and age for 2 months. In the study were used: ECG, Echocardiography, Ultrasonography, Morphofunctional studies of the endothelium. Results: There is a dynamic decreasing in the level of left ventricular myocardial mass index (LV MMI), reduction of end-diastolic volume (EDV) and increase in the absolute values of shock volume (SV), ejection fraction( EF) under the influence of optimized treatment due to the inclusion of lecithin in the treatment of children with EAH with ED. The Ve/Va ratio had a tendency to increase. Vasoconstriction of vessels after the reactive hyperemia test was significantly reduced, but the degree of vasodilation varied depending on the method of therapy. The intima-media thickness (IMT) decreased in 1.12 times in the cases of children with an optimized treatment, accompanied by a decreasing of DEC by 2-times. Levels of the aortic stiffness index had a tendency of decreasing (from 0.88 ± 0.02 to 0.71 ± 0.01 and to 0.63 ± 0.01, respectively, by groups and in comparison with the control group – 0.55 ± 0 , 01), which reflects the improvement of hemodynamic parameters. The dynamic parameters obtained in the cases of patients with EAH in association with ED, taking into account the impact of the optimized treatment had positive correction on the total risk of cardiovascular complications, changes in the profile of LV diastolic filling, dysfunction of arterial endothelium. Conclusions: The inclusion of essential phospholipids in the treatment of children with EAH and ED helps to optimize the profile of LV diastolic filling and exclude vascular endothelial dysfunction and indicate a positive effect of optimized treatment on the overall risk of cardiovascular complications.
Styles APA, Harvard, Vancouver, ISO, etc.
6

Khari, Manju, et Prabhat Kumar. « Empirical Evaluation of Hill Climbing Algorithm ». International Journal of Applied Metaheuristic Computing 8, no 4 (octobre 2017) : 27–40. http://dx.doi.org/10.4018/ijamc.2017100102.

Texte intégral
Résumé :
The software is growing in size and complexity every day due to which strong need is felt by the research community to search for the techniques which can optimize test cases effectively. The current study is inspired by the collective behavior of finding paths from the colony of food and uses different versions of Hill Climbing Algorithm (HCA) such as Stochastic, and Steepest Ascent HCA for the purpose of finding a good optimal solution. The performance of the proposed algorithm is verified on the basis of three parameters comprising of optimized test cases, time is taken during the optimization process, and the percentage of optimization achieved. The results suggest that proposed Stochastic HCA is significantly average percentage better than Steepest Ascent HCA in reducing the number of test cases in order to accomplish the optimization target.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Cioara, Tudor, Marcel Antal, Claudia Daniela Antal (Pop), Ionut Anghel, Massimo Bertoncini, Diego Arnone, Marilena Lazzaro et al. « Data Centers Optimized Integration with Multi-Energy Grids : Test Cases and Results in Operational Environment ». Sustainability 12, no 23 (26 novembre 2020) : 9893. http://dx.doi.org/10.3390/su12239893.

Texte intégral
Résumé :
In this paper, we address the management of Data Centers (DCs) by considering their optimal integration with the electrical, thermal, and IT (Information Technology) networks helping them to meet sustainability objectives and gain primary energy savings. Innovative scenarios are defined for exploiting the DCs electrical, thermal, and workload flexibility as a commodity and Information and Communication Technologies (ICT) are proposed and used as enablers for the scenarios’ implementation. The technology and scenarios were evaluated in the context of two operational DCs: a micro DC in Poznan which has on-site renewable sources and a DC in Point Saint Martin. The test cases’ results validate the possibility of using renewable energy sources (RES) for exploiting DCs’ energy flexibility and the potential of combining IT load migration with the availability of RES to increase the amount of energy flexibility by finding a trade-off between the flexibility level, IT load Quality of Service (QoS), and the RES production level. Moreover, the experiments conducted show that the DCs can successfully adapt their thermal energy profile for heat re-use as well as the combined electrical and thermal energy profiles to match specific flexibility requests.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Sahoo, Rashmi Rekha, et Mitrabinda Ray. « Metaheuristic Techniques for Test Case Generation ». Journal of Information Technology Research 11, no 1 (janvier 2018) : 158–71. http://dx.doi.org/10.4018/jitr.2018010110.

Texte intégral
Résumé :
The primary objective of software testing is to locate bugs as many as possible in software by using an optimum set of test cases. Optimum set of test cases are obtained by selection procedure which can be viewed as an optimization problem. So metaheuristic optimizing (searching) techniques have been immensely used to automate software testing task. The application of metaheuristic searching techniques in software testing is termed as Search Based Testing. Non-redundant, reliable and optimized test cases can be generated by the search based testing with less effort and time. This article presents a systematic review on several meta heuristic techniques like Genetic Algorithms, Particle Swarm optimization, Ant Colony Optimization, Bee Colony optimization, Cuckoo Searches, Tabu Searches and some modified version of these algorithms used for test case generation. The authors also provide one framework, showing the advantages, limitations and future scope or gap of these research works which will help in further research on these works.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Gladston, Angelin, et Niranjana Devi N. « Optimal Test Case Selection Using Ant Colony and Rough Sets ». International Journal of Applied Evolutionary Computation 11, no 2 (avril 2020) : 1–14. http://dx.doi.org/10.4018/ijaec.2020040101.

Texte intégral
Résumé :
Test case selection helps in improving quality of test suites by removing ambiguous, redundant test cases, thereby reducing the cost of software testing. Various works carried out have chosen test cases based on single parameter and optimized the test cases using single objective employing single strategies. In this article, a parameter selection technique is combined with an optimization technique for optimizing the selection of test cases. A two-step approach has been employed. In first step, the fuzzy entropy-based filtration is used for test case fitness evaluation and selection. In second step, the improvised ant colony optimization is employed to select test cases from the previously reduced test suite. The experimental evaluation using coverage parameters namely, average percentage statement coverage and average percentage decision coverage along with suite size reduction, demonstrate that by using this proposed approach, test suite size can be reduced, reducing further the computational effort incurred.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Almutairi, Saleh Ateeq. « DL-MDF-OH2 : Optimized Deep Learning-Based Monkeypox Diagnostic Framework Using the Metaheuristic Harris Hawks Optimizer Algorithm ». Electronics 11, no 24 (8 décembre 2022) : 4077. http://dx.doi.org/10.3390/electronics11244077.

Texte intégral
Résumé :
At the time the world is attempting to get over the damage caused by the COVID-19 spread, the monkeypox virus threatens to evolve into a global pandemic. Human monkeypox was first recognized in Africa and has recently emerged in 103 countries outside Africa. However, monkeypox diagnosis in an early stage is difficult because of the similarity between it, chickenpox, cowpox and measles. In some cases, computer-assisted detection of monkeypox lesions can be helpful for quick identification of suspected cases. Infected and uninfected cases have added to a growing dataset that is publicly accessible and may be utilized by machine and deep learning to predict the suspected cases at an early stage. Motivated by this, a diagnostic framework to categorize the cases of patients into four categories (i.e., normal, monkeypox, chicken pox and measles) is proposed. The diagnostic framework is a hybridization of pre-trained Convolution Neural Network (CNN) models, machine learning classifiers and a metaheuristic optimization algorithm. The hyperparameters of the five pre-trained models (i.e., VGG19, VGG16, Xception, MobileNet and MobileNetV2) are optimized using a Harris Hawks Optimizer (HHO) metaheuristic algorithm. After that, the features can be extracted from the feature extraction and reduction layers. These features are classified using seven machine learning models (i.e., Random Forest, AdaBoost, Histogram Gradient Boosting, Gradient Boosting, Support Vector Machine, Extra Trees and KNN). For each classifier, 10-fold cross-validation is used to train and test the classifiers on the features and the weighted average performance metrics are reported. The predictions from the pre-trained model and machine learning classifiers are then processed using majority voting. This study conducted the experiments on two datasets (i.e., Monkeypox Skin Images Dataset (MSID) and Monkeypox Images Dataset (MPID)). MSID dataset values 97.67%, 95.19%, 97.96%, 95.11%, 96.58%, 95.10%, 90.93% and 96.65% are achieved concerning accuracy, sensitivity, specificity, PPV, BAC, F1, IoU and ROC, respectively. While for the MPID dataset, values of 97.51%, 94.84%, 94.48%, 94.96%, 96.66%, 94.88%, 90.45% and 96.69% are achieved concerning accuracy, sensitivity, specificity, PPV, BAC, F1, IoU and ROC, respectively.
Styles APA, Harvard, Vancouver, ISO, etc.
11

Zhao, Jian Ping, Xiao Yang Liu, Hong Ming Xi, Li Ya Xu, Jian Hui Zhao et Huan Ming Liu. « A Lightweight-Leveled Software Automated Test Framework ». Advanced Materials Research 834-836 (octobre 2013) : 1919–24. http://dx.doi.org/10.4028/www.scientific.net/amr.834-836.1919.

Texte intégral
Résumé :
To resolve the problem of a large amount of automated test scripts and test data files, through the test tool QTP, data-driven and keyword-driven testing mechanism, a test automation framework based on three layer data-driven mechanism is designed, including the design of the TestSet managing test case files, the design of the TestCase storing test cases and the design of the TestData storing test data.Through controlling the test scale and applying the test data pool, reconfigurable and optimization of test scripts are designed. The methods above can decouple the test design and the script development, make test cases and data show a more humane design, make test scripts and test data on the business level optimized and reusable, and make the number of script files and the test data files reache a minimum, which reduces the occupied space.
Styles APA, Harvard, Vancouver, ISO, etc.
12

Bruce. E, John, et T. Sasi Prabha. « A Survey on Techniques Adopted in the Prioritization of Test Cases for Regression Testing ». International Journal of Engineering & ; Technology 7, no 4.5 (22 septembre 2018) : 242. http://dx.doi.org/10.14419/ijet.v7i4.5.20078.

Texte intégral
Résumé :
Regression testing is testing the software with the intention to confirm that changes made on part of a module do not necessitate other parts of the module. Test case prioritization helps to reduce regression testing cost by ordering the test cases in such a way that it produces optimized results. Code Coverage and Fault detection being the factors behind the prioritization is dealt with techniques like Heuristic method, Meta Heuristic methods and Data mining techniques. The effectiveness of the techniques applied can be evaluated with the metrics like Average Percentage of Fault Detection (APFD) , Average Percentage Block Coverage (APBC), Average Percentage Decision Coverage (APDC) etc . In this paper,, a detailed survey on the various techniques adopted for the prioritization of test cases are presented.
Styles APA, Harvard, Vancouver, ISO, etc.
13

Liu, Zhenpeng, Xianwei Yang, Shichen Zhang, Yi Liu, Yonggang Zhao et Weihua Zheng. « Automatic Generation of Test Cases Based on Genetic Algorithm and RBF Neural Network ». Mobile Information Systems 2022 (23 mai 2022) : 1–9. http://dx.doi.org/10.1155/2022/1489063.

Texte intégral
Résumé :
Software testing plays an important role in improving the quality of software, but the design of test cases requires a lot of manpower, material resources, and time, and designers tend to be subjective when designing test cases. To solve this problem and make the test cases have objectivity and greater coverage, a branch coverage test case automatic generation method based on genetic algorithm and RBF neural network algorithm (GAR) is proposed. In terms of test case generation, based on the genetic algorithm optimized in this paper, a certain number of test case samples are randomly selected to train the RBF neural network to simulate the fitness function and to calculate the individual fitness value. The experiment uses 7 C language codes to automatically generate test cases and compares the experimental data generated by the branch coverage test case generation method based on adaptive genetic algorithm (PDGA), traditional genetic algorithm (SGA), and random test generation method (random) to evaluate the proposed algorithm. The experimental results show that the method is feasible and effective, the branch coverage is increased in the generation of test cases, and the number of iterations of the population is less.
Styles APA, Harvard, Vancouver, ISO, etc.
14

Mudarakola, Lakshmi Prasad, et J. K. R. Sastry. « A Neural Network Based Strategy (NNBS) for Automated Construction of Test Cases for Testing an Embedded System using Combinatorial Techniques ». International Journal of Engineering & ; Technology 7, no 1.3 (31 décembre 2017) : 74. http://dx.doi.org/10.14419/ijet.v7i1.3.9271.

Texte intégral
Résumé :
Testing an embedded system is required to locate bugs in software, diminish risk, development, repairs costs and to improve performance for both users and the company. Embedded software testing tools are useful for catching defects during unit, integration and system testing. Embedded systems in many cases must be optimized by engaging crucial areas of the embedded systems considering all factors of the input domain. The most important concern is to build a place of test cases depend on design of the requirements that can recognize more number of faults at a least rate and point in time in the major sections of an embedded system. This paper proposes a Neural Network Based strategy (NNBS) to generate optimized test cases based on the considerations of the system. A tool called NNTCG (Neural Network Test Case Generator) has been build up based on the method proposed in this paper. Test cases are generated for testing an embedded system using NNTCG and the same are used to determine the expected output through the neural network and the output generated from the actual firmware. The faulty paths within the firmware are determined when the output generated by the neural network is not same as the output generated by the firmware.
Styles APA, Harvard, Vancouver, ISO, etc.
15

Senthil Kumar, K., et A. Muthukumaravel. « A Hybrid Approach for Test Case Prioritization using PSO Based on Software Quality Metrics ». International Journal of Engineering & ; Technology 7, no 3.12 (20 juillet 2018) : 300. http://dx.doi.org/10.14419/ijet.v7i3.12.16046.

Texte intégral
Résumé :
Effective functionality checking of any software application is the crucial event that determines the quality of outcome obtained. Generally, checking scenarios that involves multiple test cases in mixture with multiple components is time consuming and also increases the quality assurance cost. Selection of suitable method/approach for optimization and prioritization of test cases as well as appropriate evaluation of the application would result in reduction of fault detection effort without appreciable information loss and further would also significantly decrease the clearing up cost. In the proposed method, test cases are optimized and then prioritized by Particle Swarm Optimization algorithm (PSO) and Improved Cuckoo Search algorithm (ICSA), respectively. Finally, the result will be evaluated for software quality measures.
Styles APA, Harvard, Vancouver, ISO, etc.
16

Park, Jongwon. « An Optimized Colorimetric Readout Method for Lateral Flow Immunoassays ». Sensors 18, no 12 (22 novembre 2018) : 4084. http://dx.doi.org/10.3390/s18124084.

Texte intégral
Résumé :
Despite its broad penetration of various markets, the quantitative lateral flow immunoassay (LFIA) suffers from sensitivity issues in some cases. To solve this problem, an optimized colorimetric readout method for LFIA quantification is proposed in this study. An assay reader device utilizing a color camera and an analysis method using a Bayer filtered image were developed. Spectrometric measurements of the assay test line were performed to determine the color channel that contains the test line information and effectively minimizes noise. The change in the intensity ratio with increasing concentration of the target substance in the sample was largest in the green channel. The linear range of the output curve ranged from 0 to 10 ng/mL, and the detection limit was 2 ng/mL. The suggested instrumentation and analysis methods are expected to effectively resolve the low-sensitivity problems of the former LFIA systems and to offer other prospective functionalities for LFIA quantification.
Styles APA, Harvard, Vancouver, ISO, etc.
17

Maniatopoulos, Andreas, et Nikolaos Mitianoudis. « Learnable Leaky ReLU (LeLeLU) : An Alternative Accuracy-Optimized Activation Function ». Information 12, no 12 (9 décembre 2021) : 513. http://dx.doi.org/10.3390/info12120513.

Texte intégral
Résumé :
In neural networks, a vital component in the learning and inference process is the activation function. There are many different approaches, but only nonlinear activation functions allow such networks to compute non-trivial problems by using only a small number of nodes, and such activation functions are called nonlinearities. With the emergence of deep learning, the need for competent activation functions that can enable or expedite learning in deeper layers has emerged. In this paper, we propose a novel activation function, combining many features of successful activation functions, achieving 2.53% higher accuracy than the industry standard ReLU in a variety of test cases.
Styles APA, Harvard, Vancouver, ISO, etc.
18

Tatale, Subhash, et Vudatha Chandra Prakash. « Automatic Generation and Optimization of Combinatorial Test Cases from UML Activity Diagram Using Particle Swarm Optimization ». Ingénierie des systèmes d information 27, no 1 (28 février 2022) : 49–59. http://dx.doi.org/10.18280/isi.270106.

Texte intégral
Résumé :
Generation of test cases is one of the essential activities of the software testing process. The process of executing a programme to identify defects to improve the system's quality is known as software testing. Manually writing test cases takes time, effort, and money. On the other hand, generating test cases automatically is the solution to this problem. For this automation process, a model-based test case generation technique would be acceptable. A model is usually required to generate test cases in the model-based testing technique. Nowadays, researchers have relied on the activity diagram to generate test cases. Test cases for combinatorial logic systems are required. Combinatorial testing is essential for producing a small number of test cases and identifying errors occurred by interactions between system input parameters. Information about constraints, parameters and its values are required for generation of test cases. It is difficult to extract information regarding constraints, parameters, its values, and interactions between parameters from an Activity Diagram. A novel approach is proposed to extract this information from an Activity Diagram. The authors created a tool that automatically generates combinatorial test cases using UML Activity Diagrams. The proposed tool has two main parts. First, the combinatorial test design model is developed for extraction of input parameters. Second part is generation of optimized number of combinatorial test cases using Particle Swarm Optimization algorithm. Finally, the authors experimented on a real-world case study namely viz. Railway Reservation using the proposed tool, and it is shown that the proposed tool generated optimum number of combinatorial test cases.
Styles APA, Harvard, Vancouver, ISO, etc.
19

Rathee, Nisha, et Rajender Singh Chhillar. « Model Driven Approach to Secure Optimized Test Paths for Smart Samsung Pay using Hybrid Genetic Tabu Search Algorithm ». International Journal of Information System Modeling and Design 9, no 1 (janvier 2018) : 77–91. http://dx.doi.org/10.4018/ijismd.2018010104.

Texte intégral
Résumé :
Smart mobile pay applications on smart devices have been considered as the most efficient and secure mode of contactless payment. To safeguard customer credit/ debit card details, testing of mobile pay solutions like Samsung Pay is most important and critical task for testers. Testing of all the test cases is very tedious and a time-consuming task, therefore optimization techniques have been used to identify most optimized test paths. In this article, a hybrid genetic and tabu search optimization (HGTO) algorithm is proposed to secure optimized test paths using activity diagram of the smart Samsung Pay application. The proposed approach has been implemented using C++ language on the case study of the Smart Samsung Pay and an online airline reservation system. The experimental results show that the proposed technique is more effective in automatic generation and optimization of test paths, as compared to a simple genetic algorithm.
Styles APA, Harvard, Vancouver, ISO, etc.
20

Chennappan, R., et Vidyaa Thulasiraman. « Multicriteria Cuckoo search optimized latent Dirichlet allocation based Ruzchika indexive regression for software quality management ». Indonesian Journal of Electrical Engineering and Computer Science 24, no 3 (1 décembre 2021) : 1804. http://dx.doi.org/10.11591/ijeecs.v24.i3.pp1804-1813.

Texte intégral
Résumé :
The paper presents the software quality management is a highly significant one to ensure the quality and to review the reliability of software products. To improve the software quality by predicting software failures and enhancing the scalability, in this paper, we present a novel reinforced Cuckoo search optimized latent Dirichlet allocation based Ruzchika indexive regression (RCSOLDA-RIR) technique. At first, Multicriteria reinforced Cuckoo search optimization is used to perform the test case selection and find the most optimal solution while considering the multiple criteria and selecting the optimal test cases for testing the software quality. Next, the generative latent Dirichlet allocation model is applied to predict the software failure density with selected optimal test cases with minimum time. Finally, the Ruzchika indexive regression is applied for measuring the similarity between the preceding versions and the new version of software products. Based on the similarity estimation, the software failure density of the new version is also predicted. With this, software error prediction is made in a significant manner, therefore, improving the reliability of software code and service provisioning time between software versions in software systems is also minimized. An experimental assessment of the RCSOLDA-RIR technique achieves better reliability and scalability than the existing methods.
Styles APA, Harvard, Vancouver, ISO, etc.
21

Lee, Chang-Whan, Dong-Jun Kim, Sung-Kwon Kim et Kyuho Sim. « Design Optimization of Flexure Springs for Free-Piston Stirling Engines and Experimental Evaluations with Fatigue Testing ». Energies 14, no 16 (20 août 2021) : 5156. http://dx.doi.org/10.3390/en14165156.

Texte intégral
Résumé :
The free-piston Stirling engine is a closed-cycle regenerative heat engine that converts heat energy into mechanical work, and requires a spring element for vibratory operations of the displacer and power pistons. In this study, the geometry of the flexural spring design was optimized through structural finite element analyses and fatigue test evaluations. First, we constructed a target design space considering the required natural frequency of the displacer spring assembly under the geometric constraints of total mass and module height. The design of experiments was employed to construct simulation cases for design factors such as the outer diameter, thickness, and number of spirals in the spring sheet. As a result, the optimized design values were obtained to satisfy the design requirements. We also fabricated a test spring specimen and conducted fatigue tests using a linear actuator system developed to have the same motion as the engine. The test results indicated that the optimized spiral spring had no fracture under operating conditions with the design piston amplitude, revealing the effectiveness of the design method.
Styles APA, Harvard, Vancouver, ISO, etc.
22

Schoemig-Markiefka, Birgid, Jana Eschbach, Andreas H. Scheel, Aylin Pamuk, Josef Rueschoff, Thomas Zander, Reinhard Buettner et al. « Optimized PD-L1 scoring of gastric cancer ». Gastric Cancer 24, no 5 (5 mai 2021) : 1115–22. http://dx.doi.org/10.1007/s10120-021-01195-4.

Texte intégral
Résumé :
Abstract Background PD-1/PD-L1-Immunotherapy has been approved for gastric carcinoma. PD-L1 assessment by immunohistochemistry is the principle biomarker. Are biopsies able to map the actual PD-L1 status of the entire tumor? Methods Whole tumor slides of 56 gastric carcinoma were analyzed to determine the distribution of PD-L1 positive cells in the entire tumor areas. Tissue micro arrays with four cores of the tumor surface, which represents the endoscopically accessible biopsy zone, were built from the same tumors. The PD-L1 CPS value was determined separately for each core. Preoperative diagnostic biopsies were available for 22 of the tumors. PD-L1 prevalence, sensitivity and specificity were analyzed using the whole tumor slides as reference scores. Molecular subtyping was performed and related to the PD-L1 status. Results 27.3% of cases were PD-L1 negative (CPS < 1), 43.6% showed low PD-L1 expression (CPS ≥ 1 to < 5), 12.7% moderate (CPS ≥ 5 to < 10) and 16.4% strong expression (CPS ≥ 10). The biopsies showed best test characteristics if four surface biopsies were analyzed combined, i.e., the CPS was calculated across all four biopsies. The prevalence showed a distribution similar to the resection specimens, sensitivity was 0.73 and specificity 1.0. Using fewer surface biopsies decreased sensitivity and specificity and caused false-negative classifications. Compared to the TMAs, the preoperative biopsies showed reduced sensitivity (0.412). Conclusions This is the first comprehensive study to optimize PD-L1 assessment in gastric cancer using endoscopically available tissue. The obtained PD-L1 prevalence is consistent with data of current clinical studies. Calculation of the test characteristics shows that surface biopsies can be indicative of the true PD-L1 status based on the resection specimen. However, an adequate number of biopsies is required. In this study, n = 4 biopsies yielded best results.
Styles APA, Harvard, Vancouver, ISO, etc.
23

MALA, D. JEYA, et V. MOHAN. « ON THE USE OF INTELLIGENT AGENTS TO GUIDE TEST SEQUENCE SELECTION AND OPTIMIZATION ». International Journal of Computational Intelligence and Applications 08, no 02 (juin 2009) : 155–79. http://dx.doi.org/10.1142/s1469026809002515.

Texte intégral
Résumé :
Many of the automated testing tools concentrate on the automatic generation of test cases but do not worry about their optimization. In our paper, we analyzed the system model, which is represented as a state diagram and selected a very limited set of test sequences to be executed from the extreme large number (usually infinitely many) of potential ones. This paper demonstrates a way to generate optimal test sequences that are guaranteed to take the least possible time to execute and also satisfies both state and branch coverage criteria using Intelligent Agents. In the proposed approach, the Intelligent Search Agent (ISA) takes the decision of optimized test sequences by searching through the SUT, which is represented as a graph in which each node is associated with a heuristic value and each edge is associated with an edge weight. The Intelligent Agent finds the best sequence by following the nodes that satisfy the fitness criteria and generates the optimized test sequences for the SUT. Finally we compared our approach with existing Ant Colony Optimization (ACO)-based test sequence optimization approach and proved that our approach produces better results.
Styles APA, Harvard, Vancouver, ISO, etc.
24

Amattouch, M. R., N. Nagid et H. Belhadj. « Optimized Domain Decomposition Method for Non Linear Reaction Advection Diffusion Equation ». European Scientific Journal, ESJ 12, no 27 (30 septembre 2016) : 63. http://dx.doi.org/10.19044/esj.2016.v12n27p63.

Texte intégral
Résumé :
This work is devoted to an optimized domain decomposition method applied to a non linear reaction advection diffusion equation. The proposed method is based on the idea of the optimized of two order (OO2) method developed this last two decades. We first treat a modified fixed point technique to linearize the problem and then we generalize the OO2 method and modify it to obtain a new more optimized rate of convergence of the Schwarz algorithm. To compute the new rate of convergence we have used Fourier analysis. For the numerical computation we minimize this rate of convergence using a global optimization algorithm. Several test-cases of analytical problems illustrate this approach and show the efficiency of the proposed new method.
Styles APA, Harvard, Vancouver, ISO, etc.
25

Dubey, Yogita, Divakar Singh et Anju Singh. « A Parallel Early Binding Recursive Ant Colony Optimization (PEB-RAC) Approach for Generating Optimized Auto Test Cases from programming Inputs ». International Journal of Computer Applications 136, no 3 (17 février 2016) : 11–17. http://dx.doi.org/10.5120/ijca2016908420.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
26

Hotchen, Andrew J., Maria Dudareva, Jamie Y. Ferguson, Parham Sendi et Martin A. McNally. « The BACH classification of long bone osteomyelitis ». Bone & ; Joint Research 8, no 10 (octobre 2019) : 459–68. http://dx.doi.org/10.1302/2046-3758.810.bjr-2019-0050.r1.

Texte intégral
Résumé :
Objectives The aim of this study was to assess the clinical application of, and optimize the variables used in, the BACH classification of long-bone osteomyelitis. Methods A total of 30 clinicians from a variety of specialities classified 20 anonymized cases of long-bone osteomyelitis using BACH. Cases were derived from patients who presented to specialist centres in the United Kingdom between October 2016 and April 2017. Accuracy and Fleiss’ kappa (Fκ) were calculated for each variable. Bone involvement (B-variable) was assessed further by nine clinicians who classified ten additional cases of long bone osteomyelitis using a 3D clinical imaging package. Thresholds for defining multidrug-resistant (MDR) isolates were optimized using results from a further analysis of 253 long bone osteomyelitis cases. Results The B-variable had a classification accuracy of 77.0%, which improved to 95.7% when using a 3D clinical imaging package (p < 0.01). The A-variable demonstrated difficulty in the accuracy of classification for increasingly resistant isolates (A1 (non-resistant), 94.4%; A2 (MDR), 46.7%; A3 (extensively or pan-drug-resistant), 10.0%). Further analysis demonstrated that isolates with four or more resistant test results or less than 80% sensitive susceptibility test results had a 98.1% (95% confidence interval (CI) 96.6 to 99.6) and 98.8% (95% CI 98.1 to 100.0) correlation with MDR status, respectively. The coverage of the soft tissues (C-variable) and the host status (H-variable) both had a substantial agreement between users and a classification accuracy of 92.5% and 91.2%, respectively. Conclusions The BACH classification system can be applied accurately by users with a variety of clinical backgrounds. Accuracy of B-classification was improved using 3D imaging. The use of the A-variable has been optimized based on susceptibility testing results. Cite this article: A. J. Hotchen, M. Dudareva, J. Y. Ferguson, P. Sendi, M. A. McNally. The BACH classification of long bone osteomyelitis. Bone Joint Res 2019;8:459–468. DOI: 10.1302/2046-3758.810.BJR-2019-0050.R1
Styles APA, Harvard, Vancouver, ISO, etc.
27

Mena, Natmir, Polina G. Marinova-Kichikova et Kiril G. Kirov. « Optimized Drainage of Pancreatic-Digestive Anastomosis in Patients with Pancreatoduodenal Resection ». Journal of Biomedical and Clinical Research 15, no 2 (1 décembre 2022) : 135–41. http://dx.doi.org/10.2478/jbcr-2022-0019.

Texte intégral
Résumé :
Summary The study compared early post-surgical complications between two groups of patients with pancreatoduodenal resection for pancreatic head carcinoma: patients with pancreatic-gastric anastomosis with mixed drainage and controls with pancreatic jejunal anastomosis with external drainage. The present study was a cohort study. The patient group was selected prospectively, and the control group – retrospectively. Patients were randomized by sex, age, primary tumor location, pancreatic parenchyma density, clinical symptoms, tumor–node–metastasis (TNM), and grade (G). We used the IBM SPSS Statistics software with the following tests: Fisher’s exact test, Pearson’s chi-squared test, Mann–Whitney U test. The optimized reconstruction approach with mixed drainage reduced early complications: early mortality - by 2.5%, overall morbidity - by 7.5%; pancreatic-digestive anastomosis insufficiency - by 2.5%; intra-abdominal bleeding - by 2.5%; intra-abdominal infection - by 2.5%; gastroparesis - by 5.0%; wound infection - by 2.5%; biliary leakage -by 2.5%. There were no cases of clinically significant pancreatic fistula. The control group was associated with an average of 9-fold higher relative risk of early complications. The passage was restored between the 4th and 7th day. Patients had a shorter average hospital stay (11 days) compared to controls (22 days). Digestive anastomoses reconstruction on a single loop and mixed intraluminal drainage through a modified nasogastric tube led to a 7-fold reduction in early post-surgical complications and a 2-fold shorter hospital stay.
Styles APA, Harvard, Vancouver, ISO, etc.
28

Uervirojnangkoorn, Monarin, Rolf Hilgenfeld, Thomas C. Terwilliger et Randy J. Read. « Improving experimental phases for strong reflections prior to density modification ». Acta Crystallographica Section D Biological Crystallography 69, no 10 (20 septembre 2013) : 2039–49. http://dx.doi.org/10.1107/s0907444913018167.

Texte intégral
Résumé :
Experimental phasing of diffraction data from macromolecular crystals involves deriving phase probability distributions. These distributions are often bimodal, making their weighted average, the centroid phase, improbable, so that electron-density maps computed using centroid phases are often non-interpretable. Density modification brings in information about the characteristics of electron density in protein crystals. In successful cases, this allows a choice between the modes in the phase probability distributions, and the maps can cross the borderline between non-interpretable and interpretable. Based on the suggestions by Vekhter [Vekhter (2005),Acta Cryst.D61, 899–902], the impact of identifying optimized phases for a small number of strong reflections prior to the density-modification process was investigated while using the centroid phase as a starting point for the remaining reflections. A genetic algorithm was developed that optimizes the quality of such phases using the skewness of the density map as a target function. Phases optimized in this way are then used in density modification. In most of the tests, the resulting maps were of higher quality than maps generated from the original centroid phases. In one of the test cases, the new method sufficiently improved a marginal set of experimental SAD phases to enable successful map interpretation. A computer program,SISA, has been developed to apply this method for phase improvement in macromolecular crystallography.
Styles APA, Harvard, Vancouver, ISO, etc.
29

Kriegsmann, Mark, Christian Haag, Cleo-Aron Weis, Georg Steinbuss, Arne Warth, Christiane Zgorzelski, Thomas Muley et al. « Deep Learning for the Classification of Small-Cell and Non-Small-Cell Lung Cancer ». Cancers 12, no 6 (17 juin 2020) : 1604. http://dx.doi.org/10.3390/cancers12061604.

Texte intégral
Résumé :
Reliable entity subtyping is paramount for therapy stratification in lung cancer. Morphological evaluation remains the basis for entity subtyping and directs the application of additional methods such as immunohistochemistry (IHC). The decision of whether to perform IHC for subtyping is subjective, and access to IHC is not available worldwide. Thus, the application of additional methods to support morphological entity subtyping is desirable. Therefore, the ability of convolutional neuronal networks (CNNs) to classify the most common lung cancer subtypes, pulmonary adenocarcinoma (ADC), pulmonary squamous cell carcinoma (SqCC), and small-cell lung cancer (SCLC), was evaluated. A cohort of 80 ADC, 80 SqCC, 80 SCLC, and 30 skeletal muscle specimens was assembled; slides were scanned; tumor areas were annotated; image patches were extracted; and cases were randomly assigned to a training, validation or test set. Multiple CNN architectures (VGG16, InceptionV3, and InceptionResNetV2) were trained and optimized to classify the four entities. A quality control (QC) metric was established. An optimized InceptionV3 CNN architecture yielded the highest classification accuracy and was used for the classification of the test set. Image patch and patient-based CNN classification results were 95% and 100% in the test set after the application of strict QC. Misclassified cases mainly included ADC and SqCC. The QC metric identified cases that needed further IHC for definite entity subtyping. The study highlights the potential and limitations of CNN image classification models for tumor differentiation.
Styles APA, Harvard, Vancouver, ISO, etc.
30

Kim, Jisun, Jaewoong Kim, In-ju Kim, Sungwook Kang et Kwangsan Chun. « An Analysis of Mechanical Properties for Ultrasonically Welded Multiple C1220-Al1050 Layers ». Applied Sciences 9, no 19 (8 octobre 2019) : 4188. http://dx.doi.org/10.3390/app9194188.

Texte intégral
Résumé :
This study analyzed the characteristics of aluminum and copper sheets under multi-layer ultrasonic welding, and observed the strength, fracture type, and interface of the weld zone according to location. In addition, an experimental plan was developed using the Taguchi method to optimize the quadruple lap ultrasonic welding process conditions of 0.4t aluminum and copper sheets, and the experiment was performed for each of 25 welding condition. For strength evaluation, the ultrasonic welding performance was evaluated by measuring the tensile strength as a composite material and the shear force at the weld interface through two types of tensile tests: simultaneous tensile and individual tensile. To identify the individual shear strengths of the multi-layer dissimilar ultrasonic welds, three types of tensile tests were performed for each specimen depending on the location of the welded, and as the distance from the horn increased, each of shear strength decreased while the difference in strength value increased. For quadruple lap welding of pure aluminum and copper sheets, the S/N (Signal to Noise Ratio) was the highest at 64.48 with a coarse-grain pattern and optimal welding conditions, and this was selected as the optimal condition. To evaluate the optimized welding condition, additional tests were conducted using the welding conditions that showed the maximum strength values and the welding conditions optimized using the Taguchi method through simple tests. A strength evaluation of the optimized weldment was performed, and for a simultaneous tensile test, it was found that the strength of the optimized weldment was improved by 45% compared to other cases.
Styles APA, Harvard, Vancouver, ISO, etc.
31

Kiran, Ayesha, Wasi Haider Butt, Arslan Shaukat, Muhammad Umar Farooq, Urooj Fatima, Farooque Azam et Zeeshan Anwar. « Multi-objective regression test suite optimization using three variants of adaptive neuro fuzzy inference system ». PLOS ONE 15, no 12 (3 décembre 2020) : e0242708. http://dx.doi.org/10.1371/journal.pone.0242708.

Texte intégral
Résumé :
In the process of software development, regression testing is one of the major activities that is done after making modifications in the current system or whenever a software system evolves. But, the test suite size increases with the addition of new test cases and it becomes in-efficient because of the occurrence of redundant, broken, and obsolete test cases. For that reason, it results in additional time and budget to run all these test cases. Many researchers have proposed computational intelligence and conventional approaches for dealing with this problem and they have achieved an optimized test suite by selecting, minimizing or reducing, and prioritizing test cases. Currently, most of these optimization approaches are single objective and static in nature. But, it is mandatory to use multi-objective dynamic approaches for optimization due to the advancements in information technology and associated market challenges. Therefore, we have proposed three variants of self-tunable Adaptive Neuro-fuzzy Inference System i.e. TLBO-ANFIS, FA-ANFIS, and HS-ANFIS, for multi-objective regression test suites optimization. Two benchmark test suites are used for evaluating the proposed ANFIS variants. The performance of proposed ANFIS variants is measured using Standard Deviation and Root Mean Square Error. A comparison of experimental results is also done with six existing methods i.e. GA-ANFIS, PSO-ANFIS, MOGA, NSGA-II, MOPSO, and TOPSIS and it is concluded that the proposed method effectively reduces the size of regression test suite without a reduction in the fault detection rate.
Styles APA, Harvard, Vancouver, ISO, etc.
32

Lu, Suying, David Duplat, Paula Benitez-Bolivar, Cielo León, Stephany D. Villota, Eliana Veloz-Villavicencio, Valentina Arévalo et al. « Multicenter international assessment of a SARS-CoV-2 RT-LAMP test for point of care clinical application ». PLOS ONE 17, no 5 (11 mai 2022) : e0268340. http://dx.doi.org/10.1371/journal.pone.0268340.

Texte intégral
Résumé :
Continued waves, new variants, and limited vaccine deployment mean that SARS-CoV-2 tests remain vital to constrain the coronavirus disease 2019 (COVID-19) pandemic. Affordable, point-of-care (PoC) tests allow rapid screening in non-medical settings. Reverse-transcription loop-mediated isothermal amplification (RT-LAMP) is an appealing approach. A crucial step is to optimize testing in low/medium resource settings. Here, we optimized RT-LAMP for SARS-CoV-2 and human β-actin, and tested clinical samples in multiple countries. “TTTT” linker primers did not improve performance, and while guanidine hydrochloride, betaine and/or Igepal-CA-630 enhanced detection of synthetic RNA, only the latter two improved direct assays on nasopharygeal samples. With extracted clinical RNA, a 20 min RT-LAMP assay was essentially as sensitive as RT-PCR. With raw Canadian nasopharygeal samples, sensitivity was 100% (95% CI: 67.6% - 100%) for those with RT-qPCR Ct values ≤ 25, and 80% (95% CI: 58.4% - 91.9%) for those with 25 < Ct ≤ 27.2. Highly infectious, high titer cases were also detected in Colombian and Ecuadorian labs. We further demonstrate the utility of replacing thermocyclers with a portable PoC device (FluoroPLUM). These combined PoC molecular and hardware tools may help to limit community transmission of SARS-CoV-2.
Styles APA, Harvard, Vancouver, ISO, etc.
33

Ojugo, Arnold, et Oghenevwede Debby Otakore. « Forging An Optimized Bayesian Network Model With Selected Parameters For Detection of The Coronavirus In Delta State of Nigeria ». Journal of Applied Science, Engineering, Technology, and Education 3, no 1 (30 juin 2020) : 37–45. http://dx.doi.org/10.35877/454ri.asci2163.

Texte intégral
Résumé :
Machine learning algorithm have become veritable tools for effective decision support towards the construction of systems that assist experts (individuals) in their field of exploits and endeavor with regards to problematic tasks.. They are best suited for tasks where data is explored and exploited; and cases where the dataset contains noise, partial truth, ambiguities and in cases where there is shortage of datasets. For this study, we employ the Bayesian network to construct a model trained towards a target system that can help predict best parameters used for classification of the novel coronavirus (covid-19). Data was collected from Federal Medical Center Epidemiology laboratory (a centralized databank for all cases of the covid-19 in Delta State). Data was split into training and investigation (test) dataset for the target system. Results show high predictive capability.
Styles APA, Harvard, Vancouver, ISO, etc.
34

Simões, Maria, Jorge Antunes, José Fernandes et Nataliya Sakharova. « Numerical Simulation of the Depth-Sensing Indentation Test with Knoop Indenter ». Metals 8, no 11 (31 octobre 2018) : 885. http://dx.doi.org/10.3390/met8110885.

Texte intégral
Résumé :
Depth-sensing indentation (DSI) technique allows easy and reliable determination of two mechanical properties of materials: hardness and Young’s modulus. Most of the studies are focusing on the Vickers, Berkovich, and conical indenter geometries. In case of Knoop indenter, the existing experimental and numerical studies are scarce. The goal of the current study is to contribute for the understanding of the mechanical phenomena that occur in the material under Knoop indention, enhancing and facilitating the analysis of its results obtained in DSI tests. For this purpose, a finite element code, DD3IMP, was used to numerically simulate the Knoop indentation test. A finite element mesh was developed and optimized in order to attain accurate values of the mechanical properties. Also, a careful modeling of the Knoop indenter was performed to take into account the geometry and size of the imperfection (offset) of the indenter tip, as in real cases.
Styles APA, Harvard, Vancouver, ISO, etc.
35

Okotie, S., et A. E. Fasanya. « PARTICLE SWARM OPTIMIZATION OF SOLUTION GAS OIL RATIO ». Open Journal of Engineering Science (ISSN : 2734-2115) 1, no 1 (10 mars 2020) : 14–25. http://dx.doi.org/10.52417/ojes.v1i1.81.

Texte intégral
Résumé :
Reservoir fluid properties are very important in reservoir engineering computations such as material balance calculation, well test analysis, reserves estimate and numerical reservoir simulations. Ideally, these properties should be obtained from laboratory pressure-volume-temperature (PVT) analysis. Quite often, however, these measurements are either not available, or very costly to obtain. In such cases, empirically derived correlations are used to estimate the needed properties, all computation therefore, will depend on the accuracy of the correlations used for estimating the fluid properties. Hence in this study, Standing’s correlation for estimating the solution gas-oil ratio was optimized using a particle swarm optimization (PSO) algorithm to minimize the error associated in estimating solution gas-oil ratio from correlation at various depletion pressure. The optimized correlation was taken as a function of bubble point pressure, API gravity, gas gravity and reservoir temperature. PVT data from differential liberation test was used to validate this study’s correlation and the result obtained shows that the optimized correlation for this study matches closely with the experimental values, also the newly optimized correlation was validated with other models and the results gave the least average relative error of 3.34 and a correlation coefficient of 0.998 after 216th successive iterations by the particle swarm optimization algorithm. Okotie, S. | Department of Petroleum Engineering, Federal University of Petroleum Resources (FUPRE), Effurun, Delta State, Nigeria
Styles APA, Harvard, Vancouver, ISO, etc.
36

Yang, Wei, Hong Ji Meng, Ya Juan Huang et Zhi Xie. « Prediction on Molten Steel End Temperature during Tapping in BOF Based on LS-SVM and PSO ». Advanced Materials Research 508 (avril 2012) : 233–36. http://dx.doi.org/10.4028/www.scientific.net/amr.508.233.

Texte intégral
Résumé :
A new molten steel end temperature prediction model is built employing LS-SVM. To seek the optimal parameters of regularization parameter γ and kernel parameter σ in LS-SVM, an PSO algorithm is also proposed. To test the proposed predictor, the prediction model is applied on practical data from Fujian Sangang steelmaking collected in 100t BOF, and the validation is carried on the performance of the prediction. The model overcomes the blindness and the burden in time consuming of cross validation method, and at the same time, inherits the strong learning ability from small sample and the characteristics of simple calculation of LS-SVM. In the LSSVM optimized by PSO test cases, the Maximum Absolute Error (MAE) and the Root Mean Squares Error (RMSE) are the lowest, and the Pearson Relative Coefficient (PRC) is the highest. The results suggest that the LS-SVM optimized by PSO model can be extended to end-point judgment applications in achieving greater forecasting accuracy and quality.
Styles APA, Harvard, Vancouver, ISO, etc.
37

Munawar, Hafiz Suliman, Hina Inam, Fahim Ullah, Siddra Qayyum, Abbas Z. Kouzani et M. A. Parvez Mahmud. « Towards Smart Healthcare : UAV-Based Optimized Path Planning for Delivering COVID-19 Self-Testing Kits Using Cutting Edge Technologies ». Sustainability 13, no 18 (18 septembre 2021) : 10426. http://dx.doi.org/10.3390/su131810426.

Texte intégral
Résumé :
Coronavirus Disease 2019 (COVID-19) has emerged as a global pandemic since late 2019 and has affected all forms of human life and economic developments. Various techniques are used to collect the infected patients’ sample, which carries risks of transferring the infection to others. The current study proposes an AI-powered UAV-based sample collection procedure through self-collection kits delivery to the potential patients and bringing the samples back for testing. Using a hypothetical case study of Islamabad, Pakistan, various test cases are run where the UAVs paths are optimized using four key algorithms, greedy, intra-route, inter-route, and tabu, to save time and reduce carbon emissions associated with alternate transportation methods. Four cases with 30, 50, 100, and 500 patients are investigated for delivering the self-testing kits to the patients. The results show that the Tabu algorithm provides the best-optimized paths covering 31.85, 51.35, 85, and 349.15 km distance for different numbers of patients. In addition, the algorithms optimize the number of UAVs to be used in each case and address the studied cases patients with 5, 8, 14, and 71 UAVs, respectively. The current study provides the first step towards the practical handling of COVID-19 and other pandemics in developing countries, where the risks of spreading the infections can be minimized by reducing person-to-person contact. Furthermore, the reduced carbon footprints of these UAVs are an added advantage for developing countries that struggle to control such emissions. The proposed system is equally applicable to both developed and developing countries and can help reduce the spread of COVID-19 through minimizing the person-to-person contact, thus helping the transformation of healthcare to smart healthcare.
Styles APA, Harvard, Vancouver, ISO, etc.
38

Rathee, Nisha, et Rajender Singh Chhillar. « Optimization of Favourable Test Path Sequences Using Bio-Inspired Natural River System Algorithm ». Journal of Information Technology Research 14, no 2 (avril 2021) : 85–105. http://dx.doi.org/10.4018/jitr.2021040105.

Texte intégral
Résumé :
Testing of software requires a great amount of time and effort. The tester's main aim is to design optimized test sequences with a minimum amount of time, effort, and with less redundancy. Testers have used artificial intelligence meta-heuristic algorithms for optimization of test sequences. The model-driven approach is helpful in the generation of test sequences at early designing phase only. The model-driven approach uses UML diagram to represent the system's behavior and design test cases for the system at design stage of software development life cycle. The proposed approach uses natural river system for optimizing favourable non-redundant test path sequences using UML activity diagrams and sequence diagrams. The implementation of proposed approach has been done using python and results show that the proposed approach provides full coverage of test paths with less redundant test nodes compared to other meta heuristic algorithms.
Styles APA, Harvard, Vancouver, ISO, etc.
39

Hakamipour, Nooshin. « Time and cost constrained optimal designs of multiple step stress tests under progressive censoring ». International Journal of Quality & ; Reliability Management 36, no 10 (4 novembre 2019) : 1721–33. http://dx.doi.org/10.1108/ijqrm-09-2018-0239.

Texte intégral
Résumé :
Purpose The purpose of this paper is to consider the general k level step-stress accelerated life test with the Rayleigh lifetime distribution for units subjected to stress under progressive Type-I censoring. Design/methodology/approach The parameter of this distribution is assumed to be a log-linear function of the stress, and a tampered failure rate model holds. The progressive Type-I censoring reduces the cost of testing. Due to constrained resources in practice, the test design must be optimized carefully. A numerical study is conducted to illustrate the optimum test design based on several four optimality criteria under the constraint that the total experimental cost does not exceed a pre-specified budget. Findings This paper compares unconstrained and constrained optimal k level step-stress test. Based on the results of the simulation study, the cost constraint reduces cost and time of the test and it also, in the most cases, increases the efficiency of the test. Also, the T-optimal design is lowest cost and time for testing and it is found more optimal in both conditions. Originality/value In this paper, various optimization criteria for selecting the stress durations have been used, and these criteria are compared together. Also, because of affecting the stress durations on the experimental cost, the author optimize under the constraint that the total experimental cost does not exceed a pre-specified budget. The efficiency of the unconstrained test in comparison with constrained test is discussed.
Styles APA, Harvard, Vancouver, ISO, etc.
40

Cao, Wei, Peng Zhang, Nana Dong, Anwen Hu, Jiwen Xiao, Dexin Zou, Shulin Xiang et Yanxia Qi. « Efficacy Evaluation of Zoledronic Acid Combined with Chemotherapy in the Treatment of Lung Cancer Spinal Metastases on Computed Tomography Images on Intelligent Algorithms ». Computational and Mathematical Methods in Medicine 2022 (6 mai 2022) : 1–11. http://dx.doi.org/10.1155/2022/6431852.

Texte intégral
Résumé :
To analyze the effectiveness and safety of zoledronic acid combined with chemotherapy for lung cancer spinal metastases, 96 patients with lung cancer spinal metastases were averagely classified into the experimental group (gemcitabine, cisplatin, and zoledronic acid) and the control group (gemcitabine and cisplatin). An optimized noise variance estimation algorithm (OMAPB) was proposed based on the maximum a posteriori Bayesian method (MAPB), and the algorithm was applied to the patient’s computed tomography (CT) scan. The results indicated that in terms of curative effect, the number of complete remission (CR), partial remission (PR) cases, effective rate, and clinical benefit rate of the test group was significantly higher than those of the control group. The number of progress disease (PD) cases was significantly lower than that of the control group ( P < 0.05 ). The disease progression time of the test group patients was 6.2 months, and the disease progression time of the control group patients was 3.7 months ( P < 0.05 ). The test group patients had 8 cases of bone marrow suppression and gastrointestinal reactions after treatment. In the test group, there were 8 cases of bone marrow suppression, 9 cases of gastrointestinal reaction, 3 cases of fever, 4 cases of pain, and 2 cases of hair loss. The patients in the control group were complicated with bone marrow suppression in 14 cases, gastrointestinal reaction in 17 cases, fever in 5 cases, pain in 4 cases, and hair loss in 6 cases. The difference was statistically significant ( P < 0.05 ). It showed that zoledronic acid combined with chemotherapy could effectively improve the treatment efficiency and clinical benefit rate of patients with lung cancer spinal metastases, prolong the progression of the disease, reduce the degree of bone tissue damage, and would not increase chemotherapy adverse events.
Styles APA, Harvard, Vancouver, ISO, etc.
41

Furukawa, Daisuke, Brian Kim et Arthur Jeng. « 2002. BioFire® Filmarray® Pneumonia Panel : A Powerful Rapid Diagnostic Test for Antimicrobial Stewardship ». Open Forum Infectious Diseases 6, Supplement_2 (octobre 2019) : S671. http://dx.doi.org/10.1093/ofid/ofz360.1682.

Texte intégral
Résumé :
Abstract Background BioFire® Filmarray® Pneumonia Panel (BFPP) is a multiplex PCR panel that identifies 33 common bacterial and viral pathogens seen in community- and hospital-acquired pneumonias. It rapidly identifies these pathogens in addition to 7 antibiotic resistance genes on sputum and bronchioalveolar lavage samples in 1 hour. As one of the test centers for this panel, our institution utilized this panel for clinical and laboratory use. We reviewed the impact of BFPP on antimicrobial stewardship, particularly its role in early discontinuation of empiric antibiotics and prompt initiation of optimized targeted therapy. Methods We retrospectively reviewed all cases by which BFPP was ordered. We reviewed medical records of each case to identify the results of the panel, culture data, antibiotics used, and subsequent clinical intervention. Results 43 tests were ordered in total. 17 were for clinical use by an infectious disease specialist and 26 were randomly obtained by the microbiology lab. All 17 clinical cases were intervened upon with the following interventions: discontinuation of anti-pseudomonal antibiotics (8 cases), discontinuation of anti-MRSA antibiotics (5 cases), discontinuation of azithromycin (4 cases), discontinuation of carbapenem (1 case), prevention of inappropriate antibiotic escalation or initiation of inappropriate antibiotics (2 cases), and early IV to PO transition (3 cases). Of the random 26 samples ordered by lab, 13 had opportunities for antibiotic de-escalation if a physician were notified of the results. Viruses were identified in 15 samples with coronavirus being the most common. Virus was the sole pathogen in 9 of the 15 samples. Bacterial pathogens were identified in 20 samples that were reported as normal flora by conventional culture; none of these cases led to or potentially could have led to antibiotic escalation as the sole intervention. Conclusion Clinical use of BFPP had 100% intervention rate with all interventions leading to de-escalation of antibiotics or prevention of inappropriate antibiotics use. Though over-identification of colonizers is a potential limitation, BFPP is a powerful tool for antibiotic stewardship that results in rapid interventions to achieve optimal targeted therapy. Disclosures All authors: No reported disclosures.
Styles APA, Harvard, Vancouver, ISO, etc.
42

Naz, Komal, Fasiha Zainab, Khawaja Khalid Mehmood, Syed Basit Ali Bukhari, Hassan Abdullah Khalid et Chul-Hwan Kim. « An Optimized Framework for Energy Management of Multi-Microgrid Systems ». Energies 14, no 19 (22 septembre 2021) : 6012. http://dx.doi.org/10.3390/en14196012.

Texte intégral
Résumé :
Regarding different challenges, such as integration of green energy and autonomy of microgrid (MG) in the multi-microgrid (MMG) system, this paper presents an optimized and coordinated strategy for energy management of MMG systems that consider multiple scenarios of MGs. The proposed strategy operates at two optimization levels: local and global. At an MG level, each energy management system satisfies its local demand by utilizing all available resources via local optimization, and only sends surplus/deficit energy data signals to MMG level, which enhances customer privacy. Thereafter, at an MMG level, a central energy management system performs global optimization and selects optimized options from the available resources, which include charging/discharging energy to/from the community battery energy storage system, selling/buying power to/from other MGs, and trading with the grid. Two types of loads are considered in this model: sensitive and non-sensitive. The algorithm tries to make the system reliable by avoiding utmost load curtailment and prefers to shed non-sensitive loads over sensitive loads in the case of load shedding. To verify the robustness of the proposed scheme, several test cases are generated by Monte Carlo Simulations and simulated on the IEEE 33-bus distribution system. The results show the effectiveness of the proposed model.
Styles APA, Harvard, Vancouver, ISO, etc.
43

Dong, Sihui, Jinxiao Lv, Kang Wang, Wanjing Li et Yining Tian. « Design and Optimization for a New Locomotive Power Battery Box ». Sustainability 14, no 19 (7 octobre 2022) : 12810. http://dx.doi.org/10.3390/su141912810.

Texte intégral
Résumé :
To solve the disadvantages of the low protection grade, high weight, and high cost of the existing locomotive power battery system, this study optimizes the existing scheme and introduces the design concept of two-stage protection. The purpose of the research is to improve the protection level of the battery pack to IP68, to optimize the sheet metal power battery box structure into a more lightweight frame structure, to simplify the cooling mode of the battery pack for natural air cooling, and to improve the battery protection level and maintain the heat exchange capability. In the course of the study, a design scheme with a two-stage protection function is proposed. The numerical model analyzes the self-load, transverse load, longitudinal load, mode, and fatigue, and optimizes the layout of the power tank cell. The optimized box model was physically tested and economically compared. The results show that: (1) The maximum load stress is 128.4 MPa, which is lower than 235 MPa, the ultimate stress of the box material, and the fatigue factor of the frame box structure is 3.75, which is higher than 1.0, and it is not prone to fatigue damage. (2) Under the low-temperature heating condition, the overall temperature rise of the battery pack is 4.3 °C, which is greater than 2.3 °C under the air conditioning heat dissipation scheme. Under the high-temperature charging condition, the overall temperature rise of the battery pack is 2.0 °C, and the temperature value is the same as the temperature rise under the air conditioning cooling scheme. Under the high-temperature discharge condition, the overall temperature rise of the battery pack is 3.0 °C, and the temperature value is greater than 2.1 °C under the air conditioning heat dissipation scheme. At the same time, the temperature rise under the three working conditions is less than the 15 °C stipulated in the JS175-201805 standard. The simulation results show that the natural airflow and two-stage protection structure can provide a good temperature environment for the power battery to work. (3) The optimized box prototype can effectively maintain the structural integrity of the battery cell in the box in extreme test cases, reducing the probability of battery fire caused by battery cell deformation. (4) The power battery adopts a two-stage protection design under the battery power level, which can simultaneously achieve battery protection and prevent thermal runaway, while reducing costs. The research results provide a new concept for the design of a locomotive power battery system. (5) The weight of the optimized scheme is 2020 kg, and the original scheme is 2470 kg; thus, the reduction in weight is 450 kg. Meanwhile, the volume of the optimized scheme is 1.49 m3, and the original scheme is 1.93 m3; thus, the reduction in volume is 0.44 m3.
Styles APA, Harvard, Vancouver, ISO, etc.
44

Amattouch, M. R., H. Belhadj et N. Nagid. « New optimized domain decomposition order 4 method(OO4) applied to reaction advection diffusion equation ». Journal of Modern Methods in Numerical Mathematics 9, no 1-2 (28 mars 2018) : 28–41. http://dx.doi.org/10.20454/jmmnm.2018.1296.

Texte intégral
Résumé :
The purpose of this work is the study of a new approach of domain decomposition method, the optimized order 4 method(OO4), to solve a reaction advection diusion equation. This method is a Schwarz waveform relaxation approach extending the known OO2 idea. The OO4 method is a reformulation of the Schwarz algorithm with specific conditions at the interface. This condition are a dierential equation of order 1 in the normal direction and of order 4 in the tangential direction to the interface resulting of artificial boundary conditions. The obtained scheme is solved by a Krylov type algorithm. The main result in this paper is that the proposed OO4 algorithm is more robust and faster than the classical OO2 method. To confirm the performance of our method , we give several numerical test-cases.
Styles APA, Harvard, Vancouver, ISO, etc.
45

Malaby, Andrew W., Srinivas Chakravarthy, Thomas C. Irving, Sagar V. Kathuria, Osman Bilsel et David G. Lambright. « Methods for analysis of size-exclusion chromatography–small-angle X-ray scattering and reconstruction of protein scattering ». Journal of Applied Crystallography 48, no 4 (8 juillet 2015) : 1102–13. http://dx.doi.org/10.1107/s1600576715010420.

Texte intégral
Résumé :
Size-exclusion chromatography in line with small-angle X-ray scattering (SEC–SAXS) has emerged as an important method for investigation of heterogeneous and self-associating systems, but presents specific challenges for data processing including buffer subtraction and analysis of overlapping peaks. This paper presents novel methods based on singular value decomposition (SVD) and Guinier-optimized linear combination (LC) to facilitate analysis of SEC–SAXS data sets and high-quality reconstruction of protein scattering directly from peak regions. It is shown that Guinier-optimized buffer subtraction can reduce common subtraction artifacts and that Guinier-optimized linear combination of significant SVD basis components improves signal-to-noise and allows reconstruction of protein scattering, even in the absence of matching buffer regions. In test cases with conventional SAXS data sets for cytochrome c and SEC–SAXS data sets for the small GTPase Arf6 and the Arf GTPase exchange factors Grp1 and cytohesin-1, SVD–LC consistently provided higher quality reconstruction of protein scattering than either direct or Guinier-optimized buffer subtraction. These methods have been implemented in the context of a Python-extensible Mac OS X application known asData Evaluation and Likelihood Analysis(DELA), which provides convenient tools for data-set selection, beam intensity normalization, SVD, and other relevant processing and analytical procedures, as well as automated Python scripts for common SAXS analyses and Guinier-optimized reconstruction of protein scattering.
Styles APA, Harvard, Vancouver, ISO, etc.
46

Eleshaky, Mohamed E., et Oktay Baysal. « Airfoil Shape Optimization Using Sensitivity Analysis on Viscous Flow Equations ». Journal of Fluids Engineering 115, no 1 (1 mars 1993) : 75–84. http://dx.doi.org/10.1115/1.2910117.

Texte intégral
Résumé :
An aerodynamic shape optimization method has previously been developed by the authors using the Euler equations and has been applied to supersonic-hypersonic nozzle designs. This method has also included a flowfield extrapolation (or flow prediction) method based on the Taylor series expansion of an existing CFD solution. The present paper reports on the extension of this method to the thin-layer Navier-Stokes equations in order to account for the viscous effects. Also, to test the method under highly nonlinear conditions, it has been applied to the transonic flows. Initially, the success of the flow prediction method is tested. Then, the overall method is demonstrated by optimizing the shapes of two supercritical transonic airfoils at zero angle of attack. The first one is shape optimized to achieve a minimum drag while obtaining a lift above a specified value. Whereas, the second one is shape optimized for a maximum lift while attaining a drag below a specified value. The results of these two cases indicate that the present method can produce successfully optimized aerodynamic shapes.
Styles APA, Harvard, Vancouver, ISO, etc.
47

Rani, Shweta, Bharti Suri et Rinkaj Goyal. « On the Effectiveness of Using Elitist Genetic Algorithm in Mutation Testing ». Symmetry 11, no 9 (9 septembre 2019) : 1145. http://dx.doi.org/10.3390/sym11091145.

Texte intégral
Résumé :
Manual test case generation is an exhaustive and time-consuming process. However, automated test data generation may reduce the efforts and assist in creating an adequate test suite embracing predefined goals. The quality of a test suite depends on its fault-finding behavior. Mutants have been widely accepted for simulating the artificial faults that behave similarly to realistic ones for test data generation. In prior studies, the use of search-based techniques has been extensively reported to enhance the quality of test suites. Symmetry, however, can have a detrimental impact on the dynamics of a search-based algorithm, whose performance strongly depends on breaking the “symmetry” of search space by the evolving population. This study implements an elitist Genetic Algorithm (GA) with an improved fitness function to expose maximum faults while also minimizing the cost of testing by generating less complex and asymmetric test cases. It uses the selective mutation strategy to create low-cost artificial faults that result in a lesser number of redundant and equivalent mutants. For evolution, reproduction operator selection is repeatedly guided by the traces of test execution and mutant detection that decides whether to diversify or intensify the previous population of test cases. An iterative elimination of redundant test cases further minimizes the size of the test suite. This study uses 14 Java programs of significant sizes to validate the efficacy of the proposed approach in comparison to Initial Random tests and a widely used evolutionary framework in academia, namely Evosuite. Empirically, our approach is found to be more stable with significant improvement in the test case efficiency of the optimized test suite.
Styles APA, Harvard, Vancouver, ISO, etc.
48

Nguyen, Hung Thanh. « VALUATION OF WATER AND SCARCE WATER RESOURCES OPTIMAL ALLOCATION AT THE RIVER BASIN LEVEL A CASE STUDY IN DOWNSTREAM AREA OF THE DONG NAI RIVER SYSTEM BASIN ». Science and Technology Development Journal 15, no 4 (30 décembre 2012) : 87–101. http://dx.doi.org/10.32508/stdj.v15i4.1826.

Texte intégral
Résumé :
Water scarcity is an ongoing reality in many river basins due to the need for increasingly water use associated with water pollution and climate change. Faced with this situation, it is necessary to know the true value of scarce water resources to contribute to effective water allocation. In this paper, a model to optimize the allocation of water resources with the constraints in terms of hydrology has developed based on the principle of balance in marginal net benefits of water use across sectors, and test applied to solve the optimal water sources allocation in the downstream area of the Dong Nai river system basin with many different water scarcity scenarios. The results show that the model allows simulating relatively good optimized water allocation for the competitive water use demands in the cases of shortage of water, and also allows determining the balanced margin net values/ benefits of raw water corresponding to the different levels of water shortage.
Styles APA, Harvard, Vancouver, ISO, etc.
49

Reddy T, Chandrasekhara, Srivani V, A. Mallikarjuna Reddy et G. Vishnu Murthy. « Test Case Optimization and Prioritization Using Improved Cuckoo Search and Particle Swarm Optimization Algorithm ». International Journal of Engineering & ; Technology 7, no 4.6 (25 septembre 2018) : 275. http://dx.doi.org/10.14419/ijet.v7i4.6.20489.

Texte intégral
Résumé :
For minimized t-way test suite generation (t indicates more strength of interaction) recently many meta-heuristic, hybrid and hyper-heuristic algorithms are proposed which includes Artificial Bee Colony (ABC), Ant Colony Optimization (ACO), Genetic Algorithms (GA), Simulated Annealing (SA), Cuckoo Search (CS), Harmony Elements Algorithm (HE), Exponential Monte Carlo with counter (EMCQ), Particle Swarm Optimization (PSO), and Choice Function (CF). Although useful strategies are required specific domain knowledge to allow effective tuning before good quality solutions can be obtained. In our proposed technique test cases are optimized by utilizing Improved Cuckoo Algorithm (ICSA). At that point, the advanced experiments are organized or prioritized by utilizing Particle Swarm Optimization algorithm (PSO). The Particle Swarm Optimization and Improved Cuckoo Algorithm (PSOICSA) estimation is a blend of Improved Cuckoo Search Algorithm(ICSA) and Particle Swarm Optimization (PSO). PSOICSA could be utilized to advance the test suite, and coordinate both ICSA and PSO for a superior outcome, when contrasted with their individual execution as far as experiment improvement.
Styles APA, Harvard, Vancouver, ISO, etc.
50

Gupta, Varun, Durg Singh Chauhan et Kamlesh Dutta. « Hybrid Regression Testing Based on Path Pruning ». International Journal of Systems and Service-Oriented Engineering 5, no 1 (janvier 2015) : 35–55. http://dx.doi.org/10.4018/ijssoe.2015010103.

Texte intégral
Résumé :
Regression testing has been studied by various researchers for developing and testing the quality of software. Regression testing aims at re-execution of evolved software code to ensure that no new errors had been introduced during the process of modification. Since re-execution of all test cases is not feasible, selecting manageable number of test cases to execute modified code with good fault detection rate is a problem. In past few years, various hybrid based regression testing approaches have been proposed and successfully employed for software testing, aiming at reduction in the number of test cases and higher fault detection capabilities. These techniques are based on sequence of selections, prioritizations and minimization of test suite. However, these techniques suffer from major drawbacks like improper consideration of control dependencies, neglection of unaffected fragments of code for testing purpose. Further, these techniques have been employed on hypothetical or simple programs with test suites of smaller size. Present paper proposes hybrid regression testing, a combination of test case selections, test case prioritizations and test suite minimization. The technique works at statement level and is based on finding the paths containing statements that affects or gets affected by the addition/deletion or modification (both control and data dependency) of variables in statements. The modification in the code may cause ripple effect thereby resulting into faulty execution of the code. The hybrid regression testing approach is aimed at detecting such faults with lesser number of test cases. Reduction in number of test cases is possible because of the decreased number of paths to be tested. A web based framework to automate and parallelize this testing technique to maximum extend, making it well suited for globally distributed environments is also proposed in the present paper. Framework when implemented as a tool can handle large pool of test cases and will make use of parallel MIMD architectures like multicore systems. Technique is applied on prototype live system and results are compared with recently proposed hybrid regression testing approach against parameters of interest. Obtained optimized results are indicators of effectiveness of approach in terms of reduction in effort, cost as well as testing time in general and increment delivery time in particular.
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie