To see the other types of publications on this topic, follow the link: OPTIMAL TEST SUITES.

Journal articles on the topic 'OPTIMAL TEST SUITES'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'OPTIMAL TEST SUITES.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Gladston, Angelin, and Niranjana Devi N. "Optimal Test Case Selection Using Ant Colony and Rough Sets." International Journal of Applied Evolutionary Computation 11, no. 2 (April 2020): 1–14. http://dx.doi.org/10.4018/ijaec.2020040101.

Full text
Abstract:
Test case selection helps in improving quality of test suites by removing ambiguous, redundant test cases, thereby reducing the cost of software testing. Various works carried out have chosen test cases based on single parameter and optimized the test cases using single objective employing single strategies. In this article, a parameter selection technique is combined with an optimization technique for optimizing the selection of test cases. A two-step approach has been employed. In first step, the fuzzy entropy-based filtration is used for test case fitness evaluation and selection. In second step, the improvised ant colony optimization is employed to select test cases from the previously reduced test suite. The experimental evaluation using coverage parameters namely, average percentage statement coverage and average percentage decision coverage along with suite size reduction, demonstrate that by using this proposed approach, test suite size can be reduced, reducing further the computational effort incurred.
APA, Harvard, Vancouver, ISO, and other styles
2

Hasan, Imad H., Bestoun S. Ahmed, Moayad Y. Potrus, and Kamal Z. Zamli. "Generation and Application of Constrained Interaction Test Suites Using Base Forbidden Tuples with a Mixed Neighborhood Tabu Search." International Journal of Software Engineering and Knowledge Engineering 30, no. 03 (March 2020): 363–98. http://dx.doi.org/10.1142/s0218194020500151.

Full text
Abstract:
To ensure the quality of current highly configurable software systems, intensive testing is needed to test all the configuration combinations and detect all the possible faults. This task becomes more challenging for most modern software systems when constraints are given for the configurations. Here, intensive testing is almost impossible, especially considering the additional computation required to resolve the constraints during the test generation process. In addition, this testing process is exhaustive and time-consuming. Combinatorial interaction strategies can systematically reduce the number of test cases to construct a minimal test suite without affecting the effectiveness of the tests. This paper presents a new efficient search-based strategy to generate constrained interaction test suites to cover all possible combinations. The paper also shows a new application of constrained interaction testing in software fault searches. The proposed strategy initially generates the set of all possible [Formula: see text]-[Formula: see text] combinations; then, it filters out the set by removing the forbidden [Formula: see text]-[Formula: see text] using the Base Forbidden Tuple (BFT) approach. The strategy also utilizes a mixed neighborhood tabu search (TS) to construct optimal or near-optimal constrained test suites. The efficiency of the proposed method is evaluated through a comparison against two well-known state-of-the-art tools. The evaluation consists of three sets of experiments for 35 standard benchmarks. Additionally, the effectiveness and quality of the results are assessed using a real-world case study. Experimental results show that the proposed strategy outperforms one of the competitive strategies, ACTS, for approximately 83% of the benchmarks and achieves similar results to CASA for 65% of the benchmarks when the interaction strength is 2. For an interaction strength of 3, the proposed method outperforms other competitive strategies for approximately 60% and 42% of the benchmarks. The proposed strategy can also generate constrained interaction test suites for an interaction strength of 4, which is not possible for many strategies. The real-world case study shows that the generated test suites can effectively detect injected faults using mutation testing.
APA, Harvard, Vancouver, ISO, and other styles
3

Sugave, Shounak Rushikesh, Yogesh R. Kulkarni, and Balaso. "Multi-Objective Optimization Model and Hierarchical Attention Networks for Mutation Testing." International Journal of Swarm Intelligence Research 14, no. 1 (March 9, 2023): 1–23. http://dx.doi.org/10.4018/ijsir.319714.

Full text
Abstract:
Mutation testing is devised for measuring test suite adequacy by identifying the artificially induced faults in software. This paper presents a novel approach by considering multiobjectives-based optimization. Here, the optimal test suite generation is performed using the proposed water cycle water wave optimization (WCWWO). The best test suites are generated by satisfying the multi-objective factors, such as time of execution, test suite size, mutant score, and mutant reduction rate. The WCWWO is devised by a combination of the water cycle algorithm (WCA) and water wave optimization (WWO). The hierarchical attention network (HAN) is used for classifying the equivalent mutants by utilizing the MutPy tool. Furthermore, the performance of the developed WCWWO+HAN is evaluated in terms of three metrics—mutant score (MS), mutant reduction rate (MRR), and fitness—with the maximal MS of 0.585, higher MRR of 0.397, and maximum fitness of 0.652.
APA, Harvard, Vancouver, ISO, and other styles
4

Diniz, Thomaz, Everton L. G. Alves, Anderson G. F. Silva, and Wilkerson L. Andrade. "Reducing the Discard of MBT Test Cases." Journal of Software Engineering Research and Development 8 (August 15, 2020): 4–1. http://dx.doi.org/10.5753/jserd.2020.602.

Full text
Abstract:
Model-Based Testing (MBT) is used for generating test suites from system models. However, as software evolves, its models tend to be updated, which may lead to obsolete test cases that are often discarded. Test case discard can be very costly since essential data, such as execution history, are lost. In this paper, we investigate the use of distance functions and machine learning to help to reduce the discard of MBT tests. First, we assess the problem of managing MBT suites in the context of agile industrial projects. Then, we propose two strategies to cope with this problem: (i) a pure distance function-based. An empirical study using industrial data and ten different distance functions showed that distance functions could be effective for identifying low impact edits that lead to test cases that can be updated with little effort. We also found the optimal configuration for each function. Moreover, we showed that, by using this strategy, one could reduce the discard of test cases by 9.53%; (ii) a strategy that combines machine learning with distance values. This strategy can classify the impact of edits in use case documents with accuracy above 80%; it was able to reduce the discard of test cases by 10.4% and to identify test cases that should, in fact, be discarded.
APA, Harvard, Vancouver, ISO, and other styles
5

Fadhil, Heba Mohammed, Mohammed Najm Abdullah, and Mohammed Issam Younis. "TWGH: A Tripartite Whale–Gray Wolf–Harmony Algorithm to Minimize Combinatorial Test Suite Problem." Electronics 11, no. 18 (September 12, 2022): 2885. http://dx.doi.org/10.3390/electronics11182885.

Full text
Abstract:
Today’s academics have a major hurdle in solving combinatorial problems in the actual world. It is nevertheless possible to use optimization techniques to find, design, and solve a genuine optimal solution to a particular problem, despite the limitations of the applied approach. A surge in interest in population-based optimization methodologies has spawned a plethora of new and improved approaches to a wide range of engineering problems. Optimizing test suites is a combinatorial testing challenge that has been demonstrated to be an extremely difficult combinatorial optimization limitation of the research. The authors have proposed an almost infallible method for selecting combinatorial test cases. It uses a hybrid whale–gray wolf optimization algorithm in conjunction with harmony search techniques. Test suite size was significantly reduced using the proposed approach, as shown by the analysis of the results. In order to assess the quality, speed, and scalability of TWGH, experiments were carried out on a set of well-known benchmarks. It was shown in tests that the proposed strategy has a good overall strong reputation test reduction size and could be used to improve performance. Compared with well-known optimization-based strategies, TWGH gives competitive results and supports high combinations (2 ≤ t ≤12).
APA, Harvard, Vancouver, ISO, and other styles
6

He, Hongmei, Ana Sălăgean, Erkki Mäkinen, and Imrich Vrt’o. "Various heuristic algorithms to minimise the two-page crossing numbers of graphs." Open Computer Science 5, no. 1 (August 13, 2015): 22–40. http://dx.doi.org/10.1515/comp-2015-0004.

Full text
Abstract:
AbstractWe propose several new heuristics for the twopage book crossing problem, which are based on recent algorithms for the corresponding one-page problem. Especially, the neural network model for edge allocation is combined for the first time with various one-page algorithms. We investigate the performance of the new heuristics by testing them on various benchmark test suites. It is found out that the new heuristics outperform the previously known heuristics and produce good approximations of the planar crossing number for severalwell-known graph families. We conjecture that the optimal two-page drawing of a graph represents the planar drawing of the graph.
APA, Harvard, Vancouver, ISO, and other styles
7

De Groff, Dolores, Roxana Melendez, Perambur Neelakanta, and Hajar Akif. "Optimal Electric-Power Distribution and Load-Sharing on Smart-Grids: Analysis by Artificial Neural Network." INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY 18 (January 24, 2019): 7431–39. http://dx.doi.org/10.24297/ijct.v18i0.8059.

Full text
Abstract:
This study refers to developing an electric-power distribution system with optimal/suboptimal load-sharing in the complex and expanding metro power-grid infrastructure. That is, the relevant exercise is to indicate a smart forecasting strategy on optimal/suboptimal power-distribution to consumers served by a smart-grid utility. An artificial neural network (ANN) is employed to model the said optimal power-distribution between generating sources and distribution centers. A compatible architecture of the test ANN with ad hoc suites of training/prediction schedules is indicated thereof. Pertinent exercise is to determine smartly the power supported on each transmission-line between generating to distribution-nodes. Further, a “smart” decision protocol prescribing the constraint that no transmission-line carries in excess of a desired load. An algorithm is developed to implement the prescribed constraint via the test ANN; and, each value of the load shared by each distribution-line (meeting the power-demand of the consumers) is elucidated from the ANN output. The test ANN includes the use of a traditional multilayer architecture with feed-forward and backpropagation techniques; and, a fast convergence algorithm (deduced in terms of eigenvalues of a Hessian matrix associated with the input data) is adopted. Further, a novel method based on information-theoretic heuristics (in Shannon’s sense) is invoked towards model specifications. Lastly, the study results are discussed with exemplified computations using appropriate field data.
APA, Harvard, Vancouver, ISO, and other styles
8

Wang, Zhendong, Jianlan Wang, Dahai Li, and Donglin Zhu. "A Multi-Strategy Sparrow Search Algorithm with Selective Ensemble." Electronics 12, no. 11 (June 1, 2023): 2505. http://dx.doi.org/10.3390/electronics12112505.

Full text
Abstract:
Aiming at the deficiencies of the sparrow search algorithm (SSA), such as being easily disturbed by the local optimal and deficient optimization accuracy, a multi-strategy sparrow search algorithm with selective ensemble (MSESSA) is proposed. Firstly, three novel strategies in the strategy pool are proposed: variable logarithmic spiral saltation learning enhances global search capability, neighborhood-guided learning accelerates local search convergence, and adaptive Gaussian random walk coordinates exploration and exploitation. Secondly, the idea of selective ensemble is adopted to select an appropriate strategy in the current stage with the aid of the priority roulette selection method. In addition, the modified boundary processing mechanism adjusts the transgressive sparrows’ locations. The random relocation method is for discoverers and alerters to conduct global search in a large range, and the relocation method based on the optimal and suboptimal of the population is for scroungers to conduct better local search. Finally, MSESSA is tested on CEC 2017 suites. The function test, Wilcoxon test, and ablation experiment results show that MSESSA achieves better comprehensive performance than 13 other advanced algorithms. In four engineering optimization problems, the stability, effectiveness, and superiority of MSESSA are systematically verified, which has significant advantages and can reduce the design cost.
APA, Harvard, Vancouver, ISO, and other styles
9

Sung, Tien-Wen, Baohua Zhao, and Xin Zhang. "Quasi-Affine Transformation Evolutionary with Double Excellent Guidance." Wireless Communications and Mobile Computing 2021 (April 17, 2021): 1–15. http://dx.doi.org/10.1155/2021/5591543.

Full text
Abstract:
The Quasi-Affine Transformation Evolutionary (QUATRE) algorithm is a swarm-based collaborative optimization algorithm, which has drawn attention from researchers due to its simple structure, easy implementation, and powerful performance. However, it needs to be improved regarding the exploration, especially in the late stage of evolution, and the problem of easy falling into a local optimal solution. This paper proposes an improved algorithm named Quasi-Affine Transformation Evolutionary with double excellent guidance (QUATRE-DEG). The algorithm uses not only the global optimal solution but also the global suboptimal solution to guide the individual evolution. We establish a model to determine the guiding force by the distance between the global optimal position and the suboptimal position and propose a new mutation strategy through the double population structure. The optimization of population structure and the improvement of operation mechanisms bring more exploration for the algorithm. To optimize the algorithm, the experiments on parameter settings were made to determine the size of the subpopulation and to achieve a balance between exploration and development. The performance of QUATRE-DEG algorithm is evaluated under CEC2013 and CEC2014 test suites. Through comparison and analysis with some ABC variants known for their strong exploration ability and advanced QUATRE variants, the competitiveness of the proposed QUATRE-DEG algorithm is validated.
APA, Harvard, Vancouver, ISO, and other styles
10

Aggarwal, Sakshi, and Krishn K. Mishra. "Multi-operator Differential Evolution with MOEA/D for Solving Multi-objective Optimization Problems." Journal of Telecommunications and Information Technology 3, no. 2022 (September 29, 2022): 85–95. http://dx.doi.org/10.26636/jtit.2022.161822.

Full text
Abstract:
In this paper, we propose a multi-operator differentia evolution variant that incorporates three diverse mutation strategies in MOEA/D. Instead of exploiting the local region, the proposed approach continues to search for optimal solutions in the entire objective space. It explicitly maintains diversity of the population by relying on the benefit of clustering. To promowe convergence, the solutions close to the ideal position, in the objective space are given preference in the evolutionary process. The core idea is to ensure diversity of the population by applying multiple mutation schemes and a faster convergence rate, giving preference to solutions based on their proximity to the ideal position in the MOEA/D paradigm. The performance of the proposed algorithm is evaluated by two popular test suites. The experimental results demonstrate that the proposed approach outperforms other MOEA/D algorithms.
APA, Harvard, Vancouver, ISO, and other styles
11

Wang, Xilu, and Yaochu Jin. "Knowledge Transfer Based on Particle Filters for Multi-Objective Optimization." Mathematical and Computational Applications 28, no. 1 (January 18, 2023): 14. http://dx.doi.org/10.3390/mca28010014.

Full text
Abstract:
Particle filters, also known as sequential Monte Carlo (SMC) methods, constitute a class of importance sampling and resampling techniques designed to use simulations to perform on-line filtering. Recently, particle filters have been extended for optimization by utilizing the ability to track a sequence of distributions. In this work, we incorporate transfer learning capabilities into the optimizer by using particle filters. To achieve this, we propose a novel particle-filter-based multi-objective optimization algorithm (PF-MOA) by transferring knowledge acquired from the search experience. The key insight adopted here is that, if we can construct a sequence of target distributions that can balance the multiple objectives and make the degree of the balance controllable, we can approximate the Pareto optimal solutions by simulating each target distribution via particle filters. As the importance weight updating step takes the previous target distribution as the proposal distribution and takes the current target distribution as the target distribution, the knowledge acquired from the previous run can be utilized in the current run by carefully designing the set of target distributions. The experimental results on the DTLZ and WFG test suites show that the proposed PF-MOA achieves competitive performance compared with state-of-the-art multi-objective evolutionary algorithms on most test instances.
APA, Harvard, Vancouver, ISO, and other styles
12

Darvish, Arman, and Shahryar Rahnamayan. "Optimal Parameter Setting of Active-Contours Using Differential Evolution and Expert-Segmented Sample Image." Journal of Advanced Computational Intelligence and Intelligent Informatics 16, no. 6 (September 20, 2012): 677–86. http://dx.doi.org/10.20965/jaciii.2012.p0677.

Full text
Abstract:
Generally, tissue extraction (segmentation) is one of the most challenging tasks in medical image processing. Inaccurate segmentation propagates errors to the subsequent steps in the image processing chain. Thus, in any image processing chain, the role of segmentation is in fact critical because it has a significant impact on the accuracy of the final results, such as those of feature extraction. The appearance of variant noise types makes medical image segmentation a more complicated task. Thus far, many approaches for image segmentation have been proposed, including the well-known active contour (snake) model. This method minimizes the energy associated with the target’s contour, which is the sum of the internal and external energy. Although this model has strong characteristics, it suffers from sensitivity to its control parameters. Finding the optimal parameter values is not a trivial task, because the parameters are correlated and problem-dependent. To overcome this problem, this paper proposes a new approach for setting snake’s optimal parameters, which utilizes an expertsegmented gold (ground-truth) image and an optimization algorithm to determine the optimal values for snake’s seven control parameters. The proposed approach was tested on three different medical image test suites: prostate ultrasound (33 images), breast ultrasound (30 images), and lung X-Ray images (48 images). In the current approach, the DE algorithm is employed as a global optimizer. The scheme introduced in this paper is general enough to allow snake to be replaced by any other segmentation algorithm, such as the level set method. For experimental verification, 111 images were utilized. In a comparison with the prepared gold images, the overall error rate is shown to be less than 3%. We explain the proposed approach and the experiments in detail.
APA, Harvard, Vancouver, ISO, and other styles
13

Wang, Yule, Wanliang Wang, Ijaz Ahmad, and Elsayed Tag-Eldin. "Multi-Objective Quantum-Inspired Seagull Optimization Algorithm." Electronics 11, no. 12 (June 9, 2022): 1834. http://dx.doi.org/10.3390/electronics11121834.

Full text
Abstract:
Objective solutions of multi-objective optimization problems (MOPs) are required to balance convergence and distribution to the Pareto front. This paper proposes a multi-objective quantum-inspired seagull optimization algorithm (MOQSOA) to optimize the convergence and distribution of solutions in multi-objective optimization problems. The proposed algorithm adopts opposite-based learning, the migration and attacking behavior of seagulls, grid ranking, and the superposition principles of quantum computing. To obtain a better initialized population in the absence of a priori knowledge, an opposite-based learning mechanism is used for initialization. The proposed algorithm uses nonlinear migration and attacking operation, simulating the behavior of seagulls for exploration and exploitation. Moreover, the real-coded quantum representation of the current optimal solution and quantum rotation gate are adopted to update the seagull population. In addition, a grid mechanism including global grid ranking and grid density ranking provides a criterion for leader selection and archive control. The experimental results of the IGD and Spacing metrics performed on ZDT, DTLZ, and UF test suites demonstrate the superiority of MOQSOA over NSGA-II, MOEA/D, MOPSO, IMMOEA, RVEA, and LMEA for enhancing the distribution and convergence performance of MOPs.
APA, Harvard, Vancouver, ISO, and other styles
14

Jin, Xin, and Fulin Wang. "A Multioffspring Genetic Algorithm Based on Sorting Grouping Selection and Combination Pairing Crossover." Mathematical Problems in Engineering 2022 (March 28, 2022): 1–20. http://dx.doi.org/10.1155/2022/4203082.

Full text
Abstract:
A multi-offspring genetic algorithm based on sorting grouping selection and combination pairing crossover was proposed in this paper. First, individuals in the population were sorted according to their objective function values, and the remaining individuals were divided into several groups after eliminating the worst. Then, the selection operation, which has the advantage of a simplified calculation that can be easily performed, was implemented. Second, a combination pairing crossover operator was designed. Individuals from different groups were selected for new combinations. In a combination, if a crossover condition was met, individuals pairing and transposons crossover operation were made. Otherwise, offspring copies of the individuals in the combination were used, which increase the opportunity of generating better offspring by producing multi-offspring. Finally, a new population was formed by adopting basic bit mutation operator, elitism policy and the strategy of multi-offspring population competition. Moreover, the catastrophe mechanism has been introduced into improved algorithm to avoid premature convergence. The test results on the functions of CEC 2017 test suites shown that the algorithm proposed in this paper has better search performance, stability and faster convergence to the global optimal solution. These results thus verified the effectiveness and feasibility of the algorithm proposed in this paper. Applying the improved algorithm to the location optimization problem of agricultural product logistics facilities, it shown that the improved algorithm is an effective method to solve the location optimization.
APA, Harvard, Vancouver, ISO, and other styles
15

BAER, CHARLES F., and MICHAEL LYNCH. "Correlated evolution of life-history with size at maturity in Daphnia pulicaria: patterns within and between populations." Genetical Research 81, no. 2 (April 2003): 123–32. http://dx.doi.org/10.1017/s0016672303006098.

Full text
Abstract:
Explaining the repeated evolution of similar sets of traits under similar environmental conditions is an important issue in evolutionary biology. The extreme alternative classes of explanations for correlated suites of traits are optimal adaptation and genetic constraint resulting from pleiotropy. Adaptive explanations presume that individual traits are free to evolve to their local optima and that convergent evolution represents particularly adaptive combinations of traits. Alternatively, if pleiotropy is strong and difficult to break, strong selection on one or a few particularly important characters would be expected to result in consistent correlated evolution of associated traits. If pleiotropy is common, we predict that the pattern of divergence among populations will consistently reflect the within-population genetic architecture. To test the idea that the multivariate life-history phenotype is largely a byproduct of strong selection on body size, we imposed divergent artificial selection on size at maturity upon two populations of the cladoceran Daphnia pulicaria, chosen on the basis of their extreme divergence in body size. Overall, the trajectory of divergence between the two natural populations did not differ from that predicted by the genetic architecture within each population. However, the pattern of correlated responses suggested the presence of strong pleiotropic constraints only for adult body size and not for other life-history traits. One trait, offspring size, appears to have evolved in a way different from that expected from the within-population genetic architecture and may be under stabilizing selection.
APA, Harvard, Vancouver, ISO, and other styles
16

Bandi, Chaithanya, and Diwakar Gupta. "Operating Room Staffing and Scheduling." Manufacturing & Service Operations Management 22, no. 5 (September 2020): 958–74. http://dx.doi.org/10.1287/msom.2019.0781.

Full text
Abstract:
Problem definition: We consider two problems faced by an operating room (OR) manager: (1) how many baseline (core) staff to hire for OR suites, and (2) how to schedule surgery requests that arrive one by one. The OR manager has access to historical case count and case length data, and needs to balance the fixed cost of baseline staff and variable cost of overtime, while satisfying surgeons’ preferences. Academic/practical relevance: ORs are costly to operate and generate about 70% of hospitals’ revenues from surgical operations and subsequent hospitalizations. Because hospitals are increasingly under pressure to reduce costs, it is important to make staffing and scheduling decisions in an optimal manner. Also, hospitals need to leverage data when developing algorithmic solutions, and model tradeoffs between staffing costs and surgeons’ preferences. We present a methodology for doing so, and test it on real data from a hospital. Methodology: We propose a new criterion called the robust competitive ratio for designing online algorithms. Using this criterion and a robust optimization approach to model the uncertainty in case mix and case lengths, we develop tractable optimization formulations to solve the staffing and scheduling problems. Results: For the staffing problem, we show that algorithms belonging to the class of interval classification algorithms achieve the best robust competitive ratio, and develop a tractable approach for calculating the optimal parameters of our proposed algorithm. For the scheduling phase, which occurs one or two days before each surgery day, we demonstrate how a robust optimization framework may be used to find implementable schedules while taking into account surgeons’ preferences such as back-to-back and same-OR scheduling of cases. We also perform numerical experiments with real and synthetic data, which show that our approach can significantly reduce total staffing cost. Managerial implications: We present algorithms that are easy to implement and tractable. These algorithms also allow the OR manager to specify the size of the uncertainty set and to control overtime costs while meeting surgeons’ preferences.
APA, Harvard, Vancouver, ISO, and other styles
17

Cheng, Christine T. "The test suite generation problem: Optimal instances and their implications." Discrete Applied Mathematics 155, no. 15 (September 2007): 1943–57. http://dx.doi.org/10.1016/j.dam.2007.04.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Chen, Hao, Shu-Yan Wang, and Xiao-Ying Pan. "Construction Optimal Combination Test Suite Based on Ethnic Group Evolution Algorithm." Research Journal of Applied Sciences, Engineering and Technology 6, no. 2 (June 10, 2013): 309–15. http://dx.doi.org/10.19026/rjaset.6.4078.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Bharathi, M. "Optimum Test Suite Using Fault-Type Coverage-Based Ant Colony Optimization Algorithm." International Journal of Applied Metaheuristic Computing 13, no. 1 (January 2022): 1–23. http://dx.doi.org/10.4018/ijamc.2022010106.

Full text
Abstract:
Software Product Lines(SPLs) covers a mixture of features for testing Software Application Program(SPA). Testing cost reduction is a major metric of software testing. In combinatorial testing(CT), maximization of fault type coverage and test suite reduction plays a key role to reduce the testing cost of SPA. Metaheuristic Genetic Algorithm(GA) do not offer best outcome for test suite optimization problem due to mutation operation and required more computational time. So, Fault-Type Coverage Based Ant Colony Optimization(FTCBACO) algorithm is offered for test suite reduction in CT. FTCBACO algorithm starts with test cases in test suite and assign separate ant to each test case. Ants elect best test cases by updating of pheromone trails and selection of higher probability trails. Best test case path of ant with least time are taken as optimal solution for performing CT. Hence, FTCBACO Technique enriches reduction rate of test suite and minimizes computational time of reducing test cases efficiently for CT.
APA, Harvard, Vancouver, ISO, and other styles
20

Sheoran, Snehlata, Neetu Mittal, and Alexander Gelbukh. "Artificial bee colony algorithm in data flow testing for optimal test suite generation." International Journal of System Assurance Engineering and Management 11, no. 2 (September 3, 2019): 340–49. http://dx.doi.org/10.1007/s13198-019-00862-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

M, Bharathi, and Sangeetha V. "Fault-type coverage based ant colony optimization algorithm for attaining smaller test suite." IAES International Journal of Artificial Intelligence (IJ-AI) 9, no. 3 (September 1, 2020): 507. http://dx.doi.org/10.11591/ijai.v9.i3.pp507-519.

Full text
Abstract:
<table width="0" border="1" cellspacing="0" cellpadding="0"><tbody><tr><td valign="top" width="593"><p>In this paper, we proposed Fault-Type Coverage Based Ant Colony Optimization (FTCBACO) technique for test suite optimization. An algorithm starts with initialization of FTCBACO factors using test cases in test suite. Then, assign separate ant to each test case called vertex. Each ant chooses best vertices to attain food source called objective of the problem by means of updating of pheromone trails and higher probability trails. This procedure is repeated up to the ant reaches food source. In FTCBACO algorithm, minimal number of test cases with less execution time chosen by an ant to cover all faults type (objective) are taken as optimal solution. We measured the performance of FTCBACO against Greedy approach and Additional Greedy Approach in terms of fault type coverage, test suite size and execution time. However, the heuristic Greedy approach and Additional Greedy approach required more execution time and maximum test suite size to provide the best resolution for test suite optimization problem. Statistical investigations are performed to finalize the performance significance of FTCBACO with other approaches that concludes FTCBACO technique enriches the reduction rate of test suite and minimizes execution time of reducing test cases efficiently.</p></td></tr></tbody></table>
APA, Harvard, Vancouver, ISO, and other styles
22

Chen, Lu Lu, and Ling Zhang. "Study of Test Cases Prioritization Based on Ant Colony Algorithm." Applied Mechanics and Materials 263-266 (December 2012): 2168–72. http://dx.doi.org/10.4028/www.scientific.net/amm.263-266.2168.

Full text
Abstract:
Regression testing is an important activity to ensure the quality of software. In order to improve the efficiency of regression testing, in this paper, the author proposes to reorder test suite based on ant colony algorithm in regression testing, and compare the result with other common sort results. Through experiment, it shows that the method can get the optimal sequence of test cases under the time limit and it has been proven a superior method in both effectiveness and efficiency for test cases prioritization.
APA, Harvard, Vancouver, ISO, and other styles
23

Geckeler, Ralf D. "ESAD Shearing Deflectometry: A Primary Flatness Standard with Sub-Nanometer Uncertainty." Key Engineering Materials 381-382 (June 2008): 543–46. http://dx.doi.org/10.4028/www.scientific.net/kem.381-382.543.

Full text
Abstract:
To overcome the limitations of conventional interferometry, a technique has been developed which allows the absolute topography measurement of near-plane and slightly curved optical surfaces of arbitrary size with low measurement uncertainty. The Extended Shear Angle Difference (ESAD) method combines deflectometric and shearing techniques in a unique way to minimize measurement errors and to optimize measurand traceability. A device for the topography measurement of optical surfaces up to 500 mm in diameter, achieving sub-nanometer repeatability, reproducibility and uncertainty, was built at the Physikalisch-Technische Bundesanstalt (PTB). The ESAD method is optimally suited for creating a primary standard for straightness and flatness with highest accuracy by which the three-flat test or liquid mirrors can be replaced as starting points of the traceability chain in flatness measurement. In the following, the improved ESAD device which uses optimized opto-mechanical components is presented. Central aspects of the proper design and use of deflectometric systems are highlighted, including the optimal use of pentaprisms.
APA, Harvard, Vancouver, ISO, and other styles
24

Younis, Mohammed I., and Kamal Z. Zamli. "A Strategy for Automatic Quality Signing and Verification Processes for Hardware and Software Testing." Advances in Software Engineering 2010 (February 2, 2010): 1–7. http://dx.doi.org/10.1155/2010/323429.

Full text
Abstract:
We propose a novel strategy to optimize the test suite required for testing both hardware and software in a production line. Here, the strategy is based on two processes: Quality Signing Process and Quality Verification Process, respectively. Unlike earlier work, the proposed strategy is based on integration of black box and white box techniques in order to derive an optimum test suite during the Quality Signing Process. In this case, the generated optimal test suite significantly improves the Quality Verification Process. Considering both processes, the novelty of the proposed strategy is the fact that the optimization and reduction of test suite is performed by selecting only mutant killing test cases from cumulating t-way test cases. As such, the proposed strategy can potentially enhance the quality of product with minimal cost in terms of overall resource usage and time execution. As a case study, this paper describes the step-by-step application of the strategy for testing a 4-bit Magnitude Comparator Integrated Circuits in a production line. Comparatively, our result demonstrates that the proposed strategy outperforms the traditional block partitioning strategy with the mutant score of 100% to 90%, respectively, with the same number of test cases.
APA, Harvard, Vancouver, ISO, and other styles
25

Harikarthik, S. K., V. Palanisamy, and P. Ramanathan. "Optimal test suite selection in regression testing with testcase prioritization using modified Ann and Whale optimization algorithm." Cluster Computing 22, S5 (November 30, 2017): 11425–34. http://dx.doi.org/10.1007/s10586-017-1401-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Fisal, Nagwa R., Abeer Hamdy, and Essam A. Rashed. "Search-Based Regression Testing Optimization." International Journal of Open Source Software and Processes 12, no. 2 (April 2021): 1–20. http://dx.doi.org/10.4018/ijossp.2021040101.

Full text
Abstract:
Regression testing is one of the essential activities during the maintenance phase of software projects. It is executed to ensure the validity of an altered software. However, as the software evolves, regression testing becomes prohibitively expensive. In order to reduce the cost of regression testing, it is mandatory to reduce the size of the test suite by selecting the most representative test cases that do not compromise the effectiveness of the regression testing in terms of fault-detection capability. This problem is known as test suite reduction (TSR) problem, and it is known to be an NP-complete. The paper proposes a multi-objective adapted binary bat algorithm (ABBA) to solve the TSR problem. The original binary bat (OBBA) algorithm was adapted to enhance its exploration capabilities during the search for a Pareto-optimal surface. The effectiveness of the ABBA was evaluated using six Java programs with different sizes. Experimental results showed that for the same fault discovery rate, the ABBA is capable of reducing the test suite size more than the OBBA and the BPSO.
APA, Harvard, Vancouver, ISO, and other styles
27

Ping, Ping, Feng Xu, and Zhi Jian Wang. "Generating High-Quality Random Numbers by Next Nearest-Neighbor Cellular Automata." Advanced Materials Research 765-767 (September 2013): 1200–1204. http://dx.doi.org/10.4028/www.scientific.net/amr.765-767.1200.

Full text
Abstract:
Cellular automaton (CA) has been widely investigated as random number generators (RNGs). However, the CA rule and the number of neighbors must be chosen carefully for good randomness. In Ref. [11], non-uniform CA with next nearest neighborhood was applied to generate a pseudo-random sequence. Considering that non-uniform CA has more complex implementation in hardware and needs lager memory to store different rules than uniform CA. In this paper, we propose new RNGs based on uniform CA with next nearest neighborhood. Time spacing technique and NIST statistical test suite are used to find optimal rules for uniform CA. Experiment results show that the sequences generated by uniform CA with optimal rules successfully passed all tests in the NIST suite.
APA, Harvard, Vancouver, ISO, and other styles
28

Rossi, Mauro, Txomin Bornaetxea, and Paola Reichenbach. "LAND-SUITE V1.0: a suite of tools for statistically based landslide susceptibility zonation." Geoscientific Model Development 15, no. 14 (July 21, 2022): 5651–66. http://dx.doi.org/10.5194/gmd-15-5651-2022.

Full text
Abstract:
Abstract. In the past 50 years, a large variety of statistically based models and methods for landslide susceptibility mapping and zonation have been proposed in the literature. The methods, which are applicable to a large range of spatial scales, use a large variety of input thematic data, different model combinations, and several approaches to evaluate the models' performance. Despite the numerous applications available in the literature, a standard approach for susceptibility modeling and zonation is still missing. The literature search revealed that several software program and tools are available to evaluate regional slope stability using physically based analysis, but only a few use statistically based approaches. Among them, LAND-SE (LANDslide Susceptibility Evaluation) provides the possibility to perform and combine different statistical susceptibility models and to evaluate their performances and associated uncertainties. This paper describes the structure and the functionalities of LAND-SUITE, a suite of tools for statistically based landslide susceptibility modeling which integrates LAND-SE. LAND-SUITE completes and extends LAND-SE, adding functionalities to (i) facilitate input data preparation, (ii) perform preliminary and exploratory analysis of the available data, and (iii) test different combinations of variables and select the optimal thematic/explanatory set. LAND-SUITE provides a tool to assist the user during the data preparatory phase and to perform diversified statistically based landslide susceptibility applications.
APA, Harvard, Vancouver, ISO, and other styles
29

Khaleel, Shahbaa I., and Raghda Anan. "A review paper: optimal test cases for regression testing using artificial intelligent techniques." International Journal of Electrical and Computer Engineering (IJECE) 13, no. 2 (April 1, 2023): 1803. http://dx.doi.org/10.11591/ijece.v13i2.pp1803-1816.

Full text
Abstract:
<span lang="EN-US">The goal of the testing process is to find errors and defects in the software being developed so that they can be fixed and corrected before they are delivered to the customer. Regression testing is an essential quality testing technique during the maintenance phase of the program as it is performed to ensure the integrity of the program after modifications have been made. With the development of the software, the test suite becomes too large to be fully implemented within the given test cost in terms of budget and time. Therefore, the cost of regression testing using different techniques should be reduced, here we dealt many methods such as retest all technique, regression test selection technique (RTS) and test case prioritization technique (TCP). The efficiency of these techniques is evaluated through the use of many metrics such as average percentage of fault detected (APFD), average percentage block coverage (APBC) and average percentage decision coverage (APDC). In this paper we dealt with these different techniques used in test case selection and test case prioritization and the metrics used to evaluate their efficiency by using different techniques of artificial intelligent and describe the best of all.</span>
APA, Harvard, Vancouver, ISO, and other styles
30

Santo, D. D., A. O’Neill, L. Franconi, I. Mateu, R. Müller, L. Halser, S. Juillerat, M. Weber, F. Munoz Sanchez, and F. Rizatdinova. "Test of the Optosystem for the ATLAS ITk data transmission chain." Journal of Instrumentation 18, no. 03 (March 1, 2023): C03021. http://dx.doi.org/10.1088/1748-0221/18/03/c03021.

Full text
Abstract:
Abstract After Run III the ATLAS detector will undergo a series of upgrades to cope with the harsher radiation environment and increased number of proton interactions in the High Luminosity LHC. One of the key projects in this suite of upgrades is the ATLAS Inner Tracker (ITk). The pixel detector of the ITk must be read out accurately and with extremely high rate. The Optosystem performs electrical-to-optical conversion of signals from the pixel modules. We present a general overview on the design of the Optosystem and recent results related to the performance of the data transmission chain, pivoted on the Optoboards, and to the radiation hardness of the ASICs housed on it.
APA, Harvard, Vancouver, ISO, and other styles
31

Panda, S., D. Munjal, and D. P. Mohapatra. "A Slice-Based Change Impact Analysis for Regression Test Case Prioritization of Object-Oriented Programs." Advances in Software Engineering 2016 (May 8, 2016): 1–20. http://dx.doi.org/10.1155/2016/7132404.

Full text
Abstract:
Test case prioritization focuses on finding a suitable order of execution of the test cases in a test suite to meet some performance goals like detecting faults early. It is likely that some test cases execute the program parts that are more prone to errors and will detect more errors if executed early during the testing process. Finding an optimal order of execution for the selected regression test cases saves time and cost of retesting. This paper presents a static approach to prioritizing the test cases by computing the affected component coupling (ACC) of the affected parts of object-oriented programs. We construct a graph named affected slice graph (ASG) to represent these affected program parts. We determine the fault-proneness of the nodes of ASG by computing their respective ACC values. We assign higher priority to those test cases that cover the nodes with higher ACC values. Our analysis with mutation faults shows that the test cases executing the fault-prone program parts have a higher chance to reveal faults earlier than other test cases in the test suite. The result obtained from seven case studies justifies that our approach is feasible and gives acceptable performance in comparison to some existing techniques.
APA, Harvard, Vancouver, ISO, and other styles
32

Mitchell, Rory, Daniel Stokes, Eibe Frank, and Geoffrey Holmes. "Bandwidth-Optimal Random Shuffling for GPUs." ACM Transactions on Parallel Computing 9, no. 1 (March 31, 2022): 1–20. http://dx.doi.org/10.1145/3505287.

Full text
Abstract:
Linear-time algorithms that are traditionally used to shuffle data on CPUs, such as the method of Fisher-Yates, are not well suited to implementation on GPUs due to inherent sequential dependencies, and existing parallel shuffling algorithms are unsuitable for GPU architectures because they incur a large number of read/write operations to high latency global memory. To address this, we provide a method of generating pseudo-random permutations in parallel by fusing suitable pseudo-random bijective functions with stream compaction operations. Our algorithm, termed “bijective shuffle” trades increased per-thread arithmetic operations for reduced global memory transactions. It is work-efficient, deterministic, and only requires a single global memory read and write per shuffle input, thus maximising use of global memory bandwidth. To empirically demonstrate the correctness of the algorithm, we develop a statistical test for the quality of pseudo-random permutations based on kernel space embeddings. Experimental results show that the bijective shuffle algorithm outperforms competing algorithms on GPUs, showing improvements of between one and two orders of magnitude and approaching peak device bandwidth.
APA, Harvard, Vancouver, ISO, and other styles
33

Grandison, Alexandra, Michael Franjieh, Lily Greene, and Greville G. Corbett. "Optimal categorisation." Cadernos de Linguística 2, no. 1 (December 3, 2021): e393. http://dx.doi.org/10.25189/2675-4916.2021.v2.n1.id393.

Full text
Abstract:
The debate as to whether language influences cognition has been long standing but has yielded conflicting findings across domains such as colour and kinship categories. Fewer studies have investigated systems such as nominal classification (gender, classifiers) across different languages to examine the effects of linguistic categorisation on cognition. Effective categorisation needs to be informative to maximise communicative efficiency but also simple to minimise cognitive load. It therefore seems plausible to suggest that different systems of nominal classification have implications for the way speakers conceptualise relevant entities. A suite of seven experiments was designed to test this; here we focus on our card sorting experiment, which contains two sub-tasks — a free sort and a structured sort. Participants were 119 adults across six Oceanic languages from Vanuatu and New Caledonia, with classifier inventories ranging from two to 23. The results of the card sorting experiment reveal that classifiers appear to provide structure for cognition in tasks where they are explicit and salient. The free sort task did not incite categorisation through classifiers, arguably as it required subjective judgement, rather than explicit instruction. This was evident from our quantitative and qualitative analyses. Furthermore, the languages employing more extreme categorisation systems displayed smaller variation in comparison to more moderate systems. Thus, systems that are more informative or more rigid appear to be more efficient. The study implies that the influence of language on cognition may vary across languages, and that not all nominal classification systems employ this optimal trade-off between simplicity and informativeness. These novel data provide a new perspective on the origin and nature of nominal classification.
APA, Harvard, Vancouver, ISO, and other styles
34

Guan, Gui Xia, Yu Zhang Han, Min Hua Wu, and Jian Guo Han. "On Anti-Noise-Algorithm of Optical-Polarization-Sensing Localization." Applied Mechanics and Materials 190-191 (July 2012): 126–32. http://dx.doi.org/10.4028/www.scientific.net/amm.190-191.126.

Full text
Abstract:
the noise problem met during sensing-measurement hasn’t been adequately discussed. In this paper, the calculation errors caused by the noises are demonstrated. A practical optical sensing-localization algorithm based on specially organized Genetic Algorithm is proposed, and application-test experiments of this algorithm are described. Moreover, a parallel-operation scheme suited for Cloud-Computing supporting is introduced. Developments of the algorithm mentioned above provide an effective approach for the rapidly growing field, independent-localization, which would benefit the fields such as geographical mapping, navigation, especially when GPS becomes not readily available due to industrial or natural disturbances such as radio or thunder.
APA, Harvard, Vancouver, ISO, and other styles
35

Mishra, Deepti Bala, Biswaranjan Acharya, Dharashree Rath, Vassilis C. Gerogiannis, and Andreas Kanavos. "A Novel Real Coded Genetic Algorithm for Software Mutation Testing." Symmetry 14, no. 8 (July 26, 2022): 1525. http://dx.doi.org/10.3390/sym14081525.

Full text
Abstract:
Information Technology has rapidly developed in recent years and software systems can play a critical role in the symmetry of the technology. Regarding the field of software testing, white-box unit-level testing constitutes the backbone of all other testing techniques, as testing can be entirely implemented by considering the source code of each System Under Test (SUT). In unit-level white-box testing, mutants can be used; these mutants are artificially generated faults seeded in each SUT that behave similarly to the realistic ones. Executing test cases against mutants results in the adequacy (mutation) score of each test case. Efficient Genetic Algorithm (GA)-based methods have been proposed to address different problems in white-box unit testing and, in particular, issues of mutation testing techniques. In this research paper, a new approach, which integrates the path coverage-based testing method with the novel idea of tracing a Fault Detection Matrix (FDM) to achieve maximum mutation coverage, is proposed. The proposed real coded GA for mutation testing is designed to achieve the highest Mutation Score, and it is thus named RGA-MS. The approach is implemented in two phases: path coverage-based test data are initially generated and stored in an optimized test suite. In the next phase, the test suite is executed to kill the mutants present in the SUT. The proposed method aims to achieve the minimum test dataset, having at the same time the highest Mutation Score by removing duplicate test data covering the same mutants. The proposed approach is implemented on the same SUTs as these have been used for path testing. We proved that the RGA-MS approach can cover maximum mutants with a minimum number of test cases. Furthermore, the proposed method can generate a maximum path coverage-based test suite with minimum test data generation compared to other algorithms. In addition, all mutants in the SUT can be covered by less number of test data with no duplicates. Ultimately, the generated optimal test suite is trained to achieve the highest Mutation Score. GA is used to find the maximum mutation coverage as well as to delete the redundant test cases.
APA, Harvard, Vancouver, ISO, and other styles
36

Trojovská, Eva, Mohammad Dehghani, and Víctor Leiva. "Drawer Algorithm: A New Metaheuristic Approach for Solving Optimization Problems in Engineering." Biomimetics 8, no. 2 (June 6, 2023): 239. http://dx.doi.org/10.3390/biomimetics8020239.

Full text
Abstract:
Metaheuristic optimization algorithms play an essential role in optimizing problems. In this article, a new metaheuristic approach called the drawer algorithm (DA) is developed to provide quasi-optimal solutions to optimization problems. The main inspiration for the DA is to simulate the selection of objects from different drawers to create an optimal combination. The optimization process involves a dresser with a given number of drawers, where similar items are placed in each drawer. The optimization is based on selecting suitable items, discarding unsuitable ones from different drawers, and assembling them into an appropriate combination. The DA is described, and its mathematical modeling is presented. The performance of the DA in optimization is tested by solving fifty-two objective functions of various unimodal and multimodal types and the CEC 2017 test suite. The results of the DA are compared to the performance of twelve well-known algorithms. The simulation results demonstrate that the DA, with a proper balance between exploration and exploitation, produces suitable solutions. Furthermore, comparing the performance of optimization algorithms shows that the DA is an effective approach for solving optimization problems and is much more competitive than the twelve algorithms against which it was compared to. Additionally, the implementation of the DA on twenty-two constrained problems from the CEC 2011 test suite demonstrates its high efficiency in handling optimization problems in real-world applications.
APA, Harvard, Vancouver, ISO, and other styles
37

Bartram, Peter, and Alexander Wittig. "Terrestrial exoplanet simulator: an error optimal planetary system integrator that permits close encounters." Monthly Notices of the Royal Astronomical Society 504, no. 1 (March 27, 2021): 678–91. http://dx.doi.org/10.1093/mnras/stab896.

Full text
Abstract:
ABSTRACT We present Terrestrial Exoplanet Simulator (tes), a new n-body integration code for the accurate and rapid propagation of planetary systems in the presence of close encounters. tes builds upon the classic Encke method and integrates only the perturbations to Keplerian trajectories to reduce both the error and runtime of simulations. Variable step size is used throughout to enable close encounters to be precisely handled. A suite of numerical improvements is presented that together make tes optimal in terms of energy error. Lower runtimes are found in the majority of test problems considered when compared to direct integration using ias15. tes is freely available.
APA, Harvard, Vancouver, ISO, and other styles
38

Wolfe, Stephen A., David J. Huntley, and Jeff Ollerhead. "Optical Dating of Modern and Late Holocene Dune Sands in the Brandon Sand Hills, Southwestern Manitoba*." Géographie physique et Quaternaire 56, no. 2-3 (October 7, 2004): 203–14. http://dx.doi.org/10.7202/009106ar.

Full text
Abstract:
Abstract For any suite of optical dating samples two issues that must be considered are: do zero-age samples yield an optical age of zero, and are the optical ages consistent with independent stratigraphic and chronologic information? A test of the zero-age of dune sands was performed by dating samples from the crest, lee slope and stoss slope of an active dune in southwestern Manitoba. Three surface samples showed that, using 1.4 eV (infrared) excitation of K-feldspars, the equivalent dose, and hence “age”, depended on whether the bleach used for the thermal transfer correction was infrared/red or sunlight, leading to an age uncertainty of about ±40 years. Optical ages for samples 50 cm below these, and calculated relative to them, were 8 ± 8, 1 ± 7, and 38 ± 7 years, independent of the bleach used. These ages are consistent with expectations for the crest, lee slope and stoss slope, respectively. Optical ages of late Holocene dune sand units at the Brookdale Road section, southwestern Manitoba, were consistent with radiocarbon ages from organic matter within intervening buried soils. The suite of optical and radiocarbon ages from the Brandon Sand Hills provides a record of dune activity and stability for the region, and tentatively identifies periods of eolian activity at about 2 ka, 3.1 to 4.0 ka, and prior to 5.2 ka.
APA, Harvard, Vancouver, ISO, and other styles
39

Pegau, W. Scott, Jessica Garron, Leonard Zabilansky, Christopher Bassett, Job Bello, John Bradford, Regina Carns, et al. "Detection of oil in and under ice." International Oil Spill Conference Proceedings 2017, no. 1 (May 1, 2017): 1857–76. http://dx.doi.org/10.7901/2169-3358-2017.1.1857.

Full text
Abstract:
ABSTRACT (2017-147) In 2014, researchers from ten organizations came to the U.S. Army Corps of Engineers, Cold Regions Research and Engineering Laboratory (CRREL) in New Hampshire to conduct a first of its kind large-scale experiment aimed at determining current sensor capabilities for detecting oil in and under sea ice. This project was the second phase of the Oil Spill Detection in Low Visibility and Ice research project of the International Association of Oil and Gas Producers (IOGP), Arctic Oil Spill Response Technology - Joint Industry Programme. The objectives of the project were to:Acquire acoustic, thermal, optical and radar signatures of oil on, within, and underneath a level sheet of laboratory sea ice.Determine the capabilities of various sensors to detect oil in specific ice environments created in a test tank, including freeze-up, growth and melt.Model the potential performance of the sensors under realistic field conditions using the test data for validation.Recommend the most effective sensor suite of existing sensors for detecting oil in the ice environment. The sensor testing spanned a two-month ice growth phase and a one-month decay/melt period. The growth phase produced an 80 centimeter thick level sheet of salt water ice representative of natural sea ice grown under quiescent conditions. Above-ice sensors included frequency modulated continuous wave radar, ground penetrating radar, laser fluorescence polarization sensor, spectral radiometer, visible and infrared cameras. Below-ice sensors included acoustics (broadband, narrowband, and multibeam sonars), spectral radiometers, cameras, and fluorescence polarization. Measurements of physical and electrical properties of the ice and oil within the ice were provided to optical, acoustic, and radar modelers as inputs into their models. The models were then used to extrapolate the sensors’ laboratory performance to potential performance over a range of field conditions. All selected sensors detected oil under some conditions. The radar systems were the only above-ice sensors capable of detecting oil below or trapped within the ice. Cameras below the ice detected oil at all stages of ice growth, and the acoustic and fluorescence systems detected encapsulated oil through limited amounts of new ice growth beneath the oil. No single sensor detected oil in and below ice under all conditions tested. However, we used the test results to identify suites of sensors that could be deployed today both above and below the ice to detect and map an oil spill within ice covered waters.
APA, Harvard, Vancouver, ISO, and other styles
40

Xu, Gaowei, Huimin Fang, Yumin Song, and Wensheng Du. "Optimal Design and Analysis of Cavitating Law for Well-Cellar Cavitating Mechanism Based on MBD-DEM Bidirectional Coupling Model." Agriculture 13, no. 1 (January 5, 2023): 142. http://dx.doi.org/10.3390/agriculture13010142.

Full text
Abstract:
A variable velocity parallel four-bar cavitating mechanism for well-cellar can form the well-cellar cavitation which suits for well-cellar transplanting under a continuous operation. In order to improve the cavitating quality, this paper analyzed the structural composition and working principle of the cavitating mechanism and established the bidirectional coupling model of multi-body dynamics and the discrete element between the cavitating mechanism and soil through Recurdyn and EDEM software. Based on the model, a three-factor, five-level quadratic orthogonal rotational combination design test was conducted with the parameters of the cavitating mechanism as the experimental factors and the parameters of the cavitation as the response index to obtain the optimal parameter combination, and a virtual simulation test was conducted for the optimal parameter combination in order to study the cavitating law of the cavitating mechanism and soil. The test results showed that the depth of the cavitation was 188.6 mm, the vertical angle of the cavitation was 90.4°, the maximum diameter of the cavitation was 76.1 mm, the minimum diameter of the cavitation was 68.5 mm, and the variance in the diameters for the cavitation was 5.42 mm2. The cavitating mechanism with optimal parameters based on the Recurdyn–EDEM bidirectional coupling mode could further improve the cavitating quality.
APA, Harvard, Vancouver, ISO, and other styles
41

Singh, Lakshman, and Sachin Singh. "Mathematical Modeling of Thermal Lensing effect on Nd-Yag Laser Rod in Trans-Receiver Optoelectronics Avionics Assembly and its Interferometric Measurement." International Journal of Computational Physics Series 1, no. 1 (February 27, 2018): 65–76. http://dx.doi.org/10.29167/a1i1p65-76.

Full text
Abstract:
The Nd-Yag laser Rod commonly used Laser Generation element in Optical Detection & Ranging System of Military Aircraft. This research paper points out different issues of Nd-Yag Laser rod during Laboratory test. The paper discuses the thermal lensing phenomenon in the Nd-Yag Laser Rod which causes a thermal fracture in a rod, variation in laser power, measurement error and commutative low accuracy effect in the higher optical assembly. The research find out major unavoidable reasons in the optical assembly which cause thermal lensing phenomenon and formulated a generic mathematical model for the same. This research analyzed the lab observation of thermal lensing by Interferometric measurement and recommended the best suited optical assembly-tuning-alignment procedure for laser- based avionics system based on the research findings.
APA, Harvard, Vancouver, ISO, and other styles
42

Zitzler, Eckart, Kalyanmoy Deb, and Lothar Thiele. "Comparison of Multiobjective Evolutionary Algorithms: Empirical Results." Evolutionary Computation 8, no. 2 (June 2000): 173–95. http://dx.doi.org/10.1162/106365600568202.

Full text
Abstract:
In this paper, we provide a systematic comparison of various evolutionary approaches to multiobjective optimization using six carefully chosen test functions. Each test function involves a particular feature that is known to cause difficulty in the evolutionary optimization process, mainly in converging to the Pareto-optimal front (e.g., multimodality and deception). By investigating these different problem features separately, it is possible to predict the kind of problems to which a certain technique is or is not well suited. However, in contrast to what was suspected beforehand, the experimental results indicate a hierarchy of the algorithms under consideration. Furthermore, the emerging effects are evidence that the suggested test functions provide sufficient complexity to compare multiobjective optimizers. Finally, elitism is shown to be an important factor for improving evolutionary multiobjective search.
APA, Harvard, Vancouver, ISO, and other styles
43

Coppa, Paolo, and Antonio La Malfa. "Light Extinction by Fire Smokes." Journal of Fire Sciences 15, no. 3 (May 1997): 180–202. http://dx.doi.org/10.1177/073490419701500302.

Full text
Abstract:
Visibility of escape routes is most important to guarantee the safety of people indoors during fires. In the case of a fire, visibility is strongly in fluenced by the smoke present in the room. In the present work, the extinction co efficient of the visible light has been experimentally measured in the presence of the smoke produced by four different test fires, named TF2, TF3, TF4, and TF5. Tests have been carried out in a specially suited test room in the Italian Fire De partment Research Center, provided with instruments to measure the optical density of smoke and the spectral extinction coefficient.
APA, Harvard, Vancouver, ISO, and other styles
44

Zhao, Nai Zhi, Chang Tie Huang, and Xin Chen. "Structural Health Monitoring of Plate Structures Using Lamb Wave Methods." Advanced Materials Research 368-373 (October 2011): 2417–20. http://dx.doi.org/10.4028/www.scientific.net/amr.368-373.2417.

Full text
Abstract:
In this paper,waveform is used to excite Lamb waves in the test plates during experimental testing. The optimal excitation frequency will depend on the test configuration. In order to select an excitation frequency, dispersion curves are first created to show what frequency range is best suited for Lamb wave excitation. According to dispersion curves were created for the aluminum test plates and it was concluded that a frequency below 1 MHz will be used in order to only excite the fundamental and modes. Experiments are performed on the aluminum test plates described. Experimental testing is first performed on undamaged plates in order to determine the path-to-path and test-to-test variability in Lamb wave measurements in the absence of damage. The admittance data is analyzed in order to determine if any of the PZT transducers need to be replaced because of poor bonding or mechanical failure. The sensor diagnostic technique is applied to each plate and used to ensure proper consistency between PZT transducers
APA, Harvard, Vancouver, ISO, and other styles
45

Gavlina, A. E., D. A. Novikov, and M. V. Askerko. "Orthogonal ray scheme: a method for processing interference patterns and reconstructing the shape of a test convex mirror." Journal of Physics: Conference Series 2127, no. 1 (November 1, 2021): 012066. http://dx.doi.org/10.1088/1742-6596/2127/1/012066.

Full text
Abstract:
Abstract This report is devoted to the processing of the interference pattern of the tested mirror, obtained using the orthogonal ray scheme, where the convex testing surface is illuminated by a collimated beam, which is perpendicular to the optical axis of the surface. The interference pattern is created by two wavefronts, one of which is reflected from the mirror, while the other wavefront bypasses the mirror and travels directly to the detector plane. The result of interference pattern processing is a topography map formed by several tangential profiles. The proposed method is suited for large diameter convex spherical and aspherical mirrors and does not require a priori information of surface under the test, such as the vertex radius of curvature and the conical constant. Theoretical foundation of the data processing method are presented.
APA, Harvard, Vancouver, ISO, and other styles
46

Ye, Chunling, Zhengyan Mao, and Mandan Liu. "A Novel Multi-Objective Five-Elements Cycle Optimization Algorithm." Algorithms 12, no. 11 (November 14, 2019): 244. http://dx.doi.org/10.3390/a12110244.

Full text
Abstract:
Inspired by the mechanism of generation and restriction among five elements in Chinese traditional culture, we present a novel Multi-Objective Five-Elements Cycle Optimization algorithm (MOFECO). During the optimization process of MOFECO, we use individuals to represent the elements. At each iteration, we first divide the population into several cycles, each of which contains several individuals. Secondly, for every individual in each cycle, we judge whether to update it according to the force exerted on it by other individuals in the cycle. In the case of an update, a local or global update is selected by a dynamically adjustable probability P s ; otherwise, the individual is retained. Next, we perform combined mutation operations on the updated individuals, so that a new population contains both the reserved and updated individuals for the selection operation. Finally, the fast non-dominated sorting method is adopted on the current population to obtain an optimal Pareto solution set. The parameters’ comparison of MOFECO is given by an experiment and also the performance of MOFECO is compared with three classic evolutionary algorithms Non-dominated Sorting Genetic Algorithm II (NSGA-II), Multi-Objective Particle Swarm Optimization algorithm (MOPSO), Pareto Envelope-based Selection Algorithm II (PESA-II) and two latest algorithms Knee point-driven Evolutionary Algorithm (KnEA) and Non-dominated Sorting and Local Search (NSLS) on solving test function sets Zitzler et al’s Test suite (ZDT), Deb et al’s Test suite (DTLZ), Walking Fish Group (WFG) and Many objective Function (MaF). The experimental results indicate that the proposed MOFECO can approach the true Pareto-optimal front with both better diversity and convergence compared to the five other algorithms.
APA, Harvard, Vancouver, ISO, and other styles
47

Chittka, Lars. "BEE COLOR VISION IS OPTIMAL FOR CODING FLOWER COLOR, BUT FLOWER COLORS ARE NOT OPTIMAL FOR BEING CODED—WHY?" Israel Journal of Plant Sciences 45, no. 2-3 (May 13, 1997): 115–27. http://dx.doi.org/10.1080/07929978.1997.10676678.

Full text
Abstract:
Model calculations are used to determine an optimal color coding system for identifying flower colors, and to see whether flower colors are well suited for being encoded. It is shown that the trichromatic color vision of bees comprises UV, blue, and green receptors whose wavelength positions are optimal for identifying flower colors. But did flower colors actually drive the evolution of bee color vision? A phylogenetic analysis reveals that UV, blue, and green receptors were probably present in the ancestors of crustaceans and insects 570 million years ago, and thus predate the evolution of flower color by at least 400 million years. In what ways did flower colors adapt to insect color vision? The variability of flower color is subject to constraint. Flowers are clustered in the bee color space (probably because of biochemical constraints), and different plant families differ strongly in their variation of color (which points to phylogenetic constraint). However, flower colors occupy areas of color space that are significantly different from those occupied by common background materials, such as green foliage. Finally, models are developed to test whether the colors of flowers of sympatric and simultaneously blooming species diverge or converge to a higher degree than expected by chance. Such effects are indeed found in some habitats.
APA, Harvard, Vancouver, ISO, and other styles
48

Dahlke, D., M. Geßner, H. Meißner, K. Stebner, D. Grießbach, R. Berger, and A. Börner. "CALIBRATING PHOTOGRAMMETRIC AIRBORNE CAMERA SYSTEMS WITH DIFFRACTIVE OPTICAL ELEMENTS." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-2/W13 (June 5, 2019): 1637–42. http://dx.doi.org/10.5194/isprs-archives-xlii-2-w13-1637-2019.

Full text
Abstract:
<p><strong>Abstract.</strong> This paper presents a laboratory approach for geometric calibration of airborne camera systems. The setup uses an incoming laser beam, which is split by Diffractive Optical Elements (DOE) into a number of beams with precisely-known propagation directions. Each point of the diffraction pattern represents a point at infinity and is invariant against translation. A single image is sufficient to allow a complete camera calibration in accordance with classical camera calibration methods using the pinhole camera model and a distortion model. The presented method is time saving, since complex bundle adjustment procedures with several images are not necessary. It is well suited for the use with frame camera systems, but it works in principle also for pushbroom scanners. In order to prove the reliability, a conventional test field calibration is compared against the presented approach, showing that all estimated camera parameters are just insignificantly different. Furthermore a test flight over the Zeche Zollern reference target has been conducted. The aerotriangulation results shows that calibrating an airborne camera system with DOE is a feasible solution.</p>
APA, Harvard, Vancouver, ISO, and other styles
49

Sahin, Omur, Bahriye Akay, and Dervis Karaboga. "Archive-based multi-criteria Artificial Bee Colony algorithm for whole test suite generation." Engineering Science and Technology, an International Journal 24, no. 3 (June 2021): 806–17. http://dx.doi.org/10.1016/j.jestch.2020.12.011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Tuloli, Mohamad Syafri, Benhard Sitohang, and Bayu Hendradjaya. "Coevolution of Second-order-mutant." International Journal of Electrical and Computer Engineering (IJECE) 8, no. 5 (October 1, 2018): 3238. http://dx.doi.org/10.11591/ijece.v8i5.pp3238-3249.

Full text
Abstract:
<span>One of the obstacles that hinder the usage of mutation testing is its impracticality, two main contributors of this are a large number of mutants and a large number of test cases involves in the process. Researcher usually tries to address this problem by optimizing the mutants and the test case separately. In this research, we try to tackle both of optimizing mutant and optimizing test-case simultaneously using a coevolution optimization method. The coevolution optimization method is chosen for the mutation testing problem because the method works by optimizing multiple collections (population) of a solution. This research found that coevolution is better suited for multi-problem optimization than other single population methods (i.e. Genetic Algorithm), we also propose new indicator to determine the optimal coevolution cycle. The experiment is done to the artificial case, laboratory, and also a real case.</span>
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography