To see the other types of publications on this topic, follow the link: Data-driven test case design.

Journal articles on the topic 'Data-driven test case design'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Data-driven test case design.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Prakash Shrivastava, Divya. "Unit Test Case Design Metrics in Test Driven Development." International Journal of Software Engineering 2, no. 3 (August 31, 2012): 43–48. http://dx.doi.org/10.5923/j.se.20120203.01.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhao, Jian Ping, Xiao Yang Liu, Hong Ming Xi, Li Ya Xu, Jian Hui Zhao, and Huan Ming Liu. "A Lightweight-Leveled Software Automated Test Framework." Advanced Materials Research 834-836 (October 2013): 1919–24. http://dx.doi.org/10.4028/www.scientific.net/amr.834-836.1919.

Full text
Abstract:
To resolve the problem of a large amount of automated test scripts and test data files, through the test tool QTP, data-driven and keyword-driven testing mechanism, a test automation framework based on three layer data-driven mechanism is designed, including the design of the TestSet managing test case files, the design of the TestCase storing test cases and the design of the TestData storing test data.Through controlling the test scale and applying the test data pool, reconfigurable and optimization of test scripts are designed. The methods above can decouple the test design and the script development, make test cases and data show a more humane design, make test scripts and test data on the business level optimized and reusable, and make the number of script files and the test data files reache a minimum, which reduces the occupied space.
APA, Harvard, Vancouver, ISO, and other styles
3

Hanhela, Matti, Olli Gröhn, Mikko Kettunen, Kati Niinimäki, Marko Vauhkonen, and Ville Kolehmainen. "Data-Driven Regularization Parameter Selection in Dynamic MRI." Journal of Imaging 7, no. 2 (February 20, 2021): 38. http://dx.doi.org/10.3390/jimaging7020038.

Full text
Abstract:
In dynamic MRI, sufficient temporal resolution can often only be obtained using imaging protocols which produce undersampled data for each image in the time series. This has led to the popularity of compressed sensing (CS) based reconstructions. One problem in CS approaches is determining the regularization parameters, which control the balance between data fidelity and regularization. We propose a data-driven approach for the total variation regularization parameter selection, where reconstructions yield expected sparsity levels in the regularization domains. The expected sparsity levels are obtained from the measurement data for temporal regularization and from a reference image for spatial regularization. Two formulations are proposed. Simultaneous search for a parameter pair yielding expected sparsity in both domains (S-surface), and a sequential parameter selection using the S-curve method (Sequential S-curve). The approaches are evaluated using simulated and experimental DCE-MRI. In the simulated test case, both methods produce a parameter pair and reconstruction that is close to the root mean square error (RMSE) optimal pair and reconstruction. In the experimental test case, the methods produce almost equal parameter selection, and the reconstructions are of high perceived quality. Both methods lead to a highly feasible selection of the regularization parameters in both test cases while the sequential method is computationally more efficient.
APA, Harvard, Vancouver, ISO, and other styles
4

Lyu, Zhi-Jun, Qi Lu, YiMing Song, Qian Xiang, and Guanghui Yang. "Data-Driven Decision-Making in the Design Optimization of Thin-Walled Steel Perforated Sections: A Case Study." Advances in Civil Engineering 2018 (2018): 1–14. http://dx.doi.org/10.1155/2018/6326049.

Full text
Abstract:
The rack columns have so distinctive characteristics in their design, which have regular perforations to facilitate installation of the rack system that it is more difficult to be analyzed with traditional cold-formed steel structures design theory or standards. The emergence of industrial “big-data” has created better innovative thinking for those working in various fields including science, engineering, and business. The main contribution of this paper lies in that, with engineering data from finite element simulation and physical test, a novel data-driven model (DDM) using artificial neural network technology is proposed for optimization design of thin-walled steel specific perforated members. The data-driven model based on machine learning is able to provide a more effective help for decision-making of innovative design in steel members. The results of the case study indicate that compared with the traditional finite element simulation and physical test, the DDM for the solving the hard problem of complicated steel perforated column design seems to be very promising.
APA, Harvard, Vancouver, ISO, and other styles
5

Robertson, P. K., R. G. Campanella, P. T. Brown, I. Grof, and J. M. O. Hughes. "Design of axially and laterally loaded piles using in situ tests: A case history." Canadian Geotechnical Journal 22, no. 4 (November 1, 1985): 518–27. http://dx.doi.org/10.1139/t85-072.

Full text
Abstract:
A 915 mm diameter steel pipe pile was driven and tested by the B.C. Ministry of Transportation and Highways as part of their foundation studies for the proposed Annacis channel crossing of the Fraser River. The pile was driven open ended to a maximum depth of 94 m. The pile was tested axially to failure when the pile tip was at depths of 67, 78, and 94 m below ground surface. Following the final axial load test, the pile was loaded laterally to a total deflection at the ground surface of 150 mm. A slope indicator casing was installed in the pile to monitor the deflected shape during lateral loading.Adjacent to the pile, a piezometer-friction cone penetration test (CPT) and a full-displacement pressuremeter profile were made. Results of the axial and lateral load tests are presented along with the data from the CPT and the full-displacement pressuremeter tests. Results of several analyses using the data from the CPT and pressuremeter tests to predict the axial and lateral performance of the pile are presented. A comparison and discussion is presented between the predicted and measured axial and lateral behaviour of the pile, for which excellent agreement was found. Key words: pile load test, cone penetration test, pressuremeter test.
APA, Harvard, Vancouver, ISO, and other styles
6

SADEGHI, ALIREZA, and SEYED-HASSAN MIRIAN-HOSSEINABADI. "MBTDD: MODEL BASED TEST DRIVEN DEVELOPMENT." International Journal of Software Engineering and Knowledge Engineering 22, no. 08 (December 2012): 1085–102. http://dx.doi.org/10.1142/s0218194012500295.

Full text
Abstract:
Test Driven Development (TDD), as a quality promotion approach, suffers from some shortages that discourage its usage. One of the most challenging shortcomings of TDD is the low level of granularity and abstraction. This may lead to production of software that is not acceptable by the end users. Additionally, exploiting of TDD is not applicable in the enterprise systems development. To overcome this defect, we have merged TDD with Model Based Testing (MBT) and suggested a framework named Model Based Test Driven Development (MBTDD). According to TDD, writing test cases comes before programming, and based on our improved method of TDD, modeling precedes writing test cases. To validate the applicability of the proposed framework, we have implemented a use case of Human Resource Management (HRM) system by means of MBTDD. The empirical results of using MBTTD show that our proposed method overwhelms existing deficiencies of TDD.
APA, Harvard, Vancouver, ISO, and other styles
7

Lerro, Angelo, Alberto Brandl, Manuela Battipede, and Piero Gili. "A Data-Driven Approach to Identify Flight Test Data Suitable to Design Angle of Attack Synthetic Sensor for Flight Control Systems." Aerospace 7, no. 5 (May 23, 2020): 63. http://dx.doi.org/10.3390/aerospace7050063.

Full text
Abstract:
Digital avionic solutions enable advanced flight control systems to be available also on smaller aircraft. One of the safety-critical segments is the air data system. Innovative architectures allow the use of synthetic sensors that can introduce significant technological and safety advances. The application to aerodynamic angles seems the most promising towards certified applications. In this area, the best procedures concerning the design of synthetic sensors are still an open question within the field. An example is given by the MIDAS project funded in the frame of Clean Sky 2. This paper proposes two data-driven methods that allow to improve performance over the entire flight envelope with particular attention to steady state flight conditions. The training set obtained is considerably undersized with consequent reduction of computational costs. These methods are validated with a real case and they will be used as part of the MIDAS life cycle. The first method, called Data-Driven Identification and Generation of Quasi-Steady States (DIGS), is based on the (i) identification of the lift curve of the aircraft; (ii) augmentation of the training set with artificial flight data points. DIGS’s main aim is to reduce the issue of unbalanced training set. The second method, called Similar Flight Test Data Pruning (SFDP), deals with data reduction based on the isolation of quasi-unique points. Results give an evidence of the validity of the methods for the MIDAS project that can be easily adopted for generic synthetic sensor design for flight control system applications.
APA, Harvard, Vancouver, ISO, and other styles
8

Pahwa, Payal, and Renu Miglani. "Test Case Design using Black Box Testing Techniques for Data Mart." International Journal of Computer Applications 109, no. 3 (January 16, 2015): 18–22. http://dx.doi.org/10.5120/19169-0636.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Naddeo, Alessandro, and Nicola Cappetti. "Comfort driven design of innovative products: A personalized mattress case study." Work 68, s1 (January 8, 2021): S139—S150. http://dx.doi.org/10.3233/wor-208013.

Full text
Abstract:
BACKGROUND: Human-centred design asks for wellbeing and comfort of the customer/worker when interacting with a product. Having a good perception-model and an objective method to evaluate the experienced (dis)comfort by the product user is needed for performing a preventive comfort evaluation as early as possible in the product development plan. The mattress of a bed is a typical product whose relevance in everyday life of people is under-evaluated. Fortunately, this behaviour is quickly changing, and the customer wants to understand the product he/she buys and asks for more comfortable and for scientifically assessed products. No guidelines for designing a personalized mattress are available in the literature. OBJECTIVES: This study deals with the experience of designing an innovative product whose product-development-plan is focused on the customer perceived comfort: a personalized mattress. The research question is: which method can be used to innovate or create a comfort-driven human-centred product? METHODS: Virtual prototyping was used to develop a correlated numerical model of the mattress. A comfort model for preventively assessing the perceived comfort was proposed and experimentally tested. Mattress testing sessions with subjects were organized, and collected data were compared with already tested mattresses. Brainstorming and multi-expert methods were used to propose, realize, and test an archetype of a new mattress for final comfort assessment. RESULTS: A new reconfigurable mattress was developed, resulting in two patents. The mattress design shows that personalized products can be tuned according to the anthropometric data of the customer in order to improve the comfort experience during sleep. CONCLUSIONS: A “comfort-driven design guideline” was proposed; this method has been based on the use of virtual prototyping, virtual optimization and physical prototyping and testing. It allowed to improve an existing product in a better way and to bring innovation in it.
APA, Harvard, Vancouver, ISO, and other styles
10

CHO, YONGSUN, WOOJIN LEE, and KIWON CHONG. "THE TECHNIQUE OF BUSINESS MODEL DRIVEN ANALYSIS AND TEST DESIGN FOR DEVELOPMENT OF WEB APPLICATIONS." International Journal of Software Engineering and Knowledge Engineering 15, no. 04 (August 2005): 587–605. http://dx.doi.org/10.1142/s0218194005002452.

Full text
Abstract:
A technique for creating analysis models of a web application from a business model is proposed for easy and effective development. Moreover, a technique for generating test cases from the sequence diagrams for a web application is proposed. The use case diagram and web page list are generated from the business model which is depicted using the notations of the UML (Unified Modeling Language) activity diagram. The page diagram and logical/physical database models are created based on the web page list and extracted data fields. Test cases for the web application are generated from call messages (including self-call messages) of the UML sequence diagram. The effectiveness of these techniques is shown using a practical case study which is a development project of a web application for RMS (Research material Management System).
APA, Harvard, Vancouver, ISO, and other styles
11

Georgiou, Aris, George Haritos, Moyra Fowler, and Yasmin Imani. "Advanced phase powertrain design attribute and technology value mapping." Journal of Engineering, Design and Technology 14, no. 1 (March 7, 2016): 115–33. http://dx.doi.org/10.1108/jedt-05-2014-0031.

Full text
Abstract:
Purpose – The purpose of this paper is to focus on how the concept design stage of a powertrain system can be improved by using a purely objective driven approach in selecting a final concept design to progress further. This research investigation will examine the development of a novel test-bed to assist in the selection of powertrain technologies during the concept design phase at Ford Motor Company Ltd, serving as the main contribution to knowledge. Design/methodology/approach – The objectives of this research were achieved by carrying out a literature review of external published work related to concept design evaluation methods within product development and value engineering techniques. Empirical studies were conducted with a supporting case study used to test the application of a new test-bed to improve the concept design decision process. Findings – A quantitative new tool “Product Optimisation Value Engineering” (PROVEN) is presented to critically assess new and evolving powertrain technologies at the concept design phase. Research limitations/implications – This research improves the concept design selection process, hence increasing the product value to the customer. Practical implications – The new test-bed “PROVEN” incorporates a data-driven objective approach to assist in assessing concept design alternatives in providing the net value in terms of function and cost as perceived by the customer. Originality/value – A mathematical new test-bed that incorporates a highly adaptable, data-driven and multi-attribute value approach to product specification and conceptual design is developed, novel to the automotive concept design process. This will create a substantially optimised product offering to the market, reducing overall development costs while achieving customer satisfaction. The new tool has the ability to define a technology value map to assess multiple technical options as a function of its attributes, whose precise values can be determined at a given cost.
APA, Harvard, Vancouver, ISO, and other styles
12

Cai, Hui, and Jun Jia. "Using Discrete Event Simulation (DES) To Support Performance-Driven Healthcare Design." HERD: Health Environments Research & Design Journal 12, no. 3 (September 25, 2018): 89–106. http://dx.doi.org/10.1177/1937586718801910.

Full text
Abstract:
Aim: This article aims to provide a description of fundamental elements of discrete event simulation (DES), the process and the values of applying DES in assisting healthcare design and planning decision-making. More importantly, it explores how new technology such as electronic medical records, real-time location services (RTLS), and other simulation methods such as space syntax analysis (SSA) can facilitate and complement DES. Background: Healthcare administrators increasingly recognize DES as an effective tool for allocating resources and process improvement. However, limited studies have described specifically how DES can facilitate healthcare design. Method: Three case studies were provided to illustrate the value of DES in supporting healthcare design. The first case study used DES to validate a surgical suite design for shorter surgeon walking distance. The second case study used DES to facilitate capacity planning in a clinic through testing the utilization of exam rooms upon various growth scenario. The detailed process data for the current clinic were captured through RTLS tracking. The third case study applied DES in an emergency department for both site selection in master planning and capacity test at various growth scenarios with seasonal volume swing. In addition, SSA was conducted to evaluate the impacts of design on visual surveillance, team communication, and co-awareness. Conclusions: It is recognized that the DES analysis is an effective tool to address the process aspects of healthcare environments and should be combined with post-occupancy evaluation and SSA to address behavioral aspects of operations to provide more solid evidence for future design.
APA, Harvard, Vancouver, ISO, and other styles
13

MEDDERS, STEPHEN C., EDWARD B. ALLEN, and EDWARD A. LUKE. "USING RULE STRUCTURE TO EVALUATE THE COMPLETENESS OF RULE-BASED SYSTEM TESTING: A CASE STUDY." International Journal of Software Engineering and Knowledge Engineering 20, no. 07 (November 2010): 975–86. http://dx.doi.org/10.1142/s0218194010005006.

Full text
Abstract:
Rule-based systems are typically tested using a set of inputs which will produce known outputs. However, one does not know how thoroughly the software has been exercised. Traditional test-coverage metrics do not account for the dynamic data-driven flow of control in rule-based systems. Our literature review found that there has been little prior work on coverage metrics for rule-based systems. This paper proposes test-coverage metrics for rule-based systems derived from metrics defined by prior work, and presents an industrial scale case study. We conducted a case study to evaluate the practicality and usefulness of the proposed metrics. The case study applied the metrics to a system for computational fluid-dynamics models based on a rule-based application framework. These models were tested using a regression-test suite. The data-flow structure built by the application framework, along with the regression-test suite, provided case-study data. The test suite was evaluated against three kinds of coverage. The measurements indicated that complete coverage was not achieved, even at the lowest level definition. Lists of rules not covered provided insight into how to improve the test suite. The case study illustrated that structural coverage measures can be utilized to measure the completeness of rule-based system testing.
APA, Harvard, Vancouver, ISO, and other styles
14

Eilouti, Buthayna. "A language-driven reverse-engineering tool for the analysis of architectural precedents: A Palladian case study." Spatium, no. 45 (2021): 21–33. http://dx.doi.org/10.2298/spat2145021e.

Full text
Abstract:
The analysis of precedents represents a significant point of departure for design processing. By applying a language/ design analogy, this research introduces a reverse-engineering tool that helps guide the systematic analysis of architectural precedents. The visual tool consists of four main layers: the morphological, the semantic, the semiotic, and the pragmatic. To test the tool?s applicability, a prominent precedent from the Palladian designs is analyzed as a case study. By developing the tool and demonstrating its applicability for the analysis of the underlying regulatory and formative principles of the Palladian design, this paper aims to contribute to the knowledge of architectural design by introducing an analytical tool for decoding and externalizing the design language. This tool can be added to the existing toolbox of designers, and it can help reveal new insights into the multi-layered compositional language of precedents and their underlying architectonics. The findings related to the tool?s applicability and the compositional language of the Palladian design and its associative meanings and connotations are explained, discussed and illustrated by diagrams.
APA, Harvard, Vancouver, ISO, and other styles
15

Kwak, Kiseok, Kyung Jun Kim, Jungwon Huh, Ju Hyung Lee, and Jae Hyun Park. "Reliability-based calibration of resistance factors for static bearing capacity of driven steel pipe piles." Canadian Geotechnical Journal 47, no. 5 (May 2010): 528–38. http://dx.doi.org/10.1139/t09-119.

Full text
Abstract:
As part of a study to develop load and resistance factor design (LRFD) codes for foundation structures in South Korea, resistance factors for the static bearing capacity of driven steel pipe piles were calibrated in the framework of the reliability theory. A database of 52 static load test results was compiled, and the data from these load test piles were sorted into two cases: a standard penetration test (SPT) N-value at pile tip (i) less than 50 and (ii) equal to or more than 50. Reliability analyses and resistance factor calibration for the two static bearing capacity analysis methods adopted in the Korean Design standards for foundation structures were performed using the first-order reliability method (FORM) and the Monte Carlo simulation (MCS). Reliability indices and resistance factors computed by the MCS are statistically identical to those computed by FORM. Target reliability indices were selected as 2.0 and 2.33 for the group pile case and 2.5 for the single pile case. The resistance factors recommended from this study are specific for the pile foundation design and construction practice and the subsurface conditions in South Korea.
APA, Harvard, Vancouver, ISO, and other styles
16

Huynh, Quyet-Thang, Le-Trinh Pham, Nhu-Hang Ha, and Duc-Man Nguyen. "An Effective Approach for Context Driven Testing in Practice — A Case Study." International Journal of Software Engineering and Knowledge Engineering 30, no. 09 (September 2020): 1245–62. http://dx.doi.org/10.1142/s0218194020500333.

Full text
Abstract:
Software testing is a continuous process during the software development stages to ensure quality software products. Researchers, experts and software engineers keep going on studying new techniques, methods and approaches of testing to accommodate changes in software development because of the flexible requirement along with the changing of technology. So, developers and testers need to have effective methods, tools and approaches to create a high-quality product at an efficient cost. This paper provides an effective approach for context-driven testing (CDT) in an agile software development process. CDT is a testing approach that supports the tester to choose their testing techniques and test objectives based on specific contexts. The aim of this paper is to propose an effective approach for implementing the CDT in practice, called CDTiP. Through an analysis of two case studies using an agile development process with different contexts, we validate the effectiveness of the approach in terms of test coverage, detect errors, test effort. The empirical results show that CDTiP is suitable for the agile development process that can help the tester to detect defects faster at minimum cost. The results of this method have been applied at Enclave, an ODC Software Engineering company, on real projects.
APA, Harvard, Vancouver, ISO, and other styles
17

Zhang, Hui Nan, Shi Hai Wang, Xiao Xu Diao, and Bin Liu. "Test Case Generating for Integrated Modular Avionics Software Health Monitoring." Applied Mechanics and Materials 494-495 (February 2014): 873–80. http://dx.doi.org/10.4028/www.scientific.net/amm.494-495.873.

Full text
Abstract:
Avionics software is safe-critical embedded system and its architecture is evolving from traditional federated architecture to Integrated Modular Avionics (IMA) to improve resource usability. As an architecture widely employed in the avionics industry, supports partitioning concepts. To insure the development of the avionics software constructed on IMA operating system with high reliability and efficiency Health Monitoring (HM) has been shown to be a key step in reducing the life cycle costs for structural maintenance and inspection. In this paper , we propose a model-driven test methodology using Architecture Analysis &Design Language (AADL). It proposes modeling patterns of IMA errors to support the test case generating mechanisms of the HM module, proposing 3 kinds of test cases that can be injected in the HM to stimulate these kinds of errors, and we present the preliminary results that can meet the satisfactory from a ongoing project based on IMA system.
APA, Harvard, Vancouver, ISO, and other styles
18

Parente, Jéssica, Tiago Martins, João Bicker, and Penousal Machado. "Designing Dynamic Logotypes to Represent Data." International Journal of Art, Culture and Design Technologies 8, no. 1 (January 2019): 16–30. http://dx.doi.org/10.4018/ijacdt.2019010102.

Full text
Abstract:
This work explores how data can influence the design of logotypes and how they can convey information. The authors use the University of Coimbra, in Portugal, as a case study to develop data-driven logotypes for its faculties and, subsequently, for its students. The proposed logotypes are influenced by the current number of students in each faculty, the number of male and female students, and the nationality of the students. The resulting logotypes are able to portray the diversity of students in each faculty. The authors also test this design approach in the creation of logotypes for the students according to their academic information, namely the course and number of credits done. The resulting logotypes are able to adapt to the current students, evolving over time with the departure of students and admission of new ones.
APA, Harvard, Vancouver, ISO, and other styles
19

Abdurohman, Maman, and Arif Sasongko. "Software for Simplifying Embedded System Design Based on Event-Driven Method." International Journal of Electrical and Computer Engineering (IJECE) 5, no. 3 (June 1, 2015): 491. http://dx.doi.org/10.11591/ijece.v5i3.pp491-502.

Full text
Abstract:
Complexity of embedded system application increases along with the escalation of market demand. Embedded system design process must be enhanced to face design complexity problem. One of challenges in designing embedded system is speed, accuracy, and flexibility. The design process is usually conducted recursively to fulfill requirement of user and optimization. To solve this problem, it needs a system design that is flexible for adaptation. One of solutions is by optimizing all or some of the design steps. This paper proposes a design framework with an automatic framework code generator with of event driven approach. This software is a part of a design flow which is flexible and fast. Tron game and simple calculator are presented as a case study. Test result shows that this framework generator can increase speed of design’s flexibility.
APA, Harvard, Vancouver, ISO, and other styles
20

Shaufiah, Shaufiah. "Pengembangan Rules-Driven Workflow Management System (RDWfMS) dengan Menggunakan Teknik Data Mining untuk Sistem Informasi Research Center." Indonesian Journal on Computing (Indo-JC) 2, no. 1 (September 14, 2017): 39. http://dx.doi.org/10.21108/indojc.2017.2.1.93.

Full text
Abstract:
<p><em>The tight competition of business and technology today makes the organisation must be able to adapt to changes. The changes could affect not only business process model of the organisation but also their information system, which lead to high cost and time-consuming. Therefore, the design and implementation of an information system should handle with care. Furthermore, the organisation also needs to process data into information and knowledge to support their business. Hence, this research is developed to undertake those problems by developing the information system using hybrid approach of the workflow system and data mining technique with research centre organisation as a case study. The system named as SIMPLER which applying rules-driven workflow and data mining to process data into information and knowledge from research data. SIMPLER tested by measuring its ability as a workflow management system as well as measuring the quality based on end-user satisfaction and perception by using WebQual. Based on the test results indicate that the SIMPLER been able to satisfy the need for business process execution XYZ research centre, able to adapt to changes without reconstructing the source code and gained end users satisfaction of 81.81% and 85% of accuracy.</em></p>
APA, Harvard, Vancouver, ISO, and other styles
21

Ivanov, Valentin, Klaus Augsburg, Carlos Bernad, Miguel Dhaens, Mathieu Dutré, Sebastian Gramstat, Pacôme Magnin, Viktor Schreiber, Urška Skrt, and Nick Van Kelecom. "Connected and Shared X-in-the-Loop Technologies for Electric Vehicle Design." World Electric Vehicle Journal 10, no. 4 (November 21, 2019): 83. http://dx.doi.org/10.3390/wevj10040083.

Full text
Abstract:
The presented paper introduces a new methodology of experimental testing procedures required by the complex systems of electric vehicles (EV). This methodology is based on real-time connection of test setups and platforms, which may be situated in different geographical locations, belong to various cyber-physical domains, and are united in a global X-in-the-loop (XIL) experimental environment. The proposed concept, called XILforEV, allows exploring interdependencies between various physical processes that can be identified or investigated in the process of EV development. The paper discusses the following relevant topics: global XILforEV architecture; realization of required high-confidence models using dynamic data driven application systems (DDDAS) and multi fidelity models (MFM) approaches; and formulation of case studies to illustrate XILforEV applications.
APA, Harvard, Vancouver, ISO, and other styles
22

Krinitskiy, Mikhail, Marina Aleksandrova, Polina Verezemskaya, Sergey Gulev, Alexey Sinitsyn, Nadezhda Kovaleva, and Alexander Gavrikov. "On the Generalization Ability of Data-Driven Models in the Problem of Total Cloud Cover Retrieval." Remote Sensing 13, no. 2 (January 19, 2021): 326. http://dx.doi.org/10.3390/rs13020326.

Full text
Abstract:
Total Cloud Cover (TCC) retrieval from ground-based optical imagery is a problem that has been tackled by several generations of researchers. The number of human-designed algorithms for the estimation of TCC grows every year. However, there has been no considerable progress in terms of quality, mostly due to the lack of systematic approach to the design of the algorithms, to the assessment of their generalization ability, and to the assessment of the TCC retrieval quality. In this study, we discuss the optimization nature of data-driven schemes for TCC retrieval. In order to compare the algorithms, we propose a framework for the assessment of the algorithms’ characteristics. We present several new algorithms that are based on deep learning techniques: A model for outliers filtering, and a few models for TCC retrieval from all-sky imagery. For training and assessment of data-driven algorithms of this study, we present the Dataset of All-Sky Imagery over the Ocean (DASIO) containing over one million all-sky optical images of the visible sky dome taken in various regions of the world ocean. The research campaigns that contributed to the DASIO collection took place in the Atlantic ocean, the Indian ocean, the Red and Mediterranean seas, and the Arctic ocean. Optical imagery collected during these missions are accompanied by standard meteorological observations of cloudiness characteristics made by experienced observers. We assess the generalization ability of the presented models in several scenarios that differ in terms of the regions selected for the train and test subsets. As a result, we demonstrate that our models based on convolutional neural networks deliver a superior quality compared to all previously published approaches. As a key result, we demonstrate a considerable drop in the ability to generalize the training data in the case of a strong covariate shift between the training and test subsets of imagery which may occur in the case of region-aware subsampling.
APA, Harvard, Vancouver, ISO, and other styles
23

Xu, Li Chao. "The Improvement Design for Vehicle Speed Detection System." Applied Mechanics and Materials 602-605 (August 2014): 1567–70. http://dx.doi.org/10.4028/www.scientific.net/amm.602-605.1567.

Full text
Abstract:
The precision of vehicle speedometer directly influenced traffic safety, in order to ensure its reliable work, the speedometer precision must be tested regularly. During the current vehicle speed detection practice, the tested vehicle wheel was driven by the roller and the slippage between the roller and vehicle wheel was not calculated, therefore a great error came into being in the detection result for the vehicle with rear-place, rear-drive and the speed signal getting from the front wheel, aimed at this circumstance and based on analyzing vehicle speed test principle, the speed sensor and data acquisition card were chosen, and a vehicle speed detection system was improved by appling NI LabVIEW software in the paper, which had overcome the shortage of the traditional vehicle speed test principle, When being applied in detection practice, it could improve the accuracy and pass-rate of vehicle speed detection.
APA, Harvard, Vancouver, ISO, and other styles
24

Meneghetti, Giovanni, Carlo Dengo, and Fulvio Lo Conte. "Bending fatigue design of case-hardened gears based on test specimens." Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science 232, no. 11 (May 30, 2017): 1953–69. http://dx.doi.org/10.1177/0954406217712278.

Full text
Abstract:
Different design methods against bending fatigue are reported in ISO 6336 standard. The standard suggests primarily the method based on reference test gears and provides the relevant fatigue curves. Additionally, the standard suggests the use of specimens (instead of gears) to generate the reference fatigue curves, but it also advices that specimen-based methods can be used when gears are not available and that specimens are particularly useful for comparing fatigue performances of gear materials relative to one another. The purpose of the present paper is to evaluate the accuracy of the specimen-based methods mentioned in the ISO standard when applied to design gears against bending fatigue. Experimental data were generated by means of pulsator fatigue tests on case-hardened gears used in off-highway vehicles. Afterwards, experimental results were compared with theoretical estimations according to the approaches based on reference test gears (as suggested by the ISO standard) and test specimens. Concerning the latter approach, the relevant fatigue design curves were generated by testing smooth as well as notched specimens made of the same case-hardened gear steel. It was found that the specimens-based methods are as accurate as the reference gears-based method, provided that the material notch sensitivity factor is properly calibrated on the experimental results obtained from specimens.
APA, Harvard, Vancouver, ISO, and other styles
25

Costa, N., V. Branco, R. Costa, A. Borges, A. Modesto, C. Silva, and R. Cunca. "TOWARDS A DESIGN OBSERVATORY: THE CASE OF SCHOLARLY DESIGN RESEARCH IN PORTUGAL." Proceedings of the Design Society: DESIGN Conference 1 (May 2020): 827–36. http://dx.doi.org/10.1017/dsd.2020.327.

Full text
Abstract:
AbstractThe DesignOBS project was created to collect, map and interpret data about the Portuguese Design Ecosystem, providing supportive information for decision making. This study takes advantage of a participative Design perspective to define and test an observation process via a case based on Design doctorates undertaken in Portugal. It emphasises the need for additional participatory analysis and curation by experts to evaluate and develop more reliable information about the discipline. Moreover, it develops recommendations that can enhance the communicability of Design doctorates.
APA, Harvard, Vancouver, ISO, and other styles
26

Rangnes, Hege, and Aslaug Fodstad Gourvennec. "Læreres samtale om pedagogisk bruk av flerspråklige elevers prøveresultater." Acta Didactica Norge 12, no. 4 (December 18, 2018): 6. http://dx.doi.org/10.5617/adno.6282.

Full text
Abstract:
Siden begynnelsen av 2000-tallet har det vært et utdanningspolitisk ønske i Norge om å kvalitetsvurdere opplæringen i skolen, og det er i den forbindelse innført obligatoriske kartleggingsprøver og nasjonale prøver. Vi vet at lærere er usikre på oppfølgingen av prøveresultatene. Som et ledd i å styrke underveisvurderingen, har Utdanningsdirektoratet lansert digitale læringsstøttende prøver med veiledninger for mellomtrinnet. En av disse prøvene, en vokabularprøve, er særlig innrettet mot den konkrete oppfølgingen av flerspråklige elever. I denne studien undersøker vi, inspirert av et kritisk case-design, hvordan åtte lærere forstått som deltakere med fullt medlemskap i et praksisfellesskap (Lave & Wenger, 1991), reflekterer over flerspråklige elevers prøveresultater. Ved å anvende Breiter og Lights (2006) begreper knyttet til læreres beslutningstaking basert på data – såkalt data-driven decision making – analyserer vi hvordan lærerne i løpet av en samtale beveger seg fra å forklare elevenes resultater på vokabularprøven til å bygge på disse forklaringene når de skal ta målrettede valg om fremtidig undervisning. Studien viser at deltakerne i løpet av samtalen utvikler felles kunnskap om flerspråklige elevers opplæringsbehov som potensielt vil kunne føre til endring i praksis. Samtidig avdekker studien at lærernes nyervervede kunnskap bare delvis er forankret i elevresultatene. Studien peker mot et behov for at skolene gir rom for strukturerte samtaler om prøveresultater hvor også flerspråklige elever blir tematisert. Siktemålet for slike samtaler må være å skape anvendbar kunnskap basert på alle elevers prøveresultater.Nøkkelord: pedagogisk bruk av prøver, data-driven decision making, praksisfellesskap, flerspråklige elever, akademisk vokabularTeachers in conversation about the use of second language learners' test results for instructional purposesAbstractOver the last two decades, the Norwegian Government has focused strongly on assessing the quality of education in schools. This focus has led to the introduction of both compulsory screening tests and national tests. Current research indicates uncertainty on the part of teachers about how to use and follow up the test results. In order to strengthen formative assessment practice and teachers’ competence in using test results to guide instruction, the Norwegian Directorate for Education and Training has introduced a series of tests in digital format, with accompanying teacher manuals, for use in grades 5-7. One of these tests, a vocabulary test, focuses particularly on supporting second language learners’ (SLLs) learning. In this study, we employ a critical case design to investigate how eight teachers, understood as full participants in a community of practice (Lave & Wenger, 1991), engage in a collective learning process around SLLs’ test results. Using Breiter and Light’s (2006) concepts related to research on teachers’ data-driven decision making, we analyze how these teachers during a conversation move from explaining SLLs test results from the vocabulary test, to making targeted decisions for future instruction. The analysis reveals that during the conversation, the participants develop shared knowledge about the academic needs of SLLs, which could potentially lead to instructional improvement. However, this knowledge is only partly grounded in the test results. The findings suggest that the schools make room for structured conversations about SLLs test results to guide future decisions. Further research should continue to investigate how teachers could improve their instruction based on test results.Keywords: instructional use of tests, data-driven decision making, community of practice, second language learners, academic vocabulary
APA, Harvard, Vancouver, ISO, and other styles
27

Böhmer, Kristof, and Stefanie Rinderle-Ma. "Automatic Business Process Test Case Selection: Coverage Metrics, Algorithms, and Performance Optimizations." International Journal of Cooperative Information Systems 25, no. 04 (December 2016): 1740002. http://dx.doi.org/10.1142/s0218843017400020.

Full text
Abstract:
Business processes describe and implement the business logic of companies, control human interaction, and invoke heterogeneous services during runtime. Therefore, ensuring the correct execution of processes is crucial. Existing work is addressing this challenge through process verification. However, the highly dynamic aspects of the current processes and the deep integration and frequent invocation of third party services limit the use of static verification approaches. Today, one frequently utilized approach to address this limitation is to apply process tests. However, the complexity of process models is steadily increasing. So, more and more test cases are required to assure process model correctness and stability during design and maintenance. But executing hundreds or even thousands of process model test cases lead to excessive test suite execution times and, therefore, high costs. Hence, this paper presents novel coverage metrics along with a genetic test case selection algorithm. Both enable the incorporation of user-driven test case selection requirements and the integration of different knowledge sources. In addition, techniques for test case selection computation performance optimization are provided and evaluated. The effectiveness of the presented genetic test case selection algorithm is evaluated against five alternative test case selection algorithms.
APA, Harvard, Vancouver, ISO, and other styles
28

Hue, Chu Thi Minh, Duc-Hanh Dang, Nguyen Ngoc Binh, and Anh-Hoang Truong. "USLTG: Test Case Automatic Generation by Transforming Use Cases." International Journal of Software Engineering and Knowledge Engineering 29, no. 09 (September 2019): 1313–45. http://dx.doi.org/10.1142/s0218194019500414.

Full text
Abstract:
This paper proposes a transformation-based method to automatically generate functional test cases from use cases named USLTG (Use case Specification Language (USL)-based Test Generation). We first focus on developing a modeling language named Test Case Specification Language (TCSL) in order to express test cases. Test cases in TCSL can contain detailed information including test steps, test objects within steps, actions of test objects, and test data. Such information is often ignored in currently available test case specifications. We then aim to generate test cases in a TCSL model by a transformation from use cases that are represented by a USL. The USLTG transformation includes three main steps in generating (1) scenarios, (2) test data, and (3) a TCSL model. Within our transformation, the OCL solver is employed in order to build system snapshots as the part of test cases and to identify other test data. We applied our method to two case studies and evaluated our method by comparing it with other recent works.
APA, Harvard, Vancouver, ISO, and other styles
29

Sargent, D. W., R. D. Beckie, and G. Smith. "Design and performance of deep well dewatering: a case study." Canadian Geotechnical Journal 35, no. 1 (February 1, 1998): 81–95. http://dx.doi.org/10.1139/t97-077.

Full text
Abstract:
This paper reviews the process used to design the construction dewatering system at the Influent Pumping Station at Annacis Island Wastewater Treatment Plant. The design process followed the "observational method," as applied to soil mechanics by K. Terzaghi and set out by R.B. Peck in the Ninth Rankine Lecture. The design was based on a working hypothesis of behaviour anticipated under the most probable conditions identified in the data gathering and assessment program. The sensitivity of the design was evaluated by considering potentially unfavourable conditions evident in the available data. The design development included a review of monitoring feedback obtained during the pumping-well installation, a pumping test, and the dewatering system start-up. The monitoring program and review process are presented.Key words: dewatering, observational method, case study, pumping test.
APA, Harvard, Vancouver, ISO, and other styles
30

Berg, Maya, Manu Vanaerschot, Andris Jankevics, Bart Cuypers, Rainer Breitling, and Jean-Claude Dujardin. "LC-MS METABOLOMICS FROM STUDY DESIGN TO DATA-ANALYSIS – USING A VERSATILE PATHOGEN AS A TEST CASE." Computational and Structural Biotechnology Journal 4, no. 5 (January 2013): e201301002. http://dx.doi.org/10.5936/csbj.201301002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Susan-Resiga, R. F., C. Popescu, R. Szakal, S. Muntean, and A. Stuparu. "A benchmark test case for swirling flows: design of the swirl apparatus, experimental data, and numerical challenges." IOP Conference Series: Earth and Environmental Science 240 (March 27, 2019): 072004. http://dx.doi.org/10.1088/1755-1315/240/7/072004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

HU, WEI, TIANZHOU CHEN, QINGSONG SHI, and SHA LIU. "CRITICAL-PATH DRIVEN ROUTERS FOR ON-CHIP NETWORKS." Journal of Circuits, Systems and Computers 19, no. 07 (November 2010): 1543–57. http://dx.doi.org/10.1142/s021812661000689x.

Full text
Abstract:
Multithreaded programming has become the dominant paradigm in computer architecture, mainly in the form of multi-core processors. The performance bottleneck of a multithreaded program is its critical path, whose length is its total execution time. As the number of cores within a processor increases, Network-on-Chip (NoC) has been proposed as a promising approach for inter-core communication. In order to optimize the performance of a multithreaded program running on an NoC based multi-core platform, we design and implement the critical-path driven router, which prioritizes inter-thread communication on the critical path when routing packets. The experimental results show that the critical-path driven router improves the execution time of the test case by 14.8% compared to the ordinary router.
APA, Harvard, Vancouver, ISO, and other styles
33

Park, Chunjong, Hung Ngo, Libby Rose Lavitt, Vincent Karuri, Shiven Bhatt, Peter Lubell-Doughtie, Anuraj H. Shankar, et al. "The Design and Evaluation of a Mobile System for Rapid Diagnostic Test Interpretation." Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 5, no. 1 (March 19, 2021): 1–26. http://dx.doi.org/10.1145/3448106.

Full text
Abstract:
Rapid diagnostic tests (RDTs) provide point-of-care medical screening without the need for expensive laboratory equipment. RDTs are theoretically straightforward to use, yet their analog colorimetric output leaves room for diagnostic uncertainty and error. Furthermore, RDT results within a community are kept isolated unless they are aggregated by healthcare workers, limiting the potential that RDTs can have in supporting public health efforts. In light of these issues, we present a system called RDTScan for detecting and interpreting lateral flow RDTs with a smartphone. RDTScan provides real-time guidance for clear RDT image capture and automatic interpretation for accurate diagnostic decisions. RDTScan is structured to be quickly configurable to new RDT designs by requiring only a template image and some metadata about how the RDT is supposed to be read, making it easier to extend than a data-driven approach. Through a controlled lab study, we demonstrate that RDTScan's limit-of-detection can match, and even exceed, the performance of expert readers who are interpreting the physical RDTs themselves. We then present two field evaluations of smartphone apps built on the RDTScan system: (1) at-home influenza testing in Australia and (2) malaria testing by community healthcare workers in Kenya. RDTScan achieved 97.5% and 96.3% accuracy compared to RDT interpretation by experts in the Australia Flu Study and the Kenya Malaria Study, respectively.
APA, Harvard, Vancouver, ISO, and other styles
34

Zhang, Ju Liang, Qing Chuang Deng, and Yun Hua Hu. "Design and Implementation of Automatic Test System of Digital Device." Advanced Materials Research 960-961 (June 2014): 1168–73. http://dx.doi.org/10.4028/www.scientific.net/amr.960-961.1168.

Full text
Abstract:
Aiming at the digital device test, design a automatic test system, introduced the concept, architecture and technology. The PC sends SMV to digital device based on IEC61850-9-2, Then receive real data from device through the PC serial based on IEC103, constructing a loop, then completed the test. Software static structure adopts the modular programming, dynamic data exchange uses multithread. According to the configured test case, test items can be completed automatically. The system is based on PC, compared with the traditional test instrument to save hardware cost. With the help of calculation and control capabilities of the PC, realize the automatic testing, make people liberation from repetitive work, more focused on the analysis of test results, improve test quality and efficiency.
APA, Harvard, Vancouver, ISO, and other styles
35

Brembilla, Eleonora, Christina J. Hopfe, John Mardaljevic, Anastasia Mylona, and Eirini Mantesi. "Balancing daylight and overheating in low-energy design using CIBSE improved weather files." Building Services Engineering Research and Technology 41, no. 2 (November 14, 2019): 210–24. http://dx.doi.org/10.1177/0143624419889057.

Full text
Abstract:
A new set of CIBSE weather files for building performance simulation was recently developed to address the need for better quality solar data. These are essential for most building performance simulation applications, particularly for daylighting studies and low-energy building design, which requires detailed irradiation data for passive solar design and overheating risk analysis. The reliability of weather data becomes paramount when building performance is pushed to its limits. Findings illustrate how principles of good window design can be applied to a case study building, built to the Passivhaus standard, and how its expected performance is affected by the quality of solar irradiation data. Analyses using test reference years were most affected by changes in the solar radiation model (up to 8.3% points), whereas for design summer years the maximum difference was 1.7% points. Adopting the new model caused overheating risk to be classified as more severe using test reference years than design summer years, prompting a discussion on the design summer year selection method. Irradiance data measured on-site were used as a benchmark to evaluate the new solar radiation model, which was found to significantly improve the accuracy of irradiance data within weather files and so the reliability of overheating assessments. Practical application: CIBSE weather files are widely used for compliance verification of building performance in the UK context. This paper tests how the introduction of a new solar radiation model in weather files will affect daylighting and overheating simulation results. Examples are given on how low-energy building design considerations driven by advanced simulation techniques can help reaching indoor visual and thermal comfort requirements.
APA, Harvard, Vancouver, ISO, and other styles
36

ZHANG, YANCHUN, MARIA E. ORLOWSKA, and ROBERT COLOMB. "AN EFFICIENT TEST FOR THE VALIDITY OF UNBIASED HYBRID KNOWLEDGE FRAGMENTATION IN DISTRIBUTED DATABASES." International Journal of Software Engineering and Knowledge Engineering 02, no. 04 (December 1992): 589–609. http://dx.doi.org/10.1142/s0218194092000270.

Full text
Abstract:
Knowledge bases contain specific and general knowledge. In relational database systems, specific knowledge is often represented as a set of relations. The conventional methodologies for centralized database design can be applied to develop a normalized, redundancy-free global schema. Distributed database design involves redundancy removal as well as the distribution design which allows replicated data segments. Thus, distribution design can be viewed as a process on a normalized global schema which produces a collection of fragments of relations from a global database. Clearly, not every fragment of data can be permitted as a relation. In this paper, we clarify and formally discuss three kinds of fragmentations of relational databases, and characterize their features as valid designs, and we introduce a hybrid knowledge fragmentation as the general case. For completeness of presentation, we first show an algorithm for the validity test of vertical fragmentations of normalized relations, and then extend it to the more general case of unbiased fragmentations.
APA, Harvard, Vancouver, ISO, and other styles
37

d’Adamo, Alessandro, Clara Iacovano, and Stefano Fontanesi. "A Data-Driven Methodology for the Simulation of Turbulent Flame Speed across Engine-Relevant Combustion Regimes." Energies 14, no. 14 (July 12, 2021): 4210. http://dx.doi.org/10.3390/en14144210.

Full text
Abstract:
Turbulent combustion modelling in internal combustion engines (ICEs) is a challenging task. It is commonly synthetized by incorporating the interaction between chemical reactions and turbulent eddies into a unique term, namely turbulent flame speed sT. The task is very complex considering the variety of turbulent and chemical scales resulting from engine load/speed variations. In this scenario, advanced turbulent combustion models are asked to predict accurate burn rates under a wide range of turbulence–flame interaction regimes. The framework is further complicated by the difficulty in unambiguously evaluating in-cylinder turbulence and by the poor coherence of turbulent flame speed (sT) measurements in the literature. Finally, the simulated sT from combustion models is found to be rarely assessed in a rigorous manner. A methodology is presented to objectively measure the simulated sT by a generic combustion model over a range of engine-relevant combustion regimes, from Da = 0.5 to Da = 75 (i.e., from the thin reaction regime to wrinkled flamelets). A test case is proposed to assess steady-state burn rates under specified turbulence in a RANS modelling framework. The methodology is applied to a widely adopted combustion model (ECFM-3Z) and the comparison of the simulated sT with experimental datasets allows to identify modelling improvement areas. Dynamic functions are proposed based on turbulence intensity and Damköhler number. Finally, simulations using the improved flame speed are carried out and a satisfactory agreement of the simulation results with the experimental/theoretical correlations is found. This confirms the effectiveness and the general applicability of the methodology to any model. The use of grid/time resolution typical of ICE combustion simulations strengthens the relevance of the proposed dynamic functions. The presented analysis allows to improve the adherence of the simulated burn rate to that of literature turbulent flames, and it unfolds the innovative possibility to objectively test combustion models under any prescribed turbulence/flame interaction regime. The solid data-driven representation of turbulent combustion physics is expected to reduce the tuning effort in ICE combustion simulations, providing modelling robustness in a very critical area for virtual design of innovative combustion systems.
APA, Harvard, Vancouver, ISO, and other styles
38

Rohani, Munzilah Md, Rosnawati Buhari, Basil David Daniel, Joewono Prestijo, Kamaruddin Ambak, Norsabahiah Abd Sukor, and Sitti Asmah Hasan. "Car Driving Behaviour on Road Curves: A Study Case in Universiti Tun Hussein Onn Malaysia." Applied Mechanics and Materials 773-774 (July 2015): 990–95. http://dx.doi.org/10.4028/www.scientific.net/amm.773-774.990.

Full text
Abstract:
The World Health Organization (WHO) predicted that in 2020, road accidents will become the third cause of deaths in the world. Several factors contribute to road accidents, among them are human error, speeding, irregularities in road design and period of driving (either nighttime or daytime). In road design, horizontal curves are of particular interest to the designer, given that accidents are very likely to occur at such locations if drivers lose control of their vehicles due to inappropriate speed choices. This study was conducted to investigate the variation of driving behaviour on horizontal curves. The test car was fitted with a Global Positioning System (GPS) device and driven by 30 participants. The research findings show that drivers’ choice of speed varies while approaching horizontal curve, on the curve and just after leaving the curve. Apart from this, although drivers were found to have driven at a slightly higher speed during daytime compare to evening driving, however the difference was not significant. A comparison between genders also revealed that female and male drivers drive at similar speed behaviour
APA, Harvard, Vancouver, ISO, and other styles
39

Tham, Duong Hong, and Truong Nhu Manh. "Predicting the bearing capacity of pile installed into cohesive soil concerning the spatial variability of SPT data (A case study)." ENGINEERING AND TECHNOLOGY 11, no. 1 (March 24, 2021): 45–64. http://dx.doi.org/10.46223/hcmcoujs.tech.en.11.1.1405.2021.

Full text
Abstract:
Nowadays, in situ tests have played a viable role in geotechnical engineering and construction technology. Besides lab tests conducted on undisturbed soil samples, many different kinds of in-situ tests were used and proved to be more efficient in foundation design such as pressuremeter PMT, cone penetration test CPT, standard SPT, etc. Among them, a standard penetration test (SPT for short) is easy to carry out at the site. For decades, it has proved reliable to sandy soil, but many viewpoints and opinions argued that the test was not appropriately applicable to cohesive soil because of scattered and dispersed data of SPT blow counts through different layers. This paper firstly studies how reliable the SPT data can predict the physical and mechanical properties; secondly, the soil strength is determined in terms of corrected N-SPT values, and finally the bearing capacity of a pile penetrating cohesion soil. By analyzing data from 40 boreholes located in 18 projects in Ho Chi Minh City, South VietNam, coefficients of determination between SPT numbers and physical and mechanical properties of different soil kinds are not the same: R2 = 0.623 for sand, =0.363 for sandy clay and =0.189 for clay. The spatial variability of soil properties is taken into account by calculating the scale of fluctuation θ=4.65m beside the statistically-based data in horizontal directions. Finally, the results from two theoretical approaches of predicting pile bearing capacity were compared to those of finite element program Plaxis 3D and static load test at site. Correlation between the capacity computed by using corrected N-values instead of soil strength and results of static load test has proved to be well suitable in evaluating the bearing capacity of driven and jack-in piles, particularly installing in the cohesive soil using the SPT blows.
APA, Harvard, Vancouver, ISO, and other styles
40

Ankitha, Ankitha, and Dr H. V. Ravish Aradhya. "A Python based Design Verification Methodology." Journal of University of Shanghai for Science and Technology 23, no. 06 (June 18, 2021): 901–11. http://dx.doi.org/10.51201/jusst/21/05358.

Full text
Abstract:
While the UVM-constrained random and coverage-driven verification methodology revolutionized IP and unit-level testing, it falls short of SoC-level verification needs. A solution must extend from UVM and enable vertical (IP to SoC) and horizontal (verification engine portability) reuse to completely handle SoC-level verification. To expedite test-case generation and use rapid verification engines, it must also provide a method to collect, distribute, and automatically amplify use cases. Opting a Python-based Design Verification approach opens the door to various such merits. Cocotb is a very useful, growing methodology which can be used for the same. This paper elaborates on the application of cocotb, an open-source framework hosted on Github which is based on CO-routine and CO-simulation of Testbench environment for verifying VHDL/Verilog RTL using Python. It employs equivalent design-reuse and functional verification concepts like UVM, however is implemented in Python, which is much simpler to understand and that leads to faster development and reduces the turnaround time.
APA, Harvard, Vancouver, ISO, and other styles
41

CHEN, T. Y., P. L. POON, and T. H. TSE. "AN INTEGRATED CLASSIFICATION-TREE METHODOLOGY FOR TEST CASE GENERATION." International Journal of Software Engineering and Knowledge Engineering 10, no. 06 (December 2000): 647–79. http://dx.doi.org/10.1142/s0218194000000353.

Full text
Abstract:
This paper describes an integrated methodology for the construction of test cases from functional specifications using the classification-tree method. It is an integration of our extensions to the classification-hierarchy table, the classification tree construction algorithm, and the classification tree restructuring technique. Based on the methodology, a prototype system ADDICT, which stands for AutomateD test Data generation system using the Integrated Classification-Tree method, has been built.
APA, Harvard, Vancouver, ISO, and other styles
42

Soleimani, Morteza, Mohammad Pourgol-Mohammad, Ali Rostami, and Ahmad Ghanbari. "Design for Reliability of Complex System: Case Study of Horizontal Drilling Equipment with Limited Failure Data." Journal of Quality and Reliability Engineering 2014 (November 30, 2014): 1–13. http://dx.doi.org/10.1155/2014/524742.

Full text
Abstract:
Reliability is an important phase in durable system designs, specifically in the early phase of the product development. In this paper, a new methodology is proposed for complex systems’ design for reliability. Specific test and field failure data scarcity is evaluated here as a challenge to implement design for reliability of a new product. In the developed approach, modeling and simulation of the system are accomplished by using reliability block diagram (RBD) method. The generic data are corrected to account for the design and environment effects on the application. The integral methodology evaluates reliability of the system and assesses the importance of each component. In addition, the availability of the system was evaluated using Monte Carlo simulation. Available design alternatives with different components are analyzed for reliability optimization. Evaluating reliability of complex systems in competitive design attempts is one of the applications of this method. The advantage of this method is that it is applicable in early design phase where there is only limited failure data available. As a case study, horizontal drilling equipment is used for assessment of the proposed method. Benchmarking of the results with a system with more available failure and maintenance data verifies the effectiveness and performance quality of presented method.
APA, Harvard, Vancouver, ISO, and other styles
43

Brooks, Amy C., Manousos Foudoulakis, Hanna S. Schuster, and James R. Wheeler. "Historical control data for the interpretation of ecotoxicity data: are we missing a trick?" Ecotoxicology 28, no. 10 (November 6, 2019): 1198–209. http://dx.doi.org/10.1007/s10646-019-02128-9.

Full text
Abstract:
Abstract Wildlife can be exposed to chemicals in the environment from various anthropogenic sources. Ecotoxicity studies, undertaken to address the risks from potential exposure to chemicals, vary in their design e.g. duration of exposure, effect types and endpoints measured. Ecotoxicity studies measure biological responses to test item exposure. Responses can be highly variable, with limited opportunity for control of extrinsic sources of variability. It is critical to distinguish between treatment-related effects and background ‘normal variability’ when interpreting results. Historical control data (HCD) can be a valuable tool in contextualising results from single studies against previous studies performed under similar conditions. This paper discusses the case for better use of HCD in ecotoxicology assessments, illustrating with case studies the value and difficulties of using HCD in interpretation of results of standard and higher-tier study designs. HCD are routinely used in mammalian toxicology for human health assessments, but not directly in ecotoxicology. The possible reasons for this are discussed e.g., different data types, the potential to mask effects, and the lack of guidance. These concerns are real but not insurmountable and we would like to see organisations such as OECD, EFSA and USEPA develop guidance on the principles of HCD collection. Hopefully, this would lead to greater use of HCD and regulatory acceptance. We believe this is not only a scientifically valid approach but also an ethical issue that is in line with societally driven legal mandates to minimise the use of vertebrate testing in chemical regulatory decision making.
APA, Harvard, Vancouver, ISO, and other styles
44

Schmidt, D., M. Krämer, T. Kuhn, and N. Wehn. "Energy modelling in sensor networks." Advances in Radio Science 5 (June 13, 2007): 347–51. http://dx.doi.org/10.5194/ars-5-347-2007.

Full text
Abstract:
Abstract. Wireless sensor networks are one of the key enabling technologies for the vision of ambient intelligence. Energy resources for sensor nodes are very scarce. A key challenge is the design of energy efficient communication protocols. Models of the energy consumption are needed to accurately simulate the efficiency of a protocol or application design, and can also be used for automatic energy optimizations in a model driven design process. We propose a novel methodology to create models for sensor nodes based on few simple measurements. In a case study the methodology was used to create models for MICAz nodes. The models were integrated in a simulation environment as well as in a SDL runtime framework of a model driven design process. Measurements on a test application that was created automatically from an SDL specification showed an 80% reduction in energy consumption compared to an implementation without power saving strategies.
APA, Harvard, Vancouver, ISO, and other styles
45

Phadnis, Milind A., and Matthew S. Mayo. "Group sequential design for time-to-event data using the concept of proportional time." Statistical Methods in Medical Research 29, no. 7 (October 1, 2019): 1867–90. http://dx.doi.org/10.1177/0962280219876313.

Full text
Abstract:
Sequential monitoring of efficacy and safety is an important part of clinical trials. A Group Sequential design allows researchers to perform interim monitoring after groups of patients have completed the study. Statistical literature is well developed for continuous and binary outcomes and relies on asymptotic normality of the test statistic. However, in the case of time-to-event data, existing methods of sample size calculation are done either assuming proportional hazards or assuming exponentially distributed lifetimes. In scenarios where these assumptions are not true, as evidenced from historical data, these traditional methods are restrictive and cannot always be used. As interim monitoring is driven by ethical, financial, and administrative considerations, it is imperative that sample size calculations be done in an efficient manner keeping in mind the specific needs of a clinical trial with a time-to-event outcome. To address these issues, a novel group sequential design is proposed using the concept of Proportional Time. This method utilizes the generalized gamma ratio distribution to calculate the efficacy and safety boundaries and can be used for all distributions that are members of the generalized gamma family using an error spending approach. The design incorporates features specific to survival data such as loss to follow-up, administrative censoring, varying accrual times and patterns, binding or non-binding futility rules with or without skips, and flexible alpha and beta spending mechanisms. Three practical examples are discussed, followed by discussion of the important aspects of the proposed design.
APA, Harvard, Vancouver, ISO, and other styles
46

Wang, Xuan, Qiang Yi Luo, and Ling Ling Li. "Research on Automatic Test Method of VMF Codecs." Applied Mechanics and Materials 511-512 (February 2014): 1189–95. http://dx.doi.org/10.4028/www.scientific.net/amm.511-512.1189.

Full text
Abstract:
Test of VMF Codecs aims to find errors. This paper proposes an automatic test method of VMF codecs, the method use VMF Data Element Dictionary as the input data. Firstly the test process is designed for pack & unpack functions using legal and illegal test cases, then discusses the test cases suitable for automatic generation, and proposes an automatic test case generation method, the method consists of such steps: design of VMF Data Element Dictionary, message format analysis, generation of message storage struct, generation of message pack & unpack function, generation of legal test case and illegal test case. Lastly this method has been implemented and test experiments are taken, the results show that this method is feasible and effective.
APA, Harvard, Vancouver, ISO, and other styles
47

Wiemer, Hajo, Lucas Drowatzky, and Steffen Ihlenfeldt. "Data Mining Methodology for Engineering Applications (DMME)—A Holistic Extension to the CRISP-DM Model." Applied Sciences 9, no. 12 (June 13, 2019): 2407. http://dx.doi.org/10.3390/app9122407.

Full text
Abstract:
The value of data analytics is fundamental in cyber-physical production systems for tasks like optimization and predictive maintenance. The de facto standard for conducting data analytics in industrial applications is the CRISP-DM methodology. However, CRISP-DM does not specify a data acquisition phase within production scenarios. With this chapter, we present DMME as an extension to the CRISP-DM methodology specifically tailored for engineering applications. It provides a communication and planning foundation for data analytics within the manufacturing domain, including the design and evaluation of the infrastructure for process-integrated data acquisition. In addition, the methodology includes functions of design of experiments capabilities to systematically and efficiently identify relevant interactions. The procedure of DMME methodology is presented in detail and an example project illustrates the workflow. This case study was part of a collaborative project with an industrial partner who wanted an application to detect marginal lubrication in linear guideways of a servo-driven axle based only on data from the drive controller. Decision trees detect the lubrication state, which are trained with experimental data. Several experiments, taking the lubrication state, velocity, and load on the slide into account, provide the training and test datasets.
APA, Harvard, Vancouver, ISO, and other styles
48

Hwang, Chao Lung, Chun Tsun Chen, Hsiu Lung Huang, Sheng Szu Peng, Le Anh Tuan Bui, and Yuan Yi Yan. "The Design and Case Study of Pervious Concrete Materials." Advanced Materials Research 287-290 (July 2011): 781–84. http://dx.doi.org/10.4028/www.scientific.net/amr.287-290.781.

Full text
Abstract:
This study is mainly focused on the design and application of pervious concrete material to pavement and growing plant. The test results show: 1. Addition of fine aggregate obtains better binder quality. 2. Both small aggregate size and good aggregate gradation improve strength, but reduce void ratio and permeability. 3. A data bank of the relationship of the strength, the void ratio and the permeability of pervious concrete was built. 4. A practice case study of pavement demonstrates excellent permeability comparing to the local code and good quality as well as grass growing normally.
APA, Harvard, Vancouver, ISO, and other styles
49

Baer, Atar, Meaghan S. Fagalde, Curtis D. Drake, Elisabeth H. Sohlberg, Elizabeth Barash, Sara Glick, Alexander J. Millman, and Jeffrey S. Duchin. "Design of an Enhanced Public Health Surveillance System for Hepatitis C Virus Elimination in King County, Washington." Public Health Reports 135, no. 1 (December 13, 2019): 33–39. http://dx.doi.org/10.1177/0033354919889981.

Full text
Abstract:
Introduction: With the goal of eliminating hepatitis C virus (HCV) as a public health problem in Washington State, Public Health–Seattle & King County (PHSKC) designed a Hepatitis C Virus Test and Cure (HCV-TAC) data system to integrate surveillance, clinical, and laboratory data into a comprehensive database. The intent of the system was to promote identification, treatment, and cure of HCV-infected persons (ie, HCV care cascade) using a population health approach. Materials and Methods: The data system automatically integrated case reports received via telephone and fax from health care providers and laboratories, hepatitis test results reported via electronic laboratory reporting, and data on laboratory and clinic visits reported by 6 regional health care systems. PHSKC examined patient-level laboratory test results and established HCV case classification using Council of State and Territorial Epidemiologists criteria, classifying patients as confirmed if they had detectable HCV RNA. Results: The data enabled PHSKC to report the number of patients at various stages along the HCV care cascade. Of 7747 HCV RNA-positive patients seen by a partner site, 5377 (69%) were assessed for severity of liver fibrosis, 3932 (51%) were treated, and 2592 (33%) were cured. Practice Implications: Data supported local public heath surveillance and HCV program activities. The data system could serve as a foundation for monitoring future HCV prevention and control programs.
APA, Harvard, Vancouver, ISO, and other styles
50

Celikovic, Milan, Ivan Lukovic, Slavica Aleksic, and Vladimir Ivancevic. "A MOF based meta-model and a concrete DSL syntax of IIS*Case PIM concepts." Computer Science and Information Systems 9, no. 3 (2012): 1075–103. http://dx.doi.org/10.2298/csis120203034c.

Full text
Abstract:
In this paper, we present a platform independent model (PIM) of IIS*Case tool for information system (IS) design. IIS*Case is a model driven software tool that provides generation of executable application prototypes. The concepts are described by Meta Object Facility (MOF) specification, one of the commonly used approaches for describing meta-models. One of the main reasons for having IIS*Case PIM concepts specified through the meta-model, is to provide software documentation in a formal way, as well as a domain analysis purposed at creation a domain specific language to support IS design. Using the PIM meta-model, we can generate test cases that may assist in software tool verification. The meta-model may be also a good base for the process of the concrete syntax generation for some domain specific language.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography