Auswahl der wissenschaftlichen Literatur zum Thema „Data-driven test case design“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Data-driven test case design" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Zeitschriftenartikel zum Thema "Data-driven test case design"

1

Prakash Shrivastava, Divya. „Unit Test Case Design Metrics in Test Driven Development“. International Journal of Software Engineering 2, Nr. 3 (31.08.2012): 43–48. http://dx.doi.org/10.5923/j.se.20120203.01.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Zhao, Jian Ping, Xiao Yang Liu, Hong Ming Xi, Li Ya Xu, Jian Hui Zhao und Huan Ming Liu. „A Lightweight-Leveled Software Automated Test Framework“. Advanced Materials Research 834-836 (Oktober 2013): 1919–24. http://dx.doi.org/10.4028/www.scientific.net/amr.834-836.1919.

Der volle Inhalt der Quelle
Annotation:
To resolve the problem of a large amount of automated test scripts and test data files, through the test tool QTP, data-driven and keyword-driven testing mechanism, a test automation framework based on three layer data-driven mechanism is designed, including the design of the TestSet managing test case files, the design of the TestCase storing test cases and the design of the TestData storing test data.Through controlling the test scale and applying the test data pool, reconfigurable and optimization of test scripts are designed. The methods above can decouple the test design and the script development, make test cases and data show a more humane design, make test scripts and test data on the business level optimized and reusable, and make the number of script files and the test data files reache a minimum, which reduces the occupied space.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Hanhela, Matti, Olli Gröhn, Mikko Kettunen, Kati Niinimäki, Marko Vauhkonen und Ville Kolehmainen. „Data-Driven Regularization Parameter Selection in Dynamic MRI“. Journal of Imaging 7, Nr. 2 (20.02.2021): 38. http://dx.doi.org/10.3390/jimaging7020038.

Der volle Inhalt der Quelle
Annotation:
In dynamic MRI, sufficient temporal resolution can often only be obtained using imaging protocols which produce undersampled data for each image in the time series. This has led to the popularity of compressed sensing (CS) based reconstructions. One problem in CS approaches is determining the regularization parameters, which control the balance between data fidelity and regularization. We propose a data-driven approach for the total variation regularization parameter selection, where reconstructions yield expected sparsity levels in the regularization domains. The expected sparsity levels are obtained from the measurement data for temporal regularization and from a reference image for spatial regularization. Two formulations are proposed. Simultaneous search for a parameter pair yielding expected sparsity in both domains (S-surface), and a sequential parameter selection using the S-curve method (Sequential S-curve). The approaches are evaluated using simulated and experimental DCE-MRI. In the simulated test case, both methods produce a parameter pair and reconstruction that is close to the root mean square error (RMSE) optimal pair and reconstruction. In the experimental test case, the methods produce almost equal parameter selection, and the reconstructions are of high perceived quality. Both methods lead to a highly feasible selection of the regularization parameters in both test cases while the sequential method is computationally more efficient.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Lyu, Zhi-Jun, Qi Lu, YiMing Song, Qian Xiang und Guanghui Yang. „Data-Driven Decision-Making in the Design Optimization of Thin-Walled Steel Perforated Sections: A Case Study“. Advances in Civil Engineering 2018 (2018): 1–14. http://dx.doi.org/10.1155/2018/6326049.

Der volle Inhalt der Quelle
Annotation:
The rack columns have so distinctive characteristics in their design, which have regular perforations to facilitate installation of the rack system that it is more difficult to be analyzed with traditional cold-formed steel structures design theory or standards. The emergence of industrial “big-data” has created better innovative thinking for those working in various fields including science, engineering, and business. The main contribution of this paper lies in that, with engineering data from finite element simulation and physical test, a novel data-driven model (DDM) using artificial neural network technology is proposed for optimization design of thin-walled steel specific perforated members. The data-driven model based on machine learning is able to provide a more effective help for decision-making of innovative design in steel members. The results of the case study indicate that compared with the traditional finite element simulation and physical test, the DDM for the solving the hard problem of complicated steel perforated column design seems to be very promising.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Robertson, P. K., R. G. Campanella, P. T. Brown, I. Grof und J. M. O. Hughes. „Design of axially and laterally loaded piles using in situ tests: A case history“. Canadian Geotechnical Journal 22, Nr. 4 (01.11.1985): 518–27. http://dx.doi.org/10.1139/t85-072.

Der volle Inhalt der Quelle
Annotation:
A 915 mm diameter steel pipe pile was driven and tested by the B.C. Ministry of Transportation and Highways as part of their foundation studies for the proposed Annacis channel crossing of the Fraser River. The pile was driven open ended to a maximum depth of 94 m. The pile was tested axially to failure when the pile tip was at depths of 67, 78, and 94 m below ground surface. Following the final axial load test, the pile was loaded laterally to a total deflection at the ground surface of 150 mm. A slope indicator casing was installed in the pile to monitor the deflected shape during lateral loading.Adjacent to the pile, a piezometer-friction cone penetration test (CPT) and a full-displacement pressuremeter profile were made. Results of the axial and lateral load tests are presented along with the data from the CPT and the full-displacement pressuremeter tests. Results of several analyses using the data from the CPT and pressuremeter tests to predict the axial and lateral performance of the pile are presented. A comparison and discussion is presented between the predicted and measured axial and lateral behaviour of the pile, for which excellent agreement was found. Key words: pile load test, cone penetration test, pressuremeter test.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

SADEGHI, ALIREZA, und SEYED-HASSAN MIRIAN-HOSSEINABADI. „MBTDD: MODEL BASED TEST DRIVEN DEVELOPMENT“. International Journal of Software Engineering and Knowledge Engineering 22, Nr. 08 (Dezember 2012): 1085–102. http://dx.doi.org/10.1142/s0218194012500295.

Der volle Inhalt der Quelle
Annotation:
Test Driven Development (TDD), as a quality promotion approach, suffers from some shortages that discourage its usage. One of the most challenging shortcomings of TDD is the low level of granularity and abstraction. This may lead to production of software that is not acceptable by the end users. Additionally, exploiting of TDD is not applicable in the enterprise systems development. To overcome this defect, we have merged TDD with Model Based Testing (MBT) and suggested a framework named Model Based Test Driven Development (MBTDD). According to TDD, writing test cases comes before programming, and based on our improved method of TDD, modeling precedes writing test cases. To validate the applicability of the proposed framework, we have implemented a use case of Human Resource Management (HRM) system by means of MBTDD. The empirical results of using MBTTD show that our proposed method overwhelms existing deficiencies of TDD.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Lerro, Angelo, Alberto Brandl, Manuela Battipede und Piero Gili. „A Data-Driven Approach to Identify Flight Test Data Suitable to Design Angle of Attack Synthetic Sensor for Flight Control Systems“. Aerospace 7, Nr. 5 (23.05.2020): 63. http://dx.doi.org/10.3390/aerospace7050063.

Der volle Inhalt der Quelle
Annotation:
Digital avionic solutions enable advanced flight control systems to be available also on smaller aircraft. One of the safety-critical segments is the air data system. Innovative architectures allow the use of synthetic sensors that can introduce significant technological and safety advances. The application to aerodynamic angles seems the most promising towards certified applications. In this area, the best procedures concerning the design of synthetic sensors are still an open question within the field. An example is given by the MIDAS project funded in the frame of Clean Sky 2. This paper proposes two data-driven methods that allow to improve performance over the entire flight envelope with particular attention to steady state flight conditions. The training set obtained is considerably undersized with consequent reduction of computational costs. These methods are validated with a real case and they will be used as part of the MIDAS life cycle. The first method, called Data-Driven Identification and Generation of Quasi-Steady States (DIGS), is based on the (i) identification of the lift curve of the aircraft; (ii) augmentation of the training set with artificial flight data points. DIGS’s main aim is to reduce the issue of unbalanced training set. The second method, called Similar Flight Test Data Pruning (SFDP), deals with data reduction based on the isolation of quasi-unique points. Results give an evidence of the validity of the methods for the MIDAS project that can be easily adopted for generic synthetic sensor design for flight control system applications.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Pahwa, Payal, und Renu Miglani. „Test Case Design using Black Box Testing Techniques for Data Mart“. International Journal of Computer Applications 109, Nr. 3 (16.01.2015): 18–22. http://dx.doi.org/10.5120/19169-0636.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Naddeo, Alessandro, und Nicola Cappetti. „Comfort driven design of innovative products: A personalized mattress case study“. Work 68, s1 (08.01.2021): S139—S150. http://dx.doi.org/10.3233/wor-208013.

Der volle Inhalt der Quelle
Annotation:
BACKGROUND: Human-centred design asks for wellbeing and comfort of the customer/worker when interacting with a product. Having a good perception-model and an objective method to evaluate the experienced (dis)comfort by the product user is needed for performing a preventive comfort evaluation as early as possible in the product development plan. The mattress of a bed is a typical product whose relevance in everyday life of people is under-evaluated. Fortunately, this behaviour is quickly changing, and the customer wants to understand the product he/she buys and asks for more comfortable and for scientifically assessed products. No guidelines for designing a personalized mattress are available in the literature. OBJECTIVES: This study deals with the experience of designing an innovative product whose product-development-plan is focused on the customer perceived comfort: a personalized mattress. The research question is: which method can be used to innovate or create a comfort-driven human-centred product? METHODS: Virtual prototyping was used to develop a correlated numerical model of the mattress. A comfort model for preventively assessing the perceived comfort was proposed and experimentally tested. Mattress testing sessions with subjects were organized, and collected data were compared with already tested mattresses. Brainstorming and multi-expert methods were used to propose, realize, and test an archetype of a new mattress for final comfort assessment. RESULTS: A new reconfigurable mattress was developed, resulting in two patents. The mattress design shows that personalized products can be tuned according to the anthropometric data of the customer in order to improve the comfort experience during sleep. CONCLUSIONS: A “comfort-driven design guideline” was proposed; this method has been based on the use of virtual prototyping, virtual optimization and physical prototyping and testing. It allowed to improve an existing product in a better way and to bring innovation in it.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

CHO, YONGSUN, WOOJIN LEE und KIWON CHONG. „THE TECHNIQUE OF BUSINESS MODEL DRIVEN ANALYSIS AND TEST DESIGN FOR DEVELOPMENT OF WEB APPLICATIONS“. International Journal of Software Engineering and Knowledge Engineering 15, Nr. 04 (August 2005): 587–605. http://dx.doi.org/10.1142/s0218194005002452.

Der volle Inhalt der Quelle
Annotation:
A technique for creating analysis models of a web application from a business model is proposed for easy and effective development. Moreover, a technique for generating test cases from the sequence diagrams for a web application is proposed. The use case diagram and web page list are generated from the business model which is depicted using the notations of the UML (Unified Modeling Language) activity diagram. The page diagram and logical/physical database models are created based on the web page list and extracted data fields. Test cases for the web application are generated from call messages (including self-call messages) of the UML sequence diagram. The effectiveness of these techniques is shown using a practical case study which is a development project of a web application for RMS (Research material Management System).
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Dissertationen zum Thema "Data-driven test case design"

1

Lindahl, John, und Douglas Persson. „Data-driven test case design of automatic test cases using Markov chains and a Markov chain Monte Carlo method“. Thesis, Malmö universitet, Fakulteten för teknik och samhälle (TS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-43498.

Der volle Inhalt der Quelle
Annotation:
Large and complex software that is frequently changed leads to testing challenges. It is well established that the later a fault is detected in software development, the more it costs to fix. This thesis aims to research and develop a method of generating relevant and non-redundant test cases for a regression test suite, to catch bugs as early in the development process as possible. The research was executed at Axis Communications AB with their products and systems in mind. The approach utilizes user data to dynamically generate a Markov chain model and with a Markov chain Monte Carlo method, strengthen that model. The model generates test case proposals, detects test gaps, and identifies redundant test cases based on the user data and data from a test suite. The sampling in the Markov chain Monte Carlo method can be modified to bias the model for test coverage or relevancy. The model is generated generically and can therefore be implemented in other API-driven systems. The model was designed with scalability in mind and further implementations can be made to increase the complexity and further specialize the model for individual needs.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Milanez, Marcus Vinícius. „Test Driven Development: uma abordagem baseada em use cases“. Pontifícia Universidade Católica de São Paulo, 2014. https://tede2.pucsp.br/handle/handle/18157.

Der volle Inhalt der Quelle
Annotation:
Made available in DSpace on 2016-04-29T14:23:31Z (GMT). No. of bitstreams: 1 Marcus Vinicius Milanez.pdf: 1438354 bytes, checksum: 792d4f5889dafba8c4e2f71a02c050a0 (MD5) Previous issue date: 2014-09-20
The development of computer programs is a complex activity, characterized by costs and deadlines that are difficult to estimate. Requirements change frequently, resulting in products of variable reliability. Currently, there are no formal theories that completely address these underlying challenges. Several approaches have been used over time to achieve incremental progress, leading to a significant number of programming languages, development processes, and techniques. Test Driven Development (TDD) is a recently-developed approach which extends the human capabilities to develop computer programs by providing tools to mitigate the difficulties mentioned. Although TDD aggregates a set of support and control elements, it does not include mechanisms that directly help developers with deriving implementations from a set of requirements previously captured and analyzed. As result of this absence, difficulties in understanding its nature and to elaborate the software through independent modules can be observed, ultimately limiting the impact of TDD on the reliability of software. The objective of this research is to overcome the shortcomings mentioned, complementing Kent Beck s TDD proposal by introducing a modeling stage guided by Use Cases, following the ideas of Ivar Jacobson and Wirfs-Brock. Through this approach, assessed by a case study conducted together with industry professionals, enhancements in TDD usage experience could be observed, altering the manner in which this proposal is commonly understood, used and evaluated
Desenvolver programa de computador é uma atividade complexa, cujos custos e prazos são difíceis de serem estimados, caracterizada por requisitos mutantes e que resulta em produtos sem garantias plenas de funcionamento. No atual estado de desenvolvimento desta área de pesquisa, ainda não existe teoria que estabeleça as bases dos conhecimentos necessários a prover soluções a estes problemas. Diversas abordagens foram utilizadas ao longo do tempo a fim de encontrar meios que culminassem em avanços, originando um conjunto expressivo de linguagens de programação, processos e técnicas de desenvolvimento. Uma tentativa recente que intenciona estender as capacidades humanas de desenvolver programas de computador, provendo elementos que permitiriam amenizar essas dificuldades, é uma ferramenta intelectual chamada Test Driven Development (TDD). Embora agregue um conjunto de elementos de apoio e controle ao desenvolvimento, TDD não dispõe de mecanismos que auxiliem desenvolvedores a derivar implementações a partir de um conjunto de requisitos previamente capturados e analisados. Como resultado dessa ausência, observam-se dificuldades em avaliar sua natureza e em propor a elaboração do software em módulos independentes, tornando TDD uma ferramenta questionável. O objetivo desta pesquisa reside em superar as lacunas citadas, complementando a proposta de Kent Beck sobre TDD por meio da introdução de uma etapa de modelagem guiada por modelos casos de uso (Use Cases), seguindo as ideias de Ivar Jacobson e Wirfs-Brock. Por meio desta abordagem, avaliada por estudo de caso realizado em conjunto com profissionais da indústria, observou-se um aprimoramento da experiência de utilização de TDD, alterando a maneira pela qual essa proposta é comumente compreendida, utilizada e avaliada
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Gu, Sisi. „INTERFACE-DRIVEN DESIGN: A CASE STUDY IN DEEP BRAIN STIMULATION DATA MANAGEMENT“. Case Western Reserve University School of Graduate Studies / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=case1417727936.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Ljungren, Joakim. „Data-driven design for sustainable behavior : A case study in using data and conversational interfaces to influence corporate settlement“. Thesis, Umeå universitet, Institutionen för tillämpad fysik och elektronik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-139199.

Der volle Inhalt der Quelle
Annotation:
Interaction with digital products and interfaces concern more and more of human decision-making and the problems regarding environmental, financial and social sustainability are consequences much due to our behavior. The issues and goals of sustainable development therefore implies how we have to think differently about digital design. In this paper, we examine the adequacy of influencing sustainable behavior with a data-driven design approach, applying a conversational user interface. A case study regarding the United Nation’s goals of technological development and economic distribution was conducted, to see if a hypothetical business with a proof-of-concept digital product could be effective in influencing where companies base their operations. The test results showed a lack of usability and influence, but still suggested a potential with language-based interfaces. Even though the results could not prove anything, we argue that leveraging data analysis to design for sustainable behavior could be a very valuable strategy. A data-driven approach could enable ambitions of profit and user experience to coincide with those of sustainability, within a business organization.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Ogallo, Godfrey G. „IoT – Enhancing Data-driven Decision-making in Higher Education. Case Study of Ohio University“. Ohio University / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1516193584144817.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Hoyt, Kristin. „Teacher voice and participation in shaping large-scale standards-driven testing : the case of teacher involvement in the design and construction of a third year high school French end-of-course exam, based on The Indiana Academic Standards for Foreign Languages /“. [Bloomington, Ind.] : Indiana University, 2005. http://wwwlib.umi.com/dissertations/fullcit/3202896.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Abler, Daniel Jakob Silvester. „Software architecture for capturing clinical information in hadron therapy and the design of an ion beam for radiobiology“. Thesis, University of Oxford, 2013. http://ora.ox.ac.uk/objects/uuid:c2d9cf79-7b2d-4feb-bb17-53f003a8557c.

Der volle Inhalt der Quelle
Annotation:
Hadron Therapy (HT) exploits properties of ion radiation to gain therapeutic advantages over existing photon-based forms of external radiation therapy. However, its relative superiority and cost-effectiveness have not been proven for all clinical situations. Establishing a robust evidence base for the development of best treatment practices is one of the major challenges for the field. This thesis investigates two research infrastructures for building this essential evidence. First, the thesis develops main components of a metadata-driven software architecture for the collection of clinical information and its analysis. This architecture acknowledges the diversity in the domain and supports data interoperability by sharing information models. Their compliance to common metamodels guarantees that primary data and analysis results can be interpreted outside of the immediate production context. This is a fundamental necessity for all aspects of the evidence creation process. A metamodel of data capture forms is developed with unique properties to support data collection and documentation in this architecture. The architecture's potential to support complex analysis processes is demonstrated with the help of a novel metamodel for Markov model based simulations, as used for the synthesis of evidence in health-economic assessments. The application of both metamodels is illustrated on the example of HT. Since the biological effect of particle radiation is a major source of uncertainty in HT, in its second part, this thesis undertakes first investigations towards a new research facility for bio-medical experiments with ion beams. It examines the feasibility of upgrading LEIR, an existing accelerator at the European Organisation for Nuclear Research (CERN), with a new slow extraction and investigates transport of the extracted beam to future experiments. Possible configurations for the slow-resonant extraction process are identified, and designs for horizontal and vertical beam transport lines developed. The results of these studies indicate future research directions towards a new ion beam facility for biomedical research.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Gstalter, Étienne. „Réduction d’ordre de modèle de crash automobile pour l’optimisation masse / prestations“. Thesis, Compiègne, 2020. http://www.theses.fr/2020COMP2576.

Der volle Inhalt der Quelle
Annotation:
Cette thèse est une contribution à un thème de recherche sur les applications de la réduction de modèle à l’ingénierie RENAULT. Elle fait suite aux travaux du projet IRT SystemX ROM (Réduction de modèles et Optimisation Multidisciplinaire) et aux thèses CIFRE RENAULT [Vuong], [Charrier]. L’application industrielle principale du thème est la mise au point de la structure d’un véhicule sollicité en crash; des travaux sont également en cours sur la combustion, l’acoustique et l’aérodynamique. Les travaux de cette thèse sont à la fois un apport à la méthode générique ReCUR et son adaptation à l’optimisation de la caisse d’un véhicule pour le crash. RENAULT utilise largement l’optimisation pour la mise au point de la caisse en crash, avec un outil d’optimisation numérique basé sur la méthode des plans d’expériences. Cette méthode nécessite beaucoup de calculs crash car cette simulation est considérée comme une boite noire, en utilisant uniquement les entrées et sorties. La méthode ReCUR prend le contre-pied en exploitant un maximum d’informations de chaque simulation crash, dans le but de réduire fortement leur nombre. Les travaux de cette thèse ont permis de remplir cet objectif pour les applications de mise au point au nominal et pour l’optimisation robuste dans des cas complexes comme le choc frontal et arrière
This thesis is a part of a global research work dedicated to reduced-order modelling applications in the Renault engineering direction. It's research topic has been improved in the IRT System)('s project on Reduced Order Model and Multi-disciplinary Optimization. Some previous thesis can help understand the context. ([Vuong], [Charrier]). The main industrial application of the research theme is the focus on a body structure, in a crash loading. Some research works on acoustic, combustion and aerodynamic are currently ongoing. This thesis is both a contribution to the generic ReCUR method, and its application to a car body structure optimization for crash loadings. Engineering teams at Renault uses optimization to obtain the best crash simulation, with a numerical optimization software, based on designs of experiments. It requires a lot of crash simulation because each simulation is considered as unique, with only one response for each parameter. Only Inputs and Outputs are known. The ReCUR method consider that each simulation is a huge mine that needs our attention. We hope that we can decrease the number of crash simulation required to compute a model, by using much more data for each simulation
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Surendran, Sudhakar. „A Systematic Approach To Synthesis Of Verification Test-Suites For Modular SoC Designs“. Thesis, 2006. http://hdl.handle.net/2005/397.

Der volle Inhalt der Quelle
Annotation:
SoCs (System on Chips) are complex designs with heterogeneous modules (CPU, memory, etc.) integrated in them. Verification is one of the important stages in designing an SoC. Verification is the process of checking if the transformation from architectural specification to design implementation is correct. Verification involves creating the following components: (i) a testplan that identifies the conditions to be verified, (ii) a testcase that generates the stimuli to verify the conditions identified, and (iii) a test-bench that applies the stimuli and monitors the output from the design. Verification consumes upto 70% of the total design time. This is largely due to the complex and manual nature of the verification task. To reduce the time spent in verifying the design, the components used for verification can be generated automatically or created at an abstract level (to reduce the complexity) and reused. In this work we present a methodology to synthesize testcases from reusable code segments and abstract specifications. Our methodology consists of the following major steps: (i) identifying the structure of testcases, (ii) identifying code segments of testcases that can be reused from one SoC to another, (iii) identifying properties of an SoC and its modules that can be used to synthesize the SoC specific code segments of the testcase, and (iv) proposing a synthesizer that uses the code segments, the properties and the abstract specification to synthesize testcases. We discuss two specific classes of testcases. These are testcases for verifying the memory modules and the testcases for verifying the data transfer modules. These are considered since they form a significantly large subset of the device functionality. We implement a prototype testcase generator and also present an example to illustrate the use of methodology for each of these classes. The use of our methodology enables (i) the creation of testcases automatically that are correct by construction and (ii) reuse of the testcase code segments from one SoC to another. Some of the properties (of the modules and the SoC) presented in our work can be easily made part of the architectural specification, and hence, can further reduce the effort needed to create them.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Menninghaus, Mathias. „Automated Performance Test Generation and Comparison for Complex Data Structures - Exemplified on High-Dimensional Spatio-Temporal Indices“. Doctoral thesis, 2018. https://repositorium.ub.uni-osnabrueck.de/handle/urn:nbn:de:gbv:700-20180823528.

Der volle Inhalt der Quelle
Annotation:
There exist numerous approaches to index either spatio-temporal or high-dimensional data. None of them is able to efficiently index hybrid data types, thus spatio-temporal and high-dimensional data. As the best high-dimensional indexing techniques are only able to index point-data and not now-relative data and the best spatio-temporal indexing techniques suffer from the curse of dimensionality, this thesis introduces the Spatio-Temporal Pyramid Adapter (STPA). The STPA maps spatio-temporal data on points, now-values on the median of the data set and indexes them with the pyramid technique. For high-dimensional and spatio-temporal index structures no generally accepted benchmark exists. Most index structures are only evaluated by custom benchmarks and compared to a tiny set of competitors. Benchmarks may be biased as a structure may be created to perform well in a certain benchmark or a benchmark does not cover a certain speciality of the investigated structures. In this thesis, the Interface Based Performance Comparison (IBPC) technique is introduced. It automatically generates test sets with a high code coverage on the system under test (SUT) on the basis of all functions defined by a certain interface which all competitors support. Every test set is performed on every SUT and the performance results are weighted by the achieved coverage and summed up. These weighted performance results are then used to compare the structures. An implementation of the IBPC, the Performance Test Automation Framework (PTAF) is compared to a classic custom benchmark, a workload generator whose parameters are optimized by a genetic algorithm and a specific PTAF alternative which incorporates the specific behavior of the systems under test. This is done for a set of two high-dimensional spatio-temporal indices and twelve variants of the R-tree. The evaluation indicates that PTAF performs at least as good as the other approaches in terms of minimal test cases with a maximized coverage. Several case studies on PTAF demonstrate its widespread abilities.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Bücher zum Thema "Data-driven test case design"

1

Quackenbush, Stephen L. Empirical Analyses of Deterrence. Oxford University Press, 2017. http://dx.doi.org/10.1093/acrefore/9780190228637.013.313.

Der volle Inhalt der Quelle
Annotation:
Deterrence is an important subject, and its study has spanned more than seven decades. Much research on deterrence has focused on a theoretical understanding of the subject. Particularly important is the distinction between classical deterrence theory and perfect deterrence theory. Other studies have employed empirical analyses. The empirical literature on deterrence developed at different times and took different approaches. The early empirical deterrence literature was highly limited for varying reasons. Much of the early case study literature did not seek to test deterrence theory. Early quantitative studies did seek to do so, but they were hampered by rudimentary methods, poor research design, and/or a disconnect between quantitative studies and formal theories of deterrence. Modern empirical research on deterrence has made great strides toward bridging the formal-quantitative divide in the study of deterrence and conducting theoretically driven case studies. Further, researchers have explored the effect of specific variables on deterrence, such as alliances, reputations and credibility, and nuclear weapons. Future empirical studies of deterrence should build on these modern developments. In addition, they should build on perfect deterrence theory, given its logical consistency and empirical support.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Davidson, Judith. Research Design in Team-Based Qualitative Research. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190648138.003.0003.

Der volle Inhalt der Quelle
Annotation:
In the introduction to this chapter and interwoven throughout the text is the message that qualitative research begins and ends in writing, which in this case means that research design is a beginning point for that writing. This chapter is composed of three major sections that illustrate how team start-up is critical to how the writing will proceed down the line. The first section—Team Formation—provides detailed information on issues to consider in establishing the team in a manner that will be most beneficial to the conduct of qualitative research. The second section—Research Design and Project Organization—discusses early writing tasks, establishing a project management system, and the importance of linking all of this to a data archiving plan. Digital tools are discussed in some depth. The third section—Caring: Internalized and Externalized—suggests a novel approach to the issue of ethics and team management.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Thorlakson, Lori. Multi-Level Democracy. Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780198833505.001.0001.

Der volle Inhalt der Quelle
Annotation:
All federal systems face an internal tension between divisive and integrative political forces, striking a balance between providing local autonomy and representation on one hand and maintaining an integrated political community on the other hand. How multi-level systems strike this balance depends on the development of styles of either integrated politics, which creates a shared framework for political competition across the units of a federation, or independent politics, preserving highly autonomous arenas of political life. This book argues that the long-term development of integrated or independent styles of politics in multi-level systems can be shaped by two key elements of federal institutional design: the degree of fiscal decentralization, or how much is ‘at stake’ at each level of government, and the degree to which the allocation of policy jurisdiction creates legislative or administrative interdependence or autonomy. These elements of federal institutional design shape integrated and independent politics at the level of party organizations, party systems, and voter behaviour. This book tests these arguments using a mixed-method approach, drawing on original survey data from 250 subnational party leaders and aggregate electoral data from over 2,200 subnational elections in seven multi-level systems: Canada, the United States, Australia, Austria, Germany, Switzerland, and Spain. It supplements this with configurational analysis and qualitative case studies.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Bhopal, Raj S. Concepts of Epidemiology. Oxford University Press, 2016. http://dx.doi.org/10.1093/med/9780198739685.001.0001.

Der volle Inhalt der Quelle
Annotation:
Epidemiology is a population science that underpins health improvement and health care, and is concerned with the pattern, frequency, trends, and causes of disease. This book teaches its applications to population health research, policy-making, health service planning, health promotion, and clinical care. The book emphasizes concepts and principles. In 10 chapters, the book explains what epidemiology is; illustrates the basis of epidemiology in populations; provides a framework for analysing diseases by time, place, and person; introduces error, bias, and confounding; explains how we move from association to causation; considers the natural history, spectrum, and iceberg of disease in relation to medical screening; discusses the acquisition and analysis of data on incidence and prevalence of risk factors and diseases; shows the ways in which epidemiological data are presented, including relative and absolute risks; provides an integrated overview of study designs and the principles of data analysis; and considers the theoretical and ethical basis of epidemiology both in the past and the future. The emphasis is on interactive learning, with each chapter including learning objectives, theoretical and numerical exercises, questions and answers, and a summary. The text is illustrated, with detailed material in tables. The book is written in plain English, and the necessary technical and specialized terminology is explained and defined in a glossary. The book is for postgraduate courses in epidemiology, public health, and health policy. It is also suitable for clinicians, undergraduate students in medicine, nursing and other health disciplines, and researchers.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Li, Quan. Using R for Data Analysis in Social Sciences. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190656218.001.0001.

Der volle Inhalt der Quelle
Annotation:
This book seeks to teach undergraduate and graduate students in social sciences how to use R to manage, visualize, and analyze data in order to answer substantive questions and replicate published findings. This book distinguishes itself from other introductory R or statistics books in three ways. First, targeting an audience rarely exposed to statistical programming, it adopts a minimalist approach and covers only the most important functions and skills in R that one will need for conducting reproducible research projects. Second, it emphasizes meeting the practical needs of students using R in research projects. Specifically, it teaches students how to import, inspect, and manage data; understand the logic of statistical inference; visualize data and findings via histograms, boxplots, scatterplots, and diagnostic plots; and analyze data using one-sample t-test, difference-of-means test, covariance, correlation, ordinary least squares (OLS) regression, and model assumption diagnostics. Third, it teaches students how to replicate the findings in published journal articles and diagnose model assumption violations. The principle behind this book is to teach students to learn as little R as possible but to do as much reproducible, substance-driven data analysis at the beginner or intermediate level as possible. The minimalist approach dramatically reduces the learning cost but still proves adequate information for meeting the practical research needs of senior undergraduate and beginning graduate students. Having completed this book, students can use R and statistical analysis to answer questions regarding some substantively interesting continuous outcome variable in a cross-sectional design.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Measuring Quality of Life in Health. Churchill Livingstone, 2004.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Cebreros, Alfonso, Alejandrina Salcedo, Daniel Chiquiar und Aldo Heffner-Rodríguez. Trade Policy Uncertainty and its Effect on Foreign Direct Investment: Evidence from Mexico. Banco de México, 2020. http://dx.doi.org/10.36095/banxico/di.2020.14.

Der volle Inhalt der Quelle
Annotation:
This paper investigates whether "trade policy uncertainty" (TPU), even absent changes in actual policy, may have an adverse effect on foreign direct investment. The paper focuses on the case of Mexico, where we observe a plausibly sharp and exogenous increase in TPU vis-à-vis a large trading partner beginning in the second half of 2016. To test this hypothesis, we use data from Google Trends to construct a TPU index and argue that this index adequately captures both time series and cross-sectional variation in TPU across states in Mexico. We exploit this variation to identify the effect of increased uncertainty on FDI flows. We find that the increase in TPU was associated with a negative effect on FDI inflows, with the effect being driven by the negative impact that TPU had on FDI in export oriented states.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Buchteile zum Thema "Data-driven test case design"

1

Erdogan, Gencer, Atle Refsdal und Ketil Stølen. „A Systematic Method for Risk-Driven Test Case Design Using Annotated Sequence Diagrams“. In Risk Assessment and Risk-Driven Testing, 93–108. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-07076-6_7.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Erdogan, Gencer, Atle Refsdal und Ketil Stølen. „A Systematic Method for Risk-Driven Test Case Design Using Annotated Sequence Diagrams“. In Risk Assessment and Risk-Driven Testing, 93–108. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-14114-5_7.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Mondal, Sayani, und Partha Pratim Das. „Effectiveness of Test-Driven Development as an SDLC Model: A Case Study of an Elevator Controller Design“. In Lecture Notes in Electrical Engineering, 225–33. New Delhi: Springer India, 2014. http://dx.doi.org/10.1007/978-81-322-1817-3_24.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Qi, Ming, Zekun Yang, Jinghui Liu, Xuefeng Li und Dengyun Wu. „Research on Optimization Design Method of Reliability Validation Test in the Case of Zero-Failure Data“. In Lecture Notes in Electrical Engineering, 627–34. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-33-4102-9_76.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Amin, K. F., und Henry Fonbeyin Abanda. „Developing a Business Case for BIM for a Design and Build Project in Egypt“. In Data-Driven Modeling for Sustainable Engineering, 149–59. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-13697-0_11.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Varga, Stefan, Joel Brynielsson, Andreas Horndahl und Magnus Rosell. „Automated Text Analysis for Intelligence Purposes: A Psychological Operations Case Study“. In Lecture Notes in Social Networks, 221–51. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-41251-7_9.

Der volle Inhalt der Quelle
Annotation:
Abstract With the availability of an abundance of data through the Internet, the premises to solve some intelligence analysis tasks have changed for the better. The study presented herein sets out to examine whether and how a data-driven approach can contribute to solve intelligence tasks. During a full day observational study, an ordinary military intelligence unit was divided into two uniform teams. Each team was independently asked to solve the same realistic intelligence analysis task. Both teams were allowed to use their ordinary set of tools, but in addition one team was also given access to a novel text analysis prototype tool specifically designed to support data-driven intelligence analysis of social media data. The results, obtained from the case study with a high ecological validity, suggest that the prototype tool provided valuable insights by bringing forth information from a more diverse set of sources, specifically from private citizens that would not have been easily discovered otherwise. Also, regardless of its objective contribution, the capabilities and the usage of the tool were embraced and subjectively perceived as useful by all involved analysts.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Kim, Yoon Jeon, und Jose A. Ruipérez-Valiente. „Data-Driven Game Design: The Case of Difficulty in Educational Games“. In Addressing Global Challenges and Quality Education, 449–54. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-57717-9_43.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Simani, Silvio, Saverio Farsoni und Paolo Castaldi. „Application of Data–Driven Fault Diagnosis Design Techniques to a Wind Turbine Test–Rig“. In Lecture Notes in Networks and Systems, 23–38. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-80126-7_3.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Alonso, Luis, Yan Ryan Zhang, Arnaud Grignard, Ariel Noyman, Yasushi Sakai, Markus ElKatsha, Ronan Doorley und Kent Larson. „CityScope: A Data-Driven Interactive Simulation Tool for Urban Design. Use Case Volpe“. In Unifying Themes in Complex Systems IX, 253–61. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-96661-8_27.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Arboretti, Rosa, Riccardo Ceccato und Luigi Salmaso. „Nonparametric methods for stratified C-sample designs: a case study“. In Proceedings e report, 17–22. Florence: Firenze University Press, 2021. http://dx.doi.org/10.36253/978-88-5518-304-8.05.

Der volle Inhalt der Quelle
Annotation:
Several parametric and nonparametric methods have been proposed to deal with stratified C-sample problems where the main interest lies in evaluating the presence of a certain treatment effect, but the strata effects cannot be overlooked. Stratified scenarios can be found in several different fields. In this paper we focus on a particular case study from the field of education, addressing a typical stochastic ordering problem in the presence of stratification. We are interested in assessing how the performance of students from different degree programs at the University of Padova change, in terms of university credits and grades, when compared with their entry test results. To address this problem, we propose an extension of the Non-Parametric Combination (NPC) methodology, a permutation-based technique (see Pesarin and Salmaso, 2010), as a valuable tool to improve the data analytics for monitoring University students’ careers at the School of Engineering of the University of Padova. This new procedure indeed allows us to assess the efficacy of the University of Padova’s entry tests in evaluating and selecting future students.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Konferenzberichte zum Thema "Data-driven test case design"

1

Divya Prakash Shrivastava und R. C. Jain. „Unit test case design metrics in test driven development“. In 2011 International Conference on Communications, Computing and Control Applications (CCCA). IEEE, 2011. http://dx.doi.org/10.1109/ccca.2011.6031205.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Causevic, A., D. Sundmark und S. Punnekkat. „Test case quality in test driven development: a study design and a pilot experiment“. In 16th International Conference on Evaluation & Assessment in Software Engineering (EASE 2012). IET, 2012. http://dx.doi.org/10.1049/ic.2012.0029.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Gerking, Christopher, Jan Ladleif und Wilhelm Schäfer. „Model-driven test case design for model-to-model semantics preservation“. In ESEC/FSE'15: Joint Meeting of the European Software Engineering Conference and the ACM SIGSOFT Symposium on the Foundations of Software Engineering. New York, NY, USA: ACM, 2015. http://dx.doi.org/10.1145/2804322.2804323.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Ghosh, Dipanjan D., Junghan Kim, Andrew Olewnik, Arun Lakshmanan und Kemper E. Lewis. „Cyber-Empathic Design: A Data Driven Framework for Product Design“. In ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/detc2016-59642.

Der volle Inhalt der Quelle
Annotation:
One of the critical tasks in product design is to map information from the consumer space to the design space. Currently, this process is largely dependent on the designer to identify and map how psychological and consumer level factors relate to engineered product attributes. In this way current methodologies lack provision to test a designer’s cognitive reasoning and could therefore introduce bias while mapping from consumer to design space. Also, current dominant frameworks do not include user-product interaction data in design decision making and neither do they assist designers in understanding why a consumer has a particular perception about a product. This paper proposes a new framework — Cyber-Empathic Design — where user-product interaction data is acquired via embedded sensors in the products. To understand the motivations behind consumer perceptions, a network of latent constructs is used which forms a causal model framework. Structural Equation Modeling is used as the parameter estimation and hypothesis testing technique making the framework falsifiable in nature. To demonstrate the framework and demonstrate its effectiveness a case study of sensor integrated shoes is presented in this work, where two models are compared — one survey based and using the Cyber-Empathic framework model. It is shown that the Cyber-Empathic framework results in improved fit. The case study also demonstrates the technique to test a designers’ cognitive hypothesis.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Fellner, Andreas, Willibald Krenn, Rupert Schlick, Thorsten Tarrach und Georg Weissenbacher. „Model-based, mutation-driven test case generation via heuristic-guided branching search“. In MEMOCODE '17: 15th ACM-IEEE International Conference on Formal Methods and Models for System Design. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3127041.3127049.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Siniaalto, Maria, und Pekka Abrahamsson. „A Comparative Case Study on the Impact of Test-Driven Development on Program Design and Test Coverage“. In First International Symposium on Empirical Software Engineering and Measurement (ESEM 2007). IEEE, 2007. http://dx.doi.org/10.1109/esem.2007.2.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Siniaalto, Maria, und Pekka Abrahamsson. „A Comparative Case Study on the Impact of Test-Driven Development on Program Design and Test Coverage“. In First International Symposium on Empirical Software Engineering and Measurement (ESEM 2007). IEEE, 2007. http://dx.doi.org/10.1109/esem.2007.35.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Barrois, Benjamin, Olivier Sentieys und Daniel Menard. „The hidden cost of functional approximation against careful data sizing — A case study“. In 2017 Design, Automation & Test in Europe Conference & Exhibition (DATE). IEEE, 2017. http://dx.doi.org/10.23919/date.2017.7926979.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Marijan, Dusica, Vladimir Zlokolica, Nikola Teslic, Tarkan Tekcan und Vukota Pekovic. „User-driven optimized test case design and modeling for end-user device quality inspection“. In 2011 IEEE International Conference on Consumer Electronics (ICCE). IEEE, 2011. http://dx.doi.org/10.1109/icce.2011.5722906.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Liu, Kang, Benjamin Tan, Ramesh Karri und Siddharth Garg. „Poisoning the (Data) Well in ML-Based CAD: A Case Study of Hiding Lithographic Hotspots“. In 2020 Design, Automation & Test in Europe Conference & Exhibition (DATE). IEEE, 2020. http://dx.doi.org/10.23919/date48585.2020.9116489.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Berichte der Organisationen zum Thema "Data-driven test case design"

1

Rincón-Torres, Andrey Duván, Kimberly Rojas-Silva und Juan Manuel Julio-Román. The Interdependence of FX and Treasury Bonds Markets: The Case of Colombia. Banco de la República, September 2021. http://dx.doi.org/10.32468/be.1171.

Der volle Inhalt der Quelle
Annotation:
We study the interdependence of FX and Treasury Bonds (TES) markets in Colombia. To do this, we estimate a heteroskedasticity identified VAR model on the returns of the COP/USD exchange rate (TRM) and bond prices, as well as event-analysis models for return volatilities, number of quotes, quote volume, and bid/ask spreads. The data under analysis consists of 5-minute intraday bid/ask US dollar prices and bond quotes, for an assortment of bond species. For these species we also have the number of bid/ask quotes as well as their volume. We found, also, that the exchange rate conveys information to the TES market, but the opposite does not completely hold: A one percent COP depreciation leads to a persistent reduction of TES prices between 0.05% and 0.22%. However, a 1% TES price increase has a very small effect and not entirely significant on the exchange rate, i.e. a COP appreciation between 0.001% and 0.009%. Furthermore, TRM return volatility increases do not affect bond return volatility but its liquidity, i.e. the bid/ask quote number and volume. These results are coherent with the fact that the FX market more efficiently reflects the effect of shocks than the TES market, which may be due to its low liquidity and concentration on a specific habitat. These results have implications for the design of financial stability policies as well as for private portfolio design, rebalancing and hedging.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

DeJaeghere, Joan, Vu Dao, Bich-Hang Duong und Phuong Luong. Inequalities in Learning in Vietnam: Teachers’ Beliefs About and Classroom Practices for Ethnic Minorities. Research on Improving Systems of Education (RISE), Februar 2021. http://dx.doi.org/10.35489/bsg-rise-wp_2021/061.

Der volle Inhalt der Quelle
Annotation:
Global and national education agendas are concerned with improving quality and equality of learning outcomes. This paper provides an analysis of the case of Vietnam, which is regarded as having high learning outcomes and less inequality in learning. But national data and international test outcomes may mask the hidden inequalities that exist between minoritized groups and majority (Kinh) students. Drawing on data from qualitative videos and interviews of secondary teachers across 10 provinces, we examine the role of teachers’ beliefs, curricular design and actions in the classroom (Gale et al., 2017). We show that teachers hold different beliefs and engage in curricular design – or the use of hegemonic curriculum and instructional practices that produce different learning outcomes for minoritized students compared to Kinh students. It suggests that policies need to focus on the social-cultural aspects of teaching in addition to the material and technical aspects.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Thompson, Marshall, und Ramez Hajj. Flexible Pavement Recycling Techniques: A Summary of Activities. Illinois Center for Transportation, Juli 2021. http://dx.doi.org/10.36501/0197-9191/21-022.

Der volle Inhalt der Quelle
Annotation:
Cold in-place recycling (CIR) involves the recycling of the asphalt portions (including hot-mix asphalt and chip, slurry, and cape seals, as well as others) of a flexible or composite pavement with asphalt emulsion or foamed asphalt as the binding agent. Full-depth reclamation (FDR) includes the recycling of the entire depth of the pavement and, in some cases, a portion of the subgrade with asphalt, cement, or lime products as binding agents. Both processes are extensively utilized in Illinois. This project reviewed CIR and FDR projects identified by the Illinois Department of Transportation (IDOT) from the Transportation Bulletin and provided comments on pavement designs and special provisions. The researchers evaluated the performance of existing CIR/FDR projects through pavement condition surveys and analysis of falling weight deflectometer data collected by IDOT. They also reviewed CIR/FDR literature and updated/modified (as appropriate) previously provided inputs concerning mix design, testing procedures, thickness design, construction, and performance as well as cold central plant recycling (CCPR) literature related to design and construction. The team monitored the performance of test sections at the National Center for Asphalt Technology and Virginia Department of Transportation. The researchers assisted IDOT in the development of a CCPR special provision as well as responded to IDOT inquiries and questions concerning issues related to CIR, FDR, and CCPR. They attended meetings of IDOT’s FDR with the Cement Working Group and provided input in the development of a special provision for FDR with cement. The project’s activities confirmed that CIR, FDR, and CCPR techniques are successfully utilized in Illinois. Recommendations for improving the above-discussed techniques are provided.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Terzic, Vesna, und William Pasco. Novel Method for Probabilistic Evaluation of the Post-Earthquake Functionality of a Bridge. Mineta Transportation Institute, April 2021. http://dx.doi.org/10.31979/mti.2021.1916.

Der volle Inhalt der Quelle
Annotation:
While modern overpass bridges are safe against collapse, their functionality will likely be compromised in case of design-level or beyond design-level earthquake, which may generate excessive residual displacements of the bridge deck. Presently, there is no validated, quantitative approach for estimating the operational level of the bridge after an earthquake due to the difficulty of accurately simulating residual displacements. This research develops a novel method for probabilistic evaluation of the post-earthquake functionality state of the bridge; the approach is founded on an explicit evaluation of bridge residual displacements and associated traffic capacity by considering realistic traffic load scenarios. This research proposes a high-fidelity finite-element model for bridge columns, developed and calibrated using existing experimental data from the shake table tests of a full-scale bridge column. This finite-element model of the bridge column is further expanded to enable evaluation of the axial load-carrying capacity of damaged columns, which is critical for an accurate evaluation of the traffic capacity of the bridge. Existing experimental data from the crushing tests on the columns with earthquake-induced damage support this phase of the finite-element model development. To properly evaluate the bridge's post-earthquake functionality state, realistic traffic loadings representative of different bridge conditions (e.g., immediate access, emergency traffic only, closed) are applied in the proposed model following an earthquake simulation. The traffic loadings in the finite-element model consider the distribution of the vehicles on the bridge causing the largest forces in the bridge columns.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

McKenna, Patrick, und Mark Evans. Emergency Relief and complex service delivery: Towards better outcomes. Queensland University of Technology, Juni 2021. http://dx.doi.org/10.5204/rep.eprints.211133.

Der volle Inhalt der Quelle
Annotation:
Emergency Relief (ER) is a Department of Social Services (DSS) funded program, delivered by 197 community organisations (ER Providers) across Australia, to assist people facing a financial crisis with financial/material aid and referrals to other support programs. ER has been playing this important role in Australian communities since 1979. Without ER, more people living in Australia who experience a financial crisis might face further harm such as crippling debt or homelessness. The Emergency Relief National Coordination Group (NCG) was established in April 2020 at the start of the COVID-19 pandemic to advise the Minister for Families and Social Services on the implementation of ER. To inform its advice to the Minister, the NCG partnered with the Institute for Governance at the University of Canberra to conduct research to understand the issues and challenges faced by ER Providers and Service Users in local contexts across Australia. The research involved a desktop review of the existing literature on ER service provision, a large survey which all Commonwealth ER Providers were invited to participate in (and 122 responses were received), interviews with a purposive sample of 18 ER Providers, and the development of a program logic and theory of change for the Commonwealth ER program to assess progress. The surveys and interviews focussed on ER Provider perceptions of the strengths, weaknesses, future challenges, and areas of improvement for current ER provision. The trend of increasing case complexity, the effectiveness of ER service delivery models in achieving outcomes for Service Users, and the significance of volunteering in the sector were investigated. Separately, an evaluation of the performance of the NCG was conducted and a summary of the evaluation is provided as an appendix to this report. Several themes emerged from the review of the existing literature such as service delivery shortcomings in dealing with case complexity, the effectiveness of case management, and repeat requests for service. Interviews with ER workers and Service Users found that an uplift in workforce capability was required to deal with increasing case complexity, leading to recommendations for more training and service standards. Several service evaluations found that ER delivered with case management led to high Service User satisfaction, played an integral role in transforming the lives of people with complex needs, and lowered repeat requests for service. A large longitudinal quantitative study revealed that more time spent with participants substantially decreased the number of repeat requests for service; and, given that repeat requests for service can be an indicator of entrenched poverty, not accessing further services is likely to suggest improvement. The interviews identified the main strengths of ER to be the rapid response and flexible use of funds to stabilise crisis situations and connect people to other supports through strong local networks. Service Users trusted the system because of these strengths, and ER was often an access point to holistic support. There were three main weaknesses identified. First, funding contracts were too short and did not cover the full costs of the program—in particular, case management for complex cases. Second, many Service Users were dependent on ER which was inconsistent with the definition and intent of the program. Third, there was inconsistency in the level of service received by Service Users in different geographic locations. These weaknesses can be improved upon with a joined-up approach featuring co-design and collaborative governance, leading to the successful commissioning of social services. The survey confirmed that volunteers were significant for ER, making up 92% of all workers and 51% of all hours worked in respondent ER programs. Of the 122 respondents, volunteers amounted to 554 full-time equivalents, a contribution valued at $39.4 million. In total there were 8,316 volunteers working in the 122 respondent ER programs. The sector can support and upskill these volunteers (and employees in addition) by developing scalable training solutions such as online training modules, updating ER service standards, and engaging in collaborative learning arrangements where large and small ER Providers share resources. More engagement with peak bodies such as Volunteering Australia might also assist the sector to improve the focus on volunteer engagement. Integrated services achieve better outcomes for complex ER cases—97% of survey respondents either agreed or strongly agreed this was the case. The research identified the dimensions of service integration most relevant to ER Providers to be case management, referrals, the breadth of services offered internally, co-location with interrelated service providers, an established network of support, workforce capability, and Service User engagement. Providers can individually focus on increasing the level of service integration for their ER program to improve their ability to deal with complex cases, which are clearly on the rise. At the system level, a more joined-up approach can also improve service integration across Australia. The key dimensions of this finding are discussed next in more detail. Case management is key for achieving Service User outcomes for complex cases—89% of survey respondents either agreed or strongly agreed this was the case. Interviewees most frequently said they would provide more case management if they could change their service model. Case management allows for more time spent with the Service User, follow up with referral partners, and a higher level of expertise in service delivery to support complex cases. Of course, it is a costly model and not currently funded for all Service Users through ER. Where case management is not available as part of ER, it might be available through a related service that is part of a network of support. Where possible, ER Providers should facilitate access to case management for Service Users who would benefit. At a system level, ER models with a greater component of case management could be implemented as test cases. Referral systems are also key for achieving Service User outcomes, which is reflected in the ER Program Logic presented on page 31. The survey and interview data show that referrals within an integrated service (internal) or in a service hub (co-located) are most effective. Where this is not possible, warm referrals within a trusted network of support are more effective than cold referrals leading to higher take-up and beneficial Service User outcomes. However, cold referrals are most common, pointing to a weakness in ER referral systems. This is because ER Providers do not operate or co-locate with interrelated services in many cases, nor do they have the case management capacity to provide warm referrals in many other cases. For mental illness support, which interviewees identified as one of the most difficult issues to deal with, ER Providers offer an integrated service only 23% of the time, warm referrals 34% of the time, and cold referrals 43% of the time. A focus on referral systems at the individual ER Provider level, and system level through a joined-up approach, might lead to better outcomes for Service Users. The program logic and theory of change for ER have been documented with input from the research findings and included in Section 4.3 on page 31. These show that ER helps people facing a financial crisis to meet their immediate needs, avoid further harm, and access a path to recovery. The research demonstrates that ER is fundamental to supporting vulnerable people in Australia and should therefore continue to be funded by government.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Deb, Robin, Paramita Mondal und Ardavan Ardeshirilajimi. Bridge Decks: Mitigation of Cracking and Increased Durability—Materials Solution (Phase III). Illinois Center for Transportation, Dezember 2020. http://dx.doi.org/10.36501/0197-9191/20-023.

Der volle Inhalt der Quelle
Annotation:
Type K cement offers a lower slump than conventional concrete, even at a higher water-to-cement ratio. Therefore, a suitable chemical admixture should be added to the Type K concrete mix design at a feasible dosage to achieve and retain target slump. In this project, a compatibility study was performed for Type K concrete with commercially available water-reducing and air-entraining admixtures. Slump and air content losses were measured over a period of 60 minutes after mixing and a particular mid-range water-reducing admixture was found to retain slump effectively. Furthermore, no significant difference in admixture interaction between conventional and Type K concrete was observed. Another concern regarding the use of Type K concrete is that its higher water-to-cement ratio can potentially lead to higher permeability and durability issues. This study also explored the effectiveness of presoaked lightweight aggregates in providing extra water for Type K hydration without increasing the water-to-cement ratio. Permeability of concrete was measured to validate that the use of presoaked lightweight aggregates can lower water adsorption in Type K concrete, enhancing its durability. Extensive data analysis was performed to link the small-scale material test results with a structural test performed at Saint Louis University. A consistent relation was established in most cases, validating the effectiveness of both testing methods in understanding the performance of proposed shrinkage-mitigation strategies. Stress analysis was performed to rank the mitigation strategies. Type K incorporation is reported to be the most effective method for shrinkage-related crack mitigation among the mixes tested in this study. The second-best choice is the use of Type K in combination with either presoaked lightweight aggregates or shrinkage-reducing admixtures. All mitigation strategies tested in this work were proved to be significantly better than using no mitigation strategy.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Coulson, Saskia, Melanie Woods, Drew Hemment und Michelle Scott. Report and Assessment of Impact and Policy Outcomes Using Community Level Indicators: H2020 Making Sense Report. University of Dundee, 2017. http://dx.doi.org/10.20933/100001192.

Der volle Inhalt der Quelle
Annotation:
Making Sense is a European Commission H2020 funded project which aims at supporting participatory sensing initiatives that address environmental challenges in areas such as noise and air pollution. The development of Making Sense was informed by previous research on a crowdfunded open source platform for environmental sensing, SmartCitizen.me, developed at the Fab Lab Barcelona. Insights from this research identified several deterrents for a wider uptake of participatory sensing initiatives due to social and technical matters. For example, the participants struggled with the lack of social interactions, a lack of consensus and shared purpose amongst the group, and a limited understanding of the relevance the data had in their daily lives (Balestrini et al., 2014; Balestrini et al., 2015). As such, Making Sense seeks to explore if open source hardware, open source software and and open design can be used to enhance data literacy and maker practices in participatory sensing. Further to this, Making Sense tests methodologies aimed at empowering individuals and communities through developing a greater understanding of their environments and by supporting a culture of grassroot initiatives for action and change. To do this, Making Sense identified a need to underpin sensing with community building activities and develop strategies to inform and enable those participating in data collection with appropriate tools and skills. As Fetterman, Kaftarian and Wanderman (1996) state, citizens are empowered when they understand evaluation and connect it in a way that it has relevance to their lives. Therefore, this report examines the role that these activities have in participatory sensing. Specifically, we discuss the opportunities and challenges in using the concept of Community Level Indicators (CLIs), which are measurable and objective sources of information gathered to complement sensor data. We describe how CLIs are used to develop a more indepth understanding of the environmental problem at hand, and to record, monitor and evaluate the progress of change during initiatives. We propose that CLIs provide one way to move participatory sensing beyond a primarily technological practice and towards a social and environmental practice. This is achieved through an increased focus in the participants’ interests and concerns, and with an emphasis on collective problem solving and action. We position our claims against the following four challenge areas in participatory sensing: 1) generating and communicating information and understanding (c.f. Loreto, 2017), 2) analysing and finding relevance in data (c.f. Becker et al., 2013), 3) building community around participatory sensing (c.f. Fraser et al., 2005), and 4) achieving or monitoring change and impact (c.f. Cheadle et al., 2000). We discuss how the use of CLIs can tend to these challenges. Furthermore, we report and assess six ways in which CLIs can address these challenges and thereby support participatory sensing initiatives: i. Accountability ii. Community assessment iii. Short-term evaluation iv. Long-term evaluation v. Policy change vi. Capability The report then returns to the challenge areas and reflects on the learnings and recommendations that are gleaned from three Making Sense case studies. Afterwhich, there is an exposition of approaches and tools developed by Making Sense for the purposes of advancing participatory sensing in this way. Lastly, the authors speak to some of the policy outcomes that have been realised as a result of this research.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

CAE Correlation of Sealing Pressure of a Press-in-Place Gasket. SAE Imposter, April 2021. http://dx.doi.org/10.4271/2021-01-0299.

Der volle Inhalt der Quelle
Annotation:
The Press-in-Place (PIP) gasket is a static face seal with self-retaining feature, which is used for the mating surfaces of engine components to maintain the reliability of the closed system under various operating conditions. Its design allows it to provide enough contact pressure to seal the internal fluid as well as prevent mechanical failures. Insufficient sealing pressure will lead to fluid leakage, consequently resulting in engine failures. A test fixture was designed to simulate the clamp load and internal pressure condition on a gasket bolted joint. A Sensor pad using TEKSCAN equipment was used to capture the overall and local pressure distribution of the PIP gasket under various engine loading conditions. Then, the Sensor pad test results were compared with simulated CAE results from computer models. Through the comparisons, it is found that the gasket sealing pressure of test data and CAE data show good correlation for bolt load condition 500N when compared to internal pressure side load condition of 0.138 MPa & 0.276 MPa. Moreover, the gasket cross-sectional pressure distribution obtained by experimental tests and CAE models correlated very well with R2 ranging from 90 to 99% for all load cases. Both CAE and Sensor pad test results shows increase in sealing pressure when internal side pressure is applied to the gasket seal.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie