Journal articles on the topic 'Validation approch'

To see the other types of publications on this topic, follow the link: Validation approch.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Validation approch.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Rahman, Mohammad Azizur, Qian Zhao, Helga Wiederhold, Nico Skibbe, Eva González, Nico Deus, Bernhard Siemon, Reinhard Kirsch, and Jörg Elbracht. "Coastal groundwater systems: mapping chloride distribution from borehole and geophysical data." Grundwasser 26, no. 2 (January 21, 2021): 191–206. http://dx.doi.org/10.1007/s00767-021-00475-1.

Full text
Abstract:
AbstractInformation on chloride (Cl) distribution in aquifers is essential for planning and management of coastal zone groundwater resources as well as for simulation and validation of density-driven groundwater models. We developed a method to derive chloride concentrations from borehole information and helicopter-borne electromagnetic (HEM) data for the coastal aquifer in the Elbe-Weser region where observed chloride and electrical conductivity data reveal that the horizontal distribution of salinity is not uniform and does not correlate with the coastline. The integrated approach uses HEM resistivity data, borehole petrography information, grain size analysis of borehole samples as well as observed chloride and electrical conductivity to estimate Cl distribution. The approch is not straightforward due to the complex nature of the geology where clay and silt are present. Possible errors and uncertainties involved at different steps of the method are discussed.
APA, Harvard, Vancouver, ISO, and other styles
2

Annisa, Putri, and Ahmad Nizar Rangkuti. "LINTASAN BELAJAR MATERI ARITMATIKA SOSIAL DENGAN PENDEKATAN PENDIDIKAN MATEMATIKA REALISTIK DI SMP NEGERI 1 BATANG ANGKOLA KABUPATEN TAPANULI SELATAN." Math Educa Journal 3, no. 2 (November 11, 2019): 109–17. http://dx.doi.org/10.15548/mej.v3i2.676.

Full text
Abstract:
The Purpose of the research is to find out validation and practically learning trajectory by realistic mathematics education approach of social arithmetic object in SMP Negeri 1 Batang Angkola. This research in research forming design research type validation study even the purposing developing local intruction theory (LIT) by compinion researcher and teacher to improving quality of learning. The action this research at SMP Negeri 1 Batang Angkola by experience subject of product in grade VII-A, leader 32 students. The instrument collecting data by using validation paper, questionnaire and using analysing validation technique and practicalitation. The result of research point out learning trajectory by realistic mathematics education approach said valid and practice. The validating source learning point out with score 76% by analyses 2 validator. The practicitation source learning find out with score 88% from questionnening respon of students, action all componens. Realistic Mathematics Education Approach to proccesing learning and extracted student in learning social arithmetic material. The source of learning resulting to this research forming activities action from reaching the purpose of learning, when purpose of learning are understanding social arithmetic concept.
APA, Harvard, Vancouver, ISO, and other styles
3

Tietbohl, Caroline K. "Empathic Validation in Physician–Patient Communication: An Approach to Conveying Empathy for Problems With Uncertain Solutions." Qualitative Health Research 32, no. 3 (December 11, 2021): 413–25. http://dx.doi.org/10.1177/10497323211056312.

Full text
Abstract:
Interest in systematic approaches to improving clinical empathy has increased. However, conceptualizations of empathy are inconsistent and difficult to operationalize. Drawing on video recordings of primary care visits with older adults, I describe one particular communication strategy for conveying empathy—empathic validation. Using conversation analysis, I show that the design of empathic validations and the context in which they are delivered are critical to positive patient responses. Effective empathic validations must (a) demonstrate shared understanding and (b) support the patient’s position. Physicians provided empathic validation when there was no medical solution to offer and within this context, for three purposes: (1) normalizing changes in health, (2) acknowledging individual difficulty, and (3) recognizing actions or choices. Empathic validation is a useful approach because it does not rely on patients’ ability to create an “empathic opportunity” and has particular relevance for older adults.
APA, Harvard, Vancouver, ISO, and other styles
4

Bartak, Roman, Adrien Maillard, and Rafael Cardoso. "Validation of Hierarchical Plans via Parsing of Attribute Grammars." Proceedings of the International Conference on Automated Planning and Scheduling 28 (June 15, 2018): 11–19. http://dx.doi.org/10.1609/icaps.v28i1.13892.

Full text
Abstract:
An important problem of automated planning is validating if a plan complies with the domain model. Such validation is straightforward for classical sequential planning but until recently there was no plan validation approach for Hierarchical Task Networks (HTN). In this paper we propose a novel technique for validating HTN plans by parsing of attribute grammars with the timeline constraint.
APA, Harvard, Vancouver, ISO, and other styles
5

Francis, R. I. C. Chris, Steven E. Campana, and Helen L. Neil. "Validation of fish ageing methods should involve bias estimation rather than hypothesis testing: a proposed approach for bomb radiocarbon validations." Canadian Journal of Fisheries and Aquatic Sciences 67, no. 9 (September 2010): 1398–408. http://dx.doi.org/10.1139/f10-068.

Full text
Abstract:
The need to validate methods of ageing fish is widely accepted and several approaches to validation have been used. Most validations are essentially informal tests, using graphical methods, of the null hypothesis of zero bias in the age estimates. It is argued that it would be more useful to estimate a confidence interval for this bias. This would provide both a quantitative measure of the strength of the validation and a means of formalising the hypothesis test. A method of estimating this confidence interval is proposed for validations based on bomb radiocarbon, and this is illustrated using data for bluenose ( Hyperoglyphe antarctica ) and haddock ( Melanogrammus aeglefinus ).
APA, Harvard, Vancouver, ISO, and other styles
6

Ginting, Herson, M. Oky Gafari, and Malan Lubis. "Development of Exposition Text Writing Teaching Materials With Genre Approach to Students of Grade X Vocational High School Brigjen Katamso Medan." Budapest International Research and Critics in Linguistics and Education (BirLE) Journal 2, no. 3 (July 29, 2019): 372–97. http://dx.doi.org/10.33258/birle.v2i3.377.

Full text
Abstract:
The aims of this study to find the validation conducted by the material expert validator on the development of the exposition text writing module. This research used development method by using population data of all student grade X vocational high school Brigjen Katamso Medan with total 92 students. The technique of collecting data by using by simple random sampling. . The results of the validation carried out by the material expert validator on the development of the exposition text writing module with the genre approach 89.11% with the category "Very Good". The results of the validation carried out by the design expert validator on the exposition text writing module overall average value of 90.80% with the category "Very Good". The teacher's assessment of the exposition text writing module with the genre approach for grade X students obtained an average value of 80.70% in the "good" category
APA, Harvard, Vancouver, ISO, and other styles
7

Tazkiyah, Fida, Helli Ihsan, and Muhammad Ariez Musthofa. "Prophetic Leadership Scale’s Validation and the Tendency of Normative Response." Jurnal Psikologi Islam dan Budaya 3, no. 2 (November 1, 2020): 147–58. http://dx.doi.org/10.15575/jpib.v3i2.5770.

Full text
Abstract:
This study aims to validate the prophetic leadership scale by using a quantitative approach. 202 leaders were involved in this study. Data analysis technique that has been used for construct validation is factorial validation with confirmatory factor analysis (CFA), a convergent validation, discriminant validation, and social desirability bias with Pearson correlation. There are 4 instruments used in this study, namely the measuring instrument of prophetic leadership, the measuring instrument of authentic leadership as a convergent validator, the measuring instrument of religiosity as a discriminant validator, and the measuring tool of social desirability as a validator of social appropriateness bias. The prophetic leadership measurement tool measures the same construct as the authentic leadership measurement tool, and measures different constructs from the religiosity measurement tool, and the prophetic leadership measurement tool has a social appropriateness bias or the respondent's tendency to give answers in accordance with norms. The findings raise a prospect that social desirability bias’s influences fitness indices in a scale’s validation.
APA, Harvard, Vancouver, ISO, and other styles
8

Giraldo, Frank D. "Validity and Classroom Language Testing: A Practical Approach." Colombian Applied Linguistics Journal 22, no. 2 (December 22, 2020): 194–206. http://dx.doi.org/10.14483/22487085.15998.

Full text
Abstract:
Validity and validation are common in large-scale language testing. These topics are fundamental because they help stakeholders in testing systems make accurate interpretations of individuals’ language ability and related ensuing decisions. However, there is limited information on validity and validation for classroom language testing, for which interpretations and decisions based on curriculum objectives are paramount, too. In this reflection article, I provide a critical account of these two issues as they are applied in large-scale testing. Next, I use this background to discuss and provide possible applications for classroom language education through a proposed approach for validating classroom language tests. The approach comprises the analyses of curriculum objectives, design of test specifications, analysis of test items, professional design of instruments, statistical calculations, cognitive validation and consequential analyses. I close the article with implications and recommendations for such endeavours and highlight why they are fundamental for high-quality language testing systems in classroom contexts.
APA, Harvard, Vancouver, ISO, and other styles
9

FELLNER, Andrzej, Radosław FELLNER, and Eugeniusz PIECHOCZEK. "Pre-flight validation RNAV GNSS approach procedures for EPKT in „EGNOS APV Mielec Project”." Scientific Journal of Silesian University of Technology. Series Transport 90 (March 1, 2016): 37–46. http://dx.doi.org/10.20858/sjsutst.2016.90.4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Fonseca i Casas, Pau, Joan Garcia i Subirana, Víctor García i Carrasco, and Xavier Pi i Palomés. "SARS-CoV-2 Spread Forecast Dynamic Model Validation through Digital Twin Approach, Catalonia Case Study." Mathematics 9, no. 14 (July 14, 2021): 1660. http://dx.doi.org/10.3390/math9141660.

Full text
Abstract:
The spread of the SARS-CoV-2 modeling is a challenging problem because of its complex nature and lack of information regarding certain aspects. In this paper, we explore a Digital Twin approach to model the pandemic situation in Catalonia. The Digital Twin is composed of three different dynamic models used to perform the validations by a Model Comparison approach. We detail how we use this approach to obtain knowledge regarding the effects of the nonpharmaceutical interventions and the problems we faced during the modeling process. We use Specification and Description Language (SDL) to represent the compartmental forecasting model for the SARS-CoV-2. Its graphical notation simplifies the different specialists’ understanding of the model hypotheses, which must be validated continuously following a Solution Validation approach. This model allows the successful forecasting of different scenarios for Catalonia. We present some formalization details, discuss the validation process and present some results obtained from the validation model discussion, which becomes a digital twin of the pandemic in Catalonia.
APA, Harvard, Vancouver, ISO, and other styles
11

Emmert-Streib, Frank, and Matthias Dehmer. "Inference of Genome-Scale Gene Regulatory Networks: Are There Differences in Biological and Clinical Validations?" Machine Learning and Knowledge Extraction 1, no. 1 (August 22, 2018): 138–48. http://dx.doi.org/10.3390/make1010008.

Full text
Abstract:
Causal networks, e.g., gene regulatory networks (GRNs) inferred from gene expression data, contain a wealth of information but are defying simple, straightforward and low-budget experimental validations. In this paper, we elaborate on this problem and discuss distinctions between biological and clinical validations. As a result, validation differences for GRNs reflect known differences between basic biological and clinical research questions making the validations context specific. Hence, the meaning of biologically and clinically meaningful GRNs can be very different. For a concerted approach to a problem of this size, we suggest the establishment of the HUMAN GENE REGULATORY NETWORK PROJECT which provides the information required for biological and clinical validations alike.
APA, Harvard, Vancouver, ISO, and other styles
12

Guerbai, Yasmine, Youcef Chibani, and Yassine Meraihi. "Techniques for Selecting the Optimal Parameters of One-Class Support Vector Machine Classifier for Reduced Samples." International Journal of Applied Metaheuristic Computing 13, no. 1 (January 2022): 1–15. http://dx.doi.org/10.4018/ijamc.290533.

Full text
Abstract:
Usually, the One-Class Support Vector Machine (OC-SVM) requires a large dataset for modeling effectively the target class independently to other classes. For finding the OC-SVM model, the available dataset is subdivided into two subsets namely training and validation, which are used for training and validating the optimal parameters. This approach is effective when a large dataset is available. However, when training samples are reduced, parameters of the OC-SVM are difficult to find in absence of the validation subset. Hence, this paper proposes various techniques for selecting the optimal parameters using only a training subset. The experimental evaluation conducted on several real-world benchmarks proves the effective use of the new selection parameter techniques for validating the model of OC-SVM classifiers versus the standard validation techniques
APA, Harvard, Vancouver, ISO, and other styles
13

Otoluwa, Yemima, Sunarty Eraku, and Daud Yusuf. "PENGEMBANGAN MEDIA PEMBELAJARAN BERBASIS LECTORA INSPIRE YANG DIINTEGRASIKAN DENGAN CAMTASIA STUDIO PADA MATA PELAJARAN GEOGRAFI MATERI SISTEM INFORMASI GEOGRAFI." JAMBURA GEO EDUCATION JOURNAL 1, no. 1 (December 7, 2019): 01–08. http://dx.doi.org/10.34312/jgej.v1i1.4041.

Full text
Abstract:
This research aims to develop Lectora Inspire based learning media on Geographic Information System material. Lectora Inspire is an application used to create learning media associated with Geographic Information System material which is integrated with Camtasia Studio. It applies a development research method with ADDIE model as an approach in compiling the learning media, Moreover, the validation are conducted by material expert, media expert, and subject expert. This learning media is tried out limited to high school student of grade XII trough questionnaire. The collected data is research finding regarding the quality of learning media and suggestions for product revisions. The validation result from material expert reveals that the value of media validation provided by validator I or material/content experts is 83.75% with valid category, validator II or product design expert is 88% with very valid category, while validator III or subject expert or teacher of geography subject is 98% with very valid category. Thus, the average of students respond towards the learning media is 88.8%.
APA, Harvard, Vancouver, ISO, and other styles
14

de Jesus, Kelly, Karla de Jesus, Pedro Figueiredo, João Paulo Vilas-Boas, Ricardo Jorge Fernandes, and Leandro José Machado. "Reconstruction Accuracy Assessment of Surface and Underwater 3D Motion Analysis: A New Approach." Computational and Mathematical Methods in Medicine 2015 (2015): 1–8. http://dx.doi.org/10.1155/2015/269264.

Full text
Abstract:
This study assessed accuracy of surface and underwater 3D reconstruction of a calibration volume with and without homography. A calibration volume (6000 × 2000 × 2500 mm) with 236 markers (64 above and 88 underwater control points—with 8 common points at water surface—and 92 validation points) was positioned on a 25 m swimming pool and recorded with two surface and four underwater cameras. Planar homography estimation for each calibration plane was computed to perform image rectification. Direct linear transformation algorithm for 3D reconstruction was applied, using 1600000 different combinations of 32 and 44 points out of the 64 and 88 control points for surface and underwater markers (resp.). Root Mean Square (RMS) error with homography of control and validations points was lower than without it for surface and underwater cameras (P≤0.03). With homography, RMS errors of control and validation points were similar between surface and underwater cameras (P≥0.47). Without homography, RMS error of control points was greater for underwater than surface cameras (P≤0.04) and the opposite was observed for validation points (P≤0.04). It is recommended that future studies using 3D reconstruction should include homography to improve swimming movement analysis accuracy.
APA, Harvard, Vancouver, ISO, and other styles
15

Adegbohun, Feyijimi, Annette von Jouanne, Ben Phillips, Emmanuel Agamloh, and Alex Yokochi. "High Performance Electric Vehicle Powertrain Modeling, Simulation and Validation." Energies 14, no. 5 (March 9, 2021): 1493. http://dx.doi.org/10.3390/en14051493.

Full text
Abstract:
Accurate electric vehicle (EV) powertrain modeling, simulation and validation is paramount for critical design and control decisions in high performance vehicle designs. Described in this paper is a methodology for the design and development of EV powertrain through modeling, simulation and validation on a real-world vehicle system with detailed analysis of the results. Although simulation of EV powertrains in software simulation environments plays a significant role in the design and development of EVs, validating these models on the real-world vehicle systems plays an equally important role in improving the overall vehicle reliability, safety and performance. This modeling approach leverages the use of MATLAB/Simulink software for the modeling and simulation of an EV powertrain, augmented by simultaneously validating the modeling results on a real-world vehicle which is performance tested on a chassis dynamometer. The combination of these modeling techniques and real-world validation demonstrates a methodology for a cost effective means of rapidly developing and validating high performance EV powertrains, filling the literature gaps in how these modeling methodologies can be carried out in a research framework.
APA, Harvard, Vancouver, ISO, and other styles
16

Pfeifer, Monika, Wenchieh Yen, Michael Baldauf, George Craig, Susanne Crewell, Jürgen Fischer, Martin Hagen, et al. "Validating precipitation forecasts using remote sensor synergy: A case study approach." Meteorologische Zeitschrift 19, no. 6 (December 1, 2010): 601–17. http://dx.doi.org/10.1127/0941-2948/2010/0487.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Saxe, Debra F., Diane L. Persons, Daynna J. Wolff, and Karl S. Theil. "Validation of Fluorescence In Situ Hybridization Using an Analyte-Specific Reagent for Detection of Abnormalities Involving the Mixed Lineage Leukemia Gene." Archives of Pathology & Laboratory Medicine 136, no. 1 (January 1, 2012): 47–52. http://dx.doi.org/10.5858/arpa.2010-0645-sa.

Full text
Abstract:
Context.—Fluorescence in situ hybridization (FISH) is a molecular cytogenetic assay that is commonly used in laboratory medicine. Most FISH assays are not approved by the US Food and Drug Administration but instead are laboratory-developed tests that use analyte-specific reagents. Although several guidelines exist for validation of FISH assays, few specific examples of FISH test validations are available in the literature. Objective.—To provide an example of how a FISH assay, using an analyte-specific reagent probe, may be validated in a clinical laboratory. Design.—We describe the approach used by an individual laboratory for validation of a FISH assay for mixed lineage leukemia (MLL) gene. Results.—Specific validation data are provided illustrating how initial assay performance characteristics in a FISH assay for MLL may be established. Conclusions.—Protocols for initial validation of FISH assays may vary between laboratories. However, all laboratories must establish several defined performance specifications prior to implementation of FISH assays for clinical use. We describe an approach used for assessing performance specifications and validation of an analyte-specific reagent FISH assay using probes for MLL rearrangement in interphase nuclei.
APA, Harvard, Vancouver, ISO, and other styles
18

Janzek-Hawlat, S., S. Sibinovic, G. Duftschmid, and C. Rinner. "Semantic Validation of Standard-based Electronic Health Record Documents with W3C XML Schema." Methods of Information in Medicine 49, no. 03 (2010): 271–80. http://dx.doi.org/10.3414/me09-02-0027.

Full text
Abstract:
Summary Objectives: The goal of this article is to examine whether W3C XML Schema provides a practicable solution for the semantic validation of standard-based electronic health record (EHR) documents. With semantic validation we mean that the EHR documents are checked for conformance with the underlying archetypes and reference model. Methods: We describe an approach that allows XML Schemas to be derived from archetypes based on a specific naming convention. The archetype constraints are augmented with additional components of the reference model within the XML Schema representation. A copy of the EHR document that is transformed according to the before-mentioned naming convention is used for the actual validation against the XML Schema. Results: We tested our approach by semantically validating EHR documents conformant to three different ISO/EN 13606 archetypes respective to three sections of the CDA implementation guide “Continuity of Care Document (CCD)” and an implementation guide for diabetes therapy data. We further developed a tool to automate the different steps of our semantic validation approach. Conclusions: For two particular kinds of archetype prescriptions, individual transformations are required for the corresponding EHR documents. Otherwise, a fully generic validation is possible. In general, we consider W3C XML Schema as a practicable solution for the semantic validation of standard-based EHR documents.
APA, Harvard, Vancouver, ISO, and other styles
19

Campbell, Kaitlyn, Carissa Coleman, and Kristine Williams. "RESPONSES OF PERSONS LIVING WITH DEMENTIA TO VALIDATING COMMUNICATION BY FAMILY CAREGIVERS: A SECONDARY ANALYSIS." Innovation in Aging 6, Supplement_1 (November 1, 2022): 133. http://dx.doi.org/10.1093/geroni/igac059.531.

Full text
Abstract:
Abstract Validation is a person-centered approach for communicating with people living with dementia (PLWD). This study evaluated how PLWD respond to caregiver communication that includes or fails to include validation. This secondary analysis of homecare videos (N=41) of family caregivers interacting with a PLWD during daily care used behavioral coding of validating communication (affirmation, acknowledging emotions, encouraging emotional expression, verbalizing understanding) in relation to responses of PLWD (resistiveness to care, apathy, or cooperation). A 10 second time-lag sequential analysis identified an 11% probability of a cooperative response when caregivers communicated using affirmations. Caregiver verbalizations of understanding resulted in a 6% probability and silence had an 8% probability of cooperative responses. Use of validating communication did not result in a significant probability of negative PLWD responses (apathy or resistiveness to care). Use of validating communication strategies may assist caregivers to achieve goals of care without negative responses from PLWD.
APA, Harvard, Vancouver, ISO, and other styles
20

Crobeddu, Éric, and Saad Bennis. "Suivi et validation des mesures pour un déversoir d'orage latéral à seuil court." Canadian Journal of Civil Engineering 33, no. 3 (March 1, 2006): 326–35. http://dx.doi.org/10.1139/l05-123.

Full text
Abstract:
This work proposes an original approach for monitoring and validating discharge measurements on a lateral storm weir. The approach is based on the combined use of a measuring device and a discharge formula. The developed validation procedure first identifies the erroneous measurements and, consequently, the faulty sensor, using information redundancy on the discharged flow. The erroneous measurements are identified using statistical tests on the relative gap between the measured and simulated discharged flows. Secondly, the erroneous discharge flow measurements are corrected by following the correction rules stated for each type of error. The validation procedure has been tested in a laboratory on a test bench equipped with sensors currently used in wastewater systems. The tests helped demonstrate the validity and robustness of the Dominguez formula used to simulate discharge flows. These tests also helped substantiate the validation procedure developed through identification and correction of erroneous discharged flows introduced in the measurement samples.Key words: storm overflow, lateral storm weir, discharge formula, instrumentation, validation, measurements.[Journal translation]
APA, Harvard, Vancouver, ISO, and other styles
21

Rahmi, Mulia, Saminan Saminan, Muhammad Syukri, Yusrizal Yusrizal, Ibnu Khaldun, Wiwit Artika, and Ismul Huda. "Development of a Virtual Lab in Science-Physics Learning Based on the STEM Approach." Jurnal Penelitian Pendidikan IPA 8, no. 4 (October 31, 2022): 2351–55. http://dx.doi.org/10.29303/jppipa.v8i4.1660.

Full text
Abstract:
Learning innovation in the laboratory has experienced very rapid development in the current era of technological development. One of these innovations is through the use of virtual laboratory classes in science learning, as is the case with temperature and heat materials. The aims of this research are to develop a virtual lab based on the STEM approach on the concepts of temperature and heat, to test the validity and practicality of virtual lab products based on the STEM approach. The development method used is 4-D (four-D). The data obtained in the form of quantitative and qualitative data. Qualitative data in the form of suggestions, input, feedback and criticism from the validator and quantitative data in the form of a validation questionnaire. The validation results show that the validation is 87.89% (very feasible). In addition, the practical test of student responses was obtained at 96, 15 (very practical) and the teacher at 3.31 (very practical). Practically speaking, the virtual lab based on the STEM approach that was developed is very useful to support practical activities.
APA, Harvard, Vancouver, ISO, and other styles
22

Wang, Qing, Ming Zhang, Tyler Tomita, Joshua T. Vogelstein, Shibin Zhou, Nickolas Papadopoulos, Kenneth W. Kinzler, and Bert Vogelstein. "Selected reaction monitoring approach for validating peptide biomarkers." Proceedings of the National Academy of Sciences 114, no. 51 (December 4, 2017): 13519–24. http://dx.doi.org/10.1073/pnas.1712731114.

Full text
Abstract:
We here describe a selected reaction monitoring (SRM)-based approach for the discovery and validation of peptide biomarkers for cancer. The first stage of this approach is the direct identification of candidate peptides through comparison of proteolytic peptides derived from the plasma of cancer patients or healthy individuals. Several hundred candidate peptides were identified through this method, providing challenges for choosing and validating the small number of peptides that might prove diagnostically useful. To accomplish this validation, we used 2D chromatography coupled with SRM of candidate peptides. We applied this approach, called sequential analysis of fractionated eluates by SRM (SAFE-SRM), to plasma from cancer patients and discovered two peptides encoded by the peptidyl-prolyl cis–trans isomerase A (PPIA) gene whose abundance was increased in the plasma of ovarian cancer patients. At optimal thresholds, elevated levels of at least one of these two peptides was detected in 43 (68.3%) of 63 women with ovarian cancer but in none of 50 healthy controls. In addition to providing a potential biomarker for ovarian cancer, this approach is generally applicable to the discovery of peptides characteristic of various disease states.
APA, Harvard, Vancouver, ISO, and other styles
23

Benin, Andrea L., Ada Fenick, Jeph Herrin, Grace Vitkauskas, John Chen, and Cynthia Brandt. "How Good Are the Data? Feasible Approach to Validation of Metrics of Quality Derived From an Outpatient Electronic Health Record." American Journal of Medical Quality 26, no. 6 (September 16, 2011): 441–51. http://dx.doi.org/10.1177/1062860611403136.

Full text
Abstract:
Although electronic health records (EHRs) promise to be efficient resources for measuring metrics of quality, they are not designed for such population-based analyses. Thus, extracting meaningful clinical data from them is not straightforward. To avoid poorly executed measurements, standardized methods to measure and to validate metrics of quality are needed. This study provides and evaluates a use case for a generally applicable approach to validating quality metrics measured electronically from EHR-based data. The authors iteratively refined and validated 4 outpatient quality metrics and classified errors in measurement. Multiple iterations of validation and measurement resulted in high levels of sensitivity and agreement versus the “gold standard” of manual review. In contrast, substantial differences remained for measurement based on coded billing data. Measuring quality metrics using an EHR-based electronic process requires validation to ensure accuracy; approaches to validation such as those described in this study should be used by organizations measuring quality from EHR-based information.
APA, Harvard, Vancouver, ISO, and other styles
24

Nakamura, Akira, and Toru Oumaya. "ICONE15-10424 AN APPROACH OF INTEGRATED EVALUATION METHOD FOR THERMAL FATIGUE AND ITS VALIDATION APPLYING SPECTRA TEST." Proceedings of the International Conference on Nuclear Engineering (ICONE) 2007.15 (2007): _ICONE1510. http://dx.doi.org/10.1299/jsmeicone.2007.15._icone1510_222.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Wan, Li, and Ying Jin. "Assessment of model validation outcomes of a new recursive spatial equilibrium model for the Greater Beijing." Environment and Planning B: Urban Analytics and City Science 46, no. 5 (September 27, 2017): 805–25. http://dx.doi.org/10.1177/2399808317732575.

Full text
Abstract:
Robust calibration and validation of applied urban models are prerequisites for their successful, policy-cogent use. This is particularly important today when expert assessment is questioned and closely scrutinized. This paper proposes a new model calibration-validation strategy based on a spatial equilibrium model that incorporates multiple time horizons, such that the predictive capabilities of the model can be empirically tested. The model is implemented for the Greater Beijing city region and the model validation strategy is demonstrated over the Census years 2000 to 2010. Through forward/backward forecasting, the model validation helps to verify the stability of the model parameters as well as the predictive capabilities of the recursive equilibrium framework. The proposed modelling strategy sets a new standard for verifying and validating recursive equilibrium models. We also consider the wider implications of the approach.
APA, Harvard, Vancouver, ISO, and other styles
26

Palumbo, Diego, Martina Mori, Francesco Prato, Stefano Crippa, Giulio Belfiori, Michele Reni, Junaid Mushtaq, et al. "Prediction of Early Distant Recurrence in Upfront Resectable Pancreatic Adenocarcinoma: A Multidisciplinary, Machine Learning-Based Approach." Cancers 13, no. 19 (September 30, 2021): 4938. http://dx.doi.org/10.3390/cancers13194938.

Full text
Abstract:
Despite careful selection, the recurrence rate after upfront surgery for pancreatic adenocarcinoma can be very high. We aimed to construct and validate a model for the prediction of early distant recurrence (<12 months from index surgery) after upfront pancreaticoduodenectomy. After exclusions, 147 patients were retrospectively enrolled. Preoperative clinical and radiological (CT-based) data were systematically evaluated; moreover, 182 radiomics features (RFs) were extracted. Most significant RFs were selected using minimum redundancy, robustness against delineation uncertainty and an original machine learning bootstrap-based method. Patients were split into training (n = 94) and validation cohort (n = 53). Multivariable Cox regression analysis was first applied on the training cohort; the resulting prognostic index was then tested in the validation cohort. Clinical (serum level of CA19.9), radiological (necrosis), and radiomic (SurfAreaToVolumeRatio) features were significantly associated with the early resurge of distant recurrence. The model combining these three variables performed well in the training cohort (p = 0.0015, HR = 3.58, 95%CI = 1.98–6.71) and was then confirmed in the validation cohort (p = 0.0178, HR = 5.06, 95%CI = 1.75–14.58). The comparison of survival curves between low and high-risk patients showed a p-value <0.0001. Our model may help to better define resectability status, thus providing an actual aid for pancreatic adenocarcinoma patients’ management (upfront surgery vs. neoadjuvant chemotherapy). Independent validations are warranted.
APA, Harvard, Vancouver, ISO, and other styles
27

Tong, Jian-Bo, Jia Chang, Shu-Ling Liu, and Min Bai. "A quantitative structure–activity relationship (QSAR) study of peptide drugs based on a new descriptor of amino acids." Journal of the Serbian Chemical Society 80, no. 3 (2015): 343–53. http://dx.doi.org/10.2298/jsc140604069t.

Full text
Abstract:
Quantitative structure-activity relationships (QSAR) approach is used for finding the relationship between molecular structures and the activity of peptide drugs. In this work, stepwise multiple regression, was employed to select optimal subset of descriptors that have significant contribution to the drug activity of 21 oxytocin analogues, 48 bitter tasting threshold, and 58 angiotensin-converting enzyme inhibitors. A new set of descriptor, SVWGM, was used for the prediction of the drug activity of peptide drugs and then were used to build the model by partial least squares method, for it?s estimation stability and generalization ability was strictly analyzed by both internal and external validations, with cross-validation correlation coefficient, correlation coefficient and correlation coefficient of external validation.
APA, Harvard, Vancouver, ISO, and other styles
28

Dennis, Kristine, Brian Carter, Susan Gapstur, and Victoria Stevens. "Metabolomics Approach for Validation of Self-Reported Ibuprofen and Acetaminophen Use." Metabolites 8, no. 4 (September 21, 2018): 55. http://dx.doi.org/10.3390/metabo8040055.

Full text
Abstract:
Over-the-counter analgesic use is common and is typically assessed through self-report; therefore, it is subject to misclassification. Detection of drug metabolites in biofluids offers a viable tool for validating self-reported analgesic use. Thus, the aim of this study was to determine the utility of a metabolomics approach for the validation of acetaminophen and ibuprofen use in blood samples. Untargeted mass spectrometry-based metabolomics analysis was conducted in serum samples from 1547 women and plasma samples from 556 men. The presence of two metabolites each for acetaminophen and ibuprofen at levels at or above a defined cutoff value was used to determine concordance with self-reported use. For acetaminophen use based on the presence of both acetaminophen and acetamidophenylglucuronide, concordance was 98.5–100% among individuals reporting use today, and 79.8–91.4% for those reporting never or rare use. Ibuprofen use based on the presence of both carboxyibuprofen and hydroxyibuprofen resulted in concordance of 51.3–52.5% for individuals reporting use today and 99.4–100% for those reporting never or rare use. Our findings suggest that an untargeted metabolomics approach in blood samples may be useful for validating self-reported acetaminophen use. However, this approach appears unlikely to be suitable for validating ibuprofen use.
APA, Harvard, Vancouver, ISO, and other styles
29

Dixon, Mark B. "A Graphical Based Approach to the Conceptual Modeling, Validation and Generation of XML Schema Definitions." International Journal of Information Technology and Web Engineering 8, no. 1 (January 2013): 1–22. http://dx.doi.org/10.4018/jitwe.2013010101.

Full text
Abstract:
This paper discusses the research and development of a modeling tool that provides a graphical approach to the definition, validation and generation of XML schemas. Although XML has had a ubiquitous web presence for a number of years the strength of its underlying validation framework is often not leveraged to its maximum potential. Additionally the design process followed when developing XML data formats is often rather ad-hoc and driven by technical requirements of the application rather than a conceptual level analysis of the problem domain. This work contributes to research knowledge by proposing and validating a mechanism for allowing non-programmers to easily visualise and design the rules to which XML documents should comply. The use of an underlying meta-case platform provides a unique opportunity to allow highly customisable support and automatic code generation for any number of schema definition languages, thus providing a test-bed for future research activities.
APA, Harvard, Vancouver, ISO, and other styles
30

Raviasta, Sukma, Zulhelmi Zulhelmi, and Zuhdi Ma’aruf. "Development of Learning Tools with Learning Videos Using the Flipped Classroom Approach on Gas Kinetic Theory Material in Class XI SMA." Jurnal Geliga Sains: Jurnal Pendidikan Fisika 8, no. 2 (January 15, 2021): 150. http://dx.doi.org/10.31258/jgs.8.2.150-158.

Full text
Abstract:
This research’s goal was to develop learning tools with video using a flipped-classroom approach which is valid on the kinetic theory of gas for students in class XI MIPA SMA. This research uses a research and development method with the ADDIE development model (analysis, design, development, implementation, and evaluation), which in this research is limited to the development phase. The research instrument used was a problem-fulfillment questionnaire filled out by students and a validation assessment sheet that was filled in by a validator. The data collection technique was based on the results of filling out a questionnaire by students at SMA N 1 Pekanbaru and from the validation analysis sheet by the validator. The data analysis technique used the percentage method and continued with data validation using descriptive analysis. This research produces learning tools in the form of lesson plans, student worksheets, and learning videos. The results of the assessment of learning tools using the flipped classroom approach obtained an average value for lesson plans with a very high category value of 4.36, video learning with a value of 4.56 in the very high category, and LKPD with a score of 4.2 very high categories. Each assessment indicator has a value greater than 3, and the validity value is declared valid. Thus, the learning device using video media using the flipped-classroom approach to the kinetic theory material of gas for students of class XI MIPA SMA is declared to be valid and suitable for use.
APA, Harvard, Vancouver, ISO, and other styles
31

Har-even, M., and V. L. Brailovsky. "Probabilistic validation approach for clustering." Pattern Recognition Letters 16, no. 11 (November 1995): 1189–96. http://dx.doi.org/10.1016/0167-8655(95)00073-p.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Magdon-Ismail, Malik, and Konstantin Mertsalov. "A permutation approach to validation*." Statistical Analysis and Data Mining: The ASA Data Science Journal 3, no. 6 (November 16, 2010): 361–80. http://dx.doi.org/10.1002/sam.10096.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Mhadhbi, Zeineb, Sajeh Zairi, Cedric Gueguen, and Belhassen Zouari. "Validation of a Distributed Energy Management Approach for Smart Grid Based on a Generic Colored Petri Nets Model." Journal of Clean Energy Technologie 6, no. 1 (January 2018): 20–25. http://dx.doi.org/10.18178/jocet.2018.6.1.430.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Inra, Azwar, Sukamto Sukamto, and Z. Mawardi Effendi. "Developing a model of soft-skill teaching for civil engineering students." Research and Evaluation in Education 2, no. 2 (December 28, 2016): 122. http://dx.doi.org/10.21831/reid.v2i2.8220.

Full text
Abstract:
An observation shows that civil engineering students’ soft skill is still low. It is characterized by their poor sociability and low positive attitude toward dealing with problems and challenges of life. For students to have various aspects of soft skill, a study was conducted to develop a model of soft-skill teaching that would meet the criteria of validity, effectiveness, and practicality. The research and development referred to Richey & Klein approach, which is: (1) developing a model; and (b) validating a model. The validation process was conducted in two phases, namely internal validation and external validation. The internal validation of the model and its components was conducted by experts in educational technology, vocational education, and evaluation. On the other hand, the external validation was conducted in the form of small groups and of large groups which involved 41 students and a member of the Department of Civil Engineering Board. The data analysis was performed in two phases: the developmental phase and the testing phase. The analysis in the development phase was conducted by means of the qualitative approach, while the analysis in the testing stage was conducted by means of the quantitative approach; the two phases should be undergone to analyze the model validation and the trial results. These phases would also be undertaken by the experts. The results of the study show that the model of soft-skill teaching developed for the students of the department of civil engineering had been able to meet the criteria of validity, effectiveness, and practicality.
APA, Harvard, Vancouver, ISO, and other styles
35

Chevalier, Luc, Heba Makhlouf, Benoît Jacquet-Faucillon, and Eric Launay. "Modeling the influence of connecting elements in wood products behavior: a numerical multi-scale approach." Mechanics & Industry 19, no. 3 (2018): 301. http://dx.doi.org/10.1051/meca/2018004.

Full text
Abstract:
Wood furniture is often composed of simple parts that may be modeled as beams or plates. These particularities allow using simplified approaches that reduces the number of degrees of freedom (dof for short) in a finite element simulation of the furniture's behavior. Generally, connections are not taken into account in such simulations but these connections are critical in the failure process of the furniture and it worth studying it precisely. Using a multi-scale approach, this paper introduces a numerical procedure to identify the connection contribution in the furniture's stiffness. Comparing 3D finite element calculations with a Timoshenko's beam calculation on a corner of two wooden parts, we identify the specific behavior of the connection elements (pins, nut, screw… and local 3D effects) to introduce it as a punctual 0D element in the beam code. Two validations of the approach are presented here: (i) a numerical validation by comparing the result of the beam code with a complete 3D finite element simulation on a representative plane structure of wooden furniture; (ii) an experimental validation by managing a global bending test and measuring the displacement field using digital image correlation (DIC for short).
APA, Harvard, Vancouver, ISO, and other styles
36

Islam, Rafiq, Jennifer Vance, Martin Poirier, Jennifer Zimmer, Ardeshir Khadang, Dave Williams, Jennifer Zemo, et al. "Recommendations on ELISpot assay validation by the GCC." Bioanalysis 14, no. 4 (February 2022): 187–93. http://dx.doi.org/10.4155/bio-2022-0010.

Full text
Abstract:
Gene therapy, cell therapy and vaccine research have led to an increased need to perform cellular immunity testing in a regulated environment to ensure the safety and efficacy of these treatments. The most common method for the measurement of cellular immunity has been Enzyme-Linked Immunospot assays. However, there is a lack of regulatory guidance available discussing the recommendations for developing and validating these types of assays. Hence, the Global CRO Council has issued this white paper to provide a consensus on the different validation parameters required to support Enzyme-Linked Immunospot assays and a harmonized and consistent approach to Enzyme-Linked Immunospot validation among contract research organizations.
APA, Harvard, Vancouver, ISO, and other styles
37

Akolkar, Dadasaheb B., Darshana Patil, Pradip Fulmali, Pooja Fulmali, Revati Patil, Kiran Bendale, Archana Adhav, et al. "Analytical and clinical validation of the TruCheck platform for diagnostic triaging of symptomatic cases suspected of colorectal cancer." Journal of Clinical Oncology 39, no. 3_suppl (January 20, 2021): 24. http://dx.doi.org/10.1200/jco.2021.39.3_suppl.24.

Full text
Abstract:
24 Background: Trucheck is a non-invasive approach for diagnostic triaging of symptomatic individuals who are suspected of Colorectal Cancer (CRC) and have been advised an invasive biopsy. Trucheck evaluates blood samples for presence of Circulating Ensembles of Tumor Associated Cells (C-ETACs: EpCAM+, Pan-CK+, CD45±) originating from a primary Colorectal Adenocarcinoma (CR-AD: CDX-2, CK-20, Muc2); such C-ETACs are unexpected in asymptomatic individuals as well as in individuals with benign Colorectal conditions. Methods: For Analytical Validation, spike-recovery analysis was performed using control cells / cell lines for EpCAM (SKBR-3), Pan-CK (SKBR-3), CD45 (PBMCs), CDX2 (CaCO2), CK20 (HCT15) and MUC2 (SW620) to determine the Sensitivity, Specificity, Accuracy, Limit of Detection, Linearity and Precision. Clinical Validations were performed using 15 mL blood samples from 587 participants. An initial Retrospective Clinical Pre-validation was based on blood samples from 350 known cases of CR-AD and 20 cases of non-CRC. The Prospective Clinical Validation was performed on blood samples collected prior to any invasive procedures from 217 symptomatic cases suspected of CRC. Results: Analytical Validation indicated 100% Sensitivity, 100% Specificity, 100% Accuracy, 90% Precision and significant linearity (R2≥0.98) for all ICC markers. Clinical Pre-validation indicated 84.9 % Sensitivity and 100% Specificity for CR-AD v/s non-CRC. In the Prospective Clinical Validation cohort, histopathological evaluation (HPE) of biopsied tumor tissue indicated benign Colorectal conditions in 10 cases and CR-AD 207 of the 217 suspected cases. Based on HPE as the standard, the Trucheck approach had 90.3% Sensitivity, 100% Specificity and 95.2% Accuracy. Conclusions: Symptomatic individuals suspected of CRC may benefit from the sensitive and specific non-invasive Trucheck approach. Individuals positive for CR-AD-specific C-ETACs can be prioritized for further clinical procedures while C-ETAC negative individuals can be considered for alternate diagnoses. The Trucheck approach can eliminate the need for (and risks associated with) invasive colon biopsies in a significant proportion of suspected cases and is especially useful where
APA, Harvard, Vancouver, ISO, and other styles
38

Ahmetaj, Shqiponja, Bianca Löhnert, Magdalena Ortiz, and Mantas Šimkus. "Magic shapes for SHACL validation." Proceedings of the VLDB Endowment 15, no. 10 (June 2022): 2284–96. http://dx.doi.org/10.14778/3547305.3547329.

Full text
Abstract:
A key prerequisite for the successful adoption of the Shapes Constraint Language (SHACL)---the W3C standardized constraint language for RDF graphs---is the availability of automated tools that efficiently validate targeted constraints (known as shapes graphs ) over possibly very large RDF graphs. There are already significant efforts to produce optimized engines for SHACL validation, but they focus on restricted fragments of SHACL. For unrestricted SHACL, that is SHACL with unrestricted recursion and negation, there is no validator beyond a proof-of-concept prototype, and existing techniques are inherently incompatible with the goal-driven approaches being pursued by existing validators. Instead they require a global computation on the entire data graph that is not only computationally very costly, but also brittle, and can easily result in validation failures due to conflicts that are irrelevant to the validation targets. To address these challenges, we present a 'magic' transformation---based on Magic Sets as known from Logic Programming---that transforms a SHACL shapes graph S into a new shapes graph S' whose validation considers only the relevant neighbourhood of the targeted nodes. The new S' is equivalent to S whenever there are no conflicts between the constraints and the data, and in case the validation of S fails due to conflicts that are irrelevant to the target, S' may still admit a lazy, target-oriented validation. We implement the algorithm and run preliminary experiments, showing our approach can be a stepping stone towards validators for full SHACL, and that it can significantly improve the performance of the only prototype validator that currently supports full recursion and negation.
APA, Harvard, Vancouver, ISO, and other styles
39

Cheng, Kwok Sun, Pei-Chi Huang, Tae-Hyuk Ahn, and Myoungkyu Song. "Tool Support for Improving Software Quality in Machine Learning Programs." Information 14, no. 1 (January 16, 2023): 53. http://dx.doi.org/10.3390/info14010053.

Full text
Abstract:
Machine learning (ML) techniques discover knowledge from large amounts of data. Modeling in ML is becoming essential to software systems in practice. The accuracy and efficiency of ML models have been focused on ML research communities, while there is less attention on validating the qualities of ML models. Validating ML applications is a challenging and time-consuming process for developers since prediction accuracy heavily relies on generated models. ML applications are written by relatively more data-driven programming based on the black box of ML frameworks. All of the datasets and the ML application need to be individually investigated. Thus, the ML validation tasks take a lot of time and effort. To address this limitation, we present a novel quality validation technique that increases the reliability for ML models and applications, called MLVal. Our approach helps developers inspect the training data and the generated features for the ML model. A data validation technique is important and beneficial to software quality since the quality of the input data affects speed and accuracy for training and inference. Inspired by software debugging/validation for reproducing the potential reported bugs, MLVal takes as input an ML application and its training datasets to build the ML models, helping ML application developers easily reproduce and understand anomalies in the ML application. We have implemented an Eclipse plugin for MLVal that allows developers to validate the prediction behavior of their ML applications, the ML model, and the training data on the Eclipse IDE. In our evaluation, we used 23,500 documents in the bioengineering research domain. We assessed the ability of the MLVal validation technique to effectively help ML application developers: (1) investigate the connection between the produced features and the labels in the training model, and (2) detect errors early to secure the quality of models from better data. Our approach reduces the cost of engineering efforts to validate problems, improving data-centric workflows of the ML application development.
APA, Harvard, Vancouver, ISO, and other styles
40

Klein, Devorah E., and Matthew J. Jordan. "Methods of Assessing Medical Devices." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 46, no. 23 (September 2002): 1890–94. http://dx.doi.org/10.1177/154193120204602305.

Full text
Abstract:
While designing and validating any complex system has challenges, the medical domain has specific requirements which must be considered for a system or device to be successful. The environments, communities of use, and interactions are varied, unpredictable, uncontrolled, and ever-changing. Given the environments, communities of use, and interactions involved with medical devices, successful early and late validation of the device must be informed by the context of use itself. Building “frameworks” which represent the context of use for the device can focus validation goals, methods, and criteria and ensure that validation is directed and appropriate. In this paper we present a process and associated methods for defining the frameworks in which medical devices can be successfully assessed. The phases of the process include Phase1: Definition in which a framework of understanding is built which represents the environment of use, community of users, and the interactions between systems and users for the medical device in development. In Phase 2: Validation the framework which defines the environment of use, community of users, and the interactions between systems and users is used to develop a validation approach and criteria. The developing device is then validated against the framework itself.
APA, Harvard, Vancouver, ISO, and other styles
41

Gusfitri, Winda, Abdurrahman Abdurrahman, Dedek Andrian, Nofriyandi Nofriyandi, and Sri Rezeki. "Development of Mathematics Learning Tools Based on Ethnomathematics on Rectangular and Triangles in Junior High School." Prisma Sains : Jurnal Pengkajian Ilmu dan Pembelajaran Matematika dan IPA IKIP Mataram 10, no. 3 (July 6, 2022): 609. http://dx.doi.org/10.33394/j-ps.v10i3.5310.

Full text
Abstract:
Ethnomathematics is mathematics that appears in a particular culture and is considered a learning approach that aims to enable students to solve mathematical problems related to their daily activities which include grouping, counting, measuring, designing buildings or tools, playing, and determining location. This study aims to produce ethnomathematics-based mathematics learning tools on quadrilaterals and triangles that have been tested for feasibility. Learning tools are in the form of learning implementation plans (LIP) and student worksheets (SAS). This study uses the R&D method with the following steps: potential and problems, data collection, product design, design validity, design revision, and final product. In the R&D method, researchers did not use the pilot stage because school learning was carried out online. The data collection technique used is validation data and validator response data from two Mathematics Education Lecturers and one Mathematics Teacher. The data analysis technique used is validation data analysis and validator response data analysis to mathematics learning tools. From the research results obtained LIP validation results 89.72% with very valid criteria and 92.25% SAS validation results with very valid criteria. While the results of the validator's response to LIP were 80% with good criteria and the results of the validator's response to SAS were 79.86% with good criteria. Based on these results, it can be concluded that the development of ethnomathematics-based mathematics learning tools on quadrilateral and triangle material is feasible to be used and tested in junior high schools.
APA, Harvard, Vancouver, ISO, and other styles
42

De Meester, Ben, Pieter Heyvaert, Dörthe Arndt, Anastasia Dimou, and Ruben Verborgh. "RDF graph validation using rule-based reasoning." Semantic Web 12, no. 1 (November 19, 2020): 117–42. http://dx.doi.org/10.3233/sw-200384.

Full text
Abstract:
The correct functioning of Semantic Web applications requires that given RDF graphs adhere to an expected shape. This shape depends on the RDF graph and the application’s supported entailments of that graph. During validation, RDF graphs are assessed against sets of constraints, and found violations help refining the RDF graphs. However, existing validation approaches cannot always explain the root causes of violations (inhibiting refinement), and cannot fully match the entailments supported during validation with those supported by the application. These approaches cannot accurately validate RDF graphs, or combine multiple systems, deteriorating the validator’s performance. In this paper, we present an alternative validation approach using rule-based reasoning, capable of fully customizing the used inferencing steps. We compare to existing approaches, and present a formal ground and practical implementation “Validatrr”, based on N3Logic and the EYE reasoner. Our approach – supporting an equivalent number of constraint types compared to the state of the art – better explains the root cause of the violations due to the reasoner’s generated logical proof, and returns an accurate number of violations due to the customizable inferencing rule set. Performance evaluation shows that Validatrr is performant for smaller datasets, and scales linearly w.r.t. the RDF graph size. The detailed root cause explanations can guide future validation report description specifications, and the fine-grained level of configuration can be employed to support different constraint languages. This foundation allows further research into handling recursion, validating RDF graphs based on their generation description, and providing automatic refinement suggestions.
APA, Harvard, Vancouver, ISO, and other styles
43

Loew, A., and F. Schlenz. "A dynamic approach for evaluating coarse scale satellite soil moisture products." Hydrology and Earth System Sciences 15, no. 1 (January 11, 2011): 75–90. http://dx.doi.org/10.5194/hess-15-75-2011.

Full text
Abstract:
Abstract. Validating coarse scale remote sensing soil moisture products requires a comparison of gridded data to point-like ground measurements. The necessary aggregation of in situ measurements to the footprint scale of a satellite sensor (>100 km2) introduces uncertainties in the validation of the satellite soil moisture product. Observed differences between the satellite product and in situ data are therefore partly attributable to these aggregation uncertainties. The present paper investigates different approaches to disentangle the error of the satellite product from the uncertainties associated to the up-scaling of the reference data. A novel approach is proposed, which allows for the quantification of the remote sensing soil moisture error using a temporally adaptive technique. It is shown that the point-to-area sampling error can be estimated within 0.0084 [m3/m3].
APA, Harvard, Vancouver, ISO, and other styles
44

Loew, A., and F. Schlenz. "A dynamic approach for evaluating coarse scale satellite soil moisture products." Hydrology and Earth System Sciences Discussions 7, no. 5 (September 24, 2010): 7263–303. http://dx.doi.org/10.5194/hessd-7-7263-2010.

Full text
Abstract:
Abstract. Validating coarse scale remote sensing soil moisture products requires a comparison of gridded data to point-like ground measurements. The necessary aggregation of in situ measurements to the footprint scale of a satellite sensor (>100 km2) introduces uncertainties in the validation of the satellite soil moisture product. Observed differences between the satellite product and in situ data are therefore partly attributable to these aggregation uncertainties. The present paper investigates different approaches to disentangle the error of the satellite product from the uncertainties associated to the up-scaling of the reference data. A novel approach is proposed, which allows for the quantification of the remote sensing soil moisture error using a temporally adaptive technique. It is shown that the point-to-area sampling error can be estimated within 0.0084 [m3/m3].
APA, Harvard, Vancouver, ISO, and other styles
45

Sharma, S. K. "Validation of Spallation Models: An Approach." Acta Physica Polonica B Proceedings Supplement 6, no. 4 (2013): 1161. http://dx.doi.org/10.5506/aphyspolbsupp.6.1161.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Chafouk, Houcine, Ghaleb Hoblos, Nicolas Langlois, Serge Le Gonidec, and José Ragot. "Soft Computing Approach for Data Validation." Journal of Aerospace Computing, Information, and Communication 4, no. 1 (January 2007): 628–35. http://dx.doi.org/10.2514/1.20742.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Avros, Renata, Mati Golani, and Zeev Volkovich. "A Comparative Approach to Cluster Validation." Journal of Pattern Recognition Research 6, no. 2 (2011): 230–43. http://dx.doi.org/10.13176/11.208.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Rizk, N. K., and H. C. Mongia. "Calculation Approach Validation for Airblast Atomizers." Journal of Engineering for Gas Turbines and Power 114, no. 2 (April 1, 1992): 386–94. http://dx.doi.org/10.1115/1.2906603.

Full text
Abstract:
In order to formulate a common approach that could provide the spray parameters of airblast atomizers, various processes of liquid preparation, breakup, and secondary atomization have been included in a semi-analytical calculation procedure. The air velocity components in the atomizer flow field are provided by mathematical expressions, and the spray droplets are considered to form at ligament breakup through a disturbance wave growth concept. The validation of the developed approach included the application to six atomizers that significantly varied in concept, design, and size. They represented both prefilming and plain-jet types, and the data utilized in the present effort were obtained with six different liquids. Satisfactory agreement between the measurements and the predictions has been achieved under wide ranges of air/fuel ratio and air pressure drop for various test liquids. The results of this investigation indicate the potential of using such an approach in the early phases of airblast atomizer design, and may be followed by more detailed calculations using analytical tools.
APA, Harvard, Vancouver, ISO, and other styles
49

Bentivegna, Marco, Nicolò Spagnolo, Chiara Vitelli, Daniel J. Brod, Andrea Crespi, Fulvio Flamini, Roberta Ramponi, et al. "Bayesian approach to Boson sampling validation." International Journal of Quantum Information 12, no. 07n08 (November 2014): 1560028. http://dx.doi.org/10.1142/s021974991560028x.

Full text
Abstract:
The Boson sampling problem consists in sampling from the output probability distribution of a bosonic Fock state, after it evolves through a linear interferometer. There is strong evidence that Boson sampling is computationally hard for classical computers, while it can be solved naturally by bosons. This has led it to draw increasing attention as a possible way to provide experimental evidence for the quantum computational supremacy. Nevertheless, the very complexity of the problem makes it hard to exclude the hypothesis that the experimental data are sampled from a different probability distribution. By exploiting integrated quantum photonics, we have carried out a set of three-photon Boson sampling experiments and analyzed the results using a Bayesian approach, showing that it represents a valid alternative to currently used methods. We adopt this approach to provide evidence that the experimental data correspond to genuine three-photon interference, validating the results against fully and partially-distinguishable photon hypotheses.
APA, Harvard, Vancouver, ISO, and other styles
50

Kane, Michael. "The Argument-Based Approach to Validation." School Psychology Review 42, no. 4 (December 1, 2013): 448–57. http://dx.doi.org/10.1080/02796015.2013.12087465.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography