Dissertations / Theses on the topic 'Validation approch'

To see the other types of publications on this topic, follow the link: Validation approch.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Validation approch.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Pang, Lee Yick. "A quantitative approach to linguistic model validation." Thesis, University College London (University of London), 1990. http://discovery.ucl.ac.uk/10018475/.

Full text
Abstract:
The thesis is an attempt to identify a method of statistical analysis whereby theoretical linguistic models can be validated, to some degree, via analysis of language user perception of text structure. Such a tool of validation is indispensible but has yet to be identified. There are two areas of linguistic model validation where the proposed method of analysis can make a substantial contribution: a - in validating linguistic models, qua descriptive models, as explanatory models, and b - in establishing grounds for comparison among competing and/or conflicting linguistic models in the same area of linguistic investigation. The study has a clear methodological emphasis and explores new empirical procedures of text analysis. The statistical technique for such a validation study is repertory grid analysis (Kelly 1955, Slater 1977). This technique is widely used in psychotherapy but is used for the first time in linguistic investigation. Repertory grid analysis offers two very important contributions. It is, on the one hand, one of the most rigorous quantitative methods for the study of human perception; at the same time, it allows for qualitative analysis of the data, which is very desirable in the study proposed. The area of linguistics to be studied is the signalling approach to text analysis proposed by Winter (1977, 1982) and Hoey (1979, 1983). An informal pre-pilot was first carried out to examine broad features and potential problems of the application of repertory grid analysis to the investigation planned. A proper pilot was then carried out to investigate closely the feasibility of the study. Results from the pilot indicated that the proposed approach was usable. The main study was then performed on a representative sample of a target population (i.e. a sub-population of undergraduate students in Hong Kong). Besides analyses associated with the repertory grid technique, an ANOVA design was used for the investigation of aspects within the experimental situation that may be of relevance. The independent variables include relative English language proficiency and the major academic disciplines of the experimental subjects, different methods of grid elicitation, and variation in text structure. The data were analysed first on individual perception of text structure and then on the agreement between the theoretical model and subject perception both as individuals and as a group. In the analyses, both a quantitative and a qualitative approach were used. The results of the study indicated very clearly that repertory grid analysis was able to make interesting and informative comparisons between the theoretical model and subject perception of text structure and should be a usable technique for linguistic model validation as first hypothesized. In particular, individual characteristics of perception were uncovered; and the consensus view of the sample was captured. Furthermore, the present application of repertory grid analysis also enabled a qualitative analysis of the data which threw additional light on and provided much needed details for the research. The study has important implications for linguistics. Firstly, an objective and statistically based technique for rendering linguistic models susceptible to validation procedure, so far unavailable, has now been identified. Furthermore, the study certainly helps to establish applied linguistics as an academic discipline at once independent from and contributing to theoretical linguistics.
APA, Harvard, Vancouver, ISO, and other styles
2

Weekley, Christopher D. "Aircraft simulation validation using an instrumental variable approach." Thesis, Virginia Tech, 1992. http://hdl.handle.net/10919/41517.

Full text
Abstract:
A procedure is developed which offers the potential to validate aircraft simulation models using noisy flight test measurements. The proposed validation procedure is based on the instrumental variable parameter identification method. The instrumental variable method requires a choice of "instruments." For this research, the "instruments" are chosen using the response predicted by an available simulation model. With the “instruments” chosen from the predicted response, it is shown that the parameter estimates are correlated with only the measured input noise vector. In contrast, the generally used least-squares approach is shown to be correlated with both the state and input noise vectors. Several studies are presented to demonstrate the utility of the validation procedure. These studies include input variations and noise variations. The method is demonstrated using longitudinal and lateral/directional axis cases derived from a nonlinear simulation of a high performance fighter aircraft. The results are presented using time response comparisons, eigenvalue comparisons, and identified stability derivative comparisons. The case study results confirm that the instrumental variable method performs better than the least-squares technique when the state noise level is high and the input noise level is relatively low.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
3

Yeo, Sheau-yuen. "Measuring organizational climate for diversity a construct validation approach /." Columbus, Ohio : Ohio State University, 2006. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1141677667.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hachey, Krystal. "Examining Thinking Skills in the Context of Large-scale Assessments Using a Validation Approach." Thèse, Université d'Ottawa / University of Ottawa, 2014. http://hdl.handle.net/10393/30974.

Full text
Abstract:
Large Scale Assessments (LSAs) of student achievement in education serve a variety of purposes, such as comparing educational programs, providing accountability measures, and assessing achievement on a broad range of curriculum standards. In addition to measuring content-related processes such as mathematics or reading, LSAs also focus on thinking-related skills such as lower level thinking (e.g., understanding concepts) and problem solving. The purpose of the current study was to deconstruct and clarify the mechanisms that make up an LSA, including thinking skills and assessment perspectives, from a validation approach based on the work by Messick (1995) and Kane (1990). Therefore, when examining the design and student data of two LSAs in reading, (a) what common thinking skills are assessed? and (b) what are the LSAs’ underlying assessment perspectives? Content analyses were carried out on two LSAs that purported to assess thinking skills in reading: the Pan-Canadian Assessment Program (PCAP) and the Educational Quality and Accountability Office (EQAO). As the two LSAs evaluated reading, the link between reading and thinking was also addressed. Conceptual models were developed and used to examine the assessment framework, test booklets, and scoring guide of the two assessments. In addition, a nonlinear factor analysis was conducted on the EQAO item-level data from the test booklets to examine the dimensionality of the LSA. The most prominent thinking skill referenced after qualitatively analyzing the assessment frameworks, test booklets, and scoring guides was critical thinking, while results from the quantitative analysis revealed that two factors best represented the item-level EQAO data. Overall, the tools provided in the current study can help inform both researchers and practitioners about the interaction between the assessment approach and related thinking skills.
APA, Harvard, Vancouver, ISO, and other styles
5

Essien, Joe. "Model driven validation approach for enterprise architecture and motivation extensions." Thesis, University of West London, 2015. https://repository.uwl.ac.uk/id/eprint/1269/.

Full text
Abstract:
As the endorsement of Enterprise Architecture (EA) modelling continues to grow in diversity and complexity, management of its schema, artefacts, semantics and relationships has become an important business concern. To maintain agility and flexibility within competitive markets, organizations have also been compelled to explore ways of adjusting proactively to innovations, changes and complex events also by use of EA concepts to model business processes and strategies. Thus the need to ensure appropriate validation of EA taxonomies has been considered severally as an essential requirement for these processes in order to exert business motivation; relate information systems to technological infrastructure. However, since many taxonomies deployed today use widespread and disparate modelling methodologies, the possibility to adopt a generic validation approach remains a challenge. The proliferation of EA methodologies and perspectives has also led to intricacies in the formalization and validation of EA constructs as models often times have variant schematic interpretations. Thus, disparate implementations and inconsistent simulation of alignment between business architectures and heterogeneous application systems is common within the EA domain (Jonkers et al., 2003). In this research, the Model Driven Validation Approach (MDVA) is introduced. MDVA allows modelling of EA with validation attributes, formalization of the validation concepts and transformation of model artefacts to ontologies. The transformation simplifies querying based on motivation and constraints. As the extended methodology is grounded on the semiotics of existing tools, validation is executed using ubiquitous query language. The major contributions of this work are the extension of a metamodel of Business Layer of an EAF with Validation Element and the development of EAF model to ontology transformation Approach. With this innovation, domain-driven design and object-oriented analysis concepts are applied to achieve EAF model’s validation using ontology querying methodology. Additionally, the MDVA facilitates the traceability of EA artefacts using ontology graph patterns.
APA, Harvard, Vancouver, ISO, and other styles
6

Reeves, Stanley J. "A cross-validation approach to image restoration and blur identification." Diss., Georgia Institute of Technology, 1990. http://hdl.handle.net/1853/13414.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Baduel, Ronan. "An integrated model-based early validation approach for railway systems." Thesis, Toulouse 2, 2019. http://www.theses.fr/2019TOU20083.

Full text
Abstract:
L’ingénierie système est un domaine où l’on étudie la conception de systèmes complexes. Un système est une solution que l’on veut développer, par exemple un train ou un réseau de satellites. Un système complexe est composé de différents éléments indépendants. Les ingénieurs travaillent à partir de listes d’attentes vis à vis du système, chacune caractérisant ce qu’il doit être ou doit faire. Ils créent alors un système et vérifient qu’il répond aux attentes. Afin de gagner du temps et de l’argent, on souhaite vérifier que le système correspond aux attentes avant même de le réaliser. Cela demande d’intégrer les différentes attentes et spécifier comment elles doivent être mises ensembles, générant ainsi un système attendu que l’on peut valider. Le but de la thèse est de fournir une méthode permettant d'intégrer les informations caractérisant un système de train au cours de sa conception, permettant ainsi la spécification, la représentation et la validation de son comportement
System engineering is a domain that studies the conception of complex system. A system corresponds to a solution we want to develop, such as a train, a satellite network, etc. A complex system is composed of several independent elements Engineers work from lists of individual expectations regarding the system to be or what it is supposed to do, which they use to create a system and see if it answers expectations. To gain time and money, we would like to check that the system-to-be answers expectations before developing it: it requires to integrate expectations and specify how they should be put together, inducing a system expected that we can check. The goal pursued in this PhD is to provide a method to integrate information regarding a train system during conception, enabling the specification, representation and validation of its behavior
APA, Harvard, Vancouver, ISO, and other styles
8

Xiong, Jingwei. "A Penalized Approach to Mixed Model Selection Via Cross Validation." Bowling Green State University / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1510965832174342.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Jaoua, Ali. "Recouvrement avant de programmes sous les hypotheses de specifications deterministes et non deterministes." Toulouse 3, 1987. http://www.theses.fr/1987TOU30227.

Full text
Abstract:
Etude de la specification et de l'abstraction fonctionnelle de programmes. Definition de la coherence de programmes en termes relationnels. Definition des differents niveaux de coherence d'etats de programmes et des caracterisations de ces niveaux. Proposition d'une methodologie pratique de recouvrement avant basee sur l'idee de preserver, au moyen d'assertions executables, un niveau donne de coherences. Presentation d'une methodologie hybride de validation de programmes basee sur la verification formelle de certaines proprietes critiques du programme, et le recouvrement avant des proprietes non critiques
APA, Harvard, Vancouver, ISO, and other styles
10

Fernandez, Charles. "Modélisation et validation expérimentale des complexes insonorisants pour la prévision vibroacoustique numérique basse et moyenne fréquences des automobiles." Phd thesis, Université Paris-Est, 2008. http://tel.archives-ouvertes.fr/tel-00470535.

Full text
Abstract:
Dans cette recherche, on construit un modèle simplifié en basses et moyennes fréquences de complexes insonorisants (habillages) de l'industrie automobile à partir d'un élément élastoacoustique stochastique. Le modèle simplifié moyen est issu d'une extension de la théorie des structures floues et dépend de trois paramètres physiques : densité modale, amortissement et masse participante. Le modèle simplifié stochastique qui prend en compte les incertitudes de modèle et de données est construit en utilisant une approche probabiliste non paramétrique et dépend de trois paramètres de dispersion. Le modèle simplifié de l'habillage est implémenté dans un modèle vibroacoustique stochastique industriel d'automobile. Deux problèmes inverses sont résolus à l'aide d'une base de données expérimentales sur véhicules construite en parallèle et permettent d'identifier les paramètres du modèle complet. L'analyse des résultats permet de valider les développements théoriques et la méthodologie proposée
APA, Harvard, Vancouver, ISO, and other styles
11

Faye-Dumanget, Christine. "L'épuisement estudiantin : approche clinique, psychopathologique, épidémiologique et psychothérapeutique TCC du syndrome du burnout académique." Thesis, Nantes, 2018. http://www.theses.fr/2018NANT2050/document.

Full text
Abstract:
Dans le champ de la santé mentale, la population étudiante constitue un groupe particulièrement vulnérable en termes de détresse ou de souffrance psychologique. En effet, les risques anxiodépressifs et addictifs font partie des troubles souvent identifiés chez ces jeunes adultes. Le stress est également particulièrement prégnant et l’épuisement pour cette classe d’âge représente la première cause de fragilité psychologique. Les recherches internationales qui s’intéressent à la santé psychique des étudiants décrivent ce phénomène sous le terme de syndrome de burnout académique ou burnout académique (BOA). Cette souffrance largement reconnue dans le milieu professionnel est transposable au monde des études et de la formation. Elle reflète un processus tridimensionnel dont les composantes sont l’épuisement psychique, le cynisme et la réduction ou la perte du sentiment d’efficacité lié aux études. Si de nombreux travaux internationaux s’intéressent au BOA, il n’existe pas d’outil validé en français, ce qui complique la possibilité de mener de telles études systématisées dans des pays francophones. L’objectif de ce travail consiste à rendre compte du phénomène d’un point de vue clinique (cas cliniques), à valider empiriquement un outil d’évaluation du burnout académique (validation psychométrique du Maslach Burnout Inventory-Student Survey (MBI-SS) sur un échantillon de 667 étudiants), à procéder à l’exploration épidémiologique du BOA et ses les liens avec certains facteurs psychopathologiques (anxiété, dépression) et processus adaptatifs (régulation émotionnelle, flexibilité mentale) avec un échantillon de 2260 étudiants provenant d’espaces francophones et enfin à proposer une prise en charge du BOA s’appuyant sur les Thérapies Comportementales et Cognitives. L’ensemble de ces études est discuté afin de mettre en exergue la particularité du BOA dans cette population et la caractéristiques d’une dimension du BOA, l’épuisement émotionnel, en tant que processus transdiagnostique
The student population is a particularly vulnerable group in terms of distress or psychological suffering. Indeed, anxiety-depressive and addictive risks are often identified in these young adults. Stress is also particularly prevalent and exhaustion for this age group is the first leading cause of psychological fragility. International researches focusing on student psychic health refer to this phenomenon with the terms academic burnout syndrome or academic burnout (ABO). Burnout is widely recognized in professional environments and is transferable to other contexts, such as higher education and training. It reflects a three-dimensional process which includes mental exhaustion, cynicism and the reduction or loss of the sense of effectiveness associated with studies. Even if many international studies investigated ABO, no French validated tool is available, which complicates the possibility of conducting systematic studies in French-speaking countries. The objective of this work is to report on the phenomenon from a clinical point of view (clinical cases), to empirically validate an evaluation tool for academic burnout (psychometric validation of the Maslach Burnout Inventory Student Survey (MBI-SS) on a sample of 667 students), to carry out the epidemiological exploration of the ABO and its links with certain psychopathological factors (anxiety, depression) and adaptive processes ( emotional regulation, mental flexibility) with a sample of 2260 students from French-speaking areas and finally to propose a management of ABO based on Behavioral and Cognitive Therapies. All of these studies are discussed in order to highlight the particularity of the ABO in this population as well as the characteristics of one ABO dimension, the emotional exhaustion, as a transdiagnostic process
APA, Harvard, Vancouver, ISO, and other styles
12

Abidelah, Anis. "Analyse numérique du comportement d'assemblages métalliques. Approche numérique et validation expérimentale." Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2009. http://tel.archives-ouvertes.fr/tel-00725229.

Full text
Abstract:
Le travail présenté dans ce mémoire porte sur le comportement des assemblages métalliques boulonnés avec platine d'about. La première partie décrit deux séries d'essais sur des assemblages métalliques avec platine d'about boulonnée renforcée ou non par raidisseur. La deuxième partie est consacrée au développement d'un modèle numérique 3D non linéaire validé sur la base des résultats d'essais. Les résultats sont aussi comparés aux formulations analytiques de l'EC3. Le modèle a permis d'analyser l'évolution des champs de contraintes et de la plasticité dans différentes zones de l'assemblage. Il a permis aussi de confirmer les observations expérimentales concernant l'influence du raidisseur de platine sur le mécanisme de transfert des efforts à travers son extrémité. Dans une troisième partie, l'influence de la flexion du boulon sur le comportement des tronçons en té est modélisée. Une étude paramétrique est menée pour évaluer les effets de paramètres tels que la dimension du tronçon, la rigidité de la rondelle, la position du boulon et l'épaisseur de la semelle sur la flexion du boulon soumis à un effort normal de traction.
APA, Harvard, Vancouver, ISO, and other styles
13

Walia, Gursimran Singh. "Empirical Validation of Requirement Error Abstraction and Classification: A Multidisciplinary Approach." MSSTATE, 2006. http://sun.library.msstate.edu/ETD-db/theses/available/etd-05152006-151903/.

Full text
Abstract:
Software quality and reliability is a primary concern for successful development organizations. Over the years, researchers have focused on monitoring and controlling quality throughout the software process by helping developers to detect as many faults as possible using different fault based techniques. This thesis analyzed the software quality problem from a different perspective by taking a step back from faults to abstract the fundamental causes of faults. The first step in this direction is developing a process of abstracting errors from faults throughout the software process. I have described the error abstraction process (EAP) and used it to develop error taxonomy for the requirement stage. This thesis presents the results of a study, which uses techniques based on an error abstraction process and investigates its application to requirement documents. The initial results show promise and provide some useful insights. These results are important for our further investigation.
APA, Harvard, Vancouver, ISO, and other styles
14

Tomiyama, Hiroyuki, Shin-ichiro Chikada, Shinya Honda, and Hirouyuki Takada. "An RTOS-based approach to design and validation of embedded systems." IEEE, 2005. http://hdl.handle.net/2237/6840.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Engsner, Hampus. "A PIT - Based approach to Validation of Electricity Spot Price Models." Thesis, KTH, Matematisk statistik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-172996.

Full text
Abstract:
The modeling of electricity spot prices is still in its early stages, with various different competing models being proposed by different researchers. This makes model evaluation and comparison research an important area, for practitioners and researchers alike. However, there is a distinct lack in the literature of consensus regarding model evaluation tools to assess model validity, with different researchers using different methods of varying suitability as validation methods. In this thesis the current landscape of electricity spot price models and how they are currently evaluated is mapped out. Then, as the main contribution this research aims to make, a general and flexible framework for model validation is proposed, based on the Probability Integral Transform (PIT). The probability integral transform, which can be seen as a generalization of analyzing residuals in simple time series and regression models, transforms the realizations of a time series into independent and identically distributed U(0,1) variables using the conditional distributions of the time series. Testing model validity is with this method reduced to testing if the PIT values are independent and identically distributed U(0,1) variables. The thesis is concluded by testing spot price models of varying validity according to previous research using this framework against actual spot price data. These empirical tests suggest that PIT-based model testing does indeed point us toward the more suitable models, with especially unsuitable models being rejected by a large margin.
Modelleringen av spotpriser på el är fortfarande i ett tidigt stadium, med många olika modeller som förespråkas av olika forskare. Detta innebär att forskning som fokuserar på modellutvärdering och jämförelse är viktig både för berörda parter i näringslivet och forskare inom detta område. Det finns dock en klar brist på konsensusmetoder att utvärdera modellers validitet, då olika forskare förespråkar olika metoder av varierande lämplighet som valideringsverktyg. I den här uppsatsen kartläggs det nuvarande landskapet av spotprismodeller och de metoder som används för att utvärdera dem. Sedan, som det huvudsakliga forskningsbidraget av detta arbete, presenteras ett generellt och flexibelt valideringsramverk som baseras på vad som kallas ”Probability Integral Transform” (PIT). PIT, vilken kan ses som en generalisering av att undersöka residualer i enkla tidsserie- och regressionsmodeller, transformerar utfallet av en tidsserie till oberoende och identiskt fördelade U(0,1) variabler med hjälp av tidsseriens betingade fördelningar. Att testa modellens validitet reduceras med denna metod till att testa om PIT – värdena är oberoende och identiskt fördelade U(0,1) variabler. Uppsatsen avslutas med tester av spotprismodeller av varierande validitet enligt litteraturen med hjälp av detta ramverk mot faktiskt spotprisdata. De empiriska testerna antyder att PIT – baserad modellvalidering faktiskt stämmer överrens med modellers validitet baserat på nuvarande konsensus, där särskilt opassande modeller förkastas med stora marginaler.
APA, Harvard, Vancouver, ISO, and other styles
16

Yoo, Min-Jung. "Une approche componentielle pour la modélisation d'agents coopératifs et leur validation." Paris 6, 1999. http://www.theses.fr/1999PA066652.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Neloy, Md Naim Ud Dwla. "Validation of theoritical approach to measure biodiversity using plant species data." Thesis, Högskolan i Skövde, Institutionen för biovetenskap, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-19431.

Full text
Abstract:
Measuring Biodiversity is an important phenomenon to serve best to our ecology and also keep environment sound. Variety of life on different levels, like an ecosystem, life forms on a site, landscape collectively known as Biodiversity. Species richness and evenness combine measures as Biodiversity. Separate formula, index, equation are widely using to measure Biodiversity in each level. Swedish Environmental Protection Agency aimed to establish an index that consists of landscape functionality and landscape heterogeneity. For landscape functionality assessment, there BBCI (Biotope biodiversity Capacity index) is going to use. High BBCI indicates a high biodiversity for each biotope. However, empirically estimate species richness how much matched with BBCI that not been evaluated. The aim of this paper to see the relationship between empirical estimated Biodiversity and BBCI. A relationship between Shannon diversity index and BBCI also ran to see the matches between them. Collect the empirical data from selected 15 landscapes using Artportalen.se and sort the data for further calculation. Results showed that there was a strong positive relationship between empirical estimated Biodiversity and BBCI. Again Shannon diversity index and BBCI also demonstrated a positive correlation between them. It showed BBCI could explain 60%-69% of species richness data and 17%-22% of Shannon diversity index. It indicates the acceptance of theoretical study of measure Biodiversity.
APA, Harvard, Vancouver, ISO, and other styles
18

Guibert, Nicolas. "Validation d'une approche basée sur l'exemple pour l'initiation à la programmation." Poitiers, 2006. http://www.theses.fr/2006POIT2341.

Full text
Abstract:
Alors qu’ordinateurs et programmes informatiques se sont implantés dans nombre de disciplines scientifiques en tant qu'outils d'analyse ou instruments de mesure, l'acquisition des compétences requises pour la conception de programmes ne se fait pas aisément. De nombreuses études ont caractérisé les erreurs et difficultés rencontrées par les programmeurs novices. Les environnements actuellement utilisés pour l’apprentissage de la programmation se composent d’outils conçus dans une unique optique de développement, et non pas dans cadre explicitement pédagogique. A cette approche « industrielle » s’oppose une approche explicitement pédagogique, où le but premier est la découverte et la construction de connaissances, et non pas la réalisation de tâches techniques. Cette Thèse étudie l’usage d'un paradigme de programmation alternatif, la programmation graphique sur exemple, comme support à la construction active d’un savoir viable par l'étudiant, en s’appuyant sur des expérimentations en situations réelles avec un environnement adapté conçu explicitement pour l’apprentissage
Although computers and programs have now become essential in experimental sciences as analysis or measurement tools, many students still find learning Computer Science is extremely difficult. Many studies have characterised the errors and difficulties encountered by novice programmers. The environments in use nowadays for learning programming are tools built in the unique perspective of development, and not in a pedagogical perspective. This “industrial” approach is often opposed to a genuine pedagogical approach, where the goal is discovery and acquisition of knowledge, and not the realisation of technical tasks. This thesis explores the use of an alternative interaction paradigm, “programming by examples”, to support the student’s active construction of viable knowledge, by the use of experimental studies led in a concrete environment, with an adapted programming by examples environment, engineered specifically for a pedagogical purpose
APA, Harvard, Vancouver, ISO, and other styles
19

Vachon, Éric. "Une nouvelle approche de la validation de requête vidéo par l'utilisateur." Paris 6, 2003. http://www.theses.fr/2003PA066329.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Pediaditakis, Michael. "Presenting multi-language XML documents : an adaptive transformation and validation approach." Thesis, University of Kent, 2006. https://kar.kent.ac.uk/24021/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Li, Zheng, and n/a. "A pattern-based approach to the specification and validation of web services interactions." Swinburne University of Technology, 2007. http://adt.lib.swin.edu.au./public/adt-VSWT20070618.115228.

Full text
Abstract:
Web services are designed for composition and use by third parties through dynamic discovery. As such, the issue of interoperability between services is of great importance to ensure that the services can work together towards the overall application goals. In particular, the interaction protocols of a service need to be implemented and used properly so that the service composition can conduct itself in an orderly fashion. There have been significant research efforts in providing rich descriptions for Web services, which includes their behaviour properties. When describing the interaction process/protocols of a service, most of them adopt a procedural or programming style approach. We argue that this style of description for service interactions is not natural to publishing service behaviour properties from the viewpoint of facilitating third-party service composition and analysis. Especially when dealing with service with diverse behaviour, the limit of these procedural approaches become apparent. In this thesis, we introduce a lightweight, pattern/constraint-based declarative approach that better supports the specification and use of service interaction properties in the service description and composition process. This approach uses patterns to describe the interaction behaviour of a service as a set of constraints. As such, it supports the incremental description of a service's interaction behaviour from the service developer's perspective, and the easy understanding and analysis of the interaction properties from the service user's perspective. It has been incorporated into OWL-S for service developers to describe service interaction constraints. We also present a framework and the related tool support for monitoring and checking the conformance of the service's runtime interactions against its specified interaction properties, to test whether the service is used properly and whether the service fulfils its behavioural obligations. The tool involves interception of service interactions/messages, representation of interaction constraints using finite state automata and finite state machine, and conformance checking of service interactions against interaction constraints. As such, we provide a useful tool for validating the implementation and use of services regarding their interaction behaviour.
APA, Harvard, Vancouver, ISO, and other styles
22

Paisant, Jean-Francois. "Modélisation numérique et validation expérimentale de l'hydrodynamique d'une émulsion dans une colonne d'extraction." Thesis, Paris 6, 2014. http://www.theses.fr/2014PA066453/document.

Full text
Abstract:
Au sein des opérations de retraitement du combustible usé, la colonne pulsée à garnissage est l'appareil d'extraction liquide-liquide principalement utilisé. Dans un contexte de compétitivité économique et de raréfaction des ressources, l'efficacité de ces appareils est devenue un enjeu pour l'industriel. Afin d'améliorer leur rendement à travers un meilleur dimensionnement, la connaissance de la vitesse de glissement entre les phases de l'émulsion est nécessaire.Les travaux menés et présentés dans ce manuscrit s'articulent autour de la modélisation physique et numérique de l'hydrodynamique de l'émulsion ainsi que de sa caractérisation expérimentale.Dans ce travail, une modélisation d'approche eulérienne, inspirée des travaux de D. Lhuillier, permet l'obtention d'un modèle bi-fluide couplé à une équation d'évolution de la surface d'échange (aire interfaciale). La résolution du modèle s'effectue par éléments finis sous le logiciel CAST3M. A l'issue des calculs, le modèle montre sa capacité à restituer le comportement de l'émulsion et permet l'obtention des vitesses de glissement. Dans une optique de validation expérimentale du modèle, des expérimentations sur deux installations sont menées. Celles-ci font notamment intervenir un couplage entre les méthodes de vélocimétrie par image de particules et de fluorescence induite par laser afin d'obtenir les vitesses de chaque phases et le taux de rétention de la phase dispersée. Un algorithme de détection et de suivi de gouttes est développé afin d'obtenir la vitesse de la phase dispersée et sa fraction volumique. La confrontation de ces aux résultats numériques permet une première qualification encourageante du modèle
In the core of spent fuel reprocessing operations, the pulsed columns with packing are the liquid-liquid extraction apparati mainly used. The context of economical competiveness and scarce resources, industrials are driven to improve the efficiency of these processes. Pulsed column efficiency is bound to the amount of available exchange surface, which depends on geometrical parameters of the column and the operating conditions. A better design would improve the efficiency. In this aim the knowledge of the interphase slip velocity is necessary. The work presented in this thesis revolves around physical and numerical modelling of the hydrodynamics of the emulsion and its experimental characterization.In this work, a eulerian approach, based on the work of D.Lhuillier, allows to obtain a two-fluid model coupled with an evolution equation of the exchange surface (interfacial area). We use finite elements method to solve this model along with CAST3M software. Numerical simulations have shown the model abilities to correctly reproduce the emulsion behaviour and to obtain the slip velocity.In order to experimentally validate the model, we carried out two types of experimentation. Particles images velocimetry coupled to laser induced fluorescence are involved to obtain velocities of each phases and the dispersed phase volume fraction. We developed a tracking algorithm to obtain the dispersed phase velocity and the hold up. These results, such as velocities and strain rate tensor, have been used in a first validation of the model
APA, Harvard, Vancouver, ISO, and other styles
23

Belt, P. (Pekka). "Improving verification and validation activities in ICT companies—product development management approach." Doctoral thesis, University of Oulu, 2009. http://urn.fi/urn:isbn:9789514291487.

Full text
Abstract:
Abstract The main motive for this research arises from the fact that the research has been scarce on verification and validation (V&V) activities from the management viewpoint, even though V&V has been covered from the technical viewpoint. There was a clear need for studying the management aspects due to the development of the information and communications technology (ICT) sector, and increased significance of V&V activities. ICT has developed into a turbulent, high clock-speed sector and the importance of V&V activities has increased significantly. As a consequence, companies in the ICT sector require ideas for improving their verification and validation activities from the product development management viewpoint. This study approaches the above mentioned goal from four perspectives: current V&V management challenges, organisational and V&V maturities, benchmarking another sector, and uncertainty during new product development (NPD). This dissertation is qualitative in nature and is based on interviewing experienced industrial managers, reflecting their views against scientific literature. The researcher has analysed the obtained material and made conclusions. The main implications of this doctoral dissertation can be concluded as a need to overcome the current tendency to organise through functional silos, and low maturity of V&V activities. Verification and validation activities should be viewed and managed over the entire NPD process. This requires new means for cross-functional integration. The maturity of the overall management system needs to be adequate to enable higher efficiency and effectiveness of V&V activities. There are pressures to shift the emphasis of V&V to early NPD and simultaneously delay decision-making in NPD projects to a stage where enough information is available. Understanding enhancing V&V methods are a potential way to advance towards these goals.
APA, Harvard, Vancouver, ISO, and other styles
24

Yeoh, Terence Eng Siong. "Validation Of The Facet Satisfaction Scale (Fss): An Evaluative Approach To Assessing Facet Job Satisfaction." Thesis, University of North Texas, 2011. https://digital.library.unt.edu/ark:/67531/metadc103414/.

Full text
Abstract:
Job satisfaction has, and continues to be an important construct of interest to researchers and practitioners alike. However, conflicting operational definitions and inconsistent measurement systems have reduced the efficacy of the construct in predicting important job-related outcomes for organizations and their employees. The Facet Satisfaction Scale (FSS) was designed to overcome these deficiencies by creating a facet-based measure that assesses job satisfaction in accordance with recent definitions of the construct. Reliability and validity analyses were conducted on both the complete and shortened version of the scale. The FSS exhibited evidence of reliability (ranging from .52 to .93 for the shortened FSS, and .53 to .96 for the complete FSS). Evidence of scale validity was also obtained through the use of construct, content, and criterion-related validity measures. Implications of the study on future research on job satisfaction are discussed.
APA, Harvard, Vancouver, ISO, and other styles
25

Zhang, Zhiqi. "Blood-Brain Barrier in vitro Model: A Tissue Engineering Approach and Validation." FIU Digital Commons, 2010. http://digitalcommons.fiu.edu/etd/246.

Full text
Abstract:
This dissertation evaluated the feasibility of using commercially available immortalized cell lines in building a tissue engineered in vitro blood-brain barrier (BBB) co-culture model for preliminary drug development studies. Mouse endothelial cell line and rat astrocyte cell lines purchased from American Type Culture Collections (ATCC) were the building blocks of the co-culture model. An astrocyte derived acellular extracellular matrix (aECM) was introduced in the co-culture model to provide a novel in vitro biomimetic basement membrane for the endothelial cells to form endothelial tight junctions. Trans-endothelial electrical resistance (TEER) and solute mass transport studies were engaged to quantitatively evaluate the tight junction formation on the in-vitro BBB models. Immuno-fluorescence microscopy and Western Blot analysis were used to qualitatively verify the in vitro expression of occludin, one of the earliest discovered tight junction proteins. Experimental data from a total of 12 experiments conclusively showed that the novel BBB in vitro co-culture model with the astrocyte derived aECM (CO+aECM) was promising in terms of establishing tight junction formation represented by TEER values, transport profiles and tight junction protein expression when compared with traditional co-culture (CO) model setups and endothelial cells cultured alone. Experimental data were also found to be comparable with several existing in vitro BBB models built from various methods. In vitro colorimetric sulforhodamine B (SRB) assay revealed that the co-cultured samples with aECM resulted in less cell loss on the basal sides of the insert membranes than that from traditional co-culture samples. The novel tissue engineering approach using immortalized cell lines with the addition of aECM was proven to be a relevant alternative to the traditional BBB in vitro modeling.
APA, Harvard, Vancouver, ISO, and other styles
26

Pellegri, Matteo, Andrea Vacca, Ram S. Devendran, Etienne Dautry, and Benjamin Ginsberg. "A Lumped Parameter Approach for GEROTOR Pumps: Model Formulation and Experimental Validation." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2016. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-199845.

Full text
Abstract:
This paper describes a high fidelity simulation model for GEROTOR pumps. The simulation approach is based on the coupling of different models: a geometric model used to evaluate the instantaneous volumes and flow areas inside the unit, a lumped parameter fluid dynamic model for the evaluation of the displacing action inside the unit and mechanical models for the evaluation of the internal micro-motions of the rotors axes. This paper particularly details the geometrical approach, which takes into account the actual geometry of the rotors, given as input as CAD files. This model can take into account the actual location of the points of contact between the rotors as well for the actual clearances between the rotors. The potentials of the model are shown by considering a particular GEROTOR design. A specific test set-up was developed within this research for the model validation, and comparisons in terms of steady-state pressure versus flow curves and instantaneous pressure ripples are shown for the reference pump.
APA, Harvard, Vancouver, ISO, and other styles
27

Bryant, Nathan J. "EXPERIMENTAL VALIDATION OF THE CALPHAD APPROACH APPLIED TO MULTI-PRINCIPLE ELEMENT ALLOYS." Wright State University / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=wright1433176902.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Marche, Mikaël. "Une approche par simulation d'environnement pour la validation de code d'agents mobiles." Besançon, 2002. http://www.theses.fr/2002BESA2057.

Full text
Abstract:
On propose ici des moyens de valider une application mobile par des techniques de simulation de son environnement d'exécution. Ainsi, nous apportons une contribution aux problèmes de l'exécution de test qui est nous le verrons, difficile à mettre en oeuvre. Par exemple, l'aspect mobile des entités testées implique la difficulté à conserver des points de contact et d'observation sur elles. Pour résoudre ces difficultés, nous définissons une architecture permettant de simuler plusieurs configurations d'un environnement d'exécution d'une application agent. L'approche par simulation permet de ne pas limiter a priori les possibilités de définition d'environnement de test. Il est alors possible de décrire des environnements complexes et hétérogènes sans réelles contraintes, là où la mise en place d'une architecture de test réelle aurait demandée beaucoup d'effort d'instrumentation. Autour de l'outil de simulation, on propose des moyens pour exprimer la vérification de comportement en exécution. L'idée est d'associer au simulateur une sémantique d'exécution de vérification active et passive de comportements. Pour cela, on fournit un langage de contrôle et d'observation qui permet d'exprimer les tests à exécuter en parallèle avec l'application. Le langage et la sémantique d'observation proposés sont volontairement découplés de la philosophie initiale de test. L'objectif est de profiter qu'on se trouve dans un contexte de simulation pour permettre tous types d'activités définissables sur l'outil. Ainsi, l'approche par simulation proposée est résolument ouverte de façon à permettre à la fois la validation d'applications par le test et la conception et la vérification de prototypes
We propose in this thesis a way to validate mobile applications by using a technique based on simulation. Thus, we tackle the problems of the test execution which is difficult to implement. In particular, testing for such systems must take into account the mobility of tested processes, which makes it difficult to maintain contact points between testing and tested processes. To solve these difficulties, we define an architecture allowing to simulate several configurations of an environment of agents. The approach by simulation makes it possible to define without limitations the test environment Around the simulation tool (SAM), we propose a way to express the checking of behavior during the execution. For that, we provide a language of control and observation which makes it possible to express the tests to be carried out on the implementation under test. The language (JOR) and the semantics of observation suggested are voluntarily uncoupled from the initial goal of testing. The objective is to profit that one is in a context of simulation to allow all types of definable activities on the tool. Thus, the approach by simulation we suggest is resolutely open in order to allow at the same time the validation by testing and the design and checking of prototypes
APA, Harvard, Vancouver, ISO, and other styles
29

Filomena, Melissa, and Protik Sarkar. "Can dinosaurs generate unicorns? : -A corporate approach for early stage idea validation." Thesis, KTH, Industriell ekonomi och organisation (Inst.), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-246021.

Full text
Abstract:
Companies are always trying to increase their sales and revenue, but nowadays due to technology advancement, competence and the fast moving economy this task becomes more difficult as time goes by. This is where innovation walks in, to find new ways to add value to customers, increasing profit and even find new potential markets. As part of implementing innovative practices, many companies have added corporate entrepreneurship to their structure, to look for new business models that reach diverse customers within the same industry. Getting in the mind of customers, trying to decipher unspoken needs and matching problems to new solutions is part of the insighting process that has to be done when attempting idea incubation. This research seeks to provide a methodology to make the insighting process for early idea validation, in a corporate environment, less manual and more mechanical. For this purpose the Stockholm division of Telia Company was used as study case and main source of data recollection, which brings the research results to a practical use and analysis.
APA, Harvard, Vancouver, ISO, and other styles
30

Mariani, Lucia. "Driving Simulator for HMI development and validation: definition of a dedicated approach and test protocol." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2022.

Find full text
Abstract:
This work is a technical feasibility study for a novel and alternative process in automotive Human Machine Interface (HMI) development and validation. The main goal is to integrate the standard approach for concept design and partially substitute the standard physical approach for validation, both using an integrated and fully controllable virtual workflow provided by the Driving Simulation. In this experience I will focus on an Instrument Panel Cluster (IPC) User Interface (UI), that is the graphic interface where relevant driving and ADAS information and warnings are shown. I will explore the basics of a Static Driving Simulator from the refurbishment to the full application and usage, to stimulate specific scenarios. The focus will be on simulations that are not possible to recreate in real life for physical testing because they are too dangerous,i.e. the FCW/AEB (Forward Collision Warning / Autonomous Emergency Braking) warning and mitigation validation. Before my activity, simulation has been already used as a support to the HMI concepts development, but it was only as multidisciplinary internal cooperation with few standards available and not as a truly integrated and autonomous workflow. Given this background and since Maserati decided to invest in a dedicated environment, the first step of my activity focused on giving a structure to the functional part and the second step concentrated on the operational part linked to the creation of validation test protocols. All the activities related to this study have been entirely done at Maserati Innovation Lab Headquarters during my Master's internship in Modena.
APA, Harvard, Vancouver, ISO, and other styles
31

Baron, Mickaël. "Vers une approche sûre du développement des Interfaces Homme-Machine." Poitiers, 2003. http://www.theses.fr/2003POIT2323.

Full text
Abstract:
Les interfaces homme-machine (IHM) constituent une part indispensable dans la quasi-totalité des systèmes informatiques. Le recours à des modèles de spécification, de développement, de vérification et de validation devient indispensable pour assurer les propriétés de l'IHM. Aujourd'hui, on peut considérer que deux approches (approches fondées sur le développement formel, approches fondées sur la définition d'outils) exploitant les modèles du domaine de l'IHM peuvent être mises en parallèle pour la vérification des propriétés. Malgré des avancées intéressantes, aucune d'elles n'est encore parvenue à s'imposer. Nous proposons dans cette thèse deux nouvelles approches permettant le développement sûr d'IHM, fondées sur une même méthode formelle (la méthode B). La première intègre des techniques hétérogènes du domaine de l'IHM, afin d'exprimer, vérifier et valider formellement des propriétés. La seconde, fondée sur les outils (SUIDT), conçoit de manière interactive le dialogue entre un noyau fonctionnel développé formellement et une présentation graphique
Human-Computer Interfaces (HCI) represent an essential part in most computing systems. Resorting to specification, development, checking, validation models is becoming necessary to ensure that the system perfectly meets the HCI's properties. Nowadays, we consider that properties can be checked following two approaches one based on formal developments and the second one is tool based. In spite of great progress, none of them emerged. In this context, we propose two new approaches allowing a safe HCI development, based on a single formal method (B method). The first approach, based on formal developments, permits the integration of HCI heterogeneous techniques in order to express, check and validate interactive system properties. The second one, based on tools definitions (SUIDT) allows creating an interactive dialog between the formally-developed functional core and an user interface. Moreover, the latter approach ensures properties what are expressed both into the functional core and by the user's needs
APA, Harvard, Vancouver, ISO, and other styles
32

Khalgui, Mohamed. "Validation temporelle et déploiement d'une application de contrôle industrielle à base de composants." Thesis, Vandoeuvre-les-Nancy, INPL, 2007. http://www.theses.fr/2007INPL009N/document.

Full text
Abstract:
Dans cette thèse, nous nous intéressons à la validation temporelle ainsi qu'au déploiement d'applications de contrôle industriel à base de composants. La technologie des composants retenue est celle des Blocs Fonctionnels définie dans la norme industrielle IEC 61499. Un Bloc Fonctionnel est défini comme un composant réactif supportant des fonctionnalités d'une application. L'avantage de cette norme, connue dans l'industrie, est la description statique de l'application ainsi que de son support d'exécution. Une première contribution de la thèse est l'interprétation des différents concepts définis dans la norme. Nous précisons, en particulier, la dynamique du composant en vue de décrire un comportement déterministe de l'application. Pour appliquer une validation temporelle exhaustive, nous proposons un modèle de comportement d'un Bloc Fonctionnel à l'aide du formalisme des automates temporisés. D'autre part, nous fournissons une sémantique au concept de réseau de Blocs Fonctionnels pour décrire une application comme une composition de Blocs. Une deuxième contribution de la thèse est le déploiement de tels réseaux sur une architecture distribuée multi-tâches tout en respectant des propriétés sur les temps de réponse de bout en bout. Nous transformons un réseau de Blocs Fonctionnels vers un ensemble de tâches élémentaires dépendantes, appelées actions. Cette transformation permet l'exploitation de résultats d'ordonnancement pour valider la correction temporelle de l'application. Pour déployer les blocs d'une application, nous proposons une approche hybride alliant un ordonnancement statique non-préemptif et un autre ordonnancement en ligne préemptif. L'ordonnancement statique permet la construction des tâches s'exécutant sur chaque calculateur. Ces tâches sont vues comme des séquencements statiques d'actions. Elles sont alors à ordonnancer dynamiquement selon une politique préemptive reposant sur EDF (Earliest Deadline First). Grâce à cette approche, nous réduisons le nombre de commutation de contexte en regroupant les actions au sein des tâches. De plus l'ordonnancement dynamique préemptif augmente la faisabilité du système. Enfin, une dernière contribution est une extension de la deuxième. Nous proposons une approche d'allocation de réseaux de blocs fonctionnels sur un support d'exécution distribué. Cette allocation, basée sur une heuristique de Liste, se repose sur la méthode hybride pour assurer un déploiement faisable de l'application. Le problème d'allocation est de trouver pour chaque bloc fonctionnel le calculateur capable de l'exécuter tout en respectant des contraintes fonctionnelles, temporelles et de support d'exécution. Notons enfin que l'heuristique proposée se base sur une technique de retour-arrière pour augmenter l'espace de solutions
This thesis deals with the temporal validation and the deployment of component-based industrial control applications. We are interested in the Function Blocks approach, defined in the IEC 61499 standard, as a well known component based technology in the industry. A Function Block is an event triggered component owning data to support the application functionalities. The advantage of this technology is the taking into account of the application and also its execution support. The first thesis contribution deals with the interpretation of the different concepts defined in the standard. In particular, we propose a policy defining a deterministic behavior of a FB. To apply an exhaustive temporal validation of the application, we propose a behavioral model of a Block as Timed Automata. On the other hand, we propose a semantic for the concept of FBs networks to develop industrial control applications. The second thesis contribution deals with the deployment of FBs networks in a distributed multi-tasking architecture. Such deployment has to respect classical End to End Response Time Bounds as temporal constraints. To validate the temporal behavior of an application, we propose an approach transforming its blocks into an actions system with precedence constraints. The purpose is to exploit previous theories on the scheduling of real-time systems. To deploy FBs networks in feasible OS tasks, we propose a Hybrid scheduling approach combining an off-line non-preemptive scheduling and an on-line preemptive one. The off-line scheduling allows to construct OS tasks from FBs, whereas the on-line one allows to schedule these tasks according to the classical EDF policy. A constructed OS task is an actions sequence defining an execution scenario of the application. Thanks to this approach, we reduce the context switching at run-time by merging application actions in OS tasks. In addition, the system feasibility is increased by applying an on-line preemptive policy. Finally, the last thesis contribution is an extension of the previous one. We propose an approach allocating FBs networks in a distributed architecture. Based on a heuristic, such approach uses the hybrid method to construct feasible OS tasks in calculators. The allocation problem of a particular application FB is to look for a corresponding calculator while respecting functional, temporal and execution support constraints. We note that the proposed heuristic is based on a back-tracking technic to increase the solutions space
APA, Harvard, Vancouver, ISO, and other styles
33

Barbu, Andreea. "Developing mobile agents through a formal approach." Paris 12, 2005. https://athena.u-pec.fr/primo-explore/search?query=any,exact,990002282380204611&vid=upec.

Full text
Abstract:
Nous nous intéressons dans cette thèse à la modélisation et à la vérification de systèmes d'agents mobiles. Le développement d'une structure pour les agents mobiles demande le développement de solutions pour un ensemble de problèmes spécifiques dû à la mobilité. Une question dans le développement de logiciel est : le programme proposé est-il vraiment une solution pour le problème considéré. Une façon de répondre à cette question consiste à utiliser les méthodes formelles. Dans notre approche, nous construisons un modèle du problème (la spécification) en uitilisant le Pi calcul d'ordre supérieur. En ayant ce modèle formel comme base, Nous pouvons vérifier que ce modèle possède les propriétés voulues ; valider le modèle à travers es simulations ; être capable de prouver que l'implémentation est cohérente par rapport à la spécification. En profitant de nos résultats, nous avons implémenté un prototype qui permet la validation de systèmes d'agents mobiles conçus avec le Pi-calcul d'ordre sup
This thesis deals with the modelling and validation of mobile agent systems. The development of a support structure for mobile agents demans the development of solutions for set of specific problems that appear due to mobility. A basic question in software development is if the proposed program is really a solution for the considered problem. One way to answer this question is through the use of formal methods. In our approach, the first step is to build a model of the solution (specification) using the higher-order Pi-calculus. Having this formal model as a base, we can : validate the model through simulations ; carry out mathematical tests to guarantee that this model possesses the required properties (verification) ; being able to prove that the implementation is correct with respect to the specification. Making use of our results, we have implemented a prototype called HOPiTool which allows the possibility of validation of mobile agent systems conceived with higher-order Pi-calc
APA, Harvard, Vancouver, ISO, and other styles
34

Li, Zheng. "A pattern-based approach to the specification and validation of web services interactions." Australasian Digital Thesis Program, 2007. http://adt.lib.swin.edu.au/public/adt-VSWT20070618.115228/index.html.

Full text
Abstract:
Thesis (MSc) - Swinburne University of Technology, Faculty of Information & Communication Technologies, 2006.
A thesis submitted to Faculty of Information and Communication Technologies, Swinburne University of Technology for the degree of Master of Science by Research, 2007. Typescript. Bibliography p. 107-112.
APA, Harvard, Vancouver, ISO, and other styles
35

Usher, John S. "Subjective evaluation and electroacoustic theoretical validation of a new approach to audio upmixing." Thesis, McGill University, 2006. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=102741.

Full text
Abstract:
Audio signal processing systems for converting two-channel (stereo) recordings to four or five channels are increasingly relevant. These audio upmixers can be used with conventional stereo sound recordings and reproduced with multichannel home theatre or automotive loudspeaker audio systems to create a more engaging and natural-sounding listening experience. This dissertation discusses existing approaches to audio upmixing for recordings of musical performances and presents specific design criteria for a system to enhance spatial sound quality. A new upmixing system is proposed and evaluated according to these criteria and a theoretical model for its behavior is validated using empirical measurements.
The new system removes short-term correlated components from two electronic audio signals using a pair of adaptive filters, updated according to a frequency domain implementation of the normalized-least-means-square algorithm. The major difference of the new system with all extant audio upmixers is that unsupervised time-alignment of the input signals (typically, by up to +/-10 ms) as a function of frequency (typically, using a 1024-band equalizer) is accomplished due to the non-minimum phase adaptive filter. Two new signals are created from the weighted difference of the inputs, and are then radiated with two loudspeakers behind the listener. According to the consensus in the literature on the effect of interaural correlation on auditory image formation, the self-orthogonalizing properties of the algorithm ensure minimal distortion of the frontal source imagery and natural-sounding, enveloping reverberance (ambiance) imagery.
Performance evaluation of the new upmix system was accomplished in two ways: Firstly, using empirical electroacoustic measurements which validate a theoretical model of the system; and secondly, with formal listening tests which investigated auditory spatial imagery with a graphical mapping tool and a preference experiment. Both electroacoustic and subjective methods investigated system performance with a variety of test stimuli for solo musical performances reproduced using a loudspeaker in an orchestral concert-hall and recorded using different microphone techniques.
The objective and subjective evaluations combined with a comparative study with two commercial systems demonstrate that the proposed system provides a new, computationally practical, high sound quality solution to upmixing.
APA, Harvard, Vancouver, ISO, and other styles
36

Mac, Garrigle Ellen F. "A validation of the enterprise management engineering approach to knowledge management systems engineering." Thesis, The George Washington University, 2014. http://pqdtopen.proquest.com/#viewpdf?dispub=3614805.

Full text
Abstract:

Knowledge management is one of the current "buzzwords" gaining popularity on an almost-daily basis within the business world. Much attention has been paid to the theory and justification of knowledge management (KM) as an effective business and organizational practice. However, much less attention has been paid to the more specific issues of effective implementation of knowledge management, or to the potential financial benefit or payoff that could potentially result from an effective system implementation. As the concept of KM becomes more generally accepted, knowledge management systems (KMS) are becoming more prevalent. A KMS is often considered simply another information system to be designed, built, and supported by the IT department. In actual implementation, many KM system development efforts are not successful. There is frequently a perception that strict adherence to development processes produces an excessive time lag, rigor, and formality which will "disrupt" the desired free flow of knowledge. Professor Michael Stankosky of GWU has posited a more flexible variation of the usual systems engineering (SE) approach, tailored specifically to the KM domain and known as Enterprise Management Engineering© (EME). This approach takes the four major pillars of KM as identified by GWU research in this area—Leadership, Organization, Technology, and Learning—and adapts eighteen key SE steps to accommodate the more flexible and imprecise nature of "knowledge".

Anecdotal study of successful KMS developments has shown that many of the more formal processes imposed by systems engineering (such as defining strategic objectives before beginning system development) serve a useful purpose. Consequently, an integrated systems engineering process tailored specifically to the KM domain should lead to more successful implementations of KM systems. If this is so, organizations that have followed some or all of the steps in this process will have designed and deployed more "successful" KMS than those organizations that have not done so. To support and refine this approach, a survey was developed to determine the usage of the 18 steps identified in EME. These results were then analyzed against a objective financial measurement of organizational KM to determine whether a correlation exists. This study is intended to test the validity of the efficacy of the EME approach to KM implementation.

For the financial measurement data, the subject list of organizations for this study used a measure of intangible valuation developed by Professor Baruch Lev of NYU called Knowledge Capital Earnings © (KCE). This is the amount of earnings that a company with good "knowledge" has left over once its earnings based on tangible financial and physical assets have been subtracted from overall earnings. KCE can then be used to determine the Knowledge Capital (KC) of an organization. This in turn provides two quantitative measures (one relative, one absolute) that can be used to define a successful knowledge company.

For this study, Lev's research from 2001 was updated, using more recent financial data. Several of these organizations completed a survey instrument based upon the 18 points of the EME approach. The results for the 18 steps were compared against each other and against each organization's KC scores. The results show that there is a significant correlation between EME and the relative KC measurement, and select EME steps do correlate significantly with a high KC value. Although this study, being the first validation effort, does not show provable causation, it does demonstrate a quantifiable correlation and association between EME and successful KM implementation. This in turn should contribute to the slim body of objective knowledge on the design, deployment, and measurement of KM systems.

APA, Harvard, Vancouver, ISO, and other styles
37

Sheard, Michael. "A construct validation approach to mental toughness in sport : a positive psychological perspective." Thesis, Teesside University, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.425977.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Mouawia, Hussein. "Validation clinique d'une nouvelle approche "ISET" du diagnostic prénatal non invasif d'Amyotrophie spinale." Paris 5, 2008. http://www.theses.fr/2008PA05T006.

Full text
Abstract:
Notre équipe a développé ISET (Isolation by Size of Epithelial Tumor/Trophoblastic cells), une approche prometteuse et non invasive pour le diagnostic prénatal de SMA. L'étude de validation clinique de la méthode ISET pour le DPN de SMA est réalisée suivant le protocole défini par les statisticiens et méthodologistes du site Necker. L'approche, réalisée complètement en aveugle par rapport à la méthode invasive réalisée parallèlement, a visé 160 diagnostics génétiques de SMA par l'étude de 160 cellules foetales isolées du sang (20 ml, 10-11ème semaine d'aménorrhée) de 16 mères à risque (1/4) d'avoir un enfant atteint de la maladie. Les résultats valident la méthode ISET pour le diagnostic prénatal de SMA et constituent le premier cas de validation complète d'une méthode complètement non invasive de diagnostic prénatal de maladie génétique
A promising non-invasive strategy for prenatal diagnosis of SMA has been proposed using ISET (Isolation by Size of Epithelial Trophoblastic cells). A prospective blinded clinical validation study of the ISET method for SMA was set up in the Necker-Enfants Malades Hospital in Paris according to statistical recommendations. It targeted 160 genetic diagnoses of SMA through the study of 160 fetal cells obtained from the blood (20 ml) of 16 mothers at risk of having an affected child. The results show the successful validation of the ISET method for prenatal diagnosis of SMA and should have implications for the implementation of a safe prenatal diagnosis of this genetic disease in clinical practice
APA, Harvard, Vancouver, ISO, and other styles
39

Gurav, Hardik. "Experimental Validation of the Global Transmissibility (Direct Method) Approach to Transfer Path Analysis." University of Cincinnati / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1563273082454307.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Fonseca, Pedro Nicolau Faria Da. "Modélisation et validation des algorithmes non-déterministes de synchronisation des horloges." Vandoeuvre-les-Nancy, INPL, 1999. http://docnum.univ-lorraine.fr/public/INPL_T_1999_FARIA_DA_FONSECA_P_N.pdf.

Full text
Abstract:
Cette thèse traite le problème d'analyse et de conception des algorithmes de synchronisation non déterministes dans les systèmes répartis. Les algorithmes non-déterministes constituent une solution intéressante pour le problème de la synchronisation des horloges, ce qui est témoigné par l'intérêt qui leur a été porté pendant les dernières années. Les algorithmes de synchronisation non-déterministes utilisent des techniques statistiques ou probabilistes et permettent d'obtenir une meilleure précision par rapport aux algorithmes déterministes, le prix à payer étant la probabilité que le système ne réussisse pas à se synchroniser avec la précision souhaitée. Cette probabilité d'échec tend vers zéro avec le nombre de messages échangés et elle peut être réduite à une valeur aussi petite qu'on le souhaite avec un nombre de messages suffisamment grand. Malheureusement, l'évaluation des différentes propositions publiées dans la littérature est difficile, principalement à cause de l'inexistence d'une base commune pour établir les comparaisons. Nous proposons un modèle analytique pour le fonctionnement des algorithmes de synchronisation non-déterministes. Le but est de trouver une expression que le concepteur d'un système distribué puisse utiliser pour calculer le nombre de messages requis pour un certain algorithme, de façon à fournir la synchronisation avec la précision et la probabilité spécifiées. Ce résultat est la Condition de Garantie de Synchronisation, qui définit une condition suffisante qui garantit le succès de la synchronisation non-déterministe sous les conditions spécifiées. Cette condition est établie à partir de paramètres locaux d'un site et de paramètres du système qui sont facilement calculables, tels que le nombre de sites et les paramètres qui décrivent le délai comme une variable aléatoire. Les conditions sous-jacentes au modèle proposé sont vérifiées expérimentalement à l'aide d'une plate-forme basée sur le réseau CAN que nous avons développée. Le test des algorithmes sur cette plate-forme a permis de vérifier la validité des hypothèses qui sont associées au modèle
This thesis addresses the problem of analyzing and designing non-deterministic clock synchronization algorithms in distributed systems. Non-deterministic algorithms are a promising solution to the clock synchronization problem, which can be testified by the attention they have received in recent years. Non-determinist dock synchronization algorithms use statistical or probabilistic techniques and they allow a better precision than with deterministic ones; the price to pay is a small probability that the system will fail to synchronize to the desired precision. This probability of failure can be made as small as desired by sending a sufficiently large number of messages. Unfortunately, assessing different algorithms is a difficult task, especially because we lack a common ground to establish comparisons. We propose an analytical model for the behaviour of non-deterministic dock synchronization algorithms. The aim is to find an expression that the distributed systems designer can use to estimate the required number of messages, in order to guarantee that a certain algorithm will synchronize to the desired precision and probability of success. This result is a Sufficient Condition for Synchronization, which states the conditions that must be fulfilled in order to guarantee the desired precision and probability of success. This condition is based on local parameters and on system parameters that are easily computed, such as the number of sites and the description of the delay as a random variable. The underlying assumptions are experimentally verified. For this purpose, we developed a test-bed based on the Controller Area Network. The results validate the assumptions used for the model
Esta tese aborda o problema da analise e da concepção de algoritmos de sincronização de relógios nao deterministicos. Os algoritmos não-deterministicos constituem uma soluçao promissora para o problema da sincronização de relógios, sendo prova disso o interesse que estes têm despertado nos ultimos anos. Os algoritmos de sincronização não-deterministicos utilizam técnicas estatisticas ou probabilisticas para a obtençao do resultado. Obtém-se assim uma melhor precisao do que com os algoritmos determinfsticos. 0 preço a pagar é uma (pequena) probabilidade que o sistema. Não consiga sincronizar com a precisão desejada. . Pode-se tornar esta probabilidade de insucesso tao pequena quanto necessario através da utlizaçao de um numero de mensagens suficientemente grande. Infelizmente, a comparação e a avaliação das diferentes soluçoes propostas é dificil, principalmente devido à inexistência duma base comum para estabelecer essas comparaçoes. Nesta tese, propomos um modelo analitico para o funcionamento dos algoritmos de sincronização nao-determinfsticos. 0 objectiva é obter uma. Expressão que permita calcular o numero de mensagens necessarias para um determinado algoritmo, de modo a que a sincronização ocorra corn a precisão e a probabilidade especificadas. Este resultado é a Condiçao de Garantia de Sincronizaçao, que define uma condição suficiente para garantir o sucesso da sincronizaçao nao-deterministica sob as condiçoes especificadas de precisão e probabilidade de sucesso. Esta condiçao é estabelecida a partir de parâmetros locais deum nô e de parâmetros do sistema que siio facilmente calculaveis, tais como o numero de nôs ou os parâmetros que descrevem o atraso de comunicação como uma variavel aleatôria. As condiçãoes subjacentes ao modelo proposto são verificadas experimentalmente. Para tal, desenvolveuse uma plataforma baseada na rede CAN (Controller Area Network). As experiências realizadas permitiram verificar a validade das hipóteses associadas ao modelo
APA, Harvard, Vancouver, ISO, and other styles
41

Abdollahzadeh, Ali Akbar. "Validation de données par équilibrage de bilans : synthèse et nouvelles approches." Vandoeuvre-les-Nancy, INPL, 1997. http://www.theses.fr/1997INPL118N.

Full text
Abstract:
La conduite et la surveillance d'un processus nécessite de disposer de données fiables représentant d'une manière pertinente son état du fonctionnement. Malheureusement, l'ensemble des mesures prélevées sur ce processus ne constitue pas une représentation exacte de son fonctionnement parce que les mesures sont sujettes à des erreurs de différentes natures. Il est donc nécessaire de tester la cohérence des mesures acquises avant de les utiliser. La validation des données à partir d'équilibrage de bilan matière ou de bilan énergétique est l'une des procédures utilisables permettant de détecter l'incohérence des mesures, de localiser les défaillances, d'estimer les grandeurs vraies et de déduire la valeur des grandeurs non mesurées. Dans le premier chapitre de ce mémoire, nous présentons brièvement la théorie de la réconciliation de données basée sur la méthode du maximum de vraisemblance. Ensuite la résolution globale du problème d'estimation et la résolution utilisant des fonctions de pénalisation est présentée. Une partie de ce chapitre traite le cas de systèmes incomplètement mesurés à partir des concepts d'observabilité et de redondance. Les algorithmes de détection et de localisation de défauts basés sur les équations de redondance et les termes correctifs sont également abordés dans ce chapitre. L’extension du problème d'estimation au cas de systèmes non-linéaires statiques et au cas de systèmes dynamiques est présentée dans le deuxième chapitre. Au cours de ce chapitre, nous présentons une nouvelle technique de réconciliation de mesures basée sur le respect de contraintes inégalité caractérisant le domaine dans lequel est situe l'état probable du système concerné. L’étude de la possibilité d'application de la méthode de validation de données concernant la surveillance d'un réseau de distribution d'eau potable est développée dans le troisième chapitre. Cette surveillance a comme objectif essentiel la détection et la localisation des capteurs défaillants ou la détection d'aléas pouvant survenir sur le réseau (fuites, casses) à partir de l'analyse des grandeurs mesurées sur ce réseau (écarts de fermeture des équations de bilan) et celle des grandeurs estimées (termes correctifs normalisés).
APA, Harvard, Vancouver, ISO, and other styles
42

Taher, Akar. "Approche coopérative et non supervisée de partitionnement d’images hyperspectrales pour l’aide à la décision." Thesis, Rennes 1, 2014. http://www.theses.fr/2014REN1S094/document.

Full text
Abstract:
Les images hyperspectrales sont des images complexes qui ne peuvent être partitionnées avec succès en utilisant une seule méthode de classification. Les méthodes de classification non coopératives, paramétriques ou non paramétriques peuvent être classées en trois catégories : supervisée, semi-supervisée et non supervisée. Les méthodes paramétriques supervisées nécessitent des connaissances a priori et des hypothèses sur les distributions des données à partitionner. Les méthodes semi-supervisées nécessitent des connaissances a priori limitées (nombre de classes, nombre d'itérations), alors que les méthodes de la dernière catégorie ne nécessitent aucune connaissance. Dans le cadre de cette thèse un nouveau système coopératif et non supervisé est développé pour le partitionnement des images hyperspectrales. Son originalité repose sur i) la caractérisation des pixels en fonction de la nature des régions texturées et non-texturées, ii) l'introduction de plusieurs niveaux d'évaluation et de validation des résultats intermédiaires, iii) la non nécessité d'information a priori. Le système mis en ouvre est composé de quatre modules: Le premier module, partitionne l'image en deux types de régions texturées et non texturées. Puis, les pixels sont caractérisés en fonction de leur appartenance à ces régions. Les attributs de texture pour les pixels appartenant aux régions texturées, et la moyenne locale pour les pixels appartenant aux régions non texturées. Le deuxième module fait coopérer parallèlement deux classifieurs (C-Moyen floue : FCM et l'algorithme Adaptatif Incrémental Linde-Buzo-Gray : AILBG) pour partitionner chaque composante. Pour rendre ces algorithmes non supervisés, le nombre de classes est estimé suivant un critère basé sur la dispersion moyenne pondérée des classes. Le troisième module évalue et gère suivant deux niveaux les conflits des résultats de classification obtenus par les algorithmes FCM et AILBG optimisés. Le premier identifie les pixels classés dans la même classe par les deux algorithmes et les reportent directement dans le résultat final d'une composante. Le second niveau utilise un algorithme génétique (GA), pour gérer les conflits entre les pixels restant. Le quatrième module est dédié aux cas des images multi-composantes. Les trois premiers modules sont appliqués tout d'abord sur chaque composante indépendamment. Les composantes adjacentes ayant des résultats de classification fortement similaires sont regroupées dans un même sous-ensemble et les résultats des composantes de chaque sous-ensemble sont fusionnés en utilisant le même GA. Le résultat de partitionnement final est obtenu après évaluation et fusion par le même GA des différents résultats de chaque sous-ensemble. Le système développé est testé avec succès sur une grande base de données d'images synthétiques (mono et multi-composantes) et également sur deux applications réelles: la classification des plantes invasives et la détection des pins
Hyperspectral and more generally multi-component images are complex images which cannot be successfully partitioned using a single classification method. The existing non-cooperative classification methods, parametric or nonparametric can be categorized into three types: supervised, semi-supervised and unsupervised. Supervised parametric methods require a priori information and also require making hypothesis on the data distribution model. Semi-supervised methods require some a priori knowledge (e.g. number of classes and/or iterations), while unsupervised nonparametric methods do not require any a priori knowledge. In this thesis an unsupervised cooperative and adaptive partitioning system for hyperspectral images is developed, where its originality relies i) on the adaptive nature of the feature extraction ii) on the two-level evaluation and validation process to fuse the results, iii) on not requiring neither training samples nor the number of classes. This system is composed of four modules: The first module, classifies automatically the image pixels into textured and non-textured regions, and then different features of pixels are extracted according to the region types. Texture features are extracted for the pixels belonging to textured regions, and the local mean feature for pixels of non-textured regions. The second module consists of an unsupervised cooperative partitioning of each component, in which pixels of the different region types are classified in parallel via the features extracted previously using optimized versions of Fuzzy C-Means (FCM) and Adaptive Incremental Linde-Buzo-Gray algorithm (AILBG). For each algorithm the number of classes is estimated according to the weighted average dispersion of classes. The third module is the evaluation and conflict management of the intermediate classification results for the same component obtained by the two classifiers. To obtain a final reliable result, a two-level evaluation is used, the first one identifies the pixels classified into the same class by both classifiers and report them directly to the final classification result of one component. In the second level, a genetic algorithm (GA) is used to remove the conflicts between the invalidated remaining pixels. The fourth module is the evaluation and conflict management in the case of a multi-component image. The system handles all the components in parallel; where the above modules are applied on each component independently. The results of the different components are compared, and the adjacent components with highly similar results are grouped within a subset and fused using a GA also. To get the final partitioning result of the multi-component image, the intermediate results of the subsets are evaluated and fused by GA. The system is successfully tested on a large database of synthetic images (mono and multi-component) and also tested on two real applications: classification of invasive plants and pine trees detection
APA, Harvard, Vancouver, ISO, and other styles
43

Komari, Prabanjan. "A Novel Simulation Based Approach for Trace Signal Selection in Silicon Debug." University of Cincinnati / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1468512478.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Falcone, Jessica Dominique. "Validation of high density electrode arrays for cochlear implants: a computational and structural approach." Thesis, Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/39563.

Full text
Abstract:
Creating high resolution, or high-density, electrode arrays may be the key for improving cochlear implant users' speech perception in noise, comprehension of lexical languages, and music appreciation. Contemporary electrode arrays use multipolar stimulation techniques such as current steering (shifting the spread of neural excitation in between two physical electrodes) and current focusing (narrowing of the neural spread of excitation) to increase resolution and more specifically target the neural population. Another approach to increasing resolution incorporates microelectromechanical systems (MEMS) fabrication to create a thin film microelectrode (TFM) array with a series of high density electrodes. Validating the benefits of high density electrode arrays requires a systems-level approach. This hypothesis will be tested computationally via cochlea and auditory nerve simulations, and in vitro studies will provide structural proof-of-concept. By employing Rattay's activating function and entering it into Litvak's neural probability model, a first order estimation model was obtained of the auditory nerve's response to electrical stimulation. Two different stimulation scenarios were evaluated: current steering vs. a high density electrode and current focusing of contemporary electrodes vs. current focusing of high density electrodes. The results revealed that a high density electrode is more localized than current steering and requires less current. A second order estimation model was also created COMSOL, which provided the resulting potential and current flow when the electrodes were electrically stimulated. The structural tests were conducted to provide a proof of concept for the TFM arrays' ability to contour to the shape of the cochlea. The TFM arrays were integrated with a standard insertion platform (IP). In vitro tests were performed on human cadaver cochleae using the TFM/IP devices. Fluoroscopic images recorded the insertion, and post analysis 3D CT scans and histology were conducted on the specimens. Only three of the ten implanted TFM/IPs suffered severe delamination. This statistic for scala vestibuli excursion is not an outlier when compared to previous data recorded for contemporary cochlear electrode arrays.
APA, Harvard, Vancouver, ISO, and other styles
45

Kulkarni, Shashank D. "Development and validation of a Method of Moments approach for modeling planar antenna structures." Worcester, Mass. : Worcester Polytechnic Institute, 2007. http://www.wpi.edu/Pubs/ETD/Available/etd-042007-151741/.

Full text
Abstract:
Dissertation (Ph.D.)--Worcester Polytechnic Institute.
Keywords: patch antennas; volume integral equation (VIE); method of moments (MoM); low order basis functions; convergence. Includes bibliographical references (leaves 169-186 ).
APA, Harvard, Vancouver, ISO, and other styles
46

Barati, Hossein. "Test-taking strategies and the assessment of reading skills : an approach to construct validation." Thesis, University of Bristol, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.420924.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Kitchaev, Daniil A. "Development and validation of a computational approach to predicting the synthesis of inorganic materials." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/115604.

Full text
Abstract:
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Materials Science and Engineering, 2018.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 159-191).
The concept of computational materials design envisions the identification of chemistries and structures with desirable properties through first-principles calculations, and the downselection of these candidates to those experimentally accessible using available synthesis methods. While first-principles property screening has become routine, the present lack of a robust method for the identification of synthetically accessible materials is an obstacle to true materials design. In this thesis, I develop a general approach for evaluating synthesizeability, and where possible, identifying synthesis routes towards the realization of target materials. This approach is based on a quasi- thermodynamic analysis of synthesis methods, relying on the assumption that phase selection is guided by transient thermodynamic stability under the conditions relevant to phase formation. By selecting the thermodynamic handles relevant to a growth procedure and evaluating the evolution of thermodynamic boundary conditions throughout the reaction, I identify potential metastable end-products as the set of ground state phases stabilized at various stages of the synthesis. To validate this approach, I derive the quasi-thermodynamic influence of adsorption-controlled finite- size stability and bulk off-stoichiometry on phase selection in the aqueous synthesis of polymorphic FeS2 and MnO2 systems, rationalizing the results of a range of synthesis experiments. To enable this analysis, I develop and benchmark the methodology necessary for the reliable first-principles evaluation of structure-sensitive bulk and interfacial stability in aqueous media. Finally, I describe a manganese oxide oxygen evolution catalyst, whose high activity is controlled by metastable, tetrahedrally- coordinated Mn3+ ions as an example of materials functionality enabled by structural metastability. The framework for the first-principles analysis of synthesis proposed and validated in this thesis lays the groundwork for the development of computational synthesis prediction and holds the potential to greatly accelerate the design and realization of new functional materials.
by Daniil A. Kitchaev.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
48

Chambin, Odile. "Validation d'un modele d'absorption percutanee ex vivo : approche correlative avec des parametres in vivo." Dijon, 1995. http://www.theses.fr/1995DIJOPE02.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Ovidiu, Parvu. "Computational model validation using a novel multiscale multidimensional spatio-temporal meta model checking approach." Thesis, Brunel University, 2016. http://bura.brunel.ac.uk/handle/2438/11863.

Full text
Abstract:
Computational models of complex biological systems can provide a better understanding of how living systems function but need to be validated before they are employed for real-life (e.g. clinical) applications. One of the most frequently employed in silico approaches for validating such models is model checking. Traditional model checking approaches are limited to uniscale non-spatial computational models because they do not explicitly distinguish between different scales, and do not take properties of (emergent) spatial structures (e.g. density of multicellular population) into account. This thesis defines a novel multiscale multidimensional spatio-temporal meta model checking methodology which enables validating multiscale (spatial) computational models of biological systems relative to how both numeric (e.g. concentrations) and spatial system properties are expected to change over time and across multiple scales. The methodology has two important advantages. First it supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to produce them. Secondly the methodology is generic because it can be automatically reconfigured according to case study specific types of spatial structures and properties using the meta model checking approach. In addition the methodology could be employed for multiple domains of science, but we illustrate its applicability here only against biological case studies. To automate the computational model validation process, the approach was implemented in software tools, which are made freely available online. Their efficacy is illustrated against two uniscale and four multiscale quantitative computational models encoding phase variation in bacterial colonies and the chemotactic aggregation of cells, respectively the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. This novel model checking approach will enable the efficient construction of reliable multiscale computational models of complex systems.
APA, Harvard, Vancouver, ISO, and other styles
50

Rosinski, Jenny M. "Derivation and validation of alcohol phenotypes in a college population a motivational/developmental approach /." Diss., Columbia, Mo. : University of Missouri-Columbia, 2008. http://hdl.handle.net/10355/5535.

Full text
Abstract:
Thesis (Ph. D.)--University of Missouri-Columbia, 2008.
The entire dissertation/thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file (which also appears in the research.pdf); a non-technical general description, or public abstract, appears in the public.pdf file. Title from title screen of research.pdf file (viewed on July 29, 2009) Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography