Dissertations / Theses on the topic 'Meta-modelling'

To see the other types of publications on this topic, follow the link: Meta-modelling.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Meta-modelling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Petrov, Ilia [Verfasser]. "Meta-data, Meta-Modelling and Query Processing in Meta-data Repository Systems / Ilia Petrov." Aachen : Shaker, 2006. http://d-nb.info/1170532349/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Petrov, Ilia P. [Verfasser]. "Meta-data, Meta-Modelling and Query Processing in Meta-data Repository Systems / Ilia Petrov." Aachen : Shaker, 2006. http://nbn-resolving.de/urn:nbn:de:101:1-2018110406170194301809.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bashar, Hasanain. "Meta-modelling of intensive computational models." Thesis, University of Sheffield, 2016. http://etheses.whiterose.ac.uk/13667/.

Full text
Abstract:
Engineering process design for applications that use computationally intensive nonlinear dynamical systems can be expensive in time and resources. The presented work reviews the concept of a meta-model as a way to improve the efficiency of this process. The proposed meta-model will have a computational advantage in implementation over the computationally intensive model therefore reducing the time and resources required to design an engineering process. This work proposes to meta-model a computationally intensive nonlinear dynamical system using reduced-order linear parameter varying system modelling approach with local linear models in velocity based linearization form. The parameters of the linear time-varying meta-model are blended using Gaussian Processes regression models. The meta-model structure is transparent and relates directly to the dynamics of the computationally intensive model while the velocity-based local linear models faithfully reproduce the original system dynamics anywhere in the operating space of the system. The non-parametric blending of the meta-model local linear models by Gaussian Processes regression models is ideal to deal with data sparsity and will provide uncertainty information about the meta-model predictions. The proposed meta-model structure has been applied to second-order nonlinear dynamical systems, a small sized nonlinear transmission line model, medium sized fluid dynamics problem and the computationally intensive nonlinear transmission line model of order 5000.
APA, Harvard, Vancouver, ISO, and other styles
4

Sadeghi, Sara. "Meta Modelling in the Vehicle Industry." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-119806.

Full text
Abstract:
The advance of electronics and information technology during the last years has made possible the proliferation of embedded systems in all fields. Accordingly, embedded systems have become increasingly available in all automotive products. These systems bring about improvements in functionality, increase of system complexity, and more interaction between hardware and software components. The development of these systems requires that engineers from multi-disciplinary fields cooperate closely in order to efficiently develop such complex products. However, there is often disagreements among these engineers about design concepts such as requirements, functions, and specifications. Despite that there have been various attempts of providing information models for automotive embedded system design, there is in practice a lack of a consistent structure that represents and describes the relationships between these concepts. Moreover, such structure has never been implemented in an industrial modeling language for complex physical systems such as Modelica. The objectives of this thesis are to provide a multi-level structure, which represents different design abstraction levels, and a meta-model for automotive embedded system design. This thesis was done at Scania, Södertälje, where these models were used for designing a fuel level display embedded system for a truck. The multi-level structure was designed and developed using a real case, fuel display system, from high-level abstraction (customer requirements) down to component/ block level specifications. Afterwards, a meta-model was proposed. The proposed meta-model was evaluated based on nine interviews with experts in information modeling and development area from both industry and academia. The following criteria were considered for the evaluation of meta-model: correctness, comprehensibility, expressiveness, generality, and usefulness. Six experts confirmed that the proposed meta-model was correct while two experts commented that in general, models could not be said to be correct or incorrect. One of the experts considered that the model required more details. In addition, the model was comprehensible for the majority of the experts. Discussions regarding semantics and expressiveness resulted in some model refinements. Afterwards, the experts acknowledged the expressiveness of this meta-model. The experts agreed that this meta-model was general for automotive system design, and six of them confirmed that the it was useful. The rest recommended that
APA, Harvard, Vancouver, ISO, and other styles
5

Mason, Paul Andrew James. "MATrA : meta-modelling approach to traceability for avionics." Thesis, University of Newcastle Upon Tyne, 2002. http://hdl.handle.net/10443/582.

Full text
Abstract:
Traceability is the common term for mechanisms to record and navigate relationships between artifacts produced by development and assessment processes. Effective management of these relationships is critical to the success of projects involving the development of complex aerospace products. Practitioners use a range of notations to model aerospace products (often as part of a defined technique or methodology). Those appropriate to electrical and electronic systems (avionics) include Use Cases for requirements, Ada for development and Fault Trees for assessment (others such as PERT networks support product management). Most notations used within the industry have tool support, although a lack of well-defined approaches to integration leads to inconsistencies and limits traceability between their respective data sets (internal models). Conceptually, the artifacts produced using such notations populate four traceability dimensions. Of these, three record links between project artifacts (describing the same product), while the fourth relates artifacts across different projects (and hence products), and across product families within the same project. The scope of this thesis is to define a meta-framework that characterises traceability dimensions for aerospace projects, and then to propose a concrete framework capturing the syntax and semantics of notations used in developing avionics for such projects which enables traceability across the four dimensions. The concrete framework is achieved by exporting information from the internal models of tools supporting these notations to an integrated environment consisting of. i) a Workspace comprising a set of structures or meta-models (models describing models) expressed in a common modelling language representing selected notations (including appropriate extensions reflecting the application domain); ii) well-formedness constraints over these structures capturing properties of the notations (and again, reflecting the domain); and iii) associations between the structures. To maintain consistency and identify conflicts, elements of the structures are verified against a system model that defines common building blocks underlying the various notations. The approach is evaluated by (partial) tool implementation of the structures which are populated using case study material derived from actual commercial specifications and industry standards.
APA, Harvard, Vancouver, ISO, and other styles
6

Jackson, Daniel. "Modelling publication and reporting bias in meta-analysis." Thesis, University of Warwick, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.403117.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Liang, Zhihong. "A meta-modelling language definition for specific domain." Thesis, De Montfort University, 2009. http://hdl.handle.net/2086/3539.

Full text
Abstract:
Model Driven software development has been considered to be a further software construction technology following object-oriented software development methods and with the potential to bring new breakthroughs in the research of software development. With deepening research, a growing number of Model Driven software development methods have been proposed. The model is now widely used in all aspects of software development. One key element determining progress in Model Driven software development research is how to better express and describe the models required for various software components. From a study of current Model Driven development technologies and methods, Domain-Specific Modelling is suggested in the thesis as a Model Driven method to better realise the potential of Model-Driven Software Development. Domain-specific modelling methods can be successfully applied to actual software development projects, which need a flexible and easy to extend, meta-modelling language to provide support. There is a particular requirement for modelling languages based on domain-specific modelling methods in Meta-modelling as most general modelling languages are not suitable. The thesis focuses on implementation of domain-specific modelling methods. The "domain" is stressed as a keystone of software design and development and this is what most differentiates the approach from general software development process and methods. Concerning the design of meta-modelling languages, the meta-modelling language based on XML is defined including its abstract syntax, concrete syntax and semantics. It can support description and construction of the domain meta-model and the domain application model. It can effectively realise visual descriptions, domain objects descriptions, relationships descriptions and rules relationships of domain model. In the area of supporting tools, a meta-meta model is given. The meta-meta model provides a group of general basic component meta-model elements together with the relationships between elements for the construction of the domain meta-model. It can support multi-view, multi-level description of the domain model. Developers or domain experts can complete the design and construction of the domain-specific meta-model and the domain application model in the integrated modelling environment. The thesis has laid the foundation necessary for research in descriptive languages through further study in key technologies of meta-modelling languages based on Model Driven development.
APA, Harvard, Vancouver, ISO, and other styles
8

Scheidgen, Markus. "Description of languages based on object-oriented meta-modelling." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät II, 2009. http://dx.doi.org/10.18452/15913.

Full text
Abstract:
In dieser Dissertation, schaue ich auf objekt-orientierte Metamodellierung und wie sie verwendet werden kann, um Computersprachen zu beschreiben. Dabei, fokussiere ich mich nicht nur auf die Beschreibung von Sprachen, sondern auch auf die Verwendung von Sprachbeschreibungen zur automatischen Erzeugung von Sprachwerkzeugen aus Sprachbeschreibungen. Ich nutze die Idee von Metasprachen und Metawerkzeugen. Metasprachen werden verwendet um bestimmte Sprachaspekte, wie Notationen und Semantiken, zu beschreiben, und Metawerkzeuge werden verwendet um Sprachwerkzeuge wie Editoren und Interpreter aus entsprechenden Beschreibungen zu erzeugen. Diese Kombination von Beschreibung und automatischer Entwicklung von Werkzeugen ist als Domänenspezifische Modellierung (DSM) bekannt. Ich verwende DSM basierend auf objekt-orientierter Metamodellierung zur Beschreibung der wichtigen Aspekte ausführbarer Computersprachen. Ich untersuche existierende Metasprachen und Metawerkzeuge für die Beschreibung von Sprachvorkommen, ihrer konkreten Repräsentation und Semantik. Weiter, entwickle ich eine neue Plattform zur Beschreibung von Sprachen basierend auf dem CMOF-Modell der OMG MOF 2.x Empfehlungen. Ich entwickle eine Metasprache und Metawerkzeug für textuelle Notationen. Schlussendlich, entwickle ich eine graphische Metasprache und Metawerkzeug zur Beschreibung von operationaler Semantik von Computersprachen. Um die Anwendbarkeit der vorgestellten Techniken zu prüfen, nehme ich SDL, die Specification and Description Language, als einen Archetypen für textuell notierte Sprachen mit ausführbaren Instanzen. Für diesen Archetyp zeige ich, dass die präsentierten Metasprachen und Metawerkzeuge es erlauben solche Computersprachen zu beschreiben und automatisch Werkzeuge für diese Sprachen zu erzeugen.
In this thesis, I look into object-oriented meta-modelling and how it can be used to describe computer languages. Thereby, I do not only focus on describing languages, but also on utilising the language descriptions to automatically create language tools from language descriptions. I use the notion of meta-languages and meta-tools. Meta-languages are used to describe certain language aspects, such as notation or semantics, and meta-tools are used to create language tools, such as editors or interpreters, from corresponding descriptions. This combination of describing and automated development of tools is known as domain specific modelling (DSM). I use DSM based on object-oriented meta-modelling to describe all important aspects of executable computer languages. I look into existing meta-languages and meta-tools for describing language utterances, their concrete representation, and semantics. Furthermore, I develop a new platform to define languages based on the CMOF-model of the OMG MOF 2.x recommendations. I develop a meta-language and meta-tool for textual language notations. Finally, I develop a new graphical meta-language and meta-tool for describing the operational semantics of computer languages. To prove the applicability of the presented techniques, I take SDL, the Specification and Description Language, as an archetype for textually notated languages with executable instances. For this archetype, I show that the presented meta-languages and meta-tools allow to describe such computer languages and allow to automatically create tools for those languages.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhou, Tao. "Meta-heuristic based Construction Supply Chain Modelling and Optimization." Thesis, Curtin University, 2018. http://hdl.handle.net/20.500.11937/75225.

Full text
Abstract:
Driven by the severe competition within the construction industry, the necessity of improving and optimizing the performance of construction supply chain has been aroused. This thesis proposes three problems with regard to the construction supply chain optimization from three perspectives, namely, deterministic single objective optimization, stochastic optimization and multi-objective optimization respectively. Mathematical models for each problem are constructed accordingly and meta-heuristic algorithms are developed and applied for resolving these three problems.
APA, Harvard, Vancouver, ISO, and other styles
10

Samartsidis, Pantelis. "Point process modelling of coordinate-based meta-analysis neuroimaging data." Thesis, University of Warwick, 2016. http://wrap.warwick.ac.uk/87635/.

Full text
Abstract:
Now over 25 years old, functional magnetic resonance imaging (fMRI) has made significant contributions in improving our understanding of the human brain function. However, some limitations of fMRI studies, including those associated with the small sample sizes that are typically employed, raise concerns about validity of the technique. Lately, growing interest has been observed in combining the results of multiple fMRI studies in a meta-analysis. This can potentially address the limitations of single experiments and raise opportunities for reaching safer conclusions. Coordinate-based meta-analyses (CBMA) use the peak activation locations from multiple studies to find areas of consistent activations across experiments. CBMA presents statisticians with many interesting challenges. Several issues have been solved but there are also many open problems. In this thesis, we review literature on the topic and after describing the unsolved problems we then attempt to address some of the most important. The first problem that we approach is the incorporation of study-specific characteristics in the meta-analysis model known as meta-regression. We propose an novel meta-regression model based on log-Gaussian Cox processes and develop a parameter estimation algorithm using the Hamiltonian Monte Carlo method. The second problem that we address is the use of CBMA data as prior in small underpowered fMRI studies. Based on some existing work on the topic, we develop a hierarchical model for fMRI studies that uses previous CBMA findings as a prior for the location of the effects. Finally, we discuss a classical problem of meta-analysis, the file drawer problem, where studies are suppressed from the literature because they fail to report any significant finding. We use truncated models to infer the total number of non-significant studies that are missing from a database. All our methods are tested on both simulated and real data.
APA, Harvard, Vancouver, ISO, and other styles
11

Neoh, Jun. "Application of meta-analysis, multidimensional scaling, structural equation modelling, and multilevel modelling in analysing carpooling behaviour." Thesis, University of Southampton, 2016. https://eprints.soton.ac.uk/395306/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Wijesekera, Dhammika Harindra, and n/a. "A form based meta-schema for information and knowledge elicitation." Swinburne University of Technology, 2006. http://adt.lib.swin.edu.au./public/adt-VSWT20060904.123024.

Full text
Abstract:
Knowledge is considered important for the survival and growth of an enterprise. Currently knowledge is stored in various places including the bottom drawers of employees. The human being is considered to be the most important knowledge provider. Over the years knowledge based systems (KBS) have been developed to capture and nurture the knowledge of domain experts. However, such systems were considered to be separate and different from the traditional information systems development. Many KBS development projects have failed. The main causes for such failures have been recognised as the difficulties associated with the process of knowledge elicitation, in particular the techniques and methods employed. On the other hand, the main emphasis of information systems development has been in the areas of data and information capture relating to transaction based systems. For knowledge to be effectively captured and nurtured it is necessary for knowledge to be part of the information systems development activity. This thesis reports on a process of investigation and analysis conducted into the areas of information, knowledge and the overlapping areas. This research advocates a hybrid approach, where knowledge and information capture to be considered as one in a unified environment. A meta-schema design based on Formal Object Role Modelling (FORM), independent of implementation details, is introduced for this purpose. This is considered to be a key contribution of this research activity. Both information and knowledge is expected to be captured through this approach. Meta data types are provided for the capture of business rules and they form part of the knowledge base of an organisation. The integration of knowledge with data and information is also described. XML is recognised by many as the preferred data interchange language and it is investigated for the purpose of rule interchange. This approach is expected to enable organisations to interchange business rules and their meta-data, in addition to data and their schema. During interchange rules can be interpreted and applied by receiving systems, thus providing a basis for intelligent behaviour. With the emergence of new technologies such as the Internet the modelling of an enterprise as a series of business processes has gained prominence. Enterprises are moving towards integration, establishing well-described business processes within and across enterprises, to include their customers and suppliers. The purpose is to derive a common set of objectives and benefit from potential economic efficiencies. The suggested meta-schema design can be used in the early phases of requirements elicitation to specify, communicate, comprehend and refine various artefacts. This is expected to encourage domain experts and knowledge analysts work towards describing each business process and their interactions. Existing business processes can be documented and business efficiencies can be achieved through a process of refinement. The meta-schema design allows for a ?systems view? and sharing of such views, thus enabling domain experts to focus on their area of specialisation whilst having an understanding of other business areas and their facts. The design also allows for synchronisation of mental models of experts and the knowledge analyst. This has been a major issue with KBS development and one of the main reasons for the failure of such projects. The intention of this research is to provide a facility to overcome this issue. The natural language based FORM encourages verbalisation of the domain, hence increasing the understanding and comprehension of available business facts.
APA, Harvard, Vancouver, ISO, and other styles
13

Wood, Michael James. "An exploration of building design and optimisation methods using Kriging meta-modelling." Thesis, University of Exeter, 2016. http://hdl.handle.net/10871/24974.

Full text
Abstract:
This thesis investigates the application of Kriging meta-modelling techniques in the field of building design and optimisation. In conducting this research, there were two key motivational factors. The first is the need for building designers to have tools that allow low energy buildings to be designed in a fast and efficient manner. The second motivating factor is the need for optimisation tools that account, or help account, for the wide variety of uses that a building might have; so-called Robust Optimisation (RO). This thesis therefore includes an analysis of Kriging meta-modelling and first applies this to simple building problems. I then use this simple building model to determine the effect of the updated UK Test Reference Years (TRYs) on energy consumption. Second, I examine Kriging-based optimisation techniques for a single objective. I then revisit the single-building meta-model to examine the effect of uncertainty on a neighbourhood of buildings and compare the results to the output of a brute-force analysis of a full building simulator. The results show that the Kriging emulation is an effective tool for creating a meta-model of a building. The subsequent use in the analysis of the effect of TRYs on building shows that UK buildings are likely to use less heating in the future but are likely to overheat more. In the final two chapters I use the techniques developed to create a robust building optimisation algorithm as well as using Kriging to improve the optimisation efficiency of the well-known NSGA-II algorithm. I show that the Kriging-based robust optimiser effectively finds more robust solutions than traditional global optimisation. I also show that Kriging techniques can be used to augment NSGA-II so that it finds more diverse solutions to some types of multi-objective optimisation problems. The results show that Kriging has significant potential in this field and I reveal many potential areas of future research. This thesis shows how a Kriging-enhanced NSGA-II multi-objective optimisation algorithm can be used to improve the performance of NSGA-II. This new algorithm has been shown to speed up the convergence of some multi-objective optimisation algorithms significantly. Although further work is required to verify the results for a wider variety of building applications, the initial results are promising.
APA, Harvard, Vancouver, ISO, and other styles
14

Randall, Marcus Christian, and n/a. "A General Modelling System and Meta-Heuristic Based Solver for Combinatorial Optimisation Problems." Griffith University. School of Environmental and Applied Science, 1999. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20051116.120133.

Full text
Abstract:
There are many real world assignment, scheduling and planning tasks which can be classified as combinatorial optimisation problems (COPs). These are usually formulated as a mathematical problem of minimising or maximising some cost function subject to a number of constraints. Usually, such problems are NP hard, and thus, whilst it is possible to find exact solutions to specific problems, in general only approximate solutions can be found. There are many algorithms that have been proposed for finding approximate solutions to COPs, ranging from special purpose heuristics to general search meta-heuristics such as simulated annealing and tabu search. General meta-heuristic algorithms like simulated annealing have been applied to a wide range of problems. In most cases, the designer must choose an appropriate data structure and a set of local operators that define a search neighbourhood. The variability in representation techniques, and suitable neighbourhood transition operators, has meant that it is usually necessary to develop new code for each problem. Toolkits like the one developed by Ingber's Adaptive Simulated Annealing (Ingber 1993, 1996) have been applied to assist rapid prototyping of simulated annealing codes, however, these still require the development of new programs for each type of problem. There have been very few attempts to develop a general meta-heuristic solver, with the notable exception being Connolly's General Purpose Simulated Annealing (Connolly 1992). In this research, a general meta-heuristic based system is presented that is suitable for a wide range of COPs. The main goal of this work is to build an environment in which it is possible to specify a range of COPs using an algebraic formulation, and to produce a tailored solver automatically. This removes the need for the development of specific software, allowing very rapid prototyping. Similar techniques have been available for linear programming based solvers for some years in the form of the GAMS (General Algebraic Modelling System) (Brooke, Kendrick, Meeraus and Raman 1997) and AMPL (Fourer, Gay and Kernighan 1993) interfaces. The new system is based on a novel linked list data structure rather than the more conventional vector notation due to the natural mapping between COPS and lists. In addition, the modelling system is found to be very suitable for processing by meta-heuristic search algorithms as it allows the direct application of common local search operators. A general solver is built that is based on the linked list modelling system. This system is capable of using meta-heuristic search engines such as greedy search, tabu search and simulated annealing. A number of implementation issues such as generating initial solutions, choosing and invoking appropriate local search transition operators and producing suitable incremental cost expressions, are considered. As such, the system can been seen as a good test-bench for model prototypers and those who wish to test various meta-heuristic implementations in a standard way. However, it is not meant as a replacement or substitute for efficient special purpose search algorithms. The solver shows good performance on a wide range of problems, frequently reaching the optimal and best-known solutions. Where this is not the case, solutions within a few percent deviation are produced. Performance is dependent on the chosen transition operators and the frequency with which each is applied. To a lesser extent, the performance of this implementation is influenced by runtime parameters of the meta-heuristic search engine.
APA, Harvard, Vancouver, ISO, and other styles
15

Gogolin, Sarah [Verfasser]. "Diagnosing Students’ Meta-Modelling Knowledge : Gathering Validity Evidence during Test Development / Sarah Gogolin." Berlin : Freie Universität Berlin, 2017. http://d-nb.info/1149513047/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Randall, Marcus. "A General Modelling System and Meta-Heuristic Based Solver for Combinatorial Optimisation Problems." Thesis, Griffith University, 1999. http://hdl.handle.net/10072/367399.

Full text
Abstract:
There are many real world assignment, scheduling and planning tasks which can be classified as combinatorial optimisation problems (COPs). These are usually formulated as a mathematical problem of minimising or maximising some cost function subject to a number of constraints. Usually, such problems are NP hard, and thus, whilst it is possible to find exact solutions to specific problems, in general only approximate solutions can be found. There are many algorithms that have been proposed for finding approximate solutions to COPs, ranging from special purpose heuristics to general search meta-heuristics such as simulated annealing and tabu search. General meta-heuristic algorithms like simulated annealing have been applied to a wide range of problems. In most cases, the designer must choose an appropriate data structure and a set of local operators that define a search neighbourhood. The variability in representation techniques, and suitable neighbourhood transition operators, has meant that it is usually necessary to develop new code for each problem. Toolkits like the one developed by Ingber's Adaptive Simulated Annealing (Ingber 1993, 1996) have been applied to assist rapid prototyping of simulated annealing codes, however, these still require the development of new programs for each type of problem. There have been very few attempts to develop a general meta-heuristic solver, with the notable exception being Connolly's General Purpose Simulated Annealing (Connolly 1992). In this research, a general meta-heuristic based system is presented that is suitable for a wide range of COPs. The main goal of this work is to build an environment in which it is possible to specify a range of COPs using an algebraic formulation, and to produce a tailored solver automatically. This removes the need for the development of specific software, allowing very rapid prototyping. Similar techniques have been available for linear programming based solvers for some years in the form of the GAMS (General Algebraic Modelling System) (Brooke, Kendrick, Meeraus and Raman 1997) and AMPL (Fourer, Gay and Kernighan 1993) interfaces. The new system is based on a novel linked list data structure rather than the more conventional vector notation due to the natural mapping between COPS and lists. In addition, the modelling system is found to be very suitable for processing by meta-heuristic search algorithms as it allows the direct application of common local search operators. A general solver is built that is based on the linked list modelling system. This system is capable of using meta-heuristic search engines such as greedy search, tabu search and simulated annealing. A number of implementation issues such as generating initial solutions, choosing and invoking appropriate local search transition operators and producing suitable incremental cost expressions, are considered. As such, the system can been seen as a good test-bench for model prototypers and those who wish to test various meta-heuristic implementations in a standard way. However, it is not meant as a replacement or substitute for efficient special purpose search algorithms. The solver shows good performance on a wide range of problems, frequently reaching the optimal and best-known solutions. Where this is not the case, solutions within a few percent deviation are produced. Performance is dependent on the chosen transition operators and the frequency with which each is applied. To a lesser extent, the performance of this implementation is influenced by runtime parameters of the meta-heuristic search engine.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Environmental and Applied Science
Science, Environment, Engineering and Technology
Full Text
APA, Harvard, Vancouver, ISO, and other styles
17

Ashford, Derek George. "Constraints on skill acquisitions : a meta-analysis of the movement based observational modelling literature." Thesis, Manchester Metropolitan University, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.404545.

Full text
Abstract:
The program of work reported within the thesis constitutes the first quantitative analysis of the observational modelling (OM) literature. It was motivated by the major limitations that restrict qualitative reviews. These have been common in the skill acquisition literature (McCullagh et af., 1989; Williams et af., 1999; Williams, 1993). Qualitative reviews typically involve only a limited sample of independent investigations, and selection and subsequent interpretation processes are highly susceptible to various types of biases (Copper and Hedges, 1994). An extensive search of the literature was conducted including (n = 293) sources associated with all types of behaviour modification involving OM. Because the thesis was aimed at understanding the effect of constraints on movement skill acquisition, only modelling effects associated with movement behaviour modification were considered (n = 65). In chapter 1, a qualitative review of the movement based OM literature was included. This revealed that experiments on behaviour modification associated with OM have used various experimental designs (e.g., between and within-groups), and typically movement effect (ME) and/or movement dynamics (MD) outcomes as dependent measures. These qualitative findings provided the rationale for the meta-analyses that followed. In chapter 2, current meta-analytic procedures were reviewed to clarify the protocols required to synthesize overall mean effects of OM treatments from diverse designs. Effect size estimates derived from (n = 69) primary investigations were used within two major meta-analytic summaries. The first review (chap.3) clarified the overall mean treatment effect of OM for ME (0 = 0.27) and MD (0 = 0.77) measures over and above that gained through practice only / discovery learning conditions. Both treatment effects represent significant (p<0.01) modelling benefits over control conditions, with additional benefits clearly evident for MD outcomes. These results are consistent with the Visual Perception Perspective (Scully and Newell, 1985) for OM, and suggest that, primarily, demonstrations convey the MD (i.e. relative motions) required to approximate modelled movement skills. Although, ME (Le., performance outcomes) can benefit, modelling treatment effects are typically more modest, suggesting an increased role of 'practice' in skill acquisition. To quantify task constraint influences during OM a new task classification measure was developed (chapA). The classification used two difficulty components, novelty and complexity, which were defined using 3 and 7 descriptive variables respectively. The inter-rater reliability and test-retest objectivity of each descriptive variable rating produced an intraclass correlation coefficients ranging from r = 0.81 to 1.00 and r = 0.87 to 1.00 respectively. The second review (chap.5) reported the mean treatment effects for high and low movement novelty and complexity for MD and ME modelling outcomes. MD results indicated a marked difference in overall treatment effects gained for high (0 = 1.02) and low (0 = 0.57) novelty. Similar, yet more modest novelty effects were obtained for ME outcomes (Ohigh = 0.42 and Olow = 0.11). These results were in direct contrast to previous predictions and conclusions (Gould, 1978). Results suggest facilitative MD modeling outcomes occurred with increased task novelty. The complexity analyses highlighted no discernable difference in MD treatment effects for high (0 = 0.72) and low (8 = 0.74) movement complexity. ME measures were generally more trivial, but also showed little difference resulting from high (0 = 0.07) or low (0 = 0.12) task complexity. Comparable estimates were obtained for an overall difficulty analysis which combined novelty and complexity components. These results indicated that whilst complexity might be expected to influence OM outcomes, further analysis and refinement of the current complexity classification may be warranted within future research efforts.
APA, Harvard, Vancouver, ISO, and other styles
18

Bergström, Sofia. "Modelling Business Capabilities withEnterprise Architecture : A Case Study at a Swedish Pension Managing Company." Thesis, KTH, Industriella informations- och styrsystem, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-175378.

Full text
Abstract:
This master thesis looks at the use of business capabilities within enterprisearchitecture, and investigates how the concept is used within the Swedish pension managingcompany Folksam. Based on interviews with stakeholders an enterprise architecture metamodelcentred on the business capability is constructed. The meta-model is then edited andrevised according to a questionnaire aimed at removing irrelevant elements, and a secondset of interviews discussing a capability's health status and well being. This second set ofinterviews resulted in the removal of elements not aecting the well being of a capability.The nal meta-model has the business capability and the capability health status at itscore. It consists of the Capability element, with two attributes, surrounded by nine otherelements connected by eleven relations in total.
Detta examensarbete undersoker hur verksamhetsformagor anvandsinom enterprisearkitektur, och vidare hur formage-konceptet anvands pa det svenska pensionsforetaget Folksam. Baserat pa intervjuer med intressenter skapas en metamodell medverksamhetsformagan i centrum. Metamodellen revideras och andras sedan enligt ett frageformular vars mal var att ta bort ej relevanta element, och enligt en andra omgang intervjuerdar en formagas halsa diskuteras. Denna andra omgang intervjuer resulterade i att elementsom inte paverkade formagans halsa togs bort. Den slutgiltiga metamodellen har verksamhetsformagan och dess halsostatus i fokus. Den bestar av formage-elementet, med tvaattribut, omgardat av nio andra element som binds ihop av totalt elva olika relationer.
APA, Harvard, Vancouver, ISO, and other styles
19

TRAVAGIN, GABRIELE. "Scrittura espressiva in adolescenza: Dalla meta-analisi ad un test sperimentale di un nuovo intervento di scrittura." Doctoral thesis, Università Cattolica del Sacro Cuore, 2012. http://hdl.handle.net/10280/1256.

Full text
Abstract:
Il presente programma di ricerca approfondisce l’uso dell’Expressive Writing (EW) con gli adolescenti a partire da tre studi, organizzati in modo sequenziale. Lo Studio 1 indaga l’efficacia e i fattori di moderazione dell’EW con gli adolescenti tramite meta-analisi. In particolare, è stata eseguita una review quantitativa degli studi sull’EW con partecipanti in età adolescenziale, attraverso i seguenti passaggi: ricerca sistematica e codifica degli studi; calcolo degli effect size; analisi dei moderatori. Lo Studio 2 confronta sperimentalmente sugli adolescenti gli effetti a breve e lungo termine di due tipi di istruzioni di scrittura, una convenzionale (EW) e l’altra orientata cognitivamente (CEW), elaborata sulla base dei risultati della meta-analisi. Le analisi sono state finalizzate a testare gli effetti della modificazione delle consegne di scrittura sul funzionamento emotivo e sociale degli adolescenti. Lo Studio 3 consiste in un’analisi secondaria dello Studio 2 ed esplora l’esito dell’intervento in funzione delle traiettorie di cambiamento dei meccanismi cognitivi (“Self-distancing”) rilevati negli scritti, tramite Group-Based Trajectory Modeling. I risultati degli studi sono discussi in funzione delle loro implicazioni teoriche e pratiche.
The present research program aims at evaluating the use of Expressive Writing (EW) with adolescents through three studies, organized in a progressive fashion. Study 1 investigates the efficacy and moderators of EW with adolescents through a meta-analysis. The study performed a quantitative review of the EW interventions with adolescent samples, according to the following steps: systematic literature search and coding of the studies; calculation of the effect size; analysis of the moderators. Study 2 experimentally compares the short- and long-term effects of the traditional writing condition (EW) to a cognitively-oriented EW condition (CEW) on a sample of adolescents. The analyses had the objective to test the effects of altering the writing instructions on social and emotional adjustment of participants. Study 3 consists in a secondary analysis of the written essays collected in Study 2 with the intent of examining the effects of the intervention as a function of the cognitive processes (“Self-distancing”) observed during the writing sessions by means of the Group-Based Trajectory Modeling. The findings are discussed on the basis of their theoretical and practical implications.
APA, Harvard, Vancouver, ISO, and other styles
20

TRAVAGIN, GABRIELE. "Scrittura espressiva in adolescenza: Dalla meta-analisi ad un test sperimentale di un nuovo intervento di scrittura." Doctoral thesis, Università Cattolica del Sacro Cuore, 2012. http://hdl.handle.net/10280/1256.

Full text
Abstract:
Il presente programma di ricerca approfondisce l’uso dell’Expressive Writing (EW) con gli adolescenti a partire da tre studi, organizzati in modo sequenziale. Lo Studio 1 indaga l’efficacia e i fattori di moderazione dell’EW con gli adolescenti tramite meta-analisi. In particolare, è stata eseguita una review quantitativa degli studi sull’EW con partecipanti in età adolescenziale, attraverso i seguenti passaggi: ricerca sistematica e codifica degli studi; calcolo degli effect size; analisi dei moderatori. Lo Studio 2 confronta sperimentalmente sugli adolescenti gli effetti a breve e lungo termine di due tipi di istruzioni di scrittura, una convenzionale (EW) e l’altra orientata cognitivamente (CEW), elaborata sulla base dei risultati della meta-analisi. Le analisi sono state finalizzate a testare gli effetti della modificazione delle consegne di scrittura sul funzionamento emotivo e sociale degli adolescenti. Lo Studio 3 consiste in un’analisi secondaria dello Studio 2 ed esplora l’esito dell’intervento in funzione delle traiettorie di cambiamento dei meccanismi cognitivi (“Self-distancing”) rilevati negli scritti, tramite Group-Based Trajectory Modeling. I risultati degli studi sono discussi in funzione delle loro implicazioni teoriche e pratiche.
The present research program aims at evaluating the use of Expressive Writing (EW) with adolescents through three studies, organized in a progressive fashion. Study 1 investigates the efficacy and moderators of EW with adolescents through a meta-analysis. The study performed a quantitative review of the EW interventions with adolescent samples, according to the following steps: systematic literature search and coding of the studies; calculation of the effect size; analysis of the moderators. Study 2 experimentally compares the short- and long-term effects of the traditional writing condition (EW) to a cognitively-oriented EW condition (CEW) on a sample of adolescents. The analyses had the objective to test the effects of altering the writing instructions on social and emotional adjustment of participants. Study 3 consists in a secondary analysis of the written essays collected in Study 2 with the intent of examining the effects of the intervention as a function of the cognitive processes (“Self-distancing”) observed during the writing sessions by means of the Group-Based Trajectory Modeling. The findings are discussed on the basis of their theoretical and practical implications.
APA, Harvard, Vancouver, ISO, and other styles
21

Saeedi, Kawther Abdulelah. "QRMF : a multi-perspective framework for quality requirements modelling." Thesis, University of Manchester, 2014. https://www.research.manchester.ac.uk/portal/en/theses/qrmf-a-multiperspective-framework-for-quality-requirements-modelling(7e02e8f6-7abb-4179-84f0-8ea1581fadb2).html.

Full text
Abstract:
In recent years, a considerable amount of research has been conducted in modelling non-functional requirements (NFR) or Quality Requirements (QR). However, in comparison with functional requirements (FR) modelling, QR models are still immature and have not been widely adopted. The fundamental reason for this shortfall outlined in this thesis is that the existing QR modelling approaches have not adequately considered the challenging nature of QRs. In this thesis, this limitation is addressed through integrating QR modelling with FR modelling in a multi-perspective modelling framework. This framework, thus called QRMF (Quality Requirements Modelling Framework), is developed offering a process-oriented approach to modelling QR from different views and at different phases of requirement. These models are brought together in a descriptive representation schema, which represents a logical structure to guide the construction of requirement models comprehensively and with consistency. The research presented in the thesis introduces a generic meta-meta model for QRMF to aid understanding the abstract concepts and further guide the modelling process; it offers a reference blueprint to develop a modelling tool applicable to the framework. QRMF is supported by a modelling process, which guides requirement engineers to capture a set of complete, traceable and comprehensible QR models for software system. The thesis presents a case study, which evaluates the practicality and applicability of the QRMF. Finally, the framework is evaluated theoretically, through comparing and contrasting related approaches found in the literature.
APA, Harvard, Vancouver, ISO, and other styles
22

O'Mara, Alison Jane. "Methodological and Substantive Applications of Meta-Analysis : Multilevel Modelling, Simulation, and the Construct Validation of Self-Concept." Thesis, University of Oxford, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.508589.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Almgren, Love, and Åström Johan Holm. "Probabilistic modelling and attack simulations on AWS Connected Vehicle Solution : An Application of the Meta Attack Language." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-259298.

Full text
Abstract:
This work is focused on investigating if the Meta Attack Language (MAL) can be used to create an integrating layer between two different applications of the MAL, and thus being able to model a new domain. In this case vehicleLang and awsLang were choosen as candidate applications of the MAL. While the domain chosen was to model the AWS Connected Vehicle Solution Infrastructure. This domain therefore modelled a service that is quickly becoming popular among car manufacturers. The two languages were successfully compiled into one language using the MAL, and also able to model a leak within AWS that could potentially lead to greater exposure of the infrastructure as a whole. On the other hand some limitations in the MAL compiler has lead to suggestions of how to improve it for better support of integration of different MAL applications.
Denna rapport är fokuserad på att undersöka om Meta Attack Language (MAL) kan användas till att skapa ett integrerande lager mellan två olika tillämpningar av MAL, och därmed modellera en ny domän. VehicleLang och awsLang valdes som tillämpningar av MAL. Domänen som valdes att modellera var AWS Connected Vehicle Solution Infrastructure, vilket är en service som blir allt mer populär bland biltillverkare. De två språken kompilerades ihop till ett med hjälp av MAL, och det användes till att modellera en läcka inom AWS som potentiellt kunde leda till en större exponering av infrastrukturen. Samtidigt har några begränsningar i MAL kompilatorn lett till några förbättringsförslag för bättre integrationssupport av olika MAL applikationer.
APA, Harvard, Vancouver, ISO, and other styles
24

Florin, Madeleine Jill. "Towards Precision Agriculture for whole farms using a combination of simulation modelling and spatially dense soil and crop information." Thesis, The University of Sydney, 2008. http://hdl.handle.net/2123/3169.

Full text
Abstract:
Precision Agriculture (PA) strives towards holistic production and environmental management. A fundamental research challenge is the continuous expansion of ideas about how PA can contribute to sustainable agriculture. Some associated pragmatic research challenges include quantification of spatio-temporal variation of crop yield; crop growth simulation modelling within a PA context and; evaluating long-term financial and environmental outcomes from site-specific crop management (SSCM). In Chapter 1 literature about managing whole farms with a mind towards sustainability was reviewed. Alternative agricultural systems and concepts including systems thinking, agro-ecology, mosaic farming and PA were investigated. With respect to environmental outcomes it was found that PA research is relatively immature. There is scope to thoroughly evaluate PA from a long-term, whole-farm environmental and financial perspective. Comparatively, the emphasis of PA research on managing spatial variability offers promising and innovative ways forward, particularly in terms of designing new farming systems. It was found that using crop growth simulation modelling in a PA context is potentially very useful. Modelling high-resolution spatial and temporal variability with current simulation models poses a number of immediate research issues. This research focused on three whole farms located in Australia that grow predominantly grains without irrigation. These study sites represent three important grain growing regions within Australia. These are northern NSW, north-east Victoria and South Australia. Note-worthy environmental and climatic differences between these regions such as rainfall timing, soil type and topographic features were outlined in Chapter 2. When considering adoption of SSCM, it is essential to understand the impact of temporal variation on the potential value of managing spatial variation. Quantifying spatiotemporal variation of crop yield serves this purpose; however, this is a conceptually and practically challenging undertaking. A small number of previous studies have found that the magnitude of temporal variation far exceeds that of spatial variation. Chapter 3 of this thesis dealt with existing and new approaches quantifying the relationship between spatial and temporal variability in crop yield. It was found that using pseudo cross variography to obtain spatial and temporal variation ‘equivalents’ is a promising approach to quantitatively comparing spatial and temporal variation. The results from this research indicate that more data in the temporal dimension is required to enable thorough analysis using this approach. This is particularly relevant when questioning the suitability of SSCM. Crop growth simulation modelling offers PA a number of benefits such as the ability to simulate a considerable volume of data in the temporal dimension. A dominant challenge recognised within the PA/modelling literature is the mismatch between the spatial resolution of point-based model output (and therefore input) and the spatial resolution of information demanded by PA. This culminates into questions about the conceptual model underpinning the simulation model and the practicality of using point-based models to simulate spatial variability. iii The ability of point-based models to simulate appropriate spatial and temporal variability of crop yield and the importance of soil available water capacity (AWC) for these simulations were investigated in Chapter 4. The results indicated that simulated spatial variation is low compared to some previously reported spatial variability of real yield data for some climate years. It was found that the structure of spatial yield variation was directly related to the structure of the AWC and interactions between AWC and climate. It is apparent that varying AWC spatially is a reasonable starting point for modelling spatial variation of crop yield. A trade-off between capturing adequate spatio-temporal variation of crop yield and the inclusion of realistically obtainable model inputs is identified. A number of practical solutions to model parameterisation for PA purposes are identified in the literature. A popular approach is to minimise the number of simulations required. Another approach that enables modelling at every desired point across a study area involves taking advantage of high-resolution yield information from a number of years to estimate site-specific soil properties with the inverse use of a crop growth simulation model. Inverse meta-modelling was undertaken in Chapter 5 to estimate AWC on 10- metre grids across each of the study farms. This proved to be an efficient approach to obtaining high-resolution AWC information at the spatial extent of whole farms. The AWC estimates proved useful for yield prediction using simple linear regression as opposed to application within a complex crop growth simulation model. The ability of point-based models to simulate spatial variation was re-visited in Chapter 6 with respect to the exclusion of lateral water movement. The addition of a topographic component into the simple point-based yield prediction models substantially improved yield predictions. The value of these additions was interpreted using coefficients of determination and comparing variograms for each of the yield prediction components. A result consistent with the preceding chapter is the importance of further validating the yield prediction models with further yield data when it becomes available. Finally, some whole-farm management scenarios using SSCM were synthesised in Chapter 7. A framework that enables evaluation of the long-term (50 years) farm outcomes soil carbon sequestration, nitrogen leaching and crop yield was established. The suitability of SSCM across whole-farms over the long term was investigated and it was found that the suitability of SSCM is confined to certain fields. This analysis also enabled identification of parts of the farms that are the least financially and environmentally viable. SSCM in conjunction with other PA management strategies is identified as a promising approach to long-term and whole-farm integrated management.
APA, Harvard, Vancouver, ISO, and other styles
25

Florin, Madeleine Jill. "Towards Precision Agriculture for whole farms using a combination of simulation modelling and spatially dense soil and crop information." University of Sydney, 2008. http://hdl.handle.net/2123/3169.

Full text
Abstract:
Doctor of Philosophy
Precision Agriculture (PA) strives towards holistic production and environmental management. A fundamental research challenge is the continuous expansion of ideas about how PA can contribute to sustainable agriculture. Some associated pragmatic research challenges include quantification of spatio-temporal variation of crop yield; crop growth simulation modelling within a PA context and; evaluating long-term financial and environmental outcomes from site-specific crop management (SSCM). In Chapter 1 literature about managing whole farms with a mind towards sustainability was reviewed. Alternative agricultural systems and concepts including systems thinking, agro-ecology, mosaic farming and PA were investigated. With respect to environmental outcomes it was found that PA research is relatively immature. There is scope to thoroughly evaluate PA from a long-term, whole-farm environmental and financial perspective. Comparatively, the emphasis of PA research on managing spatial variability offers promising and innovative ways forward, particularly in terms of designing new farming systems. It was found that using crop growth simulation modelling in a PA context is potentially very useful. Modelling high-resolution spatial and temporal variability with current simulation models poses a number of immediate research issues. This research focused on three whole farms located in Australia that grow predominantly grains without irrigation. These study sites represent three important grain growing regions within Australia. These are northern NSW, north-east Victoria and South Australia. Note-worthy environmental and climatic differences between these regions such as rainfall timing, soil type and topographic features were outlined in Chapter 2. When considering adoption of SSCM, it is essential to understand the impact of temporal variation on the potential value of managing spatial variation. Quantifying spatiotemporal variation of crop yield serves this purpose; however, this is a conceptually and practically challenging undertaking. A small number of previous studies have found that the magnitude of temporal variation far exceeds that of spatial variation. Chapter 3 of this thesis dealt with existing and new approaches quantifying the relationship between spatial and temporal variability in crop yield. It was found that using pseudo cross variography to obtain spatial and temporal variation ‘equivalents’ is a promising approach to quantitatively comparing spatial and temporal variation. The results from this research indicate that more data in the temporal dimension is required to enable thorough analysis using this approach. This is particularly relevant when questioning the suitability of SSCM. Crop growth simulation modelling offers PA a number of benefits such as the ability to simulate a considerable volume of data in the temporal dimension. A dominant challenge recognised within the PA/modelling literature is the mismatch between the spatial resolution of point-based model output (and therefore input) and the spatial resolution of information demanded by PA. This culminates into questions about the conceptual model underpinning the simulation model and the practicality of using point-based models to simulate spatial variability. iii The ability of point-based models to simulate appropriate spatial and temporal variability of crop yield and the importance of soil available water capacity (AWC) for these simulations were investigated in Chapter 4. The results indicated that simulated spatial variation is low compared to some previously reported spatial variability of real yield data for some climate years. It was found that the structure of spatial yield variation was directly related to the structure of the AWC and interactions between AWC and climate. It is apparent that varying AWC spatially is a reasonable starting point for modelling spatial variation of crop yield. A trade-off between capturing adequate spatio-temporal variation of crop yield and the inclusion of realistically obtainable model inputs is identified. A number of practical solutions to model parameterisation for PA purposes are identified in the literature. A popular approach is to minimise the number of simulations required. Another approach that enables modelling at every desired point across a study area involves taking advantage of high-resolution yield information from a number of years to estimate site-specific soil properties with the inverse use of a crop growth simulation model. Inverse meta-modelling was undertaken in Chapter 5 to estimate AWC on 10- metre grids across each of the study farms. This proved to be an efficient approach to obtaining high-resolution AWC information at the spatial extent of whole farms. The AWC estimates proved useful for yield prediction using simple linear regression as opposed to application within a complex crop growth simulation model. The ability of point-based models to simulate spatial variation was re-visited in Chapter 6 with respect to the exclusion of lateral water movement. The addition of a topographic component into the simple point-based yield prediction models substantially improved yield predictions. The value of these additions was interpreted using coefficients of determination and comparing variograms for each of the yield prediction components. A result consistent with the preceding chapter is the importance of further validating the yield prediction models with further yield data when it becomes available. Finally, some whole-farm management scenarios using SSCM were synthesised in Chapter 7. A framework that enables evaluation of the long-term (50 years) farm outcomes soil carbon sequestration, nitrogen leaching and crop yield was established. The suitability of SSCM across whole-farms over the long term was investigated and it was found that the suitability of SSCM is confined to certain fields. This analysis also enabled identification of parts of the farms that are the least financially and environmentally viable. SSCM in conjunction with other PA management strategies is identified as a promising approach to long-term and whole-farm integrated management.
APA, Harvard, Vancouver, ISO, and other styles
26

Charoensawat, Supada. "A likelihood approach based upon the proportional hazards model for SROC modelling in meta-analysis of diagnostic studies." Thesis, University of Reading, 2014. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.627926.

Full text
Abstract:
The number of meta-analysis of diagnostic studies is increasing and the models which deal with the summary receiver operating characteristic (SROC) have become quite popular. Many of these models have reached considerable statistical complexity, required expertise and knowledge. Here" a model named the proportional hazard model (PHM) is developed. The PHM model has a simple form and is easy to interpret. There is only one parameter of interest 0, which is called the diagnostic accuracy and has the interpretation that the smaller 0 is, the higher the diagnostic accuracy.
APA, Harvard, Vancouver, ISO, and other styles
27

ADAMO, GRETA. "Investigating business process elements: a journey from the field of Business Process Management to ontological analysis, and back." Doctoral thesis, Università degli studi di Genova, 2020. http://hdl.handle.net/11567/1010191.

Full text
Abstract:
Business process modelling languages (BPMLs) typically enable the representation of business processes via the creation of process models, which are constructed using the elements and graphical symbols of the BPML itself. Despite the wide literature on business process modelling languages, on the comparison between graphical components of different languages, on the development and enrichment of new and existing notations, and the numerous definitions of what a business process is, the BPM community still lacks a robust (ontological) characterisation of the elements involved in business process models and, even more importantly, of the very notion of business process. While some efforts have been done towards this direction, the majority of works in this area focuses on the analysis of the behavioural (control flow) aspects of process models only, thus neglecting other central modelling elements, such as those denoting process participants (e.g., data objects, actors), relationships among activities, goals, values, and so on. The overall purpose of this PhD thesis is to provide a systematic study of the elements that constitute a business process, based on ontological analysis, and to apply these results back to the Business Process Management field. The major contributions that were achieved in pursuing our overall purpose are: (i) a first comprehensive and systematic investigation of what constitutes a business process meta-model in literature, and a definition of what we call a literature-based business process meta-model starting from the different business process meta-models proposed in the literature; (ii) the ontological analysis of four business process elements (event, participant, relationship among activities, and goal), which were identified as missing or problematic in the literature and in the literature-based meta-model; (iii) the revision of the literature-based business process meta-model that incorporates the analysis of the four investigated business process elements - event, participant, relationship among activities and goal; and (iv) the definition and evaluation of a notation that enriches the relationships between activities by including the notions of occurrence dependences and rationales.
APA, Harvard, Vancouver, ISO, and other styles
28

Jakimavičiūtė, Vilma. "Organizacinės struktūros modeliavimas naudojant UML profilį." Master's thesis, Lithuanian Academic Libraries Network (LABT), 2008. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2008~D_20080612_094109-22866.

Full text
Abstract:
Darbo tikslas – sukurti specializuotą modeliavimo aplinką, skirtą organizacijos struktūros specifikavimui. Tipinė organizacija yra suskirstyta į padalinius, kuriuose dirba darbuotojai, kuriems priskirta viena ar kita pareigybė ir atlieka įvairius vaidmenis organizacijos projektuose. Analitinė modeliavimo kalbų ir įrankių apžvalga parodė, kad vieningo būdo specifikuoti organizacijos struktūrą nėra. Daugeliu atveju organizacijos struktūra modeliuojama neformaliomis diagramomis, kurias sudėtinga analizuoti ir atnaujinti. Darbe siūlomas nesudėtingas organizacijos struktūros meta modelis bei jo realizacija UML profiliu. Naudojantis MagicDraw UML įrankio funkcinėmis savybėmis – stereotipų adaptavimu, validavimo taisyklėmis bei ataskaitų generavimu – sukurta specializuota organizacijų struktūros modeliavimo aplinka. Praktinė sukurtos aplinkos vertė pademonstruota taikant ją konkrečios organizacijos UAB “Baltijos programinė įranga” struktūros modeliavimui.
The goal of this thesis is to create a specialized modeling environment for defining organization structures. A typical organization defines some organization units, has a number of employees that are assigned to a specific organization unit, play a specific role in an organization and may be involved in projects to play different roles. The research of modeling tools and languages shows that there is no single dominant approach to modeling organization structure, and in most cases it is modeled informally, which makes it difficult to analyze and maintain. This thesis proposes a simple meta-model covering the most important organization structure concepts and their mappings to UML profile. A specialized environment for modeling organization structures is created using MagicDraw UML tool features – stereotype customizations, validation rules, and report generation. For demonstrating the practical value of the created environment, modeling the structure of a real organization Baltijos programine iranga is presented.
APA, Harvard, Vancouver, ISO, and other styles
29

Strawbridge, Alexander Daniel. "Modelling non-linear exposure-disease relationships in a large individual participant meta-analysis allowing for the effects of exposure measurement error." Thesis, University of Cambridge, 2012. https://www.repository.cam.ac.uk/handle/1810/243941.

Full text
Abstract:
This thesis was motivated by data from the Emerging Risk Factors Collaboration (ERFC), a large individual participant data (IPD) meta-analysis of risk factors for coronary heart disease(CHD). Cardiovascular disease is the largest cause of death in almost all countries in the world, therefore it is important to be able to characterise the shape of risk factor–CHD relationships. Many of the risk factors for CHD considered by the ERFC are subject to substantial measurement error, and their relationship with CHD non-linear. We firstly consider issues associated with modelling the risk factor–disease relationship in a single study, before using meta-analysis to combine relationships across studies. It is well known that classical measurement error generally attenuates linear exposure–disease relationships, however its precise effect on non-linear relationships is less well understood. We investigate the effect of classical measurement error on the shape of exposure–disease relationships that are commonly encountered in epidemiological studies, and then consider methods for correcting for classical measurement error. We propose the application of a widely used correction method, regression calibration, to fractional polynomial models. We also consider the effects of non-classical error on the observed exposure–disease relationship, and the impact on our correction methods when we erroneously assume classical measurement error. Analyses performed using categorised continuous exposures are common in epidemiology. We show that MacMahon’s method for correcting for measurement error in analyses that use categorised continuous exposures, although simple, does not provide the correct shape for nonlinear exposure–disease relationships. We perform a simulation study to compare alternative methods for categorised continuous exposures. Meta-analysis is the statistical synthesis of results from a number of studies addressing similar research hypotheses. The use of IPD is the gold standard approach because it allows for consistent analysis of the exposure–disease relationship across studies. Methods have recently been proposed for combining non-linear relationships across studies. We discuss these methods, extend them to P-spline models, and consider alternative methods of combining relationships across studies. We apply the methods developed to the relationships of fasting blood glucose and lipoprotein(a) with CHD, using data from the ERFC.
APA, Harvard, Vancouver, ISO, and other styles
30

Malin, Gemma. "The diagnostic/prognostic value of neonatal findings for predicting childhood and adult morbidity : systematic reviews, meta-analysis and decision analytic modelling." Thesis, University of Birmingham, 2013. http://etheses.bham.ac.uk//id/eprint/4156/.

Full text
Abstract:
Events in utero have been linked with diseases throughout life, however there is a lack of consensus regarding the ability of neonatal tests to predict these outcomes. Systematic reviews and meta-analyses were performed, assessing umbilical cord pH and base excess at birth, standards of low birth weight, and the Apgar score, including a total of 218 papers and 26704980 individuals. The prognostic association and predictive accuracy of these tests for adverse outcomes, including neonatal mortality and morbidity, childhood morbidity including cerebral palsy, and adult outcomes, were determined. A decision-analytic model based analysis assessed the cost-effectiveness of varying the umbilical cord pH threshold, and treatment with neonatal hypothermia. This thesis determined that all of the tests examined had a strong association with neonatal mortality, and a significant but smaller association with neonatal morbidity and childhood cerebral palsy. In general, where the association was strong, tests had a high specificity and positive likelihood ratio for adverse outcome, but poor sensitivity and negative likelihood ratio, indicating that negative tests do not reduce the risk. The cost effectiveness analysis showed that the threshold of pH used in current practice to recommend neonatal hypothermia is more effective and less costly than a higher threshold.
APA, Harvard, Vancouver, ISO, and other styles
31

Pirathiban, Ramethaa. "Improving species distribution modelling: Selecting absences and eliciting variable usefulness for input into standard algorithms or a Bayesian hierarchical meta-factor model." Thesis, Queensland University of Technology, 2019. https://eprints.qut.edu.au/134401/1/Ramethaa_Pirathiban_Thesis.pdf.

Full text
Abstract:
This thesis explores and proposes methods to improve species distribution models. Throughout this thesis, a rich class of statistical modelling techniques has been developed to address crucial and interesting issues related to the data input into these models. The overall contribution of this research is the advancement of knowledge on species distribution modelling through an increased understanding of extraneous zeros, quality of the ecological data, variable selection that incorporates ecological theory and evaluating performance of the fitted models. Though motivated by the challenge of species distribution modelling from ecology, this research is broadly relevant to many fields, including bio-security and medicine. Specifically, this research is of potential significance to researchers seeking to: identify and explain extraneous zeros; assess the quality of their data; or employ expert-informed variable selection.
APA, Harvard, Vancouver, ISO, and other styles
32

Nyblom, Per. "Dynamic Abstraction for Interleaved Task Planning and Execution." Licentiate thesis, Linköping : Department of Computer and Information Science, Linköpings universitet, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-11924.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Callow, Glenn. "Extending relational model transformations to better support the verification of increasingly autonomous systems." Thesis, Loughborough University, 2013. https://dspace.lboro.ac.uk/2134/13435.

Full text
Abstract:
Over the past decade the capabilities of autonomous systems have been steadily increasing. Unmanned systems are moving from systems that are predominantly remotely operated, to systems that include a basic decision making capability. This is a trend that is expected to continue with autonomous systems making decisions in increasingly complex environments, based on more abstract, higher-level missions and goals. These changes have significant implications for how these systems should be designed and engineered. Indeed, as the goals and tasks these systems are to achieve become more abstract, and the environments they operate in become more complex, are current approaches to verification and validation sufficient? Domain Specific Modelling is a key technology for the verification of autonomous systems. Verifying these systems will ultimately involve understanding a significant number of domains. This includes goals/tasks, environments, systems functions and their associated performance. Relational Model Transformations provide a means to utilise, combine and check models for consistency across these domains. In this thesis an approach that utilises relational model transformation technologies for systems verification, Systems MDD, is presented along with the results of a series of trials conducted with an existing relational model transformation language (QVT-Relations). These trials identified a number of problems with existing model transformation languages, including poorly or loosely defined semantics, differing interpretations of specifications across different tools and the lack of a guarantee that a model transformation would generate a model that was compliant with its associated meta-model. To address these problems, two related solvers were developed to assist with realising the Systems MDD approach. The first solver, MMCS, is concerned with partial model completion, where a partial model is defined as a model that does not fully conform with its associated meta-model. It identifies appropriate modifications to be made to a partial model in order to bring it into full compliance. The second solver, TMPT, is a relational model transformation engine that prioritises target models. It considers multiple interpretations of a relational transformation specification, chooses an interpretation that results in a compliant target model (if one exists) and, optionally, maximises some other attribute associated with the model. A series of experiments were conducted that applied this to common transformation problems in the published literature.
APA, Harvard, Vancouver, ISO, and other styles
34

Doyle, Katherine Mary. "Mathematical modelling through top-level structure." Thesis, Queensland University of Technology, 2006. https://eprints.qut.edu.au/16391/1/Katherine_Doyle_Thesis.pdf.

Full text
Abstract:
Mathematical modelling problems are embedded in written, representational, and graphic text. For students to actively engage in the mathematical-modelling process, they require literacy. Of critical importance is the comprehension of the problems' text information, data, and goals. This design-research study investigated the application of top-level structuring; a literary, organisational, structuring strategy, to mathematical-modelling problems. The research documents how students' mathematical modelling was changed when two classes of Year 4 students were shown, through a series of lessons, how to apply top-level structure to two scientifically-based, mathematical-modelling problems. The methodology used a design-based research approach, which included five phases. During Phase One, consultations took place with the principal and participant teachers. As well, information on student numeracy and literacy skills was gathered from the Queensland Year 3 'Aspects of Numeracy' and 'Aspects of Literacy' tests. Phase Two was the initial implementation of top-level structure with one class of students. In Phase Three, the first mathematical-modelling problem was implemented with the two Year 4 classes. Data was collected through video and audio taping, student work samples, teacher and researcher observations, and student presentations. During Phase Four, the top-level structure strategy was implemented with the second Year 4 class. In Phase Five, the second mathematical-modelling problem was investigated by both classes, and data was again collected through video and audio taping, student work samples, teacher and researcher observations, and student presentations. The key finding was that top-level structure had a positive impact on students' mathematical modelling. Students were more focussed on mathematising, acquired key mathematical knowledge, and used high-level, mathematically-based peer questioning and responses after top-level structure instruction. This research is timely and pertinent to the needs of mathematics education today because of its recognition of the need for mathematical literacy. It reflects international concerns on the need for more research in problem solving. It is applicable to real-world problem solving because mathematical-modelling problems are focussed in real-world situations. Finally, it investigates the role literacy plays in the problem-solving process.
APA, Harvard, Vancouver, ISO, and other styles
35

Doyle, Katherine Mary. "Mathematical modelling through top-level structure." Queensland University of Technology, 2006. http://eprints.qut.edu.au/16391/.

Full text
Abstract:
Mathematical modelling problems are embedded in written, representational, and graphic text. For students to actively engage in the mathematical-modelling process, they require literacy. Of critical importance is the comprehension of the problems' text information, data, and goals. This design-research study investigated the application of top-level structuring; a literary, organisational, structuring strategy, to mathematical-modelling problems. The research documents how students' mathematical modelling was changed when two classes of Year 4 students were shown, through a series of lessons, how to apply top-level structure to two scientifically-based, mathematical-modelling problems. The methodology used a design-based research approach, which included five phases. During Phase One, consultations took place with the principal and participant teachers. As well, information on student numeracy and literacy skills was gathered from the Queensland Year 3 'Aspects of Numeracy' and 'Aspects of Literacy' tests. Phase Two was the initial implementation of top-level structure with one class of students. In Phase Three, the first mathematical-modelling problem was implemented with the two Year 4 classes. Data was collected through video and audio taping, student work samples, teacher and researcher observations, and student presentations. During Phase Four, the top-level structure strategy was implemented with the second Year 4 class. In Phase Five, the second mathematical-modelling problem was investigated by both classes, and data was again collected through video and audio taping, student work samples, teacher and researcher observations, and student presentations. The key finding was that top-level structure had a positive impact on students' mathematical modelling. Students were more focussed on mathematising, acquired key mathematical knowledge, and used high-level, mathematically-based peer questioning and responses after top-level structure instruction. This research is timely and pertinent to the needs of mathematics education today because of its recognition of the need for mathematical literacy. It reflects international concerns on the need for more research in problem solving. It is applicable to real-world problem solving because mathematical-modelling problems are focussed in real-world situations. Finally, it investigates the role literacy plays in the problem-solving process.
APA, Harvard, Vancouver, ISO, and other styles
36

Idiahi, Innocent. "Defining a Formalized Representation for Information Demand." Thesis, Tekniska Högskolan, Högskolan i Jönköping, JTH. Forskningsområde Informationsteknik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-16018.

Full text
Abstract:
Information demand is a part of comprehensive business logistics which encompass logistics of information. The demand for information has provided a unifying framework for different needs on enterprise modeling. Hence, the problems organizations faces relating to flow and distribution has lead to the development of various framework for analyzing information demand and this is guided by a set of rules, methods and even a unified representation. This thesis work defines a specification for enterprise Information Demand Context model using XPDL as the language of construct. The paper gives reasons why XPDL was preferred for such a representation and show how mapping is carried out from the constructs of notations to its associated XPDL specifications, so that when we are defining a representation we are as well defining its meta model. The resulting specification is presented in such a way that it should be able to give a flexible, logical and more defined structure.
APA, Harvard, Vancouver, ISO, and other styles
37

Ndiaye, Ndèye Fatma. "Algorithmes d'optimisation pour la résolution du problème de stockage de conteneurs dans un terminal portuaire." Thesis, Le Havre, 2015. http://www.theses.fr/2015LEHA0002/document.

Full text
Abstract:
ADans cette thède, nous traitons le problème de stockage de conteneurs dans un terminal portuaire. Dans un premier temps, noux présentons une étude bibliographique dans laquelle sont analysés les travaux qui ont déhà été rélisé dans ce domaine. Ensuite, nous présentons une étude analytique, puis une modélisation mathématique et des méthodes de résolution numérique qui englobent des algorithmes efficaces. Nous proposons une démonstration de la compexité du problème de stockage de conteneurs en considérant différents cas de stockage. Ce problème étant "Np_difficile" peut être difficilement résolu avec le logiciel d'optimisation "ILOG CPLEX". », raison pour laquelle nous proposons un algorithme de banch-and-cut, qui est une méthode de résolution exacte et qui nous a permis de repousser les limites de "ILOG CPLEX". Nous avons aussi proposé des algorithmes métatheuristiques et des hybridations qui procurent des résultats très satisfaisants et qui sont très avantageux en temps de calcul
AIn this thesis, we trait the container storage problem at port terminal. Initially, we present a state of the art in which the work that have been previously made in this filed are analyzed. After that, we present an analytical study. Thed we propose a mathematical modelling and some methods of resolution including effective algorithms. We propose a demonstration of the complexity of the problem by considering different cases of storage. This problme is "Np_difficult, so not always easy to solve by using the optimization software "ILOG CPLEX”. This is why we propose a branch-and-cut algorithm, wich is an optimal resolution algorithm and wich enables to go beyong the limits of "ILOG CPLEX". We also proposed meta-heuristic algorithms and hybridizations wich provide satisfactory resulys and wich required less calculation times
APA, Harvard, Vancouver, ISO, and other styles
38

Thivolle, Damien. "Langages modernes pour la modélisation et la vérification des systèmes asynchrones." Phd thesis, Université de Grenoble, 2011. http://tel.archives-ouvertes.fr/tel-00685209.

Full text
Abstract:
Cette thèse se situe à l'intersection de deux domaines-clés : l'ingénierie dirigée par les modèles (IDM) et les méthodes formelles, avec différents champs d'application. Elle porte sur la vérification formelle d'applications parallèles modélisées selon l'approche IDM. Dans cette approche, les modèles tiennent un rôle central et permettent de développer une application par transformations successives (automatisées ou non) entre modèles intermédiaires à différents niveaux d'abstraction, jusqu'à la production de code exécutable. Lorsque les modèles ont une sémantique formelle, il est possible d'effectuer une vérification automatisée ou semi-automatisée de l'application. Ces principes sont mis en oeuvre dans TOPCASED, un environnement de développement d'applications critiques embarquées basé sur ECLIPSE, qui permet la vérification formelle par connexion à des boîtes à outils existantes. Cette thèse met en oeuvre l'approche TOPCASED en s'appuyant sur la boîte à outils CADP pour la vérification et sur son plus récent formalisme d'entrée : LOTOS NT. Elle aborde la vérification formelle d'applications IDM à travers deux problèmes concrets : 1) Pour les systèmes GALS (Globalement Asynchrone Localement Synchrone), une méthode de vérification générique par transformation en LOTOS NT est proposée, puis illustrée sur une étude de cas industrielle fournie par AIRBUS : un protocole pour les communications entre un avion et le sol décrit dans le langage synchrone SAM conçu par AIRBUS. 2) Pour les services Web décrits à l'aide de la norme BPEL (Business Process Execution Language), une méthode de vérification est proposée, qui est basée sur une transformation en LOTOS NT des modèles BPEL, en prenant en compte les sous-langages XML Schema, XPath et WSDL sur lesquels repose la norme BPEL.
APA, Harvard, Vancouver, ISO, and other styles
39

Čepenko, Dmitrijus. "Įmonės metaduomenų saugykla." Master's thesis, Lithuanian Academic Libraries Network (LABT), 2011. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2011~D_20110831_144356-22597.

Full text
Abstract:
Darbe analizuojamas metaduomenų saugyklų kūrimas pritaikant veiklos modeliavimo metodo metamodelį. Apžvelgtos metaduomenų saugyklų ir veiklos modeliavimo metodų bei notacijų BPMN, ARIS, GRAI tinklo, UEML savybės, jų pritaikymo metaduomenų saugykloms projektuoti galimybės. Atrinktas tinkamiausias veiklos modeliavimo metodas ARIS, kuris apima daugiausiai organizacijos veiklos aspektų. Sukurta metaduomenų saugykla panaudojant ARIS veiklos modeliavimo metodo metamodelį, kuri pritaikyta įmonės UAB „Klaipėdos Konteinerių Terminalas“ duomenų srautams valdyti ir integruota su įmonės naudojama informacine sistema. Šios saugyklos pagalba metaduomenys, išgauti iš įvairių įmonės dokumentų, suskirstomi pagal skirtingus veiklos aspektus: darbo vietas, veiklos funkcijas, padalinius. Išnagrinėtas metaduomenų saugyklų kūrimas UEML, GRAI tinklo, BPMN metamodelių pagrindu bei jų savybės ir palygintos su ARIS metodo metamodelio pagrindu sukurtos metaduomenų saugyklos savybėmis. Nustatyta, kad veiklos modeliavimo metodo ARIS taikymas leidžia sukurti metaduomenų saugyklą, apimančią daugiau įmonės veiklos aspektų. Tai leidžia geriau identifikuoti įmonės duomenų pokyčius ir vietą veiklos valdymo procese.
This paper analyses creation of a metadata repository by implementing the meta-model of enterprise modelling methods and notations. It encompasses overviews of metadata repository qualities of UEML, ARIS, GRAI Grid, BPMN enterprise modelling methods and notations and their application in designing metadata repositories. ARIS method was selected as the most suitable enterprise modelling method because it encompasses the largest number of aspects of the enterprise activity. Employing this method meta-model, a metadata repository was created and integrated with the informational system of the company UAB „Klaipėdos Konteinerių Terminalas“ for more efficient management of the company’s data flows. With the help of this metadata repository data extracted from various company’s documents is grouped according to different aspects of the company’s activity: workplaces, departments, job functions, etc. The paper investigates creation of metadata repositories on the basis of UEML, GRAI Grid and BPMN meta-models and compares their qualities with those of a metadata repository created on the basis of ARIS meta-model. It has been established that application of ARIS enterprise modelling method enables to create a metadata repository encompassing more aspects of the enterprise’s activity. It allows to better identify changes in the enterprise’s documents and their place in the activity management process.
APA, Harvard, Vancouver, ISO, and other styles
40

Diaz, Leiva Juan Esteban. "Simulation-based optimization for production planning : integrating meta-heuristics, simulation and exact techniques to address the uncertainty and complexity of manufacturing systems." Thesis, University of Manchester, 2016. https://www.research.manchester.ac.uk/portal/en/theses/simulationbased-optimization-for-production-planning-integrating-metaheuristics-simulation-and-exact-techniques-to-address-the-uncertainty-and-complexity-of-manufacturing-systems(9ef8cb33-99ba-4eb7-aa06-67c9271a50d0).html.

Full text
Abstract:
This doctoral thesis investigates the application of simulation-based optimization (SBO) as an alternative to conventional optimization techniques when the inherent uncertainty and complex features of real manufacturing systems need to be considered. Inspired by a real-world production planning setting, we provide a general formulation of the situation as an extended knapsack problem. We proceed by proposing a solution approach based on single and multi-objective SBO models, which use simulation to capture the uncertainty and complexity of the manufacturing system and employ meta-heuristic optimizers to search for near-optimal solutions. Moreover, we consider the design of matheuristic approaches that combine the advantages of population-based meta-heuristics with mathematical programming techniques. More specifically, we consider the integration of mathematical programming techniques during the initialization stage of the single and multi-objective approaches as well as during the actual search process. Using data collected from a manufacturing company, we provide evidence for the advantages of our approaches over conventional methods (integer linear programming and chance-constrained programming) and highlight the synergies resulting from the combination of simulation, meta-heuristics and mathematical programming methods. In the context of the same real-world problem, we also analyse different single and multi-objective SBO models for robust optimization. We demonstrate that the choice of robustness measure and the sample size used during fitness evaluation are crucial considerations in designing an effective multi-objective model.
APA, Harvard, Vancouver, ISO, and other styles
41

Nourmohammadzadeh, Abtin [Verfasser], Sven [Akademischer Betreuer] Hartmann, and Stefan [Akademischer Betreuer] Westphal. "Mathematical modelling with exact, heuristic and meta-heuristic solution methodologies for the fuel-efficient platooning of heavy duty vehicles on road networks / Abtin Nourmohammadzadeh ; Sven Hartmann, Stefan Westphal." Clausthal-Zellerfeld : Technische Universität Clausthal, 2019. http://d-nb.info/1231362839/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Chakroun, Chedlia. "Contribution à la définition d'une méthode de conception de bases de données à base ontologique." Phd thesis, ISAE-ENSMA Ecole Nationale Supérieure de Mécanique et d'Aérotechique - Poitiers, 2013. http://tel.archives-ouvertes.fr/tel-00904117.

Full text
Abstract:
Récemment, les ontologies ont été largement adoptées par différentes entreprises dans divers domaines. Elles sontdevenues des composantes centrales dans bon nombre d'applications. Ces modèles conceptualisent l'univers du discours auxmoyens de concepts primitifs et parfois redondants (calculés à partir de concepts primitifs). Au début, la relation entreontologies et base de données a été faiblement couplée. Avec l'explosion des données sémantiques, des solutions depersistance assurant une haute performance des applications ont été proposées. En conséquence, un nouveau type de base dedonnées, appelée base de données à base ontologique (BDBO) a vu le jour. Plusieurs types de BDBO ont été proposés, ilsutilisent différents SGBD. Chaque BDBO possède sa propre architecture et ses modèles de stockage dédiés à la persistancedes ontologies et de ses instances. A ce stade, la relation entre les bases de données et les ontologies devient fortementcouplée. En conséquence, plusieurs études de recherche ont été proposées sur la phase de conception physique des BDBO.Les phases conceptuelle et logique n'ont été que partiellement traitées. Afin de garantir un succès similaire au celui connupar les bases de données relationnelles, les BDBO doivent être accompagnées par des méthodologies de conception et desoutils traitant les différentes étapes du cycle de vie d'une base de données. Une telle méthodologie devrait identifier laredondance intégrée dans l'ontologie. Nos travaux proposent une méthodologie de conception dédiée aux bases de données àbase ontologique incluant les principales phases du cycle de vie du développement d'une base de données : conceptuel,logique, physique ainsi que la phase de déploiement. La phase de conception logique est réalisée grâce à l'incorporation desdépendances entre les concepts ontologiques. Ces dépendances sont semblables au principe des dépendances fonctionnellesdéfinies pour les bases de données relationnelles. En raison de la diversité des architectures des BDBO et la variété desmodèles de stockage utilisés pour stocker et gérer les données ontologiques, nous proposons une approche de déploiement àla carte. Pour valider notre proposition, une implémentation de notre approche dans un environnement de BDBO sousOntoDB est proposée. Enfin, dans le but d'accompagner l'utilisateur pendant le processus de conception, un outil d'aide à laconception des bases de données à partir d'une ontologie conceptuelle est présenté
APA, Harvard, Vancouver, ISO, and other styles
43

Rodat, Damien. "Simulation opérationnelle en contrôle non destructif." Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLS535.

Full text
Abstract:
La simulation opérationnelle a déjà été développée pour diverses activités dont l'exercice en conditions réelles peut s'avérer coûteux voire dangereux : le pilotage d'avion, les interventions chirurgicales, etc. L'idée consiste à remplacer la réalité par une simulation suffisamment réaliste pour donner l'impression aux utilisateurs qu'ils réalisent réellement l'activité.Le Contrôle Non-Destructif (CND) regroupe l'ensemble des méthodes mises en œuvre pour tester l'intégrité des pièces mécaniques sans les altérer. Dans ce domaine, la simulation opérationnelle n'a été introduite que très récemment par un brevet déposé par Airbus. Cette approche permet de simuler numériquement la présence de défauts sans avoir à les ajouter réellement dans les pièces. Les pièces aéronautiques étant coûteuses, la simulation opérationnelle permet de réduire les coûts liés à la formation des opérateurs, à l'évaluation des performances des méthodes ou aux tests en conditions réelles de nouvelles procédures.La présente thèse vise à développer les outils scientifiques et technologiques nécessaires à donner vie au concept de simulation opérationnelle en CND. Pour remplacer la réalité par la simulation, les défis à relever sont de trois ordres : le réalisme de la simulation, la rapidité des calculs et l'instrumentation. Nous avons choisi d'illustrer ces trois aspects dans le cadre de l'inspection par ultrasons de pièces en matériaux composites. Les modèles de simulation couramment employés --- basés sur la résolution des équations de la physique --- n'offrent pas des temps de calculs suffisamment courts pour satisfaire les pré-requis de la simulation opérationnelle. Par ailleurs, le réalisme des simulations souffre parfois de la difficulté à paramétrer correctement les modèles. Nous explorons donc une autre approche : les modèles sont construits à partir de données expérimentales. Cette stratégie est exploitée pour traiter différents types de phénomènes tels que l'endommagement par impact, le trou à fond plat ou encore les perturbations de la micro-structure des matériaux. Par ailleurs, une solution matérielle et logicielle sont proposées et un premier prototype de simulateur opérationnel est mis au point. Ce système permet d'exploiter les modèles développés et de montrer que les signaux synthétiques peuvent sembler aussi réalistes que la réalité. Cette thèse court ainsi du concept jusqu'au prototype
Several fields have already adopted the concept of operational simulation to limit risks and costs. For instance, part of the training phase of airline transport pilots or surgerons can now rely on simulations instead of real-life situations.Non-Destructive Testing (NDT) assesses the integrity of structural and mechanical components without damaging them. Operational simulation has drawn attention of the NDT community only recently through an Airbus patent. In this field, the operational simulation can be used to simulate the presence of a defect in a component without actually inserting the defect. For expensive parts such as aeronautical structures, this approach can reduce the costs of training operators, evaluating NDT method performances or testing new procedures in real-conditions.This thesis work aims to apply the concept of operational simulation to NDT. Three main scientific and technological challenges are to be tackled: the simulation realism, the computation speed and the instrumentation. We chose to focus this study on the ultrasound NDT technique applied to composite materials. Classical simulation approaches based on physical equations are not fast enough for a real-time synthesis of ultrasound signals. Moreover, the realism is often limited by the fidelity of the inspection set-up description. For instance, the material properties are not always well-known and bring to a drop of realism. Thus, we investigate an alternative way: the models are built directly from experimental data. This strategy is applied to model the effect of several phenomena such as impact damages, flat bottom holes or material micro-structure. Hardware and software solutions are also studied to propose a first prototype. We have shown that the replacement of real signals by on-the-fly simulated ones is achievable: the simulation is realistic enough to be considered as reality by operators. thus, this thesis work brings the concept to a first prototype dedicated to ultrasound NDT
APA, Harvard, Vancouver, ISO, and other styles
44

Somoray, Klaire. "Beyond compliance: An exploratory investigation of proactive safety behaviours within the context of work driving." Thesis, Queensland University of Technology, 2019. https://eprints.qut.edu.au/131136/2/Klaire_Somoray_Thesis.pdf.

Full text
Abstract:
This research program is an exploratory investigation on the concept of proactive safety behaviours within the context of work driving. A measure for proactive safety was developed and could be used as a complementary behaviour-based safety performance measure within the work driving context. The research program also provided a model on how organisations can engage their work drivers and management to be more proactive in managing risks while driving for work.
APA, Harvard, Vancouver, ISO, and other styles
45

Cattinelli, I. "INVESTIGATIONS ON COGNITIVE COMPUTATION AND COMPUTATIONAL COGNITION." Doctoral thesis, Università degli Studi di Milano, 2011. http://hdl.handle.net/2434/155482.

Full text
Abstract:
This Thesis describes our work at the boundary between Computer Science and Cognitive (Neuro)Science. In particular, (1) we have worked on methodological improvements to clustering-based meta-analysis of neuroimaging data, which is a technique that allows to collectively assess, in a quantitative way, activation peaks from several functional imaging studies, in order to extract the most robust results in the cognitive domain of interest. Hierarchical clustering is often used in this context, yet it is prone to the problem of non-uniqueness of the solution: a different permutation of the same input data might result in a different clustering result. In this Thesis, we propose a new version of hierarchical clustering that solves this problem. We also show the results of a meta-analysis, carried out using this algorithm, aimed at identifying specific cerebral circuits involved in single word reading. Moreover, (2) we describe preliminary work on a new connectionist model of single word reading, named the two-component model because it postulates a cascaded information flow from a more cognitive component that computes a distributed internal representation for the input word, to an articulatory component that translates this code into the corresponding sequence of phonemes. Output production is started when the internal code, which evolves in time, reaches a sufficient degree of clarity; this mechanism has been advanced as a possible explanation for behavioral effects consistently reported in the literature on reading, with a specific focus on the so called serial effects. This model is here discussed in its strength and weaknesses. Finally, (3) we have turned to consider how features that are typical of human cognition can inform the design of improved artificial agents; here, we have focused on modelling concepts inspired by emotion theory. A model of emotional interaction between artificial agents, based on probabilistic finite state automata, is presented: in this model, agents have personalities and attitudes that can change through the course of interaction (e.g. by reinforcement learning) to achieve autonomous adaptation to the interaction partner. Markov chain properties are then applied to derive reliable predictions of the outcome of an interaction. Taken together, these works show how the interplay between Cognitive Science and Computer Science can be fruitful, both for advancing our knowledge of the human brain and for designing more and more intelligent artificial systems.
APA, Harvard, Vancouver, ISO, and other styles
46

Sienou, Amadou. "Proposition d'un cadre méthodologique pour le management intégré des risques et des processus d'entreprise." Thesis, Toulouse, INPT, 2009. http://www.theses.fr/2009INPT018G/document.

Full text
Abstract:
L'ingénierie d'entreprise conçoit et met en application des projets d'amélioration de la structure et du fonctionnement des organisations de production de biens ou de services. Elle développe des démarches fondées sur la modélisation, en particulier la modélisation des processus métiers, pour assurer une qualité et une cohérence d'ensemble des projets. Aujourd'hui, la prise en compte du risque en ingénierie d'entreprise fait l'objet de nombreux développements, liés à un environnement perçu comme de plus en plus agressif et imprévisible. Des cadres de référence sont même publiés pour guider les entreprises dans ces nouvelles dimensions du pilotage de l'organisation autour du risque. Notre étude se consacre à la conception des processus métier dirigée par les risques comme une composante à part entière de l'ingénierie d'entreprise. Après avoir fait une synthèse des connaissances sur les univers du risque et des processus, un problème d'intégration de ces connaissances est formulé. Un cadre méthodologique pour le management intégré des risques et des processus est ainsi conçu et décrit. Il repose sur la coordination des cycles de vie de la gestion des risques et de la gestion des processus métier, sur la définition d'un cadre conceptuel unifié permettant d'identifier et de maîtriser les informations échangées entre eux, et enfin sur un langage de modélisation adapté à une description des situations et étendant les capacités d'un outil de modélisation du marché (ARIS Business Architect). Un cas d'études du domaine de la santé vient illustrer le bien fondé de l'application de ce cadre méthodologique sur un cas concret
Enterprise engineering is concerned with the design of projects which aim to improve the structure and behaviour of organisations producing goods and services. It develops approaches based on modelling techniques, particularly on business process modelling in order to assure the quality and the global consistency of the project portfolio. Nowadays, risk consideration in enterprise engineering is a growing importance since the business environment is becoming more and more competitive and unpredictable. In fact, reference frameworks providing guidance for enterprise risk management are developed to tackle this. Our research focuses on risk driven business process design as an integral part of enterprise engineering. After delivering a synthesis of work related to risks and business processes, a research question concerning the integration of both has been addressed. A framework for the integrated management of risks and business processes is suggested. It is based on three components: a coordination of risk and business process management lifecycles, a unified conceptual framework supporting information exchanges between the coordinated lifecycles, and finally a modelling language adapted to the description of risky situations. The later extends the features of a commercial modelling tool (ARIS Business Architect). A case study from the health sector illustrates the foundation of the methodological framework
APA, Harvard, Vancouver, ISO, and other styles
47

Lin, Zin-Rong. "Developing a model of quality of life for people with coronary heart disease." Thesis, Loughborough University, 2001. https://dspace.lboro.ac.uk/2134/12309.

Full text
Abstract:
Quality of life (QOL) is an extremely important concept in the promotion of appropriate and successful health care programmes. However, there is a need for conceptual clarity to unravel the complexities of terminology in different medical conditions and the underlying factors that have a direct influence on the quality of life for people with coronary heart disease. The primary objective of this thesis is to propose a theoretical model which specifies the domains of QOL and the interrelationships among these domains. The objectives of the study are four-fold: (1) To examine whether a cardiac rehabilitation programme has a beneficial effect on cardiac heart disease patients; (2) To evaluate the primary components of generic health-related quality of life assessment tools for people with coronary heart disease; (3) To identify the main factors governing disease-specific health-related quality of life assessment tools amongst patients with coronary heart disease; (4) To examine a variety of conceptual models of QOL and to determine their relevance to cardiac patients. First, in order to provide conceptual clarity, a comprehensive review of QOL measures was undertaken. Second, data was collected on a cardiac rehabilitation programme in a county hospital using Short Form-36 (SF-36) and Quality of Life for Myocardial Infarction (QLMI) instruments. This data was analysed using a number of techniques including (l)meta-analysis; (2)discriminant analysis; (3)factor analysis and (4)structural equation modelling. Analysing the data in this way enabled the development and clarification of the specific domains of the quality of life model. Meta-analysis involved pooling the results of several studies, these were then analysed to provide a systematic, quantitative review of the data. The results found that the related studies did not have consistent outcomes to support the positive effects of a cardiac exercise rehabilitation programme on quality of life in coronary patients. Findings from the SF-36 indicate that older people with coronary heart disease gain more pain relief than their younger counterparts. After a cardiac exercise rehabilitation progranune, statistically significant improvements occurred in physical function, social function, role limitation/physical, energy/vitality, body pain, and change in health-related dimensions of quality of life. The first-order five domains model includes the symptom domain, the restriction domain, the confidence domain, the self-esteem domain and the emotion domain. This model represents an appropriate model of quality of life for people with coronary heart disease compared to the three-domain model and the four-domain model. In terms of the second-order QOL model, the five-domain model also has an adequate fit to the data. According to the result of structural equation modelling, three models, including the null model, the alternative model I and the alternative model n, did not fit the data perfectly. However, the construct of full latent variable model gradually increased the fit statistics from the null model to the alternative model I and from the null model to alternative model n. Therefore, it can be concluded that the paths and indicators of the three models need to be further adjusted in order to provide a more appropriate model. Nevertheless, this is a first trial to examine a full model of quality of life for people with coronary heart disease using the structural equation analyses. As such, this study provides a new approach to examining the difference between empirical studies and theoretical approaches.
APA, Harvard, Vancouver, ISO, and other styles
48

Masinde, Brian. "Birds' Flight Range. : Sensitivity Analysis." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-166248.

Full text
Abstract:
’Flight’ is a program that uses flight mechanics to estimate the flight range of birds. This program, used by ornithologists, is only available for Windows OS. It requires manual imputation of body measurements and constants (one observation at a time) and this is time-consuming. Therefore, the first task is to implement the methods in R, a programming language that runs on various platforms. The resulting package named flying, has three advantages; first, it can estimate flight range of multiple bird observations, second, it makes it easier to experiment with different settings (e.g. constants) in comparison to Flight and third, it is open-source making contribution relatively easy. Uncertainty and global sen- sitivity analyses are carried out on body measurements separately and with various con- stants. In doing so, the most influential body variables and constants are discovered. This task would have been near impossible to undertake using ’Flight’. A comparison is made amongst the results from a crude partitioning method, generalized additive model, gradi- ent boosting machines and quasi-Monte Carlo method. All of these are based on Sobol’s method for variance decomposition. The results show that fat mass drives the simulations with other inputs playing a secondary role (for example mechanical conversion efficiency and body drag coefficient).
APA, Harvard, Vancouver, ISO, and other styles
49

Bhandaram, Abhinav. "Detecting Component Failures and Critical Components in Safety Critical Embedded Systems using Fault Tree Analysis." Thesis, University of North Texas, 2018. https://digital.library.unt.edu/ark:/67531/metadc1157555/.

Full text
Abstract:
Component failures can result in catastrophic behaviors in safety critical embedded systems, sometimes resulting in loss of life. Component failures can be treated as off nominal behaviors (ONBs) with respect to the components and sub systems involved in an embedded system. A lot of research is being carried out to tackle the problem of ONBs. These approaches are mainly focused on the states (i.e., desired and undesired states of a system at a given point of time to detect ONBs). In this paper, an approach is discussed to detect component failures and critical components of an embedded system. The approach is based on fault tree analysis (FTA), applied to the requirements specification of embedded systems at design time to find out the relationship between individual component failures and overall system failure. FTA helps in determining both qualitative and quantitative relationship between component failures and system failure. Analyzing the system at design time helps in detecting component failures and critical components and helps in devising strategies to mitigate component failures at design time and improve overall safety and reliability of a system.
APA, Harvard, Vancouver, ISO, and other styles
50

Ferreira, Denzil Sócrates Teixeira. "Meta, tracer - MOF with traceability." Master's thesis, Universidade da Madeira, 2009. http://hdl.handle.net/10400.13/80.

Full text
Abstract:
The following document proposes a traceability solution for model-driven development. There as been already previous work done in this area, but so far there has not been yet any standardized way for exchanging traceability information, thus the goal of this project developed and documented here is not to automatize the traceability process but to provide an approach to achieve traceability that follows OMG standards, making traceability information exchangeable between tools that follow the same standards. As such, we propose a traceability meta-model as an extension of MetaObject Facility (MOF)1. Using MetaSketch2 modeling language workbench, we present a modeling language for traceability information. This traceability information then can be used for tool cooperation. Using Meta.Tracer (our tool developed for this thesis), we enable the users to establish traceability relationships between different traceability elements and offer a visualization for the traceability information. We then demonstrate the benefits of using a traceability tool on a software development life cycle using a case study. We finalize by commenting on the work developed.
Orientador: Leonel Nóbrega
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography