Dissertations / Theses on the topic 'Conceptual models'

To see the other types of publications on this topic, follow the link: Conceptual models.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Conceptual models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

MARINS, ANDRE LUIZ ALMEIDA. "PROVENANCE CONCEPTUAL MODELS." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2008. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=11880@1.

Full text
Abstract:
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
Sistemas de informação, desenvolvidos para diversos setores econômicos, necessitam com maior freqüência capacidade de rastreabilidade dos dados. Para habilitar tal capacidade, é necessário modelar a proveniência dos dados. Proveniência permite testar conformidade com a legislação, repetição de experimentos, controle de qualidade, entre outros. Habilita também a identificação de agentes (pessoas, organizações ou agentes de software) e pode ser utilizada para estabelecer níveis de confiança para as transformações dos dados. Esta dissertação propõe um modelo genérico de proveniência criado com base no alinhamento de recortes de ontologias de alto nível, padrões internacionais e propostas de padrões que tratam direta ou indiretamente de conceitos relacionados à proveniência. As contribuições da dissertação são portanto em duas direções: um modelo conceitual para proveniência - bem fundamentado - e a aplicação da estratégia de projeto conceitual baseada em alinhamento de ontologias.
Information systems, developed for several economic segments, increasingly demand data traceability functionality. To endow information systems with such capacity, we depend on data provenance modeling. Provenance enables legal compliance, experiment validation, and quality control, among others . Provenance also helps identifying participants (determinants or immanents) like people, organizations, software agents among others, as well as their association with activities, events or processes. It can also be used to establish levels of trust for data transformations. This dissertation proposes a generic conceptual model for provenance, designed by aligning fragments of upper ontologies, international standards and broadly recognized projects. The contributions are in two directions: a provenance conceptual model - extensively documented - that facilitates interoperability and the application of a design methodology based on ontology alignment.
APA, Harvard, Vancouver, ISO, and other styles
2

Solomons, Stanley Nicholls. "Conceptual models in industrial design." Thesis, De Montfort University, 1988. http://hdl.handle.net/2086/4144.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wilkie, Ormond L. "Modification models of conceptual combination." Thesis, Massachusetts Institute of Technology, 1992. http://hdl.handle.net/1721.1/13100.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Delamore, David George. "Conceptual models for dynamic systems." Thesis, University of Cambridge, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.608827.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Oliver, Ian. "Animating object oriented conceptual models." Thesis, University of Kent, 2001. https://kar.kent.ac.uk/13637/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Eeles, Charles William Owen. "Parameter optimization of conceptual hydrological models." Thesis, Open University, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.261674.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Purdy, Luke D. "Conceptual Models for Virtual High Schools." Thesis, University of Louisiana at Lafayette, 2016. http://pqdtopen.proquest.com/#viewpdf?dispub=10163355.

Full text
Abstract:

This study is a hermeneutic phenomenology focused on a virtual high school in Central Louisiana. Because virtual education is growing quickly, and student performance in virtual high schools is inconsistent, the study seeks to provide a conceptual model from which to design virtual school curricula and develop virtual school teaches. The proposed model is grounded in three theoretical frameworks and validated through the experience of virtual school teachers. The theoretical frameworks informing the study are the Community of Inquiry model, Sense of Community theory, and the Theory of Transactional Distance. The research participants’ experiences are used to validate the proposed conceptual model for virtual high school course development. The result is a conceptual model that can be used by virtual high school course designers to guide the development of virtual school curricula.

The study can also be used to guide the development of strategies for delivering online courses and conducting professional development in a virtual learning environment. The study makes four major findings. The study finds virtual high school students vary in their motivation and autonomy. The study finds the teacher-student relationship to more individualized in the virtual school than in the traditional school. The study finds that virtual high school students do not perceive value in virtual learning communities. Finally, the study finds that virtual school teachers experience with technology is positive, but their students often experience trouble with technology. The study uses these findings to suggest a conceptual model from which to develop virtual high school curricula and teach virtual high school classes.

APA, Harvard, Vancouver, ISO, and other styles
8

Dixon, Diane. "Conceptual and measurement models of disability." Thesis, University of Aberdeen, 2006. http://digitool.abdn.ac.uk/R?func=search-advanced-go&find_code1=WSN&request1=AAIU206100.

Full text
Abstract:
Numerous theoretical frameworks have been applied to further our understanding of the correlates, causes and consequences of disability, and each is accompanied by a particular conceptualisation of disability. In this thesis, disability was conceptualised as behaviour. The concept of disability as behaviour is accompanied by psychological theories of behaviour and behaviour change within which behaviour is viewed as a product of motivational factors. A repertory grid study indicated that people with mobility disability use motivational concepts to distinguish between physical activities typically used to index their disability. These data lent support to the suitability of the disability as behaviour concept in this population. Consequently, psychological theory, in the form of the theory of planned behaviour (TPB) and social cognitive theory (SCT), was used to account for walking disability in an orthopaedic sample. The empirical application of psychological theories requires the operationalisation of the constructs within those theories. An investigation of the content validity of existing measures of the perceived control constructs from the TPB and SCT indicated current measurement items do not show discriminant content validity. These data were used to select perceived control items into the study of disability in an orthopaedic sample. The content validity of the perceived control items was investigated using confirmatory factor analyses applied to the responses of the orthopaedic sample. Structural equation modelling indicated that both the TPB and SCT could account for walking disability in the orthopaedic sample. Finally, the ability of the TPB and SCT to mediate between the central constructs of the main medical model of disability, namely the International Classification of Functioning Disability and Health (ICF), was assessed using structural modelling. This integrated model accounted for a greater proportion of the variance in walking disability than did the ICF alone, suggesting psychological theory can be used to improve the ICF model.
APA, Harvard, Vancouver, ISO, and other styles
9

Damljanovic, Danica. "Natural language interfaces to conceptual models." Thesis, University of Sheffield, 2011. http://etheses.whiterose.ac.uk/1630/.

Full text
Abstract:
Accessing structured data in the form of ontologies currently requires the use of formal query languages (e.g., SeRQL or SPARQL) which pose significant difficulties for non-expert users. One way to lower the learning overhead and make ontology queries more straightforward is through a Natural Lan- guage Interface (NLI). While there are existing NLIs to structured data with reasonable performance, they tend to require expensive customisation to each new domain. Additionally, they often require specific adherence to a pre-defined syntax which, in turn, means that users still have to undergo training. In this thesis, we study the usability of NLIs from two perspectives: that of the developer who is customising the NLI system, and that of the end-user who uses it for querying. We investigate whether usability methods such as feedback and clarification dialogs can increase the usability for end users and reduce the customisation effort for the developers. To that end, we have developed two systems, QuestIO and FREyA, whose design, evaluation and comparison with similar systems form the core of the contribution of this thesis.
APA, Harvard, Vancouver, ISO, and other styles
10

Grau, Vázquez Antonio José. "Computer-aided validation of formal conceptual models." [S.l. : s.n.], 2001. http://deposit.ddb.de/cgi-bin/dokserv?idn=961616598.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Hughes, Richard Sylvester. "The conceptual structure of product semantic models." Thesis, Brunel University, 1999. http://bura.brunel.ac.uk/handle/2438/4969.

Full text
Abstract:
The study is concerned with the conceptual structure and content of the framework for characterising user-product interaction, proposed under the title – ‘Product Semantics’. The sources for the critique of design, from which the framework is derived, are identified and analysed, and the substantive theoretical and methodological content given initial consideration in terms of the deployment of the central concept of ‘meaning’, and the principal theoretical approaches adopted in the analysis of meaning and semantic concepts generally. The commitment to a cognitive and experiential approach to user-interaction is established and the concepts central to the framework, and requiring more detailed analysis, are identified. The core of the study consists in an analysis of the sequence of concepts and contexts that are chiefly used in the theoretical articulation of the framework, including - function, affordance, categorisation, artefacts, meaning and expression - of which the concept of affordance is central to the structure. On the basis of the initial consideration of the structure and content of the scheme, and in the light of the analysis of concepts, the explanatory structure of the framework is established. It is argued that the core commitment to an experiential and cognitive account, and the form of the explanatory structure, are jointly incompatible with the conceptual content of the framework, particularly in respect of the pivotal role of the concept of affordance. Proposals are advanced for an alternative interpretation which addresses the central issues of consistency and coherence, and which suggests an alternative approach to the conceptual characterisation of the framework and the form of the explanatory hierarchy. The implications of the framework, and the proposed alternative interpretation, are considered in respect of their application in shaping approaches to the development of design theory and methodology, and the experiential aspect of semantics and cognition.
APA, Harvard, Vancouver, ISO, and other styles
12

Maehle, Valerie A. "Conceptual models in the transfer of learning." Thesis, University of Aberdeen, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.261454.

Full text
Abstract:
In order to attain clinical competence student physiotherapists apply knowledge from a range of cognitive domains in the assessment and treatment of patients with a variety of conditions. Current research indicates that the ability to transfer knowledge to a wide variety of conditions requires a cognitive structure in which concepts are embedded in a rich network of interconnections (Faletti, 1990, Spiro, 1987). A concept mapping technique was selected as means of eliciting a representation of the knowledge the student possessed and would access in order to underpin the assessment and treatment of a specific peripheral joint condition. Twenty second and third year physiotherapy students currently on clinical placement in an Out-Patient Department each produced a concept map prior to assessing the patient. A modification of the 'Student Teacher Dialogue' (Hammond et al, 1989) was the methodology selected for identification of the transfer of learning. Analysis of the transcription of this interaction provided evidence of the domain specific and procedural knowledge transferred to the patient assessment. Weak correlations were found to exist between the degree of complexity of the concept map the student produced and the amount and level of transfer achieved in the clinical setting. Also there was evidence to suggest that abstract subject areas, or those which involved practical or clinical applications, facilitated the development of more concentrated conceptual networks. However, contrary to expectation, third year students failed to produce higher quality maps than second year students, despite having greater academic and clinical experience.
APA, Harvard, Vancouver, ISO, and other styles
13

NETO, ELVIDIO GAVASSONI. "APPLICATION OF NONLINEAR VIBRATION MODES TO CONCEPTUAL MODELS OF OFFSHORE STRUCTURES." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2012. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=21272@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
Estruturas offshore têm demandado, em função do aumento da profundidade da lâminha de água e da severidade do ambiente, análises de vibração cada vez mais confiáveis. Em face de oscilações com grandes deslocamentos, torna-se imprescindível uma análise não linear dessas estruturas. Métodos numéricos como os elementos finitos constituem-se numa tarefa computacionalmente custosa, uma vez que os acoplamentos modais tornam necessários modelos com muitos graus de liberdade. Isso dificulta as análises paramétricas e prolonga os ciclos de projeto para estruturas offshore. Uma alternativa a esses problemas é o uso de modelos de ordem reduzida. Os modos normais não lineares têm-se mostrado uma ferramenta eficiente na derivação de modelos de ordem reduzida para análises de vibrações não lineares. Isso ocorre porque um número menor de modos não lineares, em relação aos modelos com modos lineares, é necessário para se obter o mesmo nível de precisão num modelo reduzido. Esse trabalho utiliza modelos de ordem reduzida, obtidos por meio de análise modal não linear, para o estudo de vibração de modelos simplificados de estruturas offshore. Três exemplos de aplicação são utilizados: pêndulo invertido, torre articulada e plataforma spar. Além dos métodos baseado no procedimento de Galerkin e o assintótico, um procedimento numérico alternativo é proposto para obtenção dos modos, podendo ser utilizado para construção dos modos essencialmente não lineares. As vibrações livres e forçadas são estudadas. A estabilidade das soluções é analisada utilizando-se a teoria de Floquet, diagramas de bifurcação e de Mathieu e seções de Poincaré. As seções de Poincaré são também utilizadas para identificar a multiplicidade dos modos não lineares e a existência de multimodos. Os resultados são comparados com a solução obtida da integração numérica do sistema original de equações, mostrando uma boa precisão dos modelos reduzidos.
The increasing water depth and the ocean adverse environment demand more accurate vibration analysis of offshore structures. Due to large amplitude oscillations, a nonlinear vibration analysis becomes necessary. Numerical methods such as finite element constitute a computationally expensive task when applied to these problems, since the occurrence of modal coupling demands a high number of degrees-of-freedom. A feasible possibility to overcome these difficulties is the use of low order models. The nonlinear normal modes have been shown to be an effective tool in the derivation of reduced order models in nonlinear dynamics. In the use of nonlinear modal analysis fewer modes are required to achieve a given level of accuracy in comparison to the use of linear modes. This work uses the nonlinear normal modes to derive low dimensional models to study the vibration of simplified models of offshore structures. Three examples are considered: an inverted pendulum, an articulated tower and a spar platform. Both free and forced vibrations are studied. The asymptotic and Galerkin-based methods are used to derive the normal modes. In addition, an alternative numerical procedure to construct such modes is proposed, which can be used to derive coupled modes. The solution stability is determined by the use of the Floquet theory, bifurcation and Mathieu diagrams, and Poincaré sections. The Poincaré sections are also used to investigate the multiplicity of modes and multimodes. The results obtained from the numerical integration of the original system are favourably compared with those of the reduced order models, showing the accuracy of the reduced models.
APA, Harvard, Vancouver, ISO, and other styles
14

Kucukyavuz, Fatih. "Transforming Conceptual Models Of The Mission Space Into Simulation Space Models." Master's thesis, METU, 2011. http://etd.lib.metu.edu.tr/upload/12613158/index.pdf.

Full text
Abstract:
Helping to abstract a valid model from real system, conceptual modeling is an essential phase in simulation development lifecycle. With the development of the KAMA framework, a new methodology was presented to develop mission space conceptual model for simulation systems. It provides metamodel elements represented by graphical diagrams to develop conceptual models of mission space. BOM (Base Object Model), developed by SISO (Simulation Interoperability Standards Organization), is another conceptual modeling concept serving for simulation space. KAMA models are very close to problem domain and intend to model real world concepts in requirement analysis and development phase. Whereas, being vital inputs for the simulation design phase, BOM models are closer to solution domain. Hence there is no defined way of using the captured mission space knowledge in simulation space, problem arises when moving from requirement analysis to design phase. In this study, to solve this problem, we propose a method for transforming mission space conceptual models in simulation space. Our solution approach will be mapping the KAMA mission space models to BOM simulation space models for automatically transport real world analysis results to simulation designers.
APA, Harvard, Vancouver, ISO, and other styles
15

Jabbari, Sabegh Mohammad Ali. "A study of the combined use of conceptual models." Thesis, Queensland University of Technology, 2018. https://eprints.qut.edu.au/115490/1/Mohammad_Ali_Jabbari_Sabegh_Thesis.pdf.

Full text
Abstract:
This thesis investigates the use of representations of the relevant features of a system domain, often called conceptual models, during information systems analysis and design. Conceptual models play a significant role in the early detection and correction of information systems development errors. However, understanding of the use of different types of models in practice remains outstanding. Through multiple studies, this thesis provides both an empirical understanding of the combined use of conceptual models in practice and extensions and contingencies to existing theories that explain how and why practitioners use model combinations.
APA, Harvard, Vancouver, ISO, and other styles
16

Thompson, David Charles. "Feasibility of a skeletal modeler for conceptual mechanical design /." Full text (PDF) from UMI/Dissertation Abstracts International, 2000. http://wwwlib.umi.com/cr/utexas/fullcit?p3004386.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Seibert, Jan. "Conceptual runoff models - fiction or representation of reality?" Doctoral thesis, Uppsala University, Department of Earth Sciences, 1999. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-290.

Full text
Abstract:

Available observations are often not sufficient as a basis for decision making in water management. Conceptual runoff models are frequently used as tools for a wide range of tasks to compensate the lack of measurements, e.g., to extend runoff series, compute design floods and predict the leakage of nutrients or the effects of a climatic change. Conceptual runoff models are practical tools, especially if the reliability in their predictions can be assessed. Testing of these models is usually based solely on comparison of simulated and observed runoff, although most models also simulate other fluxes and states. Such tests do not allow thorough assessment ofmodel-prediction reliability. In this thesis, two widespread conceptual models, the HBV modeland TOPMODEL, were tested using a catalogue of methods for model validation (defined as estimation of confidence in model simulations). The worth of multi-criteria validation forevaluating model consistency was emphasised. Both models were capable to simulate runoffadequately after calibration, whereas the performance for some of the other validation tests wasless satisfactory. The impossibility to identify unique parameter values caused large uncertainties in model predictions for the HBV model. The parameter uncertainty was reducedwhen groundwater levels were included into the calibration, whereas groundwater-levelsimulations were in weak agreement with observations when the model was calibrated againstonly runoff. The agreement of TOP-MODEL simulations with spatially distributed data was weak for both groundwater levels and the distribution of saturated areas. Furthermore, validation against hydrological common sense revealed weaknesses in the TOPMODEL approach. In summary these results indicated limitations of conceptual runoff models and highlighted the need for powarful validation methods. The use of such methods enables assessment of the reliability of model predictions. It also supports the further development of models by identification of weak parts and evalution of improvements.

APA, Harvard, Vancouver, ISO, and other styles
18

Freer, James E. "Uncertainty and calibration of conceptual rainfall runoff models." Thesis, Lancaster University, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.266810.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

El-Ghalayini, Haya Ahmed. "Reverse engineering domain ontologies to conceptual data models." Thesis, University of the West of England, Bristol, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.438785.

Full text
Abstract:
The nature of information systems is changing as they become more sophisticated and increasingly address more complex application domains. This has in turn complicates the representation of the functional requirements of a particular problem domain in terms of its basic entities and their relationships, as expressed in conceptual data models. Conceptual data models aim at establishing a link between user and domain requirements. The process of developing conceptual data models seems to be quite straightforward; however, it is often a lengthy and iterative process, and the output models can have a significant impact on the quality of the final system. As ontologies can capture consensual commitment about domain knowledge, the goal of this thesis is to study the extent to which consensual knowledge about a certain domain that has been captured in domain ontologies can participate in developing conceptual data models. Therefore, this thesis research introduces a framework in a form of so-called Transformation-Engine to generate a possible conceptual data model from a given domain ontology. The functionality of the Transformation-Engine constitutes of two main activities: (1) generating a suggested conceptual data model from a given domain ontology using a novel set of mapping rules between an ontology language and a conceptual data modelling constructs; (2) improving the quality of the generated conceptual data model elements using the newly introduced ontological quality graph. The results of this thesis show that conceptual data models that are developed from domain ontologies are comparable to the models that are traditionally developed during the elicitation stage. The approach does not generate a comprehensive conceptual data model automatically, but suggests relevant alternatives to modellers as they capture the basic entities and their relations for the problem domain identified by a domain community. The reverse-engineered conceptual data models aid understanding and communication, and facilitate eventual system integration.
APA, Harvard, Vancouver, ISO, and other styles
20

Wenren, Cheng. "Mixed Model Selection Based on the Conceptual Predictive Statistic." Bowling Green State University / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1403735738.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Sorooshian, Soroosh, and Vijai Kumar Gupta. "Improving the Reliability of Compartmental Models: Case of Conceptual Hydrologic Rainfall-Runoff Models." Department of Hydrology and Water Resources, University of Arizona (Tucson, AZ), 1986. http://hdl.handle.net/10150/614011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Nicholson, Michael David. "Applications of branching processes to cancer evolution and initiation." Thesis, University of Edinburgh, 2018. http://hdl.handle.net/1842/33034.

Full text
Abstract:
There is a growing appreciation for the insight mathematical models can yield on biological systems. In particular, due to the challenges inherent in experimental observation of disease progression, models describing the genesis, growth and evolution of cancer have been developed. Many of these models possess the common feature that one particular type of cellular population initiates a further, distinct population. This thesis explores two models containing this feature, which also employ branching processes to describe population growth. Firstly, we consider a deterministically growing wild type population which seeds stochastically developing mutant clones. This generalises the classic Luria- Delbruck model of bacterial evolution. We focus on how differing wild type growth manifests itself in the distribution of clone sizes. In our main result we prove that for a large class of wild type growth, the long-time limit of the clone size distribution has a general two-parameter form, whose tail decays as a power-law. In the second model, we consider a fully stochastic system of cells in a growing population that can undergo birth, death and transitions. New cellular types appear via transitions, examples of which are genetic mutations or migrations bringing cells into a new environment. We concentrate on the scenario where the original cell type has the largest net growth rate, which is relevant for modelling drug resistance, due to fitness costs of resistance, or cells migrating into contact with a toxin. Two questions are considered in our main results. First, how long do we wait until a cell with a specific target type, an arbitrary number of transitions from the original population, exists. Second, which particular sequence of transitions initiated the target population. In the limit of small final transition rates, simple, explicit formulas are given to answer these questions.
APA, Harvard, Vancouver, ISO, and other styles
23

Chioasca, Erol-Valeriu. "Automatic construction of conceptual models to support early stages of software development : a semantic object model approach." Thesis, University of Manchester, 2015. https://www.research.manchester.ac.uk/portal/en/theses/automatic-construction-of-conceptual-models-to-support-early-stages-of-software-development--a-semantic-object-model-approach(ded4f2e1-2614-4a6c-89c3-f259112b30cb).html.

Full text
Abstract:
The earliest stage of software development almost always involves converting requirements descriptions written in natural language (NLRs) into initial conceptual models, represented by some formal notation. This stage is time-consuming and demanding, as initial models are often constructed manually, requiring human modellers to have appropriate modelling knowledge and skills. Furthermore, this stage is critical, as errors made in initial models are costly to correct if left undetected until the later stages. Consequently, the need for automated tool support is desirable at this stage. There are many approaches that support the modelling process in the early stages of software development. The majority of approaches employ linguistic-driven analysis to extract essential information from input NLRs in order to create different types of conceptual models. However, the main difficulty to overcome is the ambiguous and incomplete nature of NLRs. Semantic-driven approaches have the potential to address the difficulties of NLRs, however, the current state of the art methods have not been designed to address the incomplete nature of NLRs. This thesis presents a semantic-driven automatic model construction approach which addresses the limitations of current semantic-driven NLR transformation approaches. Central to this approach is a set of primitive conceptual patterns called Semantic Object Models (SOMs), which superimpose a layer of semantics and structure on top of NLRs. These patterns serve as intermediate models to bridge the gap between NLRs and their initial conceptual models. The proposed approach first translates a given NLR into a set of individual SOM instances (SOMi) and then composes them into a knowledge representation network called Semantic Object Network (SON). The proposed approach is embodied in a software tool called TRAM. The validation results show that the proposed semantic-driven approach aids users in creating improved conceptual models. Moreover, practical evaluation of TRAM indicates that the proposed approach performs better than its peers and has the potential for use in real world software development.
APA, Harvard, Vancouver, ISO, and other styles
24

Malakhoff, Lev A. "Combat aircraft mission tradeoff models for conceptual design evaluation." Diss., Virginia Polytechnic Institute and State University, 1988. http://hdl.handle.net/10919/53583.

Full text
Abstract:
A methodology is developed to address the analyses of combat aircraft attrition. The operations of an aircraft carrier task force are modeled using the systems dynamics simulation language DYNAMO. The three mission-roles include: surface attack, lighter escort, and carrier defense. The level of analysis is performed over the entire campaign, going beyond the traditional single·sortie analysis level. These analyses are performed by determining several measures of effectiveness (MOEs) for whatever constraints are applied to the model. The derived MOEs include: Campaign Survivability (CS), Fractlon of Force Lost (FFL), Exchange Ratio (ER), Relative Exchange Ratio (RER), Possible Crew Loss (PCL), and Replacement Cost (RC). RER is felt to be the most useful MOE since it considers the initial inventory levels of both friendly and enemy forces, and its magnitude is easy for the analyst to relate to (an RER greater than one is a prediction of a friendly force’s victory). The simulation model developed in this research is run for several experiments. The effects of force size on the MOEs ls studied, as well as a hypothetical multimission aircraft deployed to perform any of the three missions (albeit at lower effectiveness than the speciallzed aircraft for their given roles but nonetheless with a higher availability). Evaluation of specific technological improvements such as smaller radar cross section, higher thrust/weight, improved weapons ranges, is made using the MOEs. Also, a cost-effectiveness tradeoff methodology is developed by determining the acquisition cost ratio (ACR) for certain modified alternatives the baseline by determining the required initial inventory of modified aircraft to produce the same total effectiveness of the baseline aircraft.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
25

Cabrera, Antonio Trani. "Combat aircraft scenario tradeoff models for conceptual design evaluation." Diss., Virginia Polytechnic Institute and State University, 1988. http://hdl.handle.net/10919/53920.

Full text
Abstract:
The purpose of this research is to apply engineering-based knowledge to the field of combat aircraft survivability, and to create scenario-specific models in order to estimate the tradeoff between aircraft survivability and lethality metrics at the encounter and sortie levels. The development of scenario-specific models serves to identify and quantify technological changes that have Ieverage on the overall performance of the aircraft from a survivability point of view. Also, the models focus on the fighter aircraft susceptibility assessment and are capable of incorporating outputs from offline studies as inputs, such as in the area of vulnerability assessment where extensive databases are available. The mission scenario models are microscopic in nature and relate important conceptual aircraft design parameters such as thrust-to-mass ratio, wing loading, empty mass, maneuverability, etc. and operational parameters (e.g., weapon payload, range, loiter time, flight profiles, etc.) to the aircraft sortie survivability and lethality under various threat scenarios. This research proposes a methodology to estimate survivability and lethality aircraft performance at the sortie level where aircraft parameters can be implemented into scenario-specific models to assess their impact upon survivability-related metrics. While the project was conceived with naval aircraft in mind, the methodology, to the extent possible, is not to be aircraft-specific and thus could be applied to any particular design at the conceptual stage.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
26

GUERSON, J. O. "Representing Dynamic Invariants in Ontologically Well-Founded Conceptual Models." Universidade Federal do Espírito Santo, 2015. http://repositorio.ufes.br/handle/10/4283.

Full text
Abstract:
Made available in DSpace on 2016-08-29T15:33:21Z (GMT). No. of bitstreams: 1 tese_8885_JohnMScThesis[final]20150703-110844.pdf: 2078817 bytes, checksum: b7867cf01fac04ad68223c7eeca3fe68 (MD5) Previous issue date: 2015-05-28
Conceptual models often capture the invariant aspects of the phenomena we perceive. These invariants may be considered static when they refer to structures we perceive in phenomena at a particular point in time or dynamic/temporal when they refer to regularities across different points in time. While static invariants have received significant attention, dynamics enjoy marginal support in widely-employed techniques such as UML and OCL. This thesis aims at addressing this gap by proposing a technique for the representation of dynamic invariants of subject domains in UML-based conceptual models. For that purpose, a temporal extension of OCL is proposed. It enriches the ontologically well-founded OntoUML profile and enables the expression of a variety of (arbitrary) temporal constraints. The extension is fully implemented in the tool for specification, verification and simulation of enriched OntoUML models.
APA, Harvard, Vancouver, ISO, and other styles
27

Moolla, Ahmed Ismail. "A conceptual framework to measure brand loyalty / by Ahmed Ismail Moolla." Thesis, North-West University, 2010. http://hdl.handle.net/10394/4442.

Full text
Abstract:
Since the emergence of branding as an approach to marketing, the concept has been received with a great deal of interest and has stimulated ever increasing research in the area. Businesses have realized the importance of retaining existing customers and have begun to identify and apply ways to build long-term relationships with customers. These relationships with customers require an understanding of customer needs, business requirements and the influences that create a long-term relation which is more commonly known as brand loyalty. Several research studies including this one present the results of brand loyalty research in the form of a conceptual framework. From an academic viewpoint, the identification and application of all the relevant influences are essential in the construction of a framework that can guide the promotion of brand loyalty. The aim of this study was to identify the influences that are most important in creating and measuring brand loyalty in the fast moving consumer goods (FMCG) sector. The study builds a conceptual framework using the identified influences and also presents the interrelationships between the influences. The primary theoretical background and concepts in brand loyalty for this study ranged from the history of branding to the results of brand loyalty studies conducted over the past five years. The extensive review of literature and previously tested brand loyalty models resulted in the identification of 12 influences that impact directly on brand loyalty. Reducing the identified set of influences into a manageable set for this thesis involved selecting the most commonly used reliable and valid brand loyalty influences. The empirical study which followed was conducted among a sample of 550 customers who had access to a wide range of FMCG. The empirical study based on the selected 12 brand loyalty influences yielded results that measured the strength of each influence and the interrelationship of influences. The results were analysed by the process of factor analysis, and were presented in the form of a conceptual framework that could be applied in the FMCG segment to measure the strength of brand loyalty influences and determine if the same influences apply to all FMCG. The results of the study confirmed that different influences have different effects on brand loyalty in the FMCG segment. The study revealed that the psychological influences such as brand commitment, brand affect, perceived value and relationship proneness had a far stronger effect on brand loyalty than the brand performance influences such as customer satisfaction or brand performance. Furthermore, the study found an extremely close relationship between influences as far as the specific products were concerned. This study confirmed that FMCG could all be treated as a single entity when evaluating the influences of brand loyalty. The uniqueness and value of the study lies in the evaluation of each brand loyalty influence that is collectively assembled in one framework. The most important contribution of the study is therefore the construction of this conceptual framework through which brand loyalty could be measured and strategically managed.
Thesis (Ph.D. (Business Administration))--North-West University, Potchefstroom Campus, 2011.
APA, Harvard, Vancouver, ISO, and other styles
28

Kleissen, Franciscus Maria. "Uncertainty and identifiability in conceptual models of surface water acidification." Thesis, Imperial College London, 1991. http://hdl.handle.net/10044/1/46868.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Johansson, Henrik. "Conceptual information models to integrate data management in engineering simulation /." Luleå, 2002. http://epubl.luth.se/1402-1544/2002/33.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Orellana, Bobadilla Barbara A. "Identification of lumped and semi-distributed conceptual rainfall runoff models." Thesis, Imperial College London, 2012. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.590036.

Full text
Abstract:
Conceptual rainfall runoff (CRR) models usually require calibration to identify their parameter values, whereas their model structure is selected prior to modelling. Consid- erable efforts have been directed into calibration of lumped CRR models. Identification of the model structure on the basis of available data still remains unclear. The data-based mechanistic (DBM) approach does minimal assumptions of the model structure, which is identified using powerful statistic techniques. Moreover, there is a similarity between the CRR and DBM model.formulations. Based on this similarity, an integration of CRR and DBM models is proposed and evaluated. Two calibration strategies are investigated in the Upper Illinois river catchment (USA) for lumped modelling. Results show that the identi- fied TF model improves the simulated flow, especially in the time to peak, in comparison to the modelled flow of the conceptual model. It is suggested that this improvement is di- rectly related to the lag time parameter considered in the TF model between the effective rainfall and the flow. Semi-distributed rainfall runoff models provide advantages over lumped models in representing the effect of spatially variable inputs, outputs and catchment properties. However they are affected by parameter identifiability. Four calibration strategies are considered to analyse the ability to meaningful simulate flow at interior locations. Results show that there are no significant improvements at the catchment outlet when internal gauges are included. The behavioural parameter sets defined at the catchment outlet tend to be non-behavioural at the internal gauges. This tendency increases with the distance from the catchment outlet to the internal gauges. Considering only spatial variability of rainfall rather than also of parameter values did not improve the simulations at the out- let or at the internal gauges, compared to lumped modelling results. Calibration only at the catchment outlet using independent sampling of the internal subcatchments achieved similar results. Identification of lumped and semi-distributed CRR models is carried out using the RRTMSD modelling toolbox developed for this work.
APA, Harvard, Vancouver, ISO, and other styles
31

Harmain, H. M. "Building object-oriented conceptual models using natural language processing techniques." Thesis, University of Sheffield, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.312740.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Omar, Mussa. "Semi-automated development of conceptual models from natural language text." Thesis, University of Huddersfield, 2018. http://eprints.hud.ac.uk/id/eprint/34665/.

Full text
Abstract:
The process of converting natural language specifications into conceptual models requires detailed analysis of natural language text, and designers frequently make mistakes when undertaking this transformation manually. Although many approaches have been used to help designers translate natural language text into conceptual models, each approach has its limitations. One of the main limitations is the lack of a domain-independent ontology that can be used as a repository for entities and relationships, thus guiding the transition from natural language processing into a conceptual model. Such an ontology is not currently available because it would be very difficult and time consuming to produce. In this thesis, a semi-automated system for mapping natural language text into conceptual models is proposed. The model, which is called SACMES, combines a linguistic approach with an ontological approach and human intervention to achieve the task. The model learns from the natural language specifications that it processes, and stores the information that is learnt in a conceptual model ontology and a user history knowledge database. It then uses the stored information to improve performance and reduce the need for human intervention. The evaluation conducted on SACMES demonstrates that (1) designers’ creation of conceptual models is improved when using the system comparing with not using any system, and that (2) the performance of the system is improved by processing more natural language requirements, and thus, the need for human intervention has decreased. However, these advantages may be improved further through development of the learning and retrieval techniques used by the system.
APA, Harvard, Vancouver, ISO, and other styles
33

Howell, Dennis H. "Japan's Security Decisions: Allison's Conceptual Models and Missile Defense Policy." Thesis, Virginia Tech, 2005. http://hdl.handle.net/10919/42780.

Full text
Abstract:
This research project assesses the continued utility of Allisonâ s three policy-making models in analyzing contemporary foreign policy problems. It also explores the effect of cultural considerations on Allisonâ s concepts by delving into the unique themes of Japanese politics. The climate in which this policy decision is made is framed through a discussion of the strategic environment and Japanese defense policy following the Cold War and 9/11. The rational actor, organizational process, and bureaucratic politics models are applied to Japanâ s 2003 decision to field a missile defense system through a qualitative analysis of English-language secondary hard-copy and online sources. Some Japanese government materials are reviewed as well; the Japanese language, however, presented challenges to research. Despite the expectation that the rational actor model best describes the Japanese approach to missile defense, this project shows the true value of Allisonâ s theories lies in their capacity to expose issues relevant to policy problems from varying perspectives. Japanâ s missile defense policy likely resulted from a combination of the three models, each influenced in varying degrees by the cultural aspects of Japanese politics.
Master of Arts
APA, Harvard, Vancouver, ISO, and other styles
34

Collins, Sean E. "Comparing hypotheses proposed by two conceptual models for stream ecology." University of Cincinnati / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1396532770.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Derrick, Emory Joseph. "Conceptual frameworks for discrete event simulation modeling." Thesis, Virginia Tech, 1988. http://hdl.handle.net/10919/43840.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

De, Alwis Brian. "Supporting conceptual queries over integrated sources of program information." Thesis, University of British Columbia, 2008. http://hdl.handle.net/2429/695.

Full text
Abstract:
A software developer explores a software system by asking and answering a series of questions. To answer these questions, a developer may need to consult various sources providing information about the program, such as the static relationships expressed directly in the source code, the run-time behaviour of a program recorded in a dynamic trace, or evolution history as recorded in a source management system. Despite the support afforded by software exploration tools, developers often struggle to find the necessary information to answer their questions and may even become disoriented, where they feel mentally lost and are uncertain of what they were trying to accomplish. This dissertation advances a thesis that a developer's questions, which we refer to as conceptual queries, can be better supported through a model to represent and compose different sources of information about a program. The basis of this model is the sphere, which serves as a simple abstraction of a source of information about a program. Many of the software exploration tools used by a developer can be represented as a sphere. Spheres can be composed in a principled fashion such that information from a sphere may replace or supplement information from a different sphere. Using our sphere model, for example, a developer can use dynamic runtime information from an execution trace to replace information from the static source code to see what actually occurred. We have implemented this model in a configurable tool, called Ferret. We have used the facilities provided by the model to implement 36 conceptual queries identified from the literature, blogs, and our own experience, and to support the integration of four different sources of program information. Establishing correspondences between similar elements from different spheres allows a query to bridge across different spheres in addition to allowing a tool's user interface to drive queries from other sources of information. Through this effort we show that sphere model broadens the set of possible conceptual queries answerable by software exploration tools. Through a small diary study and a controlled experiment, both involving professional software developers, we found the developers used the conceptual queries that were available to them and reported finding Ferret useful.
APA, Harvard, Vancouver, ISO, and other styles
37

Clowes, Darren. "Hybrid semantic-document models." Thesis, Loughborough University, 2013. https://dspace.lboro.ac.uk/2134/14736.

Full text
Abstract:
This thesis presents the concept of hybrid semantic-document models to aid information management when using standards for complex technical domains such as military data communication. These standards are traditionally text based documents for human interpretation, but prose sections can often be ambiguous and can lead to discrepancies and subsequent implementation problems. Many organisations produce semantic representations of the material to ensure common understanding and to exploit computer aided development. In developing these semantic representations, no relationship is maintained to the original prose. Maintaining relationships between the original prose and the semantic model has key benefits, including assessing conformance at a semantic level, and enabling original content authors to explicitly define their intentions, thus reducing ambiguity and facilitating computer aided functionality. Through the use of a case study method based on the military standard MIL-STD-6016C, a framework of relationships is proposed. These relationships can integrate with common document modelling techniques and provide the necessary functionality to allow semantic content to be mapped into document views. These relationships are then generalised for applicability to a wider context. Additionally, this framework is coupled with a templating approach which, for repeating sections, can improve consistency and further enhance quality. A reflective approach to model driven web rendering is presented and evaluated. This reflective approach uses self-inspection at runtime to read directly from the model, thus eliminating the need for any generative processes which result in data duplication across source used for different purpose.
APA, Harvard, Vancouver, ISO, and other styles
38

Ferreira, Juliana dos Santos. "A conceptualização de bandido em expressões bandido de x: uma perspectiva cognitivista." Universidade do Estado do Rio de Janeiro, 2012. http://www.bdtd.uerj.br/tde_busca/arquivo.php?codArquivo=4079.

Full text
Abstract:
Com vistas à conceptualização do conceito de BANDIDO em 32 expressões com a estrutura bandido de x, descrevemos, nesta dissertação, os modelos cognitivos idealizados subjacentes à construção de sentido de tais expressões, postulando-lhes um caráter de modelo cognitivo complexo, nos termos de Lakoff (1987), produtivo na língua. Constituem ainda o arcabouço teórico deste estudo a Teoria da Mesclagem Conceptual (FAUCONNIER e TURNER, 2002) e a Teoria da Metáfora Conceptual (LAKOFF e JOHNSON, 1980). A análise das construções bandido de x foi realizada a partir de 137 comentários retirados da internet e definições elaboradas por 15 alunos do ensino fundamental; 18 do ensino médio e 20 alunos do ensino superior. Os alunos que colaboraram com a pesquisa definiram 24 expressões bandido de x. A pesquisa obedeceu ao procedimento qualitativo de análise dos dados, no qual observamos as diferentes interpretações dadas para as expressões, fundamentando-as a partir dos processos cognitivos envolvidos no sentido das mesmas. Assim com base na análise dos comentários de internautas e nas definições de alunos, propomos quatro processos de conceptualização para as expressões bandido de x: (a) conceptualização com base em modelos cognitivos proposicionais, em que x é um locativo interpretado como lugar de origem ou de atuação do bandido bandido de morro, bandido de rua, bandido de cadeia ; (b) conceptualização com base em modelos esquemático-imagéticos, em que observamos a atribuição de uma espécie de escala ao sentido atribuído à construção, culminando em diferentes status para a categoria BANDIDO DE X, subjacente a expressões bandido de primeira/segunda/quinta categoria/linha; (c) conceptualização de BANDIDO DE X com base em modelos metonímicos, em que x é uma peça do vestuário/calçado/acessório, de modo a interpretar o BANDIDO como pertencendo a uma categoria que costuma utilizar determinada peça de roupa, acessório ou calçado bandido de colarinho branco, bandidos de farda, bandido de chinelo ; (d) conceptualização de BANDIDO DE X com base em modelos metafóricos, em que x é um conceito abstrato que pode ser entendido como um objeto possuído pelo bandido, de forma a caracterizá-lo pela maneira de agir ou expertise bandido de conceito, bandido de atitude, bandido de fé. Acreditamos, assim, na possibilidade de descrição de padrões que regem a conceptualização de BANDIDO DE X, cujos sentidos alcançados por meio de modificadores revelam a produtividade e complexidade do modelo cognitivo BANDIDO
The theme of this study is the concept of bandit. We intend to investigate, analyze and describe the idealized cognitive models of 32 expressions resulted from the construction bandit of x .We organized a corpus composed of 137 comments taken from the internet that contain bandit of x expressions. We provide a description of the Idealized Cognitive Models. It counts on the contributions of Conceptual Metaphor Theory (LAKOFF and JONHSON, 1980), Conceptual Blending Theory (FAUCONNIER and TURNER, 2002) and Idealized Cognitive Models Theory (LAKOFF, 1987). The another part of the corpus was made by analyzing responses of 15 elementary school students, 18 middle school students and 20 college students which students set 24 expressions bandit x. The research followed a qualitative procedure of data analysis which we see the different interpretations given to the terms on the basis of various cognitive processes Thus, based on analysis of comments from netizens and definitions of students, we propose four processes of conceptualization to outlaw expressions of x: (a) conceptualization based on propositional cognitive models, where x is interpreted as a rental place of origin or acting bandit - bandit hill, street thug, thug in jail - (b) conceptualization based on the schematic, pictorial models, we observe the allocation of a kind of scale to the meaning attributed to construction, culminating in different status for category villain of x, the underlying expressions bandit first / second / fifth category / line, (c) conceptualization of x-based metonymic models, where x is a piece of clothing / footwear / accessories, so to interpret the bandit as belonging to a category that tends to use certain piece of clothing, accessory or footwear - white collar crook, uniformed bandits, bandit slipper - (d) conceptualization of BANDIT of x, based on metaphorical models in x is an abstract concept that can be understood as an object owned by the BANDIT in order to characterize it by way of acting or expertise - bandit concept, attitude bandit, bandit of faith. We believe, therefore, the possibility of description of standards governing the conceptualization of BANDIT of x, whose senses achieved through modifiers reveal productivity and complexity of the cognitive model BANDIT
APA, Harvard, Vancouver, ISO, and other styles
39

Eryilmaz, Utkan. "A Verification Approach For Dynamics Of Metamodel Based Conceptual Models Of The Mission Space." Phd thesis, METU, 2011. http://etd.lib.metu.edu.tr/upload/12613144/index.pdf.

Full text
Abstract:
Conceptual models were introduced in the simulation world in order to describe the problem domain in detail before any implementation is attempted. One of the recent approaches for conceptual modeling of the military mission space is the KAMA approach which provides a process description, a UML based notation, and a supporting tool for developing conceptual models. The prominence of the approach stems from availability of guidance and applications in real life case studies. Although the credibility of a conceptual model can be leveraged through use of a structured notation and tools, the verification and validation activities must be performed to arrive at more credible conceptual models. A conceptual model includes two categories of information: static and dynamic. The dynamic information describes the changes that occur over time. In this study, the dynamic characteristics of the conceptual models described in KAMA notation are explored and a verification approach based on these is proposed. The dynamical aspects of KAMA notation and example conceptual models provide the necessary information for characterization of the dynamical properties of conceptual models. Using these characteristics as a basis, an approach is formulated that consists of formal and semiformal techniques as well as supporting tools. For description of additional properties for dynamic verification, an extended form of KAMA is developed, called the KAMA-DV notation. The approach is applied on two different real-life case studies and its effectiveness is compared with earlier verification studies.
APA, Harvard, Vancouver, ISO, and other styles
40

Helvaci, Aziz. "Comparison Of Parametric Models For Conceptual Duration Estimation Of Building Projects." Master's thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/2/12609759/index.pdf.

Full text
Abstract:
Estimation of construction durations is a very crucial part of project planning, as several key decisions are based on the estimated durations. In general, construction durations are estimated by using planning and scheduling techniques such as Gannt or bar chart, the Critical Path Method (CPM), and the Program Evaluation and Review Technique (PERT). However, these techniques usually require detailed design information for estimation of activity durations and determination of the sequencing of the activities. In some cases, pre-design duration estimates may be performed by using these techniques, however, accuracy of these estimates mainly depends on the experience of the planning engineer. In this study, it is aimed to develop and compare alternative methods for conceptual duration estimation of building constructions with basic data information available at the early stages of projects. Five parametric duration estimation models are developed with the data of 17 building projects which were constructed by a contractor in United States. Regression analysis and artificial neural networks are used in the development of these five duration estimation models. A parametric cost estimation model is developed using regression analysis for cost estimations to be used in calculating the prediction performances of cost based duration estimation models. Finally, prediction performances of all parametric duration estimation models are determined and compared. The models provided reasonably accurate estimates for construction durations. The results also indicated that construction durations can be predicted accurately without making an estimate for the project cost.
APA, Harvard, Vancouver, ISO, and other styles
41

Mylopoulos, Maria. "Conceptual models for knowledge management, an empirical study using Knowledge Forum." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp05/MQ63065.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Cooper, Vincent A. "On automatic calibration of conceptual rainfall runoff models using optimisation techniques." Thesis, McGill University, 2002. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=38473.

Full text
Abstract:
Conceptual rainfall runoff (CRR) models are popular among hydrologists owing to their modest input data requirements, their simple structure and therefore their modest computational requirements. The realisation that CRR model calibration is more akin to a global optimisation problem, not a local optimisation one, was a significant development. The subsequent applications of global optimisation methods (GOM) have solved a major difficulty for estimating model parameters. Further improvement in parameter estimation may follow if constraints that help to limit the number of feasible parameter values can be developed. This study has proposed a methodology for formulating constraints related to the CRR model structure and the hydrologic data available for calibration.
For developing this methodology, the research examined the capabilities of three GOMs for calibration, namely, the Shuffled Complex Evolution (SCE) method, a genetic algorithm (GA) and a simulated annealing procedure (SA), and one local optimisation method, the downhill simplex method (DSM). The GOMs all performed better than the DSM. The SCE displayed superior accuracy and robustness for synthetic data applications, being able to find all (five) selected sets of parameter values with almost 100% accuracy. However, the GA performed better than the SCE method with real data and perhaps reflects some weakness in the SCE to find global optimal points under difficult calibration conditions. The SA was inferior to the others with both types of data applications.
The importance of selection of parameter ranges is currently given little attention in the calibration process, but even with the superior search capability of GOMs, inflated search spaces can frustrate their searches and may lead to inferior parameter estimation. Several inequalities relating model parameters with the hydrologic data were developed, which when coupled with the SCE method, significantly improved the GOM performance. This modified SCE method appeared less sensitive to the problem of parameter range specification. A method for formulating these constraints was demonstrated on synthetic data and a procedure for its application to real data was done using data from two tropical watersheds.
APA, Harvard, Vancouver, ISO, and other styles
43

Brown, Stephen Anthony. "Models for automatic diffrentiation : a conceptual framework for exploiting program transformation." Thesis, University of Hertfordshire, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.263028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Birkel, Christian. "Integrating high-resolution tracer data into lumped conceptual rainfall-runoff models." Thesis, University of Aberdeen, 2010. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?pid=202563.

Full text
Abstract:
Environmental change is currently regarded as one of the greatest threats to water resources. Limited knowledge of hydrological processes prevents from adequate characterization of systems behaviour to future changes. Geochemical and isotope tracers are considered reliable tools to study physical processes, but tracer studies are often constrained by the temporal and spatial variability in the tracer response and coarse data sets. Novel automatic sampling devices and inexpensive laser spectroscopy made higher-resolution stable isotope tracer data feasible. This thesis presents approaches to integrate geochemical tracer data, high-resolution stable isotope tracer data and process dynamics observed in the field into lumped conceptual rainfall-runoff models to study catchment hydrological processes at different scales. The use of such process-based data successfully aided model conceptualization and calibration in the quest for simple water and solute transport models with improved representation of process dynamics. In particular, high-resolution isotope data could identify temporally and spatially variable flow pathways to assess diffuse pollution transport, which otherwise might have been lost. This work showed that pollutants in some catchments are likely to rapidly discharge into the stream and due to geological properties reside over longer periods in deeper groundwater systems. In other words, changes to these systems today are likely to show an immediate effect fading persistently over decadal time periods. Such knowledge is important if catchment remediation and recovery has to be assessed from a management point of view such as for example targeting measures and cost-effective land management to improve water quality status.
APA, Harvard, Vancouver, ISO, and other styles
45

Williams, Gbolahan. "Architecting tacit information in conceptual data models for requirements process improvement." Thesis, King's College London (University of London), 2013. https://kclpure.kcl.ac.uk/portal/en/theses/architecting-tacit-information-in-conceptual-data-models-for-requirements-process-improvement(2d3369c6-4387-4b69-b625-c9d36705bfac).html.

Full text
Abstract:
Despite extensive work in the field of Requirements Engineering, ineffective require- ments remains a major antecedent to the failure of projects. Requirements Engineering (RE) refers to the body of methods associated with elucidating the needs of a client, when considering the development of a new system or product. In the literature, challenges in RE have been mainly attributed to insufficient client input, incomplete requirements, evolving requirements and lack of understanding of the domain. Accordingly, this has raised the need for methods of effectively eliciting, analysing and recording requirements. In the literature, promising methods have been proposed for using ethnography to improve methods for elicitation because of its strong qualitative and quantitative qualities in understanding human activities. There has also been success with the use of Model Driven Engineering techniques for analysing, recording and communicating requirements through the use of Conceptual Data Models (CDM), to provide a shared understanding of the domain of a system. However, there has been little work that has attempted to integrate these two areas either from an empirical or theoretical perspective. In this thesis, we investigate how ethnographic research methods contribute to a method for data analysis in RE. Specifically, we consider the proposition that a CDM based on explicit and implicit information derived from ethnographic elicitation, will lead to design solutions that more closely match the expectations of clients. As a result of our investigation, this thesis presents the following key contributions: (i) the introduction of an ethnographic approach to RE for elicitation and verification (ii) a rich CDM metamodel and modeling language necessary for defining and recording ethnographic analyses based on implicit and explicit information (iii) a method for mapping CDM’s to high level architectural abstractions called ecologies. To compliment this work, an evaluation case study is provided that demonstrates a real world application of this work.
APA, Harvard, Vancouver, ISO, and other styles
46

Dillon, Andrew. "Knowledge acquisition and conceptual models: A Cognitive analysis of the interface." Cambridge: Cambridge University Press, 1987. http://hdl.handle.net/10150/106468.

Full text
Abstract:
This item is not the definitive copy. Please use the following citation when referencing this material: Dillon, A. (1987) Knowledge acquisition and conceptual models: a cognitive analysis of the interface. In: D. Diaper and R.Winder (eds.) People and Computers III. Cambridge: Cambridge University Press, 371-379. Abstract: Understanding how users process the information available to them through the computer interface can greatly enhance our abilities to design usable systems. This paper details the results of a longitudinal psychological experiment investigating the effect of interface style on user performance, knowledge acquisition and conceptual model development. Through the use of standard performance measures, interactive error scoring and protocol analysis techniques it becomes possible to identify crucial psychological factors in successful human computer use. Results indicate that a distinction between "deep" and "shallow" knowledge of system functioning can be drawn where both types of user appear to interact identically with the machine although significant differences in their respective knowledge exists. The effect of these differences on user ability to perform under stress and transfer to similar systems is noted. Implications for the design of usable systems are discussed.
APA, Harvard, Vancouver, ISO, and other styles
47

Famiglietti, James Stephen 1960. "Threshold structures in conceptual rainfall-runoff models : potential problems with calibration." Thesis, The University of Arizona, 1986. http://hdl.handle.net/10150/191912.

Full text
Abstract:
One of the most frequently reported problems in the automatic calibration of conceptual rainfall-runoff (CRR) models is the inability to identify a unique and conceptually realistic parameter set for a model in use on a particular watershed. Sorooshian and Gupta [1983,1985] and Gupta and Sorooshian [1983] have shown that model structure can be largely responsible for this problem. This thesis extends this work by further investigating calibration problems resulting from model structure, particularly when the CRR model contains one or more threshold elements. Two aspects of this problem were addressed. In the first, the practice of modeling nonlinear processes with linear models is analyzed. In the second, reparameterization techniques to improve parameter identifiability (Gupta and Sorooshian [1983]; Sorooshian and Gupta [1985]) are demonstrated on a four-parameter simplified CRR model. The results of these studies show that the parameter identification procedure can in fact be hindered by model structure and that the interdependence of model structure and parameter estimation methodologies must be considered by CRR model users and developers if automatic calibration is to be successful.
APA, Harvard, Vancouver, ISO, and other styles
48

Chabert, Maxime. "Constraint programming models for conceptual clustering : Application to an erp configuration problem." Thesis, Lyon, 2018. http://www.theses.fr/2018LYSEI118/document.

Full text
Abstract:
Les ERP (Enterprise Resource Planning) sont incontournables dans les systèmes d'information des sociétés industrielles: ils jouent un rôle crucial pour automatiser et suivre leurs processus afin d'améliorer leur compétitivité. Un ERP est un logiciel générique qui est utilisé par plusieurs sociétés industrielles ayant des besoins et des processus différents. C'est pourquoi de nombreux paramètres permettent d'adapter le fonctionnement du système aux besoins d'une société. Le déploiement d'un ERP, qui vise à paramétrer le système en fonction des besoins collectés, est donc une tâche complexe qui requiert une profonde expertise du système mais aussi du métier de l'entreprise industrielle. Infologic est une société qui développe et installe son propre ERP appelé Copilote. La difficulté liée au déploiement de Copilote dans une société industrielle est un réel frein pour la croissance d'Infologic et réduire la complexité du paramétrage de Copilote est un enjeu vital pour Infologic. C'est pourquoi nous avons étudié le processus de déploiement de Copilote et particulièrement la phase de paramétrage du système. Nous proposons une approche visant à extraire, depuis l'ensemble des paramétrages existants, un catalogue de paramétrages correspondant à des besoins fonctionnels précédemment rencontrés afin de les réutiliser lors des prochains déploiements de Copilote. Nous proposons d’utiliser la programmation par contraintes pour cela, afin de pouvoir facilement personnaliser les solutions calculées en ajoutant des contraintes et des critères d’optimisation variés. Nous introduisons de nouveaux modèles à base de contraintes pour résoudre des problèmes de clustering conceptuel, ainsi qu'une contrainte globale pour le problème de couverture exacte avec plusieurs algorithmes de propagation. Nous montrons qu'elle permet de modéliser facilement des problèmes de clustering conceptuel, et de les résoudre plus efficacement que les approches déclaratives de l’état de l’art
Enterprise Resource Planning (ERP) systems are essential for industrial companies to automatize and monitor their business processes in order to boost their competitiveness. ERP systems are generic software designed to serve a large variety of companies with different business processes. Therefore, they have many configuration options to support various business processes used in different companies. The implementation process of an ERP system consists in assigning values to ERP parameters according to the company requirements: It determines the exact operations and processes supported by the system in the specific company. Infologic is a French company that develops and integrates their own ERP system called Copilote. It has thousands of parameters that are used to adapt it as precisely as possible to customer requirements. However, this flexibility makes the implementation of Copilote a time consuming task that requires a deep knowledge of its functionalities and parameters. Reducing the complexity of the implementation of Copilote is a critical issue for Infologic who needs to integrate efficiently new system integrators to meet the demand of new customers. In this thesis, we study the implementation process of Copilote in order to understand the main issues encountered by Infologic. We propose a new approach for extracting a catalog of configuration parts from existing configurations of Copilote, and each configuration part is associated with the business requirement it fulfills in order to reuse it for next implementations of Copilote. To this aim, we propose to use constraint programming (CP) to easily integrate feedbacks of experts by means of new constraints or criteria. We introduce new CP models to solve conceptual clustering problems and a new global constraint for the exact cover problem with several propagation algorithms. We show it allows to model easily conceptual clustering problems and to solve it more efficiently thant existing delcarative approaches
APA, Harvard, Vancouver, ISO, and other styles
49

Wijns, Christopher P. "Exploring conceptual geodynamic models : numerical method and application to tectonics and fluid flow." University of Western Australia. School of Earth and Geographical Sciences, 2005. http://theses.library.uwa.edu.au/adt-WU2005.0068.

Full text
Abstract:
Geodynamic modelling, via computer simulations, offers an easily controllable method for investigating the behaviour of an Earth system and providing feedback to conceptual models of geological evolution. However, most available computer codes have been developed for engineering or hydrological applications, where strains are small and post-failure deformation is not studied. Such codes cannot simultaneously model large deformation and porous fluid flow. To remedy this situation in the face of tectonic modelling, a numerical approach was developed to incorporate porous fluid flow into an existing high-deformation code called Ellipsis. The resulting software, with these twin capabilities, simulates the evolution of highly deformed tectonic regimes where fluid flow is important, such as in mineral provinces. A realistic description of deformation depends on the accurate characterisation of material properties and the laws governing material behaviour. Aside from the development of appropriate physics, it can be a difficult task to find a set of model parameters, including material properties and initial geometries, that can reproduce some conceptual target. In this context, an interactive system for the rapid exploration of model parameter space, and for the evaluation of all model results, replaces the traditional but time-consuming approach of finding a result via trial and error. The visualisation of all solutions in such a search of parameter space, through simple graphical tools, adds a new degree of understanding to the effects of variations in the parameters, the importance of each parameter in controlling a solution, and the degree of coverage of the parameter space. Two final applications of the software code and interactive parameter search illustrate the power of numerical modelling within the feedback loop to field observations. In the first example, vertical rheological contrasts between the upper and lower crust, most easily related to thermal profiles and mineralogy, exert a greater control over the mode of crustal extension than any other parameters. A weak lower crust promotes large fault spacing with high displacements, often overriding initial close fault spacing, to lead eventually to metamorphic core complex formation. In the second case, specifically tied to the history of compressional orogenies in northern Nevada, exploration of model parameters shows that the natural reactivation of early normal faults in the Proterozoic basement, regardless of basement topography or rheological contrasts, would explain the subsequent elevation and gravitationally-induced thrusting of sedimentary layers over the Carlin gold trend, providing pathways and ponding sites for mineral-bearing fluids.
APA, Harvard, Vancouver, ISO, and other styles
50

BENEVIDES, A. B. "A Model-Based graphical editor for supporting the creation, verification and validation of OntoUML conceptual models." Universidade Federal do Espírito Santo, 2010. http://repositorio.ufes.br/handle/10/4211.

Full text
Abstract:
Made available in DSpace on 2016-08-29T15:33:10Z (GMT). No. of bitstreams: 1 tese_3332_.pdf: 4500876 bytes, checksum: 55c9192a29c0414e33a92f7897ba7167 (MD5) Previous issue date: 2010-02-05
Essa tese apresenta um editor gráfico baseado em modelos para o suporte à criação, verificação e validação de modelos conceituais e ontologias de domínio em uma linguagem de modelagem filosoficamente e cognitivamente bem-fundada chamada OntoUML. O editor é projetado de forma que, por um lado, ele protege o usuário da complexidade dos princípios ontológicos subjacentes à essa linguagem. Por outro lado, ele reforça esses princípios nos modelos produzidos por prover um mecanismo para verificação formal automática de restrições, daí assegurando que os modelos criados serão sintaticamente corretos. Além disso, avaliar a qualidade de modelos conceituais é um ponto chave para assegurar que modelos conceituais podem ser utilizados efetivamente como uma base para o entendimento, acordo e construção de sistemas de informação. Por essa razão, o editor é também capaz de gerar instâncias de modelos automaticamente por meio da transformação desses modelos em especificações na linguagem, baseada em lógica, chamada Alloy. Como as especificações Alloy geradas incluem os axiomas modais da ontologia fundacional subjacente à OntoUML, chamada Unified Foundational Ontology (UFO), então as instâncias geradas automaticamente vão apresentar um comportamento modal enquanto estiverem sendo classificadas dinamicamente, suportando, assim, a validação das meta-propriedades modais dos tipos fornecidos pela linguagem OntoUML.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography