To see the other types of publications on this topic, follow the link: Toom model.

Dissertations / Theses on the topic 'Toom model'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Toom model.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Jones, Thomas Carroll Jr. "JigCell Model Connector: Building Large Molecular Network Models from Components." Thesis, Virginia Tech, 2017. http://hdl.handle.net/10919/78277.

Full text
Abstract:
The ever-growing size and complexity of molecular network models makes them difficult to construct and understand. Modifying a model that consists of tens of reactions is no easy task. Attempting the same on a model containing hundreds of reactions can seem nearly impossible. We present the JigCell Model Connector, a software tool that supports large-scale molecular network modeling. Our approach to developing large models is to combine together smaller models, making the result easier to comprehend. At the base, the smaller models (called modules) are defined by small collections of reactions. Modules connect together to form larger modules through clearly defined interfaces, called ports. In this work, we enhance the port concept by defining different types of ports. Not all modules connect together the same way, therefore multiple connection options need to exist.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
2

Carmona, Bardella Ana. "Combining Discrete Element and Process-based sedimentary models: a new tool to model syntectonic sedimentation." Doctoral thesis, Universitat de Barcelona, 2016. http://hdl.handle.net/10803/401652.

Full text
Abstract:
In order to understand the current state of the natural rocky environment and its heterogeneity, we require to study the interaction and time evolution of the numerous geological processes that have contributed to the geological reality we observe today. Given this, the thesis presented here concerns itself with numerical modelling of geological processes. The forward numerical model developed during this thesis is able to simulate deformation and sedimentation in one single setting. To do this, the model uses a novel approach that combines a Discrete Element Model (DEM) and a process- based sedimentary model, Simsafadim (SFM) to link both processes, deformation and sedimentation. The discrete element model (DEM) deals with the simulation of deformation in different materials in 2D and 3D. It is primarily used to investigate the propagation and evolution of deformation in the upper crust caused by tectonic movements. Simsafadim (SFM) is a process-based numerical forward model, which simulates subaqueous clastic transport and sedimentation in three dimensions, including processes of carbonate interaction, production, transport and sedimentation. It can model efficiently the distribution of facies and the depositional architectures in a sedimentary basin and it is a powerful tool for the 3D prediction of stratigraphic structures. Merging both codes provides a new tool for geological modelling in which deformation is influenced by the presence of syntectonic sediment dispersal and deposition. In addition, the tectonic processes change the topographic surface, which influences fluid flow, transport and, consequently, sedimentation in the process-based sedimentary model. The interaction of tectonic and sedimentary processes allows us to study the propagation of deformation in the syntectonic materials as well as how these new sediments influence the propagation of deformation in the pretectonic unit. The model is applied in two different cases studies, in order to test the viability of the new model, as well as to achieve new insight in the respective themes treated: 1) First case study: the effect of normal faulting and a relay ramp on sediment dispersal. The model is used to study the sedimentary infill in an extensional basin, specifically related to a relay ramp system. To perform the test study two configurations are designed: with one normal fault, and with two overlapping normal faults linked by a relay ramp. The different results show that the source area location in relation to the available accommodation space plays the major role in the distribution of different sediment types into the basin. Nonetheless, when the source area for water and sediment is defined as regional and parallel to the fault, the grain size distribution obtained by the two overlapping faults linked by a relay ramp have clear asymmetries when compared with the ones obtained by one-fault configurations. Therefore, the different extensional experiments allow us to conclude that the configuration with a relay ramp can play an important role in the distribution of the sediments into the basin. 2) Second case study: The effect of syntectonic sedimentation on fold geometry The numerical is used to investigate the effect of syntectonic sedimentation on fold geometry and specifically related to a delta progradation surrounded by two growing anticlines. To the initial tectonic configuration that reproduces the growth of two faults, two different cases of the sedimentary model are considered: without sediments, and considering syntectonic sedimentation. Summing up, the main results obtained for these experiments conclude that the syntectonic sedimentation is controlling the mechanism of fold growth and the final fold geometry: the left-side fold shows a left-vergent asymmetric anticline. Moreover, the strain suggests that this anticline is passing from a detachment fold (without sediments) to a fault propagation fold basinwards (with sediments). As a consequence, the inner syncline and the related sedimentary basin are also changing in transversal and longitudinal direction, being wider with syntectonic sedimentation.
El objectiu principal d'aquesta tesi és donar un pas endavant en el coneixement dels processos geològics que intervenen en la formació de les conques sedimentàries mitjançant la creació d'un model numèric per modelitzar la sedimentació sintectònica en un ambient subaquàtic. El model numèric desenvolupat en aquesta tesi és capaç de modelitzar la sedimentació clàstica subaquàtica i la deformació de la unitat pretectònica en una sola configuració. Per fer això, el model combina dos models ja existents: un model d'elements discrets per simular la deformació de la unitat pretectònica (DEM) i un model sedimentari basat en processos, per modelitzar la sedimentació clàstica subaquàtica Simsafadim (SFM). La unió d'aquest dos models, DEM i SFM, proporciona una nova eina per a la modelització geològica. En el nou model, l’evolució de deformació en la unitat pretectònica canviarà la topografia de la conca, que afectarà directament la batimetria, influenciant així els processos de transport i sedimentació que hi tenen lloc. Alhora, aquesta evolució de la deformació de la unitat pretectònica estarà influenciada per la presència dels nous materials sintectònics. Amb aquesta doble interacció tectònica-sedimentària del nou model, també es pot estudiar la deformació en els materials sintectònics. Aquesta nova eina de modelització permet simular i analitzar diferents arquitectures deposicionals sintectòniques i escenaris geològics més complexos. Tant per ampliar el coneixement de com els processos de sedimentació-deformació interactuen en alguns ambients tectònics, com per comprovar l'eficàcia del model, la nova eina s'aplica a dos casos diferents: 1) En primer lloc s'utilitza el model numèric en un ambient extensional per estudiar l'impacte que té la presència de falles normals i les corresponents zones de relleu en la distribució del sediment a la conca. Quan es compara la distribució de sediment obtinguda amb una configuració de dos falles amb una rapa de relleu, amb la distribució de sediment obtinguda amb una configuració d'una sola falla, els resultats mostren asimetries, tant longitudinals com perpendiculars a les estructures. 2) En el segon cas s'estudia l'efecte de la sedimentació sintectònica sobre la geometria final d'un plec. Els resultats, obtinguts a partir de la comparació de l'evolució de les estructures considerant o no considerant sedimentació sintectònica, estableixen que la sedimentació sintectònica pot afectar tant al mecanisme de formació del plec com a la seva geometria final.
APA, Harvard, Vancouver, ISO, and other styles
3

Sarrazin, Pierre. "Reengineering a process model generalisation tool." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp04/mq29777.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sarrazin, Pierre 1971. "Reengineering a process model generalisation tool." Thesis, McGill University, 1996. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=27402.

Full text
Abstract:
A large organization that has many projects to manage may want to build a model that gives an overview of the common and variable parts of its projects' processes. This action is process model generalisation. The McGill Software Engineering Laboratory has developed a technique and a tool to achieve that. The work described in this thesis consisted of reengineering the tool to give it a longer life expectancy and to make it part of a future client-server architecture suitable for developing a suite of process management tools. The tool was effectively reengineered and this helped the laboratory define the architecture better. Also, some lessons about software maintenance were learned.
APA, Harvard, Vancouver, ISO, and other styles
5

SILVA, THAIS ROSA DA. "A SUPPLY CHAIN MATURITY MODEL TOOL." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2018. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=36265@1.

Full text
Abstract:
O mercado competitivo exige sistematicamente custos reduzidos e mais flexibilidade das organizações contemporâneas que, por sua vez, podem alcançar melhores resultados ao focarem no desenvolvimento de suas cadeias de suprimento. Sendo assim, os modelos de maturidade em supply chain se apresentam como aliados no processo de alavancagem das organizações. Apesar do crescente número de modelos neste campo de conhecimento, a literatura aponta para uma lacuna na criação de modelos híbridos, adaptativos e que se predispõem a medir a cadeia de suprimentos de forma ampla. A partir desta carência, o objetivo desta dissertação foi desenvolver um modelo de maturidade em gestão de cadeias de suprimentos com uma ferramenta de medição associada. Para isso, foi desenvolvida uma metodologia subdividida em três fases, abrangendo: os estudos introdutórios, a pesquisa bibliográfica e o desenvolvimento efetivo do modelo e ferramenta. O resultado desta pesquisa foi a criação do produto tecnológico, com registro no Instituto Nacional da Propriedade Industrial (INPI), Supply Chain Management Maturity Model (SCM3). SCM3 é um modelo de caráter interdisciplinar, multidimensional e composto por 6 dimensões de aplicação: gestão de fornecedores, gestão da operação e clientes, gestão das atividades logísticas, recursos humanos, sistemas tecnológicos e sistema de medição de performance. O SCM3 utiliza o conceito de pontos chave de transição para mudança de nível de maturidade, o que confere maior pragmatismo ao modelo. A operacionalização deste modelo exigiu também o desenvolvimento de um método de apoio às organizações para sua implantação. O protótipo de ferramenta computacional associada ao SCM3 gera resultados compatíveis e comparáveis, reunindo 153 questões e 5 funcionalidades. A etapa de validação do modelo contou com a participação de especialista tanto do mercado, quanto da academia, que avaliaram aspectos inerentes à interface, conteúdo, encadeamento lógico, viabilidade e aceitabilidade do modelo. O modelo foi aplicado em ambiente organizacional em duas grandes empresas, líderes do setor de telecomunicações brasileiro, contando com a participação da camada gerencial das companhias. A aplicação se demonstrou relevante para organizações e acadêmicos, sendo capaz de realizar inúmeras análises para organizações alavancarem suas cadeias e apoiar estudo futuros de benchmarking em maturidade em supply chain.
The competitive marketplace systematically demands reduced costs and more flexibility from contemporary organizations which, in turn, can achieve better results by focusing on the development of their supply chains. Thus, supply chain maturity models present themselves as allies in the leverage process of organizations. Despite the increasing number of models in this field of knowledge, the literature points to a gap in the creation of hybrid, adaptive models that are predisposed to measure the supply chain in a broad way. From this gap, the objective of this dissertation was to develop a maturity model in supply chain management with an associated measurement tool. For this, a methodology was developed subdivided in three phases, covering: the introductory studies, the bibliographical research and the effective development of the model and tool. The result of this research was the creation of the technological product, with registration in the National Institute of Industrial Property (INPI), Supply Chain Management Maturity Model (SCM3). SCM3 is interdisciplinary, multidimensional and composed of 6 application dimensions: supplier management, operation and customer management, logistic activities management, human resources, technological systems and performance measurement system. The SCM3 uses the concept of turning points to change maturity level, which confers greater pragmatism to the model. The operationalization of this model also required the development of a method to support organizations for the implementation. The computational tool prototype associated to SCM3 generates compatible and comparable results, gathering 153 questions and 5 functionalities. The validation stage of the model was attended by specialists from both the market and academy, who evaluated aspects inherent to the interface, content, logical linkage, viability and acceptability of the model. The model was applied in an organizational environment in two large companies, leaders of the Brazilian telecommunications sector, counting with the participation of the managerial layer of the companies. The application has proved to be relevant for organizations and academics, being able to perform numerous analyzes for organizations to leverage their chains and support future benchmarking studies in supply chain maturity.
APA, Harvard, Vancouver, ISO, and other styles
6

Buscaroli, Nicolò. "Il tool PRISM+." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2022.

Find full text
Abstract:
La blockchain è una delle tecnologie più innovative e importanti comparse negli ultimi tempi nel campo dell’informatica. Prism+ è uno strumento che permette di fare analisi probabilistiche ed analizzare il comportamento di modelli di protocolli blockchain. Dopo una breve introduzione, per prima cosa si espone in generale l’argomento della blockchain, enunciando le caratteristiche e gli utilizzi più importanti di questa tecnologia. In seguito viene trattato Prism, strumento che permette di fare analisi su modelli probabilistici, utilizzato per sistemi di diversi campi d’applicazione. Questa sezione si divide in diverse parti: nella descrizione del linguaggio PRISM, che serve per costruire un modello probabilistico analizzabile dal tool; nella spiegazione del linguaggio di specifica delle proprietà necessario per far sì che Prism possa valutare diverse proprietà del modello creato; nella descrizione di un esempio di utilizzo dello strumento Prism, utile per una comprensione diretta del funzionamento del tool. Successivamente si passa alla discussione su Prism+, evoluzione di Prism il cui scopo principale è permettere l’analisi probabilistica di modelli di protocolli blockchain. Questa sezione si divide in due parti: esposizione delle strutture dati aggiunte in Prism+ rispetto a Prism; descrizione delle operazioni possibili tra queste nuove strutture dati. In seguito si riporta un esempio di utilizzo di Prism+. Una breve conclusione pone fine allo scritto.
APA, Harvard, Vancouver, ISO, and other styles
7

CIRILO, ELDER JOSE REIOLI. "GENARCH: A MODEL-BASED PRODUCT DERIVATION TOOL." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2008. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=12424@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
COORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR
Este trabalho apresenta uma ferramenta baseada em modelos para derivação de produtos de LPSs, denominada GenArch. O objetivo principal da ferramenta é permitir que a comunidade de desenvolvimento de software tradicional, utilize conceitos e fundamentos de abordagens de LPSs na produção de seus sistemas ou partes de seus sistemas sem a necessidade do entendimento de modelos e conceitos complexos. A abordagem implementada pela ferramenta foi elaborada com base em fundamentos do desenvolvimento dirigido por modelos. Centrada na definição de três modelos (características, implementação e configuração), a ferramenta permite a derivação automática de produtos ou frameworks existentes. O trabalho também define um conjunto específico de anotações Java que possibilitam a geração automática dos modelos de derivação a partir dos elementos de implementação da arquitetura de uma LPS. A plataforma Eclipse e as tecnologia EMF e openArchitectureWare foram utilizadas como base para a implementação da ferramenta. Uma extensão da ferramenta que atende especificamente aos modelos de componente Spring e OSGi, também é proposta nessa dissertação. Tal extensão permite a instanciação automática da LPS e aplicações através de diferentes tipos de customizações, variando da configuração fina de propriedades de componentes até a seleção automática de quais componentes irão compor o produto final. Como parte de validação da abordagem, a ferramenta foi utilizada na derivação automática de três diferentes estudos de caso: (i) o framework JUnit; (ii) uma LPS de jogos J2ME; e (iii) uma aplicação web baseada em serviços. Diversas lições aprendidas e resultados do uso da ferramenta nestes três diferentes cenários são também apresentadas.
This work presents a model-based tool for product derivation, called GenArch, which aims to enable the mainstream software developer community to use the concepts and foundations of the SPL approach, without the need to understand complex concepts or models. The tool approach is build on top of model-driven development techniques. It is centered on the definition of three models (feature, implementation and configuration models), which enable the automatic instantiation of software product lines (SPLs) or frameworks. A set of specific Java annotations are also defined to allow generating automatically many of the models, based on existing implementations elements of SPL architectures. The Eclipse platform, and EMF and openArchitectureWare technologies are used as the base for the implementation of the tool. The dissertation also presents a GenArch extension that addresses the new abstractions provided by the Spring and OSGi component models. Different kinds of customizations are provided by this extension varying from fine-grained configuration of component properties to the automatic selection of components that will compose the final product. As part of the approach validation, the tool was used in the derivation of three case studies: (i) JUnit framework; (ii) a J2ME games SPL; (iii) a service oriented Web application. Several lessons learned and discussions resulting from the use of the tool also are described.
APA, Harvard, Vancouver, ISO, and other styles
8

Newby, Christopher. "Tool for Physics Beyond the Standard Model." Thesis, University of Oregon, 2016. http://hdl.handle.net/1794/20472.

Full text
Abstract:
The standard model (SM) of particle physics is a well studied theory, but there are hints that the SM is not the final story. What the full picture is, no one knows, but this thesis looks into three methods useful for exploring a few of the possibilities. To begin I present a paper by Spencer Chang, Nirmal Raj, Chaowaroj Wanotayaroj, and me, that studies the Higgs boson. The scalar particle first seen in 2012 may be the vanilla SM version, but there is some evidence that its couplings are different than predicted. By means of increasing the Higgs' coupling to vector bosons and fermions, we can be more consistent with the data. Next, in a paper by Spencer Chang, Gabriel Barello, and me, we elaborate on a tool created to study dark matter (DM) direct detection. The original work by Anand. {\em et al.} focused on elastic dark matter, whereas we extended this work to include the inelastic case, where different DM mass states enter and leave the collision. We also examine several direct detection experiments with our new framework to see if DAMA's modulation can be explained while avoiding the strong constraints imposed by the other experiments. We find that there are several operators that can do this. Finally, in a paper by Spencer Chang, Gabriel Barello, and me, we study an interesting phenomenon know as kinetic mixing, where two gauge bosons can share interactions with particles even though these particles aren't charged under both gauge groups. This, in and of itself, is not new, but we discuss a different method of obtaining this mixing where instead of mixing between two Abelian groups one of the groups is Nonabelian. Using this we then see that there is an inherent mass scale in the mixing strength; something that is absent in the Abelian-Abelian case. Furthermore, if the Nonabelian symmetry is the SU$(2)_L$ of the SM then the mass scale of the physics responsible for the mixing is about 1 TeV, right around the sweet spot for detection at the LHC. This dissertation includes previously published and unpublished co-authored material.
APA, Harvard, Vancouver, ISO, and other styles
9

Gu, Wenqing. "Tool Integration: Model-based Tool Adapter Construction and Discovery Conforming to OSLC." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-169430.

Full text
Abstract:
Tool Integration is a vital part in the modern IT industry. With the ever increasing complexity in reality, more tools in different domains are needed in research and development process. However, currently no vendor has a complete solution for the whole process, and no mature solution to integrate different tools together, thus tools are still used separately in the industry. Due to this separation, the same information is created more than once for different tools, which is both time wasting and error prone.This thesis is part of the research to deliver a model-based tool integration framework that helps the end user design their own scenario of tool integration and implement it with less effort by generating most common parts automatically. This thesis itself is mainly focused on tool adapters, including the model-based tool adapter construction and discovery. In the first part, a model-based tool adapter construction platform conforming to OSLC is designed and implemented, based on which, the construction process of a tool adapter is presented with an example. With this platform, most of the codes and configuration files can be generated, with the exemption of the tool specific functionalities. The tool adapter is constructed as a separate SCA component, and can be included in the SCA based tool chain with minor configuration. With SCA, the deployment of the tool adapter and future management can be largely eased. In the second part, the model-based discovery process of an unknown tool adapter conforming to OSLC and our assumptions is presented in detail. With the discovery tool, the sharing of the tool adapter is made possible, and the integration of the different tools are largely eased. An example of discover an unknown tool adapter is also included for a more clear explanation. Finally, in the meanwhile of the design and implementation of the construction platform and the discovery process, the existing Matlab/Simulink tool adapter is extended and refined to make it full compatible to the standard and our tool chain.
APA, Harvard, Vancouver, ISO, and other styles
10

Naswa, Sudhir. "Representation of Biochemical Pathway Models : Issues relating conversion of model representation from SBML to a commercial tool." Thesis, University of Skövde, School of Humanities and Informatics, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-28.

Full text
Abstract:

Background: Computational simulation of complex biological networks lies at the heart of systems biology since it can confirm the conclusions drawn by experimental studies of biological networks and guide researchers to produce fresh hypotheses for further experimental validation. Since this iterative process helps in development of more realistic system models a variety of computational tools have been developed. In the absence of a common format for representation of models these tools were developed in different formats. As a result these tools became unable to exchange models amongst them, leading to development of SBML, a standard exchange format for computational models of biochemical networks. Here the formats of SBML and one of the commercial tools of systems biology are being compared to study the issues which may arise during conversion between their respective formats. A tool StoP has been developed to convert the format of SBML to the format of the selected tool.

Results: The basic format of SBML representation which is in the form of listings of various elements of a biochemical reaction system differs from the representation of the selected tool which is location oriented. In spite of this difference the various components of biochemical pathways including multiple compartments, global parameters, reactants, products, modifiers, reactions, kinetic formulas and reaction parameters could be converted from the SBML representation to the representation of the selected tool. The MathML representation of the kinetic formula in an SBML model can be converted to the string format of the selected tool. Some features of the SBML are not present in the selected tool. Similarly, the ability of the selected tool to declare parameters for locations, which are global to those locations and their children, is not present in the SBML.

Conclusions: Differences in representations of pathway models may include differences in terminologies, basic architecture, differences in capabilities of software’s, and adoption of different standards for similar things. But the overall similarity of domain of pathway models enables us to interconvert these representations. The selected tool should develop support for unit definitions, events and rules. Development of facility for parameter declaration at compartment level by SBML and facility for function declaration by the selected tool is recommended.

APA, Harvard, Vancouver, ISO, and other styles
11

Gómez, Rodríguez Laura. "A Tool-Supported Method for Fallacies Detection in Process-Based Argumentation." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-40940.

Full text
Abstract:
Process-based arguments aim at demonstrating that a process, compliant with a standard, has been followed during the development of a safety-critical system. Compliance with these processes is mandatory for certification purposes, so the generation of process-based arguments is essential, but also a very costly and time-consuming task. In addition, inappropriate reasoning in the argumentation such as insufficient evidence (i.e. a fallacious argumentation), may result in a loss of quality of the system, leading to safety-related failures. Therefore, avoiding or detecting fallacies in process-based arguments is crucial. However, the process of reviewing such arguments is currently done manually and is based on the expert’s knowledge, so it is a very laborious and error-prone task.In this thesis, an approach to automatically generate fallacy-free process-based arguments is proposed and implemented. This solution is composed of two parts; (i) detecting omission of key evidence fallacies on the modelled processes, and (ii) transforming them into process-based safety arguments. The former checks automatically if the process model, compliant with the Software & Systems Process Engineering Metamodel (SPEM) 2.0, contains the sufficient information for not committing an omission of key evidence fallacy. If fallacies are detected, the functionality provides the proper recommendation to resolve them. Once the safety engineers/process engineers modify the process model following the provided recommendations, the second part of the solution can be applied. This one generates automatically the process-based argument, compliant with the Structured Assurance Case Metamodel (SACM), and displays it –rendered via Goal Structuring Notation (GSN)– into the OpenCert assurance case editor within the AMASS platform. The applicability of the solution is validated in the context of the ECSS-E-ST-40C standard.
APA, Harvard, Vancouver, ISO, and other styles
12

Zarev, Veselin [Verfasser], Tom [Gutachter] Schanz, and Maria [Gutachter] Datcheva. "Model identification for the adaption of numerical simulation models / Veselin Zarev ; Gutachter: Tom Schanz, Maria Datcheva." Bochum : Ruhr-Universität Bochum, 2016. http://d-nb.info/1116709708/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Barger, Lynne F. "The model generator: a tool for simulation model definition, specification, and documentation." Thesis, Virginia Polytechnic Institute and State University, 1986. http://hdl.handle.net/10919/74519.

Full text
Abstract:
The Model Generator, one of the automated tools of the Model Development Environment, supports the process of discrete event simulation model development by guiding a modeler through model definition and model specification. This research focuses on the specification process within the Model Generator. A design is proposed and requirements are established for extending an existing generator prototype to incorporate model specification. The specification is obtained interactively by engaging a modeler in a series of dialogues. The modeler's responses are stored in a database that is structured to represent a model specification in the notation prescribed by Condition Specifications. The dialogue has been designed to solicit the specific information required for a Condition Specification. Furthermore, the dialogue has been organized according to levels with each dialogue at a given level responsible for completing the database elements prescribed for that level. The results of initial experimentation with an implementation of the design are positive. The prototype appears capable of producing a Condition Specification while offering broader support to the modeling task in concert with utilization and enforcement of the underlying philosophy imparted by the Conical Methodology.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
14

Obregón, Christian, and Julio Lara. "Landslide Susceptibility Map: A tool for sustainable land management." Pontificia Universidad Católica del Perú. Centro de Investigación en Geografía Aplicada, 2014. http://repositorio.pucp.edu.pe/index/handle/123456789/119875.

Full text
Abstract:
This study aims to show the importance of Landslide Susceptibility Map as a tool for land use planning, prevention and risk mitigation. This will be shown through MM evaluation processes affecting high sector of El Paraíso gorge - Villa María del Triunfo (Lima - Peru).The work consisted of two phases: in the first one (field) the intrinsic characteristics of geology and geomorphology were identified. The second one, included the Landslide Susceptibility Map generation, using the multivariate Heuristic Model consisting of overlapping maps variables (Carrara et al. 1995, Lain et al. 2005), developed in a GIS environment through algebra layer (geoprocessing operations).The results of Landslide Susceptibility Map in general, give us geoscience information that will contribute to land management, and in a timely manner, with the development of specific studies, prevention and / or mitigation measures to ensure the physical stability of identified critical areas.
El presente estudio tiene por objetivo mostrar la importancia del Mapa de Susceptibilidad a MM, como herramienta para la planificación territorial, prevención y mitigación de riesgos. Para ello, se muestra como ejemplo la evaluación geodinámica del sector alto de la quebrada El Paraíso – Villa María del Triunfo (Lima – Perú).El trabajo consistió de dos fases: en la primera (campo) se identificaron los características intrínsecas de geología y geomorfología. La segunda (gabinete), comprendió la elaboración del mapa de susceptibilidad aplicando el modelo heurístico multivariado que consiste en la superposición de mapas de variables (Carrara et al. 1995; Laín et al. 2005), desarrollado en un entorno SIG a través del álgebra de capas (operaciones de geoprocesamiento).Los resultados del mapa de susceptibilidad de manera general, nos presentan información geocientífica que contribuirá con el ordenamiento territorial (OT); y de manera puntual, con el desarrollo de estudios específicos, medidas de prevención y/o mitigación para asegurar la estabilidad física de las áreas críticas identificadas.
APA, Harvard, Vancouver, ISO, and other styles
15

Atterer, Richard. "Usability Tool Support for Model-Based Web Development." Diss., lmu, 2008. http://nbn-resolving.de/urn:nbn:de:bvb:19-92963.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Shafaghi, Mohammad. "Computer aided tool management system : an implementation model." Thesis, Sheffield Hallam University, 1994. http://shura.shu.ac.uk/3186/.

Full text
Abstract:
In recent years considerable attention has been diverted towards devising new strategies to deal with the competitive nature of manufacturing environments. Such strategies are often influenced by the costs and quality of the manufactured products. An effective tool management and control system can significantly contribute to the efficiency of manufacturing facilities by maintaining the flow of production, reducing manufacturing costs, and be instrumental to the quality of finished goods. Most companies however, have consistently overlooked the importance of tooling and its impact on the efficiency of their manufacturing facilities, consequently it has become a maior production bottleneck. Hence, the need for uncovering the nature, extent, and underlying causes of tooling problems. Having recognised the importance of a Computer Aided Tool Management And Control Systems (CATMACS) as a partial solution to the efficient management of tooling resources, the study then looks at the implementation of CATMACS in fourteen manufacturing companies in the UK, developing some 40 propositions. Based on the developed propositions, a framework for the implementation methodology is constructed. The framework consists of five phases; Tool audit, Strategy, Design, Action, and Review. The framework has been evaluated and the inputs and outputs to the phases have been identified. The framework represents a significant step in understanding of CATMACS implementation, in particular:
  • It addresses the need for such system.
  • It provides the basis of an implementation toolkit.
  • It provides guidance for the best way of implementing a CATMACS.
  • It is constructed using hard data.
APA, Harvard, Vancouver, ISO, and other styles
17

Shi, Jianlin. "Model and tool integration in high level design of embedded systems." Licentiate thesis, KTH, Machine Design (Div.), 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4589.

Full text
Abstract:

The development of advanced embedded systems requires a systematic approach as well as advanced tool support in dealing with their increasing complexity. This complexity is due to the increasing functionality that is implemented in embedded systems and stringent (and conflicting) requirements placed upon such systems from various stakeholders. The corresponding system development involves several specialists employing different modeling languages and tools. Integrating their work and the results thereof then becomes a challenge. In order to facilitate system architecting and design integration of different models, an approach that provides dedicated workspaces/views supported by structured information management and information exchange between domain models and tools is required.

This work is delimited to the context of embedded systems design and taking a model based approach. The goal of the work is to study possible technical solutions for integrating different models and tools, and to develop knowledge, support methods and a prototype tool platform.

To this end, this thesis examines a number of approaches that focus on the integration of multiple models and tools. Selected approaches are compared and characterized, and the basic mechanisms for integration are identified. Several scenarios are identified and further investigated in case studies. Two case studies have been performed with model transformations as focus. In the first one, integration of Matlab/Simulink® and UML2 are discussed with respect to the motivations, technical possibilities, and challenges. A preliminary mapping strategy, connecting a subset of concepts and constructs of Matlab/Simulink® and UML2, is presented together with a prototype implementation in the Eclipse environment. The second case study aims to enable safety analysis based on system design models in a UML description. A safety analysis tool, HiP-HOPS (Hierarchically Performed Hazard Origin and Propagation Studies), is partially integrated with a UML tool where an EAST-ADL2 based architecture model is developed. The experience and lessons learned from the experiments are reported in this thesis.

Multiple specific views are involved in the development of embedded systems. This thesis has studied the integration between system architecture design, function development and safety analysis through using UML tools, Matlab/Simulink, and HiP-HOPS. The results indicate that model transformations provide a feasible and promising solution for integrating multiple models and tools. The contributions are believed to be valid for a large class of advanced embedded systems. However, the developed transformations so far are not really scalable. A systematic approach for efficient development of model transformations is desired to standardize the design process and reuse developed transformations. To this end, future studies will be carried out to develop guidelines for model and tool integration and to provide support for structured information at both meta level and instance level.

APA, Harvard, Vancouver, ISO, and other styles
18

Juriga, Jakub. "Virtuální model části obráběcího stroje v ADAMS." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2012. http://www.nusl.cz/ntk/nusl-230395.

Full text
Abstract:
In theoretical part, this master´s thesis deals with vibrations in cutting machine and description of creation of self-excited vibrations theory. In practical part, there is problem of chatter in cutting machine solved with using simulation program Adams and computing program MATLAB. Gradually, Multi body system of cutting machine and model of cutting tool with features flexible body are analyzed. At the end all both models were used to create complex model of the cutting machine .
APA, Harvard, Vancouver, ISO, and other styles
19

Sud, Rajat. "A Synergistic Approach to Software Requirements Generation: The Synergistic Requirements Generation Model (SRGM) and, An Interactive Tool for Modeling SRGM (itSRGM)." Thesis, Virginia Tech, 2003. http://hdl.handle.net/10919/32942.

Full text
Abstract:
The importance of a well-formulated set of software requirements contributing to a successful software development effort has been underscored in recent times. However, the software industry still faces a dearth of process models and artifacts populating the requirements generation process. Of those that do exist, they are often disconnected and narrowly focused, providing little structure to the requirement generation phase. Current methodologies advocate useful guidelines, but do not enforce them. To address these concerns, we introduce the â Synergistic Approach to Software Requirements Generationâ â an approach composed of two components â a model and an interactive tool implementing the model. The first component is the Synergistic Requirements Generation Model (SRGM). The SRGM is complete with detailed processes spanning the entire software requirements generation phase. These processes have been identified and decomposed to low-level activities with the intent to improve clarity and understanding. The second component entails the An Interactive Tool for modeling the SRGM (itSRGM) codifying the structure dictated by the model. The tool enforces guidelines and provides visual representations of the progression of activities involved in the requirements generation process.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
20

Ball, Kerry Louise. "Exploring professionalism in medical educators : from model to tool." Thesis, University of Winchester, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.503839.

Full text
Abstract:
The aim of this research was to explore professionalism in medical educators, mostly from a primary care background, in a mixed method research design. Previous research on profesSionalism has focused on medical students and doctors. However medical educators are responsible for teaching professionalism to medical students. Professionalism is a complex and developed state, which must be explored in context to a specifiC role. This study was an exploratory sequential mixed method research design, with two distinct phases. The first was a qualitative phase involving exploration into the concept of professionalism within the doctors' role of a medical educator. This exploration inCluded a literature review and open-ended survey on professionalism, which led to the development of a model of professionalism for medical educators. The second phase involved the design and piloting of a tool, the Professional Reflective Enrichment Tool (PRET), that could be used to enhance professionalism in medical educators, using the model developed in phase one to structure the tool's development. The model of professionalism offered a unique insight into the medical educator's role. In this research, a resource to encourage reflection was used to enhance aspects of professionalism. Reflection was encouraged by developing a series of scenarios, based on the model, designed to pose professional dilemmas. Formative feedback was provided based on this reflection. The PRET was piloted using both assessors and users. A high multi-rater reliability was found. The pilot testing used 53 medical educators, 75% of whom were from primary care. A three-stage model of reflective thinking was developed using existing, tested models of reflection to structure formative feedback to the PRET. Qualitative data comments indicated that the PRET did promote a state of reflection and that the formative feedback was useful. This research offers a unique resource to encourage reflective thought and professional development in medical educators. By providing a structure to this thought the educator is able to apply the resource to their own practice, in personal reflection and implicit or explicit teaching methods.
APA, Harvard, Vancouver, ISO, and other styles
21

Rossel, Cid Pedro Osvaldo. "Software product line model for the meshing tool domain." Tesis, Universidad de Chile, 2013. http://www.repositorio.uchile.cl/handle/2250/113113.

Full text
Abstract:
Doctor en Ciencias, Mención Computación
Una malla es una discretización de la geometría de un cierto dominio. Las mallas pueden estar compuestas de diversos elementos: triángulos, cuadriláteros, tetraedros, etc. Una herramienta para la generación de mallas es un aplicación que permite crear, refinar, desrefinar, mejorar, suavizar, visualizar y posprocesar mallas y/o una región particular de ella, como también asignar valores físicos a los elementos de la malla (temperatura, concentración, etc.). Las herramientas para la generación de mallas son complejas y sofisticadas, y construir una herramienta nueva desde cero o mantener una existente, demanda un esfuerzo enorme. Existe una necesidad y oportunidad para usar enfoques nuevos en el desarrollo de estas herramientas, de manera de reducir tanto el tiempo como los costos de desarrollo, sin comprometer la calidad. La experiencia en el desarrollo de estas herramientas provee la motivación para la construcción de otras nuevas mediante la reutilización del trabajo realizado durante los desarrollos previos. Estas herramientas comparten varias características y sus variaciones pueden ser manejadas sistemáticamente. Esto hace que el desarrollo de estas herramientas sea una buena oportunidad para aplicar el enfoque de Línea de Productos de Software (LPS). Los procesos existentes de LPS son generales y requieren usualmente una serie de pasos y documentación innecesaria en el dominio de las herramientas para la generación de mallas. Así, esta tesis propone un modelo de proceso de LPS específico para este tipo de herramientas. Un proceso de desarrollo de LPS está centrado en la reutilización de software, e involucra principalmente dos fases: la ingeniería del dominio (ID) y la ingeniería de la aplicación (IA). El proceso presentado en este trabajo está centrado en dos etapas de la ID: el análisis del dominio (AD) y el diseño del dominio (DD). En el AD se define el modelo del dominio y el alcance de la LPS. En el DD la arquitectura de la línea de productos (ALP) es creada; esta arquitectura es válida y compartida por todos los productos en la LPS. Un modelo de características es comúnmente usado para modelar el dominio. En este trabajo, el AD también ocupa un diccionario, escenarios, acciones y metas para proveer el razonamiento utilizado para la construcción del modelo de características. Esta tesis presenta un proceso riguroso para obtener el modelo del dominio. Este modelo es formalizado mediante condiciones de consistencia y completitud. El proceso de definición del alcance es presentado a través de un diagrama de actividad. Además, el enfoque presentado en esta tesis presenta explícitamente los diferentes productos de la LPS, estableciendo relaciones entre productos y las características de la LPS, lo que permite administrar el desarrollo del producto. La etapa de DD se centra en la creación de la ALP, artefacto esencial para la construcción de productos de la LPS. Para ello, este trabajo provee un proceso deductivo y otro transformacional. En el primero, una ALP explícita es desarrollada, usando los artefactos producidos en el AD. Además, tanto la vista arquitectónica estructural como la de comportamiento son establecidas. Ambas vistas son generales y permiten la representación de cualquier producto dentro del alcance de la LPS. En el proceso transformacional, una ALP implícita es desarrollada usando reglas de transformación, las que han sido creadas usando artefactos producidos en el AD. En este proceso se produce la arquitectura para productos específicos, y la ALP es definida como la suma de todas las arquitecturas de los productos. Tanto el AD como el DD son descritos en detalle, y la aplicación del modelo de la LPS es ilustrado a través de un ejemplo bien documentado en el dominio de las herramientas para la generación de mallas, el que tiene un grado relativamente alto de complejidad. En este ejemplo, un modelo del dominio formalizado es introducido, y la arquitectura es definida tanto para el proceso deductivo como para el transformacional.
APA, Harvard, Vancouver, ISO, and other styles
22

Hutto, John. "Risk management in law enforcement : a model assessment tool /." View online, 2009. http://ecommons.txstate.edu/arp/301.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Khanse, Karan Rajiv. "Development and Validation of a Tool for In-Plane Antilock Braking System (ABS) Simulations." Thesis, Virginia Tech, 2015. http://hdl.handle.net/10919/56567.

Full text
Abstract:
Automotive and Tire companies spend extensive amounts of time and money to tune their products through prototype testing at dedicated test facilities. This is mainly due to the limitations in the simulation capabilities that exist today. With greater competence in simulation, comes more control over designs in the initial stages, which in turn lowers the demand on the expensive stage of tuning. The work presented, aims at taking today's simulation capabilities a step forward by integrating models that are best developed in different software interfaces. An in-plane rigid ring model is used to understand the transient response of tires to various high frequency events such as Anti-Lock Braking and short wavelength road disturbances. A rule based ABS model performs the high frequency braking operation. The tire and ABS models have been created in the Matlab-Simulink environment. The vehicle model has been developed in CarSim. The models developed in Simulink have been integrated with the vehicle model in CarSim, in the form of a design tool that can be used by tire as well as vehicle designers for further tuning of the vehicle functional performances as they relate to in-line braking scenarios. Outdoor validation tests were performed to obtain data from a vehicle that was measured on a suspension parameter measuring machine (SPMM) in order to complement this design tool. The results of the objective tests performed have been discussed and the correlations and variations with respect to the simulation results have been analyzed.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
24

Pensawat, Taweewit. "Real-Time Ethernet Networks Simulation Model." Thesis, Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-877.

Full text
Abstract:

Real-time networks are traditionally built on proprietary standards, resulting in a interoperability issues between different real-time netork implementations and traditional data networks mainly used in back office operations.

Continuity and supplier independence are a cause of concern with current

proprietary real-time networks.

This project evaluates the capability of providing real-time traffic over

switched Ethernet with EDF scheduling algorithm implemented at both the

switch and the node. By using OMNET simulation tool at packet level, it

is shown that the EDF implementation in switched Ethernet can guarantee

real-time traffic over the network and at the same time supporting non real-time traffic.

APA, Harvard, Vancouver, ISO, and other styles
25

Beisheim, Maja, and Charline Langner. "Lean Startup as a Tool for Digital Business Model Innovation : Enablers and Barriers for Established Companies." Thesis, Jönköping University, IHH, Företagsekonomi, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-52579.

Full text
Abstract:
Background: The rapidly changing world of digital technologies forces many companies to undertake a digital shift by transforming existing business models into digital business models to achieve sustainable value creation and value capture. Especially, for established companies, that have been successful leaders before the dot-com bubble (1995-2000) and whose business models have been threatened by the emergence of digital technologies, there is a need for a digital shift. We refer to this digitization of business models as digital business model innovation. However, often adoption and implementation of digital technologies require tremendous changes and thus, can be challenging for established companies. Therefore, agile methods and business experimentation have become important strategic elements and are used to generate and test novel business models in a fast manner. We introduce lean startup as an agile method for digital business model innovation, which has proven to be successful in digital entrepreneurship. Thus, it requires further empirical investigation on how to use lean startup in established companies for successful digital business model innovation. Purpose: The purpose of our study is to identify enablers and barriers of lean startup as a tool for digital BMI in established companies. Thus, we propose a framework showing how established companies can be successful in digital business model innovation by using lean startup. Method: We conducted exploratory, qualitative research based on grounded theory following an abductive approach. Using a non-probability, purposive sampling strategy, we gathered our empirical data through ten semi-structured interviews with experts in lean startup and digital business model innovation, working in or with established companies, shifting their business model towards a digital business model. By using grounded analysis, we gained an in-depth understanding of how lean startup is used in practice as well as occurring barriers and enablers for established companies. Conclusion: We emphasize that successful use of lean startup for digital business model innovation is based on an effective (1) lean startup management, appropriate (2) organizational structures, fitting (3) culture, and dedicated (4) corporate governance, which all require and are based on solid (5) methodical competence of the entire organization. Furthermore, (6) external influences such as market conditions, role of competition, or governance rules indirectly affect using lean startup as a tool for digital business model innovation.
APA, Harvard, Vancouver, ISO, and other styles
26

Kunert, Sebastian. "Wirkungsanalyse kognitiver Lernwerkzeuge." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät II, 2011. http://dx.doi.org/10.18452/16431.

Full text
Abstract:
Ziel dieser Arbeit ist es, die Wirkung von Lernsoftware durch eine Kausalkette vom verstehenden Lernen (Edelmann, 1996) über generative Aktivitäten (Wittrock, 1990) hin zu computergestützten kognitiven Lernwerkzeugen (Jonassen, 1992)in seiner Gesamtheit empirisch nachzuweisen. Dazu wurde eine digitale Lern- und Testumgebung geschaffen, in deren Mittelpunkt ein Beschriftungswerkzeug, ein Operatorentool sowie eine Simulation standen. Mit ihrer Hilfe sollten Aufbau, Funktionsweise und Bedienung einer einfachen technischen Anlage erlernt werden. Erhoben wurden Maße des Lernprozesses (Zeit, Eingaben) und des Lernerfolges (Wissenstest, Handlungsaufgaben). Im Rahmen eines klassischen Lernexperiments im Labor wurde in einer ersten Testreihe kein oder eins der 3 Tools zum freiwilligen Gebrauch angeboten. Die Ergebnisse zeigen, dass ein jedes Lernwerkzeug seine Nutzer in zusätzliche generative Aktivitäten verwickelt. Darüber hinaus bewirken sie auf Grund der Interaktionsmöglichkeiten, der repräsentierbaren Inhalte sowie der Visualisierungsform eine Aufmerksamkeitsfokussierung auf einzelne Lerninhalte. Demzufolge verbessert sich die Güte des mentalen Modells toolbedingt vor allem in entsprechenden Teilfacetten. Der statistische Vergleich dieser Ergebnisse mit älteren Daten einer parallelen Testreihe (Wipper, 2004) ergab keine bedeutenden Unterschiede, was auf eine Allgemeingültigkeit dieses Effekts hinweist. Im Rahmen einer zweiten Studie wurden die 3 Werkzeuge in einer Lernumgebung kombiniert angeboten. Im Resultat ergänzten sich die fokussierenden Wirkungen der Einzelapplikationen additiv zu einem ausgeglichenen mentalen Modell. Darüber hinaus war das Leistungsniveau jener Nutzer in allen Maßen des Lernprozesses und -erfolgs konstant höher als das aller anderen Probanden. Die Ergebnisse bestätigen die bereits existierenden Hinweise auf die eingangs erwähnte Kausalkette bzgl. des verstehenden Lernens mit Hilfe eines Computers.
In the light of instructional psychology the outcome of computer based learning aids can be described as a chain of causation consisting of constructivist forms of knowledge acquisition (Edelmann, 1996), generative processes (Wittrock, 1990) and digital mind tools (Jonassen, 1992). Aim of the present work is to proof this causal assumption empirically. Therefore, a computer based learning and testing environment was set up. It based on 3 single digital mind tools, which are supposed to help learning the construction, functionality and handling of a simple deterministic plant for soap production. Additionally, paper & pencil were provided. The measured variables cover the learning process (e.g. time, inputs) and the learning outcome (multiple-choice test, operating test). In a first series of 4 laboratory experiments 109 students were given no or one of the digital mind tools. In result, all 3 means initiate new generative processes. But the mental models of their users are not more sophisticated in whole. There was improvement in special areas, depending on characteristics of the tools (interaction modes, representable content, and visualisation). The findings were statistically compared to older data of a parallel test series (Wipper, 2004) but only marginal differences occurred. In a further laboratory experiment 98 students were given a combination of the 3 digital mind tools mentioned above. This hybrid learning environment initiates new generative processes as well. Because of an additional combination of the focussing effects caused by the tools, the mental model of its users is comprehensively more sophisticated now. Moreover, those people reach constantly much higher scores in all measures than the other subjects. The findings confirm the causal chain mentioned at the beginning.
APA, Harvard, Vancouver, ISO, and other styles
27

Cziulik, Carlos. "Development of a computer evaluation model for assessing mechanical systems conceptual design alternatives." Thesis, University of Surrey, 1998. http://epubs.surrey.ac.uk/843915/.

Full text
Abstract:
The focus of this thesis is on the development of a conceptual design evaluation model that can be used in engineering design and can be implemented as a computer tool. A prerequisite to achieve this objective is a proper understanding of the initial phases of the design process, using an adequate framework. Hence, a brief examination of Theory of Technical Systems associated to a comprehensive study of the conceptual design stage, based on academic design methodologies and a survey amongst British industries, is presented. Additionally, evaluation issues at the early phases of design and a number of approaches for evaluating alternative solutions are investigated and relevant characteristics to be included in a prospective conceptual design evaluation model are compiled. A novel evaluation model based on function metrics has been proposed. The approach provides an intermediate evaluation, indicating which solutions have the potential to progress further in the design process The core of the model is the composition of evaluation matrices and computation of partial indices, which will originate an overall index used to classify the alternatives. The model assumes the existence of an explicit function structure on which the development of the organ/component structure is going to be based. A unique feature of this model is that it does not depend on designers' preferences or judgement in assigning values. From the formalised solution the designer has to identify which organ/component implements which function. An initial prototype of a computer tool (LiberSolutio), which embodies the above model, is presented. In addition to being an evaluation system, LiberSolutio can record the design history of the set of solutions generated for a particular functional decomposition/ structure. A preliminary evaluation of the model and computer system is also presented with conclusions drawn from the results obtained.
APA, Harvard, Vancouver, ISO, and other styles
28

Zikra, Iyad. "Integration of Enterprise Modeling and Model Driven Development : A Meta-Model and a Tool Prototype." Licentiate thesis, Stockholms universitet, Institutionen för data- och systemvetenskap, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-103698.

Full text
Abstract:
The use of models for designing and developing Information Systems (IS) has changed in recent years. Models are no longer considered a peripheral documentation medium that is poorly maintained and often neglected. Rather, models are increasingly seen as essential parts of the final product—as central artifacts that drive and guide the development efforts. The knowledge that modelers rely on when designing models is represented as formal models and clearly defined rules for transforming the models. The flexibility, reliability, and effectiveness offered by the formal models and the transformations are making Model Driven Development (MDD) a popular choice for building IS. Models also serve in describing enterprise design, where enterprise-level models capture organizational knowledge and aid in understanding, improving, and growing the enterprise. Enterprise Modeling (EM) offers a structured and unified view of the enterprise, thereby enabling more informed and accurate decisions to be made. Many MDD approaches have been proposed to tackle a wide range of IS-related issues, but little attention is being paid to the source of the knowledge captured by the IS models. EM approaches capture organizational knowledge and provide the necessary input and underlying context for designing IS. However, the results produced by EM approaches need to be manually analyzed by modelers to create the initial MDD model. This interruption of the MDD process represents a gap between enterprise models and MDD models. Limited research has been done to connect EM to MDD in a systematic and structured manner based on the principles of model-driven development. This thesis proposes a unifying meta-model for integrating EM and MDD. The meta-model captures the inherent links that exist between organizational knowledge and IS design. This helps to improve the alignment between organizational goals and the IS that are created to support them. The research presented herein follows the guidelines of the design science research methodology. It starts with a state-of-the-art survey of the current relationship between MDD and prior stages of development. The findings of the survey are used to elicit a set of necessary properties for integrating EM and MDD. The unifying meta-model is then proposed as the basis for an integrated IS development approach that applies the principles of MDD and starts on the enterprise level by considering enterprise models in the development process. The design of the meta-model supports the elicited integration properties. The unifying meta-model is based on the Enterprise Knowledge Development (EKD) approach to EM. A prototype tool is developed to support the unifying meta-model, following a study to choose a suitable implementation environment. The use of the unifying meta-model is demonstrated through the implemented tool platform using an example case study, revealing its advantages and highlighting the potential for improvement and future development.
APA, Harvard, Vancouver, ISO, and other styles
29

Oliveira, Perla Novais de. "Transformação genética de tomate \'Micro-Tom\' com o gene enhanced disease susceptibility 5 (EDS5) isolado de Citrus sinensis." Universidade de São Paulo, 2015. http://www.teses.usp.br/teses/disponiveis/11/11144/tde-02022016-155536/.

Full text
Abstract:
Nos anos recentes, a atividade agrícola da citricultura vem enfrentando grandes problemas fitossanitários, principalmente, com relação à viabilidade econômica decorrente do controle das doenças. A bactéria Candidatus Liberibacter spp. está associada ao HLB, a principal doença que limita a produção das plantas cítricas. Assim, muitos pesquisadores têm voltado suas atenções para estudarem e encontrarem genes-alvo na resposta do hospedeiro a este patógeno para utilização no melhoramento genético. Nesse sentido, métodos de transformação genética das plantas cítricas são essenciais, porém características inerentes à espécie limitam seu cultivo in vitro e requerem um maior tempo para crescimento e propagação. Com isso, torna-se importante o estudo em plantas modelo, principalmente, para seguir protocolos de validação de genes. De acordo com o exposto, o gene EDS5 isolado de Citrus sinensis, associado ao mecanismo de Resistência Sistêmica Adquirida (SAR) foi superexpresso por meio da transformação genética em tomateiro (Solanum lycopersicum L. Micro-Tom). Após o crescimento dos brotos regenerados, foram identificadas as plantas positivas por meio de análise de GUS e PCR. Linhagens transgênicas homozigotas foram obtidas com avaliação da resistência ao antibiótico canamicina.
In the recent years, the agricultural activity of the citrus industry has been facing big phytosanitary problems, mainly with regard to economic viability arising from disease control. The bacterium Candidatus Liberibacter spp. is associated with HLB, the main disease that limits the production of citrus trees. Thus, many researchers have been returning their attentions to study and find target genes in the host response to this pathogen for use in the genetic improvement. In this way, methods of genetic transformation of citrus plants are essential, but the inherent characteristics of the species border your in vitro cultivation and require a longer time for growth and propagation. Therefore, it is important to study of model plants, mainly for genetic validation protocols. Thus, the EDS5 gene isolated from Citrus sinensis, associated with Systemic Acquired Resistance mechanism (SAR) was overexpressed by genetic transformation in tomato (Solanum lycopersicum L. Micro-Tom). After the growth of regenerated shoots, positive plants were identified by PCR and GUS analysis. Homozygous transgenic lines were obtained with evaluation of resistance to kanamycin.
APA, Harvard, Vancouver, ISO, and other styles
30

Moraes, Tatiana de Souza. "Transformação genética de tomateiro (Solanum lycopersicum cv. \'Micro-Tom\') e de laranja doce (Citrus sinensis L. Osbeck) com o gene Csd1 (superóxido dismutase do cobre e do zinco), isolado de Poncirus trifoliata." Universidade de São Paulo, 2015. http://www.teses.usp.br/teses/disponiveis/11/11144/tde-14122015-103829/.

Full text
Abstract:
Embora a citricultura seja uma importante atividade econômica no Brasil, nos últimos anos houve uma redução significativa da produção nacional. A baixa rentabilidade que o setor citrícola vem enfrentando, devido ao alto custo de produção, é decorrente principalmente dos problemas fitossanitários, com destaque para as doenças, que afetam diretamente a produtividade dos pomares. Atualmente, o huanglongbing (HLB) é a doença mais grave que afeta a citricultura mundial, sendo que os danos são severos em todas as variedades de citros. Diante desse fato, a transformação genética de plantas é uma alternativa para a obtenção de plantas transgênicas, com genes que estimulem o sistema de defesa das plantas, tornando-as resistentes a doenças. Apesar da eficácia dos protocolos existentes para a transformação genética de citros, uma desvantagem característica de plantas perenes é o ciclo reprodutivo lento, tornando difícil e demorado a validação de novos genes de interesse. Por isso, uma importante estratégia é o uso de plantas modelos, como o tomateiro, que possui ciclo curto e boa eficiência de transformação genética. Assim, o objetivo deste trabalho foi obter plantas transgênicas de Solanum lycopersicum cv. \'Micro-Tom\' e Citrus sinensis, contendo a construção gênica com o gene Csd1 (superóxido dismutase do cobre e do zinco), isolado de Poncirus trifoliata. A proteína codificada pelo gene Csd1, também conhecido como Sod1 (superoxide dismutase 1), é o mais potente antioxidante na natureza e é um importante constituinte de defesa celular contra o estresse oxidativo causado pela infecção bacteriana. O tomateiro Micro-tom foi utilizado como modelo de patogenicidade para validação do gene. Porém, devido a sua baixa eficiência de transformação genética, os experimentos de inoculação com o patógeno não foram realizados. Posteriormente, a caracterização da função do gene Csd1 em relação ao HLB será realizada com as plantas transgênicas de citros. A identificação de plantas transgênicas, de tomate e de laranja doce, foi realizada por meio da análise de PCR, utilizando primers para a detecção do gene Csd1. As plantas PCR+ foram aclimatizadas e transferidas para casa-de-vegetação. A eficiência de transformação genética do tomateiro \'Micro-Tom\' e das cultivares de laranja doce, \'Hamlin\' e \'Pineapple\', foram respectivamente: 0,34%, 4,74% e 3,65%. A caracterização molecular pelas análises de Southern blot e RT-qPCR foi realizada apenas em plantas de citros. Foi possível confirmar a integração do transgene em 32 eventos obtidos. O número de eventos de inserção variou de 1 - 5, sendo a presença do gene endógeno Csd1, localizada em 3 locais distintos no genoma das plantas. O nível de mRNA do transgene foi verificada em 21 plantas que tiveram apenas uma única inserção do transgene no genoma. Os resultados obtidos mostram que houve transcrição do gene Csd1 nas plantas transgênicas, assim como, na testemunha não transgênica. A relação do nível de transcrição do transgene com a resistência das plantas ao patógeno será definida após a inoculação com Candidadus Liberibacter.
Although the citrus industry is an important economic activity in Brazil, in recent years there has been a significant reduction in the national citrus production. The low profitability of the citrus sector has faced due to the high production cost is mainly attributed to phytosanitary problems, particularly diseases that directly affect productivity of orchards. Currently, huanglongbing (HLB) is the most serious disease that affects the global citrus industry and the damage is severe in all citrus varieties. Genetic transformation of plants is an alternative to obtain transgenic plants with genes that stimulate the plant defense system, making it resistant to diseases. Despite the effectiveness of protocols for genetic transformation of citrus, a characteristic disadvantage of perennial plants is the slow reproductive cycle, hindering validation of new genes of interest. Therefore, an important strategy is the use of model plants, such as the tomato, which has a short cycle and good genetic transformation efficiency. The objective of this study was to obtain transgenic plants of Solanum lycopersicum cv. \"Micro-Tom \'and Citrus sinensis, containing the gene construct with Csd1 gene (copper/zinc superoxide dismutase), isolated of Poncirus trifoliata. The protein encoded by the gene Csd1, also known as SOD1 (superoxide dismutase 1), is the most powerful antioxidant in nature and is important constituent of cellular defense against oxidative stress caused by bacterial infection. \'Micro-tom\' tomato was used as a model for pathogenic gene validation. However, due to its low efficiency of genetic transformation, the inoculation experiments with the pathogen were not realized. Posteriorly, the characterization of gene function Csd1 in relation to the HLB disease will be realize with citrus transgenic plants. The objective of this study was to obtain transgenic plants of Solanum lycopersicum cv. \'Micro-Tom\' and of Citrus sinensis, containing the gene construction with Csd1 gene (copper/zinc superoxide dismutase), isolated of Poncirus trifoliata, for validation and for future study of this gene for resistance to HLB. Proteins encoded by the Csd1 gene, also known as SOD1 (superoxide dismutase 1), are the most powerful antioxidants in nature and are important constituents of cellular defense against oxidative stress caused by bacterial infection. The identification of transgenic plants of tomato and sweet orange was performed by the PCR analysis using primers for the detection of Csd1 gene. The PCR+ plants were acclimatized and transferred to a greenhouse. The genetic transformation efficiency of tomato \'Micro-Tom\' and sweet orange cultivars, \'Hamlin\' and \'Pineapple\', were 0.34%, 4.74% and 3.65%, respectively. The molecular characterization with the Southern blot and RT-qPCR analyses was performed only in citrus plants. The transgene integration was confirmed in 32 plants. The number of insertion events ranged from 1-5 and the presence of Csd1 endogenous gene is found in three distinct locations in the plants genome. The mRNA level of the transgene was verified in 21 plants that had only a single transgene insertion into the plant genome. The results show that there was transcription of Csd1 gene in transgenic plants as well as in non-transgenic plants. The relation between the transgene transcript level with the resistance of plants to pathogens is set after inoculation with Candidadus Liberibacter.
APA, Harvard, Vancouver, ISO, and other styles
31

Kim, Taejung 1969. "Time-optimal CNC tool paths : a mathematical model of machining." Thesis, Massachusetts Institute of Technology, 2001. http://hdl.handle.net/1721.1/8861.

Full text
Abstract:
Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2001.
Includes bibliographical references (p. 181-188).
Free-form surface machining is a fundamental but time-consuming process in modern manufacturing. The central question we ask in this thesis is how to reduce the time that it takes for a 5-axis CNC (Computer Numerical Control) milling machine to sweep an entire free-form surface in its finishing stage. We formulate a non-classical variational time-optimization problem defined on a 2-dimensional manifold subject to both equality and inequality constraints. The machining time is the cost functional in this optimization problem. We seek for a preferable vector field on a surface to obtain skeletal information on the toolpaths. This framework is more amenable to the techniques of continuum mechanics and differential geometry rather than to path generation and conventional CAD/CAM (Computer Aided Design and Manufacturing) theory. After the formulation, this thesis derives the necessary conditions for optimality. We decompose the problem into a series of optimization problems defined on 1-dimensional streamlines of the vector field and, as a result, simplify the problem significantly. The anisotropy in kinematic performance has a practical importance in high-speed machining. The greedy scheme, which this thesis implements for a parallel hexapod machine tool, uses the anisotropy for finding a preferable vector field.
(cont.) Numerical integration places tool paths along its integral curves. The gaps between two neighboring toolpaths are controlled so that the surface can be machined within a specified tolerance. A conservation law together with the characteristic theory for partial differential equations comes into play in finding appropriately-spaced toolpaths, avoiding unnecessarily-overlapping areas. Since the greedy scheme is based on a local approximation and does not search for the global optimum, it is necessary to judge how well the greedy paths perform. We develop an approximation theory and use it to economically evaluate the performance advantage of the greedy paths over other standard schemes. In this thesis, we achieved the following two objectives: laying down the theoretical basis for surface machining and finding a practical solution for the machining problem. Future work will address solving the optimization problem in a stricter sense.
by Taejung Kim.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
32

Fay, TH, and JC Greeff. "A three species competition model as a decision support tool." Elsevier, 2007. http://encore.tut.ac.za/iii/cpro/DigitalItemViewPage.external?sp=1000167.

Full text
Abstract:
An overcrowding problem of nyala, and lately also of impala in the Ndumo Game Reserve, South Africa, has been detrimental to other species and vegetation structures over a period of two decades. In the present study a deterministic model for three competing species (where two species tend to be overpopulated while the third faces probable localized extinction) is constructed, while future trends coupled with their coexistence are projected. On a mathematical basis, we seek reasons for the failure of the cropping strategies implemented by management over the last two decades, and suggest alternative, scientifical-based approaches to the calculation of cropping quotas to ensure the future coexistence of all three species. A system of three first-order nonlinear differential equations is used, with parameter values based on field data and opinions of specialist ecologists. The effect of various cropping strategies, and the introduction of a fourth species (man as a predator) to the system, is investigated mathematically. This model was implemented as a harvesting strategy in 2002, and is being continuously tested. Final assessment can only be done over a 10–15-year period, but so far indications are promising.
APA, Harvard, Vancouver, ISO, and other styles
33

Vaterlaus, Austin C. "Development of a 3D Computational Vocal Fold Model Optimization Tool." BYU ScholarsArchive, 2020. https://scholarsarchive.byu.edu/etd/8468.

Full text
Abstract:
One of the primary objectives of voice research is to better understand the biomechanics of voice production and how changes in properties of the vocal folds (VFs) affect voice ability and quality. Synthetic VF models provide a way to observe how changes in geometry and material property affect voice biomechanics. This thesis seeks to evaluate an approach of using a genetic algorithm to design synthetic VF models in three ways: first, through the development of a computationally cost-effective 3D vocal fold model; second, by creating and optimizing a variation of this model; and third, by validating the approach. To reduce computation times, a user-defined function (UDF) was implemented in low-fidelity 2D and 3D computational VF models. The UDF replaced the conventional meshed fluid domain with the mechanical energy equation. The UDF was implemented in the commercial finite element code ADINA and verified to produce results that were similar to those of 2D and 3D VF models with meshed fluid domains. Computation times were reduced by 86% for 2D VF models and 74% for 3D VF models while core vibratory characteristic changes were less than 5%. The results from using the UDF demonstrate that computation times could be reduced while still producing acceptable results. A genetic algorithm optimizer was developed to study the effects of altering geometry and material elasticity on frequency, closed quotient (CQ), and maximum flow declination rate (MFDR). The objective was to achieve frequency and CQ values within the normal human physiological range while maximizing MFDR. The resulting models enabled an exploration of trends between objective and design variables. Significant trends and aspects of model variability are discussed. The results demonstrate the benefit of using a structured model exploration method to create models with desirable characteristics. Two synthetic VF models were fabricated to validate predictions made by models produced by the genetic algorithm. Fabricated models were subjected to tests where frequency, CQ, and sound pressure level were measured. Trends between computational and synthetic VF model responses are discussed. The results show that predicted frequency trends between computational and synthetic models were similar, trends for closed quotient were inconclusive, and relationships between MFDR and sound pressure level remained consistent. Overall, while discrepancies between computational and synthetic VF model results were observed and areas in need of further study are noted, the study results provide evidence of potential for using the present optimization method to design synthetic VF models.
APA, Harvard, Vancouver, ISO, and other styles
34

Vázquez, Fernandez Javier. "BPEL with explicit data flow model, editor, and partitioning tool /." [S.l. : s.n.], 2007. http://nbn-resolving.de/urn:nbn:de:bsz:93-opus-32286.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Busch, Benjamin C. "Cognitive bargaining model an analysis tool for third party incentives?" Thesis, Monterey, California : Naval Postgraduate School, 2009. http://edocs.nps.edu/npspubs/scholarly/theses/2009/Dec/09Dec%5FBusch.pdf.

Full text
Abstract:
Thesis (M.A. in Security Studies (Defense Decision-Making))--Naval Postgraduate School, December 2009.
Thesis Advisor(s): Looney, Robert. Second Reader: Tsypkin, Mikhail. "December 2009." Description based on title screen as viewed on January 29, 2010. Author(s) subject terms: Inducements, bargaining, war, Ukraine, Russia, denuclearization, Prospect Theory, rational choice, cognitive, model, bargaining and war. Includes bibliographical references (p. 75-80). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
36

Zheng, Yue. "Driver model for a software in the loop simulation tool." Thesis, KTH, Skolan för industriell teknik och management (ITM), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-265668.

Full text
Abstract:
For this project, a Software-In-the-Loop (SIL) simulation tool is used at Scania (“VTAB” – Virtual Truck and Bus), which simulates the submodels of the mechanical vehicle components together with the real control units. The simulation tool contains the following submodels: Engine model, Drivetrain model, Drive cycle model, Restbus model, and Driver model. The simulated human driver submodel in the restbus model outputs two pedal control signals to the control unit, namely the gas and brake pedals. With these two pedal signals, the control unit decides the modes of mechanical vehicle components. This driver model needs to be reworked to obtain a better velocity following performance. Two controllers, fuzzy PI anti-windup and backward calculation, are implemented in the driver model and compared by the velocity tracking accuracy and the pedal switching frequency. In the comparison and analysis section, two different cycles and two weights of payload are simulated. The simulation results demonstrate that both controllers can improve the driver model’s velocity tracing accuracy. Further, the fuzzy PI anti-windup controller is better when considering pedal signals fluctuation frequency and implementation complexity.
För detta projekt används ett simuleringsverktyg Software-In-the-Loop (SIL) på Scania (“VTAB” - Virtual Truck and Bus), vilket simulerar submodellerna för de mekaniska fordonskomponenterna tillsammans med de verkliga styrenheterna. Simuleringsverktyget innehåller följande submodeller: Motormodell, Drivmotormodell, Drivcykelmodell, Restbusmodell och Drivermodell. Den simulerade submodellen för mänsklig förare i restbussmodellen kommer att sända två pedalsstyrsignaler till styrenheten, nämligen gas och broms. Med dessa två pedalsignaler kan styrenheten avgöra lägen av mekaniska fordonskomponenter. Denna drivrutinmodell måste omarbetas för att få en bättre hastighetsspårnings presentationsförmåga. Två styrenheter, fuzzy PI anti-windup och bakåtberäkning, implementeras i förarmodell och jämförs respektive med hastighetsspårningsnoggrannhet och pedalväxelfrekvens. I jämförelseoch analysavsnittet simuleras två olika cyklar och två nyttolast. Simuleringsresultaten visar att båda kontrollerna kan förbättra förarmodellens hastighetsspårningskapacitet. Vidare är fuzzy PI-anti-windup-kontroller bättre när man tar hänsyn till pedalsignalernas fluktueringsfrekvens och implementeringskomplexitet
APA, Harvard, Vancouver, ISO, and other styles
37

Pavlík, Jan. "Problematika rychlé automatické výměny nástrojů u obráběcích strojů." Doctoral thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2011. http://www.nusl.cz/ntk/nusl-233983.

Full text
Abstract:
This dissertation thesis is focused on the problem of the Automatic Tool Change for milling centers, which is recently shown as a very current topic. The content of this work is to conduct a comprehensive research of various types of ATC depending on the type of machine, as well as research of the current tool interface relating closely with these problems. There was made a comprehensive analysis on the basis of data obtained by comprehensive research, from which both the main factors affecting this topic and the area dealed and developed by this work resulted. The most problematic area of machinery used nowdays for the Atomatic Tool Change was detected as manipulator provided the exchange between spindle and tool storage. Kinematic structure and design of individual elements affects very significantly the dynamics of the whole process of exchange of tools. Therefore, the work is focused on problematic of the manipulator. The main results of this thesis are constructional design of several types of manipulators, which were partly realized within the project 1.2.4 of Brno department of RCMT in the form of the testing stands. There was developed simulation models for selected types of manipulators which were verificated by measuring on the real stand. On the basis of abovementioned problem there was identified another key issue related not only to the exchange process but also with complex problem of service life of spindle. It is a grab force in the process of picking tools. The work outlines the possibilities for further research of these forces on the service file of spindle (especially on service life of bering groups of spindle).
APA, Harvard, Vancouver, ISO, and other styles
38

Inforzato, Diego José. "Estudo do comportamento dos aços ferramenta Thyrotherm 2999 EFS supra e H13 sob fadiga de baixo ciclo a altas temperaturas." Universidade de São Paulo, 2005. http://www.teses.usp.br/teses/disponiveis/88/88131/tde-04012011-141307/.

Full text
Abstract:
Realizou-se neste trabalho uma investigação comparativa do comportamento dos aços ferramenta H13 e THYROTHERM 2999 EFS SUPRA, destinados à fabricação de matrizes para conformação a quente, quando submetidos à fadiga de baixo ciclo a altas temperaturas (FBCAT). A partir de suas curvas de revenimento, foram definidas três durezas de trabalho para cada material (durezas de 42, 52 e 58 HRC), correspondendo a três temperaturas de revenimento distintas e três condições de estudo, buscando-se a condição ótima apresentada por estes materiais para este tipo de aplicação, visando-se então analisar a influência da dureza inicial do material na vida do componente. Foi determinada também a temperatura de ensaio de fadiga isotérmica, em 400°C, correspondente à temperatura de utilização da matriz, ou seja, uma temperatura crítica típica que a matriz atinge durante a solicitação em trabalho. A seguir foram realizados para cada material os ensaios de tração a temperatura ambiente, e na seqüência, os ensaios de tração na temperatura de trabalho definida, que permitiram a determinação dos primeiros parâmetros monotônicos dos materiais, dentre eles uma previsão para os níveis de deformação a serem utilizados nos ensaios de fadiga (0.5,0.6,0.7,0.8,0.9,1.0 e 1.1%), e demais parâmetros como E, k, n, σe, σ’f, ε’f, b, c, que permitiram a elaboração de curvas ε−N, com um modelo estimativo já existente. Finalmente, foram então realizados os ensaios de fadiga isotérmica de baixo ciclo, à temperatura de 400°C, e os resultados foram utilizados para a elaboração das curvas ε−N, resultando então na proposta de um modelo de previsão de resistência à fadiga específico para os materiais pesquisados.
It was made in this work an investigative comparison of the behavior of the tool steels H13 and THYROTHERM 2999 EFS SUPRA, designed for die steels for hot forming, when exposed to high temperature low cycle fatigue (HTLCF). From their tempering curves three material working hardness were defined for each material (hardness of 42, 52 and 58 HRC), corresponding to three different tempering temperatures, and so three study cases for each material, searching for the best condition for this kind of application, and to assess the influence of the initial hardness on the part material life. The isothermal low cycle fatigue test temperature was either defined at 400°C, corresponding to the used temperature at the die steel, i.e., a critical typical temperature that the forging dies reach on hot working. After that, tensile tests were performed for both materials, at room temperature, and at the working temperature formerly defined, and these tests allowed the definition of the first monotonic parameters for these materials, among them predictions for strain levels (0.5, 0.6, 0.7, 0.8, 0.9, 1.0 and 1.1%), to be used on fatigue tests, and further parameters like E, k, n, σe, σ’f, ε’f, b, c, that allowed the elaboration of ε−N curves, based on a still existing prediction model. Finally, isothermal low cycle fatigue tests were performed, at 400°C, and the results were used for ε−N curves elaboration, resulting on a prediction model of the fatigue strength specified for the assessed materials.
APA, Harvard, Vancouver, ISO, and other styles
39

Costa, Carlos Alberto. "Product range models in injection mould tool design." Thesis, Loughborough University, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.327657.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Statham, Craig G. "An open CNC interface for intelligent control of grinding." Thesis, Liverpool John Moores University, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.313100.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Zhu, Qinwei. "5SGraph: A Modeling Tool for Digital Libraries." Thesis, Virginia Tech, 2002. http://hdl.handle.net/10919/35832.

Full text
Abstract:
The high demand for building digital libraries by non-experts requires a simplified modeling process and rapid generation of digital libraries. To enable rapid generation, digital libraries should be modeled with descriptive languages. A visual modeling tool would be helpful to non-experts so they may model a digital library without knowing the theoretical foundations and the syntactical details of the descriptive language. In this thesis, we describe the design and implementation of a domain-specific visual modeling tool, 5SGraph, aimed at modeling digital libraries. 5SGraph is based on a metamodel that describes digital libraries using the 5S theory. The output from 5SGraph is a digital library model that is an instance of the metamodel, expressed in the 5S description language (5SL). 5SGraph presents the metamodel in a structured toolbox, and provides a top-down visual building environment for designers. The visual proximity of the metamodel and instance model facilitates requirements gathering and simplifies the modeling process. Furthermore, 5SGraph maintains semantic constraints specified by the 5S metamodel and enforces these constraints over the instance model to ensure semantic consistency and correctness. 5SGraph enables component reuse to reduce the time and efforts of designers. The results from a pilot usability test confirm the usefulness of 5SGraph.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
42

Miller, Lee Norris. "Design and Analysis of a Compact, Economical, Multi-axis, Multi-tasking, Small Part Machine Tool." VCU Scholars Compass, 2005. http://scholarscompass.vcu.edu/etd_retro/157.

Full text
Abstract:
Manufacturers around the world are increasingly challenged to make components that are becoming smaller, more precise, more complex, and comprised of many more features. When manufactured components require precision, when they are very complex, or when they have multiple features, especially three dimensional features, manufacturers must often resort to machining. Machining smaller parts, however, can be particularly challenging. In this thesis, the issues associated with machining small parts are examined and a brief overview of the equipment available to machine small parts is considered. A machine tool design is developed which addresses many of the limitations associated with the Swiss-type screw machine. The proposed design is then "virtually prototyped" as a solid model in SolidWorksTM, machining forces are calculated, and the effects of the machining forces on the machine tool design are analyzed utilizing COSMOSWORKSTM FEA software and standard industrial formulas for calculating machine tool component service lives and safety factors. Values for all of the design metrics as well as all of the component service lives were found to meet or exceed their target values, thus the machine tool, if manufactured, is expected to perform robustly and to function as desired.
APA, Harvard, Vancouver, ISO, and other styles
43

Almamy, Jeehan. "An evaluation of Altman's Z score using cash flow ratio as analytical tool to predict corporate failure amid the recent financial crisis in the UK." Thesis, Brunel University, 2016. http://bura.brunel.ac.uk/handle/2438/13735.

Full text
Abstract:
One of the most important threats for many firms today, despite their nature of the operation, size and longevity, is insolvency. Existing empirical evidence has shown that in the past two decades, business failures have occurred at a higher rate than any time since the 1930s. Many business failure studies have been conducted over time using financial ratios as inputs and traditional statistical techniques. Some of these studies examined whether cash flow information improves the prediction of business failure. Most recently, researchers have employed discriminant analysis to perform business failure prediction. The recent changes in the world caused by unstable environments where many firms fail more than ever, there is increasing need to predict business failure. To this date, there have been limited previous studies conducted on failure prediction for UK firms. Even in other countries, there has been a small amount of research done in the field of firm failures. Therefore, this study investigates the extension of Altman’s (1968) original model in predicting the health of UK firms using discriminant analysis and performance ratios to test which ratios are statistically significant in predicting the health of the UK firms .a selected sample containing 90 failed and 1000 non failed on UK industrial firms from 2000 – 2013. The main purpose of this study is to contribute towards Altman’s (1968) original Z-score model by adding new variables (Cash flow ratio). The study found that cash flow, when combined with Altman’s original variables is highly significant in predicting the health of UK general firms. A J-UK model was developed to test the health of UK firms. When compared with the re-estimated the Altman’s original model in the UK context, the predictive power of the model was 82.9%, which is consistent with Taffler’s (1982) UK model. Furthermore, to test the predictive power of the model before, during and after the financial crisis periods; results show that J-UK model had a higher accuracy to predict the health of UK firms than the re-estimated Altman’s original model. Finally, the study proves that liquidity, profitability, leverage and capital turnover ratios are significant ratios in predicting failure. Liquidity and profitability have the highest contribution to the results of both re-estimated Altman’s original model and J-UK model. This study has implications for decision makers. Regulatory bodies and practitioners have to take into account the ratios, which contributed highest to the model in order to serve as early warning signals for corrective action.
APA, Harvard, Vancouver, ISO, and other styles
44

Hui, Ben Bunny Chun Bun Graduate School of Biomedical Engineering Faculty of Engineering UNSW. "A parameter optimisation tool for excitable cell mathematical models based on CellML." Awarded By:University of New South Wales. Graduate School of Biomedical Engineering, 2009. http://handle.unsw.edu.au/1959.4/41447.

Full text
Abstract:
Mathematical models are often used to describe and, in some cases predict, excitable cellular behaviour that is based on observed experimental results. With the increase of computational power, it is now possible to solve such models in a relatively short time. This, along with an increasing knowledge of cellular and subcellular processes, has led to the development of a large number of complex cellular models, capable of describing a broad range of excitable cell behaviour. But the use of complex models can also lead to problems. Most models can accurately reproduce results associated with the data on which the models are based. However, results from complicated models, with large numbers of variables and parameters, are less reliable if the model is not placed under the same physiological conditions as defined by the model author. In order to test a model??s suitability and robustness over a range of physiological conditions, one needs to fit model parameters against experimental data observed under those conditions. By using the modelling standard and repository offered by CellML, model users can easily select and adapt a large number of models to set up their own applications to fit model parameters against user-supplied experimental data. However, currently there is a lack of software that can utilise CellML model for parameter fitting. In this thesis, a Java-based utility has been developed, capable of performing least square parameter optimisation for a wide range of CellML models. Using the developed software, a number of parameter fits and identifiability analyses were performed on a selected group of CellML models. It was found that most of the models were ill-formed, with larger numbers of parameters worsening model identifiability. In some cases, the usage of multiple datasets and different objective functions can improve model identifiability. Finally, the developed software was used to perform parameter optimisation against two sets of action potentials from a sinoatrial node experiment, in the absence and presence of E9031, a specific ion channel blocker.
APA, Harvard, Vancouver, ISO, and other styles
45

Thomas, Kerry J. "Teaching Mathematical Modelling to Tomorrow's Mathematicians or, You too can make a million dollars predicting football results." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2012. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-83131.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Josimovic, Aleksandra. "AI as a Radical or Incremental Technology Tool Innovation." Thesis, KTH, Industriell Management, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-230603.

Full text
Abstract:
As researchers found that throughout the history a common challenge for companies across different industries, when it comes to leveraging and capturing value from a technology innovation is strongly influenced by the company’s dominant business model, an established framework through which assessment takes place. The overall purpose of this study is to provide a deeper understanding of the role that company's dominant business model has on the assessment of the impact that new technology innovation, in this case, AI, will have on the company and the market on which company operates. This thesis is partially exploratory and partially descriptive with a qualitative and deductive nature. In order to answer the purpose, a research strategy of case studies was used where empirical data was collected from interviews held with 47 company’s top executives from different hierarchical levels and business units, from Sweden, Switzerland, the USA, Germany, and Finland. The theoretical framework that describes the how AI as a new technology tool is perceived from the Company X perspective, either as a radical, game changer, or an incremental innovation technology tool and examines the role that dominant business model has on this perception was created. The developed implementation framework had its foundation in previous research concerning innovation business model theories. The data that was collected from the company’s executives were then analyzed and compared to the model. The most significant findings suggest that AI as a new technology tool is perceived as a game changer, radical innovation tool for some areas within the Company X and that the company dominant business model profoundly influences this perception.
Som forskare fann att genom hela historien är en gemensam utmaning för företag inom olika branscher när det gäller att utnyttja och fånga värde från en teknologisk innovation starkt påverkad av företagets dominerande affärsmodell, en etablerad ram genom vilken bedömning sker. Det övergripande syftet med denna studie är att ge en djupare förståelse för den roll som företagets dominerande affärsmodell har vid bedömningen av den inverkan som ny teknik innovation, i detta fall AI, kommer att ha på företaget och marknaden där företaget driver . Denna avhandling är delvis undersökande och delvis beskrivande med kvalitativ och deduktiv natur. För att svara på målet användes en forskningsstrategi av fallstudier där empiriska data samlades in från intervjuer med 47 bolagets ledande befattningshavare från olika hierarkiska nivåer och affärsenheter, från Sverige, Schweiz, USA, Tyskland och Finland. Den teoretiska ram som beskriver hur AI som ett nytt teknikverktyg uppfattas ur företagets Xperspektiv, antingen som en radikal, spelväxlare eller ett inkrementellt innovationsteknologiprogram och undersöker den roll som dominerande affärsmodell har på denna uppfattning skapades. Den utvecklade implementeringsramen har grundat sig i tidigare forskning rörande innovationsmodellteorier. Data som samlades in från företagets chefer analyserades sedan och jämfördes med modellen. De viktigaste resultaten tyder på att AI som ett nytt teknikverktyg uppfattas som en spelväxlare, radikalt innovationsverktyg för vissa områden inom företaget X och att företagets dominerande affärsmodell påverkar denna uppfattning väsentligt.
APA, Harvard, Vancouver, ISO, and other styles
47

Silva, Ricardo João Besteiro e. "A behavioral analysis tool for models of software systems." Master's thesis, Faculdade de Ciências e Tecnologia, 2010. http://hdl.handle.net/10362/4023.

Full text
Abstract:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Process calculi are simple languages which permit modeling of concurrent systems so that they can be verified for correctness. We can analyze concurrent systems based on process calculi by either comparing a representation of the actual implementation with a simpler specification for equivalence, or by verifying whether desired properties described in an adequate logic hold. Strong bisimulation equivalence is one of many equivalence relations defined on process calculi to aid in the verification of concurrent software. This equivalence relation relates processes which exhibit the same behavior, i.e. perform the same transitions, as equivalent regardless of internal implementation details. Logics to reason about processes range from those which describe temporal properties – how properties evolve during the course of a process’ life – behavioral properties – which actions a process is capable of performing – and spatial properties – what components compose a process and how are they connected. Model checking consists of verifying if a model, in our case a process, satisfies a given property. Model checking techniques are quite popular in conjunction with process calculi to aid in the verification of the correctness of concurrent systems. In this thesis we address the problems of checking bisimilarity between processess using characteristic formulae, which are formulae used to fully describe a process’ behavior. We implement some facilities to allow bisimilarity verification in the Spatial Logic Model Checker tool. As a result of adding these facilities we also extend the SLMC tool with an extra modality in the logic it uses to reason about processes. We have also added the possibility to define mutually recursive properties in the tool and enhanced the model checking algorithm with a cache to prevent redundant, time-consuming checks to be performed.
APA, Harvard, Vancouver, ISO, and other styles
48

Findley, Stephen Holt. "Hydrologic modeling as a decision-making tool in wildlife management." Thesis, This resource online, 1994. http://scholar.lib.vt.edu/theses/available/etd-11242009-020314/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Noda, Fernanda Sousa dos Santos. "Avaliação do modelo Soil and Water Assessment tool (SWAT) na bacia hidrográfica do Ribeirão Taquaruçu." Universidade Federal do Tocantins, 2018. http://hdl.handle.net/11612/954.

Full text
Abstract:
Devido ao processo de crescimento das cidades e atividade agrícolas há impactos que afetam os recursos hídricos como a escassez, degradação da qualidade de água e consequente conflitos de usos. Neste contexto, os modelos hidrológicos são importantes ferramentas para avaliar o comportamento hídrico de bacias hidrográficas, além de também poderem ser utilizados na previsão de cenários a fim de verificar o impacto do uso e ocupação do solo. Assim, o presente estudo teve como finalidade avaliar o desempenho do modelo Soil and Water Assessment Tool (SWAT) na simulação da vazão na Bacia Hidrográfica do Ribeirão Taquaruçu. A calibração automática foi realizada com o programa SWAT-CUP, em passo diário, com dados de duas estações fluviométricas da BRK Ambiental no período de abril de 2013 a julho de 2014, enquanto os dados de agosto de 2014 a agosto de 2015 foram utilizados no período de validação. A análise de sensibilidade foi realizada com 14 parâmetros selecionados entre aqueles mais significativos para a simulação de vazão em região do bioma Cerrado. Os resultados da análise de sensibilidade indicaram que os parâmetros mais influentes são o SOL_K (condutividade hidráulica saturada do solo) e CN2 (curva número para a condição II). Para avaliação do desempenho do modelo foram utilizadas as funções objetivos Nash-Sutcliffe (NSE) e o coeficiente de determinação (R2) que indicaram, no período de calibração, os seguintes valores: -0,05 e 0,55 (sub-bacia 1); 0,51 (sub-bacia 3), respectivamente. Já no período de validação foram apresentados os seguintes resultados para NSE e R2: 0,44 e 0,54 (sub-bacia 1); 0,24 e 0,29 (sub-bacia 3), nesta ordem. Considerando que o objeto de estudo é importante por ser responsável por grande parte do abastecimento de água do município de Palmas – TO e que o software não conseguiu simular de forma adequada as vazões mínimas, os resultados são considerados insatisfatórios.
Due to the process of city growth and agricultural activity there are impacts that affect water resources such as scarcity, degradation of water quality and consequent conflicts of uses. In this context, hydrological models are important tools to evaluate the hydrological behavior of watersheds, and can also be used to predict scenarios to verify the impact of land use and occupation. Thus, the present study had as purpose to evaluate the performance of the Soil and Water Assessment Tool (SWAT) model in the flow simulation in the Ribeirão Taquaruçu Watershed. The automatic calibration was performed with the SWAT-CUP program, in a daily step, with data from two BRK Ambiental fluviometric stations from April 2013 to July 2014, while data from August 2014 to August 2015 were used in the period of validation. The sensitivity analysis was performed with 14 parameters chosen considering the most significant ones for the simulation of flow in the Cerrado region. The results of the sensitivity analysis indicated that the most influential parameters are the SOL_K (saturated hydraulic conductivity of the soil) and CN2 (number curve for condition II). The Nash-Sutcliffe objective functions (NSE) and the determination coefficient (R2) were used to evaluate the performance of the model, which indicated, in the calibration period, the following values: - 0.05 and 0.55 (sub-basin 1); 0.51 (sub-basin 3), respectively. In the validation period, the following results were presented for NSE and R2: 0,44 and 0,54 (sub-basin 1); 0.24 and 0.29 (sub-basin 3), in that order. Considering that the object of study is important because it is responsible for a large part of the water supply of the municipality of Palmas – TO and that the software was not able to simulate the minimum flows adequately, the results are considered unsatisfactory.
APA, Harvard, Vancouver, ISO, and other styles
50

Cucchiella, Stefano. "Horizontal transformations for models reuse and tool chaining." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-11819.

Full text
Abstract:
Nowadays industrial software development faces an increasing system complexity together with thenecessity to significantly decrease costs and time of the development process and to release at thesame time high-quality products. As a consequence, they typically adopt a constellation of proprietarytools each of which dealing with particular stages of the overall development process, namely design,testing, and deployment to mention a few. Model-Driven Engineering techniques are gaining a growinginterest as an efficient approach to tackle the current software intricacy. However, the use of amultitude of proprietary tools requires the redundant specification of characteristics of the system andhampers their chaining.This thesis work is founded on the necessity to provide a horizontal transformation between conceptsdefined into Executable and Translatable UML (xtUML) models, and semantically-equivalent conceptsdefined into Rational Software Architect (RSA), through a pivot profile developed onto the Papyrusplatform. The proposed profile is defined according to Foundational Subset for Executable UML Models(fUML), an Object Management Group’s (OMG) standard that provides a virtual machine for modelexecution aims. Since the fUML restriction forces models to be defined slightly different to the UMLstandard, all necessary stratagems have been analyzed to provide a compliant pivot profile. Finally, a well-known case study is taken into consideration in order to evaluate the modeling choices adoptedduring the profile’s specification.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography