Tesi sul tema "Data Modeling and Design"

Segui questo link per vedere altri tipi di pubblicazioni sul tema: Data Modeling and Design.

Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili

Scegli il tipo di fonte:

Vedi i top-50 saggi (tesi di laurea o di dottorato) per l'attività di ricerca sul tema "Data Modeling and Design".

Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.

Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.

Vedi le tesi di molte aree scientifiche e compila una bibliografia corretta.

1

Chang, Johnny T. 1975. "Data management in a distributed design modeling environment". Thesis, Massachusetts Institute of Technology, 2001. http://hdl.handle.net/1721.1/88878.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Jeng, Taysehng. "Design coordination modeling: a distributed computer environment for managing design activities". Diss., Georgia Institute of Technology, 1999. http://hdl.handle.net/1853/23162.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Zeigler, Patrick Scott. "A solid modeling program for designers". Thesis, Virginia Polytechnic Institute and State University, 1988. http://hdl.handle.net/10919/74531.

Testo completo
Abstract (sommario):
The personal computer is rapidly finding its way into the architectural working environment, at this time however it is used mostly as a drafting tool. This thesis is an investigation into programming that would allow the computer to become a design tool. The issues that are dealt with in this thesis include an easy to use user interface that will not inhibit the design process, and develop a system that will allow design changes and additions to a model in a three dimensional sketch like ability. Three dimensional models of paper, clay, wood and other materials have been used to create designs, and aid the designer in making decisions. This type of medium is difficult to make changes, and because of the scale of such materials it becomes difficult to work on interior spaces, thus more attention is usually placed on the exterior design. With the use of the computer these limitations may be eliminated, and the designer may create a design from any perspective or view point.
Master of Architecture
Gli stili APA, Harvard, Vancouver, ISO e altri
4

CHOOBINEH, JOOBIN. "FORM DRIVEN CONCEPTUAL DATA MODELING (DATABASE DESIGN, EXPERT SYSTEMS, CONCEPTUAL)". Diss., The University of Arizona, 1985. http://hdl.handle.net/10150/188043.

Testo completo
Abstract (sommario):
Conceptual data schema is constructed from the analysis of the business forms which are used in an enterprise. In order to peform the analysis a data model, a forms model, and heuristics to map from the forms model to the data model are developed. The data model we use is an extended version of the Entity-Relationship Model. Extensions include the addition of the min-max cardinalities and generalization hierarchy. By extending the min-max cardinalities to attributes we capture a number of significant characteristics of the entities in a concise manner. We introduce a hierarchical model of forms. The model specifies various properties of each form field within the form such as their origin, hierarchical structure, and cardinalities. The inter-connection of the forms is expressed by specifying which form fields flow from one form to another. The Expert Database Design System creates a conceptual schema by incrementally integrating related collections of forms. The rules of the expert system are divided into six groups: (1) Form Selection, (2) Entity Identification, (3) Attribute Attachment, (4) Relationship Identification, (5) Cardinality Identification, and (6) Integrity Constraints. The rules of the first group use knowledge about the form flow to determine the order in which forms are analyzed. The rules in other groups are used in conjunction with a designer dialogue to identify entities, relationships, and attributes of a schema that represents the collection of forms.
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Phillips, Shawn M. "Improving Marine Corps Total Life Cycle Management by connecting collected data and simulation". Thesis, Monterey, Calif. : Naval Postgraduate School, 2009. http://edocs.nps.edu/npspubs/scholarly/theses/2009/Jun/09Jun%5FPhillips.pdf.

Testo completo
Abstract (sommario):
Thesis (M.S. in Operations Research)--Naval Postgraduate School, June 2009.
Thesis Advisor(s): Lucas, Thomas W. "June 2009." Description based on title screen as viewed on July 13, 2009. Author(s) subject terms: Simulation, Design of Experiments, Life Cycle Management, VBA, Modeling. Includes bibliographical references (p. 57). Also available in print.
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Srinivasan, K. "Design and development of an enterprise modeling framework". Diss., Georgia Institute of Technology, 1993. http://hdl.handle.net/1853/8285.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Snyder, Scott Alan. "Design and Modeling of a Three-Dimensional Workspace". Case Western Reserve University School of Graduate Studies / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=case1112875843.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Swischuk, Renee C. (Renee Copland). "Physics-based machine learning and data-driven reduced-order modeling". Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/122682.

Testo completo
Abstract (sommario):
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Thesis: S.M., Massachusetts Institute of Technology, Computation for Design and Optimization Program, 2019
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 123-128).
This thesis considers the task of learning efficient low-dimensional models for dynamical systems. To be effective in an engineering setting, these models must be predictive -- that is, they must yield reliable predictions for conditions outside the data used to train them. These models must also be able to make predictions that enforce physical constraints. Achieving these tasks is particularly challenging for the case of systems governed by partial differential equations, where generating data (either from high-fidelity simulations or from physical experiments) is expensive. We address this challenge by developing learning approaches that embed physical constraints. We propose two physics-based approaches for generating low-dimensional predictive models. The first leverages the proper orthogonal decomposition (POD) to represent high-dimensional simulation data with a low-dimensional physics-based parameterization in combination with machine learning methods to construct a map from model inputs to POD coefficients. A comparison of four machine learning methods is provided through an application of predicting flow around an airfoil. This framework also provides a way to enforce a number of linear constraints by modifying the data with a particular solution. The results help to highlight the importance of including physics knowledge when learning from small amounts of data. We also apply a data-driven approach to learning the operators of low-dimensional models. This method provides an avenue for constructing low-dimensional models of systems where the operators of discretized governing equations are unknown or too complex, while also having the ability to enforce physical constraints. The methodology is applied to a two-dimensional combustion problem, where discretized model operators are unavailable. The results show that the method is able to accurately make predictions and enforce important physical constraints.
by Renee C. Swischuk.
S.M.
S.M. Massachusetts Institute of Technology, Computation for Design and Optimization Program
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Totapally, Hara. "Virtual design office: A collaborative unified modeling language tool". CSUSB ScholarWorks, 2001. https://scholarworks.lib.csusb.edu/etd-project/1994.

Testo completo
Abstract (sommario):
Real-time conferencing and collaborative computing is a great way to make developers more effective. This project is a collaborative framework development comprising configurable client and server components.
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Verstak, Alexandre. "Data and Computation Modeling for Scientific Problem Solving Environments". Thesis, Virginia Tech, 2002. http://hdl.handle.net/10919/35299.

Testo completo
Abstract (sommario):
This thesis investigates several issues in data and computation modeling for scientific problem solving environments (PSEs). A PSE is viewed as a software system that provides (i) a library of simulation components, (ii) experiment management, (iii) reasoning about simulations and data, and (iv) problem solving abstractions. Three specific ideas, in functionalities (ii)-(iv), form the contributions of this thesis. These include the EMDAG system for experiment management, the BSML markup language for data interchange, and the use of data mining for conducting non-trivial parameter studies. This work emphasizes data modeling and management, two important aspects that have been largely neglected in modern PSE research. All studies are performed in the context of S4W, a sophisticated PSE for wireless system design.
Master of Science
Gli stili APA, Harvard, Vancouver, ISO e altri
11

Wang, Changling. "Sketch based 3D freeform object modeling with non-manifold data structure /". View Abstract or Full-Text, 2002. http://library.ust.hk/cgi/db/thesis.pl?MECH%202002%20WANGC.

Testo completo
Abstract (sommario):
Thesis (Ph. D.)--Hong Kong University of Science and Technology, 2002.
Includes bibliographical references (leaves 143-152). Also available in electronic version. Access restricted to campus users.
Gli stili APA, Harvard, Vancouver, ISO e altri
12

Kang, Lulu. "Computer and physical experiments: design, modeling, and multivariate interpolation". Diss., Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/34805.

Testo completo
Abstract (sommario):
Many problems in science and engineering are solved through experimental investigations. Because experiments can be costly and time consuming, it is important to efficiently design the experiment so that maximum information about the problem can be obtained. It is also important to devise efficient statistical methods to analyze the experimental data so that none of the information is lost. This thesis makes contributions on several aspects in the field of design and analysis of experiments. It consists of two parts. The first part focuses on physical experiments, and the second part on computer experiments. The first part on physical experiments contains three works. The first work develops Bayesian experimental designs for robustness studies, which can be applied in industries for quality improvement. The existing methods rely on modifying effect hierarchy principle to give more importance to control-by-noise interactions, which can violate the true effect order of a system because the order should not depend on the objective of an experiment. The proposed Bayesian approach uses a prior distribution to capture the effect hierarchy property and then uses an optimal design criterion to satisfy the robustness objectives. The second work extends the above Bayesian approach to blocked experimental designs. The third work proposes a new modeling and design strategy for mixture-of-mixtures experiments and applies it in the optimization of Pringles potato crisps. The proposed model substantially reduces the number of parameters in the existing multiple-Scheffé model and thus, helps the engineers to design much smaller experiments. The second part on computer experiments introduces two new methods for analyzing the data. The first is an interpolation method called regression-based inverse distance weighting (RIDW) method, which is shown to overcome some of the computational and numerical problems associated with kriging, particularly in dealing with large data and/or high dimensional problems. In the second work, we introduce a general nonparametric regression method, called kernel sum regression. More importantly, we make an interesting discovery by showing that a particular form of this regression method becomes an interpolation method, which can be used to analyze computer experiments with deterministic outputs.
Gli stili APA, Harvard, Vancouver, ISO e altri
13

Alhasoun, Fahad. "Understanding and modeling human movement in cities using phone data". Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/107058.

Testo completo
Abstract (sommario):
Thesis: S.M., Massachusetts Institute of Technology, Computation for Design and Optimization Program, 2016.
S.M. !c Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science 2016
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 83-88).
Cities today are strained by the exponential growth in population where they are homes to the majority of world's population. Understanding the complexities underlying the emerging behaviors of human travel patterns on the city level is essential toward making informed decision-making pertaining to urban transportation infrastructures This thesis includes several attempts towards modeling and understanding human mobility at the scales of individuals and the scale of aggregate population movement. The second chapter includes the development of a browser delivering visual insights of the aggregate behavior of populations in cities. The third chapter provides a computational framework for clustering regions in cities based on their attraction behavior and in doing so aids a predictive model in predicting inflows to newly developed regions. The fourth chapter investigates the patterns of individuals' movement at the city scale towards developing a predictive model for a persons' next visited location. The predictive accuracy is then increased by adding movement information of the population. The motivation behind the work of this thesis is derived from the demand of tools that provides fine-grained analysis of the complexity of human travel within cites. The approach takes advantage of the existing built infrastructures to sense the mobility of people eliminating the financial and temporal burdens of traditional methods. The outcomes of this work will assist both planners and the public in understanding the complexities of human mobility within their cities.
by Fahad Alhasoun.
S.M.
S.M. !c Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science
Gli stili APA, Harvard, Vancouver, ISO e altri
14

Hennig, Christian [Verfasser]. "From Data Modeling to Knowledge Engineering in Space System Design / Christian Hennig". Karlsruhe : KIT Scientific Publishing, 2018. http://www.ksp.kit.edu.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
15

Hall, Neil Scott. "Impact of data modeling and database implementation methods on the optimization of conceptual aircraft design". Thesis, Georgia Institute of Technology, 1996. http://hdl.handle.net/1853/16847.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
16

Massaro, Evan K. "Modeling exascale data generation and storage for the large hadron collider computing network". Thesis, Massachusetts Institute of Technology, 2020. https://hdl.handle.net/1721.1/126990.

Testo completo
Abstract (sommario):
Thesis: S.M., Massachusetts Institute of Technology, Computation for Design and Optimization Program, May, 2020
Cataloged from the official PDF of thesis.
Includes bibliographical references (pages 85-86).
The Large Hadron Collider (LHC) is the world's largest and highest energy particle accelerator. With the particle collisions produced at the LHC and measured with the Compact Muon Solenoid (CMS) detector, the CMS experimental group performs precision measurements and general searches for new physics. Year-round CMS operations produce 100 Petabytes of physics data per year, which is stored within a globally distributed grid network of 70 scientific institutions. By 2027, upgrades to the LHC and CMS detector will allow unprecedented probes of microscopic physics, but in doing so generate 2,000 Petabytes (2 Exabytes) of physics data per year. To address the computational requirements of CMS, the cost of CPU resources, disk and tape storage, and tape drives were modeled. These resources were then used in a model of the major CMS computing processes and required infrastructure.
In addition to estimating budget requirements, this model produced bandwidth requirements, for which the transatlantic network cable was explicitly addressed. Given discrete or continuously parameterized policy decisions, the system cost and required network bandwidth could be modeled as a function of the policy. This sensitivity analysis was coupled to an uncertainty quantification of the model outputs, which were functions of the estimated system parameters. The expected value of the system cost and maximum transatlantic network activity were modeled to increase 40 times in 2027 relative to 2018. In 2027 the required transatlantic network capacity was modeled to have an expected value of 210 Gbps, with a 95% confidence interval that reaches 330 Gbps, just under the current bandwidth of 340 Gbps. By changing specific computing policies, the system cost and network load were shown to decrease.
Specific policies can reduce the network load to an expected value of 150 Gbps, with a 95% confidence interval that reaches 260 Gbps. Given the unprecedented volume of data, such policy changes can allow CMS to meet its future physics goals.
by Evan K. Massaro.
S.M.
S.M. Massachusetts Institute of Technology, Computation for Design and Optimization Program
Gli stili APA, Harvard, Vancouver, ISO e altri
17

Lopes, Siqueira Thiago Luis. "The Design of Vague Spatial Data Warehouses". Doctoral thesis, Universite Libre de Bruxelles, 2015. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/221701.

Testo completo
Abstract (sommario):
Spatial data warehouses (SDW) and spatial online analytical processing (SOLAP) enhance decision making by enabling spatial analysis combined with multidimensional analytical queries. A SDW is an integrated and voluminous multidimensional database containing both conventional and spatial data. SOLAP allows querying SDWs with multidimensional queries that select spatial data that satisfy a given topological relationship and that aggregate spatial data. Existing SDW and SOLAP applications mostly consider phenomena represented by spatial data having exact locations and sharp boundaries. They neglect the fact that spatial data may be affected by imperfections, such as spatial vagueness, which prevents distinguishing an object from its neighborhood. A vague spatial object does not have a precisely defined boundary and/or interior. Thus, it may have a broad boundary and a blurred interior, and is composed of parts that certainly belong to it and parts that possibly belong to it. Although several real-world phenomena are characterized by spatial vagueness, no approach in the literature addresses both spatial vagueness and the design of SDWs nor provides multidimensional analysis over vague spatial data. These shortcomings motivated the elaboration of this doctoral thesis, which addresses both vague spatial data warehouses (vague SDWs) and vague spatial online analytical processing (vague SOLAP). A vague SDW is a SDW that comprises vague spatial data, while vague SOLAP allows querying vague SDWs. The major contributions of this doctoral thesis are: (i) the Vague Spatial Cube (VSCube) conceptual model, which enables the creation of conceptual schemata for vague SDWs using data cubes; (ii) the Vague Spatial MultiDim (VSMultiDim) conceptual model, which enables the creation of conceptual schemata for vague SDWs using diagrams; (iii) guidelines for designing relational schemata and integrity constraints for vague SDWs, and for extending the SQL language to enable vague SOLAP; (iv) the Vague Spatial Bitmap Index (VSB-index), which improves the performance to process queries against vague SDWs. The applicability of these contributions is demonstrated in two applications of the agricultural domain, by creating conceptual schemata for vague SDWs, transforming these conceptual schemata into logical schemata for vague SDWs, and efficiently processing queries over vague SDWs.
Les entrepôts de données spatiales (EDS) et l'analyse en ligne spatiale (ALS) améliorent la prise de décision en permettant l'analyse spatiale combinée avec des requêtes analytiques multidimensionnelles. Un EDS est une base de données multidimensionnelle intégrée et volumineuse qui contient des données classiques et des données spatiales. L'ALS permet l'interrogation des EDS avec des requêtes multidimensionnelles qui sélectionnent des données spatiales qui satisfont une relation topologique donnée et qui agrègent les données spatiales. Les EDS et l'ALS considèrent essentiellement des phénomènes représentés par des données spatiales ayant une localisation exacte et des frontières précises. Ils négligent que les données spatiales peuvent être affectées par des imperfections, comme l'imprécision spatiale, ce qui empêche de distinguer précisément un objet de son entourage. Un objet spatial vague n'a pas de frontière et/ou un intérieur précisément définis. Ainsi, il peut avoir une frontière large et un intérieur flou, et est composé de parties qui lui appartiennent certainement et des parties qui lui appartiennent éventuellement. Bien que plusieurs phénomènes du monde réel sont caractérisés par l'imprécision spatiale, il n'y a pas dans la littérature des approches qui adressent en même temps l'imprécision spatiale et la conception d'EDS ni qui fournissent une analyse multidimensionnelle des données spatiales vagues. Ces lacunes ont motivé l'élaboration de cette thèse de doctorat, qui adresse à la fois les entrepôts de données spatiales vagues (EDS vagues) et l'analyse en ligne spatiale vague (ALS vague). Un EDS vague est un EDS qui comprend des données spatiales vagues, tandis que l'ALS vague permet d'interroger des EDS vagues. Les contributions majeures de cette thèse de doctorat sont: (i) le modèle conceptuel Vague Spatial Cube (VSCube), qui permet la création de schémas conceptuels pour des EDS vagues à l'aide de cubes de données; (ii) le modèle conceptuel Vague Spatial MultiDim (VSMultiDim), qui permet la création de schémas conceptuels pour des EDS vagues à l'aide de diagrammes; (iii) des directives pour la conception de schémas relationnels et des contraintes d'intégrité pour des EDS vagues, et pour l'extension du langage SQL pour permettre l'ALS vague; (iv) l'indice Vague Spatial Bitmap (VSB-index) qui améliore la performance pour traiter les requêtes adressées à des EDS vagues. L'applicabilité de ces contributions est démontrée dans deux applications dans le domaine agricole, en créant des schémas conceptuels des EDS vagues, la transformation de ces schémas conceptuels en schémas logiques pour des EDS vagues, et le traitement efficace des requêtes sur des EDS vagues.
O data warehouse espacial (DWE) é um banco de dados multidimensional integrado e volumoso que armazena dados espaciais e dados convencionais. Já o processamento analítico-espacial online (SOLAP) permite consultar o DWE, tanto pela seleção de dados espaciais que satisfazem um relacionamento topológico, quanto pela agregação dos dados espaciais. Deste modo, DWE e SOLAP beneficiam o suporte a tomada de decisão. As aplicações de DWE e SOLAP abordam majoritarimente fenômenos representados por dados espaciais exatos, ou seja, que assumem localizações e fronteiras bem definidas. Contudo, tais aplicações negligenciam dados espaciais afetados por imperfeições, tais como a vagueza espacial, a qual interfere na identificação precisa de um objeto e de seus vizinhos. Um objeto espacial vago não tem sua fronteira ou seu interior precisamente definidos. Além disso, é composto por partes que certamente pertencem a ele e partes que possivelmente pertencem a ele. Apesar de inúmeros fenômenos do mundo real serem caracterizados pela vagueza espacial, na literatura consultada não se identificaram trabalhos que considerassem a vagueza espacial no projeto de DWE e nem para consultar o DWE. Tal limitação motivou a elaboração desta tese de doutorado, a qual introduz os conceitos de DWE vago e de SOLAP vago. Um DWE vago é um DWE que armazena dados espaciais vagos, enquanto que SOLAP vago provê os meios para consultar o DWE vago. Nesta tese, o projeto de DWE vago é abordado e as principais contribuições providas são: (i) o modelo conceitual VSCube que viabiliza a criação de um cubos de dados multidimensional para representar o esquema conceitual de um DWE vago; (ii) o modelo conceitual VSMultiDim que permite criar um diagrama para representar o esquema conceitual de um DWE vago; (iii) diretrizes para o projeto lógico do DWE vago e de suas restrições de integridade, e para estender a linguagem SQL visando processar as consultas de SOLAP vago no DWE vago; e (iv) o índice VSB-index que aprimora o desempenho do processamento de consultas no DWE vago. A aplicabilidade dessas contribuições é demonstrada em dois estudos de caso no domínio da agricultura, por meio da criação de esquemas conceituais de DWE vago, da transformação dos esquemas conceituais em esquemas lógicos de DWE vago, e do processamento de consultas envolvendo as regiões vagas do DWE vago.
Doctorat en Sciences de l'ingénieur et technologie
Location of the public defense: Universidade Federal de São Carlos, São Carlos, SP, Brazil.
info:eu-repo/semantics/nonPublished
Gli stili APA, Harvard, Vancouver, ISO e altri
18

Farias, Arthur José Rodrigues. "Integrating data mining into contextual goal modeling to tackle context uncertaintiesat design time". reponame:Repositório Institucional da UnB, 2017. http://repositorio.unb.br/handle/10482/31637.

Testo completo
Abstract (sommario):
Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Ciência da Computação, 2017.
Submitted by Raquel Almeida (raquel.df13@gmail.com) on 2018-04-10T19:57:59Z No. of bitstreams: 1 2017_ArthurJoséRodriguesFarias.pdf: 1641526 bytes, checksum: 2e82f2f022d3f87b6463903dccc2d8da (MD5)
Approved for entry into archive by Raquel Viana (raquelviana@bce.unb.br) on 2018-04-11T18:23:36Z (GMT) No. of bitstreams: 1 2017_ArthurJoséRodriguesFarias.pdf: 1641526 bytes, checksum: 2e82f2f022d3f87b6463903dccc2d8da (MD5)
Made available in DSpace on 2018-04-11T18:23:36Z (GMT). No. of bitstreams: 1 2017_ArthurJoséRodriguesFarias.pdf: 1641526 bytes, checksum: 2e82f2f022d3f87b6463903dccc2d8da (MD5) Previous issue date: 2018-04-11
Understanding and predicting all context conditions the self-adaptive systems will be exposed to during its life time and implementing a ppropriate adaptation techniques is avery challenging mission. If thesys tem cannot recognize and adapt to unexpected contexts, this can be the cause of failures in self-adaptive systems, with possible implications of not being able to fulfill user requirements or even resulting in undesired behaviors. Especially for dependability attributes, this would have fatal implications. The earlier the broad range of high level context conditions can be specified, the better adaptation strategies can be implemented and validated into the self adaptive systems. The objective of this work is to provide (automated) support to unveil context sets at early stages of the software development life cycle and verify how the contexts impact the system’s dependability attributes. This task will increase the amount of potential issues identified that might thre atenthedependability of self-adaptivesystems. This work provide san approach for the automated detection and analysis of context conditions and their correlations at design time. Our approach employs a data mining process to suitably elicit context sets and is relying on the constructs of a contextual goal model (CGM) for the mapping of contexts to the system’s behavior from a design perspective. We experimentally evaluated our proposal on a Body Sensor Network system(BSN), by simulating amyriadofresourcesthatcouldleadtoa variability space of 4096 possible context conditions. Our results show that our approach is able to elicit contexts that would significantly affect a high percentage of BSN assisted patients with high health risk profile inful filling their goals with in the required reliability level. Additionally, we explored the scalability of the mining process in the BSN context, showing it is able to perform under a minute even for simulated data at the size of over five orders of magnitude. This research supports the development of self-adaptive systems by anticipating at design time contexts that might restrain the achievability of system goals by means of a sound and efficient data mining process.
Gli stili APA, Harvard, Vancouver, ISO e altri
19

Zhao, He Sokhansanj Bahrad. "Systematic data-driven modeling of cellular systems for experimental design and hypothesis evaluation /". Philadelphia, Pa. : Drexel University, 2009. http://hdl.handle.net/1860/3133.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
20

林德華 e Tak-wah Lam. "Topological data structure and algorithms for cell-complex based non-manifold form feature modeling". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1994. http://hub.hku.hk/bib/B3121244X.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
21

Lam, Tak-wah. "Topological data structure and algorithms for cell-complex based non-manifold form feature modeling /". Hong Kong : University of Hong Kong, 1994. http://sunzi.lib.hku.hk/hkuto/record.jsp?B19672214.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
22

Aydogdu, Ali <1985&gt. "Advanced modeling and data assimilation methods for the design of sustained marine monitoring networks". Doctoral thesis, Università Ca' Foscari Venezia, 2016. http://hdl.handle.net/10579/10343.

Testo completo
Abstract (sommario):
An impact assessment of a Fishery Observing System (FOS) network in the Adriatic Sea was carried out with an ocean circulation model fully-coupled with a data assimilation system. The FOS data are single point vertical values of temperature collected in 2007. In this study, we used the Observing System Experiment (OSE) and Observing System Simulation Experiment (OSSE) methodologies to estimate the impact of different FOS design and sensors implementation. OSEs were conducted to evaluate real observations and they show that the FOS network improves the analysis significantly, especially during the stratification season. Root mean square (RMS) of temperature errors are reduced by about 44% and 36% in the upper and lower layers respectively. We also demonstrated that a similar impact can be obtained with a reduced number of vessels if the spatial coverage of the data points does not change significantly. In the OSSE, the impact of the implementation of a CTD (conductivity-temperature-depth) sensor in place of the existing temperature sensor was tested with identical twin approaches between January and April 2007. The results imply that the assimilation of salinity does not improve the analysis significantly during the winter and spring seasons.
Gli stili APA, Harvard, Vancouver, ISO e altri
23

Jones, Mary Elizabeth Song Il-Yeol. "Dimensional modeling : identifying patterns, classifying patterns, and evaluating pattern impact on the design process /". Philadelphia, Pa. : Drexel University, 2006. http://dspace.library.drexel.edu/handle/1860/743.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
24

Markina-Khusid, Aleksandra. "Effect of learning on stakeholder negotiation outcomes : modeling and analysis of game-generated data". Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/100390.

Testo completo
Abstract (sommario):
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, Engineering Systems Division, System Design and Management Program, 2015.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 75-81).
A design negotiation game based on a stakeholder salience framework was created by the APACE research team to explore negotiation dynamics between stakeholders with individual attributes and agendas. Experimental data was collected anonymously during games played by groups of human participants through a web interface. It was found that the negotiation process takes a non-zero number of iterations even under conditions that strongly favor agreement. A realistic scenario was created based on extensive interviews with the major stakeholders involved in a real negotiation of a plan for a new government information technology system. Solution space exploration of this scenario demonstrated that the experimentally obtained solutions lie far from the optimality frontier. Performance differed significantly in two groups of participants with dissimilar professional experience; games played by interns achieved higher scores than those played by senior staff. An agent-based model was built to simulate multi-stage design negotiation. Utility functions of individual players were based on their private agendas. Players voted for a design according to the relative attractiveness of the design as established by the individual utility function. The negotiation process helps players discover other players' agendas. It was hypothesized that knowledge of each other's private objectives would enable groups of players to achieve design solutions that are closer to optimal. Effects of learning were introduced into the model by adding a fraction of the sum of all players' utility function to each individual utility function. Simulated games with learning effects yielded solutions with higher total player scores than simulated games without learning did. Results of simulated games with a substantial level of learning effects were similar to average experimental results from groups of interns. Results of simulated games without learning were close to the average results of games played by senior staff.
by Aleksandra Markina-Khusid.
S.M. in Engineering and Management
Gli stili APA, Harvard, Vancouver, ISO e altri
25

Zinnecker, Alicia M. "Modeling for Control Design of an Axisymmetric Scramjet Engine Isolator". The Ohio State University, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=osu1354215841.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
26

Decker, Gero. "Design and analysis of process choreographies". Phd thesis, Universität Potsdam, 2009. http://opus.kobv.de/ubp/volltexte/2010/4076/.

Testo completo
Abstract (sommario):
With the rise of electronic integration between organizations, the need for a precise specification of interaction behavior increases. Information systems, replacing interaction previously carried out by humans via phone, faxes and emails, require a precise specification for handling all possible situations. Such interaction behavior is described in process choreographies. Choreographies enumerate the roles involved, the allowed interactions, the message contents and the behavioral dependencies between interactions. Choreographies serve as interaction contract and are the starting point for adapting existing business processes and systems or for implementing new software components. As a thorough analysis and comparison of choreography modeling languages is missing in the literature, this thesis introduces a requirements framework for choreography languages and uses it for comparing current choreography languages. Language proposals for overcoming the limitations are given for choreography modeling on the conceptual and on the technical level. Using an interconnection modeling style, behavioral dependencies are defined on a per-role basis and different roles are interconnected using message flow. This thesis reveals a number of modeling "anti-patterns" for interconnection modeling, motivating further investigations on choreography languages following the interaction modeling style. Here, interactions are seen as atomic building blocks and the behavioral dependencies between them are defined globally. Two novel language proposals are put forward for this modeling style which have already influenced industrial standardization initiatives. While avoiding many of the pitfalls of interconnection modeling, new anomalies can arise in interaction models. A choreography might not be realizable, i.e. there does not exist a set of interacting roles that collectively realize the specified behavior. This thesis investigates different dimensions of realizability.
Elektronische Integration zwischen Organisationen erfordert eine präzise Spezifikation des Interaktionsverhaltens: Informationssysteme, die Kommunikation per Telefon, Fax und Email ablösen, können nicht so flexibel und selbständig auf Ausnahmesituationen reagieren wie Menschen. Choreographien ermöglichen es, Interaktionsverhalten genau zu spezifizieren. Diese Modelle zählen die beteiligten Rollen, die erlaubten Interaktionen, Nachrichteninhalte und Verhaltensabhängigkeiten auf und dienen somit als Interaktionsvertrag zwischen den Organisationen. Auch als Ausgangspunkt für eine Anpassung existierender Prozesse und Systeme sowie für die Implementierung neuer Softwarekomponenten finden Choreographien Anwendung. Da ein Vergleich von Choreographiemodellierungssprachen in der Literatur bislang fehlt, präsentiert diese Arbeit einen Anforderungskatalog, der als Basis für eine Evaluierung existierender Sprachen angewandt wird. Im Kern führt diese Arbeit Spracherweiterungen ein, um die Schwächen existierender Sprachen zu überwinden. Die vorgestellten Erweiterungen adressieren dabei Modellierung auf konzeptioneller und auf technischer Ebene. Beim Verlinkungsmodellierungsstil werden Verhaltensabhängigkeiten innerhalb der beteiligten Rollen spezifiziert und das Interaktionsverhalten entsteht durch eine Verlinkung der Kommunikationsaktivitäten. Diese Arbeit stellt einige "Anti-Pattern" für die Verlinkungsmodellierung vor, welche wiederum Untersuchungen bzgl. Choreographiesprachen des Interaktionsmodellierungsstils motivieren. Hier werden Interaktionen als atomare Blöcke verstanden und Verhaltensabhängigkeiten werden global definiert. Diese Arbeit führt zwei neue Choreographiesprachen dieses zweiten Modellierungsstils ein, welche bereits in industrielle Standardisierungsinitiativen eingeflossen sind. Während auf der einen Seite zahlreiche Fallstricke der Verlinkungsmodellierung umgangen werden, können in Interaktionsmodellen allerdings neue Anomalien entstehen. Eine Choreographie kann z.B. "unrealisierbar" sein, d.h. es ist nicht möglich interagierende Rollen zu finden, die zusammen genommen das spezifizierte Verhalten abbilden. Dieses Phänomen wird in dieser Arbeit über verschiedene Dimensionen von Realisierbarkeit untersucht.
Gli stili APA, Harvard, Vancouver, ISO e altri
27

Siqueira, Thiago Luís Lopes. "The design of vague spatial data warehouses". Universidade Federal de São Carlos, 2015. https://repositorio.ufscar.br/handle/ufscar/298.

Testo completo
Abstract (sommario):
Made available in DSpace on 2016-06-02T19:04:00Z (GMT). No. of bitstreams: 1 6824.pdf: 22060515 bytes, checksum: bde19feb7a6e296214aebe081f2d09de (MD5) Previous issue date: 2015-12-07
Universidade Federal de Minas Gerais
O data warehouse espacial (DWE) é um banco de dados multidimensional integrado e volumoso que armazena dados espaciais e dados convencionais. Já o processamento analítico espacial online (SOLAP) permite consultar o DWE, tanto pela seleção de dados espaciais que satisfazem um relacionamento topológico, quanto pela agregação dos dados espaciais. Deste modo, DWE e SOLAP beneficiam o suporte a tomada de decisão. As aplicações de DWE e SOLAP abordam majoritarimente fenômenos representados por dados espaciais exatos, ou seja, que assumem localizações e fronteiras bem definidas. Contudo, tais aplicações negligenciam dados espaciais afetados por imperfeições, tais como a vagueza espacial, a qual interfere na identificação precisa de um objeto e de seus vizinhos. Um objeto espacial vago não tem sua fronteira ou seu interior precisamente definidos. Além disso, é composto por partes que certamente pertencem a ele e partes que possivelmente pertencem a ele. Apesar de inúmeros fenômenos do mundo real serem caracterizados pela vagueza espacial, na literatura consultada não se identificaram trabalhos que considerassem a vagueza espacial no projeto de DWE e nem para consultar o DWE. Tal limitação motivou a elaboração desta tese de doutorado, a qual introduz os conceitos de DWE vago e de SOLAP vago. Um DWE vago é um DWE que armazena dados espaciais vagos, enquanto que SOLAP vago provê os meios para consultar o DWE vago. Nesta tese, o projeto de DWE vago é abordado e as principais contribuições providas são: (i) o modelo conceitual VSCube que viabiliza a criação de um cubos de dados multidimensional para representar o esquema conceitual de um DWE vago; (ii) o modelo conceitual VSMultiDim que permite criar um diagrama para representar o esquema conceitual de um DWE vago; (iii) diretrizes para o projeto lógico do DWE vago e de suas restrições de integridade, e para estender a linguagem SQL visando processar as consultas de SOLAP vago no DWE vago; e (iv) o índice VSB-index que aprimora o desempenho do processamento de consultas no DWE vago. A aplicabilidade dessas contribuições é demonstrada em dois estudos de caso no domínio da agricultura, por meio da criação de esquemas conceituais de DWE vago, da transformação dos esquemas conceituais em esquemas lógicos de DWE vago, e do processamento de consultas envolvendo as regiões vagas do DWE vago.
Spatial data warehouses (SDW) and spatial online analytical processing (SOLAP) enhance decision making by enabling spatial analysis combined with multidimensional analytical queries. A SDW is an integrated and voluminous multidimensional database containing both conventional and spatial data. SOLAP allows querying SDWs with multidimensional queries that select spatial data that satisfy a given topological relationship and that aggregate spatial data. Existing SDW and SOLAP applications mostly consider phenomena represented by spatial data having exact locations and sharp boundaries. They neglect the fact that spatial data may be affected by imperfections, such as spatial vagueness, which prevents distinguishing an object from its neighborhood. A vague spatial object does not have a precisely defined boundary and/or interior. Thus, it may have a broad boundary and a blurred interior, and is composed of parts that certainly belong to it and parts that possibly belong to it. Although several real-world phenomena are characterized by spatial vagueness, no approach in the literature addresses both spatial vagueness and the design of SDWs nor provides multidimensional analysis over vague spatial data. These shortcomings motivated the elaboration of this doctoral thesis, which addresses both vague spatial data warehouses (vague SDWs) and vague spatial online analytical processing (vague SOLAP). A vague SDW is a SDW that comprises vague spatial data, while vague SOLAP allows querying vague SDWs. The major contributions of this doctoral thesis are: (i) the Vague Spatial Cube (VSCube) conceptual model, which enables the creation of conceptual schemata for vague SDWs using data cubes; (ii) the Vague Spatial MultiDim (VSMultiDim) conceptual model, which enables the creation of conceptual schemata for vague SDWs using diagrams; (iii) guidelines for designing relational schemata and integrity constraints for vague SDWs, and for extending the SQL language to enable vague SOLAP; (iv) the Vague Spatial Bitmap Index (VSB-index), which improves the performance to process queries against vague SDWs. The applicability of these contributions is demonstrated in two applications of the agricultural domain, by creating conceptual schemata for vague SDWs, transforming these conceptual schemata into logical schemata for vague SDWs, and efficiently processing queries over vague SDWs.
Gli stili APA, Harvard, Vancouver, ISO e altri
28

Sun, Ye. "Peer-assisted semi-persistent online storage and distribution : design, analysis and modeling /". View abstract or full-text, 2009. http://library.ust.hk/cgi/db/thesis.pl?CSED%202009%20SUN.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
29

Jenson, Justin Michael. "Design of selective peptide inhibitors of anti-apoptotic Bfl-1 using experimental screening, structure-based design, and data-driven modeling". Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/120631.

Testo completo
Abstract (sommario):
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Biology, 2018.
Cataloged from PDF version of thesis.
Includes bibliographical references.
Protein-protein interactions are central to all biological processes. Designer reagents that selectively bind to proteins and inhibit their interactions can be used to probe protein interaction networks, discover druggable targets, and generate potential therapeutic leads. Current technology makes it possible to engineer proteins and peptides with desirable interaction profiles using carefully selected sets of experiments that are customized for each design objective. There is great interest in improving the protein design pipeline to create protein binders more efficiently and against a wider array of targets. In this thesis, I describe the design and development of selective peptide inhibitors of anti-apoptotic BcI-2 family proteins, with an emphasis on targeting Bfl-1. Anti-apoptotic Bcl-2 family proteins bind to short, pro-apoptotic BH3 motifs to support cellular survival. Overexpression of BfI-1 has been shown to promote cancer cell survival and the development of chemoresistance. Prior work suggests that selective inhibition of Bfl-1 can induce cell death in Bfl-1 overexpressing cancer cells without compromising healthy cells that also rely on anti-apoptotic BcI-2 proteins for survival. Thus, Bfl-1-selective BH3 mimetic peptides are potentially valuable for diagnosing Bfl-1 dependence and can serve as leads for therapeutic development. In this thesis, I describe three distinct approaches to designing potent and selective Bfl-1 inhibitors. First, I describe the design and screening of libraries of variants of BH3 peptides. I show that peptides from this screen bind in a previously unobserved BH3 binding mode and have large margins of specificity for Bfl-1 when tested in vitro and in cultured cells. Second, I describe a computational model of the specificity landscape of three anti-apoptotic Bcl-2 proteins including Bfl-1. This model was derived from high-throughput affinity measurement of thousands of peptides from BH3 libraries. I show that this model is useful for designing peptides with desirable interaction profiles within a family of related proteins. Third, I describe the use of a scoring potential built on the amino acid frequencies from well-defined structural motifs complied from the Protein Data Bank to design novel BH3 peptides targeting Bfl-1.
by Justin Michael Jenson.
Ph. D.
Gli stili APA, Harvard, Vancouver, ISO e altri
30

Netterberg, Max, e Simon Wahlström. "Design and training of a recommendersystem on an educational domain using Topic & Term-Frequency modeling". Thesis, Uppsala universitet, Avdelningen för visuell information och interaktion, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-445986.

Testo completo
Abstract (sommario):
This thesis investigates the possibility to create a machine learning powered recommendersystem from educational material supplied by a media provider company. By limiting theinvestigation to a single company's data the thesis provides insights into how a limited datasupply can be utilized in creating a first iteration recommender system. The methods includesemi structured interviews with system experts, constructing a model-building pipeline andtesting the models on system experts via a web interface. The study paints a good picture ofwhat kind of actions you can take when designing content based filtering recommender systemand what actions to take when moving on to further iterations. The study showed that userpreferences may be decisive for the relevancy of the provided recommendations for a specificmedia content. Furthermore, the study showed that Term Frequency Inverse DocumentFrequency modeling was significantly better than using an Elasticsearch database to serverecommendations. Testing also indicated that using term frequency document inversefrequency created a better model than using topic modeling techniques such as latent dirichletallocation. However as testing was only conducted on system experts in a controlledenvironment, further iterations of testing is necessary to statistically conclude that these modelswould lead to an increase in user experience.
Gli stili APA, Harvard, Vancouver, ISO e altri
31

Xu, Zichen. "Energy Modeling and Management for Data Services in Multi-Tier Mobile Cloud Architectures". The Ohio State University, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=osu1468272637.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
32

Hutton, David. "Data modelling techniques to improve student's admission criteria". Thesis, Nelson Mandela Metropolitan University, 2015. http://hdl.handle.net/10948/11036.

Testo completo
Abstract (sommario):
Education is commonly seen as an escape from poverty and a critical path to securing a better standard of living. This is especially relevant in the South African context, where the need is so great that in one instance people were trampled to death at the gates of a higher educational institution, whilst attempting to register for this opportunity. The root cause of this great need is a limited capacity and a demand, which outstrips the supply. This is not a problem specific to South Africa. It is however exaggerated in the South African context due to the country's lack of infrastructure and the opening of facilities to all people. Tertiary educational institutions are faced with ever-increasing applications for a limited number of available positions. This study focuses on a dataset from the Nelson Mandela Metropolitan University's Faculty of Engineering, the Built Environment and Information Technology - with the aim of establishing guidelines for the use of data modelling techniques to improve student admissions criteria. The importance of data preprocessing was highlighted and generalized linear regression, decision trees and neural networks were proposed and motivated for modelling. Experimentation was carried out, resulting in a number of recommended guidelines focusing on the tremendous value of feature engineering coupled with the use of generalized linear regression as a base line. Adding multiple models was highly recommended; since it allows for greater opportunities for added insight.
Gli stili APA, Harvard, Vancouver, ISO e altri
33

Pliuskuvienė, Birutė. "Adaptive data models in design". Doctoral thesis, Lithuanian Academic Libraries Network (LABT), 2008. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2008~D_20080627_143940-41525.

Testo completo
Abstract (sommario):
In the dissertation the adaptation problem of the software whose instability is caused by the changes in primary data contents and structure as well as the algorithms for applied problems implementing solutions to problems of applied nature is examined. The solution to the problem is based on the methodology of adapting models for the data expressed as relational sets.
Disertacijoje nagrinėjama taikomųjų uždavinių sprendimus realizuojančių programinių priemonių, kurių nepastovumą lemia pirminių duomenų turinio, jų struktūrų ir sprendžiamų taikomojo pobūdžio uždavinių algoritmų pokyčiai, adaptavimo problema.
Gli stili APA, Harvard, Vancouver, ISO e altri
34

Telikapalli, Surya. "Collaborative design (COLLDESIGN): A real-time interactive unified modeling language tool". CSUSB ScholarWorks, 2004. https://scholarworks.lib.csusb.edu/etd-project/2669.

Testo completo
Abstract (sommario):
This project extended COLLDESIGN, an interactive collaborative modeling tool that was developed by Mr. Hara Totapally. The initial version included a collaborative framework comprised of configurable client and server components. This project accomplished a complete implementation of the Class Diagram view. In addition, extending the framework, text messaging and audio conferencing features have been implemented to allow for real-time textual and audio communication between team members working on a particular project. VideoClient is the GUI of the application.
Gli stili APA, Harvard, Vancouver, ISO e altri
35

Krisnadhi, Adila Alfa. "Ontology Pattern-Based Data Integration". Wright State University / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=wright1453177798.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
36

Gemesi, Hafize Gunsu. "Food traceability information modeling and data exchange and GIS based farm traceability model design and application". [Ames, Iowa : Iowa State University], 2010. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:1476294.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
37

Assaad, Maher. "Design and modeling of clock and data recovery integrated circuit in 130 nm CMOS technology for 10 Gb/s serial data communications". Thesis, University of Glasgow, 2009. http://theses.gla.ac.uk/707/.

Testo completo
Abstract (sommario):
Abstract This thesis describes the design and implementation of a fully monolithic 10 Gb/s phase and frequency-locked loop based clock and data recovery (PFLL-CDR) integrated circuit, as well as the Verilog-A modeling of an asynchronous serial link based chip to chip communication system incorporating the proposed concept. The proposed design was implemented and fabricated using the 130 nm CMOS technology offered by UMC (United Microelectronics Corporation). Different PLL-based CDR circuits topologies were investigated in terms of architecture and speed. Based on the investigation, we proposed a new concept of quarter-rate (i.e. the clocking speed in the circuit is 2.5 GHz for 10 Gb/s data rate) and dual-loop topology which consists of phase-locked and frequency-locked loop. The frequency-locked loop (FLL) operates independently from the phase-locked loop (PLL), and has a highly-desired feature that once the proper frequency has been acquired, the FLL is automatically disabled and the PLL will take over to adjust the clock edges approximately in the middle of the incoming data bits for proper sampling. Another important feature of the proposed quarter-rate concept is the inherent 1-to-4 demultiplexing of the input serial data stream. A new quarter-rate phase detector based on the non-linear early-late phase detector concept has been used to achieve the multi-Giga bit/s speed and to eliminate the need of the front-end data pre-processing (edge detecting) units usually associated with the conventional CDR circuits. An eight-stage differential ring oscillator running at 2.5 GHz frequency center was used for the voltage-controlled oscillator (VCO) to generate low-jitter multi-phase clock signals. The transistor level simulation results demonstrated excellent performances in term of locking speed and power consumption. In order to verify the accuracy of the proposed quarter-rate concept, a clockless asynchronous serial link incorporating the proposed concept and communicating two chips at 10 Gb/s has been modelled at gate level using the Verilog-A language and time-domain simulated.
Gli stili APA, Harvard, Vancouver, ISO e altri
38

Vadoudi, Kiyan. "Data Model Proposal to Integrate GIS with PLM for DfS". Thesis, Troyes, 2017. http://www.theses.fr/2017TROY0014/document.

Testo completo
Abstract (sommario):
Le déploiement du développement durable passe par des enjeux de transition sociétale et technique auxquels cherche à répondre le Design for Sustainability (DfS). Dans le cadre de la conception des systèmes de production, et en particulier pour les produits manufacturés, les impacts environnementaux que ce soit en termes de consommation de ressources ou de rejets (déchets, émissions) doivent être intégrés comme des paramètres de conception. L’évaluation des impacts environnementaux (par exemple par l’Analyse de Cycle de Vie, ACV) doit donc s’articuler avec la gestion du cycle de vie des produits (PLM). L’inventaire de cycle de vie, ICV est un élément central du lien entre le système de production et son environnement, caractérisé par des informations géographiques et spatiales sur l’écosphère. Le travail de thèse proposé stipule que les impacts environnementaux des systèmes conçus dépendent de cette caractérisation géographique. Les approches d’écoconception et de DFS doivent donc intégrer ces informations géographiques ce qu’elles ne font que très peu, ces informations n’étant pas intégré dans les outils de conception. La thèse propose donc une approche de modélisation pour intégrer les informations relatives au produit et son système de production (via le PLM), l’évaluation de son potentiel d’impact environnemental (via l’ACV et en particulier l’ICV), et les informations géographiques en conception. Pour cela, les informations géographiques à associer sont identifiées et des cas d’études illustratifs sont construits pour montrer l’impact de ces informations sur la définition des produits
There are different approaches to implement sustainability and Design for Sustainability (DfS) is the one that give more accurate result by considering both global and regional scales. Integration of Life Cycle Assessment (LCA) into Product Lifecycle Management (PLM) is an example of tool integration to support sustainability. In LCA framework, Life Cycle Inventory (LCI) is the quantified and classified list of input and output flow of the LCA model that is a model of the product system, linking the technological system to the ecosphere (Environment system). As each region has a unique environmental system, design characteristics and specifications of technological system should be modified and adopted based on these differences. Implementation of this approach will require geographical information of interacted environmental systems, which is a kind of new strategy in DfS. Therefore, we tested the interest of the integration of Geographical Information Systems (GIS) with PLM to support geographical considerations during product development activities. The main research question of this research work is then how to propose this PLM-GIS integration for DfS. Thus, we conducted that literature review on existing data models about product, environment, geography and their combination is a key to prove the link among them
Gli stili APA, Harvard, Vancouver, ISO e altri
39

Shrestha, Chandra R. "Advanced technology in a low technology setting : the application of building information modeling in the rural settings of Nepal". Virtual Press, 2007. http://liblink.bsu.edu/uhtbin/catkey/1379441.

Testo completo
Abstract (sommario):
Advanced technology is often considered to be the property of those individuals or corporations based in the richest and most powerful nations, or the rich and powerful individuals or corporations that can be found in the world's poorest countries. It is typically the case that the most disadvantaged populations do not have equal access to the internet, laptops or personal workstations, fabricating machines, or any of the many software programs or hardware components that are available to many of the world's workers and citizens. There is a tremendous gap between those with access to advanced technology and those without.The aim of this thesis is to address this gap by speculating on the potential of implementing an advanced software application—Building information Modeling (BIM)--into several low technology settings that exist in the author's home country of Nepal.To advance this question, a number of research tactics were engaged: field research was conducted in Nepal during the summer of 2006; advanced digital applications, including Revit and BIM software, were studied and used both as a graduate student and an intern in an architectural firm in Atlanta, Georgia; and the non-governmental organization READNepal was contacted and studied, as was the architectural office with which they collaborate in Katmandu.Three different sites in Nepal were selected to highlight the differences present in the rural areas in terms of cultural, social, and physical qualities, as well as to consider the different "cultures of building" and levels of technological sophistication in the three settings.In time, "trajectories" were developed that link the digital tools readily available to the READNepal offices in New York and Nepal, with the architect's office in Katmandu, and with the contractors and day laborers (who are often illiterate) in the countryside. In this sense, it is suggested that through the introduction of basic software and hand-held devices, information can flow not only from the funding organization's office in New York to the architect, or from the architect to the job site, but from the job site to the architect and READNepal office as well. In this way, communication is improved and the architecture has the potential to be more localized. Power is distributed and knowledge flows not only from top-to-bottom, but from bottom-to-top, from day laborer in rural Nepal to architect in Kathmandu to New York office worker.The thesis offers not a concrete solution but rather an understanding that while potential exists, there is much more to be done to bridge the gap that exists between clients, architects, and constructors in societies where gaps exist between those who have access to high technology and those who do not.
Department of Architecture
Gli stili APA, Harvard, Vancouver, ISO e altri
40

Rentzsch, Walter Herbert Werner. "Data capture and modelling for material processing and design". Thesis, University of Cambridge, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.613904.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
41

Qian, Zhiguang. "Computer experiments [electronic resource] : design, modeling and integration /". Diss., Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/11480.

Testo completo
Abstract (sommario):
The use of computer modeling is fast increasing in almost every scientific, engineering and business arena. This dissertation investigates some challenging issues in design, modeling and analysis of computer experiments, which will consist of four major parts. In the first part, a new approach is developed to combine data from approximate and detailed simulations to build a surrogate model based on some stochastic models. In the second part, we propose some Bayesian hierarchical Gaussian process models to integrate data from different types of experiments. The third part concerns the development of latent variable models for computer experiments with multivariate response with application to data center temperature modeling. The last chapter is devoted to the development of nested space-filling designs for multiple experiments with different levels of accuracy.
Gli stili APA, Harvard, Vancouver, ISO e altri
42

Menking, Ricky Arnold. "The Effects of Team Dynamics Training on Conceptual Data Modeling Task Performance". Thesis, University of North Texas, 2006. https://digital.library.unt.edu/ark:/67531/metadc5448/.

Testo completo
Abstract (sommario):
Database modeling is a complex conceptual topic often taught through the use of project-based teams. One of the problems with the use of project-based teams in university courses is the determination of whether this is the most effective use of instructor and student time involvement and effort level. Therefore, this study investigated the impact of providing team dynamics training prior to the commencement of short-duration project-based team conceptual data modeling projects on individual data modeling task performance (DMTP) outcomes and team cohesiveness. The literature review encompassed conceptual data design modeling, the use of a project-based team approach, team dynamics and cohesion, self-efficacy, gender, and diversity. The research population consisted of 75 university students at a North American University (Canadian) pursuing a business program requiring an information systems course in which database design components are taught. Analysis of the collected data revealed that there was a statistically significant inverse relationship found between the provision of team dynamics training and individual DMTP. However, no statistically significant relationship was found between team dynamics training and team cohesion. Therefore, this study calls into question the value of team dynamics training on learning outcomes in the case of very short duration project-based teams involved in conceptual data modeling tasks. Additional research in this area would need to clarify what about this particular experiment might have contributed to these results.
Gli stili APA, Harvard, Vancouver, ISO e altri
43

Rajab, Khairan. "Knowledge Guided Non-Uniform Rational B-Spline (NURBS) for Supporting Design Intent in Computer Aided Design (CAD) Modeling". Scholar Commons, 2011. http://scholarcommons.usf.edu/etd/3302.

Testo completo
Abstract (sommario):
For many years, incompatible computer-aided design (CAD) packages that are based on Non-uniform Rational B-Spline (NURBS) technology carried out the exchange of models and data through either neutral file formats (IGES or STEP) or proprietary formats that have been accepted as quasi industry standards. Although it is the only available solution at the current time, the exchange process most often produces unsatisfactory results. Models that are impeccable in the original modeling system usually end up with gaps or intersections between surfaces on another incompatible system. Issues such as loss of information, change of data accuracy, inconsistent tolerance, and misinterpretation of the original design intent are a few examples of problems associated with migrating models between different CAD systems. While these issues and drawbacks are well known and cost the industry billions of dollars every year, a solution to eradicate problems from their sources has not been developed. Meanwhile, researchers along with the industries concerned with these issues have been trying to resolve such problems by finding means to repair the migrated models either manually or by using specialized software. Designing in recent years is becoming more knowledge intensive and it is essential for NURBS to take its share of the ever increasing use of knowledge. NURBS are very powerful modeling tools and have become the de facto standard in modeling. If we stretch their strength and make them knowledge driven, benefits beyond current expectations can be achieved easily. This dissertation introduces knowledge guided NURBS with theoretical and practical foundations for supporting design intent capturing, retrieval, and exchange among dissimilar CAD systems. It shows that if NURBS entities are tagged with some knowledge, we can achieve seamless data exchange, increase robustness, and have more reliable computations, all of which are ultimate objectives many researchers in the field of CAD have been trying to accomplish for decades. Establishing relationships between a NURBS entity and its origin and destinations can aid with seamless CAD model migration. The type of the NURBS entity and the awareness of any irregularities can lead to more intelligent decisions on how to proceed with many computations to increase robustness and achieve a high level of reliability. As a result, instead of having models that are hardly modifiable because of migrating raw numerical data in isolation, the knowledge driven migration process will produce models that are editable and preserve design intent. We have addressed the issues not only theoretically but also by developing a prototype system that can serve as a test bed. The developed system shows that a click of a button can regenerate a migrated model instead of repairing it, avoiding delay and corrective processes that only limit the effective use of such models.
Gli stili APA, Harvard, Vancouver, ISO e altri
44

Lin, Wenhsyong. "An object-oriented software development environment for geometric modeling in intelligent computer aided design". Diss., Virginia Tech, 1992. http://hdl.handle.net/10919/40409.

Testo completo
Abstract (sommario):
The concept of intelligent CAD systems to assist a designer in automating the design process has been discussed for years. It has been recognized that knowledge engineering techniques and the study of design theory can provide certain solutions to this approach. A major issue in developing intelligent CAD systems for geometric modeling is the integration of the design geometry with the representation of the design constraints. Current commercial computer aided design (CAD) systems are used primarily for recording the results of the design process. Using conventional CAD systems, a design engineer either must create the geometry of the design object with precise coordinates and dimensions, or start his design from an existing geometry of a previous design. It is difficult to propagate a dimensional change throughout an entire model -- especially solid models. This rigidity imposed by conventional CAD systems discourages a designer from exploring different approaches in creating a novel product.
Ph. D.
Gli stili APA, Harvard, Vancouver, ISO e altri
45

Berhe, Leakemariam. "Statistical modeling and design in forestry : The case of single tree models". Doctoral thesis, Umeå : Umeå University, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-1663.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
46

Rambo, Jeffrey D. "Reduced-Order Modeling of Multiscale Turbulent Convection: Application to Data Center Thermal Management". Diss., Available online, Georgia Institute of Technology, 2006, 2006. http://etd.gatech.edu/theses/available/etd-03272006-080024/.

Testo completo
Abstract (sommario):
Thesis (Ph. D.)--Mechanical Engineering, Georgia Institute of Technology, 2006.
Marc Smith, Committee Member ; P.K. Yeung, Committee Member ; Benjamin Shapiro, Committee Member ; Sheldon Jeter, Committee Member ; Yogendra Joshi, Committee Chair.
Gli stili APA, Harvard, Vancouver, ISO e altri
47

Lee, Ghang. "A new formal and analytical process to product modeling (PPM) method and its application to the precast concrete industry". Diss., Available online, Georgia Institute of Technology, 2004:, 2004. http://etd.gatech.edu/theses/available/etd-10262004-191554/unrestricted/lee%5Fghang%5F200412%5Fphd.pdf.

Testo completo
Abstract (sommario):
Thesis (Ph. D.)--Architecture, Georgia Institute of Technology, 2005.
Eastman, Charles M., Committee Chair ; Augenbroe, Godfried, Committee Co-Chair ; Navathe, Shamkant B., Committee Co-Chair ; Hardwick, Martin, Committee Member ; Sacks, Rafael, Committee Member. Vita. Includes bibliographical references.
Gli stili APA, Harvard, Vancouver, ISO e altri
48

Tashmukhambetov, Arslan. "Experimental Design, Data Analysis, and Modeling for Characterizing the Three-Dimensional Acoustic Field of a Seismic Airgun Array". ScholarWorks@UNO, 2009. http://scholarworks.uno.edu/td/1084.

Testo completo
Abstract (sommario):
In June 2003, the Littoral Acoustic Demonstration Center conducted an acoustic characterization experiment for a standard seismic exploration array. Two moorings with Environmental Acoustic Recording Systems (EARS) were deployed in the northern part of the Gulf of Mexico to measure ambient noise and collect shot information. A 21-element seismic airgun array was towed along five parallel linear tracks with horizontal closest approach points to the EARS buoy position of 63, 500, 1000, 2000, and 5000 m. Calibrated acoustic pressure measurements collected during the experiment were analyzed to obtain zero-to-peak sound pressures, sound exposure levels, and pressure levels in 1/3-octave frequency bands. In addition, the experimental data were modeled by using a modified underwater acoustic propagation model to fill in missing data measurements. The resulting modeling procedure showed good agreement between measured and modeled data in absolute pressure amplitudes and frequency interference patterns for frequencies up to 1000 Hz. The analysis is important for investigating the potential impact on marine mammals and fish and predicting the exposure levels for newly planned seismic surveys in other geographic areas. Based on results of the experiment conducted and data analysis performed, a new experimental design was proposed to maximize the amount of collected data using the available equipment while minimizing the time needed for the source ship. The design used three patches, one with 3º angular spacing between the lines at a reference depth. Embedded is a smaller patch with 1º spacing and within that a still smaller patch with one half degree spacing. This arrangement gives a reasonably uniform distribution of shots versus solid angle with a large variety of emission and azimuthal angles for different ranges. Due to the uncertainty of positioning systems, the angular space is divided into solid angle bins. Simulations predicted more than 200 shots per bin for emission angles greater than 13 degrees. Statistical analysis of collected data will be performed on the proposed bin basis. An experiment based on the proposed design was conducted in Fall 2007. The data measurements collected during the experiment are currently being analyzed and will be reported in the near future.
Gli stili APA, Harvard, Vancouver, ISO e altri
49

Samadiani, Emad. "Energy efficient thermal management of data centers via open multi-scale design". Diss., Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/37218.

Testo completo
Abstract (sommario):
Data centers are computing infrastructure facilities that house arrays of electronic racks containing high power dissipation data processing and storage equipment whose temperature must be maintained within allowable limits. In this research, the sustainable and reliable operations of the electronic equipment in data centers are shown to be possible through the Open Engineering Systems paradigm. A design approach is developed to bring adaptability and robustness, two main features of open systems, in multi-scale convective systems such as data centers. The presented approach is centered on the integration of three constructs: a) Proper Orthogonal Decomposition (POD) based multi-scale modeling, b) compromise Decision Support Problem (cDSP), and c) robust design to overcome the challenges in thermal-fluid modeling, having multiple objectives, and inherent variability management, respectively. Two new POD based reduced order thermal modeling methods are presented to simulate multi-parameter dependent temperature field in multi-scale thermal/fluid systems such as data centers. The methods are verified to achieve an adaptable, robust, and energy efficient thermal design of an air-cooled data center cell with an annual increase in the power consumption for the next ten years. Also, a simpler reduced order modeling approach centered on POD technique with modal coefficient interpolation is validated against experimental measurements in an operational data center facility.
Gli stili APA, Harvard, Vancouver, ISO e altri
50

Capdevila, Ibañez Bruno. "Serious game architecture and design : modular component-based data-driven entity system framework to support systemic modeling and design in agile serious game developments". Paris 6, 2013. http://www.theses.fr/2013PA066727.

Testo completo
Abstract (sommario):
Depuis une dizaine d’années, on constate que les propriétés d’apprentissage inhérentes aux jeux-vidéo incitent de nombreux développeurs à explorer leur potentiel en tant que moyen d’expression pour des buts divers et innovateurs (sérieux). L’apprentissage est au cœur de l’expérience de jeu, mais il prend normalement place dans les domaines affectifs et psychomoteurs. Quand l’apprentissage cible un contenu sérieux, des designers cognitifs/pédagogiques doivent veiller à son effectivité dans le domaine cognitif. Dans des équipes éminemment multidisciplinaires (jeu, technologie, cognition et art), la compréhension et la communication sont indispensables pour une collaboration efficace dès la première étape de conception. Dans une approche génie logiciel, on s’intéresse aux activités (multidisciplinaires) du processus de développement plutôt qu’aux disciplines elles-mêmes, dans le but d’uniformiser et clarifier le domaine. Puis, nous proposons une fondation logicielle qui renforce ce modèle multidisciplinaire grâce à une approche d’underdesign qui favorise la création des espaces de design collaboratifs. Ainsi, Genome Engine peut être vu comme une infrastructure sociotechnique dirigée-donnée qui permet à des développeurs non-programmeurs, comme le game designer et éventuellement le designer cognitif, de participer activement dans la construction du design du produit, plutôt que de l’évaluer déjà en temps d’utilisation. Son architecture est fondée sur un style de système de systèmes d’entités, ce qui contribue à sa modularité, sa réutilisabilité et adaptabilité, ainsi qu’à fournir des abstractions qui favorisent la communication. Plusieurs projets réels nous ont permis de tester notre approche
For the last ten years, we witness how the inherent learning properties of videogames entice several creators into exploring their potential as a medium of expression for diverse and innovative (serious) purposes. Learning is at the core of the play experience, but it usually takes place at the affective and psychomotor domains. When the learning targets the serious content, cognitive/instructional designers must ensure its effectiveness at the cognitive domain. In such eminently multidisciplinary teams (game, technology, cognition, art), understanding and communication are essential for an effective collaboration from the early stage of inception. In a software engineering approach, we focus on the (multidisciplinary) activities of the development process rather than the disciplines themselves, with the intent to uniform and clarify the field. Then, we propose a software foundation that reinforces this multidisciplinary model thanks to an underdesign approach that favors the creation of collaborative design workspaces. Thereby, Genome Engine can be considered as a data-driven sociotechnical infrastructure that provides non-programmer developers, such as game designers and eventually cognitive designers, with a means to actively participate in the construction of the product design, rather than evaluating it once in usage time. Its architecture is based on a component-based application framework with an entity system of systems runtime object model, which contributes to modularity, reuse and adaptability, as well as to provide familiar abstractions that ease communication. Our approach has been extensively evaluated with the development of several serious game projects
Gli stili APA, Harvard, Vancouver, ISO e altri
Offriamo sconti su tutti i piani premium per gli autori le cui opere sono incluse in raccolte letterarie tematiche. Contattaci per ottenere un codice promozionale unico!

Vai alla bibliografia