To see the other types of publications on this topic, follow the link: Modeling of processor design.

Dissertations / Theses on the topic 'Modeling of processor design'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Modeling of processor design.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Chen, Yuan. "High level modelling and design of a low powered event processor." Thesis, University of Newcastle Upon Tyne, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.500939.

Full text
Abstract:
With the fast development of semiconductor technology, more and more Intellectual Property (IP cores) can be integrated into one chip under the Globally Asynchronous and Locally Synchronous (GALS) architecture. Power becomes the main restriction of the System-on-Chip (SOC) performance especially when the chip is used in a portable device. Many low power technologies have been proposed and studied for IP core's design. However, there is a shortage of system level power management schemes (policies) for the GALS architecture. In particular, the area of using Dynamic Power Management (DPM) to optimize SOC power dissipation under latency restriction ains relatively unexplored. This thesis describes the work of modelling and design of an asynchronous event coprocessor to control the operations of an IP core in the GALS architecture.
APA, Harvard, Vancouver, ISO, and other styles
2

Popescu, Catalin Nicolae. "Modeling and control of extrusion coating." Diss., Georgia Institute of Technology, 2001. http://hdl.handle.net/1853/13700.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yuan, Fangfang. "Assessing the impact of processor design decisions on simulation based verification complexity using formal modeling with experiments at instruction set architecture level." Thesis, University of Bristol, 2012. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.566838.

Full text
Abstract:
The Instruction Set Architecture (ISA) describes the key functionalities of a processor design and is the most comprehensible format for enabling humans to understand the structure of the entire processor design. This thesis first introduces the construction of a generic ISA formal model with mathematical notations rather than programming languages, and demonstrates the extensions towards specific ISA designs. The stepwise refinement modeling technique gives rise to the hierarchically structured model, which eases the overall comprehensibility of the ISA and reduces the effort required for modeling similar designs. The ISA models serve as self-consistent, complete, and unambiguous specifications for coding, while helping engineers explore different design options beforehand. In the design phase, a selection of features is available to architects in order for the design to be trimmed towards a particular optimization target, e.g. low power consumption or fast computation, which can be assessed before implementation. However, taking verification into consideration, there is to my knowledge no way to estimate the difficulty of verifying a design before coding it. There needs to be a platform and a metric, from which both functional and non-functional properties can be quantitatively represented and then compared before implementation. Hence, this thesis secondly pro- poses a metric, based on the formally reasoned extension of the generic ISA models, as an estimator of some non-functional property, i.e. the verification complexity for achieving verification goals. The main claim of this thesis is that the verification complexity in simulation-based verification can be accurately retrieved from a hierarchically constructed ISA formal model in which the functionalities are fully specified with the correctness preserved. The modeling structure allows relative comparisons at a reasonably high level of abstraction brought by the hierarchically constructed formalization. The analysis on the experimental ISA emulator assesses the quality of the metric and concludes the applicability of the proposed metric.
APA, Harvard, Vancouver, ISO, and other styles
4

Prasai, Anish. "Methodologies for Design-Oriented Electromagnetic Modeling of Planar Passive Power Processors." Thesis, Virginia Tech, 2006. http://hdl.handle.net/10919/34164.

Full text
Abstract:
The advent and proliferation of planar technologies for power converters are driven in part by the overall trends in analog and digital electronics. These trends coupled with the demands for increasingly higher power quality and tighter regulations raise various design challenges. Because inductors and transformers constitute a rather large part of the overall converter volume, size and performance improvement of these structures can subsequently enhance the capability of power converters to meet these application-driven demands. Increasing the switching frequency has been the traditional approach in reducing converter size and improving performance. However, the increase in switching frequency leads to increased power loss density in windings and core, with subsequent increase in device temperature, parasitics and electromagnetic radiation. An accurate set of reduced-order modeling methodologies is presented in this work in order to predict the high-frequency behavior of inductors and transformers. Analytical frequency-dependent expressions to predict losses in planar, foil windings and cores are given. The losses in the core and windings raise the temperature of the structure. In order to ensure temperature limitation of the structure is not exceeded, 1-D thermal modeling is undertaken. Based on the losses and temperature limitation, a methodology to optimize performance of magnetics is outlined. Both numerical and analytical means are employed in the extraction of transformer parasitics and cross-coupling. The results are compared against experimental measurements and are found to be in good accord. A simple near-field electromagnetic shield design is presented in order to mitigate the amount of radiation. Due to inadequacy of existing winding technology in forming suitable planar windings for PCB application, an alternate winding scheme is proposed which relies on depositing windings directly onto the core.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
5

Qian, Zhiguang. "Computer experiments [electronic resource] : design, modeling and integration /." Diss., Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/11480.

Full text
Abstract:
The use of computer modeling is fast increasing in almost every scientific, engineering and business arena. This dissertation investigates some challenging issues in design, modeling and analysis of computer experiments, which will consist of four major parts. In the first part, a new approach is developed to combine data from approximate and detailed simulations to build a surrogate model based on some stochastic models. In the second part, we propose some Bayesian hierarchical Gaussian process models to integrate data from different types of experiments. The third part concerns the development of latent variable models for computer experiments with multivariate response with application to data center temperature modeling. The last chapter is devoted to the development of nested space-filling designs for multiple experiments with different levels of accuracy.
APA, Harvard, Vancouver, ISO, and other styles
6

Zhang, Qiang. "Process modeling of innovative design using systems engineering." Thesis, Strasbourg, 2014. http://www.theses.fr/2014STRAD007/document.

Full text
Abstract:
Nous développons des modèles de processus pour décrire et gérer efficacement la conception innovante, en suivant la méthodologie DRM. D'abord, nous présentons un modèle descriptif de la conception innovante. Ce modèle reflète les processus fondamentaux qui sont utiles pour comprendre les différentes dimensions et étapes impliqués dans la conception innovante. Il permette aussi de localise les possibilités d'innovation dans ce processus, et se focalise sur les facteurs internes et externes qui influencent le succès. Deuxièmement, nous effectuons une étude empirique pour étudier la façon dont le contrôle et la flexibilité peuvent être équilibrés pour gérer l'incertitude dans la conception innovante. Après avoir identifié les pratiques de projets qui traitent de ces incertitudes en termes de contrôle et de flexibilité, des études de cas sont analysés. Cet exemple montre que le contrôle et la flexibilité peuvent coexister. En se basant sûr les résultats managériaux issu de cette étude empirique, nous développons un modèle procédurale de processus et un modèle adaptatif à base d’activité. Le premier propose le cadre conceptuel pour équilibrer l'innovation et le contrôle par la structuration des processus au niveau du projet et par l'intégration des pratiques flexibles au niveau opérationnel. Le second modèle considère la conception innovante comme un système adaptatif complexe. Il propose ainsi une méthode de conception qui construit progressivement l'architecture du processus de la conception innovante. Enfin, les deux modèles sont vérifiées en analysant un certain nombre de processus et en faisant des simulations au sein de trois projets de conception innovante
We develop a series of process models to comprehensively describe and effectively manage innovative design in order to achieve adequate balance between innovation and control, following the design research methodology (DRM). Firstly, we introduce a descriptive model of innovative design. This model reflects the actual process and pattern of innovative design, locates innovation opportunities in the process and supports a systematic perspective whose focus is the external and internal factors affecting the success of innovative design. Secondly, we perform an empirical study to investigate how control and flexibility can be balanced to manage uncertainty in innovative design. After identifying project practices that cope with these uncertainties in terms of control and flexibility, a case-study sample based on five innovative design projects from an automotive company is analyzed and shows that control and flexibility can coexist. Based on the managerial insights of the empirical study, we develop the procedural process model and the activity-based adaptive model of innovative design. The former one provides the conceptual framework to balance innovation and control by the process structuration at the project-level and the integration of flexible practices at the operation-level. The latter model considers innovative design as a complex adaptive system, and thereby proposes the method of process design that dynamically constructs the process architecture of innovative design. Finally, the two models are verified by supporting a number of process analysis and simulation within a series of innovative design projects
APA, Harvard, Vancouver, ISO, and other styles
7

Satyanarayana, Srinath. "Fixture-workpiece contact modeling for a compliant workpiece." Thesis, Georgia Institute of Technology, 2001. http://hdl.handle.net/1853/17874.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Panchal, Jitesh H. "A framework for simulation-based integrated design of multiscale products and design processes." Diss., Available online, Georgia Institute of Technology, 2005, 2005. http://etd.gatech.edu/theses/available/etd-11232005-112626/.

Full text
Abstract:
Thesis (Ph. D.)--Mechanical Engineering, Georgia Institute of Technology, 2006.
Eastman, Chuck, Committee Member ; Paredis, Chris, Committee Co-Chair ; Allen, Janet, Committee Member ; Rosen, David, Committee Member ; Tsui, Kwok, Committee Member ; McDowell, David, Committee Member ; Mistree, Farrokh, Committee Chair. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
9

Cui, Song. "Hardware mapping of critical paths of a GaAs core processor for solid modelling accelerator /." Title page, contents and abstract only, 1996. http://web4.library.adelaide.edu.au/theses/09PH/09phc9661.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Perdikaki, Olga. "Modeling the work center design problems for thermal processes in semiconductor manufacturing." [Gainesville, Fla.] : University of Florida, 2003. http://purl.fcla.edu/fcla/etd/UFE0001451.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Storm, Sandra [Verfasser]. "Molecular Modeling and Experimental Design of Surfactant-Based Extraction Processes / Sandra Storm." München : Verlag Dr. Hut, 2015. http://d-nb.info/1071513109/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Funk, Suzana. "Processo criativo para o design virtual de embalagens." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2010. http://hdl.handle.net/10183/29059.

Full text
Abstract:
O objetivo deste trabalho é sistematizar o processo criativo virtual de embalagens, utilizando softwares de modelagem 3D como ferramentas de apoio à criação e materialização da ideia. As embalagens, atualmente, são cada vez mais reformuladas e em um tempo menor, sendo importantes para o sucesso de produtos e empresas. Assim, é fundamental que sejam feitos estudos sobre como desenvolver embalagens de forma rápida e eficaz, para que as embalagens alcancem os objetivos estratégicos empresariais, possam interagir da melhor forma com as pessoas e sejam menos agressivas ao meio ambiente no contexto da realidade em que são consumidas. Com os recursos tecnológicos disponíveis, a criação das embalagens normalmente é feita com o auxílio de softwares, mas com pouca ou sem sistematização do processo criativo. De modo a abordar o processo criativo virtual de embalagens, foram estudados aspectos teóricos sobre o design da embalagem, criatividade e processos criativos, aspectos tridimensionais da forma e modelagem virtual 3D. Além dos aspectos teóricos, foram feitas pesquisas com profissionais com experiência em modelagem virtual 3D. Foi realizada uma análise qualitativa destes dados e, em seguida, foi feita a sua interligação para elaborar as dez diretrizes que servem como guia para o processo criativo virtual do design de embalagens proposto. Optou-se por organizar estas diretrizes dentro de uma metodologia, a qual consiste das etapas: Buscar, Conectar, Criar e Apresentar. O seu diferencial, em relação às demais metodologias, está nas fases Criar e Apresentar, que envolvem a criação de embalagens utilizando as possibilidades de softwares de modelagem 3D. Na criação das diretrizes referentes à parte do processo criativo em softwares, procurou-se a generalização dos procedimentos, pois no contexto da criação e materialização da ideia podem ser utilizados diferentes softwares. Para colocar em prática esta metodologia, foi realizado o design de um frasco de perfume, seguindo as dez diretrizes propostas. Notou-se que as diretrizes contribuíram significativamente para guiar, organizar e até mesmo inspirar a geração de ideias do processo criativo e conduzir as atividades até a apresentação final do trabalho. Foi observado também que a computação gráfica pode contribuir significativamente com o design, potencializando a criação e a materialização de embalagens. Desta forma, pode-se diminuir o tempo de criação e os custos com a criação de protótipos, além de aperfeiçoar a visualização final do produto.
The aim of this work is to systematize the virtual creative process of packing, using a 3D modeling software as support tools for the creation and materialization of the idea. The current packing has been reformulated in a shorter time, being important for the success of the products and the companies. Thus, studies about how to develop packing in a fast and efficient way are fundamental, in order to reach the company’s strategic goals, a better interaction with people, being less aggressive to the environment inside the context they are consumed. With the available technological resources, the creation of packing is normally done with the help of software, although employing little or no systematization of the creative process. In order to approach the virtual creative process of packing, theoretical aspects regarding the packing design, creativity and creative processes, the tridimensional aspects and 3D virtual modeling were studied. Besides the theoretical aspects, a research with professionals with expertise in 3D virtual modeling was performed. Firstly, a qualitative analysis of these data was performed and, then, the interconnection between them was done in order to elaborate the ten procedures that can serve as a guide for the virtual creative process of the packing design. Then, these procedures were organized inside a methodology, which consists of Search, Connect, Create, Present. Its differential, compared to other methodologies, lies on the Create and Present phases, which involve the creation of packing using the possibilities of 3D modeling software. In the creation of the guides related to the creative process in software, the generalization of the procedures was targeted, because, in the context of the creation and materialization of the idea, different software can be used. To put this methodology into practice, a design of a perfume bottle, following the ten proposed guides, was accomplished. The guidance significantly contributed to lead and to organize the work, even inspiring the generation of ideas of the creative process and leading the activities up to the end of the work. It was also observed that graphical computation can significantly contribute with the design, increasing the creation and the materialization of packing. With these results, the creation time and the costs of creating prototypes can be reduced, besides improving the final visualization of the product.
APA, Harvard, Vancouver, ISO, and other styles
13

Zeng, Yong. "Axiomatic approach to the modeling of product conceptual design processes using set theory." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/nq64894.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Bare, Marshall Edwin. "Structuring Emperical Methods for Reuse and Efficiency in Product Development Processes." BYU ScholarsArchive, 2006. https://scholarsarchive.byu.edu/etd/1032.

Full text
Abstract:
Product development requires that engineers have the ability to predict product performance. When product performance involves complex physics and natural phenomena, mathematical models are often insufficient to provide accurate predictions. Engineering companies compensate for this deficiency by testing prototypes to obtain empirical data that can be used in place of predictive models. The purpose of this work is to provide techniques and methods for efficient use of empirical methods in product development processes. Empirical methods involve the design and creation of prototype hardware and the testing of that hardware in controlled environments. Empirical methods represent a complete product development sub-cycle within the overall product development process. Empirical product development cycles can be expensive in both time and resources. Global economic pressures have caused companies to focus on improving the productivity of their product development cycles. A variety of techniques for improving the productivity of product development processes have been developed. These methods focus on structuring process steps and product artifacts for reuse and efficiency. However these methods have, to this point, largely ignored the product development sub-cycle of empirical design. The same techniques used on the overall product development processes can and should be applied to the empirical product development sub-cycle. This thesis focuses on applying methods of efficient and reusable product development processes on the empirical development sub-cycle. It also identifies how to efficiently link the empirical product development sub-cycle into the overall product development process. Specifically, empirical product development sub-cycles can be characterized by their purposes into three specific types: first, obtaining data for predictive model coefficients, boundary conditions and driving functions; second, validating an existing predictive model; and third, to provide the basis for predictions using interpolation and extrapolation of the empirical data when a predictive model does not exist. These three types of sub-cycles are structured as reusable processes in a standard form that can be used generally in product development. The roles of these three types of sub-cycles in the overall product development process are also established and the linkages defined. Finally, the techniques and methods provided for improving the efficiency of empirical methods in product development processes are demonstrated in a form that shows their benefits.
APA, Harvard, Vancouver, ISO, and other styles
15

Fu, Yee-tien. "Simulation modeling of information flows in decision making processes for design-to-manufacturing strategies." Thesis, Virginia Tech, 1989. http://hdl.handle.net/10919/45938.

Full text
Abstract:

Most successful manufacturing companies were initially formed around a unique or superior product design. As a result of this trend, many companies, especially in high-technology industries, considered design and marketing the company's primary functions. When the United States had superior manufacturing technological capabilities in the 1950's and 1960's, corporate management could systematically neglect manufacturing and still be successful. Manufacturing was treated as a service organization and evaluated in the negative terms of poor quality, low productivity, high wage rates, and so on. Manufacturing was not expected to make a positive contribution to a company's success. Recent successes of the Japanese in building higher quality, lower cost products show the critical error in this philosophy. Manufacturing is now a major factor in a company's competitive position (Priest, 1988).

Manufacturing strategies are the framework for accomplishing the long-term corporate goals for the manufacturing function. This framework helps to focus manufacturing goals and provides plans for integrating the necessary functions and resources into a coordinated effort to improve production. Communication of this strategy sets the right climate for the teamwork and long-term planning that are necessary in developing improved manufacturing capabilities. The strategy should be well publicized throughout the company, with regularly scheduled reviews to monitor progress toward the goals.

Recently, emphasis on manufacturing strategy, as advocated by leading scholars in the 1970's and 1980's, has been recognized as one way to regain the competitive advantage for American manufacturers. Work carried out recently at Harvard and Stanford by Porter (1985), Skinner (1985), Wheelwright and Hayes (1984) has given more attention to the central role and potential importance of manufacturing; this work helps to explain the relative success of Japanese and Gennan companies. Their work also sets a sound background for further research in this area. However, current studies in manufacturing strategy have delved into technology and financial considerations. But technology, capital, and work force are all planned by the people (managers) who are always located in some kind of organizational structure. Based on the experimental results proposed by industrial psychologists, this research is one step toward a quantitative study of organizational efficiency. Two major types of organizations, hierarchical (serial) and egalitarian (parallel), are investigated by applying simulation techniques. The variables controlled comprise organizational type, number of levels in a hierarchical structure, and number of participants. The research results are also applied to investigate the applicability of current design-to-manufacturing strategies, such as simultaneous engineering and concurrent design in firms. Suggestions on how to reduce the design-to-manufacturing time through appropriate organizational structures are presented following analysis of the simulation results.


Master of Science
APA, Harvard, Vancouver, ISO, and other styles
16

Kim, Sangwook. "A multi-disciplinary approach for ontological modeling of enterprise business processes : case-based approach /." free to MU campus, to others for purchase, 2002. http://wwwlib.umi.com/cr/mo/fullcit?p3052188.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Souza, Patrícia de Mello [UNESP]. "A modelagem tridimensional como implemento do processo de desenvolvimento do produto de moda." Universidade Estadual Paulista (UNESP), 2006. http://hdl.handle.net/11449/96266.

Full text
Abstract:
Made available in DSpace on 2014-06-11T19:28:04Z (GMT). No. of bitstreams: 0 Previous issue date: 2006-01-31Bitstream added on 2014-06-13T20:57:41Z : No. of bitstreams: 1 souza_pm_me_bauru.pdf: 907859 bytes, checksum: b98682457254c5034fde8ac03ff92b52 (MD5)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
A presente pesquisa verifica a eficiência da modelagem tridimensional, moulage, como instrumento de otimização do processo de desenvolvimento do produto de moda/vestuário. Para tanto, enfoca as etapas de criação e materialização nas quais a referida técnica encontra-se inserida, onde constata a dicotomia entre as áreas de criação e modelagem. Aborda as qualidades técnicas, construtivas, ergonômicas e estéticas envolvidas no projeto da modelagem do produto, enfatizando os aspectos de conforto, caimento e inovação formal. De abordagem qualitativa, tem seus dados coletados por meio de observações sistemáticas, no âmbito acadêmico, numa variedade de situações-problemas, em momentos diversos, com variadas fontes de informação - cenários criados para reproduzir, considerando as devidas proporções e especificidades - situações industriais análogas. Estabelece as seguintes linhas guias de observação: criar e materializar; materializar a criação do outro; a criação constitui-se na própria materialização. Indicadores previstos na estruturação da pesquisa - adequação dimensional, vestibilidade, inovação formal, tempo, retrabalho, consumo de matéria-prima, soluções de montagem - conduzem aos resultados, numa comparação dos dados obtidos quando a técnica da modelagem tridimensional encontra-se ou não inserida no processo de desenvolvimento do produto de moda. É constatada a eficiência da técnica no processo.
The purpose of this research is to verify the efficiency of the three-dimensional modeling, draping, as a way to achieve a better development of the fashion/clothing products. For that, it focus the creation and the materialization steps, in which the draping technique is found, where is found a dichotomy between the creation and the modeling areas. It also approaches the technical, constructive, ergonomic and esthetic qualities involved on the modeling project, emphasizing the comfort, adjustment and innovation of the shape. This qualitative research got the data collection by methodological academic observation, with a variety of problem situations, in different moments with distinctive information sources - created reproduction scenes, considering the propositions and specialties - such as industrial situations. It establishes the following observation guide: to create and materialize; to materialize the other s creation; the creation constitutes in the materialization itself. The indicators used in this research are the dimensional fitness, adjustment, innovation of the shape, time, rework, material raw, assembling solutions. They conduct to the findings, comparing obtained data when the three-dimensional modeling is found in development of the fashion/clothing products and it proves the efficiency of the technique in the process.
APA, Harvard, Vancouver, ISO, and other styles
18

Lee, Ghang. "A new formal and analytical process to product modeling (PPM) method and its application to the precast concrete industry." Diss., Available online, Georgia Institute of Technology, 2004:, 2004. http://etd.gatech.edu/theses/available/etd-10262004-191554/unrestricted/lee%5Fghang%5F200412%5Fphd.pdf.

Full text
Abstract:
Thesis (Ph. D.)--Architecture, Georgia Institute of Technology, 2005.
Eastman, Charles M., Committee Chair ; Augenbroe, Godfried, Committee Co-Chair ; Navathe, Shamkant B., Committee Co-Chair ; Hardwick, Martin, Committee Member ; Sacks, Rafael, Committee Member. Vita. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
19

Sriwiriyarat, Tongchai. "Computer Program Development for the Design of IFAS Wastewater Treatment Processes." Thesis, Virginia Tech, 1999. http://hdl.handle.net/10919/32065.

Full text
Abstract:
The Integrated Film Activated Sludge Process (IFAS) was developed to reduce the cost of additional facilities required to complete year round nitrification in the design of new or retrofit wastewater treatment plants. The purpose of this project was to develop a computer-based mechanistic model, called IFAS, which can be used as a tool by scientists and engineers to optimize their designs and to troubleshoot a full-scale treatment plant. The program also can be employed to assist researchers conducting their studies of IFAS wastewater treatment processes. IFAS enables the steady-state simulation of nitrification-denitrification processes as well as carbonaceous removal in systems utilizing integrated media, but this current version supports only sponge type media. The IFAS program was developed by incorporating empirical equations for integrated biofilm carbonaceous uptake and nitrification developed by Sen and Randall (1995) into the general activated sludge model, developed by the International Association on Water Quality (IAWQ, previously known as IAWRC), plus the biological phosphorus removal model of Wentzel et al (1989). The calibration and evaluation of the IFAS model was performed using existing data from both an IFAS system and a conventional activated sludge bench-scale plant operated over a wide range of Aerobic Mean Cell Residence Times (Aerobic MCRT's). The model developed provides a good fit and a reasonable prediction of the experimental data for both the IFAS and the conventional pilot-scale systems. The phosphorus removal component of the model has not yet been calibrated because of insufficient data and the lack of adequately defined parameters.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
20

Narayanan, Sundaram. "Design and development of an object-oriented architecture for modeling and simulation of discrete-part manufacturing systems." Diss., Georgia Institute of Technology, 1994. http://hdl.handle.net/1853/24374.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Dubos, Gregory Florent. "Stochastic modeling of responsiveness, schedule risk and obsolescence of space systems, and implications for design choices." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/43656.

Full text
Abstract:
The U.S Department of Defense and the National Aeronautics and Space Administration continue to face common challenges in the development and acquisition of their space systems. In particular, space programs repeatedly experience significant schedule slippages, and spacecraft are often delivered on-orbit several months, sometimes years, after the initially planned delivery date. The repeated pattern of these schedule slippages suggests deep-seated flaws in managing spacecraft delivery and schedule risk, and an inadequate understanding of the drivers of schedule slippages. Furthermore, due to their long development time and physical inaccessibility after launch, space systems are exposed to a particular and acute risk of obsolescence, resulting in loss of value or competitive advantage over time. The perception of this particular risk has driven some government agencies to promote design choices that may ultimately be contributing to these schedule slippages, and jeopardizing what is increasingly recognized as critical, namely space responsiveness. The overall research objective of this work is twofold: (1) to identify and develop a thorough understanding of the fundamental causes of the risk of schedule slippage and obsolescence of space systems; and in so doing, (2) to guide spacecraft design choices that would result in better control of spacecraft delivery schedule and mitigate the impact of these "temporal risks" (schedule and obsolescence risks). To lay the groundwork for this thesis, first, the levers of responsiveness, or means to influence schedule slippage and impact space responsiveness are identified and analyzed, including design, organizational, and launch levers. Second, a multidisciplinary review of obsolescence is conducted, and main drivers of system obsolescence are identified. This thesis then adapts the concept of a technology portfolio from the macro- or company level to the micro-level of a single complex engineering system, and it analyzes a space system as a portfolio of technologies and instruments, each technology with its distinct stochastic maturation path and exposure to obsolescence. The selection of the spacecraft portfolio is captured by parameters such as the number of instruments, the initial technology maturity of each technology/instrument, the resulting heterogeneity of the technology maturity of the whole system, and the spacecraft design lifetime. Building on the abstraction of a spacecraft as a portfolio of technologies, this thesis then develops a stochastic framework that provides a powerful capability to simultaneously explore the impact of design decisions on spacecraft schedule, on-orbit obsolescence, and cumulative utility delivered by the spacecraft. Specifically, this thesis shows how the choice of the portfolio size and the instruments Technology Readiness Levels (TRLs) impact the Mean-Time-To-Delivery (MTTD) of the spacecraft and mitigate (or exacerbate) schedule risk. This work also demonstrates that specific combinations/choices of the spacecraft design lifetime and the TRLs can reduce the risk of on-orbit obsolescence. This thesis then advocates for a paradigm shift towards a calendar-based design mindset, in which the delivery time of the spacecraft is accounted for, as opposed to the traditional clock-based design mindset. The calendar-based paradigm is shown to lead to different design choices, which are more likely to prevent schedule slippage and/or enhance responsiveness and ultimately result in a larger cumulative utility delivered. Finally, missions scenarios are presented to illustrate how the framework and analyses here proposed can help identify system design choices that satisfy various mission objectives and constraints (temporal as well as utility-based).
APA, Harvard, Vancouver, ISO, and other styles
22

Ghosh, Tapajyoti. "Integrated sustainability assessment and design of processes, supply chains, ecosystems and economy using life cycle modeling methods." The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1563480013206943.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Rasmusson, Kristina. "Modeling of geohydrological processes in geological CO2 storage – with focus on residual trapping." Doctoral thesis, Uppsala universitet, Luft-, vatten och landskapslära, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-327994.

Full text
Abstract:
Geological storage of carbon dioxide (CO2) in deep saline aquifers is one approach to mitigate release from large point sources to the atmosphere. Understanding of in-situ processes providing trapping is important to the development of realistic models and the planning of future storage projects. This thesis covers both field- and pore-scale numerical modeling studies of such geohydrological processes, with focus on residual trapping. The setting is a CO2-injection experiment at the Heletz test site, conducted within the frame of the EU FP7 MUSTANG and TRUST projects. The objectives of the thesis are to develop and analyze alternative experimental characterization test sequences for determining in-situ residual CO2 saturation (Sgr), as well as to analyze the impact of the injection strategy on trapping, the effect of model assumptions (coupled wellbore-reservoir flow, geological heterogeneity, trapping model) on the predicted trapping, and to develop a pore-network model (PNM) for simulating and analyzing pore-scale mechanisms. The results include a comparison of alternative characterization test sequences for estimating Sgr. The estimates were retrieved through parameter estimation. The effect on the estimate of including various data sets was determined. A new method, using withdrawal and an indicator-tracer, for obtaining a residual zone in-situ was also introduced. Simulations were made of the CO2 partitioning between layers in a multi-layered formation, and parameters influencing this were identified. The results showed the importance of accounting for coupled wellbore-reservoir flow in simulations of such scenarios. Simulations also showed that adding chase-fluid stages after a conventional CO2 injection enhances the (residual and dissolution) trapping. Including geological heterogeneity generally decreased the estimated trapping. The choice of trapping model may largely effect the quantity of the predicted residual trapping (although most of them produced similar results). The use of an appropriate trapping model and description of geological heterogeneity for a site when simulating CO2 sequestration is vital, as different assumptions may give significant discrepancies in predicted trapping. The result also includes a PNM code, for multiphase quasi-static flow and trapping in porous materials. It was used to investigate trapping and obtain an estimated trapping (IR) curve for Heletz sandstone.
APA, Harvard, Vancouver, ISO, and other styles
24

Agarwal, Kuldeep. "Physics Based Hierarchical Decomposition of Processes for Design of Complex Engineered Systems." The Ohio State University, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=osu1322152146.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Cingi, Guney. "The Influence Of Digital Technologies On The Interaction Of Design And Manufacturing Processes." Master's thesis, METU, 2006. http://etd.lib.metu.edu.tr/upload/12606944/index.pdf.

Full text
Abstract:
This study aims to analyze and evaluate the influence of digital technologies on the inter-action of design and manufacturing processes by representing an outlook of digital tech-nologies through developments in modeling capabilities, manufacturing techniques, mate-rial science, and design strategies. The digital era reached by the technological developments in different fields of sci-ence influenced the field of architecture, just like the others. Thus, a new kind of spa-tial and tectonic quality in architecture is emerging with the lately introduced design tools and materials that are novel to the building industry, while redefining the role of architect in this contemporary medium. The evolutionary process of Frank O. Gehry and his office, being a pioneer in using digital design and manufacturing tools in architecture, is represented with realized examples that point out the formerly discussed developments in the realm of architecture and visualize the tectonics of the digitally designed and produced buildings
culminating with the case study of Guggenheim Museum, Bilbao.
APA, Harvard, Vancouver, ISO, and other styles
26

Souza, Patrícia de Mello. "A modelagem tridimensional como implemento do processo de desenvolvimento do produto de moda /." Bauru : [s.l.], 2006. http://hdl.handle.net/11449/96266.

Full text
Abstract:
Orientador: Ivan De Domenico Valarelli
Banca: Luis Carlos Pachoarelli
Banca: Suzana Barreto Martins
Resumo: A presente pesquisa verifica a eficiência da modelagem tridimensional, moulage, como instrumento de otimização do processo de desenvolvimento do produto de moda/vestuário. Para tanto, enfoca as etapas de criação e materialização nas quais a referida técnica encontra-se inserida, onde constata a dicotomia entre as áreas de criação e modelagem. Aborda as qualidades técnicas, construtivas, ergonômicas e estéticas envolvidas no projeto da modelagem do produto, enfatizando os aspectos de conforto, caimento e inovação formal. De abordagem qualitativa, tem seus dados coletados por meio de observações sistemáticas, no âmbito acadêmico, numa variedade de situações-problemas, em momentos diversos, com variadas fontes de informação - cenários criados para reproduzir, considerando as devidas proporções e especificidades - situações industriais análogas. Estabelece as seguintes linhas guias de observação: criar e materializar; materializar a criação do outro; a criação constitui-se na própria materialização. Indicadores previstos na estruturação da pesquisa - adequação dimensional, vestibilidade, inovação formal, tempo, retrabalho, consumo de matéria-prima, soluções de montagem - conduzem aos resultados, numa comparação dos dados obtidos quando a técnica da modelagem tridimensional encontra-se ou não inserida no processo de desenvolvimento do produto de moda. É constatada a eficiência da técnica no processo.
Abstract: The purpose of this research is to verify the efficiency of the three-dimensional modeling, draping, as a way to achieve a better development of the fashion/clothing products. For that, it focus the creation and the materialization steps, in which the draping technique is found, where is found a dichotomy between the creation and the modeling areas. It also approaches the technical, constructive, ergonomic and esthetic qualities involved on the modeling project, emphasizing the comfort, adjustment and innovation of the shape. This qualitative research got the data collection by methodological academic observation, with a variety of problem situations, in different moments with distinctive information sources - created reproduction scenes, considering the propositions and specialties - such as industrial situations. It establishes the following observation guide: to create and materialize; to materialize the other’s creation; the creation constitutes in the materialization itself. The indicators used in this research are the dimensional fitness, adjustment, innovation of the shape, time, rework, material raw, assembling solutions. They conduct to the findings, comparing obtained data when the three-dimensional modeling is found in development of the fashion/clothing products and it proves the efficiency of the technique in the process.
Mestre
APA, Harvard, Vancouver, ISO, and other styles
27

Sun, Furong. "Some Advances in Local Approximate Gaussian Processes." Diss., Virginia Tech, 2019. http://hdl.handle.net/10919/97245.

Full text
Abstract:
Nowadays, Gaussian Process (GP) has been recognized as an indispensable statistical tool in computer experiments. Due to its computational complexity and storage demand, its application in real-world problems, especially in "big data" settings, is quite limited. Among many strategies to tailor GP to such settings, Gramacy and Apley (2015) proposed local approximate GP (laGP), which constructs approximate predictive equations by constructing small local designs around the predictive location under certain criterion. In this dissertation, several methodological extensions based upon laGP are proposed. One methodological contribution is the multilevel global/local modeling, which deploys global hyper-parameter estimates to perform local prediction. The second contribution comes from extending the laGP notion of "locale" to a set of predictive locations, along paths in the input space. These two contributions have been applied in the satellite drag emulation, which is illustrated in Chapter 3. Furthermore, the multilevel GP modeling strategy has also been applied to synthesize field data and computer model outputs of solar irradiance across the continental United States, combined with inverse-variance weighting, which is detailed in Chapter 4. Last but not least, in Chapter 5, laGP's performance has been tested on emulating daytime land surface temperatures estimated via satellites, in the settings of irregular grid locations.
Doctor of Philosophy
In many real-life settings, we want to understand a physical relationship/phenomenon. Due to limited resources and/or ethical reasons, it is impossible to perform physical experiments to collect data, and therefore, we have to rely upon computer experiments, whose evaluation usually requires expensive simulation, involving complex mathematical equations. To reduce computational efforts, we are looking for a relatively cheap alternative, which is called an emulator, to serve as a surrogate model. Gaussian process (GP) is such an emulator, and has been very popular due to fabulous out-of-sample predictive performance and appropriate uncertainty quantification. However, due to computational complexity, full GP modeling is not suitable for “big data” settings. Gramacy and Apley (2015) proposed local approximate GP (laGP), the core idea of which is to use a subset of the data for inference and further prediction at unobserved inputs. This dissertation provides several extensions of laGP, which are applied to several real-life “big data” settings. The first application, detailed in Chapter 3, is to emulate satellite drag from large simulation experiments. A smart way is figured out to capture global input information in a comprehensive way by using a small subset of the data, and local prediction is performed subsequently. This method is called “multilevel GP modeling”, which is also deployed to synthesize field measurements and computational outputs of solar irradiance across the continental United States, illustrated in Chapter 4, and to emulate daytime land surface temperatures estimated by satellites, discussed in Chapter 5.
APA, Harvard, Vancouver, ISO, and other styles
28

Janakiraman, Vijayakumar. "DESIGN, FABRICATION AND CHARACTERIZATION OF BIFURCATING MICROFLUIDIC NETWORKS FOR TISSUE-ENGINEERED PRODUCTS WITH BUILT-IN MICROVASCULATURE." Case Western Reserve University School of Graduate Studies / OhioLINK, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=case1196457966.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Kim, Jieun. "Modeling cognitive and affective processes of designers in the early stages of design : mental categorization of information processing." Paris, ENSAM, 2011. http://www.theses.fr/2011ENAM0011.

Full text
Abstract:
Cette thèse a pour finalité l'analyse de l'activité cognitive des designers en vue de formaliser les processus informationnels inhérents aux activités de design industriel et plus particulièrement les activités de catégorisation mentale de l'information implicite qui précède la génération d'esquisses. Un tel modèle formalisé est nécessaire pour, à partir d'une extraction de la connaissance et des règles de design, élaborer de nouveaux outils numériques qui supporteront les processus informationnels amont en conception innovante. Nous avons mis en place deux expérimentations avec des designers experts et novices, afin de mieux décrire les processus cognitifs et affectifs des designers. Nous avons combiné la méthode cognitive (Protocole d'étude selon la méthode de verbalisation spontanée, questionnaire) et physiologique (Activité Electrodermale et eye tracking). La démarche a bénéficié comme terrain expérimental du projet national GENIUS. Ce projet visait à développer un système support à l'activité des designers dans la phase de catégorisation et de génération en conception amont. Notre apport s'appuie sur la formalisation d'un modèle des processus cognitif et affectif des designers. Ce modèle a permis d'extraire des spécifications pour le développement du système ''GENIUS''
The aim of this thesis is to explore how designers mentally categorize design information during early sketching performed in the generative phase. In conjunction with cognitive aspects of design, we proposed that cognitive and affective processes, involved in this specific phase, should be modeled through understanding designer's mental process and its relationship with early representations (sketches) together. A combination of action research approach and laboratory-based experiments was particularly appropriate for our study. Thus, first, a descriptive model of information processing involving memory theories drawn from cognitive psychology was developed. This model was refined and enriched via empirical studies with experts and novices in the product design domain. In order to formalize cognitive and affective processes of designers, we combined cognitive (concurrent verbalization protocol and questionnaires) and physiological (galvanic skin conductance and eye tracking system) methods. Subsequent analysis finally yielded a model depicting cognitive and affective processes of designers in the generative phase. As an application, based on our model, a list of specifications for developing computational tools dedicated to the generative phase has been applied and validated in the ''GENIUS'' project, which aimed to develop the system for supporting designer's activities in the early stages of design
APA, Harvard, Vancouver, ISO, and other styles
30

Demir, Ersin. "Executable business process modeling as a tool for increasing the understanding of business processes in an organization." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-177381.

Full text
Abstract:
Understanding of business processes is becoming an important key factor for successful businesses and today many organizations are facing the lack of knowledge about the business processes that they are working on. Since the interaction between different business processes and different actors are becoming more common it is not only enough for employees to have knowledge about the business processes that they involve directly, but also they need to know about the other business processes in the organization. Unfortunately there is not enough research on this topic in the literature and the goal of this thesis is to propose a method for solving the indicated problem by increasing the understanding of business processes by the employees in an organization. The proposed method basically contains the design and execution of process models based on a real scenario. A case study has been conducted at an IT company where the employees have no or limited knowledge about the business processes that their organization works on. Even though the method has been only tested in one organization it is generic and can be applied to any other similar organization. The design science approach is used to develop the method and build the process models as artifacts. The process models have been verified by using an executable business modeling tool, iPB, and presented to the employees in a seminar in order to make them to understand the business processes better. The knowledge of the employees about the business processes has been analyzed before and after the presentation, thus we could compare the results and find out how much their knowledge has increased after the presentation. The results have shown that the knowledge of the employees has increased significantly. In conclusion, the method of design and presentation of executable business process models has been proved to be a solution to the problem of not understanding of business processes in an organization well enough.
APA, Harvard, Vancouver, ISO, and other styles
31

Mororó, Bruno Oliveira. "Modelagem sistêmica do processo de melhoria contínua de processos industriais utilizando o método seis sigma e redes de Petri." Universidade de São Paulo, 2008. http://www.teses.usp.br/teses/disponiveis/3/3152/tde-29012009-103220/.

Full text
Abstract:
A globalização reforça a necessidade das empresas aprimorarem seus processos e produtos continuamente para se manterem competitivas e atenderem às expectativas de um mercado dinâmico e de acionistas cada vez mais exigentes, que buscam maximizar seus lucros. A melhoria contínua acima mencionada não se refere apenas à qualidade percebida pelo consumidor final, mas também à qualidade e confiabilidade dos processos de produção. Desta forma, quando as empresas têm melhores processos, melhores são os produtos originados e também os seus custos. Porém, a questão é como esses processos são modelados na fase de projeto e como ferramentas de qualidade, sobretudo o Seis Sigma metodologia mais em voga na atualidade podem utilizar tais modelos para obter melhores resultados. Essa dissertação propõe a utilização de ferramentas de modelagem e simulação tais como as Redes de Petri para modelagem de processos produtivos fornecem um modelo formal para a representação de sistemas de produção, capturando aspectos inerentes a tais sistemas como concorrência, paralelismo e sincronização suportando a aplicação da metodologia Seis Sigma, a qual geralmente atua somente no nível de melhoria do processo produtivo e não do projeto que o originou. Dessa forma, essa dissertação trás uma proposta de integração entre os times que projetam o processo e os que executam a produção, demonstrando tal viabilidade por meio da análise dos projetos/modelos durante a aplicação da metodologia Seis Sigma. É realizado um estudo de caso na estamparia de uma indústria automotiva que ilustra a aplicação da metodologia proposta.
The globalization strengthens the necessity for companies to improve its processes and products in order to remains competitive and to attend expectations of a dynamic market and shareholders each time more demanding and eager to maximize their profits. The continuous improvement mentioned above, is not related only to the quality perceived by the final consumer, but also to the quality and reliability of the production processes. Therefore, as better processe the company gets, as better would be their final products and as lower would be the costs. However, the question is wheter a continuous improvement using Six Sigma could benefit from the design documentation for the target process returning also an improved documentation after the cycle is completed. This work considers the use of Petri Nets for production processes modeling - even if any other design and modeling representation would return similar results - supporting the Six Sigma methodology application. The main result is a proposition of continuous improvement life cycle that maintains the design documentation consistent and up-to-date. Thus, this work pursuits the integration between the teams that design processes and those who implement them in the manufacturing plant. To show the potential of the models analysis during Six Sigma projects a case study is analyzed for the Press Shop area in an Automotive Industry.
APA, Harvard, Vancouver, ISO, and other styles
32

Sohn, SugJe. "Modeling and Analysis of Production and Capacity Planning Considering Profits, Throughputs, Cycle Times, and Investment." Diss., Georgia Institute of Technology, 2004. http://hdl.handle.net/1853/5083.

Full text
Abstract:
This research focuses on large-scale manufacturing systems having a number of stations with multiple tools and product types with different and deterministic processing steps. The objective is to determine the production quantities of multiple products and the tool requirements of each station that maximizes net profit while satisfying strategic constraints such as cycle times, required throughputs, and investment. The formulation of the problem, named OptiProfit, is a mixed-integer nonlinear programming (MINLP) with the stochastic issues addressed by mean-value analysis (MVA) and queuing network models. Observing that OptiProfit is an NP-complete, nonconvex, and nonmonotonic problem, the research develops a heuristic method, Differential Coefficient Based Search (DCBS). It also performs an upper-bound analysis and a performance comparison with six variations of Greedy Ascent Procedure (GAP) heuristics and Modified Simulated Annealing (MSA) in a number of randomized cases. An example problem based on a semiconductor manufacturing minifab is modeled as an OptiProfit problem and numerically analyzed. The proposed methodology provides a very good quality solution for the high-level design and operation of manufacturing facilities.
APA, Harvard, Vancouver, ISO, and other styles
33

Möller, Johannes [Verfasser]. "Modeling and experimental analysis of antibody-producing cell culture processes: from metabolism over population to design and scale-up / Johannes Möller." München : Verlag Dr. Hut, 2020. http://d-nb.info/1222351978/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Tavares, Felipe Souza. "AO-BPM 2.0 : modelação de processos orientada a aspetos." Master's thesis, Instituto Superior de Economia e Gestão, 2018. http://hdl.handle.net/10400.5/16944.

Full text
Abstract:
Mestrado em Gestão de Sistemas de Informação
Interesses transversais são aqueles que se interpõem entre diversos itens, mas ao mesmo tempo não fazem parte deles. Métodos tradicionais de modelação de processos não possuem um tratamento especial para cuidar de interesses transversais, fazendo com que eles permaneçam espalhados por todo o modelo do processo, dificultando seu entendimento e manutenção. Nesse sentido, a orientação a aspetos é um paradigma que oferece mecanismos para modularizar interesses transversais. Nessa dissertação são propostas melhorias a uma notação para modelação de processos que utiliza a orientação a aspetos, com o objetivo de gerar modelos de mais fácil entendimento e facilitar a sua posterior manutenção. A metodologia utilizada foi o design science, que foi mais detalhado na sua respetiva secção. Por fim, foi realizado um estudo de caso para avaliar se a notação proposta é capaz de produzir um modelo de processo válido. No capítulo de conclusões, pode ser visto que o resultado foi satisfatório e que mais trabalhos posteriores poderão ser realizados no futuro.
Crosscutting concerns are those that interact between several processes, but at the same time are not part of them. Traditional methods of process modeling do not have a special treatment to take care of crosscutting concerns, causing them to remain scattered throughout the process model, making it difficult to understand and maintain them later. In this sense, orientation to aspects is a paradigm that offers mechanisms to modularize transversal interests. In this dissertation improvements are proposed in a notation for process modeling that uses orientation to aspects, in order to generate models of easier understanding and facilitate its later maintenance. The methodology used was design science, which is more detailed in its respective section. Finally, a case study was carried out to evaluate if the proposed notation can produce a valid process model. In the conclusions chapter, it can be seen that the result was satisfactory, and that further work could be carried out in the future.
info:eu-repo/semantics/publishedVersion
APA, Harvard, Vancouver, ISO, and other styles
35

Tomba, Emanuele. "Latent variable modeling approaches to assist the implementation of quality-by-design paradigms in pharmaceutical development and manufacturing." Doctoral thesis, Università degli studi di Padova, 2013. http://hdl.handle.net/11577/3423108.

Full text
Abstract:
With the introduction of the Quality-by-Design (QbD) initiative, the American Food and Drug Administration and the other pharmaceutical regulatory Agencies aimed to change the traditional approaches to pharmaceutical development and manufacturing. Pharmaceutical companies have been encouraged to use systematic and science-based tools for the design and control of their processes, in order to demonstrate a full understanding of the driving forces acting on them. From an engineering perspective, this initiative can be seen as the need to apply modeling tools in pharmaceutical development and manufacturing activities. The aim of this Dissertation is to show how statistical modeling, and in particular latent variable models (LVMs), can be used to assist the practical implementation of QbD paradigms to streamline and accelerate product and process design activities in pharmaceutical industries, and to provide a better understanding and control of pharmaceutical manufacturing processes. Three main research areas are explored, wherein LVMs can be applied to support the practical implementation of the QbD paradigms: process understanding, product and process design, and process monitoring and control. General methodologies are proposed to guide the use of LVMs in different applications, and their effectiveness is demonstrated by applying them to industrial, laboratory and simulated case studies. With respect to process understanding, a general methodology for the use of LVMs is proposed to aid the development of continuous manufacturing systems. The methodology is tested on an industrial process for the continuous manufacturing of tablets. It is shown how LVMs can model jointly data referred to different raw materials and different units in the production line, allowing to understand which are the most important driving forces in each unit and which are the most critical units in the line. Results demonstrate how raw materials and process parameters impact on the intermediate and final product quality, enabling to identify paths along which the process moves depending on its settings. This provides a tool to assist quality risk assessment activities and to develop the control strategy for the process. In the area of product and process design, a general framework is proposed for the use of LVM inversion to support the development of new products and processes. The objective of model inversion is to estimate the best set of inputs (e.g., raw material properties, process parameters) that ensure a desired set of outputs (e.g., product quality attributes). Since the inversion of an LVM may have infinite solutions, generating the so-called null space, an optimization framework allowing to assign the most suitable objectives and constraints is used to select the optimal solution. The effectiveness of the framework is demonstrated in an industrial particle engineering problem to design the raw material properties that are needed to produce granules with desired characteristics from a high-shear wet granulation process. Results show how the framework can be used to design experiments for new products design. The analogy between the null space and the Agencies’ definition of design space is also demonstrated and a strategy to estimate the uncertainties in the design and in the null space determination is provided. The proposed framework for LVM inversion is also applied to assist the design of the formulation for a new product, namely the selection of the best excipient type and amount to mix with a given active pharmaceutical ingredient (API) to obtain a blend of desired properties. The optimization framework is extended to include constraints on the material selection, the API dose or the final tablet weight. A user-friendly interface is developed to aid formulators in providing the constraints and objectives of the problem. Experiments performed industrially on the formulation designed in-silico confirm that model predictions are in good agreement with the experimental values. LVM inversion is shown to be useful also to address product transfer problems, namely the problem of transferring the manufacturing of a product from a source plant, wherein most of the experimentation has been carried out, to a target plant which may differ for size, lay-out or involved units. An experimental process for pharmaceutical nanoparticles production is used as a test bed. An LVM built on different plant data is inverted to estimate the most suitable process conditions in a target plant to produce nanoparticles of desired mean size. Experiments designed on the basis of the proposed LVM inversion procedure demonstrate that the desired nanoparticles sizes are obtained, within experimental uncertainty. Furthermore, the null space concept is validated experimentally. Finally, with respect to the process monitoring and control area, the problem of transferring monitoring models between different plants is studied. The objective is to monitor a process in a target plant where the production is being started (e.g., a production plant) by exploiting the data available from a source plant (e.g., a pilot plant). A general framework is proposed to use LVMs to solve this problem. Several scenarios are identified on the basis of the available information, of the source of data and on the type of variables to include in the model. Data from the different plants are related through subsets of variables (common variables) measured in both plants, or through plant-independent variables obtained from conservation balances (e.g., dimensionless numbers). The framework is applied to define the process monitoring model for an industrial large-scale spray-drying process, using data available from a pilot-scale process. The effectiveness of the transfer is evaluated in terms of monitoring performances in the detection of a real fault occurring in the target process. The proposed methodologies are then extended to batch systems, considering a simulated penicillin fermentation process. In both cases, results demonstrate that the transfer of knowledge from the source plant enables better monitoring performances than considering only the data available from the target plant.
La recente introduzione del concetto di Quality-by-Design (QbD) da parte della Food and Drug Administration e delle altre agenzie di regolamentazione farmaceutica ha l’obiettivo di migliorare e modernizzare gli approcci tradizionalmente utilizzati dalle industrie farmaceutiche per lo sviluppo di nuovi prodotti e dei relativi processi produttivi. Scopo dell’iniziativa è di incoraggiare le industrie stesse all’utilizzo di procedure sistematiche e basate su presupposti scientifici sia nella fase di sviluppo di prodotto e processo, che nella fase di conduzione del processo produttivo stesso. A tal proposito, le Agenzie hanno definito paradigmi e linee guida per agevolare l’implementazione di queste procedure in ambito industriale, favorendo una migliore comprensione dei fenomeni alla base dei processi produttivi, in maniera da assicurare un controllo stringente sulla qualità dei prodotti finali, in termini di proprietà fisiche, ma soprattutto di efficacia e sicurezza per i pazienti. Da un punto di vista ingegneristico, il Quality-by-Design può essere visto come il tentativo di introdurre principi di modellazione in ambiti di sviluppo e di produzione farmaceutica. Questo offre enormi opportunità all’industria farmaceutica, che può beneficiare di metodologie e strumenti ormai maturi, già sperimentati in altri settori industriali maggiormente inclini all’innovazione tecnologica. Allo stesso tempo, non va tralasciato il fatto che l’industria farmaceutica presenta caratteristiche uniche, come la complessità dei prodotti, le produzioni tipicamente discontinue, diversificate e in bassi volumi e, soprattutto, lo stretto controllo regolatorio, che richiedono strumenti dedicati per affrontare i problemi specifici che possono sorgere in tale ambiente. Per questi motivi, vi è l’esigenza di concepire metodologie che siano adeguate alle peculiarità dell’industria farmaceutica, ma al tempo stesso abbastanza generali da poter essere applicate in un’ampia gamma di situazioni. L’obiettivo di questa Dissertazione è dimostrare come la modellazione statistica, e in particolar modo i modelli a variabili latenti (LVM, latent variable models), possano essere utilizzati per guidare l’implementazione pratica dei principi fondamentali del Quality-by-Design in fase di sviluppo di prodotto e di processo e in fase di produzione in ambito farmaceutico. In particolare, vengono proposte metodologie generali per l’impiego di modelli a variabili latenti nelle tre aree principali sulle quali l’iniziativa del Quality-by-Design si fonda: il miglioramento della comprensione sui processi, la progettazione di nuovi prodotti e processi produttivi, e il monitoraggio e controllo di processo. Per ciascuna di queste aree, l’efficacia della modellazione a variabili latenti viene dimostrata applicando i modelli in diversi casi studio di tipo industriale, di laboratorio, o simulati. Per quanto riguarda il miglioramento della comprensione sui processi, nel Capitolo 3 è proposta una strategia generale per applicare LVM nello sviluppo di sistemi di produzione in continuo. L’analisi è applicata a supporto dello sviluppo di un processo industriale continuo di produzione di compresse su scala pilota. La procedura si basa su tre fasi fondamentali: i) una fase di gestione dei dati; ii) una fase di analisi esplorativa; iii) una fase di analisi globale. Viene mostrato come i parametri dei modelli costruiti a partire dai dati del processo possano essere interpretati sulla base di principi fisici, permettendo di identificare le principali forze motrici che agiscono sul sistema e di ordinarle a seconda della loro importanza. Questo può essere utile per supportare una valutazione dei rischi necessaria a definire una strategia di controllo per il processo e per guidare la sperimentazione fin dalle prime fasi dello sviluppo. In particolare, nel caso studio considerato, la metodologia proposta individua nel processo utilizzato per macinare le particelle di principio attivo e nella sezione nella quale il principio attivo è formulato le principali fonti di variabilità entranti nel sistema con effetto sulle proprietà fisiche del prodotto finale. Dall’analisi globale, è mostrato come l’utilizzo di modelli a variabili latenti a blocchi multipli permetta di individuare le unità del processo più critiche e, all’interno di ciascuna di esse, le variabili più critiche per la qualità del prodotto. Inoltre questi modelli si dimostrano particolarmente utili nell’identificare le traiettorie lungo le quali il processo si muove, a seconda delle proprietà delle materie prime e dei parametri di processo utilizzati, fornendo così uno strumento per garantire che l’operazione segua la traiettoria designata. Nell’ambito della progettazione di nuovi prodotti e processi, l’efficacia dei modelli a variabili latenti è dimostrata nel Capitolo 4, dove è proposta una procedura generale basata sull’inversione di LVM per supportare lo sviluppo di nuovi prodotti e la determinazione delle condizioni operative dei rispettivi processi di produzione. L’obiettivo della procedura proposta è quello di fornire uno strumento atto a dare un’adeguata formalizzazione matematica, in termini di inversione di LVM, al problema di progettazione, secondo gli obiettivi e i vincoli che il problema stesso può presentare. Dal momento che l’inversione di LVM può avere soluzioni multiple, vengono individuati quattro possibili problemi di ottimizzazione, tramite i quali effettuare l’inversione. L’obiettivo dell’inversione del modello è di stimare le condizioni ottimali in ingresso al sistema (in termini, per esempio, di caratteristiche delle materie prime o di parametri di processo) che assicurino di raggiungere la qualità desiderata per il prodotto in uscita. La procedura è applicata con successo in un caso studio industriale, per la determinazione delle proprietà delle materie prime in ingresso a un processo di granulazione a umido, con l’obiettivo di ottenere in uscita granuli con determinate caratteristiche di qualità. È inoltre esaminato il concetto di spazio nullo, lo spazio cioè cui appartengono tutte le soluzioni di un problema di inversione di LVM, che corrispondono ad uno stesso insieme di variabili desiderate (proprietà del prodotto) in uscita. In particolare, si dimostra come la definizione di spazio nullo presenti diverse caratteristiche comuni alla definizione di spazio di progetto (design space) di un processo, stabilita dalle linee guida delle Agenzie di regolamentazione, e come lo spazio nullo possa essere utilizzato al fine di una identificazione preliminare dello spazio di progetto. Al fine di avere una misura sull’affidabilità delle soluzioni del problema di inversione, viene proposta una strategia per stimarne le incertezze. Sono inoltre presentate alcune soluzioni per affrontare questioni specifiche relative all’inversione di LVM. In particolare, si propone una nuova statistica da utilizzare per la selezione del numero di variabili latenti da includere in un modello utilizzato per l’inversione, in modo tale da descrivere adeguatamente l’insieme dei regressori, oltre a quello delle variabili in uscita. In aggiunta, dato che a causa delle possibili incertezze del modello non è assicurato che la sua inversione fornisca una soluzione che consenta di ottenere le proprietà desiderate per il prodotto, è proposta una strategia per sfruttare la struttura di covarianza dei dati storici per selezionare nuovi profili di qualità per il prodotto, in modo da facilitare l’inversione del modello. Gli approcci proposti sfruttano i parametri del modello e i vincoli imposti per la qualità del prodotto per stimare nuovi insiemi di proprietà, per i quali l’errore di predizione del modello è minimo. Questo agevola l’inversione del modello nel fornire le proprietà del prodotto desiderate, dal momento che queste possono essere assegnate come vincoli rigidi al problema di ottimizzazione. Nel Capitolo 5 la procedura presentata al Capitolo 4 per l’inversione di LVM è applicata per progettare la formulazione di nuovi prodotti farmaceutici, in cui l’obiettivo è di stimare i migliori eccipienti da miscelare con un dato principio attivo e la loro quantità in modo da ottenere una miscela di proprietà adeguate per la fase di compressione. La procedura proposta al Capitolo 4 è ampliata al fine di includere i vincoli per la selezione dei materiali e di considerare gli specifici obiettivi che un problema di formulazione può presentare (per esempio, massimizzare la dose di principio attivo, o minimizzare il peso della compressa finale). L’inversione del modello è risolta come problema di programmazione non lineare misto-intera, per il quale è sviluppata un’interfaccia utente che consenta ai formulatori di specificare gli obiettivi e i vincoli che il problema di formulazione da risolvere può presentare. La metodologia proposta è testata in un caso studio industriale per progettare nuove formulazioni per un dato principio attivo. Le formulazioni progettate in-silico sono preparate e verificate sperimentalmente, fornendo risultati in linea con le predizioni del modello. Nel Capitolo 6 è presentata una diversa applicazione della procedura generale per l’inversione di LVM presentata al Capitolo 4. Il caso studio riguarda un problema di trasferimento di prodotto, in cui l’obiettivo è di ottenere nanoparticelle di diametro medio predefinito, tramite un processo di precipitazione con anti-solvente in un dispositivo obiettivo. La metodologia sfrutta i dati storici disponibili da esperimenti effettuati su un dispositivo di riferimento di diversa dimensione da quello obiettivo, e sullo stesso dispositivo obiettivo ma con una diversa configurazione sperimentale. Un modello di tipo joint-Y PLS (JY-PLS) è inizialmente utilizzato per correlare dati di diversa origine (per dispositivo e configurazione sperimentale). Quindi, la procedura presentata al Capitolo 4 viene impiegata per invertire il modello JY-PLS al fine di determinare le condizioni operative nel dispositivo obiettivo, che assicurino l’ottenimento di nanoparticelle di diametro medio desiderato. La convalida sperimentale conferma i risultati ottenuti dall’inversione del modello. Inoltre gli esperimenti consentono di convalidare sperimentalmente il concetto di spazio nullo, dimostrando come diverse condizioni di processo stimate lungo lo spazio nullo consentano effettivamente di ottenere nanoparticelle con le medesime dimensioni medie. La sezione finale di questa Dissertazione propone l’applicazione di LVM a supporto del monitoraggio e controllo di processo in operazioni farmaceutiche. In particolare, nel Capitolo 7, è affrontato il problema del trasferimento di modelli per il monitoraggio di processo tra impianti diversi. In questo caso il problema è di assicurare che l’operazione in un impianto obiettivo sia sotto controllo statistico fin dai primi istanti di funzionamento dell’impianto, sfruttando la conoscenza disponibile (in termini di dati) da altri impianti. È proposta una procedura generale basata su LVM per far fronte a questo tipo di problemi. La procedura identifica cinque diversi scenari, a seconda del tipo di informazioni disponibili (solo dati di processo o sia dati di processo sia conoscenza di base sul processo), della provenienza dei dati disponibili (solo dall’impianto di riferimento o sia dall’impianto di riferimento sia dall’impianto obiettivo) e dal tipo di variabili di processo considerate per la costruzione del modello (solo variabili comuni tra gli impianti o sia variabili comuni sia altre variabili). Per modellare in maniera congiunta i dati disponibili da impianti diversi, sono utilizzate analisi delle componenti principali (PCA) o modelli di tipo JY-PLS, a seconda che, per la costruzione del modello di monitoraggio, si considerino solo variabili comuni tra gli impianti (nel caso PCA), o sia variabili comuni sia altre variabili (nel caso JY-PLS). Le metodologie proposte sono verificate nel trasferimento di modello per il monitoraggio di un processo industriale di atomizzazione, dove il riferimento è un impianto su scala pilota, mentre l’impianto obiettivo è un’unità produttiva su scala industriale. Le prestazioni in fase di monitoraggio del processo su scala industriale sono soddisfacenti per tutti gli scenari proposti. In particolare, è dimostrato come il trasferimento di informazioni dall’impianto di riferimento migliori le prestazioni del modello per il monitoraggio dell’impianto obiettivo. Le procedure proposte sono inoltre applicate in uno studio preliminare per il trasferimento di sistemi di monitoraggio in processi discontinui, considerando come caso studio un processo simulato di fermentazione per la produzione di penicillina, in cui sono simulati due impianti differenti per scala e configurazione. Le prestazioni del sistema di monitoraggio indicano che, anche in questo caso, considerare nella costruzione del modello i dati disponibili dalle operazioni nell’impianto di riferimento rende il sistema più efficiente nella rilevazione delle anomalie simulate nell’impianto obiettivo, rispetto a considerare nel modello di monitoraggio i soli (pochi) dati disponibili dall’impianto obiettivo stesso.
APA, Harvard, Vancouver, ISO, and other styles
36

Amziani, Mourad. "Modeling, evaluation and provisioning of elastic service-based business processes in the cloud." Thesis, Evry, Institut national des télécommunications, 2015. http://www.theses.fr/2015TELE0016/document.

Full text
Abstract:
Le Cloud Computing est de plus en plus utilisé pour le déploiement et l'exécution des applications métiers et plus particulièrement des applications à base de services (AbSs). L'élasticité à différents niveaux est l'une des propriétés fournies par le Cloud. Son principe est de garantir la fourniture des ressources nécessaires et suffisantes pour la continuité de l'exécution optimale des services Cloud. La fourniture des ressources doit considérer la variation de la demande pour éviter la sous-utilisation et la surutilisation de ces dernières. Il est évident que la fourniture d'infrastructures et/ou de plateformes élastiques n'est pas suffisante pour assurer l'élasticité des applications métiers déployées. En effet, il est aussi nécessaire de considérer l'élasticité au niveau des applications. Ceci permet l'adaptation dynamique des applications déployées selon la variation des demandes. Par conséquent, les applications métiers doivent être fournies avec des mécanismes d'élasticité permettant leur adaptation tout en assurant les propriétés fonctionnelles et non-fonctionnelles désirées. Dans nos travaux, nous nous sommes intéressés à la fourniture d'une approche holistique pour la modélisation, l'évaluation et la mise en oeuvre des mécanismes d'élasticité des AbSs dans le Cloud. En premier lieu, nous avons proposé un modèle formel pour l'élasticité des AbSs. Pour cela, nous avons modélisé les AbSs en utilisant les réseaux de Petri et défini deux opérations d'élasticité (la duplication et la consolidation). En outre, nous avons proposé de coupler ces deux opérations avec un contrôleur d'élasticité. Pour assurer l'élasticité des AbSs, le contrôleur analyse l'exécution des AbSs et prend des décisions sur les opérations d'élasticité (duplication/consolidation). Après la définition de notre modèle pour l'élasticité des AbSs, nous nous sommes intéressés à l'évaluation de l'élasticité avant de l'implémenter dans des environnements Cloud réels. Pour cela, nous avons proposé d'utiliser notre contrôleur d'élasticité comme un Framework pour la validation et l'évaluation de l'élasticité en utilisant des techniques de vérification et de simulation. Enfin, nous avons mis en oeuvre l'élasticité des AbSs dans des environnements Cloud réels. Pour cela, nous avons proposé deux approches. La première approche encapsule les AbSs non-élastiques dans des micro-conteneurs, étendus avec nos mécanismes d'élasticité, avant de les déployer sur des infrastructures Cloud. La seconde approche intègre notre contrôleur d'élasticité dans une infrastructure autonomique afin de permettre l'ajout dynamique des fonctionnalités d'élasticité aux AbSs déployées sur des plateformes Cloud
Cloud computing is being increasingly used for deploying and executing business processes and particularly Service-based Business Processes (SBPs). Among other properties, Cloud environments provide elasticity at different scopes. The principle of elasticity is to ensure the provisioning of necessary and sufficient resources such that a Cloud service continues running smoothly even when the number or quantity of its utilization scales up or down, thereby avoiding under-utilization and over-utilization of resources. It is obvious that provisioning of elastic infrastructures and/or platforms is not sufficient to provide elasticity of deployed business processes. In fact, it is also necessary to consider the elasticity at the application scope. This allows the adaptation of deployed applications during their execution according to demands variation. Therefore, business processes should be provided with elasticity mechanisms allowing their adaptation to the workload changes while ensuring the desired functional and non-functional properties. In our work, we were interested in providing a holistic approach for modeling, evaluating and provisioning of elastic SBPs in the Cloud. We started by proposing a formal model for SBPs elasticity. To do this, we modeled SBPs using Petri nets and defined two elasticity operations (duplication / consolidation). In addition, we proposed to intertwine these elasticity operations with an elasticity controller that monitors SBPs execution, analyzes monitoring information and executes the appropriate elasticity operation (duplication/consolidation) in order to enforce the elasticity of SBPs. After facing the challenge of defining a model and mechanisms for SBPs elasticity, we were interested in the evaluation of elasticity before implementing it in real environments. To this end, we proposed to use our elasticity controller as a framework for the validation and evaluation of elasticity using verification and simulation techniques. Finally, we were interested in the provisioning of elasticity mechanisms for SBPs in real Cloud environments. For this aim, we proposed two approaches. The first approach packages non-elastic SBPs in micro-containers, extended with our elasticity mechanisms, before deploying them in Cloud infrastructures. The second approach integrates our elasticity controller in an autonomic infrastructure to dynamically add elasticity facilities to SBPs deployed on Cloud platforms
APA, Harvard, Vancouver, ISO, and other styles
37

Loures, Carla Cristina Almeida [UNESP]. "Otimização do processo de cultivo da microalga Chlorella minutissima como fonte de matéria-prima para a produção de biodiesel." Universidade Estadual Paulista (UNESP), 2016. http://hdl.handle.net/11449/141995.

Full text
Abstract:
Submitted by CARLA CRISTINA ALMEIDA LOURES null (ccaloures@hotmail.com) on 2016-08-01T17:10:06Z No. of bitstreams: 1 TESE _VERSÃO FINAL_V 19-07REAL pdf.pdf: 3835125 bytes, checksum: 2120b82f57806ae4108e555792decece (MD5)
Approved for entry into archive by Ana Paula Grisoto (grisotoana@reitoria.unesp.br) on 2016-08-03T14:10:38Z (GMT) No. of bitstreams: 1 loures_cca_dr_guara.pdf: 3835125 bytes, checksum: 2120b82f57806ae4108e555792decece (MD5)
Made available in DSpace on 2016-08-03T14:10:38Z (GMT). No. of bitstreams: 1 loures_cca_dr_guara.pdf: 3835125 bytes, checksum: 2120b82f57806ae4108e555792decece (MD5) Previous issue date: 2016-07-05
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
O presente trabalho foi desenvolvido com a microalga Chlorella minutissima com o objetivo de determinar as melhores condições operacionais de fotobioreatores tubulares descontínuos, tipo coluna de bolhas, para o crescimento celular e a produção de lipídeos em função das variáveis de processo: fluxo de CO2 na alimentação, concentração inicial de nitrato, concentração inicial de fosfato, suplementação (metais e vitaminas), temperatura e salinidade. Os experimentos foram planejados utilizando a metodologia de Taguchi. Os resultados obtidos mostraram que a configuração do fotobioreatores tubulares descontínuos, tipo coluna de bolhas, foi adequada para o cultivo da microalga Chlorella minutíssima, apresentando bons resultados de teor de lipídeos, da ordem de 37,08 ± 1,50 % em relação à biomassa seca ao final de 7 dias de cultivo. Definidas as condições ótimas de cultivo foi realizada uma reação como o óleo obtido, empregando catalisador químico (H2SO4). O resultado obtido demonstrou que o catalisador atuou de forma eficiente, convertendo os ácidos graxos em seus respectivos ésteres de etila. Outro fator importante de ressaltar foi a ausência de pigmentos presente no material lipídico, visto que o óleo extraído apresentou coloração amarelada similar aos óleos vegetais tradicionais, como por exemplo, o óleo de soja. Pigmentos quando presentes no material lipídico podem comprometer a conversão dos triglicerídeos em ésteres alquílicos (biodiesel). Chlorella minutissima apresenta vantagens adicionais perante a diversas matérias-primas lipídicas com presença de pigmentos. Para a avaliação da velocidade de crescimento populacional das microalgas foi considerado o modelo cinético de crescimento logístico. A partir desta investigação, verificou-se que o crescimento celular e a produtividade de lipídeos são significativamente dependentes da concentração inicial de nitrato, do fluxo de dióxido de carbono, da concentração de fosfato, da suplementação do meio e da temperatura. Os melhores valores obtidos, tanto para o crescimento quanto para a produtividade de lipídeos, foram para a concentração de nitrato no nível alto e os demais parâmetros significativos no nível baixo. Dessa forma, a metodologia e os resultados apresentados neste trabalho podem ser úteis na busca pela viabilização econômica da produção de biodiesel de microalgas, uma vez que a produção de biodiesel via microalgas ainda é um processo inviável devido aos custos.
The current work was developed with the microalgae Chlorella minutissima with the aim of determining the best operational conditions of discontinuous-tubular photobioreactors, bubble-column type, to the cell growth and production of lipids in relation to process variables: CO2 feed flow rate, nitrate initial concentration, phosphate initial concentration, supplements (metals and vitamins), temperature and salinity. The experiments were designed using Taguchi methodology. Results showed that the configuration of discontinuous-tubular photobioreactors, bubble-column type was adequate to the cultivation of microalgae Chlorella minutissima, presenting good results in terms of lipid content, of order of 37.08 ± 1.50% in relation to the dry biomass at the end of the 7 cultivation days. Once optimal cultivation conditions were established, a reaction with obtained oil was carried out using a chemical catalyst (H2SO4). Results showed that the catalyst acted in an efficient way, converting fatty oils in its respective ethyl esters. Another important factor worth noticing was the absence of pigments present in the lipid material, considering that the extracted oil presented a yellow color similar to traditional vegetal oils, such as soy oil. The presence of pigments in the lipid content may compromise the conversion of triglycerides in alkyl esters (biodiesel). Chlorella minutissima presents additional advantage in comparison to diverse lipid raw materials with the presence of pigments. The logistic growth was used as the kinetic model to evaluate the growth rate and speed of microalgae. From this analysis, it was possible to verify that cell growth and the productivity of lipids were significantly dependent on the initial concentration of nitrate, carbon dioxide flow rate, concentration of phosphate, supplementation of media and temperature. Best values obtained for growth as well as for lipid productivity were nitrate concentration in the high level and the other parameters were significant in the low level. Thus, the methodology and results presented in this work can be useful in seeking economic feasibility of production of microalgae biodiesel, since the production of biodiesel is currently is not viable due to the high costs associated to it.
APA, Harvard, Vancouver, ISO, and other styles
38

Charpentier, Frédéric. "Maîtrise du processus de modélisation géométrique et physique en conception mécanique." Phd thesis, Université Sciences et Technologies - Bordeaux I, 2014. http://tel.archives-ouvertes.fr/tel-01016665.

Full text
Abstract:
La conception de produits a pour objectif de définir techniquement un produit en satisfaisant les besoins de l'ensemble des clients du cycle de vie du produit. Les enjeux industriels conduisent à développer des modèles et des outils d'aide à la conception afin de répondre aux besoins clients tout en optimisant le triptyque coût-qualité délais. L'objectif de cette thèse est de proposer une vision globale permettant d'appréhender les différents types de modélisation. Pour atteindre cet objectif, une analyse globale de ces notions est nécessaire afin d'obtenir une représentation commune du système pour les différentes activités de conception et de simulation. L'intérêt de cette approche est de pouvoir mettre en évidence les dépendances et les relations entre ces activités. Cette approche doit permettre d'appréhender les différents niveaux de détails (systémique) lors de la décomposition fonctionnelle et structurelle du produit. Elle doit également permettre de suivre l'élaboration des modèles physiques pour la simulation. Nous proposons une traçabilité du processus de conception et du processus de modélisation permettant de remettre en cause, le cas échéant, les choix de conception et les hypothèses de modélisation. Ce travail est fondé sur des concepts de GeoSpelling comme le " skin modèle ", les opérations et les propriétés. Ils sont enrichis d'autres concepts comme les modèles finis et infinis et les modèles primitif et de simulation.
APA, Harvard, Vancouver, ISO, and other styles
39

Commissariat, Hormazd P. "Performance Modeling of Single Processor and Multi-Processor Computer Architectures." Thesis, Virginia Tech, 1995. http://hdl.handle.net/10919/31377.

Full text
Abstract:
Determining the optimum computer architecture configuration for a specific application or a generic algorithm is a difficult task. The complexity involved in today's computer architectures and systems makes it more difficult and expensive to easily and economically implement and test full functional prototypes of computer architectures. High level VHDL performance modeling of architectures is an efficient way to rapidly prototype and evaluate computer architectures. Determining the architecture configuration is fixed, one would like to know the tolerance and expected performance of individual/critical components and also what would be the best way to map the software tasks onto the processor(s). Trade-offs and engineering compromises can be analyzed and the effects of certain component failures and communication bottle-necks can be studied. A part of the research work done for the RASSP (Rapid Prototyping of Application Specific Signal Processors) project funded by Department of Defense contracts is documented in this thesis. The architectures modeled include a single-processor, single-global-bus system; a four processor, single-global-bus system; a four processor, multiple-local-bus, single-global-bus system; and finally, a four processor multiple-local-bus system interconnected by a crossbar interconnection switch. The hardware models used are mostly legacy/inherited models from an earlier project and they were upgraded, modified and customized to suit the current research needs and requirements. The software tasks that are run on the processors are pieces of the signal and image processing algorithm run on the Synthetic Aperture Radar (SAR). The communication between components/devices is achieved in the form of tokens which are record structures. The output is a trace file which tracks the passage of the tokens through various components of the architecture. The output trace file is post-processed to obtain activity plots and latency plots for individual components of the architecture.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
40

Moore, Simon William. "Multithreaded processor design." Thesis, University of Cambridge, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.338268.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Sirin, Göknur. "Supporting multidisciplinary vehicle modeling : towards an ontology-based knowledge sharing in collaborative model based systems engineering environment." Thesis, Châtenay-Malabry, Ecole centrale de Paris, 2015. http://www.theses.fr/2015ECAP0024/document.

Full text
Abstract:
Les systèmes industriels (automobile, aérospatial, etc.) sont de plus en plus complexes à cause des contraintes économiques et écologiques. Cette complexité croissante impose des nouvelles contraintes au niveau du développement. La question de la maitrise de la capacité d’analyse de leurs architectures est alors posée. Pour résoudre cette question, les outils de modélisation et de simulation sont devenus une pratique courante dans les milieux industriels afin de comparer les multiples architectures candidates. Ces outils de simulations sont devenus incontournables pour conforter les décisions. Pourtant, la mise en œuvre des modèles physiques est de plus en plus complexe et nécessite une compréhension spécifique de chaque phénomène simulé ainsi qu’une description approfondie de l’architecture du système, de ses composants et des liaisons entre composants. L’objectif de cette thèse est double. Le premier concerne le développement d’une méthodologie et des outils nécessaires pour construire avec précision les modèles de simulation des architectures de systèmes qu’on désire étudier. Le deuxième s’intéresse à l’introduction d’une approche innovante pour la conception, la production et l’intégration des modèles de simulations en mode « plug and play » afin de garantir la conformité des résultats aux attentes, notamment aux niveaux de la qualité et de la maturité. Pour accomplir ces objectifs, des méthodologies et des processus d’ingénierie des systèmes basés sur les modèles (MBSE) ainsi que les systèmes d’information ont été utilisés. Ce travail de thèse propose pour la première fois un processus détaillé et un outil pour la conception des modèles de simulation. Un référentiel commun nommé « Modèle de carte d'identité (MIC) » a été développé pour standardiser et renforcer les interfaces entre les métiers et les fournisseurs sur les plans organisationnels et techniques. MIC garantit l’évolution et la gestion de la cohérence de l’ensemble des règles et les spécifications des connaissances des domaines métiers dont la sémantique est multiple. MIC renforce également la cohérence du modèle et réduit les anomalies qui peuvent interférer pendant la phase dite IVVQ pour Intégration, Vérification, Validation, Qualification. Finalement, afin de structurer les processus de conception des modèles de simulation, le travail s’est inspiré des cadres de l’Architecture d’Entreprise en reflétant les exigences d’intégration et de standardisation du modèle opératoire de l’entreprise. Pour valider les concepts introduits dans le cadre de cette thèse, des études de cas tirés des domaines automobile et aérospatiale ont été réalisées. L'objectif de cette validation est d'observer l'amélioration significative du processus actuel en termes d'efficacité, de réduction de l'ambiguïté et des malentendus dans la modélisation et la simulation du système à concevoir
Simulation models are widely used by industries as an aid for decision making to explore and optimize a broad range of complex industrial systems’ architectures. The increased complexity of industrial systems (cars, airplanes, etc.), ecological and economic concerns implies a need for exploring and analysing innovative system architectures efficiently and effectively by using simulation models. However, simulations designers currently suffer from limitations which make simulation models difficult to design and develop in a collaborative, multidisciplinary design environment. The multidisciplinary nature of simulation models requires a specific understanding of each phenomenon to simulate and a thorough description of the system architecture, its components and connections between components. To accomplish these objectives, the Model-Based Systems Engineering (MBSE) and Information Systems’ (IS) methodologies were used to support the simulation designer’s analysing capabilities in terms of methods, processes and design tool solutions. The objective of this thesis is twofold. The first concerns the development of a methodology and tools to build accurate simulation models. The second focuses on the introduction of an innovative approach to design, product and integrate the simulation models in a “plug and play" manner by ensuring the expected model fidelity. However, today, one of the major challenges in full-vehicle simulation model creation is to get domain level simulation models from different domain experts while detecting any potential inconsistency problem before the IVVQ (Integration, Verification, Validation, and Qualification) phase. In the current simulation model development process, most of the defects such as interface mismatch and interoperability problems are discovered late, during the IVVQ phase. This may create multiple wastes, including rework and, may-be the most harmful, incorrect simulation models, which are subsequently used as basis for design decisions. In order to address this problem, this work aims to reduce late inconsistency detection by ensuring early stage collaborations between the different suppliers and OEM. Thus, this work integrates first a Detailed Model Design Phase to the current model development process and, second, the roles have been re-organized and delegated between design actors. Finally an alternative architecture design tool is supported by an ontology-based DSL (Domain Specific Language) called Model Identity Card (MIC). The design tools and mentioned activities perspectives (e.g. decisions, views and viewpoints) are structured by inspiration from Enterprise Architecture Frameworks. To demonstrate the applicability of our proposed solution, engine-after treatment, hybrid parallel propulsion and electric transmission models are tested across automotive and aeronautic industries
APA, Harvard, Vancouver, ISO, and other styles
42

Barbosa, Neto Wilson 1983. "Do projeto à fabricação : um estudo de aplicação da fabricação digital no processo de produção arquitetônica." [s.n.], 2013. http://repositorio.unicamp.br/jspui/handle/REPOSIP/258032.

Full text
Abstract:
Orientador: Maria Gabriela Caffarena Celani
Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Civil, Arquitetura e Urbanismo
Made available in DSpace on 2018-08-24T11:31:11Z (GMT). No. of bitstreams: 1 BarbosaNeto_Wilson_M.pdf: 18612146 bytes, checksum: d8fbaaa3aaa7ff8e9008c22519dc03e7 (MD5) Previous issue date: 2013
Resumo: A utilização de técnicas de Fabricação Digital está cada vez mais presente no campo da arquitetura e construção por todo o mundo, devido aos avanços tecnológicos que os sistemas CAD (Computer-aided Design) e CAM (Computer-aided Manufacturing) têm proporcionado aos processos de concepção e produção da obra arquitetônica. As possibilidades que essas ferramentas oferecem aos arquitetos e projetistas abrem caminho para novas abordagens de projeto, que permitem o uso da produção automatizada sem a necessidade de uma padronização tão rígida como aquela imposta pelo sistema industrial. Entretanto, nota-se que a aplicação dos métodos de Fabricação Digital no processo de produção do espaço edificado no Brasil é lento, quando comparado a outros países onde a tecnologia necessária para o exercício da técnica já se encontra amplamente difundida. A presente pesquisa tem como foco investigar a aplicação da Fabricação Digital, por intermédio de processos file-to-factory na produção arquitetônica de elementos para a construção civil, mais especificamente com o uso de técnicas subtrativas de corte 2D. Após um levantamento sobre o estado da arte da área e o desenvolvimento de dois estudos de caso, o método utilizado foi a pesquisa-ação, por meio de um exercício de aplicação do conceito file-to-factory. O processo de projeto foi documentado detalhadamente e analisado para a sistematização dos procedimentos, que servirão de referência para futuras aplicações no campo da arquitetura. Espera-se, com isso, contribuir para a divulgação dessas novas tecnologias na produção arquitetônica no cenário brasileiro
Abstract: The use of Digital Fabrication techniques is increasingly present in the field of architecture and construction throughout the world. Systems such as CAD (Computer-aided Design) and CAM (Computer-aided Manufacturing) have provided technological advances to the architectural design and production process. The possibilities that these tools provide to architects and designers introduce new design approaches, which allow the use of automated production without the rigid standardization imposed by the industrial system. However, it can be noticed that the use of Digital Fabrication methods in the built environment production process in Brazil is slow when compared to other countries where the technology is widely incorporated. This research focuses on investigating the application of Digital Fabrication, through file-to-factory processes in the production of architectural elements for the construction industry, specifically with the use of 2D subtractive cutting techniques. After a survey on the state of the art in the field and two case studies, the method used was an action research through a file-to-factory exercise. The design process was documented in detail and analyzed in order to systematize the procedures as a reference for future applications in architecture. As a result we expect to contribute to the dissemination of these new technologies in architectural production in the Brazilian scenario
Mestrado
Arquitetura, Tecnologia e Cidade
Mestre em Arquitetura, Tecnologia e Cidade
APA, Harvard, Vancouver, ISO, and other styles
43

Zhou, Yu Computer Science &amp Engineering Faculty of Engineering UNSW. "Low power processor design." Publisher:University of New South Wales. Computer Science & Engineering, 2008. http://handle.unsw.edu.au/1959.4/42608.

Full text
Abstract:
Power consumption is a critical design issue in embedded processors. As part of our low power processor design project, this thesis work aims to reduce power consumption on two typical processor components: Register File (RF), and Arithmetic and Logic Unit (ALU). Register File is one of the most power hungry components in the processor, consuming about 20% of the processor power. The ALU is the working horse in the processor, responsible for almost all basic computing operations. Although ALU does not consume as high power as the register file, we observe that it can be power intensive in terms of power dissipation per silicon area unit and may result in a thermal hot spot in the processor. Existing approaches to reduce power on the register file and ALU are effective. However, most of them either entail extensive hardware design efforts, or require a significant amount of work on post-compilation software code modification. The approaches proposed in this thesis avoid such problems. We only customize the internal structure of the processor components and keep the components’ interface to other system parts intact, so that the customization to a component is transparent to its external hardware design and no modification/alteration to other hardware components or to the software code is required. This customization strategy is well suitable to our whole low power processor design project and can be applied to any customization of an existing system for a given application. We have applied our customization approaches to a set of benchmarks in a variety of application domains. Our experimental results show that the power savings on register file are in a range from 18.8% to 45.5%, an average of 29.7% register file power can be saved. For the arithmetic and logic unit, the power savings are from 43.5% to 49.6% and the average saving is 46.9% as compared to the original designs. We also combine the customization of both the ALU and the register file. With the customizing of the ALU and the register file simultaneously, the processor power consumption can be reduced from 3.9% to 10.1%; on average, 6.44% processor power can be saved. Most importantly, the power saving achievement is at the cost of neither hardware complexity nor processor performance, and the implementation is extremely straightforward and can be easily incorporated into a processor design environment, such as ASIPMeister (a design tool, to automatically generate a VHDL model for application specificinstruction set processors) used in our research.
APA, Harvard, Vancouver, ISO, and other styles
44

Herrera, Valencia Rodrigo Fernando. "Impact of BIM/LEAN on the interaction of construction project design teams." Doctoral thesis, Universitat Politècnica de València, 2020. http://hdl.handle.net/10251/158718.

Full text
Abstract:
[ES] Los equipos de diseño de los proyectos de construcción están compuestos por diferentes interesados; esto podría dificultar las interacciones. Las metodologías BIM y Lean tienen un impacto positivo en los proyectos de construcción. Además, hay pruebas de la aplicación conjunta de BIM y Lean; sin embargo, se desconoce la relación empírica entre las prácticas Lean y los usos de BIM en la fase de diseño. Tampoco existe una comprensión más profunda de los fenómenos sociales que se generan entre los equipos de diseño cuando se aplican las metodologías de gestión BIM-Lean. Por lo tanto, el objetivo de esta investigación es entender el impacto de las prácticas de gestión de diseño Lean (LDM) y los usos BIM en la interacción de los equipos de diseño de los proyectos de construcción. El método de investigación tiene dos fases: 1) la creación de herramientas para evaluar el nivel de aplicación de las prácticas LDM y los usos BIM, y para comprender las interacciones en un equipo de diseño; y 2) el análisis de las relaciones entre BIM, Lean y la interacción, basado en información empírica de proyectos de construcción en fase de diseño. Los resultados presentan un instrumento de evaluación de usos del BIM y un cuestionario de prácticas de LDM para medir la gestión del diseño, y un método para comprender los diferentes tipos de interacción en un equipo de diseño. Basados en los datos de 64 proyectos, un análisis chi cuadrado reveló 33 relaciones empíricas entre los usos del BIM y las prácticas LDM; además, la aplicación de los usos del BIM implica una mayor aplicación de las prácticas LDM. El proyecto que aplica la gestión BIM-Lean logra numerosas interacciones en su equipo de diseño; flujos de información transparentes, ordenados y estandarizados; un entorno de colaboración, confianza y aprendizaje; y una gestión del compromiso. Todos estos elementos de interacción no son visibles en el proyecto, en el que no se aplicó la gestión BIM-lean.
[CAT] Els equips de diseny dels projectes de construcción están compostos per diferents interesats; açó podría dificultar les interaccions. Les metodologies BIM i Lean tenen un impacte positiu en els projectes de construcció. A més, ni hi ha proves de l'aplicació conjunta de BIM i Lean; no obstant, es desconeix la relació empírica entre practiques Lean i els usos de BIM en fase de diseny. Tampoc existix una comprensió mes profunda dels fenómens socials que es generen entre els equips de diseny quan s'apliquen les metodologies de gestió BIM-Lean. Per tant, l'objectiu d'esta investigació es entendre l'impacte de les practiques de gestió de diseny Lean (LDM) i els usos BIM en l'interacció dels equips de diseny dels projectes de construcción. El métode de investigació te dos fases: 1) la creació de ferramentes per a evaluar el nivell d'aplicació de les practiques LDM i els usos BIM, i per a comprendre les interaccions en un equip de diseny; i 2) l'análisis de les relacions entre BIM, Lean i la interacció, basades en informació empírica de projectes de construcció en fase de diseny. Els resultats presenten un instrument d'evaluació d'usos del BIM i un questionari de practiques de LDM per a mesurar la gestió del diseny, i un método per a comprendre els diferents tipos d'interacció en un equip de diseny. Basades en les dades de 64 projectes, un análisis chi cuadrado va revelar 33 relacions empíricas entre els usos del BIM i les práctiques LDM; a més, l¿aplicació dels usos del BIM implica una major aplicació de les práctiques LDM. El projecte que aplica la gestió BIM-Lean obté nombroses interaccions en el seu equip de diseny; fluxes d'informació transparents, ordenats i estandarizats; un entorn de colaboració, confiança i aprenentatge; i una gestió del compromis. Tots estos elements d'interacció no son visibles en el projecte, en el que no es va aplicar la gestió BIM-lean.
[EN] Design teams of construction projects are composed of different stakeholders; this fact could make the interactions difficult. BIM and Lean methodologies have a positive impact on construction projects. Besides, there is evidence of the combined implementation of BIM and Lean; however, it is not known the empirical relationship between Lean practices and BIM uses in the design phase. Also, there is not a deeper understanding of the social phenomena that are generated among design teams when BIM-Lean management methodologies are implemented. Therefore, the objective of this research is to understand the impact of Lean design management (LDM) practices, and BIM uses in the interaction of construction project design teams. The research method has two phases: (1) the creation of tools to assess the level of implementation of LDM practices and BIM uses and to understand the interactions in a design team; and (2) the relationship analyses between BIM, Lean, and interaction, based on empirical information from construction projects in the design phase. The results present a BIM uses assessment tool and an LDM practices questionnaire to measure the design management and a method to understand the different types of interaction in a design team. Based on data from 64 projects, a chi-square analysis revealed 33 empirical relationships between BIM uses and LDM practices; also, the application of BIM uses implies a greater application of LDM practices. The project that applies BIM-Lean management achieves many interactions among its design team; transparent, orderly, and standardized information flows; a collaborative, trust, and learning environment; and commitment management. All these interaction elements are not visible in the project, where BIM-lean management was not applied.
Herrera Valencia, RF. (2020). Impact of BIM/LEAN on the interaction of construction project design teams [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/158718
TESIS
APA, Harvard, Vancouver, ISO, and other styles
45

Anastasiadis, P. T. "The influences on optimal structural designs of the modelling processes and design concepts." Thesis, Cranfield University, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.267196.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

FERREIRA, Alexandre Alves. "Gestão de processos na análise da execução orçamentária da Universidade Federal de Pernambuco." Universidade Federal de Pernambuco, 2016. https://repositorio.ufpe.br/handle/123456789/17617.

Full text
Abstract:
Submitted by Irene Nascimento (irene.kessia@ufpe.br) on 2016-08-04T17:59:38Z No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Alexandre Alves Ferreira.pdf: 1532776 bytes, checksum: cd416fd163761609e1e6e758b7633ee6 (MD5)
Made available in DSpace on 2016-08-04T17:59:38Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Alexandre Alves Ferreira.pdf: 1532776 bytes, checksum: cd416fd163761609e1e6e758b7633ee6 (MD5) Previous issue date: 2016-04-19
Uma das grandes missões de um governo é de proporcionar bem-estar à população. Para alcançar esses objetivos é necessário desenvolver ações, as quais implicam vinculação de verba pública para assegurar as despesas que tais ações acarretarão. Sendo assim, nos órgãos públicos, como é o caso das universidades federais, é elaborado o orçamento anual, regido pela Lei Orçamentária Anual (LOA), a qual estabelece os créditos destinados aos diversos órgãos governamentais para que eles possam, então, desenvolver suas atividades. É nesse horizonte que este estudo aplica o princípio da Gestão de Processos, como ferramenta de gestão, para monitorar o desempenho da execução orçamentária, especialmente nos procedimentos que requerem ações entre unidades departamentais. Assim, o objetivo geral deste estudo foi o de analisar a gestão da execução orçamentária da Universidade Federal de Pernambuco (UFPE) e propor melhorias ao processo. Especificamente, o estudo identificou, descreveu e representou graficamente o processo de execução orçamentária da UFPE; analisou e apontou seus cenários atual e desejável em relação às suas etapas, ao tempo utilizado no desenvolvimento das tarefas e aos recursos humanos envolvidos; e apresentou uma proposição de indicadores de desempenho do processo na expectativa de que sirvam de recursos para monitorar a execução orçamentária das unidades gestoras de orçamento e da UFPE, em geral. Pesquisa de natureza aplicada, a coleta de dados combinou diferentes fontes de evidências: pesquisa documental, observação participante e questionários semiestruturados. Adotou-se abordagem qualitativa e quantitativa na análise dos dados, e utilizou a metodologia do Business Process Management (BPM) para descrição, representação e análise do processo. A proposição dos indicadores seguiu o modelo metodológico de Trzesniak (2014), e, portanto, os indicadores foram formulados com base em cinco dimensões: denominação, propósito, conceito, forma de apuração e metadados. A pesquisa apontou que o processo de execução orçamentária da UFPE tem potencial para se tornar mais ágil, eficiente e focado no usuário e que os indicadores de monitoramento propostos poderão acompanhar a gestão orçamentária.
One of the major missions of the government is to provide welfare to the population. In order to achieve this goal it is necessary to develop actions, which involve linking public funds to ensure resources for the development of such actions. Thus, public agencies - such as federal universities - prepare the annual budget in accordance with the Annual Budget Law (LOA), which establishes credits allocated to various government agencies so that they may develop their activities. This is the horizon to which this study applies the principle of process management as a management tool to monitor the performance of budget execution, especially procedures that require actions between departmental units. Therefore, this study aims at analyzing budget execution management performed by the Federal University of Pernambuco (UFPE) and proposing improvements to the process. Specifically, this study identified, described and graphically represented the budget execution process of UFPE. It analyzed and pointed out the current and desired scenarios regarding its stages, time spent in the development of tasks and human resources involved. In addition, it presented a proposal for performance indicators in an attempt to serve as resources for monitoring budget execution of UFPE's budget management units. This study is an applied research. Data collection combined different sources of evidence: document research, participant observation, and semi-structured questionnaires. It adopted qualitative and quantitative approach to data analysis, and used Business Process Management (BPM) methodology for description, representation and analysis of the process. Proposition of indicators followed the methodological model developed by Trzesniak (2014), and therefore indicators have been formulated based on five dimensions: denomination, purpose, concept, means of verification and metadata. The research indicated that UFPE's budget execution process has the potential to become more agile, efficient and focused on the user, and the proposed monitoring indicators may follow up the budget management.
APA, Harvard, Vancouver, ISO, and other styles
47

O'Donovan, Brendan Donal. "Modelling and simulation of engineering design processes." Thesis, University of Cambridge, 2004. https://www.repository.cam.ac.uk/handle/1810/251938.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Xin, Chen Hilario Lorenzo. "Modelling resources in simulation engineering design processes." Thesis, University of Cambridge, 2017. https://www.repository.cam.ac.uk/handle/1810/269709.

Full text
Abstract:
The planning and scheduling of appropriate resources is essential in engineering design for delivering quality products on time, within cost and at acceptable risk. There is an inherent complexity in deciding what resources should perform which tasks taking into account their effectiveness towards completing the task, whilst adjusting to their availabilities. The right resources must be applied to the right tasks in the correct order. In this context, process modelling and simulation could aid in resource management decision making. However, most approaches define resources as elements needed to perform the activities without defining their characteristics, or use a single classification such as human designers. Other resources such as computational and testing resources, amongst others have been overlooked during process planning stages. In order to achieve this, literature and empirical investigations were conducted. Firstly, literature investigations focused on what elements have been considered design resources by current modelling approaches. Secondly, empirical studies characterised key design resources, which included designers, computational, testing and prototyping resources. The findings advocated for an approach that allows allocation flexibility to balance different resource instances within the process. In addition, capabilities to diagnose the impact of attaining specific performance to search for a preferred resource allocation were also required. Therefore, the thesis presents a new method to model different resource types with their attributes and studies the impact of using different instances of those resources by simulating the model and analysing the results. The method, which extends a task network model, Applied Signposting Model (ASM), with Bayesian Networks (BN), allows testing the influence of using different resources combinations on process performance. The model uses BN within each task to model different instances of resources that carries out the design activities (computational, designers and testing) along with its configurable attributes (time, risk, learning curve etc.), and tasks requirements. The model was embedded in an approach and was evaluated by applying it to two aerospace case studies. The results identified insights to improve process performance such as the best performing resource combinations, resource utilisation, resource sensitive activities, the impact of different variables, and the probability of reaching set performance targets by the different resource instances.
APA, Harvard, Vancouver, ISO, and other styles
49

Orr, Rodney Alister. "Language extensions for array processor and multi-processor configurations." Thesis, Queen's University Belfast, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.254319.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Gnatyuk, Vladimir, and Christian Runesson. "A Multimedia DSP Processor Design." Thesis, Linköping University, Department of Electrical Engineering, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2269.

Full text
Abstract:

This Master Thesis presents the design of the core of a fixed point general purpose multimedia DSP processor (MDSP) and its instruction set. This processor employs parallel processing techniques and specialized addressing models to speed up the processing of multimedia applications.

The MDSP has a dual MAC structure with one enhanced MAC that provides a SIMD, Single Instruction Multiple Data, unit consisting of four parallel data paths that are optimized for accelerating multimedia applications. The SIMD unit performs four multimedia- oriented 16- bit operations every clock cycle. This accelerates computationally intensive procedures such as video and audio decoding. The MDSP uses a memory bank of four memories to provide multiple accesses of source data each clock cycle.

APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography