Dissertations / Theses on the topic 'Lifecycle modeling'

To see the other types of publications on this topic, follow the link: Lifecycle modeling.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 28 dissertations / theses for your research on the topic 'Lifecycle modeling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Hefnawy, Ahmed. "Lifecycle-based Modeling of Smart City Ecosystem." Thesis, Lyon, 2018. http://www.theses.fr/2018LYSE2014.

Full text
Abstract:
Le développement, l'exploitation et la maintenance des systèmes urbains intelligents sont des tâches très complexes et impliquent de nombreux acteurs de différentes disciplines et domaines. Dans la plupart des cas, ces systèmes se trouvent à différentes phases de conception, de déploiement et d'exploitation, c'est-à-dire à différentes phases de leur cycle de vie. Par conséquent, les concepts de gestion du cycle de vie sont très importants pour mieux gérer le développement des villes intelligentes en tant qu'écosystème complet à travers les différentes phases du cycle de vie. Cet argument est étayé par les résultats de notre enquête sur les villes intelligentes, où les informations récoltées des parties prenantes interrogées prouvent la pertinence d’une approche cycle de vie pour répondre aux neuf préoccupations identifiées; non alignement sur les objectifs stratégiques, échec réglementaire au niveau des différentes phases, retard dans le «time to market», processus disjoints, partage des connaissances et traçabilité des données difficiles, échange inefficace de données/informations; et utilisation inefficace et inefficiente des infrastructures. Pour répondre aux préoccupations mentionnées ci-dessus, cette thèse propose l'application des éléments fondamentaux du cycle de vie aux villes intelligentes, ce qui nécessite l'introduction de la notion de temps dans la modélisation urbaine intelligente en ajoutant le point de vue « cycle de vie » comme nouvelle dimension de leurs architectures multicouches. L'approche proposée comprend deux éléments. Le premier est le modèle tridimensionnel qui permet aux développeurs de villes intelligentes d'envisager trois points de vue : les couches de l'architecture, le temps (phases du cycle de vie) et les domaines. Le deuxième correspond à la notion d'interaction qui permet l'intégration entre les systèmes de gestion du cycle de vie et les plateformes IoT. Cette approche est validée à travers un cas d'utilisation d’un système de stationnement intelligent « Smart Parking », proposé dans le cadre de la Coupe du Monde™ de la FIFA 2022. Le système de stationnement intelligent proposé est stratégiquement aligné sur les objets Smart Qatar et relie toutes les parties prenantes concernées à travers les différentes phases du cycle de vie. Pour assurer l'interopérabilité sémantique, le système de stationnement intelligent utilise les normes DATEX II pour les données statiques et dynamiques liées au stationnement. Enfin, le cas d'utilisation met l'accent sur l'intégration entre les données liées au cycle de vie et les données IoT à travers l'interaction entre un système de cycle de vie Aras Innovator® (construction de nomenclatures, gestion de configurations, etc.) et une plate-forme d’implémentation de référence IoT O-MI/O-DF (publication peer-to-peer, découverte d'informations liées au stationnement sous une forme agrégée)
Smart city system development, operation and maintenance are very complex tasks and involve numerous stakeholders from different disciplines and domains. In most cases, these systems are at different phases of design, deployment and operation, i.e. at different phases of lifecycle. Hence, lifecycle management concepts are very important to better manage smart city development as a complete ecosystem across different phases of lifecycle. This argument is supported by the findings of our smart city survey, where the information gathered from interviewed stakeholders proves the relevance of a lifecycle approach to address the identified nine concerns; non-alignment to strategic objectives, regulatory failure at different phases, delay in “time to market”, disjointed processes, difficult knowledge sharing and data traceability, inefficient and delayed exchange of data/ information, and inefficient and ineffective use of infrastructure.To address the abovementioned concerns, this thesis proposes the application of lifecycle management concepts in smart cities, which requires the introduction of the time notion to smart city modeling by adding the lifecycle viewpoint as a new dimension to the multi-layered architecture. The proposed smart city lifecycle-based approach consists of two components. First, the three-dimensional model that enables smart city developers to consider three viewpoints: Architecture Layers, Time (Lifecycle Phases), and Domains. Second, the interaction approach that enables integration between lifecycle management systems and IoT platforms. This approach is validated through a use-case of Smart Parking System, proposed as part of the FIFA World Cup™ 2022. The proposed smart parking system is strategically aligned to Smart Qatar objectives and connects all relevant stakeholders across the different lifecycle phases. To ensure semantic interoperability, the smart parking system uses the DATEX II standards for static and dynamic parking related data. Finally, the use-case focuses on the integration between lifecycle related data and IoT data through the interaction between Aras Innovator® lifecycle system (BoM construction, configuration management, etc.) and the O-MI/O-DF IoT Reference Implementation Platform (peer-to-peer publication and discovery of parking-related information in an aggregated form)
APA, Harvard, Vancouver, ISO, and other styles
2

Rao, Vijay D. "A Unified Approach to Quantitative Software Lifecycle Modeling." Thesis, Indian Institute of Science, 2001. http://hdl.handle.net/2005/94.

Full text
Abstract:
An evolutionary process currently taking place in engineering systems is the shift from hardware to software where the role of software engineering is becoming more central in developing large engineering systems. This shift represents a trend from a piece-meal vision of software development to a holistic, system-wide vision. The term "software crisis" of 1960's and 1970's was the observation that most software development projects end up with massive cost overruns and schedule delays. The growing complexity of software projects led to Waterfall, Spiral and other models to depict the software development lifecycle. These models are qualitative and study the product, process and project issues in isolation, and do not provide a quantitative framework to depict the various facets of development, testing, maintenance and reuse. In this thesis, a generic, unified lifecycle model (ULM) integrating the product, process and project view of software development based on re-entrant lines is proposed. A reentrant line is a multi-class queueing network that consists of several types of artifacts visiting a set of development teams more than once. An artifact is a general term for any object of information created, produced, changed or used by development teams and it includes items such as requirements specification documents, preliminary and detailed module designs and design documents, code, components, test plans and test suites. The artifacts visit the development teams several times before exiting the system, thus making the flow of artifacts non-acyclic. The main consequence of the re-entrant flow is that several artifacts at different stages of completion compete with each other for service by a development team. The ULM model output is obtained by using the criticality, complexity and usage of artifacts. The model is solved using linear programming and simulation methods. The software development process in a software organisation is represented by the proposed re-entrant line model. The model is used to predict project metrics such as the development time, cost and product quality for any new project to be taken up by the organization. The routing matrix of the artifacts in the ULM can be modified to derive different types of lifecycle models such as Waterfall, Prototyping, Spiral and Hybrid models. The ULM may be modified to include software reuse and component-based development. We investigate certain issues involved in software reuse. Reuse of components is modeled as an external arrival of artifacts at different stages in the ULM. Two distinct lifecycles in component based software development, namely, 'development for reuse' and 'development with reuse', are distinguished and the development time and cost for projects are estimated using LP bounds and simulation. The 'development for reuse' lifecycle involves reusable components development that is stored in a reuse library. As the number of components in the reuse library grows over time and with projects, the problem of effective and efficient retrieval of candidate components in order to facilitate systematic reuse becomes the bottleneck. A novel approach where components are stored in a case-base is proposed. The retrieval process is based on a reasoning approach that relies on similar cases (components) in the past to find solutions to the current problem (new software requirements in projects). The selection of candidate components for decisions pertaining to four levels of reuse {reuse as-is, reuse with minor code modifications, reuse of specifications, no reuse or develop afresh} in the current application is modeled using Rough and Fuzzy sets. These methodologies are illustrated with suitable case studies. Maintenance of legacy systems, representing a massive, long-term business investment, is an important but relatively new research area. The ULM is modified to depict the complex set of activities associated with software maintenance. Quantitative metrics such as release time of versions, cost, time and effort for maintenance are estimated using this model. Some of the specific contributions of this thesis are: 1. A unified quantitative lifecycle model (ULM) depicting the software development process is used to obtain project metrics such as development cost, development time and quality based on the product and process attributes for the Waterfall, Prototyping, Spiral and Hybrid lifecycle models. 2. Analytic hierarchy process (AHP) methodology is used to rank order the suitability of different lifecycle models for a new development project at hand, based on the metrics obtained from ULM. 3. The ULM is modified to depict component-based software development and to integrate reuse as an important basis for software development. Two distinct lifecycles for development for reuse and development with reuse are studied. The 'development for reuse' strategy generates reusable components that are organized and stored in a reuse library. The selection-decision regarding candidate components from this library for reuse in the current application is derived using a Rough and Fussy set methodology. 4. The ULM is adapted to represent the various activities associated with software maintenance. Estimates of maintenance metrics for different strategies of maintenance of legacy systems are obtained.
APA, Harvard, Vancouver, ISO, and other styles
3

Zhu, Wenhua. "3D modeling of city building and lifecycle simulation." Thesis, Compiègne, 2017. http://www.theses.fr/2017COMP2344/document.

Full text
Abstract:
Avec la construction et le développement de la ville intelligente, la façon de construire le modèle 3D réaliste des grands bâtiments de la ville rapidement et efficacement devient le hotspot de recherche. Dans cette thèse, une méthode procédurale de modélisation intelligente est proposée pour construire rapidement et efficacement un modèle de construction de ville 3D à grande échelle basé sur la modélisation de la forme de la façade et de la grammaire de forme. La technologie de l'information du bâtiment (BIM) est un moyen technique important pour améliorer l'industrie de la construction, pour la conception du bâtiment de la ville et la construction de la meilleure recherche et l'application de la technologie BIM est la clé, de gérer efficacement les informations du cycle de vie du bâtiment et de réaliser le partage et l'échange. Cette thèse a étudié l'acquisition et le traitement des données de modélisation. Google Earth et le logiciel ArcGIS sont principalement utilisés pour acquérir et traiter des données d'images-cartes et des données de cartes d'élévation de la zone cible, ces deux types de correspondance et de superposition de données, qui peuvent générer des données de terrain urbain 3D avec des informations de localisation géographique. Ensuite OpenStreetMap est utilisé pour acquérir les données routières de la zone cible, et il peut être optimisé pour le réseau routier nécessaire par le logiciel JOSM. La technologie de balayage laser 3D est utilisée pour collecter des images de texture de surface de bâtiment et pour créer le modèle de nuages de points de la modélisation d'architecture cible afin d'obtenir les dimensions de modélisation par mesure. Sur cette base, cette thèse a principalement étudié le principe et le processus de la règle CGA pour créer des modèles de construction, et étudié la méthode qui peut séparer les éléments architecturaux en utilisant la segmentation d'image pour générer automatiquement la règle CGA et de créer ensuite le modèle de construction. Ainsi, des modèles de construction 3D ont été établis dans le logiciel CityEngine en utilisant les règles CGA et la technologie de segmentation des façades. Cette thèse a construit le modèle d'information intégré au bâtiment urbain (CBIIM) basé sur BIM. L'information sur la construction de la ville est classée et intégrée, et le bâtiment et la composante ont été décrits avec la norme IFC, afin de gérer efficacement les informations du cycle de vie du bâtiment. Cette thèse étudie la technologie du modèle d'association d'information intégrée, qui permet de réaliser une conception standardisée des composants avec des caractéristiques associées et une conception intelligente des bâtiments avec des paramètres associés dans les règles de connaissances combinées avec l'IFC. La technologie de simulation de la construction de visualisation est étudiée. Les règles de connaissance dans le modèle d'information intégré fournissent une référence fiable pour la simulation de construction, et la scène de simulation est créée en invoquant le modèle d'information intégré, ainsi le processus de simulation est terminé. En prenant le campus Baoshan de l'Université de Shanghai comme exemple, le processus de modélisation de la scène entière est illustré, et les étapes de modélisation de toutes sortes d'objets 3D sont décrites en détail pour résoudre les problèmes spécifiques dans le processus de modélisation réelle. Ainsi, la faisabilité et la validité de la méthode de modélisation intelligente procédurale sont vérifiées. Prenant comme exemple le dortoir de l'Université de Shanghai, une simulation et le modèle de simulation ont été créés par les informations intégrées, combinées aux informations de construction pertinentes, la simulation de construction a été complétée par le programme. Ainsi, la faisabilité et la validité du CBIIM sont vérifiées
With the construction and development of the smart city, how to construct the realistic 3D model of the large-scale city buildings quickly and efficiently which becomes the research hotspot. In this thesis, a novel 3D modeling approach is proposed to quickly and efficiently build 3D model of large-scale city buildings based on shape grammar and facade rule modeling. Building Information Model (BIM) is an important technical means to enhance the construction industry, for the city building design and construction, how to better research and application of BIM technology which is the key, in this thesis City Building Integrated Information Model (CBIIM) is specified to manage the information of building lifecycle effectively and realize the information sharing and exchanging. This thesis has studied the acquisition and processing of the modeling data. Google Earth and ArcGIS software are mainly used to acquire and process image-maps data and elevation-maps data of the target area, these two kinds of data match and overlay, which can generate 3D city terrain data with geographic location information. Then OpenStreetMap is used to acquire road data of the target area, and it can be optimal processed to the necessary road network by JOSM software. 3D laser scanning technology is used to collect building surface texture images and create the point clouds model of the target architecture modeling so as to get the modeling dimensions by measurement. On this basis, this thesis mainly has studied the principle and the process of CGA rule to create building models, and studied the method that can separate architectural elements using image segmentation to generate CGA rule automatically and to create building model furtherly. Thus 3D building models have been established in the CityEngine software using CGA rules and facade modeling technology. This thesis has specified the City Building Integrated Information Model (CBIIM) based on BIM. The city building information are classified and integrated, and the building and component was described with the IFC standard, in order to manage the informations of building lifecycle effectively. This thesis studies the integrated information association model technology, that it can realize standardized component design with associated features and intelligent building design with associated parameters in knowledge rules combined with IFC. The construction simulation technology is studied. The knowledge rules in the integrated information model provide a reliable reference for the construction simulation, and the simulation scene is created through the invoking the integrated information model, thus the construction simulation process is completed by the program. Taking Baoshan Campus of Shanghai University as an example, the modeling process of the whole scene is illustrated, and the modeling steps of all kinds of 3D objects are described in detail to solve the specific problems in the actual modeling process. Thus the feasibility and validity of the procedural intelligent modeling approach are verified. Taking the dormitory of Shanghai University as an example, a simulation scene and the simulation model were created by the integrated informations, combined with the relevant construction information the construction simulation was completed by the program. Thus the feasibility and validity of the CBIIM are verified
APA, Harvard, Vancouver, ISO, and other styles
4

Blanchard, Robert D. "Nose fairing modeling and simulation to support Trident II D5 lifecycle extension." Thesis, Monterey, California: Naval Postgraduate School, 2013. http://hdl.handle.net/10945/37588.

Full text
Abstract:
Approved for public release; distribution is unlimited
The objective of this thesis is to evaluate a modeling and simulation tool for the analysis of the Trident II D5 missile nose fairing to determine the limitations of serviceability through the extended service life of the D5 missile. The benefit of this analysis is a means to evaluate and manage the remaining nose fairing supply and serve as a baseline for future production of nose fairings. Constructed of a Sitka spruce and fiberglass laminate, the nose fairing is designed as the lifting point of the missile for submarine onloads and offloads and supports the entire weight of the missile. A computer model of the nose fairing was used to evaluate the nose fairing under tensile and compressive loading conditions to simulate the lifting evolution and closure segment impact at time of launch. Changes in the material properties of the model allow for a simulation of aging in the nose fairing to estimate the performance degradation over time, as well as exploration of the applicability of new materials to any future design of nose fairings.
APA, Harvard, Vancouver, ISO, and other styles
5

Johnston, Reuben Aaron. "A Multivariate Bayesian Approach to Modeling Vulnerability Discovery in the Software Security Lifecycle." Thesis, The George Washington University, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10828524.

Full text
Abstract:

Software vulnerabilities that enable well-known exploit techniques for committing computer crimes are preventable, but they continue to be present in releases. When Blackhats (i.e., malicious researchers) discover these vulnerabilities they oftentimes release corresponding exploit software and malware. If vulnerabilities—or discoveries of them—are not prevented, mitigated, or addressed, customer confidence could be reduced. In addressing the issue, software-makers must choose which mitigation alternatives will provide maximal impact and use vulnerability discovery modeling (VDM) techniques to support their decision-making process. In the literature, applications of these techniques have used traditional approaches to analysis and, despite the dearth of data, have not included information from experts and do not include influential variables describing the software release (SR) (e.g., code size and complexity characteristics) and security assessment profile (SAP) (e.g., security team size or skill). Consequently, they have been limited to modeling discoveries over time for SR and SAP scenarios of unique products, whose results are not readily comparable without making assumptions that equate all SR and SAP combinations under study. This research takes an alternative approach, applying Bayesian methods to modeling the vulnerability-discovery phenomenon. Relevant data were obtained from expert judgment (i.e., information elicited from security experts in structured workshops) and from public databases. The open-source framework, MCMCBayes, was developed to perform Bayesian model averaging (BMA). It combines predictions of interval-grouped discoveries by performance-weighting results from six variants of the non-homogeneous Poisson process, two regression models, and two growth-curve models. Utilizing expert judgment also enables forecasting expected discoveries over time for arbitrary SR and SAP combinations, thus helping software-makers to better understand the effects of influential variables they control on the phenomenon. This requires defining variables that describe arbitrary SR and SAP combinations as well as constructing VDM extensions that parametrically scale results from a defined baseline SR and SAP to the arbitrary SR and SAP of interest. Scaling parameters were estimated using elicited multivariate data gathered with a novel paired comparison approach. MCMCBayes uses the multivariate data with the BMA model for the baseline to perform predictions for desired SR and SAP combinations and to demonstrate how multivariate VDM techniques could be used. The research is applicable to software-makers and persons interested in applications of expert-judgment elicitation or those using Bayesian analysis techniques with phenomena having non-decreasing counts over time.

APA, Harvard, Vancouver, ISO, and other styles
6

RICHARD, DEEPAK. "LIFECYCLE PERFORMANCE MODEL FOR COMPOSITE MATERIALS IN CIVIL ENGINEERING." University of Cincinnati / OhioLINK, 2003. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1069787827.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Brooks, Brad Walton. "Automated Data Import and Revision Management in a Product Lifecycle Management Environment." Diss., CLICK HERE for online access, 2009. http://contentdm.lib.byu.edu/ETD/image/etd3182.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Zhang, Sumei. "Product structure modeling for ETO system product considering the product lifecycle : A case study of ABB Mine Hoist." Thesis, Uppsala universitet, Industriell teknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-385770.

Full text
Abstract:
In order to gain competitive advantages in markets, companies have provided a variety of customized products to satisfy customer-specific requirements, leading to not only a large amount of product data but also high cost, long lead-time and complexity of quality control. Efficient product data management throughout the product lifecycle has become increasingly crucial, of which product structure management is regarded as the most important constituent.  The study took ABB Mine Hoist system as a case to investigate how to construct a generic product structure model fit for engineer-to-order system offerings with the consideration of their sales-delivery product lifecycle. The aim of the model is to facilitate the product-related information sharing and reuse across a company, and the integration of different business operations throughout the entire product lifecycle as well. Based on the current situation analysis of product data management on ABB Mine Hoist, three major issues were identified which need to be addressed in the formulation of a generic structure model: namely the integration of requirements of multiple disciplines; the consistency of product information throughout the product lifecycle; and the constant update of product repository. Through illustrating the formulation of ABB Mine Hoist generic structure model, the method of how to construct a generic product structure model for engineer-to-order system product was presented. The model was achieved by applying the framework of the step-based product model and was regarded as a result of integrating domain-specific requirements. The adaptive generic product structure model was then employed to display the role of this generic model in the different phases of a sales-delivery lifecycle. The model could serve as a “master concept” to transfer common product information in the product lifecycle. It’s expected to benefit the business of engineer-to-order system product through improving the integration of different disciplines, enhancing information exchange and reuse. It could also provide an abstract and conceptual basis for potential product repository to reinforce data consistency and completeness.
APA, Harvard, Vancouver, ISO, and other styles
9

Mondini, Leonardo. "BIM Lifecycle e Facility Management: il caso di studio del BIM per Gio Ponti." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2020.

Find full text
Abstract:
Come contributo all’attuale stato di necessità di adozione del BIM nel nostro Paese, questo elaborato vuole documentare come il Building Information Modeling, grazie alla sua metodologia di natura olistica, non sia in contrasto con i tradizionali metodi di progettazione, costruzione e gestione di un edificio, ma attualizzi invece un portato proprio della tradizione ingegneristica: il rapporto fra atto ideativo e fatto costruttivo. Il passo evolutivo più grosso lo deve fare il Facility, poiché è qui che si concentra il 70% del costo sul totale della spesa nell’arco della vita utile di un edificio. Lo scoglio da superare è sicuramente la gestione dei dati: abbiamo un database efficace (il modello IFC), che ci permette di archiviare una grande quantità di dati (ACdat e ACDoc), ora dobbiamo riuscire a leggerli e decifrarli; l’altro problema da risolvere è la bidirezionalità dell’informazione, che viene letta in un senso (dall’IFC ai programmi di Facility), ma non trova un reale risposta nel passaggio inverso. E’ ora di ottimizzare tutti gli aspetti insiti nel processo costruttivo, attraverso un cambio di approccio al problema, un cambiamento generazionale di pensiero che sta trovando terreno fertile nel settore AEC per sviluppare appieno le sue potenzialità.
APA, Harvard, Vancouver, ISO, and other styles
10

Lucas, Jason David. "An Integrated BIM Framework to Support Facility Management in Healthcare Environments." Diss., Virginia Tech, 2012. http://hdl.handle.net/10919/28564.

Full text
Abstract:
The quality of healthcare environments has been linked to patient safety, patient and staff stress, clinical output, and patient outcome. As part of maintaining the physical environment within the healthcare settings facility managers need to ensure that complex systems are working properly. Facility management tasks need to be completed with minimal interference with clinical services. This is often difficult to do because facility information is often stored in multiple systems and may be inadequate and incomplete. Communication and exchange of information throughout the lifecycle and throughout the operational phase of the building is fragmented. Relevant information and effective facility information management are important for efficient operation and maintenance of the facility. It is even more important when systems are being constantly upgraded and renovated due to new technologies and for the need for facility managers to do more work with fewer resources. This research is examining the link between facility management and clinical activities, especially in terms of information exchange and management. A framework is proposed to help facility managers more efficiently manage healthcare facility information. Case analysis was completed on facility related patient safety events to determine the types of information needed and exchanged through the eventâ s response by facility personnel. The information was then organized into a product model and ontology to help capture, manage, and retrieve the information. The goal of the research is to offer a method of storing healthcare facility information in an efficient and effective manner to support facility managers in their response to patient safety events. This dissertation outlines the objectives of this research and the methodologies used in the case analysis. The development of the product model and information exchanges identified is also discussed. Lastly, conceptual model for a prototype was developed and is presented to demonstrate how the product model and ontology can be used to allow the user to query information and interact with the system.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
11

Tindall, Nathaniel W. "Analyses of sustainability goals: Applying statistical models to socio-economic and environmental data." Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/54259.

Full text
Abstract:
This research investigates the environment and development issues of three stakeholders at multiple scales—global, national, regional, and local. Through the analysis of financial, social, and environmental metrics, the potential benefits and risks of each case study are estimated, and their implications are considered. In the first case study, the relationship of manufacturing and environmental performance is investigated. Over 700 facilities of a global manufacturer that produce 11 products on six continents were investigated to understand global variations and determinants of environmental performance. Water, energy, carbon dioxide emissions, and production data from these facilities were analyzed to assess environmental performance; the relationship of production composition at the individual firm and environmental performance were investigated. Location-independent environmental performance metrics were combined to provide both global and local measures of environmental performance. These models were extended to estimate future water use, energy use, and greenhouse gas emissions considering potential demand shifts. Natural resource depletion risks were investigated, and mitigation strategies related to vulnerabilities and exposure were discussed. The case study demonstrated how data from multiple facilities can be used to characterize the variability amongst facilities and to preview how changes in production may affect overall corporate environmental metrics. The developed framework adds a new approach to account for environmental performance and degradation as well as assess potential risk in locations where climate change may affect the availability of production resources (i.e., water and energy) and thus, is a tool for understanding risk and maintaining competitive advantage. The second case study was designed to address the issue of delivering affordable and sustainable energy. Energy pricing was evaluated by modeling individual energy consumption behaviors. This analysis simulated a heterogeneous set of residential households in both the urban and rural environments in order to understand demand shifts in the residential energy end-use sector due to the effects of electricity pricing. An agent-based model (ABM) was created to investigate the interactions of energy policy and individual household behaviors; the model incorporated empirical data on beliefs and perceptions of energy. The environmental beliefs, energy pricing grievances, and social networking dynamics were integrated into the ABM model structure. This model projected the aggregate residential sector electricity demand throughout the 30-year time period as well as distinguished the respective number of households who only use electricity, that use solely rely on indigenous fuels, and that incorporate both indigenous fuels and electricity. The model is one of the first characterizations of household electricity demand response and fuel transitions related to energy pricing at the individual household level, and is one of the first approaches to evaluating consumer grievance and rioting response to energy service delivery. The model framework is suggested as an innovative tool for energy policy analysis and can easily be revised to assist policy makers in other developing countries. In the final case study, a framework was developed for a broad cost-benefit and greenhouse gas evaluation of transit systems and their associated developments. A case study was developed of the Atlanta BeltLine. The net greenhouse gas emissions from the BeltLine light rail system will depend on the energy efficiency of the streetcars themselves, the greenhouse gas emissions from the electricity used to power the streetcars, the extent to which people use the BeltLine instead of driving personal vehicles, and the efficiency of their vehicles. The effects of ridership, residential densities, and housing mix on environmental performance were investigated and were used to estimate the overall system efficacy. The range of the net present value of this system was estimated considering health, congestion, per capita greenhouse gas emissions, and societal costs and benefits on a time-varying scale as well as considering the construction and operational costs. The 95% confidence interval was found with a range bounded by a potential loss of $860 million and a benefit of $2.3 billion; the mean net present value was $610 million. It is estimated that the system will generate a savings of $220 per ton of emitted CO2 with a 95% confidence interval bounded by a potential social cost of $86 cost per ton CO2 and a savings of $595 per ton CO2.
APA, Harvard, Vancouver, ISO, and other styles
12

Huo, Ming Computer Science &amp Engineering Faculty of Engineering UNSW. "A systematic framework of recovering process patterns from project enactment data as inputs to software process improvement." Publisher:University of New South Wales. Computer Science & Engineering, 2009. http://handle.unsw.edu.au/1959.4/43683.

Full text
Abstract:
The study of the software development process is a relatively new research area but it is growing rapidly. This development process, also called 'the software life cycle' or 'the software process', is the methodology used throughout the industry for the planning, design, implementation, testing and maintenance that takes place during the creation of a software product. Over the years a variety of different process models have been developed. From the numerous process models now available, project managers need validation of the choice he/she has made for a software development model that he/she believes will provide the best results. Yet the quality software so sought after by software project managers can be enhanced by improving the development process through which it is delivered. Well tested, reliable evidence is needed to assist these project managers in choosing and planning a superior software process as well as for improving the adopted software process. While some guidelines for software process validation and improvement have been provided, such as CMMI, quantitative evidence is, in fact, scarce. The quantitative evidence sometimes may not be able to be obtained from high level processes that refer to a planned process model, such as a waterfall model. Furthermore, there has been little analysis of low level processes. These low level processes refer to the actions of how a development team follow a high level software process model to develop a software product. We describe these low level processes as project enactment. Normally there is a gap between the high level software process and the project enactment. In order to improve this software development process, this gap needs to be identified, measured and analyzed. In this dissertation, we propose an approach that examines the deviation between a planned process model and the project enactment of that plan. We measure the discrepancy from two aspects: consistency and inconsistency. The analytical results of the proposed approach, which include both qualitative and quantitative data, provide powerful and precise evidence for tailoring, planning and selecting any software process model. The entire approach is composed of four major phases: 1) re-presentation of the planned process model, 2) pre-processing the low level process data, 3) process mining, and 4) analysis and comparison of the recovered process model and planned process model. We evaluate the proposed approach in three case studies: a small, a medium, and a large-sized project obtained from an industrial software development organization. The appropriate data on low level processes is collected and our approach is then applied to these projects individually. From each case study we then performed a detailed analysis of the inconsistencies that had surfaced as well as the consistencies between the plan and the enactment models. An analysis of the inconsistencies revealed that several 'agile' practices were introduced during the project's development even though the planned process model was initially based on 'ISO-12207' instead of the 'agile' method. In addition, our analysis identifies the patterns in the process that are frequently repeated. The outcome of the case studies shows that our approach is applicable to a range of software projects. The conclusions derived from these case studies confirmed that our approach could be used to enhance the entire software development process, including tailoring and assessment.
APA, Harvard, Vancouver, ISO, and other styles
13

Nabil, A. "P.A.L.M. (Physical Asset Lifecycle Modelling) in the healthcare sector." Thesis, University College London (University of London), 2016. http://discovery.ucl.ac.uk/1503379/.

Full text
Abstract:
A Private Finance Initiative (PFI) is a way of establishing Public-Private Partnerships (PPP) by funding public infrastructure projects with private capital investment. The election in 1979 of a Conservative government under Margaret Thatcher marked the start of a still-continuing shift of activities away from the UK public sector. PFI was implemented in the UK for the first time in 1992. HCP is an award winning PFI asset-management company and, as part of the EngD course, the researcher has spent a large amount of time based at HCP. HCP stands for Healthcare Projects, and this thesis presents an alternative, combined-methods research approach to one of the most mechanically complex asset types under HCP’s management, in its largest healthcare facility. The research presents a risk-based approach to the operational lifecycle planning of 113 air-handling units at a central London hospital. The two components to the project are engineering risk (How likely is the asset to fail?) and contractual risk (What are the financial implications of such a failure?). Currently, these assets are modelled by HCP on a ‘strategic’ level, but using CIBSE- recommended guidance and part-failure data collected from six other UK-based hospitals, the Physical Asset Lifecycle Model (PALM) produces a funding profile for the replacement of the 1,247 internal components, as opposed to 113 bulk assets. The numerical model has also been visualised through the extraction of 3D BIM geometry into a geometrical-modelling tool (Rhino5) and computational plug-in (Grasshopper) to connect to the lifecycle model and visualise the replacement strategy proposed. The qualitative part of the combined-methods approach involved interviewing HCP Management board members as to their views on the models. The current profile adopted by HCP for the management of the air-handling units involves a £6.045m spend during the remaining 33-year concession period. The main findings of the PALM lifecycle model are that, based on a component-level replacement approach, this figure can be reduced by more than £1m based on a recommended replacement profile (£4.709m). Such a reduction can be based on how HCP currently manages its assets, and the engineering survey conducted showed that three air-handling units currently being life-cycled by HCP either had no components or were decommissioned prior to construction. The main findings of the PALM geometrical model (based on thematic-interview analysis) are that such a tool has largely been unseen in the industry before and it displays major translatability to other complex mechanical assets with component parts. It can also be integrated into HCP business propositions for new and existing clients in the future because of its clarity and ability to produce transparent lifecycle modelling from a decision-maker’s point of view. The research concludes that while the PALM model provides a glimpse as to how lifecycle modelling may be conducted in the future, a number of barriers to its implementation remain (namely data availability in a competitive environment, the time versus income generated business-case paradigm and a generational ability to change and accept technological advancements amongst senior decision-makers).
APA, Harvard, Vancouver, ISO, and other styles
14

Figay, Nicolas. "Interoperabilité des applications d'entreprise dans le domaine technique." Thesis, Lyon 1, 2009. http://www.theses.fr/2009LYO10242/document.

Full text
Abstract:
Dans le contexte économique actuel, les entreprises font face à de nouveaux problèmes en termes d’interopérabilité, du fait de besoins croissants de collaboration eBusiness dans les écosystèmes numériques auxquels elles appartiennent. Elles ont également besoins de pouvoir rentabiliser et faire évoluer les applications internes existantes. De plus, l’établissement rapide d’une collaboration numérique avec un membre de leur écosystème, limitée dans le temps, ne devrait pas nécessiter de modification de leurs infrastructures de communications et de leurs applications pour pouvoir échanger information et connaissance. D’un côté, les solutions actuelles sont de moins en moins adaptées pour faire face ni aux besoins croissants d’interopérabilité dans des environnements de plus en plus complexes. D’un autre côté, il n’est pas envisageable de remplacer les standards et les cadres d’interopérabilité actuellement utilisés en proposant des innovations de rupture. Les travaux de recherches réalisés dans le cadre de la thèse « Interopérabilité des applications techniques d’entreprise » concernent le développement d’une approche innovante pour construire un cadre d’interopérabilité des applications d’entreprise basé sur l’utilisation simultanée et cohérente des standards d’interopérabilité d’un écosystème et des technologies associées. L’objectif est l’interopérabilité « pragmatique ». L’approche innovante propose s’appuie sur les apports conjugués de l’ingénierie par les modèles, de la modélisation d’entreprise, des ontologies et des architectures orientées services. Elle promeut l’utilisation des commodités du WEB, basées sur des standards ouverts et gouvernés. Ce faisant, la préservation sémantique entre les standards de l’écosystème considéré, les artefacts d’ingénierie des applications et les infrastructures de communication est cruciale. Aussi l’approche innovante proposée inclut-elle le concept “d’hyper modèle étendu”, qui a été développée dans le cadre de cette thèse, et dont l’usage est illustré dans le cadre particulier des applications de gestion du cycle de vie des produits industriels, au sein de l’entreprise étendue
Within the current economic context, enterprises are facing new interoperability issues due to increasing needs of eBusiness Collaboration within the emerging digital ecosystems they belong to. They also need to be able to keep in pace with their heterogeneous internal legacy systems. In addition, they should not have to modify their infrastructure or applications for fast and short collaboration implying information and knowledge interchange with new partners of their ecosystem. In one hand, current solutions are less and less adapted to face increasing needs and complexity in term of interoperability. In the other hand, legacy interoperability standards and frameworks can’t be replaced as it can be imagined to propose new disruptive approach and technologies. The research work undertaken for the thesis “Interoperability of Technical Enterprise Interoperability” consists in proposing an innovative approach allowing a given and mature ecosystem to build an enterprise application interoperability framework based on simultaneous and coherent usage of eBusiness standards used by a given ecosystem, combining usage of the different relevant frameworks supporting these standards. The goal is achievement of "pragmatic" interoperability. Proposed innovative approach takes advantage of simultaneous usage of Model Driven Engineering, Enterprise Modeling, Ontology and Service Oriented Architecture. It promotes systematic usage of commodities on the WEB based on open and governed standards. Doing so, semantic preservation between ecosystem’s standards, application engineering artifacts and communication infrastructures is crucial. To support semantic preservation within the context of the innovative proposed approach, the concept of “extended hypermodel” is developed and demonstrated within the context of Product Lifecycle Management within networked organizations
APA, Harvard, Vancouver, ISO, and other styles
15

Vadoudi, Kiyan. "Data Model Proposal to Integrate GIS with PLM for DfS." Thesis, Troyes, 2017. http://www.theses.fr/2017TROY0014/document.

Full text
Abstract:
Le déploiement du développement durable passe par des enjeux de transition sociétale et technique auxquels cherche à répondre le Design for Sustainability (DfS). Dans le cadre de la conception des systèmes de production, et en particulier pour les produits manufacturés, les impacts environnementaux que ce soit en termes de consommation de ressources ou de rejets (déchets, émissions) doivent être intégrés comme des paramètres de conception. L’évaluation des impacts environnementaux (par exemple par l’Analyse de Cycle de Vie, ACV) doit donc s’articuler avec la gestion du cycle de vie des produits (PLM). L’inventaire de cycle de vie, ICV est un élément central du lien entre le système de production et son environnement, caractérisé par des informations géographiques et spatiales sur l’écosphère. Le travail de thèse proposé stipule que les impacts environnementaux des systèmes conçus dépendent de cette caractérisation géographique. Les approches d’écoconception et de DFS doivent donc intégrer ces informations géographiques ce qu’elles ne font que très peu, ces informations n’étant pas intégré dans les outils de conception. La thèse propose donc une approche de modélisation pour intégrer les informations relatives au produit et son système de production (via le PLM), l’évaluation de son potentiel d’impact environnemental (via l’ACV et en particulier l’ICV), et les informations géographiques en conception. Pour cela, les informations géographiques à associer sont identifiées et des cas d’études illustratifs sont construits pour montrer l’impact de ces informations sur la définition des produits
There are different approaches to implement sustainability and Design for Sustainability (DfS) is the one that give more accurate result by considering both global and regional scales. Integration of Life Cycle Assessment (LCA) into Product Lifecycle Management (PLM) is an example of tool integration to support sustainability. In LCA framework, Life Cycle Inventory (LCI) is the quantified and classified list of input and output flow of the LCA model that is a model of the product system, linking the technological system to the ecosphere (Environment system). As each region has a unique environmental system, design characteristics and specifications of technological system should be modified and adopted based on these differences. Implementation of this approach will require geographical information of interacted environmental systems, which is a kind of new strategy in DfS. Therefore, we tested the interest of the integration of Geographical Information Systems (GIS) with PLM to support geographical considerations during product development activities. The main research question of this research work is then how to propose this PLM-GIS integration for DfS. Thus, we conducted that literature review on existing data models about product, environment, geography and their combination is a key to prove the link among them
APA, Harvard, Vancouver, ISO, and other styles
16

Huang, Chengxue, and Hampus Wranér. "Lifecycle management and smart manufacturing: Modelling and implementation to utilize the digital twin." Thesis, KTH, Industriell produktion, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-232369.

Full text
Abstract:
Smart manufacturing – smart factories creating smart products – is a topic which has arisen in the academic as well as business community. This thesis covers smart manufacturing in the context of lifecycle management. The thesis investigated how the standard Product Life Cycle Support (PLCS) could be used to support smart manufacturing and mainly how to develop the underlying system and information infrastructure. Standards, reports and specifications for smart manufacturing were investigated. Several information models were created from these publications which could be used for implementing a proposed solution for the infrastructure.The implementation concerned a use case in the ongoing research project DigIn, and used the developed models to implement a proposed solution in the product lifecycle management software ShareAspace. This was done in order to evaluate how to use the functionality of PLCS and ShareAspace to utilize the solution to support smart manufacturing and update the digital twin. In parallel to this thesis, a sub-project part of the DigIn project was conducted which connected the database to other software in the system as well as to the factory shop floor. The solution used the plant service bus Kafka and REST APIs in order to establish the connection. The functionality of the system regarding the specified required functionality in the publications was then investigated.The solution was found to meet most of the requirements of the publications regarding, among others, lifecycle management, service oriented architecture, non-hierarchical structures and communication capabilities.
Smart tillverkning – smarta fabriker som skapar smarta produkter – är ett ämne som inom det akademiska och affärsmässiga området förekommer alltmer frekvent. Denna uppsats behandlar smart tillverkning i kontexten av Product Life Cycle Support (PLCS). Uppsatsen undersökte hur PLCS kunde utnyttjas för att möjliggöra smart tillverkning, med huvudsakligt fokus på möjliggörandet av den bakomliggande system- och informationsinfrastrukturen för smart tillverkning. Standarder, rapporter och specifikationer för smart tillverkning undersöktes. Flertalet informationsmodeller skapades utifrån dessa publikationer vilka kunde användas för att implementera ett förslag för infrastrukturen.Implementationen hade sin bas i det pågående forskningsprojektet DigIn, och använde de utvecklade modellerna för att implementera en föreslagen lösning i produktlivscykel-mjukvaran ShareAspace. Detta gjordes för att utvärdera hur funktionaliteten i ShareAspace och PLCS skulle kunna användas för att stödja smart tillverkning och uppdatera den digitala tvillingen. Parallellt med denna implementation genomfördes i DigIn ett projekt vilka kopplade samman databasen med annan mjukvara i systemet samt fabriksgolvet. Lösningen använde en Plant Service Bus (Kafka) och REST APIer för att koppla samman dessa. Funktionaliteten av systemet rörande specificerade krav som återfanns i publikationerna undersöktes sedan.Lösningen fanns möta de flesta av de krav som lades fram i de undersökta publikationerna rörande, bland annat, livscykelshantering, tjänsteorienterad arkitektur, icke-hierarkiska strukturer samt kommunikationsmöjligheter.
APA, Harvard, Vancouver, ISO, and other styles
17

Hall, Adam J. "Structural and statistical aspects in joint modelling of artesunate pharmacometrics and malarial parasite lifecycle." Thesis, University of Warwick, 2015. http://wrap.warwick.ac.uk/81913/.

Full text
Abstract:
Malaria is a parasite with a complex lifecycle, and commonly used antimalarial agents from the artemisinin family have varied effectiveness over different stages of this lifecycle. The pharmacokinetic profile of the artemisinins is also strongly influenced by the parasite burden and lifecycle stage. This work introduces a new pharmacokinetic and pharmacodynamic model incorporating these interdependent drug and lifecycle features, for orally administered artesunate and its principal metabolite dihydroartemisinin. This model, like the underlying system whose features it attempts to capture, is quite complex and cannot be solved analytically like standard linear first-order compartmental models previously used for pharmacokinetic modelling of these drugs. Therefore, understanding, inference and validity are explored through use of the modern statistical technique of a Sequential Monte Carlo sampler. Structural, numerical and practical identifiability are important concepts for all models, the latter two especially so in this case as the model structure does not admit an algebraic structural identifiability analysis. Motivated by this, the above identifiability concepts are also investigated in connection with the Sequential Monte Carlo technique. Sequential Monte Carlo is demonstrated to be a useful tool for gaining insight into models whose structural identifiability is not known, just as it is also shown to have significant advantages in parameter inference over the classical approach. The coupled parasite lifecycle and artemisinin-derivative model is built in stages, starting with an in vitro submodel capturing the dynamics of uptake of artemisinins into parasitised and non-parasitised red blood cells. Next, the parasite lifecycle, or ‘ageing’ model, is introduced, which uses a new concept of shadow compartments to achieve its aims of describing ageing in continuous time and to exhibit sufficient control over the parasite population. Finally, these models are integrated together into the full coupled pharmacokinetic and pharmacodynamic model. More work is needed to fully assess the resultant model on clinical datasets, but the building blocks upon which it was constructed appear to fulfil their aims reasonably well.
APA, Harvard, Vancouver, ISO, and other styles
18

Lai, Yat Yin. "Lifecycle greenhouse gas & water resource inventory modelling for Swedish small and medium enterprises." Thesis, KTH, Hållbar utveckling, miljövetenskap och teknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-280473.

Full text
Abstract:
Traditionally, environmental related work is not commonly practised by the Swedish small and medium enterprises (SME). This can be attributed to the non-inclusive Swedish sustainability reporting legislation, which is primarily targeted at the large corporations, but also due to shortage of resources, demands, prioritization and competences in the SME. The study is aimed to simulate the SME’s interest and confidence in taking the first step in incorporating sustainability into their business strategies by means of environmental inventory. Through the development of an easy-to-use inventory model in MS Excel and automatically generated inventory reports, the study provides the SME with a tool to account for lifecycle greenhouse gas (GHG) emissions and water resource utilization. The availability of Snacksbolaget, a wholesaling actor, as a case study company refines the research scope to put focus on the wholesale and retail trading sector. Carrying out in two phases, the study begins with developing a generic theoretical model to portray typical GHG emissions and water resource utilization in the wholesale and retail trading sector, which is followed by applying the theoretical model to Snacksbolaget as a case study. The application involves system boundary definitions in the company and product levels, data collections, calculations, inventory model building and verification by Snacksbolaget. The usability of the inventory model is verified by Snacksbolaget and the company is satisfied with the information the overarching inventory reports provided. The inventory results based on Snacksbolaget’s data demonstrate that Scope 3, indirect GHG emissions, contributes to the majority of the company’s overall emissions. Within Scope 3, the largest contributions are originated from purchased products cradle-to-gate stages and their upstream transportation, while sensitivity analysis proves that the energy production mix applied to the manufacturing of product and the modes of goods transport could play a vital role in GHG reduction. Correspondingly, indirect water use in electricity generation, particularly in the product manufacturing stage, is shown to be the largest water input and output source of the studied products. Recommendations to Snacksbolaget are provided in the report, aiming at potentially minimizing indirect GHG emitted and water withdrawn or discharged by the products and services they purchase. To reach a larger SME audience, it is recommended to further develop the current customized spreadsheet model into sector specific inventory models. The interest and uptake of the models will likely increase if they are well developed and maintained by expertise independent of the SME and made readily available to the use of the SME.
Jämfört med de större företagen är det i Sverige inte lika vanligt att små och medelstora företag (SME) jobbar med miljöfrågor. Detta beror delvis på att hållberhetsredovisningskraven inte gäller SME, men också på grund av att SME saknar resurser, tid, pengar, kompetens samt att kunderna inte alltid ställer krav kring att driva de frågorna. Genom att utveckla modeller kring hur man beräknar och redovisar växthusgasutsläpp (GHG) och vattenanvändning för ett företag är tanken att examensarbetet ska underlätta SME:s miljöarbete och locka flera SME till att adressera de miljörelaterade utmaningarna. Examensarbetet är uppdelat i två sektioner. Först tas en teoretisk modell fram med inriktning mot parti- och detaljhandelssektorn, där avgränsningarna är dragna i enlighet med GHG protokollstandarden och redovisningsprinciper för vattenanvändning. Modellen implementeras sedan som verktyg i MS Excel, med inbyggda beräknings- och rapporteringsfunktioner. Verifiering av modellen görs genom tillämpning av verktyget på Snacksbolaget, ett fallstudieföretag som har en grossistverksamhet inom hotell, restaurang och cateringbranschen. Inventeringsresultatet visar att Scope 3 GHG-utsläppen är störst i Snacksbolagets totala utsläppsprofil 2019, vilken är ganska typisk för partihandeln. Av Scope 3 utsläppen är materialutvinning och produkttillverkningsfaserna av de köpta produkterna det som bidrar till största delen, samtidigt som utsläppen från transport-kategorier sådan som Scope 1, Scope 3.4 och Scope 3.9 också är signifikanta. Inventeringen av vattenanvändning visar att produkttillverkningsfasen kräver den högsta vattenvolymen, där en relativ stor del är indirekt vatten som användas i elproduktionen. Inventeringsresultatet och känslighetsanalysen visar de potentiella åtgärder företaget kan implementera för att påverka storleken av Scope 3 emissionen och indirekta vattenanvändningen. Företaget rekommenderas att överväga sina produktval genom att utvärdera miljöprestationen genom att analysera flera miljöparametrar / miljödimensioner. Dessutom bör de inskaffa den underliggande produktinformation från leverantörerna för att säkerställa att man tar hänsyn till en korrekt produktlivscykel och validera de miljöfördelar som framhävs av leverantören. Det är också viktigt att välja produkter där materialet utvinns i, och produktens tillverkning sker i, länder eller regioner som utnyttjar förnybar energi och samtidigt inte är drabbat av vattenbrist. Företaget kan även motivera leverantören att frakta varorna med gröna transportalternativ. Sist men inte minst kan företaget uppmuntra till mindre konsumtion genom att erbjuda tjänster som uthyrning och återanvändning av produkter. Vidareutveckling av modellen inom parti- och detaljhandelssektorn kommer att kräva ytterligare data, tid och kompentens. Medan utvidgning av modellen för att omfatta andra affärssektor kan uppnås genom att utveckla separata modeller med likande tillvägagångssätt som beskrivs för den nuvarande modellen. Datakvaliteten kan förbättras under tiden genom regelbunden underhållning och uppdatering. För att kunna locka till sig fler små och medelstora företag, bör anpassande modeller utvecklas och underhålls av externa miljöexperter och göras tillgängliga för användning av SME.
APA, Harvard, Vancouver, ISO, and other styles
19

Zhang, Cen. "Modelling the Demand Evolution of New Shared Mobility Services." Kyoto University, 2019. http://hdl.handle.net/2433/242485.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Pinel, Muriel. "L'introduction de la gestion du cycle de vie produit dans les entreprises de sous-traitance comme vecteur d'agilité opérationnelle et de maîtrise du produit." Phd thesis, Université de Grenoble, 2013. http://tel.archives-ouvertes.fr/tel-00852852.

Full text
Abstract:
Pour faire face à un environnement en perpétuelle évolution, les entreprises doivent changer et parfois en profondeur. Ces évolutions en principe voulues et contrôlées se font au moyen de projets dits " d'entreprise ". Parmi les buts qu'il s'agit d'atteindre par le biais de ces projets, deux buts majeurs sont identifiables : le développement de l'agilité opérationnelle et la maîtrise des produits. Dans ces travaux de thèse, nous nous focalisons sur le projet PLM (Product Lifecycle Management) et plus particulièrement sur la mise en oeuvre du système d'information nécessaire à la gestion du cycle de vie des produits : le système PLM. Ce système d'information coordonne tout ou partie des informations liées à la définition, à la réalisation, à l'usage et au retrait des produits. Après un état de l'art permettant de définir de façon précise les concepts liés à la gestion du cycle de vie des produits, une méthode est proposée pour définir le cahier des charges du système PLM. La définition de cette méthode montre la nécessité d'assurer la cohérence entre les différents modèles du système PLM d'une part et entre les différentes représentations du produit utilisées tout au long de son cycle de vie d'autre part. Un cadre de modélisation basé sur le paradigme systémique, sur le paradigme d'ambivalence et sur des concepts de métamodélisation est alors proposé. Ce cadre de modélisation apporte des éléments de réponse aux besoins de cohérence identifiés. Il supporte également l'adoption du raisonnement synergique indispensable au développement de l'agilité opérationnelle recherchée par l'entreprise. Une expérimentation est réalisée pour illustrer les concepts introduits dans notre cadre de modélisation.
APA, Harvard, Vancouver, ISO, and other styles
21

Parsanezhad, Pouriya. "A Lifecycle Approach towards Building Information Management : Technical and procedural implications for the facility management and operations sector." Licentiate thesis, KTH, Projektkommunikation, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-164954.

Full text
Abstract:
A well-structured and coordinated information management practice is central to promoting efficiency in construction. Building information management encompasses authoring, interpretation, communication, coordination and storage of building information. The benefits envisioned by utilizing IT developments such as Building Information Modelling (BIM) in the facility management and operations (FM&O) sector are estimated to be far greater than in other sectors. There is, however, a gap between the knowledge available in the field of building information management and the actual demands of the architectural, engineering, construction and operation (AECO) industry, especially the FM&O sector. The overall aim of this qualitative research is to develop knowledge that conceptualizes the lifecycle supporting implementation of BIM in the AECO industry with a focus on its implications for a BIM-enabled FM&O practice. This applied research comprises a number of summative and formative components: paper 1 investigates the existing and emerging information management systems for the FM&O sector and their characteristics. The focus of paper 2 is narrowed down to the technical requirements on building information management systems; while its temporal scope spans the entire lifecycle of buildings. Paper 3 is a further elaboration on the findings of paper 1 and covers the technical requirements of BIM-implementation in the FM&O sector. Paper 4 investigates workflows – another category of the issues identified in paper 1. Paper 1 aims to provide a general understanding of the importance and implications of implementing BIM-enabled systems in the FM&O sector and also identifies the main categories of the issues associated with this approach. This literary paper reports on a basic research with a descriptive approach and builds upon the information from a non-exhaustive set of literature. In this paper, workflows, contracts and information technology have been identified as three categories of the issues associated with implementing BIM-enabled systems in the FM&O sector. Paper 2 is also a literary research which draws on the notion of BIM repositories and aims to clarify the technical requirements for a more collaborative building industry as well as depicting the current status of building knowledge management technologies, recent trends and future prospects. Open format BIM repositories have been suggested as the cornerstones of an integrated information management system for AECO firms. The aim of paper 3 is twofold: firstly, to summarize the current status of the building information management technologies applied in the facility operation activities and identifying prevailing issues; secondly, to devise some technical solutions for those issues based on a case project. In the first part of this study, a summarized description of information management configurations in eleven projects were extracted from literature and the technical issues within those systems were identified. Moreover, five major categories of contemporary technical solutions for enhancing information transfer from BIM to FM&O software were designated. Then, a narrative and illustrative representation and reconstruction of an IT-implementation project was developed. Paper 4 is another literary study which aims to provide the theoretical basis for more focused studies on existing and desired processes in the FM&O sector and their associated information transactions. In this paper, firstly, the more common definitions of the key concepts have been revisited and discussed. Then, the generic types of the processes, activities and organizational roles common to FM&O firms, the types of information required by each actor and how such information are acquired have been presented.

QC 20150423

APA, Harvard, Vancouver, ISO, and other styles
22

Nicquevert, Bertrand. "Manager l'interface. Approche par la complexité du processus collaboratif de conception, d'intégration et de réalisation : modèle transactionnel de l'acteur d'interface et dynamique des espaces d'échanges." Phd thesis, Université de Grenoble, 2012. http://tel.archives-ouvertes.fr/tel-00789791.

Full text
Abstract:
Dans de grands projets tels qu'accélérateurs ou détecteurs de particules, les interfaces et les frontières se révèlent à la fois critiques et sous-estimées. Le manageur technique, acteur parmi les autres, se trouve placé à des nœuds de réseau où il doit mettre en œuvre des espaces d'échanges afin de susciter des conduites collaboratives. À partir d'études de cas issus du terrain du CERN, la thèse adopte trois principes issus de la littérature de la complexité, les principes dialogique,hologrammique et d'auto-éco-organisation. Elle propose une construction méthodologique matricielleoriginale menant à l'élaboration d'un modèle transactionnel de l'acteur d'interface.L'espace d'échanges collaboratif devient le lieu où se déploie la dynamique de transformation del'acteur d'interface en acteur-frontière. Les objets intermédiaires élaborés lors du processus deconception / intégration y sont simultanément transformés en objets frontières, qui sont mobiliséspour la réalisation du produit dans le cadre récursivement déterminé du projet. L'intérêt d'uneapproche globale et couplée de cette dynamique des espaces d'échanges conduit à proposer un"hypercompas" afin d'orienter "l'agir ↔ penser" du manageur technique.
APA, Harvard, Vancouver, ISO, and other styles
23

Tchana, De Tchana Yvan. "Proposition d’un jumeau numérique pour soutenir la gestion de l'exploitation d'une infrastructure linéaire." Thesis, Troyes, 2021. http://www.theses.fr/2021TROY0012.

Full text
Abstract:
La croissance du numérique au sein l'industrie de la construction conduit au BIM (Building Information Modeling). Inspiré du bâtiment, le BIM est adopté plus tard sur des projets d’infrastructure linéaire. Ces projets nécessitent une maitrise de bout en bout de la chaîne d’information. Le PLM (Product Lifecycle Management) favorise cette continuité numérique dans l’industrie manufacturière. Des études expérimentent une utilisation complémentaire des approches BIM et PLM pour ce type de projet. Adaptant des méthodes développées pour la construction de bâtiments, ces études se limitent à l’élaboration de référentiels de données. Cela rend difficile la gestion de l’infrastructure dans sa phase d’usage, où la maquette numérique doit devenir un jumeau numérique. Ces travaux consistent à développer une stratégie pour la conception, la mise en œuvre et l’exploitation d’une infrastructure linéaire. Cible de la démarche, le jumeau numérique intégrera non seulement BIM et PLM, mais aussi toute autre source d’information qui resitue l’infrastructure dans son environnement. Agrégateur de données, il devrait permettre d’assurer la gestion du cycle de vie d’une infrastructure linéaire. Ce système est éprouvé sur une infrastructure linéaire particulière, un passage à niveau. La continuité numérique et la traçabilité des données sont des facteurs importants pour un tel ouvrage. Notre proposition permet de suivre l’évolution de la donnée, et de lier des données d’exploitation aux données de conception et de construction de l’infrastructure
The digital growth of the construction industry led to BIM (Building Information Modeling). Developed for buildings, BIM is later used on linear infrastructure projects. Such projects require end-to-end control of information. PLM (Product Lifecycle Management) supports digital continuity in the manufacturing industry. Studies evaluate the relevance of a complementary use of the BIM and PLM approaches for linear infrastructure projects. With an adaptation of methods used for building construction, those studies are mostly restricted to the implementation of data repositories. This makes it difficult to consider the infrastructure post-construction phase, where the 3D model is no longer a digital model, but a digital twin. This research work consists in developing a strategy for the design, the implementation and the operations and maintenance of a linear infrastructure. The digital twin of the infrastructure is the target of our approach. It will take into consideration not only BIM and PLM methodologies, but also any other data source positioning the infrastructure in its geographical environment. Data aggregator, our digital twin should make it possible to manage the lifecycle of a linear infrastructure. This system is tested on a specific linear infrastructure, a level crossing. Digital continuity and data traceability are important factors for those constructions. Through the digital twin, our proposal helps to follow the data, and thus to link operational data to the design and construction data of the linear infrastructure
APA, Harvard, Vancouver, ISO, and other styles
24

LIU, CHUN-HSIN, and 劉駿馨. "Rapid Modeling of Software Lifecycle Management." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/66881851698389766724.

Full text
Abstract:
碩士
東海大學
資訊工程與科學系
96
In the domain of software engineering, software lifecycle management is an important issue to study. However, it is not easy for a software development team, especially for those medium and small ones, to build an adequate lifecycle management solution. It is also not a trivial job to implement the solution by just adopting commercial or open source available management tools. Therefore, in this thesis we propose a rapid software lifecycle management modeling and implementing method called RALM (Rapid Application Lifecycle management Modeling) to help software development teams implementing their required management solutions as easily as possible. Based on templates and ALM-XML standard format, a software team can rapidly define its lifecycle management solution. In order to show the superiority of RALM, we further implement a tool called ALM2VSTS such that it can automatically transfer the solution definition into Microsoft ® Visual Studio 2005 Team System (VSTS) process template, configuration guidance, and engineering practice portal.
APA, Harvard, Vancouver, ISO, and other styles
25

Khosroshahy, Masood. "Analytical Lifecycle Modeling and Threat Analysis of Botnets." Thesis, 2013. http://spectrum.library.concordia.ca/976958/1/Khosroshahy_PhD_S2013.pdf.

Full text
Abstract:
Botnet, which is an overlay network of compromised computers built by cybercriminals known as botmasters, is the new phenomenon that has caused deep concerns to the security professionals responsible for governmental, academic, and private sector networks. Botmasters use a plethora of methods to infect network-accessible devices (nodes). The initial malware residing on these nodes then either connects to a central Command & Control (C&C) server or joins a Peer-to-Peer (P2P) botnet. At this point, the nodes can receive the commands of the botmaster and proceed to engage in illicit activities such as Distributed Denial-of-Service (DDoS) attacks and massive e-mail spam campaigns. Being able to reliably estimate the size of a botnet is an important task which allows the adequate deployment of mitigation strategies against the botnet. In this thesis, we develop analytical models that capture the botnet expansion and size evolution behaviors in sufficient details so as to accomplish this crucial estimation/analysis task. We develop four Continuous-Time Markov Chain (CTMC) botnet models: the first two, SComI and SComF, allow the prediction of initial unhindered botnet expansion in the case of infinite and finite population sizes, respectively. The third model, the SIC model, is a botnet lifecycle model which accounts for all important node stages and allows botnet size estimates as well as evaluation of botnet mitigation strategies such as disinfections of nodes and attacks on botnet's C&C mechanism. Finally, the fourth model, the SIC-P2P model, is an extension of the SIC model suitable for P2P botnets, allowing fine-grained analysis of mitigation strategies such as index poisoning and sybil attack. As the convergence of Internet and traditional telecommunication services is underway, the threat of botnets is looming over essential basic communication services. As the last contribution presented in this thesis, we analyze the threat of botnets in the 4G cellular wireless networks. We identify the vulnerability of the air interface, i.e. the Long Term Evolution (LTE), which allows a successful botnet-launched DDoS attack against it. Through simulation using an LTE simulator, we determine the number of botnet nodes per cell that can significantly degrade the service availability of such cellular networks.
APA, Harvard, Vancouver, ISO, and other styles
26

Mao, I.-shiang, and 毛一祥. "Lifecycle Assessment of Maintenance, Repair and Rehabilitation Costs: A Reliability Based Deterioration Modeling Approach for Bridge Components." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/73995106422298931773.

Full text
Abstract:
博士
國立中央大學
營建管理研究所
103
Bridge MR&R costs can be more objectively estimated if adequate historical data are collected and well processed, also a proper deterioration model is adopted. This study will first demonstrate a systematic approach to explore the key factors leading to bridge deterioration. The representative samples of bridges with similar behaviors of deterioration are then classified by the identified factors. A new condition index (NCI) is defined and the reliability index (Beta) is introduced to measure the deterioration quantitatively. The deterioration trend of the representative samples can then be determined. Finally, the maintaining cost of a bridge with the same deterioration behavior can be estimated. This study achieved the following objects: 1. Identifying the key factors leading to deterioration towards bridge elements; 2. Developing a bridge matching system to retrieve bridge samples by their attributes; 3. Developing a reliability-based deterioration model of bridge elements; 4. Demonstrating a systematic approach to estimate the MR&R cost of bridges.
APA, Harvard, Vancouver, ISO, and other styles
27

Bernardino, Ricardo Jorge Rocha. "State-Based Programming in PQL." Master's thesis, 2014. http://hdl.handle.net/10316/35712.

Full text
Abstract:
Dissertação de Mestrado em Engenharia Informática apresentada à Faculdade de Ciências e Tecnologia da Universidade de Coimbra
Complex Event Processing (CEP) has become increasingly popular within organizations. The financial industry used to be the sole beneficiary of the CEP capabilities, however nowadays we see it increasingly being adopted by non-financial companies, especially for Business Activity Monitoring software (BAM). One of the most distinctive features of CEP is pattern matching. In this field, pattern matching should not be mistaken for the pattern matching in character strings, nor pattern matching from the functional programming paradigm. Rather, it specifies temporal relationships between events, such as sequences, restrictions of event occurrences, just to name a few. On the other hand, one feature which has yet to be widely adopted amongst the CEP engines is entity state and lifecycle modeling, i.e., the ability to correlate events into instances of an entity, where each entity will have di↵erent states and transitions representing its lifecycle. The objective of this work is to investigate and implement this type of functionality in Feedzai Pulse’s query language - PQL.
APA, Harvard, Vancouver, ISO, and other styles
28

Venugopal, Vishnu. "Modelling How Refractoriness to Interferon Compromises Interferon-Free Treatment of Hepatitis C Virus Infection." Thesis, 2017. http://etd.iisc.ernet.in/2005/3612.

Full text
Abstract:
Hepatitis C virus (HCV) infection globally affects 130-150 million people. It causes both acute and chronic infections. Due to the severe side effects and low success rates of interferon based treatments, which formed the standard treatment for HCV, the treatment paradigm shifted to direct acting antivirals (DAAs). DAAs have revolutionized the treatment of hepatitis C virus infection. Clinical trials with combinations of DAAs have recorded >90% response with shorter treatment durations and fewer side effects than earlier treatments involving IFN. Outside the controlled setting of a clinical trial, however, response rates with DAA combinations are much lower (<70%). DAAs can fail if HCV accumulates mutations that confer drug resistance. Interestingly, the pre-existence of mutant frequency in the virus appears not to influence treatment outcome. A better predictor for DAA treatment outcome is yet to be unravelled. Surprisingly, individuals who respond poorly to IFN appear to be more likely to fail DAA treatment. IFN is a generic antiviral that improves immune responses and is expected not to have any bearing on DAA treatment outcomes. Why individuals with poor IFN sensitivity fail DAA treatment remains a mystery. In a recent study of the IFN signalling network, HCV has been shown to compromise IFN activity. It induces bistability in the network leading to distinct phenotypic responses of cells to IFN exposure. In particular, individuals who respond poorly to IFN tend to have a higher percentage of cells that are refractory to IFN; these cells allow viral persistence despite IFN exposure. We hypothesized here that in such individuals, greater ongoing replication would allow increased development of resistance and thus lead to the failure of DAAs. We constructed a model of viral dynamics that accounts for the distinct phenotypic responses of cells to IFN, viral replication and mutation, and the development of resistance to DAAs. Our model predicted that although the relative prevalence of pre- existing mutants is unaffected by IFN sensitivity, in agreement with observations, the growth of drug resistant mutants is accelerated in individuals with poor IFN sensitivity. Based on a distribution of IFN sensitivity across individuals, our model accurately described clinical observations of the response rates to different current treatment protocols. With this model, we predict that the common strategy of increasing the genetic barrier by adding more drugs to the combination was not necessary to avert the development of drug resistance. Instead, an optimised increase in DAA dosage alone or DAA+PR or PR dosage depending on the patient’s IFN sensitivity could help achieve success.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography