Dissertations / Theses on the topic 'Cycle de vie de la donnée'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Cycle de vie de la donnée.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Delalandre, Léo. "Relations traits-environnement chez les végétaux : du cycle de vie des organismes au cycle de vie des données." Electronic Thesis or Diss., Université de Montpellier (2022-....), 2024. http://www.theses.fr/2024UMONG001.
Comparative ecology has highlighted recurring associations between plant functional traits and their environment. These relationships may vary depending on the level of organization considered – within species, between species, and among groups of species – but this dependency remains poorly studied. A fundamental distinction in life history theories is made between annual species (completing their life cycle in one year) and perennial species (life cycle over more than one year, usually with multiple reproductive events). Annual and perennial herbaceous plants differ in their functioning (growth rate, investment in seed production, allocation to roots, etc.). However, despite their frequent coexistence, few studies have considered potential differences in trait-environment relationships between these two groups. The objective of this thesis is to understand the specific variations in the traits of annual plants depending on resource availability, based on in situ measurements and in a common garden setting.We studied herbaceous communities in the Grands Causses, where annuals and perennials coexist in two contrasting environmental conditions: i) fertilization and high disturbance, and ii) poor soil and less intense disturbance. We show that variations in traits related to growth rate and leaf tissue density are lower in annuals than in perennials. This is explained by (a) a higher species turnover in perennials, and (b) the presence of species with larger differences in trait values between environments in perennials. Intraspecific variations are identical between the two groups of species. Measurements made during this first part were used to complete a trait database under development. On this occasion, I contributed to the structuring of this database through data management work, aiming to propose modalities for sharing functional trait data and associated environmental variables; a synthesis of this work is proposed.Secondly, we analyzed intraspecific variability in annuals from these communities, in order to test its origin (genetic or plastic), to identify the most variable traits in response to fertilization, and to compare this variability between species. Thirty populations were grown in a common garden, with low or high fertilization. The results indicate that i) the observed trait variations in situ are likely of plastic origin; ii) plasticity is low in morphological leaf and root traits but high in biomass allocation and nitrogen content; iii) species preferring nutrient-rich environments are more plastic in their nitrogen content.Finally, a literature review was undertaken to determine which traits are determinant for annual and perennial herbaceous plants, reasoning on demographic components (reproduction, growth, survival), the importance of which differs according to the life cycle. We propose an opinion article aiming to better integrate life cycle and commonly measured morpho-physio-phenological traits.This thesis proposes a study of the relationships between functional traits and the environment at different levels of organization: between life cycles, between species, and within species. It highlights that trait-environment relationships can vary between these levels, fitting into a renewed interest in context dependency in comparative ecology
Bertin, Jean-Marie. "Modélisation sémantique des bases de données d'inventaires en cycle de vie." Phd thesis, INSA de Lyon, 2013. http://tel.archives-ouvertes.fr/tel-00876636.
Bertin, Benjamin. "Modélisation sémantique des bases de données d'inventaires en cycle de vie." Thesis, Lyon, INSA, 2013. http://www.theses.fr/2013ISAL0049/document.
Environmental impact assessment of goods and services is nowadays a major challenge for both economic and ethical reasons. Life Cycle Assessment provides a well accepted methodology for modeling environmental impacts of human activities. This methodology relies on the decomposition of a studied system into interdependent processes in a step called Life Cycle Inventory. Every process has several environmental impacts and the composition of those processes provides the cumulated environmental impact for the studied human activities. Several organizations provide processes databases containing several thousands of processes with their interdependency links that are used by LCA practitioners to do an LCA study. Understanding and audit of those databases requires to analyze a huge amount of processes and their dependency relations. But those databases can contain thousands of processes linked together. We identified two problems that the experts faces using those databases: - organize the processes and their dependency relations to improve the comprehensibility; - calculate the impacts and, if it is not possible, find why it is not feasible. In this thesis, we: - show that there are some semantic similarities between the processes and their dependency relations and propose a new way to model the dependency relations in an inventory database. In our approach, we semantically index the processes using an ontology and we use a multi-layers model of the dependency relations. We also study a declarative approach of this multi-layers approach; - propose a method to calculate the environmental impacts of the processes based on linear algebra and graph theory, and we study the conditions of the feasibility of this calculation when we have a cyclic model. We developed a prototype based on this approach that showed some convincing results on different use cases. We tested our prototype on a case study based on a data set extracted from the National Renewable Energy restricted to the electricity production in the United-States
Beloin-Saint-Pierre, Didier. "Vers une caractérisation spatiotemporelle pour l'analyse du cycle de vie." Phd thesis, Ecole Nationale Supérieure des Mines de Paris, 2012. http://pastel.archives-ouvertes.fr/pastel-00857936.
Khouri, Selma. "Cycle de vie sémantique de conception de systèmes de stockage et manipulation de données." Thesis, Chasseneuil-du-Poitou, Ecole nationale supérieure de mécanique et d'aérotechnique, 2013. http://www.theses.fr/2013ESMA0016/document.
Data Warehouses (DWs) become essential components for companies and organizations.DWdesign field has been actively researched in recent years. The main limitation of the proposedapproaches is the lack of an overall vision covering the DW design cycle. Our main contributionin this thesis is to propose a method adapted to recent evolutions of the DW design cycle,and covering all its phases. These evolutions have given rise to new data storage models andnew deployment architectures, which offers different design choices for designers and administrators.DW literature recognizes the importance of user requirements in the design process, andthe importance of accessing and representing data semantics. We propose an ontology drivendesign method that valorizes users’ requirements by providing them a persistent view in theDW structure. This view allows anticipating diverse design tasks and simulating different designchoices. Our second proposal revisits the design cycle by executing the ETL phase (extractiontransformation-loading of data) in the conceptual stage. This proposal allows a deployment à lacarte of the DW using the different deployment platforms available
Khouri, Selma, and Selma Khouri. "Cycle de vie sémantique de conception de systèmes de stockage et manipulation de données." Phd thesis, ISAE-ENSMA Ecole Nationale Supérieure de Mécanique et d'Aérotechique - Poitiers, 2013. http://tel.archives-ouvertes.fr/tel-00926657.
Lasvaux, Sébastien. "Étude d'un modèle simplifié pour l'analyse de cycle de vie des bâtiments." Phd thesis, Paris, ENMP, 2010. https://pastel.hal.science/pastel-00712043.
Energy and environmental aspects are more and more integrated in the design process of buildings. Life Cycle Assessment (LCA) is generally used to assess the environmental performance of buildings. This method uses to date a high number of data which can be a limiting aspect for its application. For instance, it can be difficult for some manufacturers to give life cycle inventory (LCI) gathering several hundreds of flows. In addition, the interpretation of the results with about ten environmental indicators can be complex for the building practitioners. In this context, the aim of this research is to study a simplified model for the LCA of buildings. A simplified life cycle inventory (LCI) database gathering building materials, products and processes is first developed. It is composed of data from the Ecoinvent and INIES database with the help of a homogeneous nomenclature. The use of statistical methods then enables to assess the relevance of simplification of the LCA model. Simplified life cycle impact assessment (LCIA) model are built between the LCI flows and the LCIA indicators. They enable previously to the applications using the database to identify the flows that are the most significant in the environmental impact of a building. The understanding of the consequences of the simplification of the LCA model and the statistical methods used in this work enable to better appreciate the reliability of simplified LCA applied to building products and buildings as a whole
Lasvaux, Sébastien. "Étude d'un modèle simplifié pour l'analyse de cycle de vie des bâtiments." Phd thesis, École Nationale Supérieure des Mines de Paris, 2010. http://pastel.archives-ouvertes.fr/pastel-00712043.
Kibamba, Yannick Privat. "Spécification et développement d'un environnement collaboratif de gestion du cycle de vie des données de simulation numérique." Compiègne, 2011. http://www.theses.fr/2011COMP1997.
The proposed research work deals with the issues of Simulation Lifecycle Management (SLM). Nowadays numerical simulation plays a major role in the product development process. Indeed, reducing the need for physical prototypes and providing a relevant analysis of system behavior, numerical simulation has became a major lever for improving the development process. Faced to an increased competition, manufacturing companies rely heavily on numerical simulation to improve the technical performances of their products. According to this increasing use of numerical simulation, some issues related to data management and information sharing between simulation disciplines and with other phases of the development process has emerged. This PhD thesis presents a study of improvement of simulation activities based on an application of PLM approach. This study suggests two main areas of improvement. The first one concerns the definition of product structure for better integration of the needs of the numerical simulation, specifically in relation to the definition of fluid and structure domains and also related interactions. The second area is related to the management of dependencies between simulation data for a better traceability and an easier capitalization. This research work finally resulted in the implementation of a SLM prototype based on the solution Smar Team of Dassault Systèmes and coupled with two expert applications : CATIA, a CAD solution of Dassault Systèmes, and Workbench, a Pre/Post-processing solution of Ansys
Royer, Kevin. "Vers un entrepôt de données et des processus : le cas de la mobilité électrique chez EDF." Thesis, Chasseneuil-du-Poitou, Ecole nationale supérieure de mécanique et d'aérotechnique, 2015. http://www.theses.fr/2015ESMA0001/document.
Nowadays, the electrical vehicles (EV) market is undergoing a rapid expansion and has become ofgreat importance for utility companies such as EDF. In order to fulfill its objectives (demand optimization,pricing, etc.), EDF has to extract and analyze heterogeneous data from EV and charging spots. Inorder to tackle this, we used data warehousing (DW) technology serving as a basis for business process(BP). To avoid the garbage in/garbage out phenomena, data had to be formatted and standardized.We have chosen to rely on an ontology in order to deal with data sources heterogeneity. Because theconstruction of an ontology can be a slow process, we proposed an modular and incremental constructionof the ontology based on bricks. We based our DW on the ontology which makes its construction alsoan incremental process. To upload data to this particular DW, we defined the ETL (Extract, Trasform& Load) process at the semantic level. We then designed recurrent BP with BPMN (Business ProcessModelization & Notation) specifications to extract EDF required knowledge. The assembled DWpossesses data and BP that are both described in a semantic context. We implemented our solutionon the OntoDB platform, developed at the ISAE-ENSMA Laboratory of Computer Science and AutomaticControl for Systems. The solution has allowed us to homogeneously manipulate the ontology, thedata and the BP through the OntoQL language. Furthermore, we added to the proposed platform thecapacity to automatically execute any BP described with BPMN. Ultimately, we were able to provideEDF with a tailor made platform based on declarative elements adapted to their needs
Benkrid, Soumia. "Le déploiement, une phase à part entière dans le cycle de vie des entrepôts de données : application aux plateformes parallèles." Thesis, Chasseneuil-du-Poitou, Ecole nationale supérieure de mécanique et d'aérotechnique, 2014. http://www.theses.fr/2014ESMA0027/document.
Designing a parallel data warehouse consists of choosing the hardware architecture, fragmenting the data warehouse schema, allocating the generated fragments, replicating fragments to ensure high system performance and defining the treatment strategy and load balancing.The major drawback of this design cycle is its ignorance of the interdependence between subproblems related to the design of PDW and the use of heterogeneous metrics to achieve thesame goal. Our first proposal defines an analytical cost model for parallel processing of OLAP queries in a cluster environment. Our second takes into account the interdependence existing between fragmentation and allocation. In this context, we proposed a new approach to designa PDW on a cluster machine. During the fragmentation process, our approach determines whether the fragmentation pattern generated is relevant to the allocation process or not. The results are very encouraging and validation is done on Teradata. For our third proposition, we presented a design method which is an extension of our work. In this phase, an original method of replication, based on fuzzy logic is integrated
Pichon, Noémie. "Méthode de génération de données d’inventaire du génie des procédés textiles : contribution à l’écoconception des vêtements." Electronic Thesis or Diss., Centrale Lille Institut, 2023. http://www.theses.fr/2023CLIL0039.
The fashion and textile industry is a complex, highly fragmented, and globalized valuechain, requiring a wide range of professions with specific expertise, and a highly heterogeneous level ofknowledge regarding the sector's environmental burdens. Given that climate and environmental issueshave never been so high on the agenda, scientific literature has been growing in recent years to assessthe environmental and human health impacts of this sector, which has been identified as the fourth mostpolluting industry in Europe, all impact categories combined. The eco-design of products is today acentral approach to achieve the sector's impact reduction targets. The challenge today is to extend itsuse to as many players as possible.The main aim of this research was to develop a method for generating textile Life Cycle Inventory(LCI) data, in order to promote eco-design and continuous improvement in the production stage of agarment's life cycle. The research work was carried out at the finest scale of textile process engineering,i.e. at the unit process scale. An illustration of this method for a specific transformation stage in textileengineering: from fiber to yarn, also known as spinning, was therefore carried out, including thecalculation of uncertainties. Finally, the analysis of the contributions to the results highlighted eco-design leverages
Bertrand, Christian. "Ateliers de génie logiciel : études, modèles de bases de données, contribution du modèle entité-association au cycle de vie du logiciel." Mulhouse, 1989. http://www.theses.fr/1989MULH0115.
Le, Van-Thao. "Proposition d'une stratégie soutenable pour donner une nouvelle vie à une pièce en s’appuyant sur les techniques de fabrication additive." Thesis, Université Grenoble Alpes (ComUE), 2017. http://www.theses.fr/2017GREAI038/document.
Currently, materials collected from end-of-life (EoL) products are recycled into raw material for reusing in a new production cycle. However, energy consumptions of recycling sectors remain important. The added values and energy used in the manufacture of original parts are also lost during the material recycling process. Nowadays, additive manufacturing techniques are sufficiently efficient and allow the manufacture of products with a material compatible with the use. Taking into account the performances of these techniques in a sustainable strategy can open the ways to modify parts and reuse them directly without returning to the raw material level. This thesis aims to develop a sustainable strategy, which allows giving a new life to an EoL part (or an existing part) by transforming it directly into a new part intended for another product. In order to develop such a strategy, the works of the thesis aims to solve the following scientific issues : the first scientific issue is related to the technological feasibility : is it possible to deposit material on an existing part using additive manufacturing technologies to obtain the new part with good material health ? This question is solved by carrying out an experimental study on the observation of microstructures and mechanical properties of the samples, which are manufactured by adding new features into an existing part in EBM. The second scientific issue is related to the study of the complete manufacturing chain from a technological point of view. How to design the process planning for additive and subtractive manufacturing combination to manufacture the expected part from the existing part ? To solve this question, a methodology to design the process planning for combining these manufacturing processes has been proposed based on the concept of additive manufacturing and machining features.The third scientific issue is linked to the sustainability and does the new strategy have advantages in comparison to the conventional strategy in terms of sustainability ? An approach based on the Life Cycle Assessment (LCA) method has also been developed to assess environmental impacts. The criteria for qualifying the domain of the proposed strategy vis-a-vis the conventional strategy were also identified
Bertin, Ingrid. "Conception des bâtiments assurant leur réversibilité, leur déconstruction et leur réemploi, méthodologie de suivi et évaluation environnementale sur les cycles de vie." Thesis, Paris Est, 2020. http://www.theses.fr/2020PESC1041.
In a context of strong environmental pressure in which the construction sector has the greatestimpact, the reuse of the load-bearing elements is the most promising as it significantly avoidswaste production, preserves natural resources and reduces greenhouse gas emissions by cuttingdown on embodied energy.This thesis consequently covers three main areas of research:1. Improvement of structural design through expedient typologies by defining the DfReu(Design for Reuse) in order to anticipate the use of load-bearing elements (vertical andhorizontal), that can be dismantled and reused at the end of their service life to extendtheir lifespan, ultimately increasing the stock of available elements for reuse.2. Development of a methodology for the implementation of a reinforced and long-lastingtraceability centered on a materials bank with the use of BIM in order to secure all thecharacteristics, in particular physico-mechanical, of the load-bearing elements and tofacilitate the reuse processes as well as the commitment of a new responsibility for thereuse engineer.3. Identification of the key parameters influencing the environmental impacts of reuse anddevelopment of sensitivity study, allowing a better comprehension of the consequencesof this process and its consideration in design to support to decision making.An experiment based on reinforced concrete demonstration portals frames has enabledcorroboration of these three lines of research by generating missing data in literature. Thispractical analysis of column-beam assembly has generated technical data on the structuralbehavior after reuse, but also environmental data for implementation and deconstruction.This research offers subsequently a methodology based on a chain of tools to enable engineersto design reversible construction assemblies within a reusable structure, to secure the necessaryinformation in the BIM model coupled with physical traceability, to build a bank of materials andto enhance design through a stock of load-bearing elements. The study thus distinguishes"design with a stock" which aims to combine as many available elements as possible, from"design from a stock" which leads to the reuse of 100% of the elements and thus presents a newparadigm for the designer.At the same time, the environmental impacts of the reuse process are studied using a life cycleassessment (LCA). A sensitivity study, based among other things on the number of uses and thelifespan, in comparison to equivalent new constructions, provides a better understanding of theareas of interest of the DfReu. Consideration of criteria specific to the circular economy inbuildings completes the definition of reuse criteria. In the end, environmental studies establishunder which conditions reuse reduces the impact of a building and identify the key parameters.The results obtained are primarily intended for structural engineers but more broadly fordesigners part of the project management: architects, engineers and environmental designoffices, in order to offer and encourage the study of variants anticipating the reusability of newlydesigned buildings. By extension, the results can also be used in projects involving existingbuildings
Moalla, Néjib. "Amélioration de la qualité des données du produit dans le contexte du cycle de vie d’un vaccin : une approche d’interopérabilité dirigée par les modèles." Lyon 2, 2007. http://theses.univ-lyon2.fr/sdx/theses/lyon2/2007/moalla_n.
To reach the industrial excellence, data quality is one of the essential pillars to handle in any improvement or optimization approach. Thus, data quality is a paramount need to ensure that the product meets the customer requirements. In the drug company and more particularly, in the vaccine industry, the definition of vaccine product is very complex considering its molecular structure. Data quality proves to be a priority according to many product definitions (biological, pharmaceutical, industrial, etc) and especially face to a lot of restrictions and regulatory recommendations imposed by customers as health authorities. In this context, and in front of the multitude of business activities supported by disconnected information systems, the need to ensure interoperability between these heterogeneous systems will make it possible to handle the specifications of various business scope during the exchanges of information. The deployment of model driven architecture will enable to transform a functional description of processes towards data models expressed in various platforms. In the logistic perimeter of the vaccines industry, we are interested to ensure the quality of some critical data in our ERP by the deployment of the concepts of model driven interoperability. The definition of various levels of reference frames will enable us to structure the models thus generated to share them with logistic perimeter actors. In the long run, our approach aims at reducing the cost of the product
Mathon, Vincent. "Etude climatologique des systèmes convectifs de meso-échelle en Afrique de l'Ouest." Paris 7, 2001. http://www.theses.fr/2001PA077099.
Prinçaud, Marion. "Développement d'un outil d'aide à la décision environnementale basé sur l'analyse de cycle de vie intégré au processus de conception." Phd thesis, Paris, ENSAM, 2011. http://pastel.archives-ouvertes.fr/pastel-00589315.
Pham, Cong Cuong. "Multi-utilisation de données complexes et hétérogènes : application au domaine du PLM pour l’imagerie biomédicale." Thesis, Compiègne, 2017. http://www.theses.fr/2017COMP2365/document.
The emergence of Information and Comunication Technologies (ICT) in the early 1990s, especially the Internet, made it easy to produce data and disseminate it to the rest of the world. The strength of new Database Management System (DBMS) and the reduction of storage costs have led to an exponential increase of volume data within entreprise information system. The large number of correlations (visible or hidden) between data makes them more intertwined and complex. The data are also heterogeneous, as they can come from many sources and exist in many formats (text, image, audio, video, etc.) or at different levels of structuring (structured, semi-structured, unstructured). All companies now have to face with data sources that are more and more massive, complex and heterogeneous.technical information. The data may either have different denominations or may not have verifiable provenances. Consequently, these data are difficult to interpret and accessible by other actors. They remain unexploited or not maximally exploited for the purpose of sharing and reuse. Data access (or data querying), by definition, is the process of extracting information from a database using queries to answer a specific question. Extracting information is an indispensable function for any information system. However, the latter is never easy but it always represents a major bottleneck for all organizations (Soylu et al. 2013). In the environment of multiuse of complex and heterogeneous, providing all users with easy and simple access to data becomes more difficult for two reasons : - Lack of technical skills : In order to correctly formulate a query a user must know the structure of data, ie how the data is organized and stored in the database. When data is large and complex, it is not easy to have a thorough understanding of all the dependencies and interrelationships between data, even for information system technicians. Moreover, this understanding is not necessarily linked to the domain competences and it is therefore very rare that end users have sufficient theses such skills. - Different user perspectives : In the multi-use environment, each user introduces their own point of view when adding new data and technical information. Data can be namedin very different ways and data provenances are not sufficiently recorded. Consequently, they become difficultly interpretable and accessible by other actors since they do not have sufficient understanding of data semantics. The thesis work presented in this manuscript aims to improve the multi-use of complex and heterogeneous data by expert usiness actors by providing them with a semantic and visual access to the data. We find that, although the initial design of the databases has taken into account the logic of the domain (using the entity-association model for example), it is common practice to modify this design in order to adapt specific techniques needs. As a result, the final design is often a form that diverges from the original conceptual structure and there is a clear distinction between the technical knowledge needed to extract data and the knowledge that the expert actors have to interpret, process and produce data (Soylu et al. 2013). Based on bibliographical studies about data management tools, knowledge representation, visualization techniques and Semantic Web technologies (Berners-Lee et al. 2001), etc., in order to provide an easy data access to different expert actors, we propose to use a comprehensive and declarative representation of the data that is semantic, conceptual and integrates domain knowledge closeed to expert actors
Verriez, Quentin. "Rationaliser les pratiques numériques en archéologie : l'exemple des chantiers de fouilles de Bibracte." Electronic Thesis or Diss., Bourgogne Franche-Comté, 2023. http://www.theses.fr/2023UBFCH035.
The thesis examines the move to open, digital archaeology. It investigates the feasibility of free software-based archaeological excavations producing transparent, structured data in open formats. Over the past two decades, the integration of digital technologies in the field of archaeology has increased considerably, affecting the collection, processing, management, preservation and dissemination of data. The open science approach offers solutions for the management, use and protection of this new type of data. This study uses a four-year excavation of the Bibracte Oppidum as a framework to test how open science principles can guide data production during fieldwork. Moreover, it aims to offer insights into the impacts of digital archaeology on its users and surroundings beyond technical concerns. The project aims to modernise archaeological methods by developing digital practices that consider fieldworkers' objectives and integrate their approach into a process of mastering, sharing and preserving archaeological knowledge
Lifran, Robert. "La contrainte de liquidité et l'accumulation du patrimoine professionnel dans une perspective de cycle de vie : modèles et tests empiriques sur les données du RICA (réseau d'information comptable agricole)." Montpellier 1, 1992. http://www.theses.fr/1992MON10044.
The aim of the work is to build an life model for self employed, with endogeneous income. First, we define the type of constraints facing an selfemployed who is willing to maximize his dicounted life time utility. Second, assuming a separability between saving and borrowing for the firm, we derive a optimality condition for the firm debt from a equation which maximize ressources (disposable income and equity) of the next period. Third, we demonstrate that the euler equation is the same as the case of exogenous income, but the level of initial consumption is different. If the borrowing decision, is exogenously constrained, the rate of growth of consumption will be lower, and depends on the technology and the initial equity. We test a model for optimal debt of farmers on panel data fraom the rica. The heterogeneity of farmers facing a borrowing constraint is studied estimating a borrowing equation by age groups
Kone, Joël-Louis. "Modélisation et suivi du vieillissement d’accumulateurs Li-ions par couplage avec modèle Dual-tank." Thesis, Université Grenoble Alpes, 2021. http://www.theses.fr/2021GRALI003.
The battery models used in system studies are generally based on electrical models with a single tank ("one tank model"), to which are coupled semi-empirical aging models predicting the evolution of the capacity of this one tank model.In these models, the state-of-health of a cell is therefore represented by a single value, which is too limiting. Moreover, these approaches make it difficult to understand the phenomena of capacity slope failures observed experimentally.In this thesis, empirical or physics-based aging models are coupled to a dual tank model. The first empirical approach aims to directly predict the evolution of the capacity of each electrode and the offset between electrode potential signals. The second, inspired by the physical phenoma that can occur within the battery, introduces the notion of parasitic current at the origin of the loss of cycling lithium.These different approaches are implemented using experimental calendar results from the MOBICUS project
Hernane, Soumeya-Leila. "Modèles et algorithmes de partage de données cohérents pour le calcul parallèle distribué à haut débit." Thesis, Université de Lorraine, 2013. http://www.theses.fr/2013LORR0042/document.
Data Handover is a library of functions adapted to large-scale distributed systems. It provides routines that allow acquiring resources in reading or writing in the ways that are coherent and transparent for users. We modelled the life cycle of Dho by a finite state automaton and through experiments; we have found that our approach produced an overlap between the calculation of the application and the control of the data. These experiments were conducted both in simulated mode and in real environment (Grid'5000). We exploited the GRAS library of the SimGrid toolkit. Several clients try to access the resource concurrently according the client-server paradigm. By the theory of queues, the stability of the model was demonstrated in a centralized environment. We improved, the distributed algorithm for mutual exclusion (of Naimi and Trehel), by introducing following features: (1) Allowing the mobility of processes (ADEMLE), (2) introducing shared locks (AEMLEP) and finally (3) merging both properties cited above into an algorithm summarising (ADEMLEP). We proved the properties, safety and liveliness, theoretically for all extended algorithms. The proposed peer-to-peer system combines our extended algorithms and original Data Handover model. Lock and resource managers operate and interact each other in an architecture based on three levels. Following the experimental study of the underlying system on Grid'5000, and the results obtained, we have proved the performance and stability of the model Dho over a multitude of parameters
Leroy, Yann. "Développement d'une méthodologie de fiabilisation des prises de décisions environnementales dans le cadre d'analyses de cycle de vie basée sur l'analyse et la gestion des incertitudes sur les données d'inventaires." Phd thesis, Paris, ENSAM, 2009. http://pastel.archives-ouvertes.fr/pastel-00005830.
Hadji, Rezai Sara. "Méthode d’évaluation de l’impact des composants de construction sur la performance globale (énergétique & environnementale, économique et sociale) d’un bâtiment tout au long de son cycle de vie." Thesis, La Rochelle, 2017. http://www.theses.fr/2017LAROS014/document.
Current expectations in terms of energy, environmental, economic and social performance, are increasingly higher, making more complex the act of building and complicating decision-making in the design phase. Therefore, tomorrow's challenge includes the support of construction stakeholders in the choice of the least risky and most appropriate construction components for a given construction project. This PhD was born of the desire to respond to this challenge. The objective being to develop a method for assessing the impact of the integration of a construction component on the overall performance (energy, environmental, economic and social) of a building throughout its life cycle. The appropriation of the risk culture, its assessment (identification, analysis and evaluation), and its management in the field of construction is a new phenomenon, still mainly in the state of research. However, other fields such as nuclear, aeronautics or finance have already acquired knowledge. Tools, methods and techniques were developed and highlighted good practices and precautions to be followed. This robust basis was used to develop our method and in particular, the classical method of Cooke (1991) was used to weigh the expert judgments in order to ensure a better representativeness of reality. The proposed method is composed of three steps. The first is a preliminary step that allows to define the objectives of performance to be achieved for an efficient building by building-up a family of performance indicators. It also allows to choose between the two next steps. Either the construction component is largely distributed and the main process is implemented, or it is innovative component and the secondary process is implemented. In fine the method enables : 1) to identify the 'hazard/risk' pairs associated with the component studied (largely distributed or innovative component) ; 2) to highlight, by weighting the performance indicators representing the overall performance to be achieved for a particular building : a) the most sensitive indicators regarding this component (largely distributed component) ; b) the impact of the integration of this component through the 'hazard/risk' pairs identified throughout the life cycle of the building at the component scale and at the building scale (largely distributed component). It was tested on two construction components, one largely distributed, the other innovative, which has highlighted the method advantages, limits, prospects and improvements from a theoretical point of view. But also to identify good practices, risks and learning associated with its implementation. Its strength lies in its large-scale deployment. The more it will be implemented on a large number of construction components of the same family, the more it will bring an interest in design assistance by comparing these components to choose the most appropriate and least risky for a given construction project
Tchana, De Tchana Yvan. "Proposition d’un jumeau numérique pour soutenir la gestion de l'exploitation d'une infrastructure linéaire." Thesis, Troyes, 2021. http://www.theses.fr/2021TROY0012.
The digital growth of the construction industry led to BIM (Building Information Modeling). Developed for buildings, BIM is later used on linear infrastructure projects. Such projects require end-to-end control of information. PLM (Product Lifecycle Management) supports digital continuity in the manufacturing industry. Studies evaluate the relevance of a complementary use of the BIM and PLM approaches for linear infrastructure projects. With an adaptation of methods used for building construction, those studies are mostly restricted to the implementation of data repositories. This makes it difficult to consider the infrastructure post-construction phase, where the 3D model is no longer a digital model, but a digital twin. This research work consists in developing a strategy for the design, the implementation and the operations and maintenance of a linear infrastructure. The digital twin of the infrastructure is the target of our approach. It will take into consideration not only BIM and PLM methodologies, but also any other data source positioning the infrastructure in its geographical environment. Data aggregator, our digital twin should make it possible to manage the lifecycle of a linear infrastructure. This system is tested on a specific linear infrastructure, a level crossing. Digital continuity and data traceability are important factors for those constructions. Through the digital twin, our proposal helps to follow the data, and thus to link operational data to the design and construction data of the linear infrastructure
Hernane, Soumeya-Leila. "Modèles et algorithmes de partage de données cohérents pour le calcul parallèle distribué à haut débit." Electronic Thesis or Diss., Université de Lorraine, 2013. http://www.theses.fr/2013LORR0042.
Data Handover is a library of functions adapted to large-scale distributed systems. It provides routines that allow acquiring resources in reading or writing in the ways that are coherent and transparent for users. We modelled the life cycle of Dho by a finite state automaton and through experiments; we have found that our approach produced an overlap between the calculation of the application and the control of the data. These experiments were conducted both in simulated mode and in real environment (Grid'5000). We exploited the GRAS library of the SimGrid toolkit. Several clients try to access the resource concurrently according the client-server paradigm. By the theory of queues, the stability of the model was demonstrated in a centralized environment. We improved, the distributed algorithm for mutual exclusion (of Naimi and Trehel), by introducing following features: (1) Allowing the mobility of processes (ADEMLE), (2) introducing shared locks (AEMLEP) and finally (3) merging both properties cited above into an algorithm summarising (ADEMLEP). We proved the properties, safety and liveliness, theoretically for all extended algorithms. The proposed peer-to-peer system combines our extended algorithms and original Data Handover model. Lock and resource managers operate and interact each other in an architecture based on three levels. Following the experimental study of the underlying system on Grid'5000, and the results obtained, we have proved the performance and stability of the model Dho over a multitude of parameters
Shahzad, Muhammad Kashif. "Exploitation dynamique des données de production pour améliorer les méthodes DFM dans l'industrie Microélectronique." Phd thesis, Université de Grenoble, 2012. http://tel.archives-ouvertes.fr/tel-00771672.
Kubler, Sylvain. "Premiers travaux relatifs au concept de matière communicante : Processus de dissémination des informations relatives au produit." Phd thesis, Université Henri Poincaré - Nancy I, 2012. http://tel.archives-ouvertes.fr/tel-00759600.
Скворчевський, Олександр Євгенович. "Організація систем управління баз даних в логістичній підтримці життєвого циклу озброєння та військової техніки." Thesis, Луцький національний технічний університет, 2018. http://repository.kpi.kharkov.ua/handle/KhPI-Press/39152.
Bicalho, Tereza. "Les limites de l'ACV. Etude de la soutenabilité d'un biodiesel issu de l'huile de palme brésilienne." Phd thesis, Université Paris Dauphine - Paris IX, 2013. http://tel.archives-ouvertes.fr/tel-01002055.
Zina, Souheil. "Proposition d’un cadre de modélisation pour les applications PLM : application à la gestion de configurations." Thesis, Nancy 1, 2007. http://www.theses.fr/2007NAN10138/document.
Monitoring the technical information is one of the main preoccupations of the companies. Indeed, the increasingly constraining regulations and the higher competition level require being more rigorous and reactive to the customers’ requests. The product quality improvement and the reduction of costs and of cycles require applying rules and means of technical data management. However, the installation of a PLM (Product Lifecycle Management) solution remains a difficult exercise in order to take into account the complexity and the diversity of the customer requirements. The issues faced by both editors and integrators of PLM applications arise from the specific aspect of customers’ projects, even though most functional needs are often generic. The increasing data complexity and the need for flexible and scalable systems require PLM applications editors to capitalize knowledge related to their products engineering and to rationalize the working methods of design and development teams. This research work is interested in defining a modeling framework and deployment methods to help the integrator to specify, design, implement and make PLM applications evolve quickly by taking into account the customers’ specificities. The study carried out at LASCOM on the Advitium software package, consisted in setting a step of reverse engineering which made it possible to formalize the concepts needed for technical data management, and the specifics related to the product configuration management. This fundamental stage made possible to validate the handled concepts
Hernane, Soumeya. "Modèles et algorithmes de partage de données cohérents pour le calcul parallèle et distribué à haut débit." Phd thesis, Université de Lorraine, 2013. http://tel.archives-ouvertes.fr/tel-00919272.
Gong, Xin. "Gestion de patrimoine immobilier et transition numérique : modélisation des flux de données et mesure des impacts du BIM." Thesis, Lyon, 2020. http://www.theses.fr/2020LYSET005.
The phase of Facility Management (FM) accounts for 75% of the total cost of a building over its entire life cycle. While nowadays new construction can benefit from many services resulting from the digital transition and particularly BIM (Building Information Modeling), the integration of such services in the FM is still problematic. And because of the offer of tools still confidential, it is in lack of practical feedback and therefore objective evaluations, and the lack of measurement tools of impacts of this digital transition on FM. Not to mention the digital transition with BIM on existing buildings considering the break in data flows between design-construction and FM. The present thesis aims to propose a tool for quantifying the effects of the introduction of BIM in the FM strategy of real estate asset managers, based on the modeling of data flows according to a system development lifecycle V-Model. First, the needs of the managers were identified and formalized from real situations; the information needed for modeling was obtained from data collection, interviews and observations in situ. The organizational structure and operation of the managers were then translated into a schematic modeling ordering the real estate asset management activities, the functional relationships between entities, the work processes. This structuring led to the construction of a multiple-criteria metric based on KPI (Key Performance Indicator), exploiting in three indices of importance, competence and performance. The resulting MIB (Measurement of Impacts of BIM) model uses data from two real case studies to establish the current diagnosis and evaluate the impacts of the digital transition. Following the estimated impacts, optimization solutions are identified to guide managers in their transition strategy
Houriet-Segard, Geneviève. "Logement, cycles démographiques et cycle de vie." Paris, Institut d'études politiques, 1998. http://www.theses.fr/1998IEPP0029.
This thesis includes three parts. It's objective is to better understanding interactions between demography and housing. Analysis of the demographic cycles influencing the housing demand and the housing market equilibrium: some existing housing market's models with a demographic determinant are reviewed. Such a model is then built up based on French economical data. The model's main hypothesis is that there is a systematic excess of housing supply over demand. The level of this excess has a direct incidence on the price of housing. This model is simulated using various demographic and economical scenarios. Analysis of residential choices during the life cycle: the focus is the household's housing behavior with the saving's life cycle theory (e. G. People buy a house when they are working and sell it when they are retire to compensate the loss of revenue and to maintain the same level of consumption). These behaviors are tested on the data of the French survey "enquete logement 92". Behaviors such as the housing consumption, the tenure choice and the mobility of elder householders are tested in the context of saving's life cycle theory. Microsimulation of the housing market that allows the study of the interactions between aging population, the housing market and intergenerational transfers (bequests and social security programs): at the microeconomic level, this model takes into account a heterogeneous population simulating events such as birth's and death's dates, salary, retirement pension and sometimes inheritances. All people live two periods in their lifetime where they will optimize their consumption following the generalized saving's life cycle theory. Lastly, individual behaviors are aggregated and compiled at a macroeconomic level. This model is also simulated for different demographic and economical evolutions
Glade, Mathieu Lyonnet Patrick. "Modélisation des coûts de cycle de vie." [S.l.] : [s.n.], 2005. http://bibli.ec-lyon.fr/exl-doc/mglade.pdf.
Bouzaiene-Marle, Leïla. "AVISE, une démarche d'Anticipation du Vieillissement par Interrogation et Stimulation d'Experts, application à un matériel passif d'une centrale nucléaire : le pressuriseur." Phd thesis, Ecole Centrale Paris, 2005. http://tel.archives-ouvertes.fr/tel-00271619.
Kubler, Sylvain. "Premiers travaux relatifs au concept de matière communicante : Processus de dissémination des informations relatives au produit." Electronic Thesis or Diss., Université de Lorraine, 2012. http://www.theses.fr/2012LORR0130.
Over the last decade, communities involved with intelligent-manufacturing systems (IMS - Intelligent Manufacturing Systems, HMS - Holonic Manufacturing System) have demonstrated that systems that integrate intelligent products can be more efficient, flexible and adaptable. Intelligent products may prove to be beneficial economically, to deal with product traceability and information sharing along the product lifecycle. Nevertheless, there are still some open questions such as the specification of what information should be gathered, stored and distributed and how it should be managed during the lifecycle of the product. The contribution of this thesis is to define a process for disseminating information related to the product over its lifecycle. This process is combined with a new paradigm, which changes drastically the way we view the material. This concept aims to give the ability for the material to be intrinsically and wholly "communicating". The data dissemination process allow users to store context-sensitive information on communicating product. In addition to the data dissemination process, this thesis gives insight into the technological and scientific research fields inherent to the concept of "communicating material", which remain to be explored
Virginillo, Martin Gustavo. "Méthode d'analyse du cycle de vie des emballages." Thesis, Université Laval, 2011. http://www.theses.ulaval.ca/2011/28009/28009.pdf.
Charpin, Françoise. "Théorie du cycle de vie, croissance et endettement." Paris 10, 1990. http://www.theses.fr/1990PA100093.
LALANDE, MULLER SYLVIE. "Migraine et vie genitale chez la femme." Rennes 1, 1993. http://www.theses.fr/1993REN1M147.
Naumchev, Alexandr. "Exigences orientées objets dans un cycle de vie continu." Thesis, Toulouse 3, 2019. http://www.theses.fr/2019TOU30132.
The constantly changing customers' and users' needs require fast response from software teams. This creates strong demand for seamlessness of the software processes. Continuous integration, delivery and deployment, also known as DevOps, made a huge progress in making software processes responsive to change. This progress had little effect on software requirements, however. Specifying requirements still relies on the natural language, which has an enormous expressive power, but inhibits requirements' traceability, verifiability, reusability and understandability. Promoting the problematic qualities without inhibiting the expressiveness too much introduces a challenge. Bertrand Meyer, in his multirequirements method, accepts the challenge and proposes to express individual requirements on three layers: declarative subset of an object-oriented programming language, natural language and a graphical notation. This approach has motivated and inspired the work on the present thesis. While multirequirements focus on traceability and understandability, the Seamless Object-Oriented Requirements approach presented in the dissertation takes care of verifiability, reusability and understandability. The dissertation explores the Martin Glinz' hypothesis that software requirements should be objects to support seamlessness. The exploration confirms the hypothesis and results in a collection of tool-supported methods for specifying, validating, verifying and reusing object-oriented requirements. The most significant reusable technical contribution of the dissertation is a ready-to-use Eiffel library of template classes that capture recurring software requirement patterns. Concrete seamless object-oriented requirements inherit from these templates and become clients of the specified software. Object-oriented software construction becomes the method for requirements specification, validation and reuse; Design by Contract becomes the method for verifying correctness of implementations against the requirements. The dissertation reflects on several experiments and shows that the new approach promotes requirements' verifiability, reusability and understandability while keeping expressiveness at an acceptable level. The experiments rely on several examples, some of which are used as benchmarks in the requirements literature. Each experiment illustrates a problem through an example, proposes a general solution, and shows how the solution fixes the problem. While the experimentation relies on Eiffel and its advanced tool support, such as automated proving and testing, each idea underpinning the approach scales conceptually to any statically typed object-oriented programming language with genericity and elementary support for contracts
Popovici, Emil. "Contribution à l'analyse du cycle de vie des quartiers." Paris, ENMP, 2005. http://www.theses.fr/2005ENMP1340.
Studies regarding the environmental quality of buildings have shown the importance of decisions at the settlement scale and of assessing the relevance of various architectural and technical solutions at this level. A settlement life cycle assessment (LCA) software tool was developed, providing a decision support to: project developers, designers, contractors and owners/residents, in order to design and manage a settlement in a sustainable way. This achievement was possible by linking a thermal simulation tool with a LCA tool at building level and complementing them with supplementary elements (e. G. Networks, open spaces, etc. ). Several environmental indicators are evaluated allowing comparisons of various alternative designs. First evaluations of three European settlements are presented in order to illustrate the application of the tool. This work aims to contribute in linking urban, architectural and technical design, according to an integrated design approach
Arrondel, Luc. "Hypothèse du cycle de vie et composition du patrimoine." Paris 10, 1988. http://www.theses.fr/1988PA100166.
This thesis presents a general model of portfolio choices which is tested on the french CREP 1980 survey with 3. 000 households. The model extends the framework of modigliani's life cycle hypothesis to wealth composition with the help of existing partial theories (generalised form of the life cycle hypothesis allowing for bequest, merton's intertemporal portfolio choice model, model of acquisition of durable goods and housing). If finally assumes that household accumulation behaviour can be described by a three stage sequential procedure : (1) consumption-saving decision; (2) discrete choice of the combination of assets held; (3) continuous choice of conditional assets demands, given the combination held. Empirical econometric and statistical analysis deals both with the number of assets held, as an indicator of wealth diversification, and with portfolio composition. It shows the importance of the size of wealth and of age on assets demands and reveals the key role played by the discrete choice, results wich seem to vindicate the hypothesis of a three stage budgeting. From the explanatory variables of each asset ownership and conditional demand it is also possible to elaborate a typology of the 14 assets distinguished by the survey
Arrondel, Luc. "Hypothèse du cycle de vie et composition du patrimoine." Grenoble 2 : ANRT, 1988. http://catalogue.bnf.fr/ark:/12148/cb376114018.
Milette, Diane. "L'intérêt social dans une perspective du cycle de vie." Thèse, Université du Québec à Trois-Rivières, 1993. http://depot-e.uqtr.ca/5280/1/000606511.pdf.
De, Lavergne Casimir. "Eléments du cycle de vie de l'Eau Antarctique de Fond." Thesis, Paris 6, 2016. http://www.theses.fr/2016PA066373/document.
Antarctic Bottom Water is the most voluminous water mass of the World Ocean, and it feeds the deepest and slowest component of ocean circulation. The processes that govern its lifecycle are therefore key to the ocean's carbon and heat storage capacity on centennial to multi-millennial timescales. This thesis aims at characterizing and quantifying processes responsible for the destruction (synonymous of lightening and upwelling) of Antarctic Bottom Water in the abyssal ocean. Using an observational estimate of the global ocean thermohaline structure and diagnostics based on the density budget of deep waters, we explore the roles of basin geometry, geothermal heating and mixing by breaking internal waves for the abyssal circulation. We show that the shape of ocean basins largely controls the structure of abyssal upwelling. The contribution of mixing powered by breaking internal waves, though poorly constrained, is estimated to be insufficient to destroy Antarctic Bottom Water at a rate comparable to that of its formation. Geothermal heating plays an important role for the upwelling of waters covering large seafloor areas. The results suggest a reappraisal of the role of mixing in deep straits and sills, but also of the fundamental role of basin geometry, for the lightening and transport of abyssal waters
Lelièvre, Luc. "Cycle de vie et mobilité résidentielle dans l'agglomération de Québec." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape10/PQDD_0007/MQ44695.pdf.
Sallez, Yves. "Produit " actif " tout au long de son cycle de vie." Habilitation à diriger des recherches, Université de Valenciennes et du Hainaut-Cambresis, 2012. http://tel.archives-ouvertes.fr/tel-00768771.
El, Faiz Meryem. "Analyse du cycle de vie à l'aide du logiciel SimaPro." Master's thesis, Université Laval, 2020. http://hdl.handle.net/20.500.11794/40144.
Life cycle assessment (LCA) is a standardized method for assessing the environmental impacts of a product, which is defined by the ISO 14040: 2006 and ISO 14044: 2006standards. It is a recognized approach for assessing the environmental impact of products across their entire life cycle from raw materials extraction through manufacturing, transportation, usage and disposal based on a set of indicators representative of environmental issues of the product (climate change, natural resources, ozone, toxicity, ecotoxicity). Performing a life cycle assessment requires processing, calculating and analyzing a lot of information. The use of LCA software facilitates these different phases and assures transparency and traceability. This thesis presents a state of the art of the tools and methods available for carrying out an LCA based on the principles of the ISO14040 series. SimaPro, one of the main commercial software available for LCA practitioners, is presented in detail through a case study, in order to explore the different basic functions, databases and the impact calculation methods made available with the software. Keywords: LCA, environmental footprint, databases, SimaPro.