Tesis sobre el tema "Traçabilité des données"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte los 17 mejores tesis para su investigación sobre el tema "Traçabilité des données".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.
Bazin, Cyril. "Tatouage de données géographiques et généralisation aux données devant préserver des contraintes". Caen, 2010. http://www.theses.fr/2010CAEN2006.
Texto completoDigital watermaking is a fundamental process for intellectual property protection. It consists in inserting a mark into a digital document by slightly modifications. The presence of this mark allows the owner of a document to prove the priority of his rights. The originality of our work is twofold. In one hand, we use a local approach to ensure a priori that the quality of constrained documents is preserved during the watermark insertion. On the other hand, we propose a generic watermarking scheme. The manuscript is divided in three parts. Firstly, we introduce the basic concepts of digital watermarking for constrainted data and the state of the art of geographical data watermarking. Secondly, we present our watermarking scheme for digital vectorial maps often used in geographic information systems. This scheme preserves some topological and metric qualities of the document. The watermark is robust, it is resilient against geometric transformations and cropping. We give an efficient implementation that is validated by many experiments. Finally, we propose a generalization of the scheme for constrainted data. This generic scheme will facilitate the design of watermarking schemes for new data type. We give a particular example of application of a generic schema for relational databases. In order to prove that it is possible to work directly on the generic scheme, we propose two detection protocols straightly applicable on any implementation of generic scheme
Bossy, Robert. "Édition coopérative de bases de données scientifiques". Paris 6, 2002. http://www.theses.fr/2002PA066047.
Texto completoHadjipavlou, Elena. "Big data, surveillance et confiance : la question de la traçabilité dans le milieu aéroportuaire". Thesis, Université Côte d'Azur (ComUE), 2016. http://www.theses.fr/2016AZUR2044/document.
Texto completoThis research project questions, in a comprehensive and critical way, the presence of digital traces in the era of Big Data. This reflection opens up in the relation between Surveillance and Trust. In recent years, “Big Data” has massively and repeatedly been used in order to describe a new societal dynamic that would be characterized by the production of massive quantities of data. Furthermore, enormous potential benefits from using new statistical tools to analyze these data generated from connected objects and tools in more and more human actions. The airport sector is currently facing a major transformation, fueled by the explosion of data within its structure. The data generated during a passenger's journey are now extremely massive. There is no doubt that the management of this data is an important lever for the safety, the improvement of services and the comfort of the passenger. However, the expected benefits raise a great question: Where do these data go? We do not know. And as long as we do not know, how can we trust? These considerations are being examined at Larnaca airport in Cyprus. The different angles of approach as well as the diversity of the actors required the creation of a multidimensional corpus, resulting from a mixed methodology, in order to have a comprehensive approach to the subject. This corpus includes interviews, questionnaires and life stories of passengers and professionals. The qualitative and quantitative analysis that followed was based on a theoretical framework previously elaborated, in order to cross the representations of the actors concerning the surveillance and the trust and finally, highlight the different inherent visions to this issue
Maléchaux, Astrid. "Traçabilité et authentification d'huiles d'olive monovariétales par des outils chimiométriques". Electronic Thesis or Diss., Aix-Marseille, 2020. http://theses.univ-amu.fr.lama.univ-amu.fr/200110_MALECHAUX_773gwajs656jpq766z43jurz_TH.pdf.
Texto completoEnsuring the authenticity and traceability of food products is important to deal with consumer concerns and economic losses generated by food fraud cases. Moreover, olive oils, especially those whose origin gives them an extra added value, are among the products most affected by frauds. However, current official analyses only focus on purity and quality criteria of the oils, but not on the confirmation of their varietal origin. Therefore, this thesis proposes chemometric tools to facilitate the application of the varietal recognition of olive oils by routine analyses. On the one hand, by adding a control chart decision rule to the PLS1-DA model, samples whose characteristics are deviating from the reference profile can easily and quickly be identified, even in the presence of unbalanced numbers between the predicted classes. On the other hand, the development of data fusion strategies allows to benefit from the synergy between complementary sources of information such as the specific analysis of fatty acids by gas chromatography and the global analyses by near- and mid-infrared spectroscopy
Bendriss, Sabri. "Contribution à l'analyse et la conception d'un système d'information pour la gestion de la traçabilité des marchandises dans un contexte de transport multimodal". Le Havre, 2009. http://www.theses.fr/2009LEHA0024.
Texto completoOne of solutions to regulate and rationalize the physical displacement of goods is to succeed to synchronize the physical flow with its informational counterpart throughout the various links constituting the transport chain. In this context, a solution of goods tracking and tracing can contribute to a better mastery of flows. In this memory, we propose a data modeling approach for goods traceability based on innovative research approaches (PLM, Intelligent product, Product centered systems) and taking into account the possibilities offered by the use of NICT in terms of data sharing, auto-identification and geolocation. Then, in order to integrate our traceability data with the other transport chain data, but also in order to facilitate the sharing and the exchange of our data, we propose a modeling and the development of an intermediation platform based on the web services logic. Meeting criteria of interoperability and integrability, the result allows through mechanisms for exchange and saving data to follow and to restore goods lifecycle in its entirety
Diallo, Thierno M. L. "Approche de diagnostic des défauts d’un produit par intégration des données de traçabilité unitaire produit/process et des connaissances expertes". Thesis, Lyon 1, 2015. http://www.theses.fr/2015LYO10345.
Texto completoThis thesis, which is part of the Traçaverre Project, aims to optimize the recall when the production process is not batch type with a unit traceability of produced items. The objective is to minimize the number of recalled items while ensuring that all items with defect are recalled. We propose an efficient recall procedure that incorporates possibilities offered by the unitary traceability and uses a diagnostic function. For complex industrial systems for which human expertise is not sufficient and for which we do not have a physical model, the unitary traceability provides opportunities to better understand and analyse the manufacturing process by a re-enactment of the life of the product through the traceability data. The integration of product and process unitary traceability data represents a potential source of knowledge to be implemented and operate. This thesis propose a data model for the coupling of these data. This data model is based on two standards, one dedicated to the production and the other dealing with the traceability. We developed a diagnostic function based on data after having identified and integrated the necessary data. The construction of this diagnosis function was performed by a learning approach and comprises the integration of knowledge on the system to reduce the complexity of the learning algorithm. In the proposed recall procedure, when the equipment causing the fault is identified, the health status of this equipment in the neighbourhood of the manufacturing time of the defective product is evaluated in order to identify other products likely to present the same defect. The global proposed approach was applied to two case studies. The first study focuses on the glass industry. The second case of application deals with the benchmark Tennessee Eastman process
Le, roch Yann. "De l'intérêt de la mise en réseau de données logistiques standardisées : des technologies supports aux business models". Thesis, Paris Sciences et Lettres (ComUE), 2017. http://www.theses.fr/2017PSLEM059.
Texto completoNew logistics model, such as the Physical Internet, offer a breakthrough in the way goods are routed inside networks that are more and more inter-connected. This inter-connection requires innovative, Internet-of-Things-like, information systems for tracking and routing those goods within open loops. This double redefinition of physical and digital flows is bound to impact logistics service providers professions and their business models. Through an action-research on be-half of an industrial consortium, we study this question in three steps : by experimenting track and trace standardized technologies used to manage returnable transport items, by identifying new value logics induced by these new technological means, and finally observing how these new technologies and businesses impact the logistics services in place. Our results validate the benefit of these new information systems to trace logistical units within the Physical Internet. A range of new business tactics induced by those new digital and physical models is proposed. However, to the extent of our research, the logistic’s network business models are not yet impacted by these new digital logics
Toussaint, Marion. "Une contribution à l'industrie 4.0 : un cadre pour sécuriser l'échange de données standardisées". Electronic Thesis or Diss., Université de Lorraine, 2022. http://www.theses.fr/2022LORR0121.
Texto completoThe recent digital transformation of the manufacturing world has resulted in numerous benefits, from higher quality products to enhanced productivity and shorter time to market. In this digital world, data has become a critical element in many critical decisions and processes within and across organizations. Data exchange is now a key process for the organizations' communication, collaboration, and efficiency. Industry 4.0 adoption of modern communication technologies has made this data available and shareable at a quicker rate than we can consume or track it. This speed brings significant challenges such as data interoperability and data traceability, two interdependent challenges that manufacturers face and must understand to adopt the best position to address them. On one hand, data interoperability challenges delay faster innovation and collaboration. The growing volume of data exchange is associated with an increased number of heterogeneous systems that need to communicate with and understand each other. Information standards are a proven solution, yet their long and complex development process impedes them from keeping up with the fast-paced environment they need to support and provide interoperability for, slowing down their adoption. This thesis proposes a transition from predictive to adaptive project management with the use of Agile methods to shorten the development iterations and increase the delivery velocity, increasing standards adoption. While adaptive environments have shown to be a viable solution to align standards with the fast pace of industry innovation, most project requirements management solutions have not evolved to accommodate this change. This thesis also introduces a model to support better requirement elicitation during standards development with increased traceability and visibility. On the other hand, data-driven decisions are exposed to the speed at which tampered data can propagate through organizations and corrupt these decisions. With the mean time to identify (MTTI) and mean time to contain (MTTC) such a threat already close to 300 days, the constant growth of data produced and exchanged will only push the MTTI and MTTC upwards. While digital signatures have already proven their use in identifying such corruption, there is still a need for formal data traceability framework to track data exchange across large and complex networks of organizations to identify and contain the propagation of corrupted data. This thesis analyses existing cybersecurity frameworks, their limitations, and introduces a new standard-based framework, in the form of an extended NIST CSF profile, to prepare against, mitigate, manage, and track data manipulation attacks. This framework is also accompanied with implementation guidance to facilitate its adoption and implementation by organizations of all sizes
Chichignoud, Aurélien. "Approche méthodologique pour le maintien de la cohérence des données de conception des systèmes sur puce". Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLS069/document.
Texto completoThe development of highly complex products requires the maintenance of a huge set of inter-dependent documents, in various formats. Unfortunately, no tool or methodology is available today to systematically maintain consistency between all these documents. Therefore, according to observations made in STMicroelectronics, when a document changes, stakeholders must manually propagate the changes to the impacted set of dependent documents. For various reasons, they may not well propagate the change, or even may not propagate it at all. Related documents thereby diverge more and more over time. It dramatically impacts productivity to realign documents and make the very wide-ranging corpus of documents consistent. This paper proposes a methodology to help stakeholders to systematically maintain consistency between documents, based on the Architecture Description concept introduced by ISO42010. First, a model is defined to describe formally and completely correspondences between Architecture Description Elements of documents. This model is designed to be independent of documents formats, selected system development lifecycle and the working methods of the industry. Second, these correspondences are analyzed in case of document modification in order to help stakeholders maintaining global corpus consistency. A prototype has been developed, which implements the proposed approach, to evaluate the methodology. 18 subjects volunteered to evaluate the approach. These subjects made two tests (with and without our methodology) involving the correction of inconsistencies added in a set of documents. These tests allowed us to identify two variables: the number of inconsistencies corrected and the average time to correct the inconsistencies. According to our study, the use of the approach helps to correct 5.5% more inconsistencies in a time 3.3% lower
Ouertani, Mohamed Zied. "DEPNET : une approche support au processus de gestion de conflits basée sur la gestion des dépendances de données de conception". Phd thesis, Université Henri Poincaré - Nancy I, 2007. http://tel.archives-ouvertes.fr/tel-00163113.
Texto completoC'est à la gestion de ce phénomène, le conflit, que nous nous sommes intéressés dans le travail présenté dans ce mémoire, et plus particulièrement à la gestion de conflits par négociation. Nous proposons l'approche DEPNET (product Data dEPendencies NETwork identification and qualification) pour supporter au processus de gestion de conflits basée sur la gestion des dépendances entre les données. Ces données échangées et partagées entre les différents intervenants sont l'essence même de l'activité de conception et jouent un rôle primordial dans l'avancement du processus de conception.
Cette approche propose des éléments méthodologiques pour : (1) identifier l'équipe de négociation qui sera responsable de la résolution de conflit, et (2) gérer les impacts de la solution retenu suite à la résolution du conflit. Une mise en œuvre des apports de ce travail de recherche est présentée au travers du prototype logiciel DEPNET. Nous validons celui-ci sur un cas d'étude industriel issu de la conception d'un turbocompresseur.
Azzi, Rita. "Blockchain Adoption in Healthcare : Toward a Patient Centric Ecosystem". Electronic Thesis or Diss., Institut polytechnique de Paris, 2023. http://www.theses.fr/2023IPPAT053.
Texto completoThe healthcare sector evolves constantly, driven by technological advancement and innovative solutions. From remote patient monitoring to the Internet of Things (IoT), Artificial Intelligence (AI), personalized medicine, mobile health, and electronic records systems, technology has improved patient outcomes and enhanced care delivery. These technologies have shifted the healthcare ecosystem to be more patient-centered, focusing on meeting the patient's needs rather than the needs of the individual organizations within it. However, this transformative shift experienced by the healthcare industry is associated with multiple challenges due to the inherent complexity and fragmentation of the healthcare ecosystem. This dissertation addresses three healthcare ecosystem challenges that significantly impact patients. The first challenge addressed is the problem of counterfeit or falsified drugs that represent a threat to public health, resulting from the vulnerabilities in the pharmaceutical supply chain, notably centralized data management and the lack of transparency. The second challenge addressed is the problem of healthcare data fragmentation that thwarts care coordination and impacts clinical efficiency. This problem results from the dynamic and complex patients' journey in the healthcare system, shaped by their unique health needs and preferences. Patient data are scattered across multiple healthcare organizations within centralized databases and are ruled by policies that hinder data sharing and patients' empowerment over their data. The third challenge addressed is the confidentiality and privacy of healthcare data that, if compromised, shatter the trust relationship between patients and healthcare stakeholders. This challenge results from the healthcare organizations' poor data governance that increases the risk of data breaches and unauthorized access to patient information.The blockchain has emerged as a promising solution to address these critical challenges. It was introduced into the healthcare ecosystem with the promise of enforcing transparency, authentication, security, and trustworthiness. Through comprehensive analysis and case studies, this dissertation assesses the opportunities and addresses the challenges of adopting the blockchain in the healthcare industry. We start with a thorough review of the state of the art covering the blockchain's role in improving supply chain management and enhancing the healthcare delivery chain. Second, we combine theoretical and real-world application studies to develop a guideline that outlines the requirements for building a blockchain-based supply chain. Third, we propose a patient-centric framework that combines blockchain technology with Semantic technologies to help patients manage their health data. Our fourth contribution presents a novel approach to data governance by developing a blockchain-based framework that improves data security and empowers patients to participate actively in their healthcare decisions. In this final contribution, we widen the scope of the proposed framework to include a roadmap for its adoption across diverse domains (banking, education, transportation, and logistics, etc.)
Derville, Alexandre. "Développement d'algorithmes de métrologie dédiés à la caractérisation de nano-objets à partir d'informations hétérogènes". Thesis, Université Grenoble Alpes (ComUE), 2018. http://www.theses.fr/2018GREAM082/document.
Texto completoThis thesis is included in the technical and economical context of nanomaterials, more specifically nanoparticles and block copolymer. Today, we observe a technological revolution with the introduction of these materials into matrices more or less complex present in our daily lives (health, cosmetics, buildings, food ...). These materials yield unique properties to these products (mechanical, electrical, chemical, thermal ...). This omnipresence associated with the economic stakes generates two problems related to the process control and associated metrology. The first is to ensure traceability of these nanomaterials in order to prevent any health and environmental risks and the second is to optimize the development of processes in order to sustain profitable economic sectors. For this, the two most common metrology techniques used are: scanning electron microscopy (SEM) and atomic force microscopy (AFM).The first phase of the work is devoted to the development of a data fusion methodology that automatically analyzes data from each microscope and uses their respective strengths to reduce measurement uncertainties in three dimensions. A first part was dedicated to the correction of a major defect of the AFM which generates drifts and / or jumps in the signals. We present a data-driven methodology, fast to implement and which accurately corrects these deviations. The proposed methodology makes no assumption on the object locations and can therefore be used as an efficient preprocessing routine for signal enhancement before object analysis.The second part is dedicated to the development of a method for automatic analysis of spherical nanoparticle images coming from an AFM or a SEM. In order to develop 3D traceability, it is mandatory to identify and measure the identical nanoparticles that have been measured on both AFM and SEM. In order to obtain two estimations of the diameter on the same physical particle, we developed a technique that allows to match the particles. Starting from estimates for both types of microscopy, with particles present in both kinds of images or not, we present a technique that allows the aggregation of estimators on diameter populations in order to obtain a more reliable value of properties of the particle diameter.The second phase of this thesis is dedicated to the optimization of a block copolymer process (lamellar structures) in order to capitalize on all the characteristic quantities used for the validation of the process (line width, period, roughness, defects rate) in particular from SEM images for the purpose of matching them with a set of process parameters.Indeed, during the development of a new process, an experimental plan is carried out. The analysis of the latter makes it possible to manually estimate a more or less precise process window (estimate related to the expertise of the materials engineer). The step is reiterated until the desired characteristics are obtained. In order to accelerate the development, we have studied a way of predicting the result of the process on the parameter space. For this, we studied different regression techniques that we present to propose an automatic methodology for optimizing the parameters of a process powered by AFM and / or SEM image characteristics.This work of estimator aggregation and process window optimization makes it possible to consider the development of a standardization of automatic analysis of SEM and AFM data for the development of a standard for traceability of nanomaterials
Zwolinska, Monika. "Sécurité et libertés fondamentales des communications électroniques en droit français, européen et international". Thesis, Nice, 2015. http://www.theses.fr/2015NICE0038/document.
Texto completoThe impact of today’s information and communication technologies is essential forthe exercice of human rights, particularly concerning freedom of expression and privacyprotection. With the massive use of Internet, mobile phones and – more recently – other smart objects and digital services, the tension mounts with respect to establishing the limit between public and private space online. Likewise, the freedom of expression, communication and information are at risk as - under the pretext of fighting cybercrime and cyber terrorism, as well as maintaining public order - public authorities interfere with online contents by controlling, monitoring, restraining or prohibiting it. Especially as both States’ and private companies’ capacities in creating extremely precise databases identifying information on persons’ consumption habits, itineraries, thoughts and opinions gradually increase. Therefore, the need to redefine the way in which the respect of fundamental freedoms is taken into consideration in the context of digital environment becomes urgent
Schmidt, Loïc. "Passage à l'échelle des intergiciels RFID pour l'informatique diffuse". Thesis, Lille 1, 2010. http://www.theses.fr/2010LIL10155/document.
Texto completoFollowing the Internet of Things (IoT) concept, each manufactured object is associated with an RFID tag (Radio Frequency IDentification) embedding a unique identifier. The incoming of this RFID in the management of goods process rises technical issues such as their integration into the Internet allowing to access object related information gathered along its life. Indeed, with the spreading of this technology, the architecture of this IoT must be scalable and must provide an efficient way for managing these informations. In order to facilitate the development and the deployment of RFID solutions, standards have been ratified by EPCGlobal specifying interfaces for an RFID middleware: a component filtering and collecting events coming from readers (ALE), a component stocking these informations (EPCIS), a component allowing to retrieve these databases (ONS), ...We propose a distributed solution of these different components by using the distributed hash tables technology. We present a distributed ALE component that provides a way to interrogate every shared reader in the network, and a solution to query and retrieve informations from EPCIS in a efficient manner, without overloading the network. Considering the importance of the ONS system, which is to IoT what DNS is to Internet, we have explored an alternate solution to the DNS one in order to distribute this component. Our solutions offer an efficient scalability to the EPCGlobal middleware, providing a generic middleware for ubiquitous computing
Es, soufi Widad. "Modélisation et fouille des processus en vue d'assister la prise de décisions dans le contexte de la conception et la supervision des systèmes". Thesis, Paris, ENSAM, 2018. http://www.theses.fr/2018ENAM0067/document.
Texto completoData sets are growing rapidly because of two things. First, the fourth industrial revolution that aims to transform factories into smart entities in which cyber physical systems monitor the physical processes of the factory. Second, the need to innovate in order to achieve and maintain competitiveness. Due to this huge volume of data (Big Data), (i) design and supervision processes are becoming chaotic, (ii) data within organizations is increasingly becoming difficult to exploit and (iii) engineers are increasingly lost when making decisions. Indeed, several issues are identified in industry: (i) when researching, visualizing and exchanging information, (ii) when making decisions and (iii) when managing contextual changes. Through this research work, we propose an Intelligent and modular Decision Support System (IDSS), where each of the four modules solves one of the identified issues. Process modelling and traceability modules aim to model processes and capture how they are actualy executed. The decision support module proposes the process patterns that best fit the decision context, as well as their most significant activity parameters. The contextual change management module continuously updates the decision-making module, in order to handle the dynamic aspect of the decision context. The proposed system is fully verified and half-validated in the context of the Gontrand project, aiming at intelligent and real-time supervision of gas networks favoring the injection of green gas. In order to be fully validated, the performance of the system must be analyzed after integrating and exploitating it in a real industrial environment
Bertin, Ingrid. "Conception des bâtiments assurant leur réversibilité, leur déconstruction et leur réemploi, méthodologie de suivi et évaluation environnementale sur les cycles de vie". Thesis, Paris Est, 2020. http://www.theses.fr/2020PESC1041.
Texto completoIn a context of strong environmental pressure in which the construction sector has the greatestimpact, the reuse of the load-bearing elements is the most promising as it significantly avoidswaste production, preserves natural resources and reduces greenhouse gas emissions by cuttingdown on embodied energy.This thesis consequently covers three main areas of research:1. Improvement of structural design through expedient typologies by defining the DfReu(Design for Reuse) in order to anticipate the use of load-bearing elements (vertical andhorizontal), that can be dismantled and reused at the end of their service life to extendtheir lifespan, ultimately increasing the stock of available elements for reuse.2. Development of a methodology for the implementation of a reinforced and long-lastingtraceability centered on a materials bank with the use of BIM in order to secure all thecharacteristics, in particular physico-mechanical, of the load-bearing elements and tofacilitate the reuse processes as well as the commitment of a new responsibility for thereuse engineer.3. Identification of the key parameters influencing the environmental impacts of reuse anddevelopment of sensitivity study, allowing a better comprehension of the consequencesof this process and its consideration in design to support to decision making.An experiment based on reinforced concrete demonstration portals frames has enabledcorroboration of these three lines of research by generating missing data in literature. Thispractical analysis of column-beam assembly has generated technical data on the structuralbehavior after reuse, but also environmental data for implementation and deconstruction.This research offers subsequently a methodology based on a chain of tools to enable engineersto design reversible construction assemblies within a reusable structure, to secure the necessaryinformation in the BIM model coupled with physical traceability, to build a bank of materials andto enhance design through a stock of load-bearing elements. The study thus distinguishes"design with a stock" which aims to combine as many available elements as possible, from"design from a stock" which leads to the reuse of 100% of the elements and thus presents a newparadigm for the designer.At the same time, the environmental impacts of the reuse process are studied using a life cycleassessment (LCA). A sensitivity study, based among other things on the number of uses and thelifespan, in comparison to equivalent new constructions, provides a better understanding of theareas of interest of the DfReu. Consideration of criteria specific to the circular economy inbuildings completes the definition of reuse criteria. In the end, environmental studies establishunder which conditions reuse reduces the impact of a building and identify the key parameters.The results obtained are primarily intended for structural engineers but more broadly fordesigners part of the project management: architects, engineers and environmental designoffices, in order to offer and encourage the study of variants anticipating the reusability of newlydesigned buildings. By extension, the results can also be used in projects involving existingbuildings
Lepage, Yves. "Traçabilité modulée pour la conformité à Sarbanes-Oxley". Mémoire, 2009. http://www.archipel.uqam.ca/2484/1/M11144.pdf.
Texto completo