Letteratura scientifica selezionata sul tema "Traçabilité des données"
Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili
Consulta la lista di attuali articoli, libri, tesi, atti di convegni e altre fonti scientifiche attinenti al tema "Traçabilité des données".
Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.
Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.
Articoli di riviste sul tema "Traçabilité des données":
Jay, Nicolas, e Marc Koehler. "Sécurité et confidentialité des données de traçabilité en milieu de soins". Interbloc 36, n. 3 (luglio 2017): 153–56. http://dx.doi.org/10.1016/j.bloc.2017.07.005.
FERRAND, J. F., F. MÉRAT, C. LEMARQUAND, M. SARTHOU-MOUTENGOU e P. VIANCE. "Les documents, supports de traçabilité des expositions professionnelles du personnel militaire." Médecine et Armées Vol. 40 No. 1, Volume 40, Numéro 1 (1 febbraio 2012): 61–70. http://dx.doi.org/10.17184/eac.6586.
Belleville, Arnaud, Federico Garavaglia, Damien Sevrez, Véronique Mary, Aloïs Tilloy, Didier Scopel e Hélène Combes. "Réanalyse des chroniques patrimoniales de débit. Évaluation de l'impact et valorisation". La Houille Blanche, n. 5-6 (ottobre 2018): 29–35. http://dx.doi.org/10.1051/lhb/2018048.
Ramousse, O., R. Melingui e D. Hatsch. "Traçabilité des expositions professionnelles aux produits chimiques grâce à une base de données". Archives des Maladies Professionnelles et de l'Environnement 73, n. 3 (giugno 2012): 552. http://dx.doi.org/10.1016/j.admp.2012.03.334.
Calixte, Xaviera, Samia Ben Rajeb e Pierre Leclercq. "Traçabilité de l’usage des outils de conception dans un processus collaboratif". SHS Web of Conferences 47 (2018): 01004. http://dx.doi.org/10.1051/shsconf/20184701004.
Bourret, Christian, e Claudie Meyer. "La culture de l’information comme levier de changement dans le système de santé français. Le rôle des nouvelles organisations d’interface ou une approche coopérative autour de dynamiques de proximité". Articles 44, n. 1 (31 marzo 2015): 21–47. http://dx.doi.org/10.7202/1029301ar.
Petit, V., J. J. Huart, G. Quenette e T. Fargeau. "Traçabilité des données lors de dons de plasmaphérèse–Utilisation pratique du logiciel iTrace à l’EFS Nord de France". Transfusion Clinique et Biologique 20, n. 3 (giugno 2013): 366–67. http://dx.doi.org/10.1016/j.tracli.2013.03.269.
Beaude, Boris. "(re)Médiations numériques et perturbations des sciences sociales contemporaines". Sociologie et sociétés 49, n. 2 (4 dicembre 2018): 83–111. http://dx.doi.org/10.7202/1054275ar.
BROCHARD, M., K. DUHEM, T. GESLAIN, P. L. GASTINEL e J. L. PEYRAUD. "Phénotypage et génotypage de la composition fine du lait : les filières laitières et la recherche française investissent pour l’avenir". INRAE Productions Animales 27, n. 4 (21 ottobre 2014): 299–302. http://dx.doi.org/10.20870/productions-animales.2014.27.4.3075.
Sombié, Charles, Kouka Luc Delma, Moussa Ouédraogo e Rasmané Semdé. "Audit de l’activité de stérilisation des dispositifs médicaux réutilisables dans le centre hospitalier universitaire pédiatrique Charles de Gaulle au Burkina Faso". Journal Africain de Technologie Pharmaceutique et Biopharmacie (JATPB) 1, n. 1 (21 ottobre 2022): 1–15. http://dx.doi.org/10.57220/jatpb.v1i1.2.
Tesi sul tema "Traçabilité des données":
Bazin, Cyril. "Tatouage de données géographiques et généralisation aux données devant préserver des contraintes". Caen, 2010. http://www.theses.fr/2010CAEN2006.
Digital watermaking is a fundamental process for intellectual property protection. It consists in inserting a mark into a digital document by slightly modifications. The presence of this mark allows the owner of a document to prove the priority of his rights. The originality of our work is twofold. In one hand, we use a local approach to ensure a priori that the quality of constrained documents is preserved during the watermark insertion. On the other hand, we propose a generic watermarking scheme. The manuscript is divided in three parts. Firstly, we introduce the basic concepts of digital watermarking for constrainted data and the state of the art of geographical data watermarking. Secondly, we present our watermarking scheme for digital vectorial maps often used in geographic information systems. This scheme preserves some topological and metric qualities of the document. The watermark is robust, it is resilient against geometric transformations and cropping. We give an efficient implementation that is validated by many experiments. Finally, we propose a generalization of the scheme for constrainted data. This generic scheme will facilitate the design of watermarking schemes for new data type. We give a particular example of application of a generic schema for relational databases. In order to prove that it is possible to work directly on the generic scheme, we propose two detection protocols straightly applicable on any implementation of generic scheme
Bossy, Robert. "Édition coopérative de bases de données scientifiques". Paris 6, 2002. http://www.theses.fr/2002PA066047.
Hadjipavlou, Elena. "Big data, surveillance et confiance : la question de la traçabilité dans le milieu aéroportuaire". Thesis, Université Côte d'Azur (ComUE), 2016. http://www.theses.fr/2016AZUR2044/document.
This research project questions, in a comprehensive and critical way, the presence of digital traces in the era of Big Data. This reflection opens up in the relation between Surveillance and Trust. In recent years, “Big Data” has massively and repeatedly been used in order to describe a new societal dynamic that would be characterized by the production of massive quantities of data. Furthermore, enormous potential benefits from using new statistical tools to analyze these data generated from connected objects and tools in more and more human actions. The airport sector is currently facing a major transformation, fueled by the explosion of data within its structure. The data generated during a passenger's journey are now extremely massive. There is no doubt that the management of this data is an important lever for the safety, the improvement of services and the comfort of the passenger. However, the expected benefits raise a great question: Where do these data go? We do not know. And as long as we do not know, how can we trust? These considerations are being examined at Larnaca airport in Cyprus. The different angles of approach as well as the diversity of the actors required the creation of a multidimensional corpus, resulting from a mixed methodology, in order to have a comprehensive approach to the subject. This corpus includes interviews, questionnaires and life stories of passengers and professionals. The qualitative and quantitative analysis that followed was based on a theoretical framework previously elaborated, in order to cross the representations of the actors concerning the surveillance and the trust and finally, highlight the different inherent visions to this issue
Maléchaux, Astrid. "Traçabilité et authentification d'huiles d'olive monovariétales par des outils chimiométriques". Electronic Thesis or Diss., Aix-Marseille, 2020. http://theses.univ-amu.fr.lama.univ-amu.fr/200110_MALECHAUX_773gwajs656jpq766z43jurz_TH.pdf.
Ensuring the authenticity and traceability of food products is important to deal with consumer concerns and economic losses generated by food fraud cases. Moreover, olive oils, especially those whose origin gives them an extra added value, are among the products most affected by frauds. However, current official analyses only focus on purity and quality criteria of the oils, but not on the confirmation of their varietal origin. Therefore, this thesis proposes chemometric tools to facilitate the application of the varietal recognition of olive oils by routine analyses. On the one hand, by adding a control chart decision rule to the PLS1-DA model, samples whose characteristics are deviating from the reference profile can easily and quickly be identified, even in the presence of unbalanced numbers between the predicted classes. On the other hand, the development of data fusion strategies allows to benefit from the synergy between complementary sources of information such as the specific analysis of fatty acids by gas chromatography and the global analyses by near- and mid-infrared spectroscopy
Bendriss, Sabri. "Contribution à l'analyse et la conception d'un système d'information pour la gestion de la traçabilité des marchandises dans un contexte de transport multimodal". Le Havre, 2009. http://www.theses.fr/2009LEHA0024.
One of solutions to regulate and rationalize the physical displacement of goods is to succeed to synchronize the physical flow with its informational counterpart throughout the various links constituting the transport chain. In this context, a solution of goods tracking and tracing can contribute to a better mastery of flows. In this memory, we propose a data modeling approach for goods traceability based on innovative research approaches (PLM, Intelligent product, Product centered systems) and taking into account the possibilities offered by the use of NICT in terms of data sharing, auto-identification and geolocation. Then, in order to integrate our traceability data with the other transport chain data, but also in order to facilitate the sharing and the exchange of our data, we propose a modeling and the development of an intermediation platform based on the web services logic. Meeting criteria of interoperability and integrability, the result allows through mechanisms for exchange and saving data to follow and to restore goods lifecycle in its entirety
Diallo, Thierno M. L. "Approche de diagnostic des défauts d’un produit par intégration des données de traçabilité unitaire produit/process et des connaissances expertes". Thesis, Lyon 1, 2015. http://www.theses.fr/2015LYO10345.
This thesis, which is part of the Traçaverre Project, aims to optimize the recall when the production process is not batch type with a unit traceability of produced items. The objective is to minimize the number of recalled items while ensuring that all items with defect are recalled. We propose an efficient recall procedure that incorporates possibilities offered by the unitary traceability and uses a diagnostic function. For complex industrial systems for which human expertise is not sufficient and for which we do not have a physical model, the unitary traceability provides opportunities to better understand and analyse the manufacturing process by a re-enactment of the life of the product through the traceability data. The integration of product and process unitary traceability data represents a potential source of knowledge to be implemented and operate. This thesis propose a data model for the coupling of these data. This data model is based on two standards, one dedicated to the production and the other dealing with the traceability. We developed a diagnostic function based on data after having identified and integrated the necessary data. The construction of this diagnosis function was performed by a learning approach and comprises the integration of knowledge on the system to reduce the complexity of the learning algorithm. In the proposed recall procedure, when the equipment causing the fault is identified, the health status of this equipment in the neighbourhood of the manufacturing time of the defective product is evaluated in order to identify other products likely to present the same defect. The global proposed approach was applied to two case studies. The first study focuses on the glass industry. The second case of application deals with the benchmark Tennessee Eastman process
Le, roch Yann. "De l'intérêt de la mise en réseau de données logistiques standardisées : des technologies supports aux business models". Thesis, Paris Sciences et Lettres (ComUE), 2017. http://www.theses.fr/2017PSLEM059.
New logistics model, such as the Physical Internet, offer a breakthrough in the way goods are routed inside networks that are more and more inter-connected. This inter-connection requires innovative, Internet-of-Things-like, information systems for tracking and routing those goods within open loops. This double redefinition of physical and digital flows is bound to impact logistics service providers professions and their business models. Through an action-research on be-half of an industrial consortium, we study this question in three steps : by experimenting track and trace standardized technologies used to manage returnable transport items, by identifying new value logics induced by these new technological means, and finally observing how these new technologies and businesses impact the logistics services in place. Our results validate the benefit of these new information systems to trace logistical units within the Physical Internet. A range of new business tactics induced by those new digital and physical models is proposed. However, to the extent of our research, the logistic’s network business models are not yet impacted by these new digital logics
Toussaint, Marion. "Une contribution à l'industrie 4.0 : un cadre pour sécuriser l'échange de données standardisées". Electronic Thesis or Diss., Université de Lorraine, 2022. http://www.theses.fr/2022LORR0121.
The recent digital transformation of the manufacturing world has resulted in numerous benefits, from higher quality products to enhanced productivity and shorter time to market. In this digital world, data has become a critical element in many critical decisions and processes within and across organizations. Data exchange is now a key process for the organizations' communication, collaboration, and efficiency. Industry 4.0 adoption of modern communication technologies has made this data available and shareable at a quicker rate than we can consume or track it. This speed brings significant challenges such as data interoperability and data traceability, two interdependent challenges that manufacturers face and must understand to adopt the best position to address them. On one hand, data interoperability challenges delay faster innovation and collaboration. The growing volume of data exchange is associated with an increased number of heterogeneous systems that need to communicate with and understand each other. Information standards are a proven solution, yet their long and complex development process impedes them from keeping up with the fast-paced environment they need to support and provide interoperability for, slowing down their adoption. This thesis proposes a transition from predictive to adaptive project management with the use of Agile methods to shorten the development iterations and increase the delivery velocity, increasing standards adoption. While adaptive environments have shown to be a viable solution to align standards with the fast pace of industry innovation, most project requirements management solutions have not evolved to accommodate this change. This thesis also introduces a model to support better requirement elicitation during standards development with increased traceability and visibility. On the other hand, data-driven decisions are exposed to the speed at which tampered data can propagate through organizations and corrupt these decisions. With the mean time to identify (MTTI) and mean time to contain (MTTC) such a threat already close to 300 days, the constant growth of data produced and exchanged will only push the MTTI and MTTC upwards. While digital signatures have already proven their use in identifying such corruption, there is still a need for formal data traceability framework to track data exchange across large and complex networks of organizations to identify and contain the propagation of corrupted data. This thesis analyses existing cybersecurity frameworks, their limitations, and introduces a new standard-based framework, in the form of an extended NIST CSF profile, to prepare against, mitigate, manage, and track data manipulation attacks. This framework is also accompanied with implementation guidance to facilitate its adoption and implementation by organizations of all sizes
Chichignoud, Aurélien. "Approche méthodologique pour le maintien de la cohérence des données de conception des systèmes sur puce". Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLS069/document.
The development of highly complex products requires the maintenance of a huge set of inter-dependent documents, in various formats. Unfortunately, no tool or methodology is available today to systematically maintain consistency between all these documents. Therefore, according to observations made in STMicroelectronics, when a document changes, stakeholders must manually propagate the changes to the impacted set of dependent documents. For various reasons, they may not well propagate the change, or even may not propagate it at all. Related documents thereby diverge more and more over time. It dramatically impacts productivity to realign documents and make the very wide-ranging corpus of documents consistent. This paper proposes a methodology to help stakeholders to systematically maintain consistency between documents, based on the Architecture Description concept introduced by ISO42010. First, a model is defined to describe formally and completely correspondences between Architecture Description Elements of documents. This model is designed to be independent of documents formats, selected system development lifecycle and the working methods of the industry. Second, these correspondences are analyzed in case of document modification in order to help stakeholders maintaining global corpus consistency. A prototype has been developed, which implements the proposed approach, to evaluate the methodology. 18 subjects volunteered to evaluate the approach. These subjects made two tests (with and without our methodology) involving the correction of inconsistencies added in a set of documents. These tests allowed us to identify two variables: the number of inconsistencies corrected and the average time to correct the inconsistencies. According to our study, the use of the approach helps to correct 5.5% more inconsistencies in a time 3.3% lower
Ouertani, Mohamed Zied. "DEPNET : une approche support au processus de gestion de conflits basée sur la gestion des dépendances de données de conception". Phd thesis, Université Henri Poincaré - Nancy I, 2007. http://tel.archives-ouvertes.fr/tel-00163113.
C'est à la gestion de ce phénomène, le conflit, que nous nous sommes intéressés dans le travail présenté dans ce mémoire, et plus particulièrement à la gestion de conflits par négociation. Nous proposons l'approche DEPNET (product Data dEPendencies NETwork identification and qualification) pour supporter au processus de gestion de conflits basée sur la gestion des dépendances entre les données. Ces données échangées et partagées entre les différents intervenants sont l'essence même de l'activité de conception et jouent un rôle primordial dans l'avancement du processus de conception.
Cette approche propose des éléments méthodologiques pour : (1) identifier l'équipe de négociation qui sera responsable de la résolution de conflit, et (2) gérer les impacts de la solution retenu suite à la résolution du conflit. Une mise en œuvre des apports de ce travail de recherche est présentée au travers du prototype logiciel DEPNET. Nous validons celui-ci sur un cas d'étude industriel issu de la conception d'un turbocompresseur.
Capitoli di libri sul tema "Traçabilité des données":
DEJEAN DE LA BÂTIE, Alice. "A l’aube de l’ère pénale sanitaire". In Les épidémies au prisme des SHS, 251–55. Editions des archives contemporaines, 2022. http://dx.doi.org/10.17184/eac.6011.
JULIEN, H., A. ALLONNEAU, O. BON e H. LEFORT. "Aspects actuels du triage, du combat à la catastrophe, essai de synthèse". In Médecine et Armées Vol. 46 No.3, 197–206. Editions des archives contemporaines, 2018. http://dx.doi.org/10.17184/eac.7334.