Dissertations / Theses on the topic 'Indicateurs de qualité des données'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Indicateurs de qualité des données.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Maddi, Abdelghani. "La quantification de la recherche scientifique et ses enjeux : bases de données, indicateurs et cartographie des données bibliométriques." Thesis, Sorbonne Paris Cité, 2018. http://www.theses.fr/2018USPCD020/document.
Full textThe issue of productivity and the "quality" of scientific research is one of the central issues of the 21st century in the economic and social world. Scientific research, source of innovation in all fields, is considered the key to economic development and competitiveness. Science must also contribute to the societal challenges defined in the Framework Programmes for Research and Technological Development (H2020) for example, such as health, demography and well-being. In order to rationalize public spending on research and innovation or to guide the investment strategies of funders, several indicators are developed to measure the performance of research entities. Now, no one can escape evaluation, starting with research articles, researchers, institutions and countries (Pansu, 2013, Gingras, 2016). For lack of methodological comprehension, quantitative indicators are sometimes misused by neglecting the aspects related to their method of calculation / normalization, what they represent or the inadequacies of the databases from which they are calculated. This situation may have disastrous scientific and social consequences. Our work plans to examine the tools of evaluative bibliometrics (indicators and databases) in order to measure the issues related to the quantitative evaluation of scientific performances. We show through this research that the quantitative indicators, can never be used alone to measure the quality of the research entities given the disparities of the results according to the analysis perimeters, the ex-ante problems related to the individual characteristics of researchers who directly affect the quantitative indicators, or the shortcomings of the databases from which they are calculated. For a responsible evaluation, it is imperative to accompany the quantitative measures by a qualitative assessment of the peers. In addition, we also examined the effectiveness of quantitative measures for the purpose of understanding the evolution of science and the formation of scientific communities. Our analysis, applied to a corpus of publications dealing the economic crisis, allowed us to show the dominant authors and currents of thought, as well as the temporal evolution of the terms used in this thematic
Pekovic, Sanja. "Les déterminants et les effets des normes de qualité et d'environnement : analyses microéconométriques à partir de données françaises d'entreprises et de salariés." Phd thesis, Université Paris-Est, 2010. http://tel.archives-ouvertes.fr/tel-00592480.
Full textPekovic, Sanja. "Les déterminants et les effets des normes de qualité et d’environnement : analyses microéconométriques à partir de données françaises d’entreprises et de salariés." Thesis, Paris Est, 2010. http://www.theses.fr/2010PEST3007/document.
Full textThe scope and magnitude of changes occurring in business today has led to great interest in and widespread adoption of Quality and Environmental Management approaches. However, despite their prevalence, efforts at understanding the motives for their adoption, as well as their effects on firm and employee outcomes, are still in their infancy. This PhD dissertation provides useful theoretical and empirical contributions to three research topics dealing with Quality and Environmental Management approaches in the French institutional framework: (1) the determinants of their adoption, (2) their impact on firm outcomes and (3) their impact on employee outcomes. These three aspects make up the three parts of this PhD thesis.In part I, we define and characterise quality and environmental approaches with a special focus on ISO 9000 Quality Management standards and ISO 14000 Environmental Management standards. Furthermore, we empirically examine the determinants of quality and environmental standards adoption. Our findings reveal that the determinants of quality standards significantly differ between manufacturing and service firms, particularly when we examine features of the internal strategy of those firms (quality improvement, cost reduction and innovation). However, we have also obtained evidence which indicates that the characteristics of firms (firm size, corporate status and previous experience with similar standards) and features of their external strategy (export levels and customer satisfaction) play a significant role in quality standards adoption across both the manufacturing and service sectors. Moreover, we empirically investigate the determinants of chemical firms' registration for the ISO 14001 standard or the Responsible Care program. We show that most determinants are different for the two systems analysed: while firm size, previous experience with similar standards, information disclosure requirements and customer location are major determinants of ISO 14001 standard registration, regulatory pressure, past environmental problems and future risks are the main drivers of Responsible Care registration.In part II, we empirically investigate whether quality and environmental standards are related to better firm performance using various sets of performance measures. The evidence indicates that quality standards positively influence turnover and specific indicators of innovation performance and productivity, but have no impact on profit and some other innovation performance measures. Based on our empirical findings, we conclude that while environmental standards improve turnover and recruitment of both professional and non professional employees, they have no effect on profit. Moreover, the research shows that implementing both quality and environmental standards is likely to better enhance firm outcomes than implementing only one standard.Part III is focused on the effect of quality and environmental standards on employee outcomes. The estimation results show that quality standards increase the risk of employee accidents although more specifically they are ineffective on working accidents that lead to sick leave. On the other hand, our results lead to the conclusion that environmental standards add significantly to the enhancement of working conditions, via the reduction of accidents. Furthermore, the obtained evidence shows that environmental standards seem to improve employee well-being. More precisely, employees working for firms that are certified for an environmental standard report greater feelings of usefulness about their job and declare that they are more often fairly valued in their jobs. The evidence also shows that employees working for environmentally certified firms do not claim to be significantly more involved in their job but they are more likely, ceteris paribus, to work uncompensated for sup plementary work hours than “non green workers”
Januel, Jean-Marie. "Les données de routine des séjours d'hospitalisation pour évaluer la sécurité des patients : études de la qualité des données et perspectives de validation d'indicateurs de la sécurité des patients." Phd thesis, Université Claude Bernard - Lyon I, 2011. http://tel.archives-ouvertes.fr/tel-00690802.
Full textRicha, Jean Raphaël. "Amélioration de l'ingénierie des données dans les environnements connectés grâce à la détection de la propagation et de l'obsolescence des données." Electronic Thesis or Diss., Pau, 2024. http://www.theses.fr/2024PAUU3077.
Full textThe relentless expansion of digital technologies has enhanced the capabilities of collecting, visualizing, analyzing, and autonomously managing the environment surrounding us.Intelligent and increasingly connected is the prevailing trend within the rising metaverse, where connected environments are driven to gather a plethora of data.These environments contribute to a better understanding of human activities, improve quality of life, and ease paths for sustainability.This thesis introduces two pivotal concepts - zone continuum and data obsolescence - furthering awareness of connected environments by explaining how data crosses between zones and detecting when data becomes irrelevant with respect to its environment.By addressing these overlooked aspects, we aim to improve data interpretation and management within connected environments.This thesis addresses four key challenges: (1) designing a deep and expressive data model that effectively captures the core components of connected environments, such as hosting zones, smart devices, and their corresponding sensor networks; (2) establishing a mechanism to discover and trace how data propagates between different zones; (3) proposing a large-scale approach to detect data obsolescence while considering the entire environment; and (4) handling the dynamic nature of the CE that could be illustrated by the mobility and failure of the devices, as well as modifications in zone configurations (e.g., combining multiple zones into one or dividing a single zone into several).To address these challenges, we first propose the HSSN+ data model, which underpins a resilient representation of the elements in connected environments.Consequently, in light of this model, we put forward two interconnected frameworks. The first one is the Zone Continuum Computation for Connected Environments (ZCCCE) framework that formalizes the zone continuum concept and enables to understand how data moves between zones. The second one is the Data Obsolescence Detection in Connected Environments (DODCE) framework which offers a structured approach to identify and compute the data obsolescence based on predefined quality indicators.These two frameworks provide opportunities for effective control and understanding in connected environments
Januel, Jean-Marie. "Les données de routine des séjours d’hospitalisation pour évaluer la sécurité des patients : études de la qualité des données et perspectives de validation d’indicateurs de la sécurité des patients." Thesis, Lyon 1, 2011. http://www.theses.fr/2011LYO10355/document.
Full textAssessing safety among hospitalized patients is a major issue for health services. The development of indicators to measure adverse events related to health care (HAE) is a crucial step, for which the main challenge lies on the performance of the data used for this approach. Based on the limitations of the measurement in terms of reproducibility and on the high cost of studies conducted using medical records audit, the development of Patient Safety Indicators (PSI) by the Agency for Healthcare Research and Quality (AHRQ) in the United States, using codes from the clinically modified 9th revision of the International Classification of Diseases (ICD) shows interesting prospects. Our work addressed five key issues related to the development of these indicators: nosological definition; feasibility and validity of codes based algorithms; quality of medical diagnoses coding using ICD codes, comparability across countries; and possibility of establishing a benchmark to compare these indicators. Some questions remain, and we suggest several research pathways regarding possible improvements of PSI based on a better definition of PSI algorithms and the use of other data sources to validate PSI (i.e., registry data). Thus, the use of adjustment models including the Charlson index, the average number of diagnoses coded and a variable of the positive predictive value should be considered to control the case-mix variations and differences of quality of coding for comparisons between hospitals or countries
Chelil, Samy. "Assimilation de données pour améliorer les modèles de qualité de l'eau : vers un indicateur de pression azotée." Electronic Thesis or Diss., Sorbonne université, 2022. http://www.theses.fr/2022SORUS440.
Full textNon-point sources of nitrogen pollution in the agricultural context are mainly due to the over-fertilization of agricultural fields for crop yield improvement. Regardless of the considerable efforts made at various levels to optimize the cultural management practices, the nitrogen surplus in the soil may constitute a high potential for nitrate leaching towards freshwater resources, especially at the beginning of the winter season. Indeed, the lack of catch crops during this period, the high soil mineralization levels (often left bare in summer and fall seasons), and the uncontrolled nitrogen management practices increase the soil nitrogen pool, leading to extreme nitrate exportations following the first winter precipitations. This thesis work aims to develop a new nitrate index based on conceptual modeling of nitrate transfer through agricultural subsurface drained soils. First, the conceptual NIT-DRAIN model was developed, optimized, and validated to simulate nitrate leaching through agricultural subsurface drained catchments. Then, its spatiotemporal robustness was assessed based on observed nitrate concentrations data at different time steps (hour, days), collected at three agricultural fields (La Jaillière, 1 ha; Chantemerle, 36 ha; Rampillon, 355 ha). One can note that only the seven input parameters and drainage discharge data are needed to estimate the remaining pool of nitrate at the beginning of the winter season (RNBW) as function of the measured nitrate concentrations at the subsurface drainage outlet. Two adjoint sensitivity analyses (local and global) have been implemented to determine the influence of the model parameters on the nitrate concentrations (model output). Results indicate that nitrate transfer velocities (vl) and the nitrogen-sharing factor (α) parameters show a significant impact on the model output. The variational data assimilation method (4D-Var) has been implemented on the NIT-DRAIN model to enhance the RNBW estimations and the temporal evolution of the soil nitrogen pool. In addition, an ensemble of sampling frequency of nitrate concentration observations has been considered (e.g., hour, day, month, quarter) to assess their impact on the RNBW estimations. Thus, it has been shown that RNBW estimation errors become substantial starting from a monthly sampling of nitrate concentrations data. Finally, the performance of the drainage simulation model (SIDRA-RU) has been evaluated in the prospect of a future coupling with the nitrate model (NIT-DRAIN). Hence, the adjoint code of the SIDRA-RU model has been generated using the automatic differentiation tool TAPENADE to facilitate the implementation of the variational data assimilation method. In the near future, we seek to avoid the need for the observed drainage discharge used as input of the NIT-DRAIN model by considering the simulated SIDRA-RU drainage data
Labenne, Amaury. "Méthodes de réduction de dimension pour la construction d'indicateurs de qualité de vie." Thesis, Bordeaux, 2015. http://www.theses.fr/2015BORD0239/document.
Full textThe purpose of this thesis is to develop and suggest new dimensionreduction methods to construct composite indicators on a municipal scale. The developedstatistical methodology highlights the consideration of the multi-dimensionalityof the quality of life concept, with a particular attention on the treatment of mixeddata (quantitative and qualitative variables) and the introduction of environmentalconditions. We opt for a variable clustering approach and for a multi-table method(multiple factorial analysis for mixed data). These two methods allow to build compositeindicators that we propose as a measure of living conditions at the municipalscale. In order to facilitate the interpretation of the created composite indicators, weintroduce a method of selections of variables based on a bootstrap approach. Finally,we suggest the clustering of observations method, named hclustgeo, which integratesgeographical proximity constraints in the clustering procedure, in order to apprehendthe spatiality specificities better
Autreaux-Noppe, Karine. "Contribution méthodologique à la mise en place d'un réseau de surveillance des peuplements phytoplanctoniques des eaux courantes." Lille 1, 2000. https://pepite-depot.univ-lille.fr/RESTREINT/Th_Num/2000/50376-2000-143.pdf.
Full textCes variations montrent qu'il est au moins nécessaire de standardiser et si possible normaliser les méthodes de comptage et d'analyse de pigments. Par ailleurs, pour limiter l'évolution des pigments du transport à l'analyse, la filtration sur place n'est pas indispensable mais la conservation à 4°c et a l'obscurité est nécessaire. La variabilité à micro-échelle et à petite échelle spatiale est faible et négligeable, indiquant qu'un prélèvement dans la zone d'écoulement maximal est représentatif du site de prélèvement. A l'inverse, la variabilité à petite échelle temporelle est élevé, montrant que pour obtenir une meilleure estimation du peuplement phytoplanctonique, il faut augmenter la fréquence d'échantillonnage
Serrano, Balderas Eva Carmina. "Preprocessing and analysis of environmental data : Application to the water quality assessment of Mexican rivers." Thesis, Montpellier, 2017. http://www.theses.fr/2017MONTS082/document.
Full textData obtained from environmental surveys may be prone to have different anomalies (i.e., incomplete, inconsistent, inaccurate or outlying data). These anomalies affect the quality of environmental data and can have considerable consequences when assessing environmental ecosystems. Selection of data preprocessing procedures is crucial to validate the results of statistical analysis however, such selection is badly defined. To address this question, the thesis focused on data acquisition and data preprocessing protocols in order to ensure the validity of the results of data analysis mainly, to recommend the most suitable sequence of preprocessing tasks. We propose to control every step in the data production process, from their collection on the field to their analysis. In the case of water quality assessment, it comes to the steps of chemical and hydrobiological analysis of samples producing data that were subsequently analyzed by a set of statistical and data mining methods. The multidisciplinary contributions of the thesis are: (1) in environmental chemistry: a methodological procedure to determine the content of organochlorine pesticides in water samples using the SPE-GC-ECD (Solid Phase Extraction – Gas Chromatography – Electron Capture Detector) techniques; (2) in hydrobiology: a methodological procedure to assess the quality of water on four Mexican rivers using macroinvertebrates-based biological indices; (3) in data sciences: a method to assess and guide on the selection of preprocessing procedures for data produced from the two previous steps as well as their analysis; and (4) the development of a fully integrated analytics environment in R for statistical analysis of environmental data in general, and for water quality data analytics, in particular. Finally, within the context of this thesis that was developed between Mexico and France, we have applied our methodological approaches on the specific case of water quality assessment of the Mexican rivers Tula, Tamazula, Humaya and Culiacan
Mechinaud, Lamarche Vadel Agathe. "Elaboration d'indicateurs de mortalité post-hospitalière à différents délais avec prise en compte des causes médicales de décès." Thesis, Paris 11, 2014. http://www.theses.fr/2014PA11T073/document.
Full textThe main objective of this PhD work was to investigate different methodological options for the elaboration of post hospital mortality indicators aiming at reflecting quality of care, in particular to identify the most relevant timeframes and to assess the contribution of the causes of death information.In a first phase, the hospital discharge data of the French General health insurance scheme beneficiaries who died during the year following an hospital stay in 2008 or 2009 were linked to the cause of death register. The matching rate was 96.4%.In a second phase, the hospital stays for which the underlying cause of death could be qualified as independent from the main diagnosis were identified with an algorithm and a software relying on international standards.In a third phase, the method most widely used to assess in-hospital mortality (Dr Foster Unit method) was reproduced and used to construct hospital mortality indicators at 30, 60, 90, 180 et 365 days post-admission, on year 2009 (12 322 831 acute-care stays)..As in other countries, in-hospital mortality revealed biased by discharge patterns in the French data: hospitals : short length-of-stay or high transfer-out rates for comparable casemix tend to have lower in-hospital mortality. The 60-day and 90-day indicators should be preferred to the 30-day indicator, because they reflect a larger part of in-hospital mortality, and are less subject to the incentives either to maintain patients alive until the end of the follow-up window or to shift resources away when this length of stay is reached. The contribution of the causes of death seems negligible in the context of hospital-wide indicators, but it could prove its utility in future health services research about specific indicators limited to selected conditions or procedures.However, reservations about the relevance of hospital-wide mortality indicators aiming at assessing quality of care are described (limits of the statistical model and adjustment variables available, heterogeneity of the coding quality between hospitals). Further research is needed, in particular on the capacity of these indicators to reflect quality of care and on the impact of their public reporting. To date, the use of hospital-wide mortality indicators needs to be extremely cautious
Grac, Corinne. "Fouille temporelle des indicateurs physico-chimiques et biologiques pour l'évaluation de l'état, des pressions et de la capacité de résilience des rivières." Thesis, Strasbourg, 2019. http://www.theses.fr/2019STRAH015.
Full textData from the assessment of river are big data, with complex relationships. Unsupervised data mining methods can be applied on them and give relevant results for their management, if a close collaboration exists between hydroecologists and computer scientists. The extraction of partially ordered patterns from temporal sequences of physicochemical pressures preceding a biological state has been achieved. These temporal patterns allow to identify a part of the pressures involved or not in a degraded ecological status, to specify the importance of the sequences time-length before a biological assessment, to identify the characteristic pressure categories at a regional scale. To go further, we plan to extend these patterns to hydromorphological pressures
Nesvijevskaia, Anna. "Phénomène Big Data en entreprise : processus projet, génération de valeur et Médiation Homme-Données." Thesis, Paris, CNAM, 2019. http://www.theses.fr/2019CNAM1247.
Full textBig Data, a sociotechnical phenomenon carrying myths, is reflected in companies by the implementation of first projects, especially Data Science projects. However, they do not seem to generate the expected value. The action-research carried out over the course of 3 years in the field, through an in-depth qualitative study of multiple cases, points to key factors that limit this generation of value, including overly self-contained project process models. The result is (1) an open data project model (Brizo_DS), orientated on the usage, including knowledge capitalization, intended to reduce the uncertainties inherent in these exploratory projects, and transferable to the scale of portfolio management of corporate data projects. It is completed with (2) a tool for documenting the quality of the processed data, the Databook, and (3) a Human-Data Mediation device, which guarantee the alignment of the actors towards an optimal result
Ferhat, Fouad. "Une analyse économique de la qualité et de l'efficience des universités et des systèmes universitaires : une comparaison au niveau international." Thesis, Paris 1, 2016. http://www.theses.fr/2016PA01E040/document.
Full textThis thesis aims to economically analyze the quality and efficiency of universities and university systems at an international level of comparison, by using input/output indicators and the Data Envelopment Analysis (DEA) method. The thesis is composed of four chapters. The first chapter entitled "university rankings: a critical perspective" presents and evaluates the relevance of inputs/outputs indicators used by most university rankings. It is the opportunity to present a number of criticisms found in the literature and focus on a common methodological problem in the rankings. It is the use of inputs as measures of university quality. This practice confuses means and results and ignores the basic concepts of accounting models in terms of production functions and efficiency. The second chapter entitled "characteristics and rankings of universities : around some factors that can explain the differences in performance between universities", compares the results of two rankings: QS-Times and Shanghai and offers a list of factors that may explain why there are such differences in quality, according to these rankings between universities. [...] The third chapter entitled "performance and efficiency of universities and their determinants: an evaluation using world university rankings and DEA methodology" evaluates on the basis of a DEA methodology the efficiency of 214 universities from 13 different countries, in order to find if the top ranked universities among traditional rankings are also universities that best utilize their financial and human resources. [...] The fourth chapter titled "efficiency of university systems in 35 countries and its determinants: an assessment by DEA methodology and the calculation of Malmquist indices (2006-2012)" assesses the efficiency and performance of university systems of 35 countries. It offers new scores for overall efficiency that complement the first two studies on this topic in the literature by Agasisti (2011) and St.Aubyn et al (2009). Compared to the article of Agasisti (2011), we identify five new developments in our study : the sample is higher (35 countries instead of 18), the observation period is updated, the evolution of efficiency between two periods is calculated, the number of inputs and outputs incorporated into each model is higher and a specific model for evaluating the efficiency of research is proposed. Our study confirms the thesis that the university systems of Switzerland and the United Kingdom are the most efficient. It also shows based on the calculations of Malmquist indices between 2006 and 2012 that teaching efficiency of 35 reviewed university systems has a tendency of declining while the research efficiency and that of attractivity-reputation is rather increasing. This allows a better assessment of the impact of reforms inspired by the Shanghai ranking on university systems. These reforms led the academic staff of universities to abandon their focus on teaching in favor of research activities
Bertillot, Hugo. "La rationalisation en douceur : sociologie des indicateurs qualité à l’hôpital." Thesis, Paris, Institut d'études politiques, 2014. http://www.theses.fr/2014IEPP0042/document.
Full textSince the late 1990s, the French hospital sector faces neomanagerial reforms aiming at strengthening state control over care organizations and medical activities. The thesis examines these contemporary transformations through the deployment of quality indicators measuring the quality of medical care, in order to enhance transparency and improve quality. How do these evaluation devices affect the institutional regulation of healthcare, the organization of hospitals and the professional autonomy of medical practitioners? The thesis describes the organized, instrumental and cognitive dimensions of public action, through a qualitative methodology crossing interviews with institutional leaders, analysis of written sources and investigations in four hospitals. It traces the career of these instruments, analyzes their technical and cognitive properties and characterizes their diverse social uses. These indicators have emerged in an institutional context deeply affected by nosocomial infections and hospital rankings. They are made of hybrid knowledge. Multiple intermediaries between state and professional actors shaped and legitimized them. They have been carefully but massively disseminated in the 2000s. Through their institutionalization, they instill formalization, control, traceability and auditability in hospitals. By moving the focus from the visible and constraining effects of payment instruments to these more discrete mechanisms, the thesis shows how quality evaluation softly rationalizes professional bureaucracies
Brocolini, Laurent. "Caractérisation de l'environnement sonore urbain : Proposition de nouveaux indicateurs de qualité." Phd thesis, Université de Cergy Pontoise, 2012. http://tel.archives-ouvertes.fr/tel-00855265.
Full textNajjar-Pellet, Josette. "Les infections nosocomiales : indicateurs de qualité en réanimation et en chirurgie ?" Lyon 1, 2008. http://www.theses.fr/2008LYO10164.
Full textNosocomial infections (NI) are infections that occur during or after a patient care. With the development of evaluation and accreditation initiatives, these infections started to be used as indicators of quality of care. However, the validity of these rates is the subject of scientific debate. This work is carried out in the framework of clinical research project Noso. Qual aiming at studying the link between structure and process indicators and NI as performance indicators in intensive care and in surgery. The design of the scores required a literature review, expert consultations and field surveys. The scores were organized into seven dimensions: Human Resources, Architecture, Safety and Environment, Management of documentation, Patient care management, Risk management of infections, Evaluation and Surveillance. Scores were compared to surveillance data using a Poisson regression. Only two dimensions Architecture and Management of documentation were significantly associated with the NI with a protective effect. As for the overall scores, they were significantly associated with NI in intensive care units for all the sites of infections considered but not in surgery. The results obtained in intensive care show that NI would be a valid indicator which could support the construction of indicators already initiated in France and other countries. As for the results in surgery, they should not call into question either the principle of surveillance of NI or the necessity of evaluating actions of prevention and improving professional practices activities
Choquet, Rémy. "Partage de données biomédicales : modèles, sémantique et qualité." Phd thesis, Université Pierre et Marie Curie - Paris VI, 2011. http://tel.archives-ouvertes.fr/tel-00824931.
Full textBen, salem Aïcha. "Qualité contextuelle des données : détection et nettoyage guidés par la sémantique des données." Thesis, Sorbonne Paris Cité, 2015. http://www.theses.fr/2015USPCD054/document.
Full textNowadays, complex applications such as knowledge extraction, data mining, e-learning or web applications use heterogeneous and distributed data. The quality of any decision depends on the quality of the used data. The absence of rich, accurate and reliable data can potentially lead an organization to make bad decisions.The subject covered in this thesis aims at assisting the user in its quality ap-proach. The goal is to better extract, mix, interpret and reuse data. For this, the data must be related to its semantic meaning, data types, constraints and comments.The first part deals with the semantic schema recognition of a data source. This enables the extraction of data semantics from all the available information, inculding the data and the metadata. Firstly, it consists of categorizing the data by assigning it to a category and possibly a sub-category, and secondly, of establishing relations between columns and possibly discovering the semantics of the manipulated data source. These links detected between columns offer a better understanding of the source and the alternatives for correcting data. This approach allows automatic detection of a large number of syntactic and semantic anomalies.The second part is the data cleansing using the reports on anomalies returned by the first part. It allows corrections to be made within a column itself (data homogeni-zation), between columns (semantic dependencies), and between lines (eliminating duplicates and similar data). Throughout all this process, recommendations and analyses are provided to the user
Bard, Sylvain. "Méthode d'évaluation de la qualité de données géographiques généralisées : application aux données urbaines." Paris 6, 2004. http://www.theses.fr/2004PA066004.
Full textMarzin, Anahita. "Indicateurs biologiques de la qualité écologique des cours d'eau : variabilités et incertitudes associées." Phd thesis, AgroParisTech, 2013. http://pastel.archives-ouvertes.fr/pastel-00879788.
Full textMarzin, Anahita. "Indicateurs biologiques de la qualité écologique des cours d’eau : variabilités et incertitudes associées." Thesis, Paris, AgroParisTech, 2013. http://www.theses.fr/2013AGPT0002/document.
Full textSensitive biological measures of ecosystem quality are needed to assess, maintain or restore the ecological conditions of rivers. Since our understanding of these complex systems is imperfect, river management requires recognizing variability and uncertainty of bio-assessment for decision-making. Based on the analysis of national data sets (~ 1654 sites), the main goals of this work were (1) to test some of the assumptions that shape bio-indicators and (2) address the temporal variability and the uncertainty associated to prediction of reference conditions.(1) This thesis highlights (i) the predominant role of physiographic factors in shaping biological communities in comparison to human pressures (defined at catchment, riparian corridor and reach scales), (ii) the differences in the responses of biological indicators to the different types of human pressures (water quality, hydrological, morphological degradations) and (iii) more generally, the greatest biological impacts of water quality alterations and impoundments. (2) A Bayesian method was developed to estimate the uncertainty associated with reference condition predictions of a fish-based bio-indicator (IPR+). IPR+ predictive uncertainty was site-dependent but showed no clear trend related to the environmental gradient. By comparison, IPR+ temporal variability was lower and sensitive to an increase of human pressure intensity. This work confirmed the advantages of multi-metric indexes based on functional metrics in comparison to compositional metrics. The different sensitivities of macrophytes, fish, diatoms and macroinvertebrates to human pressures emphasize their complementarity in assessing river ecosystems. Nevertheless, future research is needed to better understand the effects of interactions between pressures and between pressures and the environment
Fall, Samba. "Economie du droit et indicateurs de qualité dans le domaine de la justice." Paris 10, 2011. http://www.theses.fr/2011PA100106.
Full textMichel, Philippe. "Approche métrologique de l'utilisation des indicateurs de performance en santé." Bordeaux 2, 2001. http://www.theses.fr/2001BOR28896.
Full textHealth care performance indicators are used to identify deficiencies in health care, to compare structures or activities and to follow performance over time. Assessing indicators is difficult, partly because there are conceptual ambiguities concerning the classical measurement properties of validity and reliability. The objectives of this thesis are to propose a revised measurement conceptual framework, to include it in guidelines for the utilisation of performance indicators, and to illustrate application of guidelines. We defined four measurements properties (validity, stability, homogeneity and coherence) relevant for the assessment of all possible sources of variability. Our strategy for assessing indicators, which is coherent and logical for the development and evaluation of indicators, is theorically applicable to all measurement tools. Our integrative guidelines, based on ten steps, explores the measurement properties as well as suitability and feasibility issues. The appropriateness of the proposed framework is illustrated using our work on indicators of performance of pain management, cardiac surgery, preoperative prescription appropriateness and ambulatory surgery wards. The framework was operational, although the four kinds of variability, separated for didactical reasons, not always analyzed separately. We believe that the wide array of indicators studied is in favor of the applicability of our framework to all performance indicators in health care
Zelasco, José Francisco. "Gestion des données : contrôle de qualité des modèles numériques des bases de données géographiques." Thesis, Montpellier 2, 2010. http://www.theses.fr/2010MON20232.
Full textA Digital Surface Model (DSM) is a numerical surface model which is formed by a set of points, arranged as a grid, to study some physical surface, Digital Elevation Models (DEM), or other possible applications, such as a face, or some anatomical organ, etc. The study of the precision of these models, which is of particular interest for DEMs, has been the object of several studies in the last decades. The measurement of the precision of a DSM model, in relation to another model of the same physical surface, consists in estimating the expectancy of the squares of differences between pairs of points, called homologous points, one in each model which corresponds to the same feature of the physical surface. But these pairs are not easily discernable, the grids may not be coincident, and the differences between the homologous points, corresponding to benchmarks in the physical surface, might be subject to special conditions such as more careful measurements than on ordinary points, which imply a different precision. The generally used procedure to avoid these inconveniences has been to use the squares of vertical distances between the models, which only address the vertical component of the error, thus giving a biased estimate when the surface is not horizontal. The Perpendicular Distance Evaluation Method (PDEM) which avoids this bias, provides estimates for vertical and horizontal components of errors, and is thus a useful tool for detection of discrepancies in Digital Surface Models (DSM) like DEMs. The solution includes a special reference to the simplification which arises when the error does not vary in all horizontal directions. The PDEM is also assessed with DEM's obtained by means of the Interferometry SAR Technique
Le, Pape Cécile. "Contrôle de qualité des données répliquées dans un Cluster." Paris 6, 2005. http://www.theses.fr/2005PA066433.
Full textAubé, Lucien. "Identification des indicateurs de qualité des interventions en formation sur mesure auprès des organisations." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp02/NQ32571.pdf.
Full textLassoued, Yassine. "Médiation de qualité dans les systèmes d'information géographique." Aix-Marseille 1, 2005. http://www.theses.fr/2005AIX11027.
Full textMuhlenbach, Fabrice. "Evaluation de la qualité de la représentation en fouille de données." Lyon 2, 2002. http://demeter.univ-lyon2.fr:8080/sdx/theses/lyon2/2002/muhlenbach_f.
Full textKnowledge discovery tries to produce novel and usable knowledge from the databases. In this whole process, data mining is the crucial machine learning step but we must asked some questions first: how can we have an a priori idea of the way of the labels of the class attribute are separable or not? How can we deal with databases where some examples are mislabeled? How can we transform continuous predictive attributes in discrete ones in a supervised way by taking into account the global information of the data ? We propose some responses to these problems. Our solutions take advantage of the properties of geometrical tools: the neighbourhood graphs. The neighbourhood between examples projected in a multidimensional space gives us a way of characterising the likeness between the examples to learn. We develop a statistical test based on the weight of edges that we must suppress from a neighbourhood graph for having only subgraphs of a unique class. This gives information about the a priori class separability. This work is carried on in the context of the detection of examples from a database that have doubtful labels: we propose a strategy for removing and relabeling these doubtful examples from the learning set to improve the quality of the resulting predictive model. These researches are extended in the special case of a continuous class to learn: we present a structure test to predict this kind of variable. Finally, we present a supervised polythetic discretization method based on the neighbourhood graphs and we show its performances by using it with a new supervised machine learning algorithm
Sall, Serigne Touba. "Approche par les données de panel dans la théorie asymptotique des indicateurs de pauvreté." Paris 6, 2012. http://www.theses.fr/2012PA066286.
Full textAim of the thesis is to study the limit distributions of estimators that arise in the analysis of poverty, called poverty indices. We are especially concerned with the asymptotic theory of the time-dependent general poverty index (GPI), including all the usual indices in the literature. We settle uniform weak convergence of such statistics. As a first step, we consider the indices for a fixed time. Using extreme value theory and Hungarian approximations, we need asymptotic laws for the GPI, and entirely describe the asymptotic normality of this class. These results have natural applications to derive asymptotic confidence intervals for indices based on data collected within developing countries. However, we still need to handle longitudinal data, where the poverty situation is analysed over a continuous period of time. In this case, we are faced with longitudinal data, called panel data, and led to consider the time-dependent general poverty index. Based on weak convergence theory for empirical processes, developed by Vaart and Wellner (1995), we settle the uniform weak convergence of such statistics. We obtained uniform asymptotic laws of GPI. Our results yield tools to handle discrete and continuous longitudinal data. As application we used Senegalese data ESAM (enquete senegalaise aupres des ménages) available for two periods, 1996 and 2001. In this way, comparisons are made between different regions on a fixed moment in time or between the situations of the same region at different time instants
Duchesne, Karine. "La qualité des emplois au sein d'entreprises d'économie sociale indicateurs objectifs et perceptions des acteurs." Mémoire, Université de Sherbrooke, 2006. http://savoirs.usherbrooke.ca/handle/11143/2479.
Full textCécillon, Lauric. "Quels indicateurs pour évaluer la qualité de sols forestiers soumis à des contraintes environnementales fortes ?" Phd thesis, Grenoble 1, 2008. http://www.theses.fr/2008GRE10149.
Full textLes sols des régions de montagne sont confrontés à de fortes modifications climatiques et d'usages qui les rendent particulièrement sensibles à trois menaces: l'érosion, la perte de matière organique et de biodiversité. Cette thèse aborde la question de la qualité des sols sous l'angle de processus clés (décomposition et agrégation de la matière organique) au sein de différents compartiments de la partie superficielle du sol vivant (épipédon). Ces processus clés interviennent dans l'intensité des services rendus par les sols comme la séquestration de carbone, la fertilité des sols et le maintien de leur activité biologique. Parmi les indicateurs utilisés dans l'évaluation de la qualité des sols, nous avons choisi de tester des indicateurs simples et composites reliés aux processus clés de deux types d'écosystèmes subissant de fortes contraintes environnementales. Deux sites d'étude situés dans les massifs du Dévoluy (Isère) et des Maures (Var) ont été utilisés. Tous deux sont marqués par un fort gradient de végétation induit par des contraintes naturelles (sols à permafrost pour le site du Dévoluy) ou anthropiques (incendies pour le site des Maures). Dans ce travail, nous formulons trois hypothèses: (i) il existe au niveau des sols une signature originale des modifications environnementales influençant le fonctionnement des écosystèmes forestiers; (ii) cette signature est liée à la qualité des sols et recouvre une partie biologique largement sous estimée; (iii) la réflectance du sol résume cette signature et permet sa caractérisation dans des conditions contrastées. Les résultats de la thèse démontrent que les contraintes liées au pédoclimat et aux incendies induisent de fortes modifications des variables décrivant les processus de décomposition et d'agrégation dans les sols. Le compartiment biologique du sol (microflore ou microfaune) est particulièrement affecté par ces contraintes, révélant des modifications dans les processus liés aux chaînes trophiques des sols (voies préférentielles de décomposition et d'agrégation biologique, nitrification et dénitrification potentielles). La spectroscopie proche infrarouge se révèle être un outil pertinent pour rendre compte des modifications de qualité physico-chimiques mais aussi biologiques des sols soumis à de fortes contraintes
Cécillon, Lauric. "Quels indicateurs pour évaluer la qualité de sols forestiers soumis à des contraintes environnementales fortes ?" Phd thesis, Université Joseph Fourier (Grenoble), 2008. http://tel.archives-ouvertes.fr/tel-00396553.
Full textAl, Chami Zahi. "Estimation de la qualité des données multimedia en temps réel." Thesis, Pau, 2021. http://www.theses.fr/2021PAUU3066.
Full textOver the past decade, data providers have been generating and streaming a large amount of data, including images, videos, audio, etc. In this thesis, we will be focusing on processing images since they are the most commonly shared between the users on the global inter-network. In particular, treating images containing faces has received great attention due to its numerous applications, such as entertainment and social media apps. However, several challenges could arise during the processing and transmission phase: firstly, the enormous number of images shared and produced at a rapid pace requires a significant amount of time to be processed and delivered; secondly, images are subject to a wide range of distortions during the processing, transmission, or combination of many factors that could damage the images’content. Two main contributions are developed. First, we introduce a Full-Reference Image Quality Assessment Framework in Real-Time, capable of:1) preserving the images’content by ensuring that some useful visual information can still be extracted from the output, and 2) providing a way to process the images in real-time in order to cope with the huge amount of images that are being received at a rapid pace. The framework described here is limited to processing those images that have access to their reference version (a.k.a Full-Reference). Secondly, we present a No-Reference Image Quality Assessment Framework in Real-Time. It has the following abilities: a) assessing the distorted image without having its distortion-free image, b) preserving the most useful visual information in the images before publishing, and c) processing the images in real-time, even though the No-Reference image quality assessment models are considered very complex. Our framework offers several advantages over the existing approaches, in particular: i. it locates the distortion in an image in order to directly assess the distorted parts instead of processing the whole image, ii. it has an acceptable trade-off between quality prediction accuracy and execution latency, andiii. it could be used in several applications, especially these that work in real-time. The architecture of each framework is presented in the chapters while detailing the modules and components of the framework. Then, a number of simulations are made to show the effectiveness of our approaches to solve our challenges in relation to the existing approaches
Mahmoud, Salwa. "Les indicateurs de la recherche scientifique en Tunisie. Etude biblio-scientométrique du secteur médical." Aix-Marseille 3, 1998. http://www.theses.fr/1998AIX30053.
Full textMasselot, Gérard. "La synécoparcimonie : un outil d'évaluation biologique de la qualité des eaux courantes : Théorie et applications." Paris, Muséum national d'histoire naturelle, 2005. http://www.theses.fr/2002MNHN0027.
Full textA new freshwater biomonitoring tool, the synecoparsimony method, is proposed and tested. The methodological bases are explained. Its validity is tested on several real cases, of various geographical origins (neartic and west-palearctic). It is shown that this method can be used as well to analyze faunistic data as microfloristic. The new tool makes it possible to minimize the ad hoc hypothesis. It enables direct and rigorous confrontation between biological data and the mesologic characteristics of the rivers. The concept of “bio-indicator” taxa is discussed, and the concept of “significant taxa” is proposed. The new European freshwater biomonitoring tools ara studied and criticized. It is shownthat the new suggested method can allow a relevant approach of the quality of water and/or of aquatic environments. Its validity field is specified, and the complementary studies necessary to improve this new tool are exposed. The need for the use of “total evidence” matrices of qualitative biological data, necessary including “rare” taxa, is shown. The specific level of determination of taxa is confirmed as being the most informative. The method we propose could be integrated as a complementary tool available for freshwater managers in Europe
Beauger, Davy. "Le retransqol : une échelle de mesure de la qualité de vie spécifique aux patients porteurs d'un greffon rénal fonctionnel. : Développement, adaptation et application." Thesis, Aix-Marseille, 2014. http://www.theses.fr/2014AIXM5057.
Full textThe inclusion of the concept of quality of life (QOL) is indicative of a profound change in the way of practicing medicine, particularly in the field of nephrology for patients with end stage renal disease (ESRD). Given the prevalence, incidence and mortality of this disease in France, it seemed important, even essential, to measure properly, appropriately and consistently, the QOL of patients with ESRD. Health related quality of life (HRQOL) is therefore an important indicator of results to evaluate the consequences of this disease, the effect of medical procedures, treatment effects, or the impact of health policies.In 2007, after a study of literature concerning the assessment of QOL's scales of patients with ESRD, it was revealed a certain lack, quantitative or qualitative, of specific questionnaires for measuring QOL for ESRD patients validated in French, especially for patients with a functioning kidney transplant.In 2008, a specific scale has been developed and validated to measure the QOL of renal transplant recipients: the ReTransQol (Renal Transplant Quality of life questionnaire). After 5 years of use and application of ReTransQol in different national studies, this tool has been improved and a new version was created: the ReTransQol V2 (or RTQ V2). After lots of analysis, this scale has currently good psychometric properties and has been validated in various populations. The RTQ V2 is also used in international studies (Brazil, Germany, Canada ...), and a cross-cultural validation of the scale is planned.The ReTransQol V2 is a specific tool to assess the HRQOL and is suitable for a routine use among renal transplant recipients
Gilliers, Camille. "Recherche d'indicateurs de la qualité des écosystèmes côtiers : application aux nourriceries côtières et estuariennes des poissons plats." Littoral, 2004. http://www.theses.fr/2004DUNK0145.
Full textInshore shallow waters and estuaries along much of the western coasts of France provide nursery areas, which have been identified as Essential Fish Habitat, for a wide variety of fish species and especially commercial flatfish. Worldwide increasing industrial development and human population growth have led to rise of environmental damages in the marine environment and human pressure on marine habitats keeps strong. This assertion is especially verified on coastal marine areas, particularly submitted to the adverse effects of anthropogenic disturbances. As damages to habitat quality are one of the most harmful means of slowing or preventing stock from recoveries, one of the main areas are emphasis in current fisheries ecological research is the conservation and enhancement of Essential Fish Habitats. The main goal of the study was to find suitable bioindicators measured on fish to assess the habitat quality of fish nurseries along the Eastern English Channel and Bay of Biscay coasts, with a focus on sites heavily impacted by human activities such as the important estuaries and harbour areas. Subsequently, biological indicators measured both at the individual (growth, size, condition) and population (density) levels were used to assess the biological performances of juvenile fishes and the habitat quality of different nursery areas. The results demonstrate that is necessary to use a pool of bioindicators to assess the quality of habitats as no single measure may provide an overall description of habitat quality. Concerning the habitat quality of the fish nurseries along the western coasts of France, the analyses point out the estuaries of Seine and Gironde where levels of contamination of coastal waters are especially high, and the combination of fish grawth performance and density significantly lower than in other nursery areas. Nevertheless, the results also suggest that it is more pertinent to work on a small spatial scale to increase the sensitivity and the usefulness of the bioindicators, especially to avoid problems related to geographical gradients and to estimate eventual differences due to anthropogenic perturbations on habitat at a local scale. The combination of several bioindicators measured on juvenile fishes appears to provide reliable quantitative indicators of the quality of habitat on the nursery grounds and notably to point out the disturbances in survival and growth in especially polluted areas. Hence, such indicators may contribute to improve assessment of environmental quality of essential fish habitats in the aim of a sustainable management of fisheries ressources
Beryouni, Khadija. "Variabilité des données environnementales : exemple des sédiments marins." Perpignan, 2012. http://www.theses.fr/2012PERP1088.
Full textThe work presented in this thesis aims to optimize the study of marine sediments in an objective quality assurance. To illustrate this approach, I focused on the fine fraction of sediment. In sedimentology, the fine fraction is the fraction of the sediment particles whose size is less than a given size (2, 10, 20, 40, 50, 63, 80… µm). His percentage is a fundamental parameter for many studies of sediments because the used both in geochemistry, geophysics, soil science, archeology. Today, two approaches are competing and complementary in laboratories to determine the rate of fine fraction. Sieving considered an old method but which has proven itself and the laser diffraction which is a new method. Each one is praised or vilified by many technicians and / or researchers. To better understand this "casus belli" between sedimentologists, I, first questioned all the steps of obtaining the rate of fine fraction to identify sources of approximation errors and possibly taint the result. This work was carried out for both the method by wet sieving and laser diffraction method. In a second step, I tried to highlight all approximations that are made in the chain that links the development of offshore sampling plan for interpretation. Thus, the influence of sampling scheme, sampling system, the spatial variability at different scales (from meters to hundred meters) have been studied. For universality, several sedimentary facies were studied at two sites with very different environmental characteristics. Oualidia lagoon has been studied in the framework of Franco-Moroccan cooperation (MA/07/179) and the eastern Bay of Seine in the program "Colmatage" funded by the GIP "Seine Aval". The results will help to optimize the structure of sedimentological databases implemented by specifying the importance of certain fields that until now were never informed. Only the inclusion of this additional information allows a better interpretation of the "Quality" as possible. For example, the study of temporal variability can be optimized when you can make the difference between natural variability, those that are looking for, and variability, in a broad sense, related to "measure"
Devillers, Rodolphe. "Conception d'un système multidimensionnel d'information sur la qualité des données géospatiales." Phd thesis, Université de Marne la Vallée, 2004. http://tel.archives-ouvertes.fr/tel-00008930.
Full textWeber-Baghdiguian, Lexane. "Santé, genre et qualité de l'emploi : une analyse sur données microéconomiques." Thesis, Paris Sciences et Lettres (ComUE), 2017. http://www.theses.fr/2017PSLED014/document.
Full textThis thesis studies the influence of work on job and life quality, the latter being considered through the perception that individuals have of their own health. The first chapter focuses on the long-term effects of job losses due to plant closure on job quality. We show that job loss negatively affects wages, perceived job insecurity, the quality of the working environment and job satisfaction, including in the long run. The two last chapters investigate gender differences in self-reported health. The second chapter provides descriptive evidence on the relationships between self-assessed health, gender and mental health problems, i.e. depression and/or affective pains. Finally, in the last chapter, we study the influence of social norms as proxied by the gender structure of the workplace environment, on gender differences in self-reported health. We show that both women and men working in female-dominated environments report more specific health problems than those who work in male-dominated environments. The overall findings of this thesis are twofold. First, losing a job has a negative impact on several dimensions of job quality and satisfaction in the long run. Secondly, mental diseases and social norms at work are important to understand gender-related differences in health perceptions
Puricelli, Alain. "Réingénierie et Contrôle Qualité des Données en vue d'une Migration Technologique." Lyon, INSA, 2000. http://theses.insa-lyon.fr/publication/2000ISAL0092/these.pdf.
Full textThe purpose of this thesis is to develop a methodology of treatment for logical consistency checking in a Geographical Information System (GIS), in order to ensure the migration of the data in the case of a technological change of system and re-structuring. This methodology is then applied to a real GIS installed in the Urban Community of Lyon (the SUR). Logical consistency is one of the quality criteria that are commonly allowed within the community of producers and users of geographical data, as well as precision or exhaustiveness for instance. After a presentation of the elements of quality and metadata in GIS, a state of the art is given concerning various works of standardization within these fields. The different standards under development (those of the CEN, the ISO and the FGDC among others) are analyzed and commented. A methodology of detection and correction of geometrical and topological errors is then detailed, within the framework of existing geographical vector databases. Three types of errors are identified, namely structural, geometrical and semantic errors. For each one of these families of errors, methods of detection based on established theories (integrity constraints, topology and computational geometry) are proposed as well ideas for the correction are detailed. This approach is then implemented within the context of the SUR databases. To complete this application, a specific mechanism was developed to deal also with the errors in tessellations, which were not taken into account by the methodology (which uses binary topological relations). Finally to ensure the consistency of the corrections, a method was set up to spread the corrections in the neighborhood of the objects under corrections. Those objects can be located inside a single layer of data as well as between different layers or different databases of the system
Bazin, Cyril. "Tatouage de données géographiques et généralisation aux données devant préserver des contraintes." Caen, 2010. http://www.theses.fr/2010CAEN2006.
Full textDigital watermaking is a fundamental process for intellectual property protection. It consists in inserting a mark into a digital document by slightly modifications. The presence of this mark allows the owner of a document to prove the priority of his rights. The originality of our work is twofold. In one hand, we use a local approach to ensure a priori that the quality of constrained documents is preserved during the watermark insertion. On the other hand, we propose a generic watermarking scheme. The manuscript is divided in three parts. Firstly, we introduce the basic concepts of digital watermarking for constrainted data and the state of the art of geographical data watermarking. Secondly, we present our watermarking scheme for digital vectorial maps often used in geographic information systems. This scheme preserves some topological and metric qualities of the document. The watermark is robust, it is resilient against geometric transformations and cropping. We give an efficient implementation that is validated by many experiments. Finally, we propose a generalization of the scheme for constrainted data. This generic scheme will facilitate the design of watermarking schemes for new data type. We give a particular example of application of a generic schema for relational databases. In order to prove that it is possible to work directly on the generic scheme, we propose two detection protocols straightly applicable on any implementation of generic scheme
Alili, Hiba. "Intégration de données basée sur la qualité pour l'enrichissement des sources de données locales dans le Service Lake." Thesis, Paris Sciences et Lettres (ComUE), 2019. http://www.theses.fr/2019PSLED019.
Full textIn the Big Data era, companies are moving away from traditional data-warehouse solutions whereby expensive and timeconsumingETL (Extract, Transform, Load) processes are used, towards data lakes in order to manage their increasinglygrowing data. Yet the stored knowledge in companies’ databases, even though in the constructed data lakes, can never becomplete and up-to-date, because of the continuous production of data. Local data sources often need to be augmentedand enriched with information coming from external data sources. Unfortunately, the data enrichment process is one of themanual labors undertaken by experts who enrich data by adding information based on their expertise or select relevantdata sources to complete missing information. Such work can be tedious, expensive and time-consuming, making itvery promising for automation. We present in this work an active user-centric data integration approach to automaticallyenrich local data sources, in which the missing information is leveraged on the fly from web sources using data services.Accordingly, our approach enables users to query for information about concepts that are not defined in the data sourceschema. In doing so, we take into consideration a set of user preferences such as the cost threshold and the responsetime necessary to compute the desired answers, while ensuring a good quality of the obtained results
Feno, Daniel Rajaonasy. "Mesures de qualité des règles d'association : normalisation et caractérisation des bases." Phd thesis, Université de la Réunion, 2007. http://tel.archives-ouvertes.fr/tel-00462506.
Full textEchairi, Abdelwahad. "Effets du cuivre sur quelques indicateurs de la qualité biologique des sols viticoles : étude à différentes échelles." Dijon, 2008. http://www.theses.fr/2008DIJOS008.
Full textCopper-based fungicides are used for more than a century by vine growers to fight against mildew (and other diseases). As a result, copper accumulates in soil, reaching high concentrations, with potential harmful effects on soil biocenosis. In spite of this threat, copper-based products are still in use, especially in organic farming. In this work, we tried to make clear the effects of copper on some aspects of biological quality of vineyards in “real” conditions, on the short, medium and long term. Long term effects were studied in a region (Champagne) through soil samples representing a large range of copper concentrations. Two different sites, in Burgundy and Champagne were used to study the medium-term effects (decade). Finally, to study in details the short term effects (1-4 years) we used an experimental approach, in three different locations, copper additions being the only source of variation. Biological indicators (microbial biomass, C & N mineralization, nitrification) were used for routine analysis. In addition, two fungal populations of interest for vine growing and wine making were studied: mycorrhizal-arbuscular fungi (MA) and yeasts able to grow on vine juice. The main characteristics of the soil samples were measured simultaneously, including total and EDTA-extractable copper. Microbial biomass is a reliable indicator of soil quality but, for low levels of Cu, spatio-temporal variations were higher than the effects of copper addition. The same observations were made for C and N mineralization activities. Nitrification activity (ammonium oxidation) turned out to be less affected by copper additions than by previous nitrogen additions (as reduced forms: organic N and ammonium N). Therefore, nitrification is not a reliable indicator of Cu contamination. Our results also showed that MA fungal populations are of potential interest to assess the effects of cultural practices, including copper additions, provided the other sources of variation are under control. These populations can be characterized both quantitatively (spore numbers) and qualitatively (diversity of morphological types). Significant differences between treatments were observed in our experiments. However, the efficiency of these populations (for P uptake) are not addressed by these tests. Populations of yeasts are also potentially interesting to study the effects of Cu in vineyards. We developed a protocol to assess both genotypic and phenotypic diversity of these fungal populations. Genotypic characterization was based on 18S rDNA PCR-RFLP and polymorphism of D1-D2 region of 26S rDNA. Phenotypic characterization was limited to the assessment of copper tolerance by measuring growth rate on a medium containing increasing Cu concentrations. The results showed no correlation between genotypic and phenotypic characterization. Many strains were able to grow on media containing high concentrations of copper, even when they were isolated from soil samples without previous Cu application
Sakr, Riad. "Quality assurance in higher education institutions : contingent assessment system." Thesis, Lille 1, 2018. http://www.theses.fr/2018LIL1A008/document.
Full textThe thesis discusses quality assurance in higher education institutions and the contingent quality assessment systems, and deliberates on the different factors that have impacted the higher education system and transformed the role of its institutions. The researcher then presents and discusses the different dimensions and variables that are interwoven into the quality assessment of these institutions. The Lebanese higher education structure is also discussed and analyzed. To assess quality issue in higher education institutions operating in Lebanon, a template/model is proposed. Six areas of quality dimensions are considered. Standards, criteria, and indicators are developed for each of these areas, and different coefficients are affected by indicators. A quality scale is established; the judgment is based on qualitative and quantitative evaluation justified by on-site observation and proofs. The template/model is tested in a Lebanese private university. The assessment has led to the determination of scores within each area, an average score for each area and for the university as a whole. The results of this assessment have reflected many strong and weak aspects of said university. The proposed template/model could be considered a practical assessment tool of higher education institutions, particularly for the ‘young’ institutions of higher education to evaluate the quality level of their different components. It could also be considered a national assessment step that precedes the acquisition of the international accreditation
Philippe, Romain, and Romain Philippe. "Outils automatiques d'évaluation de la qualité des données pour le suivi en continu de la qualité des eaux usées." Master's thesis, Université Laval, 2019. http://hdl.handle.net/20.500.11794/37160.
Full textAujourd’hui, la surveillance et le contrôle de la qualité des eaux usées (réseaux d’égouts, stations de récupération des ressources de l’eau–StaRRE, rivières) utilisent plusieurs capteurs installés en ligne. Une bonne stratégie de surveillance devrait être fiable et fournir une bonne qualité de données. L’utilisation des méthodes actuelles de détection de fautes a montré que des problèmes de colmatage conduisent à une perte de données comprise entre 10 et 60 %. Aider les utilisateurs à comprendre, analyser et traiter les fautes détectées (capteurs colmatés, fautes de calibration, installations et maintenances sous-optimales) permettrait de réduire le pourcentage de perte de données et d’atteindre de bonnes données sur la qualité des eaux usées. Dans ce travail de maîtrise, nous proposons deux outils modulaires complets permettant d’obtenir des informations exploitables à partir des données brutes (c'est-à-dire pour la détection des erreurs de capteurs, le contrôle ou la surveillance de processus). Ces outils ont été appliqués à des séries chronologiques des projets pilEAUte, bordEAUx et kamEAU collectés dans différents réseaux d’égouts et les StaRRE. Ces méthode sont été rendues limpides dans leur applicabilité avec la rédaction de «Standard Operating Procedures(SOP)» facilitant leur utilisation. Aussi, elles sont modulaires avec la construction de blocs de fonctions, tels qu'une boîte à outils. La première méthode est un outil univarié composé de deux étapes principales: le filtrage des données (détection des valeurs aberrantes et lissage) et la détection des fautes. La deuxième méthode est un outil utilisant l’Analyse en Composantes Principales (ACP) également composéede deux étapes: Développement du modèle ACP et détection des fautes par l’ACP. Finalement, dans les cas d’étude, le traitement des données a conduit à une perte minimale de données variant de 0.1-12%.
Nowadays, in the wastewater field (sewers, water resource recovery facilities -WRRFs, rivers), the monitoring and control of wastewater quality is performed with several on-line sensors. However, a good monitoring strategy should be reliable and provide good data quality. The current fault detection methods have shown that problems such as fouling lead to 10-60 % of the data being discarded. However, helping users in understanding, analysing and processing detected faults (sensors clogging, faulty calibration, suboptimal installation and maintenance) will allow reducing the percentage of data loss and reaching good data on wastewater quality. In this Master thesis, we propose two full workflows allowing the collection of raw data and their transformation into actionable information (i.e. for sensor fault detection, control or process monitoring).The two full modular frameworks were applied to time series data coming from thepilEAUte, bordEAUx and kamEAU projects collected in sewers and WRRFs. These methods have been made more easily applicable by writing Standard Operation Procedures (SOPs) on the use of these methods. In addition, the Matlab scripts are written in a modular way by building different function blocks that are compiled in a toolbox. The first method is a univariate tool composed of two main steps: Data filtering (outlier detection and smoothing) and fault detection. The second method is a multivariate tool using Principal Component Analysis, also composed of two steps: (i) the development of the PCA model and (ii) the fault detection by the PCA. Finally, for the three aforementioned projects, data treatment has led to only 0.1-12% of the data being discarded.
Nowadays, in the wastewater field (sewers, water resource recovery facilities -WRRFs, rivers), the monitoring and control of wastewater quality is performed with several on-line sensors. However, a good monitoring strategy should be reliable and provide good data quality. The current fault detection methods have shown that problems such as fouling lead to 10-60 % of the data being discarded. However, helping users in understanding, analysing and processing detected faults (sensors clogging, faulty calibration, suboptimal installation and maintenance) will allow reducing the percentage of data loss and reaching good data on wastewater quality. In this Master thesis, we propose two full workflows allowing the collection of raw data and their transformation into actionable information (i.e. for sensor fault detection, control or process monitoring).The two full modular frameworks were applied to time series data coming from thepilEAUte, bordEAUx and kamEAU projects collected in sewers and WRRFs. These methods have been made more easily applicable by writing Standard Operation Procedures (SOPs) on the use of these methods. In addition, the Matlab scripts are written in a modular way by building different function blocks that are compiled in a toolbox. The first method is a univariate tool composed of two main steps: Data filtering (outlier detection and smoothing) and fault detection. The second method is a multivariate tool using Principal Component Analysis, also composed of two steps: (i) the development of the PCA model and (ii) the fault detection by the PCA. Finally, for the three aforementioned projects, data treatment has led to only 0.1-12% of the data being discarded.
Plassart, Pierre. "Pertinence des indicateurs microbiens dans l'évaluation de l'état des sols agricoles." Rouen, 2010. http://www.theses.fr/2010ROUES048.
Full textMost soils worlwide undergo modifications caused by human activities. For numerous agricultural soils, the intensive practices, as well as the chronical spreading of chemical inputs, lead to a decrease in these soils fertilkity. Yet, while these soils are in danger, only a few relevant soil status evaluation tools exist. The biological component has been neglected for a long time, though soil accomodates a high diversity level of organisms involved in biogeochemical cycles. Among these organisms, microorganisms are likely to be potential bioindicators of soil status. Thus, the aim of this work consists in identifying biological variables describing bacterial diversity that would be sensitive to soil status. Bacterial community structure (abundance, genetic and functionnal diversity) was assessed as a whole, but also through the analysis of Pseudomonas populations, a bacterial genus widely found in soils and sensitive to environmental disturbances. The experimental approach, from the field plot level to the microcosm level, consisted in the confrontation of the impacts of (1) soil copper contamination (agronomical dose x100), (2) natural disturbances (spatial and climatic), and (3) human caused disturbances (agricultural practices) on bacterial communities. The first part of this work was performed in situ on typical silty Haute-Normandie soils managed as grasslands and cultures, in order to define the natural spatial and temporal variation limits of different bacterial descriptors. Besides, this first part enabled the understanding each variable's sensitivity, and validated the different methodological approaches. Thus, the measured variables can be organized in two categories: those which rely on microorganisms cultivation would reflect recent soil disturbances, while those based on molecular approaches would reflect the agricultural history of field plots. In the second part, the impact of a copper contamination on the previously described variables was studied in microcosms. Results showed that variations in the bacterial community structure caused by a single copper contamination were minor, when compared with variations caused by season or agricultural practices. In the same time, a comparative analysis was performed between sieved soil and intact soil microcosms. This revealed that soil status can influence bacterial communities response to a copper contamination; indeed, soil physical disturbance (sieving) induced a transient response of bacterial communities to the copper contamination. This work lead to (1) building, for an agricultural silty soil, a microbiological reference system, which includes spatial and temporal fluctuations of the measured variables, (2) determining for each variable its sensitivity to natural and human impacts, and identifying the limits and complementarities of the cellular and molecular approaches, (3) organizing into a herarchy factors determining the genetic and functionnal sructure of bacterial communities, (4) validating the relevance of using the Pseudomonas genus in the evaluation of the impact of copper on soils
Gutierrez, Rodriguez Claudia. "Qualité des données capteurs pour les systèmes de surveillance de phénomènes environnementaux." Lyon, INSA, 2010. http://theses.insa-lyon.fr/publication/2010ISAL0032/these.pdf.
Full textNowadays, emerging applications in the geographic domain increasingly explore geolocalized information provided by sensors, specially for crisis management, real-time vehicle management, urban or environmental risks management, etc. More particularly, the use of sensors in the surveillance domain, specially environmental surveillance (i. E. , floodings, avalanches, volcanoes. . . ), allow a simpler interpretation of the real world. In the meanwhile, the great volume of data coming from these sensors, at variable frequency and positions, acquired in hostile environments with limited battery and communication power turn the data imprecise and uncertain. We have thus a problem regarding the quality of the data provided by these sensors. This research work proposed in this thesis raise the challenge of real-time sensor data quality, in order to help users in the decision making during critical situations. This being, we propose a methodology for the definition, evaluation and communication of sensor data quality. This methodology is inspired by existent methodologies dealing with data quality in different domains and is structured in three phases (the definition, the evaluation and the communication). This methodology, differently from existent methodologies in the information systems domain, needed the specification of sensor data characteristics linked to data quality. The evaluation of the quality, in the core of the methodology, takes into account the acquisition context factors and treatment that can impact the data quality, as well as the metadata management and the evaluation of multi-criterion quality in real-time. The proposed methodology is supported by a visualization prototype called MoSDaQ (Monitoring Sensor Data Quality) that allows us to visualize via web the data provided by environmental observations, together with information related to the data quality in real-time