Dissertations / Theses on the topic 'Estimation du terme inconnu'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 25 dissertations / theses for your research on the topic 'Estimation du terme inconnu.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Wang, Zhibo. "Estimations non-asymptotiques et robustes basées sur des fonctions modulatrices pour les systèmes d'ordre fractionnaire." Electronic Thesis or Diss., Bourges, INSA Centre Val de Loire, 2023. http://www.theses.fr/2023ISAB0003.
Full textThis thesis develops the modulating functions method for non-asymptotic and robust estimations for fractional-order nonlinear systems, fractional-order linear systems with accelerations as output, and fractional-order time-delay systems. The designed estimators are provided in terms of algebraic integral formulas, which ensure non-asymptotic convergence. As an essential feature of the designed estimation algorithms, noisy output measurements are only involved in integral terms, which endows the estimators with robustness against corrupting noises. First, for fractional-order nonlinear systems which are partially unknown, fractional derivative estimation of the pseudo-state is addressed via the modulating functions method. Thanks to the additive index law of fractional derivatives, the estimation is decomposed into the fractional derivatives estimation of the output and the fractional initial values estimation. Meanwhile, the unknown part is fitted via an innovative sliding window strategy. Second, for fractional-order linear systems with accelerations as output, fractional integral estimation of the acceleration is firstly considered for fractional-order mechanical vibration systems, where only noisy acceleration measurements are available. Based on the existing numerical approaches addressing the proper fractional integrals of accelerations, our attention is primarily restricted to estimating the unknown initial values using the modulating functions method. On this basis, the result is further generalized to more general fractional-order linear systems. In particular, the behaviour of fractional derivatives at zero is studied for absolutely continuous functions, which is quite different from that of integer order. Third, for fractional-order time-delay systems, pseudo-state estimation is studied by designing a fractional-order auxiliary modulating dynamical system, which provides a more general framework for generating the required modulating functions. With the introduction of the delay operator and the bicausal generalized change of coordinates, the pseudo-state estimation of the considered system can be reduced to that of the corresponding observer normal form. In contrast to the previous work, the presented scheme enables direct estimation for the pseudo-state rather than estimating the fractional derivatives of the output and a bunch of fractional initial values. In addition, the efficiency and robustness of the proposed estimators are verified by numerical simulations in this thesis. Finally, a summary of this work and an insight into future work were drawn
Mechhoud, Sarah. "Estimation de la diffusion thermique et du terme source du modèle de transport de la chaleur dans les plasmas de tokamaks." Phd thesis, Université de Grenoble, 2013. http://tel.archives-ouvertes.fr/tel-00954183.
Full textChafouk, Houcine. "Validation de données à variance inconnu." Nancy 1, 1990. http://docnum.univ-lorraine.fr/public/SCD_T_1990_0379_CHAFOUK.pdf.
Full textClément, Thierry. "Estimation d'incertitude paramétrique dans un contexte de bruit de sortie inconnu mais borné." Grenoble INPG, 1987. http://www.theses.fr/1987INPG0095.
Full textClément, Thierry. "Estimation d'incertitude paramétrique dans un contexte de bruit de sortie inconnu mais borné." Grenoble 2 : ANRT, 1987. http://catalogue.bnf.fr/ark:/12148/cb376039984.
Full textScaillet, Olivier. "Modélisation et estimation de la structure par terme des taux d'intérêt." Paris 9, 1996. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=1996PA090003.
Full textThis work is divided in five chapters and examines two aspects of term structure modeling: the pricing of derivatives assets on interest rates end the inference of the models generally used in this pricing. The first chapter introduces continuous-time models and arbitrage pricing. The second chapter examines option pricing in a affine model. The estimation part concerning state variable dynamics is dealt with in chapter III. It consists in studying an inference method based on simulations in the frameworks of diffusion processes. In chapter IV, tests of non-nested hypotheses using simulations and an indirect encompassing notion are described. Two applications are proposed in chapter V. The first one consists in examining an estimation method of term structure models from bond data. The second one concerns estimation and comparison of several usual models of the instantaneous interest rate
Huillery, Julien. "Support temps-fréquence d'un signal inconnu en présence de bruit additif gaussien." Phd thesis, Grenoble INPG, 2008. http://tel.archives-ouvertes.fr/tel-00301324.
Full textNous choisissons de résoudre ce problème de localisation au moyen d'un test binaire d'hypothèses, formulé en chaque point du plan temps-fréquence. Le seuil de détection correspondant à ce test doit alors être déterminé : d'après les lois de probabilité des coefficients du spectrogramme d'une part, en lien avec la puissance du bruit d'autre part et, enfin, selon un critère de détection approprié.
La première étude concerne le comportement statistique des coefficients du spectrogramme. Dans le contexte d'un bruit non blanc et non stationnaire, la densité de probabilité des observations est ainsi formulée.
La densité specrale de puissance du bruit apparaît naturellement comme l'un des paramètres de cette densité de probabilité. Dans une seconde étude, une méthode d'estimation de ce bruit est proposée. Elle se base sur le comportement statistique des plus petits coefficients du spectrogramme.
Cet ensemble de connaissances nous permet finalement de résoudre le test d'hypothèses dont la solution naturelle au sens du maximum de vraisemblance fait apparaître le rapport d'énergie entre le signal et le bruit en chaque point du plan temps-fréquence. Ce rapport signal sur bruit local permet dès lors de préciser la condition "au moins supérieure" relative au support temps-fréquence accessible du signal.
L'algorithme de localisation temps-fréquence qui résulte de ce travail permet finalement de retenir le support temps-fréquence du signal d'intérêt sur l'ensemble duquel le rapport signal sur bruit est supérieur à une valeur choisie a priori.
Kassai, Koupai Behrouz. "Estimation du bénéfice clinique à long terme, les critères intermédiaires et leur application en pédiatrie." Lyon 1, 2006. http://www.theses.fr/2006LYO10028.
Full textBiomarkers and diagnostic tests results are commonly used as substitute for evaluating the treatment benefit. The use of substitutes yields a series of methodological problems. Using a numerical approach we have contributed to a better knowledge of the influence of the performance of diagnostic tests and biomarkers on the measurement of the treatment benefit. In paediatrics, we developed a clinical trial evaluating the efficacy of minoxidil on intima-media thickness with echography in children with Williams Beuren syndrome. Estimating the long-term treatment benefit. In the realm of prevention of chronic diseases, the follow-up duration rarely exceed 5 years. Preventive treatment, however, are usually prescribed for long-term. It is important, therefore, to know the approximations made when extrapolating results of short-term clinical trials. We developed an approach allowing us to estimate the clinical benefit of preventive treatment in terms of gain in life expectancy from results of clinical trials. We applied this approach to antihypertensive drugs using a meta-analysis on individual patient data and life tables. We have contributed to a better understanding of the behaviour of efficacy indices (relative risk, risk differences) as a function of the duration of the treatment. In children treated for cancer, we explored the long-term benefit in terms of gain in life expectancy
Bellier-Delienne, Annie. "Évaluation des contrats notionnels MATIF : estimation de la volatilité et de l'option de livraison." Paris 9, 1993. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=1993PA090028.
Full textNotional bond futures volatility, which measures the sensibility of the contract to movements in financial market, is estimated from different types of options on long bond futures evaluation models. Whaley model (with classical hypothesis of Black and Scholes model, American option) gives the best result within the different models which were tested. The quality option gives the futures seller the choice of delivery bond at the expiry date. Obtained by arbitrage, it always seems to be under evaluated by the market. It can be explained by the fact that the market limits the official bond list with two or three deliverable bonds, and when a bond goes out of the official list, the next delivery month, it's almost the cheapest to deliver
Champagne, Julie. "Traitement d'ordre en mémoire à court terme et estimation du temps : effet d'interférence spécifique au traitement d'ordre temporel." Master's thesis, Université Laval, 2000. http://hdl.handle.net/20.500.11794/51881.
Full textOuattara, Aboudou. "La structure par terme du taux d'escompte psychologique : estimation et incidences sur les préférences face au risque et sociales." Thesis, Paris 9, 2015. http://www.theses.fr/2015PA090026/document.
Full textThe discounted utility theory proposed by Samuelson (1937) is one of the dominant paradigms in economics and management especially in finance where it serves as a basis of the Intertemporal Capital Asset Pricing Model (ICAPM) and its version including consumption (ICCAPM). Despite this place, its validity has a framework to explain individuals time preferences has been questioned in recents researches paving the way for amendments and questionings of this framework. Among others, these reseaches introduce the concept of term structure of psychological discount rate. Therefore, a part from exponential discount rate function, we find in the literature seven alternatives discount rate function : Hernstein, proportional, Laibson, Rachlin, Hyperbolic and generalized hyperbolic.Following this work, we initiated a research to provide an answer to the question on the characteristics and driving factors those explain its heterogeneity at an individual level. Thereafter, its relationship with other dimensions of individual preferences (risk and social interaction behavior) are explored. The purpose is to identify among them, the function that is consistent with the observed time preferences, to estimate the underlying parameters and to study the consistency of individual time preference. This research is based on the data collected by experimental study using eighteen time trade-offs, four lottery trade-offs, a dictator game, an ultimatum game and a trust game.Data analysis confirmed previous results on the violation of the time invariant of the psychological value of time hypothesis and established that the studied population is characterized by an heterogeneity with respect to the form of the term structure of psychological discount rate. Individuals are characterized by an Hernstein, Generalized hyperbolic or Laibson psychological discount rate. We found that demographic, social and temporal orientation have a weak link with the individual differences of the term structure of psychological discount rate. Application (a dimension of personality traits) is the most important driving factor of term structure of psychological discount rate forms heterogeneity. We finally established that there is a weak relationship between the parameters of time, risk and social preferences
Rajaona, Harizo. "Inférence bayésienne adaptative pour la reconstruction de source en dispersion atmosphérique." Thesis, Lille 1, 2016. http://www.theses.fr/2016LIL10120/document.
Full textIn atmospheric physics, reconstructing a pollution source is a challenging but important question : it provides better input parameters to dispersion models, and gives useful information to first-responder teams in case of an accidental toxic release.Various methods already exist, but using them requires an important amount of computational resources, especially as the accuracy of the dispersion model increases. A minimal degree of precision for these models remains necessary, particularly in urban scenarios where the presence of obstacles and the unstationary meteorology have to be taken into account. One has also to account for all factors of uncertainty, from the observations and for the estimation. The topic of this thesis is the construction of a source term estimation method based on adaptive Bayesian inference and Monte Carlo methods. First, we describe the context of the problem and the existing methods. Next, we go into more details on the Bayesian formulation, focusing on adaptive importance sampling methods, especially on the AMIS algorithm. The third chapter presents an application of the AMIS to an experimental case study, and illustrates the mechanisms behind the estimation process that provides the source parameters’ posterior density. Finally, the fourth chapter underlines an improvement of how the dispersion computations can be processed, thus allowing a considerable gain in computation time, and giving room for using a more complex dispersion model on both rural and urban use cases
Lamlih, Achraf. "Conception d’un système intégré de mesure de bioimpédance pour le suivi long terme de la composition des tissus biologiques." Thesis, Montpellier, 2018. http://www.theses.fr/2018MONTS070/document.
Full textTissue composition assessment techniques are used to help better comprehend physiological processes and their overall impact on the biological state of the experiments subjects. The research presented in this manuscript aims to design a bioimpedance spectroscopy integrated measurement system capable of measuring a wide range of biomarkers over long periods of time (up to one year). The presented measurement system can be used for physiological variables long time monitoring applications in general. Nevertheless, the presented solutions target in particular fish species in the context of the POPSTAR project which aims to enhance our understanding of fish behavior by analyzing not only the environment in which fish travel and live but also the fish themselves. After identifying the design challenges of a bioimpedance spectroscopy integrated measurement system, we have proposed a novel hybrid architecture providing fast bioimpedance spectroscopy while maximizing the measurement precision. As the signal generation blocks are critical and their performances affect the whole architecture performances. The second part of this research focuses on the design and optimization of the signal generation part of the architecture. Indeed, we have enhanced the stimuli generation signals quality for single tone and multitone excitations while proposing for this blocks low complexity on-chip implementations. In the last part of our work the current driver that transforms the voltage stimuli into an excitation current is discussed. A novel analog topology using an improved regulated cascode and a common-mode feedback compensation independent of process variations is presented. The first chip prototype implementing the critical blocks of the bioimpedance integrated measurement architecture has been designed and simulated in a 0.18 µm AMS (Austria MicroSystems) CMOS process operating at 1.8 V power supply
Ladino, lopez Andrés. "Traffic state estimation and prediction in freeways and urban networks." Thesis, Université Grenoble Alpes (ComUE), 2018. http://www.theses.fr/2018GREAT016/document.
Full textCentralization of work, population and economic growth alongside continued urbanization are the main causes of congestion. As cities strive to update or expand aging infrastructure, the application of big data, new models and analytics to better understand and help to combat traffic congestion is crucial to the health and development of our smart cities of XXI century. Traffic support tools specifically designed to detect, forecast and alert these conditions are highly requested nowadays.This dissertation is dedicated to study techniques that may help to estimate and forecast conditions about a traffic network. First, we consider the problem Dynamic Travel Time (DTT) short-term forecast based on data driven methods. We propose two fusion techniques to compute short-term forecasts from clustered time series. The first technique considers the error covariance matrix and uses its information to fuse individual forecasts based on best linear unbiased estimation principles. The second technique exploits similarity measurements between the signal to be predicted and clusters detected in historical data and it performs afusion as a weighted average of individual forecasts. Tests over real data were implemented in the study case of the Grenoble South Ring, it comprises a highway of 10.5Km monitored through the Grenoble Traffic Lab (GTL) a real time application was implemented and open to the public.Based on the previous study we consider then the problem of simultaneous density/flow reconstruction in urban networks based on heterogeneous sources of information. The traffic network is modeled within the framework of macroscopic traffic models, where we adopt Lighthill-Whitham-Richards (LWR) conservation equation and a piecewise linear fundamental diagram. The estimation problem considers two key principles. First, the error minimization between the measured and reconstructed flows and densities, and second the equilibrium state of the network which establishes flow propagation within the network. Both principles are integrated together with the traffic model constraints established by the supply/demand paradigm. Finally the problem is casted as a constrained quadratic optimization with equality constraints in order to shrink the feasible region of estimated variables. Some simulation scenarios based on synthetic data for a manhattan grid network are provided in order to validate the performance of the proposed algorithm
Chen, Xu. "New formulation of optical flow for turbulence estimation." Thesis, Ecully, Ecole centrale de Lyon, 2015. http://www.theses.fr/2015ECDL0025/document.
Full textThe method of optical flow is a powerful tool for motion estimation. It is able to extract the dense velocity field from image sequence. In this study, we employ this method to retrieve precisely the incompressible turbulent motions. For 2D turbulence estimation, it consists in minimizing an objective function constituted by an observation term and a regularization one. The observation term is based on the transport equation of a passive scalar field. For non-fully resolved scalar images, we propose to use the mixed model in large eddy simulation (LES) to determine the interaction between large-scale motions and the unresolved ones. The regularization term is based on the continuity equation of 2D incompressible flows. Evaluation of the proposed formulation is done over synthetic and experimental images. In addition, we extend optical flow to three dimensional and multiple scalar databases are generated with direct numerical simulation (DNS) in order to evaluate the performance of optical flow in the 3D context. We propose two formulations differing by the order of the regularizer. Numerical results show that the formulation with second-order regularizer outperforms its first-order counterpart. We also draw special attention to the effect of Schmidt number, which characterizes the ratio between the molecular diffusion of the scalar and the dissipation of the turbulence. Results show that the precision of the estimation increases as the Schmidt number increases. Overall, optical flow has showcased its capability of reconstructing the turbulent flow with excellent accuracy. This method has all the potential and attributes to become an effective flow measurement approach in fluid mechanics community
Conze, Pierre-Henri. "Estimation de mouvement dense long-terme et évaluation de qualité de la synthèse de vues. Application à la coopération stéréo-mouvement." Phd thesis, INSA de Rennes, 2014. http://tel.archives-ouvertes.fr/tel-00992940.
Full textBrunel, Julien. "Prévoir la demande de transport de marchandises à long terme : estimation économétrique du couplage transport/économie, le cas des traversées alpines." Lyon 2, 2007. http://theses.univ-lyon2.fr/documents/lyon2/2007/brunel_j.
Full textThe current research aims to produce long-term forecasts of freight transport demand across the Alps. A first part introduces the literature related to forecast freight transport demand in the long-term. It highlights the role of economic activity as a main determinant of freight transport demand. Then, this issue is discussed using the concept of coupling for freight transport. The second part estimates the relationship between freight transport demand across the Alps and Italian industrial activity. We apply two alternative econometric specifications, a model in rate of growth following quin-quin fret models (Gabella-Latreille, 1997) and an error-correction model following Engel and Granger (1987) procedure in reason of the co-integrated nature of time-series. It shows that the model in rate of growth and the error-correction model results are globally coherent. In a third part, these estimates are combined following an idea purposed by Bates and Granger (1969) in order to produce long-term forecasts of freight transport demand across the Alps. It suggests that these estimates differ from those obtained by previous models. One can observe that previous models generally estimate log-linear models using standard econometric tools in spite of a high risk of being spurious regressions (Granger and Newbold, 1974). More precisely, this research assumes that the estimation of standard models, rather than more advanced techniques, is likely to produce an over-estimation of traffic forecasts of twenty percent
Brunel, Julien Bonnafous Alain. "Prévoir la demande de transport de marchandises à long terme estimation économétrique du couplage transport/économie, le cas des traversées alpines /." Lyon : Université Lyon 2, 2007. http://theses.univ-lyon2.fr/sdx/theses/lyon2/2007/brunel_j.
Full textLaharotte, Pierre-Antoine. "Contributions à la prévision court-terme, multi-échelle et multi-variée, par apprentissage statistique du trafic routier." Thesis, Lyon, 2016. http://www.theses.fr/2016LYSET013/document.
Full textThe maturity of information and communication technologies and the advent of Big Data have led to substantial developments in intelligent transportation systems (ITS) : from data collection to innovative processing solutions. Knowledge of current traffic states is available over most of the network range without the use of intrusive infrastructure-side collection devices, instead relying on wireless transmission of multi-source data. The increasing use of huge databases had a strong influence on traffic management, including forecasting methods. These approaches followed the recent trend towards innovative works on statistical learning. However, the prediction problem remains mainly focused on the local scale. The prediction for each road link relies on a dedicated, optimized and adapted prediction model. Our work introduces a traffic-forecasting framework able to tackle network scale problems. The study conducted in this thesis aims to present and evaluate this new “global” approach, in comparison to most-used existing works, and then to analyze its sensitivity to several factors. The traffic-forecasting framework, based on multi-variate learning methods, is detailed after a review of the literature on traffic flow theory. A multi-dimensional version of the k nearest-neighbors, a simple and sparse model, is evaluated through several use cases. The originality of the work stands on the processing approach, applied to data collected through new measurement process (e.g. Bluetooth, floating car data, connected vehicles). Then, the performance of our primary approach is compared to other learning-based methods. We propose an adaptation of kernel-based methods for the global prediction framework. The obtained results show that global approaches perform as well as usual approaches. The spatial and temporal specificities of the methods are highlighted according to the prediction accuracy. To improve the forecasting accuracy and reduce the computation time, we propose an identification and selection method targeting critical links. The results demonstrate that the use of a restricted subset of links is sufficient to ensure acceptable performances during validation tests. Finally, the prediction framework resilience is evaluated with respect to non-recurrent events as incidents or adverse weather conditions affecting the nominal network operations. The results highlight the impact of these non-recurrent conditions on real-time forecasting of short-term network dynamics. This enables the design of a further operational and resilient prediction framework. This perspective of forecasting matches the current applications relying on embedded systems and addressing the traffic network supervisor’s expectations
Chautru, Emilie. "Statistiques multivariées pour l'analyse du risque alimentaire." Thesis, Paris, ENST, 2013. http://www.theses.fr/2013ENST0045/document.
Full textAt a crossroads of economical, sociological, cultural and sanitary issues, dietary analysis is of major importance for public health institutes. When international trade facilitates the transportation of foodstuffs produced in very different environmental conditions, when conspicuous consumption encourages profitable strategies (GMO, pesticides, etc.), it is necessary to quantify the sanitary risks engendered by such economic behaviors. We are interested in the evaluation of chronic types of exposure (at a yearly scale) to food contaminants, the long-term toxicity of which is already well documented. Because dietary risk and benefit is not limited to the abuse or the avoidance of toxic substances, nutritional intakes are also considered. Our work is thus organized along three main lines of research. We first consider the statistical analysis of very high long-term types of exposure to one or more chemical elements present in the food, adopting approaches in keeping with extreme value theory. Then, we adapt classical techniques borrowed from the statistical learning field concerning minimum volume set estimation in order to identify dietary habits that realize a compromise between toxicological risk and nutritional benefit. Finally, we study the asymptotic properties of a number of statistics that can assess the characteristics of the distribution of individual exposure, which take into account the possible survey scheme from which the data originate
Boukrami, Othmane. "Les effets de la diversification sur le risque de change non couvert par les marchés financiers : estimation de la rentabilité du portefeuille dans un système d'informatio optimal." Thesis, Lyon 3, 2011. http://www.theses.fr/2011LYO30024.
Full textIn current market conditions, companies in emerging markets have the choice between a short-term debt in local currency and a long-term hard currency financing from international sources to finance their long-term investments. This practice would create either an interest rate gap or a currency gap. As an extent of previous researches and studies covering the question of currency risks diversification in mature financial markets, this thesis is quite distinctive from the existing literature as it focuses on emerging market currencies for which there are little or no hedging options of currency and interest rate risks. The proposed model is based on a fundamentally different approach from existing risk models, seeking to mitigate risks internally through portfolio diversification, rather than by matching supply and demand. This, by analyzing both correlations between emerging market currencies in a portfolio composed of African, Asian, South American and Eastern Europe currencies and the effect of diversification on market risk reduction. The main objective of this thesis is to contribute to the specification and the identification of a risk diversification model while demonstrating that the establishment of a diversified portfolio of emerging market currencies not covered by the commercial banks is a lucrative business over the long-term. With an efficient information system, the proposed model attempts to demonstrate the effect that such hedging products would have on reducing the credit risk of borrowers and hence the lenders. To achieve this aim, the different risks associated with these activities have been identified while choosing the methods for their effective management as well as the modeling of hypothetical exposures created by this activity. The impact of reducing market risk exposure through the usage of interest rate and currency hedging products on the credit risk rating of companies in emerging countries has also been modeled. The current research claims that the choice of currencies does not significantly impact the results as long as the proposed regional limits are respected. The simulation’ results show that managing a diversified currency portfolio under an optimal risk management guidelines can be a lucrative business for banks as the risk mitigation can be effectively done through portfolio diversification
Nguyen, Thanh Don. "Impact de la résolution et de la précision de la topographie sur la modélisation de la dynamique d’invasion d’une crue en plaine inondable." Thesis, Toulouse, INPT, 2012. http://www.theses.fr/2012INPT0093/document.
Full textWe analyze in this thesis various aspects associated with the modeling of free surface flows in shallow water approximation. We first study the system of Saint-Venant equations in two dimensions and its resolution with the numerical finite volumes method, focusing in particular on aspects hyperbolic and conservative. These schemes can process stationary equilibria, wetdry interfaces and model subcritical, transcritical and supercritical flows. After, we present the variational data assimilation method theory fitted to this kind of flow. Its application through sensitivity studies is fully discussed in the context of free surface water. After this theoretical part, we test the qualification of numerical methods implemented in the code Dassflow, developed at the University of Toulouse, mainly at l'IMT, but also at IMFT. This code solves the Shallow Water equations by finite volume method and is validated by comparison with analytical solutions for standard test cases. These results are compared with another hydraulic free surface flow code using finite elements in two dimensions: Telemac2D. A significant feature of the Dassflow code is to allow variational data assimilation using the adjoint method for calculating the cost function gradient. The adjoint code was obtained using the automatic differentiation tool Tapenade (INRIA). Then, the test is carried on a real hydraulically complex case using different qualities of Digital Elevation Models (DEM) and bathymetry of the river bed. This information are provided by either a conventional database types IGN or a very high resolution LIDAR information. The comparison of the respective influences of bathymetry, mesh size, kind of code used on the dynamics of flooding is very finely explored. Finally we perform sensitivity mapping studies on parameters of the Dassflow model. These maps show the respective influence of different parameters and of the location of virtual measurement points. This optimal location of these points is necessary for an efficient data assimilation in the future
Chaumont, Marc. "Représentation en objets vidéo pour un codage progressif et concurrentiel des séquences d'images." Phd thesis, Université Rennes 1, 2003. http://tel.archives-ouvertes.fr/tel-00004146.
Full textGardes, Thomas. "Reconstruction temporelle des contaminations métalliques et organiques particulaires dans le bassin versant de l'Eure et devenir des sédiments suite à l'arasement d'un barrage. Reconstruction of anthropogenic activities in legacy sediments from the Eure River, a major tributary of the Seine Estuary (France) Flux estimation, temporal trends and source determination of trace metal contamination in a major tributary of the Seine estuary, France Temporal trends, sources, and relationships between sediment characteristics and polycyclic aromatic hydrocarbons (PAHs) and polychlorinated biphenyls (PCBs) in sediment cores from the major Seine estuary tributary, France Impacts à court-terme de l’arasement d’un barrage sur la morphologie du cours d’eau et la remobilisation de sédiments contaminés par les métaux traces Bioaccessibility of polycyclic aromatic compounds (PAHs, PCBs) and trace elements: Influencing factors and determination in a river sediment core." Thesis, Normandie, 2020. http://www.theses.fr/2020NORMR038.
Full textThe anthropogenic impact on rivers has significantly increased following the industrial revolutioninitiated by Western countries. Thus, changes in the geomorphology of rivers for water storage andnavigation, the conversion of land for agricultural, industrial and urbanization purposes illustrate thisenvironmental pressure, which results, among other things, in an increase in discharges of variouscontaminants into environmental compartments, particularly rivers. Therefore, part of these dischargescan end up in suspended particulate matter, which is then considered as storage wells, which transit inrivers. River development, particularly the construction of dams, encourages the sedimentation of these contaminated particles over time. These sediments of anthropogenic origin, also called legacy sediments, are therefore witnesses to human activities and make it possible to reconstruct the temporal trajectories of contamination within watersheds. The Eure River, a major tributary of the Seine estuary, has experienced significant anthropogenic pressures since the twentieth century. The temporal reconstruction of anthropogenic pressures has required the combination of different methodological approaches: (i) a diachronic analysis of the morphological modifications of the river was carried out, in conjunction with (ii) an analysis of the sedimentary dynamics and the nature of the sediment deposits by coupling geophysical, sedimentological and geochemical methods, and (iii) the setting up of a network for monitoring the hydro-sedimentary behaviour with continuous sampling of suspended particulate matter. Significant geomorphological changes have occurred in the lower reaches of the watershed, with the main consequences being an outlet moved some ten kilometres in the direction of a dam and the formation of hydraulic annexes favouring the accumulation of sediments as early as the 1940s. These made it possible to show that the Eure River watershed had experienced significant contamination, the consequences of which are still being recorded despite the cessation of activities or uses. The temporal trends of trace metal and metalloid elements showed strong contaminations in As in the 1940s and contaminations of industrial origin in Cr, Co, Ni, Cu, Zn, Ag and Cd in the 1960s and 1970s, as well as contaminations in Sb and Pb in 1990–2000. The latter are still recorded despite the cessation of the activities responsible for the discharges, as evidenced by the results from the suspended particulate matter currently collected in the river. Like most trace metals, organic contaminants such as PAHs showed significant contamination during the 1940–1960s, with signatures indicating a predominantly pyrogenic origin. PCBs showed significant contamination during the period 1950–1970, in connection with the production and national uses of mixtures composed mainly of low chlorinated congeners. Finally, interest in a third family of persistent organic contaminants, organochlorine pesticides, showed the use of lindane and DDT, particularly during the 1940–1970 period, and highlighted the post-ban use of lindane and the presence of a metabolite of DDT several decades after the cessation of its use, in connection with the increase in erosion of cultivated soils
Roy-Brunelle, Raphaël. "Estimation de la mortalité attendue à long terme des maladies chroniques : une comparaison de différentes méthodes d’analyse." Thesis, 2021. http://hdl.handle.net/1866/25671.
Full textLong term mortality associated to medical conditions is rarely known with accuracy. Despite this, mortality assessment is essential in certain field of activity, such as life insurance and medical expertise. Constant Relative Risk (RR) methodology is the most used method although often leading to superficial and conservative estimations. We then decided to compare two mortality estimation methods, the constant Excess Death Rate (EDR) and the Proportional Life Expectancy (PLE). We analyzed long term mortality of several chronic medical conditions, such as cancers and coronary artery diseases, and we compared those results with those from the three distinct estimation methods. Thus, we were able to determine which methodology is the most accurate. Our results show that constant EDR and PLE are superior to constant RR to better estimate the mortality. The longer the follow-up is, the better those methods are. Finally, factors like age and the kind of the medical condition seem not to have an important impact when it comes to identify the most suitable method. Constant EDR and PLE should be recommended for mortality assessment in medical insurance underwriting and for life expectancy evaluation in medical expertise.