Dissertations / Theses on the topic 'Méthodes élémentaires'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Méthodes élémentaires.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Poyet, Patrice. "Méthodes de discrimination des anomalies géochimiques multi élémentaires significatives." Nice, 1986. http://www.theses.fr/1986NICE4005.
Full textReignier, Denis. "Contributions théoriques à l'étude de quelques réactions élémentaires à trois atomes." Bordeaux 1, 2001. http://www.theses.fr/2001BOR12314.
Full textHitti, Karim. "Simulation numérique de Volumes Élémentaires Représentatifs (VERs) complexes : Génération, Résolution et Homogénéisation." Phd thesis, École Nationale Supérieure des Mines de Paris, 2011. http://pastel.archives-ouvertes.fr/pastel-00667428.
Full textMaerschalk, Thierry. "Study of Triple-GEM detector for the upgrade of the CMS muon spectrometer at LHC." Doctoral thesis, Universite Libre de Bruxelles, 2016. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/233187.
Full textCette th`ese de doctorat s’inscrit dans le cadre de la mise `a niveau de l’exp ́erience CMSaupr`es du grand collisionneur de protons du CERN, le LHC. CMS, avec l’exp ́erienceATLAS, a permis la d ́ecouverte du boson de Brout-Englert-Higgs en 2012. Mais leprogramme de recherche du LHC n’est pas pour autant termin ́e. En effet, le LHC estdestin ́e `a fonctionner encore au moins 20 ans. Pendant cette p ́eriode, la luminosit ́e vacroˆıtre progressivement jusqu’`a atteindre environ cinq fois la valeur nominale de 10 34cm −2 s −1 initialement pr ́evue et ce d’ici 2025. Cette augmentation de luminosit ́e pousseles exp ́eriences du LHC, comme CMS, `a mettre `a jour les d ́etecteurs ainsi que leurssyst`emes d’acquisition de donn ́ees. Une des prochaines mises `a niveau majeures deCMS est l’addition d’une nouvelle couche de d ́etection dans le spectrom`etre `a muonvers l’avant. La technologie de d ́etection qui a ́et ́e choisie par la collaboration CMS estla technologie des Triple Gas Electron Multiplier (Triple-GEM). Cette mise `a niveaua pour but de maintenir les performances du syst`eme de d ́eclenchement et ce malgr ́el’augmentation de taux de particules (> 1 kHz/cm 2 ) et de permettre ́egalement, grˆacea la tr`es bonne r ́esolution spatiale des Triple-GEM (∼ 250 μm), l’am ́elioration de la re-construction des traces de muons. C’est l’ ́etude des caract ́eristiques de cette technologiequi est le sujet de cette th`ese.Cette caract ́erisation des d ́etecteurs Triple-GEM commence par une ́etude d ́etaill ́ee de lar ́esolution temporelle. Cette ́etude a ́et ́e r ́ealis ́ee `a l’aide de diff ́erentes simulations MonteCarlo telles que GARFIELD et a permis de montrer que les Triple-GEMs ́equip ́es de lanouvelle ́electronique VFAT3 (sp ́ecifiquement d ́evelop ́ee pour les Triple-GEMs) remplis-sent les conditions pour la mise `a niveau de CMS.Nous avons ensuite ́etudi ́e diff ́erents prototypes. Tout d’abord nous avons construit deuxpetits (10 × 10 cm 2 ) prototypes de Triple-GEM et d ́evelop ́e un banc de test au sein dulaboratoire de l’ULB. Ce banc de test nous a permis d’ ́etudier un autre param`etre impor-tant des d ́etecteurs Triple-GEM: le gain. Au cours de cette th`ese nous avons ́egalementparticip ́e `a la prise de donn ́ees et `a l’installation de diff ́erents tests en faisceau au CERN.L’analyse des donn ́ees du test en faisceaux d’octobre 2014 est aussi pr ́esent ́ee en d ́etail.La derni`ere partie de ce travail concerne l’ ́etude de la r ́esolution spatiale. Nous avonsestim ́e la r ́esolution spatiale par simulation de Monte Carlo ainsi que de mani`ere an-alytique pour des d ́etecteurs GEM munis d’une ́electronique binaire. Cette ́etude a ́egalement ́et ́e g ́en ́eralis ́ee `a d’autres d ́etecteurs tels que les Micromegas ou encore lescapteurs au silicium.
Doctorat en Sciences
info:eu-repo/semantics/nonPublished
Alegret, Cyril. "Développement de méthodes génériques de corrélation entre les mesures électriques & physiques des composants et les étapes élémentaires de fabrication." Phd thesis, Grenoble 1, 2006. http://www.theses.fr/2006GRE10195.
Full textFor sub-90nm technologies, the complexity of the structures is so important that the control of the fabrication is became a essential activity for semiconductor fabs. In this context, two major challenges are proposed to the engineers : The first one is to characterise and to reduce the variability of the electrical properties of the structures. The second one consists in optimising the control of this variability in order to guaranty the circuit performances. This PhD thesis aims at proposing a global analysis methodology based on the development of advanced statistical methods (multivariate and neuronal algorithms) to correlate the electrical and physical measurements of the components and the equipment parameters collected during the processes. The models obtained lead to the optimisation of each components of the process control (Statistical Process Control, Fault Detection and Classification and Run to Run)
Alegret, Cyril. "Développement de méthodes génériques de corrélation entre les mesures électriques & physiques des composants et les étapes élémentaires de fabrication." Phd thesis, Université Joseph Fourier (Grenoble), 2006. http://tel.archives-ouvertes.fr/tel-00122893.
Full textCette thèse vise à mettre en oeuvre une méthodologie globale d'analyse, s'appuyant sur le développement de méthodes statistiques avancés (outils multivariés, neuronaux) afin de corréler les mesures électriques et physiques des composants ainsi que les paramètres des équipements. Les résultats des modélisations débouchent sur l'optimisation de toutes les composantes liées au contrôle des procédés (Control Statistique des Procédés, Fault Detection and Classification, Run to Run).
Geneste, Grégory. "Mécanismes élémentaires de la croissance oxyde sur oxyde : étude ab initio et semi-empirique de MgO/MgO(001)." Toulouse 3, 2003. http://www.theses.fr/2003TOU30156.
Full textTahri, Aomar. "Contribution à l'étude de la méthanation des oxydes de carbone catalysée par du nickel dispersé sur silice : Étude en régime permanent de la cinétique réactionnelle et application de méthodes transitoires à l'identification de processus élémentaires." Nancy 1, 1987. http://www.theses.fr/1987NAN10148.
Full textMazet, Vincent. "Développement de méthodes de traitement de signaux spectroscopiques : estimation de la ligne de base et du spectre de raies." Phd thesis, Université Henri Poincaré - Nancy I, 2005. http://tel.archives-ouvertes.fr/tel-00011477.
Full textDans un premier temps est proposée une méthode déterministe qui permet d'estimer la ligne de base des spectres par le polynôme qui minimise une fonction-coût non quadratique (fonction de Huber ou parabole tronquée). En particulier, les versions asymétriques sont particulièrement bien adaptées pour les spectres dont les raies sont positives. Pour la minimisation, on utilise l'algorithme de minimisation semi-quadratique LEGEND.
Dans un deuxième temps, on souhaite estimer le spectre de raies : l'approche bayésienne couplée aux techniques MCMC fournit un cadre d'étude très efficace. Une première approche formalise le problème en tant que déconvolution impulsionnelle myope non supervisée. En particulier, le signal impulsionnel est modélisé par un processus Bernoulli-gaussien à support positif ; un algorithme d'acceptation-rejet mixte permet la simulation de lois normales tronquées. Une alternative intéressante à cette approche est de considérer le problème comme une décomposition en motifs élémentaires. Un modèle original est alors introduit ; il a l'intérêt de conserver l'ordre du système fixe. Le problème de permutation d'indices est également étudié et un algorithme de ré-indexage est proposé.
Les algorithmes sont validés sur des spectres simulés puis sur des spectres infrarouge et Raman réels.
Zenoni, Florian. "Study of Triple-GEM detectors for the CMS muon spectrometer upgrade at LHC and study of the forward-backward charge asymmetry for the search of extra neutral gauge bosons." Doctoral thesis, Universite Libre de Bruxelles, 2016. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/229354.
Full textThis PhD thesis takes place in the CMS experiment at CERN's Large Hadron Collider (LHC). The LHC allowed the discovery of the Brout-Englert-Higgs boson in 2012, and is designed to run for at least 20 years, with an increasing luminosity that will reach by 2025 a value of 7.5 x 10^34 cm^-2 s^-1, that is a yield five times greater than the one initially intended. As a consequence, the experiments must adapt and upgrade many of their components and particle detectors. One of the foreseen upgrades of the CMS experiment concerns the Triple Gas Electron Multiplier (GEM) detectors, currently in development for the forward muon spectrometer. These detectors will be installed in CMS during the second long LHC shutdown (LS2), in 2018-2019. The aim of this upgrade is to better control the event trigger rate at Level 1 for muon detection, thanks to the high performance of these Triple GEM detectors, in presence of very high particle rates (>1 kHz/cm^2). Moreover, thanks to its excellent spatial resolution (~250 um), the GEM technology can improve the muon track reconstruction and the identification capability of the forward detector.The goal of my research is to estimate the sensitivity of Triple GEMs to the hostile background radiation in CMS, essentially made of neutron and photons generated by the interaction between the particles and CMS detectors. The accurate evaluation of this sensitivity is very important, as an underestimation could have ruinous effects of the Triple GEMs efficiency, once they are installed in CMS. To validate my simulations, I have reproduced experimental results obtained with similar detectors already installed in CMS, such as the Resistive Plate Chambers (RPC).The second part of my work regards the study of the CMS experiment capability to discriminate between different models of new physics predicting the existence of neutral vector bosons called Z'. These models belong to plausible extensions of the Standard Model. In particular, the analysis is focused on simulated samples in which the Z' decays in two muons, and on the impact that the Triple GEM detectors upgrades will bring to these measurements during the high luminosity phase of the LHC, called Phase II. My simulations prove that more than 20% of the simulated events see at least one muon in the CMS pseudo-rapidity (eta) region covered by Triple GEM detectors. Preliminary results show that, in the case of 3 TeV/c^2 models, it will be possible already at the end of Phase I to discriminate a Z'I from a Z'SSM with a significance level alpha > 3 sigma.
Doctorat en Sciences
info:eu-repo/semantics/nonPublished
Petuya-Poublan, Rémi. "Contribution à la description théorique de la dynamique des processus élémentaires hétérogènes : collisions de l'azote moléculaire et de l'hydrogène atomique avec des surfaces de tungstène." Thesis, Bordeaux, 2014. http://www.theses.fr/2014BORD0134/document.
Full textHeterogeneous elementary processes at the gas-solid interface are ofgreat interest in many domains such as heterogeneous catalysis, atmospheric and interstellar media chemistry, spacecraft atmospheric re-entry and plasma-wall interactions description. This thesis focus on the dynamics of nitrogen, N2, non reactive scattering on a tungsten W(100) surface and hydrogen, H2, recombination processes on tungsten surfaces W(100) and W(110). The quasiclassical dynamics of these processes is simulated using potential energy surfaces based on density functional theory calculations. In particular, a multi-adsorbate potential is developed to include surface coverage in the dynamics simulation in order to scrutinize the interplay between both direct abstraction, the so-called Eley-Rideal recombination,and the Hot-Atom recombination process after hyperthermal diffusion on the surface
Joly, Philippe. "Pour une éducation esthétique à l'école élémentaire." Université Marc Bloch (Strasbourg) (1971-2008), 1992. http://www.theses.fr/1992STR20035.
Full textSpecific aim of "aesthetic" education (limited to pictorial arts), distinc from the technical training in art, is to develop and sharpen children's perception and curiosity. The matter is not to "train the pupils taste" and to inculcate in them "what to like", but to teach them how to tell why they like - or not, or less - a picture, and make them find out the exact words and reasons to tell it, in short, to educate them in order that they could really have their own taste. In working on a set of reproductions of paintings, pupils can note down their judgement by the meaning of a questionnaire. An evaluation chart of children's abilities in art appreciation during the school years, is thus provided. A practical approach with an experimental class aims at giving teachers some directions to build a structured curriculum in this particular field, mostly concerning paintings reading and appreciative vocabulary, without forgetting the collaboration with cultural partners such as fine arts museums
Gozé, Vincent. "Une version effective du théorème des nombres premiers de Wen Chao Lu." Electronic Thesis or Diss., Littoral, 2024. http://www.theses.fr/2024DUNK0725.
Full textThe prime number theorem, first proved in 1896 using complex analysis, gives the main term for the asymptotic distribution of prime numbers. It was not until 1949 that the first so-called "elementary" proof was published: it rests strictly on real analysis.In 1999, Wen Chao Lu obtained by an elementary method an error term in the prime number theorem very close to the one provided by the zero-free region of the Riemann zeta function given by La Vallée Poussin at the end of the 19th century. In this thesis, we make Lu's result explicit in order, firstly, to give the best error term obtained by elementary methods so far, and secondly, to explore the limits of his method
Wallet, Jacques. "Images animées et enseignement de la géographie pour les élèves de l'école élémentaire et du collège." Paris 7, 1994. http://www.theses.fr/1994PA070009.
Full textThe research's subjet is didactical. The monie has never been and never becuine a document of the tradition in french way to learn geography. The research tries to explain the position of a monie in a didactical project the story of the production of monies, four earliers documentanes, by "school monies" to new technologie, is the mean issue
Belyanovskaya, Alexandra. "Composition élémentaire de mammifères dans les zones naturelles et anthropiques et et impacts potentiels avec la méthode USEtox." Thesis, Paris, ENSAM, 2019. http://www.theses.fr/2019ENAM0049.
Full textThe geochemical heterogeneity of the biosphere, due to different natural and anthropogenic conditions, is changing significantly as a result of the development of man and society. Modern geo-ecological studies of different territories prove the fact of close connections of living organisms with the environment. However, little attention is paid to the complex approach, for example with the use of the model of impact assessment on living organisms. The Life Cycle Assessment (LCA) method, is allowed the magnitude and significance of the impact on the environment and the human organism to be monitored. The characteristic coefficient (CF) - is a tabular value, proposed by the model, depending on the region's location. In this thesis, however, it is proposed to modify this coefficient by introducing the results of biogeochemical analysis of territories with different ecological situations, in order to rank them more precisely. It is this modification, which determines the relevance of the study. The PhD thesis is purposed to assess the geo-ecological state of local areas of Russia and Kazakhstan with the use of indicators of the elemental compositions of organs and tissues of mammals, and a ranking of the toxicity of individual elements using the USEtox model. The modification method of the USEtox impact assessment model, using the results of the chemical analysis, can be used as a local supplement in the assessment of toxic effects on the population
Bédard, Jean-François. "La mesure de l'expression : physiognomonie et caractère dans la Nouvelle méthode de Jean-Jacques Lequeu." Thesis, McGill University, 1992. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=61158.
Full textLauze, François. "Sur la résolution minimale des idéaux d'arrangement de points génériques dans les espaces projectifs." Phd thesis, Université de Nice Sophia-Antipolis, 1994. http://tel.archives-ouvertes.fr/tel-00465375.
Full textGuimarães, Thomazi Aurea Regina. "L'enseignant de l'école élémentaire et le curriculum de la lecture : enquête à Belo Horizonte (Brésil)." Paris 5, 2005. http://www.theses.fr/2005PA05H012.
Full textThis research paper is aimed at the reading practices developed by teachers, preliminary school, in Brazil. We try to know the possibilities of reader formation at school and we focus our analysis on the making up of a reading curriculum for every teacher, based upon their declarations concerning the types of texts they use, activities engaged at the classrooms and at the library of the class or school. We then analyse other aspects concerning the teacher : his background, his beliefs in shaping a student reader, his workday, his planning, and above all his personal relationship with reading, at childhood, as a teenager and today. Or theoretical framework is based on the sociology of teachers, the sociology of curriculum and the sociology of reading. The methodology is grounded on interviews, answers to written questions and documents, handled through content analysis
Demesmay-Guilhin, Claire. "Spéciation de l'arsenic par couplages chromatographie en phase liquide et méthodes spectrales d'analyse élémentaire spécifiques : application à des sols et sédiments." Lyon 1, 1992. http://www.theses.fr/1992LYO10270.
Full textPansart, Lucie. "Algorithmes de chemin élémentaire : application aux échanges de reins." Thesis, Université Grenoble Alpes, 2020. http://www.theses.fr/2020GRALM039.
Full textThis thesis deals with elementary path problems and their application to the kidney exchange problem. We focus on kidney exchange programs including altruistic donors, which are crucial for patients with renal disease and challenging for operations research methods. The goal of this work is to develop an efficient algorithm that can be used to solve future instances, which are likely to involve a large number of donors and patients. While we progress on this topic, we encounter closely related problems on packing, vehicle routing and stable set. For this last problem, we introduce a new extended formulation and prove it is ideal and compact for claw-free perfect graphs by characterizing its polytope. We then concentrate on the design of a column generation dedicated to the kidney exchange problem and confront its NP-hard pricing problem. The specific problem that we address is the elementary path problem with length constraint, which models the search for interesting chains of donation to add during the pricing step. We investigate dynamic approaches, in particular the NG-route relaxation and the color coding heuristic, and improve them by exploiting the length constraint and sparsity of graphs. We study the color coding in a more general context, providing a guaranteed improvement by proposing new randomized strategies. They are based on ordering the graph before coloring it and introduce a bias in the probability distribution to increase the probability of finding an optimal solution
Bloch-Michel, Valérie. "Logiciel d'estimation de paramètres cinétiques de processus élémentaires en phase gazeuse." Vandoeuvre-les-Nancy, INPL, 1995. http://www.theses.fr/1995INPL010N.
Full textArnal, Mathieu. "Développement d'une évaluation génomique pour l'analyse de données longitudinales : application aux contrôles élémentaires chez les caprins laitiers." Thesis, Toulouse, INPT, 2019. http://www.theses.fr/2019INPT0124.
Full textGenetic improvement of dairy goats is based on the measurement of the quantity and quality of milk production of females on farms, at intervals of 4 to 5 weeks during lactation, according to strict protocols. The genetic evaluation of the quantity of milk is based on the estimation of the total quantity of milk produced per lactation. This selection based on the total quantity of milk in lactation tends to select animals with increasingly high peak lactation production. High production at the beginning of lactation can cause metabolic problems for females. In addition, in the case of dairy goats, in a context of seasonal production, a milk production that is maintained after the peak, i.e. persistent, would allow a more spread out milk production, in line with market expectations. There is therefore a zootechnical and economic interest in wanting to select more persistent dairy goats. In our study, the approach consists in modelling the shape of the lactation curve based on the information collected during each farm test. Models allowing the analysis of such longitudinal data are generally called test-day models. One of the main interests is to take better account of environmental effects, affecting production on test-day, with a herd-test-day effect depending only on the animals present during the test. The second advantage of this type of model is that most genetic and environmental effects are modelled as curves, so it would be possible to select animals with the best genetic value for persistence. The development of these models requires the prior study of the environmental effects affecting milk production over time. Following a detailed descriptive analysis of the lactation curves of the two main French goat breeds (Alpine and Saanen), we showed that there was a variability in the shape of the lactation curves, and in particular the month of calving was involved in the different curve shapes. Then we proposed a random regression model, similar to that developed in French dairy cattle. The proposed modeling makes it possible to obtain two genetic indexes directly: one corresponding to the genetic value of the animal for the total quantity of milk during lactation and a second corresponding to a genetic value of the animal's milk persistency, without correlation between the two. The model proposed is more relevant than the current one because it takes into account, in a disjointed way, the goats in primiparous and the goats in multiparous. We also studied correlations between indexes of different traits during lactation and correlations between persistence and AI fertility or between persistence and longevity. In the last part of the thesis, we extended the genetic evaluation of test-day to a genomic evaluation model (Single-step GBLUP) allowing to exploit all available molecular information (genotyping SNP 50K). A validation of this model and a comparison with the current model was carried out. The main difference between the Single-step GBLUP RRM and the Single-step GBLUP lactation model currently in use was the differences in the averages of the estimated indexes per year of birth of the bucks. Finally, from the Single-step GBLUP model we have identified some interested regions of the genome linked to milk persistence
Borchardt, Gaëlle. "L’influence des connaissances graphotactiques sur l’acquisition de l’orthographe lexicale : étude chez l’enfant d’école élémentaire et chez l’adulte." Thesis, Paris 5, 2012. http://www.theses.fr/2012PA05H114/document.
Full textPas de résumé en anglais
Georgiou, Aubin. "Modélisation à l'échelle mésoscopique de la cellule élémentaire d'un renfort composite 3D interlock à partir de tomographies à rayon X." Master's thesis, Université Laval, 2021. http://hdl.handle.net/20.500.11794/70370.
Full textThis thesis presents the work conducted toward the development of a finite element model of the representative unit cell of a 3D interlock composite, based on tomographies. This material is composed of yarns woven in the warp and weft directions, as well as through the thickness. This kind of 3X interlock has been very little studied to date, which makes numerical modelling of its unit cell necessary. However, it present advantages over other types of composites, in particular the absence of delamination phenomenon, the main cause of composite degradation. This work presents the approach and the means employed to obtain the numerical unit cell of the material. Tomographies of different states of the dry, uncompacted and compacted fabric were carried out in order to generate a finite element model from the image processing software AVIZO and the textile generation software TEXGEN. A simplified model of a unit cell slice was chosen in order to simulate and validate the numerical compaction process with the Abaqus finite element software. The results of the compaction model were compared with tomographies of dry texile. The results of this simplified simulation are promising. The deformation and displacements of the yarns obtained on the unit cell of the simulated dry textile are very close to those observed on tomographies of the compacted textile.
Dimitriou, Konstantinos. "Etude des collisions d'ions et d'électrons avec atomes et molécules par la méthode des trajectoires classiques." Paris 11, 2001. http://www.theses.fr/2001PA112333.
Full textCross section evaluation for the reactions observed in collisions between electrons, ions and simple molecules is the aim of this work. The classical trajectory Monte Carlo method (CTMC) which had been frequently used for calculating such cross sections is here extented in order to treat four-and five-body collisions and applied to the study of H-H, H+ -H2+, He++ -H2+, H+ -H2 collisions. In order to extend the apllication field, quantum calculations of the type of linear combinasion of atomic orbitals (LCAO) and an evaluation of e - rare gas ionization cross sections have been also carried out. The ionization and charge transfer cross sections obtained by this work have been compared with those coming from calculations avaible in the literature, but also with the experiemntal results. It has been so verified that using the CTMC method we can obtain reliable results in a wide domain of collision energies. During this work, a detailed study of the categories of trajectories which constitute the solution of the few-body dynamical systems has been necessary. .
Blanchet, Florent. "Etude de la coupe en perçage par le biais d'essais élémentaires en coupe orthogonale : application aux composites carbone-époxy." Thesis, Toulouse 3, 2015. http://www.theses.fr/2015TOU30164/document.
Full textIn aeronautical sector, assembly of CFRP composite structures requires the drilling of the fastener holes. Requirements of reliability, productivity and machining quality are imposed on the drilling process. The operation must be mastered, which requires a better understanding of the phenomena occurring during drilling. But the study of these phenomena faces several major challenges such as the confined aspect of the operation, the complex and variable geometry of the cutting tool, the speed variation along the main cutting edge... To overcome these obstacles, elementary testing representative of the drilling cutting phenomena could be implemented. This is the purpose of this work, which is based on three axes. The first is a study of the geometrical and kinematic representativeness of the elementary tests regarding the drilling operation. A tool geometry identification program is developed. It identifies the evolution of the local geometry of cutting tools along the main cutting edges and allows defining the elementary tests that are geometrically and kinematically representative of the drilling cutting. The second axis developed proposes a study of the phenomena occuring in quasi-static orthogonal cutting. Thus, the effects relative to the cutting speed are not considered. In this context, digital image correlation tests during orthogonal cutting are conducted. They lead to the analysis of the displacements and strains fields. Generated forces, chip morphology and machined surface texture are also investigated, in relation to the rake angle and the angle ?2 between the cutting speed and the direction of fibres. For this configuration, two types of numerical models, a macro-mechanical and a micro-mechanical one, are developed. A comparison is made in relation to experimental results. The last axis of this work concern the analysis of phenomena related to the cutting speed during orthogonal cutting. The macro-mechanical model is modified to include such phenomena as the variation of the matrix breaking stress as a function of strain rate, or the evolution of the friction coefficient according to the sliding velocity. The model results are compared to experimental results
Ackerer, Julien. "Mécanismes et taux de dénudation d'un bassin versant élémentaire (Strengbach, France) : apport de l'étude couplée des méthodes de datation isotopique (déséquilibres U-Th-Ra, 10Be in situ) et des méthodes de modélisation hydrogéochimique (KIRMAT)." Thesis, Strasbourg, 2017. http://www.theses.fr/2017STRAH002/document.
Full textIn this PhD work, the combination of the geochemical analytic and isotopic approaches with the modeling approaches has allowed to bring new insights to the understanding of the critical zone and the regolith. Concerning the regolith, this work presents a methodology to perform the analyses of the elemental geochemistry, of the mineralogy, as well as of the U-Th-Ra isotopes and of the in situ 10Be cosmogenic isotope along a single weathering profile. The obtained results highlight the importance of the sampling spatial resolution for an exhaustive interpretation of the U-Th-Ra and of the in situ 10Be data, especially to independently determine the key parameters of the long term regolith production and denudation rates. The two weathering profiles realized in this study furthermore show that (1) the regolith structure is relatively simple on ridge-tops and allows a continuous interpretation of the geochemical and of the mineralogical data and (2) the slope processes tend to increase the spatial heterogeneity of the regolith and of the weathering processes. In addition, the monitoring and the modeling of the surface waters allow to investigate the current weathering processes, and to understand the mechanisms involved in their recent variability. This work shows the relationship that can exist between the modifications recorded at the surface in the soil solutions and the temporal evolution of some chemical properties of the spring waters (pH, calcium concentration). The simulations also allow to understand the weak variability of the global weathering fluxes exported by the springs, in relation with the relative stability of the sodium and of the dissolved silica concentrations over the period 1990-2010. This study also demonstrates the interest of the coupling of methods providing information on the weathering and the erosion processes at different time and space scales, in particular to correctly evaluate the regolith dynamic and to position the present-day functioning of a watershed with respect to its long term evolution
Ragheb, Diana. "Développement de la méthode PIXE à haute énergie auprès du cyclotron ARRONAX." Phd thesis, Ecole des Mines de Nantes, 2014. http://tel.archives-ouvertes.fr/tel-01062444.
Full textWaksman, Sylvie Yona. "Les céramiques byzantines des fouilles de Pergame : caractérisation des productions locales et importées par analyse élémentaire par les méthodes PIXE et INAA et par pétrographie." Université Louis Pasteur (Strasbourg) (1971-2008), 1995. http://www.theses.fr/1995STR13153.
Full textEl, Hajjar Ragheb Diana. "Développement de la méthode PIXE à haute énergie auprès du cyclotron ARRONAX." Thesis, Nantes, Ecole des Mines, 2014. http://www.theses.fr/2014EMNA0137/document.
Full textParticle Induced X-ray Emission (PIXE) is a fast, nondestructive, multi-elemental analysis technique. It is based on the detection of characteristic X-rays due to the interaction of accelerated charged particles with matter. This method is successfully used in various application fields using low energy protons (energies around few MeV), reaching a limit of detection of the order the μg/g (ppm). At this low energy, the depth of analysis is limited. At the ARRONAX cyclotron, protons and alpha particles are delivered with energy up to 70 MeV, allowing the development of the High Energy PIXE technique. Thanks to these beams, we mainly produce KX-rays, more energetic than the LX-rays used with standard PIXE for the heavy elements analysis. Thus, in depth analysis in thick materials is achievable. For light element analysis, the PIGE technique, based on the detection of gamma rays emitted by excited nuclei, may be used in combination with PIXE. First of all, we will introduce the characteristics and principles of high energy PIXE analysis that we have developed at ARRONAX. Then we will detail the performance achieved, particularly in terms of detection limit in various experimental conditions. Finally, we present the results obtained for the analysis of multilayer samples and quantification of trace elements in thick samples
Serrero, Françoise. "Vitruve 1982-1995, les annees jean marc regard sur une ecole publique qui reflechit a la construction des savoirs et a l'apprentissage de la citoyennete." Paris 7, 1999. http://www.theses.fr/1999PA070105.
Full textPerrot, Camille. "Microstructure et Macro-Comportement Acoustique : Approche par reconstruction d'une Cellule Élémentaire Représentative." Phd thesis, INSA de Lyon, 2006. http://tel.archives-ouvertes.fr/tel-00123464.
Full textGarcia, Sanchez David. "Modélisation de la demande énergétique des bâtiments à l'échelle urbaine : contribution de l'analyse de sensibilité à l'élaboration de modèles flexibles." Phd thesis, Ecole des Mines de Nantes, 2012. http://tel.archives-ouvertes.fr/tel-00800205.
Full textGlanowski, Thomas. "Compréhension et modélisation des mécanismes élémentaires d’endommagement en fatigue d’élastomères renforcés au noir de carbone." Thesis, Brest, École nationale supérieure de techniques avancées Bretagne, 2019. http://www.theses.fr/2019ENTA0009.
Full textThe fatigue properties of carbon black filled elastomers are strongly related to the inclusions’ population, induced by complex recipes and the successive stages of the manufacturing process (mixing, injection and curing). The improvement of these properties involves at first an ability to describe the statistical features of these inclusions’ population in terms of nature, size, geometry, orientation and spatial distribution. Then, a detailed understanding of the damage mechanisms is required in order to define the mechanical criticality of inclusions according to their characteristics under cyclic loading. This study presents at first the tools developed, based on a detailed analysis of X-ray micro-tomography data. The obtained results on the inclusion’s populations and the damage induced allow highlighting the potential of these tools and their current limits for the studied materials. Atypical inclusions, unknown in the litterature, has been discovered. The cavitation mechanism appears to be the most critical regarding fatigue because it leads to micro-cracks that propagate in the matrix. A comparison of the criticality of the inclusions’ parameters regarding a cavitation criterion is carried out with a parametric study using finite elements simulations. Finally, thermographic measurements at the inclusions’s scale show the additional investiguations needed for a better understanding of the damage mechanisms at this scale
Mesradi, Mohammed Reda. "Mesures expérimentales et simulation Monte Carlo de la fonction de réponse d'un détecteur Si(Li) : application à l'analyse quantitative multi-élémentaire par XRF et PIXE." Strasbourg, 2009. http://www.theses.fr/2009STRA6188.
Full textIn order to diversify the analytical capabilities of the RaMsEs group by the X-ray fluorescence methods XRF and PIXE, quantitative analyses are being developed for environmental samples. Monte Carlo simulations are being used to validate some of the results. The XRF experiments are being performed using 241Am as the primary source of electromagnetic radiation. PIXE experiments are done with protons delivered by the 4 MV Van de Graaff accelerator of the Institut d'Electronique du Solide et des Systèmes (InESS) Laboratory of Strasbourg. Determination of the elemental composition of a sample is based on the measurement of X-rays emitted by the elements present in the sample in conjunction with the detector response and, for all except thin targets, the self-absorption of X-rays in the target (matrix effect). In XRF the self-absorption was experimentally determined by a method described in [Leroux et al. , 1966; Tertian and Claisse, 1982] and by simulations with the MCNP code. For PIXE the self-absorption was determined by MCNP. The intrinsic efficiency of the Si(Li) detector has been determined by three methods: with calibrated radioactive sources, XRF by 241Am of metal foils and PIXE with 2 and 3. 8 MeV protons. The experimental results were comforted with MCNP and GEANT simulations
Martiano, Sandie. "Pratiques et représentations de l'écrit chez les élèves de cycle III de l'école élémentaire au regard de différents modèles pédagogiques." Paris 12, 2005. https://athena.u-pec.fr/primo-explore/search?query=any,exact,990003940910204611&vid=upec.
Full textThe purpose of the study is to grasp the influence of the implementation of various pedagogical models on the practices and representations of writing among pupils. The point is to identify the existing correlation between the characteristic parameters of these models and the evolution of the practices and representations of writing among pupils. Our research was carried on five groups of third-year pupils in cycle III. Various data have been analyzed : class observations, audio recordings of activities with the whole group or with smaller groups, written questionnaires filled by the pupils to collect their representations of writing and of group-work, pupils' texts and semi-conducted interviews led with the collaboration of the teachers. The results of this study display that the evolution of practices ans representations of writing is conditioned by the presence of some parameters or by the combination of several parameters that are typical of the pedagogical models
Khdir, Younis Khalid. "Non-linear numerical homogenization : application to elasto-plastic materials." Thesis, Lille 1, 2014. http://www.theses.fr/2014LIL10023/document.
Full textThis PhD dissertation deals with the numerical homogenization of heterogeneous elastic-plastic random media via large volume computations. The dissertation is presented in two main parts. The first part is dedicated to the effective elastic-plastic response of random two-phase composites stretched under uniaxial loading. The second part is focused on the effective yield response of random porous media over a wide range of stress triaxialities. In the first part, we describe a computational homogenization methodology to estimate the effective elastic-plastic response of random two-phase composite media. The method is based on finite element simulations using three-dimensional cubic cells of different size but smaller than the deterministic representative volume element of the microstructure. In the second part, we describe using the finite element method a computational homogenization study of three-dimensional cubic cells in order to estimate the effective yield surface of random porous media containing one or two populations of voids. The representativity of the overall yield surface estimates is examined using cubic cells containing randomly distributed and oriented spheroidal voids. The computational results are compared with some existing Gurson-type yield criteria
El, Moumen Ahmed. "Prévision du comportement des matériaux hétérogènes basée sur l’homogénéisation numérique : modélisation, visualisation et étude morphologique." Thesis, Lille 1, 2014. http://www.theses.fr/2014LIL10077/document.
Full textThe homogenization is a technique of Micro-Macro passage taking into account the influence of morphological, mechanical and statistical parameters of the representative microstructure of an heterogeneous material. Numerical modeling has contributed significantly to the development of this technique to determine the physical and mechanical properties of bi-and multi-phase heterogenous materials. The main objective of this work is the prediction of the macroscopic elastic and thermal behaviors of heterogeneous materials. The mechanical and thermal behaviors was determined numerically and compared with experimental and analytical results. The variation of the representative volume element (RVE) versus volume fraction and the contrast was analyzed. This study showed the importance of a rigorous determination of the optimal RVE size. Indeed, it must take into account several parameters such as : volume fraction, contrast, type of property and the morphology of the heterogeneity. A new concept of the equivalent morphology was proposed. This concept introduces the equivalence of the elastic and thermal characteristics of a microstructure of heterogeneous materials with complex morphology and those of a microstructure containing spherical particles. This work led us to developement of a comprehensive approach to microstructural design by integrating the real morphology of heterogeneous microstructure phases incorporating at the same time the image visualization, the morphological study and the geometric and numerical modeling
Ouaouicha, Hassan. "Simulation du fonctionnement logique de FELIN : algorithmes de calcul simultané de racines de polynômes." Phd thesis, Grenoble INPG, 1987. http://tel.archives-ouvertes.fr/tel-00324430.
Full textGuan, Shian. "Étude de la synthèse de dioxyde de vanadium : propriétés physico-chimiques, dopages élémentaire et applications sous forme de films." Thesis, Bordeaux, 2019. http://www.theses.fr/2019BORD0443.
Full textVO2 oxide is a promising material for energy-saving smart windows due to its reversible metal-to-insulator transition at 68℃, accompanied by large optical changes from a low temperature transparent state to a more blocking state at high temperature in the near-infrared region. From a crystallographic point of view, the VO2 transition occurs between two crystal phases: monoclinic (M) and rutile (R) phases.The limiting factors for a commercial use are the high transition temperature, unpopular yellow color (film) and poor thermal and chemical stability. Most importantly, there is still lack of a reproducible and low-cost method for VO2 particles synthesis.Two synthesis methods for VO2 powder are proposed. Firstly, VO2 (M) powders with tuned crystallinity were prepared through a two-step thermal treatment: a fast VEG decomposition in air leads to the formation of poor crystal VO2, followed by a post-annealing treatment at higher temperatures in vacuum for a full-crystallization. The second method, named as carbo-thermal reduction method, is based on the reducing role of carbon. Herein soot is chosen for a direct reduction of homemade V2O5 nano-powder. The comparison of sealed or dynamic vacuum systems allows further understanding of the reductive mechanism. The obtained VO2 particles size varied from 5.3 um to 415 nm by adjusting the annealing temperature and time. In addition, Al3+, Ti4+ and Nb5+ metal ions can be doped into VO2, successfully. Nb5+ shows the most efficient effect on decreasing of the transition temperature down to Tc= 25°C.The doping effects of Nb on VO2 are fully investigated on several aspects, including the morphology, crystal distortion, thermal stability, heat capacity and resistivity. Finally, a deep investigation of the magnetic properties with respect to the impact of the doping concentration and the particle size is proposed.The shaping step into thin films, is performance using a dip coating process from VO2 suspensions. This step is investigated to tackle down the trade-off relationship between luminous transmittance (Tlum) and solar-energy modulation ability (ΔTsol) in VO2-based thin films. Zero dimensional island structured VO2 films are fabricated by taking advantage of the reductive strategy using carbon as developed in previous part. PVP is used as a surface stabilizer and reducing agent. To tune the functional materials VO2’s surface coverage on substrate, three elaboration parameters were changed: (i) the suspension concentration; (ii) the V2O5@PVP film layer thickness; (iii) the number of VO2 layers through repeating the spin-coating/annealing loop.Finally, the driving force of the phase transition of VO2 type oxides is still unclear: as far as distortion is concerned, some authors propose to explain this transition a distortional induced structural origin (Peierls model), other state that the transition proceeds by a complex electron-electron correlation (Mott model). We discuss our point of view on this metal-insulator transition of VO2 oxides. In the limits of a simple geometrical approach, we describe here with a new approach, the main driving forces involved in the MIT transition of VO2 oxides. This approach, based on the binding valence model, a well-established model, has allowed us to show that the phase transition is certainly consistent with a maximization of the lattice energy due to a maximization of the unit-cell volume of the VO2 oxides, in other words, the most stable allotropic form is that presenting for a given temperature the largest unit-cell volume
Rozanski, Adrian. "Sur la représentativité, la taille minimale du VER et les propriétés effectives de transport des matériaux composites aléatoires." Thesis, Lille 1, 2010. http://www.theses.fr/2010LIL10137/document.
Full textThe thesis focuses on random composites and some specific features such as: the minimum size of a representative volume element (RVE) and the determination of effective transport properties. The main objective is to formulate a computationally efficient method which would allow for quick determination of effective properties. The effective properties of transport are considered. It is shown that this class of properties can be estimated either by performing calculations over one large sample or by averaging over a sufficient number of smaller microstructure realizations. However, for a given type of microstructure; the size of such smaller realizations can not be smaller than some critical minimum size. It is shown that this critical size of RVE is strongly affected by several parameters. These are: microstructure type, volume fractions of constituents, contrast in mechanical properties of composite phases, number of performed realizations as well as a desired accuracy. Furthermore, two separate types of representativity are introduced: geometrical representativity and representativity with respect to overall transport properties. Therefore, two distinct criteria for the minimum size of RVE are formulated based on the properties of the two-point correlation function. Comparing to other methods proposed in wide literature, the criterion formulated in the thesis gives an advantage: the condition proposed includes microstructure morphology. Therefore, in order to determine the minimum size of RVE none numerical calculations like those of FE are necessary. A validation of proposed methodology is performed on several examples of 2D microstructures
Wali, Abderrahmen. "Analyse expérimentale et modélisation multi-échelle du comportement mécanique de mélanges Polycarbonate/Polypropylène : effet de la morphologie." Thesis, Lille 1, 2017. http://www.theses.fr/2017LIL10077.
Full textThe objective of this work is to perform experimental characterization and to model the mechanical behaviour of immiscible PC/PP blends. A predominantly spherical microstructure, in the most PC / PP blends, reveals low adhesion due to high interfacial tension between two phases which was observed under a scanning electron microscope (SEM). This results in a negative deviation of the mechanical tensile properties accordingly to the % of PP. One of the possible solutions is to add a third component that can improve adhesion between two phases. In this work PP-g-MA was chosen. Despite its low rigidity and brittleness, it has partially improved the mechanical properties of the blends. A multi-scale approach was applied to model the homogenised behaviour of the PC / PP blends using two different types of models. The first one is based on analytical homogenization and the second one will be defined in the context of numerical homogenization. The statistical distribution law for the size of the dispersed phase was determined from the SEM images. This law was applied for representative volume element (RVE) generation. The behaviour of the constituents has been defined as elastoplastic. Initially assumed hypothesis of a perfect interface did not describe the mechanical behaviour of the blends in a satisfactory manner. In order to improve this, a model introducing cohesive surfaces to simulate interfacial damage is developed using traction-separation law. The model is in good agreement with the experimental results. Finally, parametric study was carried out to highlight the effect of the shape, the number and the orientation of dispersed phase on the nonlinear response of blends
Fournier, Léonie. "Stratégies directes et indirectes d'usage de la langue : l'exemple d'élèves d'école élémentaire en filière franco-allemande." Thesis, Strasbourg, 2015. http://www.theses.fr/2015STRAC010.
Full textThis thesis reports on an empirical study of 3rd and 4th grade students in a German-French bilingual school. The study consisted of research subjects solving exercises by using the thinkaloud method, after which film and transcription analysis allowed for a detailed examination of the strategies used during the exercises. Such strategies have been termed "language usagestrategies" and have been distinguished from learning and communication strategies. This distinction is important because when students use their L2 in this context, their goal is not to learn the language or to communicate, rather, it is to solve the exercise. The students used and combined many cognitive, compensatory, metacognitive, social and affective strategies during the Think aloud Protocols. Also the individual interviews revealed that the students manifested a large number of language usage strategies, suggesting that they are conscious of strategy usage when using their L2
Ismaël, Amina. "Une évaluation des performances analytiques de la spectroscopie sur plasma induit par laser (LIBS)." Thesis, Bordeaux 1, 2011. http://www.theses.fr/2011BOR14357/document.
Full textLaser-Induced Breakdown Spectroscopy (LIBS) is an elemental analytical technique which combines laser ablation with atomic emission spectroscopy. LIBS spectroscopy has many advantages but is not recognized as a fully quantitative method. Indeed, the problem of samples' heterogeneity, matrix effects, self-absorption of emission lines and the lack of repeatability deteriorate the analytical performances of LIBS. In order to improve this technique, the work presented in this thesis includes an example of analytical performances evaluation with the use of quality notions of a laboratory LIBS system. The method is here specially applied to the analysis of certified steel samples. A first study deals with the optimization of the LIBS system for the quantitative analysis. As the effect of the different experimental parameters on LIBS signal is complex, a methodical protocol is necessary. Here, a parametric study is proposed to determine the experimental conditions suitable for quantitative analysis. Once optimized, the LIBS method is then characterized with basics of method validation. The trueness and the precision of the method are evaluated in conditions of repeatability and intermediate precision. This study shows promising results for LIBS technique. The application of a control chart reveals however an instability of the laboratory system and enables to introduce corrective actions to improve its analytical performances
Knoop, Ludvig de. "Development of quantitative in situ transmission electron microscopy for nanoindentation and cold-field emission." Toulouse 3, 2014. http://thesesups.ups-tlse.fr/3041/.
Full textThis thesis has focused on the development of quantitative in situ transmission electron microscopy (TEM) techniques. We have used a special nano-probe sample holder, which allows local electrical biasing and micro-mechanical testing. The finite element method (FEM) was used to compare models with the experimental results. In addition to conventional imaging techniques, electron holography has been used to measure electric fields and strains. The first part addresses cold-field emission from a carbon cone nanotip (CCnT). This novel type of carbon structure may present an alternative to W-based cold-field emission sources, which are used in the most advanced electron guns today. When a sufficiently strong electric field is applied to the CCnT, electrons can tunnel through the energy barrier with the vacuum, which corresponds to the phenomenon of cold-field emission. Using electron holography and FEM, a quantified value of the local electric field at the onset of field emission was found (2. 5 V/nm). Combining this with one of the Fowler-Nordheim equations, the exit work function of the CCnT was determined to be 4. 8±0. 3 eV. The number of charges on the CCnT before and after the onset of field emission was also measured. The second part focuses on the plastic deformation of Al thin films to test dislocation-interface interactions. A dislocation close to an interface with a stiffer material should be repelled by it. Here, we find to the contrary that dislocations moving towards the oxidized interface are absorbed, even at room temperature. The stress was derived from a combination of load-cell measurements and FEM calculations. Finally, preliminary experiments to combine in situ indentation and dark-field electron holography are reported
Holste, Angela Sarah. "Développement des méthodes bio analytique pour l’analyse quantitative et qualitative des peptides et protéines marqués par le couplage de la chromatographie et la spectrométrie de masse." Thesis, Pau, 2014. http://www.theses.fr/2014PAUU3004/document.
Full textThis PhD thesis was a Cotutelle between the Université de Pau et des Pays de l’Adour (UPPA) in Pau, France and the Christian-Albrechts University (CAU) in Kiel, Germany. In the course of this international collaboration, bio-analytical methods for the quantitative and qualitative analysis of labelled peptides and proteins were developed, which were based on the hyphenation of chromatography with mass spectrometry. Peptides and protein digests were lanthanide labelled using DOTA-based compounds according to an optimised protocol. Separation on the peptide level was performed using IP-RP-nanoHPLC. Complementary data sets were acquired using MALDI-MS for identification and ICP-MS for quantification. In this context, an online precleaning step was developed and implemented in the nanoHPLC separation routine, which allowed for effective removal of excess reagents. This lead to lowered metal backgrounds during ICP-MS measurements and thus better data interpretability, while guarding peptide recovery at a maximum level. An alternative offline purification using solid phase extraction (SPE) resulted in important peptide losses and can be considered unsuitable for quantitative analysis. Additives to the nanoHPLC eluents, such as HFBA and EDTA were tested and not deemed beneficial for the analysis of normal peptide samples. HFBA can be reconsidered for special application on very hydrophilic peptide species. A set of labelled peptides was developed, which due to application of known quantities could be employed for quick and simple quantification of a low complexity digest sample. In addition this peptide set allowed for the reliable superposition of chromatograms, enabling sample comparability especially for complementary ICP-MS and MALDI-MS data. Experiments for application of fsLA-ICP-MS on MALDI-MS target plates were conducted and showed very promising results. For this purpose, samples that were already identified using MALDI-MS were supposed to be remeasured using fsLA-ICP-MS. First quantification attempts on the modified steel target plate were successful and in the range of expectance. Adjusted parameters for MALDI-MS allowed for proper peptide identifications
Dirrenberger, Justin. "Propriétés effectives de matériaux architecturés." Phd thesis, Ecole Nationale Supérieure des Mines de Paris, 2012. http://pastel.archives-ouvertes.fr/pastel-00797363.
Full textDe, francqueville Foucault. "Etude micromécanique du lien entre endommagement local et comportement macroscopique de propergols solides." Thesis, Institut polytechnique de Paris, 2019. http://www.theses.fr/2019IPPAX004.
Full textThe goal of this present project is the development of numerical tools for simulating damage of solid propellants, which are used for anaerobic propulsion. It should allow identifying which properties disturb their behavior. To study the effect of energetic particles shapes, 3D microstructures are generated with a random dispersion of monosized spheres or polyhedra at high volume fraction (55%). In case of spheres, the elastic properties of the representative volume elements (RVE) are confronted with an analytical model and experimental characterizations of model composites, with a remarkable coherence of the three approaches. Then, the linear behavior of REV filled with polyhedra is compared to the one in case of spheres, highlighting only a limited effect of particles shapes. Damage of those materials being mostly due to matrix/filler debounding, a bilinear cohesive zone model with a viscous regularization and posting of interfaces damage state is implemented. The first order influence of the cohesive zone parameters either on the mechanical response or on the local damage is demonstrated. If convergence troubles prevent any quantitative confrontation with experimental data, their specific trends are well reproduced at either the particles or the global scales. A parameter study highlights also the impact of each cohesive zone parameter on the global behavior. Study of the damaged behavior, depending on particles shape, leads again to a second order impact. Finally, analyses of quasi-propellants, representative of common propellants, are proposed. Following the industrial characterization process, the interfaces properties are identified qualitatively based on the trends of the simulations. This analysis is completed by non-conventional characterization techniques to validate its coherence and to offer exhaustive information on the adhesives properties
Sellier, Virginie. "Développement de méthodes de traçage sédimentaire pour quantifier l'impact des mines de nickel sur l’hyper-sédimentation des rivières et l'envasement des lagons de Nouvelle-Calédonie Investigating the use of fallout and geogenic radionuclides as potential tracing properties to quantify the sources of suspended sediment in a mining catchment in New Caledonia, South Pacific Combining visible-based-colour parameters and geochemical tracers to improve sediment source discrimination in a mining catchment (New Caledonia, South Pacific Islands) Reconstructing the impact of nickel mining activities on sediment supply to the rivers and the lagoon of South Pacific Islands: lessons learnt from the Thio early mining site (New Caledonia)." Thesis, université Paris-Saclay, 2020. http://www.theses.fr/2020UPASV013.
Full textNew Caledonia, an island located in the south-western Pacific Ocean and currently the world's sixth largest producer of nickel, is facing unprecedented sedimentary pollution of its river systems. Indeed, nickel mining that started in the 1880s accelerated soil erosion and sediment transport processes. Hyper-sedimentation of the Caledonian hydro-systems has been observed after the deployment of mining activities on the archipelago. Although this phenomenon exacerbates the flooding problems experienced in these tropical regions, the sediment contributions generated by nickel mining remain unknown and are nevertheless required to guide the implementation of control measures to reduce these sediment inputs.To this end, a sediment fingerprinting study was carried out in a "pilot" catchment: the Thio River catchment (397 km²), considered as one of the first areas exploited for nickel mining in New Caledonia. Different tracers such as radionuclides, elemental geochemistry or "colour" properties were tested to trace and quantify the mining source contributions to the sediment inputs generated during two recent cyclonic flood events (tropical depression in 2015, cyclone Cook in 2017). A sediment core was also collected in the floodplain of the Thio River catchment to reconstruct the temporal evolution of these mining source contributions. The results of this study show that mining sources dominated sediment inputs with an average contribution ranging from 65-68% for the 2015 flood event to 83-88% for the 2017 flood event. The impact of the spatial variability of precipitation was highlighted to explain the variations in the contributions of these sources across the catchment. The temporal variations in the contributions of the mining sources deduced from the analysis of the sediment core were interpreted at the light of the mining history in the Thio River catchment (pre-mechanization, mechanization, post-mechanization of mining activity). The contributions of mining sources were again dominant with an average contribution along the sedimentary profile of 74 %. Once validated, this tracing method has been tested in four other catchments of New Caledonia in order to evaluate the validity of the approach in other contexts
Azi, Nabila. "Méthodes exactes et heuristiques pour le problème de tournées de véhicules avec fenêtres de temps et réutilisation de véhicules." Thèse, 2010. http://hdl.handle.net/1866/4876.
Full textThis thesis studies vehicle routing problems with time windows, where a gain is associated with each customer and where the objective is to maximize the total gain collected minus the routing costs. Furthermore. the same vehicle might be assigned to different routes during the planning horizon. This problem has received little attention in the literature in spite of its importance in practice. For example, in the home delivery of perishable goods (like food), routes of short duration must be combined to form complete workdays. We believe that this type of problem will become increasingly important in the future with the advent of electronic services, like e-groceries, where customers can order goods through the Internet and get these goods delivered at home. In the first chapter of this thesis, we present a review of vehicle routing problems with gains, as well as vehicle routing problems with multiple use of vehicles. We discuss the general classes of problem-solving approaches for these problems, namely, exact methods, heuristics and metaheuristics. We also introduce dynamic vehicle routing problems, where new information is revealed as the routes are executed. In the second chapter, we describe an exact algorithm for a vehicle routing problem with time windows and multiple use of vehicles, where the first objective is to maximize the number of served customers. To this end, the problem is modeled as a vehicle routing problem with gains. The exact algorithm is based on column generation, coupled with an elementary shortest path algorithm with resource constraints. To solve realistic instances in reasonable computation times, a heuristic approach is required. The third chapter proposes an adaptative large neighborhood search where the various hierarchical levels of the problem are exploited (i.e., complete vehicle workdays, routes within workdays and customers within routes). The fourth chapter deals with the dynamic case. In this chapter, a strategy for accepting or rejecting new customer requests is proposed. This strategy is based on the generation of multiple scenarios for different realizations of the requests in the future. An opportunity cost for serving a new request is then computed, based on an evaluation of the scenarios with and without the new request. Finally, the last chapter summarizes the contributions of this thesis and proposes future research avenues.