Dissertations / Theses on the topic 'Prédiction de la charge'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Prédiction de la charge.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Delille, Florent. "Recherche d'une prédiction de fragmentation charge par charge pour les tirs à ciel ouvert." Phd thesis, Ecole Nationale Supérieure des Mines de Paris, 2012. http://pastel.archives-ouvertes.fr/pastel-00798090.
Full textDelille, Florent. "Recherche d'une prédiction de fragmentation charge par charge pour les tirs à ciel ouvert." Electronic Thesis or Diss., Paris, ENMP, 2012. http://www.theses.fr/2012ENMP0046.
Full textTo contribute in the understanding of rock breakage and fragmentation processes in open pit blasting, the herein presented research aims at refining existing empirical fragmentation prediction techniques. A comprehensive full-scale experimental program has been conducted in an open pit mine and analyzed. The experiments yield data as well as enlightenments with respect to existing literature on blasting experiments. In particular, single-hole and multiple-hole blasting results are compared. In the test site's rock/explosives context, results demonstrate that industrial benefits from a hole-by-hole prediction are limited. A numerical approach has been developed in parallel to experimental work; it takes advantage of adamage behaviour law specifically designed for fragmentation by explosives (Rouabhi, 2004). 2D calculations with scale reduction are made; the use of such a behaviour law is shown to be essential, and it is outlined that coupling shock wave & explosive gases effects should be sought in future modelling work. Moreover, results observed in a laboratory scale experimental study (Miklautsch, 2002) are explained in an original way. At the end of the thesis, several hole-by-hole prediction methods – which can easily be reproduced –are tested and fitted with results from the full-scale blasting experiments. In the end, it is shown that the most accurate method obtained, when used with mean blast pattern parameters instead of hole-by-hole information, actually provides an even more accurate prediction
Ragot, Sébastien. "Matrices densité : modélisation des densités de charge et d'impulsion : prédiction des propriétés solides." Châtenay-Malabry, Ecole centrale de Paris, 2001. http://www.theses.fr/2001ECAP0709.
Full textFournier, Edouard. "Méthodes d'apprentissage statistique pour la prédiction de charges et de contraintes aéronautiques." Thesis, Toulouse 3, 2019. http://www.theses.fr/2019TOU30123.
Full textThis thesis focuses on Machine Learning and information extraction for aeronautical loads and stress data. In the first time, we carry out a study for the prediction of aeronautical loads curves. We compare regression trees based models and quantify the influence of dimension reduction techniques on regression performances in an extrapolation context. In the second time, we develop a deformation model acting simultaneously on the input and the output space of the curves. We study the asymptotic properties of the estimators of the deformation parameters. This deformation model is associated to the modeling and predicting process of aeronautical loads. Finally, we give a simple and efficient method for predicting critical loads
Coutant, Charles. "Optimisation de la prise en charge des cancers du sein : développement de prédicteurs clinicopathologiques et génomiques." Paris 6, 2009. http://www.theses.fr/2009PA066582.
Full textDossou, Dossouvi. "Fatigue thermique d'un polycarbonate : modèle de prédiction de durée de vie d'éprouvettes entaillées." Vandoeuvre-les-Nancy, INPL, 1996. http://www.theses.fr/1996INPL135N.
Full textDray, Bensahkoun Delphine. "Prédiction des propriétés thermo-élastiques d'un composite injecté et chargé de fibres courtes." Phd thesis, Paris, ENSAM, 2006. http://pastel.archives-ouvertes.fr/pastel-00001776.
Full textFortier, Alexandre. "Utilisation de modèles analytiques et numériques pour la prédiction du comportement sous charge statique de dispositifs de retenue de ponts routiers de niveau PL-2 en béton armé de barre d'armature de polymères renforcés de fibres de verre." Mémoire, Université de Sherbrooke, 2014. http://savoirs.usherbrooke.ca/handle/11143/5308.
Full textPigenet, Nazim. "Mise en place des outils de suivi de prédiction de la demande électrique à l'échelle d'un territoire, application au département du Lots." Toulouse 3, 2009. http://thesesups.ups-tlse.fr/617/.
Full textIn order to delay the construction of high-voltage lines, the French district of the Lot (one of the 98 French "departments") has set up a local programme of demand side management. The effectiveness of such a programme can be measured by its impact on the pic power demand and its assessment requires a modelising of the district's power load. Some such models are available already, but they are linked to a detailed knowledge of the existing equipment and consumer habits. In the Lot, this type of information is scarce, which renders the existing models inappropriate. The model proposed here was constructed using limited criteria defined via a thorough analysis of the Lot's electrical consumption and its similarities with national consumption. The applications of this model are then tested according to the various scenarios concerning the evolution of these criteria. .
Chouman, Mehdi. "Plasticité cyclique et phénomène de rochet : amélioration de la prédiction de l’ovalisation des tubes en acier durant le cintrage cyclique par modélisation de la distorsion de la surface de charge." Paris 13, 2010. http://www.theses.fr/2010PA132015.
Full textThe reel-lay installation process of offshore pipelines requires their transport in the form of reels. The cyclic reeling-straightening of these pipes induces an accumulation of ovalisation which decreases their resistance to collapse. This accumulation being a phenomenon of ratcheting, two possibilities were explored to improve its prediction: the law of kinematic hardening and the yield criterion. The capacity to predict the ovalisation was evaluated for several models of kinematic hardening coupled with the von Mises yield criterion. These models largely overpredict the ovalisation of the tubes. The influence of the yield criterion was evaluated by using the non quadratic yield function of Bron and Besson which has a set of « shape parameters ». The implementation of this model in the code Abaqus had highlighted a larger influence of the yield surface on the ovalisation. The use of this criterion improves the prediction of the ovalisation but insufficiently. This is due mainly to the invariant shape of the yield surface. Thus, we have proposed a phenomenological elastoplastic model describing distortion of yield surface based on Bron and Besson criterion: the distortion is modelled by varying the most influential parameter on the ovality according evolution laws depending on the strain path. The identification of the parameters of this model for the X60 and St52 steels shows its great capacity to predict the evolution of the ovality of the tube, as well as the strain (ratcheting) measured during the cyclic reeling test
Rizzetto, Simon. "Prédiction de la sensibilité biogéochimique et écologique des écosystèmes forestiers français aux dépôts atmosphériques azotés dans un contexte de changement global." Phd thesis, Toulouse, INPT, 2017. http://oatao.univ-toulouse.fr/19558/1/Rizzetto.pdf.
Full textBasu, Kaustav. "Techniques avancées de classification pour l'identification et la prédiction non intrusive de l'état des charges dans le bâtiment." Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENT089/document.
Full textSmart metering is one of the fundamental units of a smart grid, as many further applicationsdepend on the availability of fine-grained information of energy consumption and production.Demand response techniques can be substantially improved by processing smart meter data to extractrelevant knowledge of appliances within a residence. The thesis aims at finding generic solutions for thenon-intrusive load monitoring and future usage prediction of residential loads at a low sampling rate.Load monitoring refers to the dis-aggregation of individual loads from the total consumption at thesmart meter. Future usage prediction of appliances are important from the energy management point ofview. In this work, state of the art multi-label temporal classification techniques are implemented usingnovel set of features. Moreover, multi-label classifiers are able to take inter-appliance correlation intoaccount. The methods are validated using a dataset of residential loads in 100 houses monitored over aduration of 1-year
Briquet, Laurent. "Explorations psychométriques et psychoprojectives chez les auteurs d'infractions à caractère sexuel en psychologie légale : contribution sémiologique à l'identification de nouvelles composantes intrapsychiques et de nouvelles prises en charge psychothérapiques." Thesis, Bourgogne Franche-Comté, 2018. http://www.theses.fr/2018UBFCC006.
Full textThe management of sexual offenders disrupts care professionals within the limits of their therapeutic skills and consequently raises the recurring question of the chances of reintegration in society.The lack of understanding of the intrapsychic functioning of these patients, the lack of training of professionals in normal and pathological sexuality, the reinforcing effect on partial drives produced by the internet, or the specificity of the court-ordered therapy’s framework are all elements which definitely puts us out of the usual mental health care field.By attempting to take into account all these specificities and by exploring in a standardized way the psychosocial, psychometric and psycho-projective dimensions of these sex offenders, this research attempts to highlight on the one hand the intrapsychic variables that would improve scales of sexual dangerosity and secondly the psychotherapeutic characteristics likely to respond to the specific psychological dysfunctions of this population
Jouan, Alexandre Alain. "Influence du vieillissement en atmosphère confinée sur la prédiction de la durée de vie des joints adhésifs." Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLX002.
Full textThe substitution of soldering by electrically conductive adhesive bonding in electronic assemblies is of growing interest in the oil-and-gas industry. These electronic assemblies should nowadays work at high temperature and pressure levels during their service lives. The question of the long-term reliability of adhesive bonding becomes in this context a critical issue. The study of the fatigue strength of adhesive joints as well as the influence of ageing on this strength is therefore crucial for the final deployment of adhesive bonding. This work aims at addressing this fatigue lifetime prediction for two different commercial adhesives in electronic tools, and studying the influence of ageing on the lifetime prediction. Two types of ageing were implemented: an open-air ageing and a confined inert ageing (under Argon).An experimental characterization of the viscoelastic behavior of the two adhesive joints was first carried out. In particular, the non-linearities due to each of the loading parameter (simple shear and hydrostatic pressure) were highlighted and modeled.The microstructural heterogeneity brought by the metallic conductive fillers was then numerically studied. The numerical model introduces several approximations that make the results qualitative, but they highlight the influence of the fillers on the global macroscopic behavior of the adhesive joint.The question of the fatigue strength of adhesive joints and its evolution with ageing is finally experimentally addressed. The influence of the atmosphere and confinement of ageing on the lifetime prediction is clearly noticed for the two joints studied in this work. Fatigue criteria with some of the parameters evolving this ageing are proposed to fit the results experimentally obtained
Artola, Laurent. "Étude et modélisation des mécanismes de transport et de collection de charges dédiées à la prédiction de SEE dans les technologies fortement intégrées." Toulouse, ISAE, 2011. http://www.theses.fr/2011ESAE0025.
Full textNatural radiative environment is known to induce functional errors in electronics. Particles, such as neutrons, protons or heavy ions are known to induce these errors in electronic devices so called SEE (Single Event Effects). It's critical for industrial and space agencies to evaluate this risk. In this context, a new methodology of prediction is proposed in order to stimate the operational error rate for integrated devices boarded in space or avionic flights. Preliminary TCAD simulations lead to identify the physical mechanisms of transport and charges collection and lead to determine the critical technological parameters which impact the behaviour of the MOS transistor. These works emphasize the necessity to take into account the dynamic modeling of ambipolar diffusion velocity. Based on these works, the ADDICT model (Advanced Dynamic Diffusion Collection Transient) has been proposed. The main part of this work is focus on the validation of the transient current model by comparisons with experimental results ans TCAD simulations. In order to study the SEU and MBU events, a new upset criterion used with ADDICT leads to propose a new prediction methodology. This methodology has been compared with experimental cross sections issued from literature and tests campaign (for various SRAM memories from 0. 25 µm to 65 nm). Finally the ADDICT model has been evaluated for operational experiments. Actually, commercial SRAM memories (90 nm Cypress) has been boarded on stratospheric balloons. This last work shows the relevant operational prediction proposed by ADDICT
Dolbeault, Sylvie. "La détresse des patients atteints de cancer : prévalence, facteurs prédictifs, modalités de repérage et de prise en charge." Phd thesis, Université Pierre et Marie Curie - Paris VI, 2009. http://tel.archives-ouvertes.fr/tel-00432252.
Full textDolbeault, Sylvie. "La détresse des patients atteints de cancer : prévalence, facteurs prédictifs, modalités de répérage et de prise en charge." Paris 6, 2009. http://www.theses.fr/2009PA066207.
Full textJourdan, Véronique. "Prise en charge des traumatismes crâniens bénins dans un service d'urgences : analyse prospective des facteurs prédictifs de lésions." Montpellier 1, 2000. http://www.theses.fr/2000MON11058.
Full textBitu, Fabien. "L'apport des technologies interactives dans l'étude des composantes sensorielles de la créativité." Electronic Thesis or Diss., Normandie, 2022. http://www.theses.fr/2022NORMC027.
Full textDigital technologies are omnipresent in children and adolescents’ daily life. Interactive technologies such as touchscreens seem particularly attractive and easy to use, bringing benefits on school learning. The question arises whether these benefits can be also observed with other types of activities that are not subject to learning. In this work, we will specifically focus on creative framework. Although creativity is conceived as a multifactorial phenomenon, sensorimotor components have been very little integrated in creative process. Yet, Dietrich and Haider (2015) recently suggested that the process allowing to generate a creative idea would use the same mechanism than the one used to control a real or imagined action through sensorimotor prediction. Thereby, sensory afferences being central in sensorimotor action control, could be considered as one of the constitutive factors of creativity. In this thesis, we question this relationship between creativity and sensorimotricity. Through 4 experimental studies, we varied available sensory afferences in a creative drawing task by asking 6 to 15 years old children and adolescents to draw on touchscreen with finger, with stylus, and on paper with a pen. The results indeed show that enhancing sensory afferences with finger on touchscreen enhances originality at all age. However, lowering sensory afferences with the use of a stylus does not bring to the same effects on originality performance according to the age of participants. In children aged 6-7 years old, using a stylus on tablet does not modify originality performances. After 8 years old, children realize more original drawings with the use of a stylus on touchscreen than with a pen on paper. The observed benefits on creative performances from this age could be explained by the acquired capacity to compensate a sensory loss, that allow to maximize sensory information, thus leading to originality benefits. These benefits with finger and with stylus are also observed with adolescents who, because of disruptive behavior disorder, show difficulties in mobilizing cognitive capacities. Qualitative data acquired from adolescents with this trouble yields a major preference for tactile rather than paper support, that would be linked to a higher sensory mobilization in the production of graphic gestures on the interface. We discuss of the implication of these results for the creative process nature and its development, as well as using sensory afference to help typical and atypical children and adolescents to mobilize cognitive capacities more efficiently
Garnier, Pierre. "Etude des propriétés viscoélastiques d'un nitrile chargé sous sollicitations cycliques : application à la prédiction de la durée de vie des pompes à rotor excentré." Thesis, Clermont-Ferrand 2, 2012. http://www.theses.fr/2012CLF22274.
Full textThe study of the sustainability of the rubber, which is the main constitutive material of the Moineau pump stator, was carried out using finite element analyses, thermomechanical experiments and physic-chemical analysis. The numerical simulations were performed to identify the range and location of the maximal stress and strain within the pump stator. The experimental tests allowed to characterize the mechanical behaviour of rubber under cyclic solicitation at several temperatures. The first part of the study focused on the Payne effect during fatigue loading while the second dealt with the effect of a thermomechanical loading on the rubber microstructure evolution
Daviet, Jean-Christophe. "Facteurs prédictifs du devenir vital et fonctionnel d’une cohorte d’hémiplégiques vasculaires : conséquences sur les modalités de prise en charge." Limoges, 2004. http://aurore.unilim.fr/theses/nxfile/default/79737b65-cfe6-4b35-bc58-b9a5316fa823/blobholder:0/2004LIMO310E.pdf.
Full textSince few years in France, stroke care is a priority of public health. The Ministry of Health gave recommendations for a national management and care of stroke patients. In this way, we performed studies to determine early predictive factors of functional outcome and to evaluate their impact on stroke care organisation. We conducted an observational cohort study including a population of 156 first hemispherical stroke. This work concluded, in accordance with previous studies, that early predictive factors of poor functional outcome at D360 were: severity of motor deficit, severity of the disabilities evaluated by the Barthel index, initial loss of consciousness, neuropsychological disorders, urinary incontinence, prestroke myocardial infarction and prestroke disabilities. Age itself did not seem to be predictive factor. We have studied more specifically the impact of urinary disorders. To organise post-acute stroke care, we used criteria according to a national guideline recommended by the French Society of Physical Medicine and Rehabilitation and by the Ministry of Health for a national management and care of stroke patients. Thus, we observed that the impact of early predictive factors of poor functional outcome on discharge must be considered in relation with family status. Family status had an important influence on discharge modalities and home return. Complementary works were conducted concerning individual treatments, especially the complex regional pain syndrome type 1 (CRPS I). A first prospective case-series study, including 71 hemiplegic patients, shown that the CRPS I severity and its progression were strongly correlated with the hemiplegia severity. The CRPS I progression was more influenced by hemiplegia evolution than by specific treatments. Two others studies, including 60 patients, shown that measurement of transcutaneous oxygen tension did not seem to be sufficiently reproducible for application to a pathology such as the CRPS I. Actually, their is no complementary exploration to evaluate CRPS I after stroke. The field of rehabilitation has seen several stroke rehabilitation guidelines published in recent years with very little work done on describing the implementation phase or the impact of using these guidelines on clinical
Jeanson, Lisa. "Vers un modèle prédictif de la charge mentale et des performances sur postes répétitifs de fabrication dans l'industrie automobile." Thesis, Université de Lorraine, 2020. http://www.theses.fr/2020LORR0190.
Full textIn 2004, PSA developed PSA Excellent System, an organizational system based on the Lean Manufacturing principles to optimize vehicle production. One of the pillars of it is the follow up of a "work standard" (SoW) designed by the methods engineers. In theory, SoW allow for the balancing of shifts, i.e. the organization of tasks that operators can perform within a given period. The objective here is to maintain the operators' performance and health. Despite this approach, errors and complaints on the assembly lines still emerge. To understand these phenomena, we constructed a new approach based on the comparison between prescribed and real tasks. In fact, we found that discrepancies between SoW and workers' behaviour correspond to anticipated/collective and individual regulations. These regulations are strategies used by workers to cope with production constraints, such as mental workload, that are not taken into account during the design of workstations. Thus, regulations are a symptom of a dichotomy between the rules shaping workstations design and real production constraints. Especially, while certain constraints such as time pressure, require automatic behaviours (that are efficient and mostly unconscious), other constraints such as spatio-temporal restrictions, work variations, task complexity, high level of risk and information load, bring the need to use reflexive behaviours (that are slow and require a high level of attention). Our final results showed that, when we reduce level of constraints and allow workers to perform their task automatically, we therefore minimize the risk of errors, and the need to use regulations
Mazouni, Chafika. "Evaluation et validation de marqueurs pronostiques et prédictifs dans la prise en charge des patientes présentant un cancer du sein." Thesis, Aix-Marseille 2, 2010. http://www.theses.fr/2010AIX20701.
Full textThe identification of prognostic and predictive markers is important for a better understanding of the evolutionary process and the development of targeted therapies. Thus estrogen receptors (ER) represent both an important prognostic marker but also predictive of therapies using tamoxifen or aromatase inhibitors. However, a number of patients will evolve despite hormonotherapy. The objective of our work was to evaluate the method for measuring ER, the contribution of proteases in the distinction of prognostic and predictive tumor profiles. In our work, we demonstrated the influence of the mode of measure of ER and in particular its quantitative expression on the prognostic interpretation and a better determination of benefit of treatment depending on the level of expression of ER. We show the interest of the evaluation of tissue proteases uPA, PAI-I and cathepsin-D, to characterize the heterogeneity of tumors in addition to ER. Specifically, in ER + patients, high levels of cathepsin-D and PAI-1 are an indicator of poor prognosis. We developed a nomogram combining ER and nodal status, to 3 types of proteases: PAI-1, cathepsin-D and thymidine kinase, to determine the probability of survival at 2 and 5 years. In addition, these proteases are evaluated in tumors infected with the Epstein-Barr virus (EBV) and shows high rates of thymidine kinase in EBV + BC, reflecting biologically aggressive tumors. Our work has helped to improve the identification of profiles of tumors according to ER and proteases and characterize virus-associated tumors
Soret, Perrine. "Régression pénalisée de type Lasso pour l’analyse de données biologiques de grande dimension : application à la charge virale du VIH censurée par une limite de quantification et aux données compositionnelles du microbiote." Thesis, Bordeaux, 2019. http://www.theses.fr/2019BORD0254.
Full textIn clinical studies and thanks to technological progress, the amount of information collected in the same patient continues to grow leading to situations where the number of explanatory variables is greater than the number of individuals. The Lasso method proved to be appropriate to circumvent over-adjustment problems in high-dimensional settings.This thesis is devoted to the application and development of Lasso-penalized regression for clinical data presenting particular structures.First, in patients with the human immunodeficiency virus, mutations in the virus's genetic structure may be related to the development of drug resistance. The prediction of the viral load from (potentially large) mutations allows guiding treatment choice.Below a threshold, the viral load is undetectable, data are left-censored. We propose two new Lasso approaches based on the Buckley-James algorithm, which imputes censored values by a conditional expectation. By reversing the response, we obtain a right-censored problem, for which non-parametric estimates of the conditional expectation have been proposed in survival analysis. Finally, we propose a parametric estimation based on a Gaussian hypothesis.Secondly, we are interested in the role of the microbiota in the deterioration of respiratory health. The microbiota data are presented as relative abundances (proportion of each species per individual, called compositional data) and they have a phylogenetic structure.We have established a state of the art methods of statistical analysis of microbiota data. Due to the novelty, few recommendations exist on the applicability and effectiveness of the proposed methods. A simulation study allowed us to compare the selection capacity of penalization methods proposed specifically for this type of data.Then we apply this research to the analysis of the association between bacteria / fungi and the decline of pulmonary function in patients with cystic fibrosis from the MucoFong project
Bizollon, Thierry. "Contribution à l'étude des facteurs prédictifs de l'évolution et de la prise en charge thérapeutique de la récidive virale C après transplantation hépatique." Lyon 1, 2002. http://www.theses.fr/2002LYO1T126.
Full textCrombé, Amandine. "Développement des approches radiomics à visées diagnostique et pronostique pour la prise en charge de patients atteints des sarcomes des tissus mous." Thesis, Bordeaux, 2020. http://www.theses.fr/2020BORD0059.
Full textSoft-tissue sarcomas (STS) are malignant ubiquitous mesenchymal tumors that are characterized by their heterogeneity at several levels, i.e. in terms of clinical presentation, radiological presentation, histology, molecular features and prognosis. Magnetic resonance imaging (MRI) with a contrast-agent injection is the imaging of reference for these tumors. MRI enables to perform the local staging, the evaluation of response to treatment, to plan the surgery and to look for local relapse. Furthermore, MRI can access non-invasively to the whole tumor in situ and in vivo which is complementary to histopathological and molecular analyses requiring invasive biopsy samples at risk of sampling bias. However, no imaging biomarker dedicated to STS has been validated so far. Meanwhile, technical innovations have been developed, namely: (i) alternative imaging modalities or MRI sequences that can quantify intratumoral physiopathological phenomenon; (ii) image analysis tools that can quantify radiological phenotypes better than human’s eyes through hundreds of textural and shape quantitative features (named radiomics features); and (iii) mathematical algorithms that can integrate all these information into predictive models (: machine-learning). Radiomics approaches correspond to the development of predictive models based on machine-learning algorithms and radiomics features, eventually combined with other clinical, pathological and molecular features. The aim of this thesis was to put these innovations into practice and to optimize them in order to improve the diagnostic and therapeutic managements of patients with STS.In the first part, we combined radiological and radiomics features extracted from the baseline structural MRIs of patients with a locally-advanced subtype of STS in order to build a radiomics signature that could help to identify patients with higher risk of metastatic relapse and may benefit from neoadjuvant treatments. In the second part, we elaborated a model based on the early changes in intratumoral heterogeneity (: delta-radiomics) on structural MRIs of patients with locally-advanced high-grade STS treated with neoadjuvant chemotherapy, in order to rapidly identify patients who do not respond to treatment and would benefit from early therapeutic adjustments. In the last part, we tried to better identify and control potential bias in radiomics approaches in order to optimize the predictive models based on radiomics features
Cluze, Monique. "Facteurs prédictifs de la réussite ou de l'échec de la prise en charge de malades alcooliques dans le service d'alcoologie du Centre hospitalier général de Valence (Drôme)." Lyon 1, 1992. http://www.theses.fr/1992LYO1M010.
Full textAbdennadher, Mohamed Karim. "Étude et élaboration d’un système de surveillance et de maintenance prédictive pour les condensateurs et les batteries utilisés dans les Alimentations Sans Interruptions (ASI)." Thesis, Lyon 1, 2010. http://www.theses.fr/2010LYO10101/document.
Full textTo ensure power quality and permanently, some electronic system supplies exist. These supplies are the Uninterrupted Power Supplies (UPS). An UPS like any other system may have some failures. This can be a cause of redundancy loss. This load loss causes a maintenance downtime which may represent a high cost. We propose in this thesis to work on two of the most sensitive components in the UPS namely electrolytic capacitors and lead acid batteries. In a first phase, we present the existing surveillance systems for these two components, highlighting their main drawbacks. This allows us to propose the specifications which have to be implemented for this system. For electrolytic capacitors, we detail different stages of characterization ; the aging accelerated standard experimental procedure and their associated results. On the other hand, we present the simulation results of monitoring and failure prediction system retained. We discuss the experimental validation, describing the developed system. We detail the electronic boards designed, implemented algorithms and their respective constraints for a real time implementation. Finally, for lead acid batteries, we present the simulation results of the monitoring system adopted to obtain the SOC and SOH. We describe the aging experimental procedure of charging and discharging cycles of the batteries needed to find a simple and accurate electric models. We explain the aging experimental results and in the end we give suggestions for improving our system to get a more accurate SOH
Broglio, Annie. "Prédiction par automates." Aix-Marseille 1, 1991. http://www.theses.fr/1991AIX11385.
Full textRoy, Samir Chandra. "Analyse et modélisation du comportement de divers matériaux en érosion de cavitation." Thesis, Université Grenoble Alpes (ComUE), 2015. http://www.theses.fr/2015GREAI081/document.
Full textNumerical prediction of cavitation erosion requires the knowledge of flow aggressiveness, both of which have been challenging issues till-date. This thesis proposes to use an inverse method to estimate the aggressiveness of the flow from the observation of the pits printed on the surface in the first moments of the cavitation erosion. Three materials were tested in the same experimental conditions in the cavitation tunnel PREVERO available LEGI Grenoble. The geometry of the pits left on the surface is precisely measured using a systematic method to overcome the roughness effect. Assuming that each pit was generated by a single bubble collapse whose pressure field is treated as a Gaussian shape, finite element calculations are run for estimating the load that created each residual imprint. It is shown that the load distribution falls on a master curve independent of the tested material; the softer material (aluminum alloy) measuring the lowest impacts while the most resistant material (duplex stainless steel) provides access to the largest impact pressures. It is concluded that the material can be used as a pressure sensor measuring the level of aggressiveness of the flow. The inverse method is based on a material characterization taking into account strain rate effects. It is shown that nanoindentation tests are more suitable than compression tests to determine the parameters of the behavior law, particularly for the aluminum alloy for which the microstructure is very heterogeneous. High-speed compression tests with split Hopkinson pressure bars complement the constitutive law giving the sensitivity to the strain rate. Simulations considering the dynamic loading show that impacts of strong amplitude but applied in a short time do not leave any residual pit if the frequency is higher than the natural frequency of the material treated as a damped oscillator. A dynamic mechanism of plastic strain accumulation that could eventually lead to fatigue failure is proposed. Finally, the mass loss curve of cavitation erosion is simulated by applying randomly on a 3D mesh, the impact force population estimated by the inverse method
Heeke, Simon. "Développement et implémentation de nouveaux biomarqueurs prédictifs dans le cancer du poumon non à petites cellules - du tissu à la biopsie liquide." Electronic Thesis or Diss., Université Côte d'Azur (ComUE), 2019. http://www.theses.fr/2019AZUR6015.
Full textLung cancer is the leading cause of cancer-related deaths worldwide for both men and women. However, the treatment of lung cancer has changed radically in recent years with the introduction of more effective chemotherapies, but above all the development of targeted treatments that allow a personalized therapeutic approach and the introduction of immunotherapy that has considerably prolonged the survival of some patients with non-small cell lung cancer (NSCLC). Although these new therapeutic approaches have made it possible to obtain sometimes spectacular responses, a fairly large number of patients are resistant to these treatments. In this context, the development of new biomarkers to select the best treatment for the right patient at the right time is crucial to improving clinical outcomes for NSCLC patients. Nevertheless, not all biomarkers currently under study are able to improve this prediction, in particular, the implementation of some biomarkers in clinical routine is often difficult, whereas preliminary results obtained in vitro or even in initial clinical trials were promising.The objective of the thesis was to evaluate and implement new biomarkers that predict the response to immunotherapy and targeted therapies for the therapeutic selection of NSCLC patients. The first part of the thesis discusses the importance of biobanks and the control of biological resources as a cornerstone for the development of these new biomarkers. We have implemented an operating procedure that allows us to safely store biological collections of interest and use them for biomarker research studies. We describe how a biobank dedicated to a single pathology can be established and used for research purposes.Additionally, the genomic evaluation of cell-free DNA (cfDNA) for the detection of specific mutations of the Epidermal growth Factor Receptor receptor (EGFR) is studied and evaluated. We retrospectively analyzed 324 patients over a three-year period from three biological tests used in routine clinical practice and were able to demonstrate that these tests are very robust but must be closely controlled to avoid false positive or negative results. We then evaluated the next-generation sequencing (NGS) of plasma DNA using an internal test developed in the laboratory and an external test and were able to demonstrate that both tests were reliable for the detection of genomic alterations in plasma in clinical routine. In the last part of the thesis, I describe how the evaluation of large targeted sequencing panels capable of assessing mutation tumor load can be used to select patients for anti-tumor immunotherapy and what pitfalls should be avoided in order to use this biomarker in clinical routine.In summary, this thesis demonstrates the importance of novel biomarkers for the stratification ofpatients undergoing therapy in NSCLC and contributed to the implementation of tissue and liquidbiopsy-based biomarkers in routine clinical care
Nguyen, Thi Thu Tam. "Learning techniques for the load forecasting of parcel pick-up points." Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASG034.
Full textPick-Up Points (PUP) represent an alternative delivery option for purchases from online retailers (Business-to-Customer, B2C) or online Customer-to-Customer (C2C) marketplaces. Parcels are delivered at a reduced cost to a PUP and wait until being picked up by customers or returned to the original warehouse if their sojourn time is over. When the chosen PUP is overloaded, the parcel may be refused and delivered to the next available PUP on the carrier tour. PUP load forecasting is an efficient method for the PUP management company (PMC) to better balance the load of each PUP and reduce the number of rerouted parcels. This thesis aims to describe the parcel flows in a PUP and to proposed models used to forecast the evolution of the load. For the PUP load associated with the B2C business, the parcel life-cycle has been taken into account in the forecasting process via models of the flow of parcel orders, the delivery delays, and the pick-up process. Model-driven and data-driven approaches are compared in terms of load-prediction accuracy. For the PUP load associated with the C2C business, the daily number of parcels dropped off with a given PUP as target is described by a Markov-Switching AutoRegressive model to account for the non-stationarity of the second-hand shopping activity. The life-cycle of each parcel is modeled by a Markov jump process. Model parameters are evaluated from previous parcel drop-off, delivery, and pick-up records. The probability mass function of the future load of a PUP is then evaluated using all information available on parcels with this PUP as target. In both cases, the proposed model-driven approaches give, for most of the cases, better forecasting performance, compared with the data-driven models, involving LSTM, Random forest, Holt-Winters, and SARIMA models, up to four days ahead in the B2C case and up to six days ahead in the C2C case. The first approach applied to the B2C parcel load yields an MAE of 3 parcels for the one-day ahead prediction and 8 parcels for the four-day ahead prediction. The second approach applied to the C2C parcel load yields an MAE of 5 parcels for the one-day ahead prediction and 8 parcels for the seven-day ahead prediction. These prediction horizons are consistent with the delivery delay associated with these parcels (1-3 days in the case of a B2C parcel and 4-5 days in the case of a C2C parcel). Future research directions aim at optimizing the prediction accuracy, especially in predicting future orders and studying a load-balancing approach to better share the load between PUPs
Doorgah, Naraindranath. "Contribution à la modélisation prédictive CEM d'une chaine d'entrainement." Phd thesis, Ecole Centrale de Lyon, 2012. http://tel.archives-ouvertes.fr/tel-00838823.
Full textAbdennadher, Mohamed. "Étude et élaboration d'un système de surveillance et de maintenance prédictive pour les condensateurs et les batteries utilisés dans les Alimentations Sans Interruptions (ASI)." Phd thesis, Université Claude Bernard - Lyon I, 2010. http://tel.archives-ouvertes.fr/tel-00806065.
Full textGerth, Lutz Martin. "Électrodéposition de cuivre à partir de solutions sulfuriques : mesures locales de la densité de courant dans des cellules à hydrodynamique complexe." Vandoeuvre-les-Nancy, INPL, 1995. http://www.theses.fr/1995INPL153N.
Full textBello, Léa. "Prédiction des structures convectives terrestres." Thesis, Lyon, École normale supérieure, 2015. http://www.theses.fr/2015ENSL0977/document.
Full textSince its formation, the Earth is slowly cooling. The heat produced by the core and the radioactive decay in the mantle is evacuated toward the surface by convection. The evolving convective structures thereby created control a diversity of surface phenomena such as vertical motion of continents or sea level variation. The study presented here attempts to determine which convective structures can be predicted, to what extent and over what timescale. Because of the chaotic nature of convection in the Earth’s mantle, uncertainties in initial conditions grow exponentially with time and limit forecasting and hindcasting abilities. Following the twin experiments method initially developed by Lorenz [1965] in weather forecast, we estimate for the first time the Lyapunov time and the limit of predictability of Earth’s mantle convection. Our numerical solutions for 3D spherical convection in the fully chaotic regime, with diverse rheologies, suggest that a 5% error on initial conditions limits the prediction of Earth’s mantle convection to 95 million years. The quality of the forecast of convective structures also depends on our ability to describe the mantle properties in a realistic way. In 3D numerical convection experiments, pseudo plastic rheology can generate self-consistent plate tectonics compatible at first order with Earth surface behavior [Tackley, 2008]. We assessed the role of the temperature dependence of viscosity and the pseudo plasticity on reconstructing slab evolution, studying a variety of mantle thermal states obtained by imposing 200 million years of surface velocities extracted form tectonic reconstructions [Seton et al., 2012; Shephard et al., 2013]. The morphology and position of the reconstructed slabs largely vary when the viscosity contrast increases and when pseudo plasticity is introduced. The errors introduced by the choices in the rheological description of the mantle are even larger than the errors created by the uncertainties in initial conditions and surface velocities. This work shows the significant role of initial conditions and rheology on the quality of predicted convective structures, and identifies pseudo plasticity and large viscosity contrast as key ingredients to produce coherent and flat slabs, notable features of Earth’s mantle convection
Vekemans, Denis. "Algorithmes pour méthodes de prédiction." Lille 1, 1995. http://www.theses.fr/1995LIL10176.
Full textCarré, Clément. "Prédiction génétique des caractères complexes." Toulouse 3, 2014. http://thesesups.ups-tlse.fr/2912/.
Full textThe objective of this thesis is the development of statistical approaches for genetic prediction of complex traits. Four approaches are developed, adapted to different genetic contexts. Under additivity, a linear model combining transmission and SNP marker data is useful when the SNP do not capture all genetic effects influencing the trait. Under nonlinear genetic effects, a kernel regression method (Nadaraya-Watson) yields more precise predictions than the standard method (BLUP). After the comparison of parametric vs. Nonparametric methods, we propose to combine methods : a statistical aggregation method is efficient and robust to mix several predictors. Finally, an original algorithm of random projections of linear models allows rapid recovery of parsimonious model parameters
Hmamouche, Youssef. "Prédiction des séries temporelles larges." Thesis, Aix-Marseille, 2018. http://www.theses.fr/2018AIXM0480.
Full textNowadays, storage and data processing systems are supposed to store and process large time series. As the number of variables observed increases very rapidly, their prediction becomes more and more complicated, and the use of all the variables poses problems for classical prediction models.Univariate prediction models are among the first models of prediction. To improve these models, the use of multiple variables has become common. Thus, multivariate models and become more and more used because they consider more information.With the increase of data related to each other, the application of multivariate models is also questionable. Because the use of all existing information does not necessarily lead to the best predictions. Therefore, the challenge in this situation is to find the most relevant factors among all available data relative to a target variable.In this thesis, we study this problem by presenting a detailed analysis of the proposed approaches in the literature. We address the problem of prediction and size reduction of massive data. We also discuss these approaches in the context of Big Data.The proposed approaches show promising and very competitive results compared to well-known algorithms, and lead to an improvement in the accuracy of the predictions on the data used.Then, we present our contributions, and propose a complete methodology for the prediction of wide time series. We also extend this methodology to big data via distributed computing and parallelism with an implementation of the prediction process proposed in the Hadoop / Spark environment
Hmamouche, Youssef. "Prédiction des séries temporelles larges." Electronic Thesis or Diss., Aix-Marseille, 2018. http://www.theses.fr/2018AIXM0480.
Full textNowadays, storage and data processing systems are supposed to store and process large time series. As the number of variables observed increases very rapidly, their prediction becomes more and more complicated, and the use of all the variables poses problems for classical prediction models.Univariate prediction models are among the first models of prediction. To improve these models, the use of multiple variables has become common. Thus, multivariate models and become more and more used because they consider more information.With the increase of data related to each other, the application of multivariate models is also questionable. Because the use of all existing information does not necessarily lead to the best predictions. Therefore, the challenge in this situation is to find the most relevant factors among all available data relative to a target variable.In this thesis, we study this problem by presenting a detailed analysis of the proposed approaches in the literature. We address the problem of prediction and size reduction of massive data. We also discuss these approaches in the context of Big Data.The proposed approaches show promising and very competitive results compared to well-known algorithms, and lead to an improvement in the accuracy of the predictions on the data used.Then, we present our contributions, and propose a complete methodology for the prediction of wide time series. We also extend this methodology to big data via distributed computing and parallelism with an implementation of the prediction process proposed in the Hadoop / Spark environment
Wandeto, John Mwangi. "Self-organizing map quantization error approach for detecting temporal variations in image sets." Thesis, Strasbourg, 2018. http://www.theses.fr/2018STRAD025/document.
Full textA new approach for image processing, dubbed SOM-QE, that exploits the quantization error (QE) from self-organizing maps (SOM) is proposed in this thesis. SOM produce low-dimensional discrete representations of high-dimensional input data. QE is determined from the results of the unsupervised learning process of SOM and the input data. SOM-QE from a time-series of images can be used as an indicator of changes in the time series. To set-up SOM, a map size, the neighbourhood distance, the learning rate and the number of iterations in the learning process are determined. The combination of these parameters that gives the lowest value of QE, is taken to be the optimal parameter set and it is used to transform the dataset. This has been the use of QE. The novelty in SOM-QE technique is fourfold: first, in the usage. SOM-QE employs a SOM to determine QE for different images - typically, in a time series dataset - unlike the traditional usage where different SOMs are applied on one dataset. Secondly, the SOM-QE value is introduced as a measure of uniformity within the image. Thirdly, the SOM-QE value becomes a special, unique label for the image within the dataset and fourthly, this label is used to track changes that occur in subsequent images of the same scene. Thus, SOM-QE provides a measure of variations within the image at an instance in time, and when compared with the values from subsequent images of the same scene, it reveals a transient visualization of changes in the scene of study. In this research the approach was applied to artificial, medical and geographic imagery to demonstrate its performance. Changes that occur in geographic scenes of interest, such as new buildings being put up in a city or lesions receding in medical images are of interest to scientists and engineers. The SOM-QE technique provides a new way for automatic detection of growth in urban spaces or the progressions of diseases, giving timely information for appropriate planning or treatment. In this work, it is demonstrated that SOM-QE can capture very small changes in images. Results also confirm it to be fast and less computationally expensive in discriminating between changed and unchanged contents in large image datasets. Pearson's correlation confirmed that there was statistically significant correlations between SOM-QE values and the actual ground truth data. On evaluation, this technique performed better compared to other existing approaches. This work is important as it introduces a new way of looking at fast, automatic change detection even when dealing with small local changes within images. It also introduces a new method of determining QE, and the data it generates can be used to predict changes in a time series dataset
Hoen, Bruno. "Pronostic et prédiction en pathologie infectieuse." Nancy 1, 1995. http://www.theses.fr/1995NAN10445.
Full textMaucort-Boulch, Delphine. "Modélisation et prédiction : application en cancérologie." Lyon 1, 2007. http://www.theses.fr/2007LYO10067.
Full textWhen dealing with prediction, statistical models have to fit some criteria. The requirements are different for marginal predictions and for individual predictions. Based on two exemples in oncology, particularities and criteria of prognostic models are explored. The first exemple deals with the margianl effect of breast cancer chemoprevention on overall survival for women at high risk. The second exemple deals with individual prediction with a marginal model for Hodgkin's lymphoma patients. Measures of predictive quality are studied, in particular in survival context, especially in Cox proportional hazards model
Onzon, Emmanuel. "Prédiction statistique efficace et asymptotiquement efficace." Paris 6, 2012. http://www.theses.fr/2012PA066531.
Full textThe subject of this thesis is, on one hand, to complete previous works about the generalization of Cramér-Rao inequality to predictors, and on the other hand, to generalize the concept of asymptotic efficiency to statistical prediction. Chapter 1 reminds results of point estimation theory which are essential in the following chapters. The Cramér-Rao inequality for predictors is the subject of Chapter 2. Chapter 3 deals with asymptotic efficiency of estimators of the regression function for a peculiar risk as well as asymptotic efficiency for prediction under the condition that the prediction risk is asymptotically equivalent to the estimation risk of the regression function. In Chapter 4 we give conditions under which the asymptotic equivalence of risks used in the previous chapter is satisfied. In Chapter 5 we prove a result of convergence in distribution and use it to give an alternative definition of asymptotic efficiency for predictors. Chapter 6 presents results of simulations of a problem of forecasting of the Ornstein-Uhlenbeck process
Hublart, Paul. "Exploring the use of conceptual catchment models in assessing irrigation water availability for grape growing in the semi-arid Andes." Thesis, Montpellier, 2015. http://www.theses.fr/2015MONTS181.
Full textThis thesis investigates the use of lumped catchment models to assess water availability for irrigation in the upland areas of northern-central Chile (30°S). Here, most of the annual water supply falls as snow in the high Cordillera during a few winter storms. Seasonal snowpacks serve as natural reservoirs, accumulating water during the winter and sustaining streams and aquifers during the summer, when irrigation demand in the cultivated valleys is at its peak. At the inter-annual timescale, the influence of ENSO and PDO phenomena result in the occurrence of extremely wet and dry years. Also, irrigated areas and grape growing have achieved a dramatic increase since the early 1980s. To evaluate the usefulness of explicitly accounting for changes in irrigation water-use in lumped catchment models, an integrated modeling framework was developed and different ways of quantifying/reducing model uncertainty were explored. Natural streamflow was simulated using an empirical hydrological model and a snowmelt routine. In parallel, seasonal and inter-annual variations in irrigation requirements were estimated using several process-based phenological models and a simple soil-water balance approach. Overall, this resulted in a low-dimensional, holistic approach based on the same level of mathematical abstraction and process representation as in most commonly-used catchment models. To improve model reliability and usefulness under varying or changing climate conditions, particular attention was paid to the effects of extreme temperatures on crop phenology and the contribution of sublimation losses to water balance at high elevations. This conceptual framework was tested in a typical semi-arid Andean catchment (1512 km2, 820–5500 m a.s.l.) over a 20–year simulation period encompassing a wide range of climate and water-use conditions (changes in grape varieties, irrigated areas, irrigation techniques). Model evaluation was performed from a Bayesian perspective assuming auto-correlated, heteroscedastic and non-gaussian residuals. Different criteria and data sources were used to verify model assumptions in terms of efficiency, internal consistency, statistical reliability and sharpness of the predictive uncertainty bands. Alternatively, a multiple-hypothesis and multi-criteria modeling framework was also developed to quantify the importance of model non-uniqueness and structural inadequacy from a non-probabilistic perspective. On the whole, incorporating the effects of irrigation water-use led to new interactions between the hydrological parameters of the modeling framework and improved reliability of streamflow predictions during low-flow periods. Finally, a sensitivity analysis to changes in climate conditions was conducted to evaluate the potential impacts of increasing temperatures and atmospheric CO2 on the hydrological behavior of the catchment and the capacity to meet future water demands
Tan, Ewe Hong. "Effets des interactions polymère - charge et charge - charge sur les propriétés dynamiques." Mulhouse, 1992. http://www.theses.fr/1992MULH0230.
Full textYadegari, Iraj. "Prédiction, inférence sélective et quelques problèmes connexes." Thèse, Université de Sherbrooke, 2017. http://hdl.handle.net/11143/10167.
Full textAbstract : We study the problem of point estimation and predictive density estimation of the mean of a selected population, obtaining novel developments which include bias analysis, decomposition of risk, and problems with restricted parameters (Chapter 2). We propose efficient predictive density estimators in terms of Kullback-Leibler and Hellinger losses (Chapter 3) improving on plug-in procedures via a dual loss and via a variance expansion scheme. Finally (Chapter 4), we present findings on improving on the maximum likelihood estimator (MLE) of a bounded normal mean under a class of loss functions, including reflected normal loss, with implications for predictive density estimation. Namely, we give conditions on the loss and the width of the parameter space for which the Bayes estimator with respect to the boundary uniform prior dominates the MLE.
Kawala, François. "Prédiction de l'activité dans les réseaux sociaux." Thesis, Université Grenoble Alpes (ComUE), 2015. http://www.theses.fr/2015GREAM021/document.
Full textThis dissertation is devoted to a social-media-mining problem named the activity-prediction problem. In this problem one aims to predict the number of user-generated-contents that will be created about a topic in the near future. The user-generated-contents that belong to a topic are not necessary related to each other.In order to study the activity-prediction problem without referring directly to a particular social-media, a generic framework is proposed. This generic framework allows to describe various social-media in a unified way. With this generic framework the activityprediction problem is defined independently of an actual social-media. Three examples are provided to illustrate how this generic framework describes social-media. Three defi- nitions of the activity-prediction problem are proposed. Firstly the magnitude prediction problem defines the activity-prediction as a regression problem. With this definition one aims to predict the exact activity of a topic. Secondly, the buzz classification problem defines the activity-prediction as a binary classification problem. With this definition one aims to predict if a topic will have an activity burst of a predefined amplitude. Thirdly the rank prediction problem defines the activity-prediction as a learning-to-rank problem. With this definition one aims to rank the topics accordingly to theirs future activity-levels. These three definitions of the activity prediction problem are tackled with state-of-the-art machine learning approaches applied to generic features. Indeed, these features are defined with the help of the generic framework. Therefore these features are easily adaptable to various social-media. There are two types of features. Firstly the features which describe a single topic. Secondly the features which describe the interplay between two topics.Our ability to predict the activity is tested against an industrial-size multilingual dataset. The data has been collected during 51 weeks. Two sources of data were used: Twitter and a bulletin-board-system. The collected data contains three languages: English, French and German. More than five hundred millions user-generated-contents were captured. Most of these user-generated-contents are related to computer hardware, video games, and mobile telephony. The data collection necessitated the implementation of a daily routine. The data was prepared so that commercial-contents and technical failure are not sources of noise. A cross-validation method that takes into account the time of observations is used. In addition an unsupervised method to extract buzz candidates is proposed. Indeed the training-sets are very ill-balanced for the buzz classification problem, and it is necessary to preselect buzz candidates. The activity-prediction problems are studied within two different experimental settings. The first experimental setting includes data from Twitter and the bulletin-board-system, on a long time-scale, and with three different languages. The second experimental setting is dedicated specifically to Twitter. This second experiment aims to increase the reproducibility of experiments as much as possible. Hence, this experimental setting includes user-generated-contents collected with respect to a list of unambiguous English terms. In addition the observation are restricted to ten consecutive weeks. Hence the risk of unannounced change in the public API of Twitter is minimized
Sohm, Juliette. "Prédiction des déformations permanentes des matériaux bitumeux." Ecole centrale de Nantes, 2011. https://tel.archives-ouvertes.fr/tel-00907046.
Full textIn order to study permanent strains in bituminous mixtures, the division SMIT of LCPC has developed a thermo-controlled triaxial testing device at constant confining pressure. An important work of testing development and verification on measurement has been carried out. The first experimental testing programme, consisting in creep tests at constant stress, has allowed to study the respective influences of the confining pressure, the deviatoric stress and the temperature on the behaviour of our bituminous mixture. These tests have also allowed to validate partially the time-temperature superposition principle in the non-linear field, under triaxial stress. An elasto-visco-plastic model has been developed in order to simulate creep tests under confining pressure. This model aims at reproducing our experimental observations. These results may be applicable for aeronautic and storage bituminous platforms. A second experimental testing programme, consisting in cyclic sinusoidal compression tests has been performed. During these tests, two scales of strains have been measured: strains about 10-4m/m, related to the viscoelastic behaviour of asphalt concretes, and strains about 10-2m/m related to the appearance of permanent strains in asphalt concretes. The respective influences of the confining pressure, the temperature and the frequency have also been studied in this programme. Noteworthy differences in the behaviour of our asphalt concrete during creep tests and cyclic tests have been observed
Gringoz, Florian. "Prédiction de la conformité géométrique d'assemblages aéronautiques." Thesis, université Paris-Saclay, 2020. http://www.theses.fr/2020UPASN012.
Full textThe assembly geometry is define through its components geometry described in their nominal configuration, in other words without geometrical deviations and with accurate relative positions. In fact, the real components geometries has geometrical deviations and their positions are not accurate. The doctoral work consists of predict the geometrical conformity of an aeronautical assembly from the geometries of its components. From knowledge of components geometry, a second step objective is to realise the simulation of assembly of this components (geometrical deviations propagation and finite elements coupling) in order to evaluate the geometrical conformity of the assembly, and to determinate the required operations in order to reach this conformity. The entire process will be applied on aeronautical nacelles