Dissertations / Theses on the topic 'Plans d'expérience optimaux (Statistique)'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 23 dissertations / theses for your research on the topic 'Plans d'expérience optimaux (Statistique).'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Koné, Mamadou. "Optimalité des plans d'expériences équilibrés pour les périodes." Paris 6, 2011. http://www.theses.fr/2011PA066509.
Full textJoutard, Cyrille. "Grandes déviations en statistique asymptotique." Toulouse 3, 2004. http://www.theses.fr/2004TOU30071.
Full textMahfouz, Mariam. "Plans d'expériences optimaux et compétition entre traitements." Pau, 1988. http://www.theses.fr/1988PAUU3030.
Full textBelouni, Mohamad. "Plans d'expérience optimaux en régression appliquée à la pharmacocinétique." Thesis, Grenoble, 2013. http://www.theses.fr/2013GRENM056/document.
Full textThe problem of interest is to estimate the concentration curve and the area under the curve (AUC) by estimating the parameters of a linear regression model with autocorrelated error process. We construct a simple linear unbiased estimator of the concentration curve and the AUC. We show that this estimator constructed from a sampling design generated by an appropriate density is asymptotically optimal in the sense that it has exactly the same asymptotic performance as the best linear unbiased estimator (BLUE). Moreover, we prove that the optimal design is robust with respect to a misspecification of the autocovariance function according to a minimax criterion. When repeated observations are available, this estimator is consistent and has an asymptotic normal distribution. All those results are extended to the error process of Hölder with index including between 0 and 2. Finally, for small sample sizes, a simulated annealing algorithm is applied to a pharmacokinetic model with correlated errors
Bousbia-Salah, Ryad. "Optimisation dynamique en temps-réel d’un procédé de polymérisation par greffage." Electronic Thesis or Diss., Université de Lorraine, 2018. http://www.theses.fr/2018LORR0242.
Full textIn a schematic way, process optimization consists of three basic steps: (i) modeling, in which a (phenomenological) model of the process is developed, (ii) problem formulation, in which the criterion of Performance, constraints and decision variables are defined, (iii) the resolution of the optimal problem, in which the optimal profiles of the decision variables are determined. It is important to emphasize that these optimal profiles guarantee the optimality for the model used. When applied to the process, these profiles are optimal only when the model perfectly describes the behavior of the process, which is very rarely the case in practice. Indeed, uncertainties about model parameters, process disturbances, and structural model errors mean that the optimal profiles of the model-based decision variables will probably not be optimal for the process. The objective of this thesis is to develop a conceptual strategy for using experimental measurements online so that the process not only satisfies the necessary conditions, but also the optimal conditions. This conceptual development will in particular be based on recent advances in deterministic optimization (the stochastic methods will not be dealt with in this work) of processes based on the estimation of the state variables that are not measured by a moving horizon observer. A dynamic real-time optimization (D-RTO) methodology has been developed and applied to a batch reactor where polymer grafting reactions take place. The objective is to determine the on-line reactor temperature profile that minimizes the batch time while meeting terminal constraints on the overall conversion rate and grafting efficiency
Yousfi, Elqasyr Khadija. "MODÉLISATION ET ANALYSE STATISTIQUE DES PLANS D'EXPÉRIENCE SÉQUENTIELS." Phd thesis, Université de Rouen, 2008. http://tel.archives-ouvertes.fr/tel-00377114.
Full textTinsson, Walter. "Plans d'expérience à facteurs quantitatifs et à effets de blocs aléatoires." Pau, 1998. http://www.theses.fr/1998PAUU3021.
Full textDruilhet, Pierre. "Optimalités des plans d'expériences équilibrés pour les voisinages." Toulouse 3, 1995. http://www.theses.fr/1995TOU30283.
Full textTlalolini, Romero David. "Génération de mouvements optimaux de marche pour des robots bipèdes 3D." Nantes, 2008. http://www.theses.fr/2008NANT2112.
Full textThis work is devoted to the optimal motions generation of cycle walk for the anthropomorphic bipedal robots walking in the space. These robots are assumed to consist of a kinematics structure which retains only basic human locomotion mobility's on the level of the hip-knee-ankle-foot kinematics chain. The decomposition of the human gaits cycle, robot modelling and the constraints linked to the contact between the foot and the ground are presented. To dene dynamic model and impact model, the method of Newton-Euler is used. The motions generation of walking is posed under the form of a constrained parameter optimization problem. The resolution of this problem is obtained by Sequential Quadratic Programming (SQP) methods. The criterion is optimized in order to increase the autonomy of energy of the biped robot. To achieve and improve the convergence of the optimization algorithm, the gradient analytic calculation of the criteria was performed. In a preliminary study limited of the sagittal plane, dierent gaits are compared. These are dened by instantaneous double support phases or nite time double support phases and the single support phases including or not an under-actuated sub-phase in which stance foot rotates about the stance toe. These results are then generalized to the case of three-dimensional walk. The comparison of two gaits which includes the impacts and the single support phases with or without rolling of the foot is achieved. One can saw from these dierent studies that the introduction of phase of foot rolling in a single support phase allows much higher walking velocity with less energy consumption
Bousbia-Salah, Ryad. "Optimisation dynamique en temps-réel d’un procédé de polymérisation par greffage." Thesis, Université de Lorraine, 2018. http://www.theses.fr/2018LORR0242/document.
Full textIn a schematic way, process optimization consists of three basic steps: (i) modeling, in which a (phenomenological) model of the process is developed, (ii) problem formulation, in which the criterion of Performance, constraints and decision variables are defined, (iii) the resolution of the optimal problem, in which the optimal profiles of the decision variables are determined. It is important to emphasize that these optimal profiles guarantee the optimality for the model used. When applied to the process, these profiles are optimal only when the model perfectly describes the behavior of the process, which is very rarely the case in practice. Indeed, uncertainties about model parameters, process disturbances, and structural model errors mean that the optimal profiles of the model-based decision variables will probably not be optimal for the process. The objective of this thesis is to develop a conceptual strategy for using experimental measurements online so that the process not only satisfies the necessary conditions, but also the optimal conditions. This conceptual development will in particular be based on recent advances in deterministic optimization (the stochastic methods will not be dealt with in this work) of processes based on the estimation of the state variables that are not measured by a moving horizon observer. A dynamic real-time optimization (D-RTO) methodology has been developed and applied to a batch reactor where polymer grafting reactions take place. The objective is to determine the on-line reactor temperature profile that minimizes the batch time while meeting terminal constraints on the overall conversion rate and grafting efficiency
Lasserre, Virginie. "Determination de plans optimaux pour des essais en crossover dans le cadre du modele lineaire." Paris 5, 1992. http://www.theses.fr/1992PA05S012.
Full textVaret, Suzanne. "Développement de méthodes statistiques pour la prédiction d'un gabarit de signature infrarouge." Phd thesis, Université Paul Sabatier - Toulouse III, 2010. http://tel.archives-ouvertes.fr/tel-00511385.
Full textWahdame, Bouchra. "Analyse et optimisation du fonctionnement de piles à combustible par la méthode des plans d’expériences." Besançon, 2006. http://www.theses.fr/2006BESA2034.
Full textThe Design of Experiment (DoE) methodology can be a suitable medium to characterise, analyse and also improve a complex system like a Fuel Cell (FC). After a brief overview of the PEMFC technology with its operating constraints, the author provides some essential elements of the DoE methodology. This one is then applied to various cases. Some experimental designs are first proposed to evaluate the influences of the gas pressures and flow rates over the maximal power reached by a PEMFC. The analyses made for the full factorial and fractional designs lead to similar results: a major influence of the air flow. Next, the tools proposed in this work are developed to analyse some data collected from a 5kW PEMFC. The current, temperature, gas pressure and stoichiometry rate impacts over the FC voltage are estimated; some possible interactions between the factors are highlighted as well. The optimisations of the FC operation parameters leading to high performances are performed afterwards thanks to statistical models. Then, the analysis of a durability test performed on a PEMFC stack operated during 1000 hours is made using the Response Surface Methodology. This study shows notably the interest of using some variable stoichiometry rates through ageing time in order to obtain both high voltages and lower cell voltage variability. At last, the author shows how the DoE methodology can contribute to a deeper understanding of the FC physical phenomena. Three different domains are concerned: the pressure losses in the bipolar plates, the impact of humidification on the stack internal resistance, and the ageing of a stack operated at high temperature level
Gimeno, Anthony. "Contribution à l'étude d'alternateurs automobiles : caractérisation des pertes en vue d'un dimensionnement optimal." Compiègne, 2011. http://www.theses.fr/2011COMP1923.
Full textThe advent of more stringent anti-pollution standards and the rising of oil price, lead car manufacturers and automotive suppliers to find efficient solutions for our future. This thesis is a contribution to improving the performance of the electrical generating function in a thermal powertrain. Two approaches are considered in this work. In the first approach, a study is done on the current machine (called Lundell structure or claw pole machine). In a second approach, we design a structure abble to replace the claw pole alternator. To identify and understand the evolution of the different losses, a characterization of losses is made and a study on VDA cycle is completed. We stress the importance of iron losses in claw pole structure and thereby study the influence of various factors on its evolution. The interest of a delta conexion in terms of stator iron losses is emphasized and the impact of this current in terms of copper losses is quantified. A study is conducted over the complementarity of experimental and finite element approach on the repartition of iron losses between the stator and the rotor of this structure. Finally, the impact of the rectifier on the evolution of iron losses is studied. During this first approach, we also propose an analytical modeling of the machine and his efficiency. The second approach leads to design an hybrid structure based on a wound rotor synchronous machine with interpolar magnets. In this study, we propose a coupling between an analytical design and a finite element one through the establishment of experimental designs. This study leads to an efficiency map of the hybrid structure, highlighting its value in terms of performance relative to a claw pole machine
Scheidt, Céline. "Analyse statistique d'expériences simulées : Modélisation adaptative de réponses non régulières par krigeage et plans d'expériences, Application à la quantification des incertitudes en ingénierie des réservoirs pétroliers." Phd thesis, Université Louis Pasteur (Strasbourg) (1971-2008), 2006. https://publication-theses.unistra.fr/public/theses_doctorat/2006/SCHEIDT_Celine_2006.pdf.
Full textQuantification of uncertainty in reservoir performance is an essential phase of oil field evaluation and production. Due to the large number of parameters and the physical complexity of the reservoir, fluid flow models can be computationally time consuming. Traditional uncertainty management is thus routinely performed using proxy models of the fluid flow simulator, following experimental design methodology. However, this approach often ignores the irregularity of the response. The objective of the thesis is to construct non-linear proxy models of the fluid flow simulator. Contrary to classical experimental designs which assume a polynomial behavior of the response, we build evolutive experimental designs to fit gradually the potentially non-linear shape of uncertainty. This methodology combines the advantages of experimental design with geostatistical methods. Starting from an initial trend of the uncertainty, the method determines iteratively new simulations that might bring crucial information to update the estimation of the uncertainty. Four criteria of adding new simulations are proposed. We suggest performing simulation at the extremes and the null derivative points of the approximation in order to better characterize irregularity. In addition, we propose an original way to increase the prior predictivity of the approximation using pilot points. The pilot points are also good candidates for simulation. This methodology allows for an efficient modeling of highly non-linear responses, while reducing the number of simulations compared to latin hypercubes. This work can potentially improve the efficiency in decision making under uncertainty
Barbillon, Pierre. "Méthodes d'interpolation à noyaux pour l'approximation de fonctions type boîte noire coûteuses." Phd thesis, Université Paris Sud - Paris XI, 2010. http://tel.archives-ouvertes.fr/tel-00559502.
Full textSantiago, Jenny. "Développement de nouveaux plans d'expériences uniformes adaptés à la simulation numérique en grande dimension." Electronic Thesis or Diss., Aix-Marseille, 2013. http://www.theses.fr/2013AIXM4302.
Full textThis thesis proposes a methodology of study in numeric simulation for high dimensions. There are several steps in this methodology : setting up an experimental design, performing sensitivity analysis, then using response surface for modelling. In numeric simulation, we use a Space Filling Design that scatters the points in the entire domain. The construction of an experimental design in high dimensions must be efficient, with good uniformity properties. Moreover, this construction must be fast. We propose using the WSP algorithm to construct such an experimental design. This design is then used in all steps of the methodology, making it a versatile design, from sensitivity analysis to modelling. A sensitivity analysis allows identifying the influent factors. Adapting the Morris method principle, this approach classifies the inputs into three groups according to their effects. Then, the experimental design is folded over in the subspace of the influent inputs. This action can modify the uniformity properties of the experimental design by creating possible gaps and clusters. So, it is necessary to repair it by removing clusters and filling gaps. We propose a step-by-step approach to offer suitable repairing for each experimental design. Then, the repaired design is used for the final step: modelling from the response surface. We consider a Support Vector Machines method because dimension does not affect the construction. Easy to construct and with good results, similar to the results obtained by Kriging, the Support Vector Regression method is an alternative method for the study of complex phenomena in high dimensions
Santiago, Jenny. "Développement de nouveaux plans d'expériences uniformes adaptés à la simulation numérique en grande dimension." Thesis, Aix-Marseille, 2013. http://www.theses.fr/2013AIXM4302.
Full textThis thesis proposes a methodology of study in numeric simulation for high dimensions. There are several steps in this methodology : setting up an experimental design, performing sensitivity analysis, then using response surface for modelling. In numeric simulation, we use a Space Filling Design that scatters the points in the entire domain. The construction of an experimental design in high dimensions must be efficient, with good uniformity properties. Moreover, this construction must be fast. We propose using the WSP algorithm to construct such an experimental design. This design is then used in all steps of the methodology, making it a versatile design, from sensitivity analysis to modelling. A sensitivity analysis allows identifying the influent factors. Adapting the Morris method principle, this approach classifies the inputs into three groups according to their effects. Then, the experimental design is folded over in the subspace of the influent inputs. This action can modify the uniformity properties of the experimental design by creating possible gaps and clusters. So, it is necessary to repair it by removing clusters and filling gaps. We propose a step-by-step approach to offer suitable repairing for each experimental design. Then, the repaired design is used for the final step: modelling from the response surface. We consider a Support Vector Machines method because dimension does not affect the construction. Easy to construct and with good results, similar to the results obtained by Kriging, the Support Vector Regression method is an alternative method for the study of complex phenomena in high dimensions
Coutu, Arnaud. "Conception de réacteurs de laboratoire et développement d’approches numériques pour l’optimisation du procédé de méthanisation en voie solide et discontinu : plans d’expériences mixtes et bootstrapping, modélisation couplée hydrodynamique et biochimique." Thesis, Compiègne, 2021. http://www.theses.fr/2021COMP2616.
Full textThis work aims to provide innovative solutions and perspectives to the current research work on solid state anaerobic digestion, using digital tools. The deliverables of this study are organized into a “digital toolbox” for engineers and researchers. Like the worker’s toolbox, this solution consists of several sections representing the three main digital applications for anaerobic digestion: optimization, hydrodynamics and modeling. Each part is linked with others to build a systemic approach identifying exchanges between them in order to form a complete solution exceeding the sum of its parts. This work was carried out around two substrates: straw cattle manure and damp grass. It was however designed to be transposed to any substrates. The first step in this work was the reactors and gas counters conception to perform each of these steps at lower cost. This equipment was computer-aided designed after the characteristics were determined by calculation. The second step was to determine a different approach from experimental designs to increase the optimization efficiency using this tool. This method allows tooptimize both the composition of each substrate and the different operating parameters values in a single experimental design combining factor design and mix design. The bootstrapping tool is also used to minimize the number of experiments while maintaining the results significance. In this study, the two substrates composition and two operating parameters were studied to maximize methane yield. The two substrates were cattle manure and damp grass, and the two studied parameters were immersion of the substrates and recirculation frequency of the liquid phase. The percolation is also a study subject: what is the purpose of optimizing the operating parameters if the liquid phase cannot percolate within the solid part? Therefore, the third step of this work is to study the liquid phase flow within the solid part under codigestion conditions according to its composition and stratification. This step allows to provide the flow characteristic parameter to propose a new stratification approach and to highlight the codigestion effect on the microporosity and macroporosity evolution. Finally, the microporosity and macroporosity evolution impact on biology was modeled in a single substrate model to provide an understanding tool and a first work step on a prediction tool integrating these phenomena. The whole study allows to optimize the operating parameters, to ensure the functional aspect of an experiment and bring forward an understanding model of porosities evolution. It is not a definitive solution but a solution to substantiate, just as the toolbox is continuously renewed in innovative and more efficient tools
Salameh, Farah. "Méthodes de modélisation statistique de la durée de vie des composants en génie électrique." Phd thesis, Toulouse, INPT, 2016. http://oatao.univ-toulouse.fr/16622/1/Salameh_Farah.pdf.
Full textFu, Shuai. "Inverse problems occurring in uncertainty analysis." Thesis, Paris 11, 2012. http://www.theses.fr/2012PA112208/document.
Full textThis thesis provides a probabilistic solution to inverse problems through Bayesian techniques.The inverse problem considered here is to estimate the distribution of a non-observed random variable X from some noisy observed data Y explained by a time-consuming physical model H. In general, such inverse problems are encountered when treating uncertainty in industrial applications. Bayesian inference is favored as it accounts for prior expert knowledge on Xin a small sample size setting. A Metropolis-Hastings-within-Gibbs algorithm is proposed to compute the posterior distribution of the parameters of X through a data augmentation process. Since it requires a high number of calls to the expensive function H, the modelis replaced by a kriging meta-model. This approach involves several errors of different natures and we focus on measuring and reducing the possible impact of those errors. A DAC criterion has been proposed to assess the relevance of the numerical design of experiments and the prior assumption, taking into account the observed data. Another contribution is the construction of adaptive designs of experiments adapted to our particular purpose in the Bayesian framework. The main methodology presented in this thesis has been applied to areal hydraulic engineering case-study
Blondet, Gaëtan. "Système à base de connaissances pour le processus de plan d'expériences numériques." Thesis, Compiègne, 2017. http://www.theses.fr/2017COMP2363/document.
Full textIn order to improve industrial competitiveness, product design relies more and more on numerical tools, such as numerical simulation, to develop better and cheaper products faster. Numerical Design of Experiments (NDOE) are more and more used to include variabilities during simulation processes, to design more robust, reliable and optimized product earlier in the product development process. Nevertheless, a NDOE process may be too expensive to be applied to a complex product, because of the high computational cost of the model and the high number of required experiments. Several methods exist to decrease this computational cost, but they required expert knowledge to be efficiently applied. In addition to that, NDoE process produces a large amount of data which must be managed. The aim of this research is to propose a solution to define, as fast as possible, an efficient NDoE process, which produce as much useful information as possible with a minimal number of simulations, for complex products. The objective is to shorten both process definition and execution steps. A knowledge-based system is proposed, based on a specific ontology and a bayesian network, to capitalise, share and reuse knowledge and data to predict the best NDoE process definition regarding to a new product. This system is validated on a product from automotive industry
El-Habti, Ahmed. "Estimation bayésienne empirique pour les plans d'expérience non équilibrés." Mémoire, 2007. http://www.archipel.uqam.ca/5029/1/M9893.pdf.
Full text