Tesi sul tema "Optimisation probabiliste"
Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili
Vedi i top-50 saggi (tesi di laurea o di dottorato) per l'attività di ricerca sul tema "Optimisation probabiliste".
Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.
Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.
Vedi le tesi di molte aree scientifiche e compila una bibliografia corretta.
Scherrer, Bruno. "Application et optimisation de l'échantillonnage probabiliste en écologie continentale". Montpellier 2, 1987. http://www.theses.fr/1987MON20115.
Bouillard, Anne. "Optimisation et analyse probabiliste de systèmes à évènements discrets". Lyon, École normale supérieure (sciences), 2005. http://www.theses.fr/2005ENSL0337.
This thesis deals with the study of discrete event systems. Three different models are considered. In the first part, we are interested in the trace groups. After giving a simple Möbius-like formula for the generating series of the trace groups, we show the existence and the algebraicity of the asymptotic growth rate of the height of the traces. The second part is devoted to timed free-choice nets, an important sub-class of Petri nets. We define the notion of throughput in those nets and study the variation of the throughput in function of the conflict resolution policies. First, we show how to compute the throughput, then we are interested in the policy that maximizes or minimizes the throughput. Finally, we give an efficient method to generate a marking according to its exact distribution in order to numerically evaluate the throughput. In the last part, we study the computation of performance guarantees in networks thanks to Network Calculus techniques. We show the stability of the ultimately pseudo-periodic functions with the operations of the Network Calculus and give algorithms to compute these functions. These techniques are then applied to the study of performance guarantees in graphs with turn prohibition
Scherrer, Bruno. "Application et optimisation de l'échantillonnage probabiliste en écologie continentale". Grenoble 2 : ANRT, 1987. http://catalogue.bnf.fr/ark:/12148/cb37609751c.
Schmitt, Lucie. "Durabilité des ouvrages en béton soumis à la corrosion : optimisation par une approche probabiliste". Thesis, Toulouse, INSA, 2019. http://www.theses.fr/2019ISAT0009/document.
Mastering the durability of new structures and the need to extand the lifespan of existing constructions correspond to social issues of the highest order and are part of the principles of a circular economy. The durability of concrete structures thus occupies a central position in the normative context. This thesis works follow those of J. Mai-Nhu* and aims at extending the field of application the SDReaM-crete model by integrating mineral additions based concretes and by defining a limit state criterion based on a quantity of corroded products. An approach based on a numerical optimization of predictive computations is set up to perform reliability analyses by considering the main mechanisms related to the corrosion of reinforcement, carbonation and chlorides. This model enables the optimization of the sizing of the concrete covers and performances by further integrating the environmental conditions as defined by the standards
Bérard, Jean. "Contributions à l'étude probabiliste des algorithmes d'évolution". Lyon 1, 2001. http://www.theses.fr/2001LYO10223.
Belkora, Samir. "Les méthodes d'optimisation multiobjectif : synthèse et considérations théoriques : application d'un modèle probabiliste de choix au problème d'optimisation multiobjectif". Aix-Marseille 2, 1986. http://www.theses.fr/1986AIX24012.
This work treat of multiple objective optimization methods. In a first large part, we accomplished the multiple objective methods synthesis by following a traditional classification which separate the methods in three major class : - the methods which require "a priori" ponderation of the objectives. - the methods which require "progressive" ponderation of the objectives or the interactive methods. - the methods which lead to "a posteriori" ponderation of the objectives. In a second part, we try to provide a theorical contribution to improve a method which belongs to the interactive methods class by integrating a qualitative choice probabilistic model which authorize sufficiently versatile specifications, the conditional probit model
Souissi, Salma. "Problème du Bin Packing probabiliste à une dimension". Versailles-St Quentin en Yvelines, 2006. http://www.theses.fr/2006VERS0052.
In the Probabilistic Bin Packing Problem (PBPP) the random deletion of some items once placed into bins. The problem is to rearrange the residual items, using the a priori solution. The initial arrangement being done with the Next Fit Decreasing Heuristic (NFD). We propose two resolution methodologies: the redistribution strategy according to NFD and the a priori strategy. In the first one, the Next fit algorithm is applied to the new list. In the second one, successive groups of bins are optimally rearranged. In both cases, we develop an average case analysis for the (PBPP). We prove the law of large numbers and the central limit theorem for the number of occupied bins as the initial number of items tends to infinity. We verify these theoretical results by simulation
Bahloul, Khaled. "Optimisation combinée des coûts de transport et de stockage dans un réseau logistique dyadique, multi-produits avec demande probabiliste". Phd thesis, INSA de Lyon, 2011. http://tel.archives-ouvertes.fr/tel-00695275.
Royer, Clément. "Algorithmes d'optimisation sans dérivées à caractère probabiliste ou déterministe : analyse de complexité et importance en pratique". Thesis, Toulouse 3, 2016. http://www.theses.fr/2016TOU30207/document.
Randomization has had a major impact on the latest developments in the field of numerical optimization, partly due to the outbreak of machine learning applications. In this increasingly popular context, classical nonlinear programming algorithms have indeed been outperformed by variants relying on randomness. The cost of these variants is usually lower than for the traditional schemes, however theoretical guarantees may not be straightforward to carry out from the deterministic to the randomized setting. Complexity analysis is a useful tool in the latter case, as it helps in providing estimates on the convergence speed of a given scheme, which implies some form of convergence. Such a technique has also gained attention from the deterministic optimization community thanks to recent findings in the nonconvex case, as it brings supplementary indicators on the behavior of an algorithm. In this thesis, we investigate the practical enhancement of deterministic optimization algorithms through the introduction of random elements within those frameworks, as well as the numerical impact of their complexity results. We focus on direct-search methods, one of the main classes of derivative-free algorithms, yet our analysis applies to a wide range of derivative-free methods. We propose probabilistic variants on classical properties required to ensure convergence of the studied methods, then enlighten their practical efficiency induced by their lower consumption of function evaluations. Firstorder concerns form the basis of our analysis, which we apply to address unconstrained and linearly-constrained problems. The observed gains incite us to additionally take second-order considerations into account. Using complexity properties of derivative-free schemes, we develop several frameworks in which information of order two is exploited. Both a deterministic and a probabilistic analysis can be performed on these schemes. The latter is an opportunity to introduce supplementary probabilistic properties, together with their impact on numerical efficiency and robustness
Bonnard, Cécile. "Optimisation de potentiels statistiques pour un modèle d'évolution soumis à des contraintes structurales". Phd thesis, Université Montpellier II - Sciences et Techniques du Languedoc, 2010. http://tel.archives-ouvertes.fr/tel-00495973.
Odeh, Khaled. "Nouveaux algorithmes pour le traitement probabiliste et logique des arbres de défaillance". Compiègne, 1995. http://www.theses.fr/1995COMPD846.
Matoiri, Chaibati Nadare. "Méthode probabiliste générique de qualification de la durabilité / fiabilité du béton dans son environnement". Thesis, Angers, 2020. http://www.theses.fr/2020ANGE0021.
The evolution of construction techniques, innovation and the environmental consideration standards are driving construction players to offer efficient and innovative materials for the construction of buildings and various infrastructures around the world. The material remains the most widely used in the construction industry today.Concrete standardization is generally based on prescriptive rules that take into account the Water/Cement ratio, the minimum resistance class, the minimum cement content, minimum air content, etc. This prescriptive method remains satisfactory but must be supplemented by another alternative to take into account the use of concrete composition not mentioned in the standard. Thus, the performance-based approach to concrete durability has been proposed. The aimis to allow the manufacture of concrete with a different formula than the standard while showing that it is as efficient as that of the standard. This performance demonstration involves the use of durability indicators and performance tests.The results of the tests carried out on the new concrete formulation will show whether its performance is equal to or above that of the standard. The problem arises in the number of tests to be carried out in order to guarantee the performance of the concrete. In this thesis, a generic method of qualifying the durability/Reliability of concrete with an optimized test plan is proposed for different types of environments. This method allows to estimate the durability/reliability of concrete in its environment with a given guarantee and a statistic of the tests to be carried out. The Wiener process is used to model degradation processes in concrete.An application of the method is carried out for the case of a concrete subject to carbonation
Goka, Edoh. "Analyse des tolérances des systèmes complexes – Modélisation des imperfections de fabrication pour une analyse réaliste et robuste du comportement des systèmes". Thesis, Paris, ENSAM, 2019. http://www.theses.fr/2019ENAM0019/document.
Tolerance analysis aims toward the verification of the impact of individual tolerances on the assembly and functional requirements of a mechanical system. The manufactured products have several types of contacts and their geometry is imperfect, which may lead to non-functioning and non-assembly. Traditional methods for tolerance analysis do not consider the form defects. This thesis aims to propose a new procedure for tolerance analysis which considers the form defects and the different types of contact in its geometrical behavior modeling. A method is firstly proposed to model the form defects to make realistic analysis. Thereafter, form defects are integrated in the geometrical behavior modeling of a mechanical system and by considering also the different types of contacts. Indeed, these different contacts behave differently once the imperfections are considered. The Monte Carlo simulation coupled with an optimization technique is chosen as the method to perform the tolerance analysis. Nonetheless, this method is subject to excessive numerical efforts. To overcome this problem, probabilistic models using the Kernel Density Estimation method are proposed
Macherey, Arthur. "Approximation et réduction de modèle pour les équations aux dérivées partielles avec interprétation probabiliste". Thesis, Ecole centrale de Nantes, 2021. http://www.theses.fr/2021ECDN0026.
In this thesis, we are interested in the numerical solution of models governed by partial differential equations that admit a probabilistic interpretation. In a first part, we consider partial differential equations in high dimension. Based on a probabilistic interpretation of the solution which allows to obtain pointwise evaluations of the solution using Monte-Carlo methods, we propose an algorithm combining an adaptive interpolation method and a variance reduction method to approximate the global solution. In a second part, we focus on reduced basis methods for parametric partial differential equations. We propose two greedy algorithms based on a probabilistic interpretation of the error. We also propose a discrete optimization algorithm probably approximately correct in relative precision which allows us, for these two greedy algorithms, to judiciously select a snapshot to add to the reduced basis based on the probabilistic representation of the approximation error
Hadj-Ahmed, Réda. "Modélisation des assemblages collés : Application à l'optimisation du transfert des efforts par cisaillement". Marne-la-vallée, ENPC, 1999. http://www.theses.fr/1999ENPC9926.
The use of adhesives to manufacture in plane force-transmitting joints between various structural material is now very spread because joining presents some advantages with regard to the others traditional techniques of material assembly. However, from a mechanical point of view, the classical joints (layers with constant thicknesses) present stress concentrations near the ends of the bond. Several shapes of joints are proposed in the literature to decrease the stress concentrtions but the analysis of stresses in these joints often requires a finite element method. The initial part of this work consists in rewriting a multiparticle model in the case of an adhesive bonded joint having layers with non constant thickness. The resolution of the model equations is done numerically by establishing a variational formulation. The model is validated by comparaison with finite element results on classical joints, scarfed joints, tapered joints and joints with uniform shear stress. The second part of this study deals with the scale effects for the shear strenght of adhesive joints. We propose a scale law giving the critical shear stress as a function of the adhesive thickness. This allows us to etablisk a procedure of optimization taking account of the scale effet. This scale effect is often related to the presence of defects in the adhesive. A law giving the probability of adhesive joint resistance is established. It makes it possible to find an optimal adhesive thickness and a limiting overlap length. This analysis allows also to definite clearly the problem of the proabilistic optimization of an adhesive joint. Indeed, it comes to the minimization of a functional of the adhesive thickness
Piegay, Nicolas. "Optimisation multi-objectif et aide à la décision pour la conception robuste. : Application à une structure industrielle sur fondations superficielles". Thesis, Bordeaux, 2015. http://www.theses.fr/2015BORD0393/document.
Design in Civil Engineering is usually performed in a semi-probabilistic way using characteristic values which are associated with partial safety factors. However, this approach doesn’t guarantee the structure robustness with regard to uncertainties that could affect its performance during construction and operation. In this thesis, we propose a decision aid methodology for robust design of steel frame on spread foundations. Soil-structure interaction is taken into consideration in the design process implying that the design choices on foundations influence the design choices on steel frame (and vice versa). The proposed design approach uses multi-objective optimization and decision aid methods in order to obtain the best solution with respect to the decision-maker’s preferences on each criterion. Furthermore, sensitivity analyzes are performed in order to identify and quantify the most influencing uncertainty sources on variability of the structure performances. These uncertainties are modeled as random variables and propagated in the design process using latin hypercube sampling. A part of this dissertation is devoted to the effects of uncertainties involved in soil properties on the structure responses and on the design global approach
Mbaye, Moustapha. "Conception robuste en vibration et aéroélasticité des roues aubagées de turbomachines". Phd thesis, Université Paris-Est, 2009. http://tel.archives-ouvertes.fr/tel-00529002.
Nelakanti, Anil Kumar. "Modélisation du langage à l'aide de pénalités structurées". Phd thesis, Université Pierre et Marie Curie - Paris VI, 2014. http://tel.archives-ouvertes.fr/tel-01001634.
Vidal, Vincent. "Développement de modèles graphiques probabilistes pour analyser et remailler les maillages triangulaires 2-variétés". Phd thesis, INSA de Lyon, 2011. http://tel.archives-ouvertes.fr/tel-00708530.
Mazigh, Mourad. "Solutions probabilistes de problèmes en optimisation". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp02/NQ60944.pdf.
Dubourg, Vincent. "Méta-modèles adaptatifs pour l'analyse de fiabilité et l'optimisation sous contrainte fiabiliste". Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2011. http://tel.archives-ouvertes.fr/tel-00697026.
Haddad, Marcel Adonis. "Nouveaux modèles robustes et probabilistes pour la localisation d'abris dans un contexte de feux de forêt". Electronic Thesis or Diss., Université Paris sciences et lettres, 2020. http://www.theses.fr/2020UPSLD021.
The location of shelters in different areas threatened by wildfires is one of the possible ways to reduce fatalities in acontext of an increasing number of catastrophic and severe forest fires. The problem is basically to locate p sheltersminimizing the maximum distance people will have to cover to reach the closest accessible shelter in case of fire. Thelandscape is divided in zones and is modeled as an edge-weighted graph with vertices corresponding to zones andedges corresponding to direct connections between two adjacent zones. Each scenario corresponds to a fire outbreak ona single zone (i.e., on a vertex) with the main consequence of modifying evacuation paths in two ways. First, an evacuationpath cannot pass through the vertex on fire. Second, the fact that someone close to the fire may have limited choice, ormay not take rational decisions, when selecting a direction to escape is modeled using a new kind of evacuation strategy.This evacuation strategy, called Under Pressure, induces particular evacuation distances which render our model specific.We propose two problems with this model: the Robust p-Center Under Pressure problem and the Probabilistic p-CenterUnder Pressure problem. First we prove hardness results for both problems on relevant classes of graphs for our context.In addition, we propose polynomial exact algorithms on simple classes of graphs and we develop mathematical algorithmsbased on integer linear programming
Benhida, Soufia. "De l'optimisation pour l'aide à la décision : applications au problème du voyageur de commerce probabiliste et à l'approximation de données". Thesis, Normandie, 2018. http://www.theses.fr/2018NORMIR27.
The first part of this work deals with route optimization in the form of an optimization problem named The Traveler's Business Problem. In this part we are interested to make a rich presentation of the problem of Traveler Commerce, its variants, then we propose a strategy of constraint generation for the resolution of the TSP. Then we treat its stochastic version : the probabilistic business traveler problem. We propose a mathematical formulation of the PTSP and we present numerical results obtained by exact resolution for a series of small instances. In the second part, we propose a method of general approximation to approximate different type of data, first we treat the approximation of a wind signal (simple case, 1D), then the approximation of a vector field taking into account the topography which is the main contribution of this part
Andrieu, Laetitia. "Optimisation sous contrainte en probabilité". Phd thesis, Ecole des Ponts ParisTech, 2004. http://pastel.archives-ouvertes.fr/pastel-00001239.
Ducamp, Gaspard. "PROCOP : probabilistic rules compilation and optimisation". Electronic Thesis or Diss., Sorbonne université, 2021. http://www.theses.fr/2021SORUS090.
Widely adopted for more than 20 years in industrial fields, business rules offer the opportunity to non-IT users to define decision-making policies in a simple and intuitive way. To facilitate their use, rule-based systems, known as business rule management systems, have been developed, separating the business logic from the computer application. While they are suitable for processing structured and complete data, they do not easily allow working with probabilistic data. PROCOP (Probabilistic Rules Optimized and COmPilation) is a thesis proposing a new approach for the integration of probabilistic reasoning in IBM Operational Decision Manager (ODM), IBM's business rules management system, in particular through the introduction of a concept of global risk on the evaluation of the execution conditions of an action, complicating the compilation phase of the system but increasing the expressiveness of the business rules. Various methods are explored, implemented and compared in order to allow the use of such a powerful reasoning capacity on a large scale, in particular in order to answer the problems linked to the use of probabilistic graphical models in complex networks
Ayadi, Inès. "Optimisation des politiques de maintenance préventive dans un cadre de modélisation par modèles graphiques probabilistes". Thesis, Paris Est, 2013. http://www.theses.fr/2013PEST1072/document.
At present, equipments used on the industrial circles are more and more complex. They require a maintenance increased to guarantee a level of optimal service in terms of reliability and availability. Besides, often this guarantee of optimalité has a very high cost, what is binding. In the face of these requirements the management of the maintenance of equipments is from now on a stake in size: look for a politics of maintenance realizing an acceptable compromise between the availability and the costs associated to the maintenance of the system. The works of this thesis leave besides the report that in several applications of the industry, the need for strategies of maintenance assuring(insuring) at the same time an optimal safety and a maximal profitability lives furthermore there
Serra, Romain. "Opérations de proximité en orbite : évaluation du risque de collision et calcul de manoeuvres optimales pour l'évitement et le rendez-vous". Thesis, Toulouse, INSA, 2015. http://www.theses.fr/2015ISAT0035/document.
This thesis is about collision avoidance for a pair of spherical orbiting objects. The primary object - the operational satellite - is active in the sense that it can use its thrusters to change its trajectory, while the secondary object is a space debris that cannot be controlled in any way. Onground radars or other means allow to foresee a conjunction involving an operational space craft,leading in the production of a collision alert. The latter contains statistical data on the position and velocity of the two objects, enabling for the construction of a probabilistic collision model.The work is divided in two parts : the computation of collision probabilities and the design of maneuvers to lower the collision risk. In the first part, two kinds of probabilities - that can be written as integrals of a Gaussian distribution over an Euclidean ball in 2 and 3 dimensions -are expanded in convergent power series with positive terms. It is done using the theories of Laplace transform and Definite functions. In the second part, the question of collision avoidance is formulated as a chance-constrained optimization problem. Depending on the collision model, namely short or long-term encounters, it is respectively tackled via the scenario approach or relaxed using polyhedral collision sets. For the latter, two methods are proposed. The first one directly tackles the joint chance constraints while the second uses another relaxation called risk selection to obtain a mixed-integer program. Additionaly, the solution to the problem of fixed-time fuel minimizing out-of-plane proximity maneuvers is derived. This optimal control problem is solved via the primer vector theory
Karri, Senanayak Sesh Kumar. "On the Links between Probabilistic Graphical Models and Submodular Optimisation". Thesis, Paris Sciences et Lettres (ComUE), 2016. http://www.theses.fr/2016PSLEE047/document.
The entropy of a probability distribution on a set of discrete random variables is always bounded by the entropy of its factorisable counterpart. This is due to the submodularity of entropy on the set of discrete random variables. Submodular functions are also generalisation of matroid rank function; therefore, linear functions may be optimised on the associated polytopes exactly using a greedy algorithm. In this manuscript, we exploit these links between the structures of graphical models and submodular functions: we use greedy algorithms to optimise linear functions on the polytopes related to graphic and hypergraphic matroids for learning the structures of graphical models, while we use inference algorithms on graphs to optimise submodular functions.The first main contribution of the thesis aims at approximating a probabilistic distribution with a factorisable tractable distribution under the maximum likelihood framework. Since the tractability of exact inference is exponential in the treewidth of the decomposable graph, our goal is to learn bounded treewidth decomposable graphs, which is known to be NP-hard. We pose this as a combinatorial optimisation problem and provide convex relaxations based on graphic and hypergraphic matroids. This leads to an approximate solution with good empirical performance. In the second main contribution, we use the fact that the entropy of a probability distribution is always bounded by the entropy of its factorisable counterpart mainly as a consequence of submodularity. This property of entropy is generalised to all submodular functions and bounds based on graphical models are proposed. We refer to them as graph-based bounds. An algorithm is developped to maximise submodular functions, which is NPhard, by maximising the graph-based bound using variational inference algorithms on graphs. As third contribution, we propose and analyse algorithms aiming at minimizing submodular functions that can be written as sum of simple functions. Our algorithms only make use of submodular function minimisation and total variation oracles of simple functions
Murat, Cécile. "Les problèmes d'optimisation combinatoire probabilistes dans les graphes". Paris 9, 1997. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=1997PA090054.
Besson, Rémi. "Decision making strategy for antenatal echographic screening of foetal abnormalities using statistical learning". Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLX037/document.
In this thesis, we propose a method to build a decision support tool for the diagnosis of rare diseases. We aim to minimize the number of medical tests necessary to achieve a state where the uncertainty regarding the patient's disease is less than a predetermined threshold. In doing so, we take into account the need in many medical applications, to avoid as much as possible, any misdiagnosis. To solve this optimization task, we investigate several reinforcement learning algorithm and make them operable in our high-dimensional. To do this, we break down the initial problem into several sub-problems and show that it is possible to take advantage of the intersections between these sub-tasks to accelerate the learning phase. The strategies learned are much more effective than classic greedy strategies. We also present a way to combine expert knowledge, expressed as conditional probabilities, with clinical data. This is crucial because the scarcity of data in the field of rare diseases prevents any approach based solely on clinical data. We show, both empirically and theoretically, that our proposed estimator is always more efficient than the best of the two models (expert or data) within a constant. Finally, we show that it is possible to effectively integrate reasoning taking into account the level of granularity of the symptoms reported while remaining within the probabilistic framework developed throughout this work
Bellalouna, Monia. "Problèmes d'optimisation combinatoires probabilistes". Phd thesis, Ecole Nationale des Ponts et Chaussées, 1993. http://pastel.archives-ouvertes.fr/pastel-00568759.
Faimun, Faimun. "Probability based optimisation of non-linear multi-layered orthotropic plates". Thesis, University of Newcastle Upon Tyne, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.545776.
Laamiri, Hassan. "Optimisation methods in structural systems reliability". Thesis, Imperial College London, 1991. http://hdl.handle.net/10044/1/46878.
Schliebs, Stefan. "Heterogeneous probabilistic models for optimisation and modelling of evolving spiking neural networks". AUT University, 2010. http://hdl.handle.net/10292/963.
Nelakanti, Anil Kumar. "Modélisation du langage à l'aide de pénalités structurées". Electronic Thesis or Diss., Paris 6, 2014. http://www.theses.fr/2014PA066033.
Modeling natural language is among fundamental challenges of artificial intelligence and the design of interactive machines, with applications spanning across various domains, such as dialogue systems, text generation and machine translation. We propose a discriminatively trained log-linear model to learn the distribution of words following a given context. Due to data sparsity, it is necessary to appropriately regularize the model using a penalty term. We design a penalty term that properly encodes the structure of the feature space to avoid overfitting and improve generalization while appropriately capturing long range dependencies. Some nice properties of specific structured penalties can be used to reduce the number of parameters required to encode the model. The outcome is an efficient model that suitably captures long dependencies in language without a significant increase in time or space requirements. In a log-linear model, both training and testing become increasingly expensive with growing number of classes. The number of classes in a language model is the size of the vocabulary which is typically very large. A common trick is to cluster classes and apply the model in two-steps; the first step picks the most probable cluster and the second picks the most probable word from the chosen cluster. This idea can be generalized to a hierarchy of larger depth with multiple levels of clustering. However, the performance of the resulting hierarchical classifier depends on the suitability of the clustering to the problem. We study different strategies to build the hierarchy of categories from their observations
Moussa, Kaouther. "Estimation de domaines d'attraction et contrôle basé sur l'optimisation : application à des modèles de croissance tumorale". Thesis, Université Grenoble Alpes, 2020. http://www.theses.fr/2020GRALT078.
The main objective of this thesis is to propose frameworks and algorithms that are based on advanced control approaches, in order to guide cancer treatments scheduling. It also aims at pointing out the importance of taking into account the problem of stochastic uncertainties handling in the drug scheduling design, since cancer dynamical systems are considered to be highly uncertain phenomena.Cancer dynamical interactions are still an open research topic which is not fully understood yet. The complexity of such dynamics comes from their partially unknown behavior and their uncertain nature. Additionally, they are often described by nonlinear complex dynamics and require taking into consideration many constraints related to physiology as well as biology.In terms of control design, this topic gathers many complexity ingredients such asnonlinear dynamics, constraints handling and optimality issues. Therefore, in this thesis, we propose to use a recent optimal control approach that is based on moment optimization. This framework has the advantage of considering all the state and input variables as probability densities, allowing therefore to explicitly consider parametric as well as initial state uncertainties in the optimal control problem. We use this framework in Part II, in order to design robust optimal control schedules that represent cancer drugs injection profiles.The second problem that we address in Part III consists in the estimation of regionsof attraction for cancer interactions models. This problem is interesting in the context of cancer treatment design, since it provides the set of all possible initial conditions (tumor and patient health indicators), that can be driven to a desired targeted safe region, where the patient is considered to be healed. Furthermore, we focus on the assessment of methodologies that take into consideration the parametric uncertainties that can affect the dynamical model
Klopfenstein, Olivier. "Optimisation robuste des réseaux de télécommunications". Phd thesis, Université de Technologie de Compiègne, 2008. http://tel.archives-ouvertes.fr/tel-00321868.
Pour résoudre de tels problèmes combinatoires sous contraintes probabilistes, on s'appuie d'abord sur l'optimisation robuste. Les liens théoriques entre ces deux familles de méthodes sont mis en évidence. A partir de modèles robustes appropriés, des algorithmes de résolution heuristique sont définis. On s'intéresse ensuite à la résolution optimale de problèmes combinatoires sous contraintes probabilistes. Des tests numériques illustrent les méthodes présentées et montrent leur efficacité pratique. Enfin, deux applications au domaine des télécommunications sont développées. Elles concernent toutes deux la localisation de fonctions dans un réseau.
Nierhoff, Till. "The k-center-problem and r-independent sets : a study in probabilistic analysis /". Aachen : Shaker, 1999. http://catalogue.bnf.fr/ark:/12148/cb37323988q.
Ghribi, Dhafer. "Optimisation des corrections de forme dans les engrenages droits et hélicoïdaux : Approches déterministes et probabilistes". Phd thesis, INSA de Lyon, 2013. http://tel.archives-ouvertes.fr/tel-00873973.
Favier, Aurélie. "Décompositions fonctionnelles et structurelles dans les modèles graphiques probabilistes appliquées à la reconstruction d'haplotypes". Toulouse 3, 2011. http://thesesups.ups-tlse.fr/1527/.
This thesis is based on two topics : the decomposition in graphical models which are, among others, Bayesian networks and cost function networks (WCSP) and the haplotype reconstruction in pedigrees. We apply techniques of WCSP to treat Bayesian network. We exploit stuctural and fonctional properties, in an exact and approached methods. Particulary, we define a decomposition of function which produces functions with a smaller variable number. An application example in optimization is the haplotype reconstruction. It is essential for a best prediction of seriousness of disease or to understand particular physical characters. Haplotype reconstruction is represented with a Bayesian network. The functionnal decomposition allows to reduce this Bayesian network in an optimization problem WCSP (Max-2SAT)
Rawlik, Konrad Cyrus. "On probabilistic inference approaches to stochastic optimal control". Thesis, University of Edinburgh, 2013. http://hdl.handle.net/1842/8293.
Ngah, Mohammad Fahmi. "Statistical process optimisation and probabilistic performance assessment of resin infused carbon-epoxy composite laminates". Thesis, Imperial College London, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.418251.
Hobbs, Dale. "Probabilistic estimation of hosting capacity and operating reserve including optimisation of PV installation capacity". Thesis, Hobbs, Dale (2019) Probabilistic estimation of hosting capacity and operating reserve including optimisation of PV installation capacity. Honours thesis, Murdoch University, 2019. https://researchrepository.murdoch.edu.au/id/eprint/52466/.
Marceau, Caron Gaetan. "Optimization and uncertainty handling in air traffic management". Thesis, Paris 11, 2014. http://www.theses.fr/2014PA112183/document.
In this thesis, we investigate the issue of optimizing the aircraft operators' demand with the airspace capacity by taking into account uncertainty in air traffic management. In the first part of the work, we identify the main causes of uncertainty of the trajectory prediction (TP), the core component underlying automation in ATM systems. We study the problem of online parameter-tuning of the TP during the climbing phase with the optimization algorithm CMA-ES. The main conclusion, corroborated by other works in the literature, is that ground TP is not sufficiently accurate nowadays to support fully automated safety-critical applications. Hence, with the current data sharing limitations, any centralized optimization system in Air Traffic Control should consider the human-in-the-loop factor, as well as other uncertainties. Consequently, in the second part of the thesis, we develop models and algorithms from a network global perspective and we describe a generic uncertainty model that captures flight trajectories uncertainties and infer their impact on the occupancy count of the Air Traffic Control sectors. This usual indicator quantifies coarsely the complexity managed by air traffic controllers in terms of number of flights. In the third part of the thesis, we formulate a variant of the Air Traffic Flow and Capacity Management problem in the tactical phase for bridging the gap between the network manager and air traffic controllers. The optimization problem consists in minimizing jointly the cost of delays and the cost of congestion while meeting sequencing constraints. In order to cope with the high dimensionality of the problem, evolutionary multi-objective optimization algorithms are used with an indirect representation and some greedy schedulers to optimize flight plans. An additional uncertainty model is added on top of the network model, allowing us to study the performances and the robustness of the proposed optimization algorithm when facing noisy context. We validate our approach on real-world and artificially densified instances obtained from the Central Flow Management Unit in Europe
Klopfenstein, Olivier. "Optimisation robuste des réseaux de télécommunication". Compiègne, 2008. http://www.theses.fr/2008COMP1740.
This thesis deals with taking uncertain data into account in optimization problems. Our focus is on mathematical programs with chance constraints. In this approach, the goal is to find the best solution, given a unfeasibility probability tolerance. Furthermore, we concentrate on integer variables, since they are often required for practical applications. To solve such chance-constrained combinatorial problems, we first rely on robust optimization. The theoretical links between these two families of approaches are studied. Based on appropriate robust models, solution algorithms to chance-constrained combinatorial problems are designed. Then, the optimal solution of such problems is also investigated. Specific theoretical results and algorithms are given to reach optimality. Finally, two specific telecommuniation applications are developed. Both of them deal with location in a network
Dantan, Jérôme. "Une approche systémique unifiée pour l’optimisation durable des systèmes socio-environnementaux : ingénierie des systèmes de décision en univers incertain". Thesis, Paris, CNAM, 2016. http://www.theses.fr/2016CNAM1045/document.
Nowadays, the sustainability of human activities is a major worldwide concern. The challenge is to evaluate such activities not only in terms of efficiency and productivity, but also in terms of their economic, social, environmental, etc. durability. For this, the experts of these areas need to work collaboratively. In this context, human societies are facing several major challenges such as: (1) process a large amount of information whose volume increases exponentially (“big data”), (2) live in a both dynamic and imperfect real world, (3) predict and assess future states of its activities.The researches we have conducted in this thesis contribute in particular to the domain of decision systems engineering under uncertainty. We have chosen the field of general socio-environmental systems as subject of study, particularly the multidisciplinary field of agriculture. We propose a systemic approach for the sustainable optimization of socio-environmental systems: (1) the meta-modeling of socio-environmental systems, (2) the generic representation of data imperfection flowing in such systems, associated to a decision model in uncertain environment and finally (3) the simulation and the assessment of such systems in dynamic environment for the purpose of decision making by experts which we have illustrated by both a service-oriented architecture model and case studies applied to the agriculture domain
Alekseychuk, Oleksandr. "Detection of crack-like indications in digital radiography by global optimisation of a probabilistic estimation function". Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2006. http://nbn-resolving.de/urn:nbn:de:swb:14-1154450084263-67485.
In dieser Arbeit wurde ein neuer Algorithmus zur Detektion rissartiger Anzeigen in der digitalen Radiographie entwickelt. Klassische lokale Detektionsmethoden versagen wegen des geringen Signal-Rausch-Verhältnisses (von ca. 1) der Rissanzeigen in den Radiographien. Die notwendige Resistenz gegen Rauschen wird durch die Benutzung von globalen Merkmalen dieser Anzeigen erzielt. Das ist aber mit einem undurchführbaren Rechenaufwand sowie Problemen bei der formalen Beschreibung der Rissform verbunden. Üblicherweise wird ein übermäßiger Rechenaufwand bei der Lösung vergleichbarer Probleme durch Anwendung von Heuristisken reduziert. Dazu benuzte Heuristiken werden mit der Versuchs-und-Irrtums-Methode ermittelt, sind stark problemangepasst und können die optimale Lösung nicht garantieren. Das Besondere dieser Arbeit ist anderer Lösungsansatz, der jegliche Heuristik bei der Suche nach Rissanzeigen vermeidet. Ein globales wahrscheinlichkeitstheoretisches Merkmal, hier Schätzfunktion genannt, wird konstruiert, dessen Maximum unter allen möglichen Formen, Längen und Positionen der Rissanzeige exakt (d.h. ohne Einsatz jeglicher Heuristik) gefunden werden kann. Diese Schätzfunktion wird als die Summe des a posteriori Informationsgewinns bezüglich des Vorhandenseins eines Risses im jeden Punkt entlang der hypothetischen Rissanzeige definiert. Der Informationsgewinn entsteht durch die Überprüfung der Hypothese der Rissanwesenheit anhand der vorhandenen Bildinformation. Eine so definierte Schätzfunktion ist theoretisch gerechtfertigt und besitzt die gewünschten Eigenschaften bei wechselnder Anzeigenintensität. Der Algorithmus wurde in der Programmiersprache C++ implementiert. Seine Detektionseigenschaften wurden sowohl mit simulierten als auch mit realen Bildern untersucht. Der Algorithmus liefert gute Ergenbise (hohe Detektionsrate bei einer vorgegebenen Fehlalarmrate), die jeweils vergleichbar mit den Ergebnissen trainierter menschlicher Auswerter sind
Alekseychuk, Oleksandr. "Detection of crack-like indications in digital radiography by global optimisation of a probabilistic estimation function". Doctoral thesis, Technische Universität Dresden, 2005. https://tud.qucosa.de/id/qucosa%3A24919.
In dieser Arbeit wurde ein neuer Algorithmus zur Detektion rissartiger Anzeigen in der digitalen Radiographie entwickelt. Klassische lokale Detektionsmethoden versagen wegen des geringen Signal-Rausch-Verhältnisses (von ca. 1) der Rissanzeigen in den Radiographien. Die notwendige Resistenz gegen Rauschen wird durch die Benutzung von globalen Merkmalen dieser Anzeigen erzielt. Das ist aber mit einem undurchführbaren Rechenaufwand sowie Problemen bei der formalen Beschreibung der Rissform verbunden. Üblicherweise wird ein übermäßiger Rechenaufwand bei der Lösung vergleichbarer Probleme durch Anwendung von Heuristisken reduziert. Dazu benuzte Heuristiken werden mit der Versuchs-und-Irrtums-Methode ermittelt, sind stark problemangepasst und können die optimale Lösung nicht garantieren. Das Besondere dieser Arbeit ist anderer Lösungsansatz, der jegliche Heuristik bei der Suche nach Rissanzeigen vermeidet. Ein globales wahrscheinlichkeitstheoretisches Merkmal, hier Schätzfunktion genannt, wird konstruiert, dessen Maximum unter allen möglichen Formen, Längen und Positionen der Rissanzeige exakt (d.h. ohne Einsatz jeglicher Heuristik) gefunden werden kann. Diese Schätzfunktion wird als die Summe des a posteriori Informationsgewinns bezüglich des Vorhandenseins eines Risses im jeden Punkt entlang der hypothetischen Rissanzeige definiert. Der Informationsgewinn entsteht durch die Überprüfung der Hypothese der Rissanwesenheit anhand der vorhandenen Bildinformation. Eine so definierte Schätzfunktion ist theoretisch gerechtfertigt und besitzt die gewünschten Eigenschaften bei wechselnder Anzeigenintensität. Der Algorithmus wurde in der Programmiersprache C++ implementiert. Seine Detektionseigenschaften wurden sowohl mit simulierten als auch mit realen Bildern untersucht. Der Algorithmus liefert gute Ergenbise (hohe Detektionsrate bei einer vorgegebenen Fehlalarmrate), die jeweils vergleichbar mit den Ergebnissen trainierter menschlicher Auswerter sind.
Birman, Jessie. "Quantification et propagation d'incertitude dans les phases amont de projets de conception d'avions : de l'optimisation déterministe à l'optimisation sous contraintes probabilistes". Toulouse 3, 2013. http://thesesups.ups-tlse.fr/2269/.
Conceptual aircraft sizing is the first step in the development project of a passenger transport aircraft. Classically, in this phase, a lot of aircraft configurations are compared after having been globally sized thanks to a deterministic, multidisciplinary and constrained optimisation problem. The purpose is to determine the main characteristics of the airplane according to a set of Top Level Requirements. At preliminary stage, designers have to deal with limited knowledge and high uncertainty when solving this problem. Managing that uncertainty is a major issue: assessing its impact on the design in the early stage allows to save time and cost. This PhD thesis introduces a new methodology to solve the aircraft design optimisation affected by uncertainty. First of all, the main source of uncertainty involved at this stage is identified as predictive model uncertainty, which is part of epistemic uncertainty. This uncertainty is quantified within a probabilistic framework. For that purpose, based on the Beta distribution, we create a new generic distribution function able to assume a wide range of distribution shapes: it is called Beta-Mystique distribution. Second of all, we realise uncertainty propagation studies with Monte Carlo and moment propagation methods, in order to analyse the robustness of aircraft configuration according to a set of uncertainties. Finally, a chance constrained optimisation is solved to produce a robust aircraft configuration. Two strategies are considered: the use of Surrogate models to approximate the probabilities and the resolution of the optimisation problem thanks to the moment propagation method
Ben, Dbabis Makram. "Modèles et méthodes actuarielles pour l'évaluation quantitative des risques en environnement solvabilité II". Phd thesis, Université Paris Dauphine - Paris IX, 2012. http://tel.archives-ouvertes.fr/tel-00833856.