Academic literature on the topic 'Optimisation probabiliste'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Optimisation probabiliste.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Optimisation probabiliste":

1

Martinet, P., L. Lanfranco, D. Tandé, L. Picard, P. Danneels, S. Jamard, B. Gaborit, C. Danthu, C. Loheac, and S. Rézig. "Pyélonéphrite aiguë du greffon : vers une optimisation de l'antibiothérapie probabiliste." Médecine et Maladies Infectieuses Formation 1, no. 2 (June 2022): S9. http://dx.doi.org/10.1016/j.mmifmc.2022.03.044.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pitner, P., H. Procaccia, T. Riffard, B. Granger, and B. Flesch. "Optimisation du contrôle et de la maintenance des faisceaux tubulaires des générateurs de vapeur grâce à l'analyse probabiliste." Revue Générale Nucléaire, no. 3 (May 1993): 187–94. http://dx.doi.org/10.1051/rgn/19933187.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Carrié, Cédric, Noémie Sauvage, and Matthieu Biais. "Optimisation du traitement par β-Lactamines chez le patient de réanimation en hyperclairance rénale." Médecine Intensive Réanimation 30, no. 2 (May 18, 2021): 157–64. http://dx.doi.org/10.37051/mir-00059.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
L'optimisation du traitement par β-Lactamines représente toujours un défi complexe chez le patient de soins critiques, compte tenu de la large variabilité des concentrations d'antibiotiques en relation avec d’importantes interactions pharmacocinétiques et pharmacodynamiques (PK/PD). D’une part, il est communément admis que les patients présentant une insuffisance rénale justifient une diminution des posologies afin de limiter le risque de toxicité. D’autre part, certains patients peuvent également présenter une hyperclairance rénale (HCR), désormais reconnue comme un des principaux facteurs de risque de sous-dosage et d’échec thérapeutique des agents anti-infectieux à élimination urinaire. L’hyperclairance est une entité fréquente en réanimation chirurgicale, probablement sous-diagnostiquée en l’absence de mesure de clairance urinaire de la créatinine (CLCR). Pour certaines β-lactamines prescrites en probabiliste, plusieurs études de pharmacocinétique suggèrent une augmentation des posologies recommandées afin d’atteindre les objectifs PK/PD chez les patients atteints d'HCR. En cas d’impossibilité de monitorer les concentrations plasmatiques dans des délais brefs, l'optimisation des posologies de β-lactamines selon le monitorage quotidien de la CLCR est une stratégie sûre et efficace pour améliorer les taux de succès thérapeutique. L’HCR étant un phénomène fluctuant, cette stratégie impose un monitorage quotidien de la CLCR afin d’adapter les posologies et limiter le risque de surdosage.
4

Pasalodos-Tato, María, Timo Pukkala, and Alberto Rojo Alboreca. "Optimal management of Pinus pinaster in Galicia (Spain) under risk of fire." International Journal of Wildland Fire 19, no. 7 (2010): 937. http://dx.doi.org/10.1071/wf08150.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Pinus pinaster is the most important conifer in Galicia in terms of volume and production, occurs mainly in plantations. Forest fires are the main threat to forest plantations, affecting optimal stand management. The aim of this study was to develop management prescriptions for P. pinaster based on growth and yield models and optimisations. The objective function was soil expectation value, calculated taking the expected fire losses into account. Fire risk was assumed to consist of two components, probability of occurrence and damage. As the main cause of forest fires in Galicia is arson, the manager cannot significantly influence fire occurrence, which was assumed to be exogenous. Salvage was treated as an endogenous factor depending on the management schedule followed in the stand. Optimisations were done for different initial stands, timber assortments, discount rates and probabilities of fire occurrence. Based on the optimisation results, regression models were developed for the optimal rotation length as well as the timing and intensity of thinnings. The results show that when fire risk is partly endogenous, optimal rotation lengths become shorter with increasing probability of fire occurrence, and optimal thinning becomes heavier and earlier. However, without a price reduction for burned timber, the optimal rotation length increases with increasing probability of fire.
5

Tayal, Shilpy. "Analysis of Information Geometry for Optimization and Inference Applications." Mathematical Statistician and Engineering Applications 70, no. 1 (January 31, 2021): 621–27. http://dx.doi.org/10.17762/msea.v70i1.2516.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
A mathematical framework called "information geometry" investigates the geometrical attributes and features of probability distributions and statistical models. In a variety of domains, including as machine learning, optimisation, and inference, it offers a potent toolkit for analysing and optimising complicated systems. Information geometry and its uses in optimisation and inference are examined in this work.First, we give a general overview of information geometry's foundational ideas and principles, including topics like the Fisher information metre, divergence measures, and exponential families. We go over how to quantify the geometric links between probability distributions and derive practical geometric structures using these ideas.The use of information geometry in optimisation issues is what we investigate next. We show how the Fisher information metric can direct effective search strategies and convergence analysis in optimisation algorithms by utilising its geometric characteristics. We go over the benefits of applying information geometry to a variety of optimisation tasks, including parameter estimation, model choice, and neural network training. We also look into how information geometry affects statistical inference. We emphasise how the development of effective and reliable inference algorithms is made possible by the geometric structures of exponential families. We go over the use of divergence measures to quantify the differences between distributions, making tasks like model comparison and hypothesis testing easier.We also review current developments in information geometry, especially its application to probabilistic programming and deep learning. We go over how information geometry can improve deep neural networks' capacity for generalisation, interpretation, and uncertainty estimation.In this study, information geometry and its uses in optimisation and inference are thoroughly studied. Information geometry provides useful insights and methods for resolving challenging issues in a variety of fields by taking advantage of the geometric aspects of probability distributions.
6

Lypchuk, Vasyl, and Vasyl Dmytriv. "Management of technological process optimisation." Engineering Management in Production and Services 12, no. 3 (September 1, 2020): 103–15. http://dx.doi.org/10.2478/emj-2020-0022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract The research aims to characterise the optimisation of a technological process depending on the main time parameters for production. The optimisation does not require to correct technical parameters of a system, but rather the organisational and managerial factors of the technological process. The workload is taken as an evaluation criterion, which factors in the probability distribution of time characteristics of computer process operations. Time characteristics that represent the performance of an operation influence the workloads of an operator and equipment, determining the productivity of the technological process. Analytical models were developed for the operational control of a production line efficiency considering the probability–statistical parameters pertaining to the performance of operations and technological equipment peculiarities. The article presents research results, which characterise the dependence of a production line efficiency on the type of equipment, and the duration of preparatory and final operations considering their probability. Under an optimal workload of the operator, the duration of the complete program changes linearly, regardless of the time required for the performance of operations by a computer without the involvement of the operator, and depending on the type of equipment. A managerial decision can be optimal under the condition that the factor of technological process efficiency (K_TP) tends to max. The developed method of analytical determination can be used to calculate the workload of both an operator and technological equipment. The calculations of the duration of a production line operation resulted in the methodology for the consideration of probability characteristics pertaining to the time distribution of the period required to perform operations, which influences the unequal efficiency of the production line. The probabilistic character of time distribution related to intervals of performed operations serves as a parameter in the management of technological process optimisation, which can be achieved using simulators of technological processes optimised in terms of their efficiency.
7

Shariatmadar, Keivan, and Mark Versteyhe. "Numerical Linear Programming under Non-Probabilistic Uncertainty Models — Interval and Fuzzy Sets." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 28, no. 03 (May 21, 2020): 469–95. http://dx.doi.org/10.1142/s0218488520500191.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This paper considers a linear optimisation problem under uncertainty with at least one element modelled as a non-probabilistic uncertainty. The uncertainty is expressed in the coefficient matrices of constraints and/or coefficients of goal function. Previous work converts such problems to classical (linear) optimisation problems and eliminates uncertainty by converting the linear programming under uncertainty problem to a decision problem using imprecise probability and imprecise decision theory. Our aim here is to generalise this approach numerically and present three methods to calculate the solution. We investigate what numerical results can be obtained for interval and fuzzy types of uncertainty models and compare them to classical probabilistic cases — for two different optimality criteria: maximinity and maximality. We also provide an efficient method to calculate the maximal solutions in the fuzzy set model. A numerical example is considered for illustration of the results.
8

Van Nguyen, N., J. W. Lee, Y. D. Lee, and H. U. Park. "A multidisciplinary robust optimisation framework for UAV conceptual design." Aeronautical Journal 118, no. 1200 (February 2014): 123–42. http://dx.doi.org/10.1017/s0001924000009027.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract This paper describes a multidisciplinary robust optimisation framework for UAV conceptual design. An in-house configuration designer system is implemented to generate the full sets of configuration data for a well-developed advanced UAV analysis tool. A fully integrated configuration designer along with the UAV analysis tool ensures that full sets of configuration data are provided simultaneously while the UAV configuration changes during optimisation. The computational strategy for probabilistic analysis is proposed by implementing a central difference method and fitting distribution for a reduced number of Monte Carlo Simulation sampling points. The minimisation of a new robust design objective function helps to enhance the reliability while other UAV performance criteria are satisfied. In addition, the fully integrated process and a probabilistic analysis strategy method demonstrate a reduction in the probability of failure under noise factors without any noticeable increase in design turnaround time. The proposed robust optimisation framework for UAV conceptual design case study yields a more trustworthy prediction of the optimal configuration and is preferable to the traditional deterministic design approach. The high fidelity analysis ANSYS Fluent 13 is performed to demonstrate the accuracy of proposed framework on baseline, deterministic and RDO configuration.
9

Holický, Milan. "Fuzzy probabilistic optimisation of building performance." Automation in Construction 8, no. 4 (April 1999): 437–43. http://dx.doi.org/10.1016/s0926-5805(98)00090-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

al-Rifaie, Mohammad Majid, and Tim Blackwell. "Cognitive Bare Bones Particle Swarm Optimisation with Jumps." International Journal of Swarm Intelligence Research 7, no. 1 (January 2016): 1–31. http://dx.doi.org/10.4018/ijsir.2016010101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The ‘bare bones' (BB) formulation of particle swarm optimisation (PSO) was originally advanced as a model of PSO dynamics. The idea was to model the forces between particles with sampling from a probability distribution in the hope of understanding swarm behaviour with a conceptually simpler particle update rule. ‘Bare bones with jumps' (BBJ) proposes three significant extensions to the BB algorithm: (i) two social neighbourhoods, (ii) a tuneable parameter that can advantageously bring the swarm to the ‘edge of collapse' and (iii) a component-by-component probabilistic jump to anywhere in the search space. The purpose of this paper is to investigate the role of jumping within a specific BBJ algorithm, cognitive BBJ (cBBJ). After confirming the effectiveness of cBBJ, this paper finds that: jumping in one component only is optimal over the 30 dimensional benchmarks of this study; that a small per particle jump probability of 1/30 works well for these benchmarks; jumps are chiefly beneficial during the early stages of optimisation and finally this work supplies evidence that jumping provides escape from regions surrounding sub-optimal minima.

Dissertations / Theses on the topic "Optimisation probabiliste":

1

Scherrer, Bruno. "Application et optimisation de l'échantillonnage probabiliste en écologie continentale." Montpellier 2, 1987. http://www.theses.fr/1987MON20115.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bouillard, Anne. "Optimisation et analyse probabiliste de systèmes à évènements discrets." Lyon, École normale supérieure (sciences), 2005. http://www.theses.fr/2005ENSL0337.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
L'objet de cette thèse est l'étude des performances dans les systèmes à évènements discrets. Trois modèles différents de ces systèmes sont étudiés. Dans une première partie, nous nous intéressons aux groupes de traces. Après avoir donné une formule simple, semblable à la formule de Möbius pour les monoïdes de traces, de la série génératrice des groupes de traces, nous montrons l'existence et l'algébricité du taux de croissance asymptotique de la hauteur des traces. Une deuxième partie de cette thèse est consacrée aux réseaux à choix libres temporisés, une sous-classe importante de réseaux de Petri. Nous définissons une notion de débit dans ces réseaux et étudions la variation du débit en fonction de la politique de résolution de conflits choisie. Tout d'abord, nous montrons comment calculer ce débit. Ensuite, nous nous intéressons aux politiques qui le maximisent ou le minimisent. Enfin, nous donnons une méthode efficace de simulation exacte du processus des marquages pour l'évaluation numérique du débit. Enfin, nous étudions le calcul des garanties de performances dans les réseaux à l'aide des techniques du Network Calculus. Nous montrons la stabilité des fonctions ultimement pseudo-périodiques par les opérations du network calculus et donnons des algorithmes pour le calcul de ces fonctions. Ces techniques sont ensuite appliquées à l'étude des garanties de performances dans les graphes avec angles interdits
This thesis deals with the study of discrete event systems. Three different models are considered. In the first part, we are interested in the trace groups. After giving a simple Möbius-like formula for the generating series of the trace groups, we show the existence and the algebraicity of the asymptotic growth rate of the height of the traces. The second part is devoted to timed free-choice nets, an important sub-class of Petri nets. We define the notion of throughput in those nets and study the variation of the throughput in function of the conflict resolution policies. First, we show how to compute the throughput, then we are interested in the policy that maximizes or minimizes the throughput. Finally, we give an efficient method to generate a marking according to its exact distribution in order to numerically evaluate the throughput. In the last part, we study the computation of performance guarantees in networks thanks to Network Calculus techniques. We show the stability of the ultimately pseudo-periodic functions with the operations of the Network Calculus and give algorithms to compute these functions. These techniques are then applied to the study of performance guarantees in graphs with turn prohibition
3

Scherrer, Bruno. "Application et optimisation de l'échantillonnage probabiliste en écologie continentale." Grenoble 2 : ANRT, 1987. http://catalogue.bnf.fr/ark:/12148/cb37609751c.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Schmitt, Lucie. "Durabilité des ouvrages en béton soumis à la corrosion : optimisation par une approche probabiliste." Thesis, Toulouse, INSA, 2019. http://www.theses.fr/2019ISAT0009/document.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
La maîtrise de la durabilité des ouvrages neufs et la nécessité de prolonger la durée de vie des structures existantes correspondent à des enjeux sociétaux de tout premier ordre et s’inscrivent dans les principes d’une économie circulaire. La durabilité des ouvrages en béton occupe ainsi une position centrale dans le contexte normatif. Ces travaux de thèse font suite à ceux de J. Mai-Nhu* et ont pour objectif d’étendre le domaine d’application du modèle SDReaM-crete en intégrant les bétons à base d’additions et en définissant un critère d’état limite basé sur une quantité de produits corrodés. Une approche basée sur une optimisation numérique des calculs prédictifs est mise en place pour réaliser des calculs d’indices de fiabilité en considérant les principaux mécanismes liés à la corrosion des armatures, carbonatation et chlorures. Ce modèle permet d’optimiser le dimensionnement des enrobages et les performances du béton en intégrant davantage les conditions environnementales telles qu’elles sont définies dans les normes
Mastering the durability of new structures and the need to extand the lifespan of existing constructions correspond to social issues of the highest order and are part of the principles of a circular economy. The durability of concrete structures thus occupies a central position in the normative context. This thesis works follow those of J. Mai-Nhu* and aims at extending the field of application the SDReaM-crete model by integrating mineral additions based concretes and by defining a limit state criterion based on a quantity of corroded products. An approach based on a numerical optimization of predictive computations is set up to perform reliability analyses by considering the main mechanisms related to the corrosion of reinforcement, carbonation and chlorides. This model enables the optimization of the sizing of the concrete covers and performances by further integrating the environmental conditions as defined by the standards
5

Bérard, Jean. "Contributions à l'étude probabiliste des algorithmes d'évolution." Lyon 1, 2001. http://www.theses.fr/2001LYO10223.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Notre objectif principal est d'étudier précisément le comportement d'algorithmes d'évolution simplifiée afin de parvenir, sur des exemples spécifiques, à une compréhension détaillée des effets des étapes de mutation, sélection, et de la tailles de population. Du fait de leur simplicité, les modèles étudiés présentent un intérêt limité du point de vue de l'optimisation, mais ils ont été utilisés en biologie. Leur étude mathématique présente de sérieuses difficultés et des techniques spécifiques doivent être développées. Les deux premiers modèles étudiés se rattachent à la catégorie des modèles fitness-space. Le troisième modèle aborde le cas où le paysage de fitness comporte des irrégularités aléatoires. Nous présentons également des résultats expérimentaux sur l'utilisation des méthodes de fitness-sharing dans le cadre du filtrage particulaire.
6

Belkora, Samir. "Les méthodes d'optimisation multiobjectif : synthèse et considérations théoriques : application d'un modèle probabiliste de choix au problème d'optimisation multiobjectif." Aix-Marseille 2, 1986. http://www.theses.fr/1986AIX24012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Ce travail traite des methodes d'optimisation multiobjectif. Dans une premiere grande partie, nous avons effectue la synthese des methodes multiobjectifs en suivant une classification traditionnelle qui divise les methodes en trois classes importantes : - les methodes necessitant une ponderation "a priori" des objectifs. - les methodes necessitant une ponderation "progressive" des objectifs ou methodes interactives. - les methodes conduisant a une ponderation "a posteriori" des objectifs. Dans une seconde partie, nous essayons d'apporter une contribution theorique a l'amelioration d'une methode appartenant a la classe des methodes interactives en integrant un modele probabiliste de choix qualitatif qui autorise des specifications assez souples, le modele probit conditionnel
This work treat of multiple objective optimization methods. In a first large part, we accomplished the multiple objective methods synthesis by following a traditional classification which separate the methods in three major class : - the methods which require "a priori" ponderation of the objectives. - the methods which require "progressive" ponderation of the objectives or the interactive methods. - the methods which lead to "a posteriori" ponderation of the objectives. In a second part, we try to provide a theorical contribution to improve a method which belongs to the interactive methods class by integrating a qualitative choice probabilistic model which authorize sufficiently versatile specifications, the conditional probit model
7

Souissi, Salma. "Problème du Bin Packing probabiliste à une dimension." Versailles-St Quentin en Yvelines, 2006. http://www.theses.fr/2006VERS0052.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Le Problème de Bin Packing Probabiliste (PBPP) tient compte de la disparition de certains objets après avoir été placés dans les boîtes. Le problème consiste à réarranger les objets restants en utilisant la solution a priori. L’arrangement initial est effectué en utilisant l’heuristique Next Fit Decreasing (NFD). Nous considérons deux stratégies de résolution: la stratégie de redistribution suivant NFD et la stratégie a priori. Dans la première, l’algorithme Next Fit est appliqué à la nouvelle liste. Dans la seconde, des groupes successives de boîtes sont réarrangés d’une façon optimale. Dans les deux cas, nous développons une analyse en moyenne pour le PBPP. Nous prouvons la loi des grands nombres et le théorème central limite pour le nombre de boîtes obtenu par chacune de ces stratégies quand le nombre d’objets initial tend vers l’infini. Nous vérifions ces résultats théoriques par simulation
In the Probabilistic Bin Packing Problem (PBPP) the random deletion of some items once placed into bins. The problem is to rearrange the residual items, using the a priori solution. The initial arrangement being done with the Next Fit Decreasing Heuristic (NFD). We propose two resolution methodologies: the redistribution strategy according to NFD and the a priori strategy. In the first one, the Next fit algorithm is applied to the new list. In the second one, successive groups of bins are optimally rearranged. In both cases, we develop an average case analysis for the (PBPP). We prove the law of large numbers and the central limit theorem for the number of occupied bins as the initial number of items tends to infinity. We verify these theoretical results by simulation
8

Bahloul, Khaled. "Optimisation combinée des coûts de transport et de stockage dans un réseau logistique dyadique, multi-produits avec demande probabiliste." Phd thesis, INSA de Lyon, 2011. http://tel.archives-ouvertes.fr/tel-00695275.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Le but de cette thèse est de proposer des méthodes de gestion des approvisionnements adaptées à des contextes particuliers afin de minimiser les coûts logistiques engendrés dans un réseau logistique multi produits, multi niveaux confronté à une demande probabiliste. Au cours de cette thèse, nous nous sommes attachés à : - Proposer des méthodes de gestion des stocks et du transport pour des familles de produits dans différents contextes : o Une première politique de réapprovisionnement est proposée pour une famille de produits caractérisée par une demande aléatoire et répétitive. Cette politique est définie par un niveau de commande et par un niveau de ré-complètement de stock pour chaque produit et une période de réapprovisionnement. Dès qu'un produit atteint le niveau de commande, un réapprovisionnement de tous les produits de la famille est déclenché. o Une deuxième politique de réapprovisionnement est proposée pour une famille de produits caractérisée par une demande très aléatoire et ponctuelle. Cette politique est basée sur les ruptures de stock. A chaque rupture d'un produit présent dans le stock il y a déclenchement d'un réapprovisionnement de tous les produits de la famille. - Proposer une méthode de classification multicritères afin de constituer des groupes de produits relevant d'une politique donnée, chaque classe ou famille regroupant des produits réagissant identiquement. Cette classification des produits en familles homogènes permet d'identifier les caractéristiques déterminantes dans le choix des méthodes de gestion de stock et de transport. - Analyser et comparer les performances de ces deux politiques d'approvisionnement par rapport à des politiques de référence, ainsi que leur sensibilité au regard de quelques paramètres discriminants : variabilité de la demande ; coût des produits ; coût des commandes urgentes...
9

Royer, Clément. "Algorithmes d'optimisation sans dérivées à caractère probabiliste ou déterministe : analyse de complexité et importance en pratique." Thesis, Toulouse 3, 2016. http://www.theses.fr/2016TOU30207/document.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
L'utilisation d'aspects aléatoires a contribué de façon majeure aux dernières avancées dans le domaine de l'optimisation numérique; cela est dû en partie à la recrudescence de problèmes issus de l'apprentissage automatique (machine learning). Dans un tel contexte, les algorithmes classiques d'optimisation non linéaire, reposant sur des principes déterministes, se révèlent en effet bien moins performants que des variantes incorporant de l'aléatoire. Le coût de ces dernières est souvent inférieur à celui de leurs équivalents déterministes; en revanche, il peut s'avérer difficile de maintenir les propriétés théoriques d'un algorithme déterministe lorsque de l'aléatoire y est introduit. Effectuer une analyse de complexité d'une telle méthode est un procédé très répandu dans ce contexte. Cette technique permet déstimer la vitesse de convergence du schéma considéré et par là même d'établir une forme de convergence de celui-ci. Les récents travaux sur ce sujet, en particulier pour des problèmes d'optimisation non convexes, ont également contribué au développement de ces aspects dans le cadre déterministe, ceux-ci apportant en effet un éclairage nouveau sur le comportement des algorithmes. Dans cette thèse, on s'intéresse à l'amélioration pratique d'algorithmes d'optimisation sans dérivées à travers l'introduction d'aléatoire, ainsi qu'à l'impact numérique des analyses de complexité. L'étude se concentre essentiellement sur les méthodes de recherche directe, qui comptent parmi les principales catégories d'algorithmes sans dérivées; cependant, l'analyse sous-jacente est applicable à un large éventail de ces classes de méthodes. On propose des variantes probabilistes des propriétés requises pour assurer la convergence des algorithmes étudiés, en mettant en avant le gain en efficacité induit par ces variantes: un tel gain séxplique principalement par leur coût très faible en évaluations de fonction. Le cadre de base de notre analyse est celui de méthodes convergentes au premier ordre, que nous appliquons à des problèmes sans ou avec contraintes linéaires. Les bonnes performances obtenues dans ce contexte nous incitent par la suite à prendre en compte des aspects d'ordre deux. A partir des propriétés de complexité des algorithmes sans dérivées, on développe de nouvelles méthodes qui exploitent de l'information du second ordre. L'analyse de ces procédures peut être réalisée sur un plan déterministe ou probabiliste: la deuxième solution nous permet d'étudier de nouveaux aspects aléatoires ainsi que leurs conséquences sur l'éfficacité et la robustesse des algorithmes considérés
Randomization has had a major impact on the latest developments in the field of numerical optimization, partly due to the outbreak of machine learning applications. In this increasingly popular context, classical nonlinear programming algorithms have indeed been outperformed by variants relying on randomness. The cost of these variants is usually lower than for the traditional schemes, however theoretical guarantees may not be straightforward to carry out from the deterministic to the randomized setting. Complexity analysis is a useful tool in the latter case, as it helps in providing estimates on the convergence speed of a given scheme, which implies some form of convergence. Such a technique has also gained attention from the deterministic optimization community thanks to recent findings in the nonconvex case, as it brings supplementary indicators on the behavior of an algorithm. In this thesis, we investigate the practical enhancement of deterministic optimization algorithms through the introduction of random elements within those frameworks, as well as the numerical impact of their complexity results. We focus on direct-search methods, one of the main classes of derivative-free algorithms, yet our analysis applies to a wide range of derivative-free methods. We propose probabilistic variants on classical properties required to ensure convergence of the studied methods, then enlighten their practical efficiency induced by their lower consumption of function evaluations. Firstorder concerns form the basis of our analysis, which we apply to address unconstrained and linearly-constrained problems. The observed gains incite us to additionally take second-order considerations into account. Using complexity properties of derivative-free schemes, we develop several frameworks in which information of order two is exploited. Both a deterministic and a probabilistic analysis can be performed on these schemes. The latter is an opportunity to introduce supplementary probabilistic properties, together with their impact on numerical efficiency and robustness
10

Bonnard, Cécile. "Optimisation de potentiels statistiques pour un modèle d'évolution soumis à des contraintes structurales." Phd thesis, Université Montpellier II - Sciences et Techniques du Languedoc, 2010. http://tel.archives-ouvertes.fr/tel-00495973.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Ces dernières années, plusieurs modèles d'évolution moléculaire, basés sur l'hypothèse que les séquences des protéines évoluent sous la contrainte d'une structure bien dénie et constante au cours de l'évolution, ont été développés. Cependant, un tel modèle repose sur l'expression de la fonction repr ésentant le lien entre la structure et sa séquence. Les potentiels statistiques proposent une solution intéressante, mais parmi l'ensemble des potentiels statistiques existants, lequel serait le plus approprié pour ces modèles d'évolution ? Dans cette thèse est développé un cadre probabiliste d'optimisation de potentiels statistiques, dans le contexte du maximum de vraisemblance, et dans une optique de protein design. Le potentiel statistique utilisé ici est composé d'un terme de contact entre deux acides aminés et un terme d'accessibilité au solvant, mais le cadre statistique peut être très facilement généralisé à des formes plus complexes de potentiel. Ce cadre intègre diérentes méthodes d'optimisation, incluant la prise en compte de structures alternatives (decoys) pour l'optimisation des potentiels, et utilise une amélioration algorithmique permettant l'obtention rapide de potentiels statistiques adaptés au contexte. Tout cela nous fournit un cadre robuste et des tests statistiques (à la fois dans le contexte de l'optimisation des potentiels et dans le contexte de l'évolution moléculaire), permettant de comparer diérentes méthodes d'optimisation de potentiels statistiques pour les modèles soumis à des contraintes structurales.

Books on the topic "Optimisation probabiliste":

1

Peter, Whittle, and Kelly F. P, eds. Probability, statistics, and optimisation: A tribute to Peter Whittle. Chichester: Wiley, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ross, Sheldon M. Applied probability models with optimization applications. New York: London, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ross, Sheldon M. Applied probability models with optimization applications. New York: Dover Publications, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sironi, Paolo. Modern portfolio management: From Markowitz to probabilistic scenario optimisation : goal-based and long-term portfolio choice. London: Risk Books, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Arora, Rajesh Kumar. Optimization: Algorithms and applications. Boca Raton: Taylor & Francis Group, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

L, Aarts E. H., and Lenstra J. K, eds. Local search in combinatorial optimization. Chichester [England]: Wiley, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Onn, Shmuel. Nonlinear discrete optimization: An algorithmic theory. Zürich, Switzerland: European Mathematical Society Publishing House, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hansen, Eldon R. Global optimization using interval analysis. New York: M. Dekker, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Davis, M. H. A. Markov models and optimization. London: Chapman & Hall, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Whittle, Peter. Networks: Optimisation and Evolution (Cambridge Series in Statistical and Probabilistic Mathematics). Cambridge University Press, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Optimisation probabiliste":

1

Illingworth, John, and Josef Kittler. "Optimisation Algorithms in Probabilistic Relaxation Labelling." In Pattern Recognition Theory and Applications, 109–17. Berlin, Heidelberg: Springer Berlin Heidelberg, 1987. http://dx.doi.org/10.1007/978-3-642-83069-3_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Khajwal, Basim, C. H. Luke Ong, and Dominik Wagner. "Fast and Correct Gradient-Based Optimisation for Probabilistic Programming via Smoothing." In Programming Languages and Systems, 479–506. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-30044-8_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
AbstractWe study the foundations of variational inference, which frames posterior inference as an optimisation problem, for probabilistic programming. The dominant approach for optimisation in practice is stochastic gradient descent. In particular, a variant using the so-called reparameterisation gradient estimator exhibits fast convergence in a traditional statistics setting. Unfortunately, discontinuities, which are readily expressible in programming languages, can compromise the correctness of this approach. We consider a simple (higher-order, probabilistic) programming language with conditionals, and we endow our language with both a measurable and a smoothed (approximate) value semantics. We present type systems which establish technical pre-conditions. Thus we can prove stochastic gradient descent with the reparameterisation gradient estimator to be correct when applied to the smoothed problem. Besides, we can solve the original problem up to any error tolerance by choosing an accuracy coefficient suitably. Empirically we demonstrate that our approach has a similar convergence as a key competitor, but is simpler, faster, and attains orders of magnitude reduction in work-normalised variance.
3

de Vries, G. B., P. H. A. J. M. van Gelder, and J. K. Vrijling. "Probabilistic Cost Optimisation of Soil Improvement Strategies." In Probabilistic Safety Assessment and Management, 3317–23. London: Springer London, 2004. http://dx.doi.org/10.1007/978-0-85729-410-4_530.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Stamp, Mark. "Model-based optimisation of probability sampling designs." In Introduction to Machine Learning with Applications in Information Security, 231–62. 2nd ed. Boca Raton: Chapman and Hall/CRC, 2022. http://dx.doi.org/10.1201/9781003264873-13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Brus, Dick J. "Model-based optimisation of probability sampling designs." In Spatial Sampling with R, 231–62. Boca Raton: Chapman and Hall/CRC, 2022. http://dx.doi.org/10.1201/9781003258940-13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Savić, Robert, and Uwe K. Rakowsky. "A Neuro-Fuzzy Reliability Optimisation Method Considering Life Cycle Costs." In Probabilistic Safety Assessment and Management, 1388–94. London: Springer London, 2004. http://dx.doi.org/10.1007/978-0-85729-410-4_224.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Camarinopoulos, L., G. Zioutas, and E. Bora-Senta. "An Optimisation Technique For Robust Autoregressive Estimates." In Athens Conference on Applied Probability and Time Series Analysis, 102–14. New York, NY: Springer New York, 1996. http://dx.doi.org/10.1007/978-1-4612-2412-9_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Barros, Anne, Christophe Bérenguer, and Antoine Grall. "Effect of false alarms on the optimisation of the maintenance decisions." In Probabilistic Safety Assessment and Management, 2833–39. London: Springer London, 2004. http://dx.doi.org/10.1007/978-0-85729-410-4_454.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bacharoudis, Konstantinos, Atanas Popov, and Svetan Ratchev. "Application of Advanced Simulation Methods for the Tolerance Analysis of Mechanical Assemblies." In IFIP Advances in Information and Communication Technology, 153–67. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-72632-4_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
AbstractIn the frame of a statistical tolerance analysis of complex assemblies, for example an aircraft wing, the capability to predict accurately and fast specified, very small quantiles of the distribution of the assembly key characteristic becomes crucial. The problem is significantly magnified, when the tolerance synthesis problem is considered in which several tolerance analyses are performed and thus, a reliability analysis problem is nested inside an optimisation one in a fully probabilistic approach. The need to reduce the computational time and accurately estimate the specified probabilities is critical. Therefore, herein, a systematic study on several state of the art simulation methods is performed whilst they are critically evaluated with respect to their efficiency to deal with tolerance analysis problems. It is demonstrated that tolerance analysis problems are characterised by high dimensionality, high non-linearity of the state functions, disconnected failure domains, implicit state functions and small probability estimations. Therefore, the successful implementation of reliability methods becomes a formidable task. Herein, advanced simulation methods are combined with in-house developed assembly models based on the Homogeneous Transformation Matrix method as well as off-the-self Computer Aided Tolerance tools. The main outcome of the work is that by using an appropriate reliability method, computational time can be reduced whilst the probability of defected products can be accurately predicted. Furthermore, the connection of advanced mathematical toolboxes with off-the-self 3D tolerance tools into a process integration framework introduces benefits to successfully deal with the tolerance allocation problem in the future using dedicated and powerful computational tools.
10

Zanin, Massimiliano, Marco Correia, Pedro A. C. Sousa, and Jorge Cruz. "Probabilistic Constraint Programming for Parameters Optimisation of Generative Models." In Progress in Artificial Intelligence, 376–87. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-23485-4_38.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Optimisation probabiliste":

1

Stefanini, L., and F. J. Blom. "Safety Margin Optimisation by Probabilistic Analysis." In ASME 2017 Pressure Vessels and Piping Conference. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/pvp2017-65141.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Deterministic assessment codes can contain large safety factors that give very conservative results. By applying probabilistic analysis to these deterministic assessments, an implicitly accepted probability of failure can be determined. The probability of failure is implicit because it is calculated with the parameter values resulting in a state that is deterministically accepted by the code [2]. When these probabilities are compared for similar deterministic assessments, the excess conservatism can be shown and possibly reduced. During the present study a probabilistic analysis of the critical crack length initiation was performed. Such analysis led to the formulation of a corrective action proposal to the Master Curve approach given in BS7910:2013 Annex J. Firstly a deterministic calculation was performed with the Kr-Lr method to define the Critical Length of a through-wall circumferential crack present in a nuclear reactor’s piping. The value of Kmat used in the Kr-Lr method was calculated for a probability of 0.05 and with T0 directly measured (T0 a unique value). The second step was to pass to probabilistic calculation. Here Kmat was calculated from both T0 directly measured and T0 estimated by Charpy-V tests (T0 as a distribution). The results from these calculations gave the probability of a crack being equal to the Critical Crack Length. Moreover, these results showed that the Tk safety margin introduced in BS7910:2013 Annex J introduce an excess conservatism. Results from the probabilistic calculations were then compared to the implicitly accepted failure probability Pf (5%) that results from deterministic analysis (T0 considered as a single value) to account for the effects of T0 distribution. An optimized Tk was then found to account for the real uncertainty of the statistical distribution. Finally, excluding a dependency on the yield stress, the Tk optimization method was generalized. A new correlation for the Tk safety margin is proposed.
2

Puisa, R., and D. Vassalos. "Surrogate Optimisation of Probabilistic Subdivison Index." In Design and Operation of Passenger Ships 2011. RINA, 2011. http://dx.doi.org/10.3940/rina.pass.2011.12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Franke, Björn, Michael O'Boyle, John Thomson, and Grigori Fursin. "Probabilistic source-level optimisation of embedded programs." In the 2005 ACM SIGPLAN/SIGBED conference. New York, New York, USA: ACM Press, 2005. http://dx.doi.org/10.1145/1065910.1065922.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Nakiganda, Agnes M., Shahab Dehghan, and Petros Aristidou. "A Data-Driven Optimisation Model for Designing Islanded Microgrids." In 2022 17th International Conference on Probabilistic Methods Applied to Power Systems (PMAPS). IEEE, 2022. http://dx.doi.org/10.1109/pmaps53380.2022.9810598.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Holický, M. "Probabilistic optimisation of concrete cover exposed to carbonation." In ConcreteLife'06 - International RILEM-JCI Seminar on Concrete Durability and Service Life Planning: Curing, Crack Control, Performance in Harsh Environments. RILEM Publications SARL, 2006. http://dx.doi.org/10.1617/291214390x.040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Shyam, RB Ashith, Peter Lightbody, Gautham Das, Pengcheng Liu, Sebastian Gomez-Gonzalez, and Gerhard Neumann. "Improving Local Trajectory Optimisation using Probabilistic Movement Primitives." In 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2019. http://dx.doi.org/10.1109/iros40897.2019.8967980.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

El Yafrani, Mohamed, Marcella Scoczynski, Myriam Delgado, Ricardo Luders, Inkyung Sung, Markus Wagner, and Diego Oliva. "On Updating Probabilistic Graphical Models in Bayesian Optimisation Algorithm." In 2019 8th Brazilian Conference on Intelligent Systems (BRACIS). IEEE, 2019. http://dx.doi.org/10.1109/bracis.2019.00062.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Pepper, Nick, Francesco Montomoli, Francesco Giacomel, Giovanna Cavazzini, Michele Pinelli, Nicola Casari, and Sanjiv Sharma. "Uncertainty Quantification and Missing Data for Turbomachinery With Probabilistic Equivalence and Arbitrary Polynomial Chaos, Applied to Scroll Compressors." In ASME Turbo Expo 2020: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/gt2020-16139.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract This work presents a framework for predicting unknown input distributions for turbomachinery applications starting from scarce experimental measurements. The problem is relevant to turbomachinery where important parameters are obtained using indirect measurements. In this paper a scroll compressor is used as example but the suggested framework is completely general and can be used to infer missing data on material composition (carbon fiber properties, laser melted specimens for additive manufacturing etc) or input data (such as the turbine inlet temperature). Scroll compressors are small devices with a very complex geometry that is difficult to measure. Moreover these compressors are highly sensitive to manufacturing errors and clearances. For these reasons we have chosen this example as an ideal candidate to prove the effectiveness of the framework. An input probability distribution for the scroll height is recovered based on a scarce, synthetic data set. The scroll height is used as an example of a missing distribution for a geometric parameter as it has the highest variance and is challenging to measure experimentally. The framework consists of two main building blocks: an equivalence in a probabilistic sense and a Non-Intrusive Polynomial Chaos formulation able to deal with scarce data. The probabilistic equivalence is defined by a Probability Density Function (PDF) matching approach in which the statistical distance between probability distributions is quantified by either the Kolmogorov-Smirnov (KS) distance or the Kullback-Leibler (KL) divergence. By representing the missing inputs with a generalised Polynomial Chaos Expansion (gPCE) the back-calculation problem can be recast as an optimisation problem in which an arbitrary Polynomial Chaos (aPC) formulation was used to propagate the uncertain input distributions through a computational model of the system and generate a probability distribution for the Quantity of Interest (QoI). The framework has been tested with multiple non-Askey scheme distributions to prove the generality of the proposed approach.
9

Kim, Hyunsun A., and Robert A. Guyer. "Robust Topology Optimisation with Generalised Probability Distribution of Loading." In 54th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference. Reston, Virginia: American Institute of Aeronautics and Astronautics, 2013. http://dx.doi.org/10.2514/6.2013-1870.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Qureshi, Marij, Kwangkyu Yoo, and M. H. Ferri Aliabadi. "Assessment of algorithms for the probabilistic optimisation of composite panels." In FRACTURE AND DAMAGE MECHANICS: Theory, Simulation and Experiment. AIP Publishing, 2020. http://dx.doi.org/10.1063/5.0034767.

Full text
APA, Harvard, Vancouver, ISO, and other styles

To the bibliography