Dissertations / Theses on the topic 'RBDO (Reliability Based Design Optimisation)'

To see the other types of publications on this topic, follow the link: RBDO (Reliability Based Design Optimisation).

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 22 dissertations / theses for your research on the topic 'RBDO (Reliability Based Design Optimisation).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Chakchouk, Mohamed. "Conceptiοn d'un détecteur de système mécatronique mobile intelligent pour observer des molécules en phase gazeuse en ΙR." Electronic Thesis or Diss., Normandie, 2024. http://www.theses.fr/2024NORMIR06.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Ce travail anticipe que, dans un monde technologique numérique en constante expansion, les percées technologiques dans l'analyse des données collectées par des dispositifs spectroscopiques permettront l'identification presque instantanée d'espèces connues observées in-situ dans un environnement spécifique, laissant l'analyse approfondie nécessaire aux espèces non observées. La méthode dérivée de la technologie RBDO (Reliability Based Design Optimisation) sera utilisé pour implémenter une procédure d’intelligence artificielle afin d'identifier les espèces observées à partir d'un capteur IR mobile. Afin d'analyser avec succès les données obtenues, il est nécessaire d'assigner de manière appropriée les espèces moléculaires à partir des données IR observées en utilisant les modèles théoriques appropriés. Ce travail se concentre sur l'observation à partir d'appareils mobiles équipés de capteurs, d'antennes et d'électronique appropriés pour capturer et envoyer des données brutes ou analysées à partir d'un environnement spectroscopique IR intéressant. Il est donc intéressant voir indispensable de se concentrer sur les outils théoriques basés sur la symétrie pour l'analyse spectroscopique des molécules, ce qui permet d'identifier les fenêtres IR à choisir pour l'observation dans la conception de l'appareil. Ensuite, en ajustant les paramètres théoriques spectroscopiques aux fréquences observées, le spectre d'une espèce moléculaire peut être reconstruit. Une déconvolution des spectres observés est nécessaire avant l'analyse en termes d'intensité, de largeur et de centre de raie caractérisant une forme de raie. Une stratégie adéquate est donc nécessaire lors de la conception pour inclure l'analyse des données pendant la phase d'observation, qui peut bénéficier d'un algorithme d'intelligence artificielle pour tenir compte des différences dans la signature spectrale IR à cet égard, le pouvoir analytique des données de l'instrument peut être amélioré en utilisant la méthodologie d'optimisation de la conception basée sur la fiabilité (RBDO). Basée sur le comportement multiphysique de la propagation des incertitudes dans l'arbre hiérarchique du système, la RBDO utilise une modélisation probabiliste pour analyser l'écart par rapport à la sortie souhaitée comme paramètres de rétroaction pour optimiser la conception au départ. Le but de cette thèse est de traiter les paramètres de fenêtres d'observation IR, afin de traiter les questions de fiabilité au-delà de la conception mécatronique, pour inclure l'identification des espèces par l'analyse des données collectées
This work anticipates that, in an ever-expanding digital technology world, technological breakthroughs in the analysis of data collected by spectroscopic devices will allow the almost instantaneous identification of known species observed in-situ in a specific environment, leaving the necessary in-depth analysis of unobserved species. The method derived from RBDO (Reliability Based Design Optimization) technology will be used to implement an artificial intelligence procedure to identify observed species from a mobile IR sensor. To successfully analyze the obtained data, it is necessary to appropriately assign molecular species from the observed IR data using appropriate theoretical models. This work focuses on the observation from mobile devices equipped with appropriate sensors, antennas, and electronics to capture and send raw or analyzed data from an interesting IR spectroscopic environment. It is therefore interesting if not essential to focus on symmetry-based theoretical tools for the spectroscopic analysis of molecules, which allows to identify the IR windows to be chosen for observation in the design of the device. Then, by fitting the theoretical spectroscopic parameters to the observed frequencies, the spectrum of a molecular species can be reconstructed. A deconvolution of the observed spectra is necessary before the analysis in terms of intensity, width and line center characterizing a line shape. Therefore, an adequate strategy is needed in the design to include data analysis during the observation phase, which can benefit from an artificial intelligence algorithm to account for differences in the IR spectral signature. In this regard, the analytical power of the instrument data can be improved by using the reliability-based design optimization (RBDO) methodology. Based on the multi-physics behavior of uncertainty propagation in the hierarchical system tree, RBDO uses probabilistic modeling to analyze the deviation from the desired output as feedback parameters to optimize the design in the first place. The goal of this thesis is to address IR observation window parameters to address reliability issues beyond mechatronic design to include species identification through analysis of collected data
2

Zhang, Peipei. "Diffuse response surface model based on advancing latin hypercube patterns for reliability-based design optimization of ultrahigh strength steel NC milling parameters." Compiègne, 2011. http://www.theses.fr/2011COMP1949.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Puisque les incertitudes des paramètres de systèmes mécaniques entrainent la variabilité de la performance du produit, les systèmes optimisés sans prendre en compte les incertitudes peuvent présenter le risque de défaillance. L'optimisation fiabiliste (RBDO) focalise ainsi l'attention des ingénieurs et des chercheurs. Cependant, les méthodes habituelles de RBDO présentent un coût informatique excessif. Donc, afin d'améliorer l'efficacité informatique de la résolution de problèmes de RBDO, il est naturel de faire appel à des approches basées sur les surfaces de réponse (RSM). Dans ce travail, nous nous concentrons sur une Méthodologie de Surface de Réponse adaptée à la conception optimale dans le contexte fiabiliste. Nous proposons une variante de l'Approximation Diffuse, basée sur un modèle d'échantillonnage progressif et couplée à l'estimation de la fiabilité par FORM. La méthode proposée utilise simultanément des points dans l'espace normal standard U ainsi que dans l'espace physique X. Les deux réseaux forment un « plan d'expériences virtuel » défini par deux jeux de points dans les deux espaces de conception, qui sont évalués seulement quand nécessaire pour réduire au minimum le nombre d'évaluations « exactes » et ainsi diminuer le coût informatique. Dans chaque nouvelle itération, le pattern de points est mis à jour avec des points du design virtuel convenablement choisis afin d'effectuer l'approximation. Nous étendons ici l'idée d'Hypercube Latin (LHS) pour réutiliser au maximum des points précédemment calculés en ajoutant un nombre minimal de nouveaux points voisins à chaque étape, nécessaires pour l'approximation au voisinage du design actuel. Nous proposons des opérateurs de translation, de zoom avant et arrière, étendant ainsi le modèle LHS et le rendant récursif tout en contrôlant la qualité d'exploration de l'espace de conception et en maximisant le conditionnement de l'approximation. Dans la partie applicative de ce travail, nous examinons l'optimisation des paramètres du processus de fraisage à commande numérique (NC) de l'acier à haute limite élastique. Le succès de l'opération d'usinage dépend de la sélection des paramètres tels que le taux d'alimentation, la vitesse de coupe, les profondeurs axiales et radiales de coupe. Les contraintes d'optimisation sont exprimées comme des fonctions des indices de fiabilité calculés par FORM diffus
Since variances in the input parameters of engineering systems cause subsequent variations in the product performance, and deterministic optimum designs that are obtained without taking uncertainties into consideration could lead to unreliable designs. Reliability-Based Design Optimization (RBDO) is getting a lot of attention recently. However, RBDO is computationally expensive. Therefore, the Response Surface Methodology (RSM) is often used to improve the computational efficiency in the solution of problems in RBDO. In this work, we focus on a Response Surface Methodology (RSM) adapted to the Reliability-Based Design Optimization (RBDO). The Diffuse Approximation (DA), a variant of the well-known Moving Least Squares (MLS) approximation based on a progressive sampling pattern is used within a variant of the First Order Reliability Method (FORM). The proposed method simultaneously uses points in the standard normal space (U-space) as well as the physical space (X-space). The two grids form a “virtual design of experiments” defined by two sets of points in the two design spaces, that are evaluated only when needed in order to minimize the number of ‘exact’ thus computationally expensive function evaluations. In each new iteration, the pattern of points is updated with points appropriately selected from the “virtual design of experiments”, in order to perform the approximation. As an original contribution, we introduce the concept of « advancing Latin Hypercube Sampling (LHS) » which extends the idea of Latin Hypercube Sampling (LHS) to maximally reuse previously computed points while adding a minimal number of new neighboring points at each step, necessary for the approximation in the vicinity of the current design. We propose panning, expanding and shrinking Latin hypercube patterns of sampling points and we analyze the influence of this specific kind of patterns on the quality of the approximation. Next we calculate the minimal number of data points required in order to get a well-conditioned approximation system. In the application part of this work, we investigate the optimization of the process parameters for Numerical Control (NC) milling of ultrahigh strength steel. The success of the machining operation depends on the selection of machining parameters such as the feed rate, cutting speed, and the axial and radial depths of cut. A variant of the First Order Reliability Method (FORM) is chosen to calculate the reliability index. The optimization constraints are expressed as functions of the computed reliability indices
3

Cho, Hyunkyoo. "Efficient variable screening method and confidence-based method for reliability-based design optimization." Diss., University of Iowa, 2014. https://ir.uiowa.edu/etd/4594.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The objectives of this study are (1) to develop an efficient variable screening method for reliability-based design optimization (RBDO) and (2) to develop a new RBDO method incorporated with the confidence level for limited input data problems. The current research effort involves: (1) development of a partial output variance concept for variable screening; (2) development of an effective variable screening sequence; (3) development of estimation method for a confidence level of a reliability output; and (4) development of a design sensitivity method for the confidence level. In the RBDO process, surrogate models are frequently used to reduce the number of simulations because analysis of a simulation model takes a great deal of computational time. On the other hand, to obtain accurate surrogate models, we have to limit the dimension of the RBDO problem and thus mitigate the curse of dimensionality. Therefore, it is desirable to develop an efficient and effective variable screening method for reduction of the dimension of the RBDO problem. In this study, it is found that output variance is critical for identifying important variables in the RBDO process. A partial output variance, which is an efficient approximation method based on the univariate dimension reduction method (DRM), is proposed to calculate output variance efficiently. For variable screening, the variables that has larger partial output variances are selected as important variables. To determine important variables, hypothesis testing is used so that possible errors are contained at a user-specified error level. Also, an appropriate number of samples is proposed for calculating the partial output variance. Moreover, a quadratic interpolation method is studied in detail to calculate output variance efficiently. Using numerical examples, performance of the proposed variable screening method is verified. It is shown that the proposed method finds important variables efficiently and effectively. The reliability analysis and the RBDO require an exact input probabilistic model to obtain accurate reliability output and RBDO optimum design. However, often only limited input data are available to generate the input probabilistic model in practical engineering problems. The insufficient input data induces uncertainty in the input probabilistic model, and this uncertainty forces the RBDO optimum to lose its confidence level. Therefore, it is necessary to consider the reliability output, which is defined as the probability of failure, to follow a probability distribution. The probability of the reliability output is obtained with consecutive conditional probabilities of input distribution type and parameters using the Bayesian approach. The approximate conditional probabilities are obtained under reasonable assumptions, and Monte Carlo simulation is applied to practically calculate the probability of the reliability output. A confidence-based RBDO (C-RBDO) problem is formulated using the derived probability of the reliability output. In the C-RBDO formulation, the probabilistic constraint is modified to include both the target reliability output and the target confidence level. Finally, the design sensitivity of the confidence level, which is the new probabilistic constraint, is derived to support an efficient optimization process. Using numerical examples, the accuracy of the developed design sensitivity is verified and it is confirmed that C-RBDO optimum designs incorporate appropriate conservativeness according to the given input data.
4

Gaul, Nicholas John. "Modified Bayesian Kriging for noisy response problems and Bayesian confidence-based reliability-based design optimization." Diss., University of Iowa, 2014. https://ir.uiowa.edu/etd/1322.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The objective of this study is to develop a new modified Bayesian Kriging (MBKG) surrogate modeling method that can be used to carry out confidence-based reliability-based design optimization (RBDO) for problems in which simulation analyses are inherently noisy and standard Kriging approaches fail. The formulation of the MBKG surrogate modeling method is presented, and the full conditional distributions of the unknown MBKG parameters are derived and coded into a Gibbs sampling algorithm. Using the coded Gibbs sampling algorithm, Markov chain Monte Carlo is used to fit the MBKG surrogate model. A sequential sampling method that uses the posterior credible sets for inserting new design of experiment (DoE) sample points is proposed. The sequential sampling method is developed in such a way that the new DoE sample points added will provide the maximum amount of information possible to the MBKG surrogate model, making it an efficient and effective way to reduce the number of DoE sample points needed. Therefore, it improves the posterior distribution of the probability of failure efficiently. Finally, a confidence-based RBDO method using the posterior distribution of the probability of failure is developed. The confidence-based RBDO method is developed so that the uncertainty of the MBKG surrogate model is included in the optimization process. A 2-D mathematical example was used to demonstrate fitting the MBKG surrogate model and the developed sequential sampling method that uses the posterior credible sets for inserting new DoE. A detailed study on how the posterior distribution of the probability of failure changes as new DoE are added using the developed sequential sampling method is presented. Confidence-based RBDO is carried out using the same 2-D mathematical example. Three different noise levels are used for the example to compare how the MBKG surrogate modeling method, the sequential sampling method, and the confidence-based RBDO method behave for different amounts of noise in the response. A comparison of the optimization results for the three different noise levels for the same 2-D mathematical example is presented. A 3-D multibody dynamics (MBD) engineering block-car example is presented. The example is used to demonstrate using the developed methods to carry out confidence-based RBDO for an engineering problem that contains noise in the response. The MBD simulations for this example were done using the commercially available MBD software package RecurDyn. Deterministic design optimization (DDO) was first done using the MBKG surrogate model to obtain the mean response values, which then were used with standard Kriging methods to obtain the sensitivity of the responses. Confidence-based RBDO was then carried out using the DDO solution as the initial design point.
5

Ndashimye, Maurice. "Accounting for proof test data in Reliability Based Design Optimization." Thesis, Stellenbosch : Stellenbosch University, 2015. http://hdl.handle.net/10019.1/97108.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Thesis (MSc)--Stellenbosch University, 2015.
ENGLISH ABSTRACT: Recent studies have shown that considering proof test data in a Reliability Based Design Optimization (RBDO) environment can result in design improvement. Proof testing involves the physical testing of each and every component before it enters into service. Considering the proof test data as part of the RBDO process allows for improvement of the original design, such as weight savings, while preserving high reliability levels. Composite Over-Wrapped Pressure Vessels (COPV) is used as an example application of achieving weight savings while maintaining high reliability levels. COPVs are light structures used to store pressurized fluids in space shuttles, the international space station and other applications where they are maintained at high pressure for extended periods of time. Given that each and every COPV used in spacecraft is proof tested before entering service and any weight savings on a spacecraft results in significant cost savings, this thesis put forward an application of RBDO that accounts for proof test data in the design of a COPV. The method developed in this thesis shows that, while maintaining high levels of reliability, significant weight savings can be achieved by including proof test data in the design process. Also, the method enables a designer to have control over the magnitude of the proof test, making it possible to also design the proof test itself depending on the desired level of reliability for passing the proof test. The implementation of the method is discussed in detail. The evaluation of the reliability was based on the First Order Reliability Method (FORM) supported by Monte Carlo Simulation. Also, the method is implemented in a versatile way that allows the use of analytical as well as numerical (in the form of finite element) models. Results show that additional weight savings can be achieved by the inclusion of proof test data in the design process.
AFRIKAANSE OPSOMMING: Onlangse studies het getoon dat die gebruik van ontwerp spesifieke proeftoets data in betroubaarheids gebaseerde optimering (BGO) kan lei tot 'n verbeterde ontwerp. BGO behels vele aspekte in die ontwerpsgebied. Die toevoeging van proeftoets data in ontwerpsoptimering bring te weë; die toetsing van 'n ontwerp en onderdele voor gebruik, die aangepaste en verbeterde ontwerp en gewig-besparing met handhawing van hoë betroubaarsheidsvlakke. 'n Praktiese toepassing van die BGO tegniek behels die ontwerp van drukvatte met saamgestelde materiaal bewapening. Die drukvatontwerp is 'n ligte struktuur wat gebruik word in die berging van hoë druk vloeistowwe in bv. in ruimtetuie, in die internasionale ruimtestasie en in ander toepassings waar hoë druk oor 'n tydperk verlang word. Elke drukvat met saamgestelde materiaal bewapening wat in ruimtevaartstelsels gebruik word, word geproeftoets voor gebruik. In ruimte stelselontwerp lei massa besparing tot 'n toename in loonvrag. Die tesis beskryf 'n optimeringsmetode soos ontwikkel en gebaseer op 'n BGO tegniek. Die metode word toegepas in die ontwerp van drukvatte met saamgestelde materiaal bewapening. Die resultate toon dat die gebruik van proeftoets data in massa besparing optimering onderhewig soos aan hoë betroubaarheidsvlakke moontlik is. Verdermeer, die metode laat ook ontwerpers toe om die proeftoetsvlak aan te pas om sodoende by ander betroubaarheidsvlakke te toets. In die tesis word die ontwikkeling en gebruik van die optimeringsmetode uiteengelê. Die evaluering van betroubaarheidsvlakke is gebaseer op 'n eerste orde betroubaarheids-tegniek wat geverifieer word met talle Monte Carlo simulasie resultate. Die metode is ook so geskep dat beide analitiese sowel as eindige element modelle gebruik kan word. Ten slotte, word 'n toepassing getoon waar resultate wys dat die gebruik van die optimeringsmetode met die insluiting van proeftoets data wel massa besparing kan oplewer.
6

Zhao, Liang. "Reliability-based design optimization using surrogate model with assessment of confidence level." Diss., University of Iowa, 2011. https://ir.uiowa.edu/etd/1194.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The objective of this study is to develop an accurate surrogate modeling method for construction of the surrogate model to represent the performance measures of the compute-intensive simulation model in reliability-based design optimization (RBDO). In addition, an assessment method for the confidence level of the surrogate model and a conservative surrogate model to account the uncertainty of the prediction on the untested design domain when the number of samples are limited, are developed and integrated into the RBDO process to ensure the confidence of satisfying the probabilistic constraints at the optimal design. The effort involves: (1) developing a new surrogate modeling method that can outperform the existing surrogate modeling methods in terms of accuracy for reliability analysis in RBDO; (2) developing a sampling method that efficiently and effectively inserts samples into the design domain for accurate surrogate modeling; (3) generating a surrogate model to approximate the probabilistic constraint and the sensitivity of the probabilistic constraint with respect to the design variables in most-probable-point-based RBDO; (4) using the sampling method with the surrogate model to approximate the performance function in sampling-based RBDO; (5) generating a conservative surrogate model to conservatively approximate the performance function in sampling-based RBDO and assure the obtained optimum satisfy the probabilistic constraints. In applying RBDO to a large-scale complex engineering application, the surrogate model is commonly used to represent the compute-intensive simulation model of the performance function. However, the accuracy of the surrogate model is still challenging for highly nonlinear and large dimension applications. In this work, a new method, the Dynamic Kriging (DKG) method is proposed to construct the surrogate model accurately. In this DKG method, a generalized pattern search algorithm is used to find the accurate optimum for the correlation parameter, and the optimal mean structure is set using the basis functions that are selected by a genetic algorithm from the candidate basis functions based on a new accuracy criterion. Plus, a sequential sampling strategy based on the confidence interval of the surrogate model from the DKG method, is proposed. By combining the sampling method with the DKG method, the efficiency and accuracy can be rapidly achieved. Using the accurate surrogate model, the most-probable-point (MPP)-based RBDO and the sampling-based RBDO can be carried out. In applying the surrogate models to MPP-based RBDO and sampling-based RBDO, several efficiency strategies, which include: (1) using local window for surrogate modeling; (2) adaptive window size for different design candidates; (3) reusing samples in the local window; (4) using violated constraints for surrogate model accuracy check; (3) adaptive initial point for correlation parameter estimation, are proposed. To assure the accuracy of the surrogate model when the number of samples is limited, and to assure the obtained optimum design can satisfy the probabilistic constraints, a conservative surrogate model, using the weighted Kriging variance, is developed, and implemented for sampling-based RBDO.
7

Ezzati, Ghasem. "Reliability-based design optimisation methods in large scale systems." Thesis, Federation University Australia, 2015. http://researchonline.federation.edu.au/vital/access/HandleResolver/1959.17/99881.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Doctor of Philosophy
Structural optimisation is an important field of applied mathematics, which has proved useful in engineering projects. Reliability-based design optimisation (RBDO) can be considered a branch of structural optimisation. Different RBDO approaches have been applied in real world problems (e.g. vehicle side impact model, short column design, etc.). Double-loop, single-loop, and decoupled approaches are three categories in RBDO. This research focuses on double-loop approaches, which consider reliability analysis problems in their inner loops and design optimisation calculations in their outer loops. In recent decades, double-loop approaches have been studied and modified in order to improve their stability and efficiency, but many shortcomings still remain, particularly regarding reliability analysis methods. This thesis will concentrate on development of new reliability analysis methods that can be applied to solve RBDO problems. As a local optimisation algorithm, the conjugate gradient method will be adopted. Furthermore, a new method will be introduced to solve a reliability analysis problem in the polar space. The reliability analysis problem must be transformed into an unconstrained optimisation problem before solving in the polar space. Two methods will be introduced here and their stability and efficiency will be compared with the existing methods via numerical experiments. Next, we consider applications of RBDO models to electricity networks. Most of the current optimisation models of these networks are categorised as deterministic design optimisation models. A probabilistic constraint is introduced in this thesis for electricity networks. For this purpose, a performance function must be defined for a network in order to define safety and failure conditions. Then, new non-deterministic design optimisation models will be formulated for electricity networks by using the mentioned probabilistic constraint. These models are designed to keep failure probability of the network below a predetermined and accepted safety level.
8

Chen, Qing. "Reliability-based structural design: a case of aircraft floor grid layout optimization." Thesis, Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/39630.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In this thesis, several Reliability-based Design Optimization (RBDO) methods and algorithms for airplane floor grid layout optimization are proposed. A general RBDO process is proposed and validated by an example. Copula as a mathematical method to model random variable correlations is introduced to discover the correlations between random variables and to be applied in producing correlated data samples for Monte Carlo simulations. Based on Hasofer-Lind (HL) method, a correlated HL method is proposed to evaluate a reliability index under correlation. As an alternative method for computing a reliability index, the reliability index is interpreted as an optimization problem and two nonlinear programming algorithms are introduced to evaluate reliability index. To evaluate the reliability index by Monte Carlo simulation in a time efficient way, a kriging-based surrogate model is proposed and compared to the original model in terms of computing time. Since in RBDO optimization models the reliability constraint obtained by MCS does not have an analytical form, a kriging-based response surface is built. Kriging-based response surface models are usually segment functions that do not have a uniform expression over the design space; however, most optimization algorithms require a uniform expression for constraints. To solve this problem, a heuristic gradient-based direct searching algorithm is proposed. These methods and algorithms, together with the RBDO general process, are applied to the layout optimization of aircraft floor grid structural design.
9

Mansour, Rami. "Reliability Assessment and Probabilistic Optimization in Structural Design." Doctoral thesis, KTH, Hållfasthetslära (Avd.), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-183572.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Research in the field of reliability based design is mainly focused on two sub-areas: The computation of the probability of failure and its integration in the reliability based design optimization (RBDO) loop. Four papers are presented in this work, representing a contribution to both sub-areas. In the first paper, a new Second Order Reliability Method (SORM) is presented. As opposed to the most commonly used SORMs, the presented approach is not limited to hyper-parabolic approximation of the performance function at the Most Probable Point (MPP) of failure. Instead, a full quadratic fit is used leading to a better approximation of the real performance function and therefore more accurate values of the probability of failure. The second paper focuses on the integration of the expression for the probability of failure for general quadratic function, presented in the first paper, in RBDO. One important feature of the proposed approach is that it does not involve locating the MPP. In the third paper, the expressions for the probability of failure based on general quadratic limit-state functions presented in the first paper are applied for the special case of a hyper-parabola. The expression is reformulated and simplified so that the probability of failure is only a function of three statistical measures: the Cornell reliability index, the skewness and the kurtosis of the hyper-parabola. These statistical measures are functions of the First-Order Reliability Index and the curvatures at the MPP. In the last paper, an approximate and efficient reliability method is proposed. Focus is on computational efficiency as well as intuitiveness for practicing engineers, especially regarding probabilistic fatigue problems where volume methods are used. The number of function evaluations to compute the probability of failure of the design under different types of uncertainties is a priori known to be 3n+2 in the proposed method, where n is the number of stochastic design variables.

QC 20160317

10

Villanueva, Diane. "Reliability Based Design Including Future Tests and Multi-Agent Approaches." Phd thesis, Saint-Etienne, EMSE, 2013. http://tel.archives-ouvertes.fr/tel-00862355.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The initial stages of reliability-based design optimization involve the formulation of objective functions and constraints, and building a model to estimate the reliability of the design with quantified uncertainties. However, even experienced hands often overlook important objective functions and constraints that affect the design. In addition, uncertainty reduction measures, such as tests and redesign, are often not considered in reliability calculations during the initial stages. This research considers two areas that concern the design of engineering systems: 1) the trade-off of the effect of a test and post-test redesign on reliability and cost and 2) the search for multiple candidate designs as insurance against unforeseen faults in some designs. In this research, a methodology was developed to estimate the effect of a single future test and post-test redesign on reliability and cost. The methodology uses assumed distributions of computational and experimental errors with re-design rules to simulate alternative future test and redesign outcomes to form a probabilistic estimate of the reliability and cost for a given design. Further, it was explored how modeling a future test and redesign provides a company an opportunity to balance development costs versus performance by simultaneously designing the design and the post-test redesign rules during the initial design stage.The second area of this research considers the use of dynamic local surrogates, or surrogate-based agents, to locate multiple candidate designs. Surrogate-based global optimization algorithms often require search in multiple candidate regions of design space, expending most of the computation needed to define multiple alternate designs. Thus, focusing on solely locating the best design may be wasteful. We extended adaptive sampling surrogate techniques to locate multiple optima by building local surrogates in sub-regions of the design space to identify optima. The efficiency of this method was studied, and the method was compared to other surrogate-based optimization methods that aim to locate the global optimum using two two-dimensional test functions, a six-dimensional test function, and a five-dimensional engineering example.
11

Vishwanathan, Aditya. "Uncertainty Quantification for Topology Optimisation of Aerospace Structures." Thesis, University of Sydney, 2020. https://hdl.handle.net/2123/23922.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The design and optimisation of aerospace structures is non-trivial. There are several reasons for this including, but not limited to, (1) complex problem instances (multiple objectives, constraints, loads, and boundary conditions), (2) the use of high fidelity meshes which impose significant computational burden, and (3) dealing with uncertainties in the engineering modelling. The last few decades have seen a considerable increase in research output dedicated to solving these problems, and yet the majority of papers neglect the effect of uncertainties and assume deterministic conditions. This is particularly the case for topology optimisation - a promising method for aerospace design that has seen relatively little practical application to date. This thesis will address notable gaps in the topology optimisation under uncertainty literature. Firstly, an observation underpinning the field of uncertainty quantification (UQ) is the lack of experimental studies and dealing with non-parametric variability (e.g. model unknowns, experimental and human errors etc.). Random Matrix Theory (RMT) is a method explored heavily in this thesis for the purpose of numerical and experimental UQ of aerospace structures for both parametric and non-parametric uncertainties. Next, a novel algorithm is developed using RMT to increase the efficiency of Reliability-Based topology optimisation, a formulation which has historically been limited by computational runtime. This thesis also provides contributions to Robust Topology optimisation (RTO) by integrating uncertain boundary conditions and providing experimental validation of the results. The final chapter of this thesis addresses uncertainties in multi-objective topology optimisation (MOTO), and also considers treating a single objective RTO problem as a MOTO to provide a more consistent distribution of solutions.
12

Yu, Hang. "Reliability-based design optimization of structures : methodologies and applications to vibration control." Phd thesis, Ecole Centrale de Lyon, 2011. http://tel.archives-ouvertes.fr/tel-00769937.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Deterministic design optimization is widely used to design products or systems. However, due to the inherent uncertainties involved in different model parameters or operation processes, deterministic design optimization without considering uncertainties may result in unreliable designs. In this case, it is necessary to develop and implement optimization under uncertainties. One way to deal with this problem is reliability-based robust design optimization (RBRDO), in which additional uncertainty analysis (UA, including both of reliability analysis and moment evaluations) is required. For most practical applications however, UA is realized by Monte Carlo Simulation (MCS) combined with structural analyses that renders RBRDO computationally prohibitive. Therefore, this work focuses on development of efficient and robust methodologies for RBRDO in the context of MCS. We presented a polynomial chaos expansion (PCE) based MCS method for UA, in which the random response is approximated with the PCE. The efficiency is mainly improved by avoiding repeated structural analyses. Unfortunately, this method is not well suited for high dimensional problems, such as dynamic problems. To tackle this issue, we applied the convolution form to compute the dynamic response, in which the PCE is used to approximate the modal properties (i.e. to solve random eigenvalue problem) so that the dimension of uncertainties is reduced since only structural random parameters are considered in the PCE model. Moreover, to avoid the modal intermixing problem when using MCS to solve the random eigenvalue problem, we adopted the MAC factor to quantify the intermixing, and developed a univariable method to check which variable results in such a problem and thereafter to remove or reduce this issue. We proposed a sequential RBRDO to improve efficiency and to overcome the nonconvergence problem encountered in the framework of nested MCS based RBRDO. In this sequential RBRDO, we extended the conventional sequential strategy, which mainly aims to decouple the reliability analysis from the optimization procedure, to make the moment evaluations independent from the optimization procedure. Locally "first-torder" exponential approximation around the current design was utilized to construct the equivalently deterministic objective functions and probabilistic constraints. In order to efficiently calculate the coefficients, we developed the auxiliary distribution based reliability sensitivity analysis and the PCE based moment sensitivity analysis. We investigated and demonstrated the effectiveness of the proposed methods for UA as well as RBRDO by several numerical examples. At last, RBRDO was applied to design the tuned mass damper (TMD) in the context of passive vibration control, for both deterministic and uncertain structures. The associated optimal designs obtained by RBRDO cannot only reduce the variability of the response, but also control the amplitude by the prescribed threshold.
13

Price, Nathaniel Bouton. "Conception sous incertitudes de modèles avec prise en compte des tests futurs et des re-conceptions." Thesis, Lyon, 2016. http://www.theses.fr/2016LYSEM012/document.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Au stade de projet amont, les ingénieurs utilisent souvent des modèles de basse fidélité possédant de larges erreurs. Les approches déterministes prennent implicitement en compte les erreurs par un choix conservatif des paramètres aléatoires et par l'ajout de facteurs de sécurité dans les contraintes de conception. Une fois qu'une solution est proposée, elle est analysée par un modèle haute fidélité (test futur): une re-conception peut s'avérer nécessaire pour restaurer la fiabilité ou améliorer la performance, et le modèle basse fidélité est calibré pour prendre en compte les résultats de l'analyse haute-fidélité. Mais une re-conception possède un coût financier et temporel. Dans ce travail, les effets possibles des tests futurs et des re-conceptions sont intégrés à une procédure de conception avec un modèle basse fidélité. Après les Chapitres 1 et 2 qui donnent le contexte de ce travail et l'état de l'art, le Chapitre 3 analyse le dilemme d'une conception initiale conservatrice en terme de fiabilité ou ambitieuse en termes de performances (avec les re-conceptions associées pour améliorer la performance ou la fiabilité). Le Chapitre 4 propose une méthode de simulation des tests futurs et de re-conception avec des erreurs épistémiques corrélées spatialement. Le Chapitre 5 décrit une application à une fusée sonde avec des erreurs à la fois aléatoires et de modèles. Le Chapitre 6 conclut le travail
At the initial design stage, engineers often rely on low-fidelity models that have high uncertainty. In a deterministic safety-margin-based design approach, uncertainty is implicitly compensated for by using fixed conservative values in place of aleatory variables and ensuring the design satisfies a safety-margin with respect to design constraints. After an initial design is selected, high-fidelity modeling is performed to reduce epistemic uncertainty and ensure the design achieves the targeted levels of safety. High-fidelity modeling is used to calibrate low-fidelity models and prescribe redesign when tests are not passed. After calibration, reduced epistemic model uncertainty can be leveraged through redesign to restore safety or improve design performance; however, redesign may be associated with substantial costs or delays. In this work, the possible effects of a future test and redesign are considered while the initial design is optimized using only a low-fidelity model. The context of the work and a literature review make Chapters 1 and 2 of this manuscript. Chapter 3 analyzes the dilemma of whether to start with a more conservative initial design and possibly redesign for performance or to start with a less conservative initial design and risk redesigning to restore safety. Chapter 4 develops a generalized method for simulating a future test and possible redesign that accounts for spatial correlations in the epistemic model error. Chapter 5 discusses the application of the method to the design of a sounding rocket under mixed epistemic model uncertainty and aleatory parameter uncertainty. Chapter 6 concludes the work
14

Yaich, Ahmed. "Analyse de l’endommagement par fatigue et optimisation fiabiliste des structures soumises à des vibrations aléatoires." Thesis, Normandie, 2018. http://www.theses.fr/2018NORMIR05.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Cette thèse porte sur l'analyse de l'endommagement par fatigue et optimisation fiabiliste des structures soumises à des vibrations aléatoires. Le but de l'optimisation fiabiliste est de trouver le compromis entre le coût et la fiabilité. Plusieurs méthodes, telles que la méthode hybride et la méthode OSF ont été développées. Ces méthodes ont été appliquées dans des cas statiques et certains cas dynamiques spécifiques. Dans la réalité les structures sont soumises à des vibrations aléatoires qui peuvent provoquer un endommagement par fatigue. Dans cette thèse on présente la stratégie numérique de calcul de l'endommagement par fatigue dans le domaine fréquentiel et on propose une extension des méthodes RBDO dans le cas des structures soumises à des vibrations aléatoires. Aussi, une méthode RHM est développée. Enfin,une application industrielle qui porte sur la modélisation de la partie mécanique du banc HALT est présenté
This thesis deals with the fatigue damage analysis and reliability-based design optimization (RBDO) of structures under random vibrations. The purpose of an RBDO method is to find the best compromise between cost and safety. Several methods, such as Hybrid method and OSF method have been developed. These methods have been applied in static cases and some specific dynamic cases. In fact, structures are subject to random vibrations which can cause a fatigue damage. In this thesis we present the strategy of calculation of the fatigue damage based on the Sines criterion in the frequency domain developed in our laboratory. Then, an extension of the RBDO methods in the case of structures subjected to random vibrations is proposed. Also, an RHM method is developed. Finally, we present an industrial application where we propose a model of the mechanical part of the HALT chamber
15

Saad, Lara. "Optimisation du coût du cycle de vie des structures en béton armé." Thesis, Clermont-Ferrand 2, 2016. http://www.theses.fr/2016CLF22692/document.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Les structures de génie civil, en particulier les ponts en béton armé, doivent être conçues et gérées pour assurer les besoins de transport et de communication dans la société. Il est indispensable de garantir un fonctionnement convenable et sécuritaire de ces structures, puisque les défaillances peuvent conduire à des perturbations du transport, des pertes catastrophiques de concessions et des pertes de vies humaines, avec des impacts économiques, sociétaux et environnementaux graves, à court et à long termes. Les gestionnaires entreprennent diverses activités pour maintenir la performance et le fonctionnement adéquat à long terme, tout en satisfaisant les contraintes financières et sécuritaires. Idéalement, ils peuvent recourir à des techniques d'optimisation pour établir les compromis entre la réduction du coût du cycle de vie (LCC) et la maximisation de la durée de vie. Cela nécessite le développement de l’analyse du cycle de vie, de l’analyse de fiabilité et de l'optimisation structurale.Les approches actuelles pour la conception et la gestion des structures s’appuyant sur l’analyse du coût de cycle de vie, montrent les besoins suivants : (1) une approche intégrée et systématique pour modéliser de façon cohérente les processus de dégradation, les charges de trafic, le vieillissement et les conséquences directes et indirectes de la défaillance, (2) une considération complète des dépendances économiques, structurales et stochastiques entre les différents éléments de l’ouvrage, (3) une approche permettant de modéliser efficacement un système structural formé de plusieurs éléments interdépendants, (4) une évaluation des conséquences de la dégradation et de la redistribution des charges entre les éléments en tenant compte de la redondance du système et de la configuration de l’ouvrage, (5) une méthode d'optimisation de la conception et de la maintenance qui préserve l’exigence de fiabilité tout en considérant la robustesse de la décision. L'objectif global de cette thèse est de fournir des procédures améliorées qui peuvent être appliquées à la conception et à la gestion fiabilistes et robustes des ouvrages en béton armé, en réduisant les coûts supportés par les gestionnaires et les utilisateurs, tout en tenant compte des dépendances entre les éléments. Dans la première partie de cette thèse, une synthèse bibliographique concernant les procédures de la conception et de la maintenance basée sur des calculs fiabilistes est présentée, et les différents composants du LCC sont développés. Ensuite, une approche est proposée pour la conception des ouvrages en tenant compte du coût aux usagers et en intégrant dans la fonction du coût de cycle de vie. Le modèle couplé corrosion-fatigue est aussi considéré dans l’optimisation de la conception. La planification de la maintenance des ouvrages est ensuite développée, en considérant les différents types d'interaction entre les éléments, en particulier les dépendances économiques, structurales et stochastiques. Ce modèle utilise l'analyse de l'arbre de défaillance et les probabilités conditionnelles pour tenir compte des dépendances dans la planification de la maintenance. Les conséquences de la dégradation et de la redistribution des charges sont prises en compte dans l'approche proposée. Par ailleurs, une méthode pratique de calcul de la fiabilité d'un système formé de plusieurs composantes interdépendantes est proposée, à travers un facteur de redondance calculé par la modélisation mécanique. Enfin, une nouvelle procédure d'optimisation est proposée, permettant de tenir compte des incertitudes dans le système et la capacité structurale de s'adapter aux variabilités intrinsèques. La procédure proposée tient compte des incertitudes et de la variabilité dans une formulation cohérente, validée au moyen des applications numériques. (...)
Civil engineering structures, particularly reinforced concrete bridges, should be designed and managed to ensure the society needs. It is crucial to assure that these structures function properly and safely as damage during the service life can lead to transport disturbance, catastrophic loss of property, causalities, as well as severe economic, social, and environmental impacts, in addition to long term consequences. Decision-makers adopt various activities to maintain adequate long-term performance and functionality while satisfying financial constraints. Ideally, they may employ optimization techniques to identify the trade-offs between minimizing the life-cycle cost (LCC) and maximizing the expected service life. This requires the development of three challenging chores: life cycle analysis, reliability analysis and structural optimization. The current approaches for the design and management of structures through a Life-cycle cost analysis (LCCA) highlight the following needs: (1) an integrated and systematic approach to model coherently the deterioration processes, the increasing traffic loads, the aging and the direct and indirect consequences of failure, (2) a mutual consideration of economic, structural and stochastic dependencies between the elements of a structural system, (3) an adequate approach for the deterioration dependencies and load redistribution between the elements, (4) an improvement of system reliability computation as a function of the structural redundancy and configuration that can take into account the dependencies between the elements, (5) a consideration of design and maintenance optimization procedures that focus coherently on the robustness of the management decision and on the satisfaction of reliability requirements.The overall objective of this study is to provide improved LCCA and procedures that can be applied to select optimal and robust design and maintenance decisions regarding new and existing reinforced concrete structures, by minimizing both manager and user costs, while providing the required safety along the structure lifetime, taking into account the most severe degradation processes and the dependencies between structural elements. In the first part of this thesis, a literature review concerning the current probabilistic design and maintenance procedures is presented, and the LCC components are discussed. Then, a new approach is developed to evaluate the user delay costs on a reinforced concrete bridge structure, based on direct and indirect costs related to degradation and failure, and to integrate it to the life cycle cost function, in order to allow for probabilistic design. In addition,the coupled corrosion-fatigue model is considered in the design optimization. Afterward, a structural maintenance planning approach is developed to consider the three types of interactions, namely economic, structural and stochastic dependencies. The proposed model uses fault tree analysis and conditional probabilities to reflect the dependencies in the maintenance planning. The consequences of degradation are evaluated and a method is proposed to account for the load redistribution. Moreover, a practical formulation for quantifying the reliability of a system formed of interrelated components is proposed, by the mean of a redundancy factor that can be computed by finite element analysis. Finally, a new optimization procedure is proposed, by taking into account the uncertainties in the analysis, and the structural ability to adapt to variability, unforeseen actions or deterioration mechanisms. The proposed procedure takes account of uncertainties andvariability in one consistent formulation, which is shown through numerical applications. (...)
16

Moustapha, Maliki. "Métamodèles adaptatifs pour l'optimisation fiable multi-prestations de la masse de véhicules." Thesis, Clermont-Ferrand 2, 2016. http://www.theses.fr/2016CLF22670/document.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Cette thèse s’inscrit dans le cadre des travaux menés par PSA Peugeot Citroën pour l’allègement de ses véhicules. Les optimisations masse multi-prestations réalisées sur le périmètre de la structure contribuent directement à cette démarche en recherchant une allocation d’épaisseurs de tôles à masse minimale qui respectent des spécifications physiques relatives à différentes prestations (choc, vibro-acoustique, etc.). Ces spécifications sont généralement évaluées à travers des modèles numériques à très haute-fidélité qui présentent des temps de restitution particulièrement élevés. Le recours à des fonctions de substitution, connues sous le nom de métamodèles, reste alors la seule alternative pour mener une étude d’optimisation tout en respectant les délais projet. Cependant la prestation qui nous intéresse, à savoir le choc frontal, présente quelques particularités (grande dimensionnalité, fortes non-linéarités, dispersions physique et numérique) qui rendent sa métamodélisation difficile.L’objectif de la thèse est alors de proposer une approche d’optimisation basée sur des métamodèles adaptatifs afin de dégager de nouveaux gains de masse. Cela passe par la prise en compte du choc frontal dont le caractère chaotique est exacerbé par la présence d’incertitudes. Nous proposons ainsi une méthode d’optimisation fiabiliste avec l’introduction de quantiles comme mesure de conservatisme. L’approche est basée sur des modèles de krigeage avec enrichissement adaptatif afin de réduire au mieux le nombre d’appels aux modèles éléments finis. Une application sur un véhicule complet permet de valider la méthode
One of the most challenging tasks in modern engineering is that of keeping the cost of manufactured goods small. With the advent of computational design, prototyping for instance, a major source of expenses, is reduced to its bare essentials. In fact, through the use of high-fidelity models, engineers can predict the behaviors of the systems they design quite faithfully. To be fully realistic, such models must embed uncertainties that may affect the physical properties or operating conditions of the system. This PhD thesis deals with the constrained optimization of structures under uncertainties in the context of automotive design. The constraints are assessed through expensive finite element models. For practical purposes, such models are conveniently substituted by so-called surrogate models which stand as cheap and easy-to-evaluate proxies. In this PhD thesis, Gaussian process modeling and support vector machines are considered. Upon reviewing state-of-the-art techniques for optimization under uncertainties, we propose a novel formulation for reliability-based design optimization which relies on quantiles. The formal equivalence of this formulation with the traditional ones is proved. This approach is then coupled to surrogate modeling. Kriging is considered thanks to its built-in error estimate which makes it convenient to adaptive sampling strategies. Such an approach allows us to reduce the computational budget by running the true model only in regions that are of interest to optimization. We therefore propose a two-stage enrichment scheme. The first stage is aimed at globally reducing the Kriging epistemic uncertainty in the vicinity of the limit-state surface. The second one is performed within iterations of optimization so as to locally improve the quantile accuracy. The efficiency of this approach is demonstrated through comparison with benchmark results. An industrial application featuring a car under frontal impact is considered. The crash behavior of a car is indeed particularly affected by uncertainties. The proposed approach therefore allows us to find a reliable solution within a reduced number of calls to the true finite element model. For the extreme case where uncertainties trigger various crash scenarios of the car, it is proposed to rely on support vector machines for classification so as to predict the possible scenarios before metamodeling each of them separately
17

Dammak, Khalil. "Prise en compte des incertitudes des problèmes en vibro-acoustiques (ou interaction fluide-structure)." Thesis, Normandie, 2018. http://www.theses.fr/2018NORMIR19/document.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Ce travail de thèse porte sur l’analyse robuste et l’optimisation fiabiliste des problèmes vibro-acoustiques (ou en interaction fluide-structure) en tenant en compte des incertitudes des paramètres d’entrée. En phase de conception et de dimensionnement, il parait intéressant de modéliser les systèmes vibro-acoustiques ainsi que leurs variabilités qui peuvent être essentiellement liées à l’imperfection de la géométrie ainsi qu’aux caractéristiques des matériaux. Il est ainsi important, voire indispensable, de tenir compte de la dispersion des lois de ces paramètres incertains afin d’en assurer une conception robuste. Par conséquent, l’objectif est de déterminer les capacités et les limites, en termes de précision et de coûts de calcul, des méthodes basées sur les développements en chaos polynomiaux en comparaison avec la technique référentielle de Monte Carlo pour étudier le comportement mécanique des problèmes vibro-acoustique comportant des paramètres incertains. L’étude de la propagation de ces incertitudes permet leur intégration dans la phase de conception. Le but de l’optimisation fiabiliste Reliability-Based Design Optimization (RBDO) consiste à trouver un compromis entre un coût minimum et une fiabilité accrue. Par conséquent, plusieurs méthodes, telles que la méthode hybride (HM) et la méthode Optimum Safety Factor (OSF), ont été développées pour atteindre cet objectif. Pour remédier à la complexité des systèmes vibro-acoustiques comportant des paramètres incertains, nous avons développé des méthodologies spécifiques à cette problématique, via des méthodes de méta-modèlisation, qui nous ont permis de bâtir un modèle de substitution vibro-acoustique, qui satisfait en même temps l’efficacité et la précision du modèle. L’objectif de cette thèse, est de déterminer la meilleure méthodologie à suivre pour l’optimisation fiabiliste des systèmes vibro-acoustiques comportant des paramètres incertains
This PhD thesis deals with the robust analysis and reliability optimization of vibro-acoustic problems (or fluid-structure interaction) taking into account the uncertainties of the input parameters. In the design and dimensioning phase, it seems interesting to model the vibro-acoustic systems and their variability, which can be mainly related to the imperfection of the geometry as well as the characteristics of the materials. It is therefore important, if not essential, to take into account the dispersion of the laws of these uncertain parameters in order to ensure a robust design. Therefore, the purpose is to determine the capabilities and limitations, in terms of precision and computational costs, of methods based on polynomial chaos developments in comparison with the Monte Carlo referential technique for studying the mechanical behavior of vibro-acoustic problems with uncertain parameters. The study of the propagation of these uncertainties allows their integration into the design phase. The goal of the reliability-Based Design Optimization (RBDO) is to find a compromise between minimum cost and a target reliability. As a result, several methods, such as the hybrid method (HM) and the Optimum Safety Factor (OSF) method, have been developed to achieve this goal. To overcome the complexity of vibro-acoustic systems with uncertain parameters, we have developed methodologies specific to this problem, via meta-modeling methods, which allowed us to build a vibro-acoustic surrogate model, which at the same time satisfies the efficiency and accuracy of the model. The objective of this thesis is to determine the best methodology to follow for the reliability optimization of vibro-acoustic systems with uncertain parameters
18

Lesobre, Romain. "Modélisation et optimisation de la maintenance et de la surveillance des systèmes multi-composants - Applications à la maintenance et à la conception de véhicules industriels." Thesis, Université Grenoble Alpes (ComUE), 2015. http://www.theses.fr/2015GREAT022/document.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Ces travaux de thèse traitent des problèmes de maintenance associés aux véhicules industriels. Ils se concentrent sur la planification des opérations de maintenance et sur le développement d'une méthodologie de conception pour la maintenance. Le but est de proposer une offre de maintenance personnalisée en fonction de chaque véhicule et capable de s'adapter aux contraintes des utilisateurs. Dans l'industrie du transport, ces contraintes se caractérisent par un nombre d'opportunités de maintenance limité et des immobilisations à fortes conséquences financières. Cette offre a vocation à garantir un niveau de disponibilité élevé tout en réduisant l'impact de la maintenance sur les coûts globaux d'exploitation. Dans ce cadre, la politique de maintenance développée vise à assurer, moyennant un certain risque, l'autonomie d'un système multi-composant sur des périodes d'opérations données. Pendant ces périodes, aucune opération de maintenance et aucune défaillance du système ne doivent venir perturber la réalisation des missions. A la fin de chaque période, la politique considérée évalue la nécessité d'une intervention de maintenance pour assurer la prochaine période avec un niveau de confiance spécifié. Lorsque la maintenance est jugée indispensable, des critères intégrant les coûts et l'efficacité de la maintenance sont introduits pour sélectionner les opérations à réaliser. Cette forme originale de regroupement dynamique s'appuie à la fois sur les modèles de fiabilité des composants, sur la structure fiabiliste du système et sur les informations de surveillance disponibles en ligne. Celles-ci se composent d'informations liées à l'état de santé des composants mais également à leurs conditions d'utilisation. La flexibilité du processus permet d'intégrer, dans la décision, des niveaux d'informations différents suivant les composants. Les paramètres de cette politique, à savoir la longueur de la période et le niveau de confiance, sont optimisés en fonction du coût total de maintenance. Ce coût, évalué sur un horizon fini, intègre les coûts directs associés aux opérations de maintenance et les coûts indirects engendrés par les immobilisations. Pour envisager une réduction significative des coûts d'exploitation du système, l'optimisation de la politique de maintenance seule ne suffit pas. Il est primordial de mener une réflexion plus large associant le système et sa maintenance dès la conception. Pour diriger cette réflexion, la méthodologie de conception proposée hiérarchise, à l'aide d'un facteur d'importance original, l'impact des composants sur les coûts d'exploitation. Différentes options de conceptions sont ensuite évaluées, par simulation, sur les composants jugés prioritaires. Les options retenues conduisent à réduire les coûts globaux d'exploitation. Des résultats de simulation permettent d'illustrer les méthodes développées. Une application sur un sous-système du véhicule industriel est également réalisée
This thesis research work focuses on the maintenance operations scheduling and the development of a design methodology for maintenance. The aim is to suggest a customized maintenance service offer for each vehicle and able to adapt to user constraints. In the transport industry, these constraints are defined by a limited number of maintenance opportunities and vehicle unplanned stops with significant financial consequences. This service offer should enable both to improve the vehicle uptime and to reduce the maintenance impact on operating costs. In this framework, the developed maintenance policy ensures, with a given risk probability, maintenance free operating periods for a multi-component system. During these periods, the system should be able to carry out all its assigned missions without maintenance actions and system fault. And the end of each period, the considered policy evaluates if a maintenance action is required to ensure maintenance-free and fault-free operation on the next period with a specified confidence level. When a maintenance action is mandatory, decision criteria considering the maintenance costs and the maintenance efficiency are used to select the operations to be performed. This form of dynamic clustering, called time-driven clustering, integrates both the component reliability models, the system structure and the available monitoring information. In our case, the monitoring information refers to the component state information and information on the component operating conditions. The process flexibility makes possible to make a maintenance decision in using different information levels for system components. The policy parameters, namely the period length and the confidence level value, are optimized based on the total maintenance cost. This cost, evaluated on a finite horizon, is composed of directs costs related to maintenance operations and indirect costs generated by system immobilizations. In order to reach a significant operating costs reduction, the maintenance policy optimization alone is not sufficient. It is essential to have a broader approach to involve the system and its maintenance since the conception. In this context, the developed design methodology suggests to prioritize the components impact on the operating costs. This prioritization is performed thanks to a defined importance factor. Then, multiple design options are evaluated by simulation in priority component. The selected options lead to reduce the operating costs. This work contains simulation results that illustrate the methods mentioned above. Moreover, a heavy vehicle sub-system is used as a test-case
19

Abid, Fatma. "Contribution à la robustesse et à l'optimisation fiabiliste des structures Uncertainty of shape memory alloy micro-actuator using generalized polynomial chaos methodUncertainty of shape memory alloy micro-actuator using generalized polynomial chaos method Numerical modeling of shape memory alloy problem in presence of perturbation : application to Cu-Al-Zn-Mn specimen An approach for the reliability-based design optimization of shape memory alloy structure Surrogate models for uncertainty analysis of micro-actuator." Thesis, Normandie, 2019. http://www.theses.fr/2019NORMIR24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
La conception des ouvrages économiques a suscité de nombreux progrès dans les domaines de la modélisation et de l’optimisation, permettant l’analyse de structures de plus en plus complexes. Cependant, les conceptions optimisées sans considérer les incertitudes des paramètres, peuvent ne pas respecter certains critères de fiabilité. Pour assurer le bon fonctionnement de la structure, il est important de prendre en considération l’incertitude dès la phase de conception. Il existe plusieurs théories dans la littérature pour traiter les incertitudes. La théorie de la fiabilité des structures consiste à définir la probabilité de défaillance d’une structure par la probabilité que les conditions de bon fonctionnement ne soient pas respectées. On appelle cette étude l’analyse de la fiabilité. L’intégration de l’analyse de fiabilité dans les problèmes d’optimisation constitue une nouvelle discipline introduisant des critères de fiabilité dans la recherche de la configuration optimale des structures, c’est le domaine de l’optimisation fiabiliste (RBDO). Cette méthodologie de RBDO vise donc à considérer la propagation des incertitudes dans les performances mécaniques en s’appuyant sur une modélisation probabiliste des fluctuations des paramètres d’entrée. Dans ce cadre, ce travail de thèse porte sur l’analyse robuste et l’optimisation fiabiliste des problèmes mécaniques complexes. Il est important de tenir compte des paramètres incertains du système pour assurer une conception robuste. L’objectif de la méthode RBDO est de concevoir une structure afin d’établir un bon compromis entre le coût et l’assurance de fiabilité. Par conséquent, plusieurs méthodes, telles que la méthode hybride et la méthode optimum safety factor, ont été développées pour atteindre cet objectif. Pour remédier à la complexité des problèmes mécaniques complexes comportant des paramètres incertains, des méthodologies spécifiques à cette problématique, tel que les méthodes de méta-modélisation, ont été développées afin de bâtir un modèle de substitution mécanique, qui satisfait en même temps l’efficacité et la précision du modèle
The design of economic system leads to many advances in the fields of modeling and optimization, allowing the analysis of structures more and more complex. However, optimized designs can suffer from uncertain parameters that may not meet certain reliability criteria. To ensure the proper functioning of the structure, it is important to consider uncertainty study is called the reliability analysis. The integration of reliability analysis in optimization problems is a new discipline introducing reliability criteria in the search for the optimal configuration of structures, this is the domain of reliability optimization (RBDO). This RBDO methodology aims to consider the propagation of uncertainties in the mechanical performance by relying on a probabilistic modeling of input parameter fluctuations. In this context, this thesis focuses on a robust analysis and a reliability optimization of complex mechanical problems. It is important to consider the uncertain parameters of the system to ensure a robust design. The objective of the RBDO method is to design a structure in order to establish a good compromise between the cost and the reliability assurance. As a result, several methods, such as the hybrid method and the optimum safety factor method, have been developed to achieve this goal. To address the complexity of complex mechanical problems with uncertain parameters, methodologies specific to this issue, such as meta-modeling methods, have been developed to build a mechanical substitution model, which at the same time satisfies the efficiency and the precision of the model
20

Bouguila, Maissa. "Μοdélisatiοn numérique et οptimisatiοn des matériaux à changement de phase : applicatiοns aux systèmes cοmplexes." Electronic Thesis or Diss., Normandie, 2024. http://www.theses.fr/2024NORMIR05.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Les matériaux à changement de phase MCP révèlent des performances importantes dans le domaine de la gestion thermique. Ces matériaux ont une capacité importante de stockage thermique. L’excès de la chaleur dissipée par les composantes électroniques peut causer des graves défaillances. Un système de refroidissement développé basé sur les matériaux à changement de phase est l’une des solutions les plus recommandées afin d’assurer un fonctionnement sécurisé de ces composants microélectroniques. Bien que la faible conductivité de ces matériaux soit considérée comme la limitation majeure de leurs utilisations dans les applications de gestion thermique. L’objectif principal de cette thèse est l’amélioration de la conductivité thermique de ces matériaux et l’optimisation des dissipateurs thermiques. Dans les premiers chapitres, des modélisations numériques sont effectuées afin de déterminer la configuration optimale d’un dissipateur à partir de l’étude de plusieurs paramètres comme l’insertion des ailettes, la dispersion des nanoparticules et l’utilisation de multiples matériaux à changement de phase. L’innovation de cette étude est la modélisation du transfert thermique des matériaux à changement de phase avec une concentration des nanoparticules relativement importante par rapport à la littérature et plus précisément les travaux scientifiques expérimentaux. Des conclusions intéressantes sont extraites de cette étude paramétrique qui va nous permettre parla suite de proposer un nouveau modèle basé sur des multiples des matériaux à changement de phase améliorés avec les nanoparticules. Des études d’optimisation fiabiliste sont après réalisées.En premier lieu, une étude d’optimisation fiabiliste mono-objective a été réalisé dans le but est de proposer un modèle du dissipateur fiable à multiple NANOMCPS avec des dimensions optimales. Donc l’objectif est d'optimiser (minimiser) le volume total du dissipateur tout en considérant les différents contraintes géométriques et fonctionnels. La méthode hybride robuste (RHM) montre une efficacité à proposer un modèle fiable et optimal comparant à la méthode d’optimisation déterministe (DDO) et les différentes méthodes d’optimisation de la conception basée sur la fiabilité (RBDO) considérées. En plus de la nouveauté de modèle proposée basé sur des multiples NANOMCPs, l’intégration d’une méthode de RBDO développée (RHM) pour l’application de gestion thermique est considérée comme une innovation dans la littérature récente.En deuxième lieu, une étude d’optimisation fiabiliste multi objective est proposée. Deux objectives sont considérées : le volume total du dissipateur et le temps de décharge pour atteindre la température ambiante. De plus, l’utilisation d’une méthode d’optimisation RHM, et l’algorithme génétique de tri non dominée, sont adoptées afin de chercher le modèle optimal et fiable qui offre le meilleur compromis entre les deux objectifs. En outre, un modèle de substitution avancée est établi dans le but de réduire le temps de simulation vu que le nombre important des itérations jusqu'à aboutir à un modèle optimal
Phase-change materials exhibit considerable potential in the field of thermal management.These materials offer a significant thermal storage capacity. Excessive heat dissipated by miniature electronic components could lead to serious failures. A cooling system based on phase-change materials is among the most recommended solutions to guarantee the reliable performance of these microelectronic components. However, the low conductivity of these materials is considered a major limitation to their use in thermal management applications. The primary objective of this thesis is to address the challenge of improving the thermal conductivity of these materials. Numerical modeling is conducted, in the first chapters, to determine the optimal configuration of a heat sink, based on the study of several parameters such as fin insertion, nanoparticle dispersion, and the use of multiple phase-change materials. The innovation in this parametric study lies in the modeling of heat transfer from phase-change materials with relatively high nanoparticle concentration compared to the low concentration found in recent literature (experimental researchs). Significant conclusions are deducted from this parametric study, enabling us to propose a new model based on multiple phase-change materials improved with nanoparticles (NANOMCP). Reliable optimization studies are then conducted. Initially, a mono-objective reliability optimization study is carried out to propose a reliable and optimal model based on multiple NANOMCPs. The Robust Hybrid Method (RHM)proposes a reliable and optimal model, compared with the Deterministic Design Optimization method (DDO) and various Reliability Design Optimization methods (RBDO). Furthermore,the integration of a developed RBDO method (RHM) for the thermal management applicationis considered an innovation in recent literature. Additionally, a reliable multi-objective optimization study is proposed, considering two objectives: the total volume of the heat sink and the discharge time to reach ambient temperature. The RHM optimization method and the non-dominated sorting genetics algorithm (C-NSGA-II) were adopted to search for the optimal and reliable model that offers the best trade-off between the two objectives. Besides, An advanced metamodel is developed to reduce simulation time, considering the large number of iterations involved in finding the optimal model
21

Dubourg, Vincent. "Méta-modèles adaptatifs pour l'analyse de fiabilité et l'optimisation sous contrainte fiabiliste." Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2011. http://tel.archives-ouvertes.fr/tel-00697026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Cette thèse est une contribution à la résolution du problème d'optimisation sous contrainte de fiabilité. Cette méthode de dimensionnement probabiliste vise à prendre en compte les incertitudes inhérentes au système à concevoir, en vue de proposer des solutions optimales et sûres. Le niveau de sûreté est quantifié par une probabilité de défaillance. Le problème d'optimisation consiste alors à s'assurer que cette probabilité reste inférieure à un seuil fixé par les donneurs d'ordres. La résolution de ce problème nécessite un grand nombre d'appels à la fonction d'état-limite caractérisant le problème de fiabilité sous-jacent. Ainsi,cette méthodologie devient complexe à appliquer dès lors que le dimensionnement s'appuie sur un modèle numérique coûteux à évaluer (e.g. un modèle aux éléments finis). Dans ce contexte, ce manuscrit propose une stratégie basée sur la substitution adaptative de la fonction d'état-limite par un méta-modèle par Krigeage. On s'est particulièrement employé à quantifier, réduire et finalement éliminer l'erreur commise par l'utilisation de ce méta-modèle en lieu et place du modèle original. La méthodologie proposée est appliquée au dimensionnement des coques géométriquement imparfaites soumises au flambement.
22

Lin, Shu-Ping, and 林書平. "Parallelized Ensemble of Gaussian-based Reliability Analyses (PEoGRA) for Reliability-Based Design Optimization (RBDO)." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/04249081892380031251.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
碩士
中原大學
機械工程研究所
104
Reliability-Based Design Optimization (RBDO) algorithms have been developed to solve design optimization problems with existence of uncertainties. Traditionally, the original random design space is transformed to the standard normal design space, where the reliability index can be measured in a standardized unit. In the standard normal design space, the Modified Reliability Index Approach (MRIA) measured the minimum distance from the design point to the failure region to represent the reliability index; on the other hand, the Performance Measure Approach (PMA) performed inverse reliability analysis to evaluate the target function performance in a distance of reliability index away from the design point. MRIA was able to provide stable and accurate reliability analysis while PMA showed greater efficiency and was widely used in various engineering applications. However, the existing methods cannot properly perform reliability analysis in the standard normal design space if the transformation to the standard normal space does not exist or is difficult to determine. Especially, in image processing application, the transformation is hard to determine because of arbitrarily distribution in CIELAB space. The program speed is important while image processing application algorithm developed. To this end, a new algorithm, Parallelized Ensemble of Gaussian Reliability Analyses (PEoGRA), was developed to estimate the failure probability using Gaussian-based Kernel Density Estimate (KDE) in the original design space. The probabilistic constraints were formulated based on each kernel reliability analysis for the optimization processes. And Muti-Thread shared memory framework, including data access layer, assigned task layer and layer of estimation of reliability index layer, is used to acceleration program. This paper proposed an efficient way to estimate the constraint gradient and linearly approximate the probabilistic constraints with fewer function evaluations. Some numerical examples with various random distributions are studied to investigate the numerical performances of the proposed method. The program speed is investigated with EoGRA and PEoGRA in numerical examples also. Above of all, the results showed PEoGRA is capable of finding correct solutions in some problems that cannot be solved by traditional methods. PEoGRA is capable to operate image processing application in acceptable speed.

To the bibliography