Dissertations / Theses on the topic 'Time-dependent reliability'

To see the other types of publications on this topic, follow the link: Time-dependent reliability.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 17 dissertations / theses for your research on the topic 'Time-dependent reliability.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Buijs, Foekje Akke. "Time-dependent reliability analysis of flood defences." Thesis, University of Newcastle Upon Tyne, 2008. http://hdl.handle.net/10443/888.

Full text
Abstract:
The aim of this thesis is to investigate how the time-dependent behaviour of flood defence properties can be appropriately characterised and incorporated in a reliabilitybased approach. Such an approach is required in a maintenance optimisation framework for flood defence management. The first objective shows that existing structural reliability methods are suitable for the analysis and incorporation of asset time-dependent processes in flood defence (system) reliability. Recent progress on quantitative maintenance optimisation frameworks for flood defence management is drawn together and complemented by theory from other engineering disciplines. The second objective develops three importance measure types to indicate the relevance of the time-dependent processes in the context of a rational maintenance optimisation approach. These importance measures support practical operational management as well as maintenance optimisation model design. The third objective develops a modelling methodology to describe asset time-dependent processes of flood defences by a statistical model. The first phase in the modelling methodology is problem formulation. The second conceptualisation phase is a five-step analysis of the asset time-dependent process. Firstly, existing field observations and scientific understanding are assembled. Secondly, the excitation, ancillary and affected features and uncertainty types of the asset time-dependent process are analysed. The third step describes the character of the process conditional on the excitation. The fourth step analyses the dependencies between different asset time-dependent processes. The fifth step formulates alternative statistical models for the asset time-dependent process. The last phase in the modelling methodology is parameter estimation, calibration and model corroboration. Historical observations on asset time-dependent processes are scarce and can either be used for further extension of this phase or Bayesian posterior updating. The fourth objective demonstrates the methods developed in this thesis in a (system) reliability model of the Dartford Creek to Swanscombe Marshes flood defence system along the Thames Estuary.
APA, Harvard, Vancouver, ISO, and other styles
2

Kerpicci, Kara Sibel. "Reliability-based Analysis Of Time-dependent Scouring At Bridge Abutments." Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/3/12610334/index.pdf.

Full text
Abstract:
Deterministic scour prediction equations for bridge abutments do not involve uncertainties coming from scouring parameters and they only consider effects of hydraulic parameters. However, in order to safely design bridge abutments, treatment of these uncertainties and evaluation of possible risks are required. Two artificial neural network (ANN) models are constructed to describe scouring phenomenon using the parameters of two different equations. The equation to be used in the reliability analysis is then determined according to ANN modeling results. To conduct reliability analysis, Monte Carlo simulation technique is used in which different distributions and coefficients of variations are used for random variables to examine their effects on reliability. It is observed that probability distributions of governing variables have no impact on reliability. However, coefficients of variations of these variables influence reliability.
APA, Harvard, Vancouver, ISO, and other styles
3

Hussin, Razaidi. "A statistical study of time dependent reliability degradation of nanoscale MOSFET devices." Thesis, University of Glasgow, 2017. http://theses.gla.ac.uk/8052/.

Full text
Abstract:
Charge trapping at the channel interface is a fundamental issue that adversely affects the reliability of metal-oxide semiconductor field effect transistor (MOSFET) devices. This effect represents a new source of statistical variability as these devices enter the nano-scale era. Recently, charge trapping has been identified as the dominant phenomenon leading to both random telegraph noise (RTN) and bias temperature instabilities (BTI). Thus, understanding the interplay between reliability and statistical variability in scaled transistors is essential to the implementation of a ‘reliability-aware’ complementary metal oxide semiconductor (CMOS) circuit design. In order to investigate statistical reliability issues, a methodology based on a simulation flow has been developed in this thesis that allows a comprehensive and multi-scale study of charge-trapping phenomena and their impact on transistor and circuit performance. The proposed methodology is accomplished by using the Gold Standard Simulations (GSS) technology computer-aided design (TCAD)-based design tool chain co-optimization (DTCO) tool chain. The 70 nm bulk IMEC MOSFET and the 22 nm Intel fin-shape field effect transistor (FinFET) have been selected as targeted devices. The simulation flow starts by calibrating the device TCAD simulation decks against experimental measurements. This initial phase allows the identification of the physical structure and the doping distributions in the vertical and lateral directions based on the modulation in the inversion layer’s depth as well as the modulation of short channel effects. The calibration is further refined by taking into account statistical variability to match the statistical distributions of the transistors’ figures of merit obtained by measurements. The TCAD simulation investigation of RTN and BTI phenomena is then carried out in the presence of several sources of statistical variability. The study extends further to circuit simulation level by extracting compact models from the statistical TCAD simulation results. These compact models are collected in libraries, which are then utilised to investigate the impact of the BTI phenomenon, and its interaction with statistical variability, in a six transistor-static random access memory (6T-SRAM) cell. At the circuit level figures of merit, such as the static noise margin (SNM), and their statistical distributions are evaluated. The focus of this thesis is to highlight the importance of accounting for the interaction between statistical variability and statistical reliability in the simulation of advanced CMOS devices and circuits, in order to maintain predictivity and obtain a quantitative agreement with a measured data. The main findings of this thesis can be summarised by the following points: Based on the analysis of the results, the dispersions of VT and ΔVT indicate that a change in device technology must be considered, from the planar MOSFET platform to a new device architecture such as FinFET or SOI. This result is due to the interplay between a single trap charge and statistical variability, which has a significant impact on device operation and intrinsic parameters as transistor dimensions shrink further. The ageing process of transistors can be captured by using the trapped charge density at the interface and observing the VT shift. Moreover, using statistical analysis one can highlight the extreme transistors and their probable effect on the circuit or system operation. The influence of the passgate (PG) transistor in a 6T-SRAM cell gives a different trend of the mean static noise margin.
APA, Harvard, Vancouver, ISO, and other styles
4

Hilsmeier, Todd Andrew. "Characterization of time-dependent component reliability and availability effects due to aging /." The Ohio State University, 1998. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487950153601096.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Morita, Lia Hanna Martins. "Degradation modeling for reliability analysis with time-dependent structure based on the inverse gaussian distribution." Universidade Federal de São Carlos, 2017. https://repositorio.ufscar.br/handle/ufscar/9120.

Full text
Abstract:
Submitted by Aelson Maciera (aelsoncm@terra.com.br) on 2017-08-29T19:13:47Z No. of bitstreams: 1 TeseLHMM.pdf: 2605456 bytes, checksum: b07c268a8fc9a1af8f14ac26deeec97e (MD5)
Approved for entry into archive by Ronildo Prado (ronisp@ufscar.br) on 2017-09-25T18:22:48Z (GMT) No. of bitstreams: 1 TeseLHMM.pdf: 2605456 bytes, checksum: b07c268a8fc9a1af8f14ac26deeec97e (MD5)
Approved for entry into archive by Ronildo Prado (ronisp@ufscar.br) on 2017-09-25T18:22:55Z (GMT) No. of bitstreams: 1 TeseLHMM.pdf: 2605456 bytes, checksum: b07c268a8fc9a1af8f14ac26deeec97e (MD5)
Made available in DSpace on 2017-09-25T18:27:54Z (GMT). No. of bitstreams: 1 TeseLHMM.pdf: 2605456 bytes, checksum: b07c268a8fc9a1af8f14ac26deeec97e (MD5) Previous issue date: 2017-04-07
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Conventional reliability analysis techniques are focused on the occurrence of failures over time. However, in certain situations where the occurrence of failures is tiny or almost null, the estimation of the quantities that describe the failure process is compromised. In this context the degradation models were developed, which have as experimental data not the failure, but some quality characteristic attached to it. Degradation analysis can provide information about the components lifetime distribution without actually observing failures. In this thesis we proposed different methodologies for degradation data based on the inverse Gaussian distribution. Initially, we introduced the inverse Gaussian deterioration rate model for degradation data and a study of its asymptotic properties with simulated data. We then proposed an inverse Gaussian process model with frailty as a feasible tool to explore the influence of unobserved covariates, and a comparative study with the traditional inverse Gaussian process based on simulated data was made. We also presented a mixture inverse Gaussian process model in burn-in tests, whose main interest is to determine the burn-in time and the optimal cutoff point that screen out the weak units from the normal ones in a production row, and a misspecification study was carried out with the Wiener and gamma processes. Finally, we considered a more flexible model with a set of cutoff points, wherein the misclassification probabilities are obtained by the exact method with the bivariate inverse Gaussian distribution or an approximate method based on copula theory. The application of the methodology was based on three real datasets in the literature: the degradation of LASER components, locomotive wheels and cracks in metals.
As técnicas convencionais de análise de confiabilidade são voltadas para a ocorrência de falhas ao longo do tempo. Contudo, em determinadas situações nas quais a ocorrência de falhas é pequena ou quase nula, a estimação das quantidades que descrevem os tempos de falha fica comprometida. Neste contexto foram desenvolvidos os modelos de degradação, que possuem como dado experimental não a falha, mas sim alguma característica mensurável a ela atrelada. A análise de degradação pode fornecer informações sobre a distribuição de vida dos componentes sem realmente observar falhas. Assim, nesta tese nós propusemos diferentes metodologias para dados de degradação baseados na distribuição gaussiana inversa. Inicialmente, nós introduzimos o modelo de taxa de deterioração gaussiana inversa para dados de degradação e um estudo de suas propriedades assintóticas com dados simulados. Em seguida, nós apresentamos um modelo de processo gaussiano inverso com fragilidade considerando que a fragilidade é uma boa ferramenta para explorar a influência de covariáveis não observadas, e um estudo comparativo com o processo gaussiano inverso usual baseado em dados simulados foi realizado. Também mostramos um modelo de mistura de processos gaussianos inversos em testes de burn-in, onde o principal interesse é determinar o tempo de burn-in e o ponto de corte ótimo para separar os itens bons dos itens ruins em uma linha de produção, e foi realizado um estudo de má especificação com os processos de Wiener e gamma. Por fim, nós consideramos um modelo mais flexível com um conjunto de pontos de corte, em que as probabilidades de má classificação são estimadas através do método exato com distribuição gaussiana inversa bivariada ou em um método aproximado baseado na teoria de cópulas. A aplicação da metodologia foi realizada com três conjuntos de dados reais de degradação de componentes de LASER, rodas de locomotivas e trincas em metais.
APA, Harvard, Vancouver, ISO, and other styles
6

Baingo, Darek. "A Framework for Stochastic Finite Element Analysis of Reinforced Concrete Beams Affected by Reinforcement Corrosion." Thèse, Université d'Ottawa / University of Ottawa, 2012. http://hdl.handle.net/10393/23063.

Full text
Abstract:
Corrosion of reinforcing bars is the major cause of deterioration of reinforced concrete (RC) structures in North America, Europe, the Middle East, and many coastal regions around the world. This deterioration leads to a loss of serviceability and functionality and ultimately affects the structural safety. The objective of this research is to formulate and implement a general stochastic finite element analysis (SFEA) framework for the time-dependent reliability analysis of RC beams with corroding flexural reinforcement. The framework is based on the integration of nonlinear finite element and reliability analyses through an iterative response surface methodology (RSM). Corrosion-induced damage is modelled through the combined effects of gradual loss of the cross-sectional area of the steel reinforcement and the reduction bond between steel and concrete for increasing levels of corrosion. Uncertainties in corrosion rate, material properties, and imposed actions are modelled as random variables. Effective implementation of the framework is achieved by the coupling of commercial finite element and reliability software. Application of the software is demonstrated through a case study of a simply-supported RC girder with tension reinforcement subjected to the effects of uniform (general) corrosion, in which two limit states are considered: (i) a deflection serviceability limit state and (ii) flexural strength ultimate limit state. The results of the case study show that general corrosion leads to a very significant decrease in the reliability of the RC beam both in terms of flexural strength and maximum deflections. The loss of strength and serviceability was shown to be predominantly caused by the loss of bond strength, whereas the gradual reduction of the cross-sectional area of tension reinforcement was found to be insignificant. The load-deflection response is also significantly affected by the deterioration of bond strength (flexural strength and stiffness). The probability of failure at the end of service life, due to the effects of uniform corrosion-induced degradation, is observed to be approximately an order of magnitude higher than in the absence of corrosion. Furthermore, the results suggest that flexural resistance of corroded RC beams is controlled by the anchorage (bond) of the bars and not by the yielding of fully bonded tensile reinforcement at failure. This is significant since the end regions can be severely corroded due to chloride, moisture, and oxygen access at connections and expansion joints. The research strongly suggests that bond damage must be considered in the assessment of the time-dependent reliability of RC beams subjected to general corrosion.
APA, Harvard, Vancouver, ISO, and other styles
7

Emam, Emam. "UTILIZING A REAL LIFE DATA WAREHOUSE TO DEVELOP FREEWAY TRAVEL TIME ELIABILITY STOCHASTIC MODELS." Doctoral diss., University of Central Florida, 2006. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/3987.

Full text
Abstract:
During the 20th century, transportation programs were focused on the development of the basic infrastructure for the transportation networks. In the 21st century, the focus has shifted to management and operations of these networks. Transportation network reliability measure plays an important role in judging the performance of the transportation system and in evaluating the impact of new Intelligent Transportation Systems (ITS) deployment. The measurement of transportation network travel time reliability is imperative for providing travelers with accurate route guidance information. It can be applied to generate the shortest path (or alternative paths) connecting the origins and destinations especially under conditions of varying demands and limited capacities. The measurement of transportation network reliability is a complex issue because it involves both the infrastructure and the behavioral responses of the users. Also, this subject is challenging because there is no single agreed-upon reliability measure. This dissertation developed a new method for estimating the effect of travel demand variation and link capacity degradation on the reliability of a roadway network. The method is applied to a hypothetical roadway network and the results show that both travel time reliability and capacity reliability are consistent measures for reliability of the road network, but each may have a different use. The capacity reliability measure is of special interest to transportation network planners and engineers because it addresses the issue of whether the available network capacity relative to the present or forecast demand is sufficient, whereas travel time reliability is especially interesting for network users. The new travel time reliability method is sensitive to the users' perspective since it reflects that an increase in segment travel time should always result in less travel time reliability. And, it is an indicator of the operational consistency of a facility over an extended period of time. This initial theoretical effort and basic research was followed by applying the new method to the I-4 corridor in Orlando, Florida. This dissertation utilized a real life transportation data warehouse to estimate travel time reliability of the I-4 corridor. Four different travel time stochastic models: Weibull, Exponential, Lognormal, and Normal were tested. Lognormal was the best-fit model. Unlike the mechanical equipments, it is unrealistic that any freeway segment can be traversed in zero seconds no matter how fast the vehicles are. So, an adjustment of the developed best-fit statistical model (Lognormal) location parameter was needed to accurately estimate the travel time reliability. The adjusted model can be used to compute and predict travel time reliability of freeway corridors and report this information in real time to the public through traffic management centers. Compared to existing Florida Method and California Buffer Time Method, the new reliability method showed higher sensitivity to geographical locations, which reflects the level of congestion and bottlenecks. The major advantages/benefits of this new method to practitioners and researchers over the existing methods are its ability to estimate travel time reliability as a function of departure time, and that it treats travel time as a continuous variable that captures the variability experienced by individual travelers over an extended period of time. As such, the new method developed in this dissertation could be utilized in transportation planning and freeway operations for estimating the important travel time reliability measure of performance. Then, the segment length impacts on travel time reliability calculations were investigated utilizing the wealth of data available in the I-4 data warehouse. The developed travel time reliability models showed significant evidence of the relationship between the segment length and the results accuracy. The longer the segment, the less accurate were the travel time reliability estimates. Accordingly, long segments (e.g., 25 miles) are more appropriate for planning purposes as a macroscopic performance measure of the freeway corridor. Short segments (e.g., 5 miles) are more appropriate for the evaluation of freeway operations as a microscopic performance measure. Further, this dissertation has explored the impact of relaxing an important assumption in reliability analysis: Link independency. In real life, assuming that link failures on a road network are statistically independent is dubious. The failure of a link in one particular area does not necessarily result in the complete failure of the neighboring link, but may lead to deterioration of its performance. The "Cause-Based Multimode Model" (CBMM) has been used to address link dependency in communication networks. However, the transferability of this model to transportation networks has not been tested and this approach has not been considered before in the calculation of transportation networks' reliability. This dissertation presented the CBMM and applied it to predict transportation networks' travel time reliability that an origin demand can reach a specified destination under multimodal dependency link failure conditions. The new model studied the multi-state system reliability analysis of transportation networks for which one cannot formulate an "all or nothing" type of failure criterion and in which dependent link failures are considered. The results demonstrated that the newly developed method has true potential and can be easily extended to large-scale networks as long as the data is available. More specifically, the analysis of a hypothetical network showed that the dependency assumption is very important to obtain more reasonable travel time reliability estimates of links, paths, and the entire network. The results showed large discrepancy between the dependency and independency analysis scenarios. Realistic scenarios that considered the dependency assumption were on the safe side, this is important for transportation network decision makers. Also, this could aid travelers in making better choices. In contrast, deceptive information caused by the independency assumption could add to the travelers' anxiety associated with the unknown length of delay. This normally reflects negatively on highway agencies and management of taxpayers' resources.
Ph.D.
Department of Civil and Environmental Engineering
Engineering and Computer Science
Civil Engineering
APA, Harvard, Vancouver, ISO, and other styles
8

Cheng, Danling. "Integrated System Model Reliability Evaluation and Prediction for Electrical Power Systems: Graph Trace Analysis Based Solutions." Diss., Virginia Tech, 2009. http://hdl.handle.net/10919/28944.

Full text
Abstract:
A new approach to the evaluation of the reliability of electrical systems is presented. In this approach a Graph Trace Analysis based approach is applied to integrated system models and reliability analysis. The analysis zones are extended from the traditional power system functional zones. The systems are modeled using containers with iterators, where the iterators manage graph edges and are used to process through the topology of the graph. The analysis provides a means of computationally handling dependent outages and cascading failures. The effects of adverse weather, time-varying loads, equipment age, installation environment, operation conditions are considered. Sequential Monte Carlo simulation is used to evaluate the reliability changes for different system configurations, including distributed generation and transmission lines. Historical weather records and loading are used to update the component failure rates on-the-fly. Simulation results are compared against historical reliability field measurements. Given a large and complex plant to operate, a real-time understanding of the networks and their situational reliability is important to operational decision support. This dissertation also introduces using an Integrated System Model in helping operators to minimize real-time problems. A real-time simulation architecture is described, which predicts where problems may occur, how serious they may be, and what is the possible root cause.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
9

Miller, Ian Timothy. "Probabilistic finite element modeling of aerospace engine components incorporating time-dependent inelastic properties for ceramic matrix composite (CMC) materials." Akron, OH : University of Akron, 2006. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=akron1144941702.

Full text
Abstract:
Thesis (M.S.)--University of Akron, Dept. of Mathematics, 2006.
"May, 2006." Title from electronic thesis title page (viewed 11/29/2007) Advisor, Vinod Arya; Co-Advisor, Ali Hajjafar; Faculty reader, Shantaram S. Pai; Department Chair, Kevin Kreider; Dean of the College, Ronald F. Levant; Dean of the Graduate School, George R. Newkome. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
10

Zhu, Weiqi, and ycqq929@gmail com. "An Investigation into Reliability Based Methods to Include Risk of Failure in Life Cycle Cost Analysis of Reinforced Concrete Bridge Rehabilitation." RMIT University. Civil, Environmental and Chemical Engineering, 2008. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20080822.140447.

Full text
Abstract:
Reliability based life cycle cost analysis is becoming an important consideration for decision-making in relation to bridge design, maintenance and rehabilitation. An optimal solution should ensure reliability during service life while minimizing the life cycle cost. Risk of failure is an important component in whole of life cycle cost for both new and existing structures. Research work presented here aimed to develop a methodology for evaluation of the risk of failure of reinforced concrete bridges to assist in decision making on rehabilitation. Methodology proposed here combines fault tree analysis and probabilistic time-dependent reliability analysis to achieve qualitative and quantitative assessment of the risk of failure. Various uncertainties are considered including the degradation of resistance due to initiation of a particular distress mechanism, increasing load effects, changes in resistance as a result of rehabilitation, environmental variables, material properties and model errors. It was shown that the proposed methodology has the ability to provide users two alternative approaches for qualitative or quantitative assessment of the risk of failure depending on availability of detailed data. This work will assist the managers of bridge infrastructures in making decisions in relation to optimization of rehabilitation options for aging bridges.
APA, Harvard, Vancouver, ISO, and other styles
11

Deng, Yingjun. "Degradation modeling based on a time-dependent Ornstein-Uhlenbeck process and prognosis of system failures." Thesis, Troyes, 2015. http://www.theses.fr/2015TROY0004/document.

Full text
Abstract:
Cette thèse est consacrée à la description, la prédiction et la prévention des défaillances de systèmes. Elle se compose de quatre parties relatives à la modélisation stochastique de dégradation, au pronostic de défaillance du système, à l'estimation du niveau de défaillance et à l'optimisation de maintenance.Le processus d'Ornstein-Uhlenbeck (OU) dépendant du temps est introduit dans un objectif de modélisation des dégradations. Sur la base de ce processus, le premier instant de passage d’un niveau de défaillance prédéfini est considéré comme l’instant de défaillance du système considéré. Différentes méthodes sont ensuite proposées pour réaliser le pronostic de défaillance. Dans la suite, le niveau de défaillance associé au processus de dégradation est estimé à partir de la distribution de durée de vie en résolvant un problème inverse de premier passage. Cette approche permet d’associer les enregistrements de défaillance et le suivi de dégradation pour améliorer la qualité du pronostic posé comme un problème de premier passage. Le pronostic de défaillances du système permet d'optimiser sa maintenance. Le cas d'un système contrôlé en permanence est considéré. La caractérisation de l’instant de premier passage permet une rationalisation de la prise de décision de maintenance préventive. L’aide à la décision se fait par la recherche d'un niveau virtuel de défaillance dont le calcul est optimisé en fonction de critères proposés
This thesis is dedicated to describe, predict and prevent system failures. It consists of four issues: i) stochastic degradation modeling, ii) prognosis of system failures, iii) failure level estimation and iv) maintenance optimization. The time-dependent Ornstein-Uhlenbeck (OU) process is introduced for degradation modeling. The time-dependent OU process is interesting from its statistical properties on controllable mean, variance and correlation. Based on such a process, the first passage time is considered as the system failure time to a pre-set failure level. Different methods are then proposed for the prognosis of system failures, which can be classified into three categories: analytical approximations, numerical algorithms and Monte-Carlo simulation methods. Moreover, the failure level is estimated from the lifetime distribution by solving inverse first passage problems. This is to make up the potential gap between failure and degradation records to reinforce the prognosis process via first passage problems. From the prognosis of system failures, the maintenance optimization for a continuously monitored system is performed. By introducing first passage problems, the arrangement of preventive maintenance is simplified. The maintenance decision rule is based on a virtual failure level, which is solution of an optimization problem for proposed objective functions
APA, Harvard, Vancouver, ISO, and other styles
12

Chen, Chang-Chih. "System-level modeling and reliability analysis of microprocessor systems." Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/53033.

Full text
Abstract:
Frontend and backend wearout mechanisms are major reliability concerns for modern microprocessors. In this research, a framework which contains modules for negative bias temperature instability (NBTI), positive bias temperature instability (PBTI), hot carrier injection (HCI), gate-oxide breakdown (GOBD), backend time-dependent dielectric breakdown (BTDDB), electromigration (EM), and stress-induced voiding (SIV) is proposed to analyze the impact of each wearout mechanism on state-of-art microprocessors and to accurately estimate microprocessor lifetimes due to each wearout mechanism. Taking into account the detailed thermal profiles, electrical stress profiles and a variety of use scenarios, composed of a fraction of time in operation, a fraction of time in standby, and a fraction of time when the system is off, this work provides insight into lifetime-limiting wearout mechanisms, along with the reliability-critical microprocessor functional units for a system. This enables circuit designers to know if their designs will achieve an adequate lifetime and further make any updates in the designs to enhance reliability prior to committing the designs to manufacture.
APA, Harvard, Vancouver, ISO, and other styles
13

Ahmadivala, Morteza. "Vers une planification optimale de la maintenance des structures existantes sur la base d'une analyse de fiabilité en fonction du temps." Thesis, Université Clermont Auvergne‎ (2017-2020), 2020. http://theses.bu.uca.fr/nondiff/2020CLFAC056_AHMADIVALA.pdf.

Full text
Abstract:
Les structures de génie civil jouent un rôle important dans tout pays pour améliorer l'économie ainsi que le bien-être social et environnemental. Une défaillance indésirable peut avoir des impacts significatifs à différents niveaux pour le propriétaire de la structure et pour les utilisateurs. La fatigue est l'un des principaux processus de dégradation des structures en acier qui provoque une défaillance structurelle avant la fin de la durée de vie prévue. Pour éviter des défaillances imprévues à cause de la fatigue, une Gestion complète du Cycle de Vie structurelle (GCV) est nécessaire pour minimiser le coût du cycle de vie et maximiser la durée de vie structurelle. L'un des principaux objectifs du GCV peut être lié à l'optimisation de la planification de la maintenance structurelle. Atteindre cet objectif est une tâche difficile qui nécessite de relever certains défis tels que la prédiction de la performance structurelle dans l'incertitude, l'utilisation de données de Surveillance de l'État de la Structure (SES) pour réduire les incertitudes, la prise en compte du comportement de propagation des fissures pour des composants donnés, la fiabilité et la décision éclairée par les coûts, réalisation et effet des actions de maintenance, entre autres. En conséquence, les contributions suivantes sont prises en compte dans cette recherche pour améliorer les capacités du GCV structurel qui sont expliquées brièvement dans la suite. Développement d'une nouvelle méthode de fiabilité dépendent du temps pour l'analyse de la fiabilité en fatigue.Étudier l'efficacité des outils avancés de propagation des fissures pour étudier les problèmes de fissuration par fatigue indésirables et caractériser certaines actions de réparation possibles sur une étude de cas réel.Présentation des hypothèses et des étapes de simplification requises pour intégrer la méthode de fiabilité dépendent du temps proposé avec les modèles de propagation des fissures afin d'approximer la fiabilité de fatigue dépendent du temps.Comme première contribution de cette thèse, une nouvelle méthode de fiabilité dépendante du temps appelée AK-SYS-t est proposée. Cette méthode fournit un outil efficace et précis pour évaluer la fiabilité en fonction du temps d'un composant par rapport aux autres méthodes disponibles. AK-SYS-t relie la fiabilité en fonction du temps aux problèmes de fiabilité du système et tente d'exploiter les méthodes de fiabilité du système efficaces telles que AK-SYS pour l'analyse de la fiabilité en fonction du temps. Il convient de mentionner que l'analyse de la fiabilité en fonction du temps est nécessaire dans ce contexte, car la détérioration des performances (comme la fatigue) est un processus dépendant du temps associé à des paramètres dépendant du temps tels que le chargement de fatigue.Un autre sujet connexe est l'étude du phénomène de propagation des fissures avec des outils de modélisation avancés tels que la Méthode des Éléments Finis (MEF) et la méthode XFEM (eXtended Finite Element Method). À des fins d'illustration, la fissure à la racine d'une soudure d'angle est prise en compte (détail de fatigue courant dans les ponts avec platelage orthotropes). Une question importante étudiée ici est l'influence de la tension transversale dans la plaque de pont sur la direction de la propagation de la fissure. On montre comment l'augmentation de la tension transversale dans la plaque de pont peut modifier la propagation des fissures vers la plaque de pont. Ces fissures sont considérées comme dangereuses car elles sont difficiles à inspecter et à détecter. En fin de compte, XFEM est utilisé pour étudier l'efficacité de deux solutions de réparation possibles. (...)
Civil engineering structures play an important role in any country for improving the economy together with the social and environmental welfare. An unwanted failure might cause significant impacts at different levels for the structure owner and for users. Fatigue is one of the main degradation processes on steel structures that causes structural failure before the end of the designed service life. To avoid unexpected failures due to fatigue, a comprehensive structural Life Cycle Management (LCM) is required to minimize the life-cycle cost and maximize the structural service life. One of the main objectives within the LCM can be related to optimizing the structural maintenance planning. Achieving this goal is a challenging task which requires to address some challenges such as predicting the structural performance under uncertainty, employing Structural Health Monitoring (SHM) data to reduce uncertainties, taking into account crack propagation behavior for given components, reliability and cost-informed decision making, and effect of maintenance actions among others. Accordingly, following contributions are considered in this research to improve the capabilities of structural LCM which are explained shortly in the sequel.Developing a new time-dependent reliability method for fatigue reliability analysis.Investigating the effectiveness of advanced crack propagation tools to study unwanted fatigue cracking problems and characterizing some possible repair actions on a real case study.Introducing the assumptions and simplification steps required to integrate the proposed time-dependent reliability method with crack propagation models to approximate the time-dependent fatigue reliability.As the first contribution of this thesis, a new time-dependent reliability method called AK-SYS-t is proposed. This method provides an efficient and accurate tool to evaluate time-dependent reliability of a component compared to other available methods. AK-SYS-t relates the time-dependent reliability to system reliability problems and tries to exploit the efficient system reliability methods such as AK-SYS towards time-dependent reliability analysis. It is worth mentioning that time-dependent reliability analysis is necessary in this context since the performance deterioration (such as fatigue) is a time-dependent process associated with time-dependent parameters such as fatigue loading.Another related topic is the study of crack propagation phenomenon with advanced modeling tools such as Finite Element Method (FEM) and eXtended Finite Element Method (XFEM). For illustration purposes, the crack in the root of a fillet weld is considered (common fatigue detail in bridges with orthotropic deck plates). One important issue investigated herein is the influence of the transversal tension in the deck plate on the direction of the crack propagation. It is shown how increasing the transversal tension in the deck plate may change the crack propagation towards the deck plate. Such cracks are considered dangerous since they are hard to inspect and detect. In the end, XFEM is used to investigate the effectiveness of two possible repair solutions.A supplementary contribution is related to introducing the required steps in order to integrate the newly developed time-depend reliability method with crack propagation problems through some applicational examples. This is a challenging task since performing the time-dependent reliability analysis for such problems requires a cycle-by-cycle calculation of stress intensity factors which requires huge computational resources. Therefore, the aim here is to introduce the assumptions and simplification steps in order to adopt the AK-SYS-t for fatigue reliability analysis. Accordingly, two examples are considered. (...)
APA, Harvard, Vancouver, ISO, and other styles
14

Liang, Bin. "Estimation of Time-dependent Reliability of Suspension Bridge Cables." Thesis, 2016. https://doi.org/10.7916/D8V69JTN.

Full text
Abstract:
The reliability of the main cable of a suspension bridge is crucial to the reliability of the entire bridge. Throughout the life of a suspension bridge, its main cables are subject to corrosion due to various factors, and the deterioration of strength is a slowly evolving and dynamic process. The goal of this research is to find the pattern of how the strength of steel wires inside a suspension bridge cable changes with time. Two methodologies are proposed based on the analysis of five data sets which were collected by testing pristine wires, artificially corroded wires, and wires taken from three suspension bridges: Severn Bridge, Forth Road Bridge and Williamsburg Bridge. The first methodology is to model wire strength as a random process in space whose marginal probability distribution and power spectral density evolve with time. Both the marginal distribution and the power spectral density are parameterized with time-dependent parameters. This enables the use of Monte Carlo methods to estimate the failure probability of wires at any given time. An often encountered problem -- the incompatibility between the non-Gaussian marginal probability distribution and prescribed power spectral density -- which arises when simulating non-Gaussian random processes using translational field theory, is also studied. It is shown by copula theory that the selected marginal distribution imposes restrictions on the selection of power spectral density function. The second methodology is to model the deterioration rate of wire strength as a stochastic process in time, under Ito's stochastic calculus framework. The deterioration rate process is identified as a mean-reversion stochastic process taking non-negative values. It is proposed that the actual deterioration of wire strength depends on the deterioration rate, and may also depend on the state of the wire strength itself. The probability distribution of wire strength at any given time can be obtained by integrating the deterioration rate process. The model parameters are calibrated from the available data sets by matching moments or minimizing differences between probability distributions.
APA, Harvard, Vancouver, ISO, and other styles
15

Chang, Shih-Cheng, and 張仕政. "Cooperative particle swarm optimization for the time-dependent reliability redundancy allocation problems." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/2f7am9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Sudarsanam, Nandan. "Methods of estimating stress-strength interference reliability under a time dependent degradation analysis." 2005. http://digital.library.okstate.edu/etd/umi-okstate-1356.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Liu, Jen-Chieh, and 劉仁傑. "Study of RRAM reliability and switching mechanism using time-dependent dielectric breakdown methods." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/86962169833683697359.

Full text
Abstract:
碩士
國立交通大學
電子工程學系 電子研究所
102
In this thesis, time-dependent dielectric breakdown (TDDB) characteristics of resistive switching random access memory (RRAM) have been thoroughly studied. Several applications have also been demonstrated, such as long-term reliability prediction and understanding on the resistive-switching (RS) mechanism. In the gate dielectric breakdown theory, there are well-established methodologies for lifetime prediction. These methodologies were introduced in this work to study RRAM read disturb immunity. Furthermore, fast TDDB lifetime projection of the high-resistance state (HRS) was performed by ramp-voltage test and verified by the constant voltage TDDB test. Apart from the advantages such as convenience and accuracy, the ramp-voltage-based method was also available in different RRAM cells, indicating its promising potential to become a general methodology to predict read disturb immunity. Additionally, it is also presented that TDDB of HRS is able to provide additional insight into the resistive switching mechanisms. In the same RRAM cell, two RS modes are obtained using different operation schemes. The two RS modes show distinctive electrical and statistical properties. Furthermore, their time-to-SET (tSET) distribution reveals different reliability concerns. Therefore, a comprehensive measurement methodology involving TDDB of the SET process and electrical characteristics of HRS is developed to clarify the origin of the two RS modes, which can be well explained by different locations where RS took place; in bulk high- or interfacial layer (IL) adjacent to silicon substrate. Moreover, changing the operation conditions led to a gradual evolution of the RS location.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography