Auswahl der wissenschaftlichen Literatur zum Thema „Sequential decision making, modeling, risk, effort“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Sequential decision making, modeling, risk, effort" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Zeitschriftenartikel zum Thema "Sequential decision making, modeling, risk, effort"

1

Шаталова, Ольга Владимировна, Дмитрий Андреевич Медников und Зейнаб Усама Протасова. „MULTI-AGENT INTELLIGENT SYSTEM FOR PREDICTION OF RISK OF CARDIOVASCULAR COMPLICATIONS WITH SYNERGY CHANNELS“. СИСТЕМНЫЙ АНАЛИЗ И УПРАВЛЕНИЕ В БИОМЕДИЦИНСКИХ СИСТЕМАХ, Nr. 3() (30.09.2020): 177–88. http://dx.doi.org/10.36622/vstu.2020.19.3.023.

Der volle Inhalt der Quelle
Annotation:
Цель исследования заключается в повышении качества прогнозирования ишемической болезни сердца путем учета синергетического эффекта наличия сопутствующих заболеваний и факторов профессиональной среды посредством многоагентных интеллектуальных систем. Методы исследования. Для прогнозирования ишемической болезни сердца предложена базовая структура многоагентной интеллектуальной системы, содержащая «сильные» и «слабые» классификаторы. При этом «слабые» классификаторы разделены на четыре группы, первая из которых осуществляет анализ данных, полученных на основе традиционных факторов риска ишемической болезни сердца, вторая - на основе анализа электрокардиологических исследований, третья группа «слабых» классификаторов предназначена для диагностики сопутствующих заболеваний и синдромов по предикторам, используемых первыми двумя группами агентов, а четвертая - анализирует факторы риска окружающей среды. Мультиагентная система позволяет управлять процессом принятия решений посредством сочетания экспертных оценок, статистических данных и текущей информации. Результаты. Проведены экспериментальные исследования различных модификаций предложенной модели классификатора, заключающихся в последовательном исключении из агрегатора решений «слабых» классификаторов на различных иерархических уровнях. В ходе экспериментального оценивания и в результате математического моделирования было показано, что при использовании всех информативных признаков уверенность в правильном прогнозе по риску ишемической болезни сердца превышает величину 0,8. Показатели качества прогнозирования выше, чем у известной системы прогнозирования ишемической болезни сердца - превышает SCORE, в среднем, на 14%. Выводы. Анализ показателей качества классификации в экспериментальной группе обследуемых с различным показателем ишемического риска и в контрольной группе, составленной из машинистов электролокомотивов, для которых релевантными показателями ишемических рисков являются вибрационная болезнь и пребывание в электромагнитных полях, показал, что учет влияния этих факторов риска в контрольной группе повышает диагностическую эффективность на семь процентов по сравнению с экспериментальной группой, выступающей как фоновая The aim of the study is to improve the quality of predicting coronary heart disease by taking into account the synergistic effect of the presence of concomitant diseases and occupational factors through multi-agent intelligent systems. Research methods. To predict coronary heart disease, a basic structure of a multi-agent intelligent system is proposed, which contains “strong” and “weak” classifiers. At the same time, the "weak" classifiers are divided into four groups, the first of which analyzes data obtained on the basis of traditional risk factors for coronary heart disease, the second - based on the analysis of electrocardiological studies, the third group of "weak" classifiers is intended for the diagnosis of concomitant diseases and syndromes based on predictors used by the first two groups of agents, and the fourth analyzes environmental risk factors. The mobile system allows you to manage the decision-making process through a combination of expert assessments, statistical data and current information. Results. Experimental studies of various modifications of the proposed model of the classifier, consisting in the sequential exclusion from the aggregator of decisions of "weak" classifiers at various hierarchical levels, have been carried out. In the course of experimental evaluation and as a result of mathematical modeling, it was shown that when using all informative signs, the confidence in the correct forecast for the risk of coronary heart disease exceeds 0.8. The indicators of the quality of prediction are higher than those of the known predictive system for coronary heart disease - they exceed SCORE, on average, by 14%. Conclusions. Analysis of the classification quality indicators in the experimental group of subjects with different ischemic risk indicators and in the control group made up of electric locomotive drivers, for whom vibration sickness and exposure to electromagnetic fields are relevant indicators of ischemic risks, showed that taking into account the influence of these risk factors in the control group increases diagnostic efficiency by seven percent compared with the experimental group serving as background
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

ROOS, PATRICK, und DANA NAU. „RISK PREFERENCE AND SEQUENTIAL CHOICE IN EVOLUTIONARY GAMES“. Advances in Complex Systems 13, Nr. 04 (August 2010): 559–78. http://dx.doi.org/10.1142/s0219525910002682.

Der volle Inhalt der Quelle
Annotation:
There is much empirical evidence that human decision-making under risk does not coincide with expected value maximization, and much effort has been invested into the development of descriptive theories of human decision-making involving risk (e.g. Prospect Theory). An open question is how behavior corresponding to these descriptive models could have been learned or arisen evolutionarily, as the described behavior differs from expected value maximization. We believe that the answer to this question lies, at least in part, in the interplay between risk-taking, sequentiality of choice, and population dynamics in evolutionary environments. In this paper, we provide the results of several evolutionary game simulations designed to study the risk behavior of agents in evolutionary environments. These include several evolutionary lottery games where sequential decisions are made between risky and safe choices, and an evolutionary version of the well-known stag hunt game. Our results show how agents that are sometimes risk-prone and sometimes risk-averse can outperform agents that make decisions solely based on the maximization of the local expected values of the outcomes, and how this can facilitate the evolution of cooperation in situations where cooperation entails risk.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

FREITAS DA ROCHA, ARMANDO, MARCELO NASCIMENTO BURATTINI, FÁBIO THEOTO ROCHA und EDUARDO MASSAD. „A NEUROECONOMIC MODELING OF ATTENTION-DEFICIT/HYPERACTIVITY DISORDER (ADHD)“. Journal of Biological Systems 17, Nr. 04 (Dezember 2009): 597–622. http://dx.doi.org/10.1142/s021833900900306x.

Der volle Inhalt der Quelle
Annotation:
In this paper we present a new neuroeconomics model for decision-making applied to the Attention-Deficit/Hyperactivity Disorder (ADHD). The model is based on the hypothesis that decision-making is dependent on the evaluation of expected rewards and risks assessed simultaneously in two decision spaces: the personal (PDS) and the interpersonal emotional spaces (IDS). Motivation to act is triggered by necessities identified in PDS or IDS. The adequacy of an action in fulfilling a given necessity is assumed to be dependent on the expected reward and risk evaluated in the decision spaces. Conflict generated by expected reward and risk influences the easiness (cognitive effort) and the future perspective of the decision-making. Finally, the willingness (not) to act is proposed to be a function of the expected reward (or risk), adequacy, easiness and future perspective. The two most frequent clinical forms are ADHD hyperactive(AD/HDhyp) and ADHD inattentive(AD/HDdin). AD/HDhyp behavior is hypothesized to be a consequence of experiencing high rewarding expectancies for short periods of time, low risk evaluation, and short future perspective for decision-making. AD/HDin is hypothesized to be a consequence of experiencing high rewarding expectancies for long periods of time, low risk evaluation, and long future perspective for decision-making.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Jafarzadegan, Keighobad, Peyman Abbaszadeh und Hamid Moradkhani. „Sequential data assimilation for real-time probabilistic flood inundation mapping“. Hydrology and Earth System Sciences 25, Nr. 9 (16.09.2021): 4995–5011. http://dx.doi.org/10.5194/hess-25-4995-2021.

Der volle Inhalt der Quelle
Annotation:
Abstract. Real-time probabilistic flood inundation mapping is crucial for flood risk warning and decision-making during the emergency period before an upcoming flood event. Considering the high uncertainties involved in the modeling of a nonlinear and complex flood event, providing a deterministic flood inundation map can be erroneous and misleading for reliable and timely decision-making. The conventional flood hazard maps provided for different return periods cannot also represent the actual dynamics of flooding rivers. Therefore, a real-time modeling framework that forecasts the inundation areas before the onset of an upcoming flood is of paramount importance. Sequential data assimilation (DA) techniques are well known for real-time operation of physical models while accounting for existing uncertainties. In this study, we present a DA hydrodynamic modeling framework where multiple gauge observations are integrated into the LISFLOOD-FP model to improve its performance. This study utilizes the ensemble Kalman filter (EnKF) in a multivariate fashion for dual estimation of model state variables and parameters where the correlations among point source observations are taken into account. First, a synthetic experiment is designed to assess the performance of the proposed approach; then the method is used to simulate the Hurricane Harvey flood in 2017. Our results indicate that the multivariate assimilation of point source observations into hydrodynamic models can improve the accuracy and reliability of probabilistic flood inundation mapping by 5 %–7 %, while it also provides the basis for sequential updating and real-time flood inundation mapping.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Long, Xueqin, Chenxi Hou, Shanshan Liu und Yuejiao Wang. „Sequential Route Choice Modeling Based on Dynamic Reference Points and Its Empirical Study“. Discrete Dynamics in Nature and Society 2020 (27.03.2020): 1–11. http://dx.doi.org/10.1155/2020/8081576.

Der volle Inhalt der Quelle
Annotation:
Aiming at the influence of information, we investigate and analyze the sequential route choice behavior under dynamic reference points based on cumulative prospect theory in this paper. An experiment platform collecting the sequential route choices based on C/S structure is designed and four types of information are released to participants, respectively. Real-time travel time prediction methods are then proposed for travelers’ decision-making. Using nonlinear regression method, the parameters of the value function and weight function of cumulative prospect theory are estimated under different types of information, respectively. It is found that travelers’ behavior showed obvious characteristic of risk pursuit under the circumstance where real-time travel time information is released. Instead, when they have access to descriptive information, they tend to be more conservative.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Polyakov, Yuri, Andrey Savchenko, Mikhail Savelyev, Denis Perevedentsev und Anna Koscheeva. „Management-developing of an algorithm for the design decisions in the innovative activity of an organization“. SHS Web of Conferences 116 (2021): 00070. http://dx.doi.org/10.1051/shsconf/202111600070.

Der volle Inhalt der Quelle
Annotation:
This article proposes a new algorithm for developing conceptual project decisions in terms of an organization or innovator’s response to changing consumer demands through the development and investment in innovations. The theoretical basis for choosing the type of an algorithm was elaborated by studying the available referencies relevant to present issues of the applicability of types of algorithms for developing management decisions. A multistage multicriteria algorithm (hereinafter referred to as MMA) was developed on the basis of studied algorithms. On developing the stages of the MMA, a set of thinking methods (analysis – synthesis, deduction – induction), a decomposition method, and logical methods were used. MMA is considered as a set of step-by-step sequential descriptions of local processes, formulations and solutions of management tasks, used decision methods and decision-making criteria. The SMART(ER) method was used for managerial tasks at each stage to be set. When solving a management problem, a set of methods was used including an expert method, a factor analysis, brainstorm and methods for calculating economic efficiency. For choosing solutions at each stage to be made, a set of criteria was used. They are the weight of the negative impact on the result of the activity, the yes/no method, the maximum effect, costs minimizing, risks. The schematic design was defined for MMA to be developed. The schematic design was considered as a method of modeling and a method of graphical-analytical displaying of the stages of the process.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Shen, Yun, Michael J. Tobia, Tobias Sommer und Klaus Obermayer. „Risk-Sensitive Reinforcement Learning“. Neural Computation 26, Nr. 7 (Juli 2014): 1298–328. http://dx.doi.org/10.1162/neco_a_00600.

Der volle Inhalt der Quelle
Annotation:
We derive a family of risk-sensitive reinforcement learning methods for agents, who face sequential decision-making tasks in uncertain environments. By applying a utility function to the temporal difference (TD) error, nonlinear transformations are effectively applied not only to the received rewards but also to the true transition probabilities of the underlying Markov decision process. When appropriate utility functions are chosen, the agents’ behaviors express key features of human behavior as predicted by prospect theory (Kahneman & Tversky, 1979 ), for example, different risk preferences for gains and losses, as well as the shape of subjective probability curves. We derive a risk-sensitive Q-learning algorithm, which is necessary for modeling human behavior when transition probabilities are unknown, and prove its convergence. As a proof of principle for the applicability of the new framework, we apply it to quantify human behavior in a sequential investment task. We find that the risk-sensitive variant provides a significantly better fit to the behavioral data and that it leads to an interpretation of the subject's responses that is indeed consistent with prospect theory. The analysis of simultaneously measured fMRI signals shows a significant correlation of the risk-sensitive TD error with BOLD signal change in the ventral striatum. In addition we find a significant correlation of the risk-sensitive Q-values with neural activity in the striatum, cingulate cortex, and insula that is not present if standard Q-values are used.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Paasche, Hendrik, Katja Paasche und Peter Dietrich. „Uncertainty as a Driving Force for Geoscientific Development“. Nature and Culture 15, Nr. 1 (01.03.2020): 1–18. http://dx.doi.org/10.3167/nc.2020.150101.

Der volle Inhalt der Quelle
Annotation:
Geoscientists invest significant effort to cope with uncertainty in Earth system observation and modeling. While general discussions exist about uncertainty and risk communication, judgment and decision-making, and science communication with regard to Earth sciences, in this article, we tackle uncertainty from the perspective of Earth science practitioners. We argue different scientific methodologies must be used to recognize all types of uncertainty inherent to a scientific finding. Following a discovery science methodology results in greater potential for the quantification of uncertainty associated to scientific findings than staying inside hypothesis-driven science methodology, as is common practice. Enabling improved uncertainty quantification could relax debates about risk communication and decision-making since it reduces the room for personality traits when communicating scientific findings.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Dittes, Beatrice, Maria Kaiser, Olga Špačková, Wolfgang Rieger, Markus Disse und Daniel Straub. „Risk-based flood protection planning under climate change and modeling uncertainty: a pre-alpine case study“. Natural Hazards and Earth System Sciences 18, Nr. 5 (15.05.2018): 1327–47. http://dx.doi.org/10.5194/nhess-18-1327-2018.

Der volle Inhalt der Quelle
Annotation:
Abstract. Planning authorities are faced with a range of questions when planning flood protection measures: is the existing protection adequate for current and future demands or should it be extended? How will flood patterns change in the future? How should the uncertainty pertaining to this influence the planning decision, e.g., for delaying planning or including a safety margin? Is it sufficient to follow a protection criterion (e.g., to protect from the 100-year flood) or should the planning be conducted in a risk-based way? How important is it for flood protection planning to accurately estimate flood frequency (changes), costs and damage? These are questions that we address for a medium-sized pre-alpine catchment in southern Germany, using a sequential Bayesian decision making framework that quantitatively addresses the full spectrum of uncertainty. We evaluate different flood protection systems considered by local agencies in a test study catchment. Despite large uncertainties in damage, cost and climate, the recommendation is robust for the most conservative approach. This demonstrates the feasibility of making robust decisions under large uncertainty. Furthermore, by comparison to a previous study, it highlights the benefits of risk-based planning over the planning of flood protection to a prescribed return period.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Martinelli, Gabriele, Jo Eidsvik, Ketil Hokstad und Ragnar Hauge. „Strategies for Petroleum Exploration on the Basis of Bayesian Networks: A Case Study“. SPE Journal 19, Nr. 04 (06.08.2013): 564–75. http://dx.doi.org/10.2118/159722-pa.

Der volle Inhalt der Quelle
Annotation:
Summary The paper presents a new approach for modeling important geological elements, such as reservoir, trap, and source, in a unified statistical model. This joint modeling of these geological variables is useful for reliable prospect evaluation, and provides a framework for consistent decision making under uncertainty. A Bayesian network (BN), involving different kinds of dependency structures, is used to model the correlation within the various geological elements and to couple the elements. On the basis of the constructed network, an optimal sequential exploration strategy is established with dynamic programming (DP). This strategy is useful for selecting the first prospect to explore and for making the decisions that should follow, depending on the outcome of the first well. A risk-neutral decision maker will continue exploring new wells as long as the expected profit is positive. The model and choice of exploration strategy are tailored to a case study represented by five prospects in a salt basin, but they will also be useful for other contexts. For the particular case study, we show how the strategy clearly depends on the exploration and development cost and the expected volumes and recovery factors. The most lucrative prospect tends to be selected first, but the sequential decisions depend on the outcome of the exploration well in this first prospect.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Dissertationen zum Thema "Sequential decision making, modeling, risk, effort"

1

Cuevas, Rivera Dario [Verfasser], Stefan [Akademischer Betreuer] Kiebel, Stefan [Gutachter] Kiebel und Michael [Gutachter] Smolka. „Dynamic computational models of risk and effort discounting in sequential decision making / Dario Cuevas Rivera ; Gutachter: Stefan Kiebel, Michael Smolka ; Betreuer: Stefan Kiebel“. Dresden : Technische Universität Dresden, 2021. http://d-nb.info/1236384024/34.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Cuevas, Rivera Dario. „Dynamic computational models of risk and effort discounting in sequential decision making“. 2021. https://tud.qucosa.de/id/qucosa%3A75264.

Der volle Inhalt der Quelle
Annotation:
Dissertation based on my publications in the field of risky behavior in dynamic, sequential decision making tasks.:1.- Introduction 2.- Context-dependent risk aversion: a model-based approach 3.- Modeling dynamic allocation of effort in a sequential task using discounting models 4.- General discussion
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Buchteile zum Thema "Sequential decision making, modeling, risk, effort"

1

Engemann, Kurt J., Holmes E. Miller und Ronald R. Yager. „Modeling Risk in Sequential Decision Making with Interval Probabilities“. In Soft Computing for Risk Evaluation and Management, 17–30. Heidelberg: Physica-Verlag HD, 2001. http://dx.doi.org/10.1007/978-3-7908-1814-7_2.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

„What Is the Question? — Most of the Time and Effort“. In Risk Modeling for Determining Value and Decision Making, 235–42. Chapman and Hall/CRC, 2000. http://dx.doi.org/10.1201/9781420035940-18.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Sikirda, Yuliya, Mykola Kasatkin und Dmytro Tkachenko. „Intelligent Automated System for Supporting the Collaborative Decision Making by Operators of the Air Navigation System During Flight Emergencies“. In Handbook of Research on Artificial Intelligence Applications in the Aviation and Aerospace Industries, 66–90. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-1415-3.ch003.

Der volle Inhalt der Quelle
Annotation:
This chapter researches pilot and air traffic controller collaborative decision making (CDM) during flight emergencies for maximum synchronization of operators' technological procedures. Deterministic models of CDM by the Air Navigation System's human operators were obtained by network planning methods; their adequacy is confirmed by full-scale modeling on a complex flight simulator. For the sequential optimization of the collaborative two-channel network “Air traffic controller-Pilot” to achieve the end-to-end effectiveness of joint solutions, a multi-criteria approach was used: ensuring the minimum time to parry flight emergency with maximum safety/maximum consistency over the time of operators' actions. With the help of the multiplicative function, the influence of organizational risk factors on flight safety in the air traffic control was evaluated. A conceptual model of System for control and forecasting the flight emergency development on the base of Intelligent Automated System for supporting the CDM by operators was developed.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Konferenzberichte zum Thema "Sequential decision making, modeling, risk, effort"

1

Ghosh, Dipanjan D., und Andrew Olewnik. „Computationally Efficient Imprecise Uncertainty Propagation in Engineering Design and Decision Making“. In ASME 2012 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/detc2012-70419.

Der volle Inhalt der Quelle
Annotation:
Modeling uncertainty through probabilistic representation in engineering design is common and important to decision making that considers risk. However, representations of uncertainty often ignore elements of “imprecision” that may limit the robustness of decisions. Further, current approaches that incorporate imprecision suffer from computational expense and relatively high solution error. This work presents the Computationally Efficient Imprecise Uncertainty Propagation (CEIUP) method which draws on existing approaches for propagation of imprecision and integrates sparse grid numerical integration to provide computational efficiency and low solution error for uncertainty propagation. The first part of the paper details the methodology and demonstrates improvements in both computational efficiency and solution accuracy as compared to the Optimized Parameter Sampling (OPS) approach for a set of numerical case studies. The second half of the paper is focused on estimation of non-dominated design parameter spaces using decision policies of Interval Dominance and Maximality Criterion in the context of set-based sequential design-decision making. A gear box design problem is presented and compared with OPS, demonstrating that CEIUP provides improved estimates of the non-dominated parameter range for satisfactory performance with faster solution times. Parameter estimates obtained for different risk attitudes are presented and analyzed from the perspective of Choice Theory leading to questions for future research. The paper concludes with an overview of design problem scenarios in which CEIUP is the preferred method and offers opportunities for extending the method.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Feng, Wenxing, Xiaoqiang Xiang, Guangming Jia, Lianshuang Dai, Yulei Gu, Xiaozheng Yang, Qingshang Feng und Lijian Zhou. „Applying the Quantitative Risk Assessment (QRA) to Improve Safety Management of Oil and Gas Pipeline Stations in China“. In 2012 9th International Pipeline Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/ipc2012-90130.

Der volle Inhalt der Quelle
Annotation:
The oil and gas pipeline companies in China are facing unprecedented opportunities and challenges because of China’s increasing demand for oil and gas energy that is attributed to rapid economic and social development. Limitation of land resource and the fast urbanization lead to a determinate result that many pipelines have to go through or be adjacent to highly populated areas such as cities or towns. The increasing Chinese government regulation, and public concerns about industrial safety and environmental protection push the pipeline companies to enhance the safety, health and environmental protection management. In recent years, PetroChina Pipeline Company (PPC) pays a lot of attention and effort to improve employees and public safety around the pipeline facilities. A comprehensive, integrated HSE management system is continuously improved and effectively implemented in PPC. PPC conducts hazard identification, risk assessment, risk control and mitigation, risk monitoring. For the oil and gas stations in highly populated area or with numerous employees, PPC carries out quantitative risk assessment (QRA) to evaluate and manage the population risk. To make the assessment, “Guidelines for quantitative risk assessments” (purple book) published by Committee for the Prevention of Disasters of Netherlands is used along with a software package. The basic principles, process, and methods of QRA technology are introduced in this article. The process is to identify the station hazards, determinate the failure scenarios of the facilities, estimate the possibilities of leakage failures, calculate the consequences of failures and damages to population, demonstrate the individual risk and social risk, and evaluate whether the risk is acceptable. The process may involve the mathematical modeling of fluid and gas spill, dispersion, fire and explosion. One QRA case in an oil pipeline station is described in this article to illustrate the application process and discuss several key issues in the assessment. Using QRA technique, about 20 stations have been evaluated in PPC. On the basis of the results, managers have taken prevention and mitigation plans to control the risk. QRAs in the pipeline station can provide a quantitative basis and valuable reference for the company’s decision-making and land use planning. Also, QRA can play a role to make a better relationship between the pipeline companies and the local regulator and public. Finally, this article delivers limitations of QRA in Chinese pipeline stations and discusses issues of the solutions.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Dupuis, Bruce, und Jason Humber. „The Evolution of Data Management to Empower Integrity Management Decisions: A Case Study of an Enterprise Implementation“. In 2002 4th International Pipeline Conference. ASMEDC, 2002. http://dx.doi.org/10.1115/ipc2002-27415.

Der volle Inhalt der Quelle
Annotation:
BP’s Natural Gas Liquids business unit (NGLBU) has conducted integrity investigation and mitigation activities on its pipelines and has been following this best practice for numerous years. In recent times, NGLBU’s data management initiatives focused on establishing an enterprise Geographic Information System (GIS) coupled tightly with a derivative of the Pipeline Open Data Standard (PODS) data model. During successful implementation of the GIS, an analysis identified gaps in existing data management processes for pipeline integrity information. Consequently, the business unit adopted Baseline Technology’s Pipeline Information Control System (PICS) and its modules to support the pipeline integrity decision-making process on its 9000km of pipeline. The PICS implementation leverages the existing GIS implementation while addressing a number of unresolved data management and integration issues, including: • Integration of inline inspection with excavation results; • Migration of above ground surveys to a common repository; • Integration of multiple inline inspections; • Facilitation of corrosion growth modeling; • Structured process for prioritization of remediation; • Structured process for integration of inline inspections with risk parameters; • Defined data collection, storage, and integration standards. Data management solutions based solely on a GIS require pipeline surveys without explicit positional information to be converted into a common linear reference system (typically chainage or stationing) such that disparate data sets may be overlaid and compared. This conversion, or spatial normalization, process is where much of the data management effort is spent and is often prone to error introduction. Even when small errors are introduced, the normalization process is often performed such that it is not auditable. If the underlying spatial errors are not reported, addressed, and understood, the value of the data integration and any subsequent analysis of the combined data set is questionable.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie