To see the other types of publications on this topic, follow the link: Sequential decision making, modeling, risk, effort.

Journal articles on the topic 'Sequential decision making, modeling, risk, effort'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 33 journal articles for your research on the topic 'Sequential decision making, modeling, risk, effort.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Шаталова, Ольга Владимировна, Дмитрий Андреевич Медников, and Зейнаб Усама Протасова. "MULTI-AGENT INTELLIGENT SYSTEM FOR PREDICTION OF RISK OF CARDIOVASCULAR COMPLICATIONS WITH SYNERGY CHANNELS." СИСТЕМНЫЙ АНАЛИЗ И УПРАВЛЕНИЕ В БИОМЕДИЦИНСКИХ СИСТЕМАХ, no. 3() (September 30, 2020): 177–88. http://dx.doi.org/10.36622/vstu.2020.19.3.023.

Full text
Abstract:
Цель исследования заключается в повышении качества прогнозирования ишемической болезни сердца путем учета синергетического эффекта наличия сопутствующих заболеваний и факторов профессиональной среды посредством многоагентных интеллектуальных систем. Методы исследования. Для прогнозирования ишемической болезни сердца предложена базовая структура многоагентной интеллектуальной системы, содержащая «сильные» и «слабые» классификаторы. При этом «слабые» классификаторы разделены на четыре группы, первая из которых осуществляет анализ данных, полученных на основе традиционных факторов риска ишемической болезни сердца, вторая - на основе анализа электрокардиологических исследований, третья группа «слабых» классификаторов предназначена для диагностики сопутствующих заболеваний и синдромов по предикторам, используемых первыми двумя группами агентов, а четвертая - анализирует факторы риска окружающей среды. Мультиагентная система позволяет управлять процессом принятия решений посредством сочетания экспертных оценок, статистических данных и текущей информации. Результаты. Проведены экспериментальные исследования различных модификаций предложенной модели классификатора, заключающихся в последовательном исключении из агрегатора решений «слабых» классификаторов на различных иерархических уровнях. В ходе экспериментального оценивания и в результате математического моделирования было показано, что при использовании всех информативных признаков уверенность в правильном прогнозе по риску ишемической болезни сердца превышает величину 0,8. Показатели качества прогнозирования выше, чем у известной системы прогнозирования ишемической болезни сердца - превышает SCORE, в среднем, на 14%. Выводы. Анализ показателей качества классификации в экспериментальной группе обследуемых с различным показателем ишемического риска и в контрольной группе, составленной из машинистов электролокомотивов, для которых релевантными показателями ишемических рисков являются вибрационная болезнь и пребывание в электромагнитных полях, показал, что учет влияния этих факторов риска в контрольной группе повышает диагностическую эффективность на семь процентов по сравнению с экспериментальной группой, выступающей как фоновая The aim of the study is to improve the quality of predicting coronary heart disease by taking into account the synergistic effect of the presence of concomitant diseases and occupational factors through multi-agent intelligent systems. Research methods. To predict coronary heart disease, a basic structure of a multi-agent intelligent system is proposed, which contains “strong” and “weak” classifiers. At the same time, the "weak" classifiers are divided into four groups, the first of which analyzes data obtained on the basis of traditional risk factors for coronary heart disease, the second - based on the analysis of electrocardiological studies, the third group of "weak" classifiers is intended for the diagnosis of concomitant diseases and syndromes based on predictors used by the first two groups of agents, and the fourth analyzes environmental risk factors. The mobile system allows you to manage the decision-making process through a combination of expert assessments, statistical data and current information. Results. Experimental studies of various modifications of the proposed model of the classifier, consisting in the sequential exclusion from the aggregator of decisions of "weak" classifiers at various hierarchical levels, have been carried out. In the course of experimental evaluation and as a result of mathematical modeling, it was shown that when using all informative signs, the confidence in the correct forecast for the risk of coronary heart disease exceeds 0.8. The indicators of the quality of prediction are higher than those of the known predictive system for coronary heart disease - they exceed SCORE, on average, by 14%. Conclusions. Analysis of the classification quality indicators in the experimental group of subjects with different ischemic risk indicators and in the control group made up of electric locomotive drivers, for whom vibration sickness and exposure to electromagnetic fields are relevant indicators of ischemic risks, showed that taking into account the influence of these risk factors in the control group increases diagnostic efficiency by seven percent compared with the experimental group serving as background
APA, Harvard, Vancouver, ISO, and other styles
2

ROOS, PATRICK, and DANA NAU. "RISK PREFERENCE AND SEQUENTIAL CHOICE IN EVOLUTIONARY GAMES." Advances in Complex Systems 13, no. 04 (August 2010): 559–78. http://dx.doi.org/10.1142/s0219525910002682.

Full text
Abstract:
There is much empirical evidence that human decision-making under risk does not coincide with expected value maximization, and much effort has been invested into the development of descriptive theories of human decision-making involving risk (e.g. Prospect Theory). An open question is how behavior corresponding to these descriptive models could have been learned or arisen evolutionarily, as the described behavior differs from expected value maximization. We believe that the answer to this question lies, at least in part, in the interplay between risk-taking, sequentiality of choice, and population dynamics in evolutionary environments. In this paper, we provide the results of several evolutionary game simulations designed to study the risk behavior of agents in evolutionary environments. These include several evolutionary lottery games where sequential decisions are made between risky and safe choices, and an evolutionary version of the well-known stag hunt game. Our results show how agents that are sometimes risk-prone and sometimes risk-averse can outperform agents that make decisions solely based on the maximization of the local expected values of the outcomes, and how this can facilitate the evolution of cooperation in situations where cooperation entails risk.
APA, Harvard, Vancouver, ISO, and other styles
3

FREITAS DA ROCHA, ARMANDO, MARCELO NASCIMENTO BURATTINI, FÁBIO THEOTO ROCHA, and EDUARDO MASSAD. "A NEUROECONOMIC MODELING OF ATTENTION-DEFICIT/HYPERACTIVITY DISORDER (ADHD)." Journal of Biological Systems 17, no. 04 (December 2009): 597–622. http://dx.doi.org/10.1142/s021833900900306x.

Full text
Abstract:
In this paper we present a new neuroeconomics model for decision-making applied to the Attention-Deficit/Hyperactivity Disorder (ADHD). The model is based on the hypothesis that decision-making is dependent on the evaluation of expected rewards and risks assessed simultaneously in two decision spaces: the personal (PDS) and the interpersonal emotional spaces (IDS). Motivation to act is triggered by necessities identified in PDS or IDS. The adequacy of an action in fulfilling a given necessity is assumed to be dependent on the expected reward and risk evaluated in the decision spaces. Conflict generated by expected reward and risk influences the easiness (cognitive effort) and the future perspective of the decision-making. Finally, the willingness (not) to act is proposed to be a function of the expected reward (or risk), adequacy, easiness and future perspective. The two most frequent clinical forms are ADHD hyperactive(AD/HDhyp) and ADHD inattentive(AD/HDdin). AD/HDhyp behavior is hypothesized to be a consequence of experiencing high rewarding expectancies for short periods of time, low risk evaluation, and short future perspective for decision-making. AD/HDin is hypothesized to be a consequence of experiencing high rewarding expectancies for long periods of time, low risk evaluation, and long future perspective for decision-making.
APA, Harvard, Vancouver, ISO, and other styles
4

Jafarzadegan, Keighobad, Peyman Abbaszadeh, and Hamid Moradkhani. "Sequential data assimilation for real-time probabilistic flood inundation mapping." Hydrology and Earth System Sciences 25, no. 9 (September 16, 2021): 4995–5011. http://dx.doi.org/10.5194/hess-25-4995-2021.

Full text
Abstract:
Abstract. Real-time probabilistic flood inundation mapping is crucial for flood risk warning and decision-making during the emergency period before an upcoming flood event. Considering the high uncertainties involved in the modeling of a nonlinear and complex flood event, providing a deterministic flood inundation map can be erroneous and misleading for reliable and timely decision-making. The conventional flood hazard maps provided for different return periods cannot also represent the actual dynamics of flooding rivers. Therefore, a real-time modeling framework that forecasts the inundation areas before the onset of an upcoming flood is of paramount importance. Sequential data assimilation (DA) techniques are well known for real-time operation of physical models while accounting for existing uncertainties. In this study, we present a DA hydrodynamic modeling framework where multiple gauge observations are integrated into the LISFLOOD-FP model to improve its performance. This study utilizes the ensemble Kalman filter (EnKF) in a multivariate fashion for dual estimation of model state variables and parameters where the correlations among point source observations are taken into account. First, a synthetic experiment is designed to assess the performance of the proposed approach; then the method is used to simulate the Hurricane Harvey flood in 2017. Our results indicate that the multivariate assimilation of point source observations into hydrodynamic models can improve the accuracy and reliability of probabilistic flood inundation mapping by 5 %–7 %, while it also provides the basis for sequential updating and real-time flood inundation mapping.
APA, Harvard, Vancouver, ISO, and other styles
5

Long, Xueqin, Chenxi Hou, Shanshan Liu, and Yuejiao Wang. "Sequential Route Choice Modeling Based on Dynamic Reference Points and Its Empirical Study." Discrete Dynamics in Nature and Society 2020 (March 27, 2020): 1–11. http://dx.doi.org/10.1155/2020/8081576.

Full text
Abstract:
Aiming at the influence of information, we investigate and analyze the sequential route choice behavior under dynamic reference points based on cumulative prospect theory in this paper. An experiment platform collecting the sequential route choices based on C/S structure is designed and four types of information are released to participants, respectively. Real-time travel time prediction methods are then proposed for travelers’ decision-making. Using nonlinear regression method, the parameters of the value function and weight function of cumulative prospect theory are estimated under different types of information, respectively. It is found that travelers’ behavior showed obvious characteristic of risk pursuit under the circumstance where real-time travel time information is released. Instead, when they have access to descriptive information, they tend to be more conservative.
APA, Harvard, Vancouver, ISO, and other styles
6

Polyakov, Yuri, Andrey Savchenko, Mikhail Savelyev, Denis Perevedentsev, and Anna Koscheeva. "Management-developing of an algorithm for the design decisions in the innovative activity of an organization." SHS Web of Conferences 116 (2021): 00070. http://dx.doi.org/10.1051/shsconf/202111600070.

Full text
Abstract:
This article proposes a new algorithm for developing conceptual project decisions in terms of an organization or innovator’s response to changing consumer demands through the development and investment in innovations. The theoretical basis for choosing the type of an algorithm was elaborated by studying the available referencies relevant to present issues of the applicability of types of algorithms for developing management decisions. A multistage multicriteria algorithm (hereinafter referred to as MMA) was developed on the basis of studied algorithms. On developing the stages of the MMA, a set of thinking methods (analysis – synthesis, deduction – induction), a decomposition method, and logical methods were used. MMA is considered as a set of step-by-step sequential descriptions of local processes, formulations and solutions of management tasks, used decision methods and decision-making criteria. The SMART(ER) method was used for managerial tasks at each stage to be set. When solving a management problem, a set of methods was used including an expert method, a factor analysis, brainstorm and methods for calculating economic efficiency. For choosing solutions at each stage to be made, a set of criteria was used. They are the weight of the negative impact on the result of the activity, the yes/no method, the maximum effect, costs minimizing, risks. The schematic design was defined for MMA to be developed. The schematic design was considered as a method of modeling and a method of graphical-analytical displaying of the stages of the process.
APA, Harvard, Vancouver, ISO, and other styles
7

Shen, Yun, Michael J. Tobia, Tobias Sommer, and Klaus Obermayer. "Risk-Sensitive Reinforcement Learning." Neural Computation 26, no. 7 (July 2014): 1298–328. http://dx.doi.org/10.1162/neco_a_00600.

Full text
Abstract:
We derive a family of risk-sensitive reinforcement learning methods for agents, who face sequential decision-making tasks in uncertain environments. By applying a utility function to the temporal difference (TD) error, nonlinear transformations are effectively applied not only to the received rewards but also to the true transition probabilities of the underlying Markov decision process. When appropriate utility functions are chosen, the agents’ behaviors express key features of human behavior as predicted by prospect theory (Kahneman & Tversky, 1979 ), for example, different risk preferences for gains and losses, as well as the shape of subjective probability curves. We derive a risk-sensitive Q-learning algorithm, which is necessary for modeling human behavior when transition probabilities are unknown, and prove its convergence. As a proof of principle for the applicability of the new framework, we apply it to quantify human behavior in a sequential investment task. We find that the risk-sensitive variant provides a significantly better fit to the behavioral data and that it leads to an interpretation of the subject's responses that is indeed consistent with prospect theory. The analysis of simultaneously measured fMRI signals shows a significant correlation of the risk-sensitive TD error with BOLD signal change in the ventral striatum. In addition we find a significant correlation of the risk-sensitive Q-values with neural activity in the striatum, cingulate cortex, and insula that is not present if standard Q-values are used.
APA, Harvard, Vancouver, ISO, and other styles
8

Paasche, Hendrik, Katja Paasche, and Peter Dietrich. "Uncertainty as a Driving Force for Geoscientific Development." Nature and Culture 15, no. 1 (March 1, 2020): 1–18. http://dx.doi.org/10.3167/nc.2020.150101.

Full text
Abstract:
Geoscientists invest significant effort to cope with uncertainty in Earth system observation and modeling. While general discussions exist about uncertainty and risk communication, judgment and decision-making, and science communication with regard to Earth sciences, in this article, we tackle uncertainty from the perspective of Earth science practitioners. We argue different scientific methodologies must be used to recognize all types of uncertainty inherent to a scientific finding. Following a discovery science methodology results in greater potential for the quantification of uncertainty associated to scientific findings than staying inside hypothesis-driven science methodology, as is common practice. Enabling improved uncertainty quantification could relax debates about risk communication and decision-making since it reduces the room for personality traits when communicating scientific findings.
APA, Harvard, Vancouver, ISO, and other styles
9

Dittes, Beatrice, Maria Kaiser, Olga Špačková, Wolfgang Rieger, Markus Disse, and Daniel Straub. "Risk-based flood protection planning under climate change and modeling uncertainty: a pre-alpine case study." Natural Hazards and Earth System Sciences 18, no. 5 (May 15, 2018): 1327–47. http://dx.doi.org/10.5194/nhess-18-1327-2018.

Full text
Abstract:
Abstract. Planning authorities are faced with a range of questions when planning flood protection measures: is the existing protection adequate for current and future demands or should it be extended? How will flood patterns change in the future? How should the uncertainty pertaining to this influence the planning decision, e.g., for delaying planning or including a safety margin? Is it sufficient to follow a protection criterion (e.g., to protect from the 100-year flood) or should the planning be conducted in a risk-based way? How important is it for flood protection planning to accurately estimate flood frequency (changes), costs and damage? These are questions that we address for a medium-sized pre-alpine catchment in southern Germany, using a sequential Bayesian decision making framework that quantitatively addresses the full spectrum of uncertainty. We evaluate different flood protection systems considered by local agencies in a test study catchment. Despite large uncertainties in damage, cost and climate, the recommendation is robust for the most conservative approach. This demonstrates the feasibility of making robust decisions under large uncertainty. Furthermore, by comparison to a previous study, it highlights the benefits of risk-based planning over the planning of flood protection to a prescribed return period.
APA, Harvard, Vancouver, ISO, and other styles
10

Martinelli, Gabriele, Jo Eidsvik, Ketil Hokstad, and Ragnar Hauge. "Strategies for Petroleum Exploration on the Basis of Bayesian Networks: A Case Study." SPE Journal 19, no. 04 (August 6, 2013): 564–75. http://dx.doi.org/10.2118/159722-pa.

Full text
Abstract:
Summary The paper presents a new approach for modeling important geological elements, such as reservoir, trap, and source, in a unified statistical model. This joint modeling of these geological variables is useful for reliable prospect evaluation, and provides a framework for consistent decision making under uncertainty. A Bayesian network (BN), involving different kinds of dependency structures, is used to model the correlation within the various geological elements and to couple the elements. On the basis of the constructed network, an optimal sequential exploration strategy is established with dynamic programming (DP). This strategy is useful for selecting the first prospect to explore and for making the decisions that should follow, depending on the outcome of the first well. A risk-neutral decision maker will continue exploring new wells as long as the expected profit is positive. The model and choice of exploration strategy are tailored to a case study represented by five prospects in a salt basin, but they will also be useful for other contexts. For the particular case study, we show how the strategy clearly depends on the exploration and development cost and the expected volumes and recovery factors. The most lucrative prospect tends to be selected first, but the sequential decisions depend on the outcome of the exploration well in this first prospect.
APA, Harvard, Vancouver, ISO, and other styles
11

Bickel, J. Eric, James E. Smith, and Jennifer L. Meyer. "Modeling Dependence Among Geologic Risks in Sequential Exploration Decisions." SPE Reservoir Evaluation & Engineering 11, no. 02 (April 1, 2008): 352–61. http://dx.doi.org/10.2118/102369-pa.

Full text
Abstract:
Summary Prospects in a common basin are likely to share geologic features. For example, if hydrocarbons are found at one location, they may be more likely to be found at other nearby locations. When making drilling decisions, we should be able to exploit this dependence and use drilling results from one location to make more informed decisions about other nearby prospects. Moreover, we should consider these informational synergies when evaluating multiprospect exploration opportunities. In this paper, we describe an approach for modeling the dependence among prospects and determining an optimal drilling strategy that takes this information into account. We demonstrate this approach using an example involving five prospects. This example demonstrates the value of modeling dependence and the value of learning about individual geologic risk factors (e.g., from doing a postmortem at a failed well) when choosing a drilling strategy. Introduction When considering a new prospect, it is important to consider its probability of success. In practice, this assessment is often decomposed into success probabilities for a number of underlying geologic factors. For example, one might consider the probabilities that the hydrocarbons were generated, whether the reservoir rocks have the appropriate porosity and permeability, and whether the identified structural trap has an appropriate seal [see, e.g., Magoon and Dow (1994)]. The overall probability of success is the product of these individual probabilities. Although these assessments may be difficult, for a single prospect, this risk analysis process is straightforward. When considering multiple prospects in a common basin or multiple target zones in a single well, in addition to considering the probability of success for each prospect, we need to consider the dependence among prospects. For example, if hydrocarbons are found at one location, they may be much more likely to be found at another nearby location. Conversely, if hydrocarbons are not found at the first location, they may be less likely to be found at the other. When evaluating opportunities with multiple prospects, we should consider decision processes and workflows that exploit this dependence and use results from early wells to make more informed decisions about other locations. For example, if a postmortem analysis of core samples from a failed well reveals that there were no hydrocarbons present, then we may not want to continue drilling at nearby sites. On the other hand, if the postmortem analysis reveals that hydrocarbons were present, but the reservoir lacked a seal, then we may want to continue to explore other nearby sites. In this paper, we describe an approach for modeling dependence among prospects and developing a drilling strategy that exploits the information provided by early drilling results.
APA, Harvard, Vancouver, ISO, and other styles
12

Aita, Rafael Carlesso, Daniela T. Pezzini, Eric C. Burkness, Christina D. DiFonzo, Deborah L. Finke, Thomas E. Hunt, Janet J. Knodel, et al. "Presence–Absence Sampling Plans for Stink Bugs (Hemiptera: Pentatomidae) in the Midwest Region of the United States." Journal of Economic Entomology 114, no. 3 (April 22, 2021): 1362–72. http://dx.doi.org/10.1093/jee/toab076.

Full text
Abstract:
Abstract Stink bugs represent an increasing risk to soybean production in the Midwest region of the United States. The current sampling protocol for stink bugs in this region is tailored for population density estimation and thus is more relevant to research purposes. A practical decision-making framework with more efficient sampling effort for management of herbivorous stink bugs is needed. Therefore, a binomial sequential sampling plan was developed for herbivorous stink bugs in the Midwest region. A total of 146 soybean fields were sampled across 11 states using sweep nets in 2016, 2017, and 2018. The binomial sequential sampling plans were developed using combinations of five tally thresholds at two proportion infested action thresholds to identify those that provided the best sampling outcomes. Final assessment of the operating characteristic curves for each plan indicated that a tally threshold of 3 stink bugs per 25 sweeps, and proportion infested action thresholds of 0.75 and 0.95 corresponding to the action thresholds of 5 and 10 stink bugs per 25 sweeps, provided the optimal balance between highest probability of correct decisions (≥ 99%) and lowest probability of incorrect decisions (≤ 1%). In addition, the average sample size for both plans (18 and 12 sets of 25 sweeps, respectively) was lower than that for the other proposed plans. The binomial sequential sampling plan can reduce the number of sample units required to achieve a management decision, which is important because it can potentially reduce risk/cost of management for stink bugs in soybean in this region.
APA, Harvard, Vancouver, ISO, and other styles
13

Thurtle, David, Sabrina H. Rossi, Brendan Berry, Paul Pharoah, and Vincent J. Gnanapragasam. "Models predicting survival to guide treatment decision-making in newly diagnosed primary non-metastatic prostate cancer: a systematic review." BMJ Open 9, no. 6 (June 2019): e029149. http://dx.doi.org/10.1136/bmjopen-2019-029149.

Full text
Abstract:
ObjectivesMen diagnosed with non-metastatic prostate cancer require standardised and robust long-term prognostic information to help them decide on management. Most currently-used tools use short-term and surrogate outcomes. We explored the evidence base in the literature on available pre-treatment, prognostic models built around long-term survival and assess the accuracy, generalisability and clinical availability of these models.DesignSystematic literature review, pre-specified and registered on PROSPERO (CRD42018086394).Data sourcesMEDLINE, Embase and The Cochrane Library were searched from January 2000 through February 2018, using previously-tested search terms.Eligibility criteriaInclusion required a multivariable model prognostic model for non-metastatic prostate cancer, using long-term survival data (defined as ≥5 years), which was not treatment-specific and usable at the point of diagnosis.Data extraction and synthesisTitle, abstract and full-text screening were sequentially performed by three reviewers. Data extraction was performed for items in the CHecklist for critical Appraisal and data extraction for systematic Reviews of prediction Modelling Studies checklist. Individual studies were assessed using the new Prediction model Risk Of Bias ASsessment Tool.ResultsDatabase searches yielded 6581 studies after deduplication. Twelve studies were included in the final review. Nine were model development studies using data from over 231 888 men. However, only six of the nine studies included any conservatively managed cases and only three of the nine included treatment as a predictor variable. Every included study had at least one parameter for which there was high risk of bias, with failure to report accuracy, and inadequate reporting of missing data common failings. Three external validation studies were included, reporting two available models: The University of California San Francisco (UCSF) Cancer of the Prostate Risk Assessment score and the Cambridge Prognostic Groups. Neither included treatment effect, and both had potential flaws in design, but represent the most robust and usable prognostic models currently available.ConclusionFew long-term prognostic models exist to inform decision-making at diagnosis of non-metastatic prostate cancer. Improved models are required to inform management and avoid undertreatment and overtreatment of non-metastatic prostate cancer.
APA, Harvard, Vancouver, ISO, and other styles
14

Dosi, Clio, Manuel Iori, Arthur Kramer, and Matteo Vignoli. "Computational Simulation as an Organizational Prototyping Tool." Proceedings of the Design Society: International Conference on Engineering Design 1, no. 1 (July 2019): 1105–14. http://dx.doi.org/10.1017/dsi.2019.116.

Full text
Abstract:
AbstractThis case study deals with a redesign effort to face the overcrowding issue in an Emergency Department (ED). A multidiscinary group of healthcare professionals and engineers worked together to improve the actual processes. We integrate the simulation modeling in a human-centered design method. We use the simulation technique as a learning and experimentation tool into a design thinking process: the computational descrete event simulation helps explore the possibile scenarios to be prototyped. We used the simulation to create a virtual prototyping environment, to help the group start a safe ideation and prototyping effort. Virtual prototyping injected into the organizational context the possibility of experimenting. It represented a cognitive low-risk environment where professionals could explore possible alternative solutions. Upon those solutions, we developed organizational prototyping tools. Top management and head physicians gained confidence for a more grounded decision making effort and important choices of change management and investments have been made.
APA, Harvard, Vancouver, ISO, and other styles
15

Ciarallo, Frank W., Raymond R. Hill, Sriram Mahadevan, Vikrant Chopra, Patrick J. Vincent, and Christoper S. Allen. "Building the Mobility Aircraft Availability Forecasting (MAAF) Simulation Model and Decision Support System." Journal of Defense Modeling and Simulation: Applications, Methodology, Technology 2, no. 2 (April 2005): 57–69. http://dx.doi.org/10.1177/154851290500200202.

Full text
Abstract:
The Mobility Aircraft Availability Forecasting (MAAF) model prototype development and study effort was initiated to help the United States Air Force Air Mobility Command (AMC) answer the question, “How can we accurately predict mission capable (MC) rates?” While perfect prediction of aircraft MC rates is not possible, we investigate a simulation-based risk analysis approach. Current prediction methods utilize “after the fact” analyses and user opinion, making it difficult to perform quick, accurate, and effective analyses of potential limiting factors and policy changes, particularly in time-sensitive situations. This paper describes the MAAF proof-of-concept model and decision support system built to provide AMC managers the dynamic, predictive tools needed to better forecast aircraft availability. The simulation component featured new capabilities for mobility modeling to include dynamic definition of the configuration of a mobility system, dynamic definition of the capabilities of the individual airbases within a mobility system, improved representation of the aircraft objects within the model, and a new approach to modeling aircraft maintenance including the realistic consideration of partially mission capable aircraft. The development efforts and sample experimental results are recounted in this paper.
APA, Harvard, Vancouver, ISO, and other styles
16

Aljohani, Naif Radi, Ayman Fayoumi, and Saeed-Ul Hassan. "Predicting At-Risk Students Using Clickstream Data in the Virtual Learning Environment." Sustainability 11, no. 24 (December 17, 2019): 7238. http://dx.doi.org/10.3390/su11247238.

Full text
Abstract:
In higher education, predicting the academic performance of students is associated with formulating optimal educational policies that vehemently impact economic and financial development. In online educational platforms, the captured clickstream information of students can be exploited in ascertaining their performance. In the current study, the time-series sequential classification problem of students’ performance prediction is explored by deploying a deep long short-term memory (LSTM) model using the freely accessible Open University Learning Analytics dataset. In the pass/fail classification job, the deployed LSTM model outperformed the state-of-the-art approaches with 93.46% precision and 75.79% recall. Encouragingly, our model superseded the baseline logistic regression and artificial neural networks by 18.48% and 12.31%, respectively, with 95.23% learning accuracy. We demonstrated that the clickstream data generated due to the students’ interaction with the online learning platforms can be evaluated at a week-wise granularity to improve the early prediction of at-risk students. Interestingly, our model can predict pass/fail class with around 90% accuracy within the first 10 weeks of student interaction in a virtual learning environment (VLE). A contribution of our research is an informed approach to advanced higher education decision-making towards sustainable education. It is a bold effort for student-centric policies, promoting the trust and the loyalty of students in courses and programs.
APA, Harvard, Vancouver, ISO, and other styles
17

Chen, Nathan, David Rey, and Lauren Gardner. "Multiscale Network Model for Evaluating Global Outbreak Control Strategies." Transportation Research Record: Journal of the Transportation Research Board 2626, no. 1 (January 2017): 42–50. http://dx.doi.org/10.3141/2626-06.

Full text
Abstract:
High volumes of passenger air travel increase the risk of infectious disease epidemics and pandemics. Regional preparedness planning for large-scale outbreaks requires models that are able to capture outbreak dynamics within a control policy evaluation framework. Previous studies focused on either modeling outbreak dynamics or optimizing outbreak control decisions; this paper proposes an integrated approach that combines both aspects. A multiscale epidemic outbreak model is introduced that is designed to capture the infection dynamics at both the local (city) scale and the global (air travel) scale. A bilevel decision-making framework is then proposed to identify the optimal set of outbreak control policies, while accounting for local and global outbreak dynamics. The model is implemented for a case study in which a hypothetical epidemic outbreak is assumed to emerge from within the United States, and different control resource allocation strategies are explored and evaluated. The results highlight the importance of accounting for outbreak dynamics within the decision-making process and provide insight into the design and efficiency of a range of control strategies. This research is an initial effort to be followed by further research on the design of outbreak control strategies by using optimization algorithms under this framework.
APA, Harvard, Vancouver, ISO, and other styles
18

Reichardt, Uta, Gudmundur F. Ulfarsson, and Gudrun Petursdottir. "Cooperation Between Science and Aviation-Sector Service Providers in Europe for Risk Management of Volcanic Ash." Transportation Research Record: Journal of the Transportation Research Board 2626, no. 1 (January 2017): 99–105. http://dx.doi.org/10.3141/2626-12.

Full text
Abstract:
The eruption of Eyjafjallajökull in April–May 2010 (hereafter E2010) revealed the fragility of air traffic in the case of an ash-producing volcanic eruption. This study examines developments since E2010 of cooperation between science and aviation-sector service providers toward efforts for improved resilience against a new volcanic eruption. The research builds on literature and interviews with representatives from research and regulatory institutes, air traffic managers, aircraft operators, and engine manufacturers across Europe. The article describes how scientific advice was requested to revise the regulatory precautionary approach and reopen airspace during E2010. The paper depicts the increased effort of scientific advancement in the understanding of ash characterization, modeling of the volcanic ash plume, and atmospheric environment. Furthermore, cross-disciplinary workshops and the memorandum of understanding between Icelandic and British institutions are examined to document increased cooperation between scientists and aviation-sector service providers to provide support to decision makers. However, the science needed for improved risk management is complex and depends on the effects of volcanic ash on jet engines. The concentration levels decided on over the course of a few days in 2010 have not been revised, and the aviation industry does not seem to prioritize research into these issues. A dialogue is needed between science, governance, and engine manufacturers, as well as more collective research funding to test jet engines to improve informed decision making, rather than leaving such research only to the manufacturers and internal political agendas.
APA, Harvard, Vancouver, ISO, and other styles
19

Liu, Feng, Jian-Jun Wang, Haozhe Chen, and De-Li Yang. "Machine scheduling with outsourcing." International Journal of Logistics Management 25, no. 1 (May 6, 2014): 133–59. http://dx.doi.org/10.1108/ijlm-12-2012-0142.

Full text
Abstract:
Purpose – The purpose of this paper is to study the use of outsourcing as a mechanism to cope with supply chain uncertainty, more specifically, how to deal with sudden arrival of higher priority jobs that require immediate processing, in an in-house manufacturer's facility from the perspective of outsourcing. An operational level schedule of production and distribution of outsourced jobs to the manufacturer's facility should be determined for the subcontractor in order to achieve overall optimality. Design/methodology/approach – The problem is of bi-criteria in that both the transportation cost measured by number of delivery vehicles and schedule performance measured by jobs’ delivery times. In order to obtain the problem's Pareto front, we propose dynamic programming (DP) heuristic solution procedure based on integrated decision making, and population-heuristic solution procedures using different encoding schemes based on sequential decision making. Computational studies are designed and carried out by randomly generating comparative variations of numerical problem instances. Findings – By comparing several existing performance metrics for the obtained Pareto fronts, it is found that DP heuristic outperforms population-heuristic in both solutions diversity and proximity to optimal Pareto front. Also in population-heuristic, sub-range keys representation appears to be a better encoding scheme for the problem than random keys representation. Originality/value – This study contributes to the limited yet important knowledge body on using outsourcing approach to coping with possible supply chain disruptions in production scheduling due to sudden customer orders. More specifically, we used modeling methodology to confirm the importance of collaboration with subcontractors to effective supply chain risk management.
APA, Harvard, Vancouver, ISO, and other styles
20

Tepa-Yotto, Ghislain T., Henri E. Z. Tonnang, Georg Goergen, Sevgan Subramanian, Emily Kimathi, Elfatih M. Abdel-Rahman, Daniel Flø, et al. "Global Habitat Suitability of Spodoptera frugiperda (JE Smith) (Lepidoptera, Noctuidae): Key Parasitoids Considered for Its Biological Control." Insects 12, no. 4 (March 24, 2021): 273. http://dx.doi.org/10.3390/insects12040273.

Full text
Abstract:
The present study is the first modeling effort at a global scale to predict habitat suitability of fall armyworm (FAW), Spodoptera frugiperda and its key parasitoids, namely Chelonus insularis, Cotesia marginiventris,Eiphosoma laphygmae,Telenomus remus and Trichogramma pretiosum, to be considered for biological control. An adjusted procedure of a machine-learning algorithm, the maximum entropy (Maxent), was applied for the modeling experiments. Model predictions showed particularly high establishment potential of the five hymenopteran parasitoids in areas that are heavily affected by FAW (like the coastal belt of West Africa from Côte d’Ivoire (Ivory Coast) to Nigeria, the Congo basin to Eastern Africa, Eastern, Southern and Southeastern Asia and some portions of Eastern Australia) and those of potential invasion risks (western & southern Europe). These habitats can be priority sites for scaling FAW biocontrol efforts. In the context of global warming and the event of accidental FAW introduction, warmer parts of Europe are at high risk. The effect of winter on the survival and life cycle of the pest in Europe and other temperate regions of the world are discussed in this paper. Overall, the models provide pioneering information to guide decision making for biological-based medium and long-term management of FAW across the globe.
APA, Harvard, Vancouver, ISO, and other styles
21

Mikhel’kevich, V. N., L. P. Ovchinnikova, and L. V. Seryapina. "INFORMATION AND METHODOLOGICAL SUPPORT OF STUDENTS’ SELF-GOVERNING INDEPENDENT WORK." Izvestiya of the Samara Science Centre of the Russian Academy of Sciences. Social, Humanitarian, Medicobiological Sciences 22, no. 75 (2020): 58–63. http://dx.doi.org/10.37313/2413-9645-2020-22-75-58-63.

Full text
Abstract:
The author presents the results of scientific research on informational and educational support of students’ self-organizing independent work, which is complicated and comprehensive, because it includes sequential and consistent execution of large stages of educational and cognitive activity requiring creative approach to decision making. Starting point of this activity is assignment from the lead instructor to study and acquire given learning material. Having received the assignment, the student assesses the individual work effort required to complete it, sets time limits, draws up the schedule, manages the workplace, gets acquainted with information and training materials available and, if necessary, searches for additional required literature. After that, the student is to study educational theoretical material and perform practical exercises and tasks. Finally, he/she carries out self-control of acquired knowledge and skills. Assuming that acquired knowledge and skills do not fully comply with the fund of evaluation assets the student is to reexamine the learning material or correct the mistakes in the practical exercise. The next stage is to submit the accomplished work to the lead instructor. For holistic and figurative appreciation of this complex didactic process pedagogical science uses system modeling. In the present article the author introduces the functional model of students’ self-organizing independent work adapted to distance education. The model under consideration is of practical interest because it can be applied to forecast the process of knowledge acquisition in students’ self-organizing independent work and monitor the influence every link of this work has on the final result of independent work. In the long run, decision is made as to which educational technology proves the best. The efficiency of informational and educational support in the structure of the presented model has been confirmed by the results of the pilot study and final testing.
APA, Harvard, Vancouver, ISO, and other styles
22

PERSONA, ALESSANDRO, DARIA BATTINI, MAURIZIO FACCIO, MAURIZIO BEVILACQUA, and FILIPPO EMANUELE CIARAPICA. "CLASSIFICATION OF OCCUPATIONAL INJURY CASES USING THE REGRESSION TREE APPROACH." International Journal of Reliability, Quality and Safety Engineering 13, no. 02 (April 2006): 171–91. http://dx.doi.org/10.1142/s0218539306002197.

Full text
Abstract:
Occupational safety and illness surveillance has made a great effort to spread a "safety culture" to all workplaces and a great deal of progress has been made in finding solutions that guarantee safer working conditions.This paper analyses occupational injury data in order to identify specific risk groups and factors that in turn could be further analyzed to define prevention measures. A technique based on rule induction is put forward as a non-parametric alternative tool for analyzing occupational injury data which specifically uses the Classification And Regression Tree (CART) approach. Application of this technique to relevant work-related injury data collected in Italy has been encouraging. Data referring to 156 cases of injury in the period 2000–2002 were analyzed and lead to the factors that most affect work-related injuries being identified. According to the literature, up to the time of writing computer-intensive non-parametric modeling procedures have never been used to analyze occupational injuries. The aim of this paper is to use a real world application to illustrate the advantages and flexibility of applying a typical non-parametric epidemiological tool, such as CART, to an occupational injury study. This application can provide more informative, flexible, and attractive models identifying potential risk areas in support of decision-making in safety management.
APA, Harvard, Vancouver, ISO, and other styles
23

Joshi, Rushikesh S., Darryl Lau, Justin K. Scheer, Miquel Serra-Burriel, Alba Vila-Casademunt, Shay Bess, Justin S. Smith, Ferran Pellise, and Christopher P. Ames. "State-of-the-art reviews predictive modeling in adult spinal deformity: applications of advanced analytics." Spine Deformity 9, no. 5 (May 18, 2021): 1223–39. http://dx.doi.org/10.1007/s43390-021-00360-0.

Full text
Abstract:
AbstractAdult spinal deformity (ASD) is a complex and heterogeneous disease that can severely impact patients’ lives. While it is clear that surgical correction can achieve significant improvement of spinopelvic parameters and quality of life measures in adults with spinal deformity, there remains a high risk of complication associated with surgical approaches to adult deformity. Over the past decade, utilization of surgical correction for ASD has increased dramatically as deformity correction techniques have become more refined and widely adopted. Along with this increase in surgical utilization, there has been a massive undertaking by spine surgeons to develop more robust models to predict postoperative outcomes in an effort to mitigate the relatively high complication rates. A large part of this revolution within spine surgery has been the gradual adoption of predictive analytics harnessing artificial intelligence through the use of machine learning algorithms. The development of predictive models to accurately prognosticate patient outcomes following ASD surgery represents a dramatic improvement over prior statistical models which are better suited for finding associations between variables than for their predictive utility. Machine learning models, which offer the ability to make more accurate and reproducible predictions, provide surgeons with a wide array of practical applications from augmenting clinical decision making to more wide-spread public health implications. The inclusion of these advanced computational techniques in spine practices will be paramount for improving the care of patients, by empowering both patients and surgeons to more specifically tailor clinical decisions to address individual health profiles and needs.
APA, Harvard, Vancouver, ISO, and other styles
24

Kahn, Rebecca, Ayesha S. Mahmud, Andrew Schroeder, Luis Hernando Aguilar Ramirez, John Crowley, Jennifer Chan, and Caroline O. Buckee. "Rapid Forecasting of Cholera Risk in Mozambique: Translational Challenges and Opportunities." Prehospital and Disaster Medicine 34, no. 05 (September 3, 2019): 557–62. http://dx.doi.org/10.1017/s1049023x19004783.

Full text
Abstract:
AbstractDisasters, such as cyclones, create conditions that increase the risk of infectious disease outbreaks. Epidemic forecasts can be valuable for targeting highest risk populations before an outbreak. The two main barriers to routine use of real-time forecasts include scientific and operational challenges. First, accuracy may be limited by availability of data and the uncertainty associated with the inherently stochastic processes that determine when and where outbreaks happen and spread. Second, even if data are available, the appropriate channels of communication may prevent their use for decision making.In April 2019, only six weeks after Cyclone Idai devastated Mozambique’s central region and sparked a cholera outbreak, Cyclone Kenneth severely damaged northern areas of the country. By June 10, a total of 267 cases of cholera were confirmed, sparking a vaccination campaign. Prior to Kenneth’s landfall, a team of academic researchers, humanitarian responders, and health agencies developed a simple model to forecast areas at highest risk of a cholera outbreak. The model created risk indices for each district using combinations of four metrics: (1) flooding data; (2) previous annual cholera incidence; (3) sensitivity of previous outbreaks to the El Niño-Southern Oscillation cycle; and (4) a diffusion (gravity) model to simulate movement of infected travelers. As information on cases became available, the risk model was continuously updated. A web-based tool was produced, which identified highest risk populations prior to the cyclone and the districts at-risk following the start of the outbreak.The model prior to Kenneth’s arrival using the metrics of previous incidence, projected flood, and El Niño sensitivity accurately predicted areas at highest risk for cholera. Despite this success, not all data were available at the scale at which the vaccination campaign took place, limiting the model’s utility, and the extent to which the forecasts were used remains unclear. Here, the science behind these forecasts and the organizational structure of this collaborative effort are discussed. The barriers to the routine use of forecasts in crisis settings are highlighted, as well as the potential for flexible teams to rapidly produce actionable insights for decision making using simple modeling tools, both before and during an outbreak.
APA, Harvard, Vancouver, ISO, and other styles
25

Pramudya, Ikhsan, Abdul Rauf, and Asbar Asbar. "ANALISIS KERENTANAN PENGELOLAAN WILAYAH PESISIR DITINJAU DARI PRESPEKTIF MITIGASI BENCANA DI KABUPATEN BADUNG PROVINSI BALI." JOURNAL OF INDONESIAN TROPICAL FISHERIES (JOINT-FISH) : Jurnal Akuakultur, Teknologi Dan Manajemen Perikanan Tangkap, Ilmu Kelautan 2, no. 2 (December 29, 2019): 174–91. http://dx.doi.org/10.33096/joint-fish.v2i2.50.

Full text
Abstract:
The vulnerability identification of coastal areas in this study consisted of the level of danger (earthquake and tsunami), physical vulnerability and non-physical vulnerability. Determination of hazard level using modeling based on seismic history in the subduction zone south of Bali, while analysis of the level of physical and non-physical vulnerability is used Descriptive analysis and Analytical Hierarchy Process (AHP). The analysis process will determine the ranking used as an indicator of decision making to create a vulnerability level map in the research area through the Geographic Information System (GIS) with three levels, namely high, medium and low. In this study also formulated coastal area management policies in the perspective of disaster mitigation using SWOT analysis. The results of the analysis show that four kelurahan in Badung Regency which have high levels of vulnerability are Kuta, Tuban, Kedonganan and Tibubeneng. While the seven kelurahan in the medium level of vulnerability are Jimbaran, Benoa, Tanjung Benoa, Legian, Seminyak, Canggu, Dalung, and the four kelurahan with low levels of vulnerability are Pecatu, Ungasan, Kutuh and Kerobokan. In general the research area is at a high to moderate level of vulnerability, this shows that a comprehensive disaster mitigation effort is needed by implementing several strategies formulated, including 1) developing disaster resilient village programs by establishing disaster risk reduction forums and volunteer training villages for disaster mitigation, 2) maximizing community knowledge in disaster and mitigation to cope with high earthquake and tsunami hazard levels, 3) implementing regional regulations on disaster mitigation based spatial planning effectively in controlling disaster risk areas and utilizing green lines as evacuation routes and meeting point.
APA, Harvard, Vancouver, ISO, and other styles
26

Moon, Jukrin, Farzan Sasangohar, S. Camille Peres, Timothy J. Neville, and Changwon Son. "Modeling Team Cognition in Emergency Response via Naturalistic Observation of Team Interactions." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 62, no. 1 (September 2018): 1801–2. http://dx.doi.org/10.1177/1541931218621408.

Full text
Abstract:
Emergency responders work collectively as an ad hoc team to save lives and infrastructures at risk, despite their varying experience, knowledge, cultural backgrounds, and difficult working conditions with high-levels of uncertainty and timepressure. Cognition, in particular, has gained attention as a key construct to consider in collective response efforts in emergency management. Team cognition, however, has not been fully appreciated or adequately addressed in the field of emergency response (Bigley & Roberts, 2001). The interactionist perspective (or interactive team cognition) effectively captures team cognition in heterogeneous and dynamic teams prevalent in the real-world (Cooke & Gorman, 2009; Cooke, Gorman, Myers, & Duran, 2013). Although researchers in the emergency response discipline appreciate the value of viewing team cognition as interaction (Comfort, 2007; Bergeron & Cooren, 2012; Wolbers & Boersma, 2013), an associated empirical or interventional attempt using this perspective remains scarce. Tracing the scarcity of literature back to lack of context-specific theorizing efforts (Moon, Peres, & Sasangohar, 2017), an observation-based, theory-building approach is utilized here to address this gap. The naturalistic observational study presented here is an initial effort to explore team cognition for an incident management team (IMT) as an interactive system. An IMT is an ad hoc team of command-level responders. Interestingly, an IMT is a team of functional sub-teams or sections (i.e., Command, Planning, Operations, Logistics, and Finance/ Administration). Within each sub-team there is also a team of functional units. This naturalistic observational study was conducted at a high-fidelity simulator replicating a generic IMT facility, i.e., the emergency operations training center (EOTC), College Station, TX. Interactions were observed and coded in terms of who initiated the interaction and with whom, which technology was being used, and what was communicated and for what purpose. The purpose of this study is to develop a theoretical interactionist model of team cognition in emergency response, to inform future interventional attempts to improve team decision-making. To do so, this study views a Plans team as a cognitive system capable of managing information through interdependent, nonlinear, and dynamic interactive behaviors for perceiving (P), diagnosing (D), and adapting (A) to the changes in the status of critical elements (Adapted from Moon et al., 2017). The proposed P·D·A model posits the following three premises: (1) a Plans team is a cognitive system where its team cognition is interactions of team members to complete a cognitive task; (2) team cognition for each of the three sub-teams of a Plans team is tied to the context-specific cognitive tasks of perceiving (P), diagnosing (D), and adapting (A) to the changes in the status of critical elements; and (3) team cognition for a Plans team is manifested as nonlinear, interdependent, and dynamic interactions within and among P, D, and A of the three sub-teams of the Plans team. Preliminary results from a content analysis of transcribed and coded interactions suggest that an Info/Intel unit, a Situation unit, and a Section Chief unit can be hypothesized to be critical contributors of team cognition for a Plans team in terms of P, D, and A, respectively. These hypotheses can be represented with network centrality measures as follows: Hypothesis 1. An Info/Intel unit has high in-degree and out-degree centrality with non-Plans teams. Hypothesis 2. A Situation unit has high betweenness centrality within a Plans team. Hypothesis 3. A Section Chief unit has high in-degree and out-degree centrality within a Plans team, and high betweenness centrality between the Plans team and non-Plans teams. The proposed P·D·A model illustrates the benefits of viewing team cognition as interaction within and among a team of teams, for context-specific tasks of P, D, and A. Most importantly, the model effectively captures the nonlinear, interdependent, and dynamic nature of team cognition as interaction in a multiteam system, or MTS (Marks, DeChurch, Mathieu, Panzer, & Alonso, 2005; Bienefeld & Grote, 2014), embedded in complex socio-technical systems, STS (Vicente, 2002). As the information processing model views an individual as a cognitive system or a human information processing system (Wickens, 1992), the P·D·A model views a team as a cognitive system capable of managing information. The interactionist perspective on team cognition helps the P·D·A model to realize its potential to extend an individual cognition model to a team level. The interactionist perspective is “compatible with the view of human-machine system as a unitary system” (Cooke & Gorman, 2009, p. 28). In addition to the theoretical and practical implications, this study has methodological implications. Measuring interactive team cognition with network-based metrics (currently in progress) will open a new chapter. The need of incorporating a network perspective into team cognition in emergency response is in line with the literature (Wolbers & Boersma, 2013; Steigenberger, 2016). As a future work, the P·D·A model will be further developed with network and content analysis and validated through interviews with Subject Matter Experts (SMEs) involved in Hurricane Harvey.
APA, Harvard, Vancouver, ISO, and other styles
27

Li, Huiping, Hope S. Rugo, Jin Zhang, Zhimin Shao, Zhenzhou Shen, Binhe Xu, Jiong Wu, et al. "Interpreting Advanced Breast Cancer Consensus Guidelines for Use in China." Journal of Global Oncology 2, no. 3_suppl (June 2016): 36s—37s. http://dx.doi.org/10.1200/jgo.2016.004028.

Full text
Abstract:
Abstract 58 Background: In 2011, an international panel of breast cancer experts developed the first Advanced Breast Cancer (ABC) Consensus Guidelines to provide standards and improved care for the multidisciplinary care of patients with this common disease. We sought to adapt the ABC guidelines for China, incorporating cultural standards and available Chinese resource, and identifying suitable formed guideline. Methods: We organized the Chinese Consensus Guidelines Conference for ABC (CABC) yearly from 2013 through 2015 in Beijing as a joint effort between the China Medical Women's Association, the Organization of Beijing Sunshine Great Wall Oncology Program, Peking University, The panel included 50 breast oncology and surgery experts from 20 provinces, as well as two external consultant oncologists from the U.S. and Singapore. Permission was obtained from the ABC Chair to use the guidelines as a basis for our discussion. All questions were presented and discussed in detail, including a review of current applicable data, and panel members voted on each question. Results: The main issues discussed included; 1. In China the patient treatment decision making generally by family members. 2. Use of sequential single agent chemotherapy for standard risk in China most experts still prefer combination therapy. 3. The trastuzumab are not covered by health insurance in China and/or pertuzumab is not yet available. 5. For hormone receptor positive ABC, some physicians in China prefer to start with chemotherapy . 7. Not well accepted by Chinese patients. Details of final voting and Chinese consensus will be presented. Conclusions: Standard guidelines are critical, but must be tailored to be used effectively in specific countries. The CABC has effectively discussed, modified and distributed guidelines for the treatment of ABC in China. AUTHORS' DISCLOSURES OF POTENTIAL CONFLICTS OF INTEREST: Huiping Li No relationship to disclose Hope S. Rugo Honoraria: Genomic Health Speakers' Bureau: Genomic Health Research Funding: Plexxikon, Macrogenics, OBI Pharma, Eisai, Pfizer, Novartis, Eli Lilly, GlaxoSmithKline, Genentech, Celsion, Nektar, Merck, Amgen Travel, Accommodations, Expenses: Novartis, Nektar, Roche/Genentech, OBI Pharma, Mylan Jin Zhang No relationship to disclose Zhimin Shao No relationship to disclose Zhenzhou Shen No relationship to disclose Binhe Xu No relationship to disclose Jiong Wu No relationship to disclose Zefei Jiang No relationship to disclose Erwei Song No relationship to disclose Yinhua Liu No relationship to disclose Xichun Hu No relationship to disclose Cuizhi Geng No relationship to disclose Bo Li No relationship to disclose Jinhai Tang No relationship to disclose Jifeng Feng No relationship to disclose Pin Zhang No relationship to disclose Junlan Yang No relationship to disclose Qingyuan Zhang No relationship to disclose Jian Liu No relationship to disclose Yuee Teng No relationship to disclose Yongsheng Wang No relationship to disclose Zhongsheng Tong No relationship to disclose Guohong Song No relationship to disclose Peng Yuan No relationship to disclose Hongmei Zhao No relationship to disclose Wuyun Su No relationship to disclose Tao Sun No relationship to disclose Seng-Weng Wong Consulting or Advisory Role: MSD Oncology, Novartis, Roche, Pfizer Speakers' Bureau: MSD Oncology, Bayer, Novartis Travel, Accommodations, Expenses: Bayer, Roche, Merck Serono, Boehringer Ingelheim Yanshen Lu No relationship to disclose Yongchang Zhou No relationship to disclose
APA, Harvard, Vancouver, ISO, and other styles
28

Sahni, Isha, and Roland N. Horne. "Multiresolution Wavelet Analysis for Improved Reservoir Description." SPE Reservoir Evaluation & Engineering 8, no. 01 (February 1, 2005): 53–69. http://dx.doi.org/10.2118/87820-pa.

Full text
Abstract:
Summary It is well documented that history matching is a problem with possibly nonunique solutions. In the past few years, several automated or semiautomated history-matching algorithms have been proposed. Depending on the algorithm used, it is possible that the final estimated reservoir-property distribution that allows for a good history match may not be geologically realistic. Therefore, there is a need to include other constraints to generate multiple, geologically realistic history-matched realizations. These constraints might, for example, include the variogram, a training image, the distribution of net-to-gross, pore volume, or other geostatistical information about the reservoir. This inclusion is particularly useful because it introduces uncertainty information in the reservoir description when we have limited history from existing wells in the field and intend to drill infill wells or implement a secondary-recovery process. The algorithm proposed in this paper uses multiresolution wavelet analysis to integrate history data with the geostatistical information contained in the variogram proposed for the reservoir. Wavelets allow the representation and manipulation of property distributions at various resolutions at the same time. Using wavelets, information from different sources such as production history and seismic surveys (that would be at different resolutions) can be incorporated directly at the appropriate resolution level. In the first step, we fix the wavelet coefficients sensitive to the history-match data. This has the effect of fixing the field history without fixing individual gridblock properties. In the second step, the remaining free wavelet coefficients are modified to integrate variogram information into the reservoir description. Generating multiple realizations of only the second set of wavelets coefficients results in multiple history-matched, variogram-constrained descriptions of the reservoir. The computational investment is very modest because the history match is done only once. In a number of example cases, different areal Gaussian fields with varying amounts of available production-history data were studied to test the algorithm. It was found that the wavelet coefficients constraining the history can be decoupled from those constraining the variogram. The implication of this observation is that the history data and variogram can be integrated sequentially into the reservoir model—that is, after the initial history match, new information can be added to the model without disturbing the original match to yield multiple history-matched and geostatistically constrained realizations. Introduction Reservoir modeling is essential for forecasting the performance of a reservoir, for reservoir management, for risk analysis, and for making key economic decisions. The purpose of reservoir modeling is to develop a model of the reservoir that closely resembles the actual reservoir based on available information. This model then can be used to forecast future performance and optimize reservoir-management decisions. The more accurate the reservoir model, the better the predictions will be. Therein lies the importance of generating a good reservoir model. History matching is but one step in this direction. Merely achieving a good history match does not ensure sound predictions from the reservoir model; it is therefore essential that all sources of information about the reservoir be used appropriately to come up with a good model. Early automated history-matching procedures were discussed by Jacquard and Jain,1 adapted from variational analysis in electric networking. Since then, there have been several developments of concepts and algorithms along similar lines. In general, the objective is to determine the spatial distribution of a set of gridded reservoir properties such as permeabilities and porosities, given the response of the field in terms of fluid flow to an external impulse such as drainage and injection of fluids, as well as geostatistical data. Production history from existing wells is an important source of information about the reservoir, in terms of the average permeabilities, spatial distribution of permeabilities, net-to-gross, etc. Production history could be in the form of the pressure or saturation distribution in the reservoir in response to injection or production impulses. A good reservoir model must therefore, when run through a flow simulator, give the same response to the same impulse as the real reservoir. Many studies have shown favorable results from integrating dynamic data into reservoir modeling using streamline simulators (e.g., Datta-Gupta et al.2). However, not only does history matching alone not ensure sound production forecast, it also does not guarantee physical consistency and might produce artifacts based on the algorithm used. The results thus obtained might give a perfect history match, but if they are a physical, use of the model will lead to further error in prediction of future performance because the model may not be close enough in a geological sense to the actual reservoir. This situation arises because there may be a number of different solutions to the history-matching problem. In other words, a number of different permeability distributions may be found, all of which give the same response to a given impulse. As such, we need to integrate geostatistical data that will constrain the problem and make the model more realistic. Landa and Horne3 and Landa4investigated the impact of different data on reservoir characterization and uncertainty. Integration of static and dynamic data into reservoir models has been attempted in the Bayesian framework5-7 and with gradual deformation.8 Multiresolution wavelet analysis forms the basis for efficient representation of the field as well as a reduction in the number of parameters to be estimated. As described in the following section, the gridded reservoir-property distribution can be transformed linearly to give a unique set of wavelet coefficients. It has been found9,10 that a specific subset of these wavelet coefficients is sufficient to determine the response of the reservoir to production. The conjecture is that the remaining set can be modified subject to constraints based on geological, seismic, or other subjective information about the spatial distribution of the permeabilities. This study showed that the sets of wavelets constraining the history match and those constraining geostatistical parameters (variograms in particular) can indeed be decoupled and evaluated separately to yield a set of different permeability distributions stochastically. Most history-matching algorithms involve flow simulation at each iteration while minimizing the objective function. The advantage of our new algorithm is that instead of doing repeated history matches, it fixes a set of wavelet coefficients that constrain the history, thereby fixing the history up to some tolerance. The objective function endeavors to enforce a proposed variogram of spatial distribution of the permeabilities. As such, the algorithm takes orders-of-magnitude less time to yield permeability distributions that are constrained by both the history and the variogram of the field.
APA, Harvard, Vancouver, ISO, and other styles
29

Siembieda, William. "Toward an Enhanced Concept of Disaster Resilience: A Commentary on Behalf of the Editorial Committee." Journal of Disaster Research 5, no. 5 (October 1, 2010): 487–93. http://dx.doi.org/10.20965/jdr.2010.p0487.

Full text
Abstract:
1. Introduction This Special Issue (Part 2) expands upon the theme “Building Local Capacity for Long-term Disaster Resilience” presented in Special Issue Part 1 (JDR Volume 5, Number 2, April 2010) by examining the evolving concept of disaster resilience and providing additional reflections upon various aspects of its meaning. Part 1 provided a mixed set of examples of resiliency efforts, ranging from administrative challenges of integrating resilience into recovery to the analysis of hazard mitigation plans directed toward guiding local capability for developing resiliency. Resilience was broadly defined in the opening editorial of Special Issue Part 1 as “the capacity of a community to: 1) survive a major disaster, 2) retain essential structure and functions, and 3) adapt to post-disaster opportunities for transforming community structure and functions to meet new challenges.” In this editorial essay we first explore in Section 2 the history of resilience and then locate it within current academic and policy debates. Section 3 presents summaries of the papers in this issue. 2. Why is Resilience a Contemporary Theme? There is growing scholarly and policy interest in disaster resilience. In recent years, engineers [1], sociologists [2], geographers [3], economists [4], public policy analysts [5, 6], urban planners [7], hazards researchers [8], governments [9], and international organizations [10] have all contributed to the literature about this concept. Some authors view resilience as a mechanism for mitigating disaster impacts, with framework objectives such as resistance, absorption, and restoration [5]. Others, who focus on resiliency indicators, see it as an early warning system to assess community resiliency status [3, 8]. Recently, it has emerged as a component of social risk management that seeks to minimize social welfare loss from catastrophic disasters [6]. Manyena [11] traces scholarly exploration of resilience as an operational concept back at least five decades. Interest in resilience began in the 1940s with studies of children and trauma in the family and in the 1970s in the ecology literature as a useful framework to examine and measure the impact of assault or trauma on a defined eco-system component [12]. This led to modeling resilience measures for a variety of components within a defined ecosystem, leading to the realization that the systems approach to resiliency is attractive as a cross-disciplinary construct. The ecosystem analogy however, has limits when applied to disaster studies in that, historically, all catastrophic events have changed the place in which they occurred and a “return to normalcy” does not occur. This is true for modern urban societies as well as traditional agrarian societies. The adoption of “The Hyogo Framework for Action 2005-2015” (also known as The Hyogo Declaration) provides a global linkage and follows the United Nations 1990s International Decade for Natural Disaster Reduction effort. The 2005 Hyogo Declaration’s definition of resilience is: “The capacity of a system, community or society potentially exposed to hazards to adapt by resisting or changing in order to reach and maintain an acceptable level of functioning and structure.” The proposed measurement of resilience in the Hyogo Declaration is determined by “the degree to which the social system is capable of organizing itself to increase this capacity for learning from past disasters for better future protection and to improve risk reduction measures.” While very broad, this definition contains two key concepts: 1) adaptation, and 2) maintaining acceptable levels of functioning and structure. While adaptation requires certain capacities, maintaining acceptable levels of functioning and structure requires resources, forethought, and normative action. Some of these attributes are now reflected in the 2010 National Disaster Recovery Framework published by the U.S. Federal Emergency Management Agency (FEMA) [13]. With the emergence of this new thinking on resilience related to disasters, it is now a good time to reflect on the concept and assess what has recently been said in the literature. Bruneau et al. [1] offer an engineering sciences definition for community seismic resilience: “The ability of social units (e.g., organizations, communities) to mitigate hazards, contain the effects of disasters when they occur, and carry out recovery activities in ways that minimize social disruption and mitigate the effects of future earthquakes.” Rose [4] writes that resiliency is the ability of a system to recover from a severe shock. He distinguishes two types of resilience: (1) inherent – ability under normal circumstances and (2) adaptive – ability in crisis situations due to ingenuity or extra effort. By opening up resilience to categorization he provides a pathway to establish multi-disciplinary approaches, something that is presently lacking in practice. Rose is most concerned with business disruption which can take extensive periods of time to correct. In order to make resource decisions that lower overall societal costs (economic, social, governmental and physical), Rose calls for the establishment of measurements that function as resource decision allocation guides. This has been done in part through risk transfer tools such as private insurance. However, it has not been well-adopted by governments in deciding how to allocate mitigation resources. We need to ask why the interest in resilience has grown? Manyena [11] argues that the concept of resilience has gained currency without obtaining clarity of understanding, definition, substance, philosophical dimensions, or applicability to disaster management and sustainable development theory and practice. It is evident that the “emergency management model” does not itself provide sufficient guidance for policymakers since it is too command-and-control-oriented and does not adequately address mitigation and recovery. Also, large disasters are increasingly viewed as major disruptions of the economic and social conditions of a country, state/province, or city. Lowering post-disaster costs (human life, property loss, economic advancement and government disruption) is being taken more seriously by government and civil society. The lessening of costs is not something the traditional “preparedness” stage of emergency management has concerned itself with; this is an existing void in meeting the expanding interests of government and civil society. The concept of resilience helps further clarify the relationship between risk and vulnerability. If risk is defined as “the probability of an event or condition occurring [14]#8221; then it can be reduced through physical, social, governmental, or economic means, thereby reducing the likelihood of damage and loss. Nothing can be done to stop an earthquake, volcanic eruption, cyclone, hurricane, or other natural event, but the probability of damage and loss from natural and technological hazards can be addressed through structural and non-structural strategies. Vulnerability is the absence of capacity to resist or absorb a disaster impact. Changes in vulnerability can then be achieved by changes in these capacities. In this regard, Franco and Siembieda describe in this issue how coastal cities in Chile had low resilience and high vulnerability to the tsunami generated by the February 2010 earthquake, whereas modern buildings had high resilience and, therefore, were much less vulnerable to the powerful earthquake. We also see how the framework for policy development can change through differing perspectives. Eisner discusses in this issue how local non-governmental social service agencies are building their resilience capabilities to serve target populations after a disaster occurs, becoming self-renewing social organizations and demonstrating what Leonard and Howett [6] term “social resilience.” All of the contributions to this issue illustrate the lowering of disaster impacts and strengthening of capacity (at the household, community or governmental level) for what Alesch [15] terms “post-event viability” – a term reflecting how well a person, business, community, or government functions after a disaster in addition to what they might do prior to a disaster to lessen its impact. Viability might become the definition of recovery if it can be measured or agreed upon. 3. Contents of This Issue The insights provided by the papers in this issue contribute greater clarity to an understanding of resilience, together with its applicability to disaster management. In these papers we find tools and methods, process strategies, and planning approaches. There are five papers focused on local experiences, three on state (prefecture) experiences, and two on national experiences. The papers in this issue reinforce the concept of resilience as a process, not a product, because it is the sum of many actions. The resiliency outcome is the result of multiple inputs from the level of the individual and, at times, continuing up to the national or international organizational level. Through this exploration we see that the “resiliency” concept accepts that people will come into conflict with natural or anthropogenic hazards. The policy question then becomes how to lower the impact(s) of the conflict through “hard or soft” measures (see the Special Issue Part 1 editorial for a discussion of “hard” vs. “soft” resilience). Local level Go Urakawa and Haruo Hayashi illustrate how post-disaster operations for public utilities can be problematic because many practitioners have no direct experience in such operations, noting that the formats and methods normally used in recovery depend on personal skills and effort. They describe how these problems are addressed by creating manuals on measures for effectively implementing post-disaster operations. They develop a method to extract priority operations using business impact analysis (BIA) and project management based business flow diagrams (BFD). Their article effectively illustrates the practical aspects of strengthening the resiliency of public organizations. Richard Eisner presents the framework used to initiate the development and implementation of a process to create disaster resilience in faith-based and community-based organizations that provide services to vulnerable populations in San Francisco, California. A major project outcome is the Disaster Resilience Standard for Community- and Faith-Based Service Providers. This “standard” has general applicability for use by social service agencies in the public and non-profit sectors. Alejandro Linayo addresses the growing issue of technological risk in cities. He argues for the need to understand an inherent conflict between how we occupy urban space and the technological risks created by hazardous chemicals, radiation, oil and gas, and other hazardous materials storage and movement. The paper points out that information and procedural gaps exist in terms of citizen knowledge (the right to know) and local administrative knowledge (missing expertise). Advances and experience accumulated by the Venezuela Disaster Risk Management Research Center in identifying and integrating technological risk treatment for the city of Merida, Venezuela, are highlighted as a way to move forward. L. Teresa Guevara-Perez presents the case that certain urban zoning requirements in contemporary cities encourage and, in some cases, enforce the use of building configurations that have been long recognized by earthquake engineering as seismically vulnerable. Using Western Europe and the Modernist architectural movement, she develops the historical case for understanding discrepancies between urban zoning regulations and seismic codes that have led to vulnerable modern building configurations, and traces the international dissemination of architectural and urban planning concepts that have generated vulnerability in contemporary cities around the world. Jung Eun Kang, Walter Gillis Peacock, and Rahmawati Husein discuss an assessment protocol for Hazard Mitigation Plans applied to 12 coastal hazard zone plans in the state of Texas in the U.S. The components of these plans are systematically examined in order to highlight their respective strengths and weaknesses. The authors describe an assessment tool, the plan quality score (PQS), composed of seven primary components (vision statement, planning process, fact basis, goals and objectives, inter-organizational coordination, policies & actions, and implementation), as well as a component quality score (CQS). State (Prefecture) level Charles Real presents the Natural Hazard Zonation Policies for Land Use Planning and Development in California in the U.S. California has established state-level policies that utilize knowledge of where natural hazards are more likely to occur to enhance the effectiveness of land use planning as a tool for risk mitigation. Experience in California demonstrates that a combination of education, outreach, and mutually supporting policies that are linked to state-designated natural hazard zones can form an effective framework for enhancing the role of land use planning in reducing future losses from natural disasters. Norio Maki, Keiko Tamura, and Haruo Hayashi present a method for local government stakeholders involved in pre-disaster plan making to describe performance measures through the formulation of desired outcomes. Through a case study approach, Nara and Kyoto Prefectures’ separate experiences demonstrate how to conduct Strategic Earthquake Disaster Reduction Plans and Action Plans that have deep stakeholder buy-in and outcome measurability. Nara’s plan was prepared from 2,015 stakeholder ideas and Kyoto’s plan was prepared from 1,613 stakeholder ideas. Having a quantitative target for individual objectives ensures the measurability of plan progress. Both jurisdictions have undertaken evaluations of plan outcomes. Sandy Meyer, Eugene Henry, Roy E. Wright and Cynthia A. Palmer present the State of Florida in the U.S. and its experience with pre-disaster planning for post-disaster redevelopment. Drawing upon the lessons learned from the impacts of the 2004 and 2005 hurricane seasons, local governments and state leaders in Florida sought to find a way to encourage behavior that would create greater community resiliency in 2006. The paper presents initial efforts to develop a post-disaster redevelopment plan (PDRP), including the experience of a pilot county. National level Bo-Yao Lee provides a national perspective: New Zealand’s approach to emergency management, where all hazard risks are addressed through devolved accountability. This contemporary approach advocates collaboration and coordination, aiming to address all hazard risks through the “4Rs” – reduction, readiness, response, and recovery. Lee presents the impact of the Resource Management Act (1991), the Civil Defence Emergency Management Act (2002), and the Building Act (2004) that comprise the key legislation influencing and promoting integrated management for environment and hazard risk management. Guillermo Franco and William Siembieda provide a field assessment of the February 27, 2010, M8.8 earthquake and tsunami event in Chile. The papers present an initial damage and life-loss review and assessment of seismic building resiliency and the country’s rapid updating of building codes that have undergone continuous improvement over the past 60 years. The country’s land use planning system and its emergency management system are also described. The role of insurance coverage reveals problems in seismic coverage for homeowners. The unique role of the Catholic Church in providing temporary shelter and the central government’s five-point housing recovery plan are presented. A weakness in the government’s emergency management system’s early tsunami response system is noted. Acknowledgements The Editorial Committee extends its sincere appreciation to both the contributors and the JDR staff for their patience and determination in making Part 2 of this special issue possible. Thanks also to the reviewers for their insightful analytic comments and suggestions. Finally, the Committee wishes to again thank Bayete Henderson for his keen and thorough editorial assistance and copy editing support.
APA, Harvard, Vancouver, ISO, and other styles
30

Giordano, Chris, Meghan Brennan, Basma Mohamed, Parisa Rashidi, François Modave, and Patrick Tighe. "Accessing Artificial Intelligence for Clinical Decision-Making." Frontiers in Digital Health 3 (June 25, 2021). http://dx.doi.org/10.3389/fdgth.2021.645232.

Full text
Abstract:
Advancements in computing and data from the near universal acceptance and implementation of electronic health records has been formative for the growth of personalized, automated, and immediate patient care models that were not previously possible. Artificial intelligence (AI) and its subfields of machine learning, reinforcement learning, and deep learning are well-suited to deal with such data. The authors in this paper review current applications of AI in clinical medicine and discuss the most likely future contributions that AI will provide to the healthcare industry. For instance, in response to the need to risk stratify patients, appropriately cultivated and curated data can assist decision-makers in stratifying preoperative patients into risk categories, as well as categorizing the severity of ailments and health for non-operative patients admitted to hospitals. Previous overt, traditional vital signs and laboratory values that are used to signal alarms for an acutely decompensating patient may be replaced by continuously monitoring and updating AI tools that can pick up early imperceptible patterns predicting subtle health deterioration. Furthermore, AI may help overcome challenges with multiple outcome optimization limitations or sequential decision-making protocols that limit individualized patient care. Despite these tremendously helpful advancements, the data sets that AI models train on and develop have the potential for misapplication and thereby create concerns for application bias. Subsequently, the mechanisms governing this disruptive innovation must be understood by clinical decision-makers to prevent unnecessary harm. This need will force physicians to change their educational infrastructure to facilitate understanding AI platforms, modeling, and limitations to best acclimate practice in the age of AI. By performing a thorough narrative review, this paper examines these specific AI applications, limitations, and requisites while reviewing a few examples of major data sets that are being cultivated and curated in the US.
APA, Harvard, Vancouver, ISO, and other styles
31

G., Rejikumar, and Aswathy Asokan-Ajitha. "Role of impulsiveness in online purchase completion intentions: an empirical study among Indian customers." Journal of Indian Business Research ahead-of-print, ahead-of-print (November 2, 2020). http://dx.doi.org/10.1108/jibr-04-2018-0132.

Full text
Abstract:
Purpose Online cart abandonment is a severe issue posing challenges to e-commerce growth. Emerging economies such as India fascinates global marketing practitioners because of favorable demographics and high levels of internet penetration. This study aims to consider the role of certain exogenous factors in developing shopping motivations that sequentially mediate to online purchase completion through impulsiveness under risk perceptions. The primary motivation behind this study is to understand the mental mechanism among online customers that develop purchase completion intentions, which prevent cart abandonment significantly. Design/methodology/approach Impact of e-commerce exogenous factors related to e-commerce such as website attributes, product features, promotional excellence and decision-making easiness on shopping motivations, impulsiveness and purchase completions intentions under the moderating effect of risk was estimated from the perceptions of Indian online customers (n = 243) using variance-based structural equation modeling and SPSS process macro v.3.0. Findings The most important exogenous variable that can influence purchase completion directly, sequentially through shopping motivations is decision easiness and promotions. Even though utility motivations are dominant in purchase completion intentions, hedonistic aspects are more critical in developing impulsiveness. The translation of impulsiveness to purchase completion is happening, but risk perception significantly moderates impulsiveness formation. Research limitations/implications Theoretically, this study examined online purchase completions being the most sought response by a customer to various stimuli in e-commerce. The study adopted a moderated mediation analysis in which shopping motivations and impulsiveness were mediators and risk as moderator. The interaction effect of risk on purchase completions was significant even when the mediating effects were prominent. Practical implications Contributes to the current knowledge-related online buying behavior in virtual retail formats and helps marketers in streamlining their focus in using impulsiveness as a strategic tool in reducing cart abandonment. Originality/value This study helps in understanding emerging trends in online buying behavior in India.
APA, Harvard, Vancouver, ISO, and other styles
32

Hemmings, Brioch, Matthew J. Knowling, and Catherine R. Moore. "Early Uncertainty Quantification for an Improved Decision Support Modeling Workflow: A Streamflow Reliability and Water Quality Example." Frontiers in Earth Science 8 (November 27, 2020). http://dx.doi.org/10.3389/feart.2020.565613.

Full text
Abstract:
Effective decision making for resource management is often supported by combining predictive models with uncertainty analyses. This combination allows quantitative assessment of management strategy effectiveness and risk. Typically, history matching is undertaken to increase the reliability of model forecasts. However, the question of whether the potential benefit of history matching will be realized, or outweigh its cost, is seldom asked. History matching adds complexity to the modeling effort, as information from historical system observations must be appropriately blended with the prior characterization of the system. Consequently, the cost of history matching is often significant. When it is not implemented appropriately, history matching can corrupt model forecasts. Additionally, the available data may offer little decision-relevant information, particularly where data and forecasts are of different types, or represent very different stress regimes. In this paper, we present a decision support modeling workflow where early quantification of model uncertainty guides ongoing model design and deployment decisions. This includes providing justification for undertaking (or forgoing) history matching, so that unnecessary modeling costs can be avoided and model performance can be improved. The workflow is demonstrated using a regional-scale modeling case study in the Wairarapa Valley (New Zealand), where assessments of stream depletion and nitrate-nitrogen contamination risks are used to support water-use and land-use management decisions. The probability of management success/failure is assessed by comparing the proximity of model forecast probability distributions to ecologically motivated decision thresholds. This study highlights several important insights that can be gained by undertaking early uncertainty quantification, including: i) validation of the prior numerical characterization of the system, in terms of its consistency with historical observations; ii) validation of model design or indication of areas of model shortcomings; iii) evaluation of the relative proximity of management decision thresholds to forecast probability distributions, providing a justifiable basis for stopping modeling.
APA, Harvard, Vancouver, ISO, and other styles
33

Morais, Maria C., Berta Gonçalves, and João A. Cabral. "A Dynamic Modeling Framework to Evaluate the Efficacy of Control Actions for a Woody Invasive Plant, Hakea sericea." Frontiers in Ecology and Evolution 9 (April 13, 2021). http://dx.doi.org/10.3389/fevo.2021.641686.

Full text
Abstract:
Invasive alien species (IAS) are a significant component of global changes, causing severe economic and biodiversity damage. In this regard, Hakea sericea is one of the most widespread IAS throughout the Mediterranean region, including Portugal. The difficulty surrounding its management is exacerbated by post-fire situations, signifying a challenging task for managers. To assist in this effort, we used a system dynamic approach to model the population dynamics of Hakea sericea regarding the combinations of wildfire risk and control scenarios, which differ in periodicity, type of interventions, and cohort age. The ultimate goal of this study was to assess the effectiveness and costs of control efforts at reducing the abundance of this IAS. A Natura 2000 site Alvão/Marão (code PTCON0003) in northern Portugal, severely invaded by Hakea sericea, served as the study site. The modeling results demonstrate that Hakea sericea is likely to continue spreading if left uncontrolled. Although it may not be possible to ensure eradication of Hakea sericea from the study, repeated control actions aimed at the entire IAS population could be very effective in reducing its area. From a practical standpoint, removing all plants 24 months after each fire event followed by subsequent monitoring appears to be the most cost-effective strategy for managing Hakea sericea. Considering the modeling results, the dynamic modeling framework developed is a versatile, instructive tool that can support decision-making aimed at effective management of Hakea sericea.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography