Academic literature on the topic 'PREDICTION MODELS APPLICATIONS'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'PREDICTION MODELS APPLICATIONS.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "PREDICTION MODELS APPLICATIONS"

1

Chung, Chang-Jo. "Spatial Prediction Models and Applications." GEOINFORMATICS 12, no. 2 (2001): 58–59. http://dx.doi.org/10.6010/geoinformatics.12.58.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Dammann, Maximilian Peter, Wolfgang Steger, and Kristin Paetzold-Byhain. "OPTIMISED MODELS FOR AR/VR BY USING GEOMETRIC COMPLEXITY METRICS TO CONTROL TESSELLATION." Proceedings of the Design Society 3 (June 19, 2023): 2855–64. http://dx.doi.org/10.1017/pds.2023.286.

Full text
Abstract:
AbstractAR/VR applications are a valuable tool in product design and lifecycle. But the integration of AR/VR is not seamless, as CAD models need to be prepared for the AR/VR applications. One necessary data transformation is the tessellation of the analytically described geometry. To ensure the usability, visual quality and evaluability of the AR/VR application, time consuming optimisation is needed depending on the product complexity and the performance of the target device.Widespread approaches to this problem are based on iterative mesh decimation. This approach ignores the varying importance of geometries and the required visual quality in engineering applications. Our predictive approach is an alternative that enables optimisation without iterative process steps on the tessellated geometry.The contribution presents an approach that uses surface-based prediction and enables predictions of the perceived visual quality of the geometries. This contains the investigation of different geometric complexity metrics gathered from literature as basis for prediction models. The approach is implemented in a geometry preparation tool and the results are compared with other approaches.
APA, Harvard, Vancouver, ISO, and other styles
3

Lei, Xiangdong, Changhui Peng, Haiyan Wang, and Xiaolu Zhou. "Individual height–diameter models for young black spruce (Picea mariana) and jack pine (Pinus banksiana) plantations in New Brunswick, Canada." Forestry Chronicle 85, no. 1 (January 1, 2009): 43–56. http://dx.doi.org/10.5558/tfc85043-1.

Full text
Abstract:
Historically, height–diameter models have mainly been developed for mature trees; consequently, few height–diameter models have been calibrated for young forest stands. In order to develop equations predicting the height of trees with small diameters, 46 individual height–diameter models were fitted and tested in young black spruce (Picea mariana) and jack pine (Pinus banksiana) plantations between the ages of 4 to 8 years, measured from 182 plots in New Brunswick, Canada. The models were divided into 2 groups: a diameter group and a second group applying both diameter and additional stand- or tree-level variables (composite models). There was little difference in predicting tree height among the former models (Group I) while the latter models (Group II) generally provided better prediction. Based on goodness of fit (R2and MSE), prediction ability (the bias and its associated prediction and tolerance intervals in absolute and relative terms), and ease of application, 2 Group II models were recommended for predicting individual tree heights within young black spruce and jack pine forest stands. Mean stand height was required for application of these models. The resultant tolerance intervals indicated that most errors (95%) associated with height predictions would be within the following limits (a 95% confidence level): [-0.54 m, 0.54 m] or [-14.7%, 15.9%] for black spruce and [-0.77 m, 0.77 m] or [-17.1%, 18.6%] for jack pine. The recommended models are statistically reliable for growth and yield applications, regeneration assessment and management planning. Key words: composite model, linear model, model calibration, model validation, prediction interval, tolerance interval
APA, Harvard, Vancouver, ISO, and other styles
4

Pintelas, Emmanuel, Meletis Liaskos, Ioannis E. Livieris, Sotiris Kotsiantis, and Panagiotis Pintelas. "Explainable Machine Learning Framework for Image Classification Problems: Case Study on Glioma Cancer Prediction." Journal of Imaging 6, no. 6 (May 28, 2020): 37. http://dx.doi.org/10.3390/jimaging6060037.

Full text
Abstract:
Image classification is a very popular machine learning domain in which deep convolutional neural networks have mainly emerged on such applications. These networks manage to achieve remarkable performance in terms of prediction accuracy but they are considered as black box models since they lack the ability to interpret their inner working mechanism and explain the main reasoning of their predictions. There is a variety of real world tasks, such as medical applications, in which interpretability and explainability play a significant role. Making decisions on critical issues such as cancer prediction utilizing black box models in order to achieve high prediction accuracy but without provision for any sort of explanation for its prediction, accuracy cannot be considered as sufficient and ethnically acceptable. Reasoning and explanation is essential in order to trust these models and support such critical predictions. Nevertheless, the definition and the validation of the quality of a prediction model’s explanation can be considered in general extremely subjective and unclear. In this work, an accurate and interpretable machine learning framework is proposed, for image classification problems able to make high quality explanations. For this task, it is developed a feature extraction and explanation extraction framework, proposing also three basic general conditions which validate the quality of any model’s prediction explanation for any application domain. The feature extraction framework will extract and create transparent and meaningful high level features for images, while the explanation extraction framework will be responsible for creating good explanations relying on these extracted features and the prediction model’s inner function with respect to the proposed conditions. As a case study application, brain tumor magnetic resonance images were utilized for predicting glioma cancer. Our results demonstrate the efficiency of the proposed model since it managed to achieve sufficient prediction accuracy being also interpretable and explainable in simple human terms.
APA, Harvard, Vancouver, ISO, and other styles
5

Moskolaï, Waytehad Rose, Wahabou Abdou, Albert Dipanda, and Kolyang. "Application of Deep Learning Architectures for Satellite Image Time Series Prediction: A Review." Remote Sensing 13, no. 23 (November 27, 2021): 4822. http://dx.doi.org/10.3390/rs13234822.

Full text
Abstract:
Satellite image time series (SITS) is a sequence of satellite images that record a given area at several consecutive times. The aim of such sequences is to use not only spatial information but also the temporal dimension of the data, which is used for multiple real-world applications, such as classification, segmentation, anomaly detection, and prediction. Several traditional machine learning algorithms have been developed and successfully applied to time series for predictions. However, these methods have limitations in some situations, thus deep learning (DL) techniques have been introduced to achieve the best performance. Reviews of machine learning and DL methods for time series prediction problems have been conducted in previous studies. However, to the best of our knowledge, none of these surveys have addressed the specific case of works using DL techniques and satellite images as datasets for predictions. Therefore, this paper concentrates on the DL applications for SITS prediction, giving an overview of the main elements used to design and evaluate the predictive models, namely the architectures, data, optimization functions, and evaluation metrics. The reviewed DL-based models are divided into three categories, namely recurrent neural network-based models, hybrid models, and feed-forward-based models (convolutional neural networks and multi-layer perceptron). The main characteristics of satellite images and the major existing applications in the field of SITS prediction are also presented in this article. These applications include weather forecasting, precipitation nowcasting, spatio-temporal analysis, and missing data reconstruction. Finally, current limitations and proposed workable solutions related to the use of DL for SITS prediction are also highlighted.
APA, Harvard, Vancouver, ISO, and other styles
6

Kim, Donghyun, Heechan Han, Wonjoon Wang, Yujin Kang, Hoyong Lee, and Hung Soo Kim. "Application of Deep Learning Models and Network Method for Comprehensive Air-Quality Index Prediction." Applied Sciences 12, no. 13 (July 1, 2022): 6699. http://dx.doi.org/10.3390/app12136699.

Full text
Abstract:
Accurate pollutant prediction is essential in fields such as meteorology, meteorological disasters, and climate change studies. In this study, long short-term memory (LSTM) and deep neural network (DNN) models were applied to six pollutants and comprehensive air-quality index (CAI) predictions from 2015 to 2020 in Korea. In addition, we used the network method to find the best data sources that provide factors affecting comprehensive air-quality index behaviors. This study had two steps: (1) predicting the six pollutants, including fine dust (PM10), fine particulate matter (PM2.5), ozone (O3), sulfurous acid gas (SO2), nitrogen dioxide (NO2), and carbon monoxide (CO) using the LSTM model; (2) forecasting the CAI using the six predicted pollutants in the first step as predictors of DNNs. The predictive ability of each model for the six pollutants and CAI prediction was evaluated by comparing it with the observed air-quality data. This study showed that combining a DNN model with the network method provided a high predictive power, and this combination could be a remarkable strength in CAI prediction. As the need for disaster management increases, it is anticipated that the LSTM and DNN models with the network method have ample potential to track the dynamics of air pollution behaviors.
APA, Harvard, Vancouver, ISO, and other styles
7

Colditz, Graham A., and Esther K. Wei. "Risk Prediction Models: Applications in Cancer Prevention." Current Epidemiology Reports 2, no. 4 (September 30, 2015): 245–50. http://dx.doi.org/10.1007/s40471-015-0057-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

He, Jianqin, Yong Hu, Xiangzhou Zhang, Lijuan Wu, Lemuel R. Waitman, and Mei Liu. "Multi-perspective predictive modeling for acute kidney injury in general hospital populations using electronic medical records." JAMIA Open 2, no. 1 (November 15, 2018): 115–22. http://dx.doi.org/10.1093/jamiaopen/ooy043.

Full text
Abstract:
Abstract Objectives Acute kidney injury (AKI) in hospitalized patients puts them at much higher risk for developing future health problems such as chronic kidney disease, stroke, and heart disease. Accurate AKI prediction would allow timely prevention and intervention. However, current AKI prediction researches pay less attention to model building strategies that meet complex clinical application scenario. This study aims to build and evaluate AKI prediction models from multiple perspectives that reflect different clinical applications. Materials and Methods A retrospective cohort of 76 957 encounters and relevant clinical variables were extracted from a tertiary care, academic hospital electronic medical record (EMR) system between November 2007 and December 2016. Five machine learning methods were used to build prediction models. Prediction tasks from 4 clinical perspectives with different modeling and evaluation strategies were designed to build and evaluate the models. Results Experimental analysis of the AKI prediction models built from 4 different clinical perspectives suggest a realistic prediction performance in cross-validated area under the curve ranging from 0.720 to 0.764. Discussion Results show that models built at admission is effective for predicting AKI events in the next day; models built using data with a fixed lead time to AKI onset is still effective in the dynamic clinical application scenario in which each patient’s lead time to AKI onset is different. Conclusion To our best knowledge, this is the first systematic study to explore multiple clinical perspectives in building predictive models for AKI in the general inpatient population to reflect real performance in clinical application.
APA, Harvard, Vancouver, ISO, and other styles
9

Hong, Feng, Lu Tian, and Viswanath Devanarayan. "Improving the Robustness of Variable Selection and Predictive Performance of Regularized Generalized Linear Models and Cox Proportional Hazard Models." Mathematics 11, no. 3 (January 20, 2023): 557. http://dx.doi.org/10.3390/math11030557.

Full text
Abstract:
High-dimensional data applications often entail the use of various statistical and machine-learning algorithms to identify an optimal signature based on biomarkers and other patient characteristics that predicts the desired clinical outcome in biomedical research. Both the composition and predictive performance of such biomarker signatures are critical in various biomedical research applications. In the presence of a large number of features, however, a conventional regression analysis approach fails to yield a good prediction model. A widely used remedy is to introduce regularization in fitting the relevant regression model. In particular, a L1 penalty on the regression coefficients is extremely useful, and very efficient numerical algorithms have been developed for fitting such models with different types of responses. This L1-based regularization tends to generate a parsimonious prediction model with promising prediction performance, i.e., feature selection is achieved along with construction of the prediction model. The variable selection, and hence the composition of the signature, as well as the prediction performance of the model depend on the choice of the penalty parameter used in the L1 regularization. The penalty parameter is often chosen by K-fold cross-validation. However, such an algorithm tends to be unstable and may yield very different choices of the penalty parameter across multiple runs on the same dataset. In addition, the predictive performance estimates from the internal cross-validation procedure in this algorithm tend to be inflated. In this paper, we propose a Monte Carlo approach to improve the robustness of regularization parameter selection, along with an additional cross-validation wrapper for objectively evaluating the predictive performance of the final model. We demonstrate the improvements via simulations and illustrate the application via a real dataset.
APA, Harvard, Vancouver, ISO, and other styles
10

Zhao, Zeyuan, Ping Li, Yongjie Dai, Zhaoe Min, and Lei Chen. "Multi-Task Deep Evidential Sequence Learning for Trustworthy Alzheimer’s Disease Progression Prediction." Applied Sciences 13, no. 15 (August 3, 2023): 8953. http://dx.doi.org/10.3390/app13158953.

Full text
Abstract:
Alzheimer’s disease (AD) is an irreversible neurodegenerative disease. Providing trustworthy AD progression predictions for at-risk individuals contributes to early identification of AD patients and holds significant value in discovering effective treatments and empowering the patient in taking proactive care. Recently, although numerous disease progression models based on machine learning have emerged, they often focus solely on enhancing predictive accuracy and ignore the measurement of result reliability. Consequently, this oversight adversely affects the recognition and acceptance of these models in clinical applications. To address these problems, we propose a multi-task evidential sequence learning model for the trustworthy prediction of disease progression. Specifically, we incorporate evidential deep learning into the multi-task learning framework based on recurrent neural networks. We simultaneously perform AD clinical diagnosis and cognitive score predictions while quantifying the uncertainty of each prediction without incurring additional computational costs by leveraging the Dirichlet and Normal-Inverse-Gamma distributions. Moreover, an adaptive weighting scheme is introduced to automatically balance between tasks for more effective training. Finally, experimental results on the TADPOLE dataset validate that our model not only has a comparable predictive performance to similar models but also offers reliable quantification of prediction uncertainties, providing a crucial supplementary factor for risk-sensitive AD progression prediction applications.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "PREDICTION MODELS APPLICATIONS"

1

Foley, Kristen Madsen. "Multivariate Spatial Temporal Statistical Models for Applications in Coastal Ocean Prediction." NCSU, 2006. http://www.lib.ncsu.edu/theses/available/etd-07042006-110351/.

Full text
Abstract:
Estimating the spatial and temporal variation of surface wind fields plays an important role in modeling atmospheric and oceanic processes. This is particularly true for hurricane forecasting, where numerical ocean models are used to predict the height of the storm surge and the degree of coastal flooding. We use multivariate spatial-temporal statistical methods to improve coastal storm surge prediction using disparate sources of observation data. An Ensemble Kalman Filter is used to assimilate water elevation into a three dimension primitive equations ocean model. We find that data assimilation is able to improve the estimates for water elevation for a case study of Hurricane Charley of 2004. In addition we investigate the impact of inaccuracies in the wind field inputs which are the main forcing of the numerical model in storm surge applications. A new multivariate spatial statistical framework is developed to improve the estimation of these wind inputs. A spatial linear model of coregionalization (LMC) is used to account for the cross-dependency between the two orthogonal wind components. A Bayesian approach is used for estimation of the parameters of the multivariate spatial model and a physically based wind model while accounting for potential additive and multiplicative bias in the observed wind data. This spatial model consistently improves parameter estimation and prediction for surface wind data for the Hurricane Charley case study when compared to the original physical wind model. These methods are also shown to improve storm surge estimates when used as the forcing fields for the coastal ocean model. Finally we describe a new framework for estimating multivariate nonstationary spatial-temporal processes based on an extension of the LMC model. We compare this approach to other multivariate spatial models and describe an application to surface wind fields from Hurricane Floyd of 1999.
APA, Harvard, Vancouver, ISO, and other styles
2

Dolan, David M. "Spatial statistics using quasi-likelihood methods with applications." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape10/PQDD_0029/NQ66201.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bean, Brennan L. "Interval-Valued Kriging Models with Applications in Design Ground Snow Load Prediction." DigitalCommons@USU, 2019. https://digitalcommons.usu.edu/etd/7579.

Full text
Abstract:
One critical consideration in the design of buildings constructed in the western United States is the weight of settled snow on the roof of the structure. Engineers are tasked with selecting a design snow load that ensures that the building is safe and reliable, without making the construction overly expensive. Western states use historical snow records at weather stations scattered throughout the region to estimate appropriate design snow loads. Various mapping techniques are then used to predict design snow loads between the weather stations. Each state uses different mapping techniques to create their snow load requirements, yet these different techniques have never been compared. In addition, none of the current mapping techniques can account for the uncertainty in the design snow load estimates. We address both issues by formally comparing the existing mapping techniques, as well as creating a new mapping technique that allows the estimated design snow loads to be represented as an interval of values, rather than a single value. In the process, we have improved upon existing methods for creating design snow load requirements and have produced a new tool capable of handling uncertain climate data.
APA, Harvard, Vancouver, ISO, and other styles
4

Khondaker, Bidoura. "Transferability of community-based macro-level collision prediction models for use in road safety planning applications." Thesis, University of British Columbia, 2008. http://hdl.handle.net/2429/2867.

Full text
Abstract:
This thesis proposes the methodology and guidelines for community-based macro-level CPM transferability to do road safety planning applications, with models developed in one spatial-temporal region being capable of used in a different spatial-temporal region. In doing this. the macro-level CPMs developed for the Greater Vancouver Regional District (GVRD) by Lovegrove and Sayed (2006, 2007) was used in a model transferability study. Using those models from GVRD and data from Central Okanagan Regional District (CORD), in the Province of British Columbia. Canada. a transferability test has been conducted that involved recalibration of the 1996 GVRD models to Kelowna, in 2003 context. The case study was carried out in three parts. First, macro-level CPMs for the City of Kelowna were developed using 2003 data following the research by GVRD CPM development and use. Next, the 1996 GVRD models were recalibrated to see whether they could yield reliable prediction of the safety estimates for Kelowna, in 2003 context. Finally, a comparison between the results of Kelowna’s own developed models and the transferred models was conducted to determine which models yielded better results. The results of the transferability study revealed that macro-level CPM transferability was possible and no more complicated than micro-level CPM transferability. To facilitate the development of reliable community-based, macro-level collision prediction models, it was recommended that CPMs be transferred rather than developed from scratch whenever and wherever communities lack sufficient data of adequate quality. Therefore, the transferability guidelines in this research, together with their application in the case studies, have been offered as a contribution towards model transferability to do road safety planning applications, with models developed in one spatial-temporal region being capable of used in a different spatial-temporal region.
APA, Harvard, Vancouver, ISO, and other styles
5

MacLellan, Christopher J. "Computational Models of Human Learning: Applications for Tutor Development, Behavior Prediction, and Theory Testing." Research Showcase @ CMU, 2017. http://repository.cmu.edu/dissertations/1054.

Full text
Abstract:
Intelligent tutoring systems are effective for improving students’ learning outcomes (Bowen et al., 2013; Koedinger & Anderson, 1997; Pane et al., 2013). However, constructing tutoring systems that are pedagogically effective has been widely recognized as a challenging problem (Murray, 1999, 2003). In this thesis, I explore the use of computational models of apprentice learning, or computer models that learn interactively from examples and feedback, to support tutor development. In particular, I investigate their use for authoring expert-models via demonstrations and feedback (Matsuda et al., 2014), predicting student behavior within tutors (VanLehn et al., 1994), and for testing alternative learning theories (MacLellan, Harpstead, Patel, & Koedinger, 2016). To support these investigations, I present the Apprentice Learner Architecture, which posits the types of knowledge, performance, and learning components needed for apprentice learning and enables the generation and testing of alternative models. I use this architecture to create two models: the DECISION TREE model, which non- incrementally learns when to apply its skills, and the TRESTLE model, which instead learns incrementally. Both models both draw on the same small set of prior knowledge for all simulations (six operators and three types of relational knowledge). Despite their limited prior knowledge, I demonstrate their use for efficiently authoring a novel experimental design tutor and show that they are capable of achieving human-level performance in seven additional tutoring systems that teach a wide range of knowledge types (associations, categories, and skills) across multiple domains (language, math, engineering, and science). I show that the models are capable of predicting which versions of a fraction arithmetic and box and arrows tutors are more effective for human students’ learning. Further, I use a mixedeffects regression analysis to evaluate the fit of the models to the available human data and show that across all seven domains the TRESTLE model better fits the human data than the DECISION TREE model, supporting the theory that humans learn the conditions under which skills apply incrementally, rather than non-incrementally as prior work has suggested (Li, 2013; Matsuda et al., 2009). This work lays the foundation for the development of a Model Human Learner— similar to Card, Moran, and Newell’s (1986) Model Human Processor—that encapsulates psychological and learning science findings in a format that researchers and instructional designers can use to create effective tutoring systems.
APA, Harvard, Vancouver, ISO, and other styles
6

Al-Shammari, Dhahi Turki Jadah. "Remote sensing applications for crop type mapping and crop yield prediction for digital agriculture." Thesis, The University of Sydney, 2022. https://hdl.handle.net/2123/29771.

Full text
Abstract:
This thesis addresses important topics in agricultural modelling research. Chapter 1 describes the importance of land productivity and the pressure on the agricultural sector to provide food. In chapter 2, a summer crop type mapping model has been developed to map major cotton fields in-season in the Murray Darling Basin (MDB) in Australia. In chapter 3, a robust crop classification model has been designed to classify two major crops (cereals and canola) in the MDB in Australia. chapter 4 focused on exploring changes in prediction quality with changes in the spatial resolution of predictors and the predictions. More specifically, this study investigated whether inputs should be resampled prior to modelling, or the modelling implemented first with the aggregation of predictions happening as a final step. In chapter 5, a new vegetation index is proposed that exploits the three red-edge bands provided by the Sentinel-2 satellite to capture changes in the transition region between the photosynthetically affected region (red region) and the Near-Infrared region (NIR region) affected by cell structure and leaf layers. Chapter 6 was conducted to test the potential of integration of two mechanistic-type model products (biomass and soil moisture) in the DDMs models. Chapter 7 was dedicated to discussing each technique used in this thesis and the outcomes of each technique, and the relationships between these outcomes. This thesis addressed the topics and questioned asked at the beginning of this research and the outcomes are listed in each chapter.
APA, Harvard, Vancouver, ISO, and other styles
7

Sobhani, Negin. "Applications, performance analysis, and optimization of weather and air quality models." Diss., University of Iowa, 2017. https://ir.uiowa.edu/etd/5996.

Full text
Abstract:
Atmospheric particulate matter (PM) is linked to various adverse environmental and health impacts. PM in the atmosphere reduces visibility, alters precipitation patterns by acting as cloud condensation nuclei (CCN), and changes the Earth’s radiative balance by absorbing or scattering solar radiation in the atmosphere. The long-range transport of pollutants leads to increase in PM concentrations even in remote locations such as polar regions and mountain ranges. One significant effect of PM on the earth’s climate occurs while light absorbing PM, such as Black Carbon (BC), deposits over snow. In the Arctic, BC deposition on highly reflective surfaces (e.g. glaciers and sea ices) has very intense effects, causing snow to melt more quickly. Thus, characterizing PM sources, identifying long-range transport pathways, and quantifying the climate impacts of PM are crucial in order to inform emission abatement policies for reducing both health and environmental impacts of PM. Chemical transport models provide mathematical tools for better understanding atmospheric system including chemical and particle transport, pollution diffusion, and deposition. The technological and computational advances in the past decades allow higher resolution air quality and weather forecast simulations with more accurate representations of physical and chemical mechanisms of the atmosphere. Due to the significant role of air pollutants on public health and environment, several countries and cities perform air quality forecasts for warning the population about the future air pollution events and taking local preventive measures such as traffic regulations to minimize the impacts of the forecasted episode. However, the costs associated with the complex air quality forecast models especially for simulations with higher resolution simulations make “forecasting” a challenge. This dissertation also focuses on applications, performance analysis, and optimization of meteorology and air quality modeling forecasting models. This dissertation presents several modeling studies with various scales to better understand transport of aerosols from different geographical sources and economic sectors (i.e. transportation, residential, industry, biomass burning, and power) and quantify their climate impacts. The simulations are evaluated using various observations including ground site measurements, field campaigns, and satellite data. The sector-based modeling studies elucidated the importance of various economical sector and geographical regions on global air quality and the climatic impacts associated with BC. This dissertation provides the policy makers with some implications to inform emission mitigation policies in order to target source sectors and regions with highest impacts. Furthermore, advances were made to better understand the impacts of light absorbing particles on climate and surface albedo. Finally, for improving the modeling speed, the performances of the models are analyzed, and optimizations were proposed for improving the computational efficiencies of the models. Theses optimizations show a significant improvement in the performance of Weather Research and Forecasting (WRF) and WRF-Chem models. The modified codes were validated and incorporated back into the WRF source code to benefit all WRF users. Although weather and air quality models are shown to be an excellent means for forecasting applications both for local and hemispheric scale, further studies are needed to optimize the models and improve the performance of the simulations.
APA, Harvard, Vancouver, ISO, and other styles
8

Rose, Peter. "Prediction of Fish Assemblages in Eastern Australian Streams Using Species Distribution Models: Linking Ecological Theory, Statistical Advances and Management Applications." Thesis, Griffith University, 2018. http://hdl.handle.net/10072/384279.

Full text
Abstract:
Rivers and streams are among the most imperilled ecosystems on earth owing to overexploitation; water quality impacts, altered flow regimes, habitat destruction, proliferation of alien species and climate change. There is a pressing need to address these threats through stream bioassessment, stream rehabilitation and species conservation actions. Species distribution models (SDMs) can offer a practical, spatially explicit means to assess the impact of these threats, prioritise stream rehabilitation and direct conservation decisions. However, applications of SDMs for stream bioassessment and real-world conservation outcomes in freshwater ecosystems is still in its infancy. This thesis set out to link conceptual advances in fish ecology with emerging statistical methods applied to stream bioassessment and species conservation issues facing eastern Australian freshwater fish species. One of the primary uses of SDMs in freshwater environments is bioassessment, or assessment of “river health”. A network of reference sites underpins most stream bioassessment programs, however, there is an ongoing challenge of objectively selecting high quality reference sites, particularly in highly modified assessment regions. To address subjectivity associated with ‘best professional judgement’ and similar methods, I developed a novel, data-driven approach using species turnover modelling (generalised dissimilarity modelling) to increase objectivity and transparency in reference site selection. I also tested whether biogeographic legacies of fish assemblages among discrete coastal catchments limited the use of reference sites in southeast Queensland and northeast New South Wales. The data-driven approach was then used to select reference sites and sample fish assemblages to develop freshwater fish SDMs for subsequent data chapters. Another factor potentially limiting the accuracy of SDMs for bioassessment and conservation is the modelling strategy employed. In particular, site-specific models for stream bioassessment usually still use ‘shortcut’ methods such as community classification and discriminant function analysis, despite growing evidence that machine learning algorithms provide greater predictive performance. I tested how reference coastal fish assemblages are structured in relation different species assembly theories (e.g., species arrangement in discrete communities, species sorting independently across environmental gradients, or elements of both) by comparing different modelling approaches reflective of these processes (community level modelling, stacked ‘single species’ models and multi-species response models). Evaluation of the modelling was used to determine which of these modelling paradigms best suit stream bioassessment and other conservation applications such as survey gap analysis, estimating range changes owing to climate or land use change and estimating biodiversity. The taxonomic completeness index is the most commonly used site-specific index for stream bioassessment programs, despite several recognised limitations of this index, including use of an arbitrary threshold; omission of rare taxa that may be responsive to subtle levels of disturbance; and omission of potentially useful information on taxa gained at disturbed sites. I developed and tested an index that incorporated both native species losses, and gains of tolerant and alien species into a unified index of assemblage change for stream bioassessment. This study used a single species ensemble modelling approach to predict species occurrence and combined predictions into an index akin to Bray-Curtis dissimilarity. The resultant index, ‘BCA’, markedly outperformed the widely used taxonomic completeness index derived from community classification (discriminant function analysis) models and has considerable potential for improving stream bioassessment index sensitivities for a range of freshwater indicators (e.g. diatoms, macroinvertebrates, macrophytes). It is recognised that there are very few peer-reviewed SDM studies that have ‘real world’ conservation applications; most are instead academic exercises concerned with addressing methodological challenges, or hypothetical examples of how one might apply a SDM for a conservation problem. To address this gap between modelling and management, I used SDMs to inform a conservation plan for declining southern pygmy perch (Murray-Darling Basin lineage) (Nannoperca australis) in northern Victoria. This study incorporated alien species abundance models as predictors into an ensemble SDM to identify remnant habitats of this declining species. The models indicated that ~ 70% of N. australis habitat has become unsuitable since European settlement owing to anthropogenic pressures and interaction with alien fish species, particularly brown trout (Salmo trutta). Model outputs were used for survey gap analysis and to identify stream segments suitable for targeted management and reintroduction of the species. This study formed the basis for a captive breeding and translocation plan for southern pygmy perch in northern Victoria. The thesis concludes with practical learnings from these modelling studies for freshwater bioassessment and conservation practitioners; namely: (1) that machine learning multispecies response and ensemble models offer improved predictive performance compared with traditional approaches and that model choice depends on the intended use of the model; (2) that a newly developed index, “BCA”, provides a more conceptually sound and sensitive index than the traditionally used taxonomic completeness index for stream bioassessment; and, (3) that SDMs developed using readily available and high quality stream bioassessment datasets provide an excellent foundation for applied freshwater fish species conservation and management. The thesis concludes with future challenges and directions for freshwater fish SDM research.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Environment and Sc
Science, Environment, Engineering and Technology
Full Text
APA, Harvard, Vancouver, ISO, and other styles
9

García, Durán Alberto. "Learning representations in multi-relational graphs : algorithms and applications." Thesis, Compiègne, 2016. http://www.theses.fr/2016COMP2271/document.

Full text
Abstract:
Internet offre une énorme quantité d’informations à portée de main et dans une telle variété de sujets, que tout le monde est en mesure d’accéder à une énorme variété de connaissances. Une telle grande quantité d’information pourrait apporter un saut en avant dans de nombreux domaines (moteurs de recherche, réponses aux questions, tâches NLP liées) si elle est bien utilisée. De cette façon, un enjeu crucial de la communauté d’intelligence artificielle a été de recueillir, d’organiser et de faire un usage intelligent de cette quantité croissante de connaissances disponibles. Heureusement, depuis un certain temps déjà des efforts importants ont été faits dans la collecte et l’organisation des connaissances, et beaucoup d’informations structurées peuvent être trouvées dans des dépôts appelés Bases des Connaissances (BCs). Freebase, Entity Graph Facebook ou Knowledge Graph de Google sont de bons exemples de BCs. Un grand problème des BCs c’est qu’ils sont loin d’êtres complets. Par exemple, dans Freebase seulement environ 30% des gens ont des informations sur leur nationalité. Cette thèse présente plusieurs méthodes pour ajouter de nouveaux liens entre les entités existantes de la BC basée sur l’apprentissage des représentations qui optimisent une fonction d’énergie définie. Ces modèles peuvent également être utilisés pour attribuer des probabilités à triples extraites du Web. On propose également une nouvelle application pour faire usage de cette information structurée pour générer des informations non structurées (spécifiquement des questions en langage naturel). On pense par rapport à ce problème comme un modèle de traduction automatique, où on n’a pas de langage correct comme entrée, mais un langage structuré. Nous adaptons le RNN codeur-décodeur à ces paramètres pour rendre possible cette traduction
Internet provides a huge amount of information at hand in such a variety of topics, that now everyone is able to access to any kind of knowledge. Such a big quantity of information could bring a leap forward in many areas if used properly. This way, a crucial challenge of the Artificial Intelligence community has been to gather, organize and make intelligent use of this growing amount of available knowledge. Fortunately, important efforts have been made in gathering and organizing knowledge for some time now, and a lot of structured information can be found in repositories called Knowledge Bases (KBs). A main issue with KBs is that they are far from being complete. This thesis proposes several methods to add new links between the existing entities of the KB based on the learning of representations that optimize some defined energy function. We also propose a novel application to make use of this structured information to generate questions in natural language
APA, Harvard, Vancouver, ISO, and other styles
10

Asiri, Aisha. "Applications of Game Theory, Tableau, Analytics, and R to Fashion Design." DigitalCommons@Robert W. Woodruff Library, Atlanta University Center, 2018. http://digitalcommons.auctr.edu/cauetds/146.

Full text
Abstract:
This thesis presents various models to the fashion industry to predict the profits for some products. To determine the expected performance of each product in 2016, we used tools of game theory to help us identify the expected value. We went further and performed a simple linear regression and used scatter plots to help us predict further the performance of the products of Prada. We used tools of game theory, analytics, and statistics to help us predict the performance of some of Prada's products. We also used the Tableau platform to visualize an overview of the products' performances. All of these tools were used to aid in finding better predictions of Prada's product performances.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "PREDICTION MODELS APPLICATIONS"

1

International Workshop on Epileptic Seizure Prediction (3rd 2007 Freiburg im Breisgau, Germany). Seizure prediction in epilepsy: From basic mechanisms to clinical applications. Weinheim: Wiley-VCH, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

United States. National Telecommunications and Information Administration., ed. Medium frequency propagation prediction techniques and antenna modeling for Intelligent Transportation Systems (ITS) broadcast applications. [Boulder, Colo.]: U.S. Dept. of Commerce, National Telecommunications and Information Administration, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

DeMinco, N. Medium frequency propagation prediction techniques and antenna modeling for Intelligent Transportation Systems (ITS) broadcast applications. [Boulder, Colo.]: U.S. Dept. of Commerce, National Telecommunications and Information Administration, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

United States. National Telecommunications and Information Administration, ed. Medium frequency propagation prediction techniques and antenna modeling for Intelligent Transportation Systems (ITS) broadcast applications. [Boulder, Colo.]: U.S. Dept. of Commerce, National Telecommunications and Information Administration, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Center, Langley Research, ed. Empirical modeling of environment-enhanced fatigue crack propagation in structural alloys for component life prediction. Hampton, Va: National Aeronautics and Space Administration, Langley Research Center, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Automotive model predictive control: Models, methods and applications. Berlin: Springer, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

McMillan, Gregory K. Models unleashed: Virtual plant and model predictive control applications : a pocket guide. Research Triangle Park, NC: ISA, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Phillips, Peter C. B. Bayesian model selection and prediction with empirical applications. New Haven, CN: Yale University, Cowles Foundation, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

James, Judge W., Sebastian Lynne, and Altschul Jeffrey H, eds. Quantifying the present and predicting the past: Theory, method, and application of archeological predictive modeling. Denver, Colo: U.S. Dept. of the Interior, Bureau of Land Management, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

United States. National Aeronautics and Space Administration., ed. Modeling improvements and users manual for axial-flow turbine off-design computer code AXOD. [Washington, DC]: National Aeronautics and Space Administration, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "PREDICTION MODELS APPLICATIONS"

1

Steyerberg, E. W. "Applications of prediction models." In Statistics for Biology and Health, 11–31. New York, NY: Springer New York, 2008. http://dx.doi.org/10.1007/978-0-387-77244-8_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Steyerberg, Ewout W. "Applications of Prediction Models." In Statistics for Biology and Health, 15–36. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-16399-0_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gopakumar, Shivapratap, Truyen Tran, Dinh Phung, and Svetha Venkatesh. "Stabilizing Linear Prediction Models Using Autoencoder." In Advanced Data Mining and Applications, 651–63. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-49586-6_46.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Briggs, William M. "Testing, Prediction, and Cause in Econometric Models." In Econometrics for Financial Applications, 3–19. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-73150-6_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Agarwal, Shrey, Yashaswi Upmon, Riyan Pahuja, Ganesh Bhandarkar, and Suresh Chandra Satapathy. "Student Performance Prediction Using Classification Models." In Smart Intelligent Computing and Applications, Volume 1, 187–96. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-9669-5_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhang, David, Dongmin Guo, and Ke Yan. "Improving the Transfer Ability of Prediction Models." In Breath Analysis for Medical Applications, 91–112. Singapore: Springer Singapore, 2017. http://dx.doi.org/10.1007/978-981-10-4322-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Skovranek, Tomas, and Vladimir Despotovic. "Signal prediction using fractional derivative models." In Applications in Engineering, Life and Social Sciences, Part B, edited by Dumitru Bǎleanu and António Mendes Lopes, 179–206. Berlin, Boston: De Gruyter, 2019. http://dx.doi.org/10.1515/9783110571929-007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Psathas, Anastasios Panagiotis, Lazaros Iliadis, Dimitra V. Achillopoulou, Antonios Papaleonidas, Nikoleta K. Stamataki, Dimitris Bountas, and Ioannis M. Dokas. "Autoregressive Deep Learning Models for Bridge Strain Prediction." In Engineering Applications of Neural Networks, 150–64. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-08223-8_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Boudali, Imen, and Ines Belhadj Messaoud. "Machine Learning Models for Toxicity Prediction in Chemotherapy." In Intelligent Systems Design and Applications, 350–64. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-35510-3_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Alsinglawi, Belal, Fady Alnajjar, Omar Mubin, Mauricio Novoa, Ola Karajeh, and Omar Darwish. "Benchmarking Predictive Models in Electronic Health Records: Sepsis Length of Stay Prediction." In Advanced Information Networking and Applications, 258–67. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-44041-1_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "PREDICTION MODELS APPLICATIONS"

1

Camps, Octavia I. "Prediction models from CAD models of 3D objects." In Applications in Optical Science and Engineering, edited by David P. Casasent. SPIE, 1992. http://dx.doi.org/10.1117/12.131616.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Giang, Le Truong, Dongwon Kang, and Doo-Hwan Bae. "Software Fault Prediction Models for Web Applications." In 2010 IEEE 34th Annual Computer Software and Applications Conference Workshops (COMPSACW). IEEE, 2010. http://dx.doi.org/10.1109/compsacw.2010.19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Boni, Mohammad Al, and Matthew S. Gerber. "Area-Specific Crime Prediction Models." In 2016 15th IEEE International Conference on Machine Learning and Applications (ICMLA). IEEE, 2016. http://dx.doi.org/10.1109/icmla.2016.0118.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ostrowski, David Alfred. "Model segmentation for numerical prediction." In 2009 IEEE Workshop on Hybrid Intelligent Models and Applications (HIMA). IEEE, 2009. http://dx.doi.org/10.1109/hima.2009.4937821.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Nguyen, Tien M., Hien T. Tran, Zhonghai Wang, Amanda Coons, Charles C. Nguyen, Steven A. Lane, Khanh D. Pham, Genshe Chen, and Gang Wang. "RFI modeling and prediction approach for SATOP applications: RFI prediction models." In SPIE Defense + Security, edited by Khanh D. Pham and Genshe Chen. SPIE, 2016. http://dx.doi.org/10.1117/12.2223518.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kalajdjieski, Jovan, Georgina Mirceva, and Slobodan Kalajdziski. "Attention Models for PM2.5 Prediction." In 2020 IEEE/ACM International Conference on Big Data Computing, Applications and Technologies (BDCAT). IEEE, 2020. http://dx.doi.org/10.1109/bdcat50828.2020.00010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Boring, Matthew A., Wei Zhang, and William A. Bruce. "Improved Burnthrough Prediction Model for In-Service Welding Applications." In 2008 7th International Pipeline Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/ipc2008-64353.

Full text
Abstract:
When welding onto an in-service pipeline to repair a damaged section of pipe or to install a branch connection (i.e., hot-tapping) there are two main concerns; burnthrough and hydrogen cracking. The risk of burnthrough is typically evaluated by predicting the inside surface temperature of the pipeline using industry trusted computer models (e.g., Battelle or PRCI models). The objective of this project was to evaluate alternatives to the burnthrough prediction approach currently used by the Battelle and PRCI models and to identify and validate an improved approach. An improved approach for burnthrough prediction was developed and based on two-dimensional (2-D) thermo-mechanical FEA model which uses ABAQUS and EWI-developed proprietary user subroutines (46345 model). The easy-to-use graphic user interface (GUI) is based on Microsoft Excel and allows the user to run the numerical analysis by a few mouse-button clicks. The 46345 model was based on circumferential and bead-on-pipe welds which simulate the first layer of a temper bead in-service welding procedure or a weld metal deposition repair. The effect of various parameters such as pressure, wall thickness, pipe diameter, and welding direction were quantitatively studied using the 46345 model and compared to cross sections of experimental welds made under the same condition. The 46345 model circumferential weld case predictions were in good agreement with experimental weld cross sections and were able to reduce the over-conservatism assumed with the PRCI model. The 46345 model longitudinal weld case predictions were in less-than-adequate agreement with the experimental weld cross sections and were not able to reduce the over-conservatism assumed with the PRCI model. It is important to note that even though the 46345 model does predict the inside surface temperature during the analysis that the temperature is not used in determining the burnthrough risk. The burnthrough risk is solely based on the magnitude of the radial displacement which may be a better measure of burnthrough risk than the inside surface temperature.
APA, Harvard, Vancouver, ISO, and other styles
8

Syeed, Miah Mohammad Asif, Maisha Farzana, Ishadie Namir, Ipshita Ishrar, Meherin Hossain Nushra, and Tanvir Rahman. "Flood Prediction Using Machine Learning Models." In 2022 International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA). IEEE, 2022. http://dx.doi.org/10.1109/hora55278.2022.9800023.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kerce, J. Clayton, and Francois Vandenberghe. "A lower bound for prediction uncertainty in nowcasting/forecasting models." In Optical Engineering + Applications, edited by Xiaolei Zou, Dale Barker, and Francois-Xavier Le Dimet. SPIE, 2007. http://dx.doi.org/10.1117/12.740665.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Lee, Yunli, Leslie Tiong Ching Ow, and David Ngo Chek Ling. "Hidden Markov Models for Forex Trends Prediction." In 2014 International Conference on Information Science and Applications (ICISA). IEEE, 2014. http://dx.doi.org/10.1109/icisa.2014.6847408.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "PREDICTION MODELS APPLICATIONS"

1

Chernozhukov, Victor, Alexandre Belloni, and Mingli Chen. Quantile graphical models: prediction and conditional independence with applications to systemic risk. The IFS, December 2017. http://dx.doi.org/10.1920/wp.cem.2017.5417.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Meier, David C. Operational Exploitation of Satellite-Based Sounding Data and Numerical Weather Prediction Models for Directed Energy Applications. Fort Belvoir, VA: Defense Technical Information Center, December 2015. http://dx.doi.org/10.21236/ad1003080.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bubenik, T. A., R. D. Fischer, G. R. Whitacre, D. J. Jones, J. F. Kiefner, M. Cola, and W. A. Bruce. API-WCR Investigation and Prediction of Cooling Rates During Pipeline Maintenance Welding. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), December 1991. http://dx.doi.org/10.55274/r0011852.

Full text
Abstract:
Investigates and improves methods of predicting cooling rates during pipeline maintenance welding. This project was funded by the American Petroleum Institute. The work was performed by Battelle Memorial Institute and Edison Welding Institute. The scope of work included (1) a review of three previous research efforts to develop satisfactory methods for welding appurtenances to in-service pipelines, (2) a review of a pipeline leak and rupture incidents associated with appurtenances, (3) the enhancement of existing analytical models for predicting cooling rates and temperatures during welding on an in-service pipeline, and (4) validation of the thermal-analysis models by performing welds on pipelines carrying three different liquid-petroleum products. The thermal-analysis models can be used to help develop maintenance welding procedures for repair and hot tap welding applications and to reassess the condition of existing installations. This work was cofounded by PRC.
APA, Harvard, Vancouver, ISO, and other styles
4

Cheng and Wang. L52025 Calibration of the PRCI Thermal Analysis Model for Hot Tap Welding. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), January 2004. http://dx.doi.org/10.55274/r0010298.

Full text
Abstract:
In-service welding is a common industrial practice for both maintenance and repair purpose. Its applications include, but not limited to repair of pipeline damages caused by construction or corrosion, and hot tap welding used to add branch connections to existing pipelines. In-service welding enables maintaining and repairing pipelines without removing them from service. Such welding operations generate significant economic and environmental benefits, for example, no interruption of pipeline operations and no venting of pipeline contents. One of the common problems associated with in-service welding is hydrogen cracking. Pipeline operating conditions combined with unscrupulous welding procedures could lead to high heat-affected zone (HAZ) hardness values and this, in turn, could cause hydrogen cracking. The risk of hydrogen cracking is particularly high for older pipeline materials with high carbon equivalent. The objective of the project was to produce a significantly improved HAZ hardness prediction procedure over the procedure in the current PRCI thermal analysis software by utilizing state-of-the-art phase transformation models for steels. Systematic validation of the prediction algorithms was conducted using extensive experimental data of actual welds. The hardness prediction model is expected to become the basis on which the hardness prediction module of the PRCI thermal analysis software will be upgraded and improved.
APA, Harvard, Vancouver, ISO, and other styles
5

Harris, Aubrey, Nathan Richards, and S. McKay. Defining levels of effort for ecological models. Engineer Research and Development Center (U.S.), September 2023. http://dx.doi.org/10.21079/11681/47642.

Full text
Abstract:
While models are useful tools for decision-making in environmental management, the question arises about the level of effort required to develop an effective model for a given application. In some cases, it is unclear whether more analysis would lead to choosing a better course of action. This technical note (TN) examines the role of ecological model complexity in ecosystem management. First, model complexity is examined through the lens of risk informed planning. Second, a framework is presented for categorizing five different levels of effort that range from conceptual models to detailed predictive tools. This framework is proposed to enhance communication and provide consistency in ecological modeling applications. Third, the level of effort framework is applied to a set of models in the Middle Rio Grande River system to demonstrate the framework’s utility and application. Ultimately, this TN seeks to guide planners in determining an appropriate level of effort relative to risks associated with uncertainty and resource availability for a given application.
APA, Harvard, Vancouver, ISO, and other styles
6

Wei, Dongmei, Yang Sun, and Rongtao Chen. Risk prediction model for ISR after coronary stenting-a systematic review and meta-analysis. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, April 2023. http://dx.doi.org/10.37766/inplasy2023.4.0014.

Full text
Abstract:
Review question / Objective: The efficacy of risk prediction model for ISR. Condition being studied: Coronary heart disease (CHD), with high morbidity and high mortality rate, is still a serious public health concern around the world. PCI is fast becoming a key instrument in revascularization for patients with CHD, as well as an important technology in the management of CHD patients.1 Although the clinical application of coronary stents brought about a dramatic improvement in patients’ clinical and procedural outcomes, the mid-and long-term outcome of stent implantation remains significantly hampered by the risk of developing ISR with a prevalence rate of 3–20% over time. Predictive models have the advantage of formally combining risk factors to allow more accurate risk estimation. And it is essential to establish a model to predict ISR in patients with CAD and drug-eluting stents (DESs) implantation.However, predictive model performance needs further evaluation.
APA, Harvard, Vancouver, ISO, and other styles
7

Blanco, Roberto, Elena Fernández, Miguel García-Posada, and Sergio Mayordomo. An estimation of the default probabilities of Spanish non-financial corporations and their application to evaluate public policies. Madrid: Banco de España, September 2023. http://dx.doi.org/10.53479/33512.

Full text
Abstract:
We model the one-year ahead probability for default of Spanish non-financial corporations using data for the period 1996-2019. While most previous literature considers that a firm is in default if it files for bankruptcy, we define default as having non-performing loans during at least three months of a given year. This broader definition allows us to predict firms’ financial distress at an earlier stage that cannot generally be observed by researchers, before their financial conditions become too severe and they have to file for bankruptcy or engage in private workouts with their creditors. We estimate, by means of logistic regressions, both a general model that uses all the firms in the sample and six models for different size-sector combinations. The selected explanatory variables are five accounting ratios, which summarise firms’ creditworthiness, and the growth rate of aggregate credit to non-financial corporations, to take into account the role of credit availability in mitigating the risk of default. Finally, we carry out two applications of our prediction models: we construct credit rating transition matrices and evaluate a programme implemented by the Spanish government to provide direct aid to firms severely affected by the COVID-19 crisis.
APA, Harvard, Vancouver, ISO, and other styles
8

Zhu, Xian-Kui, Brian Leis, and Tom McGaughy. PR-185-173600-R01 Reference Stress for Metal-loss Assessment of Pipelines. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), August 2018. http://dx.doi.org/10.55274/r0011516.

Full text
Abstract:
This project focused on quantifying the reference stress to be used in predictive models for assessing the effects of metal loss on pipeline integrity. The results of this project will work in concert with the outcomes of project EC-2-7 that examined sources of scatter in metal-loss predictions with respect to the metal-loss defect geometry. The methodology for developing a new reference stress included empirical and finite element analyses along with comparison of full-scale experimental results that indicate the failure behavior of defect-free pipe has dependence on the strain hardening rate, n, of the pipe steel. Since the strain hardening rate is often unreported in qualification test records and mill certification reports, the development of a new reference stress will seek to include the utilization of the ratio of yield-to-tensile strength (Y/T) as a surrogate for n. This approach ideally would be insensitive to pipe grade, and thus, allow broad application of the reference stress without increasing scatter or bias across grade levels. This work also compared the resulting metal-loss criterion with the new reference stress relative to the B31G and Modified B31G models using a dataset of approximately 75 full-scale burst test results for test vessels containing isolated defects. This comparison was performed by C-FER Technologies under sub-contract to EWI and quantified the prediction bias and prediction variability of the new criterion relative to those widely in use.
APA, Harvard, Vancouver, ISO, and other styles
9

Vecherin, Sergey, Stephen Ketcham, Aaron Meyer, Kyle Dunn, Jacob Desmond, and Michael Parker. Short-range near-surface seismic ensemble predictions and uncertainty quantification for layered medium. Engineer Research and Development Center (U.S.), September 2022. http://dx.doi.org/10.21079/11681/45300.

Full text
Abstract:
To make a prediction for seismic signal propagation, one needs to specify physical properties and subsurface ground structure of the site. This information is frequently unknown or estimated with significant uncertainty. This paper describes a methodology for probabilistic seismic ensemble prediction for vertically stratified soils and short ranges with no in situ site characterization. Instead of specifying viscoelastic site properties, the methodology operates with probability distribution functions of these properties taking into account analytical and empirical relationships among viscoelastic variables. This yields ensemble realizations of signal arrivals at specified locations where statistical properties of the signals can be estimated. Such ensemble predictions can be useful for preliminary site characterization, for military applications, and risk analysis for remote or inaccessible locations for which no data can be acquired. Comparison with experiments revealed that measured signals are not always within the predicted ranges of variability. Variance-based global sensitivity analysis has shown that the most significant parameters for signal amplitude predictions in the developed stochastic model are the uncertainty in the shear quality factor and the Poisson ratio above the water table depth.
APA, Harvard, Vancouver, ISO, and other styles
10

Oliver, Amanda, Catherine Murphy, Edmund Howe, and John Vest. Comparing methods for estimating water surface elevation between gages in the Lower Mississippi River. Engineer Research and Development Center (U.S.), April 2023. http://dx.doi.org/10.21079/11681/46915.

Full text
Abstract:
Predicting a water surface elevation (WSElev) at a particular location has a wide range of applications like determining if a levee will overtop or how much a dike notch will increase water flow into a secondary channel. Five existing methods for predicting the water’s surface, (1) daily slope, (2) average slope, (3) River Analysis System (RAS) 1D, (4) RAS 2D, and (5) Adaptive Hydraulics modeling system (AdH), were used to predict the Mississippi River’s daily water surface from 10 October 2014 to 31 May 2016 at Friar’s Point, Greenville, and Natchez gages. The error, calculated as the model-predicted water surface minus the gage-observed water surface, was compared among the methods. The average slope method, using Helena and Fair Landing gages, and the daily slope method, using either Memphis and Helena or Helena and Arkansas City gages, most closely estimated the observed WSElev. The RAS 1D predictions for Friar Point and Greenville produced more accurate estimates than the RAS 2D model and were the only estimates that did not show a pattern of over- or underestimation. When the daily slope method was applied to gages that were farther apart (Memphis and Arkansas City, Arkansas City and Vicksburg, or Vicksburg and Knoxville), the error became greater than most RAS 1D and 2D predictions. The low error and simple calculations of the daily slope and average slope methods using gages <110 river miles apart make these methods useful for calculating current and historic conditions. The lack of over- or underestimation in the RAS 1D predictions (for locations away from the edges of the model area) make this method a better choice for predicting average WSElevs and a good choice for forecasting future WSElevs.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography