Academic literature on the topic 'Prediction of survival; Probability; Time models'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Prediction of survival; Probability; Time models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Prediction of survival; Probability; Time models"

1

Lan, Yu, and Daniel F. Heitjan. "Adaptive parametric prediction of event times in clinical trials." Clinical Trials 15, no. 2 (January 29, 2018): 159–68. http://dx.doi.org/10.1177/1740774517750633.

Full text
Abstract:
Background: In event-based clinical trials, it is common to conduct interim analyses at planned landmark event counts. Accurate prediction of the timing of these events can support logistical planning and the efficient allocation of resources. As the trial progresses, one may wish to use the accumulating data to refine predictions. Purpose: Available methods to predict event times include parametric cure and non-cure models and a nonparametric approach involving Bayesian bootstrap simulation. The parametric methods work well when their underlying assumptions are met, and the nonparametric method gives calibrated but inefficient predictions across a range of true models. In the early stages of a trial, when predictions have high marginal value, it is difficult to infer the form of the underlying model. We seek to develop a method that will adaptively identify the best-fitting model and use it to create robust predictions. Methods: At each prediction time, we repeat the following steps: (1) resample the data; (2) identify, from among a set of candidate models, the one with the highest posterior probability; and (3) sample from the predictive posterior of the data under the selected model. Results: A Monte Carlo study demonstrates that the adaptive method produces prediction intervals whose coverage is robust within the family of selected models. The intervals are generally wider than those produced assuming the correct model, but narrower than nonparametric prediction intervals. We demonstrate our method with applications to two completed trials: The International Chronic Granulomatous Disease study and Radiation Therapy Oncology Group trial 0129. Limitations: Intervals produced under any method can be badly calibrated when the sample size is small and unhelpfully wide when predicting the remote future. Early predictions can be inaccurate if there are changes in enrollment practices or trends in survival. Conclusions: An adaptive event-time prediction method that selects the model given the available data can give improved robustness compared to methods based on less flexible parametric models.
APA, Harvard, Vancouver, ISO, and other styles
2

Gensheimer, Michael F., and Balasubramanian Narasimhan. "A scalable discrete-time survival model for neural networks." PeerJ 7 (January 25, 2019): e6257. http://dx.doi.org/10.7717/peerj.6257.

Full text
Abstract:
There is currently great interest in applying neural networks to prediction tasks in medicine. It is important for predictive models to be able to use survival data, where each patient has a known follow-up time and event/censoring indicator. This avoids information loss when training the model and enables generation of predicted survival curves. In this paper, we describe a discrete-time survival model that is designed to be used with neural networks, which we refer to as Nnet-survival. The model is trained with the maximum likelihood method using mini-batch stochastic gradient descent (SGD). The use of SGD enables rapid convergence and application to large datasets that do not fit in memory. The model is flexible, so that the baseline hazard rate and the effect of the input data on hazard probability can vary with follow-up time. It has been implemented in the Keras deep learning framework, and source code for the model and several examples is available online. We demonstrate the performance of the model on both simulated and real data and compare it to existing models Cox-nnet and Deepsurv.
APA, Harvard, Vancouver, ISO, and other styles
3

Ren, Kan, Jiarui Qin, Lei Zheng, Zhengyu Yang, Weinan Zhang, Lin Qiu, and Yong Yu. "Deep Recurrent Survival Analysis." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 4798–805. http://dx.doi.org/10.1609/aaai.v33i01.33014798.

Full text
Abstract:
Survival analysis is a hotspot in statistical research for modeling time-to-event information with data censorship handling, which has been widely used in many applications such as clinical research, information system and other fields with survivorship bias. Many works have been proposed for survival analysis ranging from traditional statistic methods to machine learning models. However, the existing methodologies either utilize counting-based statistics on the segmented data, or have a pre-assumption on the event probability distribution w.r.t. time. Moreover, few works consider sequential patterns within the feature space. In this paper, we propose a Deep Recurrent Survival Analysis model which combines deep learning for conditional probability prediction at finegrained level of the data, and survival analysis for tackling the censorship. By capturing the time dependency through modeling the conditional probability of the event for each sample, our method predicts the likelihood of the true event occurrence and estimates the survival rate over time, i.e., the probability of the non-occurrence of the event, for the censored data. Meanwhile, without assuming any specific form of the event probability distribution, our model shows great advantages over the previous works on fitting various sophisticated data distributions. In the experiments on the three realworld tasks from different fields, our model significantly outperforms the state-of-the-art solutions under various metrics.
APA, Harvard, Vancouver, ISO, and other styles
4

Li, Kan, and Sheng Luo. "Dynamic predictions in Bayesian functional joint models for longitudinal and time-to-event data: An application to Alzheimer’s disease." Statistical Methods in Medical Research 28, no. 2 (July 28, 2017): 327–42. http://dx.doi.org/10.1177/0962280217722177.

Full text
Abstract:
In the study of Alzheimer’s disease, researchers often collect repeated measurements of clinical variables, event history, and functional data. If the health measurements deteriorate rapidly, patients may reach a level of cognitive impairment and are diagnosed as having dementia. An accurate prediction of the time to dementia based on the information collected is helpful for physicians to monitor patients’ disease progression and to make early informed medical decisions. In this article, we first propose a functional joint model to account for functional predictors in both longitudinal and survival submodels in the joint modeling framework. We then develop a Bayesian approach for parameter estimation and a dynamic prediction framework for predicting the subjects’ future outcome trajectories and risk of dementia, based on their scalar and functional measurements. The proposed Bayesian functional joint model provides a flexible framework to incorporate many features both in joint modeling of longitudinal and survival data and in functional data analysis. Our proposed model is evaluated by a simulation study and is applied to the motivating Alzheimer’s Disease Neuroimaging Initiative study.
APA, Harvard, Vancouver, ISO, and other styles
5

Alemazkoor, Negin, Conrad J. Ruppert, and Hadi Meidani. "Survival analysis at multiple scales for the modeling of track geometry deterioration." Proceedings of the Institution of Mechanical Engineers, Part F: Journal of Rail and Rapid Transit 232, no. 3 (March 9, 2017): 842–50. http://dx.doi.org/10.1177/0954409717695650.

Full text
Abstract:
Defects in track geometry have a notable impact on the safety of rail transportation. In order to make the optimal maintenance decisions to ensure the safety and efficiency of railroads, it is necessary to analyze the track geometry defects and develop reliable defect deterioration models. In general, standard deterioration models are typically developed for a segment of track. As a result, these coarse-scale deterioration models may fail to predict whether the isolated defects in a segment will exceed the safety limits after a given time period or not. In this paper, survival analysis is used to model the probability of exceeding the safety limits of the isolated defects. These fine-scale models are then used to calculate the probability of whether each segment of the track will require maintenance after a given time period. The model validation results show that the prediction quality of the coarse-scale segment-based models can be improved by exploiting information from the fine-scale defect-based deterioration models.
APA, Harvard, Vancouver, ISO, and other styles
6

Sun, Zhaohong, Wei Dong, Jinlong Shi, Kunlun He, and Zhengxing Huang. "Attention-Based Deep Recurrent Model for Survival Prediction." ACM Transactions on Computing for Healthcare 2, no. 4 (October 31, 2021): 1–18. http://dx.doi.org/10.1145/3466782.

Full text
Abstract:
Survival analysis exhibits profound effects on health service management. Traditional approaches for survival analysis have a pre-assumption on the time-to-event probability distribution and seldom consider sequential visits of patients on medical facilities. Although recent studies leverage the merits of deep learning techniques to capture non-linear features and long-term dependencies within multiple visits for survival analysis, the lack of interpretability prevents deep learning models from being applied to clinical practice. To address this challenge, this article proposes a novel attention-based deep recurrent model, named AttenSurv , for clinical survival analysis. Specifically, a global attention mechanism is proposed to extract essential/critical risk factors for interpretability improvement. Thereafter, Bi-directional Long Short-Term Memory is employed to capture the long-term dependency on data from a series of visits of patients. To further improve both the prediction performance and the interpretability of the proposed model, we propose another model, named GNNAttenSurv , by incorporating a graph neural network into AttenSurv, to extract the latent correlations between risk factors. We validated our solution on three public follow-up datasets and two electronic health record datasets. The results demonstrated that our proposed models yielded consistent improvement compared to the state-of-the-art baselines on survival analysis.
APA, Harvard, Vancouver, ISO, and other styles
7

Tan, Ping, Lu Yang, Hang Xu, and Qiang Wei. "Novel perioperative parameters-based nomograms for survival outcomes in upper tract urothelial carcinoma after radical nephroureterectomy." Journal of Clinical Oncology 37, no. 7_suppl (March 1, 2019): 414. http://dx.doi.org/10.1200/jco.2019.37.7_suppl.414.

Full text
Abstract:
414 Background: Recently, several postoperative nomograms for cancer-specific survival (CSS) after radical nephroureterectomy (RNU) were proposed, while they did not incorporate the same variables; meanwhile, many preoperative blood-based parameters, which were recently reported to be related to survival, were not included in their models. In addition, no nomogram for overall survival (OS) was available to date. Methods: The full data of 716 patients were available. The whole cohort was randomly divided into two cohorts: the training cohort for developing the nomograms (n = 508) and the validation cohort for validating the models (n = 208). Univariate and multivariate Cox proportional hazards regression models were used for establishing the prediction models. The discriminative accuracy of nomograms were measured by Harrell’s concordance index (C-index). The clinical usefulness and net benefit of the predictive models were estimated and visualized by using Decision curve analyses (DCA). Results: The median follow-up time was 42.0 months (IQR: 18.0-76.0). For CSS, tumor size, grade and pT stage, lymph node metastasis, NLR, PLR and fibrinogen level were identified as independent risk factors in the final model; while tumor grade and pT stage, lymph node metastasis, PLR, Cys-C and fibrinogen level were identified as independent predictors for OS model. The C-index for CSS prediction was 0.82 (95%CI: 0.79-0.85), and the OS nomogram model had an accuracy of 0.83 (95%CI: 0.80-0.86). The results of bootstrapping showed no deviation from the ideal. The calibration plots for the probability of CSS and OS at 3 or 5-year after RNU showed a favorable agreement between the prediction by the nomograms and actual observation. In the external validation cohort, the C-indexes of the nomograms for predicting CSS and OS were 0.79 (95%CI: 0.74-0.84) and 0.80 (95%CI: 0.75-0.85), respectively. As indicated by calibration plots, optimal agreement was observed between prediction and observation in the external cohort. Conclusions: The nomograms developed and validated based on preoperative blood-based parameters were superior to any single variable for predicting CSS and OS after RNU.
APA, Harvard, Vancouver, ISO, and other styles
8

Liu, Xing-Rong, Yudi Pawitan, and Mark Clements. "Parametric and penalized generalized survival models." Statistical Methods in Medical Research 27, no. 5 (September 1, 2016): 1531–46. http://dx.doi.org/10.1177/0962280216664760.

Full text
Abstract:
We describe generalized survival models, where g( S( t| z)), for link function g, survival S, time t, and covariates z, is modeled by a linear predictor in terms of covariate effects and smooth time effects. These models include proportional hazards and proportional odds models, and extend the parametric Royston–Parmar models. Estimation is described for both fully parametric linear predictors and combinations of penalized smoothers and parametric effects. The penalized smoothing parameters can be selected automatically using several information criteria. The link function may be selected based on prior assumptions or using an information criterion. We have implemented the models in R. All of the penalized smoothers from the mgcv package are available for smooth time effects and smooth covariate effects. The generalized survival models perform well in a simulation study, compared with some existing models. The estimation of smooth covariate effects and smooth time-dependent hazard or odds ratios is simplified, compared with many non-parametric models. Applying these models to three cancer survival datasets, we find that the proportional odds model is better than the proportional hazards model for two of the datasets.
APA, Harvard, Vancouver, ISO, and other styles
9

Andrinopoulou, Eleni-Rosalina, D. Rizopoulos, Johanna JM Takkenberg, and E. Lesaffre. "Combined dynamic predictions using joint models of two longitudinal outcomes and competing risk data." Statistical Methods in Medical Research 26, no. 4 (June 9, 2015): 1787–801. http://dx.doi.org/10.1177/0962280215588340.

Full text
Abstract:
Nowadays there is an increased medical interest in personalized medicine and tailoring decision making to the needs of individual patients. Within this context our developments are motivated from a Dutch study at the Cardio-Thoracic Surgery Department of the Erasmus Medical Center, consisting of patients who received a human tissue valve in aortic position and who were thereafter monitored echocardiographically. Our aim is to utilize the available follow-up measurements of the current patients to produce dynamically updated predictions of both survival and freedom from re-intervention for future patients. In this paper, we propose to jointly model multiple longitudinal measurements combined with competing risk survival outcomes and derive the dynamically updated cumulative incidence functions. Moreover, we investigate whether different features of the longitudinal processes would change significantly the prediction for the events of interest by considering different types of association structures, such as time-dependent trajectory slopes and time-dependent cumulative effects. Our final contribution focuses on optimizing the quality of the derived predictions. In particular, instead of choosing one final model over a list of candidate models which ignores model uncertainty, we propose to suitably combine predictions from all considered models using Bayesian model averaging.
APA, Harvard, Vancouver, ISO, and other styles
10

Liu, Chuchu, Anja J. Rueten-Budde, Andreas Ranft, Uta Dirksen, Hans Gelderblom, and Marta Fiocco. "Dynamic prediction of overall survival: a retrospective analysis on 979 patients with Ewing sarcoma from the German registry." BMJ Open 10, no. 10 (October 2020): e036376. http://dx.doi.org/10.1136/bmjopen-2019-036376.

Full text
Abstract:
ObjectivesThis study aimed at developing a dynamic prediction model for patients with Ewing sarcoma (ES) to provide predictions at different follow-up times. During follow-up, disease-related information becomes available, which has an impact on a patient’s prognosis. Many prediction models include predictors available at baseline and do not consider the evolution of disease over time.SettingIn the analysis, 979 patients with ES from the Gesellschaft für Pädiatrische Onkologie und Hämatologie registry, who underwent surgery and treatment between 1999 and 2009, were included.DesignA dynamic prediction model was developed to predict updated 5-year survival probabilities from different prediction time points during follow-up. Time-dependent variables, such as local recurrence (LR) and distant metastasis (DM), as well as covariates measured at baseline, were included in the model. The time effects of covariates were investigated by using interaction terms between each variable and time.ResultsDeveloping LR, DM in the lungs (DMp) or extrapulmonary DM (DMo) has a strong effect on the probability of surviving an additional 5 years with HRs and 95% CIs equal to 20.881 (14.365 to 30.353), 6.759 (4.465 to 10.230) and 17.532 (13.210 to 23.268), respectively. The effects of primary tumour location, postoperative radiotherapy (PORT), histological response and disease extent at diagnosis on survival were found to change over time. The HR of PORT versus no PORT at the time of surgery is equal to 0.774 (0.594 to 1.008). One year after surgery, the HR is equal to 1.091 (0.851 to 1.397).ConclusionsThe time-varying effects of several baseline variables, as well as the strong impact of time-dependent variables, show the importance of including updated information collected during follow-up in the prediction model to provide accurate predictions of survival.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Prediction of survival; Probability; Time models"

1

Ripley, Ruth Mary. "Neural network models for breast cancer prognosis." Thesis, University of Oxford, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.244721.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Jones, Margaret. "Point prediction in survival time models." Thesis, University of Newcastle Upon Tyne, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.340616.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kaponen, Martina. "Prediction of survival time of prostate cancer patients using Cox regression." Thesis, Uppsala universitet, Tillämpad matematik och statistik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-354482.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Rodrigo, Hansapani Sarasepa. "Bayesian Artificial Neural Networks in Health and Cybersecurity." Scholar Commons, 2017. http://scholarcommons.usf.edu/etd/6940.

Full text
Abstract:
Being in the era of Big data, the applicability and importance of data-driven models like artificial neural network (ANN) in the modern statistics have increased substantially. In this dissertation, our main goal is to contribute to the development and the expansion of these ANN models by incorporating Bayesian learning techniques. We have demonstrated the applicability of these Bayesian ANN models in interdisciplinary research including health and cybersecurity. Breast cancer is one of the leading causes of deaths among females. Early and accurate diagnosis is a critical component which decides the survival of the patients. Including the well known ``Gail Model", numerous efforts are being made to quantify the risk of diagnosing malignant breast cancer. However, these models impose some limitations on their use of risk prediction. In this dissertation, we have developed a diagnosis model using ANN to identify the potential breast cancer patients with their demographic factors and the previous mammogram results. While developing the model, we applied the Bayesian regularization techniques (evidence procedure), along with the automatic relevance determination (ARD) prior, to minimize the network over-fitting. The optimal Bayesian network has 81\% overall accuracy in correctly classifying the actual status of breast cancer patients, 59\% sensitivity in accurately detecting the malignancy and 83\% specificity in correctly detecting non-malignancy. The area under the receiver operating characteristic curve (0.7940) shows that this is a moderate classification model. We then present a new Bayesian ANN model for developing a nonlinear Poisson regression model which can be used for count data modeling. Here, we have summarized all the important steps involved in developing the ANN model, including the forward-propagation, backward-propagation and the error gradient calculations of the newly developed network. As a part of this, we have introduced a new activation function into the output layer of the ANN and error minimizing criterion, using count data. Moreover, we have expanded our model to incorporate the Bayesian learning techniques. The performance our model is tested using simulation data. In addition to that, a piecewise constant hazard model is developed by extending the above nonlinear Poisson regression model under the Bayesian setting. This model can be utilized over the other conventional methods for accurate survival time prediction. With this, we were able to significantly improve the prediction accuracies. We captured the uncertainties of our predictions by incorporating the error bars which could not achieve with a linear Poisson model due to the overdispersion in the data. We also have proposed a new hybrid learning technique, and we evaluated the performance of those techniques with a varying number of hidden nodes and data size. Finally, we demonstrate the suitability of Bayesian ANN models for time series forecasting by using an online training algorithm. We have developed a vulnerability forecast model for the Linux operating system by using this approach.
APA, Harvard, Vancouver, ISO, and other styles
5

Begum, Mubeena. "Gene expression profiles and clinical parameters for survival prediction in stage II and III colorectal cancer." [Tampa, Fla] : University of South Florida, 2006. http://purl.fcla.edu/usf/dc/et/SFE0001554.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Yang, Lili. "Joint models for longitudinal and survival data." Thesis, 2014. http://hdl.handle.net/1805/4666.

Full text
Abstract:
Indiana University-Purdue University Indianapolis (IUPUI)
Epidemiologic and clinical studies routinely collect longitudinal measures of multiple outcomes. These longitudinal outcomes can be used to establish the temporal order of relevant biological processes and their association with the onset of clinical symptoms. In the first part of this thesis, we proposed to use bivariate change point models for two longitudinal outcomes with a focus on estimating the correlation between the two change points. We adopted a Bayesian approach for parameter estimation and inference. In the second part, we considered the situation when time-to-event outcome is also collected along with multiple longitudinal biomarkers measured until the occurrence of the event or censoring. Joint models for longitudinal and time-to-event data can be used to estimate the association between the characteristics of the longitudinal measures over time and survival time. We developed a maximum-likelihood method to joint model multiple longitudinal biomarkers and a time-to-event outcome. In addition, we focused on predicting conditional survival probabilities and evaluating the predictive accuracy of multiple longitudinal biomarkers in the joint modeling framework. We assessed the performance of the proposed methods in simulation studies and applied the new methods to data sets from two cohort studies.
National Institutes of Health (NIH) Grants R01 AG019181, R24 MH080827, P30 AG10133, R01 AG09956.
APA, Harvard, Vancouver, ISO, and other styles
7

Yuan, Yan. "Prediction Performance of Survival Models." Thesis, 2008. http://hdl.handle.net/10012/3974.

Full text
Abstract:
Statistical models are often used for the prediction of future random variables. There are two types of prediction, point prediction and probabilistic prediction. The prediction accuracy is quantified by performance measures, which are typically based on loss functions. We study the estimators of these performance measures, the prediction error and performance scores, for point and probabilistic predictors, respectively. The focus of this thesis is to assess the prediction performance of survival models that analyze censored survival times. To accommodate censoring, we extend the inverse probability censoring weighting (IPCW) method, thus arbitrary loss functions can be handled. We also develop confidence interval procedures for these performance measures. We compare model-based, apparent loss based and cross-validation estimators of prediction error under model misspecification and variable selection, for absolute relative error loss (in chapter 3) and misclassification error loss (in chapter 4). Simulation results indicate that cross-validation procedures typically produce reliable point estimates and confidence intervals, whereas model-based estimates are often sensitive to model misspecification. The methods are illustrated for two medical contexts in chapter 5. The apparent loss based and cross-validation estimators of performance scores for probabilistic predictor are discussed and illustrated with an example in chapter 6. We also make connections for performance.
APA, Harvard, Vancouver, ISO, and other styles
8

Li-Yuan, Chang, and 張瓈元. "An Empirical Study on Default Probability Models- Comparing Discrete-Time Survival Model and Merton's Model." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/65281870518653699527.

Full text
Abstract:
碩士
國立交通大學
財務金融研究所
95
Based on the data of Taiwan corporations trading in TSE and OTC,this study used factor analysis to choose variables and according to stock data to construct financial distress prediction models, such as discrete-time survival model and Merton’ s model and then estimated the default probability when company went into bankruptcy. Furthermore, I compared the accuracy of two models. This study classified the variables into four categories, which are financial structure、ability to pay、efficiency of administration and ability to profit. The methods used in analyzing the Models’ prediction accuracy are K-S test、ROC curve and AUC. The empirical results showed that these two models can both validate the distribution of independent variables of non-default group differ from that of default group and the discrete-time survival model actually predict the default probability better than Merton’s model. Keywords: Merton;Discrete-Time Survival;ROC Curve;AUC;Factor Analysis
APA, Harvard, Vancouver, ISO, and other styles
9

Kusiak, Caroline. "Real-Time Dengue Forecasting In Thailand: A Comparison Of Penalized Regression Approaches Using Internet Search Data." 2018. https://scholarworks.umass.edu/masters_theses_2/708.

Full text
Abstract:
Dengue fever affects over 390 million people annually worldwide and is of particu- lar concern in Southeast Asia where it is one of the leading causes of hospitalization. Modeling trends in dengue occurrence can provide valuable information to Public Health officials, however many challenges arise depending on the data available. In Thailand, reporting of dengue cases is often delayed by more than 6 weeks, and a small fraction of cases may not be reported until over 11 months after they occurred. This study shows that incorporating data on Google Search trends can improve dis- ease predictions in settings with severely underreported data. We compare penalized regression approaches to seasonal baseline models and illustrate that incorporation of search data can improve prediction error. This builds on previous research show- ing that search data and recent surveillance data together can be used to create accurate forecasts for diseases such as influenza and dengue fever. This work shows that even in settings where timely surveillance data is not available, using search data in real-time can produce more accurate short-term forecasts than a seasonal baseline prediction. However, forecast accuracy degrades the further into the future the forecasts go. The relative accuracy of these forecasts compared to a seasonal average forecast varies depending on location. Overall, these data and models can improve short-term public health situational awareness and should be incorporated into larger real-time forecasting efforts.
APA, Harvard, Vancouver, ISO, and other styles
10

Στεφάνου, Παύλος. "Development of scale-bridging methodologies and algorithms founded on the outcome of detailed atomistic simulations for the reliable prediction of the viscoelastic properties of polymer melts." Thesis, 2011. http://nemertes.lis.upatras.gr/jspui/handle/10889/4563.

Full text
Abstract:
In this thesis we design and develop algorithms for predicting the rheological behavior of polymer melts based on the results of detailed atomistic simulations and guided by theories of the Dynamics of Polymers and fundamental Principles of Science of the Non-Equilibrium Thermodynamics. More specifically: 1) We propose a new rheological constitutive model for the time evolution of the tensor conformation tensor C of chains in a polymer melt (and hence the stress tensor τ) using the generalized bracket formalism of Beris and Edwards. The new constitutive model includes terms that describe a whole range of phenomena and are successfully used to describe the rheological properties of commercial polyethylene resins. 2) We developed a new methodology that allows direct connection of the results of atomistic simulations with molecular reptation theory for entangled polymers. The final result of the methodology is the calculation of the function ψ(s,t) which expresses the probability that the segment s along the contour of the primitive path remain in the original tube after time t. 3) We extended the Rouse theory for systems without polymer chain ends, as the polymer rings. While there have been previous theoretical work, a comprehensive analysis of the Rouse model of cyclic polymers was still lacking; here we develop the theory in its entirety.
Στα πλαίσια της παρούσας διατριβής σχεδιάσαμε και αναπτύξαμε αλγορίθμους πρόβλεψης της ρεολογικής συμπεριφοράς πολυμερικών τηγμάτων βασιζόμενοι στα αποτελέσματα λεπτομερών ατομιστικών προσομοιώσεων, καθοδηγούμενοι όμως από Θεωρίες της Δυναμικής των Πολυμερών αλλά και από θεμελιώδεις αρχές της Επιστήμης της Θερμοδυναμικής Εκτός Ισορροπίας. Πιο συγκεκριμένα: 1) Προτείνουμε αρχικά ένα νέο ρεολογικό καταστατικό μοντέλο για τη χρονική εξέλιξη του τανυστή διαμορφώσεων C των αλυσίδων σε ένα πολυμερικό τήγμα (και κατ’ επέκταση για τον τανυστή των τάσεων τ) κάνοντας χρήση του φορμαλισμού των γενικευμένων αγκυλών των Beris και Edwards. Το νέο καταστατικό μοντέλο περιλαμβάνει όρους που περιγράφουν ένα ολόκληρο φάσμα φαινομένων και χρησιμοποιήθηκε με επιτυχία για την περιγραφή των ρεολογικών ιδιοτήτων εμπορικών ρητινών πολυαιθυλενίου. 2) Αναπτύξαμε μια καινούργια μεθοδολογία που επιτρέπει την άμεση σύνδεση των αποτελεσμάτων των ατομιστικών προσομοιώσεων με τη μοριακή θεωρία του ερπυσμού για διαπλεγμένα πολυμερή. Το τελικό αποτέλεσμα της μεθοδολογίας είναι ο υπολογισμός της συνάρτησης ψ(s,t) που εκφράζει την πιθανότητα το σημείο s κατά μήκος του περιγράμματος του πρωτογενούς δρόμου των αλυσίδων να παραμένει στον αρχικό σωλήνα μετά από χρόνο t. Επεκτείναμε τη θεωρία Rouse και για συστήματα πολυμερικών αλυσίδων δίχως άκρα, όπως αυτά των πολυμερικών δακτυλίων. Παρότι στίγματα της θεωρίας είχαν παρουσιαστεί και σε προηγούμενες εργασίες από άλλους ερευνητές, στην παρούσα διατριβή αναπτύξαμε τη θεωρία στην ολότητά της.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Prediction of survival; Probability; Time models"

1

Life time data: Statistical models and methods. Singapore: World Scientific, 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Box, George E. P. Time series analysis: Forecasting and control. 4th ed. Hoboken, N.J: John Wiley, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Box, George E. P. Time series analysis: Forecasting and control. 3rd ed. Englewood Cliffs, N.J: Prentice Hall, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Box, George E. P. Time series analysis: Forecasting and control. 4th ed. Hoboken, N.J: John Wiley, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Lifetime Data: Statistical Models and Methods. World Scientific Publishing Co Pte Ltd, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Box, George E. P. Time Series Analysis: Forecasting and Control (Wiley Series in Probability and Statistics). 4th ed. Wiley-Interscience, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Deshpande, Jayant V., and Sudha G. Purohit. Life-time Data: Statistical Models And Methods (Quality, Reliability and Engineering Statistics) (Quality, Reliabiltiy & Engineering Statistics). World Scientific Publishing Company, 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Box, George E. P. Time Series Analysis: Forecasting & Control. Pearson Education Asia Limited, 2005.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Jenkins, Gwilym M., Gregory C. Reinsel, and George E. P. Box. Time Series Analysis: Forecasting and Control. Wiley & Sons, Incorporated, John, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Jenkins, Gwilym M., Gregory C. Reinsel, and George E. P. Box. Time Series Analysis: Forecasting and Control. Wiley & Sons, Incorporated, John, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Prediction of survival; Probability; Time models"

1

"Identification of Prognostic Factors Related to Survival Time: Nonproportional Hazards Models." In Wiley Series in Probability and Statistics, 339–76. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2003. http://dx.doi.org/10.1002/0471458546.ch13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

"u = 1 u = 1.5u = 2 u = 3 u = 4 u = 5.5u = 6.5u = 7.9 u = 8.9u = 10.7 Survival Probability." In Joint Models for Longitudinal and Time-to-Event Data, 210–11. Chapman and Hall/CRC, 2012. http://dx.doi.org/10.1201/b12208-24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Diao, Qian, Jianye Lu, Wei Hu, Yimin Zhang, and Gary Bradski. "DBN Models for Visual Tracking and Prediction." In Bayesian Network Technologies, 176–93. IGI Global, 2007. http://dx.doi.org/10.4018/978-1-59904-141-4.ch009.

Full text
Abstract:
In a visual tracking task, the object may exhibit rich dynamic behavior in complex environments that can corrupt target observations via background clutter and occlusion. Such dynamics and background induce nonlinear, nonGaussian and multimodal observation densities. These densities are difficult to model with traditional methods such as Kalman filter models (KFMs) due to their Gaussian assumptions. Dynamic Bayesian networks (DBNs) provide a more general framework in which to solve these problems. DBNs generalize KFMs by allowing arbitrary probability distributions, not just (unimodal) linear-Gaussian. Under the DBN umbrella, a broad class of learning and inference algorithms for time-series models can be used in visual tracking. Furthermore, DBNs provide a natural way to combine multiple vision cues. In this chapter, we describe some DBN models for tracking in nonlinear, nonGaussian and multimodal situations, and present a prediction method to assist feature extraction part by making a hypothesis for the new observations.
APA, Harvard, Vancouver, ISO, and other styles
4

"Patient 25 Patient 25 Patient 25 Extra 1 year Extra 2 years Extra 4 years Patient 2 Patient 2 Patient 2 Extra 1 year Extra 2 years Extra 4 years Survival Probability." In Joint Models for Longitudinal and Time-to-Event Data, 198–203. Chapman and Hall/CRC, 2012. http://dx.doi.org/10.1201/b12208-21.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Godara, Deepa, Amit Choudhary, and Rakesh Kumar Singh. "Predicting Change Prone Classes in Open Source Software." In Research Anthology on Usage and Development of Open Source Software, 653–75. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-9158-1.ch034.

Full text
Abstract:
In today's world, the heart of modern technology is software. In order to compete with pace of new technology, changes in software are inevitable. This article aims at the association between changes and object-oriented metrics using different versions of open source software. Change prediction models can detect the probability of change in a class earlier in the software life cycle which would result in better effort allocation, more rigorous testing and easier maintenance of any software. Earlier, researchers have used various techniques such as statistical methods for the prediction of change-prone classes. In this article, some new metrics such as execution time, frequency, run time information, popularity and class dependency are proposed which can help in prediction of change prone classes. For evaluating the performance of the prediction model, the authors used Sensitivity, Specificity, and ROC Curve. Higher values of AUC indicate the prediction model gives significant accurate results. The proposed metrics contribute to the accurate prediction of change-prone classes.
APA, Harvard, Vancouver, ISO, and other styles
6

Klepac, Goran. "Data Mining Models as a Tool for Churn Reduction and Custom Product Development in Telecommunication Industries." In Handbook of Research on Novel Soft Computing Intelligent Algorithms, 511–37. IGI Global, 2014. http://dx.doi.org/10.4018/978-1-4666-4450-2.ch017.

Full text
Abstract:
This chapter represents the business case in the telecommunication company called Veza, in domain of churn prediction and churn mitigation. The churn project was divided into few stages. Due to limited budget and cost optimization, stage one was concentrated on prospective customer value calculation model based on fuzzy expert system. This helps Veza company to find most valuable telecom subscribers. It also helped company to better understand subscriber portfolio structure. Developed fuzzy expert system also helped Veza company in detection of soft churn. Stage two is profiling and customer segmentation based on time series analysis which provided potential predictors for predictive churn model. The central stage was concentrated on developing traditional predictive churn model based on logistic regression. This calculated probability that subscribers will make churn in next few months. The final stage was dedicated to SNA (Social Network Analysis) model development which found out the most valuable customers from the perspective of existing subscriber network. This model gave the answer that subscribers have the greatest influence on other subscribers in a way what is dangerous if they leave Veza company because they will motivate other subscribers to do the same thing. All three stages made complete churn detection/mitigation solution which take into consideration past behaviour of subscribers, their prospective value, and their strength of influence on other subscribers. This project helped Veza company to decrease churn rate and it gave directions for better understanding customer needs and behaviour which were the base for new product development.
APA, Harvard, Vancouver, ISO, and other styles
7

Klepac, Goran. "Data Mining Models as a Tool for Churn Reduction and Custom Product Development in Telecommunication Industries." In Business Intelligence, 430–57. IGI Global, 2016. http://dx.doi.org/10.4018/978-1-4666-9562-7.ch023.

Full text
Abstract:
This chapter represents the business case in the telecommunication company called Veza, in domain of churn prediction and churn mitigation. The churn project was divided into few stages. Due to limited budget and cost optimization, stage one was concentrated on prospective customer value calculation model based on fuzzy expert system. This helps Veza company to find most valuable telecom subscribers. It also helped company to better understand subscriber portfolio structure. Developed fuzzy expert system also helped Veza company in detection of soft churn. Stage two is profiling and customer segmentation based on time series analysis which provided potential predictors for predictive churn model. The central stage was concentrated on developing traditional predictive churn model based on logistic regression. This calculated probability that subscribers will make churn in next few months. The final stage was dedicated to SNA (Social Network Analysis) model development which found out the most valuable customers from the perspective of existing subscriber network. This model gave the answer that subscribers have the greatest influence on other subscribers in a way what is dangerous if they leave Veza company because they will motivate other subscribers to do the same thing. All three stages made complete churn detection/mitigation solution which take into consideration past behaviour of subscribers, their prospective value, and their strength of influence on other subscribers. This project helped Veza company to decrease churn rate and it gave directions for better understanding customer needs and behaviour which were the base for new product development.
APA, Harvard, Vancouver, ISO, and other styles
8

"T cu im rre e n tl Sycahleeasd ) qu aas rte wreeldlaatst he thLeammounltt -i D na oth io e n rt aylEIaR rt I h , odtrhoeurgm ht ajporrem di ocd ti eolnprw ob il llem re s q . u T ir hee the resolution of hOabvseearnva im to p ry o rt oafntCcooluupm le bdiamoUdneilvecrosm ity p . onTehnet, sea lt ehfo fo urgthsp ex hteernes , io onntaogfloorbeaclasdto in mga , in boatnhdth th eseeaorcee dva saonlnacn es diantcm lu odse ­ m in acn lu ydeodf ( t C he a rs toyn pe 1s9o 98 f ) m . ethods discussed above are uomciesamnatacnhdbaettmwoesepnhtehree . fl Fuo xe rsmaatntyhearbeoaus, n d th atr io ie nsoofftthhee rep F li o ca rtE in NgSaOn , d c , ur in re nstom co eupclaesdesm , oidmep ls roav re in cgapoanb le thoefo of frtehaelsie st iwcillalnrde -q suuirrfeacse ig coupling may be ess eenatd ia dli . tiA on ll tshue cc ecsusrroefnetmgpein ri ecraalt / isotn at i o st ficcaolumpe le th dom ds o . dFeo ls rirnesptlain ca ctee , a model parameterisatio nificant improvements in the SST anomaly patterns in the equatorial Pacific that th ry elraeyqeu rs ir , ecd lo m ud osd , erlad im inasp ti oonf , saun rf dacceonpv ro ecce ti sosn es, bound­ have many characteristics in common with observed to a quick solution, but, ro g v iv eemnetnhtesiam re p o li rktealny . N to onye ie o ld flEeN ss SsO uc cceosm sf puolsiin te tsh . eCm ur orreentdim ffi ocduelltspa ro re blceomnso id ferreapblliy ­ imp Iatcsthoofud ld ronuogthbte , they are worth pursuing. ce of the p ca hteirnigcc th ir ecuslpae ti c o if n ic peav tt oelruntsioinnoafgtihve en SESNTSaOndepaitsm od oes . ­ tehxe prospects for im forgotten, however, that not all of However, it is precisely this problem that must be no ctlufsuilv ly eluynodnersse ta a n so pnraolvteidmde ro sc uag le hst . p A re l dictions reside solved. Just as the ‘average’ daily weather is rarely of climate variabilit d y , th th eem re u l is ti aanmnpulaelteo th doeucgahdawles ca dloeo ce bpsteuravleda , idthteo ‘ ucnadneornsitcaanl’ diEnNgS th Oan id aeauissefm ul orceonastcroun ct ­ e2x .1 is c t ) e nc aend -e th .g e . , sien the time series o vidence for its for prediction. To reach their full potential, coupled distributions of rai cnuflaalrl ( cFhiagnugrees2i . n2ftrhae in f p al rlob (F ab ig il uir ty eim nd oidveildsun al eepdas to t E be N S ab O le etpoisroedpe li scaa te ndt he th eeivroleuv ti ooln vi nogfnoefw co duep velopments in data an ). Very recently, extratropical atmospheric and ocean interactions. There is lesdommeoedveildsehnacveeosftd ar etaeld ys t is oaonpdeinn the accuracy The most optimistic expectation is that once that may have a somewhat c ad d a if lfv er aern ia t t io unpstihnisEN fie S ld O . cEoNuSpO le , d th m ey odw el i s ll bheavaeb le cotnoqhueelrped id etnhtei fy chaanld le npg re edio ct ftmheeasiun red by the ocean s character, as other modes of climate variability. This may include Zhang te ertananl. ua1l99 ti 7 m , eFoslc la al neusr fa ( cKeleteemmapne ra et tures, from links between ENSO and the climate system not yet are now beginning to fin ddeatanlu . m1b9e9r8 ) o . M al. od1e9 ll 9e6 rs , m dis ocdoevlesremdaiyntahiediimnpienrv fe ecsttiogbaste io rv nast io onfaplodsastiab . lIemcplriomvaetdem ab e il cih ty anoin sm th seinde th ca edN al otrothmaun lt d i tropic f potential modes that link ocean basins, such as ENSO-and Barnett 1996). There is adlescoad ev aalltiPm ac eifsiccaf le o r ( vari­ related variations of SST in the tropical North Atlantic, ENSO links to rainfall may come an id dengcoed th ep aetnsLoam ti e f rece In n tl aydddiistc io u n ss etdoboycE ea n n fi -e altdmaonsdphMea re y er c o ( u1p9l9 in 7 g ). , new nointutdheeo se fcE ul N ar S O va riitas bility in the str ding generations of models need to include realistic land-southern Europe (R eolpfe -le wes .g k . i , a in ndneonrg Ha th th lp e e rn an dAfm ri acga/ ­ rae tm ali oss ti pchm er oedeclosuopflitnhge . la Snudch su rifm ac peroavnedmie ts ntvsegientvao ti lovneaThheeadp , r m ed aiyctaalbsio lity of ENS rt 1987). and adequate descriptions based on observed data of in Northern Hevm ar iyspohnerdeecOa sp d , rail on ntgiem ( e to s Ba c a ls a a le fse , w e sp se eacs ia oln ly strheep re isne it nitaal tio ve nge in ta t m io ondesltsa te is . c W ur orrekn tl oynbleainndg -s m ur afiancleym 19 e9a5n ) s . (i I . n e ., additio meda et al. driven by the development of coupled models for over several cdheacnagdenes , sis ) n ec a th u lso e la r ‘ itvnyfpairciaalbio li rty in the climate climate change projection over the next century conditional ENSO probability l u fo ernecceassetsxsi . m pe Fpcolteeds ’ e values (Dickinson et al. 1996). the Gulf Coast of the United States shows reaxaam sonal Significant advances in coupled model-based ENSO signal for both the first and second half s o tro p n le, f th g e." In Droughts, 65. Routledge, 2016. http://dx.doi.org/10.4324/9781315830896-45.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Prediction of survival; Probability; Time models"

1

Nemeth, Noel N., Osama M. Jadaan, Eric H. Baker, and John P. Gyekenyesi. "Lifetime Reliability Prediction of Ceramics Subjected to Thermal and Mechanical Cyclic Loads." In ASME Turbo Expo 2007: Power for Land, Sea, and Air. ASMEDC, 2007. http://dx.doi.org/10.1115/gt2007-27047.

Full text
Abstract:
A methodology is shown for predicting the time-dependent reliability (probability of survival) of ceramic components against catastrophic rupture when subjected to thermal and mechanical cyclic loads. This methodology is based on the Weibull distribution to model stochastic strength and a power law that models subcritical crack growth. Changes in material response that can occur with temperature or time (i.e. changing fatigue and Weibull parameters with temperature or time) are accommodated by segmenting a cycle into discrete time increments. Material properties are assumed to be constant within an increment, but can vary between increments. This capability has been added to the NASA CARES/Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. The code has been modified to have the ability to interface with commercially available finite element analysis codes such as ANSYS executed for transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
APA, Harvard, Vancouver, ISO, and other styles
2

Yim, Solomon C., Tongchate Nakhata, and Erick T. Huang. "Coupled Nonlinear Barge Motions: Part II — Deterministic Models, Stochastic Models and Stability Analysis." In ASME 2004 23rd International Conference on Offshore Mechanics and Arctic Engineering. ASMEDC, 2004. http://dx.doi.org/10.1115/omae2004-51131.

Full text
Abstract:
A computationally efficient quasi-two-degree-of-freedom (Q2DOF) stochastic model and a stability analysis of barges in random seas are presented in this paper. Based on the deterministic 2DOF coupled Roll-Heave model with high-degree polynomial approximation of restoring forces and moments developed in Part I, an attempt is made to further reduce the DOF of the model for efficient stochastic stability analysis by decoupling the heave effects on roll motion, resulting in a one-degree-of-freedom (1DOF) roll-only model. Using the Markov assumption, stochastic differential equations governing the evolution of probability densities of roll-heave and roll responses for the two low-DOF models are derived via the Fokker-Planck formulation. Numerical results of roll responses for the 2DOF and 1DOF models, using direct simulation in the time domain and the path integral solution technique in the probability domain, are compared to determine the effects of neglecting the influence of heave on roll motion and assess the relative computational efforts required. It is observed that the 1DOF model is computationally very efficient and the 2DOF model response predictions are quite accurate. However, the nonlinear roll-heave coupling is found to be significant and needs to be directly taken into account rendering the 1DOF roll-only model inadequate for practical use. The 2DOF model is impractical for long-duration real time response computation due to the insurmountable computational effort required. By taking advantage of the observed strong correlation between measured heave and wave elevation in the experimental results, an accurate and efficient Q2DOF model is developed by expressing the heave response in the 2DOF model as a function of wave elevation, thus reducing the effective DOF to unity. This Q2DOF model is essential as it reduces the computational effort by a factor of 10−5 compared to that of the 2DOF model, thus making practical stochastic analysis possible. A stochastic stability analysis of the barge under operational and survival sea states specified by the US Navy is presented using the Q2DOF model based on first passage time formulation.
APA, Harvard, Vancouver, ISO, and other styles
3

Exarchos, Themis P., George Rigas, Yorgos Goletsis, Kostas Stefanou, Steven Jacobs, Maria-Giovanna Trivella, and Dimitrios I. Fotiadis. "A dynamic Bayesian network approach for time-specific survival probability prediction in patients after ventricular assist device implantation." In 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, 2014. http://dx.doi.org/10.1109/embc.2014.6944296.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Mak, Lawrence, Brian Farnworth, Eugene H. Wissler, Michel B. DuCharme, Wendell Uglene, Renee Boileau, Pete Hackett, and Andrew Kuczora. "Thermal Requirements for Surviving a Mass Rescue Incident in the Arctic: Preliminary Results." In ASME 2011 30th International Conference on Ocean, Offshore and Arctic Engineering. ASMEDC, 2011. http://dx.doi.org/10.1115/omae2011-49471.

Full text
Abstract:
Maritime and air traffic through the Arctic has increased in recent years. Cruise ship and commercial jet liners carry a large number of passengers. With increased traffic, there is a higher probability that a major disaster could occur. Cruise ship and plane accidents could be catastrophic and may require mass rescue. Due to the remote location, limited search and rescue resources, time for these resources to get to the accident location and large number of survivors, the retrieval time could be several days. Therefore, survivors may be required to survive on their own for days while they await rescue. Recognizing that the International Maritime Organization does not have specific thermal performance criteria for liferafts and lifeboats and personal and group survival kits, the Maritime and Arctic Survival Scientific and Engineering Research Team (MASSERT) initiated a research project to improve safety and provide input for advances to regulations. The objective of the project is to investigate if the current thermal protective equipment and preparedness available to people traveling in the Canadian Arctic are adequate for surviving a major air or cruise ship disaster and to identify the minimum thermal protection criteria for survival. This project builds on the results and tools developed in other research projects conducted by the team on thermal protection of liferafts, lifeboats and immersion suits. The project is divided into three major phases — clothing ensemble testing with thermal manikins, a physiology experiment on sustainable shivering duration and ensemble testing in Arctic conditions with human subjects. A numerical model uses these data to simulate survival scenarios. In the first phase of this project, the thermal resistance values of the protective clothing typically available to cruise ship and aircraft passengers were measured using two thermal manikins. The ensembles included Cabin Wear, Deck Wear, Expedition Wear, Abandonment Wear and protective clothing from Canada Forces Major Air Disaster Kit (MAJAID). Tests were conducted on dry and wet ensembles at 5°C and −15°C with and without wind. There is very good agreement between the thermal resistances measured by the two manikins. The differences in thermal resistances observed are likely caused by variations in fit and wrinkles and folds in the ensembles from dressing. With no wind, the thermal resistance is lowest with Cabin Wear and highest with MAJAID clothing inside the down-filled casualty bag. The Expedition Wear, the Abandonment Wear and the MAJAID clothing have about the same thermal resistance. With 7 metre-per-second wind, the thermal resistance of all ensembles decreased significantly by 30% to 70%. These results highlight the importance of having a shelter as a windbreak. For wet clothing ensembles at 5°C, the initial wet thermal resistance was 2 to 2.5 times lower than the dry value, and drying times ranged up to 60 hours. This highlights the importance of staying dry. Preliminary predictions from the numerical model show that the survivors in Expedition Wear, even with sleeping bag and tent, can be mildly hypothermic and need to depend heavily on shivering to maintain thermal balance. In a shelter, the predicted metabolic rate is roughly double the resting rate; it is triple the resting rate without protection from the wind. Further research is required to study shivering fatigue and age effects. Research on mass rescue scenarios for cruise ships and airplanes survivors should ideally involve subjects of both genders and the elderly.
APA, Harvard, Vancouver, ISO, and other styles
5

Sokota, Samuel, Ryan D'Orazio, Khurram Javed, Humza Haider, and Russell Greiner. "Simultaneous Prediction Intervals for Patient-Specific Survival Curves." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/828.

Full text
Abstract:
Accurate models of patient survival probabilities provide important information to clinicians prescribing care for life-threatening and terminal ailments. A recently developed class of models -- known as individual survival distributions (ISDs) -- produces patient-specific survival functions that offer greater descriptive power of patient outcomes than was previously possible. Unfortunately, at the time of writing, ISD models almost universally lack uncertainty quantification. In this paper we demonstrate that an existing method for estimating simultaneous prediction intervals from samples can easily be adapted for patient-specific survival curve analysis and yields accurate results. Furthermore, we introduce both a modification to the existing method and a novel method for estimating simultaneous prediction intervals and show that they offer competitive performance. It is worth emphasizing that these methods are not limited to survival analysis and can be applied in any context in which sampling the distribution of interest is tractable. Code is available at https://github.com/ssokota/spie.
APA, Harvard, Vancouver, ISO, and other styles
6

Chen, Haibo, Torgeir Moan, Sverre Haver, and Kjell Larsen. "Prediction of Relative Motion and Probability of Contact Between FPSO and Shuttle Tanker in Tandem Offloading Operation." In ASME 2002 21st International Conference on Offshore Mechanics and Arctic Engineering. ASMEDC, 2002. http://dx.doi.org/10.1115/omae2002-28101.

Full text
Abstract:
Tandem offloading safety between FPSO and shuttle tanker is under concern. A few collisions between the two vessels have happened in the North Sea in recent years. In these incidents, excessive relative motions (termed as surging and yawing in this paper) between FPSO and tanker are identified as “failure prone situations” which have contributed to the initiation of most collision incidents. To quantitatively assess the probability of surging and yawing events, and more importantly, to effectively reduce their occurrence in tandem offloading operation, we present a simulation-based approach in this paper, which is carried out by a state-of-the-art time-domain simulation code SIMO. The SIMO simulation models are setup and calibrated for a typical North Sea purpose-built FPSO and a DP shuttle tanker. This 2-vessel system motion in tandem offloading is simulated. The simulated relative distance and relative heading between FPSO and tanker are analyzed by fitting their extreme values into statistical models. This gives out probabilities of surging and yawing events. Sensitivity studies are performed to analyze contributions from various technical and operational factors. Measures to minimize the occurrence of surging and yawing from design and operational point of view are proposed.
APA, Harvard, Vancouver, ISO, and other styles
7

Rahman, Tahrima, Shasha Jin, and Vibhav Gogate. "Cutset Bayesian Networks: A New Representation for Learning Rao-Blackwellised Graphical Models." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/797.

Full text
Abstract:
Recently there has been growing interest in learning probabilistic models that admit poly-time inference called tractable probabilistic models from data. Although they generalize poorly as compared to intractable models, they often yield more accurate estimates at prediction time. In this paper, we seek to further explore this trade-off between generalization performance and inference accuracy by proposing a novel, partially tractable representation called cutset Bayesian networks (CBNs). The main idea in CBNs is to partition the variables into two subsets X and Y, learn a (intractable) Bayesian network that represents P(X) and a tractable conditional model that represents P(Y|X). The hope is that the intractable model will help improve generalization while the tractable model, by leveraging Rao-Blackwellised sampling which combines exact inference and sampling, will help improve the prediction accuracy. To compactly model P(Y|X), we introduce a novel tractable representation called conditional cutset networks (CCNs) in which all conditional probability distributions are represented using calibrated classifiers—classifiers which typically yield higher quality probability estimates than conventional classifiers. We show via a rigorous experimental evaluation that CBNs and CCNs yield more accurate posterior estimates than their tractable as well as intractable counterparts.
APA, Harvard, Vancouver, ISO, and other styles
8

Kim, Taewung, Kyukwon Bang, Hyun-Yong Jeong, and Stephen Decker. "A Simple Vehicle Model for Path Prediction During Evasive Maneuvers and a Stochastic Analysis on the Crash Probability." In ASME 2007 International Mechanical Engineering Congress and Exposition. ASMEDC, 2007. http://dx.doi.org/10.1115/imece2007-43334.

Full text
Abstract:
Active safety systems are being developed in automotive industry, and an analytical vehicle model is needed in such systems to predict vehicle path to assess the crash probability. However, the bicycle model cannot result in a good correlation with test data and ADAMS simulation results, and other analytical vehicle models which have 8 or 14 degrees of freedom need more computation time. Therefore, in this study a simple analytical vehicle model was proposed to predict vehicle path especially during evasive maneuvers. The analytical vehicle model can predict a vehicle’s path based on the given vehicle speed and steering angle. In the analytical vehicle model, two different moment arms were used for inboard and outboard wheels, and lateral and longitudinal load transfers were taken into account. In addition, the magic formula tire model was used to estimate the lateral force. The analytical vehicle model has been validated with a sophisticated ADAMS model, and it resulted in a good correlation with test data. Using the simple analytical model, a stochastic analysis was conducted to analyze the effect of the initial offset amount and the heading angle on the crash probability. Another stochastic analysis was also conducted to analyze the effect of a sensing error on the false negative rate (FNR) and the false positive rate (FPR). It was found that the initial offset amount and the heading angle played a key role in the crash probability, and only FPR was affected noticeably by a sensing error.
APA, Harvard, Vancouver, ISO, and other styles
9

Arild, Øystein, Hans Petter Lohne, Hans Joakim Skadsem, Eric Patrick Ford, and Jon Tømmerås Selvik. "Time-to-Failure Estimation of Barrier Systems in Permanently Plugged and Abandoned Wells." In ASME 2019 38th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/omae2019-96546.

Full text
Abstract:
Abstract With the increasing number of aging fields and wells worldwide, a large number of wells will have to permanently plugged and abandoned in the coming decades. Today’s technical solutions on P&A design are primarily driven by legislations or recognized standards such as NORSOK D-10 or the Oil & Gas UK Well Decommissioning Guidelines. The NORSOK D-010 say that the well should be sealing to “eternity” without providing any link between the recommended solution and time-to-failure. During the last few years, there has been a drive towards a risk-based approach to P&A design. With such an approach, the goodness of a P&A design can be formulated in terms of the associated leakage risk, which consists of two components; i) the time-to-failure of the barrier system and ii) the leakage rate given the barrier system has failed. When failure data are available, there is a wide range of statistical tools available for establishing the time-to-failure probability distribution. However, barrier failure data on permanently plugged and abandoned wells are scarce in the North Sea region. In order to estimate the time-to-failure for wells in the North Sea region, all relevant information should be taken into account; survival data, expert input and physiochemical degradation models. In this paper, we will show how this can be accomplished by means of a Bayesian reliability approach. The paper will first describe the general framework for how to perform Bayesian time-to-failure estimation. Thereafter, information pertaining to barrier system lifetime for wells on the Norwegian Continental Shelf (NCS) and relevant assumptions will be discussed. Finally, the methodology will be applied on a synthetic case.
APA, Harvard, Vancouver, ISO, and other styles
10

Xia, Henian, Nathan Keeney, Brian J. Daley, Adam Petrie, and Xiaopeng Zhao. "Prediction of ICU In-Hospital Mortality Using Artificial Neural Networks." In ASME 2013 Dynamic Systems and Control Conference. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/dscc2013-3768.

Full text
Abstract:
This work aims to predict in-hospital mortality in the open-source Physionet ICU database from features extracted from the time series of physiological variables using neural network models and other machine learning techniques. We developed an effective and efficient greedy algorithm for feature selection, reducing the number of potential features from 205 to a best subset of only 47. The average of five trials of 10-fold cross validation shows an accuracy of (86.23±0.14)%, a sensitivity of (50.29±0.22)%, a specificity of (92.01 ± 0.21)%, a positive prediction value of (50.29±0.50)%, a negative prediction value of (92.01±0.00)%, and a Lemeshow score of 119.55±9.87. By calibrating the predicted mortality probability using an optimization approach, we can improve the Lemeshow score to 27.51±4.38. The developed model has the potential for application in ICU machines to improve the quality of care and to evaluate the effect of treatment or drugs.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography