Academic literature on the topic 'Marginal Structural Cox models'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Marginal Structural Cox models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Marginal Structural Cox models"

1

Ali, R. Ayesha, M. Adnan Ali, and Zhe Wei. "On computing standard errors for marginal structural Cox models." Lifetime Data Analysis 20, no. 1 (April 18, 2013): 106–31. http://dx.doi.org/10.1007/s10985-013-9255-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Enders, Dirk, Susanne Engel, Roland Linder, and Iris Pigeot. "Robust versus consistent variance estimators in marginal structural Cox models." Statistics in Medicine 37, no. 24 (June 11, 2018): 3455–70. http://dx.doi.org/10.1002/sim.7823.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Xiao, Yongling, Erica E. M. Moodie, and Michal Abrahamowicz. "Comparison of Approaches to Weight Truncation for Marginal Structural Cox Models." Epidemiologic Methods 2, no. 1 (January 8, 2013): 1–20. http://dx.doi.org/10.1515/em-2012-0006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Westreich, D., S. R. Cole, P. C. Tien, J. S. Chmiel, L. Kingsley, M. J. Funk, K. Anastos, and L. P. Jacobson. "Time Scale and Adjusted Survival Curves for Marginal Structural Cox Models." American Journal of Epidemiology 171, no. 6 (February 5, 2010): 691–700. http://dx.doi.org/10.1093/aje/kwp418.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Karim, Mohammad Ehsanul, John Petkau, Paul Gustafson, Robert W. Platt, and Helen Tremlett. "Comparison of statistical approaches dealing with time-dependent confounding in drug effectiveness studies." Statistical Methods in Medical Research 27, no. 6 (September 21, 2016): 1709–22. http://dx.doi.org/10.1177/0962280216668554.

Full text
Abstract:
In longitudinal studies, if the time-dependent covariates are affected by the past treatment, time-dependent confounding may be present. For a time-to-event response, marginal structural Cox models are frequently used to deal with such confounding. To avoid some of the problems of fitting marginal structural Cox model, the sequential Cox approach has been suggested as an alternative. Although the estimation mechanisms are different, both approaches claim to estimate the causal effect of treatment by appropriately adjusting for time-dependent confounding. We carry out simulation studies to assess the suitability of the sequential Cox approach for analyzing time-to-event data in the presence of a time-dependent covariate that may or may not be a time-dependent confounder. Results from these simulations revealed that the sequential Cox approach is not as effective as marginal structural Cox model in addressing the time-dependent confounding. The sequential Cox approach was also found to be inadequate in the presence of a time-dependent covariate. We propose a modified version of the sequential Cox approach that correctly estimates the treatment effect in both of the above scenarios. All approaches are applied to investigate the impact of beta-interferon treatment in delaying disability progression in the British Columbia Multiple Sclerosis cohort (1995–2008).
APA, Harvard, Vancouver, ISO, and other styles
6

Westreich, Daniel, Stephen R. Cole, Enrique F. Schisterman, and Robert W. Platt. "A simulation study of finite-sample properties of marginal structural Cox proportional hazards models." Statistics in Medicine 31, no. 19 (April 11, 2012): 2098–109. http://dx.doi.org/10.1002/sim.5317.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Burne, Rebecca M., and Michal Abrahamowicz. "Adjustment for time-dependent unmeasured confounders in marginal structural Cox models using validation sample data." Statistical Methods in Medical Research 28, no. 2 (August 24, 2017): 357–71. http://dx.doi.org/10.1177/0962280217726800.

Full text
Abstract:
Large databases used in observational studies of drug safety often lack information on important confounders. The resulting unmeasured confounding bias may be avoided by using additional confounder information, frequently available in smaller clinical “validation samples”. Yet, no existing method that uses such validation samples is able to deal with unmeasured time-varying variables acting as both confounders and possible mediators of the treatment effect. We propose and compare alternative methods which control for confounders measured only in a validation sample within marginal structural Cox models. Each method corrects the time-varying inverse probability of treatment weights for all subject-by-time observations using either regression calibration of the propensity score, or multiple imputation of unmeasured confounders. Two proposed methods rely on martingale residuals from a Cox model that includes only confounders fully measured in the large database, to correct inverse probability of treatment weight for imputed values of unmeasured confounders. Simulation demonstrates that martingale residual-based methods systematically reduce confounding bias over naïve methods, with multiple imputation including the martingale residual yielding, on average, the best overall accuracy. We apply martingale residual-based imputation to re-assess the potential risk of drug-induced hypoglycemia in diabetic patients, where an important laboratory test is repeatedly measured only in a small sub-cohort.
APA, Harvard, Vancouver, ISO, and other styles
8

Bruneau, Pierre, Fahim Ashkar, and Bernard Bobée. "SMPLNORM : Un modèle simple pour obtenir les probabilités conjointes de deux débits et le niveau qui en dépend." Canadian Journal of Civil Engineering 21, no. 5 (October 1, 1994): 883–95. http://dx.doi.org/10.1139/l94-094.

Full text
Abstract:
Most bivariate models assume the same type of marginal distribution, with two parameters, for two variables (gamma, type I extreme values, and so forth). The disadvantage of these models is that it is often difficult to make adjustments for observed flows. This study shows the application flexibility of a program that calculates the joint probability of two variables, Q1, and Q2, with marginal distributions that have three parameters. The program can also provide the probability of nonexceedence of a third variable, H, mathematically related to the first two variables. Two applications are discussed, in which Q1 and Q2 are the flows of two rivers controlling the variable H, which is a level in both cases. Theoretically, this model could also be applied to other types of variables. The proposed model is based on the hypothesis that a Box–Cox type of power transformation could reduce the marginal distributions of Q1 and Q2 to a normal distribution. One of the main conclusions of the study addresses the importance of taking into account the correlation between Q1 and Q2 to obtain a valid estimate of H. Key words: hydrology, statistics, bivariate models, Box–Cox, flow, water level. [Traduit par la rédaction]
APA, Harvard, Vancouver, ISO, and other styles
9

Santacatterina, Michele, Celia García‐Pareja, Rino Bellocco, Anders Sönnerborg, Anna Mia Ekström, and Matteo Bottai. "Optimal probability weights for estimating causal effects of time‐varying treatments with marginal structural Cox models." Statistics in Medicine 38, no. 10 (December 27, 2018): 1891–902. http://dx.doi.org/10.1002/sim.8080.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Madden, Jamie M., Finbarr P. Leacy, Lina Zgaga, and Kathleen Bennett. "Fitting Marginal Structural and G-Estimation Models Under Complex Treatment Patterns: Investigating the Association Between De Novo Vitamin D Supplement Use After Breast Cancer Diagnosis and All-Cause Mortality Using Linked Pharmacy Claim and Registry Data." American Journal of Epidemiology 189, no. 3 (November 1, 2019): 224–34. http://dx.doi.org/10.1093/aje/kwz243.

Full text
Abstract:
Abstract Studies have shown that accounting for time-varying confounding through time-dependent Cox proportional hazards models may provide biased estimates of the causal effect of treatment when the confounder is also a mediator. We explore 2 alternative approaches to addressing this problem while examining the association between vitamin D supplementation initiated after breast cancer diagnosis and all-cause mortality. Women aged 50–80 years were identified in the National Cancer Registry Ireland (n = 5,417) between 2001 and 2011. Vitamin D use was identified from linked prescription data (n = 2,570). We sought to account for the time-varying nature of vitamin D use and time-varying confounding by bisphosphonate use using 1) marginal structural models (MSMs) and 2) G-estimation of structural nested accelerated failure-time models (SNAFTMs). Using standard adjusted Cox proportional hazards models, we found a reduction in all-cause mortality in de novo vitamin D users compared with nonusers (hazard ratio (HR) = 0.84, 95% confidence interval (CI): 0.73, 0.99). Additional adjustment for vitamin D and bisphosphonate use in the previous month reduced the hazard ratio (HR = 0.45, 95% CI: 0.33, 0.63). Results derived from MSMs (HR = 0.44, 95% CI: 0.32, 0.61) and SNAFTMs (HR = 0.45, 95% CI: 0.34, 0.52) were similar. Utilizing MSMs and SNAFTMs to account for time-varying bisphosphonate use did not alter conclusions in this example.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Marginal Structural Cox models"

1

Lusivika, Nzinga Clovis. "Estimation d’effets individuels de traitements pris en combinaison dans les études observationnelles." Thesis, Sorbonne université, 2019. http://www.theses.fr/2019SORUS218.

Full text
Abstract:
La réalisation d'un essai thérapeutique randomisé peut être difficile à mettre en place pour estimer sans biais l'effet causal d'une stratégie thérapeutique. Dans ce cas, l’étude observationnelle constitue une alternative pour évaluer l’effet causal d’un traitement. Quatre types de difficultés méthodologiques nous intéressent dans ce type d’étude : 1) le biais d’indication ; 2) La présence des facteurs de confusion temps-dépendantes (TD) ; 3) la relation variant dans le temps entre un traitement TD et un effet ; 4) dans la vie réelle, les patients prennent parfois plusieurs traitements, de façon séquentielle ou simultanée. Dans ces conditions, l’évaluation de l’effet propre à chaque traitement constitue un défi méthodologique. L’objectif de cette thèse est de proposer un cadre méthodologique qui permet d’estimer correctement les effets propres aux traitements dans un contexte de multithérapie dans une étude observationnelle en tenant compte de ces difficultés méthodologiques. Nous avons évalué la performance du modèle marginal structurel de Cox pour estimer les effets individuels et conjoints de deux traitements et démontré qu'il a des bonnes performances en présence des facteurs de confusion TD et d'une interaction entre les deux traitements. Nous avons également comparé la performance du modèle marginal structurel de Cox à exposition cumulée pondérée à celle du modèle de Cox à exposition cumulée pondérée standard pour estimer les effets variant dans le temps en présence de confusion temps dépendante et démontré qu'il a une meilleure performance et qu'il peut être appliqué aux données réelles quelque soit la force de confusion temps dépendante
Randomized controlled trials cannot be implemented in all situations for estimating effects of therapeutic strategies. Observational studies would then constitute an alternative for evaluating treatment effects. We have a specified interest in four types of methodological difficulties for such studies: 1) confounding by indication ; 2) presence of time-dependent confounding ; 3) The relationship between a given time-dependent treatment and its effect may vary over time ; 4) In real life, patients often receive multiple treatments, sequentially or simultaneously. In this context, the evaluation of individual effects of treatment is a methodological challenge. The overall objective of this thesis was to propose a methodological framework in which these methodological difficulties are accommodated, allowing the individual effects of treatments to be correctly estimated within the context of multi-treatments in an observational study. We evaluated the performance of the marginal structural Cox model when estimating the individual and joint effects of two treatments and showed that it performed well in the presence of three different scenarios of time-dependent confounding. We also showed the importance of estimating the interaction term when exploring the treatment effect from combination therapy. We compared the performance of weighted cumulative exposure marginal structural Cox model with that of a conventional TD WCE Cox model for estimating time-varying effects of treatments without bias in the presence of TD confounding. Our results showed that the WCE Cox MSM performed better and can be applied to real data whatever the strength of time dependent confounding
APA, Harvard, Vancouver, ISO, and other styles
2

Xiao, Yongling. "Flexible marginal structural models for survival analysis." Thesis, McGill University, 2012. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=107571.

Full text
Abstract:
In longitudinal studies, both treatments and covariates may vary throughout the follow-up period. Time-dependent (TD) Cox proportional hazards (PH) models can be used to model the effect of time-varying treatments on the hazard. However, two challenges exist in such modeling. First, accurate modeling of the effects of TD treatments on the hazard requires resolving the uncertainty about the etiological relevance of treatments taken in different time periods. The second challenge arises in the presence of TD confounders affected by prior treatments. By assuming the absence of the other challenge, two different methodologies, weighted cumulative exposure (WCE) and marginal structural models (MSM), have been recently proposed to separately address each challenge, respectively. In this thesis, I proposed the combination of these methodologies so as to address both challenges simultaneously, as both may commonly arise in combination in longitudinal studies.In the first manuscript, I proposed and validated a novel approach to implement the marginal structural Cox proportional hazards model (referred to as Cox MSM) with inverse-probability-of-treatment weighting (IPTW) directly via a weighted time-dependent Cox PH model, rather than via a pooled logistic regression approximation. The simulations show that the IPTW estimator yields consistent estimates of the causal effect of treatment, but it may suffer from large variability, due to some extremely high IPT weights. The precision of the IPTW estimator could be improved by normalizing the stabilized IPT weights.Simple weight truncation has been proposed and commonly used in practice as another solution to reduce the large variability of IPTW estimators. However, truncation levels are typically chosen based on ad hoc criteria which have not been systematically evaluated. Thus, in the second manuscript, I proposed a systematic data-adaptive approach to select the optimal truncation level which minimizes the estimated expected MSE of the IPTW estimates. In simulation, the new approach exhibited the performance that was as good as the approaches that simply truncate the stabilized weights at high percentiles such as the 99th or 99.5th of their distribution, in terms of reducing the variance and improving the MSE of the estimatesIn the third manuscript, I proposed a new, flexible model to estimate the cumulative effect of time-varying treatment in the presence of the time-dependent confounders/mediators. The model incorporated weighted cumulative exposure modeling in a marginal structural Cox model. Specifically, weighted cumulative exposure was used to summarize the treatment history, which was defined as the weighted sum of the past treatments. The function that assigns different weights to treatments received at different times was modeled with cubic regression splines. The stabilized IPT weights for each person at each visit were calculated to account for the time-varying confounding and mediation. The weighted Cox MSM, using stabilized IPT weights, was fitted to estimate the total causal cumulative effect of the treatments on the hazard. Simulations demonstrate that the proposed new model can estimate the total causal cumulative effect, i.e. to capture both the direct and the indirect (mediated by the TD confounder) treatment effects. Bootstrap-based 95% confidence bounds for the estimated weight function were constructed and the impact of some extreme IPT weights on the estimates of the causal cumulative effect was explored.In the last manuscript, I applied the WCE MSM to the Swiss HIV Cohort Study (SHCS) to re-assess whether the cumulative exposure to abacavir therapy may increase the potential risk of cardiovascular events, such as myocardial infarction or the cardiovascular-related death.
Dans les études longitudinales, aussi bien les covariables que les traitements peuvent varier au cours de la période de suivi. Les modèles de Cox à effets proportionnels avec variables dépendantes du temps peuvent être utilisés pour modéliser l'effet de traitement variant au cours du temps. Cependant, deux défis apparaissent pour ce type de modélisation. Tout d'abord, une modélisation précise des effets des traitements dépendants du temps sur le risque nécessite de résoudre l'incertitude quant à l'importance étiologique des traitements pris a différentes périodes de temps. Ensuite, un second défi se pose dans le cas de la présence d'une variable de confusion qui dépend du temps et qui est également un médiateur de l'effet du traitement sur le risque. Deux différentes méthodologies ont récemment été suggérées pour répondre, séparément, à chacun de ces deux défis, respectivement l'exposition cumulée pondérée et les modèles structuraux marginaux (MSM). Dans cette thèse, j'ai proposé la combinaison de ces méthodologies de façon à répondre aux deux défis simultanément, étant donné qu'ils peuvent tous les deux fréquemment se poser en même temps dans des études longitudinales. Dans le premier article, j'ai proposé et validé une nouvelle approche pour mettre en œuvre le Cox MSM avec la pondération par l'inverse de probabilité de traitement (PIPT) directement à partir d'un modèle de Cox a effets proportionnels pondéré et avec variables dépendantes du temps plutôt que par une approximation par régression logistique sur données agrégées. Les simulations montrent que l'estimateur PIPT donne des estimations consistantes de l'effet causal du traitement alors qu'il serait associé à une grande variabilité dans les estimations, à cause d'inverses de probabilités de traitement extrêmement élevés. La simple troncature de poids a été proposée et couramment utilisée dans la pratique comme une autre solution pour réduire la grande variabilité des estimateurs PIPT. Cependant, les niveaux de troncature sont généralement choisis en fonction de critères ad hoc, qui n'ont pas été systématiquement évalués. Ainsi, dans le deuxième article, j'ai proposé une approche systématique adaptative aux données systématique pour sélectionner le niveau de troncature optimal qui minimise l'erreur quadratique moyenne des estimations PIPT. Dans le troisième article, j'ai proposé un nouveau modèle flexible afin d'estimer l'effet cumulatif de traitements qui varient dans le temps en présence de facteurs de confusion/médiateurs dépendant du temps. Le modèle intègre la modélisation de l'exposition cumulative pondérée dans un Cox MSM. Plus précisément, l'exposition cumulée pondérée a été utilisée pour résumer l'histoire du traitement, qui a été définie comme la somme pondérée des traitements antérieurs. La fonction qui assigne des poids différents aux traitements reçus à différents moments a été modélisée avec des régressions par B-splines cubiques, en utilisant différentes covariables dépendantes du temps artificielles. Les poids IPT stabilisés pour chaque personne à chaque visite ont été calculés afin de tenir compte des variables de confusion et des médiateurs qui dépendent du temps. Le modèle structurel marginal de Cox à effets proportionnel et avec des covariables dépendantes du temps pondéré, qui utilise des poids stabilisés pondérés, a été ajusté pour estimer l'effet cumulatif causal total des traitements sur le risque. Les simulations montrent que le nouveau modèle proposé permet d'estimer l'effet cumulatif causal total, c'est à dire qu'il permet de capturer à la fois les effets direct et indirect.Dans le dernier article, j'ai appliqué le modèle structural marginal avec exposition cumulée pondérée à une étude de cohorte suisse sur le VIH afin de réévaluer si l'exposition cumulée à la thérapie abacavir augmentait le risque potentiel d'événements cardiovasculaires, tels que l'infarctus du myocarde ou le décès lié a un événement cardiovasculaire.
APA, Harvard, Vancouver, ISO, and other styles
3

Havercroft, William G. "Exploration of marginal structural models for survival outcomes." Thesis, University of Bristol, 2014. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.684750.

Full text
Abstract:
A marginal structural model parameterises the distribution of an outcome given a treatment intervention, where such a distribution is the fundamental probabilistic representation of the causal effect of treatment on the outcome. Causal inference methods are designed to consistently estimate aspects of these causal distributions, in the presence of interference from non-causal associations which typically occur in observational data. One such method, which involves the application of inverse probability of treatment weights, directly targets the parameters of marginal structural models. The asymptotic properties and practical applicability of this method are well established, but little attention has been paid to its finite-sample performance. This is because simulating data from known distributions which are entirely suitable for such investigations generally presents a significant challenge, especially in scenarios where the outcome is survival time. We illuminate these issues, and propose and implement certain solutions, considering separately the cases of static (pre-determined) and dynamic (tailored) treatment interventions. In so doing, we explore both theoretical and practical aspects of marginal structural models for survival outcomes, and the associated inference method.
APA, Harvard, Vancouver, ISO, and other styles
4

Yang, Shibing. "Application of Marginal Structural Models in Pharmacoepidemiologic Studies." VCU Scholars Compass, 2014. http://scholarscompass.vcu.edu/etd/3471.

Full text
Abstract:
Background: Inverse-probability-of-treatment-weighted estimation (IPTW) of marginal structural models was proposed to adjust for time-varying confounders that are influenced by prior treatment use. It is unknown whether pharmacoepidemiologic studies that applied IPTW conformed to the recommendations proposed by methodological studies. In addition, no previous study has compared the performance of different analytic strategies adopted in IPTW analyses. Objectives: This project aims 1) to review the reporting practice of pharmacoepidemiologic studies that applied IPTW, 2) to compare the validity and precision of several approaches to constructing weight, 3) to use IPTW to estimate the effectiveness of glucosamine and chondroitin in treating osteoarthritis. Methods: We systematically retrieved pharmacoepidemiologic studies that were published in 2012 and applied IPTW to estimate the effect of a time-varying treatment. Under a variety of simulated scenarios, we assessed the performance of four analytic approaches what were commonly used in studies conducting IPTW analyses. Finally, using data from Osteoarthritis Initiative, we applied IPTW to estimate the long-term effectiveness of glucosamine and chondroitin on treating knee osteoarthritis. Results: The practice of reporting use of IPTW in pharmacoepidemiologic studies was suboptimal. The majority of reviewed studies did not report that the positivity assumption was assessed, and several studies used unstablized weights or did not report that the stabilized weights were used. With data simulation, we found that intention-to-treat analyses underestimated the actual treatment effect when there was non-null treatment effect and treatment non-adherence. This underestimation was linearly correlated with adherence levels. As-treated analyses that took into account the complex mechanism of treatment use generated approximately unbiased estimates without sacrificing the estimate precision when the treatment effect was non-null. Finally, after adjustment for potential confounders with marginal structural models, we found no clinically meaningful benefits of glucosamine/chondroitin in relieving knee pain, stiffness and physical function or slowing joint space narrowing. Conclusions: It may be prudent to develop best practices of reporting the use of IPTW. Studies performing intention-to-treat analyses should report the levels of adherence after treatment initiation, and studies performing as-treated analyses should take into the complex mechanism of treatment use in weight construction.
APA, Harvard, Vancouver, ISO, and other styles
5

Mojaverian, Nassim. "Effects of sparse follow-up on marginal structural models for time-to-event data." Thesis, McGill University, 2012. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=110690.

Full text
Abstract:
Background: Survival time is a common parameter of interest that can be estimated by using Cox Proportional Hazards models when measured continuously. An alternative way to estimate hazard ratios is to cut up time into equal-lengthed intervals and consider the by-interval outcome to be 0 if the person is alive during this interval and 1 otherwise. In this discrete-time approximation, instead of using a Cox model, one should perform pooled logistic regression to get unbiased estimate of survival time under the assumption of low death rate per interval. This fact is satisfied when shorter intervals is used in order to have fewer events in each time, however, by doing this, problems such as missing values can arise because the actual visits occur less frequently in a survival setting and one must therefore account for the missing values. Objective: We investigate the effect of two methods of filling in missing data, Last Observation Carried Forward (LOCF) and Multiple Imputation (MI), as well as Available Case Study. We compare these three different approaches to complete data analysis. Methods: Weighted pooled logistic regression is used to estimate the causal marginal treatment effect. Complete data were generated using Young's algorithm to obtain monthly information for all individuals, and from the complete data, observed data were selected by assuming follow-up visits occurred every six or three months. Thus, to analyze the observed data at a monthly level, we performed LOCF and MI to fill in the missing values and compared the results to those from a completely-observed data analysis. We also included an analysis of the observed-data without any imputation. We then applied these methods to the Canadian Co-infection Cohort to estimate the impact of alcohol consumption on liver fibrosis.Results: In most simulations, MI produced the least biased and least variant estimators, even outperforming analyses based on completely-observed data. In the presence of stronger confounding, MI-based estimators were more biased but nevertheless less variant than the estimators based on completely-observed data.Conclusion: Multiple Imputation is superior to last-observation carried forward and observed-data analysis when marginal structural models are used to adjust for time-varying exposure and variables in the context of survival analysis and data are missing or infrequently measured.
Contexte : Le temps de survie est un paramètre d'intérêt commun qui peut être évalué en faisant appel aux modèles à risques proportionnels de Cox lorsqu'il mesuré en continu. Un autre moyen d'estimer des risques relatifs est de diviser le temps en intervalles égaux et d'assigner une valeur de 0 ou de 1 à chaque intervalle selon que l'individu y est vivant ou non. Dans cette approximation à temps discret, on doit avoir recours à une régression logistique regroupée plutôt qu'à un modèle de Cox pour obtenir une estimation sans biais du temps de survie sous l'hypothèse que le taux de décès par intervalle est bas. Cette hypothèse est raisonnable lorsque les intervalles sont suffisamment courts pour éviter les événements multiples mais ce faisant, des problèmes de valeurs manquantes ou autres peuvent subvenir car les visites effectives ont lieu moins fréquemment dans un contexte de survie et la possibilité que des valeurs soient manquantes est bien réelle.Objectif : Nous examinons l'effet de deux méthodes d'imputation de valeurs manquantes, à savoir la reconduction de la dernière observation (RDO) et l'imputation multiple (IM), de même que la technique d'études des cas disponibles. Nous comparons ces trois approches en prenant comme point de référence l'analyse des cas complets.Méthodes : La régression logistique regroupée pondérée est utilisée afin d'estimer l'effet de traitement causal marginal. Des données complètes ont été générées au moyen de l'algorithme de Young afin d'obtenir des informations mensuelles au sujet de tous les individus ; des observations ont ensuite été sélectionnées à partir des données complètes en supposant que des visites de suivi aient lieu tous les trois ou six mois. Ainsi, en vue d'analyser les données observées sur une base mensuelle, on a effectué la reconduction de la dernière observation (RDO) et l'imputation multiple (IM) pour remplacer les données manquantes et comparer les résultats à ceux d'une analyse de données entièrement observables. On a également effectué une analyse des données observées avant imputation. On a ensuite appliqué ces techniques à la cohorte de co-infection canadienne afin d'évaluer l'impact de la consommation d'alcool sur la fibrose du foie. Résultats: Dans la plupart des simulations, les estimations fondées sur l'imputation multiple se sont avérées moins biaisées et moins variables que les autres, surpassant même celles fondées sur l'observation de données complètes. En présence d'effets confondants, les estimations fondées sur l'imputation multiple ont présenté un biais accru mais ont été moins variables que celles fondées sur les données entièrement observables.Conclusion : L'imputation multiple est supérieure à la reconduction de la dernière observation et à l'analyse des données brutes lorsque des modèles structuraux marginaux sont utilisés pour ajuster l'exposition temporelle et les variables dans un contexte d'analyses de survie où les données sont mesurées à basse fréquence ou incomplètes.
APA, Harvard, Vancouver, ISO, and other styles
6

Pang, Menglan. "A study of non-collapsibility of the odds ratio via marginal structural and logistic regression models." Thesis, McGill University, 2012. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=110697.

Full text
Abstract:
Background: It has been noted in epidemiology and biostatistics that when the odds ratio (OR) is used to measure the causal effect of a treatment or exposure, there is a discrepancy between the marginal OR and the conditional OR even in the absence of confounding. This is known as non-collapsibility of the OR. It is sometimes described (incorrectly) as a bias in the estimated treatment effect from a logistic regression model if an important covariate is omitted. Objectives: Distinguish confounding bias from non-collapsibility and measure the non-collapsibility effect on the OR in different scenarios. Methods: We used marginal structural models and standard logistic regression to measure the non-collapsibility effect and confounding bias. An analytic approach is proposed to assess the non-collapsibility effect in a point-exposure study. This approach can be used to verify the conditions for the absence of non-collapsibility and to examine the phenomenon of confounding without non-collapsibility. A graphical approach is employed to show the relationship between the non-collapsibility effect and the baseline risk or the marginal outcome probability, and it reveals the non-collapsibility behaviour with a range of different exposure effects and different covariate effects. In order to explore the non-collapsibility effect of the OR in the presence of time-varying confounding, an observational cohort study was simulated. Results and Conclusion: The total difference between the conditional and crude effects can be decomposed into a sum of the non-collapsibility effect and the confounding bias. We provide a general formula for expressing the non-collapsibility effect under different scenarios. Our analytic approach provided similar results to related formulae in the literature. Various interesting observations about non-collapsibility can be made from the different scenarios with or without confounding using the graphical approach. Somewhat surprisingly, the effect of the covariate plays a more important role in the non-collapsibility effect than does the effect of the exposure. In the presence of time-varying confounding, the non-collapsibility is comparable to the effect in the point-exposure study.
Contexte : Il a été observé en épidémiologie et en biostatistique que lorsque le "odds ratio" (OR) est utilisé pour mesurer l'effet causal d'un traitement ou d'une exposition, il y a une différence entre l'OR marginal et l'OR conditionnel et ce, même s'il y a absence de biais de confusion. Ceci est décrit comme le non-collapsibilité de l'OR. Il est parfois incorrectement décrit comme un biais dans l'effet estimé du traitement à partir d'un modèle de régression logistique, si une covariante importante est exclue.Objectifs : Distinguer le biais provenant du biais de confusion du non-collapsibilité et mesurer l'effet du non-collapsibilité sur l'OR dans plusieurs scénarios.Méthode : On a utilisé des modèles structuraux marginaux et la régression logistique ajustée pour mesurer l'effet du non-collapsibilité dans une étude d'exposition par points. Cette approche peut être utilisée pour vérifier les conditions de l'absence de non-collapsibilité et pour examiner le phénomène de biais de confusion sans non-collapsibilité. Une approche graphique est employée pour démontrer la relation entre le non-collapsibilité et le risque de base ou la probabilité du résultat marginal; ceci révèle le comportement de non-collapsibilité avec une étendue d'effets d'exposition et de covariance différents. De manière à explorer l'effet de non-collapsibilité de l'OR en présence de biais de confusion variant en fonction du temps, une étude d'observation de cohorte a été simulée.Résultats et Conclusion : La différence entre les effets conditionnels et bruts peut être décomposée dans la somme de l'effet de non-collapsibilité et du biais de confusion. Nous suggérons une formule générale pour exprimer l'effet du non-collapsibilité dans plusieurs scénarios différents. Notre approche analytique expose des résultats similaires à d'autres étant trouvés avec des formules présentes dans la littérature. Plusieurs observations intéressantes sur le non-collapsibilité peuvent être faites à partir de différents scénarios, avec ou sans biais de confusion, en utilisant notre approche graphique. De manière surprenante, l'effet d'une covariable joue un plus grand rôle dans le non-collapsibilité que l'effet de l'exposition. En présence de biais de confusion reliée au temps, l'effet du non-collapsibilité est comparable à l'effet de l'étude d'exposition par point.
APA, Harvard, Vancouver, ISO, and other styles
7

Ewings, F. M. "Practical and theoretical considerations of the application of marginal structural models to estimate causal effects of treatment in HIV infection." Thesis, University College London (University of London), 2012. http://discovery.ucl.ac.uk/1346448/.

Full text
Abstract:
Standard marginal structural models (MSMs) are commonly applied to estimate causal effects in the presence of time-dependent confounding; these may be extended to history-adjusted MSMs to estimate effects conditional on time-updated covariates, and dynamic MSMs to estimate e¤ects of pre-speci…ed dynamic regimes (Cain et al., 2010). We address methods to assess the optimal time for treatment initiation with respect to CD4 count in HIV-infected persons, and apply these to CASCADE cohort data. We advocate the application of all three types of MSM to address such causal questions and investigate gaps in the literature concerning their application. Of importance is the construction of suitable inverse probability weights. We have structured this process as four key decisions, de…fining a range of strategies; all demonstrated a bene…ficial effect of ART in CASCADE. We found a trend towards greater treatment bene…fit at lower CD4 across a range of models. Via large simulated randomised trials based on CASCADE data, longer grace periods (permitted delay in treatment initiation) and in particular less-frequently observed CD4 indicated higher optimal regimes (earlier treatment initiation at higher CD4), although similar AIDS-free survival rates may be achieved at these higher optimal regimes. In realistically-sized observational simulations, the optimal regime estimates lacked precision, mainly due to broadly constant AIDS-free survival rates at higher CD4. Optimal regimes estimated from dynamic MSMs should be interpreted with regard to the shape of the outcome-by-regime curve and the precision. In our clinical setting, we found that allowing a 3-month grace period may increase precision with little bias under the interpretation of no grace period; under longer grace periods, the bias outweighed the efficiency gain. In our CASCADE population, immediate treatment was preferable to delay, although estimation was limited by relatively short follow-up. Comparison across the MSM approaches offers additional insights into the methodology and clinical results.
APA, Harvard, Vancouver, ISO, and other styles
8

Oba, Koji. "How to use marginal structural models in randomized trials to estimate the natural direct and indirect effects of therapies mediated by causal intermediates." 京都大学 (Kyoto University), 2011. http://hdl.handle.net/2433/152045.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

FORNARI, CARLA. "Metodi statistici per la valutazione di costo-efficacia in HTA." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2012. http://hdl.handle.net/10281/28475.

Full text
Abstract:
Health technology assessment (HTA) is a potential multidisciplinary tool for decision-making in healthcare policy. Scientific evidences for HTA processes usually come from randomized clinical trials (RCTs), which are the gold standard, but observational studies can be used to overcome study design limitations of RCTs. Nowadays, the use of healthcare administrative databases (HADs) in HTA is not widespread, despite their apparent utility. The objective of this work was to use HADs to conduct a cost-effectiveness evaluation of a pharmacological therapy under real world adherence to treatment. We analyzed the case of statin therapy in the secondary prevention of acute myocardial infarction (AMI). Data were derived from HADs of the healthcare authority of Lombardy, a region in Northern Italy. Patients hospitalized for a first event of AMI in the year 2003 were followed until the 31st December 2008, collecting data on healthcare services and vital status. We used regression methods to estimate the incremental net benefit of treatment or of adherence levels to treatment. HADs provide a rich source of information but the analysis must account for the correlation between repeated measures of the same subject, possible selection bias and endogeneity of the data. In this situation classical regression methods are not appropriate, so we applied marginal structural models (MSMs) to the health economics framework. The cost-effectiveness analysis of statin treatment in the secondary prevention of AMI confirms that statins are cost-effective while the analysis of adherence to treatment needs further assessments. The implementation of MSMs could be a future tool in economic healthcare evaluations, because MSMs make possible to account for time-varying confounding and selection bias typical of longitudinal observational studies. In this way the use of data from HADs or from observational studies could become more reliable.
APA, Harvard, Vancouver, ISO, and other styles
10

Farmer, R. E. "Application of marginal structural models with inverse probability of treatment weighting in electronic health records to investigate the benefits and risks of first line type II diabetes treatments." Thesis, London School of Hygiene and Tropical Medicine (University of London), 2017. http://researchonline.lshtm.ac.uk/4646129/.

Full text
Abstract:
Background: Electronic healthcare records (EHRs) provide opportunities to estimate the effects of type two diabetes (T2DM) treatments on outcomes such as cancer and cardiovascular disease. Marginal structural models (MSMs) with inverse probability of treatment weights (IPTW) can correctly estimate the causal effect of time-varying treatment in the presence of time-dependent confounders such as HbA1c. Dynamic MSMs can be used to compare dynamic treatment strategies. This thesis applies weighted MSMs and dynamic MSMs to explore risks and benefits of early-stage T2DM treatments, and considers the practicalities/impact of using these models in a complex clinical setting with a challenging data source. Methods and Findings: A cohort of patients with newly diagnosed T2DM was identified from the Clinical Practice Research Datalink. MSMs with IPTW were used to estimate the causal effect of metformin monotherapy on cancer risk, and the effects of metformin and sulfonylurea monotherapies on risks of MI, stroke, all-cause mortality, and HbA1c trajectory. Dynamic MSMs were implemented to compare HbA1c thresholds for treatment initiation on risks of MI, stroke, all-cause mortality (ACM) and glucose control. No association was found between metformin use and cancer risk. Metformin and sulfonylureas led to better HbA1c control than diet only, as expected, and there was some evidence of reduced MI risk with long-term metformin use. Changes in estimates between standard models and weighted models were generally in the expected direction given hypothesised time-dependent confounding. For stroke and ACM, results were less conclusive, with some suggestions of residual confounding. Higher HbA1c thresholds for treatment initiation reduced the likelihood of reaching target HbA1c, and there was a suggestion that higher initiation thresholds increased MI risk. Conclusions: Fitting weighted MSMs and dynamic MSMs was feasible using routine primary care data. The models appeared to work well in controlling for strong time-dependent confounding with short-term outcomes; results for longer-term outcomes were less conclusive.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Marginal Structural Cox models"

1

Rosenblum, Michael. "Marginal Structural Models." In Targeted Learning, 145–60. New York, NY: Springer New York, 2011. http://dx.doi.org/10.1007/978-1-4419-9782-1_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bergsma, Wicher, Marcel Croon, and Jacques A. Hagenaars. "Causal Analyses: Structural Equation Models and (Quasi-)Experimental Designs." In Marginal Models, 155–90. New York, NY: Springer New York, 2009. http://dx.doi.org/10.1007/b12532_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Robins, James M. "Marginal Structural Models versus Structural nested Models as Tools for Causal inference." In Statistical Models in Epidemiology, the Environment, and Clinical Trials, 95–133. New York, NY: Springer New York, 2000. http://dx.doi.org/10.1007/978-1-4612-1284-3_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Petersen, Maya, Joshua Schwab, Elvin Geng, and Mark van der Laan. "Chapter 10: Evaluation of longitudinal dynamic regimes with and without marginal structural working models." In Adaptive Treatment Strategies in Practice, 157–86. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2015. http://dx.doi.org/10.1137/1.9781611974188.ch10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Gaudry, Marc, Ulrich Blum, and Tran Liem. "Turning Box-Cox including quadratic forms in regression." In Structural Road Accident Models, 335–46. Elsevier, 2000. http://dx.doi.org/10.1016/b978-008043061-4/50016-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gaudry, Marc, Ulrich Blum, and Tran Liem. "Turning Box-Cox Including Quadratic Forms in Regression." In Structural Road Accident Models, 335–46. Emerald Group Publishing Limited, 2000. http://dx.doi.org/10.1108/9780080518015-014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Mittinty, Murthy N. "Structural Nested Mean Models or History-Adjusted Marginal Structural Models for Time-Varying Effect Modification: An Application to Dental Data." In Handbook of Statistics, 249–73. Elsevier, 2017. http://dx.doi.org/10.1016/bs.host.2017.08.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Marginal Structural Cox models"

1

Diouf, Ibrahima, Charles B. Malpas, Sifat Sharmin, Olga Skibina, Katherine Buzzard, Jeannette Lechner-Scott, Michael Barnett, et al. "006 Comparison of multiple disease modifying therapies in multiple sclerosis with marginal structural models." In ANZAN Annual Scientific Meeting 2021 Abstracts. BMJ Publishing Group Ltd, 2021. http://dx.doi.org/10.1136/bmjno-2021-anzan.6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Dumas, Orianne, Nicole Le Moual, Valérie Siroux, Dick Heederik, Francine Kauffmann, and Xavier Basagana. "Marginal Structural Models To Quantify And Control For The Healthy Worker Effect In Asthma: Results From The EGEA Study." In American Thoracic Society 2012 International Conference, May 18-23, 2012 • San Francisco, California. American Thoracic Society, 2012. http://dx.doi.org/10.1164/ajrccm-conference.2012.185.1_meetingabstracts.a1175.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wordofa, Daniel Hailemichael. "Development of Track Geometry Degradation Model & Review of Recovery Models." In 2022 Joint Rail Conference. American Society of Mechanical Engineers, 2022. http://dx.doi.org/10.1115/jrc2022-79370.

Full text
Abstract:
Abstract Former works on railway track geometry degradation are reviewed and their advantage and dis-advantages discussed. Railway track degradation can be geometry or structural degradation. This paper focuses on track geometry degradation and tries to propose additional railway track geometry degradation models based on real data on existing operational line. The case study is a conventional railway line in Ethiopia “Ethio-Djibouti standard gauge railway line”. Track geometry data collected by track geometry measuring vehicle for 156 weeks and for a length of 323 kilometers is taken for analysis. 114km is a double track line and the rest is a single line track. Models are developed for both single and double track lines. Degradation models developed are statistical probabilistic models which can be used for prediction purpose with the accuracy in need. The probabilistic nature allows having marginal values for confidence interval in need. Effects of some factors on track geometry degradation are also analyzed. Most of the results show similar results to former studies but the effect of speed on track geometry degradation is different.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhao, Yuliang, Sheng Dong, Zihao Yang, and Lance Manuel. "Estimating Design Loads for Floating Structures Using Environmental Contours." In ASME 2020 39th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/omae2020-18453.

Full text
Abstract:
Abstract To ensure acceptable operation and/or survival of floating structures in extreme conditions, nonlinear time-domain simulations are often used to predict the structural response at the design stage. An environmental contour (EC) is commonly employed to identify critical sea states that serve as input for numerical simulations to assess the safety and performance of marine structures. In many studies, marginal and conditional distributions are defined to construct bivariate joint probability distributions for variables such as significant wave height and zero-crossing period; then, environmental contours can be constructed using the inverse first-order reliability method (IFORM). This study adopts alternative models to describe the generalized dependence structure between the environmental variables using copulas; the Nataf transformation is also discussed as a special case. Environmental contours are constructed, making use of measured wave data from moored buoys. Derived design loads are applied on a semi-submersible platform to assess possible differences. In addition, the long-term extremes of the tension of the mooring lines are estimated, considering uncertainties in the structural response using a 3D model (that includes response variability, ignored with the EC approach) to help establish more accurate design loads using Monte Carlo simulation. Results offer a clear indication of the extreme response of the floating structure based on the different models.
APA, Harvard, Vancouver, ISO, and other styles
5

Soares, Catarina S., and C. Guedes Soares. "Comparison of Bivariate Models of the Distribution of Significant Wave Height and Peak Wave Period." In ASME 2007 26th International Conference on Offshore Mechanics and Arctic Engineering. ASMEDC, 2007. http://dx.doi.org/10.1115/omae2007-29740.

Full text
Abstract:
This paper presents the results of a comparison of the fit of three bivariate models to a set of 14 years of significant wave height and peak wave period data from the North Sea. One of the methods defines the joint distribution from a marginal distribution of significant wave height and a set of distributions of peak period conditional on significant wave height. Other method applies the Plackett model to the data and the third one applies the Box-Cox transformation to the data in order to make it approximately normal and then fits a bivariate normal distribution to the transformed data set. It is shown that all methods provide a good fit but each one have its own strengths and weaknesses, being the choice dependent on the data available and applications in mind.
APA, Harvard, Vancouver, ISO, and other styles
6

Gueveneux, Hervé, and Philippe Le Buhan. "Hybrid Riser Tower Design: Evolution, Operational Efficiency and Compliance to Ultra Deep Offshore Challenge." In ASME 2014 33rd International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/omae2014-23882.

Full text
Abstract:
The riser system selection and sizing is one of the most challenging design issues of the subsea world specifically for deepwater applications. Bundle riser towers have been used for few deep water developments since the late 1990s and have demonstrated their capacity to meet the operational constraints in terms of field layout congestion, flow assurance, mechanical design, fabrication and installation etc. From the pioneer application in the Gulf Of Mexico (Green Canyon 29 – 1988) to the 2nd generation systems in Western Africa (GIRASSOL to Rosa or CLOV), the design of Hybrid Riser Towers has evolved to improve mechanical robustness, installation efficiency and operability. This article reviews the past evolutions and gives an overall view of the performance achieved so far, mainly in West African sectors. The move of the offshore industry towards ultra-deep (WD > 1800m) and/or marginal fields may trigger some additional design evolutions which are highlighted in the article, with a focus on the water depth increase considering environmental conditions met on previous projects. The feasibility in harsher conditions is not addressed in this document. In addition, the challenges raised by the engineering of such deepwater Hybrid Riser Tower concepts are discussed: • Structural models of different scales shall be used to properly account for potential couplings between the global behavior of the whole tower with the local responses of each individual component; • Top and bottom assemblies as well as the rigid or flexible spools, the piping and connectors shall be introduced into the FE models to define correct boundary conditions to the HRT; • Slug or severe internal flows shall be considered for fatigue issues requiring complex models which combine CFD and structural calculations; • Risks of hydrodynamic instabilities shall be assessed like plunge and torsion galloping for non-cylindrical sections and VIV for bare peripherals; • A specific attention shall be paid to the fatigue budget spent from towing and Installation to the In-place configuration.
APA, Harvard, Vancouver, ISO, and other styles
7

Kiviluoma, Risto, and Atte Mikkonen. "Equivalent static wind load procedure for skew winds on large bridges." In IABSE Congress, New York, New York 2019: The Evolving Metropolis. Zurich, Switzerland: International Association for Bridge and Structural Engineering (IABSE), 2019. http://dx.doi.org/10.2749/newyork.2019.2573.

Full text
Abstract:
<p>This paper describes theoretical framework on forming equivalent static wind loads (ESWL) for large bridges. A method is proposed for efficient handling of large number of load cases, when vibration and structural analysis is extended to skew winds, i.e., to the wind directions other than the principal ones. These appear to be increasingly important in many practical cases when complex bridge geometry is used for architectural uniqueness; or when the bridge is situated in city centres or hilly terrain, where local obstacles make the wind turbulence difficult to assess with standard models.</p><p>The method uses a set of load cases for principal wind directions to be input and solved with the static Finite Element (FE) model. Combination matrix is deduced for the results to assess skew winds. The method is alike that is frequently used in wind-tunnel studies of tall buildings. ESWL determination is done in co-operation with the wind and the bridge engineer. The needed input for the wind engineer includes numeric vibration mode shape data, global nodal coordinates and mass distributions. ESWL are created in numeric form that could be easily input to the FE-model. The method allows utilisation of various type analysis results and experimental data available for the bridge, including section-model based analysis, full-model wind-tunnel tests and structural monitoring results. It facilitates examination and adjustment of appropriate safety marginal to wind loads that take into account methodologic uncertainties in each.</p><p>It is proposed that wind-tunnel laboratories, or other wind engineers with bridge analysis expertise, should more often include ESWL-extraction to their services.</p>
APA, Harvard, Vancouver, ISO, and other styles
8

Kruse, Benjamin, Clemens Münzer, Stefan Wölkl, Arquimedes Canedo, and Kristina Shea. "A Model-Based Functional Modeling and Library Approach for Mechatronic Systems in SysML." In ASME 2012 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/detc2012-70378.

Full text
Abstract:
Even though the concept development phase in product development is arguably the most important phase in mechanical and mechatronics design, the available computer-based support for this stage is marginal. This paper presents a new computational model-based method to improve the early phases of mechatronic product design and to facilitate the application from early designs to detailed designs. The paper focuses on model-based Function-Behavior-Structure (FBS) libraries in SysML to support both the manual and computational generation of standard and innovative concepts. In this paper, an approach to re-usable functional models in SysML is presented. The method uses an operator-flow formulation of functions, based on the NIST functional basis, and is validated against a model of an electric car. The generated functional models are validated with respect to the consistency of the flows and tested by associating the functional model directly to the target product component structure. The results of the research are a new modeling approach for function and component libraries in SysML, an associated workflow for modeling of mechatronic systems, and the necessary extensions of the NIST functional basis. The modeling approach provides means for formal functional decomposition followed by an allocation of the functions to structural components that form the target structure.
APA, Harvard, Vancouver, ISO, and other styles
9

Wu, Jun, Can Ma, Chunhui Dai, Zhenxing Zhao, Lu Dai, and Zhouyang Liu. "Research on Structural Design and Analysis of S-CO2 Turbine Impeller." In 2018 26th International Conference on Nuclear Engineering. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/icone26-81267.

Full text
Abstract:
The Brayton cycle with supercritical carbon (S-CO2) as working medium is one of the most promising new nuclear power systems. Turbine is the key device during the working process in the Brayton power cycle. The turbine structural presents small size and extremely high rotational speed for the special physical properties of S-CO2, which increase the difficulty for the structural design and strength safety significantly. According to the aerodynamic design and optimization results of 200 kW S-CO2 radial inflow turbine, this paper proposes a detail structural design and analysis method for turbine impeller. Based on the three-dimensional blade profile data and meridional planes data, key structural design parameters are chosen and the parametric geometry model is established by CAD tools. On this basis, numerical simulation models of turbine are established to analyze the structural strength in detail. Then the influence of parameters on the turbine impeller strength is studied by a series of finite element numerical procedures. The influence mechanisms of key structural design parameters on impeller strength are discussed. Moreover, the final model of turbine impeller is obtained by parameter comparison and selection. The results show that for the initial model, the maximum von-Mises equivalent stress is 400.10 MPa, the maximum radial deformation is 0.0333 mm and the maximum axial deformation is 0.0770 mm. For the final model, the maximum von-Mises equivalent stress is 294.26 MPa, the maximum radial deformation is 0.0279 mm and the maximum axial deformation is 0.0769 mm. The maximum von-Mises equivalent stress and maximum radial deformation of structural decreases 26.45 % and 16.22 % respectively compared with the initial model. As a result, the impeller structural strength safety margin is obviously improved by the parameter analysis.
APA, Harvard, Vancouver, ISO, and other styles
10

Vanem, Erik, and Arne Bang Huseby. "Environmental Contours Based on a Direct Sampling Approach and the IFORM Approach: Contribution to a Benchmark Study." In ASME 2020 39th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/omae2020-18041.

Full text
Abstract:
Abstract Environmental contours are often applied in probabilistic structural reliability analysis to identify extreme environmental conditions that may give rise to extreme loads and responses. It represents an approximate method for performing long-term extreme response analyses in cases where full long-term analyses are not feasible due to computationally heavy and time-demanding response calculations. There are various methods for deriving environmental contours given a set of metocean data. These relate to different approaches for modelling the joint behaviour of the metocean variables, i.e., a joint distribution function fitted to the data, but also different ways of establishing the environmental contour given a joint distribution for the environmental variables. In light of this, a benchmark exercise was announced at OMAE 2019 [1], asking for contributions from different practitioners involved with environmental contours. Various bivariate datasets are provided and two exercises are specified for which different solutions are elicited. The first part of the exercise concerns the estimation of the actual contours, whereas the second part relates to the uncertainty characterization of the contours in light of sampling variability. This paper is a response to this announcement and provides one contribution to these benchmark exercises; environmental contours based on a direct sampling approach as well as contours based on the IFORM approach will be presented. Both sets of contours are based on the same models for the joint distribution of the environmental variables, i.e., a conditional model where the joint distribution is modelled as a product of a marginal model for one variable and a conditional model for the other. Both the joint modelling of the environmental variables and the different approaches to estimate environmental contours are described in this paper and the results for the provided datasets are shown.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Marginal Structural Cox models"

1

Romero-Chamorro, José Vicente, and Sara Naranjo-Saldarriaga. Weather Shocks and Inflation Expectations in Semi-Structural Models. Banco de la República Colombia, November 2022. http://dx.doi.org/10.32468/be.1218.

Full text
Abstract:
Colombia is particularly affected by the El Niño Southern Oscillation (ENSO) weather fluctuations. In this context, this study explores how the adverse weather events linked to ENSO affect the inflation expectations in Colombia and how to incorporate these second-round effects into a small open economy New Keynesian model. Using BVARx models we provide evidence that the inflation expectations obtained from surveys and break-even inflation measures are affected by weather supply shocks. Later, using this stylised fact, we modify one of the core forecasting models of the Banco de la República by incorporating the mechanisms in which weather-related shocks affect marginal costs and inflation expectations. We find that ENSO shocks had an important role in both inflation and the dynamics of inflation expectations, and that policymakers should consider this fact.
APA, Harvard, Vancouver, ISO, and other styles
2

Tanny, Josef, Gabriel Katul, Shabtai Cohen, and Meir Teitel. Application of Turbulent Transport Techniques for Quantifying Whole Canopy Evapotranspiration in Large Agricultural Structures: Measurement and Theory. United States Department of Agriculture, January 2011. http://dx.doi.org/10.32747/2011.7592121.bard.

Full text
Abstract:
Original objectives and revisions The original objectives of this research, as stated in the approved proposal were: 1. To establish guidelines for the use of turbulent transport techniques as accurate and reliable tool for continuous measurements of whole canopy ET and other scalar fluxes (e.g. heat and CO2) in large agricultural structures. 2. To conduct a detailed experimental study of flow patterns and turbulence characteristics in agricultural structures. 3. To derive theoretical models of air flow and scalar fluxes in agricultural structures that can guide the interpretation of TT measurements for a wide range of conditions. All the objectives have been successfully addressed within the project. The only modification was that the study focused on screenhouses only, while it was originally planned to study large greenhouses as well. This was decided due to the large amount of field and theoretical work required to meet the objectives within screenhouses. Background In agricultural structures such as screenhouses and greenhouses, evapotranspiration (ET) is currently measured using lysimeters or sap flow gauges. These measurements provide ET estimates at the single-plant scale that must then be extrapolated, often statistically or empirically, to the whole canopy for irrigation scheduling purposes. On the other hand, turbulent transport techniques, like the eddy covariance, have become the standard for measuring whole canopy evapotranspiration in the open, but their applicability to agricultural structures has not yet been established. The subject of this project is the application of turbulent transport techniques to estimate ET for irrigation scheduling within large agricultural structures. Major conclusions and achievements The major conclusions of this project are: (i) the eddy covariance technique is suitable for reliable measurements of scalar fluxes (e.g., evapotranspiration, sensible heat, CO2) in most types of large screenhouses under all climatic conditions tested. All studies resulted with fair energy balance closures; (ii) comparison between measurements and theory show that the model is capable in reliably predicting the turbulent flow characteristics and surface fluxes within screenhouses; (iii) flow characteristics within the screenhouse, like flux-variance similarity and turbulence intensity were valid for the application of the eddy covariance technique in screenhouses of relatively dilute screens used for moderate shading and wind breaking. In more dense screens, usually used for insect exclusions, development of turbulent conditions was marginal; (iv) installation of the sensors requires that the system’s footprint will be within the limits of the screenhouse under study, as is the case in the open. A footprint model available in the literature was found to be reliable in assessing the footprint under screenhouse conditions. Implications, both scientific and agricultural The study established for the first time, both experimentally and theoretically, the use of the eddy covariance technique for flux measurements within agricultural screenhouses. Such measurements, along with reliable theoretical models, will enable more accurate assessments of crop water use which may lead to improved crop water management and increased water use efficiency of screenhouse crops.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography