Academic literature on the topic 'Marginal structure model'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Marginal structure model.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Marginal structure model"

1

Li, Zun, Yuanpei Zhu, and Yuping Wang. "A Criminisi-DnCNN Model-Based Image Inpainting Method." Mathematical Problems in Engineering 2022 (August 2, 2022): 1–8. http://dx.doi.org/10.1155/2022/9780668.

Full text
Abstract:
Existing image inpainting methods achieve unideal results in dealing with centralized inpainting areas. For this reason, in this study, a Criminisi-DnCNN model-based image inpainting method is proposed. Inspired by the manual inpainting technology, the pointwise mutual information (PMI) algorithm was adopted to obtain the marginal structural map of the images to be repaired. Then, the Criminisi algorithm was used to restore the marginal structure to obtain the complete marginal structure image guided by the superficial linear structure. Finally, the problem of texture inpainting was converted into the counterpart of image denoising through the separation of variables by using the denoising convolutional neural network image denoiser (DnCNN). Compared with the existing inpainting methods, this model has improved the clarity of the marginal structure and reduced the blurring of the area to be repaired.
APA, Harvard, Vancouver, ISO, and other styles
2

McNeish, Daniel, and Jeffrey R. Harring. "Improving convergence in growth mixture models without covariance structure constraints." Statistical Methods in Medical Research 30, no. 4 (January 12, 2021): 994–1012. http://dx.doi.org/10.1177/0962280220981747.

Full text
Abstract:
Growth mixture models are a popular method to uncover heterogeneity in growth trajectories. Harnessing the power of growth mixture models in applications is difficult given the prevalence of nonconvergence when fitting growth mixture models to empirical data. Growth mixture models are rooted in the random effect tradition, and nonconvergence often leads researchers to modify their intended model with constraints in the random effect covariance structure to facilitate estimation. While practical, doing so has been shown to adversely affect parameter estimates, class assignment, and class enumeration. Instead, we advocate specifying the models with a marginal approach to prevent the widespread practice of sacrificing class-specific covariance structures to appease nonconvergence. A simulation is provided to show the importance of modeling class-specific covariance structures and builds off existing literature showing that applying constraints to the covariance leads to poor performance. These results suggest that retaining class-specific covariance structures should be a top priority and that marginal models like covariance pattern growth mixture models that model the covariance structure without random effects are well-suited for such a purpose, particularly with modest sample sizes and attrition commonly found in applications. An application to PTSD data with such characteristics is provided to demonstrate (a) convergence difficulties with random effect models, (b) how covariance structure constraints improve convergence but to the detriment of performance, and (c) how covariance pattern growth mixture models may provide a path forward that improves convergence without forfeiting class-specific covariance structures.
APA, Harvard, Vancouver, ISO, and other styles
3

Ando, Shuji. "A Bivariate Index for Visually Measuring Marginal Inhomogeneity in Square Tables." International Journal of Statistics and Probability 8, no. 5 (August 15, 2019): 58. http://dx.doi.org/10.5539/ijsp.v8n5p58.

Full text
Abstract:
For square tables, the marginal homogeneity model which has a structure that the row marginal distribution is equal to the column marginal distribution was proposed. Thereafter, various extended models of marginal homogeneity have been proposed, these models can be classified into two types marginal inhomogeneity. On the other hand, various indexes which measure the degree of deviation from marginal homogeneity have been proposed. However these indexes cannot concurrently define degrees of deviation from marginal homogeneity with respect to two types marginal inhomogeneity. This paper proposes a bivariate index that can concurrently define degrees of deviation from those. The proposed bivariate index would also be utility for visually comparing degrees of deviation from marginal homogeneity in several tables using confidence regions.
APA, Harvard, Vancouver, ISO, and other styles
4

Kertel, Maximilian, and Markus Pauly. "Estimating Gaussian Copulas with Missing Data with and without Expert Knowledge." Entropy 24, no. 12 (December 19, 2022): 1849. http://dx.doi.org/10.3390/e24121849.

Full text
Abstract:
In this work, we present a rigorous application of the Expectation Maximization algorithm to determine the marginal distributions and the dependence structure in a Gaussian copula model with missing data. We further show how to circumvent a priori assumptions on the marginals with semiparametric modeling. Further, we outline how expert knowledge on the marginals and the dependency structure can be included. A simulation study shows that the distribution learned through this algorithm is closer to the true distribution than that obtained with existing methods and that the incorporation of domain knowledge provides benefits.
APA, Harvard, Vancouver, ISO, and other styles
5

Boháčik, Ján. "Financial shocks and their effects on velocity of money in agent-based model." Review of Economic Perspectives 22, no. 4 (December 1, 2022): 241–66. http://dx.doi.org/10.2478/revecp-2022-0011.

Full text
Abstract:
Abstract The interaction of debt and economic performance has been getting more attention over the last few years. However, models making provision for debt are still outnumbered by models completely ignoring it. This paper is the first one to analyze the relationship between household debt (in the form of bank loans) and economic performance (in terms of aggregate income) considering both the impact of wealth and income distribution, and the impact of the MPC distribution under various financial shocks. The outcomes of the model are velocities calculated as ratios of aggregate income to aggregate debt. The paper demonstrates how financial shocks affect the income velocity of money under different distributions of wealth/income and marginal propensity to consume across the population. For this purpose, an original agent-based simulation model with a limited loan supply was designed. Proposed model shocks are shocks to loan demand, loan supply, marginal propensity to consume, macro-prudential regulatory ratios, real estate capital gains, repayment ratios, shocks to the structure of loans provided and to the structure of real estate property transactions. It is shown that the more equal the distributions of wealth/income and of the marginal propensity to consume, the higher is the income velocity of money. From financial shocks, the marginal propensity to consume shock and the shock to the structure of new real estate property purchases have the largest impact on velocity. The shock to regulatory ratios has generally the lowest magnitude.
APA, Harvard, Vancouver, ISO, and other styles
6

Yao, P. "Causal inference: Cognitive functioning and depressive symptoms by longitudinal marginal structure model." Value in Health 17, no. 3 (May 2014): A183—A184. http://dx.doi.org/10.1016/j.jval.2014.03.1072.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Wiese, Richard. "A Two-Level Approach to Morphological Structure." Journal of Germanic Linguistics 20, no. 3 (September 2008): 243–74. http://dx.doi.org/10.1017/s147054270800010x.

Full text
Abstract:
In morphological theory, various models have been developed with respect to the appropriate levels of abstraction for stating morphological generalizations. This paper addresses a class of seemingly marginal and/or problematic phenomena in morphology and proposes that morphological descriptions regularly refer to two distinct levels of description. One is the level of “morphosyntax,” and one is the level of “morphophonology.” Furthermore, morphology is considered to be marginal if and only if the degree of isomorphy between representations on these two levels is reduced. This basic proposal is illustrated and tested with several central phenomena of morphology found in German: synthetic compounds, conversion, empty morphs, and trun-cation. The analysis proposed here argues for the necessity of a two-level model of morphology as an approach in which both abstract morphosyntax as well as more concrete morphophonology have a place.*
APA, Harvard, Vancouver, ISO, and other styles
8

Niu, Yi, Xiaoguang Wang, Hui Cao, and Yingwei Peng. "Variable selection via penalized generalized estimating equations for a marginal survival model." Statistical Methods in Medical Research 29, no. 9 (January 29, 2020): 2493–506. http://dx.doi.org/10.1177/0962280220901728.

Full text
Abstract:
Clustered and multivariate survival times, such as times to recurrent events, commonly arise in biomedical and health research, and marginal survival models are often used to model such data. When a large number of predictors are available, variable selection is always an important issue when modeling such data with a survival model. We consider a Cox’s proportional hazards model for a marginal survival model. Under the sparsity assumption, we propose a penalized generalized estimating equation approach to select important variables and to estimate regression coefficients simultaneously in the marginal model. The proposed method explicitly models the correlation structure within clusters or correlated variables by using a prespecified working correlation matrix. The asymptotic properties of the estimators from the penalized generalized estimating equations are established and the number of candidate covariates is allowed to increase in the same order as the number of clusters does. We evaluate the performance of the proposed method through a simulation study and analyze two real datasets for the application.
APA, Harvard, Vancouver, ISO, and other styles
9

Koutoumanou, Eirini, Angie Wade, and Mario Cortina-Borja. "Local dependence in bivariate copulae with Beta marginals." Revista Colombiana de Estadística 40, no. 2 (July 1, 2017): 281–96. http://dx.doi.org/10.15446/rce.v40n2.59404.

Full text
Abstract:
The local dependence function (LDF) describes changes in the correlation structure of continuous bivariate random variables along their range. Bivariate density functions with Beta marginals can be used to model jointly a wide variety of data with bounded outcomes in the (0,1) range, e.g. proportions. In this paper we obtain expressions for the LDF of bivariate densities constructed using three different copula models (Frank, Gumbel and Joe) with Beta marginal distributions, present examples for each, and discuss an application of these models to analyse data collected in a study of marks obtained on a statistics exam by postgraduate students.
APA, Harvard, Vancouver, ISO, and other styles
10

SMIRNOV, FEODOR A. "A NEW SET OF EXACT FORM FACTORS." International Journal of Modern Physics A 09, no. 29 (November 20, 1994): 5121–43. http://dx.doi.org/10.1142/s0217751x94002077.

Full text
Abstract:
We present form factors for a wide range of integrable models which include marginal perturbations of the SU(2) WZNZ model for arbitrary central charge and the principal chiral field model. The interesting structure of these form factors is discussed.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Marginal structure model"

1

Sekhi, Ikram. "Développement d'un alphabet structural intégrant la flexibilité des structures protéiques." Thesis, Sorbonne Paris Cité, 2018. http://www.theses.fr/2018USPCC084/document.

Full text
Abstract:
L’objectif de cette thèse est de proposer un Alphabet Structural (AS) permettant une caractérisation fine et précise des structures tridimensionnelles (3D) des protéines, à l’aide des chaînes de Markov cachées (HMM) qui permettent de prendre en compte la logique issue de l’enchaînement des fragments structuraux en intégrant l’augmentation des conformations 3D des structures protéiques désormais disponibles dans la banque de données de la Protein Data Bank (PDB). Nous proposons dans cette thèse un nouvel alphabet, améliorant l’alphabet structural HMM-SA27,appelé SAFlex (Structural Alphabet Flexibility), dans le but de prendre en compte l’incertitude des données (données manquantes dans les fichiers PDB) et la redondance des structures protéiques. Le nouvel alphabet structural SAFlex obtenu propose donc un nouveau modèle d’encodage rigoureux et robuste. Cet encodage permet de prendre en compte l’incertitude des données en proposant trois options d’encodages : le Maximum a posteriori (MAP), la distribution marginale a posteriori (POST)et le nombre effectif de lettres à chaque position donnée (NEFF). SAFlex fournit également un encodage consensus à partir de différentes réplications (chaînes multiples, monomères et homomères) d’une même protéine. Il permet ainsi la détection de la variabilité structurale entre celles-ci. Les avancées méthodologiques ainsi que l’obtention de l’alphabet SAFlex constituent les contributions principales de ce travail de thèse. Nous présentons aussi le nouveau parser de la PDB (SAFlex-PDB) et nous démontrons que notre parser a un intérêt aussi bien sur le plan qualitatif (détection de diverses erreurs)que quantitatif (rapidité et parallélisation) en le comparant avec deux autres parsers très connus dans le domaine (Biopython et BioJava). Nous proposons également à la communauté scientifique un site web mettant en ligne ce nouvel alphabet structural SAFlex. Ce site web représente la contribution concrète de cette thèse alors que le parser SAFlex-PDB représente une contribution importante pour le fonctionnement du site web proposé. Cette caractérisation précise des conformations 3D et la prise en compte de la redondance des informations 3D disponibles, fournies par SAFlex, a en effet un impact très important pour la modélisation de la conformation et de la variabilité des structures 3D, des boucles protéiques et des régions d’interface avec différents partenaires, impliqués dans la fonction des protéines
The purpose of this PhD is to provide a Structural Alphabet (SA) for more accurate characterization of protein three-dimensional (3D) structures as well as integrating the increasing protein 3D structure information currently available in the Protein Data Bank (PDB). The SA also takes into consideration the logic behind the structural fragments sequence by using the hidden Markov Model (HMM). In this PhD, we describe a new structural alphabet, improving the existing HMM-SA27 structural alphabet, called SAFlex (Structural Alphabet Flexibility), in order to take into account the uncertainty of data (missing data in PDB files) and the redundancy of protein structures. The new SAFlex structural alphabet obtained therefore offers a new, rigorous and robust encoding model. This encoding takes into account the encoding uncertainty by providing three encoding options: the maximum a posteriori (MAP), the marginal posterior distribution (POST), and the effective number of letters at each given position (NEFF). SAFlex also provides and builds a consensus encoding from different replicates (multiple chains, monomers and several homomers) of a single protein. It thus allows the detection of structural variability between different chains. The methodological advances and the achievement of the SAFlex alphabet are the main contributions of this PhD. We also present the new PDB parser(SAFlex-PDB) and we demonstrate that our parser is therefore interesting both qualitative (detection of various errors) and quantitative terms (program optimization and parallelization) by comparing it with two other parsers well-known in the area of Bioinformatics (Biopython and BioJava). The SAFlex structural alphabet is being made available to the scientific community by providing a website. The SAFlex web server represents the concrete contribution of this PhD while the SAFlex-PDB parser represents an important contribution to the proper function of the proposed website. Here, we describe the functions and the interfaces of the SAFlex web server. The SAFlex can be used in various fashions for a protein tertiary structure of a given PDB format file; it can be used for encoding the 3D structure, identifying and predicting missing data. Hence, it is the only alphabet able to encode and predict the missing data in a 3D protein structure to date. Finally, these improvements; are promising to explore increasing protein redundancy data and obtain useful quantification of their flexibility
APA, Harvard, Vancouver, ISO, and other styles
2

Xiao, Yongling. "Flexible marginal structural models for survival analysis." Thesis, McGill University, 2012. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=107571.

Full text
Abstract:
In longitudinal studies, both treatments and covariates may vary throughout the follow-up period. Time-dependent (TD) Cox proportional hazards (PH) models can be used to model the effect of time-varying treatments on the hazard. However, two challenges exist in such modeling. First, accurate modeling of the effects of TD treatments on the hazard requires resolving the uncertainty about the etiological relevance of treatments taken in different time periods. The second challenge arises in the presence of TD confounders affected by prior treatments. By assuming the absence of the other challenge, two different methodologies, weighted cumulative exposure (WCE) and marginal structural models (MSM), have been recently proposed to separately address each challenge, respectively. In this thesis, I proposed the combination of these methodologies so as to address both challenges simultaneously, as both may commonly arise in combination in longitudinal studies.In the first manuscript, I proposed and validated a novel approach to implement the marginal structural Cox proportional hazards model (referred to as Cox MSM) with inverse-probability-of-treatment weighting (IPTW) directly via a weighted time-dependent Cox PH model, rather than via a pooled logistic regression approximation. The simulations show that the IPTW estimator yields consistent estimates of the causal effect of treatment, but it may suffer from large variability, due to some extremely high IPT weights. The precision of the IPTW estimator could be improved by normalizing the stabilized IPT weights.Simple weight truncation has been proposed and commonly used in practice as another solution to reduce the large variability of IPTW estimators. However, truncation levels are typically chosen based on ad hoc criteria which have not been systematically evaluated. Thus, in the second manuscript, I proposed a systematic data-adaptive approach to select the optimal truncation level which minimizes the estimated expected MSE of the IPTW estimates. In simulation, the new approach exhibited the performance that was as good as the approaches that simply truncate the stabilized weights at high percentiles such as the 99th or 99.5th of their distribution, in terms of reducing the variance and improving the MSE of the estimatesIn the third manuscript, I proposed a new, flexible model to estimate the cumulative effect of time-varying treatment in the presence of the time-dependent confounders/mediators. The model incorporated weighted cumulative exposure modeling in a marginal structural Cox model. Specifically, weighted cumulative exposure was used to summarize the treatment history, which was defined as the weighted sum of the past treatments. The function that assigns different weights to treatments received at different times was modeled with cubic regression splines. The stabilized IPT weights for each person at each visit were calculated to account for the time-varying confounding and mediation. The weighted Cox MSM, using stabilized IPT weights, was fitted to estimate the total causal cumulative effect of the treatments on the hazard. Simulations demonstrate that the proposed new model can estimate the total causal cumulative effect, i.e. to capture both the direct and the indirect (mediated by the TD confounder) treatment effects. Bootstrap-based 95% confidence bounds for the estimated weight function were constructed and the impact of some extreme IPT weights on the estimates of the causal cumulative effect was explored.In the last manuscript, I applied the WCE MSM to the Swiss HIV Cohort Study (SHCS) to re-assess whether the cumulative exposure to abacavir therapy may increase the potential risk of cardiovascular events, such as myocardial infarction or the cardiovascular-related death.
Dans les études longitudinales, aussi bien les covariables que les traitements peuvent varier au cours de la période de suivi. Les modèles de Cox à effets proportionnels avec variables dépendantes du temps peuvent être utilisés pour modéliser l'effet de traitement variant au cours du temps. Cependant, deux défis apparaissent pour ce type de modélisation. Tout d'abord, une modélisation précise des effets des traitements dépendants du temps sur le risque nécessite de résoudre l'incertitude quant à l'importance étiologique des traitements pris a différentes périodes de temps. Ensuite, un second défi se pose dans le cas de la présence d'une variable de confusion qui dépend du temps et qui est également un médiateur de l'effet du traitement sur le risque. Deux différentes méthodologies ont récemment été suggérées pour répondre, séparément, à chacun de ces deux défis, respectivement l'exposition cumulée pondérée et les modèles structuraux marginaux (MSM). Dans cette thèse, j'ai proposé la combinaison de ces méthodologies de façon à répondre aux deux défis simultanément, étant donné qu'ils peuvent tous les deux fréquemment se poser en même temps dans des études longitudinales. Dans le premier article, j'ai proposé et validé une nouvelle approche pour mettre en œuvre le Cox MSM avec la pondération par l'inverse de probabilité de traitement (PIPT) directement à partir d'un modèle de Cox a effets proportionnels pondéré et avec variables dépendantes du temps plutôt que par une approximation par régression logistique sur données agrégées. Les simulations montrent que l'estimateur PIPT donne des estimations consistantes de l'effet causal du traitement alors qu'il serait associé à une grande variabilité dans les estimations, à cause d'inverses de probabilités de traitement extrêmement élevés. La simple troncature de poids a été proposée et couramment utilisée dans la pratique comme une autre solution pour réduire la grande variabilité des estimateurs PIPT. Cependant, les niveaux de troncature sont généralement choisis en fonction de critères ad hoc, qui n'ont pas été systématiquement évalués. Ainsi, dans le deuxième article, j'ai proposé une approche systématique adaptative aux données systématique pour sélectionner le niveau de troncature optimal qui minimise l'erreur quadratique moyenne des estimations PIPT. Dans le troisième article, j'ai proposé un nouveau modèle flexible afin d'estimer l'effet cumulatif de traitements qui varient dans le temps en présence de facteurs de confusion/médiateurs dépendant du temps. Le modèle intègre la modélisation de l'exposition cumulative pondérée dans un Cox MSM. Plus précisément, l'exposition cumulée pondérée a été utilisée pour résumer l'histoire du traitement, qui a été définie comme la somme pondérée des traitements antérieurs. La fonction qui assigne des poids différents aux traitements reçus à différents moments a été modélisée avec des régressions par B-splines cubiques, en utilisant différentes covariables dépendantes du temps artificielles. Les poids IPT stabilisés pour chaque personne à chaque visite ont été calculés afin de tenir compte des variables de confusion et des médiateurs qui dépendent du temps. Le modèle structurel marginal de Cox à effets proportionnel et avec des covariables dépendantes du temps pondéré, qui utilise des poids stabilisés pondérés, a été ajusté pour estimer l'effet cumulatif causal total des traitements sur le risque. Les simulations montrent que le nouveau modèle proposé permet d'estimer l'effet cumulatif causal total, c'est à dire qu'il permet de capturer à la fois les effets direct et indirect.Dans le dernier article, j'ai appliqué le modèle structural marginal avec exposition cumulée pondérée à une étude de cohorte suisse sur le VIH afin de réévaluer si l'exposition cumulée à la thérapie abacavir augmentait le risque potentiel d'événements cardiovasculaires, tels que l'infarctus du myocarde ou le décès lié a un événement cardiovasculaire.
APA, Harvard, Vancouver, ISO, and other styles
3

Havercroft, William G. "Exploration of marginal structural models for survival outcomes." Thesis, University of Bristol, 2014. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.684750.

Full text
Abstract:
A marginal structural model parameterises the distribution of an outcome given a treatment intervention, where such a distribution is the fundamental probabilistic representation of the causal effect of treatment on the outcome. Causal inference methods are designed to consistently estimate aspects of these causal distributions, in the presence of interference from non-causal associations which typically occur in observational data. One such method, which involves the application of inverse probability of treatment weights, directly targets the parameters of marginal structural models. The asymptotic properties and practical applicability of this method are well established, but little attention has been paid to its finite-sample performance. This is because simulating data from known distributions which are entirely suitable for such investigations generally presents a significant challenge, especially in scenarios where the outcome is survival time. We illuminate these issues, and propose and implement certain solutions, considering separately the cases of static (pre-determined) and dynamic (tailored) treatment interventions. In so doing, we explore both theoretical and practical aspects of marginal structural models for survival outcomes, and the associated inference method.
APA, Harvard, Vancouver, ISO, and other styles
4

Yang, Shibing. "Application of Marginal Structural Models in Pharmacoepidemiologic Studies." VCU Scholars Compass, 2014. http://scholarscompass.vcu.edu/etd/3471.

Full text
Abstract:
Background: Inverse-probability-of-treatment-weighted estimation (IPTW) of marginal structural models was proposed to adjust for time-varying confounders that are influenced by prior treatment use. It is unknown whether pharmacoepidemiologic studies that applied IPTW conformed to the recommendations proposed by methodological studies. In addition, no previous study has compared the performance of different analytic strategies adopted in IPTW analyses. Objectives: This project aims 1) to review the reporting practice of pharmacoepidemiologic studies that applied IPTW, 2) to compare the validity and precision of several approaches to constructing weight, 3) to use IPTW to estimate the effectiveness of glucosamine and chondroitin in treating osteoarthritis. Methods: We systematically retrieved pharmacoepidemiologic studies that were published in 2012 and applied IPTW to estimate the effect of a time-varying treatment. Under a variety of simulated scenarios, we assessed the performance of four analytic approaches what were commonly used in studies conducting IPTW analyses. Finally, using data from Osteoarthritis Initiative, we applied IPTW to estimate the long-term effectiveness of glucosamine and chondroitin on treating knee osteoarthritis. Results: The practice of reporting use of IPTW in pharmacoepidemiologic studies was suboptimal. The majority of reviewed studies did not report that the positivity assumption was assessed, and several studies used unstablized weights or did not report that the stabilized weights were used. With data simulation, we found that intention-to-treat analyses underestimated the actual treatment effect when there was non-null treatment effect and treatment non-adherence. This underestimation was linearly correlated with adherence levels. As-treated analyses that took into account the complex mechanism of treatment use generated approximately unbiased estimates without sacrificing the estimate precision when the treatment effect was non-null. Finally, after adjustment for potential confounders with marginal structural models, we found no clinically meaningful benefits of glucosamine/chondroitin in relieving knee pain, stiffness and physical function or slowing joint space narrowing. Conclusions: It may be prudent to develop best practices of reporting the use of IPTW. Studies performing intention-to-treat analyses should report the levels of adherence after treatment initiation, and studies performing as-treated analyses should take into the complex mechanism of treatment use in weight construction.
APA, Harvard, Vancouver, ISO, and other styles
5

Lusivika, Nzinga Clovis. "Estimation d’effets individuels de traitements pris en combinaison dans les études observationnelles." Thesis, Sorbonne université, 2019. http://www.theses.fr/2019SORUS218.

Full text
Abstract:
La réalisation d'un essai thérapeutique randomisé peut être difficile à mettre en place pour estimer sans biais l'effet causal d'une stratégie thérapeutique. Dans ce cas, l’étude observationnelle constitue une alternative pour évaluer l’effet causal d’un traitement. Quatre types de difficultés méthodologiques nous intéressent dans ce type d’étude : 1) le biais d’indication ; 2) La présence des facteurs de confusion temps-dépendantes (TD) ; 3) la relation variant dans le temps entre un traitement TD et un effet ; 4) dans la vie réelle, les patients prennent parfois plusieurs traitements, de façon séquentielle ou simultanée. Dans ces conditions, l’évaluation de l’effet propre à chaque traitement constitue un défi méthodologique. L’objectif de cette thèse est de proposer un cadre méthodologique qui permet d’estimer correctement les effets propres aux traitements dans un contexte de multithérapie dans une étude observationnelle en tenant compte de ces difficultés méthodologiques. Nous avons évalué la performance du modèle marginal structurel de Cox pour estimer les effets individuels et conjoints de deux traitements et démontré qu'il a des bonnes performances en présence des facteurs de confusion TD et d'une interaction entre les deux traitements. Nous avons également comparé la performance du modèle marginal structurel de Cox à exposition cumulée pondérée à celle du modèle de Cox à exposition cumulée pondérée standard pour estimer les effets variant dans le temps en présence de confusion temps dépendante et démontré qu'il a une meilleure performance et qu'il peut être appliqué aux données réelles quelque soit la force de confusion temps dépendante
Randomized controlled trials cannot be implemented in all situations for estimating effects of therapeutic strategies. Observational studies would then constitute an alternative for evaluating treatment effects. We have a specified interest in four types of methodological difficulties for such studies: 1) confounding by indication ; 2) presence of time-dependent confounding ; 3) The relationship between a given time-dependent treatment and its effect may vary over time ; 4) In real life, patients often receive multiple treatments, sequentially or simultaneously. In this context, the evaluation of individual effects of treatment is a methodological challenge. The overall objective of this thesis was to propose a methodological framework in which these methodological difficulties are accommodated, allowing the individual effects of treatments to be correctly estimated within the context of multi-treatments in an observational study. We evaluated the performance of the marginal structural Cox model when estimating the individual and joint effects of two treatments and showed that it performed well in the presence of three different scenarios of time-dependent confounding. We also showed the importance of estimating the interaction term when exploring the treatment effect from combination therapy. We compared the performance of weighted cumulative exposure marginal structural Cox model with that of a conventional TD WCE Cox model for estimating time-varying effects of treatments without bias in the presence of TD confounding. Our results showed that the WCE Cox MSM performed better and can be applied to real data whatever the strength of time dependent confounding
APA, Harvard, Vancouver, ISO, and other styles
6

Hage, Fabio Sismotto El. "A estrutura tarifária de uso das redes de distribuição de energia elétrica no Brasil: análise crítica do modelo vigente e nova proposta metodológica." Universidade de São Paulo, 2011. http://www.teses.usp.br/teses/disponiveis/3/3143/tde-04042011-122312/.

Full text
Abstract:
O trabalho discute a questão da precificação eficiente em sistemas de distribuição de energia, abordando desde a teoria econômica clássica aplicada aos modelos de produção e transporte da energia, passando por uma avaliação crítica da atual metodologia de estrutura tarifária vigente no Brasil, até o detalhamento de uma proposta consistente e simplificadora de estrutura de preços para a atividade da distribuição de energia. A teoria de monopólios naturais é o pano de fundo para uma discussão integrada dos modelos clássicos de estrutura de preços para o transporte da energia elétrica. Nesta avaliação do estado da arte, são abordadas as teorias da precificação linear, da precificação não linear e da precificação de ponta. A atual metodologia nacional de cálculo da estrutura tarifária de uso das redes de distribuição, aplicada pela Agência Nacional de Energia Elétrica ANEEL, é revisitada sob uma visão técnica crítica. Como resultado, são desconstruídos alguns conceitos e processos vigentes pela constatação de uma excessiva complexidade operacional aliada a uma carência de fundamentação econômica e matemática. Por fim, é proposta uma metodologia simplificadora para a estrutura de preços de uso das redes de distribuição de energia, objetivando maior eficiência econômica, maior simplicidade operacional na aplicação e sólida fundamentação teórica, reduzindo arbitrariedades e subjetividades existentes na atual metodologia.
The present work discusses the question of efficient pricing on electric power distribution systems. The subject is approached from the discussion of the classical economic theory applied to energy production and transport models, passing through a critical evaluation of the current rate structure used in Brazil, to the description of a consistent and simplified proposal for the electric power distribution rate design. The theory of natural monopolies is the background of an integrated discussion on classical rate design models concerning the electric energy transportation activity. By the classical problem analysis, some theories commonly approached are linear pricing, non linear pricing and peak load pricing. The current Brazilian methodology used for the rate design of the usage of distribution networks, applied by the National Regulatory Agency (ANEEL), is revisited under a critical technical vision. As a result, some concepts are reassessed due to the observed excessive operational complexity allied to the lack of economical and statistical foundation. Finally, a simplified methodology for the rate structure of the usage of electrical distribution networks is proposed. The methodology aims, at the same time, greater economic efficiency, simpler operational application and a solid theoretical foundation, thereby reducing arbitrariness and subjectivity found in the current methodology.
APA, Harvard, Vancouver, ISO, and other styles
7

Mojaverian, Nassim. "Effects of sparse follow-up on marginal structural models for time-to-event data." Thesis, McGill University, 2012. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=110690.

Full text
Abstract:
Background: Survival time is a common parameter of interest that can be estimated by using Cox Proportional Hazards models when measured continuously. An alternative way to estimate hazard ratios is to cut up time into equal-lengthed intervals and consider the by-interval outcome to be 0 if the person is alive during this interval and 1 otherwise. In this discrete-time approximation, instead of using a Cox model, one should perform pooled logistic regression to get unbiased estimate of survival time under the assumption of low death rate per interval. This fact is satisfied when shorter intervals is used in order to have fewer events in each time, however, by doing this, problems such as missing values can arise because the actual visits occur less frequently in a survival setting and one must therefore account for the missing values. Objective: We investigate the effect of two methods of filling in missing data, Last Observation Carried Forward (LOCF) and Multiple Imputation (MI), as well as Available Case Study. We compare these three different approaches to complete data analysis. Methods: Weighted pooled logistic regression is used to estimate the causal marginal treatment effect. Complete data were generated using Young's algorithm to obtain monthly information for all individuals, and from the complete data, observed data were selected by assuming follow-up visits occurred every six or three months. Thus, to analyze the observed data at a monthly level, we performed LOCF and MI to fill in the missing values and compared the results to those from a completely-observed data analysis. We also included an analysis of the observed-data without any imputation. We then applied these methods to the Canadian Co-infection Cohort to estimate the impact of alcohol consumption on liver fibrosis.Results: In most simulations, MI produced the least biased and least variant estimators, even outperforming analyses based on completely-observed data. In the presence of stronger confounding, MI-based estimators were more biased but nevertheless less variant than the estimators based on completely-observed data.Conclusion: Multiple Imputation is superior to last-observation carried forward and observed-data analysis when marginal structural models are used to adjust for time-varying exposure and variables in the context of survival analysis and data are missing or infrequently measured.
Contexte : Le temps de survie est un paramètre d'intérêt commun qui peut être évalué en faisant appel aux modèles à risques proportionnels de Cox lorsqu'il mesuré en continu. Un autre moyen d'estimer des risques relatifs est de diviser le temps en intervalles égaux et d'assigner une valeur de 0 ou de 1 à chaque intervalle selon que l'individu y est vivant ou non. Dans cette approximation à temps discret, on doit avoir recours à une régression logistique regroupée plutôt qu'à un modèle de Cox pour obtenir une estimation sans biais du temps de survie sous l'hypothèse que le taux de décès par intervalle est bas. Cette hypothèse est raisonnable lorsque les intervalles sont suffisamment courts pour éviter les événements multiples mais ce faisant, des problèmes de valeurs manquantes ou autres peuvent subvenir car les visites effectives ont lieu moins fréquemment dans un contexte de survie et la possibilité que des valeurs soient manquantes est bien réelle.Objectif : Nous examinons l'effet de deux méthodes d'imputation de valeurs manquantes, à savoir la reconduction de la dernière observation (RDO) et l'imputation multiple (IM), de même que la technique d'études des cas disponibles. Nous comparons ces trois approches en prenant comme point de référence l'analyse des cas complets.Méthodes : La régression logistique regroupée pondérée est utilisée afin d'estimer l'effet de traitement causal marginal. Des données complètes ont été générées au moyen de l'algorithme de Young afin d'obtenir des informations mensuelles au sujet de tous les individus ; des observations ont ensuite été sélectionnées à partir des données complètes en supposant que des visites de suivi aient lieu tous les trois ou six mois. Ainsi, en vue d'analyser les données observées sur une base mensuelle, on a effectué la reconduction de la dernière observation (RDO) et l'imputation multiple (IM) pour remplacer les données manquantes et comparer les résultats à ceux d'une analyse de données entièrement observables. On a également effectué une analyse des données observées avant imputation. On a ensuite appliqué ces techniques à la cohorte de co-infection canadienne afin d'évaluer l'impact de la consommation d'alcool sur la fibrose du foie. Résultats: Dans la plupart des simulations, les estimations fondées sur l'imputation multiple se sont avérées moins biaisées et moins variables que les autres, surpassant même celles fondées sur l'observation de données complètes. En présence d'effets confondants, les estimations fondées sur l'imputation multiple ont présenté un biais accru mais ont été moins variables que celles fondées sur les données entièrement observables.Conclusion : L'imputation multiple est supérieure à la reconduction de la dernière observation et à l'analyse des données brutes lorsque des modèles structuraux marginaux sont utilisés pour ajuster l'exposition temporelle et les variables dans un contexte d'analyses de survie où les données sont mesurées à basse fréquence ou incomplètes.
APA, Harvard, Vancouver, ISO, and other styles
8

Fujii, Tomoko. "Human Atrial Natriuretic Peptide for Acute Kidney Injury in Adult Critically III Patients: A Multicenter Prospective Observational Study." Kyoto University, 2019. http://hdl.handle.net/2433/242410.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Pang, Menglan. "A study of non-collapsibility of the odds ratio via marginal structural and logistic regression models." Thesis, McGill University, 2012. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=110697.

Full text
Abstract:
Background: It has been noted in epidemiology and biostatistics that when the odds ratio (OR) is used to measure the causal effect of a treatment or exposure, there is a discrepancy between the marginal OR and the conditional OR even in the absence of confounding. This is known as non-collapsibility of the OR. It is sometimes described (incorrectly) as a bias in the estimated treatment effect from a logistic regression model if an important covariate is omitted. Objectives: Distinguish confounding bias from non-collapsibility and measure the non-collapsibility effect on the OR in different scenarios. Methods: We used marginal structural models and standard logistic regression to measure the non-collapsibility effect and confounding bias. An analytic approach is proposed to assess the non-collapsibility effect in a point-exposure study. This approach can be used to verify the conditions for the absence of non-collapsibility and to examine the phenomenon of confounding without non-collapsibility. A graphical approach is employed to show the relationship between the non-collapsibility effect and the baseline risk or the marginal outcome probability, and it reveals the non-collapsibility behaviour with a range of different exposure effects and different covariate effects. In order to explore the non-collapsibility effect of the OR in the presence of time-varying confounding, an observational cohort study was simulated. Results and Conclusion: The total difference between the conditional and crude effects can be decomposed into a sum of the non-collapsibility effect and the confounding bias. We provide a general formula for expressing the non-collapsibility effect under different scenarios. Our analytic approach provided similar results to related formulae in the literature. Various interesting observations about non-collapsibility can be made from the different scenarios with or without confounding using the graphical approach. Somewhat surprisingly, the effect of the covariate plays a more important role in the non-collapsibility effect than does the effect of the exposure. In the presence of time-varying confounding, the non-collapsibility is comparable to the effect in the point-exposure study.
Contexte : Il a été observé en épidémiologie et en biostatistique que lorsque le "odds ratio" (OR) est utilisé pour mesurer l'effet causal d'un traitement ou d'une exposition, il y a une différence entre l'OR marginal et l'OR conditionnel et ce, même s'il y a absence de biais de confusion. Ceci est décrit comme le non-collapsibilité de l'OR. Il est parfois incorrectement décrit comme un biais dans l'effet estimé du traitement à partir d'un modèle de régression logistique, si une covariante importante est exclue.Objectifs : Distinguer le biais provenant du biais de confusion du non-collapsibilité et mesurer l'effet du non-collapsibilité sur l'OR dans plusieurs scénarios.Méthode : On a utilisé des modèles structuraux marginaux et la régression logistique ajustée pour mesurer l'effet du non-collapsibilité dans une étude d'exposition par points. Cette approche peut être utilisée pour vérifier les conditions de l'absence de non-collapsibilité et pour examiner le phénomène de biais de confusion sans non-collapsibilité. Une approche graphique est employée pour démontrer la relation entre le non-collapsibilité et le risque de base ou la probabilité du résultat marginal; ceci révèle le comportement de non-collapsibilité avec une étendue d'effets d'exposition et de covariance différents. De manière à explorer l'effet de non-collapsibilité de l'OR en présence de biais de confusion variant en fonction du temps, une étude d'observation de cohorte a été simulée.Résultats et Conclusion : La différence entre les effets conditionnels et bruts peut être décomposée dans la somme de l'effet de non-collapsibilité et du biais de confusion. Nous suggérons une formule générale pour exprimer l'effet du non-collapsibilité dans plusieurs scénarios différents. Notre approche analytique expose des résultats similaires à d'autres étant trouvés avec des formules présentes dans la littérature. Plusieurs observations intéressantes sur le non-collapsibilité peuvent être faites à partir de différents scénarios, avec ou sans biais de confusion, en utilisant notre approche graphique. De manière surprenante, l'effet d'une covariable joue un plus grand rôle dans le non-collapsibilité que l'effet de l'exposition. En présence de biais de confusion reliée au temps, l'effet du non-collapsibilité est comparable à l'effet de l'étude d'exposition par point.
APA, Harvard, Vancouver, ISO, and other styles
10

Börsum, Jakob. "Estimating Causal Effects Of Relapse Treatment On The Risk For Acute Myocardial Infarction Among Patients With Diffuse Large B-Cell Lymphoma." Thesis, Uppsala universitet, Statistiska institutionen, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-447241.

Full text
Abstract:
This empirical register study intends to estimate average causal effects of relapse treatment on the risk for acute myocardial infarction (AMI) among patients with Diffuse B-Cell Lymphoma (DLBCL) within the potential outcome framework. The report includes a brief introduction to causal inference and survival anal- ysis and mentions specific causal parameters of interest that will be estimated. A cohort of 2887 Swedish DLBCL patients between 2007 and 2014 were included in the study where 560 patients suffered a relapse. The relapse treatment is hypothesised to be cardiotoxic and induces an increased risk of heart diseases. The identifiability assumptions need to hold to estimate average causal effects and are assessed in this report. The patient cohort is weighted using inverse probability of treatment and censoring weights and potential marginal survival curves are estimated from marginal structural Cox models. The resulting point estimate indicates a protective causal effect of relapse treatment on AMI but estimated bootstrap confidence intervals suggest no significant effect on the 5% significance level.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Marginal structure model"

1

Farber, Daniel A. Public Choice Theory and Legal Institutions. Edited by Francesco Parisi. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199684267.013.015.

Full text
Abstract:
This article asks what public choice can teach about legal institutions and their governing framework of public law. It begins with an overview and assessment of two important components of public choice: social choice theory (stemming from Arrow's Theorem) and interest group theory. It then considers the use of public choice models to explain the behaviour of legislatures, agencies, and courts. The core public choice insight is that institutional structures are responses to fundamental problems relating to collective action. However, the normative use of specific public choice models should be undertaken with caution. The models are likely to be most useful when they are informed by deep familiarity with specific institutional contexts; reforms are context-specific; and proposed changes are at the margin rather than involving major structural changes.
APA, Harvard, Vancouver, ISO, and other styles
2

Sager, Jalel. National Energy Signatures. Edited by Debra J. Davidson and Matthias Gross. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780190633851.013.4.

Full text
Abstract:
Fossil fuels and their high yield of available energy regulate the global economy and structure its hierarchy of nations. When a “pulse” of energy—over months, years, decades, or centuries—enters the global industrial system, overshoot dynamics are often observed. The system enters a new mode of production, with new technical combinations. Once it does, it is extremely difficult to return to the old infrastructure, even though the energy resource that provided the pulse likely will yield less over the years (the US and its highway system provide one example of an infrastructural system conceived in a higher-yielding environment, the US oil boom of the early twentieth century). As the energy surplus, or marginal resource return, begins to diminish, output declines, slowing the rise of powerful nations, and transferring growth elsewhere. The effects of declining returns often show up in the monetary system.
APA, Harvard, Vancouver, ISO, and other styles
3

Hodgkin, Kate. Autobiographical Writings. Edited by Andrew Hiscock and Helen Wilcox. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199672806.013.12.

Full text
Abstract:
Emerging out of the traditions of exemplary lives and self-analysis at the beginning of the seventeenth century, the genre of spiritual autobiography writing is fluid and unstable both textually and generically. The individualism that has often been taken to define the autobiographical project is problematized in these accounts, which tend to foreground self-transcendence over self-assertion, collective over individual identities, and exemplarity over uniqueness. The spiritual framework provides a language of self-narrative and self-analysis, structured around affliction and redemption, and privileging inward over outward experiences. As a mode which insists on the truth of experience, it allows marginal selves (including women and lower-class men) a public voice, above all in the gathered churches of the revolutionary decades and after, while also containing those voices within tight conventions. The simultaneous restrictions and liberations of these various frames offer important perspectives on debates about the early modern self.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Marginal structure model"

1

Rosenblum, Michael. "Marginal Structural Models." In Targeted Learning, 145–60. New York, NY: Springer New York, 2011. http://dx.doi.org/10.1007/978-1-4419-9782-1_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bergsma, Wicher, Marcel Croon, and Jacques A. Hagenaars. "Causal Analyses: Structural Equation Models and (Quasi-)Experimental Designs." In Marginal Models, 155–90. New York, NY: Springer New York, 2009. http://dx.doi.org/10.1007/b12532_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bilmes, Jeffrey A. "Algorithms and Data Structures for Exact Computation of Marginals." In Handbook of Graphical Models, 83–116. Boca Raton, Florida : CRC Press, c2019.: CRC Press, 2018. http://dx.doi.org/10.1201/9780429463976-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Robins, James M. "Marginal Structural Models versus Structural nested Models as Tools for Causal inference." In Statistical Models in Epidemiology, the Environment, and Clinical Trials, 95–133. New York, NY: Springer New York, 2000. http://dx.doi.org/10.1007/978-1-4612-1284-3_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Shtessel, Yuri, Leonid Fridman, Antonio Rosales, and Chandrasekhara Bharath Panathula. "Practical Stability Phase and Gain Margins Concept." In Advances in Variable Structure Systems and Sliding Mode Control—Theory and Applications, 101–32. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-62896-7_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Boiko, Igor. "On Inherent Gain Margins of Sliding-Mode Control Systems." In Advances in Variable Structure Systems and Sliding Mode Control—Theory and Applications, 133–47. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-62896-7_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Larsson, Måns, Jennifer Alvén, and Fredrik Kahl. "Max-Margin Learning of Deep Structured Models for Semantic Segmentation." In Image Analysis, 28–40. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-59129-2_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Petersen, Maya, Joshua Schwab, Elvin Geng, and Mark van der Laan. "Chapter 10: Evaluation of longitudinal dynamic regimes with and without marginal structural working models." In Adaptive Treatment Strategies in Practice, 157–86. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2015. http://dx.doi.org/10.1137/1.9781611974188.ch10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Umhoefer, Paul J., Joe Dragovich, Jeff Cary, and David C. Engebretson. "Refinements of the “Baja British Columbia” Plate-Tectonic Model for Northward Translation Along the Margin of Western North America." In Deep Structure and Past Kinematics of Accreted Terranes, 101–11. Washington, D. C.: American Geophysical Union, 2013. http://dx.doi.org/10.1029/gm050p0101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chen, Haoran, Minghua Zhu, Xuesong Cai, Jufeng Luo, and Yunzhou Qiu. "Improved Model Structure with Cosine Margin OIM Loss for End-to-End Person Search." In MultiMedia Modeling, 419–30. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-37731-1_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Marginal structure model"

1

Opara, R. O., K. C. Okafor, D. O. Dike, G. A. Chukwudebe, and R. M. Onoshakpor. "Towards Locational Marginal Pricing Model for Nigerian Electricity Tariff Structure using Optimal Power Flow Computation." In 2019 IEEE PES/IAS PowerAfrica. IEEE, 2019. http://dx.doi.org/10.1109/powerafrica.2019.8928813.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

McDonald, Andrew, Pang-Ning Tan, and Lifeng Luo. "COMET Flows: Towards Generative Modeling of Multivariate Extremes and Tail Dependence." In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/462.

Full text
Abstract:
Normalizing flows—a popular class of deep generative models—often fail to represent extreme phenomena observed in real-world processes. In particular, existing normalizing flow architectures struggle to model multivariate extremes, characterized by heavy-tailed marginal distributions and asymmetric tail dependence among variables. In light of this shortcoming, we propose COMET (COpula Multivariate ExTreme) Flows, which decompose the process of modeling a joint distribution into two parts: (i) modeling its marginal distributions, and (ii) modeling its copula distribution. COMET Flows capture heavy-tailed marginal distributions by combining a parametric tail belief at extreme quantiles of the marginals with an empirical kernel density function at mid-quantiles. In addition, COMET Flows capture asymmetric tail dependence among multivariate extremes by viewing such dependence as inducing a low-dimensional manifold structure in feature space. Experimental results on both synthetic and real-world datasets demonstrate the effectiveness of COMET flows in capturing both heavy-tailed marginals and asymmetric tail dependence compared to other state-of-the-art baseline architectures. All code is available at https://github.com/andrewmcdonald27/COMETFlows.
APA, Harvard, Vancouver, ISO, and other styles
3

Wang, Shengyong, and Qianjin Yue. "Ice Induced Vibration and Its Isolation of an Offshore Platform With Bucket Foundations in Marginal Oilfields." In ASME 2010 29th International Conference on Ocean, Offshore and Arctic Engineering. ASMEDC, 2010. http://dx.doi.org/10.1115/omae2010-20592.

Full text
Abstract:
The economical efficiency and structural characteristics of offshore platforms with bucket foundations is an important concern in the oil/gas exploration of marginal oil fields. JZ9-3E riser platform, which was set up in the Bohai Sea, is such structure. However, very severe ice-induced vibrations were observed. A finite element model was established to analyze the structural characteristics for this platform. A good comparison between modal analysis and full-scale observation was obtained, which shows the dynamic similarity between the numerical model and the prototype platform structure. To reduce the ice-induced vibrations, a fixed ice-breaking cone was installed on this platform. Both of the FE model results and the observed data showed that the fixed ice-breaking cone can isolate the ice-induced vibrations effectively.
APA, Harvard, Vancouver, ISO, and other styles
4

Kruse, Benjamin, Clemens Münzer, Stefan Wölkl, Arquimedes Canedo, and Kristina Shea. "A Model-Based Functional Modeling and Library Approach for Mechatronic Systems in SysML." In ASME 2012 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/detc2012-70378.

Full text
Abstract:
Even though the concept development phase in product development is arguably the most important phase in mechanical and mechatronics design, the available computer-based support for this stage is marginal. This paper presents a new computational model-based method to improve the early phases of mechatronic product design and to facilitate the application from early designs to detailed designs. The paper focuses on model-based Function-Behavior-Structure (FBS) libraries in SysML to support both the manual and computational generation of standard and innovative concepts. In this paper, an approach to re-usable functional models in SysML is presented. The method uses an operator-flow formulation of functions, based on the NIST functional basis, and is validated against a model of an electric car. The generated functional models are validated with respect to the consistency of the flows and tested by associating the functional model directly to the target product component structure. The results of the research are a new modeling approach for function and component libraries in SysML, an associated workflow for modeling of mechatronic systems, and the necessary extensions of the NIST functional basis. The modeling approach provides means for formal functional decomposition followed by an allocation of the functions to structural components that form the target structure.
APA, Harvard, Vancouver, ISO, and other styles
5

Zhao, Yuliang, Sheng Dong, Zihao Yang, and Lance Manuel. "Estimating Design Loads for Floating Structures Using Environmental Contours." In ASME 2020 39th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/omae2020-18453.

Full text
Abstract:
Abstract To ensure acceptable operation and/or survival of floating structures in extreme conditions, nonlinear time-domain simulations are often used to predict the structural response at the design stage. An environmental contour (EC) is commonly employed to identify critical sea states that serve as input for numerical simulations to assess the safety and performance of marine structures. In many studies, marginal and conditional distributions are defined to construct bivariate joint probability distributions for variables such as significant wave height and zero-crossing period; then, environmental contours can be constructed using the inverse first-order reliability method (IFORM). This study adopts alternative models to describe the generalized dependence structure between the environmental variables using copulas; the Nataf transformation is also discussed as a special case. Environmental contours are constructed, making use of measured wave data from moored buoys. Derived design loads are applied on a semi-submersible platform to assess possible differences. In addition, the long-term extremes of the tension of the mooring lines are estimated, considering uncertainties in the structural response using a 3D model (that includes response variability, ignored with the EC approach) to help establish more accurate design loads using Monte Carlo simulation. Results offer a clear indication of the extreme response of the floating structure based on the different models.
APA, Harvard, Vancouver, ISO, and other styles
6

Galtier, Thomas, Sayan Gupta, and Igor Rychlik. "Approximation of Crossing Intensities for Non Linear Responses Subjected to Non Gaussian Loadings." In ASME 2010 29th International Conference on Ocean, Offshore and Arctic Engineering. ASMEDC, 2010. http://dx.doi.org/10.1115/omae2010-20114.

Full text
Abstract:
Crossing intensity constitute an important response characteristic for randomly vibrating structures, especially if one is interested in estimating the risk against failures. This paper focusses on developing approximations by which estimates of the crossing intensities for response of marine structures can be obtained in a computationally efficient manner, when the loads are modeled as a special class of non-Gaussian processes, namely as LMA processes. Ocean waves exhibit considerable non-Gaussianity as marked by their skewed marginal distributions and heavy tails. Here, a new class of processes-the Laplace driven Moving Average (LMA) processes are used to model the ocean waves. LMA processes are non-Gaussian, strictly stationary, can model in principle any spectrum and have the additional flexibility to model the skewness and the kurtosis of the marginal distribution. The structure behavior assumed is limited to quadratic systems characterized by second order kernels, which is common for marine structures. Thus, an estimation of the crossing intensities of the response involves studying the crossing characteristics of a LMA process passing through a second order filter. A new computationally efficient hybrid method, which uses the saddle point approximations along with limited Monte Carlo simulations, is developed to compute crossing intensity of the response. The proposed method is illustrated through numerical examples.
APA, Harvard, Vancouver, ISO, and other styles
7

Mohamed Najib, Mohamed Aiman, Yong Ken Phoon, Wan Fatimah Wan Shamshudin, Shazana Sofia Mustapa, Aizuddin Khalid, and Yunus Alwi Yusof. "Limbayong: Decoding Industry Top Decile Reservoir Complexity for Marginal Deepwater Development." In SPE Annual Technical Conference and Exhibition. SPE, 2022. http://dx.doi.org/10.2118/210079-ms.

Full text
Abstract:
Abstract PETRONAS Carigali (PCSB) has developed a solution to monetize industry top decile worth reservoir complexity in the deepwater environment via Limbayong field, Malaysia. The field complexity is acknowledged by Independent Project Analysis (IPA) as industry top decile reservoir complexity due to severe elongated structure (30km length, 2.5km width) with varying faults frequency, vertical intercalation of thin-bed, thick sand reservoir and lateral compartmentalization which impend effectiveness of well drainage and pressure maintenance. The four (4) appraisal wells result since 2002 give diverse subsurface understanding, indicated possible different depositional model and greater degree of complication. This paper describes the key development challenges and strategies that significantly improve the field value proposition for FID. PCSB pivoted to focus assessment on low realization case for development. It generated advanced reservoir mapping to simulate sand distribution and concentration through incorporated faults re-interpretation, refined grid resolution, and change of facies prediction, increasing the stratigraphic compartments. The team performed integrated subsurface-surface flow assurance modeling and validated turndown limit for production and operation. Subsequently, iterated concepts for incremental reservoir recoverable by high-grading producer-injector pairing, wells-facilities design provision for a base, upgrade, or future tie-in. The team formulated industry collaboration (IC) studies in each FEL phase with drivers for deepwater technology enablers implementation in EPCIC primarily via concept selection, engineering standardization, and design competition. Each distinct concept is ratified with project economics group value chain evaluation and stakeholders’ alignment. The breakthrough signifies merit in the key strategies and templates to overcome similar-scale project complexity with viable business cases. The IC affirmed cost proposition of 20 to 30% lower than industry average for deepwater wells and facilities, ensuring it to be positioned in top quartile project performance. It re-defined minimum technical design and demonstrated a prominent value trade-off for scaling-up concepts. It drives momentum to monetize high complexity reservoirs even further in the deepwater environment, which otherwise remains undeveloped. There is potential for replication throughout nearly 800MMboe scattered fields within deepwater offshore Sabah, Malaysia. Deepwater offshore has a niche role in bridging global transition between energy mix offering and net-zero economy target. It produces among the industry's smallest carbon footprints yet with high economic efficiency. Consolidated and efficient development strategies accelerate the decarbonization pathway. It advocates a hybrid capital project management model to manage extreme uncertainties with design thinking, lean startup, and agile approach.
APA, Harvard, Vancouver, ISO, and other styles
8

Chandrasekaran, Srinivasan, R. Sundaravadivelu, R. Pannerselvam, S. Madhuri, and D. Shyamala Varthini. "Experimental Investigations of Offshore Triceratops Under Regular Waves." In ASME 2011 30th International Conference on Ocean, Offshore and Arctic Engineering. ASMEDC, 2011. http://dx.doi.org/10.1115/omae2011-49826.

Full text
Abstract:
For the development of marginal deep water oil fields many new platform concepts and design are emerging. Offshore triceratops is a relatively new concept for ultra deep water oil exploration. Triceratops has desirable characteristics of both TLP and Spar. It consists of deck structure supported on three buoyant leg structures (BLS) connected through ball joints. These joints make them a stable and heave restrained system; they do not pitch and roll but have limited surge and sway. The three BLS structures are connected to sea floor by tethering system. The present study discusses detailed experimental investigations carried out to analyze the structural response on the 1:150 scaled model of triceratops under regular waves. Influence of ball joints and tether tension variation on deck response is studied. Based on the experimental studies conducted, it is seen that deck response of Triceratops is relatively lower than BLS; deck remains horizontal for the encountered wave loads.
APA, Harvard, Vancouver, ISO, and other styles
9

Vanem, Erik. "Analyzing Extreme Sea State Conditions by Time-Series Simulation." In ASME 2022 41st International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2022. http://dx.doi.org/10.1115/omae2022-78795.

Full text
Abstract:
Abstract This paper presents an extreme value analysis on data of significant wave height based on time-series simulation. A method to simulate time series with given marginal distribution and preserving the autocorrelation structure in the data is applied to significant wave height data. Then, extreme value analysis is performed by simulating from the fitted time-series model that preserves both the marginal probability distribution and the autocorrelation. In this way, the effect of serial correlation on the extreme values can be taken into account, without subsampling and de-clustering of the data. The effect of serial correlation on estimating extreme wave conditions have previously been highlighted, and failure to account for this effect will typically lead to an overestimation of extreme conditions. This is demonstrated by this study, that compares extreme value estimates from the simulated times-series model with estimates obtained directly from the marginal distribution assuming that 3-hourly significant wave heights are independent and identically distributed. A dataset of significant wave height provided as part of a second benchmark exercise on environmental extremes that was presented at OMAE 2021, has been analysed.
APA, Harvard, Vancouver, ISO, and other styles
10

Hu, Zhen, and Sankaran Mahadevan. "Time-Dependent Reliability Analysis Using a New Multivariate Stochastic Load Model." In ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/detc2016-59185.

Full text
Abstract:
A common strategy for the modeling of stochastic loads in time-dependent reliability analysis is to describe the loads as independent Gaussian stochastic processes. This assumption does not hold for many engineering applications. This paper proposes a Vine-autoregressive-moving average (Vine-ARMA) load model for time-dependent reliability analysis, in problems with a vector of correlated non-Gaussian stochastic loads. The marginal stochastic processes are modeled as univariate ARMA models. The correlations between different univariate ARMA models are captured using the Vine-copula. The ARMA model maintains the correlation over time. The Vine-copula represents not only the correlation between different ARMA models, but also the tail dependence of different ARMA models. The developed Vine-ARMA model therefore can flexibly model a vector of high-dimensional correlated non-Gaussian stochastic processes with the consideration of tail dependence. Due to the complicated structure of the Vine-ARMA model, new challenges are introduced in time-dependent reliability analysis. In order to overcome these challenges, the Vine-ARMA model is integrated with a recently developed single-loop Kriging (SILK) surrogate modeling method. A hydrokinetic turbine blade subjected to a vector of correlated river flow loads is used to demonstrate the effectiveness of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Marginal structure model"

1

Romero-Chamorro, José Vicente, and Sara Naranjo-Saldarriaga. Weather Shocks and Inflation Expectations in Semi-Structural Models. Banco de la República Colombia, November 2022. http://dx.doi.org/10.32468/be.1218.

Full text
Abstract:
Colombia is particularly affected by the El Niño Southern Oscillation (ENSO) weather fluctuations. In this context, this study explores how the adverse weather events linked to ENSO affect the inflation expectations in Colombia and how to incorporate these second-round effects into a small open economy New Keynesian model. Using BVARx models we provide evidence that the inflation expectations obtained from surveys and break-even inflation measures are affected by weather supply shocks. Later, using this stylised fact, we modify one of the core forecasting models of the Banco de la República by incorporating the mechanisms in which weather-related shocks affect marginal costs and inflation expectations. We find that ENSO shocks had an important role in both inflation and the dynamics of inflation expectations, and that policymakers should consider this fact.
APA, Harvard, Vancouver, ISO, and other styles
2

Nadal-Caraballo, Norberto C., Madison C. Yawn, Luke A. Aucoin, Meredith L. Carr, Jeffrey A. Melby, Efrain Ramos-Santiago, Victor M. Gonzalez, et al. Coastal Hazards System–Louisiana (CHS-LA). US Army Engineer Research and Development Center, August 2022. http://dx.doi.org/10.21079/11681/45286.

Full text
Abstract:
The US Army Engineer Research and Development Center (ERDC), Coastal and Hydraulics Laboratory (CHL) expanded the Coastal Hazards System (CHS) to quantify storm surge and wave hazards for coastal Louisiana. The CHS Louisiana (CHS-LA) coastal study was sponsored by the Louisiana Coastal Protection and Restoration Authority (CPRA) and the New Orleans District (MVN), US Army Corps of Engineers (USACE) to support Louisiana’s critical coastal infrastructure and to ensure the effectiveness of coastal storm risk management projects. The CHS-LA applied the CHS Probabilistic Coastal Hazard Analysis (PCHA) framework to quantify tropical cyclone (TC) responses, leveraging new atmospheric and hydrodynamic numerical model simulations of synthetic TCs developed explicitly for the Louisiana region. This report focuses on documenting the PCHA conducted for the CHS-LA, including details related to the characterization of storm climate, storm sampling, storm recurrence rate estimation, marginal distributions, correlation and dependence structure of TC atmospheric-forcing parameters, development of augmented storm suites, and assignment of discrete storm weights to the synthetic TCs. As part of CHS-LA, coastal hazards were estimated within the study area for annual exceedance frequencies (AEFs) over the range of 10 yr-1 to 1×10-4 yr-1.
APA, Harvard, Vancouver, ISO, and other styles
3

Nadal-Caraballo, Norberto, Madison Yawn, Luke Aucoin, Meredith Carr, Jeffrey Melby, Efrain Ramos-Santiago, Fabian Garcia-Moreno, et al. Coastal Hazards System–Puerto Rico and US Virgin Islands (CHS-PR). Engineer Research and Development Center (U.S.), December 2022. http://dx.doi.org/10.21079/11681/46200.

Full text
Abstract:
The South Atlantic Coastal Study (SACS) was completed by the US Army Corps of Engineers to quantify storm surge and wave hazards allowing for the expansion of the Coastal Hazards System (CHS) to the South Atlantic Division (SAD) domain. The goal of the CHS-SACS was to quantify coastal storm hazards for present conditions and future sea level rise (SLR) scenarios to aid in reducing flooding risk and increasing resiliency in coastal environments. CHS-SACS was completed for three regions within the SAD domain, and this report focuses on the Coastal Hazards System–Puerto Rico and US Virgin Islands (CHS-PR). This study applied the CHS Probabilistic Coastal Hazard Analysis (PCHA) framework for quantifying tropical cyclone (TC) responses, leveraging new atmospheric and hydrodynamic numerical model simulations of synthetic TCs developed explicitly for the CHS-PR region. This report focuses on documenting the PCHA conducted for CHS-PR, including the characterization of storm climate, storm sampling, storm recurrence rate estimation, marginal distributions, correlation and dependence structure of TC atmospheric-forcing parameters, development of augmented storm suites, and assignment of discrete storm weights to the synthetic TCs. As part of CHS-PR, coastal hazards were estimated for annual exceedance frequencies over the range of 10 yr⁻¹ to 10⁻⁴ yr⁻¹.
APA, Harvard, Vancouver, ISO, and other styles
4

Iwasaki, T., and H. Shimamura. Velocity structure model determined from onshore-offshore seismic profiling across Vancouver Island and adjacent continental margin. Natural Resources Canada/ESS/Scientific and Technical Publishing Services, 1990. http://dx.doi.org/10.4095/129019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hayward, N., and S. Paradis. Geophysical reassessment of the role of ancient lineaments on the development of the western margin of Laurentia and its sediment-hosted Zn-Pb deposits, Yukon and Northwest Territories. Natural Resources Canada/CMSS/Information Management, 2022. http://dx.doi.org/10.4095/330038.

Full text
Abstract:
The role of crustal lineaments in the development of the western margin of Laurentia, Selwyn basin and associated sediment-hosted Zn-Pb deposits (clastic-dominated, Mississippi-Valley-type) in Yukon and NWT, are reassessed through a new 3-D inversion strategy applied to new compilations of gravity and magnetic data. Regionally continuous, broadly NE-trending crustal lineaments including the Liard line, Fort Norman structure, and Leith Ridge fault, were interpreted as having had long-standing influence on craton, margin, and sedimentary basin development. However, multiple tectonic overprints including terrane accretion, thrust faulting, and plutonism obscure the region's history. The Liard line, related to a transfer fault that bounds the Macdonald Platform promontory, is refined from the integration of the new geophysical models with published geological data. The geophysical models support the continuity of the Fort Norman structure below the Selwyn basin, but the presence of Leith Ridge fault is not supported in this area. The ENE-trending Mackenzie River lineament, traced from the Misty Creek Embayment to Great Bear Lake, is interpreted to mark the southern edge of a cratonic promontory. The North American craton is bounded by a NW-trending lineament interpreted as a crustal manifestation of lithospheric thinning of the Laurentian margin, as echoed by a change in the depth of the lithosphere-asthenosphere boundary. The structure is straddled by Mississippi Valley-type Zn-Pb occurrences, following their palinspastic restoration, and also defines the eastern limit of mid-Late Cretaceous granitic intrusions. Another NW-trending lineament, interpreted to be associated with a shallowing of lower crustal rocks, is coincident with clastic-dominated Zn-Pb occurrences.
APA, Harvard, Vancouver, ISO, and other styles
6

Harris, L. B., P. Adiban, and E. Gloaguen. The role of enigmatic deep crustal and upper mantle structures on Au and magmatic Ni-Cu-PGE-Cr mineralization in the Superior Province. Natural Resources Canada/CMSS/Information Management, 2021. http://dx.doi.org/10.4095/328984.

Full text
Abstract:
Aeromagnetic and ground gravity data for the Canadian Superior Province, filtered to extract long wavelength components and converted to pseudo-gravity, highlight deep, N-S trending regional-scale, rectilinear faults and margins to discrete, competent mafic or felsic granulite blocks (i.e. at high angles to most regional mapped structures and sub-province boundaries) with little to no surface expression that are spatially associated with lode ('orogenic') Au and Ni-Cu-PGE-Cr occurrences. Statistical and machine learning analysis of the Red Lake-Stormy Lake region in the W Superior Province confirms visual inspection for a greater correlation between Au deposits and these deep N-S structures than with mapped surface to upper crustal, generally E-W trending, faults and shear zones. Porphyry Au, Ni, Mo and U-Th showings are also located above these deep transverse faults. Several well defined concentric circular to elliptical structures identified in the Oxford Stull and Island Lake domains along the S boundary of the N Superior proto-craton, intersected by N- to NNW striking extensional fractures and/or faults that transect the W Superior Province, again with little to no direct surface or upper crustal expression, are spatially associated with magmatic Ni-Cu-PGE-Cr and related mineralization and Au occurrences. The McFaulds Lake greenstone belt, aka. 'Ring of Fire', constitutes only a small, crescent-shaped belt within one of these concentric features above which 2736-2733 Ma mafic-ultramafic intrusions bodies were intruded. The Big Trout Lake igneous complex that hosts Cr-Pt-Pd-Rh mineralization west of the Ring of Fire lies within a smaller concentrically ringed feature at depth and, near the Ontario-Manitoba border, the Lingman Lake Au deposit, numerous Au occurrences and minor Ni showings, are similarly located on concentric structures. Preliminary magnetotelluric (MT) interpretations suggest that these concentric structures appear to also have an expression in the subcontinental lithospheric mantle (SCLM) and that lithospheric mantle resistivity features trend N-S as well as E-W. With diameters between ca. 90 km to 185 km, elliptical structures are similar in size and internal geometry to coronae on Venus which geomorphological, radar, and gravity interpretations suggest formed above mantle upwellings. Emplacement of mafic-ultramafic bodies hosting Ni-Cr-PGE mineralization along these ringlike structures at their intersection with coeval deep transverse, ca. N-S faults (viz. phi structures), along with their location along the margin to the N Superior proto-craton, are consistent with secondary mantle upwellings portrayed in numerical models of a mantle plume beneath a craton with a deep lithospheric keel within a regional N-S compressional regime. Early, regional ca. N-S faults in the W Superior were reactivated as dilatational antithetic (secondary Riedel/R') sinistral shears during dextral transpression and as extensional fractures and/or normal faults during N-S shortening. The Kapuskasing structural zone or uplift likely represents Proterozoic reactivation of a similar deep transverse structure. Preservation of discrete faults in the deep crust beneath zones of distributed Neoarchean dextral transcurrent to transpressional shear zones in the present-day upper crust suggests a 'millefeuille' lithospheric strength profile, with competent SCLM, mid- to deep, and upper crustal layers. Mechanically strong deep crustal felsic and mafic granulite layers are attributed to dehydration and melt extraction. Intra-crustal decoupling along a ductile décollement in the W Superior led to the preservation of early-formed deep structures that acted as conduits for magma transport into the overlying crust and focussed hydrothermal fluid flow during regional deformation. Increase in the thickness of semi-brittle layers in the lower crust during regional metamorphism would result in an increase in fracturing and faulting in the lower crust, facilitating hydrothermal and carbonic fluid flow in pathways linking SCLM to the upper crust, a factor explaining the late timing for most orogenic Au. Results provide an important new dataset for regional prospectively mapping, especially with machine learning, and exploration targeting for Au and Ni-Cr-Cu-PGE mineralization. Results also furnish evidence for parautochthonous development of the S Superior Province during plume-related rifting and cannot be explained by conventional subduction and arc-accretion models.
APA, Harvard, Vancouver, ISO, and other styles
7

Tanny, Josef, Gabriel Katul, Shabtai Cohen, and Meir Teitel. Application of Turbulent Transport Techniques for Quantifying Whole Canopy Evapotranspiration in Large Agricultural Structures: Measurement and Theory. United States Department of Agriculture, January 2011. http://dx.doi.org/10.32747/2011.7592121.bard.

Full text
Abstract:
Original objectives and revisions The original objectives of this research, as stated in the approved proposal were: 1. To establish guidelines for the use of turbulent transport techniques as accurate and reliable tool for continuous measurements of whole canopy ET and other scalar fluxes (e.g. heat and CO2) in large agricultural structures. 2. To conduct a detailed experimental study of flow patterns and turbulence characteristics in agricultural structures. 3. To derive theoretical models of air flow and scalar fluxes in agricultural structures that can guide the interpretation of TT measurements for a wide range of conditions. All the objectives have been successfully addressed within the project. The only modification was that the study focused on screenhouses only, while it was originally planned to study large greenhouses as well. This was decided due to the large amount of field and theoretical work required to meet the objectives within screenhouses. Background In agricultural structures such as screenhouses and greenhouses, evapotranspiration (ET) is currently measured using lysimeters or sap flow gauges. These measurements provide ET estimates at the single-plant scale that must then be extrapolated, often statistically or empirically, to the whole canopy for irrigation scheduling purposes. On the other hand, turbulent transport techniques, like the eddy covariance, have become the standard for measuring whole canopy evapotranspiration in the open, but their applicability to agricultural structures has not yet been established. The subject of this project is the application of turbulent transport techniques to estimate ET for irrigation scheduling within large agricultural structures. Major conclusions and achievements The major conclusions of this project are: (i) the eddy covariance technique is suitable for reliable measurements of scalar fluxes (e.g., evapotranspiration, sensible heat, CO2) in most types of large screenhouses under all climatic conditions tested. All studies resulted with fair energy balance closures; (ii) comparison between measurements and theory show that the model is capable in reliably predicting the turbulent flow characteristics and surface fluxes within screenhouses; (iii) flow characteristics within the screenhouse, like flux-variance similarity and turbulence intensity were valid for the application of the eddy covariance technique in screenhouses of relatively dilute screens used for moderate shading and wind breaking. In more dense screens, usually used for insect exclusions, development of turbulent conditions was marginal; (iv) installation of the sensors requires that the system’s footprint will be within the limits of the screenhouse under study, as is the case in the open. A footprint model available in the literature was found to be reliable in assessing the footprint under screenhouse conditions. Implications, both scientific and agricultural The study established for the first time, both experimentally and theoretically, the use of the eddy covariance technique for flux measurements within agricultural screenhouses. Such measurements, along with reliable theoretical models, will enable more accurate assessments of crop water use which may lead to improved crop water management and increased water use efficiency of screenhouse crops.
APA, Harvard, Vancouver, ISO, and other styles
8

Lasiecka, I., and R. Triggiani. Increasing the Margin of Stability of Arbitrarily Finite Modes of Flexible Large Space Structures with Damping. Fort Belvoir, VA: Defense Technical Information Center, February 1991. http://dx.doi.org/10.21236/ada248555.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lasiecka, I., and R. Triggiani. Increasing the Margin of Stability of Arbitrarily Finite Modes of Flexible Large Space Structures with Damping. Fort Belvoir, VA: Defense Technical Information Center, August 1986. http://dx.doi.org/10.21236/ada172813.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Triggiani, R., and I. Lasiecka. Increasing the Margin of Stability of Arbitrarily Finite Modes of Flexible Large Space Structures with Damping. Fort Belvoir, VA: Defense Technical Information Center, February 1990. http://dx.doi.org/10.21236/ada221742.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography