Tesis sobre el tema "Weighted simulation"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte los 50 mejores tesis para su investigación sobre el tema "Weighted simulation".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.
Shah, Sandeep R. "Perfect simulation of conditional and weighted models". Thesis, University of Warwick, 2004. http://wrap.warwick.ac.uk/59406/.
Texto completoGraham, Mark. "The development and application of a simulation system for diffusion-weighted MRI". Thesis, University College London (University of London), 2018. http://discovery.ucl.ac.uk/10047351/.
Texto completoGiacalone, Marco. "Lambda_c detection using a weighted Bayesian PID approach". Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2016. http://amslaurea.unibo.it/11431/.
Texto completoPotgieter, Andrew. "A Parallel Multidimensional Weighted Histogram Analysis Method". Thesis, University of Cape Town, 2014. http://pubs.cs.uct.ac.za/archive/00000986/.
Texto completoSimmler, Urs. "Simulation-News in Creo 1.0 & 2.0 & 3.0 : weighted Links : "Tipps & Tricks"". Universitätsbibliothek Chemnitz, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-qucosa-114511.
Texto completoKamunge, Daniel. "A non-linear weighted least squares gas turbine diagnostic approach and multi-fuel performance simulation". Thesis, Cranfield University, 2011. http://dspace.lib.cranfield.ac.uk/handle/1826/5612.
Texto completoLandon, Colin Donald. "Weighted particle variance reduction of Direct Simulation Monte Carlo for the Bhatnagar-Gross-Krook collision operator". Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/61882.
Texto completoCataloged from PDF version of thesis.
Includes bibliographical references (p. 67-69).
Direct Simulation Monte Carlo (DSMC)-the prevalent stochastic particle method for high-speed rarefied gas flows-simulates the Boltzmann equation using distributions of representative particles. Although very efficient in producing samples of the distribution function, the slow convergence associated with statistical sampling makes DSMC simulation of low-signal situations problematic. In this thesis, we present a control-variate-based approach to obtain a variance-reduced DSMC method that dramatically enhances statistical convergence for lowsignal problems. Here we focus on the Bhatnagar-Gross-Krook (BGK) approximation, which as we show, exhibits special stability properties. The BGK collision operator, an approximation common in a variety of fields involving particle mediated transport, drives the system towards a local equilibrium at a prescribed relaxation rate. Variance reduction is achieved by formulating desired (non-equilibrium) simulation results in terms of the difference between a non-equilibrium and a correlated equilibrium simulation. Subtracting the two simulations results in substantial variance reduction, because the two simulations are correlated. Correlation is achieved using likelihood weights which relate the relative probability of occurrence of an equilibrium particle compared to a non-equilibrium particle. The BGK collision operator lends itself naturally to the development of unbiased, stable weight evaluation rules. Our variance-reduced solutions are compared with good agreement to simple analytical solutions, and to solutions obtained using a variance-reduced BGK based particle method that does not resemble DSMC as strongly. A number of algorithmic options are explored and our final simulation method, (VR)2-BGK-DSMC, emerges as a simple and stable version of DSMC that can efficiently resolve arbitrarily low-signal flows.
by Colin Donald Landon.
S.M.
Xu, Zhouyi. "Stochastic Modeling and Simulation of Gene Networks". Scholarly Repository, 2010. http://scholarlyrepository.miami.edu/oa_dissertations/645.
Texto completoKlann, Dirk. "The Role of Information Technology in the Airport Business: A Retail-Weighted Resource Management Approach for Capacity-Constrained Airports". Thesis, Cranfield University, 2009. http://hdl.handle.net/1826/4474.
Texto completoPant, Mohan Dev. "Simulating Univariate and Multivariate Burr Type III and Type XII Distributions Through the Method of L-Moments". OpenSIUC, 2011. https://opensiuc.lib.siu.edu/dissertations/401.
Texto completoCan, Mutan Oya. "Comparison Of Regression Techniques Via Monte Carlo Simulation". Master's thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/3/12605175/index.pdf.
Texto completoNagahara, Shizue. "Studies on Functional Magnetic Resonance Imaging with Higher Spatial and Temporal Resolutions". 京都大学 (Kyoto University), 2014. http://hdl.handle.net/2433/188540.
Texto completoRuiz, Fernández Guillermo. "3D reconstruction for plastic surgery simulation based on statistical shape models". Doctoral thesis, Universitat Pompeu Fabra, 2018. http://hdl.handle.net/10803/667049.
Texto completoAquesta tesi ha estat realitzada a Crisalix amb la col·laboració de la Universitat Pompeu Fabra sota el pla de Doctorats Industrials. Crisalix té com a objectiu la millora de la comunicació entre els professionals de la cirurgia plàstica i els pacients, proporcionant una solució a la pregunta que sorgeix més freqüentment durant el procés de planificació d'una operació quirúrgica ``Com em veuré després de la cirurgia?''. La solució proposada per Crisalix està basada en la tecnologia d'imatge 3D. Aquesta tecnologia genera la reconstrucció 3D de la zona del pacient operada, seguit de la possibilitat de crear múltiples simulacions obtenint la representació dels possibles resultats de la cirurgia. Aquesta tesi presenta un sistema capaç de reconstruir cares i pits de pacients de cirurgia plàstica a partir de fotos 2D i escanegis. La reconstrucció en 3D d'un objecte és un problema complicat degut a la presència d'ambigüitats. Els mètodes basats en models estadístics son adequats per mitigar-les. En aquest treball, hem seguit la intuïció de maximitzar l'ús d'informació prèvia, introduint-la al model estadístic per millorar les seves propietats. En primer lloc, explorem els Active Shape Models (ASM) que són un conegut mètode fet servir per alinear contorns d'objectes 2D. No obstant, un cop aplicades les correccions de forma del model estadístic, es difícil de mantenir informació de la que es disposava a priori (per exemple, un petit conjunt de punts donat) inalterada. Proposem una nova projecció ponderada amb un terme de regularització, que permet obtenir formes que compleixen les restriccions de forma imposades i alhora són plausibles en concordança amb el model estadístic. En segon lloc, ampliem la metodologia per aplicar-la als anomenats 3D Morphable Models (3DMM) que són un mètode extensivament utilitzat per fer reconstrucció 3D. No obstant, els mètodes de 3DMM existents presenten algunes limitacions. Alguns estan basats en optimitzacions no lineals, computacionalment costoses i que poden quedar atrapades en mínims locals. Una altra limitació, és que no tots el mètodes proporcionen la resolució adequada per representar amb precisió els detalls de l'anatomia. Donat l'ús mèdic de l'aplicació, la precisió i la robustesa són factors molt importants a tenir en compte. Mostrem com la inicialització i l'ajustament de 3DMM poden ser millorats fent servir la projecció ponderada amb regularització proposada. Finalment, es presenta un sistema capaç de reconstruir models 3D de pacients de cirurgia plàstica a partir de dos possibles tipus de dades: imatges 2D i escaneigs en 3D. El nostre mètode es fa servir en diverses etapes del procés de reconstrucció: alineament de formes en imatge, la inicialització i l'ajustament de 3DMM. Els mètodes desenvolupats han estat integrats a l'entorn de producció de Crisalix provant la seva validesa.
PACIFICO, CLAUDIA. "Comparison of propensity score based methods for estimating marginal hazard ratios with composite unweighted and weighted endpoints: simulation study and application to hepatocellular carcinoma". Doctoral thesis, Università degli Studi di Milano-Bicocca, 2021. http://hdl.handle.net/10281/306601.
Texto completoIntroduction My research activity aims to use the data from the HERCOLES study, a retrospective study on hepatocarcinoma, as an application example for the comparison of statistical methods for estimating the marginal effect of a certain treatment on standard survival endpoints (unweighted) and weighted composite endpoints. This last approach, unexplored to date, is motivated by the need to take into account the different clinical relevance of cause-specific events. In particular, death is considered the worst event but a greater relevance is also given to local recurrence compared to non-local one. To evaluate the statistical performance of these methods, two simulation protocols were developed. Methods To remove or reduce the effect of confounders (characteristics of the subject and other baseline factors that determine systematic differences between treatment groups) in order to quantify a marginal effect, it is necessary to use appropriate statistical methods, based on the Propensity Score (PS): the probability that a subject is assigned to a treatment conditional on the covariates measured at baseline. In my thesis I considered some of the PS-based methods available in literature (Austin 2013): - PS as a covariate with spline transformation - PS as a stratified categorical covariate with respect to quantiles - Pairing for PS - Inverse probability weighting (IPW) The marginal effect of the unweighted composite endpoint is measured in terms of marginal hazard ratio (HR) estimated using a Cox model. As regards the weighted composite endpoint, the estimator of the treatment effect is the non-parametric estimator of the ratio between cumulative hazards proposed by Ozga and Rauch (2019). Simulation protocol The data generation mechanism is similar for both simulation studies. In both simulation protocols, the data generation mechanism is similar to that used by Austin (2013). Specifically, with regard to the unweighted endpoint (Disease Free Survival), I simulated three scenarios by considering respectively three values for the marginal HR: HR=1 (scenario a); HR=1.5 (scenario b) and HR=2 (scenario c). In each scenario, I simulated 10,000 datasets consisting of 1,000 subjects and for the estimate of the PS I generated 12 confounders. The simulation study for the weighted endpoint provides for the same scenarios (a, b, c) combined with three types of weights for the two single endpoints: (w1,w2)=(1,1); (w1,w2)=(1,0.5); (w1,w2)=(1,0.8). In each scenario I simulated 1,000 data sets consisting of 1,000 subjects and for the estimate of the PS I generated 3 confounders. Furthermore, I considered only the two methods considered in the literature to be the most robust: IPW and PS pairing (Austin 2016). Results The results relating to the unweighted composite endpoint confirm what is already known in the literature: IPW is the most robust method based on PS, followed by matching for PS. The innovative aspect of my thesis concerns the implementation of simulation studies for the evaluation of the performance of PS-based methods in estimating the marginal effect of a certain treatment with respect to a weighted composite survival endpoint: the IPW is confirmed as the most accurate and precise method.
Fraga, Guilherme Crivelli. "Análise da influência das propriedades radiativas de um meio participante na interação turbulência-radiação em um escoamento interno não reativo". reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2016. http://hdl.handle.net/10183/142495.
Texto completoTurbulence-radiation interaction (TRI) results from the highly non-linear coupling between fluctuations of radiation intensity and fluctuations of temperature and chemical composition of the medium, and its relevance in a number of high-temperature problems, especially when chemical reactions are included, has been demonstrated experimentally, theoretically, and numerically. In the present study, the TRI is analyzed in a channel flow of a non-reactive participating gas for different turbulence intensities of the flow at the inlet and considering two distinct species for the medium composition (carbon dioxide and water vapor). The central objective is to evaluate how the inclusion or not of the spectral variation of the radiative properties of a participating gas in the radiative transfer calculations affects the turbulence-radiation interaction. With this purpose, numerical simulations are performed using the computational fluid dynamics Fortranbased code Fire Dynamics Simulator, that employs the finite volume method to solve a form of the fundamental equations – i.e., the mass, momentum and energy balances and the state equation – appropriate for low Mach number flows, through an explicit second-order (both in time and in space) core algorithm. Turbulence is modeled by the large eddy simulation approach (LES), using the dynamic Smagorinsky model to close the subgrid-scale terms; for the thermal radiation part of the problem, the finite volume method is used for the discretization of the radiative transfer equation and the gray gas and weighted-sum-of-gray-gases (WSGG) models are implemented as a way to omit and consider the spectral dependence of the radiative properties, respectively. The TRI magnitude in the problem is evaluated by differences between values for the time-averaged heat fluxes at the wall (convective and radiative) and for the time-averaged radiative heat source calculated accounting for and neglecting the turbulence-radiation interaction effects. In general, TRI had little importance over all the considered cases, a conclusion that agrees with results of previous studies. When using the WSGG model, the contributions of the phenomenon were greater that with the gray gas hypothesis, demonstrating that the inclusion of the spectral variance in the solution of the radiative problem has an impact in the TRI effects. Furthermore, this paper presents a discussion, partly unprecedented in the context of the turbulence-radiation interaction, about the different methodologies that can be used for the TRI analysis. Finally, a correction factor is proposed for the time-averaged radiative heat source in the WSGG model, which is then validated by its implementation in the simulated cases. In future studies, a sensibility analysis on the terms that compose this factor can lead to a better understanding of how fluctuations of temperature correlate with the turbulence-radiation interaction phenomenon.
Luo, Hao. "Some Aspects on Confirmatory Factor Analysis of Ordinal Variables and Generating Non-normal Data". Doctoral thesis, Uppsala universitet, Statistiska institutionen, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-149423.
Texto completoOfe, Hosea y Peter Okah. "Value at Risk: A Standard Tool in Measuring Risk : A Quantitative Study on Stock Portfolio". Thesis, Umeå universitet, Handelshögskolan vid Umeå universitet, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-45303.
Texto completoClément, Jean-Baptiste. "Simulation numérique des écoulements en milieu poreux non-saturés par une méthode de Galerkine discontinue adaptative : application aux plages sableuses". Electronic Thesis or Diss., Toulon, 2021. http://www.theses.fr/2021TOUL0022.
Texto completoFlows in unsaturated porous media are modelled by the Richards' equation which is a degenerate parabolic nonlinear equation. Its limitations and the challenges raised by its numerical solution are laid out. Getting robust, accurate and cost-effective results is difficult in particular because of moving sharp wetting fronts due to the nonlinear hydraulic properties. Richards' equation is discretized by a discontinuous Galerkin method in space and backward differentiation formulas in time. The resulting numerical scheme is conservative, high-order and very flexible. Thereby, complex boundary conditions are included easily such as seepage condition or dynamic forcing. Moreover, an adaptive strategy is proposed. Adaptive time stepping makes nonlinear convergence robust and a block-based adaptive mesh refinement is used to reach required accuracy cost-effectively. A suitable a posteriori error indicator helps the mesh to capture sharp wetting fronts which are also better approximated by a discontinuity introduced in the solution thanks to a weighted discontinuous Galerkin method. The approach is checked through various test-cases and a 2D benchmark. Numerical simulations are compared with laboratory experiments of water table recharge/drainage and a largescale experiment of wetting, following reservoir impoundment of the multi-materials La Verne dam. This demanding case shows the potentiality of the strategy developed in this thesis. Finally, applications are handled to simulate groundwater flows under the swash zone of sandy beaches in comparison with experimental observations
Daviaud, Bérangère. "Méthodes formelles pour les systèmes réactifs, applications au live coding". Electronic Thesis or Diss., Angers, 2024. http://www.theses.fr/2024ANGE0032.
Texto completoThe formalism of discrete event systems and reactive systems provides an effective abstract framework for representing and studying a wide range of systems. In this thesis, we leverage this formalism to model a live coding score whose interpretation is conditioned by the occurrence of specific events. This approach led us to investigate formal methods for discrete event systems that enable their modeling, analysis, and the design of appropriate control strategies. This study resulted in several contributions, particularly regarding the expressiveness of weighted automata, the formal verification of temporal properties, and the existence of weighted simulation. The final part of this dissertation introduces the formalism of the interactive score, as well as the \textit{Troop Interactive} library, developed to make interactive score writing and the realization of interactive sound performances based on live coding practices more accessible
Liu, Chunde. "Creation of hot summer years and evaluation of overheating risk at a high spatial resolution under a changing climate". Thesis, University of Bath, 2017. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.725405.
Texto completoMassire, Aurélien. "Non-selective Refocusing Pulse Design in Parallel Transmission for Magnetic Resonance Imaging of the Human Brain at Ultra High Field". Thesis, Paris 11, 2014. http://www.theses.fr/2014PA112180/document.
Texto completoIn Magnetic Resonance Imaging (MRI), the increase of the static magnetic field strength is used to provide in theory a higher signal-to-noise ratio, thereby improving the overall image quality. The purpose of ultra-high-field MRI is to achieve a spatial image resolution sufficiently high to be able to distinguish structures so fine that they are currently impossible to view in a non-invasive manner. However, at such static magnetic fields strengths, the wavelength of the electromagnetic waves sent to flip the water proton spins is of the same order of magnitude than the scanned object. Interference wave phenomena are then observed, which are caused by the radiofrequency (RF) field inhomogeneity within the object. These generate signal and/or contrast artifacts in MR images, making their exploitation difficult, if not impossible, in certain areas of the body. It is therefore crucial to provide solutions to mitigate the non-uniformity of the spins excitation. Failing this, these imaging systems with very high fields will not reach their full potential.For relevant high field clinical diagnosis, it is therefore necessary to create RF pulses homogenizing the excitation of all spins (here of the human brain), and optimized for each individual to be imaged. For this, an 8-channel parallel transmission system (pTX) was installed in our 7 Tesla scanner. While most clinical MRI systems only use a single transmission channel, the pTX extension allows to simultaneously playing various forms of RF pulses on all channels. The resulting sum of the interference must be optimized in order to reduce the non-uniformity typically seen.The objective of this thesis is to synthesize this type of tailored RF pulses, using parallel transmission. These pulses will have as an additional constraint the compliance with the international exposure limits for radiofrequency exposure, which induces a temperature rise in the tissue. In this sense, many electromagnetic and temperature simulations were carried out as an introduction of this thesis, in order to assess the relationship between the recommended RF exposure limits and the temperature rise actually predicted in tissues.This thesis focuses specifically on the design of all RF refocusing pulses used in non-selective MRI sequences based on the spin-echo. Initially, only one RF pulse was generated for a simple application: the reversal of spin dephasing in the transverse plane, as part of a classic spin echo sequence. In a second time, sequences with very long refocusing echo train applied to in vivo imaging are considered. In all cases, the mathematical operator acting on the magnetization, and not its final state as is done conventionally, is optimized. The gain in high field imaging is clearly visible, as the necessary mathematical operations (that is to say, the rotation of the spins) are performed with a much greater fidelity than with the methods of the state of the art. For this, the generation of RF pulses is combining a k-space-based spin excitation method, the kT-points, and an optimization algorithm, called Gradient Ascent Pulse Engineering (GRAPE), using optimal control.This design is relatively fast thanks to analytical calculations rather than finite difference methods. The inclusion of a large number of parameters requires the use of GPUs (Graphics Processing Units) to achieve computation times compatible with clinical examinations. This method of designing RF pulses has been experimentally validated successfully on the NeuroSpin 7 Tesla scanner, with a cohort of healthy volunteers. An imaging protocol was developed to assess the image quality improvement using these RF pulses compared to typically used non-optimized RF pulses. All methodological developments made during this thesis have contributed to improve the performance of ultra-high-field MRI in NeuroSpin, while increasing the number of MRI sequences compatible with parallel transmission
Parmar, Rajbir Singh. "Simulation of weight gain and feed consumption of turkeys". Diss., Virginia Polytechnic Institute and State University, 1989. http://hdl.handle.net/10919/54257.
Texto completoPh. D.
Silva, Wesley Bertoli da. "Distribuição de Poisson bivariada aplicada à previsão de resultados esportivos". Universidade Federal de São Carlos, 2014. https://repositorio.ufscar.br/handle/ufscar/4586.
Texto completoFinanciadora de Estudos e Projetos
The modelling of paired counts data is a topic that has been frequently discussed in several threads of research. In particular, we can cite bivariate counts, such as the analysis of sports scores. As a result, in this work we present the bivariate Poisson distribution to modelling positively correlated scores. The possible independence between counts is also addressed through the double Poisson model, which arises as a special case of the bivariate Poisson model. The main characteristics and properties of these models are presented and a simulation study is conducted to evaluate the behavior of the estimates for different sample sizes. Considering the possibility of modeling parameters by insertion of predictor variables, we present the structure of the bivariate Poisson regression model as a general case as well as the structure of an effects model for application in sports data. Particularly, in this work we will consider applications to Brazilian Championship Serie A 2012 data, in which the effects will be estimated by double Poisson and bivariate Poisson models. Once obtained the fits, the probabilities of scores occurence are estimated and then we obtain forecasts for the outcomes. In order to obtain more accurate forecasts, we present the weighted likelihood method from which it will be possible to quantify the relevance of the data according to the time they were observed.
A modelagem de dados provenientes de contagens pareadas e um típico que vem sendo frequentemente abordado em diversos segmentos de pesquisa. Em particular, podemos citar os casos em que as contagens de interesse são bivariadas, como por exemplo na analise de placares esportivos. Em virtude disso, neste trabalho apresentamos a distribuição Poisson bivariada para os casos em que as contagens de interesse sao positivamente correlacionadas. A possível independencia entre as contagens tambem e abordada por meio do modelo Poisson duplo, que surge como caso particular do modelo Poisson bivariado. As principais características e propriedades desses modelos são apresentadas e um estudo de simulação é realizado, visando avaliar o comportamento das estimativas para diferentes tamanhos amostrais. Considerando a possibilidade de se modelar os parâmetros por meio da inserçao de variáveis preditoras, apresentamos a estrutura do modelo de regressão Poisson bivariado como caso geral, bem como a estrutura de um modelo de efeitos para aplicação a dados esportivos. Particularmente, neste trabalho vamos considerar aplicações aos dados da Serie A do Campeonato Brasileiro de 2012, na qual os efeitos serão estimados por meio dos modelos Poisson duplo e Poisson bivariado. Uma vez obtidos os ajustes, estimam-se as probabilidades de ocorrência dos placares e, a partir destas, obtemos previsões para as partidas de interesse. Com o intuito de se obter previsões mais acuradas para as partidas, apresentamos o metodo da verossimilhança ponderada, a partir do qual seria possível quantificar a relevância dos dados em função do tempo em que estes foram observados.
Watanabe, Alexandre Hiroshi. "Comparações de populações discretas". Universidade de São Paulo, 2013. http://www.teses.usp.br/teses/disponiveis/55/55134/tde-11062013-095657/.
Texto completoOne of the main problems in hypothesis testing for homogeneity of survival curves occurs when the failure rate (or intensity functions) are nonproportional. Although the Log-rank test is a nonparametric test most commonly used to compare two or more populations subject to censored data, this test presented two constraints. First, all the asymptotic theory involved with the Log-rank test, is the hypothesis that individuals and populations involved have continuous distributions or at best mixed. Second, the log-rank test does not show well when the intensity functions intersect. The starting point for the analysis is to assume that the data is continuous and in this case suitable Gaussian processes may be used to test the assumption of homogeneity. Here, we cite the Renyi test and Cramér-von Mises for continuous data (CCVM), and Moeschberger see Klein (1997) [15]. Despite these non-parametric tests show good results for continuous data, these may have trouble discrete data or rounded. In this work, we perform a simulation study of statistic Cramér-von Mises (CVM) proposed by Leão and Ohashi [16], which allows us to detect failure rates are nonproportional (crossing of failure rates) subject to censure for arbitrary data discrete or rounded. We also propose a modification of the test log-rank classic data arranged in a contingency table. By applying the statistics proposed in this paper for discrete or rounded data, developed the test shows a power function better than the usual testing
Lindberg, Mattias y Peter Guban. "Auxiliary variables a weight against nonresponse bias : A simulation study". Thesis, Stockholms universitet, Statistiska institutionen, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-142977.
Texto completoRui, Yikang. "Urban Growth Modeling Based on Land-use Changes and Road Network Expansion". Doctoral thesis, KTH, Geodesi och geoinformatik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-122182.
Texto completoQC 20130514
Medhekar, Vinay Shantaram. "Modeling and simulation of oxidative degradation of Ultra-High Molecular Weight Polyethylene (UHMWPE)". Link to electronic thesis, 2001. http://www.wpi.edu/Pubs/ETD/Available/etd-0828101-135959.
Texto completoGassama, Malamine. "Estimation du risque attribuable et de la fraction préventive dans les études de cohorte". Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLV131/document.
Texto completoThe attributable risk (AR) measures the proportion of disease cases that can be attributed to an exposure in the population. Several definitions and estimation methods have been proposed for survival data. Using simulations, we compared four methods for estimating AR defined in terms of survival functions: two nonparametric methods based on Kaplan-Meier's estimator, one semiparametric based on Cox's model, and one parametric based on the piecewise constant hazards model. Our results suggest to use the semiparametric or parametric approaches to estimate AR if the proportional hazards assumption appears appropriate. These methods were applied to the E3N women cohort data to estimate the AR of breast cancer due to menopausal hormone therapy (MHT). We showed that about 9% of cases of breast cancer were attributable to MHT use at baseline. In case of a protective exposure, an alternative to the AR is the prevented fraction (PF) which measures the proportion of disease cases that could be avoided in the presence of a protective exposure in the population. The definition and estimation of PF have never been considered for cohort studies in the survival analysis context. We defined the PF in cohort studies with survival data and proposed two estimation methods: a semiparametric method based on Cox’s proportional hazards model and a parametric method based on a piecewise constant hazards model with an extension to competing risks. Using data of the Three-City (3C) cohort study, we found that approximately 9% of cases of stroke could be avoided using lipid-lowering drugs (statins or fibrates) in the elderly population. Our study shows that the PF can be estimated to evaluate the impact of beneficial drugs in observational cohort studies while taking potential confounding factors and competing risks into account
Karewar, Shivraj. "Atomistic Simulations of Deformation Mechanisms in Ultra-Light Weight Mg-Li Alloys". Thesis, University of North Texas, 2015. https://digital.library.unt.edu/ark:/67531/metadc801888/.
Texto completoDeMarco, James P. Jr. "Mechanical characterization and numerical simulation of a light-weight aluminum A359 metal-matrix composite". Master's thesis, University of Central Florida, 2011. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/4933.
Texto completoID: 030423478; System requirements: World Wide Web browser and PDF reader.; Mode of access: World Wide Web.; Thesis (M.S.M.E.)--University of Central Florida, 2011.; Includes bibliographical references (p. 113-118).
M.S.
Masters
Mechanical, Materials, and Aerospace Engineering
Engineering and Computer Science
Edlund, Per-Olov. "Preliminary estimation of transfer function weights : a two-step regression approach". Doctoral thesis, Stockholm : Economic Research Institute, Stockholm School of Economics [Ekonomiska forskningsinstitutet vid Handelshögsk.] (EFI), 1989. http://www.hhs.se/efi/summary/291.htm.
Texto completoAl-Nsour, Rawan. "MOLECULAR DYNAMICS SIMULATIONS OF PURE POLYTETRAFLUOROETHYLENE NEAR GLASSY TRANSITION TEMPERATURE FOR DIFFERENT MOLECULAR WEIGHTS". VCU Scholars Compass, 2014. http://scholarscompass.vcu.edu/etd/3845.
Texto completoLaughlin, Trevor William. "A parametric and physics-based approach to structural weight estimation of the hybrid wing body aircraft". Thesis, Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/45829.
Texto completoRamakrishnan, Tyagi. "Asymmetric Unilateral Transfemoral Prosthetic Simulator". Scholar Commons, 2014. https://scholarcommons.usf.edu/etd/5111.
Texto completoRobinson, Marc J. "Simulation of the vacuum assisted resin transfer molding (VARTM) process and the development of light-weight composite bridging". Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 2008. http://wwwlib.umi.com/cr/ucsd/fullcit?p3336692.
Texto completoTitle from first page of PDF file (viewed January 9, 2009). Available via ProQuest Digital Dissertations. Vita. Includes bibliographical references (p. 482-492).
Auvray, Alexis. "Contributions à l'amélioration de la performance des conditions aux limites approchées pour des problèmes de couche mince en domaines non réguliers". Thesis, Lyon, 2018. http://www.theses.fr/2018LYSEC018/document.
Texto completoTransmission problems with thin layer are delicate to approximate numerically, because of the necessity to build meshes on the scale of the thin layer. It is common to avoid these difficulties by using problems with approximate boundary conditions — also called impedance conditions. Whereas the approximation of transmission problems by impedance problems turns out to be successful in the case of smooth domains, the situation is less satisfactory in the presence of corners and edges. The goal of this thesis is to propose new impedance conditions, more efficient, to correct this lack of performance. For that purpose, the asymptotic expansions of the various models -problems are built and studied to locate exactly the origin of the loss, in connection with the singular profiles associated to corners and edges. New impedance conditions are built, of multi-scale Robin or Venctel types. At first studied in dimension 2, they are then generalized in certain situations in dimension 3. Simulations have been carried out to confirm the efficiency of the theoretical methods to some
Colliri, Tiago Santos. "Avaliação de preços de ações: proposta de um índice baseado nos preços históricos ponderados pelo volume, por meio do uso de modelagem computacional". Universidade de São Paulo, 2013. http://www.teses.usp.br/teses/disponiveis/100/100132/tde-07072013-015903/.
Texto completoThe importance of considering the volumes to analyze stock prices movements can be considered as a well-accepted practice in the financial area. However, when we look at the scientific production in this field, we still cannot find a unified model that includes volume and price variations for stock prices assessment purposes. In this paper we present a computer model that could fulfill this gap, proposing a new index to evaluate stock prices based on their historical prices and volumes traded. The aim of the model is to estimate the current proportions of the total volume of shares available in the market from a stock distributed according with their respective prices traded in the past. In order to do so, we made use of dynamic financial modeling and applied it to real financial data from the Sao Paulo Stock Exchange (Bovespa) and also to simulated data which was generated trough an order book model. The value of our index varies based on the difference between the current proportion of shares traded in the past for a price above the current price of the stock and its respective counterpart, which would be the proportion of shares traded in the past for a price below the current price of the stock. Besides the model can be considered mathematically very simple, it was able to improve significantly the financial performance of agents operating with real market data and with simulated data, which contributes to demonstrate its rationale and its applicability. Based on the results obtained, and also on the very intuitive logic of our model, we believe that the index proposed here can be very useful to help investors on the activity of determining ideal price ranges for buying and selling stocks in the financial market.
Heimbigner, Stephen Matthew. "Implications in Using Monte Carlo Simulation in Predicting Cardiovascular Risk Factors among Overweight Children and Adolescents". Digital Archive @ GSU, 2007. http://digitalarchive.gsu.edu/iph_theses/11.
Texto completoHuang, Bing. "Understanding Operating Speed Variation of Multilane Highways with New Access Density Definition and Simulation Outputs". Scholar Commons, 2012. http://scholarcommons.usf.edu/etd/4079.
Texto completoAnderson, Abby Hodel A. Scottedward. "Design, testing, and simulation of a low-cost, light-weight, low-g IMU for the navigation of an indoor blimp". Auburn, Ala., 2006. http://repo.lib.auburn.edu/2006%20Spring/master's/ANDERSON_ABBY_43.pdf.
Texto completoShah, Manan Kanti. "Material Characterization and Forming of Light Weight Alloys at Elevated Temperature". The Ohio State University, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=osu1306939665.
Texto completoCHAKKALAKKAL, JOSEPH JUNIOR. "Design of a weight optimized casted ADI component using topology and shape optimization". Thesis, KTH, Maskin- och processteknologi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-236518.
Texto completoStrukturoptimering används ofta i produktutvecklingsprocessen i modern industri för att ta fram optimala konstruktioner med minsta möjliga materialåtgång för komponenten. Konventionella konstruktionsmetoder genererar vanligtvis överdimensionerade komponenter med överflödigt material och vikt. Detta ökar i sin tur livstidskostnaderna för maskiner både i termer av materialavfall och användning. Avhandlingen "Konstruktion av viktoptimerad gjuten ADI-komponent" behandlar omkonstruktionen av en komponent från en svetsad stålplåtstruktur till en gjutbar konstruktion med minskad tillverkningskostnad och vikt. Komponenten “Borrstöd” monterad i framkant av bommen på en ortdrivningsmaskin är omkonstruerad under detta arbete. Huvudsyftet med avhandlingen är ta fram en alternativ konstruktion med lägre vikt och som kan monteras på befintlig maskinlayout utan någon ändring i monteringsgränssnittet. Denna avhandling innehåller en detaljerad beskrivning av förfarandet för att uppnå viktminskningen av "borrstödet" och presenterar resultaten samt metodiken som baseras på både topologi- och parameter- optimering.
Jakobi, Christoph. "Entwicklung und Evaluation eines Gewichtsfenstergenerators für das Strahlungstransportprogramm AMOS". Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2018. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-234133.
Texto completoThe purpose of efficiency increasing methods is the reduction of the computing time required to solve radiation transport problems using Monte Carlo techniques. Besides additional geometry manipulation and source biasing this includes in particular the weight windows technique as the most important variance reduction method developed in the 1950s. To date the difficulty of this technique is the calculation of appropriate weight windows. In this work a generator for spatial and energy dependent weight windows based on the forward-adjoint generator by T.E. BOOTH and J.S. HENDRICKS is developed and implemented in the radiation transport program AMOS. With this generator the weight windows are calculated iteratively and set automatically. Furthermore the generator is able to autonomously adapt the energy segmentation. The functioning is demonstrated by means of the deep penetration problem of photon radiation. In this case the efficiency can be increased by several orders of magnitude. With energy dependent weight windows the computing time is decreased additionally by approximately one order of magnitude. For a practice-oriented problem, the irradiation of a dosimeter for individual monitoring, the efficiency is only improved by a factor of four at best. Source biasing and geometry manipulation result in an equivalent improvement. The use of energy dependent weight windows proved to be of no practical relevance
Pires, dos Santos Rebecca. "The Application of Artificial Neural Networks for Prioritization of Independent Variables of a Discrete Event Simulation Model in a Manufacturing Environment". BYU ScholarsArchive, 2017. https://scholarsarchive.byu.edu/etd/6431.
Texto completoEsquincalha, Agnaldo da Conceição. "Estimação de parâmetros de sinais gerados por sistemas lineares invariantes no tempo". Universidade do Estado do Rio de Janeiro, 2009. http://www.bdtd.uerj.br/tde_busca/arquivo.php?codArquivo=1238.
Texto completoNesta dissertação é apresentado um estudo sobre a recuperação de sinais modelados por somas ponderadas de exponenciais complexas. Para tal, são introduzidos conceitos elementares em teoria de sinais e sistemas, em particular, os sistemas lineares invariantes no tempo, SLITs, que podem ser representados matematicamente por equações diferenciais, ou equações de diferenças, para sinais analógicos ou digitais, respectivamente. Equações deste tipo apresentam como solução somas ponderadas de exponenciais complexas, e assim fica estabelecida a relação entre os sistemas de tipo SLIT e o modelo em estudo. Além disso, são apresentadas duas combinações de métodos utilizadas na recuperação dos parâmetros dos sinais: métodos de Prony e mínimos quadrados, e métodos de Kung e mínimos quadrados, onde os métodos de Prony e Kung recuperam os expoentes das exponenciais e o método dos mínimos quadrados recupera os coeficientes lineares do modelo. Finalmente, são realizadas cinco simulações de recuperação de sinais, sendo a última, uma aplicação na área de modelos de qualidade de água.
A study on the recovery of signals modeled by weighted sums of complex exponentials complex is presented. For this, basic concepts of signals and systems theory are introduced. In particular, the linear time invariant systems (LTI Systems) are considered, which can be mathematically represented by differential equations or difference equations, respectively, for analog or digital signals. The solution of these types of equations is given by a weighted sum of complex exponentials, so the relationship between the LTI Systems and the model of study is established. Furthermore, two combinations of methods are used to recover the parameters of the signals: Prony and least squares methods, and Kung and least squares methods, where Prony and Kung methods are used to recover the exponents of the exponentials and the least square method is used to recover the linear coefficients of the model. Finally, five simulations are performed for the recovery of signals, the last one being an application in the area of water quality models.
Olsson, Jörgen. "Low Frequency Impact Sound in Timber Buildings : Simulations and Measurements". Licentiate thesis, Linneaus Univeristy, Sweden; SP Technical Research Institute of Sweden, Sweden, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-58068.
Texto completoProWood
Silent Timber Build
Urban Tranquility
BioInnovation FBBB
Staffan, Paul. "Design of an ultra-wideband microstrip antenna array with low size, weight and power". Wright State University / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=wright1578437280799995.
Texto completoDavies, G. J. "Towards an agent-based model for risk-based regulation". Thesis, Cranfield University, 2010. http://dspace.lib.cranfield.ac.uk/handle/1826/5662.
Texto completoCastro, Jaime. "Influence of random formation on paper mechanics : experiments and theory". Diss., Georgia Institute of Technology, 2001. http://hdl.handle.net/1853/7016.
Texto completoLi, Qi. "Acoustic noise emitted from overhead line conductors". Thesis, University of Manchester, 2013. https://www.research.manchester.ac.uk/portal/en/theses/acoustic-noise-emitted-from-overhead-line-conductors(90a5c23c-a7fc-4230-bbab-16b8737b2af2).html.
Texto completo