Siga este enlace para ver otros tipos de publicaciones sobre el tema: Variability Models.

Tesis sobre el tema "Variability Models"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores tesis para su investigación sobre el tema "Variability Models".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Ternité, Thomas [Verfasser]. "Variability of Development Models / Thomas Ternité". München : Verlag Dr. Hut, 2010. http://d-nb.info/1009972332/34.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Scutari, Marco. "Measures of Variability for Graphical Models". Doctoral thesis, Università degli studi di Padova, 2011. http://hdl.handle.net/11577/3422736.

Texto completo
Resumen
In recent years, graphical models have been successfully applied in several different disciplines, including medicine, biology and epidemiology. This has been made possible by the rapid evolution of structure learning algorithms, from constraint-based ones to score-based and hybrid ones. The main goal in the development of these algorithms has been the reduction of the number of either independence tests or score comparisons needed to learn the structure of the Bayesian network. In most cases the characteristics of the learned networks have been studied using a small number of reference data sets as benchmarks, and differences from the true structure heve been measured with purely descriptive measures such as Hamming distance. This approach to model validation is not possible for real world data sets, as the true structure of their probability distribution is not known. An alternative is provided by the use of either parametric or nonparametric bootstrap. By applying a learning algorithm to a sufficiently large number of bootstrap samples it is possible to obtain the empirical probability of any feature of the resulting network, such as the structure of the Markov Blanket of a particular node. The fundamental limit in the interpretation of the results is that the “reasonable” level of confidence for thresholding depends on the data and the learning algorithm. In this thesis we extend the aforementioned bootstrap-based approach for the in- ference on the structure of a Bayesian or Markov network. The graph representing the network structure and its underlying undirected graph (in the case of Bayesian networks) are modelled using a multivariate extension of the Trinomial and Bernoulli distributions; each component is associated with an arc. These assumptions allow the derivation of exact and asymptotic measures of the variability of the network structure or any of its parts. These measures are then applied to some common learning strate- gies used in literature using the implementation provided by the bnlearn R package implemented and maintained by the author.
Negli ultimi anni i modelli grafici, ed in particolare i network Bayesiani, sono entrati nella pratica corrente delle analisi statistiche in diversi settori scientifici, tra cui medi cina e biostatistica. L’uso di questo tipo di modelli è stato reso possibile dalla rapida evoluzione degli algoritmi per apprenderne la struttura, sia quelli basati su test statistici che quelli basati su funzioni punteggio. L’obiettivo principale di questi nuovi algoritmi è la riduzione del numero di modelli intermedi considerati nell’apprendimento; le loro caratteristiche sono state usualmente valutate usando dei dati di riferimento (per i quali la vera struttura del modello è nota da letteratura) e la distanza di Hamming. Questo approccio tuttavia non può essere usato per dati sperimentali, poiché la loro struttura probabilistica non è nota a priori. In questo caso una valida alternativa è costituita dal bootstrap non parametrico: apprendendo un numero sufficientemente grande di modelli da campioni bootstrap è infatti possibile ottenere una stima empirica della probabilità di ogni caratteristica di interesse del network stesso. In questa tesi viene affrontato il principale limite di questo secondo approccio: la difficoltà di stabilire una soglia di significatività per le probabilità empiriche. Una possibile soluzione è data dall’assunzione di una distribuzione Trinomiale multivariata (nel caso di grafi orientati aciclici) o Bernoulliana multivariata (nel caso di grafi non orientati), che permette di associare ogni arco del network ad una distribuzione mar ginale. Questa assunzione permette di costruire dei test statistici, sia asintotici che esatti, per la variabilità multivariata della struttura del network nel suo complesso o di una sua parte. Tali misure di variabilità sono state poi applicate ad alcuni algoritmi di apprendimento della struttura di network Bayesiani utilizzando il pacchetto R bnlearn, implementato e mantenuto dall’autore.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Arzounian, Dorothée. "Sensory variability and brain state : models, psychophysics, electrophysiology". Thesis, Sorbonne Paris Cité, 2017. http://www.theses.fr/2017USPCB055/document.

Texto completo
Resumen
La même entrée sensorielle ne provoque pas toujours la même réaction. Dans les expériences en laboratoire, un stimulus donné peut engendrer une réponse différente à chaque nouvel essai, en particulier à proximité du seuil sensoriel. Ce phénomène est généralement attribué à une source de bruit non spécifique qui affecte la représentation sensorielle du stimulus ou le processus décisionnel. Dans cette thèse, nous examinons l'hypothèse selon laquelle cette variabilité des réponses peut être attribuée en partie à des fluctuations mesurables et spontanées de l'état cérébral. Dans ce but, nous développons et évaluons deux ensembles d'outils. L’un est un ensemble de modèles et de méthodes psychophysiques permettant de suivre les variations de la performance perceptive avec une bonne résolution temporelle et avec précision, sur différentes échelles de temps. Ces méthodes s’appuient sur des procédures adaptatives initialement développées pour mesurer efficacement les seuils de perception statiques et sont étendues ici dans le but de suivre des seuils qui varient au cours du temps. Le deuxième ensemble d'outils que nous développons comprend des méthodes d'analyse de données pour extraire de signaux d’électroencéphalographie (EEG) une quantité prédictive de la performance comportementale à diverses échelles de temps. Nous avons appliqué ces outils à des enregistrements conjoints d’EEG et de données comportementales collectées pendant que des auditeurs normo-entendants réalisaient une tâche de discrimination de fréquence sur des stimuli auditifs proche du seuil de discrimination. Contrairement à ce qui a été rapporté dans la littérature concernant des stimuli visuels, nous n'avons pas trouvé de preuve d’un quelconque effet des oscillations EEG spontanées de basse fréquence sur la performance auditive. En revanche, nous avons trouvé qu'une part importante de la variabilité des jugements peut s’expliquer par des effets de l'historique récent des stimuli et des réponses sur la décision prise à un moment donné
The same sensory input does not always trigger the same reaction. In laboratory experiments, a given stimulus may elicit a different response on each trial, particularly near the sensory threshold. This is usually attributed to an unspecific source of noise that affects the sensory representation of the stimulus or the decision process. In this thesis we explore the hypothesis that response variability can in part be attributed to measurable, spontaneous fluctuations of ongoing brain state. For this purpose, we develop and test two sets of tools. One is a set of models and psychophysical methods to follow variations of perceptual performance with good temporal resolution and accuracy on different time scales. These methods rely on the adaptive procedures that were developed for the efficient measurements of static sensory thresholds and are extended here for the purpose of tracking time-varying thresholds. The second set of tools we develop encompass data analysis methods to extract from electroencephalography (EEG) signals a quantity that is predictive of behavioral performance on various time scales. We applied these tools to joint recordings of EEG and behavioral data acquired while normal listeners performed a frequency-discrimination task on near-threshold auditory stimuli. Unlike what was reported in the literature for visual stimuli, we did not find evidence for any effects of ongoing low-frequency EEG oscillations on auditory performance. However, we found that a substantial part of judgment variability can be accounted for by effects of recent stimulus-response history on an ongoing decision
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Byrne, Nicholas. "Deterministic models of Southern Hemisphere circulation variability". Thesis, University of Reading, 2017. http://centaur.reading.ac.uk/74253/.

Texto completo
Resumen
Statistical models of atmospheric variability typically attempt to account for deterministic seasonal variations by constructing a long-term average for each day or month of the year. Year-to-year variability can then be treated as some form of stochastic process about this long-term average. In general, the stochastic processes are assumed to be statistically stationary (invariant under time translation). However, for a non-linear system such as the Earth’s atmosphere, multiple seasonal evolutions may be possible for the same external forcing. In the presence of such a multiplicity of solutions, the identification of a seasonal cycle with a long-term average may not be the optimal procedure. Previous research has suggested that multiple evolutions of the seasonal cycle of the Southern Hemisphere mid-latitude circulation may be possible. The central goal of this thesis is to build on this work and to present evidence for different seasonal evolutions of the Southern Hemisphere mid-latitude circulation. This evidence is initially presented by highlighting a low-frequency peak in an aspect of the Southern Hemisphere mid-latitude circulation that is viewed as a harmonic of the annual cycle (quasi-two year). Statistically stationary models of variability about a long-term average are argued to be unable to account for the presence of this harmonic. Following this, an alternative model of circulation variability is proposed that explicitly references various stages of the seasonal cycle in a deterministic manner. In particular, explicit reference is made to the downward shift and to the final breakdown of the stratospheric polar vortex. A re-interpretation of several previous results in the literature including Southern Annular Mode persistence timescales, Southern Hemisphere mid-latitude climate change and the semi-annual oscillation of the mid-latitude jet is subsequently presented using this alternative perspective.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Strounine, Kirill. "Reduced models of extratropical low-frequency variability". Diss., Restricted to subscribing institutions, 2007. http://proquest.umi.com/pqdweb?did=1320974401&sid=1&Fmt=2&clientId=1564&RQT=309&VName=PQD.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Schenzinger, Verena. "Tropical stratosphere variability and extratropical teleconnections". Thesis, University of Oxford, 2016. https://ora.ox.ac.uk/objects/uuid:7f03dad9-8ef6-4586-8caa-314d9c3a15da.

Texto completo
Resumen
The Quasi-Biennial Oscillation (QBO) is the dominant pattern of variability in the tropical stratosphere. Despite a well established theory regarding its generation in the atmosphere, the simulation in global climate models remains difficult. A set of metrics assessing the quality of model simulations is presented in this study. The QBO simulations in models submitted to the CMIP5 and CCMVal-2 intercomparison projects are characterised and compared to radiosonde observations and reanalysis datasets. Common model biases and their potential causes are addressed. As the QBO has a long intrinsic period, knowing its influences on other parts of the climate system can be used to improve long range forecasts. These teleconnections of the QBO in observations are investigated using composite analysis, multilinear regression and a novel approach called causal effect networks (CEN). Findings from these analyses confirm previous results of the QBO modulating the stratospheric polar vortex and subsequently the North Atlantic Oscillation (NAO). They also suggest that it is important to take the equatorial zonal mean zonal wind vertical profile into account when studying teleconnections, rather than the more traditional method of using just one single level. While QBO influences on the Northern Hemisphere winter polar vortex and the NAO are more clearly established, interactions within the tropics remain inconclusive. Regression analysis does not show a connection between the QBO and the MJO, whereas the CEN analysis does. Further studies are needed to understand the interaction mechanisms near the equator. Finally, following the unprecedented disruption of the QBO cycle in the winter 2015/16, the event is described and a model analogue from the MPI-ESM-MR historical simulation is presented. Future implications are unclear, although model projections indicate more frequent QBO irregularities in a warming climate.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Wengel, Christian [Verfasser]. "Equatorial Pacific Variability in Climate Models / Christian Wengel". Kiel : Universitätsbibliothek Kiel, 2018. http://d-nb.info/1160235406/34.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Burrow, Jennifer. "Mechanistic models of recruitment variability in fish populations". Thesis, University of York, 2011. http://etheses.whiterose.ac.uk/1611/.

Texto completo
Resumen
There are serious concerns worldwide about the decline of exploited fish stocks. The number of fish larvae surviving to be recruited into the adult population each year is fundamental to the long-term stability of a fish stock. Monitoring and predicting recruitment is a crucial component of managing economically important fisheries worldwide. Fish recruitment can vary by an order of magnitude, or more, between years, and the larval stage is a key determining factor. Fish larvae are born into an extremely variable environment, with high mortality rates, and so it is not surprising that the number surviving to join the adult population is highly variable. This thesis presents simple stochastic, mechanistic larval growth models, developed and utilised to investigate recruitment probabilities and variability. The models are mechanistic in that they are based on consideration of the key ecological processes at work, and not on statistical regression analyses or similar techniques. At the heart of the thesis lies a stochastic drift-diffusion model for the growth of an individual larva. Further mathematical and ecological complexity is built up through consideration of both the temporal and spatial heterogeneity of larval food sources, primarily zooplankton. Results illustrate the impact of stochasticity in the timing of peak food abundance, and the patchiness of the prey, on recruitment variability. The idea of non-constant variance in recruitment is also investigated, with the aim of testing its practical relevance to fisheries management. It is demonstrated that the currently available stock-recruitment time series are at least one order of magnitude too short to reliably fit such models. Management implications are illustrated using simple models and published recruitment data for two exploited stocks. The work developed within this thesis highlights the importance of stochasticity in fish larval growth and recruitment, and the power of simple mechanistic models in examining these ideas.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

MANFREDI, PAOLO. "High-Speed Interconnect Models with Stochastic Parameter Variability". Doctoral thesis, Politecnico di Torino, 2013. http://hdl.handle.net/11583/2513763.

Texto completo
Resumen
In the process of design and fabrication of electronic products, numerical simulation plays a fundamental role for a preliminary electromagnetic compatibility (EMC) assessment of devices in the early design phase. Direct EMC measurements impact both cost and time-to-market as they require purchase and/or hiring of facilities and instruments, as well as fabrication of prototype devices, and need therefore to be minimized. Nowadays, designers can rely on several sophisticated modeling tools, helping them to perform right-the-first-time designs. Nonetheless, these simulation models are accurate as long as we are able to assign accurate values to each system parameter. In modern high-speed and high-density designs, process variations and uncertainties in operating conditions result in parameters which are hard to control or partially unavailable. The device response is thus no longer regarded as deterministic, but is more suitably interpreted as a random process. In this framework, the assessment of signal integrity requires a statistical analysis, which is traditionally based on the so-called Monte Carlo or other sampling-based methods. Yet, for practical applications, these approaches are often too time-consuming, as they are known to require a large number of samples to converge. In this thesis, we extend available literature results to the efficient analysis of high-speed interconnects, such as avionic and industrial cables or printed circuit board traces, affected by uncertainties, like process variations or unavailable operating conditions. Specifically, the framework of polynomial chaos theory is adopted to create stochastic models for transmission lines which are faster to be simulated compared to repeated Monte Carlo simulations. Such methodology is based on the expansion of random quantities in series of orthogonal polynomials, and has been already and successfully applied to the analysis of lumped circuits. In this work, the modeling of distributed components, which are key elements for modern high-frequency designs, is addressed. The advocated approach is general and overcomes the limitations of available literature models for the statistical analysis of the signal propagation over interconnects, which are based on simplified structures and approximate assumptions. Also, a SPICE-compatible implementation is presented, thus allowing the convenient use of SPICE-like circuit analysis tools for the simulation of complex stochastic network topologies, avoiding the creation of customized, ad hoc implementations. This thesis provides a comprehensive theoretical discussion together with several tutorial application examples, thus complementing the published material.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Denis, Yvan. "Implémentation de PCM (Process Compact Models) pour l’étude et l’amélioration de la variabilité des technologies CMOS FDSOI avancées". Thesis, Université Grenoble Alpes (ComUE), 2016. http://www.theses.fr/2016GREAT045/document.

Texto completo
Resumen
Récemment, la course à la miniaturisation a vue sa progression ralentir à cause des défis technologiques qu’elle implique. Parmi ces obstacles, on trouve l’impact croissant de la variabilité local et process émanant de la complexité croissante du processus de fabrication et de la miniaturisation, en plus de la difficulté à réduire la longueur du canal. Afin de relever ces défis, de nouvelles architectures, très différentes de celle traditionnelle (bulk), ont été proposées. Cependant ces nouvelles architectures demandent plus d’efforts pour être industrialisées. L’augmentation de la complexité et du temps de développement requièrent de plus gros investissements financier. De fait il existe un besoin réel d’améliorer le développement et l’optimisation des dispositifs. Ce travail donne quelques pistes dans le but d’atteindre ces objectifs. L’idée, pour répondre au problème, est de réduire le nombre d’essai nécessaire pour trouver le processus de fabrication optimal. Le processus optimal est celui qui conduit à un dispositif dont les performances et leur dispersion atteignent les objectifs prédéfinis. L’idée développée dans cette thèse est de combiner l’outil TCAD et les modèles compacts dans le but de construire et calibrer ce que l’on appelle un PCM (Process Compact Model). Un PCM est un modèle analytique qui établit les liens entre les paramètres process et électriques du MOSFET. Il tire à la fois les bénéfices de la TCAD (puisqu’il relie directement les paramètres process aux paramètres électriques) et du modèle compact (puisque le modèle est analytique et donc rapide à calculer). Un PCM suffisamment prédictif et robuste peut être utilisé pour optimiser les performances et la variabilité globale du transistor grâce à un algorithme d’optimisation approprié. Cette approche est différente des méthodes de développement classiques qui font largement appel à l’expertise scientifique et à des essais successifs dans le but d’améliorer le dispositif. En effet cette approche apporte un cadre mathématique déterministe et robuste au problème.Le concept a été développé, testé et appliqué aux transistors 28 et 14 nm FD-SOI ainsi qu’aux simulations TCAD. Les résultats sont exposés ainsi que les recommandations nécessaires pour implémenter la technique à échelle industrielle. Certaines perspectives et applications sont de même suggérées
Recently, the race for miniaturization has seen its growth slow because of technological challenges it entails. These barriers include the increasing impact of the local variability and processes from the increasing complexity of the manufacturing process and miniaturization, in addition to the difficult of reducing the channel length. To address these challenges, new architectures, very different from the traditional one (bulk), have been proposed. However these new architectures require more effort to be industrialized. Increasing complexity and development time require larger financial investments. In fact there is a real need to improve the development and optimization of devices. This work gives some tips in order to achieve these goals. The idea to address the problem is to reduce the number of trials required to find the optimal manufacturing process. The optimal process is one that results in a device whose performance and dispersion reach the predefined aims. The idea developed in this thesis is to combine TCAD tool and compact models in order to build and calibrate what is called PCM (Process Compact Model). PCM is an analytical model that establishes linkages between process and electrical parameters of the MOSFET. It takes both the benefits of TCAD (since it connects directly to the process parameters electrical parameters) and compact (since the model is analytic and therefore faster to calculate). A sufficiently robust predictive and PCM can be used to optimize performance and overall variability of the transistor through an appropriate optimization algorithm. This approach is different from traditional development methods that rely heavily on scientific expertise and successive tests in order to improve the system. Indeed this approach provides a deterministic and robust mathematical framework to the problem. The concept was developed, tested and applied to transistors 28 and 14 nm FD-SOI and to TCAD simulations. The results are presented and recommendations to implement it at industrial scale are provided. Some perspectives and applications are likewise suggested
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Saxon, David. "Investigating variability in multilevel models : going beyond therapist effects". Thesis, University of Sheffield, 2017. http://etheses.whiterose.ac.uk/19804/.

Texto completo
Resumen
Although psychological therapies can benefit many people, over half of the patients who receive therapy do not recover. Also, across services and therapists there is a great deal of variability in patient outcomes. Studies from the USA, using multilevel modelling (MLM), have indicated that the variability between therapists has a significant effect on patient outcomes, with some therapists over twice as effective as others. However, some of these findings were derived from data samples that did not meet the recommended size for reliably estimating therapist effects using MLM. This methodology-focused thesis discusses five studies, published between 2012 and 2017, that contain some of the largest samples of routinely collected service data to date. The initial aim was to replicate the USA studies with large UK samples. However in doing so, analytical methods were developed which utilised random slopes and residuals from multilevel models, to better understand therapist variability and ask research questions about ‘how’ and ‘why’ therapists vary in effectiveness. The five studies in this thesis produced some of the most reliable estimates of the size of the therapist effect. They also include the first estimates of therapist effects for patient drop-out and deterioration. In addition, the methods developed were applied to: reliably identify the most effective therapists controlling for case-mix; show how the effects of important patient variables, like intake severity and number of sessions attended, are moderated by therapists; identify therapist factors associated with better outcomes and, for the first time, consider therapist variability on two outcomes simultaneously. Collectively, the studies provide strong evidence of the importance of the therapist to patient outcomes and strong justification for focusing the research effort on therapists and therapist variability. This thesis provides some original methodologies which can contribute to such a research effort.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Wallhead, Philip John. "Accounting for unpredictable spatial variability in plankton ecosystem models". Thesis, University of Southampton, 2008. https://eprints.soton.ac.uk/63762/.

Texto completo
Resumen
Limitations on our ability to predict fine-scale spatial variability in plankton ecosystems can seriously compromise our ability to predict coarse-scale behaviour. Spatial variability which is deterministically unpredictable may distort parameter estimates when the ecosystem model is fitted to (or assimilates) ocean data, may compromise model validation, and may produce mean-field ecosystem behaviour discrepant with that predicted by the model. New statistical methods are investigated to mitigate these effects and thus improve understanding and prediction of coarse-scale behaviour e.g. in response to climate change. First, the standard model fitting technique is generalised to allow model-data ‘phase errors’ in the form of time lags, as has been observed to approximate mesoscale plankton variability in the open ocean. The resulting ‘variable lag fit’ is shown to enable ‘Lagrangian’ parameter recovery with artificial ecosystem data. A second approach employs spatiotemporal averaging, fitting a ‘weak prior’ box model to suitably-averaged data from Georges Bank (as an example), allowing liberal biological parameter adjustments to account for mean effects of unresolved variability. A novel skill assessment technique is used to show that the extrapolative skill of the box model fails to improve on a strictly empirical model. Third, plankton models where horizontal variability is resolved ‘implicitly’ are investigated as an alternative to coarse or higher explicit resolution. A simple simulation study suggests that the mean effects of fine-scale variability on coarse-scale plankton dynamics can be serious, and that ‘spatial moment closure’ and similar statistical modelling techniques may be profitably applied to account for them.
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Kowalski, P. C. "Models of interannual mid-latitude sea surface temperature variability". Thesis, University College London (University of London), 2013. http://discovery.ucl.ac.uk/1394920/.

Texto completo
Resumen
Well established and novel simple mixed layer models are used to investigate some of the factors influencing mid-latitude sea surface temperature variability. This thesis focuses in particular on the re-emergence mechanism and the factors that influence it. The re-emergence mechanism describes the process whereby winter sea surface temperature anomalies can become sequestered below the mixed layer as it reforms in the spring/summer and are entrained into the mixed layer in the following winter, subsequently impacting the sea surface temperature. In chapter 2 the idealized mixed layer column model used in Stevenson [36] and the quasi-geostrophic wind driven ocean model are derived. Chapter 3 investigates how sea surface temperature anomalies are generated and decay through mixed layer processes and in the absence of atmospheric feedback. The e ect of atmospheric feedback on the sea surface temperature and mixed layer is investigated in chapter 4. Two new models of the re-emergence mechanism are presented in chapter 5: the first is a stochastic two season model and the second is an entraining mixed layer model with a fixed mixed layer annual cycle. These models are used to investigate some of the factors, such as the diff erence between the summer and winter mixed layer depth, that influence the re-emergence mechanism. The impact of interannual mixed layer depth variability on the re-emergence mechanism is then investigated using the model of Stevenson [36]. In chapters 6 and 7, the impact of local Ekman pumping and associated Rossby wave induced vertical motion on the sea surface temperature, the mixed layer and the re-emergence mechanism are investigated.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Assy, Nour. "Automated support of the variability in configurable process models". Thesis, Université Paris-Saclay (ComUE), 2015. http://www.theses.fr/2015SACLL001/document.

Texto completo
Resumen
L'évolution rapide dans les environnements métier d'aujourd'hui impose de nouveaux défis pour la gestion efficace et rentable des processus métiers. Dans un tel environnement très dynamique, la conception des processus métiers devient une tâche fastidieuse, source d'erreurs et coûteuse. Par conséquent, l'adoption d'une approche permettant la réutilisation et l'adaptabilité devient un besoin urgent pour une conception de processus prospère. Les modèles de processus configurables récemment introduits représentent l'une des solutions recherchées permettant une conception de processus par la réutilisation, tout en offrant la flexibilité. Un modèle de processus configurable est un modèle générique qui intègre de multiples variantes de procédés d'un même processus métier à travers des points de variation. Ces points de variation sont appelés éléments configurables et permettent de multiples options de conception dans le modèle de processus. Un modèle de processus configurable doit être configuré selon une exigence spécifique en sélectionnant une option de conception pour chaque élément configurable.Les activités de recherche récentes sur les modèles de processus configurables ont conduit à la spécification des langages de modélisation de processus configurables comme par exemple configurable Event-Driven Process Chain (C-EPC) qui étend la notation de l'EPC avec des éléments configurables. Depuis lors, la question de la conception et de la configuration des modèles de processus configurables a été étudiée. D'une part, puisque les modèles de processus configurables ont tendance à être très complexe avec un grand nombre d'éléments configurables, de nombreuses approches automatisées ont été proposées afin d'assister leur conception. Cependant, les approches existantes proposent de recommander des modèles de processus configurables entiers qui sont difficiles à réutiliser, nécessitent un temps complexe de calcul et peuvent confondre le concepteur du processus. D'autre part, les résultats de la recherche sur la conception des modèles de processus configurables ont mis en évidence la nécessité des moyens de soutien pour configurer le processus. Par conséquent, de nombreuses approches ont proposé de construire un système de support de configuration pour aider les utilisateurs finaux à sélectionner les choix de configuration souhaitables en fonction de leurs exigences. Cependant, ces systèmes sont actuellement créés manuellement par des experts du domaine qui est sans aucun doute une tâche fastidieuse et source d'erreurs .Dans cette thèse, nous visons à automatiser le soutien de la variabilité dans les modèles de processus configurables. Notre objectif est double: (i) assister la conception des processus configurables d'une manière à ne pas confondre les concepteurs par des recommandations complexes et (i) assister la création des systèmes de soutien de configuration afin de libérer les analystes de processus de la charge de les construire manuellement. Pour atteindre le premier objectif, nous proposons d'apprendre de l'expérience acquise grâce à la modélisation des processus passés afin d'aider les concepteurs de processus avec des fragments de processus configurables. Les fragments proposés inspirent le concepteur du processus pour compléter la conception du processus en cours. Pour atteindre le deuxième objectif, nous nous rendons compte que les modèles de processus préalablement conçus et configurés contiennent des connaissances implicites et utiles pour la configuration de processus. Par conséquent, nous proposons de bénéficier de l'expérience acquise grâce à la modélisation et à la configuration passées des processus afin d'aider les analystes de processus dans la construction de leurs systèmes de support de configuration
Today's fast changing environment imposes new challenges for effective management of business processes. In such a highly dynamic environment, the business process design becomes time-consuming, error-prone, and costly. Therefore, seeking reuse and adaptability is a pressing need for a successful business process design. Configurable reference models recently introduced were a step toward enabling a process design by reuse while providing flexibility. A configurable process model is a generic model that integrates multiple process variants of a same business process in a given domain through variation points. These variation points are referred to as configurable elements and allow for multiple design options in the process model. A configurable process model needs to be configured according to a specific requirement by selecting one design option for each configurable element.Recent research activities on configurable process models have led to the specification of configurable process modeling notations as for example configurable Event-Driven Process Chain (C-EPC) that extends the EPC notation with configurable elements. Since then, the issue of building and configuring configurable process models has been investigated. On the one hand, as configurable process models tend to be very complex with a large number of configurable elements, many automated approaches have been proposed to assist their design. However, existing approaches propose to recommend entire configurable process models which are difficult to reuse, cost much computation time and may confuse the process designer. On the other hand, the research results on configurable process model design highlight the need for means of support to configure the process. Therefore, many approaches proposed to build a configuration support system for assisting end users selecting desirable configuration choices according to their requirements. However, these systems are currently manually created by domain experts which is undoubtedly a time-consuming and error-prone task.In this thesis, we aim at automating the support of the variability in configurable process models. Our objective is twofold: (i) assisting the configurable process design in a fin-grained way using configurable process fragments that are close to the designers interest and (ii) automating the creation of configuration support systems in order to release the process analysts from the burden of manually building them. In order to achieve the first objective, we propose to learn from the experience gained through past process modeling in order to assist the process designers with configurable process fragments. The proposed fragments inspire the process designer to complete the design of the ongoing process. To achieve the second objective, we realize that previously designed and configured process models contain implicit and useful knowledge for process configuration. Therefore, we propose to benefit from the experience gained through past process modeling and configuration in order to assist process analysts building their configuration support systems. Such systems assist end users interactively configuring the process by recommending suitable configuration decisions
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Assy, Nour. "Automated support of the variability in configurable process models". Electronic Thesis or Diss., Université Paris-Saclay (ComUE), 2015. http://www.theses.fr/2015SACLL001.

Texto completo
Resumen
L'évolution rapide dans les environnements métier d'aujourd'hui impose de nouveaux défis pour la gestion efficace et rentable des processus métiers. Dans un tel environnement très dynamique, la conception des processus métiers devient une tâche fastidieuse, source d'erreurs et coûteuse. Par conséquent, l'adoption d'une approche permettant la réutilisation et l'adaptabilité devient un besoin urgent pour une conception de processus prospère. Les modèles de processus configurables récemment introduits représentent l'une des solutions recherchées permettant une conception de processus par la réutilisation, tout en offrant la flexibilité. Un modèle de processus configurable est un modèle générique qui intègre de multiples variantes de procédés d'un même processus métier à travers des points de variation. Ces points de variation sont appelés éléments configurables et permettent de multiples options de conception dans le modèle de processus. Un modèle de processus configurable doit être configuré selon une exigence spécifique en sélectionnant une option de conception pour chaque élément configurable.Les activités de recherche récentes sur les modèles de processus configurables ont conduit à la spécification des langages de modélisation de processus configurables comme par exemple configurable Event-Driven Process Chain (C-EPC) qui étend la notation de l'EPC avec des éléments configurables. Depuis lors, la question de la conception et de la configuration des modèles de processus configurables a été étudiée. D'une part, puisque les modèles de processus configurables ont tendance à être très complexe avec un grand nombre d'éléments configurables, de nombreuses approches automatisées ont été proposées afin d'assister leur conception. Cependant, les approches existantes proposent de recommander des modèles de processus configurables entiers qui sont difficiles à réutiliser, nécessitent un temps complexe de calcul et peuvent confondre le concepteur du processus. D'autre part, les résultats de la recherche sur la conception des modèles de processus configurables ont mis en évidence la nécessité des moyens de soutien pour configurer le processus. Par conséquent, de nombreuses approches ont proposé de construire un système de support de configuration pour aider les utilisateurs finaux à sélectionner les choix de configuration souhaitables en fonction de leurs exigences. Cependant, ces systèmes sont actuellement créés manuellement par des experts du domaine qui est sans aucun doute une tâche fastidieuse et source d'erreurs .Dans cette thèse, nous visons à automatiser le soutien de la variabilité dans les modèles de processus configurables. Notre objectif est double: (i) assister la conception des processus configurables d'une manière à ne pas confondre les concepteurs par des recommandations complexes et (i) assister la création des systèmes de soutien de configuration afin de libérer les analystes de processus de la charge de les construire manuellement. Pour atteindre le premier objectif, nous proposons d'apprendre de l'expérience acquise grâce à la modélisation des processus passés afin d'aider les concepteurs de processus avec des fragments de processus configurables. Les fragments proposés inspirent le concepteur du processus pour compléter la conception du processus en cours. Pour atteindre le deuxième objectif, nous nous rendons compte que les modèles de processus préalablement conçus et configurés contiennent des connaissances implicites et utiles pour la configuration de processus. Par conséquent, nous proposons de bénéficier de l'expérience acquise grâce à la modélisation et à la configuration passées des processus afin d'aider les analystes de processus dans la construction de leurs systèmes de support de configuration
Today's fast changing environment imposes new challenges for effective management of business processes. In such a highly dynamic environment, the business process design becomes time-consuming, error-prone, and costly. Therefore, seeking reuse and adaptability is a pressing need for a successful business process design. Configurable reference models recently introduced were a step toward enabling a process design by reuse while providing flexibility. A configurable process model is a generic model that integrates multiple process variants of a same business process in a given domain through variation points. These variation points are referred to as configurable elements and allow for multiple design options in the process model. A configurable process model needs to be configured according to a specific requirement by selecting one design option for each configurable element.Recent research activities on configurable process models have led to the specification of configurable process modeling notations as for example configurable Event-Driven Process Chain (C-EPC) that extends the EPC notation with configurable elements. Since then, the issue of building and configuring configurable process models has been investigated. On the one hand, as configurable process models tend to be very complex with a large number of configurable elements, many automated approaches have been proposed to assist their design. However, existing approaches propose to recommend entire configurable process models which are difficult to reuse, cost much computation time and may confuse the process designer. On the other hand, the research results on configurable process model design highlight the need for means of support to configure the process. Therefore, many approaches proposed to build a configuration support system for assisting end users selecting desirable configuration choices according to their requirements. However, these systems are currently manually created by domain experts which is undoubtedly a time-consuming and error-prone task.In this thesis, we aim at automating the support of the variability in configurable process models. Our objective is twofold: (i) assisting the configurable process design in a fin-grained way using configurable process fragments that are close to the designers interest and (ii) automating the creation of configuration support systems in order to release the process analysts from the burden of manually building them. In order to achieve the first objective, we propose to learn from the experience gained through past process modeling in order to assist the process designers with configurable process fragments. The proposed fragments inspire the process designer to complete the design of the ongoing process. To achieve the second objective, we realize that previously designed and configured process models contain implicit and useful knowledge for process configuration. Therefore, we propose to benefit from the experience gained through past process modeling and configuration in order to assist process analysts building their configuration support systems. Such systems assist end users interactively configuring the process by recommending suitable configuration decisions
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Záhorovská, Zuzana. "Využití simulačních modelů a programů k analýze či zlepšení chodu podniku (reálná situace)". Master's thesis, Vysoká škola ekonomická v Praze, 2009. http://www.nusl.cz/ntk/nusl-15743.

Texto completo
Resumen
Aim of this thesis is to analyze the current situation, to identify bottlenecks and to propose improvements to the department of acquisitions in the selected financial company, which is an important part of the Czech market and which desire not to be named. In the first part of my thesis is provided to the reader a theoretical basis necessary to understand the following text. Then I create simulation models for individual teams, which take part in the mentioned department. These models are based only on average and total values. That is why in the next section, there is described the work to bring them closer to reality with shifts of administrators and the variable number of entities, which are processed throughout the day. Based on the analysis of calculated values, I propose two methods of redistribution of actions to reduce the number of employees and to increase their efficiency.
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Kay, Gillian. "Mechanisms of southern African rainfall variability in coupled climate models". Thesis, University of Oxford, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.496573.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Anderson, Karen. "Temporal variability in calibration target reflectance : methods, models and applications". Thesis, University of Southampton, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.419019.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Loeza-Serrano, Sergio Ivan. "Optimal statistical design for variance components in multistage variability models". Thesis, University of Manchester, 2014. https://www.research.manchester.ac.uk/portal/en/theses/optimal-statistical-design-for-variance-components-in-multistage-variability-models(d407bb0e-cbb0-4ef8-ab6d-80cf3e4327cb).html.

Texto completo
Resumen
This thesis focuses on the construction of optimum designs for the estimation of the variance components in multistage variability models. Variance components are the model parameters that represent the different sources of variability that affect the response of a system. A general and highly detailed way to define the linear mixed effects model is proposed. The extension considers the explicit definition of all the elements needed to construct a model. One key aspect of this formulation is that the random part is stated as a functional that individually determines the form of the design matrices for each random regressor, which gives significant flexibility. Further, the model is strictly divided into the treatment structure and the variability structure. This allows separate definitions of each structure but using the single rationale of combining, with little restrictions, simple design arrangements called factor layouts. To provide flexibility for considering different models, methodology to find and select optimum designs for variance components is presented using MLE and REML estimators and an alternative method known as the dispersion-mean model. Different forms of information matrices for variance components were obtained. This was mainly done for the cases when the information matrix is a function of the ratios of variances. Closed form expressions for balanced designs for random models with 3-stage variability structure, in crossed and nested layouts were found. The nested case was obtained when the information matrix is a function of the variance components. A general expression for the information matrix for the ratios using REML is presented. An approach to using unbalanced models, which requires the use of general formulae, is discussed. Additionally, D-optimality and A-optimality criteria of design optimality are restated for the case of variance components, and a specific version of pseudo-Bayesian criteria is introduced. Algorithms to construct optimum designs for the variance components based on the aforementioned methodologies were defined. These algorithms have been implemented in the R language. The results are communicated using a simple, but highly informative, graphical approach not seen before in this context. The proposed plots convey enough details for the experimenter to make an informed decision about the design to use in practice. An industrial internship allowed some the results herein to be put into practice, although no new research outcomes originated. Nonetheless, this is evidence of the potential for applications. Equally valuable is the experience of providing statistical advice and reporting conclusions to a non statistical audience.
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Tërnava, Xhevahire. "Gestion de la variabilité au niveau du code : modélisation, traçabilité et vérification de cohérence". Thesis, Université Côte d'Azur (ComUE), 2017. http://www.theses.fr/2017AZUR4114/document.

Texto completo
Resumen
Durant le développement de grandes lignes de produits logiciels, un ensemble de techniques d’implémentation traditionnelles, comme l’héritage ou les patrons de conception, est utilisé pour implémenter la variabilité. La notion de feature, en tant qu’unité réutilisable, n’a alors pas de représentation de première classe dans le code, et un choix inapproprié de techniques entraîne des incohérences entre variabilités du domaine et de l’implémentation. Dans cette thèse, nous étudions la diversité de la majorité des techniques d’implémentation de la variabilité, que nous organisons dans un catalogue étendu. Nous proposons un framework pour capturer et modéliser, de façon fragmentée, dans des modèles techniques de variabilité, la variabilité implémentée par plusieurs techniques combinées. Ces modèles utilisent les points de variation et les variantes, avec leur relation logique et leur moment de résolution, pour abstraire les techniques d’implémentation. Nous montrons comment étendre le framework pour obtenir la traçabilité de feature avec leurs implémentations respectives. De plus, nous fournissons une approche outillée pour vérifier la cohérence de la variabilité implémentée. Notre méthode utilise du slicing pour vérifier partiellement les formules de logique propositionnelles correspondantes aux deux niveaux dans le cas de correspondence 1–m entre ces niveaux. Ceci permet d’obtenir une détection automatique et anticipée des incohérences. Concernant la validation, le framework et la méthode de vérification ont été implémentés en Scala. Ces implémentations ont été appliquées à un vrai système hautement variable et à trois études de cas de lignes de produits
When large software product lines are engineered, a combined set of traditional techniques, such as inheritance, or design patterns, is likely to be used for implementing variability. In these techniques, the concept of feature, as a reusable unit, does not have a first-class representation at the implementation level. Further, an inappropriate choice of techniques becomes the source of variability inconsistencies between the domain and the implemented variabilities. In this thesis, we study the diversity of the majority of variability implementation techniques and provide a catalog that covers an enriched set of them. Then, we propose a framework to explicitly capture and model, in a fragmented way, the variability implemented by several combined techniques into technical variability models. These models use variation points and variants, with their logical relation and binding time, to abstract the implementation techniques. We show how to extend the framework to trace features with their respective implementation. In addition, we use this framework and provide a tooled approach to check the consistency of the implemented variability. Our method uses slicing to partially check the corresponding propositional formulas at the domain and implementation levels in case of 1–to–m mapping. It offers an early and automatic detection of inconsistencies. As validation, we report on the implementation in Scala of the framework as an internal domain specific language, and of the consistency checking method. These implementations have been applied on a real feature-rich system and on three product line case studies, showing the feasibility of the proposed contributions
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Ponce, Alvarez Adrián. "Probabilistic models for studying variability in single-neuron and neuronal ensemble activity". Thesis, Aix-Marseille 2, 2010. http://www.theses.fr/2010AIX20706.

Texto completo
Resumen
Une des caractéristiques les plus singulières de l’activité corticale est son degré élevé de variabilité. Ma thèse dedoctorat s’est focalisée sur l’étude de (i) l’irrégularité des intervalles entre potentiels d’action (PAs)successivement émis par un neurone, et (ii) la variabilité dans l’évolution temporelle de l’activité d’un ensemblede neurones. Premièrement, j’ai étudié l’irrégularité des neurones enregistrés dans le cortex moteur de singesmacaques performant une tâche d’estimation du temps et de préparation à l’action. J’ai montré que l’irrégularitén’est pas un paramètre libre de l’activité neuronale, contrairement au taux de PAs, mais est déterminée par lescontraintes structurelles des réseaux neuronaux. Deuxièmement, j’ai utilisé le modèle de Markov caché (MMC)pour analyser l’activité d’ensembles de neurones enregistrés dans plusieurs aires corticales, sensorielles etmotrices, de singes exécutant une tâche de discrimination tactile. J’ai montré que les processus sensoriels etdécisionnels sont distribués dans plusieurs aires corticales. Les résultats suggèrent que l’action et la décision surlaquelle elle est basée sont reliées par une cascade d’évènements non stationnaires et stochastiques. Finalement,j’ai utilisé le MMC pour caractériser l’activité spontanée d’un ensemble de neurones du cortex préfrontal d’unrat. Les résultats montrèrent que l’alternance entre les états UP et DOWN est un processus stochastique etdynamique. La variabilité apparaît donc aussi bien pendant l’activité spontanée que pendant le comportementactif et semble être contrainte par des facteurs structurels qui, à leur tour, contraignent le mode d’opération desréseaux neuronaux
A hallmark of cortical activity is its high degree of variability. The present work focused on (i) the variability ofintervals between spikes that single neurons emit, called spike time irregularity (STI), and (ii) the variability inthe temporal evolution of the collective neuronal activity. First, I studied the STI of macaque motor corticalneurons during time estimation and movement preparation. I found that although the firing rate of the neuronstransmitted information about these processes, the STI of a neuron is not flexible and is determined by thebalance of excitatory and inhibitory inputs. These results were obtained by means of an irregularity measure thatI compared to other existing measures. Second, I analyzed the neuronal ensemble activity of severalsomatosensory and motor cortical areas of macaques during tactile discrimination. I showed that ensembleactivity can be effectively described by the Hidden Markov Model (HMM). Both sensory and decision-makingprocesses were distributed across many areas. Moreover, I showed that decision-related changes in neuronalactivity rely on a noise-driven mechanism and that the maintenance of the decision relies on transient dynamics,subtending the conversion of a decision into an action. Third, I characterized the statistics of spontaneous UP andDOWN states in the prefrontal cortex of a rat, using the HMM. I showed that state alternation is stochastic andthe activity during UP states is dynamic. Hence, variability is prominent both during active behavior andspontaneous activity and is determined by structural factors, thus rending it inherent to cortical organization andshaping the function of neural networks
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Primeau, François W. (François William) 1966. "Multiple equilibria and low-frequency variability of wind-driven ocean models". Thesis, Massachusetts Institute of Technology, 1998. http://hdl.handle.net/1721.1/58512.

Texto completo
Resumen
Thesis (Ph. D.)--Joint Program in Physical Oceanography (Massachusetts Institute of Technology, Dept. of Earth, Atmospheric, and Planetary Sciences; and the Woods Hole Oceanographic Institution), 1998.
Includes bibliographical references (leaves 156-158).
by François W. Primeau.
Ph.D.
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Morin, Brice. "Leveraging models from design-time to runtime to support dynamic variability". Rennes 1, 2010. http://www.theses.fr/2010REN1S101.

Texto completo
Resumen
This thesis presents a Model-Driven and Aspect-Oriented approach to tame the complexity of Dynamically Adaptive Systems (DAS). At design-time, we capture the different facets of a DAS (variability, environment/context, reasoning and architecture) using dedicated metamodels. Each feature of the variability model describing a DAS is refined into an aspect model. We leverage these design models at runtime to drive the dynamic adaptation process. Both the running system and its execution context are abstracted as models. Depending on the current context (model) a reasoner interprets the reasoning model to determine a well fitted selection of features. We then use Aspect-Oriented Modeling techniques to compose the aspect models (associated to the selected features) to derive the corresponding architecture. This way, there is no need to specify the whole set of possible configurations at design-time: each configuration is automatically built when needed. We finally rely on model comparison to fully automate the reconfiguration process in order to adapt the running system, with no need to write low-level reconfiguration scripts
Cette thèse présente une approche dirigée par les modèles et basée sur la modélisation par aspects pour maitriser la complexité de systèmes logiciels adaptatifs (SA). Lors de la conception, les différentes facettes d’un SA (variabilité, environnement/contexte, raisonnement et architecture) sont capturées à l’aide de différents méta-modèles dédiés. En particuliers, les variantes de chaque point de variation sont raffinées à l’aide d’aspect (niveau model). Ces modèles sont embarqués à l’exécution pour contrôler et automatiser le mécanisme de reconfiguration. Selon le contexte courant un composant de raisonnement détermine un ensemble de variantes bien adaptées au contexte. Nous utilisons ensuite un tisseur d’aspects pour dériver automatiquement l’architecture correspondante à cette sélection de variantes. Ainsi, les concepteurs du SA n’ont pas besoin de spécifier toutes les configurations : au contraire, chaque configuration est dérivée lorsqu’il y en a besoin. Nous utilisons finalement une comparaison de modèle pour automatiser entièrement le processus de reconfiguration, sans devoir écrire des scripts de reconfiguration de bas niveau
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Yuen, Wai-kee y 袁偉基. "A historical event analysis of the variability in the empirical uncovered interest parity (UIP) coefficient". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2006. http://hub.hku.hk/bib/B36424201.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Crawford, Scott Daniel. "Sources of Variability in a Proteomic Experiment". Diss., CLICK HERE for online access, 2006. http://contentdm.lib.byu.edu/ETD/image/etd1534.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Salazar-Ferrer, Olivier. "Climatic variability and aperiodic behaviour: low order climate models and dynamical reconstruction". Doctoral thesis, Universite Libre de Bruxelles, 1989. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/213250.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Osborn, Timothy J. "Internally-generated variability in some ocean models on decadal to millennial timescales". Thesis, University of East Anglia, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.297045.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Cetina, Englada Carlos. "Achieving Autonomic Computing through the Use of Variability Models at Run-time". Doctoral thesis, Universitat Politècnica de València, 2010. http://hdl.handle.net/10251/7484.

Texto completo
Resumen
Increasingly, software needs to dynamically adapt its behavior at run-time in response to changing conditions in the supporting computing infrastructure and in the surrounding physical environment. Adaptability is emerging as a necessary underlying capability, particularly for highly dynamic systems such as context-aware or ubiquitous systems. By automating tasks such as installation, adaptation, or healing, Autonomic Computing envisions computing environments that evolve without the need for human intervention. Even though there is a fair amount of work on architectures and their theoretical design, Autonomic Computing was criticised as being a \hype topic" because very little of it has been implemented fully. Furthermore, given that the autonomic system must change states at runtime and that some of those states may emerge and are much less deterministic, there is a great challenge to provide new guidelines, techniques and tools to help autonomic system development. This thesis shows that building up on the central ideas of Model Driven Development (Models as rst-order citizens) and Software Product Lines (Variability Management) can play a signi cant role as we move towards implementing the key self-management properties associated with autonomic computing. The presented approach encompass systems that are capable of modifying their own behavior with respect to changes in their operating environment, by using variability models as if they were the policies that drive the system's autonomic recon guration at runtime. Under a set of recon guration commands, the components that make up the architecture dynamically cooperate to change the con guration of the architecture to a new con guration. This work also provides the implementation of a Model-Based Recon guration Engine (MoRE) to blend the above ideas. Given a context event, MoRE queries the variability models to determine how the system should evolve, and then it provides the mechanisms for modifying the system.
Cetina Englada, C. (2010). Achieving Autonomic Computing through the Use of Variability Models at Run-time [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/7484
Palancia
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Cheung, Anson H., Michael E. Mann, Byron A. Steinman, Leela M. Frankcombe, Matthew H. England y Sonya K. Miller. "Comparison of Low-Frequency Internal Climate Variability in CMIP5 Models and Observations". AMER METEOROLOGICAL SOC, 2017. http://hdl.handle.net/10150/624456.

Texto completo
Resumen
Low-frequency internal climate variability (ICV) plays an important role in modulating global surface temperature, regional climate, and climate extremes. However, it has not been completely characterized in the instrumental record and in the Coupled Model Intercomparison Project phase 5 (CMIP5) model ensemble. In this study, the surface temperature ICV of the North Pacific (NP), North Atlantic (NA), and Northern Hemisphere (NH) in the instrumental record and historical CMIP5 all-forcing simulations is isolated using a semiempirical method wherein the CMIP5 ensemble mean is applied as the external forcing signal and removed from each time series. Comparison of ICV signals derived from this semiempirical method as well as from analysis of ICV in CMIP5 preindustrial control runs reveals disagreement in the spatial pattern and amplitude between models and instrumental data on multidecadal time scales (>20 yr). Analysis of the amplitude of total variability and the ICV in the models and instrumental data indicates that the models underestimate ICV amplitude on low-frequency time scales (>20 yr in the NA; >40 yr in the NP), while agreement is found in the NH variability. A multiple linear regression analysis of ICV in the instrumental record shows that variability in the NP drives decadal-to-interdecadal variability in the NH, whereas the NA drives multidecadal variability in the NH. Analysis of the CMIP5 historical simulations does not reveal such a relationship, indicating model limitations in simulating ICV. These findings demonstrate the need to better characterize low-frequency ICV, which may help improve attribution and decadal prediction.
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Forney, Karin A. "Patterns of variability and environmental models of relative abundance for California cetaceans /". Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 1997. http://wwwlib.umi.com/cr/ucsd/fullcit?p9823699.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Pouder, Jessica Anne. "Using Human Footprint Models and Land-Cover Variability to Predict Ecological Processes". W&M ScholarWorks, 2014. https://scholarworks.wm.edu/etd/1539626953.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Brogan, Roisin. "The Variability of the R Magnitude in Dynamical Models of AGB Stars". Thesis, Uppsala universitet, Teoretisk astrofysik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-392377.

Texto completo
Resumen
This report will first give a brief background on asymptotic giant branch (AGB) stars and the characteristics that make them interesting to study. Some methods and tools used in the field are then introduced, before the photometric variability of these stars is investigated. This is achieved by using data from dynamical models of AGB stars with differing chemical abundances. The R, J and K bands of the UBVRI system are specifcally investigated to explore whether these are good candidates for AGB photometric and spectroscopic research. Lastly, the molecular features at these wavelengths are investigated to understand the impact that they have on the photometric variability during the pulsation cycle and which molecules are most prominent in this.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Vazquez, Heather. "Evaluating Changes to Natural Variability on a Warming Globe in CMIP5 Models". FIU Digital Commons, 2018. https://digitalcommons.fiu.edu/etd/3737.

Texto completo
Resumen
Global mean surface temperatures (GMST) warmed in the early 20th century, experienced a mid-century lull, and warmed again steadily until 1997. Observations at the turn of the 21st century have revealed another period of quiescent warming of GMSTs from 1998 to 2012, thus prompting the notion of a global warming “hiatus”. The warming hiatus occurred concurrently with steadily increasing atmospheric greenhouse gas concentrations, sea level rise, and retreating arctic sea ice. The occurrence of the warming hiatus suggests that natural variability continues to be a sizable contributor to modern climate change and implies that energy is rearranged or changed within the climate system. Much of the scientific research conducted over the last decade has attempted to identify which modes of natural variability may be contributing to the GMST signal in the presence of anthropogenic warming. Many of these studies concluded that natural variability, operating in the global oceans were the largest contributors to GMST. What remains unclear is how oceanic variability and its contribution to GMST may change on a warmer globe as greenhouse gas concentrations continue to rise. Our research includes diagnostic analyses of the available observational surface temperature estimates and novel state-of-the-art climate model experiments from the fifth phase of the Coupled Model Intercomparison Project (CMIP5). Our analyses seek to understand how the natural modes of variability within the ocean will change under different warming scenarios. Utilizing simulations forced with observed pre-industrial and historical greenhouse gas emissions in combination with several future warming simulations, we quantify the probability of similar “hiatus-like” periods occurring on a warmer globe. To that end employ various metrics and detrending techniques including EOF decomposition, running climatologies, along with linear and nonlinear trends to elucidate how natural variability changes over time. We also examine the changing influence of natural modes of variability with respect to the anthropogenic radiative forcing over different regions on the globe.Results suggest that natural variability for much of the global oceans decreases as the radiative forcing increases in the future warming scenarios.
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Serra, Yolande L. y Kerrie Geil. "Historical and Projected Eastern Pacific and Intra-Americas Sea TD-Wave Activity in a Selection of IPCC AR5 Models". AMER METEOROLOGICAL SOC, 2017. http://hdl.handle.net/10150/624034.

Texto completo
Resumen
The tracks of westward-propagating synoptic disturbances across the Intra-Americas Sea (IAS) and far-eastern Pacific, known as easterly waves or tropical depression (TD) waves, are an important feature of the region's climate. They are associated with heavy rainfall events, seed the majority of tropical cyclones, and contribute to the mean rainfall across the region. This study examines the ability of current climate models (CMIP5) to simulate TD-wave activity and associated environmental factors across the IAS and far-eastern Pacific as compared to reanalysis. Model projections for the future are then compared with the historical model experiment to investigate the southward shift in CMIP5 track density and the environmental factors that may contribute to it. While historical biases in TD-wave track-density patterns are well correlated with model biases in sea surface temperature and midlevel moisture, the projected southward shift of the TD track density by the end of the twenty-first century in CMIP5 models is best correlated with changes in deep wind shear and midlevel moisture. In addition, the genesis potential index is found to be a good indicator of both present and future regions of high TD-wave track density for the models in this region. This last result may be useful for understanding the more complex relationship between tropical cyclones and this index in models found in other studies.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Santoso, Agus Mathematics &amp Statistics Faculty of Science UNSW. "Evolution of climate anomalies and variability of Southern Ocean water masses on interannual to centennial time scales". Awarded by:University of New South Wales. School of Mathematics and Statistics, 2005. http://handle.unsw.edu.au/1959.4/33355.

Texto completo
Resumen
In this study the natural variability of Southern Ocean water masses on interannual to centennial time scales is investigated using a long-term integration of the Commonwealth Scientic and Industrial Research Organisation (CSIRO) coupled climate model. We focus our attention on analysing the variability of Antarctic IntermediateWater (AAIW), Circumpolar DeepWater (CDW), and Antarctic Bottom Water (AABW). We present an analysis of the dominant modes of temperature and salinity (T - S) variability within these water masses. Climate signals are detected and analysed as they get transmitted into the interior from the water mass formation regions. Eastward propagating wavenumber-1, -2, and -3 signals are identied using a complex empirical orthogonal function (CEOF) analysis along the core of the AAIW layer. Variability in air-sea heat uxes and ice meltwater rates are shown by heat and salt budget analyses to control variability of Antarctic Surface Water where density surfaces associated with AAIW outcrop. The dominant mode in the CDW layer is found to exhibit an interbasin-scale of variability originating from the North Atlantic, and propagating southward into the Southern Ocean. Salinity dipole anomalies appear to propagate around the Atlantic meridional overturning circulation with the strengthening and weakening of North Atlantic Deep Water formation. In the AABW layer, T - S anomalies are shown to originate from the southwestern Weddell Sea, driven by salinity variations and convective overturning in the region. It is also demonstrated that the model exhibits spatial patterns of T - S variability for the most part consistent with limited observational record in the Southern Hemisphere. However, some observations of decadal T - S changes are found to be beyond that seen in the model in its unperturbed state. We further assess sea surface temperature (SST) variability modes in the Indian Ocean on interannual time scales in the CSIRO model and in reanalysis data. The emergence of a meridional SST dipole during years of southwest Western Australian rainfall extremes is shown to be connected to a large-scale mode of Indian Ocean climate variability. The evolution of the dipole is controlled by variations in atmospheric circulation driving anomalous latent heat uxes with wind-driven ocean transport moderating the impact of evaporation and setting the conditions favourable for the next generation phase of an opposite dipole.
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Steele, Clint y n/a. "The prediction and management of the variability of manufacturing operations". Swinburne University of Technology, 2005. http://adt.lib.swin.edu.au./public/adt-VSWT20060815.151147.

Texto completo
Resumen
Aim: To investigate methods that can be used to predict and manage the effects of manufacturing variability on product quality during the design process. Methodology: The preliminary investigation is a review and analysis of probabilistic methods and quality metrics. Based on this analysis, convenient robustification methods are developed. In addition, the nature of the flow of variability in a system is considered. This is then used to ascertain the information needed for an input variable when predicting the quality of a proposed design. The second, and major, part of the investigation is a case-by-case analysis of a collection of manufacturing operations and material properties. Each is initially analysed from first principles. On completion, the fundamental causes of variability of the key characteristic(s) are identified. Where possible, the expected variability for each of those characteristics has been determined. Where this determination was not possible, qualitative conclusions about the variability are made instead. In each case, findings on the prediction and management of manufacturing variability are made.
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Johnson, David. "The spatial and temporal variability of nearshore currents". University of Western Australia. Centre for Water Research, 2004. http://theses.library.uwa.edu.au/adt-WU2004.0067.

Texto completo
Resumen
The nearshore current field, defined here as the residual horizontal flow after averaging over the incident wave period, exhibits variability at a range of time and space scales. Some of the variable currents are low frequency gravity wave motions. However, variable, rotational (in the sense of possessing vertical vorticity) flow can also exist as part of the overall nearshore current field. A field and numerical modelling investigation of these variable rotational currents has been carried out. Drifters, which were developed for surfzone use, enabled measurement of the nearshore current structure; the design and testing of these new instruments is described. Two sets of field measurements, using the new drifters and Eulerian instruments were carried out for conditions with swell perpendicular to a plane beach and in strong longshore currents. In the perpendicular swell conditions, an interesting and well-defined feature of the measured trajectories was the development of transient rip currents. Discrete vortices were also observed. In the longshore current case, trajectories with the longshore current displacement removed had complex meandering paths. Lagrangian data were used to make estimates of length scales and dispersion, both of which provide strong evidence that the current field cannot be due to low frequency gravity waves alone. Under the assumption of equipartition of kinetic and potential energy for low frequency gravity waves, Eulerian measurements of velocities and pressure show significant energy due to non-divergent, rotational flow in both the perpendicular swell and longshore current case. A numerical model that can simulate horizontal flow with a directionally spread, random wave field incident on a plane beach was implemented. The model developed transient rip currents that are qualitatively very similar to those seen in the drifter trajectories from the field. The number and intensity of rip currents in the model depended on the beach slope and incident wave spectra. The energy content and cross-shore flux (and hence transport of material) of the rotational current flow component in the simulated flow fields is comparable to that due to low frequency gravity waves. The modelling also provided some evidence that there may be universal characteristics of the rotational currents. The field results and modelling show that variable rotational currents are ubiquitous in the field even when longshore currents and hence shear waves are not present. The term “infragravity turbulence” is suggested to describe the general class of nearshore hydrodynamics not directly associated with shear waves, which is largely disorganised, but contains well defined features such as transient rips currents and large scale horizontal vortices. The results have important implications in the understanding of the transport of material, including sediment, biological material, pollution, and sometimes bathers, in the nearshore zone.
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Struble, Nigel. "Measuring Glycemic Variability and Predicting Blood Glucose Levels Using Machine Learning Regression Models". Ohio University / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1382664092.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Robinson, Emma Claire. "Characterising population variability in brain structure through models of whole-brain structural connectivity". Thesis, Imperial College London, 2010. http://hdl.handle.net/10044/1/5875.

Texto completo
Resumen
Models of whole-brain connectivity are valuable for understanding neurological function. This thesis seeks to develop an optimal framework for extracting models of whole-brain connectivity from clinically acquired diffusion data. We propose new approaches for studying these models. The aim is to develop techniques which can take models of brain connectivity and use them to identify biomarkers or phenotypes of disease. The models of connectivity are extracted using a standard probabilistic tractography algorithm, modified to assess the structural integrity of tracts, through estimates of white matter anisotropy. Connections are traced between 77 regions of interest, automatically extracted by label propagation from multiple brain atlases followed by classifier fusion. The estimates of tissue integrity for each tract are input as indices in 77x77 ”connectivity” matrices, extracted for large populations of clinical data. These are compared in subsequent studies. To date, most whole-brain connectivity studies have characterised population differences using graph theory techniques. However these can be limited in their ability to pinpoint the locations of differences in the underlying neural anatomy. Therefore, this thesis proposes new techniques. These include a spectral clustering approach for comparing population differences in the clustering properties of weighted brain networks. In addition, machine learning approaches are suggested for the first time. These are particularly advantageous as they allow classification of subjects and extraction of features which best represent the differences between groups. One limitation of the proposed approach is that errors propagate from segmentation and registration steps prior to tractography. This can cumulate in the assignment of false positive connections, where the contribution of these factors may vary across populations, causing the appearance of population differences where there are none. The final contribution of this thesis is therefore to develop a common co-ordinate space approach. This combines probabilistic models of voxel-wise diffusion for each subject into a single probabilistic model of diffusion for the population. This allows tractography to be performed only once, ensuring that there is one model of connectivity. Cross-subject differences can then be identified by mapping individual subjects’ anisotropy data to this model. The approach is used to compare populations separated by age and gender.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Bates, Rebecca Anne. "Speaker dynamics as a source of pronunciation variability for continuous speech recognition models /". Thesis, Connect to this title online; UW restricted, 2004. http://hdl.handle.net/1773/5858.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Galindo, Duarte José Ángel. "Evolution, testing and configuration of variability systems intensive". Thesis, Rennes 1, 2015. http://www.theses.fr/2015REN1S008/document.

Texto completo
Resumen
Une particularité importante du logiciel est sa capacité à être adapté et configuré selon différents scénarios. Récemment, la variabilité du logiciel a été étudiée comme un concept de première classe dans différents domaines allant des lignes de produits logiciels aux systèmes ubiquitaires. La variabilité est la capacité d'un produit logiciel à varier en fonction de différentes circonstances. Les systèmes à forte variabilité mettent en jeu des produits logiciels où la gestion de la variabilité est une activité d'ingénierie prédominante. Les diverses parties de ces systèmes sont couramment modélisées en utilisant des formes différentes de ''modèle de variabilité'', qui est un formalisme de modélisation couramment utilisé. Les modèles de caractéristiques (feature models) ont été introduits par Kang et al. en 1990 et sont une représentation compacte d'un ensemble de configurations pour un système à forte variabilité. Le grand nombre de configurations d'un modèle de caractéristiques ne permet pas une analyse manuelle. De fait, les mécanismes assistés par ordinateur sont apparus comme une solution pour extraire des informations utiles à partir de modèles de caractéristiques. Ce processus d'extraction d'information à partir de modèles de caractéristiques est appelé dans la littérature scientifique ''analyse automatisée de modèles de caractéristiques'' et a été l'un des principaux domaines de recherche ces dernières années. Plus de trente opérations d'analyse ont été proposées durant cette période. Dans cette thèse, nous avons identifié différentes questions ouvertes dans le domaine de l'analyse automatisée et nous avons considéré plusieurs axes de recherche. Poussés par des scénarios du monde réel (e.g., la téléphonie mobile ou la vidéo protection), nous avons contribué à appliquer, adapter ou étendre des opérations d'analyse automatisée pour l’évolution, le test et la configuration de systèmes à forte variabilité
The large number of configurations that a feature model can encode makes the manual analysis of feature models an error prone and costly task. Then, computer-aided mechanisms appeared as a solution to extract useful information from feature models. This process of extracting information from feature models is known as ''Automated Analysis of Feature models'' that has been one of the main areas of research in the last years where more than thirty analysis operations have been proposed. In this dissertation we looked for different tendencies in the automated analysis field and found several research opportunities. Driven by real-world scenarios such as smart phone or videosurveillance domains, we contributed applying, adapting or extending automated analysis operations in variability intensive systems evolution, testing and configuration
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Soukharev, B. E. y L. L. Hood. "Solar cycle variation of stratospheric ozone: Multiple regression analysis of long-term satellite data sets and comparisons with models". AMER GEOPHYSICAL UNION, 2006. http://hdl.handle.net/10150/623340.

Texto completo
Resumen
Previous multiple regression analyses of the solar cycle variation of stratospheric ozone are improved by (1) analyzing three independent satellite ozone data sets with lengths extending up to 25 years and (2) comparing column ozone measurements with ozone profile data during the 1992–2003 period when no major volcanic eruptions occurred. Results show that the vertical structure of the tropical ozone solar cycle response has been consistently characterized by statistically significant positive responses in the upper and lower stratosphere and by statistically insignificant responses in the middle stratosphere (∼28–38 km altitude). This vertical structure differs from that predicted by most models. The similar vertical structure in the tropics obtained for separate time intervals (with minimum response invariably near 10 hPa) is difficult to explain by random interference from the QBO and volcanic eruptions in the statistical analysis. The observed increase in tropical total column ozone approaching the cycle 23 maximum during the late 1990s occurred primarily in the lower stratosphere below the 30 hPa level. A mainly dynamical origin for the solar cycle total ozone variation at low latitudes is therefore likely. The amplitude of the solar cycle ozone variation in the tropical upper stratosphere derived here is somewhat reduced in comparison to earlier results. Additional data are needed to determine whether this upper stratospheric response is or is not larger than model estimates.
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Knowlton, Nicholas Scott. "Robust estimation of inter-chip variability to improve microarray sample size calculations". Oklahoma City : [s.n.], 2005.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Kuhlbrodt, Till. "Stability and variability of open-ocean deep convection in deterministic and stochastic simple models". Phd thesis, [S.l. : s.n.], 2002. http://pub.ub.uni-potsdam.de/2002/0033/kuhlb.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Myers, Timothy Albert. "Investigating the variability of subtropical marine boundary layer clouds in observations and climate models". Thesis, University of California, San Diego, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=3714206.

Texto completo
Resumen

Low-level clouds found over the eastern subtropical oceans have a substantial cooling effect on Earth’s climate since they strongly reflect solar radiation back to space, and their simulation in climate models contributes to large uncertainty in global warming projections. This thesis aims to increase understanding of these marine boundary layer clouds through observational analysis, theoretical considerations, and an evaluation of their simulation in climate models. Examination of statistical relationships between cloud properties and large-scale meteorological variables is a key method employed throughout the thesis. The meteorological environment of marine boundary layer clouds shapes their properties by affecting the boundary layer’s depth and structure.

It is found that enhanced subsidence, typically thought to promote boundary layer cloudiness, actually reduces cloudiness when the confounding effect of the strength of the temperature inversion capping the boundary layer is taken into account. A conceptual model is able to explain this result. Next, fundamental deficiencies in the simulation of subtropical clouds in two generations of climate models are identified. Remarkably, the newer generation of climate models is in some ways inferior to the older generation in terms of capturing key low-level cloud processes. Subtropical mid- and high-level clouds are also found to contribute more to variability in the radiation budget at the top of the atmosphere than previously thought. In the last portion of the thesis, large inter-model spread in subtropical cloud feedbacks is shown to arise primarily from differences in the simulation of the interannual relationship between shortwave cloud radiative effect and sea surface temperature. An observational constraint on this feedback suggests that subtropical marine boundary layer clouds will act as a positive feedback to global warming.

Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Das, Tapash. "The impact of spatial variability of precipitation on the predictive uncertainty of hydrological models". [S.l. : s.n.], 2006. http://nbn-resolving.de/urn:nbn:de:bsz:93-opus-28827.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

MISTRY, MALCOLM. "Impacts of climate change and variability on crop yields using emulators and empirical models". Doctoral thesis, Università Ca' Foscari Venezia, 2017. http://hdl.handle.net/10278/3716714.

Texto completo
Resumen
La tesi valuta gli impatti dei cambiamenti climatici e della variabilità climatica sulla produttività agricola a scala regionale e globale analizzando dati ad alta risoluzione spaziale con metodi econometrici. La tesi utilizza dati provenienti da sei modelli globali delle rese agricole per quattro coltivazioni non irrigate (mais, riso, soia, e grano) per costruire un emulatore da integrare in modelli di valutazione integrata (IAMs). La prestazione dell’emulatore statistico è valutata su scala regionale utilizzando modelli empirici basati su osservazioni storiche per gli Stati Uniti. Il Capitolo 1 fornisce il contesto della ricerca esistente e descrive le metodologie disponibili nell’ambito dell’agronomia. Introduce la motivazione e gli obiettivi della ricerca sviluppata nei capitoli successivi. Il Capitolo 2 discute i dati, la metodologia usata per sviluppare un semplice emulatore statistico della funzione di risposta delle rese agricole a shock meteorologici simulati da modelli di processo. Per facilitare l’integrazione dell’emulatore in modelli IAMs, questo capitolo testa un modello semplice ad effetti fissi con l’interazione con trend temporali. Il Capitolo 3 esplora delle varianti del modello base che esplorano 1) altre variabili esplicative 2) variazioni geografiche in base a diverse aree agronomiche (Agro-Ecological Zones, AEZs), 3) il ruolo della dipendenza spaziale nei dati. Il Capitolo 4 confronta la performance dell’emulatore statistico calibrato sui dati dei modelli di processo con dei modelli empirici basati su dati storici. Il confronto analizza i dati per gli Stati Uniti. Si basa sul modello base sviluppato nel Capitolo 2 e dati storici per gli Stati Uniti dal Dipartimento dell’Agricoltura (USDA). Nel loro insieme i tre capitoli 2-4 affrontano diverse importanti domande: 1) come si caratterizzano le funzioni di risposta in forma ridotta stimate a partire da dati generati da modelli di processo 2) come queste variano geograficamente e in base al modello che genera i dati 3) come queste differiscono rispetto a funzioni di risposta stimate a partire dai dati osservati storicamente e 4) quali sono le implicazioni per l’analisi del rischio climatico. Il Capitolo 5 conclude la tesi con un riassunto dei contributi chiave e suggerimenti per lavori futuri.
The thesis assesses impacts of climate change and variability on regional and global crop yields using econometric approaches to analyze global gridded data. Using a large dimension panel data of six Global Gridded Crop Models (GGCMs) for four rainfed crops (maize, rice, soybeans and wheat) an emulator suitable/amenable of being integrated into Integrated Assessment Models (IAMs) is built. The performance of the emulator is evaluated against observational-based, empirical models at regional scale by building a statistical model calibrated on historical observed crop yields data for United States (U.S.) counties. Chapter 1 provides the background of existing research methodologies in agronomic literature. The gaps in existing research and scope for research are laid down as motivation and objectives of the research that follows in the subsequent chapters. Chapter 2 discusses the data, methodology and framework used in the construction of a simple statistical emulator of the response of crops to weather shocks simulated by crop models. To facilitate the integration of the emulator into IAMs, the simplest model using a base specification of linear fixed effect with time trend interactions is developed. Chapter 3 investigates modifications to the base specification with a series of robustness checks exploring the suitability of an additional predictor variable, the stratification of coefficients geographically by groups of Agro-Ecological Zones (AEZs); and most importantly, the role of spatial dependence in variables by applying a spatial model. Chapter 4 compares the performance of the statistical emulator calibrated on crop model results, with an empirical models of crop responses based on historical data. The comparison focuses on U.S. counties. The base specification from Chapter 2 together with historical observed data from the U.S. Department of Agriculture (USDA), are utilized in an inter-comparison exercise for divergence in results and subsequent implications. Collectively, the three chapters (2-4) address several important questions: (1) what do reduced-form statistical response surfaces trained on crop model outputs from various simulation specifications look like; (2) do model-based crop response functions vary systematically over space (e.g., crop suitability zones) and across crop models?, (3) how do model-based crop response functions compare to crop responses estimated using historical observations? and (4) what are the implications for the characterization of future climate risks? Chapter 5 concludes the thesis providing a summary of key contributions and suggestions for future work.
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Mistry, Malcolm Noshir <1977&gt. "Impacts of climate change and variability on crop yields using emulators and empirical models". Doctoral thesis, Università Ca' Foscari Venezia, 2016. http://hdl.handle.net/10579/10345.

Texto completo
Resumen
The thesis assesses impacts of climate change and variability on regional and global crop yields using econometric approaches to analyze global gridded data. Using a large dimension panel data of six Global Gridded Crop Models (GGCMs) for four rainfed crops (maize, rice, soybeans and wheat) an emulator suitable/amenable of being integrated into Integrated Assessment Models (IAMs) is built. The performance of the emulator is evaluated against observational-based, empirical models at regional scale by building a statistical model calibrated on historical observed crop yields data for United States (U.S.) counties. Chapter 1 provides the background of existing research methodologies in agronomic literature. The gaps in existing research and scope for research are laid down as motivation and objectives of the research that follows in the subsequent chapters. Chapter 2 discusses the data, methodology and framework used in the construction of a simple statistical emulator of the response of crops to weather shocks simulated by crop models. To facilitate the integration of the emulator into IAMs, the simplest model using a base specification of linear fixed effect with time trend interactions is developed. Chapter 3 investigates modifications to the base specification with a series of robustness checks exploring the suitability of an additional predictor variable, the stratification of coefficients geographically by groups of Agro-Ecological Zones (AEZs); and most importantly, the role of spatial dependence in variables by applying a spatial model. Chapter 4 compares the performance of the statistical emulator calibrated on crop model results, with an empirical models of crop responses based on historical data. The comparison focuses on U.S. counties. The base specification from Chapter 2 together with historical observed data from the U.S. Department of Agriculture (USDA), are utilized in an inter-comparison exercise for divergence in results and subsequent implications. Collectively, the three chapters (2-4) address several important questions: (1) what do reduced-form statistical response surfaces trained on crop model outputs from various simulation specifications look like; (2) do model-based crop response functions vary systematically over space (e.g., crop suitability zones) and across crop models?, (3) how do model-based crop response functions compare to crop responses estimated using historical observations? and (4) what are the implications for the characterization of future climate risks? Chapter 5 concludes the thesis providing a summary of key contributions and suggestions for future work.
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

O'Hara, Jeffrey Keith. "Water resources planning under climate change and variability". Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 2007. http://wwwlib.umi.com/cr/ucsd/fullcit?p3259069.

Texto completo
Resumen
Thesis (Ph. D.)--University of California, San Diego, 2007.
Title from first page of PDF file (viewed June 21, 2007). Available via ProQuest Digital Dissertations. Vita. Includes bibliographical references.
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Chimidza, Oyapo. "The variability and predictability of the IRI shape parameters over Grahamstown, South Africa". Thesis, Rhodes University, 2008. http://hdl.handle.net/10962/d1005282.

Texto completo
Resumen
The International Reference Ionosphere (IRI) shape parameters B0, B1, and D1 provide a representation of the shape of the F2 layer, the thickness of the F2 layer and the shape of the F1 layer of the ionosphere respectively. The aim of this study was to examine the variability of these parameters using Grahamstown, South Africa (33.3±S, 26.5±E) ionosonde data and determine their predictability by the IRI-2001 model. A further aim of this study was to investigate developing an alternative model for predicting these parameters. These parameters can be determined from electron density profiles that are inverted from ionograms recorded with an ionosonde. Data representing the B0, B1 and D1 parameters, with half hourly or hourly intervals, were scaled and deduced from the digital pulse sounder (DPS) ionosonde for the period April 1996 to December 2006. An analysis of the diurnal, seasonal, and solar variations of the behaviour of these parameters was undertaken for the years 2000, 2004 and 2005 using monthly medians. Comparisons between the observational results and that of the IRI model (IRI 2001 version) indicate that the IRI-2001 model does not accurately represent the diurnal and seasonal variation of the parameters. A preliminary model was thus developed using the technique of Neural Networks (NNs). All available data from the Grahamstown ionosonde from 1996 to 2006 were used in the training of the NNs and the prediction of the variation of the shape parameters. Inputs to the model were the day number, the hour of day, the solar activity and the magnetic index. Comparisons between the preliminary NN model and the IRI-2001 model indicated that the preliminary model was more accurate at the prediction of the parameters than the IRI-2001 model. This analysis showed the need to improve the existing IRI model or develop a new model for the South African region. This thesis describes the results from this feasibility study which show the variability and predictability of the IRI shape parameters.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía