Dissertations / Theses on the topic 'Variability'

To see the other types of publications on this topic, follow the link: Variability.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Variability.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Tërnava, Xhevahire. "Gestion de la variabilité au niveau du code : modélisation, traçabilité et vérification de cohérence." Thesis, Université Côte d'Azur (ComUE), 2017. http://www.theses.fr/2017AZUR4114/document.

Full text
Abstract:
Durant le développement de grandes lignes de produits logiciels, un ensemble de techniques d’implémentation traditionnelles, comme l’héritage ou les patrons de conception, est utilisé pour implémenter la variabilité. La notion de feature, en tant qu’unité réutilisable, n’a alors pas de représentation de première classe dans le code, et un choix inapproprié de techniques entraîne des incohérences entre variabilités du domaine et de l’implémentation. Dans cette thèse, nous étudions la diversité de la majorité des techniques d’implémentation de la variabilité, que nous organisons dans un catalogue étendu. Nous proposons un framework pour capturer et modéliser, de façon fragmentée, dans des modèles techniques de variabilité, la variabilité implémentée par plusieurs techniques combinées. Ces modèles utilisent les points de variation et les variantes, avec leur relation logique et leur moment de résolution, pour abstraire les techniques d’implémentation. Nous montrons comment étendre le framework pour obtenir la traçabilité de feature avec leurs implémentations respectives. De plus, nous fournissons une approche outillée pour vérifier la cohérence de la variabilité implémentée. Notre méthode utilise du slicing pour vérifier partiellement les formules de logique propositionnelles correspondantes aux deux niveaux dans le cas de correspondence 1–m entre ces niveaux. Ceci permet d’obtenir une détection automatique et anticipée des incohérences. Concernant la validation, le framework et la méthode de vérification ont été implémentés en Scala. Ces implémentations ont été appliquées à un vrai système hautement variable et à trois études de cas de lignes de produits
When large software product lines are engineered, a combined set of traditional techniques, such as inheritance, or design patterns, is likely to be used for implementing variability. In these techniques, the concept of feature, as a reusable unit, does not have a first-class representation at the implementation level. Further, an inappropriate choice of techniques becomes the source of variability inconsistencies between the domain and the implemented variabilities. In this thesis, we study the diversity of the majority of variability implementation techniques and provide a catalog that covers an enriched set of them. Then, we propose a framework to explicitly capture and model, in a fragmented way, the variability implemented by several combined techniques into technical variability models. These models use variation points and variants, with their logical relation and binding time, to abstract the implementation techniques. We show how to extend the framework to trace features with their respective implementation. In addition, we use this framework and provide a tooled approach to check the consistency of the implemented variability. Our method uses slicing to partially check the corresponding propositional formulas at the domain and implementation levels in case of 1–to–m mapping. It offers an early and automatic detection of inconsistencies. As validation, we report on the implementation in Scala of the framework as an internal domain specific language, and of the consistency checking method. These implementations have been applied on a real feature-rich system and on three product line case studies, showing the feasibility of the proposed contributions
APA, Harvard, Vancouver, ISO, and other styles
2

Zemánek, Ladislav. "Analýza variability srdečního rytmu pomocí entropie." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2013. http://www.nusl.cz/ntk/nusl-220014.

Full text
Abstract:
The analysis of HRV is an advanced and noninvasive method which is used to investigate the involuntary nervous system. It is also one of the important parameters of its proper function. Heart rate variability can also be analyzed by entropy, which studies the discrepancy of the RR intervals of the HRV signal and thus can be used to diagnose cardiac diseases
APA, Harvard, Vancouver, ISO, and other styles
3

Storme, Martin. "Variabilité des évaluations de la créativité." Thesis, Paris 5, 2013. http://www.theses.fr/2013PA05H107.

Full text
Abstract:
Depuis les années 1950, la pensée divergente est la principale mesure utilisée pour estimer le niveau de créativité d'un individu. si de nombreuses études lui ont été consacrées, peu se sont attachées à la décrire dans sa dimension processuelle. dans la continuité des travaux fondateurs de lubart et de gilhooly, nous développerons avec cette thèse une approche statique et une approche dynamique du processus de pensée divergente. nous proposerons plus spécifiquement une extension du modèle de résonnance émotionnelle de lubart et getz appliquée aux associations originales (pour l'aspect statique), et une modélisation par des chaînes de markov de différentes dimensions de la pensée divergente, comme les stratégies ou les catégories d'idées (pour l'aspect dynamique)
This dissertation is devoted to the study of the variability of creativity evaluations, by focusing ontraining non-experts judges to enhance their expertise. In the theoretical part, various issues related tocreativity, judgment and variability are explored and provide hypotheses for the empirical part. Thefirst series of studies allows justifying the relevance of the application of a simplified model of creativityjudgment inspired by Besemer & O’Quin (1999) to the evaluation by non-experts judges of graphicproducts made by children. The rest of the empirical studies are devoted to the investigation of theeffect of the training on 1) the stability and 2) the expertise of creativity evaluations. The model ofcreative judgments provides the mechanism explaining the effect of the training on the stability andthe expertise of creativity evaluations, by emphasizing the mediating role of stability and expertise ofrelevant predictors evaluation (originality and elaboration) and of the integration function, by which thejudge combines predictors to make a creativity judgment. A final study allows studying the long-termeffect of the training. These results are discussed and future research and applications are suggested
APA, Harvard, Vancouver, ISO, and other styles
4

Salan, Jefferson. "Is variability appropriate? Encoding Variability and Transfer-Appropriate Processing." Thesis, Virginia Tech, 2020. http://hdl.handle.net/10919/99414.

Full text
Abstract:
Transfer-appropriate processing (TAP) proposes that retrieval success is based on the match between processing at encoding and retrieval. We propose that the processing described by TAP determines the contextual cues that are encoded with an event. At retrieval, the presence or absence of contextual cues matching the encoding cues will influence success. To implement these principles as a strategy to improve memory, the nature of future retrieval processing or cues must be known during encoding. As this is unlikely in real-world memory function, we propose that increased encoding variability – increasing the range of encoded cues – increases the likelihood of TAP when the retrieval scenario is unknown. The larger the set of encoded cues, the more likely those cues will recur during retrieval and therefore achieve TAP. Preliminary research in our lab (Diana, unpublished data) has found that increased encoding variability improves memory for item information in a novel retrieval context. To test whether this benefit to memory is due to the increased likelihood of TAP, the current experiment compared the effects of encoding variability under conditions that emphasize TAP to conditions that reduce TAP. We found main effects of encoding variability and TAP, but no interaction between the two. Planned comparisons between high and low variability encoding contexts within matching and non-matching retrieval contexts did not produce a significant difference between high and low variability when encoding-retrieval processing matched. We conclude that further studies are necessary to determine whether encoding variability has mechanisms that benefit memory beyond TAP.
M.S.
It is well accepted within the episodic memory literature that successful memory retrieval is often driven by context cues. Specifically, the cues that are stored with the memory of the event. To develop a better understanding of how episodic memory works, we must understand how manipulating context cues changes memory performance. One way to investigate the effects of context manipulation is using encoding variability, which refers to the amount of variability (i.e., change) in context cues from one repetition of an item or event, to the next. Preliminary research in our lab (Diana, unpublished data) has found that increased encoding variability improves memory retrieval in a novel context, but it is unclear why this is the case. We proposed that the mental processing described by transfer-appropriate processing (TAP) – a principle stating that memory retrieval success is determined by the match, or overlap, between the mental processing at encoding (i.e., memory formation) and memory retrieval – determines the contextual cues that are stored with the memory at encoding. We hypothesized that encoding variability works even when TAP has already been achieved by matching the processing and cues at encoding to those at retrieval. Alternatively, we hypothesized that encoding variability works by specifically achieving TAP, so that encoding variability is only helpful when the encoding and retrieval contexts do not match. Results indicated partial support for the alternative hypothesis, suggesting that encoding variability works by achieving TAP. However, these results were not sufficiently conclusive, and it is likely that there are other mechanisms that allow for encoding variability to improve memory. This study establishes the groundwork for future work examining encoding variability and its effects on memory.
APA, Harvard, Vancouver, ISO, and other styles
5

Santagata, Carmen. "L'utilisation de roches autres que le silex au Paléolithique ancien et moyen : choix économiques, techniques et fonctionnels sur la base de l'étude des gisements de Sainte-Anne 1 (Haute-Loire, France) (MIS 5 et 6) et Notarchirico (Basilicata, Italie) (MIS 14 à 17)." Thesis, Bordeaux 1, 2012. http://www.theses.fr/2012BOR14532/document.

Full text
Abstract:
La question principale à laquelle nous avons tenté de répondre est la suivante : est-ce que la variabilité des matières premières (caractéristiques pétrographiques et morphologie des supports) a conditionné la production des hommes préhistoriques dans le temps et dans l’espace ? La mise en place de fiches descriptives technologiques spécifiques pour chaque catégorie d’objets nous a permis d’analyser les caractères techniques propres à chaque produit et de sélectionner les variables principales qui ont joué un rôle important dans l’évaluation des contraintes et ont guidé la production lithique. L’analyse critique des contextes lithostratigraphiques et la prise en compte des diversités techniques, technologiques ou chronologiques des industries permettent de reconsidérer les paradigmes à la base de la différentiation des techniques : produits du façonnage bifacial (biface, ébauche), système de production Levallois, système de production Discoïde. Ces termes ont trop longtemps masqué la variabilité exprimée par la production technique des tailleurs du Paléolithique. C’est vers l’analyse de la pluralité des comportements humains et des sociétés qu’il faut se diriger maintenant
The use of geomaterials different from flint in Early and Middle Paleolithic. Economical technical and functional choices after the study of two sites : Sainte-Anne I (Haute-Loire, France) (MIS 5-7) and Notarchirico (Basilicata, Italy) (MIS 14-17).We tried to answer the question: has the variability of raw materials (from petrographical characteristics and morphological aspects) influenced prehistoric knappers during the production both in time and space? The creation and use of specific descriptive files for each category of objects allowed us to analyze the technical characteristics of the products and to select the variables that played a major role in the assessment of needs at the base of the lithic production. Critical analysis of the litho-stratigraphic contexts and consideration of technical, technological or chronological diversity allows to reconsider the paradigms underlying the differentiation of the techniques: products façonnage (double sided, ébauche), Levallois production system, discoid production system. These terms have for too long concealed the variability of the characters in Paleolithic lithic production. We have now to reconsider the purposes of lithic studies and to aim to decipher the plurality of individuals and palaeolithic societies behaviours
L’utilizzazione delle rocce diverse dalla selce nel Paleolitico antico e medio : scelte economiche, tecniche e funzionali, sulla base dello studio dei siti di Sainte-Anne I (Haute-Loire, France) (SIO 5-7) e di Notarchirico (Basilicata, Italia) (SIO 14-17).La questione principale alla quale abbiamo tentati di rispondere è la seguente: la variabilità delle materie prime (caratteristiche petrografiche e morfologia dei supporti) ha condizionato gli uomini preistorici durante la produzione nel tempo e nello spazio? La creazione e l’utilizzo di schede di analisi specifiche per ogni categoria di oggetti ci ha permesso di analizzare i caratteri tecnici specifici dei prodotti e di selezionare le variabili principali che hanno giocato un ruolo importante nella valutazione delle necessità alla base della produzione litica. L’analisi critica dei contesti lito-stratigrafici e la considerazione delle diversità tecnica, tecnologica o cronologica delle industrie permette di riconsiderare i paradigmi alla base della differenziazione delle tecniche: prodotti del façonnage bifacciale (bifacciale, ébauche), sistema di produzione Levallois, sistema di produzione Discoide. Questi termini hanno per troppo tempo celato la variabilità dei caratteri che durante il Paleolitico era necessario prendere in considerazione nella produzione litica. Adesso bisogna dare un nuovo indirizzo agli studi, indirizzandoli verso la presa di coscienza della pluralità dei comportamenti umani e delle società paleolitiche
APA, Harvard, Vancouver, ISO, and other styles
6

Vo, Van Olivier. "Introduction of variability into pantograph-catenary dynamic simulations." Thesis, Paris, ENSAM, 2016. http://www.theses.fr/2016ENAM0021/document.

Full text
Abstract:
L’alimentation électrique des trains s’effectue en général par une interface pantographecaténairereprésentant un système mécanique couplé complexe. Les phénomènes dynamiques intervenantdans l’interaction entre le pantographe et la caténaire sont encore mal connus. Par ailleurs, le comportementdynamique du système est très variable car sensible à de nombreux paramètres. La premièrecontribution de cette thèse est de proposer une analyse détaillée de l’interaction dynamique pantographecaténaireen étudiant en particulier la réponse dynamique du pantographe à la géométrie de la caténaireainsi que les propagations, réflexions et transmissions des ondes dans cette dernière. Il a ainsi été démontréque la coïncidence spatiale, temporelle ou fréquentielle de ces différents phénomènes est à l’originede la majorité des variations des quantités d’intérêt. Par ailleurs, l’étude des ondes a montré que lespendules entourant le poteau avaient une importance particulière dans l’interaction dynamique et que lesparamètres tels que le rapport des impédances dynamiques et la somme des vitesses des ondes dans lescâbles étaient des variables dimensionnantes dans la caténaire. La seconde contribution a été de réduireles principales incertitudes épistémiques liées au modèle telles que l’amortissement dans la caténaire, laraideur de contact et la taille des éléments. La dernière contribution était d’implémenter des paramètresvariables dans le modèle en utilisant les mesures disponibles. À partir de ce modèle aléatoire, les incertitudesont été classées en utilisant les indices de Sobol sur des critères géométriques et dynamiques.L’absence de corrélation entre les critères géométriques et dynamique observée a des conséquences notablessur la politique de maintenance. Enfin, le grand nombre d’études de sensibilités réalisés a permisde souligner la maturité de l’outil de simulation et de proposer des orientations pour les travaux futurs pourla conception, maintenance ou homologation de pantographes ou de caténaires
In railways, electrical current is generally collected by the train through a complex coupledmechanical system composed of a pantograph and a catenary. Dynamic phenomena that occur duringtheir interaction are still not fully understood. Furthermore, the system behaviour is sensitive to numerousparameters and thus highly variable. The first contribution of this thesis is a detailed analysis of thepantograph-catenary dynamic interaction separating phenomena due to the dynamic response of the pantographto the catenary geometry from wave propagations, reflections and transmissions that occur in thecatenary. The coincidence of frequencies or characteristic times is then shown to explain most variationsin the quantities of interest. Moreover, droppers surrounding the mast have been shown to be particularlyimportant in dynamic interaction. Ratio of wire impedances and sum of wave velocities also appeared tobe dimensioning quantities for catenary design. The second contribution was to reduce epistemic uncertaintylinked with model parameters such as catenary damping, contact stiffness and element size. Thefinal contribution was to use the model in a configuration with random parameters. An initial step was tostatistically characterise physical catenary parameters using available measurements. From this randommodel, ranking of uncertainties using Sobol indices on static and dynamic criteria was shown to be possible.An absence of correlation between geometric and dynamic criteria was also found, which has notableimplications for maintenance policies. The high number of sensitivity studies also gave the occasion tohighlight the maturity of simulation tool and propose directions for further work on design, maintenance orcertification of pantographs and catenaries
APA, Harvard, Vancouver, ISO, and other styles
7

Velayudhan, Vikas. "TCAD study of interface traps-related variability in ultra-scaled MOSFETs." Doctoral thesis, Universitat Autònoma de Barcelona, 2016. http://hdl.handle.net/10803/400200.

Full text
Abstract:
El trabajo desarrollado en esta tesis se ha enfocado en el análisis y estudio del impacto que tienen en la variabilidad de MOSFETs ultraescalados el número y la distribución espacial de las trampas interficiales. En los estudios realizados, el número de localizaciones en las que las trampas estaban ubicadas se varió, pero siempre se mantuvo la carga total constante, definiendo diferentes densidades de trampas según el número de localizaciones analizado. Inicialmente se realizaron simulaciones en 2D de trampas interficiales situadas a lo largo del canal del transistor y se analizó su influencia en Vth. Se analizaron los casos de una sola localización, analizando la influencia de la longitud de canal y tensión de drenador, 2 y múltiples localizaciones. Posteriormente, el análisis se extendió a simulaciones en 3D, simulando trampas interficiales distribuidas a lo largo y ancho del transistor. Finalmente, se analizó el efecto de trampas interficiales no solo en Vth si no también en Ion. Para tener una visión más realista del efecto de las trampas interficiales en la variabilidad del transistor MOSFET ultrescalado, se extendió el estudio a simulaciones en 3D de un dispositivo de WxL = 50nm x 20nm. Los resultados mostraron que la localización de las trampas a lo largo del canal tiene más influencia que su posición a lo ancho del canal. Además, para el caso de considerar dos trampas, se observó que si estaban muy juntas su influencia es menor que si están suficientemente separadas. Los resultados se interpretaron en términos de cambios en el área de barrera de potencial creados según la posición de las trampas. Se simularon dispositivos con diferente número de localizaciones en posiciones aleatorias y se observó un efecto ‘turn around’ en la dependencia de Vth (valor medio) y σVth. El incremento inicial en Vth se atribuyó a un incremento del área de la barrera efectiva con el número de localizaciones. El decremento posterior observado en Vth al aumentar el número de localizaciones se atribuyó a un aumento de la probabilidad de tener trampas muy cerca unas de otras resultando en una disminución del área de la barrera, junto con el escalado en carga en cada localización. También se observó que σVth sigue la ley de Pelgrom y que la anchura del dispositivo juega un papel dominante en esta dependencia. Por otro lado, también se ha observado que la distribución espacial de trampas afecta a la corriente Ion. Los resultados mostraron que la localización de trampas a lo largo del canal influye fundamentalmente en Vth, mientras que la distribución de trampas a lo ancho del canal afecta sobre todo a Ion. Estas dependencias explican las asimetrías encontrados en las características Id-Vg de los transistores. El trabajo se podría continuar analizando el impacto de distribuciones de trampas en condiciones dinámicas, como ocurre en los mecanismos de RTN o el BTI. La principal aplicabilidad de los resultados de esta tesis se sitúa en el campo de la fiabilidad de MOSFETs ultrescalados. Las aportaciones hechas en esta tesis contribuyen a entender el efecto del número distribución espacial de trampas interficiales, que pueden originarse con mecanismos que pueden reducir la fiabilidad como Bias Temperature Instabilities (BTI), Hot Carrier Injection (HCI) o Random Telegraph Noise (RTN), en la dispersión de las características de transistores MOSFET.
The work developed in this thesis has focused on the analysis and study of the impact on the variability of ultra-scaled MOSFETs due to the number and spatial distribution of interfacial traps. In the study, the number of locations where traps were located were randomly varied, but the total charge in the entire device was always maintained constant. Initially 2D simulations of interfacial traps located along the channel of the transistor and its influence on Vth was analyzed. The analysis was started with the case of a single location, analyzing the influence of channel length and drain voltage, and the case of 2 and multiple locations were analyzed. Subsequently, the analysis was extended to 3D simulations, simulating interfacial traps distributed across the transistor. Finally, the effect of interfacial traps was analyzed not only on Vth but also on Ion. For a more realistic vison of the effect of interfacial traps variability on ultra-scaled MOSFET transistors, the study was extended to 3D simulations of a device WxL = 50nm x 20nm. The results showed that the location of traps along the channel has more influence than its position at the edge of the channel. In addition, when the case of two traps were considered, it was observed that if they were close together their influence is less than if they were sufficiently separated. The results were interpreted in terms of changes in the area of potential barrier created by the position of the traps. Devices were simulated with different number of locations at random positions and compared to the 2D results, a 'turn around' effect was observed in the dependence of Vth (mean value) and σVth. The initial increase in Vth was attributed to an increase in area of the effective barrier with the increase in the number of locations. The subsequent decrease observed in Vth with the increase in the number of locations was attributed to an increased likelihood of having traps very close to one another resulting in a decrease in the effective area of the barrier, along with the charge scaling at each location. It was also noted that σVth follows the Pelgrom’s law and that the width of the device plays a dominant role in this dependence. Furthermore, it has also been found that the spatial distribution of the traps affects the Ion. The results showed that the location of traps along the channel fundamentally influences Vth, while the distribution of traps channel along the width affects mostly Ion. These dependencies explain the asymmetries found in the Id-Vg characteristics of transistors. The work could be continued by analyzing the impact of distributions of traps in dynamic conditions, as in the mechanisms of RTN or BTI. The main applicability of the results of this thesis lies in the field of reliability of ultra-scaled MOSFETs. The contributions made in this thesis contribute to understand the effect of the number and the spatial distribution of interfacial traps that can arise with mechanisms such as Bias Temperature Instabilities (BTI), Hot Carrier Injection (HCI) or Random Telegraph Noise (RTN) that can reduce device reliability and result in the dispersion of the characteristics of MOSFETs.
APA, Harvard, Vancouver, ISO, and other styles
8

Uhlig, Stefan. "Heart Rate Variability." Doctoral thesis, Universitätsbibliothek Chemnitz, 2018. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-qucosa-233101.

Full text
Abstract:
Ein gesunder Herzschlag zeichnet sich nicht dadurch aus, dass er besonders regelmäßig ist. Vielmehr sollte ein gesunder Herzschlag, selbst in Phasen augenscheinlicher körperlicher Inaktivität, variabel sein (z.B. Appelhans & Luecken, 2006; Berntson et al., 1997; Shaffer, McCraty, & Zerr, 2014). Historisch gesehen ist dies keine völlig neue Erkenntnis – bereits in der frühen chinesischen und griechischen Medizin konnte dieses Phänomen beobachtet werden (einen schönen Überblick hierzu gibt Billman, 2011). Das Zusammenwirken der sympathischen und parasympathischen Bestandteile des autonomen Nervensystems, welches sich unter anderem in der Herzratenvariabilität (HRV) widerspiegelt, erlaubt uns nicht nur Einblicke in die physiologische Adaptionsfähigkeit, sondern auch in die psychische Flexibilität und Regulationsfähigkeit des Menschen, um so auf sich ständig ändernde Umweltanforderungen angemessen reagieren zu können (z.B. Appelhans & Luecken, 2006; Beauchaine, 2001; ChuDuc, NguyenPhan, & NguyenViet, 2013; Porges, 1995b; Quintana & Heathers, 2014; Riganello, Garbarino, & Sannita, 2012; Shaffer et al., 2014; Stein & Kleiger, 1999; Thayer & Lane, 2000). Mit ganz einfachen Worten: Die Variabilität unseres Herzschlages stellt eine Art Interface dar, welches Auskunft über das Zusammenspiel physiologischer und psychologischer Prozesse gibt. In der vorliegenden Monografie beschäftige ich mich intensiv mit dem Thema HRV, insbesondere mit der Anwendung und Durchführung von HRV-Kurzzeitmessungen (meistens fünf Minuten) im Kontext (bio-) psychologischer Forschung. Während ich im Rahmen des ersten Kapitels eine komprimierte Einführung in die Thematik und einen Überblick über die nachfolgenden Kapitel gebe, beschäftigt sich Kapitel II mit der Frage, welche methodischen Standards für HRV-Kurzzeitmessungen derzeit vorliegen. Ausgangspunkt hierfür sind vereinzelte Hinweise (z.B. im Rahmen meta-analytischer Bestrebungen) darauf, dass die Erfassung, Darstellung und Interpretation von HRV-Messungen durch ein nicht unerhebliches Maß an Diversität gekennzeichnet ist (z.B. de Vries, 2013; Ellis, Zhu, Koenig, Thayer, & Wang, 2015; Quintana & Heathers, 2014; Tak et al., 2009; Zahn et al., 2016). Ferner fehlen bis heute belastbare Normwerte für die gängigsten HRV-Parameter, die typischerweise in Kurzzeitmessungen berechnet werden können (vgl. Nunan, Sandercock, & Brodie, 2010). Ausgehend von diesen Beobachtungen stellen wir ein systematisches Literaturreview vor. In einem ersten Schritt haben wir aktuelle Standards zur Erhebung und Auswertung von HRV-Messungen identifiziert, auf deren Basis wir ein Klassifikationssystem zur Beurteilung von HRV-Studien erstellt haben. Nachfolgend wurden zwischen 2000 und 2013 publizierte Artikel (N = 457), im Hinblick auf die extrahierten methodischen Standards, überprüft. Unsere Ergebnisse legen das Vorhandensein einer beträchtlichen methodischen Heterogenität und einen Mangel an wichtigen Informationen nahe (z.B. in Bezug auf die Erhebung essentieller Kontrollvariablen oder das Berichten von HRV-Parametern), einhergehend mit der Tatsache, dass sich gängige Empfehlungen und Richtlinien (z.B. Task Force, 1996) nur partiell in der empirischen Praxis wiederfinden. Auf der Grundlage unserer Ergebnisse leiten wir Empfehlungen für weitere Forschung in diesem Bereich ab, wobei sich unsere „Checkliste“ besonders an forschende Psychologen richtet. Abschließend diskutieren wir die Einschränkungen unseres Reviews und unterbreiten Vorschläge, wie sich diese - bisweilen unbefriedigende - Situation verbessern lässt. Während unserer umfangreichen Literaturrecherche ist uns sehr schnell aufgefallen, dass HRV-Kurzzeitmessungen auf ein breites wissenschaftliches Interesse stoßen, wobei verschiedenste Konzepte und Forschungsfragen mit spezifischen HRV-Mustern in Verbindung gebracht werden (vgl. Beauchaine, 2001; Dong, 2016; Francesco et al., 2012; Makivić, Nikić, & Willis, 2013; Nunan et al., 2010; Pinna et al., 2007; Quintana & Heathers, 2014; Sammito et al., 2015; Sandercock, 2007). Darunter befinden sich sowohl eher eigenschaftsähnliche (z.B. Trait-Angst; Miu, Heilman, & Miclea, 2009; Watkins, Grossman, Krishnan, & Sherwood, 1998) als auch stark situationsabhängige Konstrukte (z.B. akute emotionale Erregung; Lackner, Weiss, Hinghofer-Szalkay, & Papousek, 2013; Papousek, Schulter, & Premsberger, 2002). Während die beiden einflussreichsten Theorien zur HRV, die Polyvagal-Theorie (Porges, 1995b, 2001, 2007) und das Modell der neuroviszeralen Integration (Thayer & Lane, 2000, 2009), einen dispositionellen Charakter der HRV nahelegen, sind zahlreiche Einflussfaktoren bekannt, die unmittelbare Auswirkungen auf das autonome Nervensystem haben (Fatisson, Oswald, & Lalonde, 2016; Valentini & Parati, 2009). Demzufolge haben wir uns die Frage gestellt, wie zeitlich stabil individuelle HRV-Messungen sind (siehe Kapitel III). Da die existierende Literatur hierzu ambivalente Ergebnisse bereithält (Sandercock, 2007; Sandercock, Bromley, & Brodie, 2005) und die zeitliche Stabilität von HRV-Messungen bisher vornehmlich über sehr kurze Zeiträume mit wenigen Messzeitpunkten untersucht wurde (z.B. Cipryan & Litschmannova, 2013; Maestri et al., 2009; Pinna et al., 2007), haben wir eine längsschnittliche Studie mit fünf Messzeitpunkten, verteilt auf ein Jahr, konstruiert (N = 103 Studierende). In Abhängigkeit von der Körperhaltung der Probanden während der Messung (liegend, sitzend, stehend), haben wir nachfolgend die Retest-Reliabilität (absolute und relative Reliabilität; siehe Atkinson & Nevill, 1998; Baumgartner, 1989; Weir, 2005) der gängigsten HRV-Parameter ermittelt. Unsere Ergebnisse deuten auf ein beachtliches Ausmaß an Zufallsschwankungen der HRV-Parameter hin, welches weitgehend unabhängig von der Körperhaltung der Probanden und dem zeitlichen Abstand der Messzeitpunkte ist. Da diese Ergebnisse weitreichende Folgen suggerieren, diskutieren wir diese, unter Berücksichtigung vorhandener Einschränkungen, ausführlich. Während in Kapitel II und III vornehmlich methodische Fragen im Fokus stehen, stelle ich in Kapitel IV dieser Monografie eine Feldstudie vor. Im Rahmen dieser Studie haben wir die Zusammenhänge zwischen subjektivem Stress, Coping-Strategien, HRV und Schulleistung untersucht. Sowohl die bereits erwähnten Theorien (Porges, 1995b, 2001, 2007, Thayer & Lane, 2000, 2009), als auch eine beträchtliche Anzahl an Forschung, lassen Zusammenhänge zwischen HRV und Stress (z.B. Berntson & Cacioppo, 2004; Chandola, Heraclides, & Kumari, 2010; Krohne, 2017; Michels, Sioen, et al., 2013; Oken, Chamine, & Wakeland, 2015; Porges, 1995a; Pumprla, Howorka, Groves, Chester, & Nolan, 2002) sowie HRV und kognitiver Leistung vermuten (z.B. Duschek, Muckenthaler, Werner, & Reyes del Paso, 2009; Hansen, Johnsen, & Thayer, 2003; Luque-Casado, Perales, Cárdenas, & Sanabria, 2016; Shah et al., 2011). Allerdings fehlt es bislang an Studien, welche die komplexeren Zusammenhänge zwischen all den genannten Konstrukten untersuchen. Dies gilt insbesondere für die Untersuchung von Kindern und Jugendlichen. Um zur Schließung dieser Wissenslücke beizutragen, haben wir Gymnasiasten (N = 72, zwischen zehn und 15 Jahren alt) im Rahmen eine Querschnittstudie zu deren Stresserleben und Bewältigungsstrategien (mittels SSKJ 3-8; Lohaus, Eschenbeck, Kohlmann, & Klein-Heßling, 2006) befragt. Außerdem wurden bei all diesen Schülern HRV und Zeugnisdurchschnittsnoten erhoben. Unsere Ergebnisse unterstreichen die Bedeutung konstruktiver Coping-Strategien zur Vermeidung von physischen und psychischen Stresssymptomen, welche ihrerseits negative Auswirkungen auf die Schulleistung haben. Demgegenüber lassen sich die erwarteten Zusammenhänge zwischen HRV und Stress/Coping (Berntson & Cacioppo, 2004; Dishman et al., 2000; Fabes & Eisenberg, 1997; Lucini, Di Fede, Parati, & Pagani, 2005; Michels, Sioen, et al., 2013; O’Connor, Allen, & Kaszniak, 2002; Porges, 1995a) sowie HRV und kognitiver Leistung (Hansen et al., 2003; Suess, Porges, & Plude, 1994; Thayer, Hansen, Saus-Rose, & Johnsen, 2009) anhand unserer Daten nicht bestätigen. Mögliche Gründe für dieses Befundmuster sowie Anforderungen an zukünftige Studien dieser Art werden abschließend diskutiert. Schlussendlich (a) fasse ich alle gesammelten Erkenntnisse prägnant zusammen, (b) diskutiere deren Implikationen, (c) stelle deren Beitrag zum wissenschaftlichen Forschungsstand heraus, und (d) gebe einen kurzen Einblick in die jüngsten Entwicklungen der HRV-Forschung (Kapitel V). Außerdem, und damit schließe ich den inhaltlichen Part dieser Monografie ab, möchte ich den Leser an meinen zehn wichtigsten Lernerfahrungen teilhaben lassen.
APA, Harvard, Vancouver, ISO, and other styles
9

Turner, John. "Antarctic climate variability." Thesis, University of East Anglia, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.396624.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Sajjadi, Samad. "Variability in interlanguage." Thesis, University of Reading, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.359533.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Minns, Alan Ronald. "Low frequency variability." Thesis, University of Cambridge, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.624871.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Lee, Coral Em. "Order effects of variability-contingent and variability-independent point delivery: Effects on operant variability and target sequence acquisition." Thesis, University of North Texas, 2004. https://digital.library.unt.edu/ark:/67531/metadc4502/.

Full text
Abstract:
Previous research has shown that variability is a reinforceable dimension of operant behavior. Additionally, it has been demonstrated that learning is facilitated when variability in responding is high. In this research, variability was observed within an operant composed of any sequence of six left and right key presses. Variability was either a requirement for point delivery (VAR conditions) or points were delivered independent of variability (ANY conditions). Two groups of college undergraduates experienced different orders of conditions. One group began the experiment under VAR conditions, and the variability requirement was later removed. The other group began the experiment under ANY conditions, and the variability requirement was later added. A concurrently reinforced target sequence (i.e., an always-reinforced sequence of left and right key presses) was introduced to both groups after these orders of conditions had been experienced. A variety of outcomes resulted. Subjects learned the target sequence when variability was both high and low with non-target points concurrently available. Other subjects learned the target sequence after all non-target point deliveries had been suspended. One subject failed to acquire the target sequence at all. These results were compared to previous findings and possible explanations for the discrepancies were suggested.
APA, Harvard, Vancouver, ISO, and other styles
13

Neff, Bryon (Bryon R. ). "Examining the Relationship between Variability in Acquisition and Variability in Extinction." Thesis, University of North Texas, 1997. https://digital.library.unt.edu/ark:/67531/metadc279279/.

Full text
Abstract:
Using the "revealed operant" technique, variability during acquisition and extinction was examined with measures of response rate and a detailed analysis of response topography. During acquisition, subjects learned to emit four response patterns. A continuous schedule of reinforcement (CRF) for 100 repetitions was used for each pattern and a 30 min extinction phase immediately followed. One group of subjects learned the response patterns via a "trial-and-error" method. This resulted in a wide range of variability during acquisition and extinction. Only one subject emitted a substantial amount of resurgent behavior. A second group of subjects was given instructions on what keys to press to earn reinforcers. This group had less variability in acquisition and extinction and resurgent responding was prevalent.
APA, Harvard, Vancouver, ISO, and other styles
14

Couso, Fontanillo Carlos. "Analysis of impact of nanoscale defects on variability in mos structures." Doctoral thesis, Universitat Autònoma de Barcelona, 2018. http://hdl.handle.net/10803/650408.

Full text
Abstract:
En los últimos años, la información y su análisis se han convertido en la piedra angular del crecimiento de nuestra sociedad, permitiendo la economía compartida, la globalización de productos y conocimientos, etc. Grandes compañías como Amazon, Facebook, Google... que son conscientes del potencial de estos recursos, están desarrollando infraestructuras con el fin de extraer toda la información posible sobre nuestro entorno (Internet de las cosas) o sobre nosotros mismos (redes sociales, teléfonos inteligentes ...), procesar esta información (Big Data Centers) y transmitirla rápidamente y entre cualquier parte del mundo. Sin embargo, la construcción de esta infraestructura requiere cada vez mejores dispositivos electrónicos, que no pueden desarrollarse utilizando las técnicas de escalado convencionales, porque las dimensiones de los dispositivos han alcanzado el rango atómico. Entre las diferentes fuentes de variabilidad, las trampas de interfaz (IT), las distribuciones de dopantes aleatorios (RDD) y la rugosidad de borde de línea (LER) se han identificado como las más destacadas. En consecuencia, la comunidad científica está explorando nuevas soluciones mediante sofisticadas técnicas experimentales o software de simulación, con el fin de superar los problemas de escalado. En este contexto, esta tesis estructurada en 7 capítulos, intentará contribuir a resolver este problema, analizando el impacto de las trampas de la interfaz y los defectos en la variabilidad de dispositivos. Para presentar al lector los conceptos fundamentales aplicados en esta tesis, en el capítulo 1 se explica la teoría del transporte de carga a través de una unión Schottky y el transistor de efecto de campo semiconductor de metal-óxido (MOSFET). Además, también se presentan el concepto de variabilidad y diferentes fuentes de variabilidad. En el segundo capítulo, se describen en detalle las técnicas de caracterización avanzada, como la microscopía de fuerza atómica conductiva (CAFM) para obtener información a nanoescala. Después de eso, se explica el simulador TCAD de dispositivos ATLAS y sus limitaciones, el cual es usado en esta tesis. El tercer capítulo está dedicado a describir el impacto de los defectos (threading dislocations) en la conducción a través de un contacto Schottky. Aquí, diferentes mecanismos de conducción que están asociados a la conducción a través de áreas con TD y sin TD son analizados demostrando que el área con alta densidad de TD muestra mayor corriente de fuga. En el capítulo cuatro, las técnicas de caracterización explicadas en el capítulo 2 se utilizan para obtener información a nanoescala. Para introducir esta información al simulador TCAD, se desarrollaron dos herramientas de software que son explicadas. Finalmente, la variabilidad de dispositivos MOSFET se estudia teniendo en cuenta los datos experimentales a nanoescala. En el capítulo cinco, se analiza la influencia de las trampas de interfaz en la variabilidad del dispositivo. En primer lugar, se estudia el impacto de las cargas fijas discretas de la interfaz en dispositivos MOSFET de tecnología de 65 nm con diferentes dimensiones (variabilidad tiempo-cero), donde una desviación de la ley de Pelgrom se prueba mediante datos experimentales y de simulación TCAD. A continuación, el comportamiento dinámico de las trampas se analiza mediante las simulaciones transitorias TCAD, con el fin de estimar sus parámetros físicos de trampas a partir de parámetros empíricos. El último capítulo de resultados está dedicado a estudiar el compromiso entre el rendimiento y el consumo de potencia en (Silicon On Insulator) SOI MOSFET cuando se opera en un voltaje cercano al umbral. Además, también se analiza el impacto de las trampas de interfaz en el rendimiento y el consumo de potencia del dispositivo. Finalmente, en el último capítulo, se destacan las conclusiones más relevantes de esta tesis.
Over the last years, the information and its analysis have become in the corner stone of growth of our society allowing the sharing economy, globalization of products and knowledge, block-chain technology etc. Huge companies such as: Amazon, Facebook, Google... which were aware of the potential of these resources, are developing vast infrastructures in order to extract as much information as possible about our environment (Internet of Things) or ourselves (social media, smart-phones...), process this information (Big Data Centers) and transmit it quickly all over the world. However, this challenge requires electronic devices with higher performance and low power consumption, which cannot be developed using the conventional scaling techniques because the dimensions of devices have reached the atomic range. In this range of dimensions, the impact of the discrete of matter and charge increases inevitably the variability of devices. Among different variability sources, Interface traps (IT), Random Dopant Distributions (RDD), Line Edge Roughness (LER) and Poly Gate Granularity (PGG) have been identified as the most prominent ones. Consequently, the scientific community is exploring new solutions such as, alternative device materials and/or structures, in order to overcome the different issues owing to the scaling. In this context, this thesis, which is structure in 7 chapters, will try to contribute to solve this problem, analyzing the impact of interface traps and defects on device variability. In order to introduce to the reader, in chapter 1 the charge transport theory through a semiconductor and metal junction (Schottky contact) and the Metal-Oxide-Semiconductor Field Effect Transistor (MOSFET) device are explained. Besides, the concept of variability and different sources of variability are also presented. In the second chapter, advanced characterization techniques, such as, Conductive Atomic Force Microscopy (CAFM) and Kelvin Prove Force Microscopy (KPFM) used to obtain nanoscale information are described in detail. After that, the TCAD device simulator called ATLAS is explained. Here, the models and their limitations to simulate the electronic devices are discussed. Third chapter is devoted to describe the impact of threading dislocation (TD) defects on the conduction through a schottky contact formed by a III-V semiconductor material (InGaAs) and a metal. Here, different conduction mechanisms, Poole Frenkel (PF) and Thermionic Emission (TE), have been associated to the conduction through areas with TD and without TD, respectively, proving that III-V materials with high density of TD showing higher leakage current. In chapter four, the development of a simulator called (NAnoscale MAp Simulator (NAMAS)) to generate automatically topography and density charge maps from inputs obtained from CAFM measurements (topography and current maps) of a given sample is explained. From the generated maps, the impact of the oxide thickness and the charge density fluctuations on MOSFET variability is studied. In chapter five, the impact of interface traps in the gate oxide on device variability is analyzed. Firstly, the impact of interface discrete fixed charges on 65 nm technology MOSFET devices with different dimensions is studied (time-zero variability), where a deviation of Pelgrom's law is proved by experimental and TCAD simulation data. Next, the dynamic behavior of traps is analyzed by TCAD transient simulation in order to estimate their physical parameters of traps from empiric parameters. Chapter six is devoted to study the performance and power consumption trade-off in Ultra-thin Body and Buried Oxide Fully Depleted Silicon on Insulator (UTBB FDSOI) MOSFET when it is operated in near-threshold voltage. Besides, the impact of traps in gate oxide / channel and in buried oxide / channel interfaces on the performance and power consumption of device is also analyzed. Finally, the more relevant conclusions are highlighted.
APA, Harvard, Vancouver, ISO, and other styles
15

Maggioni, Mezzomo Cecilia. "Caractérisation et modélisation des fluctuations aléatoires des paramètres électriques des dispositifs en technologies CMOS avancées." Thesis, Grenoble, 2011. http://www.theses.fr/2011GRENT044/document.

Full text
Abstract:
Ce travail porte sur la caractérisation et la modélisation des fluctuations aléatoires des paramètres électriques des transistors MOS avancées. La structure de test utilisée est validée expérimentalement au moyen de la méthode de mesure de Kelvin. Pour comprendre le comportement des fluctuations, un modèle est d’abord proposé pour le régime linéaire. Il permet de modéliser les fluctuations de la tension de seuil des transistors avec implants de poche pour toutes les longueurs de transistor et aussi pour toute la gamme de tension de grille. Ensuite, l’appariement du courant de drain est caractérisé et modélisé en fonction de la tension de drain. Pour modéliser les caractéristiques réelles de transistors sans implants de poche, il est nécessaire de considérer la corrélation des fluctuations de la tension de seuil et celles de la mobilité. De plus, des caractérisations sur des transistors avec implants de poche montrent un nouveau comportement de l’appariement du courant de drain. Des caractérisations ont aussi été menées pour analyser l’impact des fluctuations de la rugosité de grille
This research characterizes and models the mismatch of electrical parameters in advanced MOS transistors. All characterizations are made through a test structure, which is experimentally validated using a structure based on Kelvin method. A model, valid in the linear region, is proposed. It is used for modeling the threshold voltage fluctuations of the transistors with pocket-implants, for any transistor length and gate voltage. It gives a deep understanding of the mismatch, especially for devices with non-uniform channel. Another study analyzes the mismatch of the drain current by characterizing and modeling in terms of the drain voltage. A second model is then proposed for transistors without pocket-implants. In order to apply this model, the correlation of threshold voltage fluctuations and mobility fluctuations must be considered. Characterizations are also performed on transistors with pocket-implants, showing a new drain current mismatch behavior for long transistors. Finally, characterizations are made to analyze the impact of gate roughness fluctuations on mismatch
APA, Harvard, Vancouver, ISO, and other styles
16

Mordel, Patrick. "Variabilité glycémique : exploration in vitro des fonctions cellulaires et mitochondriales sur la lignée de cardiomyocyte HL-1." Thesis, Normandie, 2017. http://www.theses.fr/2017NORMC415/document.

Full text
Abstract:
Le diabète est associé à une augmentation de risque de maladie cardiovasculaire et une dérégulation du métabolisme. Il a été suggéré que la variabilité glycémique (VG) pouvait avoir un rôle dans le développement des complications du diabète. Afin d’étudier et de caractériser les dysfonctions induites par la VG, nous avons mis au point un modèle in vitro mimant la VG sur la lignée de cardiomyocytes HL-1. Nous avons ainsi développé un traitement de 12 heures, mimant hypoglycémie, normoglycémie, hyperglycémie et VG. L’étude de la signalisation cellulaire ne nous a pas permis de montrer un rôle délétère de la VG. Nous avons toutefois mis en évidence que la VG participait à des dysfonctions mitochondriales. En effet en situation de fluctuations en glucose, les mitochondries des cellules HL-1 présentent une augmentation de leur potentiel de membrane, ainsi qu’une augmentation de la production d’anions superoxydes. Bien que nous n’ayons pas réussi à montrer de perturbation de la chaîne respiratoire après 12 heures d’exposition, nous avons pu montrer que 72 heures d’exposition provoquaient une baisse de la respiration mitochondriale. Nous avons enfin étudié l’impact des fluctuations en glucose sur la susceptibilité au développement de lésions d’hypoxie, et avons montré que les lésions sont majorées après 36 heures d’hypoxie en cas d’exposition à des fluctuations en glucose. Nos résultats montrent un rôle délétère de la VG, néanmoins des expériences complémentaires sont nécessaires afin de caractériser de manière plus précise les mécanismes impliqués
Diabetes mellitus is associated with higher risk of cardiovascular disease and metabolism dysregulation. Glycemic variability (GV) has been suggested as a risk factor in diabetic complication. In order to characterize dysfunctions induced by GV, we developed an in vitro model that transpose GV on the cardiac cell line HL-1. We exposed our cells to a treatment of 12 hours miming hypoglycemia, normoglycemia, hyperglycemia and GV. The exploration of signaling pathways didn’t allow us to show a deleterious effect of glucose fluctuation. However we were able to point mitochondrial alteration under glucose fluctuation. HL-1 cells mitochondria exhibit a higher membrane potential and an increase of superoxide anion production. Although we didn’t show any alteration in mitochondrial respiration after 12 hours of exposition, we showed that after 72 hours of glucose fluctuation, HL-1 cells showed a decrease in mitochondrial respiration. We finally studied the impact of glucose fluctuation on the susceptibility to develop hypoxic injuries. We showed that after 36 hours of hypoxia, injuries were higher for cells exposed to glucose fluctuation. Our results indicate a deleterious effect of GV, but additional experiments are needed to better characterize the mechanisms
APA, Harvard, Vancouver, ISO, and other styles
17

Barizien, Antoine. "Studying the variability ofbacterial growth in microfluidicdroplets." Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLX020/document.

Full text
Abstract:
Cette thèse porte sur l’étude de la variabilité de la croissance de bactéries en gouttes micro-fluidiques. Dans un premier temps, la puce micro-fluidique utilisée au cours de la thèse est présentée. Elle permet d’encapsuler des bactéries individuelles dans 1500 gouttes d’un nano litre, et de suivre leur croissance en parallèle grâce à la mesure de leur fluorescence par microscopie. La relation entre fluorescence mesurée et nombre de bactérie est discutée, et plusieurs questions techniques, comme la variabilité de taille des gouttes, l’hétérogénéité de fluorescence des bactéries, sont mesurées et leurs conséquences sur les mesures de croissance quantifiées. Dans un second temps, nous développons un modèle probabiliste qui permet, à partir de la variabilité des temps de divisions des bactéries, de prédire la variabilité de croissance entre les gouttes. Pour ce faire, nous adaptons le modèle classique de Bellman-Harris. La distribution aléatoire du nombre initial de bactérie par gouttes, ainsi que les temps de divisions différents des premières générations de bactéries sont ajoutées au modèle pour l’adapter à notre système expérimental. Les contributions de ces différentes sources de variabilité à la variabilité inter-gouttes de croissance des populations de bactéries sont quantifiées, et le modèle permet bien d’expliquer la variabilité de la croissance entre les gouttes. Dans un troisième temps, nous proposons un schéma d’inférence pour résoudre le problème inverse, qui est de retrouver, à partir des courbes de croissance, la variabilité des temps de division des bactéries individuelles. Le modèle précédent ne peut être utilisé à cause des sources externes de variabilité, nous proposons donc un schéma d’inférence basé sur le suivi dans le temps de chacune des trajectoires des gouttes. Grâce à des simulations reproduisant les conditions expérimentales, nous prouvons que l’inférence est possible. Elle ne peut être appliquée à nos expériences en raison de la précision insuffisante de notre mesure de fluorescence. Enfin, la même puce micro-fluidique est utilisée pour quantifier l’action d’antibiotiques sur des bactéries, notamment la réponse SOS qui est induite lorsque l’ADN de la bactérie est endommagé. La technologie d’encapsulation en goutte est utilisée pour mesurer l’hétérogénéité de réponse des bactéries et la relier à leur capacité à survivre au stress dû à l’antibiotique, et à reformer une colonie
This thesis presents some results about the variability of the growth of bacteria in microfluidics droplets. In the first chapter, the microfluidic chip used throughout the PhD is presented. It allows to encapsulate bacteria in an array of 1.500 nano-liter sized droplets, and to follow their growth in each droplet in parallel through fluorescence microscopy. The link between the measured fluorescence and the number of bacteria in a droplet is discussed, and other technical questions are addressed, such as the variability in droplet size and the cell-to-cell fluorescence variability. Next, we develop a stochastic model to account for the observed variability of population size in the droplet during the exponential phase of growth. A well-known stochastic model, the Bellman-Harris model, is adapted to take into account the external sources of randomness due to our experimental system (initial distribution of bacteria per droplet, different division time of the first generations). They are taken into account, along with the effects of the cell-to-cell variability of division times in our model, which is successful to predict the variability observed in the microfluidics experiments. Then we tackle the inverse problem, which is to recover the cell-to-cell variability from the observation of the growth in droplets. We propose an inference scheme based on following each droplet in time. The deviation from pure exponential growth is linked back to the cell-to-cell variability, and this inference scheme is proven to be successful on simulations that mimic the experimental constrains. However, we cannot completely apply it to our experiments because of a lack of accuracy in our fluorescence measurements. Finally, we demonstrate how our chip can represent a gain of space and time to quantify the effect of antibiotics on a bacterial strain compared to classical susceptibility measurement methods. We also show how it can be used to study the variability of the SOS response of bacteria, which is a bacterial stress response induced when the DNA of the cell is damaged, and relate it to the ability to survive an antibiotic treatment
APA, Harvard, Vancouver, ISO, and other styles
18

Bartoň, Vojtěch. "Analýza genetické variability v sekvenačních datech treponemálních kmenů." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2018. http://www.nusl.cz/ntk/nusl-378144.

Full text
Abstract:
This diploma thesis is dealing with methods of identification genetic variability in sequencing data. The resarch is targeted to bacterial strains of Treponema pallidum. The sequencing was performed by Illumina platform. There is a proposition of method to identificate variable spots in resequenced genomes and their analysis and comparation across all processed genomes.
APA, Harvard, Vancouver, ISO, and other styles
19

Armshaw, Jared T. "Variability in the Natural World: An Analysis of Variability in Preschool Play." Thesis, University of North Texas, 2020. https://digital.library.unt.edu/ark:/67531/metadc1707331/.

Full text
Abstract:
Children acquire many skills through play. These range from fine and gross motor skills, social skills, problem-solving, to even creativity. Creativity or creative engagement is frequently a component in early preschool curricula. A pivotal repertoire to engage in behaviors deemed creative, such as art, storytelling, problem-solving, and the like, is the ability to vary one's responses regardless of the specific repertoire. Researchers have developed methods to produce response variability. However, notwithstanding the significant contributions from the literature for prompting response variability, it remains unclear how much variation in responding is socially appropriate. To fill this research gap, the purpose of this study is to characterize and understand the different ways preschool children commonly interact with the activities and materials present in a preschool classroom. In our study, we assessed children's repeat item interactions, novel item interactions, and time allocation across seven concurrently available activity centers. A multifarious pattern for item interactions emerged across children. Some children had restricted levels of novel item and center interactions, while other children had more varied novel item and center interactions. However, the variance in interactions was predominantly controlled by the center type. This study bolsters our understanding of variability and creativity within a school setting, but more importantly, it informs the task of selecting goals for applied practice with children who have restricted play or interests.
APA, Harvard, Vancouver, ISO, and other styles
20

Cassinelli, Joe P. "Discussion: Magnetic fields, variability." Universität Potsdam, 2007. http://opus.kobv.de/ubp/volltexte/2008/1819/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Meyer, Peter, and Rasmus Åström. "Eliminating Variability Through Standardization." Thesis, Örebro universitet, Institutionen för naturvetenskap och teknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:oru:diva-51841.

Full text
Abstract:
The purpose of this thesis was to review and investigate processes by analyzing what impact logistics and handling had on the transportation units, with the goal of proposing a solution strategy for how to prevent and reduce the breakage of these units. By creating and applying standards and guidelines, specified in this thesis there is the opportunity to greatly reduce: cost of costumer claims, ecological impact and risk of injuries. To achieve this: mapping of the processes and locating risk factors, archival studies, literature, Ishikawa mapping, a case study and interviews were performed. With the methods used, two main problem areas were found and investigated. Analyzing these two problem areas lead to the recommendations of standards and application of Standardized work.
Syftet med det genomförda examensarbetet var att undersöka och analysera vilka effekter logistik och hantering haft på hållfastheten av lastningsbärare för att ta fram ett lösningsförslag på hur brott kan förebyggas. Från införda standarder och riktlinjer, presenterade i det här examensarbetet kan reklamationskostnader, ekologiska kostnader och skaderisker kraftigt minimeras. Processkartläggning och risk sökning, Arkiv- och litteraturstudier, Ishikawa- kartläggning, fallstudie och intervjuer genomfördes och användes för att få fram resultatet. Med de använda metoderna kunde två problemområden hittas och undersökas. Analysering av dessa problemområden ledde till en rekommendation om applicering av standardiserat arbete.
APA, Harvard, Vancouver, ISO, and other styles
22

Bashir, Furrukh, and Furrukh Bashir. "Hydrometeorological Variability over Pakistan." Diss., The University of Arizona, 2017. http://hdl.handle.net/10150/626357.

Full text
Abstract:
Pakistan, as an agriculture based economy, is vulnerable to various hydrometeorological hazards ranging from tropical cyclones, thunderstorms, tornadoes, drought, rain, hail, snow, lightning, fog, wind, temperature extremes, air pollution, and climatic change. However, three of the most pressing challenges in terms of water resource availability, that are different in nature, but are inter-linked to each other are discussed over here. We begin with the Karakoram Anomaly that is considered as one of the most mysterious and most speculated phenomena on Planet Earth. Though, it is confined to the glaciers in the eastern Hindukush, western Karakoram and northwestern Himalayan mountain ranges of Northern Pakistan that are not responding to global warming in the same manner as their counterparts elsewhere, because, their retreat rates are less than the global average, and some are either stable or growing. However, the Karakoram Anomaly has baffled scientific society for more than a decade since its earliest discovery in the year 2005. The reasons of the Karakoram anomaly were mainly associated to physiography of the area and role of climate was considered marginal till now, as climate is influencing glaciers differently all over the globe. Here, for the first time, we present a hydro-meteorological perspective based on five decades of synoptic weather observations collected by the meteorological network of Pakistan. Analysis of this unique data set indicates that increased regional scale humidity, cloud cover, and precipitation, along with decreased net radiation, near-surface wind speed, potential evapotranspiration and river flow, especially during the summer season, represent a substantial change in the energy, mass and momentum fluxes that are facilitating the establishment of the Karakoram Anomaly. In turn, it is influencing the availability of glacier melt in River Indus in summer season. Secondly, we developed a hydrometeorological data sets for Pakistan as they are extremely important for water related impact studies and future climate change scenarios. Presently, major sources of gridded temperature and precipitation data generation are in-situ observations, satellite retrieved information and outputs from numerical models. However, each has its own merits and demerits. Among them gridded observed data sets are considered superior if the gauge density is better. Unfortunately, precipitation gauge network of Pakistan is poorly presented in prior gridded products. Therefore, a daily in-situ observation based, 0.05º×0.05º gridded temperature and precipitation data set for Pakistan, for the period of 1960-2013 is developed. It is named as PAK-HYM-1.0, that is an abbreviation of Pakistan and Hydrometeorology, and 1.0 indicates that it is the first version. This data set is developed by utilizing data from 67 meteorological stations of Pakistan. This number of observation sites is 2 to 4 times higher than that used in prior similar products, and this product can be adopted as an operational information product that can be updated on daily basis. Finally, we focused on meteorological and hydrological droughts in Pakistan. We have reconstructed history of drought in Pakistan using in situ observations based high resolution gridded data through Standardized Precipitation Index (SPI) methodology on different time scales. Furthermore, we have explained the transition of meteorological drought to hydrological drought using river inflows data of large rivers of Pakistan, and explained the sensitivity of different rivers to rainfall and temperature of different seasons. On the basis of this analysis, we have proposed a solution of construction of water reservoirs to tap water resources from northern mountains as inflows from these mountains has potential to perform as a buffer against droughts in low-lying areas of Pakistan. In addition to that, we have demonstrated the potential of Palmer Drought Sensitivity Index (PDSI) as an operational tool for drought monitoring in Pakistan.
APA, Harvard, Vancouver, ISO, and other styles
23

Johnson, Kristien Paul. "Variability of ionic magnesium." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/MQ60850.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Belokurov, Vasily. "Variability surveys in astronomy." Thesis, University of Oxford, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.401024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Pickering, J. B. "Auditory vowel formant variability." Thesis, University of Oxford, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.375999.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Riffle, Travis Lee. "Variability in Auditory Distraction." Ohio University / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1565870603158009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Bishop, Michele R. "Resurgence of operant variability /." abstract and full text PDF (UNR users only), 2008. http://0-gateway.proquest.com.innopac.library.unr.edu/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3329717.

Full text
Abstract:
Thesis (Ph. D.)--University of Nevada, Reno, 2008.
"August, 2008." Includes bibliographical references (leaves 92-100). Library also has microfilm. Ann Arbor, Mich. : ProQuest Information and Learning Company, [2009]. 1 microfilm reel ; 35 mm. Online version available on the World Wide Web.
APA, Harvard, Vancouver, ISO, and other styles
28

Hilder, James Alan. "Evolving variability tolerant logic." Thesis, University of York, 2010. http://etheses.whiterose.ac.uk/1334/.

Full text
Abstract:
Intrinsic variability occurs between individual MOSFET transistors caused by atomic-scale differences in the construction of devices. The impact of this variability will become a major issue in future circuit design as the devices scale below 50nm. In this thesis, the background to the causes and effects of intrinsic variability, in particular that of random dopant placement and line-edge roughness, is discuss. A system is developed which uses a genetic algorithm to attempt to optimise the dimensions of transistors within standard-cell libraries, with the aim of improving performance and reducing the impact of intrinsic variability in terms of the effect on circuit delay and power consumption. The genetic algorithm uses a multi-objective fitness function to allow a number of circuit characteristics to be considered in the evolution process. The system is tested using different standard-cell libraries from open-source and commercial providers, with developments and alterations to the system that have been made throughout the course of the experiments discussed. Comparisons of the performance with other optimisation techniques, hill climbing and simulated-annealing, are discussed. The optimisation process concludes with the use of e-Science techniques to allow for detailed statistical analysis of the evolved designs on high-performance computing clusters. The observed results for two-input logic gates demonstrate that the technique can be effective in the reduction of statistical spread in the delay and power consumption of circuits subject to intrinsic variability. The thesis finishes with the investigation of larger circuits which are assembled from the optimised cells. A proposed design methodology is introduced, in which the processes of logic design are broken into small blocks, each of which uses techniques from evolutionary computation to improve performance. This includes an investigation into the application of a multi-objective fitness function to improve the performance of logic circuits evolved using Cartesian Genetic Programming, which produces designs for logic multiplier and display driver circuits which are competitive with human-produced designs and other evolved designs. These designs are assessed for their variability tolerance, with the multiplier circuit demonstrating an improvement in delay variability.
APA, Harvard, Vancouver, ISO, and other styles
29

Škrtel, Karol. "Analýza variability srdečního rytmu." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2008. http://www.nusl.cz/ntk/nusl-217454.

Full text
Abstract:
The project describes the methods useful for observe changes of heart rate in ECG signal. Heart rate variability become (HRV) the conventionally accepted term to describe variations of NN intervals between consecutive heart beats and generally it is function of instantaneous heart rate or NN interval on time. HRV may be evaluated by time domain or frequency domain measures. In Matlab was developed algorithm, realized like function, which counts HRV parameters from ECG signal series. Analysis in time domain adverts to high correlation between statistic and geometric parameters and similarly with signal HRV. Results of frequency domain analysis shows similarity of power spectral density, which was calculated by two different ways (from interpolated and no interpolated signal HRV). Functionality of developed algorithm was verified on each signal. Project results have signification in progress of analysis ECG signal methods with a view to observe pathological changes in heart rate.
APA, Harvard, Vancouver, ISO, and other styles
30

Ghesini, Silvia <1974&gt. "Molecular Variability in Isoptera." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2010. http://amsdottorato.unibo.it/2930/1/ghesini_silvia_tesi.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Ghesini, Silvia <1974&gt. "Molecular Variability in Isoptera." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2010. http://amsdottorato.unibo.it/2930/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Hubeika, Valiantsina. "Intersession Variability Compensation in Language and Speaker Identification." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2008. http://www.nusl.cz/ntk/nusl-235432.

Full text
Abstract:
Variabilita kanálu a hovoru je velmi důležitým problémem v úloze rozpoznávání mluvčího. V současné době je ve velkém množství vědeckých článků uvedeno několik technik pro kompenzaci vlivu kanálu. Kompenzace vlivu kanálu může být implementována jak v doméně modelu, tak i v doménách příznaků i skóre. Relativně nová výkoná technika je takzvaná eigenchannel adaptace pro GMM (Gaussian Mixture Models). Mevýhodou této metody je nemožnost její aplikace na jiné klasifikátory, jako napřílad takzvané SVM (Support Vector Machines), GMM s různým počtem Gausových komponent nebo v rozpoznávání řeči s použitím skrytých markovových modelů (HMM). Řešením může být aproximace této metody, eigenchannel adaptace v doméně příznaků. Obě tyto techniky, eigenchannel adaptace v doméně modelu a doméně příznaků v systémech rozpoznávání mluvčího, jsou uvedeny v této práci. Po dosažení dobrých výsledků v rozpoznávání mluvčího, byl přínos těchto technik zkoumán pro akustický systém rozpoznávání jazyka zahrnující 14 jazyků. V této úloze má nežádoucí vliv nejen variabilita kanálu, ale i variabilita mluvčího. Výsledky jsou prezentovány na datech definovaných pro evaluaci rozpoznávání mluvčího z roku 2006 a evaluaci rozpoznávání jazyka v roce 2007, obě organizované Amerických Národním Institutem pro Standard a Technologie (NIST)
APA, Harvard, Vancouver, ISO, and other styles
33

Kostečka, Petr. "Posouzení stability procesu." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2012. http://www.nusl.cz/ntk/nusl-230076.

Full text
Abstract:
This master`s thesis is focused on the application of statistic process control in engineering production and it is focused on comparison with method Beta correction. The aim of the diploma thesis is to find suitable regulation of the producing process with respect to provision of sufficiently high level of quality of products at the process of achievement of adequate costs conneced with process monitoring. The basic principles of both methods are solved in the theoretical part of this diploma thesis. Both methods are compared in particular producing process in the practical part of the thesis. Conclusions and suggestions are done on the base of gained data.
APA, Harvard, Vancouver, ISO, and other styles
34

Dixon, Paul David. "Stability and variability of physiological control determined from heart rate variability in infants." Thesis, Imperial College London, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.368644.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Akyel, Kaya Can. "Statistical methodologies for modelling the impact of process variability in ultra-deep-submicron SRAMs." Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENT080/document.

Full text
Abstract:
La miniaturisation des transistors vers ses ultimes limites physiques a exacerbé les effets négatifs qui sont liées à la granularité de la matière. Plusieurs nouvelles sources de variabilités affectent les transistors qui, bien qu'identiquement dessinés, montrent des caractéristiques électriques qui sont variables entre eux et entre différents moments de leur utilisation. Les circuits de mémoire SRAM, qui sont conçues avec des règles de dessin parmi le plus agressives et contiennent un nombre de transistors très élevé, sont menacés en particulier par ce phéomène de variabilité qui représente le plus grand obstacle non seulement pour la réduction de la surface d'un point mémoire SRAM, mais aussi pour la réduction de son tension d'alimentation. L'optimisation des circuits SRAM est devenue une tache cruciale afin de répondre à la fois aux demandes d'augmentation de densité et de la réduction de la consommation, donc une méthodologie statistique permettant de modéliser an amont l'impact de la variabilité à travers des simulations SPICE est devenue un besoin obligatoire. Les travaux de recherches présentés se concentrent sur le développement des nouvelles méthodologies pour la simulation des points mémoires sous l'impact de la variabilité, dans le but d'accomplir une modélisation précise de la tension d'alimentation minimale d'un SRAM quelques soit les conditions d'opérations. La variabilité dynamique liée au bruit RTS qui cause le changement des caractéristiques électrique des transistors au cours de leurs opérations est également étudiée avec un effort particulier de modélisation. Ce travail a donné lieu à de nombreuses publications internationales et à un brevet. Aujourd'hui cette méthodologie est retenue par STMicroelectronics et est utilisé dans la phase d'optimisation des plans mémoires SRAM
The downscaling of device geometry towards its physical limits exacerbates the impact of the inevitable atomistic phenomena tied to matter granularity. In this context, many different variability sources raise and affect the electrical characteristics of the manufactured devices. The variability-aware design methodology has therefore become a popular research topic in the field of digital circuit design, since the increased number of transistors in the modern integrated circuits had led to a large statistical variability affecting dramatically circuit functionality. Static Random Access Memory (SRAM) circuits which are manufactured with the most aggressive design rules in a given technology node and contain billions of transistor, are severely impacted by the process variability which stands as the main obstacle for the further reduction of the bitcell area and of its minimum operating voltage. The reduction of the latter is a very important parameter for Low-Power design, which is one of the most popular research fields of our era. The optimization of SRAM bitcell design therefore has become a crucial task to guarantee the good functionality of the design at an industrial manufacturing level, in the same time answering to the high density and low power demands. However, the long time required by each new technology node process development means a long waiting time before obtaining silicon results, which is in cruel contrast with the fact that the design optimization has to be started as early as possible. An efficient SPICE characterization methodology for the minimum operating voltage of SRAM circuits is therefore a mandatory requirement for design optimization. This research work concentrates on the development of the new simulation methodologies for the modeling of the process variability in ultra-deep-submicron SRAMs, with the ultimate goal of a significantly accurate modeling of the minimum operating voltage Vmin. A particular interest is also carried on the time-dependent sub-class of the process variability, which appears as a change in the electrical characteristics of a given transistor during its operation and during its life-time. This research work has led to many publications and one patent application. The majority of findings are retained by STMicroelectronics SRAM development team for a further use in their design optimization flow
APA, Harvard, Vancouver, ISO, and other styles
36

Baum, David. "Variabilitätsextraktion aus makrobasierten Software-Generatoren." Master's thesis, Universitätsbibliothek Leipzig, 2014. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-132719.

Full text
Abstract:
Die vorliegende Arbeit beschäftigt sich mit der Frage, wie Variabilitätsinformationen aus den Quelltext von Generatoren extrahiert werden können. Zu diesem Zweck wurde eine Klassifizierung von Variablen entwickelt, die im Vergleich zu bestehenden Ansätzen eine genauere Identifikation von Merkmalen ermöglicht. Zudem bildet die Unterteilung die Basis der Erkennung von Merkmalinteraktionen und Cross-tree-Constraints. Weiterhin wird gezeigt, wie die gewonnenen Informationen durch Merkmalmodelle dargestellt werden können. Da diese auf dem Generator-Quelltext basieren, liefern sie Erkenntnisse über den Lösungsraum der Domäne. Es wird sichtbar, aus welchen Implementierungskomponenten ein Merkmal besteht und welche Beziehungen es zwischen Merkmalen gibt. Allerdings liefert ein automatisch generiertes Merkmalmodell nur wenig Erkenntnisse über den Lösungsraum. Außerdem wurde ein Prototyp entwickelt, der eine Automatisierung des beschriebenen Extraktionsprozesses ermöglicht.
APA, Harvard, Vancouver, ISO, and other styles
37

Denis, Yvan. "Implémentation de PCM (Process Compact Models) pour l’étude et l’amélioration de la variabilité des technologies CMOS FDSOI avancées." Thesis, Université Grenoble Alpes (ComUE), 2016. http://www.theses.fr/2016GREAT045/document.

Full text
Abstract:
Récemment, la course à la miniaturisation a vue sa progression ralentir à cause des défis technologiques qu’elle implique. Parmi ces obstacles, on trouve l’impact croissant de la variabilité local et process émanant de la complexité croissante du processus de fabrication et de la miniaturisation, en plus de la difficulté à réduire la longueur du canal. Afin de relever ces défis, de nouvelles architectures, très différentes de celle traditionnelle (bulk), ont été proposées. Cependant ces nouvelles architectures demandent plus d’efforts pour être industrialisées. L’augmentation de la complexité et du temps de développement requièrent de plus gros investissements financier. De fait il existe un besoin réel d’améliorer le développement et l’optimisation des dispositifs. Ce travail donne quelques pistes dans le but d’atteindre ces objectifs. L’idée, pour répondre au problème, est de réduire le nombre d’essai nécessaire pour trouver le processus de fabrication optimal. Le processus optimal est celui qui conduit à un dispositif dont les performances et leur dispersion atteignent les objectifs prédéfinis. L’idée développée dans cette thèse est de combiner l’outil TCAD et les modèles compacts dans le but de construire et calibrer ce que l’on appelle un PCM (Process Compact Model). Un PCM est un modèle analytique qui établit les liens entre les paramètres process et électriques du MOSFET. Il tire à la fois les bénéfices de la TCAD (puisqu’il relie directement les paramètres process aux paramètres électriques) et du modèle compact (puisque le modèle est analytique et donc rapide à calculer). Un PCM suffisamment prédictif et robuste peut être utilisé pour optimiser les performances et la variabilité globale du transistor grâce à un algorithme d’optimisation approprié. Cette approche est différente des méthodes de développement classiques qui font largement appel à l’expertise scientifique et à des essais successifs dans le but d’améliorer le dispositif. En effet cette approche apporte un cadre mathématique déterministe et robuste au problème.Le concept a été développé, testé et appliqué aux transistors 28 et 14 nm FD-SOI ainsi qu’aux simulations TCAD. Les résultats sont exposés ainsi que les recommandations nécessaires pour implémenter la technique à échelle industrielle. Certaines perspectives et applications sont de même suggérées
Recently, the race for miniaturization has seen its growth slow because of technological challenges it entails. These barriers include the increasing impact of the local variability and processes from the increasing complexity of the manufacturing process and miniaturization, in addition to the difficult of reducing the channel length. To address these challenges, new architectures, very different from the traditional one (bulk), have been proposed. However these new architectures require more effort to be industrialized. Increasing complexity and development time require larger financial investments. In fact there is a real need to improve the development and optimization of devices. This work gives some tips in order to achieve these goals. The idea to address the problem is to reduce the number of trials required to find the optimal manufacturing process. The optimal process is one that results in a device whose performance and dispersion reach the predefined aims. The idea developed in this thesis is to combine TCAD tool and compact models in order to build and calibrate what is called PCM (Process Compact Model). PCM is an analytical model that establishes linkages between process and electrical parameters of the MOSFET. It takes both the benefits of TCAD (since it connects directly to the process parameters electrical parameters) and compact (since the model is analytic and therefore faster to calculate). A sufficiently robust predictive and PCM can be used to optimize performance and overall variability of the transistor through an appropriate optimization algorithm. This approach is different from traditional development methods that rely heavily on scientific expertise and successive tests in order to improve the system. Indeed this approach provides a deterministic and robust mathematical framework to the problem. The concept was developed, tested and applied to transistors 28 and 14 nm FD-SOI and to TCAD simulations. The results are presented and recommendations to implement it at industrial scale are provided. Some perspectives and applications are likewise suggested
APA, Harvard, Vancouver, ISO, and other styles
38

Číhal, Martin. "Analýza variability srdečního rytmu pomocí fraktální dimenze." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2013. http://www.nusl.cz/ntk/nusl-220052.

Full text
Abstract:
This work is focused on fractal dimension utilization for heart rate variability analysis. Both the theory of heart rate variability and the methods of HRV analysis in time domain and using the fractal dimesion are summarized. Short comparsion of time domain and fractal dimension method is presented.
APA, Harvard, Vancouver, ISO, and other styles
39

Clèries, Soler Ramon. "Geographic Variability in Liver Cancer." Doctoral thesis, Universitat Autònoma de Barcelona, 2006. http://hdl.handle.net/10803/4627.

Full text
Abstract:
At the beginning of the 21st century, primary liver cancer (PLC) remains the fifth most common malignancy in men worldwide, and the eighth in women. Central Africa and South East of Asia are high risk geographic areas for PLC, whereas developed countries appear to be generally low risk. Infections with hepatitis B (HBV) and C (HCV) viruses are the main risk factors for PLC, accounting for well over 80% of PLC cases detected worldwide. The recently detected increase in both incidence and mortality by PLC in developed countries is strongly related to these viral infections. The evaluation of PLC time trends needs to take into consideration the geographic distribution and effect of these viruses. This thesis presents three studies which the aim to describe PLC incidence and mortality issues in different geographic areas, each addressing several epidemiological and methodological issues. For each study, different statistical methods on the basis of the Bayesian inference have been proposed, evaluated and discussed in order to cope with extra-Poisson variability.
The first study, entitled "Meta-analysis of cohort studies of risk of liver cancer death among HBV carriers", evaluates the variability in PLC mortality reported in 11 cohort studies of male HBV carriers, taking into consideration the effects of geographic area and the choice of the general population versus a more comparable group such as HBV-negative workers or blood donors as the comparison group. The statistical methods of this study focuses on mixtures of Poisson distributions. The "stickbreaking" method has been used to estimate the number of components of the mixture of Poisson distributions, and, thus to obtain a pooled relative risk (RR) of death for PLC among male HBV carriers. The pooled RR of death by PLC related to HBV infection was 23.5 (95% Credibility Interval (CRI): 14.9 - 44.5). Studies carried out in high risk areas for PLC (China and Taiwan) showed RRs 2 to 5-fold higher than those of studies carried out in Europe, Japan and the U.S.. In low risk areas for PLC, studies which used workers or blood donors as comparison groups had RRs 1.9-fold higher (95% CRI: 1.2 - 3.1) than studies which used the general population. However, in high risk areas, the ratio of RRs was 5.3-fold (95% CRI: 3.4 - 7.9). This is the first time that a "healthy donor effect" has been quantified in longitudinal studies.
The second study, entitled "Geographic distribution of primary liver cancer in Europe in 2002" evaluates the effect of HBV and HCV seroprevalence in 38 European countries on PLC incidence and mortality. Mixed Poisson models based on Bayesian inference have been used to smooth Standardized Incidence (SIR) and Mortality (SMR) ratios for PLC accounting for the effect of HBV and HCV prevalences. This approach enabled us to both examine the effect of different levels of HBV and HCV, and to identify remaining variability in PLC after accounting for infection rates. Bayesian inference allowed the determination of posterior probabilities for the somoothed SIRs and SMRs (hereafter RRs). The Deviance Information Criterion (DIC) and the "effective number of parameters" (pD) have been used as tools for model choice. The highest mortality and incidence PLC RRs were found in Southern European countries (RR range 0.9-2.4), whereas Northern European countries showed the lowest RRs (RR range: 0.3-0.9). The effect of HBV infection was not found to be statistically significant in the model which accounted for both HBV and HCV prevalence. Countries with a prevalence of HCV higher than 2% (e.g.: Italy and Spain) had a higher risk of incidence and mortality (RR range: 1.28 - 1.78) than countries with HCV prevalence below 1%. Thus, the high risk of PLC detected in Southern Europe appears to be explained, in part, by HCV infection. The high HCV seroprevalence in this area could be associated with exposure 30-50 years ago. There may be an underestimation of PLC incidence and mortality rates in Eastern European countries given the low PLC RRs reported, despite high HBV and HCV seroprevalences observed. The implementation of population-based cancer registries in Eastern European countries is warranted, as well as HCV prevalence studies across Europe, to better determine the distribution of PLC in Europe and its relationship with that virus.
The last study, entitled "Time trends in liver disease in Spain during the period 198397", describes incidence and mortality trends in hepatocellular carcinoma and cholangiocarcinoma as well as mortality trends in liver cirrhosis in Spain. Autoregressive age-period-cohort (APC) models have been used to evaluate the time trends. We found that APC models performed well for those liver diseases with large number of cases, whereas the age-period models did for those liver diseases with low number of cases. We found an increase in incidence and mortality of hepatocellular carcinoma in Spain (annual percent change (APCH) in men's incidence: 6.6%, 95% CRI: 5.8, 8.1: APCH in women's incidence: 4.5%, 95% CRI: 1.4%, 7.3%; APCH in men's mortality: 6.8%, 95% CRI: 5.8%, 8.1%; APCH in women's mortality: 5.1%, 95% CRI: 3.5%, 6.3%), that appear to be related to HCV exposure 30 years ago, as described in other studies of PLC. We also found an increasing trend in cholangiocarcinoma mortality (APCH in men: 17.1%, 95% CRI: 13.5%, 21.2%; APCH in women: 15.0%, 95% CRI: 11.5%, 19.5%) similar to that found in some developed countries, that could be attributed to improvement in diagnosis resulting from better imaging and diagnostic techniques. However, we did not detect a significant increasing trend in cholangiocarcinoma incidence, perhaps due to the low number of cases reported by the Spanish cancer registries. We have observed a decreasing trend in cirrhosis mortality in both sexes during the study period (APCH in men: -3.1%, 95% CRI: -5.1, -1.9%; APCH in women: -2.9%; 95% CRI: -6.2%, -1.3%), although younger cohorts did not show this pattern. This cohort effect suggests the possibility that younger cohorts could be exposed to some additional risk factors besides alcohol consumption. HIV and HCV or HBV co-infection and intravenous drug addiction could explain the increase in liver cirrhosis mortality among younger cohorts.
The flexibility of the Bayesian approach allowed us to cope with extra-Poisson variability in three statistical analyses, applying different models, and addressing relevant methodological aspects specific to each problem. Challenging statistical issues in the framework of Bayesian applied modelling are: i) the selection of prior distributions for model parameters, which is related to convergence of the model; and ii) model selection procedures, and these remain important considerations for future research.
APA, Harvard, Vancouver, ISO, and other styles
40

Marciniak, Jennifer Yuko. "Variability in eukaryotic gene expression /." Diss., Connect to a 24 p. preview or request complete full text in PDF formate. Access restricted to UC campuses, 2005. http://wwwlib.umi.com/cr/ucsd/fullcit?p3208639.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Ho, Kwan-wai Annie. "Variability of cleft palate speech." Click to view the E-thesis via HKUTO, 2001. http://sunzi.lib.hku.hk/hkuto/record/B36207883.

Full text
Abstract:
Thesis (B.Sc)--University of Hong Kong, 2001.
"A dissertation submitted in partial fulfilment of the requirements for the Bachelor of Science (Speech and Hearing Sciences), The University of Hong Kong, May 4, 2001." Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
42

Stemmer, Georg. "Modeling variability in speech recognition /." Berlin : Logos-Verl, 2005. http://deposit.ddb.de/cgi-bin/dokserv?id=2659313&prov=M&dok_var=1&dok_ext=htm.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Akram, Asif, and Qammer Abbas. "COMPARISON OF VARIABILITY MODELING TECHNIQUES." Thesis, Jönköping University, JTH, Computer and Electrical Engineering, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-9643.

Full text
Abstract:

Variability in complex systems offering rich set of features is a seriouschallenge to their users in term of flexibility with many possible variants fordifferent application contexts and maintainability. During the long period oftime, much effort has been made to deal with these issues. An effort in thisregard is developing and implementing different variability modelingtechniques.This thesis argues the explanation of three modeling techniques namedconfigurable components, feature models and function-means trees. The maincontribution to the research includes:• A comparison of above mentioned variability modeling techniques in asystematic way,• An attempt to find the integration possibilities of these modelingtechniques based on literature review, case studies, comparison,discussions, and brainstorming.The comparison is based on three case studies each of which is implemented inall above mentioned three modeling techniques and a set of generic aspects ofthese techniques which are further divided into characteristics. At the end, acomprehensive discussion on the comparison is presented and in final sectionsome integration possibility are proposed on the basis of case studies,characteristics, commonalities and experience gained through theimplementation of case studies and literature review.

APA, Harvard, Vancouver, ISO, and other styles
44

Lee, Chien-Hsiu. "Microlensing and Variability towards M31." Diss., lmu, 2011. http://nbn-resolving.de/urn:nbn:de:bvb:19-132307.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Klaube, Maximilian. "Spatial Variability of shotcrete thickness." Thesis, KTH, Jord- och bergmekanik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-224933.

Full text
Abstract:
An  important  task   during  the  construction  process  is  to  validate  the dimensions  and  properties  of  a  given  structure.  The  dimensions  like  for instance the thickness of a construction element should  be measured after finishing  building  it.  The  aim  is  to  compare  the  measured  value  with  the design  value   to  avoid   that   elements  do   not  correspond  to  the  input requirements.  Moreover,  the  measurements  are  helpful  to  analyse  the statistical    distribution    of    the    investigated    geometrical    property    by computing e.g. a histogram, which visualises the dispersion and  enable the calculation of the probability of failure for a specific structure or element.In  this  work,  a  shotcrete  layer  has  been  analysed  in  order  to  provide information  about  the  homogeneity  of  the  shotcrete  thickness  in  a  pre-determined  tunnel  section.  The  calculation  method  is  based  on  two  laser scans,  before  and  after  applying  the  shotcrete.  Due  to  the  construction process, the shotcrete layer will not be totally equal, which might be a safety problem. Especially, when  the shotcrete layer is thinner than  required and hence, the actual variation of the shotcrete must be considered and verified.To determine the statistical distribution, correlograms and histograms have been  computed  for  a  wall  area  in  a  tunnel  in  Southern  Sweden.  The correlogram shows the distance where the values have a correlation to each other  and  usually  this  distance  is  called  scale  of  fluctuation.  For  the  wall section, this scale of fluctuation has been calculated for the length (0.8m) as well  as  the  height  (0.8m).  Compared  to  the  original  sample  distance,  e.g. distance of the rock bolts, the variance for the calculation of the probability of failure might be reduced.
APA, Harvard, Vancouver, ISO, and other styles
46

Taing, Nguonly. "Run-time Variability with Roles." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2018. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-234933.

Full text
Abstract:
Adaptability is an intrinsic property of software systems that require adaptation to cope with dynamically changing environments. Achieving adaptability is challenging. Variability is a key solution as it enables a software system to change its behavior which corresponds to a specific need. The abstraction of variability is to manage variants, which are dynamic parts to be composed to the base system. Run-time variability realizes these variant compositions dynamically at run time to enable adaptation. Adaptation, relying on variants specified at build time, is called anticipated adaptation, which allows the system behavior to change with respect to a set of predefined execution environments. This implies the inability to solve practical problems in which the execution environment is not completely fixed and often unknown until run time. Enabling unanticipated adaptation, which allows variants to be dynamically added at run time, alleviates this inability, but it holds several implications yielding system instability such as inconsistency and run-time failures. Adaptation should be performed only when a system reaches a consistent state to avoid inconsistency. Inconsistency is an effect of adaptation happening when the system changes the state and behavior while a series of methods is still invoking. A software bug is another source of system instability. It often appears in a variant composition and is brought to the system during adaptation. The problem is even more critical for unanticipated adaptation as the system has no prior knowledge of the new variants. This dissertation aims to achieve anticipated and unanticipated adaptation. In achieving adaptation, the issues of inconsistency and software failures, which may happen as a consequence of run-time adaptation, are evidently addressed as well. Roles encapsulate dynamic behavior used to adapt players representing the base system, which is the rationale to select roles as the software system's variants. Based on the role concept, this dissertation presents three mechanisms to comprehensively address adaptation. First, a dynamic instance binding mechanism is proposed to loosely bind players and roles. Dynamic binding of roles enables anticipated and unanticipated adaptation. Second, an object-level tranquility mechanism is proposed to avoid inconsistency by allowing a player object to adapt only when its consistent state is reached. Last, a rollback recovery mechanism is proposed as a proactive mechanism to embrace and handle failures resulting from a defective composition of variants. A checkpoint of a system configuration is created before adaptation. If a specialized bug sensor detects a failure, the system rolls back to the most recent checkpoint. These mechanisms are integrated into a role-based runtime, called LyRT. LyRT was validated with three case studies to demonstrate the practical feasibility. This validation showed that LyRT is more advanced than the existing variability approaches with respect to adaptation due to its consistency control and failure handling. Besides, several benchmarks were set up to quantify the overhead of LyRT concerning the execution time of adaptation. The results revealed that the overhead introduced to achieve anticipated and unanticipated adaptation to be small enough for practical use in adaptive software systems. Thus, LyRT is suitable for adaptive software systems that frequently require the adaptation of large sets of objects.
APA, Harvard, Vancouver, ISO, and other styles
47

Groot, Paul Joseph. "Optical variability in compact sources." [Amsterdam] : Amsterdam : Sterrenkundig Instituut 'Anton Pannekoek' ; Universiteit van Amsterdam [Host], 1999. http://dare.uva.nl/document/92169.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Slauson, Leigh Victoria. "Students' conceptual understanding of variability." Columbus, Ohio : Ohio State University, 2008. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1199117318.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Berger, Thorsten. "Variability Modeling in the Real." Doctoral thesis, Universitätsbibliothek Leipzig, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-113623.

Full text
Abstract:
Variability modeling is one of the key disciplines to cope with complex variability in large software product lines. It aims at creating, evolving, and configuring variability models, which describe the common and variable characteristics, also known as features, of products in a product line. Since the introduction of feature models more than twenty years ago, many variability modeling languages and notations have been proposed both in academia and industry, followed by hundreds of publications on variability modeling techniques that have built upon these theoretical foundations. Surprisingly, there are relatively few empirical studies that aim at understanding the use of such languages. What variability modeling concepts are actually used in practice? Do variability models applied in real-world look similar to those published in literature? In what technical and organizational contexts are variability models applicable? We present an empirical study that addresses this research gap. Our goals are i) to verify existing theoretical research, and ii) to explore real-world variability modeling languages and models expressed in them. We study concepts and semantics of variability modeling languages conceived by practitioners, and the usage of these concepts in real, large-scale models. Our aim is to support variability modeling research by providing empirical data about the use of its core modeling concepts, by identifying and characterizing further concepts that have not been as widely addressed, and by providing realistic assumptions about scale, structure, content, and complexity of real-world variability models. We believe that our findings are of relevance to variability modeling researchers and tool designers, for example, those working on interactive product configurators or feature dependency checkers. Our extracted models provide realistic benchmarks that can be used to evaluate new techniques. Recognizing the recent trend in software engineering to open up software platforms to facilitate inter-organizational reuse of software, we extend our empirical discourse to the emerging field of software ecosystems. As natural successors of successful product lines, ecosystems manage huge variability among and within their software assets, thus, represent a highly interesting class of systems to study variability modeling concepts and mechanisms. Our studied systems comprise eleven highly configurable software systems, two ecosystems with closed platforms, and three ecosystems relying on open platforms. Some of our subjects are among the largest successful systems in existence today. Results from a survey on industrial variability modeling complement these subjects. Our overall results provide empirical evidence that the well-researched concepts of feature modeling are used in practice, but also that more advanced concepts are needed. We observe that assumptions about variability models in the literature do not hold. Our study also reveals that variability models work best in centralized variability management scenarios, and that they are fragile and have to be controlled by a small team. We also identify a particular type of dependencies that is increasingly used in open platforms and helps sustain the growth of ecosystems. Interestingly, while enabling distributed variability, these dependencies rely on a centralized and stable vocabulary. Finally, we formulate new hypotheses and research questions that provide direction for future research.
APA, Harvard, Vancouver, ISO, and other styles
50

Jewson, Stephen P. "Decadal and interdecadal climate variability." Thesis, University of Oxford, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.308590.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography