Dissertations / Theses on the topic 'Quantitavie'

To see the other types of publications on this topic, follow the link: Quantitavie.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Quantitavie.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Albert, Elise. "Déterminants génétiques et génomiques de la réponse au déficit hydrique chez la tomate (Solanum lycopersicum) et impact sur la qualité des fruits." Thesis, Avignon, 2017. http://www.theses.fr/2017AVIG0688/document.

Full text
Abstract:
A l’échelle du globe, la diminution des ressources en eau est devenue un des principaux facteurs limitants pour les productions agricoles. Jusqu’à présent, les approches génomiques à haut débit conduites chez les espèces modèles ont permis d’identifier des centaines de gènes potentiellement impliqués dans la survie des plantes en conditions de sécheresse,mais très peu ont des effets bénéfiques sur la qualité et le rendement des cultures.Néanmoins, l’application d’un déficit hydrique bien contrôlé peut permettre d’améliorer la qualité des fruits charnus par dilution et/ou accumulation de composés gustatifs majeurs.Dans ce contexte, la première partie du travail de thèse avait pour but de déchiffrer les déterminants génétiques de la réponse au déficit hydrique chez la tomate en explorant les interactions ‘génotype x niveau d’irrigation’ (G x I) et ‘QTL x niveau d’irrigation’ (QTL x I) dans deux populations. La première population consistait en un ensemble de lignées recombinantes (RIL) issues du croisement entre deux accessions cultivées, tandis que la seconde était composée de diverses accessions à petits fruits principalement originaires d'Amérique du Sud. Les plantes ont été phénotypées pour un ensemble de caractères agronomiques (vigueur des plantes et qualité des fruits) et génotypées pour des milliers de SNP. Les données ont été analysées en utilisant les méthodologies de la cartographie de liaison et d'association, permettant l'identification de QTL et gènes candidats putatifs pour la réponse de la tomate au déficit hydrique. La deuxième partie du travail de thèse avait pour objectif d'explorer la régulation des gènes dans les fruits et les feuilles de tomates en condition de déficit hydrique. Dans ce but, des données de séquençage du transcriptome ont été recueillies sur les deux génotypes parentaux de la population RIL et leur hybride F1. Les données ont été analysées pour identifier les gènes et les allèles exprimés de manière différentielle. Puis, l'expression de 200 gènes a été mesurée dans les fruits et les feuilles de l’ensemble des lignées de la population RIL par qPCR micro-fluidique à haut débit. Des eQTL et des interactions ‘eQTL x niveau d’irrigation’ ont été identifiés pour ces gènes par cartographie de liaison. Les colocalisations entre les QTL phénotypiques et les QTL d’expression ont été analysées. Les connaissances produites au cours de cette thèse contribuent à une meilleure compréhension des interactions des plantes de tomate avec leur environnement et fournissent des bases pour l'amélioration de la qualité des fruits en conditions d’irrigation limitée
Water scarcity will constitute a crucial constraint for agricultural productivity in a nearfuture. High throughput approaches in model species have identified hundreds of genespotentially involved in survival under drought conditions, but very few having beneficialeffects on quality and yield in crops plants. Nonetheless, controlled water deficits mayimprove fleshy fruit quality through weaker dilution and/or accumulation of nutritionalcompounds. In this context, the first part of the PhD was aimed at deciphering the geneticdeterminants of the phenotypic response to water deficit in tomato by exploring thegenotype by watering regime (G x W) and QTL by watering regime (QTL x W) interactions intwo populations. The first population consisted in recombinant inbreed lines (RIL) from across between two cultivated accessions and the second was composed of diverse small fruittomato accessions mostly native from South America. Plants were phenotyped for majorplant and fruit quality traits and genotyped for thousands of SNP. Data were analyzed withinthe linkage and association mapping frameworks allowing the identification of QTLs andputative candidate genes for response to water deficit in tomato. The second part of the PhDhad the objective to explore gene regulation in green fruit and leaves of tomato plantsstressed by water deficit. For this purpose, RNA-Seq data were collected on the two parentalgenotypes of the RIL population and their F1 hybrid. Data were analyzed to identifydifferentially expressed genes and allele specific expression (ASE). Then, the expression of200 genes was measured in leaves and fruits of the whole RIL population by high throughputmicrofluidic qPCR. eQTLs and eQTL by watering regime interactions were mapped for thosegenes using linkage mapping. Colocalisations with the phenotypic QTLs were analyzed. Theknowledge produced during this PhD will contribute to a better understanding of the tomatoplant interaction with their environment and provide bases for improvement of fruit qualityunder limited water supply
APA, Harvard, Vancouver, ISO, and other styles
2

Jha, Ankita. "Quantitative control of GPCR organization and signaling by endocytosis in epithelial morphogenesis." Thesis, Aix-Marseille, 2017. http://www.theses.fr/2017AIXM0393/document.

Full text
Abstract:
Au cours de la gastrulation de l’embryon de Drosophile, l’activation apicale du cytosquelette d’acto-myosine orchestre la constriction apicale dans le mésoderme en invagination ainsi que l’intercalation cellulaire dans l’ectoderme en extension. Un contrôle quantitatif de l’activité des GPCRs et, par conséquent, de l’activation de Rho1 est à l’origine des différences de déformation des cellules du mésoderme et de l’ectoderme mais ces mécanismes demeurent incompris. L’activité du GPCR Smog se concentre respectivement en deux compartiments distincts à la surface de la membrane plasmique (PM) et dans ses invaginations (PMI). Au moyen de la FCS, nous avons étudié la surface de la PM et pu montrer que Smog oligomérise en homo-clusters en réponse à son activation par le ligand Fog. L’endocytose de Smog est facilitée par la kinase Gprk2 et sa protéine adaptatrice la β-Arrestine-2 qui retire Smog actif de la PM. Lorsque que la concentration de Fog est élevée ou que l’endocytose est réduite, Smog s’organise en homo-clusters et s’accumule au niveau des PMI qui agissent comme des centres d’activation de Rho1. Une concentration plus importante d’homo-clusters de Smog et un nombre plus important PMI dans le mésoderme par comparaison avec l’ectoderme. Répartition dynamique de Smog actif à la surface de la PM ou dans ses invaginations impacte directement sur la signalisation Rho1. Les PMI accumulent de hauts niveaux de Rho1-GTP suggérant qu’elles forment des centres de signalisation. La concentration de Fog et l’endocytose de Smog sont des processus régulateurs couplés qui contrôlent la différence quantitative d’activation de Rho1 dans le mésoderme et l’ectoderme de la Drosophile
During Drosophila gastrulation, apical activation of the actomyosin networks drives apical constriction in the invaginating mesoderm and cell-cell intercalation in the extending ectoderm. Here, we show that cell-surface G-protein coupled receptor, Smog activates G-proteins, Rho1 and Rho-kinase that is required for apical constriction and cell-cell intercalation. Quantitative control over GPCR activity and thereby Rho1 activation underlies differences in deformation of the mesoderm and ectoderm cells but the mechanisms remain elusive. We show that GPCR-Smog activity is concentrated on two different apical plasma membrane compartments i.e. the surface and the plasma membrane invaginations. Using FCS, we probe the surface of the plasma membrane (PM) and show that Smog homo-clusters in response to its activating ligand Fog. Endocytosis of Smog is facilitated by the kinase Gprk2 and the adaptor protein β-Arrestin-2 that clears active Smog from the surface of PM. When Fog concentration is high or endocytosis is low, Smog arranges in homo-clusters and accumulates in plasma membrane invaginations (PMI), that are hubs for Rho1 activation. Lastly, we find high Smog homo-cluster concentrations and numerous apical PMIs in the mesoderm compared to the ectoderm. We identify that dynamic partitioning of active Smog on the surface of the PM or PMI directly impact on Rho1 signaling. PMIs accumulate high Rho1-GTP suggesting they form signaling centers. Fog concentration and Smog endocytosis form coupled regulatory processes that regulate quantitative differential Rho1/MyoII activation in the Drosophila mesoderm and ectoderm
APA, Harvard, Vancouver, ISO, and other styles
3

Nooriafshar, Mehryar. "Balancing the Use of Technology and Traditional Approaches in Teaching Mathematics within Business Courses." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2012. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-80771.

Full text
Abstract:
Technologies associated with modern computing are being commonly used in education. Over the past few years, the usage has increased considerably. This increase is also attributed to the availability of more improved technology products and services at much lower costs. As a result, many successful educational multimedia products have been developed which have made significant contributions to learning and teaching mathematics at various levels. However, it is not always clear what exactly the position of technology in education is. In other words, to what extent does the technology-aided means of learning enhance learning and add value to the conventional materials? How are they supposed to supersede or excel the learning effectiveness of traditional methods of teaching? This paper explores the possibilities of utilizing the latest technologies such as Virtual Reality (VR) environments and Tablet PCs in conjunction with the traditional approaches and concepts in creating a balanced and more effective learning and teaching conditions. It also demonstrates how the creation of a situation where ‘one cannot see the wood for the trees’ can be avoided by striking the right balance.
APA, Harvard, Vancouver, ISO, and other styles
4

Goldhahn, Dirk. "Quantitative Methoden in der Sprachtypologie: Nutzung korpusbasierter Statistiken." Doctoral thesis, Universitätsbibliothek Leipzig, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-130550.

Full text
Abstract:
Die Arbeit setzt sich mit verschiedenen Aspekten der Nutzung korpusbasierter Statistiken in quantitativen typologischen Untersuchungen auseinander. Die einzelnen Abschnitte der Arbeit können als Teile einer sprachunabhängigen Prozesskette angesehen werden, die somit umfassende Untersuchungen zu den verschiedenen Sprachen der Welt erlaubt. Es werden dabei die Schritte von der automatisierten Erstellung der grundlegenden Ressourcen über die mathematisch fundierten Methoden bis hin zum fertigen Resultat der verschiedenen typologischen Analysen betrachtet. Hauptaugenmerk der Untersuchungen liegt zunächst auf den Textkorpora, die der Analyse zugrundeliegen, insbesondere auf ihrer Beschaffung und Verarbeitung unter technischen Gesichtspunkten. Es schließen sich Abhandlungen zur Nutzung der Korpora im Gebiet des lexikalischen Sprachvergleich an, wobei eine Quantifizierung sprachlicher Beziehungen mit empirischen Mitteln erreicht wird. Darüber hinaus werden die Korpora als Basis für automatisierte Messungen sprachlicher Parameter verwendet. Zum einen werden derartige messbare Eigenschaften vorgestellt, zum anderen werden sie hinsichtlich ihrer Nutzbarkeit für sprachtypologische Untersuchungen systematisch betrachtet. Abschließend werden Beziehungen dieser Messungen untereinander und zu sprachtypologischen Parametern untersucht. Dabei werden quantitative Verfahren eingesetzt.
APA, Harvard, Vancouver, ISO, and other styles
5

Jouvanceau, Valentin. "Three essays on unconventional monetary policies." Thesis, Lyon, 2019. http://www.theses.fr/2019LYSE2061.

Full text
Abstract:
La thèse est organisée en trois articles de recherche. Le premier chapitre porte sur : “Quantitative Easing and Excess Reserves”. Le chapitre 2 présente le papier : ”New Evidence on the Effects of Quantitative Easing”. Le chapitre 3est un papier intitulé : ”State-contingent Forward Guidance”.“Quantitative Easing and Excess Reserves” : Quels sont les effets d’un afflux de réserves excédentaires sur l’économie réelle ? ´ Etonnamment, la littérature théorique reste silencieuse sur cette question. Nous abordons cette questiondans un modèle néo-keynésien avec diverses frictions financières et réserves obligatoires. La modélisation du QE par l’offre de réserves excédentaires permet de tenir compte des variations endogènes de l’offre relative d’actifs financiers. Nous estimons que ce mécanisme est essentiel pour identifier et démêler les canaux de rééquilibrage de portefeuille, du crédit et des prix des actifs du QE. En outre, nous démontrons que les effets macroéconomiques du QE sont plutôt faibles et se transmettent principalement par le canal des prix des actifs.“New Evidence on the Effects of Quantitative Easing” : Les effets macroéconomiques des programmes du QE ont-ils été empiriquement surestimés ? En utilisant un grand nombre de spécifications de modèle qui diffèrent par le degré de variation des paramètres, la réponse est oui. Notre exercice de prévision suggère qu’il est crucial de tenir compte des variations temporelles des paramètres, mais pas de la volatilité stochastique. Dans une analyse des chocs structurels du QE, nous constatons que le QE1 a eu des effets macroéconomiques plus importants que le QE2 et le QE3, mais beaucoupmoins importants que ceux habituellement observés dans la littérature.“State-contingent Forward Guidance” : Dans ce papier, les impacts d’un Statecontingent Forward Guidance sont évalués dans un modèle DSGE avec frictions de recherche et d’appariement sur le marché du travail. Du point de vue Odyssean, cet engagement a des effets relativement faibles sur l’économie. Ce résultat s’attaque à ce que l’on appelle le puzzle du forward guidance. En outre, les simulations suggèrent qu’un State-contingent Forward Guidance est principalement transmis par les pressions sur les anticipations d’inflation
The thesis is organized in three research papers. The paper of the first chapter is: “Quantitative Easing and Excess Reserves”. The chapter 2 offers a paper named: “New Evidence on the Effects of Quantitative Easing”. The chapter 3 is a paper called: “State-contingent Forward Guidance”.“Quantitative Easing and Excess Reserves”: What are the impacts of a flush of interest-bearing excess reserves to the real economy? Surprisingly, the theoretical literature remains silent about this ques- tion. We address this issue in a newKeynesian model with various financial frictions and reserve requirements in the balance sheet of bankers. Modeling QE by the supply of excess reserves allows for endogenous changes in the relative supply of financial assets. We find that this mechanism is crucial to identify and disentangle between the portfolio balance, the credit and the asset prices channels of QE. Further, we demonstrate that the macroeconomic effects of QE are rather weak and mainly transmitted through the asset prices channel.“New Evidence on the Effects of Quantitative Easing”: Have the macroeconomic effects of QE programs been empirically overestimated? Using a large set of model specifications that differ in the degree of time-variation in parameters, the answer is yes. Our forecasting exercise suggests that it is crucial to allow for time-variation in parameters, but not for stochastic volatility. In an analysis of structural QE shocks, we find that QE1 had larger macroeconomic effects than QE2 and QE3, but much smaller than usually found in the literature.“State-contingent Forward Guidance”: In this paper, the impacts of a statecontingent forward guidance are assessed in a DSGE model with search and matching frictions. Under an Odyssean perspective, the commitment is found to have relatively low effects in the economy. This result tackles the so-called forward guidance puzzle. In addition, the simulations suggest that a statecontingent forward guidance is mainly transmitted by shifts in the expectations of inflation
APA, Harvard, Vancouver, ISO, and other styles
6

Talouarn, Estelle. "Utilisation des données de séquence pour la cartographie fine et l'évaluation génomique des caractères d'intérêt des caprins laitiers français." Thesis, Toulouse, INPT, 2020. http://www.theses.fr/2020INPT0067.

Full text
Abstract:
La filière caprine française a intégré l’ère de la génomique avec le récent développement et la valorisation d’une puce à ADN dans les années 2010-2020 pour la recherche de QTL et l’évaluation génétique. La démocratisation des données de séquençage tout génome pour les animaux de rente ouvre de nouvelles perspectives. Le projet VarGoats, a pour but de mettre à disposition d’un consortium international, un jeu de données de plus de 1000 séquences pour l’espèce Capra hircus. L’étude de la qualité d’imputation vers la séquence dans la filière caprine est un préalable nécessaire à l’utilisation de cette dernière dans les analyses d’association pour la détection de QTL ainsi que dans les évaluations génomiques. L’objectif principal de ces travaux est d’étudier l’intégration potentielle des données de séquence dans les programmes d’amélioration génétique de la filière laitière caprine française. La mise en place d’un contrôle de la qualité des données de séquence a représenté un travail majeur dans ma thèse. Il s’est appuyé sur une recherche bibliographique ainsi que sur la comparaison des génotypes 50k disponibles avec les séquences filtrées. Finalement, sur les 97 889 899 SNP et 12 304 043 indels initiaux, nous avons retenu 23 338 436 variants dont 40 491 appartenaient au set de SNP de la puce Illumina GoatSNP50 BeadChip. Une étude préalable de l’imputation depuis la puce 50k vers la séquence a ensuite été menée dans le but d’obtenir un nombre suffisant de séquences imputées de bonne qualité. Plusieurs méthodes d’imputation (imputation populationnelle ou familiale) et plusieurs logiciels ont été testés en utilisant les données de séquence disponibles (829 séquences des différences races caprines internationales). En intra-race, les taux de concordances génotypiques et alléliques ont été estimées à 0,74 et 0,86 en Saanen et 0,76 et 0,87 en Alpine respectivement. Les corrélations étaient alors de 0,26 et 0,24 en Alpine et Saanen respectivement. Les séquences imputées des mâles ont permis la confirmation de QTL précédemment observés sur les génotypes 50k ainsi que la détection de nouvelles régions d’intérêt. L’exhaustivité des données de séquence représentait une opportunité sans précédent d’approfondir une région QTL du chromosome 19 en Saanen qui est associée à la fois à des caractères de production mais aussi à des caractères de morphologie et santé de la mamelle ainsi qu’à des caractères de production de semence. Cette analyse n’a pas abouti à l’identification de mutations candidates. Néanmoins, nous avons pu proposer un moyen simple d’identifier des profils génomiques et phénotypiques particuliers en race Saanen à partir d’un génotype 50k. Cette méthode pourra s’avérer utile en terme de prédiction précoce tant en France qu’à l’international. Enfin, en réunissant l’ensemble des travaux effectués précédemment, nous avons étudié l’impact de l’intégration de données de séquence imputées sur le chromosome 19 sur la précision des évaluations en race Saanen françaises. Plusieurs modèles d’évaluations ont été mis en oeuvre et comparés : single-step GBLUP (ssGBLUP), single-step GBLUP pondéré (WssGBLUP) en utilisant différents panels de variants imputés. Les meilleurs résultats ont été obtenus en utilisant un ssGBLUP incluant les génotypages 50k et les variants imputés de la région du QTL du chromosome 19 (entre 24,72 et 28,38 Mb) avec des gains de +6,2% de précision en moyenne sur les caractères évalués. La mise à jour de la puce caprine à laquelle j’ai participé représente une perspective d’amélioration de la précision des évaluations. Elle permet d’améliorer significativement la qualité des évaluations génomiques (entre 3,1 et 6,4% en fonction du scenario considéré) tout en limitant les temps de calculs liés à l’imputation notamment. Ces travaux confortent l’intérêt de l’utilisation de données de séquence dans les programmes de sélection caprins français et ouvrent la perspective de leur intégration dans la routine des évaluations
French dairy goats recently integrated genomics with the development of a DNA chip in the 2010s and the first QTL detections and genomic evaluations. The availability of sequence data for farm animals opens up new opportunities. The VarGoats project is an international 1,000 genomes resequencing program designed to provide sequence information of the Capra hircus species. The study of imputation quality to sequence level is a necessary first step before using imputed sequences in association analysis and genomic evaluations. The main objective of this work was to study the possible integration of sequence data in the French dairy goats breeding programs. The set up of a quality check represented a sizable part of this thesis. It was based on bibliographic research and the comparison between available 50k genotypes and sequence data. Out of the initial 97,889,899 SNPs and 12,304,043 indels, we eventually retained 23,338,436 variants including 40,491 SNPs of the Illumina GoatSNP50 BeadChip. A preliminary study of imputation from 50k genotypes to sequence was then performed with the aim of getting a sufficient number of sequenced animals of good quality. Several softwares and methods were considered (family or population imputation) using the 829 sequenced animals available. Within-breed imputation led to genotype and allele concordance of 0.74 and 0.86 in Saanen and 0.76 and 0.87 in Alpine respectively. Correlations were then of 0.26 and 0.24 in Alpine and Saanen respectively. Imputed sequence of males confirmed signals previously identified using 50k genotypes and allowed the detection of new regions of interest. The density of sequence data represented an unprecedented opportunity to deepen our understanding of QTL region of chromosome 19 in the Saanen breed. This region is associated to production, type and udder health traits as well as semen production traits. Our analysis did not point out any candidate mutation. However, we offer a simple way to identify genomic and phenotypic profiles in the Saanen breed using 50k genotypes. This method could be of use for early prediction in France but also worldwide. Finally, using all previous results, we studied the impact of the integrating imputed sequence data of chromosome 19 on the accuracy of evaluations in French Saanen. Several evaluation models were compared : single-step GBLUP (ssGBLUP) and weighted single-step GBLUP (WssGBLUP) using different panels of imputed variants. Best results were obtained using ssGBLUP with 50k genotypes and all variants on the QTL region of chromosome 19 (between 24.72 and 28.38Mb): +6.2% accuracy on average for all evaluated traits. The 50k chip update to which I participated represents a opportunity to improve genomic evaluations. Indeed, it significantly improved accuracy of predictions (between 3.1 and 6.4% on average depending on the scenario) while limiting computation time associated to imputation. This work confirms the benefits of using sequence data in the French dairy goats breeding programs and opens up the perspective of integrating them in the routine genomic evaluations
APA, Harvard, Vancouver, ISO, and other styles
7

Nooriafshar, Mehryar. "Balancing the Use of Technology and Traditional Approaches in TeachingMathematics within Business Courses." Proceedings of the tenth International Conference Models in Developing Mathematics Education. - Dresden : Hochschule für Technik und Wirtschaft, 2009. - S. 450 - 453, 2012. https://slub.qucosa.de/id/qucosa%3A1795.

Full text
Abstract:
Technologies associated with modern computing are being commonly used in education. Over the past few years, the usage has increased considerably. This increase is also attributed to the availability of more improved technology products and services at much lower costs. As a result, many successful educational multimedia products have been developed which have made significant contributions to learning and teaching mathematics at various levels. However, it is not always clear what exactly the position of technology in education is. In other words, to what extent does the technology-aided means of learning enhance learning and add value to the conventional materials? How are they supposed to supersede or excel the learning effectiveness of traditional methods of teaching? This paper explores the possibilities of utilizing the latest technologies such as Virtual Reality (VR) environments and Tablet PCs in conjunction with the traditional approaches and concepts in creating a balanced and more effective learning and teaching conditions. It also demonstrates how the creation of a situation where ‘one cannot see the wood for the trees’ can be avoided by striking the right balance.
APA, Harvard, Vancouver, ISO, and other styles
8

Wengert, Christian. "Quantitative endoscopy /." Konstanz : Hartung-Gorre Verlag, 2008. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=17686.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Faustino, Rui Alexandre Rodrigues Veloso. "Quantitative easing." Master's thesis, NSBE - UNL, 2012. http://hdl.handle.net/10362/9552.

Full text
Abstract:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Economics from the NOVA – School of Business and Economics
Since November 2008, the Federal Reserve of the United States pursued a series of large-scale asset purchases, known as Quantitative Easing. In this Work Project, I describe the context, the objectives and the implementation of the Quantitative Easing. Additionally, I discuss its expected effects. Finally, I present empirical evidence of the effects on interest rates, output and inflation. I conclude that the first round of purchases was effective on preventing deflation and depression while the second had a small impact on economy.
APA, Harvard, Vancouver, ISO, and other styles
10

Cleary, Maryanne Viola. "Quantitative HPTLC." Thesis, This resource online, 1995. http://scholar.lib.vt.edu/theses/available/etd-07112009-040558/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Kleinschmidt, Heike. "Kompetenzaustausch als Basis einer inklusiven Beziehungskultur." Bachelor's thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2014. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-146437.

Full text
Abstract:
Abstract in English Inclusion is a state where the opportunity to understand everything is given naturally. In order to support this idea as well as simply be authentic in this academic work in Inclusion Studies, the following part is an English summary of my Bachelor thesis. Using the English language – understood and spoken by a huge number of today's world population – I would like interested people who do not know German to find an access to the subject of my work. Exchange of competences and culture of relationship – abstract “Exchange of competences as a basis for an inclusive culture of relationship” is the title of my Bachelor Thesis. It means, that interpersonal exchange of abilities and understandings can be a starting point for relationships that are realised and experienced in a way that can be called inclusive (referring to “inclusion”). This work focusses especially on relationships between so-called clients and pedagogues. The thesis consists of four textual chapters from A to D. Part A describes my motivation for this work which is personal experience and presents the leading question – Does a mutual exchange of competences effect an inclusive relationship culture? – and the hypothesis: If competences are exchanged, the relationship 'clientel ⇆ pedagogue' is (on both sides) positively perceived – this feeling and in consequence the relationship itself are giving inclusivity and sustainable effectiveness to the growing process taking place on both sides. Furthermore, the methodology is described in the first chapter. The next chapter is about introducing definitions in order to ensure that the readers will fully understand the key terms and topics which are: relationships; culture and cultures; exchange and balance; competences, resources, potentials; communication; inclusion; science. Chapter C refers to two different kinds of sources in order to support and prove the hypothesis as well as to illustrate the concept. The first part of this third chapter is dealing with models from social sciences like “inclusive learning culture” (A. Zimpel 2012), “cybernEthics” (H. von Foerster 1993), hermeneutics (Joedecke 2013), Empowerment (W. Stark), Children's and Human Rights (J. Korczak; United Nations) and the Cultural-historical psychology (L. S. Vygotsky, A. N. Leontev, A. R. Luria). These concepts are compared and woven together – to be illustrated by authentic examples in the chapter's second part. I have prepared ten written surveys in a narrative style. These interviews’ subjects have been asked to talk about living in a different cultural surrounding that was at first “different” and “new” for them: they were to tell which perceptions and emotions they had? (What helped them to establish relationships to people there? How was it to perform pedagogical tasks as a “not-yet-speaking” person?) In chapter D the connections between all sources mentioned above in form are summed up. In addition, a metaphor is also created, which is called the “cultural bag”: representing the mutual growing process surrounded by the inclusive relationship culture through which the individuals’ competences could expand, the “bag of culture” is a form of container and at the same time transportable – so each part of the relationship can now individually take and use respectively apply in any new situation.
APA, Harvard, Vancouver, ISO, and other styles
12

Merk, Sven. "Plasmadiagnostik und Quantifizierung für die laserinduzierte Plasmaspektroskopie." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I, 2013. http://dx.doi.org/10.18452/16664.

Full text
Abstract:
In dieser Arbeit werden neue Konzepte der kalibrationsfreien Analytik mit laser-induzierter Plasmaspektroskopie (LIBS) und der Plasmadiagnostik an laser¬induzierten Plasmen entwickelt. Dafür wurden bereits bekannte Techniken auf ihre Schwachstellen und Beschränkungen untersucht und entsprechende Neuerungen vorgestellt, die entdeckten Mängel zu beseitigen. In Computersimulationen konnte die CF-LIBS-Technik (calibration free LIBS) auf synthetische Plasmen angewandt werden, deren Eigenschaften vollständig bekannt und gezielt verändert wurden. So konnte der Einfluss aller Plasmaeigenschaften auf das quantitative Analysenergebnis untersucht werden. Diese Untersuchungen haben gezeigt, dass die vorhandene CF-LIBS-Technik nur semi-quantitative Ergebnisse liefern kann. Eine Verbesserung der CF-LIBS konnte mit der Verwendung räumlich aufgelöster Spektren erreicht werden, da hier die Heterogenität des Plasmas überwunden werden kann. Durch Kombination dieser Vorgehensweise mit einer Plasmasimulation konnten deutlich stabilere Werte für die Plasmatemperatur und die Dichte, beides wichtige Faktoren bei der kalibrationsfreien, quantitativen Analyse, erhalten werden. Die bisher verwendete Methode zur räumlich aufgelösten Plasmaspektroskopie, die Abel Inversion, ist auf zylindrisch symmetrische Plasmen beschränkt. Schon leichte Störungen in dieser Symmetrie können zu fehlerhaften Ergebnissen führen. Mit der in dieser Arbeit erfolgten Einführung der inversen Radon¬transformation als neue Technik der räumlich aufgelösten Plasmadiagnostik an laserinduzierten Plasmen, konnten erstmalig realitätsnahe Rekonstruktionen von Plasmen gestörter Symmetrie und intrinsischer Asymmetrie durchgeführt werden. Das eröffnet ein neues Feld in der Plasmadiagnostik und trägt zum besseren Verständnis laserinduzierter Plasmen bei. Dies ist der Grundstock für eine Weiterentwicklung der laserinduzierten Plasmaspektroskopie hin zu einer absoluten analytischen Methode.
In this work, new concepts for the calibration free analysis by laser induced breakdown spectroscopy (LIBS) and for plasma diagnostics on laser induced plasma have been developed. In the course of this, already known techniques were analyzed for weaknesses and limitations. New techniques, based these findings, were presented. Using computer simulations of perfectly defined and entirely modifiable plasmas, the well known CF-LIBS (calibration free LIBS) technique was evaluated. The influence of all plasma parameters on the analysis result was investigated. It was shown that this technique, in its current form, is capable to deliver semi quantitative results only. An improvement of CF-LIBS was achieved by the use of spatially resolved measurements, overcoming the plasma heterogeneity. Combination of this approach with plasma simulation to find the parameters temperature and density, both important factors in calibration free analytics, much more stable results could be obtained. However, the well known technique for spatially resolved plasma investigation, the Abel Inversion, is limited to cylindrically symmetric plasmas only. Even weak disturbances of such symmetry can lead to major errors in the results. Using the inverse Radon Transformation, which is being introduced to plasma diagnostics on laser induced plasma by the present work, the first close to reality reconstructions and investigations of plasma with disturbed symmetry and even intrinsic asymmetry were conducted. This opens a new field in plasma diagnostics and helps to a better understanding of laser induced plasmas in general. It forms the foundation of further development of laser induced breakdown spectroscopy to become an absolute method.
APA, Harvard, Vancouver, ISO, and other styles
13

Ciardo, Diletta. "Quantitative analysis of the regulation of the DNA replication program by the intra-S phase checkpoint in Xenopus embryos Checkpoint control of the spatio-temporal regulation of DNA replication in Xenopus early embryos Polo-like kinase 1 (Plk1) is a positive regulator of DNA replication in the Xenopus in vitro system On the Interplay of the DNA Replication Program and the Intra-S Phase Checkpoint Pathway Genome wide decrease of DNA replication eye density at the midblastula transition of Xenopus laevis Polo like kinase 1 promotes dispersed replication origin firing during S phase." Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLS478.

Full text
Abstract:
Dans les organismes multicellulaires, plusieurs millier d’origines initient la réplication de l'ADN. Elles sont regroupées en domaines qui se répliquent tôt ou tard au cours de la phase S (origines précoces ou tardives). L'un des mécanismes régulant le programme de réplication est un point de contrôle intra phase S qui dépend des kinases ATR et Chk1. Cette voie est activée par un stress réplicatif engendré par le blocage des fourches de réplication aux origines précoces, en retour elle inhibe l’activation des origines tardives. Il a été proposé, que la protéine Polo-Like-Kinase 1 (Plk1) soit responsable du déclenchement des origines situées à proximité des fourches bloquées en cas de stress réplicatif. Cependant, aucune analyse de l’activation des origines n’a jamais été réalisée au cours d’une phase S non perturbée lorsque Plk1 est absente. Pour avoir une vue globale et unifiée du processus de réplication de l'ADN, des modèles numériques et analytiques ont été construits dans le passé, mais aucun d'eux n'intègrent le rôle de Chk1 et Plk1. L'objectif de ma thèse était d’étudier expérimentalement et analytiquement de quelle manière Chk1 peut réguler le déclenchement des origines dans l'espace et dans le temps. En particulier, de comprendre si Plk1 pouvait être impliquée dans cette régulation pendant une phase S non perturbée, à cette fin, j'ai utilisé le système réplicatif des extraits d’œuf de Xénopes. En premier lieu, j'ai intégré dans un modèle numérique l’action de Chk1 pour reproduire le programme de réplication du système Xénope. J'ai testé différents scénarios puis j’ai utilisé des données de peignage d'ADN obtenues précédemment dans des conditions d'inhibition de la kinase Chk1. Les simulations Monte Carlo obtenues ont été ajustées aux données expérimentales en optimisant les valeurs des paramètres libres des modèles. J'ai trouvé qu'il fallait ajouter deux hypothèses aux modèles de réplication développés précédemment: 1) la présence d’une forte inhibition du déclenchement des origines par Chk1 à partir du début de la phase S 2) la présence de domaines génomiques répliquant précocement et qui échappent à cette inhibition. Deuxièmement, j'ai montré expérimentalement que, Plk1 actif est recrutée sur la chromatine avant le début de la phase S non perturbée et qu'en l'absence de Plk1, la réplication de l'ADN est ralentie. De plus, l’absence de Plk1 entraîne une augmentation de la phosphorylation de Chk1 et une diminution de l’activité de la kinase Cdk2, ce qui suggère que Plk1 inhibe Chk1. En réalisant des expériences de peignage d’ADN, j'ai démontré qu’en l’absence de Plk1 on observe une baisse du niveau d’activation des origines. L'analyse de ces données par mon modèle numérique suggère que Plk1 régule négativement l’action de Chk1 levant ainsi son action inhibitrice sur l’activation globale des origines. Cet effet concorde avec mes observations expérimentales. Il semble cependant que Plk1 n’agisse pas à proximité directe des fourches de réplication, comme cela avait été proposé précédemment. Enfin, en assimilant le processus de réplication à un processus de nucléation et de croissance unidimensionnel, j'ai développé une nouvelle approche quantitative pour étudier la régulation du programme de réplication. Cette approche lie la similarité entre les profils spatiaux de réplication d'une molécule unique et les processus de régulation de la réplication de l'ADN. En analysant les données de peignage d'ADN, j'ai montré que le programme de réplication de l'ADN des embryons précoces de Xénope est régulé par deux processus exclusifs dans l'espace et dans le temps. L’un avec une fréquence faible d’activation des origines et une vitesse apparente de fourches élevée et un second, régulé par Plk1, présentant une fréquence d’activation élevée des origines avec une vitesse apparente de fourches faible
The initiation of DNA replication in multicellular organisms starts from several thousand genomic loci called replication origins. They are grouped into domains which replicate early or late during S phase. The firing of a replication origin creates two diverging replication forks that replicate flanking DNA. One of the mechanisms regulating DNA replication program is the ATR/Chk1 dependent intra-S phase checkpoint. This pathway is activated by replicative stress due to stalled replication forks at early firing origins and in turn, inhibits the late firing of origins. It has been proposed that the checkpoint recovery kinase Plk1 (Polo-Like-Kinase 1) could be responsible for allowing origin firing close to stalled forks in replication stress conditions. However, origin firing has not been analysed after Plk1 inhibition or depletion during unperturbed S phase. To assemble a comprehensive and unified view of the DNA replication process numerical and analytical models have been built in the past, but none of them integrates the role of checkpoint pathways. The goal of my thesis was to investigate experimentally and analytically how the checkpoint regulates the firing of origins in space and time and, in particular, whether the Plk1 is implicated in the regulation of origin firing during unperturbed S phase. To this end, I used the Xenopus in vitro system. First, I integrated in a numerical model the checkpoint pathway to describe the replication program in the Xenopus in vitro system. I tested different scenarios and used DNA combing data previously obtained by the laboratory after the inhibition of the checkpoint kinase Chk1. Monte Carlo simulated data were fitted to experimental data by optimizing the values of free parameters of models using a genetic algorithm. I found that two new hypothesis should be added to formerly built replication models: 1) a strong inhibition of origin firing by Chk1 from the beginning of S phase 2) the presence of early replicating genomic domains that evade the origin firing inhibition. Second, I experimentally showed that during unperturbed S phase active Plk1 is recruited to chromatin before the start of S phase and that in the absence of Plk1, DNA replication is slowed down. Moreover, Plk1 depletion led to an increase in Chk1 phosphorylation (p-Chk1) and a decrease of Cdk2 activity, suggesting that Plk1 inhibits the intra-S phase checkpoint. Performing DNA combing, I demonstrated that Plk1 depletion leads to a decrease in origin firing level. Analysis of the combing data by the developed numerical model suggested that during unchallenged S phase Plk1 down regulates the global origin firing inhibitory action of Chk1, consistent with the experimental observation of increased level of p-Chk1 in Plk1 depleted Xenopus egg extract. However, Plk1 does not seem to act close to replication forks as was proposed earlier. Finally, by considering replication process as a one-dimensional nucleation and growth process and using statistical methods, I developed a new quantitative approach to study the regulation of replication program. This approach links the similarity between single molecule replication patterns to DNA replication regulating processes. By analyzing DNA combing data, I showed that DNA replication program in Xenopus early embryos is regulated by two spatially and temporally exclusive processes. One with low frequency of origin firing and high apparent fork speed and a second, controlled by PlK1, with a high frequency of origin firing and a low apparent fork speed. Altogether my results demonstrate that Plk1 positively regulates replication origin firing during normal S phase by down regulating the replication checkpoint. The numerical model predicts the existence of replication timing domains in the Xenopus model system. Future work will show whether Plk1 regulates the replication program at the level of genomic domains
APA, Harvard, Vancouver, ISO, and other styles
14

L'Hôte, David. "Exploitation d’un modèle de souris interspécifiques, recombinantes et congéniques pour la cartographie de QTL de la fertilité mâle et pour l’étude de la régulation génique testiculaire dans le contexte d’un génome mosaïque." Limoges, 2009. https://aurore.unilim.fr/theses/nxfile/default/c71350b1-b2e7-41de-aa2f-64f46883036c/blobholder:0/2009LIMO4014.pdf.

Full text
Abstract:
Afin de positionner des QTL de fertilité mâle, nous avons utilisé un ensemble de lignées interspécifiques, recombinantes et congéniques (IRCS). Nous avons positionné 8 QTL influençant les paramètres : poids testiculaire, et prostatique, morphologie et vitalité des spermatozoïdes. Nous avons ciblé notre étude sur une lignée IRCS portant un QTL impliqué dans une réduction du poids testiculaire, et une tératozoospermie sur le chromosome 11. L'analyse en cartographie fine nous a permis de proposer une région candidate de 4 gènes significativement exprimés dans le testicule. Par ailleurs, nous avons conduit une analyse du transcriptome testiculaire de trois lignées IRCS et des lignées parentales qui a montré comment des gènes spretus étaient régulés lorsqu'ils étaient introgressés dans un génome musculus. Cette étude nous a permis d'amorcer une réflexion sur la tolérance d'un génome vis-à-vis des flux de gènes entre espèces voisines dans la constitution d'un génome mosaïque
In order to map new QTL regulating male fertility parameter, we analysed a set of Interspecific Recombinant Congenic strain. We mapped 8 QTL implicated in testis, development, prostate growth, sperm vitality and morphology. We performed fine mapping analysis of a QTL of reduced testis weight associated with teratozoospermia, localised on MMU11. We proposed a candidate locus encompassing four testis expressed gene. Moreover, in order to understand gene expression regulation in interspecific mosaic genome, we analysed testis transcriptome of three IRC strains compared with parental strains testis transcriptome. In this study, we describe how spretus genes are regulated when introgressed in musculus background. This study gives some insight concerning gene flow tolerance across the specie barrier during emergence of mosaic genome
APA, Harvard, Vancouver, ISO, and other styles
15

Amin, Ali Rada. "BCR de classe IgA : signalisation de la cellule B normale et dans un contexte de lymphoprolifération." Limoges, 2009. http://www.theses.fr/2009LIMO4069.

Full text
Abstract:
Afin de positionner des QTL de fertilité mâle, nous avons utilisé un ensemble de lignées interspécifiques, recombinantes et congéniques (IRCS). Nous avons positionné 8 QTL influençant les paramètres : poids testiculaire, et prostatique, morphologie et vitalité des spermatozoïdes. Nous avons ciblé notre étude sur une lignée IRCS portant un QTL impliqué dans une réduction du poids testiculaire, et une tératozoospermie sur le chromosome 11. L'analyse en cartographie fine nous a permis de proposer une région candidate de 4 gènes significativement exprimés dans le testicule. Par ailleurs, nous avons conduit une analyse du transcriptome testiculaire de trois lignées IRCS et des lignées parentales qui a montré comment des gènes spretus étaient régulés lorsqu'ils étaient introgressés dans un génome musculus. Cette étude nous a permis d'amorcer une réflexion sur la tolérance d'un génome vis-à-vis des flux de gènes entre espèces voisines dans la constitution d'un génome mosaïque
In order to map new QTL regulating male fertility parameter, we analysed a set of Interspecific Recombinant Congenic strain. We mapped 8 QTL implicated in testis, development, prostate growth, sperm vitality and morphology. We performed fine mapping analysis of a QTL of reduced testis weight associated with teratozoospermia, localised on MMU11. We proposed a candidate locus encompassing four testis expressed gene. Moreover, in order to understand gene expression regulation in interspecific mosaic genome, we analysed testis transcriptome of three IRC strains compared with parental strains testis transcriptome. In this study, we describe how spretus genes are regulated when introgressed in musculus background. This study gives some insight concerning gene flow tolerance across the specie barrier during emergence of mosaic genome
APA, Harvard, Vancouver, ISO, and other styles
16

Youle, Ian. "Quantitative tritium imaging." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape8/PQDD_0015/NQ45641.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Elder, A. D. "Quantitative fluorescence microscopy." Thesis, University of Cambridge, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.598801.

Full text
Abstract:
The work presented here improves the level of quantification achievable with fluorescence microscopy by integrating novel technologies and developing new experimental and theoretical methodologies. Initial work focused on the use of fluorescence microscopy for the quantification of molecular interactions in living cells. This resulted in the development of an analysis routine for the quantification of Förster resonance energy transfer (FRET) by intensity-based sensitised acceptor emission measurements. The developed technique enabled quantification of the strength of interaction as well as the relative stoichiometry of free and bound fluorophores. The work culminated in the dynamic measurement of the cyclin – cyclin dependent kinase interaction through the course of the cell cycle. To improve the flexibility of microscopy techniques, a confocal microscopy system was designed and built which used novel fibre-based supercontinuum illumination technique and a prism-based spectrometer to provide wavelength resolved measurements. The multiparametric imaging approach which this system enabled was shown to aid in the quantification of complex systems. The remainder of this thesis considers the development of new frequency-domain fluorescence lifetime imaging microscopy (FD-FLIM) techniques. The advantages of lifetime imaging techniques were illustrated through their application to quantitative chemical analysis in microfluidic devices. Novel illumination technology was integrated into FD-FLIM systems; both in the form of inexpensive light emitting diodes and fibre-based supercontinuum technology. An in-depth theoretical analysis permitted the development of systems with much improved photon economy. Using extensions of the AB analysis technique, multicomponent lifetime data could be accurately quantified. finally, a new experimental technique was implemented, termed ø2FLIM, which enabled the rapid acquisition of alias-free fluorescence lifetime data.
APA, Harvard, Vancouver, ISO, and other styles
18

梁永雄 and Wing-hung Leung. "Quantitative coronary arteriography." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1991. http://hub.hku.hk/bib/B31981483.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Waszkiewicz, Pawel. "Quantitative continuous domains." Thesis, University of Birmingham, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.269779.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Schlachter, Simon Christopher. "Quantitative multidimensional microscopy." Thesis, University of Cambridge, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.609221.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Martins, Carlos Jose Amaro Parente. "Quantitative string evolution." Thesis, University of Cambridge, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.627371.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Sharma, Arvind Kumar. "Quantitative Stratigraphic Inversion." Diss., Virginia Tech, 2006. http://hdl.handle.net/10919/30172.

Full text
Abstract:
We develop a methodology for systematic inversion of quantitative stratigraphic models. Quantitative stratigraphic modeling predicts stratigraphy using numerical simulations of geologic processes. Stratigraphic inversion methodically searches the parameter space in order to detect models which best represent the observed stratigraphy. Model parameters include sea-level change, tectonic subsidence, sediment input rate, and transport coefficients. We successfully performed a fully automated process based stratigraphic inversion of a geologically complex synthetic model. Several one and two parameter inversions were used to investigate the coupling of process parameters. Source location and transport coefficient below base level indicated significant coupling, while the rest of the parameters showed only minimal coupling. The influence of different observable data on the inversion was also tested. The inversion results using misfit based on sparse, but time dependent sample points proved to be better than the misfit based on the final stratigraphy only, even when sampled densely. We tested several inversion schemes on the topography dataset obtained from the eXperimental EarthScape facility simulation. The clustering of model parameters in most of the inversion experiments showed the likelihood of obtaining a reasonable number of compatible models. We also observed the need for several different diffusion-coefficient parameterizations to emulate different erosional and depositional processes. The excellent result of the piecewise inversion, which used different parameterizations for different time intervals, demonstrate the need for development or incorporation of time-variant parameterizations of the diffusion coefficients. We also present new methods for applying boundary condition on simulation of diffusion processes using the finite-difference method. It is based on the straightforward idea that solutions at the boundaries are smooth. The new scheme achieves high accuracy when the initial conditions are non vanishing at the boundaries, a case which is poorly handled by previous methods. Along with the ease in implementation, the new method does not require any additional computation or memory.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
23

Peace, Richard Aidan. "Quantitative cardiac SPECT." Thesis, University of Aberdeen, 2001. http://digitool.abdn.ac.uk/R?func=search-advanced-go&find_code1=WSN&request1=AAIU602292.

Full text
Abstract:
Myocardial perfusion SPECT imaging is a sensitive and specific indicator of coronary artery disease (Fleischman et al. 1998). The clinical value of coronary scintigraphy is now established with a utilisation rate of eight procedures per 1000 population per year in the USA and two per 1000 in the EU (Pennell et al. 1998). While myocardial perfusion SPECT images are routinely interpreted by expert observers the classification is inevitably subject to inter-observer and intra-observer variability. An optimised and validated quantitative index of the presence or absence of coronary artery disease (CAD) could improve reproducibility, accuracy and diagnostic confidence. There are segmental techniques to automatically detect CAD from myocardial perfusion SPECT studies such as the CEqual quantitative analysis software (Van Train et al. 1994). However, they have not been shown to be significantly better than expert observers (Berman et al. 1998). The overall aim of this thesis was to develop, optimise and evaluate quantitative techniques for the detection of CAD in myocardial perfusion SPECT studies. This task was divided into three areas; quantification of transient ischaemic dilation (TID); quantitative detection and localisation of CAD; count normalisation of patient studies. Transient ischaemic dilation (TED) is the transient dilation of the left ventricle on immediate post stress images compared to resting technetium-99m imaging. Stolzenberg (1980) first noted TID as a specific marker for severe CAD. There are few published studies of fully quantitative evaluations of TID. The first aim of this thesis was to compare the performance of methods for quantifying TDD in myocardial perfusion SPECT. The second aim of this thesis was to investigate the use of image registration in myocardial perfusion SPECT for quantitative detection and localisation of CAD. This thesis describes two studies comparing six count normalisation techniques. These techniques were; normalise to the maximum value; to the mean voxel value; to the mean of the top 10% or 20% of counts; minimise the sum of squares between studies or the sum of absolute differences. Ten normal myocardial perfusion SPECT studies each with 300 different simulated perfusion defects were count normalised to the original studies. The fractional count normalisation error was consistently lower when the sum of absolute differences was minimised. However, a more clinically applicable measure of count normalisation performance is the effect on quantitative CAD detection. The Z-score method of automatic detection of CAD was repeated using each count normalisation technique. There was no statistically significant difference between the methods although the power of the ROC analysis was poor due to low patient numbers. The balance of evidence suggested that count normalisation by minimisation of the of absolute differences produced the best performance.
APA, Harvard, Vancouver, ISO, and other styles
24

Grjasnow, Alexej. "Teilkohärente quantitative Phasenkontrastmikroskopie." Berlin mbv, 2009. http://d-nb.info/99459576X/04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Tamski, Mika. "Quantitative electrochemical EPR." Thesis, University of Warwick, 2015. http://wrap.warwick.ac.uk/79963/.

Full text
Abstract:
Electron paramagnetic resonance (EPR) is a spectroscopic technique sensitive to unpaired electrons present in paramagnetic species such as free radicals and organometallic complexes. Electrochemistry (EC) is an interfacial science, where reduction and oxidation processes are studied. A single electron reduction or oxidation generates a paramagnetic species with an unpaired electron, thus making EPR a valuable tool in the study of electrochemical systems. In this work a novel electrochemical cell was designed and developed to be used with a specific type of EPR resonator, called loop gap resonator (LGR). After building and characterising the performance of the EC-EPR setup, it was adapted for quantitative measurements in electrochemical EPR (QEC-EPR). Thus, for the first time, the technique of EC-EPR has been fully characterised for analytical work, opening possibilities to study electrode reactions quantitatively with accuracy and precision not obtained before, as demonstrated in Chapter 8 of this thesis.
APA, Harvard, Vancouver, ISO, and other styles
26

Bordas, Alexandre. "Homogénéisation stochastique quantitative." Thesis, Lyon, 2018. http://www.theses.fr/2018LYSEN053/document.

Full text
Abstract:
Cette thèse porte sur l’homogénéisation quantitative d’équations aux dérivées partielles paraboliques, et de problèmes elliptiques discrets. Dans l’introduction, nous voyons comment de tels problèmes, même lorsque les coefficients sont déterministes, résultent d’un modèle aléatoire. Nous donnons ensuite une notion de ce qu’est l’homogénéisation : que se passe-t-il lorsque les coefficients eux-mêmes sont aléatoires, est-il possible de considérer qu’un environnement présentant des inhomogénéités sur de très petites échelles, se comporte d’une manière proche d’un environnement fictif qui serait homogène ?Nous donnons ensuite une interprétation de cette question en terme de marche aléatoire en conductances aléatoires, puis donnons une idée des outils utilisés dans les preuves des deux chapitres suivants. Dans le chapitre II, nous démontrons un résultat d’homogénéisation quantitative pour une équation parabolique – l’équation de la chaleur par exemple – dans un environnement admettant des coefficients aléatoires et dépendant du temps. La méthode utilisée consiste à considérer les solutions d’un tel problème comme optimiseurs de fonctionnelles qui seront définies au préalable, puis d’utiliser la propriété cruciale de sous-additivité de ces quantités, afin d’en déduire une convergence puis un résultat de concentration, qui permettra d’en déduire une vitesse de convergence des solutions vers la solution du problème homogénéisé, Dans le chapitre III, nous adaptons ces méthodes pour un problème elliptique sur le graphe Zd
This thesis deals with quantitative stochastic homogenization of parabolic partial differential equations, and discrete elliptic problems. In the introduction, we see how can such problems come from random models, even when the coefficients are deterministic. Then, we introduce homogenization : what happen if the coefficients themselves are random ? Could we consider that an environment with microscopical random heterogeneities behaves, at big scale, as a fictious deterministic homogeneous environment ? Then, we give a random walk in random environment interpretation and the sketch of the proofs in the two following chapters. In chapter II, we prove a quantitative homogenization result for parabolic PDEs, such as heat equation, in environment admitting time and space dependent coefficients. The method of the proof consists in considering solutions of such problems as minimizers of variational problems. The first step is to express solutions as minimizers, and then to use the capital property of subadditivity of the corresponding quantities, in order to deduce convergence and concentration result. From that, we deduce a rate of convergence of the actual solutions to the homogenized solution. In chapter III, we adapt these methods to a discrete elliptic problem on the lattice Zd
APA, Harvard, Vancouver, ISO, and other styles
27

Chew, Serena Janine. "Comparison of quantitative precipitation forecast, a precipitation-based quantitative precipitation estimate and a radar-derived quantitative precipitation estimate." abstract and full text PDF (free order & download UNR users only), 2006. http://0-gateway.proquest.com.innopac.library.unr.edu/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:1432997.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Siepmann, Martin, and Wilhelm Kirch. "Effects of Caffeine on Topographic Quantitative EEG." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2014. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-134674.

Full text
Abstract:
Despite the widespread use of caffeine as a central nervous stimulant, the central pharmacodynamic properties of the drug have not yet been conclusively evaluated in humans. The present study was undertaken to assess the acute effects of caffeine on measures of topographical quantitative electroencephalogram (EEG) in normal subjects. Ten healthy male volunteers (mean age ± SD 25 ± 4 years) received placebo and 200 mg of caffeine as powder with oral water solution (caffeine amount = 2 cups of coffee) under randomized, double-blind crossover conditions on two different occasions. Before administration and 30 min afterwards, a 17-channel quantitative EEG was recorded during relaxation with eyes open and closed (15 min each). Caffeine caused a significant reduction of total EEG power at fronto-parieto-occipital and central electrode positions of both hemispheres when the subjects kept their eyes open. Absolute power of the slow and fast alpha and slow beta activities was diminished in various regions of the brain (p < 0.05). The effect was more pronounced with the subjects keeping their eyes open than with eyes closed. It can be concluded that quantitative EEG is a sensitive method to assess the effects of psychostimulants on the human brain. Therefore, in pharmaco-EEG studies, environmental factors such as caffeine have to be excluded
Dieser Beitrag ist mit Zustimmung des Rechteinhabers aufgrund einer (DFG-geförderten) Allianz- bzw. Nationallizenz frei zugänglich
APA, Harvard, Vancouver, ISO, and other styles
29

Hinoda, Takuya. "Quantitative assessment of gadolinium deposition in dentate nucleus using quantitative susceptibility mapping." Kyoto University, 2018. http://hdl.handle.net/2433/232091.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Przybilla, Norbert. "QUANTITATIVE SPECTROSCOPY OF SUPERGIANTS." Diss., lmu, 2002. http://nbn-resolving.de/urn:nbn:de:bvb:19-820.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Uebleis, Christopher. "Die quantitative "real-time"." Diss., lmu, 2007. http://nbn-resolving.de/urn:nbn:de:bvb:19-77470.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Krenning, Boudewijn Juriaan. "Quantitative Three-dimensional Echocardiography." [S.l.] : Rotterdam : [The Author] ; Erasmus University [Host], 2007. http://hdl.handle.net/1765/10695.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Brinca, Pedro Soares. "Essays in Quantitative Macroeconomics." Doctoral thesis, Stockholms universitet, Nationalekonomiska institutionen, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-92861.

Full text
Abstract:
In the first essay, Distortions in the Neoclassical Growth Model: A Cross Country Analysis, I show that shocks that express themselves as total factor productivity and labor income taxes are comparably more synchronized than shocks that resemble distortions to the ability of allocating resources across time and states of the world. These two shocks are also the most important to model. Lastly, I document the importance of international channels of transmission for the shocks, given that these are spatially correlated and that international trade variables, such as trade openness correlate particularly well with them. The second essay is called Monetary Business Cycle Accounting for Sweden. Given that the analysis is focused in one country, I can extend the prototype economy to include a nominal interest rate setting rule and government bonds. As in the previous essay, distortions to the labor-leisure condition and total factor productivity are the most relevant margins to be modeled, now joined by deviations from the nominal interest rate setting rule. Also, distortions do not share a structural break during the Great Recession, but they do during the 1990’s.  Researchers aiming to model Swedish business cycles must take into account the structural changes the Swedish economy went through in the 1990’s, though not so during the last recession. The third essay, Consumer Confidence and Consumption Spending: Evidence for the United States and the Euro Area, we show that, the consumer confidence index can be in certain circumstances a good predictor of consumption. In particular, out-of-sample evidence shows that the contribution of confidence in explaining consumption expenditures increases when household survey indicators feature large changes, so that confidence indicators can have some increasing predictive power during such episodes. Moreover, there is some evidence of a confidence channel in the international transmission of shocks, as U.S. confidence indices help predicting consumer sentiment in the euro area.
APA, Harvard, Vancouver, ISO, and other styles
34

Datta, Neil Anirvan Sagomisa. "A quantitative combinatory logic." Thesis, Imperial College London, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.502442.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Enright, S. A. "Towards quantitative computed tomography." Thesis, University of Canterbury. Electrical and Electronic Engineering, 1992. http://hdl.handle.net/10092/6886.

Full text
Abstract:
Computed tomography is introduced along with an overview of its diverse applications in many scientific endeavours. A unified approach for the treatment of scattering from linear scalar wave motion is introduced. The assumptions under which wave motion within a medium can be characterised by concourses of rays are presented along with comment on the validity of these assumptions. Early and conventional theory applied for modelling the behaviour of rays, within media for which ray assumptions are valid, are reviewed. A new computerised method is described for reconstruction of a refractive index distribution from time-of-flight measurements of radiation/waves passing through the distribution and taken on a known boundary surrounding it. The reconstruction method, aimed at solving the bent-ray computed tomography (CT) problem, is based on a novel ray description which doesn't require the ray paths to be known. This allows the refractive index to be found by iterative solution of a set of linear equations, rather than through the computationally intensive procedure of ray tracing, which normally accompanies iterative solutions to problems of this type. The preliminary results show that this method is capable of handling appreciable spatial refractive index variations in large bodies. A review containing theory and techniques for image reconstruction from projections is presented, along with their historical development. The mathematical derivation of a recently developed reconstruction technique, the method of linograms is considered. An idea, termed the plethora of views idea, which aims to improve quantitative CT image reconstruction, is introduced. The theoretical foundation for this is the idea that when presented with a plethora of projections, by which is meant a number greater than that required to reconstruct the known region of support of an image, so that the permissible reconstruction region can be extended, then the intensity of the reconstructed distribution should be negligible throughout the extended region. Any reconstruction within the extended region, that departs from what would be termed negligible, is deduced to have been caused by imperfections of the projections. The implicit expectation of novel schemes which are presented for improving CT image reconstruction, is that contributions within the extended region can be utilised to ameliorate the effects of the imperfections on the reconstruction where the distribution is known to be contained. Preliminary experimental results are reported for an iterative algorithm proposed to correct a plethora of X-ray CT projection data containing imperfections. An extended definition is presented for the consistency of projections, termed spatial consistency, that incorporates the region with which the projection data is consistent. Using this definition and an associated definition, spatial inconsistency, an original technique is proposed and reported on for the recovery of inconsistencies that are contained in the projection data over a narrow range of angles.
APA, Harvard, Vancouver, ISO, and other styles
36

Williams, Geoffrey Alan. "Studies in quantitative macroeconomics." Thesis, University of East Anglia, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.267257.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Bradley, Michael Ian. "Quantitative bioprocess containment validation." Thesis, University College London (University of London), 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.395529.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Heusser, Jonathan. "Automating quantitative information flow." Thesis, Queen Mary, University of London, 2011. http://qmro.qmul.ac.uk/xmlui/handle/123456789/1260.

Full text
Abstract:
Unprecedented quantities of personal and business data are collected, stored, shared, and processed by countless institutions all over the world. Prominent examples include sharing personal data on social networking sites, storing credit card details in every store, tracking customer preferences of supermarket chains, and storing key personal data on biometric passports. Confidentiality issues naturally arise from this global data growth. There are continously reports about how private data is leaked from confidential sources where the implications of the leaks range from embarrassment to serious personal privacy and business damages. This dissertation addresses the problem of automatically quantifying the amount of leaked information in programs. It presents multiple program analysis techniques of different degrees of automation and scalability. The contributions of this thesis are two fold: a theoretical result and two different methods for inferring and checking quantitative information flows are presented. The theoretical result relates the amount of possible leakage under any probability distribution back to the order relation in Landauer and Redmond’s lattice of partitions [35]. The practical results are split in two analyses: a first analysis precisely infers the information leakage using SAT solving and model counting; a second analysis defines quantitative policies which are reduced to checking a k-safety problem. A novel feature allows reasoning independent of the secret space. The presented tools are applied to real, existing leakage vulnerabilities in operating system code. This has to be understood and weighted within the context of the information flow literature which suffers under an apparent lack of practical examples and applications. This thesis studies such “real leaks” which could influence future strategies for finding information leaks.
APA, Harvard, Vancouver, ISO, and other styles
39

Yang, Y. "Essays in quantitative investments." Thesis, University of Liverpool, 2018. http://livrepository.liverpool.ac.uk/3021457/.

Full text
Abstract:
This thesis studies the characteristics of Chinese futures markets and the quantitative investment strategies. The main objective of this thesis is to provide a comprehensive analysis on the performance of quantitative investment strategies in the Chinese market. Furthermore, with an econometric analysis, the stylised facts of the Chinese futures markets are documented. Extensive backtesting results on the performance of momentum, reversal and pairs trading type strategies are provided. In the case of pairs trading type strategies, risk and return relationship is characterised by the length of the maximum holding periods, and thus re ected in the maximum drawdown risk. In line with the increasing holding periods, the pro tability of pairs trading increases over longer holding periods. Therefore, the abnormal returns from pairs trading in the Chinese futures market do not necessarily re ect market ine ciency. Momentum and reversal strategies are compared by employing both high- and low-frequency time series with precise estimation of transaction costs. The comparison of momentum and reversal investment strategies at the intra- and inter-day scales displays that the portfolio rebalancing frequency signi cantly impacts the pro tability of such strategies. Complementarily, the excess returns of inter-day momentum trading with the inclusion of precise estimates of transaction costs re ect that quantitative investment strategies consistently produce abnormal pro ts in the Chinese commodity futures markets. However, from a risk-adjusted view, the returns are obtained only by bearing additional drawdown risks. Finally, this thesis suggests that investor should choose quantitative trading strategies according to the investment horizon, tolerance for maximum drawdown and portfolio rebalancing costs.
APA, Harvard, Vancouver, ISO, and other styles
40

Louth, Richard James. "Essays in quantitative analytics." Thesis, University of Cambridge, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.608849.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Rushworth, Philip John. "Quantitative asymmetric reaction kinetics." Thesis, University of Warwick, 2011. http://wrap.warwick.ac.uk/45827/.

Full text
Abstract:
The comparison of catalysts for producing chiral materials is of vital importance in the improvement of reaction scope and efficacy. Here we describe a new method of analysing the kinetics of the stereodetermining steps in asymmetric reactions by performing ligand/catalyst competition experiments against an internal standard and measuring the enantiomeric excess obtained at a variety of ratios of ligand/catalyst to internal standard. From these enantiomeric excess measurements, we can establish the relative rate of reaction between the ligand/catalyst systems and the internal standard, allowing us to make indirect comparisons of the rates at which the ligands/catalysts perform the reaction. Here we take this method and apply it to three common synthetic procedures: the Sharpless asymmetric dihydroxylation, the asymmetric Michael addition of malonates to nitroalkenes and palladium catalysed asymmetric allylation reaction.
APA, Harvard, Vancouver, ISO, and other styles
42

Fredriksson, Ingemar. "Quantitative Laser Doppler Flowmetry." Doctoral thesis, Linköpings universitet, Biomedicinsk instrumentteknik, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-19947.

Full text
Abstract:
Laser Doppler flowmetry (LDF) is virtually the only non-invasive technique, except for other laser speckle based techniques, that enables estimation of the microcirculatory blood flow. The technique was introduced into the field of biomedical engineering in the 1970s, and a rapid evolvement followed during the 1980s with fiber based systems and improved signal analysis. The first imaging systems were presented in the beginning of the 1990s. Conventional LDF, although unique in many aspects and elegant as a method, is accompanied by a number of limitations that may have reduced the clinical impact of the technique. The analysis model published by Bonner and Nossal in 1981, which is the basis for conventional LDF, is limited to measurements given in arbitrary and relative units, unknown and non-constant measurement volume, non-linearities at increased blood tissue fractions, and a relative average velocity estimate. In this thesis a new LDF analysis method, quantitative LDF, is presented. The method is based on recent models for light-tissue interaction, comprising the current knowledge of tissue structure and optical properties, making it fundamentally different from the Bonner and Nossal model. Furthermore and most importantly, the method eliminates or highly reduces the limitations mentioned above. Central to quantitative LDF is Monte Carlo (MC) simulations of light transport in tissue models, including multiple Doppler shifts by red blood cells (RBC). MC was used in the first proof-of-concept study where the principles of the quantitative LDF were tested using plastic flow phantoms. An optically and physiologically relevant skin model suitable for MC was then developed. MC simulations of that model as well as of homogeneous tissue relevant models were used to evaluate the measurement depth and volume of conventional LDF systems. Moreover, a variance reduction technique enabling the reduction of simulation times in orders of magnitudes for imaging based MC setups was presented. The principle of the quantitative LDF method is to solve the reverse engineering problem of matching measured and calculated Doppler power spectra at two different source-detector separations. The forward problem of calculating the Doppler power spectra from a model is solved by mixing optical Doppler spectra, based on the scattering phase functions and the velocity distribution of the RBC, from various layers in the model and for various amounts of Doppler shifts. The Doppler shift distribution is calculated based on the scattering coefficient of the RBC:s and the path length distribution of the photons in the model, where the latter is given from a few basal MC simulations. When a proper spectral matching is found, via iterative model parameters updates, the absolute measurement data are given directly from the model. The concentration is given in g RBC/100 g tissue, velocities in mm/s, and perfusion in g RBC/100 g tissue × mm/s. The RBC perfusion is separated into three velocity regions, below 1 mm/s, between 1 and 10 mm/s, and above 10 mm/s. Furthermore, the measures are given for a constant output volume of a 3 mm3 half sphere, i.e. within 1.13 mm from the light emitting fiber of the measurement probe. The quantitative LDF method was used in a study on microcirculatory changes in type 2 diabetes. It was concluded that the perfusion response to a local increase in skin temperature, a response that is reduced in diabetes, is a process involving only intermediate and high flow velocities and thus relatively large vessels in the microcirculation. The increased flow in higher velocities was expected, but could not previously be demonstrated with conventional LDF. The lack of increase in low velocity flow indicates a normal metabolic demand during heating. Furthermore, a correlation between the perfusion at low and intermediate flow velocities and diabetes duration was found. Interestingly, these correlations were opposites (negative for the low velocity region and positive for the mediate velocity region). This finding is well in line with the increased shunt flow and reduced nutritive capillary flow that has previously been observed in diabetes.
APA, Harvard, Vancouver, ISO, and other styles
43

Graczyk, Alicja. "Development of quantitative dSTORM." Thesis, Heriot-Watt University, 2017. http://hdl.handle.net/10399/3334.

Full text
Abstract:
Direct stochastic optical reconstruction microscopy (dSTORM) is a singlemolecule imaging technique which involves tagging molecular targets with fluorescently labelled antibodies. In this method, only a subset of fluorophores emit photons at the same time, while the majority of fluorescent tags are pushed into an optically inactive state. This powerful technique, where resolution of 20 nm can be achieved, suffers from two major drawbacks which prevent quantitative analysis. The first problem lies with labelling of proteins of interest, where a single protein is typically labelled by multiple secondary antibodies tagged with a variable number of fluorophores. To count the number of proteins only one fluorophore per protein of interest must be assured. To solve this problem, I aimed to develop a novel linker molecule which, together with Fab’, an antigen-binding fragment, would produce a detection agent for 1:1 fluorophore to protein labelling. An alternative approach was also employed, in which an anti-EGFP nanobody was homogeneously mono-labelled with Alexa Fluor 647. Binding to EGFP was analysed both qualitatively and quantitatively and an excellent nanomolar affinity was demonstrated. The degree of labelling investigation revealed 1:1 nanobody to fluorophore ratio. The analysis of the nanobody was also performed using dSTORM, both on glass and in cells. The monolabelled nanobody produced significantly less localisations per single target as compared to the commercially available F(ab’)2 fragment and showed excellent colocalisation with EGFP in EGFP-SNAP-25 and EGFP-Lifeact transfected cells. The second problem in dSTORM is connected with the photophysical process itself. This is because the same fluorophore in dSTORM can enter light and dark cycles multiple times, so it is impossible to establish if closely neighbouring signals originate from one or multiple sources. A polarisation-based method was developed allowing measurement of polarisation of each fluorophore’s dipole. My strategy involved a change in the microscope pathway employing a polarisation splitter to separate light coming from each fluorophore into two components with orthogonal polarisations. Finally, the single labelling was combined with the polarisation experiments to achieve quantitative dSTORM, where the neighbouring signals could be assigned to the same or different targets, based on the polarisation value of each signal.
APA, Harvard, Vancouver, ISO, and other styles
44

Von, Essen Christian. "Quantitative Verification and Synthesis." Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENM090/document.

Full text
Abstract:
Cette thèse contribue à l'étude théorique et a l'application de la vérification et de la synthèse quantitative. Nous étudions les stratégies qui optimisent la fraction de deux récompenses des MDPs. L'objectif est la synthèse de régulateurs efficaces dans des environnements probabilistes. Premièrement nous montrons que les stratégies déterministes et sans mémoire sont suffisants. Sur la base de ces résultats, nous proposons trois algorithmes pour traiter des modèles explicitement encodées. Notre évaluation de ces algorithmes montre que l'un de ces derniers est plus rapide que les autres. Par la suite nous proposons et mettons en place une variante symbolique basé sur les diagrammes de décision binaire.Deuxièmement, nous étudions le problème de réparation des programmes d'un point de vue quantitatif. Cela conduit à une reformulation de la réparation d'un log: que seules les exécutions fautives du programme soient modifiées. Nous étudions les limites de cette approche et montrons comment nous pouvons assouplir cette nouvelle exigence. Nous concevons et mettons en œuvre un algorithme pour trouver automatiquement des réparations, et montrons qu'il améliore les modifications apportées aux programmes. Troisièmement, nous étudions une nouvelle approche au framework pour la vérification et synthèse quantitative. La vérification et la synthèse fonctionnent en tandem pour analyser la qualité d'un contrôleur en ce qui concerne, par exemple , de robustesse contre des erreurs de modélisation. Nous considérons également la possibilité d'approximer la courbure de Pareto, qui appataît de la combinaison du modèle avec de multiples récompenses. Cela nous permet à la fois d'étudier les compromis inhérents au système et de choisir une configuration adéquate. Nous appliquons notre framework aux plusieurs études de cas. La majorité de l'étude de cas est concernée par un système anti-collision embarqué (ACAS X). Nous utilisons notre framework pour aider à analyser l'espace de conception du système et de valider le contrôleur en cours d'investigation par la FAA. En particulier, nous contribuons l'analyse par PCTL et stochastic model checking
This thesis contributes to the theoretical study and application of quantitative verification and synthesis. We first study strategies that optimize the ratio of two rewards in MDPs. The goal is the synthesis of efficient controllers in probabilistic environments. We prove that deterministic and memoryless strategies are sufficient. Based on these results we suggest 3 algorithms to treat explicitly encoded models. Our evaluation of these algorithms shows that one of these is clearly faster than the others. To extend its scope, we propose and implement a symbolic variant based on binary decision diagrams, and show that it cope with millions of states. Second, we study the problem of program repair from a quantitative perspective. This leads to a reformulation of program repair with the requirement that only faulty runs of the program be changed. We study the limitations of this approach and show how we can relax the new requirement. We devise and implement an algorithm to automatically find repairs, and show that it improves the changes made to programs.Third, we study a novel approach to a quantitative verification and synthesis framework. In this, verification and synthesis work in tandem to analyze the quality of a controller with respect to, e.g., robustness against modeling errors. We also include the possibility to approximate the Pareto curve that emerges from combining the model with multiple rewards. This allows us to both study the trade-offs inherent in the system and choose a configuration to our liking. We apply our framework to several case studies. The major case study is concerned with the currently proposed next generation airborne collision avoidance system (ACAS X). We use our framework to help analyze the design space of the system and to validate the controller as currently under investigation by the FAA. In particular, we contribute analysis via PCTL and stochastic model checking to add to the confidence in the controller
APA, Harvard, Vancouver, ISO, and other styles
45

Yum, Minchul. "Essays in Quantitative Macroeconomics." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1429444230.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Kattenbelt, Mark Alex. "Automated quantitative software verification." Thesis, University of Oxford, 2010. http://ora.ox.ac.uk/objects/uuid:62430df4-7fdf-4c4f-b3cd-97ba8912c9f5.

Full text
Abstract:
Many software systems exhibit probabilistic behaviour, either added explicitly, to improve performance or to break symmetry, or implicitly, through interaction with unreliable networks or faulty hardware. When employed in safety-critical applications, it is important to rigorously analyse the behaviour of these systems. This can be done with a formal verification technique called model checking, which establishes properties of systems by algorithmically considering all execution scenarios. In the presence of probabilistic behaviour, we consider quantitative properties such as "the worst-case probability that the airbag fails to deploy within 10ms", instead of qualitative properties such as "the airbag eventually deploys". Although many model checking techniques exist to verify qualitative properties of software, quantitative model checking techniques typically focus on manually derived models of systems and cannot directly verify software. In this thesis, we present two quantitative model checking techniques for probabilistic software. The first is a quantitative adaptation of a successful model checking technique called counter-example guided abstraction refinement which uses stochastic two-player games as abstractions of probabilistic software. We show how to achieve abstraction and refinement in a probabilistic setting and investigate theoretical extensions of stochastic two-player game abstractions. Our second technique instruments probabilistic software in such a way that existing, non-probabilistic software verification methods can be used to compute bounds on quantitative properties of the original, uninstrumented software. Our techniques are the first to target real, compilable software in a probabilistic setting. We present an experimental evaluation of both approaches on a large range of case studies and evaluate several extensions and heuristics. We demonstrate that, with our methods, we can successfully compute quantitative properties of real network clients comprising approximately 1,000 lines of complex ANSI-C code — the verification of such software is far beyond the capabilities of existing quantitative model checking techniques.
APA, Harvard, Vancouver, ISO, and other styles
47

Vari, Miklos. "The impact of central bank policies on money markets." Thesis, Paris 1, 2017. http://www.theses.fr/2017PA01E062.

Full text
Abstract:
Cette thèse est une tentative de mieux comprendre l’impact des différentes mesures prises par les banques centrales depuis 2008, et en particulier en zone Euro. Elle se concentre sur les effets des différents politiques non-conventionnelles sur le marché monétaire. Le chapitre 1 montre comment la fragmentation du marché interbancaire perturbe la transmission de la politique monétaire. Le phénomène de fragmentation est introduit dans un modèle standard de marché interbancaire. On voit alors que de la liquidité excédentaire apparaît de façon endogène dans le modèle. Cela conduit les taux d’intérêt à court terme à s’éloigner du taux de la banque centrale. Le modèle est utilisé pour analyser les politiques conventionnelles et non conventionnelles de l’Eurosystème. Le chapitre 2 explique comment le programme d’achat de titres souverains de l’Eurosystème (le PSPP) a poussé certains taux du marché monétaire en dessous du taux de la facilité de dépôt de l’Eurosystème, qui est pourtant sensé être un plancher. Le chapitre explore empiriquement les interactions entre le PSPP et les taux d’intérêts collatéralisés. Le chapitre 3 montre comment des régulations très proches de celles de Bâle III étaient utilisées par les banques centrales dans les trois décennies qui ont suivi la Seconde Guerre mondiale. A l’époque ces régulations étaient utilisées pour stabiliser l’inflation et la production, un rôle qui serait aujourd’hui typiquement attribué à la politique monétaire (et non à la régulation bancaire). Les expériences historiques que nous décrivons montrent clairement que la régulation de la liquidité a des effets restrictifs sur l’activité
The first chapter shows how interbank market fragmentation disrupts the transmission of monetary policy. Fragmentation is the fact that banks, depending on their country of location,have different probabilities of default on their interbank borrowings. Once fragmentation is introduced into standard theoretical models of monetary policy implementation, excess liquidity arises endogenously. This leads short-term interest rates to depart from the central bank policy rates. Using data on cross-border financial flows and monetary policy operations,it is shown that this mechanism has been at work in the Euro-Area since 2008. The model is used to analyze conventional and unconventional monetary policy measures. The second chapter shows how the Euro area money market rates have been standing below the deposit facility rate since 2015, which financial markets perceive as a byproduct of Eurosystem's public sector purchase program (PSPP). This paper explores empirically the interactions between the PSPP and short term secured money market rates (repo rates). We document different channels through which asset purchases may affect the various segments of the Euro area repo market. Using proprietary data from the PSPP and individual repo transactions made on the repo market for specific securities, our results show that the PSPP has contributed to push down repo rates. Purchasing 1% of a bond outstanding is associated with a decline in its repo rate of -0.75 bps
APA, Harvard, Vancouver, ISO, and other styles
48

Podlipská, J. (Jana). "Non-invasive semi-quantitative and quantitative ultrasound imaging for diagnostics of knee osteoarthritis." Doctoral thesis, Oulun yliopisto, 2016. http://urn.fi/urn:isbn:9789526214351.

Full text
Abstract:
Abstract Osteoarthritis (OA) is a common degenerative disease of synovial joints becoming more frequent with age. Pain, stiffness and functional disability caused by OA negatively affect the quality of individuals’ lives. In order to prevent the manifestation of symptoms and further OA progress, early diagnosis is essential. Ultrasonography has the potential to detect various abnormalities in the knee joint, however its place in clinical practice remains uncertain. The present study aimed to determine the diagnostic performance of the semi-quantitative wide-area ultrasound (US) scanning of knee femoral cartilage degeneration, osteophytes and meniscal extrusion using magnetic resonance imaging as the reference tool. Diagnostic ability of conventional radiography (CR) was also determined and the performances of both modalities compared. Subsequently, the association of structural US findings with knee pain and function was investigated. Finally, quantitative US image analysis focusing on detection and evaluation of subchondral bone integrity in early OA was developed. The US quantitative outcomes were compared with CR and arthroscopy. Tibio-femoral osteophytes, medial meniscal extrusion and medial femoral cartilage morphological degeneration were identified by US with better or at least comparable accuracy than by CR, in which joint space narrowing was used as a composite measure of cartilage damage and meniscal extrusion. The global femoral cartilage grade associated strongly with increased pain and disability. Site-specifically, especially medial cartilage degeneration and femoral lateral osteophytes were associated with increased pain and disability. Regarding the quantitative outcomes, significant increase in US intensity in the femoral subchondral bone depth 0.35–0.7 mm and decrease in intensity slope up to 0.7 mm depth were observed during radiographic or arthroscopic OA progression. Novel wide-area US scanning provides relevant additional diagnostic information on tissue-specific OA pathology not depicted by CR. US-detected changes of femoral cartilage and osteophytes are also associated with clinical symptoms. Consequently, the use of US as a complementary imaging tool along with CR may enable more accurate diagnostics of knee OA. Furthermore, developed quantitative US analysis is a promising tool for detection of femoral subchondral bone changes in knee OA
Tiivistelmä Nivelrikko on erittäin yleinen nivelten rappeumasairaus, joka aiheuttaa kipua, jäykkyyttä sekä liikkumisvaikeutta. Nivelrikon nykyistä varhaisempi diagnosointi olisi äärimmäisen tärkeää, jotta voitaisiin vähentää oireiden esiintymistä ja hidastaa sairauden etenemistä. Ultraäänikuvaus on lupaava menetelmä nivelrikon varhaisdiagnostiikkaan, mutta sitä ei kuitenkaan ole vielä yleisesti hyväksytty rutiininomaiseen kliiniseen käyttöön. Tämän tutkimuksen päätavoitteena oli selvittää polven semi-kvantitatiivisen ultraäänikuvauksen diagnostista kykyä verrattuna perinteiseen röntgenkuvaukseen, kun arvioidaan rustokudoksen kulumista, luupiikkejä sekä nivelkierukan siirtymää. Magneettikuvausta käytettiin vertailumenetelmänä. Toisena tavoitteena oli selvittää yhteyttä polven ultraäänilöydösten ja polven kivun sekä liikkuvuuden välillä. Lopuksi selvitettiin, voidaanko kvantitatiivisella analyysilla parantaa ultraäänikuvauksen tarkkuutta varhaisvaiheen nivelrikkopotilaiden rustonalaisen luun kunnon määrittämiseen. Röntgenkuvaukseen verrattuna ultraäänikuvaus osoittautui vähintään yhtä hyväksi tai paremmaksi kuvantamismenetelmäksi, kun arvioitiin reisi- ja sääriluun luupiikkejä, sisemmän nivelkierukan siirtymää tai rustokudoksen kulumista. Reisiluun nivelruston yleinen kulumisen aste oli suoraan verrannollinen potilaiden polvinivelen liikerajoituksiin ja kipuun. Erityisesti reisiluun sisemmän puolen nivelruston kuluminen sekä ulomman puolen luupiikit liittyivät potilaiden oireisiin. Kvantitatiiviset ultraäänitulokset osoittivat, että ultraäänen rusto-luurajapinnalta tulevan heijastuksen maksimi-intensiteetti lisääntyi sekä intensiteetin tason laskunopeus vähentyi nivelrikon vaikeusasteen kasvaessa. Tutkimus osoitti, että ultraäänikuvauksella voidaan saada tärkeää diagnostista lisätietoa polven nivelrikon aiheuttamista kudosmuutoksista, joita ei pystytä havaitsemaan perinteisellä röntgenkuvauksella. Ultraäänikuvauksessa näkyvät kudosmuutokset liittyvät myös potilaan kliinisiin oireisiin. Lisäksi rustonalaista luuta voidaan analysoida kvantitatiivisesti ultraäänikuvien perusteella, mikä edelleen helpottaa kuvien tulkintaa. Tutkimuksen perusteella voidaankin suositella ultraäänikuvauksen nykyistä laajempaa kliinistä käyttöä röntgenkuvausta täydentävänä tutkimusmenetelmänä nivelrikon varhaisdiagnostiikassa
APA, Harvard, Vancouver, ISO, and other styles
49

Liukaitytė, Judita. "Quantitave Evaluation of Biometeorological Conditions in Lithuania." Doctoral thesis, Lithuanian Academic Libraries Network (LABT), 2011. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2011~D_20110330_154958-12442.

Full text
Abstract:
Biometeorological information provides early warnings about health risks and possible effects of complex meteorological conditions on human health or comfortability. The aim of the present research is to determine the weather impact on human health in Lithuania and accomplish evaluation of biometeorological conditions in the country. In the research a sociological evaluation of weather sensitivity of Lithuanian population was carried out. There also was measured the impact of the weather conditions on cardiovascular disease recurrent in Vilnius and the most meteo sensitive diseases were identified as well as their dependence on the current weather conditions. The change of indices describing biometeorological cold (Wind Chill) and heat („Humidex“) effects was analysed. The impact of heat on mortality of Vilnius residents was evaluated and appropriate indicators were determined to provide heat warnings in Lithuania. There was performed a calibration of ultraviolet radiation values measured in Kaunas MS and assessed the appropriateness of STAR model for forecasting of UV radiation intensity in Lithuania. The results of dissertation can be used to make biometeorological forecasts. On the basis of this work, the system of existing early warnings about elemental, catastroffic and other hazardous weather events could be improved.
Biometeorologinė informacija suteikia išankstinius perspėjimus apie sveikatai gresiantį pavojų, kaip meteorologinių sąlygų kompleksas veiks žmonių sveikatos būklę ar komfortiškumą. Šio darbo tikslas - nustatyti orų poveikį Lietuvos gyventojų sveikatai ir atlikti biometeorologinių sąlygų šalies teritorijoje vertinimą. Darbe atliktas Lietuvos gyventojų jautrumo orams sociologinis vertinimas. Nustatytas orų sąlygų poveikis širdies-kraujagyslių ligų kartojimuisi Vilniuje ir išskirtos meteojautriausios ligos ir jų priklausomybė nuo esamų meteorologinių sąlygų. Analizuota biometeorologinį šalčio poveikį nusakančio Vėjo žvarbumo ir karščio poveikį – „Humidex“ indekso kaita Lietuvoje. Įvertintas karščio poveikis Vilniaus gyventojų mirtingumui ir nustatyti indikatoriai tinkantys teikti karščio perspėjimus Lietuvoje. Atlikta Kauno MS išmatuotų ultravioletinės spinduliuotės dydžių kalibraciją bei įvertintas modelio STAR tinkamumas ultravioletinės spinduliuotės intensyvumo Lietuvoje prognozei. Disertacinio darbo rezultatai gali būti naudojami biometeorologinėms prognozėms sudaryti. Remiantis šiuo darbu, galima tobulinti egzistuojančią išankstinių perspėjimų apie stichinius, katastrofinius ir kitus pavojingus hidrometeorologinius reiškinius sistemą.
APA, Harvard, Vancouver, ISO, and other styles
50

Babari, Parvaneh. "Quantitative Automata and Logic for Pictures and Data Words." Doctoral thesis, Universitätsbibliothek Leipzig, 2017. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-221165.

Full text
Abstract:
Mathematical logic and automata theory are two scientific disciplines with a close relationship that is not only fundamental for many theoretical results but also forms the basis of a coherent methodology for the verification and synthesis of computing systems. This connection goes back to a much longer history in the 1960s, through the fundamental work of Büchi-Elgot-Trakhtenbrot, which shows the expressive equivalence of automata and logical systems such as monadic second-order logic on finite and infinite words. This allowed the handling of specifications (where global system properties are stated), and implementations (which involve the definition of the local steps in order to satisfy the global goals laid out in the specifications) in a single framework. This connection has been extended to and well-investigated for many other structures such as trees, finite pictures, timed words and data words. For many computer science applications, however, quantitative phenomena need to be modelled, as well. Examples are vagueness and uncertainty of a statement, length of time periods, spatial information, and resource consumption. Weighted automata, introduced by Schützenberger, are prominent models for quantitative aspects of systems. The framework of weighted monadic second-order logic over words was first introduced by Droste and Gastin. They gave a characterization of quantitative behavior of weighted finite automata, as semantics of monadic second-order sentences within their logic. Meanwhile, the idea of weighted logics was also applied to devices recognizing more general structures such as weighted tree automata, weighted automata on infinite words or traces. The main goal of this thesis is to give logical characterizations for weighted automata models on pictures and data words as well as for Büchi-tiling systems in the spirit of the classical Büchi-Elgot theorem. As the second goal, we deal with synchronizing problem for data words. Below, we briefly summarize the contents of this thesis. Informally, a two-dimensional string is called a picture and is defined as a rectangular array of symbols taken from a finite alphabet. A two-dimensional language (or picture language) is a set of pictures. Picture languages have been intensively investigated by several research groups. In Chapter 1, we define weighted two-dimensional on-line tessellation automata (W2OTA) taking weights from a new weight structure called picture valuation monoid. This new weighted picture automaton model can be used to model several applications, e.g. the average density of a picture. Such aspects could not be modelled by semiring weighted picture automaton model. The behavior of this automaton model is a picture series mapping pictures over an alphabet to elements of a picture valuation monoid. As one of our main results, we prove a Nivat theorem for W2OTA. It shows that recognizable picture series can be obtained precisely as projections of particularly simple unambiguously recognizable series restricted to unambiguous recognizable picture languages. In addition, we introduce a weighted monadic second-order logic (WMSO) which can model average density of pictures. As the other main result, we show that W2OTA and a suitable fragment of our weighted MSO logic are expressively equivalent. In Chapter 2, we generalize the notion of finite pictures to +ω-pictures, i.e., pictures which have finite number of rows and infinite number of columns. We extend conventional tiling systems with a Büchi acceptance condition in order to define the class of Büchi-tiling recognizable +ω-picture languages. The class of recognizable +ω-picture languages is indeed, a natural generalization of ω-regular languages. We show that the class of all Büchi-tiling recognizable +ω-picture languages has the similar closure properties as the class of tiling recognizable languages of finite pictures: it is closed under projection, union, and intersection, but not under complementation. While for languages of finite pictures, tiling recognizability and EMSO-definability coincide, the situation is quite different for languages of +ω-pictures. In this setting, the notion of tiling recognizability does not even cover the language of all +ω -pictures over Σ = {a, b} in which the letter a occurs at least once – a picture language that can easily be defined in first-order logic. As a consequence, EMSO is too strong for being captured by the class of tiling recognizable +ω-picture languages. On the other hand, EMSO is too weak for being captured by the class of all Büchi-tiling recognizable +ω-picture languages. To obtain a logical characterization of this class, we introduce the logic EMSO∞, which extends EMSO with existential quantification of infinite sets. Additionally, using combinatorial arguments, we show that the Büchi characterization theorem for ω-regular languges does not carry over to the Büchi-tiling recognizable +ω-picture languages. In Chapter 3, we consider the connection between weighted register automata and weighted logic on data words. Data words are sequences of pairs where the first element is taken from a finite alphabet (as in classical words) and the second element is taken from an infinite data domain. Register automata, introduced by Francez and Kaminski, provide a widely studied model for reasoning on data words. These automata can be considered as classical nondeterministic finite automata equipped with a finite set of registers which are used to store data in order to compare them with some data in the future. In this chapter, for quantitative reasoning on data words, we introduce weighted register automata over commutative data semirings equipped with a collection of binary data functions in the spirit of the classical theory of weighted automata. Whereas in the models of register automata known from the literature data are usually compared with respect to equality or a linear order, here we allow data comparison by means of an arbitrary collection of binary data relations. This approach permits easily to incorporate timed automata and weighted timed automata into our framework. Motivated by the seminal Büchi-Elgot-Trakhtenbrot theorem about the expressive equivalence of finite automata and monadic second-order (MSO) logic and by the weighted MSO logic of Droste and Gastin, we introduce weighted MSO logic on data words and give a logical characterization of weighted register automata. In Chapter 4, we study the concept of synchronizing data words in register automata. The synchronizing problem for data words asks whether there exists a data word that sends all states of the register automaton to a single state. The class of register automata that we consider here has a decidable non-emptiness problem, and the subclass of nondeterministic register automata with a single register has a decidable non-universality problem. We provide the complexity bounds of the synchronizing problem in the family of deterministic register automata with k registers (k-DRA), and in the family of nondeterministic register automata with single register (1-NRA), and in general undecidability of the problem in the family of k-NRA. To this end, we prove that, for k-DRA, inputting data words with only 2k + 1 distinct data values, from the infinite data domain, is sufficient to synchronize. Then, we show that the synchronizing problem for k-DRA is in general PSPACE-complete, and it is in NLOGSPACE for 1-DRA. For nondeterministic register automata (NRA), we show that Ackermann(n) distinct data, where n is the number of states of the register automaton, might be necessary to synchronize. Then, by means of a construction, proving that the synchronizing problem and the non-universality problem in 1-NRA are interreducible, we show the Ackermann-completeness of the problem for 1-NRA. However, for k-NRA, in general, we prove that this problem is undecidable due to the unbounded length of synchronizing data words.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography