Siga este enlace para ver otros tipos de publicaciones sobre el tema: Quantitative analysis of interactomes.

Tesis sobre el tema "Quantitative analysis of interactomes"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores tesis para su investigación sobre el tema "Quantitative analysis of interactomes".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Puchalska, Monika [Verfasser] y Gerhard [Akademischer Betreuer] Mittler. "Quantitative proteomic analysis of the interactome of mammalian S/MAR (scaffold/matrix attachment region) elements​". Freiburg : Universität, 2018. http://d-nb.info/1216826447/34.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Jané, Palli Pau. "Quantification des affinités PBM/PDZ et de leurs sites modulateurs par des approches expérimentales et informatiques à haut débit". Electronic Thesis or Diss., Strasbourg, 2020. http://www.theses.fr/2020STRAJ051.

Texto completo
Resumen
Ce travail a porté sur les domaines PDZ, une famille de domaines globulaires reconnaissant des motifs de liaison aux PDZ (appelés PBM, pour ‘PDZ-Binding Motifs’) généralement situés à l'extrémité C-terminale de leurs protéines partenaires. Les réseaux domaines-motifs sont souvent modulés par des modifications post-traductionnelles réversibles (PTM). Nous avons utilisé des PBM synthétiques simulant différentes conditions: motifs sauvages de diverses longueurs, acétylés, phosphorylés ou portant des mutations ‘imitant’ les PTM. Ces peptides ont été utilisés pour des études d'interaction à l'aide du test ‘Hold-Up’, un test développé à l'origine dans notre laboratoire. Nous avons évalué l'impact de diverses modifications des complexes PBM/PDZ, qui conduisent à un changement global de leur capacité de liaison du PDZ. Ces résultats fournissent des informations quantitatives sur l'effet biologique que de telles modifications pourraient avoir dans le contexte des protéines entières
This thesis focuses on PDZ domains, a family of globular domains that bind to conserved PDZ-Binding Motifs (called henceforth PBMs) generally situated at the extreme C-terminus of their partner proteins. Domain-motif networks are often modulated by reversible post-translational modifications (PTMs). We used synthetized PBMs to reproduce different conditions, such as a wild-type, acetylation or phosphorylation, addition of extra exosites or residue mimication of PTM in the literature. These peptides were used for interaction studies using the holdup assay, an assay originally developed in our laboratory. We evaluated the impact of diverse modifications of the PBM/PDZ interactions, which led to a global change of the PDZ-binding capability. These results provided quantitative information on the biological effects that such modifications may have in the context of full-length proteins
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Franchin, Cinzia. "Mass Spectrometry-Based Quantitative Proteomics to Study Proteomes, Phosphoproteomes and Interactomes". Doctoral thesis, Università degli studi di Padova, 2012. http://hdl.handle.net/11577/3422169.

Texto completo
Resumen
Over the past few years, mass spectrometry-based proteomics has been widely applied to the most diverse fields of biochemistry, biomedicine and biology, and several approaches have been developed to allow absolute and relative quantification of proteins in very complex mixtures. During my PhD, I have conducted three main studies, taking advantage of different quantitative proteomics techniques. In particular SILAC and iTRAQ approaches have been exploited to investigate the calcification process induced by endotoxin in clonal interstitial aortic valve cells, while SILAC and label-free quantitative approaches have been exploited to identify new potential interacting partners and substrates of the protein kinase CK2. Calcific aortic valve disease represents the most common type of valvular disease and the first cause of surgical valve replacement in the industrialized world. No medical therapies are available to prevent or slow down calcium deposition within the valve leaflets, therefore surgery is the only possible treatment. To investigate the molecular mechanisms underlying the calcification process, SILAC and iTRAQ proteomics approaches have been applied to bovine interstitial aortic valve cells, a cellular model that is able to acquire a pro-calcific profile and drive matrix mineralization upon treatment with an inflammatory stimulus like the endotoxin lipopolysaccharide (LPS). The application to the same cellular model of two different quantitative technologies, led to the identification and relative quantification of hundreds of proteins, among which many showed a significant alteration in response to LPS. The acquired data suggest that cellular oxidoreductase activity, cytoskeletal and spliceosome regulation, glycolisis/gluconeogenesis, and arginine metabolism are altered during the acquisition of the pro-calcific profile. These results represent a starting point to investigate more in detail the molecular mechanisms that seem to be strongly involved in the calcification process induced by LPS. The other projects described in this thesis focus on CK2, an essential, constitutively active and highly pleiotropic protein kinase. CK2, like many other kinases, is strongly involved in several cellular processes, and in particular it has been hypothesized that this enzyme plays a crucial role in the transduction of survival signals. However, a clear comprehension of the multiple roles played by this kinase within the cell has not been achieved. The aim of these projects was the identification of interacting partners and substrates of CK2 by means of proteomics approaches to try to shed some light on the functions performed by this kinase. In particular a combination of immunoprecipitation experiments and label-free quantitative analyses has been performed to identify new potential interacting partners of CK2, while the SILAC technology, in combination with the use of a specific and potent inhibitor of CK2, was exploited to identify new putative substrates of this kinase directly in a cellular system. The results obtained confirm the notion that CK2 plays a role in many fundamental cellular functions and clearly indicate a strong involvement of this kinase in the biological processes of protein biosynthesis and degradation. Moreover interesting aspects linked to phosphorylation/dephosphorylation turnover rates emerged from these analyses. A detailed discussion, from a technical and biological point of view, of the data collected is presented. Finally, during my PhD I also collaborated to a project aiming at the identification of the primary molecular targets of antimicrobial photodynamic therapy. This work, not discussed in the thesis, has recently been submitted to the “Journal of Proteomics” with the title: “Molecular Targets of Antimicrobial Photodynamic Therapy Identified by a Proteomic Approach”.
Negli ultimi anni, la ricerca proteomica basata sulla spettrometria di massa è stata applicata in modo esponenziale ai più diversi campi della biochimica, biomedicina e biologia, permettendo il parallelo sviluppo di nuovi approcci per la quantificazione relativa e assoluta delle proteine. Nel corso del mio dottorato, ho seguito lo sviluppo di tre progetti principali, sfruttando diverse tecniche di spettrometria quantitativa. In particolare, le tecnologie SILAC e iTRAQ sono state applicate allo studio del processo di calcificazione delle cellule interstiziali delle valvole aortiche, mentre i metodi SILAC e di quantificazione label-free sono stati sfruttati per l’identificazione di potenziali interattori e substrati della protein chinasi CK2. La calcificazione delle valvole aortiche è una delle più comuni patologie valvolari e prima causa di sostituzione valvolare nei paesi industrializzati. A oggi sfortunatamente non esistono terapie che possano prevenire o curare la deposizione di calcio nelle valvole aortiche, e l’unica soluzione è l’intervento chirurgico. Per chiarire le basi molecolari di questo processo, abbiamo applicato le metodiche SILAC e iTRAQ ad un modello cellulare basato su cellule valvolari cardiache bovine (BVIC), in grado di acquisire un profilo pro-calcifico e favorire la mineralizzazione della matrice extra-cellulare in risposta ad uno stimolo infiammatorio come l’endotossina lipopolisaccaride (LPS). L’utilizzo di due diverse tecnologie allo stesso modello cellulare ha permesso l’identificazione, e la relativa quantificazione, di centinaia di proteine, parecchie delle quali mostrano una significativa alterazione in risposta al trattamento con LPS. L’analisi dei dati ha infatti rivelato l’alterazione di proteine appartenenti a diversi processi cellulari, quali la regolazione del citoscheletro, dei meccanismi ossidoriduttivi e dello spliceosoma, la via metabolica della glicolisi/gluconeogenesi, e il metabolismo dell’arginina, suggerendo il coinvolgimento di queste vie nel fenomeno della calcificazione delle valvole aortiche. Questi risultati rappresentano perciò un punto di partenza per nuovi dettagliati studi dei meccanismi molecolari alla base della calcificazione valvolare indotta da LPS. Gli altri progetti descritti in questa tesi sono focalizzato su CK2, una protein chinasi essenziale, altamente pleiotroica e costitutivamente attiva, fortemente implicata in una moltitudine di processi cellulari, in particolare nella trasduzione dei segnali di sopravvivenza, per la quale sembra giocare un ruolo chiave. Tuttavia una completa comprensione del ruolo che CK2 ricopre nei vari processi cellulari in cui è implicata non è ancora stata raggiunta, perciò questo lavoro ha come scopo l’identificazione di nuovi potenziali interattori e substrati di CK2, allo scopo di chiarire maggiormente la sua funzione all’interno della cellula. Nello specifico, abbiamo abbinato esperimenti d’immunoprecipitazione e analisi quantitativa label-free per lo studio delle proteine che interagiscono con CK2, mentre la tecnologia SILAC combinata con l’uso di un inibitore potente e specifico di CK2 è stata applicata alla ricerca di nuovi potenziali substrati di questa chinasi direttamente in un sistema cellulare. I risultati ottenuti confermano le conoscenze già note riguardo al coinvolgimento di CK2 in diversi processi essenziali per la vita cellulare, e fanno emergere chiaramente un coinvolgimento di primo piano di CK2 nei processi di biosintesi e degradazione proteica. Inoltre, l’analisi dei dati ha anche rivelato interessanti ed inattesi aspetti del turnover di fosforilazione/defosforilazione di proteine fosforilate da CK2. I dati ottenuti sono dettagliatamente presentati in questa tesi, da un punto di vista sia tecnico che biologico. Infine, durante il dottorato ho anche collaborato alla realizzazione di un progetto volto all’identificazione di bersagli molecolari nella terapia fotodinamica antimicrobica, utilizzando un approccio proteomico. Da questa collaborazione, è nato un lavoro (non descritto in questa tesi) che è stato recentemente sottoposto a “Journal of Proteomics” con il titolo: “Molecular Targets of Antimicrobial Photodynamic Therapy Identified by a Proteomic Approach”.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Kühnle, Tim. "Quantitative Analysis of Human Chronotypes". Diss., lmu, 2006. http://nbn-resolving.de/urn:nbn:de:bvb:19-51686.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Bhabuta, Madhu Darshan Kumar. "Quantitative analysis of ATM networks". Thesis, Imperial College London, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.299444.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Owen, Christopher Grant. "Quantitative analysis of conjunctival vasculature". Thesis, Queen's University Belfast, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.287628.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Doupé, David Patrick. "Quantitative analysis of epithelial homeostasis". Thesis, University of Cambridge, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.611770.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Zeng, Wen. "Quantitative analysis of distributed systems". Thesis, University of Newcastle upon Tyne, 2014. http://hdl.handle.net/10443/2638.

Texto completo
Resumen
Computing Science addresses the security of real-life systems by using various security-oriented technologies (e.g., access control solutions and resource allocation strategies). These security technologies signficantly increase the operational costs of the organizations in which systems are deployed, due to the highly dynamic, mobile and resource-constrained environments. As a result, the problem of designing user-friendly, secure and high efficiency information systems in such complex environment has become a major challenge for the developers. In this thesis, firstly, new formal models are proposed to analyse the secure information flow in cloud computing systems. Then, the opacity of work flows in cloud computing systems is investigated, a threat model is built for cloud computing systems, and the information leakage in such system is analysed. This study can help cloud service providers and cloud subscribers to analyse the risks they take with the security of their assets and to make security related decision. Secondly, a procedure is established to quantitatively evaluate the costs and benefits of implementing information security technologies. In this study, a formal system model for data resources in a dynamic environment is proposed, which focuses on the location of different classes of data resources as well as the users. Using such a model, the concurrent and probabilistic behaviour of the system can be analysed. Furthermore, efficient solutions are provided for the implementation of information security system based on queueing theory and stochastic Petri nets. This part of research can help information security officers to make well judged information security investment decisions.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Dong, Zhiyuan. "Three Essays in Quantitative Analysis". University of Cincinnati / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1282048935.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Narendra, Koneru. "Quantitative analysis of domain testing effectiveness /". Adobe Acrobat .pdf file, requires Adobe Acrobat Reader software, 2001. http://etd-submit.etsu.edu/etd/theses/available/etd-0404101-011933/unrestricted/koneru0427.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Nazar, Rogelio. "A quantitative approach to concept analysis". Doctoral thesis, Universitat Pompeu Fabra, 2010. http://hdl.handle.net/10803/7516.

Texto completo
Resumen
The present research focuses on the study of the distribution of lexis in corpus and its aim is to inquire into the relations that exist between concepts through the occurrences of the terms that designate them. The initial hypothesis is that it is possible to analyze concepts by studying the contexts of occurrence of the terms. More precisely, taking into account the statistics of term co-occurrence in context windows of n words. The thesis presents a computational model in the form of graphs of term co-occurrence in which each node represents single or multiword terms. Given a query term, a graph for that term is derived from a given corpus. As texts are analyzed, every time that two terms appear together in the same context window, the nodes that represent each of these terms are connected by an arc or, in case they already had one, their connection is strengthened. This graph is presented as a model of learning, and as such it is evaluated with experiments in which a computer program solves tasks that involve some degree of concept analysis. Within the scope of concept analysis, one of those tasks is to tell whether a word or a sequence of words in a given text is referring to a specific concept and to derive some of the basic properties of that concept, such as its taxonomic relations. Some other tasks can be to determine when the same word is referring to more than one concept (cases of homonymy or polysemy) as well as to determine when different words are referring to the same concept (cases of synonymy or equivalence between languages or dialectical variations). As a linguistic interpretation of these phenomena, this thesis derives a generalization in the realm of discourse analysis: the properties of the co-occurrence graphs are possible because authors of argumentative texts have a tendency to name some of the basic properties of the concepts that they introduce in discourse. This happens mainly at the beginning of texts, in order to ensure that principles among reader and writer are shared. Each author will predicate different information about a given concept, but the authors that treat the same topic will tend to depart from a common base and this coincidence will be expressed in the selection of the vocabulary. This coincidence in the selection of the
vocabulary, because of its cumulative effect, can be studied with statistical means.

El presente trabajo se centra en el estudio de la distribución del léxico en corpus y su
cometido es el análisis de las relaciones existentes entre los conceptos a través de los
términos que estos designan. La hipótesis de partida es que podemos analizar conceptos estudiando los contextos de aparición de los términos que los designan, utilizando para ello las estadísticas de coocurrencia de los términos en ventanas de contexto de n palabras. La tesis presenta un modelo computacional en forma de grafos de coocurrencia de términos donde los nodos representan términos simples o sintagmáticos. Dado un término analizado, se deriva un grafo para ese término a partir de un corpus. A medida que los textos se analizan, cada vez que dos términos aparecen juntos en una misma ventana de contexto, los nodos que los representan se conectan entre sí mediante un arco o bien fortalecen su conexión si ya la tenían. Este grafo es presentado como un modelo de aprendizaje, y como tal es evaluado mediante experimentos en que un ordenador resuelve tareas propias del análisis conceptual. Estas tareas incluyen determinar cuándo una palabra o secuencia de palabras dentro de un texto hace referencia a un concepto definido, así como determinar algunas de las propiedades más importantes de este concepto, tal como sus relaciones taxonómicas. Otras tareas son las de determinar cuándo una misma palabra puede hacer referencia a más de un concepto (casos de homonimia o polisemia) o determinar cuándo distintas palabras hacen referencia a un mismo concepto (casos de sinonimia o equivalencia entre lenguas o variedades dialectales). Como una interpretación lingüística de estos fenómenos, esta tesis extrae una generalización en el plano del anàlisis del discurso: las propiedades de los grafos de coocurrencia léxica surgen gracias a la tendencia que tienen los autores de textos argumentativos de mencionar algunas de las propiedades más importantes de los conceptos que introducen en el discurso. Esto ocurre sobre todo al inicio del discurso, con el objeto de asegurar que los principios entre lector y autor son compartidos. Cada autor predicará distintas informaciones acerca de un determinado concepto, pero los autores que traten sobre un mismo tema tendrán tendencia a partir de una misma base y esta coincidencia se manifestará en la selección del léxico que, por su efecto acumulativo, puede ser estudiada de manera estadística.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Ross, Kyla Turpin. "Quantitative Analysis of Feedback During Locomotion". Diss., Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/14110.

Texto completo
Resumen
It is known that muscles possess both intrinsic and reflexive responses to stretch, both of which have been studied extensively. While much is known about heterogenic and autogenic reflexes during XER, these have not been well characterized during locomotion. In this study, we mapped the distribution of autogenic and heterogenic feedback in hindlimb extensor muscles using muscle stretch in the spontaneously locomoting premammillary decerebrate cat. We used natural stimulation and compared stretch-evoked force responses obtained during locomotion with those obtained during XER. The goal was to ascertain whether feedback was modulated between the two states. We found that heterogenic feedback pathways, particularly those emanating from MG, remained inhibitory during locomotion while autogenic feedback specifically in MG increases in gain. Furthermore, increases in MG gain were due to force-dependent mechanisms. This suggests that rather than an abrupt transition from inhibition to excitation with changes in motor tasks, these pathways coexist and contribute to maintaining interjoint coordination. Increases in autogenic gain provide a localized loading reflex to contribute to the completion of the movement. The results of these experiments are clinically significant, particularly for the rehabilitation of spinal cord injured patients. To effectively administer treatment and therapy for patients with compromised spinal reflexes, a complete understanding of the circuitry is required.
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Joh, In-Ho. "Quantitative analysis of biological decision switches". Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/43672.

Texto completo
Resumen
Cells switch phenotypes or behaviors to adapt to various environmental stimuli. Often there are multiple alternative phenotypes, hence a cell chooses one phenotype among them, a process which we term a ``decision switch'. At the cellular level, decision switches are governed by gene regulation, hence they are intrinsically stochastic. Here we investigate two aspects of decision switches: how copy number of genetic components facilitates multiple phenotypes and how temporal dynamics of gene regulation with stochastic fluctuations affect switching a cell fate. First, we demonstrate that gene expression can be sensitive to changes in the copy number of genes and promoters, and alternative phenotypes may arise due to bistability within gene regulatory networks. Our analysis in phage-lambda-infected E. coli cells exhibit drastic change in gene expression by changing the copy number of viral genes, suggesting phages can determine their fates collectively via sharing gene products. Second, we examine decision switches mediated by temporal dynamics of gene regulation. We consider a case when temporal gene expression triggers a corresponding cell fate, and apply it to the lysis-lysogeny decision switch by phage lambda. Our analysis recapitulates the systematic bias between lysis and lysogeny by the viral gene copy number. We also present a quantitative measure of cell fate predictability based on temporal gene expression. Analyses using our framework suggest that the future fate of a cell can be highly correlated with temporal gene expression, and predicted if the current gene expression is known.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Kurths, Jürgen, A. Voss, Annette Witt, P. Saparin, H. J. Kleiner y N. Wessel. "Quantitative analysis of heart rate variability". Universität Potsdam, 1994. http://opus.kobv.de/ubp/volltexte/2007/1347/.

Texto completo
Resumen
In the modern industrialized countries every year several hundred thousands of people die due to the sudden cardiac death. The individual risk for this sudden cardiac death cannot be defined precisely by common available, non-invasive diagnostic tools like Holter-monitoring, highly amplified ECG and traditional linear analysis of heart rate variability (HRV). Therefore, we apply some rather unconventional methods of nonlinear dynamics to analyse the HRV. Especially, some complexity measures that are basing on symbolic dynamics as well as a new measure, the renormalized entropy, detect some abnormalities in the HRV of several patients who have been classified in the low risk group by traditional methods. A combination of these complexity measures with the parameters in the frequency domain seems to be a promising way to get a more precise definition of the individual risk. These findings have to be validated by a representative number of patients.
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Lien, Tonje Gulbrandsen. "Statistical Analysis of Quantitative PCR Data". Thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for matematiske fag, 2011. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-13094.

Texto completo
Resumen
This thesis seeks to develop a better understanding of the analysis of gene expression to find the amount of transcript in a sample. The mainstream method used is called Polymerase Chain Reaction (PCR) and it exploits the DNA's ability to replicate. The comparative CT method estimate the starting fluorescence level f0 by assuming constant amplification in each PCR cycle, and it uses the fluorescence level which has risen above a certain threshold. We present a generalization of this method, where different threshold values can be used. The main aim of this thesis is to evaluate a new method called the Enzymological method. It estimates f0 by considering a cycle dependent amplification and uses a larger part of the fluorescence curves, than the two CT methods. All methods are tested on dilution series, where the dilution factors are known. In one of the datasets studied, the Clusterin dilution-dataset, we get better estimates from the Enzymological method compared to the two CT methods.
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Lee, Chae-Joon. "Quantitative analysis of oral phase dysphagia". Thesis, University of British Columbia, 1991. http://hdl.handle.net/2429/29995.

Texto completo
Resumen
Swallowing disorders (dysphagia) are becoming increasingly prevalent in the health care field. Oral phase dysphagia specifically refers to swallowing difficulties encountered during the initial, voluntary segment of the swallowing process, where the ingested food material is prepared into a manageable food mass (bolus) and then propelled from the mouth to the pharynx. Proper management (evaluation and treatment) of oral phase dysphagia poses a difficult task. The swallowing process is a complex sequence of highly coordinated events, while oral phase dysphagia can originate from a wide range of underlying causes. From a review of the literature, it is apparent that much of the current methods of management for oral phase dysphagia rely much on the subjective skills of the clinician in charge. In order to improve upon these methods as well as to gain a better understanding of the swallowing mechanism, the development of new, quantitative clinical methods are proposed. A computer-aided tongue force measurement system has been developed that is capable of quantifying individual tongue strength and coordination. This strain gauge-based transducer system measures upward and lateral tongue thrust levels during a "non-swallowing" test condition. Clinical testing has been performed on 15 able-bodied control subjects with no prior history of swallowing difficulty and several patient groups (25 post-polio, 8 stroke, 2 other neuromuscular disorders) with or without swallowing difficulties. The results have demonstrated the system's potential as a reliable, clinical tool (repeatability: t-test, p < 0.01) to quantify variation in tongue strength and coordination between and within the control and patient groups. In order to investigate the possible existence and role of a suction mechanism in transporting the bolus during the oral phase, a pharyngeal manometry setup has been utilized for oral phase application. This novel approach (oral phase manometry) would determine whether the widely-accepted tongue pressure mechanism is solely responsible for oral phase bolus transport. Based on extensive manometric data obtained from an able-bodied subject without any swallowing abnormality, no significant negative pressure or pressure gradients (suction) were found during the oral phase. The pressure generated by the tongue acting against the palate in propelling the bolus during the oral phase has been utilized in the development of an oral cavity pressure transducer system. This computer-aided measurement system employs a network of ten force sensing resistors (FSR's) attached to a custom-made mouthguard transducer unit to record the wave-like palatal pressure pattern during an actual swallow. Clinical testing has been performed on 3 able-bodied subjects with no swallowing abnormality, and their pressure data have been found to support tbe reliability of the device (repeatability: t-test, p < 0.01). The system has also been applied to study the effects of bolus size and consistency on the palatal pressure pattern during the oral phase. The results of this investigation has confirmed the importance of the tongue during the oral phase of swallowing. It has been shown that deficiency in tongue strength or coordination is a primary factor in oral phase dysphagia. The new, quantitative clinical methods which were developed for this research has utilized these parameters to provide improved methods of evaluation and treatment for oral phase dysphagia.
Applied Science, Faculty of
Mechanical Engineering, Department of
Graduate
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Vijayaraj, Veeraraghavan. "A QUANTITATIVE ANALYSIS OF PANSHARPENED IMAGES". MSSTATE, 2004. http://sun.library.msstate.edu/ETD-db/theses/available/etd-07012004-173924/.

Texto completo
Resumen
There has been an exponential increase in satellite image data availability. Image data are now collected with different spatial, spectral, and temporal resolutions. Image fusion techniques are used extensively to combine different images having complementary information into one single composite. The fused image has rich information that will improve the performance of image analysis algorithms. Pansharpening is a pixel level fusion technique used to increase the spatial resolution of the multispectral image using spatial information from the high resolution panchromatic image while preserving the spectral information in the multispectral image. Resolution merge, image integration, and multisensor data fusion are some of the equivalent terms used for pansharpening. Pansharpening techniques are applied for enhancing certain features not visible in either of the single data alone, change detection using temporal data sets, improving geometric correction, and enhancing classification. Various pansharpening algorithms are available in the literature, and some have been incorporated in commercial remote sensing software packages such as ERDAS Imagine® and ENVI®. The performance of these algorithms varies both spectrally and spatially. Hence evaluation of the spectral and spatial quality of the pansharpened images using objective quality metrics is necessary. In this thesis, quantitative metrics for evaluating the quality of pansharpened images have been developed. For this study, the Intensity-Hue-Saturation (IHS) based sharpening, Brovey sharpening, Principal Component Analysis (PCA) based sharpening and a Wavelet-based sharpening method is used.
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Li, Michael Kok Keng. "Quantitative analysis of chloroplast protein targeting". Thesis, University of Warwick, 2009. http://wrap.warwick.ac.uk/3917/.

Texto completo
Resumen
This thesis presents the first use of the Partition of Unity Method in quantifying the spatio-temporal dynamics of a fluorescent protein targeted to the chloroplast twin-arginine translocation pathway. The fluorescence loss in photobleaching technique is applied in a modified fashion to the measurement of substrate mobilities in the chloroplast stroma. Our in vivo results address the two suggested protein targeting mechanisms of membrane-binding before lateral movement to the translocon and direct binding to the translocon. A high performance computing C/C++ implementation of the Partition of Unity Method is used to perform simulations of fluoresence loss in photobleaching and allow a compelling comparison to photobleaching data series. The implementation is both mesh-free and particle-less.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Kafatos, George. "Statistical analysis of quantitative seroepidemiological data". Thesis, Open University, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.539408.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Zou, Ping. "Quantitative analysis of digital substraction angiograms". Thesis, University of Sheffield, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.489076.

Texto completo
Resumen
Occlusive vascular disease (e.g. arteriosclerosis) is a major and growing health problem worldwide. New drug treatments - as opposed to conventional bypass surgery - have attracted much research interest but a key obstacle here is the lack of any objective methodology for assessing the effectiveness of these treatments (before and after studies). The clinical data sources are digital subtraction angiograms (DSAs) which are a routine assessment of patients presenting with symptoms of occlusive vascular disease. Novel image processing algorithms are required to analyse these DS A images which are of low quality though of high spatial resolution and likely to contain complex vascular networks. Ultimately, we aim to model the haemodynamic capacity of the patient's vascular system using the geometric information extracted from DSAs.
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Tan, K.-W. "A quantitative analysis of trade flows". Thesis, University of East Anglia, 1987. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.377742.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Bennett, Graham. "Quantitative analysis using low resolution NMR". Thesis, University of Nottingham, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.299565.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Gallagher, Raymond. "Uncertainty modelling in quantitative risk analysis". Thesis, University of Liverpool, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.367676.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Barkshire, Ian Richard. "Quantitative Auger analysis of layered structures". Thesis, University of York, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.306431.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Canales, Kelly E. 1969. "Volumetric quantitative analysis of receptor function". Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/89342.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Bura, Cotiso Andrei. "Mathematical frameworks for quantitative network analysis". Diss., Virginia Tech, 2019. http://hdl.handle.net/10919/95034.

Texto completo
Resumen
This thesis is comprised of three parts. The first part describes a novel framework for computing importance measures on graph vertices. The concept of a D-spectrum is introduced, based on vertex ranks within certain chains of nested sub-graphs. We show that the D- spectrum integrates the degree distribution and coreness information of the graph as two particular such chains. We prove that these spectra are realized as fixed points of certain monotone and contractive SDSs we call t-systems. Finally, we give a vertex deletion algorithm that efficiently computes D-spectra, and we illustrate their correlation with stochastic SIR-processes on real world networks. The second part deals with the topology of the intersection nerve for a bi-secondary structure, and its singular homology. A bi-secondary structure R, is a combinatorial object that can be viewed as a collection of cycles (loops) of certain at most tetravalent planar graphs. Bi-secondary structures arise naturally in the study of RNA riboswitches - molecules that have an MFE binary structural degeneracy. We prove that this loop nerve complex has a euclidean 3-space embedding characterized solely by H2(R), its second homology group. We show that this group is the only non-trivial one in the sequence and furthermore it is free abelian. The third part further describes the features of the loop nerve. We identify certain disjoint objects in the structure of R which we call crossing components (CC). These are non-trivial connected components of a graph that captures a particular non-planar embedding of R. We show that each CC contributes a unique generator to H2(R) and thus the total number of these crossing components in fact equals the rank of the second homology group.
Doctor of Philosophy
This Thesis is divided into three parts. The first part describes a novel mathematical framework for decomposing a real world network into layers. A network is comprised of interconnected nodes and can model anything from transportation of goods to the way the internet is organized. Two key numbers describe the local and global features of a network: the number of neighbors, and the number of neighbors in a certain layer, a node has. Our work shows that there are other numbers in-between the two, that better characterize a node. We also give explicit means of computing them. Finally, we show that these numbers are connected to the way information spreads on the network, uncovering a relation between the network’s structure and dynamics on said network. The last two parts of the thesis have a common theme and study the same mathematical object. In the first part of the two, we provide a new model for the way riboswtiches organize themselves. Riboswitches, are RNA molecules within a cell, that can take two mutually opposite conformations, depending on what function they need to perform within said cell. They are important from an evolutionary standpoint and are actively studied within that context, usually being modeled as networks. Our model captures the shapes of the two possible conformations, and encodes it within a mathematical object called a topological space. Once this is done, we prove that certain numbers that are attached to all topological spaces carry specific values for riboswitches. Namely, we show that the shapes of the two possible conformations for a riboswich are always characterized by a single integer. In the last part of the Thesis we identify what exactly in the structure of riboswitches contributes to this number being large or small. We prove that the more tangled the two conformations are, the larger the number. We can thus conclude that this number is directly proportional to how complex the riboswitch is.
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Koneru, Narendra. "Quantitative analysis of domain testing effectiveness". [Johnson City, Tenn. : East Tennessee State University], 2001. http://etd-submit.etsu.edu/etd/theses/available/etd-0404101-011933/unrestricted/koneru0427.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Teltzrow, Maximilian. "A quantitative analysis of e-commerce". Doctoral thesis, Humboldt-Universität zu Berlin, Wirtschaftswissenschaftliche Fakultät, 2005. http://dx.doi.org/10.18452/15297.

Texto completo
Resumen
Die Rolle und Wahrnehmung des World Wide Web in seinen unterschiedlichen Nutzungskontexten ändert sich zunehmend – von einem frühen Fokus auf reine Web-Interaktion mit Kunden, Informationssuchern und anderen Nutzern hin zum Web als eine Komponente in einer mehrkanaligen Informations- und Kommunikationsstrategie. Diese zentrale Entwicklung ermöglicht Firmen, eine wachsende Menge digitaler Konsumenteninformationen zu sammeln, zu analysieren und zu verwerten. Während Firmen von diesen Daten profitieren (z.B. für Marketingzwecke und zur Verbesserung der Bedienungsfreundlichkeit), hat die Analyse und Nutzung von Onlinedaten zu einem signifikanten Anstieg der Datenschutzbedenken bei Konsumenten geführt, was wiederum ein Haupthindernis für erfolgreichen E-Commerce ist. Die Implikationen für eine Firma sind, dass Datenschutzerfordernisse bei der Datenanalyse und -nutzung berücksichtigt und Datenschutzpraktiken effizient nach außen kommuniziert werden müssen. Diese Dissertation erforscht den Grenzbereich zwischen den scheinbar konkurrierenden Interessen von Onlinekonsumenten und Firmen. Datenschutz im Internet wird aus einer Konsumentenperspektive untersucht und Datenschutzanforderungen werden spezifiziert. Eine Gruppe von Geschäftsanalytiken für Webseiten wird präsentiert und es wird verdeutlicht, wie Datenschutzanforderungen in den Analyseprozess integriert werden können. Ein Design zur besseren Kommunikation von Datenschutz wird vorgestellt, um damit eine effizientere Kommunikation der Datenschutzpraktiken einer Firma gegenüber Konsumenten zu ermöglichen. Die vorgeschlagenen Lösungsansätze gestatten den beiden Gegenparteien, widerstreitende Interessen zwischen Datennutzung und Datenschutz auszugleichen. Ein besonderer Fokus dieser Forschungsarbeit liegt auf Mehrkanalhändlern, die den E-Commerce Markt derzeit dominieren. Die Beiträge dieser Arbeit sind im Einzelnen: * Messung von Vorbedingungen für Vertrauen im Mehrkanalhandel Der Erfolg des Mehrkanalhandels und die Bedeutung von Datenschutz werden aus einer Konsumentenperspektive dargestellt. Ein Strukturgleichungsmodell zur Erklärung von Konsumentenvertrauen in einen Mehrkanalhändler wird präsentiert. Vertrauen ist eine zentrale Vorbedingung für die Kaufbereitschaft. Ein signifikanter Einfluss der wahrgenommenen Reputation und Größe physischer Filialen auf das Vertrauen in einen Onlineshop wurde festgestellt. Dieses Resultat bestätigt unsere Hypothese, dass kanalübergreifende Effekte zwischen dem physischen Filialnetzwerk und einem Onlineshop existieren. Der wahrgenommene Datenschutz hat im Vergleich den stärksten Einfluss auf das Vertrauen. Die Resultate legen nahe, Distributionskanäle weiter zu integrieren und die Kommunikation des Datenschutzes zu verbessern. * Design und Test eines Web-Analyse-Systems Der Forschungsbeitrag zu Konsumentenwahrnehmungen im Mehrkanalhandel motiviert die weitere Untersuchung der Erfolgsfaktoren im Internet. Wir präsentieren ein Kennzahlensystem mit 82 Kennzahlen zur Messung des Onlineerfolges von Webseiten. Neue Konversionsmetriken und Kundensegmentierungsansätze werden vorgestellt. Ein Schwerpunkt liegt auf der Entwicklung von Kennzahlen für Mehrkanalhändler. Das Kennzahlensystem wird auf Daten eines Mehrkanalhändlers und einer Informationswebseite geprüft. * Prototypische Entwicklung eines datenschutzwahrenden Web Analyse Services Die Analyse von Webdaten erfordert die Wahrung von Datenschutzrestriktionen. Der Einfluss von Datenschutzbestimmungen auf das Kennzahlensystem wird diskutiert. Wir präsentieren einen datenschutzwahrenden Web Analyse Service, der die Kennzahlen unseres Web-Analyse-Systems berechnet und zudem anzeigt, wenn eine Kennzahl im Konflikt mit Datenschutzbestimmungen steht. Eine syntaktische Erweiterung eines etablierten Datenschutzstandards wird vorgeschlagen. * Erweiterung der Analyse von Datenschutzbedürfnissen aus Kundensicht Eine wichtige Anwendung, die Resultate des beschriebenen Web Analyse Services nutzt, sind Personalisierungssysteme. Diese Systeme verbessern ihre Effizienz mit zunehmenden Informationen über die Nutzer. Daher sind die Datenschutzbedenken von Webnutzern besonders hoch bei Personalisierungssystemen. Konsumentendatenschutzbedenken werden in einer Meta-Studie von 30 Datenschutzumfragen kategorisiert und der Einfluss auf Personalisierungssysteme wird beschrieben. Forschungsansätze zur datensschutzwahrenden Personalisierung werden diskutiert. * Entwicklung eines Datenschutz-Kommunikationsdesigns Eine Firma muss nicht nur Datenschutzanforderungen bei Web-Analyse- und Datennutzungspraktiken berücksichtigen. Sie muss diese Datenschutzvorkehrungen auch effektiv gegenüber den Seitenbesuchern kommunizieren. Wir präsentieren ein neuartiges Nutzer-Interface-Design, bei dem Datenschutzpraktiken kontextualisiert erklärt werden, und der Kundennutzen der Datenübermittlung klar erläutert wird. Ein Nutzerexperiment wurde durchgeführt, das zwei Versionen eines personalisierten Web-Shops vergleicht. Teilnehmer, die mit unserem Interface-Design interagierten, waren signifikant häufiger bereit, persönliche Daten mitzuteilen, bewerteten die Datenschutzpraktiken und den Nutzen der Datenpreisgabe höher und kauften wesentlich häufiger.
The aim of this thesis is to explore the border between the competing interests of online consumers and companies. Privacy on the Internet is investigated from a consumer perspective and recommendations for better privacy management for companies are suggested. The proposed solutions allow the resolution of conflicting goals between companies’ data usage practices and consumers’ privacy concerns. The research is carried out with special emphasis on retailers operating multiple distribution channels. These retailers have become the dominant player in e-commerce. The thesis presents a set of business analyses for measuring online success of Web sites. New conversion metrics and customer segmentation approaches have been introduced. The analysis framework has been tested on Web data from a large multi-channel retailer and an information site. The analysis of Web data requires that privacy restrictions must be adhered to. Thus the impact of legislative and self-imposed privacy requirements on our analysis framework is also discussed. We propose a privacy-preserving Web analysis service that calculates our set of business analyses and indicates when an analysis is not compliant with privacy requirements. A syntactical extension of a privacy standard is proposed. Moreover, an overview of consumer privacy concerns and their particular impact on personalization systems is provided, that is summarized in a meta-study of 30 privacy surveys. A company must not only respect privacy requirements in its Web analysis and usage purposes but it must also effectively communicate these privacy practices to its site visitors. A privacy communication design is presented, which allows more efficient communication of a Web site’s privacy practices directed towards the users. Subjects who interacted with our new interface design were significantly more willing to share personal data with the Web site. They rated its privacy practices and the perceived benefit higher and made considerably more purchases.
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

LAVIANO, FRANCESCO. "Magneto-optics: Imaging and Quantitative Analysis". Doctoral thesis, Politecnico di Torino, 2005. http://hdl.handle.net/11583/2565567.

Texto completo
Resumen
The work presented in this thesis deals with the magneto-optical imaging (MOI) technique, which was developed during the PhD course of the candidate, with an equipment at the international state-of-the-art. The quantitative theoretical formalism was critically reproduced and verified, along with a careful calibration procedure. A novel, big improvement was brought for all the MOI experiments, aimed at the quantitative reconstruction of the current density, by an iterative procedure (Supercond. Sci. Technol. 16, 71 (2003)). The new MOI method was then applied to study a wide class of superconducting materials and a thorough study about the electrodynamics of superconducting films was accompanied by a research in the high energy nuclear field for creating suitable nanometric-size defects. In particular: the anisotropic interaction of correlated nanostructures with vortices was verified for the first time (Phys. Rev. B 68, 014507 (2003)); controlled micro-modulations of the superconducting properties through confined high-energy heavy-ion irradiation were achieved and nonlocal electrodynamics plus new vortex matter phases were observed; the measurements of hybrid heterostructures (superconducting/ferromagnetic bi-layers) was joining the studies of macroscopic quantum physics of vortices and of the macroscopic (but localized) spin structures (interacting with vortices indeed), in the ferromagnetic film.
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Weith-Glushko, Seth A. "Quantitative analysis of infrared contrast enhancement algorithms /". Online version of thesis, 2007. http://hdl.handle.net/1850/4208.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Folatelli, Gastón. "Type Ia Supernova Cosmology : Quantitative Spectral Analysis". Doctoral thesis, Stockholm University, Department of Physics, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-80.

Texto completo
Resumen

Type Ia supernovae have been successfully used as standardized candles to study the expansion history of the Universe. In the past few years, these studies led to the exciting result of an accelerated expansion caused by the repelling action of some sort of dark energy. This result has been confirmed by measurements of cosmic microwave background radiation, the large-scale structure, and the dynamics of galaxy clusters. The combination of all these experiments points to a “concordance model” of the Universe with flat large-scale geometry and a dominant component of dark energy.

However, there are several points related to supernova measurements which need careful analysis in order to doubtlessly establish the validity of the concordance model. As the amount and quality of data increases, the need of controlling possible systematic effects which may bias the results becomes crucial. Also important is the improvement of our knowledge of the physics of supernovae events to assure and possibly refine their calibration as standardized candle.

This thesis addresses some of those issues through the quantitative analysis of supernova spectra. The stress is put on a careful treatment of the data and on the definition of spectral measurement methods. The comparison of measurements for a large set of spectra from nearby supernovae is used to study the homogeneity and to search for spectral parameters which may further refine the calibration of the standardized candle. One such parameter is found to reduce the dispersion in the distance estimation of a sample of supernovae to below 6%, a precision which is comparable with the current lightcurve-based calibration, and is obtained in an independent manner. Finally, the comparison of spectral measurements from nearby and distant objects is used to test the possibility of evolution with cosmic time of the intrinsic brightness of type Ia supernovae.

Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Kluth, Stephan. "Quantitative modeling and analysis with FMC-QE". Phd thesis, Universität Potsdam, 2011. http://opus.kobv.de/ubp/volltexte/2011/5298/.

Texto completo
Resumen
The modeling and evaluation calculus FMC-QE, the Fundamental Modeling Concepts for Quanti-tative Evaluation [1], extends the Fundamental Modeling Concepts (FMC) for performance modeling and prediction. In this new methodology, the hierarchical service requests are in the main focus, because they are the origin of every service provisioning process. Similar to physics, these service requests are a tuple of value and unit, which enables hierarchical service request transformations at the hierarchical borders and therefore the hierarchical modeling. Through reducing the model complexity of the models by decomposing the system in different hierarchical views, the distinction between operational and control states and the calculation of the performance values on the assumption of the steady state, FMC-QE has a scalable applica-bility on complex systems. According to FMC, the system is modeled in a 3-dimensional hierarchical representation space, where system performance parameters are described in three arbitrarily fine-grained hierarchi-cal bipartite diagrams. The hierarchical service request structures are modeled in Entity Relationship Diagrams. The static server structures, divided into logical and real servers, are de-scribed as Block Diagrams. The dynamic behavior and the control structures are specified as Petri Nets, more precisely Colored Time Augmented Petri Nets. From the structures and pa-rameters of the performance model, a hierarchical set of equations is derived. The calculation of the performance values is done on the assumption of stationary processes and is based on fundamental laws of the performance analysis: Little's Law and the Forced Traffic Flow Law. Little's Law is used within the different hierarchical levels (horizontal) and the Forced Traffic Flow Law is the key to the dependencies among the hierarchical levels (vertical). This calculation is suitable for complex models and allows a fast (re-)calculation of different performance scenarios in order to support development and configuration decisions. Within the Research Group Zorn at the Hasso Plattner Institute, the work is embedded in a broader research in the development of FMC-QE. While this work is concentrated on the theoretical background, description and definition of the methodology as well as the extension and validation of the applicability, other topics are in the development of an FMC-QE modeling and evaluation tool and the usage of FMC-QE in the design of an adaptive transport layer in order to fulfill Quality of Service and Service Level Agreements in volatile service based environments. This thesis contains a state-of-the-art, the description of FMC-QE as well as extensions of FMC-QE in representative general models and case studies. In the state-of-the-art part of the thesis in chapter 2, an overview on existing Queueing Theory and Time Augmented Petri Net models and other quantitative modeling and evaluation languages and methodologies is given. Also other hierarchical quantitative modeling frameworks will be considered. The description of FMC-QE in chapter 3 consists of a summary of the foundations of FMC-QE, basic definitions, the graphical notations, the FMC-QE Calculus and the modeling of open queueing networks as an introductory example. The extensions of FMC-QE in chapter 4 consist of the integration of the summation method in order to support the handling of closed networks and the modeling of multiclass and semaphore scenarios. Furthermore, FMC-QE is compared to other performance modeling and evaluation approaches. In the case study part in chapter 5, proof-of-concept examples, like the modeling of a service based search portal, a service based SAP NetWeaver application and the Axis2 Web service framework will be provided. Finally, conclusions are given by a summary of contributions and an outlook on future work in chapter 6. [1] Werner Zorn. FMC-QE - A New Approach in Quantitative Modeling. In Hamid R. Arabnia, editor, Procee-dings of the International Conference on Modeling, Simulation and Visualization Methods (MSV 2007) within WorldComp ’07, pages 280 – 287, Las Vegas, NV, USA, June 2007. CSREA Press. ISBN 1-60132-029-9.
FMC-QE (Fundamental Modeling Concepts for Quantitative Evaluation [1]) ist eine auf FMC, den Fundamental Modeling Concepts, basierende Methodik zur Modellierung des Leistungsverhaltens von Systemen mit einem dazugehörenden Kalkül zur Erstellung von Leistungsvorhersagen wie Antwortzeiten und Durchsatz. In dieser neuen Methodik steht die Modellierung der hierarchischen Bedienanforderungen im Mittelpunkt, da sie der Ursprung aller dienstbasierenden Systeme sind. Wie in der Physik sind in FMC-QE die Bedienanforderungen Tupel aus Wert und Einheit, um Auftragstransformationen an Hierarchiegrenzen zu ermöglichen. Da die Komplexität durch eine Dekomposition in mehreren Sichten und in verschiedene hierarchische Schichten, die Unterscheidung von Operations- und Kontrollzuständen, sowie dazugehörige Berechungen unter Annahme der Stationarität reduziert wird, skaliert die Anwendbarkeit von FMC-QE auf komplexe Systeme. Gemäß FMC wird das zu modellierende System in einem 3-dimensionalen hierarchischen Beschreibungsraum dargestellt. Die quantitativen Kenngrößen der Systeme werden in drei beliebig frei-granularen hierarchischen bi-partiten Graphen beschrieben. Die hierarchische Struktur der Bedienanforderungen wird in Entity Relationship Diagrammen beschrieben. Die statischen Bedienerstrukturen, unterteilt in logische und reale Bediener, sind in Aufbaudiagrammen erläutert. Außerdem werden Petri Netze, genauer Farbige Zeit-behaftete Petri Netze, dazu verwendet, die dynamischen Abläufe, sowie die Kontrollflüsse im System zu beschreiben. Anschließend wird eine Menge von hierarchischen Gleichungen von der Struktur und den Parametern des Modells abgeleitet. Diese Gleichungen, die auf dem stationären Zustand des Systems beruhen, basieren auf den beiden Fundamental Gesetzen der Leistungsanalyse, dem Gesetz von Little und dem Verkehrsflussgesetz. Das Gesetz von Little definiert hierbei Beziehungen innerhalb einer hierarchischen Schicht (horizontal) und das Verkehrsflussgesetz wiederum Beziehungen zwischen hierarchischen Schichten (vertikal). Die Berechungen erlauben Leistungsvorhersagen für komplexe Systeme durch eine effiziente Berechnung von Leistungsgrößen für eine große Auswahl von System- und Lastkonfigurationen. Innerhalb der Forschungsgruppe von Prof. Dr.-Ing Werner Zorn am Hasso Plattner Institut an der Universität Potsdam ist die vorliegende Arbeit in einen größeren Forschungskontext im Bereich FMC-QE eingebettet. Während hier ein Fokus auf dem theoretischen Hintergrund, der Beschreibung und der Definition der Methodik als auch der Anwendbarkeit und Erweiterung gelegt wurde, sind andere Arbeiten auf dem Gebiet der Entwicklung einer Anwendung zur Modellierung und Evaluierung von Systemen mit FMC-QE bzw. der Verwendung von FMC-QE zur Entwicklung einer adaptiven Transportschicht zur Einhaltung von Dienstgüten (Quality of Service) und Dienstvereinbarungen (Service Level Agreements) in volatilen dienstbasierten Systemen beheimatet. Diese Arbeit umfasst einen Einblick in den Stand der Technik, die Beschreibung von FMC-QE sowie die Weiterentwicklung von FMC-QE in repräsentativen allgemeinen Modellen und Fallstudien. Das Kapitel 2: Stand der Technik gibt einen Überblick über die Warteschlangentheorie, Zeit-behaftete Petri Netze, weitere Leistungsbeschreibungs- und Leistungsvorhersagungstechniken sowie die Verwendung von Hierarchien in Leistungsbeschreibungstechniken. Die Beschreibung von FMC-QE in Kapitel 3 enthält die Erläuterung der Grundlagen von FMC-QE, die Beschreibung einiger Grundannahmen, der graphischen Notation, dem mathematischen Modell und einem erläuternden Beispiel. In Kapitel 4: Erweiterungen von FMC-QE wird die Behandlung weiterer allgemeiner Modelle, wie die Modellklasse von geschlossenen Netzen, Synchronisierung und Mehrklassen-Modelle beschrieben. Außerdem wird FMC-QE mit dem Stand der Technik verglichen. In Kapitel 5 werden Machbarkeitsstudien beschrieben. Schließlich werden in Kapitel 6 eine Zusammenfassung und ein Ausblick gegeben. [1] Werner Zorn. FMC-QE - A New Approach in Quantitative Modeling. In Hamid R. Arabnia, editor, Proceedings of the International Conference on Modeling, Simulation and Visualization Methods (MSV 2007) within WorldComp ’07, 280 – 287, Las Vegas, NV, USA, Juni 2007. CSREA Press. ISBN 1-60132-029-9.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Holmgren, Åke J. "Quantitative vulnerability analysis of electric power networks". Doctoral thesis, KTH, Transporter och samhällsekonomi, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3969.

Texto completo
Resumen
Disturbances in the supply of electric power can have serious implications for everyday life as well as for national (homeland) security. A power outage can be initiated by natural disasters, adverse weather, technical failures, human errors, sabotage, terrorism, and acts of war. The vulnerability of a system is described as a sensitivity to threats and hazards, and is measured by P (Q(t) > q), i.e. the probability of at least one disturbance with negative societal consequences Q larger than some critical value q, during a given period of time (0,t]. The aim of the thesis is to present methods for quantitative vulnerability analysis of electric power delivery networks to enable effective strategies for prevention, mitigation, response, and recovery to be developed. Paper I provides a framework for vulnerability assessment of infrastructure systems. The paper discusses concepts and perspectives for developing a methodology for vulnerability analysis, and gives examples related to power systems. Paper II analyzes the vulnerability of power delivery systems by means of statistical analysis of Swedish disturbance data. It is demonstrated that the size of large disturbances follows a power law, and that the occurrence of disturbances can be modeled as a Poisson process. Paper III models electric power delivery systems as graphs. Statistical measures for characterizing the structure of two empirical transmission systems are calculated, and a structural vulnerability analysis is performed, i.e. a study of the connectivity of the graph when vertices and edges are disabled. Paper IV discusses the origin of power laws in complex systems in terms of their structure and the dynamics of disturbance propagation. A branching process is used to model the structure of a power distribution system, and it is shown that the disturbance size in this analytical network model follows a power law. Paper V shows how the interaction between an antagonist and the defender of a power system can be modeled as a game. A numerical example is presented, and it is studied if there exists a dominant defense strategy, and if there is an optimal allocation of resources between protection of components, and recovery.
QC 20100831
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Rajaram, Sridhar. "Quantitative image analysis and polymer blend coalescence". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1996. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/MQ45460.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Yale, Robert Nathan. "INSTANT MESSAGING COMMUNICATION: A QUANTITATIVE LINGUISTIC ANALYSIS". Oxford, Ohio : Miami University, 2007. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=miami1183663224.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Holmgren, Åke J. "Quantitative vulnerability analysis of electric power networks /". Stockholm : Division of Safety Research, Royal Institute of Technology, 2006. http://www.diva-portal.org/kth/theses/abstract.xsql?dbid=3969.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Folatelli, Gastón. "Type Ia supernova cosmology : quantitative spectral analysis /". Stockholm : Fysikum, Univ, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-80.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

El-Ghobary, Ahmed Mohamed Eid. "Orwell's imagery : a structure and quantitative analysis". Thesis, Bangor University, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.306074.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Dean, Rebecca. "Analysis of triglycerides using quantitative NMR techniques". Thesis, Bangor University, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.440959.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

McArt, Darragh Gerrard. "Systematic quantitative analysis of the comet assay". Thesis, University of Ulster, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.551559.

Texto completo
Resumen
The single cell gel electrophoresis assay (comet assay) has been widely used over the last 30 years. It is a robust method of estimating DNA movement under perturbation. In the assay cells are embedded in agarose on a frosted slide. The amount of broken or fragmented DNA that is removed from the central body, after the cell has been lysed and electrophoresed, is estimated as the total volume of DNA damage in the form of strand breaks caused by the insult. The damage is quantified by fluorescence microscopy with analysis software. It is a simple technique that is efficient and relatively quick. With the comet methodology there is much debate on the underlying mechanisms of cellular damage and repair and over how strongly the comet shape reflects the DNA damage that has occurred. It is often used as a quick-fire detection method to back up other analysis methods with very little thought given to the information it provides. The technique itself is open to experimenter bias at the microscopy level, where researchers have the freedom to select which comets to analyse. Addressing these issues along with those of standardising the technique and developing novel methods of comet analysis may enhance the information it provides. Using statistical and computational methods, it is proposed that a simulation of the comet would provide important information about DNA damage at a cellular level while offering impartial judgement and reproducibility. Information is available on some aspects of comet structure in published literature and with new techniques both computationally and experimentally it has become possible to examine intrinsic information on comet structure. Information was also established via novel and existing laboratory techniques and also through microscopy techniques to examine architecture. A computational modelling and simulation approach can then be used to ask important biological questions on aspects of DNA damage and repair at a cellular level.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Dwyer, V. M. "Elastic scattering in quantitative Auger/XPS analysis". Thesis, University of York, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.374164.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Waterhouse, Kaye Helen. "Quantitative palynofacies analysis of Jurassic climatic cycles". Thesis, University of Southampton, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.238879.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Harris, A. W. "Laser microprobe mass spectrometry - quantitative inorganic analysis". Thesis, University of Cambridge, 1987. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.233975.

Texto completo
Resumen
This thesis is concerned with the application of Laser Microprobe Mass Spectrometry (LAMMS) to the microanalysis of inorganic materials and in particular to the quantification of such analyses. The investigation consists both of an assessment of the capabilities of a LAMMS instrument, the Cambridge Mass Spectrometry LIMA 2A, and an attempt to correlate experimental results with theoretical predictions. The principles of the operation of a LAMMS instrument are discussed and a description of the LIMA 2A instrument is presented. A survey of the literature concerned with the interaction of a high-powered laser with a solid specimen to produce a plasma, the basis of the LAMMS technique, is included. Particular emphasis is given to the description of this interaction in terms of a model based on a local thermodynamic equilibrium (LTE) in the plasma. This model is applied to the conditions estimated to be produced in the LIMA instrument to make simple predictions of expected results. A discussion of the possible methods for converting LAMMS data into a quantitative analysis is given, along with a brief description of the statistical techniques used for data handling. Several model materials, principally chosen for their well-defined composition and known lateral homogeneity, are used in this work. These are single-crystal silicon, a binary copper-nickel alloy and three III-V semiconductors drawn from the Ga-In-As system. The effect of the instrument itself is investigated, with a view to establishing its contribution to the errors observed in the data. This is followed by an investigation of the variation of both the absolute and relative ion signals produced. The variations in the relative ion signals are then compared with the predictions of the LTE model in an attempt to establish its validity. This comparison is also used to estimate the conditions produced in the laser-induced plasma and their variation with specimen chemistry and laser power density. The general conclusions of the investigation are drawn together in a discussion of the preferred methods for the quantification of LAMMS data and the expected error in the resulting analysis. It is shown that, provided appropriate methods are used, the LAMMS technique can provide quantitative analyses with precisions of about 5-20%.
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Johnson, James. "Quantitative analysis of plant root system architecture". Thesis, University of Nottingham, 2018. http://eprints.nottingham.ac.uk/55601/.

Texto completo
Resumen
The root system of a plant is responsible for supplying it with essential nutrients. The plant's ability to explore the surrounding soil is largely determined by its root system architecture (RSA), which varies with both genetic and environmental conditions. X-ray micro computed tomography (µCT) is a powerful tool allowing the non-invasive study of the root system architecture of plants grown in natural soil environments, providing both 3D descriptions of root architecture and the ability to make multiple measurements over a period of time. Once volumetric µCT data is acquired, the root system must first be segmented from the surrounding soil environment and then described. Automated and semi-automated software tools can be used to extract roots from µCT images, but current methods for the recovery of RSA traits from the resulting volumetric descriptions are somewhat limited. This thesis presents a novel tool (RooTh) which, given a segmented µCT image, skeletonises the root system and quantifies global and local root traits with minimal user interaction. The computationally inexpensive method used takes advantage of curve-fitting and active contours to find the optimal skeleton and thus evaluate root traits objectively. A small-scale experiment was conducted to validate and compare root traits extracted using the method presented here alongside other 2D imaging tools. The results show a good degree of correlation between the two methods.
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Corts, Brandon (Brandon E. ). "Quantitative analysis of 200 meter track times". Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/112577.

Texto completo
Resumen
Thesis: S.B., Massachusetts Institute of Technology, Department of Mechanical Engineering, 2017.
Cataloged from PDF version of thesis.
Includes bibliographical references (page 19).
Using male, varsity, division 3 collegiate track and field results from the past decade, critical coaching decisions such as optimizing meet scheduling, targeting efficient training programs to mimic, and identifying potential performance-influencing factors on athletes can be made more easily. To come to these conclusions, 200-meter race times were normalized using seasonal athlete improvement factors and wind data to identify at which facilities athletes tended to run faster times and what factors make those facilities fast. It can be concluded that the variation in banked track and field facilities makes the banked-to-flat track conversion factor implemented by the NCAA in 2012 is potentially too harsh for athletes to compete on some banked indoor tracks compared to others. The data also has the potential for many other applications such as identifying the highest quality training programs, analyzing conversion factors and facility speed for races other than the 200-meter dash, and applying similar principals to variations in swimming facilities.
by Brandon Corts.
S.B.
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Feldstein, Michael J. (Michael Jordan). "Quantitative analysis and modeling of microembolic phenomena". Thesis, Massachusetts Institute of Technology, 2006. http://hdl.handle.net/1721.1/38327.

Texto completo
Resumen
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2006.
Includes bibliographical references (p. 131-136).
This thesis explores parameters that govern microvascular occlusion secondary to embolic phenomenon. Bulk and individual properties of microembolic particles were characterized using light microscopy, SEM and optically-based particle analysis. Particulate probability distributions were created from imaging data using Matlab. Size distribution, volume, morphology and chemical properties were quantified using in-vivo and model flow systems to correlate particulate characteristics and occlusive efficacy. This study focused on novel expandable/deformable Polyacrylic acid microspheres (PAA-MS) for use as catheter-deliverable therapeutic emboli. These emboli expand in aqueous media such as blood and remain unexpanded in custom delivery media. The techniques developed to investigate therapeutic microemboli were applied to the analysis of clot dissolution byproducts. PAA-MS expand volumetrically in seconds when placed in aqueous environments. PAA-MS were modified to resist fragmentation based upon failure analysis. Degradation testing demonstrated PAA-MS chemical stability. Charge characteristics inherent to the PAA-MS acid matrix were leveraged to develop low-viscosity media that prevent expansion. Cationic dyes were found that bind the charged matrix within PAA-MS to enhance visualization.
(cont.) Unexpanded PAA-MS were delivered through standard catheters and microcatheters at concentrations that induce durable occlusions. Non-expandable microspheres could not be delivered through microcatheters. PAA-MS required less embolic mass to occlude in-vitro flow systems at significantly higher pressures than nonexpandable microspheres. Preliminary biocompatibility tests demonstrated safety and PAA-MS were able to occlude both porcine renal and coronary vasculature in-vivo. An ultrasonic clot dissolution device generated microemboli from synthetic acellular fibrin-only clots and whole blood clots. The average particle size for whole blood clots was less than 100 microns and acellular clots produced larger average emboli than whole blood clots, indicating that cellular components may limit thromboembolic size. The expandable/deformable properties of PAA-MS allow them to traverse microcatheters when unexpanded. Once in blood, PAA-MS expand 140-fold to create a space-filling, pressure resistant occlusion.
(cont.) These results have implications for intravascular embolization procedures where smaller catheters minimize vasospasm and allow more precise targeting while stronger occlusions resist occlusive breakdown and associated distal embolization. These embolic improvements could reduce procedural complications while increasing efficacy. Future work will solidify correlations between microembolic properties, microvascular occlusion and tissue infarction.
by Michael J. Feldstein.
S.M.
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Lammerding, Jan 1974. "Quantitative analysis of subcellular biomechanics and mechanotransduction". Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/18039.

Texto completo
Resumen
Thesis (Ph. D.)--Massachusetts Institute of Technology, Biological Engineering Division, 2004.
Includes bibliographical references.
Biological cells such as endothelial or muscle cells respond to mechanical stimulation with activation of specific intracellular and extracellular signaling pathways and cytoskeletal remodeling, a process termed mechanotransduction. Intracellular mechanosensors are thought to be activated by conformational changes induced by local cellular deformations. Since these mechanosensors have been speculated to be located in several cellular domains including the cell membrane, the cytoskeleton, and the nucleus, it is necessary to achieve a detailed understanding of subcellular mechanics. In this work, we present novel methods to independently quantify cytoskeletal displacements, mechanical coupling between the cytoskeleton and the extracellular matrix, and nuclear mechanics based on high resolution tracking of cellular structures and receptor bound magnetic beads in response to applied strain or microscopic forces. These methods were applied to study the effects of several human disease associated mutations on subcellular mechanics and to examine the interaction between known protein function and specific changes in cellular mechanical properties and mechanotransduction pathways. Initial experiments were targeted to the role of membrane adhesion receptors. Experiments with cells expressing a mutant form of the integrin-associated molecule tetraspanin CD151 revealed that CD151 plays a key role in selectively strengthening α6βl integrin-mediated adhesion to laminin-1. We then studied cytoplasmic behavior using cells from mice with an αB-Crystallin mutation (R120G) that causes desmin-related myopathy. These studies showed impaired passive cytoskeletal mechanics in adult mouse cardiac myocytes. Finally, we studied cells deficient in the nuclear envelope
(cont.) protein lamin A/C and showed that lamin A/C deficient cells have increased nuclear deformation, defective mechanotransduction, and impaired viability under mechanical strain, suggesting that the tissue specific effects observed in laminopathies such as Emery-Dreifuss muscular dystrophy or Hutchinson-Gilford progeria may arise from varying degrees of impaired nuclear mechanics and transcriptional regulation. In conclusion, our methods provide new and valuable tools to examine the role of subcellular biomechanics on mechanotransduction in normal and mutant cells, leading to improved understanding of disease mechanisms associated with altered cell mechanics.
by Jan Lammerding.
Ph.D.
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Aiuchi, Masaharu. "An empirical analysis of quantitative trading strategies". Thesis, Massachusetts Institute of Technology, 2008. http://hdl.handle.net/1721.1/44439.

Texto completo
Resumen
Thesis (M.B.A.)--Massachusetts Institute of Technology, Sloan School of Management, 2008.
Includes bibliographical references (p. 277-280).
Along with the increasing computing power, growing availability of various data streams, introduction of the electronic exchanges, decreasing trading costs and heating-up competition in financial investment industry, quantitative trading strategies or quantitative trading rules have been evolving rapidly in a few decades. They challenge the Efficient Market Hypothesis by trying to forecast future price movements of risky assets from the historical market information in algorithmic ways or in statistical ways. They try to find some patters or trends from the historical data and use them to beat the market benchmark. In this research, I introduce several quantitative trading strategies and investigate their performances empirically i.e. by executing back-tests assuming that the S&P 500 stock index is a risky asset to trade. The strategies utilize the historical data of the stock index itself, trading volume movement, risk-free rate movement and implied volatility movement in order to generate buy or sell trading signals. Then I attempt to articulate and decompose the source for successes of some strategies in the back-tests into several factors such as trend patterns or relationships between market information variables in intuitive way. Some strategies recorded higher performances than the benchmark in the back-tests, however it is still a problem how we can distinguish these winner strategies beforehand from the losers at the beginning of our investment horizon. Human discretion such as macro view on the future market trend is considered to still play an important role for quantitative trading to be successful in the long-run.
by Masaharu Aiuchi.
M.B.A.
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Macura, Tomasz Jakub. "Automating the quantitative analysis of microscopy images". Thesis, University of Cambridge, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.611330.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Starr, John Michael. "Quantitative analysis of the Aramaic Qumran texts". Thesis, University of Edinburgh, 2013. http://hdl.handle.net/1842/8947.

Texto completo
Resumen
Ηirbet-Qumran lies about 15km south of Jericho. Between 1947 and 1956 eleven caves were discovered that contained thousands of fragments, mainly of prepared animal skin, representing approximately 900 texts many of which are considered copies. Over 100 of these texts are in Aramaic, though many are short fragments. The provenance of these texts is uncertain, but the fact that copies of some of them are known from beyond Qumran indicates that not all the Aramaic texts found at Qumran are likely to have originated there, though Qumran may have been a site where they were copied. Hitherto, this Aramaic corpus has been referred to as a single entity linguistically, but increasingly it is recognized that heterogeneity of textual features is present. Such heterogeneity may provide clues as to the origins of different texts. The current classification of Qumran texts is in terms of the cave they were found in and the order in which they were found: it does not provide any information about textual relationships between texts. In this thesis I first review the literature to identify suitable quantitative criteria for classification of Aramaic Qumran texts. Second, I determine appropriate statistical methods to classify edited Aramaic Qumran texts according to quantitative textual criteria. Third, I establish ‘proof of principle’ by classifying Hebrew bible books. Fourth, I classify the Aramaic Qumran texts as a corpus without reference to external reference texts. Fifth, I investigate the alignment of this internally-derived classification with external reference texts. Sixth, I perform confirmatory analyses to determine the number of different text-type groups within the corpus. Seventh, I use this classification to allocate previously unclassified texts to one of these text-types. Finally, I examine the textual characteristics of these different text-types and discuss what this can tell scholars about individual texts and the linguistic development of Aramaic during Second Temple Judaism as reflected by Qumran.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía