Dissertations / Theses on the topic 'Correlation analysis'

To see the other types of publications on this topic, follow the link: Correlation analysis.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Correlation analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Bhatta, Sanjeev. "Conditional Correlation Analysis." Wright State University / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=wright1495963166600514.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Orvalho, André. "Botnet Detection by Correlation Analysis." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-105096.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
When a bot master uses a control and commander (C&C) mechanism to assemble a large number of bots, infecting them by using well known vulnerabilities, it forms a botnet. Botnets can vary in C&C architecture (Centralized C&C or P2P are the most common), communication protocols used (IRC, HTTP or others like P2P) and observable botnet activities. They are nowadays one of the largest threats on cyber security and it is very important to specify the different characteristics of botnets in order to detect them, the same way a hunter needs to know its prey before preparing methods to catch it. There are 2 important places to look for botnet activity: The network and the infected host. This project intends to present a study that correlates the behavior on the network with the behavior on the host in order to help detection, studies like [SLWL07] (based on network behavior) and [SM07] (based on host behavior) are two good start points to help on the research. The choice of the architecture was done by looking at the botnet characteristics especially the capacity of changing and evolving which makes methods for detection by misuse obsolete. The system is designed to first look at 4 features of system calls on the host side: First which system call it is, second the name of the application using the system call, third the time between this system call and the last system call and for last the sequence of the past three system calls. A technique of unsupervised learning (the K-means algorithm) will be used to calculate the values for the threshold using an unclassified training set. when on the real world the collection is used to calculate the values to compare with the threshold. If it passes the threshold than the necessary information is passed to the network evaluation block. On the network side and before receiving any data from the host side, it will calculate the threshold for the flows given on the training set. When using the data from the host to narrow down the number of flows to look at, it very if their values pass the threshold. The feature used to calculate the threshold is the time between flows. If the network finds flows that pass the threshold for the network evaluation block than it will emit reports and alarms to the user. The small experiences done show some promising signs for use on the real world even though a lot more further testing is needed especially on the network bit. The prototype shows some limitations that can be overcome by further testing and using other techniques to evolve the prototype.
3

Rossi, Claudio Alexander. "Empirical Analysis of Implied Equity Correlation." St. Gallen, 2009. http://www.biblio.unisg.ch/org/biblio/edoc.nsf/wwwDisplayIdentifier/01653419003/$FILE/01653419003.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Lai, Pei Ling. "Neural implementations of canonical correlation analysis." Thesis, University of the West of Scotland, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.311771.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Droop, Alastair Philip. "Correlation Analysis of Multivariate Biological Data." Thesis, University of York, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.507622.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Samarov, Daniel V. Marron James Stephen. "The analysis and advanced extensions of canonical correlation analysis." Chapel Hill, N.C. : University of North Carolina at Chapel Hill, 2009. http://dc.lib.unc.edu/u?/etd,2205.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Thesis (Ph. D.)--University of North Carolina at Chapel Hill, 2009.
Title from electronic title page (viewed Jun. 26, 2009). "... in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Department of Statistics and Operations Research." Discipline: Statistics and Operations Research; Department/School: Statistics and Operations Research.
7

Nicovich, Philip R. "Widefield fluorescence correlation spectroscopy." Diss., Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/33849.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Fluorescence correlation spectroscopy has become a standard technique for modern biophysics and single molecule spectroscopy research. Here is presented a novel widefield extension of the established single-point technique. Flow in microfluidic devices was used as a model system for microscopic motion and through widefield fluorescence correlation spectroscopy flow profiles were mapped in three dimensions. The technique presented is shown to be more tolerant to low signal strength, allowing image data with signal-to-noise values as low as 1.4 to produce accurate flow maps as well as utilizing dye-labeled single antibodies as flow tracers. With proper instrumentation flows along the axial direction can also be measured. Widefield fluorescence correlation spectroscopy has also been utilized to produce super-resolution confocal microscopic images relying on the single-molecule microsecond blinking dynamics of fluorescent silver clusters. A method for fluorescence modulation signal extraction as well as synthesis of several novel noble metal fluorophores is also presented.
8

Gou, Zhenkun. "Canonical correlation analysis and artificial neural networks." Thesis, University of the West of Scotland, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.269409.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lykou, Anastasia. "Sparse canonical correlation analysis using the Lasso." Thesis, Lancaster University, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.533099.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Moore, Julie Carolyn. "Comparisons of correlation methods in risk analysis." Thesis, This resource online, 1994. http://scholar.lib.vt.edu/theses/available/etd-06102009-063246/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Felix, Pascal Georges. "Characterization and correlation analysis of pharmaceutical gelatin." [Tampa, Fla.] : University of South Florida, 2003. http://purl.fcla.edu/fcla/etd/SFE0000494.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Hotchkiss, Alastair Jeremy. "Generalised cross correlation functions for physical applications." Thesis, University of Exeter, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.262492.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

FRAZER, SCOTT RAYMOND. "INFORMATION EXTRACTION IN CHROMATOGRAPHY USING CORRELATION TECHNIQUES." Diss., The University of Arizona, 1985. http://hdl.handle.net/10150/187978.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
While research into improving data quality from analytical instrumentation has gone on for decades, only recently has research been done to improve information extraction methods. One of these methods, correlation analysis, is based upon the shifting of one function relative to another and determining a correlation value for each displacement. The cross correlation algorithm allows one to compare two files and find the similarities that exist, the convolution operation combines two functions two dimensionally (e.g. any input into an analytical instrument convolves with that instrument response to give the output) and deconvolution separates functions that have convolved together. In correlation chromatography, multiple injections are made into a chromatograph at a rate which overlaps the instrument response to each injection. Injection intervals must be set to be as random as possible within limits set by peak widths and number. When the input pattern representation is deconvolved from the resulting output, the effect of that input is removed to give the instrument response to one injection. Since the operation averages all the information in the output, random noise is diminished and signal-to-noise ratios are enhanced. The most obvious application of correlation chromatography is in trace analysis. Signal-to-noise enhancements may be maximized by treating the output data (for example, with a baseline subtraction) before the deconvolution operation. System nonstationarities such as injector nonreproducibility and detector drift cause baseline or "correlation" noise, which limit attainable signal-to-noise enhancements to about half of what is theoretically possible. Correlation noise has been used to provide information about changes in system conditions. For example, a given concentration change that occurs over the course of a multiple injection sequence causes a reproducible correlation noise pattern; doubling the concentration change will double the amplitude of each point in the noise pattern. This correlation noise is much more amenable to computer analysis and, since it is still the result of signal averaging, the effect of random fluctuations and noise is reduced. A method for simulating conventional coupled column separations by means of time domain convolution of chromatograms from single column separations is presented.
14

Castleberry, Alissa. "Integrated Analysis of Multi-Omics Data Using Sparse Canonical Correlation Analysis." The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu15544898045976.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Jablonski, David A. "NDVI and panchromatic image correlation using texture analysis." Thesis, Monterey, California : Naval Postgraduate School, 2010. http://edocs.nps.edu/npspubs/scholarly/theses/2010/Mar/10Mar%5FJablonski.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Thesis (M.S. in Space Systems Operations)--Naval Postgraduate School, March 2010.
Thesis Advisor(s): Olsen, Richard C. Second Reader: Trask, David M. "March 2010." Description based on title screen as viewed on April 28, 2010. Author(s) subject terms: NDVI, Panchromatic, Texture analysis, Kelp Detection. Includes bibliographical references (p. 69-70). Also available in print.
16

Sergeev, Mikhail. "High order autocorrelation analysis in image correlation spectroscopy." Thesis, McGill University, 2004. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=81437.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This thesis studies optical microscopy based high order autocorrelation approaches for measuring molecular aggregation of fluorescently labeled particles in fluid systems. As the particles randomly diffuse into and out of the volume defined by the focus of a confocal laser beam illumination, the collected fluorescence intensity fluctuates. Fluorescence Correlation Spectroscopy (FCS) and Image Correlation Spectroscopy (ICS) have been used as methods which analyse temporal and spatial intensity fluctuations, and provide quantitative information of the molecular transport processes. Theoretical expressions for the high order autocorrelation function magnitudes for a non-interactive model are derived as well as their fitting equations for single- and multicomponent diffusion.
We present an experimental verification of the model applied to simple systems. Solutions of fluorescent microspheres of well-defined size have been imaged using confocal laser scanning microscopy. It has been shown that translational diffusion coefficients were not very sensitive to molecular size dispersion, which made a first order autocorrelation approach to be somewhat ineffective for dealing with multicomponent systems. We demonstrate that the number densities of a mixture of two fluorescent particles can be determined analyzing the higher order autocorrelation function magnitudes. Numerical simulations have been analyzed for testing the experimental tools we use. The technique outlined may be developed to detect and characterize aggregates of fluorescently labeled biological molecules such as membrane proteins and cell surface receptors. Such quantitative aggregation measurements, therefore, can provide information about the mechanism of intercellular signaling which is believed to depend on the oligomerization of cell membrane protein receptors.
17

A, Kuzmenko. "CORRELATION-EXTREME NAVIGATION SYSTEM BASED ON MORPHOLOGICAL ANALYSIS." Thesis, ПОЛІТ.Сучасні проблеми науки.Гуманітарні науки:тези доповідей XVII Міжнародної науково-практичної конференції молодих учених і студентів:[y 2-x т.].Т.2(м.Київ,4-7 квітня 2017 р.)/[ред.кол.:В.М.Ісаєнко та ін.]; Національний авіаційний університет.-К.:НАУ,2017.-374 с, 2017. http://er.nau.edu.ua/handle/NAU/27745.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Therefore, advantage of morphological method is associated with the possibility of improving the integration of image registration conditions. Introduced by morphological analysis the notion of «form» significantly enriches the radiometric properties of reference image, making possible to build a more robust detection algorithms.
18

Liang, Yiming. "Analysis of Paperboard Performance using Digital Image Correlation." Thesis, KTH, Hållfasthetslära, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-277799.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The performance of paperboard materials in packaging application has been investigated and evaluated for a long time. This is because it plays a decisive role for product protection and decoration in packaging applications. Potential damages during transportation sometimes affect the consistency of the performance. Therefore, the capability of the material to resist these external disturbances was of interest. A multiply paperboard was chosen as the experimental material. The analysis conducted in this thesis aimed to reveal the tensile behavior in the cross-machine direction (CD) of the material against various kinds of local or global changes. The changes included global and local climate variations, cutouts, and regional weakening and strengthening, which were applied during the intervals between preloading and reloading. The digital image correlation (DIC) analysis computed the time-varying strain fields from the gray level information contained in the recorded videos of loading processes.  The generated strain fields were imported to post analysis. Comparison between comparable stages (two stages with the same average strain value from different loading sections) was considered as the scheme of isolating the influences of the changes and investigating them individually. The cosine image similarity method and the eigenface algorithm were used to validate this scheme, while the directional average calculation and the strain field compensation method were introduced to realize the isolation. The differences between the front and back outer plies of the paperboard sheets were detected as individual. Moreover, both global and local climate changes were affecting the strain distributions of the specimens proportionally on account of the moisture ratio within the material. In addition, the invisible mechanical weakening and strengthening were captured evidently with the analysis, which caused strain concentrations due to the uneven distribution of expansion capability. The relaxation and bending in unloading processes were two of the primary disturbing factors within all the deformed specimens, which were related to time and bending direction, correspondingly.
Egenskaperna hos kartongmaterial för förpackningstillämpningar har varit ett ämne att undersökning under lång tid. Detta för att dessa egenskaper spelar en avgörande roll som produktskydd och dekorativ utformning i mängde av tillämpningar. Potentiella skador under transport påverkar bland annat materialets tillförlitlighet och prestandard. Därför är det aktuellt att undersöka samt förstå materialets förmåga att motstå yttre störningar. Experimentmaterialet som användes bestod av en typ av flerskiktskartong. Analyser som utfördes i denna avhandling har syfte att identifiera de mekaniska förändringarna i materialets dragegenskaper i tvärsmaskin-riktningen (CD) på grund av olika lokala eller globala förändringar. Förändringarna innefattar både globala och lokala klimatvariationer, utskärningar, och lokala försvagningar samt förstärkningar. Dessa förändringar infördes vid intervallet mellan på- och avlastning. Den digital bildkorrelations analys (DIC) användes för att beräknade de tidsvarierande töjningsfälten från den grånivåinformationen i som registrerades med hjälp av inspelade videor under belastningen  Den genererade töjningsfälten importerades för vidare analys. Två tillstånd med liknande medelvärde av töjningsnivån från olika delar av belastningen jämfördes, detta för att isolera påverkan av förändringarna och undersöka dem individuellt. Två olika metoder för jämförelse av bilderna (cosine image similarity och eigenface algorithm) användes för att validera analysschemat, där riktning-medelvärdesberäkningar och töjningsfälts kompensations-metoden användes för att realisera dessa isoleringar. Enstaka skillnader upptäcktes mellan de främre och bakre ytskikten på kartongarken. Dessutom påverkades töjningsfördelningarna för proverna både av den globala och lokala klimatförändringar på grund av fukttillståndet i materialet. Vidare kan de osynliga mekaniska försvagningar och förstärkningar tydligt fångas med de utförda analyserna, vilket ledde till töjningskoncentrationers uppkomst på grund av det inhomogena expansions-förmåga hos arket. Relaxationen och böjningen vid avlastning relaterade till tid och böjningsförmåga var två av de primära faktorerna som påverkade analysens kvalité.
19

Marshall, David Fielding 1965. "Correlation based analysis of a generalized target description." Thesis, The University of Arizona, 1993. http://hdl.handle.net/10150/278292.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This thesis presents a discrete time implementation and analysis of the generalized impulse response model proposed by Altes. This model is a generalization of the weighted sum of time delayed delta functions typically used to describe the impulse response of scattering targets. The target chosen for the analysis is the rigid acoustic sphere. The coefficients of the generalized model are used to calculate estimates of the radius of the sphere, which is known in advance for testing purposes. The accuracy of the radius estimates indicates the accuracy of the model coefficients. The generalized model is shown to be superior from the standpoint of radius estimation. An estimator for the time of arrival of a signal of unknown but deterministic form is derived. It is based upon a generalized likelihood ratio test whose structure accommodates the generalized model. This estimator performs well in high levels of noise.
20

Budimir, Iva <1992&gt. "Stochastic Modeling and Correlation Analysis of Omics Data." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amsdottorato.unibo.it/9792/1/Budimir_Iva_tesi.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
We studied the properties of three different types of omics data: protein domains in bacteria, gene length in metazoan genomes and methylation in humans. Gene elongation and protein domain diversification are some of the most important mechanisms in the evolution of functional complexity. For this reason, the investigation of the dynamic processes that led to their current configuration can highlight the important aspects of genome and proteome evolution and consequently of the evolution of living organisms. The potential of methylation to regulate the expression of genes is usually attributed to the groups of close CpG sites. We performed the correlation analysis to investigate the collaborative structure of all CpGs on chromosome 21. The long-tailed distributions of gene length and protein domain occurrences were successfully described by the stochastic evolutionary model and fitted with the Poisson Log-Normal distribution. This approach included both demographic and environmental stochasticity and the Gompertzian density regulation. The parameters of the fitted distributions were compared at the evolutionary scale. This allowed us to define a novel protein-domain-based phylogenetic method for bacteria which performed well at the intraspecies level. In the context of gene length distribution, we derived a new generalized population dynamics model for diverse subcommunities which allowed us to jointly model both coding and non-coding genomic sequences. A possible application of this approach is a method for differentiation between protein-coding genes and pseudogenes based on their length. General properties of the methylation correlation structure were firstly analyzed for the large data set of healthy controls and later compared to the Down syndrome (DS) data set. The CpGs demonstrated strong group behaviour even across the large genomic distances. Detected differences in DS were surprisingly small, possibly caused by the small sample size of DS which reduced the power of statistical analysis.
21

Habib, Farhat Abbas. "Genotype-phenotype correlation using phylogenetic trees." Columbus, Ohio : Ohio State University, 2007. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1187297400.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Liu, Xiaoli. "Spatial Correlation Study on Hybrid Electric Vehicle Adoption." The Ohio State University, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=osu1397646595.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Chen, Zhe Haykin Simon S. "Stochastic approaches for correlation-based learning." *McMaster only, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
24

Wiedemann, Eric A. "Reducing variance between two systems by inducing correlation." Thesis, Georgia Institute of Technology, 1995. http://hdl.handle.net/1853/23345.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Yamaguchi, David K. "Interpretation of Cross Correlation Between Tree-Ring Series." Tree-Ring Society, 1986. http://hdl.handle.net/10150/261724.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Ginn, Patrick W. "Correlation analysis of fleet information warfare center network incidents." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2001. http://handle.dtic.mil/100.2/ADA396275.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Thesis (M.S. in Information Technology Management) Naval Postgraduate School, June 2001.
Thesis advisors, Raymond Buettner, Dan C. Boger. Includes bibliographical references (p. 51). Also Available online.
27

Nilsson, Erik. "Correlation Analysis of Calcium Signalling Networks in Living Cells." Thesis, KTH, Applied Physics, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-10882.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:

In living cells, calcium ions (Ca2+) play an important role as an intracellular second messenger. It mediates the regulation of cellular processes such as gene expression, initiation of vesicle fusion in synapses, is used in muscle contraction and is believed to play a fundamental role in synaptic plasticity as a molecular substrate for learning. The Ca2+ signals are created by the fact that the concentration of Ca2+ in the cytosol is four orders of magnitude lower than in the extracellular fluid as well as in cytoplasmic compartments such as the endoplasmic reticulum (ER). This enables fast increments in the cytosol concentration, which is regulated back to normal concentration by different mechanisms. In this project, the connection between Ca2+ signals of different cells was analysed using different correlation techniques: cross-correlation of continuous signals and digitalised signals. Therefore a software tool was developed in MATLAB, which takes Ca2+ recordings from time-lapse fluorescence microscopy as input and calculates the pair wise correlation for all cells. The software was tested by using previous data from experiments with embryonic stem cells from mouse (mES) and human (hES) as well as data from recordings done as part of the project. The study shows that the mathematical method of cross-correlation can successfully be applied to quantitative and qualititative analysis of Ca2+ signals. Furthermore, there exist strongly correlated cells in colonies of mES cells and hES cells. We suggest the synchronisation is achieved by physical coupling implicating a decrease of correlation as the distance increases for strong correlations. In addition, the lag used by the cross-correlation function (an effective phase shift) decreases as the correlation coefficient increases and increases as the intercellular distance increases for high correlation coefficients. Interestingly, the number of cells included in small scale clusters of strongly correlated cells is significantly larger for the differentiating mES cells than for the proliferating mÉS cells. In a broader perspective, the developed software might be usd in for instance analysis of cellular electrical activity and shows the relevance of applying methods from the exact sciences to biology.


QC 20100708
28

Lei, Song. "Informative correlation extraction from and for Forex market analysis." AUT University, 2010. http://hdl.handle.net/10292/899.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The forex market is a complex, evolving, and a non-linear dynamical system, and its forecast is difficult due to high data intensity, noise/outliers, unstructured data and high degree of uncertainty. However, the exchange rate of a currency is often found surprisingly similar to the history or the variation of an alternative currency, which implies that correlation knowledge is valuable for forex market trend analysis. In this research, we propose a computational correlation analysis for the intelligent correlation extraction from all available economic data. The proposed correlation is a synthesis of channel and weighted Pearson's correlation, where the channel correlation traces the trend similarity of time series, and the weighted Pearson's correlation filters noise in correlation extraction. In the forex market analysis, we consider 3 particular aspects of correlation knowledge: (1) historical correlation, correlation to previous market data; (2) cross-currency correlation, correlation to relevant currencies, and (3) macro correlation, correlation to macroeconomic variables. While evaluating the validity of extracted correlation knowledge, we conduct a comparison of Support Vector Regression (SVR) against the correlation aided SVR (cSVR) for forex time series prediction, where correlation in addition to the observed forex time series data is used for the training of SVR. The experiments are carried out on 5 futures contracts (NZD/AUD, NZD/EUD, NZD/GBP, NZD/JPY and NZD/USD) within the period from January 2007 to December 2008. The comparison results show that the proposed correlation is computationally significant for forex market analysis in that the cSVR is performing consistently better than purely SVR on all 5 contracts exchange rate prediction, in terms of error functions MSE, RMSE, NMSE, MAE and MAPE. However, the cSVR prediction is found occasionally differing significantly from the actual price, which suggests that despite the significance of the proposed correlation, how to use correlation knowledge for market trend analysis remains a very challenging difficulty that prevents in practice further understanding of the forex market. In addition, the selection of macroeconomic factors and the determination of time period for analysis are two computationally essential points worth addressing further for future forex market correlation analysis.
29

Carr, James. "Error analysis of X-ray photon correlation spectroscopy measurements." Thesis, McGill University, 2013. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=117183.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The use of a theory to extract parameters from experimental data requires proper understanding of the statistical variation. Furthermore, the improvement of any experimental technique requires a sound understanding of the sources of error and an accurate model of how experimental parameters effect signal strength and noise. The second order intensity-intensity correlation function is the standard measured quantity in dynamic light scattering and x-ray photon correlation spectroscopy (XPCS) experiments. In this thesis we compare the measured variances of the correlation function to a model based on the statistics of dynamic light scattering. Agreement between the dynamic light scattering model and the x-ray photon correlation spectroscopy data is shown. XPCS experiments are typically conducted with low photon flux and are used to study long time constants. To achieve sufficient statistics area detectors are used. We show that there are appreciable correlations between near neighbour pixels. These correlations reveal important features that must be included to accurately draw conclusions from XPCS experiments.
L'utilisation d'une théorie pour extraire des paramètres depuis des données expérimentales nécessite une compréhension des variations statistiques. De plus, l'amélioration d'une technique expérimentale repose sur la compréhension des sources d'erreurs and d'un modèle précis de l'effet des paramètres expérimentaux sur le signal et le bruit. La fonction de corrélation intensité-intensité de deuxième ordre est une quantité de mesure standardisée pour les expériences de diffusion de lumière dynamique et de spectroscopie de corrélation de photons en rayons X (X-ray Photon Correlation Spectroscopy, XPCS). Dans cette thèse, nous comparons les variances mesurées de la fonction de corrélation à un modèle basé sur les statistiques de diffusion de lumière dynamique. Nous démontrons l'accord entre le modèle de diffusion de lumière dynamique et les données XPCS. Les expériences XPCS sont en général effectuées avec un faible flux de photons et sont utilisées pourétudier les constantes à long terme. Pour atteindre des statistiques susantes, de détecteurs à résolution spatiale sont utilisés. Nous montrons l'existence de corrélations entre pixels voisins. Ces corrélations révèlent d'importantes caractéristiques qui doivent tre inclues afin de tirer des conclusions précises des expériences XPCS.
30

Donnelly, Brendan P. "Correlation analysis of the n=3 states of helium." Thesis, Queen's University Belfast, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.335980.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Hennessey, James W. "Assistive visual content creation tools via multimodal correlation analysis." Thesis, University College London (University of London), 2018. http://discovery.ucl.ac.uk/10046053/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Visual imagery is ubiquitous in society and can take various formats: from 2D sketches and photographs to photorealistic 3D renderings and animations. The creation processes for each of these mediums have their own unique challenges and methodologies that artists need to overcome and master. For example, for an artist to depict a 3D scene in a 2D drawing they need to understand foreshortening effects to position and scale objects accurately on the page; or, when modeling 3D scenes, artists need to understand how light interacts with objects and materials, to achieve a desired appearance. Many of these tasks can be complex, time-consuming, and repetitive for content creators. The goal of this thesis is to develop tools to alleviate artists from some of these issues and to assist them in the creation process. The key hypothesis is that understanding the relationships between multiple signals present in the scene being created enables such assistive tools. This thesis proposes three assistive tools. First, we present an image degradation model for depth-augmented image editing to help evaluate the quality of the image manipulation. Second, we address the problem of teaching novices to draw objects accurately by automatically generating easy-to-follow sketching tutorials for arbitrary 3D objects. Finally, we propose a method to automatically transfer 2D parametric user edits made to rendered 3D scenes to global variations of the original scene.
32

Zhao, Yuqian. "An analysis of the stability in multivariate correlation structures." Thesis, University of Birmingham, 2017. http://etheses.bham.ac.uk//id/eprint/7739/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Analysing the instability in the multivariate correlation structure, the present thesis starts from assessing in-sample and out-of-sample performances of multivariate GARCH models with or without a structural break. The result emphasizes the importance of correlation change point detection for model fittings. We then propose semi–parametric CUSUM tests to detect a change point in the covariance structures of non–linear multivariate models with dynamically evolving volatilities and correlations. The asymptotic distributions of the proposed statistics are derived under mild conditions. Our simulations show that, even though the nearly unit root property distorts the size and power of tests, the standardization of the data with conditional standard deviations in multivariate volatility models can correct such distortions. Lastly, concerning classical trimmed issue in change point test, we extend the semi-parametric CUSUM to weighted CUSUM tests, which enhances the power across either ends of a sample. A Monte Carlo simulation study suggests that weighted CUSUM tests exhibit better performances than unweighted ones in finite samples. Regarding empirical applications, we show the absorption ratio is a leading indicator of the financial fragility, and we study global financial contagion effect, also we investigate unexpected events in the U.S. equity market.
33

Antonilli, Stefanie, and Lydia Embaie. "Charlson and Rx-Risk Comorbidity Indices – A Correlation Analysis." Thesis, Stockholms universitet, Statistiska institutionen, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-183469.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The objective of this study was to investigate the utilization of the diagnose-based Charlson Comorbidity Index (CCI) and the medication-based Rx-Risk Comorbidity Index on Swedish administrative data. Data was collected over a ten-year period from the National Patient Register and the National Prescribed Medication Register on 3609 respondents from the national public health survey 2018, aged 16-84 and registered in Stockholm County. The overall aim was to identify comorbid conditions in the study population; and to examine if the identified comorbidities differ between indices, based on subject characteristics such as age and gender. Moreover, the specific aim was to quantify correlation between the indices, as well as within indices over look-back periods of up to ten years. Among the study population, 13 % were identified with at least one comorbid condition through CCI, and 87 % had medications indicative of at least one condition covered by Rx-Risk. Both the original Charlson weights and updated weights by Quan were used to compute the comorbidity scores for CCI. Results showed that when CCI and Quan may have scored low, the Rx-Risk picked up more conditions. The Spearman rank correlation between CCI and Quan scores resulted in relatively high correlation with a coefficient of 0.82 (p-value < 0.05) over look-back periods of 2, 5 and 10 years. Moreover, the correlation between CCI and Rx-Risk was fairly low over all look-back periods with a correlation coefficient of 0.34 (p-value < 0.05) at most. The within-correlation showed that CCI identified much of the comorbidity between the one- and two-year look-back periods, whilst Rx-Risk identified much comorbidity within the one-year look-back period. The overall implications of the presented results are that a utilization of Charlson index and Rx-Risk is likely to capture comorbid conditions in different health care settings, and thus expected correlation is to be of modest level between the two indices. The research question of interest should therefore determine which index is favorable when assessment of comorbidity is desired.
34

Nageswaran, Ashok R. "Deformation Analysis of Soft Tissues by Digital Image Correlation." University of Cincinnati / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1233614556.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Estep, Freddie Leon. "An analysis of the correlation between fear and motivation." Online full text .pdf document, available to Fuller patrons only, 2000. http://www.tren.com.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Kalaitzis, Angelos. "Bitcoin - Monero analysis: Pearson and Spearman correlation coefficients of cryptocurrencies." Thesis, Mälardalens högskola, Akademin för utbildning, kultur och kommunikation, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-41402.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In this thesis, an analysis of Bitcoin, Monero price and volatility is conducted with respect to S&P500 and the VIX index. Moreover using Python, we computed correlation coefficients of nine cryptocurrencies with two different approaches: Pearson and Spearman from July 2016 -July 2018. Moreover the Pearson correlation coefficient was computed for each year from July2016 - July 2017 - July 2018. It has been concluded that in 2016 the correlation between the selected cryptocurrencies was very weak - almost none, but in 2017 the correlation increased and became moderate positive. In 2018, almost all of the cryptocurrencies were highly correlated. For example, from January until July of 2018, the Bitcoin - Monero correlation was 0.86 and Bitcoin - Ethereum was 0.82.
37

Leach, Lesley Ann Freeny. "Bias and Precision of the Squared Canonical Correlation Coefficient under Nonnormal Data Conditions." Thesis, University of North Texas, 2006. https://digital.library.unt.edu/ark:/67531/metadc5361/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This dissertation: (a) investigated the degree to which the squared canonical correlation coefficient is biased in multivariate nonnormal distributions and (b) identified formulae that adjust the squared canonical correlation coefficient (Rc2) such that it most closely approximates the true population effect under normal and nonnormal data conditions. Five conditions were manipulated in a fully-crossed design to determine the degree of bias associated with Rc2: distribution shape, variable sets, sample size to variable ratios, and within- and between-set correlations. Very few of the condition combinations produced acceptable amounts of bias in Rc2, but those that did were all found with first function results. The sample size to variable ratio (n:v)was determined to have the greatest impact on the bias associated with the Rc2 for the first, second, and third functions. The variable set condition also affected the accuracy of Rc2, but for the second and third functions only. The kurtosis levels of the marginal distributions (b2), and the between- and within-set correlations demonstrated little or no impact on the bias associated with Rc2. Therefore, it is recommended that researchers use n:v ratios of at least 10:1 in canonical analyses, although greater n:v ratios have the potential to produce even less bias. Furthermore,because it was determined that b2 did not impact the accuracy of Rc2, one can be somewhat confident that, with marginal distributions possessing homogenous kurtosis levels ranging anywhere from -1 to 8, Rc2 will likely be as accurate as that resulting from a normal distribution. Because the majority of Rc2 estimates were extremely biased, it is recommended that all Rc2 effects, regardless of which function from which they result, be adjusted using an appropriate adjustment formula. If no rationale exists for the use of another formula, the Rozeboom-2 would likely be a safe choice given that it produced the greatest number of unbiased Rc2 estimates for the greatest number of condition combinations in this study.
38

陳志昌 and Chee-cheong Chan. "Compositional data analysis of voting patterns." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1993. http://hub.hku.hk/bib/B31977236.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Parkhomenko, Elena. "Sparse Canonical Correlation Analysis." Thesis, 2008. http://hdl.handle.net/1807/11243.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Large scale genomic studies of the association of gene expression with multiple phenotypic or genotypic measures may require the identification of complex multivariate relationships. In multivariate analysis a common way to inspect the relationship between two sets of variables based on their correlation is Canonical Correlation Analysis, which determines linear combinations of all variables of each type with maximal correlation between the two linear combinations. However, in high dimensional data analysis, when the number of variables under consideration exceeds tens of thousands, linear combinations of the entire sets of features may lack biological plausibility and interpretability. In addition, insufficient sample size may lead to computational problems, inaccurate estimates of parameters and non-generalizable results. These problems may be solved by selecting sparse subsets of variables, i.e. obtaining sparse loadings in the linear combinations of variables of each type. However, available methods providing sparse solutions, such as Sparse Principal Component Analysis, consider each type of variables separately and focus on the correlation within each set of measurements rather than between sets. We introduce new methodology - Sparse Canonical Correlation Analysis (SCCA), which examines the relationships of many variables of different types simultaneously. It solves the problem of biological interpretability by providing sparse linear combinations that include only a small subset of variables. SCCA maximizes the correlation between the subsets of variables of different types while performing variable selection. In large scale genomic studies sparse solutions also comply with the belief that only a small proportion of genes are expressed under a certain set of conditions. In this thesis I present methodology for SCCA and evaluate its properties using simulated data. I illustrate practical use of SCCA by applying it to the study of natural variation in human gene expression for which the data have been provided as problem 1 for the fifteenth Genetic Analysis Workshop (GAW15). I also present two extensions of SCCA - adaptive SCCA and modified adaptive SCCA. Their performance is evaluated and compared using simulated data and adaptive SCCA is applied to the GAW15 data.
40

Xu, Dingbang. "Correlation analysis of intrusion alerts." 2006. http://www.lib.ncsu.edu/theses/available/etd-05112006-124934/unrestricted/etd.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Wang, Hung-Chia, and 王鴻嘉. "Multivariate Functional Canonical Correlation Analysis." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/38266450678634060201.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
碩士
國立交通大學
統計學研究所
104
Nanotechnology has opened a new world in manufacturing processes. These more sophisticated processes are naturally accompanied with more stringent quality standards. Thus, maintaining or even improving the product yield has become an important issue in the manufacturing industry. With the rapid development of nanometer processes, how to find those key factors that could affect the process quality among possibly thousands of variables in order to increase the yield is one of the most important issues concerned in the production line. With advance computer technology, profile data of a functional variable showing the change of the variable over time, gradually has become a main type of data. However, for profile data, the existing “functional canonical correlation analysis” methods developed in the literature can only analyze the association between two functional variables. In this thesis, we propose a statistical method called “multivariate functional canonical correlation analysis” to explore the association between two groups of multiple functional variables based on functional data analysis. In addition, the proposed method can be easily modified to explore the correlation between multiple functional variables and one or more response variables. With that, the method has a potential to find among many functional variables the quality-related variables. A simulation study demonstrates the effectiveness of the method. The proposed method is applicable in many areas. As an illustrative example, the method is applied to a real dataset of medflies to investigate the association between the mortality of the female medflies and the mortality of the male medflies along with another two functional variables.
42

Lin, Nancy Pei-ching, and 林丕靜. "Correlation Analysis of Fuzzy Sets." Thesis, 1998. http://ndltd.ncl.edu.tw/handle/04600114630994536322.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

TSAI, YI-RUNG, and 蔡宜蓉. "Lycopene Cognition and Correlation Analysis." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/7n8vcg.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
碩士
中華科技大學
健康科技研究所
107
With the increasing awareness of health in Taiwan, modern people's dietary concepts have also changed, such as vegetable oil consumption, adequate intake of fruits and vegetables, daily intake of health food, weekly exercise, etc. The most important change is in diet. the food we eat every day is closely related to our health, so how to eat healthily is worth discovering and studying. Natural fruits and vegetables contain many beneficial elements, which can not only make people eat healthily, but also prevent diseases and even cancer. The purpose of this paper is to investigate the general public's perception of lycopene, a natural substance in natural fruits and vegetables, and its effects on health. In this paper, 319 people were interviewed by Internet questionnaire. The content of the questionnaire includes gender, age, marital status, dietary habit, career, health condition, exercises, the related content of lycopene, etc. To understand the general public's perception of lycopene and its effects. It was found in the questionnaire that 90.3% of the respondents knew about lycopene, while 9.7% did not. Of those asked about lycopene's effects, 78 (12.0 %) said it protects the eyes, 106 (16.3%) believed that lycopene had anti-aging effects. Among g the 127 (19.5%) who believed that solanine reduced cardiovascular disease, 14 (2.2%) who believed that it promoted learning, 124 (19.0%) who believed that it was antioxidant and anti-inflammatory, and 62 (9.5%) who believed that it lowered blood sugar and cholesterol. 43 (6.6%) believed that lycopene could improve prostate hypertrophy and inhibit prostate cancer, and 59 (9.1%) believed that lycopene could inhibit tumor cell growth and cancer cell metastasis. The results of the survey showed that the general public is not very aware of the many benefits of lycopene. Key words: lycopene, cardiovascular disease, Hypoglycemia , lowering cholesterol
44

Chang, Chin-Min, and 張經旼. "fMRI Analysis Using Neighborhood Correlation and Autoregression Analysis." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/45726845264860621895.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
碩士
國立臺灣大學
醫學工程學研究所
96
Functional magnetic resonance imaging (fMRI) has become an important tool for brain function studies. Statistical Parametric Mapping (SPM) provided a model-driven method for fMRI studies. Different from SPM, Independent Component Analysis (ICA) provided a data-driven method. Here we are trying to give a data-driven method and solve the high-complexity problem in clustering method and ICA. Correlation coefficient had been used in many ways for fMRI analysis recently and proved efficient. Autoregression analysis is utilized in different time series analysis. Here, we are going to take correlation coefficient and autoregression analysis as two filters, to remove most of the voxels which are considered to be noise. After data reduction, we provide Hierarchical cluster and correlation algorithm to those remaining voxels. Providing the method to the fMRI data from SPM’s auditory experiment, we can correctly get the auditory cortical.
45

"Canonical correlation analysis of functional data." ARIZONA STATE UNIVERSITY, 2010. http://pqdtopen.proquest.com/#viewpdf?dispub=3380673.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Papgeorgiou, Constantine P., Federico Girosi, and Tomaso Poggio. "Sparse Correlation Kernel Analysis and Reconstruction." 1998. http://hdl.handle.net/1721.1/7256.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This paper presents a new paradigm for signal reconstruction and superresolution, Correlation Kernel Analysis (CKA), that is based on the selection of a sparse set of bases from a large dictionary of class- specific basis functions. The basis functions that we use are the correlation functions of the class of signals we are analyzing. To choose the appropriate features from this large dictionary, we use Support Vector Machine (SVM) regression and compare this to traditional Principal Component Analysis (PCA) for the tasks of signal reconstruction, superresolution, and compression. The testbed we use in this paper is a set of images of pedestrians. This paper also presents results of experiments in which we use a dictionary of multiscale basis functions and then use Basis Pursuit De-Noising to obtain a sparse, multiscale approximation of a signal. The results are analyzed and we conclude that 1) when used with a sparse representation technique, the correlation function is an effective kernel for image reconstruction and superresolution, 2) for image compression, PCA and SVM have different tradeoffs, depending on the particular metric that is used to evaluate the results, 3) in sparse representation techniques, L_1 is not a good proxy for the true measure of sparsity, L_0, and 4) the L_epsilon norm may be a better error metric for image reconstruction and compression than the L_2 norm, though the exact psychophysical metric should take into account high order structure in images.
47

Tan, Qi Er. "Correlation adjusted penalization in regression analysis." 2012. http://hdl.handle.net/1993/9147.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The PhD thesis introduces two new types of correlation adjusted penalization methods to address the issue of multicollinearity in regression analysis. The main purpose is to achieve simultaneous shrinkage of parameter estimators and variable selection for multiple linear regression and logistic regression when the predictor variables are highly correlated. The motivation is that when there is serious issue of multicollinearity, the variances of parameter estimators are significantly large. The new correlation adjusted penalization methods shrink the parameter estimators and their variances to alleviate the problem of multicollinearity. The latest important trend to deal with multicollinearity is to apply penalization methods for simultaneous shrinkage and variable selection. In the literature, the following penalization methods are popular: ridge, bridge, LASSO, SCAD, and OSCAR. Few papers have used correlation based penalization methods, and these correlation based methods in the literature do not work when some correlations are either 1 or -1. This means that these correlation based methods fail if at least two predictor variables are perfectly correlated. We introduce two new types of correlation adjusted penalization methods that work whether or not the predictor variables are perfectly correlated. The types of correlation adjusted penalization methods introduced in my thesis are intuitive and innovative. We investigate important theoretical properties of these new types of penalization methods, including bias, mean squared error, data argumentation and asymptotic properties, and plan to apply them to real data sets in the near future.
48

Yang, Yi-Chung, and 楊宜中. "Sequence Alignment with XNOR Correlation Analysis." Thesis, 2004. http://ndltd.ncl.edu.tw/handle/10809517763605135227.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
碩士
國立臺灣大學
生物產業機電工程學研究所
92
This study develops a novel method for sequence alignment based on correlation appoach. This approach can be broken down into three stages – correlation, searching and scoring which are decoupling processes. A scoring process is incorporated into the alignment process in the final stage. Therefore, the variation of scoring matrix doesn’t affect the correlation and searching procedures. Moreover, all feasible optimal sequences are ensured to be obtained by using the developed approach. The correlation approach for sequence alignment is mainly based on exclusive NOR operation. If two symbols are matching, output 1as a result, if mismatch, output 0. Subsequently, all feasible paths of optimal sequences are searched and their corresponding sequences are found from the correlation matrix and cumulative correlation matrix. Finally, the score of optimal the sequence is calculated with the specified scoring matrix. Due to the 1st and 2nd stages not considering scoring matrix, the variation of scoring matrix couldn’t affect the aligned results with the correlation approach. Yet, dynamic programming, which is widely used in sequence alignment, often suffers from the recalculation of the whole alignment process while scoring matrices change. Therefore, different optimal sequences are obtained. Compared with dynamic programming, the developed alignment approach with three stages of decoupling processes is proved to provide complete sets of optimal sequences and a more efficient and comprehensive way for sequence alignment.
49

Ferreira, Catarina Duarte Barros. "Digital Image Correlation for Vibration Analysis." Master's thesis, 2021. https://hdl.handle.net/10216/136673.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Hung, Po-Chih. "Strain analysis by digital image correlation /." Diss., 1998. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:9914246.

Full text
APA, Harvard, Vancouver, ISO, and other styles

To the bibliography