To see the other types of publications on this topic, follow the link: Multivariate depth.

Dissertations / Theses on the topic 'Multivariate depth'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 17 dissertations / theses for your research on the topic 'Multivariate depth.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Beltran, Luis. "NONPARAMETRIC MULTIVARIATE STATISTICAL PROCESS CONTROL USING PRINCIPAL COMPONENT ANALYSIS AND SIMPLICIAL DEPTH." Doctoral diss., University of Central Florida, 2006. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/4080.

Full text
Abstract:
Although there has been progress in the area of Multivariate Statistical Process Control (MSPC), there are numerous limitations as well as unanswered questions with the current techniques. MSPC charts plotting Hotelling's T2 require the normality assumption for the joint distribution among the process variables, which is not feasible in many industrial settings, hence the motivation to investigate nonparametric techniques for multivariate data in quality control. In this research, the goal will be to create a systematic distribution-free approach by extending current developments and focusing on the dimensionality reduction using Principal Component Analysis. The proposed technique is different from current approaches given that it creates a nonparametric control chart using robust simplicial depth ranks of the first and last set of principal components to improve signal detection in multivariate quality control with no distributional assumptions. The proposed technique has the advantages of ease of use and robustness in MSPC for monitoring variability and correlation shifts. By making the approach simple to use in an industrial setting, the probability of adoption is enhanced. Improved MSPC can result in a cost savings and improved quality.
Ph.D.
Department of Industrial Engineering and Management Systems
Engineering and Computer Science
Industrial Engineering and Management Systems
APA, Harvard, Vancouver, ISO, and other styles
2

Lee, Joanna L. S. "Time-of-flight secondary ion mass spectrometry - fundamental issues for quantitative measurements and multivariate data analysis." Thesis, University of Oxford, 2011. http://ora.ox.ac.uk/objects/uuid:f0e4b8ff-f563-429e-9e71-9c277a5139c4.

Full text
Abstract:
Time-of-flight secondary ion mass spectrometry (ToF-SIMS) is a powerful technique for the analysis of organic surfaces and interfaces for many innovative technologies. However, despite recent developments, there are still many issues and challenges hindering the robust, validated use of ToF-SIMS for quantitative measurement. These include: the lack of metrology and fundamental understanding for the use of novel cluster primary ion beams such as C60n+ and Ar2000+; the need for validated and robust measurement protocols for difficult samples, such as those with significant micron scale surface topography; the lack of guidance on novel data analysis methods including multivariate analysis which have the potential to simplify many time-consuming and intensive analyses in industry; and the need to establish best practice to improve the accuracy of measurements. This thesis describes research undertaken to address the above challenges. Sample topography and field effects were evaluated experimentally using model conducting and insulating fibres and compared with computer simulations to provide recommendation to diagnose and reduce the effects. Two popular multivariate methods, principal component analysis (PCA) and multivariate curve resolution (MCR), were explored using mixed organic systems consisting of a simple polymer blend and complex hair fibres treated with a multi-component formulation to evaluate different multivariate and data preprocessing methods for the optimal identification, localisation and quantification of the chemical components. Finally, cluster ion beams C60n+ and Ar500-2500+ were evaluated on an inorganic surface and an organic delta layer reference material respectively to elucidate the fundamental metrology of cluster ion sputtering and pave the way for their use in organic depth profiling. These studies provide the essential metrological foundation to address frontier issues in surface and nanoanalysis and extend the measurement capabilities of ToF-SIMS.
APA, Harvard, Vancouver, ISO, and other styles
3

Armaut, Elisabeth. "Estimation d'ensembles de niveau d'une fonction de profondeur pour des données fonctionnelles. Applications au clustering et à la théorie du risque." Electronic Thesis or Diss., Université Côte d'Azur, 2024. http://www.theses.fr/2024COAZ5021.

Full text
Abstract:
Les fonctions de profondeur statistiques jouent un rôle fondamental dans l'analyse et la caractérisation des structures de données complexes. Les profondeurs fournissent une mesure de centralité ou d'excentricité pour une observation individuelle ou pour l'ensemble des données, ce qui aide à comprendre leurs positions relatives et leurs distributions sous-jacentes. Les concepts relatifs à la profondeur, tels qu'ils sont présents dans la littérature, trouvent leur origine dans la notion de profondeur de Tukey, également désignée sous le nom de profondeur médiane. Cette notion a été introduite par le statisticien John W. Tukey dans son article intitulé "Mathematics and the Picturing of Data" publié en 1975 [170]. La principale idée sous-jacente à la profondeur de Tukey consiste à généraliser la médiane univariée d'un jeu de données unidimensionnel en dimension supérieure.Dans un premier temps, nous nous intéressons aux profondeurs multivariées suivies des profondeurs fonctionnelles, pour lesquelles nous construisons une revue générale dans le Chapitre 1.Dans la seconde partie de la thèse, i.e. dans le Chapitre 2, nous entreprenons une étude rigoureuse des ensembles de niveaux des fonctions de profondeur multivariées et établissons plusieurs propriétés analytiques et statistiques. Tout d'abord, nous montrons que lorsque la profondeur multivariée sous-jacente est suffisamment régulière, la différence symétrique entre l'ensemble de niveaux de profondeur estimé et son équivalent théorique converge vers zéro en termes de volume d-dimensionel et de probabilité sous la distribution considérée. Outre ces contributions, la nouveauté du Chapitre 2, dans le cadre de la théorie du risque, réside dans l'introduction d'une mesure de risque basée sur une profondeur appelée Covariate-Conditional-Tail-Expectation (CCTE). Globalement, la CCTE vise à calculer un coût moyen sachant qu'au moins un des facteurs de risque en jeu est 'élevé' suivant une certaine direction. Cette dernière zone de risque est modélisée par un ensemble de niveau de faible profondeur. Contrairement à des mesures de risques fondées sur les queues de distribution, notre définition de CCTE est indépendante de toute direction grâce à l'implication des ensembles de niveaux d'une profondeur. Nous démontrons également que, lorsque la taille de l'échantillon tend vers l'infini, la CCTE basée sur la profondeur empirique est consistante par rapport à sa version théorique. Et nous fournissons les taux de convergence pour la CCTE, pour des niveaux de risque fixes ainsi que lorsque le niveau de risque tend vers zéro quand la taille de l'échantillon tend vers l'infini. Dans ce dernier cas d'étude, nous analysons de même le comportement de la définition originelle de CCTE basée sur une fonction de répartition, cas qui n'a pas été étudié dans [56]. En plus des simulations effectuées sur la CCTE, nous illustrons son utilité sur des données environnementales.La dernière partie de cette thèse, le Chapitre 3, conclut notre travail et consiste à définir une profondeur fonctionnelle générale pour des données fonctionnelles basée sur l'analyse en composantes principales fonctionnelles. Cela implique l'utilisation d'une profondeur multivariée générique. Dans cette optique, nous utilisons la décomposition bien connue de Karhunen-Loève comme outil pour pro- jeter un processus aléatoire centré et de carré intégrable le long d'une combinaison linéaire finie de fonctions orthogonales appelées composantes principales. À notre connaissance, il s'agit d'une approche novatrice dans le cadre des profondeurs fonctionnelles. Naturellement, nous proposons un estimateur de notre profondeur fonctionnelle pour lequel nous démontrons une consistance uniforme. Nous complétons enfin notre étude avec des simulations et des applications sur données réelles dans des problèmes de classifications, où notre nouvelle profondeur se révèle être au moins aussi performante que la plupart des concurrents classiques
Statistical depth functions play a fundamental role in analyzing and characterizing complex data structures. Depth functions provide a measure of centrality or outlyingness for individual observations or entire datasets, aiding in the understanding of their relative positions and underlying distributions. The concepts related to depth, as found in the literature, originate from the notion of Tukey's depth, also known as the median depth. This concept was introduced by the statistician John W. Tukey in his article titled "Mathematics and the Picturing of Data," published in 1975 [170]. The fundamental idea underlying Tukey's depth is to generalize the univariate median of a one-dimensional dataset in higher dimension. First, our interest focuses on multivariate depths followed by functional depths, both of which we build an overall review within Chapter 1. In the second part of this thesis, i.e. in Chapter 2, we undertake a rigorous study of multivariate depth-level sets and establish several analytical and statistical properties. First, we show that, when the underlying multivariate depth is smooth enough, then the symmetric difference between the estimated depth-level set and its theoretical counterpart converges to zero in terms of the d-dimensional volume and of the probability under the unknown distribution. Apart from these contributions, the novelty of Chapter 2 is the introduction and study of a depth-based risk measure called the Covariate-Conditional- Tail-Expectation (CCTE), within a risk theory setup. Roughly, the CCTE aims at computing an average cost knowing that at least one of the risk factors at hand is 'high' in some direction. The latter risk area is modelled by a level-set of low depth value. In contrast to risk measures based on distribution tails, our definition of CCTE is direction-free, owing to the involvement of depth level sets. We establish that, as the sample size goes to infinity the empirical depth-based CCTE is consistent for its theoretical version. We demonstrate consistency and provide rates of convergence for the depth- CCTE, for fixed levels of risk as well as when the risk level goes to zero as the sample size goes to infinity. In this last case of study, we also analyze the behavior of the original CCTE definition based on a distribution function, a case that was not studied in [56]. On top of several simulations performed on the CCTE, we illustrate its usefulness on environmental data.The final part of this thesis, Chapter 3, wraps up our work in which we contribute to defining a new type of depth for functional data based on functional principal component analysis. This includes using a generic multivariate depth. In this view, we use the well known Karhunen-Loève decomposition as a tool to project a centered square-integrable random process along some finite linear combination of orthogonal functions called the principal components. To the best of our knowledge, this is a novel approach in the functional depth literature. In this extent, we involve a multivariate depth function for the vector of the projected principal components. Naturally, we provide an estimator of our functional depth for which we demonstrate uniform consistency with a rate of convergence. We complement our study with several simulations and real data applications to functional classification, where our new depth equals or outperforms most of conventional competitors
APA, Harvard, Vancouver, ISO, and other styles
4

Baffoe, Nana Ama Appiaa. "Diagnostic Tools for Forecast Ensembles." Case Western Reserve University School of Graduate Studies / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=case1522964882574611.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Cantu, Alma. "Proposition de modes de visualisation et d'interaction innovants pour les grandes masses de données et/ou les données structurées complexes en prenant en compte les limitations perceptives des utilisateurs." Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2018. http://www.theses.fr/2018IMTA0068/document.

Full text
Abstract:
Suite à l’amélioration des outils de capture et de stockage des données, ces dernières années ont vu les quantités de données à traiter croître énormément. De nombreux travaux, allant du traitement automatique à la visualisation d’information, ont alors été mis en place, mais certains domaines sont encore trop spécifiques pour en profiter. C’est le cas du Renseignement d’Origine ÉlectroMagnétique (ROEM). Ce domaine ne fait pas uniquement face à de grandes quantités de données mais doit aussi gérer des données et des usages complexes ainsi que des populations d’utilisateurs ayant de moins en moins d’expérience. Dans cette thèse nous nous sommes intéressés à l’usage de l’existant et des nouvelles technologies appliquées à la visualisation pour proposer des solutions à la combinaison de problématiques comme les données en grandes quantité et les données complexes. Nous commençons par présenter une analyse du domaine du ROEM qui a permis d’extraire les problématiques auxquelles il doit faire face. Nous nous intéressons ensuite aux solutions gérant les combinaisons de telles problématiques. L’existant ne contenant pas directement de telles solutions, nous nous intéressons alors à la description des problématiques de visualisation et proposons une caractérisation de ces problématiques. Cette caractérisation nous permet de décrire les représentations existantes et de mettre en place un outil de recommandation des représentations basé sur la façon dont l’existant résout les problématiques. Enfin nous nous intéressons à identifier de nouvelles métaphores pour compléter l’existant et proposons une représentation immersive permettant de résoudre les problématiques du ROEM. Ces contributions permettent d’analyser et d’utiliser l’existant et approfondissent l’usage des représentations immersives pour la visualisation d’information
As a result of the improvement of data capture and storage, recent years have seen the amount of data to be processed increase dramatically. Many studies, ranging from automatic processing to information visualization, have been performed, but some areas are still too specific to take advantage of. This is the case of ELectromagnetic INTelligence(ELINT). This domain does not only deal with a huge amount of data but also has to handle complex data and usage as well as populations of users with less and less experience. In this thesis we focus on the use of existing and new technologies applied to visualization to propose solutions to the combination of issues such as huge amount and complex data. We begin by presenting an analysis of the ELINT field which made it possible to extract the issues that it must faces. Then, we focus on the visual solutions handling the combinations of such issues but the existing work do not contain directly such solutions. Therefore, we focus on the description of visual issues and propose a characterization of these issues. This characterization allows us to describe the existing representations and to build a recommendation tool based on how the existing work solves the issues. Finally, we focus on identifying new metaphors to complete the existing work and propose an immersive representation to solve the issues of ELINT. These contributions make it possible to analyze and use the existing and deepen the use of immersive representations for the visualization of information
APA, Harvard, Vancouver, ISO, and other styles
6

Dogra, Jody A. Busch Kenneth W. Busch Marianna A. "Multivariate analyses of near-infrared and UV spectral data." Waco, Tex. : Baylor University, 2009. http://hdl.handle.net/2104/5347.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Carvalho, Aline Roberta de. "Atributos do solo associados às variações na vegetação em fragmento de cerrado, Assis, SP." Universidade de São Paulo, 2009. http://www.teses.usp.br/teses/disponiveis/11/11140/tde-09022009-152200/.

Full text
Abstract:
O objetivo do presente trabalho é analisar as correlações existentes entre variáveis ambientais e a distribuição das espécies arbóreas. O estudo foi realizado em uma parcela permanente de 320 x 320 m, um fragmento de cerradão, situado na Estação Ecológica de Assis, Assis, SP. Coletaram-se amostras em cinco pontos de uma pedosequência, realizando-se análises químicas, físicas do solo e micromorfologicas. A vegetação foi amostrada em áreas de 314m² em torno de cada ponto de coleta do solo. Foram considerados todos os indivíduos vivos e com DAP (diâmetro à altura do peito) igual ou superior a 4,8cm e PAP (perímetro à altura do peito) 15 cm. Para analisar o banco de dados foram utilizadas três técnicas de análise multivariada: análise de componentes principais (PCA), para variáveis ambientais; análise de correspondência retificada (DCA), para variáveis florísticas; e análise de correspondência canônica (CCA), para verificar a possível associação entre estas duas variáveis. A análise dos componentes principais (PCA) demonstrou que a maioria das variáveis apresentaram correlação semelhante nas camadas superficiais (0-20 cm e 20-60 cm). Essa tendência não se repetiu nas demais profundidades (60-80 e >80 cm), evidenciando maior diferenciação do perfil de solo com o aumento da profundidade analisada. A análise de correspondência canônica (CCA) mostrou coerência nos padrões de distribuição das espécies em relação às variáveis ambientais do fragmento, caracterizada pelos atributos químicos e físicos, mas principalmente pelo regime hídrico dos solos, sugerindo que este seja um fator fortemente determinante na distribuição das espécies.
The objective of the present work is to analyze correlation the correlation between environmental variables and tree species distribution. The study was developed in a permanent plot of 320 x 320 m, in a cerrado fragment, located at Assis Ecological Station, Assis County, São Paulo. Soil samples were collected at five sites in a pedosequence, and submitted to chemical, physical and micromorphological analysis. The vegetation was sampled within 314 m2 area around each site where the soil sample was collected. All alive trees which diameter at breast height was equal or higher than 4,8 cm and perimeter at the breast height was equal or superior to 15 cm, were considered. To analyze the data bank, three multivariate techniques analysis were used: principal component analysis (PCA), to environment variables; certificated correspondent analysis (DCA), to floristic variables; and canonic correspondent analysis (CCA), to verify a possible association between two variables. The principal component analysis demonstrated that the majority of variables presented similar correlation within superficial layers (0-20 and 20-60 cm). This trend was not the same for the other layers (60-80 and > 80 cm), suggesting more changes in soil profile with soil depth. The correspondence canonic analysis showed to be reliable to demonstrate standard distribution of species in relation to environmental variables for fragment, characterized by soil physical and chemical attributes. But, the key character was the soil water regime, suggesting that the water availability had strong influence over species distributions.
APA, Harvard, Vancouver, ISO, and other styles
8

Salawu, Emmanuel Oluwatobi. "Spatiotemporal Variations in Coexisting Multiple Causes of Death and the Associated Factors." ScholarWorks, 2018. https://scholarworks.waldenu.edu/dissertations/6108.

Full text
Abstract:
The study and practice of epidemiology and public health benefit from the use of mortality statistics, such as mortality rates, which are frequently used as key health indicators. Furthermore, multiple causes of death (MCOD) data offer important information that could not possibly be gathered from other mortality data. This study aimed to describe the interrelationships between various causes of death in the United States in order to improve the understanding of the coexistence of MCOD and thereby improve public health and enhance longevity. The social support theory was used as a framework, and multivariate linear regression analyses were conducted to examine the coexistence of MCOD in approximately 80 million death cases across the United States from 1959 to 2005. The findings showed that in the United States, there is a statistically significant relationship between the number of coexisting MCOD, race, education, and the state of residence. Furthermore, age, gender, and marital status statistically influence the average number of coexisting MCOD. The results offer insights into how the number of coexisting MCOD vary across the United States, races, education levels, gender, age, and marital status and lay a foundation for further investigation into what people are dying from. The results have the long-term potential of helping public health practitioners identify individuals or communities that are at higher risks of death from a number of coexisting MCOD such that actions could be taken to lower the risks to improve people's wellbeing, enhance longevity, and contribute to positive social change.
APA, Harvard, Vancouver, ISO, and other styles
9

Chih-TingHsieh and 謝芝庭. "Multivariate Process Capability Index Based on Data Depth Concept." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/2cw2p6.

Full text
Abstract:
碩士
國立成功大學
統計學系
105
Generally, an industrial product has more than one quality characteristic. In order to establish performance measures for evaluating the capability of a multivariate manufacturing process, several multivariate process capability indices have been developed in the past. Most of the proposed in the literature MPCIs are defined under an assumption, that process quality characteristics are normally distributed. However, this assumption may not hold in practice In this research, based on the data depth concept, we proposed two multivariate process capability indices, which could be used regardless on data distribution. Finally, simulation results show that our proposed indices outperform than existing model. A numerical example further demonstrate the usefulness of the proposed indices.
APA, Harvard, Vancouver, ISO, and other styles
10

Kong, Linglong. "On Multivariate Quantile Regression: Directional Approach and Application with Growth Charts." Phd thesis, 2009. http://hdl.handle.net/10048/462.

Full text
Abstract:
Thesis (Ph. D.)--University of Alberta, 2009.
Title from pdf file main screen (viewed on July 21, 2009). "A thesis submitted to the Faculty of Graduate Studies and Research in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Statistics, Department of Mathematical and Statistical Sciences, University of Alberta." Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
11

Vencálek, Ondřej. "Vážená hloubka dat a diskriminace založená na hloubce dat." Doctoral thesis, 2011. http://www.nusl.cz/ntk/nusl-311583.

Full text
Abstract:
The concept of data depth provides a powerful nonparametric tool for multivariate data analysis. We propose a generalization of the well-known halfspace depth called weighted data depth. The weighted data depth is not affine invariant in general, but it has some useful properties as possible nonconvex central areas. We further discuss application of data depth methodology to solve discrimination problem. Several classifiers based on data depth are reviewed and one new classifier is proposed. The new classifier is a modification of k-nearest- neighbour classifier. Classifiers are compared in a short simulation study. Advantage gained from use of the weighted data depth for discrimination purposes is shown.
APA, Harvard, Vancouver, ISO, and other styles
12

Lee, Hyunsook. "Two topics aJackknife maximum likelihood approach to statistical model selection and a Convex hull peeling depth approach to nonparametric massive multivariate data analysis with applications /." 2006. http://etda.libraries.psu.edu/theses/approved/WorldWideIndex/ETD-1427/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Lee, HyunSook. "Two topics, a Jackknife maximum likelihood approach to statistical model selection and a Convex hull peeling depth approach to nonparametric massive multivariate data analysis with applications /." 2006. http://etda.libraries.psu.edu/theses/approved/WorldWideIndex/ETD-1427/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Kotík, Lukáš. "Vážené poloprostorové hloubky a jejich vlastnosti." Doctoral thesis, 2015. http://www.nusl.cz/ntk/nusl-336162.

Full text
Abstract:
Statistical depth functions became well known nonparametric tool of multivariate data analyses. The most known depth functions include the halfspace depth. Although the halfspace depth has many desirable properties, some of its properties may lead to biased and misleading results especially when data are not elliptically symmetric. The thesis introduces 2 new classes of the depth functions. Both classes generalize the halfspace depth. They keep some of its properties and since they more respect the geometric structure of data they usually lead to better results when we deal with non-elliptically symmetric, multimodal or mixed distributions. The idea presented in the thesis is based on replacing the indicator of a halfspace by more general weight function. This provides us with a continuum, especially if conic-section weight functions are used, between a local view of data (e.g. kernel density estimate) and a global view of data as is e.g. provided by the halfspace depth. The rate of localization is determined by the choice of the weight functions and theirs parameters. Properties including the uniform strong consistency of the proposed depth functions are proved in the thesis. Limit distribution is also discussed together with some other data depth related topics (regression depth, functional data depth)...
APA, Harvard, Vancouver, ISO, and other styles
15

(3322188), Samuel Belteton. "Multivariate analysis of leaf tissue morphogenesis." Thesis, 2020.

Find full text
Abstract:
Leaf size and shape are strongly influenced by the growth patterns of the epidermal tissue. Pavement cells are the prevalent cell type in the epidermis and during cell expansion they undergo a drastic shape change from a simple polyhedral cells to puzzled-shaped cell. The role of these cell protrusions, more commonly referred to as lobes, remains unknown but their formation has been proposed to help increase the structural integrity of the epidermal tissue. How the symmetry breaking event that initiates a lobe is controlled remains unknown, however pharmacological and genetic disruption of the microtubule system has been shown to interfere not only with lobe initiation but also with lobe expansion. Additionally, the role of microtubules in the pattering of microfibril deposition, the load-bearing structure of the cell wall, makes the microtubule system a good candidate to evaluate its dynamics as a function of shape change. Two main mechanical models for lobe initiation are evaluated here, one where microtubules serve as stable features suppressing local expansion and one where microtubules, similarly to the anisotropic expansion patterning in hypocotyl cells, pro-mote the local anisotropic expansion of the cell resulting in lobe formation. The main method to evaluate these models was through the use of long-term time-lapse image analysis using a plasma-membrane marker for accurate shape change quantification and a microtubule marker to quantify their location, persistence, and density as a function of cell shape change. Using the junctions where three cells come together,cells were sub-divided into segments and the shape of these segments were tracked using a new coordinate system that allowed the detection of new lobes as which can arise from ∼300 deflections. By mapping sub-cellular processes, such as microtubule persistence, to this coordinate system, correlations of microtubule organization and shape change was possible. Additionally, a subset of microtubules bundles that splay across the anticlinal and periclinal walls, perpendicular and parallel to the leaf surface respectively, were identified as marking the location and direction of lobe formation.Disrupting the cell boundary by partially digesting pectin, a main component in the middle lamella, revealed the cell-autonomous morphogenesis mechanism in pavementcells. Under pectinase treatment, cell invaginations were produced and similarly to lobes their initiation was microtubule and cellulose dependent. Lastly, stress prediction using finite-element models, based from live-cell images, co-localized regions of high cell wall stress with both microtubule persistence and shape shape locations in both lobing and invaginated segments. Together, a model of cellular shape change is presented where microtubules translate cell wall stresses to tissue morphogenesis.
APA, Harvard, Vancouver, ISO, and other styles
16

Sandström, Sara. "Modellering av volym samt max- och medeldjup i svenska sjöar : en statistisk analys med hjälp av geografiska informationssystem." Thesis, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-325822.

Full text
Abstract:
Lake volume and lake depth are important variables that defines a lake and its ecosystem. Sweden has around 100 000 lakes, but only around 8000 lakes has measured data for volume, max- and mean-depth. To collect data for the rest of the lakes is presently too time consuming and expensive, therefore a predictive method is needed. Previous studies by Sobek et al. (2011) have found a model predicting lake volume from map-derived parameters with high degrees of explanation for mean volume of 15 lakes or more. However, the predictions for one individual lake, as well as max- and mean-depth, were not accurate enough. The purpose with this study was to derive better models based on new map material with higher resolution. Variables used was derived using GIS-based calculations and then analyzed with multivariate statistical analysis with PCA, PLS-regression and multiple linear regression. A model predicting lake volume for one individual lake with better accuracy than previous studies was found. The variables best explaining the variations in lake volume was lake area and the median slope of an individual zone around each lake (R2=0.87, p<0.00001). Also, the model predicting max-depth from lake area, median slope of an individual zone around each lake and height differences in the closest area surrounding each lake, had higher degrees of explanation than in previous studies (R2=0.42). The mean-depth had no significant correlation with map-derived parameters, but showed strong correlation with max-depth. Reference Sobek, S., Nisell, J. & Fölster J. (2011). Predicting the volume and depths of lakes from map-derived parameters. Inland Waters, vol. 1, ss. 177-184.
APA, Harvard, Vancouver, ISO, and other styles
17

Lercher, Hendrik. "Multivariate Vorhersagbarkeit von ICD-Schocks und Mortalität bei Patienten nach einer ICD-Neuimplantation." Doctoral thesis, 2016. http://hdl.handle.net/11858/00-1735-0000-002B-7C62-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography