Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Non-invasive methods of analysis.

Dissertationen zum Thema „Non-invasive methods of analysis“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-50 Dissertationen für die Forschung zum Thema "Non-invasive methods of analysis" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Barberis, Elettra. „New non-invasive approaches for proteomics and metabolomics analyses“. Doctoral thesis, Università del Piemonte Orientale, 2020. http://hdl.handle.net/11579/115041.

Der volle Inhalt der Quelle
Annotation:
Recent technological developments in analytical chemistry spurred the analysis of historical, archaeological and paleontological objects. The Identification of proteins and small molecules from cultural heritage objects is crucial to characterize the materials used by the artists and it can provide invaluable information for designing restoration interventions. All most the developed analytical procedures require at least a micro sampling from the object. However, non-invasive techniques are always preferred for the analysis of precious and unique objects. A part of this PhD research focused on the development and application of new non-invasive methods for the analysis of cultural heritage. A new method for the non-invasive analysis of proteins and small molecules with mass spectrometry from cultural heritage objects was discussed; the results obtained using a non-invasive imaging instrument on ancient Egyptian mural paintings were also presented; the development and application of non-invasive methods that use portable infrared spectroscopy instrumentation were shown. The recent revolution in mass spectrometry technology with the introduction of high throughput instruments and techniques has led to the widespread expansion of advanced analytical methods in health science. But today, the main target of modern mass spectrometry analysis in biomedical research can be summarize as the development of effective and reliable approaches able of discriminating diseased conditions at their earliest stage, in a non or minimally-invasive manner. The aim of the second part of this PhD research was the development and application of non-invasive methods for the analysis of biological materials. A new method for the non-invasive analysis and characterization of adenoma in colon rectal cancer was presented and a combined bi and mono-dimensional gas chromatography mass spectrometry approach for the identification of new biomarkers for prostate cancer in serum was discussed.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Alonso-Caneiro, David. „Non-invasive assessment of tear film surface quality“. Thesis, Queensland University of Technology, 2010. https://eprints.qut.edu.au/41737/1/David_Alonso-Caneiro_Thesis.pdf.

Der volle Inhalt der Quelle
Annotation:
The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Fontseré, Alemany Clàudia 1992. „Genomic analysis of wild and captive chimpanzee populations from non-invasive samples using target capture methods“. Doctoral thesis, Universitat Pompeu Fabra, 2020. http://hdl.handle.net/10803/670317.

Der volle Inhalt der Quelle
Annotation:
Wild chimpanzee populations are considered to be under threat of extinction due to the damaging consequences of human impact into their natural habitat and illegal trade. Conservation genomics is an emerging field that has the potential to guide conservation efforts not only in the wild (in situ) but also outside their natural range (ex situ). In this thesis, we have explored to which extent target capture methods on specific genomic regions can provide insights into chimpanzee genetic diversity in captive and wild populations. Specifically, we have characterized the ancestry and inbreeding of 136 European captive chimpanzees to aid their management in captivity and inferred the origin of 31 confiscated individuals from illegal trade by sequencing ancestry informative SNPs. Also, we have examined molecular strategies to maximize the library complexity in target capture methods from fecal samples so they can be applied in large-scale genomic studies. Finally, we have captured the chromosome 21 from 828 fecal samples collected across the entire extant chimpanzee range. As a result of our high density sampling scheme, we have found strong evidence of population stratification in chimpanzee populations and we have discovered new local genetic diversity that is linked to its geographic origin. Finally, with this newly generated dataset and fine-grained geogenetic map, we have implemented a strategy for the geolocalization of chimpanzees which has a direct conservation application
Les poblacions salvatges de ximpanzés estan en perill d'extinció a causa de les dramàtiques conseqüències associades a l’impacte humà en el seu hàbitat natural i al tràfic il·legal. La genòmica de la conservació és un camp emergent que té el potencial de guiar esforços de conservació d’espècies en perill d’extinció no només en el seu hàbitat natural (in situ) sinó també en captivitat (ex situ). En aquesta tesi, hem analitzat fins a quin punt els mètodes de captura de regions específiques del genoma són una bona eina per explorar la diversitat genètica dels ximpanzés tant en poblacions captives com salvatges. Concretament, hem caracteritzat la subespècie i els nivells de consanguinitat de 136 ximpanzés de zoos europeus amb l'objectiu de guiar-ne la seva gestió en captivitat, i hem inferit l'origen de 31 individus confiscats del tràfic il·legal a través de la seqüenciació de SNPs informatius de llinatge. També hem posat en pràctica estratègies moleculars per maximitzat la complexitat de les llibreries en la captura de regions específiques a partir de mostres fecals i així poder ser aplicades en estudis genòmics a gran escala. Finalment, hem capturat el cromosoma 21 de 828 mostres fecals recollides per tota la distribució geogràfica dels ximpanzé. Arran de l’alta densitat de mostreig, hem trobat evidències que apunten a una alta estratificació poblacional en els ximpanzés i hem desxifrat nova diversitat genètica vinculada a l’origen geogràfic dels individus. Finalment, amb el conjunt de dades generat i el mapa geogenètic obtingut, hem implementat una estratègia per la geolocalització de ximpanzés amb aplicació directe per a la conservació.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Morioka, Hiroshi. „Analysis Methods toward Brain-Machine Interfaces in Real Environments“. 京都大学 (Kyoto University), 2015. http://hdl.handle.net/2433/199450.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Gooch, Steven R. „A METHOD FOR NON-INVASIVE, AUTOMATED BEHAVIOR CLASSIFICATION IN MICE, USING PIEZOELECTRIC PRESSURE SENSORS“. UKnowledge, 2014. http://uknowledge.uky.edu/ece_etds/56.

Der volle Inhalt der Quelle
Annotation:
While all mammals sleep, the functions and implications of sleep are not well understood, and are a strong area of investigation in the research community. Mice are utilized in many sleep studies, with electroencephalography (EEG) signals widely used for data acquisition and analysis. However, since EEG electrodes must be surgically implanted in the mice, the method is high cost and time intensive. This work presents an extension of a previously researched high throughput, low cost, non-invasive method for mouse behavior detection and classification. A novel hierarchical classifier is presented that classifies behavior states including NREM and REM sleep, as well as active behavior states, using data acquired from a Signal Solutions (Lexington, KY) piezoelectric cage floor system. The NREM/REM classification system presented an 81% agreement with human EEG scorers, indicating a useful, high throughput alternative to the widely used EEG acquisition method.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Stenström, Mats. „Computerised microtomography : non-invasive imaging and analysis of biological samples, with special reference to monitoring development of osteoporosis in small animals /“. Linköping : Univ, 2000. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-5030.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Elizalde, Siller Hugo Ramon. „Non-linear modal analysis methods for engineering structures“. Thesis, Imperial College London, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.419886.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Campbell, N. C. „Statistical methods for non-stationary time series analysis“. Thesis, University of Cambridge, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.597266.

Der volle Inhalt der Quelle
Annotation:
This dissertation is concerned with Bayesian methods for non-stationary time series analysis. Most of the algorithms developed use Markov chain Monte Carlo (MCMC) methods as the means of sampling from the required posteriors. A stochastic version of the Expectation Maximisation (EM) algorithm, the Expectation Sample (ES) algorithm is developed. The performance of this algorithm is compared with EM and other stochastic EM algorithms for parameter estimation of locally stationary time series. The ES algorithm is shown to overcome some of the well documented limitations of the EM algorithm. Non-stationary time series are commonly modelled by segmenting them into a number of independent frames that can be considered stationary. An algorithm is developed whereby these individuals segments can be considered to be dependent on each other. This algorithm is used for the task of noise reduction of a long audio signal and it is shown that the new algorithm gives improved results compared to existing techniques. The time-varying Autoregressive (TVAR) model is introduced as a non-stationary time series model. Basis functions are used to model the TVAR coefficients and an MCMC algorithm developed to perform subset selection on the set of chosen basis functions. Results show that this algorithm is capable of reducing the number of basis functions used to model each TVAR coefficient. The subset selection algorithm is extended to deal with the problem of unknown TVAR model order. Two MCMC algorithms are developed; a reversible jump algorithm and a combined subset selection algorithm. An application to noise reduction of audio signals is considered. The techniques developed previously are extended to account for the fact that the signal is now observed in noise. The algorithm is demonstrated using real audio with added white noise.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Xu, Song. „Non-interior path-following methods for complementarity problems /“. Thesis, Connect to this title online; UW restricted, 1998. http://hdl.handle.net/1773/5793.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Faghidi, Hamid. „Non-parametric and Non-filtering Methods for Rolling Element Bearing Condition Monitoring“. Thèse, Université d'Ottawa / University of Ottawa, 2014. http://hdl.handle.net/10393/30689.

Der volle Inhalt der Quelle
Annotation:
Rolling element bearings are one of the most significant elements and frequently-used components in mechanical systems. Bearing fault detection and diagnosis is important for preventing productivity loss and averting catastrophic failures of mechanical systems. In industrial applications, bearing life is often difficult to predict due to different application conditions, load and speed variations, as well as maintenance practices. Therefore, reliable fault detection is necessary to ensure productive and safe operations. Vibration analysis is the most widely used method for detection and diagnosis of bearing malfunctions. A measured vibration signal from a sensor is often contaminated by noise and vibration interference components. Over the years, many methods have been developed to reveal fault signatures, and remove noise and vibration interference components. Though many vibration based methods have been proposed in the literature, the high frequency resonance (HFR) technique is one of a very few methods have received certain industrial acceptance. However, the effectiveness of the HFR methods depends, to a great extent, on some parameters such as bandwidth and centre frequency of the fault excited resonance, and window length. Proper selection these parameters is often a knowledge-demanding and time-consuming process. In particular, the filter designed based on the improperly selected bandwidth and center frequency of the fault excited resonance can filter out the true fault information and mislead the detection/diagnosis decisions. In addition, even if these parameters can be selected properly at beginning of each process, they may become invalid in a time-varying environment after a certain period of time. Hence, they may have to be re-calculated and updated, which is again a time-consuming and error-prone process. This undermines the practical significance of the above methods for online monitoring of bearing conditions. To overcome the shortcomings of existing methods, the following four non-parametric and non-filtering methods are proposed: 1. An amplitude demodulation differentiation (ADD) method, 2. A calculus enhanced energy operator (CEEO) method, 3. A higher order analytic energy operator (HO_AEO) approach, and 4. A higher order energy operator fusion (HOEO_F) technique. The proposed methods have been evaluated using both simulated and experimental data.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

West, Martin Alan. „Methods of contact analysis of non-conforming rough surfaces“. Thesis, Imperial College London, 1991. http://hdl.handle.net/10044/1/8907.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

Zhang, Fan. „Geometric and probabilistic methods for non-Euclidean image analysis“. Thesis, University of York, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.445471.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Blake, Kenneth William. „Moving mesh methods for non-linear parabolic partial differential equations“. Thesis, University of Reading, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.369545.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

Nilsson, Helen. „The change of haemoglobin during blood donation, and an assessment of a photometrical method for non-invasive haemoglobin analysis“. Thesis, Uppsala universitet, Institutionen för kvinnors och barns hälsa, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-210904.

Der volle Inhalt der Quelle
Annotation:
In Sweden, lowest acceptable haemoglobin levels in blood donators are 125g/L for women and 135g/L for men for a test sample taken in the beginning of the blood donation. Levels, which are 10g/L lower, are accepted if the sample is taken after the blood donation. Earlier studies show that the haemoglobin level decreases for a person that is lying down. The two aims of this study were to examine how much the haemoglobin levels change during blood donation and to examine if the photometrical instrument Pronto-7TM shows equivalent results to that of the established method Cell-Dyn Sapphire. In the study, 120 blood donors participated. Blood samples were taken in the beginning and in the end of the donation. Analyses by Pronto-7TM were done before and after the donation. The haemoglobin level decreased significantly during the blood donation. The difference was in mean value 5,9g/L according to Cell-Dyn Sapphire. The decrease was significantly less than 10g/L. The Pronto-7TM gave levels that were 1,6g/L higher than Cell-Dyn Sapphire in mean and the standard deviation was higher for Pronto-7TM than for Cell-Dyn Sapphire. In conclusion, the decrease of the haemoglobin levels was significantly less than the expected difference 10g/L. Pronto-7TM gives results that differs a little from the results of the established method.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Richardson, Thomas Stuart. „Continuation methods applied to non linear flight dynamics and control“. Thesis, University of Bristol, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.268783.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

Henderson, Daniel Adrian. „Modelling and analysis of non-coding DNA sequence data“. Thesis, University of Newcastle Upon Tyne, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.299427.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Trifonov, Trifon. „Analysis of methods for extraction of programs from non-constructive proofs“. Diss., lmu, 2012. http://nbn-resolving.de/urn:nbn:de:bvb:19-140308.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Zhang, Shaojie. „Computational methods for genome-wide non-coding RNA discovery and analysis“. Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 2007. http://wwwlib.umi.com/cr/ucsd/fullcit?p3271244.

Der volle Inhalt der Quelle
Annotation:
Thesis (Ph. D.)--University of California, San Diego, 2007.
Title from first page of PDF file (viewed August 13, 2007). Available via ProQuest Digital Dissertations. Vita. Includes bibliographical references (p. 98-108).
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Ashton, Triss A. „Accuracy and Interpretability Testing of Text Mining Methods“. Thesis, University of North Texas, 2013. https://digital.library.unt.edu/ark:/67531/metadc283791/.

Der volle Inhalt der Quelle
Annotation:
Extracting meaningful information from large collections of text data is problematic because of the sheer size of the database. However, automated analytic methods capable of processing such data have emerged. These methods, collectively called text mining first began to appear in 1988. A number of additional text mining methods quickly developed in independent research silos with each based on unique mathematical algorithms. How good each of these methods are at analyzing text is unclear. Method development typically evolves from some research silo centric requirement with the success of the method measured by a custom requirement-based metric. Results of the new method are then compared to another method that was similarly developed. The proposed research introduces an experimentally designed testing method to text mining that eliminates research silo bias and simultaneously evaluates methods from all of the major context-region text mining method families. The proposed research method follows a random block factorial design with two treatments consisting of three and five levels (RBF-35) with repeated measures. Contribution of the research is threefold. First, the users perceived a difference in the effectiveness of the various methods. Second, while still not clear, there are characteristics with in the text collection that affect the algorithms ability to extract meaningful results. Third, this research develops an experimental design process for testing the algorithms that is adaptable into other areas of software development and algorithm testing. This design eliminates the bias based practices historically employed by algorithm developers.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

Stang, Jorgen. „Iterative methods for linear and geometrically non-linear parallel finite element analysis“. Thesis, Heriot-Watt University, 1995. http://hdl.handle.net/10399/743.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

Hassel, Beatriz Ivón. „ANALYSIS OF NATURAL MATERIALS AND STRUCTURES BY NON-CONTACT STRAIN MEASUREMENT METHODS“. Kyoto University, 2010. http://hdl.handle.net/2433/120467.

Der volle Inhalt der Quelle
Annotation:
Kyoto University (京都大学)
0048
新制・課程博士
博士(農学)
甲第15424号
農博第1809号
新制||農||979(附属図書館)
学位論文||H22||N4523(農学部図書室)
27902
京都大学大学院農学研究科森林科学専攻
(主査)教授 小松 幸平, 教授 中野 隆人, 教授 矢野 浩之
学位規則第4条第1項該当
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Gardner, Sugnet. „Extensions of biplot methodology to discriminant analysis with applications of non-parametric principal components“. Thesis, Stellenbosch : Stellenbosch University, 2001. http://hdl.handle.net/10019.1/52264.

Der volle Inhalt der Quelle
Annotation:
Dissertation (PhD)--Stellenbosch University, 2001.
ENGLISH ABSTRACT: Gower and Hand offer a new perspective on the traditional biplot. This perspective provides a unified approach to principal component analysis (PCA) biplots based on Pythagorean distance; canonical variate analysis (CVA) biplots based on Mahalanobis distance; non-linear biplots based on Euclidean embeddable distances as well as generalised biplots for use with both continuous and categorical variables. The biplot methodology of Gower and Hand is extended and applied in statistical discrimination and classification. This leads to discriminant analysis by means of PCA biplots, CVA biplots, non-linear biplots as well as generalised biplots. Properties of these techniques are derived in detail. Classification regions defined for linear discriminant analysis (LDA) are applied in the CVA biplot leading to discriminant analysis using biplot methodology. Situations where the assumptions of LDA are not met are considered and various existing alternative discriminant analysis procedures are formulated in terms of biplots and apart from PCA biplots, QDA, FDA and DSM biplots are defined, constructed and their usage illustrated. It is demonstrated that biplot methodology naturally provides for managing categorical and continuous variables simultaneously. It is shown through a simulation study that the techniques based on biplot methodology can be applied successfully to the reversal problem with categorical variables in discriminant analysis. Situations occurring in practice where existing discriminant analysis procedures based on distances from means fail are considered. After discussing self-consistency and principal curves (a form of non-parametric principal components), discriminant analysis based on distances from principal curves (a form of a conditional mean) are proposed. This biplot classification procedure based upon principal curves, yields much better results. Bootstrapping is considered as a means of describing variability in biplots. Variability in samples as well as of axes in biplot displays receives attention. Bootstrap a-regions are defined and the ability of these regions to describe biplot variability and to detect outliers is demonstrated. Robust PCA and CVA biplots restricting the role of influential observations on biplot displays are also considered. An extensive library of S-PLUS computer programmes is provided for implementing the various discriminant analysis techniques that were developed using biplot methodology. The application of the above theoretical developments and computer software is illustrated by analysing real-life data sets. Biplots are used to investigate the degree of capital intensity of companies and to serve as an aid in risk management of a financial institution. A particular application of the PCA biplot is the TQI biplot used in industry to determine the degree to which manufactured items comply with multidimensional specifications. A further interesting application is to determine whether an Old-Cape furniture item is manufactured of stinkwood or embuia. A data set provided by the Western Cape Nature Conservation Board consisting of measurements of tortoises from the species Homopus areolatus is analysed by means of biplot methodology to determine if morphological differences exist among tortoises from different geographical regions. Allometric considerations need to be taken into account and the resulting small sample sizes in some subgroups severely limit the use of conventional statistical procedures. Biplot methodology is also applied to classification in a diabetes data set illustrating the combined advantage of using classification with principal curves in a robust biplot or biplot classification where covariance matrices are unequal. A discriminant analysis problem where foraging behaviour of deer might eventually result in a change in the dominant plant species is used to illustrate biplot classification of data sets containing both continuous and categorical variables. As an example of the use of biplots with large data sets a data set consisting of 16828 lemons is analysed using biplot methodology to investigate differences in fruit from various areas of production, cultivars and rootstocks. The proposed a-bags also provide a measure of quantifying the graphical overlap among classes. This method is successfully applied in a multidimensional socio-economical data set to quantify the degree of overlap among different race groups. The application of the proposed biplot methodology in practice has an important byproduct: It provides the impetus for many a new idea, e.g. applying a peA biplot in industry led to the development of quality regions; a-bags were constructed to represent thousands of observations in the lemons data set, in tum leading to means for quantifying the degree of overlap. This illustrates the enormous flexibility of biplots - biplot methodology provides an infrastructure for many novelties when applied in practice.
AFRIKAANSE OPSOMMING: Gower en Hand bied 'n nuwe perspektief op die tradisionele bistipping. Hierdie perspektief verskaf 'n uniforme benadering tot hoofkomponent analise (HKA) bistippings gebaseer op Pythagoras-afstand; kanoniese veranderlike analise (KVA) bistippings gebaseer op Mahalanobis-afstand; nie-lineere bistippings gebaseer op Euclidies inbedbare afstande sowel as veralgemeende bistippings vir gebruik wanneer beide kontinue en kategoriese veranderlikes voorkom. Die bistippingsmetodologie van Gower en Hand word uitgebrei en toegepas in statistiese diskriminasie en klassifikasie. Dit lei tot diskriminantanalise met behulp van HKA bistippings, KVA bistippings, nie-lineere bistippings sowel as veralgemeende bistippings. Die eienskappe van hierdie tegnieke word in besonderhede afgelei. Die toepassing van die konsep van 'n klassifikasiegebied in die KVA bistipping baan die weg vir lineere diskriminantanalise (LDA) met behulp van bistippingsmetodologie. Situasies waar daar nie aan die aannames van LDA voldoen word nie kry aandag en verskeie bestaande altematiewe diskriminantanalise prosedures word in terme van bistippings geformuleer en naas HKA bistippings, word QDA, FDA en DSM bistippings gedefinieer, gekonstrueer en hul gebruike gedemonstreer. Dit word aangetoon dat bistippingsmetodologie op 'n natuurlik wyse voorsiening maak om kategoriese veranderlikes en kontinue veranderlikes gelyktydig te hanteer. Daar word met behulp van 'n simulasie-studie aangetoon dat tegnieke gebaseer op die bistippingsmetodologie wat ontwikkel IS, suksesvol by die sogenaamde ornkeringsprobleem by diskriminantanalise met kategoriese veranderlikes gebruik kan word. Verder word aangevoer dat daar baie praktiese situasies voorkom waar bestaande prosedures van diskriminantanalise faal omdat dit op afstande vanaf gemiddeldes gebaseer IS. Na 'n bespreking van self-konsekwentheid en hoofkrommes ('n vorm van nieparametriese hoofkomponente) word voorgestel om diskriminantanalise op afstand vanaf hoofkrommes ('n vonn van 'n voorwaardelike gemiddelde) te baseer. Sodoende is 'n bistippingklassifikasie prosedure wat op afstand vanaf hoofkrommes gebaseer is en wat baie beter resultate lewer, ontwikkel. Die variasie in die posisies van datapunte in die bistipping sowel as van die bistippingsasse word bestudeer met behulp van skoenlusmetodes. 'n Skoenlus a-gebied word gedefinieer en dit word gedemonstreer hoe so 'n a-gebied aangewend kan word om variasie in bistippings te beskryf en wegleers te identifiseer. Robuuste HKA en KV A bistippings wat die rol van invloedryke waamemings op die bistipping beperk, word bespreek. 'n Omvangryke biblioteek van S-PLUS rekenaarprogramme is geskryf VIr die implementering van die verskillende diskriminantanalise tegnieke wat met behulp van bistippingsmetodologie ontwikkel is. Die toepassing van die voorafgaande teoretiese ontwikkelinge en rekenaarprogramme word geillustreer aan die hand van werklike datastelle vanuit die praktyk. So word bistippings gebruik om die mate van kapitaalintensiteit van ondememings te ondersoek en om as hulpmiddel by risikobestuur van 'n finansiele instelling te dien. 'n Besondere toepassing van die HKA bistipping is die TQI bistipping wat in die industriele omgewing gebruik word ten einde te bepaal tot watter mate vervaardigde artikels aan neergelegde meerdimensionele spesifikasies voldoen. 'n Verdere interessante toepassing is om te bepaal of 'n Ou-Kaapse meubelstuk van stinkhout of embuia gemaak is. 'n Datastel verskaf deur Wes-Kaap Natuurbewaring in verband met die bekende padloper skilpad, Homopus areolatus, is met behulp van bistippings geanaliseer om te bepaal of daar morfometriese verskille tussen die padlopers afkomstig van bepaalde geografiese gebiede is. Allometriese beginsels moes ook in ag gene em word en die min waamemings in sommige van die subgroepe het tot gevolg dat konvensionele statistiese tegnieke nie sonder meer gebruik kan word nie. Die bistippingsmetodologie is ook toegepas op klassifikasie by 'n diabetes datastel om die gekombineerde gebruik van. hoofkrommes in 'n robuuste bistipping te illustreer en bistippingklassifikasie waar daar sprake van ongelyke kovariansiematrikse is. 'n Diskriminantanalise probleem waar die weidingsvoorkeure van wildsbokke 'n verandering in die dominante plantegroei tot gevolg kan he, word gebruik om bistippingklassifikasie met data waar kontinue sowel as kategoriese veranderlikes verskaf word, te illustreer. As voorbeeld van die gebruik van bistippings by 'n groot datastel is 'n datastel bestaande uit waamemings van 16828 suurlemoene met behulp van bistippingsmetodologie geanaliseer ten einde verskille in vrugte afkomstig van verskillende produsente-streke, kultivars en onderstamme te ondersoek. Die a-sakkies wat hier ontwikkel is, lei tot kwantifisering van die grafiese oorvleueling van groepe. Hierdie beginsel word suksesvol toegepas in 'n meerdimensionele sosio-ekonomiese datastel om die mate van oorvleueling van verskillende bevolkingsgroepe te kwantifiseer. Die toepassing van die voorgestelde bistippingsmetodologie in die praktyk lei tot 'n belangrike newe-produk: Dit verskaf die stimulus tot die ontstaan van nuwe idees, byvoorbeeld, die toepassing van 'n HKA bistipping in 'n industriele omgewing het tot die ontwikkeling van die konsep van 'n kwaliteitsgebied aanleiding gegee; a-sakkies is gekonstrueer om duisende waamemings in die suurlemoendatastel te verteenwoordig wat weer gelei het tot 'n metode om die graad van oorvleueling te kwantifiseer. Hierdeur is die geweldige veelsydigheid van bistippings geillustreer - bistippingsmetodologie verskaf die infrastruktuur vir baie vindingryke toepassings in die praktyk.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

Piñal, Moctezuma Juan Fernando. „Characterization of damage evolution on metallic components using ultrasonic non-destructive methods“. Doctoral thesis, Universitat Politècnica de Catalunya, 2019. http://hdl.handle.net/10803/667641.

Der volle Inhalt der Quelle
Annotation:
When fatigue is considered, it is expected that structures and machinery eventually fail. Still, when this damage is unexpected, besides of the negative economic impact that it produces, life of people could be potentially at risk. Thus, nowadays it is imperative that the infrastructure managers, ought to program regular inspection and maintenance for their assets; in addition, designers and materials manufacturers, can access to appropriate diagnostic tools in order to build superior and more reliable materials. In this regard, and for a number of applications, non-destructive evaluation techniques have proven to be an efficient and helpful alternative to traditional destructive assays of materials. Particularly, for the design area of materials, in recent times researchers have exploited the Acoustic Emission (AE) phenomenon as an additional assessing tool with which characterize the mechanical properties of specimens. Nevertheless, several challenges arise when treat said phenomenon, since its intensity, duration and arrival behavior is essentially stochastic for traditional signal processing means, leading to inaccuracies for the outcome assessment. In this dissertation, efforts are focused on assisting in the characterization of the mechanical properties of advanced high strength steels during under uniaxial tensile tests. Particularly of interest, is being able to detect the nucleation and growth of a crack throughout said test. Therefore, the resulting AE waves generated by the specimen during the test are assessed with the aim of characterize their evolution. For this, on the introduction, a brief review about non-destructive methods emphasizing the AE phenomenon is introduced. Next is presented, an exhaustive analysis with regard to the challenge and deficiencies of detecting and segmenting each AE event over a continuous data-stream with the traditional threshold detection method, and additionally, with current state of the art methods. Following, a novel AE event detection method is proposed, with the aim of overcome the aforementioned limitations. Evidence showed that the proposed method (which is based on the short-time features of the waveform of the AE signal), excels the detection capabilities of current state of the art methods, when onset and endtime precision, as well as when quality of detection and computational speed are also considered. Finally, a methodology aimed to analyze the frequency spectrum evolution of the AE phenomenon during the tensile test, is proposed. Results indicate that it is feasible to correlate nucleation and growth of a crack with the frequency content evolution of AE events.
Cuando se considera la fatiga de los materiales, se espera que eventualmente las estructuras y las maquinarias fallen. Sin embargo, cuando este daño es inesperado, además del impacto económico que este produce, la vida de las personas podría estar potencialmente en riesgo. Por lo que hoy en día, es imperativo que los administradores de las infraestructuras deban programar evaluaciones y mantenimientos de manera regular para sus activos. De igual manera, los diseñadores y fabricantes de materiales deberían de poseer herramientas de diagnóstico apropiadas con el propósito de obtener mejores y más confiables materiales. En este sentido, y para un amplio número de aplicaciones, las técnicas de evaluación no destructivas han demostrado ser una útil y eficiente alternativa a los ensayos destructivos tradicionales de materiales. De manera particular, en el área de diseño de materiales, recientemente los investigadores han aprovechado el fenómeno de Emisión Acústica (EA) como una herramienta complementaria de evaluación, con la cual poder caracterizar las propiedades mecánicas de los especímenes. No obstante, una multitud de desafíos emergen al tratar dicho fenómeno, ya que el comportamiento de su intensidad, duración y aparición es esencialmente estocástico desde el punto de vista del procesado de señales tradicional, conllevando a resultados imprecisos de las evaluaciones. Esta disertación se enfoca en colaborar en la caracterización de las propiedades mecánicas de Aceros Avanzados de Alta Resistencia (AAAR), para ensayos de tracción de tensión uniaxiales, con énfasis particular en la detección de fatiga, esto es la nucleación y generación de grietas en dichos componentes metálicos. Para ello, las ondas mecánicas de EA que estos especímenes generan durante los ensayos, son estudiadas con el objetivo de caracterizar su evolución. En la introducción de este documento, se presenta una breve revisión acerca de los métodos existentes no destructivos con énfasis particular al fenómeno de EA. A continuación, se muestra un análisis exhaustivo respecto a los desafíos para la detección de eventos de EA y las y deficiencias del método tradicional de detección; de manera adicional se evalúa el desempeño de los métodos actuales de detección de EA pertenecientes al estado del arte. Después, con el objetivo de superar las limitaciones presentadas por el método tradicional, se propone un nuevo método de detección de actividad de EA; la evidencia demuestra que el método propuesto (basado en el análisis en tiempo corto de la forma de onda), supera las capacidades de detección de los métodos pertenecientes al estado del arte, cuando se evalúa la precisión de la detección de la llegada y conclusión de las ondas de EA; además de, cuando también se consideran la calidad de detección de eventos y la velocidad de cálculo. Finalmente, se propone una metodología con el propósito de evaluar la evolución de la energía del espectro frecuencial del fenómeno de EA durante un ensayo de tracción; los resultados demuestran que es posible correlacionar el contenido de dicha evolución frecuencial con respecto a la nucleación y crecimiento de grietas en AAAR's.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

TAMAGNONE, GABRIELE. „Numerical modelling and design methods for CLT structures“. Doctoral thesis, Università degli Studi di Trieste, 2019. http://hdl.handle.net/11368/2991040.

Der volle Inhalt der Quelle
Annotation:
Dalla sua concezione a metà degli anni novanta, il legno lamellare a strati incrociati, anche noto come CLT o X-Lam, ha raggiunto grande popolarità tra i materiali da costruzione grazie alle numerose innate qualità, gli sforzi a livello mondiale per costruire strutture affidabili in zone a rischio sismico e la necessità di costruire un ambiente più eco-sostenibile. Molti test sono stati fatti negli ultimi 15 anni, volti a comprendere meglio il comportamento delle connessioni in edifici in CLT, di parti strutturali o di intere strutture in CLT, in modo da fornire regole affidabili per i progettisti per progettare strutture in CLT sotto ogni condizione di carico. Sulla base di questi test, molti sono stati i modelli numerici che sono stati suggeriti negli anni. Questi rappresentano uno strumento fondamentale per la progettazione di strutture in CLT quando insorgono specifiche problematiche ed un approccio analitico da solo non è sufficiente. Nonostante i molti anni di sforzi, non esistono ancora affidabili metodologie di progetto nella quasi totalità dei codici a livello mondiale e ancora molte sono le incognite relative al comportamento delle strutture in CLT a molti livelli (connessioni, parti strutturali, strutture). Questa tesi riassume tre anni di ricerche numeriche, le quali hanno affrontato diversi problemi relativi al comportamento di elementi strutturali e strutture in CLT sotto azioni dinamiche. Durante la prima parte di questo percorso l’attenzione è stata posta sulla continuazione di un precedente studio, portato avanti durante la tesi di laurea magistrale, il quale era incentrato sulla formulazione di un metodo semplificato per la costruzione di un dominio resistente sforzo normale-momento flettente per pannelli in CLT connessi alla base da connessioni tipo hold-down e angle bracket. In mancanza di risultati di test di interesse, la concentrazione è stata rivolta ancora alla formulazione di metodi semplificati per la progettazione di elementi strutturali in CLT. È stato analizzato il problema delle connessioni pannello-pannello all’interno di una stessa parete. In particolare, è stata studiata la rigidezza di queste connessioni in relazione al comportamento ribaltante di pareti a due pannelli attraverso l’analisi di test a scala reale indipendenti e analisi numeriche agli elementi finiti. Una formula per il calcolo di queste connessioni è stata dapprima proposta e poi, dopo ulteriori analisi, rivista e corretta. Per estendere l’analisi e considerare elementi strutturali più complessi, è stata investigata, a livello di analisi numerica, l’influenza del solaio e delle connessioni parete-solaio superiore sul comportamento ribaltante delle pareti, prendendo in considerazione configurazioni con e senza solaio, variando diversi parametri di modo da ottenere risultati statisticamente significativi. Nell’estate del 2017 il candidato ha partecipato attivamente al NHERI TallWood Project, una ricerca statunitense intesa a testare strutture in CLT per fornire regole di progettazione per tali strutture nei futuri codici nazionali. Sponsorizzato dalla Colorado State University, nella persona del Prof. John W. van de Lindt, il candidato ha collaborato alla preparazione di un edificio con due orizzontamenti fuori terra testato sulla tavola vibrante della UCSD a San Diego (California) Per valutare il più corretto valore di smorzamento per strutture in CLT sotto l’azione di eventi sismici di bassa intensità, sono stati riprodotti numericamente ed analizzati i test su tavola vibrante del progetto SOFIE a 0,15 g. Ulteriori considerazioni sono state fatte sul ruolo dell’attrito su questo tipo di strutture e sul problema delle analisi lineari per strutture in CLT (risposta non simmetrica di connessioni caricate in tensione-compressione).
Since its conception in the mid 90’s, cross-laminated timber, known also as CLT or X-Lam, has achieved a great popularity as construction material thanks to its numerous intrinsic qualities, worldwide effort to build reliable structures in seismic-prone areas and necessity to build a more eco-friendly environment. Many tests have been carried out in the last 15 years, aimed to better understand the behavior of connections in CLT buildings, CLT assemblies and CLT structures in order to provide reliable rules for designers to design structures made of CLT in any loading condition. Based on these tests, many numerical models have been suggested through the years. They represent a fundamental tool for the design of CLT structures when specific design problems arise. Despite many years of efforts, reliable design rules are still missing in almost every code worldwide and many are still the unknown related to CLT structures behavior at many levels (connections, assemblies, structures). This thesis summarizes three years of numerical investigations, which have faced different problems related to the comprehension of CLT assemblies and structures behavior under dynamic loading conditions. The first part of this path focused on the continuation of a previous study made within the Master Degree thesis, which was the formulation of a simplified method to obtain an axial-load/bending moment limit domain for a CLT panel connected to the supporting surface through hold-down and angle bracket connections. Without test results of interest, the focus of the study returned to be the formulation of simple methods for CLT assemblies design. The problem of panel-to-panel connections was investigated. In particular, the stiffness of such connections related to the rocking behavior of 2-panel wall assemblies was studied through full-scale tests and FE numerical analyses. A formula for the design of these connections was firstly suggested and then, after further analyses, revised and corrected. In order to extend the analyses and consider more complex assemblies, the influence of diaphragm and wall-to-diaphragm connections stiffness on the rocking behavior of wall assemblies was numerically investigated, taking into account configuration with and without diaphragm, varying several parameters to obtain statistically significant results. In the summer of 2017 the candidate actively participated to the NHERI TallWood Project, an American research project intended to test CLT structures in order to provide design rules for these structures in the future US codes. Sponsored by the Colorado State University, in the person of Professor John W. van de Lindt, the candidate collaborated to the setup of a 2-story CLT building that was tested on the UCSD shaking table located in San Diego (California). In order to assess the most proper value of damping for CLT structures under low-intensity seismic events and to better investigate the potential of the component approach for the modelling of CLT structures, the 0,15 g shaking table tests of the 3-story building within the SOFIE Project were reproduced and analyzed. Further considerations on the role of friction for this type of structure have been made together with the problem of linear analyses for CLT structures (non-symmetric response for tension-compression loaded connections).
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Lu, Mingyang. „Forward and inverse analysis for non-destructive testing based on electromagnetic computation methods“. Thesis, University of Manchester, 2018. https://www.research.manchester.ac.uk/portal/en/theses/forward-and-inverse-analysis-for-nondestructive-testing-based-on-electromagnetic-computation-methods(c9b4f030-eb7d-42a9-b55c-07df6b96aa1f).html.

Der volle Inhalt der Quelle
Annotation:
EM computation methods for the simulation and reconstruction of the metallic plate properties are investigated in this thesis. Two major computational problems exist in EM NDT: the forward problem and the inverse problem. The forward problem is to calculate the frequency-dependent inductance for steel plates with arbitrary values of permeability, conductivity, thickness and lift-off (i.e. the distance between the sensor and test sample). The inverse problem involves how to determine each parameter, i.e. permeability, conductivity, thickness and lift-off from the frequency-dependent inductance measurements. The purpose of this dissertation is to develop advanced forward and inverse solvers. This work will mainly deal with metallic plate structure in the low-frequency induction scheme. For the forward problem, both edge-element FEM and Dodd and Deeds analytical solution to simulate the eddy current probe-coil problems are developed. The feasibility and accuracy of the proposed forward solvers are verified by experiments and numerical solutions. An example of computation of eddy currents in metallic plates is also carried out to test the performance of the solver. The dissertation then goes further to consider the solution of the inverse problem of determining unique values for the four variables - permeability, conductivity, thickness and lift-off (i.e. the distance between the sensor and test sample) from the multi-frequency inductance spectra. In particular, novel methods on how to compensate lift-off variations are proposed. In addition, CIP is explored to measure the permeability of ferrous plates. These methods are verified by measurement results from EM sensors.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

Kodewitz, Andreas. „Methods for large volume image analysis : applied to early detection of Alzheimer's disease by analysis of FDG-PET scans“. Thesis, Evry-Val d'Essonne, 2013. http://www.theses.fr/2013EVRY0005/document.

Der volle Inhalt der Quelle
Annotation:
Dans cette thèse, nous explorons de nouvelles méthodes d’analyse d’images pour la détection précoce des changements métaboliques cérébraux causés par la maladie d’Alzheimer. Nous introduisons deux apports méthodologiques que nous appliquons à un ensemble de données réelles. Le premier est basé sur l’apprentissage automatique afin de créer une carte des informations pertinentes pour la classification d'un ensemble d’images. Pour cela nous échantillonnons des blocs de Voxels selon un algorithme de Monte-Carlo. La mise en œuvre d’une classification basée sur ces patchs 3d a pour conséquence la réduction significative du volume de patchs à traiter et l’extraction de caractéristiques dont l’importance est statistiquement quantifiable. Cette méthode s’applique à différentes caractéristiques et est adaptée à des types d’images variés. La résolution des cartes produites par cette méthode peut être affinée à volonté et leur contenu informatif est cohérent avec des résultats antérieurs obtenus dans la littérature. Le second apport méthodologique porte sur la conception d’un nouvel algorithme de décomposition de tenseur d’ordre important, adapté à notre application. Cet algorithme permet de réduire considérablement la consommation de mémoire et donc en évite la surcharge. Il autorise la décomposition rapide de tenseurs, y compris ceux de dimensions très déséquilibrées. Nous appliquons cet algorithme en tant que méthode d’extraction de caractéristiques dans une situation où le clinicien doit diagnostiquer des stades précoces de la maladie d'Alzheimer en utilisant la TEP-FDG seule. Les taux de classification obtenus sont souvent au-dessus des niveaux de l’état de l’art
In this thesis we want to explore novel image analysis methods for the early detection of metabolic changes in the human brain caused by Alzheimer's disease (AD). We will present two methodological contributions and present their application to a real life data set. We present a machine learning based method to create a map of local distribution of classification relevant information in an image set. The presented method can be applied using different image characteristics which makes it possible to adapt the method to many kinds of images. The maps generated by this method are very localized and fully consistent with prior findings based on Voxel wise statistics. Further we preset an algorithm to draw a sample of patches according to a distribution presented by means of a map. Implementing a patch based classification procedure using the presented algorithm for data reduction we were able to significantly reduce the amount of patches that has to be analyzed in order to obtain good classification results. We present a novel non-negative tensor factorization (NTF) algorithm for the decomposition of large higher order tensors. This algorithm considerably reduces memory consumption and avoids memory overhead. This allows the fast decomposition even of tensors with very unbalanced dimensions. We apply this algorithm as feature extraction method in a computer-aided diagnosis (CAD) scheme, designed to recognize early-stage ad and mild cognitive impairment (MCI) using fluorodeoxyglucose (FDG) positron emission tomography (PET) scans only. We achieve state of the art classification rates
APA, Harvard, Vancouver, ISO und andere Zitierweisen
27

van, Joolen Vincent J. „Application of Higdon non-reflecting boundary conditions to shallow water models“. Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2003. http://library.nps.navy.mil/uhtbin/hyperion-image/03Jun%5FvanJoolen%5FPhD.pdf.

Der volle Inhalt der Quelle
Annotation:
Thesis (Ph. D. in Applied Mathematics)--Naval Postgraduate School, June 2003.
Dissertation supervisors: Beny Neta, Dan Givoli. Includes bibliographical references (p. 131-133). Also available online.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
28

Rasekh, Ali. „Efficient methods for non-linear thermochemical analysis of composite structures undergoing autoclave processing“. Thesis, University of British Columbia, 2007. http://hdl.handle.net/2429/31050.

Der volle Inhalt der Quelle
Annotation:
Composite structures are increasingly being used in different industries. Their manufacturing imparts some challenges for the industry: most importantly prediction and control of the process to specification. The usual numerical solutions typically based on the use of the finite element analysis are not very suitable for large parts, especially when there is a need for quick estimation of the results for preliminary design and optimization. Therefore, there is a need both to have enhanced solutions that reduce the modeling effort for computer simulation of large and complex structures and also to simplify the solution and provide easy to use methodologies for quick estimations based on tables and charts. In the present work, a simple methodology is developed to estimate the temperature distribution in a thermoset polymer matrix composite slab placed on a tool and subjected to cycles of temperature ramp and hold leading to the curing of the composite and generation of heat due to the internal chemical reactions. Supplementary diagrams are also generated to set limits on the method. A modified finite element solution for heat transfer is also introduced that reduces the mesh generation and computational effort. This "higher order shell element" uses enhanced shape functions and efficient methods for spatial and temporal integrations in order to reduce the computational run times. The developed methods provide the design engineer with efficient analysis tools for predicting the temperature in a composite part. The simple diagrams and tables can be used for preliminary estimation of the temperature distribution in the part at each stage of the material development. The enhanced finite element methodology developed here can be used to reduce the amount of effort necessary in mesh generation and refinements necessary to achieve accurate solutions for thermochemical modelling of complex composite structures during processing.
Applied Science, Faculty of
Civil Engineering, Department of
Graduate
APA, Harvard, Vancouver, ISO und andere Zitierweisen
29

Muralidharan, Abishek. „Evaluation of heavy-duty engine exhaust hydrocarbon and non-methane hydrocarbon analysis methods“. Morgantown, W. Va. : [West Virginia University Libraries], 2007. https://eidr.wvu.edu/etd/documentdata.eTD?documentid=5520.

Der volle Inhalt der Quelle
Annotation:
Thesis (M.S.)--West Virginia University, 2007.
Title from document title page. Document formatted into pages; contains viii, 87 p. : ill. (some col.). Includes abstract. Includes bibliographical references (p. 72-73).
APA, Harvard, Vancouver, ISO und andere Zitierweisen
30

Uddin, Mohammad Moin. „ROBUST STATISTICAL METHODS FOR NON-NORMAL QUALITY ASSURANCE DATA ANALYSIS IN TRANSPORTATION PROJECTS“. UKnowledge, 2011. http://uknowledge.uky.edu/gradschool_diss/153.

Der volle Inhalt der Quelle
Annotation:
The American Association of Highway and Transportation Officials (AASHTO) and Federal Highway Administration (FHWA) require the use of the statistically based quality assurance (QA) specifications for construction materials. As a result, many of the state highway agencies (SHAs) have implemented the use of a QA specification for highway construction. For these statistically based QA specifications, quality characteristics of most construction materials are assumed normally distributed, however, the normality assumption can be violated in several forms. Distribution of data can be skewed, kurtosis induced, or bimodal. If the process shows evidence of a significant departure from normality, then the quality measures calculated may be erroneous. In this research study, an extended QA data analysis model is proposed which will significantly improve the Type I error and power of the F-test and t-test, and remove bias estimates of Percent within Limit (PWL) based pay factor calculation. For the F-test, three alternative tests are proposed when sampling distribution is non-normal. These are: 1) Levene’s test; 2) Brown and Forsythe’s test; and 3) O’Brien’s test. One alternative method is proposed for the t-test, which is the non-parametric Wilcoxon - Mann – Whitney Sign Rank test. For PWL based pay factor calculation when lot data suffer non-normality, three schemes were investigated, which are: 1) simple transformation methods, 2) The Clements method, and 3) Modified Box-Cox transformation using “Golden Section Search” method. The Monte Carlo simulation study revealed that both Levene’s test and Brown and Forsythe’s test are robust alternative tests of variances when underlying sample population distribution is non-normal. Between the t-test and Wilcoxon test, the t-test was found significantly robust even when sample population distribution was severely non-normal. Among the data transformation for PWL based pay factor, the modified Box-Cox transformation using the golden section search method was found to be the most effective in minimizing or removing pay bias. Field QA data was analyzed to validate the model and a Microsoft® Excel macro based software is developed, which can adjust any pay consequences due to non-normality.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
31

Kilic, Gokhan. „Application of advanced non-destructive testing methods on bridge health assessment and analysis“. Thesis, University of Greenwich, 2012. http://gala.gre.ac.uk/9811/.

Der volle Inhalt der Quelle
Annotation:
Bridge structures have an important role in economic, social and environmental aspects of society life. Bridges are also subject to a natural process of deterioration of construction materials, as well as natural and environmental events such as flooding, freezing, thawing etc. Health monitoring and assessment of the structural integrity of bridges have been the focus of engineers and researchers for decades. Currently, the various aspects of bridge health are monitored separately. However, measuring these aspects independently does not give the overall health of the bridge and crucial indicators of structural damage can be neglected. Generally, bridge health assessments take the form of individual NDT (non-destructive techniques) detecting individual defects. However value can be added to these results by combining and comparing the findings of several different NDT surveys. By completing this, a more accurate assessment of bridge health is obtained. This increases confidence in the decision as to whether remedial action is necessary. In this thesis an integrated bridge health monitoring approach is proposed which applies several NDT specifically chosen for bridge health assessments, thus achieving this added value. This method can be used as a part of a comprehensive bridge monitoring strategy as an assessment tool to evaluate the bridges structural health. This approach enables the user of this approach to obtain a detailed structural report on the bridge with all the necessary information pertaining to its’ health, allowing for a fully educated decision to be made regarding whether remedial action is necessary. This research presents the results of the applications of such methods on case studies utilising Ground Penetrating Radar (GPR), IBIS-S technology / system (deflection and vibration detection sensor system with interferometric capability) and Accelerometer sensors. It also evaluates the effectiveness of the adopted methods and technologies by comparing and validating the yielded results with conventional methods (modelling and visual inspection). The research presents and discusses processed data obtained by the above mentioned methods in detail and reports on challenges encountered in setting up and materialising the assessment process. This work also reports on Finite Element Modelling (FEM) of the main case study (Pentagon Road Bridge) using specialist software (SAP2000 and ANSYS) in order to simulate the perceived movement of the bridge under dynamic and static conditions. The analytical results output were compared with results obtained by the applications of the above non-destructive methods. Thus by using these techniques the main aim of this thesis is to develop an integrated model/approach for the assessment and monitoring of the structural integrity and overall functionality of bridges. All the above methods were validated using preliminary case studies (GPR), additional equipment (accelerometers for IBIS-S validation) and additional techniques and information (SAP 2000 and ANSYS were compared to one another and IBIS-S results). All of these techniques were applied on the Pentagon Road Bridge. This bridge was chosen as no information was available regarding its structural composition. Visual inspection showed the external defects of the structure: cracking, moisture ingress and concrete delamination was present in one of the spans of the bridge. The GPR surveys gave the position of the rebars and also signs of moisture ingress at depths of 20cm (confirmed using velocity analysis). IBIS-S gave results for the deflection of the structure. FEM was used to model the behaviour of the bridge assuming no defects. To achieve additional model accuracy the results of the rebar position were input in to the model and it was calibrated using IBIS-S data. The deflection results from the model were then compared to the actual deflection data to identify areas of deterioration. It was found that excessive deflection occurred on one of the spans. It was thus found that all NDT indicated that a particular span was an area of significant deterioration and remedial action should be completed on this section in the near future. Future prediction was also completed by running simulations in ANSYS for increasing crack lengths and dynamic loading. It was found that if there is no remedial action excessive beam bending moments will occur and eventual collapse. The results of this research demonstrated that GPR provided information on the extent of the internal structural defects of the bridge under study (moisture ingress and delamination) whilst IBIS-S technology and Accelerometer sensors permitted measurement of the magnitude of the vibration of the bridge under dynamic and static loading conditions. The results depicted similarities between the FEM results and the adopted non-destructive methods results in location and pattern. This work can potentially contribute towards a better understanding of the mechanical and physical behaviours of bridge structures and ultimately assess their life expectancy and functionality.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
32

Ma, Jiajie. „Accuracy and reliability of non-linear finite element analysis for surgical simulation“. University of Western Australia. School of Mechanical Engineering, 2006. http://theses.library.uwa.edu.au/adt-WU2010.0089.

Der volle Inhalt der Quelle
Annotation:
In this dissertation, the accuracy and reliability of non-linear finite element computations in application to surgical simulation is evaluated. The evaluation is performed through comparison between the experiment and finite element analysis of indentation of soft tissue phantom and human brain phantom. The evaluation is done in terms of the forces acting on the cylindrical Aluminium indenter and deformation of the phantoms due to these forces. The deformation of the phantoms is measured by tracking 3D motions of X-ray opaque markers implanted in the direct neighbourhood under the indenter using a custom-made biplane X-ray image intensifiers (XRII) system. The phantoms are made of Sylgard® 527 gel to simulate the hyperelastic constitutive behaviour of the brain tissue. The phantoms are prepared layer by layer to facilitate the implantation of the X-ray opaque markers. The modelling of soft tissue phantom indentation and human brain phantom indentation is performed using the ABAQUSTM/Standard finite element solver. Realistic geometry model of the human brain phantom obtained from Magnetic Resonance images is used. Specific constitutive properties of the phantom layers determined through uniaxial compression tests are used in the model. The models accurately predict the indentation force-displacement relations and marker displacements in both soft tissue phantom indentation and human brain phantom indentation. Good agreement between the experimental and modelling results verifies the reliability and accuracy of the finite element analysis techniques used in this study and confirms the predictive power of these techniques in application to surgical simulation.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
33

Donmez, Ayca. „Adaptive Estimation And Hypothesis Testing Methods“. Phd thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/3/12611724/index.pdf.

Der volle Inhalt der Quelle
Annotation:
For statistical estimation of population parameters, Fisher&rsquo
s maximum likelihood estimators (MLEs) are commonly used. They are consistent, unbiased and efficient, at any rate for large n. In most situations, however, MLEs are elusive because of computational difficulties. To alleviate these difficulties, Tiku&rsquo
s modified maximum likelihood estimators (MMLEs) are used. They are explicit functions of sample observations and easy to compute. They are asymptotically equivalent to MLEs and, for small n, are equally efficient. Moreover, MLEs and MMLEs are numerically very close to one another. For calculating MLEs and MMLEs, the functional form of the underlying distribution has to be known. For machine data processing, however, such is not the case. Instead, what is reasonable to assume for machine data processing is that the underlying distribution is a member of a broad class of distributions. Huber assumed that the underlying distribution is long-tailed symmetric and developed the so called M-estimators. It is very desirable for an estimator to be robust and have bounded influence function. M-estimators, however, implicitly censor certain sample observations which most practitioners do not appreciate. Tiku and Surucu suggested a modification to Tiku&rsquo
s MMLEs. The new MMLEs are robust and have bounded influence functions. In fact, these new estimators are overall more efficient than M-estimators for long-tailed symmetric distributions. In this thesis, we have proposed a new modification to MMLEs. The resulting estimators are robust and have bounded influence functions. We have also shown that they can be used not only for long-tailed symmetric distributions but for skew distributions as well. We have used the proposed modification in the context of experimental design and linear regression. We have shown that the resulting estimators and the hypothesis testing procedures based on them are indeed superior to earlier such estimators and tests.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
34

Liesen, Jörg. „Construction and analysis of polynomial iterative methods for non-hermitian systems of linear equations“. [S.l. : s.n.], 1998. http://deposit.ddb.de/cgi-bin/dokserv?idn=955877776.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
35

Wood, Jeffrey C. „An analysis of mixed finite element methods for Maxwell's equations on non-uniform meshes“. Thesis, University of Oxford, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.282161.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
36

Shikongo, Albert. „Numerical Treatment of Non-Linear singular pertubation problems“. Thesis, Online access, 2007. http://etd.uwc.ac.za/usrfiles/modules/etd/docs/etd_gen8Srv25Nme4_3831_1257936459.pdf.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
37

Rydén, Patrik. „Statistical analysis and simulation methods related to load-sharing models“. Doctoral thesis, Umeå universitet, Matematisk statistik, 2000. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-46772.

Der volle Inhalt der Quelle
Annotation:
We consider the problem of estimating the reliability of bundles constructed of several fibres, given a particular kind of censored data. The bundles consist of several fibres which have their own independent identically dis-tributed failure stresses (i.e.the forces that destroy the fibres). The force applied to a bundle is distributed between the fibres in the bundle, accord-ing to a load-sharing model. A bundle with these properties is an example of a load-sharing system. Ropes constructed of twisted threads, compos-ite materials constructed of parallel carbon fibres, and suspension cables constructed of steel wires are all examples of load-sharing systems. In par-ticular, we consider bundles where load-sharing is described by either the Equal load-sharing model or the more general Local load-sharing model. In order to estimate the cumulative distribution function of failure stresses of bundles, we need some observed data. This data is obtained either by testing bundles or by testing individual fibres. In this thesis, we develop several theoretical testing methods for both fibres and bundles, and related methods of statistical inference. Non-parametric and parametric estimators of the cumulative distribu-tion functions of failure stresses of fibres and bundles are obtained from different kinds of observed data. It is proved that most of these estimators are consistent, and that some are strongly consistent estimators. We show that resampling, in this case random sampling with replacement from sta-tistically independent portions of data, can be used to assess the accuracy of these estimators. Several numerical examples illustrate the behavior of the obtained estimators. These examples suggest that the obtained estimators usually perform well when the number of observations is moderate.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
38

Persson, Jonas. „Bandwidth-reduced Linear Models of Non-continuous Power System Components“. Doctoral thesis, Stockholm : Electric Power Systems, School of Electrical Engineering, Royal Institute of Technology, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3984.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
39

Stiefenhofer, Pascal Christian. „Stability analysis of non-smooth dynamical systems with an application to biomechanics“. Thesis, University of Sussex, 2016. http://sro.sussex.ac.uk/id/eprint/61481/.

Der volle Inhalt der Quelle
Annotation:
This thesis discusses a two dimensional non-smooth dynamical system described by an autonomous ordinary differential equation. The right hand side of the differential equation is assumed to be discontinuous. We provide a local theory of existence, uniqueness and exponential asymptotic stability and state a formula for the basin of attraction. Our conditions are sufficient. Thetheory generalizes smooth dynamical systems theory by providing contraction conditions for two nearby trajectories at a jump. Such conditions have only previously been studied for a two dimensional nonautonomous differential equation. We provide an example of the theory developed in this thesis and show that we can determine stability of a periodic orbit without explicitly calculating it. This is the main advantage of our theory. Our conditions require to define a metric. This however, can turn out to be a difficult task, and at present, we do not have a method for finding such a metric systematically. The final part of this thesis considers an application of a nonsmooth dynamical system to biomechanics. We model an elderly person stepping over an obstacle. Our model assumes stiff legs, and suggests a gait strategy to overcome an obstacle. This work is in collaboration with Professor Wagner's research group at Institute for Sport Science at the University of Mϋnster. However, we only present work developed independently in this thesis.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
40

Wang, Suyi Wang. „Analyzing data with 1D non-linear shapes using topological methods“. The Ohio State University, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=osu1524020976023345.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
41

Gelmini, Angelo. „Advanced Analysis and Synthesis Methods for the Design of Next Generation Reflectarrays“. Doctoral thesis, Università degli studi di Trento, 2019. http://hdl.handle.net/11572/243312.

Der volle Inhalt der Quelle
Annotation:
The design of reflectarray surface currents that satisfy both radiation and user-defined antenna feasibility constraints is addressed through a novel paradigm which takes advantage of the non-uniqueness of inverse source (IS) problems. To this end, the synthesis is formulated in the IS framework and its non-measurable solutions are employed as a design DoF. Thanks to the adopted framework, a closed-form expression for the design of reflectarray surface currents is derived which does not require any iterative local/global optimization procedure and which inherently satisfies both the radiation and the feasibility design constraints. The features and potentialities of the proposed strategy are assessed through selected numerical experiments dealing with different reflectarray aperture types/sizes and forbidden region definitions.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
42

Gelmini, Angelo. „Advanced Analysis and Synthesis Methods for the Design of Next Generation Reflectarrays“. Doctoral thesis, Università degli studi di Trento, 2019. http://hdl.handle.net/11572/243312.

Der volle Inhalt der Quelle
Annotation:
The design of reflectarray surface currents that satisfy both radiation and user-defined antenna feasibility constraints is addressed through a novel paradigm which takes advantage of the non-uniqueness of inverse source (IS) problems. To this end, the synthesis is formulated in the IS framework and its non-measurable solutions are employed as a design DoF. Thanks to the adopted framework, a closed-form expression for the design of reflectarray surface currents is derived which does not require any iterative local/global optimization procedure and which inherently satisfies both the radiation and the feasibility design constraints. The features and potentialities of the proposed strategy are assessed through selected numerical experiments dealing with different reflectarray aperture types/sizes and forbidden region definitions.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
43

Mäkikallio, T. (Timo). „Analysis of heart rate dynamics by methods derived from nonlinear mathematics:clinical applicability and prognostic significance“. Doctoral thesis, University of Oulu, 1998. http://urn.fi/urn:isbn:9514250133.

Der volle Inhalt der Quelle
Annotation:
Abstract The traditional methods of analysing heart rate variability based on means and variance are unable to detect subtle but potentially important changes in interbeat heart rate behaviour. This research was designed to evaluate the clinical applicability and prognostic significance of new dynamical methods of analysing heart rate behaviour derived from nonlinear mathematics. The study covered four different patient populations, their controls and one general population of elderly people. The first patient group consisted of 38 patients with coronary artery disease without previous myocardial infarction, the second of 40 coronary artery disease patients with a prior Q-wave myocardial infarction, and the third of 45 patients with a history of ventricular tachyarrhythmia. The fourth group comprised 10 patients with a previous myocardial infarction who had experienced ventricular fibrillation during electrocardiographic recordings. The fifth group comprised a random sample of 347 community-living elderly people invited for a follow-up of 10 years after electrocardiographic recordings. Heart rate variability was analysed by traditional time and frequency domain methods. The new dynamical measures derived from nonlinear dynamics were: 1) approximate entropy, which reflects the complexity of the data, 2) detrended fluctuation analysis, which describes the presence or absence of fractal correlation properties of time series data, and 3) power-law relationship analysis, which demonstrates the distribution of spectral characteristics of RR intervals, but does not reflect the magnitude of spectral power in different spectral bands. Approximate entropy was higher in postinfarction patients (1.17 ± 0.22), but lower in coronary artery disease patients without myocardial infarction (0.93 ± 0.17) than in healthy controls (1.03 ± 014, p < 0.01, p < 0.05 respectively). It did not differ between patients with and without ventricular arrhythmia. The short term fractal-like scaling exponent of the detrended fluctuation analysis was higher in coronary artery disease patients without myocardial infarction (1.34 ± 0.15, p < 0.001), but not in postinfarction patients without arrhythmia (1.06 ± 0.13) compared with healthy controls (1.09 ± 0.13). The short term exponent was markedly reduced in patients with life-threatening arrhythmia (0.85 ± 0.25 ventricular tachycardia patients, 0.68 ± 0.18 ventricular fibrillation patients, p < 0.001 for both). The long term power-law slope of the power-law scaling analysis was lower in the ventricular fibrillation group than in postinfarction controls without arrhythmia risk (-1.63 ± 0.24 vs. -1.33 ± 0.23, p < 0.01) and predicted mortality in a general elderly population with an adjusted relative risk of 1.74 (95% CI 1.42–2.13). The present observations demonstrate that dynamic analysis of heart rate behaviour gives new insight into analysis of heart rate dynamics in various cardiovascular disorders. The breakdown of the normal fractal-like organising principle of heart rate variability is associated with an increased risk of mortality and vulnerability to life-threatening arrhythmias.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
44

Cobb, Matthew. „Recoverable resources calculation using non-linear methods: a comparative study“. Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2016. https://ro.ecu.edu.au/theses/1809.

Der volle Inhalt der Quelle
Annotation:
The prediction of recoverable resources at an operating manganese mine is currently undertaken using univariate ordinary kriging of the target variable manganese, and 5 deleterious variables. Input data densities at the time of this calculation are considerably lower than at the time of final selection (grade control), and the potential for unnacceptable conditional bias to be introduced through the use of linear geostatistical methods when determining grade estimates over a small support has led to assessment of the potential benefit of employing the local change of support methods Localised Uniform Conditioning (LUC) and Conditional Simulation (CS). Allowances for the operating conditions, including time frames for estimation / simulation, and likely software limitations are accounted for by also requiring decorrelation to be used in instances where the data are considered in a multivariate sense. A novel method for decorrelation of geostatistical datasets, Independent Components Analysis (ICA), is compared against the more common method of Minimum-Maximum Autocorrelation Factorisation (MAF). ICA performs comparably against MAF in terms of its ability to diagonalise the variance-covariance matrix of the test dataset over multiple lags, for a variety of input data densities and treatments (log-ratio transformed and raw oxide data). Based on these results, ICA decorrelated data were incorporated into a comparative study of LUC and CS against block ordinary kriging (BOK), using an input dataset of reduced density, treated variously as raw univariate oxide data, decorrelated oxide data, and log-ratio transformed decorrelated data. The use of the log-ratio transform, designed to account for the 100% sum constraint inherent to the input data, proved impractical for LUC due to difficulties associated with the discrete Gaussian model change of support method employed by this technique. Log-ratio data transformation was restricted to use with CS where back transformation to raw oxide space could take place on a pseudo-equivalent support to the input data, prior to change of support. While use of the log-ratio transformation for CS guaranteed adherence to the sum constraint for results (the only method to do so) it resulted in distortion to both the spatial and grade distribution of results. Decorrelation by ICA also posed difficulties, with biases introduced to final back transformed results as a result of the decorrelation algorithm in both log-ratio transformed and oxide data, which in some instances caused impossible negative values to be returned for some variables in the final results. In a comparison of net profit calculations for each method, the distortions introduced from both log-ratio transformation, and decorrelation become evident in either overly optimistic or conservative profit distributions for methods in which they were used. Of the results presented, only BOK, CS and LUC of non-decorrelated oxide data appear to show results similar to those which would be used at the operation during final selection (based on ordinary kriging of a complete dataset). Based on the comparison of spatial grade distributions and both net profit spatial distribution and summary, the decision to employ a non-linear method of recoverable resource calculation at the operation under question would be questionable in terms of its reward for effort, given that the current method of BOK appears to produce equivalent results.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
45

Fok, Carlotta Ching Ting 1973. „Approximating periodic and non-periodic trends in time-series data“. Thesis, McGill University, 2002. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=79765.

Der volle Inhalt der Quelle
Annotation:
Time-series data that reflect a periodic pattern are often used in psychology. In personality psychology, Brown and Moskowitz (1998) used spectral analysis to study whether fluctuations in the expression of four interpersonal behaviors show a cyclical pattern. Spline smoothing had also been used in the past to track the non-periodic trend, but no research has yet been done that combines spectral analysis and spline smoothing. The present thesis describes a new model which combines these two techniques to capture both periodic and non-periodic trends in the data.
The new model is then applied to Brown and Moskowitz's time-series data to investigate the long-term evolution to the four interpersonal behaviors, and to the GDP data to examine the periodic and non-periodic pattern for the GDP values of the 16 countries. Finally, the extent to which the model is accurate is tested using simulated data.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
46

Thekedar, Bhushan. „Investigations on the use of breath gas analysis with Proton Transfer Reaction Mass Spectrometry (PTR-MS) for a non-invasive method of early lung cancer detection“. kostenfrei, 2009. https://mediatum2.ub.tum.de/node?id=821780.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
47

Brown, Andrew Michael. „Development of a probabilistic dynamic synthesis method for the analysis of non-deterministic structures“. Diss., Georgia Institute of Technology, 1998. http://hdl.handle.net/1853/19065.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
48

Grasso, Marica. „“Transcutaneous laryngeal ultrasonography: a reliable, non-invasive and inexpensive preoperative method in the evaluation of vocal cords motility. A prospective multicentric analysis on a large series and an analysis of Literature”“. Doctoral thesis, Universita degli studi di Salerno, 2019. http://elea.unisa.it:8080/xmlui/handle/10556/4504.

Der volle Inhalt der Quelle
Annotation:
2017 - 2018
INTRODUCTION Benign and malignant thyroid diseases affect a large population worldwide. Total Thyroidectomy is one of the most commonly performed intervention in general surgery. The most feared and dangerous complication of thyroidectomy is the paresis or paralysis of the recurrent laryngeal nerve (RLN). Therefore, endocrine surgeons have been prompted to include, among the preoperative examinations, the evaluation of vocal cords function through flexible fiberoptic laryngoscopy (FFL). RLN injuries have a low incidence in referral center with experienced surgeons and, a routine FFL could be uncomfortable for patients and leads to unjustifiable increase of health care costs. Transcutaneous laryngeal ultrasonography (TLUS) has been proposed as a noninvasive and painless indirect examination of vocal cords function as alternative to direct FFL. TLUS is an easy and feasible technique and is a non-invasive, inexpensive, rapid, painless, repeatable and well tolerated by the patient. The aim of this study is to assess TLUS reliability as an alternative method to direct FFL in the evaluation vocal folds function in patients candidate to thyroid surgery. MATHERIAL and METHOD We conducted a prospective observational multicentric cohort study on 396 consecutive patients diagnosed with benign and malignant thyroid disease referred to the Thyroid Surgery Division of the University of Campania "Luigi Vanvitelli" and to the General and Specialistic Surgery Division of the “A. Cardarelli” Hospital. Patients were stratified into 2 groups according to BMI in a non-overweight group (BMI <25) and in an overweight or obese group (BMI ≥25). Transcutaneous laryngeal ultrasonography was performed for each patients by and experienced surgeon trained in ultrasound examination and, after TLUS, all patients underwent routine preoperative FFL by a blinded experienced otolaryngologist. Findings were classified as normal or impaired vocal cord function. RESULTS Sensitivity was equal to 100% (98 – 100%), specificity was 99,5% (98 – 99,9%), positive predictive value 66,7 % (61,8 – 71,3%), negative predictive value 100% (98 – 100%). The probability of a vocal cord alteration in case of negative TLUS was 0% (0 – 10,4%) and if it resulted positive was 66,7% (60,7 – 72,3%). In our series, no False Negative have been observed. The prevalence of VCP in our series was 1% (0,3 – 2,7%). The results showed a concordance between TLUS and FL of 99,5%, with a Cohen’s K value of 0,798. DISCUSSION Thanks to the standardization of the ultrasound technique, we registered a high overall assessability rate was 96.46%, a sensitivity of 100%, a specificity of 99.5%, a positive predictive value of 66.7% and a negative predictive value of 100% in the identification of vocal cords alterations. Our results showed a concordance between TLUS and FL of 99.5%, with a Cohen’s K value of 0.798. These encouraging data allowed us to consider TLUS as part of the routine preoperative screening, as it is absolutely reliable in identifying healthy patients without paresis of the vocal cords. In case of doubts on the motility of the vocal cords, however, TLUS allowed to select patients that should be addressed to FFL. Our study confirmed some difficulty in identifying the vocal cords in male patients with hypertrophy of the thyroid cartilage without calcification. This difficulty was solved thanks to adoption of a different acoustic window in lateral approach, as our investigator had undergone specific training in ultrasound of the cervical region. CONCLUSION TLUS is a valid non-invasive and painless alternative method in the preoperative assessment of vocal cords for a selected population, such as pediatric patients, cardiopathic patients, patients who do not tolerate invasive exams, patients with no diagnosis or suspicion diagnosis of malignancy and patients who do not have voice changes. It could save a high percentage of patients from FFL and in the same time could accurately select patients candidate to second level examinations. [edited by Author]
XVII n.s. (XXXI ciclo)
APA, Harvard, Vancouver, ISO und andere Zitierweisen
49

Santasusana, Isach Miquel. „Numerical techniques for non-linear analysis of structures combining discrete element and finite element methods“. Doctoral thesis, Universitat Politècnica de Catalunya, 2016. http://hdl.handle.net/10803/404120.

Der volle Inhalt der Quelle
Annotation:
This works encompasses a broad review of the basic aspects of the Discrete Element Method for its application to general granular material handling problems with special emphasis on the topics of particle-structure interaction and the modelling of cohesive materials. On the one hand, a special contact detection algorithm has been developed for the case of spherical particles representing the granular media in contact with the finite elements that discretize the surface of rigid structures. The method, named Double Hierarchy Method, improves the existing state of the art in the field by solving the problems that non-smooth contact regions and multi contact situations present. This topic is later extended to the contact with deformable structures by means of a coupled DE-FE method. To do so, a special procedure is described aiming to consistently transfer the contact forces, which are first calculated on the particles, to the nodes of the FE representing the solids or structures. On the other hand, a model developed by Oñate et al. for the modelling of cohesive materials with the DEM is numerically analysed to draw some conclusions about its capabilities and limitations. In parallel to the theoretical developments, one of the objectives of the thesis is to provide the industrial partner of the doctoral programme, CITECHSA, a computer software called DEMPack (www.cimne.com/dem/) that can apply the coupled DE-FE procedure to real engineering projects. One of the remarkable applications of the developments in the framework of the thesis has been a project with the company Weatherford Ltd. involving the simulation of concrete-like material testing. The thesis is framed within the first graduation (2012-2013) of the Industrial Doctorate program of the Generalitat de Catalunya. The thesis proposal comes out from the agreement between the company CITECHSA and the research centre CIMNE from the Polytechnical University of Catalonia (UPC).
Aquest treball comprèn una àmplia revisió dels aspectes bàsics del Mètode dels Elements Discrets (DEM) per a la seva aplicació genèrica en problemes que involucren la manipulació i transport de material granular posant èmfasi en els temes de la interacció partícula-estructura i la simulació de materials cohesius. Per una banda, s'ha desenvolupat un algoritme especialitzat en la detecció de contactes entre partícules esfèriques que representen el medi granular i els elements finits que conformen una malla de superfície en el modelatge d'estructures rígides. El mètode, anomenat "Double Hierarchy Method", suposa una millora en l'estat de l'art existent al solucionar els problemes que deriven del contacte en regions de transició no suau i en casos amb múltiples contactes. Aquest tema és posteriorment estès al contacte amb estructures deformables per mitjà de l'acoblament entre el DEM i el Mètode dels Elements Finits (FEM) el qual governa la solució de mecànica de sòlids en l'estructura. Per a fer-ho, es descriu un procediment pel qual les forces de contacte, que es calculen en les partícules, es transfereixen de forma consistent als nodes que formen part de l'estructura o sòlid en qüestió. Per altra banda, un model desenvolupat per Oñate et al. per a modelar materials cohesius mitjançant el DEM és analitzat numèricament per tal d'extreure conclusions sobre les seves capacitats i limitacions. En paral·lel als desenvolupaments teòrics, un dels objectius de la tesi és proveir al partner industrial del programa doctoral, CITECHSA, d'un software anomenat DEMpack (http://www.cimne.com/dem/) que permeti aplicar l'acoblament DEM-FEM en projectes d'enginyeria reals. Una de les aplicacions remarcables dels desenvolupaments en el marc de la tesis ha estat un projecte per l'empresa Weatherford Ltd. que involucra la simulació de tests en provetes de materials cimentosos tipus formigó. Aquesta tesis doctoral s'emmarca en la primera promoció (2012-2013) del programa de Doctorats Industrials de la Generalitat de Catalunya. La proposta de tesi prové de l'acord entre l'empresa CITECHSA i el centre de recerca CIMNE de la Universitat Politècnica de Catalunya (UPC).
APA, Harvard, Vancouver, ISO und andere Zitierweisen
50

Lewandowski, Michal. „Advanced non linear dimensionality reduction methods for multidimensional time series : applications to human motion analysis“. Thesis, Kingston University, 2011. http://eprints.kingston.ac.uk/20313/.

Der volle Inhalt der Quelle
Annotation:
This dissertation contributes to the state of the art in the field of pattern recognition and machine learning by advancing a family of nonlinear dimensionality reduction methods. We start with the automatisation of spectral dimensionality reduction approaches in order to facilitate the usage of these techniques by scientists in various domains wherever there is a need to explore large volumes of multivariate data. Then, we focus on the crucial and open problem of modelling the intrinsic structure of multidimensional time series. Solutions to this outstanding scientific challenge would advance various branches of science from meteorology, biology, engineering to computer vision, wherever time is a key asset of high dimensional data. We introduce two different approaches to this complex problem, which are both derived from the proposed concept of introducing spatio-temporal constraints between time series. The first algorithm allows for an efficient deterministic parameterisation of multidimensional time series spaces, even in the presence of data variations, whereas the second one approximates an underlying distribution of such spaces in a generative manner. We evaluate our original contributions in the area of visual human motion analysis, especially in two major computer vision tasks, i. e. human body pose estimation and human action recognition from video. In particular, we propose two variants of temporally constrained human motion descriptors, which become a foundation of view independent action recognition frameworks, and demonstrate excellent robustness against style, view and speed variability in recognition of different kinds of motions. Performance analysis confirms the strength and potential of our contributions, which may benefit many domains beyond computer vision.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie