Dissertations / Theses on the topic 'Evaluation of Measurement Uncertainty'

To see the other types of publications on this topic, follow the link: Evaluation of Measurement Uncertainty.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Evaluation of Measurement Uncertainty.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Zakharov, I. P., O. A. Botsiura, I. Tsybina, and O. Zakharov. "Measurement uncertainty evaluation by kurtosis method at micrometer calibration." Thesis, "Софттрейд", 2020. https://openarchive.nure.ua/handle/document/18983.

Full text
Abstract:
The procedure for measurement uncertainty evaluation at micrometer calibration by the kurtosis method is considered. The measurement model as the deviation of the micrometer readings from the length of the reference gage block is recorded. The measurement model takes into account the corrections for the micrometer resolution to be calibrated, lack of flatness and departure from parallelism of its measuring faces, as well as for the temperature difference between the gage block and the calibrated micrometer. The input values and their standard uncertainties are estimated. The calculation of the combined standard uncertainty and expanded uncertainty is carried out taking into account the kurtosis of the input quantities. The report presents an uncertainty budget, which can serve as a basis for creating a software tool that facilitates calculations. The proposed procedure was validated by the Monte Carlo method, which showed that it is are adequate for an intended use.
APA, Harvard, Vancouver, ISO, and other styles
2

Wei, Peng. "Web and knowledge-based decision support system for measurement uncertainty evaluation." Thesis, Brunel University, 2009. http://bura.brunel.ac.uk/handle/2438/10114.

Full text
Abstract:
In metrology, measurement uncertainty is understood as a range in which the true value of the measurement is likely to fall in. The recent years have seen a rapid development in evaluation of measurement uncertainty. ISO Guide to the Expression of Uncertainty in Measurement (GUM 1995) is the primary guiding document for measurement uncertainty. More recently, the Supplement 1 to the "Guide to the expression of uncertainty in measurement" – Propagation of distributions using a Monte Carlo method (GUM SP1) was published in November 2008. A number of software tools for measurement uncertainty have been developed and made available based on these two documents. The current software tools are mainly desktop applications utilising numeric computation with limited mathematical model handling capacity. A novel and generic web-based application, web-based Knowledge-Based Decision Support System (KB-DSS), has been proposed and developed in this research for measurement uncertainty evaluation. A Model-View-Controller architecture pattern is used for the proposed system. Under this general architecture, a web-based KB-DSS is developed based on an integration of the Expert System and Decision Support System approach. In the proposed uncertainty evaluation system, three knowledge bases as sub-systems are developed to implement the evaluation for measurement uncertainty. The first sub-system, the Measurement Modelling Knowledge Base (MMKB), assists the user in establishing the appropriate mathematical model for the measurand, a critical process for uncertainty evaluation. The second sub-system, GUM Framework Knowledge Base, carries out the uncertainty evaluation process based on the GUM Uncertainty Framework using symbolic computation, whilst the third sub-system, GUM SP1 MCM Framework Knowledge Base, conducts the uncertainty calculation according to the GUM SP1 Framework numerically based on Monte Carlo Method. The design and implementation of the proposed system and sub-systems are discussed in the thesis, supported by elaboration of the implementation steps and examples. Discussions and justifications on the technologies and approaches used for the sub-systems and their components are also presented. These include Drools, Oracle database, Java, JSP, Java Transfer Object, AJAX and Matlab. The proposed web-based KB-DSS has been evaluated through case studies and the performance of the system has been validated by the example results. As an established methodology and practical tool, the research will make valuable contributions to the field of measurement uncertainty evaluation.
APA, Harvard, Vancouver, ISO, and other styles
3

Russo, Domenico. "Innovative procedure for measurement uncertainty evaluation of environmental noise accounting for sound pressure variability." Doctoral thesis, Universita degli studi di Salerno, 2017. http://hdl.handle.net/10556/2574.

Full text
Abstract:
2015 - 2016
This study aims to demonstrate the importance of uncertainty evaluation in the measurement of environmental noise in the context of Italian legislation on noise pollution. Attention is focused on the variability of the measurand as a source of uncertainty and a procedure for the evaluation of uncertainty for environmental noise measurement is proposed. First drawing on several real noise datasets in order to determine suitable measurement time intervals for the estimation of the environmental noise, a data-driven sampling strategy is proposed, which takes into account the observed variability associated with measured sound pressure levels. Outliers are eliminated from the actual noise measurements using an outlier detection algorithm based on K-neighbors distance. As the third step, the contribution of measurand variability on measurement uncertainty is determined by using the normal bootstrap method. Experimental results exploring the adoption of the proposed method drawing upon real data from environmental noise using acquisition campaigns confirm the reliability of the proposal. It is shown to be very promising with regard to the prediction of expected values and uncertainty of traffic noise when a reduced dataset is considered. [edited by author]
Negli ultimi anni, studiosi ed esperti del settore hanno focalizzato la loro attenzione sulle possibili fonti di incertezza associabili a tale attività, cercando di pervenire a modelli che contemplassero tutte le variabili che concorrono alla determinazione dell’incertezza nella misura dei livelli di pressione acustica: l'incertezza dovuta alle caratteristiche della strumentazione di misura (fonometri o analizzatori multicanale), l'errore derivante dal posizionamento della strumentazione e quindi dei trasduttori microfonici, l'incertezza dovuta al calibratore, nonché l’incertezza da associare. Al fine, però, di fornire un’adeguata stima dell’indeterminazione associata alla misura del livello equivalente di rumore ambientale, risulta indispensabile considerare l’incertezza derivante dall’intrinseca variabilità del fenomeno in esame. Il tema risulta essere di particolare interesse scientifico e, negli ultimi anni, molti autori hanno proposto diverse metodologie di approccio al suddetto problema, in particolare alcuni hanno focalizzato l’attenzione sull’eliminazione dei segnali sonori non desiderati, altri sulla stima del tempo di misura e altri ancora direttamente sulla determinazione dell’incertezza. Alla luce di quanto esposto, ho pensato di integrare le diverse tecniche studiate in un’unica procedura, basata sul metodo bootstrap, tecnica statistica di ricampionamento con sostituzione del dataset iniziale, in quanto non ha limitazioni in termini di forma e di proprietà delle distribuzioni statistiche considerate ed è, pertanto, più adatta all’analisi del rumore ambientale, la cui popolazione non è strettamente gaussiana. Inizialmente, dal momento che l’affidabilità della stima degli indicatori di rumore ambientale dipende in modo significativo dalla variabilità temporale del rumore, e, quindi, risulta fondamentale scegliere in modo accurato il tempo di misura che tenga in considerazione la variabilità statistica del fenomeno acustico sotto osservazione, l’algoritmo individua in modo automatico un tempo minimo di acquisizione, corrispondente al numero minimo di livelli pressione sonora necessari a garantire la significatività statistica del set di dati di partenza. In una seconda fase sono individuati ed eliminati dal segnale acquisito eventuali valori anomali (outlier) ed, infine, è calcolata l’incertezza relativa al misurando applicando il metodo bootstrap. I risultati di tale metodo sono stati anche confrontati con la stima del valore atteso per il descrittore acustico a breve termine e della corrispondente incertezza applicando il metodo classico (GUM ISO). Poiché le grandezze calcolate con l’applicazione del metodo bootstrap si avvicinano molto a quelle determinate con il metodo classico nell’ipotesi di ridotto numero di campioni, tale procedura risulta altresì particolarmente adatta alla previsione dell'indicatore di rumore ambientale quando sono disponibili pochi dati di misura. [a cura dell'autore]
XV n.s. (XXIX)
APA, Harvard, Vancouver, ISO, and other styles
4

Stellini, Marco. "Evaluation of Uncertainty and Repeatability in Measurement: two application studies in Synchronization and EMC Testing." Doctoral thesis, Università degli studi di Padova, 2009. http://hdl.handle.net/11577/3425620.

Full text
Abstract:
Efficient organization of measurement tasks requires the knowledge and characterization of the parameters and of the effects that may affect the measurement itself. Uncertainty analysis is an example of how measurement accuracy is often difficult to quantify. Repeatability also assumes a key role. This is the ability to replicate the tests and related measurements at different times. The research was focused on this aspect of analysis of test repeatability. Some specific case studies in the field of measurements relating to synchronization between network nodes and to measurements for Electromagnetic Compatibility have been considered. Synchronization between the components of a system is a particularly important requirement when considering distributed structures. The network nodes developed for this research are based on both PCs with a Real Time operating system (RTAI) and Linux-based embedded systems (FOX Acme Systems Board) interfaced to an auxiliary module with a field-programmable gate array (FPGA). The aim of the tests is to measure and classify the uncertainty due to jitter in the Time Stamping mechanism, and consequently to evaluate the resolution and repeatability of synchronization achieved in different traffic conditions using a standardized protocol for synchronization (IEEE 1588-PTPd). The work in the Electromagnetic Compatibility has likewise focused on the repeatability of measurements typical of some practical applications. Some experiments involving LISN calibration have been carried out and some improvements are presented to reduce uncertainty. Theoretical and experimental uncertainty analysis associated with the ESD tests has been conducted and some possible solutions are proposed. A study on the performance of sites for radiated tests (Anechoic chambers, open-area test site) has been started using simulations and experimental testing in order to assess the capability of different sites. Obtained results are compared with different reference sources. Finally, the results of a research project carried out at the University of Houston for the propagation of electromagnetic fields are reported.
Organizzare una efficiente campagna di misure richiede la conoscenza e la caratterizzazione dei parametri e degli effetti che possono influire sulla misura stessa. L’analisi dell’incertezza è un esempio di come l’accuratezza sia spesso difficile da quantificare. Oltre all’incertezza tuttavia assume un ruolo chiave la ripetibilità, ovvero la possibilità di replicare il test e le relative misure in momenti diversi. L’attività di ricerca ha riguardato proprio questo aspetto di analisi della ripetibilità dei test prendendo in considerazione alcuni casi di studio specifici sia in ambito di Misure relative alla Sincronizzazione tra nodi di un sistema distribuito sia di Misure per la Compatibilità Elettromagnetica. La sincronizzazione è un’esigenza particolarmente sentita quando si considerano strutture di misura distribuite. I nodi di rete sviluppati per queste ricerche sono basati sia su PC dotati di sistema operativo Real Time (RTAI) sia su sistemi embedded Linux-based (Acme Systems FOX Board) interfacciati ad un modulo ausiliario su cui si trova un field-programmable gate array (FPGA). I test condotti hanno permesso di misurare e classificare l’incertezza dovuta al jitter nel meccanismo di Time Stamp, e conseguentemente di valutare la risoluzione e la ripetibilità della sincronizzazione raggiunta in diverse condizioni di traffico utilizzando un protocollo di sincronizzazione standardizzato secondo l’IEEE 1588 (PTPd). In ambito di compatibilità elettromagnetica, il lavoro svolto si è concentrato sull’analisi della ripetibilità di misure tipiche di alcune applicazioni pratiche in ambito EMC. E’ stata svolta una analisi approfondita dei fenomeni parassiti legati alla taratura di una LISN e sono stati introdotti alcuni miglioramenti costruttivi al fine di ridurre i contributi di incertezza. Si è condotta una indagine teorico-sperimentale sull’incertezza associata alla misura di immunità con generatore di scariche elettrostatiche e l’individuazione di possibili soluzioni. E’ stato avviato uno studio sulle prestazioni dei siti per le misure dei disturbi irradiati (camere anecoiche, open-area test site) mediante simulazioni teoriche e prove ’in campo’ al fine di valutare i limiti di impiego dei diversi siti e comparare i risultati ottenuti con sorgenti di riferimento. Infine, vengono riportati i risultati di una ricerca svolta presso l’Università di Houston e relativa alla propagazione di campi elettromagnetici.
APA, Harvard, Vancouver, ISO, and other styles
5

Celebioglu, Emrah Hasan. "Developing A Computer Program For Evaluating Uncertainty Of Some Typical Dimensional Measuring And Gauging Devices." Master's thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/12605976/index.pdf.

Full text
Abstract:
In dimensional measurements, it is required to specify uncertainty in the measurement as the range of possible deviation for the measurement result. In this thesis, a computer program is developed for evaluating uncertainty in measurement of commonly used dimensional measuring devices like vernier callipers, micrometers, comparators, and gauge blocks. In evaluation of the uncertainty in measurement, some uncertainty sources like temperature difference between the measured part and the instrument, uncertainty in reference gauge block&rsquo
s dimension, mechanical effects, etc. are considered. The program developed, employs the EAL, NIST and GUM uncertainty evaluation equations as standard equations. However, the program can also be used for other measuring instruments and the users can define their own uncertainty equation. In the evaluations, for the standard uncertainty of the variables considered, symmetric distributions are used. The program gives the uncertainty budget and to compare the contribution of each variable on the overall uncertainty of the measurement, the uncertainty effect ratio is also given. In this thesis the evaluation process for uncertainty in measurement, the difference between the measurement error and uncertainty in measurement and the structure of the program are discussed. Also, a set of experiments has been made to illustrate the application of the program for evaluating the measurement uncertainty of vernier callipers with 1/50 and 1/20 resolutions, digital vernier calliper and 25 mm micrometer.
APA, Harvard, Vancouver, ISO, and other styles
6

Lee, Kyutae. "Evaluation of methodologies for continuous discharge monitoring in unsteady open-channel flows." Diss., University of Iowa, 2013. https://ir.uiowa.edu/etd/5012.

Full text
Abstract:
Ratings curves are conventional means to continuously provide estimates of discharges in rivers. Among the most-often adopted assumptions in building these curves are the steady and uniform flow conditions for the open-channel flow that in turn provide a one-to-one relationships between the variables involved in discharge estimation. The steady flow assumption is not applicable during propagation of storm-generated waves hence the question on the validity of the steady rating curves during unsteady flow is of both scientific and practical interest. Scarce experimental evidence and analytical inferences substantiate that during unsteady flows the relationship between some of the variables is not unique leading to looped rating curves (also labeled hysteresis). Neglecting the unsteadiness of the flow when this is large can significantly affect the accuracy of the flow estimation. Currently, the literature does not offer criteria for a comprehensive evaluation of the methods for estimation of the departure of the looped rating curves from the steady ones nor for identifying the most appropriate means to dynamically capturing hysteresis for different possible river flow conditions. Therefore, the overarching goal of this study was to explore the uncertainty of the conventional approaches for constructing stage-discharge rating curves (hQRCs) and to evaluate methodologies for accurate and continuous discharge monitoring in unsteady open channel flows using analytical inference, index velocity rating curves (VQRCs), and continuous slope area method (CSA) with considerations on discharge measurement uncertainty. The study will demonstrate conceptual and experimental evidences to illustrate some of the unsteady flow impacts on rating curves and suggest the development of a uniform end-to-end methodology to enhance the accuracy of the current protocols for continuous stream flow estimation for both steady and unsteady river conditions. Moreover, hysteresis diagnostic method will be presented to provide the way to conveniently evaluate when and where the hysteresis becomes significant as a function of the site and storm event characteristics. The measurement techniques and analysis methodologies proposed herein will allow to dynamically tracking both the flood wave propagation and the associated uncertainty in the conventional RCs.
APA, Harvard, Vancouver, ISO, and other styles
7

Powell, Joanne. "Evaluating measurement uncertainty in amino acid racemization analysis : towards a new chronology." Thesis, University of York, 2012. http://etheses.whiterose.ac.uk/4465/.

Full text
Abstract:
Unlike other Quaternary dating methods, amino acid racemization (AAR) geochronology has the potential to provide age estimates that span the entire Quaternary period, a crucial period for understanding past climate change and human evolution. It has become a critical technique for Quaternary Science and uses the time/temperature dependent kinetics of protein decomposition to provide relative age estimates of fossil samples. The accuracy of age estimates relies heavily on the accuracy of analytical data and accurate determinations of uncertainty estimates. This thesis takes internationally established principles of measurement uncertainty determination and applies them to AAR. Analytical uncertainty is considered in the context of intra- and inter-laboratory measurement results. A retrospective evaluation of intra-laboratory precision using ANOVA is given, and results from an inter-laboratory proficiency study, evaluated as estimates of bias, are summarised (paper submitted). The final sections look at uncertainty from existing archaeological site data, including sampling effects. A model is proposed that utilises decomposition correlations between amino acids to provide a priori uncertainty estimates. These are then used to update observed site data using a Bayesian approach to derive posterior uncertainty estimates and D/L values. A further model is tentatively presented which could potentially be used to derive quantitative age estimates once uncertainty within the kinetic and temperature models have been characterised and accounted for.
APA, Harvard, Vancouver, ISO, and other styles
8

Mahowald, Jean [Verfasser]. "EVALUATION OF DYNAMIC DAMAGE INDICATORS ON REAL-LIFE CIVIL ENGINEERING STRUCTURES: MEASUREMENT UNCERTAINTY AND ENVIRONMENTAL INFLUENCES CONSIDERED / Jean Mahowald." Aachen : Shaker, 2014. http://d-nb.info/1050341740/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Mannschatz, Theresa. "Site evaluation approach for reforestations based on SVAT water balance modeling considering data scarcity and uncertainty analysis of model input parameters from geophysical data." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2015. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-175309.

Full text
Abstract:
Extensive deforestations, particularly in the (sub)tropics, have led to intense soil degradation and erosion with concomitant reduction in soil fertility. Reforestations or plantations on those degraded sites may provide effective measures to mitigate further soil degradation and erosion, and can lead to improved soil quality. However, a change in land use from, e.g., grassland to forest may have a crucial impact on water balance. This may affect water availability even under humid tropical climate conditions where water is normally not a limiting factor. In this context, it should also be considered that according to climate change projections rainfall may decrease in some of these regions. To mitigate climate change related problems (e.g. increases in erosion and drought), reforestations are often carried out. Unfortunately, those measures are seldom completely successful, because the environmental conditions and the plant specific requirements are not appropriately taken into account. This is often due to data-scarcity and limited financial resources in tropical regions. For this reason, innovative approaches are required that are able to measure environmental conditions quasi-continuously in a cost-effective manner. Simultaneously, reforestation measures should be accompanied by monitoring in order to evaluate reforestation success and to mitigate, or at least to reduce, potential problems associated with reforestation (e.g. water scarcity). To avoid reforestation failure and negative implications on ecosystem services, it is crucial to get insights into the water balance of the actual ecosystem, and potential changes resulting from reforestation. The identification and prediction of water balance changes as a result of reforestation under climate change requires the consideration of the complex feedback system of processes in the soil-vegetation-atmosphere continuum. Models that account for those feedback system are Soil-Vegetation-Atmosphere-Transfer (SVAT) models. For the before-mentioned reasons, this study targeted two main objectives: (i) to develop and test a method combination for site evaluation under data scarcity (i.e. study requirements) (Part I) and (ii) to investigate the consequences of prediction uncertainty of the SVAT model input parameters, which were derived using geophysical methods, on SVAT modeling (Part II). A water balance modeling approach was set at the center of the site evaluation approach. This study used the one-dimensional CoupModel, which is a SVAT model. CoupModel requires detailed spatial soil information for (i) model parameterization, (ii) upscaling of model results and accounting for local to regional-scale soil heterogeneity, and (iii) monitoring of changes in soil properties and plant characteristics over time. Since traditional approaches to soil and vegetation sampling and monitoring are time consuming and expensive (and therefore often limited to point information), geophysical methods were used to overcome this spatial limitation. For this reason, vis-NIR spectroscopy (visible to near-infrared wavelength range) was applied for the measurement of soil properties (physical and chemical), and remote sensing to derive vegetation characteristics (i.e. leaf area index (LAI)). Since the estimated soil properties (mainly texture) could be used to parameterize a SVAT model, this study investigated the whole processing chain and related prediction uncertainty of soil texture and LAI, and their impact on CoupModel water balance prediction uncertainty. A greenhouse experiment with bamboo plants was carried out to determine plant-physiological characteristics needed for CoupModel parameterization. Geoelectrics was used to investigate soil layering, with the intent of determining site-representative soil profiles for model parameterization. Soil structure was investigated using image analysis techniques that allow the quantitative assessment and comparability of structural features. In order to meet the requirements of the selected study approach, the developed methodology was applied and tested for a site in NE-Brazil (which has low data availability) with a bamboo plantation as the test site and a secondary forest as the reference (reference site). Nevertheless, the objective of the thesis was not the concrete modeling of the case study site, but rather the evaluation of the suitability of the selected methods to evaluate sites for reforestations and to monitor their influence on the water balance as well as soil properties. The results (Part III) highlight that one needs to be aware of the measurement uncertainty related to SVAT model input parameters, so for instance the uncertainty of model input parameters such as soil texture and leaf area index influences meaningfully the simulated model water balance output. Furthermore, this work indicates that vis-NIR spectroscopy is a fast and cost-efficient method for soil measurement, mapping, and monitoring of soil physical (texture) and chemical (N, TOC, TIC, TC) properties, where the quality of soil prediction depends on the instrument (e.g. sensor resolution), the sample properties (i.e. chemistry), and the site characteristics (i.e. climate). Additionally, also the sensitivity of the CoupModel with respect to texture prediction uncertainty with respect to surface runoff, transpiration, evaporation, evapotranspiration, and soil water content depends on site conditions (i.e. climate and soil type). For this reason, it is recommended that SVAT model sensitivity analysis be carried out prior to field spectroscopic measurements to account for site specific climate and soil conditions. Nevertheless, mapping of the soil properties estimated via spectroscopy using kriging resulted in poor interpolation (i.e. weak variograms) results as a consequence of a summation of uncertainty arising from the method of field measurement to mapping (i.e. spectroscopic soil prediction, kriging error) and site-specific ‘small-scale’ heterogeneity. The selected soil evaluation method (vis-NIR spectroscopy, structure comparison using image analysis, traditional laboratory analysis) showed that there are significant differences between the bamboo soil and the adjacent secondary forest soil established on the same soil type (Vertisol). Reflecting on the major study results, it can be stated that the selected method combination is a way forward to a more detailed and efficient way to evaluate the suitability of a specific site for reforestation. The results of this study provide insights into where and when during soil and vegetation measurements a high measurement accuracy is required to minimize uncertainties in SVAT modeling
Umfangreiche Abholzungen, besonders in den (Sub-)Tropen, habe zu intensiver Bodendegradierung und Erosion mit einhergehendem Verlust der Bodenfruchtbarkeit geführt. Eine wirksame Maßnahme zur Vermeidung fortschreitender Bodendegradierung und Erosion sind Aufforstungen auf diesen Flächen, die bisweilen zu einer verbesserten Bodenqualität führen können. Eine Umwandlung von Grünland zu Wald kann jedoch einen entscheidenden Einfluss auf den Wasserhaushalt haben. Selbst unter humid-tropischen Klimabedingungen, wo Wasser in der Regel kein begrenzender Faktor ist, können sich Aufforstungen negativ auf die Wasserverfügbarkeit auswirken. In diesem Zusammenhang muss auch berücksichtigt werden, dass Klimamodelle eine Abnahme der Niederschläge in einigen dieser Regionen prognostizieren. Um die Probleme, die mit dem Klimawandel in Verbindung stehen zu mildern (z.B. Zunahme von Erosion und Dürreperioden), wurden und werden bereits umfangreiche Aufforstungsmaßnahmen durchgeführt. Viele dieser Maßnahmen waren nicht immer umfassend erfolgreich, weil die Umgebungsbedingungen sowie die pflanzenspezifischen Anforderungen nicht angemessen berücksichtigt wurden. Dies liegt häufig an der schlechten Datengrundlage sowie an den in vielen Entwicklungs- und Schwellenländern begrenzter verfügbarer finanzieller Mittel. Aus diesem Grund werden innovative Ansätze benötigt, die in der Lage sind quasi-kontinuierlich und kostengünstig die Standortbedingungen zu erfassen und zu bewerten. Gleichzeitig sollte eine Überwachung der Wiederaufforstungsmaßnahme erfolgen, um deren Erfolg zu bewerten und potentielle negative Effekte (z.B. Wasserknappheit) zu erkennen und diesen entgegenzuwirken bzw. reduzieren zu können. Um zu vermeiden, dass Wiederaufforstungen fehlschlagen oder negative Auswirkungen auf die Ökosystemdienstleistungen haben, ist es entscheidend, Kenntnisse vom tatsächlichen Wasserhaushalt des Ökosystems zu erhalten und Änderungen des Wasserhaushalts durch Wiederaufforstungen vorhersagen zu können. Die Ermittlung und Vorhersage von Wasserhaushaltsänderungen infolge einer Aufforstung unter Berücksichtigung des Klimawandels erfordert die Berücksichtigung komplex-verzahnter Rückkopplungsprozesse im Boden-Vegetations-Atmosphären Kontinuum. Hydrologische Modelle, die explizit den Einfluss der Vegetation auf den Wasserhaushalt untersuchen sind Soil-Vegetation-Atmosphere-Transfer (SVAT) Modelle. Die vorliegende Studie verfolgte zwei Hauptziele: (i) die Entwicklung und Erprobung einer Methodenkombination zur Standortbewertung unter Datenknappheit (d.h. Grundanforderung des Ansatzes) (Teil I) und (ii) die Untersuchung des Einflusses der mit geophysikalischen Methoden vorhergesagten SVAT-Modeleingangsparameter (d.h. Vorhersageunsicherheiten) auf die Modellierung (Teil II). Eine Wasserhaushaltsmodellierung wurde in den Mittelpunkt der Methodenkombination gesetzt. In dieser Studie wurde das 1D SVAT Model CoupModel verwendet. CoupModel benötigen detaillierte räumliche Bodeninformationen (i) zur Modellparametrisierung, (ii) zum Hochskalierung von Modellergebnissen unter Berücksichtigung lokaler und regionaler Bodenheterogenität, und (iii) zur Beobachtung (Monitoring) der zeitlichen Veränderungen des Bodens und der Vegetation. Traditionelle Ansätze zur Messung von Boden- und Vegetationseigenschaften und deren Monitoring sind jedoch zeitaufwendig, teuer und beschränken sich daher oft auf Punktinformationen. Ein vielversprechender Ansatz zur Überwindung der räumlichen Einschränkung sind die Nutzung geophysikalischer Methoden. Aus diesem Grund wurden vis-NIR Spektroskopie (sichtbarer bis nah-infraroter Wellenlängenbereich) zur quasi-kontinuierlichen Messung von physikalischer und chemischer Bodeneigenschaften und Satelliten-basierte Fernerkundung zur Ableitung von Vegetationscharakteristika (d.h. Blattflächenindex (BFI)) eingesetzt. Da die mit geophysikalisch hergeleiteten Bodenparameter (hier Bodenart) und Pflanzenparameter zur Parametrisierung eines SVAT Models verwendet werden können, wurde die gesamte Prozessierungskette und die damit verbundenen Unsicherheiten und deren potentiellen Auswirkungen auf die Wasserhaushaltsmodellierung mit CoupModel untersucht. Ein Gewächshausexperiment mit Bambuspflanzen wurde durchgeführt, um die zur CoupModel Parametrisierung notwendigen pflanzenphysio- logischen Parameter zu bestimmen. Geoelektrik wurde eingesetzt, um die Bodenschichtung der Untersuchungsfläche zu untersuchen und ein repräsentatives Bodenprofil zur Modellierung zu definieren. Die Bodenstruktur wurde unter Verwendung einer Bildanalysetechnik ausgewertet, die die qualitativen Bewertung und Vergleichbarkeit struktureller Merkmale ermöglicht. Um den Anforderungen des gewählten Standortbewertungsansatzes gerecht zu werden, wurde die Methodik auf einem Standort mit einer Bambusplantage und einem Sekundärregenwald (als Referenzfläche) in NO-Brasilien (d.h. geringe Datenverfügbarkeit) entwickelt und getestet. Das Ziel dieser Arbeit war jedoch nicht die Modellierung dieses konkreten Standortes, sondern die Bewertung der Eignung des gewählten Methodenansatzes zur Standortbewertung für Aufforstungen und deren zeitliche Beobachtung, als auch die Bewertung des Einfluss von Aufforstungen auf den Wasserhaushalt und die Bodenqualität. Die Ergebnisse (Teil III) verdeutlichen, dass es notwendig ist, sich den potentiellen Einfluss der Messunsicherheiten der SVAT Modelleingangsparameter auf die Modellierung bewusst zu sein. Beispielsweise zeigte sich, dass die Vorhersageunsicherheiten der Bodentextur und des BFI einen bedeutenden Einfluss auf die Wasserhaushaltsmodellierung mit CoupModel hatte. Die Arbeit zeigt weiterhin, dass vis-NIR Spektroskopie zur schnellen und kostengünstigen Messung, Kartierung und Überwachung boden-physikalischer (Bodenart) und -chemischer (N, TOC, TIC, TC) Eigenschaften geeignet ist. Die Qualität der Bodenvorhersage hängt vom Instrument (z.B. Sensorauflösung), den Probeneigenschaften (z.B. chemische Zusammensetzung) und den Standortmerkmalen (z.B. Klima) ab. Die Sensitivitätsanalyse mit CoupModel zeigte, dass der Einfluss der spektralen Bodenartvorhersageunsicherheiten auf den mit CoupModel simulierten Oberflächenabfluss, Evaporation, Transpiration und Evapotranspiration ebenfalls von den Standortbedingungen (z.B. Klima, Bodentyp) abhängt. Aus diesem Grund wird empfohlen eine SVAT Model Sensitivitätsanalyse vor der spektroskopischen Feldmessung von Bodenparametern durchzuführen, um die Standort-spezifischen Boden- und Klimabedingungen angemessen zu berücksichtigen. Die Anfertigung einer Bodenkarte unter Verwendung von Kriging führte zu schlechten Interpolationsergebnissen in Folge der Aufsummierung von Mess- und Schätzunsicherheiten (d.h. bei spektroskopischer Feldmessung, Kriging-Fehler) und der kleinskaligen Bodenheterogenität. Anhand des gewählten Bodenbewertungsansatzes (vis-NIR Spektroskopie, Strukturvergleich mit Bildanalysetechnik, traditionelle Laboranalysen) konnte gezeigt werden, dass es bei gleichem Bodentyp (Vertisol) signifikante Unterschiede zwischen den Böden unter Bambus und Sekundärwald gibt. Anhand der wichtigsten Ergebnisse kann festgehalten werden, dass die gewählte Methodenkombination zur detailreicheren und effizienteren Standortuntersuchung und -bewertung für Aufforstungen beitragen kann. Die Ergebnisse dieser Studie geben einen Einblick darauf, wo und wann bei Boden- und Vegetationsmessungen eine besonders hohe Messgenauigkeit erforderlich ist, um Unsicherheiten bei der SVAT Modellierung zu minimieren
Extensos desmatamentos que estão sendo feitos especialmente nos trópicos e sub-trópicos resultam em uma intensa degradação do solo e num aumento da erosão gerando assim uma redução na sua fertilidade. Reflorestamentos ou plantações nestas áreas degradadas podem ser medidas eficazes para atenuar esses problemas e levar a uma melhoria da qualidade do mesmo. No entanto, uma mudança no uso da terra, por exemplo de pastagem para floresta pode ter um impacto crucial no balanço hídrico e isso pode afetar a disponibilidade de água, mesmo sob condições de clima tropical úmido, onde a água normalmente não é um fator limitante. Devemos levar também em consideração que de acordo com projeções de mudanças climáticas, as precipitações em algumas dessas regiões também diminuirão agravando assim, ainda mais o quadro apresentado. Para mitigar esses problemas relacionados com as alterações climáticas, reflorestamentos são frequentemente realizados mas raramente são bem-sucedidos, pois condições ambientais como os requisitos específicos de cada espécie de planta, não são devidamente levados em consideração. Isso é muitas vezes devido, não só pela falta de dados, como também por recursos financeiros limitados, que são problemas comuns em regiões tropicais. Por esses motivos, são necessárias abordagens inovadoras que devam ser capazes de medir as condições ambientais quase continuamente e de maneira rentável. Simultaneamente com o reflorestamento, deve ser feita uma monitoração a fim de avaliar o sucesso da atividade e para prevenir, ou pelo menos, reduzir os problemas potenciais associados com o mesmo (por exemplo, a escassez de água). Para se evitar falhas e reduzir implicações negativas sobre os ecossistemas, é crucial obter percepções sobre o real balanço hídrico e as mudanças que seriam geradas por esse reflorestamento. Por este motivo, esta tese teve como objetivo desenvolver e testar uma combinação de métodos para avaliação de áreas adequadas para reflorestamento. Com esse intuito, foi colocada no centro da abordagem de avaliação a modelagem do balanço hídrico local, que permite a identificação e estimação de possíveis alterações causadas pelo reflorestamento sob mudança climática considerando o sistema complexo de realimentação e a interação de processos do continuum solo-vegetação-atmosfera. Esses modelos hidrológicos que investigam explicitamente a influência da vegetação no equilíbrio da água são conhecidos como modelos Solo-Vegetação-Atmosfera (SVAT). Esta pesquisa focou em dois objetivos principais: (i) desenvolvimento e teste de uma combinação de métodos para avaliação de áreas que sofrem com a escassez de dados (pré-requisito do estudo) (Parte I), e (ii) a investigação das consequências da incerteza nos parâmetros de entrada do modelo SVAT, provenientes de dados geofísicos, para modelagem hídrica (Parte II). A fim de satisfazer esses objetivos, o estudo foi feito no nordeste brasileiro,por representar uma área de grande escassez de dados, utilizando como base uma plantação de bambu e uma área de floresta secundária. Uma modelagem do balanço hídrico foi disposta no centro da metodologia para a avaliação de áreas. Este estudo utilizou o CoupModel que é um modelo SVAT unidimensional e que requer informações espaciais detalhadas do solo para (i) a parametrização do modelo, (ii) aumento da escala dos resultados da modelagem, considerando a heterogeneidade do solo de escala local para regional e (iii) o monitoramento de mudanças nas propriedades do solo e características da vegetação ao longo do tempo. Entretanto, as abordagens tradicionais para amostragem de solo e de vegetação e o monitoramento são demorados e caros e portanto muitas vezes limitadas a informações pontuais. Por esta razão, métodos geofísicos como a espectroscopia visível e infravermelho próximo (vis-NIR) e sensoriamento remoto foram utilizados respectivamente para a medição de propriedades físicas e químicas do solo e para derivar as características da vegetação baseado no índice da área foliar (IAF). Como as propriedades estimadas de solo (principalmente a textura) poderiam ser usadas para parametrizar um modelo SVAT, este estudo investigou toda a cadeia de processamento e as incertezas de previsão relacionadas à textura de solo e ao IAF. Além disso explorou o impacto destas incertezas criadas sobre a previsão do balanço hídrico simulado por CoupModel. O método geoelétrico foi aplicado para investigar a estratificação do solo visando a determinação de um perfil representante. Já a sua estrutura foi explorada usando uma técnica de análise de imagens que permitiu a avaliação quantitativa e a comparabilidade dos aspectos estruturais. Um experimento realizado em uma estufa com plantas de bambu (Bambusa vulgaris) foi criado a fim de determinar as caraterísticas fisiológicas desta espécie que posteriormente seriam utilizadas como parâmetros para o CoupModel. Os resultados do estudo (Parte III) destacam que é preciso estar consciente das incertezas relacionadas à medição de parâmetros de entrada do modelo SVAT. A incerteza presente em alguns parâmetros de entrada como por exemplo, textura de solo e o IAF influencia significantemente a modelagem do balanço hídrico. Mesmo assim, esta pesquisa indica que vis-NIR espectroscopia é um método rápido e economicamente viável para medir, mapear e monitorar as propriedades físicas (textura) e químicas (N, TOC, TIC, TC) do solo. A precisão da previsão dessas propriedades depende do tipo de instrumento (por exemplo da resolução do sensor), da propriedade da amostra (a composição química por exemplo) e das características das condições climáticas da área. Os resultados apontam também que a sensitividade do CoupModel à incerteza da previsão da textura de solo em respeito ao escoamento superficial, transpiração, evaporação, evapotranspiração e ao conteúdo de água no solo depende das condições gerais da área (por exemplo condições climáticas e tipo de solo). Por isso, é recomendado realizar uma análise de sensitividade do modelo SVAT prior a medição espectral do solo no campo, para poder considerar adequadamente as condições especificas do área em relação ao clima e ao solo. Além disso, o mapeamento de propriedades de solo previstas pela espectroscopia usando o kriging, resultou em interpolações de baixa qualidade (variogramas fracos) como consequência da acumulação de incertezas surgidas desde a medição no campo até o seu mapeamento (ou seja, previsão do solo via espectroscopia, erro do kriging) e heterogeneidade especifica de uma pequena escala. Osmétodos selecionados para avaliação das áreas (vis-NIR espectroscopia, comparação da estrutura de solo por meio de análise de imagens, análise de laboratório tradicionais) revelou a existência de diferenças significativas entre o solo sob bambu e o sob floresta secundária, apesar de ambas terem sido estabelecidas no mesmo tipo de solo (vertissolo). Refletindo sobre os principais resultados do estudo, pode-se afirmar que a combinação dos métodos escolhidos e aplicados representam uma forma mais detalhada e eficaz de avaliar se uma determinada área é adequada para ser reflorestada. Os resultados apresentados fornecem percepções sobre onde e quando, durante a medição do solo e da vegetação, é necessário se ter uma precisão mais alta a fim de minimizar incertezas potenciais na modelagem com o modelo SVAT
APA, Harvard, Vancouver, ISO, and other styles
10

Frazer, Robert Charles. "Measurement uncertainty in gear metrology." Thesis, University of Newcastle Upon Tyne, 2007. http://hdl.handle.net/10443/852.

Full text
Abstract:
Gears play an important role in mechanical power transmission systems. They enable the prime mover characteristic (a gas turbine for example) to be matched to the characteristic of the driven load (say, a slow speed propeller), thus reducing the cost of both manufacturing and operating the system. The customer requirements for higher power density and lower noise demands more accurate gears. This imposes more stringent requirements on the measuring equipment that controls the quality of the manufacturing machines. Many gears have flank form and tooth spacing tolerances that are less then 10μm, and according to the so called `Golden rule', measuring equipment on the shop floor should have a measurement uncertainty of between 1 to 2μm. These are stringent requirements that demand the highest standards of metrology. Thus the need to accurately quantify the measurement uncertainty of inspection machines is of paramount importance if costly mistakes are to be avoided. The work reported in this thesis was completed as part of the activities undertaken by the author in his role as head of the UK National Gear Metrology Laboratory (NGML). The laboratory is accredited by the United Kingdom Accreditation Service (UKAS) for gear measurement and on-site calibration of gear measuring machines. The work is mainly experimental in nature. In fact, much of what is reported is centred on work undertaken with two artefact sets: one set consisting of 100mm diameter lead and profile artefacts and a second set of 200mm diameter artefacts. These gear artefacts are probably the most valuable in the world because of the volume and quality of the calibration data associated with them.
APA, Harvard, Vancouver, ISO, and other styles
11

Kim, Alisa. "Deep Learning for Uncertainty Measurement." Doctoral thesis, Humboldt-Universität zu Berlin, 2021. http://dx.doi.org/10.18452/22161.

Full text
Abstract:
Diese Arbeit konzentriert sich auf die Lösung des Problems der Unsicherheitsmessung und ihrer Auswirkungen auf Geschäftsentscheidungen, wobei zwei Ziele verfolgt werden: Erstens die Entwicklung und Validierung robuster Modelle zur Quantifizierung der Unsicherheit, wobei insbesondere sowohl die etablierten statistischen Modelle als auch neu entwickelte maschinelle Lernwerkzeuge zum Einsatz kommen. Das zweite Ziel dreht sich um die industrielle Anwendung der vorgeschlagenen Modelle. Die Anwendung auf reale Fälle bei der Messung der Volatilität oder bei einer riskanten Entscheidung ist mit einem direkten und erheblichen Gewinn oder Verlust verbunden. Diese These begann mit der Untersuchung der impliziten Volatilität (IV) als Proxy für die Wahrnehmung der Unsicherheit von Anlegern für eine neue Klasse von Vermögenswerten - Kryptowährungen. Das zweite Papier konzentriert sich auf Methoden zur Identifizierung risikofreudiger Händler und nutzt die DNN-Infrastruktur, um das Risikoverhalten von Marktakteuren, das auf Unsicherheit beruht und diese aufrechterhält, weiter zu untersuchen. Das dritte Papier befasste sich mit dem herausfordernden Bestreben der Betrugserkennung 3 und bot das Entscheidungshilfe-modell, das eine genauere und interpretierbarere Bewertung der zur Prüfung eingereichten Finanzberichte ermöglichte. Angesichts der Bedeutung der Risikobewertung und der Erwartungen der Agenten für die wirtschaftliche Entwicklung und des Aufbaus der bestehenden Arbeiten von Baker (2016) bot das vierte Papier eine neuartige DL-NLP-basierte Methode zur Quantifizierung der wirtschaftspolitischen Unsicherheit. Die neuen Deep-Learning-basierten Lösungen bieten eine überlegene Leistung gegenüber bestehenden Ansätzen zur Quantifizierung und Erklärung wirtschaftlicher Unsicherheiten und ermöglichen genauere Prognosen, verbesserte Planungskapazitäten und geringere Risiken. Die angebotenen Anwendungsfälle bilden eine Plattform für die weitere Forschung.
This thesis focuses on solving the problem of uncertainty measurement and its impact on business decisions while pursuing two goals: first, develop and validate accurate and robust models for uncertainty quantification, employing both the well established statistical models and newly developed machine learning tools, with particular focus on deep learning. The second goal revolves around the industrial application of proposed models, applying them to real-world cases when measuring volatility or making a risky decision entails a direct and substantial gain or loss. This thesis started with the exploration of implied volatility (IV) as a proxy for investors' perception of uncertainty for a new class of assets - crypto-currencies. The second paper focused on methods to identify risk-loving traders and employed the DNN infrastructure for it to investigate further the risk-taking behavior of market actors that both stems from and perpetuates uncertainty. The third paper addressed the challenging endeavor of fraud detection and offered the decision support model that allowed a more accurate and interpretable evaluation of financial reports submitted for audit. Following the importance of risk assessment and agents' expectations in economic development and building on the existing works of Baker (2016) and their economic policy uncertainty (EPU) index, it offered a novel DL-NLP-based method for the quantification of economic policy uncertainty. In summary, this thesis offers insights that are highly relevant to both researchers and practitioners. The new deep learning-based solutions exhibit superior performance to existing approaches to quantify and explain economic uncertainty, allowing for more accurate forecasting, enhanced planning capacities, and mitigated risks. The offered use-cases provide a road-map for further development of the DL tools in practice and constitute a platform for further research.
APA, Harvard, Vancouver, ISO, and other styles
12

Zhao, Lei. "Bench scale apparatus measurement uncertainty and uncertainty effects on measurement of fire characteristics of material systems." Link to electronic thesis, 2005. http://www.wpi.edu/Pubs/ETD/Available/etd-050105-182456/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Hoa, Phan Le Phuong. "Uncertainty in measurement of piezoresistive sensors /." Dresden : W.e.b.-Univ.-Verl, 2005. http://deposit.ddb.de/cgi-bin/dokserv?id=2660800&prov=M&dok_var=1&dok_ext=htm.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Abbott, David Scot. "Assessing Student Understanding of Measurement and Uncertainty." NCSU, 2003. http://www.lib.ncsu.edu/theses/available/etd-06172003-143358/.

Full text
Abstract:
A test to assess student understanding of measurement and uncertainty has been developed and administered to more than 500 students at two large research universities. The aim is two-fold: 1) to assess what students learn in the first semester of introductory physics labs and 2) to uncover patterns in student reasoning and practice. The forty minute, eleven item test focuses on direct measurement and student attitudes toward multiple measurements. After one revision cycle using think-aloud interviews, the test was administered to students to three groups: students enrolled in traditional laboratory lab sections of first semester physics at North Carolina State University (NCSU), students in an experimental (SCALE-UP) section of first semester physics at NCSU, and students in first semester physics at the University of North Carolina at Chapel Hill. The results were analyzed using a mixture of qualitative and quantitative methods. In the traditional NCSU labs, where students receive no instruction in uncertainty and measurement, students show no improvement on any of the areas examined by the test. In SCALE-UP and at UNC, students show statistically significant gains in most areas of the test. Gains on specific test items in SCALE-UP and at UNC correspond to areas of instructional emphasis. Test items were grouped into four main aspects of performance: ?point/set? reasoning, meaning of spread, ruler reading and ?stacking.? Student performance on the pretest was examined to identify links between these aspects. Items within each aspect are correlated to one another, sometimes quite strongly, but items from different aspects rarely show statistically significant correlation. Taken together, these results suggest that student difficulties may not be linked to a single underlying cause. The study shows that current instruction techniques improve student understanding, but that many students exit the introductory physics lab course without appreciation or coherent understanding for the concept of measurement uncertainty.
APA, Harvard, Vancouver, ISO, and other styles
15

Hainsworth, G. D. "Measurement uncertainty in water distribution telemetry systems." Thesis, Nottingham Trent University, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.383304.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Kim, Alisa [Verfasser]. "Deep Learning for Uncertainty Measurement / Alisa Kim." Berlin : Humboldt-Universität zu Berlin, 2021. http://d-nb.info/1227300824/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Placido, Rui. "Estimating measurement uncertainty in the medical laboratory." Thesis, Cranfield University, 2016. http://dspace.lib.cranfield.ac.uk/handle/1826/11258.

Full text
Abstract:
Medical Laboratories Accreditation is covered by ISO 15189:2012 - Medical Laboratories — Requirements for Quality and Competence. In Portugal, accreditation processes are held under the auspices of the Portuguese Accreditation Institute (IPAC), which applies the Portuguese edition (NP EN ISO 15189:2014). Accordingly, Medical Laboratories accreditation processes now require the estimate of measurement uncertainty (MU) associated to the results. The Guide to the Expression of Uncertainty in Measurement (GUM) describes the calculation of MU, not contemplating the specific aspects of medical laboratory testing. Several models have been advocated, yet without a final consensus. Given the lack of studies on MU in Portugal, especially on its application in the medical laboratory, it is the objective of this thesis to reach to a model that fulfils the IPAC’s accreditation regulations, in regards to this specific requirement. The study was based on the implementation of two formulae (MU-A and MU-B), using the Quality Management System (QMS) data of an ISO 15189 Accredited Laboratory. Including the laboratory’s two Cobas® 6000–c501 (Roche®) analysers (C1 and C2) the work focused three analytes: creatinine, glucose and total cholesterol. The MU-B model formula, combining the standard uncertainties of the method’s imprecision, of the calibrator’s assigned value and from the pre-analytical variation, was considered the one best fitting to the laboratory's objectives and to the study's purposes, representing well the dispersion of values reasonably attributable to the measurand final result. Expanded Uncertainties were: Creatinine - C1 = 9,60%; C2 = 5,80%; Glucose - C1 = 8,32%; C2 = 8,34%; Cholesterol - C1 = 4,00%; C2 = 3,54 %. ...[cont.].
APA, Harvard, Vancouver, ISO, and other styles
18

Machekhin, Yu P. "Uncertainty measurement and dynamic system chaotical behaviour." Thesis, France, 2008. http://openarchive.nure.ua/handle/document/8734.

Full text
Abstract:
Chaotic behaviour of nonlinear dynamic system as a mechanism of influence on uncertainty measurement are discussed. Based on the logistic equation as a mathematical models of measeurement uncertainty investigated as a function of state of dynamic system. In the case when dynamic chaos take place in system, uncertainty measurement are increasing and has anormal distribution.
APA, Harvard, Vancouver, ISO, and other styles
19

デイビッド, ア., and David Ha. "Boundary uncertainty-based classifier evaluation." Thesis, https://doors.doshisha.ac.jp/opac/opac_link/bibid/BB13128126/?lang=0, 2019. https://doors.doshisha.ac.jp/opac/opac_link/bibid/BB13128126/?lang=0.

Full text
Abstract:
種々の分類器を対象として,有限個の学習データのみが利用可能である現実においても理論的に的確で計算量的にも実際的な,分類器性能評価手法を提案する.分類器評価における難しさは,有限データのみの利用に起因する分類誤り推定に伴う偏りの発生にある.この困難を解決するため,「境界曖昧性」と呼ばれる新しい評価尺度を提案し,それを用いる評価法の有用性を,3種の分類器と13個のデータセットを用いた実験を通して実証する.
We propose a general method that makes accurate evaluation of any classifier model for realistic tasks, both in a theoretical sense despite the finiteness of the available data, and in a practical sense in terms of computation costs. The classifier evaluation challenge arises from the bias of the classification error estimate that is only based on finite data. We bypass this existing difficulty by proposing a new classifier evaluation measure called "boundary uncertainty'' whose estimate based on finite data can be considered a reliable representative of its expectation based on infinite data, and demonstrate the potential of our approach on three classifier models and thirteen datasets.
博士(工学)
Doctor of Philosophy in Engineering
同志社大学
Doshisha University
APA, Harvard, Vancouver, ISO, and other styles
20

Durisek, Nicholas Joseph. "Simultaneous overall measurement uncertainty reduction for multi-parameter macro-measurement system design /." The Ohio State University, 1996. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487942739808246.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Calkins, Joseph Matthew. "Quantifying Coordinate Uncertainty Fields in Coupled Spatial Measurement systems." Diss., Virginia Tech, 2002. http://hdl.handle.net/10919/28472.

Full text
Abstract:
Spatial coordinate measurement systems play an important role in manufacturing and certification processes. There are many types of coordinate measurement systems including electronic theodolite networks, total station systems, video photogrammetry systems, laser tracking systems, laser scanning systems, and coordinate measuring machines. Each of these systems produces coordinate measurements containing some degree of uncertainty. Often, the results from several different types of measurement systems must be combined in order to provide useful measurement results. When these measurements are combined, the resulting coordinate data set contains uncertainties that are a function of the base data sets and complex interactions between the measurement sets. ISO standards, ANSI standards, and others, require that estimates of uncertainty accompany all measurement data. This research presents methods for quantifying the uncertainty fields associated with coupled spatial measurement systems. The significant new developments and refinements presented in this dissertation are summarized as follows: 1) A geometrical representation of coordinate uncertainty fields. 2) An experimental method for characterizing instrument component uncertainty. 3) Coordinate uncertainty field computation for individual measurements systems. 4) Measurement system combination methods based on the relative uncertainty of each measurement's individual components. 5) Combined uncertainty field computation resulting from to the interdependence of the measurements for coupled measurement systems. 6) Uncertainty statements for measurement analyses such as best-fit geometrical shapes and hidden-point measurement. 7) The implementation of these methods into commercial measurement software. 8) Case studies demonstrating the practical applications of this research. The specific focus of this research is portable measurement systems. It is with these systems that uncertainty field combination issues are most prevalent. The results of this research are, however, general and therefore applicable to any instrument capable of measuring spatial coordinates.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
22

Haglind, Carl. "Evaluation and Implementation of Traceable Uncertainty for Threat Evaluation." Thesis, Uppsala universitet, Avdelningen för systemteknik, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-228106.

Full text
Abstract:
Threat evaluation is used in various applications to find threatening objects or situations and neutralize them before they cause any damage. To make the threat evaluation as user-friendly as possible, it is important to know where the uncertainties are. The method Traceable Uncertainty can make the threat evaluation process more transparent and hopefully easier to rely on. Traceable Uncertainty is used when different sources of information are combined to find support for the decision making process. The uncertainty of the current information is measured before and after the combination. If the magnitude of uncertainty has changed more than a threshold, a new branch will be created which excludes the new information from the combination of evidence. Traceable Uncertainty has never been tested on any realistic scenario to investigate whether it is possible to implement the method on a large scale system. The hypothesis of this thesis is that Traceable Uncertainty can be used on large scale systems if its threshold parameter is tuned in the right way. Different threshold values were tested when recorded radar data were analyzed for threatening targets. Experiments combining random generated evidence were also analyzed for different threshold values. The results showed that a threshold value in the range [0.15, 0.25] generated a satisfying amount of interpretations that were not too similar to eachother. The results could also be filtered to take away unnecessary interpretations. This shows that in this aspect and for this data set, Traceable Uncertainty can be used on large scale systems.
APA, Harvard, Vancouver, ISO, and other styles
23

Clouse, Randy Wayne. "Evaluation of GLEAMS considering parameter uncertainty." Thesis, Virginia Tech, 1996. http://hdl.handle.net/10919/44516.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Clouse, Randy W. "Evaluation of GLEAMS considering parameter uncertainty /." This resource online, 1996. http://scholar.lib.vt.edu/theses/available/etd-09042008-063009/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Nevin, Anne. "The Uncertainty of Decisions in Measurement Based Admission Control." Doctoral thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for telematikk, 2010. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-11561.

Full text
Abstract:
Most real-time voice and video applications are delay/loss sensitive but relaxed  in the sense that they can tolerate some packet loss/delay. Using this information, network utilization can be greatly improved by exploiting statistical multiplexing. To this end, Measurement Based Admission Control (MBAC) has for a long time been recognized as a promising solution. MBAC algorithms, do not require an a priori source characterization which in many cases may be difficult or impossible to attain. Instead, MBAC uses measurements to capture the behavior of existing flows and uses this information together with some coarse knowledge of a new flow when making an admission decision for this requesting flow. The number one requirement of MBAC to be successful is that it can robustly provide Quality of Service (QoS) to the accepted flows. Being robust means that MBAC must be able to withstand a sudden increase in the number of users trying to access the network, handle applications with various capacity requirements and handle an aggregate rate that may change in a highly unpredictable manner. These robustness issues become challenging since MBAC relies on erroneous measurements. Measurements are unavoidably inaccurate. This imperfection creates uncertainties which affect the MBAC decision process. The degree of uncertainty depends on flow characteristics, the length of the observation window and the flow dynamics. Flows will be accepted when they should have been rejected, false acceptance , and rejected when they should have been accepted, false rejections. For the service provider, false rejections translate into a decrease in utilization and for the end user, false acceptance means that the QoS of the flow can no longer be guaranteed. Basing admissions on measurements clearly requires the understanding of the measurement error and how this impacts the performance of MBAC. This thesis considers the uncertainty of the MBAC admission decision process and describes a methodology for analyzing measurement errors and the resulting performance of MBAC. When studying the performance of MBAC, the key is to focus on the time-scale over which measurements are collected and the admission decision is made. This is in contrast to the infinite time-scale used when evaluating the performance of MBAC with respect to utilization and loss/delay probabilities. We find how the uncertainty in the measurements vary with the length of the observation window. Non-homogeneous flows cause increased complexity for the MBAC decision algorithm and also for the estimation process. The concept of similar flows is introduced, which is a restriction to simplify the analytical expressions in a non-homogeneous flow environment. The probability of false acceptance can be reduced by adding a slack in bandwidth. When determining the size of this slack, the service provider is confronted with the trade-off between maximizing useful traffic and reducing useless traffic. We show how the system can be provisioned to meet predetermined performance criteria. This work is fundamentally different from any previous work concerning MBAC and opens up for new thinking and methods for analyzing MBAC performance.
APA, Harvard, Vancouver, ISO, and other styles
26

McCoy, Martin. "Uncertainty analysis in flow measurement by Monte Carlo simulation." Thesis, University of Strathclyde, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.405481.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Joubert, Daniek. "Adaptive occupancy grid mapping with measurement and pose uncertainty." Thesis, Stellenbosch : Stellenbosch University, 2012. http://hdl.handle.net/10019.1/71911.

Full text
Abstract:
Thesis (MSc)--Stellenbosch University, 2012.
ENGLISH ABSTRACT: In this thesis we consider the problem of building a dense and consistent map of a mobile robot’s environment that is updated as the robot moves. Such maps are vital for safe and collision-free navigation. Measurements obtained from a range sensor mounted on the robot provide information on the structure of the environment, but are typically corrupted by noise. These measurements are also relative to the robot’s unknown pose (location and orientation) and, in order to combine them into a world-centric map, pose estimation is necessary at every time step. A SLAM system can be used for this task. However, since landmark measurements and robot motion are inherently noisy, the pose estimates are typically characterized by uncertainty. When building a map it is essential to deal with the uncertainties in range measurements and pose estimates in a principled manner to avoid overconfidence in the map. A literature review of robotic mapping algorithms reveals that the occupancy grid mapping algorithm is well suited for our goal. This algorithm divides the area to be mapped into a regular lattice of cells (squares for 2D maps or cubes for 3D maps) and maintains an occupancy probability for each cell. Although an inverse sensor model is often employed to incorporate measurement uncertainty into such a map, many authors merely state or depict their sensor models. We derive our model analytically and discuss ways to tailor it for sensor-specific uncertainty. One of the shortcomings of the original occupancy grid algorithm is its inability to convey uncertainty in the robot’s pose to the map. We address this problem by altering the occupancy grid update equation to include weighted samples from the pose uncertainty distribution (provided by the SLAM system). The occupancy grid algorithm has been criticized for its high memory requirements. Techniques have been proposed to represent the map as a region tree, allowing cells to have different sizes depending on the information received for them. Such an approach necessitates a set of rules for determining when a cell should be split (for higher resolution in a local region) and when groups of cells should be merged (for lower resolution). We identify some inconsistencies that can arise from existing rules, and adapt those rules so that such errors are avoided. We test our proposed adaptive occupancy grid algorithm, that incorporates both measurement and pose uncertainty, on simulated and real-world data. The results indicate that these uncertainties are included effectively, to provide a more informative map, without a loss in accuracy. Furthermore, our adaptive maps need far fewer cells than their regular counterparts, and our new set of rules for deciding when to split or merge cells significantly improves the ability of the adaptive grid map to mimic its regular counterpart.
AFRIKAANSE OPSOMMING: In hierdie tesis beskou ons die probleem om ’n digte en konsekwente kaart van ’n mobiele robot se omgewing te bou, wat opgedateer word soos die robot beweeg. Sulke kaarte is van kardinale belang vir veilige, botsingvrye navigasie. Metings verkry vanaf ’n sensor wat op die robot gemonteer is, verskaf inligting rakende die struktuur van die omgewing, maar word tipies deur ruis vervorm. Hierdie metings is ook relatief tot die robot se onbekende postuur (posisie en oriëntasie) en, om hulle saam te voeg in ’n wêreldsentriese kaart, is postuurafskatting nodig op elke tydstap. ’n SLAM stelsel kan vir hierdie doeleinde gebruik word. Aangesien landmerkmetings en die beweging van die robot inherent ruiserig is, word die postuurskattings gekarakteriseer deur onsekerheid. Met die bou van ’n kaart moet hierdie onsekerhede in afstandmetings en postuurskattings op ’n beginselvaste manier hanteer word om te verhoed dat te veel vertroue in die kaart geplaas word. ’n Literatuurstudie van karteringsalgoritmes openbaar die besettingsroosteralgoritme as geskik vir ons doel. Die algoritme verdeel die gebied wat gekarteer moet word in ’n reëlmatige rooster van selle (vierkante vir 2D kaarte of kubusse vir 3D kaarte) en onderhou ’n besettingswaarskynlikheid vir elke sel. Alhoewel ’n inverse sensormodel tipies gebruik word om metingsonsekerheid in so ’n kaart te inkorporeer, noem of wys baie outeurs slegs hulle model. Ons herlei ons model analities en beskryf maniere om sensorspesifieke metingsonsekerheid daarby in te sluit. Een van die tekortkominge van die besettingsroosteralgoritme is sy onvermoë om onsekerheid in die postuur van die robot na die kaart oor te dra. Ons spreek hierdie probleem aan deur die opdateringsvergelyking van die oorspronklike besettingsroosteralgoritme aan te pas, om geweegde monsters van die postuuronsekerheidsverdeling (verskaf deur die SLAM stelsel) in te sluit. Die besettingsroosteralgoritme word soms gekritiseer vir sy hoë verbruik van geheue. Tegnieke is voorgestel om die kaart as ’n gebiedsboom voor te stel, wat selle toelaat om verskillende groottes te hê, afhangende van die inligting wat vir hulle verkry is. So ’n benadering noodsaak ’n stel reëls wat spesifiseer wanneer ’n sel verdeel (vir ’n hoër resolusie in ’n plaaslike gebied) en wanneer ’n groep selle saamgevoeg (vir ’n laer resolusie) word. Ons identifiseer teenstrydighede wat kan voorkom as die huidige reëls gevolg word, en pas hierdie reëls aan sodat sulke foute vermy word. Ons toets ons voorgestelde aanpasbare besettingsroosteralgoritme, wat beide metings- en postuuronsekerheid insluit, op gesimuleerde en werklike data. Die resultate dui daarop dat hierdie onsekerhede op ’n effektiewe wyse na die kaart oorgedra word sonder om akkuraatheid prys te gee. Wat meer is, ons aanpasbare kaarte benodig heelwat minder selle as hul reëlmatige eweknieë. Ons nuwe stel reëls om te besluit wanneer selle verdeel of saamgevoeg word, veroorsaak ook ’n merkwaardige verbetering in die vermoë van die aanpasbare roosterkaart om sy reëlmatige eweknie na te boots.
APA, Harvard, Vancouver, ISO, and other styles
28

Bermuske, Mike, Lars Büttner, and Jürgen Czarske. "Measurement uncertainty budget of an interferometric flow velocity sensor." SPIE, 2017. https://tud.qucosa.de/id/qucosa%3A35151.

Full text
Abstract:
Flow rate measurements are a common topic for process monitoring in chemical engineering and food industry. To achieve the requested low uncertainties of 0:1% for flow rate measurements, a precise measurement of the shear layers of such flows is necessary. The Laser Doppler Velocimeter (LDV) is an established method for measuring local flow velocities. For exact estimation of the flow rate, the flow profile in the shear layer is of importance. For standard LDV the axial resolution and therefore the number of measurement points in the shear layer is defined by the length of the measurement volume. A decrease of this length is accompanied by a larger fringe distance variation along the measurement axis which results in a rise of the measurement uncertainty for the flow velocity (uncertainty relation between spatial resolution and velocity uncertainty). As a unique advantage, the laser Doppler profile sensor (LDV-PS) overcomes this problem by using two fan-like fringe systems to obtain the position of the measured particles along the measurement axis and therefore achieve a high spatial resolution while it still offers a low velocity uncertainty. With this technique, the flow rate can be estimated with one order of magnitude lower uncertainty, down to 0:05% statistical uncertainty.1 And flow profiles especially in film flows can be measured more accurately. The problem for this technique is, in contrast to laboratory setups where the system is quite stable, that for industrial applications the sensor needs a reliable and robust traceability to the SI units, meter and second. Small deviations in the calibration can, because of the highly position depending calibration function, cause large systematic errors in the measurement result. Therefore, a simple, stable and accurate tool is needed, that can easily be used in industrial surroundings to check or recalibrate the sensor. In this work, different calibration methods are presented and their in uences to the measurement uncertainty budget of the sensor is discussed. Finally, generated measurement results for the film flow of an impinging jet cleaning experiment are presented.
APA, Harvard, Vancouver, ISO, and other styles
29

Kim, Hyo Soo. "Periodic error in heterodyne interferometry measurement, uncertainty, and elimination /." [Gainesville, Fla.] : University of Florida, 2009. http://purl.fcla.edu/fcla/etd/UFE0041110.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Fan, Jiezhen S. "Measurement and communication uncertainty in automated aircraft separation management." Thesis, Queensland University of Technology, 2016. https://eprints.qut.edu.au/102407/1/Jiezhen_Fan_Thesis.pdf.

Full text
Abstract:
This thesis proposed an inter-aircraft communication and a track file manager based automated separation management algorithm for solving aircraft conflicts during a period of central communication failure. The thesis first investigated the impact of measurement uncertainty on the estimated risk of conflict and then it compared the separation performance of several algorithms in uncertain communication environment, by doing so, it characterized the separation performance in communication failure situations.
APA, Harvard, Vancouver, ISO, and other styles
31

Menke, Robert A. "A FULL SYSTEM CHARACTERIZATION OF THE MEASUREMENT UNCERTAINTY OF A CONDUCTED EMISSIONS MEASUREMENT SYSTEM." UKnowledge, 2005. http://uknowledge.uky.edu/gradschool_theses/264.

Full text
Abstract:
Electromagnetic compatibility (EMC) standards for an accredited test laboratory require that the measurement uncertainty of the measuring instruments be characterized. The CISPR 16-4 standard gives guidance to the magnitude of this uncertainty, but no method of characterization. This thesis describes a method to perform this characterization on a conducted emissions measurement system, taking advantage of full system analysis techniques to reduce the uncertainty to exceptionally low levels. In addition, a framework is introduced whereby uncertainty can decomposed into its constituent parts such that the laboratory operator can identify methods to improve the systems performance.
APA, Harvard, Vancouver, ISO, and other styles
32

Wang, Jinkai. "Uncertainty evaluation of delayed neutron decay parameters." [College Station, Tex. : Texas A&M University, 2008. http://hdl.handle.net/1969.1/ETD-TAMU-3210.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Meier, Helga. "Project evaluation and capital budgeting under uncertainty." Thesis, Imperial College London, 1995. http://hdl.handle.net/10044/1/7785.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Mohandas, Moginraj. "Evaluation of Expressions with Uncertainty in Databases." ScholarWorks@UNO, 2007. http://scholarworks.uno.edu/td/535.

Full text
Abstract:
Expressions are used in a range of applications like Publish/Subscribe, Ecommerce, etc. Integrating support for expressions in a database management system (DBMS) provides an efficient and scalable platform for applications that use Expressions. Support from uncertain data and expressions can be beneficial but not currently provided for. In this thesis, we investigate how expressions with uncertainty can be integrated in a DBMS like other data. We describe the underlying theory and implementation of UNXS (UNcertain eXpression System), a system that we have developed to handle uncertainty in expressions and data. We develop a theoretical model to compare and contrast different previous work in supporting uncertainty in DBMS and Publish/Subscribe systems. We extend the existing approaches to propose new techniques for matching uncertain expressions to uncertain data in UNXS. We then describe an implementation that integrates this support in Postgresql DBMS, which to our knowledge is the first such implementation.
APA, Harvard, Vancouver, ISO, and other styles
35

Moore, F. "Rethinking measurement in psychology and education : a quantum perspective." Thesis, Queen's University Belfast, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.368532.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Suzuki, Satomi. "Integrated evaluation of structural uncertainty using history matching from seismic imaging uncertainty model /." May be available electronically:, 2007. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Argyraki, Ariadni. "Estimation of measurement uncertainty in the sampling of contaminated land." Thesis, Imperial College London, 1997. http://hdl.handle.net/10044/1/8489.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

MacAulay, Gavin. "Characterisation of structured surfaces and assessment of associated measurement uncertainty." Thesis, Brunel University, 2016. http://bura.brunel.ac.uk/handle/2438/13473.

Full text
Abstract:
Recently, structured surfaces, consisting of deterministic features designed to produce a particular effect, have shown promise in providing superior functional performance for a range of applications including: low friction surfaces, hydrophobic surfaces and optical effects. Methods have been developed to characterise such structured surfaces. The most widely used characterisation methods are based on segmenting the surface in feature and background regions and then determining the geometrical properties of those features. However, further work is needed to refine these characterisation techniques and provide associated uncertainties. This thesis considers the effect of various segmentation control parameters such as thresholds on the final geometric parameters. The effect of varying filter size is also considered. These considerations should help in selecting a suitable characterisation method for future projects. Additionally, uncertainty in the characterisation should be estimated in order to give an indication of the accuracy of the assessment. However, no previous work has assessed uncertainty in the dimensional properties of structured surfaces. Therefore, this thesis presents two methods to characterise the uncertainty in the geometric characteristics of structured surfaces. First, the measurement reproducibility is used, which can be determined by repeated measurement of a feature. However, measurement reproducibility cannot account for all sources of uncertainty and cannot assess any bias in the measurements. Therefore, a second method based on assessment of the metrological characteristics of the instrument is considered. The metrological characteristics estimate errors produced by the instrument in a way that can easily be measured. Monte Carlo techniques are then used to propagate the effects of the metrological characteristics and their uncertainties into the final measurement uncertainty. For the example used, it was found that the results using the metrological characteristics were in good agreement with the reproducibility results. From these results, it is concluded that the choice of segmentation method, control parameters and filtering can all significantly effect the characterisation of features on a structured surface, often in unexpected ways. Therefore, care must be taken when selecting these values for a specific application. Additionally, two methods of determining the uncertainty of the structured surfaces were considered. Both methods are valid and produce similar results. Using the measurement reproducibility is simple to perform, but requires many measurements and cannot account for some uncertainty sources such as those due to the instrument amplification factors. On the other hand, the use of metrological characteristics can account for all significant sources of uncertainty in a measurement, but is mathematically more complex, requiring Monte Carlo simulations to propagate the uncertainties into the final characteristics. Additionally, other artefacts than the sample being measured are required to determine the metrological characteristics, which may be an issue in some cases.
APA, Harvard, Vancouver, ISO, and other styles
39

Goudar, Devkuma Murugesh. "Quantifying uncertainty in residual stress measurement using hole drilling techniques." Thesis, University of Bristol, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.549436.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Williams, Braydon J. "Uncertainty of Stereo PIV Calibration and Self-Calibration." DigitalCommons@USU, 2017. https://digitalcommons.usu.edu/etd/5263.

Full text
Abstract:
Particle image velocimetry (PIV) is a widely used fluid measurement technique. Three dimensional PIV data or stereo PIV is acquired using two cameras. Stereo cameras are calibrated from camera coordinates, pixels, to real world units such as millimeters using calibration models. Stereo calibration is fundamental to the accuracy of a PIV measurement. In this thesis, the accuracy of the stereo calibration is assessed. The mean error of stereo calibration was found to be 0.23%.
APA, Harvard, Vancouver, ISO, and other styles
41

Lyn, Jennifer A. "Optimising uncertainty from sampling and analysis of foods and environmental samples." Thesis, University of Sussex, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.270732.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Vestin, Albin, and Gustav Strandberg. "Evaluation of Target Tracking Using Multiple Sensors and Non-Causal Algorithms." Thesis, Linköpings universitet, Reglerteknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-160020.

Full text
Abstract:
Today, the main research field for the automotive industry is to find solutions for active safety. In order to perceive the surrounding environment, tracking nearby traffic objects plays an important role. Validation of the tracking performance is often done in staged traffic scenarios, where additional sensors, mounted on the vehicles, are used to obtain their true positions and velocities. The difficulty of evaluating the tracking performance complicates its development. An alternative approach studied in this thesis, is to record sequences and use non-causal algorithms, such as smoothing, instead of filtering to estimate the true target states. With this method, validation data for online, causal, target tracking algorithms can be obtained for all traffic scenarios without the need of extra sensors. We investigate how non-causal algorithms affects the target tracking performance using multiple sensors and dynamic models of different complexity. This is done to evaluate real-time methods against estimates obtained from non-causal filtering. Two different measurement units, a monocular camera and a LIDAR sensor, and two dynamic models are evaluated and compared using both causal and non-causal methods. The system is tested in two single object scenarios where ground truth is available and in three multi object scenarios without ground truth. Results from the two single object scenarios shows that tracking using only a monocular camera performs poorly since it is unable to measure the distance to objects. Here, a complementary LIDAR sensor improves the tracking performance significantly. The dynamic models are shown to have a small impact on the tracking performance, while the non-causal application gives a distinct improvement when tracking objects at large distances. Since the sequence can be reversed, the non-causal estimates are propagated from more certain states when the target is closer to the ego vehicle. For multiple object tracking, we find that correct associations between measurements and tracks are crucial for improving the tracking performance with non-causal algorithms.
APA, Harvard, Vancouver, ISO, and other styles
43

Hingston, Egerton Daniel Christian. "Geotechnical uncertainty in the evaluation of landslide hazards." Thesis, University of Leeds, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.502775.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Tai, Tuck Leong. "Uncertainty in the economic evaluation of transportation projects." Thesis, University of Canterbury. Civil Engineering, 1987. http://hdl.handle.net/10092/4835.

Full text
Abstract:
This thesis describes the methods and results of an investigation into the nature and impacts of uncertainty on the current economic evaluation practices in New Zealand. The thesis consists of four parts. Part I introduces the issue of uncertainty and decision making within the context of transportation planning. Part II takes a critical look at the economic evaluation procedures for urban projects in New Zealand, with particular reference to the underlying concepts and the possible areas of deficiency; via a case study and with the use of a traffic model in conjunction with the formal evaluation procedures, the sources of error and uncertainty associated with the current evaluation practices in New Zealand are identified and discussed. Part III presents the results of risk analysis performed on some rural transport projects. The techniques of sensitivity analysis and Monte Carlo simulation have been utilised for estimating and quantifying the uncertainty in the projects. Various measures of risk and the concept of stochastic dominance have been put forward as aids to the process of decision making. The methodology of Monte Carlo simulation and the selection of probability distributions for various input parameters are also described. Lastly, Part IV records the conclusions, implications and recommendations arrived at for this study.
APA, Harvard, Vancouver, ISO, and other styles
45

Thomas, Wayne Robert. "An evaluation of educational decision problems under uncertainty." Thesis, University of Bristol, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.319396.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Alexandrino, Josemir da Cruz. "Metodologia para avaliação do desempenho metrológico em equipamentos médico-hospitalares." Universidade Federal da Bahia. Escola Politécnica, 2012. http://repositorio.ufba.br/ri/handle/ri/18453.

Full text
Abstract:
Submitted by LIVIA FREITAS (livia.freitas@ufba.br) on 2015-12-03T18:56:39Z No. of bitstreams: 1 TeseJosemirAlexandrino.pdf: 4030416 bytes, checksum: 971f2eadd7c4d23fb1945b3a4dfbd1d5 (MD5)
Approved for entry into archive by LIVIA FREITAS (livia.freitas@ufba.br) on 2016-01-11T15:27:48Z (GMT) No. of bitstreams: 1 TeseJosemirAlexandrino.pdf: 4030416 bytes, checksum: 971f2eadd7c4d23fb1945b3a4dfbd1d5 (MD5)
Made available in DSpace on 2016-01-11T15:27:49Z (GMT). No. of bitstreams: 1 TeseJosemirAlexandrino.pdf: 4030416 bytes, checksum: 971f2eadd7c4d23fb1945b3a4dfbd1d5 (MD5)
O uso massivo de tecnologia na área da saúde tem elevado a efetividade dos procedimentos médicos, com consequente melhoria da qualidade de vida dos pacientes. Entretanto, tais tecnologias trazem riscos intrínsecos, que podem resultar em danos, quando mal utilizadas ou mantidas. Neste contexto, encontram-se os Equipamentos Médico-hospitalares (EMH) que podem apresentar problemas que resultem em diagnósticos errôneos, terapias inapropriadas ou que comprometam a segurança dos usuários. A avaliação periódica, e em particular, do desempenho metrológico, constitui-se em importante meio para alcançar o uso seguro do EMH. As soluções encontradas para tal avaliação são rigidamente condicionadas às especificidades de cada tipo de EMH e dificilmente podem ser aproveitadas para outros. Embora não tenha sido encontrada na literatura uma solução generalista que possa ser aplicada a determinados tipos de EMH, esta indica viabilidade de aplicação, em função dos constantes avanços tecnológicos do setor. Neste sentido, este trabalho apresenta uma metodologia para a avaliação do desempenho metrológico de EMHs e para tal, foi desenvolvido um sistema que reúne instrumentos de medição integrados a um software gerenciador de ensaios. Este software realiza a aquisição e monitoramento de dados de medição relativos às variáveis metrológicas do equipamento sob teste, avalia a adequação destes dados a requisitos metrológicos preestabelecidos e emite relatório de conformidade. O sistema permite a configuração de diversos parâmetros, possibilitando a execução de avaliações de desempenho metrológico em diferentes tipos de EMH. Para fins de validação, o sistema foi configurado para realizar avaliações em bisturis elétricos e incubadoras neonatais. Os resultados mostram a eficácia da metodologia desenvolvida e a eficiência da plataforma computacional para avaliação metrológica de EHM.
APA, Harvard, Vancouver, ISO, and other styles
47

Voßmann, Frank. "Decision weights in choice under risk and uncertainty : measurement and decomposition /." [S.l. : s.n.], 2004. http://www.gbv.de/dms/zbw/490610218.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Taillefer, Chris. "Reducing measurement uncertainty in a DSP-based mixed-signal test environment." Thesis, McGill University, 2003. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=84104.

Full text
Abstract:
FFT-based tests (e.g. gain, distortion, SNR, etc.) from a device-under-test (DUT) exhibit normal distributions when the measurement is repeated many times. Hence, a statistical approach to evaluate the accuracy of these measurements is traditionally applied. The noise in a DSP-based mixed-signal test system severely limits its measurement accuracy. Moreover, in high-speed sampled-channel applications the jitter-induced noise from the DUT and test equipment can severely impede accurate measurements.
A new digitizer architecture and post-processing methodology is proposed to increase the measurement accuracy of the DUT and the test equipment. An optimal digitizer design is presented which removes any measurement bias due to noise and greatly improves measurement repeatability. Most importantly, the presented system improves accuracy in the same test time as any conventional test.
An integrated mixed-signal test core was implemented in TSMC's 0.18 mum mixed-signal process. Experimental results obtained from the mixed-signal integrated test core validate the proposed digitizer architecture and post processing technique. Bias errors were successfully removed and measurement variance was improved by a factor of 5.
APA, Harvard, Vancouver, ISO, and other styles
49

Tyler, David Keith. "Improved estimation of uncertainty in flow measurement at sewage treatment works." Thesis, University of Hertfordshire, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.409476.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Vanslette, Kevin M. "Theoretical Study of Variable Measurement Uncertainty h_I and Infinite Unobservable Entropy." Digital WPI, 2013. https://digitalcommons.wpi.edu/etd-theses/289.

Full text
Abstract:
This paper examines the statistical mechanical and thermodynamical consequences of variable phase-space volume element $h_I=?igtriangleup x_i?igtriangleup p_i$. Varying $h_I$ leads to variations in the amount of measured entropy of a system but the maximum entropy remains constant due to the uncertainty principle. By taking $h_u ightarrow 0^+$ an infinite unobservable entropy is attained leading to an infinite unobservable energy per particle and an unobservable chemical equilibrium between all particles. The amount of heat fluxing though measurement apparatus is formulated as a function of $h_I$ for systems in steady state equilibrium as well as the number of measured particles or sub-particles so any system can be described as unitary or composite in number. Some example systems are given using variable $h_I$.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography