Dissertations / Theses on the topic 'Targeted sampling'

To see the other types of publications on this topic, follow the link: Targeted sampling.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 15 dissertations / theses for your research on the topic 'Targeted sampling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Isaac, Giorgis. "Development of Enhanced Analytical Methodology for Lipid Analysis from Sampling to Detection : A Targeted Lipidomics Approach." Doctoral thesis, Uppsala University, Analytical Chemistry, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-5810.

Full text
Abstract:

This thesis covers a wide range of analytical method development for lipid analysis in complex biological samples; from sample preparation using pressurized fluid extraction (PFE) and separation with reversed phase capillary liquid chromatography (RP-LC) to detection by electrospray ionization mass spectrometry (ESI/MS) and tandem MS.

The requirements for fast, reliable and selective extraction methods with minimal usage of solvents have accelerated the development of new extraction techniques. PFE is one of the new automated, fast and efficient liquid extraction techniques which use elevated temperature and pressure with standard liquid solvents. In this thesis the reliability and efficiency of the PFE technique was investigated for the extraction of total lipid content from cod, herring muscle and human brain tissue as well as for pesticides from fatty foodstuffs. Improved or comparable efficiencies were achieved with reduced time and solvent consumption as compared to traditional methods.

A RP-LC coupled online to ESI/MS for the analysis of phosphatidylcholine (PC) and sphingomyelin (SM) molecular species was developed and used for the analysis of brain lipids from eight groups of mice treated with vehicle and various neuroleptics. The effect of postnatal iron administration in lipid composition and behavior was investigated. Whether or not these effects could be altered by subchronic administration of the neuroleptics (clozapine and haloperidol) were examined. The results support the hypothesis that an association between psychiatric disorders, behavior abnormalities and lipid membrane constitution in the brain exists.

Finally, a tandem MS precursor ion scan was used to analyze the developmental profile of brain sulfatide accumulation in arylsulfatase A (ASA) deficient (ASA -/-) as compared to wild type control (ASA +/+) mice. The ASA -/- mice were developed as a model of the monogenic disease metachromatic leukodystrophy with an established deficiency of the lysosomal enzyme ASA. The results showed that an alteration in the composition of sulfatide molecular species was observed between the ASA -/- and ASA +/+ mice.

This thesis shows that modern analytical methods can provide new insights in the extraction and analysis of lipids from complex biological samples.

APA, Harvard, Vancouver, ISO, and other styles
2

Holmberg, Edward A. IV. "Data Visualization to Evaluate and Facilitate Targeted Data Acquisitions in Support of a Real-time Ocean Forecasting System." ScholarWorks@UNO, 2014. http://scholarworks.uno.edu/td/1873.

Full text
Abstract:
A robust evaluation toolset has been designed for Naval Research Laboratory’s Real-Time Ocean Forecasting System RELO with the purpose of facilitating an adaptive sampling strategy and providing a more educated guidance for routing underwater gliders. The major challenges are to integrate into the existing operational system, and provide a bridge between the modeling and operative environments. Visualization is the selected approach and the developed software is divided into 3 packages: The first package is to verify that the glider is actually following the waypoints and to predict the position of the glider for the next cycle’s instructions. The second package helps ensures that the delivered waypoints are both useful and feasible. The third package provides the confidence levels for the suggested path. This software’s implementation is in Python for portability and modularity to allow for easy expansion for new visuals.
APA, Harvard, Vancouver, ISO, and other styles
3

Nguyen, Trang. "Comparison of Sampling-Based Algorithms for Multisensor Distributed Target Tracking." ScholarWorks@UNO, 2003. http://scholarworks.uno.edu/td/20.

Full text
Abstract:
Nonlinear filtering is certainly very important in estimation since most real-world problems are nonlinear. Recently a considerable progress in the nonlinear filtering theory has been made in the area of the sampling-based methods, including both random (Monte Carlo) and deterministic (quasi-Monte Carlo) sampling, and their combination. This work considers the problem of tracking a maneuvering target in a multisensor environment. A novel scheme for distributed tracking is employed that utilizes a nonlinear target model and estimates from local (sensor-based) estimators. The resulting estimation problem is highly nonlinear and thus quite challenging. In order to evaluate the performance capabilities of the architecture considered, advanced sampling-based nonlinear filters are implemented: particle filter (PF), unscented Kalman filter (UKF), and unscented particle filter (UPF). Results from extensive Monte Carlo simulations using different configurations of these algorithms are obtained to compare their effectiveness for solving the distributed target tracking problem.
APA, Harvard, Vancouver, ISO, and other styles
4

Qian, Jiajie. "Nanofiber-enabled multi-target passive sampling device for legacy and emerging organic contaminants." Diss., University of Iowa, 2018. https://ir.uiowa.edu/etd/6487.

Full text
Abstract:
The widespread environmental occurrence of chemical pollutants presents an ongoing threat to human and ecosystem health. This challenge is compounded by the diversity of chemicals used in industry, commerce, agriculture and medicine, which results in a spectrum of potential fates and exposure profiles upon their inevitable release into the environment. This, in turn, confounds risk assessment, where challenges persist in accurate determination of concentrations levels, as well as spatial and temporal distributions, of pollutants in environmental media (e.g., water, air, soil and sediments). Passive sampling technologies continue to gain acceptance as a means for simplifying environmental occurrence studies and, ultimately, improving the quality of chemical risk assessment. Passive samplers rely on the accumulation of a target analyte into a matrix via molecular diffusion, which is driven by the difference in chemical potential between the analyte in the environment and the sampling media (e.g., sorbent phase). After deployment, the target analyte can be extracted from the sampling media and quantified, providing an integrated, time-weighted average pollutant concentration via a cost-effective platform that requires little energy or manpower when compared to active (e.g., grab) sampling approaches. While a promising, maturing technology, however, limitations exist in current commercially available passive samplers; they are typically limited in the types of chemicals that can be targeted effectively, can require long deployment times to accumulate sufficient chemical for analysis, and struggle with charged analytes. In this dissertation, we have designed a next-generation, nanofiber sorbent as a passive sampling device for routine monitoring of both legacy and emerging organic pollutant classes in water and sediment. The polymer nanofiber networks fabricated herein exhibit a high surface area to volume ratio (SA/V values) which shortens the deployment time. Uptake studies of these polymer nanofiber samplers suggest that field deployment could be shortened to less than one day for surface water analysis, effectively operating as an equilibrium passives sampling device, and twenty days for pore water analysis in soil and sediment studies. By comparison, most commercially available passive sampler models generally require at least a month of deployment before comparable analyses may be made. Another highlight of the nanofiber materials produced herein is their broad target application range. We demonstrate that both hydrophobic (e.g., persistent organic pollutants, or POPs, like PCBs and dioxin) and hydrophilic (e.g., emerging pollutant classes including pesticides, pharmaceuticals and personal care products) targets can be rapidly accumulated with our optimal nanofibers formulations. This suggests that one of our devices could potentially replace multiple commercial passive sampling devices, which often exhibit a more limited range of analyte targets. We also present several approaches for tailoring nanofiber physical and chemical properties to specifically target particular high priority pollutant classes (e.g., PFAS). Three promising modification approaches validated herein include: (i) fabricating carbon nanotube-polymer composites to capture polar compounds; (ii) introducing surface-segregating cationic surfactants to target anionic pollutants (e.g., the pesticide 2,4-D and perfluorooctanoic acid or PFOA); and (iii) use of leachable surfactants as porogens to increase nanofiber pore volume and surface area to increase material capacity. Collectively, outcomes of this work will guide the future development of next generation passive samplers by establishing broadly generalizable structure-activity relationships. All told, we present data related to the influence on the rate and extent of pollutant uptake in polymer nanofiber matrices as a function of both physical (specific surface area, pore volume, and diameter) and chemical (e.g., bulk and surface composition, nanofiber wettability, surface charge) nanofiber properties. We also present modeling results describing sampler operation that can be used to assess and predict passive sampler performance prior to field deployment. The electrospun nanofiber mats (ENMs) developed as passive sampling devices herein provide greater functionality and allow for customizable products for application to a wide range of chemical diverse organic pollutants. Combined with advances in and expansion of the nanotechnology sector, we envision this product could be made commercially available so as to expand the use and improve the performance of passive sampling technologies in environmental monitoring studies.
APA, Harvard, Vancouver, ISO, and other styles
5

Fischell, Erin Marie. "Characterization of underwater target geometry from autonomous underwater vehicle sampling of bistatic acoustic scattered fields." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/100161.

Full text
Abstract:
Thesis: Ph. D., Joint Program in Applied Ocean Science and Engineering (Massachusetts Institute of Technology, Department of Mechanical Engineering; and the Woods Hole Oceanographic Institution), 2015.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 153-156).
One of the long term goals of Autonomous Underwater Vehicle (AUV) minehunting is to have multiple inexpensive AUVs in a harbor autonomously classify hazards. Existing acoustic methods for target classification using AUV-based sensing, such as sidescan and synthetic aperture sonar, require an expensive payload on each outfitted vehicle and expert image interpretation. This thesis proposes a vehicle payload and machine learning classification methodology using bistatic angle dependence of target scattering amplitudes between a fixed acoustic source and target for lower cost-per-vehicle sensing and onboard, fully autonomous classification. The contributions of this thesis include the collection of novel high-quality bistatic data sets around spherical and cylindrical targets in situ during the BayEx'14 and Massachusetts Bay 2014 scattering experiments and the development of a machine learning methodology for classifying target shape and estimating orientation using bistatic amplitude data collected by an AUV. To achieve the high quality, densely sampled 3D bistatic scattering data required by this research, vehicle broadside sampling behaviors and an acoustic payload with precision timed data acquisition were developed. Classification was successfully demonstrated for spherical versus cylindrical targets using bistatic scattered field data collected by the AUV Unicorn as a part of the BayEx'14 scattering experiment and compared to simulated scattering models. The same machine learning methodology was applied to the estimation of orientation of aspect-dependent targets, and was demonstrated by training a model on data from simulation then successfully estimating the orientations of a steel pipe in the Massachusetts Bay 2014 experiment. The final models produced from real and simulated data sets were used for classification and parameter estimation of simulated targets in real time in the LAMSS MOOS-IvP simulation environment.
by Erin Marie Fischell.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
6

Moyer, Steven K. "Modeling challenges of advanced thermal imagers." Diss., Available online, Georgia Institute of Technology, 2006, 2006. http://etd.gatech.edu/theses/available/etd-02272006-144729/.

Full text
Abstract:
Thesis (Ph. D.)--Electrical and Computer Engineering, Georgia Institute of Technology, 2007.
Dr. William T. Rhodes, Committee Co-Chair ; Dr. John Buck, Committee Member ; Dr. William Hunt, Committee Member ; Dr. Stephen P. DeWeerth, Committee Member ; Dr. Ronald G. Driggers, Committee Member ; Dr. Gisele Bennett, Committee Chair.
APA, Harvard, Vancouver, ISO, and other styles
7

Lux, Johannes Thomas [Verfasser], and Ingeborg [Akademischer Betreuer] Levin. "A new target preparation facility for high precision AMS measurements and strategies for efficient 14CO2 sampling / Johannes Thomas Lux ; Betreuer: Ingeborg Levin." Heidelberg : Universitätsbibliothek Heidelberg, 2018. http://d-nb.info/1177252260/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Dixon, Wallace E. Jr, Robert M. Price, Michael Watkins, and Christine Brink. "Touchstat V. 3.00: A New and Improved Monte Carlo Adjunct for the Sequential Touching Task." Digital Commons @ East Tennessee State University, 2007. https://doi.org/10.3758/BF03193010.

Full text
Abstract:
The sequential-touching procedure is employed by researchers studying nonlinguistic categorization in toddlers. TouchStat 3.00 is introduced in this article as an adjunct to the sequential-touching procedure, allowing researchers to compare children’s actual touching behavior to what might be expected by chance. Advantages over the Thomas and Dahlin (2000) framework include ease of use, and fewer assumptive limitations. Improvements over TouchStat 1.00 include calculation of chance probabilities for multiple “special cases” and for immediate intercategory alternations. A new feature for calculating mean run length is also included.
APA, Harvard, Vancouver, ISO, and other styles
9

Lamberti, Roland. "Contributions aux méthodes de Monte Carlo et leur application au filtrage statistique." Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLL007/document.

Full text
Abstract:
Cette thèse s’intéresse au problème de l’inférence bayésienne dans les modèles probabilistes dynamiques. Plus précisément nous nous focalisons sur les méthodes de Monte Carlo pour l’intégration. Nous revisitons tout d’abord le mécanisme d’échantillonnage d’importance avec rééchantillonnage, puis son extension au cadre dynamique connue sous le nom de filtrage particulaire, pour enfin conclure nos travaux par une application à la poursuite multi-cibles.En premier lieu nous partons du problème de l’estimation d’un moment suivant une loi de probabilité, connue à une constante près, par une méthode de Monte Carlo. Tout d’abord,nous proposons un nouvel estimateur apparenté à l’estimateur d’échantillonnage d’importance normalisé mais utilisant deux lois de proposition différentes au lieu d’une seule. Ensuite,nous revisitons le mécanisme d’échantillonnage d’importance avec rééchantillonnage dans son ensemble afin de produire des tirages Monte Carlo indépendants, contrairement au mécanisme usuel, et nous construisons ainsi deux nouveaux estimateurs.Dans un second temps nous nous intéressons à l’aspect dynamique lié au problème d’inférence bayésienne séquentielle. Nous adaptons alors dans ce contexte notre nouvelle technique de rééchantillonnage indépendant développée précédemment dans un cadre statique.Ceci produit le mécanisme de filtrage particulaire avec rééchantillonnage indépendant, que nous interprétons comme cas particulier de filtrage particulaire auxiliaire. En raison du coût supplémentaire en tirages requis par cette technique, nous proposons ensuite une procédure de rééchantillonnage semi-indépendant permettant de le contrôler.En dernier lieu, nous considérons une application de poursuite multi-cibles dans un réseau de capteurs utilisant un nouveau modèle bayésien, et analysons empiriquement les résultats donnés dans cette application par notre nouvel algorithme de filtrage particulaire ainsi qu’un algorithme de Monte Carlo par Chaînes de Markov séquentiel
This thesis deals with integration calculus in the context of Bayesian inference and Bayesian statistical filtering. More precisely, we focus on Monte Carlo integration methods. We first revisit the importance sampling with resampling mechanism, then its extension to the dynamic setting known as particle filtering, and finally conclude our work with a multi-target tracking application. Firstly, we consider the problem of estimating some moment of a probability density, known up to a constant, via Monte Carlo methodology. We start by proposing a new estimator affiliated with the normalized importance sampling estimator but using two proposition densities rather than a single one. We then revisit the importance sampling with resampling mechanism as a whole in order to produce Monte Carlo samples that are independent, contrary to the classical mechanism, which enables us to develop two new estimators. Secondly, we consider the dynamic aspect in the framework of sequential Bayesian inference. We thus adapt to this framework our new independent resampling technique, previously developed in a static setting. This yields the particle filtering with independent resampling mechanism, which we reinterpret as a special case of auxiliary particle filtering. Because of the increased cost required by this technique, we next propose a semi independent resampling procedure which enables to control this additional cost. Lastly, we consider an application of multi-target tracking within a sensor network using a new Bayesian model, and empirically analyze the results from our new particle filtering algorithm as well as a sequential Markov Chain Monte Carlo algorithm
APA, Harvard, Vancouver, ISO, and other styles
10

Vestin, Albin, and Gustav Strandberg. "Evaluation of Target Tracking Using Multiple Sensors and Non-Causal Algorithms." Thesis, Linköpings universitet, Reglerteknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-160020.

Full text
Abstract:
Today, the main research field for the automotive industry is to find solutions for active safety. In order to perceive the surrounding environment, tracking nearby traffic objects plays an important role. Validation of the tracking performance is often done in staged traffic scenarios, where additional sensors, mounted on the vehicles, are used to obtain their true positions and velocities. The difficulty of evaluating the tracking performance complicates its development. An alternative approach studied in this thesis, is to record sequences and use non-causal algorithms, such as smoothing, instead of filtering to estimate the true target states. With this method, validation data for online, causal, target tracking algorithms can be obtained for all traffic scenarios without the need of extra sensors. We investigate how non-causal algorithms affects the target tracking performance using multiple sensors and dynamic models of different complexity. This is done to evaluate real-time methods against estimates obtained from non-causal filtering. Two different measurement units, a monocular camera and a LIDAR sensor, and two dynamic models are evaluated and compared using both causal and non-causal methods. The system is tested in two single object scenarios where ground truth is available and in three multi object scenarios without ground truth. Results from the two single object scenarios shows that tracking using only a monocular camera performs poorly since it is unable to measure the distance to objects. Here, a complementary LIDAR sensor improves the tracking performance significantly. The dynamic models are shown to have a small impact on the tracking performance, while the non-causal application gives a distinct improvement when tracking objects at large distances. Since the sequence can be reversed, the non-causal estimates are propagated from more certain states when the target is closer to the ego vehicle. For multiple object tracking, we find that correct associations between measurements and tracks are crucial for improving the tracking performance with non-causal algorithms.
APA, Harvard, Vancouver, ISO, and other styles
11

Wahlström, Dennis. "Probabilistic Multidisciplinary Design Optimization on a high-pressure sandwich wall in a rocket engine application." Thesis, Umeå universitet, Institutionen för fysik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-138480.

Full text
Abstract:
A need to find better achievement has always been required in the space industrythrough time. Advanced technologies are provided to accomplish goals for humanityfor space explorer and space missions, to apprehend answers and widen knowledges. These are the goals of improvement, and in this thesis, is to strive and demandto understand and improve the mass of a space nozzle, utilized in an upperstage of space mission, with an expander cycle engine. The study is carried out by creating design of experiment using Latin HypercubeSampling (LHS) with a consideration to number of design and simulation expense.A surrogate model based optimization with Multidisciplinary Design Optimization(MDO) method for two different approaches, Analytical Target Cascading (ATC) and Multidisciplinary Feasible (MDF) are used for comparison and emend the conclusion. In the optimization, three different limitations are being investigated, designspace limit, industrial limit and industrial limit with tolerance. Optimized results have shown an incompatibility between two optimization approaches, ATC and MDF which are expected to be similar, but for the two limitations, design space limit and industrial limit appear to be less agreeable. The ATC formalist in this case dictates by the main objective, where the children/subproblems only focus to find a solution that satisfies the main objective and its constraint. For the MDF, the main objective function is described as a single function and solved subject to all the constraints. Furthermore, the problem is not divided into subproblems as in the ATC. Surrogate model based optimization, its solution influences by the accuracy ofthe model, and this is being investigated with another DoE. A DoE of the full factorial analysis is created and selected to study in a region near the optimal solution.In such region, the result has evidently shown to be quite accurate for almost allthe surrogate models, except for max temperature, damage and strain at the hottestregion, with the largest common impact on inner wall thickness of the space nozzle. Results of the new structure of the space nozzle have shown an improvement of mass by ≈ 50%, ≈ 15% and ≈ -4%, for the three different limitations, design spacelimit, industrial limit and industrial limit with tolerance, relative to a reference value,and ≈ 10%, ≈ 35% and ≈ 25% cheaper to manufacture accordingly to the defined producibility model.
APA, Harvard, Vancouver, ISO, and other styles
12

Nguyen, Trang M. "Comparison of sampling based algorithms for multisensor distributed target tracking." 2003. http://louisdl.louislibraries.org/u?/NOD,37.

Full text
Abstract:
Thesis (M.S.)--University of New Orleans, 2003.
Title from electronic submission form. "A thesis ... in partial fulfillment of the requirements for the degree of Master of Science"--Thesis t.p. Vita. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
13

Tzaw, Lin Ming, and 林明灶. "An Approach Using Adaptive Sampling Interval For a Multi-Target Tracking System." Thesis, 1994. http://ndltd.ncl.edu.tw/handle/49301662385681812455.

Full text
Abstract:
碩士
大葉大學
電機工程研究所
82
This research is to design an algorithm using adaptive sampling interval for a radar tracking system. Via this technique, the tracking system can scan and grasp the target information more effectively. The key development of this approach is that the detection criterion for target maneuvering situation and environment status together with the extended Kalman filter and adaptive procedure algorithm is designed for a tracking system. In order to analyze this approach, a computer simulation algorithm is developed also. Finally, the comparision of the difference of general fixed sampling interval and adaptive sampling interval for a tracking system will be conducted in this thesis. In addition to the situations concerned as above, the multiple target tracking problems are also considered in this reasearch. According to the simulation results, the adaptive sampling interval procedure proposed in this thesis will enhance the radar tracking capability and have more accurate performance.
APA, Harvard, Vancouver, ISO, and other styles
14

Hirschowitz, Steven. "The interaction of sampling ratio and modelling method in prediction of binary target with rare target class." Thesis, 2009. http://hdl.handle.net/10539/7248.

Full text
Abstract:
In many practical predictive data mining problems with a binary target, one of the target classes is rare. In such a situation it is common practice to decrease the ratio of common to rare class cases in the training set by under-sampling the common class. The relationship between the ratio of common to rare class cases in the training set and model performance was investigated empirically on three artificial and three real-world data sets. The results indicated that a flexible modelling method without regularisation benefits in both mean and variance of performance from a larger ratio when evaluated on a criterion sensitive to overfitting, and benefits in mean but not variance of performance when evaluated on a criterion less sensitive to overfitting. For an inflexible modelling method and a flexible method with regularisation, the effects of a larger ratio were less consistent. In no circumstances, however, was a larger ratio found to be detrimental to model performance, however measured.
APA, Harvard, Vancouver, ISO, and other styles
15

Maake, Pauline Mmaletshabo. "Pre-operative patient education for patients undergoing kidney transplant as viewed by nephrology nurses." Diss., 2017. http://hdl.handle.net/10500/23708.

Full text
Abstract:
The purpose of this study was to determine the views of nephrology nurses regarding pre-operative education prior to kidney transplant. The study was conducted in Nephrology Ward in King Abdulaziz Medical City, Riyadh, Kingdom of Saudi Arabia. Qualitative descriptive design was used. Purposive non-probability sampling was used until data saturation occurred. Target population were registered nurses working in the Nephrology Unit. Both male and female nurses aged between 25 and 59 years working for a period of at least one year in the Nephrology Unit were included in the study. Data saturation was reached after interviewing 15 nephrology nurses. Themes and categories emerged from adopting Creswell’s (2013) “data analysis spiral”. Some of the key findings were that pre-operative patient education is a multidisciplinary team approach and that psychosocial aspects of the patients should be taken into consideration before educating the patients. Conclusions were drawn and recommendations were also made from findings of this study. Ultimately, key recommendations were that there is a need to train and empower nurses in importance of delivering pre-operative education and that expatriate nurses have access to Arabic speakers to overcome language barriers while educating the patients
Health Studies
M.A. (Health Studies)
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography