Dissertations / Theses on the topic 'Sampling'

To see the other types of publications on this topic, follow the link: Sampling.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Sampling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Malik, M. B. "Acceptance sampling : Robust alternatives for sampling by variables." Thesis, University of Essex, 1985. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.482618.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hörmann, Wolfgang, and Josef Leydold. "Monte Carlo Integration Using Importance Sampling and Gibbs Sampling." Department of Statistics and Mathematics, Abt. f. Angewandte Statistik u. Datenverarbeitung, WU Vienna University of Economics and Business, 2005. http://epub.wu.ac.at/1642/1/document.pdf.

Full text
Abstract:
To evaluate the expectation of a simple function with respect to a complicated multivariate density Monte Carlo integration has become the main technique. Gibbs sampling and importance sampling are the most popular methods for this task. In this contribution we propose a new simple general purpose importance sampling procedure. In a simulation study we compare the performance of this method with the performance of Gibbs sampling and of importance sampling using a vector of independent variates. It turns out that the new procedure is much better than independent importance sampling; up to dimension five it is also better than Gibbs sampling. The simulation results indicate that for higher dimensions Gibbs sampling is superior. (author's abstract)
Series: Preprint Series / Department of Applied Statistics and Data Processing
APA, Harvard, Vancouver, ISO, and other styles
3

Meister, Kadri. "On Methods for Real Time Sampling and Distributions in Sampling." Doctoral thesis, Umeå : Univ, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-415.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Miljak, Marija. "Conformational sampling of intrinsically disordered peptides by enhanced sampling methods." Thesis, University of Southampton, 2017. https://eprints.soton.ac.uk/422232/.

Full text
Abstract:
The aim of this study was to explore the conformational equilibrium of four cyclic hormone peptides in order to investigate to what extent the bound conformational state can be observed from the solution phase simulations. The studied cyclic peptides share the same structural motif of a six membered ring closed by disulphide bridge between the cysteine residues. They also belong to the class of intrinsically disordered peptides known to exist in an equilibrium of different conformations. Elucidating their conformational ensemble using traditional experimental techniques has proven hard due to the fast interconversion between conformational states, and thus molecular dynamics simulation may help in providing a detailed picture of the peptide’s conformational ensemble. However, conventional molecular dynamics simulation are limited by the long time scale required to observe many conformational motions. Therefore in this work Replica Exchange techniques were applied to test the rate of convergence in conformational sampling. Moreover, to predict the conformational equilibrium of the peptides, a combination of results from enhanced sampling methods, DFT calculations and NMR experiments was used. It was found that calculated chemical shifts weighted by the ensemble populations of each conformational state were better able to reproduce the experimental chemical shift data, over and above any single peptide conformation. This result supports the use of enhanced sampling molecular dynamics computer simulations to study intrinsically disordered peptides. The knowledge of the conformational equilibrium and the relative populations of the unbound states of the peptides obtained using this approach may help in predicting the structural and functional roles of the bound state peptide. Another purpose of this work was also to check the extent to which a difference in peptide sequence may contribute to their functional diversity. Finally, the performance of the Replica Exchange simulations was compared, indicating that Solute Tempering is to be preferred over temperature Replica Exchange for reasons of computational efficiency.
APA, Harvard, Vancouver, ISO, and other styles
5

Grepstad, Sigrid. "Sampling on Quasicrystals." Thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for matematiske fag, 2011. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-12650.

Full text
Abstract:
We prove that quasicrystals are universal sets of stable sampling in any dimension. Necessary and sufficient density conditions for stable sampling and interpolation sets in one dimension are studied in detail.
APA, Harvard, Vancouver, ISO, and other styles
6

Hörmann, Wolfgang, and Josef Leydold. "Quasi Importance Sampling." Department of Statistics and Mathematics, Abt. f. Angewandte Statistik u. Datenverarbeitung, WU Vienna University of Economics and Business, 2005. http://epub.wu.ac.at/1394/1/document.pdf.

Full text
Abstract:
There arise two problems when the expectation of some function with respect to a nonuniform multivariate distribution has to be computed by (quasi-) Monte Carlo integration: the integrand can have singularities when the domain of the distribution is unbounded and it can be very expensive or even impossible to sample points from a general multivariate distribution. We show that importance sampling is a simple method to overcome both problems. (author's abstract)
Series: Preprint Series / Department of Applied Statistics and Data Processing
APA, Harvard, Vancouver, ISO, and other styles
7

Gatarić, Milana. "Nonuniform generalized sampling." Thesis, University of Cambridge, 2016. https://www.repository.cam.ac.uk/handle/1810/254971.

Full text
Abstract:
In this thesis we study a novel approach to stable recovery of unknown compactly supported L2 functions from finitely many nonuniform samples of their Fourier transform, so-called Nonuniform Generalized Sampling (NUGS). This framework is based on a recently introduced idea of generalized sampling for stable sampling and reconstruction in abstract Hilbert spaces, which allows one to tailor the reconstruction space to suit the function to be approximated and thereby obtain rapidly-convergent approximations. While preserving this important hallmark, NUGS describes sampling by the use of weighted Fourier frames, thus allowing for highly nonuniform sampling schemes with the points taken arbitrarily close. The particular setting of NUGS directly corresponds to various image recovery models ubiquitous in applications such as magnetic resonance imaging, computed tomography and electron microscopy, where Fourier samples are often taken not necessarily on a Cartesian grid, but rather along spiral trajectories or radial lines. Specifically, NUGS provides stable recovery in a desired reconstruction space subject to sufficient sampling density and sufficient sampling bandwidth, where the latter depends solely on the particular reconstruction space. For univariate compactly supported wavelets, we show that only a linear scaling between the number of wavelets and the sampling bandwidth is both sufficient and necessary for stable recovery. Furthermore, in the wavelet case, we provide an efficient implementation of NUGS for recovery of wavelet coefficients from Fourier data. Additionally, the sufficient relation between the dimension of the reconstruction space and the bandwidth of the nonuniform samples is analysed for the reconstruction spaces of piecewise polynomials or splines with a nonequidistant sequence of knots, and it is shown that this relation is also linear for splines and piecewise polynomials of fixed degree, but quadratic for piecewise polynomials of varying degree. In order to derive explicit guarantees for stable recovery from nonuniform samples in terms of the sampling density, we also study conditions sufficient to ensure existence of a particular frame. Firstly, we establish the sharp and dimensionless sampling density that is sufficient to guarantee a weighted Fourier frame for the space of multivariate compactly supported L2 functions. Furthermore, subject to non-sharp densities, we improve existing estimates of the corresponding frame bounds. Secondly, we provide sampling densities sufficient to ensure a frame, as well as, estimates of the corresponding frame bounds, when a multivariate bandlimited function and its derivatives are sampled at nonuniformly spaced points.
APA, Harvard, Vancouver, ISO, and other styles
8

Fike, William H. "Lobster Sampling Trap." Fogler Library, University of Maine, 2007. http://www.library.umaine.edu/theses/pdf/FikeWH2007.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Fang, Jing. "Herded Gibbs sampling." Thesis, University of British Columbia, 2012. http://hdl.handle.net/2429/43188.

Full text
Abstract:
The Gibbs sampler is one of the most popular algorithms for inference in statistical models. In this thesis, we introduce a herding variant of this algorithm that is entirely deterministic. We demonstrate, with simple examples, that herded Gibbs exhibits better convergence behavior for approximating the marginal distributions than Gibbs sampling. In particular, image denoising exemplifies the effectiveness of herded Gibbs as an inference technique for Markov Random Fields (MRFs). Also, we adopt herded Gibbs as the inference engine for Conditional Random Fields (CRFs) in Named Entity Recognition (NER) and show that it is competitive with the state of the art. The conclusion is that herded Gibbs, for graphical models with nodes of low degree, is very close to Gibbs sampling in terms of the complexity of the code and computation, but that it converges much faster.
APA, Harvard, Vancouver, ISO, and other styles
10

Raja, Yogesh. "Adaptive visual sampling." Thesis, Queen Mary, University of London, 2010. http://qmro.qmul.ac.uk/xmlui/handle/123456789/607.

Full text
Abstract:
Various visual tasks may be analysed in the context of sampling from the visual field. In visual psychophysics, human visual sampling strategies have often been shown at a high-level to be driven by various information and resource related factors such as the limited capacity of the human cognitive system, the quality of information gathered, its relevance in context and the associated efficiency of recovering it. At a lower-level, we interpret many computer vision tasks to be rooted in similar notions of contextually-relevant, dynamic sampling strategies which are geared towards the filtering of pixel samples to perform reliable object association. In the context of object tracking, the reliability of such endeavours is fundamentally rooted in the continuing relevance of object models used for such filtering, a requirement complicated by realworld conditions such as dynamic lighting that inconveniently and frequently cause their rapid obsolescence. In the context of recognition, performance can be hindered by the lack of learned context-dependent strategies that satisfactorily filter out samples that are irrelevant or blunt the potency of models used for discrimination. In this thesis we interpret the problems of visual tracking and recognition in terms of dynamic spatial and featural sampling strategies and, in this vein, present three frameworks that build on previous methods to provide a more flexible and effective approach. Firstly, we propose an adaptive spatial sampling strategy framework to maintain statistical object models for real-time robust tracking under changing lighting conditions. We employ colour features in experiments to demonstrate its effectiveness. The framework consists of five parts: (a) Gaussian mixture models for semi-parametric modelling of the colour distributions of multicolour objects; (b) a constructive algorithm that uses cross-validation for automatically determining the number of components for a Gaussian mixture given a sample set of object colours; (c) a sampling strategy for performing fast tracking using colour models; (d) a Bayesian formulation enabling models of object and the environment to be employed together in filtering samples by discrimination; and (e) a selectively-adaptive mechanism to enable colour models to cope with changing conditions and permit more robust tracking. Secondly, we extend the concept to an adaptive spatial and featural sampling strategy to deal with very difficult conditions such as small target objects in cluttered environments undergoing severe lighting fluctuations and extreme occlusions. This builds on previous work on dynamic feature selection during tracking by reducing redundancy in features selected at each stage as well as more naturally balancing short-term and long-term evidence, the latter to facilitate model rigidity under sharp, temporary changes such as occlusion whilst permitting model flexibility under slower, long-term changes such as varying lighting conditions. This framework consists of two parts: (a) Attribute-based Feature Ranking (AFR) which combines two attribute measures; discriminability and independence to other features; and (b) Multiple Selectively-adaptive Feature Models (MSFM) which involves maintaining a dynamic feature reference of target object appearance. We call this framework Adaptive Multi-feature Association (AMA). Finally, we present an adaptive spatial and featural sampling strategy that extends established Local Binary Pattern (LBP) methods and overcomes many severe limitations of the traditional approach such as limited spatial support, restricted sample sets and ad hoc joint and disjoint statistical distributions that may fail to capture important structure. Our framework enables more compact, descriptive LBP type models to be constructed which may be employed in conjunction with many existing LBP techniques to improve their performance without modification. The framework consists of two parts: (a) a new LBP-type model known as Multiscale Selected Local Binary Features (MSLBF); and (b) a novel binary feature selection algorithm called Binary Histogram Intersection Minimisation (BHIM) which is shown to be more powerful than established methods used for binary feature selection such as Conditional Mutual Information Maximisation (CMIM) and AdaBoost.
APA, Harvard, Vancouver, ISO, and other styles
11

Kimby, Adam. "Sampling och Friktion." Thesis, Blekinge Tekniska Högskola, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-16461.

Full text
Abstract:
This Bachelor Thesis examines sampling as creative process in relation to friction between users and creators. The goal of this thesis is to create an understanding of the relations involved in creative processes and explore sustainable manner of working with processes. To explore creative processes, sampling is interpreted in relation to music production and the musical culture involved in shaping of worlds. Sampling is the processes of creating from what already is in a way of involving the users and creators with the past, present and future. The friction of creative processes involves an understanding of innovation and originality that questions existing principles. Sampling in relation to friction is about boundaries, how do we involve ourselves in an accountable process and what is appropriation in relation to creative engagement?
Det här kandidatarbetet undersöker sampling som en skapandeprocess i relation till användare och deltagare. Syftet med undersökningen är att skapa en förståelse för de relationerna involverade i en skapandeprocess och utforska hållbara aspekter av att arbeta med processer. För att undersöka skapandeprocesser förstås sampling i relation till musikproduktion och musikkultur som världskapande. Sampling möter skapandeprocesser med det redan befintliga genom att involvera användaren och deltagaren i dåtid, nutid och framtid. Friktionen i skapandeprocesser prövar förståelsen av innovation, originalitet och ägande. Sampling i relation till friktion handlar om gränser, hur kan vi involvera oss i ett ansvarfullt skapande och vad är appropriering i relation till skapandeprocesser?
APA, Harvard, Vancouver, ISO, and other styles
12

Pollard, John. "Adaptive distance sampling." Thesis, University of St Andrews, 2002. http://hdl.handle.net/10023/15176.

Full text
Abstract:
We investigate mechanisms to improve efficiency for line and point transect surveys of clustered populations by combining the distance methods with adaptive sampling. In adaptive sampling, survey effort is increased when areas of high animal density are located, thereby increasing the number of observations. We begin by building on existing adaptive sampling techniques, to create both point and line transect adaptive estimators, these are then extended to allow the inclusion of covariates in the detection function estimator. However, the methods are limited, as the total effort required cannot be forecast at the start of a survey, and so a new fixed total effort adaptive approach is developed. A key difference in the new method is that it does not require the calculation of the inclusion probabilities typically used by existing adaptive estimators. The fixed effort method is primarily aimed at line transect sampling, but point transect derivations are also provided. We evaluate the new methodology by computer simulation, and report on surveys of harbour porpoise in the Gulf of Maine, in which the approach was compared with conventional line transect sampling. Line transect simulation results for a clustered population showed up to a 6% improvement in the adaptive density variance estimate over the conventional, whilst when there was no clustering the adaptive estimate was 1% less efficient than the conventional. For the harbour porpoise survey, the adaptive density estimate cvs showed improvements of 8% for individual porpoise density and 14% for school density over the conventional estimates. The primary benefit of the fixed effort method is the potential to improve survey coverage, allowing a survey to complete within a fixed time and effort; an important feature if expensive survey resources are involved, such as an aircraft, crew and observers.
APA, Harvard, Vancouver, ISO, and other styles
13

Wang, Yu. "Revisiting Network Sampling." The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1546425835709593.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Jones, Philip William Carleton University Dissertation Engineering Electrical. "Asynchronous sampling wattmeter." Ottawa, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
15

Rummler, William August. "Sampling edge covers /." Online version of thesis, 2009. http://hdl.handle.net/1850/10647.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Corrow, Allissa. "DOUBLE SAMPLING FOR COARSE WOODY DEBRIS ESTIMATIONS FOLLOWING LINE INTERSECT SAMPLING." The University of Montana, 2010. http://etd.lib.umt.edu/theses/available/etd-06122010-133739/.

Full text
Abstract:
Coarse woody debris (CWD), an essential component of healthy forests, has typically been defined as dead and down, large woody material. Quantification of this resource provides a useful metric for assessing wildlife habitat, fuel loading, and more recently, carbon sequestration. Although many CWD sampling methods exist, accurate estimation is difficult and expensive. Double sampling incorporates auxiliary data that is positively correlated with the attribute of interest as a means of reducing sampling costs and/or increasing estimation precision. The present study investigated double sampling applications to the common CWD sampling technique, line intersect sampling (LIS). We identified aggregate length as a potential auxiliary variable for estimating aggregate volume and abundance of CWD. However, further analysis indicated that the cost difference of the sampling phases, coupled with the correlation of the variables was not sufficient to warrant double sampling in the study area. Further investigation is needed to develop accurate and efficient CWD sampling methods with widespread applicability.
APA, Harvard, Vancouver, ISO, and other styles
17

Qasim, Muhammad. "Calibration Estimation under Nonresponse based on Simple Random Sampling Vs Pareto Sampling." Thesis, Örebro universitet, Handelshögskolan vid Örebro Universitet, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:oru:diva-35668.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Kerr, Kent. "Sampling for Beryllium Surface Contamination using Wet, Dry and Alcohol Wipe Sampling." Washington, D.C. : Oak Ridge, Tenn. : United States. National Nuclear Security Administration ; distributed by the Office of Scientific and Technical Information, U.S. Dept. of Energy, 2004. http://www.osti.gov/servlets/purl/837587-M4P95G/native/.

Full text
Abstract:
Thesis (M.S.); Submitted to Central Missouri State University, Warrensburg, MO (US); 17 Dec 2004.
Published through the Information Bridge: DOE Scientific and Technical Information. Kerr, Kent. NNSA Kansas City Site Office (US) NNSA Kansas City Site Office. 12/17/2004. Report is also available in paper and microfiche from NTIS.
APA, Harvard, Vancouver, ISO, and other styles
19

Zhang, Haizhang. "Sampling with reproducing kernels." Related electronic resource: Current Research at SU : database of SU dissertations, recent titles available full text, 2009. http://wwwlib.umi.com/cr/syr/main.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Forsgren, Niklas. "Sampling Ocsilloscope On-Chip." Thesis, Linköping University, Department of Electrical Engineering, 2003. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-1563.

Full text
Abstract:

Signal-integrity degradation from such factors as supply and substrate noise and cross talk between interconnects restricts the performance advances in Very Large Scale Integration (VLSI). To avoid this and to keep the signal-integrity, accurate measurements of the on-chip signal must be performed to get an insight in how the physical phenomenon affects the signals.

High-speed digital signals can be taken off chip, through buffers that add delay. Propagating a signal through buffers restores the signal, which can be good if only information is wanted. But if the waveform is of importance, or if an analog signal should be measured the restoration is unwanted. Analog buffers can be used but they are limited to some hundred MHz. Even if the high-speed signal is taken off chip, the bandwidth of on-chip signals is getting very high, making the use of an external oscilloscope impossible for reliable measurement. Therefore other alternatives must be used.

In this work, an on-chip measuring circuit is designed, which makes use of the principle of a sampling oscilloscope. Only one sample is taken each period, resulting in an output frequency much lower than the input frequency. A slower signal is easier to take off-chip and it can easily be processed with an ordinary oscilloscope.

APA, Harvard, Vancouver, ISO, and other styles
21

Schelin, Lina. "Spatial sampling and prediction." Doctoral thesis, Umeå universitet, Institutionen för matematik och matematisk statistik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-53286.

Full text
Abstract:
This thesis discusses two aspects of spatial statistics: sampling and prediction. In spatial statistics, we observe some phenomena in space. Space is typically of two or three dimensions, but can be of higher dimension. Questions in mind could be; What is the total amount of gold in a gold-mine? How much precipitation could we expect in a specific unobserved location? What is the total tree volume in a forest area? In spatial sampling the aim is to estimate global quantities, such as population totals, based on samples of locations (papers III and IV). In spatial prediction the aim is to estimate local quantities, such as the value at a single unobserved location, with a measure of uncertainty (papers I, II and V). In papers III and IV, we propose sampling designs for selecting representative probability samples in presence of auxiliary variables. If the phenomena under study have clear trends in the auxiliary space, estimation of population quantities can be improved by using representative samples. Such samples also enable estimation of population quantities in subspaces and are especially needed for multi-purpose surveys, when several target variables are of interest. In papers I and II, the objective is to construct valid prediction intervals for the value at a new location, given observed data. Prediction intervals typically rely on the kriging predictor having a Gaussian distribution. In paper I, we show that the distribution of the kriging predictor can be far from Gaussian, even asymptotically. This motivated us to propose a semiparametric method that does not require distributional assumptions. Prediction intervals are constructed from the plug-in ordinary kriging predictor. In paper V, we consider prediction in the presence of left-censoring, where observations falling below a minimum detection limit are not fully recorded. We review existing methods and propose a semi-naive method. The semi-naive method is compared to one model-based method and two naive methods, all based on variants of the kriging predictor.
APA, Harvard, Vancouver, ISO, and other styles
22

Hörmann, Wolfgang. "New Importance Sampling Densities." Department of Statistics and Mathematics, Abt. f. Angewandte Statistik u. Datenverarbeitung, WU Vienna University of Economics and Business, 2005. http://epub.wu.ac.at/1066/1/document.pdf.

Full text
Abstract:
To compute the expectation of a function with respect to a multivariate distribution naive Monte Carlo is often not feasible. In such cases importance sampling leads to better estimates than the rejection method. A new importance sampling distribution, the product of one-dimensional table mountain distributions with exponential tails, turns out to be flexible and useful for Bayesian integration problems. To obtain a heavy-tailed importance sampling distribution a new radius transform for the above distribution is suggested. Together with a linear transform the new importance sampling distributions lead to simple and fast integration algorithms with reliable error bounds. (author's abstract)
Series: Preprint Series / Department of Applied Statistics and Data Processing
APA, Harvard, Vancouver, ISO, and other styles
23

Aly, Hussein A. "Regularized image up-sampling." Thesis, University of Ottawa (Canada), 2004. http://hdl.handle.net/10393/29073.

Full text
Abstract:
This thesis addresses the problem of performing image magnification to achieve higher perceived resolution for grey-scale and color images. A new perspective on the problem is introduced through the new concept of a theoretical camera that can acquire an ideal high resolution image. A new formulation of the problem is then introduced using two ingredients: a newly designed observation model and the total-variation regularizer. An observation model, that establishes a generalized relation between the desired magnified image and the measured lower resolution image, has been newly designed based on careful study of the physical acquisition processes that have generated the images. The result is a major contribution of this thesis: a closed-form solution for obtaining the observation model. This closed form has been implemented and observation models were obtained for different typical scenarios, and their performance was shown to outperform observation models used in the literature. Two new theorems for designing the theoretical camera, adapted to the display device used, on arbitrary lattices have been developed. The thesis presents new analysis with a signal processing perspective that justifies the use of the total-variation regularizer as a priori knowledge for the magnified image; this analysis is defined on both the low and the high resolution lattices simultaneously. The resulting objective function has been minimized numerically using the level-set method with two new motions that interact simultaneously, leading to a solution scheme that is not trapped in constant-image solutions and converges to a unique solution regardless of the initial estimate. For color images, the human visual system characteristics were involved in the choice of the color space used in the implementation. It was found that a proper color space such as YCbCr that focuses on magnifying a better luminance channel provided the same result as a vectorial total-variation formulation, but at a reduced computational cost. The quality of the magnified images obtained by the new approaches of this thesis surpassed the quality of state-of-the-art methods from the literature.
APA, Harvard, Vancouver, ISO, and other styles
24

Hörmann, Wolfgang, and Josef Leydold. "Improved Perfect Slice Sampling." Department of Statistics and Mathematics, Abt. f. Angewandte Statistik u. Datenverarbeitung, WU Vienna University of Economics and Business, 2003. http://epub.wu.ac.at/868/1/document.pdf.

Full text
Abstract:
Perfect slice sampling is a method to turn Markov Chain Monte Carlo (MCMC) samplers into exact generators for independent random variates. The originally proposed method is rather slow and thus several improvements have been suggested. However, two of them are erroneous. In this article we give a short introduction to perfect slice sampling, point out incorrect methods, and give a new improved version of the original algorithm. (author's abstract)
Series: Preprint Series / Department of Applied Statistics and Data Processing
APA, Harvard, Vancouver, ISO, and other styles
25

Shanian, Sara. "Selective Sampling for Classification." Thesis, Université Laval, 2007. http://www.theses.ulaval.ca/2007/24857/24857.pdf.

Full text
Abstract:
Une des objectifs poursuivis par la recherche en apprentissage automatique est la construction de bons classificateurs à partir d'un ensemble d'exemples étiquetés. Certains problèmes nécessitent de réunir un grand ensemble d'exemples étiquetés, ce qui peut s'avérer long et coûteux. Afin de réduire ces efforts, il est possible d'utiliser les algorithmes d'apprentissage actif. Ces algorithmes tirent profit de la possibilité de faire quelques demandes d'étiquetage parmi un grand ensemble d'exemples non-étiquetés pour construire un classificateur précis. Il est cependant important de préciser que les algorithmes d'apprentissage actif actuels possèdent eux-mêmes quelques points faibles connus qui peuvent les mener à performer inadéquatement dans certaines situations. Dans cette thèse, nous proposons un nouvel algorithme d'apprentissage actif. Notre algorithme atténue certains points faibles des précédents algorithmes d'apprentissage actif, et il se révèle trés compétitif aux algorithmes d'apprentissage actif bien-connus. De plus, notre algorithme est facile à implémenter.
One of the goals of machine learning researches is to build accurate classifiers form an amount of labeled examples. In some problems, it is necessary to gather a large set of labeled examples which can be costly and time-consuming. To reduce these expenses, one can use active learning algorithms. These algorithms benefit from the possibility of performing a small number of label-queries from a large set of unlabeled examples to build an accurate classifier. It should be mentioned that actual active learning algorithms, themselves, have some known weak points which may lead them to perform unsuccessfully in certain situations. In this thesis, we propose a novel active learning algorithm. Our proposed algorithm not only fades the weak points of the previous active learning algorithms, but also performs competitively among the widely known active learning algorithms while it is easy to implement.
APA, Harvard, Vancouver, ISO, and other styles
26

Said, Maya Rida 1976. "Discrete-time randomized sampling." Thesis, Massachusetts Institute of Technology, 2001. http://hdl.handle.net/1721.1/86836.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Vul, Edward. "Sampling in human cognition." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/62097.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Brain and Cognitive Sciences, 2010.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 117-126).
Bayesian Decision Theory describes optimal methods for combining sparse, noisy data with prior knowledge to build models of an uncertain world and to use those models to plan actions and make novel decisions. Bayesian computational models correctly predict aspects of human behavior in cognitive domains ranging from perception to motor control and language. However the predictive success of Bayesian models of cognition has highlighted long-standing challenges in bridging the computational and process levels of cognition. First, the computations required for exact Bayesian inference are incommensurate with the limited resources available to cognition (e.g., computational speed; and memory). Second, Bayesian models describe computations but not the processes that carry out these computations and fail to accurately predict human behavior under conditions of cognitive load or deficits. I suggest a resolution to both challenges: The mind approximates Bayesian inference by sampling. Experiments across a wide range of cognition demonstrate Monte-Carlo-like behavior by human observers; moreover, models of cognition based on specific Monte Carlo algorithms can describe previously elusive cognitive phenomena such as perceptual bistability and probability matching. When sampling algorithms are treated as process models of human cognition, the computational and process levels can be modeled jointly to shed light on new and old cognitive phenomena..
by Edward Vul.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
28

Bhandari, Ayush. "Sampling time-resolved phenomena." Thesis, Massachusetts Institute of Technology, 2018. https://hdl.handle.net/1721.1/128455.

Full text
Abstract:
Thesis: Ph. D., Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2018
Cataloged from student-submitted PDF of thesis.
Includes bibliographical references (pages 193-210).
Broadly speaking, time-resolved phenomena refers to three dimensional capture of a scene based on the time-of-flight principle. Since speed and and time are proportional quantities, knowing time-of-flight allows one to estimate distances. This time-of-flight may be attributed to a pulse of light or a wave packet of sound. Depending on the sub-band of the electromagnetic spectrum, the interaction of waves or pulses with the scene of interest results in measurements and based on this proxy of the physical world, one is interested in inferring physical properties of the scene. This may be something simple as depth, or something more involved such as fluorescence lifetime of a biological sample or the diffusion coefficient of turbid/scattering medium. The goal of this work is to develop a unifying approach to study time-resolved phenomena across various sub-bands of the electromagnetic spectrum, devise algorithms to solve for the corresponding inverse problems and provide fundamental limits. Sampling theory, which deals with the interplay between the discrete and the continuous realms, plays a critical role in this work due to the continuous nature of physical world and the discrete nature of its proxy, that is, the time-resolved measurements.
by Ayush Bhandari.
Ph. D.
Ph.D. Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences
APA, Harvard, Vancouver, ISO, and other styles
29

Herrmann, Felix J. "Seismology meets compressive sampling." Institute for Pure & Applied Mathematics. University of California, Los Angeles, 2007. http://hdl.handle.net/2429/603.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Walworth, James. "Soil Sampling and Analysis." College of Agriculture and Life Sciences, University of Arizona (Tucson, AZ), 2006. http://hdl.handle.net/10150/144813.

Full text
Abstract:
5 pp.
Soil testing is comprised of four steps: Collection of a representative soil sample, laboratory analyses of the soil sample, interpretation of analytical results, and management recommendations based on interpreted analytical results.
APA, Harvard, Vancouver, ISO, and other styles
31

Walworth, J. L. "Soil Sampling and Analysis." College of Agriculture and Life Sciences, University of Arizona (Tucson, AZ), 2011. http://hdl.handle.net/10150/239610.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Karlberg, Nicholas. "Fjärrkontrollstyrd sampling av tröghetssensorer." Thesis, Umeå universitet, Institutionen för tillämpad fysik och elektronik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-160737.

Full text
Abstract:
På avdelningen Medicinsk Teknik-Forskning och Utveckling på Norrlands Universitetssjukhus (MT-FoU) pågår en utveckling av ett rörelsemätsystem (MoLab) som syftar till att ge kvantitativa data från individer med nedsatt rörelsefunktion. Denna information kan fungera som underlag för behandling och användas för att utvärdera uppföljning av behandlingar. Mätsystemet begränsas idag av att det är beroende av en accesspunkt med tillhörande lokalt nätverk för att föra över data till insamlingsdator via TCP/IP-protokollet.   Examensarbetet syftar till att öka mätsystemets mobilitet genom att inkludera en portabel och trådlös mätlösning utan krav på stationär accesspunkt och insamlingsdator vid insamlingstillfället. Med ett mer mobilt mätsystem kan data inhämtas från individer i dennes autentiska och vardagliga miljö.   En lokal lagringsyta har kopplats till sensorenheterna och användargränssnittet utgörs istället av en fjärrkontroll, som startar och stoppar mätningar. Fjärrkontrollen fungerar också som en accesspunkt som sensorenheterna ansluter till och som vidarebefordrar start- och stoppsignaler via UDP-broadcast från fjärrkontroll till alla uppkopplade sensorenheter. Vid given stoppsignal överförs sensordata till fjärrkontrollen för efterföljande lagring på dess interna minneskort. Det är viktigt att mätningarna sker synkront mellan sensorenheterna, där startsynkroniseringen kan kvantifieras som tidsdifferensen mellan sensorenheternas mätstart.  Med metoden som föreslås i detta arbete, hamnar startsynkroniseringen väl under ett väldefinierat tröskelvärde på 2 ms som också fanns implementerat i MoLab och är därför en tänkbar metod för att utöka mobiliteten för MoLab.
The department of Biomedical Engineering Research and Development at University Hospital (MT-FoU) is developing a motion measurement system (MoLab) that aims to provide quantitative data from individuals with impaired mobility. This information can serve as a basis for treatments and used to evaluate follow-up of treatments. The measuring system is today limited by the fact that it is dependent on an access point and associated local network to communicate data to an evaluation computer via the TCP/IP protocol. This is a bachelor thesis that aims to give this measuring system greater freedom to measure without dependence of a stationary access point and evaluation computer. With a more mobile system, data can be obtained in a more authentic and everyday environment for the individuals to be measured.  An internal memory has been connected to the inertial sensor devices, and the user interface is instead a remote control, which starts and stops measurements. The remote-control acts as the access point which the sensor devices connects to and start and stop signals are transmitted via UDP broadcast from the remote-control to the sensor devices. In case of a given stop signal, sensor data is transferred to the internal memory of the remote control. It is important that the measurements take place synchronously where the synchronization can be quantified as the time difference between the sensor devices at the measurement start. With this method, start synchronization ends up well below a well-defined threshold value of 2 ms that was also implemented in MoLab, giving it greater mobility.
APA, Harvard, Vancouver, ISO, and other styles
33

Merkelov, Fedor, and Yaroslav Kodess. "Design and Implementation of Sampling Rate Converters for Conversions between Arbitrary Sampling Rates." Thesis, Linköping University, Department of Electrical Engineering, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-1151.

Full text
Abstract:

In different applications, in digital domain, it is necessary to change the sampling rate by an arbitrary number. For example Software Radio which should handle different conversion factors and standards.

This work focuses on the problem of designing and implement sampling rate converters for conversions between arbitrary sampling rates.

The report presents an overview of different converter techniques as well as considers a suitable scheme with low implementation cost. The creating VHDL generator of Farrow-based structure to speed up the design process is the main task of this work. The suitable design technique which is the most important thing in any design work is presented in the report as well.

The scheme which is considered to be suitable is created by VHDL generator and tested in MATLAB. The source code is attached to the report. And some results from tests of the implemented scheme.

APA, Harvard, Vancouver, ISO, and other styles
34

Linder, Daniel Frederick. "Optimal and permissible sampling rates for first-order sampling of two-band signals." Click here to access dissertation, 2008. http://www.georgiasouthern.edu/etd/archive/spring2008/daniel_f_linder/Linder_Daniel_F_200801_MS.pdf.

Full text
Abstract:
Thesis (M.S.)--Georgia Southern University, 2008.
"A dissertation submitted to the Graduate Faculty of Georgia Southern University in partial fulfillment of the requirements for the degree Master of Science." Under the direction of Yan Wu. ETD. Electronic version approved: May 2008. Includes bibliographical references (p. 36)
APA, Harvard, Vancouver, ISO, and other styles
35

Kannan, Srinivasaraghavan [Verfasser]. "Applications of Advanced Sampling Methods for Enhanced Conformational Sampling of Biomolecules / Srinivasaraghavan Kannan." Bremen : IRC-Library, Information Resource Center der Jacobs University Bremen, 2009. http://d-nb.info/1034722484/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Rahman, S. M. Rayhan. "Performance of local planners with respect to sampling strategies in sampling-based motion planning." Thesis, McGill University, 2011. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=96891.

Full text
Abstract:
Automatically planning the motion of rigid bodies moving in 3D by translation and rotation in the presence of obstacles has long been a research challenge for mathematicians, algorithm designers and roboticists. The field made dramatic progress with the introduction of the probabilistic and sampling-based "roadmap" approach. However, motion planning when narrow passages are present has remained a challenge. This thesis presents a framework for experimenting with combinations of sampling strategies and local planners, and for comparing their performance on user defined input problems. Our framework also allows parallel implementations on a variable number of processing cores. We present experimental results. In particular, our framework has allowed us to find combinations of sampling strategy choice with local planner choice that can solve difficult benchmark motion planningproblems.
La planification automatique du mouvement de corps rigides en mouvement 3D par translation et rotation en présence d'obstacles a longtemps été un défi pour la recherche pour les mathématiciens, les concepteurs de l'algorithme et roboticiens. Le champ a fait d'importants progrès avec l'introduction de la méthode de "feuille de route" probabiliste basée sur l'échantillonnage. Mais la planification du mouvement en présence de passages étroits est resté un défi.Cette thése présente un cadre d'expérimentation avec des combinaisons de stratégies d'échantillonnage et les planificateurs locaux, et de comparaison de leurs performances sur des problémes définis par l'utilisateur. Notre programme peut également être exécuté parallèle sur un nombre variable de processeurs. Nous présentons des résultats expérimentaux. En particulier, notre cadre nous a permis de trouver des combinaisons de choix d'une stratégie d'échantillonnage avec choix de planificateur local qui peut résoudre des problèmes difficiles de référence.
APA, Harvard, Vancouver, ISO, and other styles
37

Messina, Carl J. "Labeled sampling consensus a novel algorithm for robustly fitting multiple structures using compressed sampling." Master's thesis, University of Central Florida, 2011. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/4983.

Full text
Abstract:
The ability to robustly fit structures in datasets that contain outliers is a very important task in Image Processing, Pattern Recognition and Computer Vision. Random Sampling Consensus or RANSAC is a very popular method for this task, due to its ability to handle over 50% outliers. The problem with RANSAC is that it is only capable of finding a single structure. Therefore, if a dataset contains multiple structures, they must be found sequentially by finding the best fit, removing the points, and repeating the process. However, removing incorrect points from the dataset could prove disastrous. This thesis offers a novel approach to sampling consensus that extends its ability to discover multiple structures in a single iteration through the dataset. The process introduced is an unsupervised method, requiring no previous knowledge to the distribution of the input data. It uniquely assigns labels to different instances of similar structures. The algorithm is thus called Labeled Sampling Consensus or L-SAC. These unique instances will tend to cluster around one another allowing the individual structures to be extracted using simple clustering techniques. Since divisions instead of modes are analyzed, only a single instance of a structure need be recovered. This ability of L-SAC allows a novel sampling procedure to be presented "compressing" the required samples needed compared to traditional sampling schemes while ensuring all structures have been found. L-SAC is a flexible framework that can be applied to many problem domains.
ID: 030423298; System requirements: World Wide Web browser and PDF reader.; Mode of access: World Wide Web.; Thesis (M.S.E.E.)--University of Central Florida, 2011.; Includes bibliographical references (p. 70-72).
M.S.E.E.
Masters
Electrical Engineering and Computer Science
Engineering and Computer Science
Electrical Engineering
APA, Harvard, Vancouver, ISO, and other styles
38

Akinola, A. A. "Automation of a sampling train for the in-duct sampling of dust-laden gases." Thesis, University of Strathclyde, 1985. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.371956.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Hiroz, D. "An Investigation into the Applicability of the Theory of Sampling to Enviromental Sampling Programmes." Thesis, Queen's University Belfast, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.501275.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Li, Ping. "Stable random projections and conditional random sampling, two sampling techniques for modern massive datasets /." May be available electronically:, 2007. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Grafström, Anton. "On unequal probability sampling designs." Umeå : Department of Mathematics and Mathematical Statistics, Umeå University, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-33701.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Nene, Rohit Ravindra. "Design of bioaerosol sampling inlets." Thesis, Texas A&M University, 2003. http://hdl.handle.net/1969.1/5736.

Full text
Abstract:
An experimental investigation involving the design, fabrication, and testing of an ambient sampling inlet and two additional Stokes-scaled inlets is presented here. Testing of each inlet was conducted at wind speeds of 2, 8, and 24 km/h (0.55, 2.22, and 6.67 m/s), and characterized for particle sizes between 5 and 20 µm AD. The base-line ambient sampling inlet, which operates at 100 L/min, was developed to interface with a Circumferential Slot Virtual Impactor aerosol concentrator. The inlet displays wind-speed independent characteristics with a penetration above 90% for a nominal particle size of 10 µm AD for all wind speeds. Particles up to 11.5 µm AD are sampled through this inlet with a penetration above 80% at all wind speeds. In an effort to test the validity of Stokes scaling to assist in the design of inlets, two additional inlets were designed to accommodate design flow rates of 400 L/min and 800 L/min, with the 100 L/min unit as the base inlet. Scaling was achieved by applying a Stokes scaling factor to selective parameters, such as inlet aspiration gap, annular gap, window height, and the rise which is the vertical distance extending from the lower flange to the base of the window. The scaled inlets display wind independent penetration characteristics close to 95% for a nominal particle size of 10 µm AD. The scaled inlets also have the ability to sample particles up to a size of 13 µm AD with a penetration in excess of 80% at all wind speeds. Observations from the plots of penetration against the Stokes number based on the free stream velocity suggest that it is insufficient to use only Stokes-scaling for inlet design. A modified velocity ratio defined for omnidirectional inlets was incorporated into a summary of results obtained for all combinations of BSI units and wind speeds. Also, a correlation equation based on the Stokes number and a modified velocity ratio was developed as a model for predicting performance among the BSI family of inlets. This correlation used in unison with Stokes-scaling provides promise for predicting performance and improving the overall design process of inlets.
APA, Harvard, Vancouver, ISO, and other styles
43

Gemulla, Rainer. "Sampling Algorithms for Evolving Datasets." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2008. http://nbn-resolving.de/urn:nbn:de:bsz:14-ds-1224861856184-11644.

Full text
Abstract:
Perhaps the most flexible synopsis of a database is a uniform random sample of the data; such samples are widely used to speed up the processing of analytic queries and data-mining tasks, to enhance query optimization, and to facilitate information integration. Most of the existing work on database sampling focuses on how to create or exploit a random sample of a static database, that is, a database that does not change over time. The assumption of a static database, however, severely limits the applicability of these techniques in practice, where data is often not static but continuously evolving. In order to maintain the statistical validity of the sample, any changes to the database have to be appropriately reflected in the sample. In this thesis, we study efficient methods for incrementally maintaining a uniform random sample of the items in a dataset in the presence of an arbitrary sequence of insertions, updates, and deletions. We consider instances of the maintenance problem that arise when sampling from an evolving set, from an evolving multiset, from the distinct items in an evolving multiset, or from a sliding window over a data stream. Our algorithms completely avoid any accesses to the base data and can be several orders of magnitude faster than algorithms that do rely on such expensive accesses. The improved efficiency of our algorithms comes at virtually no cost: the resulting samples are provably uniform and only a small amount of auxiliary information is associated with the sample. We show that the auxiliary information not only facilitates efficient maintenance, but it can also be exploited to derive unbiased, low-variance estimators for counts, sums, averages, and the number of distinct items in the underlying dataset. In addition to sample maintenance, we discuss methods that greatly improve the flexibility of random sampling from a system's point of view. More specifically, we initiate the study of algorithms that resize a random sample upwards or downwards. Our resizing algorithms can be exploited to dynamically control the size of the sample when the dataset grows or shrinks; they facilitate resource management and help to avoid under- or oversized samples. Furthermore, in large-scale databases with data being distributed across several remote locations, it is usually infeasible to reconstruct the entire dataset for the purpose of sampling. To address this problem, we provide efficient algorithms that directly combine the local samples maintained at each location into a sample of the global dataset. We also consider a more general problem, where the global dataset is defined as an arbitrary set or multiset expression involving the local datasets, and provide efficient solutions based on hashing.
APA, Harvard, Vancouver, ISO, and other styles
44

Grafström, Anton. "On unequal probability sampling designs." Doctoral thesis, Umeå universitet, Institutionen för matematik och matematisk statistik, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-33701.

Full text
Abstract:
The main objective in sampling is to select a sample from a population in order to estimate some unknown population parameter, usually a total or a mean of some interesting variable. When the units in the population do not have the same probability of being included in a sample, it is called unequal probability sampling. The inclusion probabilities are usually chosen to be proportional to some auxiliary variable that is known for all units in the population. When unequal probability sampling is applicable, it generally gives much better estimates than sampling with equal probabilities. This thesis consists of six papers that treat unequal probability sampling from a finite population of units. A random sample is selected according to some specified random mechanism called the sampling design. For unequal probability sampling there exist many different sampling designs. The choice of sampling design is important since it determines the properties of the estimator that is used. The main focus of this thesis is on evaluating and comparing different designs. Often it is preferable to select samples of a fixed size and hence the focus is on such designs. It is also important that a design has a simple and efficient implementation in order to be used in practice by statisticians. Some effort has been made to improve the implementation of some designs. In Paper II, two new implementations are presented for the Sampford design. In general a sampling design should also have a high level of randomization. A measure of the level of randomization is entropy. In Paper IV, eight designs are compared with respect to their entropy. A design called adjusted conditional Poisson has maximum entropy, but it is shown that several other designs are very close in terms of entropy. A specific situation called real time sampling is treated in Paper III, where a new design called correlated Poisson sampling is evaluated. In real time sampling the units pass the sampler one by one. Since each unit only passes once, the sampler must directly decide for each unit whether or not it should be sampled. The correlated Poisson design is shown to have much better properties than traditional methods such as Poisson sampling and systematic sampling.
APA, Harvard, Vancouver, ISO, and other styles
45

Karawatzki, Roman. "The Multivariate Ahrens Sampling Method." Department of Statistics and Mathematics, WU Vienna University of Economics and Business, 2006. http://epub.wu.ac.at/958/1/document.pdf.

Full text
Abstract:
The "Ahrens method" is a very simple method for sampling from univariate distributions. It is based on rejection from piecewise constant hat functions. It can be applied analogously to the multivariate case where hat functions are used that are constant on rectangular domains. In this paper we investigate the case of distributions with so called orthounimodal densities. Technical implementation details as well as their practical limitations are discussed. The application to more general distributions is considered. (author's abstract)
Series: Research Report Series / Department of Statistics and Mathematics
APA, Harvard, Vancouver, ISO, and other styles
46

Abrahamson, Jeff Shokoufandeh Ali. "Optimal matching and deterministic sampling /." Philadelphia, Pa. : Drexel University, 2007. http://hdl.handle.net/1860/2526.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Williams, Robert Tristan. "Perfect Sampling of Vervaat Perpetuities." Scholarship @ Claremont, 2013. http://scholarship.claremont.edu/cmc_theses/628.

Full text
Abstract:
This paper focuses on the issue of sampling directly from the stationary distribution of Vervaat perpetuities. It improves upon an algorithm for perfect sampling first presented by Fill & Huber by implementing both a faster multigamma coupler and a moving value of Xmax to increase the chance of unification. For beta = 1 we are able to reduce the expected steps for a sample by 22%, and at just beta = 3 we lower the expected time by over 80%. These improvements allow us to sample in reasonable time from perpetuities with much higher values of beta than was previously possible.
APA, Harvard, Vancouver, ISO, and other styles
48

Dodds, William C. "Sampling from the Hardcore Process." Scholarship @ Claremont, 2013. http://scholarship.claremont.edu/cmc_theses/681.

Full text
Abstract:
Partially Recursive Acceptance Rejection (PRAR) and bounding chains used in conjunction with coupling from the past (CFTP) are two perfect simulation protocols which can be used to sample from a variety of unnormalized target distributions. This paper first examines and then implements these two protocols to sample from the hardcore gas process. We empirically determine the subset of the hardcore process's parameters for which these two algorithms run in polynomial time. Comparing the efficiency of these two algorithms, we find that PRAR runs much faster for small values of the hardcore process's parameter whereas the bounding chain approach is vastly superior for large values of the process's parameter.
APA, Harvard, Vancouver, ISO, and other styles
49

khorvash, Massih. "On uniform sampling of cliques." Thesis, University of British Columbia, 2009. http://hdl.handle.net/2429/13923.

Full text
Abstract:
The problem that we are addressing in this thesis is the problem of sampling uniformly the cliques of a target size k of any given graph. As a natural approach for solving this problem, we used the already available state-of-the-art heuristic MAX-CLIQUE solvers. The heuristic MAX-CLIQUE algorithms, which we used for this task, have k-CLIQUE solvers as their subroutines. This thesis therefore examines how uniformly some of the state-of-the-art stochastic local search MAX-CLIQUE algorithms sample target cliques of graphs, and suggests various methods to improve their sampling performance. We also investigate the limitations of a generic method that uses already existing SAT samplers (such as XORSample and XORSample' [1]) to sample the solutions of the translated SAT instances from k-CLIQUE instances. The main limitation with this method is the huge size of the encoded instances. Another important limitation with this method is that sampling satisfying assignments of the encoded SAT instances is expensive. We found that some state-of-the-art heuristic MAX-CLIQUE algorithms (DLS-MC [2] and PLS [3]) are already able to sample near-uniformly the target cliques of many of the instances used in this thesis. Furthermore, we gained some insights into various properties of these algorithms, which helped us to improve the sampling performance on the remaining instances.
APA, Harvard, Vancouver, ISO, and other styles
50

McShine, Lisa Maria. "Random sampling of combinatorial structures." Diss., Georgia Institute of Technology, 2000. http://hdl.handle.net/1853/28771.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography