Tesis sobre el tema "Full sampling"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte los 50 mejores tesis para su investigación sobre el tema "Full sampling".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.
Castorena, Juan. "Full-Waveform LIDAR Recovery at Sub-Nyquist Rates". International Foundation for Telemetering, 2013. http://hdl.handle.net/10150/579674.
Texto completoThird generation LIDAR full-waveform (FW) based systems collect 1D FW signals of the echoes generated by laser pulses of wide bandwidth reflected at the intercepted objects to construct depth profiles along each pulse path. By emitting a series of pulses towards a scene using a predefined scanning patter, a 3D image containing spatial-depth information can be constructed. Unfortunately, acquisition of a high number of wide bandwidth pulses is necessary to achieve high depth and spatial resolutions of the scene. This implies the collection of massive amounts of data which generate problems for the storage, processing and transmission of the FW signal set. In this research, we explore the recovery of individual continuous-time FW signals at sub-Nyquist rates. The key step to achieve this is to exploit the sparsity in FW signals. Doing this allows one to sub-sample and recover FW signals at rates much lower than that implied by Shannon's theorem. Here, we describe the theoretical framework supporting recovery and present the reader with examples using real LIDAR data.
Rahman, S. M. Rayhan. "Performance of local planners with respect to sampling strategies in sampling-based motion planning". Thesis, McGill University, 2011. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=96891.
Texto completoLa planification automatique du mouvement de corps rigides en mouvement 3D par translation et rotation en présence d'obstacles a longtemps été un défi pour la recherche pour les mathématiciens, les concepteurs de l'algorithme et roboticiens. Le champ a fait d'importants progrès avec l'introduction de la méthode de "feuille de route" probabiliste basée sur l'échantillonnage. Mais la planification du mouvement en présence de passages étroits est resté un défi.Cette thése présente un cadre d'expérimentation avec des combinaisons de stratégies d'échantillonnage et les planificateurs locaux, et de comparaison de leurs performances sur des problémes définis par l'utilisateur. Notre programme peut également être exécuté parallèle sur un nombre variable de processeurs. Nous présentons des résultats expérimentaux. En particulier, notre cadre nous a permis de trouver des combinaisons de choix d'une stratégie d'échantillonnage avec choix de planificateur local qui peut résoudre des problèmes difficiles de référence.
Wong, Raymond Y. P. "Critical analysis of the existing food sampling programmes". Thesis, University of Strathclyde, 2000. http://oleg.lib.strath.ac.uk:80/R/?func=dbin-jump-full&object_id=21140.
Texto completoAmoako-Tuffour, Yaw. "Design of an automated ingestible gastrointestinal sampling device". Thesis, McGill University, 2014. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=123257.
Texto completoUne capsule électromécanique ingérable a été conçue pour recueillir des échantillons physiques du tractus gastro-intestinal humain dans le but de mieux localiser la source des malaises gastro-intestinaux, d'explorer le microbiome et de surveiller les processus métaboliques. Un prototype complet a été développé incluant matériel, électronique sur mesure, logiciel et un mécanisme d'échantillonnage novateur tirant parti de la forme cylindrique de l'appareil. Des tests ont été effectués afin d'évaluer la capacité du prototype à prélever des échantillons et maintenir leur intégrité, supporter les conditions environnementales et les forces associées à l'utilisation clinique normale, et transiter en toute sécurité à travers le tractus gastro-intestinal. La contamination croisée a été plafonnée à 7.58% sur une période de 12 heures à 37 ° C. Et l'appareil était capable de prélever des échantillons hétérogènes. Il a été démontré que ce dispositif est un moyen efficace et non-invasif pour étudier la physiologie du tractus gastro-intestinal et servir de plate-forme pour le développement futur de la médecine personnalisée, l'administration de médicaments et d'intervention GI.
Morin, Antoine. "Estimation and prediction of black fly abundance and productivity". Thesis, McGill University, 1987. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=75447.
Texto completoPayette, Francois. "Applications of a sampling strategy for the ERBE scanner data". Thesis, McGill University, 1988. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=61784.
Texto completoErtefaie, Ashkan. "Casual inference via propensity score regression and length-biased sampling". Thesis, McGill University, 2011. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=104784.
Texto completoL'ajustement du facteur de confusion est la clé dans l'estimation de l'effet de traitement dans les études observationelles. Deux techniques bien connus d'ajustement causal sont le score de propension et la probabilité de traitement inverse pondéré. Nous avons comparé les propriétés asymptotiques de ces deux estimateurs et avons démontré que la première méthode est un estimateur plus efficace. Étant donné que d'ignorer des facteurs de confusion importants ne fait que biaiser l'estimateur, il semble bénéfique de tenir compte de tous les co-variables. Cependant, ceci peut entrainer une inflation de la variance des paramètres estimés et provoquer des biais également. Par conséquent, nous présentons une pénalisation technique basée conjointement sur la probabilité du traitement et sur les variables de la réponse pour sélectionner la clé co-variables qui doit être inclus dans le modèle du traitement attribué. Outre le biais introduit par la non-randomisation, nous discutons d'une autre source de biais introduit par un échantillon non représentatif de la population cible. Plus précisément, nous étudions l'effet de la longueur du biais de l'échantillon dans l'estimation de la résultante du traitement. Nous avons introduit une pondération et une solide équation d'estimation double pour ajuster l'échantillonnage biaisé et la non-randomisation dans la généralisation du modèle à temps accéléré échec réglage. Puis, les propriétés des estimateurs du vaste échantillon sont établies. Nous menons une étude étendue pour examiner la simulation des propriétés des estimateurs du petit échantillon. Dans chaque chapitre, nous appliquons notre propre technique sur de véritables ensembles de données et comparons les résultats avec ceux obtenus par d'autres méthodes.
Sacher, William. "The effect of sampling noise in ensemble-based Kalman filters". Thesis, McGill University, 2009. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=66769.
Texto completoLa possibilité pour les filtres de Kalman d'ensemble d'être mis en œuvre à un coût non prohibitif comme outil d'assimilation de données dans les modèles de prévision numérique du temps, et donc dans un contexte hautement non-linéaire, a suscité l'attention de la communauté scientifique au cours des dernières années. De nombreuses études ont cependant montré les limites pratiques de leur implémentation. En effet, en tant que méthode de Monte-Carlo, ils requièrent l'utilisation d'un échantillon limité de réalisations indépendantes du processus étudié. Les inévitables erreurs d'échantillonnage engendrées conduisent à une détérioration de la qualité de l'analyse. L'expression théorique donnant la précision et la fiabilité de l'analyse en fonction de la taille de l'ensemble est établie dans un contexte idéalisé impliquant un modèle parfait, en se bornant aux moments de second ordre des distributions statistiques des erreurs. La tendance générale des filtres de Kalman d'ensemble à sous-estimer en moyenne la variance de l'analyse, et donc leur propension à diverger, est ici prouvée théoriquement. Les comportements de méthodes alternatives construites pour réduire ou éliminer les effets de l'erreur d'échantillonage font également l'objet d'une étude théorique. Le filtre de Kalman d'ensemble double et l'inflation des covariances sont étudiés. Dans le cas des méthodes utilisant des observations perturbées, la méthode d'inflation des covariances apparaît comme la plus facile et la moins coûteuse à mettre en œuvre. Les résultats théoriques obtenus sont en accord avec les moyennes effectuées sur un grand nombre de réalisations d'expériences utilisant un modèle barotrope parfait, de faible résolution et quasi-géostrophique. Ces expériences-jumelles ont d'abord été effectuées sur un seul cycle d'analyse, puis dans un système de prévision numéri
Foster, Kristina. "Using Distinct Sectors in Media Sampling and Full Media Analysis to Detect Presence of Documents from a Corpus". Thesis, Monterey, California. Naval Postgraduate School, 2012. http://hdl.handle.net/10945/17365.
Texto completoForensics examiners frequently search for known content by comparing each file from a target media to a known file hash database. We propose using sector hashing to rapidly identify content of interest. Using this method, we hash 512 B or 4 KiB disk sectors of the target media and compare those to a hash database of known file blocks, fixed-sized file fragments of the same size. Sector-level analysis is fast because it can be parallelized and we can sample a sufficient number of sectors to determine with high probability if a known file exists on the target. Sector hashing is also file system agnostic and allows us to identify evidence that a file once existed even if it is not fully recoverable. In this thesis we analyze the occurrence of distinct file blocksヨblocks that only occur as a copy of the original fileヨin three multi-million file corpora and show that most files, including documents, legitimate and malicious software, consist of distinct blocks. We also determine the relative performanceof several conventional SQL and NoSQL databases with a set of one billion file block hashes.
Turner, Barry John. "Spatial sampling and vertical variability effects on microwave radiometer rainfall estimates". Thesis, McGill University, 1991. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=59910.
Texto completoThe optimal conversion between microwave brightness temperature and rainfall rate was highly sensitive to the spatial resolution of observations. Retrievals were made from the simulated microwave measurements using rainfall retrieval functions optimized for each resolution and for each storm case.
There is potential for microwave radiometer measurements from the planned TRMM satellite to provide better 'snapshot' estimates than area-threshold VIS/IR methods. Variability of the vertical profile of precipitation did not seriously reduce accuracy. However, it is crucial that calibration of retrieval methods be done with ground truth of the same spatial resolution.
Makulsawatudom, Arun. "Construction productivity measurement and improvement in Thailand by improved work-sampling". Thesis, University of Strathclyde, 2003. http://oleg.lib.strath.ac.uk:80/R/?func=dbin-jump-full&object_id=21527.
Texto completoCastorena, Juan. "A Comparison of Compressive Sensing Approaches for LIDAR Return Pulse Capture, Transmission, and Storage". International Foundation for Telemetering, 2014. http://hdl.handle.net/10150/577483.
Texto completoMassive amounts of data are typically acquired in third generation full-waveform (FW) LIDAR systems to generate image-like depthmaps of a scene of acceptable quality. The sampling systems acquiring this data, however, seldom take into account the low information rate generally present in the FW signals and, consequently, they sample very inefficiently. Our main goal here is to compare two efficient sampling models and processes for the individual time-resolved FW signals collected by a LIDAR system. Specifically, we compare two approaches of sub-Nyquist sampling of the continuous-time LIDAR FW return pulses: (i) modeling FW signals as short-duration pulses with multiple bandlimited echoes, and (ii) modeling them as signals with finite rates of innovation (FRI).
Kaharabata, Samuel K. "Moisture transfer behind windbreaks : laboratory simulations and conditional sampling in the field". Thesis, McGill University, 1991. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=60535.
Texto completoConditional sampling of fluctuations w' and q' of the wind and moisture, respectively, with sonic anemometer and fast-response Krypton hygrometer behind solid and porous windbreaks in the field, revealed frequency of occurrence, duration and intensity of those turbulent structures primarily responsible for moisture transfer.
Bergeron, Pierre-Jérôme. "Covariates and length-biased sampling : is there more than meets the eye ?" Thesis, McGill University, 2006. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=102958.
Texto completoMorrone, Dario. "Induced bias on measuring influence by length-biased sampling of failure times". Thesis, McGill University, 2008. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=21931.
Texto completoDes mesures d'influence pour divers modèles statistiques ont déjà été développées. Néanmoins, une mesure d'influence qui traite des données censurées de la droite parvenant de cohorte prévalentes n'a pourtant pas été traitée. Nous présentons une mesure d'influence qui tient compte du biais en longueur et de la censure de la droite, présents dans des données parvenant d'une cohorte prévalente. Cette mesure fait usage d'une vraisemblance correctement ajustée pour le biais en longueur ainsi que l'information potentiellement contenu dans la distribution marginale des covariés. Une approximation de cette mesure est développée. Nous illustrons la pertinence de correctement incorporer le biais en longueur et l'information contenu dans les covariés en analysant les différences d'influence quand la nature des données provenant d'une cohorte prévalente est reconnue et quand elle est ignorée. Les résultats sont illustrés avec l'aide de données sur la survie avec la démence parmi les personnes âgées au Canada fourni par le Canadian Study on Health and Aging.
Matta, Marwan. "Spatiotemporal interpolation for sampling grid conversion with application to scalable video coding". Thesis, McGill University, 1994. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=22663.
Texto completoMcCray, Robert B. "UTILIZATION OF A SMALL UNMANNED AIRCRAFT SYSTEM FOR DIRECT SAMPLING OF NITROGEN OXIDES PRODUCED BY FULL-SCALE SURFACE MINE BLASTING". UKnowledge, 2016. http://uknowledge.uky.edu/mng_etds/31.
Texto completoRabbath, Camille Alain. "Sensitivity of the discrete- to continuous-time pole transformation at fast sampling rates". Thesis, McGill University, 1995. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=23267.
Texto completoAzerêdo, Daniel Mendes. "Pesquisas sob amostragem informativa utilizando o FBST". Universidade de São Paulo, 2013. http://www.teses.usp.br/teses/disponiveis/45/45132/tde-19082013-161233/.
Texto completoPfeffermann, Krieger and Rinott (1998) introduced a framework for modeling sampling processes that can be used to assess if a sampling process is informative. In this setting, sample selection probabilities are approximated by a polynomial function depending on outcome and auxiliary variables. Within this framework, our main purpose is to investigate the application of the Full Bayesian Significance Test (FBST), introduced by Pereira and Stern (1999), as a tool for testing sampling ignorability, that is, to detect a significant relation between the sample selection probabilities and the outcome variable. The performance of this statistical modelling framework is tested with some simulation experiments.
Vafaee, Manouchehr S. "Evaluation and implementation of an automated blood sampling system for positron emission tomographic studies". Thesis, McGill University, 1993. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=57009.
Texto completoWe have evaluated and implemented the "Scanditronix" automated blood sampling system and measured its external delay and dispersion. PET studies of cerebral blood flow and oxygen metabolism using simultaneous manual and automated blood sampling were analyzed and compared. We show that the results obtained with automated blood sampling are more reliable than those based on manual sampling. We also present suggestions to further improve the reliability of quantitative PET studies based on automated blood sampling.
Shu, Weihuan. "Optimal sampling rate assignment with dynamic route selection for real-time wireless sensor networks". Thesis, McGill University, 2009. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=32351.
Texto completoL'attribution de calcul et de la communication ressources d'une mani`ere qui optimise les performances du syst`eme global est un aspect crucial de la gestion du syst`eme. R´eseau de capteurs sans fil pose de nouveaux d´efis en raison de la p´enurie de ressources et en temps r´eel. Travaux existants a traite distribution temps-reel probl`eme de taux d'´echantillonnage, dans un seul processeur cas et r´eseau cas de routage environment statique. Pour les r´eseaux de capteurs sans fil, afin de parvenir `a une meilleure performance globale du r´eseau, le routage devrait tre examin´e en mˆeme temps que la distribution de taux des flux individuels. Dans cet article, nous abordons le probl`eme de l'optimisation des taux d'´echantillonnage avec route s´election dynamique pour r´eseaux de capteurs sans fil. Nous modelisons le probleme comme un probl`eme d'optimisation et le r´esolvons dans le cadre de l'utilite de reseau maximisation. Sur la base de la m´ethode primal-dual et la dual d´ecomposition technique, nous concevons un algorithme distribu´e qui atteint le meilleur l'utilite de reseau globale au vu de route d´ecision dynamique et le taux distribution. Des simulations ont ´et´e r´ealis´ees pour d´emontrer l'efficience et l'efficacit´e de nos solutions propos´ees. fr
Gibson, Keith W. "Time-concentrated sampling : a simple strategy for information gain at a novel, depleted patch". Thesis, McGill University, 2002. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=78368.
Texto completoSimon, Philippe 1964. "Long-term integrated sampling to characterize airborne volatile organic compounds in indoor and outdoor environments". Thesis, McGill University, 1997. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=34455.
Texto completoA mathematical model was derived by modifications to the Hagen-Poiseuille and ideal gas laws. This model defines the relationship between container volume and capillary geometry (length/internal diameter) required to provide selected sampling times. Based on theoretical considerations, simulations were performed to study the effects of dimensional parameters. From these results, capillaries having 0.05 and 0.10 mm internal diameters were selected according to their ability to reduce sampling flow rates and to increase sampling times. Different capillary lengths were tested on various sampler prototypes. It was found that a constant sampling flow rate was delivered when a maximum discharge rate was established under the influence of a pressure gradient between a vacuum and ambient pressure. Experimental flow rates from 0.018 to 2.6 ml/min were obtained and compared with model predictions. From this comparison, empirical relationships between capillary geometry and maximum discharge rate given by the pressure gradient were defined. Essentially, based on these empirical relationships, capillary sampling flow controller specifications can be calculated to offer extended integrated sampling periods. On this basis, sampler prototypes were configured for stationary sampling and personal sampling.
Studies, based on theory, have indicated that factors such as temperature, humidity and longitudinal molecular diffusion are not likely to influence the passive sampling process. Subsequent experiments confirmed that temperature changes should not significantly affect flow rates delivered by controllers, and that molecular diffusion does not have any impact on the representativeness of long-term samples. Recovery tests provided acceptable results demonstrating that selected capillaries do not contribute to adsorption that could seriously affect the validity of this sampling approach.
Field demonstration studies were performed with both stationary and personal sampler prototypes in the indoor and outdoor environments. The performance of the sampler compared favorably, and in some instances, exceeded that of accepted methodology. These novel samplers were more reliable, had greater versatility and principally, allowed sampling periods extending from hours to a month. These inherent qualities will assist industrial hygienists and environmentalists in the study of emission sources, pollutant concentrations, dispersion, migration and control measures. This novel sampler is presently the only device available for the effective study of episodic events of VOC emission.
Selected capillary geometries acting as a restriction to the entry of ambient air into evacuated sample container can provide a simple, versatile and reliable alternative for the collection of VOCs. This approach can contribute to a better understanding of VOC effects on human health and the environment.
Rowan, David J. "The distribution, texture and trace element concentrations of lake sediments /". Thesis, McGill University, 1992. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=70361.
Texto completoBeedell, David C. (David Charles). "The effect of sampling error on the interpretation of a least squares regression relating phosporus and chlorophyll". Thesis, McGill University, 1995. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=22720.
Texto completoCerigo, Helen. "HPV knowledge and self-sampling for the detection of HPV-DNA among Inuit women in Nunavik, Quebec". Thesis, McGill University, 2011. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=96825.
Texto completoLa prévalence du virus du papillome humain(VPH) est élevée dans la population Inuit du Québec. Nous avons donc 1) documenter le niveau de connaissance concernant le VPH et son lien avec le cancer du col utérin et 2) évaluer le rendement de l'auto prélèvement pour le VPH en comparaison avec le prélèvement fait par l'intervenant de santé et 3) déterminer la préférence des femmes Inuit du Nunavik entre les deux méthodes. Un questionnaire fut utilisé pour évaluer le niveau de connaissance et la préférence entre les modes de prélèvements. La comparabilité entre les modes de prélèvements s'est effectuée sur les résultats du test PCR détectant 36 différents types de VPH. Plus de 31% des femmes Inuit avaient entendues parler du VPH. Le niveau de connaissance général sur le VPH est faible mais semblable à celui rapporté pour des populations non Autochtone. La comparabilité en matière de détection des VPH est élevée entre les deux méthodes. L'auto prélèvement est potentiellement une méthode de prélèvement propice à augmenter le taux de dépistage du cancer du col utérin.
Rossner, Alan. "The development and evaluation of a novel personal air sampling canister for the collection of gases and vapors /". Thesis, McGill University, 2002. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=84428.
Texto completoA series of flow rate experiments were done to test the capillary flow capabilities with a 300 mL canister for sampling times ranging from a few minutes to over 40 hours. Flow rates ranging from 0.05 to 1.0 mL/min were experimentally tested and empirical formulae were developed to predict flow rates for given capillary geometries. The low flow rates allow for the collection of a long term air sample in a small personal canister.
Studies to examine the collection of air contaminants were conducted in laboratory and in field tests. Air samples for six volatile organic compounds were collected from a small exposure chamber using the capillary-canisters, charcoal tubes and diffusive badges at varied concentrations. The results from the three sampling devices were compared to each other and to concentration values obtained by an on-line gas chromatography. The results indicate that the capillary-canister compares quite favorably to the sorbent methods and to the on line GC values for the six compounds evaluated.
Personal air monitoring was conducted in a large exposure chamber to assess the effectiveness of the capillary-canister method to evaluate breathing zone samples. In addition, field testing was performed at a manufacturing facility to assess the long term monitoring capabilities of the capillary-canister. Precision and accuracy were found to parallel that of sorbent sampling methods.
The capillary-canister device displayed many positive attributes for occupational and community air sampling. Extended sampling times, greater capabilities to sample a broad range of chemicals simultaneously, ease of use, ease of analysis and the low relative cost of the flow controller should allow for improvements in exposure assessment.
De, la Chenelière Véronik. "The risks and benefits of an invasive technique, biopsy sampling, for an endangered population, the St. Lawrence beluga (Delphinapterus leucas) /". Thesis, McGill University, 1998. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=21536.
Texto completoGrimm, Carsten. "Well-Being in its Natural Habitat: Orientations to Happiness and the Experience of Everyday Activities". Thesis, University of Canterbury. Psychology, 2013. http://hdl.handle.net/10092/8040.
Texto completoLeong, Aaron. "Population prevalence of diabetes: validation of a case definition from health administrative data using a population-based survey and home blood glucose sampling". Thesis, McGill University, 2013. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=116891.
Texto completoContexte: Les études qui valident les cas de diabète identifiés à partir des données administratives sont limitées et n'évaluent pas systématiquement le diabète non diagnostiqué. Nous avons mené une revue systématique pour déterminer la validité d'un algorithme utilisé pour identifier les cas de diabète (deux facturations de médecins ou une hospitalisation pour le diabète, pendant une période de deux ans; Système National de Surveillance du Diabète du Canada, SNSD). Nous avons aussi validé cet algorithme dans les bases de données du Québec en comparaison avec des données obtenues à partir d'une enquête publique. En plus, nous avons estimé la prévalence du diabète diagnostiqué à partir des données d'une enquête publique ainsi que la prévalence du diabète non diagnostiqué à partir du taux de la glycémie mesurée en utilisant des échantillons sanguins envoyés par la poste.Méthodes: Pour la revue systématique, nous avons effectué une recherche dans les bases de données de Medline (de 1950) et Embase (de 1980) pour les études de validation publiées jusqu'en Août 2012 (mots-clés: «diabetes mellitus», «administrative databases» et «validation studies»). Un modèle de régression bi-variée aux effets aléatoires a été utilisé pour regrouper des estimations de sensibilité et de spécificité. Pour la validation de l'algorithme dans les bases de données du Québec, nous avons obtenu des données administratives relatives à un échantillon aléatoire stratifié de 6247 résidents du Québec (2009). Ces individus ont été aussi interrogés au téléphone et ont été invités à envoyer à un laboratoire central des échantillons de sang prélevés à jeun. Les cas de diabète du SNSD ont été comparés avec ceux du diabète auto-déclaré par le patient et avec l'auto-déclaration en conjonction avec une glycémie élevée (≥ 7 mmol/l), respectivement. La sensibilité, spécificité, les valeurs prédictives positives et négatives, et la concordance statistique ont été calculées. La prévalence du diabète dans la population a été estimée en utilisant la définition ajustée en fonction des estimations de la sensibilité et de la spécificité.Résultats: Dans la revue systématique, la stratégie de recherche a identifié 1,423 résumés desquels 11 études de validation ont été choisies pour la revue. Six études ont été regroupées dans une méta-analyse. En comparaison aux données des enquêtes et/ou des dossiers médicaux, la sensibilité était de 82,3% (IC 95%, 75,8, 87,4%) et la spécificité était de 97,9% (IC 95%: 96,5, 98,8%). Pour la validation de la définition dans les banques de données du Québec, la comparaison avec les données obtenues à partir de l'enquête a montré une sensibilité de 84,3% (IC 95%: 79,3, 88,5) et une spécificité de 97,9% (IC 95%: 97,4, 98,4). La comparaison avec les données de l'enquête combinées aux taux de glycémie, a montré une sensibilité beaucoup plus faible de 58,2% (IC 95% 52,2, 64,6) et une spécificité de 98,7% (IC 95%: 98,0, 99,3). Après ajustement pour le poids d'échantillonnage, la prévalence du diabète diagnostiqué était de 7,2% (IC 95% 6,3, 8,0) et la prévalence du diabète diagnostiqué et non-diagnostiqué était de 13,4% (IC 95% 11.7, 15.0).Conclusion: Incluant l'étude du Québec dans une méta-analyse actualisée des 7 études, la sensibilité de la définition de cas de diabète diagnostiqué était de 82,6% (IC 95%, 77.1, 87.0) et la spécificité était de 97,9% (IC 95%: 96,8, 98,6). La définition semble être suffisamment précise pour une surveillance en santé publique, en particulier pour les analyses de tendance. Les personnes non-diagnostiqués sont susceptibles de subir un retard dans le traitement et d'encourir un risque plus élevé pour les complications liées au diabète. La prévalence du diabète estimée à partir des banques de données administrative doit être corrigée pour la sensibilité et la spécificité de la définition afin de mieux quantifier les changements annuels de prévalence et de tenir compte des cas de diabète non diagnostiqués.
Qin, Yulin. "Non-Parametric and Parametric Estimators of the Survival Function under Dependent Censorship". University of Cincinnati / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1368086293.
Texto completoDI, SALVO ANDREA. "CMOS distributed signal processing systems for radiation sensors". Doctoral thesis, Politecnico di Torino, 2022. http://hdl.handle.net/11583/2957742.
Texto completoBenamara, Tariq. "Full-field multi-fidelity surrogate models for optimal design of turbomachines". Thesis, Compiègne, 2017. http://www.theses.fr/2017COMP2368.
Texto completoOptimizing turbomachinery components stands as a real challenge despite recent advances in theoretical, experimental and High-Performance Computing (HPC) domains. This thesis introduces and validates optimization techniques assisted by full-field Multi-Fidelity Surrogate Models (MFSMs) based on Proper Orthogonal Decomposition (POD). The combination of POD and Multi-Fidelity Modeling (MFM) techniques allows to capture the evolution of dominant flow features with geometry modifications. Two POD based multi-fidelity optimization methods are proposed. Thefirst one consists in an enrichment strategy dedicated to Gappy-POD (GPOD)models. It is more suitable for instantaneous low-fidelity computations whichmakes it hardly tractable for aerodynamic design of turbomachines. This methodis demonstrated on the flight domain study of a 2D airfoil from the literature. The second methodology is based on a multi-fidelity extension to Non-IntrusivePOD (NIPOD) models. This extension starts with a re-interpretation of theConstrained POD (CPOD) concept and allows to enrich the reduced spacedefinition with abondant, albeit inaccurate, low-fidelity information. In the second part of the thesis, a benchmark test case is introduced to test fullfield multi-fidelity optimization methodologies on an example presenting featuresrepresentative of turbomachinery problems. The predictability of the proposedMulti-Fidelity NIPOD (MFNIPOD) surrogate models is compared to classical surrogates from the literature on both analytical and industrial-scale applications. Finally, we employ the proposed tool to the shape optimization of a 1.5-stage boosterand we compare the obtained results with standard state of the art approaches
Warren, Georgina. "Developing land management units using Geospatial technologies: An agricultural application". Curtin University of Technology, Department of Spatial Sciences, 2007. http://espace.library.curtin.edu.au:80/R/?func=dbin-jump-full&object_id=17509.
Texto completoVallelonga, Paul Travis. "Measurement of Lead Isotopes in Snow and Ice from Law Dome and other sites in Antarctica to characterize the Lead and seek evidence of its origin". Curtin University of Technology, School of Applied Science, 2002. http://espace.library.curtin.edu.au:80/R/?func=dbin-jump-full&object_id=14018.
Texto completoParticular attention was given to the quantity of Pb added to the samples during the decontamination and sample storage stages of the sample preparation process. These stages, including the use of a stainless steel chisel for the decontamination, contributed ~5.2 pg to the total sample analysed, amounting to a concentration increase of ~13 fg g-1. In comparison, the mass spectrometer ion source contributed typically 89 +/- 19 fg to the blank, however its influence depended upon the amount of Pb available for analysis and so had the greatest impact when small volumes of samples with a very low concentration were analysed. As a consequence of these careful investigations of the Pb blank contributions to the samples, the corrections made to the Pb isotopic ratios and concentrations measured are smaller than previously reported evaluations of Pb in Antarctica by thermal ionisation mass spectrometry. The data indicate that East Antarctica was relatively pristine until -1884 AD, after which the first influence of anthropogenic Pb in Law Dome is observed. "Natural", pre-industrial, background concentrations of Pb and Ba were - 0.4 pg/g and - 1.3 pg/g, respectively, with Pb isotopic compositions within the range 206Pb/207Pb = 1.20 - 1.25 and 208Pb/207Pb = 2.46 - 2.50 and an average rock and soil dust Pb contribution of 8-12%. A major pollution event was observed at Law Dome between 1884 and 1908 AD, elevating the Pb concentration fourfold and changing 206Pb/207Pb ratios in the ice to ~1.12. Based on Pb isotopic systematics and Pb emissions statistics, this was attributed to Pb mined at Broken Hill and smelted at Broken Hill and Port Pirie, Australia.
Anthropogenic Pb inputs to Law Dome were most significant from ~1900 to 1910 and from ~1960 to 1980. During the 20th century, Ba concentrations were consistently higher than "natural" levels. This was attributed to increased dust production, suggesting the influence of climate change and/or changes in land coverage with vegetation. Law Dome ice dated from 1814 AD to 1819 AD was analysed for Pb isotopes and Pb, Ba and Bismuth (Bi) concentrations to investigate the influence of the 1815 AD volcanic eruption of Tambora, Indonesia. The presence of volcanic debris in the core samples was observed from late-1816 AD to 1818 AD as an increase in sulphate concentrations and electrical conductivity of the ice. Barium concentrations were approximately three times higher than background levels from mid-1816 to mid1818, consistent with increased atmospheric loading of rock and soil dust, while enhanced Pb/Ba and Bi/Ba ratios, associated with deposition of volcanic debris, were observed at mid-1814 and from early-1817 to mid-1818. From the results, it appeared likely that Pb emitted from Tambora was removed from the atmosphere within the 1.6 year period required to transport aerosols to Antarctica. Increased Pb and Bi concentrations observed in Law Dome ice ~1818 AD were attributed to either increased heavy metal emissions from Mount Erebus, or increased fluxes of heavy metals to the Antarctic ice sheet resulting from climate and meteorological modifications following the Tambora eruption.
A non-continuous series of Law Dome snow core samples dating from 1980 to 9185 AD were analysed to investigate seasonal variations in the deposition of Pb and Ba. It was found that Pb and Ba at Law Dome do exhibit seasonal variations in deposition, with higher concentrations of Pb and Ba usually observed during Summer and lower concentrations of Pb and Ba usually observed during the Autumn and Spring seasons. At Law Dome, broad patterns of seasonal Pb and Ba deposition are evident however these appear to be punctuated by short-term deposition events or may even be composed of a continuum of short-term deposition events. This variability suggests that complex meteorological systems are responsible for the transport of Pb and Ba to Law Dome, and probably Antarctica in general.
Gioia, Dario <1988>. "Fully Flexible Binding of Taxane-Site Ligands to Tubulin via Enhanced Sampling MD Simulations". Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amsdottorato.unibo.it/8624/1/Gioia_Dario_Tesi.pdf.
Texto completoFrühwirth-Schnatter, Sylvia. "Fully Bayesian Analysis of Switching Gaussian State Space Models". Department of Statistics and Mathematics, WU Vienna University of Economics and Business, 2000. http://epub.wu.ac.at/812/1/document.pdf.
Texto completoSeries: Forschungsberichte / Institut für Statistik
Tongroon, Manida. "Combustion characteristics and in-cylinder process of CAI combustion with alcohol fuels". Thesis, Brunel University, 2010. http://bura.brunel.ac.uk/handle/2438/4501.
Texto completoChang, Meng-I. "A Comparison of Two MCMC Algorithms for Estimating the 2PL IRT Models". OpenSIUC, 2017. https://opensiuc.lib.siu.edu/dissertations/1446.
Texto completoWu, Xinying. "Reliability Assessment of a Continuous-state Fuel Cell Stack System with Multiple Degrading Components". Ohio University / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1556794664723115.
Texto completoAfrifa-Yamoah, Ebenezer. "Imputation, modelling and optimal sampling design for digital camera data in recreational fisheries monitoring". Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2021. https://ro.ecu.edu.au/theses/2387.
Texto completoZuazo, Pablo [Verfasser], Klaus [Akademischer Betreuer] Butterbach-Bahl y Heinz [Akademischer Betreuer] Rennenberg. "Development of a fully automated soil incubation and gas sampling system for quantifying trace gas emission pulses from soils at high temporal resolution". Freiburg : Universität, 2016. http://d-nb.info/1129080730/34.
Texto completoAndersson, Oskar. "Avskiljning av inert material från avfallsbränsle : En fältstudie av förbättrad RDF-produktion på bränsleberedningen i Västerås". Thesis, Mälardalens högskola, Akademin för ekonomi, samhälle och teknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35780.
Texto completoEnergy recovery of waste got huge potential of decreasing the greenhouse gas emissions in the world. Combustion in fluidized bed boilers gives high resource efficiency but demands a comminuted fuel with low content of inert (non-combustible) materials, a so called refuse derived fuel (RDF). A well-functioning separation process as part of the RDF-production allows efficient combustion as well as efficient treatment of the separated materials. The purpose of this degree project is to investigate what factors that influences on the separation of inert material from waste for combustion in a fluidized bed boiler and how the separation can be improved. This is investigated through a field study of a fuel-preparation plant in Sweden. The separation process has been examined visually and by experiments based on sampling and manual sorting of waste fractions. The results show five factors that are assumed to influence on the sorting. Three of them are suggested to be solved by simple constructions. One factor that shows to have a great impact is the input waste to the process which is varying to a large extent. A measure that is suggested to give improved separation is a recurrent check of the RDF quality and the reject quality. Combined with information about the input waste this should be basis for recurrent adjustments of the plant to achieve a more stable quality of the separation output. Another measure that is suggested is to decrease the size of the material flow through the production line. This is suggested since the size of the flow is assumed to have an important impact on the separation. The decrease can be achieved by more evenly distribute the production over time and over the production lines. This will though require a more active planning of the production and minimization of production stops. As part of the work a new wind sifter has also been tested. The wind sifter show good potential of improving the separation if it would be installed to create a two-step wind sifting. However, since the investment of a new wind sifter implies a high investment, a study of the costs and saving potential is required before the investment can be suggested as a measure.
BARUA, SUKHENDU LAL. "APPLICATION OF CONDITIONAL SIMULATION MODEL TO RUN-OF-MINE COAL SAMPLING FREQUENCY DETERMINATION AND COAL QUALITY CONTROL AT THE POWER PLANT (BLENDING, GOAL PROGRAMMING, MICROCOMPUTER)". Diss., The University of Arizona, 1985. http://hdl.handle.net/10150/187940.
Texto completoRicosset, Thomas. "Signature électronique basée sur les réseaux euclidiens et échantillonnage selon une loi normale discrète". Thesis, Toulouse, INPT, 2018. http://www.theses.fr/2018INPT0106/document.
Texto completoLattice-based cryptography has generated considerable interest in the last two decades due toattractive features, including conjectured security against quantum attacks, strong securityguarantees from worst-case hardness assumptions and constructions of fully homomorphicencryption schemes. On the other hand, even though it is a crucial part of many lattice-basedschemes, Gaussian sampling is still lagging and continues to limit the effectiveness of this newcryptography. The first goal of this thesis is to improve the efficiency of Gaussian sampling forlattice-based hash-and-sign signature schemes. We propose a non-centered algorithm, with aflexible time-memory tradeoff, as fast as its centered variant for practicable size of precomputedtables. We also use the Rényi divergence to bound the precision requirement to the standarddouble precision. Our second objective is to construct Falcon, a new hash-and-sign signaturescheme, based on the theoretical framework of Gentry, Peikert and Vaikuntanathan for latticebasedsignatures. We instantiate that framework over NTRU lattices with a new trapdoor sampler
Křížek, Miroslav. "Měřicí modul s A/D převodníkem se současným vzorkováním". Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2009. http://www.nusl.cz/ntk/nusl-217840.
Texto completoSäll, Erik. "Design of a Low Power, High Performance Track-and-Hold Circuit in a 0.18µm CMOS Technology". Thesis, Linköping University, Department of Electrical Engineering, 2002. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-1353.
Texto completoThis master thesis describes the design of a track-and-hold (T&H) circuit with 10bit resolution, 80MS/s and 30MHz bandwidth. It is designed in a 0.18µm CMOS process with a supply voltage of 1.8 Volt. The circuit is supposed to work together with a 10bit pipelined analog to digital converter.
A switched capacitor topology is used for the T&H circuit and the amplifier is a folded cascode OTA with regulated cascode. The switches used are of transmission gate type.
The thesis presents the design decisions, design phase and the theory needed to understand the design decisions and the considerations in the design phase.
The results are based on circuit level SPICE simulations in Cadence with foundry provided BSIM3 transistor models. They show that the circuit has 10bit resolution and 7.6mW power consumption, for the worst-case frequency of 30MHz. The requirements on the dynamic performance are all fulfilled, most of them with large margins.
Pardon, Gaspard. "From Macro to Nano : Electrokinetic Transport and Surface Control". Doctoral thesis, KTH, Mikro- och nanosystemteknik, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-144994.
Texto completoQC 20140509
Rappid
NanoGate
Norosensor
Hsieh, Hung-Chih y 謝鴻志. "The full-field heterodyne interferometry with novel sampling scheme". Thesis, 2011. http://ndltd.ncl.edu.tw/handle/00396438979331007005.
Texto completo國立交通大學
光電工程學系
99
In order to apply the heterodyne interferometry to the full-field measurement by using a digital camera, the relations between the camera sampling frequency and the heterodyne frequency are investigated based on the procedures to derive the associated phases. We find that the full-field heterodyne interferometry can be operated whether the sampling conditions meet the Nyquist sampling theorem or not. The large step height and the refractive index distribution are performed in order with the conventional Nyquist sampling theorem. Then, the optimal conditions for a commonly used CCD camera are proposed to reduce the cost. The full-field phase retardation distribution of a wave plate is measured to demonstrate their validities. To measure the height distribution, an alternative full-field interferometric profilometry is proposed by combining the two-wavelength interferometry and the heterodyne interferometry. A collimated heterodyne light is introduced into a modified Twyman-Green interferometer, and its phase and profile can be obtained. In the measurement of the full-field refractive index distribution, the circular heterodyne light is incident on the sample obliquely. The reflected light passes through an analyzer and its associated phases are derived from the interference signals. The estimated data are substituted into the special equations derived from Fresnel’s equations, and the full-field refractive index distribution of the sample can be obtained. The processes to derive the associated phases from the data of a series of recorded frames are performed, two optimal sampling conditions for a common-used CCD camera are proposed. The full-field phase retardation of a wave plate is measured by using a common path heterodyne interferometry to show the validities. The above methods have several merits such as easy operation, high resolution and rapid measurement.
"Glauber Dynamics for Sampling an Edge Colouring of Full Homogeneous Trees". 2016. http://repository.lib.cuhk.edu.hk/en/item/cuhk-1292280.
Texto completo在這篇論文,我們集中在用 Glauber Dynamics 隨機抽出正則樹邊著色的問題。這相當於 用 Glauber Dynamics 隨機抽出正則樹的線圖的頂點著色。我們研究這特殊例子因為線圖 的周長小因此不能直接應用之前的結果。目前最好的結果是 vigoda 的多項式混合時間如 果 q 不少於 11d/3。我們的主要成果是多項式混合時間如果 q 不少於 2d。
We study the problem of sampling a graph colouring using Glauber Dynamics. This is an interesting problem as Jerrum showed that we can approximate the number of proper colouring if we can sample a colouring nearly uniformly. Therefore we want a sampler with polynomial running time. Glauber Dynamics is one natural Markov Chain for sampling a graph colouring and it has been studied extensively. The state space of it is the set of proper colourings. In each step, we sample a colour c and a node u randomly. Then we update u to colour c if the new colouring is still proper, otherwise we stay at the current colouring. For a graph with maximum degree d, let q be the number of colours. One important goal on this problem is proving polynomial mixing time if q ≥ d + 2. For general graphs, the best result is polynomial mixing time if q ≥ 11d/6 by Vigoda. For some classes of graphs, we can sample a colouring for fewer number of colours. One example are graphs with large girth and large maximum degree and there are many results for this kinds of graphs.
In the thesis, we focus on the mixing time of Glauber Dynamics for sampling an edge colouring of a full d-homogeneous tree. This is equivalent to sampling a proper vertex colouring of the line graph of the tree. We consider this special case as the line graph has small girth so previous results and techniques does not apply directly. The best previous result is polynomial mixing time if q ≥ 11d/3 by Vigoda. Our main result is polynomial mixing time if q ≥ 2d. Our proof is based on the multicommodity flow argument by Sinclair.
Poon, Chun Yeung.
Thesis M.Phil. Chinese University of Hong Kong 2016.
Includes bibliographical references (leaves ).
Abstracts also in Chinese.
Title from PDF title page (viewed on …).
Detailed summary in vernacular field only.
Detailed summary in vernacular field only.