Siga este link para ver outros tipos de publicações sobre o tema: Environmental health Data processing.

Teses / dissertações sobre o tema "Environmental health Data processing"

Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos

Selecione um tipo de fonte:

Veja os 50 melhores trabalhos (teses / dissertações) para estudos sobre o assunto "Environmental health Data processing".

Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.

Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.

Veja as teses / dissertações das mais diversas áreas científicas e compile uma bibliografia correta.

1

Wilmot, Peter Nicholas. "Modelling cooling tower risk for Legionnaires' Disease using Bayesian Networks and Geographic Information Systems". Title page, contents and conclusion only, 1999. http://web4.library.adelaide.edu.au/theses/09SIS.M/09sismw744.pdf.

Texto completo da fonte
Resumo:
Includes bibliographical references (leaves 115-120) Establishes a Bayesian Belief Network (BBN) to model uncertainty of aerosols released from cooling towers and Geographic Information Systems (GIS) to create a wind dispersal model and identify potential cooling towers as the source of infection. Demonstrates the use of GIS and BBN in environmental epidemiology and the power of spatial information in the area of health.
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Chitondo, Pepukayi David Junior. "Data policies for big health data and personal health data". Thesis, Cape Peninsula University of Technology, 2016. http://hdl.handle.net/20.500.11838/2479.

Texto completo da fonte
Resumo:
Thesis (MTech (Information Technology))--Cape Peninsula University of Technology, 2016.
Health information policies are constantly becoming a key feature in directing information usage in healthcare. After the passing of the Health Information Technology for Economic and Clinical Health (HITECH) Act in 2009 and the Affordable Care Act (ACA) passed in 2010, in the United States, there has been an increase in health systems innovations. Coupling this health systems hype is the current buzz concept in Information Technology, „Big data‟. The prospects of big data are full of potential, even more so in the healthcare field where the accuracy of data is life critical. How big health data can be used to achieve improved health is now the goal of the current health informatics practitioner. Even more exciting is the amount of health data being generated by patients via personal handheld devices and other forms of technology that exclude the healthcare practitioner. This patient-generated data is also known as Personal Health Records, PHR. To achieve meaningful use of PHRs and healthcare data in general through big data, a couple of hurdles have to be overcome. First and foremost is the issue of privacy and confidentiality of the patients whose data is in concern. Secondly is the perceived trustworthiness of PHRs by healthcare practitioners. Other issues to take into context are data rights and ownership, data suppression, IP protection, data anonymisation and reidentification, information flow and regulations as well as consent biases. This study sought to understand the role of data policies in the process of data utilisation in the healthcare sector with added interest on PHRs utilisation as part of big health data.
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Yang, Bin, e 杨彬. "A novel framework for binning environmental genomic fragments". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2010. http://hub.hku.hk/bib/B45789344.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Gigandet, Katherine M. "Processing and Interpretation of Illinois Basin Seismic Reflection Data". Wright State University / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=wright1401309913.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Perovich, Laura J. (Laura Jones). "Data Experiences : novel interfaces for data engagement using environmental health data". Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/95612.

Texto completo da fonte
Resumo:
Thesis: S.M., Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2014.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 71-81).
For the past twenty years, the data visualization movement has reworked the way we engage with information. It has brought fresh excitement to researchers and reached broad audiences. But what comes next for data? I seek to create example "Data Experiences" that will contribute to developing new spaces of information engagement. Using data from Silent Spring Institute's environmental health studies as a test case, I explore Data Experiences that are immersive, interactive, and aesthetic. Environmental health datasets are ideal for this application as they are highly relevant to the general population and have appropriate complexity. Dressed in Data will focus on the experience of an individual with her/his own environmental health data while BigBarChart focuses on the experience of the community with the overall dataset. Both projects seek to present opportunities for nontraditional learning, community relevance, and social impact.
by Laura J. Perovich.
S.M.
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Ponsimaa, P. (Petteri). "Discovering value for health with grocery shopping data". Master's thesis, University of Oulu, 2016. http://urn.fi/URN:NBN:fi:oulu-201605221849.

Texto completo da fonte
Resumo:
Food retailers are taking more active role in the customer value creation process and shifting their attention from the sale of goods to support customer’s value-creation to discover more innovative service-based business models. From customer data consumers may develop more responsible consumption behaviour, make more economical choices, and raise awareness on food healthiness. This exploratory study sets out to answer the question what value if any does the use of grocery shopping data bring to the customers. Using design science research, the thesis makes use of grocery purchase data available to S-Group customers and presents ways of applying the data while making it meaningful for them. The aim was to construct visualization application prototypes for seeking value and benefits of purchase data experienced by the customers. To evaluate the application design, a study group of eight customers were invited to provide purchase data and feedback on the data visualizations. The focus was on building designs of the grocery consumption patterns based on customer interviews and then evaluating the impact on the study group via interviews and usage data. The visualization prototypes allowed the participants to discover something new of their shopping and food consumption behaviour, not known to them before the study and not visible from the mere purchase data. Interviews suggested that the visualizations of health data encourage reflection of consuming habits, and thus may be used as a tool for increasing awareness of one’s shopping behaviour. A number of limitations in the data utilization were met hindering inference-making and reflecting on the data. Lastly, the prototypes led the participants to envision new digital health services, some of which might have commercial value.
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Adu-Prah, Samuel. "GEOGRAPHIC DATA MINING AND GEOVISUALIZATION FOR UNDERSTANDING ENVIRONMENTAL AND PUBLIC HEALTH DATA". OpenSIUC, 2013. https://opensiuc.lib.siu.edu/dissertations/657.

Texto completo da fonte
Resumo:
Within the theoretical framework of this study it is recognized that a very large amount of real-world facts and geospatial data are collected and stored. Decision makers cannot consider all the available disparate raw facts and data. Problem-specific variables, including complex geographic identifiers have to be selected from this data and be validated. The problems associated with environmental- and public-health data are that (1) geospatial components of the data are not considered in analysis and decision making process, (2) meaningful geospatial patterns and clusters are often overlooked, and (3) public health practitioners find it difficult to comprehend geospatial data. Inspired by the advent of geographic data mining and geovisualization in public and environmental health, the goal of this study is to unveil the spatiotemporal dynamics in the prevalence of overweight and obesity in United States youths at regional and local levels over a twelve-year study period. Specific objectives of this dissertation are to (1) apply regionalization algorithms effective for the identification of meaningful clusters that are in spatial uniformity to youth overweight and obesity, and (2) use Geographic Information System (GIS), spatial analysis techniques, and statistical methods to explore the data sets for health outcomes, and (3) explore geovisualization techniques to transform discovered patterns in the data sets for recognition, flexible interaction and improve interpretation. To achieve the goal and the specific objectives of this dissertation, we used data sets from the National Longitudinal Survey of Youth 1997 (NLSY'97) early release (1997-2004), NLSY'97 current release (2005 - 2008), census 2000 data and yearly population estimates from 2001 to 2008, and synthetic data sets. The NLSY97 Cohort database range varied from 6,923 to 8,565 individuals during the period. At the beginning of the cohort study the age of individuals participating in this study was between 12 and 17 years, and in 2008, they were between 24 and 28 years. For the data mining tool, we applied the Regionalization with Dynamically Constrained Agglomerative clustering and Partitioning (REDCAP) algorithms to identify hierarchical regions based on measures of weight metrics of the U.S. youths. The applied algorithms are the single linkage clustering (SLK), average linkage clustering (ALK), complete linkage clustering (CLK), and the Ward's method. Moreover, we used GIS, spatial analysis techniques, and statistical methods to analyze the spatial varying association of overweight and obesity prevalence in the youth and to geographically visualize the results. The methods used included the ordinary least square (OLS) model, the spatial generalized linear mixed model (GLMM), Kulldorff's Scan space-time analysis, and the spatial interpolation techniques (inverse distance weighting). The three main findings for this study are: first, among the four algorithms ALK, Ward and CLK identified regions effectively than SLK which performed very poorly. The ALK provided more promising regions than the rest of the algorithms by producing spatial uniformity effectively related to the weight variable (body mass index). The regionalization algorithm-ALK provided new insights about overweight and obesity, by detecting new spatial clusters with over 30% prevalence. New meaningful clusters were detected in 15 counties, including Yazoo, Holmes, Lincoln, and Attala, in Mississippi; Wise, Delta, Hunt, Liberty, and Hardin in Texas; St Charles, St James, and Calcasieu in Louisiana; Choctaw, Sumter, and Tuscaloosa in Alabama. Demographically, these counties have race/ethnic composition of about 75% White, 11.6% Black and 13.4% others. Second, results from this study indicated that there is an upward trend in the prevalence of overweight and obesity in United States youths both in males and in females. Male youth obesity increased from 10.3% (95% CI=9.0, 11.0) in 1999 to 27.0% (95% CI=26.0, 28.0) in 2008. Likewise, female obesity increased from 9.6% (95% CI=8.0, 11.0) in 1999 to 28.9% (95% CI=27.0, 30.0) during the same period. Youth obesity prevalence was higher among females than among males. Aging is a substantial factor that has statistically highly significant association (p < 0.001) with prevalence of overweight and obesity. Third, significant cluster years for high rates were detected in 2003-2008 (relative risk 1.92, 3.4 annual prevalence cases per 100000, p < 0.0001) and that of low rates in 1997-2002 (relative risk 0.39, annual prevalence cases per 100000, p < 0.0001). Three meaningful spatiotemporal clusters of obesity (p < 0.0001) were detected in counties located within the South, Lower North Eastern, and North Central regions. Counties identified as consistently experiencing high prevalence of obesity and with the potential of becoming an obesogenic environment in the future are Copiah, Holmes, and Hinds in Mississippi; Harris and Chamber, Texas; Oklahoma and McCain, Oklahoma; Jefferson, Louisiana; and Chicot and Jefferson, Arkansas. Surprisingly, there were mixed trends in youth obesity prevalence patterns in rural and urban areas. Finally, from a public health perspective, this research have shown that in-depth knowledge of whether and in what respect certain areas have worse health outcomes can be helpful in designing effective community interventions to promote healthy living. Furthermore, specific information obtained from this dissertation can help guide geographically-targeted programs, policies, and preventive initiatives for overweight and obesity prevalence in the United States.
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Kersten, Ellen Elisabeth. "Spatial Triage| Data, Methods, and Opportunities to Advance Health Equity". Thesis, University of California, Berkeley, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=3686356.

Texto completo da fonte
Resumo:

This dissertation examines whether spatial measures of health determinants and health outcomes are being used appropriately and effectively to improve the health of marginalized populations in the United States. I concentrate on three spatial measures that have received significant policy and regulatory attention in California and nationally: access to healthful foods, climate change, and housing quality. I find that measures of these health determinants have both significant limitations and unrealized potential for addressing health disparities and promoting health equity.

I define spatial triage as a process of using spatial data to screen or select place-based communities for targeted investments, policy action, and/or regulatory attention. Chapter 1 describes the historical context of spatial triage and how it relates to ongoing health equity research and policy. In Chapter 2, I evaluate spatial measures of community nutrition environments by comparing data from in-person store surveys against data from a commercial database. I find that stores in neighborhoods with higher population density or higher percentage of people of color have lower availability of healthful foods and that inaccuracies in commercial databases may produce biased measures of healthful food availability.

Chapter 3 focuses on spatial measures of climate change vulnerability. I find that currently used spatial measures of "disadvantaged communities" ignore many important factors, such as community assets, region-specific risks, and occupation-based hazards that contribute to place-based vulnerability. I draw from examples of successful actions by community-based environmental justice organizations and reframe "disadvantaged" communities as sites of solutions where innovative programs are being used to simultaneously address climate mitigation, adaptation, and equity goals.

In Chapter 4, I combine electronic health records, public housing locations, and census data to evaluate patterns of healthcare utilization and health outcomes for low-income children in San Francisco. I find that children who live in redeveloped public housing are less likely to have more than one acute care hospital visit within a year than children who live in older, traditional public housing. These results demonstrate how integrating patient-level data across hospitals and with data from other sectors can identify new types of place-based health disparities. Chapter 5 details recommendations for analytic, participatory, and cross-sector approaches to guide the development and implementation of more effective health equity research and policy.

Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Ling, Meng-Chun. "Senior health care system". CSUSB ScholarWorks, 2005. https://scholarworks.lib.csusb.edu/etd-project/2785.

Texto completo da fonte
Resumo:
Senior Health Care System (SHCS) is created for users to enter participants' conditions and store information in a central database. When users are ready for quarterly assessments the system generates a simple summary that can be reviewed, modified, and saved as part of the summary assessments, which are required by Federal and California law.
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Dulaney, D. R., Kurt J. Maier e Phillip R. Scheuerman. "Data Requirements for Developing Effective Pathogen TMDLs". Digital Commons @ East Tennessee State University, 2005. https://dc.etsu.edu/etsu-works/2938.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
11

Zhao, Hang. "Finding stable allocations in distributed real-time systems with multiple environmental parameters and replicable application". Ohio : Ohio University, 2005. http://www.ohiolink.edu/etd/view.cgi?ohiou1113854894.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
12

Danna, Nigatu Mitiku, e Esayas Getachew Mekonnen. "Data Processing Algorithms in Wireless Sensor Networks får Structural Health Monitoring". Thesis, KTH, Bro- och stålbyggnad, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-72241.

Texto completo da fonte
Resumo:
The gradual deterioration and failure of old buildings, bridges and other civil engineering structures invoked the need for Structural Health Monitoring (SHM) systems to develop a means to monitor the health of structures. Dozens of sensing, processing and monitoring mechanisms have been implemented and widely deployed with wired sensors. Wireless sensor networks (WSNs), on the other hand, are networks of large numbers of low cost wireless sensor nodes that communicate through a wireless media. The complexity nature and high cost demand of the highly used wired traditional SHM systems have posed the need for replacement with WSNs. However, the major fact that wireless sensor nodes have memory and power supply limitations has been an issue and many efficient options have been proposed to solve this problem and preserve the long life of the network. This is the reason why data processing algorithms in WSNs focus mainly on the accomplishment of efficient utilization of these scarce resources. In this thesis, we design a low-power and memory efficient data processing algorithm using in-place radix-2 integer Fast Fourier Transform (FFT). This algorithm requires inputs with integer values; hence, increases the memory efficiency by more than 40% and highly saves processor power consumption over the traditional floating-point implementation. A standard-deviation-based peak picking algorithm is next applied to measure the natural frequency of the structure. The algorithms together with Contiki, a lightweight open source operating system for networked embedded systems, are loaded on Z1 Zolertia sensor node. Analogue Device’s ADXL345 digital accelerometer on board is used to collect vibration data. The bridge model used to test the target algorithm is a simply supported beam in the lab.
Estilos ABNT, Harvard, Vancouver, APA, etc.
13

Harley, Joel B. "Data-Driven, Sparsity-Based Matched Field Processing for Structural Health Monitoring". Research Showcase @ CMU, 2014. http://repository.cmu.edu/dissertations/392.

Texto completo da fonte
Resumo:
This dissertation develops a robust, data-driven localization methodology based on the integration of matched field processing with compressed sensing ℓ1 recovery techniques and scale transform signal processing. The localization methodology is applied to an ultrasonic guided wave structural health monitoring system for detecting, locating, and imaging damage in civil infrastructures. In these systems, the channels are characterized by complex, multi-modal, and frequency dispersive wave propagation, which severely distort propagating signals. Acquiring the characteristics of these propagation mediums from data represents a difficult inverse problem for which, in general, no readily available solution exists. In this dissertation, we build data-driven models of these complex mediums by integrating experimental guided wave measurements with theoretical wave propagation models and ℓ1 sparse recovery methods from compressed sensing. The data-driven models are combined with matched field processing, a localization framework extensively studied for underwater acoustics, to localize targets in complex, guided wave environments. The data-driven matched field processing methodology is then refined, through the use of the scale transform, to achieve robustness to environmental variations that distort guided waves. Data-driven matched field processing is experimentally applied to an ultrasound structural health monitoring system to detect and locate damage in aluminum plate structures.
Estilos ABNT, Harvard, Vancouver, APA, etc.
14

Algire, Martin. "Distributed multi-processing for high performance computing". Thesis, McGill University, 2000. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=31180.

Texto completo da fonte
Resumo:
Parallel computing can take many forms. From a user's perspective, it is important to consider the advantages and disadvantages of each methodology. The following project attempts to provide some perspective on the methods of parallel computing and indicate where the tradeoffs lie along the continuum. Problems that are parallelizable enable researchers to maximize the computing resources available for a problem, and thus push the limits of the problems that can be solved. Solving any particular problem in parallel will require some very important design decisions to be made. These decisions may dramatically affect portability, performance, and cost of implementing a software solution to the problem. The results gained from this work indicate that although performance improvements are indeed possible---they are heavily dependent on the application in question and may require much more programming effort and expertise to implement.
Estilos ABNT, Harvard, Vancouver, APA, etc.
15

Willner, Marjorie Rose. "Environmental Analysis at the Nanoscale: From Sensor Development to Full Scale Data Processing". Diss., Virginia Tech, 2018. http://hdl.handle.net/10919/94644.

Texto completo da fonte
Resumo:
Raman spectroscopy is an extremely versatile technique with molecular sensitivity and fingerprint specificity. However, the translation of this tool into a deployable technology has been stymied by irreproducibility in sample preparation and the lack of complex data analysis tools. In this dissertation, a droplet microfluidic platform was prototyped to address both sample-to-sample variation and to introduce a level of quantitation to surface enhanced Raman spectroscopy (SERS). Shifting the SERS workflow from a cell-to-cell mapping routine to the mapping of tens to hundreds of cells demanded the development of an automated processing tool to perform basic SERS analyses such as baseline correction, peak feature selection, and SERS map generation. The analysis tool was subsequently expanded for use with a multitude of diverse SERS applications. Specifically, a two-dimensional SERS assay for the detection of sialic acid residues on the cell membrane was translated into a live cell assay by utilizing a droplet microfluidic device. Combining single-cell encapsulation with a chamber array to hold and immobilize droplets allowed for the interrogation of hundreds of droplets. Our novel application of computer vision algorithms to SERS maps revealed that sialic sugars on cancer cell membranes are found in small clusters, or islands, and that these islands typically occupy less than 30% of the cell surface area. Employing an opportunistic mindset for the application of the data processing platform, a number of smaller projects were pursued. Biodegradable aliphatic-aromatic copolyesters with varying aromatic content were characterized using Raman spectroscopy and principal component analysis (PCA). The six different samples could successfully be distinguished from one another and the tool was able to identify spectral feature changes resulting from an increasing number of aryl esters. Uniquely, PCA was performed on the 3,125 spectra collected from each sample to investigate point-to-point heterogeneities. A third set of projects evaluated the ability of the data processing tool to calculate spectral ratios in an automated fashion and were exploited for use with nano-pH probes and Rayleigh hot-spot normalization.
Ph. D.
Estilos ABNT, Harvard, Vancouver, APA, etc.
16

Korziuk, Kamil, e Tomasz Podbielski. "Engineering Requirements for platform, integrating health data". Thesis, Blekinge Tekniska Högskola, Institutionen för tillämpad signalbehandling, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-16089.

Texto completo da fonte
Resumo:
In the world that we already live people are more and more on the run and population ageing significantly raise, new technologies are trying to bring best they can to meet humans’ expectations. Survey’s results, that was done during technology conference with elderly on Blekinge Institute of Technology showed, that no one of them has any kind of help in their home but they would need it. This Master thesis present human health state monitoring to focus on fall detection. Health care systems will not completely stop cases when humans are falling down, but further studying causes can prevent them.In this thesis, integration of sensors for vital parameters measurements, human position and measured data evaluation are presented. This thesis is based on specific technologies compatible with Arduino Uno and Arduino Mega microcontrollers, measure sensors and data exchange between data base, MATLAB/Simulink and web page. Sensors integrated in one common system bring possibility to examine the patient health state and call aid assistance in case of health decline or serious injury risk.System efficiency was based on many series of measurement. First phase a comparison between different filter was carried out to choose one with best performance. Kalman filtering and trim parameter for accelerometer was used to gain satisfying results and the final human fall detection algorithm. Acquired measurement and data evaluation showed that Kalmar filtering allow to reach high performance and give the most reliable results. In the second phase sensor placement was tested. Collected data showed that human fall detection is correctly recognized by system with high accuracy. Designed system as a result allow to measure human health and vital state like: temperature, heartbeat, position and activity. Additionally, system gives online overview possibility with actual health state, historical data and IP camera preview when alarm was raised after bad health condition.
Estilos ABNT, Harvard, Vancouver, APA, etc.
17

Maas, Luis C. (Luis Carlos). "Processing strategies for functional magnetic resonance imaging data sets". Thesis, Massachusetts Institute of Technology, 1998. http://hdl.handle.net/1721.1/85262.

Texto completo da fonte
Resumo:
Thesis (Ph.D.)--Harvard--Massachusetts Institute of Technology Division of Health Sciences and Technology, 1999.
Includes bibliographical references (leaves 108-118).
by Luis Carlos Maas, III.
Ph.D.
Estilos ABNT, Harvard, Vancouver, APA, etc.
18

Iwaya, Leonardo H. "Secure and Privacy-aware Data Collection and Processing in Mobile Health Systems". Licentiate thesis, Karlstads universitet, Institutionen för matematik och datavetenskap (from 2013), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-46982.

Texto completo da fonte
Resumo:
Healthcare systems have assimilated information and communication technologies in order to improve the quality of healthcare and patient's experience at reduced costs. The increasing digitalization of people's health information raises however new threats regarding information security and privacy. Accidental or deliberate data breaches of health data may lead to societal pressures, embarrassment and discrimination. Information security and privacy are paramount to achieve high quality healthcare services, and further, to not harm individuals when providing care. With that in mind, we give special attention to the category of Mobile Health (mHealth) systems. That is, the use of mobile devices (e.g., mobile phones, sensors, PDAs) to support medical and public health. Such systems, have been particularly successful in developing countries, taking advantage of the flourishing mobile market and the need to expand the coverage of primary healthcare programs. Many mHealth initiatives, however, fail to address security and privacy issues. This, coupled with the lack of specific legislation for privacy and data protection in these countries, increases the risk of harm to individuals. The overall objective of this thesis is to enhance knowledge regarding the design of security and privacy technologies for mHealth systems. In particular, we deal with mHealth Data Collection Systems (MDCSs), which consists of mobile devices for collecting and reporting health-related data, replacing paper-based approaches for health surveys and surveillance. This thesis consists of publications contributing to mHealth security and privacy in various ways: with a comprehensive literature review about mHealth in Brazil; with the design of a security framework for MDCSs (SecourHealth); with the design of a MDCS (GeoHealth); with the design of Privacy Impact Assessment template for MDCSs; and with the study of ontology-based obfuscation and anonymisation functions for health data.
Information security and privacy are paramount to achieve high quality healthcare services, and further, to not harm individuals when providing care. With that in mind, we give special attention to the category of Mobile Health (mHealth) systems. That is, the use of mobile devices (e.g., mobile phones, sensors, PDAs) to support medical and public health. Such systems, have been particularly successful in developing countries, taking advantage of the flourishing mobile market and the need to expand the coverage of primary healthcare programs. Many mHealth initiatives, however, fail to address security and privacy issues. This, coupled with the lack of specific legislation for privacy and data protection in these countries, increases the risk of harm to individuals. The overall objective of this thesis is to enhance knowledge regarding the design of security and privacy technologies for mHealth systems. In particular, we deal with mHealth Data Collection Systems (MDCSs), which consists of mobile devices for collecting and reporting health-related data, replacing paper-based approaches for health surveys and surveillance.
Estilos ABNT, Harvard, Vancouver, APA, etc.
19

Gurung, Sanjaya. "Integrating environmental data acquisition and low cost Wi-Fi data communication". Thesis, University of North Texas, 2009. https://digital.library.unt.edu/ark:/67531/metadc12131/.

Texto completo da fonte
Resumo:
This thesis describes environmental data collection and transmission from the field to a server using Wi-Fi. Also discussed are components, radio wave propagation, received power calculations, and throughput tests. Measured receive power resulted close to calculated and simulated values. Throughput tests resulted satisfactory. The thesis provides detailed systematic procedures for Wi-Fi radio link setup and techniques to optimize the quality of a radio link.
Estilos ABNT, Harvard, Vancouver, APA, etc.
20

Harris, Jeff R. "Processing and integration of geochemical data for mineral exploration: Application of statistics, geostatistics and GIS technology". Thesis, University of Ottawa (Canada), 2002. http://hdl.handle.net/10393/6421.

Texto completo da fonte
Resumo:
Geographic Information Systems (GIS) used in concert with statistical and geostatistical software provide the geologist with a powerful tool for processing, visualizing and analysing geoscience data for mineral exploration applications. This thesis focuses on different methods for analysing, visualizing and integrating geochemical data sampled from various media (rock, till, soil, humus), with other types of geoscience data. Different methods for defining geochemical anomalies and separating geochemical anomalies due to mineralization from other lithologic or surficial factors (i.e. true from false anomalies) are investigated. With respect to lithogeochemical data, this includes methods to distinguish between altered and un-altered samples, methods (normalization) for identifying lithologic from mineralization effects, and various statistical and visual methods for identifying anomalous geochemical concentrations from background. With respect to surficial geochemical data, methods for identifying bedrock signatures, and scavenging effects are presented. In addition, a new algorithm, the dispersal train identification algorithm (DTIA), is presented which broadly helps to identify and characterize anisotropies in till data due to glacial dispersion and more specifically identifies potential dispersal trains using a number of statistical parameters. The issue of interpolation of geochemical data is addressed and methods for determining whether geochemical data should or should not be interpolated are presented. New methods for visualizing geochemical data using red-green-blue (RGB) ternary displays are illustrated. Finally data techniques for integrating geochemical data with other geoscience data to produce mineral prospectivity maps are demonstrated. Both data and knowledge-driven GIS modeling methodologies are used (and compared) for producing prospectivity maps. New ways of preparing geochemical data for input to modeling are demonstrated with the aim of getting the most out of your data for mineral exploration purposes. Processing geochemical data by sub-populations, either by geographic unit (i.e., lithology) or by geochemical classification and alteration style was useful for better identification of geochemical anomalies, with respect to background, and for assessing varying alteration styles. Normal probability plots of geochemical concentrations based on spatial (lithologic) divisions and Principal Component Analysis (PCA) were found to be particularly useful for identifying geochemical anomalies and for identifying associations between major oxide elements that in turn reflect different alteration styles. (Abstract shortened by UMI.)
Estilos ABNT, Harvard, Vancouver, APA, etc.
21

Kishore, Annapoorni. "AN INTERNSHIP WITH ENVIRONMENTAL SYSTEMS RESEARCH INSTITUTE". Miami University / OhioLINK, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=miami1209153230.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
22

Das, Debalina. "Waterborne Diseases: Linking Public Health And Watershed Data". Amherst, Mass. : University of Massachusetts Amherst, 2009. http://scholarworks.umass.edu/theses/235/.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
23

Marchant, Christian C. "Retrieval of Aerosol Mass Concentration from Elastic Lidar Data". DigitalCommons@USU, 2010. https://digitalcommons.usu.edu/etd/812.

Texto completo da fonte
Resumo:
Agricultural aerosol sources can contribute significantly to air pollution in many regions of the country. Characterization of the aerosol emissions of agricultural operations is required to establish a scientific basis for crafting regulations concerning agricultural aerosols. A new lidar instrument for measuring aerosol emissions is described, as well as two new algorithms for converting lidar measurements into aerosol concentration data. The average daily aerosol emission rate is estimated from a dairy using lidar. The Aglite Lidar is a portable scanning lidar for mapping the concentration of particulate matter from agricultural and other sources. The instrument is described and performance and lidar sensitivity data are presented. Its ability to map aerosol plumes is demonstrated, as well as the ability to extract wind-speed information from the lidar data. An iterative least-squares method is presented for estimating the solution to the lidar equation. The method requires a priori knowledge of aerosol relationships from point sensors. The lidar equation is formulated and solved in vector form. The solution is stable for signals with extremely low signal-to-noise ratios and for signals at ranges far beyond the boundary point. Another lidar algorithm is also presented as part of a technique for estimating aerosol concentration and particle-size distribution. This technique uses a form of the extended Kalman Filter, wherein the target aerosol is represented as a linear combination of basis aerosols. For both algorithms, the algorithm is demonstrated using both synthetic test data and field measurements of biological aerosol simulants. The estimated particle size distribution allows straightforward calculation of parameters such as volume-fraction concentration and effective radius. Particulate matter emission rates from a dairy in the San Joaquin Valley of California were investigated during June 2008. Vertical particulate matter concentration profiles were measured both upwind and downwind of the facility using lidar, and a mass balance technique was used to estimate the average emission rate. Emission rates were also estimated using an inverse modeling technique coupled with the filter-based measurements. The concentrations measured by lidar and inverse modeling are of similar magnitude to each other, as well as to those from studies with similar conditions.
Estilos ABNT, Harvard, Vancouver, APA, etc.
24

Kratchman, Jessica. "Predicting Chronic Non-Cancer Toxicity Levels from Short-Term Toxicity Data". Thesis, The George Washington University, 2017. http://pqdtopen.proquest.com/#viewpdf?dispub=10263969.

Texto completo da fonte
Resumo:

This dissertation includes three separate but related studies performed in partial fulfillment of the requirements for the degree of Doctor of Public Health in Environmental and Occupational Health. The main goal this dissertation was to develop and assess quantitative relationships for predicting doses associated with chronic non-cancer toxicity levels in situations where there is an absence of chronic toxicity data, and to consider the applications of these findings to chemical substitution decisions. Data from National Toxicology Program (NTP) Technical Reports (TRs) (and where applicable Toxicity Reports), which detail the results of both short-term and chronic rodent toxicity tests, have been extracted and modeled using the Environmental Protection Agency’s (EPA’s) Benchmark Dose Software (BMDS). Best-fit minimum benchmark doses (BMDs) and benchmark dose lower limits (BMDL) were determined. Endpoints of interest included non-neoplastic lesions, final mean body weights and mean organ weights. All endpoints were identified by NTP Pathologists in the abstract of the TRs as either statistically or biologically significant. A total of 41 chemicals tested between 2000 and 2012 were included with over 1700 endpoints for short-term (13 week) and chronic (2 year) exposures.

Non-cancer endpoints were the focus of this research. Chronic rodent bioassays have been used by many methodologies in predicting the carcinogenic potential of chemicals in humans (1). However, there appears to be less emphasis on non-cancer endpoints. Further, it has been shown in the literature that there is little concordance in cancerous endpoints between humans and rodents (2). The first study, Quantitative Relationship of Non-Cancer Benchmark Doses in Short-Term and Chronic Rodent Bioassays (Chapter 2), investigated quantitative relationships between non-cancer chronic and short-term toxicity levels using best-fit modeling results and orthogonal regression techniques. The findings indicate that short-term toxicity studies reasonably provide a quantitative estimate of minimum (and median) chronic non-cancer BMDs and BMDLs.

The next study, Assessing Implicit Assumptions in Toxicity Testing Guidelines (Chapter 3) assessed the most sensitive species and species-sex combinations associated with the best-fit minimum BMDL10 for the 41 chemicals. The findings indicate that species and species-sex sensitivity for this group of chemicals is not uniform and that rats are significantly more sensitive than mice for non-cancerous outcomes. There are also indications that male rats may be more than the other species sex groups in certain instances.

The third and final study, Comparing Human Health Toxicity of Alternative Chemicals (Chapter 4), considered two pairs of target and alternative chemicals. A target is the chemical of concern and the alternative is the suggested substitution. The alternative chemical lacked chronic toxicity data, whereas the target had well studied non-cancer health effects. Using the quantitative relationships established in Chapter 2, Quantitative Relationship of Non-Cancer Benchmark Doses in Short-Term and Chronic Rodent Bioassays, chronic health effect levels were predicted for the alternative chemicals and compared to known points of departure (PODs) for the targets. The findings indicate some alternatives can lead to chemical exposures potentially more toxic than the target chemical.

Estilos ABNT, Harvard, Vancouver, APA, etc.
25

Ying, Yujie. "A Data-Driven Framework for Ultrasonic Structural Health Monitoring of Pipes". Research Showcase @ CMU, 2012. http://repository.cmu.edu/dissertations/92.

Texto completo da fonte
Resumo:
Cylindrical shells serve important roles in broad engineering applications, such as oil and natural gas pipelines, and pressurized industrial piping systems. To ensure the safety of pipe structures, various inspection equipment and platforms have been developed based on nondestructive testing (NDT) technologies. However, most existing approaches are time and labor intensive, and are only conducted intermittently. Drawbacks of current NDT methods suggest a proactive, automated and long-term monitoring system. Structural health monitoring (SHM) techniques continuously assess structural integrity through permanently installed transducers, allowing condition-based maintenance to replace the current practice of economically inefficient schedule-based maintenance. Ultrasonics is an appealing SHM technology in which guided waves interrogate long stretches of a pipe with high sensitivity to damage, and can be generated by a surface-mounted, small-size piezoelectric wafer transducer (PZT). The challenges of implementing ultrasonic SHM with PZTs as active sensing devices lie in: (1) the wave pattern is complex and difficult to interpret; (2) it is even more difficult to differentiate changes produced by damage from changes produced by benign environmental and operational variability The ultimate goal of this research is to develop an ultrasonic sensing and data analysis system for continuous and reliable monitoring of pipe structures. The objective of this dissertation is to devise a data-driven framework for effective and robust analysis of guided wave signals to detect and localize damage in steel pipes under environmental and operational variations. The framework is composed of a three-stage SHM scheme: damage detection,damage localization and damage characterization, supported by a multilayer data processing architecture incorporating statistical analysis, signal processing, and machine leaning techniques. The data-driven methodology was first investigated through laboratory experiments conducted on a pipe specimen with varying internal air pressure. The sensed ultrasonic data were characterized and mapped onto a high dimensional feature space using various statistical and signal processing techniques. Machine learning algorithms were applied to automatically identify effective features, and to detect and localize a weak scatterer on the pipe. The reliability and generality of the data-driven framework was further validated through field tests performed on an in-service hot-water pipe under large, complex and uncontrollable operating conditions. This data-driven SHM methodology involves an integrated process of sensing, data acquisition, statistical analysis, signal processing, and pattern recognition, for continuous tracking of the structural functionality in an adaptive and cost-effective manner. The techniques developed in this dissertation are expected to have broader applications related to the regular inspection, maintenance, and management of critical infrastructures not just limited to pipes.
Estilos ABNT, Harvard, Vancouver, APA, etc.
26

Pentaris, Fragkiskos. "Digital signal processing for structural health monitoring of buildings". Thesis, Brunel University, 2014. http://bura.brunel.ac.uk/handle/2438/10560.

Texto completo da fonte
Resumo:
Structural health monitoring (SHM) systems is a relatively new discipline, studying the structural condition of buildings and other constructions. Current SHM systems are either wired or wireless, with a relatively high cost and low accuracy. This thesis exploits a blend of digital signal processing methodologies, for structural health monitoring (SHM) and develops a wireless SHM system in order to provide a low cost implementation yet reliable and robust. Existing technologies of wired and wireless sensor network platforms with high sensitivity accelerometers are combined, in order to create a system for monitoring the structural characteristics of buildings very economically and functionally, so that it can be easily implemented at low cost in buildings. Well-known and established statistical time series methods are applied to SHM data collected from real concrete structures subjected to earthquake excitation and their strong and weak points are investigated. The necessity to combine parametric and non-parametric approaches is justified and to this direction novel and improved digital signal processing techniques and indexes are applied to vibration data recordings, in order to eliminate noise and reveal structural properties and characteristics of the buildings under study, that deteriorate due to environmental, seismic or anthropogenic impact. A characteristic and potential harming specific case study is presented, where consequences to structures due to a strong earthquake of magnitude 6.4 M are investigated. Furthermore, is introduced a seismic influence profile of the buildings under study related to the seismic sources that exist in the broad region of study.
Estilos ABNT, Harvard, Vancouver, APA, etc.
27

Zinszer, Kate. "Predicting malaria in a highly endemic country using clinical and environmental data". Thesis, McGill University, 2014. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=123153.

Texto completo da fonte
Resumo:
Malaria is a public health crisis, with between 154 and 289 million cases worldwide in 2011, 80% of which are in sub-Saharan Africa. International agencies have prioritized reduction of the malaria burden, investing an estimated $1.84 billion in 2012 alone in malaria control and prevention programs in malaria endemic countries. There has been a drastic increase in resources dedicated to malaria prevention and control efforts over the past decade. Malaria thrives in poor tropical and subtropical countries where local resources are limited. Accurate disease predictions and early warnings of increased disease burden can provide public health and clinical health services with information critical for targeting malaria control and prevention measures in an efficient manner. There have been numerous studies that have developed malaria forecasting models although the limitations of several of the studies include narrowly focusing on environmental predictors and the use of scale-dependent measures. Common, scale-free accuracy measures are essential, as they will facilitate the comparison of findings between studies and between methods and likely lead to improvements in the field of malaria forecasting. The aim of my thesis work was to develop and evaluate statistical models that integrate environmental and clinical data to forecast malaria across different settings in a highly endemic country. Specifically, the first objective was to systematically examine and summarize the literature on malaria forecasting models. A scoping review was conducted and the findings of this review have informed the methods and predictors included in the forecasting models used in this thesis. The second objective was to evaluate different methods of defining the catchment areas of health facilities and this allowed us to estimate the geographic regions served by the health facilities and their study populations. The third and final objective was to identify significant predictors of malaria across different settings and forecast horizons. Two forecasting models were developed for each of the six Uganda Malaria Surveillance Project (UMSP) sites, short-term (4 weeks) and long-term (52 weeks) models for a total of 12 models. Remote sensing data were obtained for the environmental predictors and the UMSP clinical data were obtained for clinic- and patient-level predictors. Models were evaluated in terms of forecast error on data that were not used for model development. Most of the models with the lowest forecast error included both environmental and clinical predictors, and the parameters of the models often varied within a site and across sites. Generally, the short-term models were able to predict variations in malaria counts whereas the intermediate and long-term models were more useful in predicting cumulative cases (e.g., number of cases within 30 weeks).The collective work of this thesis should advance the field of public health surveillance and more specifically malaria forecasting in a number of ways: in providing methodological guidelines for future forecasting studies, in providing a simple method for catchment definition that can be applied to define the geographic limits of a forecasting model, in identifying the importance of clinical predictors for forecasting malaria, in demonstrating the development of forecasting models with high spatial and temporal resolutions, and in raising important points of consideration for future forecasting work.
Avec ses 154 à 289 millions de victimes en 2011, dont 80% provenaient de la partie subsaharienne d'Afrique, la malaria est un problème majeur de santé publique. Les agences internationales ont privilégié le contrôle de la malaria, en investissant un montant estimé à 1,84$ milliard en 2012 dans les programmes de prévention et de lutte contre la malaria et dans les pays où la malaria est endémique. Il y a eu une forte augmentation des ressources consacrées à la prévention et au contrôle de la malaria au cours de la dernière décennie. La malaria se développe dans les pays tropicaux et subtropicaux pauvres où les ressources sont limitées. Des prévisions précises de l'occurrence de la malaria et des alertes précoces permettant de détecter son augmentation peuvent fournir des outils aux cliniques de santé publique et de l'information essentielle pour cibler efficacement son contrôle et certaines mesures de prévention. Plusieurs études ont développé des modèles de prévision de la malaria mais la majorité de ces études comprennent des limites telles que leur focalisation sur des facteurs prédictifs de l'environnement seulement et sur des mesures de précision qui sont influencées par l'échelle utilisée. Des mesures de précision qui sont indépendantes de l'échelle et des mesures communes sont indispensables dans ce contexte, car elles faciliteront la comparaison des résultats entre les études et entre les méthodes et permettront d'améliorer les prévisions de la malaria. Le but de ce travail de thèse était de développer et d'évaluer des modèles statistiques qui intègrent des données cliniques et environnementales afin de prévoir la malaria dans les différents contextes d'un pays où la maladie est fortement endémique.Plus précisément, le premier objectif était d'examiner systématiquement et de résumer la littérature scientifique sur les modèles de prévision de la malaria. Une revue exploratoire de la littérature a été menée et les résultats de cette étude ont guidé le choix des méthodes et des prédicteurs inclus dans les modèles de prévision. Le deuxième objectif était d'évaluer la manière de définir les circonscriptions des services de santé. Cette recherche nous a permis d'estimer les régions géographiques desservies par les services de santé et les populations étudiées. Le troisième et dernier objectif était d'identifier les prédicteurs significatifs de la malaria dans les différents contextes et dans les différents horizons de prévisions. Deux modèles de prévision, un à court terme (4 semaines) et un à long terme (52 semaines), ont été élaborés pour chacun des six sites du projet de surveillance de la malaria en Ouganda (PSMO) totalisant 12 modèles. Des données de télédétection ont été obtenues pour les prédicteurs environnementaux et des données cliniques de PSMO ont été obtenus pour les prédicteurs sanitaires (dispensaires et patients). Les modèles ont été évalués en fonction de l'erreur de prévision sur des données qui n'ont pas été utilisées pour l'élaboration du modèle. La plupart des modèles avec l'erreur de prévision la plus faible incluaient à la fois des prédicteurs environnementaux et non environnementaux, et les paramètres des modèles variaient souvent au sein d'un site et entre les sites. Les résultats de cette thèse devraient faire progresser le domaine de la prévision de la malaria de plusieurs façons: en fournissant des lignes directrices méthodologiques pour de futures études prévisionnelles, en fournissant une méthode simple pour définir les circonscriptions des services de santé applicable pour définir les limites géographiques d'un modèle de prévision, en démontrant l'importance des prédicteurs cliniques pour la prévision de la malaria, en fournissant un exemple de modèle de prévision avec une résolution spatiale et temporelle, et finalement, en soulevant des points importants à prendre en considération pour de futurs travaux de prévision.
Estilos ABNT, Harvard, Vancouver, APA, etc.
28

Sharaf, Taysseer. "Statistical Learning with Artificial Neural Network Applied to Health and Environmental Data". Scholar Commons, 2015. http://scholarcommons.usf.edu/etd/5866.

Texto completo da fonte
Resumo:
The current study illustrates the utilization of artificial neural network in statistical methodology. More specifically in survival analysis and time series analysis, where both holds an important and wide use in many applications in our real life. We start our discussion by utilizing artificial neural network in survival analysis. In literature there exist two important methodology of utilizing artificial neural network in survival analysis based on discrete survival time method. We illustrate the idea of discrete survival time method and show how one can estimate the discrete model using artificial neural network. We present a comparison between the two methodology and update one of them to estimate survival time of competing risks. To fit a model using artificial neural network, you need to take care of two parts; first one is the neural network architecture and second part is the learning algorithm. Usually neural networks are trained using a non-linear optimization algorithm such as quasi Newton Raphson algorithm. Other learning algorithms are base on Bayesian inference. In this study we present a new learning technique by using a mixture of the two available methodologies for using Bayesian inference in training of neural networks. We have performed our analysis using real world data. We have used patients diagnosed with skin cancer in the United states from SEER database, under the supervision of the National Cancer Institute. The second part of this dissertation presents the utilization of artificial neural to time series analysis. We present a new method of training recurrent artificial neural network with Hybrid Monte Carlo Sampling and compare our findings with the popular auto-regressive integrated moving average (ARIMA) model. We used the carbon dioxide monthly average emission to apply our comparison, data collected from NOAA.
Estilos ABNT, Harvard, Vancouver, APA, etc.
29

Byrne, Patricia Hiromi. "Development of an advisory system for indoor radon mitigation". PDXScholar, 1991. https://pdxscholar.library.pdx.edu/open_access_etds/4263.

Texto completo da fonte
Resumo:
A prototype hybrid knowledge-based advisory system for indoor radon mitigation has been developed to assist Pacific Northwest mitigators in the selection and design of mitigation systems for existing homes. The advisory system employs a heuristic inferencing strategy to determine which mitigation techniques are applicable, and applies procedural methods to perform the fan selection and cost estimation for particular techniques. The rule base has been developed employing knowledge in existing publications on radon mitigation. Additional knowledge has been provided by field experts. The benefits of such an advisory system include uniform record-keeping and consistent computations for the user, and verification of approved radon mitigation methods.
Estilos ABNT, Harvard, Vancouver, APA, etc.
30

Yang, Shaojie. "A Data Augmentation Methodology for Class-imbalanced Image Processing in Prognostic and Health Management". University of Cincinnati / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ucin161375046654683.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
31

Fritz, Godfried. "The relationship of sense of coherence to health and work in data processing personnel". Master's thesis, University of Cape Town, 1989. http://hdl.handle.net/11427/16845.

Texto completo da fonte
Resumo:
Bibliography: pages 80-86.
The aim of the present study was to test a model of stress and to examine whether the theoretical construct of sense of coherence (SOC) moderated the relationship between stressors and health-related and work-related outcomes. This construct of SOC was identified by an Israeli medical sociologist, Antonovsky. He maintained that the current focus of research on stress is largely pathogenic in nature. He suggested that it would be of value to shift research more towards that which identifies the origins of health. He consequently developed the term "salutogenesis", which requires people to focus on those factors which promote well-being. He also argued that people are not either sick or well, but rather are located on a continuum between health-ease/dis-ease. With respect to their health, persons will find themselves somewhere along this continuum, where they may shift between the two positions. He then suggests that certain factors contribute to facilitating the movement along this continuum. These factors together form a construct which he calls the SOC. The SOC is comprised of core components. He hypothesizes that someone with a strong SOC is likely to make better sense of the world around him/her, thereby engendering resilience towards the impinging stressors. The person with a weak SOC is likely to capitulate to these stressors · more readily and by succumbing to them is going to increase the likelihood that (s)he will move to the dis-ease end of the continuum. This study attempted to investigate the following research questions, namely, whether (1) the stressors were related to the stress outcomes, (2) the SOC was related to the stressors and outcomes, and (3) the SOC moderated the relationships between stressors and outcomes. In the present study the subjects were drawn from all data processing professionals in a large financial organisation. The respondents (~ = 194) replied to a questionnaire which contained scales which measured a variety of job-related stressors, an SOC scale as well as job-related and health-related outcome variables. Intercorrelations between the stressor, moderator and outcome variables were calculated. Other statistical procedures that were utilized were subgroup analyses and the moderated multiple regression analyses. Partial support for all three research questions was obtained. Four of the six stressors were found to correlate significantly with somatic complaints, thereby suggesting that stressors result in persons feeling the results of stress and reporting them physically. The SOC was found to relate to some of the stressors and outcome variables. This would lend partial support to an interpretation of the SOC as having a main effect relationship to stressor and outcome variables. In the subgroup analyses the results showed that out of a possible 54 relationships, the SOC moderated in only seven of them that the moderated multiple regression (MMR) analyses showed out of 54 possible relationships, the SOC moderated in 12 of them health-related variables. Furthermore, the SOC moderated between six outcome variables and six work-related outcomes. These findings then partially support research question 3, which examined whether the SOC would moderate relationships between stressors and outcome variables. This study was concluded by a discussion of the findings, its implications, and the limitations of this research.
Estilos ABNT, Harvard, Vancouver, APA, etc.
32

Picard, Charlotte. "Climate change and dengue: analysis of historical health and environmental data for Peru". Thesis, McGill University, 2012. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=106459.

Texto completo da fonte
Resumo:
Dengue, a mosquito-borne virual infection that is the most common cause of hemorrhagic fever globally, is rapidly spreading worldwide. An estimated 40% of the world's population is at risk for this disease that is transmitted by Aedes sp. mosquitos. The Aedes mosquito-dengue virus lifecycle varies with temperature, and climate change may increase the risk of Dengue epidemics in the future. This study examined whether changes in sea surface temperature (SST) along the Peruvian coast were associated with dengue incidence from 2002-2010. In Peru the effects of the El Niño cycle on weather conditions are pronounced, providing an ideal place to study fluctuations in climate and dengue incidence. Negative binomial models were used to examine the relationship between dengue cases and changes in SST across regions of Peru. Spearman's rank test was used to determine the lagged SST term that was most correlated with Dengue incidence in each region. The negative binomial models included terms for the optimum lagged SST and a term for the trend of increasing dengue incidence over the study period. The magnitude and sign of the correlation coefficient of dengue and SST varied between the 15 regions of Peru with dengue cases. 9 provinces had positive correlations between the two while 6 had negative correlations. The optimum lag ranged from 0 months to 6 months. In all of the regions lagged SST was a significant predictor of dengue cases in the negative binomial model. The relationship between dengue and sea surface temperature in Peru appears to be significant across the country. Given the varied nature of the relationship between regions it is not possible to make accurate generalisations about this relationship in Peru. Accounting for additional climatic variables such as precipitation may help in improving the predictive model.
La dengue, une infection virale transmise par les moustiques étant la cause la plus fréquente de fièvre hémorragique au niveau mondial, se propage rapidement dans le monde entier. On estime que 40% de la population mondiale est à risque pour cette maladie qui est transmise par les moustiques Aedes sp. Le cycle de vie du virus dengue des moustiques Aedes varie avec la température, et le changement climatique peut accroître le risque d'épidémies de dengue dans le futur. Nous avons examiné si les changements de température de surface de la mer (SST) sur le long de la côte péruvienne ont été associés à l'incidence de dengue de 2002 à 2010. Au Pérou les effets du cycle El Niño sur les conditions météorologiques sont prononcés, offrant un endroit idéal pour étudier les fluctuations du climat et de l'incidence de la dengue. Des modèles binomiaux négatifs ont été utilisés pour examiner la relation entre les cas de dengue et des changements de SST dans toutes les régions du Pérou. Le test de Spearman a été utilisé pour déterminer le terme retardé de SST qui était la plus corrélée avec l'incidence de dengue dans chaque région. Les modèles binomiaux négatifs comprenaient des termes pour optimiser la SST et un terme à la tendance de l'incidence de la dengue augmente au cours de la période d'étude. L'amplitude et le signe du coefficient de corrélation de la dengue et le SST varient entre les 15 régions du Pérou. Neuf provinces avaient des corrélations positives entre les deux, tandis que six avaient des corrélations négatives. Le décalage optimal varie de 0 à 6 mois. Dans toutes les régions retardées, le SST était un prédicateur important de cas de dengue dans le modèle binomial négatif. La relation entre la dengue et la température de surface de la mer au Pérou semble être significatif à travers le pays. Étant donné la nature variée de la relation entre les régions, il n'est pas possible de faire des généralisations exactes à propos de cette relation au Pérou. Tenant compte des autres variables climatiques comme la précipitation pourrait aider à améliorer le modèle prédictif.
Estilos ABNT, Harvard, Vancouver, APA, etc.
33

Ioannidou, Despoina. "Characterization of environmental inequalities due to Polyaromatic Hydrocarbons in France : developing environmental data processing methods to spatialize exposure indicators for PAH substances". Thesis, Paris, CNAM, 2018. http://www.theses.fr/2018CNAM1176/document.

Texto completo da fonte
Resumo:
La réduction des inégalités d’exposition environnementale constitue un axe majeur en santé publique en France comme en témoignent les priorités des différents Plan Nationaux Santé Environnement (PNSE). L’objectif de cette thèse est de développer une approche intégrée pour la caractérisation des inégalités environnementales et l’évaluation de l’exposition spatialisée de la population aux HAP en France.Les données produites dans le cadre des réseaux de surveillance de la qualités des milieux environnementaux sont le reflet de la contamination réelle des milieux et de l’exposition globale des populations. Toutefois, elles ne présentent généralement pas une représentativité spatiale suffisante pour caractériser finement les expositions environnementales, ces réseaux n’ayant pas été initialement conçus dans cet objectif. Des méthodes statistiques sont développées pour traiter les bases de données d’entrée (concentrations environnementales dans l’eau, l’air et le sol) et les rendre pertinentes vis à vis des objectifs définis de caractérisation de l’exposition. Un modèle multimédia d’exposition, interfacé avec un Système d’Information Géographique pour intégrer les variables environnementales, est développé pour estimer les doses d’exposition liées à l’ingestion d’aliments, d’eau de consommation, de sol et à l’inhalation de contaminants atmosphériques. La méthodologie a été appliquée pour trois Hydrocarbures Aromatiques Polycycliques (benzo[a]pyrène, benzo[ghi]pérylène et indéno[1,2,3-cd]pyrène) sur l’ensemble du territoire français. Les résultats permettent de cartographier des indicateurs d’exposition, d’identifier les zones de surexposition et de caractériser les déterminants environnementaux. Dans une logique de caractérisation de l’exposition, la spatialisation des données issues des mesures environnementales pose un certain nombre de questions méthodologiques qui confèrent aux cartes réalisées de nombreuses incertitudes et limites relatives à l’échantillonnage et aux représentativités spatiales et temporelles des données. Celles-ci peuvent être réduites par l’acquisition de données supplémentaires et par la construction de variables prédictives des phénomènes spatiaux et temporels considérés.Les outils de traitement statistique de données développés dans le cadre de ces travaux seront intégrés dans la plateforme PLAINE pour être déclinés sur d’autres polluants en vue de prioriser les mesures de gestion à mettre en œuvre
Reducing environmental exposure inequalities has become a major focus of public health efforts in France, as evidenced by the French action plans for health and the environment. The aim of this thesis is to develop an integrated approach to characterize environmental inequalities and evaluate the spatialized exposure to PAH in France.The data produced as part of the monitoring quality networks of environmental media reflect the actual contamination of the environment and the overall exposure of the populations. However they do not always provide an adequate spatial resolution to characterize environmental exposures as they are usually not assembled for this specific purpose. Statistical methods are employed to process input databases (environmental concentrations in water, air and soil) in the objective of characterizing the exposure. A multimedia model interfaced with a GIS, allows the integration of environmental variables in order to yield exposure doses related to ingestion of food, water and soil as well as atmospheric contaminants' inhalation.The methodology was applied to three Polycyclic Aromatic Hydrocarbon substances, (benzo[a]pyrene, benzo[ghi]perylene and indeno[1,2,3-cd]pyrene), in France. The results obtained, allowed to map exposure indicators and to identify areas of overexposure and characterize environmental determinants. In the context of exposure characterization, the direct spatialization of available data from environmental measurement datasets poses a certain number of methodological questions which lead to uncertainties related to the sampling and the spatial and temporal representativeness of data. These could be reduced by acquiring additional data or by constructing predictive variables for the spatial and temporal phenomena considered.Data processing algorithms and calculation of exposure carried out in this work, will be integrated in the French coordinated integrated environment and health platform-PLAINE in order to be applied on other pollutants and prioritize preventative actions
Estilos ABNT, Harvard, Vancouver, APA, etc.
34

Maher, Elizabeth. "INVESTIGATION OF ENVIRONMENTAL CADMIUM SOURCES IN EASTERN KENTUCKY". UKnowledge, 2018. https://uknowledge.uky.edu/mng_etds/41.

Texto completo da fonte
Resumo:
Utilizing data collected by the University of Kentucky Lung Cancer Research Initiative (LCRI), this study investigated potential mining-related sources for the elevated levels of cadmium in Harlan and Letcher counties. Statistical analyses for this study were conducted utilizing SAS. A number of linear regression models and logarithmic models were used to evaluate the significance of the data. The linear regression models consisted of both simple and multivariate types, with the simple models seeking to establish significance between the potential sources and urine cadmium levels and the multivariate models seeking both to identify any statistically significant linear relationships between source types as well as establish a relationship between the potential source and the urine cadmium levels. The analysis began by investigating which ingestion method caused the increased levels of cadmium exposure. The analysis included ingestion through water sources and inhalation of dust. Of these two, dust showed the higher level of correlation. The second step was to analyze a number of sources of dust, particularly those related to mining practices in the area. These included the proximity to the Extended Haul Road System, secondary haul roads, rail roads, and processing plants. Of the variables in the analysis, Extended Haul Roads, secondary haul roads, and rail roads showed no correlation, and only the proximity to processing plants showed statistical significance.
Estilos ABNT, Harvard, Vancouver, APA, etc.
35

Iakovidis, Iason. "On nonstationarity from operational and environmental effects in structural health monitoring bridge data". Thesis, University of Sheffield, 2018. http://etheses.whiterose.ac.uk/22569/.

Texto completo da fonte
Resumo:
Structural Health Monitoring (SHM) describes a set of activities that can be followed in order to collect data from an existent structure, generate data-based information about its current condition, identify the presence of any signs of abnormality and forecast its future response. These activities, among others, include instrumentation, data acquisition, processing, generation of diagnostic tools, as well as transmission of information to engineers, owners and authorities. SHM and, more specifically, continuous monitoring can provide numerous measures, which can be generally classified into three categories; vibrational-based, which includes natural frequencies, modeshapes, damping ratios, component-based, such as strains, tensions, deflections and environmental and operational variations (EOVs), associated with temperature, wind, traffic humidity and others. One of the main technical problems that SHM has to tackle is that of data normalisation. In abstract terms, this describes the impact that EOVs can have on SHM measures. In many cases, with interest placed on bridges, it has been observed that EOVs introduce nonstationary to SHM signals that can mask the variability that can be associated with the presence of damage; making damage detection attempts difficult. Hence, it is desirable to quantify the impacts of EOVs on damage sensitive features, project them out, using methods such as the cointegration, Principal Component Analysis (PCA) or others, in order to achieve a stationary signal. This type of signal can be assessed over time using tools, such as statistical process control (SPC) charts, to identify the existence of novelty, which can be linked with damage. As one can understand from the latter, it is important to detect the presence of nonstationary in SHM signals and identify its sources. However, this is not a straight-forward procedure and one important question that need to be answered is; how one can judge if a signal is stationary or not. Inside this work, this question is discussed, focusing on the definition of weak stationarity and under which assumption this judgement holds. In particular, the data coming from SHM are finite samples. Therefore, the mean and variance of a signal can be tracked, using a sequence of moving windows, something that needs a prior determination of the width of window. However, the major concern here is that the SHM signals can be characterised as periodically-correlated or cyclostationary. In such cases, it seems that it is better for one to use more advanced statistical tools to assess a signal's nonstationary. More specifically, nonstationary tests coming from the context of Econometrics and time-series analysis can be employed. In order to use such proxies more extensively, one should build trust on their indications by understanding the mechanism under which they perform. This work concentrates on the Augmented Dickey-Fuller (ADF) nonstationary test and emphasis is placed on the hypothesis (unit root) under which performs its assessment. In brief, a series of simulations are generated, and based on dimensional analysis, it is shown that the ADF test is essentially counts the number of cycles/periods of the dominant periodic component. Its indications depend on the number of observations/cycles, the normalised frequency of the signal, the sampling rate and signal-to-noise ratio (SNR). The most important conclusion made is that knowing the sampling frequency of any given signal, a critical frequency in Hz can be found, which can be derived from the critical normalised one, as a function of the number of cycles, which can be directly used to judge if the signal is stationary or not. In other words, this investigation provides an answer to the question; after how many cycles of continuous monitoring (i.e. days), an SHM signal can be judged as stationary? As well as considering nonstationary in a general way, this thesis returns to the main issue of data normalisation. To begin with, a laboratory test is performed, at the laboratory (Jonas lab) of Sheffield University, on an aluminium truss bridge model manufactured there. In particular, that involved vibration analysis of the truss bridge inside an environmental chamber, which simulated varying temperature conditions from -10 to 20 deg. Celsius, while damage introduced on the structure by the removal of bolts and connecting brackets in two locations of the model. This experiment provided interesting results to discuss further the impact of EOVs on data coming from the monitoring of a small-scale structure. After that, the thesis discusses the use of Johansen's approach to cointegration in the context of SHM, demonstrate its use on the laboratory truss bridge data and provides a review of the available methods that can be used to monitor the cointegration residual. The latter is the stationary signal provided by cointegration which is free from EOVs and capable for novelty detection. The methodologies reviewed are various SPC charts, while also the use of ADF is also explored, providing extensive discussion. Furthermore, an important conclusion from the SHM literature is that the impact of EOVs on SHM signals can occur on widely disparate time scales. Therefore, the quantification and elimination of these impacts from signals is not an easy procedure and prior knowledge is needed. For such purposes, refined means originated from the field of signal processing can be used within SHM. Of particular interest here is the concept of multiresolution analysis (MRA), which has been used in SHM in order to decompose a given signal in its frequency components (different time-scales) and evaluate the damage sensitivity of each one, employing the Johansen's approach to cointegration, which is able to project out the impact of EOVs from multiple SHM series. A more principled way to perform MRA is proposed here, in order to decompose SHM signals, by introducing two additional steps. The first step is the ADF test, which can be used to assess each one of the MRA levels in terms of nonstationary. In this way, a critical decomposition level (L*) can be found and used to decompose the original SHM signal into a non-stationary and stationary part. The second step introduced includes the use of autocorrelation functions (ACFs) in order to test the stationary MRA levels and identify those that can be considered as delta-correlated. These levels can be used to form a noisy component inside the stationary one. Assuming that all the aforementioned steps are confirmed, the original signal can now be decomposed into a stationary, a mean, a non-stationary and a noisy component. The proposed decomposition can be of great interest not only for SHM purposes, but also in the general context of time-series analysis, as it provides a principled way to perform MRA. The proposed analysis is demonstrated on natural frequency and temperature data of the Z24 Bridge. All in all, the thesis tries to answer the following questions: 1) How an SHM signal can be judged as non-stationary/stationary and under which assumptions? 2) After how many cycles of continuous monitoring an SHM signal that is initially non-stationary becomes stationary? 3) Which are the main drivers of this nonstationary (i.e. EOVs, abnormality/damage or others)? 4) How one can distinguish the effect of EOVs from this of abnormality/damage? 5) How one can project out the confounding} influence of EOVs from an SHM signal and provide a signal that is capable for novelty detection? 6) Is it possible to decompose an SHM signal and study each one of these components separately? 7) Which of these components are mostly affected by EOVs, which from damage and which do not include important information in terms of novelty detection? Understanding and answering all the aforementioned questions can help on identifying signals that can be monitored over time or in data windows, ensuring that stationarity achieved, employing methodologies such as statistical process control (SPC) for novelty detection.
Estilos ABNT, Harvard, Vancouver, APA, etc.
36

Kafle, Ram C. "Trend Analysis and Modeling of Health and Environmental Data: Joinpoint and Functional Approach". Scholar Commons, 2014. https://scholarcommons.usf.edu/etd/5246.

Texto completo da fonte
Resumo:
The present study is divided into two parts: the first is on developing the statistical analysis and modeling of mortality (or incidence) trends using Bayesian joinpoint regression and the second is on fitting differential equations from time series data to derive the rate of change of carbon dioxide in the atmosphere. Joinpoint regression model identifies significant changes in the trends of the incidence, mortality, and survival of a specific disease in a given population. Bayesian approach of joinpoint regression is widely used in modeling statistical data to identify the points in the trend where the significant changes occur. The purpose of the present study is to develop an age-stratified Bayesian joinpoint regression model to describe mortality trends assuming that the observed counts are probabilistically characterized by the Poisson distribution. The proposed model is based on Bayesian model selection criteria with the smallest number of joinpoints that are sufficient to explain the Annual Percentage Change (APC). The prior probability distributions are chosen in such a way that they are automatically derived from the model index contained in the model space. The proposed model and methodology estimates the age-adjusted mortality rates in different epidemiological studies to compare the trends by accounting the confounding effects of age. The future mortality rates are predicted using the Bayesian Model Averaging (BMA) approach. As an application of the Bayesian joinpoint regression, first we study the childhood brain cancer mortality rates (non age-adjusted rates) and their Annual Percentage Change (APC) per year using the existing Bayesian joinpoint regression models in the literature. We use annual observed mortality counts of children ages 0-19 from 1969-2009 obtained from Surveillance Epidemiology and End Results (SEER) database of the National Cancer Institute (NCI). The predictive distributions are used to predict the future mortality rates. We also compare this result with the mortality trend obtained using joinpoint software of NCI, and to fit the age-stratified model, we use the cancer mortality counts of adult lung and bronchus cancer (25-85+ years), and brain and other Central Nervous System (CNS) cancer (25-85+ years) patients obtained from the Surveillance Epidemiology and End Results (SEER) data base of the National Cancer Institute (NCI). The second part of this study is the statistical analysis and modeling of noisy data using functional data analysis approach. Carbon dioxide is one of the major contributors to Global Warming. In this study, we develop a system of differential equations using time series data of the major sources of the significant contributable variables of carbon dioxide in the atmosphere. We define the differential operator as data smoother and use the penalized least square fitting criteria to smooth the data. Finally, we optimize the profile error sum of squares to estimate the necessary differential operator. The proposed models will give us an estimate of the rate of change of carbon dioxide in the atmosphere at a particular time. We apply the model to fit emission of carbon dioxide data in the continental United States. The data set is obtained from the Carbon Dioxide Information Analysis Center (CDIAC), the primary climate-change data and information analysis center of the United States Department of Energy. The first four chapters of this dissertation contribute to the development and application of joinpiont and the last chapter discusses the statistical modeling and application of differential equations through data using functional data analysis approach.
Estilos ABNT, Harvard, Vancouver, APA, etc.
37

Correia, Andrew William. "Estimating the Health Effects of Environmental Exposures: Statistical Methods for the Analysis of Spatio-temporal Data". Thesis, Harvard University, 2013. http://dissertations.umi.com/gsas.harvard:10828.

Texto completo da fonte
Resumo:
In the field of environmental epidemiology, there is a great deal of care required in constructing models that accurately estimate the effects of environmental exposures on human health. This is because the nature of the data that is available to researchers to estimate these effects is almost always observational in nature, making it difficult to adequately control for all potential confounders - both measured and unmeasured. Here, we tackle three different problems in which the goal is to accurately estimate the effect of an environmental exposure on various health outcomes.
Estilos ABNT, Harvard, Vancouver, APA, etc.
38

Wong, Sze-nga, e 王絲雅. "The impact of electronic health record on diabetes management : a systematic review". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2013. http://hdl.handle.net/10722/193850.

Texto completo da fonte
Resumo:
Objectives: To investigate the impact of electronic health record (EHR) on diabetes management through examination of the effectiveness of implementation of EHR and to improve the quality of care and the cost-effectiveness on the use of EHR. Methods: Three databases, PubMed, Ovid Medline and Google Scholar, were searched with specific combination keywords including electronic medical record and electronic health record, and diabetes. Quality appraisal and extraction of data were conducted on literature that met with the inclusion criteria. Results: 10 literature studies, a total of 204,251 participants with diabetes, were included in this study. All subjects, with similar demographic and clinical characteristics, were from clinic and primary care setting with the use of EHR. Different outcome measures were compared and to evaluate the effectiveness of EHR on quality of care and cost-effectiveness. Discussion: The impact of EHR on effectiveness of diabetes management, potential factors of barrier for adoption and the limitation for implementation of EHR were discussed. These suggested that further research is needed to have stronger evidence to widespread the use of EHR in Hong Kong as a future direction on public health issue. Conclusion: In this systematic review, EHR showed potential benefit in improving the quality of care and reduce the health care expenditure for long term running. Patient safety and efficiency are yet to be covered in the studies. Further research is needed on the acceptability and applicability of the use of EHR in Hong Kong.
published_or_final_version
Public Health
Master
Master of Public Health
Estilos ABNT, Harvard, Vancouver, APA, etc.
39

Mxoli, Ncedisa Avuya Mercia. "Guidelines for secure cloud-based personal health records". Thesis, Nelson Mandela Metropolitan University, 2017. http://hdl.handle.net/10948/14134.

Texto completo da fonte
Resumo:
Traditionally, health records have been stored in paper folders at the physician’s consulting rooms – or at the patient’s home. Some people stored the health records of their family members, so as to keep a running history of all the medical procedures they went through, and what medications they were given by different physicians at different stages of their lives. Technology has introduced better and safer ways of storing these records, namely, through the use of Personal Health Records (PHRs). With time, different types of PHRs have emerged, i.e. local, remote server-based, and hybrid PHRs. Web-based PHRs fall under the remote server-based PHRs; and recently, a new market in storing PHRs has emerged. Cloud computing has become a trend in storing PHRs in a more accessible and efficient manner. Despite its many benefits, cloud computing has many privacy and security concerns. As a result, the adoption rate of cloud services is not yet very high. A qualitative and exploratory research design approach was followed in this study, in order to reach the objective of proposing guidelines that could assist PHR providers in selecting a secure Cloud Service Provider (CSP) to store their customers’ health data. The research methods that were used include a literature review, systematic literature review, qualitative content analysis, reasoning, argumentation and elite interviews. A systematic literature review and qualitative content analysis were conducted to examine those risks in the cloud environment that could have a negative impact on the secure storing of PHRs. PHRs must satisfy certain dimensions, in order for them to be meaningful for use. While these were highlighted in the research, it also emerged that certain risks affect the PHR dimensions directly, thus threatening the meaningfulness and usability of cloud-based PHRs. The literature review revealed that specific control measures can be adopted to mitigate the identified risks. These control measures form part of the material used in this study to identify the guidelines for secure cloud-based PHRs. The guidelines were formulated through the use of reasoning and argumentation. After the guidelines were formulated, elite interviews were conducted, in order to validate and finalize the main research output: i.e. guidelines. The results of this study may alert PHR providers to the risks that exist in the cloud environment; so that they can make informed decisions when choosing a CSP for storing their customers’ health data.
Estilos ABNT, Harvard, Vancouver, APA, etc.
40

Wrable, Madeline. "Exploring the Association Between Remotely Sensed Environmental Parameters and Surveillance Disease Data| An Application to the Spatiotemporal Modelling of Schistosomiasis in Ghana". Thesis, Tufts University, 2017. http://pqdtopen.proquest.com/#viewpdf?dispub=10276352.

Texto completo da fonte
Resumo:

Schistosomiasis control in sub-Saharan Africa is enacted primarily through mass drug administration, where predictive modeling plays an important role in filling knowledge gaps in the distribution of disease burden. Remote sensing (RS) satellite imagery is used to predictively model infectious disease transmission in schistosomiasis, since transmission requires environmental conditions to sustain specific freshwater snail species. Surveys are commonly used to obtain health outcome data, and while they provide accurate estimates of disease in a specific time and place, the resources required make performing surveys at large spatiotemporal scales impractical. Ongoing national surveillance data in the form of reported counts from health centers is conceptually better suited to utilizing the full spatiotemporal capabilities of publically available RS data, as most open source satellite products can be utilized as global continuous surfaces with historical (in some cases 40-year) timespans. In addition RS data is often in the public domain and takes at most a few days to order. Therefore, the use of surveillance data as an initial descriptive approach of mapping areas of high disease prevalence (often with large focal variation present) could then be followed up with more resource intensive methods such as health surveys paired with commercial, high spatial resolution imagery. Utilization of datasets and technologies more cost effectively would lead to sustainable control, a precursor to eradication (Rollinson et al. 2013).

In this study, environmental parameters were chosen for their historical use as proxies for climate. They were used as predictors and as inputs to a novel climate classification technique. This allowed for qualitative and quantitative analysis of broad climatic trends, and were regressed on 8 years of Ghanaian national surveillance health data. Mixed effect modeling was used to assess the relationship between reported disease counts and remote sensing data over space and time. A downward trend was observed in the reported disease rates (~1% per month). Seasonality was present, with two peaks (March and September) in the north of the country, a single peak (July) in the middle of the country, and lows consistently observed in December/January. Trend and seasonal patterns of the environmental variables and their associations with reported incidence varied across the defined climate zones. Environmental predictors explained little of the variance and did not improve model fit significantly, unlike district level effects which explained most of the variance. Use of climate zones showed potential and should be explored further. Overall, surveillance of neglected tropical diseases in low-income countries often suffers from incomplete records or missing observations. However, with systematic improvements, these data could potentially offer opportunities to more comprehensively analyze disease patterns by combining wide geographic coverage and varying levels of spatial and temporal aggregation. The approach can serve as a decision support tool and offers the potential for use with other climate-sensitive diseases in low-income settings.

Estilos ABNT, Harvard, Vancouver, APA, etc.
41

Poon, Wai-yin, e 潘慧賢. "Review of the implementation of electronic health record in Hong Kong". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2012. http://hub.hku.hk/bib/B50257456.

Texto completo da fonte
Resumo:
eHR is one of the main development area in healthcare sector to ensure a high quality and effective healthcare service in Hong Kong is provided. However, the present development of eHR in Hong Kong is mainly focused on public sectors healthcare providers – the hospitals and clinics under HA and DH. Most of the private hospitals and clinics are still using paper based health records. Although some of them may have implemented their own eHR systems, there is no interconnection among other healthcare providers. In this dissertation, the eHR system development in Hong Kong for both public and private sectors will be reviewed, to figure out the development of eHR and various clinical management systems, as well as the problems facing by the healthcare workers and patients. Also, HKSAR government shows supportive to the eHR development both in the governance and financial aspects. To facilitate the coordination of developing her sharing system among different healthcare providers, an eHR Office has been setup under Food and Health Bureau for this purpose. The eHR office will monitor the progress of the eHR development process. As HA has a well-developed world-known Clinical Management System (CMS), which handles patient records in electronic forms in public hospitals daily. HA acts as one major advisor in her implementation for HKSAR. Data privacy and data security issues are the major concerns of healthcare workers and patients. The Personal Data Protection Ordinance (PDPO) provides protection on the data privacy in legal aspect. However, no legislation on data privacy has been specified for eHR currently. Meanwhile, various physical security protections have been adopted in the implementation of eHR in technology side, which provided a certain level of data security to the system. Chinese Medicine has been developed rapidly recently, it is expected the Chinese Medicine would become one of the core service area in healthcare sector in Hong Kong, sharing the healthcare service with the Western Medicine. However, there is no integration between Chinese Medicine and Western Medicine in current her sharing system development. eHR development involves huge investment, to evaluate the feasibility of developing the eHR system, a scientific tool is recommended, a Cost and Benefit Analysis is hence conducted for the eHR in Hong Kong, to compare the effectiveness of eHR with the traditional paper-based health records in the healthcare setting. As recommended from the CBA, the eHR system will be developed with the consideration on the system flexibility and the adaptability from all the healthcare providers. On the other hand, the implementation of the her system will be a long and complex process and will require the contribution and participation from all parties.
published_or_final_version
Politics and Public Administration
Master
Master of Public Administration
Estilos ABNT, Harvard, Vancouver, APA, etc.
42

Kala, Abhishek K. "Spatially Explicit Modeling of West Nile Virus Risk Using Environmental Data". Thesis, University of North Texas, 2015. https://digital.library.unt.edu/ark:/67531/metadc822841/.

Texto completo da fonte
Resumo:
West Nile virus (WNV) is an emerging infectious disease that has widespread implications for public health practitioners across the world. Within a few years of its arrival in the United States the virus had spread across the North American continent. This research focuses on the development of a spatially explicit GIS-based predictive epidemiological model based on suitable environmental factors. We examined eleven commonly mapped environmental factors using both ordinary least squares regression (OLS) and geographically weighted regression (GWR). The GWR model was utilized to ascertain the impact of environmental factors on WNV risk patterns without the confounding effects of spatial non-stationarity that exist between place and health. It identifies the important underlying environmental factors related to suitable mosquito habitat conditions to make meaningful and spatially explicit predictions. Our model represents a multi-criteria decision analysis approach to create disease risk maps under data sparse situations. The best fitting model with an adjusted R2 of 0.71 revealed a strong association between WNV infection risk and a subset of environmental risk factors including road density, stream density, and land surface temperature. This research also postulates that understanding the underlying place characteristics and population composition for the occurrence of WNV infection is important for mitigating future outbreaks. While many spatial and aspatial models have attempted to predict the risk of WNV transmission, efforts to link these factors within a GIS framework are limited. One of the major challenges for such integration is the high dimensionality and large volumes typically associated with such models and data. This research uses a spatially explicit, multivariate geovisualization framework to integrate an environmental model of mosquito habitat with human risk factors derived from socio-economic and demographic variables. Our results show that such an integrated approach facilitates the exploratory analysis of complex data and supports reasoning about the underlying spatial processes that result in differential risks for WNV. This research provides different tools and techniques for predicting the WNV epidemic and provides more insights into targeting specific areas for controlling WNV outbreaks.
Estilos ABNT, Harvard, Vancouver, APA, etc.
43

Pardue, Miranda Taylor. "Comparing Heatwave Related Mortality Data from Distressed Counties to Affluent Counties in Central and Southern Central Appalachia". Digital Commons @ East Tennessee State University, 2020. https://dc.etsu.edu/honors/583.

Texto completo da fonte
Resumo:
The Appalachian Mountains are home to some of the most culturally rich places in the United States, but also some of the most impoverished communities as well. Several recent events support climate change across the globe. It is expected that Appalachian communities may suffer more dire consequences, as many communities lack strategies to help relieve some of the worst effects of climate change. Heatwaves are predicted to increase in duration and frequency over time, and communities that are not well prepared for the damaging effects of heatwaves can suffer unduly. This study aims to quantify the likelihood that people living in economically distressed counties in the Central and Southern Central regions of Appalachia will face heatwave related mortality more intensely than those who live in more affluent counties in the same regions. Twelve counties from each socioeconomic group have been selected based on the county economic status to analyze climate and mortality data over thirty-eight years starting in 1981 and ending in 2018. Data was collected during the warm season for each county, May 1st to September 30th, and compared to the mortality data from the same county during the same warm season. This study used all-cause mortality numbers from each of the twenty-four counties for the mortality data. The relative risk for each county in both the distressed and affluent categories was calculated. The average relative risk for each socioeconomic status were then compared. The results of this study did not show statistical significance in the likelihood that being in a socioeconomically distressed county increases one's chances of succumbing to heatwave related mortality in the Central and Southern Central regions of Appalachia. More research with larger sample sizes and more attention paid to the factors driving socioeconomic status is needed to better assess the relationship of heatwave mortality to socioeconomic status.
Estilos ABNT, Harvard, Vancouver, APA, etc.
44

Abdioskouei, Maryam. "Improving Air Quality Prediction Through Characterizing the Model Errors Using Data from Comprehensive Field Experiments". Thesis, The University of Iowa, 2019. http://pqdtopen.proquest.com/#viewpdf?dispub=13420451.

Texto completo da fonte
Resumo:

Uncertainty in the emission estimates is one the main reasons for shortcomings in the Chemistry Transport Models (CTMs) which can reduce the confidence level of impact assessment of anthropogenic activities on air quality and climate. This dissertation focuses on understating the uncertainties within the CTMs and reducing these uncertainties by improving emission estimates

The first part of this dissertation focuses on reducing the uncertainties around the emission estimates from oil and Natural Gas (NG) operations by using various observations and high-resolution CTMs. To achieve this goal, we used Weather Research and Forecasting with Chemistry (WRF-Chem) model in conjunction with extensive measurements from two major field campaigns in Colorado. Ethane was used as the indicator of oil and NG emissions to explore the sensitivity of ethane to different physical parametrizations and simulation set-ups in the WRF-Chem model using the U.S. EPA National Emission Inventory (NEI-2011). The sensitivity analysis shows up to 57.3% variability in the modeled ethane normalized mean bias (NMB) across the simulations, which highlights the important role of model configurations on the model performance.

Comparison between airborne measurements and the sensitivity simulations shows a model-measurement bias of ethane up to -15ppb (NMB of -80%) in regions close to oil and NG activities. Under-prediction of ethane concentration in all sensitivity runs suggests an actual under-estimation of the oil and NG emissions in the NEI-2011 in Colorado. To reduce the error in the emission inventory, we developed a three-dimensional variational inversion technique. Through this method, optimal scaling factors up to 6 for ethane emission rates were calculated. Overall, the inversion method estimated between 11% to 15% higher ethane emission rates in the Denver-Julesburg basin compared to the NEI-201. This method can be extended to constrain oil and NG emissions in other regions in the US using the available measurement datasets.

The second part of the dissertation discusses the University of Iowa high-resolution chemical weather forecast framework using WRF-Chem designed for the Lake Michigan Ozone Study (LMOS-2017). LMOS field campaign took place during summer 2017 to address high ozone episodes in coastal communities surrounding Lake Michigan. The model performance for clouds, on-shore flows, and surface and aircraft sampled ozone and NOx concentrations found that the model successfully captured much of the observed synoptic variability of onshore flows. Selection of High-Resolution Rapid Refresh (HRRR) model as initial and boundary condition, and the Noah land surface model, significantly improved comparison of meteorology variables to both ground-based and aircraft data. Model consistently underestimated the daily maximum concentration of ozone. Emission sensitivity analysis suggests that increase in Hydrocarbon (HC). Variational inversion method and measurements by GeoTAS and TROPOMI instruments and airborne and ground-based measurements can be used to constrain NOx emissions in the region.

Estilos ABNT, Harvard, Vancouver, APA, etc.
45

Mallya, Shruti. "Modelling Human Risk of West Nile Virus Using Surveillance and Environmental Data". Thesis, Université d'Ottawa / University of Ottawa, 2017. http://hdl.handle.net/10393/35734.

Texto completo da fonte
Resumo:
Limited research has been performed in Ontario to ascertain risk factors for West Nile Virus (WNV) and to develop a unified risk prediction strategy. The aim of the current body of work was to use spatio-temporal modelling in conjunction with surveillance and environmental data to determine which pre-WNV season factors could forecast a high risk season and to explore how well mosquito surveillance data could predict human cases in space and time during the WNV season. Generalized linear mixed modelling found that mean minimum monthly temperature variables and annual WNV-positive mosquito pools were most significantly predictive of number of human WNV cases (p<0.001). Spatio-temporal cluster analysis found that positive mosquito pool clusters could predict human case clusters up to one month in advance. These results demonstrate the usefulness of mosquito surveillance data as well as publicly available climate data for assessing risk and informing public health practice.
Estilos ABNT, Harvard, Vancouver, APA, etc.
46

Dreyer, Anna Alexandra. "Likelihood and Bayesian signal processing methods for the analysis of auditory neural and behavioral data". Thesis, Massachusetts Institute of Technology, 2008. http://hdl.handle.net/1721.1/45908.

Texto completo da fonte
Resumo:
Thesis (Ph. D.)--Harvard-MIT Division of Health Sciences and Technology, 2008.
Includes bibliographical references.
Developing a consensus on how to model neural and behavioral responses and to quantify important response properties is a challenging signal processing problem because models do not always adequately capture the data and different methods often yield different estimates of the same response property. The threshold, the first stimulus level for which a difference between baseline activity and stimulus-driven activity exists, is an example of such a response property for both neural and behavioral responses.In the first and second sections of this work, we show how the state-space model framework can be used to represent neural and behavioral responses to auditory stimuli with a high degree of model goodness-of-fit. In the first section, we use likelihood methods to develop a state-space generalized linear model and estimate maximum likelihood parameters for neural data. In the second section, we develop the alternative Bayesian state-space model for behavioral data. Based on the estimated joint density, we then illustrate how important response properties, such as the neural and behavioral threshold, can be estimated, leading to lower threshold estimates than current methods by at least 2 dB. Our methods provide greater sensitivity, obviation of the hypothesis testing framework, and a more accurate description of the data.Formulating appropriate models to describe neural data in response to natural sound stimulation is another problem that currently represents a challenge. In the third section of the thesis, we develop a generalized linear model for responses to natural sound stimuli and estimate maximum likelihood parameters. Our methodology has the advantage of describing neural responses as point processes, capturing aspects of the stimulus response such as past spiking history and estimating the contributions of the various response covariates, resulting in a high degree of model goodness-of-fit.
(cont) Using our model parameter estimates, we illustrate that decoding of the natural sound stimulus in our model framework produces neural discrimination performance on par with behavioral data.These findings have important implications for developing theoretically-sound and practical definitions of the neural response properties, for understanding information transmission within the auditory system and for design of auditory prostheses.
by Anna A. Dreyer.
Ph.D.
Estilos ABNT, Harvard, Vancouver, APA, etc.
47

Jokhadar, Hossam. "Comparison of the accuracy of fit of CAD/CAM crowns using three different data acquisition methods". Thesis, NSUWorks, 2013. https://nsuworks.nova.edu/hpd_cdm_stuetd/23.

Texto completo da fonte
Resumo:
A thesis submitted to the College of Dental Medicine of Nova Southeastern University of the degree of Master of Science in Dentistry. Background. Earlier research evaluated the 3D internal fit of CAD/CAM crowns after direct versus indirect laser scanning. To date, no study has evaluated the marginal integrity of all-ceramic crowns milled with different type of scanning systems via different methods of scanning. The purpose of This study was conducted to assess the marginal integrity of all-ceramic crowns milled with the E4D CAD/CAM system (D4D, Richardson, Texas) using three different scanning methods of a prepared model (direct scanning and indirect scanning of a cast, and scanning of an impression material). Methods. A metal die model of a prepared mandibular first molar was fabricated according to specifications for tooth preparation for the E4D CAD/CAM system. Fifty five all-ceramic crowns were milled using this system: 5 crowns were made from scanning of the metal die; 25 crowns were made from scanning of 5 PVS impressions of the metal die, with each impression scanned 5 times; and 25 crowns were made from scanning stone dies poured from the same previous 5 PVS impressions. Each stone die was scanned 5 times. An internal gap to provide space for cement was kept constant at 25 microns. Marginal integrity of the crowns was assessed using optical microscopy. Results. The overall mean marginal gap and standard deviation for crowns was 78.1&mgr;m (18.9) for scanning the metal dies, 148.9&mgr;m (25.4) for scanning impressions and 126.2&mgr;m (28.2) for scanning the stone casts. ANOVA revealed significant differences in marginal gap between the three different groups. Conclusions. The direct scanning of a metal die produced crowns with significantly smaller marginal gaps than the marginal gap seen from scanning a PVS impression or a stone cast (P<.05). Additionally, it was found that scanning PVS impressions or stone casts, produced crowns with unacceptable mean marginal gaps (over 120&mgr;m). It was also observed that difficulty of scanning PVS impressions and tracing the finish line lead to overhanging margins and larger marginal gaps for both crowns produced via that method.
Estilos ABNT, Harvard, Vancouver, APA, etc.
48

Kress, Marin M. "Identification and use of indicator data to develop models for Marine-sourced risks in Massachusetts Bay". Thesis, University of Massachusetts Boston, 2016. http://pqdtopen.proquest.com/#viewpdf?dispub=10118449.

Texto completo da fonte
Resumo:

The coastal watersheds around Massachusetts Bay are home to millions of people, many of whom recreate in coastal waters and consume locally harvested shellfish. Epidemiological data on food-borne illness and illnesses associated with recreational water exposure are known to be incomplete. Of major food categories, seafood has the highest recorded rate of associated foodborne illness. In total, the health impacts from these marine-sourced risks are estimated to cost millions of dollars each year in medical expenses or lost productivity. When recorded epidemiological data is incomplete it may be possible to estimate abundance or prevalence of specific pathogens or toxins in the source environment, but such environmental health challenges require an interdisciplinary approach.

This dissertation is divided into four sections: (1) a presentation of two frameworks for organizing research and responses to environmental health issues; (2) an exploration of human population dynamics in Massachusetts Bay coastal watersheds from 2000 to 2010 followed by a review of, and identification of potential indicators for, five marine-sourced risks: Enterococcus bacteria, Vibrio parahaemolyticus bacteria, Hepatitis A Virus, potentially toxigenic Pseudo-nitzschia genus diatoms, and anthropogenic antibiotics; (3) an introduction to environmental health research in the context of a changing data landscape, presentation of a generalized workflow for such research with a description of data sources relevant to marine environmental health for Massachusetts Bay; and (4) generation of models for the presence/absence of Enterococcus bacteria and Pseudo-nitzschia delicatissima complex diatoms and model selection using an information-theoretic approach.

This dissertation produced estimates of coastal watershed demographics and usage levels for anthropogenic antibiotics, it also demonstrated that Pseudo-nitzschia delicatissima complex diatoms may be present in any season of the year. Of the modeling generation and selection, the Enterococcus model performed poorly overall, but the Pseudo-nitzschia delicatissima complex model performed adequately, demonstrating high sensitivity with a low rate of false negatives. This dissertation concludes that monitoring data collected for other purposes can be used to estimate marine-sourced risks in Massachusetts Bay, and such work would be improved by data from purpose-designed studies.

Estilos ABNT, Harvard, Vancouver, APA, etc.
49

Cai, Beilei 1979. "Essays in health and environmental economics: Challenges in the empirical analysis of micro-level economic survey data". Thesis, University of Oregon, 2008. http://hdl.handle.net/1794/8505.

Texto completo da fonte
Resumo:
xi, 108 p. A print copy of this thesis is available through the UO Libraries. Search the library catalog for the location and call number.
Micro-level survey data are widely used in applied economic research. This dissertation, which consists of three empirical papers, demonstrates challenges in empirical research using micro-level survey data, as well as some methods to accommodate these problems. Chapter II examines the effect of China's recent public health insurance reform on health utilization and health status. Chinese policy makers have been eager to identify how this reform, characterized by a substantial increase in out-of-pocket costs, has affected health care demand and health status. However, due to self-selection of individuals into the publicly insured group, the impact of the reform remains an unresolved issue. I employ a Heckman selection model in the context of difference-in-difference regression to accommodate the selection problem, and provide the first solid empirical evidence that the recent public health insurance reforms in China adversely affected both health care access and health status for publicly insured individuals. Chapter III examines the construct validity of a stated preference (SP) survey concerning climate change policy. Due to the fact that the SP survey method remains a controversial tool for benefit-cost analysis, every part of the survey deserves thorough examination to ensure the quality of the data. Using a random utility approach, I establish that there is a great deal of logical consistency between people's professed attitudes toward different payment vehicles and their subsequent choices among policies which vary in the incidence of their costs. Chapter IV employs the same survey data used in Chapter III, but demonstrates the potential for order effects stemming from prior attitude-elicitation questions. In addition, it considers the potential impact of these order effects on Willingness to Pay (WTP) estimates for climate change mitigation. I find the orderings of prior elicitation questions may change people's opinions toward various attributes of the different policies, and thereby increase or decrease their WTP by a substantial amount. Thus, this chapter emphasizes the significance of order effects in prior elicitation questions, and supports a call for diligence in using randomly ordered prior elicitation questions in stated preference surveys, to minimize inadvertent effects from any single arbitrary ordering.
Adviser: Trudy Ann Cameron
Estilos ABNT, Harvard, Vancouver, APA, etc.
50

Carolan, Stephany. "Increasing adherence to digital mental health interventions delivered in the workplace". Thesis, University of Sussex, 2018. http://sro.sussex.ac.uk/id/eprint/79618/.

Texto completo da fonte
Resumo:
Background: Work related stress, depression and anxiety are common. Despite evidence that these problems can be successfully treated in the workplace, take-up of psychological treatments by workers is low, resulting in many going untreated. One way to address this may be through the use of digital mental health interventions (DMHIs) in the workplace, but there is a lack of information about their appeal and effectiveness. Research questions: 1. What is the evidence for delivering DMHIs in the workplace? 2. What are the advantages and disadvantages to delivering DMHIs in the workplace? 3. What features of DMHIs influence engagement and adherence? What can be done to improve these? 4. What are employers' priorities when selecting DMHIs for their workforce? Method of investigation: Mixed methods were used to answer the research questions. Summary of conclusions: There is evidence for the efficacy of workplace DMHIs, especially if they are delivered over a short timeframe, utilise secondary modalities to deliver the interventions (emails and text messages), and use elements of persuasive technology (self-monitoring and tailoring). Use of online-facilitated discussion groups may increase engagement. Both employees and employers identified convenience, flexibility, and anonymity as advantages of DMHIs. Employers also valued the potential of DMHIs to reach many employees. The main barrier to engagement for employees was lack of time. For employers, barriers to purchasing DMHIs were employees' lack of access to equipment, and their low interest and skills. Cost and effectiveness were priorities for decision makers when purchasing DMHIs. Further work needs to be done with workers and employers to design and deliver DMHIs that meet both their needs.
Estilos ABNT, Harvard, Vancouver, APA, etc.
Oferecemos descontos em todos os planos premium para autores cujas obras estão incluídas em seleções literárias temáticas. Contate-nos para obter um código promocional único!

Vá para a bibliografia