To see the other types of publications on this topic, follow the link: Data environmental analysis.

Dissertations / Theses on the topic 'Data environmental analysis'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Data environmental analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Li, Yang. "The spatial data quality analysis in the environmental modelling." Thesis, University of East London, 2001. http://roar.uel.ac.uk/1305/.

Full text
Abstract:
The spatial data quality analysis is essential in environmental modelling for efficiently addressing the environmental change. As the complexity of data sets and the modelling capability of computer systems increase, the need to address the quality of both data and models is increasingly important. Integration with environmental modelling, the spatial data quality analysis and the geocomputation paradigm have been three important areas of GIS research. In this research they are brought together in the context of coastal oil spill modelling. The research covers the issues of measurement, modelling and management of spatial data quality. Coupling GIS and environmental modelling, the systematic solution is developed for coastal oil spill modelling which is representative of complex environmental models. The procedures of geospatial data quality analysis were implemented not only with existing GIS funLionality but also with various Geocomputation techniques. Spatial data quality analyses of inputs and model performances, which include sensitivity analyses, error propagation analyses and fitness-for-use analyses, were carried out for the coastal oil spill modelling. The results show that in coastal oil spill modelling, a better understanding and improvement of spatial data quality can be achieved through such analyses. The examples illustrate both the diversity of techniques and tools required when investigating spatial data quality issues in environmental modelling. The evidence of feasibility and practicality are also provided for these flexible analysis approaches. An overall methodology is developed at each stage of a project; with particular emphasis at inception to ensure adequate data quality on which to construct the models. Furthermore, the coupling strategy of GIS and environmental modelling is revised to include a geo-data quality analysis (GQA) engine. With growing availability of proprietary and public domain software suitable for spatial data quality analysis, GQA engines will be formed with the evolution of such software into tightly-coupled collection of tools external to GIS. The GQA engine would itself be tightly-coupled with GIS and environmental models to form a modelling framework.
APA, Harvard, Vancouver, ISO, and other styles
2

Yang, Bin, and 杨彬. "A novel framework for binning environmental genomic fragments." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2010. http://hub.hku.hk/bib/B45789344.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Huang, Yunshui Charles. "A prototype of data analysis visualization tool." Thesis, Massachusetts Institute of Technology, 1994. http://hdl.handle.net/1721.1/12125.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Valtersson, Einar. "Comparison of data analysis methods KMSProTF and MsDEMPCA using Magnetotelluric data." Thesis, Luleå tekniska universitet, Institutionen för samhällsbyggnad och naturresurser, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-80195.

Full text
Abstract:
In this work two ways of processing controlled-source magnetotelluric (MT) data were tried and compared against each other. The aim was to evaluate the differences between a multivariate-based processing method to a bivariate processing method. The software KMSProTF represents conventional processing using bivariate and robust statistics. MsDEMPCA utilizes a multi-variate Criss-Cross regression scheme to improve the condition of the data-matrix before robustly decomposing it into principal components. Data from the FENICS-19 survey in northern Scandinavia was processed to transfer functions (TF) using the respective method. The TFs were visually interpreted in KMSProTF. There were no significant differences found between the methods. In addition a calibration between instruments was carried out, which caused an exclusion of parts of the data-set.
APA, Harvard, Vancouver, ISO, and other styles
5

Adu-Prah, Samuel. "GEOGRAPHIC DATA MINING AND GEOVISUALIZATION FOR UNDERSTANDING ENVIRONMENTAL AND PUBLIC HEALTH DATA." OpenSIUC, 2013. https://opensiuc.lib.siu.edu/dissertations/657.

Full text
Abstract:
Within the theoretical framework of this study it is recognized that a very large amount of real-world facts and geospatial data are collected and stored. Decision makers cannot consider all the available disparate raw facts and data. Problem-specific variables, including complex geographic identifiers have to be selected from this data and be validated. The problems associated with environmental- and public-health data are that (1) geospatial components of the data are not considered in analysis and decision making process, (2) meaningful geospatial patterns and clusters are often overlooked, and (3) public health practitioners find it difficult to comprehend geospatial data. Inspired by the advent of geographic data mining and geovisualization in public and environmental health, the goal of this study is to unveil the spatiotemporal dynamics in the prevalence of overweight and obesity in United States youths at regional and local levels over a twelve-year study period. Specific objectives of this dissertation are to (1) apply regionalization algorithms effective for the identification of meaningful clusters that are in spatial uniformity to youth overweight and obesity, and (2) use Geographic Information System (GIS), spatial analysis techniques, and statistical methods to explore the data sets for health outcomes, and (3) explore geovisualization techniques to transform discovered patterns in the data sets for recognition, flexible interaction and improve interpretation. To achieve the goal and the specific objectives of this dissertation, we used data sets from the National Longitudinal Survey of Youth 1997 (NLSY'97) early release (1997-2004), NLSY'97 current release (2005 - 2008), census 2000 data and yearly population estimates from 2001 to 2008, and synthetic data sets. The NLSY97 Cohort database range varied from 6,923 to 8,565 individuals during the period. At the beginning of the cohort study the age of individuals participating in this study was between 12 and 17 years, and in 2008, they were between 24 and 28 years. For the data mining tool, we applied the Regionalization with Dynamically Constrained Agglomerative clustering and Partitioning (REDCAP) algorithms to identify hierarchical regions based on measures of weight metrics of the U.S. youths. The applied algorithms are the single linkage clustering (SLK), average linkage clustering (ALK), complete linkage clustering (CLK), and the Ward's method. Moreover, we used GIS, spatial analysis techniques, and statistical methods to analyze the spatial varying association of overweight and obesity prevalence in the youth and to geographically visualize the results. The methods used included the ordinary least square (OLS) model, the spatial generalized linear mixed model (GLMM), Kulldorff's Scan space-time analysis, and the spatial interpolation techniques (inverse distance weighting). The three main findings for this study are: first, among the four algorithms ALK, Ward and CLK identified regions effectively than SLK which performed very poorly. The ALK provided more promising regions than the rest of the algorithms by producing spatial uniformity effectively related to the weight variable (body mass index). The regionalization algorithm-ALK provided new insights about overweight and obesity, by detecting new spatial clusters with over 30% prevalence. New meaningful clusters were detected in 15 counties, including Yazoo, Holmes, Lincoln, and Attala, in Mississippi; Wise, Delta, Hunt, Liberty, and Hardin in Texas; St Charles, St James, and Calcasieu in Louisiana; Choctaw, Sumter, and Tuscaloosa in Alabama. Demographically, these counties have race/ethnic composition of about 75% White, 11.6% Black and 13.4% others. Second, results from this study indicated that there is an upward trend in the prevalence of overweight and obesity in United States youths both in males and in females. Male youth obesity increased from 10.3% (95% CI=9.0, 11.0) in 1999 to 27.0% (95% CI=26.0, 28.0) in 2008. Likewise, female obesity increased from 9.6% (95% CI=8.0, 11.0) in 1999 to 28.9% (95% CI=27.0, 30.0) during the same period. Youth obesity prevalence was higher among females than among males. Aging is a substantial factor that has statistically highly significant association (p < 0.001) with prevalence of overweight and obesity. Third, significant cluster years for high rates were detected in 2003-2008 (relative risk 1.92, 3.4 annual prevalence cases per 100000, p < 0.0001) and that of low rates in 1997-2002 (relative risk 0.39, annual prevalence cases per 100000, p < 0.0001). Three meaningful spatiotemporal clusters of obesity (p < 0.0001) were detected in counties located within the South, Lower North Eastern, and North Central regions. Counties identified as consistently experiencing high prevalence of obesity and with the potential of becoming an obesogenic environment in the future are Copiah, Holmes, and Hinds in Mississippi; Harris and Chamber, Texas; Oklahoma and McCain, Oklahoma; Jefferson, Louisiana; and Chicot and Jefferson, Arkansas. Surprisingly, there were mixed trends in youth obesity prevalence patterns in rural and urban areas. Finally, from a public health perspective, this research have shown that in-depth knowledge of whether and in what respect certain areas have worse health outcomes can be helpful in designing effective community interventions to promote healthy living. Furthermore, specific information obtained from this dissertation can help guide geographically-targeted programs, policies, and preventive initiatives for overweight and obesity prevalence in the United States.
APA, Harvard, Vancouver, ISO, and other styles
6

Autret, Arnaud. "Modular neural networks for analysis of flow cytometry data." Thesis, University of South Wales, 2003. https://pure.southwales.ac.uk/en/studentthesis/modular-neural-networks-for-analysis-of-flow-cytometry-data(49f3349b-e86a-4bfb-a689-c853323b6f2d).html.

Full text
Abstract:
In predicting environmental hazards or estimating the impact of human activities on the marine ecosystem, scientists have multiplied the need for sample analysis. The classical microscopic approach is time consuming and wastes the talent and intellectual abilities of trained specialists. Therefore, scientists developed an automated optical tool, called a Flow Cytometer (FC), to analyse samples quickly and in large quantities. The flow cytometer has successfully been applied to real phytoplankton studies. However, analysis of the data extracted from samples is still required. Artificial Neural Networks (ANNs) are one of the tools applied to FC data analysis. Despite several successful applications, ANNs have not been widely adopted by the marine biologist community, as they can not possible to change the number of species in the classification problem without retraining of the full system from scratch. Training is time consuming and requires expertise in ANNs. Moreover, most ANN paradigms cannot cope effectively with unknown data, such as data coming from new phytoplankton species or from species outside the scope of the studies. This project developed a new ANN technique based on a modular architecture that removes the need for retraining and allows unknowns to be detected and rejected. Furthermore, the Support Vector Machine architecture is applied in this domain for the first time and compared against another ANN paradigm called Radial Basis Function Networks. The results show that the modular architecture is able to effectively deal with new data which can be incorporated into the ANN architecture without fully retraining the system.
APA, Harvard, Vancouver, ISO, and other styles
7

Wood, Duncan Andrew. "Analysis of passive microwave data for large area environmental monitoring." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1996. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp04/mq24519.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Welle, Paul. "Remotely Sensed Data for High Resolution Agro-Environmental Policy Analysis." Research Showcase @ CMU, 2017. http://repository.cmu.edu/dissertations/1012.

Full text
Abstract:
Policy analyses of agricultural and environmental systems are often limited due to data constraints. Measurement campaigns can be costly, especially when the area of interest includes oceans, forests, agricultural regions or other dispersed spatial domains. Satellite based remote sensing offers a way to increase the spatial and temporal resolution of policy analysis concerning these systems. However, there are key limitations to the implementation of satellite data. Uncertainty in data derived from remote-sensing can be significant, and traditional methods of policy analysis for managing uncertainty on large datasets can be computationally expensive. Moreover, while satellite data can increasingly offer estimates of some parameters such as weather or crop use, other information regarding demographic or economic data is unlikely to be estimated using these techniques. Managing these challenges in practical policy analysis remains a challenge. In this dissertation, I conduct five case studies which rely heavily on data sourced from orbital sensors. First, I assess the magnitude of climate and anthropogenic stress on coral reef ecosystems. Second, I conduct an impact assessment of soil salinity on California agriculture. Third, I measure the propensity of growers to adapt their cropping practices to soil salinization in agriculture. Fourth, I analyze whether small-scale desalination units could be applied on farms in California in order mitigate the effects of drought and salinization as well as prevent agricultural drainage from entering vulnerable ecosystems. And fifth, I assess the feasibility of satellite-based remote sensing for salinity measurement at global scale. Through these case studies, I confront both the challenges and benefits associated with implementing satellite based-remote sensing for improved policy analysis.
APA, Harvard, Vancouver, ISO, and other styles
9

Alexander, Lauren P. "Cell phone location data for travel behavior analysis." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/99592.

Full text
Abstract:
Thesis: S.M. in Transportation, Massachusetts Institute of Technology, Department of Civil and Environmental Engineering, 2015.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 111-118).
Mobile phone technology generates vast amounts of data at low costs all over the world. This rich data provides digital traces when and where individuals travel, improving our ability to understand, model, and predict human mobility. Especially in this era of rapid urbanization, mobile phone data presents exciting new opportunities to plan transportation infrastructure and services that meet the mobility needs and challenges associated with increasing travel demand. But to realize these benefits, methods must be developed to utilize and integrate this data into existing urban and transportation modeling frameworks. In this thesis, we draw on techniques from the transportation engineering and urban computing communities to estimate travel demand and infrastructure usage. The methods we present utilize call detail records (CDRs) from mobile phones in conjunction with geospatial data, census records, and surveys, to generate representative origin-destination matrices, route trips through road networks, and evaluate traffic congestion. Moreover, we implement these algorithms in a flexible, modular, and computationally efficient software system. This platform provides an end-to-end solution that integrates raw, massive data to generate estimates of travel demand and infrastructure performance in any city, and produces interactive visualizations to effectively communicate these results. Finally, we demonstrate an application of these data and methods to evaluate the impact of ride-sharing on urban traffic. Using these approaches, we generate travel demand estimates analogous to many of the outputs of conventional travel demand models, demonstrating the potential of mobile phone data as a low cost option for transportation planning. We hope this work will serve as unified and comprehensive guide to integrating new big data resources into transportation modeling practices.
by Lauren P. Alexander.
S.M. in Transportation
APA, Harvard, Vancouver, ISO, and other styles
10

Shyr, Feng-Yeu. "Combining laboratory and field data in rail fatigue analysis." Thesis, Massachusetts Institute of Technology, 1993. http://hdl.handle.net/1721.1/28024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Avila, Murillo Fernando. "An integrated development of correspondence analysis with applications to environmental data." Diss., The University of Arizona, 1991. http://hdl.handle.net/10150/185551.

Full text
Abstract:
Correspondence Analysis (CA) is a multivariate method that has been developed from different perspectives, for a variety of purposes. There have been reservations on the use of CA for data measured on a ratio scale, which is the usual type of data encountered in the earth and environmental sciences. These reservations have to do to with the usual approaches to CA, which are restricted to count type data, and with the actual nature of CA, as being either an exploratory or an inferential method. We present CA as an exploratory technique that provides an algebraic model for data matrices with non negative entries. This model can be used for dimension reduction or for pattern recognition purposes. As with any other exploratory technique, the use of CA has to be supplemented with diagnostics and post-analysis, if we are to have some measure of confidence in the results. We provide a set of diagnostics and a methodology for post-analysis, and we test its use on three example data sets from the earth and environmental sciences.
APA, Harvard, Vancouver, ISO, and other styles
12

Ayala, Solares Jose Roberto. "Data mining and machine learning for environmental systems modelling and analysis." Thesis, University of Sheffield, 2017. http://etheses.whiterose.ac.uk/18321/.

Full text
Abstract:
This thesis provides an investigation of environmental systems modelling and analysis based on system identification techniques. In particular, this work focuses on adapting and developing a new Nonlinear AutoRegressive with eXogenous inputs (NARX) framework, and its application to analyse some environmental case studies. Such a framework has proved to be very convenient to model systems with nonlinear dynamics because it builds a model using the Orthogonal Forward Regression (OFR) algorithm by recursively selecting model regressors from a pool of candidate terms. This selection is performed by means of a dependency metric, which measures the contribution of a candidate term to explain a signal of interest. For the first time, this thesis introduces a package in the R programming language for the construction of NARX models. This includes a set of features for effectively performing system identification, including model selection, parameter estimation, model validation, model visualisation and model evaluation. This package is used extensively throughout this thesis. This thesis highlights two new components of the original OFR algorithm. The first one aims to extend the deterministic notion of the NARX methodology by introducing the distance correlation metric, which can provide interpretability of nonlinear dependencies, together with the bagging method, which can provide an uncertainty analysis. This implementation produces a bootstrap distribution not only for the parameter estimates, but also for the forecasts. The biggest advantage is that it does not require the specification of prior distributions, as it is usually done in Bayesian analysis. The NARX methodology has been employed with systems where both inputs and outputs are continuous variables. Nevertheless, in real-life problems, variables can also appear in categorical form. Of special interest are systems where the output signal is binary. The second new component of the OFR algorithm is able to deal with this type of variable by finding relationships with regressors that are continuous lagged input variables. This improvement helps to identify model terms that have a key role in a classification process. Furthermore, this thesis discusses two environmental case studies: the first one on the analysis of the Atlantic Meridional Overturning Circulation (AMOC) anomaly, and the second one on the study of global magnetic disturbances in near-Earth space. Although the AMOC anomaly has been studied in the past, this thesis analyses it using NARX models for the first time. The task is challenging given that the sample size available is small. This requires some preprocessing steps in order to obtain a feasible model that can forecast future AMOC values, and hindcast back to January of 1980. In the second case study, magnetic disturbances in near-Earth space are studied by means of the Kp index. This index goes from 0 (very quiet) to 9 (very disturbed) in 28 levels. There is special interest in the forecast of high magnetic disturbances given their impact on terrestrial technology and astronauts' safety, but these events are rare and therefore, difficult to predict. Two approaches are analysed using the NARX methodology in order to assess the best modelling strategy. Although this phenomenon has been studied with other techniques providing very promising results, the NARX models are able to provide an insightful relationship of the Kp index to solar wind parameters, which can be useful in other geomagnetic analyses.
APA, Harvard, Vancouver, ISO, and other styles
13

Kasahara, Hidekazu. "Activity Support Based on Human Location Data Analysis with Environmental Factors." 京都大学 (Kyoto University), 2016. http://hdl.handle.net/2433/215678.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Graffelman, Jan. "Contributions to the multivariate Analysis of Marine Environmental Monitoring." Doctoral thesis, Universitat Politècnica de Catalunya, 2000. http://hdl.handle.net/10803/6525.

Full text
Abstract:
The thesis parts from the view that statistics starts with data, and starts by introducing the data sets studied: marine benthic species counts and chemical measurements made at a set of sites in the Norwegian Ekofisk oil field, with replicates and annually repeated. An introductory chapter details the sampling procedure and shows with reliability calculations that the (transformed) chemical variables have excellent reliability, whereas the biological variables have poor reliability, except for a small subset of abundant species. Transformed chemical variables are shown to be approximately normal. Bootstrap methods are used to assess whether the biological variables follow a Poisson distribution, and lead to the conclusion that the Poisson distribution must be rejected, except for rare species. A separate chapter details more work on the distribution of the species variables: truncated and zero-inflated Poisson distributions as well as Poisson mixtures are used in order to account for sparseness and overdispersion. Species are thought to respond to environmental variables, and regressions of the abundance of a few selected species onto chemical variables are reported. For rare species, logistic regression and Poisson regression are the tools considered, though there are problems of overdispersion. For abundant species, random coefficient models are needed in order to cope with intraclass correlation. The environmental variables, mainly heavy metals, are highly correlated, leading to multicollinearity problems. The next chapters use a multivariate approach, where all species data is now treated simultaneously. The theory of correspondence analysis is reviewed, and some theoretical results on this method are reported (bounds for singular values, centring matrices). An applied chapter discusses the correspondence analysis of the species data in detail, detects outliers, addresses stability issues, and considers different ways of stacking data matrices to obtain an integrated analysis of several years of data, and to decompose variation into a within-sites and between-sites component. More than 40 % of the total inertia is due to variation within stations. Principal components analysis is used to analyse the set of chemical variables. Attempts are made to integrate the analysis of the biological and chemical variables. A detailed theoretical development shows how continuous variables can be mapped in an optimal manner as supplementary vectors into a correspondence analysis biplot. Geometrical properties are worked out in detail, and measures for the quality of the display are given, whereas artificial data and data from the monitoring survey are used to illustrate the theory developed. The theory of display of supplementary variables in biplots is also worked out in detail for principal component analysis, with attention for the different types of scaling, and optimality of displayed correlations. A theoretical chapter follows that gives an in depth theoretical treatment of canonical correspondence analysis, (linearly constrained correspondence analysis, CCA for short) detailing many mathematical properties and aspects of this multivariate method, such as geometrical properties, biplots, use of generalized inverses, relationships with other methods, etc. Some applications of CCA to the survey data are dealt with in a separate chapter, with their interpretation and indication of the quality of the display of the different matrices involved in the analysis. Weighted principal component analysis of weighted averages is proposed as an alternative for CCA. This leads to a better display of the weighted averages of the species, and in the cases so far studied, also leads to biplots with a higher amount of explained variance for the environmental data. The thesis closes with a bibliography and outlines some suggestions for further research, such as a the generalization of canonical correlation analysis for working with singular covariance matrices, the use partial least squares methods to account for the excess of predictors, and data fusion problems to estimate missing biological data.
APA, Harvard, Vancouver, ISO, and other styles
15

Wan, Chun-wah, and 尹振華. "Evaluate hotel energy performance using data envelopment analysis." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2012. http://hub.hku.hk/bib/B48543640.

Full text
Abstract:
There are many factors affecting the hotel energy consumption, such as hotel classifications, floor area, numbers of guest rooms, nos. of room guests, mix of guest segments, level of occupancy (guest nights), scale of meeting facilities, laundry, retails operations, building features, facilities features, fuel mix, year of construction, year of retrofit, numbers of staff, weather conditions, management arrangement, etc. In Hong Kong and Singapore, the traditional method of benchmarking by Energy Use Index (EUI) per particular factor however was not able to effectively analyze such multiple inputs and multiple outputs environment. From the previous research papers, the Data Envelopment Analysis (DEA) was applied for the hotel management study in other countries recently, such as Portugal, Africa, Italy, Taiwan and Korean. Recently, the application of DEA to building energy analysis was only limited to residential buildings in US, government buildings in Taiwan, and hotel buildings in Turkey. The study provides a simple and basic DEA model (CCR-I CRS) for the evaluation of hotel energy consumption analysis of a sample hotel in Hong Kong for the tourism /hospitality industry. The DEA model was established with multiple input variables (electricity, Towngas, water, outdoor temperature and relative humidity) and multiple output variables (numbers of room nights, numbers of room guests, and numbers of food & beverage cover). The models successfully identifies the relative efficiencies of efficient decision-making units (DMUs) and inefficient DMUs, therefore the potential of saving areas are shown for further improvement action by hotel management strategic planning. Benchmarks are provided for improving the operations of poor-performing DMUs – months and F&B outlets respectively. Several interesting and useful managerial insights and implications from the study are discussed. Peer groups and slacks were identified among the efficient operations for the inefficient DMUs to adjust themselves in order to reach the efficient frontier. The study suggests a framework which enables the hotel management to develop a strategic action plan with energy conservation measures in different priorities. At the end, the hotel will be able to deliver a high degree of guest service standard and at the same time to preserve the environment by reducing the energy consumption. It is concluded that my area of study is a fit to “the gap”. The end results will form the extension of overseas researches and the foundation of the local researches in this knowledge area.
published_or_final_version
Environmental Management
Master
Master of Science in Environmental Management
APA, Harvard, Vancouver, ISO, and other styles
16

CARVALHO, MARIA A. G. de. "Metodos estatisticos para analise de dados de monitoracao ambiental." reponame:Repositório Institucional do IPEN, 2003. http://repositorio.ipen.br:8080/xmlui/handle/123456789/11121.

Full text
Abstract:
Made available in DSpace on 2014-10-09T12:48:36Z (GMT). No. of bitstreams: 0
Made available in DSpace on 2014-10-09T13:57:51Z (GMT). No. of bitstreams: 1 09249.pdf: 8213081 bytes, checksum: e62b849d5f56043393427a2373a3d1d5 (MD5)
Tese (Doutoramento)
IPEN/T
Instituto de Pesquisas Energeticas e Nucleares - IPEN/CNEN-SP
APA, Harvard, Vancouver, ISO, and other styles
17

Zoss, Brandon M. "Design and analysis of mobile sensing systems : an environmental data collection swarm." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/104269.

Full text
Abstract:
Thesis: S.M. in Naval Architecture and Marine Engineering, Massachusetts Institute of Technology, Department of Mechanical Engineering, 2016.
Thesis: Mech. E., Massachusetts Institute of Technology, Department of Mechanical Engineering, 2016.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 213-217).
Recent advances in small-scale portable computing have lead to an explosion in swarming as a viable method to approach large-scale data problems in the commercial, scientific, and defense sectors. This increased attention to large-scale swarm robotics has lead to an increase in swarm intelligence concepts, giving more potential to address issues more effectively and timely than any single unit. However, the majority of today's autonomous platforms are prohibitively costly and too complex for marketable research applications. This is particularly true when considering the demands required to be temporally and spatially pervasive in a marine environment. This work presents a low cost, portable, and highly maneuverable platform as a method to collect, share, and process environmental data. Our platform is modular, allowing a variety of sensor combinations, and may yield a heterogeneous swarm. Kalman filters are utilized to provide integrated, real-time dynamic self-awareness. In addition to an environmentally savvy platform, we define computational framework and characteristics, which allow complex problems to be solved in a distributed and collective manner. This computational framework includes two methods for scalar field estimation, which rely on low order orthogonal Hermite basis functions. Low order fits provide a natural method for low-pass filtering, thus avoiding ambient noise recovery in the reconstruction process. Real-time sampling and recovery allow for individual and collectively autonomous behaviors driven through globally assessed environmental parameters. Finally, we give evidence that large numbers can cooperatively tackle large-scale problems much more efficiently and timely than more capable and expensive units. This is particularly true when utilizing a unique methodology, presented herein, to best assemble in order to most affectively reconstruct sparse spatial scalar fields.
by Brandon M. Zoss.
S.M. in Naval Architecture and Marine Engineering
Mech. E.
APA, Harvard, Vancouver, ISO, and other styles
18

Choudhary, Vijay Singh 1979. "A client-server software application for statistical analysis of fMRI data." Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/30141.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, 2004.
Includes bibliographical references (leaves 63-66).
Statistical analysis methods used for interrogating functional magnetic resonance imaging (fMRI) data are complex and continually evolving. There exist a scarcity of educational material for fMRI. Thus, an instructional based software application was developed for teaching the fundamentals of statistical analysis in fMRI. For wider accessibility, the application was designed with a client/server architecture. The Java client has a layered design for flexibility and a nice Graphical User Interface (GUI) for user interaction. The application client can be deployed to multiple platforms in heterogeneous and distributed network. The future possibility of adding real-time data processing capabilities in the server led us to choose CGI/Perl/C as server side technologies. The client and server communicates via a simple protocol through the Apache Web Server. The application provides students with opportunities for hands-on exploration of the key concepts using phantom data as well as sample human fMRI data. The simulation allows students to control relevant parameters and observe intermediate results for each step in the analysis stream (spatial smoothing, motion correction, statistical model parameter selection etc.). Eventually this software tool and the accompanying tutorial will be disseminated to researchers across the globe via Biomedical Informatics Research Network (BIRN) portal.
by Vijay Singh Choudhary.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
19

Trodd, Nigel Mitchell. "An analysis of semi-natural vegetation from remotely-sensed data." Thesis, Kingston University, 1994. http://eprints.kingston.ac.uk/20578/.

Full text
Abstract:
This thesis examines the potential of remote sensing to characterise heathiand vegetation. A series of ground radiometer and airborne multispectral scanner experiments were conducted to investigare relationships between remotely-sensed data and the species composition of heathland vegetation. Particular reference was made to spatial characteristics of the vegetation. The results refuted the hypothesis that a classification of remotely-sensed data accounts for the variation in heathland vegetation communities. A classification was an inaccurate and inappropriate representation of the vegetation. However, it was cautioned that any assessment of remote sensing was dependent on the accuracy and method of processing of ground data. An investigation of alternative methods of analysis found that multispectral data were related to continuous variations in the abundance of dominant species, and it was inferred that classification underestimated the potential of remote sensing. Vegetation continua were successfully identified from remotely-sensed data acquired throughout the season, and a transformation of the data generated spectral components that helped to explain the interaction between species composition and multispectral reflectance. A particular criticism of classification had been the inability to represent the spatial characteristics of vegetation due to the nominal nature of the output. Abrupt boundaries, continuous transitions and homogeneous stands were identified from the vegetation data. Alternative methods, using ordinal-level remotely-sensed data, were able to represent several spatial characteristics of the vegetation. The full potential of remote sensing to characterise heathland vegetation will only be realised once the processing of remotely-sensed data is coupled to ecological models. The methods evaluated in this thesis are a first step in that direction, but they require a better understanding of spectral and spatial properties in order to meet the routine information needs of environmental scientists at local to global scales.
APA, Harvard, Vancouver, ISO, and other styles
20

Sandberg, Melanie (Melanie Jean). "Applications of ASDE-X Data to the analysis of airport surface operations." Thesis, Massachusetts Institute of Technology, 2012. http://hdl.handle.net/1721.1/74469.

Full text
Abstract:
Thesis (S.M. in Transportation)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, 2012.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 133).
While much attention has been given to analyzing and optimizing problems in air transportation, relatively little research has gone into studying airport surface operations. In recent years a surface surveillance system called Airport Surface Detection Equipment, Model-X (ASDE-X) has been installed at over 30 airports in the US as a safety device. The applications of the data being captured by these systems are far broader than just promoting safety. In this thesis, it will be demonstrated how ASDE-X data can be analyzed to characterize airport operations, and how it might be used going forward in real-time. The process of converting the raw ASDE-X data into a useable format will be discussed. Then, an analysis of airport operations at LaGuardia Airport and Philadelphia Airport will be presented using three months of summer data. These airports will be studied both in an aggregate fashion as well as for individual runway configurations. Finally, a case study of an Android tablet application will be presented as a next step in automation for aiding airport traffic operations.
by Melanie Sandberg.
S.M.in Transportation
APA, Harvard, Vancouver, ISO, and other styles
21

Picard, Charlotte. "Climate change and dengue: analysis of historical health and environmental data for Peru." Thesis, McGill University, 2012. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=106459.

Full text
Abstract:
Dengue, a mosquito-borne virual infection that is the most common cause of hemorrhagic fever globally, is rapidly spreading worldwide. An estimated 40% of the world's population is at risk for this disease that is transmitted by Aedes sp. mosquitos. The Aedes mosquito-dengue virus lifecycle varies with temperature, and climate change may increase the risk of Dengue epidemics in the future. This study examined whether changes in sea surface temperature (SST) along the Peruvian coast were associated with dengue incidence from 2002-2010. In Peru the effects of the El Niño cycle on weather conditions are pronounced, providing an ideal place to study fluctuations in climate and dengue incidence. Negative binomial models were used to examine the relationship between dengue cases and changes in SST across regions of Peru. Spearman's rank test was used to determine the lagged SST term that was most correlated with Dengue incidence in each region. The negative binomial models included terms for the optimum lagged SST and a term for the trend of increasing dengue incidence over the study period. The magnitude and sign of the correlation coefficient of dengue and SST varied between the 15 regions of Peru with dengue cases. 9 provinces had positive correlations between the two while 6 had negative correlations. The optimum lag ranged from 0 months to 6 months. In all of the regions lagged SST was a significant predictor of dengue cases in the negative binomial model. The relationship between dengue and sea surface temperature in Peru appears to be significant across the country. Given the varied nature of the relationship between regions it is not possible to make accurate generalisations about this relationship in Peru. Accounting for additional climatic variables such as precipitation may help in improving the predictive model.
La dengue, une infection virale transmise par les moustiques étant la cause la plus fréquente de fièvre hémorragique au niveau mondial, se propage rapidement dans le monde entier. On estime que 40% de la population mondiale est à risque pour cette maladie qui est transmise par les moustiques Aedes sp. Le cycle de vie du virus dengue des moustiques Aedes varie avec la température, et le changement climatique peut accroître le risque d'épidémies de dengue dans le futur. Nous avons examiné si les changements de température de surface de la mer (SST) sur le long de la côte péruvienne ont été associés à l'incidence de dengue de 2002 à 2010. Au Pérou les effets du cycle El Niño sur les conditions météorologiques sont prononcés, offrant un endroit idéal pour étudier les fluctuations du climat et de l'incidence de la dengue. Des modèles binomiaux négatifs ont été utilisés pour examiner la relation entre les cas de dengue et des changements de SST dans toutes les régions du Pérou. Le test de Spearman a été utilisé pour déterminer le terme retardé de SST qui était la plus corrélée avec l'incidence de dengue dans chaque région. Les modèles binomiaux négatifs comprenaient des termes pour optimiser la SST et un terme à la tendance de l'incidence de la dengue augmente au cours de la période d'étude. L'amplitude et le signe du coefficient de corrélation de la dengue et le SST varient entre les 15 régions du Pérou. Neuf provinces avaient des corrélations positives entre les deux, tandis que six avaient des corrélations négatives. Le décalage optimal varie de 0 à 6 mois. Dans toutes les régions retardées, le SST était un prédicateur important de cas de dengue dans le modèle binomial négatif. La relation entre la dengue et la température de surface de la mer au Pérou semble être significatif à travers le pays. Étant donné la nature variée de la relation entre les régions, il n'est pas possible de faire des généralisations exactes à propos de cette relation au Pérou. Tenant compte des autres variables climatiques comme la précipitation pourrait aider à améliorer le modèle prédictif.
APA, Harvard, Vancouver, ISO, and other styles
22

Foster, Malcom B. "Remote environmental sensing platform, Crooked Lake, Antartica : design, deployment and analysis of data." Thesis, University of Nottingham, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.439858.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Kafle, Ram C. "Trend Analysis and Modeling of Health and Environmental Data: Joinpoint and Functional Approach." Scholar Commons, 2014. https://scholarcommons.usf.edu/etd/5246.

Full text
Abstract:
The present study is divided into two parts: the first is on developing the statistical analysis and modeling of mortality (or incidence) trends using Bayesian joinpoint regression and the second is on fitting differential equations from time series data to derive the rate of change of carbon dioxide in the atmosphere. Joinpoint regression model identifies significant changes in the trends of the incidence, mortality, and survival of a specific disease in a given population. Bayesian approach of joinpoint regression is widely used in modeling statistical data to identify the points in the trend where the significant changes occur. The purpose of the present study is to develop an age-stratified Bayesian joinpoint regression model to describe mortality trends assuming that the observed counts are probabilistically characterized by the Poisson distribution. The proposed model is based on Bayesian model selection criteria with the smallest number of joinpoints that are sufficient to explain the Annual Percentage Change (APC). The prior probability distributions are chosen in such a way that they are automatically derived from the model index contained in the model space. The proposed model and methodology estimates the age-adjusted mortality rates in different epidemiological studies to compare the trends by accounting the confounding effects of age. The future mortality rates are predicted using the Bayesian Model Averaging (BMA) approach. As an application of the Bayesian joinpoint regression, first we study the childhood brain cancer mortality rates (non age-adjusted rates) and their Annual Percentage Change (APC) per year using the existing Bayesian joinpoint regression models in the literature. We use annual observed mortality counts of children ages 0-19 from 1969-2009 obtained from Surveillance Epidemiology and End Results (SEER) database of the National Cancer Institute (NCI). The predictive distributions are used to predict the future mortality rates. We also compare this result with the mortality trend obtained using joinpoint software of NCI, and to fit the age-stratified model, we use the cancer mortality counts of adult lung and bronchus cancer (25-85+ years), and brain and other Central Nervous System (CNS) cancer (25-85+ years) patients obtained from the Surveillance Epidemiology and End Results (SEER) data base of the National Cancer Institute (NCI). The second part of this study is the statistical analysis and modeling of noisy data using functional data analysis approach. Carbon dioxide is one of the major contributors to Global Warming. In this study, we develop a system of differential equations using time series data of the major sources of the significant contributable variables of carbon dioxide in the atmosphere. We define the differential operator as data smoother and use the penalized least square fitting criteria to smooth the data. Finally, we optimize the profile error sum of squares to estimate the necessary differential operator. The proposed models will give us an estimate of the rate of change of carbon dioxide in the atmosphere at a particular time. We apply the model to fit emission of carbon dioxide data in the continental United States. The data set is obtained from the Carbon Dioxide Information Analysis Center (CDIAC), the primary climate-change data and information analysis center of the United States Department of Energy. The first four chapters of this dissertation contribute to the development and application of joinpiont and the last chapter discusses the statistical modeling and application of differential equations through data using functional data analysis approach.
APA, Harvard, Vancouver, ISO, and other styles
24

Willner, Marjorie Rose. "Environmental Analysis at the Nanoscale: From Sensor Development to Full Scale Data Processing." Diss., Virginia Tech, 2018. http://hdl.handle.net/10919/94644.

Full text
Abstract:
Raman spectroscopy is an extremely versatile technique with molecular sensitivity and fingerprint specificity. However, the translation of this tool into a deployable technology has been stymied by irreproducibility in sample preparation and the lack of complex data analysis tools. In this dissertation, a droplet microfluidic platform was prototyped to address both sample-to-sample variation and to introduce a level of quantitation to surface enhanced Raman spectroscopy (SERS). Shifting the SERS workflow from a cell-to-cell mapping routine to the mapping of tens to hundreds of cells demanded the development of an automated processing tool to perform basic SERS analyses such as baseline correction, peak feature selection, and SERS map generation. The analysis tool was subsequently expanded for use with a multitude of diverse SERS applications. Specifically, a two-dimensional SERS assay for the detection of sialic acid residues on the cell membrane was translated into a live cell assay by utilizing a droplet microfluidic device. Combining single-cell encapsulation with a chamber array to hold and immobilize droplets allowed for the interrogation of hundreds of droplets. Our novel application of computer vision algorithms to SERS maps revealed that sialic sugars on cancer cell membranes are found in small clusters, or islands, and that these islands typically occupy less than 30% of the cell surface area. Employing an opportunistic mindset for the application of the data processing platform, a number of smaller projects were pursued. Biodegradable aliphatic-aromatic copolyesters with varying aromatic content were characterized using Raman spectroscopy and principal component analysis (PCA). The six different samples could successfully be distinguished from one another and the tool was able to identify spectral feature changes resulting from an increasing number of aryl esters. Uniquely, PCA was performed on the 3,125 spectra collected from each sample to investigate point-to-point heterogeneities. A third set of projects evaluated the ability of the data processing tool to calculate spectral ratios in an automated fashion and were exploited for use with nano-pH probes and Rayleigh hot-spot normalization.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
25

Shimazaki, Hiroto. "Application-oriented approaches of geospatial data analysis : case studies on global environmental problems." 京都大学 (Kyoto University), 2009. http://hdl.handle.net/2433/126501.

Full text
Abstract:
Kyoto University (京都大学)
0048
新制・課程博士
博士(工学)
甲第14926号
工博第3153号
新制||工||1473(附属図書館)
27364
UT51-2009-M840
京都大学大学院工学研究科都市環境工学専攻
(主査)教授 田村 正行, 准教授 立川 康人, 准教授 須﨑 純一
学位規則第4条第1項該当
APA, Harvard, Vancouver, ISO, and other styles
26

Bremberger, Christoph, Francisca Bremberger, Mikulas Luptacik, and Stephan Schmitt. "Regulatory impact of environmental standards on the eco-efficiency of firms." Palgrave Macmillan, 2014. http://dx.doi.org/10.1057/jors.2013.176.

Full text
Abstract:
In this paper we propose an approach to implement environmental standards into Data Envelopment Analysis (DEA) and in this way to measure their regulatory impact on eco-efficiency of firms. As one standard feature of basic DEA models (as e.g. CCR from Charnes et al. (1978)) lies in the exogeneity of inputs, desirable and undesirable outputs, it is not possible to introduce environmental constraints for these parameters directly into basic DEA models. Therefore, we use a bounded-variable way, which allows constraints on the efficiency frontier. The regulatory impact is assessed as difference in eco-efficiency scores before and after fictive introduction of an environmental standard. Furthermore, we distinguish between weak and strong disposability of undesirable outputs and develop corresponding models. Assessing the regulatory impact of environmental standards in advance provides support for environmental policy makers in choosing appropriate instruments and in adjusting the intensity of regulation. (authors' abstract)
APA, Harvard, Vancouver, ISO, and other styles
27

Mallya, Shruti. "Modelling Human Risk of West Nile Virus Using Surveillance and Environmental Data." Thesis, Université d'Ottawa / University of Ottawa, 2017. http://hdl.handle.net/10393/35734.

Full text
Abstract:
Limited research has been performed in Ontario to ascertain risk factors for West Nile Virus (WNV) and to develop a unified risk prediction strategy. The aim of the current body of work was to use spatio-temporal modelling in conjunction with surveillance and environmental data to determine which pre-WNV season factors could forecast a high risk season and to explore how well mosquito surveillance data could predict human cases in space and time during the WNV season. Generalized linear mixed modelling found that mean minimum monthly temperature variables and annual WNV-positive mosquito pools were most significantly predictive of number of human WNV cases (p<0.001). Spatio-temporal cluster analysis found that positive mosquito pool clusters could predict human case clusters up to one month in advance. These results demonstrate the usefulness of mosquito surveillance data as well as publicly available climate data for assessing risk and informing public health practice.
APA, Harvard, Vancouver, ISO, and other styles
28

Rhead, Rebecca Danielle. "Concern for the natural environment and its effect on pro-environmental behaviour amongst the British public." Thesis, University of Manchester, 2015. https://www.research.manchester.ac.uk/portal/en/theses/concern-for-the-natural-environment-and-its-effect-on-proenvironmental-behaviour-amongst-the-british-public(dabf1d8e-1c31-4fdd-b431-8e3941ce0759).html.

Full text
Abstract:
Reports from the IPCC have been consistent in their findings: climate change is happening and human activity is the cause. The temperature of the earth’s climate has been steadily rising since the industrial revolution, with profoundly negative consequences for the natural environment. Britain is amongst the top 10 global contributors towards climate change, producing more CO2 per capita than China, and yet little is known about the relationship the British public have with the natural environment. Drawing upon DEFRA’s 2009 Survey of Public Attitudes and Behaviours Towards the Environment, a nationally representative sample of the UK, this study aims to (1) explore environmental attitudes in the DEFRA sample; (2) identify the types of environmental concern that exist in the UK and; (3) examine how environmental concern is associated with pro-environmental behaviours. The overall goal is to develop a better understanding this attitude-behaviour relationship. The thesis has 3 main findings. First, environmental concern is formed of three environmental attitudes: (a) a cognitive appraisal of plant and animal welfare (ecocentric attitude); (b) welfare of the human race (human-centric attitude); and (c) a prioritisation of the self, alongside dismissal of environmental problems (denial).Second, members of the British public can be assigned to one of four groups based on their environmental concern: Pro-environment, Neutral, Disengaged and Paradoxical (the latter 2 groups are apathetic towards environmental issues though in different ways).Third, when examining behaviour variation across these environmental concern groups, it was found, unsurprisingly, that membership of the pro- environmental group is strongly predictive of pro-environmental behaviour. What was surprising was that pro-environmental concern predicts a variety of behaviours, both easy and challenging (i.e. easy behaviour such as recycling household waste as well more challenging behaviour such as an increase use of public transportation over driving), whereas previous studies have typically found such behaviours to be unaffected by attitudes. Membership of the Neutral group also predicts pro-environmental behaviours, although this relationship is weaker and exists for fewer measures of behaviour. Disengaged and Paradoxical forms of concern are not significant predictors of behaviour. Upon examining the effect of socio-economic status (SES) on group membership and this attitude-behaviour relationship, it was found that SES does not moderate the attitude-behaviour relationship, but it does influence group membership. Respondents with higher SES were more likely to belong to neutral or pro-environment groups. After reviewing these findings, it is concluded that environmental attitudes do clearly predict behaviour, but a large portion of the UK population do not possess environmental attitudes strong enough to do so (the Disengaged and Paradoxical groups amount to 36% of the population). Future studies should focus on these apathetic groups in an attempt to understand them, determine effective methods of engagement and identify factors that increase the probability of members transitioning out of these groups.
APA, Harvard, Vancouver, ISO, and other styles
29

Hou, Anyang. "Using GPS data in route choice analysis : case study in Boston." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/60803.

Full text
Abstract:
Thesis (S.M. in Transportation)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, 2010.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 101-103).
The pervasive location-based technologies, such as GPS and cell phone, help us find the pattern of geographical information of human behavior and also help dig opportunities in real world. In transportation field, they help people better understand the transportation behavior and at the same time collect necessary information for us. One important aspect of its application is how people choose the route given the existing urban network. However, dealing with the excessive amount of data and the modeling of route choice behavior are two major challenges in the route choice analysis.This thesis discusses the general process in the route choice analysis, from GPS data processing, map matching to the generation of route choice sets. Besides, the Path-Size logit model is implemented to address the modeling issue. In this thesis, I develop a new effective method, which I called Point-Based Local Search Map Matching, to match the consecutive GPS data to the network data. Also, I develop a new model, which I called Random Weight Choice Set Generation Model to deal with the choice set generation problem in the route choice analysis. The data comes from two major sources. One is the Boston car GPS data. It tells when and where a specific car is. The other is the Boston urban network data, which contains all types of roads in GIS format.
by Anyang Hou.
S.M.in Transportation
APA, Harvard, Vancouver, ISO, and other styles
30

Levitan, Denise Madeline. "Statistical Analysis of the Environmental Geochemistry of an Unmined Uranium Ore Deposit." Diss., Virginia Tech, 2014. http://hdl.handle.net/10919/64782.

Full text
Abstract:
An evaluation of the geochemistry of the environment prior to large-scale changes enables scientists and other stakeholders to assess both baseline conditions and the potential impact of those changes to the environment. One area in which documentation of pre-development geochemistry is particularly important is in the exploitation of ore deposits. Ore deposits consist of concentrations of elements or minerals that are enriched enough to be of potential economic value. Their unusual geochemistry often leaves a signature on the environment that can both aid in location an economic resource and present environmental management challenges during its lifecycle. Coles Hill, Virginia, represents one such site. The Coles Hill property is the location of uranium-enriched rock, commonly referred to as the Coles Hill uranium deposit. This dissertation outlines study design, sampling, and statistical analysis methods that can be used in the geochemical characterization of a potential resource extraction site. It presents three studies on geoenvironmental media at Coles Hill. The first study discusses sampling strategies and statistical analysis to address variability in geology, hydrology and climate for baseline assessment and presents an example of such an assessment at Coles Hill. Results suggest a localized environmental impact of the deposit but that differences in bedrock geology within the area surrounding the deposit could also be responsible for some of the variation. This study also emphasizes the importance of consideration of data below analytical detection limits and describes methods for doing so. The second study compares the geochemistry of soil samples collected at Coles Hill with reference data collected by the U.S. Geological Survey using multivariate statistical techniques. Differences are used to suggest potential pathfinder elements such as light rare earth elements to aid in exploration for similar deposits. The third study uses multivariate statistical analysis to examine differences among rocks, soils, and stream sediments to infer important geochemical processes involved in weathering of the deposit. Overall, the results of these studies can aid in the development of future environmental site studies at Coles Hill and elsewhere.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
31

Holman, Justin O. "Quantitative comparison of categorical maps with applications for the analysis of global environmental data /." view abstract or download file of text, 2004. http://wwwlib.umi.com/cr/uoregon/fullcit?p3136418.

Full text
Abstract:
Thesis (Ph. D.)--University of Oregon, 2004.
Typescript. Includes vita and abstract. Includes bibliographical references (leaves 101-107). Also available for download via the World Wide Web; free to University of Oregon users.
APA, Harvard, Vancouver, ISO, and other styles
32

Zureiqat, Hazem Marwan. "Fare policy analysis for public transport : a discrete-continuous modeling approach using panel data." Thesis, Massachusetts Institute of Technology, 2008. http://hdl.handle.net/1721.1/43748.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, 2008.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Includes bibliographical references (p. 115-117).
In many large metropolitan areas, public transport is very heavily used, and ridership is approaching system capacity in the peak periods. This has caused a shift in attention by agency decision-makers to strategies that can more effectively manage the demand for public transport, rather than simply increase overall demand. In other words, a need has arisen to understand not only why people use public transport as opposed to other modes but also how they use public transport, in terms of their ticket, mode, and time-of-day choices. To that end, fares become an increasingly important policy tool that can trigger certain behavioral changes among riders. This thesis develops a methodology to model, at the disaggregate level, the response of public transport users to fare changes. A discrete-continuous framework is proposed in which ticket choice is modeled at the higher (discrete) level and frequencies of public transport use, based on mode and time-of-day, are modeled at the lower (continuous) level. This framework takes advantage of the availability of smartcard data over time, allowing individual-specific behavioral changes with various fare policies to be captured. This methodology is applied to London's public transport system using Oyster smartcard data collected between November 2005 and February 2008. The results indicate a strong inertia effect in terms of ticket choice among public transport users in London. An individual's prior ticket choice is found to be a very important factor in determining their future ticket choice. This is also evident when we simulate the effects of two policy changes on ticket choices. We find that the impact of changing the prices of period tickets may take several months or more to fully materialize. In terms of the frequency of public transport use, the results indicate estimated short and long-run fare elasticities of -0.40 and -0.64, respectively, for travel on the London Underground and equivalent estimates of -0.08 and -0.13 for travel on bus.
(cont) The estimated Underground fare elasticities are comparable to those in the literature. The bus fare elasticities, on the other hand, are relatively smaller, in absolute value, than prior estimates. This difference reflects the small variations in bus fares in the dataset on which the model was estimated and the low fare sensitivity for users under such variations. Furthermore, we apply the model, in conjunction with related assumptions and findings from previous research, to evaluate an AM peak pricing scheme on the London Underground, in which travelers are charged £2.00 between 8:30am and 9:15am, rather than the current fare of £1.50. This application estimates that such a policy could potentially decrease AM "peak-of-the-peak" demand on the Underground by about 9%, with the reduction in ridership shifting either to a different mode or to a different time period.
by Hazem Marwan Zureiqat.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
33

Serrano, Balderas Eva Carmina. "Preprocessing and analysis of environmental data : Application to the water quality assessment of Mexican rivers." Thesis, Montpellier, 2017. http://www.theses.fr/2017MONTS082/document.

Full text
Abstract:
Les données acquises lors des surveillances environnementales peuvent être sujettes à différents types d'anomalies (i.e., données incomplètes, inconsistantes, inexactes ou aberrantes). Ces anomalies qui entachent la qualité des données environnementales peuvent avoir de graves conséquences lors de l'interprétation des résultats et l’évaluation des écosystèmes. Le choix des méthodes de prétraitement des données est alors crucial pour la validité des résultats d'analyses statistiques et il est assez mal défini. Pour étudier cette question, la thèse s'est concentrée sur l’acquisition des données et sur les protocoles de prétraitement des données afin de garantir la validité des résultats d'analyse des données, notamment dans le but de recommander la séquence de tâches de prétraitement la plus adaptée. Nous proposons de maîtriser l'intégralité du processus de production des données, de leur collecte sur le terrain et à leur analyse, et dans le cas de l'évaluation de la qualité de l'eau, il s’agit des étapes d'analyse chimique et hydrobiologique des échantillons produisant ainsi les données qui ont été par la suite analysées par un ensemble de méthodes statistiques et de fouille de données. En particulier, les contributions multidisciplinaires de la thèse sont : (1) en chimie de l'eau: une procédure méthodologique permettant de déterminer les quantités de pesticides organochlorés dans des échantillons d'eau collectés sur le terrain en utilisant les techniques SPE–GC-ECD (Solid Phase Extraction - Gas Chromatography - Electron Capture Detector) ; (2) en hydrobiologie : une procédure méthodologique pour évaluer la qualité de l’eau dans quatre rivières Mexicaines en utilisant des indicateurs biologiques basés sur des macroinvertébrés ; (3) en science des données : une méthode pour évaluer et guider le choix des procédures de prétraitement des données produites lors des deux précédentes étapes ainsi que leur analyse ; et enfin, (4) le développement d’un environnement analytique intégré sous la forme d’une application développée en R pour l’analyse statistique des données environnementales en général et l’analyse de la qualité de l’eau en particulier. Enfin, nous avons appliqué nos propositions sur le cas spécifique de l’évaluation de la qualité de l’eau des rivières Mexicaines Tula, Tamazula, Humaya et Culiacan dans le cadre de cette thèse qui a été menée en partie au Mexique et en France
Data obtained from environmental surveys may be prone to have different anomalies (i.e., incomplete, inconsistent, inaccurate or outlying data). These anomalies affect the quality of environmental data and can have considerable consequences when assessing environmental ecosystems. Selection of data preprocessing procedures is crucial to validate the results of statistical analysis however, such selection is badly defined. To address this question, the thesis focused on data acquisition and data preprocessing protocols in order to ensure the validity of the results of data analysis mainly, to recommend the most suitable sequence of preprocessing tasks. We propose to control every step in the data production process, from their collection on the field to their analysis. In the case of water quality assessment, it comes to the steps of chemical and hydrobiological analysis of samples producing data that were subsequently analyzed by a set of statistical and data mining methods. The multidisciplinary contributions of the thesis are: (1) in environmental chemistry: a methodological procedure to determine the content of organochlorine pesticides in water samples using the SPE-GC-ECD (Solid Phase Extraction – Gas Chromatography – Electron Capture Detector) techniques; (2) in hydrobiology: a methodological procedure to assess the quality of water on four Mexican rivers using macroinvertebrates-based biological indices; (3) in data sciences: a method to assess and guide on the selection of preprocessing procedures for data produced from the two previous steps as well as their analysis; and (4) the development of a fully integrated analytics environment in R for statistical analysis of environmental data in general, and for water quality data analytics, in particular. Finally, within the context of this thesis that was developed between Mexico and France, we have applied our methodological approaches on the specific case of water quality assessment of the Mexican rivers Tula, Tamazula, Humaya and Culiacan
APA, Harvard, Vancouver, ISO, and other styles
34

Tindall, Nathaniel W. "Analyses of sustainability goals: Applying statistical models to socio-economic and environmental data." Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/54259.

Full text
Abstract:
This research investigates the environment and development issues of three stakeholders at multiple scales—global, national, regional, and local. Through the analysis of financial, social, and environmental metrics, the potential benefits and risks of each case study are estimated, and their implications are considered. In the first case study, the relationship of manufacturing and environmental performance is investigated. Over 700 facilities of a global manufacturer that produce 11 products on six continents were investigated to understand global variations and determinants of environmental performance. Water, energy, carbon dioxide emissions, and production data from these facilities were analyzed to assess environmental performance; the relationship of production composition at the individual firm and environmental performance were investigated. Location-independent environmental performance metrics were combined to provide both global and local measures of environmental performance. These models were extended to estimate future water use, energy use, and greenhouse gas emissions considering potential demand shifts. Natural resource depletion risks were investigated, and mitigation strategies related to vulnerabilities and exposure were discussed. The case study demonstrated how data from multiple facilities can be used to characterize the variability amongst facilities and to preview how changes in production may affect overall corporate environmental metrics. The developed framework adds a new approach to account for environmental performance and degradation as well as assess potential risk in locations where climate change may affect the availability of production resources (i.e., water and energy) and thus, is a tool for understanding risk and maintaining competitive advantage. The second case study was designed to address the issue of delivering affordable and sustainable energy. Energy pricing was evaluated by modeling individual energy consumption behaviors. This analysis simulated a heterogeneous set of residential households in both the urban and rural environments in order to understand demand shifts in the residential energy end-use sector due to the effects of electricity pricing. An agent-based model (ABM) was created to investigate the interactions of energy policy and individual household behaviors; the model incorporated empirical data on beliefs and perceptions of energy. The environmental beliefs, energy pricing grievances, and social networking dynamics were integrated into the ABM model structure. This model projected the aggregate residential sector electricity demand throughout the 30-year time period as well as distinguished the respective number of households who only use electricity, that use solely rely on indigenous fuels, and that incorporate both indigenous fuels and electricity. The model is one of the first characterizations of household electricity demand response and fuel transitions related to energy pricing at the individual household level, and is one of the first approaches to evaluating consumer grievance and rioting response to energy service delivery. The model framework is suggested as an innovative tool for energy policy analysis and can easily be revised to assist policy makers in other developing countries. In the final case study, a framework was developed for a broad cost-benefit and greenhouse gas evaluation of transit systems and their associated developments. A case study was developed of the Atlanta BeltLine. The net greenhouse gas emissions from the BeltLine light rail system will depend on the energy efficiency of the streetcars themselves, the greenhouse gas emissions from the electricity used to power the streetcars, the extent to which people use the BeltLine instead of driving personal vehicles, and the efficiency of their vehicles. The effects of ridership, residential densities, and housing mix on environmental performance were investigated and were used to estimate the overall system efficacy. The range of the net present value of this system was estimated considering health, congestion, per capita greenhouse gas emissions, and societal costs and benefits on a time-varying scale as well as considering the construction and operational costs. The 95% confidence interval was found with a range bounded by a potential loss of $860 million and a benefit of $2.3 billion; the mean net present value was $610 million. It is estimated that the system will generate a savings of $220 per ton of emitted CO2 with a 95% confidence interval bounded by a potential social cost of $86 cost per ton CO2 and a savings of $595 per ton CO2.
APA, Harvard, Vancouver, ISO, and other styles
35

Correia, Andrew William. "Estimating the Health Effects of Environmental Exposures: Statistical Methods for the Analysis of Spatio-temporal Data." Thesis, Harvard University, 2013. http://dissertations.umi.com/gsas.harvard:10828.

Full text
Abstract:
In the field of environmental epidemiology, there is a great deal of care required in constructing models that accurately estimate the effects of environmental exposures on human health. This is because the nature of the data that is available to researchers to estimate these effects is almost always observational in nature, making it difficult to adequately control for all potential confounders - both measured and unmeasured. Here, we tackle three different problems in which the goal is to accurately estimate the effect of an environmental exposure on various health outcomes.
APA, Harvard, Vancouver, ISO, and other styles
36

Sánchez-Martínez, Gabriel Eduardo. "Running time variability and resource allocation : a data-driven analysis of high-frequency bus operations." Thesis, Massachusetts Institute of Technology, 2012. http://hdl.handle.net/1721.1/79498.

Full text
Abstract:
Thesis (S.M. in Transportation)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, February 2013.
"February 2012." Cataloged from PDF version of thesis.
Includes bibliographical references (p. 129-132).
Running time variability is one of the most important factors determining service quality and operating cost of high-frequency bus transit. This research aims to improve performance analysis tools currently used in the bus transit industry, particularly for measuring running time variability and understanding its effect on resource allocation using automated data collection systems such as AVL. Running time variability comes from both systematic changes in ridership and traffic levels at different times of the day, which can be accounted for in service planning, and the inherent stochasticity of homogeneous periods, which must be dealt with through real-time operations control. An aggregation method is developed to measure the non-systematic variability of arbitrary time periods. Visual analysis tools are developed to illustrate running time variability by time of day at the direction and segment levels. The suite of analysis tools makes variability analysis more approachable, potentially leading to more frequent and consistent evaluations. A discrete event simulation framework is developed to evaluate hypothetical modifications to a route's fleet size using automatically collected data. A simple model based on this framework is built to demonstrate its use. Running times are modeled at the segment level, capturing correlation between adjacent segments. Explicit modeling of ridership, though supported by the framework, is not included. Validation suggests that running times are modeled accurately, but that further work in modeling terminal dispatching, dwell times, and real-time control is required to model headways robustly. A resource allocation optimization framework is developed to maximize service performance in a group of independent routes, given their headways and a total fleet size constraint. Using a simulation model to evaluate the performance of a route with varying fleet sizes, a greedy optimizer adjusts allocation toward optimality. Due to a number of simplifying assumptions, only minor deviations from the current resource allocation are considered. A potential application is aiding managers to fine-tune resource allocation to improve resource effectiveness.
by Gabriel Eduardo Sánchez-Martínez.
S.M.in Transportation
APA, Harvard, Vancouver, ISO, and other styles
37

Sandy, Alexis Emily. "Environmental and Digital Data Analysis of the National Wetlands Inventory (NWI) Landscape Position Classification System." Thesis, Virginia Tech, 2006. http://hdl.handle.net/10919/33572.

Full text
Abstract:
The National Wetlands Inventory (NWI) is the definitive source for wetland resources in the United States. The NWI production unit in Hadley, MA has begun to upgrade their digital map database, integrating descriptors for assessment of wetland functions. Updating is conducted manually and some automation is needed to increase production and efficiency. This study assigned landscape position descriptor codes to NWI wetland polygons and correlated polygon environmental properties with public domain terrain, soils, hydrology, and vegetation data within the Coastal Plain of Virginia. Environmental properties were applied to a non-metric multidimensional scaling technique to identify similarities within individual landscape positions based on wetland plant indicators, primary and secondary hydrology indicators, and field indicators of hydric soils. Individual NWI landscape position classes were linked to field-validated environmental properties. Measures provided by this analysis indicated that wetland plant occurrence and wetland plant status obtained a stress value of 0.136 (Kruskalâ s stress measure = poor), which is a poor indicator when determining correlation among wetland environmental properties. This is due principally to the highly-variable plant distribution and wetland plant status found among the field-validated sites. Primary and secondary hydrology indicators obtained a stress rating of 0.097 (Kruskalâ s stress measure = good) for correlation. The hydrology indicators measured in this analysis had a high level of correlation with all NWI landscape position classes due the common occurrence of at least one primary hydrology indicator in all field validated wetlands. The secondary indicators had an increased accuracy in landscape position discrimination over the primary indicators because they were less ubiquitous. Hydric soil characteristics listed in the 1987 Manual and NTCHS field indicators of hydric soils proved to be a relatively poor indicator, based on Kruskalâ s stress measure of 0.117, for contrasting landscape position classes because the same values occurred across all classes. The six NWI fieldâ validated landscape position classes used in this study were then further applied in a public domain digital data analysis. Mean pixel attribute values extracted from the 180 field-validated wetlands were analyzed using cluster analysis. The percent hydric soil component displayed the greatest variance when compared to elevation and slope curvature, streamflow and waterbody, Cowardin classification, and wetland vegetation type. Limitations of the soil survey data included: variable date of acquisition, small scale compared to wetland size, and variable quality. Flow had limitations related to its linear attributes, therefore is often found insignificant when evaluating pixel values that are mean of selected pixels across of wetland landscape position polygons. NLCD data limitations included poor quality resolution (large pixel size) and variable classification of cover types. The three sources of information that would improve wetland mapping and modeling the subtle changes in elevation and slope curvature that characterize wetland landscapes are: recent high resolution leaf-off aerial photography, high-quality soil survey data, and high-resolution elevation data. Due to the data limitations and the choice of variables used in this study, development of models and rules that clearly separate the six different landscape positions was not possible, and thus automation of coding could not be attempted.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
38

McCully, Curtis, Charles R. Keeton, Kenneth C. Wong, and Ann I. Zabludoff. "Quantifying Environmental and Line-of-sight Effects in Models of Strong Gravitational Lens Systems." IOP PUBLISHING LTD, 2017. http://hdl.handle.net/10150/623240.

Full text
Abstract:
Matter near a gravitational lens galaxy or projected along the line of sight (LOS) can affect strong lensing observables by more than contemporary measurement errors. We simulate lens fields with realistic threedimensional mass configurations (self-consistently including voids), and then fit mock lensing observables with increasingly complex lens models to quantify biases and uncertainties associated with different ways of treating the lens environment (ENV) and LOS. We identify the combination of mass, projected offset, and redshift that determines the importance of a perturbing galaxy for lensing. Foreground structures have a stronger effect on the lens potential than background structures, due to nonlinear effects in the foreground and downweighting in the background. There is dramatic variation in the net strength of ENV/LOS effects across different lens fields; modeling fields individually yields stronger priors for H-0 than ray tracing through N-body simulations. Models that ignore mass outside the lens yield poor fits and biased results. Adding external shear can account for tidal stretching from galaxies at redshifts z >= z(lens), but it requires corrections for external convergence and cannot reproduce nonlinear effects from foreground galaxies. Using the tidal approximation is reasonable for most perturbers as long as nonlinear redshift effects are included. Even then, the scatter in H0 is limited by the lens profile degeneracy. Asymmetric image configurations produced by highly elliptical lens galaxies are less sensitive to the lens profile degeneracy, so they offer appealing targets for precision lensing analyses in future surveys like LSST and Euclid.
APA, Harvard, Vancouver, ISO, and other styles
39

Kala, Abhishek K. "Spatially Explicit Modeling of West Nile Virus Risk Using Environmental Data." Thesis, University of North Texas, 2015. https://digital.library.unt.edu/ark:/67531/metadc822841/.

Full text
Abstract:
West Nile virus (WNV) is an emerging infectious disease that has widespread implications for public health practitioners across the world. Within a few years of its arrival in the United States the virus had spread across the North American continent. This research focuses on the development of a spatially explicit GIS-based predictive epidemiological model based on suitable environmental factors. We examined eleven commonly mapped environmental factors using both ordinary least squares regression (OLS) and geographically weighted regression (GWR). The GWR model was utilized to ascertain the impact of environmental factors on WNV risk patterns without the confounding effects of spatial non-stationarity that exist between place and health. It identifies the important underlying environmental factors related to suitable mosquito habitat conditions to make meaningful and spatially explicit predictions. Our model represents a multi-criteria decision analysis approach to create disease risk maps under data sparse situations. The best fitting model with an adjusted R2 of 0.71 revealed a strong association between WNV infection risk and a subset of environmental risk factors including road density, stream density, and land surface temperature. This research also postulates that understanding the underlying place characteristics and population composition for the occurrence of WNV infection is important for mitigating future outbreaks. While many spatial and aspatial models have attempted to predict the risk of WNV transmission, efforts to link these factors within a GIS framework are limited. One of the major challenges for such integration is the high dimensionality and large volumes typically associated with such models and data. This research uses a spatially explicit, multivariate geovisualization framework to integrate an environmental model of mosquito habitat with human risk factors derived from socio-economic and demographic variables. Our results show that such an integrated approach facilitates the exploratory analysis of complex data and supports reasoning about the underlying spatial processes that result in differential risks for WNV. This research provides different tools and techniques for predicting the WNV epidemic and provides more insights into targeting specific areas for controlling WNV outbreaks.
APA, Harvard, Vancouver, ISO, and other styles
40

Otis, Paul T. "Dominance Based Measurement of Environmental Performance and Productive Efficiency of Manufacturing." Diss., Virginia Tech, 1999. http://hdl.handle.net/10919/26653.

Full text
Abstract:
The concept of efficiency measurement is based on the definition of a frontier that envelopes observed production plans. The effect of pollution on productive efficiency is typically studied by considering pollution as not freely disposable (i.e., there is a cost incurred to dispose of pollution) or by assigning shadow prices to pollution outputs. However, the frontier along with the required technological assumptions (such as convexity) needed for a definition of a frontier may be replaced with the concept of pair-wise dominance. With data from a manufacturing facility, the use of pair-wise dominance allows one to consider a wide spectrum of inputs and outputs. Pair-wise dominance can also be applied to segregate production plans into sets according to their relative environmental performance and productive efficiency. These sets are used to identify reference production plans upon which distance-based measures of performance are defined. This research applies pair-wise dominance to time series data from a printed circuit board manufacturing facility to illustrate the approach. The proposed approach is compared to the Data Envelopment Analysis (DEA) approach. It was observed that for detailed production data the proposed approach was more informative concerning the measurement of productive efficiency than the standard methods.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
41

Ng, Albert (Albert Y. ). "Use of automatically collected data for the preliminary impact analysis of the East London Line extension." Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/64575.

Full text
Abstract:
Thesis (S.M. in Transportation)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, 2011.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 151-154).
Data from public transport automated data collection (ADC) systems are now widely used in academic research and are beginning to be used for planning purposes. ADC systems provide ubiquitous and inexpensive, if limited, data streams for planning purposes. Since ADC data systems have been around for some time and are deployed by many large public transport agencies, the resulting data can be used for before and after impact analyses of changes in the transportation system. This research explores the use of automatically collected data to understand the impacts of a major public transport infrastructure investment on a complex existing network. The research presents the methods, using automatically collected data, to determine the impacts on multiple modes of transportation and the preliminary results of the impact of the introduction of the East London Line Extension. The East London Line is still in the early stages of growth and first and second order impacts continue to develop. The line is carrying an average of approximately 70,000 passengers per day and ridership continues to increase monthly. The East London Line is an important public transport crossing of the Thames River and a crucial role as a distributer to and from intersecting rail lines. It was estimated that between 28 to 32 percent of the daily weekday passenger journeys are new journeys to the public transport system. There is a change in ridership on many bus routes that run through the area served by the East London line. A more detailed analysis on four bus routes that run parallel to the East London Line and two bus routes that act as feeder routes show mixed results by route, direction, and time period. The mixed results lead us to believe that based on this preliminary impact analysis, the East London line can have a positive and negative impact on bus ridership but the impacts are most likely route, route segment, time of day, and direction specific. Analysis of disaggregate data showed that journey frequency of East London Line patrons increased at a higher rate than for the control panel. It is clear that ADC system data provides a cost effective means to capture a breath and depth of data suitable for impact analyses.
by Albert Ng.
S.M.in Transportation
APA, Harvard, Vancouver, ISO, and other styles
42

Margulis, Steven A. (Steven Adam) 1973. "Variational sensitivity analysis and data assimilation studies of the coupled land surface-atmospheric boundary layer system." Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/17525.

Full text
Abstract:
Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, 2002.
Includes bibliographical references (p. 197-205).
One of the fundamental components of Earth system science is understanding coupled land-atmosphere processes. The land plays an especially important role in the climate system principally via the regulation of surface fluxes of moisture and energy into the atmosphere. Due to the mutual dependence of these fluxes on surface and atmospheric states, the land and boundary layer comprise a coupled system with complicated interactions and feedbacks which are significant factors in modulating the variability of the weather and climate. In this thesis we develop a framework for exploratory sensitivity analysis and data assimilation in the coupled land-atmosphere system using a variational (or adjoint) approach. The framework is applied to three distinct case studies of interest. First, the variational framework is used to quantify land-atmosphere coupling and feedbacks. The model and its adjoint are used to investigate the differences in the daytime sensitivities of land surface fluxes to model states and parameters when used in coupled and uncoupled modes. Results show that the sensitivities between the two cases are significantly different because of boundary layer feedbacks. Depending on the particular case, sensitivities can be either amplified or dampened due to the presence of an interactive boundary layer. Next, we used the variational data assimilation framework to investigate the potential of estimating land and ABL states and fluxes from readily-available micrometeorological observations and radiometric surface temperature.
(cont.) Results from an application to a field experiment site showed that using both surface temperature and micrometeorology allows for the accurate estimation of land surface fluxes even during non-ideal conditions. Furthermore the assimilation scheme shows promise in diagnosing model errors that may be present due to missing process representation and/or biased parameterizations. Finally, a combined variational-ensemble framework is used to estimate boundary layer growth and entrainment fluxes. The results from our application to the field site indicate a much larger ratio of entrainment to surface fluxes compared to early literature values. The fact that the entrainment parameter is larger than first hypothesized serves to further bolster the importance of land-atmosphere coupling.
by Steven A. Margulis.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
43

Accioly, Luciano Jose de Oliveira. "Applying Spectral Mixture Analysis (SMA) For Soil Information Extraction On The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) Data." Diss., The University of Arizona, 1997. http://etd.library.arizona.edu/etd/GetFileServlet?file=file:///data1/pdf/etd/azu_e9791_1997_400_sip1_w.pdf&type=application/pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Kriström, Bengt. "Valuing environmental benefits using the contingent valuation method : an econometric analysis." Doctoral thesis, Umeå universitet, Institutionen för nationalekonomi, 1990. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-90578.

Full text
Abstract:
The purpose of this study is to investigate methods for assessing the value people place on preserving our natural environments and resources. It focuses on the contingent valuation method, which is a method for directly asking people about their preferences. In particular, the study focuses on the use of discrete response data in contingent valuation experiments.The first part of the study explores the economic theory of the total value of a natural resource, where the principal components of total value are analyzed; use values and non-use values. Our application is a study of the value Swedes' attach to the preservation of eleven forest areas that contain high recreational values and contain unique environmental qualities. Six forests were selected on the basis of an official investigation which includes virgin forests and other areas with unique environmental qualities. In addition, five virgin forests were selected.Two types of valuation questions are analyzed, the continuous and the discrete. The first type of question asks directly about willingness to pay, while the second type suggests a price that the respondent may reject or accept. The results of the continuous question suggest an average willingness to pay of about 1,000 SEK per household for preservation of the areas. Further analysis of the data suggests that this value depends on severi characteristics of the respondent: such as the respondent's income and whether or not the respondent is an altruist.Two econometric approaches are used to analyze the discrete responses; a flexible parametric approach and a non-parametric approach. In addition, a Bayesian approach is described. It is shown that the results of a contingent valuation experiment may depend to some extent on the choice of the probability model. A re-sampling approach and a Monte-Carlo approach is used to shed light on the design of a contingent valuation experiment with discrete responses. The econometric analysis ends with an analysis of the often observed disparity between discrete and continuous valuation questions.A cost-benefit analysis is performed in the final chapter. The purpose of this analysis is to illustrate how the contingent valuation approach may be combined with opportunity cost data to improve the decision-basis in the environmental policy domain. This analysis does not give strong support for a cutting alternative. Finally, the results of this investigation are compared with evidence from other studies.The main conclusion of this study is that assessment of peoples' sentiments towards changes of our natural environments and resources can be a useful supplement to decisions about the proper husbandry of our natural environments and resources. It also highlights the importance of careful statistical analysis of data gained from contingent valuation experiments.
digitalisering@umu
APA, Harvard, Vancouver, ISO, and other styles
45

Thomsen, Marianne. "QSARs in environmental risk assessment : interpretation and validation of SAR/QSAR based on multivariate data analysis /." Roskilde : Roskilde University, Department of Life Science and Chemistry, 2001. http://hdl.handle.net/1800/538.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Reynes, Anthony. "Environmental steering flow analysis for central north Pacific tropical cyclones based on NCEP/NCAR reanalysis data." Thesis, University of Hawaii at Manoa, 2003. http://hdl.handle.net/10125/7009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Saraiya, Devang. "The Impact of Environmental Variables in Efficiency Analysis: A fuzzy clustering-DEA Approach." Thesis, Virginia Tech, 2005. http://hdl.handle.net/10919/34637.

Full text
Abstract:
Data Envelopment Analysis (Charnes et al, 1978) is a technique used to evaluate the relative efficiency of any process or an organization. The efficiency evaluation is relative, which means it is compared with other processes or organizations. In real life situations different processes or units seldom operate in similar environments. Within a relative efficiency context, if units operating in different environments are compared, the units that operate in less desirable environments are at a disadvantage. In order to ensure that the comparison is fair within the DEA framework, a two-stage framework is presented in this thesis. Fuzzy clustering is used in the first stage to suitably group the units with similar environments. In a subsequent stage, a relative efficiency analysis is performed on these groups. By approaching the problem in this manner the influence of environmental variables on the efficiency analysis is removed. The concept of environmental dependency index is introduced in this thesis. The EDI reflects the extent to which the efficiency behavior of units is due to their environment of operation. The EDI also assists the decision maker to choose appropriate peers to guide the changes that the inefficient units need to make. A more rigorous series of steps to obtain the clustering solution is also presented in a separate chapter (chapter 5).
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
48

Yang, Jingtao. "Quantifying the Technical Efficiency of Canadian Paratransit Systems Using Data Envelopment Analysis Method." Thesis, University of Waterloo, 2005. http://hdl.handle.net/10012/841.

Full text
Abstract:
Paratransit service operators in Canada are under increasing pressure to improve the operational productivity of their services due to increased demand and tightening financial constraints. To achieve this, Paratransit operators need to know their performance as compared to peer systems and the best practices within the industry. This will enable each operator to identify where and how much improvement should be made in order to be on a par with the industry?s best practices. Little research effort, however, has been devoted to the issue of how to measure and compare paratransit efficiency in a consistent and systematic manner.

This research focuses on evaluating the level of efficiency of individual paratransit systems in Canada with the specific objective of identifying the most efficient service agencies and the sources of their efficiency. By identifying the most efficient systems along with the influencing factors, it is possible that new service policies and management and operational strategies could be developed for improved resource utilization and quality of services. To achieve this objective, this research applies the analysis methodology called Data Envelopment Analysis (DEA) approach which is a mathematical programming based technique for determining the efficiency of individual systems as compared their peers involving multiple performance measures. Annual operating data from Canadian Urban Transit Association for Canadian paratransit systems of year 2001, 2002 and 2003 are used in this analysis. Regression analysis is performed to identify the possible relationship between the efficiency of a paratransit system and some measurable operating, managerial and other factors which could have an impact on the performance of paratransit systems. The regression analysis also allows for the calculation of confidence intervals and bias for the efficiency scores in order to assess their precision.
APA, Harvard, Vancouver, ISO, and other styles
49

Diaz, Gerardo Jr. "Analysis of 2017 Multi-Agency Field Campaign Data for Wintertime Surface Pollution in the Cache Valley of Utah." TopSCHOLAR®, 2019. https://digitalcommons.wku.edu/theses/3112.

Full text
Abstract:
Atmospheric motions resulting from rising airborne parcels help to scatter emissions, including PM, away from their sources, decreasing local pollution levels. However this pattern shifts during the wintertime, as cold air damming and inversion layers create stable conditions that limit the vertical transport of air masses. Both point and area sources of emissions currently dot the western United States and are responsible for the production of the vast majority of agricultural pollution in the region. At the same time, population-growth has resulted in an ever-increasing amount of urbansource emissions. The entrapment of PM, which are produced when a wide array of urban and agricultural emissions series are released onto a valley floor, aggregate until they become singular particles which vary in size and can negatively affect the human respiratory system. As such, this goal of this study was to investigate the processes that lead to poor wintertime air quality conditions in the Mountain West and primarily in Cache Valley, which experiences some of the worst air quality in the United States during the winter season. Several results, including the observation of chemical reactions such as the production of the NO3 radical, along with the discovery of significantly high levels of DMS in an area that is not known for its production, all suggest that the chemical behaviors of Cache Valley are rather complex and play a critical role in poor wintertime air quality conditions. Furthermore, the presence of DMS at such high concentrations could be due to its being produced on the valley floor. As such, we hope that these results will help in improving our understanding of the physical and chemical dynamics of the Valley during the winter season, which will in turn aid in our ability to forecast such conditions and also properly plan future industrial and commercial projects that will inevitably be introduced into the region as it continues to grow.
APA, Harvard, Vancouver, ISO, and other styles
50

He, Yi. "An Analysis of Airborne Data Collection Methods for Updating Highway Feature Inventory." DigitalCommons@USU, 2016. https://digitalcommons.usu.edu/etd/5016.

Full text
Abstract:
Highway assets, including traffic signs, traffic signals, light poles, and guardrails, are important components of transportation networks. They guide, warn and protect drivers, and regulate traffic. To manage and maintain the regular operation of the highway system, state departments of transportation (DOTs) need reliable and up-to-date information about the location and condition of highway assets. Different methodologies have been employed to collect road inventory data. Currently, ground-based technologies are widely used to help DOTs to continually update their road database, while air-based methods are not commonly used. One possible reason is that the initial investment for air-based methods is relatively high; another is the lack of a systematic and effective approach to extract road features from raw airborne light detection and ranging (LiDAR) data and aerial image data. However, for large-area inventories (e.g., a whole state highway inventory), the total cost of using aerial mapping is actually much lower than other methods considering the time and personnel needed. Moreover, unmanned aerial vehicles (UAVs) are easily accessible and inexpensive, which makes it possible to reduce costs for aerial mapping. The focus of this project is to analyze the capability and strengths of airborne data collection system in highway inventory data collection. In this research, a field experiment was conducted by the Remote Sensing Service Laboratory (RSSL), Utah State University (USU), to collect airborne data. Two kinds of methodologies were proposed for data processing, namely ArcGIS-based algorithm for airborne LiDAR data, and MATLAB-based procedure for aerial photography. The results proved the feasibility and high efficiency of airborne data collection method for updating highway inventory database.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography