Dissertations / Theses on the topic 'GPS location data'

To see the other types of publications on this topic, follow the link: GPS location data.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 25 dissertations / theses for your research on the topic 'GPS location data.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Griffin, Terry W. "GPS CaPPture: a System for GPS Trajectory Collection, Processing, and Destination Prediction." Thesis, University of North Texas, 2012. https://digital.library.unt.edu/ark:/67531/metadc115089/.

Full text
Abstract:
In the United States, smartphone ownership surpassed 69.5 million in February 2011 with a large portion of those users (20%) downloading applications (apps) that enhance the usability of a device by adding additional functionality. a large percentage of apps are written specifically to utilize the geographical position of a mobile device. One of the prime factors in developing location prediction models is the use of historical data to train such a model. with larger sets of training data, prediction algorithms become more accurate; however, the use of historical data can quickly become a downfall if the GPS stream is not collected or processed correctly. Inaccurate or incomplete or even improperly interpreted historical data can lead to the inability to develop accurately performing prediction algorithms. As GPS chipsets become the standard in the ever increasing number of mobile devices, the opportunity for the collection of GPS data increases remarkably. the goal of this study is to build a comprehensive system that addresses the following challenges: (1) collection of GPS data streams in a manner such that the data is highly usable and has a reduction in errors; (2) processing and reduction of the collected data in order to prepare it and make it highly usable for the creation of prediction algorithms; (3) creation of prediction/labeling algorithms at such a level that they are viable for commercial use. This study identifies the key research problems toward building the CaPPture (collection, processing, prediction) system.
APA, Harvard, Vancouver, ISO, and other styles
2

Padmanabhan, Vijaybalaji. "Developing an operational procedure to produce digitized route maps using GPS vehicle location data." Thesis, Virginia Tech, 2000. http://hdl.handle.net/10919/32202.

Full text
Abstract:
Advancements in Global Positioning System (GPS) technology now make GPS data collection for transportation studies and other transportation applications a reality. Base map for the application can be obtained by importing the road centerline map into GIS software like AutoCAD Map, or Arc/Info or MapixTM. However, such kinds of Road Centerline maps are not available for all places. Therefore, it may be necessary to collect the data using GPS units. This thesis details the use of GPS technology to produce route maps that can be used to predict arrival time of a bus. This application is particularly useful in rural areas, since the bus headway in a rural area is generally larger than that in an urban area. The information is normally communicated through various interfaces such as internet, cable TV, etc., based on the GPS bus location data. The objective of this thesis is to develop an operational procedure to obtain the digitized route map of any desired interval or link length and to examine the accuracy of the digitized map. The operational procedure involved data collection, data processing, algorithm development and coding to produce the digitized route maps. An algorithm was developed produce the digitized route map from the base map of the route, coded in MATLAB, and can be used to digitize the base map into any desired interval of distance. The accuracy comparison is made to determine the consistency between the digitized route map and the base map.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
3

Jeong, Ran Hee. "The prediction of bus arrival time using Automatic Vehicle Location Systems data." Texas A&M University, 2004. http://hdl.handle.net/1969.1/1458.

Full text
Abstract:
Advanced Traveler Information System (ATIS) is one component of Intelligent Transportation Systems (ITS), and a major component of ATIS is travel time information. The provision of timely and accurate transit travel time information is important because it attracts additional ridership and increases the satisfaction of transit users. The cost of electronics and components for ITS has been decreased, and ITS deployment is growing nationwide. Automatic Vehicle Location (AVL) Systems, which is a part of ITS, have been adopted by many transit agencies. These allow them to track their transit vehicles in real-time. The need for the model or technique to predict transit travel time using AVL data is increasing. While some research on this topic has been conducted, it has been shown that more research on this topic is required. The objectives of this research were 1) to develop and apply a model to predict bus arrival time using AVL data, 2) to identify the prediction interval of bus arrival time and the probabilty of a bus being on time. In this research, the travel time prediction model explicitly included dwell times, schedule adherence by time period, and traffic congestion which were critical to predict accurate bus arrival times. The test bed was a bus route running in the downtown of Houston, Texas. A historical based model, regression models, and artificial neural network (ANN) models were developed to predict bus arrival time. It was found that the artificial neural network models performed considerably better than either historical data based models or multi linear regression models. It was hypothesized that the ANN was able to identify the complex non-linear relationship between travel time and the independent variables and this led to superior results. Because variability in travel time (both waiting and on-board) is extremely important for transit choices, it would also be useful to extend the model to provide not only estimates of travel time but also prediction intervals. With the ANN models, the prediction intervals of bus arrival time were calculated. Because the ANN models are non parametric models, conventional techniques for prediction intervals can not be used. Consequently, a newly developed computer-intensive method, the bootstrap technique was used to obtain prediction intervals of bus arrival time. On-time performance of a bus is very important to transit operators to provide quality service to transit passengers. To measure the on-time performance, the probability of a bus being on time is required. In addition to the prediction interval of bus arrival time, the probability that a given bus is on time was calculated. The probability density function of schedule adherence seemed to be the gamma distribution or the normal distribution. To determine which distribution is the best fit for the schedule adherence, a chi-squared goodness-of-fit test was used. In brief, the normal distribution estimates well the schedule adherence. With the normal distribution, the probability of a bus being on time, being ahead schedule, and being behind schedule can be estimated.
APA, Harvard, Vancouver, ISO, and other styles
4

Hennessey, Daniel R. "Buses as traffic probes empirical investigation using GPS-based location data on the OSU Campus Area Bus Service system /." Connect to resource, 2007. http://hdl.handle.net/1811/25080.

Full text
Abstract:
Thesis (Honors)--Ohio State University, 2007.
Title from first page of PDF file. Document formatted into pages: contains x, 74 p.; also includes graphics. Includes bibliographical references (p. 58-59). Available online via Ohio State University's Knowledge Bank.
APA, Harvard, Vancouver, ISO, and other styles
5

Woywitka, Robin John. "Archaeological site location data implications for GIS /." online access from Digital Dissertation Consortium access full-text, 2002. http://libweb.cityu.edu.hk/cgi-bin/er/db/ddcdiss.pl?MQ81330.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Toledo, Moreo Rafael. "Un sistema de navegación de alta integridad para vehículos en entornos desfavorables." Doctoral thesis, Universidad de Murcia, 2006. http://hdl.handle.net/10803/10923.

Full text
Abstract:
Algunas aplicaciones de carretera actuales, tales como los servicios de información al viajero, llamadas de emergencia automáticas, control de flotas o telepeaje eletrónico, requieren una solución de calidad al problema del posicionamiento de un vehículo terrestre, que funcione en cualquier entorno y a un coste razonable. Esta tesis presenta una solución a este problema, fusionando para ello la información procedente principalmente de sensores de navegación por satélite y sensores inerciales. Para ello emplea un nuevo filtro de fusion multisensorial IMM-EKF. El comportamiento del sistema ha sido analizado en entornos reales y controlados, y comparado con otras soluciones propuestas. Finalmente, su aplicabilidad al problema planteado ha sido verificada.
Road applications such as traveller information, automatic emergency calls, freight management or electronic fee, collection require a onboard equipment (OBE) capable to offer a high available accurate position, even in unfriendly environments with low satellite visibility at low cost. Specifically in life critical applications, users demand from the OBEs accurate continuous positioning and information of the reliability of this position. This thesis presents a solution based on the fusion of Global Navigation Satellite Systems (GNSS) and inertial sensors (GNSS/INS), running an Extended Kalman Filter combined with an Interactive Multi-Model method (IMM-EKF). The solution developed in this work supplies continuous positioning in marketable conditions, and a meaningful trust level of the given solution. A set of tests performed in controlled and real scenarios proves the suitability of the proposed IMM-EKF implementation, as compared with low cost GNSS based solutions, dead reckoning systems and single model extended Kalman filter (SM-EKF) solutions.
APA, Harvard, Vancouver, ISO, and other styles
7

Jurečka, Jan. "Analýza BI dat pomocí geografického systému." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-197063.

Full text
Abstract:
The topic of the current Master's thesis is Business Intelligence's data presentation using maps. Through integrating BI and geographic information systems a new discipline is emerging - Location Intelligence. The main goal of this thesis is to highlight and analyse reporting possibilities of the BI tools in the framework of maps. The theoretical part of this paper is dedicated to the foundation and principles of geographic information systems and their intersection with BI, where such field as Location Intelligence is being created. In the practical part of the thesis the BI tools IBM Cognos and Oracle BI are compared. The comparison is based on the following criteria: field of implementation, visualization, map external sources and performance. The evaluating criteria are defined in the beginning of the practical part as well as the evaluation method. The methods of analysis and information collecting were used to extract and revise the knowledge from specific electronic and printed sources in Czech or English. Sources for the practical part origin from my technical knowledge of the field of BI, as well as practical experience with implementation of map sources as a feature of Business Intelligence. Statistical methods are used for evaluation of the criteria results. The practical and theoretical value of the thesis lies in creating the lucid comparison of implementation of the map sources into the selected BI tools and options for reporting or visualization of BI data over map. Apart from comparison the framework for implementation of maps into the selected BI tools is established in the above mentioned work.
APA, Harvard, Vancouver, ISO, and other styles
8

Almuzaini, Khalid. "Qualitative modelling of place location on the linked data web and GIS." Thesis, Cardiff University, 2017. http://orca.cf.ac.uk/106368/.

Full text
Abstract:
When asked to define where a geographic place is, people normally resort to using qualitative expressions of location, such as north of and near to. This is evident in the domain of social geography, where qualitative research methods are used to gauge people’s understanding of their neighbourhood. Using a GIS to represent and map the location of neighbourhood boundaries is needed to understand and compare people’s perceptions of the spatial extent of their neighbourhoods. Extending the GIS to allow for the qualitative modelling of place will allow for the representation and mapping of neighbourhoods. On the other hand, a collaborative definition of place on the web will result in the accumulation of large sets of data resources that can be considered “location-poor”, where place location is defined mostly using single point coordinates and some random combinations of relative spatial relationships. A qualitative model of place location on the Linked Data Web (LDW) will allow for the homogenous representation and reasoning of place resources. This research has analysed the qualitative modelling of place location on the LDW and in GIS. On the LDW, a qualitative model of place is proposed, which provides an effective representation of individual place location profiles that allow place information to be enriched and spatially linked. This has been evaluated using the application of qualitative spatial reasoning (QSR) to automatic reasoning over place profiles, to check the completeness of the representation, as well as to derive implicit links not defined by the model. In GIS, a qualitative model of place is proposed that provides a basis for mapping qualitative definitions of place location in GIS, and this has been evaluated using an implementation-driven approach. The model has been implemented in a GIS and demonstrated through a realistic case study. A user-centric approach to development has been adopted, as users were involved throughout the design, development and evaluation stages.
APA, Harvard, Vancouver, ISO, and other styles
9

Jalali, Jalal. "Artificial neural networks for reservoir level detection of CO₂ seepage location using permanent down-hole pressure data." Morgantown, W. Va. : [West Virginia University Libraries], 2010. http://hdl.handle.net/10450/11137.

Full text
Abstract:
Thesis (Ph. D.)--West Virginia University, 2010.
Title from document title page. Document formatted into pages; contains xii, 140 p. : ill. (some col.), col. maps. Includes abstract. Includes bibliographical references (p. 99-104).
APA, Harvard, Vancouver, ISO, and other styles
10

Zhou, Guoqing. "Co-Location Decision Tree for Enhancing Decision-Making of Pavement Maintenance and Rehabilitation." Diss., Virginia Tech, 2011. http://hdl.handle.net/10919/26059.

Full text
Abstract:
A pavement management system (PMS) is a valuable tool and one of the critical elements of the highway transportation infrastructure. Since a vast amount of pavement data is frequently and continuously being collected, updated, and exchanged due to rapidly deteriorating road conditions, increased traffic loads, and shrinking funds, resulting in the rapid accumulation of a large pavement database, knowledge-based expert systems (KBESs) have therefore been developed to solve various transportation problems. This dissertation presents the development of theory and algorithm for a new decision tree induction method, called co-location-based decision tree (CL-DT.) This method will enhance the decision-making abilities of pavement maintenance personnel and their rehabilitation strategies. This idea stems from shortcomings in traditional decision tree induction algorithms, when applied in the pavement treatment strategies. The proposed algorithm utilizes the co-location (co-occurrence) characteristics of spatial attribute data in the pavement database. With the proposed algorithm, one distinct event occurrence can associate with two or multiple attribute values that occur simultaneously in spatial and temporal domains. This research dissertation describes the details of the proposed CL-DT algorithms and steps of realizing the proposed algorithm. First, the dissertation research describes the detailed colocation mining algorithm, including spatial attribute data selection in pavement databases, the determination of candidate co-locations, the determination of table instances of candidate colocations, pruning the non-prevalent co-locations, and induction of co-location rules. In this step, a hybrid constraint, i.e., spatial geometric distance constraint condition and a distinct event-type constraint condition, is developed. The spatial geometric distance constraint condition is a neighborhood relationship-based spatial joins of table instances for many prevalent co-locations with one prevalent co-location; and the distance event-type constraint condition is a Euclidean distance between a set of attributes and its corresponding clusters center of attributes. The dissertation research also developed the spatial feature pruning method using the multi-resolution pruning criterion. The cross-correlation criterion of spatial features is used to remove the nonprevalent co-locations from the candidate prevalent co-location set under a given threshold. The dissertation research focused on the development of the co-location decision tree (CL-DT) algorithm, which includes the non-spatial attribute data selection in the pavement management database, co-location algorithm modeling, node merging criteria, and co-location decision tree induction. In this step, co-location mining rules are used to guide the decision tree generation and induce decision rules. For each step, this dissertation gives detailed flowcharts, such as flowchart of co-location decision tree induction, co-location/co-occurrence decision tree algorithm, algorithm of colocation/co-occurrence decision tree (CL-DT), and outline of steps of SFS (Sequential Feature Selection) algorithm. Finally, this research used a pavement database covering four counties, which are provided by NCDOT (North Carolina Department of Transportation), to verify and test the proposed method. The comparison analyses of different rehabilitation treatments proposed by NCDOT, by the traditional DT induction algorithm and by the proposed new method are conducted. Findings and conclusions include: (1) traditional DT technology can make a consistent decision for road maintenance and rehabilitation strategy under the same road conditions, i.e., less interference from human factors; (2) the traditional DT technology can increase the speed of decision-making because the technology automatically generates a decision-tree and rules if the expert knowledge is given, which saves time and expenses for PMS; (3) integration of the DT and GIS can provide the PMS with the capabilities of graphically displaying treatment decisions, visualizing the attribute and non-attribute data, and linking data and information to the geographical coordinates. However, the traditional DT induction methods are not as quite intelligent as oneâ s expectations. Thus, post-processing and refinement is necessary. Moreover, traditional DT induction methods for pavement M&R strategies only used the non-spatial attribute data. It has been demonstrated from this dissertation research that the spatial data is very useful for the improvement of decision-making processes for pavement treatment strategies. In addition, the decision trees are based on the knowledge acquired from pavement management engineers for strategy selection. Thus, different decision-trees can be built if the requirement changes.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
11

Fekih, Hassen Wiem. "A ubiquitous navigation service on smartphones." Thesis, Lyon, 2020. http://www.theses.fr/2020LYSEI006.

Full text
Abstract:
La navigation pédestre est un domaine de recherche en pleine croissance qui vise à développer des services assurant le positionnement et la navigation en continu des personnes à l'extérieur comme à l'intérieur de bâtiments. Dans cette thèse, nous proposons un prototype de service pour la navigation pédestre ubiquitaire qui tient compte des préférences de l'utilisateur et de la technologie de positionnement optimale disponible. Notre objectif principal est d'estimer, d'une façon continue, la position d'un piéton muni d'un smartphone. En premier lieu, nous proposons un nouvel algorithme, nommé UCOSA, qui permet de sélectionner la technologie de positionnement à adopter à tout moment le long du processus de navigation. L'algorithme UCOSA commence par inférer la nécessité de déclencher un processus de "handover" (changement de technologie) entre les technologies de positionnement détectées (i.e. quand les zones de couvertures se chevauchent) en utilisant la technique de la logique floue. Ensuite, il sélectionne la technologie optimale à l'aide d'une fonction qui calcule un score pour chaque technologie disponible et qui se compose de deux parties. La première partie représente les poids, calculés en utilisant la méthode d'analyse hiérarchique (AHP). Tandis que, la deuxième partie fournit les valeurs normalisées des paramètres considérés. L'algorithme UCOSA intègre aussi la technique de positionnement à l'estime appelé PDR afin d'améliorer le calcul de la position du smartphone. En second lieu, nous portons l'intérêt à la technique de positionnement par empreintes RSS dont le principe consiste à calculer la position du smartphone en comparant les valeurs RSSs enregistrées, en temps réel, avec les valeurs RSSs stockées dans une base de données (radiomap). La majorité des radiomaps sont représentées sous forme de grilles composées de points de référence (PR). Nous proposons une nouvelle conception de radiomap qui ajoute d'autres PRs au centre de gravité de chaque carré de la grille. En troisième lieu, nous abordons le problème de la construction du graphe modélisant un bâtiment multi-étages. Nous proposons un algorithme qui crée tout d'abord un graphe plan pour chaque étage, séparément, et qui relie ensuite les différents étages par des liens verticaux. En dernier lieu, nous étudions un nouvel algorithme nommé SIONA qui calcule et qui affiche d'une manière continue le chemin entre deux points situés à l'intérieur ou à l'extérieur d'un bâtiment. Plusieurs expériences réelles ont été réalisées pour évaluer les performances des algorithmes proposés avec des résultats prometteurs en termes de continuité et de précision (de l'ordre de 1.8 m) du service de navigation
Pedestrian navigation is a growing research field, which aims at developing services and applications that ensure the continuous positioning and navigation of people inside and outside covered areas (e.g. buildings). In this thesis, we propose a ubiquitous pedestrian navigation service based on user preferences and the most suitable efficient available positioning technology (e.g. WiFi, GNSS). Our main objective is to estimate continuously the position of a pedestrian carrying a smartphone equipped with a variety of technologies and sensors. First, we propose a novel positioning technology selection algorithm, called UCOSA for the complete ubiquitous navigation service in indoor and outdoor environments. UCOSA algorithm starts by inferring the need of a handover between the available positioning technologies on the overlapped coverage areas using fuzzy logic technique. If a handover process is required, a score is calculated for each captured Radio Frequency (RF) positioning technology. The score function consists of two parts: the first part represents the user preferences weights computed based on the Analytic Hierarchy Process (AHP). Whereas, the second part provides the user requirements (normalized values). UCOSA algorithm also integrates the Pedestrian Dead Reckoning (PDR) positioning technique through the navigation process to enhance the estimation of the smartphone's position. Second, we focus on the RSS fingerprinting positioning technique as it is the most widely used technique, which principle is to return the smartphone's position by comparing the real time recorded RSS values with the radiomap (i.e. a database of previous stored RSS values). Most of radiomap are organized in a grid, formed or Reference Point (RP): we propose a new design of radiomap which complements the grid with other RPs located at the center of gravity of each grid square. Third, we address the challenge of constructing a graph for a multi-floor building. We propose an algorithm that starts by creating the horizontal graph of each floor, separately, and then, adds vertical links between the different floors. Finally, we implement a novel algorithm, called SIONA that calculates and displays in a continuous manner the pathway between two distinct points being located indoor or outdoor. We conduct several real experiments inside the campus of the University of Passau in Germany to evaluate the performance of the proposed algorithms. They yield promising results in terms of continuity and accuracy (around 1.8 m indoor) of navigation service
APA, Harvard, Vancouver, ISO, and other styles
12

Kubbara, Fawzi Saeed. "Geographic Data in City Planning Departments: The Volume and Use Related to Advancements in Geographic Information Systems (GIS) Technology." PDXScholar, 1992. https://pdxscholar.library.pdx.edu/open_access_etds/1352.

Full text
Abstract:
Many local planning departments have acquired and put into use advanced automated geocoding and Geographic Information Systems (GIS) to store, process, map and analyze geographic data. GIS technological advancements in hardware, software, and geographic databases - specifically, in geocoding methods to reference street address data to geographic locations - enable data to be integrated, mapped, and analyzed more efficiently and effectively. Also, technological advancements depend on organizational and institutional environments. The relationships between technological advancements and technical (data mapping and analysis), organizational, and institutional environments are not clear. The purpose of this study is to explain these relationships to help planning and development directors make better decisions in acquiring and using advanced geocoding and GIS technology. The findings are based on a mail survey of planning and development departments in cities with populations of 50,000 or more in the United States. The study found that planning departments with advanced geocoding and GIS technology are capable of conducting advanced geocoding applications. Data can be tabulated, aggregated, linked, and modeled for mapping and planning. Geocoding to aggregate data to small geographic areas helps by providing required and up-to-date information to solve urban problems. However, the study did not find that advanced geocoding systems enhance data quality as measured by spatial resolution and volume. Further studies are needed to explore this issue. The adoption and implementation of advanced geocoding and GIS technology are influenced by organizational and institutional environments. Large cities have more experience with hardware, software programs, computer professionals, and training programs, but they are dependent on centralized systems from an earlier computer era. Consequently, more recent entrants to using computers for geographic data processing are emerging rapidly. As technology is becoming more advanced, hardware and software costs are declining. Some of the organizational and institutional issues are eliminated while new ones are emerging. As a result, small area cities are adopting advanced geocoding and GIS technology more rapidly than they were previously, and sometimes they surpass large cities. This study improves understanding of automated street address geocoding methods and how these methods are related to advancements in GIS technology. The study also examines how technical, organizational, and institutional environments are interrelated in adopting and using geocoding and GIS technology. The challenge in the 1990s will not be how to fund and acquire a GIS, but how to integrate all of the pieces in order to make the technology work properly.
APA, Harvard, Vancouver, ISO, and other styles
13

Huňa, Tomáš. "Využití mapových podkladů při řešení reportingu." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-124674.

Full text
Abstract:
The main objective of this thesis is to describe working with spatial data and their use in reports. The author focuses mainly on the MS SQL Server 2008 R2 platform. The first part deals with theoretical foundations. Defines the concept of Location Intelligence and describes its history to the present. It also addresses the benefits of Location Intelligence in various sectors and the relationship with Business Intelligence. Further theoretical part goes on to describe both spatial data at a general level and at the level of MS SQL Server 2008 R2. In addition, it described the use of ESRI Shapefile files and work with these files. From the description of spatial data author goes on to use reporting tools in Microsoft - SQL Server Reporting Services and Report Builder. Possible forms of map visualization and a detailed description of the settings are at the end of the theoretical part. The second part is focused on solving practical problems. Tools that are described in the theoretical part, are used to create report examples that the author at the beginning of this section set out to create.
APA, Harvard, Vancouver, ISO, and other styles
14

Adrian, Jorge Isaac. "Applicability of rock physics models in conjunction with seismic inverted data to characterize a low poro-perm gas-bearing sandstone reservoir for well location optimization, Bredasdorp Basin, SA." Master's thesis, University of Cape Town, 2015. http://hdl.handle.net/11427/19963.

Full text
Abstract:
The primary focus of this dissertation is to develop a predictive rock physics theory that establishes relations between rock properties and the observed seismic and to present the results of different seismic characterization techniques to interpret a tight gas sand reservoir off the south coast of South Africa using as input rock physics analysis and inverted seismic outcomes. To perform the aims and goals of this study a workflow that involves the execution of three main processes was implemented: (1) rock physics modelling, (2) a simultaneous seismic inversion, and (3) seismic reservoir characterization techniques. First, a rock physics model was generated as a bridge between the seismic observables (density, Vp and Vs) and reservoir parameters such as fluid content, porosity and mineralogy. In situ and perturbational log - derived forward modelling was performed. Both in situ and perturbational forward modelling were used to generate synthetic seismic gathers, which were used to study the AVA attribute responses. Overall, the effect of fluid fill on this tight gas sand seismically is modest compared with the effect of porosity changes. Second, there follows a detailed description of a workflow implemented to simultaneously invert P and S pre - stack seismic data. The derived elastic properties (acoustic impedance, Vp/Vs and density) were then used in combination with the rock physics analysis to characterize seismically the reservoir. The predicted acoustic impedance and Vp/Vs volumes show a good tie with the log data. However, the density outcome was of limited quality compared with the two mentioned above. Finally, using outcomes from rock physic s analysis and/or inverted data, four seismic techniques to characterize the reservoir were conducted. The techniques involved are: (1) AVO cross - plotting to generate a good facies property based on AVO attributes (intercept - gradient) and rock physics in the area of study , (2) rock physics templates (RPTs) to compute discrete rock property volumes (litho - Sw, litho - porosity) using a collection of curves that cover all possible "what if" lithology - fluid content - porosity scenarios for the reservoir and the inverted data, (3) a lithological classification to calculate litho - facies probability volumes based on a litho - facies classification using petrophysical cut - off s , multivariate probability functions (PDFs) and inverted data, and (4) an extended elastic impedance (EEI) inversion to derive rock property volumes (Vclay, porosity) based on AVO attributes (intercept, gradient). Despite differences in the input and theory behind each technique, all outcomes share parallels in the distribution of good and poor facies or reservoir and non - reservoir zones.
APA, Harvard, Vancouver, ISO, and other styles
15

Belka, Kamila. "Multicriteria analysis and GIS application in the selection of sustainable motorway corridor." Thesis, Linköping University, Department of Computer and Information Science, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-4399.

Full text
Abstract:

Effects of functioning transportation infrastructure are receiving more and more environmental and social concern nowadays. Nevertheless, preliminary corridor plans are usually developed on the basis of technical and economic criteria exclusively. By the time of environmental impact assessment (EIA), which succeeds, relocation is practically impossible and only preventative measures can be applied.

This paper proposes a GIS-based method of delimiting motorway corridor and integrating social, environmental and economic factors into the early stages of planning. Multiple criteria decision making (MCDM) techniques are used to assess all possible alternatives. GIS-held weighted shortest path algorithm enables to locate the corridor. The evaluation criteria are exemplary. They include nature conservation, buildings, forests and agricultural resources, and soils. Resulting evaluation surface is divided into a grid of cells, which are assigned suitability scores derived from all evaluation criteria. Subsequently, a set of adjacent cells connecting two pre-specified points is traced by the least-cost path algorithm. The best alternative has a lowest total value of suitability scores.

As a result, the proposed motorway corridor is routed from origin to destination. It is afterwards compared with an alternative derived by traditional planning procedures. Concluding remarks are that the location criteria need to be adjusted to meet construction

requirements as well as analysis process to be automated. Nevertheless, the geographic information system and the embedded shortest path algorithm proved to be well suited for preliminary corridor location analysis. Future research directions are sketched.

APA, Harvard, Vancouver, ISO, and other styles
16

Mölder, Mikael. "A Mobile Platform for Measuring Air Pollution in Cities using Gas Sensors." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-232121.

Full text
Abstract:
Although air pollution is one of the largest threats to human health, the data available to the public is often sparse and not very accurate nor updated. For example, there exists only about 5-10 air quality measuring points across the city of Stockholm. This means that the available data is good in close proximity of the sensing equipment but can differentiate much only a couple of blocks away. In order for individuals to receive up to date information around a larger city, stationary measurements are not sufficient enough to get a clear picture of how the current state of the air quality stands. Instead, other methods of collecting this data is needed, for instance by making the measurements mobile. GOEASY is a project financed by the European Commission where Galileo, Europe’s new navigational service, is used to enable more location-based service applications. As part of the GOEASY project is the evaluation of the potential of collaborative applications where users are engaged to help individuals affected by breathing-related diseases such as asthma. This thesis presents the choice of architecture and the implementation of a mobile platform serving this purpose. Using sensors mounted on a range of objects real time air quality data is collected and made available. The result is a mobile platform and connected Android application which by utilizing air quality sensors, reports pollution measurements together with positional coordinates to a central server. Thanks to the features of the underlying systems used, this provides a platform which is accurate and more resilient to exploits compared to traditional location-based services available today. The result allows individuals with respiratory conditions to receive much more accurate and up to date information in a larger resolution. It also serves the purpose of demonstrating the potential of the supporting technology as part of the GOEASY project.
Trots att föroreningar i luften är bland de största hoten mot mänsklig hälsa är den information som finns tillgänglig för allmänheten ofta både gles och inte tillräckligt noggrann eller uppdaterad. Till exempel finns det i hela Storstockholm endast mellan 5–10 luftkvalitetstationer som mäter föroreningar. Detta innebär att den data som finns tillgänglig är bra i närheten av mätutrustningen men kan skilja sig mycket enbart ett par kvarter bort. För att öka mängden information som är tillgänglig till allmänheten räcker inte längre enbart de stationära lösningarna som finns idag för att visa hur de rådande halterna av föroreningar står sig. Andra metoder måste införas, exempelvis genom att nyttja mobila mätningar från en plattform som kan röra sig fritt. GOEASY är ett projekt finansierat av den Europeiska Kommissionen, där Galileo, Europas nya navigationssystem används för att tillåta fler platsbaserade tjänster att äntra marknaden. Som en del av GOEASY projektet ingår evalueringen av potentialen i en applikation där användare samlar in data för att hjälpa individer med andningssvårigheter som astma. Denna avhandling presenterar valen till arkitekturen samt implementationen av en mobil plattform som en del av GOEASY. Lösningen använder sig av mobila luftkvalitetsensorer som kan monteras på en rad olika objekt som samlar data i realtid som görs tillgänglig för allmänheten. Resultatet är en mobil plattform och tillhörande Android applikation som med hjälp av luftkvalitetsensorer rapporterar halten av olika skadliga föroreningar tillsammans med platsinformation till en central server. Tack vare egenskaperna av de underliggande systemen som används, skapas en plattform som är mycket mer precis när det gäller positionering jämfört med liknande system som finns tillgängligt. Det resulterande systemet gör det möjligt för individer med andningssvårigheter att få tillgång till noggrannare samt mer uppdaterad information i större utsträckning än vad som för närvarande är tillgängligt. Systemet fyller även syftet med att demonstrera potentialen i den bakomliggande teknologin som en del av GOEASY.
APA, Harvard, Vancouver, ISO, and other styles
17

Batsi, Evangelia. "Micro-seismicity and deep seafloor processes in the Western Sea of Marmara : insights from the analysis of Ocean Bottom Seismometer and Hydrophone data." Thesis, Brest, 2017. http://www.theses.fr/2017BRES0090/document.

Full text
Abstract:
Depuis les séismes dévastateurs de 1999 d’Izmit et de Duzce, la partie immergée de la Faille Nord Anatolienne (FNA)en Mer de Marmara fait l’objet d’une intense surveillance. Malgré cela, la micro-sismicité demeure mal connue. Par ailleurs, alors que la connexion avec le système pétrolier du Bassin de Thrace est établie, le rôle du gaz sur la sismicité n’a pas été identifié.Dans ce travail, nous avons analysé des données d’OBS (Ocean Bottom Seismometers) acquises dans la partie ouest de la Mer de Marmara (en avril-juillet 2011 et septembre-novembre 2014), à partir de méthodes non-linéaires –NonLinLocet d’un modèle 3D de vitesses. Une grande partie de la sismicité se produit à des profondeurs inférieures à 6 km environ : le long de failles secondaires, héritées de l’histoire complexe de la FNA ; ou dans des couches de sédiments superficiels (< 1 km) riches en gaz. Cette sismicité superficielle semble être associée à des processus liés au gaz, déclenchés par les séismes profonds de magnitude M1 > 4.5 qui se produisent régulièrement le long de la MMF.Par ailleurs, 2 familles de signaux de courte durée (<1s), dits ≪ SDE ≫ (pour Short Duration Event) apparaissent sur les enregistrements : 1) les SDE se produisant à raison de quelques dizaines de SDE/jour, en réponse à des causes locales (i.e. bioturbation, activité biologique, micro-bullage de fond de mer, mouvements à l’interface eau/sédiment), etc ; 2) lesSDE se produisant par ≪paquets≫, dont certains sont enregistrés sur les 4 composantes (y compris l’hydrophone) et apparaissent de manière périodique, toutes les 1.8 s environ, en réponse à diverses causes qui restent à déterminer (parmi lesquelles : les mammifères marins ; l’activité humaine ; la sismicité ; le dégazage ; les ≪trémors≫ sismiques ; etc)
Since the devastating earthquakes of 1999, east of Istanbul, the submerged section of the North Anatolian Fault (NAF), in the Sea of Marmara (SoM) has been intensively monitored, mainly using land stations. Still, the micro-seismicity remains poorly understood. In addition, although the connection of the SoM with the hydrocarbon gas system from the Thrace Basin is now well established, along with the presence of widespread gas within the sedimentary layers, the role of gas on seismicity is still not recognized.Here, we have analyzed Ocean Bottom Seismometer (OBS) data from two deployments (April-July 2011 and September-November 2014) in the western SoM. Based on a high-resolution, 3D-velocity model, and on non-linear methods (NonLinLoc), our location results show that a large part of the micro-seismicity occurs at shallow depths (< 6 a 8 km): along secondary faults, inherited from the complex history of the North-Anatolian shear zone; or within the uppermost (< 1 km), gas-rich, sediment layers. Part of this ultra-shallow seismicity is likely triggered by the deep earthquakes of intermediate magnitude (Ml > 4.5) that frequently occur along the western segments of the MMF.In addition, OBSs also record at least two families of short duration (<1 sec) events (SDEs): 1) “background SDEs” occurring on a permanent, at a rate of a few tens of SDEs/day, resulting from many possible, local causes, e. g.: degassing from the seafloor, biological activity near the seabed, bioturbation, etc; 2) “swarmed SDEs”, among which some are recorded also on the hydrophone, and characterized by a periodicity of ~ 1.8 seconds. The causes of these SDEs still remain to be determined (among which: anthropogenic causes, marine mammals, gas emissions, regional seismicity, tremors from the MMF, etc)
APA, Harvard, Vancouver, ISO, and other styles
18

Bento, Miguel José Candeias. "User behaviour identification based on location data." Master's thesis, 2020. http://hdl.handle.net/10071/21711.

Full text
Abstract:
Over the years there has been an almost exponential increase in the use of new technologies in various sectors. These technologies have as their main objective, to improve or facilitate our daily life. This study will focus on one of these technologies used within a theme that has been widely talked about over the last few years, the use of personal data of various people to identify certain types of behavior. More specifically, this study aims primarily to use the GPS data stored in the respective Google accounts of nine volunteers in order to identify the places they frequent most, also known as Points of Interest. This same data will also be used to identify the trajectories covered more often by each of the same volunteers. A study was carried out with a sample of 9 participants, sending them their maps with POI and trajectories, thus obtaining their validation. It was thus possible to conclude that the best way to identify POI is to use daily clusters using DBSCAN. In the case of trajectories, the Snap-to-Road method was the one that gave the best results. It was found that it was possible to respond to the initial problem, and thus a method was found that identifies most of the POI successfully and also some trajectories.Based on this work, there is a great opportunity to improve some of the algorithms and processes that have some limitations in the future, and with this in mind it's possible to develop more effective solutions.
Ao longo dos anos tem-se verificado um aumento quase exponencial no que toca à utilização de novas tecnologias em vários sectores. Estas tecnologias têm como objetivo principal, melhorar ou facilitar o quotidiano. O presente estudo vai incidir sobre uma destas tecnologias utilizada dentro de um tema que tem sido muito falado nos últimos anos, a utilização de dados pessoais de um grupo de indvíduos para identificar certos tipos de comportamentos. Mais concretamente, tem como objetivo utilizar os dados de GPS, guardados nas respectivas contas Google de nove voluntários, de modo a identificar os locais que estes mais frequentam - Pontos de Interesse. Os dados são utilizados também para identificar as trajectórias percorridas mais vezes por cada um dos voluntários. Foi realizado um estudo com uma amostra de 9 participantes, enviando-lhes os respectivos mapas com POI e trajectórias obtendo assim a validação dos mesmos. Desta forma foi possível concluir que que a melhor forma de identificar POI tem como base a utilização de clusters diários utilizando DBSCAN. Para o caso das trajectórias, o método Snap-to-Road foi o que originou melhores resultados. Verificou-se que foi possível responder ao problema inicial, desta forma, foi encontrado um método que identifica a maior parte dos POI com sucesso, bem como algumas trajetórias. Com base neste trabalho, existe uma oportunidade para futuramente melhorar alguns dos algoritmos e processos que possuem algumas limitações de modo a desenvolver soluções mais eficazes.
APA, Harvard, Vancouver, ISO, and other styles
19

Ho, Min-Hau, and 何明浩. "Performance Evaluation of The Mobile Location With A-GPS and GSM Data In Urban Area." Thesis, 2004. http://ndltd.ncl.edu.tw/handle/46122811948208790304.

Full text
Abstract:
碩士
國立臺灣海洋大學
導航與通訊系
92
Assisted-GPS(A-GPS)technology overcomes the downsides of the conventional GPS solution, and achieves high location accuracy at reasonable cost. The assistance to the mobile phone trying to determine its own location comes from the network over the air-interface. What makes this technology work so well is that the wireless network, using its own GPS receivers, as well as an estimate of the mobile’s location down to cell/sector, can predict with great accuracy the GPS signal the handset will receive and convey that information to the mobile. With this assistance the size of the search space is greatly reduced, and the time-to-first-fix(TTFF)shortened from minutes to a second or less. GPS receivers are sometimes unreliable or unusable under circumstances such as urban canyons. It is because tall buildings block (or mask) the transmitted signals that number of satellites in view are often not enough for the receiver to obtain a position solution. A-GPS receivers enhance the sensitivity of receivers in detecting GPS signals. However, improvements of hardware technology are somewhat limited. The author suggests that by modifying the A-GPS server system software, the receivers can still work normally under certain adverse environments that they used to fail to perform. The main idea is that, when the number of satellites in view are not enough, the A-GPS server sends calculated ranges between serving base station and the assisting satellites, and therefore helps to locate users. Performance of the proposed method as analyzed with numerical simulations is rather promising and reasonably feasible.
APA, Harvard, Vancouver, ISO, and other styles
20

Liu, Yi Hung, and 劉奕宏. "Coffee shop location analysis using GIS and data mining techniques." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/dwufvs.

Full text
Abstract:
碩士
國立政治大學
資訊科學學系
100
The number of customers of coffee shop chains has grown steadily in recent years that cause the market size as well as the total consumption value increase rapidly and continuously. The competition among the chain coffee stores get even worse under the traditional profit oriented management style. In such case, it is crucial to make the correct decisions when selecting the coffee shop locations as well as making operation strategies in opening new coffee shops. Traditionally, it takes a great amount of time and human resources in collecting relevant information, conducting field visits as well as site evaluations when making coffee shop site selections. One seldom considers complex factors of site evaluation or field analyzing in selecting the location of new coffee shop. Hence, it will be one of the major contributions if one can find a mechanism in analyzing the site selection as well as profit evaluation to help the investors to produce better profit and to improve the chance of success. The goal of this thesis is to provide recommendations to improve the success rate of chain coffee shop site selection strategy. Based on the coffee market leaders’ success experiences in formulating the site selection strategies, we analyzed the correlation coefficients of the population as well as economy activities in order to identify the key factors in successful site selection strategies. We also used data mining techniques to construct the classification models of successful site selection. In addition, we analyzed and evaluated competition relations between the two leading chain coffee brands using the geographic information systems to obtain appropriate recommendations in new site selections. The shop rental information of Taipei City was used to explore and to evaluate the models recommended in our mechanism. The experimental results showed that the prediction through the classification models for site selections can achieve 70% of success rate. This indicates our mechanism effectively improve the successful rate of site selections. Moreover, the experimental results also show that the spatial analysis of site selections between the competitors is helpful in providing appropriate site selection strategies.
APA, Harvard, Vancouver, ISO, and other styles
21

Huang, Mu-Ching, and 黃木清. "Application of SVM for Data Analysis to Increase the Accuracy Method for GPS Fault Locator in a Transmission Line." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/29099794178999993950.

Full text
Abstract:
碩士
國立宜蘭大學
電機資訊學院碩士在職專班
103
The power transmission systems in Taiwan are mainly of overhead lines, most of which are located in the suburbs or places that are not very populated. Consequently, they are vulnerable to impact of the natural environment, which may result in outage due to the device fault in the transmission line. If it is not equipped with a fault detection system and device, it won’t be able to locate the fault and get the related information effectively. It doesn’t only waste manpower and resources, but also affect the recovery and power supply schedule, which may even pose great impact to the economic and industrial development of an area and bring inconvenience to the communities’ source of livelihood. Moreover, for the installed fault detection system and device, the primary issue is about how to make full use of its accuracy and reliability. Considering the cost, it is impossible to set up the fault detectors intensively in each transmission line to provide the fault message, which doesn’t only equate to a high setup cost, but also add to the maintenance cost. Thus, the fault detectors may be set up on both ends of the transmission line, which can then combine with the fiber-optical communication line/computer’s computation function/signal processing unit. Moreover, it makes use of a GPS locator to send the fault location to the travelling wave message of the two terminals for computation processing, which will indicate the location and the related information of the transmission line fault. In this way,it can effectively provide the maintenance personnel with the correct fault location, so the fault can be fixed within the shortest time. However, when the transmission line is outdoors with long distance, it will be affected by a lot of factors such as impedance of conducting wire, seasonal weather and temperature and accuracy of the components on the detector. All these may result in high error percentage of the fault location, bring much more troubles to the maintenance personnel, and finally delay the repair schedule. This paper will explore the positioning accuracy of the fault locator on the fault of the power transmission device based on the fault data generated by the GPS transmission line fault locator being used by Heping Thermal Power Plant currently. Moreover, it applies SVM data analysis theory to collect and classify the correct and stable fault data with reference values, so as to calculate and discuss the signal error correction value. Furthermore, it improves and adjusts the parameters of the fault locator, so the positioning date of the transmission line fault will be more accurate, detailed and correct. In this way, it allows the maintenance personnel to determine the fault location effectively and quickly, so they can fix the problem immediately, which in turn reduces the losses caused by power supply interruption.
APA, Harvard, Vancouver, ISO, and other styles
22

Macomber, Marcia Fraser. "Selecting locations for marine harvest refugia : a GIS study using logbook data from the Oregon trawl fishery /." 2000. http://hdl.handle.net/1957/9856.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Punke, Michele Leigh. "Predictive locational modeling of late Pleistocene archaeological sites on the southern Oregon Coast using a Geographic Information System (GIS)." Thesis, 2001. http://hdl.handle.net/1957/28949.

Full text
Abstract:
The search for archaeological materials dating to 15,000 yr BP along the southern Oregon coast is a formidable task. Using ethnographic, theoretical, and archaeological data, landscape resources which would have influenced land-use and occupation location decisions in the past are highlighted. Additionally, environmental data pertaining to the late Pleistocene is examined to determine what landscape features may have been used by human groups 15,000 years ago and to determine how these landscape features may have changed since that time. These landscape resource features are included in the modeling project as independent variables. The dependent variable in this modeling project is relative probability that an area will contain archaeological materials dating to the time period of interest. Two predictive locational models are created to facilitate the search process. These models mathematically combine the independent variables using two separate approaches. The hierarchical decision rule model approach assumes that decision makers in the past would have viewed landscape features sequentially rather than simultaneously. The additive, or weighted-value, approach assumes that a number of conditional preference aspects were evaluated simultaneously and that different environmental variables had varying amounts of influence on the locational choices of prehistoric peoples. Integration of the data and mathematical model structures into a Geographic Information System (GIS) allows for spatial analysis of the landscape and the prediction of locations most likely to contain evidence of human activity dating to 15,000 years ago. The process involved with variable integration into the GIS is delineated and results of the modeling procedures are presented in spatial, map-based formats.
Graduation date: 2002
APA, Harvard, Vancouver, ISO, and other styles
24

Σταθόπουλος, Χρήστος. "Εφαρμογές ΓΠΣ και αποτελεσματικότητα δικτυού ΑΤΜ τραπεζών." Thesis, 2008. http://nemertes.lis.upatras.gr/jspui/handle/10889/956.

Full text
Abstract:
Η τεράστια και ραγδαία ανάπτυξη στην επιστήμη και στην τεχνολογία σε αυτήν την υφήλιο, τέτοια όπως τα ΓΣΠ και η δορυφορική ψηφιακή χαρτογράφηση οφείλεται στην επιτακτική ανάγκη της ανάπτυξης των χωρών να την προλάβουν και να την αξιοποιήσουν. Η χρήση της τεχνολογίας των ΓΣΠ είναι ουσιώδης, καθώς εξελίσσεται σε ένα από τα καλύτερα και γρηγορότερα εργαλεία ως προς την αξιοποίηση των παραγωγικών πόρων και βοηθά στην απόφαση υποστήριξης της διαδικασίας. Τα ΓΣΠ χρησιμοποιούν την τεχνολογία των ηλεκτρονικών χαρτών δημιουργώντας αλληλεπίδραση με πολύ-επίπεδους χάρτες έτσι ώστε να βρεθούν βέλτιστες λύσεις για τα προβλήματα. Συνδυάζει χωρικά και μη-χωρικά δεδομένα για να κατασκευάσει πληροφορίες που μπορούν εύκολα να αναλυθούν από αυτούς που παίρνουν αποφάσεις. Χρησιμοποιώντας την τεχνολογία των ΓΣΠ έχουμε ένα ισχυρό εργαλείο για την επίτευξη του σχεδιασμού αφού ο ψηφιακός χάρτης θα μπορούσε να χρησιμοποιηθεί στην εύρεση καλύτερων τοποθεσιών για διαφορετικούς σκοπούς όπως στις τράπεζες, στα ΑΤΜ, στα νέα σχολεία, εστιατόρια κτλ. Η εργασία συζητά τα κριτήρια που χρησιμοποιήθηκαν στη στατική ανάλυση και επίσης αναφέρεται στην επιτυχία της GIS ανάλυσης στο να προτείνει τοποθεσίες για τα ΑΤΜ.
The huge and rapid advancement in science and technology in this globe such as GIS and Satellite digital mapping makes it a must for developing countries to catch-up and utilize it. Usage of GIS-technology is essential, as it is becoming one of the better and faster tools to manage resources and helps in the decision support process .A GIS uses electronic mapping technology in producing interactive multi-layer maps so that queries are set to find optimal solutions for problems. It combines spatial and non-spatial data to construct visualized information that can be easily analyzed by decision makers.Using GIS technology is a powerful tool to help in planning since the digital map could be used in defining best positions for different purposes such as banks, ATMs, new schools, restaurant, etc. The paper discusses the criteria used in the spatial analysis, and also, reports on the success of the resultant GIS analysis to suggest proper locations for ATM’s.
APA, Harvard, Vancouver, ISO, and other styles
25

Severns, Christopher Ray. "A comparison of geocoding baselayers for electronic medical record data analysis." Thesis, 2014. http://hdl.handle.net/1805/3841.

Full text
Abstract:
Indiana University-Purdue University Indianapolis (IUPUI)
Identifying spatial and temporal patterns of disease occurrence by mapping the residential locations of affected people can provide information that informs response by public health practitioners and improves understanding in epidemiological research. A common method of locating patients at the individual level is geocoding residential addresses stored in electronic medical records (EMRs) using address matching procedures in a geographic information system (GIS). While the process of geocoding is becoming more common in public health studies, few researchers take the time to examine the effects of using different address databases on match rate and positional accuracy of the geocoded results. This research examined and compared accuracy and match rate resulting from four commonly-used geocoding databases applied to sample of 59,341 subjects residing in and around Marion County/ Indianapolis, IN. The results are intended to inform researchers on the benefits and downsides to their selection of a database to geocode patient addresses in EMRs.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography