Dissertations / Theses on the topic 'Spatial data and applications'

To see the other types of publications on this topic, follow the link: Spatial data and applications.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Spatial data and applications.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Li, Xintong. "Modeling for Spatial and Spatio-Temporal Data with Applications." Diss., Kansas State University, 2018. http://hdl.handle.net/2097/38749.

Full text
Abstract:
Doctor of Philosophy
Department of Statistics
Juan Du
It is common to assume the spatial or spatio-temporal data are realizations of underlying random elds or stochastic processes. E ective approaches to modelling of the underlying autocorrelation structure of the same random eld and the association among multiple processes are of great demand in many areas including atmospheric sciences, meteorology and agriculture. To this end, this dissertation studies methods and application of the spatial modeling of large-scale dependence structure and spatio-temporal regression modelling. First, variogram and variogram matrix functions play important roles in modeling dependence structure among processes at di erent locations in spatial statistics. With more and more data collected on a global scale in environmental science, geophysics, and related elds, we focus on the characterizations of the variogram models on spheres of all dimensions for both stationary and intrinsic stationary, univariate and multivariate random elds. Some e cient approaches are proposed to construct a variety of variograms including simple polynomial structures. In particular, the series representation and spherical behavior of intrinsic stationary random elds are explored in both theoretical and simulation study. The applications of the proposed model and related theoretical results are demonstrated using simulation and real data analysis. Second, knowledge of the influential factors on the number of days suitable for fieldwork (DSFW) has important implications on timing of agricultural eld operations, machinery decision, and risk management. To assess how some global climate phenomena such as El Nino Southern Oscillation (ENSO) a ects DSFW and capture their complex associations in space and time, we propose various spatio-temporal dynamic models under hierarchical Bayesian framework. The Integrated Nested Laplace Approximation (INLA) is used and adapted to reduce the computational burden experienced when a large number of geo-locations and time points is considered in the data set. A comparison study between dynamics models with INLA viewing spatial domain as discrete and continuous is conducted and their pros and cons are evaluated based on multiple criteria. Finally a model with time- varying coefficients is shown to reflect the dynamic nature of the impact and lagged effect of ENSO on DSFW in US with spatio-temporal correlations accounted.
APA, Harvard, Vancouver, ISO, and other styles
2

Embleton, Nina Lois. "Handling sparse spatial data in ecological applications." Thesis, University of Birmingham, 2015. http://etheses.bham.ac.uk//id/eprint/5840/.

Full text
Abstract:
Estimating the size of an insect pest population in an agricultural field is an integral part of insect pest monitoring. An abundance estimate can be used to decide if action is needed to bring the population size under control, and accuracy is important in ensuring that the correct decision is made. Conventionally, statistical techniques are used to formulate an estimate from population density data obtained via sampling. This thesis thoroughly investigates an alternative approach of applying numerical integration techniques. We show that when the pest population is spread over the entire field, numerical integration methods provide more accurate results than the statistical counterpart. Meanwhile, when the spatial distribution is more aggregated, the error behaves as a random variable and the conventional error estimates do not hold. We thus present a new probabilistic approach to assessing integration accuracy for such functions, and formulate a mathematically rigorous estimate of the minimum number of sample units required for accurate abundance evaluation in terms of the species diffusion rate. We show that the integration error dominates the error introduced by noise in the density data and thus demonstrate the importance of formulating numerical integration techniques which provide accurate results for sparse spatial data.
APA, Harvard, Vancouver, ISO, and other styles
3

Davies, Jessica. "Expanding the spatial data infrastructure model to support spatial wireless applications /." Connect to thesis, 2003. http://eprints.unimelb.edu.au/archive/00001044.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Menezes, Kim Anne. "Bayesian spatial models : applications for tropospheric ozone data /." Digital version accessible at:, 1999. http://wwwlib.umi.com/cr/utexas/main.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yu, Jihai. "Essays on spatial dynamic panel data model theories and applications /." Columbus, Ohio : Ohio State University, 2007. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1179767430.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Honnor, Thomas R. "Some spatial statistical techniques with applications to cellular imaging data." Thesis, University of Warwick, 2017. http://wrap.warwick.ac.uk/97940/.

Full text
Abstract:
The aim of this thesis is to provide techniques for the analysis of a variety of types of spatial data, each corresponding to one of three biological questions on the function of the protein TACC3 during mitosis. A starting point in each investigation is the interpretation of the biological question and understanding of the form of the available data, from which a mathematical representation of data and corresponding statistical problem are developed. The thesis begins with description of a methodology for application to two collections of (marked) point patterns to determine the significance of differences in their structure, achieved through comparison of summary statistics and quantification of the significance of such differences by permutation tests. A methodology is then proposed for application to a pair spatio-temporal processes to estimate their individual temporal evolutions, including ideas from optimal transportation theory, and a test of dependence between such estimators. The thesis concludes with a proposed model for line data, designed to approximate the mitotic spindle structure using trajectories on the surface of spheroids, and a comparison score to compare model t between models and/or observations. The results of methodologies when applied to simulated data are presented as part of investigations into their validity and power. Application to biological data indicates that TACC3 influences microtubule structure during mitosis at a range of scales, supporting and extending previous investigations. Each of the methodologies is designed to require minimal assumptions and numbers of parameters, resulting in techniques which may be applied more widely to similar biological data from additional experiments or data arising from other fields.
APA, Harvard, Vancouver, ISO, and other styles
7

Vuorio, R. (Riikka). "Use of public sector’s open spatial data in commercial applications." Master's thesis, University of Oulu, 2014. http://urn.fi/URN:NBN:fi:oulu-201311201883.

Full text
Abstract:
The objective of this study was to analyse how young Finnish information technology (IT) companies utilize the public sector’s open spatial data. The aim was to find out to what extent companies use public sector’s open spatial data in products and how companies are using it. In addition, defects related to data and its use and companies’ awareness of public sector open data were canvassed. Defects and unawareness might prevent or retard the utilization of public sector’s data. Public sector is collecting vast amount of data from various areas when performing public tasks. The major part of the data is spatial, meaning the data has a location aspect. Public sector is opening the data for everybody to use freely and companies could use this open spatial data for commercial purposes. High expectations have been set for the data opening: along with it, innovations and business — new companies and digital products — will be created. The European Union has promoted greatly the public sector data opening with its legislative actions. First with the PSI directive (directive on re-use of public sector data) and later with the INSPIRE directive (directive on establishing and Infrastructure for Spatial Information in the European Community). The both directives are aiming to facilitate the re-use and dissemination of public sector data, whereas the INSPIRE directive has focused on the use of interoperable spatial data by creating the spatial data infrastructure. Even if the developments are still on going, these undertakings have already created possibilities for companies to use public sector data. This applies especially to the spatial data. This study was quantitative by nature and the empirical data for the study was collected through online survey, which was targeted to randomly selected Finnish IT companies established during the years 2009–2012. Data was analyzed by descriptive statistics. The results can be generalized to the whole target population in Finland. The results of this study shows that the number of companies utilizing public sector’s open spatial data is small and the public sector’s open spatial data has not yet enabled establishing of new companies. However, companies have developed few new products with the contribution of public sector’s open spatial data and the value of the data for the products is not minor. The thesis concludes that there is a need for greater investment in promoting the public sector’s open data amongst companies: the awareness of public sector’s open spatial data could be increased. In addition, coverage of datasets and interface services could be improved. Perhaps by eliminating these defects, the number of utilizers of public sector’s open spatial data would increase. Now there is a quiet sign of awakening of the business to utilize public sector’s data.
APA, Harvard, Vancouver, ISO, and other styles
8

Martin, Elaine B. "The detection of change in spatial processes with environmental applications." Thesis, University of Glasgow, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.361063.

Full text
Abstract:
Ever since Halley (1686) superimposed onto a map of land forms. the direction of trade winds and monsoons between and near the tropics and attempted to assign them a physical cause. homo-sapiens has attempted to develop procedures which quantify the level of change in a spatial process. or assess the relationship between associated spatially measured variables. Most spatial data. whether it be originally point. linear or areal in nature. can be converted by a suitable procedure into a continuous form and plotted as an isarithmic map i.e. points of equal height are joined. Once in that form it may be regarded as a statistical surface in which height varies over area in much the same way as the terrain varies on topographic maps. Particularly in environmental statistics. the underlying shape of the surface is unknown. and hence the use of non-parametric techniques is wholly appropriate. For most applications. the location of data points is beyond the control of the map-maker hence the analyst must cope with irregularly spaced data points. A variety of possible techniques for describing a surface are given in chapter two, with attention focusing on the methodology surrounding kernel density estimation. Once a surface has been produced to describe a set of data. a decision concerning the number of contours and how they should be selected has to be taken. When comparing two sets of data. it is imperative that the contours selected are chosen using the same criteria. A data based procedure is developed in chapter three which ensures comparability of the surfaces and hence spurious conclusions are not reached as a result of inconsistencies between surfaces. Contained within this chapter is a discussion of issues which relate to other aspects of how a contour should be drawn to minimise the potential for inaccuracies in the swface fitting methodology. Chapter four focuses on a whole wealth of techniques which are currently available for comparing surfaces. These range from the simplest method of overlaying two maps and visually comparing them to more involved techniques which require intensive numerical computation. It is the formalisation of the former of these techniques which forms the basis of the methodology developed in the following two chapters to discern whether change/association has materialised between variables.One means of quantifying change between two surfaces, represented as a contoured surface, is in terms of the transformation which would be required for the two surfaces to be matched. Mathematically, transformations are described in terms of rotation, translation and scalar change. Chapter five provides a geometrical interpretation of the three transformations in terms of area, perimeter, orientation and the centre of gravity of the contour of interest and their associated properties. Although grid resolution is fundamentally a secondary level of smoothing, this aspect of surface fitting has generally been ignored. However to ensure consistency across surfaces, it is necessary to decide firstly, whether data sets of different sizes should be depicted using different mesh resolutions and secondly, how fine a resolution provides optimal results, both in terms of execution time and inherent surface variability. This aspect is examined with particular reference to the geometric descriptors used to quantify change. The question of random noise contained within a measurement process has been ignored in the analysis to this point. However in practice, some form of noise will always be contained within a process. Quantifying the level of noise attributable to a process can prove difficult since the scientist may be over optimistic in his evaluation of the noise level. In developing a suitable set of test statistics, four situations were examined, firstly when no noise was present and then for three levels of noise, the upper bounds of which were 5, 15 and 25%. Based on these statistics, a series of hypothesis tests were developed to look at the question of change for individual contour levels Le. local analysis. or alternatively for a whole surface by combining the statistics and effectively performing a multivariate test. A number of problems are associated with the methodology. These difficulties are discussed and various remedial measures are proposed. The theoretical derivation of the test statistic, both in the absence and presence of random noise, has proved mathematically to be extremely complex, with a number of stringent assumptions required to enable the theoretical distribution to be derived. A major simulation study was subsequently undertaken to develop the empirical probability distribution function for the various statistics defining change for the four levels of noise. Also for each of the statistics, the resultant power of the test was examined.The remaining chapter explicitly examines two case studies and how the methodology developed in the preceding two chapters may be implemented. The first example cited raises the question, 'Has a seasonal temperature change resulted during the fifty year span, 1930 to 1980, within the contiguous United States of America?' The data base was provided by the United States Historical Climatology Network (HCN) Serial Temperature and Precipitation Data, Quinlan et al (1987). The second problem examines whether there is an association between background radiation levels, within three regions of the south-west England, and the location of various fonns of leukaemia or whether case location is a product of the population distribution. Differences between this example and the previous illustration materialise in terms of the spatial resolution of the data; the leukaemia data are defined as punctual data points and are extremely sparse; the population distribution is defined as areal regions; with the radiation data being of a more continuous format. The methodology developed required modification, but aside of this a preliminary set of conclusions were reached.
APA, Harvard, Vancouver, ISO, and other styles
9

Berndt, Christian [Verfasser]. "Spatial interpolation of climate data for hydrological and environmental applications / Christian Berndt." Hannover : Technische Informationsbibliothek (TIB), 2016. http://d-nb.info/1124166823/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zborovskiy, Marat. "Representing and manipulating spatial data in interoperable systems and its industrial applications." Thesis, Massachusetts Institute of Technology, 2006. http://hdl.handle.net/1721.1/35099.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, System Design and Management Program, 2006.
Includes bibliographical references (p. 123-126).
Introduction: The amount of information available nowadays is staggering and increases exponentially. Making sense of this data has become increasingly difficult because of the two factors: · The sheer volume of data · The lack of interoperability between disparate data sources and models While one can do little about the former factor, the latter one can be mitigated by advancing solutions that make data easy to work with and ensure the interoperability among data sources and models in intelligent networks. One way to achieve interoperability is to force every entity involved in the data exchange to adopt the same standard. However, organizations have heavily invested in proprietary data standards and are unlikely to replace their existing standards with a new one. Therefore, another solution is to create a standard, through which organizations can translate their data sources and share them with their customers or general community. The MIT Data Center is spearheading an initiative to create M - a language that is capable to provide the much needed interoperability between divergent data sources and models with an ultimate goal of creating a new intelligent information infrastructure (Brock, Schuster and Kutz 2006).
by Marat Zborovskiy.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
11

Landrieu, Loïc. "Learning structured models on weighted graphs, with applications to spatial data analysis." Thesis, Paris Sciences et Lettres (ComUE), 2016. http://www.theses.fr/2016PSLEE046/document.

Full text
Abstract:
La modélisation de processus complexes peut impliquer un grand nombre de variables ayant entre elles une structure de corrélation compliquée. Par exemple, les phénomènes spatiaux possèdent souvent une forte régularité spatiale, se traduisant par une corrélation entre variables d’autant plus forte que les régions correspondantes sont proches. Le formalisme des graphes pondérés permet de capturer de manière compacte ces relations entre variables, autorisant la formalisation mathématique de nombreux problèmes d’analyse de données spatiales. La première partie du manuscrit se concentre sur la résolution efficace de problèmes de régularisation spatiale, mettant en jeu des pénalités telle que la variation totale ou la longueur totale des contours. Nous présentons une stratégie de préconditionnement pour l’algorithme generalized forward-backward, spécifiquement adaptée à la résolution de problèmes structurés par des graphes pondérés présentant une grande variabilité de configurations et de poids. Nous présentons ensuite un nouvel algorithme appelé cut pursuit, qui exploite les relations entre les algorithmes de flots et la variation totale au travers d’une stratégie de working set. Ces algorithmes présentent des performances supérieures à l’état de l’art pour des tâches d’agrégations de données geostatistiques. La seconde partie de ce document se concentre sur le développement d’un nouveau modèle qui étend les chaînes de Markov à temps continu au cas des graphes pondérés non orientés généraux. Ce modèle autorise la prise en compte plus fine des interactions entre noeuds voisins pour la prédiction structurée, comme illustré pour la classification supervisée de tissus urbains
Modeling complex processes often involve a high number of variables with anintricate correlation structure. For example, many spatially-localized processes display spatial regularity, as variables corresponding to neighboring regions are more correlated than distant ones. The formalism of weighted graphs allows us to capture relationships between interacting variables in a compact manner, permitting the mathematical formulation of many spatial analysis tasks. The first part of this manuscript focuses on optimization problems with graph-structure dregularizers, such as the total variation or the total boundary size. We first present the convex formulation and its resolution with proximal splitting algorithms. We introduce a new preconditioning scheme for the existing generalized forward-backward proximal splitting algorithm, specifically designed for graphs with high variability in neighbourhood configurations and edge weights. We then introduce a new algorithm, cut pursuit, which used the links between graph cuts and total variation in a working set scheme. We also present a variation of this algorithm which solved the problem regularized by the non convex total boundary length penalty. We show that our proposed approaches reach or outperform state-of-the-art for geostatistical aggregation as well as image recovery problems. The second part focuses on the development of a new model, expanding continuous-time Markov chain models to general undirected weighted graphs. This allows us to take into account the interactions between neighbouring nodes in structured classification, as demonstrated for a supervised land-use classification task from cadastral data
APA, Harvard, Vancouver, ISO, and other styles
12

Addison, Mark Antony. "Generalised hierarchical operators and their implementation in spatial data processing and other applications." Thesis, University of Liverpool, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.333518.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Eckel, Stefanie. "Statistical analysis of spatial point patterns - applications to economical, biomedical and ecological data." [S.l. : s.n.], 2008. http://nbn-resolving.de/urn:nbn:de:bsz:289-vts-66022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Johnson, Kenneth Harold Anthony. "The algebraic specification of spatial data types with applications to constructive volume geometry." Thesis, Swansea University, 2007. https://cronfa.swan.ac.uk/Record/cronfa42232.

Full text
Abstract:
Spatial objects are modelled as total functions, mapping a topological space of points to a topological algebra of data attributes. High-level operations on these spatial objects form algebras of spatial objects, which model spatial data types. This thesis presents a comprehensive account of the theory of spatial data types. The motivation behind the general theory is Constructive Volume Geometry (CVG). CVG is an algebraic framework for the specification, representation and manipulation of graphics objects in 3D. By using scalar fields as the basic building blocks, CVG gives an abstract representation of spatial objects, with the goal of unifying the many representations of objects used in 3D computer graphics today. The general theory developed in this thesis unifies discrete and continuous spatial data, and the many examples where such data is used - from computer graphics to hardware design. Such a theory is built from the algebraic and topological properties of spatial data types. We examine algebraic laws, approximation methods, and finiteness and computability for general spatial data types. We show how to apply the general theory to modelling (i) hardware and (ii) CVG. We pose the question "Which spatial objects can be represented in the algebraic framework developed for spatial data types?". To answer such a question, we analyse the expressive power of our algebraic framework. Applying our results to the CVG framework yields a new result: We show any CVG spatial object can be approximated by way of CVG terms, to arbitrary accuracy.
APA, Harvard, Vancouver, ISO, and other styles
15

Rajadell, Rojas Olga. "Data selection and spectral-spatial characterisation for hyperspectral image segmentation. Applications to remote sensing." Doctoral thesis, Universitat Jaume I, 2013. http://hdl.handle.net/10803/669093.

Full text
Abstract:
El análisis de imágenes ha impulsado muchos descubrimientos en la ciencia actual. Esta tesis se centra en el análisis de imágenes remotas para inspección aérea, exactamente en el problema de segmentación y clasificación de acuerdo al uso del suelo. Desde el nacimiento de los sensores hiperespectrales su uso ha sido vital para esta tarea ya que facilitan y mejoran sustancialmente el resultado. Sin embargo el uso de imágenes hiperespectrales entraña, entre otros, problemas de dimensionalidad y de interacción con los expertos. Proponemos mejoras que ayuden a paliar estos inconvenientes y hagan el problema mas eficiente.
Lately image analysis have aided many discoveries in research. This thesis focusses on the analysis of remote sensed images for aerial inspection. It tackles the problem of segmentation and classification according to land usage. In this field, the use of hyperspectral images has been the trend followed since the emergence of hyperspectral sensors. This type of images improves the performance of the task but raises some issues. Two of those issues are the dimensionality and the interaction with experts. We propose enhancements overcome them. Efficiency and economic reasons encouraged to start this work. The enhancements introduced in this work allow to tackle segmentation and classification of this type of images using less data, thus increasing the efficiency and enabling the design task specific sensors which are cheaper. Also, our enhacements allow to perform the same task with less expert collaboration which also decreases the costs and accelerates the process.
APA, Harvard, Vancouver, ISO, and other styles
16

Pérez, López Andrés. "Parametric analysis of ambisonic audio: a contributions to methods, applications and data generation." Doctoral thesis, Universitat Pompeu Fabra, 2020. http://hdl.handle.net/10803/669962.

Full text
Abstract:
Due to the recent advances in virtual and augmented reality, ambisonics has emerged as the de facto standard for immersive audio. Ambisonic audio can be captured using spherical microphone arrays, which are becoming increasingly popular. Yet, many methods for acoustic and microphone array signal processing are not speci cally tailored for spherical geometries. Therefore, there is still room for improvement in the eld of automatic analysis and description of ambisonic recordings. In the present thesis, we tackle this problem using methods based on the parametric analysis of the sound eld. Speci cally, we present novel contributions in the scope of blind reverberation time estimation, diffuseness estimation, and sound event localization and detection. Furthermore, several software tools developed for ambisonic dataset generation and management are also presented.
APA, Harvard, Vancouver, ISO, and other styles
17

Pun-Cheng, Lilian Suk Ching. "A new face-entity model of digital topographic data for multi-purpose urban GIS." Thesis, University of Newcastle Upon Tyne, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.285330.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Filiberti, Daniel Paul. "Combined Spatial-Spectral Processing of Multisource Data Using Thematic Content." Diss., Tucson, Arizona : University of Arizona, 2005. http://etd.library.arizona.edu/etd/GetFileServlet?file=file:///data1/pdf/etd/azu%5Fetd%5F1066%5F1%5Fm.pdf&type=application/pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Su, Ting-Li. "Application of spatial statistics to space-time disease surveillance data." Thesis, Lancaster University, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.441128.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Naude, Stephanus David. "Application of spatial resource data to assist in farmland valuation." Thesis, Stellenbosch : Stellenbosch University, 2011. http://hdl.handle.net/10019.1/18118.

Full text
Abstract:
Thesis (MScAgric) -- Stellenbosch University, 2011.
ENGLISH ABSTRACT: In South Africa more than 80 percent of the total land area is used for agriculture and subsistence livelihoods. A land transaction is generally not a recurring action for most buyers and sellers, their experience and knowledge are limited, for this reason the services of property agents and valuers are sometimes used, just to get more information available. The condition of insufficient information and the inability to observe differences in land productivity gives rise to the undervaluation of good land and overvaluation of poor land. The value of a property plays an important role in the acquisition of a bond, in this context farm valuations are essential and therefore commercial banks make more use of specialist businesses that have professional valuers available. The advent of the Internet made access to comprehensive information sources easier for property agents and valuers whose critical time and resources can now be effectively managed through Geographic Information System (GIS) integrated workflow processes. This study aims to develop the blueprint for a farm valuation support system (FVSS) that assists valuers in their application of the comparable sales method by enabling them to do the following: (1) Rapid identification of the location of the subject property and transaction properties on an electronic map. (2) Comparison of the subject property with the transaction properties in terms of value contributing attributes that can be expressed in a spatial format, mainly a) location and b) land resource quality factors not considered in existing valuation systems that primarily focus on residential property. Interpretation of soil characteristics to determine the suitability of a soil for annual or perennial crops requires specialized knowledge of soil scientists, knowledge not normally found among property valuers or estate agents. For this reason an algorithm, that generates an index value, was developed to allow easy comparison of the land of a subject property and that of transaction properties. Whether this index value reflects the soil suitability of different areas sufficiently accurate was confirmed by soil suitability data of the Breede and Berg River areas, which were obtained by soil scientists by means of a reconnaissance soil survey. This index value distinguishes the proposed FVSS from other existing property valuation systems and can therefore be used by valuers as a first approximation of a property’s soil suitability, before doing further field work. A nationwide survey was done among valuers and estate agents that provided information for the design of the proposed FVSS and proved that the need for such a system does exist and that it will be used by valuers.
AFRIKAANSE OPSOMMING: Meer as 80 persent van die totale grondoppervlakte in Suid-Afrika word gebruik vir landbou en bestaansboerdery. 'n Grondtransaksie is oor die algemeen nie 'n herhalende aksie vir die meeste kopers en verkopers nie, hul ervaring en kennis is beperk, om hierdie rede word die dienste van eiendomsagente en waardeerders soms gebruik om meer inligting beskikbaar te kry. Die toestand van onvoldoende inligting en die onvermoë om verskille in grondproduktiwiteit te identifiseer gee aanleiding tot die onderwaardering van goeie grond en oorwaardering van swak grond. Die waarde van 'n eiendom speel 'n belangrike rol in die verkryging van 'n verband. In hierdie konteks is plaaswaardasies noodsaaklik en daarom maak kommersiële banke meer gebruik van gespesialiseerde maatskappye wat oor professionele waardeerders beskik. Die koms van die Internet het toegang tot omvattende inligtingsbronne makliker gemaak vir eiendomsagente en waardeerders wie se kritiese tyd en hulpbronne nou effektief bestuur kan word deur middel van Geografiese Inligtingstelsel (GIS) geïntegreerde werksprosesse. Hierdie studie poog om die bloudruk vir 'n plaaswaardasie ondersteuningstelsel te ontwikkel wat waardeerders sal help in hul toepassing van die vergelykbare verkope metode deur hul in staat te stel om die volgende te doen: (1) Vinnige identifisering van die ligging van die betrokke onderwerp eiendom en transaksie eiendomme op 'n elektroniese kaart. (2) Vergelyking van die onderwerp eiendom met transaksie eiendomme in terme van waardedraende eienskappe wat in 'n ruimtelike formaat uitgedruk word, hoofsaaklik a) ligging en b) bodem gehaltefaktore wat nie oorweeg word in bestaande residensieel georiënteerde waardasiestelsels nie. Interpretasie van grondeienskappe om die geskiktheid van grond vir eenjarige of meerjarige gewasse te bepaal vereis gespesialiseerde kennis van grondkundiges, kennis wat nie normaalweg gevind word onder eiendomswaardeerders of eiendomsagente nie. Om hierdie rede is 'n algoritme ontwikkel sodat die grond van ‘n onderwerp eiendom d.m.v. ‘n indekswaarde met transaksie eiendomme vergelyk kan word. Die indekswaarde is akkuraat genoeg bevestig toe dit vergelyk is met grond geskiktheidsdata wat deur grondkundiges in die Breede- en Bergrivier gebiede ingesamel is. Hierdie indekswaarde onderskei die voorgestelde plaaswaardasie ondersteuningstelsel van ander bestaande eiendom waardasiestelsels en kan dus deur waardeerders gebruik word as 'n eerste bepaling van 'n eiendom se grond geskiktheid, voordat verdere veldwerk gedoen word. 'n Landwye opname is gedoen onder waardeerders en eiendomsagente wat inligting voorsien het vir die ontwerp van die voorgestelde plaaswaardasie ondersteuningstelsel, asook bewys gelewer het dat daar ‘n behoefte aan so 'n stelsel bestaan en dat dit deur waardeerders gebruik sal word.
APA, Harvard, Vancouver, ISO, and other styles
21

Umande, Philip Pembe. "Spatial point pattern analysis with application to confocal microscopy data." Thesis, Imperial College London, 2008. http://hdl.handle.net/10044/1/8569.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Wang, Xiaofeng. "New Procedures for Data Mining and Measurement Error Models with Medical Imaging Applications." Case Western Reserve University School of Graduate Studies / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=case1121447716.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Gurrapu, Chaitanya. "Human Action Recognition In Video Data For Surveillance Applications." Thesis, Queensland University of Technology, 2004. https://eprints.qut.edu.au/15878/1/Chaitanya_Gurrapu_Thesis.pdf.

Full text
Abstract:
Detecting human actions using a camera has many possible applications in the security industry. When a human performs an action, his/her body goes through a signature sequence of poses. To detect these pose changes and hence the activities performed, a pattern recogniser needs to be built into the video system. Due to the temporal nature of the patterns, Hidden Markov Models (HMM), used extensively in speech recognition, were investigated. Initially a gesture recognition system was built using novel features. These features were obtained by approximating the contour of the foreground object with a polygon and extracting the polygon's vertices. A Gaussian Mixture Model (GMM) was fit to the vertices obtained from a few frames and the parameters of the GMM itself were used as features for the HMM. A more practical activity detection system using a more sophisticated foreground segmentation algorithm immune to varying lighting conditions and permanent changes to the foreground was then built. The foreground segmentation algorithm models each of the pixel values using clusters and continually uses incoming pixels to update the cluster parameters. Cast shadows were identified and removed by assuming that shadow regions were less likely to produce strong edges in the image than real objects and that this likelihood further decreases after colour segmentation. Colour segmentation itself was performed by clustering together pixel values in the feature space using a gradient ascent algorithm called mean shift. More robust features in the form of mesh features were also obtained by dividing the bounding box of the binarised object into grid elements and calculating the ratio of foreground to background pixels in each of the grid elements. These features were vector quantized to reduce their dimensionality and the resulting symbols presented as features to the HMM to achieve a recognition rate of 62% for an event involving a person writing on a white board. The recognition rate increased to 80% for the "seen" person sequences, i.e. the sequences of the person used to train the models. With a fixed lighting position, the lack of a shadow removal subsystem improved the detection rate. This is because of the consistent profile of the shadows in both the training and testing sequences due to the fixed lighting positions. Even with a lower recognition rate, the shadow removal subsystem was considered an indispensable part of a practical, generic surveillance system.
APA, Harvard, Vancouver, ISO, and other styles
24

Gurrapu, Chaitanya. "Human Action Recognition In Video Data For Surveillance Applications." Queensland University of Technology, 2004. http://eprints.qut.edu.au/15878/.

Full text
Abstract:
Detecting human actions using a camera has many possible applications in the security industry. When a human performs an action, his/her body goes through a signature sequence of poses. To detect these pose changes and hence the activities performed, a pattern recogniser needs to be built into the video system. Due to the temporal nature of the patterns, Hidden Markov Models (HMM), used extensively in speech recognition, were investigated. Initially a gesture recognition system was built using novel features. These features were obtained by approximating the contour of the foreground object with a polygon and extracting the polygon's vertices. A Gaussian Mixture Model (GMM) was fit to the vertices obtained from a few frames and the parameters of the GMM itself were used as features for the HMM. A more practical activity detection system using a more sophisticated foreground segmentation algorithm immune to varying lighting conditions and permanent changes to the foreground was then built. The foreground segmentation algorithm models each of the pixel values using clusters and continually uses incoming pixels to update the cluster parameters. Cast shadows were identified and removed by assuming that shadow regions were less likely to produce strong edges in the image than real objects and that this likelihood further decreases after colour segmentation. Colour segmentation itself was performed by clustering together pixel values in the feature space using a gradient ascent algorithm called mean shift. More robust features in the form of mesh features were also obtained by dividing the bounding box of the binarised object into grid elements and calculating the ratio of foreground to background pixels in each of the grid elements. These features were vector quantized to reduce their dimensionality and the resulting symbols presented as features to the HMM to achieve a recognition rate of 62% for an event involving a person writing on a white board. The recognition rate increased to 80% for the "seen" person sequences, i.e. the sequences of the person used to train the models. With a fixed lighting position, the lack of a shadow removal subsystem improved the detection rate. This is because of the consistent profile of the shadows in both the training and testing sequences due to the fixed lighting positions. Even with a lower recognition rate, the shadow removal subsystem was considered an indispensable part of a practical, generic surveillance system.
APA, Harvard, Vancouver, ISO, and other styles
25

Yasumiishi, Misa. "Spatial and temporal analysis of human movements and applications for disaster response management| Using cell phone data." Thesis, State University of New York at Buffalo, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=1600848.

Full text
Abstract:

This survey study examines cell phone usage data and focuses on the application of the data to disaster response management. Through the course of this study, the structure of cell phone usage data and its characteristics will be reviewed. Cell phone usage data provides us with valuable information about human movements and their activities. The uniqueness of the data is that it contains both spatial and temporal information and this information is free of fixed routes such as roads or any preset data capturing timing. In short, it is a very fluid kind of data which reflects our activities as humans with freedom of movement. Depending on data extraction methods, the data server can provide additional information such as application activities, battery level and charge activities. However, cell phone usage data contains shortcomings including data inconsistency and sparseness. Both the richness and the shortcomings of the data expose the hurdles required in data processing and force us to devise new ways to analyze this kind of data. Once the data has been properly analyzed, the findings can be applied to our real life problems including disaster response. By understanding human movement patterns using cell phone usage data, we will be able to allocate limited emergency resources more adequately. Even more, when disaster victims lose their cell phone functionality during a disaster, we might be able to identify or predict the locations of victims or evacuees and supply them with necessary assistance. The results of this study provide some insights to cell phone usage data and human movement patterns including the concentration of cell phone activities in specific zones and rather universal cell phone charging patterns. The potential of the data as a movement analysis resource and the application to disaster response is apparent. As a base to leverage the study to the next level, a possible conceptual model of human movement factors and data processing methods will be presented.

APA, Harvard, Vancouver, ISO, and other styles
26

Jackson, Marlene Frances. "Spatial allocation of educational resources, an application of Data Envelopment Analysis." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape4/PQDD_0017/NQ58138.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Boone, Edward L. "Bayesian Methodology for Missing Data, Model Selection and Hierarchical Spatial Models with Application to Ecological Data." Diss., Virginia Tech, 2003. http://hdl.handle.net/10919/26141.

Full text
Abstract:
Ecological data is often fraught with many problems such as Missing Data and Spatial Correlation. In this dissertation we use a data set collected by the Ohio EPA as motivation for studying techniques to address these problems. The data set is concerned with the benthic health of Ohio's waterways. A new method for incorporating covariate structure and missing data mechanisms into missing data analysis is considered. This method allows us to detect relationships other popular methods do not allow. We then further extend this method into model selection. In the special case where the unobserved covariates are assumed normally distributed we use the Bayesian Model Averaging method to average the models, select the highest probability model and do variable assessment. Accuracy in calculating the posterior model probabilities using the Laplace approximation and an approximation based on the Bayesian Information Criterion (BIC) are explored. It is shown that the Laplace approximation is superior to the BIC based approximation using simulation. Finally, Hierarchical Spatial Linear Models are considered for the data and we show how to combine analysis which have spatial correlation within and between clusters.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
28

Aiken, John Charles. "The development of a colour liquid crystal display spatial light modulator and applications in polychromatic optical data processing." Thesis, Queen's University Belfast, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.326384.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Nautiyal, Atul. "Aspects of spatial wavelets and their application to modelling seismic reflection data." Thesis, University of British Columbia, 1986. http://hdl.handle.net/2429/26504.

Full text
Abstract:
The propagation of seismic waves may be described in the space-frequency domain by the Rayleigh-Sommerfeld convolution integral. The kernel of this integral is called a spatial wavelet and it embodies the physics and geometry of the propagation problem. The concepts of spatial convolution and spatial wavelet are simple and are similar to other topics studied by geophysicists. With a view to understanding these concepts, some aspects of spatial wavelets and their application to two-dimensional, zero-offset, acoustic seismic modelling were investigated. In studying the spatial wavelet, two topics in particular were examined: spatial aliasing and wavelet truncation. Spatial aliasing arises from the need to compute a discrete wavelet for implementation on a computer. This problem was solved by using an analytic expression for the spatial wavelet in the Fourier (wavenumber) domain. In the wavenumber domain the wavelet was windowed by a fourth order Butterworth operator, which removed aliasing. This technique is simple and flexible in its use. The second problem of wavelet truncation is due to the necessity of having a wavelet of finite length. A length limiting scheme based upon on the energy content of a wavelet was developed. It was argued that if that if a large portion of the wavelet energy was contained in a finite number of samples, then truncation at that sample would incur a minimal loss of information. Numerical experiments showed this to be true. The smallest length wavelet was found to depend on temporal frequency, medium velocity and extrapolation increment. The combined effects of these two solutions to the practical problem of computing a spatial wavelet resulted in two drawbacks. First, the wavelets provide modelling capabilities up to structural dips of 30 degrees. Second, there is a potential for instability due to recursive application of the wavelet. However, neither of these difficulties hampered the modelling of fairly complex structures. The spatial wavelet concept was applied to seismic modelling for media of varying complexity. Homogeneous velocity models were used to demonstrate diffraction evolution, dip limitations and imaging of curved structures. The quality of modelling was evaluated by migrating the modelled data to recover the time-image model of the reflection structure. Migrations of dipping and synform structures indicated that the modelled results were of a high calibre. Horizontally stratified velocity models were also examined for dipping and synform structures. Modelling these reflection structures showed that the introduction of a depth variable velocity profile has a tremendous influence on the synthetic seismic section. Again, migration proved that the quality of the data was excellent. Finally, the spatial wavelet algorithm was extended to the case of laterally varying velocity structures. The effects of space variant spatial convolution in the presence of a smoothed velocity field were examined. Smoothed velocity fields were computed by a simple weighted averaging procedure. The weighting function used was a decaying exponential whose decay rate determined the amount of smoothing. Seis-mograms computed for this case showed that the algorithm gave smoother and more continuous reflection signatures when the velocity field has been smoothed so that the largest lateral velocity gradient corresponded to the lower end of the temporal frequency band of the spatial wavelets. In this respect, the results are similar to those of geometric ray theory. Also, the travel times of these models compared favourably with those of ray tracings.
Science, Faculty of
Earth, Ocean and Atmospheric Sciences, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
30

Kanaparthy, Venu Madhav Singh. "GML REPRESENTATION FOR INTEROPERABLE SPATIAL DATA EXCHANGE IN A MOBILE MAPPING APPLICATION." MSSTATE, 2004. http://sun.library.msstate.edu/ETD-db/theses/available/etd-07102004-133629/.

Full text
Abstract:
Geographic information is critical to GIS applications located remotely for executing business operations. GIS applications need to interoperate to be able to share information for analysis and decision making process. Heterogeneity and complexity of information models and structures limit the data flow and application interoperation. Advancements in Internet technologies provided new opportunities for delivering spatial information to remote users. However, spatial data delivered is in proprietary structures, limiting the utility to GIS applications. To enable information flow between GIS applications a portable data modeling approach is necessary. However, geographic information is inherently complex to model. A comprehensive and standardized vocabulary to model characteristics of geographic entities is required. Furthermore applications with the need to share information should have an agreement on information structure and content exchanged. This research presents GML representation to provide interoperable spatial data services. The objective is achieved by providing an open framework to model, encode and delivery geographic information. The results of this research show that it is possible to develop interoperable spatial data services through service oriented architecture.
APA, Harvard, Vancouver, ISO, and other styles
31

Kanaparthy, Venu Madhav Singh. "GML represntation for interoperable spatial data exchange in a mobile mapping application." Master's thesis, Mississippi State : Mississippi State University, 2004. http://library.msstate.edu/etd/show.asp?etd=etd-07102004-133629.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Maguire, Ralph Paul. "Application of pharmacokinetic models to projection data in positron emission tomography." Thesis, University of Surrey, 1999. http://epubs.surrey.ac.uk/844467/.

Full text
Abstract:
In positron emission tomography (PET), coincidence detection of annihilation photons enables the measurement of Radon transforms of the instantaneous activity concentration of labelled tracers in the human body. Using reconstruction algorithms, spatial maps of the activity distribution can be created and analysed to reveal the pharmacokinetics of the labelled tracer. This thesis considers the possibility of applying pharmacokinetic modelling to the count rate data measured by the detectors, rather than reconstructed images, A new concept is proposed - parameter projections - Radon transforms of the spatial distribution of the parameters of the model, which simplifies the problem considerably. Using this idea, a general linear least squares GLLS framework is developed and applied to the one and two tissue-compartment models for [O-15]water and [F-18]FDG. Simulation models are developed from first principles to demonstrate the accuracy of the GLLS approach to parameter estimation. This requires the validation of the whole body distribution of each of the tracers, using pharmacokinetic techniques, leading to novel compartment based whole body models for [O-15]water and [F-18]FDG. A simplified Monte-Carlo framework for error estimation of the tissue models is developed, based on system parameters. It is also shown that the variances of maps of the spatial variance of the parameters of the model - parametric images - can be calculated in projection space. It is clearly demonstrated that the precision of the variance estimates is higher than that obtained from estimates based on reconstructed images. Using the methods, it is shown how statistical parametric maps of the difference between two neuronal activation conditions can be calculated from projection data. The methods developed allow faster results analysis, avoiding lengthy reconstruction of large data sets, and allow access to robust statistical techniques for activation analysis through use of the known, Poisson distributed nature, of the measured projection data.
APA, Harvard, Vancouver, ISO, and other styles
33

Zecha, Christoph Walter [Verfasser], and Roland [Akademischer Betreuer] Gerhards. "Spatial combination of sensor data deriving from mobile platforms for precision farming applications / Christoph Walter Zecha ; Betreuer: Roland Gerhards." Hohenheim : Kommunikations-, Informations- und Medienzentrum der Universität Hohenheim, 2019. http://d-nb.info/1189206706/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Nelson, Andrew Darren. "The spatial analysis of socio-economic and agricultural data across geographic scales : examples and applications in Honduras and elsewhere." Thesis, University of Leeds, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.405809.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Zecha, Christoph [Verfasser], and Roland [Akademischer Betreuer] Gerhards. "Spatial combination of sensor data deriving from mobile platforms for precision farming applications / Christoph Walter Zecha ; Betreuer: Roland Gerhards." Hohenheim : Kommunikations-, Informations- und Medienzentrum der Universität Hohenheim, 2019. http://d-nb.info/1189206706/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Sharma, Rahil. "Shared and distributed memory parallel algorithms to solve big data problems in biological, social network and spatial domain applications." Diss., University of Iowa, 2016. https://ir.uiowa.edu/etd/2277.

Full text
Abstract:
Big data refers to information which cannot be processed and analyzed using traditional approaches and tools, due to 4 V's - sheer Volume, Velocity at which data is received and processed, and data Variety and Veracity. Today massive volumes of data originate in domains such as geospatial analysis, biological and social networks, etc. Hence, scalable algorithms for effcient processing of this massive data is a signicant challenge in the field of computer science. One way to achieve such effcient and scalable algorithms is by using shared & distributed memory parallel programming models. In this thesis, we present a variety of such algorithms to solve problems in various above mentioned domains. We solve five problems that fall into two categories. The first group of problems deals with the issue of community detection. Detecting communities in real world networks is of great importance because they consist of patterns that can be viewed as independent components, each of which has distinct features and can be detected based upon network structure. For example, communities in social networks can help target users for marketing purposes, provide user recommendations to connect with and join communities or forums, etc. We develop a novel sequential algorithm to accurately detect community structures in biological protein-protein interaction networks, where a community corresponds with a functional module of proteins. Generally, such sequential algorithms are computationally expensive, which makes them impractical to use for large real world networks. To address this limitation, we develop a new highly scalable Symmetric Multiprocessing (SMP) based parallel algorithm to detect high quality communities in large subsections of social networks like Facebook and Amazon. Due to the SMP architecture, however, our algorithm cannot process networks whose size is greater than the size of the RAM of a single machine. With the increasing size of social networks, community detection has become even more difficult, since network size can reach up to hundreds of millions of vertices and edges. Processing such massive networks requires several hundred gigabytes of RAM, which is only possible by adopting distributed infrastructure. To address this, we develop a novel hybrid (shared + distributed memory) parallel algorithm to efficiently detect high quality communities in massive Twitter and .uk domain networks. The second group of problems deals with the issue of effciently processing spatial Light Detection and Ranging (LiDAR) data. LiDAR data is widely used in forest and agricultural crop studies, landscape classification, 3D urban modeling, etc. Technological advancements in building LiDAR sensors have enabled highly accurate and dense LiDAR point clouds resulting in massive data volumes, which pose computing issues with processing and storage. We develop the first published landscape driven data reduction algorithm, which uses the slope-map of the terrain as a filter to reduce the data without sacrificing its accuracy. Our algorithm is highly scalable and adopts shared memory based parallel architecture. We also develop a parallel interpolation technique that is used to generate highly accurate continuous terrains, i.e. Digital Elevation Models (DEMs), from discrete LiDAR point clouds.
APA, Harvard, Vancouver, ISO, and other styles
37

Huang, Xingang. "RELSA : automatic analysis of spatial data sets using visual reasoning techniques with an application to weather data analysis /." The Ohio State University, 2000. http://rave.ohiolink.edu/etdc/view?acc_num=osu1488193665235735.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

McBride, John Jacob Bratcher Thomas L. "Conjugate hierarchical models for spatial data an application on an optimal selection procedure /." Waco, Tex. : Baylor University, 2006. http://hdl.handle.net/2104/3955.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

da, Silva Brum Bastos Vanessa. "New methods and applications for context aware movement analysis (CAMA)." Thesis, University of St Andrews, 2019. http://hdl.handle.net/10023/16812.

Full text
Abstract:
Recent years have seen a rapid growth in movement research owing to new technologies contributing to the miniaturization and reduced costs of tracking devices. Similar trends have occurred in how environmental data are being collected (e.g., through satellites, unmanned aerial vehicles, and sensor networks). However, the development of analytical techniques for movement research has failed to keep pace with the data collection advances. There is a need for new methods capable of integrating increasingly detailed movement data with a myriad of contextual data - termed context aware movement analysis (CAMA). CAMA investigates more than movement geometry, by including biological and environmental conditions that may influence movement. However, there is a shortage of methods relating movement patterns to contextual factors, which is still limiting our ability to extract meaningful information from movement data. This thesis contributes to this methodological research gap by assessing the state-of-the art for CAMA within movement ecology and human mobility research, developing innovative methods to consider the spatio-temporal differences between movement data and contextual data and exploring computational methods that allow identification of patterns in contextualized movement data. We developed new methods and demonstrated how they facilitated and improved the integration between high frequency tracking data and temporally dynamic environmental variables. One of the methods, multi-channel sequence analysis, is then used to discover varying human behaviour relative to weather conditions in a large human GPS tracking dataset from Scotland. The second method is developed for combing multi-sensor satellite imagery (i.e., image fusion) of differing spatial and temporal resolutions. This method is applied to a GPS tracking data on maned wolves in Brazil to understand fine-scale movement behaviours related to vegetation changes across seasons. In summary, this thesis provides a significant development in terms of new ideas and techniques for performing CAMA for human and wildlife movement studies.
APA, Harvard, Vancouver, ISO, and other styles
40

Martínez, Guardiola Francisco Javier. "Liquid Crystal on Silicon Displays Characterization for Diffractive Applications and for Holographic Data Storage in Photopolymers." Doctoral thesis, Universidad de Alicante, 2015. http://hdl.handle.net/10045/50217.

Full text
Abstract:
In this PhD Thesis I present some methods for characterizing PA-LCoS microdisplays. It is useful to fully characterize this type of devices for evaluating its performance required in different applications. We have tested its validity in different applications such as diffractive optics elements (DOEs). Finally we apply these microdisplays in a full holographic data storage scheme using a photopolymer as holographic recording medium. We evaluate the capability of PVA/AA photopolymer for this holographic data storage system that incorporates as a novelty a convergent correlator geometry.
APA, Harvard, Vancouver, ISO, and other styles
41

Sun, Xiaoqian. "Bayesian spatial data analysis with application to the Missouri Ozark forest ecosystem project." Diss., Columbia, Mo. : University of Missouri-Columbia, 2006. http://hdl.handle.net/10355/4477.

Full text
Abstract:
Thesis (Ph.D.)--University of Missouri-Columbia, 2006.
The entire dissertation/thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file (which also appears in the research.pdf); a non-technical general description, or public abstract, appears in the public.pdf file. Title from title screen of research.pdf file viewed on (May 1, 2007) Vita. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
42

Chandler, Jim H. "The acquisition of spatial data from archival photographs and their application to geomorphology." Thesis, City University London, 1989. http://openaccess.city.ac.uk/7400/.

Full text
Abstract:
This thesis discusses the development and application of an analytical photogrammetric technique which enables accurate spatial data, of known quality, to be derived from archival photographs. Such a facility represents an important advancement, particularly for geomorphologists, because the effects of geomorphological process can be assessed quantitatively and directly by comparing spatial data derived from photographs at different epochs. Sources of archival photographs of England are identified and the type, quantity, range and age of each major collection is discussed. Existing methods of deriving spatial data from photographs are reviewed and illustrated by previous research, with particular emphasis upon the limitations associated with each method. The technique that was developed is based upon a self calibrating bundle adjustment and both the functional and stochastic models suitable for successful restitution of archival photographs were established. Five computer programs were developed and the algorithms associated with each are given. These programs are run sequentially and assist in rapid restitution of archival photography and to derive measures of data quality. The technique is applied successfully to a forty year old sequence of archival photographs, obtained from a variety of sources, of the Black Ven landslide, Dorset, England. Spatial data was derived from five photographic epochs, at approximately 10 year intervals, using an analytical plotter. A secondary aim of the research was to extend existing techniques and devise new methods of processing these spatial data, for geomorphological purposes. Several techniques were found to be especially valuable including: the production of morpho-genetic maps; DTI's of difference; evolutionary models; animated sequences and distributions of slope angle. The latter has shown that the evolutionary model of 'dynamic equilibrium is valid for the Black Ven landslides. All aspects of data quality are examined, particularly the functional model used in the self-calibrating bundle adjustment. This least squares estimating procedure is found to be perfectly adequate for the successful restitution of archival photographs.
APA, Harvard, Vancouver, ISO, and other styles
43

Cunningham, Helen. "Spatially related data and GIS for land and property applications." Thesis, University of Newcastle Upon Tyne, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.240743.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Li, Han. "Statistical Modeling and Analysis of Bivariate Spatial-Temporal Data with the Application to Stream Temperature Study." Diss., Virginia Tech, 2014. http://hdl.handle.net/10919/70862.

Full text
Abstract:
Water temperature is a critical factor for the quality and biological condition of streams. Among various factors affecting stream water temperature, air temperature is one of the most important factors related to water temperature. To appropriately quantify the relationship between water and air temperatures over a large geographic region, it is important to accommodate the spatial and temporal information of the steam temperature. In this dissertation, I devote effort to several statistical modeling techniques for analyzing bivariate spatial-temporal data in a stream temperature study. In the first part, I focus our analysis on the individual stream. A time varying coefficient model (VCM) is used to study the relationship between air temperature and water temperature for each stream. The time varying coefficient model enables dynamic modeling of the relationship, and therefore can be used to enhance the understanding of water and air temperature relationships. The proposed model is applied to 10 streams in Maryland, West Virginia, Virginia, North Carolina and Georgia using daily maximum temperatures. The VCM approach increases the prediction accuracy by more than 50% compared to the simple linear regression model and the nonlinear logistic model. The VCM that describes the relationship between water and air temperatures for each stream is represented by slope and intercept curves from the fitted model. In the second part, I consider water and air temperatures for different streams that are spatial correlated. I focus on clustering multiple streams by using intercept and slope curves estimated from the VCM. Spatial information is incorporated to make clustering results geographically meaningful. I further propose a weighted distance as a dissimilarity measure for streams, which provides a flexible framework to interpret the clustering results under different weights. Real data analysis shows that streams in same cluster share similar geographic features such as solar radiation, percent forest and elevation. In the third part, I develop a spatial-temporal VCM (STVCM) to deal with missing data. The STVCM takes both spatial and temporal variation of water temperature into account. I develop a novel estimation method that emphasizes the time effect and treats the space effect as a varying coefficient for the time effect. A simulation study shows that the performance of the STVCM on missing data imputation is better than several existing methods such as the neural network and the Gaussian process. The STVCM is also applied to all 156 streams in this study to obtain a complete data record.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
45

Apanasovich, Tatiyana Vladimirovna. "Testing for spatial correlation and semiparametric spatial modeling of binary outcomes with application to aberrant crypt foci in colon carcinogenesis experiments." Diss., Texas A&M University, 2004. http://hdl.handle.net/1969.1/2674.

Full text
Abstract:
In an experiment to understand colon carcinogenesis, all animals were exposed to a carcinogen while half the animals were also exposed to radiation. Spatially, we measured the existence of aberrant crypt foci (ACF), namely morphologically changed colonic crypts that are known to be precursors of colon cancer development. The biological question of interest is whether the locations of these ACFs are spatially correlated: if so, this indicates that damage to the colon due to carcinogens and radiation is localized. Statistically, the data take the form of binary outcomes (corresponding to the existence of an ACF) on a regular grid. We develop score??type methods based upon the Matern and conditionally autoregression (CAR) correlation models to test for the spatial correlation in such data, while allowing for nonstationarity. Because of a technical peculiarity of the score??type test, we also develop robust versions of the method. The methods are compared to a generalization of Moran??s test for continuous outcomes, and are shown via simulation to have the potential for increased power. When applied to our data, the methods indicate the existence of spatial correlation, and hence indicate localization of damage. Assuming that there are correlations in the locations of the ACF, the questions are how great are these correlations, and whether the correlation structures di?er when an animal is exposed to radiation. To understand the extent of the correlation, we cast the problem as a spatial binary regression, where binary responses arise from an underlying Gaussian latent process. We model these marginal probabilities of ACF semiparametrically, using ?xed-knot penalized regression splines and single-index models. We ?t the models using pairwise pseudolikelihood methods. Assuming that the underlying latent process is strongly mixing, known to be the case for many Gaussian processes, we prove asymptotic normality of the methods. The penalized regression splines have penalty parameters that must converge to zero asymptotically: we derive rates for these parameters that do and do not lead to an asymptotic bias, and we derive the optimal rate of convergence for them. Finally, we apply the methods to the data from our experiment.
APA, Harvard, Vancouver, ISO, and other styles
46

Leighty, Brian David. "Data Mining for Induction of Adjacency Grammars and Application to Terrain Pattern Recognition." NSUWorks, 2009. http://nsuworks.nova.edu/gscis_etd/212.

Full text
Abstract:
The process of syntactic pattern recognition makes the analogy between the syntax of languages and the structure of spatial patterns. The recognition process is achieved by parsing a given pattern to determine if it is syntactically correct with respect to a defined grammar. The generation of pattern grammars can be a cumbersome process when many objects are involved. This has led to the problem of spatial grammar inference. Current approaches have used genetic algorithms and inductive techniques and have demonstrated limitations. Alternative approaches are needed that produce accurate grammars while remaining computationally efficient in light of the NP-hardness of the problem. Co-location rule mining techniques in the field of Knowledge Discovery and Data Mining address the complexity issue using neighborhood restrictions and pruning strategies based on monotonic Measures Of Interest. The goal of this research was to develop and evaluate an inductive method for inferring an adjacency grammar utilizing co-location rule mining techniques to gain efficiency while providing accurate and concise production sets. The method incrementally discovers, without supervision, adjacency patterns in spatial samples, relabels them via a production rule and repeats the procedure with the newly labeled regions. The resulting rules are used to form an adjacency grammar. Grammars were generated and evaluated within the context of a syntactic pattern recognition system that identifies landform patterns in terrain elevation datasets. The proposed method was tested using a k-fold cross-validation methodology. Two variations were also tested using unsupervised and supervised training, both with no rule pruning. Comparison of these variations with the proposed method demonstrated the effectiveness of rule pruning and rule discovery. Results showed that the proposed method of rule inference produced rulesets having recall, precision and accuracy values of 82.6%, 97.7% and 92.8%, respectively, which are similar to those using supervised training. These rulesets were also the smallest, had the lowest average number of rules fired in parsing, and had the shortest average parse time. The use of rule pruning substantially reduced rule inference time (104.4 s vs. 208.9 s). The neighborhood restriction used in adjacency calculations demonstrated linear complexity in the number of regions.
APA, Harvard, Vancouver, ISO, and other styles
47

Boulil, Kamal. "Une approche automatisée basée sur des contraintes d’intégrité définies en UML et OCL pour la vérification de la cohérence logique dans les systèmes SOLAP : applications dans le domaine agri-environnemental." Thesis, Clermont-Ferrand 2, 2012. http://www.theses.fr/2012CLF22285/document.

Full text
Abstract:
Les systèmes d'Entrepôts de Données et OLAP spatiaux (EDS et SOLAP) sont des technologies d'aide à la décision permettant l'analyse multidimensionnelle de gros volumes de données spatiales. Dans ces systèmes, la qualité de l'analyse dépend de trois facteurs : la qualité des données entreposées, la qualité des agrégations et la qualité de l’exploration des données. La qualité des données entreposées dépend de critères comme la précision, l'exhaustivité et la cohérence logique. La qualité d'agrégation dépend de problèmes structurels (e.g. les hiérarchies non strictes qui peuvent engendrer le comptage en double des mesures) et de problèmes sémantiques (e.g. agréger les valeurs de température par la fonction Sum peut ne pas avoir de sens considérant une application donnée). La qualité d'exploration est essentiellement affectée par des requêtes utilisateur inconsistantes (e.g. quelles ont été les valeurs de température en URSS en 2010 ?). Ces requêtes peuvent engendrer des interprétations erronées des résultats. Cette thèse s'attaque aux problèmes d'incohérence logique qui peuvent affecter les qualités de données, d'agrégation et d'exploration. L'incohérence logique est définie habituellement comme la présence de contradictions dans les données. Elle est typiquement contrôlée au moyen de Contraintes d'Intégrité (CI). Dans cette thèse nous étendons d'abord la notion de CI (dans le contexte des systèmes SOLAP) afin de prendre en compte les incohérences relatives aux agrégations et requêtes utilisateur. Pour pallier les limitations des approches existantes concernant la définition des CI SOLAP, nous proposons un Framework basé sur les langages standards UML et OCL. Ce Framework permet la spécification conceptuelle et indépendante des plates-formes des CI SOLAP et leur implémentation automatisée. Il comporte trois parties : (1) Une classification des CI SOLAP. (2) Un profil UML implémenté dans l'AGL MagicDraw, permettant la représentation conceptuelle des modèles des systèmes SOLAP et de leurs CI. (3) Une implémentation automatique qui est basée sur les générateurs de code Spatial OCL2SQL et UML2MDX qui permet de traduire les spécifications conceptuelles en code au niveau des couches EDS et serveur SOLAP. Enfin, les contributions de cette thèse ont été appliquées dans le cadre de projets nationaux de développement d'applications (S)OLAP pour l'agriculture et l'environnement
Spatial Data Warehouse (SDW) and Spatial OLAP (SOLAP) systems are Business Intelligence (BI) allowing for interactive multidimensional analysis of huge volumes of spatial data. In such systems the quality ofanalysis mainly depends on three components : the quality of warehoused data, the quality of data aggregation, and the quality of data exploration. The warehoused data quality depends on elements such accuracy, comleteness and logical consistency. The data aggregation quality is affected by structural problems (e.g., non-strict dimension hierarchies that may cause double-counting of measure values) and semantic problems (e.g., summing temperature values does not make sens in many applications). The data exploration quality is mainly affected by inconsistent user queries (e.g., what are temperature values in USSR in 2010?) leading to possibly meaningless interpretations of query results. This thesis address the problems of logical inconsistency that may affect the data, aggregation and exploration qualities in SOLAP. The logical inconsistency is usually defined as the presence of incoherencies (contradictions) in data ; It is typically controlled by means of Integrity Constraints (IC). In this thesis, we extends the notion of IC (in the SOLAP domain) in order to take into account aggregation and query incoherencies. To overcome the limitations of existing approaches concerning the definition of SOLAP IC, we propose a framework that is based on the standard languages UML and OCL. Our framework permits a plateforme-independent conceptual design and an automatic implementation of SOLAP IC ; It consists of three parts : (1) A SOLAP IC classification, (2) A UML profile implemented in the CASE tool MagicDraw, allowing for a conceptual design of SOLAP models and their IC, (3) An automatic implementation based on the code generators Spatial OCLSQL and UML2MDX, which allows transforming the conceptual specifications into code. Finally, the contributions of this thesis have been experimented and validated in the context of French national projetcts aimming at developping (S)OLAP applications for agriculture and environment
APA, Harvard, Vancouver, ISO, and other styles
48

Fackel, Edvin, and Robin Kolmodin. "Webbapplikation för felsökning på nätverksnoder." Thesis, Högskolan i Gävle, Datavetenskap, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-32494.

Full text
Abstract:
Hantering av larm från nätverksnoder är i dagsläget en process som kräver att nätverkstekniker besöker flera olika källor av information för att dra en slutsats över vad orsaken till larmet kan vara. Genom att besöka flera olika källor av information kan det vara komplicerat att få en överblick över problemet. Studien utförs på uppdrag av Trafikverket för att underlätta arbetet för Trafikverkets nätverkstekniker på avdelningen network operations center (NOC). En webbapplikation har utvecklats för att sammanställa de vanligaste källorna som nätverkstekniker besöker. Webbapplikationen amalgamerar och presenterar information på ett sätt som bör underlätta felsökningsprocessen för nätverkstekniker. Webbapplikationen indikerar de vanligaste felorsakerna, beroende på vilken typ av larm som hanteras. Valmöjligheter vid utveckling av en webbapplikation kan göra det svårt för utvecklare att välja mjukvara. Även om mjukvara delvis sorteras ut beroende på i vilket syfte som utvecklingen sker finns fortfarande en stor mängd alternativ. Denna studie har förtydligat och bevisat att mjukvaran NodeJS, PostgreSQL och PostGIS fungerar enhetligt och är att föredra för en webbapplikation med spatiala funktioner. För att utvärdera webbapplikationen genomfördes ett användartest där nio personer, 47% av personalen på Trafikverket NOC deltog. Användartestet visar positiva resultat gällande hur väl webbapplikationen fyllde sitt syfte med att indikera möjliga orsaker för nätverkslarm samt deltagarnas upplevelse med webbapplikationen. I användartestet anser 100% av deltagarna att webbapplikationen skulle spara dem tid och de tre populäraste funktionerna var polygoner för att visa elnätägare, realtids väderdata samt en sammanställning av interna data på samma ställe.
Handling of alarms on network nodes is a process that requires network technicians to visit several sources of information before they can draw a conclusion on the reason behind the alarm. By visiting several different sources of information, it may be difficult to establish a good overview of the problem. This study is made on request by the Swedish traffic authority Trafikverket to ease the workload of the co-workers at the department network operations center (NOC). A web application has been developed to amalgamate the most common sources a network technician visits. The web application presents the amalgamated information in a way that eases the troubleshooting process for the network technicians. The web application also indicates what the most common reasons are, depending on which kind of alarm that is being handled. Due to the large availability of different software it can be difficult for a developer to choose a suitable software. Even if some are excluded based on the purpose of the development there are still many choices. This study emphasizes that the software NodeJS, PostgreSQL and PostGIS works well together and is a suitable choice when creating a web application that needs to use spatial functions. To evaluate the web application a user test was conducted were nine participants, 47% of the personnel at Trafikverket NOC participated. The result shows positive results in both how well the web application filled its purpose by indicating possible reasons for a network alarm as well as the participants experiences with the web application. In the user test 100% of the participants claim that the web application would save them time and the three most popular features were polygons of electricity distributors, real time weather data and an amalgamation of internal data in one place.
APA, Harvard, Vancouver, ISO, and other styles
49

Charles, Stephen. "A study of spatial data models and their application to selecting information from pictorial databases." Thesis, Loughborough University, 1991. https://dspace.lboro.ac.uk/2134/33079.

Full text
Abstract:
People have always used visual techniques to locate information in the space surrounding them. However with the advent of powerful computer systems and user-friendly interfaces it has become possible to extend such techniques to stored pictorial information. Pictorial database systems have in the past primarily used mathematical or textual search techniques to locate specific pictures contained within such databases. However these techniques have largely relied upon complex combinations of numeric and textual queries in order to find the required pictures. Such techniques restrict users of pictorial databases to expressing what is in essence a visual query in a numeric or character based form. What is required is the ability to express such queries in a form that more closely matches the user's visual memory or perception of the picture required. It is suggested in this thesis that spatial techniques of search are important and that two of the most important attributes of a picture are the spatial positions and the spatial relationships of objects contained within such pictures. It is further suggested that a database management system which allows users to indicate the nature of their query by visually placing iconic representations of objects on an interface in spatially appropriate positions, is a feasible method by which pictures might be found from a pictorial database. This thesis undertakes a detailed study of spatial techniques using a combination of historical evidence, psychological conclusions and practical examples to demonstrate that the spatial metaphor is an important concept and that pictures can be readily found by visually specifying the spatial positions and relationships between objects contained within them.
APA, Harvard, Vancouver, ISO, and other styles
50

Tian, Peng. "Graphene-based high spatial resolution hall sensors with potential application for data storage media characterisation." Thesis, University of Manchester, 2014. https://www.research.manchester.ac.uk/portal/en/theses/graphenebased-high-spatial-resolution-hall-sensors-with-potential-application-for-data-storage-media-characterisation(0bb9f59f-a9e2-42e8-ac1f-0adc93e9ae01).html.

Full text
Abstract:
This thesis reports on two graphene-based structures that have been proposed and fabricated as possible prototypes for high-spatial-resolution Hall sensors with potential application in research on high-density magnetic recording technology such as bit patterned magnetic recording (BPMR) and other areas where the measurement of highly inhomogeneous fields is required. There is a direct graphene-metal contact in the first structure, which is named as TYPE I in this thesis, so that the anomalous Hall effect (AHE) in the ferromagnetic islands deposited on the graphene could be detected. Meanwhile, the graphene and the metal are isolated by an h-BN layer in the second structure which is named as TYPE II, so that only the stray field from the islands can be detected using the ordinary Hall effect (OHE).The transport measurements performed on TYPE I devices revealed there is no AHE or stray field signal detectable, and their Hall resistance relations are non-linear and do not pass through the origin point. A finite element simulation comparing the resistance of the empty graphene cross and the island-occupied cross indicates that the current in the graphene may not redistribute through the metallic islands due to interface current blocking, resulting in the non-appearance of the expected AHE signal. Moreover, an analysis on the data of the longitudinal magnetoresistance (MR) reveals that a two-fluid model and effective medium theory (EMT) model might be the major graphene MR mechanisms in the regime away from and near to the charge neutrality point (CNP) respectively. As a combined result of the above findings, a joint MR-Hall effect model under the condition of the presence of a pre-existing transverse offset current, is proposed to explain the unusual behaviour of the Hall measurement data of the TYPE I devices. The model gives qualitatively correct fitting for all longitudinal and transverse transport data of TYPE I devices. In addition, the nature of the graphene/metal contact is considered as the reason responsible for the non-appearance of the expected AHE and stray field signal, although further experimental work is needed, and suggested in the thesis, to clarify this issue. On the other hand, the TYPE II devices have shown their potential to be developed as a Hall sensor being able to detect a sub-micron magnetic island in the future, but there is still a large space for the performance of the devices to be improved. At the end of the thesis, future experimental work, which could lead to the eventual development of a high-sensitivity high-spatial-resolution Hall sensor on the basis of TYPE I and TYPE II structures, are suggested and described.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography