Siga este enlace para ver otros tipos de publicaciones sobre el tema: 3D Remote Sensing data.

Tesis sobre el tema "3D Remote Sensing data"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores tesis para su investigación sobre el tema "3D Remote Sensing data".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Alkhadour, Wissam M. "Reconstruction of 3D scenes from pairs of uncalibrated images. Creation of an interactive system for extracting 3D data points and investigation of automatic techniques for generating dense 3D data maps from pairs of uncalibrated images for remote sensing applications". Thesis, University of Bradford, 2010. http://hdl.handle.net/10454/4933.

Texto completo
Resumen
Much research effort has been devoted to producing algorithms that contribute directly or indirectly to the extraction of 3D information from a wide variety of types of scenes and conditions of image capture. The research work presented in this thesis is aimed at three distinct applications in this area: interactively extracting 3D points from a pair of uncalibrated images in a flexible way; finding corresponding points automatically in high resolution images, particularly those of archaeological scenes captured from a freely moving light aircraft; and improving a correlation approach to dense disparity mapping leading to 3D surface reconstructions. The fundamental concepts required to describe the principles of stereo vision, the camera models, and the epipolar geometry described by the fundamental matrix are introduced, followed by a detailed literature review of existing methods. An interactive system for viewing a scene via a monochrome or colour anaglyph is presented which allows the user to choose the level of compromise between amount of colour and ghosting perceived by controlling colour saturation, and to choose the depth plane of interest. An improved method of extracting 3D coordinates from disparity values when there is significant error is presented. Interactive methods, while very flexible, require significant effort from the user finding and fusing corresponding points and the thesis continues by presenting several variants of existing scale invariant feature transform methods to automatically find correspondences in uncalibrated high resolution aerial images with improved speed and memory requirements. In addition, a contribution to estimating lens distortion correction by a Levenberg Marquard based method is presented; generating data strings for straight lines which are essential input for estimating lens distortion correction. The remainder of the thesis presents correlation based methods for generating dense disparity maps based on single and multiple image rectifications using sets of automatically found correspondences and demonstrates improvements obtained using the latter method. Some example views of point clouds for 3D surfaces produced from pairs of uncalibrated images using the methods presented in the thesis are included.
Al-Baath University
The appendices files and images are not available online.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Dayananda, Supriya [Verfasser]. "Evaluation of remote sensing based spectral and 3D point cloud data for crop biomass estimation in southern India / Supriya Dayananda". Kassel : Universitätsbibliothek Kassel, 2019. http://d-nb.info/1202727409/34.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Alkhadour, Wissam Mohamad. "Reconstruction of 3D scenes from pairs of uncalibrated images : creation of an interactive system for extracting 3D data points and investigation of automatic techniques for generating dense 3D data maps from pairs of uncalibrated images for remote sensing applications". Thesis, University of Bradford, 2010. http://hdl.handle.net/10454/4933.

Texto completo
Resumen
Much research effort has been devoted to producing algorithms that contribute directly or indirectly to the extraction of 3D information from a wide variety of types of scenes and conditions of image capture. The research work presented in this thesis is aimed at three distinct applications in this area: interactively extracting 3D points from a pair of uncalibrated images in a flexible way; finding corresponding points automatically in high resolution images, particularly those of archaeological scenes captured from a freely moving light aircraft; and improving a correlation approach to dense disparity mapping leading to 3D surface reconstructions. The fundamental concepts required to describe the principles of stereo vision, the camera models, and the epipolar geometry described by the fundamental matrix are introduced, followed by a detailed literature review of existing methods. An interactive system for viewing a scene via a monochrome or colour anaglyph is presented which allows the user to choose the level of compromise between amount of colour and ghosting perceived by controlling colour saturation, and to choose the depth plane of interest. An improved method of extracting 3D coordinates from disparity values when there is significant error is presented. Interactive methods, while very flexible, require significant effort from the user finding and fusing corresponding points and the thesis continues by presenting several variants of existing scale invariant feature transform methods to automatically find correspondences in uncalibrated high resolution aerial images with improved speed and memory requirements. In addition, a contribution to estimating lens distortion correction by a Levenberg Marquard based method is presented; generating data strings for straight lines which are essential input for estimating lens distortion correction. The remainder of the thesis presents correlation based methods for generating dense disparity maps based on single and multiple image rectifications using sets of automatically found correspondences and demonstrates improvements obtained using the latter method. Some example views of point clouds for 3D surfaces produced from pairs of uncalibrated images using the methods presented in the thesis are included.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Kaynak, Burcak. "Assimilation of trace gas retrievals obtained from satellite (SCIAMACHY), aircraft and ground observations into a regional scale air quality model (CMAQ-DDM/3D)". Diss., Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/37134.

Texto completo
Resumen
A major opportunity for using satellite observations of tropospheric chemical concentrations is to improve our scientific understanding of atmospheric processes by integrated analysis of satellite, aircraft, and ground-based observations with global and regional scale models. One endpoint of such efforts is to reduce modeling biases and uncertainties. The idea of coupling these observations with a regional scale air quality model was the starting point of this research. The overall objective of this research was to improve the NOₓ emission inventories by integrating observations from different platforms and regional air quality modeling. Specific objectives were: 1) Comparison of satellite NO₂ retrievals with simulated NO₂ by the regional air quality model. Comparison of simulated tropospheric gas concentrations simulated by the regional air quality model, with aircraft and ground-based observations; 3) Assessment of the uncertainties in comparing satellite NO₂ retrievals with NOₓ emissions estimates and model simulations; 4) Identification of biases in emission inventories by data assimilation of satellite NO₂ retrievals, and ground-based NO, NO₂ and O₃ observations with an iterative inverse method using the regional air quality model coupled with sensitivity calculations; 5) Improvement of our understanding of NOₓ emissions, and the interaction between regional and global air pollution by an integrated analysis of satellite NO₂ retrievals with the regional air quality model. Along with these objectives, a lightning NOₓ emission inventory was prepared for two months of summer 2004 to account for a significant upper level NOₓ source. Spatially-resolved weekly NO₂ variations from satellite retrievals were compared with estimated NOₓ emissions for different region types. Data assimilation of satellite NO₂ retrievals, and ground-based NO, NO₂ and O₃ observations were performed to evaluate the NOₓ emission inventory. This research contributes to a better understanding of the use of satellite NO₂ retrievals in air quality modeling, and improvements in the NOₓ emission inventories by correcting some of the inconsistencies that were found in the inventories. Therefore, it may provide groups that develop emissions estimates guidance on areas for improvement. In addition, this research indicates the weaknesses and the strengths of the satellite NO₂ retrievals and offers suggestions to improve the quality of the retrievals for further use in the tropospheric air pollution research.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Magnini, Luigi. "Remote sensing e object-based image analysis: metodologie di approccio per la creazione di standard archeologici". Doctoral thesis, Università degli studi di Padova, 2017. http://hdl.handle.net/11577/3423260.

Texto completo
Resumen
In recent years, the field of remote sensing experienced an incredible growth thanks to the increasing quality and variety of sensors and the reduction of instrumental costs. The benefits for archaeology were soon apparent. So far, data interpretation remains essentially a prerogative of the human operator and is mediated by his skills and experiences. The continuous increase of datasets volume, i.e. the Big Data Explosion, and the increasing necessity to work on large scale projects require an overall revision of the methods traditionally used in archeology. In this sense, the research presented hereinafter contributes to assess the limits and potential of the emerging field of object-based image analysis (OBIA). The work focused on the definition of OBIA protocols for the treatment of three-dimensional data acquired by airborne and terrestrial laser scanning through the development of a wide range of case studies, used to illustrate the possibilities of the method in archeology. The results include a new, automated approach to identify, map and quantify traces of the First World War landscape around Fort Lusern (Province of Trento, Italy) and the recalcified osteological tissue on the skulls of two burials in the protohistoric necropolis of Olmo di Nogara (Province of Verona, Italy). Moreover, the method was employed to create a predictive model to locate “control places” in mountainous environments; the simulation was built for the Western Asiago Plateau (Province of Vicenza, Italy) and then re-applied with success in basin of Bressanone (Province of Bolzano, Italy). The accuracy of the results was verified thanks to respectively ground surveys, remote cross-validation and comparison with published literature. This confirmed the potential of the methodology, giving reasons to introduce the concept of Archaeological Object-Based Image Analysis (ArchaeOBIA), used to highlight the role of object-based applications in archaeology.
Il campo del remote sensing ha vissuto un incredibile sviluppo negli ultimi anni per merito della crescente qualità e varietà dei sensori e dell’abbattimento dei costi strumentali. Le potenzialità archeologiche sono state ben presto evidenti. Finora, l’interpretazione dei dati è rimasta però prerogativa dell’operatore umano, mediata dalle sue competenze e dalla sua esperienza. Il progressivo aumento di volume dei dataset (cd. “big data explosion”) e la necessità di lavorare su progetti territoriali ad ampia scala hanno reso ora indispensabile una revisione delle modalità di studio tradizionalmente impiegate in ambito archeologico. In questo senso, la ricerca presentata di seguito contribuisce alla valutazione delle potenzialità e dei limiti dell’emergente campo d’indagine dell’object-based image analysis (OBIA). Il lavoro si è focalizzato sulla definizione di protocolli OBIA per il trattamento di dati tridimensionali acquisiti tramite laser scanner aviotrasportato e terrestre attraverso l’elaborazione di un variegato spettro di casi di studio in grado di esemplificare le possibilità offerte dal metodo in archeologia. I risultati ottenuti hanno consentito di identificare, mappare e quantificare in modo automatico e semi-automatico le tracce del paesaggio di guerra nell’area intorno a Forte Luserna (TN) e il tessuto osteologico ricalcificato sui crani di due inumati della necropoli protostorica dell’Olmo di Nogara (VR). Infine, il metodo è stato impiegato per lo sviluppo di un modello predittivo per la localizzazione dei “punti di controllo” in ambiente montano, che è stato studiato per l’area occidentale dell’Altopiano di Asiago (VI) e in seguito riapplicato con successo nella conca di Bressanone (BZ). L’accuratezza dei risultati, verificati di volta in volta tramite ricognizioni a terra, validazione incrociata tramite analisi da remoto e comparazione con i dati editi in letteratura, ha confermato il potenziale della metodologia, consentendo di introdurre il concetto di Archaeological Object-Based Image Analysis (ArchaeOBIA), per rimarcare le specificità delle applicazioni object-based nell’ambito della disciplina archeologica.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

GENTILE, VINCENZO. "Integrazione di indagini geofisiche, dati satellitari e tecniche di rilievo 3D presso il sito archeologico di Egnazia". Doctoral thesis, Università degli studi del Molise, 2017. http://hdl.handle.net/11695/72901.

Texto completo
Resumen
Egnazia è un importante sito archeologico localizzato in Puglia sul litorale adriatico tra Bari e Brindisi. La frequentazione più antica del territorio è datata all’età del Bronzo (XVI secolo a.C.). Il centro si dota della prima sistemazione di tipo urbano e fra l’età di Augusto e il I secolo d.C. si realizzano gli spazi e le strutture tipiche di una città romana. Alla fine del VI secolo l’abitato torna a restringersi entro l’antica acropoli e perdura fino al XIII secolo. All’interno del sito è presente un articolato sistema viario caratterizzato da un’arteria principale, la via Traiana, che attraversa Egnazia in direzione Nord Ovest-Sud Est, separa le aree pubbliche dagli spazi residenziali e produttivi e prosegue alla volta di Brindisi, divenendo un asse di rilievo anche nell’organizzazione del territorio. Da quest’ultima si snodano degli assi viari secondari con la funzione di collegare tutti i settori della città. In questa tesi vengono presentati i risultati di una ricerca multidisciplinare effettuata con lo scopo di comprendere l’articolato sistema viario della città tramite lo studio integrato di mappe storiche e moderne, l’analisi di immagini aeree e satellitari multispettrali, multi temporali, multi scalari (MIVIS, QuickBird e Google Earth), prospezioni geofisiche elettromagnetiche e rilievi tridimensionali (laser scanner) di un importante struttura quale il criptoportico. L’integrazione di più metodologie ha aumentato la probabilità di successo della ricerca poiché ha fornito informazioni oggettive tramite la valutazione della convergenza di più parametri che descrivono la stessa situazione. Questo tipo di approccio, scientifico, tecnologico ed innovativo, grazie alla collaborazione con diversi istituti di ricerca nazionali ed internazionali, è stato trasferito ed applicato infine in numerosi contesti archeologici localizzati in territorio internazionale (città romana di Doclea (Montenegro), la Fortezza di Ighram Aousser (Marocco) e i siti archeologici di Tell El Maskhuta (Egitto), Umm ar-Rasas (Giordania) e Gur (Iran)) caratterizzati da peculiari condizioni geologiche e geografiche.
Egnazia is an important archaeological site located in Puglia on the Adriatic coast between Bari and Brindisi. The oldest human settlement is dated back at the Bronze age (XVI century B.C.). The first urban system was created between the IV and III century B.C. and the typical roman structures were built between the Augustan age and the I century A.C. After the half of the IV century the settlement reduces its size in the old acropolis and it lasts until the XIII century A.C. In the roman city there is a complex road system characterized by a main road that travels through Egnazia towards North West-South East; it separates the public, productive and economic areas from the residential zone and it proceeds in the direction of Brindisi becoming an important point in the organization of the territory. This road has access to secondary axis which join or unite all the sectors of the city. In this thesis the results of a multidisciplinary research are presented. It was carried out with the purpose of understanding the road system of the city through the study of historical and modern maps, the analysis of multispectral, multi-temporal, multi-scalar aerial and satellite images (MIVIS, QuickBird, Google™ earth images), electromagnetic geophysical data and tridimensional survey (laser scanner) of an important structure like the cryptoporticus. The integration of different methodologies has enhanced the probability of success of the research since has provided objective information through the evaluation of diverse parameters describing the same situation. This scientific, technological and innovative multidisciplinary research was transferred and applied in different archaeological sites (the roman city of Doclea (Montenegro), the fortification of Ighram Aousser (Morocco), the archaeological site of Tell El Maskhuta (Egypt), Umm ar-Rasas (Jordan) and Gur (Iran) located in international countries and characterized by different geological and geographical conditions, with the collaboration between Italian and international institution of research.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Qi, Jiaguo. "Compositing multitemporal remote sensing data". Diss., The University of Arizona, 1993. http://hdl.handle.net/10150/186327.

Texto completo
Resumen
In order to reduce the problems of clouds, atmospheric variations, view angle effects, and the soil background variations in the high temporal frequency AVHRR data, a compositing technique is usually employed. Current compositing techniques use a single pixel selection criterion of outputting the input pixel of maximum value NDVI. Problems, however, exist due to the use of the NDVI classifier and to the imperfection of the pixel selection criteria of the algorithm itself. The NDVI was found not to have the maximum value under an ideal observation condition, while the single pixel selection criterion favors the large off-nadir sensor view angles. Consequently, the composited data still consist of substantial noise. To further reduce the noise, several data sets were obtained to study these external factor effects on the NDVI classifier and other vegetation indices. On the basis of the studies of these external factors, a new classifier was developed to further reduce the soil noise. Then, a new set of pixel selection criteria was proposed for compositing. The new compositing algorithm with the new classifier was used to composite two AVHRR data sets. The alternative approach showed that the high frequency noises were greatly reduced, while more valuable data were retained. The proposed alternative compositing algorithm not only further reduced the external factor related noises, but also retained more valuable data. In this dissertation, studies of external factor effects on remote sensing data and derived vegetation indices are presented in the first four chapters. Then the development of the new classifier and the alternative compositing algorithm were described. Perspectives and limitations of the proposed algorithms are also discussed.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Lguensat, Redouane. "Learning from ocean remote sensing data". Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2017. http://www.theses.fr/2017IMTA0050/document.

Texto completo
Resumen
Reconstruire des champs géophysiques à partir d'observations bruitées et partielles est un problème classique bien étudié dans la littérature. L'assimilation de données est une méthode populaire pour aborder ce problème, et se fait par l'utilisation de techniques classiques, comme le filtrage de Kalman d’ensemble ou des filtres particulaires qui procèdent à une évaluation online du modèle physique afin de fournir une prévision de l'état. La performance de l'assimilation de données dépend alors fortement de du modèle physique. En revanche, la quantité de données d'observation et de simulation a augmenté rapidement au cours des dernières années. Cette thèse traite l'assimilation de données d'une manière data-driven et ce, sans avoir accès aux équations explicites du modèle. Nous avons développé et évalué l'assimilation des données par analogues (AnDA), qui combine la méthode des analogues et des méthodes de filtrage stochastiques (filtres Kalman, filtres à particules, chaînes de Markov cachées). Des applications aux modèles chaotiques simplifiés et à des études de cas de télédétection réelle (température de surface de lamer, anomalies du niveau de la mer), nous démontrons la pertinence d'AnDA pour l'interpolation de données manquantes des systèmes dynamiques non linéaires et à haute dimension à partir d'observations irrégulières et bruyantes.Motivé par l'essor du machine learning récemment, la dernière partie de cette thèse est consacrée à l'élaboration de modèles deep learning pour la détection et de tourbillons océaniques à partir de données de sources multiples et/ou multi temporelles (ex: SST-SSH), l'objectif général étant de surpasser les approches dites expertes
Reconstructing geophysical fields from noisy and partial remote sensing observations is a classical problem well studied in the literature. Data assimilation is one class of popular methods to address this issue, and is done through the use of classical stochastic filtering techniques, such as ensemble Kalman or particle filters and smoothers. They proceed by an online evaluation of the physical modelin order to provide a forecast for the state. Therefore, the performanceof data assimilation heavily relies on the definition of the physical model. In contrast, the amount of observation and simulation data has grown very quickly in the last decades. This thesis focuses on performing data assimilation in a data-driven way and this without having access to explicit model equations. The main contribution of this thesis lies in developing and evaluating the Analog Data Assimilation(AnDA), which combines analog methods (nearest neighbors search) and stochastic filtering methods (Kalman filters, particle filters, Hidden Markov Models). Through applications to both simplified chaotic models and real ocean remote sensing case-studies (sea surface temperature, along-track sea level anomalies), we demonstrate the relevance of AnDA for missing data interpolation of nonlinear and high dimensional dynamical systems from irregularly-sampled and noisy observations. Driven by the rise of machine learning in the recent years, the last part of this thesis is dedicated to the development of deep learning models for the detection and tracking of ocean eddies from multi-source and/or multi-temporal data (e.g., SST-SSH), the general objective being to outperform expert-based approaches
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Amrani, Naoufal. "Spectral decorrelation for coding remote sensing data". Doctoral thesis, Universitat Autònoma de Barcelona, 2017. http://hdl.handle.net/10803/402237.

Texto completo
Resumen
Hoy en día, los datos de teledetección son esenciales para muchas aplicaciones dirigidas a la observación de la tierra. El potencial de los datos de teledetección en ofrecer información valiosa permite entender mejor las características de la tierra y las actividades humanas. Los desarrollos recientes en los sensores de satélites permiten cubrir amplias áreas geográficas, produciendo imágenes con resoluciones espaciales, espectrales y temporales sin precedentes. Esta cantidad de datos producidos implica una necesidad requiere técnicas de compresión eficientes para mejorar la transmisión y la capacidad de almacenamiento. La mayoría de estas técnicas se basan en las transformadas o en los métodos de predicción. Con el fin de entender la independencia no lineal y la compactación de datos para las imágenes hiperespectrales, empezaos por investigar la mejora de la transformada “Principa Component Analysis” (PCA) que proporciona una decorrelación optima para fuentes Gausianas. Analizamos la eficiencia en compresión sin perdida de “Principal Polynomial Analysis” (PPA) que generaliza PCA con la eliminación de las dependencias non lineales a través de regresión polinomial. Mostramos que las componentes principales no son capaces de predecirse con la regresión polinomial y por tanto no se mejora la independencia del PCA. Este análisis nos permite entender mejor el concepto de la predicción en el dominio de la transformada para fines de compresión. Por tanto, en lugar de utilizar transformadas sofisticadas y costosas como PCA, centramos nuestro interés en transformadas más simples como “CDiscrete Wavelet Transform”(DWT). Mientras tanto, adoptamos técnicas de predicción para explotar cualquier dependencia restante entre las componentes transformadas. Así, introducimos un nuevo esquema llamado “Regression Wavelet Analysis” (RWA) para aumentar la independencia entre los coeficientes de las imágenes hiperespectrales. El algoritmo utiliza la regresión multivariante para explotar las relaciones entre los coeficientes de las transformada DWT. El algoritmo RWA ofrece muchas ventajas, como el bajo coste computacional y la no expansión del rango dinámico. Sin embargo, la propiedad más importante es la eficiencia en compresión sin perdida. Experimentaos extensivos sobre un conjunto amplio de imanes indican que RWA supera las técnicas mas competitivas en el estado del arte com. PCA o el estándar CCSDS-123. Extendemos los beneficios de RWA para la compresión progresiva “ Lossy-to-lossless “. Mostramos que RWA puede alcanzar una relación rate-distorsión mejor que las obtenidas por otras técnicas del estado del arte como PCA. Para este fin, proponemos un esquema de pesos que captura la significancia predictiva de las componentes. Para un análisis más profundo, también analizamos el sesgo en los parámetros de regresión cuando se aplica una compresión con perdida. Mostramos que los parámetros de RWA no son sesgados cuando los modelos de regresión se aplican con los datos recuperados que carecen información. Finalmente, introducimos una versión del algoritmo RWA de muy bajo coste computacional. Con este nuevo enfoque, la predicción solo se basa en muy pocas componentes, mientras que el rendimiento se mantiene. Mientras que la complejidad de RWA se lleva a su bajo extremo, un método de selección eficiente es necesario. A diferencia de otros métodos de selección costosos, proponemos una estrategia simple pero eficiente llamada “ neighbor selection” para seleccionar modelos con pocas componentes predictivas. Sobre un amplio conjunto de imágenes hiperespectrales, estos modelos mantienen el excelente rendimiento de RWA con el modelo máximo, mientras que el coste computacional es reducido al
Today remote sensing is essential for many applications addressed to Earth Observation. The potential capability of remote sensing in providing valuable information enables a better understanding of Earth characteristics and human activities. Recent advances in satellite sensors allow recovering large areas, producing images with unprecedented spatial, spectral and temporal resolution. This amount of data implies a need for efficient compression techniques to improve the capabilities of storage and transmissions. Most of these techniques are dominated by transforms or prediction methods. This thesis aims at deeply analyzing the state-of-the-art techniques and at providing efficient solutions that improve the compression of remote sensing data. In order to understand the non-linear independence and data compaction of hyperspectral images, we investigate the improvement of Principal Component Analysis (PCA) that provides optimal independence for Gaussian sources. We analyse the lossless coding efficiency of Principal Polynomial Analysis (PPA), which generalizes PCA by removing non-linear relations among components using polynomial regression. We show that principal components are not able to predict each other through polynomial regression, resulting in no improvement of PCA at the cost of higher complexity and larger amount of side information. This analysis allows us to understand better the concept of prediction in the transform domain for compression purposes. Therefore, rather than using expensive sophisticated transforms like PCA, we focus on theoretically suboptimal but simpler transforms like Discrete Wavelet Transform (DWT). Meanwhile, we adopt predictive techniques to exploit any remaining statistical dependence. Thus, we introduce a novel scheme, called Regression Wavelet Analysis (RWA), to increase the coefficient independence in remote sensing images. The algorithm employs multivariate regression to exploit the relationships among wavelet-transformed components. The proposed RWA has many important advantages, like the low complexity and no dynamic range expansion. Nevertheless, the most important advantage consists of its performance for lossless coding. Extensive experimental results over a wide range of sensors, such as AVIRIS, IASI and Hyperion, indicate that RWA outperforms the most prominent transforms like PCA and wavelets, and also the best recent coding standard, CCSDS-123. We extend the benefits of RWA to progressive lossy-to-lossless. We show that RWA can attain a rate-distortion performance superior to those obtained with the state-of-the-art techniques. To this end, we propose a Prediction Weighting Scheme that captures the prediction significance of each transformed components. The reason of using a weighting strategy is that coefficients with similar magnitude can have extremely different impact on the reconstruction quality. For a deeper analysis, we also investigate the bias in the least squares parameters, when coding with low bitrates. We show that the RWA parameters are unbiased for lossy coding, where the regression models are used not with the original transformed components, but with the recovered ones, which lack some information due to the lossy reconstruction. We show that hyperspectral images with large size in the spectral dimension can be coded via RWA without side information and at a lower computational cost. Finally, we introduce a very low-complexity version of RWA algorithm. Here, the prediction is based on only some few components, while the performance is maintained. When the complexity of RWA is taken to an extremely low level, a careful model selection is necessary. Contrary to expensive selection procedures, we propose a simple and efficient strategy called \textit{neighbor selection} for using small regression models. On a set of well-known and representative hyperspectral images, these small models maintain the excellent coding performance of RWA, while reducing the computational cost by about 90\%.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Wende, Jon T. "Predicting soil strength with remote sensing data". Thesis, Monterey, California. Naval Postgraduate School, 2010. http://hdl.handle.net/10945/5174.

Texto completo
Resumen
Approved for public release; distribution is unlimited
Predicting soil strength from hyperspecral imagery enables amphibious planners to determine trafficability in the littorals. Trafficability maps can then be generated and used during the intelligence preparation of the battlespace allowing amphibious planners to select a suitable landing zone. In February and March 2010, the Naval Research Laboratory sponsored a multi-sensor remote sensing and field calibration and field validation campaign (CNMI'10). The team traveled to the islands of Pagan, Tinian, and Guam located in the Marianas archipelago. Airborne hyperspectral imagery along with ground truth data was collected from shallow water lagoons, beachfronts, vegetation, and anomalies such as World War II relics. In this thesis, beachfront hyperspectral data obtained on site was used as a reference library for evaluation against airborne hyperspectral data and ground truth data in order to determine soil strength for creating trafficability maps. Evaluation of the airborne hyperspectral images was accomplished by comparing the reference library spectra to the airborne images. The spectral angle between the reference library and airborne images was calculated producing the trafficability maps amphibious planners can use during the intelligence preparation of the battlespace.
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Cabrera-Mercader, Carlos R. (Carlos Rubén). "Robust compression of multispectral remote sensing data". Thesis, Massachusetts Institute of Technology, 1999. http://hdl.handle.net/1721.1/9338.

Texto completo
Resumen
Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1999.
Includes bibliographical references (p. 241-246).
This thesis develops efficient and robust non-reversible coding algorithms for multispectral remote sensing data. Although many efficient non-reversible coding algorithms have been proposed for such data, their application is often limited due to the risk of excessively degrading the data if, for example, changes in sensor characteristics and atmospheric/surface statistics occur. On the other hand, reversible coding algorithms are inherently robust to variable conditions but they provide only limited compression when applied to data from most modern remote sensors. The algorithms developed in this work achieve high data compression by preserving only data variations containing information about the ideal, noiseless spectrum, and by exploiting inter-channel correlations in the data. The algorithms operate on calibrated data modeled as the sum of the ideal spectrum, and an independent noise component due to sensor noise, calibration error, and, possibly, impulsive noise. Coding algorithms are developed for data with and without impulsive noise. In both cases an estimate of the ideal spectrum is computed first, and then that estimate is coded efficiently. This estimator coder structure is implemented mainly using data-dependent matrix operators and scalar quantization. Both coding algorithms are robust to slow instrument drift, addressed by appropriate calibration, and outlier channels. The outliers are preserved by separately coding the noise estimates in addition to the signal estimates so that they may be reconstructed at the original resolution. In addition, for data free of impulsive noise the coding algorithm adapts to changes in the second-order statistics of the data by estimating those statistics from each block of data to be coded. The coding algorithms were tested on data simulated for the NASA 2378-channel Atmospheric Infrared Sounder (AIRS). Near-lossless compression ratios of up to 32:1 (0.4 bits/pixel/channel) were obtained in the absence of impulsive noise, without preserving outliers, and assuming the nominal noise covariance. An average noise variance reduction of 12-14 dB was obtained simultaneously for data blocks of 2400-7200 spectra. Preserving outlier channels for which the noise estimates exceed three times the estimated noise rms value would require no more than 0.08 bits/pixel/channel provided the outliers arise from the assumed noise distribution. If contaminant outliers occurred, higher bit rates would be required. Similar performance was obtained for spectra corrupted by few impulses.
by Carlos R. Cabrera-Mercader.
Ph.D.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Álvarez, Cortés Sara. "Pyramidal regression-based coding for remote sensing data". Doctoral thesis, Universitat Autònoma de Barcelona, 2019. http://hdl.handle.net/10803/667742.

Texto completo
Resumen
Los datos hiperespectrales capturados por teledetección cuentan con cientos o miles de componentes espectrales de similares longitudes de onda. Almacenarlos y transmitirlos conlleva una demanda excesiva en ancho de banda y memoria, ya de por sí bastante limitados, que pueden dar lugar a descartar información ya capturada o a dejar de capturarla. Para paliar estas limitaciones, se aplican algoritmos de compresión. Además, la tecnología de los sensores evoluciona continuamente, pudiéndose adquirir datos con mayores dimensiones. De ahí que, para no penalizar el funcionamiento y rendimiento de futuras misiones espaciales, se necesitan desarrollar métodos de compresión más competitivos. Regression Wavelet Analysis (RWA) es el método de compresión sin pérdidas más eficiente en relación a la complejidad computacional y al rendimiento de codificación. RWA se describe como una transformada espectral sin pérdida seguida de JPEG 2000. Ésta aplica un nivel de descomposición de la transformada discreta de onda Haar y una regresión. Hay varios modelos de regresión (Maximum, Restricted y Parsimonious) y variantes (solo para Maximum). Inicialmente, nos centramos en aumentar el rendimiento de codificación y/o reducir la complejidad computacional de RWA para diseñar técnicas de compresión más competitivas. Primero, investigamos en profundidad la influencia que tiene el reemplazar el filtro de RWA por transformadas más eficientes en cuanto a la compactación de energía. Para ello, redefinimos el modelo Restricted, reduciendo el tiempo de ejecución, incrementando el ratio de compresión, y preservando un cierto grado de escalabilidad por componente. Además, mostramos que las variantes de regresión se pueden aplicar a todos los modelos de regresión, disminuyendo así su complejidad computacional sin apenas penalizar el rendimiento de codificación. Nuestras nuevas configuraciones proporcionan ratios de compresión mayores o bastante competitivos con respecto a otras técnicas de menor y mayor complejidad. Tras ello, describimos el impacto que tiene el aplicar un esquema de pesos predictivo (PWS) en el rendimiento de compresión cuando se decodifica de forma progresiva desde con-pérdida hasta sin-pérdida (PLL). La aplicación de estos pesos a todos los modelos de regresión y variantes de RWA con JPEG 2000 (PWS-RWA + JPEG 2000) mejora los resultados del esquema original (RWA + JPEG 2000). Por otro lado, vemos que un mejor rendimiento de la codificación no implica necesariamente mejores clasificaciones. De hecho, en comparación con otras técnicas que recuperan la escena con mayor calidad, PWS-RWA + JPEG 2000 provee de mejores clasificaciones cuando la distorsión en la recuperación es elevada. Para obtener una implementación de más baja complejidad computacional, presentamos resultados de RWA acompañada de un codificador que se puede ejecutar a bordo. Además, con un sencillo criterio de decisión conseguimos tasas de bits más bajas, mejorando al esquema original y otras técnicas de compresión sin pérdidas al obtener ganancias de codificación promedio entre 0,10 y 1,35 bits-por-píxel-por-componente. Finalmente, presentamos la primera técnica de compresión sin-pérdida/casi-sin-pérdida basada en un sistema piramidal que aplica regresión. Para ello, ampliamos RWA introduciendo cuantización y un algoritmo de retroalimentación para controlar independientemente el error de cuantificación en cada nivel de descomposición, al mismo tiempo que preservamos la complejidad computacional. Proporcionamos también una ecuación que limita el máximo error en valor absoluto admisible en la reconstrucción. A su vez, evitamos probar la gran cantidad de combinaciones posibles de pasos de cuantificación mediante el desarrollo de un esquema de asignación de pasos. Nuestra propuesta, llamada NLRWA, logra obtener un rendimiento de codificación muy competitivo y recuperar la escena con mayor fidelidad. Por último, cuando el codificador por entropía se basa en planos de bits, NLRWA puede proporcionar una compresión progresiva desde con-pérdida hasta sin-pérdida/casi-sin-pérdida y cierto grado de integrabilidad.
Remote sensing hyperspectral data have hundreds or thousands of spectral components from very similar wavelengths. To store and transmit it entails excessive demands on bandwidth and on on-board memory resources, which are already strongly restricted. This leads to stop capturing data or to discard some of the already recorded information without further processing. To alleviate these limitations, data compression techniques are applied. Besides, sensors' technology is continuously evolving, acquiring higher dimensional data. Consequently, in order to not jeopardize future space mission's performance, more competitive compression methods are required. Regression Wavelet Analysis (RWA) is the state-of-the-art lossless compression method regarding the trade-off between computational complexity and coding performance. RWA is introduced as a lossless spectral transform followed by JPEG 2000. It applies a Haar Discrete Wavelet Transform (DWT) decomposition and sequentially a regression operation. Several regression models (Maximum, Restricted and Parsimonious) and variants (only for the Maximum model) have been proposed. With the motivation of outperforming the latest compression techniques for remote sensing data, we began focusing on improving the coding performance and/or the computational complexity of RWA. First, we conducted an exhaustive research of the influence of replacing the underlying wavelet filter of RWA by more competitive Integer Wavelet Transforms (in terms of energy compaction). To this end, we reformulated the Restricted model, reducing the execution time, increasing the compression ratio, and preserving some degree of component-scalability. Besides, we showed that the regression variants are also feasible to apply to other models, decreasing their computational complexity while scarcely penalizing the coding performance. As compared to other lowest- and highest-complex techniques, our new configurations provide, respectively, better or similar compression ratios. After gaining a comprehensive understanding of the behavior of each operation block, we described the impact of applying a Predictive Weighting Scheme (PWS) in the Progressive Lossy-to-Lossless (PLL) compression performance. PLL decoding is possible thanks to the use of the rate control system of JPEG 2000. Applying this PWS to all the regression models and variants of RWA coupled by JPEG 2000 (PWS-RWA + JPEG 2000) produces superior outcomes, even for multi-class digital classification. From experimentation, we concluded that improved coding performance does not necessarily entail better classification outcomes. Indeed, in comparison with other widespread techniques that obtain better rate-distortion results, PWS-RWA + JPEG 2000 yields better classification outcomes when the distortion in the recovered scene is high. Moreover, the weighted framework presents far more stable classification versus bitrate trade-off. JPEG 2000 may be too computationally expensive for on-board computation. In order to obtain a cheaper implementation, we render results for RWA followed by another coder amenable for on-board operation. This framework includes the operation of a smart and simple criterion aiming at the lowest bitrates. This final pipeline outperforms the original RWA + JPEG 2000 and other state-of-the-art lossless techniques by obtaining average coding gains between 0.10 to 1.35 bits-per-pixel-per-component. Finally, we present the first lossless/near-lossless compression technique based on regression in a pyramidal multiresolution scheme. It expands RWA by introducing quantization and a feedback loop to control independently the quantization error in each decomposition level, while preserving the computational complexity. To this end, we provide a mathematical formulation that limits the maximum admissible absolute error in reconstruction. Moreover, we tackle the inconvenience of proving the huge number of possible quantization steps combinations by establishing a quantization steps-allocation definition. Our approach, named NLRWA, attains competitive coding performance and superior scene's quality retrieval. In addition, when coupled with a bitplane entropy encoder, NLRWA supports progressive lossy-to-lossless/near-lossless compression and some degree of embeddedness.
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Kressler, Florian. "The Integration of Remote Sensing and Ancillary Data". WU Vienna University of Economics and Business, 1996. http://epub.wu.ac.at/4256/1/WSG_RR_0896.pdf.

Texto completo
Resumen
Obtaining up-to-date information concernmg the environment at reasonable costs is a challenge faced by many institutions today. Satellite images meet both demands and thus present a very attractive source of information. The following thesis deals with the comparison of satellite images and a vector based land use data base of the City of Vienna. The satellite data is transformed using the spectral mixture analysis, which allows an investigation at a sub-pixel level. The results of the transformation are used to determine how suitable this spectral mixture analysis is to distinguish different land use classes in an urban area. In a next step the results of the spectral mixture analysis of two different images (recorded in 1986 and 1991) are used to undertake a change detection. The aim is to show those areas, where building activities have taken place. This information may aid the update of data bases, by limiting a detailed examination of an area to those areas, which show up as changes in the change detection. The proposed method is a fast and inexpensive way of analysing large areas and highlighting those areas where changes have taken place. lt is not limited to urban areas but may easily be adapted for different environments. (author's abstract)
Series: Research Reports of the Institute for Economic Geography and GIScience
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Jia, Xiuping Electrical Engineering Australian Defence Force Academy UNSW. "Classification techniques for hyperspectral remote sensing image data". Awarded by:University of New South Wales - Australian Defence Force Academy. School of Electrical Engineering, 1996. http://handle.unsw.edu.au/1959.4/38713.

Texto completo
Resumen
Hyperspectral remote sensing image data, such as that recorded by AVIRIS with 224 spectral bands, provides rich information on ground cover types. However, it presents new problems in machine assisted interpretation, mainly in long processing times and the difficulties of class training due to the low ratio of number of training samples to the number of bands. This thesis investigates feasible and efficient feature reduction and image classification techniques which are appropriate for hyperspectral image data. The study is reported in three parts. The first concerns a deterministic approach for hyperspectral data interpretation. Multigroup and multiple threshold spectral coding procedures, and associated techniques for spectral matching and classification, are proposed and tested. By coding on subgroups of bands using one or three thresholds, spectral searching and matching becomes simple, fast and free of the need for radiometric correction. Modifications of existing statistical techniques are proposed in the second part of the investigation A block-based maximum likelihood classification technique is developed. Several subgroups are formed from the complete set of spectral bands in the data, based on the properties of global correlation among the bands. Subgroups which are poorly correlated with each other are treated independently using conventional maximum likelihood classification. Experimental results demonstrate that, when using appropriate subgroup sizes, the new method provides a compromise among classification accuracy, processing time and available training pixels. Furthermore, a segmented, and possibly multi-layer, principal components transformation is proposed as a possible feature reduction technique prior to classification, and for effective colour display. The transformation is performed efficiently on each of the highly correlated subgroups of bands independently. Selected features from each transformed subgroup can be then transformed again to achieve a satisfactory data reduction ratio and to generate the three most significant components for colour display. Classification accuracy is improved and high quality colour image display is achieved in experiments using two AVIRIS data sets.
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Slone, Ambrose J. (Abrose Jay). "Improved remote sensing data analysis using neural networks". Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/11461.

Texto completo
Resumen
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1995.
Includes bibliographical references (leaf 115).
by Ambrose J. Slone.
M.Eng.
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Ogutu, Booker. "Modelling terrestrial ecosystem productivity using remote sensing data". Thesis, University of Southampton, 2012. https://eprints.soton.ac.uk/341720/.

Texto completo
Resumen
Production efficiency models (PEMs) have been developed to aid with the estimation of terrestrial ecosystems productivity where large spatial scales make direct measurement impractical. One of the key datasets used in these models is the fraction of photosynthetic active radiation absorbed by vegetation (FAPAR). FAPAR is the single variable that represents vegetation function and structure in these models and hence its accurate estimation is essential. This thesis focused on improving the estimation of FAPAR and developing a new PEM model that utilises the improved FAPAR data. Foremost, the accuracy of operational LAI/FAPAR products (i.e. MGVI, MODIS LAI/FAPAR, CYCLOPES LAI/FAPAR, GLOBCARBON LAI/FAPAR, and NN-MERIS LAI TOC algorithm) over a deciduous broadleaf forest was investigated. This analysis showed that the products varied in their prediction of in-situ FAPAR/LAI measurements mainly due to differences in their definition and derivation procedures. The performance of three PEMs (i.e. Carnegie-CASA, C-Fix and MOD17GPP) in simulating gross primary productivity (GPP) across various biomes was then analysed. It was shown that structural differences in these models influenced their accuracy. Next, the influence of two FAPAR products (MODIS and CYCLOPES) on ecosystem productivity modelling was analysed. Both products were found to result in overestimation of in-situ GPP measurements. This was attributed to the lack of correction for PAR absorbed by the non-photosynthetic components of the canopy by the two products. Only PAR absorbed by chlorophyll in the leaves (FAPAR chlorophyll) is used in photosynthesis and hence it was hypothesised that deriving and using this variable would improve GPP predictions. Therefore, various components of FAPAR (i.e. FAPAR canopy, FAPAR leaf and FAPAR chlorophyll) were estimated using data from a radiative transfer model (PROSAIL-2).The FAPAR components were then related to two sets of vegetation indices (i.e. broad-band: NDVI and EVI, and red-edge: MTCI and CIred-edge). The red-edge based indices were found to be more linearly related to FAPAR chlorophyll than the broad-band indices. These findings were also supported by data from two flux tower sites, where the FAPAR chlorophyll was estimated through inversion of net ecosystem exchange data and was found to be better related to a red-edge based index (i.e. MTCI).Based on these findings a new PEM (i.e. MTCIGPP) was developed to (i) use the MTCI as a surrogate of FAPAR chlorophyll and (ii) incorporate distinct quantum yield terms between the two key plant photosynthetic pathways (i.e. C3 and C4) rather than using species-specific light use efficiency. The GPP predictions from the MTCIGPP model had strong relationship with the in-situ GPP measurements. Furthermore, GPP from the MTCIGPP model were comparable to the MOD17GPP product and better in some biomes (e.g. croplands). The MTCIGPP model is simple and easy to implement, yet provides a reliable measure of terrestrial GPP and has the potential to estimate global terrestrial carbon flux.
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Dalton, Aaron James. "Autonomous Vehicle Path Planning with Remote Sensing Data". Thesis, Virginia Tech, 2008. http://hdl.handle.net/10919/36204.

Texto completo
Resumen
Long range path planning for an autonomous ground vehicle with minimal a-priori data is still very much an open problem. Previous research has demonstrated that least cost paths generated from aerial LIDAR and GIS data could play a role in automatically determining suitable routes over otherwise unknown terrain. However, most of this research has been theoretical. Consequently, there is very little literature the effectiveness of these techniques in plotting paths of an actual autonomous vehicle. This research aims to develop an algorithm for using aerial LIDAR and imagery to plan paths for a full size autonomous car. Methods of identifying obstacles and potential roadways from the aerial LIDAR and imagery are reviewed. A scheme for integrating the path planning algorithms into the autonomous vehicle existing systems was developed and eight paths were generated and driven by an autonomous vehicle. The paths were then analyzed for their drivability and the model itself was validated against the vehicle measurements. The methods described were found to be suitable for generating paths both on and off road.
Master of Science
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Garner, Jamada J. "Scene classification using high spatial resolution multispectral data". Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2002. http://library.nps.navy.mil/uhtbin/hyperion-image/02Jun%5FGarner.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Doldirina, Catherine. "The common good and access to remote sensing data". Thesis, McGill University, 2011. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=104766.

Texto completo
Resumen
The thesis represents a search for the appropriate regime for protecting remote sensing data and information. Based on the technical and societal characteristics of this type of data, it argues in favour of the necessity to secure access to it. Using the regulatory examples of the USA and Europe, the research compares the effectiveness of such relevant legal regimes as intellectual property protection, in particular, copyright on the one hand, and the regulation of public sector information on the other. On the basis of this analysis the argument is made, that the unnecessary commodification of remote sensing data through private property-like protection regime will adversely influence their use and diminish their value. The principle of sharing, based on the theories of common property and the common good, is proposed as the best and most appropriate solution to avoid development of such a scenario. Its viability and effectiveness lies in the emphasis on the balance between the private and the public in the achievement of the common good of a better life that today manifests itself inter alia in being information rich. The principle of sharing has survived centuries of philosophical thought and is relevant today, particularly with regard to the establishment of the protection and distribution regime for remote sensing data, as the highlighted examples of geographic information infrastructures and the Geographic Earth Observation System of Systems show. The metaphor of information as a waterway rounds up the discussion regarding the relevance of the principle of sharing and emphasises the indispensability of the access-to-data oriented approach to the regulation of relationships over the generation, distribution and use of remote sensing data and information.
Cette thèse se veut une recherche du régime adéquat de protection des données de télédétections et de l'information. Son argument, en faveur de la nécessité d'en sécuriser l'accès, se base sur leurs caractéristiques techniques et sociétales. En prenant comme exemples les États-Unis et l'Europe, cette recherche compare l'efficacité de régimes légaux pertinents telle que la protection de la propriété intellectuelle, en particulier celle du droit d'auteur d'une part, et la régulation du secteur public de l'information, de l'autre. Sur la base de cette analyse, ce travail soutient qu'une marchandisation non nécessaire des données de télédétections par des régimes de protection, telle que celui de la propriété privée, vont influencer défavorablement leurs utilités ainsi que leurs valeurs. Le principe du partage, basé sur les théories de la propriété commune et du bien commun, est proposé comme étant la solution pour éviter de tels scénarios. Sa viabilité et son efficacité résident dans l'accent mis entre l'équilibre public et privé dans l'accomplissement du bien commun d'une vie meilleure, qui se manifeste aujourd'hui notamment par l'abondance de l'information. Le principe de partage, qui a survécu à des siècles de pensée philosophique, est toujours pertinent aujourd'hui, particulièrement en ce qui concerne l'implantation de régime de protection et de distribution des données de télédétections, tel que les exemples donnés sur l'infrastructure de l'information géographique et le "Geographic Earth Observation System of Systems" le montrent. La métaphore qui présente l'information comme une voie navigable reprend la discussion relative à la pertinence du principe de partage et accentue l'aspect indispensable d'une approche orientée vers l'accès aux données, préférable à la régulation des relations sur la génération, la distribution et l'utilisation de données de télédétections et de l'information.
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Lewis, Sian Patricia. "Mapping forest parameters using geostatistics and remote sensing data". Thesis, University College London (University of London), 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.407744.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Primus, Ida. "Scale-recursive estimation of precipitation using remote sensing data". Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/10852.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Alam, Mohammad Tanveer. "Image Classification for Remote Sensing Using Data-Mining Techniques". Youngstown State University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=ysu1313003161.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Wessollek, Christine, Pierre Karrasch y Marie-Luise Kautz. "Surface irradiance estimations on watercourses with remote sensing data". SPIE, 2018. https://tud.qucosa.de/id/qucosa%3A35177.

Texto completo
Resumen
The vegetation in the riparian zone of a watercourse in fluences the water state with multiple factors, first via direct substance discharge and secondly via shadow casting on the water surface. Shadowing directly regulates the solar radiant energy arriving at the water surface. Solar radiation input to aquatic environments is the most important abiotic factor for aquatic flora and fauna habitat development. Thus, to adequately asses the ecological state of water courses it is necessary to quantify the solar surface irradiance E (W=m2) arriving on the water surface. When estimating the solar surface irradiance the complex coherence between incoming solar radiation, atmospheric in uences, and spatial-temporal geometries need to be investigated. This work established a work flow to compute the solar surface irradiance for water bodies using different remote sensing data. The work flow was tested on regional level for a section of the river Freiberger Mulde, Saxony, for the year 2016. Product of the calculations is a map visualising the annual sum of the solar surface irradiance (kWh=m2) arriving on the Freiberger Mulde water surface and the surrounding terrain. Based on these information bio-hydrological issues can be further examinated.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Firoozi, Nejad Behnam. "Population mapping using census data, GIS and remote sensing". Thesis, Queen's University Belfast, 2016. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.705917.

Texto completo
Resumen
This thesis assesses approaches to population surface modeling by pulling together the benefits of reference gridded population data with local regression procedures and geographically weighted regression. This study provides a more detailed assessment of surface modelling accuracy than was achieved in any previous studies to assess factors which explain errors in the predictions. The primary aim of this thesis is to evaluate Martin’s (1989) population surface modeling approach and also design and implement a method using secondary data, suitable for application in England and Wales. This research is based on the idea that population data presented for a single zone could be redistributed in the zone using local parameters such as housing density. A weighted sum performs the spatial redistribution. The thesis also aims to make use of remote sensing (RS) data and image processing techniques such as maximum likelihood classification and normalised difference vegetation index to identify (un) populated cells. The potential of Landsat images and RS data analysis is assessed particularly for countries where high quality land use data are not readily obtainable, and their generation is not feasible in the near future. This thesis focuses on the identification of unpopulated cells, rather than populated units, using RS data. Case studies make use of data from Northern Ireland (NI), and Jonkoping in southern Sweden. The outcomes indicate the impact of population density, population variance, and resolution of source zones on the accuracy of population allocation to grid cells using Martin’s (1989) model. The results show significant accuracy in prediction to 100m cells using an alternative approach based on settlement data for NI and this is recommended as an alternative method for England and Wales. It also concluded that there the potential to generate population surfaces using Landsat data for areas where local residential data are not easily accessible.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Perez, Luis Ernesto. "A virtual supermarket for remote sensing data and images". To access this resource online via ProQuest Dissertations and Theses @ UTEP, 2008. http://0-proquest.umi.com.lib.utep.edu/login?COPT=REJTPTU0YmImSU5UPTAmVkVSPTI=&clientId=2515.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Houser, Paul Raymond 1970. "Remote-Sensing Soil Moisture Using Four-Dimensional Data Assimilation". Diss., The University of Arizona, 1996. http://hdl.handle.net/10150/191208.

Texto completo
Resumen
The feasibility of synthesizing distributed fields of remotely-sensed soil moisture by the novel application of four-dimensional data assimilation applied in a hydrological model was explored in this study. Six Push Broom Microwave Radiometer images gathered over Walnut Gulch, Arizona were assimilated into the TOPLATS hydrological model. Several alternative assimilation procedures were implemented, including a method that adjusted the statistics of the modeled field to match those in the remotely sensed image, and the more sophisticated, traditional methods of statistical interpolation and Newtonian nudging. The high observation density characteristic of remotely-sensed imagery poses a massive computational burden when used with statistical interpolation, necessitating observation reduction through subsampling or averaging. For Newtonian nudging, the high observation density compromises the conventional weighting assumptions, requiring modified weighting procedures. Remotely-sensed soil moisture images were found to contain horizontal correlations that change with time and have length scales of several tens of kilometers, presumably because they are dependent on antecedent precipitation patterns. Such correlation therefore has a horizontal length scale beyond the remotely sensed region that approaches or exceeds the catchment scale. This suggests that remotely-sensed information can be advected beyond the image area and across the whole catchment. The remotely-sensed data was available for a short period providing limited opportunity to investigate the effectiveness of surface-subsurface coupling provided by alternative assimilation procedures. Surface observations were advected into the subsurface using incomplete knowledge of the surface-subsurface correlation measured at only 2 sites. It is perceived that improved vertical correlation specification will be a need for optimal soil moisture assimilation. Based on direct measurement comparisons and the plausibility of synthetic soil moisture patterns, Newtonian nudging assimilation procedures were preferred because they preserved the observed patterns within the sampled region, while also calculating plausible patterns in unmeasured regions. Statistical interpolation reduced to the trivial limit of direct data insertion in the sampled region and gave less plausible patterns outside this region. Matching the statistics of the modeled fields to those observed provided plausible patterns, but the observed patterns within sampled area were largely lost.
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Sheffield, Kathryn Jane y kathryn sheffield@dpi vic gov au. "Multi-spectral remote sensing of native vegetation condition". RMIT University. Mathematical and Geospatial Sciences, 2009. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20091110.112816.

Texto completo
Resumen
Native vegetation condition provides an indication of the state of vegetation health or function relative to a stated objective or benchmark. Measures of vegetation condition provide an indication of the vegetation's capacity to provide habitat for a range of species and ecosystem functions through the assessment of selected vegetation attributes. Subsets of vegetation attributes are often combined into vegetation condition indices or metrics, which are used to provide information for natural resource management. Despite their value as surrogates of biota and ecosystem function, measures of vegetation condition are rarely used to inform biodiversity assessments at scales beyond individual stands. The extension of vegetation condition information across landscapes, and approaches for achieving this, using remote sensing technologies, is a key focus of the work presented in this thesis. The aim of this research is to assess the utility of multi-spectral remotely sensed data for the recovery of stand-level attributes of native vegetation condition at landscape scales. The use of remotely sensed data for the assessment of vegetation condition attributes in fragmented landscapes is a focus of this study. The influence of a number of practical issues, such as spatial scale and ground data sampling methodology, are also explored. This study sets limitations on the use of this technology for vegetation condition assessment and also demonstrates the practical impact of data quality issues that are frequently encountered in these types of applied integrated approaches. The work presented in this thesis demonstrates that while some measures of vegetation condition, such as vegetation cover and stem density, are readily recoverable from multi-spectral remotely sensed data, others, such as hollow-bearing trees and log length, are not easily derived from this type of data. The types of information derived from remotely sensed data, such as texture measures and vegetation indices, that are useful for vegetation condition assessments of this nature are also highlighted. The utility of multi-spectral remotely sensed data for the assessment of stand-level vegetation condition attributes is highly dependent on a number of factors including the type of attribute being measured, the characteristics of the vegetation, the sensor characteristics (i.e. the spatial, spectral, temporal, and radiometric resolution), and other spatial data quality considerations, such as site homogeneity and spatial scale. A series of case studies are presented in this thesis that explores the effects of these factors. These case studies demonstrate the importance of different aspects of spatial data and how data manipulation can greatly affect the derived relationships between vegetation attributes and remotely sensed data. The work documented in this thesis provides an assessment of what can be achieved from two sources of multi-spectral imagery in terms of recovery of individual vegetation attributes from remotely sensed data. Potential surrogate measures of vegetation condition that can be derived across broad scales are identified. This information could provide a basis for the development of landscape scale multi-spectral remotely sensed based vegetation condition assessment approaches, supplementing information provided by established site-based vegetation condition assessment approaches.
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Akkok, Inci. "Geological Mapping Using Remote Sensing Technologies". Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/3/12610626/index.pdf.

Texto completo
Resumen
In an area of interest- Sivas Basin, Turkey- where most of the units are sedimentary and show similar spectral characteristics, spectral settings of ASTER sensor may not be enough by itself. Therefore, considering other aspects, such as morphological variables, is reasonable in addition to spectral classifiers. The main objective of this study is to test usefulness of integration of spectral analysis and morphological information for geological mapping. Remotely sensed imagery obtained from ASTER sensor is used to classify different lithological units while DEM is used to characterize landforms related to these lithological units. Maximum Likelihood Classification (MLC) is used to integrate data streaming from different sources. The methodology involves integrating the surface properties of the classified geological units in addition to the spectral reflectances. Seven different classification trials were conducted: : 1. MLC using only nine ASTER bands, 2. MLC using ASTER bands and DEM, 3. MLC using ASTER bands and slope, 4. MLC using ASTER bands and plan curvature, 5. MLC using ASTER bands and profile curvature, 6. MLC using ASTER bands and drainage density and finally 7. MLC using ASTER bands and all ancillary data. The results revealed that integrating topographical parameters aid in improvement of classification where spectral information is not sufficient to discriminate between classes of interest. An increase of more than 5% is observed in overall accuracy for the all ancillary data integration case. Moreover more than 10% improvement for most of the classes was identified. However from the results it is evident that the areal extent of the classified units causes constraints on application of the methodology.
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Curtis, Phillip. "Data Driven Selective Sensing for 3D Image Acquisition". Thèse, Université d'Ottawa / University of Ottawa, 2013. http://hdl.handle.net/10393/30224.

Texto completo
Resumen
It is well established that acquiring large amounts of range data with vision sensors can quickly lead to important data management challenges where processing capabilities become saturated and pre-empt full usage of the information available for autonomous systems to make educated decisions. While sub-sampling offers a naïve solution for reducing dataset dimension after acquisition, it does not capitalize on the knowledge available in already acquired data to selectively and dynamically drive the acquisition process over the most significant regions in a scene, the latter being generally characterized by variations in depth and surface shape in the context of 3D imaging. This thesis discusses the development of two formal improvement measures, the first based upon surface meshes and Ordinary Kriging that focuses on improving scene accuracy, and the second based upon probabilistic occupancy grids that focuses on improving scene coverage. Furthermore, three selection processes to automatically choose which locations within the field of view of a range sensor to acquire next are proposed based upon the two formal improvement measures. The first two selection processes each use only one of the proposed improvement measures. The third selection process combines both improvement measures in order to counterbalance the parameters of the accuracy of knowledge about the scene and the coverage of the scene. The proposed algorithms mainly target applications using random access range sensors, defined as sensors that can acquire depth measurements at a specified location within their field of view. Additionally, the algorithms are applicable to the case of estimating the improvement and point selection from within a single point of view, with the purpose of guiding the random access sensor to locations it can acquire. However, the framework is developed to be independent of the range sensing technology used, and is validated with range data of several scenes acquired from many different sensors employing various sensing technologies and configurations. Furthermore, the experimental results of the proposed selection processes are compared against those produced by a random sampling process, as well as a neural gas selective sensing algorithm.
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Sun, Liqun y 孙立群. "A comprehensive analysis of terrestrial surface features using remote sensing data". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2014. http://hdl.handle.net/10722/208044.

Texto completo
Resumen
Using the remote sensing data, this study aims to enhance our understanding of land surface features, including ecosystem distribution in association with topographic controls and climatic controls, vegetation disturbance due to natural hazards, and surface temperature changes with consideration of the influence of urbanization. In this study, the Global Inventory Monitoring and Modeling System (GIMMS) Normalized Difference Vegetation Index (NDVI) data sets from 1982 to 2006 were used to explore vegetation variation. A data mining method, Exhaustive Chi-squared Automatic Interaction Detector algorithm, was successfully applied to investigate the topographic influences on vegetation distribution in China. The study revealed that elevation is a predominant factor for controlling vegetation distribution among different topographic attributes (slope, aspect, Compound Topographic Index (CTI) and distance to the nearest river). Further, the study results indicated that solar radiation is the limited factor for plant growth in majority of the Northern Hemisphere in summer, and temperature is the main limitation for other seasons. Partial correlation coefficient (PCC) method was adopted to investigate the complex relationships of NDVI with weather variables (i.e., temperature, precipitation and solar radiation) and key climate indices (such as, El Niño-Southern Oscillation (ENSO), Indian Ocean Dipole (IOD), Arctic Oscillation (AO), and Antarctic Oscillation (AAO)). The study indicated that AO is the most significant index in affecting the temperatures in spring and winter in the Northern Hemisphere. This study enhanced the understanding of vegetation responds to asymmetric daytime (Tmax) and nighttime (Tmin) warming in different seasons. The result revealed that asymmetric warming of Tmax and Tmin may influence vegetation photosynthesis and respiration in the plant growth in different periods across biomes. In spring and autumn, vegetation in boreal and wet temperate regions of the Northern Hemisphere is positively correlated with Tmax and negatively correlated with Tmin, whereas, in dry regions the NDVI is always negatively correlated with Tmax and positively correlated with Tmin. In summer, the NDVI is negatively correlated with Tmax in many dry regions. In addition, this study developed a new index, Continued Vegetation Decrease Index (CVDI), to detect vegetation disturbance due to extreme natural hazards (such as, earthquake, wildfire, ice storms and so on). Using the Wenchuan earthquake occurred in Sichuan China on 12 May 2008 as an example, this study confirmed that the CVDI method can effectively identify the regions with severe vegetation damage, and it is expected that the newly-developed index can be used for detecting vegetation disturbance in other regions of the world. Finally, using the remote sensing data (land use data and surface temperature data) and weather station data, this study developed a new method to evaluate the urbanization influence on the temperature recorded at weather stations. The results revealed that the weather stations with most fast increase temperature are not in developed countries, but in developing countries. The results also imply that the global warming trend may be overestimated due to the under-estimation of urbanization influence on temperature increase.
published_or_final_version
Civil Engineering
Doctoral
Doctor of Philosophy
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Agaba, Doreen. "System design of the MeerKAT L - band 3D radar for monitoring near earth objects". Doctoral thesis, University of Cape Town, 2017. http://hdl.handle.net/11427/26890.

Texto completo
Resumen
This thesis investigates the current knowledge of small space debris (diameter less than 10 cm) and potentially hazardous asteroids (PHA) by the use of radar systems. It clearly identifies the challenges involved in detecting and tracking of small space debris and PHAs. The most significant challenges include: difficulty in tracking small space debris due to orbital instability and reduced radar cross-section (RCS), errors in some existing data sets, the lack of dedicated or contributing instruments in the Southern Hemisphere, and the large cost involved in building a high-performance radar for this purpose. This thesis investigates the cooperative use of the KAT-7 (7 antennas) and MeerKAT (64 antennas) radio telescope receivers in a radar system to improve monitoring of small debris and PHAs was investigated using theory and simulations, as a cost-effective solution. Parameters for a low cost and high-performance radar were chosen, based on the receiver digital back-end. Data from such radars will be used to add to existing catalogues thereby creating a constantly updated database of near Earth objects and bridging the data gap that is currently being filled by mathematical models. Based on literature and system requirements, quasi-monostatic, bistatic, multistatic, single input multiple output (SIMO) radar configurations were proposed for radio telescope arrays in detecting, tracking and imaging small space debris in the low Earth orbit (LEO) and PHAs. The maximum dwell time possible for the radar geometry was found to be 30 seconds, with coherent integration limitations of 2 ms and 121 ms for accelerating and non-accelerating targets, respectively. The multistatic and SIMO radar configurations showed sufficient detection (SNR 13 dB) for small debris and quasi-monostatic configuration for PHAs. Radar detection, tracking and imaging (ISAR) simulations were compared to theory and ambiguities in range and Doppler were compensated for. The main contribution made by this work is a system design for a high performance, cost effective 3D radar that uses the KAT-7 and MeerKAT radio telescope receivers in a commensal manner. Comparing theory and simulations, the SNR improvement, dwell time increase, tracking and imaging capabilities, for small debris and PHAs compared to existing assets, was illustrated. Since the MeerKAT radio telescope is a precursor for the SKA Africa, extrapolating the capabilities of the MeerKAT radar to the SKA radar implies that it would be the most sensitive and high performing contributor to space situational awareness, upon its completion. From this feasibility study, the MeerKAT 3D distributed radar will be able to detect debris of diameter less than 10 cm at altitudes between 700 km to 900 km, and PHAs, with a range resolution of 15 m, a minimum SNR of 14 dB for 152 pulses for a coherent integration time of 2.02 ms. The target range (derived from the two way delay), velocity (from Doppler frequency) and direction will be measured within an accuracy of: 2.116 m, 15.519 m/s, 0.083° (single antenna), respectively. The range, velocity accuracies and SNR affect orbit prediction accuracy by 0.021 minutes for orbit period and 0.0057° for orbit inclination. The multistatic radar was found to be the most suitable and computationally efficient configuration compared to the bistatic and SIMO configurations, and beamforming should be implemented as required by specific target geometry.
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

He, Juan Xia. "An ontology-based methodology for geospatial data integration". Thesis, University of Ottawa (Canada), 2010. http://hdl.handle.net/10393/28710.

Texto completo
Resumen
Data semantic and schematic heterogeneity is a major obstacle to the reuse and sharing of geospatial data. This research focuses on developing an ontology-based methodology to logically integrate heterogeneous geographic data in a cross-border context Three main obstacles hindering data integration are semantic, schematic, and syntactic heterogeneity. Approaches to overcome these obstacles in previous research are reviewed. Among the different approaches, an ontology-based approach is selected for horizontal geospatial data integration in the context of cross-border applications. The integration methodology includes the extraction of application schemas and application ontologies, ontology integration, the creation of a reference model (or ontologies), schema matching and integration, and the creation of usable integrated datasets. The methodology is conceptual and integrates geospatial data based on the semantic content and so is not tied to specific data formats, geometric representations, or feature locations. In order to facilitate the integration procedure, four semantic relationships are used: refer-to, semantic equivalence, semantic generalization, and semantic aggregation. A hybrid ontology approach is employed in order to facilitate the addition of new geospatial data sources to the integration process. As such, three levels of ontologies are developed and illustrated within a MS ACCESS database: application, domain, and a reference model. Furthermore, a working integration prototype is designed to facilitate the integration of geospatial data in the North American context given the semantic and schema heterogeneities in international Canadian-US geospatial datasets. The methodology and prototype provide users with the ability to freely query and retrieve data without knowledge of the heterogeneous data ontologies and schemas. This is illustrated via a case study identifying critical infrastructure around the Ambassador Bridge international border crossing. The methodology and prototype are compared and evaluated with other GDI approaches and by criteria introduced by Buccella et al. (2009). Specific challenges unique to GDI were uncovered and include geographic discrepancies, scale compatibility and temporal issues.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Payne, Timothy Myles. "Remote detection using fused data /". Title page, abstract and table of contents only, 1994. http://web4.library.adelaide.edu.au/theses/09PH/09php3465.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Combrexelle, Sébastien. "Multifractal analysis for multivariate data with application to remote sensing". Phd thesis, Toulouse, INPT, 2016. http://oatao.univ-toulouse.fr/16477/1/Combrexelle.pdf.

Texto completo
Resumen
Texture characterization is a central element in many image processing applications. Texture analysis can be embedded in the mathematical framework of multifractal analysis, enabling the study of the fluctuations in regularity of image intensity and providing practical tools for their assessment, the coefficients or wavelet leaders. Although successfully applied in various contexts, multi fractal analysis suffers at present from two major limitations. First, the accurate estimation of multifractal parameters for image texture remains a challenge, notably for small sample sizes. Second, multifractal analysis has so far been limited to the analysis of a single image, while the data available in applications are increasingly multivariate. The main goal of this thesis is to develop practical contributions to overcome these limitations. The first limitation is tackled by introducing a generic statistical model for the logarithm of wavelet leaders, parametrized by multifractal parameters of interest. This statistical model enables us to counterbalance the variability induced by small sample sizes and to embed the estimation in a Bayesian framework. This yields robust and accurate estimation procedures, effective both for small and large images. The multifractal analysis of multivariate images is then addressed by generalizing this Bayesian framework to hierarchical models able to account for the assumption that multifractal properties evolve smoothly in the dataset. This is achieved via the design of suitable priors relating the dynamical properties of the multifractal parameters of the different components composing the dataset. Different priors are investigated and compared in this thesis by means of numerical simulations conducted on synthetic multivariate multifractal images. This work is further completed by the investigation of the potential benefit of multifractal analysis and the proposed Bayesian methodology for remote sensing via the example of hyperspectral imaging.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Horn, Isaac Abraham. "Remote Sensing and Data Collection in a Marine Science Application". Fogler Library, University of Maine, 2006. http://www.library.umaine.edu/theses/pdf/HornIA2006.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Zheng, Tao. "Mapping photosynthetically active radiation (PAR) using multiple remote sensing data". College Park, Md. : University of Maryland, 2007. http://hdl.handle.net/1903/7231.

Texto completo
Resumen
Thesis (Ph. D.) -- University of Maryland, College Park, 2007.
Thesis research directed by: Geography. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Ha, Le Thi Chau. "Remote sensing data integration for landslide susceptibility mapping in Vietnam". Thesis, University of Newcastle Upon Tyne, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.493229.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Standley, Andy. "Passive microwave remote sensing of snow cover from satellite data". Thesis, University of Bristol, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.265475.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Mabaso, Sizwe Doctor. "Remote sensing data for mapping and monitoring African savanna woodlands". Thesis, Aberystwyth University, 2016. http://hdl.handle.net/2160/45317b29-ca30-4cdc-9c69-ce52067a8361.

Texto completo
Resumen
Remote sensing data provide unprecedented opportunities for detecting and monitoring forest disturbance and loss. Disturbance and loss have been successfully mapped where cleared land is of sucient extent to provide discrimination within an image. However, methodologies for mapping and monitoring forest degradation are still lacking, primarily because these features are small in size. In Tanzania, the size and extent of degradation is poorly understood, yet there has been an increase in its drivers, such as shifting cultivation, settlements, logging and charcoal burning of indigenous trees. This study aimed to establish the extent remote sensing can map forest cover and change for REDD+ monitoring in Tanzanian savanna woodlands. A forest baseline for Liwale in south-eastern Tanzania was derived using the Land Cover Classication System (LCCS) from a 2012 RapidEye image, and validated using LiDAR data from 2012. The baseline and a 2014 RapidEye image were then used to perform an object-based change detection, by rst identifying potential change features by automatically thresholding the data using a method based on optimising the skewness and kurtosis for distribution of the class of interest, and then classifying, using the random forests algorithm, the true change features. The change results were validated using 2014 LiDAR data. These methods were then scaled out to courser Landsat imagery. The study concluded that both high and low resolution optical RS data have great potential for forest monitoring of the Tanzanian savanna woodlands, even though Landsat does not provide the level of detail to accurately depict small-scale change. Components of a remote sensing-based monitoring system are proposed. However, it was noted that for national monitoring, an integration of both high and low resolution data was best. Being a deciduous environment, it was found that seasonality and persistent cloud cover in the region greatly limit the window for appropriate monitoring data. The developed methodology is robust, and thus can be scaled up to a broader national scale and across similar environments in Africa.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Marcellin, Michael W., Naoufal Amrani, Serra-Sagristà Joan, Valero Laparra y Jesus Malo. "Regression Wavelet Analysis for Lossless Coding of Remote-Sensing Data". IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 2016. http://hdl.handle.net/10150/621311.

Texto completo
Resumen
A novel wavelet-based scheme to increase coefficient independence in hyperspectral images is introduced for lossless coding. The proposed regression wavelet analysis (RWA) uses multivariate regression to exploit the relationships among wavelettransformed components. It builds on our previous nonlinear schemes that estimate each coefficient from neighbor coefficients. Specifically, RWA performs a pyramidal estimation in the wavelet domain, thus reducing the statistical relations in the residuals and the energy of the representation compared to existing wavelet-based schemes. We propose three regression models to address the issues concerning estimation accuracy, component scalability, and computational complexity. Other suitable regression models could be devised for other goals. RWA is invertible, it allows a reversible integer implementation, and it does not expand the dynamic range. Experimental results over a wide range of sensors, such as AVIRIS, Hyperion, and Infrared Atmospheric Sounding Interferometer, suggest that RWA outperforms not only principal component analysis and wavelets but also the best and most recent coding standard in remote sensing, CCSDS-123.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Radhakrishnan, Aswathnarayan. "A Study on Applying Learning Techniques to Remote Sensing Data". The Ohio State University, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=osu1586901481703797.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Medler, Michael Johns 1962. "Integrating remote sensing and terrain data in forest fire modeling". Diss., The University of Arizona, 1997. http://hdl.handle.net/10150/282480.

Texto completo
Resumen
Forest fire policies are changing. Managers now face conflicting imperatives to re-establish pre-suppression fire regimes, while simultaneously preventing resource destruction. They must, therefore, understand the spatial patterns of fires. Geographers can facilitate this understanding by developing new techniques for mapping fire behavior. This dissertation develops such techniques for mapping recent fires and using these maps to calibrate models of potential fire hazards. In so doing, it features techniques that strive to address the inherent complexity of modeling the combinations of variables found in most ecological systems. Image processing techniques were used to stratify the elements of terrain, slope, elevation, and aspect. These stratification images were used to assure sample placement considered the role of terrain in fire behavior. Examination of multiple stratification images indicated samples were placed representatively across a controlled range of scales. The incorporation of terrain data also improved preliminary fire hazard classification accuracy by 40%, compared with remotely sensed data alone. A Kauth-Thomas transformation (KT) of pre-fire and post-fire Thematic Mapper (TM) remotely sensed data produced brightness, greenness, and wetness images. Image subtraction indicated fire induced change in brightness, greenness, and wetness. Field data guided a fuzzy classification of these change images. Because fuzzy classification can characterize a continuum of a phenomena where discrete classification may produce artificial borders, fuzzy classification was found to offer a range of fire severity information unavailable with discrete classification. These mapped fire patterns were used to calibrate a model of fire hazards for the entire mountain range. Pre-fire TM, and a digital elevation model produced a set of co-registered images. Training statistics were developed from 30 polygons associated with the previously mapped fire severity. Fuzzy classifications of potential burn patterns were produced from these images. Observed field data values were displayed over the hazard imagery to indicate the effectiveness of the model. Areas that burned without suppression during maximum fire severity are predicted best. Areas with widely spaced trees and grassy understory appear to be misrepresented, perhaps as a consequence of inaccuracies in the initial fire mapping.
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Spaniol, Jutta. "Synthesis of fractal-like surfaces from sparse data bases". Thesis, University of Exeter, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.335017.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Hindley, D. "Information content of AVHRR data for crop production estimates". Thesis, Cranfield University, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.357999.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Wood, Peter. "Hyperspectral measurement and modelling of marine remote sensing reflectance". Thesis, University of Strathclyde, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.366770.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Sarton, Christopher J. "Autopilot using differential thrust for ARIES autonomous underwater vehicle". Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2003. http://library.nps.navy.mil/uhtbin/hyperion-image/03Jun%5FSarton.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Zhu, Shuxiang. "Big Data System to Support Natural Disaster Analysis". Case Western Reserve University School of Graduate Studies / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=case1592404690195316.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Mayr, Thomas. "The evaluation of PMI data for vegetation mapping in the Somerset Levels". Thesis, Cranfield University, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.281899.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Kayes, Edwin. "A NEW GENERATION OF DATA RECORDERS FOR REMOTE SENSING GROUND STATIONS". International Foundation for Telemetering, 1995. http://hdl.handle.net/10150/608543.

Texto completo
Resumen
International Telemetering Conference Proceedings / October 30-November 02, 1995 / Riviera Hotel, Las Vegas, Nevada
Magnetic tape is the primary medium used to capture and store unprocessed data from remote sensing satellites. Recent advances in digital cassette recording technology have resulted in the introduction of a range of data recorders which are equally at home working alongside conventional recorders or as part of more advanced data capture strategies. This paper shows how users are taking advantage of the convenience, economy and efficiency of this new generation of cassette-based equipment in a range of practical applications.
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

McLean, Andrew Lister. "Applications of maximum entropy data analysis". Thesis, University of Southampton, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.319161.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía