Tesis sobre el tema "Open environmental data"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte los 34 mejores tesis para su investigación sobre el tema "Open environmental data".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.
Sadler, Jeffrey Michael. "Hydrologic Data Sharing Using Open Source Software and Low-Cost Electronics". BYU ScholarsArchive, 2015. https://scholarsarchive.byu.edu/etd/4425.
Montori, Federico <1990>. "Delivering IoT Services in Smart Cities and Environmental Monitoring through Collective Awareness, Mobile Crowdsensing and Open Data". Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2019. http://amsdottorato.unibo.it/8957/1/THESIS_REV.pdf.
Dumpawar, Suruchi. "Open government data intermediaries : mediating data to drive changes in the built environment". Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/97994.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 127-133).
In recent years open data initiatives, which make government data publicly available in a machine-readable format for reuse and redistribution, have proliferated, driven by the launch of open-data government initiatives such as data.gov and data.gov.uk. Research on open data has focused on its potential for governance, its implications on transparency, accountability, and service delivery, and its limitations and barriers of use. However, less attention has been focused on the practices of data intermediaries-an emerging configuration of actors that plays an essential role in facilitating the use and reuse of data by aggregating open government data and enhancing it through a range of data practices. This thesis will assess the data practices of open government data intermediaries from three perspectives. First, it will trace the development of open government data initiatives to contend that at a moment when open data policy is seeing global diffusion with the potential of increasing social, political, and economic impact, there is a crucial need to assess the practices of intermediaries to understand how open government data is put to use. Second, it will develop a framework to analyze the role of open government data intermediaries by proposing a definition for "the data intermediary function" constituted by a range of technical, civic, representational, and critical data practices. Third, it will assess the data practices of two open government data intermediaries, 596 Acres and Transparent Chennai, who as urban actors facilitate the conversion of open government data into actionable information for communities to effect changes in the built environment. In describing and assessing the tools, practices, and methods developed by open data intermediaries this thesis will explore the potential and limitations of data intermediaries, and offer recommendations that might inform future open government data initiatives that seek to mediate open government data to facilitate changes in the built environment.
by Suruchi Dumpawar.
S.M.
Neira, Maria Elena. "An open architecture for data environments based on context interchange". Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/69352.
Wiggins, John Sterling. "Design and specification of a PC-based, open architecture environment controller". Thesis, Georgia Institute of Technology, 2000. http://hdl.handle.net/1853/17299.
Miles, Shaun Graeme. "An investigation of issues of privacy, anonymity and multi-factor authentication in an open environment". Thesis, Rhodes University, 2012. http://hdl.handle.net/10962/d1006653.
Adobe Acrobat Pro 9.5.1
Adobe Acrobat 9.51 Paper Capture Plug-in
Kalibjian, Jeffrey R. "APPLICATION OF INTRUSION DETECTION SOFTWARE TO PROTECT TELEMETRY DATA IN OPEN NETWORKED COMPUTER ENVIRONMENTS". International Foundation for Telemetering, 2000. http://hdl.handle.net/10150/606817.
Over the past few years models for Internet based sharing and selling of telemetry data have been presented [1] [2] [3] at ITC conferences. A key element of these sharing/selling architectures was security. This element was needed to insure that information was not compromised while in transit or to insure particular parties had a legitimate right to access the telemetry data. While the software managing the telemetry data needs to be security conscious, the networked computer hosting the telemetry data to be shared or sold also needs to be resistant to compromise. Intrusion Detection Systems (IDS) may be used to help identify and protect computers from malicious attacks in which data can be compromised.
Triperina, Evangelia. "Visual interactive knowledge management for multicriteria decision making and ranking in linked open data environments". Thesis, Limoges, 2020. http://www.theses.fr/2020LIMO0010.
The dissertation herein involves research in the field of the visual representations aided by semantic technologies and ontologies in order to support decisions and policy making procedures, in the framework of research and academic information systems. The visualizations will be also supported by data mining and knowledge extraction processes in the linked data environment. To elaborate, visual analytics’ techniques will be employed for the organization of the visualizations in order to present the information in such a way that will utilize the human perceptual abilities and that will eventually assist the decision support and policy making procedures. Furthermore, the visual representation and consequently the decision and policy making processes will be ameliorated by the means of the semantic technologies based on conceptual models in the form of ontologies. Thus, the main objective of the proposed doctoral thesis consists the combination of the key semantic technologies with interactive visualisations techniques based mainly on graph’s perception in order to make decision support systems more effective. The application field will be the research and academic information systems
Neumann, Bradley C. "Is All Open Space Created Equal? A Hedonic Application within a Data-Rich GIS Environment". Fogler Library, University of Maine, 2005. http://www.library.umaine.edu/theses/pdf/NeumannBC2005.pdf.
Chivarar, Sonia y Haithem Hamdi. "Technology Convergence and Open Innovation : An Empirical Study on How Nexus of Forces Influences the Open Innovation Environment". Thesis, Internationella Handelshögskolan, Högskolan i Jönköping, IHH, Informatik, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-23980.
Bennett, Stacey Patricia. "An object oriented expert system for specifying computer data security requirements in an open systems environment". Thesis, University of Birmingham, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.341835.
Castro, Ginard Alfred. "Detection, characterisation and use of open clusters in a Galactic context in a Big Data environment". Doctoral thesis, Universitat de Barcelona, 2021. http://hdl.handle.net/10803/671790.
Els cúmuls estel·lars oberts són conjunts d'estels, lligats gravitatòriament, nascuts al mateix núvol molecular que tenen propietats similars. Aquests cúmuls són traçadors populars de la estructura del disc Galàctic, com ara els braços espirals. El segon llançament de dades de Gaia, amb més de 1300 milions d'estels, impossibilita la detecció de cúmuls a partir de mètodes tradicionals degut al gran volum del catàleg. Per això, el desenvolupament de tècniques automàtiques per aquest fi ha crescut juntament amb el volums dels catàlegs a analitzar. Hem desenvolupat una metodologia per a la cerca a cegues de cúmuls oberts al disc Galàctic. Hem utilitzat un algoritme de clustering, DBSCAN, per trobar sobredensitats en l'espai astromètric de cinc dimensions de Gaia. La implementació del mètode de clustering a un entorn de Big Data, al superordinador MareNostrum, ens permet cercar cúmuls oberts basant-nos en les seves propietats físiques. Les sobredensitats detectades s'identifiquen com a cúmuls oberts reals per mitjà d'una xarxa neuronal artificial que reconeix isòcrones en un diagrama de color-magnitud. L'automatització del procediment de detecció amb l'ús de tècniques de Big Data, ha resultat en més de 650 nous cúmuls. Aquests nous cúmul representen un terç de la població actual, i és la contribució individual més gran al catàleg. Hem pogut estimar les propietats físiques dels cúmuls com distància, edat i extinció, fent servir una xarxa neuronal artificial entrenada sobre cúmuls coneguts. Fem servir aquesta informació, juntament amb mesures de velocitat radial, per traçar l'estructura espiral actual de la nostra Galàxia associant els cúmuls oberts més joves (< 30 milions d'anys) al braç espiral on s'han format. Amb això, hem augmentat el nombre de traçadors de braços espirals, afegint 264 cúmuls joves als traçadors utilitzats tradicionalment. Això ens ha permès estimar millor els paràmetres actuals d'aquests braços. Analitzant la distribució en edat dels cúmuls dins dels braços espirals, i calculant la velocitat en la que aquests braços es mouen a partir de l'orbita dels cúmuls, hem pogut desfavorir la teoria clàssica d'ona de densitat com a mecanisme principal de formació de l'estructura espiral, trobant un comportament més transitori dels braços.
Ramoly, Nathan. "Contextual integration of heterogeneous data in an open and opportunistic smart environment : application to humanoid robots". Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLL003/document.
Personal robots associated with ambient intelligence are an upcoming solution for domestic care. In fact, helped with devices dispatched in the environment, robots could provide a better care to users. However, such robots are encountering challenges of perception, cognition and action.In fact, such an association brings issues of variety, data quality and conflicts, leading to the heterogeneity and uncertainty of data. These are challenges for both perception, i.e. context acquisition, and cognition, i.e. reasoning and decision making. With the knowledge of the context, the robot can intervene through actions. However, it may encounter task failures due to a lack of knowledge or context changes. This causes the robot to cancel or delay its agenda. While the literature addresses those topics, it fails to provide complete solutions. In this thesis, we proposed contributions, exploring both reasoning and learning approaches, to cover the whole spectrum of problems. First, we designed novel context acquisition tool that supports and models uncertainty of data. Secondly, we proposed a cognition technique that detects anomalous situation over uncertain data and takes a decision in accordance. Then, we proposed a dynamic planner that takes into consideration the last context changes. Finally, we designed an experience-based reinforcement learning approach to proactively avoid failures.All our contributions were implemented and validated through simulations and/or with a small robot in a smart home platform
Rathnayaka, Mudiyanselage Udara Madushantha Somarathna. "Data quality analysis in a GIS environment of OpenStreetMap geodatabase for Sri Lanka". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019.
Rafes, Karima. "Le Linked Data à l'université : la plateforme LinkedWiki". Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLS032/document.
The Center for Data Science of the University of Paris-Saclay deployed a platform compatible with Linked Data in 2016. Because researchers face many difficulties utilizing these technologies, an approach and then a platform we call LinkedWiki were designed and tested over the university’s cloud (IAAS) to enable the creation of modular virtual search environments (VREs) compatible with Linked Data. We are thus able to offer researchers a means to discover, produce and reuse the research data available within the Linked Open Data, i.e., the global information system emerging at the scale of the internet. This experience enabled us to demonstrate that the operational use of Linked Data within a university is perfectly possible with this approach. However, some problems persist, such as (i) the respect of protocols and (ii) the lack of adapted tools to interrogate the Linked Open Data with SPARQL. We propose solutions to both these problems. In order to be able to verify the respect of a SPARQL protocol within the Linked Data of a university, we have created the SPARQL Score indicator which evaluates the compliance of the SPARQL services before their deployments in a university’s information system. In addition, to help researchers interrogate the LOD, we implemented a SPARQLets-Finder, a demonstrator which shows that it is possible to facilitate the design of SPARQL queries using autocompletion tools without prior knowledge of the RDF schemas within the LOD
Paumelle, Martin. "Description multi-dimensionnelle de l'environnement à l'échelle des territoires : contribution pour la recherche de déterminants environnementaux dans l'étiologie des maladies chroniques". Electronic Thesis or Diss., Université de Lille (2022-....), 2023. http://www.theses.fr/2023ULILR050.
Among chronic diseases, Crohn's disease (CD) and end-stage renal disease (ESRD) have a multifactorial etiology that remains partly unknown, with a strong suspicion of an environmental link. The spatial distribution of their incidence has been mapped at the municipal level in Northern France, using two health registers (Epimad and Nephronor). These spatial disparities in incidence serve as the starting point to investigate potential environmental determinants that may be involved in the onset of these diseases.The characterization of the environment and its link to health is often approached in a fragmented manner, focusing on a specific emission source, pollutant, or exposure medium. While these approaches are necessary, they may be limited in comprehending the complexity of the relationship between the environment and health, especially for multifactorial diseases with unknown environmental risk factors. In such cases, it is relevant to prioritize territorial and multidimensional strategies before potentially targeting specific environmental risk factors. In this context, how can multiple open environmental data sources be leveraged to identify territorial determinants of multifactorial diseases?The main objective of this thesis is to offer an integrated description of the environment at the territorial level to inform the etiology of the studied diseases. The strategy involved collecting and reusing open environmental data. This approach identified 24 data sources and generated 113 spatial indicators at the municipal level for four departments. These indicators allow for the characterization of contamination levels in various media (air, water, soil), pollutant emissions, the location of emission sources, land use, agricultural practices, the natural features of territories, and climate. Several methodologies were used to exploit these indicators and characterize the environment from a multidimensional perspective.A first approach involved developing composite spatial indices. These indices synthesize information from many indicators into a single global measure. Initially, vulnerability and resilience indices were calculated. They characterize the uneven spatial distribution of environmental determinants that have a beneficial or detrimental impact on health. Subsequently, composite indices of multi-media contamination (air, water, soil) were constructed.A second approach was developed using multivariate classification methods to create territorial typologies and describe the environmental profiles of municipalities. These results provide a more complex view of territories and have allowed to understand how environmental pressures are distributed in space and overlap with each other.Finally, the results of these multidimensional approaches were linked to spatial variations in the incidence of chronic diseases, suggesting potential connections between the environment and the occurrence of these pathologies. For ES-CKD, associations were observed with urban pressure and fine particulate air pollution, corroborating existing literature. For CD, links were suggested with agricultural practices, the natural characteristics of territories, and metallic soil pollution. Further epidemiological approaches are now needed to test these hypotheses and advance research in this area
Reski, Nico. "Change your Perspective : Exploration of a 3D Network created with Open Data in an Immersive Virtual Reality Environment using a Head-mounted Display and Vision-based Motion Controls". Thesis, Linnéuniversitetet, Institutionen för medieteknik (ME), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-46779.
RAZZAK, FAISAL. "The Role of Semantic Web Technologies in Smart Environments". Doctoral thesis, Politecnico di Torino, 2013. http://hdl.handle.net/11583/2506366.
Dunkel, Alexander. "Assessing the perceived environment through crowdsourced spatial photo content for application to the fields of landscape and urban planning". Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2016. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-207927.
Als Wahrnehmung wird der Bewusstseinsprozess des subjektiven Verstehens der Umwelt bezeichnet. Grundlage für diesen Prozess ist die Gewinnung von Informationen über die Sinne, also aus visuellen, olfaktorischen, akustischen und anderen Reizen. Die Wahrnehmung ist aber auch wesentlich durch interne Prozesse beeinflusst. Das menschliche Gehirn ist fortlaufend damit beschäftigt, sowohl bewusst als auch unbewusst Sinneswahrnehmungen mit Erinnerungen abzugleichen, zu vereinfachen, zu assoziieren, vorherzusagen oder zu vergleichen. Aus diesem Grund ist es schwierig, die Wahrnehmung von Orten und Landschaften in Planungsprozessen zu berücksichtigen. Jedoch wird genau dies von der Europäischen Landschaftskonvention gefordert, die Landschaft als einen bestimmten Bereich definiert, so wie er von Besuchern und Einwohnern wahrgenommen wird (“as a zone or area as perceived by local people or visitors”, ELC Art. 1, Abs. 38). Während viele Fortschritte und Erkenntnisse, zum Beispiel aus den Kognitionswissenschaften, heute helfen, die Wahrnehmung einzelner Menschen zu verstehen, konnte die Stadt- und Landschaftsplanung kaum profitieren. Es fehlt an Kenntnissen über das Zusammenwirken der Wahrnehmung vieler Menschen. Schon Stadtplaner Kevin Lynch beschäftigte dieses gemeinsame, kollektive ‚Bild‘ der menschlichen Umwelt ("generalized mental picture", Lynch, 1960, p. 4). Seitdem wurden kaum nennenswerte Fortschritte bei der Erfassung der allgemeinen, öffentlichen Wahrnehmung von Stadt- und Landschaft erzielt. Dies war Anlass und Motivation für die vorliegende Arbeit. Eine bisher in der Planung ungenutzte Informationsquelle für die Erfassung der Wahrnehmung vieler Menschen bietet sich in Form von crowdsourced Daten (auch ‚Big Data‘), also großen Mengen an Daten die von vielen Menschen im Internet zusammengetragen werden. Im Vergleich zu konventionellen Daten, zum Beispiel solchen die durch Experten erhoben werden und durch öffentliche Träger zur Verfügung stehen, eröffnet sich durch crowdsourced Daten eine bisher nicht verfügbare Quelle für Informationen, um die komplexen Zusammenhänge zwischen Raum, Identität und subjektiver Wahrnehmung zu verstehen. Dabei enthalten crowdsourced Daten lediglich Spuren menschlicher Entscheidungen. Aufgrund der Menge ist es aber möglich, wesentliche Informationen über die Wahrnehmung derer, die diese Daten zusammengetragen haben, zu gewinnen. Dies ermöglicht es Planern zu verstehen, wie Menschen ihre unmittelbare Umgebung wahrnehmen und mit ihr interagieren. Darüber hinaus wird es immer wichtiger, die Ansichten Vieler in Planungsprozessen zu berücksichtigen (Lynam, De Jong, Sheil, Kusumanto, & Evans, 2007; Brody, 2004). Der Wunsch nach öffentlicher Beteiligung sowie die Anzahl an beteiligten Stakeholdern nehmen dabei konstant zu. Durch das Nutzen dieser neuen Informationsquelle bietet sich eine Alternative zu herkömmlichen Ansätzen wie Umfragen, die genutzt werden um beispielsweise Meinungen, Positionen, Werte, Normen oder Vorlieben von bestimmten sozialen Gruppen zu messen. Indem es crowdsourced Daten erleichtern, solch soziokulturelle Werte zu bestimmen, können die Ergebnisse vor allem bei der schwierigen Gewichtung gegensätzlicher Interessen und Ansichten helfen. Es wird die Ansicht geteilt, dass die Nutzung von crowdsourced Daten, indem Einschätzungen von Experten ergänzt werden, letztendlich zu einer faireren, ausgeglichenen Berücksichtigung der Allgemeinheit in Entscheidungsprozessen führen kann (Erickson, 2011, p.1). Eine große Anzahl an Methoden ist bereits verfügbar, um aus dieser Datenquelle wichtige landschaftsbezogene Informationen auszulesen. Beispiele sind die Bewertung der Attraktivität von Landschaften, die Bestimmung der Bedeutung von Sehenswürdigkeiten oder Wahrzeichen, oder die Einschätzung von Reisevorlieben von Nutzergruppen. Viele der bisherigen Methoden wurden jedoch als ungenügend empfunden, um die speziellen Bedürfnisse und das breite Spektrum an Fragestellungen zur Landschaftswahrnehmung in Stadt- und Landschaftsplanung zu berücksichtigen. Das Ziel der vorliegenden Arbeit ist es, praxisrelevantes Wissen zu vermitteln, welches es Planern erlaubt, selbstständig Daten zu erforschen, zu visualisieren und zu interpretieren. Der Schlüssel für eine erfolgreiche Umsetzung wird dabei in der Synthese von Wissen aus drei Kategorien gesehen, theoretische Grundlagen (1), technisches Wissen zur Datenverarbeitung (2) sowie Kenntnisse zur grafischen Visualisierungen (3). Die theoretischen Grundlagen werden im ersten Teil der Arbeit (Part I) präsentiert. In diesem Teil werden zunächst Schwachpunkte aktueller Verfahren diskutiert, um anschließend einen neuen, konzeptionell-technischen Ansatz vorzuschlagen der gezielt auf die Ergänzung bereits vorhandener Methoden zielt. Im zweiten Teil der Arbeit (Part II) wird anhand eines Datenbeispiels die Anwendung des Ansatzes exemplarisch demonstriert. Fragestellungen die angesprochen werden reichen von der Datenabfrage, Verarbeitung, Analyse, Visualisierung, bis zur Interpretation von Grafiken in Planungsprozessen. Als Basis dient dabei ein Datenset mit 147 Millionen georeferenzierte Foto-Daten und 882 Millionen Tags der Fotoaustauschplatform Flickr, welches in den Jahren 2007 bis 2015 von 1,3 Millionen Nutzern zusammengetragen wurde. Anhand dieser Daten wird die Entwicklung neuer Visualisierungstechniken exemplarisch vorgestellt. Beispiele umfassen Spatio-temporal Tag Clouds, eine experimentelle Technik zur Generierung von wahrnehmungsgewichteten Karten, die Visualisierung von wahrgenommenem Landschaftswandel, das Abbilden von wahrnehmungsgewichteten Sichtlinien, sowie die Auswertung von individueller Wahrnehmung von und an bestimmten Orten. Die Anwendung dieser Techniken wird anhand verschiedener Testregionen in den USA, Kanada und Deutschland für alle Maßstabsebenen geprüft und diskutiert. Dies umfasst beispielsweise die Erfassung und Bewertung von Sichtlinien und visuellen Bezügen in Yosemite Valley, das Monitoring von wahrgenommenen Veränderungen im Bereich der High Line in New York, die Auswertung von individueller Wahrnehmung für Coit Tower in San Francisco, oder die Beurteilung von regional wahrgenommenen identitätsstiftenden Landschaftswerten für Baden-Württemberg und die Greater Toronto Area (GTA). Anschließend werden Ansätze vorgestellt, um die Qualität und Validität von Visualisierungen einzuschätzen. Abschließend wird anhand eines konkreten Planungsbeispiels, des London View Management Frameworks (LVMF), eine spezifische Implementation des Ansatzes und der Visualisierungen kurz aufgezeigt und diskutiert. Mit der Arbeit wird vor allem das breite Potential betont, welches die Nutzung von crowdsourced Daten für die Bewertung von Landschaftswahrnehmung in Stadt- und Landschaftsplanung bereithält. Insbesondere crowdsourced Fotodaten werden als wichtige zusätzliche Informationsquelle gesehen, da sie eine bisher nicht verfügbare Perspektive auf die allgemeine, öffentliche Wahrnehmung der Umwelt ermöglichen. Während der breiteren Anwendung noch einige Grenzen gesetzt sind, können die vorgestellten experimentellen Methoden und Techniken schon wichtige Aufschlüsse über eine ganze Reihe von wahrgenommenen Landschaftswerten geben. Auf konzeptioneller Ebene stellt die Arbeit eine erste Grundlage für weitere Forschung dar. Bevor jedoch eine breite Anwendung in der Praxis möglich ist, müssen entscheidende Fragen gelöst werden, beispielsweise zum Copyright, zur Definition von ethischen Standards innerhalb der Profession, sowie zum Schutz der Privatsphäre Beteiligter. Längerfristig wird nicht nur die Nutzung der Daten als wichtig angesehen, sondern auch die Erschließung der essentiellen Möglichkeiten dieser Entwicklung zur besseren Kommunikation mit Auftraggebern, Beteiligten und der Öffentlichkeit in Planungs- und Entscheidungsprozessen
Sao, Pedro Michael A. "Real-time Assessment, Prediction, and Scaffolding of Middle School Students’ Data Collection Skills within Physical Science Simulations". Digital WPI, 2013. https://digitalcommons.wpi.edu/etd-dissertations/168.
Adugna, Leykun y Goran Laic. "Kan projekt med öppen källkod användas delvis eller helt för at tuppfylla behoven för routing-applikationer?" Thesis, KTH, Medicinteknik och hälsosystem, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-272732.
Companies are looking into the open source community in the hope of finding a better alternative software to replace their existing software suit. They are looking for software that has the necessary properties required to run their business and possibly help them avoid unnecessary costs and save time. This thesis has examined the needs of routing application for companies and presented a suggestion by using self-developed testbed. The testbed can be used by companies to decide the beneficial of implementing the desired routing application software. The routing application that gave the best result in this study is FRRouting (Free Range Routing). The solution proposed by the study has been proven to be effective through a pilot project where open source program has been successful by retaining the expected quality, functionality in a cost-effective way.
王珏琄. "The condition of applying open data for sustainable development by environmental groups in Taiwan". Thesis, 2018. http://ndltd.ncl.edu.tw/handle/a4egv9.
Sayan, Bianca. "The Contribution of Open Frameworks to Life Cycle Assessment". Thesis, 2011. http://hdl.handle.net/10012/6336.
Sanchis, Huertas Ana. "Providing energy efficiency location-based strategies for buildings using linked open data". Master's thesis, 2012. http://hdl.handle.net/10362/8315.
Climate change is a main concern for humanity from the ending of 20th century. To improve and take care of our environment, a set of measures has been developed to monitor, manage, reduce consumption and raise efficiency of buildings, including the integration of renewable energies and the implementation of passive measures like the improvement of the building envelope. Complex methodologies are used in order to achieve these objectives. Using different tools and data translating is needed, and the loss of accuracy from the detailed input information is most of the times unavoidable. Moreover, including these measures in the development of a project have become a try and error process involving building characteristics, location data and energy efficiency measures. The raising of new technologies, capable of dealing with location-based data and semantics to relate and structure information in a machine readable way, may allow us to provide a set of technical measures to improve energy efficiency in an accessible, open, understandable and easy way from a few data about location and building characteristics. This work tries to define a model and its necessary and sufficient set of data. Its application will provide customized strategies acting as pre-feasibility constraints to help buildings achieve their energy efficiency objectives from its very conception. The model intends to be useful for non-expert users who want to know about their energy savings possibilities, and for professionals willing to get a sustainable starting point for their projects.
Marshall, Lucianne M. "Progression of marine phytoplankton blooms and environmental dynamics from sea-ice coverage to open waters in the coastal Arctic: comparing experimental data with continuous cabled observations". Thesis, 2018. https://dspace.library.uvic.ca//handle/1828/10131.
Graduate
2019-09-07
Chuang, Liang-Chieh y 莊良傑. "GOVERNMENT OPEN DATA PLATFORM、ENVIRONMENT SECURITY DATA AND HOUSING PRICES". Thesis, 2018. http://ndltd.ncl.edu.tw/handle/2q5s75.
元智大學
管理碩士在職專班
106
To adapt to the advent of digital age and the trend of global citizen movement, the authority has begun to promote the opening of government's political data platform on the basis of the resolution of the Executive Yuan’s 3322nd council on November 8, 2012. According to the results of the latest Global Open Data Index released by the Open Knowledge International in 2017, Taiwan was ranked first in 2015 and continued to hold the title in 2016. With the promotion of government units at all levels, there are currently 35,000 categories of materials on the government's political material opening platform. Finally, based on the four environmental safety data of car accident points, car theft, locomotive theft and residential theft, this study focused on the regions with high registered actual selling price of estate, divided Taoyuan into old and new districts, and analyzed the characteristics of the four environmental factors then generalized the relationship among them, to provide new options of house purchasing for the public. At the end of this study, we gave a number of suggestions based on the experience of the actual using of the Government Open Data, and expect to make the application of it more practical and convenient.
Tsou, Ya-Lun y 鄒亞崙. "Using XML Technology on The Query in Open GIS Data Environment". Thesis, 2004. http://ndltd.ncl.edu.tw/handle/95229223047375234151.
國立成功大學
測量工程學系碩博士班
92
Due to the various design on the format, software architecture, and operating procedures of different GIS software, their built-in functions are usually only applicable to the native data format designed. This inevitably imposes obstacles on the sharing, distribution and interoperability of geographic data. A typical scenario is the difficulty to issue a spatial query upon geographic data in different data formats because their corresponding query modules are often incompatible. Consequently it impedes us to query and acquire all available data in a distributed geographic data environment. The standardized description of geographic data provides a possibility that we no longer need to simultaneously manage data in various formats, and only need to concentrate on the standardized data format instead, regardless by whom data were created and what format they were created originally. Geographic Markup Language (GML), proposed by OpenGIS Consortium, has emerged as a strong candidate for the standard of geographic data description in recent years. This research intends to investigate the issue of querying data recorded in GML (Geography Markup Language) format and propose a feasible operation procedure from data access, filtering and representation. The core idea is to return the queried results in GML as well, but only containing those features fulfilling the constraints. The merits of this approach is we only need to handle data in GML format, and any GML viewer can be used to display queried results. The interaction of GML query module is therefore no different from those corresponding query modules in any current GIS software. We further introduce metadata in the querying process, as it may serve as an important reference for interpreting queried results, particularly when they come from different GML files. The spatial query module in this research is based on the topological relationship model by OGC and later expanded to take human spatial cognition into consideration. The successful link between primitive topological relationships and natural language-like spatial predicates reduce the required training to naïve users. With the development of GML query module, the issue of different format of geographic data is no longer a major obstacle in OpenGIS data environment. We find it is necessary to simultaneously manage spatial entities in different dimensionality and spatial data types while processing a spatial query. Besides, the corresponding primitive topological relationships for an individual spatial query may be rather different depending on the spatial data types of the spatial entities. While dealing with complex types of spatial entities, this should receive serious attention, as users may misinterpret the returned queried results while they never notice.
Teng, Yueh-chuan y 鄧岳荃. "Map Interface Content Interoperability in Geospatial SOA Environment with Open Geographic Data". Thesis, 2007. http://ndltd.ncl.edu.tw/handle/82459092928047031340.
國立成功大學
測量及空間資訊學系碩博士班
95
Full accessibility and correct use of distributed geospatial resource are two of the critical issues to recent GIS developments. With the innovated progress of recent geospatial SOA, open geospatial data format and web service have largely removed data acquisition obstacles. How to develop a middleware environment to effectively integrate heterogeneous geospatial resource, take advantages of the chaining capability of geospatial web services and develop built-in professional geospatial knowledge have emerged as our future challenges. Map interface operations in middleware environment were chosen as the major topic of this research. Besides taking full advantages of the accessibility of heterogeneous geographic data via web service, we expect to further improve the map interface display and application via built-in cartographic knowledge in middleware environment. To achieve better interoperability of heterogeneous data, a general-purpose data description framework based on the fundamental characteristics of geographic data is proposed. Complying with the ISO/TC211 19100 series international standards, the description framework enables all distributed geospatial features to automatically carry common and necessary description information. The middleware can therefore interpret the acquired data content in a standardized way and ensure the correct use of map operations. Served as a common description framework, it can be applied to any application domains and can be expanded whenever necessary. We further established a primitive geospatial SOA following various OGC standards(WMS、WFS、WCTS、OpenLS and Catalogue Service) that allows the middleware to collect and process required data via loose-coupling of web service. Based on the proposed description framework and built-in cartographic knowledge, the developed middleware can meet the demands of the correct display and operation of heterogeneous data in map interface, as well as avoid the possible wrong data use of na��ve users. It is clear that middleware will play a dominant role to bridge the gaps of users and data providers in the future GIS environment. Though we only focus on the common characteristic of geographic data in this research, the proposed fundamental middleware environment has sufficient flexibility to further improve the integration of heterogeneous data by including additional domain-specific knowledge.
YANG, SHUN-WEN y 楊舜文. "An Implementation of ETC Open Data Visualization and Traffic Analysis Using ELK Stack Environment". Thesis, 2018. http://ndltd.ncl.edu.tw/handle/6uuze2.
東海大學
資訊工程學系
106
In the government to actively promote the Open Government Data. The Government Data open platform to provide a variety of open data. The most widely used and life is closely related to the weather information and air quality information. After "De-identification" information in Traffic Data Collection System(TDCS) , also open for everyone in this platform to use. TDCS data has millions of vehicle trips per day. The traditional analysis table has been unable to effectively and quickly present analysis and resolution. This thesis mainly uses ELK Stack three open source software combination of analysis system. Real-time analysis and statistics on the open data of the TDCS . Through the visualization of the system chart. You can quickly understand the current speed situation and analysis of traffic flow and departure traffic statistics. This analysis system using Linux Shell Script get open data, and reads the cleaned data via Logstash. Combine Logstash to filter the data category. Export to the Elasticsearch database and index it. Finally, Kibana shows the results of the analysis. This system solves the limit of the number of traditional pivot analysis tables. Searches and calculates the request duration time by about 0.3 second in 500 million data. Also in the simple query test also found Elasticsearch database than non-index MariaDB more than doubled.
Ho, Jheng-Ying y 何政穎. "Integrating Internet of Thing and Open Data into Hadoop Cloud Computing Based on CloudStack Virtual Environment". Thesis, 2015. http://ndltd.ncl.edu.tw/handle/2zjd4f.
國立虎尾科技大學
資訊工程研究所
103
This study proposes a Mashup technology based on virtualization cloud computing framework (MTVCCF), integrating the Internet of Thing, web service, and open data as the framework for developing the front-end system. In addition, CloudStack and Hadoop were employed to construct a virtual cloud computing-based environment as the core framework of the back-end system. Hadoop cloud computing was used to resolve big data problems, whereas CloudStack was employed to develop, manage, and configure the basic services rendered for cloud computing. A kernel-based virtual machine (KVM) was applied to improve the extensibility of the cloud server, mitigate the gap regarding usage rate, and reduce the risk for server crash. In this study, an elderly care cloud platform (ECCP) was developed to verify the feasibility of the MTVCCF. The ECCP was aimed to measure physiological signals of elderly people. In this platform, the Near Field Communication (NFC) protocol, Bluetooth, electronic sphygmomanometer, and wireless network were integrated to provide an Internet of Thing framework that enables communication among objects, thereby transmitting the generated data through Web service to the back-end system where the data are computed and stored. Empirically, the big data generated from ECCP were subject to stress testing, which verified that the MTVCCF proposed in this study can resolve the aforementioned problems regarding big data and Internet usage capacity.
SUNG, YI-HSUN y 宋羿勳. "An Analysis of the Production Indicators of Society, Economy and Environment with Data Mining - Taking Taiwan's Open Data as an Example". Thesis, 2017. http://ndltd.ncl.edu.tw/handle/574dg2.
東海大學
資訊管理學系
105
In an age when the concepts of Open Government and Open data are prevalent, the Open Government Data (OGD) provides a platform of information sharing, which enables the public to access to the governmental data.An Analysis on Economic, Social and Environmental OGD This paper, by Data Envelopment Analysis (DEA-SBM), classifies the data available on the platform of OGD in the period of 2013-2015 into 3 categories, finds out the input and output factors, and formulates pointers to measure the efficiency of government’s economic, social and environmental policies. This paper finds out the target variable and output pointer by DEA-SBM and Decision-tree Model, respectively. The government can make a balance among economic growth, social development and environmental sustainability, so as to enhance governance efficiency. The 13 pointer variables representing the interaction between economic, social and environment factors help us to look at the social and environmental implication of Taiwan’s economic development from 2013 to 2015.
Richter, Stefan [Verfasser]. "World libraries : towards efficiently sharing large data volumes in open untrusted environments while preserving privacy / vorgelegt von Stefan Richter". 2009. http://d-nb.info/1000139689/34.
Aboualizadehbehbahani, Maziar. "Proposing a New System Architecture for Next Generation Learning Environment". Thesis, 2016. http://hdl.handle.net/1805/10289.
The emergence of information exchange and act of offering features through external interfaces is a vast but immensely valuable challenge, and essential elements of learning environments cannot be excluded. Nowadays, there are a lot of different service providers working in the learning systems market and each of them has their own advantages. On that premise, in today's world even large learning management systems are trying to cooperate with each other in order to be best. For instance, Instructure is a substantial company and can easily employ a dedicated team tasked with the development of a video conferencing functionality, but it chooses to use an open source alternative instead: The BigBlueButton. Unfortunately, different learning system manufacturers are using different technologies for various reasons, making integration that much harder. Standards in learning environments have come to resolve problems regarding exchanging information, providing and consuming functionalities externally and simultaneously minimizing the amount of effort needed to integrate systems. In addition to defining and simplifying these standards, careful consideration is essential when designing new, comprehensive and useful systems, as well as adding interoperability to existing systems, all which subsequently took part in this research. In this research I have reviewed most of the standards and protocols for integration in learning environments and proposed a revised approach for app stores in learning environments. Finally, as a case study, a learning tool has been developed to avail essential functionalities of a social educational learning management system integrated with other learning management systems. This tool supports the dominant and most popular standards for interoperability and can be added to learning management systems within seconds.
Anderson, Winston Noël. "Investigating the universality of a semantic web-upper ontology in the context of the African languages". Diss., 2016. http://hdl.handle.net/10500/21898.
Computing
M. Sc. (Computer Science)