Dissertations / Theses on the topic 'Smart data management'

To see the other types of publications on this topic, follow the link: Smart data management.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Smart data management.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

UGLIOTTI, FRANCESCA MARIA. "BIM and Facility Management for smart data management and visualization." Doctoral thesis, Politecnico di Torino, 2017. http://hdl.handle.net/11583/2696432.

Full text
Abstract:
Il BIM è per tutti gli edifici. Riconosciuta tra le disruptive technologies, la metodologia BIM cambia completamente il modo tradizionale di lavorare dell’industria delle costruzioni, a partire dalla fase di progettazione. In questo scenario, la sfida più interessante è quella di stabilire un framework, che riunisca metodi e strumenti per il ciclo di vita degli edifici, per la gestione del costruito. Il paradigma di Smart city si declina anche nella disponibilità di smart data, includendo, quindi, l’utilizzo intelligente delle informazioni riguardanti il patrimonio immobiliare. Il coinvolgimento proattivo del Facility Management nel processo edilizio è la chiave per garantire la disponibilità di un dataset appropriato di informazioni, supportando l’idea di un sistema di gestione della conoscenza basato sul BIM. In linea con questo approccio, un processo di management impostato a partire dal BIM è conseguibile attraverso una re-ingegnerizzazione complessiva della filiera atta a garantire l’efficacia del BIM ed a fornire servizi intelligenti di Facility 4.0.
BIM is for all buildings. As a disruptive technology, BIM completely changes the traditional way of working of the Construction Industry, starting from the design stage. However, the challenging issue is to establish a framework that brings together methods and tools for the buildings lifecycle, focusing on the existing buildings management. Smart city means smart data, including, therefore, intelligent use of Real Estate information. Involving Facility Management in the process is the key to ensure the availability of the proper dataset of information, supporting the idea of a BIM-based knowledge management system. According to this approach, BIM Management is achievable applying a reverse engineering process to guarantee the BIM effectiveness and to provide Facility 4.0 smart services.
APA, Harvard, Vancouver, ISO, and other styles
2

Fares, Tony Yussef. "Digital rights management for smart containment objects." Access electronically, 2005. http://www.library.uow.edu.au/adt-NWU/public/adt-NWU20060511.151012/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Moreira, Helder. "Sensor data integration and management of smart environments." Master's thesis, Universidade de Aveiro, 2016. http://hdl.handle.net/10773/17884.

Full text
Abstract:
Mestrado em Engenharia de Computadores e Telemática
Num mundo de constante desenvolvimento tecnológico e acelerado crescimento populacional, observa-se um aumento da utilização de recursos energéticos. Sendo os edifícios responsáveis por uma grande parte deste consumo energético, desencadeiam-se vários esforços de investigações de forma a criarem-se edifícios energeticamente eficientes e espaços inteligentes. Esta dissertação visa, numa primeira fase, apresentar uma revisão das atuais soluções que combinam sistemas de automação de edifícios e a Internet das Coisas. Posteriormente, é apresentada uma solução de automação para edifícios, com base em princípios da Internet das Coisas e explorando as vantagens de sistemas de processamento complexo de eventos, de forma a fornecer uma maior integração dos múltiplos sistemas existentes num edifício. Esta solução é depois validada através de uma implementação, baseada em protocolos leves desenhados para a Internet das Coisas, plataformas de alto desempenho, e métodos complexos para análise de grandes fluxos de dados. Esta implementação é ainda aplicada num cenário real, e será usada como a solução padrão para gestão e automação num edifício existente.
In a world of constant technological development and accelerated population growth, an increased use of energy resources is being observed. With buildings responsible for a large share of this energy consumption, a lot of research activities are pursued with the goal to create energy efficient buildings and smart spaces. This dissertation aims to, in a first stage, present a review of the current solutions combining Building Automation Systems (BAS) and Internet of Things (IoT). Then, a solution for building automation is presented based on IoT principles and exploiting the advantages of Complex Event Processing (CEP) systems, to provide higher integration of the multiple building subsystems. This solution was validated through an implementation, based on standard lightweight protocols designed for IoT, high performance and real time platforms, and complex methods for analysis of large streams of data. The implementation is also applied to a real world scenario, and will be used as a standard solution for management and automation of an existing building
APA, Harvard, Vancouver, ISO, and other styles
4

DEL, GIUDICE MATTEO. "Smart data management with BIM for Architectural Heritage." Doctoral thesis, Politecnico di Torino, 2016. http://hdl.handle.net/11583/2652020.

Full text
Abstract:
In the last years smart buildings topic has received much attention as well as Building Information Modelling (BIM) and interoperability as independent fields. Linking these topics is an essential research target to help designers and stakeholders to run processes more efficiently. Working on a smart building requires the use of Innovation and Communication Technology (ICT) to optimize design, construction and management. In these terms, several technologies such as sensors for remote monitoring and control, building equipment, management software, etc. are available in the market. As BIM provides an enormous amount of information in its database and theoretically it is able to work with all kind of data sources using interoperability, it is essential to define standards for both data contents and format exchange. In this way, a possibility to align research activity with Horizon 2020 is the investigation of energy saving using ICT. Unfortunately, comparing the Architecture Engineering and Construction (AEC) Industry with other sectors it is clear how in the building field advanced information technology applications have not been adopted yet. However in the last years, the adoption of new methods for the data management has been investigated by many researchers. So, basing on the above considerations, the main purpose of this thesis is investigate the use of BIM methodology relating to existing buildings concerning on three main topics: • Smart data management for architectural heritage preservation; • District data management for energy reduction; • The maintenance of highrises. For these reasons, data management acquires a very important value relating to the optimization of the building process and it is considered the most important goal for this research. Taking into account different kinds of architectural heritage, the attention is focused on the existing and historical buildings that usually have characterized by several constraints. Starting from data collection, a BIM model was developed and customized in function of its objectives, and providing information for different simulation tests. Finally, data visualization was investigated through the Virtual Reality(VR) and Augmented Reality (AR). Certainly, the creation of a 3D parametric model implies that data is organized according to the use of individual users that are involved in the building process. This means that each 3D model can be developed with different Levels of Detail/Development (LODs) basing on the goal of the data source. Along this thesis the importance of LODs is taken into account related to the kind of information filled in a BIM model. In fact, basing on the objectives of each project a BIM model can be developed in a different way to facilitate the querying data for the simulations tests. The three topics were compared considering each step of the building process workflow, highlighting the main differences, evaluating the strengths and weaknesses of BIM methodology. In these terms, the importance to set a BIM template before the modelling step was pointed out, because it provides the possibility to manage information in order to be collected and extracted for different purposes and by specific users. Moreover, basing on the results obtained in terms of the 3D parametric model and in terms of process, a proper BIM maturity level was determined for each topic. Finally, the value of interoperability was arisen from these tests considering that it provided the opportunity to develop a framework for collaboration, involving all parties of the building industry.
APA, Harvard, Vancouver, ISO, and other styles
5

Simonet, Anthony. "Active Data - Enabling Smart Data Life Cycle Management for Large Distributed Scientific Data Sets." Thesis, Lyon, École normale supérieure, 2015. http://www.theses.fr/2015ENSL1004/document.

Full text
Abstract:
Dans tous les domaines, le progrès scientifique repose de plus en plus sur la capacité à exploiter des volumes de données toujours plus gigantesques. Alors que leur volume augmente, la gestion de ces données se complexifie. Un point clé est la gestion du cycle de vie des données, c'est à dire les diverses opérations qu'elles subissent entre leur création et leur disparition : transfert, archivage, réplication, suppression, etc. Ces opérations, autrefois simples, deviennent ingérables lorsque le volume des données augmente de manière importante, au vu de l'hétérogénéité des logiciels utilisés d'une part, et de la complexité des infrastructures mises en œuvre d'autre part.Nous présentons Active Data, un méta-modèle, une implémentation et un modèle de programmation qui permet de représenter formellement et graphiquement le cycle de vie de données présentes dans un assemblage de systèmes et d'infrastructures hétérogènes, en exposant naturellement la réplication, la distribution et les différents identifiants des données. Une fois connecté à des applications existantes, Active Data expose aux utilisateurs ou à des programmes l'état d'avancement des données dans leur cycle de vie, en cours d'exécution, tout en gardant leur trace lorsqu'elles passent d'un système à un autre.Le modèle de programmation Active Data permet d'exécuter du code à chaque étape du cycle de vie des données. Les programmes écrits avec Active Data ont à tout moment accès à l'état complet des données, à la fois dans tous les systèmes et dans toutes les infrastructures sur lesquels elles sont distribuées. Nous présentons des évaluations de performance et des exemples d'utilisation qui attestent de l'expressivité du modèle de programmation et de la qualité de l'implémentation. Enfin, nous décrivons l'implémentation d'un outil de Surveillance des données basé sur Active Data pour l'expérience Advanced Photon Source qui permet aux utilisateurs de suivre la progression de leurs données, d'automatiser la plupart des tâches manuelles, d'obtenir des notifications pertinente parmi une masse gigantesque d'événements, ainsi que de détecter et corriger de nombreuses erreurs sans intervention humaine.Ce travail propose des perspectives intéressantes, en particulier dans les domaines de la provenance des données et de l'open data, tout en facilitant la collaboration entre les scientifiques de communautés différentes
In all domains, scientific progress relies more and more on our ability to exploit ever growing volumes of data. However, as datavolumes increase, their management becomes more difficult. A key point is to deal with the complexity of data life cycle management,i.e. all the operations that happen to data between their creation and there deletion: transfer, archiving, replication, disposal etc.These formerly straightforward operations become intractable when data volume grows dramatically, because of the heterogeneity ofdata management software on the one hand, and the complexity of the infrastructures involved on the other.In this thesis, we introduce Active Data, a meta-model, an implementation and a programming model that allow to represent formally and graphically the life cycle of data distributed in an assemblage of heterogeneous systems and infrastructures, naturally exposing replication, distribution and different data identifiers. Once connected to existing applications, Active Data exposes the progress of data through their life cycle at runtime to users and programs, while keeping their track as it passes from a system to another.The Active Data programming model allows to execute code at each step of the data life cycle. Programs developed with Active Datahave access at any time to the complete state of data in any system and infrastructure it is distributed to.We present micro-benchmarks and usage scenarios that demonstrate the expressivity of the programming model and the implementationquality. Finally, we describe the implementation of a Data Surveillance framework based on Active Data for theAdvanced Photon Source experiment that allows scientists to monitor the progress of their data, automate most manual tasks,get relevant notifications from huge amount of events, and detect and recover from errors without human intervention.This work provides interesting perspectives in data provenance and open data in particular, while facilitating collaboration betweenscientists from different communities
APA, Harvard, Vancouver, ISO, and other styles
6

Christiansen, Filip, and Matilda Tranell. "Data Management and Business Opportunities inEmerging Smart Metering Market." Thesis, KTH, Energiteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-206975.

Full text
Abstract:
Major changes in the energy systems throughout Europe have resulted in the implementation of new technologies such as smart grids and meters, enabling a two-way flow of information and electricity. This results in large volumes of metering data which needs to be efficiently managed for market and grid operational purposes. In addition to this, a new market for third parties seeking to enhance and convert data into valuable information has emerged. Current data management strategies vary between countries, resulting in a great diversity of data management models. To reach consensus, the European Commission has developed three theoretical reference models in order to cover all possible options. For the success of third parties, it is important to understand the rather complex mechanisms of these reference models. This can ease the process of recognizing the implemented data management model on a given market, as well as the interaction with related obstacles or barriers, in order to determine business opportunities. This report aims to present market conditions for third party actors in two European countries that have implemented different data management models. The Netherlands and Great Britain are selected based on certain conditions. With existing theories of the reference models, the actual models will be defined in each country. Key barriers are also identified. This report will then study how appropriate the implemented models are in relation to the barriers. Therefore, these two countries will also serve as case studies for evaluating the applicability of the reference models. In the Netherlands, case 1 of the reference models is identified per definition, although a transitioning towards case 2 can be observed. The major barrier consists of privacy concerns although customer engagement is becoming a central focus. In relation to these issues, targeted regulations seems to have more positive impact than the implemented model.  The Dutch market is evolving and it is shown that the customers are open to new innovative services, although the intent to purchase such services is low. A central point of access to data facilitates efficient data management, however this only includes data with a 15 minute frequency. Data with a 10 second update interval can currently be accessed only via a physical smart meter port. In Great Britain, parts from both reference model 2 and 3 are implemented and the main barrier is currently customer engagement. The model has been developed with high emphasis on earlier privacy concerns, but it has potential to also address customer engagement by supporting innovation and new services. However, earlier restrictive regulations have only allowed certain feedback services, i.e. In Home Displays, to be offered to customers. As of 2015, other options are allowed which opens up a promising market for third party actors. Data can be accessed either centrally, with half-hourly updates, or via so called Consumer Access Devices providing data with updates every 10 second. A gap between the theoretical models and reality is observed; theoretical benefits are not always evident in practice. It is also observed how all possible data flows are not always properly described or included in data management model mappings. Therefore, it is important for third parties to look beyond such mappings to understand the access to certain data that fits their purpose. At last, privacy concerns can be eased through increased customer awareness and empowerment, which is also related to the receptivity to innovations among customers.
Uppkomsten av smarta elnät och elmätare möjliggör ett dubbelriktat flöde av information i elnät. Detta ger upphov till stora datamängder och för marknadsaktiviteter och elnätsrelaterade åtaganden krävs därför en effektiv datahantering. Dessutom uppstår en ny marknad för tredjepartsaktörer som kan använda datan och göra om den till värdefull information. Strategier för hur datahanteringen ska gå till skiljer sig åt mellan länder och mångfalden är stor. Europeiska Kommissionen har tagit fram tre olika teoretiska referensmodeller för att uppnå konsensus inom detta område. Dessa modeller kan fungera som verktyg för tredjepartsaktörer i syfte att identifiera verkliga modeller för datahantering. Dessutom kan de ge värdefull information om relationen mellan datahantering och försvårande omständigheter; något som är viktigt att förstå för att bedöma marknadsmöjligheter. Målet med denna rapport är att presentera marknadsmöjligheter för tredjepartsaktörer i två europeiska länder som har olika modeller för datahantering. Utifrån särskilda kriterier väljs Nederländerna och England. Med hjälp av existerande teori kring referensmodellerna definieras de reella modellerna i länderna. Därefter utreder rapporten hur lämpliga de reella modellerna är i relation till identifierade barriärer. Därmed fungerar de två länderna även som fallstudier för utvärdering av applicerbarheten hos referensmodellerna. I Nederländerna identifieras den verkliga modellen för datahantering som en variant av modell 1 av referensmodellerna, och en utveckling mot modell 2 kan observeras. Den avgörande barriären är integritetsrelaterad, men kundengagemang blir ett alltmer centralt fokus. I relation till dessa problem kan det konstateras att specifika regleringar har större positiv genomslagskraft än själva modellen. Den holländska marknaden befinner sig i ett tidigt utvecklingsstadie men det har visat sig att kunder är positivt inställda till innovativa tjänster. Effektiv datahantering främjas av en central åtkomstpunkt, men detta inkluderar endast data med en uppdateringsfrekvens om 15 minuter. Data med uppdateringsfrekvens om 10 sekunder är tillgänglig via en fysisk port på själva elmätaren. I England identifieras den verkliga modellen för datahantering som delar av både referensmodell 2 och 3, och den största barriären är brist på kundengagemang. Tidigare utbredda integritetsproblem har delvis utformat modellen, men trots detta återfinns positiva funktioner sett till rådande utmaning då modellen främjar högre innovationsnivåer för tjänster. Regleringar har dock tidigare begränsat utbudet av sådana tjänster till endast s.k In Home Displays. Under 2015 förändrades denna reglering vilket medför lovande marknadsmöjligheter för tredjepartsaktörer. Datatillgång sker antingen via en central åtkomstpunkt, med en uppdateringsfrekvensen om 30 minuter, eller via s.k Consumer Access Devices där uppdateringsfrekvensen är 10 sekunder. Ett gap mellan de teoretiska modellerna och den verkliga implementeringen kan observeras eftersom teoretiskt beskrivna fördelar inte alltid förekommer i praktiken. En annan viktig upptäckt är att visualiseringar av datamodeller inte alltid beskriver eller inkluderar samtliga dataflöden. Därmed bör tredjepartsaktörer inte enbart förlita sig på sådana kartläggningar; andra metoder kan vara nödvändiga för att bedöma tillgången till nödvändig data. Till sist kan det konstateras att integritetsproblem kan motverkas med metoder som ökar uppmärksamheten hos kunder. Ett viktigt samband mellan detta och mottagligheten för innovation hos kunderna kan påvisas.
APA, Harvard, Vancouver, ISO, and other styles
7

Masilela, Mbonisi. "Supporting Data-Intensive Wireless Sensor Applications using Smart Data Fragmentation and Buffer Management." VCU Scholars Compass, 2007. http://scholarscompass.vcu.edu/etd/779.

Full text
Abstract:
Recent advances in low power device technology have led to the development of smaller powerful sensors geared for use in Wireless Sensor Networks. Some of these sensors are capable of producing large data packets in a single reading. This becomes a challenging problem given the constraints imposed by current MAC and Transport Layer implementations since a single data packet can exceed the MTU of the protocol stack. Little has been done in the way of addressing this issue in Wireless Sensor Networks. This paper proposes a novel solution to this issue. Proposed is a Lightweight Data Transportation Protocol that uses smart data fragmentation and efficient pipelined transmission and buffer management schemes to solve this problem. The methodology outlined in this paper ensures that data is successfully transmitted from source to destination with minimal delay or packet loss.
APA, Harvard, Vancouver, ISO, and other styles
8

Sinaeepourfard, Amir. "Hierarchical distributed fog-to-cloud data management in smart cities." Doctoral thesis, Universitat Politècnica de Catalunya, 2017. http://hdl.handle.net/10803/461740.

Full text
Abstract:
There is a vast amount of data being generated every day in the world with different formats, quality levels, etc. This new data, together with the archived historical data, constitute the seed for future knowledge discovery and value generation in several fields of science and big data environments. Discovering value from data is a complex computing process where data is the key resource, not only during its processing, but also during its entire life cycle. However, there is still a huge concern about how to organize and manage this data in all fields for efficient usage and exploitation during all data life cycles. Although several specific Data LifeCycle (DLC) models have been recently defined for particular scenarios, we argue that there is no global and comprehensive DLC framework to be widely used in different fields. In particular scenario, smart cities are the current technological solutions to handle the challenges and complexity of the growing urban density. Traditionally, Smart City resources management rely on cloud based solutions where sensors data are collected to provide a centralized and rich set of open data. The advantages of cloud-based frameworks are their ubiquity, as well as an (almost) unlimited resources capacity. However, accessing data from the cloud implies large network traffic, high latencies usually not appropriate for real-time or critical solutions, as well as higher security risks. Alternatively, fog computing emerges as a promising technology to absorb these inconveniences. It proposes the use of devices at the edge to provide closer computing facilities and, therefore, reducing network traffic, reducing latencies drastically while improving security. We have defined a new framework for data management in the context of a Smart City through a global fog to cloud resources management architecture. This model has the advantages of both, fog and cloud technologies, as it allows reduced latencies for critical applications while being able to use the high computing capabilities of cloud technology. In this thesis, we propose many novel ideas in the design of a novel F2C Data Management architecture for smart cities as following. First, we draw and describe a comprehensive scenario agnostic Data LifeCycle model successfully addressing all challenges included in the 6Vs not tailored to any specific environment, but easy to be adapted to fit the requirements of any particular field. Then, we introduce the Smart City Comprehensive Data LifeCycle model, a data management architecture generated from a comprehensive scenario agnostic model, tailored for the particular scenario of Smart Cities. We define the management of each data life phase, and explain its implementation on a Smart City with Fog-to-Cloud (F2C) resources management. And then, we illustrate a novel architecture for data management in the context of a Smart City through a global fog to cloud resources management architecture. We show this model has the advantages of both, fog and cloud, as it allows reduced latencies for critical applications while being able to use the high computing capabilities of cloud technology. As a first experiment for the F2C data management architecture, a real Smart City is analyzed, corresponding to the city of Barcelona, with special emphasis on the layers responsible for collecting the data generated by the deployed sensors. The amount of daily sensors data transmitted through the network has been estimated and a rough projection has been made assuming an exhaustive deployment that fully covers all city. And, we provide some solutions to both reduce the data transmission and improve the data management. Then, we used some data filtering techniques (including data aggregation and data compression) to estimate the network traffic in this model during data collection and compare it with a traditional real system. Indeed, we estimate the total data storage sizes through F2C scenario for Barcelona smart cities
Al món es generen diàriament una gran quantitat de dades, amb diferents formats, nivells de qualitat, etc. Aquestes noves dades, juntament amb les dades històriques arxivades, constitueixen la llavor per al descobriment de coneixement i la generació de valor en diversos camps de la ciència i grans entorns de dades (big data). Descobrir el valor de les dades és un procés complex de càlcul on les dades són el recurs clau, no només durant el seu processament, sinó també durant tot el seu cicle de vida. Tanmateix, encara hi ha una gran preocupació per com organitzar i gestionar aquestes dades en tots els camps per a un ús i explotació eficients durant tots els cicles de vida de les dades. Encara que recentment s'han definit diversos models específics de Data LifeCycle (DLC) per a escenaris particulars, argumentem que no hi ha un marc global i complet de DLC que s'utilitzi àmpliament en diferents camps. En particular, les ciutats intel·ligents són les solucions tecnològiques actuals per fer front als reptes i la complexitat de la creixent densitat urbana. Tradicionalment, la gestió de recursos de Smart City es basa en solucions basades en núvol (cloud computing) on es recopilen dades de sensors per proporcionar un conjunt de dades obert i centralitzat. Les avantatges dels entorns basats en núvol són la seva ubiqüitat, així com una capacitat (gairebé) il·limitada de recursos. Tanmateix, l'accés a dades del núvol implica un gran trànsit de xarxa i, en general, les latències elevades no són apropiades per a solucions crítiques o en temps real, així com també per a riscos de seguretat més elevats. Alternativament, el processament de boira (fog computing) sorgeix com una tecnologia prometedora per absorbir aquests inconvenients. Proposa l'ús de dispositius a la vora per proporcionar recuirsos informàtics més propers i, per tant, reduir el trànsit de la xarxa, reduint les latències dràsticament mentre es millora la seguretat. Hem definit un nou marc per a la gestió de dades en el context d'una ciutat intel·ligent a través d'una arquitectura de gestió de recursos des de la boira fins al núvol (Fog-to-Cloud computing, o F2C). Aquest model té els avantatges combinats de les tecnologies de boira i de núvol, ja que permet reduir les latències per a aplicacions crítiques mentre es poden utilitzar les grans capacitats informàtiques de la tecnologia en núvol. En aquesta tesi, proposem algunes idees noves en el disseny d'una arquitectura F2C de gestió de dades per a ciutats intel·ligents. En primer lloc, dibuixem i descrivim un model de Data LifeCycle global agnòstic que aborda amb èxit tots els reptes inclosos en els 6V i no adaptats a un entorn específic, però fàcil d'adaptar-se als requisits de qualsevol camp en concret. A continuació, presentem el model de Data LifeCycle complet per a una ciutat intel·ligent, una arquitectura de gestió de dades generada a partir d'un model agnòstic d'escenari global, adaptat a l'escenari particular de ciutat intel·ligent. Definim la gestió de cada fase de la vida de les dades i expliquem la seva implementació en una ciutat intel·ligent amb gestió de recursos F2C. I, a continuació, il·lustrem la nova arquitectura per a la gestió de dades en el context d'una Smart City a través d'una arquitectura de gestió de recursos F2C. Mostrem que aquest model té els avantatges d'ambdues, la tecnologia de boira i de núvol, ja que permet reduir les latències per a aplicacions crítiques mentre es pot utilitzar la gran capacitat de processament de la tecnologia en núvol. Com a primer experiment per a l'arquitectura de gestió de dades F2C, s'analitza una ciutat intel·ligent real, corresponent a la ciutat de Barcelona, amb especial èmfasi en les capes responsables de recollir les dades generades pels sensors desplegats. S'ha estimat la quantitat de dades de sensors diàries que es transmet a través de la xarxa i s'ha realitzat una projecció aproximada assumint un desplegament exhaustiu que cobreix tota la ciutat.
APA, Harvard, Vancouver, ISO, and other styles
9

Finotto, Gianluca <1988&gt. "Smart Data: un nuovo asset intangibile a supporto del management." Master's Degree Thesis, Università Ca' Foscari Venezia, 2016. http://hdl.handle.net/10579/8802.

Full text
Abstract:
L’affinamento delle tecnologie ha portato ad una crescita esponenziale dei devices in grado di automatizzare numerose operazioni, sia in ambito produttivo che nella vita privata. Inoltre, l’evoluzione economica dei paesi in via di sviluppo ha permesso una rapida e capillare diffusione di internet in tutto il mondo. I Big Data hanno origine quindi dalla quantità di dati digitali attualmente a disposizione, generati sempre più in maniera automatica e veloce dagli individui nel contesto privato, nell’ambiente fisico oppure in azienda (attraverso smartphone, card magnetiche, sensori, GPS, etc.), da cose (auto, beni in movimento, etc.) e dagli eventi (meteo, atterraggio degli aerei, pagamento finanziario, malfunzionamento di un distributore, etc.). Raggruppare ed analizzare adeguatamente questi zettabyte di dati strutturati e non, è fondamentale per supportare l'attività di decision-making. Tuttavia, non sempre il volume è direttamente proporzionale alla qualità: affinchè i Big Data non siano solo fini a sé stessi ma diventino “smart”, ovvero possano produrre un valore aggiunto per le organizzazioni, è necessaria una diffusione all’interno delle organizzazioni sia pubbliche che private di un’impostazione data driven, cioè un’effettiva cultura legata al data management, in modo da orientare gli investimenti futuri in adeguare infrastrutture e conoscenze tecniche e manageriali.
APA, Harvard, Vancouver, ISO, and other styles
10

Stivanello, Alice <1993&gt. "Strategic Management over Data Privacy and Cyber Security Risk in Smart City and Smart Home." Master's Degree Thesis, Università Ca' Foscari Venezia, 2018. http://hdl.handle.net/10579/12673.

Full text
Abstract:
The world population growth combined with the unprecedented levels of urban density is posing serious challenges for the future of our cities which demand an efficient, effective, and sustainable management of urban infrastructures and resource consumption. Through the integration of information and communication technologies (ICT), the smart city is identified as a ‘system-of-systems’ created to process real-time information exchange at a large-scale and consequently distribute a better life quality to its citizens. Grounded in learning capability and cross-domain interoperability, the embedded Internet of Things (IoT) infrastructure represents a high-value attack platform and thus its adoption should be carefully weighed up against the cyber risk exposure. The main objective of this research is to explore the inner workings of a such complex ecosystem and understand the criticalities of the cyber-security requirements. Since the smart home market represents a fundamental component of a smart city and the most promising application of IoT technology, an accurate investigation is carried out. Defining the smart home as an intertwined advanced automated system which provide the inhabitants remote access and centralized control over the building’s functions, the role played by the advancement of IoT technology is crucial. A multi-layer architectural model is presented in order to grasp the logical conditions underlying the intelligence-driven networks. Installed under the guise of customer service, surveillance facility and remote monitoring are responsible for the potential abuse of data retrieved and thus the failure of safety and security solutions. In response, a cyber-physical vulnerability assessment is conducted and evaluated into a threat-based Defence approach. The scope of this thesis is the identification and formulation of a safe and secure human-machine space, associating proper countermeasures to prevent data leakages and mitigate damages. Although this analysis tries to be exhaustive in all its part, the major focus is on cyber-security concern as it represents a significant barrier to smart systems adoption and all stakeholders should take it seriously. Neglecting the current cyber-security vulnerabilities and underestimate the impact of a cyber intrusion may reveal cascading disasters across the entire smart industry.
APA, Harvard, Vancouver, ISO, and other styles
11

Zhang, Xin. "Secure Data Management and Transmission Infrastructure for the Future Smart Grid." Thesis, The University of Sydney, 2016. http://hdl.handle.net/2123/14657.

Full text
Abstract:
Power grid has played a crucial role since its inception in the Industrial Age. It has evolved from a wide network supplying energy for incorporated multiple areas to the largest cyber-physical system. Its security and reliability are crucial to any country’s economy and stability [1]. With the emergence of the new technologies and the growing pressure of the global warming, the aging power grid can no longer meet the requirements of the modern industry, which leads to the proposal of ‘smart grid’. In smart grid, both electricity and control information communicate in a massively distributed power network. It is essential for smart grid to deliver real-time data by communication network. By using smart meter, AMI can measure energy consumption, monitor loads, collect data and forward information to collectors. Smart grid is an intelligent network consists of many technologies in not only power but also information, telecommunications and control. The most famous structure of smart grid is the three-layer structure. It divides smart grid into three different layers, each layer has its own duty. All these three layers work together, providing us a smart grid that monitor and optimize the operations of all functional units from power generation to all the end-customers [2]. To enhance the security level of future smart grid, deploying a high secure level data transmission scheme on critical nodes is an effective and practical approach. A critical node is a communication node in a cyber-physical network which can be developed to meet certain requirements. It also has firewalls and capability of intrusion detection, so it is useful for a time-critical network system, in other words, it is suitable for future smart grid. The deployment of such a scheme can be tricky regarding to different network topologies. A simple and general way is to install it on every node in the network, that is to say all nodes in this network are critical nodes, but this way takes time, energy and money. Obviously, it is not the best way to do so. Thus, we propose a multi-objective evolutionary algorithm for the searching of critical nodes. A new scheme should be proposed for smart grid. Also, an optimal planning in power grid for embedding large system can effectively ensure every power station and substation to operate safely and detect anomalies in time. Using such a new method is a reliable method to meet increasing security challenges. The evolutionary frame helps in getting optimum without calculating the gradient of the objective function. In the meanwhile, a means of decomposition is useful for exploring solutions evenly in decision space. Furthermore, constraints handling technologies can place critical nodes on optimal locations so as to enhance system security even with several constraints of limited resources and/or hardware. The high-quality experimental results have validated the efficiency and applicability of the proposed approach. It has good reason to believe that the new algorithm has a promising space over the real-world multi-objective optimization problems extracted from power grid security domain. In this thesis, a cloud-based information infrastructure is proposed to deal with the big data storage and computation problems for the future smart grid, some challenges and limitations are addressed, and a new secure data management and transmission strategy regarding increasing security challenges of future smart grid are given as well.
APA, Harvard, Vancouver, ISO, and other styles
12

Shi, Heng. "Uncertainty analysis and application on smart homes and smart grids : big data approaches." Thesis, University of Bath, 2018. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.760978.

Full text
Abstract:
Methods for uncertainty quantification (UQ) and mitigation in the electrical power system are very basic, Monte Carlo (MC) method and its meta methods are generally deployed in most applications, due to its simplicity and easy to be generalised. They are adequate for a traditional power system when the load is predictable, and generation is controllable. However, the large penetration of low carbon technologies, such as solar panels, electric vehicles, and energy storage, has necessitated the needs for more comprehensive approaches to uncertainty as these technologies introduce new sources of uncertainties with larger volume and diverse characteristics, understanding source and consequences of uncertainty becomes highly complex issues. Traditional methods assume that for a given system it has a unique uncertainty characteristic, hence deal with the uncertainty of the system as a single component in applications. However, this view is no longer applicable in the new context as it neglects the important underlying information associated with individual uncertainty components. Therefore, this thesis aims at: i) systematically developing UQ methodologies to identify, discriminate, and quantify different uncertainty components (forward UQ), and critically to model and trace the associated sources independently (inverse UQ) to deliver new uncertainty information, such as, how uncertainty components generated from its sources, how uncertainty components correlate with each other and how uncertainty components propagate through system aggregation; ii) applying the new uncertainty information to further improve a range of fundamental power system applications from Load Forecasting (LF) to Energy Management System (EMS).In the EMS application, the proposed forward UQ methods enable the development of a decentralised system that is able to tap into the new uncertainty information concerning the correlations between load pattern across individual households, the characteristics of uncertainty components and their propagation through aggregation. The decentralised EMS was able to achieve peak and uncertainty reduction by 18% and 45% accordingly at the grid level. In the LF application, this thesis developed inverse UQ through a deep learning model to directly build the connection between uncertainty components and its corresponding sources. For Load Forecasting on expectation (point LF) and probability (probabilistic LF) and witnessed 20%/12% performance improvement compared to the state-of-the-art, such as Support Vector Regression (SVR), Autoregressive Integrated Moving Average (ARIMA), and Multiple Linear Quantile Regression (MLQR).
APA, Harvard, Vancouver, ISO, and other styles
13

Halvorsen, Anne (Anne Fire). "Improving transit demand management with Smart Card data : general framework and applications." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/99543.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, Department of Civil and Environmental Engineering, 2015.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 169-174).
Increases in ridership are outpacing capacity expansions in a number of transit systems. By shifting their focus to demand management, agencies can instead influence how customers use the system, getting more out of the capacity they already have. However, while demand management is well researched for personal vehicle use, its applications for public transportation are still emerging. This thesis explores the strategies transit agencies can use to reduce overcrowding, with a particular focus of how automatically collected fare data can support the design and evaluation of these measures. A framework for developing demand management policies is introduced to help guide agencies through this process. It includes establishing motivations for the program, aspects to consider in its design, as well as dimensions and metrics to evaluate its impacts. Additional considerations for updating a policy are also discussed, as are the possible data sources and methods for supporting analysis. This framework was applied to a fare incentive strategy implemented at Hong Kong's MTR system. In addition to establishing existing congestion patterns, a customer classification analysis was performed to understand the typical travel patterns among MTR users. These results were used to evaluate the promotion at three levels of customer aggregation: all users, user groups, and a panel of high frequency travelers. The incentive was found to have small but non-negligible impacts on morning travel, particularly at the beginning of the peak hour and among users with commuter-like behavior. Through a change point analysis, it was possible to identify the panel members that responded to the promotion and quantify factors that influenced their decision using a discrete choice model. The findings of these analyses are used to recommend potential improvements to MTR's current scheme.
by Anne Halvorsen.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
14

Rutqvist, David. "Data-Driven Emptying Detection for Smart Recycling Containers." Thesis, Luleå tekniska universitet, Institutionen för system- och rymdteknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-70892.

Full text
Abstract:
Waste Management is one of the biggest challenges for modern cities caused by urbanisation and increased population. Smart Waste Management tries to solve this challenge with the help of techniques such as Internet of Things, machine learning and cloud computing. By utilising smart algorithms the time when a recycling container is going to be full can be predicted. By continuously measuring the filling level of containers and then partitioning the filling level data between consecutive emptyings a regression model can be used for prediction. In order to do this an accurate emptying detection is a requirement. This thesis investigates different data-driven approaches to solve the problem of an accurate emptying detection in a setting where the majority of the data are non-emptyings, i.e. suspected emptyings which by manual examination have been concluded not to be actual emptyings. This is done by starting with the currently deployed legacy solution and step-by-step increasing the performance by optimisation and machine learning models. The final solution achieves the classification accuracy of 99.1 % and the recall of 98.2 % by using a random forest classifier on a set of features based on the filling level at different given time spans. To be compared with the recall of 50 % by the legacy solution. In the end, it is concluded that the final solution, with a few minor practical modifications, is feasible for deployment in the next release of the system.
APA, Harvard, Vancouver, ISO, and other styles
15

He, Dawei. "An advanced non-intrusive load monitoring technique and its application in smart grid building energy management systems." Diss., Georgia Institute of Technology, 2016. http://hdl.handle.net/1853/54951.

Full text
Abstract:
The objective of the proposed research is to develop an intelligent load modeling, identification, and prediction technology to provide granular load energy consumption and performance details and drive building energy reduction, demand reduction, and proactive equipment maintenance. Electricity consumption in commercial and residential sectors accounts for about 70% of the total electricity generation in United States. Buildings are the most important consumers, and contribute to over 80% of the consumptions in these two sectors. To reduce electrical energy spending and carbon emission, several studies from Pacific Northwest National Lab (PNNL) and National Renewable Energy Lab (NREL) prove that if equipped with the proper technologies, a commercial or a residential building can potentially improve energy savings of buildings by up to about 10% to 30% of their usage. However, the market acceptance of these new technologies today is still not sufficient, and the reason is generally acknowledged to be the lack of solution to quantify the contributions of these new technologies to the energy savings, and the invisibility of the loads in buildings. A non-intrusive load monitoring (NILM) system is proposed in this dissertation, which can identify every individual load in buildings and record the energy consumption, time-of-day variations and other relevant statistics of the identified load, with no access to the individual component. The challenge of such a non-intrusive load monitoring is to find features that are unique for a particular load and then to match a measured feature of an unknown load against a database or library of known. Many problems exist in this procedure and the proposed research is going to focus on three directions to overcome the bottlenecks. They are respectively fundamental load studies for a model-driven feature extraction, adaptive identification algorithms for load space extendibility, and the practical simplifications for the real industrial applications. The simulation results show the great potentials of this new technology in building energy monitoring and management.
APA, Harvard, Vancouver, ISO, and other styles
16

Afzalan, Milad. "Data-driven customer energy behavior characterization for distributed energy management." Diss., Virginia Tech, 2020. http://hdl.handle.net/10919/99210.

Full text
Abstract:
With the ever-growing concerns of environmental and climate concerns for energy consumption in our society, it is crucial to develop novel solutions that improve the efficient utilization of distributed energy resources for energy efficiency and demand response (DR). As such, there is a need to develop targeted energy programs, which not only meet the requirement of energy goals for a community but also take the energy use patterns of individual households into account. To this end, a sound understanding of the energy behavior of customers at the neighborhood level is needed, which requires operational analytics on the wealth of energy data from customers and devices. In this dissertation, we focus on data-driven solutions for customer energy behavior characterization with applications to distributed energy management and flexibility provision. To do so, the following problems were studied: (1) how different customers can be segmented for DR events based on their energy-saving potential and balancing peak and off-peak demand, (2) what are the opportunities for extracting Time-of-Use of specific loads for automated DR applications from the whole-house energy data without in-situ training, and (3) how flexibility in customer demand adoption of renewable and distributed resources (e.g., solar panels, battery, and smart loads) can improve the demand-supply problem. In the first study, a segmentation methodology form historical energy data of households is proposed to estimate the energy-saving potential for DR programs at a community level. The proposed approach characterizes certain attributes in time-series data such as frequency, consistency, and peak time usage. The empirical evaluation of real energy data of 400 households shows the successful ranking of different subsets of consumers according to their peak energy reduction potential for the DR event. Specifically, it was shown that the proposed approach could successfully identify the 20-30% of customers who could achieve 50-70% total possible demand reduction for DR. Furthermore, the rebound effect problem (creating undesired peak demand after a DR event) was studied, and it was shown that the proposed approach has the potential of identifying a subset of consumers (~5%-40% with specific loads like AC and electric vehicle) who contribute to balance the peak and off-peak demand. A projection on Austin, TX showed 16MWh reduction during a 2-h event can be achieved by a justified selection of 20% of residential customers. In the second study, the feasibility of inferring time-of-use (ToU) operation of flexible loads for DR applications was investigated. Unlike several efforts that required considerable model parameter selection or training, we sought to infer ToU from machine learning models without in-situ training. As the first part of this study, the ToU inference from low-resolution 15-minute data (smart meter data) was investigated. A framework was introduced which leveraged the smart meter data from a set of neighbor buildings (equipped with plug meters) with similar energy use behavior for training. Through identifying similar buildings in energy use behavior, the machine learning classification models (including neural network, SVM, and random forest) were employed for inference of appliance ToU in buildings by accounting for resident behavior reflected in their energy load shapes from smart meter data. Investigation on electric vehicle (EV) and dryer for 10 buildings over 20 days showed an average F-score of 83% and 71%. As the second part of this study, the ToU inference from high-resolution data (60Hz) was investigated. A self-configuring framework, based on the concept of spectral clustering, was introduced that automatically extracts the appliance signature from historical data in the environment to avoid the problem of model parameter selection. Using the framework, appliance signatures are matched with new events in the electricity signal to identify the ToU of major loads. The results on ~1500 events showed an F-score of >80% for major loads like AC, washing machine, and dishwasher. In the third study, the problem of demand-supply balance, in the presence of varying levels of small-scale distributed resources (solar panel, battery, and smart load) was investigated. The concept of load complementarity between consumers and prosumers for load balancing among a community of ~250 households was investigated. The impact of different scenarios such as varying levels of solar penetration, battery integration level, in addition to users' flexibility for balancing the supply and demand were quantitatively measured. It was shown that (1) even with 100% adoption of solar panels, the renewable supply cannot cover the demand of the network during afternoon times (e.g., after 3 pm), (2) integrating battery for individual households could improve the self-sufficiency by more than 15% during solar generation time, and (3) without any battery, smart loads are also capable of improving the self-sufficiency as an alternative, by providing ~60% of what commercial battery systems would offer. The contribution of this dissertation is through introducing data-driven solutions/investigations for characterizing the energy behavior of households, which could increase the flexibility of the aggregate daily energy load profiles for a community. When combined, the findings of this research can serve to the field of utility-scale energy analytics for the integration of DR and improved reshaping of network energy profiles (i.e., mitigating the peaks and valleys in daily demand profiles).
Doctor of Philosophy
Buildings account for more than 70% of electricity consumption in the U.S., in which more than 40% is associated with the residential sector. During recent years, with the advancement in Information and Communication Technologies (ICT) and the proliferation of data from consumers and devices, data-driven methods have received increasing attention for improving the energy-efficiency initiatives. With the increased adoption of renewable and distributed resources in buildings (e.g., solar panels and storage systems), an important aspect to improve the efficiency by matching the demand and supply is to add flexibility to the energy consumption patterns (e.g., trying to match the times of high energy demand from buildings and renewable generation). In this dissertation, we introduced data-driven solutions using the historical energy data of consumers with application to the flexibility provision. Specific problems include: (1) introducing a ranking score for buildings in a community to detect the candidates that can provide higher energy saving in the future events, (2) estimating the operation time of major energy-intensive appliances by analyzing the whole-house energy data using machine learning models, and (3) investigating the potential of achieving demand-supply balance in communities of buildings under the impact of different levels of solar panels, battery systems, and occupants energy consumption behavior. In the first study, a ranking score was introduced that analyzes the historical energy data from major loads such as washing machines and dishwashers in individual buildings and group the buildings based on their potential for energy saving at different times of the day. The proposed approach was investigated for real data of 400 buildings. The results for EV, washing machine, dishwasher, dryer, and AC show that the approach could successfully rank buildings by their demand reduction potential at critical times of the day. In the second study, machine learning (ML) frameworks were introduced to identify the times of the day that major energy-intensive appliances are operated. To do so, the input of the model was considered as the main circuit electricity information of the whole building either in lower-resolution data (smart meter data) or higher-resolution data (60Hz). Unlike previous studies that required considerable efforts for training the model (e.g, defining specific parameters for mathematical formulation of the appliance model), the aim was to develop data-driven approaches to learn the model either from the same building itself or from the neighbors that have appliance-level metering devices. For the lower-resolution data, the objective was that, if a few samples of buildings have already access to plug meters (i.e., appliance level data), one could estimate the operation time of major appliances through ML models by matching the energy behavior of the buildings, reflected in their smart meter information, with the ones in the neighborhood that have similar behaviors. For the higher-resolution data, an algorithm was introduced that extract the appliance signature (i.e., change in the pattern of electricity signal when an appliance is operated) to create a processed library and match the new events (i.e., times that an appliance is operated) by investigating the similarity with the ones in the processed library. The investigation on major appliances like AC, EV, dryer, and washing machine shows the >80% accuracy on standard performance metrics. In the third study, the impact of adding small-scale distributed resources to individual buildings (solar panels, battery, and users' practice in changing their energy consumption behavior) for matching the demand-supply for the communities was investigated. A community of ~250 buildings was considered to account for realistic uncertain energy behavior across households. It was shown that even when all buildings have a solar panel, during the afternoon times (after 4 pm) in which still ~30% of solar generation is possible, the community could not supply their demand. Furthermore, it was observed that including users' practice in changing their energy consumption behavior and battery could improve the utilization of solar energy around >10%-15%. The results can serve as a guideline for utilities and decision-makers to understand the impact of such different scenarios on improving the utilization of solar adoption. These series of studies in this dissertation contribute to the body of literature by introducing data-driven solutions/investigations for characterizing the energy behavior of households, which could increase the flexibility in energy consumption patterns.
APA, Harvard, Vancouver, ISO, and other styles
17

Jennings, Brandon Douglas. "Leveraging smart system design to collect and analyze factory production data." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/117969.

Full text
Abstract:
Thesis: M.B.A., Massachusetts Institute of Technology, Sloan School of Management, in conjunction with the Leaders for Global Operations Program at MIT, 2018.
Thesis: S.M., Massachusetts Institute of Technology, Department of Mechanical Engineering, in conjunction with the Leaders for Global Operations Program at MIT, 2018.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 54-55).
Li & Fung deals with many factories that are very geographically dispersed. These facilities generally do not have the capital available to invest in new technologies and processes, and the extremely manual nature of garment fabrication is the standard as a result. As customers continue to demand quicker product turn-arounds and higher levels of customization, factories need to better understand their current process limitations in an effort to optimize their internal operations. Since most of these factories collect virtually no process data, managers have a hard time focusing on areas in which to improve. This project is approaching the question of "how can we use technology in a responsible and sustainable way to better understand our process?" from the perspective of a factory manager, who cannot necessarily invest in sophisticated software and hardware systems that other industries have adopted to monitor quality. As a result, this project focuses heavily on the user experience of both the operator (quality inspector) and the manager, as both need to be able to interact with the proposed data system easily and reliably. The primary goal of this thesis is to detail the design and implementation of a data collection platform (built during internship) for use in low-tech garment factories that will: -- Enable the procurement of process data (specifically as it relates to quality) from operators in real-time. -- Allow factory management to easily view and analyze collected data. -- Employ an intuitive front-end user interface that allows operators to quickly and reliably collect data. Since a substantial portion of this internship was spent designing, building, and testing this data collection interface, the thesis will reflect the nuances associated with building and implementing factory data systems in low-tech factories where human interaction is the primary driver of system adoption. The design and deployment of this system was ultimately successful and resulted in a robust prototype that continues to provide Li & Fung with insights into how to achieve their ultimate goal of connecting their factory network to a centralized data platform.
by Brandon Douglas Jennings.
M.B.A.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
18

Fonti, Alessandro. "Modelling approaches to smart buildings and smart districts for the definition of demand side management strategies and data models. The ENEA "Smart Village" case study." Doctoral thesis, Università Politecnica delle Marche, 2016. http://hdl.handle.net/11566/242982.

Full text
Abstract:
Il consumo energetico negli edifici rappresenta una sfida nel contesto della riduzione delle emissioni inquinanti ed in un più efficiente utilizzo dell’energia. Una risposta a questo problema è data dall’utilizzo di sistemi di Demand Side Management (DSM) i quali permettono di ottenere una riduzione dei consumi tramite un utilizzo sempre più spinto della tecnologia. I sistemi DSM necessitano di essere valutati durante la progettazione mediante tool di simulazione. Essi inoltre necessitano di modelli di simulazione e predizione quando i sistemi di controllo interessati sono controlli avanzati come controlli predittivi o controlli multi-livello. Nel caso di controlli multi-livello un’altra importante questione riguarda la corretta determinazione del data model ai fini dell’opportuna strutturazione del controllo. Nel presente lavoro l’oggetto di studio è uno “Smart Village” altamente sensorizzato localizzato a Roma e composto da uno smart building e uno smart district di 8 edifici. Ai fini della modellazione della domanda energetica dell’edificio e del distretto è stato sviluppato un simulatore in Simulink basato su HAMbase. Il simulatore dell’edificio è stato calibrato e validato su dati reali considerando come parametri di calibrazione i valori dei casual gain. I dati acquisiti coprono un periodo di 60 giorni durante l’inverno del 2013. La configurazione ottima del simulatore consente di ottenere un valore di MAPE dell’energia termica trasferita giornaliera minore del 6%. Successivamente viene riportato un sistema di supporto alle decisione basato sull’ottimizzazione multi-obiettivo a fronte di Pareto combinata con il simulatore dell’edificio. Questo sistema permette di dimostrare la potenzialità del modello nella definizione di politiche di DSM. Il simulatore dello smart district è stato poi derivato direttamente dal simulatore dell’edificio tramite riprogrammazione della s-function di HAMbase. Questo consente di avere istanze multiple del modello in uno stesso modello Simulink. Il simulatore del distretto viene utilizzato per introdurre nel contesto degli smart district il concetto di data model. In ultimo è stata analizzata l’accuratezza di modelli grey-box per la predizione a breve orizzonte predittivo del comportamento termico di edifici. Per questa analisi è stato condotto un processo di identificazione su un dataset reale acquisito durante l’anno 2015 e proveniente dai sensori installati su un singolo edificio dello smart district. L’identificazione mostra come i modelli a resistenza-capacità (RC) del secondo ordine costituiscano la migliore scelta in termini di accuratezza e complessità.
Energy consumption in buildings represents a challenge in the context of the reduction of greenhouse gas emissions and in a more efficient use of energy. An answer to this issue is the use of Demand side Management (DSM) systems which, through an increase in the use of technology, allow for the reduction of energy consumption. DSM systems need to be assessed during the design process by simulation tools. Moreover, they need simulation and predictive models if the control systems involved are advanced controls such as predictive or multilevel controls. With regards to multilevel controls, another important issue is the correct choice of the data model to properly structure the control systems. In this study, a real high-sensored Smart Village located in Rome composed of a smart building and a smart district of 8 buildings is taken into account. A Simulink simulator based on HAMbase is developed in order to model the building and district energy demands. The building simulator is calibrated and validated on real data taking into account the casual gain values as calibration parameters. The data are acquired in a period of 60 days during the winter of 2013. The optimal simulator configuration permits to obtain a MAPE on the daily transferred thermal energy less than 6%. Afterwards, a decision support system based on Pareto front multi-objective optimization combined with the smart building simulator is reported to show the model potential towards the definition of DSM policies. The simulator of the smart district is then derived directly from the building simulator by reprogramming the HAMbase s-function. This allows for multiple model instances in the same Simulink model. The district simulator is used to introduce the concept of data model in the context of smart districts. Finally, the accuracy of the low order grey-box models for short-term thermal behavior prediction is analyzed. An identification procedure is carried out on a real dataset acquired during the year 2015 from the sensors installed in a single building of the smart district. The identification shows that the second order resistance-capacitance (RC) models are the best choice in terms of accuracy and complexity.
APA, Harvard, Vancouver, ISO, and other styles
19

Persson, Martin. "A Framework for Monitoring Data from a Smart Home Environment." Thesis, Luleå tekniska universitet, Institutionen för system- och rymdteknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-79884.

Full text
Abstract:
This master thesis presents the design and implementation of a framework for monitoringdata related to activities of daily living (ADL) in a smart home environment, conducted for theHuman Health and Activity Laboratory (H2Al) at Luleå University of Technology. The generalaim of such environments is to increase the quality of life by enabling elderly to live longer athome while reducing the consumption of resources necessary. The complexity of collection,filtering and storing of data in smart home environments is however inherent due to oftenmany interworking sensor-systems, which allmay have different APIs and communicationpathways. This means that knowing whether ‘all systems are go’ when for example doing astudy is not easy, especially for persons not trained in data science.This work therefore aim to design and implement a framework for datamonitoring thattargets smart home environments in which activities of daily living are important for analysisof health-related conditions and for the personalised tailoring of interventions. The frameworkprimarily collects data from four selected systems, that for example track the position andmovements of a person. The data is stored in a database and visualised on a website toallow for monitoring of individual sensor data being collected. The framework was validatedtogether with a occupational therapist through a proof-of-concept trial in the Human Healthand Activity Laboratory, for which healthy subjects conducted a typical test (making a salad)used when assessing human performance.In conclusion, the developed framework works as expected, collecting data frommanysensor systems and storing the data in a common format, while the visualisation on a websiteis perceived as giving an easy overview of monitored data. Additional data can easily be addedto the framework and other processes beyond monitoring can be linked to the data, suchas further data refinement and algorithms for activity recognition (possibly using machinelearning techniques). Future work include to better distinguish data from multiple occupants,develop themanagement of synchronous and asynchronous data, and refine the web interfacefor additional simplicity
APA, Harvard, Vancouver, ISO, and other styles
20

Jan, Jonathan. "Collecting Data for Building Automation Analytics : A case study for collecting operational data with minimal human intervention." Thesis, KTH, Radio Systems Laboratory (RS Lab), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-233319.

Full text
Abstract:
Approximately 40% of the total energy consumption within the EU is due to buildings, and similar numbers can be found in the US. If the principal inefficiencies in buildings were easily identifiable, then a facility manager could focus their resources to make the buildings more efficient, which would lead to both cost savings for the facility owners and decrease the building’s ecological footprint. In building automation systems today, data is already being collected every second, but due to the lack of standardization for describing this data, having access to data is not the same as being able to make use of it. The existing heterogeneity makes it very costly to gather data from multiple buildings, thus making it difficult to understand the big picture. Facility managers cannot fix what they cannot see; thus it is important to facilitate the visualization of the data collected from all of the different building automation systems. This potentially offers great benefits with regards to both sustainability and economy. In this thesis, the author’s goal is to propose a sustainable, cost and time effective data integration strategy for real estate owners who wish to gain greater insight into their buildings’ efficiency. The study begins with a literature study to find previous and on-going attempts to solve this problem. Some initiatives for standardization of semantic models were found. Two of these models, Brick and Haystack, were chosen. One building automation system (BAS) was tested in a pilot case study, to test the appropriateness of a solution. The key results from this thesis project show that data from building automation systems, can be integrated into an analysis platform, and an extract, transform, and load (ETL) process for this is presented. How time efficiently data can be tagged and transformed into a common format is very dependent upon the current control system’s data storage format and whether information about its structure is adequate. It is also noted that there is no guarantee that facility managers have access to the control system’s database or information about how that is structured, in such cases other techniques can be used such as BACnet/IP, or Open Platform Communications (OPC) Unified Architecture.
Ungefär 40 % av den totala energikonsumtionen i E.U. och U.S.A. förbrukas av fastigheter. Om de delar av fastigheten som är ineffektiva enkelt kunde identifieras, skulle det underlätta fastighetsförvaltarnas arbete i att göra byggnader mer energisnåla. Detta har i sin tur potential att minska kostnader och byggnaders ekologiska fotavtryck. I dagens fastighetsautomationssystem samlas data in varje sekund, men på grund av att det saknas ett standardiserat sätt att beskriva den på, är det skillnad på att ha tillgång till data och att faktiskt kunna använda sig av den. Heterogeniteten gör att det blir både kostsamt och tidskrävande för fastighetsförvaltare att samla in data från sina fastigheter. Fastighetsförvaltare kan inte åtgärda något det inte kan se. Därför är det viktigt att underlätta möjligheten för visualisering av data från olika typer av fastighetsautomationssystem. Att lyckas med detta har potential att ge positiva effekter både när det gäller hållbarhet och ekonomi. I den här uppsatsen är författarens mål att komma fram till en hållbar, kostnads- och tidseffektiv integrationsstrategi för fastighetsförvaltare som vill få bättre insikter hur effektiv deras byggnad faktiskt är. Forskningsarbetet inleds med en litteraturstudie för att finna tidigare och pågående försök att lösa detta problem. Några initiativ för standardisering av semantiska modeller för att beskriva data inom fastighetsautomation hittades. Två av dessa, Brick och Project Haystack, valdes ut. En byggnad, och ett fastighetsautomationssystem testades i en pilotstudie. Resultaten från studien pekar på att data från fastighetautomationssystem kan integreras med en analysplattform, och en så kallad ETL-process, efter de engelska orden: extract, transform, load; presenteras för att uppnå det målet. Hur tidseffektivt data kan taggas och transformeras beror på det nuvarande kontrollsystemets datalagringsformat och om information om dess struktur är adekvat. Det noteras att det inte finns någon garanti till att få åtkomst till kontrollsystemets databas, eller information om dess struktur, därför presenteras även alternativa tekniker, däribland BACnet/IP och Open Platform Communications (OPC) Unified Architecture.
APA, Harvard, Vancouver, ISO, and other styles
21

Martins, Daniel Filipe Catita. "Utilization of blockchain in the application of master data management." Master's thesis, Universidade de Aveiro, 2016. http://hdl.handle.net/10773/22724.

Full text
Abstract:
Mestrado em Engenharia de Computadores e Telemática
As the name implies, Master Data Management (MDM) manages Master Data: the set of core information needed and shared in the systems of an enterprise. Depending on the scope of the organization, master data could be data about clients if all the systems consider client information critical for their operations and decision making. A fundamental concept of MDM is the Golden Record, an entry with the best and more valuable information about an entity, formed through the application of rules and methods on the data that exists scattered over the systems. It is the single version of the truth. Master Data Management solutions are dependent of a centralized data hub that holds the most valuable information. The solution proposed disrupts the list of o ers, combining the Master Data Management concept with the Blockchain technology, resulting in an MDM solution with a distributed data hub that is truly decentralized. A Blockchain consists on a chain of blocks that requires computational work to attach new blocks to the end of the chain, and where blocks cannot be changed without redoing the computational e ort for all the following blocks, resulting on a trusted environment. Participants of the Blockchain network hold a full copy of the Blockchain, making it a distributed network of information. Its security protocols and requirements eliminate the need for an intermediary between transactions and make Blockchain ideal to save things of value. The solution involves Ethereum: a Blockchain platform where transactions have programmable functionality, known as Smart Contracts, pieces of code containing a set of data and executable functions that are available through the public address. As all the data inserted through the programmed Smart Contracts goes through the same speci c cleansing, matching and merging rules, all the network participants will be in possession of the same Golden Records, resulting in a single view of entity. To facilitate the utilization of the solution, a wrapper and user interface were developed, granting that the user does not need to interact directly with the Blockchain.
Como o nome indica, Master Data Management (MDM) gere dados mestre: o conjunto de informação nuclear necessária e partilhada pelos sistemas de uma empresa. Dependendo do âmbito da organização, dados mestre podem ser dados de clientes, caso todos os sistemas considerem a informação dos clientes critica para a suas operações e decisões. Um conceito fundamental de MDM e o Golden Record, um registo com a melhor e mais valiosa informação sobre uma determinada entidade, formada através da aplicação de regras e métodos nos dados existentes que estão espalhados pelos vários sistemas. E a versão única da verdade. As soluções de Master Data Management existentes são dependentes de um local centralizado onde fica guardada a informação mais valiosa. A solução proposta é distributiva para a lista de ofertas, combinando o conceito de Master Data Management com a tecnologia Blockchain, resultando numa solução MDM distribuída e descentralizada. A Blockchain consiste numa cadeia de blocos que requerem esforço computacional para que se adicionem novos blocos ao m da cadeia e onde os blocos não podem ser alterados sem repetir o esforço computacional para todos os blocos seguintes, o que resulta num ambiente conclave. Os participantes de uma rede Blockchain possuem uma copia total da Blockchain, tornando-a assim uma rede distribuída de informador. Os protocolos e requisitos de segurança eliminam a necessidade de existência de um intermediário entre as os participantes de uma transacção e tornam a Blockchain ideal para guardar objectos de valor. A soluçao envolve Ethereum: uma plataforma Blockchain onde as transaçoes tem funcionalidade programável, conhecida como Smart Contracts, pedaços de código que contêm um conjunto de dados e funções executáveis que estão disponíveis a partir de um endereço publico. Como todos os dados inseridos a partir de um Smart Contract est~ao sujeitos as mesmas regras especificas de standardizaçao, correspondência e fusão, todos os participantes da rede vão possuir os mesmos Golden Records, resultando na visão unica de entidade. Para facilitar a utilização da soluçao, foram desenvolvidos um wrapper e uma interface de utilizador, para permitir que o utilizador não necessite de interagir directamente com a Blockchain.
APA, Harvard, Vancouver, ISO, and other styles
22

Söderberg, Anna, and Philip Dahlström. "Turning Smart Water Meter Data Into Useful Information : A case study on rental apartments in Södertälje." Thesis, KTH, Vattendragsteknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-217235.

Full text
Abstract:
Managing water in urban areas is an ever increasingly complex challenge. Technology enables sustainable urban water management and with integrated smart metering solutions, massive amounts of water consumption data from the end users can be collected. However, the possibility of generating data from the end user holds no value in itself. It is with the use of data analysis the vast amount of the collected data can provide more insightful information creating potential benefits. It is recognized that a deeper understanding of the end user could potentially provide benefits for operational managers as well as for the end users. A single case study of a data set containing high frequency end user water consumption data from rental apartments has been conducted, where the data set was analyzed in order to see what possible information that could be extracted and interpreted based on an exploratory data analysis (EDA). Furthermore, an interview with the operational manager of the buildings under study as well as a literature review have been carried out in order to understand how the gathered data is used today and to which contexts it could be extrapolated to provide potential benefits at a building level. The results suggests that the EDA is a powerful method approach when starting out without strong preconception of the data under study and have successfully revealed patterns and a fundamental understanding of the data and its structure. Through analysis, variations over time, water consumption patterns and excessive water users have been identified as well as a leak identification process. Even more challenging than to make meaning of the data is to trigger actions, decisions and measures based on the data analysis. The unveiled information could be applied for an improved operational building management, to empower the customers, for business and campaign opportunities as well as for an integrated decision support system. To summarize, it is concluded that the usage of smart water metering data holds an untapped opportunity to save water, energy as well as money. In the drive towards a more sustainable and smarter city, smart water meter data from end users have the potential to enable smarter building management as well as smarter water services.
APA, Harvard, Vancouver, ISO, and other styles
23

Koziel, Sylvie Evelyne. "From data collection to electric grid performance : How can data analytics support asset management decisions for an efficient transition toward smart grids?" Licentiate thesis, KTH, Elektroteknisk teori och konstruktion, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-292323.

Full text
Abstract:
Physical asset management in the electric power sector encompasses the scheduling of the maintenance and replacement of grid components, as well as decisions about investments in new components. Data plays a crucial role in these decisions. The importance of data is increasing with the transformation of the power system and its evolution toward smart grids. This thesis deals with questions related to data management as a way to improve the performance of asset management decisions. Data management is defined as the collection, processing, and storage of data. Here, the focus is on the collection and processing of data. First, the influence of data on the decisions related to assets is explored. In particular, the impacts of data quality on the replacement time of a generic component (a line for example) are quantified using a scenario approach, and failure modeling. In fact, decisions based on data of poor quality are most likely not optimal. In this case, faulty data related to the age of the component leads to a non-optimal scheduling of component replacement. The corresponding costs are calculated for different levels of data quality. A framework has been developed to evaluate the amount of investment needed into data quality improvement, and its profitability. Then, the ways to use available data efficiently are investigated. Especially, the possibility to use machine learning algorithms on real-world datasets is examined. New approaches are developed to use only available data for component ranking and failure prediction, which are two important concepts often used to prioritize components and schedule maintenance and replacement. A large part of the scientific literature assumes that the future of smart grids lies in big data collection, and in developing algorithms to process huge amounts of data. On the contrary, this work contributes to show how automatization and machine learning techniques can actually be used to reduce the need to collect huge amount of data, by using the available data more efficiently. One major challenge is the trade-offs needed between precision of modeling results, and costs of data management.

QC 20210330

APA, Harvard, Vancouver, ISO, and other styles
24

Pisanò, Lorenzo. "IoT e Smart Irrigation: gestione dei Big Data attraverso un sistema di notifica intelligente." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/23531/.

Full text
Abstract:
Con questo lavoro di tesi ho potuto approfondire anche un altro tema di grande attualità collegato ad IoT, la 'Smart Irrigation', conosciuta anche come 'Irrigazione di precisione'. Considerando la necessità sempre più evidente di migliorare la gestione della distribuzione irrigua ed energetica nel campo dell'agricoltura e, tenendo presente le indicazione meteoclimatiche e, l'importanza di avere informazioni tempestive ed aggiornate per migliorare le attività in campo, la Smart Irrigation assume un ruolo rilevante nel risparmio idrico ed energentico, evitando sprechi ed usi impropri di queste preziose risorse. Il software che ho realizzato è stato sviluppato nell'ambito di un programma europeo più vasto, il progetto SWAMP (Smart WAter Management Platform), che ha come obiettivo quello di determinare una svolta decisiva nell'utilizzo moderato e privo di sprechi dell' acqua dolce ad uso irriguo, proponendo un sistema efficiente per la gestione della distribuzione di questo bene in vari contesti. L'area di competenza del progetto fa parte di quella amministrata dal Consorzio di Boni�ca dell'Emilia Centrale (CBEC), responsabile delle irrigazioni e del drenaggio d'acqua di un'area di 1200 km2 suddivisi in circa 5400 terreni proprietari. Il software di seguito descritto genera un sistema di acquisizione dati provenienti da alcuni pluviometri dislocati nel comune di Bologna. Successivamente, li elabora e classi�fica la quantità di pioggia che cade nell'area di studio in 5 differenti livelli di rischio. Queste informazioni vengono poi noti�cate all'utente attraverso la piattaforma WDA, permettendo di ovviare ad eventi di inondazione e alluvione anche nelle aree adiacenti a quelle classi�cate 'a rischio'.
APA, Harvard, Vancouver, ISO, and other styles
25

Fitzgerald, Amy Lynn. "An exercise in database customized programming to compare the Smart Data Manager and dBaseIII." Thesis, Kansas State University, 1985. http://hdl.handle.net/2097/9838.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Lai, Tsz-wan, and 黎子雲. "The use of "Octopus" smart card in the secondary schooladministration." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2002. http://hub.hku.hk/bib/B4004029X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Prasannan, Sooraj. "A macro-micro system architecture analysis framework applied to Smart Grid meter data management systems by Sooraj Prasannan." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/59009.

Full text
Abstract:
Thesis (S.M. in System Design and Management)--Massachusetts Institute of Technology, Engineering Systems Division, 2010.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student submitted PDF version of thesis.
Includes bibliographical references (p. 109-111).
This thesis proposes a framework for architectural analysis of a system at the Macro and Micro levels. The framework consists of two phases -- Formulation and Analysis. Formulation is made up of three steps -- Identifying the System Boundary, Identifying the Object-Process System levels using the Object-Process Methodology (OPM) and then creating the Dependency Matrix using a Design Structure Matrix (DSM). Analysis is composed of two steps -- Macro-Level and Micro-Level Analysis. Macro-Level analysis identifies the system modules and their interdependencies based on the OPM and DSM clustering analysis and Visibility-Dependency Signature Analysis. The Micro-Level analysis identifies the central components in the system based on the connectivity metrics of Indegree centrality, Outdegeree centrality, Visibility and Dependency. The conclusions are drawn based on simultaneously interpreting the results derived from the Macro-Level and Micro-Level Analysis. Macro-Analysis is vital in terms of comprehending system scalability and functionality. The modules and their interactions influence the scalability of the system while the absence of certain modules within a system might indicate missing system functionality. Micro-Analysis classifies the components in the system based on connectivity and can be used to guide redesign/design efforts. Understanding how the redesign of a particular node will affect the entire system helps in planning and implementation. On the other hand, design Modification/enhancement of nodes with low connectivity can be achieved without affecting the performance or architecture of the entire system. Identifying the highly central nodes also helps the system architect understand whether the system has enough redundancy built in to withstand the failure of the central nodes. Potential system bottlenecks can also be identified by using the micro-level analysis. The proposed framework is applied to two industry leading Smart Grid Meter Data Management Systems. Meter Data Management Systems are the central repository of meter data in the Smart Grid Information Technology Layer. Exponential growth is expected in managing electrical meter data and technology firms are very interested in finding ways to leverage the Smart Information Technology market. The thesis compares the two Meter Data Management System architectures, and proposes a generic Meter Data Management System by combining the strengths of the two architectures while identifying areas of collaboration between firms to leverage this generic architecture.
S.M.in System Design and Management
APA, Harvard, Vancouver, ISO, and other styles
28

Stripling, Gwendolyn D. "An Empirical Assessment of Energy Management Information System Success Using Structural Equation Modeling." NSUWorks, 2017. http://nsuworks.nova.edu/gscis_etd/1019.

Full text
Abstract:
The Energy Industry utilizes Energy Management Information Systems (EMIS) smart meters to monitor utility consumers’ energy consumption, communicate energy consumption information to consumers, and to collect a plethora of energy consumption data about consumer usage. The EMIS energy consumption information is typically presented to utility consumers via a smart meter web portal. The hope is that EMIS web portal use will aid utility consumers in managing their energy consumption by helping them make effective decisions regarding their energy usage. However, little research exists that evaluates the effectiveness or success of an EMIS smart meter web portal from a utility consumer perspective. The research goal was to measure EMIS smart meter web portal success based on the DeLone and McLean Information Success Model. The objective of the study was to investigate the success constructs system quality, information quality, service quality, use, and user satisfaction, and determine their contribution to EMIS success, which was measured as net benefits. The research model used in this study employed Structural Equation Modeling (SEM) based on Partial Least Squares (PLS) to determine the validity and reliability of the measurement model and to evaluate the hypothetical relationships in the structural model. The significant validity and reliability measures obtained in this study indicate that the DeLone and McLean Information Success Model (2003) has the potential for use in future EMIS studies. The determinants responsible for explaining the variance in net benefits were EMIS use and user satisfaction. Based on the research findings, several implications and future research are stated and proposed.
APA, Harvard, Vancouver, ISO, and other styles
29

Klasson, Anders, and Johan Rosengren. "Industrial IoT Management Systemfor Tubes with Integrated Sensors." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-237412.

Full text
Abstract:
Sandvik har utvecklat en teknik för att placera sensorer i rör. Denna teknik har stor marknadspotential och kan effektivisera många industriprocesser. Den färdiga tjänsten ska kunna strömma sensordata till molntjänster för analys och avläsning.Deras nuvarande system kräver idag manuell konfiguration på plats och är komplicerad att installera. Denna uppsats undersöker hur systemets utrustning kan konfigureras automatiskt och hur ett system för underliggande IT-tjänster skulle kunna fungera.En lösning presenteras där många delar av installationsprocessen har automatiserats, samt en skiss för ett underliggande system.Lösningen utvärderas genom att utföra en mätning av konfigureringskomplexitet. Slutsatsen av utvärderingen var att det utvecklade system hade utökad funktionalitet, jämfört med dagens manuella tillvägagångssätt, och var inte mer komplex att konfigurera. I många avseenden mindre komplex.
Sandvik has developed a technique to place sensors inside tubes. This technology has great market potential and can optimize many industrial processes. The finished product should be able to stream sensor data to cloudservices for analysis and reading.The current system requires manual configuration on-site and the installation is labor intensive. This thesis investigates how the system’s hardware can be configured atomically, and how a supporting IT-system could function.A solution is presented where large portion of the installation process has been automated, along with an outline for a supporting system.The solution is evaluated by performing a measurement of the configuration complexity. The evaluation shows that the developed system had increased functionality compared to today’s manual configuration, configuration complexity was not increased. In many aspects, the configuration complexity was reduced.
APA, Harvard, Vancouver, ISO, and other styles
30

Zhu, Junxiang. "Integration of Building Information Modelling and Geographic Information System at Data Level Using Semantics and Geometry Conversion Approach Towards Smart Infrastructure Management." Thesis, Curtin University, 2018. http://hdl.handle.net/20.500.11937/74945.

Full text
Abstract:
This study integrates Building Information Modelling (BIM)and Geographic Information System (GIS) at data level using an open source approach for geometry transformation and an automatic attribute searching algorithm for semantics transfer for the purpose of facilitating data transformation from BIM to GIS. Based on that, an infrastructure management system has been developed using Web GIS technology in conjunction with the models created by BIM and transformed into GIS using the proposed approach.
APA, Harvard, Vancouver, ISO, and other styles
31

Bugeja, Joseph. "Smart connected homes : concepts, risks, and challenges." Licentiate thesis, Malmö universitet, Fakulteten för teknik och samhälle (TS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-7793.

Full text
Abstract:
The growth and presence of heterogeneous connected devices inside the home have the potential to provide increased efficiency and quality of life to the residents. Simultaneously, these devices tend to be Internet-connected and continuously monitor, collect, and transmit data about the residents and their daily lifestyle activities. Such data can be of a sensitive nature, such as camera feeds, voice commands, physiological data, and more. This data allows for the implementation of services, personalization support, and benefits offered by smart home technologies. Alas, there has been a rift of security and privacy attacks on connected home devices that compromise the security, safety, and privacy of the occupants. In this thesis, we provide a comprehensive description of the smart connected home ecosystem in terms of its assets, architecture, functionality, and capabilities. Especially, we focus on the data being collected by smart home devices. Such description and organization are necessary as a precursor to perform a rigorous security and privacy analysis of the smart home. Additionally, we seek to identify threat agents, risks, challenges, and propose some mitigation approaches suitable for home environments. Identifying these is core to characterize what is at stake, and to gain insights into what is required to build more robust, resilient, secure, and privacy-preserving smart home systems. Overall, we propose new concepts, models, and methods serving as a foundation for conducting deeper research work in particular linked to smart connected homes. In particular, we propose a taxonomy of devices; classification of data collected by smart connected homes; threat agent model for the smart connected home; and identify challenges, risks, and propose some mitigation approaches.
APA, Harvard, Vancouver, ISO, and other styles
32

RAZZAK, FAISAL. "The Role of Semantic Web Technologies in Smart Environments." Doctoral thesis, Politecnico di Torino, 2013. http://hdl.handle.net/11583/2506366.

Full text
Abstract:
Today semantic web technologies and Linked Data principles are providing formalism, standards, shared data semantics and data integration for unstructured data over the web. The result is a transformation from theWeb of Interaction to theWeb of Data and actionable information. On the crossroad lies our daily lives, containing plethora of unstructured data which is originating from low cost sensors and appliances to every computational element used in our modern lives, including computers, interactive watches, mobile phones, GPS devices etc. These facts accentuate an opportunity for system designers to combine these islands of data into a large actionable information space which can be utilized by automated and intelligent agents. As a result, this phenomenon is likely to institute a space that is smart enough to provide humans with comfort of living and to build an efficient society. Thus, in this context, the focus of my research has been to propose solutions to the problems in the domains of smart environment and energy management, under the umbrella of ambient intelligence. The potential role of semantic web technologies in these proposed solutions has been analysed and architectures for these solutions were designed, implemented and tested.
APA, Harvard, Vancouver, ISO, and other styles
33

Erkki, Robert, and Philip Johnsson. "Quality Data Management in the Next Industrial Revolution : A Study of Prerequisites for Industry 4.0 at GKN Aerospace Sweden." Thesis, Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-69341.

Full text
Abstract:
The so-called Industry 4.0 is by its agitators commonly denoted as the fourth industrial revolution and promises to turn the manufacturing sector on its head. However, everything that glimmers is not gold and in the backwash of hefty consultant fees questions arises: What are the drivers behind Industry 4.0? Which barriers exists? How does one prepare its manufacturing procedures in anticipation of the (if ever) coming era? What is the internet of things and what file sizes’ is characterised as big data? To answer these questions, this thesis aims to resolve the ambiguity surrounding the definitions of Industry 4.0, as well as clarify the fuzziness of a data-driven manufacturing approach. Ergo, the comprehensive usage of data, including collection and storage, quality control, and analysis. In order to do so, this thesis was carried out as a case study at GKN Aerospace Sweden (GAS). Through interviews and observations, as well as a literature review of the subject, the thesis examined different process’ data-driven needs from a quality management perspective. The findings of this thesis show that the collection of quality data at GAS is mainly concerned with explicitly stated customer requirements. As such, the data available for the examined processes is proven inadequate for multivariate analytics. The transition towards a data-driven state of manufacturing involves a five-stage process wherein data collection through sensors is seen as a key enabler for multivariate analytics and a deepened process knowledge. Together, these efforts form the prerequisites for Industry 4.0. In order to effectively start transition towards Industry 4.0, near-time recommendations for GAS includes: capture all data, with emphasize on process data; improve the accessibility of data; and ultimately taking advantage of advanced analytics. Collectively, these undertakings pave the way for the actual improvements of Industry 4.0, such as digital twins, machine cognition, and process self-optimization. Finally, due to the delimitations of the case study, the findings are but generalized for companies with similar characteristics, i.e. complex processes with low volumes.
APA, Harvard, Vancouver, ISO, and other styles
34

Mohammad, Ammad Uddin. "UAV Routing Protocol (URP) for crop health management." Thesis, Brest, 2017. http://www.theses.fr/2017BRES0147/document.

Full text
Abstract:
Les réseaux de capteurs sans fil sont maintenant un moyen crédible de collecte de données sur les cultures. L'installation d'une structure de communication fixe pour relayer les données surveillées depuis la tête de grappe jusqu'à sa destination finale peut être soit impraticable en raison de la topologie du terrain, soit prohibitive en raison du coût initial élevé. Une solution plausible consiste à utiliser des véhicules aériens sans pilote (UAV) comme moyen alternatif de collecte de données et de contrôle de supervision limité de l'état des détecteurs. Dans cet article, nous considérons le cas des parcelles agricoles disjointes comprenant chacune des grappes de capteurs, organisées de manière prédéterminée en fonction des objectifs d'élevage. Cette recherche vise à trouver une solution optimale pour la recherche de UAV et la collecte de données à partir de tous les capteurs installés dans un champ de culture. En outre, le protocole de routage des capteurs tiendra compte d'un compromis entre la gestion de l'énergie et les frais généraux de diffusion des données. Le système proposé est évalué en utilisant un modèle simulé et il devrait trouver une classe parmi toutes les sous-considérations
Wireless sensor networks are now a credible means for crop data collection. The installation of a fixed communication structure to relay the monitored data from the cluster head to its final destination can either be impractical because of land topology or prohibitive due to high initial cost. A plausible solution is to use Unmanned Aerial Vehicles (UAV) as an alternative means for both data collection and limited supervisory control of sensors status. In this paper, we consider the case of disjoint farming parcels each including clusters of sensors, organized in a predetermined way according to farming objectives. This research focuses to drive an optimal solution for UAV search and data gathering from all sensors installed in a crop field. Furthermore, the sensor routing protocol will take into account a tradeoff between energy management and data dissemination overhead.The proposed system is evaluated by using a simulated model and it should find out a class among all under consideration
APA, Harvard, Vancouver, ISO, and other styles
35

Weiss, Tobias, and Dorothea Reisbach. "Förderung der Kundeninteraktion zur Nutzung von Datenvisualisierungen auf Basis von Smart Metering im Privatkundenbereich." TUDpress, 2019. https://tud.qucosa.de/id/qucosa%3A36564.

Full text
Abstract:
Beschlossen 2015 im Gesetzesentwurf zur Digitalisierung der Energiewende (s. BMWi (2015a)) sollen verstärkt Smart Meter ausgerollt werden. Diese digitalen Stromzähler bestehen aus einem digitalen Zählwerk sowie einer Kommunikationseinheit, welche eine sichere und standardisierte Kommunikation ermöglichen soll. Die Smart Meter erfassen und veranschaulichen den aktuellen Verbrauch und können zusätzlich sogar simultan die momentane Erzeugung von Energie, z. B. durch eine Solaranlage, erfassen. Durch die ständige Erfassung des aktuellen Energieverbrauchs, verbunden mit der Übermittlungsfunktion an den EVU, kann dem Kunden unmittelbar sein aktueller Verbrauch aufgezeigt werden – eine wesentliche Grundlage für Transparenz im Verbrauch, Datenauswertungen und Startpunkt für Verbrauchsoptimierungen (vgl. BMWi (2015b); Fox (2010), S. 408). [... aus Punkt 1.2]
APA, Harvard, Vancouver, ISO, and other styles
36

Massana, i. Raurich Joaquim. "Data-driven models for building energy efficiency monitoring." Doctoral thesis, Universitat de Girona, 2018. http://hdl.handle.net/10803/482148.

Full text
Abstract:
Nowadays, energy is absolutely necessary all over the world. Taking into account the advantages that it presents in transport and the needs of homes and industry, energy is transformed into electricity. Bearing in mind the expansion of electricity, initiatives like Horizon 2020, pursue the objective of a more sustainable future: reducing the emissions of carbon and electricity consumption and increasing the use of renewable energies. As an answer to the shortcomings of the traditional electrical network, such as large distances to the point of consumption, low levels of flexibility, low sustainability, low quality of energy, the difficulties of storing electricity, etc., Smart Grids (SG), a natural evolution of the classical network, has appeared. One of the main components that will allow the SG to improve the traditional grid is the Energy Management System (EMS). The EMS is necessary to carry out the management of the power network system, and one of the main needs of the EMS is a prediction system: that is, to know in advance the electricity consumption. Besides, the utilities will also require predictions to manage the generation, maintenance and their investments. Therefore, it is necessary to dispose of the systems of prediction of the electrical consumption that, based on the available data, forecast the consumption of the next hours, days or months, in the most accurate way possible. It is in this field where the present research is placed since, due to the proliferation of sensor networks and more powerful computers, more precise prediction systems have been developed. Having said that, a complete study has been realized in the first work, taking into account the need to know, in depth, the state of the art, in relation to the load forecasting topic. On the basis of acquired knowledge, the installation of sensor networks, the collection of consumption data and modelling, using Autoregressive (AR) models, were performed in the second work. Once this model was defined, in the third work, another step was made, collecting new data, such as building occupancy, meteorology and indoor ambience, testing several paradigmatic models, such as Multiple Linear Regression (MLR), Artificial Neural Network (ANN) and Support Vector Regression (SVR), and establishing which exogenous data improves the prediction accuracy of the models. Reaching this point, and having corroborated that the use of occupancy data improves the prediction, there was the necessity of generating techniques and methodologies, in order to have the occupancy data in advance. Therefore, several attributes of artificial occupancy were designed, in order to perform long-term hourly consumption predictions, in the fourth work.
A dia d’avui l’energia és un bé completament necessari arreu del món. Degut als avantatges que presenta en el transport i a les necessitats de les llars i la indústria, l’energia és transformada en energia elèctrica. Tenint en compte la total expansió i domini de l’electricitat, iniciatives com Horitzó 2020, tenen per objectiu un futur més sostenible: reduint les emissions de carboni i el consum i incrementant l’ús de renovables. Partint dels defectes de la xarxa elèctrica clàssica, com són gran distància al punt de consum, poca flexibilitat, baixa sostenibilitat, baixa qualitat de l’energia, dificultats per a emmagatzemar energia, etc. apareixen les Smart Grid (SG), una evolució natural de la xarxa clàssica. Un dels principals elements que permetrà a les SG millorar les xarxes clàssiques és l’Energy Management System (EMS). Així doncs, per a que l’EMS pugui dur a terme la gestió dels diversos elements, una de les necessitats bàsiques dels EMS serà un sistema de predicció, o sigui, saber per endavant quin consum hi haurà en un entorn determinat. A més, les empreses subministradores d’electricitat també requeriran de prediccions per a gestionar la generació, el manteniment i fins i tot les inversions a llarg termini. Així doncs ens calen sistemes de predicció del consum elèctric que, partint de les dades disponibles, ens subministrin el consum que hi haurà d’aquí a unes hores, uns dies o uns mesos, de la manera més aproximada possible. És dins d’aquest camp on s’ubica la recerca que presentem. Degut a la proliferació de xarxes de sensors i computadors més potents, s’han pogut desenvolupar sistemes de predicció més precisos. A tall de resum, en el primer treball, i tenint en compte que s’havia de conèixer en profunditat l’estat de la qüestió en relació a la predicció del consum elèctric, es va fer una anàlisi completa de l’estat de l’art. Un cop fet això, i partint del coneixement adquirit, en el segon treball es va dur a terme la instal•lació de les xarxes de sensors, la recollida de dades de consum i el modelatge amb models lineals d’auto-regressió (AR). En el tercer treball, un cop fets els models es va anar un pas més enllà recollint dades d’ocupació, de meteorologia i ambient interior, provant diferents models paradigmàtics com Multiple Linear Regression (MLR), Artificial Neural Network (ANN) i Support Vector Regression (SVR) i establint quines dades exògenes milloren la predicció dels models. Arribat a aquest punt, i havent corroborat que l’ús de dades d’ocupació millora la predicció, es van generar tècniques per tal de disposar de les dades d’ocupació per endavant, o sigui a hores vista. D’aquesta manera es van dissenyar diferents atributs d’ocupació artificials, permetent-nos fer prediccions horàries de consum a llarg termini. Aquests conceptes s’expliquen en profunditat al quart treball.
APA, Harvard, Vancouver, ISO, and other styles
37

Tosto, Valentina. "Creazione di servizi personalizzati su dispositivi Android nell'ambito dell'Internet of Things collaborativo." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2016. http://amslaurea.unibo.it/12356/.

Full text
Abstract:
In un mondo dove ormai qualunque oggetto quotidiano è connesso ad Internet ed è comune sentir parlare di “quarta rivoluzione industriale”, diventa critica la gestione dell'eterogeneità di dati prodotti dai dispositivi nell'ambito dell'Internet of Things. Perché non utilizzare tali informazioni in modo vantaggioso, integrandole tra loro a favore della città e dei suoi abitanti? Questa tesi focalizza l'attenzione sulla creazione di servizi personalizzati dalla combinazione di dati ufficiali e non, provenienti da sensori di stazioni di monitoraggio o di smartphone, a disposizione di utenti privati e di stakeholders. Le azioni svolte in merito sono state il reperimento di dati ufficiali dall’Arpae dell’Emilia-Romagna e la progettazione e sviluppo di Habitatest, un'applicazione mobile per il sistema Android. Habitatest offre ai suoi utenti un widget per visualizzare i valori dei dati, estratti da dispositivi di utenti privati, e dei servizi creati, un grafico che mostra l’andamento delle informazioni ed un sistema drag and drop per comporre tali dati, con una formula matematica, finalizzati alla produzione di servizi. L’obiettivo del progetto realizzato è stimolare gli utenti alla condivisione di dati, derivanti dai sensori dei propri dispositivi, ed alla creazione di servizi per scopi comuni, quali migliorare la qualità di vita di se stessi e delle altre persone, applicato a campi come la domotica, contribuire alla nascita delle Smart Cities e risparmiare risorse, salvaguardando l'ambiente in cui viviamo.
APA, Harvard, Vancouver, ISO, and other styles
38

Pinarer, Ozgun. "Sustainable Declarative Monitoring Architecture : Energy optimization of interactions between application service oriented queries and wireless sensor devices : Application to Smart Buildings." Thesis, Lyon, 2017. http://www.theses.fr/2017LYSEI126/document.

Full text
Abstract:
La dernière décennie a montré un intérêt croissant pour les bâtiments intelligents. Les bâtiments traditionnels sont les principaux consommateurs d’une partie importante des ressources énergétiques, d'où le besoin de bâtiments intelligents a alors émergé. Ces nouveaux bâtiments doivent être conçus selon des normes de construction durables pour consommer moins. Ces bâtiments intelligents sont devenus l’un des principaux domaines d’application des environnements pervasifs. En effet, une infrastructure basique de construction de bâtiment intelligent se compose notamment d’un ensemble de capteurs sans fil. Les capteurs basiques permettent l’acquisition, la transmission et la réception de données. La consommation d’énergie élevée de l’ensemble de ces appareils est un des problèmes les plus difficiles et fait donc l’objet d’études dans ce domaine de la recherche. Les capteurs sont autonomes en termes d’énergie. Etant donné que la consommation d’énergie a un fort impact sur la durée de vie du service, il existe plusieurs approches dans la littérature. Cependant, les approches existantes sont souvent adaptées à une seule application de surveillance et reposent sur des configurations statiques pour les capteurs. Dans cette thèse, nous contribuons à la définition d’une architecture de surveillance déclaratif durable par l’optimisation énergétique des interactions entre requêtes applicative orientées service et réseau de capteurs sans fil. Nous avons choisi le bâtiment intelligent comme cas d’application et nous étudions donc un système de surveillance d’un bâtiment intelligent. Du point de vue logiciel, un système de surveillance peut être défini comme un ensemble d’applications qui exploitent les mesures des capteurs en temps réel. Ces applications sont exprimées dans un langage déclaratif sous la forme de requêtes continues sur les flux de données des capteurs. Par conséquent, un système de multi-applications nécessite la gestion de plusieurs demandes de flux de données suivant différentes fréquences d’acq/tx de données pour le même capteur sans fil, avec des exigences dynamiques requises par les applications. Comme une configuration statique ne peut pas optimiser la consommation d’énergie du système, nous proposons une approche intitulée Smart-Service Stream-oriented Sensor Management (3SoSM) afin d’optimiser les interactions entre les exigences des applications et l’environnement des capteurs sans fil, en temps réel. 3SoSM offre une configuration dynamique des capteurs pour réduire la consommation d’énergie tout en satisfaisant les exigences des applications en temps réel. Nous avons conduit un ensemble d’expérimentations effectuées avec un simulateur de réseau de capteurs sans fil qui ont permis de valider notre approche quant à l’optimisation de la consommation d’énergie des capteurs, et donc l’augmentation de la durée de vie de ces capteurs, en réduisant notamment les communications non nécessaires
Recent researches and analysis reports declare that high energy consumption of buildings is major problem in developed countries. As a result, they show concretely that building energy management systems (BEMS) and deployed wireless sensor network environments are important for energy efficiency of building operations. In the literature, existing smart building management systems focus on energy consumption of the building, hardware deployed inside/outside of the building and network communication issues. They adopt static configurations for wireless sensor devices and proposed models are fitted to a single application. In this study, we propose a sustainable declarative monitoring architecture that focus on the energy optimisation of interactions between application service oriented queries and wireless sensor devices. We consider the monitoring system as a set of applications that exploit sensor measures in real time such as HVAC automation and control systems, real time supervision, security. These applications can be configured dynamically by the users or by the supervisor. In our approach, we take a data point of view: applications are declaratively expressed as a set of continuous queries on the sensor data stream. To achieve our objective of energy aware optimization of the monitoring architecture, we formalize sensor device configuration and fit data acquisition and data transmission to actual applications requirements. We present a complete monitoring architecture and an algorithm that handles dynamic sensor configuration. We introduce a platform that covers physical and also simulated wireless sensor devices
APA, Harvard, Vancouver, ISO, and other styles
39

Hjälte, David. "Mot Industri 4.0 genom statistisk dataanalys : En studie om positionen av stansade hål vid Scania Ferruforms saidobalkstillverkning." Thesis, Luleå tekniska universitet, Institutionen för ekonomi, teknik, konst och samhälle, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-85648.

Full text
Abstract:
Den fjärde industriella revolutionen, även kallad Industri 4.0, drivs av ett antal teknologier som medför digitalisering och automatisering av industriella processer. Konceptet innebär en applicering av dataanalys med avancerade analytiska verktyg på stora mängder data, vilka påstås ge stora möjligheter för kvalitetsförbättringar. För att en sådan övergång ska ske är förmågan att hantera data avgörande. Trots det uppvisar många företag idag bristande användning av data för att ta beslut. Frågan är hur företag kan göra för att hantera data och utföra en transformation till Industri 4.0. För att studera det här ämnet har det här examensarbetet utförts som en fallstudie på en stansprocess hos Scania Ferruform. Genom en litteraturstudie, kvantitativ datainsamling samt observationer och intervjuer undersökte examensarbetet den nuvarande användning av data i processen. Därefter undersöktes data med statistiska verktyg för att visa på hur data kan hanteras i en process för att erhålla större kunskap om orsaker till avvikelser. Examensarbetet utredde till sist hur fortsatt arbete med datahantering kan utföras för att uppnå målet Industri 4.0.Analysverktyg har använts för att analysera över 39 000 datapunkter. Resultatet visar på att det finns utvecklingsmöjligheter vad gäller insamling, kvalitet och användning av data. Ett ramverk presenteras för hur företaget bör hantera data för att kunna utvinna ny kunskap från deras processer samt hur Ferruform fortsatt kan arbeta mot Industri 4.0.Slutligen ges rekommendationer om fortsatta studier. Resultatet av examensarbetet blir ett stöd för Ferruform i deras arbete mot mer dugliga processer och den tekniska utveckling företaget eftersträvar.
The fourth industrial revolution, also called Industry 4.0 is powered by several technologies which result in digitalization and automatization of industrial processes. The concept includes the application of big data and advanced analytics, which are said to provide great opportunities for quality improvements. For such a transition to take place, the ability to handle data is crucial. Despite this, many companies today show a lack of use of data to drive decision-making. The question is how companies can manage data and ultimately transition towards Industry 4.0. To research this topic this thesis has been carried out as a case study of a punching process at Scania Ferruform. Through a literature review, quantitative data collection, as well as observations and interviews, the thesis examined the current use of data in the process. Subsequently, data were examined with statistical tools to illustrate how data can be managed in a process to attain increased knowledge about causes of deviations. Lastly, the thesis explored future work towards Industry 4.0. Analysis tools have been used to analyse over 39 000 data points. The result of the study shows that there are opportunities for development in terms of collection, quality and use of data. A framework of how Ferruform should manage data in order to extract new knowledge from its processes is presented. Furthermore, an action plan is presented for a transition towards Industry 4.0. Finally, recommendations are given for further studies. The result of the thesis will be helpful for Ferruform in its transition towards more efficient processes and the technical development of which the company strives towards.
APA, Harvard, Vancouver, ISO, and other styles
40

Kretek, František. "Smart Home - projekt inteligentního domu." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2014. http://www.nusl.cz/ntk/nusl-220641.

Full text
Abstract:
This masters thesis deals with inteligent home electroinstallation which can distinctly decrease operating costs and make the house management much more easier. In this thesis will be described basic principles of data transfer, topology, instalation options and functions of particular units. Moreover there will be described systems and properities of solutions from the best known producers and suppliers of inteligent electroinstalations. The practical part describes the design house wiring system installation using Ego-n a solution using wireless installation Xcomfort.
APA, Harvard, Vancouver, ISO, and other styles
41

Salama, Raghda Ahmed Abdelkerim. "Data-driven modeling of smart builiding energy management." Master's thesis, 2021. http://hdl.handle.net/10362/132389.

Full text
Abstract:
Buildings consume approximately 40% of energy in total, which contributes negatively to the environment. Building Energy Management Systems(BEMS) have been used to monitor energy consumption and increase usage efficiency. In this study, the components and importance of BEMS are emphasized. The data from the management systemoftheChamchuri5building in Chula long korn University, Thailand, were used as a template for data-driven modeling for energy usage in smart buildings to analyze the patterns of energy consumption. Using multilevel modeling on theChamchuri5 building ,the main factors that consume energy on a macro and micro level are analyzed .Energy variation between zones and floors was spotted.
APA, Harvard, Vancouver, ISO, and other styles
42

Gonçalves, Sandra de Jesus Pereira. "Data-driven disaster management in a smart city." Master's thesis, 2021. http://hdl.handle.net/10071/23563.

Full text
Abstract:
Disasters, both natural and man-made, are complex events that result in the loss of human life and/or the destruction of properties. The advances in Information Technology (IT) and Big Data Analysis represent an opportunity for the development of resilient environments, since from the application of Big Data (BD) technologies it is possible not only to extract patterns of occurrences of events, but also to predict them. The work carried out in this dissertation aims to apply the CRISP-DM methodology to conduct a descriptive and predictive analysis of the events that occurred in the city of Lisbon, with emphasis on the events that affected buildings. Through this research it was verified the existence of temporal and spatial patterns of occurrences with some events occurring in certain periods of the year, such as floods and collapses that are recorded more frequently in periods of high precipitation. The spatial analysis showed that the city center is the area most affected by the occurrences, and it is in these areas where the largest proportion of buildings with major repair needs are concentrated. Finally, machine learning models were applied to the data, and the Random Forest model obtained the best result with an accuracy of 58%. This research contributes to improve the resilience of the city since the analysis developed allowed to extract insights regarding the events and their occurrence patterns that will help the decision-making process.
Os desastres, tanto naturais quanto as provocadas pelo homem, são eventos complexos que se traduzem em perdas de vidas e/ou destruição de propriedades. Os avanços na área de Tecnologias de Informação e Big Data Analysis representam uma oportunidade para o desenvolvimento de ambientes resilientes dado que, a partir da aplicação das tecnologias de Big Data (BD), é possível não só extrair padrões de ocorrências dos eventos, mas também fazer a previsão dos mesmos. O trabalho realizado nesta dissertação visa aplicar a metodologia CRISP-DM de forma a conduzir análises descritivas e preditivas sobre os eventos que ocorreram na cidade de Lisboa, com ênfase nos eventos que afetaram os edifícios. A investigação permitiu verificar a existência de padrões temporais e espaciais eventos a ocorrer em certos períodos do ano, como é o caso das cheias e inundações que são registados com maior frequência nos períodos de alta precipitação. A análise espacial permitiu verificar que a área do centro da cidade é a área mais afetada pelas ocorrências sendo nestas áreas onde se concentram a maior proporção de edifícios com grandes necessidades de reparação. Por fim, modelos de aprendizagem automática foram aplicados aos dados tendo o modelo Random Forest obtido o melhor resultado com accuracy de 58%. Esta pesquisa contribui para melhorar o aumento da resiliência da cidade pois, a análise desenvolvida permitiu extrair insights sobre os eventos e os seus padrões de ocorrência que irá ajudar os processos de tomada de decisão.
APA, Harvard, Vancouver, ISO, and other styles
43

Chiung-WenChang and 張瓊文. "On Data Analytics Framework of Smart-Project Management for Product Development." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/3mnk9v.

Full text
Abstract:
碩士
國立成功大學
工程管理碩士在職專班
106
Due to the advancement of technology, consumers' interests and needs for products are constantly changing. In order to obtain the market, enterprise must constantly develop new products to meet the changing and increasing needs of consumers. Therefore, New Product Development (NPD) is an important key activity of the enterprise and one of the strategies to create enterprise value and enhance competitive advantage. New product development is a multiplexed and complex technical application. The proportion of successful products is not high. Effective new product development requires a systematic process, appropriate methods and techniques, and effective management. As technologies such as computers, networks, socializing platform, and the IoT flourish, data-centric activities combine data science to maximize data value and create new knowledge value. With the big data, the artificial intelligent has become more and more mature, which has The rise of data science and AI has made the ideal of smart system gradually realized. New product development is a dynamic process and a system engineering producure. If we Can integrate data science methods and technologies into project management, we will make project management smart. The research uses data science concepts, methods, and techniques to design of Smart-Project Management for Product Development Model, according to this model design and planning Data Analytics Framework for Project Management and Analytics Method, using case to verify the analysis of the architecture and model is effective. This research will improve the performance of new product development, and thus enhance the company's competitiveness.
APA, Harvard, Vancouver, ISO, and other styles
44

Ya-ChingChuang and 莊雅晴. "Smart Meter Management System for Microgrid Based on Data Distribution Service." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/c566s6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

León, Palacio Ana. "SILE: A Method for the Efficient Management of Smart Genomic Information." Doctoral thesis, 2019. http://hdl.handle.net/10251/131698.

Full text
Abstract:
[ES] A lo largo de las últimas dos décadas, los datos generados por las tecnologías de secuenciación de nueva generación han revolucionado nuestro entendimiento de la biología humana. Es más, nos han permitido desarrollar y mejorar nuestro conocimiento sobre cómo los cambios (variaciones) en el ADN pueden estar relacionados con el riesgo de sufrir determinadas enfermedades. Actualmente, hay una gran cantidad de datos genómicos disponibles de forma pública, que son consultados con frecuencia por la comunidad científica para extraer conclusiones significativas sobre las asociaciones entre los genes de riesgo y los mecanismos que producen las enfermedades. Sin embargo, el manejo de esta cantidad de datos que crece de forma exponencial se ha convertido en un reto. Los investigadores se ven obligados a sumergirse en un lago de datos muy complejos que están dispersos en más de mil repositorios heterogéneos, representados en múltiples formatos y con diferentes niveles de calidad. Además, cuando se trata de resolver una tarea en concreto sólo una pequeña parte de la gran cantidad de datos disponibles es realmente significativa. Estos son los que nosotros denominamos datos "inteligentes". El principal objetivo de esta tesis es proponer un enfoque sistemático para el manejo eficiente de datos genómicos inteligentes mediante el uso de técnicas de modelado conceptual y evaluación de calidad de los datos. Este enfoque está dirigido a poblar un sistema de información con datos que sean lo suficientemente accesibles, informativos y útiles para la extracción de conocimiento de valor.
[CAT] Al llarg de les últimes dues dècades, les dades generades per les tecnologies de secuenciació de nova generació han revolucionat el nostre coneixement sobre la biologia humana. És mes, ens han permès desenvolupar i millorar el nostre coneixement sobre com els canvis (variacions) en l'ADN poden estar relacionats amb el risc de patir determinades malalties. Actualment, hi ha una gran quantitat de dades genòmiques disponibles de forma pública i que són consultats amb freqüència per la comunitat científica per a extraure conclusions significatives sobre les associacions entre gens de risc i els mecanismes que produeixen les malalties. No obstant això, el maneig d'aquesta quantitat de dades que creix de forma exponencial s'ha convertit en un repte i els investigadors es veuen obligats a submergir-se en un llac de dades molt complexes que estan dispersos en mes de mil repositoris heterogenis, representats en múltiples formats i amb diferents nivells de qualitat. A m\és, quan es tracta de resoldre una tasca en concret només una petita part de la gran quantitat de dades disponibles és realment significativa. Aquests són els que nosaltres anomenem dades "intel·ligents". El principal objectiu d'aquesta tesi és proposar un enfocament sistemàtic per al maneig eficient de dades genòmiques intel·ligents mitjançant l'ús de tècniques de modelatge conceptual i avaluació de la qualitat de les dades. Aquest enfocament està dirigit a poblar un sistema d'informació amb dades que siguen accessibles, informatius i útils per a l'extracció de coneixement de valor.
[EN] In the last two decades, the data generated by the Next Generation Sequencing Technologies have revolutionized our understanding about the human biology. Furthermore, they have allowed us to develop and improve our knowledge about how changes (variants) in the DNA can be related to the risk of developing certain diseases. Currently, a large amount of genomic data is publicly available and frequently used by the research community, in order to extract meaningful and reliable associations among risk genes and the mechanisms of disease. However, the management of this exponential growth of data has become a challenge and the researchers are forced to delve into a lake of complex data spread in over thousand heterogeneous repositories, represented in multiple formats and with different levels of quality. Nevertheless, when these data are used to solve a concrete problem only a small part of them is really significant. This is what we call "smart" data. The main goal of this thesis is to provide a systematic approach to efficiently manage smart genomic data, by using conceptual modeling techniques and the principles of data quality assessment. The aim of this approach is to populate an Information System with data that are accessible, informative and actionable enough to extract valuable knowledge.
This thesis was supported by the Research and Development Aid Program (PAID-01-16) under the FPI grant 2137.
León Palacio, A. (2019). SILE: A Method for the Efficient Management of Smart Genomic Information [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/131698
TESIS
Premiado
APA, Harvard, Vancouver, ISO, and other styles
46

Liu, Chui-Yuan, and 劉騏源. "Design and Implementation of Cloud Data Integration Management for Smart Aquarium Device." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/36331735976512961997.

Full text
Abstract:
碩士
中原大學
電機工程研究所
105
In this thesis, we design and implement a smart aquarium system combining cloud data integration management with temperature control and aquarium water system to improve flexibility of aquarium water exchange system and user can understand the aquarium state through the smart control interface. In this smart aquarium integration control device with cloud data integration has six parts. First, the construction of cloud database is collection the aquarium state to upload google spreadsheet. Second, the temperature control by using PWM with TEC can make the aquarium temperature stably. Third, the real-time charts from cloud data can exchange the dashboard that makes users understand the state of the aquarium. Fourth, the streaming service real-time display can use website or smart phone to observe the aquarium state. Fifth, the smart aquarium water system can improve water quality in the aquarium. Finally, we experiment temperature and water quality to make a flow chart of cloud data managements system. In these studies, the contribution of the research is as follows: 1. We design aquarium water exchange system by flexibility smart UI control which can extend fish-life longer. 2. We integrate cloud of data and change to the dashboard which can make the user understand state in the aquarium. 3. We use Thermoelectric Cooler (TEC) to rapid heating by PWM technology in the aquarium. Keywords: Cloud data、smart control、streaming service、Thermoelectric Cooling Chip
APA, Harvard, Vancouver, ISO, and other styles
47

YU, AN LIN, and 林宥安. "Using Big Data to Explore the Analysis of Smart Machine Management System." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/9ycf3k.

Full text
Abstract:
碩士
國立勤益科技大學
工業工程與管理系
106
The modern global industry is at the beginning of an era of innovation. Industry 4.0 combines machines, analysis, the Internet of Things (IoT), automation, and data exchange. We added communication capabilities to each device to connect the world of machines between devices and devices through the IoT, to establish a smart machine factory with resource efficiency and adaptability, provide perfect an after-sales service in business processes and value processes for integrating customers and business partners. In this study, we use WebAccess to let the production data and production status to do a charting, and the process can be monitored immediately to facilitate the transformation of the company in the future into a Small and Medium Enterprises with Industry 4.0 advanced technology. Through this mechanic analytical process technology, the mechanical industry can help factory to control the controllability of the robotic arm. The robotic arm is used to replace the traditional manual packaging mode. The robotic arm sucks and removes the plastic blister, grab the color pen, to pick up the paper card to insert in the plastic blister. During the process, the inspection data are transmitted to WebAccess. WebAccess collects test the production loop data and the data mining screens a large number of data and then discards the data to the inverted transmission neural network for analysis to examine the model of the production rate. Finally, the two-stage clustering method is used to verify the consistent rate. The planning of production processes through data models to achieving consistency, it can also reduce the production risks and reduce personnel costs, evolution to a lights-out manufacturing.
APA, Harvard, Vancouver, ISO, and other styles
48

Taghipour, Dizaji Roshanak. "Acquiring Multimodal Disaggregate Travel Behavior Data Using Smart Phones." Thesis, 2013. http://hdl.handle.net/10012/7304.

Full text
Abstract:
Despite the significant advances that have been made in traffic sensor technologies, there are only a few systems that provide measurements at the trip level and fewer yet that can do so for all travel modes. On the other hand, traditional methods of collecting individual travel behavior (i.e. manual or web-based travel diaries) are resource intensive and prone to a wide range of errors. Moreover, although dedicated GPS loggers provide the ability to collect detailed travel behavior data with less effort, their use still faces several challenges including the need to distribute and retrieve the logger; the potential need to have the survey participants upload data from the logger to a server; and the need for survey participants to carry another device with them on all their trips. The widespread adoption of smart phones provides an opportunity to acquire travel behavior data from individuals without the need for participants to record trips in a travel diary or to carry dedicated recording devices with them on their travels. The collected travel data can then be used by municipalities and regions for forecasting the travel demand or for analyzing the travel behavior of individuals. In the current research, a smart phone based travel behavior surveying system is designed, developed, and pilot tested. The custom software written for this study is capable of recording the travel characteristics of individuals over the course of any period of time (e.g. days or weeks) and across all travel modes. In this system, a custom application on the smart phone records the GPS data (using the onboard GPS unit) at a prescribed frequency and then automatically transmits the data to a dedicated server. In the server, the data are stored in a dedicated database to be then processed using trip characteristics inference algorithms. The main challenge with the implemented system is the need to reduce the amount of energy consumed by the device to calculate and transmit the GPS fixes. In order to reduce the power consumption from the travel behavior data acquisition software, several techniques are proposed in the current study. Finally, in order to evaluate the performance of the developed system, first the accuracy of the position information obtained from the data acquisition software is analyzed, and then the impact of the proposed methods for reducing the battery consumption is examined. As a conclusion, the results of implemented system shows that collecting individual travel behavior data through the use of GPS enabled smart phones is technically feasible and would address most of the limitations associated with other survey techniques. According to the results, the accuracy of the GPS positions and speed collected through the implemented system is comparable to GPS loggers. Moreover, proposed battery reduction techniques are able to reduce the battery consumption rate from 13.3% per hour to 5.75% per hour (i.e. 57% reduction) when the trip maker is non-stationary and from 5.75% per hour to 1.41% per hour (i.e. 75.5% reduction) when the trip maker is stationary.
APA, Harvard, Vancouver, ISO, and other styles
49

Firmino, Bruno Manuel Paias. "Smart Monetization - Telecom Revenue Management beyond the traditional invoice." Master's thesis, 2019. http://hdl.handle.net/10362/113609.

Full text
Abstract:
Nowadays, there is a fast and unpredictable technological evolution, with new systems constantly emerging on the market, with the capability of being monetized. However, these systems are not always fully and flexibly explored. Many hardware distributors are selling products without a clear view on sustainable business models for them, leaving these as an afterthought. Communications Service Providers are suddenly under pres sure to modernize and expande their business models as to regain the ground claimed by Over-the-top service providers, who make use of existing infrastructures to provide their own services, which naturally may lead to substantial revenue loss from the actual infras tructure owners. The Smart Monetization project aims to explore this paradigm, with the design and implementation of a reusable asset, making use of Big Data and Analytics tools that can ingest and process usage and billing data from customers, detecting event patterns and correlations that can be monetized, leading to improved and new service experiences and ensuring, as well, greater transparency on the process of billing and charging of these services.
APA, Harvard, Vancouver, ISO, and other styles
50

WU, DE-CHANG, and 鄔德昌. "A Study on Transformer Load Management by Utilizing Smart Meter Data of Low Voltage Customers." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/17482010491175551068.

Full text
Abstract:
碩士
國立高雄海洋科技大學
輪機工程研究所
105
Utilities are beginning to turn to smart metering value-added application technologies to improve distribution system operations. The aim of this paper is to build a set of analysis model for value-added application of distribution transformer load management for low voltage (LV) smart metering. The low voltage network state estimation is used to obtain an estimate in transformer load with data in customer information system. The result is used for transformer load monitoring using data visualization technique accordingly. The proposed method can assist utilities in transformer load management for identifying assets requiring replacement as they reach the end of their useful life.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography