Dissertationen zum Thema „Données intelligentes“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit Top-50 Dissertationen für die Forschung zum Thema "Données intelligentes" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.
Carel, Léna. „Analyse de données volumineuses dans le domaine du transport“. Electronic Thesis or Diss., Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLG001.
Der volle Inhalt der QuelleThe aim of this thesis is to apply new methodologies to public transportation data. Indeed, we are more and more surrounded by sensors and computers generating huge amount of data. In the field of public transportation, smart cards generate data about our purchases and our travels every time we use them. In this thesis, we used this data for two purposes. First of all, we wanted to be able to detect passenger's groups with similar temporal habits. To that end, we began to use the Non-negative Matrix Factorization as a pre-processing tool for clustering. Then, we introduced the NMF-EM algorithm allowing simultaneous dimension reduction and clustering on a multinomial mixture model. The second purpose of this thesis is to apply regression methods on these data to be able to forecast the number of check-ins on a network and give a range of likely check-ins. We also used this methodology to be able to detect anomalies on the network
Marquet, Clément. „Binaire béton : Quand les infrastructures numériques aménagent la ville“. Electronic Thesis or Diss., Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLT036.
Der volle Inhalt der QuelleHow is the city developed by and with digital technologies? To answer this question, the thesis analyses in a single movement the urban consequences of the information and physical infrastructures of digital technology. To this end, it mobilizes the methodological and analytical frameworks of Infrastructure Studies, the sociology of techniques and innovation and the sociology of public problems. From an empirical point of view, it proposes to take a step back from the major mediatized experiments of the "smart city" to study more diffuse, everyday transformations generated by digital technologies. It consists of two case studies: on the one hand, it follows a programme to develop connected services to improve the accessibility of a Paris Region transport network for people with reduced mobility, and on the other hand, it analyses the discreet establishment of numerous data centres in Plaine Commune, in the north of the Parisian metropolitan area, and the resulting local unrest. The fieldwork includes several participating observations, about 40 interviews, a press review and the analysis of internal documents of the organizations. The thesis shows how the logic of immediacy, of "real time", generally at the centre of the promises associated with the digital city, requires an increased availability of workers, data and servers. Thus, in the transport company, projects to improve passenger service via smartphones confront station agents with the dual imperative of the face-to-face relationship and the alerts of the connected device. The cartographic data on which connected services are based, often taken for granted, require organizations to invent new collaborations to ensure their production and maintenance. The servers necessary for the functioning of the digital society are accumulated, protected and maintained in data centres, imposing buildings that are geographically concentrated, disrupt the environments in which they are located, disconcert elected officials and disturb residents. The logic of real time thus weighs on the social and spatial organization of cities, and invites us to rethink the urban development of digital infrastructures in terms of work, maintenance and the environment - unlike those, more commonly mobilized, of socio-technical imaginaries, promises of optimization and urban models
Mbacke, Abdoul Aziz. „Collecte et remontée multi-sauts de données issues de lecteurs RFID pour la surveillance d'infrastructures urbaines“. Thesis, Lille 1, 2018. http://www.theses.fr/2018LIL1I052/document.
Der volle Inhalt der QuelleThe strong urbanization witnessed by the world requires better management of cities. This improved management involves the monitoring and maintenance of urban infrastructure and equipment to ensure greater safety and well-being for residents. A key role has therefore been given to ICTs through the concepts of IoT and Smart Cities. This thesis is positioned in this context and proposes the Radio Frequency Identification (RFID) in addition to the techniques already in use. The adoption of large-scale RFID for urban centers, however, needs to address two main issues: reading collisions and data collection and reporting. Through the work carried out in this thesis, we first sought to identify the solutions already proposed in the literature to reduce collisions. Based on this study, we proposed two distributed anti-collision algorithms DEFAR and CORA. They ensure a high read throughput by maintaining a low collapse rate and latency compared to literature solutions. Subsequently, we proposed DACAR, a distributed algorithm for collecting data from RFID readers in a multi-hop manner. It adapts according to the anti-collision protocol used and the position of deployed drives to provide a reliable packet delivery ratio and low end-to-end delay. An improved version is later proposed for the prioritization of data and to offer more suitable different paths using a combination of different parameters through fuzzy logic
Pujol, Hadrien. „Antennes microphoniques intelligentes : localisation de sources acoustiques par Deep Learning“. Thesis, Paris, HESAM, 2020. http://www.theses.fr/2020HESAC025.
Der volle Inhalt der QuelleFor my PhD thesis, I propose to explore the path of supervised learning, for the task of locating acoustic sources. To do so, I have developed a new deep neural network architecture. But, to optimize the millions of learning variables of this network, a large database of examples is needed. Thus, two complementary approaches are proposed to constitute these examples. The first is to carry out numerical simulations of microphonic recordings. The second one is to place a microphone antenna in the center of a sphere of loudspeakers which allows to spatialize the sounds in 3D, and to record directly on the microphone antenna the signals emitted by this experimental 3D sound wave simulator. The neural network could thus be tested under different conditions, and its performances could be compared to those of conventional algorithms for locating acoustic sources. The results show that this approach allows a generally more precise localization, but also much faster than conventional algorithms in the literature
Carel, Léna. „Analyse de données volumineuses dans le domaine du transport“. Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLG001/document.
Der volle Inhalt der QuelleThe aim of this thesis is to apply new methodologies to public transportation data. Indeed, we are more and more surrounded by sensors and computers generating huge amount of data. In the field of public transportation, smart cards generate data about our purchases and our travels every time we use them. In this thesis, we used this data for two purposes. First of all, we wanted to be able to detect passenger's groups with similar temporal habits. To that end, we began to use the Non-negative Matrix Factorization as a pre-processing tool for clustering. Then, we introduced the NMF-EM algorithm allowing simultaneous dimension reduction and clustering on a multinomial mixture model. The second purpose of this thesis is to apply regression methods on these data to be able to forecast the number of check-ins on a network and give a range of likely check-ins. We also used this methodology to be able to detect anomalies on the network
Courmont, Antoine. „Politiques des données urbaines : ce que l'open data fait au gouvernement urbain“. Thesis, Paris, Institut d'études politiques, 2016. http://www.theses.fr/2016IEPP0042/document.
Der volle Inhalt der QuelleAnalyzing open data policies, this thesis investigates the effect of the circulation of data on urban government. This political sociology of data, which analyses jointly the transformation of data and actors associated to them, highlights the pluralism of the politics of urban data. Based on an ethnographic investigation inside the Metropolis of Lyon, the thesis studies the open data policy in the making. In addition, 70 interviews, archive material and a partial comparison with North-American cities were used for the analyze. Following the chain of open data, the thesis emphasizes a tension between attachment and detachment. Attached to vast socio-technical networks, data must be detached from their initial environment to circulate, before being re-attached to new users. In order to do this, data undergo a series of trials. The uncertain outcome of these trials produce new agencements which question sectorial, institutional and territorial borders. That’s why, to maintain control on its public policies, the challenge for a local government is to manage to regulate the flows of data on its territory. Data thus become an issue that must be governed
Pham, Thi Hai Yen. „Smart city for the preservation of urban biodiversity“. Thesis, Lille 1, 2020. http://www.theses.fr/2020LIL1I043.
Der volle Inhalt der QuelleThis work aims to develop and implement some monitoring systems in the Scientific Campus of Lille University, North of France in order to observe and evaluate its biodiversity state. This thesis includes four parts. The first part includes a literature review concerning the role of biodiversity and the impact of urbanization on it as well as the development of Smart City concept and its application in the field of ecology.The second part creates a framework for urban biodiversity monitoring includes selecting indicators to surveillance, data collection, data analyst, and evaluating the urban biodiversity status. The third part presents the application of the methodology presented in part 2 to the scientific campus of Lille University. This part presents successively the scientific campus, the indicators used in this work, data collection and analysis and finally the main outcome of this work and recommendations for the preservation of the biodiversity at the scientific campus.The last part deals with open data, the application of open data for biodiversity research. It also presents how to access and how we can use it in the biodiversity domain
Nguyen, Trung Ky. „Génération d'histoires à partir de données de téléphone intelligentes : une approche de script Dealing with Imbalanced data sets for Human Activity Recognition using Mobile Phone sensors“. Thesis, Université Grenoble Alpes (ComUE), 2019. http://www.theses.fr/2019GREAS030.
Der volle Inhalt der QuelleScript is a structure describes an appropriate sequence of events or actions in our daily life. A story, is invoked a script with one or more interesting deviations, which allows us to deeper understand about what were happened in routine behaviour of our daily life. Therefore, it is essential in many ambient intelligence applications such as healthmonitoring and emergency services. Fortunately, in recent years, with the advancement of sensing technologies and embedded systems, which make health-care system possible to collect activities of human beings continuously, by integrating sensors into wearable devices (e.g., smart-phone, smart-watch, etc.). Hence, human activity recognition (HAR) has become a hot topic interest of research over the past decades. In order to do HAR, most researches used machine learning approaches such as Neural network, Bayesian network, etc. Therefore, the ultimate goal of our thesis is to generate such kind of stories or scripts from activity data of wearable sensors using machine learning approach. However, to best of our knowledge, it is not a trivial task due to very limitation of information of wearable sensors activity data. Hence, there is still no approach to generate script/story using machine learning, even though many machine learning approaches were proposed for HAR in recent years (e.g., convolutional neural network, deep neural network, etc.) to enhance the activity recognition accuracy. In order to achieve our goal, first of all in this thesis we proposed a novel framework, which solved for the problem of imbalanced data, based on active learning combined with oversampling technique so as to enhance the recognition accuracy of conventional machine learning models i.e., Multilayer Perceptron. Secondly, we introduce a novel scheme to automatically generate scripts from wearable sensor human activity data using deep learning models, and evaluate the generated method performance. Finally, we proposed a neural event embedding approach that is able to benefit from semantic and syntactic information about the textual context of events. The approach is able to learn the stereotypical order of events from sets of narrative describing typical situations of everyday life
Saunier, Camille, und Camille Saunier. „La protection des données personnelles des utilisateurs d'enceintes connectées «intelligentes» par le Règlement européen no 2016/679, le droit canadien et le droit québécois : approche comparatiste“. Master's thesis, Université Laval, 2020. http://hdl.handle.net/20.500.11794/38291.
Der volle Inhalt der QuelleLe présent travail de recherche porte sur la protection des renseignements personnels des utilisateurs d’enceintes connectées « intelligentes ». Au regard de cet objet connecté particulier, l’étude se penchera sur la manière dont la protection des données personnelles est envisagée par le Règlement européen n°2016/679 (RGPD), la Loi sur la protection des renseignements personnels et les documents électroniques (LPRPDE) et la Loi québécoise sur la protection des renseignements personnels dans le secteur privé (LPRPSP) tout au long du cycle de vie de la donnée. Ces différentes législations divergent tant sur leurs dates d’adoption que sur leurs systèmes juridiques. Pourtant, les rapports de faits qui les animent en font une des objets de comparaison particulièrement intéressants. Il ressort de cette étude que l’enceinte connectée « intelligente » met en évidence les insuffisances des législations étudiées vis-à-vis du rapport au temps, de la masse de données collectées mais aussi de l’opacité de la machine.
Ralitera, Tahina. „Simulations multi-agent pour les villes intelligentes : une architecture multi-environnement temporelle, spatiale et organisationnelle. Apports pour l’anticipation“. Electronic Thesis or Diss., La Réunion, 2020. http://www.theses.fr/2020LARE0017.
Der volle Inhalt der QuelleThe multiagent simulation is a promising approach for smart city design and planning. In this context, we focus on the example of recharging electric vehicles on public charging points. This example illustrates a problem of managing limited and shared resources in time and space. Rolland May defines three main dimensions that should be integrated by the system: the space, the organisation and the time. In multi-agent simulations, the spatial dimension and the social dimension are the subject of numerous proposals in the literature. In opposite, time remains subject to very few studies and consideration. In addition, if a lot of research deals with spatial and organisational consideration in the agent's reasoning, the time consideration, as a system dynamic, is often overlooked.This highlights two aspects to which we want to contribute:- the need for interaction support to exchange spatial, social and temporal information;- the need for reasoning that takes this exchanged spatial, temporal and organisational information into account.Thought this thesis, our first objective aim at making the multiagent simulation paradigm evolve in order to consider time as a new medium of interaction, in the same way as the spatial environment or the organisational environment. For that purpose, we draw on existing approaches that are commonly used for modelling the space and organisations. Our model is called Agent-Group-Environment-Time (AGRET). It is an extension of the generic organisational model AGR and its variant AGRE.The originality of our approach is that it integrates the temporal dimension as an environment, in the same way as the spatial environment and the social environment. This time environment is used to support the exchange and the storage of time information. It complements the simulation scheduler which manages the simulation activation cycle. The implementation of this new interaction environment brings new possibilities. One of these possibilities is the use of temporal, spatial and social information, perceived through the environments, to optimise the agent's reasoning. In this context, we choose to focus on anticipatory reasoning which is particularly interesting in the context of the smart city. This anticipatory reasoning increases the realism of the simulation by showing a cognitive capacity that is specific to humans. It also improves the agent's decision mechanism by choosing a more relevant behaviour that takes into account the agent's temporal, spatial and social activation context. This anticipatory reasoning is based on information about the past, the present and the future, which the agent perceives through the temporal environment. The inclusion of future information in the anticipative reasoning is an original feature of this approach. This functionality is made possible by the temporal environment, which allows storing and perceiving information on the temporal dimension.To summarise, our contributions are both about time. Our first contribution is about the representation of time as an environment. In the multi-agent level, we propose an interaction support for the exchange and storage of information on space, time and organisation. Our second contribution is about temporal reasoning. We propose an anticipative reasoning based on the perception of spatial, temporal and social environments. In particular, we exploit the visibility of the future dimension of time that is allowed by the temporal environment. In the example of electric vehicles recharge, the integration of our approaches allows, at the collective level, the optimisation of the recharge distribution in space and time. We show this through an implementation on a multi-agent simulation model called SkuadCityModel. More generally, at the level of the smart city, the implementation of our contributions allows the optimisation of resource management in space and time
Ali, Shayar. „Smart City : Implementation and development of platforms for the management of SunRise Smart Campus“. Thesis, Lille 1, 2018. http://www.theses.fr/2018LIL1I027/document.
Der volle Inhalt der QuelleThis work concerns the implementation of professional platforms and the development of SunRise platform for managing a Smart City. It is a part of SunRise project, which aims at turning the Scientific Campus of the University of Lille into a large-scale demonstrator site of the "Smart and Sustainable City". The campus is representative to a small town of 25000 inhabitants and 100 km of urban infrastructure.This thesis includes five parts. The first part includes a literature review concerning the Smart Cities with its definitions and components. The second part presents the role of data in Smart Cities, as well as the latest technologies that are used for Smart City management. It presents also the different existing architectures and platforms for management a Smart City.The Third part presents the SunRise Smart City demonstrator, which is used as a basis for this thesis. The part details the instrumentation installed in the demo site as well as the GIS model of the demonstrator. The fourth part concerns the architecture of the two professional platforms PI System and OpenDataSoft as well as their implementation and use for the analysis of water consumption.The last part describes the architecture of the platform SunRise and details its layers. It presents also the stages of the platform development and implementation
Guastella, Davide Andrea. „Dynamic learning of the environment for eco-citizen behavior“. Thesis, Toulouse 3, 2020. http://www.theses.fr/2020TOU30160.
Der volle Inhalt der QuelleThe development of sustainable smart cities requires the deployment of Information and Communication Technology (ICT) to ensure better services and available information at any time and everywhere. As IoT devices become more powerful and low-cost, the implementation of an extensive sensor network for an urban context can be expensive. This thesis proposes a technique for estimating missing environmental information in large scale environments. Our technique enables providing information whereas devices are not available for an area of the environment not covered by sensing devices. The contribution of our proposal is summarized in the following points: * limiting the number of sensing devices to be deployed in an urban environment; * the exploitation of heterogeneous data acquired from intermittent devices; * real-time processing of information; * self-calibration of the system. Our proposal uses the Adaptive Multi-Agent System (AMAS) approach to solve the problem of information unavailability. In this approach, an exception is considered as a Non-Cooperative Situation (NCS) that has to be solved locally and cooperatively. HybridIoT exploits both homogeneous (information of the same type) and heterogeneous information (information of different types or units) acquired from some available sensing device to provide accurate estimates in the point of the environment where a sensing device is not available. The proposed technique enables estimating accurate environmental information under conditions of uncertainty arising from the urban application context in which the project is situated, and which have not been explored by the state-of-the-art solutions: * openness: sensors can enter or leave the system at any time without the need for any reconfiguration; * large scale: the system can be deployed in a large, urban context and ensure correct operation with a significative number of devices; * heterogeneity: the system handles different types of information without any a priori configuration. Our proposal does not require any input parameters or reconfiguration. The system can operate in open, dynamic environments such as cities, where a large number of sensing devices can appear or disappear at any time and without any prior notification. We carried out different experiments to compare the obtained results to various standard techniques to assess the validity of our proposal. We also developed a pipeline of standard techniques to produce baseline results that will be compared to those obtained by our multi-agent proposal
Hiot, Nicolas. „Construction automatique de bases de données pour le domaine médical : Intégration de texte et maintien de la cohérence“. Electronic Thesis or Diss., Orléans, 2024. http://www.theses.fr/2024ORLE1026.
Der volle Inhalt der QuelleThe automatic construction of databases in the medical field represents a major challenge for guaranteeing efficient information management and facilitating decision-making. This research project focuses on the use of graph databases, an approach that offers dynamic representation and efficient querying of data and its topology. Our project explores the convergence between databases and automatic language processing, with two central objectives. In one hand, our focus is on maintaining consistency within graph databases during updates, particularly with incomplete data and specific business rules. Maintaining consistency during updates ensures a uniform level of data quality for all users and facilitates analysis. In a world of constant change, we give priority to updates, which may involve modifying the instance to accommodate new information. But how can we effectively manage these successive updates within a graph database management system? In a second hand, we focus on the integration of information extracted from text documents, a major source of data in the medical field. In particular, we are looking at clinical cases and pharmacovigilance, a crucial area for identifying the risks and adverse effects associated with the use of drugs. But, how can we detect information in texts? How can this unstructured data be efficiently integrated into a graph database? How can it be structured automatically? And finally, what is a valid structure in this context? We are particularly interested in encouraging reproducible research by adopting a transparent and documented approach to enable independent verification and validation of our results
Calvez, Philippe. „Modélisation d'agencements énergétiques durables dans les zones urbaines intelligentes : une approche pour la réduction de l’emprise énergétique par les pratiques soutenables“. Thesis, Paris 1, 2015. http://www.theses.fr/2015PA010056.
Der volle Inhalt der QuelleOn one hand, the ecological transition and sustainable development issues are today a reality that cannot be ignored given the negative impacts of human activities on their environments. On the other side, an increasingly important digitization of these environments results in the generation of massive volumes of digital traces, which are all signs of actors’ activities. A significant challenge is to understand the ins and outs of environmental impact due activities and considering Emprise of Energy (EmE) as a key indicator and how this indicator can strongly change from an activity to another. Our approach considers the identification of Practice on the basis of these digital traces generated by human and non-human entities during specific activities. Practice (instantiation of activity) uses more or less resources (physical and virtual) during their existence. Be able to identify which one is more resources dependent would help to better understand how to promote ecological transition. Promoting or at least identifying on the basis of quantifiable indicators (i.e Energy Emprise), practices that have a low impact on the environment, could be an innovative approach. These practices, in the sense of coordination of multiple heterogeneous entities in time and space, can be formalized in the form of multidimensional structures activities - Hypergraph of Activities – using the theory of Assemblage (Agencement in french) and using a set of mathematical tool (Simplicial Complexes, Hypernetworks). This research attempts to model the phenomenon of human and not human activity based on the characterization of the context (massive contextual data). These Assemblages are calculated and represented in an research application (IMhoTEP) which aims to build these complex structures not based on a priori entities’ classification, but by focusing on the relationships that they maintain in several dimensions. The main goal is to offer a decision tool which support actors’ ecological transition by understand activities inducing consumption or production of resources. These academic research in the field of computer science is based continuous digitization of physical and virtual spaces, particularly highly connected urban areas (Smart City, Internet of Everything)
Cao, Huu Quyet. „Policy-based usage control for trustworthy data sharing in smart cities“. Electronic Thesis or Diss., Evry, Institut national des télécommunications, 2017. http://www.theses.fr/2017TELE0010.
Der volle Inhalt der QuelleIn smart cities, Information and Communication Technologies, in particular Internet of Things (IoT) Technologies, are integrated into traditional services of our city, for example waste management, air pollution monitoring, and parking to improve quality while reducing costs of these services. IoT data in this context are generated by different actors, such as service providers, developers, and municipal authorities. These data should be shared among applications or services. However, in traditional scenario, there is no sharing of IoT data between them. Each actor consumes data from sensors deployed on behalf of that actor, and network infrastructure maybe shared. In order to encourage IoT data sharing, we need to establish the confidence between the actors. Exercising control over the usage of data by other actors is critical in building trust. Thus, the actors should have an ability to exercise control on how their data are going to be used. This major issue have not been treated in IoT namely Usage Control. In this thesis, we take into account obligations defined by the actors for their data (i) Abstraction of certain information, (ii) Spatial and temporal granularity, (iii) Classification of actors and purposes, and (iv) Monetization of data. For example, requirements of data usage in Intelligent parking applications are (i) Data owners have full access to all the details, (ii) Municipal authorities can access the average occupancy of parking place per street on an hourly basis, (iii) Commercial service providers can access only statistical data over a zone and a weekly basis, and (iv) Monetization of data can be based on subscription types or users roles. Thesis contributions include: (i) Policy-based Data Usage Control Model (DUPO) responds to the obligations defined by actors to their data. (ii) Trustworthy Data Sharing Platform as a Service allows transparency and traceability of data usage with open APIs based on the DUPO and Semantic technologies. (iii) Visualization Tool Prototype enables actors to exercise control on how their data will be used. (iv) Evaluation of the performance and the impact of our solution. The results show that the performance of the added trust is not affecting of the system. Mistrust might hamper public acceptance of IoT data sharing in smart cities. Our solution is key which will establish the trust between data owners and consumers by taking into account the obligations of the data owners. It is useful for data operators who would like to provide an open data platform with efficient enablers to partners, data-based services to clients, and ability to attract partners to share data on their platforms
Abdelouahab, Kamel. „Reconfigurable hardware acceleration of CNNs on FPGA-based smart cameras“. Thesis, Université Clermont Auvergne (2017-2020), 2018. http://www.theses.fr/2018CLFAC042/document.
Der volle Inhalt der QuelleDeep Convolutional Neural Networks (CNNs) have become a de-facto standard in computer vision. This success came at the price of a high computational cost, making the implementation of CNNs, under real-time constraints, a challenging task.To address this challenge, the literature exploits the large amount of parallelism exhibited by these algorithms, motivating the use of dedicated hardware platforms. In power-constrained environments, such as smart camera nodes, FPGA-based processing cores are known to be adequate solutions in accelerating computer vision applications. This is especially true for CNN workloads, which have a streaming nature that suits well to reconfigurable hardware architectures.In this context, the following thesis addresses the problems of CNN mapping on FPGAs. In Particular, it aims at improving the efficiency of CNN implementations through two main optimization strategies; The first one focuses on the CNN model and parameters while the second one considers the hardware architecture and the fine-grain building blocks
Kurdej, Marek. „Exploitation of map data for the perception of intelligent vehicles“. Thesis, Compiègne, 2015. http://www.theses.fr/2015COMP2174/document.
Der volle Inhalt der QuelleThis thesis is situated in the domains of robotics and data fusion, and concerns geographic information systems. We study the utility of adding digital maps, which model the urban environment in which the vehicle evolves, as a virtual sensor improving the perception results. Indeed, the maps contain a phenomenal quantity of information about the environment : its geometry, topology and additional contextual information. In this work, we extract road surface geometry and building models in order to deduce the context and the characteristics of each detected object. Our method is based on an extension of occupancy grids : the evidential perception grids. It permits to model explicitly the uncertainty related to the map and sensor data. By this means, the approach presents also the advantage of representing homogeneously the data originating from various sources : lidar, camera or maps. The maps are handled on equal terms with the physical sensors. This approach allows us to add geographic information without imputing unduly importance to it, which is essential in presence of errors. In our approach, the information fusion result, stored in a perception grid, is used to predict the stateof environment on the next instant. The fact of estimating the characteristics of dynamic elements does not satisfy the hypothesis of static world. Therefore, it is necessary to adjust the level of certainty attributed to these pieces of information. We do so by applying the temporal discounting. Due to the fact that existing methods are not well suited for this application, we propose a family of discoun toperators that take into account the type of handled information. The studied algorithms have been validated through tests on real data. We have thus developed the prototypes in Matlab and the C++ software based on Pacpus framework. Thanks to them, we present the results of experiments performed in real conditions
Cao, Huu Quyet. „Policy-based usage control for trustworthy data sharing in smart cities“. Thesis, Evry, Institut national des télécommunications, 2017. http://www.theses.fr/2017TELE0010/document.
Der volle Inhalt der QuelleIn smart cities, Information and Communication Technologies, in particular Internet of Things (IoT) Technologies, are integrated into traditional services of our city, for example waste management, air pollution monitoring, and parking to improve quality while reducing costs of these services. IoT data in this context are generated by different actors, such as service providers, developers, and municipal authorities. These data should be shared among applications or services. However, in traditional scenario, there is no sharing of IoT data between them. Each actor consumes data from sensors deployed on behalf of that actor, and network infrastructure maybe shared. In order to encourage IoT data sharing, we need to establish the confidence between the actors. Exercising control over the usage of data by other actors is critical in building trust. Thus, the actors should have an ability to exercise control on how their data are going to be used. This major issue have not been treated in IoT namely Usage Control. In this thesis, we take into account obligations defined by the actors for their data (i) Abstraction of certain information, (ii) Spatial and temporal granularity, (iii) Classification of actors and purposes, and (iv) Monetization of data. For example, requirements of data usage in Intelligent parking applications are (i) Data owners have full access to all the details, (ii) Municipal authorities can access the average occupancy of parking place per street on an hourly basis, (iii) Commercial service providers can access only statistical data over a zone and a weekly basis, and (iv) Monetization of data can be based on subscription types or users roles. Thesis contributions include: (i) Policy-based Data Usage Control Model (DUPO) responds to the obligations defined by actors to their data. (ii) Trustworthy Data Sharing Platform as a Service allows transparency and traceability of data usage with open APIs based on the DUPO and Semantic technologies. (iii) Visualization Tool Prototype enables actors to exercise control on how their data will be used. (iv) Evaluation of the performance and the impact of our solution. The results show that the performance of the added trust is not affecting of the system. Mistrust might hamper public acceptance of IoT data sharing in smart cities. Our solution is key which will establish the trust between data owners and consumers by taking into account the obligations of the data owners. It is useful for data operators who would like to provide an open data platform with efficient enablers to partners, data-based services to clients, and ability to attract partners to share data on their platforms
Afaneh, Ahmad. „GIS – based urban information system for Sustainable and Smart Cities : application to "SunRise – Smart City" demonstrator“. Thesis, Lille 1, 2016. http://www.theses.fr/2016LIL10220/document.
Der volle Inhalt der QuelleThe thesis concerns the use of the Geographic information system (GIS) for the construction of urban information system for Sustainable and Smart Cities. The work includes both the development of a methodology for the construction of the GIS-based urban information system and its application on to the large-scale demonstrator of the Smart and Sustainable City (SunRise Smart City).The thesis is composed of four parts. The first part includes a state of the art on the emergence of the Smart City Concept and the achievements in this area. It also presents the Geographic Information System (GIS) and its use in both environmental and urban areas.The second chapter presents the application of the GIS for the construction of the Urban Information System of the Scientific Campus of the University of Lille, which is used as a demonstration site for the project SunRise Smart City. The urban information system includes information about the campus buildings as well as the urban networks. The third chapter presents the use of the GIS for the visualization of dynamic data concerning urban networks, which is collected by smart sensors. The chapter presents the methodology followed for the dynamic data visualization as well as the application of this methodology on the water consumption data.The last chapter presents the use of the BIM in the SunRise urban information system for the management of buildings. The methodology is first presented then it is applied to a building of the Campus
Brulin, Damien. „Fusion de données multi-capteurs pour l'habitat intelligent“. Thesis, Orléans, 2010. http://www.theses.fr/2010ORLE2066/document.
Der volle Inhalt der QuelleThe smart home concept has been widely developed in the last years in order to propose solutions for twomain concerns : optimized energy management in building and help for in-home support for elderly people.In this context, the CAPTHOM project, in which this thesis is in line with, has been developed. To respondto these problems, many sensors, of different natures, are used to detect the human presence, to determinethe position and the posture of the person. In fact, no sensor can , alone, answers to all information justifyingthe development of a multi-sensor system and a data fusion method. In this project, the selected sensorsare passive infrared sensors (PIR), thermopiles and a video camera. No sensor is carried by the person(non invasive system). We propose a global architecture of intelligent sensor made of four fusion modulesallowing respectively to detect the human presence, to locate in 3D the person, to determine the posture andto help to make a decision according to the application. The human presence module fuses information ofthe three sensors : PIR sensors for the movement, thermopiles for the presence in case of immobility and thecamera to identify the detected entity. The 3D localisation of the person is realized thanks to position recedinghorizon estimation. This method, called Visual Receding Horizon Estimation (VRHE), formulates the positionestimation problem into an nonlinear optimisation problem under constraints in the image plane. The fusionmodule for the posture determination is based on fuzzy logic. It insures the posture determination regardlessof the person and the distance from the camera. Finally, the module to make a decision fuses the outputs of the preceding modules and gives the opportunity to launch alarms (elderly people monitoring) or to commandhome automation devices (lightning, heating) for the energy management of buildings
Welte, Anthony. „Spatio-temporal data fusion for intelligent vehicle localization“. Thesis, Compiègne, 2020. http://bibliotheque.utc.fr/EXPLOITATION/doc/IFD/2020COMP2572.
Der volle Inhalt der QuelleLocalization is an essential basic capability for vehicles to be able to navigate autonomously on the road. This can be achieved through already available sensors and new technologies (Iidars, smart cameras). These sensors combined with highly accurate maps result in greater accuracy. In this work, the benefits of storing and reusing information in memory (in data buffers) are explored. Localization systems need to perform a high-frequency estimation, map matching, calibration and error detection. A framework composed of several processing layers is proposed and studied. A main filtering layer estimates the vehicle pose while other layers address the more complex problems. High-frequency state estimation relies on proprioceptive measurements combined with GNSS observations. Calibration is essential to obtain an accurate pose. By keeping state estimates and observations in a buffer, the observation models of these sensors can be calibrated. This is achieved using smoothed estimates in place of a ground truth. Lidars and smart cameras provide measurements that can be used for localization but raise matching issues with map features. In this work, the matching problem is addressed on a spatio-temporal window, resulting in a more detailed pictur of the environment. The state buffer is adjusted using the observations and all possible matches. Although using mapped features for localization enables to reach greater accuracy, this is only true if the map can be trusted. An approach using the post smoothing residuals has been developed to detect changes and either mitigate or reject the affected features
Valade, Aurelien. „Capteurs intelligents : quelles méthodologies pour la fusion de données embarquées ?“ Thesis, Toulouse, INSA, 2017. http://www.theses.fr/2017ISAT0007/document.
Der volle Inhalt der QuelleThe work detailed in this document is the result of a collaborative effort of the LAAS-CNRS in Toulouse and MEAS-France / TE Connectivity during a period of three years.The goal here is to develop a methodology to design smart embedded sensors with the ability to estimate physical parameters based on multi-physical data fusion. This strategy tends to integrate sensors technologies, currently dedicated to lab measurements, in low powered embedded systems working in imperfects environments. After exploring model oriented methods, parameters estimations and Kalman filters, we detail various existing solutions upon which we can build a valid response to multi-physical data fusion problematics, for linear systems with the Kalman Filter, and for non-linear systems with the Extended Kalman Filter and the Unscented Kalman Filter.Then, we will synthesize a filter for hybrid systems, having a linear evolution model and a non-linear measurement model. For example, using the best of the two worlds in order to obtain the best complexity/precision ratio. Once we selected the estimation method, we detail computing power and algorithm complexity problematics in order to find available optimizations we can use to assess the usability of our system in a low power environment. Then we present the developed methodology application to the UQS sensor, sold by TE Connectivity, study case. This sensor uses near infra-red spectroscopy to determine the urea concentration in a urea/water solution, in order to control the nitrogen-oxyde depolluting process in gasoline engines. After a design principles presentation, we detail the model we created in order to represent the system, to simulate its behavior and to combine the measurement data to extract the desired concentration. During this step, we focus on the obstacles of our model calibration and the deviation compensation, due toworking conditions or to components aging process. Based on this development, we finally designed the hybrid models addressing the nominal working cases and the model re-calibration during the working duration of the product. After this, we presented obtained results, on simulated data, and on real-world measured data. Finally, we enhanced the methodology based on tabulated “black box” models which are easier to calibrate and cheaper to process. In conclusion, we reapplied our methodology to a different motion capture sensor, to compile all possible solutions and limits
Matta, Natalie. „Vers une gestion décentralisée des données des réseaux de capteurs dans le contexte des smart grids“. Thesis, Troyes, 2014. http://www.theses.fr/2014TROY0010/document.
Der volle Inhalt der QuelleThis thesis focuses on the decentralized management of data collected by wireless sensor networks which are deployed in a smart grid, i.e. the evolved new generation electricity network. It proposes a decentralized architecture based on multi-agent systems for both data and energy management in the smart grid. In particular, our works deal with data management of sensor networks which are deployed in the distribution electric subsystem of a smart grid. They aim at answering two key challenges: (1) detection and identification of failure and disturbances requiring swift reporting and appropriate reactions; (2) efficient management of the growing volume of data caused by the proliferation of sensors and other sensing entities such as smart meters. The management of this data can call upon several methods, including the aggregation of data packets on which we focus in this thesis. To this end, we propose to aggregate (PriBaCC) and/or to correlate (CoDA) the contents of these data packets in a decentralized manner. Data processing will thus be done faster, consequently leading to rapid and efficient decision-making concerning energy management. The validation of our contributions by means of simulation has shown that they meet the identified challenges. It has also put forward their enhancements with respect to other existing approaches, particularly in terms of reducing data volume as well as transmission delay of high priority data
Lutfi, Rania. „Indexation intelligente et recherche par le contenu de l'audio“. Nantes, 2003. http://www.theses.fr/2003NANT2028.
Der volle Inhalt der QuelleWacta, Christine. „Vers la "ville neuro-prothétique" du futur : une maquette numérique de ville renseignée comme plateforme d’échange et de croisement d’applications intégrant des données en temps réel et sur un support topographique de référence permettant une approche urbaine holistique qui intègre pleinement les questions socio- culturelles, économiques, politiques et environnementales nécessaires dans une conception urbaine de ville intelligente : l’approche Géo Spatiale appliquée à l’urbain“. Thesis, Université de Paris (2019-....), 2019. https://wo.app.u-paris.fr/cgi-bin/WebObjects/TheseWeb.woa/wa/show?t=3960&f=25139.
Der volle Inhalt der QuelleThe question of urban design of the future is one of the important and critical issues of our society. The global warming, the biodiversity at risk, the economic/social/cultural transitions, the predictions of a significant increase in the urban population, the changes in transportation patterns, and changes in urban forms, to quote only a few... All these questions are at the heart of current issues and are part of the constraints we must face in the urban design of tomorrow. Faced with such a situation, it seems risky today to continue to think of the city with approaches or design processes that are based on yesterday’s realities. As Albert Einstein puts it, "we cannot solve our problems using the same way of thinking that we had when we created them". The environmental issues (global warming, biodiversity, etc ...) are factors of vulnerability in the current city in such a way that it is generally accepted (ScienceNet) that built environments must now , more than in the past, be designed in a way that is "respectful of the environment ". We are encouraged to develop a socially responsible and "environmentally friendly" mentality, an approach that looks beyond the immediate and individual interest to achieving stable, long-term common goals. This is only possible if we use and intelligently and fairly all the resources at our disposal, in this case our knowledge, the natural resources, the socio-economic, the geographical as well as the technological advancements. Because, if technology and digital have become of common daily used by the citizens, urban design and architectural disciplines seems however to have a hard time integrating it completely in an intelligent and systemic way as do today other disciplines such as medicine and aeronautics...This work tries to develop a methodology of urban design based on a combination of digital applications, the effort of a collective intelligence as well as ideas, concepts and techniques proposed by a handful of philosophers, historians, psychologists, architects, town planners above mentioned who marked the history of cities. It is therefore from this heterogeneous marriage of techniques and thoughts augmented by recent geospatial technologies that this research intends to base its point of view on the study of urban complexity in order to try to cope with urban problems in constant form. evolution
Turmeaux, Teddy. „Contraintes et fouille de données“. Orléans, 2004. http://www.theses.fr/2004ORLE2048.
Der volle Inhalt der QuelleMondo, Mélanie. „Traces numériques et dimensions spatiales des pratiques de la ville touristique“. Thesis, La Rochelle, 2022. http://www.theses.fr/2022LAROS019.
Der volle Inhalt der QuelleThis thesis explores the inputs and the impact of digital footprints on the understanding of the spatial dimensions of urban tourist practices. Digital footprints are an emerging field of investigation that promises a better understanding of service stakeholders’ expectations (businesses, institutions, academic). From heat maps to dashboards, data is collected, processed, aggregated, smoothed, and synthetized into visualizations that could reveal a new tourist space-time. Applying frameworks from tourism geography and critical data studies, we suggest a critical approach to analyze the way these data are used. A review of the existing literature confirms an uptrend in digital footprints usage / monitoring, identifies what is at stake regarding the observation of tourist cities and points out critical limits. Two complementary approaches are then presented to measure the concrete value of this data regarding the space-time of urban tourist practices. In Biarritz (France), the analysis of a given social media dataset highlights the need for a contextualized analysis of footprints. In La Rochelle (France), a GPS dataset complemented with a series of interviews reveal the potential of elicitation methods to better understand digital footprints and narrate the practice of the tourist city. Eventually, these two approaches confirm our initial hypothesis i.e. digital footprints tend to enrich, under specific conditions, the understanding of tourism practices. The outcomes obtained allow us to advocate the relevance of contextualized and qualitative research on digital footprints in geography
Collard, Martine. „Fouille de données, Contributions Méthodologiques et Applicatives“. Habilitation à diriger des recherches, Université Nice Sophia Antipolis, 2003. http://tel.archives-ouvertes.fr/tel-01059407.
Der volle Inhalt der QuelleUgon, Adrien. „Fusion symbolique et données polysomnographiques“. Paris 6, 2013. http://www.theses.fr/2013PA066187.
Der volle Inhalt der QuelleIn recent decades, medical examinations required to diagnose and guide to treatmentbecame more and more complex. It is even a current practice to use several examinationsin different medical specialties to study a disease through multiple approaches so as todescribe it more deeply. The interpretation is difficult because the data is both heterogeneous and also veryspecific, with skilled domain of knowledge required to analyse it. In this context, symbolic fusion appears to be a possible solution. Indeed, it wasproved to be very effective in treating problems with low or high levels of abstraction ofinformation to develop a high level knowledge. This thesis demonstrates the effectiveness of symbolic fusion applied to the treatmentof polysomnographic data for the development of an assisted diagnosis tool of Sleep ApneaSyndrome. Proper diagnosis of this sleep disorder requires a polysomnography. This medicalexamination consists of simultaneously recording of various physiological parametersduring a night. Visual interpretation is tedious and time consuming and there commonlyis some disagreement between scorers. The use of a reliable support-to-diagnosis toolincreases the consensus. This thesis develops stages of the development of such a tool
Gross-Amblard, David. „Approximation dans les bases de données contraintes“. Paris 11, 2000. http://www.theses.fr/2000PA112304.
Der volle Inhalt der QuelleDupont, Xavier. „Programmation par contraintes sur les flux de données“. Caen, 2014. http://www.theses.fr/2014CAEN2016.
Der volle Inhalt der QuelleIn this thesis, we investigate the generalisation of constraint programming on finite variables to stream variables. First, the concepts of streams, infinite sequences and infinite words have been extensively studied in the litterature, and we propose a state of the art that covers language theory, classical and temporal logics, as well as the numerous formalisms that are strongly related to those. The comparison with temporal logics is a first step towards the unification of formalisms over streams, and because the temporal logics are themselves numerous, the classification of these allows the extrapolation of our contributions to other contexts. The second goal involves identifying the features of the existing formalisms that lend themselve to the techniques of constraint programming over finite variables. Compared to the expressivity of temporal logics, that of our formalism is more limited. This stems from the fact that constraint programming allows only the conjunction of constraints, and requires encapsulating disjunction into constraint propagators. Nevertheless, our formalism allows a gain in concision and the reuse of the concept of propagator in a temporal setting. The question of the generalisation of these results to more expressive logics is left open
Dematraz, Jessica. „Méthodologies d'extraction des connaissances issues de données hétérogènes pour l'innovation“. Thesis, Aix-Marseille, 2018. http://www.theses.fr/2018AIXM0716.
Der volle Inhalt der QuelleIn the age of Big Data, where information and communication technologies are in full swing, access to information has never been so easy and fast. Paradoxically, strategic information, that is, "useful" information, the information that facilitates decision-making, has never been so rare and difficult to find. Hence the importance of setting up a process of competitive intelligence and more precisely of information monitoring, in order to effectively exploit the information environment of an organization, a sector or even an entire country. Today, the predominance of information in a professional context is no longer to be proven. The monitoring issues as they are (strategic, competitive, technological, regulatory, etc.) concern entities of all sectors (public or private) and sizes (SMEs, ETIs, large groups) in all fields of activity. Except that there is no single method applicable to everything and for everyone, but a plurality of methods that must coexist to achieve the emergence of knowledge
Leblanc, Brice. „Analyse non supervisée de données issues de Systèmes de Transport Intelligent-Coopératif“. Thesis, Reims, 2020. http://www.theses.fr/2020REIMS014.
Der volle Inhalt der QuelleThis thesis takes place in the context of Vehicular Ad-hoc Networks (VANET), and more specifically the context of Cooperative-Intelligent Transport System (C-ITS). These systems are exchanging information to enhance road safety.The purpose of this thesis is to introduce data analysis tools that may provide road operators information on the usage/state of their infrastructures. Therefore, this information may help to improve road safety. We identify two cases we want to deal with: driving profile identification and road obstacle detection.For dealing with those issues, we propose to use unsupervised learning approaches: clustering methods for driving profile identification, and concept drift detection for obstacle detection. This thesis introduces three main contributions: a methodology allowing us to transform raw C-ITS data in, first, trajectory, and then, learning data-set; the use of classical clustering methods and Points Of Interests for driving profiles with experiments on mobile device data and network logs data; and the consideration of a crowd of vehicles providing network log data as data streams and considered as input of concept drift detection algorithms to recognize road obstacles
Baez, miranda Belen. „Génération de récits à partir de données ambiantes“. Thesis, Université Grenoble Alpes (ComUE), 2018. http://www.theses.fr/2018GREAM049/document.
Der volle Inhalt der QuelleStories are a communication tool that allow people to make sense of the world around them. It represents a platform to understand and share their culture, knowledge and identity. Stories carry a series of real or imaginary events, causing a feeling, a reaction or even trigger an action. For this reason, it has become a subject of interest for different fields beyond Literature (Education, Marketing, Psychology, etc.) that seek to achieve a particular goal through it (Persuade, Reflect, Learn, etc.).However, stories remain underdeveloped in Computer Science. There are works that focus on its analysis and automatic production. However, those algorithms and implementations remain constrained to imitate the creative process behind literary texts from textual sources. Thus, there are no approaches that produce automatically stories whose 1) the source consists of raw material that passed in real life and 2) and the content projects a perspective that seeks to convey a particular message. Working with raw data becomes relevant today as it increase exponentially each day through the use of connected devices.Given the context of Big Data, we present an approach to automatically generate stories from ambient data. The objective of this work is to bring out the lived experience of a person from the data produced during a human activity. Any areas that use such raw data could benefit from this work, for example, Education or Health. It is an interdisciplinary effort that includes Automatic Language Processing, Narratology, Cognitive Science and Human-Computer Interaction.This approach is based on corpora and models and includes the formalization of what we call the activity récit as well as an adapted generation approach. It consists of 4 stages: the formalization of the activity récit, corpus constitution, construction of models of activity and the récit, and the generation of text. Each one has been designed to overcome constraints related to the scientific questions asked in view of the nature of the objective: manipulation of uncertain and incomplete data, valid abstraction according to the activity, construction of models from which it is possible the Transposition of the reality collected though the data to a subjective perspective and rendered in natural language. We used the activity narrative as a case study, as practitioners use connected devices, so they need to share their experience. The results obtained are encouraging and give leads that open up many prospects for research
Poussevin, Mickael. „Apprentissage de représentation pour des données générées par des utilisateurs“. Thesis, Paris 6, 2015. http://www.theses.fr/2015PA066040/document.
Der volle Inhalt der QuelleIn this thesis, we study how representation learning methods can be applied to user-generated data. Our contributions cover three different applications but share a common denominator: the extraction of relevant user representations. Our first application is the item recommendation task, where recommender systems build user and item profiles out of past ratings reflecting user preferences and item characteristics. Nowadays, textual information is often together with ratings available and we propose to use it to enrich the profiles extracted from the ratings. Our hope is to extract from the textual content shared opinions and preferences. The models we propose provide another opportunity: predicting the text a user would write on an item. Our second application is sentiment analysis and, in particular, polarity classification. Our idea is that recommender systems can be used for such a task. Recommender systems and traditional polarity classifiers operate on different time scales. We propose two hybridizations of these models: the former has better classification performance, the latter highlights a vocabulary of surprise in the texts of the reviews. The third and final application we consider is urban mobility. It takes place beyond the frontiers of the Internet, in the physical world. Using authentication logs of the subway users, logging the time and station at which users take the subway, we show that it is possible to extract robust temporal profiles
Salehi, Mehrdad. „Developing a Model and a Language to Identify and Specify the Integrity Constraints in Spatial Datacubes“. Doctoral thesis, Université Laval, 2009. http://www.theses.ulaval.ca/2009/26325/26325.pdf.
Der volle Inhalt der QuelleTexte en anglais avec résumés en anglais et en français. Bibliogr.: f. 185-197. Publié aussi en version électronique dans la Collection Mémoires et thèses électroniques.
Azlal, Ayoub. „Déploiement d‟une stratégie Smart City à l‟échelle de la ville : application à la ville de Saint-Quentin“. Thesis, Lille 1, 2020. http://www.theses.fr/2020LIL1I056.
Der volle Inhalt der QuelleThis thesis work focuses on the deployment of the Smart City concept at the city level, with an application in the city of Saint-Quentin. The work, presented in this manuscript, contributes to enrich the Research in the field of the smart city with the objective of bridging the knowledge gap between theory and practice.Thus, the main objective is to develop a methodology for the development of a "Smart City" roadmap as the first phase of the implementation of a Smart City project.This thesis report is divided into five main parts.The first part presents a synthesis of the state of the art of research and practice on the Smart City in the world.The second part presents the methodology developed to conduct a smart city approach. It constitutes a solid scientific basis for carrying out and designing a global “Smart City” strategy.The third part is about the application of the methodology developed to the city of Saint-Quentin. After a deep analysis of the territory, we carried out a diagnosis with a view to deploying the Smart City concept. This task included identifying the challenges facing the city and areas for improvement. A series of pilot projects have been proposed.The fourth part consists on describing the real estate assets of the city of Saint-Quentin, as well as analyzing their energy consumption and CO2 emissions.At last but not least, the fifth part aims to reflect the work carried out for the intelligent transformation of municipal buildings in the city of Saint-Quentin. Two main test sites are presented : a hall for concerts and shows and a nursery and primary school group. This chapter also presents the methodology for deploying sensors to measure and monitor comfort and safety parameters in real time as well as the use of these data
Boudellal, Toufik. „Extraction de l'information à partir des flux de données“. Saint-Etienne, 2006. http://www.theses.fr/2006STET4014.
Der volle Inhalt der QuelleThe aim of this work is an attempt to resolve a mining data streams specified problem. It is an adaptative analysis of data streams. The web generation proposes new challenges due to the complexity of data structures. As an example, the data issued from virtual galleries, credit card transactions,. . . Generally, such data are continuous in time, and their sizes are dynamic. We propose a new algorithm based on measures applied to adaptative data streams. The interpretation of results is possible due to such measures. In fact, we compare our algorithm experimentally to other adapted approaches that are considered fundamental in the field. A modified algorithm that is more useful in applications is also discussed. This thesis finishes with a suggestions set about our future work relating to noises data streams and another set of suggestions about the future needfully work
Dematraz, Jessica. „Méthodologies d'extraction des connaissances issues de données hétérogènes pour l'innovation“. Electronic Thesis or Diss., Aix-Marseille, 2018. http://www.theses.fr/2018AIXM0716.
Der volle Inhalt der QuelleIn the age of Big Data, where information and communication technologies are in full swing, access to information has never been so easy and fast. Paradoxically, strategic information, that is, "useful" information, the information that facilitates decision-making, has never been so rare and difficult to find. Hence the importance of setting up a process of competitive intelligence and more precisely of information monitoring, in order to effectively exploit the information environment of an organization, a sector or even an entire country. Today, the predominance of information in a professional context is no longer to be proven. The monitoring issues as they are (strategic, competitive, technological, regulatory, etc.) concern entities of all sectors (public or private) and sizes (SMEs, ETIs, large groups) in all fields of activity. Except that there is no single method applicable to everything and for everyone, but a plurality of methods that must coexist to achieve the emergence of knowledge
Masri, Ali. „Multi-Network integration for an Intelligent Mobility“. Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLV091/document.
Der volle Inhalt der QuelleMultimodality requires the integration of heterogeneous transportation data and services to construct a broad view of the transportation network. Many new transportation services (e.g. ridesharing, car-sharing, bike-sharing) are emerging and gaining a lot of popularity since in some cases they provide better trip solutions.However, these services are still isolated from the existing multimodal solutions and are proposed as alternative plans without being really integrated in the suggested plans. The concept of open data is raising and being adopted by many companies where they publish their data sources to the web in order to gain visibility. The goal of this thesis is to use these data to enable multimodality by constructing an extended transportation network that links these new services to existing ones.The challenges we face mainly arise from the integration problem in both transportation services and transportation data
Saïs, Fatiha. „Intégration sémantique de données guidée par une ontologie“. Paris 11, 2007. http://www.theses.fr/2007PA112300.
Der volle Inhalt der QuelleThis thesis deals with semantic data integration guided by an ontology. Data integration aims at combining autonomous and heterogonous data sources. To this end, all the data should be represented according to the same schema and according to a unified semantics. This thesis is divided into two parts. In the first one, we present an automatic and flexible method for data reconciliation with an ontology. We consider the case where data are represented in tables. The reconciliation result is represented in the SML format which we have defined. Its originality stems from the fact that it allows representing all the established mappings but also information that is imperfectly identified. In the second part, we present two methods of reference reconciliation. This problem consists in deciding whether different data descriptions refer to the same real world entity. We have considered this problem when data is described according to the same schema. The first method, called L2R, is logical: it translates the schema and the data semantics into a set of logical rules which allow inferring correct decisions both of reconciliation and no reconciliation. The second method, called N2R, is numerical. It translates the schema semantics into an informed similarity measure used by a numerical computation of the similarity of the reference pairs. This computation is expressed in a non linear equation system solved by using an iterative method. Our experiments on real datasets demonstrated the robustness and the feasibility of our approaches. The solutions that we bring to the two problems of reconciliation are completely automatic and guided only by an ontology
Ravi, Mondi. „Confiance et incertitude dans les environnements distribués : application à la gestion des donnéeset de la qualité des sources de données dans les systèmes M2M (Machine to Machine)“. Thesis, Université Grenoble Alpes (ComUE), 2016. http://www.theses.fr/2016GREAM090/document.
Der volle Inhalt der QuelleTrust and uncertainty are two important aspects of many distributed systems. For example, multiple sources of information can be available for the same type of information. This poses the problem to select the best source that can produce the most certain information and to resolve incoherence amongst the available information. Managing trust and uncertainty together forms a complex problem and through this thesis we develop a solution to this. Trust and uncertainty have an intrinsic relationship. Trust is primarily related to sources of information while uncertainty is a characteristic of the information itself. In the absence of trust and uncertainty measures, a system generally suffers from problems like incoherence and uncertainty. To improve on this, we hypothesize that the sources with higher trust levels will produce more certain information than those with lower trust values. We then use the trust measures of the information sources to quantify uncertainty in the information and thereby infer high level conclusions with greater certainty.A general trend in the modern distributed systems is to embed reasoning capabilities in the end devices to make them smart and autonomous. We model these end devices as agents of a Multi Agent System. Major sources of beliefs for such agents are external information sources that can possess varying trust levels. Moreover, the incoming information and beliefs are associated with a degree of uncertainty. Hence, the agents face two-fold problems of managing trust on sources and presence of uncertainty in the information. We illustrate this with three application domains: (i) The intelligent community, (ii) Smart city garbage collection, and (iii) FIWARE : a European project about the Future Internet that motivated the research on this topic. Our solution to the problem involves modelling the devices (or entities) of these domains as intelligent agents that comprise a trust management module, an inference engine and a belief revision system. We show that this set of components can help agents to manage trust on the other sources and quantify uncertainty in the information and then use this to infer more certain high level conclusions. We finally assess our approach using simulated and real data pertaining to the different application domains
Boudane, Abdelhamid. „Fouille de données par contraintes“. Thesis, Artois, 2018. http://www.theses.fr/2018ARTO0403/document.
Der volle Inhalt der QuelleIn this thesis, We adress the well-known clustering and association rules mining problems. Our first contribution introduces a new clustering framework, where complex objects are described by propositional formulas. First, we extend the two well-known k-means and hierarchical agglomerative clustering techniques to deal with these complex objects. Second, we introduce a new divisive algorithm for clustering objects represented explicitly by sets of models. Finally, we propose a propositional satisfiability based encoding of the problem of clustering propositional formulas without the need for an explicit representation of their models. In a second contribution, we propose a new propositional satisfiability based approach to mine association rules in a single step. The task is modeled as a propositional formula whose models correspond to the rules to be mined. To highlight the flexibility of our proposed framework, we also address other variants, namely the closed, minimal non-redundant, most general and indirect association rules mining tasks. Experiments on many datasets show that on the majority of the considered association rules mining tasks, our declarative approach achieves better performance than the state-of-the-art specialized techniques
Duong, Ngoc Son. „Instrumentation de chaussées : la route intelligente qui s’auto-détecte ?“ Thesis, Ecole centrale de Nantes, 2017. http://www.theses.fr/2017ECDN0033.
Der volle Inhalt der QuelleRecently, the roads supporting a great number of heavy vehicles usually have a thick and little deformable structure. To evaluate the pavement performances, deflection measurement devices have been used. However, these measurements are not accurate and the degradation detection is not enough to detect the start of pavement damage.In order to obtain more accurate and continuous mechanical pavement measurements, highway sections were instrumented with specific sensors (temperature probes, strain gages, geophones). However, the measurements analysis under real traffic generates a great number of data and a variability of measurements. Therefore, this problem requires an original signal sorting process. The study of strain measurements allows analyzing real strain variations which take in account the daily and seasonal variations of environment parameters.Modelling calculations with different assumptions were carried out afterward in order to obtain the best prediction for the mechanical pavement behavior. The study of geophone measurements allows measuring pavement deflections which represent pavement bearing capacity. In addition, different geophones were used to characterize heavy vehicle silhouettes, vehicle speeds and their lateral positions. The thesis work meets the expected requirement of construction managers to monitor continuously their infrastructures under real traffic
Salem, Rashed. „Active XML Data Warehouses for Intelligent, On-line Decision Support“. Thesis, Lyon 2, 2012. http://www.theses.fr/2012LYO22002.
Der volle Inhalt der QuelleA decision support system (DSS) is an information system that supports decisionmakers involved in complex decision-making processes. Modern DSSs needto exploit data that are not only numerical or symbolic, but also heterogeneouslystructured (e.g., text and multimedia data) and coming from various sources (e.g,the Web). We term such data complex data. Data warehouses are casually usedas the basis of such DSSs. They help integrate data from a variety of sourcesto support decision-making. However, the advent of complex data imposes anothervision of data warehousing including data integration, data storage and dataanalysis. Moreover, today's requirements impose integrating complex data in nearreal-time rather than with traditional snapshot and batch ETL (Extraction, Transformationand Loading). Real-time and near real-time processing requires a moreactive ETL process. Data integration tasks must react in an intelligent, i.e., activeand autonomous way, to encountered changes in the data integration environment,especially data sources.In this dissertation, we propose novel solutions for complex data integration innear real-time, actively and autonomously. We indeed provide a generic metadatabased,service-oriented and event-driven approach for integrating complex data.To address data complexity issues, our approach stores heterogeneous data into aunied format using a metadata-based approach and XML. We also tackle datadistribution and interoperability using a service-oriented approach. Moreover, toaddress near real-time requirements, our approach stores not only integrated datainto a unied repository, but also functions to integrate data on-the-y. We also apply a service-oriented approach to track relevant data changes in near real-time.Furthermore, the idea of integrating complex data actively and autonomously revolvesaround mining logged events of data integration environment. For this sake,we propose an incremental XML-based algorithm for mining association rules fromlogged events. Then, we de ne active rules upon mined data to reactivate integrationtasks.To validate our approach for managing complex data integration, we develop ahigh-level software framework, namely AX-InCoDa (Active XML-based frameworkfor Integrating Complex Data). AX-InCoDa is implemented as Web application usingopen-source tools. It exploits Web standards (e.g., XML and Web services) andActive XML to handle complexity issues and near real-time requirements. Besidewarehousing logged events into an event repository to be mined for self-managingpurposes, AX-InCoDa is enriched with active rules. AX-InCoDa's feasibility is illustratedby a healthcare case study. Finally, the performance of our incremental eventmining algorithm is experimentally demonstrated
Vigneron, Vincent. „Programmation par contraintes et découverte de motifs sur données séquentielles“. Thesis, Angers, 2017. http://www.theses.fr/2017ANGE0028/document.
Der volle Inhalt der QuelleRecent works have shown the relevance of constraint programming to tackle data mining tasks. This thesis follows this approach and addresses motif discovery in sequential data. We focus in particular, in the case of classified sequences, on the search for motifs that best fit each individual class. We propose a language of constraints over matrix domains to model such problems. The language assumes a preprocessing of the data set (e.g., by pre-computing the locations of each character in each sequence) and views a motif as the choice of a sub-matrix (i.e., characters, sequences, and locations). We introduce different matrix constraints (compatibility of locations with the database, class covering, location-based character ordering common to sequences, etc.) and address two NP-complete problems: the search for class-specific totally ordered motifs (e.g., exclusive subsequences) or partially ordered motifs. We provide two CSP models that rely on global constraints to prove exclusivity. We then present a memetic algorithm that uses this CSP model during initialisation and intensification. This hybrid approach proves competitive compared to the pure CSP approach as shown by experiments carried out on protein sequences. Lastly, we investigate data set preprocessing based on patterns rather than characters, in order to reduce the size of the resulting matrix domain. To this end, we present and compare two alternative methods, one based on lattice search, the other on dynamic programming
Kuchmann-Beauger, Nicolas. „Question Answering System in a Business Intelligence Context“. Thesis, Châtenay-Malabry, Ecole centrale de Paris, 2013. http://www.theses.fr/2013ECAP0021/document.
Der volle Inhalt der QuelleThe amount and complexity of data generated by information systems keep increasing in Warehouses. The domain of Business Intelligence (BI) aims at providing methods and tools to better help users in retrieving those data. Data sources are distributed over distinct locations and are usually accessible through various applications. Looking for new information could be a tedious task, because business users try to reduce their work overload. To tackle this problem, Enterprise Search is a field that has emerged in the last few years, and that takes into consideration the different corporate data sources as well as sources available to the public (e.g. World Wide Web pages). However, corporate retrieval systems nowadays still suffer from information overload. We believe that such systems would benefit from Natural Language (NL) approaches combined with Q&A techniques. Indeed, NL interfaces allow users to search new information in their own terms, and thus obtain precise answers instead of turning to a plethora of documents. In this way, users do not have to employ exact keywords or appropriate syntax, and can have faster access to new information. Major challenges for designing such a system are to interface different applications and their underlying query languages on the one hand, and to support users’ vocabulary and to be easily configured for new application domains on the other hand. This thesis outlines an end-to-end Q&A framework for corporate use-cases that can be configured in different settings. In traditional BI systems, user-preferences are usually not taken into account, nor are their specific contextual situations. State-of-the art systems in this field, Soda and Safe do not compute search results on the basis of users’ situation. This thesis introduces a more personalized approach, which better speaks to end-users’ situations. Our main experimentation, in this case, works as a search interface, which displays search results on a dashboard that usually takes the form of charts, fact tables, and thumbnails of unstructured documents. Depending on users’ initial queries, recommendations for alternatives are also displayed, so as to reduce response time of the overall system. This process is often seen as a kind of prediction model. Our work contributes to the following: first, an architecture, implemented with parallel algorithms, that leverages different data sources, namely structured and unstructured document repositories through an extensible Q&A framework, and this framework can be easily configured for distinct corporate settings; secondly, a constraint-matching-based translation approach, which replaces a pivot language with a conceptual model and leads to more personalized multidimensional queries; thirdly, a set of NL patterns for translating BI questions in structured queries that can be easily configured in specific settings. In addition, we have implemented an iPhone/iPad™ application and an HTML front-end that demonstrate the feasibility of the various approaches developed through a series of evaluation metrics for the core component and scenario of the Q&A framework. To this end, we elaborate on a range of gold-standard queries that can be used as a basis for evaluating retrieval systems in this area, and show that our system behave similarly as the well-known WolframAlpha™ system, depending on the evaluation settings
Poussevin, Mickael. „Apprentissage de représentation pour des données générées par des utilisateurs“. Electronic Thesis or Diss., Paris 6, 2015. https://accesdistant.sorbonne-universite.fr/login?url=https://theses-intra.sorbonne-universite.fr/2015PA066040.pdf.
Der volle Inhalt der QuelleIn this thesis, we study how representation learning methods can be applied to user-generated data. Our contributions cover three different applications but share a common denominator: the extraction of relevant user representations. Our first application is the item recommendation task, where recommender systems build user and item profiles out of past ratings reflecting user preferences and item characteristics. Nowadays, textual information is often together with ratings available and we propose to use it to enrich the profiles extracted from the ratings. Our hope is to extract from the textual content shared opinions and preferences. The models we propose provide another opportunity: predicting the text a user would write on an item. Our second application is sentiment analysis and, in particular, polarity classification. Our idea is that recommender systems can be used for such a task. Recommender systems and traditional polarity classifiers operate on different time scales. We propose two hybridizations of these models: the former has better classification performance, the latter highlights a vocabulary of surprise in the texts of the reviews. The third and final application we consider is urban mobility. It takes place beyond the frontiers of the Internet, in the physical world. Using authentication logs of the subway users, logging the time and station at which users take the subway, we show that it is possible to extract robust temporal profiles
Bezet, Olivier. „Etude de la qualité temporelle des données dans un système distribué pour la fusion multi-capteurs“. Compiègne, 2005. http://www.theses.fr/2005COMP1586.
Der volle Inhalt der QuelleThe research work depicted in this thesis concerns the multi-sensor data fusion or combination in distributed environments. The objective is to improve the data accuracy by taking into account the timestamping error. The target application considered in this thesis consists of a data acquisition and processing system, embedded in an instrumented vehicle. Firstly, a method of interval timestamping correspondence in distributed environment is proposed. Ln addition to the good synchronization quality, the method has the advantage of limiting the exchanged messages on the bus. Ln the second stage and in order to reuse existing algorithms based on exact dates, we propose a method to convert linearly interval dates into punctual dates. The timestamping error is thus reflected on data imprecision. Different experiments in the advanced driver assistance systems domain have validated this study
Kou, Huaizhong. „Génération d'adaptateurs web intelligents à l'aide de techniques de fouilles de texte“. Versailles-St Quentin en Yvelines, 2003. http://www.theses.fr/2003VERS0011.
Der volle Inhalt der QuelleThis thesis defines a system framework of semantically integrating Web information, called SEWISE. It can integrate text information from various Web sources belonging to an application domain into common domain-specific concept ontology. In SEWISE, Web wrappers are built around different Web sites to automatically extract interesting information from. Text mining technologies are then used to discover the semantics Web documents talk about. SEWISE can ease topic-oriented information researches over the Web. Three problems related to the document categorization are studied. Firstly, we investigate the approaches to feature selection and proposed two approaches CBA and IBA to select features. To estimate statistic term associations and integrate them within document similarity model, a mathematical model is proposed. Finally, the category score calculation algorithms used by k-NN classifiers are studied. Two weighted algorithms CBW and IBW to calculate category score are proposed