Dissertations / Theses on the topic 'DATA INTEGRATION APPROACH'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'DATA INTEGRATION APPROACH.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Fan, Hao. "Investigating a heterogeneous data integration approach for data warehousing." Thesis, Birkbeck (University of London), 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.424299.
Full textZiegler, Patrick. "The SIRUP approach to personal semantic data integration /." [S.l. : s.n.], 2007. http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&doc_number=016357341&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA.
Full textCriollo, Manjarrez Rotman A. "An approach for hydrogeological data management, integration and analysis." Doctoral thesis, Universitat Politècnica de Catalunya, 2019. http://hdl.handle.net/10803/666507.
Full textLa conceptualització d’un sistema hidrogeològic implica una continua monitorització i avaluació d’una gran quantitat de paràmetres (e.g., paràmetres hidràulics). Pel que fa als paràmetres hidràulics de l’aqüífer, la seva quantificació és un dels problemes més comuns als estudis hidrogeològics. És àmpliament reconegut que els mètodes per obtenir aquest tipus de paràmetres tenen les seves limitacions i són dependents de l’escala d’anàlisi. Per aquest motiu, cal disposar de mètodes i eines per estimar-los dins del seu context espacial i validar la seva incertesa quan s’apliquen en una escala superior d’anàlisi. Les dades recopilades i generades per realitzar un model conceptual hidrogeològic sovint s'emmagatzemen en diferents escales i formats (e.g., mapes, fulls de càlcul o bases de dades). Aquest volum de dades en continu creixement requereix d'eines i metodologies que millorin la seva compilació i gestió per al seu posterior anàlisi. Les contribucions realitzades en aquesta tesi son: (i) proporcionar metodologies dinàmiques i escalables per migrar i integrar múltiples infraestructures de dades (infraestructures de dades espacials i no espacials, o la implementació d'eines TIC); (ii) obtenir un major rendiment de l'anàlisi hidrogeològic tenint en compte el seu context espacial; (iii) proporcionar eines específiques per analitzar processos hidrogeològics i obtenir paràmetres hidràulics que tenen un paper clau en els estudis d'aigües subterrànies; i (iv) difondre software de codi lliure i de fàcil accés que permeti l'estandardització, gestió, anàlisi, interpretació i intercanvi de dades hidrogeològiques amb un model numèric dins d'una única plataforma de informació geogràfica (SIG). S'ha dissenyat una metodologia dinàmica i escalable per harmonitzar i estandarditzar múltiples conjunts de dades de diferents orígens, o bé per connectar aquestes infraestructures de dades amb eines TIC. Aquesta metodologia pot ser implementada en qualsevol tipus de procés de migració i integració de dades (DMI), per a desenvolupar infraestructures de dades espacials i no espacials, o bé per implementar eines TIC a les infraestructures de dades existents per a anàlisi addicionals; millorant així la governança de les dades. Un major rendiment per obtenir els paràmetres hidràulics de l'aqüífer s'adreça des del desenvolupament d'una eina SIG. La interpretació dels assaigs de bombament dins del seu context espacial, pot reduir la incertesa del seu anàlisi amb un coneixement precís de la geometria i els límits de l'aqüífer. Aquest software dissenyat per recopilar, administrar, visualitzar i analitzar els assaigs de bombament en un entorn GIS, dóna suport a la parametrització hidràulica dels models de flux i transport d'aigües subterrànies. Per millorar la quantificació dels paràmetres hidràulics, es va realitzar una compilació, revisió i anàlisi de la conductivitat hidràulica basada en metodologies de mida de gra. Posteriorment, s'ha considerat i discutit la incertesa d'aplicar aquests mètodes en una escala major comparant els resultats de la millora d'escala amb les proves de bombament. Finalment, es presenta una eina SIG lliure, de codi obert i de fàcil aplicació. Aquesta nova generació d'eines SIG pretenen simplificar la caracterització de les masses d'aigua subterrània amb el propòsit de construir models conceptuals ambientals rigorosos. A més, aquesta eina permet estandarditzar, gestionar, analitzar i interpretar dades hidrogeològiques i hidroquímiques. Donat que la seva arquitectura és de codi lliure i obert, es pot actualitzar i ampliar segons les aplicacions personalitzades que cada usuari requereixi.
La conceptualización de un sistema hidrogeológico implica el continuo monitoreo y evaluación de una gran cantidad de parámetros (e.g., parámetros hidráulicos). Con respecto a los parámetros hidráulicos, su cuantificación es uno de los problemas más comunes en los estudios hidrogeológicos. Es ampliamente reconocido que los métodos para obtener este tipo de parámetros tienen sus limitaciones y son dependientes de la escala de análisis. En este sentido, es necesario disponer de métodos y herramientas para estimarlos dentro de su contexto espacial y validar su incertidumbre cuando se aplican en una escala superior de análisis. Los datos recopilados y generados para realizar un modelo conceptual hidrogeológico a menudo se almacenan en diferentes escalas y formatos (e.g., mapas, hojas de cálculo o bases de datos). Este volumen de datos en continuo crecimiento requiere de herramientas y metodologías que mejoren su compilación y gestión para su posterior análisis. Las contribuciones realizadas son: (i) proporcionar metodologías dinámicas y escalables para migrar e integrar múltiples infraestructuras de datos (ya sean infraestructuras de datos espaciales y no espaciales, o la implementación de herramientas TIC); (ii) obtener un mayor rendimiento del análisis hidrogeológico teniendo en cuenta su contexto espacial; (iii) proporcionar herramientas específicas para analizar procesos hidrogeológicos y obtener parámetros hidráulicos que desempeñan un papel clave en los estudios de aguas subterráneas; y (iv) difundir software de código abierto y de fácil acceso que permita la estandarización, gestión, análisis, interpretación e intercambio de datos hidrogeológicos con un modelo numérico dentro de una única plataforma de información geográfica (SIG). Se ha diseñado una metodología dinámica y escalable para armonizar y estandarizar múltiples conjuntos de datos de diferentes orígenes, o bien para conectar éstas infraestructuras de datos con herramientas TIC. Esta metodología puede ser implementada en cualquier tipo de proceso de migración e integración de datos (DMI), para desarrollar infraestructuras de datos espaciales y no espaciales, o para implementar herramientas TIC en las infraestructuras de datos existentes para análisis adicionales; mejorando así la gobernanza de los datos. Un mayor rendimiento para obtener los parámetros hidráulicos del acuífero se ha abordado desde el desarrollo de una herramienta SIG. La interpretación de ensayos de bombeo dentro de su contexto espacial, puede reducir la incertidumbre de su análisis con un conocimiento preciso de la geometría y los límites del acuífero. Este software diseñado para recopilar, administrar, visualizar y analizar las pruebas de bombeo en un entorno SIG, apoya la parametrización hidráulica de los modelos de flujo y transporte de aguas subterráneas. Para mejorar la cuantificación de los parámetros hidráulicos, se ha realizado una compilación, revisión y análisis de la conductividad hidráulica basada en metodologías de tamaño de grano. Posteriormente, se ha considerado y discutido la incertidumbre de aplicar estos métodos en una escala mayor comparando los resultados de la mejora de escala con los obtenidos en ensayos de bombeo. Finalmente, se presenta una herramienta SIG libre, de código abierto y de fácil aplicación. Esta nueva generación de herramienta SIG pretende simplificar la caracterización de los cuerpos de agua subterránea con el propósito de construir modelos conceptuales ambientales rigurosos. Además, esta herramienta permite estandarizar, gestionar, analizar e interpretar datos hidrogeológicos e hidroquímicos. Gracias a su arquitectura de código libre y abierto, se puede actualizar y ampliar según las aplicaciones personalizadas que cada usuario requiera
Faulstich, Lukas C. "The HyperView approach to the integration of semistructured data." [S.l. : s.n.], 2000. http://www.diss.fu-berlin.de/2000/33/index.html.
Full textKittivoravitkul, Sasivimol. "A bi-directional transformation approach for semistructured data integration." Thesis, Imperial College London, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.444093.
Full textGHEZZI, ANNALISA. "A new approach to data integration in Archaeological geophysics." Doctoral thesis, Università degli Studi di Camerino, 2020. http://hdl.handle.net/11581/447386.
Full textLewis, Richard. "A semantic approach to railway data integration and decision support." Thesis, University of Birmingham, 2015. http://etheses.bham.ac.uk//id/eprint/5959/.
Full textMireku, Kwakye Michael. "A Practical Approach to Merging Multidimensional Data Models." Thesis, Université d'Ottawa / University of Ottawa, 2011. http://hdl.handle.net/10393/20457.
Full textMukviboonchai, Suvimol. "The mediated data integration (MeDInt) : An approach to the integration of database and legacy systems." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2003. https://ro.ecu.edu.au/theses/1308.
Full textEngström, Henrik. "Selection of maintenance policies for a data warehousing environment : a cost based approach to meeting quality of service requirements." Thesis, University of Exeter, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.269667.
Full textSannellappanavar, Vijaya Laxmankumar. "DATAWAREHOUSE APPROACH TO DECISION SUPPORT SYSTEM FROM DISTRIBUTED, HETEROGENEOUS SOURCES." University of Akron / OhioLINK, 2006. http://rave.ohiolink.edu/etdc/view?acc_num=akron1153506475.
Full textSehlstedt, Jonas. "Replacing qpcr non-detects with microarray expression data : An initialized approach towards microarray and qPCR data integration." Thesis, Högskolan i Skövde, Institutionen för biovetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-15790.
Full textShea, Geoffrey Yu Kai Surveying & Spatial Information Systems Faculty of Engineering UNSW. "A Web-Based Approach to the Integration of Diverse Data Sources for GIS." Awarded by:University of New South Wales. Surveying and Spatial Information Systems, 2001. http://handle.unsw.edu.au/1959.4/17855.
Full textShea, Geoffrey Yu Kai. "A web-based approach to the integration of diverse data sources for GIS /." Sydney : School of Surveying and Spatial Information Systems, University of New South Wales, 2001. http://www.library.unsw.edu.au/~thesis/adt-NUN/public/adt-NUN20011018.170350/index.html.
Full textJakobsson, Erik. "A new approach to Pairs Trading : Using fundamental data to find optimal portfolios." Thesis, Umeå universitet, Institutionen för matematik och matematisk statistik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-104314.
Full textProesser, Malte. "A new approach to systems integration in the mechatronic engineering design process of manufacturing systems." Thesis, De Montfort University, 2014. http://hdl.handle.net/2086/10492.
Full textKabir, Sami. "BRB based Deep Learning Approach with Application in Sensor Data Streams." Thesis, Luleå tekniska universitet, Datavetenskap, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-75974.
Full textThis is a Master Thesis Report as part of degree requirement of Erasmus Mundus Joint Master Degree (EMJMD) in Pervasive Computing and Communications for Sustainable Development (PERCCOM).
Coronado, Zamora Marta. "Mapping natural selection through the drosophila melanogaster development following a multiomics data integration approach." Doctoral thesis, Universitat Autònoma de Barcelona, 2018. http://hdl.handle.net/10803/666761.
Full textCharles Darwin's theory of evolution proposes that the adaptations of organisms arise because of the process of natural selection. Natural selection leaves a characteristic footprint on the patterns of genetic variation that can be detected by means of statistical methods of genomic analysis. Today, we can infer the action of natural selection in a genome and even quantify what proportion of the incorporated genetic variants in the populations are adaptive. The genomic era has led to the paradoxical situation in which much more evidence of selection is available on the genome than on the phenotype of the organism, the primary target of natural selection. The advent of next-generation sequencing (NGS) technologies is providing a vast amount of -omics data, especially increasing the breadth of available developmental transcriptomic series. In contrast to the genome of an organism, the transcriptome is a phenotype that varies during the lifetime and across different body parts. Studying a developmental transcriptome from a population genomic and spatio-temporal perspective is a promising approach to understand the genetic and developmental basis of the phenotypic change. This thesis is an integrative population genomics and evolutionary biology project following a bioinformatic approach. It is performed in three sequential steps: (i) the comparison of different variations of the McDonald and Kreitman test (MKT), a method to detect recurrent positive selection on coding sequences at the molecular level, using empirical data from a North American population of D. melanogaster and simulated data, (ii) the inference of the genome features correlated with the evolutionary rate of protein-coding genes, and (iii) the integration of patterns of genomic variation with annotations of large sets of spatio-temporal developmental data (evo-dev-omics). As a result of this approach, we have carried out two different studies integrating the patterns of genomic diversity with multiomics layers across developmental time and space. In the first study we give a global perspective on how natural selection acts during the whole life cycle of D. melanogaster, assessing whether different regimes of selection act through the developmental stages. In the second study, we draw an exhaustive map of selection acting on the complete embryo anatomy of D. melanogaster. Taking all together, our results show that genes expressed in mid- and late-embryonic development stages exhibit the highest sequence conservation and the most complex structure: they are larger, consist of more exons and longer introns, encode a large number of isoforms and, on average, are highly expressed. Selective constraint is pervasive, particularly on the digestive and nervous systems. On the other hand, earlier stages of embryonic development are the most divergent, which seems to be due to the diminished efficiency of natural selection on maternal-effect genes. Additionally, genes expressed in these first stages have on average the shortest introns, probably due to the need for a rapid and efficient expression during the short cell cycles. Adaptation is found in the structures that also show evidence of adaptation in the adult, the immune and reproductive systems. Finally, genes that are expressed in one or a few different anatomical structures are younger and have higher rates of evolution, unlike genes that are expressed in all or almost all structures. Population genomics is no longer a theoretical science, it has become an interdisciplinary field where bioinformatics, large functional -omics datasets, statistical and evolutionary models and emerging molecular techniques are all integrated to get a systemic view of the causes and consequences of evolution. The integration of population genomics with other phenotypic multiomics data is the necessary step to gain a global picture of how adaptation occurs in nature.
Samuel, John. "Feeding a data warehouse with data coming from web services. A mediation approach for the DaWeS prototype." Thesis, Clermont-Ferrand 2, 2014. http://www.theses.fr/2014CLF22493/document.
Full textThe role of data warehouse for business analytics cannot be undermined for any enterprise, irrespective of its size. But the growing dependence on web services has resulted in a situation where the enterprise data is managed by multiple autonomous and heterogeneous service providers. We present our approach and its associated prototype DaWeS [Samuel, 2014; Samuel and Rey, 2014; Samuel et al., 2014], a DAta warehouse fed with data coming from WEb Services to extract, transform and store enterprise data from web services and to build performance indicators from them (stored enterprise data) hiding from the end users the heterogeneity of the numerous underlying web services. Its ETL process is grounded on a mediation approach usually used in data integration. This enables DaWeS (i) to be fully configurable in a declarative manner only (XML, XSLT, SQL, datalog) and (ii) to make part of the warehouse schema dynamic so it can be easily updated. (i) and (ii) allow DaWeS managers to shift from development to administration when they want to connect to new web services or to update the APIs (Application programming interfaces) of already connected ones. The aim is to make DaWeS scalable and adaptable to smoothly face the ever-changing and growing web services offer. We point out the fact that this also enables DaWeS to be used with the vast majority of actual web service interfaces defined with basic technologies only (HTTP, REST, XML and JSON) and not with more advanced standards (WSDL, WADL, hRESTS or SAWSDL) since these more advanced standards are not widely used yet to describe real web services. In terms of applications, the aim is to allow a DaWeS administrator to provide to small and medium companies a service to store and query their business data coming from their usage of third-party services, without having to manage their own warehouse. In particular, DaWeS enables the easy design (as SQL Queries) of personalized performance indicators. We present in detail this mediation approach for ETL and the architecture of DaWeS. Besides its industrial purpose, working on building DaWeS brought forth further scientific challenges like the need for optimizing the number of web service API operation calls or handling incomplete information. We propose a bound on the number of calls to web services. This bound is a tool to compare future optimization techniques. We also present a heuristics to handle incomplete information
Werlang, Ricardo. "Ontology-based approach for standard formats integration in reservoir modeling." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2015. http://hdl.handle.net/10183/115196.
Full textThe integration of data issued from autonomous and heterogeneous sources is still a significant problem for an important number of applications. In the oil and gas industry, a large amount of data is generated every day from multiple sources such as seismic data, well data, drilling data, transportation data, and marketing data. However, these data are acquired by the application of different techniques and represented in different standards and formats. Thus, these data exist in a structured form in databases, and in semi-structured forms in spreadsheets and documents such as reports and multimedia collections. To deal with this large amount of information, as well as the heterogeneous data formats of the data, the information needs to be standardized and integrated across systems, disciplines and organizational boundaries. As a result, this information integration will enable better decision making within collaborations, once high quality data will be accessible timely. The petroleum industry depends on the efficient use of these data to the construction of computer models in order to simplify the geological reality and to help understanding it. Such a model, which contains geological objects analyzed by different professionals – geologists, geophysicists and engineers – does not represent the reality itself, but the expert’s conceptualization. As a result, the geological objects modeled assume distinct semantic representations and complementary in supporting decision-making. For keeping the original intended meanings, ontologies were used for expliciting the semantic of the models and for integrating the data and files generated in the various stages of the exploration chain. The major claim of this work is that interoperability among earth models built and manipulated by different professionals and systems can be achieved by making apparent the meaning of the geological objects represented in the models. We show that domain ontologies developed with support of theoretical background of foundational ontologies show to be an adequate tool to clarify the semantic of geology concepts. We exemplify this capability by analyzing the communication standard formats most used in the modeling chain (LAS,WITSML, and RESQML), searching for entities semantically related with the geological concepts described in ontologies for Geosciences. We show how the notions of identity, rigidity, essentiality and unity applied to ontological concepts lead the modeler to more precisely define the geological objects in the model. By making explicit the identity properties of the modeled objects, the modeler who applies data standards can overcome the ambiguities of the geological terminology. In doing that, we clarify which are the relevant objects and properties that can be mapped from one model to another, even when they are represented with different names and formats.
Prieto, Barja Pablo 1986. "NGS applications in genome evolution and adaptation : A reproducible approach to NGS data analysis and integration." Doctoral thesis, Universitat Pompeu Fabra, 2017. http://hdl.handle.net/10803/565601.
Full textEn aquest doctorat he utilitzat tecnologies NGS en diferents organismes i projectes com l'ENCODE, comparant la conservació i evolució de seqüències de RNA llargs no codificant entre el ratolí i l'humà, utilitzant evidències experimentals del genoma, transcriptoma i cromatina. He seguit una estratègia similar en altres organismes com són la mongeta mesoamericana i el pollastre. En altres anàlisis he hagut d'utilitzar dades NGS en l'estudi del conegut paràsit leishmània Donovani, l'agent causatiu de la malaltia Leishmaniosis. Utilitzant dades NGS obtingudes del genoma i transcriptoma he estudiat les conseqüències del genoma en estratègies d'adaptació i evolució a llarg termini. Aquest treball es va realitzar mentre treballava en eines i estratègies per dissenyar eficientment i implementar els anàlisis bioinformàtics coneguts com a diagrames de treball, per tal de fer-los fàcils d'utilitzar, fàcilment realitzables, accessibles i amb un alt rendiment. Aquest treball present diverses estratègies per tal d'evitar la falta de reproductibilitat i consistència en la investigació científica amb aplicacions reals a la biologia de l'anàlisi de seqüències i evolució de genomes.
Saal, Petronella Elize. "Integrating computers into mathematics education in South African Schools." Diss., University of Pretoria, 2017. http://hdl.handle.net/2263/62904.
Full textDissertation (MEd)--University of Pretoria, 2017.
Science, Mathematics and Technology Education
MEd
Unrestricted
Swint, Galen Steen. "Clearwater an extensible, pliable, and customizable approach to code generation /." Diss., Available online, Georgia Institute of Technology, 2006, 2006. http://etd.gatech.edu/theses/available/etd-07082006-012732/.
Full textCalton Pu, Committee Chair ; Ling Liu, Committee Member ; Karsten Schwan, Committee Member ; Olin Shivers, Committee Member ; Donald F. Ferguson, Committee Member.
Watanabe, Toyohide, Yuuji Yoshida, and Teruo Fukumura. "Editing model based on the object-oriented approach." IEEE, 1988. http://hdl.handle.net/2237/6930.
Full textTessier, Sean Michael. "Ontology-based approach to enable feature interoperability between CAD systems." Thesis, Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/41118.
Full textPhalak, Kashmira. "Cognitively-inspired Architecture for Wireless Sensor Networks: A Model Driven Approach for Data Integration in a Traffic Monitoring System." Thesis, Virginia Tech, 2006. http://hdl.handle.net/10919/33675.
Full textWe describe CoSMo, a Cognitively Inspired Service and Model Architecture for situational awareness and monitoring of vehicular traffic in urban transportation systems using a network of wireless sensors. The system architecture combines (i) a cognitively inspired internal representation for analyzing and answering queries concerning the observed system and (ii) a service oriented architecture that facilitates interaction among individual modules, of the internal representation, the observed system and the user. The cognitively inspired model architecture allows effective deductive as well as inductive reasoning by combining simulation based dynamic models for planning with traditional relational databases for knowledge and data representation. On the other hand the service oriented design of interaction allows one to build flexible, extensible and scalable systems that can be deployed in practical settings. To illustrate our concepts and the novel features of our architecture, we have recently completed a prototype implementation of CoSMo. The prototype illustrates advantages of our approach over other traditional approaches for designing scalable software for situational awareness in large complex systems. The basic architecture and its prototype implementation are generic and can be applied for monitoring other complex systems. This thesis describes the design of cognitively-inspired model architecture and its corresponding prototype. Two important contributions include the following:
- The cognitively-inspired architecture: In contrast to earlier work in model driven architecture, CoSMo contains a number of cognitively inspired features, including perception, memory and learning. Apart from illustrating interesting trade-offs between computational cost (e.g. access time, memory), and correctness available to a user, it also allows users specified deductive and inductive queries.
- Distributed Data Integration and Fusion: In keeping with the cognitively-inspired model-driven approach, the system allows for an efficient data fusion from heterogeneous sensors, simulation based dynamic models and databases that are continually updated with real world and simulated data. It is capable of supporting a rich class of queries.
Master of Science
Du, Preez Jacobus Frederick. "The integration of informal minibus-taxi transport services into formal public transport planning and operations - A data driven approach." Master's thesis, University of Cape Town, 2018. http://hdl.handle.net/11427/29885.
Full textSELICATI, VALERIA. "Innovative thermodynamic hybrid model-based and data-driven techniques for real time manufacturing sustainability assessment." Doctoral thesis, Università degli studi della Basilicata, 2022. http://hdl.handle.net/11563/157566.
Full textKoga, Ivo Kenji 1981. "An event-based approach to process environmental data = Um enfoque baseado em eventos para processar dados ambientais." [s.n.], 2013. http://repositorio.unicamp.br/jspui/handle/REPOSIP/275618.
Full textTese (doutorado) - Universidade Estadual de Campinas, Instituto de Computação
Made available in DSpace on 2018-08-23T23:06:49Z (GMT). No. of bitstreams: 1 Koga_IvoKenji_D.pdf: 2109870 bytes, checksum: 7ac5400b2e71be3e15b3bdf5504e3adf (MD5) Previous issue date: 2013
Resumo: O resumo poderá ser visualizado no texto completo da tese digital
Abstract: The complete abstract is available with the full electronic document.
Doutorado
Ciência da Computação
Doutor em Ciência da Computação
Singh, Shikhar. "An approach to automate the adaptor software generation for tool integration in Application/ Product Lifecycle Management tool chains." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-193919.
Full textResource Description Framework (RDF) ger funktionaliteten av dataintegration, oberoende av underliggande scheman genom att behandla uppgifter som resurs och representerar det som URI. Modellen för Resource Description Framework är en datamodell som används för utbyte och kommunikation av uppgifter om Internet och kan användas för att lösa andra verkliga problem som integrationsverktyg och automatisering av kommunikation i relationsdatabaser. Ett växande problem i organisationer är att det finns ett stort antal verktyg som lagrar data och som kommunicerar med varandra alltför ofta, under hela processen för ett program eller produktutveckling. Men inga kommunikationsmedel utan ingripande av en central enhet (oftast en server) finns. Åtkomst av data mellan verktyg och länkningar mellan dem är resurskrävande. Som en del av avhandlingen utvecklar vi en programvara (även hänvisad till som "adapter" i avhandlingen), som integrerar data utan större problem. Detta kommer att eliminera behovet av att lagra databasscheman på en central lagringsplats och göra processen för att hämta data inom verktyg mindre resurskrävande. Detta kommer att ske efter beslut om en särskild strategi för att uppnå kommunikation mellan olika verktyg som kan vara en sammanslagning av många relevanta begrepp, genom studier av nya och kommande metoder som kan hjälpa i nämnda scenarier. Med den utvecklade programvaran konverteras först datat i relationsdatabaserna till RDF form och skickas och tas sedan emot i RDF format. Således utgör RDF det viktiga underliggande konceptet för programvaran. Det främsta målet med avhandlingen är att automatisera utvecklingen av ett sådant verktyg (adapter). Med denna automatisering elimineras behovet att av någon manuellt behöver utvärdera databasen och sedan utveckla adaptern enligt databasen. Ett sådant verktyg kan användas för att implementera liknande lösningar i andra organisationer som har liknande problem. Således är resultatet av avhandlingen en algoritm eller ett tillvägagångssätt för att automatisera processen av att skapa adaptern.
Zhu, Junxiang. "Integration of Building Information Modelling and Geographic Information System at Data Level Using Semantics and Geometry Conversion Approach Towards Smart Infrastructure Management." Thesis, Curtin University, 2018. http://hdl.handle.net/20.500.11937/74945.
Full textReda, Roberto. "A Semantic Web approach to ontology-based system: integrating, sharing and analysing IoT health and fitness data." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2017. http://amslaurea.unibo.it/14645/.
Full textRuppert, Jan Christian [Verfasser], Michael [Akademischer Betreuer] Bonkowski, Hartmut [Akademischer Betreuer] Arndt, and Stefan [Akademischer Betreuer] Porembski. "Advancing Functional Understanding of Primary Production in Drylands: Insights from a Data-Integration Approach / Jan Christian Ruppert. Gutachter: Michael Bonkowski ; Hartmut Arndt ; Stefan Porembski." Köln : Universitäts- und Stadtbibliothek Köln, 2014. http://d-nb.info/1054445044/34.
Full textDabous, Feras Taleb Abdel Rahman School of Information Systems Technology & Management UNSW. "A pattern based approach for the architectural design of e-business applications." Awarded by:University of New South Wales. School of Information Systems Technology and Management, 2005. http://handle.unsw.edu.au/1959.4/22047.
Full textWang, Penghao. "An integrative approach for phylogenetic inference." Thesis, The University of Sydney, 2009. https://hdl.handle.net/2123/28166.
Full textGuan, Xiaowei. "Bioinformatics Approaches to Heterogeneous Omic Data Integration." Case Western Reserve University School of Graduate Studies / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=case1340302883.
Full textJana, Vimbai Lisa Michelle. "Adopting a harmonised regional approach to customs regulation for the tripartite free trade agreement." Thesis, University of the Western Cape, 2013. http://etd.uwc.ac.za/index.php?module=etd&action=viewtitle&id=gen8Srv25Nme4_8861_1380710167.
Full textXiao, Hui. "Network-based approaches for multi-omic data integration." Thesis, University of Cambridge, 2019. https://www.repository.cam.ac.uk/handle/1810/289716.
Full textZheng, Feng. "A data exchange approach to integrating autonomous manufacturing databases." Thesis, Loughborough University, 1996. https://dspace.lboro.ac.uk/2134/27164.
Full textTous, Ruben. "Data integration with XML and semantic web technologies novel approaches in the design of modern data integration systems." Saarbrücken VDM Verlag Dr. Müller, 2006. http://d-nb.info/991303105/04.
Full textHe, Zhong. "Integration of dynamic data into reservoir description using streamline approaches." Texas A&M University, 2003. http://hdl.handle.net/1969.1/1188.
Full textPATRIZI, SARA. "Multi-omics approaches to complex diseases in children." Doctoral thesis, Università degli Studi di Trieste, 2022. http://hdl.handle.net/11368/3015193.
Full text“-Omic” technologies can detect the entirety of the molecules in the biological sample of interest, in a non-targeted and non-biased fashion. The integration of multiple types of omics data, known as “multi-omics” or “vertical omics”, can provide a better understanding of how the cause of disease leads to its functional consequences, which is particularly valuable in the study of complex diseases, that are caused by the interaction of multiple genetic and regulatory factors with contributions from the environment. In the present work appropriate multi-omics approaches are applied to two complex conditions that usually first manifest in childhood, have rising incidence and gaps in the knowledge of their molecular pathology, specifically Congenital Lung Malformations and Coeliac Disease. The aims are, respectively, to verify if cancer-associated genomic variants or DNA methylation features exist in the malformed lung tissue and to find common alterations in the methylome and the transcriptome of small intestine epithelial cells of children with CD. The methods used in the Congenital Lung Malformations project are Whole Genome Methylation microarrays and Whole Genome Sequencing, and for the Coeliac Disease the whole genome methylation microarrays and mRNA sequencing. Differentially methylated regions in possibly cancer-related genes were found in each one of the 20 lung malformation samples included. Moreover, 5 malformed samples had at least one somatic missense single nucleotide variant in genes known as lung cancer drivers, and 5 malformed samples had a total of 2 deletions of lung cancer driver tumour suppressor and 10 amplifications of lung cancer driver oncogenes. The data showed that congenital lung malformations can have premalignant genetic and epigenetic features, that are impossible to predict with clinical information only. In the second project, Principal Component Analysis of the whole genome methylation data showed that CD patients divide into two clusters, one of which overlaps with controls. 174 genes were differentially methylated compared to the controls in both clusters. Principal Component Analysis of gene expression data (mRNA-Seq) showed a distribution that is similar to the methylation data, and 442 genes were differentially expressed in both clusters. Six genes, mainly related to interferon response and antigen processing and presentation, were differentially expressed and methylated in both clusters. These results show that the intestinal epithelial cells of individuals with CD are highly variable from a molecular point of view, but they share some fundamental differences that make them able to respond to interferons, process, and present antigens more efficiently than controls. Despite the limitations of the present studies, they have shown that targeted multi-omics approaches can be set up to answer the relevant disease-specific questions by investigating many cellular functions at once, often generating new hypotheses and making unexpected discoveries in the process.
Chen, Wen-Chih. "Integrating approaches to efficiency and productivity measurement." Diss., Georgia Institute of Technology, 2003. http://hdl.handle.net/1853/25422.
Full textLaurén, Carl-Fredrik. "Current approaches on how to acquire and integrate external data into Data Warehouses." Thesis, University of Skövde, Department of Computer Science, 2003. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-754.
Full textIntegration of internal data is often mentioned in literature as the most demanding task when building or maintaining a DW. There is no literature that outlines the approach for the integration of external data into a DW. The integration of external data has increased during the last years enabling corporations to understand the opportunities in the market and to be able to better plan for future success of the corporation. The aim of this work is to exploratory outline current approaches for acquiring and integrating external data into DW and to give a brief overview of the future trends for external data integration. This aim was researched using an interview study. The results show that how to integrate external data is depending on what the corporations purpose with the external data is. Additional results show that how to integrate external data also depends on how the data is acquired.
Do, Hong-Hai. "Schema matching and mapping based data integration architecture, approaches and evaluation." Saarbrücken VDM, Müller, 2006. http://deposit.d-nb.de/cgi-bin/dokserv?id=2863983&prov=M&dok_var=1&dok_ext=htm.
Full textFleischman, David R. "A data oriented approach to integrating manufacturing functions in flexible Manufacturing Systems." Thesis, Monterey, California. Naval Postgraduate School, 1988. http://hdl.handle.net/10945/23150.
Full textKettouch, Mohamed Salah. "A new approach for interlinking and integrating semi-structured and linked data." Thesis, Anglia Ruskin University, 2017. http://arro.anglia.ac.uk/702722/.
Full textGadaleta, Emanuela. "A multidisciplinary computational approach to model cancer-omics data : organising, integrating and mining multiple sources of data." Thesis, Queen Mary, University of London, 2015. http://qmro.qmul.ac.uk/xmlui/handle/123456789/8141.
Full textAl, Shekaili Dhahi. "Integrating Linked Data search results using statistical relational learning approaches." Thesis, University of Manchester, 2017. https://www.research.manchester.ac.uk/portal/en/theses/integrating-linked-data-search-results-using-statistical-relational-learning-approaches(3f77386b-a38a-4110-8ce1-bda6340e6f0b).html.
Full textMönchgesang, Susann [Verfasser]. "Metabolomics and biochemical omics data - integrative approaches : [kumulative Dissertation] / Susann Mönchgesang." Halle, 2017. http://d-nb.info/1131075994/34.
Full text