To see the other types of publications on this topic, follow the link: Data warehouse.

Dissertations / Theses on the topic 'Data warehouse'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Data warehouse.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Sarkis, Laura Costa. "Data warehouse." Florianópolis, SC, 2001. http://repositorio.ufsc.br/xmlui/handle/123456789/80047.

Full text
Abstract:
Dissertação (mestrado) - Universidade Federal de Santa Catarina, Centro Tecnológico. Programa de Pós-Graduação em Ciência da Computação.
Made available in DSpace on 2012-10-18T09:56:17Z (GMT). No. of bitstreams: 1 227423.pdf: 1120477 bytes, checksum: 7d1d28b65b97dcebee88d4e86dfd4087 (MD5)
Este trabalho descreve os conceitos básicos do ambiente do Data Warehouse, abordando em especial o processo de migração de dados. São expostas algumas técnicas e tecnologias mais recentes existentes no mercado com esta finalidade. A partir de um estudo inicial sobre os conceitos de Data Warehouse, delimitou-se o trabalho em função do processo de migração dos dados. Com este propósito, foram estudadas quatro abordagens e elaborada uma análise comparativa na tentativa de determinar qual delas é a mais adequada ao processo. Em um processo de migração de dados é importante garantir também a qualidade dos dados, em decorrência disto, o trabalho contém a descrição de uma abordagem que trata de como é realizado o processo para a qualidade de dados em Data Warehouse. São citadas também algumas ferramentas existentes no mercado que possam possivelmente atender aos processos de migração de dados para o Data Warehouse e qualidade de dados.
APA, Harvard, Vancouver, ISO, and other styles
2

Mathew, Avin D. "Asset management data warehouse data modelling." Thesis, Queensland University of Technology, 2008. https://eprints.qut.edu.au/19310/1/Avin_Mathew_Thesis.pdf.

Full text
Abstract:
Data are the lifeblood of an organisation, being employed by virtually all business functions within a firm. Data management, therefore, is a critical process in prolonging the life of a company and determining the success of each of an organisation’s business functions. The last decade and a half has seen data warehousing rising in priority within corporate data management as it provides an effective supporting platform for decision support tools. A cross-sectional survey conducted by this research showed that data warehousing is starting to be used within organisations for their engineering asset management, however the industry uptake is slow and has much room for development and improvement. This conclusion is also evidenced by the lack of systematic scholarly research within asset management data warehousing as compared to data warehousing for other business areas. This research is motivated by the lack of dedicated research into asset management data warehousing and attempts to provide original contributions to the area, focussing on data modelling. Integration is a fundamental characteristic of a data warehouse and facilitates the analysis of data from multiple sources. While several integration models exist for asset management, these only cover select areas of asset management. This research presents a novel conceptual data warehousing data model that integrates the numerous asset management data areas. The comprehensive ethnographic modelling methodology involved a diverse set of inputs (including data model patterns, standards, information system data models, and business process models) that described asset management data. Used as an integrated data source, the conceptual data model was verified by more than 20 experts in asset management and validated against four case studies. A large section of asset management data are stored in a relational format due to the maturity and pervasiveness of relational database management systems. Data warehousing offers the alternative approach of structuring data in a dimensional format, which suggests increased data retrieval speeds in addition to reducing analysis complexity for end users. To investigate the benefits of moving asset management data from a relational to multidimensional format, this research presents an innovative relational vs. multidimensional model evaluation procedure. To undertake an equitable comparison, the compared multidimensional are derived from an asset management relational model and as such, this research presents an original multidimensional modelling derivation methodology for asset management relational models. Multidimensional models were derived from the relational models in the asset management data exchange standard, MIMOSA OSA-EAI. The multidimensional and relational models were compared through a series of queries. It was discovered that multidimensional schemas reduced the data size and subsequently data insertion time, decreased the complexity of query conceptualisation, and improved the query execution performance across a range of query types. To facilitate the quicker uptake of these data warehouse multidimensional models within organisations, an alternate modelling methodology was investigated. This research presents an innovative approach of using a case-based reasoning methodology for data warehouse schema design. Using unique case representation and indexing techniques, the system also uses a business vocabulary repository to augment case searching and adaptation. The system was validated through a case-study where multidimensional schema design speed and accuracy was measured. It was found that the case-based reasoning system provided a marginal benefit, with a greater benefits gained when confronted with more difficult scenarios.
APA, Harvard, Vancouver, ISO, and other styles
3

Mathew, Avin D. "Asset management data warehouse data modelling." Queensland University of Technology, 2008. http://eprints.qut.edu.au/19310/.

Full text
Abstract:
Data are the lifeblood of an organisation, being employed by virtually all business functions within a firm. Data management, therefore, is a critical process in prolonging the life of a company and determining the success of each of an organisation’s business functions. The last decade and a half has seen data warehousing rising in priority within corporate data management as it provides an effective supporting platform for decision support tools. A cross-sectional survey conducted by this research showed that data warehousing is starting to be used within organisations for their engineering asset management, however the industry uptake is slow and has much room for development and improvement. This conclusion is also evidenced by the lack of systematic scholarly research within asset management data warehousing as compared to data warehousing for other business areas. This research is motivated by the lack of dedicated research into asset management data warehousing and attempts to provide original contributions to the area, focussing on data modelling. Integration is a fundamental characteristic of a data warehouse and facilitates the analysis of data from multiple sources. While several integration models exist for asset management, these only cover select areas of asset management. This research presents a novel conceptual data warehousing data model that integrates the numerous asset management data areas. The comprehensive ethnographic modelling methodology involved a diverse set of inputs (including data model patterns, standards, information system data models, and business process models) that described asset management data. Used as an integrated data source, the conceptual data model was verified by more than 20 experts in asset management and validated against four case studies. A large section of asset management data are stored in a relational format due to the maturity and pervasiveness of relational database management systems. Data warehousing offers the alternative approach of structuring data in a dimensional format, which suggests increased data retrieval speeds in addition to reducing analysis complexity for end users. To investigate the benefits of moving asset management data from a relational to multidimensional format, this research presents an innovative relational vs. multidimensional model evaluation procedure. To undertake an equitable comparison, the compared multidimensional are derived from an asset management relational model and as such, this research presents an original multidimensional modelling derivation methodology for asset management relational models. Multidimensional models were derived from the relational models in the asset management data exchange standard, MIMOSA OSA-EAI. The multidimensional and relational models were compared through a series of queries. It was discovered that multidimensional schemas reduced the data size and subsequently data insertion time, decreased the complexity of query conceptualisation, and improved the query execution performance across a range of query types. To facilitate the quicker uptake of these data warehouse multidimensional models within organisations, an alternate modelling methodology was investigated. This research presents an innovative approach of using a case-based reasoning methodology for data warehouse schema design. Using unique case representation and indexing techniques, the system also uses a business vocabulary repository to augment case searching and adaptation. The system was validated through a case-study where multidimensional schema design speed and accuracy was measured. It was found that the case-based reasoning system provided a marginal benefit, with a greater benefits gained when confronted with more difficult scenarios.
APA, Harvard, Vancouver, ISO, and other styles
4

Sharathkumar, Sudhindra. "An Automated Data Warehouse." ScholarWorks@UNO, 2003. http://scholarworks.uno.edu/td/36.

Full text
Abstract:
An increasing number of organizations are implementing data warehouses to strengthen their decision support systems. This comes with the challenges of the population and the periodic update of data warehouses. In this thesis, we present a tool that provides users with features to create a warehouse database and transform structures of the source database into structures for the warehouse database. It is highly interactive, easy to use, and hides the underlying complexity of manual SQL code generation from its users. Attributes from source tables can be mapped into new attributes in the warehouse database tables using aggregate functions. Then, relevant data is automatically transported from the source database to the newly created warehouse. The tool thus integrates warehouse creation, schema mapping and data population into a single generalpurpose tool. This tool has been designed as a component of the framework for an automated data warehouse being developed at theComputer Science Department, University of New Orleans. Users of this framework are the database administrators, who will also be able to synchronize updates of multiple copies of the data warehouse. Warehouse images that need to be updated are taken offline and applications that need to access the data warehouse can now access any of the other image warehouses. The Switching Application built into this framework switches between databases in a way that is totally transparent to applications so that they do not realize existence of multiple copies of the data warehouse. In effect, even non-technical users can create, populate and update data warehouses with minimal time and effort.
APA, Harvard, Vancouver, ISO, and other styles
5

Qian, Yi. "Financial aid data warehouse /." Connect to title online, 2008. http://minds.wisconsin.edu/handle/1793/34216.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hinrichs, Holger. "Datenqualitätsmanagement in Data-warehouse-Systemen." [S.l. : s.n.], 2002. http://deposit.ddb.de/cgi-bin/dokserv?idn=964461552.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Mazzola, Irany Salgado. "Projeto de data warehouse dimensional." Florianópolis, SC, 2002. http://repositorio.ufsc.br/xmlui/handle/123456789/83465.

Full text
Abstract:
Dissertação (mestrado) - Universidade Federal de Santa Catarina, Centro Tecnológico. Programa de Pós-Graduação em Ciência da Computação.
Made available in DSpace on 2012-10-20T00:16:39Z (GMT). No. of bitstreams: 1 184713.pdf: 304047 bytes, checksum: dd66fd6662914e352a967197f0859c9a (MD5)
O objetivo do trabalho é propor um projeto de modelagem de um Data Warehouse Dimensional para uma empresa do ramo varejista, com diversas áreas cujas informações alimentam o sistema de informações apoiado no DW. Um Data Warehouse é uma tecnologia de articulação inacabada, que estrutura e formaliza as informações e os dados de uma organização a partir de uma modelagem que pode vir a ser: a) relacional, ou seja, baseada em um modelo de banco de dados relacional, que atualmente é mais usado em transações on-line; b) dimensional, que pode ser baseado em bancos de dados cujas dimensões dão uma visão operacional e administrativa mais completa e muito menos complexa do todo da organização como sistema informativo. O projeto da modelagem proposto no trabalho tem como objetivo apresentar as várias etapas que compõem a construção de um Data Warehouse, assim como mostrar seus componentes mais utilizados na visualização de informações históricas e não detalhadas para análise on-line. A utilização desta tecnologia permite que dados de valor organizacional de períodos superiores a cinco anos possam ser arquivados, visando a apreciação crítica do comportamento da organização, enquanto se projetam estratégias operacionais e táticas mais eficazes em seu gerenciamento.
APA, Harvard, Vancouver, ISO, and other styles
8

Kanna, Rajesh. "Managing XML data in a relational warehouse on query translation, warehouse maintenance, and data staleness /." [Gainesville, Fla.] : University of Florida, 2001. http://etd.fcla.edu/etd/uf/2001/anp4011/Thesis.PDF.

Full text
Abstract:
Thesis (M.S.)--University of Florida, 2001.
Title from first page of PDF file. Document formatted into pages; contains x, 75 p.; also contains graphics. Vita. Includes bibliographical references (p. 71-74).
APA, Harvard, Vancouver, ISO, and other styles
9

Redgert, Rebecca. "Evaluating Data Quality in a Data Warehouse Environment." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-208766.

Full text
Abstract:
The amount of data accumulated by organizations have grown significantly during the last couple of years, increasing the importance of data quality. Ensuring data quality for large amounts of data is a complicated task, but crucial to subsequent analysis. This study investigates how to maintain and improve data quality in a data warehouse. A case study of the errors in a data warehouse was conducted at the Swedish company Kaplan, and resulted in guiding principles on how to improve the data quality. The investigation was done by manually comparing data from the source systems to the data integrated in the data warehouse and applying a quality framework based on semiotic theory to identify errors. The three main guiding principles given are (1) to implement a standardized format for the source data, (2) to implement a check prior to integration where the source data are reviewed and corrected if necessary, and (3) to create and implement specific database integrity rules. Further work is encouraged on establishing a guide for the framework on how to best perform a manual approach for comparing data, and quality assurance of source data.
Mängden data som ackumulerats av organisationer har ökat betydligt under de senaste åren, vilket har ökat betydelsen för datakvalitet. Att säkerställa datakvalitet för stora mängder data är en komplicerad uppgift, men avgörande för efterföljande analys. Denna studie undersöker hur man underhåller och förbättrar datakvaliteten i ett datalager. En fallstudie av fel i ett datalager på det svenska företaget Kaplan genomfördes och resulterade i riktlinjer för hur datakvaliteten kan förbättras. Undersökningen gjordes genom att manuellt jämföra data från källsystemen med datat integrerat i datalagret och genom att tillämpa ett kvalitetsramverk grundat på semiotisk teori för att kunna identifiera fel. De tre huvudsakliga riktlinjerna som gavs är att (1) implementera ett standardiserat format för källdatat, (2) genomföra en kontroll före integration där källdatat granskas och korrigeras vid behov, och (3) att skapa och implementera specifika databasintegritetsregler. Vidare forskning uppmuntras för att skapa en guide till ramverket om hur man bäst jämför data genom en manuell undersökning, och kvalitetssäkring av källdata.
APA, Harvard, Vancouver, ISO, and other styles
10

Öhman, Mikael. "a Data-Warehouse Solution for OMS Data Management." Thesis, Umeå universitet, Institutionen för datavetenskap, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-80688.

Full text
Abstract:
A database system for storing and querying data of a dynamic schema has been developed based on the kdb+ database management system and the q programming language for use in a financial setting of order and execution services. Some basic assumptions of mandatory fields of the data to be stored are made including that the data are time-series based.A dynamic schema enables an Order-Management System (OMS) to store information not suitable or usable when stored in log files or traditional databases. Log files are linear, cannot be queried effectively and are not suitable for the volumes produced by modern OMSs. Traditional databases are typically row-oriented which does not suit time-series based data and rely on the relational model which uses statically typed sets to store relations.The created system includes software that is capable of mining the actual schema stored in the database and visualize it. This enables ease of exploratory querying and production of applications which use the database. A feedhandler has been created optimized for handling high volumes of data. Volumes in finance are steadily growing as the industry continues to adopt computer automation of tasks. Feedhandler performance is important to reduce latency and for cost savings as a result of not having to scale horizontally. A study of the area of algorithmic trading has been performed with focus on transaction-cost analysis. Fundamental algorithms have been reviewed.A proof of concept application has been created that simulates an OMS storing logs on the execution of a Volume Weighted Average Price (VWAP) trading algorithm. The stored logs are then used in order to improve the performance of the trading algorithm through basic data mining and machine learning techniques. The actual learning algorithm focuses on predicting intraday volume patterns.
APA, Harvard, Vancouver, ISO, and other styles
11

Pérez, Martínez Juan Manuel. "Contextualizing a Data Warehouse with Documents." Doctoral thesis, Universitat Jaume I, 2007. http://hdl.handle.net/10803/10482.

Full text
Abstract:
La tecnología actual de los almacenes de datos y las técnicas OLAP permite a las organizaciones analizar los datos estructurados que éstas recopilan en sus bases de datos. Las circunstancias que rodean a estos datos aparecen descritas en documentos, típicamente ricos en texto. Esta información sobre el contexto de los datos registrados el almacén es muy valiosa, ya que nos permite interpretar el resultado obtenido en análisis históricos. Por ejemplo, la crisis financiera relatada una revista digital sobre economía podría explicar una caída de las ventas en una determinada región. Sin embargo, no es posible explotar esta información contextual utilizando directamente las herramientas OLAP tradicionales. La principal causa es la naturaleza no-estructurada, rica en texto, de los documentos que recogen dicha información. Esta tesis presenta el almacén contextualizado: un nuevo tipo de sistema de apoyo a la decisión que combina las tecnologías de los almacenes de datos y los sistemas de recuperación de la información para integrar las fuentes de información estructurada y de documentos de una organización, y analizar estos datos bajo distintos contextos.
APA, Harvard, Vancouver, ISO, and other styles
12

Andersson, Ola. "Benchmarking of Data Warehouse Maintenance Policies." Thesis, University of Skövde, Department of Computer Science, 2000. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-472.

Full text
Abstract:

Many maintenance policies have been proposed for refreshing a warehouse. The difficulties of selecting an appropriate maintenance policy for a specific scenario with specific source characteristics, user requirements etc. has triggered researcher to develop algorithms and cost-models for predicting cost associated with a policy and a scenario. In this dissertation, we develop a benchmarking tool for testing scenarios and retrieve real world data that can be compared against algorithms and cost-models. The approach was to support a broad set of configurations, including the support of source characteristics proposed in [ENG00], to be able to test a diversity set of scenarios.

APA, Harvard, Vancouver, ISO, and other styles
13

Khan, M. Shahan Ali, and Ahmad ElMadi. "Data Warehouse Testing : An Exploratory Study." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4767.

Full text
Abstract:
Context. The use of data warehouses, a specialized class of information systems, by organizations all over the globe, has recently experienced dramatic increase. A Data Warehouse (DW) serves organiza-tions for various important purposes such as reporting uses, strategic decision making purposes, etc. Maintaining the quality of such systems is a difficult task as DWs are much more complex than ordi-nary operational software applications. Therefore, conventional methods of software testing cannot be applied on DW systems. Objectives. The objectives of this thesis study was to investigate the current state of the art in DW testing, to explore various DW testing tools and techniques and the challenges in DW testing and, to identify the improvement opportunities for DW testing process. Methods. This study consists of an exploratory and a confirmatory part. In the exploratory part, a Systematic Literature Review (SLR) followed by Snowball Sampling Technique (SST), a case study at a Swedish government organization and interviews were conducted. For the SLR, a number of article sources were used, including Compendex, Inspec, IEEE Explore, ACM Digital Library, Springer Link, Science Direct, Scopus etc. References in selected studies and citation databases were used for performing backward and forward SST, respectively. 44 primary studies were identified as a result of the SLR and SST. For the case study, interviews with 6 practitioners were conducted. Case study was followed by conducting 9 additional interviews, with practitioners from different organizations in Sweden and from other countries. Exploratory phase was followed by confirmatory phase, where the challenges, identified during the exploratory phase, were validated by conducting 3 more interviews with industry practitioners. Results. In this study we identified various challenges that are faced by the industry practitioners as well as various tools and testing techniques that are used for testing the DW systems. 47 challenges were found and a number of testing tools and techniques were found in the study. Classification of challenges was performed and improvement suggestions were made to address these challenges in order to reduce their impact. Only 8 of the challenges were found to be common for the industry and the literature studies. Conclusions. Most of the identified challenges were related to test data creation and to the need for tools for various purposes of DW testing. The rising trend of DW systems requires a standardized testing approach and tools that can help to save time by automating the testing process. While tools for operational software testing are available commercially as well as from the open source community, there is a lack of such tools for DW testing. It was also found that a number of challenges are also related to the management activities, such as lack of communication and challenges in DW testing budget estimation etc. We also identified a need for a comprehensive framework for testing data warehouse systems and tools that can help to automate the testing tasks. Moreover, it was found that the impact of management factors on the quality of DW systems should be measured.
Shahan (+46 736 46 81 54), Ahmad (+46 727 72 72 11)
APA, Harvard, Vancouver, ISO, and other styles
14

Noaman, Amin Yousef. "Distributed data warehouse architecture and design." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0027/NQ51662.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Khandelwal, Nileshkumar. "An aggregate navigator for data warehouse." Ohio : Ohio University, 2000. http://www.ohiolink.edu/etd/view.cgi?ohiou1172255887.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Navrade, Frank. "Strategische Planung mit Data-Warehouse-Systemen." Wiesbaden Gabler, 2007. http://d-nb.info/987832239/04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Navrade, Frank. "Strategische Planung mit Data-Warehouse-Systemen /." Wiesbaden : Gabler, 2008. http://www.gbv.de/dms/zbw/560183593.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Bouillet, André. "Strategien zur Erweiterung eines Data Warehouse." [S.l. : s.n.], 2002. http://www.bsz-bw.de/cgi-bin/xvms.cgi?SWB9866634.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Stock, Steffen. "Modellierung zeitbezogener Daten im data warehouse /." Wiesbaden : Wiesbaden : Deutscher Universitäts-Verlag ; Gabler, 2001. http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&doc_number=009236518&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Li, Richard D. (Richard Ding) 1978. "Web clickstream data analysis using a dimensional data warehouse." Thesis, Massachusetts Institute of Technology, 2000. http://hdl.handle.net/1721.1/86671.

Full text
Abstract:
Thesis (M.Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, February 2001.
Includes bibliographical references (leaves 83-84).
by Richard D. Li.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
21

Herden, Olaf. "Eine Entwurfsmethodik für data warehouses." [S.l.] : [s.n.], 2002. http://deposit.ddb.de/cgi-bin/dokserv?idn=964097281.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Hiléia, da Silva Melo Áurea. "Uma Estratégia de Desenvolvimento de Data Warehouse Geográfico com Integração Híbrida Aplicada ao Monitoramento de Queimadas na Amazônia." Universidade Federal de Pernambuco, 2003. https://repositorio.ufpe.br/handle/123456789/5569.

Full text
Abstract:
Made available in DSpace on 2014-06-12T17:40:23Z (GMT). No. of bitstreams: 2 arquivo7012_1.pdf: 2004375 bytes, checksum: 57e9f7963b941c4b0a6bf7bc31c6e104 (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2003
Os sistemas de suporte à decisão atravessaram as fronteiras dos negócios comerciais e atingiram, atualmente , as mais diversas áreas do conhecimento. Seja em instituições de ensino, governamentais, privadas com ou sem fins lucrativos, onde houve um processo decisório, haverá a necessidade de sistemas desse tipo. Nesta categoria destaca-se o Data Warehouse (DW), que permite a geração de dados integrados e históricos possibilitando que seus usuários tomem decisões com base em fatos e não em especulações e, embora já exista há quase vinte anos, ainda não há uma metodologia formal para sua implementação. Tal fato estende-se também aos DW Geográficos (DWG), ou seja, um ambiente de DW com tratamento e armazenamento de dados geo-referenciados. Nesse contexto, o presente trabalho define uma estratégia de desenvolvimento de DWG, contemplando desde a fase de levantamento até a inserção dos dados, seguindo o método de interação híbrida e tendo como base, as metodologias de Ralph Kimball e Jonh A. Zachman para DW tradicionais. Além disso, é feito um estudo de caso do Sistema de Monitoramento de Queimadas da Amazônia Legal de forma a avaliar e validar cada uma das etapas da estratégia implementada
APA, Harvard, Vancouver, ISO, and other styles
23

Drudi, Riccardo. "Progettazione di Data Warehouse di dati genomici su piattaforma Hadoop." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2016. http://amslaurea.unibo.it/10441/.

Full text
Abstract:
Negli ultimi anni la biologia ha fatto ricorso in misura sempre maggiore all’informatica per affrontare analisi complesse che prevedono l’utilizzo di grandi quantità di dati. Fra le scienze biologiche che prevedono l’elaborazione di una mole di dati notevole c’è la genomica, una branca della biologia molecolare che si occupa dello studio di struttura, contenuto, funzione ed evoluzione del genoma degli organismi viventi. I sistemi di data warehouse sono una tecnologia informatica che ben si adatta a supportare determinati tipi di analisi in ambito genomico perché consentono di effettuare analisi esplorative e dinamiche, analisi che si rivelano utili quando si vogliono ricavare informazioni di sintesi a partire da una grande quantità di dati e quando si vogliono esplorare prospettive e livelli di dettaglio diversi. Il lavoro di tesi si colloca all’interno di un progetto più ampio riguardante la progettazione di un data warehouse in ambito genomico. Le analisi effettuate hanno portato alla scoperta di dipendenze funzionali e di conseguenza alla definizione di una gerarchia nei dati. Attraverso l’inserimento di tale gerarchia in un modello multidimensionale relativo ai dati genomici sarà possibile ampliare il raggio delle analisi da poter eseguire sul data warehouse introducendo un contenuto informativo ulteriore riguardante le caratteristiche dei pazienti. I passi effettuati in questo lavoro di tesi sono stati prima di tutto il caricamento e filtraggio dei dati. Il fulcro del lavoro di tesi è stata l’implementazione di un algoritmo per la scoperta di dipendenze funzionali con lo scopo di ricavare dai dati una gerarchia. Nell’ultima fase del lavoro di tesi si è inserita la gerarchia ricavata all’interno di un modello multidimensionale preesistente. L’intero lavoro di tesi è stato svolto attraverso l’utilizzo di Apache Spark e Apache Hadoop.
APA, Harvard, Vancouver, ISO, and other styles
24

Felden, Carsten. "Personalisierung der Informationsversorgung in Unternehmen /." Wiesbaden : Dt. Univ.-Verl, 2006. http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&doc_number=014950251&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Lester, Margaret C. "Factors in the design and development of a data warehouse for academic data." [Johnson City, Tenn. : East Tennessee State University], 2003. http://etd-submit.etsu.edu/etd/theses/available/etd-0328103-154549/unrestricted/Lester041403f.pdf.

Full text
Abstract:
Thesis (M.S.)--East Tennessee State University, 2003.
Title from electronic submission form. ETSU ETD database URN: etd-0328103-154549. Includes bibliographical references. Also available via Internet at the UMI web site.
APA, Harvard, Vancouver, ISO, and other styles
26

Ahl, Alexander. "Kreditbedömningar och Data Warehouse : En studie om riktlinjer för insamling, transformering och inladdning av kreditbedömningsinformation i Data Warehouse." Thesis, Högskolan i Skövde, Institutionen för kommunikation och information, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-8267.

Full text
Abstract:
Studien har handlat om att ta fram riktlinjer för hur extrahering, transformering och inladdning (ETL) av kreditbedömningsinformation bör göras när det kommer till att införa ett Data Warehouse (DW). Ett kvalitativt angreppssätt har använts med både en fallstudie och en intervjustudie, där fallstudien genomfördes med Asitis AB som är en systemleverantör förfinans- och reskontrafinansiering (factoring) för att undersöka potentiella affärsmöjligheter med ett DW med kreditbedömningsinformation. Intervjustudien användes för att samla in värdefull information från individer inom alla berörda problemområden, och användes sedan i fallstudien. Resultatet var både framgångsfaktorer och fallgropar för ETL-processen med kreditbedömningsinformation som kan fungera som stöd för organisationer med detta behov. Det har visat sig att ETL-processen för kreditbedömningsinformation utgör ett område med stor potential för affärsmöjligheter, där det krävs hög kompetens, domänkunskap, proaktivitet och juridisk kunskap vid genomförande av ETL-processen.
APA, Harvard, Vancouver, ISO, and other styles
27

Olsson, Marcus. "Data Warehouse : An Outlook of Current Usage of External Data." Thesis, University of Skövde, Department of Computer Science, 2002. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-682.

Full text
Abstract:

A data warehouse is a data collection that integrates large amounts of data from several sources, with the aim to support the decision-making process in a company. Data could be acquired from internal sources within the own organization, as well as from external sources outside the organization.

The comprehensive aim of this dissertation is to examine the current usage of external data and its sources for integration into DWs, in order to give users of a DW the best possible foundation for decision-making. In order to investigate this problem, we have conducted an interview study with DW developers.

Based on the interview study, the result shows that it is relative common to integrate external data into DWs. The study also identifies different types of external data that are integrated, and what external sources it is common to acquire data from. In addition, opportunities and pitfalls of integrating external data have also been highlighted.

APA, Harvard, Vancouver, ISO, and other styles
28

Liu, Xiaodong. "Web-based access to a data warehouse of administrative data." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape7/PQDD_0019/MQ48278.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Omelchenko, Arkadiy. "Hierarchische physische Data-Cube-Strukturen in einem mobilen Data-Warehouse." Hamburg Diplomica-Verl, 2007. http://www.diplom.de/katalog/arbeit/10542.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Martin, Bryan. "Collocation of Data in a Multi-temperate Logical Data Warehouse." University of Cincinnati / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1573569703200564.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Thiele, Maik. "Qualitätsgetriebene Datenproduktionssteuerung in Echtzeit-Data-Warehouse-Systemen." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2010. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-39639.

Full text
Abstract:
Wurden früher Data-Warehouse-Systeme meist nur zur Datenanalyse für die Entscheidungsunterstützung des Managements eingesetzt, haben sie sich nunmehr zur zentralen Plattform für die integrierte Informationsversorgung eines Unternehmens entwickelt. Dies schließt vor allem auch die Einbindung des Data-Warehouses in operative Prozesse mit ein, für die zum einen sehr aktuelle Daten benötigt werden und zum anderen eine schnelle Anfrageverarbeitung gefordert wird. Daneben existieren jedoch weiterhin klassische Data-Warehouse-Anwendungen, welche hochqualitative und verfeinerte Daten benötigen. Die Anwender eines Data-Warehouse-Systems haben somit verschiedene und zum Teil konfligierende Anforderungen bezüglich der Datenaktualität, der Anfragelatenz und der Datenstabilität. In der vorliegenden Dissertation wurden Methoden und Techniken entwickelt, die diesen Konflikt adressieren und lösen. Die umfassende Zielstellung bestand darin, eine Echtzeit-Data-Warehouse-Architektur zu entwickeln, welche die Informationsversorgung in seiner ganzen Breite -- von historischen bis hin zu aktuellen Daten -- abdecken kann. Zunächst wurde ein Verfahren zur Ablaufplanung kontinuierlicher Aktualisierungsströme erarbeitet. Dieses berücksichtigt die widerstreitenden Anforderungen der Nutzer des Data-Warehouse-Systems und erzeugt bewiesenermaßen optimale Ablaufpläne. Im nächsten Schritt wurde die Ablaufplanung im Kontext mehrstufiger Datenproduktionsprozesse untersucht. Gegenstand der Analyse war insbesondere, unter welchen Bedingungen eine Ablaufplanung in Datenproduktionsprozessen gewinnbringend anwendbar ist. Zur Unterstützung der Analyse komplexer Data-Warehouse-Prozesse wurde eine Visualisierung der Entwicklung der Datenzustände, über die Produktionsprozesse hinweg, vorgeschlagen. Mit dieser steht ein Werkzeug zur Verfügung, mit dem explorativ Datenproduktionsprozesse auf ihr Optimierungspotenzial hin untersucht werden können. Das den operativen Datenänderungen unterworfene Echtzeit-Data-Warehouse-System führt in der Berichtsproduktion zu Inkonsistenzen. Daher wurde eine entkoppelte und für die Anwendung der Berichtsproduktion optimierte Datenschicht erarbeitet. Es wurde weiterhin ein Aggregationskonzept zur Beschleunigung der Anfrageverarbeitung entwickelt. Die Vollständigkeit der Berichtsanfragen wird durch spezielle Anfragetechniken garantiert. Es wurden zwei Data-Warehouse-Fallstudien großer Unternehmen vorgestellt sowie deren spezifische Herausforderungen analysiert. Die in dieser Dissertation entwickelten Konzepte wurden auf ihren Nutzen und ihre Anwendbarkeit in den Praxisszenarien hin überprüft.
APA, Harvard, Vancouver, ISO, and other styles
32

Sun, Jiake, and Wenjie Jiang. "Analysis Tool for Warehouse Material Handling Data." Thesis, Högskolan i Halmstad, Sektionen för Informationsvetenskap, Data– och Elektroteknik (IDE), 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-15205.

Full text
Abstract:
Effective material handling plays a key role in cutting costs. Well-organized material handling can cut production cost by optimizing product transfer paths, decreasing the damage rate and by increasing the utilization of storage space. This report presents the development of an analysis system for StoraEnso Hylte’s paper reel database. The system extracts and classifies key points from the database which are related to material handling; like attributes related to the product (paper reel), forklift truck information and storage cell utilization. The analysis based on paper reels includes the damage rate and transfer paths of paper reels. A mathematical model is also presented, which tells us that the probability of damage per transport is more important than the number of transports for paper reels handling. The effect of decreasing non-optimal transportation (optimize the path) is very small.
APA, Harvard, Vancouver, ISO, and other styles
33

Munir, Wahab. "Optimization of Data Warehouse Design and Architecture." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-37233.

Full text
Abstract:
A huge number of SCANIA trucks and busses are running on the roads. Unlike the trucks and buses of the past, they are hi-tech vehicles, carrying a lot of technical and operational information that pertains to different aspects like load statistics, driving time, engine-speed over time and much more. This information is fed into an analysis system where it is organized to support analytical questions. Over the period of time this system has become overloaded and needs to be optimized. There are a number of areas identified that can be considered for improvement. However, it is not possible to analyze the whole system within the given constraints. A subset is picked which has been thought of to be sufficient for the purpose of the thesis. The system takes a lot of time to load new data. Data loading is not incremental. There is a lot of redundancy in the storage structure. Query execution takes a lot of time in some parts of the database. The methods chosen for this thesis includes data warehouse design and architecture analysis, end user queries review, and code analysis. A potential solution is presented to reduce the storage space requirements and maintenance time taken by the databases. This is achieved by presenting a solution to reduce the number of databases maintained in parallel and contains duplicated data. Some optimizations have been made in the storage structure and design to improve the query processing time for the end users. An example incremental loading strategy is also implemented to demonstrate the working and idea. This helps in the reduction of loading time. Moreover, An investigation has been made into a commercially available Data warehouse management System. The investigation is mostly based on hardware architecture and how it can contributes to better performance. This portion is only theoretical. Based on the analysis recommendations are made regarding the architecture and design of the data warehouse.
APA, Harvard, Vancouver, ISO, and other styles
34

Liu, Yi. "Code generator for integrating warehouse data sources." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp05/MQ62244.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Mathews, Reena. "Simple Strategies to Improve Data Warehouse Performance." NCSU, 2004. http://www.lib.ncsu.edu/theses/available/etd-05172004-213304/.

Full text
Abstract:
Data warehouse management is fast becoming one of the most popular and important topics in industries today. For business executives, it promises significant competitive advantage for their companies, while presenting the information system managers a way to overcome the obstructions in providing business information to managers and other users. Here the company is going through the problem of inefficient performance of its data warehouse. To find an appropriate solution to this problem we first try to understand the data warehouse concept and its basic architecture, followed by an in depth study of the company data warehouse and the various issues affecting it. We propose and evaluate a set of solutions including classification of suppliers, implementing corporate commodity classification and coding system, obtaining level three spend details for PCard purchases, etc. The experimental results show considerable improvement in the data quality and the data warehouse performance. We further support these recommendations by evaluating the return on investment for improved quality data. Lastly, we discuss the future scope and other possible improvement techniques for obtaining better results.
APA, Harvard, Vancouver, ISO, and other styles
36

Totok, Andreas. "Modellierung von OLAP- und Data-Warehouse-Systemen /." Wiesbaden : Dt. Univ.-Verl. [u.a.], 2000. http://www.gbv.de/dms/ilmenau/toc/312031483.PDF.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Heinrich, Uwe. "Data Warehouse für boden- und agrarwissenschaftliche Forschungsdaten." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2016. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-211977.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Brandão, Filipe Ferreira. "Estágio TAP/Megasys - Data Warehouse & Reporting." Master's thesis, Instituto Superior de Economia e Gestão, 2014. http://hdl.handle.net/10400.5/12024.

Full text
Abstract:
Mestrado em Gestão de Sistemas de Informação
A atividade de uma companhia aérea, envolvendo várias áreas operacionais, por vezes várias empresas, dá origem a diferentes sistemas de informação, consequentemente gerando um volume grande de dados. A análise desses dados é fundamental para maximizar a informação disponível e apoiar o processo de tomada de decisão. O Business Intelligence (BI) disponibiliza aos decisores, de maneira rápida e eficiente, a informação proveniente desses sistemas. A Megasis/TAP Portugal utiliza ferramentas da MicroStrategy que permitem o acesso, criação e partilha de conteúdos, como reports ou dashboards, de modo a satisfazer a necessidade de informação, bem como disponibilizar aos utilizadores informação crucial para a tomada de decisão no grupo TAP. Neste trabalho é descrito o estágio efetuado na equipa de BI e CRM da TAP Portugal/Megasis, através do enquadramento teórico, relato das atividades desenvolvidas, e conclusões retiradas do estágio.
The business activity of an airline, with several operational areas, sometimes several companies, generates different scopes of information as well as a huge data volume. The pre-analysis and documentation of data is essential to maximize the presentation of information and optimize the decision making process. A Business Intelligence plan enables decision makers to interpret in a quick and efficient way the reliable information provided by many business sectors of their company. Megasis/TAP uses Microstrategy tools like reports or dashboards that allow access and sharing of data to the information users and validate the actions of decision makers in TAP Group. This study describes the work developed by the Business Intelligence and CRM team at TAP/Portugal/Megasis through relevant theory on the contents and benefits of a Business Intelligence plan. It also describes the tasks in which the author was involved and which conclusions may arise from its experience.
APA, Harvard, Vancouver, ISO, and other styles
39

Ferreira, Rafael Gastão Coimbra. "Data Warehouse na prática : fundamentos e implantação." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2002. http://hdl.handle.net/10183/25495.

Full text
Abstract:
Embora o conceito de Data Warehouse (doravante abreviado DW), em suas várias formas, continue atraindo interesse, muitos projetos de DW não estão gerando os benefícios esperados e muitos estão provando ser excessivamente caro de desenvolver e manter. O presente trabalho visa organizar os conceitos de DW através de uma revisão bibliográfica, discutindo seu real benefício e também de como perceber este benefício a um custo que é aceitável ao empreendimento. Em particular são analisadas metodologias que servirão de embasamento para a proposta de uma metodologia de projeto de DW, que será aplicada a um estudo de caso real para a Cia Zaffari, levando em conta critérios que são encontrados atualmente no desenvolvimento de um Data Warehouse, um subconjunto das quais será tratado no trabalho de dissertação.
Although the concept of Data Warehouse (DW), in its various forms, still attracting interest, many DW projects are not generating the benefits expected and many are proving to be too expensive to develop and to keep. This work organizes the concepts of DW through a literature review, discussing its real benefit and how to realize this benefit at a cost that is acceptable to the company. In particular methods are discussed to serve as a foundation for proposing a design methodology for DW, which will be applied to a real case study for the CIA Zaffari, taking into account criteria that are currently found in developing a data warehouse, a subset of which will be treated in the dissertation.
APA, Harvard, Vancouver, ISO, and other styles
40

Schlesinger, Lutz, Wolfgang Lehner, Wolfgang Hümmer, and Andreas Bauer. "Nutzung von Datenbankdiensten in Data-Warehouse-Anwendungen." De Gruyter Oldenbourg, 2003. https://tud.qucosa.de/id/qucosa%3A72851.

Full text
Abstract:
Zentral für eine effiziente Analyse der in Data-Warehouse-Systemen gespeicherten Daten ist das Zusammenspiel zwischen Anwendung und Datenbanksystem. Der vorliegende Artikel klassifiziert und diskutiert unterschiedliche Wege, Data-Warehouse-Anwendungen mit dem Datenbanksystem zu koppeln, um komplexe OLAP-Szenarien zur Berechnung dem Datenbankdienst zu überlassen. Dabei werden vier unterschiedliche Kategorien, die Spracherweiterung (SQL), die anwendungsspezifische Sprachneuentwicklung (MDX), die Nutzung spezifischer Objektmodelle (JOLAP) und schließlich der Rückgriff auf XML-basierte WebServices (XCube) im einzelnen diskutiert und vergleichend gegenübergestellt.
The connection of the applications and the underlying database system is crucial for performing analyses efficiently within a data warehouse system. This paper classifies and discusses different methods to bring data warehouse applications logically close to the underlying database system so that the computation of complex OLAP scenarios may be performed within the database system and not outside at the application. In detail, four different categories ranging from language extension (SQL) over the design of a new query language (MDX) and using special object models (JOLAP) to the use of XML-based WebServices are discussed and compared in detail.
APA, Harvard, Vancouver, ISO, and other styles
41

Bulusu, Prakash. "DIVA-Data warehouse Interface for Visual Analysis." [Gainesville, Fla.] : University of Florida, 2003. http://purl.fcla.edu/fcla/etd/UFE0000655.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Wilmes, Caroline. "Der strategische Einsatz eines Data Warehouse-Systems." Aachen Shaker, 2009. http://d-nb.info/999600419/04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Issa, Carla Mounir. "Data warehouse applications in modern day business." CSUSB ScholarWorks, 2002. https://scholarworks.lib.csusb.edu/etd-project/2148.

Full text
Abstract:
Data warehousing provides organizations with strategic tools to achieve the competitive advantage that organazations are constantly seeking. The use of tools such as data mining, indexing and summaries enables management to retrieve information and perform thorough analysis, planning and forcasting to meet the changes in the market environment. in addition, The data warehouse is providing security measures that, if properly implemented and planned, are helping organizations ensure that their data quality and validity remain intact.
APA, Harvard, Vancouver, ISO, and other styles
44

Gehrke, Christian. "Informationsagenten im Data Warehousing /." Heidelberg : Physica-Verlag, 2000. http://aleph.unisg.ch/hsgscan/hm00015380.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Jürgens, Marcus. "Index structures for data warehouses." [S.l. : s.n.], 1999. http://www.diss.fu-berlin.de/2000/93/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Jürgens, Marcus. "Index structures for data warehouses." Berlin ; Heidelberg : Springer, 2000. http://deposit.ddb.de/cgi-bin/dokserv?idn=96554155X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Rifaie, Mohammad. "Strategy and methodology for enterprise data warehouse development : integrating data mining and social networking techniques for identifying different communities within the data warehouse." Thesis, University of Bradford, 2010. http://hdl.handle.net/10454/4416.

Full text
Abstract:
Data warehouse technology has been successfully integrated into the information infrastructure of major organizations as potential solution for eliminating redundancy and providing for comprehensive data integration. Realizing the importance of a data warehouse as the main data repository within an organization, this dissertation addresses different aspects related to the data warehouse architecture and performance issues. Many data warehouse architectures have been presented by industry analysts and research organizations. These architectures vary from the independent and physical business unit centric data marts to the centralised two-tier hub-and-spoke data warehouse. The operational data store is a third tier which was offered later to address the business requirements for inter-day data loading. While the industry-available architectures are all valid, I found them to be suboptimal in efficiency (cost) and effectiveness (productivity). In this dissertation, I am advocating a new architecture (The Hybrid Architecture) which encompasses the industry advocated architecture. The hybrid architecture demands the acquisition, loading and consolidation of enterprise atomic and detailed data into a single integrated enterprise data store (The Enterprise Data Warehouse) where businessunit centric Data Marts and Operational Data Stores (ODS) are built in the same instance of the Enterprise Data Warehouse. For the purpose of highlighting the role of data warehouses for different applications, we describe an effort to develop a data warehouse for a geographical information system (GIS). We further study the importance of data practices, quality and governance for financial institutions by commenting on the RBC Financial Group case. v The development and deployment of the Enterprise Data Warehouse based on the Hybrid Architecture spawned its own issues and challenges. Organic data growth and business requirements to load additional new data significantly will increase the amount of stored data. Consequently, the number of users will increase significantly. Enterprise data warehouse obesity, performance degradation and navigation difficulties are chief amongst the issues and challenges. Association rules mining and social networks have been adopted in this thesis to address the above mentioned issues and challenges. We describe an approach that uses frequent pattern mining and social network techniques to discover different communities within the data warehouse. These communities include sets of tables frequently accessed together, sets of tables retrieved together most of the time and sets of attributes that mostly appear together in the queries. We concentrate on tables in the discussion; however, the model is general enough to discover other communities. We first build a frequent pattern mining model by considering each query as a transaction and the tables as items. Then, we mine closed frequent itemsets of tables; these itemsets include tables that are mostly accessed together and hence should be treated as one unit in storage and retrieval for better overall performance. We utilize social network construction and analysis to find maximum-sized sets of related tables; this is a more robust approach as opposed to a union of overlapping itemsets. We derive the Jaccard distance between the closed itemsets and construct the social network of tables by adding links that represent distance above a given threshold. The constructed network is analyzed to discover communities of tables that are mostly accessed together. The reported test results are promising and demonstrate the applicability and effectiveness of the developed approach.
APA, Harvard, Vancouver, ISO, and other styles
48

Grillo, Aderibigbe. "Developing a data quality scorecard that measures data quality in a data warehouse." Thesis, Brunel University, 2018. http://bura.brunel.ac.uk/handle/2438/17137.

Full text
Abstract:
The main purpose of this thesis is to develop a data quality scorecard (DQS) that aligns the data quality needs of the Data warehouse stakeholder group with selected data quality dimensions. To comprehend the research domain, a general and systematic literature review (SLR) was carried out, after which the research scope was established. Using Design Science Research (DSR) as the methodology to structure the research, three iterations were carried out to achieve the research aim highlighted in this thesis. In the first iteration, as DSR was used as a paradigm, the artefact was build from the results of the general and systematic literature review conduct. A data quality scorecard (DQS) was conceptualised. The result of the SLR and the recommendations for designing an effective scorecard provided the input for the development of the DQS. Using a System Usability Scale (SUS), to validate the usability of the DQS, the results of the first iteration suggest that the DW stakeholders found the DQS useful. The second iteration was conducted to further evaluate the DQS through a run through in the FMCG domain and then conducting a semi-structured interview. The thematic analysis of the semi-structured interviews demonstrated that the stakeholder's participants' found the DQS to be transparent; an additional reporting tool; Integrates; easy to use; consistent; and increases confidence in the data. However, the timeliness data dimension was found to be redundant, necessitating a modification to the DQS. The third iteration was conducted with similar steps as the second iteration but with the modified DQS in the oil and gas domain. The results from the third iteration suggest that DQS is a useful tool that is easy to use on a daily basis. The research contributes to theory by demonstrating a novel approach to DQS design This was achieved by ensuring the design of the DQS aligns with the data quality concern areas of the DW stakeholders and the data quality dimensions. Further, this research lay a good foundation for the future by establishing a DQS model that can be used as a base for further development.
APA, Harvard, Vancouver, ISO, and other styles
49

Goeken, Matthias. "Entwicklung von Data-Warehouse-Systemen : Anforderungsmanagement, Modellierung, Implementierung /." Wiesbaden : Dt. Univ.-Verl, 2006. http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&doc_number=015478852&line_number=0002&func_code=DB_RECORDS&service_type=MEDIA.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Matias, Francisco Luís Marinho do Rosário. "Metodologias de qualidade e testes de data warehouses - sua importância para a flexibilização e optimização do processo de data warehouse." Master's thesis, Universidade de Évora, 2008. http://hdl.handle.net/10174/18306.

Full text
Abstract:
Conseguir informações que permitam a tomada de decisões baseadas num volumoso histórico de dados torna-se difícil quando as grandes empresas detêm um enorme volume de dados espalhados em diversos sistemas. Os sistemas de Data Warehouse surgiram da necessidade de transformar e organizar estes dados para a geração de informações estratégicas úteis. Com o avanço exponencial do progresso tecnológico, o ciclo de vida de desenvolvimento dos produtos torna-se menor, obrigando as empresas a tomarem decisões operacionais e estratégicas em curtos intervalos de tempo. Assim, a competitividade de uma organização é determinada pela sua capacidade de tomar decisões precisas que garantam o sucesso da sua estratégia de negócio. Essas decisões devem ser baseadas em informações actualizadas, completas e de elevada qualidade. A aplicação de metodologias de qualidade e testes assumem-se como essenciais para a manutenção e optimização do processo de Data Warehouse. Investir nestas metodologias é investir na qualidade da informação. /ABSTRACT; Getting information that allow decision making based on a voluminous historical data becomes difficult when large companies have an enormous amount of data scattered in different systems. The Data Warehouse systems arose from the need to transform and organize the data for generating useful strategic information. With the exponential growth of the technological progress, the life cycle of product development becomes smaller, forcing companies to take strategic and operational decisions in short intervals of time. Thus, the competitiveness of an organization is determined by its ability to make accurate decisions to ensure the success of its business. These decisions should be based on updated, fulfilled and high quality information. The application of quality methodologies and tests take on as essential to the maintenance and optimization of the Data Warehouse process. Investing in these methods is to invest in the quality of information.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography