Dissertations / Theses on the topic 'Data system design'

To see the other types of publications on this topic, follow the link: Data system design.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Data system design.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Pullokkaran, Laijo John. "Analysis of data virtualization & enterprise data standardization in business intelligence." Thesis, Massachusetts Institute of Technology, 2013. http://hdl.handle.net/1721.1/90703.

Full text
Abstract:
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, Engineering Systems Division, System Design and Management Program, 2013.
Cataloged from PDF version of thesis.
Includes bibliographical references (page 59).
Business Intelligence is an essential tool used by enterprises for strategic, tactical and operational decision making. Business Intelligence most often needs to correlate data from disparate data sources to derive insights. Unifying data from disparate data sources and providing a unifying view of data is generally known as data integration. Traditionally enterprises employed ETL and data warehouses for data integration. However in last few years a technology known as "Data Virtualization" has found some acceptance as an alternative data integration solution. "Data Virtualization" is a federated database termed as composite database by McLeod/Heimbigner's in 1985. Till few years back Data Virtualization weren't considered as an alternative for ETL but was rather thought of as a technology for niche integration challenges. In this paper we hypothesize that for many BI applications "data virtualization" is a better cost effective data integration strategy. We analyze the system architecture of "Data warehouse" and "Data Virtualization" solutions. We further employ System Dynamics Model to compare few key metrics like "Time to Market" and "Cost of "Data warehouse" and "Data Virtualization" solutions. We also look at the impact of "Enterprise Data Standardization" on data integration.
by Laijo John Pullokkaran.
S.M. in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
2

Brown, Judith Spaulding. "Tooling data collection system professional project /." [Denver, Colo.] : Regis University, 2006. http://165.236.235.140/lib/jbrownparti2006.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

洪宜偉 and Edward Hung. "Data cube system design: an optimization problem." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B31222730.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Schanzenberger, Anja. "System design for periodic data production management." Thesis, Middlesex University, 2006. http://eprints.mdx.ac.uk/10697/.

Full text
Abstract:
This research project introduces a new type of information system, the periodic data production management system, and proposes several innovative system design concepts for this application area. Periodic data production systems are common in the information industry for the production of information. These systems process large quantities of data in order to produce statistical reports in predefined intervals. The workflow of such a system is typically distributed world-wide and consists of several semi-computerized production steps which transform data packages. For example, market research companies apply these systems in order to sell marketing information over specified timelines. production of information. These systems process large quantities of data in order to produce statistical reports in predefined intervals. The workflow of such a system is typically distributed world-wide and consists of several semi-computerized production steps which transform data packages. For example, market research companies apply these systems in order to sell marketing information over specified timelines. There has been identified a lack of concepts for IT-aided management in this area. This thesis clearly defines the complex requirements of periodic data production management systems. It is shown that these systems can be defines as IT-support for planning, monitoring and controlling periodic data production processes. Their significant advantages are that information industry will be enabled to increase production performance, and to ease (and speed up) the identification of the production progress as well as the achievable optimisation potential in order to control rationalisation goals. In addition, this thesis provides solutions for he generic problem how to introduce such a management system on top of an unchangeable periodic data production system. Two promising system designs for periodic data production management are derived, analysed and compared in order to gain knowledge about appropriate concepts and this application area. Production planning systems are the metaphor models used for the so-called closely coupled approach. The metaphor model for the loosely coupled approach is project management. The latter approach is prototyped as an application in the market research industry and used as case study. Evaluation results are real-world experiences which demonstrate the extraordinary efficiency of systems based on the loosely coupled approach. Special is a scenario-based evaluation that accurately demonstrates the many improvements achievable with this approach. Main results are that production planning and process quality can vitally be improved. Finally, among other propositions, it is suggested to concentrate future work on the development of product lines for periodic data production management systems in order to increase their reuse.
APA, Harvard, Vancouver, ISO, and other styles
5

Hung, Edward. "Data cube system design : an optimization problem /." Hong Kong : University of Hong Kong, 2000. http://sunzi.lib.hku.hk/hkuto/record.jsp?B21852340.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Horan, Stephen. "Using Labview to Design a Payload Control System." International Foundation for Telemetering, 2008. http://hdl.handle.net/10150/606179.

Full text
Abstract:
ITC/USA 2008 Conference Proceedings / The Forty-Fourth Annual International Telemetering Conference and Technical Exhibition / October 27-30, 2008 / Town and Country Resort & Convention Center, San Diego, California
As part of a project to develop small satellites, we have developed a combined ground station and flight computer control software package using LabVIEW. These computer systems are used to acquire data from sensors, control communications links, provide automatic data acquisition capabilities, and provide a user interface. In this paper, we will look at the state machines that describe both sets of software, the challenges for the flight computer development given the PC/104 format, and show how the final product was deployed.
APA, Harvard, Vancouver, ISO, and other styles
7

Lam, Lawrence G. "Digital Health-Data platforms : biometric data aggregation and their potential impact to centralize Digital Health-Data." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/106235.

Full text
Abstract:
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, School of Engineering, System Design and Management Program, Engineering and Management Program, 2015.
Cataloged from PDF version of thesis.
Includes bibliographical references (page 81).
Digital Health-Data is being collected at unprecedented rates today as biometric micro sensors continue to diffuse into our lives in the form of smart devices, wearables, and even clothing. From this data, we hope to learn more about preventative health so that we can spend less money on the doctor. To help users aggregate this perpetual growth of biometric "big" data, Apple HealthKit, Google Fit, and Samsung SAMI were each created with the hope of becoming the dominant design platform for Digital Health-Data. The research for this paper consists of citings from technology strategy literature and relevant journalism articles regarding recent and past developments that pertain to the wearables market and the digitization movement of electronic health records (EHR) and protected health information (PHI) along with their rules and regulations. The culmination of these citations will contribute to my hypothesis where the analysis will attempt to support my recommendations for Apple, Google, and Samsung. The ending chapters will encompass discussions around network effects and costs associated with multi-homing user data across multiple platforms and finally ending with my conclusion based on my hypothesis.
by Lawrence G. Lam.
S.M. in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
8

Chung, Kristie (Kristie J. ). "Applying systems thinking to healthcare data cybersecurity." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/105307.

Full text
Abstract:
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, Engineering Systems Division, 2015.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 85-90).
Since the HITECH Act of 2009, adoption of Electronic Health Record (EHR) systems in US healthcare organizations has increased significantly. Along with the rapid increase in usage of EHR, cybercrimes are on the rise as well. Two recent cybercrime cases from early 2015, the Anthem and Premera breaches, are examples of the alarming increase of cybercrimes in this domain. Although modem Information Technology (IT) systems have evolved to become very complex and dynamic, cybersecurity strategies have remained static. Cyber attackers are now adopting more adaptive, sophisticated tactics, yet the cybersecurity counter tactics have proven to be inadequate and ineffective. The objective of this thesis is to analyze the recent Anthem security breach to assess the vulnerabilities of Anthem's data systems using current cybersecurity frameworks and guidelines and the Systems-Theoretic Accident Model and Process (STAMP) method. The STAMP analysis revealed Anthem's cybersecurity strategy needs to be reassessed and redesigned from a systems perspective using a holistic approach. Unless our society and government understand cybersecurity from a sociotechnical perspective, we will never be equipped to protect valuable information and will always lose this battle.
by Kristie Chung.
S.M. in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
9

Waltmire, Michelle Klaassen. "Design of IDOMS : Intelligent Data Object Management System." Thesis, Kansas State University, 1986. http://hdl.handle.net/2097/9982.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Yang, Sun. "Solar Energy Control System Design." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-141489.

Full text
Abstract:
This thesis covers design, simulation and implementation of a solar energy control system for an on grid energy storage device. The design covers several control methods such as energy balance control, operating mode switching and data exchange. A genetic algorithm was designed to optimize the control system parameters design, and the algorithm's simulation and real time operating system implementation showed comparable results. The control system was implemented to connect a power supply to the grid. The power supply simulated a solar panel and connected to an electrical grid via Energy Hub equipment, and the energy transfer characteristics of designed control system were tested. The results showed that the selected algorithm matches the target performance criteria.
APA, Harvard, Vancouver, ISO, and other styles
11

Pellegrino, Gregory S. "Design of a Low-Cost Data Acquisition System for Rotordynamic Data Collection." DigitalCommons@CalPoly, 2019. https://digitalcommons.calpoly.edu/theses/1978.

Full text
Abstract:
A data acquisition system (DAQ) was designed based on the use of a STM32 microcontroller. Its purpose is to provide a transparent and low-cost alternative to commercially available DAQs, providing educators a means to teach students about the process through which data are collected as well as the uses of collected data. The DAQ was designed to collect data from rotating machinery spinning at a speed up to 10,000 RPM and send this data to a computer through a USB 2.0 full-speed connection. Multitasking code was written for the DAQ to allow for data to be simultaneously collected and transferred over USB. Additionally, a console application was created to control the DAQ and read data, and MATLAB code written to analyze the data. The DAQ was compared against a custom assembled National Instruments CompactDAQ system. Using a Bentley-Nevada RK 4 Rotor Kit, data was simultaneously collected using both DAQs. Analysis of this data shows the capabilities and limitations of the low cost DAQ compared to the custom CompactDAQ.
APA, Harvard, Vancouver, ISO, and other styles
12

Wilmer, Greg. "OPM model-based integration of multiple data repositories." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/100389.

Full text
Abstract:
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, Engineering Systems Division, System Design and Management Program, 2015.
Cataloged from PDF version of thesis.
Includes bibliographical references (page 90).
Data integration is at the heart of a significant portion of current information system implementations. As companies continue to move towards a diverse, growing set of Commercial Off the Shelf (COTS) applications to fulfill their information technology needs, the need to integrate data between them continues to increase. In addition, these diverse application portfolios are becoming more geographically dispersed as more software is provided using the Software as a Service (SaaS) model, and companies continue the pattern of moving their internal data centers to cloud-based computing. As the growth of data integration activities continues, several prominent data integration patterns have emerged, and commercial software packages have been created that covers each of the patterns below: 1. Bulk and/or batch data extraction and delivery (ETL, ELT, etc.); 2. Messaging / Message-oriented data movement; 3. Granular, low-latency data capture and propagation (data synchronization). As the data integration landscape within an organization, and between organizations, becomes larger and more complex, opportunities exist to streamline aspects of the data integrating process not covered by current toolsets including: 1. Extensibility by third parties. Many COTS integration toolsets today are difficult if not impossible to extend by third parties; 2. Capabilities to handle different types of structured data from relational to hierarchical to graph models; 3. Enhanced modeling capabilities through use of data visualization and modeling techniques and tools; 4. Capabilities for automated unit testing of integrations; 5. A unified toolset that covers all three patterns, allowing an enterprise to implement the pattern that best suites business needs for the specific scenario; 6. A Web-based toolset that allows configuration, management and deployment via Web-based technologies allowing geographical indifference for application deployment and integration. While discussing these challenges with a large Fortune 500 client, they expressed the need for an enhanced data integration toolset that would allow them to accomplish such tasks. Given this request, the Object Process Methodology (OPM) and the Opcat toolset were used to begin design of a data integration toolset that could fulfill these needs. As part of this design process, lessons learned covering both the use of OPM in software design projects as well as enhancement requests for the Opcat toolset were documented.
by Greg Wilmer.
S.M. in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
13

CHOOBINEH, JOOBIN. "FORM DRIVEN CONCEPTUAL DATA MODELING (DATABASE DESIGN, EXPERT SYSTEMS, CONCEPTUAL)." Diss., The University of Arizona, 1985. http://hdl.handle.net/10150/188043.

Full text
Abstract:
Conceptual data schema is constructed from the analysis of the business forms which are used in an enterprise. In order to peform the analysis a data model, a forms model, and heuristics to map from the forms model to the data model are developed. The data model we use is an extended version of the Entity-Relationship Model. Extensions include the addition of the min-max cardinalities and generalization hierarchy. By extending the min-max cardinalities to attributes we capture a number of significant characteristics of the entities in a concise manner. We introduce a hierarchical model of forms. The model specifies various properties of each form field within the form such as their origin, hierarchical structure, and cardinalities. The inter-connection of the forms is expressed by specifying which form fields flow from one form to another. The Expert Database Design System creates a conceptual schema by incrementally integrating related collections of forms. The rules of the expert system are divided into six groups: (1) Form Selection, (2) Entity Identification, (3) Attribute Attachment, (4) Relationship Identification, (5) Cardinality Identification, and (6) Integrity Constraints. The rules of the first group use knowledge about the form flow to determine the order in which forms are analyzed. The rules in other groups are used in conjunction with a designer dialogue to identify entities, relationships, and attributes of a schema that represents the collection of forms.
APA, Harvard, Vancouver, ISO, and other styles
14

Sharkey, Jeffrey Allen. "Automated radio network design using ant colony optimization." Thesis, Montana State University, 2008. http://etd.lib.montana.edu/etd/2008/sharkey/SharkeyJ0508.pdf.

Full text
Abstract:
Radio networks can provide reliable communication for rural intelligent transportation systems (ITS). Engineers manually design these radio networks by selecting tower locations and equipment while meeting a series of constraints such as coverage, bandwidth, maximum delay, and redundancy, all while minimizing network cost. As network size and constraints grow, the design process can quickly become overwhelming. In this thesis we model the network design problem (NDP) as a generalized Steiner tree-star (GSTS) problem. Any solution to the minimum Steiner tree (MST) problem on a constructed GSTS graph will directly identify the tower locations and equipment needed to build the network at an optimal cost. The direct MST solution can only satisfy coverage constraints. Because the MST problem is known to be NP-hard, our research applies ant colony optimization (ACO) to find near-optimal MST solutions. Using ACO also allows us to meet bandwidth, maximum delay, and redundancy constraints. We verify that our approach finds near-optimal designs by comparing it against a 2-approximation algorithm in several different scenarios.
APA, Harvard, Vancouver, ISO, and other styles
15

Yildirim, Cem S. M. Massachusetts Institute of Technology. "Data-Centric Business Transformation." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/107344.

Full text
Abstract:
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, School of Engineering, Institute for Data, Systems, and Society, System Design and Management Program, 2014.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 44-45).
Today's digital business environment is imposing a great transformation challenge on the enterprises to effectively use vast amount data in order to gain critical business insights to stay competitive. In their aim to take advantage of data many large organizations are launching data management programs. In these attempts organizations recognize that taking full advantage of data requires enterprise wide changes in organizational aspects, business processes, and technology. The lack of recognition of this enterprise-wide scope haunts most data management programs. Research shows that most of these programs fail and get abandoned after long efforts and investments. This study aims to highlight critical reasons why these programs fail and a different approach to address the fundamental problems associated with the majority of these failures. It is important to be successful in the data efforts due to the fact that data driven businesses are gaining significant competitive edge. Data Centric Business Transformation Strategy (DCBT) is a holistic approach for the enterprise to transform into a data driven and agile entity. DCBT is also away to achieve better alignment in the enterprises. DCBT aims to achieve two goals to transform the organization; become a smarter organization by instilling continuous learning and improvement culture in all aspects of the business and achieve agility in enterprise-wide organizational learning and technology. To achieve these two goals, understanding the current state of the organization in the tree fundamental DCBT areas of organizational learning capacity, business processes and technology is essential to incrementally and continuously improve each one in concert. Required improvements should be introduced to smaller parts of the organization delivering the value of data. Strategically chosen pipeline of projects would allow the ramp up of the organization to a continuously learning and changing organization. In the age of digital economy, agile organizations can learn quicker from large amounts of data to have the competitive edge. This study will also look into how a data management program relates to DCBT and can be used in concert to enable DCBT.
by Cem Yildirim.
S.M. in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
16

Polany, Rany. "Multidisciplinary system design optimization of fiber-optic networks within data centers." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/107503.

Full text
Abstract:
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, School of Engineering, System Design and Management Program, Engineering and Management Program, 2016.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 136-142).
The growth of the Internet and the vast amount of cloud-based data have created a need to develop data centers that can respond to market dynamics. The role of a data center designer, whom is responsible for scoping, building, and managing the infrastructure design is becoming increasingly complex. This work presents a new analytical systems approach to modeling fiber-optic network design within data centers. Multidisciplinary system design optimization (MSDO) is utilized to integrate seven disciplines into a unified software framework for modeling 10G, 40G, and 100G multi-mode fiber-optics networks: 1) market and industry analysis, 2) fiber-optic technology, 3) data center infrastructure, 4) systems analysis, 5) multi-objective optimization using genetic algorithms, 6) parallel computing, and 7) simulation research using MATLAB and OptiSystem. The framework is applied to four theoretical data center case studies to simultaneously evaluate the Pareto optimal trade-offs of (a) minimizing life-cycle costs, (b) maximizing user capacity, and (c) maximizing optical transmission quality (Q-factor). The results demonstrate that data center life-cycle costs are most sensitive to power costs, 10G OM4 multi-mode optical fiber is Pareto optimal for long reach and low user capacity needs, and 100G OM4 multi-mode optical fiber is Pareto optimal for short reach and high user capacity needs.
by Rany Polany.
S.M. in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
17

Dacus, Farron L., Steven P. Hendrix, and Joseph J. Bouchez. "INTRODUCTORY SYSTEM DESIGN OF THE ADVANCED SUBMINIATURE TELEMETRY SYSTEM." International Foundation for Telemetering, 2000. http://hdl.handle.net/10150/606786.

Full text
Abstract:
International Telemetering Conference Proceedings / October 23-26, 2000 / Town & Country Hotel and Conference Center, San Diego, California
The Advanced SubMiniature Telemetry System (ASMT) with Wireless Sensor extension is an ambitious program aimed at incorporating modern wireless system and electronic design methods into a two way, miniature, low cost, modular, and completely software controlled wireless data acquisition system. The program was conceived and is sponsored by the U.S. Air Force SEEK EAGLE Office as a means of both lowering test cost and increasing test effectiveness. This article shall present the fundamental system design challenges of the program and how modern design methods can provide a new standard of cost effectiveness, mission capability, and high spectral efficiency.
APA, Harvard, Vancouver, ISO, and other styles
18

Bhagattjee, Benoy. "Emergence and taxonomy of big data as a service." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/90709.

Full text
Abstract:
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, Engineering Systems Division, System Design and Management Program, 2014.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 82-83).
The amount of data that we produce and consume is growing exponentially in the modem world. Increasing use of social media and new innovations such as smartphones generate large amounts of data that can yield invaluable information if properly managed. These large datasets, popularly known as Big Data, are difficult to manage using traditional computing technologies. New technologies are emerging in the market to address the problem of managing and analyzing Big Data to produce invaluable insights from it. Organizations are finding it difficult to implement these Big Data technologies effectively due to problems such as lack of available expertise. Some of the latest innovations in the industry are related to cloud computing and Big Data. There is significant interest in academia and industry in combining Big Data and cloud computing to create new technologies that can solve the Big Data problem. Big Data based on cloud computing is an upcoming area in computer science and many vendors are providing their ideas on this topic. The combination of Big Data technologies and cloud computing platforms has led to the emergence of a new category of technology called Big Data as a Service or BDaaS. This thesis aims to define the BDaaS service stack and to evaluate a few technologies in the cloud computing ecosystem using the BDaaS service stack. The BDaaS service stack provides an effective way to classify the Big Data technologies that enable technology users to evaluate and chose the technology that meets their requirements effectively. Technology vendors can use the same BDaaS stack to communicate the product offerings better to the consumer.
by Benoy Bhagattjee.
S.M. in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
19

Rao, Ananth K. "The DFS distributed file system : design and implementation." Online version of thesis, 1989. http://hdl.handle.net/1850/10500.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Eriksson, Hanna. "On-Board Data Aquisition System : Conceptual Design of an Airdrop Tracking System." Thesis, Linköpings universitet, Maskinkonstruktion, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-159201.

Full text
Abstract:
This thesis is, on behalf of Saab AB, a pre-study of possible on-board solutions for position measuring during store separation tests aimed for the test and evaluation of JAS 39 Gripen. The purpose is to replace the present ground-based system in order to achieve more effective trials regarding time and economy. Three different concept development methodologies were investigated in order to find the most suitable one for this thesis. Those were merged into one adapted methodology containing the following phases; \textit{Planning}, \textit{Function Analysis}, \textit{Concept Generation} and \textit{Concept Evaluation}. The work progressed as the methodology states, and the highest amount of work was dedicated to the Planning phase. The requirements and desiderata for the system were produced with an agile process, resulting in the Construction Specification List that eventually became the basis for the Concept Generation phase. Knowledge about the technical theory needed to solve the problem was obtained in parallel with the Function Analysis and Concept Generation. The most adaptable techniques to measure position were found out to be with the use of the Global Positioning System (GPS) or Inertial Navigation System (INS). After an extensive work with the Concept Generation in parallel with a continuously updated Construction Specification List, three concepts were developed. One concept is based on GPS, the second one on INS and the third one is a combination of GPS and INS. All three concepts shares the same telemetry system and casing, which fulfills the requirement of simple installation and possibility to install in different stores. In the final phase, Concept Evaluation, a comparison between the concepts was performed. Advantages and disadvantages was listed and the fulfillment of requirements was investigated. All three concepts were handed over to Saab in order to let them decide which concept(s) to further develop.
APA, Harvard, Vancouver, ISO, and other styles
21

Nilekar, Shirish K. "A system-oriented analysis of team decision making in data rich environments." Thesis, Massachusetts Institute of Technology, 2013. http://hdl.handle.net/1721.1/90698.

Full text
Abstract:
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, Engineering Systems Division, System Design and Management Program, 2013.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 78-80).
The information processing view of organizations [1] and subsequent works highlight the primary role of information processing in the effective functioning of markets and organizations. With the current wave of "big data" and related technologies, data-oriented decision making is being widely discussed [2] as a means of using this vast amount of available data for better decisions which can lead to improved business results. The focus of many of these studies is at the organization level. However, decisions are made by teams of individuals and this is a complex socio-technical process. The quality of a decision depends on many factors including technical capabilities for data analysis and human factors like team dynamics, cognitive capabilities of the individuals and the team. In this thesis, we developed a systems theory based framework for decision making and identified four socio technical factors viz., data analytics, data sensing, power distribution, and conflict level which affect the quality of decisions made by teams. We then conducted "thought experiments" to investigate the relative contribution of each of these factors to the quality of decisions. Our experiments and subsequent analyses show that while improved data analytics does result in better decisions, human factors have an out-sized contribution to the quality of decisions, even in data rich environments. Moreover, when the human factors in a team improve, the predictability of the positive impacts due to improvements in technical capabilities of the team also increases.
by Shirish K. Nilekar.
S.M. in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
22

Lee, Moi-Keow. "Data feedback in an integrated design-to-manufacture system." Thesis, Loughborough University, 1990. https://dspace.lboro.ac.uk/2134/27789.

Full text
Abstract:
This work is set within the context of a research project on an information support system for design and manufacture. This involves collaboration of other research workers and the commitment in other specific applications within a software environment. The work is targeted at the design to manufacture of prismatic parts in a prototyping environment. The research work moves from the identification and requirement of dimensional and process analysis, through the specification of data structures within a product data model, through to the definition of a decision network and measurement graph for dimensional analysis, to the provision of verification checks and fault clusters for process analysis. The two analyses are centred on a common fault library. The two facilities to provide support to the prototyping environment have been developed and tested through an industrial case study. The essential role of these facilities is to reduce the lead-time involved in 'prove-out'. The facility embodies a generic approach of capturing manufacturing knowledge and data feedback functionality that closes the loop from manufacture to design.
APA, Harvard, Vancouver, ISO, and other styles
23

Ferrill, Paul. "REFERENCE DESIGN FOR A SQUADRON LEVEL DATA ARCHIVAL SYSTEM." International Foundation for Telemetering, 2006. http://hdl.handle.net/10150/604259.

Full text
Abstract:
ITC/USA 2006 Conference Proceedings / The Forty-Second Annual International Telemetering Conference and Technical Exhibition / October 23-26, 2006 / Town and Country Resort & Convention Center, San Diego, California
As more aircraft are fitted with solid state memory recording systems, the need for a large data archival storage system becomes increasingly important. In addition, there is a need to keep classified and unclassified data separate but available to the aircrews for training and debriefing along with some type of system for cataloging and searching for specific missions. This paper will present a novel approach along with a reference design for using commercially available hardware and software and a minimal amount of custom programming to help address these issues.
APA, Harvard, Vancouver, ISO, and other styles
24

Berdugo, Albert. "DESIGN OF A GIGABIT DATA ACQUISITION AND MULTIPLEXER SYSTEM." International Foundation for Telemetering, 2003. http://hdl.handle.net/10150/606727.

Full text
Abstract:
International Telemetering Conference Proceedings / October 20-23, 2003 / Riviera Hotel and Convention Center, Las Vegas, Nevada
Gigabits and hundreds of megabit communication buses are starting to appear as the avionic buses of choice for new or upgraded airborne systems. This trend presents new challenges for instrumentation engineers in the areas of high speed data multiplexing, data recording, and data transmission of flight safety information. This paper describes the approach currently under development to acquire data from several types of high-speed avionic buses using distributed multiplexer and acquisition units. Additional input data may include PCM, wideband analog data, discrete, real-time video and others. The system is capable of multiplexing and recording all incoming data channels, while at the same time providing data selection down to the parameter level from input channels for transmission of flight safety information. Additionally, an extensive set of data capture trigger/filter/truncation mechanisms are supported.
APA, Harvard, Vancouver, ISO, and other styles
25

Patil, Devadas V. "Business development trends and analysis for the data networking market." Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/70804.

Full text
Abstract:
Thesis (S.M. in Engineering and Management)--Massachusetts Institute of Technology, Engineering Systems Division, System Design and Management Program, 2011.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 89-96).
The Internet has come a long way after the widely reported invention by Sandra Lerner and Leonard Bosack of the router, a device that can transmit data from one network to another based on certain protocols and principles. Despite a slow start in the mid 1980s, the Internet has emerged as one of the primary means of communication for people of all walks of life. Sophisticated, network-aware applications that integrate data, voice and video have helped fuel this growth. This thesis examines the latest technology trends and historical developments in various market segments of the Internet. Using technology trends as a backdrop, it analyzes business development at Cisco Systems, Inc., a major player in all Internet market segments. Well-known tools and concepts such as the Familiarity Matrix and Technology S-curve are used for case studies of business development at Cisco. Business Development is almost always a high-stakes endeavor requiring keen insight on both financial and strategy fronts. What are good strategies for corporate entrepreneurship? What are the challenges in business development by way of acquisitions? Will cyber anonymity continue to make us lonely and distanced, or will there be a new breed of Internet applications that will genuinely bring people closer? These are some of the questions this thesis explores, drawing on the wisdom and experience of industry experts.
by Devadas V. Patil.
S.M.in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
26

Praveen, Vikram. "Event Driven GPS Data Collection System for Studying Ionospheric Scintillation." Miami University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=miami1323894410.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Williams, Alan C. "A behavioural VHDL synthesis system using data path optimisation." Thesis, University of Southampton, 1997. https://eprints.soton.ac.uk/251233/.

Full text
Abstract:
MOODS (Multiple Objective Optimisation in Data and control path synthesis) is a synthesis system which provides the ability to automatically optimise a design from a behavioural to a structural VHDL description. This thesis details two sets of enhancements made to the original system to improve the overall quality of the final hardware implementations obtained, and expand the range of the accepted VHDL subset. Whereas the original MOODS considered each functional unit in the target module library to be a purely combinational logic block, the 'expanded modules' developed for this project provide a means of implementing sequential multi-cycle modules. These modules are defined as technology-independent templates, which are inline expanded into the internal design structure during synthesis. This enables inter-module optimisation to occur at the sub-module level, thus affording greater opportunities for unit sharing and module binding. The templates also facilitate the development of specialised interface modules. These enable the use of fixed timing I/O protocols for external interfacing, while maintaining maximum scheduling flexibility within the body of the behaviour. The second set of enhancements includes an improved implementation of behavioural VHDL as input to the system. This expands the previously limited subset to include such elements as signals, wait statements, concurrent processes, and functions and procedures. These are implemented according to the IEEE standard thereby preserving the computational effects of the VHDL simulation model. The final section of work involves the development and construction of an FPGA-based real-time audio-band spectrum analyser, synthesised within the MOODS environment. This design process provides valuable insights into the strengths and weaknesses of both MOODS and behavioural synthesis in general, serving as a firm foundation to guide future development of the system.
APA, Harvard, Vancouver, ISO, and other styles
28

Kam, Ambrose (Ambrose M. ). 1971. "A technology analysis on mobilizing corporate data." Thesis, Massachusetts Institute of Technology, 2001. http://hdl.handle.net/1721.1/91735.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Srinivasan, K. "Design and development of an enterprise modeling framework." Diss., Georgia Institute of Technology, 1993. http://hdl.handle.net/1853/8285.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Wade, John Leconte. "The development of a design for manufacture expert system." Thesis, Georgia Institute of Technology, 1989. http://hdl.handle.net/1853/17600.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Lin, Paul Hong-Yi. "Data quality enhancement in oil reservoir operations : an application of IPMAP." Thesis, Massachusetts Institute of Technology, 2012. http://hdl.handle.net/1721.1/76569.

Full text
Abstract:
Thesis (S.M. in Engineering and Management)--Massachusetts Institute of Technology, Engineering Systems Division, System Design and Management Program, 2012.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 70-71).
This thesis presents a study of data quality enhancement opportunities in upstream oil and gas industry. Information Product MAP (IPMAP) methodology is used in reservoir pressure and reservoir simulation data, to propose data quality recommendations for the company under study. In particular, a new 4-step methodology for examining data quality for reservoir pressure management systems is proposed: 1. Trace the data flow and draw the IPMAP; 2. Highlight the cross-system and organizational boundaries; 3. Select data quality analytical questions based on data quality literature review; 4. Apply the analytical questions at each boundary and document the results. This original methodology is applied to the three management systems to collect a pressure survey: using a spreadsheet, a standardized database and an automated database. IPMAPs are drawn to each of these three systems and cross-system and organizational boundaries are highlighted. Next, data quality systematic questions are applied. As a result, three data quality problems are identified and documented: well identifier number, well bore data and reservoir datum. The second experiment investigates the data quality issues in the scope of reservoir simulation and forecasting. A high-level IPMAP and a process flow on reservoir simulation and forecasting are generated. The next section further elaborates on the first high level process flow and drills into the process flow for simulation. The analytical data quality questions are raised to the second simulation process flow and limited findings were documented. This thesis concludes with lessons learned and directions for future research.
by Paul Hong-Yi Lin.
S.M.in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
32

Weng, Li. "Automatic and efficient data virtualization system for scientific datasets." Columbus, Ohio : Ohio State University, 2006. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1154717945.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Nair, Deepa R. "Visual design versus development a case study presenting how XML and XSLT can separate presentation from data /." [Gainesville, Fla.] : University of Florida, 2001. http://etd.fcla.edu/etd/uf/2001/anp1594/thesis.pdf.

Full text
Abstract:
Thesis (M.S.)--University of Florida, 2001.
Title from first page of PDF file. Document formatted into pages; contains xi, 86 p.; also contains graphics. Vita. Includes bibliographical references (p. 85).
APA, Harvard, Vancouver, ISO, and other styles
34

Cherif, Mohamed. "Design and development of a simple data acquisition system for monitoring and recording data." Thesis, KTH, Energiteknik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-170868.

Full text
Abstract:
This project is about constructing a data acquisition system. These systems are very useful for technological companies because technicians can monitor every machine in the factory through a computer and see if one or several are malfunctioning. A computer based data acquisition system is an effective and cost-effective method to have an overview of the whole production line inside a factory. Data acquisition means the process of measuring and acquiring signals from physical or electrical phenomena such as pressure, temperature, voltage and flow with a computer and software. The purpose of the project is to design and develop a simple data acquisition system. Labview was used to construct the system and the system is limited to only acquire signals from flow and temperature. The measurements were made in the HPT laboratory at The Royal Institute of Technology. By the end of the project a Graphical User Interface was developed so that the user can monitor how the flow and temperature varies with time.
I dagens industri lägger företagen står vikt vid utveckling av metoder och verktyg som leder till att effektivisera arbetsprocesserna. Ett problem som stora fabriker kan ha är att om en av maskinerna slutar fungera korrekt kan det vara svårt att upptäcka det i tid. Ett PC-baserat datainsamlingssystem är ett flexibelt och kostnadseffektivt sätt att se över hela processen i en fabrik och att enkelt ha koll på alla maskiner. Datainsamlingssystem innebär processen att mäta en elektrisk eller fysiskt fenomen såsom spänning, ström, temperatur, tryck, eller ljud med hjälp av en dator och en programvara. Syftet med projektet är att designa och utveckla ett datainsamlingssystem. Labview användes för att konstruera systemet och den är begränsat till att endast samla in data för flöde och temperatur. Mätningarna utfördes i HPT laboratoriet i Kungliga Tekniska Högskolan. Resultatet blev att en Graphical User Interface konstruerades så att användaren enkelt kan se hur temperaturen och flödet varierar med tiden.
APA, Harvard, Vancouver, ISO, and other styles
35

Noralm, Zeerak. "Design and development of a simple data acquisition system for monitoring and recording data." Thesis, KTH, Energiteknik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-170869.

Full text
Abstract:
This project is about constructing a data acquisition system. These systems are very useful for technological companies because technicians can monitor every machine in the factory through a computer and see if one or several are malfunctioning. A computer based data acquisition system is an effective and cost-effective method to have an overview of the whole production line inside a factory. Data acquisition means the process of measuring and acquiring signals from physical or electrical phenomena such as pressure, temperature, voltage and flow with a computer and software. The purpose of the project is to design and develop a simple data acquisition system. Labview was used to construct the system and the system is limited to only acquire signals from flow and temperature. The measurements were made in the HPT laboratory at The Royal Institute of Technology. By the end of the project a Graphical User Interface was developed so that the user can monitor how the flow and temperature varies with time.
I dagens industri lägger företagen står vikt vid utveckling av metoder och verktyg som leder till att effektivisera arbetsprocesserna. Ett problem som stora fabriker kan ha är att om en av maskinerna slutar fungera korrekt kan det vara svårt att upptäcka det i tid. Ett PC-baserat datainsamlingssystem är ett flexibelt och kostnadseffektivt sätt att se över hela processen i en fabrik och att enkelt ha koll på alla maskiner. Datainsamlingssystem innebär processen att mäta en elektrisk eller fysiskt fenomen såsom spänning, ström, temperatur, tryck, eller ljud med hjälp av en dator och en programvara. Syftet med projektet är att designa och utveckla ett datainsamlingssystem. Labview användes för att konstruera systemet och den är begränsat till att endast samla in data för flöde och temperatur. Mätningarna utfördes i HPT laboratoriet i Kungliga Tekniska Högskolan. Resultatet blev att en Graphical User Interface konstruerades så att användaren enkelt kan se hur temperaturen och flödet varierar med tiden.
APA, Harvard, Vancouver, ISO, and other styles
36

Cherif, Mohamed, and Zeerak Noralm. "Design and development of a simple data acquisition system for monitoring and recording data." Thesis, KTH, Kraft- och värmeteknologi, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-169613.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Wong, Angela Sai On. "A fully automatic analytic approach to budget-constrained system upgrade." Thesis, University of British Columbia, 1987. http://hdl.handle.net/2429/26670.

Full text
Abstract:
This thesis describes the development of a software package to upgrade computer systems. The package, named OPTIMAL, solves the following problem: given an existing computer system and its workload, a budget, and the costs and descriptions of available upgrade alternatives for devices in the system, what is the most cost-effective way of upgrading and tuning the system to produce the optimal system throughput? To enhance the practicality of OPTIMAL, the research followed two criteria: i) input required by OPTIMAL must be system and workload characteristics directly measurable from the system under consideration; ii) other than gathering the appropriate input data, the package must be completely automated and must not require any specialized knowledge in systems performance evaluation to interpret the results. The output of OPTIMAL consists of the optimal system throughput under the budget constraint, the workload and system configuration (or upgrade strategy) that provide such throughput, and the cost of the upgrade. Various optimization techniques, including saturation analysis and fine tuning, have been applied to enhance the performance of OPTIMAL.
Science, Faculty of
Computer Science, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
38

Fung, Charles. "An extended relational data base management system for engineering design /." Online version of thesis, 1986. http://hdl.handle.net/1850/8807.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Mustafa, Mudassir Imran. "Design Principles for Data Export : Action Design Research in U-CARE." Thesis, Uppsala universitet, Institutionen för informatik och media, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-180061.

Full text
Abstract:
In this thesis, we report the findings of designing data export functionality in Uppsala University Psychosocial Care Program (U-CARE) at Uppsala University. The aim of this thesis was to explore the design space for generic data export functionality in data centric clinical research applications for data analysis. This was attained by the construction and evaluation of a prototype for a data-centric clinical research application. For this purpose Action Design Research (ADR) was conducted, situated in the domain of clinical research. The results consist of a set of design principles expressing key aspects needed to address when designing data export functionality. The artifacts derived from the development and evaluation process each one constitutes an example of how to design for data export functionality of this kind.
APA, Harvard, Vancouver, ISO, and other styles
40

Deng, Thomas. "The design and implementation of a portable data acquisition system." Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/36661.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Immorlica, Nicole 1978. "Data acquisition system design for the Alpha Magnetic Spectrometer experiment." Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/87233.

Full text
Abstract:
Thesis (M.Eng. and S.B.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2002.
Includes bibliographical references (p. 69-70).
by Nicole Immorlica.
M.Eng.and S.B.
APA, Harvard, Vancouver, ISO, and other styles
42

Khan, Kevin Jamil Hiroshi. "Wide Area Power System Monitoring Device Design and Data Analysis." Thesis, Virginia Tech, 2006. http://hdl.handle.net/10919/34052.

Full text
Abstract:

The frequency disturbance recorder (FDR) is a cost effective data acquisition device used to measure power system frequency at the distribution level. FDRs are time synchronized via the global positioning system (GPS) timing and data recorded by FDRs are time stamped to allow for comparative analysis between FDRs. The data is transmitted over the internet to a central server where the data is collected and stored for post mortem analysis. Currently, most of the analysis is done with power system frequency.

The purpose of this study is to take a first in depth look at the angle data collected by FDRs. Different data conditioning techniques are proposed and tested before one is chosen. The chosen technique is then used to extract useable angle data for angle analysis on eight generation trip events. The angle differences are then used to create surface plot angle difference movies for further analysis.

A new event detection algorithm, the k-means algorithm, is also presented in this paper. The algorithm is proposed as a simple and fast alternative to the current detection method. Next, this thesis examines several GPS modules and recommends one for a replacement of the current GPS chip, which is no longer in production. Finally, the manufacturing process for creating an FDR is documented.

This thesis may have raised more questions than it answers and it is hoped that this work will lay the foundation for further analysis of angles from FDR data.


Master of Science
APA, Harvard, Vancouver, ISO, and other styles
43

Chiou, Ian Yiing-shyang. "Design and analysis of a voice/data internet transport system /." The Ohio State University, 1986. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487266691096329.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Cheema, Saad Saadat. "Design and Performance Analysis of a Sonar Data Acquisition System." University of Cincinnati / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1563874431553724.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Schneider, Ian Michael. "Market design opportunities for an evolving power system." Thesis, Massachusetts Institute of Technology, 2020. https://hdl.handle.net/1721.1/128639.

Full text
Abstract:
Thesis: Ph. D. in Social and Engineering Systems, Massachusetts Institute of Technology, School of Engineering, Institute for Data, Systems, and Society, February, 2020
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 117-126).
The rapid growth of renewable energy is transforming the electric power sector. Wind and solar energy are non-dispatchable: their energy output is uncertain and variable from hour-to- hour. New challenges arise in electricity markets with a large share of uncertain and variable renewable energy. We investigate some of these challenges and identify economic opportunities and policy changes to mitigate them. We study electricity markets by focusing on the preferences and strategic behavior of three different groups: producers, consumers, and load-serving entities. First, we develop a game-theoretic model to investigate energy producer strategy in electricity markets with high levels of uncertain renewable energy. We show that increased geographic dispersion of renewable generators can reduce market power and increase social welfare. We also demonstrate that high-quality public forecasting of energy production can increase welfare. Second, we model and explain the effects of retail electricity competition on producer market power and forward contracting. We show that increased retail competition could decrease forward contracting and increase electricity prices; this is a downside to the general trend of increased access to retail electricity competition. Finally, we propose new methods for improving demand response programs. A demand response program operator commonly sets customer baseline thresholds to determine compensation for individual customers. The optimal way to do this remains an open question. We create a new model that casts the demand response program as a sequential decision problem; this formulation highlights the importance of learning about individual customers over time. We develop associated algorithms using tools from online learning, and we show that they outperform the current state of practice.
by Ian Michael Schneider.
Ph. D. in Social and Engineering Systems
Ph.D.inSocialandEngineeringSystems Massachusetts Institute of Technology, School of Engineering, Institute for Data, Systems, and Society
APA, Harvard, Vancouver, ISO, and other styles
46

Hussain, Romana. "System-in-use methodology : a methodology to generate conceptual PSS (Product-Service Systems) and conventional designs using systems-in-use data." Thesis, Cranfield University, 2013. http://dspace.lib.cranfield.ac.uk/handle/1826/8262.

Full text
Abstract:
Industries want to add value to their offerings but to do this, rather than just accepting customer requirements, they now need to know how their products and/or services have been embedded within their customer’s process to achieve a goal that the customer has; any gaps within the process then present an opportunity for the provider to fill these gaps. The System-In-Use (SIU) Methodology presented in this thesis facilitates customer issues in “pulling” the supply chain into creating new solutions as well as the supply chain “pushing” new value propositions into improving customer processes. It does this by drawing on a detailed theory of value and capability which was developed as part of this research. The method has been applied in five industries in processes encompassing high value-assets with very positive outcomes for each of the stakeholders involved: notably, three solutions have been adopted in industry for which a KT-Box award was granted by Cambridge University. Cont/d.
APA, Harvard, Vancouver, ISO, and other styles
47

Kouyoumdjieva, Sylvia T. "System Design for Opportunistic Networks." Doctoral thesis, KTH, Kommunikationsnät, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-176479.

Full text
Abstract:
Device-to-device communication has been suggested as a complement to traditional cellular networks as a means of offloading cellular traffic. In this thesis we explore a solution for device-to-device communication based on opportunistic content distribution in a content-centric network. Communication opportunities arise as mobile nodes roam around in an area and occasionally enter in direct communication range with one another. We consider a node to be a pedestrian equipped with a mobile device and explore the properties of opportunistic communication in the context of content dissemination in urban areas. The contributions of this thesis lie in three areas. We first study human mobility as one of the main enablers of opportunistic communication. We introduce traces collected from a realistic pedestrian mobility simulator and demonstrate that the performance of opportunistic networks is not very sensitive to the accurate estimation of the probability distributions of mobility parameters. However, capturing the space in which mobility occurs may be of high importance. Secondly, we design and implement a middleware for opportunistic content-centric networking, and we evaluate it via a small-scale testbed, as well as through extensive simulations. We conclude that energy-saving mechanisms should be part of the middleware design, while caching should be considered only as an add-on feature. Thirdly, we present and evaluate three different energy-saving mechanisms in the context of opportunistic networking: a dual-radio architecture, an asynchronous duty-cycling scheme, and an energy-aware algorithm which takes into account node selfishness. We evaluate our proposals analytically and via simulations. We demonstrate that when a critical mass of participants is available, the performance of the opportunistic network is comparable to downloading contents directly via the cellular network in terms of energy consumption while offloading large traffic volumes from the operator.

QC 20151120

APA, Harvard, Vancouver, ISO, and other styles
48

Zhenhua, Yu, Kou Yanhong, and Zhang Qishan. "DESIGN OF A NEW WEBGIS SYSTEM BASED ON XML." International Foundation for Telemetering, 2005. http://hdl.handle.net/10150/604886.

Full text
Abstract:
ITC/USA 2005 Conference Proceedings / The Forty-First Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2005 / Riviera Hotel & Convention Center, Las Vegas, Nevada
With the development of Internet and the urgent need for GIS (Geographic Information System), XML (eXtensible Markup Language) provides a powerful new way for new web application. This paper makes a research on the application of SVG based on XML in the WebGIS (World Wide Web Geographical Information System). In this paper, the characteristics of XML are illuminated in short; the application of XML in WebGIS is discussed and the features of SVG and its usage with XML are presented; a design of SVG based on XML in the WebGIS system is given.
APA, Harvard, Vancouver, ISO, and other styles
49

Alvidrez, Carlos. "A systematic framework for preparing and enhancing structured data sets for time series analysis." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/100367.

Full text
Abstract:
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, Engineering Systems Division, System Design and Management Program, 2015.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 216-217).
This thesis proposes a framework to systematically prepare and enhance structured data for time series analysis. It suggests the production of intermediate derived calculations, which aid in the analysis and rationalization of variation over time, to enhance the consistency and the efficiency of data analysis. This thesis was developed with the cooperation of a major international financial firm. The use of their actual historical financial credit risk data sets significantly aided this work by providing genuine feedback, validating specific results, and confirming the usefulness of the method. While illustrated through the use of credit risk data sets, the methodology this thesis presents is designed to be applied easily and transparently to structured data sets used for time series analysis.
by Carlos Alvidrez.
S.M. in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
50

Markina-Khusid, Aleksandra. "Effect of learning on stakeholder negotiation outcomes : modeling and analysis of game-generated data." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/100390.

Full text
Abstract:
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, Engineering Systems Division, System Design and Management Program, 2015.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 75-81).
A design negotiation game based on a stakeholder salience framework was created by the APACE research team to explore negotiation dynamics between stakeholders with individual attributes and agendas. Experimental data was collected anonymously during games played by groups of human participants through a web interface. It was found that the negotiation process takes a non-zero number of iterations even under conditions that strongly favor agreement. A realistic scenario was created based on extensive interviews with the major stakeholders involved in a real negotiation of a plan for a new government information technology system. Solution space exploration of this scenario demonstrated that the experimentally obtained solutions lie far from the optimality frontier. Performance differed significantly in two groups of participants with dissimilar professional experience; games played by interns achieved higher scores than those played by senior staff. An agent-based model was built to simulate multi-stage design negotiation. Utility functions of individual players were based on their private agendas. Players voted for a design according to the relative attractiveness of the design as established by the individual utility function. The negotiation process helps players discover other players' agendas. It was hypothesized that knowledge of each other's private objectives would enable groups of players to achieve design solutions that are closer to optimal. Effects of learning were introduced into the model by adding a fraction of the sum of all players' utility function to each individual utility function. Simulated games with learning effects yielded solutions with higher total player scores than simulated games without learning did. Results of simulated games with a substantial level of learning effects were similar to average experimental results from groups of interns. Results of simulated games without learning were close to the average results of games played by senior staff.
by Aleksandra Markina-Khusid.
S.M. in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography