Dissertations / Theses on the topic 'Gun control – Data processing'

To see the other types of publications on this topic, follow the link: Gun control – Data processing.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Gun control – Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Van, Schalkwyk Dirko. "Distributed real-time processing using GNU/Linux/libré software and COTS hardware." Thesis, Stellenbosch : Stellenbosch University, 2004. http://hdl.handle.net/10019.1/49933.

Full text
Abstract:
Thesis (MScIng)--Stellenbosch University, 2004.
ENGLISH ABSTRACT: This dissertation's research aims at studying the viability of using both low cost consumer Commodity Off The Self (COTS) PCs and libn~software in implementing a distributed real-time system modeled on a real-world engineering problem. Debugging and developing a modular satellite system is both time consuming and complex, to this end the SUNSATteam has envisioned the Interactive Test System that would be a dual mode simulator/monitoring system. It is this system that requires a real-time back-end and is used as a real world problem model to implement. The implementation was accomplished by researching the available parallel processing software and real-time extensions to GNU/Linux and choosing the appropriate solutions based on the needs of the model. A monitoring system was also implemented, for system verification, using freely available system monitoring utilities. The model was successfully implemented and verified with a global synchronization of < 10ms. It was shown that GNU/Linux and libn~ software is both mature enough and appropriate in solving a real world distributed real-time problem.
AFRIKAANSE OPSOMMING: Die tesis se navorsing is daarop gemik om die toepaslikheid van beide lae koste verbruikers Komoduteits Van Die Rak (KVDR)persoonlike rekenaars en vemiet sagteware in die implementasie van verspreide intydse stelsels te ondersoek aan die hand van die oplossing van 'n gemodelleerde ingenieurs probleem. Die ontfouting en ontwikkeling van 'n modulere satelliet is beide tyd rowend en kompleks, om hierdie te vergemaklik het die SUNSAT span die Interaktiewe Toets Stelsel gekonseptualiseer, wat in wese'n dubbel modus simulator/moniteerings stelsel sou wees. Dit is hierdie stelsel wat 'n verspreide intydse onderstel benodig en dien as die regte wereld probleem model om op te los. Die implementasie is bereik deur die beskikbare verspreide verwerkings sagteware en intydse uitbreidings vir GNU/Linux te ondersoek en die toepaslike opsies te kies gegrond op die behoeftes van die model. 'n Moniteerings stelsel is ook geimplimenteer, met behulp van libn~sagteware, vir stelsel verifikasie. Die model was suksesvol geimplimenteer en geverifieer met 'n globale sinkronisasie van < 10ms. Daar is getoon dat GNU/Linux en libn~sagteware beide volwaardig en geskik is vir die oplossing van regte wereld verspreide intydse probleme.
APA, Harvard, Vancouver, ISO, and other styles
2

Cheng, Xiaofeng. "Analysis of States Gun Control Restrictions." Scholar Commons, 2002. http://purl.fcla.edu/fcla/etd/SFE0000037.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lane, Alexander M. "Mass Shootings and Gun Control: Obama’s Road to Reform." Scholarship @ Claremont, 2013. http://scholarship.claremont.edu/cmc_theses/621.

Full text
Abstract:
This work is intended to evaluate President Obama’s gun control policies by determining whether stricter federal gun control laws should apply within theUnited States. This paper examines whether setting legal standards at a national level would effectively reduce gun related violence and mass shootings on a local and state level. These include events such as Sandy Hook Elementary inNewton,Connecticut, the Virginia Polytechnic school shootings, and theAuroraTheatershooting inDenver,Colorado. Specifically, could executive orders proposed by the president, such as assault weapon bans, rigorous background checks on gun sales, submission of mental health records to the FBI Databases, and increased aid between states and mental health care institutions effectively reduce horrific incidences of gun related violence. Using gun control data from past and present as our research, we will determine whether stricter gun control policies have deterred violent crimes, murder rates, suicides and mass shootings. Since our research focuses on policy solutions as an alternative to reduce mass shootings, not the psychological make ups and environmental factors of mass shooters, we will omit America’s gun culture as a variable within our study: such as the effects violent video games and movies could have on the psyche of troubled individuals. After carefully analyzing gun date related to gun violence and crime, this work will attempt to suggest whether or not President Obama’s gun control policies will pass in Congress and which legislation will be the most effective in limiting gun violence and mass shootings.
APA, Harvard, Vancouver, ISO, and other styles
4

Sabri, Dina O. "Process control using an optomux control board." Virtual Press, 1987. http://liblink.bsu.edu/uhtbin/catkey/484759.

Full text
Abstract:
In this thesis process control concepts were used to develop software that could be adapted to a real world situation. The software was used to control a simple temperature regulating experiment. This experiment was used to demonstrate the use of OPTOMUX analog and digital input/output devices in controlling a process. The goal of this experiment was to use the input/output devices in controlling the temperature of the box within specified tolerances for a designated period of time. To accomplish optimal use of equipment and optimal control, a mathematical model was derived to predict the behavior of the process under control. The pattern observed while the temperature was increasing toward room temperature closely resembled an exponential function. For temperatures above room temperatures the curve then approximated a square root function. The pattern followed when decreasing the temperature-was exponential throughout. The time required to collect all the significant data in the case of increasing the temperature was two hours. In the case of decreasing temperature, one hour. Beyond these time limits the temperature remained essentially constant. The maximum temperature value that could be reached was six degrees above room temperature and the minimum two degrees below room temperature.
APA, Harvard, Vancouver, ISO, and other styles
5

May, Brian 1975. "Scalable access control." Monash University, School of Computer Science and Software, 2001. http://arrow.monash.edu.au/hdl/1959.1/8043.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Cline, George E. "A control framework for distributed (parallel) processing environments." Thesis, This resource online, 1994. http://scholar.lib.vt.edu/theses/available/etd-12042009-020227/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kleine, Matthias, Robert Hirschfeld, and Gilad Bracha. "An abstraction for version control systems." Universität Potsdam, 2011. http://opus.kobv.de/ubp/volltexte/2012/5562/.

Full text
Abstract:
Versionsverwaltungssysteme (VCS) ermöglichen es Entwicklern, Änderungen an Softwareartifakten zu verwalten. VCS werden mit Hilfe einer Vielzahl verschiedener Werkzeuge bedient, wie z.,B. graphische Front-ends oder Kommandozeilenwerkzeuge. Es ist wünschenswert mit einzelnen solcher Werkzeuge unterschiedliche VCS bedienen zu können. Bislang hat sich jedoch keine Abstraktion für Versionsverwaltungssysteme durchgesetzt, mit deren Hilfe solche Werkzeuge erstellt werden können. Stattdessen implementieren Werkzeuge zur Interaktion mit mehreren VCS ad-hoc Lösungen. Diese Masterarbeit stellt Pur vor, eine Abstraktion über Versionsverwaltungskonzepte. Mit Hilfe von Pur können Anwendungsprogramme entwickelt werden, die mit mehreren Versionsverwaltungssystemen interagieren können. Im Rahmen dieser Arbeit wird eine Implementierung dieser Abstraktion bereitgestellt und mit Hilfe eines Anwendungsprogramms validiert.
Version Control Systems (VCS) allow developers to manage changes to software artifacts. Developers interact with VCSs through a variety of client programs, such as graphical front-ends or command line tools. It is desirable to use the same version control client program against different VCSs. Unfortunately, no established abstraction over VCS concepts exists. Instead, VCS client programs implement ad-hoc solutions to support interaction with multiple VCSs. This thesis presents Pur, an abstraction over version control concepts that allows building rich client programs that can interact with multiple VCSs. We provide an implementation of this abstraction and validate it by implementing a client application.
APA, Harvard, Vancouver, ISO, and other styles
8

Moi, Havard. "Rule-based control of manufacturing systems." Thesis, Hong Kong : University of Hong Kong, 2000. http://sunzi.lib.hku.hk/hkuto/record.jsp?B22190168.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Popa, Tiberiu. "Compiling Data Dependent Control Flow on SIMD GPUs." Thesis, University of Waterloo, 2004. http://hdl.handle.net/10012/1186.

Full text
Abstract:
Current Graphic Processing Units (GPUs) (circa. 2003/2004) have programmable vertex and fragment units. Often these units are implemented as SIMD processors employing parallel pipelines. Data dependent conditional execution on SIMD architectures implemented using processor idling is inefficient. I propose a multi-pass approach based on conditional streams which allows dynamic load balancing of the fragment units of the GPU and better theoretical performance on programs using data dependent conditionals and loops. The proposed system can be used to turn the fragment unit of a SIMD GPU into a stream processor with data dependent control flow.
APA, Harvard, Vancouver, ISO, and other styles
10

Kulatunga, Chamil. "Enforcing receiver-driven multicast congestion control using ECN-Nonce." Thesis, Available from the University of Aberdeen Library and Historic Collections Digital Resources, 2009. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?application=DIGITOOL-3&owner=resourcediscovery&custom_att_2=simple_viewer&pid=33532.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Koenig, Mark A. "A DECENTRALIZED ADAPTIVE CONTROL SCHEME FOR ROBOTIC MANIPULATORS." Thesis, The University of Arizona, 1985. http://hdl.handle.net/10150/275238.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Cheung, Shun Yan. "Optimizing the performance of quorum consensus replica control protocols." Diss., Georgia Institute of Technology, 1990. http://hdl.handle.net/1853/8150.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Vanderhyde, James. "Topology Control of Volumetric Data." Diss., Georgia Institute of Technology, 2007. http://hdl.handle.net/1853/16215.

Full text
Abstract:
Three-dimensional scans and other volumetric data sources often result in representations that are more complex topologically than the original model. The extraneous critical points, handles, and components are called topological noise. Many algorithms in computer graphics require simple topology in order to work optimally, including texture mapping, surface parameterization, flows on surfaces, and conformal mappings. The topological noise disrupts these procedures by requiring each small handle to be dealt with individually. Furthermore, topological descriptions of volumetric data are useful for visualization and data queries. One such description is the contour tree (or Reeb graph), which depicts when the isosurfaces split and merge as the isovalue changes. In the presence of topological noise, the contour tree can be too large to be useful. For these reasons, an important goal in computer graphics is simplification of the topology of volumetric data. The key to this thesis is that the global topology of volumetric data sets is determined by local changes at individual points. Therefore, we march through the data one grid cell at a time, and for each cell, we use a local check to determine if the topology of an isosurface is changing. If so, we change the value of the cell so that the topology change is prevented. In this thesis we describe variations on the local topology check for use in different settings. We use the topology simplification procedure to extract a single component with controlled topology from an isosurface in volume data sets and partially-defined volume data sets. We also use it to remove critical points from three-dimensional volumes, as well as time-varying volumes. We have applied the technique to two-dimensional (plus time) data sets and three dimensional (plus time) data sets.
APA, Harvard, Vancouver, ISO, and other styles
14

Chiu, Lin. "A methodology for designing concurrency control schemes in distributed databases /." The Ohio State University, 1987. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487584612163117.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Mackenzie, Donald, and Richard Fielding. "Control of a Remote Receiving Station and Data Processing at RA Range Hebrides." International Foundation for Telemetering, 1994. http://hdl.handle.net/10150/611665.

Full text
Abstract:
International Telemetering Conference Proceedings / October 17-20, 1994 / Town & Country Hotel and Conference Center, San Diego, California
The Royal Artillery Range (RA Range) is the British Army's weapons practice range in the Outer Hebrides of Scotland. The large sea range is also used by the Royal Air Force and Royal Navy for new weapons system evaluation and in service practice firing. This paper describes the telemetry facility comprising of two prime sites separated by 40 miles of open sea. Tracking antennas and receivers are at the remote island site of St Kilda with data processing and control at the Range Control Base (RCB), Benbecula. To improve operational capabilities and effectiveness, full remote control and monitoring of the multiple receivers and combiners has been installed. Radar tracking outputs are processed in the telemetry computer to produce individual antenna pointing demands.
APA, Harvard, Vancouver, ISO, and other styles
16

Morgan, Clifford Owen. "Development of computer aided analysis and design software for studying dynamic process operability." Thesis, Georgia Institute of Technology, 1986. http://hdl.handle.net/1853/10187.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Yamashita, Takao. "Dynamic control of distributed loosely coupled replicas for processing weakly consistent data." 京都大学 (Kyoto University), 2006. http://hdl.handle.net/2433/136026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Tolat, Viral V. "A Software Architecture for Realtime Data Acquisition, Instrument Control and Command Processing." International Foundation for Telemetering, 1992. http://hdl.handle.net/10150/611944.

Full text
Abstract:
International Telemetering Conference Proceedings / October 26-29, 1992 / Town and Country Hotel and Convention Center, San Diego, California
In this paper we describe the flight software for the SETS (Shuttle Electrodynamic Tethered System) experiment. The SETS experiment will fly as part of the TSS-1 (Tethered Satellite System) experiment on STS-46 currently scheduled for July 1992. The software consists of two major components: the SETSOS (SETS Operating System) and the SETS Application. The SETSOS is a UNIX-like operating system developed especially for realtime data acquisition, instrument control and command processing. The SETSOS, like all operating systems, provides resource management for application programs. It is UNIX-like in that access to resources is provided through a standard set of UNIX system calls. The SETSOS also implements the standard UNIX I/O model and a hierarchical file system. In addition to providing access to physical devices, the SETSOS provides support for two virtual devices: a packet-based data device and a command device. The packet-based data device is used by applications to place data into the telemetry stream. The command device is used to manage commands from the command uplink as well as other sources including other applications and other processors. The SETS Application is the primary program which runs under the SETSOS to handle data acquisition, instrument control and command processing. It executes as 5 separate processes, each performing a special task. The tasks include housekeeping data acquisition, limit checking, timeline management, and command processing. The processes communicate via shared memory. Time critical processing is coordinated by using signals and interrupts. In addition to a description of the software, we will discuss the relative merits and tradeoffs of using such a system design for command processing and data acquisition.
APA, Harvard, Vancouver, ISO, and other styles
19

Snowdon, Jane Louise. "Workflow control for surges from a batch work station." Diss., Georgia Institute of Technology, 1994. http://hdl.handle.net/1853/25100.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Hartley, Joanna Katherine. "Parallel algorithms for fuzzy data processing with application to water systems." Thesis, Nottingham Trent University, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.296029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Forman, Michael L., Tushar K. Hazra, Gregory M. Troendly, and William G. Nickum. "APPLYING PC-BASED EMBEDDED PROCESSING FOR REAL-TIME SATELLITE DATA ACQUISITION AND CONTROL." International Foundation for Telemetering, 1993. http://hdl.handle.net/10150/608833.

Full text
Abstract:
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada
The performance and cost effectiveness of em bedded processing has greatly enhanced the personal computer's (PC) capability, particularly when used for real-time satellite data acquisition, telemetry processing, command and control operations. Utilizing a transputer based parallel architecture, a modular, reusable, and scalable control system is attainable. The synergism between the personal computer and embedded processing results in efficient, low cost desktop workstations up to 1000 MIPS of performance.
APA, Harvard, Vancouver, ISO, and other styles
22

Sheth, Amit Pravin. "Adaptive concurrency control for distributed database systems /." The Ohio State University, 1985. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487262513408523.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Nodine, Dewayne J. "Spatial decision support system for evaluation of land use plans based upon storm water runoff impacts : a theoretical framework." Virtual Press, 1996. http://liblink.bsu.edu/uhtbin/catkey/1020175.

Full text
Abstract:
All land uses affect storm water runoff However, different uses of the same site generate varying amounts of runoff Many communities have come to rely upon detention and/or retention basins for controlling the additional runoff resulting from land development. It is argued that this incremental approach to storm water management must be replaced with a more proactive long-term view.To achieve this, more user-friendly software capable of modeling the effect long-range land use plans have on the volume and behavior of storm water runoff is needed. This software, called a Spatial Decision Support System (SDSS), must be capable of guiding the user, who may not be an expert at runoff analysis, through the process and also capable of generating output in various formats understandable by lay persons. This study utilizes a systems analysis technique to develop a theoretical framework for the Storm Water SDSS.
Department of Urban Planning
APA, Harvard, Vancouver, ISO, and other styles
24

Swientek, Martin. "High-performance near-time processing of bulk data." Thesis, University of Plymouth, 2015. http://hdl.handle.net/10026.1/3461.

Full text
Abstract:
Enterprise Systems like customer-billing systems or financial transaction systems are required to process large volumes of data in a fixed period of time. Those systems are increasingly required to also provide near-time processing of data to support new service offerings. Common systems for data processing are either optimized for high maximum throughput or low latency. This thesis proposes the concept for an adaptive middleware, which is a new approach for designing systems for bulk data processing. The adaptive middleware is able to adapt its processing type fluently between batch processing and single-event processing. By using message aggregation, message routing and a closed feedback-loop to adjust the data granularity at runtime, the system is able to minimize the end-to-end latency for different load scenarios. The relationship of end-to-end latency and throughput of batch and message-based systems is formally analyzed and a performance evaluation of both processing types has been conducted. Additionally, the impact of message aggregation on throughput and latency is investigated. The proposed middleware concept has been implemented with a research prototype and has been evaluated. The results of the evaluation show that the concept is viable and is able to optimize the end-to-end latency of a system. The design, implementation and operation of an adaptive system for bulk data processing differs from common approaches to implement enterprise systems. A conceptual framework has been development to guide the development process of how to build an adaptive software for bulk data processing. It defines the needed roles and their skills, the necessary tasks and their relationship, artifacts that are created and required by different tasks, the tools that are needed to process the tasks and the processes, which describe the order of tasks.
APA, Harvard, Vancouver, ISO, and other styles
25

Wad, Charudatta V. "QoS : quality driven data abstraction for large databases." Worcester, Mass. : Worcester Polytechnic Institute, 2008. http://www.wpi.edu/Pubs/ETD/Available/etd-020508-151213/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Alici, Semra. "Dynamic data reconciliation using process simulation software and model identification tools." Access restricted to users with UT Austin EID Full text (PDF) from UMI/Dissertation Abstracts International, 2001. http://wwwlib.umi.com/cr/utexas/fullcit?p3025133.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Benson, Glenn Stuart. "A formal protection model of security in distributed systems." Diss., Georgia Institute of Technology, 1989. http://hdl.handle.net/1853/12238.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Lakshmanan, Nithya M. "Estimation and control of nonlinear batch processes using multiple linear models." Thesis, Georgia Institute of Technology, 1997. http://hdl.handle.net/1853/11835.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Jones, Patricia Marie. "Constructing and validating a model-based operator's associate for supervisory control." Thesis, Georgia Institute of Technology, 1988. http://hdl.handle.net/1853/24274.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Calitz, Wietsche Roets. "Independent formant and pitch control applied to singing voice." Thesis, Stellenbosch : University of Stellenbosch, 2004. http://hdl.handle.net/10019.1/16267.

Full text
Abstract:
Thesis (MScIng)--University of Stellenbosch, 2004.
ENGLISH ABSTRACT: A singing voice can be manipulated artificially by means of a digital computer for the purposes of creating new melodies or to correct existing ones. When the fundamental frequency of an audio signal that represents a human voice is changed by simple algorithms, the formants of the voice tend to move to new frequency locations, making it sound unnatural. The main purpose is to design a technique by which the pitch and formants of a singing voice can be controlled independently.
AFRIKAANSE OPSOMMING: Onafhanklike formant- en toonhoogte beheer toegepas op ’n sangstem: ’n Sangstem kan deur ’n digitale rekenaar gemanipuleer word om nuwe melodie¨e te skep, of om bestaandes te verbeter. Wanneer die fundamentele frekwensie van ’n klanksein (wat ’n menslike stem voorstel) deur ’n eenvoudige algoritme verander word, skuif die oorspronklike formante na nuwe frekwensie gebiede. Dit veroorsaak dat die resultaat onnatuurlik klink. Die hoof oogmerk is om ’n tegniek te ontwerp wat die toonhoogte en die formante van ’n sangstem apart kan beheer.
APA, Harvard, Vancouver, ISO, and other styles
31

容勁 and King Stanley Yung. "Application of multi-agent technology to supply chain management." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1999. http://hub.hku.hk/bib/B31223886.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Xu, Yifan. "New data synchronization & mapping strategies for PACE - VLSI processor architecture." Thesis, University of Nottingham, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.283229.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Huang, Shiping. "Exploratory visualization of data with variable quality." Link to electronic thesis, 2005. http://www.wpi.edu/Pubs/ETD/Available/etd-01115-225546/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

黎浩然 and Ho-yin Albert Lai. "Artificial intelligence based thermal comfort control with CFD modelling." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1999. http://hub.hku.hk/bib/B3122278X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Ogidan, Olugbenga Kayode. "Design of nonlinear networked control for wastewater distributed systems." Thesis, Cape Peninsula University of Technology, 2014. http://hdl.handle.net/20.500.11838/1201.

Full text
Abstract:
Thesis submitted in fulfilment of the requirements for the degree Doctor of Technology: Electrical Engineering in the Faculty of Engineering at the Cape Peninsula University of Technology 2014
This thesis focuses on the design, development and real-time simulation of a robust nonlinear networked control for the dissolved oxygen concentration as part of the wastewater distributed systems. This concept differs from previous methods of wastewater control in the sense that the controller and the wastewater treatment plants are separated by a wide geographical distance and exchange data through a communication medium. The communication network introduced between the controller and the DO process creates imperfections during its operation, as time delays which are an object of investigation in the thesis. Due to the communication network imperfections, new control strategies that take cognisance of the network imperfections in the process of the controller design are needed to provide adequate robustness for the DO process control system. This thesis first investigates the effects of constant and random network induced time delays and the effects of controller parameters on the DO process behaviour with a view to using the obtained information to design an appropriate controller for the networked closed loop system. On the basis of the above information, a Smith predictor delay compensation controller is developed in the thesis to eliminate the deadtime, provide robustness and improve the performance of the DO process. Two approaches are adopted in the design of the Smith predictor compensation scheme. The first is the transfer function approach that allows a linearized model of the DO process to be described in the frequency domain. The second one is the nonlinear linearising approach in the time domain. Simulation results reveal that the developed Smith predictor controllers out-performed the nonlinear linearising controller designed for the DO process without time delays by compensating for the network imperfections and maintaining the DO concentration within a desired acceptable level. The transfer function approach of designing the Smith predictor is found to perform better under small time delays but the performance deteriorates under large time delays and disturbances. It is also found to respond faster than the nonlinear approach. The nonlinear feedback linearisig approach is slower in response time but out-performs the transfer function approach in providing robustness and performance for the DO process under large time delays and disturbances. The developed Smith predictor compensation schemes were later simulated in a real-time platform using LabVIEW. The Smith predictor controllers developed in this thesis can be applied to other process control plants apart from the wastewater plants, where distributed control is required. It can also be applied in the nuclear reactor plants where remote control is required in hazardous conditions. The developed LabVIEW real-time simulation environment would be a valuable tool for researchers and students in the field of control system engineering. Lastly, this thesis would form the basis for further research in the field of distributed wastewater control.
APA, Harvard, Vancouver, ISO, and other styles
36

Burdis, Keith Robert. "Distributed authentication for resource control." Thesis, Rhodes University, 2000. http://hdl.handle.net/10962/d1006512.

Full text
Abstract:
This thesis examines distributed authentication in the process of controlling computing resources. We investigate user sign-on and two of the main authentication technologies that can be used to control a resource through authentication and providing additional security services. The problems with the existing sign-on scenario are that users have too much credential information to manage and are prompted for this information too often. Single Sign-On (SSO) is a viable solution to this problem if physical procedures are introduced to minimise the risks associated with its use. The Generic Security Services API (GSS-API) provides security services in a manner in- dependent of the environment in which these security services are used, encapsulating security functionality and insulating users from changes in security technology. The un- derlying security functionality is provided by GSS-API mechanisms. We developed the Secure Remote Password GSS-API Mechanism (SRPGM) to provide a mechanism that has low infrastructure requirements, is password-based and does not require the use of long-term asymmetric keys. We provide implementations of the Java GSS-API bindings and the LIPKEY and SRPGM GSS-API mechanisms. The Secure Authentication and Security Layer (SASL) provides security to connection- based Internet protocols. After finding deficiencies in existing SASL mechanisms we de- veloped the Secure Remote Password SASL mechanism (SRP-SASL) that provides strong password-based authentication and countermeasures against known attacks, while still be- ing simple and easy to implement. We provide implementations of the Java SASL binding and several SASL mechanisms, including SRP-SASL.
APA, Harvard, Vancouver, ISO, and other styles
37

Okere, Irene Onyekachi. "A control framework for the assessment of information security culture." Thesis, Nelson Mandela Metropolitan University, 2013. http://hdl.handle.net/10948/d1019861.

Full text
Abstract:
The modern organisation relies heavily on information to function effectively. With such reliance on information, it is vital that information be protected from both internal (employees) and external threats. The protection of information or information security to a large extent depends on the behaviour of humans (employees) in the organisation. The behaviour of employees is one of the top information security issues facing organisations as the human factor is regarded as the weakest link in the security chain. To address this human factor many researchers have suggested the fostering of a culture of information security so that information security becomes second nature to employees. Information security culture as defined for this research study exists in four levels namely artefacts, espoused values, shared tacit assumptions and information security knowledge. An important step in the fostering of an information security culture is the assessment of the current state of such a culture. Gaps in current approaches for assessing information security culture were identified and this research study proposes the use of a control framework to address the identified gaps. This research study focuses on the assessment of information security culture and addresses 5 research objectives namely 1) to describe information security culture in the field of information security, 2) to determine ways to foster information security culture in an organisation, 3) to demonstrate the gap in current approaches used to assess information security culture, 4) to determine the components that could be used for the assessment of information security culture for each of the culture’s underlying levels and 5) to describe a process for the assessment of information security culture for all four levels. This research study follows a qualitative approach utilising a design science strategy and multi-method qualitative data collection techniques including literature review, qualitative content analysis, argumentation, and modelling techniques. The research methods provide a means for the interpretation of the data and the development of the proposed control framework.
APA, Harvard, Vancouver, ISO, and other styles
38

Hernańdez, Correa Evelio. "Control of nonlinear systems using input-output information." Diss., Georgia Institute of Technology, 1992. http://hdl.handle.net/1853/11176.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Paul, Douglas James. "Parallel microcomputer control of a 3DOF robotic arm." Thesis, Georgia Institute of Technology, 1989. http://hdl.handle.net/1853/18371.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Tian, Yu-Chu. "Dynamics analysis and integrated design of real-time control systems." School of Electrical and Information Engineering, 2009. http://hdl.handle.net/2123/5743.

Full text
Abstract:
Doctor of Philosophy (PhD)
Real-time control systems are widely deployed in many applications. Theory and practice for the design and deployment of real-time control systems have evolved significantly. From the design perspective, control strategy development has been the focus of the research in the control community. In order to develop good control strategies, process modelling and analysis have been investigated for decades, and stability analysis and model-based control have been heavily studied in the literature. From the implementation perspective, real-time control systems require timeliness and predictable timing behaviour in addition to logical correctness, and a real-time control system may behave very differently with different software implementations of the control strategies on a digital controller, which typically has limited computing resources. Most current research activities on software implementations concentrate on various scheduling methodologies to ensure the schedulability of multiple control tasks in constrained environments. Recently, more and more real-time control systems are implemented over data networks, leading to increasing interest worldwide in the design and implementation of networked control systems (NCS). Major research activities in NCS include control-oriented and scheduling-oriented investigations. In spite of significant progress in the research and development of real-time control systems, major difficulties exist in the state of the art. A key issue is the lack of integrated design for control development and its software implementation. For control design, the model-based control technique, the current focus of control research, does not work when a good process model is not available or is too complicated for control design. For control implementation on digital controllers running multiple tasks, the system schedulability is essential but is not enough; the ultimate objective of satisfactory quality-of-control (QoC) performance has not been addressed directly. For networked control, the majority of the control-oriented investigations are based on two unrealistic assumptions about the network induced delay. The scheduling-oriented research focuses on schedulability and does not directly link to the overall QoC of the system. General solutions with direct QoC consideration from the network perspective to the challenging problems of network delay and packet dropout in NCS have not been found in the literature. This thesis addresses the design and implementation of real-time control systems with regard to dynamics analysis and integrated design. Three related areas have been investigated, namely control development for controllers, control implementation and scheduling on controllers, and real-time control in networked environments. Seven research problems are identified from these areas for investigation in this thesis, and accordingly seven major contributions have been claimed. Timing behaviour, quality of control, and integrated design for real-time control systems are highlighted throughout this thesis. In control design, a model-free control technique, pattern predictive control, is developed for complex reactive distillation processes. Alleviating the requirement of accurate process models, the developed control technique integrates pattern recognition, fuzzy logic, non-linear transformation, and predictive control into a unified framework to solve complex problems. Characterising the QoC indirectly with control latency and jitter, scheduling strategies for multiple control tasks are proposed to minimise the latency and/or jitter. Also, a hierarchical, QoC driven, and event-triggering feedback scheduling architecture is developed with plug-ins of either the earliest-deadline-first or fixed priority scheduling. Linking to the QoC directly, the architecture minimises the use of computing resources without sacrifice of the system QoC. It considers the control requirements, but does not rely on the control design. For real-time NCS, the dynamics of the network delay are analysed first, and the nonuniform distribution and multi-fractal nature of the delay are revealed. These results do not support two fundamental assumptions used in existing NCS literature. Then, considering the control requirements, solutions are provided to the challenging NCS problems from the network perspective. To compensate for the network delay, a real-time queuing protocol is developed to smooth out the time-varying delay and thus to achieve more predictable behaviour of packet transmissions. For control packet dropout, simple yet effective compensators are proposed. Finally, combining the queuing protocol, the packet loss compensation, the configuration of the worst-case communication delay, and the control design, an integrated design framework is developed for real-time NCS. With this framework, the network delay is limited to within a single control period, leading to simplified system analysis and improved QoC.
APA, Harvard, Vancouver, ISO, and other styles
41

Tian, Yu-Chu. "Dynamics analysis and integrated design of real-time control systems." Connect to full text, 2008. http://ses.library.usyd.edu.au/handle/2123/5743.

Full text
Abstract:
Thesis (Ph. D.)--University of Sydney, 2009.
Title from title screen (viewed November 30, 2009). Submitted in fulfilment of the requirements for the degree of Doctor of Philosophy to the School of Electrical and Information Engineering in the Faculty of Engineering & Information Technologies. Degree awarded 2009; thesis submitted 2008. Includes bibliographical references. Also available in print form.
APA, Harvard, Vancouver, ISO, and other styles
42

Viljoen, Melanie. "A framework towards effective control in information security governance." Thesis, Nelson Mandela Metropolitan University, 2009. http://hdl.handle.net/10948/887.

Full text
Abstract:
The importance of information in business today has made the need to properly secure this asset evident. Information security has become a responsibility for all managers of an organization. To better support more efficient management of information security, timely information security management information should be made available to all managers. Smaller organizations face special challenges with regard to information security management and reporting due to limited resources (Ross, 2008). This dissertation discusses a Framework for Information Security Management Information (FISMI) that aims to improve the visibility and contribute to better management of information security throughout an organization by enabling the provision of summarized, comprehensive information security management information to all managers in an affordable manner.
APA, Harvard, Vancouver, ISO, and other styles
43

Wu, Qinyi. "Partial persistent sequences and their applications to collaborative text document editing and processing." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/44916.

Full text
Abstract:
In a variety of text document editing and processing applications, it is necessary to keep track of the revision history of text documents by recording changes and the metadata of those changes (e.g., user names and modification timestamps). The recent Web 2.0 document editing and processing applications, such as real-time collaborative note taking and wikis, require fine-grained shared access to collaborative text documents as well as efficient retrieval of metadata associated with different parts of collaborative text documents. Current revision control techniques only support coarse-grained shared access and are inefficient to retrieve metadata of changes at the sub-document granularity. In this dissertation, we design and implement partial persistent sequences (PPSs) to support real-time collaborations and manage metadata of changes at fine granularities for collaborative text document editing and processing applications. As a persistent data structure, PPSs have two important features. First, items in the data structure are never removed. We maintain necessary timestamp information to keep track of both inserted and deleted items and use the timestamp information to reconstruct the state of a document at any point in time. Second, PPSs create unique, persistent, and ordered identifiers for items of a document at fine granularities (e.g., a word or a sentence). As a result, we are able to support consistent and fine-grained shared access to collaborative text documents by detecting and resolving editing conflicts based on the revision history as well as to efficiently index and retrieve metadata associated with different parts of collaborative text documents. We demonstrate the capabilities of PPSs through two important problems in collaborative text document editing and processing applications: data consistency control and fine-grained document provenance management. The first problem studies how to detect and resolve editing conflicts in collaborative text document editing systems. We approach this problem in two steps. In the first step, we use PPSs to capture data dependencies between different editing operations and define a consistency model more suitable for real-time collaborative editing systems. In the second step, we extend our work to the entire spectrum of collaborations and adapt transactional techniques to build a flexible framework for the development of various collaborative editing systems. The generality of this framework is demonstrated by its capabilities to specify three different types of collaborations as exemplified in the systems of RCS, MediaWiki, and Google Docs respectively. We precisely specify the programming interfaces of this framework and describe a prototype implementation over Oracle Berkeley DB High Availability, a replicated database management engine. The second problem of fine-grained document provenance management studies how to efficiently index and retrieve fine-grained metadata for different parts of collaborative text documents. We use PPSs to design both disk-economic and computation-efficient techniques to index provenance data for millions of Wikipedia articles. Our approach is disk economic because we only save a few full versions of a document and only keep delta changes between those full versions. Our approach is also computation-efficient because we avoid the necessity of parsing the revision history of collaborative documents to retrieve fine-grained metadata. Compared to MediaWiki, the revision control system for Wikipedia, our system uses less than 10% of disk space and achieves at least an order of magnitude speed-up to retrieve fine-grained metadata for documents with thousands of revisions.
APA, Harvard, Vancouver, ISO, and other styles
44

Morris, J. W. "The development of techniques to select a control policy during proactive on-line planning and control." Thesis, Stellenbosch : Stellenbosch University, 2001. http://hdl.handle.net/10019.1/52513.

Full text
Abstract:
Thesis (MScEng)--Stellenbosch University, 2001.
ENGLISH ABSTRACT: The worldwide trend for systems is to become more complex. This leads to the need for new ways to control these complex systems. A relatively new approach for controlling systems, called on-line planning and control, poses many potential benefits to a variety of end-users, especially in the manufacturing environment. Davis [3J developed a framework for on-line planning and control that is currently incomplete. This project aims to fill one of the gaps in the framework by automating one of the functions, eliminating the need for a human observer. This function, the real-time compromise analysis function, does the comparison of the statistical performance estimates to select a control policy for implementation in the system being controlled (the realworld system) at the current moment in time. In this project, two techniques were developed to automate the function. The first technique is based on a common technique for statistically comparing two systems, the paired-t confidence interval technique. The paired-t confidence interval technique is used to compare the control policies by building confidence intervals of the expected differences for the respective performance criteria and testing the hypothesis that the statistical performance estimates of the one control policy are better than those of the other control policy. The results of these comparisons are then consolidated into a compromise function that is used to determine the control policy to be implemented currently in the real-world system. The second developed technique is derived, but differs greatly, from Davis's [3J dominance probability density function approach, and it includes principles of the paired-t confidence interval technique. It compares the control policies by determining the probability (confidence level) with which one can assume that the performance criterion of the one control policy will provide a performance value that is better than the other's and vie-ursa. These confidence levels are then aggregated into a single compromise function that is used to determine the control policy to be implemented currently in the real-world system. After the techniques were developed, it was not possible to determine their efficiency mathematically, because their statistical base is suspect. The techniques needed to be implemented before they could be evaluated and it was decided to develop an emulator of the on-line planning and control process in accordance with the framework given by Davis [3J to implement them. This Emulator is in essence a Visual Basic" program that uses Arena" models. However, this Emulator needed certain deviations from the framework to make it possible. Firstly, while the systems that will be controlled with the on-line planning and control process will be complex systems, the system controlled in the Emulator is only a straightforward MlM/l/FIFO/OO / 00 system. This allowed for the conditions that have not been addressed sufficiently, e.g. the initialising of the system models, to be bypassed. Secondly, the Emulator does not include all parts of the framework, and parts for which the technology does not currently exist have been excluded. Thirdly, the real-world system is replaced with a model, because a real-world system was not available for the study. Finally, concurrent operations are actually done sequentially, but in a way that makes it seem that they were done concurrently, as not to influence the results. This Emulator was used to analyse both techniques for two different traffic intensities. The first part of the analysis consisted of an off-line non-terminating analysis of the individual control policies of the system. This was used as a base line against which the on-line planning and control process of the Emulator was evaluated. The findings of the evaluations were that, at the traffic intensities evaluated, the techniques provided results that were very similar to the results of the best individual control. From these results, it was speculated that at different traffic intensities, different control policies would be better than the techniques themselves, while the techniques will only give slightly worse results. In addition, because the on-line planning and control process attempts to respond to changing conditions, it can be assumed that the techniques will excel in those conditions where the input distribution is changing continuously. It is also speculated that the techniques may be advantageous in cases where it is not possible to determine beforehand which of the individual control policies to use because it is impossible to predict the input distribution that will occur. It is expected that the techniques will give good (but unfortunately, not necessarily the best) results for any input distribution, while an individual control policy that may give the best results for one input distribution, may prove disastrous for another input distribution. Three important conclusions can be made from the project. Firstly, it is possible to automate the real-time compromise analysis function. Secondly, an emulator can be developed to evaluate the techniques for the real-time compromise analysis. The greatest advantage of this Emulator is that it can run significantly faster than real-time, enabling the generation of enough data to make the significant statistical comparisons needed to evaluate the techniques. The final conclusion is that while initial evaluations are inconclusive, it can be shown that the techniques warrant further study. Three important recommendations cart be made from the project. Firstly, the techniques need to be studied further, because they cannot be claimed to be perfect, or that they are the only possible techniques that will work. In fact, they are merely techniques that may work and other techniques may still prove to be better. Secondly, because it would be foolhardy to assume that the Emulator is complete, the Emulator needs to be improved with the most critical need to develop the Emulator in a programming language and simulation package that allows concurrent operations and effortless initialisation. This will enable the Emulator to be much faster and a lot more flexible. The final recommendation is that the techniques need to be evaluated with other parameters in other increasingly complex systems, culminating in the evaluation of the on-line planning and control process with the techniques included in a real-world flexible manufacturing system. Only then can there be decided conclusively on whether the techniques are efficient or not. It is hoped that this project will form a valuable building block that will facilitate making on-line planning and control a viable alternative to controlling complex systems, enabling them to respond better to changing conditions that are currently becoming the norm.
AFRIKAANSE OPSOMMING: Wêreldwyd is stelsels besig om meer ingewikkeld te raak. Dit bring mee dat nuwe metodes benodig word om hierdie ingewikkelde stelsels te beheer. Gekoppelde beplanning en beheer ("On-line planning and control") is 'n relatiewe nuwe metode om stelsels te beheer en het baie moontlike voordele vir 'n verskeidenheid van gebruikers, veral in die vervaardigingsomgewing. Davis [3] het 'n raamwerk ontwikkel vir gekoppelde beplanning en beheer, maar die raamwerk is tans onvolledig. Hierdie projek het gepoog om een van die gapings in die raamwerk te vul deur een van die funksies te outomatiseer en sodoende die behoefte vir 'n menslike waarnemer te elimineer. Hierdie funksie, die intydse-kompromie-analise-funksie ("real-time compromise analysis function"), is verantwoordelik vir die vergelyking van die statistiese prestasieskattings om 'n beheerbeleid te kies wat geïmplementeer moet word in die stelsel wat beheer word (die regtewêreld -stelsel). Die projek het twee tegnieke ontwikkel om die funksie te outomatiseer. Die eerste tegniek is gebaseer op 'n algemene tegniek om twee stelsels statisties met mekaar te vergelyk, naamlik die gepaarde-t vertrouensinterval-tegniek. Die gepaarde-t vertrouensinterval-tegniek word gebruik om die beheerbeleide te vergelyk deur vertrouensintervalle te bou van die verwagte verskille vir die verskillende vertoningskriteria en om die hipotese te toets dat die statistiese prestasieskattings van die een beheerbeleid beter is as dié van 'n ander beheerbeleid. Die resultate van hierdie vergelykings word dan gekonsolideer in 'n kompromiefunksie wat gebruik word om te bepaal watter beheerbeleid tans geïmplementeer moet word in die regte-wêreld-stelsel. Die tweede ontwikkelde tegniek is afgelei, maar verskil baie, van Davis [3] se oorheersende waarskynlikheidsdigtheid-funksie ("dominance probability density function") -benadering en gebruik ook idees van die gepaarde-t vertrouensinterval-tegniek. Dit vergelyk die beheerbeleide deur die waarskynlikheid (vertrouensvlak) te bereken waarmee aanvaar kan word dat die vertoningskriterion van een van die beheerbeleide 'n beter vertoningswaarde sal hê as die ander, en omgekeerd. Hierdie vertrouensvlakke word dan gekonsolideer in 'n kompromiefunksie wat gebruik word om te bepaal watter beheerbeleid tans géimplementeer moet word in die regte wêreld stelsel. Nadat die tegnieke ontwikkel is, was dit nie moontlik om hulle effektiwiteit wiskundig te evalueer nie, want hulle statistiese basis is verdag. Dus moes die tegnieke geïmplementeer word voordat hulle geëvalueer kon word. Daar is besluit om 'n emuleerder van die proses van gekoppelde beplanning en beheer te ontwikkel volgens die raamwerk wat deur Davis [3] ontwikkel is sodat die tegnieke geïmplementeer kan word. Hierdie Emuleerder is 'n Visual Basic* program wat Arena" modelle gebruik. Om die Emuleerder moontlik te maak, was sekere afwykings van die raamwerk nodig. Die eerste hiervan is dat die stelsels wat beheer word met gekoppelde beplanning en beheer, komplekse stelsels is, maar dat die stelsel wat deur die Emuleerder beheer word, slegs 'n eenvoudige MIMI l/EIEBI 00 I 00 sisteem is. Dit maak dit moontlik om aspekte wat nog nie genoegsaam aangespreek is nie, byvoorbeeld die inisiëring van die stelselmodelle, te omseil. Tweedens bevat die Emuleerder nie al die dele van die raamwerk nie en dele waarvoor die tegnologie tans nog nie bestaan nie, is uitgelaat. Derdens, die regte wêreld stelsel is vervang met 'n model, want 'n regte wêreld stelsel was nie beskikbaar nie. Laastens is operasies wat eintlik gelyktydig gedoen moes word, sekwensieel gedoen, maar op so 'n marrier dat dit lyk asof hulle gelyktydig gedoen is, sodat die resultate nie beïnvloed word nie. Die Emuleerder is gebruik om beide tegnieke te analiseer vir twee verskillende verkeersdigthede. Die eerste deel van die analise het bestaan uit 'n nie-terminerende analise van die individuele beheerbeleide van die stelsel. Dit is gebruik as 'n basislyn waarteen die Emuleerder se proses van gekoppelde beplanning en beheer geëvalueer is. Die bevindinge van die evaluasie was dat vir die verkeersdigthede wat geëvalueer is, die tegnieke resultate lewer wat vergelykbaar is met die van die beste individuele beheerbeleide. Oor hierdie resultate is daar gespekuleer dat by verskillende verkeersdigthede, verskillende beheerbeleide beter sal vaar as die tegnieke, terwyl die tegnieke slegs marginale swakker resultate sal lewer. En omdat gekoppelde beplanning en beheer poog om te reageer op veranderende omstandighede, kan dit aanvaar word dat die tegnieke sal presteer in omstandighede waar die toevoerverdeling die heeltyd verander. Dit word ook beweer dat die tegnieke tot voordeel sal wees in gevalle waar dit nie moontlik is om vooraf te bepaal watter van die individuele beheerbeleide om te gebruik nie, omdat dit onmoontlik is om te voorspel watter toevoerverdeling gerealiseer gaan word. Dit word verwag dat die tegnieke goeie (maar ongelukkig nie noodwendig die beste nie) resultate saliewer vir enige toevoerverdeling, terwyl 'n individuele beheerbeleid wat moontlik die beste resultate vir die een toevoerverdeling sal gee, katastrofies kan wees vir 'n ander toevoerverdeling. Drie belangrike gevolgtrekkings kan gemaak word van die projek. Eerstens, dit is moontlik om die intydse-komprornie-analise-funksie te outomatiseer. Tweedens, 'n emuleerder kan ontwikkel word om die tegnieke vir die intydse-komprornie-analise te evalueer. Die grootste voordeel van die Emuleerder is dat dit heelwat vinniger as reële tyd kan opereer, wat dit moontlik maak om genoeg data te genereer om die betekenisvolle statistiese vergelykings te maak wat benodig word om die tegnieke te evalueer. Die laaste gevolgtrekking is dat, alhoewel die aanvanklike evaluasie nie beslissend is nie, dit gewys kan word dat die tegnieke verdere studie verdien. Drie belangrike aanbevelings kan gemaak word vanuit die projek. Eerstens, die tegnieke moet nog verder bestudeer word, omdat daar nie beweer kan word dat hulle perfek is of dat hulle die enigste tegnieke is wat kan werk nie. Om die waarheid te sê, hulle is slegs tegnieke wat moontlik kan werk en ander tegnieke kan steeds bewys word om beter te wees. Tweedens sou dit onsinnig wees om te beweer dat die Emuleerder volledig is, en moet die Emuleerder nog verbeter word. Die mees kritiese vereiste is om die Emuleerder te ontwikkel in 'n programmeringstaal en simulasiepakket wat gelyktydige operasies en moeitelose inisiëring toelaat. Dit sal die Emuleerder toelaat om baie vinniger en meer buigsaam te wees. Die laaste aanbeveling is dat die tegnieke geëvalueer moet word met ander parameters in ander stelsels van stygende kompleksiteit, wat die hoogtepunt bereik in die evaluasie van die proses van gekoppelde beplanning en beheer met die tegnieke ingesluit in 'n regte-wêreld buigbare vervaardigingstelsel ("flexible manufacturing system"). Slegs dan sal dit moontlik wees om onomwonde te sê of die tegnieke effektief is of nie. Daar word gehoop dat hierdie projek 'n waardevolle boublok sal vorm wat sal bydra om gekoppelde beplanning en beheer 'n uitvoerbare alternatief te maak vir die beheer van komplekse stelsels, omdat dit hulle sal toelaat om beter te reageer op die veranderende omstandighede wat deesdae die norm is.
APA, Harvard, Vancouver, ISO, and other styles
45

Knauer, Christian, and Klaus Ralf Nötzel. "REENGINEERING A TRADITONAL SPACECRAFT CONTROL CENTER." International Foundation for Telemetering, 2001. http://hdl.handle.net/10150/607691.

Full text
Abstract:
International Telemetering Conference Proceedings / October 22-25, 2001 / Riviera Hotel and Convention Center, Las Vegas, Nevada
Deutsche Telekom is operating various communication satellites since 1989. The SCC (spacecraft control center) is located near Frankfurt / Germany. The entire system is based on antenna/RF equipment, baseband and computer software packages running on a computer network of different machines. Due to increased maintenance effort the old baseband system needed to be replaced. This also had effects to the computer system, especially to the M&C. The aim was to design the entire system in a way that the operation effort in costs aspects and human intervention are minimized. This paper shows the successful real world project of reengineering a traditional spacecraft control center (SCC). It is shown how a fifteen year old hardware (baseband system) and software design was replaced by a modern concept during normal operations. The new software packages execute all necessary tasks for spacecraft- and ground station control. The Monitor and Control System (M&C) is a database driven design (FRAMTEC, from CAM Germany).
APA, Harvard, Vancouver, ISO, and other styles
46

Xu, Cheng. "Authenticated query processing in the cloud." HKBU Institutional Repository, 2019. https://repository.hkbu.edu.hk/etd_oa/620.

Full text
Abstract:
With recent advances in data-as-a-service (DaaS) and cloud computing, outsourcing data to the cloud has become a common practice. In a typical scenario, the data owner (DO) outsources the data and delegates the query processing service to a service provider (SP). However, as the SP is often an untrusted third party, the integrity of the query results cannot be guaranteed and is thus imperative to be authenticated. To tackle this issue, a typical approach is letting the SP provide a cryptographic proof, which can be used to verify the soundness and completeness of the query results by the clients. Despite extensive research on authenticated query processing for outsourced databases, existing techniques have only considered limited query types. They fail to address a variety of needs demanded by enterprise customers such as supporting aggregate queries over set-valued data, enforcing fine-grained access control, and using distributed computing paradigms. In this dissertation, we take the first step to comprehensively investigate the authenticated query processing in the cloud that fulfills the aforementioned requirements. Security analysis and performance evaluation show that the proposed solutions and techniques are robust and efficient under a wide range of system settings.
APA, Harvard, Vancouver, ISO, and other styles
47

Barr, Thomas W. "Development of a graphics interface for the Savannah River control program "Savres"." Thesis, Georgia Institute of Technology, 1992. http://hdl.handle.net/1853/20978.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Leach, Christopher. "Novel Internet based methods for chemical information control." Thesis, Imperial College London, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.300623.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Licona-Nunez, Jorge Estuardo. "M-ary Runlength Limited Coding and Signal Processing for Optical Data Storage." Diss., Georgia Institute of Technology, 2004. http://hdl.handle.net/1853/5195.

Full text
Abstract:
Recent attempts to increase the capacity of the compact disc (CD) and digital versatile disc (DVD) have explored the use of multilevel recording instead of binary recording. Systems that achieve an increase in capacity of about three times that of conventional CD have been proposed for production. Marks in these systems are multilevel and fixed-length as opposed to binary and variable length in CD and DVD. The main objective of this work is to evaluate the performance of multilevel ($M$-ary) runlength-limited (RLL) coded sequences in optical data storage. First, the waterfilling capacity of a multilevel optical recording channel ($M$-ary ORC) is derived and evaluated. This provides insight into the achievable user bit densities, as well as a theoretical limit against which simulated systems can be compared. Then, we evaluate the performance of RLL codes on the $M$-ary ORC. A new channel model that includes the runlength constraint in the transmitted signal is used. We compare the performance of specific RLL codes, namely $M$-ary permutation codes, to that of real systems using multilevel fixed-length marks for recording and the theoretical limits. The Viterbi detector is used to estimate the original recorded symbols from the readout signal. Then, error correction is used to reduce the symbol error probability. We use a combined ECC/RLL code for phrase encoding. We evaluate the use of trellis coded modulation (TCM) for amplitude encoding. The detection of the readout signal is also studied. A post-processing algorithm for the Viterbi detector is introduced, which ensures that the detected word satisfies the code constraints. Specifying the codes and detector for the $M$-ary ORC gives a complete system whose performance can be compared to that of the recently developed systems found in the literature and the theoretical limits calculated in this research.
APA, Harvard, Vancouver, ISO, and other styles
50

馮潤開 and Yun-hoi Fung. "Linguistic fuzzy-logic control of autonomous vehicles." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1998. http://hub.hku.hk/bib/B29812690.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography