Dissertations / Theses on the topic 'Computer networks Standards Australia'

To see the other types of publications on this topic, follow the link: Computer networks Standards Australia.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 49 dissertations / theses for your research on the topic 'Computer networks Standards Australia.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Aitken, William Theodore Carleton University Dissertation Engineering Electrical. "Network management standards from the fault management perspective." Ottawa, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Staples, John. "Local area network development standards." Master's thesis, This resource online, 1990. http://scholar.lib.vt.edu/theses/available/etd-03302010-020011/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Okano, Stephen Hiroshi. "Optimizing secure communication standards for disadvantaged networks." Thesis, Massachusetts Institute of Technology, 2009. http://hdl.handle.net/1721.1/61316.

Full text
Abstract:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2009.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 137-140).
We present methods for optimizing standardized cryptographic message protocols for use on disadvantaged network links. We first provide an assessment of current secure communication message packing standards and their relevance to disadvantaged networks. Then we offer methods to reduce message overhead in packing Cryptographic Message Syntax (CMS) structures by using ZLIB compression and using a Lite version of CMS. Finally, we offer a few extensions to the Extensible Messaging and Presence Protocol (XMPP) to wrap secure group messages for chat on disadvantaged networks and to reduce XMPP message overhead in secure group transmissions. We present the design and implementation of these optimizations and the results that these optimizations have on message overhead, extensibility, and usability of both CMS and XMPP. We have developed these methods to extend CMS and XMPP with the ultimate goal of establishing standards for securing communications in disadvantaged networks.
by Stephen Hiroshi Okano.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
4

Laska, William David. "Development of a standards based open environment for the worldwide military command and control system." Master's thesis, This resource online, 1993. http://scholar.lib.vt.edu/theses/available/etd-03302010-020318/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

De, Kock Johannes Marthinus. "Optimal management of MPLS networks." Thesis, Stellenbosch : Stellenbosch University, 2002. http://hdl.handle.net/10019.1/52977.

Full text
Abstract:
Thesis (MSc)--Stellenbosch University, 2002.
ENGLISH ABSTRACT: Multiprotocol Label Switching (MPLS) is a routing technology which can manage Quality of Service (QoS) in scalable connectionless networks using relatively simple packet forwarding mechanisms. This thesis considers the optimisation of the QoS offered by an MPLS network. The QoS measure used is the expected packet delay which is minimised by switching packets along optimal label switched paths (LSPs). Two mathematical models of MPLS networks are presented together with appropriate algorithms for optimally dividing the network traffic into forwarding equivalence classes (FECs), finding optimal LSPs which minimise the expected packet delay and switching these FECs along the optimal LSPs. These algorithms are applied to compute optimal LSPs for several test networks. The mathematics on which these algorithms are based is also reviewed. This thesis provides the MPLS network operator with efficient packet routing algorithms for optimising the network's QoS.
AFRIKAANSE OPSOMMING: Multiprotocol Label Switching (MPLS) is 'n roeteringsmetode om die diensvlak (QoS) van 'n skaleerbare, verbindinglose netwerk te bestuur deur middel van relatief eenvoudige versendingsmeganismes. Hierdie tesis beskou die optimering van die QoS van 'n MPLS-netwerk. Die QoS-maatstaf is die verwagte vert raging van 'n netwerk-pakkie. Dit word geminimeer deur pakkies langs optimale "label switched paths" (LSPs) te stuur. Twee wiskundige modelle van MPLS-netwerke word ondersoek. Toepaslike algoritmes word verskaf vir die optimale verdeling van die netwerkverkeer in "forwarding equivalence classes" (FECs), die soektog na optimale LSPs (wat die verwagte pakkie-vertraging minimeer) en die stuur van die FECs langs die optimale LSPs. Hierdie algoritmes word ingespan om optimale LSPs vir verskeie toetsnetwerke op te stel. Die wiskundige teorie waarop hierdie algoritmes gegrond is, word ook hersien. Hierdie tesis verskaf doeltreffende roeteringsalgoritmes waarmee 'n MPLS-netwerkbestuurderj-es die netwerk se QoS kan optimeer.
APA, Harvard, Vancouver, ISO, and other styles
6

Ngqondi, Tembisa Grace. "The ISO/IEC 27002 and ISO/IEC 27799 information security management standards : a comparative analysis from a healthcare perspective." Thesis, Nelson Mandela Metropolitan University, 2009. http://hdl.handle.net/10948/1066.

Full text
Abstract:
Technological shift has become significant and an area of concern in the health sector with regard to securing health information assets. Health information systems hosting personal health information expose these information assets to ever-evolving threats. This information includes aspects of an extremely sensitive nature, for example, a particular patient may have a history of drug abuse, which would be reflected in the patient’s medical record. The private nature of patient information places a higher demand on the need to ensure privacy. Ensuring that the security and privacy of health information remain intact is therefore vital in the healthcare environment. In order to protect information appropriately and effectively, good information security management practices should be followed. To this end, the International Organization for Standardization (ISO) published a code of practice for information security management, namely the ISO 27002 (2005). This standard is widely used in industry but is a generic standard aimed at all industries. Therefore it does not consider the unique security needs of a particular environment. Because of the unique nature of personal health information and its security and privacy requirements, the need to introduce a healthcare sector-specific standard for information security management was identified. The ISO 27799 was therefore published as an industry-specific variant of the ISO 27002 which is geared towards addressing security requirements in health informatics. It serves as an implementation guide for the ISO 27002 when implemented in the health sector. The publication of the ISO 27799 is considered as a positive development in the quest to improve health information security. However, the question arises whether the ISO 27799 addresses the security needs of the healthcare domain sufficiently. The extensive use of the ISO 27002 implies that many proponents of this standard (in healthcare), now have to ensure that they meet the (assumed) increased requirements of the ISO 27799. The purpose of this research is therefore to conduct a comprehensive comparison of the ISO 27002 and ISO 27799 standards to determine whether the ISO 27799 serves the specific needs of the health sector from an information security management point of view.
APA, Harvard, Vancouver, ISO, and other styles
7

Kheirkhah, Sabetghadam Morteza. "MMPTCP : a novel transport protocol for data centre networks." Thesis, University of Sussex, 2016. http://sro.sussex.ac.uk/id/eprint/61781/.

Full text
Abstract:
Modern data centres provide large aggregate capacity in the backbone of networks so that servers can theoretically communicate with each other at their maximum rates. However, the Transport Control Protocol (TCP) cannot efficiently use this large capacity even if Equal-Cost Multi-Path (ECMP) routing is enabled to exploit the existence of parallel paths. MultiPath TCP (MPTCP) can effectively use the network resources of such topologies by performing fast distributed load balancing. MPTCP is an appealing technique for data centres that are very dynamic in nature. However, it is ill-suited for handling short flows since it increases their flow completion time. To mitigate these problems, we propose Maximum MultiPath TCP (MMPTCP), a novel transport protocol for modern data centres. Unlike MPTCP, it provides high performance for all network flows. It also decreases the bursty nature of data centres, which is essentially rooted in traffic patterns of short flows. MMPTCP achieves these nice features by randomising a flow's packets via all parallel paths to a destination during the initial phase of data transmission until a certain amount of data is delivered. It then switches to MPTCP with several subflows in which data transmission is governed by MPTCP congestion control. In this way, short flows are delivered very fast via the initial phase only, and long flows are delivered by MPTCP with several subflows. We evaluate MMPTCP in a FatTree topology under various network conditions. We found that MMPTCP decreases the loss rate of all the links throughout the network and helps competing flows to achieve a better performance. Unlike MPTCP with a fixed number of subflows, MMPTCP offers high burst tolerance and low-latency for short flows while it maintains high overall network utilisation. MMPTCP is incrementally deployable in existing data centres because it does not require any modification to the network and application layers.
APA, Harvard, Vancouver, ISO, and other styles
8

Tsietsi, Mosiuoa Jeremia. "Prototyping a peer-to-peer session initiation protocol user agent." Thesis, Rhodes University, 2008. http://hdl.handle.net/10962/d1006603.

Full text
Abstract:
The Session Initiation Protocol (SIP) has in recent years become a popular protocol for the exchange of text, voice and video over IP networks. This thesis proposes the use of a class of structured peer to peer protocols - commonly known as Distributed Hash Tables (DHTs) - to provide a SIP overlay with services such as end-point location management and message relay, in the absence of traditional, centralised resources such as SIP proxies and registrars. A peer-to-peer layer named OverCord, which allows the interaction with any specific DHT protocol via the use of appropriate plug-ins, was designed, implemented and tested. This layer was then incorporated into a SIP user agent distributed by NIST (National Institute of Standards and Technology, USA). The modified user agent is capable of reliably establishing text, audio and video communication with similarly modified agents (peers) as well as conventional, centralized SIP overlays.
APA, Harvard, Vancouver, ISO, and other styles
9

Kapp, James A. "Implementation of operational framework in the NLP based on MOF and ITIL standards." [Denver, Colo.] : Regis University, 2006. http://165.236.235.140/lib/JKappPartI2006.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Li, Xiao-Yu. "Evolving a secure grid-enabled, distributed data warehouse : a standards-based perspective." Thesis, Nelson Mandela Metropolitan University, 2007. http://hdl.handle.net/10948/544.

Full text
Abstract:
As digital data-collection has increased in scale and number, it becomes an important type of resource serving a wide community of researchers. Cross-institutional data-sharing and collaboration introduce a suitable approach to facilitate those research institutions that are suffering the lack of data and related IT infrastructures. Grid computing has become a widely adopted approach to enable cross-institutional resource-sharing and collaboration. It integrates a distributed and heterogeneous collection of locally managed users and resources. This project proposes a distributed data warehouse system, which uses Grid technology to enable data-access and integration, and collaborative operations across multi-distributed institutions in the context of HV/AIDS research. This study is based on wider research into OGSA-based Grid services architecture, comprising a data-analysis system which utilizes a data warehouse, data marts, and near-line operational database that are hosted by distributed institutions. Within this framework, specific patterns for collaboration, interoperability, resource virtualization and security are included. The heterogeneous and dynamic nature of the Grid environment introduces a number of security challenges. This study also concerns a set of particular security aspects, including PKI-based authentication, single sign-on, dynamic delegation, and attribute-based authorization. These mechanisms, as supported by the Globus Toolkit’s Grid Security Infrastructure, are used to enable interoperability and establish trust relationship between various security mechanisms and policies within different institutions; manage credentials; and ensure secure interactions.
APA, Harvard, Vancouver, ISO, and other styles
11

Locher, Melissa. "A case study of the change process involved in the implementation of web accessibility standards /." free to MU campus, to others for purchase, 2004. http://wwwlib.umi.com/cr/mo/fullcit?p3164525.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Gilliam, Austin Taylor. "Using Deep Neural Networks and Industry-Friendly Standards to Create a Robot Follower for Human Leaders." The Ohio State University, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=osu1524150398390964.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Hicks, Michael. "Organisational barriers and their relationship to the effective use of information system audit trails." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2006. https://ro.ecu.edu.au/theses/335.

Full text
Abstract:
Audit trails are important asa detection and monitoring mechanism for unethical or unauthorised behaviour from internal, as well as external users. In addition, they can be used to demonstrate a proof of business process or as an evidentiary record to assess the integrityof an information system. Their effective use is promoted as being an essential component of a well-balanced and complete security policy. Despite the widespread acknowledgrnent of the importance of audit trails, surveys have repeatedly shown they are often neglected in terms of both the degree of implementation and effectiveness. This study explores the evidence that suggests that organisational issues, rather than technical problems may be the cause of deficiencies in audit trail effectiveness. Organisational barriers identified in current and prior studies include, lack of appropriate training, lack of comprehensive security policies and procedures, and an absence of IT staff recruitment policy.
APA, Harvard, Vancouver, ISO, and other styles
14

Botha, Marlene. "Online traffic engineering for MPLS networks." Thesis, Stellenbosch : Stellenbosch University, 2004. http://hdl.handle.net/10019.1/50049.

Full text
Abstract:
Thesis (MSc) -- Stellenbosch University, 2004.
ENGLISH ABSTRACT: The Internet is fast evolving into a commercial platform that carries a mixture of narrow- and broadband applications such as voice, video, and data. Users expect a certain level of guaranteed service from their service providers and consequently the need exists for efficient Internet traffic engineering to enable better Quality of Service (QoS) capabilities. Multi-protocol Label Switching (MPLS) is a label switching protocol that has emerged as an enabling technology to achieve efficient traffic engineering for QoS management in IP networks. The ability of the MPLS protocol to create explicit virtual connections called Label Switched Paths (LSPs) to carry network traffic significantly enhances the traffic engineering capabilities of communication networks. The MPLS protocol supports two options for explicit LSP selection: offline LSP computation using an optimization method and dynamic route selection where a single node makes use of current available network state information in order to compute an explicit LSP online. This thesis investigates various methods for the selection of explicit bandwidth guaranteed LSPs through dynamic route selection. We address the problem of computing a sequence of optimal LSPs where each LSP can carry a specific traffic demand and we assume that no prior information regarding the future traffic demands are available and that the arrival sequence of LSP requests to the network is unknown. Furthermore, we investigate the rerouting abilities of the online LSP selection methods to perform MPLS failure restoration upon link failure. We propose a new online routing framework known as Least Interference Optimization (LIO) that utilizes the current bandwidth availability and traffic flow distribution to achieve efficient traffic engineering. We present the Least Interference Optimization Algorithm (LIOA) that reduces the interference among competing network flows by balancing the number and quantity of flows carried by a link for the setup of bandwidth guaranteed LSPs in MPLS networks. The LIOA routing strategy is evaluated and compared against well-known routing strategies such as the Minimum Hop Algorithm (MHA), Minimum Interference Routing Algorithm (MIRA), Open Shortest Path First (OSPF) and Constraint Shortest Path First (CSPF) by means of simulation. Simulation results revealed that, for the network topologies under consideration, the routing strategies that employed dynamic network state information in their routing decisions (LIOA, CSPF and MIRA) generally outperformed the routing strategies that only rely on static network information (OSPF and MHA). In most simulation experiments the best performance was achieved by the LIOA routing strategy while the MHA performed the worse. Furthermore we observed that the computational complexity of the MIRA routing strategy does not translate into equivalent performance gains. We employed the online routing strategies for MPLS failure recovery upon link failure. In particular we investigated two aspects to determine the efficiency of the routing strategies for MPLS rerouting: the suitability of the LSP configuration that results due to the establishment of LSPs prior to link failure and the ability of the online routing strategy to reroute failed LSPs upon link failure. Simulation results revealed similar rerouting performance for all online routing strategies under investigation, but a LSP configuration most suitable for online rerouting was observed for the LIOA routing strategy.
AFRIKAANSE OPSOMMING:Die Internet is voordurend besig om te evoleer in 'n medium wat 'n wye reeks moderne kommunikasietegnologiee ondersteun, insluitende telefoon, video en data. Internet gebruikers verwag gewaarborgde diens van hul diensverskaffers en daar bestaan dus 'n vraag na doeltreffende televerkeerbeheer vir gewaarborgde Internet diensgehalte. Multiprotokol Etiketskakeling (MPLS) is 'n etiketskakeling protokol wat doeltreffende televerkeerbeheer en diensgehalte moontlik maak deur die eksplisiete seleksie van virtuele konneksies vir die transmissie van netwerkverkeer in Internetprotokol (IP) netwerke. Hierdie virtuele konneksies staan bekend as etiketgeskakelde paaie. Die MPLS protokol ondersteun tans twee moontlikhede vir eksplisiete seleksie van etiketgeskakelde paaie: aflyn padberekening met behulp van optimeringsmetodes en dinamiese aanlyn padseleksie waar 'n gekose node 'n eksplisiete pad bereken deur die huidige stand van die netwerk in ag te neem. In hierdie tesis word verskeie padseleksiemetodes vir die seleksie van eksplisiete bandwydte-gewaarborgde etiketgeskakelde paaie deur mid del van dinamiese padseleksie ondersoek. Die probleem om 'n reeks optimale etiketgeskakelde paaie te bereken wat elk 'n gespesifeerde verkeersaanvraag kan akkommodeer word aangespreek. Daar word aanvaar dat geen informasie in verband met die toekomstige verkeersaanvraag bekend is nie en dat die aankomsvolgorde van etiketgeskakelde pad verso eke onbekend is. Ons ondersoek verder die herroeteringsmoontlikhede van die aanlyn padseleksiemetodes vir MPLS foutrestorasie in die geval van skakelonderbreking. Vir hierdie doel word 'n nuwe aanlyn roeteringsraamwerk naamlik Laagste Inwerking Optimering (LIO) voorgestel. LIO benut die huidige beskikbare bandwydte en verkeersvloeidistribusie van die netwerk om doeltreffende televerkeerbeheer moontlik te maak. Ons beskryf 'n Laagste Inwerking Optimering Algoritme (LIOA) wat die inwerking tussen kompeterende verkeersvloei verminder deur 'n balans te handhaaf tussen die aantal en kwantiteit van die verkeersvloeistrome wat gedra word deur elke netwerkskakel. Die LIOA roeteringstrategie word geevalueer met behulp van simulasie en die resultate word vergelyk met ander bekende roeteringstrategiee insluitende die Minimum Node Algorithme (MHA), die Minimum Inwerking Algoritme (MIRA), die Wydste Kortste Pad Eerste Algoritme (OSPF) en die Beperkte Kortste Pad Eerste Algoritme (CSPF). Die resultate van die simulasie-eksperimente to on dat, vir die netwerk topologiee onder eksperimentasie, die roeteringstratgiee wat roeteringsbesluite op dinamiese netwerk informasie baseer (LIOA, MIRA, CSPF) oor die algemeen beter vaar as die wat slegs staatmaak op statiese netwerkinformasie (MHA, OSPF). In die meeste simulasie-eksperimente vaar die LIOA roeteringstrategie die beste en die MHA roeteringstrategie die slegste. Daar word verder waargeneem dat die komputasiekomplesiteit van die MIRA roeteringstrategie nie noodwendig weerspieel word in die sukses van roeteringsuitkoms nie. In die geval waar die aanlyn roeteringstrategiee aangewend word vir MPLS foutrestorasie, toon die resultate van simulasie-eksperimente dat al die roeteringstrategiee min of meer dieselfde uitkoms lewer ten opsigte van herroetering van onderbreekte verkeersvloei. Die konfigurasie van etiketgeskakelde paaie deur die LIOA roeteringstrategie voor skakelonderbreking is egter die geskikste vir televerkeer herroetering na skakelonderbreking
APA, Harvard, Vancouver, ISO, and other styles
15

Foulkes, Philip James. "An investigation into the control of audio streaming across networks having diverse quality of service mechanisms." Thesis, Rhodes University, 2012. http://hdl.handle.net/10962/d1004865.

Full text
Abstract:
The transmission of realtime audio data across digital networks is subject to strict quality of service requirements. These networks need to be able to guarantee network resources (e.g., bandwidth), ensure timely and deterministic data delivery, and provide time synchronisation mechanisms to ensure successful transmission of this data. Two open standards-based networking technologies, namely IEEE 1394 and the recently standardised Ethernet AVB, provide distinct methods for achieving these goals. Audio devices that are compatible with IEEE 1394 networks exist, and audio devices that are compatible with Ethernet AVB networks are starting to come onto the market. There is a need for mechanisms to provide compatibility between the audio devices that reside on these disparate networks such that existing IEEE 1394 audio devices are able to communicate with Ethernet AVB audio devices, and vice versa. The audio devices that reside on these networks may be remotely controlled by a diverse set of incompatible command and control protocols. It is desirable to have a common network-neutral method of control over the various parameters of the devices that reside on these networks. As part of this study, two Ethernet AVB systems were developed. One system acts as an Ethernet AVB audio endpoint device and another system acts as an audio gateway between IEEE 1394 and Ethernet AVB networks. These systems, along with existing IEEE 1394 audio devices, were used to demonstrate the ability to transfer audio data between the networking technologies. Each of the devices is remotely controllable via a network neutral command and control protocol, XFN. The IEEE 1394 and Ethernet AVB devices are used to demonstrate the use of the XFN protocol to allow for network neutral connection management to take place between IEEE 1394 and Ethernet AVB networks. User control over these diverse devices is achieved via the use of a graphical patchbay application, which aims to provide a consistent user interface to a diverse range of devices.
APA, Harvard, Vancouver, ISO, and other styles
16

Thomson, Steven Michael. "A standards-based security model for health information systems." Thesis, Nelson Mandela Metropolitan University, 2008. http://hdl.handle.net/10948/718.

Full text
Abstract:
In the healthcare environment, various types of patient information are stored in electronic format. This prevents the re-entering of information that was captured previously. In the past this information was stored on paper and kept in large filing cabinets. However, with the technology advancements that have occurred over the years, the idea of storing patient information in electronic systems arose. This led to a number of electronic health information systems being created, which in turn led to an increase in possible security risks. Any organization that stores information of a sensitive nature must apply information security principles in order to ensure that the stored information is kept secure. At a basic level, this entails ensuring the confidentiality, integrity and availability of the information, which is not an easy feat in today’s distributed and networked environments. This paved the way for organized standardization activities in the areas of information security and information security management. Throughout history, there have been practices that were created to help “standardize” industries of all areas, to the extent that there are professional organizations whose main objective it is to create such standards to help connect industries all over the world. This applies equally to the healthcare environment, where standardization took off in the late eighties. Healthcare organizations must follow standardized security measures to ensure that patient information stored in health information systems is kept secure. However, the proliferation in standards makes it difficult to understand, adopt and deploy these standards in a coherent manner. This research, therefore, proposes a standards-based security model for health information systems to ensure that such standards are applied in a manner that contributes to securing the healthcare environment as a whole, rather than in a piecemeal fashion.
APA, Harvard, Vancouver, ISO, and other styles
17

Cripps, Helen. "Collaborative business relationships and the use of ICT: The case of the marine, defence and resources cluster, Western Australia." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2007. https://ro.ecu.edu.au/theses/301.

Full text
Abstract:
The research project was developed from an Australian Research Council Grant designed to investigate collaborative commerce and its impact on regional economic development. Through a process of consultation with the industry partner, the South West Group, the research was designed to investigate the drivers and inhibitors of collaborative relationships and the factors that impact on the creation and sustaining of these relationships. The role of Information Communication Technology (ICT) in facilitating and sustaining collaborative relationships and the perceived benefits and drawbacks of collaborative relationships were also investigated. The research sought to identify models of the best adoption of collaborative relationships.
APA, Harvard, Vancouver, ISO, and other styles
18

Stockdale, Rosemary. "Identification and realisation of the benefits of participating in an electronic marketplace : An interpretive evaluation approach." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2003. https://ro.ecu.edu.au/theses/1333.

Full text
Abstract:
Electronic marketplaces have proliferated as use of the Internet has become widespread in business. A rapid growth in the number of marketplaces, followed by a period of stringent consolidation, as market makers develop a greater understanding of effective business models, has resulted in a climate of uncertainty and confusion. As with many aspects of e-commerce the drive towards participation is fuelled less by strategy planning than by a fear of lagging behind competitors or losing first mover advantage. In this climate of uncertainty organisations often bypass effective evaluation of the benefits that can be realised from participation in e-marketplaces, thereby exacerbating the process facing them and hampering effective decision-making. Evaluation is perceived as a fraught subject within the Information System field, and particularly within the business community which adheres to tried and trusted, albeit often inappropriate, methods such as financial or technical evaluation. The difficulties involved in effective evaluation of systems are well documented; these will increase as systems become more pervasive throughout organisations and those of their trading partners. Calls for a more holistic approach to evaluation are increasing, based on a developing appreciation of interpretive methods of research within the Information Systems discipline. However, the understanding that the social, political and cultural factors affecting and organisation have an impact on the uses and advantages of systems is by no means universal, and empirical evidence of this view is only slowly emerging. This research examines the benefits that can be realised from participation in an electronic marketplace by taking an interpretive approach to the evaluation. It examines the nature of electronic marketplaces to provide clarity to a confused and dynamic environment. The study then focuses on the development of evaluation studies within the IS discipline to identify how an effective evaluation method for assessing the benefits of e-marketplace participation can be achieved. An empirical examination of an organisation’s participation in an electronic marketplace is used to identify the benefits that are realisable and the issues that impact on them. The case study is conducted through an interpretive lens, using a content, context, process (CCP) approach based on existing IS literature. This enables a crucial understanding of the internal and external environments influencing the organisation and its realisation of potential benefits. To allow for the range of interpretations and reflections required to fully address the complexity of the issues involved in such a case study, a variety of research influences such as dialect hermeneutics, critical realism and case study theory are drawn into the research model. The case study organisation’s motivation for participating in an e-marketplace was primarily cost savings. Over the two years of the study, several more potential benefits were identified, such as supply chain efficiencies, greater market awareness and a widening of the supplier base. However, the organisation’s commitments to its local and regional communities, its need to retain status and some consideration of existing relationships needed to be balanced against the gains that might be realised. In some cases the organisation chose to forgo a potential benefit in favour of socially or politically motivated actions. Cultural factors also influenced their actions, particularly as they moved towards extending participation in the marketplace to gain from a global sourcing strategy. The contribution of this research lies in two areas. Firstly, it was existing evaluation literature to development a framework for the evaluation of benefits in the complex area of electronic marketplaces, thereby extending and informing the call for more inclusive and interpretive evaluation studies. Secondly, the research contributes empirical evidence to support the recognition of benefits to be gained from electronic marketplaces and shows how the realisation of the economic benefits is impacted by the social, political and cultural factors that influence an organisation.
APA, Harvard, Vancouver, ISO, and other styles
19

Kilic, Ozgur. "Achieving Electronic Healthcare Record (ehr) Interoperability Across Healthcare Information Systems." Phd thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/12609665/index.pdf.

Full text
Abstract:
Providing an interoperability infrastructure for Electronic Healthcare Records (EHRs) is on the agenda of many national and regional eHealth initiatives. Two important integration profiles have been specified for this purpose: the "
IHE Cross-enterprise Document Sharing (XDS)"
and the "
IHE Cross Community Access (XCA)"
. XDS describes how to share EHRs in a community of healthcare enterprises and XCA describes how EHRs are shared across communities. However, currently no solution addresses some of the important challenges of cross community exchange environments. The first challenge is scalability. If every community joining the network needs to connect to every other community, this solution will not scale. Furthermore, each community may use a different coding vocabulary for the same metadata attribute in which case the target community cannot interpret the query involving such an attribute. Another important challenge is that each community has a different patient identifier domain. Querying for the patient identifiers in another community using patient demographic data may create patient privacy concerns. Yet another challenge in cross community EHR access is the EHR interoperability since the communities may be using different EHR content standards.
APA, Harvard, Vancouver, ISO, and other styles
20

Huang, Jie. "Efficient Support for Application-Specific Video Adaptation." PDXScholar, 2006. https://pdxscholar.library.pdx.edu/open_access_etds/2670.

Full text
Abstract:
As video applications become more diverse, video must be adapted in different ways to meet the requirements of different applications when there are insufficient resources. In this dissertation, we address two sorts of requirements that cannot be addressed by existing video adaptation technologies: (i) accommodating large variations in resolution and (ii) collecting video effectively in a multi-hop sensor network. In addition, we also address requirements for implementing video adaptation in a sensor network. Accommodating large variation in resolution is required by the existence of display devices with widely disparate screen sizes. Existing resolution adaptation technologies usually aim at adapting video between two resolutions. We examine the limitations of these technologies that prevent them from supporting a large number of resolutions efficiently. We propose several hybrid schemes and study their performance. Among these hybrid schemes, Bonneville, a framework that combines multiple encodings with limited scalability, can make good trade-offs when organizing compressed video to support a wide range of resolutions. Video collection in a sensor network requires adapting video in a multi-hop storeand- forward network and with multiple video sources. This task cannot be supported effectively by existing adaptation technologies, which are designed for real-time streaming applications from a single source over IP-style end-to-end connections. We propose to adapt video in the network instead of at the network edge. We also propose a framework, Steens, to compose adaptation mechanisms on multiple nodes. We design two signaling protocols in Steens to coordinate multiple nodes. Our simulations show that in-network adaptation can use buffer space on intermediate nodes for adaptation and achieve better video quality than conventional network-edge adaptation. Our simulations also show that explicit collaboration among multiple nodes through signaling can improve video quality, waste less bandwidth, and maintain bandwidth-sharing fairness. The implementation of video adaptation in a sensor network requires system support for programmability, retaskability, and high performance. We propose Cascades, a component-based framework, to provide the required support. A prototype implementation of Steens in this framework shows that the performance overhead is less than 5% compared to a hard-coded C implementation.
APA, Harvard, Vancouver, ISO, and other styles
21

Mascarenhas, da Veiga Alves Manoel Eduardo. "Characterisation of end-to-end performance for web-based file server respositories." Title page, contents and abstract only, 2001. http://web4.library.adelaide.edu.au/theses/09ENS/09ensm395.pdf.

Full text
Abstract:
Bibliography: leaves 128-135. Investigates the behaviour of TCP bulk file transfer application sessions in a broadband access environment. Introduces some concepts for evaluating network behaviour: a path instability parameter for analyzing different TCP connections; a minimum RTT delay and a minimum typical path for estimating path characteristics between a client and application servers.
APA, Harvard, Vancouver, ISO, and other styles
22

Goh, Che Seng. "Prototype system for detecting and processing of IEEE 802.11a signals." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2004. http://library.nps.navy.mil/uhtbin/hyperion/04Mar%5FGoh.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Alkadi, Alaa. "Anomaly Detection in RFID Networks." UNF Digital Commons, 2017. https://digitalcommons.unf.edu/etd/768.

Full text
Abstract:
Available security standards for RFID networks (e.g. ISO/IEC 29167) are designed to secure individual tag-reader sessions and do not protect against active attacks that could also compromise the system as a whole (e.g. tag cloning or replay attacks). Proper traffic characterization models of the communication within an RFID network can lead to better understanding of operation under “normal” system state conditions and can consequently help identify security breaches not addressed by current standards. This study of RFID traffic characterization considers two piecewise-constant data smoothing techniques, namely Bayesian blocks and Knuth’s algorithms, over time-tagged events and compares them in the context of rate-based anomaly detection. This was accomplished using data from experimental RFID readings and comparing (1) the event counts versus time if using the smoothed curves versus empirical histograms of the raw data and (2) the threshold-dependent alert-rates based on inter-arrival times obtained if using the smoothed curves versus that of the raw data itself. Results indicate that both algorithms adequately model RFID traffic in which inter-event time statistics are stationary but that Bayesian blocks become superior for traffic in which such statistics experience abrupt changes.
APA, Harvard, Vancouver, ISO, and other styles
24

Okcan, Alper. "Automatic Acquisition And Use Of Multimodal Medical Device Observations Based On Iso/ieee 11073 And Hl7 Standards." Master's thesis, METU, 2007. http://etd.lib.metu.edu.tr/upload/12608462/index.pdf.

Full text
Abstract:
The delivery of quality healthcare to all citizens at reasonable costs is an important challenge. With the increase in the aging population, the costs of managing chronic diseases increase. Today, healthcare services tend to shift from recovery to prevention. Remote healthcare monitoring is crucial for prevention and monitoring of chronic diseases since they require continuous and long-term monitoring. The advances in networking, mobile communications and medical device technologies offer a great potential to realize remote healthcare monitoring. However, seamless integration of multi-modal medical devices to the existing healthcare information systems is necessary for the automated use of medical device observations in related applications. The thesis addresses the automatic acquisition and use of multi-modal medical device observations in healthcare information systems. The interoperability of medical devices with healthcare information systems requires both physical connectivity and application level interoperability. Therefore, the thesis concentrates on both the medical device domain and the interoperability efforts on the existing healthcare information systems. It provides an interoperability solution based on ISO/IEEE 11073 and HL7 standards. This work is also realized the automatic acquisition and use of multi-modal medical device observations in an intelligent healthcare monitoring and decision support system which is developed as a part of the IST-027074 SAPHIRE project funded by the European Commission.
APA, Harvard, Vancouver, ISO, and other styles
25

Bligh, W. O. M. "Application of machine learning and connectionist modeling to an Australian dairy database." Thesis, Queensland University of Technology, 2000. https://eprints.qut.edu.au/36851/1/36851_Bligh_2000.pdf.

Full text
Abstract:
The Australian Dairy Herd Improvement Scheme (ADIDS) provides a database containing both raw and processed data relating to milk production in Australia. This thesis provides estimations of potential milk production for dairy breeding using dairy animal data and artificial neural networks (ANNs). By predicting daughter milk production from data representative of dams in a herd and artificial insemination sires, an evaluation of those potential daughter results can lead to the selection of a breeding sire for that herd. Relevant data fields and derived attributes from the dairy database that significantly affect the daughter milk production are utilised in the development of a prediction model. Further research of data fields proven to influence daughter milk production, results in a set of rules extracted for human interpretation. Only Victorian data is used in this study.
APA, Harvard, Vancouver, ISO, and other styles
26

Hachfi, Fakhreddine Mohamed. "Future of asynchronous transfer mode networking." CSUSB ScholarWorks, 2004. https://scholarworks.lib.csusb.edu/etd-project/2639.

Full text
Abstract:
The growth of Asynchronous Transfer Mode (ATM) was considered to be the ideal carrier of the high bandwidth applications like video on demand and multimedia e-learning. ATM emerged commercially in the beginning of the 1990's. It was designed to provide a different quality of service at a speed up 100 Gbps for both real time and non real time application. The turn of the 90's saw a variety of technologies being developed. This project analyzes these technologies, compares them to the Asynchronous Transfer Mode and assesses the future of ATM.
APA, Harvard, Vancouver, ISO, and other styles
27

Owen, Morné. "An enterprise information security model for a micro finance company: a case study." Thesis, Nelson Mandela Metropolitan University, 2009. http://hdl.handle.net/10948/1151.

Full text
Abstract:
The world has entered the information age. How the information is used within an organization will determine success or failure of the organisation. This study aims to provide a model, that once implemented, will provide the required protection for the information assets. The model is based on ISO 27002, an international security standard. The primary objective is to build a model that will provide a holistic security system specifically for a South African Micro Finance Company (MFC). The secondary objectives focuses on successful implementation of such a model, the uniqueness of the MFC that should be taken into account, and the maintenance of the model once implemented to ensure ongoing relevance. A questionnaire conducted at the MFC provided insight into the perceived understanding of information security. The questionnaire results were used to ensure the model solution addressed current information security shortcomings within the MFC. This study found that the information security controls in ISO 27002 should be applicable to any industry. The uniqueness for the MFC is not in the security controls, but rather in the regulations and laws applicable to it.
APA, Harvard, Vancouver, ISO, and other styles
28

Harte, David. "Internet content control in Australia : data topology, topography and the data deficit." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2001. https://ro.ecu.edu.au/theses/1073.

Full text
Abstract:
The success of the online adult industry has provoked a public policy controversy over the need for internet censorship, and in recent times there has emerged desire to protect minors from possibly unsuitable content. On January 1st 2000, the Broadcasting Services Amendment (Online Services) Act (Cwlth, 1999) (BSA) was proclaimed. The Act purports to regulate and control Internet content in Australia. Operating in tandem with the Act is the Internet Industry Association Code of Practice, giving Australia a co-regulatory approach to Internet content control. The Australian Broadcasting Authority (ABA) is charged with implementing the regime. This study sets out examine the Internet content control problem in the Australian context. The political issues surrounding the topic of Internet censorship and the lack of reliable operational statistics, revealed the difficulty of estimating the effectiveness of the current control regime. Pivotal questions for the study concerned the scope and scale of content control in the Australian context and trends in hosting. This study used website typology, as defined by data topology and data topography, to examine the scope and scale of the content control task, and the implications for the effectiveness of the BSA. It was expected that if the BSA was to have an impact, that a discernible change in user download behaviour should ensue. This study used information provided by the adult Internet Content Provider (ICP) industry to gauge the BSA's impact-on user download behaviour as a measure of the control regime’s effectiveness. It was suggested by some observers that the so-called 'data deficit' between Australia and the US would be exacerbated by the new content control regime, with possible negative implications for the conduct of e-commerce in Australia generally. A study of Australian adult website hosting arrangements and data topography was conducted to examine the implications of the control regime for the "data deficit'. This study suggests that most Australian online adult content is in fact hosted in the US. The reasons for offshore hosting are almost totally financial and pre-date the introduction of the Broadcasting Services Act (Online Services) Amendment Act 1999. The study also suggests that any effect on the 'data deficit' should be minimal, and that the typology of adult content websites in such that the current co-regulatory regime may prove ineffective in controlling access to adult content.
APA, Harvard, Vancouver, ISO, and other styles
29

Julie, Ferdie Gavin. "Development of an IEC 61850 standard-based automation system for a distribution power network." Thesis, Cape Peninsula University of Technology, 2014. http://hdl.handle.net/20.500.11838/1183.

Full text
Abstract:
Thesis submitted in fulfillment of the requirements for the degree Master of Technology: Electrical Engineering in the Faculty of Engineering at the Cape Peninsula University of Technology
The electric power distribution network, an essential section of the electric power system, supplies electrical power to the customer. Automating the distribution network allows for better efficiency, reliability, and level of work through the installation of distribution control systems. Presently, research and development efforts are focused in the area of communication technologies and application of the IEC 61850 protocol to make distribution automation more comprehensive, efficient and affordable. The aim of the thesis is to evaluate the relevance of the IEC61850 standard-based technology in the development and investigation of the distribution automation for a typical underground distribution network through the development of a distribution automation algorithm for fault detection, location, isolation and service restoration and the building of a lab scale test bench Distribution Automation (DA) has been around for many decades and each utility applies its developments for different reasons. Nowadays, due to the advancement in the communication technology, authentic and automatic reconfigurable power system that replies swiftly to instantaneous events is possible. Distribution automation functions do not only supersede legacy devices, but it allows the distribution network to function on another lever. The primary function of a DA system is to enable the devices on the distribution network to be operated and controlled remotely to automatically locate, isolate and reconnect supply during fault conditions. Utilities have become increasingly interested in DA due to the numerous benefits it offers. Operations, maintenance and efficiencies within substations and out on the feeders can be improved by the development of new additional capabilities of DA. Furthermore, the new standard-based technology has advanced further than a traditional Distribution Supervisory and Control Data Acquisition (DSCADA) system. These days the most important components of a DA system include Intelligent Electronic Devices (IEDs). IEDs have evolved through the years and execute various protection related actions, monitoring and control functions and are very promising for improving the operation of the DA systems. The thesis has developed an algorithm for automatic fault detection, location, isolation and system supply restoration using the functions of the IEC61850 standard-based technology. A lab scale system that would meet existing and future requirements for the control and automation of a typical underground distribution system is designed and constructed. The requirement for the lab scale distribution system is to have the ability to clear faults through reliable and fast protection operation, isolate faulted section/s, on the network and restore power to the unaffected parts of the network through automation control operation functions of the IEC61850 standard. Various tests and simulations have been done on the lab scale test bench to prove that the objective of the thesis is achieved. Keywords: IEC61850 Standard, Distribution automation, Distribution automation system, IEDs, Lab scale test bench, Protection, Algorithm for automatic control
APA, Harvard, Vancouver, ISO, and other styles
30

Oh, Khoon Wee. "Wireless network security : design considerations for an enterprise network /." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2004. http://library.nps.navy.mil/uhtbin/hyperion/04Dec%5FOh.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Klaška, Patrik. "Návrh autentizace uživatelů ve společnosti." Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2018. http://www.nusl.cz/ntk/nusl-378339.

Full text
Abstract:
This thesis is focused on the creation of functional authentication process of users into computer network in company Wistron InfoComm s.r.o. and discusses issues related to this process. The main aim of the thesis is to implement a functional and simultaneously realistic solution based on the company's requirements as well as described problems associated with the implementation of this solution.
APA, Harvard, Vancouver, ISO, and other styles
32

Ash, Colin. "Exploring The Antecedents Of Successful E-business Implementations Through ERP : A Longitudinal Study of SAP-based Organisations 1999-2003." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2003. https://ro.ecu.edu.au/theses/1486.

Full text
Abstract:
This research was carried out between 1999 and 2003 on the use of e-business applications in ERP-based organisations. A composite research method based on structured case studies was developed for this study. It combined the application of case methods by Carroll et al. (1998], Klein and Myer (1998), and Eisenhardt (1989). This was used to provide a focused, yet flexible structure, as a dynamic approach to case study interpretive research. The research method used three distinct models at three progressive stages of the study, to provide a multi-faceted view of each case. This composite case-based method was developed to maintain the balance between research rigour and relevance. A pilot case study of nine Australian SAP sites helped ground the theory of the study. This was followed by three stages of study of eleven international cases within a diverse industry context. The method revealed the antecedents of e-business success using the findings from case analyses against three separate research models B2B interaction, e-business change, and virtual organising. A final conceptual framework was developed as new theory of e-business transformation. The theory views e-business transformation as realising the benefits from virtual organising within complex B2B interactions by utilising the facilitators of successful e-business change. The research demonstrates that successful e-business transformation with ERP occurs when value propositions are realised through integration and differentiation of technologies used to support new business models to deliver products and services online. The associated management practice evolves through efficiency from self-service, effectiveness through empowerment towards customer care, and value enhancement from extensive relationship building with multiple alliances. The new theory of e-business transformation identifies the stages of e-business growth and development as a comprehensive plan that should assist managers of ERP-based organisations in migrating their company towards a successful e-business organisation. The detailed analysis of the findings offers a foundational per11pectlve of strategies, tactics and performance objectives for e-ERP implementations. The strength of the theory lies in the synthesis of multiple case analyses using three different lenses over three separate time periods. The triangulation of the three research frameworks provides a method for study at appropriate levels of complexity. It is evolutionary in nature and is content driven. Other researchers are urged to apply similar multi-viewed analysis.
APA, Harvard, Vancouver, ISO, and other styles
33

Renzi, Stefano. "Differences in university teaching after Learning Management System adoption : an explanatory model based on Ajzen's Theory of Planned Behavior." University of Western Australia. Graduate School of Management, 2008. http://theses.library.uwa.edu.au/adt-WU2008.0193.

Full text
Abstract:
[Truncated abstract] Current literature about university teaching argues that online teaching requires online social learning based on social interaction to be effective. This implies a shift in pedagogy based on engagement and collaboration, instead of trying to reproduce face-to- face teaching, in online environments. However, when a university adopts an elearning platform (or Learning Management System, LMS), most teachers tend to reproduce their traditional teaching, delivering, through the LMS, educational material. This study explored factors which influence university teachers to adopt teaching models based on online social interaction (OSI) when an e-learning platform is used to complement undergraduate classroom teaching. Online teaching model adoption was considered in the framework of technology adoption and post-adoption behavior, i.e., adoption and use by individuals after an organization has adopted an ICT-based innovation (Jasperson, Carter, & Zmud, 2005). Behaviors were investigated using a model based on Ajzen's (1991) Theory of Planned Behavior (TPB). In total, 26 university teachers 15 from Australia and 11 from Italy holding undergraduate courses, were recruited. They responded to a semi-structured interview based on the TPB, built on purpose for this research. Teachers were divided into three different groups on the basis of their approach to online teaching, corresponding to three different levels of adoption of OSI. The three different online teaching models were:
APA, Harvard, Vancouver, ISO, and other styles
34

Hsiao, Chih-Wen, David Turner, and Keith Ross. "A secure lightweight currency service provider." CSUSB ScholarWorks, 2004. https://scholarworks.lib.csusb.edu/etd-project/2594.

Full text
Abstract:
The main purpose of this project is to build a bank system that offers a friendly and simple interface to let users easily manage their lightweight currencies. The Lightweight Currency Protocol (LCP) was originally proposed to solve the problem of fairness in resource cooperatives. However, there are other possible applications of the protocol, including the control of spam and as a general purpose medium of exchange for low value transactions. This project investigates the implementation issues of the LCP, and also investigates LCP bank services to provide human interface to currency operations.
APA, Harvard, Vancouver, ISO, and other styles
35

Valli, Craig. "Non-business use of the World Wide Web : A study of selected Western Australian organisations." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2003. https://ro.ecu.edu.au/theses/1311.

Full text
Abstract:
Employees undertake a wide range of activities when they use the World Wide Web in the work place. Some of these activities may leave the modem Internet connected organisation vulnerable to undue or unknown risk, potential productivity losses and expense us a result of misuse or abuse or the Internet provision. Much of the existing literature on this subject points to a purported epidemic of misuse in the workplace. If this practice is so prevalent and widespread, what can modem Internet connected organisations do to identify the abuse and reduce the risks and losses that these abuses represent? To what extent is the World Wide Web used by employees for non-business related activities in organisations and can filtering or organisational policies impact on this activity? This research specifically examines contextually, the level of misuse with respect to the use of the World Wide Web in three selected Western Australian organisations using multiple interpretive case study as the vehicle for the study. The research is significant internationally to all organisations that use Internet in their everyday work. The research has discovered anomalous behaviour on the part of non-business users who have employed a variety of techniques and tactics to mask their activities. Also, organisational management in the cases examined had demonstrated shortfalls in their perception of misuse within their organisations and, the implementation of effective policy.
APA, Harvard, Vancouver, ISO, and other styles
36

Martinus, Ian. "Can B2G portals be used effectively to stimulate business in SMEs?: A case analysis of the 2Cities Business To Government portal." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2004. https://ro.ecu.edu.au/theses/1611.

Full text
Abstract:
Small and Medium Enterprises (SMEs) have many options when purchasing goods or services. These include personal contacts and networks, familiar centralised supply sources and other ad hoc means. One purchasing possibility is to buy from and sell to other businesses within a similar geographic area. The benefits of buying and selling locally may not occur to SMEs. They seek, like other consumers, to get value for money, fast and efficient service, and a reasonable level of quality. Many factors can impinge upon an SME's decision to purchase locally. It can be assumed that, given a reasonable local option, SMEs wish to buy from and sell to other local businesses. It can also be reasonably expected that if government purchasers were willing to purchase within their geographic area, SMEs would be interested in supplying local government as well. This study investigates SMEs in the Wanneroo and Joondalup Regions of Western Australia and considers the factors that may influence their decision to use the 2Cities Business-to-Government (B2G) portal. The study is concerned with gaining an insight into particular phenomena from a participants' perspective (SME) with the researcher as the primary instrument for data collection and analysis. The study requires the researcher to get close to the natural setting of the study and interact with the small business owners. This study triangulated results from three major sources. One source of data was contemporary Wanneroo and Joondalup secondary data gathered from research reports relating to local SME matters. This was combined with the semi-structured interviews of forty SMEs and two focus groups. Participant SMEs were invited to discuss factors affecting their decision to use or not use the 2Cities B2G portal. SMEs have a clear perception of what impedes and assists them in running their business and this comes through strongly. The problem facing the 2Cities portal management board is the extent to which it can influence the SME decision to buy and sell within the local area using the portal. The results form the basis of an improved model for B2G participation.
APA, Harvard, Vancouver, ISO, and other styles
37

Dosciatti, Eden Ricardo. "Uma nova arquitetura para provisão de QoS utilizando enxame de partículas em redes WiMAX fixas." Universidade Tecnológica Federal do Paraná, 2015. http://repositorio.utfpr.edu.br/jspui/handle/1/1309.

Full text
Abstract:
Os avanços tecnológicos ocorridos nos últimos anos provocaram um crescimento na base de usuários que utilizam as redes de comunicação, principalmente com serviços de multimídia, como IPTV, videoconferência e VoIP. Como estes serviços requerem mais recursos e geram uma grande demanda sobre a infraestrutura da rede, cada usuário deve ter as suas aplicações tratadas de acordo com determinadas prioridades, para oferecer um nível aceitável de serviço. Partindo deste pressuposto, as redes de comunicação para acesso sem fio em banda larga, baseadas no padrão IEEE 802.16, também conhecidas como WiMAX, conseguem atender as várias demandas dos usuários finais, como a necessidade de acesso aos dados em todos os momentos e em qualquer lugar e a conectividade de banda larga eficiente, oferecendo uma excelente relação custo-benefício para o utilizador final, pois possibilitam uma elevada capacidade de transmissão de dados a um custo relativamente baixo de implantação. A provisão de qualidade de serviços é um fator de grande importância para o desempenho das redes de comunicação, para isso, os mecanismos de escalonamento, de controle de admissão de conexões e de policiamento de tráfego, devem estar presentes. Porém, o padrão IEEE 802.16 apenas especifica as políticas de como estes mecanismos devem ser implementados, mas não define como implementá-los. Esta pesquisa tem como objetivo apresentar uma nova arquitetura para o tráfego uplink, nas estações bases das redes WiMAX, com provisão de qualidade de serviços, utilizando a metaheurística de otimização por enxame de partículas para o cálculo da duração do tempo do quadro, quando houver a necessidade, possibilitando encontrar o valor ideal para esta quantidade, proporcionando uma melhor alocação de usuários na rede.
Technological advances in recent years have led to a growth in the user base using communication networks, especially with multimedia services such as IPTV, video conferencing and VoIP. As these services require more resources and generate a great demand on the network infrastructure, each user must have their applications dealt with in accordance with certain priorities, to provide an acceptable level of service. Under this assumption, the communication networks for wireless broadband, based on the IEEE 802.16 standard, also known as WiMAX, can meet the various demands of end users, the need for access to data at all times and in any place and the connectivity efficient broadband, offering excellent tredeoff for the end user by enabling high data transmission capacity at a relatively low cost of deployment. The provision of quality services is a very important factor for the performance of communication networks for that, the scheduling mechanisms, admission control connections and traffic policing, must be present. However, the IEEE 802.16 standard specifies only policies of how these mechanisms should be implemented, but does not define how to implement them. This research aims to present a new architecture for uplink traffic, the WiMAX network base stations, with provision of quality services through the particle swarm optimization metaheuristic to calculate the frame duration, allowing to find an ideal value, providing a better allocation of network users.
APA, Harvard, Vancouver, ISO, and other styles
38

Srinivasan, Suman Ramkumar. "Improving Content Delivery and Service Discovery in Networks." Thesis, 2016. https://doi.org/10.7916/D8Z89CCC.

Full text
Abstract:
Production and consumption of multimedia content on the Internet is rising, fueled by the demand for content from services such as YouTube, Netflix and Facebook video. The Internet is shifting from host-based to content-centric networking. At the same time, users are shifting away from a homogeneous desktop computing environment to using a heterogeneous mix of devices, such as smartphones, tablets and thin clients, all of which allow users to consume data on the move using wireless and cellular data networks. The popularity of these new class of devices has, in turn, increased demand for multimedia content by mobile users. The emergence of rich Internet applications and the widespread adoption and use of High Definition (HD) video has also placed higher pressure on the service providers and the core Internet backbone, forcing service providers to respond to increased bandwidth use in such networks. In my thesis, I aim to provide clarity and insight into the usage of core networking protocols and multimedia consumption on both mobile and wireless networks, as well as the network core. I also present research prototypes for potential solutions to some of the problems caused by the increased multimedia consumption on the Internet.
APA, Harvard, Vancouver, ISO, and other styles
39

Bowden, G. J. (Gavin James). "Forecasting water resources variables using artificial neural networks." 2003. http://web4.library.adelaide.edu.au/theses/09PH/09phb7844.pdf.

Full text
Abstract:
"February 2003." Corrigenda for, inserted at back Includes bibliographical references (leaves 475-524 ) A methodology is formulated for the successful design and implementation of artificial neural networks (ANN) models for water resources applications. Attention is paid to each of the steps that should be followed in order to develop an optimal ANN model; including when ANNs should be used in preference to more conventional statistical models; dividing the available data into subsets for modelling purposes; deciding on a suitable data transformation; determination of significant model inputs; choice of network type and architecture; selection of an appropriate performance measure; training (optimisation) of the networks weights; and, deployment of the optimised ANN model in an operational environment. The developed methodology is successfully applied to two water resorces case studies; the forecasting of salinity in the River Murray at Murray Bridge, South Australia; and the the forecasting of cyanobacteria (Anabaena spp.) in the River Murray at Morgan, South Australia.
APA, Harvard, Vancouver, ISO, and other styles
40

Mahlaba, Simon Bonginkosi. "A MAC protocol for IP-based CDMA wireless networks." Thesis, 2005. http://hdl.handle.net/10413/2772.

Full text
Abstract:
The evolution of the intemet protocol (IP) to offer quality of service (QoS) makes it a suitable core network protocol for next generation networks (NGN). The QoS features incorporated to IP will enable future lP-based wireless networks to meet QoS requirements of various multimedia traffic. The Differentiated Service (Diffserv) Architecture is a promising QoS technology due to its scalability which arises from traffic flow aggregates. For this reason, in this dissertation a network infrastructure based on DiffServ is assumed. This architecture provides assured service (AS) and premium service (PrS) classes in addition to best-effort service (BE). The medium access control (MAC) protocol is one of the important design issues in wireless networks. In a wireless network carrying multimedia traffic, the MAC protocol is required to provide simultaneous support for a wide variety of traffic types, support traffic with delay and jitter bounds, and assign bandwidth in an efficient and fair manner among traffic classes. Several MAC protocols capable of supporting multimedia services have been proposed in the literature, the majority of which were designed for wireless A1M (Asynchronous Transfer Mode). The focus of this dissertation is on time division multiple access and code division multiple access (TDMAlCDMA) based MAC protocols that support QoS in lP-based wireless networks. This dissertation begins by giving a survey of wireless MAC protocols. The survey considers MAC protocols for centralised wireless networks and classifies them according to their multiple access technology and as well as their method of resource sharing. A novel TDMAlCDMA based MAC protocol incorporating techniques from existing protocols is then proposed. To provide the above-mentioned services, the bandwidth is partitioned amongst AS and PrS classes. The BE class utilizes the remaining bandwidth from the two classes because it does not have QoS requirements. The protocol employs a demand assignment (DA) scheme to support traffic from PrS and AS classes. BE traffic is supported by a random reservation access scheme with dual multiple access interference (MAl) admission thresholds. The performance of the protocol, i.e. the AS or PrS call blocking probability, and BE throughput are evaluated through Markov analytical models and Monte-Carlo simulations. Furthermore, the protocol is modified and incorporated into IEEE 802.16 broadband wireless access (BWA) network.
Thesis (M.Sc.)-University of KwaZulu-Natal, Durban, 2005.
APA, Harvard, Vancouver, ISO, and other styles
41

Mascarenhas, da Veiga Alves Manoel Eduardo. "Characterisation of end-to-end performance for web-based file server respositories." Thesis, 2001. http://hdl.handle.net/2440/107771.

Full text
Abstract:
Investigates the behaviour of TCP bulk file transfer application sessions in a broadband access environment. Introduces some concepts for evaluating network behaviour: a path instability parameter for analyzing different TCP connections; a minimum RTT delay and a minimum typical path for estimating path characteristics between a client and application servers.
Thesis (Ph.D.) -- University of Adelaide, Dept. of Electrical and Electronic Engineering, 2001
APA, Harvard, Vancouver, ISO, and other styles
42

Kabiwa, Tchokonte Maxime Stephane. "Development of an improved link metric for routing protocols in wireless ad-hoc networks." 2014. http://encore.tut.ac.za/iii/cpro/DigitalItemViewPage.external?sp=1001649.

Full text
Abstract:
M. Tech. Electrical engineering.
Discusses the interference and bandwidth adjusted ETX routing metric uses a logical interference model that refers to the interference arising from the Carrier Sensing Multiple Access with Collision Avoidance (CSMA-CA) based Medium Access Control. This approach of capturing inter-flow is complex and more restrictive. In this dissertation, a more realistic and less restrictive approach based on the information available at the physical layer (signal strength) is used to capture the interference.In contrast to the logical interference model, this has the excellent advantage of measuring the parameters using online data traffic. The question is whether actual capacity improvements can be achieved by considering the physical interference model.
APA, Harvard, Vancouver, ISO, and other styles
43

Chen, Peter John. "Australia's online censorship regime : the advocacy coalition framework and governance compared." Phd thesis, 2000. http://hdl.handle.net/1885/147789.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

London, Ray William. "Comparative data protection and security : a critical evaluation of legal standards." Thesis, 2013. http://hdl.handle.net/10500/13859.

Full text
Abstract:
This study1 addresses the key information technology issues of the age and its unintended consequences. The issues include social control by businesses, governments, and information age Star Chambers. The study focuses on a comparative analysis of data protection, data security, and information privacy (DPSIP) laws, regulations, and practices in five countries. The countries include Australia, Canada, South Africa, the United Kingdom, and the United States. The study addresses relevant international legal standards and justifications. This multidisciplinary analysis includes a systems thinking approach from a legal, business, governmental, policy, political theory, psychosocial, and psychological perspective. The study implements a comparative law and sociolegal research strategy. Historic, linguistic, and statistical strategies are applied. The study concludes with a next step proposal, based on the research, for the international community, the five countries in the study, and specifically, South Africa as it has yet to enact a sound DPSIP approach.
LL. D.
APA, Harvard, Vancouver, ISO, and other styles
45

"Contingency planning models for Government agencies." University of Technology, Sydney. School of Computing Sciences, 1996. http://hdl.handle.net/2100/245.

Full text
Abstract:
This report describes a research study into the current situation within Federal, State Government and selected private sector agencies, assessing contingency plans for Information Systems and suggests models for state-wide planning against Information Systems disasters. Following a brief look at various phases of contingency plan development, the study looks into the factors that prompt organisations to prepare contingency plans. The project involved a survey of current Information Systems contingency plans in the government agencies in the states of Victoria, Western Australia, South Australia, New South Wales and in the Australian Capital Territory. It also included two major banks, an insurance company and two computer services bureaux in the private sector within New South Wales. The survey determined that particular factors play important roles in the decision by organisations to commence contingency planning. These include actual disaster experience, senior management support, auditor's comments, legal requirements, risk analysis and business impact study, economic considerations, insurance requirements, contract commitment, new staff and introduction of new hardware and software. The critical success factors in contingency planning include regular maintenance and testing of the plan. The project also discusses the current contingency planning environment within New South Wales Government agencies and suggests cost-effective models for state-wide adoption.
APA, Harvard, Vancouver, ISO, and other styles
46

London, R. W. "Comparative data protection and security : a critical evealuation of legal standards." Thesis, 2013. http://hdl.handle.net/10500/13859.

Full text
Abstract:
This study1 addresses the key information technology issues of the age and its unintended consequences. The issues include social control by businesses, governments, and information age Star Chambers. The study focuses on a comparative analysis of data protection, data security, and information privacy (DPSIP) laws, regulations, and practices in five countries. The countries include Australia, Canada, South Africa, the United Kingdom, and the United States. The study addresses relevant international legal standards and justifications. This multidisciplinary analysis includes a systems thinking approach from a legal, business, governmental, policy, political theory, psychosocial, and psychological perspective. The study implements a comparative law and sociolegal research strategy. Historic, linguistic, and statistical strategies are applied. The study concludes with a next step proposal, based on the research, for the international community, the five countries in the study, and specifically, South Africa as it has yet to enact a sound DPSIP approach.
LL.D. (Laws)
APA, Harvard, Vancouver, ISO, and other styles
47

Sullivan, Clare Linda. "Digital identity: an emergent legal concept; an analysis of the role and legal nature of digital identity in a transactional context." 2009. http://hdl.handle.net/2440/54148.

Full text
Abstract:
This thesis examines the emergent legal concept of digital identity under the United Kingdom National Identity Scheme ('NIS') and its Australian counterpart, the Access Card Scheme('ACS') proposed in 2007. The Identity Cards Act 2006 UK c 15 ('Identity Cards Act’) and the Human Services (Enhanced Service Delivery) Bill (Cth) 2007 ('Access Card Bill') reveal a remarkably similar concept of identity in terms of its constitution and especially its functions. The United Kingdom scheme is currently being established, whereas the proposed Australian Scheme has been shelved following a change of government late in 2007. The NIS is therefore used as the model for this study but the analysis applies to any such scheme based on digital technology, including the ACS, should it be resurrected. The emergent concept of digital identity which is the subject of this thesis arises from legislation. It is a legal construct which consists of a collection of information that is stored and transmitted in digital form, and which has specific functions under the identity scheme. In this study, the information recorded about an individual for an identity scheme is referred to as an individual's 'database identity.' Database identity consists of information prescribed by legislation. Collectively, that information comprises an individual's registered identity. Under the United Kingdom scheme, it includes an individual's name/s, gender, date and place of birth and date of death, photograph, signature and biometrics, and other information such as citizenship and residential status including residential address/es, nationality, identity card number, passport number, work permit number, driver‘s licence number, and administrative information such as security and verification details. Within database identity is a small subset of information which is an individual‘s transactional identity, that is, an individual‘s identity for transactional purposes. In this study, that subset of database identity is called an individual‘s 'token identity'. Under the NIS, token identity consists of name, gender, date and place of birth, date of death and biometrics. Token identity is the gateway to the other information which makes up database identity and token identity has specific functions at the time of a transaction which give it legal character. In effect, it operates as the individual‘s transactional 'key.' Presentation of the required token identity at the time of the transaction enables the system to recognise, and to deal with, the registered identity. This thesis is therefore not about identity in the deep philosophical sense of 'who am I?' or 'what makes me, me?' It is about a legal concept of individual identity for specific purposes under a national identity scheme. In many ways, though, the concept of digital identity which is the subject of this thesis is just as important in a modern legal context. Under a national identity scheme, the response to the question 'who am I? ' is 'you are who the scheme (and in particular, the National Identity Register ('NIR')) says you are.' As the first conceptual legal analysis of identity in a transactional context, this thesis examines the functions and legal nature of database identity, and particularly token identity. Token identity has specific functions at the time of a transaction which are analysed from a legal perspective to determine whether token identity is a form of legal personality. This thesis also contends that individual personal and proprietary rights necessarily apply as a result of the functions and legal nature of this emergent concept of identity. In addition to the well- recognised right to privacy, this thesis argues that the concept gives rise to the right to identity which has been overlooked in this context. For the first time, identity as a legal concept is distinguished from privacy which is the focus of legal scholarship and jurisprudence in this area. The right to identity is contrasted with the right to privacy and the protection afforded by the right to identity in this context by those human rights in the United Kingdom is considered. The protection afforded to an individual in the United Kingdom is contrasted with the situation in Australia which does not currently have a comprehensive national human rights charter. In view of the limited protection which is currently provided to token identity by the civil law, the protection provided by the criminal law in both the United Kingdom and Australia becomes particularly significant in considering the obligations and rights which arise under the scheme. The adequacy of the criminal law in addressing the nature and consequences of the dishonest use by a person of another person‘s identity information is therefore also examined. Identity theft is defined and distinguished from identity fraud, having regard to the emergent concept of digital identity and the wrong and the harm caused by its misuse. In particular, the nature of token identity is examined and the consequences of its misuse by another person are considered in determining whether token identity is property which is capable of being the subject of theft and criminal damage. The thesis concludes by summarising the major insights provided by chapters 1-6 with a view to the future when national identity schemes like that of the United Kingdom, and indeed international schemes, will be commonplace and token identity routinely required for most commercial transactions. In that environment, being asked to provide one‘s token identity is likely to be as common and as routine as being asked one's name.
Thesis (Ph.D.) -- University of Adelaide, Law School, 2009
APA, Harvard, Vancouver, ISO, and other styles
48

Ruivo, José Miguel Costa. "Deep neural networks for image quality: a comparison study for identification photos." Master's thesis, 2018. http://hdl.handle.net/10071/18359.

Full text
Abstract:
Many online platforms allow their users to upload images to their account profile. The fact that a user is free to upload any image of their liking to a university or a job platform, has resulted in occurrences of profile images that weren’t very professional or adequate in any of those contexts. Another problem associated with submitting a profile image is that even if there is some kind of control over each submitted image, this control is performed manually by someone, and that process alone can be very tedious and time-consuming, especially when there are cases of a large influx of new users joining those platforms. Based on international compliance standards used to validate photographs for machine-readable travel documents, there are SDKs that already perform automatic classification of the quality of those photographs, however, the classification is based on traditional computer vision algorithms. With the growing popularity and powerful performance of deep neural networks, it would be interesting to examine how would these perform in this task. This dissertation proposes a deep neural network model to classify the quality of profile images, and a comparison of this model against traditional computer vision algorithms, with respect to the complexity of the implementation, the quality of the classifications, and the computation time associated to the classification process. To the best of our knowledge, this dissertation is the first to study the use of deep neural networks on image quality classification.
Muitas plataformas online permitem que os seus utilizadores façam upload de imagens para o perfil das respetivas contas. O facto de um utilizador ser livre de submeter qualquer imagem do seu agrado para uma plataforma de universidade ou de emprego, pode resultar em ocorrências de casos onde as imagens de perfil não são adequadas ou profissionais nesses contextos. Outro problema associado à submissão de imagens para um perfil é que, mesmo que haja algum tipo de controlo sobre cada imagem submetida, esse controlo é feito manualmente. Esse processo, por si, só pode ser aborrecido e demorado, especialmente em situações de grande afluxo de novos utilizadores a inscreverem-se nessas plataformas. Com base em normas internacionais utilizadas para validar fotografias de documentos de viagem de leitura óptica, existem SDKs que já realizam a classificação automática da qualidade dessas fotografias. No entanto, essa classificação é baseada em algoritmos tradicionais de visão computacional. Com a crescente popularidade e o poderoso desempenho de redes neurais profundas, seria interessante examinar como é que estas se comportam nesta tarefa. Assim, esta dissertação propõe um modelo de rede neural profunda para classificar a qualidade de imagens de perfis e faz uma comparação deste modelo com algoritmos tradicionais de visão computacional, no que respeita à complexidade da implementação, qualidade das classificações e ao tempo de computação associado ao processo de classificação. Tanto quanto sabemos, esta dissertação é a primeira a estudar o uso de redes neurais profundas na classificação da qualidade de imagem.
APA, Harvard, Vancouver, ISO, and other styles
49

Marais, Hester 1961. "Authority control in an academic library consortium using a union catalogue maintained by a central office for authority control." Thesis, 2004. http://hdl.handle.net/10500/2546.

Full text
Abstract:
Authority control is the backbone of the library catalogue and therefore a critical library activity. Experienced staff create authority records to assist users in their quest for information. The focus of this study is on authority control as a means of co-operation in academic library consortia using a union catalogue maintained by a Central Office for Authority Control. Literature studies were conducted on three sub-problems: the development of academic library consortia in South Africa, and various forms, characteristics and functions of academic library consortia in general; the characteristics, principals and objectives of authority control; and the functions of union catalogues with special reference to the role of Z39.50 within virtual union catalogues. The conclusion was that existing and new authority records should be made available as widely as possible within consortia through a union catalogue. It is however a partial solution, because not all the libraries within the consortium have the expertise to create new authority records. Two empirical studies were conducted. A cost analysis was done to determine the cost of creating and changing authority records within academic library consortia in South Africa, in order to choose a system within which authority control can be performed effectively and speedily. Secondly, a questionnaire was sent to libraries in the United States to gather information on their experiences with regard to authority control, library co-operation in general, and virtual union catalogues. The United States was the natural choice because it could be regarded as the birthplace of modern library consortia. Inferences drawn from the information received was used to develop the structure and functions for a Central Office for Authority Control in academic library consortia in South Africa. It was found that authority control within an academic library consortium using a union catalogue could be conducted most cost-effectively and timeously through such a Central Office for Authority Control. The purpose of the Central Office would be to co-ordinate authority control within the consortium. Pooling available resources within the consortium would keep the cost of authority control as low as possible. Libraries with the required infrastructure and expertise would have the opportunity to create authority records on behalf of other libraries and be compensated for their services. Through such a Central Office more authority records created according to mutually accepted standards would be available for sharing within the consortium.
Information Science
D.Litt. et Phil. (Information Science)
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography