Dissertations / Theses on the topic '280000 Information, Computing and Communication Sciences'

To see the other types of publications on this topic, follow the link: 280000 Information, Computing and Communication Sciences.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic '280000 Information, Computing and Communication Sciences.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Mugridge, Warwick Bruce. "Enhancements to an object-oriented programming language." Thesis, University of Auckland, 1990. http://hdl.handle.net/2292/1997.

Full text
Abstract:
The objective of this thesis has been to explore the value and limitations of Class, an object-oriented programming language, in order to further the development of the language. The pivot for this thesis is WallBrace, a code-checking system. The development of the WallBrace system is the basis of a critique of Class, and leads to a number of language extensions being proposed. An important aim in this work has been the careful integration of these enhancements with the rest of the language, avoiding unnecessary additions. A number of functional and object-oriented extensions to the language are proposed. Discrimination functions, which may be higher-order and polymorphic, add considerable functional power. Generic classes allow for abstract data types, such as sets and lists, to be defined within the language. The forms interface proposed will greatly enhance the quality of user interfaces to Class programs. An external interface will allow Class programs to communicate with files, databases, and specialist user-interface programs, such as for plan entry.
APA, Harvard, Vancouver, ISO, and other styles
2

Sluti, Donald George. "Linking process quality with performance: an empirical study of New Zealand manufacturing plants." Thesis, University of Auckland, 1992. http://hdl.handle.net/2292/2028.

Full text
Abstract:
This study was conducted to assess the impacts of quality on operational and business performance in manufacturing firms. Data were provided by 184 diversified New Zealand manufacturing plants- Quality is defined as the degree of conformance to specifications. The first phase of the research was the construction of a theoretical model to incorporate the impacts of quality on manufacturing performance, manufacturing productivity and business performance. The relationships of the model are based on the quality management literature. The second phase of the research was the design and administration of a survey instrument for the collection of empirical performance data. The data were then used to evaluate the relationships represented in the model. The final phase of the research used structural equations modelling in order to evaluate the relationships of the model. Quality was found to have significant and positive impacts on operational performance measures for process utilization, process output, production costs, work-in-process inventory levels and on-time delivery rate. The analysis found that change in quality level was most strongly associated with change in process utilization. The findings for the impacts of quality on operational performance were compatible with the quality management literature. The impacts of quality on business performance given by structural equations analysis were significant and positive for productivity-induced improvements of quality. Generally, the support for the impacts of quality on business performance which occur through other aspects of, operational performance was not significant. The limitations of the study were specified. The implications of the findings of the study for manufacturers were reviewed, along with the directions for future research.
APA, Harvard, Vancouver, ISO, and other styles
3

Gutmann, Peter. "The Design and Verification of a Cryptographic Security Architecture." Thesis, University of Auckland, 2000. http://hdl.handle.net/2292/2310.

Full text
Abstract:
A cryptographic security architecture constitutes the collection of hardware and software which protects and controls the use of encryption keys and similar cryptovariables. This thesis presents a design for a portable, flexible high-security architecture based on a traditional computer security model. Behind the API it consists of a kernel implementing a reference monitor which controls access to security-relevant objects and attributes based on a configurable security policy. Layered over the kernel are various objects which abstract core functionality such as encryption and digital signature capabilities, certificate management and secure sessions and data enveloping (email encryption). The kernel itself uses a novel design which bases its security policy on a collection of filter rules enforcing a cryptographic module-specific security policy. Since the enforcement mechanism (&e kernel) is completely independent of the policy database (the filter rules), it is possible to change the behaviour of the architecture by updating the policy database without having to make any changes to the kernel itself. This clear separation of policy and mechanism contrasts with current cryptographic security architecture approaches which, if they enforce controls at all, hardcode them into the implementation, making it difficult to either change the controls to meet application-specific requirements or to assess and verify them. To provide assurance of the correctness of the implementation, this thesis presents a design and implementation process which has been selected to allow the implementation to be verified in a manner which can reassure an outsider that it does indeed function as required. In addition to producing verification evidence which is understandable to the average user, the verification process for an implementation needs to be fully automated and capable of being taken down to the level of running code, an approach which is currently impossible with traditional methods. The approach presented here makes it possible to perform verification at this level, something which had previously been classed as "beyond Al" (that is, not achievable using any known technology). The versatility of the architecture presented here has been proven through its use in implementations ranging from l6-bit microcontrollers through to supercomputers, as well as a number of unusual areas such as security modules in ATMs and cryptographic coprocessors for general-purpose computers.
Note: Updated version of the thesis now published as Gutmann, P (2004). Cryptographic security architecture: design and verification. New York: Springer. ISBN 9780387953876.
APA, Harvard, Vancouver, ISO, and other styles
4

Wang, Lei. "Effectiveness of text-based mobile learning applications: case studies in tertiary education : a thesis presented to the academic faculty, submitted in partial fulfilment of the requirements for the degree of Master of Information Sciences in Information Technology, Massey University." Massey University, 2009. http://hdl.handle.net/10179/1092.

Full text
Abstract:
This research focuses on developing a series of mobile learning applications for future 'beyond' classroom learning environments. The thesis describes the general use pattern of the prototype and explores the key factors that could affect users‘ attitudes towards potential acceptance of the mobile learning applications. Finally, this thesis explores the user acceptance of the mobile learning applications; and investigates the mobility issue and the comparison of applying learning activities through mobile learning and e-learning.
APA, Harvard, Vancouver, ISO, and other styles
5

Yang, Chun Chieh. "Evaluating online support for mobile phone selection : using properties and performance criteria to reduce information overload : a thesis presented in partial fulfilment of the requirements for the degree of Master of Information Science in Information Systems at Massey University, Auckland, New Zealand." Massey University, 2008. http://hdl.handle.net/10179/844.

Full text
Abstract:
The mobile phone has been regarded as one of the most significant inventions in the field of communications and information technology over the past decade. Due to the rapid growth of mobile phone subscribers, hundreds of phone models have been introduced. Therefore, customers may find it difficult to select the most appropriate mobile phone because of information overload. The aim of this study is to investigate web support for customers who are selecting a mobile phone. Firstly, all the models of mobile phones in the New Zealand market were identified by visiting shops and local websites. Secondly, a list of all the features of these mobile phones was collated from local shops, websites and magazines. This list was categorised into mobile phone properties and performance criteria. An experiment then compared three different selection support methods: A (mobile phone catalogue), B (mobile phone property selection) and C (mobile phone property and performance criteria selection). The results of the experiment revealed that selection support methods B and C had higher overall satisfaction ratings than selection support method A; both methods B and C had similar satisfaction ratings. The results also suggested that males and females select their mobile phones differently, though there was no gender preference in selection support methods.
APA, Harvard, Vancouver, ISO, and other styles
6

Thompson, Errol Lindsay. "How do they understand? Practitioner perceptions of an object-oriented program : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Education (Computer Science) at Massey University, Palmerston North, New Zealand." Massey University, 2008. http://hdl.handle.net/10179/854.

Full text
Abstract:
In the computer science community, there is considerable debate about the appropriate sequence for introducing object-oriented concepts to novice programmers. Research into novice programming has struggled to identify the critical aspects that would provide a consistently successful approach to teaching introductory object-oriented programming. Starting from the premise that the conceptions of a task determine the type of output from the task, assisting novice programmers to become aware of what the required output should be, may lay a foundation for improving learning. This study adopted a phenomenographic approach. Thirty one practitioners were interviewed about the ways in which they experience object-oriented programming and categories of description and critical aspects were identified. These critical aspects were then used to examine the spaces of learning provided in twenty introductory textbooks. The study uncovered critical aspects that related to the way that practitioners expressed their understanding of an object-oriented program and the influences on their approach to designing programs. The study of the textbooks revealed a large variability in the cover of these critical aspects.
APA, Harvard, Vancouver, ISO, and other styles
7

Senjov-Makohon, Natalie. "Digital immigrant teachers learning for the information age." full-text, 2009. http://eprints.vu.edu.au/2063/1/senjov_makohon.pdf.

Full text
Abstract:
This study investigated how experienced teachers learned Information and Communication Technologies (ICT) during their professional development. With the introduction of ICT, experienced teachers encountered change becoming virtually displaced persons – digital immigrants; new settlers – endeavouring to obtain digital citizenship in order to survive in the information age. In the process, these teachers moved from learning how to push buttons, to applying software, and finally to changing their practice. They learned collectively and individually, in communities and networks, like immigrants and adult learners: by doing, experimenting and reflecting on ICT. Unfortunately, for these teachers-as-pedagogues, their focus on pedagogical theory during the action research they conducted, was not fully investigated or embraced during the year-long study. This study used a participant observation qualitative methodology to follow teachers in their university classroom. Interviews were conducted and documentation collected and verified by the teacher educator. The application of Kolb‘s, Gardner‘s, and Vygotsky‘s work allowed for the observation of these teachers within their sociocultural contexts. Kolb‘s work helped to understand their learning processes and Gardner‘s work indicated the learning abilities that these teachers valued in the new ICT environment. Meanwhile Vygotsky‘s work – and in particular three concepts, uchit, perezhivanija, and mislenija – presented a richer and more informed basis to understand immigration and change. Finally, this research proposes that teachers learn ICT through what is termed a hyperuchit model, consisting of developments; action; interaction; and reflection. The recommendation is that future teacher university ICT professional learning incorporates this hyperuchit model.
APA, Harvard, Vancouver, ISO, and other styles
8

Zhang, Yang. "An empirical study on the relationship between identity-checking steps and perceived trustworthiness in online banking system use : submitted in partial fulfilment of the requirements for the Degree of Master of Information Sciences in Information Technology." Massey University, 2009. http://hdl.handle.net/10179/982.

Full text
Abstract:
Online banking systems have become more common and widely used in daily life, bringing huge changes in modern banking transaction activities and giving us a greater opportunity to access the banking system anytime and anywhere. At the same time, however, one of the key challenges that still remain is to fully resolve the security concerns associated with the online banking system. Many clients feel that online banking is not secure enough, and to increase its security levels, many banks simply add more identity-checking steps or put on more security measures to some extent to give users the impression of a secure online banking system. However, this is easier to be said than done, because we believe that more identity-checking steps could compromise the usability of the online banking system, which is an inevitable feature in design of usable and useful online banking systems. Banks can simply enhance their security level with more sophisticated technologies, but this does not seem to guarantee the online banking system is in line with its key usability concern. Therefore, the research question raised in this thesis is to establish the relationships between usability, security and trustworthiness in the online banking system. To demonstrate these relationships, three experiments were carried out using the simulation of an online banking logon procedure to provide a similar online banking experience. Post questionnaires were used to measure the three concepts, i.e. usability, security and trustworthiness. The resulting analyses revealed that simply adding more identity-checking steps in the online banking system did not improve the customers? perceived security and trustworthiness, nor the biometric security technique (i.e., fingerprints) did enhance the subjective ratings on the perceived security and trustworthiness. This showed that the systems designer needs to be aware that the customer?s perception of the online banking system is not the same as that conceived from a technical standpoint.
APA, Harvard, Vancouver, ISO, and other styles
9

Mohanarajah, Selvarajah. "Designing CBL systems for complex domains using problem transformation and fuzzy logic : a thesis presented in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Computer Science at Massey University, Palmerston North, New Zealand." Massey University, 2007. http://hdl.handle.net/10179/743.

Full text
Abstract:
Some disciplines are inherently complex and challenging to learn. This research attempts to design an instructional strategy for CBL systems to simplify learning certain complex domains. Firstly, problem transformation, a constructionist instructional technique, is used to promote active learning by encouraging students to construct more complex artefacts based on less complex ones. Scaffolding is used at the initial learning stages to alleviate the difficulty associated with complex transformation processes. The proposed instructional strategy brings various techniques together to enhance the learning experience. A functional prototype is implemented with Object-Z as the exemplar subject. Both objective and subjective evaluations using the prototype indicate that the proposed CBL system has a statistically significant impact on learning a complex domain. CBL systems include Learner models to provide adaptable support tailored to individual learners. Bayesian theory is used in general to manage uncertainty in Learner models. In this research, a fuzzy logic based locally intelligent Learner model is utilized. The fuzzy model is simple to design and implement, and easy to understand and explain, as well as efficient. Bayesian theory is used to complement the fuzzy model. Evaluation shows that the accuracy of the proposed Learner model is statistically significant. Further, opening Learner model reduces uncertainty, and the fuzzy rules are simple and resemble human reasoning processes. Therefore, it is argued that opening a fuzzy Learner model is both easy and effective. Scaffolding requires formative assessments. In this research, a confidence based multiple test marking scheme is proposed as traditional schemes are not suitable for measuring partial knowledge. Subjective evaluation confirms that the proposed schema is effective. Finally, a step-by-step methodology to transform simple UML class diagrams to Object-Z schemas is designed in order to implement problem transformation. This methodology could be extended to implement a semi-automated translation system for UML to Object Models.
APA, Harvard, Vancouver, ISO, and other styles
10

Ashwell, Douglas James. "Reflecting diversity or selecting viewpoints : an analysis of the GM debate in New Zealand's media 1998-2002 : a thesis presented in partial fulfilment of the requirements for the degree of PhD in Communication at Massey University, Palmerston North, New Zealand." Massey University, 2009. http://hdl.handle.net/10179/1200.

Full text
Abstract:
The advent of genetically modified (GM) food in New Zealand in 1998 saw the beginning of a highly politicised debate about GM use in the country. The concern over GM and the political situation after the 1999 general election saw the Government establish a Royal Commission of Inquiry on Genetic Modification in May, 2000. The Royal Commission and strong public opposition to GM, evident in large public protests and other actions, made the issue highly newsworthy. The aim of this study was to explore how newspapers reported the GM debate, in particular, examining whether the reportage facilitated greater public debate and awareness about GM through journalists adhering to the ideals of the theory of social responsibility and enacting their watchdog role as encapsulated in the Fourth Estate tradition of the media. To achieve these aims the overall tone of the reportage and also which news source types and themes were most frequently reported were examined. In addition, the relationship and perceptions of scientists and journalists involved in the reporting were explored to examine how these relationships may have influenced the reportage. Content analysis showed the reportage had a pro-GM bias with policy-makers, scientists and industry spokespeople the most frequently cited news sources. The themes of Science, Economics and Politics dominated the reportage. Other source types and themes were less represented, especially themes dealing with ethical and environmental arguments. This lack of representation occurred despite the Royal Commission offering a space for all interested parties to speak. The interviews illustrated that scientists believed the quality of newspaper coverage of GM lacked depth and that important issues were unreported. Journalists found the issue complex to report and said they took care not to oversimplify the science and issues surrounding GM. The relationship between scientists and journalists indicated particular tensions existing between the two groups. The thesis concludes that if robust public debate is to occur within New Zealand regarding GM and other scientific developments, then the media should reflect a greater diversity of opinion by citing other potential news sources offering alternative arguments based on, for example, ethical or environmental grounds.
APA, Harvard, Vancouver, ISO, and other styles
11

Jonnavithula, Lalitha. "Improving the interfaces of online discussion forums to enhance learning support : a thesis presented in partial fulfilment of the requirements for the degree of Master of Information Science in Information Systems at Massey University, Palmerston North, New Zealand." Massey University, 2008. http://hdl.handle.net/10179/968.

Full text
Abstract:
This thesis describes a research work aimed at improving the interfaces of online discussion forums (ODFs) in relation to their functional support to enhance learning. These ODFs form part of almost all Learning Management Systems (LMSs) such as WebCT, Moodle and Blackboard, which are widely used in education nowadays. Although ODFs are identified as valuable sources to learning, their interfaces are limited in terms of providing support to students, such as in the areas of managing their postings as well as in facilitating them to quickly locate and obtain specified information. In addition, these systems lack features to support inter-institutional cooperation that could potentially increase knowledge sharing between students and educators of different institutions. The interface design objective of this study therefore was to explore and overcome the limitations identified as above, and enhance the effectiveness and efficiency of ODFs’ support to learning. Using a task centered design approach; the required features were developed, and implemented in a working prototype called eQuake (electronic Question answer knowledge environment). eQuake is a shared online discussion forum system developed as an add-on to a well-known open source e-learning platform (Moodle). This system was intended for use among interinstitutional students in New Zealand tertiary institutions that teach similar courses. The improved interface functionalities of eQuake are expected to enhance learning support in terms of widening communication among users, increasing knowledge base, providing existing matching answer(s) quickly to students, and exposing students to multiple perspectives. This study considers such improvements to ODF interfaces as vital to enable users to enjoy the benefits of technology-mediated environment. The perceived usefulness and ease-of-use of improved features in eQuake were evaluated using a quantitative experimental research method. The evaluation was conducted at three tertiary institutions in New Zealand, and the overall results indicated positive response, although some suggestions for improvement have been made in the evaluation. This thesis presents a review of the related literature, describes the design and development of a user interface, followed by its implementation in eQuake, and a description of the evaluation. The thesis concludes with recommendations for better interface design of ODFs and provides suggestions for future research in this area.
APA, Harvard, Vancouver, ISO, and other styles
12

Rountree, Richard John. "Novel technologies for the manipulation of meshes on the CPU and GPU : a thesis presented in partial fulfilment of the requirements for the degree of Masters of Science in Computer Science at Massey University, Palmerston North, New Zealand." Massey University, 2007. http://hdl.handle.net/10179/700.

Full text
Abstract:
This thesis relates to research and development in the field of 3D mesh data for computer graphics. A review of existing storage and manipulation techniques for mesh data is given followed by a framework for mesh editing. The proposed framework combines complex mesh editing techniques, automatic level of detail generation and mesh compression for storage. These methods work coherently due to the underlying data structure. The problem of storing and manipulating data for 3D models is a highly researched field. Models are usually represented by sparse mesh data which consists of vertex position information, the connectivity information to generate faces from those vertices, surface normal data and texture coordinate information. This sparse data is sent to the graphics hardware for rendering but must be manipulated on the CPU. The proposed framework is based upon geometry images and is designed to store and manipulate the mesh data entirely on the graphics hardware. By utilizing the highly parallel nature of current graphics hardware and new hardware features, new levels of interactivity with large meshes can be gained. Automatic level of detail rendering can be used to allow models upwards of 2 million polygons to be manipulated in real time while viewing a lower level of detail. Through the use of pixels shaders the high detail is preserved in the surface normals while geometric detail is reduced. A compression scheme is then introduced which utilizes the regular structure of the geometry image to compress the floating point data. A number of existing compression schemes are compared as well as custom bit packing. This is a TIF funded project which is partnered with Unlimited Realities, a Palmerston North software development company. The project was to design a system to create, manipulate and store 3D meshes in a compressed and easy to manipulate manner. The goal is to create the underlying technologies to allow for a 3D modelling system to become integrated into the Umajin engine, not to create a user interface/stand alone modelling program. The Umajin engine is a 3D engine created by Unlimited Realities which has a strong focus on multimedia. More information on the Umajin engine can be found at www.umajin.com. In this project we propose a method which gives the user the ability to model with the high level of detail found in packages aimed at creating offline renders but create models which are designed for real time rendering.
APA, Harvard, Vancouver, ISO, and other styles
13

Zhang, Jun. "Using computers to facilitate formative assessment of open-ended written assignments : a thesis presented in partial fulfillment of the requirements for the degree of Master of Science in Computer Science at Massey University, Palmerston North, New Zealand." Massey University. Institute of Information Sciences and Technology, 2005. http://hdl.handle.net/10179/245.

Full text
Abstract:
This thesis presents an e-learning solution to facilitate formative assessment of electronically submitted open-ended written assignments.It is widely accepted that formative assessment is highly beneficial to student leaning. A number of researchers are active in developing specialized approaches and software systems for assisting formative assessment of student work. However, no comprehensive e-learning solution exists for facilitating formative assessment of students' open-ended written work. The project presented in this thesis has developed a new approach for using computers to facilitate formative assessment of electronically submitted open-ended written assignments.Based on the literature review of the education theories around formative assessment and current computer software technologies, this project has developed three principles for e-learning support for formative assessment of open-ended written assignments:1. It needs to facilitate all the activities that are potentially required for formative assessment of student assignments (for example, the creation of assessment criteria, the submission of assignments, and the analysis of the assessment results), not only the marking activity to create feedback on assignments.2. It needs to provide an onscreen marking tool which enables human markers to mark open-ended written assignments in an intuitive and efficient way by replicating their paper-based assessment approaches.3. It needs to provide a generic solution for facilitating formative assessment of open-ended written assignments from all disciplines, not a limited solution restricted to some specific domains (for example, computers science or business courses).Based on these principles, a specification of an e-learning system for facilitating formative assessment of open-ended written assignment was developed and a system was implemented accordingly. This system, called the Written Assignment Assessment (WAA) system, has been already used in the assignment marking of several courses at Massey University.
APA, Harvard, Vancouver, ISO, and other styles
14

Steele, Aaron. "Ontological lockdown assessment : a thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Information Technology at Massey University, Palmerston North, New Zealand." Massey University, 2008. http://hdl.handle.net/10179/946.

Full text
Abstract:
In order to keep shared access computers secure and stable system administrators resort to locking down the computing environment in order to prevent intentional and unintentional damage by users. Skilled attackers are often able to break out of locked down computing environments and intentionally misuse shared access computers. This misuse has resulted in cases of mass identity theft and fraud, some of which have had an estimated cost ranging in millions. In order to determine if it is possible to break out of locked down computing environments an assessment method is required. Although a number of vulnerability assessment techniques exist, none of the existing techniques are sufficient for assessing locked down shared access computers. This is due to the existing techniques focusing on traditional, application specific, software vulnerabilities. Break out path vulnerabilities (which are exploited by attackers in order to break out of locked down environments) differ substantially from traditional vulnerabilities, and as a consequence are not easily discovered using existing techniques. Ontologies can be thought of as a modelling technique that can be used to capture expert knowledge about a domain of interest. The method for discovering break out paths in locked down computers can be considered expert knowledge in the domain of shared access computer security. This research proposes an ontology based assessment process for discovering break out path vulnerabilities in locked down shared access computers. The proposed approach is called the ontological lockdown assessment process. The ontological lockdown assessment process is implemented against a real world system and successfully identifies numerous break out path vulnerabilities.
APA, Harvard, Vancouver, ISO, and other styles
15

Zhao, Yue. "Modelling avian influenza in bird-human systems : this thesis is presented in the partial fulfillment of the requirement for the degree of Masters of Information Science in Mathematics at Massey University, Albany, New Zealand." Massey University, 2009. http://hdl.handle.net/10179/1145.

Full text
Abstract:
In 1997, the first human case of avian influenza infection was reported in Hong Kong. Since then, avian influenza has become more and more hazardous for both animal and human health. Scientists believed that it would not take long until the virus mutates to become contagious from human to human. In this thesis, we construct avian influenza with possible mutation situations in bird-human systems. Also, possible control measures for humans are introduced in the systems. We compare the analytical and numerical results and try to find the most efficient control measures to prevent the disease.
APA, Harvard, Vancouver, ISO, and other styles
16

Irie, Kenji. "Noise-limited scene-change detection in images." Diss., Lincoln University, 2009. http://hdl.handle.net/10182/1351.

Full text
Abstract:
This thesis describes the theoretical, experimental, and practical aspects of a noise-limited method for scene-change detection in images. The research is divided into three sections: noise analysis and modelling, dual illumination scene-change modelling, and integration of noise into the scene-change model. The sources of noise within commercially available digital cameras are described, with a new model for image noise derived for charge-coupled device (CCD) cameras. The model is validated experimentally through the development of techniques that allow the individual noise components to be measured from the analysis of output images alone. A generic model for complementary metal-oxide-semiconductor (CMOS) cameras is also derived. Methods for the analysis of spatial (inter-pixel) and temporal (intra-pixel) noise are developed. These are used subsequently to investigate the effects of environmental temperature on camera noise. Based on the cameras tested, the results show that the CCD camera noise response to variation in environmental temperature is complex whereas the CMOS camera response simply increases monotonically. A new concept for scene-change detection is proposed based upon a dual illumination concept where both direct and ambient illumination sources are present in an environment, such as that which occurs in natural outdoor scenes with direct sunlight and ambient skylight. The transition of pixel colour from the combined direct and ambient illuminants to the ambient illuminant only is modelled. A method for shadow-free scene-change is then developed that predicts a pixel's colour when the area in the scene is subjected to ambient illumination only, allowing pixel change to be distinguished as either being due to a cast shadow or due to a genuine change in the scene. Experiments on images captured in controlled lighting demonstrate 91% of scene-change and 83% of cast shadows are correctly determined from analysis of pixel colour change alone. A statistical method for detecting shadow-free scene-change is developed. This is achieved by bounding the dual illumination model by the confidence interval associated with the pixel's noise. Three benefits arise from the integration of noise into the scene-change detection method: - The necessity for pre-filtering images for noise is removed; - All empirical thresholds are removed; and - Performance is improved. The noise-limited scene-change detection algorithm correctly classifies 93% of scene-change and 87% of cast shadows from pixel colour change alone. When simple post-analysis size-filtering is applied both these figures increase to 95%.
APA, Harvard, Vancouver, ISO, and other styles
17

Senjov-Makohon, Natalie. "Digital immigrant teachers learning for the information age." Thesis, full-text, 2009. https://vuir.vu.edu.au/2063/.

Full text
Abstract:
This study investigated how experienced teachers learned Information and Communication Technologies (ICT) during their professional development. With the introduction of ICT, experienced teachers encountered change becoming virtually displaced persons – digital immigrants; new settlers – endeavouring to obtain digital citizenship in order to survive in the information age. In the process, these teachers moved from learning how to push buttons, to applying software, and finally to changing their practice. They learned collectively and individually, in communities and networks, like immigrants and adult learners: by doing, experimenting and reflecting on ICT. Unfortunately, for these teachers-as-pedagogues, their focus on pedagogical theory during the action research they conducted, was not fully investigated or embraced during the year-long study. This study used a participant observation qualitative methodology to follow teachers in their university classroom. Interviews were conducted and documentation collected and verified by the teacher educator. The application of Kolb‘s, Gardner‘s, and Vygotsky‘s work allowed for the observation of these teachers within their sociocultural contexts. Kolb‘s work helped to understand their learning processes and Gardner‘s work indicated the learning abilities that these teachers valued in the new ICT environment. Meanwhile Vygotsky‘s work – and in particular three concepts, uchit, perezhivanija, and mislenija – presented a richer and more informed basis to understand immigration and change. Finally, this research proposes that teachers learn ICT through what is termed a hyperuchit model, consisting of developments; action; interaction; and reflection. The recommendation is that future teacher university ICT professional learning incorporates this hyperuchit model.
APA, Harvard, Vancouver, ISO, and other styles
18

Punchihewa, Amal. "Synthetic test patterns and compression artefact distortion metrics for image codecs : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Engineering at Massey University, Palmerston North, New Zealand." Massey University, 2009. http://hdl.handle.net/10179/1131.

Full text
Abstract:
This thesis presents a framework of test methodology to assess spatial domain compression artefacts produced by image and intra-frame coded video codecs. Few researchers have studied this broad range of artefacts. A taxonomy of image and video compression artefacts is proposed. This is based on the point of origin of the artefact in the image communication model. This thesis presents objective evaluation of distortions known as artefacts due to image and intra-frame coded video compression made using synthetic test patterns. The American National Standard Institute document ANSI T1 801 qualitatively defines blockiness, blur and ringing artefacts. These definitions have been augmented with quantitative definitions in conjunction with test patterns proposed. A test and measurement environment is proposed in which the codec under test is exercised using a portfolio of test patterns. The test patterns are designed to highlight the artefact under study. Algorithms have been developed to detect and measure individual artefacts based on the characteristics of respective artefacts. Since the spatial contents of the original test patterns form known structural details, the artefact distortion metrics based on the characteristics of those artefacts are clean and swift to calculate. Distortion metrics are validated using a human vision system inspired modern image quality metric. Blockiness, blur and ringing artefacts are evaluated for representative codecs using proposed synthetic test patterns. Colour bleeding due to image and video compression is discussed with both qualitative and quantitative definitions for the colour bleeding artefacts introduced. The image reproduction performance of a few codecs was evaluated to ascertain the utility of proposed metrics and test patterns.
APA, Harvard, Vancouver, ISO, and other styles
19

Bishell, Aaron. "Designing application-specific processors for image processing : a thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Computer Science, Massey University, Palmerston North, New Zealand." Massey University, 2008. http://hdl.handle.net/10179/1024.

Full text
Abstract:
Implementing a real-time image-processing algorithm on a serial processor is difficult to achieve because such a processor cannot cope with the volume of data in the low-level operations. However, a parallel implementation, required to meet timing constraints for the low-level operations, results in low resource utilisation when implementing the high-level operations. These factors suggested a combination of parallel hardware, for the low-level operations, and a serial processor, for the high-level operations, for implementing a high-level image-processing algorithm. Several types of serial processors were available. A general-purpose processor requires an extensive instruction set to be able to execute any arbitrary algorithm resulting in a relatively complex instruction decoder and possibly extra FUs. An application-specific processor, which was considered in this research, implements enough FUs to execute a given algorithm and implements a simpler, and more efficient, instruction decoder. In addition, an algorithms behaviour on a processor could be represented in either hardware (i.e. hardwired logic), which limits the ability to modify the algorithm behaviour of a processor, or “software” (i.e. programmable logic), which enables external sources to specify the algorithm behaviour. This research investigated hardware- and software- controlled application-specific serial processors for the implementation of high-level image-processing algorithms and compared these against parallel hardware and general-purpose serial processors. It was found that application-specific processors are easily able to meet the timing constraints imposed by real-time high-level image processing. In addition, the software-controlled processors had additional flexibility, a performance penalty of 9.9% and 36.9% and inconclusive footprint savings (and costs) when compared to hardwarecontrolled processors.
APA, Harvard, Vancouver, ISO, and other styles
20

Engelbrecht, Judith Merrylyn. "Electronic clinical decision support (eCDS) in primary health care: a multiple case study of three New Zealand PHOs : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Information Systems at Massey University, Palmerston North, New Zealand." Massey University, 2009. http://hdl.handle.net/10179/1107.

Full text
Abstract:
Health care providers internationally are facing challenges surrounding the delivery of high quality, cost effective services. The use of integrated electronic information systems is seen by many people working in the health sector as a way to address some of the associated issues. In New Zealand the primary health care sector has been restructured to follow a population based care model and provides services through not-for-profit Primary Health Organisations (PHOs). PHOs, together with their District Health Boards (DHBs), contributing service providers, and local communities, are responsible for the care of their enrolled populations. The Ministry of Health (MoH) is streamlining information sharing in this environment through improvements to computer based information systems (IS). By providing health professionals with improved access to required information within an appropriate time frame, services can be targeted efficiently and effectively and patient health outcomes potentially improved. However, the adoption of IS in health care has been slower than in other industries. Therefore, a thorough knowledge of health care professionals’ attitudes to, and use of, available IS is currently needed to contribute to the development of appropriate systems. This research employs a multiple case study strategy to establish the usage of IS by three New Zealand PHOs and their member primary health care providers (PHPs), with a focus on the role of IS in clinical decision support (CDS). A mixed method approach including semi-structured interviews and postal surveys was used in the study. Firstly, the research develops and applies a survey tool based on an adaptation of an existing framework, for the study of IT sophistication in the organisations. This provides the foundation for an in-depth study of the use of computerised CDS (eCDS) in the PHO environment. Secondly, a conceptual model of eCDS utilisation is presented, illustrating the variation of eCDS use by member general practitioner (GP) practices within individual organisations. Thirdly, five areas of importance for improving eCDS utilisation within PHO’s are identified, contributing information of use to organisations, practitioners, planners, and systems developers. Lastly, the research provides a structure for the study of the domain of eCDS in PHOs by presenting a research approach and information specific for the area.
APA, Harvard, Vancouver, ISO, and other styles
21

Chetsumon, Sireerat. "Attitudes of extension agents towards expert systems as decision support tools in Thailand." Lincoln University, 2005. http://hdl.handle.net/10182/1371.

Full text
Abstract:
It has been suggested 'expert systems' might have a significant role in the future through enabling many more people to access human experts. It is, therefore, important to understand how potential users interact with these computer systems. This study investigates the effect of extension agents' attitudes towards the features and use of an example expert system for rice disease diagnosis and management(POSOP). It also considers the effect of extension agents' personality traits and intelligence on their attitudes towards its use, and the agents' perception of control over using it. Answers to these questions lead to developing better systems and to increasing their adoption. Using structural equation modelling, two models - the extension agents' perceived usefulness of POSOP, and their attitude towards the use of POSOP, were developed (Models ATU and ATP). Two of POSOP's features (its value as a decision support tool, and its user interface), two personality traits (Openness (0) and Extraversion (E)), and the agents' intelligence, proved to be significant, and were evaluated. The agents' attitude towards POSOP's value had a substantial impact on their perceived usefulness and their attitude towards using it, and thus their intention to use POSOP. Their attitude towards POSOP's user interface also had an impact on their attitude towards its perceived usefulness, but had no impact on their attitude towards using it. However, the user interface did contribute to its value. In Model ATU, neither Openness (0) nor Extraversion (E) had an impact on the agents' perceived usefulness indicating POSOP was considered useful regardless of the agents' personality background. However, Extraversion (E) had a negative impact on their intention to use POSOP in Model ATP indicating that 'introverted' agents had a clear intention to use POSOP relative to the 'extroverted' agents. Extension agents' intelligence, in terms of their GPA, had neither an impact on their attitude, nor their subjective norm (expectation of 'others' beliefs), to the use of POSOP. It also had no association with any of the variables in both models. Both models explain and predict that it is likely that the agents will use POSOP. However, the availability of computers, particularly their capacity, are likely to impede its use. Although the agents believed using POSOP would not be difficult, they still believed training would be beneficial. To be a useful decision support tool, the expert system's value and user interface as well as its usefulness and ease of use, are all crucially important to the preliminary acceptance of a system. Most importantly, the users' problems and needs should be assessed and taken into account as a first priority in developing an expert system. Furthermore, the users should be involved in the system development. The results emphasise that the use of an expert system is not only determined by the system's value and its user interface, but also the agents' perceived usefulness, and their attitude towards using it. In addition, the agents' perception of control over using it is also a significant factor. The results suggested improvements to the system's value and its user interface would increase its potential use, and also providing suitable computers, coupled with training, would encourage its use.
APA, Harvard, Vancouver, ISO, and other styles
22

Johnston, Christopher Troy. "VERTIPH : a visual environment for real-time image processing on hardware : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Computer Systems Engineering at Massey University, Palmerston North, New Zealand." Massey University, 2009. http://hdl.handle.net/10179/1219.

Full text
Abstract:
This thesis presents VERTIPH, a visual programming language for the development of image processing algorithms on FPGA hardware. The research began with an examination of the whole design cycle, with a view to identifying requirements for implementing image processing on FPGAs. Based on this analysis, a design process was developed where a selected software algorithm is matched to a hardware architecture tailor made for its implementation. The algorithm and architecture are then transformed into an FPGA suitable design. It was found that in most cases the most efficient mapping for image processing algorithms is to use a streamed processing approach. This constrains how data is presented and requires most existing algorithms to be extensively modified. Therefore, the resultant designs are heavily streamed and pipelined. A visual notation was developed to complement this design process, as both streaming and pipelining can be well represented by data flow visual languages. The notation has three views each of which represents and supports a different part of the design process. An architecture view gives an overview of the design's main blocks and their interconnections. A computational view represents lower-level details by representing each block by a set of computational expressions and low-level controls. This includes a novel visual representation of pipelining that simplifies latency analysis, multiphase design, priming, flushing and stalling, and the detection of sequencing errors. A scheduling view adds a state machine for high-level control of processing blocks. This extended state objects to allow for the priming and flushing of pipelined operations. User evaluations of an implementation of the key parts of this language (the architecture view and the computational view) found that both were generally good visualisations and aided in design (especially the type interface, pipeline and control notations). The user evaluations provided several suggestions for the improvement of the language, and in particular the evaluators would have preferred to use the diagrams as a verification tool for a textual representation rather than as the primary data capture mechanism. A cognitive dimensions analysis showed that the language scores highly for thirteen of the twenty dimensions considered, particularly those related to making details of the design clearer to the developer.
APA, Harvard, Vancouver, ISO, and other styles
23

Jiang, Feng. "Capturing event metadata in the sky : a Java-based application for receiving astronomical internet feeds : a thesis presented in partial fulfilment of the requirements for the degree of Master of Computer Science in Computer Science at Massey University, Auckland, New Zealand." Massey University, 2008. http://hdl.handle.net/10179/897.

Full text
Abstract:
When an astronomical observer discovers a transient event in the sky, how can the information be immediately shared and delivered to others? Not too long time ago, people shared the information about what they discovered in the sky by books, telegraphs, and telephones. The new generation of transferring the event data is the way by the Internet. The information of astronomical events is able to be packed and put online as an Internet feed. For receiving these packed data, an Internet feed listener software would be required in a terminal computer. In other applications, the listener would connect to an intelligent robotic telescope network and automatically drive a telescope to capture the instant Astrophysical phenomena. However, because the technologies of transferring the astronomical event data are in the initial steps, the only resource available is the Perl-based Internet feed listener developed by the team of eSTAR. In this research, a Java-based Internet feed listener was developed. The application supports more features than the Perl-based application. After applying the rich Java benefits, the application is able to receive, parse and manage the Internet feed data in an efficient way with the friendly user interface. Keywords: Java, socket programming, VOEvent, real-time astronomy
APA, Harvard, Vancouver, ISO, and other styles
24

Blakey, Jeremy Peter. "Database training for novice end users : a design research approach : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Information Systems at Massey University, Albany, New Zealand." Massey University, 2008. http://hdl.handle.net/10179/880.

Full text
Abstract:
Of all of the desktop software available, that for the implementation of a database is some of the most complex. With the increasing number of computer users having access to this sophisticated software, but with no obvious way to learn the rudiments of data modelling for the implementation of a database, there is a need for a simple, convenient method to improve their understanding. The research described in this thesis represents the first steps in the development of a tool to accomplish this improvement. In a preliminary study using empirical research a conceptual model was used to improve novice end users’ understanding of the relational concepts of data organisation and the use of a database software package. The results showed that no conclusions could be drawn about either the artefact used or the method of evaluation. Following the lead of researchers in the fields of both education and information systems, a design research process was developed, consisting of the construction and evaluation of a training artefact. A combination of design research and a design experiment was used in the main study described in this thesis. New to research in information systems, design research is a methodology or set of analytical techniques and perspectives, and this was used to develop a process (development of an artefact) and a product (the artefact itself). The artefact, once developed, needed to be evaluated for its effectiveness, and this was done using a design experiment. The experiment involved exposing the artefact to a small group of end users in a realistic setting and defining a process for the evaluation of the artefact. The artefact was the tool that would facilitate the improvement of the understanding of data modelling, the vital precursor to the development of a database. The research was conducted among a group of novice end users who were exposed to the artefact, facilitated by an independent person. In order to assess whether there was any improvement in the novices’ understanding of relational data modelling and database concepts, they then completed a post-test. Results confirmed that the artefact, trialled through one iteration, was successful in improving the understanding of these novice end users in the area of data modelling. The combination of design research and design experiment as described above gave rise to a new methodology, called experimental design research at this early juncture. The successful outcome of this research will lead to further iterations of the design research methodology, leading in turn to the further development of the artefact which will be both useful and accessible to novice users of personal computers and database software. This research has made the following original contributions. Firstly, the use of the design research methodology for the development of the artefact, which proved successful in improving novice users’ understanding of relational data structures. Secondly, the novel use of a design experiment in an information systems project, which was used to evaluate the success of the artefact. And finally, the combination of the developed artefact followed by its successful evaluation using a design experiment resulted in the hybrid experimental design research methodology. The success of the implementation of the experimental design research methodology in this information systems project shows much promise for its successful application to similar projects.
APA, Harvard, Vancouver, ISO, and other styles
25

Liu, MingHui. "Navel orange blemish identification for quality grading system : a thesis submitted in partial fulfilment of the requirements for the degree of Master of Computer Science at Massey University, Albany, New Zealand." Massey University, 2009. http://hdl.handle.net/10179/1175.

Full text
Abstract:
Each year, the world’s top orange producers output millions of oranges for human consumption. This production is projected to grow by as much as 64 million in 2010 and so the demand for fast, low-cost and precise automated orange fruit grading systems is only deemed to become more increasingly important. There is however an underlying limit to most orange blemish detection algorithms. Most existing statistical-based, structural-based, model-based and transform-based orange blemish detection algorithms are plagued by the following problem: any pixels in an image of an orange having about the same magnitudes for the red, green and blue channels will almost always be classified as belonging to the same category (either a blemish or not). This however presents a big problem as the RGB components of the pixels corresponding to blemishes are very similar to pixels near the boundary of an orange. In light of this problem, this research utilizes a priori knowledge of the local intensity variations observed on rounded convex objects to classify the ambiguous pixels correctly. The algorithm has the effect of peeling-off layers of the orange skin according to gradations of the intensity. Therefore, any abrupt discontinuities detected along successive layers would significantly help identifying skin blemishes more accurately. A commercial-grade fruit inspection and distribution system was used to collect 170 navel orange images. Of these images, 100 were manually classified as good oranges by human inspection and the rest are blemished ones. We demonstrate the efficacy of the algorithm using these images as the benchmarking test set. Our results show that the system garnered 96% correctly classified good oranges and 97% correctly classified blemished oranges. The proposed system is easily customizable as it does not require any training. The fruit quality bands can be adjusted to meet the requirements set by the market standards by specifying an agreeable percentage of blemishes for each band.
APA, Harvard, Vancouver, ISO, and other styles
26

Brodie, Matthew Andrew Dalhousie. "Development of fusion motion capture for optimisation of performance in alpine ski racing : a thesis presented in fulfilment of the requirements for the degree of Doctor of Philosophy in Science at Massey University, Wellington, New Zealand." Massey University, 2009. http://hdl.handle.net/10179/1041.

Full text
Abstract:
Fusion Motion Capture (FMC), a wearable motion capture system was developed, and applied to the optimisation of athlete performance in alpine ski racing. In what may be a world first, the three-dimensional movements of a skilled athlete (with less than 20 FIS1 points) skiing through a complete training giant slalom racecourse were analysed. FMC consists of multiple light weight sensors attached to the athlete including inertial measurement units (IMUs), pressure sensitive insoles and a global position system (GPS) receiver. The IMUs contain accelerometers, gyroscopes, and magnetometers. Limb orientation and location are obtained by mathematically combining the most reliable data from each sensor using fusion algorithms developed by the author. FMC fuses the signals from the IMUs and GPS without the need for the post filtering, usually applied to motion capture data, and therefore, maintains maximum bandwidth. The FMC results were stable and relatively independent of motion type and duration unlike other inertial systems available in 2005, when the research was initiated. Analysis of data collected from an athlete skiing giant slalom contradict the traditional „going straight turning short? race strategy. The shortest path may not always be the fastest. Instead each gate has a different optimum approach arc. Optimum turn radius increases with both increasing speed and increasing terrain slope. The results also contradict laboratory measurements of ski/snow sliding friction and suggest that snow resistance in giant slalom is of similar importance to wind drag. In addition to gravity, the athlete increased speed using the techniques of „lateral projection? and „pumping?. Race performance was determined from the analysis of the athlete skiing through the entire course. FMC proved, therefore, to be more suitable than traditional optical systems that are practically limited to capturing small sections of a race course. The athlete experienced high and rapidly fluctuating torques about all three axes of the lower joints. This information could be useful in designing training programmes racecourses and equipment to reduce knee injuries. Data driven animations and colour coded force vector diagrams were developed to enhance athlete feedback. Inline skating data was also analysed.
APA, Harvard, Vancouver, ISO, and other styles
27

Sun, Zhibin. "Application of artificial neural networks in early detection of Mastitis from improved data collected on-line by robotic milking stations." Lincoln University, 2008. http://hdl.handle.net/10182/665.

Full text
Abstract:
Two types of artificial neural networks, Multilayer Perceptron (MLP) and Self-organizing Feature Map (SOM), were employed to detect mastitis for robotic milking stations using the preprocessed data relating to the electrical conductivity and milk yield. The SOM was developed to classify the health status into three categories: healthy, moderately ill and severely ill. The clustering results were successfully evaluated and validated by using statistical techniques such as K-means clustering, ANOVA and Least Significant Difference. The result shows that the SOM could be used in the robotic milking stations as a detection model for mastitis. For developing MLP models, a new mastitis definition based on higher EC and lower quarter yield was created and Principle Components Analysis technique was adopted for addressing the problem of multi-colinearity existed in the data. Four MLPs with four combined datasets were developed and the results manifested that the PCA-based MLP model is superior to other non-PCA-based models in many respects such as less complexity, higher predictive accuracy. The overall correct classification rate (CCR), sensitivity and specificity of the model was 90.74 %, 86.90 and 91.36, respectively. We conclude that the PCA-based model developed here can improve the accuracy of prediction of mastitis by robotic milking stations.
APA, Harvard, Vancouver, ISO, and other styles
28

Tian, Yuan. "Simulation for LEGO Mindstorms robotics." Lincoln University, 2008. http://hdl.handle.net/10182/304.

Full text
Abstract:
The LEGO® MINDSTORMS® toolkit can be used to help students learn basic programming and engineering concepts. Software that is widely used with LEGO MINDSTORMS is ROBOLAB, developed by Professor Chris Rogers from Tufts University, Boston, United States. It has been adopted in about 10,000 schools in the United States and other countries. It is used to program LEGO MINDSTORMS robotics in its icon-based programming environment. However, this software does not provide debug features for LEGO MINDSTORMS programs. Users cannot test the program before downloading it into LEGO robotics hardware. In this project, we develop a simulator for LEGO MINDSTORMS to simulate the motions of LEGO robotics in a virtual 3D environment. We use ODE (Open Dynamic Engine) and OpenGL, combined with ROBOLAB. The simulator allows users to test their ROBOLAB program before downloading it into the LEGO MINDSTORMS hardware. For users who do not have the hardware, they may use the simulator to learn ROBOLAB programming skills which may be tested and debugged using the simulator. The simulator can track and display program execution as the simulation runs. This helps users to learn and understand basic robotics programming concepts. An introduction to the overall structure and architecture of the simulator is given and is followed by a detailed description of each component in the system. This presents the techniques that are used to implement each feature of the simulator. The discussions based on several test results are then given. This leads to the conclusion that the simulator is able to accurately represent the actions of robots under certain assumptions and conditions.
APA, Harvard, Vancouver, ISO, and other styles
29

Deng, Yanbo. "Using web services for customised data entry." Master's thesis, Lincoln University. Environment, Society and Design Division, 2007. http://theses.lincoln.ac.nz/public/adt-NZLIU20080313.185408/.

Full text
Abstract:
Scientific databases often need to be accessed from a variety of different applications. There are usually many ways to retrieve and analyse data already in a database. However, it can be more difficult to enter data which has originally been stored in different sources and formats (e.g. spreadsheets, other databases, statistical packages). This project focuses on investigating a generic, platform independent way to simplify the loading of databases. The proposed solution uses Web services as middleware to supply essential data management functionality such as inserting, updating, deleting and retrieval of data. These functions allow application developers to easily customise their own data entry applications according to local data sources, formats and user requirements. We implemented a Web service to support loading data to the Germinate database at the New Zealand Institute of Crop & Food Research (CFR). We also provided language specific client toolkits to help developers invoke the Web service. The toolkits allow applications to be easily customised for different platforms. In addition, we developed sample applications to help end users load data from their project data sources via the Web service. The Web service approach was evaluated through user and developer trials. The feedback from the developer trial showed that using Web services as middleware is a useful approach to allow developers and competent end users to customise data entry with minimal effort. More importantly, the customised client applications enabled end users to load data directly from their project spreadsheets and databases. It significantly reduced the effort required for exporting or transforming the source data.
APA, Harvard, Vancouver, ISO, and other styles
30

Kirk, Diana Caroline. "Flexible software process model." 2007. http://hdl.handle.net/2292/4228.

Full text
Abstract:
Many different kinds of process are used to develop software intensive products, but there is little agreement as to which processes give the best results under which circumstances. Practitioners and researchers believe that project outcomes would be improved if the development process was constructed according to project-specific factors. In order to achieve this goal, greater understanding of the factors that most affect outcomes is needed. To improve understanding, researchers build models of the process and carry out studies based on these models. However, current models contain many ambiguities and assumptions, and so it is not clear what the results of the studies mean. The statement of this thesis is that it is possible to create an abstraction of the software development process that will provide a mechanism for comparing software processes and software process models. The long term goal of the research is to provide planners with a means of tailoring the development process on a project by project basis, with the aim of reducing risk and improving outcomes.
APA, Harvard, Vancouver, ISO, and other styles
31

Thornber, Michael John. "Square pegs and round holes: application of ISO 9000 in healthcare." 2002. http://hdl.handle.net/2292/2180.

Full text
Abstract:
This research examines the application of the ISO 9000 model for quality management in healthcare. Exploratory case study is made of three healthcare provider organisations: community health service; independent practitioner association; Maori health network. Three research models are developed to examine identified gaps and areas of interest in healthcare quality management literature. The first model relates to differences between generic standards and specification standards. The second model relates to the fit of healthcare service delivery systems and ISO 9000. The third model relates to exploration of the linkages and co-ordination of an integrated care delivery network. One proposition and two hypotheses are developed in relation to the models, and are closely associated with gaps in healthcare service quality knowledge. Strong support is found for the first hypothesis though not the second hypothesis, and there are also some unexpected results. There is strong support that the process of implementing the ISO 9000 model will enhance healthcare management performance, even though the outcomes are unpredictable. There are indications supporting the notion that implementation of the ISO 9000 model will increase effective linkages and co-ordination within integrated care delivery networks. The body of evidence accumulated during the study did not, however, permit a valid conclusion regarding the hypothesis. The findings of the study can be extended to other healthcare service areas and through interpretation and extrapolation they add value to healthcare service quality research in general. In particular, the findings of the three case studies in this research suggest that future models for healthcare service quality should include a comprehensive generic model for quality management of individual and integrated healthcare service organisations.
APA, Harvard, Vancouver, ISO, and other styles
32

Tan, Felix B. "Business-IT Alignment and Shared Understanding Between Business and IS Executives: A Cognitive Mapping Investigation." 2001. http://hdl.handle.net/2292/2228.

Full text
Abstract:
Whole document restricted, see Access Instructions file below for details of how to access the print copy.
Achieving and sustaining business-IT alignment in organisations continues to be a management challenge into the new millennium. As organisations strive toward this end, researchers are attempting to better understand the alignment phenomenon. Empirical research into business-IT alignment is dominated by studies examining the relationship between business strategy, information technology and performance. Investigations into the factors enabling or inhibiting alignment are emerging. This research has traditionally taken a behavioural perspective. There is evidence of little research that examines the issue through a cognitive lens. This thesis builds on and extends the study of business-IT alignment by investigating the cognition of the key stakeholders of the alignment process - business and IS executives. Drawing on Personal Construct Theory (Kelly, 1955), this study uses a cognitive mapping methodology known as the repertory grid technique to investigate two questions: i) is there a positive relationship between business-IT alignment and shared understanding between business and IS executives?; and ii) are there differences in the cognitive maps of business and IS executives in companies that report high business-IT alignment and those that report low business-IT alignment? Shared understanding is defined as cognition that is held in common between and that which is distributed amongst business and IS executives. It is portrayed in the form of a cognitive map for each company. The study proposes that business-IT alignment is directly related to the shared understanding between business and IS executives and that the cognitive maps of these executive groups are less diverse in companies that report a high level of alignment. Eighty business and IS executives from six companies were interviewed. Cognitive maps were elicited from the research participants from which diversity between cognitive maps of business and IS executives are measured. A collective cognitive map was produced to illustrate the quality of the shared understanding in each company. The state of business-IT alignment in each company was also measured. The results of the study suggest that there is a strong positive link between business-IT alignment and shared understanding between business and IS executives. As expected, companies with a high-level of business-IT alignment demonstrate high quality shared understanding between its business and IS executives as measured and portrayed by their collective cognitive maps. The investigation further finds significant diversity in the structure and content of the cognitive maps of these executive groups in companies reporting a low-level of alignment. This study concludes that shared understanding, between business and IS executives, is important to business-IT alignment. Reconciling the diversity in the cognitive maps of business and IS executives is a step toward achieving and sustaining alignment. Practical approaches to developing shared understanding are proposed. A methodology to aid organisations in assessing shared understanding between their business and IS executives is also outlined. Finally research on business-IT alignment continues to be a fruitful and important field of IS research. This study suggests that the most interesting issues are at the interface between cognition and behaviour. The process of business-IT alignment in organisations is characterised by the individuality and commonality in the cognition of key stakeholders, its influence on the behaviour of these members and hence the organisational action taken.
APA, Harvard, Vancouver, ISO, and other styles
33

Day, Karen Jean. "Supporting the emergence of a shared services organisation: Managing change in complex health ICT projects." 2008. http://hdl.handle.net/2292/2476.

Full text
Abstract:
Although there is a high risk of failure in the implementation of ICT projects (which appears to extend to health ICT projects), we continue to implement health information systems in order to deliver quality, cost-effective healthcare. The purpose of the research was to participate in and study the change management as a critical success factor in health ICT projects, and to examine people’s responses to change so as to develop understanding and theory that could be used in future change management programmes. The research was conducted within the context of a large infrastructure project that resulted from the emergence of a shared services organisation (from two participating District Health Boards in Auckland, New Zealand). Action research (AR) formed the basis of the methodology used, and provided the foundation for a change management programme: the AR intervention. Grounded theory (GT) was used for some of the data analysis, the generation of themes by means of constant comparison and the deeper examination of the change process using theoretical sampling. AR and GT together supported the development of theory regarding the change process associated with health ICT projects. Health ICT projects were revealed in the findings as exhibiting the properties of complex adaptive systems. This complexity highlighted the art of change management as a critical success factor for such projects. The fabric of change emerged as a composite of processes linked to project processes and organisational processes. The turning point in the change process from the before state to the after state is marked by a capability crisis which requires effective patterns of leadership, sensitive targeting of communication, effective learning, and management of increased workload and diminishing resources during the course of health ICT projects. A well managed capability crisis period as a component of change management can substantially contribute to health ICT project success.
APA, Harvard, Vancouver, ISO, and other styles
34

Gutierrez, Jairo A. "Multi-Vendor System Network Management: A Roadmap for Coexistence." 1997. http://hdl.handle.net/2292/1970.

Full text
Abstract:
Whole document restricted, see Access Instructions file below for details of how to access the print copy.
As computer networks become more complex, and more heterogeneous (often involving systems from multiple vendors), the importance of integrated network management increases. This thesis summarises the efforts of research carried out 1 ) to identify the characteristics and requirements of an Integrated Network Management Environment (INME) and its individual components, 2) to propose a model to represent the INME, 3) to demonstrate the validity of the model, 4) to describe the steps needed to formally specify the model, and 5) to suggest an implementation plan for the INME. One of the key aspects of this thesis is the introduction of three different and complementary models used to integrate the emerging OSI management standards with the proven-and-tried network management solutions promoted by the Internet Activities Board. The Protocol-Oriented Network Management Model is used to represent the existing network management supported by the INME: ie, OSI and Internet-based systems. The Element-Oriented Network Management Model represents the components that are used within individual network systems. It describes the managed objects, and the platform application program interfaces (APIs). This model also includes the translation mechanisms needed to support the interaction between OSI managers and Internet agents. The Interoperability Model is used to represent the underlying communications infrastructure supporting network management. The communications between agents and managers is represented with this model by using the required protocol stacks (OSI or TCP/IP), and by depicting the interconnection between the entities using the network management functions. This three-pronged classification provides a richer level of abstraction facilitating the coexistence of the standard network management systems, allowing different levels of modeling. complexity, and improving the access to managed objects. The ultimate goal of this thesis is to describe a framework that assists developers of network management applications in the process of integrating their solutions to an open systems network management platform. This framework will also help network managers to minimise the risks involved in the transition from first generation network management systems to more integrated alternatives as they become available.
APA, Harvard, Vancouver, ISO, and other styles
35

Costain, Gay. "Cognitive Support during Object-Oriented Software Development: The Case of UML Diagrams." 2008. http://hdl.handle.net/2292/2603.

Full text
Abstract:
The Object Management Group (OMG) accepted the Unified Modelling Language (UML) as a standard in 1997, yet there is sparse empirical evidence to justify its choice. This research aimed to address that lack by investigating the modification of programs for which external representations, drawn using the UML notations most commonly used in industry, were provided. The aim of the research was to discover if diagrams using those UML notations provided the modifying programmer with cognitive support. The application of the use of modelling to assist program modification was chosen as a result of interviews that were carried out in New Zealand and North America to discover how workers in the software industry used modelling, and if so, whether UML notation satisfied their needs. The most preferred UML diagrams were identified from the interviews. A framework of modelling use in software development was derived. A longitudinal study at a Seattle-based company was the source that suggested that program modification should be investigated. The methodology chosen for the research required subjects to modify two non-trivial programs, one of which was supplied with UML documentation. There were two aspects to the methodology. First, the subjects’ performances with and without the aid of UML documentation were compared. Modifying a program is an exercise in problem solving which is a cognitive activity. If the use of UML improved subjects’ performances then it could be said that the UML had aided the subjects’ cognition. Second, concurrent verbal protocols were collected whilst the subjects modified the programs. The protocols for the modification with UML documentation, for ten of the more successful subjects, were transcribed and analysed according to a framework derived from the literature. The framework listed the possible cognitive steps involved in problem solving where cognition could be distributed to and from external representations. The categories of evidence that would confirm cognitive support were also derived from the literature. The experiments confirmed that programmers from similar backgrounds varied widely in ability and style. Twenty programmers modified both an invoice application and a diary application. There was some indication that the UML diagrams aided performance. The analyses of all ten of the transcribed subjects showed evidence of UML cognitive support.
APA, Harvard, Vancouver, ISO, and other styles
36

Berkowitz, Zeev. "A methodology for business processes identification: developing instruments for an effective enterprise system project." 2006. http://hdl.handle.net/2292/4346.

Full text
Abstract:
Whole document restricted, see Access Instructions file below for details of how to access the print copy.
Since the mid 1990s, thousands of companies around the world have implemented Enterprise Systems (ES), which are considered to be the most important development in the corporate use of information technology. By providing computerized support to business processes spanning both the enterprise and the supply chain, these systems have become an indispensable tool utilized by organizations to accomplish and maintain efficient and effective operational performance. However, there are many cases in which ES implementation has failed in terms of the required time and budget, and more importantly, in terms of functionality and performance. One of the main causes of these failures is the misidentification and improper selection of business processes to be implemented into the ES, which are a crucial element of the system's implementation life cycle. In order to achieve effective implementation, a ‘necessary and sufficient’ set of business processes must be designed and implemented. Implementing an excessive set of business processes is costly; yet implementing an insufficient set is ruinous. The heuristic identification of the set of business processes, based on requirement elicitation, is flawed; there is no guarantee that all the necessary processes have been captured (Type I error), and/or that superfluous processes have been selected for implementation (Type II error). The existing implementation methods do not include a methodology to address this vital issue. This thesis aims to resolve this problem and to provide a methodology that will generate a necessary and sufficient set of business processes in a given organization, based on its specific characteristics, which will be used as a baseline for implementing an ES. A proper definition of the business processes and their associated properties is proposed and detailed. The properties are then used as parameters to generate the complete set of all the possible business processes in the organization; from this set, necessary and sufficient processes are selected. The methodology exposes the fundamental level of business processes, which are then used as a baseline for further phases in the implementation process. The proposed methodology has been tested through the analysis of companies that have implemented ES. In each of these cases, the identification of business processes utilizing the proposed methodology has proven to provide superior results to those obtained through all other implemented practices, producing a better approximation of their existing business processes.
APA, Harvard, Vancouver, ISO, and other styles
37

Abrahams, Brooke. "Tourism information systems integration and utilization within the semantic web." 2006. http://eprints.vu.edu.au/1477/1/Abrahams.pdf.

Full text
Abstract:
The objective of this research was to generate grounded theory about the extent to which the Semantic Web and related technologies can assist with the creation, capture, integration, and utilization of accurate, consistent, timely, and up-to-date Web based tourism information. Tourism is vital to the economies of most countries worldwide (developed and lessdeveloped). Advanced Destination Marketing Systems (DMS) are essential if a country’s tourism infrastructure, facilities and attractions are to receive maximum exposure. A necessary prerequisite here is that relevant data must be captured, ‘cleansed’, organized, integrated and made available to key industry parties (e.g. travel agents and inbound tour operators). While more and more tourists are using the Internet for travel planning, the usability of the Internet as a travel information source remains a problem, with travellers often having trouble finding the information they seek as the amount of online travel related information increases. The problem is largely caused by the current Web’s lack of structure, which makes the integration of heterogeneous data a difficult time consuming task. Traditional approaches to overcoming heterogeneity have to a large extent been unsuccessful. In the past organizations attempted to rectify the problem by investing heavily in top-down strategic information systems planning projects (SISP), with the ultimate aim of establishing a new generation of systems built around a single common set of enterprise databases. An example of this approach to integration is that undertaken by the Bell companies (Nolan, Puryear & Elron 1989), whose massive investment in computer systems turned out to be more of a liability than an asset. The Semantic Web offers a new approach to integration. Broadly speaking, the Semantic Web (Berners-Lee, Hendler & Lassila 2001) refers to a range of standards, languages, development frameworks and tool development initiatives aimed at annotating Web pages with welldefined metadata so that intelligent agents can reason more effectively about services offered at particular sites. The technology is being developed by a number of scientists and industry organizations in a collaborative effort led by the Worldwide Web Consortium (W3C) with the goal of providing machine readable Web intelligence that would come from hyperlinked vocabularies, enabling Web authors to explicitly define their words and concepts. It is based on new markup languages such as such as Resource Description Framework (RDF) (Manola & Miller 2004), Ontology Web Language (OWL) (McGuinness & Harmelen 2004), and ontologies which provide a shared and formal description of key concepts in a given domain. The ontology driven approach to integration advocated here might be considered ‘bottom-up’, since individual enterprises (and parts of the one enterprise) can apply the technology (largely) independently – thereby mirroring the processes by which the Web itself evolved. The idea is that organizations could be provided with a common model (the Semantic Web ontology), and associated (easy-to-use) software could then be employed to guide them in the development of their Websites. As such, because Website production is driven by the common ontology, consistency and convenient integration is almost an automatic by-product (for all companies that take advantage of the technology and approach). In many cases, organizations would not have to change their present data structures or naming conventions, which could potentially overcome many of the change management issues that have led to the failure of previous integration initiatives. Many researchers (e.g. (El Sawy 2001)) have stressed the necessity to take a holistic view of technology, people, structure and processes in IT projects and, more specifically, Sharma et al. (2000, p. 151) have noted that as significant as DMS technological problems are, they may well pale into insignificance when compared with the managerial issues that need to be resolved. With this in mind, a systems development research approach supported by a survey of tourism operators and secondary interviews was used to generate grounded theory. The systems development and evaluation were designed to uncover technical benefits of using the Semantic Web for the integration and utilization of online tourism information. The survey of tourism operators and secondary data interviews were aimed at providing an understanding of attitudes towards adoption of a radical new online technology among industry stakeholders. A distinguishing feature of this research was its applied and pragmatic focus: in particular, one aim was to determine just what of practical use can be accomplished today, with current (albeit, extended) technology, in a real industry setting.
APA, Harvard, Vancouver, ISO, and other styles
38

Miliszewska, Iwona. "A Multidimensional Model for Transnational Computing Education Programs." 2006. http://eprints.vu.edu.au/579/1/Template.pdf.

Full text
Abstract:
As transnational education is becoming firmly embedded as a part of the distance education landscape, governments and universities are calling for meaningful research on transnational education. This study involved the development and validation of a model for effective transnational education programs. The study used student experience as a key indicator of program effectiveness and, following a holistic approach, took into consideration various dimensions of the transnational education context including student, instructor, curriculum and instruction design, interaction, evaluation and assessment, technology, and program management and organisational support. This selection of dimensions, together with their attributes, formed the proposed model for transnational education programs. The model was applied for validation against three transnational computing education programs currently offered by Australian universities in Hong Kong. Two methods of data collection - a survey, and group interviews with students - were used to validate the model; data was obtained from approximately three hundred subjects. The model was evaluated in terms of the perceived importance, to the students, of the various attributes of each program dimension on program effectiveness. The results of the validation indicated that the students in all the programs participating in the evaluation were in agreement as to the factors they consider most important to the effectiveness of transnational programs. The validation of the model led to its refinement; first, the least important attributes were removed from dimensions; second, a new dimension, pre-enrolment considerations, was introduced to the model; and finally, the attributes within each of the dimensions were ordered in terms of their perceived importance.
APA, Harvard, Vancouver, ISO, and other styles
39

Giles, Jonathan Andrew. "Improving Centruflow using semantic web technologies : a thesis presented in partial fulfillment of the requirements for the degree of Master of Science in Computer Science at Massey University, Palmerston North, New Zealand." 2007. http://hdl.handle.net/10179/801.

Full text
Abstract:
Centruflow is an application that can be used to visualise structured data. It does this by drawing graphs, allowing for users to explore information relationships that may not be visible or easily understood otherwise. This helps users to gain a better understanding of their organisation and to communicate more effectively. In earlier versions of Centruflow, it was difficult to develop new functionality as it was built using a relatively unsupported and proprietary visualisation toolkit. In addition, there were major issues surrounding information currency and trust. Something had to be done, and this was a sub-project of this thesis. The main purpose of this thesis however was to research and develop a set of mathematical algorithms to infer implicit relationships in Centruflow data sources. Once these implicit relationships were found, we could make them explicit by showing them within Centruflow. To enable this, relationships were to be calculated based on providing users with the ability to 'tag' resources with metadata. We believed that by using this tagging metadata, Centruflow could offer users far more insight into their own data. Implementing this was not a straight-forward task, as it required a considerable amount of research and development to be undertaken to understand and appreciate technologies that could help us in our goal. Our focus was primarily on technologies and approaches common in the semantic web and 'Web 2.0' areas. By pursuing semantic web technologies, we ensured that Centruflow would be considerably more standards-compliant than it was previously. At the conclusion of our development period, Centruflow had been rather substantially 'retrofitted', with all proprietary technologies replaced with equivalent semantic web technologies. The result of this is that Centruflow is now positioned on the forefront of the semantic web wave, allowing for far more comprehensive and rapid visualisation of a far larger set of readily-available data than what was possible previously. Having implemented all necessary functionality, we validated our approach and were pleased to find that our improvements led to a considerably more intelligent and useful Centruflow application than was previously available. This functionality is now available as part of 'Centruflow 3.0', which will be publicly released in March 2008. Finally, we conclude this thesis with a discussion on the future work that should be undertaken to improve on the current release.
APA, Harvard, Vancouver, ISO, and other styles
40

Danthuluri, Ravi. "Investigation on the quality of videoconferencing over the Internet and intranet environments." 2003. http://eprints.vu.edu.au/271/1/02whole.pdf.

Full text
Abstract:
This study deals with the scope and feasibility of video-conferencing on the Internet and Intranet, for a real-time implementation of a classroom atmosphere linking different universities. I have considered the effects of various factors on video conferencing and different tests have been performed to study the data transfer during the online sessions. Readings of send rate, received rate and CPU load have been considered during these tests and the results have been plotted in the form of graphs. The study also gives conclusions at regular intervals on the tests performed and the limitations on various video confencing sessions. From the statistics collected I have concluded on the hardware requirements for optimized performance of video conferencing over the Internet. The study also states the scope of research to be undertaken in future for much better performance and understanding of different types of protocols. This thesis includes the study of various network-monitoring tools.
APA, Harvard, Vancouver, ISO, and other styles
41

AlShihi, Hafedh. "Critical Factors in the Adoption and Diffusion of E-Government Initiatives in Oman." 2006. http://eprints.vu.edu.au/483/1/483contents.pdf.

Full text
Abstract:
Many significant barriers must be faced in the adoption and dissemination of e-government systems regardless of how advanced or modest a country is in terms of ICT infrastructure and deployment. This research has endeavored to investigate the impediments associated with the development and diffusion of e-government with a concentration on non-technical and country-specific factors. The focus of the research was on Oman's efforts to develop an e-government system, using advanced nations' experiences in the same domain to establish benchmarks. Initially, this research undertook a general literature review to define the barriers to the uptake of e-government and to set and refine aims, scope and questions asked of the research. Subsequently, a more focused literature review was conducted on the experiences of advanced nation with e-government, to identify possible lessons for and solutions to barriers facing the take-up of e-government. In parallel, an exploratory case study of the Oman e-government project was conducted that aimed to test the extent to which the barriers and solutions drawn from the largely Western-centric literature apply in the Omani situation, and to investigate other possible cultural and country-specific barriers. Semi-structured interviews and face-to-face administered questionnaires were the primary data collection strategies used throughout the case study phase. The study found that non-technical barriers in Oman, such as users' lack of IT knowledge and the absence of marketing campaigns, have negatively affected people's decisions to use the technology and inhibited decision makers from implementing or adopting technology initiatives. In addition, several country-specific limits to e-government growth were identified. Government decision makers in Oman were found to be prone to short-term planning, which prevents them from anticipating the long-term potential of e-government. Additionally, frequent structural changes within ministries, and the fact that the e-government project is not given high priority nor urgently needed at present, have contributed in delaying development of and improvements to such a system. Ultimately, this research delivered a socio-technical framework for adoption, detailing causes and effects of the critical factors in the adoption and diffusion of e-government initiatives in Oman.
APA, Harvard, Vancouver, ISO, and other styles
42

Kripanont, Napaporn. "Examining a technology acceptance model of internet usage by academics within Thai business schools." 2007. http://eprints.vu.edu.au/1512/1/Kripanont.pdf.

Full text
Abstract:
Information Technology has been a significant research area for some time, but its nature has changed considerably since the Internet became prominent just over a decade ago. Many researchers have studied and proposed theories and models of technology acceptance in order to predict and explain user behaviour with technology to account for rapid change in both technologies and their environments. Each theory or model has been proposed with different sets of determinants and moderators and most of them have been developed in the U.S. It is therefore questioned whether the theories and models of technology acceptance that have been developed, modified, and extended in the U.S. can be used in other countries, especially in Thailand. It is also questioned whether there might be other determinants and moderators that also play important roles in this specific environment. This thesis (1) reviewed literature in respect of nine prominent theories and models, (2) reviewed previous literature about IT acceptance and usage within four contexts of study, (3) investigated the extent to which academics use and intend to use the Internet in their work, (4) investigated how to motivate academics to make full use of the Internet in their work, (5) investigated to what extent using the Internet helps in improving academics’ professional practice, professional development and quality of working life, (6) formulated a research model of technology acceptance regarding Internet usage by Thai academics, and (7) generated and validated the research model that best describes Thai academics’ Internet usage behaviour and behaviour intention. These last two objectives represent the main focus of the thesis. Questionnaire survey method was used to collect primary data from 927 academics within Business Schools in 20 Public Universities in Thailand. The survey yielded 455 usable questionnaires, with a response rate of 49%. Statistical analysis methods and Structural Equation Modelling with AMOS version 6.0 were used to analyse data. The research model was formulated with five core determinants of usage and up to nine moderators of key relationships. It was then tested and modified, the final modified model evidenced by goodness of fit of the model to the data, explained 31.6% (Square Multiple Correlation) of the variance in usage behaviour in teaching , 42.6% in usage behaviour in other tasks, 55.7% in behaviour intention in teaching and 59.8% in behaviour intention in other tasks. From the findings, three core determinants: perceived usefulness, perceived ease of use and self-efficacy significantly determined usage behaviour in teaching. Two core determinants: perceived usefulness and self-efficacy significantly determined usage behaviour in other tasks. Finally, usage behaviour significantly influenced behaviour intention. In addition three moderators: age, e-university plan and level of reading and writing, impacted the influence of key determinants toward usage behaviour. Only two moderators: age and research university plan, impacted the influence of usage behaviour toward behaviour intention. The rest including gender, education level, academic position, experience and Thai language usage did not impact the influence of the key determinants toward usage behaviour and did not impact the influence of usage behaviour toward behaviour intention. Consequently, the final modified research model which is called the “Internet Acceptance Model” or “IAM” has the power to explain and predict user behaviour in a Thai Business Schools environment. A thorough understanding of the model may help practitioners to analyse the reasons for resistance toward the technology and also help them to take efficient measures to improve user acceptance and usage of the technology.
APA, Harvard, Vancouver, ISO, and other styles
43

Azzam, Ibrahim Ahmed Aref. "Implicit Concept-based Image Indexing and Retrieval for Visual Information Systems." 2006. http://eprints.vu.edu.au/479/1/Implicit_Concept-based_Image.pdf.

Full text
Abstract:
This thesis focuses on Implicit Concept-based Image Indexing and Retrieval (ICIIR), and the development of a novel method for the indexing and retrieval of images. Image indexing and retrieval using a concept-based approach involves extraction, modelling and indexing of image content information. Computer vision offers a variety of techniques for searching images in large collections. We propose a method, which involves the development of techniques to enable components of an image to be categorised on the basis of their relative importance within the image in combination with filtered representations. Our method concentrates on matching subparts of images, defined in a variety of ways, in order to find particular objects. The storage of images involves an implicit, rather than an explicit, indexing scheme. Retrieval of images will then be achieved by application of an algorithm based on this categorisation, which will allow relevant images to be identified and retrieved accurately and efficiently. We focus on Implicit Concept-based Image Indexing and Retrieval, using fuzzy expert systems, density measure, supporting factors, weights and other attributes of image components to identify and retrieve images.
APA, Harvard, Vancouver, ISO, and other styles
44

Grundy, John (John Collis). "Multiple textual and graphical views for interactive software development environments." 1993. http://www.cs.auckland.ac.nz/~john-g/papers/thesis93.pdf.

Full text
Abstract:
Diagram construction can be used to visually analyse and design a complex software system using natural, graphical representations describing high-level structure and semantics. Textual programming can specify detailed documentation and functionality not well expressed at a visual level. Integrating multiple textual and graphical views of software development allows programmers to utilise both representations as appropriate. Consistency management between these views must be automatically maintained by the development environment. MViews, a model for such software development environments, has been developed. MViews supports integrated textual and graphical views of software development with consistency management. MViews provides flexible program and view representation using a novel object dependency graph approach. Multiple views of a program may contain common information and are stored as graphs with textual or graphical renderings and editing. Change propagation between program components and views is supported using a novel update record mechanism. Different editing tools are integrated as views of a common program repository and new program representations and editors can be integrated without affecting existing views. A specification language for program and view state and manipulation semantics, and a visual specification language for view appearance and editing semantics, have been developed. An object-oriented architecture based on MViews abstractions allows environment specifications to be translated into a design for implementing environments. Environment designs are implemented by specialising a framework of object-oriented language classes based on the MViews architecture. A new language is described which provides object-oriented extensions to Prolog. An integrated software development environment for this language is discussed and the specification, design and implementation of this environment using MViews are described. MViews has also been reused to produce a graphical entity-relationship/textual relational database schema modeller, a dialogue painter with a graphical editing view and textual constraints view, and various program visualisation systems.
APA, Harvard, Vancouver, ISO, and other styles
45

Rugis, John. "Digital surface curvature." 2008. http://hdl.handle.net/2292/3105.

Full text
Abstract:
The theoretical basis for this thesis can be found in the subject of differential geometry where both line and surface curvature is a core feature. We begin with a review of curvature basics, establish notational conventions, and contribute new results (on n-cuts) which are of importance for this thesis. A new scale invariant curvature measure is presented. Even though curvature of continuous smooth lines and surfaces is a well-defined property, when working with digital surfaces, curvature can only be estimated. We review the nature of digitized surfaces and present a number of curvature estimators, one of which (the 3-cut mean estimator) is new. We also develop an estimator for our new scale invariant curvature measure, and apply it to digital surfaces. Surface curvature maps are defined and examples are presented. A number of curvature visualization examples are provided. In practical applications, the noise present in digital surfaces usually precludes the possibility of direct curvature calculation. We address this noise problem with solutions including a new 2.5D filter. Combining techniques, we introduce a data processing pipeline designed to generate surface registration markers which can be used to identify correspondences between multiple surfaces. We present a method (projecting curvature maps) in which high resolution detail is merged with a simplified mesh model for visualization purposes. Finally, we present the results of experiments (using texture projection merging and image processing assisted physical measurement) in which we have identified, characterized, and produced visualizations of selected fine surface detail from a digitization of Michelangelo’s David statue.
APA, Harvard, Vancouver, ISO, and other styles
46

Ma, Hui. "Distribution design for complex value databases : a dissertation presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Information Systems at Massey University." 2007. http://hdl.handle.net/10179/747.

Full text
Abstract:
Distribution design for databases usually addresses the problems of fragmentation, allocation and replication. However, the main purposes of distribution are to improve performance and to increase system reliability. The former aspect is particularly relevant in cases where the desire to distribute data originates from the distributed nature of an organization with many data needs only arising locally, i.e., some data are retrieved and processed at only one or at most very few locations. Therefore, query optimization should be treated as an intrinsic part of distribution design. Due to the interdependencies between fragmentation, allocation and distributed query optimization it is not efficient to study each of the problems in isolation to get overall optimal distribution design. However, the combined problem of fragmentation, allocation and distributed query optimization is NP-hard, and thus requires heuristics to generate efficient solutions. In this thesis the foundations of fragmentation and allocation in databases on query processing are investigated using a query cost model. The considered databases are defined on complex value data models, which capture complex value, object-oriented and XML-based databases. The emphasis on complex value databases enables a large variety of schema fragmentation, while at the same time it imposes restrictions on the way schemata can be fragmented. It is shown that the allocation of locations to the nodes of an optimized query tree is only marginally affected by the allocation of fragments. This implies that optimization of query processing and optimization of fragment allocation are largely orthogonal to each other, leading to several scenarios for fragment allocation. Therefore, it is reasonable to assume that optimized queries are given with subqueries having selection and projection operations applied to leaves. With this assumption some heuristic procedures can be developed to find an “optimal” fragmentation and allocation. In particular, cost-based algorithms for primary horizontal and derived horizontal fragmentation, vertical fragmentation are presented.
APA, Harvard, Vancouver, ISO, and other styles
47

Ferrarotti, Flavio Antonio. "Expressibility of higher-order logics on relational databases : proper hierarchies : a dissertation presented in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Information Systems at Massey University, Wellington, New Zealand." 2008. http://hdl.handle.net/10179/799.

Full text
Abstract:
We investigate the expressive power of different fragments of higher-order logics over finite relational structures (or equivalently, relational databases) with special emphasis in higher-order logics of order greater than or equal three. Our main results concern the study of the effect on the expressive power of higher-order logics, of simultaneously bounding the arity of the higher-order variables and the alternation of quantifiers. Let AAi(r,m) be the class of (i + 1)-th order logic formulae where all quantifiers are grouped together at the beginning of the formulae, forming m alternating blocks of consecutive existential and universal quantifiers, and such that the maximal-arity (a generalization of the concept of arity, not just the maximal of the arities of the quantified variables) of the higher-order variables is bounded by r. Note that, the order of the quantifiers in the prefix may be mixed. We show that, for every i [greater than or equal to] 1, the resulting AAi hierarchy of formulae of (i + 1)-th order logic is proper. This extends a result by Makowsky and Pnueli who proved that the same hierarchy in second-order logic is proper. In both cases the strategy used to prove the results consists in considering the set AUTOSAT(F) of formulae in a given logic F which, represented as finite structures, satisfy themselves. We then use a similar strategy to prove that the classes of [Sigma superscript i subscript m union Pi superscript i subscript m] formulae in which the higher-order variables of all orders up to i+1 have maximal-arity at most r, also induce a proper hierarchy in each higher-order logic of order i [greater than or equal to] 3. It is not known whether the correspondent hierarchy in second-order logic is proper. Using the concept of finite model truth definitions introduced by M. Mostowski, we give a sufficient condition for that to be the case. We also study the complexity of the set AUTOSAT(F) and show that when F is one of the prenex fragments [Sigma superscript 1 subscript m] of second-order logic, it follows that AUTOSAT(F) becomes a complete problem for the corresponding prenex fragment [Sigma superscript 2 subscript m] of third-order logic. Finally, aiming to provide the background for a future line of research in higher-order logics, we take a closer look to the restricted second-order logic SO[superscript w] introduced by Dawar. We further investigate its connection with the concept of relational complexity studied by Abiteboul, Vardi and Vianu. Dawar showed that the existential fragment of SO[superscript w] is equivalent to the nondeterministic inflationary fixed-point logic NFP. Since NFP captures relational NP, it follows that the existential fragment of SO[superscript w] captures relational NP. We give a direct proof, in the style of the proof of Fagin’s theorem, of this fact. We then define formally the concept of relational machine with relational oracle and prove the exact correspondence between the prenex fragments of SO[superscript w] and the levels of the relational polynomial-time hierarchy. This allows us to stablish a direct connection between the relational polynomial hierarchy and SO without using the Abiteboul and Vianu normal form for relational machines.
APA, Harvard, Vancouver, ISO, and other styles
48

Kannangara, Shyama Dilrukshi. "Adaptive Duplexer for Software Radio." 2006. http://eprints.vu.edu.au/600/1/600contents.pdf.

Full text
Abstract:
Different geographies and localities around the world have adopted various wireless interface standards for mobile communications. As a result roaming users will require multiple handsets with multiple standards and multiple band capabilities. Triple-band hand sets are currently offered for high end users. In the future quad-band handsets including GSM 850 will become common in the market. This trend will continue. The addition of third generation functionality to second generation platforms will be even more difficult and complex. The radio handset should be able to use the same hardware for communications anywhere in the world. Therefore users will require small low cost terminals with multimode/ multi-band capability. The software radio concept has been developed to address these challenges. The replacement of fixed frequency components in the front end of the software radio is one of the key architectural changes required. The duplexer is one such component. Since duplexing filters are not normally tuneable, each band requires a separate duplexer in a multi-band system. The duplexers are passive devices (ceramic or SAW) and multiple duplexers lead to a dramatic increase in terminal cost and size. Abstract Adaptive Duplexer for Software Radio iv This thesis proposes a new adaptive duplexer architecture to reduce/eliminate the multiple duplexer problem in software radio. This technique is based on combining a low isolation device with an adaptive double loop cancelling scheme. The proposed double loop cancellation provides the required transmitter leakage and transmitter noise isolation over wideband using a delay element and an adjustable vector attenuator in each cancellation path. This thesis analyses the double loop cancellation technique. The cancellation path delay constraints are derived for coefficients with limited adjustment range in the cancellation paths. A linear relationship between the bandwidth and the achievable cancellation level is obtained. It is shown that the residual signal power is proportional to the square of the duplexing frequency. It is concluded that the delays in the cancellation paths should be chosen to straddle the expected range variation of the delay in the main path, predominantly caused by variations in antenna matching. The new algorithm uses a single cost function to achieve simultaneous cancellation in both the transmit band and the receive band. A direct conversion receiver architecture was chosen for the hardware prototype, since it is more suitable for multi-band systems. Alternate structures are also possible. A prototype of the adaptive duplexer using a 20dB circulator and a single loop cancelling technique was designed and implemented. It achieved a total Tx leakage cancellation of 69dB at 2GHz with 45MHz duplexing frequency. However it was not possible to simultaneously cancel the transmitter noise in the receiver band. The original prototype was extended to include the second loop. The achieved isolation between the transmit and the receive signals and the achieved reduction of the transmitter noise in the receiver band were 66.8dB and 58dB respectively. These results were obtained over 5MHz bandwidth and using a 190MHz duplexing frequency. The performance is more than adequate for W-CDMA applications. Lowering the duplexing frequency improves the cancellation bandwidth and so the scheme performs better with other standards, such as IS-95 (CDMA), using 45MHz duplexing offset.
APA, Harvard, Vancouver, ISO, and other styles
49

Zhao, Fei. "The future of personal area networks in a ubiquitous computing world : a thesis presented in partial fulfillment of the requirements for the degree of Master of Information Sciences in Information Systems at Massey University at Massey University, Auckland, New Zealand." 2008. http://hdl.handle.net/10179/819.

Full text
Abstract:
In the future world of ubiquitous computing, wireless devices will be everywhere. Personal area networks (PANs), networks that facilitate communications between devices within a short range, will be used to send and receive data and commands that fulfill an individual’s needs. This research determines the future prospects of PANs by examining success criteria, application areas and barrierschallenges. An initial set of issues in each of these three areas is identified from the literature. The Delphi Method is used to determine what experts believe what are the most important success criteria, application areas and barrierschallenges. Critical success factors that will determine the future of personal area networks include reliability of connections, interoperability, and usability. Key application areas include monitoring, healthcare, and smart things. Important barriers and challenges facing the deployment of PAN are security, interference and coexistence, and regulation and standards.
APA, Harvard, Vancouver, ISO, and other styles
50

Tolochko, Igor Aleksandrovich. "Channel Estimation for OFDM Systems With Transmitter Diversity." 2005. http://eprints.vu.edu.au/313/1/313contents.pdf.

Full text
Abstract:
Orthogonal Frequency-Division Multiplexing (OFDM) is now regarded as a feasible alternative to the conventional single carrier modulation techniques for high data rate communication systems, mainly because of its inherent equalisation simplicity. Transmitter diversity can effectively combat multipath channel impairments due to the dispersive wireless channel that can cause deep fades in some subchannels. The combination of the two techniques, OFDM and transmitter diversity, can further enhance the data rates in a frequency-selective fading environment. However, this enhancement requires accurate and computationally efficient channel state information when coherent detection is involved. A good choice for high accuracy channel estimation is the linear minimum mean-squared error (LMMSE) technique, but it requires a large number of processing operations. In this thesis, a deep and thorough study is carried out, based on the mathematical analysis and simulations in MATLAB, to find new and effective channel estimation methods for OFDM in a transmit diversity environment. As a result, three novel LMMSE based channel estimation algorithms are evaluated: real time LMMSE, LMMSE by significant weight catching (SWC) and low complexity LMMSE with power delay profile approximation as uniform. The new techniques and their combinations can significantly reduce the full LMMSE processor complexity, by 50% or more, when the estimation accuracy loss remains within 1-2 dB over a wide range of channel delay spreads and signal-to-noise ratios (SNR). To further enhance the channel estimator performance, pilot symbol structures are investigated and methods for statistical parameter estimation in real time are also presented.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography