Tesi sul tema "280300 Computer Software"

Segui questo link per vedere altri tipi di pubblicazioni sul tema: 280300 Computer Software.

Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili

Scegli il tipo di fonte:

Vedi i top-19 saggi (tesi di laurea o di dottorato) per l'attività di ricerca sul tema "280300 Computer Software".

Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.

Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.

Vedi le tesi di molte aree scientifiche e compila una bibliografia corretta.

1

Jiang, Feng. "Capturing event metadata in the sky : a Java-based application for receiving astronomical internet feeds : a thesis presented in partial fulfilment of the requirements for the degree of Master of Computer Science in Computer Science at Massey University, Auckland, New Zealand". Massey University, 2008. http://hdl.handle.net/10179/897.

Testo completo
Abstract (sommario):
When an astronomical observer discovers a transient event in the sky, how can the information be immediately shared and delivered to others? Not too long time ago, people shared the information about what they discovered in the sky by books, telegraphs, and telephones. The new generation of transferring the event data is the way by the Internet. The information of astronomical events is able to be packed and put online as an Internet feed. For receiving these packed data, an Internet feed listener software would be required in a terminal computer. In other applications, the listener would connect to an intelligent robotic telescope network and automatically drive a telescope to capture the instant Astrophysical phenomena. However, because the technologies of transferring the astronomical event data are in the initial steps, the only resource available is the Perl-based Internet feed listener developed by the team of eSTAR. In this research, a Java-based Internet feed listener was developed. The application supports more features than the Perl-based application. After applying the rich Java benefits, the application is able to receive, parse and manage the Internet feed data in an efficient way with the friendly user interface. Keywords: Java, socket programming, VOEvent, real-time astronomy
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Thompson, Errol Lindsay. "How do they understand? Practitioner perceptions of an object-oriented program : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Education (Computer Science) at Massey University, Palmerston North, New Zealand". Massey University, 2008. http://hdl.handle.net/10179/854.

Testo completo
Abstract (sommario):
In the computer science community, there is considerable debate about the appropriate sequence for introducing object-oriented concepts to novice programmers. Research into novice programming has struggled to identify the critical aspects that would provide a consistently successful approach to teaching introductory object-oriented programming. Starting from the premise that the conceptions of a task determine the type of output from the task, assisting novice programmers to become aware of what the required output should be, may lay a foundation for improving learning. This study adopted a phenomenographic approach. Thirty one practitioners were interviewed about the ways in which they experience object-oriented programming and categories of description and critical aspects were identified. These critical aspects were then used to examine the spaces of learning provided in twenty introductory textbooks. The study uncovered critical aspects that related to the way that practitioners expressed their understanding of an object-oriented program and the influences on their approach to designing programs. The study of the textbooks revealed a large variability in the cover of these critical aspects.
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Rountree, Richard John. "Novel technologies for the manipulation of meshes on the CPU and GPU : a thesis presented in partial fulfilment of the requirements for the degree of Masters of Science in Computer Science at Massey University, Palmerston North, New Zealand". Massey University, 2007. http://hdl.handle.net/10179/700.

Testo completo
Abstract (sommario):
This thesis relates to research and development in the field of 3D mesh data for computer graphics. A review of existing storage and manipulation techniques for mesh data is given followed by a framework for mesh editing. The proposed framework combines complex mesh editing techniques, automatic level of detail generation and mesh compression for storage. These methods work coherently due to the underlying data structure. The problem of storing and manipulating data for 3D models is a highly researched field. Models are usually represented by sparse mesh data which consists of vertex position information, the connectivity information to generate faces from those vertices, surface normal data and texture coordinate information. This sparse data is sent to the graphics hardware for rendering but must be manipulated on the CPU. The proposed framework is based upon geometry images and is designed to store and manipulate the mesh data entirely on the graphics hardware. By utilizing the highly parallel nature of current graphics hardware and new hardware features, new levels of interactivity with large meshes can be gained. Automatic level of detail rendering can be used to allow models upwards of 2 million polygons to be manipulated in real time while viewing a lower level of detail. Through the use of pixels shaders the high detail is preserved in the surface normals while geometric detail is reduced. A compression scheme is then introduced which utilizes the regular structure of the geometry image to compress the floating point data. A number of existing compression schemes are compared as well as custom bit packing. This is a TIF funded project which is partnered with Unlimited Realities, a Palmerston North software development company. The project was to design a system to create, manipulate and store 3D meshes in a compressed and easy to manipulate manner. The goal is to create the underlying technologies to allow for a 3D modelling system to become integrated into the Umajin engine, not to create a user interface/stand alone modelling program. The Umajin engine is a 3D engine created by Unlimited Realities which has a strong focus on multimedia. More information on the Umajin engine can be found at www.umajin.com. In this project we propose a method which gives the user the ability to model with the high level of detail found in packages aimed at creating offline renders but create models which are designed for real time rendering.
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Johnston, Christopher Troy. "VERTIPH : a visual environment for real-time image processing on hardware : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Computer Systems Engineering at Massey University, Palmerston North, New Zealand". Massey University, 2009. http://hdl.handle.net/10179/1219.

Testo completo
Abstract (sommario):
This thesis presents VERTIPH, a visual programming language for the development of image processing algorithms on FPGA hardware. The research began with an examination of the whole design cycle, with a view to identifying requirements for implementing image processing on FPGAs. Based on this analysis, a design process was developed where a selected software algorithm is matched to a hardware architecture tailor made for its implementation. The algorithm and architecture are then transformed into an FPGA suitable design. It was found that in most cases the most efficient mapping for image processing algorithms is to use a streamed processing approach. This constrains how data is presented and requires most existing algorithms to be extensively modified. Therefore, the resultant designs are heavily streamed and pipelined. A visual notation was developed to complement this design process, as both streaming and pipelining can be well represented by data flow visual languages. The notation has three views each of which represents and supports a different part of the design process. An architecture view gives an overview of the design's main blocks and their interconnections. A computational view represents lower-level details by representing each block by a set of computational expressions and low-level controls. This includes a novel visual representation of pipelining that simplifies latency analysis, multiphase design, priming, flushing and stalling, and the detection of sequencing errors. A scheduling view adds a state machine for high-level control of processing blocks. This extended state objects to allow for the priming and flushing of pipelined operations. User evaluations of an implementation of the key parts of this language (the architecture view and the computational view) found that both were generally good visualisations and aided in design (especially the type interface, pipeline and control notations). The user evaluations provided several suggestions for the improvement of the language, and in particular the evaluators would have preferred to use the diagrams as a verification tool for a textual representation rather than as the primary data capture mechanism. A cognitive dimensions analysis showed that the language scores highly for thirteen of the twenty dimensions considered, particularly those related to making details of the design clearer to the developer.
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Mohanarajah, Selvarajah. "Designing CBL systems for complex domains using problem transformation and fuzzy logic : a thesis presented in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Computer Science at Massey University, Palmerston North, New Zealand". Massey University, 2007. http://hdl.handle.net/10179/743.

Testo completo
Abstract (sommario):
Some disciplines are inherently complex and challenging to learn. This research attempts to design an instructional strategy for CBL systems to simplify learning certain complex domains. Firstly, problem transformation, a constructionist instructional technique, is used to promote active learning by encouraging students to construct more complex artefacts based on less complex ones. Scaffolding is used at the initial learning stages to alleviate the difficulty associated with complex transformation processes. The proposed instructional strategy brings various techniques together to enhance the learning experience. A functional prototype is implemented with Object-Z as the exemplar subject. Both objective and subjective evaluations using the prototype indicate that the proposed CBL system has a statistically significant impact on learning a complex domain. CBL systems include Learner models to provide adaptable support tailored to individual learners. Bayesian theory is used in general to manage uncertainty in Learner models. In this research, a fuzzy logic based locally intelligent Learner model is utilized. The fuzzy model is simple to design and implement, and easy to understand and explain, as well as efficient. Bayesian theory is used to complement the fuzzy model. Evaluation shows that the accuracy of the proposed Learner model is statistically significant. Further, opening Learner model reduces uncertainty, and the fuzzy rules are simple and resemble human reasoning processes. Therefore, it is argued that opening a fuzzy Learner model is both easy and effective. Scaffolding requires formative assessments. In this research, a confidence based multiple test marking scheme is proposed as traditional schemes are not suitable for measuring partial knowledge. Subjective evaluation confirms that the proposed schema is effective. Finally, a step-by-step methodology to transform simple UML class diagrams to Object-Z schemas is designed in order to implement problem transformation. This methodology could be extended to implement a semi-automated translation system for UML to Object Models.
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Chetsumon, Sireerat. "Attitudes of extension agents towards expert systems as decision support tools in Thailand". Lincoln University, 2005. http://hdl.handle.net/10182/1371.

Testo completo
Abstract (sommario):
It has been suggested 'expert systems' might have a significant role in the future through enabling many more people to access human experts. It is, therefore, important to understand how potential users interact with these computer systems. This study investigates the effect of extension agents' attitudes towards the features and use of an example expert system for rice disease diagnosis and management(POSOP). It also considers the effect of extension agents' personality traits and intelligence on their attitudes towards its use, and the agents' perception of control over using it. Answers to these questions lead to developing better systems and to increasing their adoption. Using structural equation modelling, two models - the extension agents' perceived usefulness of POSOP, and their attitude towards the use of POSOP, were developed (Models ATU and ATP). Two of POSOP's features (its value as a decision support tool, and its user interface), two personality traits (Openness (0) and Extraversion (E)), and the agents' intelligence, proved to be significant, and were evaluated. The agents' attitude towards POSOP's value had a substantial impact on their perceived usefulness and their attitude towards using it, and thus their intention to use POSOP. Their attitude towards POSOP's user interface also had an impact on their attitude towards its perceived usefulness, but had no impact on their attitude towards using it. However, the user interface did contribute to its value. In Model ATU, neither Openness (0) nor Extraversion (E) had an impact on the agents' perceived usefulness indicating POSOP was considered useful regardless of the agents' personality background. However, Extraversion (E) had a negative impact on their intention to use POSOP in Model ATP indicating that 'introverted' agents had a clear intention to use POSOP relative to the 'extroverted' agents. Extension agents' intelligence, in terms of their GPA, had neither an impact on their attitude, nor their subjective norm (expectation of 'others' beliefs), to the use of POSOP. It also had no association with any of the variables in both models. Both models explain and predict that it is likely that the agents will use POSOP. However, the availability of computers, particularly their capacity, are likely to impede its use. Although the agents believed using POSOP would not be difficult, they still believed training would be beneficial. To be a useful decision support tool, the expert system's value and user interface as well as its usefulness and ease of use, are all crucially important to the preliminary acceptance of a system. Most importantly, the users' problems and needs should be assessed and taken into account as a first priority in developing an expert system. Furthermore, the users should be involved in the system development. The results emphasise that the use of an expert system is not only determined by the system's value and its user interface, but also the agents' perceived usefulness, and their attitude towards using it. In addition, the agents' perception of control over using it is also a significant factor. The results suggested improvements to the system's value and its user interface would increase its potential use, and also providing suitable computers, coupled with training, would encourage its use.
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Blakey, Jeremy Peter. "Database training for novice end users : a design research approach : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Information Systems at Massey University, Albany, New Zealand". Massey University, 2008. http://hdl.handle.net/10179/880.

Testo completo
Abstract (sommario):
Of all of the desktop software available, that for the implementation of a database is some of the most complex. With the increasing number of computer users having access to this sophisticated software, but with no obvious way to learn the rudiments of data modelling for the implementation of a database, there is a need for a simple, convenient method to improve their understanding. The research described in this thesis represents the first steps in the development of a tool to accomplish this improvement. In a preliminary study using empirical research a conceptual model was used to improve novice end users’ understanding of the relational concepts of data organisation and the use of a database software package. The results showed that no conclusions could be drawn about either the artefact used or the method of evaluation. Following the lead of researchers in the fields of both education and information systems, a design research process was developed, consisting of the construction and evaluation of a training artefact. A combination of design research and a design experiment was used in the main study described in this thesis. New to research in information systems, design research is a methodology or set of analytical techniques and perspectives, and this was used to develop a process (development of an artefact) and a product (the artefact itself). The artefact, once developed, needed to be evaluated for its effectiveness, and this was done using a design experiment. The experiment involved exposing the artefact to a small group of end users in a realistic setting and defining a process for the evaluation of the artefact. The artefact was the tool that would facilitate the improvement of the understanding of data modelling, the vital precursor to the development of a database. The research was conducted among a group of novice end users who were exposed to the artefact, facilitated by an independent person. In order to assess whether there was any improvement in the novices’ understanding of relational data modelling and database concepts, they then completed a post-test. Results confirmed that the artefact, trialled through one iteration, was successful in improving the understanding of these novice end users in the area of data modelling. The combination of design research and design experiment as described above gave rise to a new methodology, called experimental design research at this early juncture. The successful outcome of this research will lead to further iterations of the design research methodology, leading in turn to the further development of the artefact which will be both useful and accessible to novice users of personal computers and database software. This research has made the following original contributions. Firstly, the use of the design research methodology for the development of the artefact, which proved successful in improving novice users’ understanding of relational data structures. Secondly, the novel use of a design experiment in an information systems project, which was used to evaluate the success of the artefact. And finally, the combination of the developed artefact followed by its successful evaluation using a design experiment resulted in the hybrid experimental design research methodology. The success of the implementation of the experimental design research methodology in this information systems project shows much promise for its successful application to similar projects.
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Rutherford, Paul. "Usability of navigation tools in software for browsing genetic sequences". Diss., Lincoln University, 2008. http://hdl.handle.net/10182/948.

Testo completo
Abstract (sommario):
Software to display and analyse DNA sequences is a crucial tool for bioinformatics research. The data of a DNA sequence has a relatively simple format but the length and sheer volume of data can create difficulties in navigation while maintaining overall context. This is one reason that current bioinformatics applications can be difficult to use. This research examines techniques for navigating through large single DNA sequences and their annotations. Navigation in DNA sequences is considered here in terms of the navigational activities: exploration, wayfinding and identifying objects. A process incorporating user-centred design was used to create prototypes involving panning and zooming of DNA sequences. This approach included a questionnaire to define the target users and their goals, an examination of existing bioinformatics applications to identify navigation designs, a heuristic evaluation of those designs, and a usability study of prototypes. Three designs for panning and five designs for zooming were selected for development. During usability testing, users were asked to perform common navigational activities using each of the designs. The “Connected View” design was found to be the most usable for panning while the “Zoom Slider” design was best for zooming and most useful zooming tool for tasks involving browsing. For some tasks the ability to zoom was unnecessary. The research provides important insights into the expectations that researchers have of bioinformatics applications and suitable methods for designing for that audience. The outcomes of this type of research can be used to help improve bioinformatics applications so that they will be truly usable by researchers.
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Dermoudy, J. "Effective Run-Time Management of Parallelism in a Functional Programming Context". 2002. http://eprints.utas.edu.au/67.

Testo completo
Abstract (sommario):
This thesis considers how to speed up the execution of functional programs using parallel execution, load distribution, and speculative evaluation. This is an important challenge given the increasing complexity of software systems, the decreasing cost of individual processors, and the appropriateness of the functional paradigm for parallelisation. Processor speeds are continuing to climb — but the magnitudes of increase are overridden by both the increasing complexity of software and the escalating expectation of users. Future gains in speed are likely to occur through the combination of today’s conventional uni-processors to form loosely-coupled multicomputers. Parallel program execution can theoretically provide linear speed-ups, but for this theoretical benefit to be realised two main hurdles must be overcome. The first of these is the identification and extraction of parallelism within the program to be executed. The second hurdle is the runtime management and scheduling of the parallel components to achieve the speed-up without slowing the execution of the program. Clearly a lot of work can be done by the programmer to ‘parallelise’ the algorithm. There is often, however, much parallelism available without significant effort on the part of the programmer. Functional programming languages and compilers have received much attention in the last decade for the contributions possible in parallel executions. Since the semantics of languages from the functional programming paradigm manifest the Church-Rosser property (that the order of evaluation of sub-expressions does not affect the result), sub-expressions may be executed in parallel. The absence of side-effects and the lack of state facilitate the availability of expressions suitable for concurrent evaluation. Unfortunately, such expressions may involve varying amounts of computation or require high amounts of data — both of which complicate the management of parallel execution. If the future of computation is through the formation of multicomputers, we are faced with the high probability that the number of available processing units will quickly outweigh the known parallelism of an algorithm at any given moment during execution. Intuitively this spare processing power should be utilised if possible. The premise of speculative evaluation is that it employs otherwise idle tasks on work that may prove beneficial. The more program components available for execution the greater the opportunity for speculation and potentially the quicker the program’s result may be obtained. The second impediment for the parallel execution of programs is the scheduling of program components for evaluation. Multicomputer execution of a program involves the allocation of program components among the available tasks to maximise throughput. We present a decentralised, speculation-cognate, load distribution algorithm that allocates and manages the distribution of program components among the tasks with the co-aim of minimising the impact on tasks executing program components known to be required. In this dissertation we present our implementation of minimal-impact speculative evaluation in the context of the functional programming language Haskell augmented with a number of primitives for the indication of useful parallelism. We expound four (two quantitative and two qualitative) novel schemes for expressing the initial speculative contribution of program components and provide a translation mechanism to illustrate the equivalence of the four. The implementation is based on the Glasgow Haskell Compiler (GHC) version 0•29 — the de facto standard for parallel functional programming research — and strives to minimise the runtime overhead of managing speculative evaluation. We have augmented the Graph reduction for a Unified Machine model (GUM) runtime system with our load distribution algorithm and speculative evaluation sub-system. Both are motivated by the need to facilitate speculative evaluation without adversely impacting on program components directly influencing the program’s result. Experiments have been undertaken using common benchmark programs. These programs have been executed under sequential, conservative parallel, and speculative parallel evaluation to study the overheads of the runtime system and to show the benefits of speculation. The results of the experiments conducted using an emulated multicomputer add evidence of the usefulness of speculative evaluation in general and effective speculative evaluation in particular.
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Goldsmith, B. "A Peer to Peer Supply Chain Network". 2004. http://eprints.utas.edu.au/148.

Testo completo
Abstract (sommario):
Many papers have speculated on the possibility of applying peer-to-peer networking concepts to networks that exist in the physical world such as financial markets, business or personal communication and ad hoc networking. One such application that has been discussed in the literature has been the application of peer-to-peer to corporate supply chains to provide a flexible communication medium that may overcome some classical problems in supply chain management. This thesis presents the design, development and evaluation of a system which implements a peer-to-peer supply chain system. A general, flexible peer-to-peer network was developed which serves as a foundation to build peer-to-peer data swapping applications on top of. It provides simple network management, searching and data swapping services which form the basis of many peer-to-peer systems. Using the developed framework, a supply chain focussed application was built to test the feasibility of applying peer-to-peer networking to supply chain management. Results and discussion are presented of a scenario analysis which yielded positive results. Several future directions for research in this area are also discussed.
Gli stili APA, Harvard, Vancouver, ISO e altri
11

Kirk, Diana Caroline. "Flexible software process model". 2007. http://hdl.handle.net/2292/4228.

Testo completo
Abstract (sommario):
Many different kinds of process are used to develop software intensive products, but there is little agreement as to which processes give the best results under which circumstances. Practitioners and researchers believe that project outcomes would be improved if the development process was constructed according to project-specific factors. In order to achieve this goal, greater understanding of the factors that most affect outcomes is needed. To improve understanding, researchers build models of the process and carry out studies based on these models. However, current models contain many ambiguities and assumptions, and so it is not clear what the results of the studies mean. The statement of this thesis is that it is possible to create an abstraction of the software development process that will provide a mechanism for comparing software processes and software process models. The long term goal of the research is to provide planners with a means of tailoring the development process on a project by project basis, with the aim of reducing risk and improving outcomes.
Gli stili APA, Harvard, Vancouver, ISO e altri
12

Yakovlev, Vyacheslav. "Cluster analysis of object-oriented programs : a thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Computer Science at Massey University, Palmerston North, New Zealand". 2009. http://hdl.handle.net/10179/1030.

Testo completo
Abstract (sommario):
In this thesis we present a novel approach to the analysis of dependency graphs of object-oriented programs, and we describe a tool that has been implemented for this purpose. A graph-theoretical clustering algorithm is used in order to compute the modular structure of programs. This can be used to assist software engineers to redraw component boundaries in software in order to improve the level of reuse and maintainability. The analysis of the dependency graph of an object-oriented program is useful for assessing the quality of software design. The dependency graph can be extracted from a program using various different methods, including source code, byte code, and dynamic (behavioral) analysis. The nodes in the dependency graph are classes, members, packages and other artifacts, while the edges represent uses and extends relationships between those artifacts. Once the dependency graph has been extracted, it can be analysed in order to quantify certain characteristics of the respective program. Examples include the detection of circular dependencies and measurements of the responsibility or independence of units based on their relationships. Tools like JDepend1 implementing these principles have become very popular in recent years. Our work includes grouping types in dependency graphs using di erent clustering methods: Grouping into namespaces; Grouping into clusters using graph clustering algorithms; Grouping into clusters using rules. The detected mismatches are candidates for refactoring. We have developed a tool for processing dependency graphs clustering and producing results where users can outline possible design violations.
Gli stili APA, Harvard, Vancouver, ISO e altri
13

Mastilovich, Nikola. "Automatisation of programming of a PLC code : a thesis presented in partial fulfilment of the requirements of the degree of Masters of Engineering in Mechatronics". 2010. http://hdl.handle.net/10179/1681.

Testo completo
Abstract (sommario):
Appendix D, CD content can be found with print thesis held at Turitea library, Palmerston North. Content: Empty APCG program Empty RSLogix5000 l5k file Empty RSLogix5000 ACD file Real Life project - APCG program (only partial) Real Life project - RSLogix5000 l5k file (only partial) Real Life project - RSLogix5000 ACD file (only partial)
A competitive edge is one of the requirements of a successful business. Tools, which increase an engineer s productivity and minimize cost, can be considered as a competitive edge. The objective of this thesis was to design, create, and implement Automatic PLC Code Generator (APCG) software. A secondary objective was to demonstrate that the use of the APCG software will lead to improved project efficiency and enhanced profit margin. To create the APCG software, the MS Excel and Visual Basic for Applications (VBA) programs were used as the platform. MS Excel sheets were used as a user interface, while VBA creates the PLC code from the information entered by the engineer. The PLC code, created by the APCG software, follows the PLC structure of the Realcold Milmech Pty. Ltd, as well as the research Automatic generation of PLC code beyond the nominal sequence written by Guttel et al [1]. The APCG software was used to design and create a PLC code for one of the projects undertaken by Realcold Milmech Pty. Ltd. By using APCG software, time to design, create, and test the PLC code was improved when compared to the budgeted time. In addition, the project's profit margin was increased. Based on the results of this thesis it is expected that the APCG software will be useful for programmers that tend to handle a variety of projects on a regular basis, where programming in a modular way is not appropriate.
Gli stili APA, Harvard, Vancouver, ISO e altri
14

Majumdar, Anirban. "Design and evaluation of software obfuscations". 2008. http://hdl.handle.net/2292/3107.

Testo completo
Abstract (sommario):
Software obfuscation is a protection technique for making code unintelligible to automated program comprehension and analysis tools. It works by performing semantic preserving transformations such that the difficulty of automatically extracting the computational logic out of code is increased. Obfuscating transforms in existing literature have been designed with the ambitious goal of being resilient against all possible reverse engineering attacks. Even though some of the constructions are based on intractable computational problems, we do not know, in practice, how to generate hard instances of obfuscated problems such that all forms of program analyses would fail. In this thesis, we address the problem of software protection by developing a weaker notion of obfuscation under which it is not required to guarantee an absolute blackbox security. Using this notion, we develop provably-correct obfuscating transforms using dependencies existing within program structures and indeterminacies in communication characteristics between programs in a distributed computing environment. We show how several well known static analysis tools can be used for reverse engineering obfuscating transforms that derive resilience from computationally hard problems. In particular, we restrict ourselves to one common and potent static analysis tool, the static slicer, and use it as our attack tool. We show the use of derived software engineering metrics to indicate the degree of success or failure of a slicer attack on a piece of obfuscated code. We address the issue of proving correctness of obfuscating transforms by adapting existing proof techniques for functional program refinement and communicating sequential processes. The results of this thesis could be used for future work in two ways: first, future researchers may extend our proposed techniques to design obfuscations using a wider range of dependencies that exist between dynamic program structures. Our restricted attack model using one static analysis tool can also be relaxed and obfuscations capable of withstanding a broader class of static and dynamic analysis attacks could be developed based on the same principles. Secondly, our obfuscatory strength evaluation techniques could guide anti-malware researchers in the development of tools to detect obfuscated strains of polymorphic viruses.
Whole document restricted, but available by request, use the feedback form to request access.
Gli stili APA, Harvard, Vancouver, ISO e altri
15

Jan, Zaid. "Intelligent medical device integration with real time operating system : a thesis submitted to the School of Engineering in partial fulfilment of the requirements for the degree of Master of Engineering, Department of Electronics and Computer Syetem [i.e. Systems] Engineering at Massey University, [Albany], New Zealand". 2009. http://hdl.handle.net/10179/1501.

Testo completo
Abstract (sommario):
Many commercial devices now being produced have the ability to be remotely monitored and controlled. This thesis aims to develop a generic platform that can easily be extended to interface with many different kinds of devices for remote monitoring and control via a TCP/IP connection. The deployment will be concentrated on Medical devices but can be extended to all serial device interfaces. The hardware to be used in the development of this platform is an ARM Cortex M3 based Micro-Controller board which has to be designed to meet the requirement set by the Precept Health the founder of this platform. The design was conducted at Massey University in collaboration with senior engineer from the company. The main task in achieving the aim was the development of the necessary software layers to implement remote monitoring and control. The eCosCentric real-time embedded operating system was used to form a generic base for developing applications to monitor and control specific devices. The majority of the work involved in this project was the deployment of the operating system to the Micro-Controller. During the development process, several hardware issues were discovered with the Ethernet interface and were corrected. Using the generic platform, an application was developed to allow the reading of Bi-Directional pass through a communication protocol from 4 isolated serial input channels, to an Ethernet channel using TCP protocol.
Gli stili APA, Harvard, Vancouver, ISO e altri
16

Chen, Yi. "Efficient web-based application development tools on XML-enabled databases : a thesis presented in partial fulfilment of the requirements for the degree of Master of Information Sciences". 2008. http://hdl.handle.net/10179/896.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
17

Ye, Jun. "A reusable peer-to-peer conversation tool for online second language learning : a thesis presented in partial fulfilment of the requirements for the degree of Master of Information Science in Computer Science at Massey University, Palmerston North, New Zealand". 2008. http://hdl.handle.net/10179/865.

Testo completo
Abstract (sommario):
To support extramural learning, Johnson (2005) has proposed the Learning Computer concept, which aims to provide a learning appliance that can be used for studying university courses at any time, from anywhere, and by anybody who might have only basic software and hardware, dial-up Internet connection, and little computer literacy. Lonely extramural students need extra support for interactions and collaboration in learning, especially in second language learning that requires intensive oral language practice between the students and the tutor. This research project was a trial to extend IMMEDIATE (the prototype of the Learning Computer) to a second language extramural course. To meet the requirements of long distance conversation in such a course, a synchronous/asynchronous bimodal approach was conceptualised based on a review of e-learning, communication, and VoIP technologies. It was proposed that the prototype should automatically adapt to either synchronous mode or asynchronous mode according to different levels of Internet connection speed. An asynchronous conversation mode similar to Push-to-Talk (PTT) was also proposed. A VoIP SDK was investigated and used in the prototype for fast development. IMMEDIATE messaging protocols have been extended in the prototype to control call procedures and the asynchronous conversation mode. An evaluation of the prototype which was conducted to assess its usability, functionality and integrity of the prototype demonstrated that users can conduct telephone-like synchronous conversation efficiently at high connection speed. Although the PTT-like asynchronous mode has a time lag problem, especially when two users are both at low connection speed, it is a still a good way for novices to practise second language oral skills. The evaluation has given strongly support to the feasibility and effectiveness of the bimodal approach for applying IMMEDIATE in second language extramural learning.
Gli stili APA, Harvard, Vancouver, ISO e altri
18

Zhang, Hao Lan. "Agent-based open connectivity for decision support systems". 2007. http://eprints.vu.edu.au/1453/1/zhang.pdf.

Testo completo
Abstract (sommario):
One of the major problems that discourages the development of Decision Support Systems (DSSs) is the un-standardised DSS environment. Computers that support modern business processes are no longer stand-alone systems, but have become tightly connected both with each other and their users. Therefore, having a standardised environment that allows different DSS applications to communicate and cooperate is crucial. The integration difficulty is the most crucial problem that affects the development of DSSs. Therefore, an open and standardised environment for integrating various DSSs is required. Despite the critical need for an open architecture in the DSS designs, the present DSS architectural designs are unable to provide a fundamental solution to enhance the flexibility, connectivity, compatibility, and intelligence of a DSS. The emergence of intelligent agent technology fulfils the requirements of developing innovative and efficient DSS applications as intelligent agents offer various advantages, such as mobility, flexibility, intelligence, etc., to tackle the major problems in existing DSSs. Although various agent-based DSS applications have been suggested, most of these applications are unable to balance manageability with flexibility. Moreover, most existing agent-based DSSs are based on agent-coordinated design mechanisms, and often overlook the living environment for agents. This could cause the difficulties in cooperating and upgrading agents because the agent-based coordination mechanisms have limited capabilities to provide agents with relatively comprehensive information about global system objectives. This thesis proposes a novel multi-agent-based architecture for DSS, called Agentbased Open Connectivity for Decision support systems (AOCD). The AOCD architecture adopts a hybrid agent network topology that makes use of a unique feature called the Matrix-agent connection. The novel component, i.e. Matrix, provides a living environment for agents; it allows agents to upgrade themselves through interacting with the Matrix. This architecture is able to overcome the difficulties in concurrency control and synchronous communication that plague many decentralised systems. Performance analysis has been carried out on this framework and we find that it is able to provide a high degree of flexibility and efficiency compared with other frameworks. The thesis explores the detailed design of the AOCD framework and the major components employed in this framework including the Matrix, agents, and the unified Matrices structure. The proposed framework is able to enhance the system reusability and maximize the system performance. By using a set of interoperable autonomous agents, more creative decision-making can be accomplished in comparison with a hard-coded programmed approach. In this research, we systematically classified the agent network topologies, and developed an experimental program to evaluate the system performance based on three different agent network topologies. The experimental results present the evidence that the hybrid topology is efficient in the AOCD framework design. Furthermore, a novel topological description language for agent networks (TDLA) has been introduced in this research work, which provides an efficient mechanism for agents to perceive the information about their interconnected network. A new Agent-Rank algorithm is introduced in the thesis in order to provide an efficient matching mechanism for agent cooperation. The computational results based on our recently developed program for agent matchmaking demonstrate the efficiency and effectiveness of the Agent-Rank algorithm in the agent-matching and re-matching processes
Gli stili APA, Harvard, Vancouver, ISO e altri
19

Zhang, Hao Lan. "Agent-based open connectivity for decision support systems". Thesis, 2007. https://vuir.vu.edu.au/1453/.

Testo completo
Abstract (sommario):
One of the major problems that discourages the development of Decision Support Systems (DSSs) is the un-standardised DSS environment. Computers that support modern business processes are no longer stand-alone systems, but have become tightly connected both with each other and their users. Therefore, having a standardised environment that allows different DSS applications to communicate and cooperate is crucial. The integration difficulty is the most crucial problem that affects the development of DSSs. Therefore, an open and standardised environment for integrating various DSSs is required. Despite the critical need for an open architecture in the DSS designs, the present DSS architectural designs are unable to provide a fundamental solution to enhance the flexibility, connectivity, compatibility, and intelligence of a DSS. The emergence of intelligent agent technology fulfils the requirements of developing innovative and efficient DSS applications as intelligent agents offer various advantages, such as mobility, flexibility, intelligence, etc., to tackle the major problems in existing DSSs. Although various agent-based DSS applications have been suggested, most of these applications are unable to balance manageability with flexibility. Moreover, most existing agent-based DSSs are based on agent-coordinated design mechanisms, and often overlook the living environment for agents. This could cause the difficulties in cooperating and upgrading agents because the agent-based coordination mechanisms have limited capabilities to provide agents with relatively comprehensive information about global system objectives. This thesis proposes a novel multi-agent-based architecture for DSS, called Agentbased Open Connectivity for Decision support systems (AOCD). The AOCD architecture adopts a hybrid agent network topology that makes use of a unique feature called the Matrix-agent connection. The novel component, i.e. Matrix, provides a living environment for agents; it allows agents to upgrade themselves through interacting with the Matrix. This architecture is able to overcome the difficulties in concurrency control and synchronous communication that plague many decentralised systems. Performance analysis has been carried out on this framework and we find that it is able to provide a high degree of flexibility and efficiency compared with other frameworks. The thesis explores the detailed design of the AOCD framework and the major components employed in this framework including the Matrix, agents, and the unified Matrices structure. The proposed framework is able to enhance the system reusability and maximize the system performance. By using a set of interoperable autonomous agents, more creative decision-making can be accomplished in comparison with a hard-coded programmed approach. In this research, we systematically classified the agent network topologies, and developed an experimental program to evaluate the system performance based on three different agent network topologies. The experimental results present the evidence that the hybrid topology is efficient in the AOCD framework design. Furthermore, a novel topological description language for agent networks (TDLA) has been introduced in this research work, which provides an efficient mechanism for agents to perceive the information about their interconnected network. A new Agent-Rank algorithm is introduced in the thesis in order to provide an efficient matching mechanism for agent cooperation. The computational results based on our recently developed program for agent matchmaking demonstrate the efficiency and effectiveness of the Agent-Rank algorithm in the agent-matching and re-matching processes
Gli stili APA, Harvard, Vancouver, ISO e altri
Offriamo sconti su tutti i piani premium per gli autori le cui opere sono incluse in raccolte letterarie tematiche. Contattaci per ottenere un codice promozionale unico!

Vai alla bibliografia