Thèses sur le sujet « ADS TOOL »

Pour voir les autres types de publications sur ce sujet consultez le lien suivant : ADS TOOL.

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les 50 meilleures thèses pour votre recherche sur le sujet « ADS TOOL ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Parcourez les thèses sur diverses disciplines et organisez correctement votre bibliographie.

1

Piotroski, Janina. « THE EFFECTIVENESS OF USING AN ABSTRACTION-DECOMPOSITION SPACE AS A TOOL FOR CHARACTERIZING A KNOWLEDGE DOMAIN AND ENHANCING LEARNING ». Miami University / OhioLINK, 2006. http://rave.ohiolink.edu/etdc/view?acc_num=miami1161869699.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Rice, Richard M. « Creating an Ada module description tool ». Virtual Press, 1988. http://liblink.bsu.edu/uhtbin/catkey/539630.

Texte intégral
Résumé :
The purpose of this project was to develop, using Object Oriented Development (OOD), a software tool identified as the Ada Module Description Tool (AMDT). The AMDT provides an automated way to get a module level description of Ada code. A module level description will identify packages, subprograms, objects and type declarations and relationships. This software tool also has the ability to compare Ada source code with a module level description. The comparison shall identify any object, type, subprogram, or package declared in the module level description that does not match the provided source code.The AMDT is made up of two executable programs that run on a VAX/VMS system. The Module Description Generator (MDG) generates a module level description from a set of Ada source code files. The Module Description Checker (MDC) compares a module level description to the Ada source code. Ada is the required High Order Language for the Department Of Defense. The development methodology used was basically Object Oriented Development as described in the book Software Engineerinq With-AAA by Grady Booch and the Software Standards and Procedure Manual for Object Oriented Development (SSPM-M02.04 Draft).Booch's book is a description of Object Oriented Development methodology, while the SSPM is a set of instructions and standard format to implement the methodology. The total design of the AMDT is documented in five segments. The SSPM defines a segment as the code and documentation resulting from a pass through the OOD process. From a Software Quality Engineer's point of view the AMDT would save time in not having to check module descriptions by hand. From the Software Engineer's point of view, when the code is updated a new module description can be generated easily to keep the documentation current with the code. The AMDT tool as written does not find object declarations in the code. Fortunately the effect is minor because the module descriptions needs to be edited anyway. The module description generated by the MDG may have too much information in it. The designer wants only the types, objects, and operations that aid in the understandability of the design and how it is implemented. The only checks the MDC makes are to see if an identifier on the module description is in the code. It does not check to see if there are extra items in the code that should be required in the module description.
Department of Computer Science
Styles APA, Harvard, Vancouver, ISO, etc.
3

Britnell, Richard Neely. « Ada as a paedeutic tool for abstract data types ». Thesis, Monterey, California. Naval Postgraduate School, 1988. http://hdl.handle.net/10945/22852.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Fischer, Thomas, et sdtom@polyu edu hk. « Designing (tools (for designing (tools for ...)))) ». RMIT University. Architecture and Design, 2008. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20080424.160537.

Texte intégral
Résumé :
Outcomes of innovative designing are frequently described as enabling us in achieving more desirable futures. How can we design and innovate so as to enable future processes of design and innovation? To investigate this question, this thesis probes the conditions, possibilities and limitations of toolmaking for novelty and knowledge generation, or in other words, it examines designing for designing. The focus of this thesis is on the development of digital design tools that support the reconciliation of conflicting criteria centred on architectural geometry. Of particular interest are the roles of methodological approaches and of biological analogies as guides in toolmaking for design, as well as the possibility of generalising design tools beyond the contexts from which they originate. The presented investigation consists of an applied toolmaking study and a subsequent reflective analysis using second- order cybernetics as a theoretical framework. Observations made during the toolmaking study suggest that biological analogies can, in informal ways, inspire designing, including the designing of design tools. Design tools seem to enable the generation of novelty and knowledge beyond the contexts in and for which they are developed only if their users apply them in ways unanticipated by the toolmaker. Abstract The reflective analysis offers theoretical explanations for these observations based on aspects of second-order cybernetics. These aspects include the modelling of designing as a conversation, different relationships between observers (such as designers) and systems (such as designers engaged in their projects), the distinction between coded and uncoded knowledge, as well as processes underlying the production and the restriction of meaning. Initially aimed at the development of generally applicable, prescriptive digital tools for designing, the presented work results in a personal descriptive model of novelty and knowledge generation in science and design. This shift indicates a perspective change from a positivist to a relativist outlook on designing, which was accomplished over the course of the study. Investigating theory and practice of designing and of science, this study establishes an epistemological model of designing that accommodates and extends a number of theoretical concepts others have previously proposed. According to this model, both design and science generate and encode new knowledge through conversational processes, in which open-minded perception appears to be of greater innovative power than efforts to exercise control. The presented work substantiates and exemplifies radical constructivist theory of knowledge and novelty production, establishes correspondences between systems theory and design research theory and implies that mainstream scientific theories and practices are insufficient to account for and to guide innovation. Keywords (separated by commas) Digital design tools, geometry rationalisation, second-order cybernetics, knowledge generation
Styles APA, Harvard, Vancouver, ISO, etc.
5

Kemshal-Bell, Guy Jonathon, et guykb@bigpond net au. « Interactive media - a tool to enhance human communication ». RMIT University. Creative Media, 2007. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20080102.100544.

Texte intégral
Résumé :
This exegesis investigates the use of interactive online media to support the development of communication and problem solving skills amongst learners in a Vocational Education and Training (VET) context. It describes the development of the Maelstrom website as a response to the identified need for a collaborative, interactive online space where learners can explore and experiment within the safe and anonymous environment provided. The user interaction within the Maelstrom and user responses to their experiences are discussed and analysed to not only inform the role of the Maelstrom within the broader context on interactive online communication and collaboration, but also to guide future research.
Styles APA, Harvard, Vancouver, ISO, etc.
6

Eliasson, Sandra. « Connections between household adn street : Social, calm, safe and intimate - Tool housing ». Thesis, Umeå universitet, Arkitekthögskolan vid Umeå universitet, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-72674.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Kumar, Hemant. « Software analytical tool for assessing cardiac blood flow parameters / ». View thesis, 2001. http://library.uws.edu.au/adt-NUWS/public/adt-NUWS20030724.122149/index.html.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Jankowitz, H. T. « Software tools to aid PASCAL and ADA program design ». Thesis, University of Southampton, 1987. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.381132.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Vom, Braucke Troy S., et tvombraucke@swin edu au. « Establishment of a database for tool life performance ». Swinburne University of Technology, 2004. http://adt.lib.swin.edu.au./public/adt-VSWT20050914.085324.

Texte intégral
Résumé :
The cutting tool industry has evolved over the last half century to the point where an increasing range and complexity of cutting tools are available for metal machining. This highlighted a need to provide an intelligent, user-friendly system of tool selection and recommendation that can also provide predictive economic performance data for engineers and end-users alike. Such an 'expert system' was developed for a local manufacturer of cutting tools in the form of a relational database to be accessed over the Internet. A number of performance predictive models were reviewed for various machining processes, however they did not encompass the wide range of variables encountered in metal machining, thus adaptation of these existing models for an expert system was reasoned to be economically prohibitive at this time. Interrogation of published expert systems from cutting tool manufacturers, showed the knowledge-engineered principle to be a common approach to transferring economic and technological information to an end-user. The key advantage being the flexibility to allow further improvements as new knowledge is gained. As such, a relational database was built upon the knowledge-engineered principle, based on skilled craft oriented knowledge to establish an expert system for selection and performance assessment of cutting tools. An investigation into tapping of austenitic stainless steels was undertaken to develop part of a larger expert system. The expert system was then interrogated in this specific area in order to challenge by experiment, the skilled craft oriented knowledge in this area. The experimental results were incorporated into the database where appropriate, providing a user-friendly working expert system for intelligent cutting tool selection, recommendation and performance data.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Daddabbo, Gianvincenzo. « Virtual ADAS/AD ECU Validation & ; Design Scenario Tool in MIL and HIL environment ». Master's thesis, Alma Mater Studiorum - Università di Bologna, 2022. http://amslaurea.unibo.it/25045/.

Texte intégral
Résumé :
Autonomous Driving is the main trend that automotive industry is following. Starting from the first driver assistance systems, nowadays car are growing and growing in complexity, becoming a four-wheeled supercomputer that is able to perform a lot of driving actions totally autonomously. This thesis aims to describe what are the functionalities and methodologies that contributes to this amazing progress. The entire activity is based on a tool for automatic scenario generation. In the first two chapters the context in which the tool is placed is introduced, highlighting the method by which is possible to design its input. Subsequently, the most important aspects of the tool are described, focusing on the graphical interface and the standard it must be compliant with. Two entire chapters are devoted to the analysis of fundamental testing methodologies applied to the tool, while in the last part of the thesis, the final results are presented and discussed.
Styles APA, Harvard, Vancouver, ISO, etc.
11

Khanse, Karan Rajiv. « Development and Validation of a Tool for In-Plane Antilock Braking System (ABS) Simulations ». Thesis, Virginia Tech, 2015. http://hdl.handle.net/10919/56567.

Texte intégral
Résumé :
Automotive and Tire companies spend extensive amounts of time and money to tune their products through prototype testing at dedicated test facilities. This is mainly due to the limitations in the simulation capabilities that exist today. With greater competence in simulation, comes more control over designs in the initial stages, which in turn lowers the demand on the expensive stage of tuning. The work presented, aims at taking today's simulation capabilities a step forward by integrating models that are best developed in different software interfaces. An in-plane rigid ring model is used to understand the transient response of tires to various high frequency events such as Anti-Lock Braking and short wavelength road disturbances. A rule based ABS model performs the high frequency braking operation. The tire and ABS models have been created in the Matlab-Simulink environment. The vehicle model has been developed in CarSim. The models developed in Simulink have been integrated with the vehicle model in CarSim, in the form of a design tool that can be used by tire as well as vehicle designers for further tuning of the vehicle functional performances as they relate to in-line braking scenarios. Outdoor validation tests were performed to obtain data from a vehicle that was measured on a suspension parameter measuring machine (SPMM) in order to complement this design tool. The results of the objective tests performed have been discussed and the correlations and variations with respect to the simulation results have been analyzed.
Master of Science
Styles APA, Harvard, Vancouver, ISO, etc.
12

Coupland, Mary. « Learning with new tools ». Access electronically, 2004. http://www.library.uow.edu.au/adt-NWU/public/adt-NWU20041221.111821/index.html.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
13

Vanstone, Anita Mary, et n/a. « Are cultural impact assessments a tool for collaborative management ? » University of Otago. Department of Geography, 2003. http://adt.otago.ac.nz./public/adt-NZDU20070504.113743.

Texte intégral
Résumé :
This thesis investigates the participation of Maori (New Zealand�s indigenous people) in the impact assessment process. Traditionally, Maori have had limited involvement in the management of New Zealand�s environment. One possible solution to this could be through the adoption of a collaborative management framework. Unfortunately, there is limited information and research on tools that could facilitate collaborative management between iwi and applicants for resource consent (including, developers, planning consultants and local authorities). Therefore, this research attempts to fill in a gap in current literature and to investigate the potential of the cultural impact assessment as a tool for collaborative management. Despite some criticisms of collaborative management, there are examples where this form of communicative planning has resulted in a very positive outcome for indigenous groups. Therefore, the specific aim of this research is to analyse the extent to which cultural impact assessments can be used as a tool to promote collaborative management between iwi and applicants. In achieving the research objectives of the thesis, the theoretical background of collaborative management and impact assessment theories are explored. In addition, democracy and participation theories are also investigated. In particular, in the discussion of these theories emphasis is placed on the potential involvement of indigenous peoples. The thesis argues that the application of collaborative management via the use of cultural impact assessments may potentially increase Maori involvement in planning. Analysis of collaborative management and impact assessment theories is supported by empirical research. This includes; 1) an exploration of the New Zealand setting for the two theories, 2) a content analysis of cultural impact assessments from eight different iwi authority in New Zealand, and 3) a case study analysis of two iwi organizations that have an established system for undertaking cultural impact assessments (Kai Tahu ki Otago and the Wellington Tenths Trust). The research finds that cultural impact assessments are very similar to other impact assessment reports. However, they should be viewed as evolving documents, as there are some areas of the assessment process that need to be improved upon. The research concludes by suggesting that cultural impact assessments do have the potential to be a tool for collaborative management between iwi and applicants. Further research and education in relation to the content, value and process of cultural impact assessments is required. It is also argued that increased resourcing, training and legislative requirements are needed to further increase Maori participation in planning.
Styles APA, Harvard, Vancouver, ISO, etc.
14

Van, Der Loo Kenneth William Carleton University Dissertation Engineering Electrical. « Design and implementation of ASCGEN, a PROLOG tool for translating graphical designs into ADA code ». Ottawa, 1987.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
15

Chan, Tsz Lung. « Performance enhancement of machining process by an add-on online measurement system / ». View abstract or full-text, 2008. http://library.ust.hk/cgi/db/thesis.pl?IELM%202008%20CHAN.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
16

au, A. Stebbins@murdoch edu, et Andrew Stebbins. « The Chinese Civilizing Process : Eliasian Thought as an Effective Analytical Tool for the Chinese Cultural Context ». Murdoch University, 2009. http://wwwlib.murdoch.edu.au/adt/browse/view/adt-MU20100211.123651.

Texte intégral
Résumé :
This thesis is an effort to apply Elias’s thinking on social development to the Chinese social situation. At first glance his account of the civilizing process would appear incompatible with this context, in that, after state formation with the Qin and Han dynasties beginning in 221 BC, Chinese civilization remained both stable and highly traditional for well over two millennia. It is argued, however, that closer scrutiny reveals a process that was merely interrupted for a considerable period. The traditional system relied upon a symbiotic relationship between local society and the centre whereby the centre remained relatively small and aloof, not interfering with local social relations, as long as local society provided the required taxes and labour. In this situation the state had the monopolies of both violence and taxation that Elias would look for, but left local society to its own devices primarily because it was already pacified. This self-reinforcing system was enshrined and codified in the Confucian cannon over the course of centuries from the Han dynasty. Central control of the distribution of resources was eventually required to re-start the Chinese civilizing process, for this was the mechanism through which the local social structure would finally be altered. This only happened within the past century as the Chinese people struggled to grapple with their own ‘backwardness’ in the face of incessant Western and Japanese incursions. At this point the old system was toppled and replaced by progressively more aggressive central governments who saw as their most important task the destruction of the traditional social order in the interest of modernization. As the Chinese state consciously and forcibly took control of the distribution of resources at all levels of society, traditional social relations were stretched and warped, and the Chinese civilizing process re-commenced its long-stalled march toward modernization. This has been evidenced both by the dramatic growth in mobility and the rapidly extending chains of interdependence in the form of guanxi connections primarily during the Post-Opening period after 1978.
Styles APA, Harvard, Vancouver, ISO, etc.
17

Blundell, Ian, et n/a. « Co-management : a tool for genuine Maori involvement in coastal management ». University of Otago. Department of Geography, 2003. http://adt.otago.ac.nz./public/adt-NZDU20070507.114028.

Texte intégral
Résumé :
For Maori, the management fo New Zealand�s coast and its resources is fundamental to their cultural identity. Iwi and hapu throughout New Zealand have close relationships with the coast and unique rights and responsibilities for its future management. However, there does not appear to be wide recognition of the crucial role of Maori in New Zealand�s coastal management regime. Co-management initiatives in coastal management, particularly under the Conservation Act 1987, Resource Management Act 1991 and several legislative initiatives controlling fisheries management, are explored and critiqued. Overall, the initiatives demonstrate that progress is being made in coastal management for better protection of Maori coastal values. Nevertheless, there is scope for further improvements concerning greater Maori involvement in coastal management. Recommendations for genuine co-management systems in New Zealand�s coastal management regime include effective communication between iwi and Government; appreciation of the unique nature of each iwi in New Zealand; involvement of a third party communicating between iwi and Government representatives; appropriate funding and resources to maintain the co-management system, and encouragement and motivation from the Government to initiate and maintain the co-management system.
Styles APA, Harvard, Vancouver, ISO, etc.
18

Sieber, François. « Development of a tool to address nucleic acids into mitochondria ». Strasbourg, 2011. http://www.theses.fr/2011STRA6123.

Texte intégral
Résumé :
Les mitochondries, organelles présentes chez la majorité des cellules eucaryotes, proviennent de l’endosymbiose d’une α-protéobactérie à l’intérieur d’une cellule proto-eucaryotique ancestrale. Elles sont impliquées dans de nombreux processus fondamentaux comme la production d’ATP par phosphorylation oxydative, la synthèse d’acides aminés ou l’apoptose. Leur dysfonctionnement engendre des répercussions dramatiques sur le fonctionnement des cellules eucaryotes. Ils peuvent être associés par exemple à la stérilité mâle cytoplasmique chez les plantes ou à de nombreuses maladies chez l’homme telles que des maladies neurodégénératives et musculaires. Au cours de l’évolution la majorité des gènes bactériens ancestraux ont été perdus ou transférés dans le génome nucléaire et l’ADN mitochondrial ne code plus que pour un nombre limité de gènes. Ainsi, la majorité des protéines mitochondriales sont codées par des gènes nucléaires. De plus, les mitochondries d’un grand nombre d’espèces ne contiennent pas un nombre suffisant de gènes d’ARNt et importent des ARNt cytosoliques pour effectuer la traduction des ARNm de l’organelle. Le nombre et la nature des ARNt importés dans les mitochondries varient selon les espèces. Contrairement aux mécanismes d’importation des protéines qui sont maintenant bien connus (Neupert and Herrmann, 2007), les mécanismes gouvernant le transport des ARNt dans la mitochondrie restent très mal compris (Salinas et al. , 2008; Sieber et al. , 2011a). Par ailleurs, plusieurs défis majeurs restent à relever afin de comprendre l’ensemble des processus fondamentaux liés à la biogenèse mitochondriale. Ils se heurtent à plusieurs verrous scientifiques et techniques. L’un des verrous les plus importants est le suivant : à ce jour, aucune approche ne permet de transformer de manière stable l’ADN mitochondrial végétal ou humain. Pour tenter de répondre à cette problématique, la stratégie employée repose sur 2 principes majeurs : l’interaction possible entre un acide nucléique et une protéine, et l’existence de séquences d’adressage permettant l’importation dans les mitochondries de protéines codées par des gènes nucléaires. Ainsi, une protéine fusionnée à une séquence d’adressage mitochondriale capable d’interagir avec un acide nucléique allogène devrait entraîner ce dernier dans les mitochondries. [. . . ]
Mitochondria are organelles found in nearly all eukaryotes. They are considered to be the energetic center of the cell because they generate ATP by oxidative phosphorylation, but they are also involved in many more biological processes such as lipid and amino acid metabolism, iron-sulphur (FeS) cluster biogenesis, calcium homeostasis and apoptosis. Mitochondria originate from the endosymbiosis of an alpha-proteobacterium ancestor into a proto-eukaryotic cell. Classical mitochondria have retained a highly reduced vestige of the genome of the ancestral bacteria such that most mitochondrial proteins but also numerous tRNAs have to be imported from the cytosol to the mitochondria (Sieber et al. , 2011a). Mitochondrial genomes are subject to numerous mutations that can result in mitochondrial dysfunctions, which are often dramatic for cell viability. Such mitochondrial disorders can be at the center of human neurodegenerative and neuromuscular diseases, diabetes, aging and also cancers (Florentz et al. , 2003). In plants, mitochondrial disorders can originate from the presence of chimeric sequences in mitochondrial genomes, which lead to cytoplasmic male sterility (CMS). CMS plants are incapable of producing functional pollen and constitute a valuable tool in agronomy to produce hybrid plants that are more vigorous in culture (Budar and Pelletier, 2001). Mitochondrial transformation is thus of great interest, both for the study of mitochondrial disorders, and as a biotechnological tool for example in agronomy. Mitochondrial gene expression also remains poorly characterized and awaits reverse genetics tools for a better understanding. Except for the yeast S. Cerevisiae and the unicellular algae C. Reinhardtii where stable mitochondrial transformation has been achieved by biolistic approaches (Fox et al. , 1988; Remacle et al. , 2006), no means exist to stably transform mitochondrial DNA of higher eukaryotes. In my Ph. D. , I focused on developing a tool for efficient introduction of exogenous RNA into mitochondria of various model organisms. The strategy was to use a nucleic acid binding protein fused to a mitochondrial targeting sequence to create a protein shuttle capable of targeting RNA substrates to the mitochondrial matrix. As a shuttle candidate, I chose the mouse dihydrofolate reductase (DHFR) that binds nucleic acids non-specifically in vitro. A mitochondrial targeting sequence fused to the protein allows it to be imported into isolated mitochondria. DHFR fused to a mitochondrial targeting sequence (pDHFR) is conventionally used to dissect the mechanism of protein import into mitochondria of yeast (Pfanner et al. , 1987), and was used as a starting point for this study
Styles APA, Harvard, Vancouver, ISO, etc.
19

Šanda, Vladimír. « Aplikace controllingových nástrojů ve společnosti HAIDY a.s ». Master's thesis, Vysoká škola ekonomická v Praze, 2012. http://www.nusl.cz/ntk/nusl-162688.

Texte intégral
Résumé :
The diploma thesis focuses on preparing and application of controlling tools in a medium sized growing company. Because of the fact that the system of controlling almost did not exist in that company theoretical part of the thesis describes the term of controlling itself, its goals, functions, possible benefits and ways in which companies can approach to controlling. The practical part of the thesis focuses on three key company fields. It is issue of price formation, material management and controlling of orders. For each of these fields the original (and in view of controlling unsatisfactory) state has been mentioned. After that the way of application of controlling measures in each field has been described. Thanks to sufficient time interval it was possible to evaluate results of applied measures and outline future steps that could lead in improvement of company controlling.
Styles APA, Harvard, Vancouver, ISO, etc.
20

Vykusová, Michaela. « Nástroje interního a externího reportingu ve společnosti Siemens Engineering, a.s ». Master's thesis, Vysoká škola ekonomická v Praze, 2008. http://www.nusl.cz/ntk/nusl-4381.

Texte intégral
Résumé :
This graduation theses deals with the questions of the reporting in todays conditions of business. It is concerned with the meaning, sense and development and its relation to accounting and controlling. It pays attention to various classification of reporting, especially to internal and external reporting. It focuses on potential users of reporting and especially on its instruments. The graduation theses explains the principle of International Accounting Standard 11. The aim of the practical part of this theses is the description of the reporting system in the concrete company. It concentrates on the tools of the internal reporting in one division of the company. It presents the instruments of the internal reporting. It focuses on the purpose of individual reports, users of these reports and key figures that are reported in these reports.
Styles APA, Harvard, Vancouver, ISO, etc.
21

Wilson, Amanda J., et n/a. « Stone tool production at Cat's Eye Point, Kakanui, North Otago, New Zealand ». University of Otago. Department of Anthropology, 1999. http://adt.otago.ac.nz./public/adt-NZDU20070523.143909.

Texte intégral
Résumé :
This thesis examines a lithic assemblage from Cat�s Eye Point (J42/4), Kakanui, North Otago, New Zealand. This archaic site was excavated during 1996 and 1997 and the lithic assemblage was collected from 41m� excavated during these two seasons. Previous studies of lithic material from New Zealand and the Pacific are reviewed to indicate the range of information that can be gained from lithic analysis. Themes of research in the North Otago region are also examined to place Cat�s Eye Point into its regional context. This thesis had three main areas of investigation. The first involved a descriptive and technological analysis of the debitage using mass flake analysis (MFA) and individual flake analysis (IFA). Formal artefacts, such as hammerstones, blanks, and performs, were also examined. Secondly, spatial analysis was used to determine if the lithic assemblage could be used to infer intra-site activity areas. This was conducted by analysing macro- (flakes larger than 3mm) and microdebitage (flakes less than 3mm) by examining the range of material types. The third area of investigation examined debitage recovered from 6.4mm (1/4 inch) and 3.2mm (1/8 inch) sieves to determine if any significant technological information was gained by debitage from the 3.2mm sieve. The conclusions of this study indicate that there were two methods of basalt cobble reduction at Cat�s Eye Point for adze production. Adze production at Cat�s Eye Point was opportunistic and the non-local material curated. The results of the debitage analysis indicate that the entire sequence of adze manufacture did not occur in the excavated area of Cat�s Eye Point and the initial cobble reduction probably occurred on the adjacent beach where the cobbles are found today. Consequently, coastal rock outcrops, even without evidence of associated debitage, must be viewed as potential sources of rock for stone tool manufacture unless determined otherwise. The spatial analysis detected two activity areas and a disposal area at Cat�s Eye Point. The analysis of the 6.4mm and 3.2mm debitage found that no significant technological information was gained by examining the smaller flakes.
Styles APA, Harvard, Vancouver, ISO, etc.
22

Hill, Geof. « An inquiry into 'human sculpture' as a tool for use in the dramatistic approach to organisational communition / ». View thesis, 1995. http://library.uws.edu.au/adt-NUWS/public/adt-NUWS20030821.144019/index.html.

Texte intégral
Résumé :
Thesis (M.Sc. (Hons.) Social Ecology) -- University of Western Sydney, Hawkesbury, 1995.
"Submitted for examination in the Master of Science (Hons) Social Ecology, University of Western Sydney, Hawkesbury" Bibliography : leaves 164-168.
Styles APA, Harvard, Vancouver, ISO, etc.
23

Tukana, Andrew. « A Study of biogas digesters as an animal waste management tool on livestock farming systems in Fiji / ». View thesis, 2005. http://library.uws.edu.au/adt-NUWS/public/adt-NUWS20060502.151953/index.html.

Texte intégral
Résumé :
Thesis (M. Sc.) (Hons) -- University of Western Sydney, 2005.
" A thesis presented to the School of Environment and Agriculture, University of Western Sydney, in fulfilment of the requirements for the degree of Master of Science (Honours)." Includes bibliography : leaves 165 -175, and appendices.
Styles APA, Harvard, Vancouver, ISO, etc.
24

Simón, Martínez Marc. « Ancient DNA : a multifunctional tool for resolving anthropological questions ». Doctoral thesis, Universitat Autònoma de Barcelona, 2015. http://hdl.handle.net/10803/298316.

Texte intégral
Résumé :
En aquesta tesis hem analitzat diferents qüestions antropològiques usant l’ADN de poblacions antigues de Catalunya i Balears. Primer, hem intentat millorar la nostra metodologia aplicant un protocol diferent en l’estudi d’un jaciment que inesperadament havia proporcionat resultats escassos. Hem vist que sota algunes circumstàncies el canvi del nostre protocol usant fenol-cloroform pel del kit QIAamp DNA investigator que usa l’afinitat de les partícules de silica és positiu i millora significativament els resultats, encara que sembla que el mètode òptim pot variar en cada cas i que s’hauria de fer una valoració específica per decidir el protocol més adeqüat. Seguidament, hem volgut anar fer l’anàlisi de les relacions intrapoblacionals en grups enterrats en comú per a examinar el paper de les famílies nuclears en l’antiguitat. Vam examinar una cova sepulcral amb múltiples individus de Catalunya de finals de l’Edat del Bronze, i vam comprovar que la hipòtesis que considerava als individus trobats en aquesta cova sepulcral catalana una família nuclear era errònia. L’alta variabilitat de l’ADNmt dins del grup i el fet que l’únic haplogrup compartit (4 individus) fos infreqüent en aquell moment en la regió, suggerien l’existència d’un grup patrilocal amb la integració de dones forànies i d’un possible parentiu entre alguns dels individus. Vam concloure que possiblement els llaços genètics no eren l’únic factor determinant en els grups de finals del Bronze contrastant amb la situació de les famílies nuclears actuals en la societat Occidental, i que possiblement hi havia grups familiars estesos amb lligams culturals. Pel què fa a les poblacions Balears antigues, vam fer l’anàlisi de les seves relacions intra e interpoblacionals. En general la freqüència amb què presenten l’haplogrup H ha estat important almenys des de l’Edat del Ferro, amb l’excepció de la necròpolis de Son Real que presenta unes característiques molt particulars. Això encaixa amb els valors alts de les poblacions contemporànies de l’Oest Mediterrani així com amb les actuals, amb l’excepció de Menorca actual que pot estar influenciada per la colonització anglesa durant el segle XVIII. Els nostres resultas en les necròpolis menorquines van demostrar l’absència de biaix respecte al sexe en els enterraments i suggerien un estil de vida diferent entre les poblacions que vivien en la zona plana i les que vivien en l’accidentada costa sud pel què fa a la reproducció. A Mallorca vam provar l’ús diferencial de les necròpolis. Exceptuant Son Real, totes les necròpolis Balears antigues semblen mostrar una dotació d’haplogrups Europea homogènia, així que trobar aquestes diferències evidencia el sentit de tractar individualment cada necròpolis perquè totes presenten les seves especificitats pròpies. El seu anàlisis haplotípic mostra que ja es troben dins de la variabilitat genètica Europea i mostren un pool genètic molt similar a la població Catalana antiga, reafirmant les seves interaccions històriques documentades, i descarta una relació directa entre els membres de les cultures Nuràgica i Talaiòtica pel què fa als llinatges femenins. Finalment, hem fet una primera aproximació a les malalties que acompanyaven a aquestes poblacions estudiant una infecció altament estesa com la càries. Estudiant el factor de virulència dextranasa de l’agent cariogènic Streptococcus mutans hem pogut usar dades directes des de l’Edat del Bronze a l’actualitat per proposar que ha estat evolucionant neutralment almenys des de llavors i que una tensió de la pressió selectiva en aquest segment sembla l’explicació més plausible per entendre els seus canvis al llarg del temps.
In the current thesis we have examined anthropological questions of ancient Catalonian and Balearic populations using DNA. First, we have tried to improve our methodology applying a different protocol in the study of a site which had provided unexpected poor results. We have seen that under some circumstances the change of our protocol using phenol-clorophorm for the kit QIAamp DNA investigator which uses DNA affinity to silica particles is positive and delivers significantly better results, although it seems that the optimal method can vary at any given site and a consideration on a site by site basis should be made when deciding the most suitable protocol. Next, we have aimed to go into the analysis of intrapopulation relationships in groups buried in close association to analyze the role of nuclear families in the antiquity. We examined a common burial cave from Catalonia in the Late Bronze Age, and proved that the hypothesis considering the individuals found in Catalonia’s burial cave a nuclear family was erroneous. The high mtDNA variability inside the group and the fact that the only shared haplogroup (4 individuals) was uncommon in the region at that time, suggested the existence of a patrilocal mating system with the integration of foreign women and pointed to the kinship of some of the individuals. Our conclusion is that possibly the genetic ties were not the only determinant factor in close groups in the Late Bronze Age in contrast to the situation in current nuclear families in Western society, with cultural issues also playing an important role in what could possibly seen as an extended family structure. Concerning ancient Balearic populations, we analyzed intra and interpopulational relationships. In general their frequency of haplogroup H was very important since at least the Iron Age, with the exception of one necropolis, Son Real, which has very particular characteristics. This fits with the high values from their contemporary populations from the Western Mediterranean as well as with the current ones, exception made of Current Minorca which may be influenced by the English colonization during the XVIIIth century. Our results in the Minorcan necropolises proved the lack of sex bias in the interments and suggested different lifestyles between the populations living in the plain and those living in the rugged southern coast regarding inbreeding. In Majorca we proved a differential use of the necropolises. With the exception of Son Real, all the ancient Balearic necropolises seem to have an homogeneous European haplogroup pool so these differences evidence that the individual treatment of each necropolis makes sense as they all have their own uniqueness. The haplotipic analysis confirms that they already belong to the European genetic variability and show a very similar genetic pool to the ancient Catalan population, reaffirming their already documented historic interactions, and rules out a direct relationship between the members of the Nuragic and the Talaiotic cultures regarding the feminine lineages. Finally, we have made a first approach to the illnesses which accompanied these populations studying a widespread infection as caries. Studying the virulence factor dextranase of the cariogenic agent Streptococcus mutans we have been able to use direct data from the Bronze Age to the current era to propose that it has been evolving under neutral evolution at least since then and that a constraint of the selective pressure in this segment seems the most plausible explanation to understand its changes over time.
Styles APA, Harvard, Vancouver, ISO, etc.
25

Li, Zhaoyi, et n/a. « Analysis and Design of Virtual Reality Visualization for a Micro Electro Mechanical Systems (MEMS) CAD Tool ». Griffith University. School of Information and Communication Technology, 2005. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20060731.121340.

Texte intégral
Résumé :
Since the proliferation of CAD tools, visualizations have gained importance.. They provide invaluable visual feedback at the time of design, regardless whether it is fbi civil engineering or electronic circuit design-layout. Typically dynamic visualizations are produced in a two phase process: the calculation of positions and rendering of the image and its presentation as an animated video clip. This is a slow process that is unsuitable fbr interactive CAD visualizations, because the former two require finite element analysis Faster hardware eases the problem, but does not overcome it, because the algorithms are still too slow. Our MEMS CAD project works towards methods and techniques that are suitable for interactive design, with faster methods. The purpose of this PhD thesis is to contribute to the design of an interactive virtual prototyping of Micro Electro Mechanical Systems (MEMS) This research comprises the analysis of the visualization techniques that are appropriate for these tasks and identifying the difficulties that need to be overcome to be able to offer a MEMS design engineer a meaningful and interactive CAD design environment Such a VR-CAD system is being built in our research group with many participants in the team. Two particular problems are being addressed by presenting algorithms for truthful VR visualization methods: one is for displaying objects that are different in size on the computer screen. The other is modelling unsynchronized motion dynamics, that is different objects moving simultaneously at very high and vety low speed, by proposing stroboscopic simulation to present their dynamics on the screen They require specific size scaling and time scaling and filtering. It is these issues and challenges which make the design of a MEMS CAD tool different from other CAD tools. In the thesis I present algorithms for displaying animated virtual reality for MEMS virtual prototyping in a physically truthful way by using the simulated stroboscopic illumination to filter animated images to make it possible to show unsynchronized motion.. A scaling method was used to show or hide objects which cannot be shown simultaneously on the computer screen because of their large difference in size. The visualization of objects being designed and their animations is done with much consideration of visual perception and computer capability, which is rising attention, but not too often mentioned in the visualization domain.
Styles APA, Harvard, Vancouver, ISO, etc.
26

Lien, Jung-Hsun, et N/A. « Integrating Strategic Environmental Assessment into Transport Planning ». Griffith University. Griffith School of Environment, 2007. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20070813.155624.

Texte intégral
Résumé :
Strategic Environmental Assessment (SEA) has become recognised as an improvement on the existing, limited system of project-based EIA. It aims to integrate environmental considerations into government policies, plans and programmes, and provides a basis for arriving at better-informed decisions at broader strategic levels. However, the compatibility of this new environmental planning tool with other planning systems such as transport, holds the key to successful integration of environmental concerns into existing planning approaches. This study investigates whether SEA can influence and integrate with transport planning and policy development processes through a survey of attitudes and opinions of planners on transport SEA in Taiwan. Transport planning has been criticised for considering too few alternatives, and for basing evaluations solely on technical and economic grounds. The emerging SEA seems theoretically feasible and potentially beneficial in allowing the integration of environmental concerns into strategic transport planning. Though many countries or regions have transport SEA provisions, practical transport SEA applications remain limited, mostly in Western developed countries with high environmental awareness. SEA applications are also limited in their strategies, focusing mainly on infrastructure-related projects. Moreover, most current transport SEA practices lack strategic focus and thus fail to fulfill SEA principles. This suggests that many planners are unfamiliar with the nature and techniques of SEA, and the conceptual impediments are still critical, which may result in significant barriers to transport SEA application. The EIA Act promulgated in 1994, together with its relevant provisions, have provided an applicable mechanism and a legal basis for SEA application in Taiwan, however, no transport SEA cases have been conducted. Many technical and non-technical barriers have been identified by the interviewees, indicating that most of the planners in Taiwan believe that transport SEA is conceptually and practically immature, and planners are not yet ready for it. The conceptual barriers seem more critical at this stage because practical barriers can only be identified and overcome when planners and decision-makers have a clear and proper concept of SEA. This narrowly-viewed application has limited the benefits of SEA, and has resulted in a rigid and incorrect idea that SEA was a passive impact-reducing mechanism; this may mislead the attitudes of planners to transport SEA. In fact, the emerging SEA is a re-engineered planning system framework that serves to remind planners that they are able to improve their efforts. It is a paradigm revolution, as the way in which planners think can make a vast difference. Thus, the potential for the emerging SEA concept to influence and integrate with transport planning and transport policy development processes depends not only on practical feasibility but also on a fundamental conceptual recognition of transport SEA. SEA could influence and integrate with transport planning and transport policy development processes if planners and decision-makers changed their ways of thinking. This study also found that a tiered and integrated transport SEA, embedded in the main transport planning process at different strategic levels, has great potential to embody the environmental and sustainable concerns in transport planning and decision-making. This finding is based on several contentions supported by the recent SEA studies showing that it should not be detached from the main planning process. SEA needs to be flexible in order to meet various policies, plans and programmes (PPP) demands, and it must be value-driven, not impact-oriented. A tiered, integrated transport SEA provides ways to overcome identified transport SEA application impediments. This two-in-one planning system is a simple solution which allows transport SEA to be conducted without involving complex legal processes. It improves institutional coordination and integrates not only with planning processes but also with values and resources.
Styles APA, Harvard, Vancouver, ISO, etc.
27

Arguelles, Michael A. « Using IT-21 tools to provide Asynchronous Distributed Learning (ADL) to the fleet ». Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2000. http://handle.dtic.mil/100.2/ADA379253.

Texte intégral
Résumé :
Thesis (M.S. in Information System Technology)--Naval Postgraduate School, June 2000.
Thesis advisors: Porter, Gary ; Jones, Carl. "June 2000." Includes bibliographical references (p. 68). Also Available online.
Styles APA, Harvard, Vancouver, ISO, etc.
28

Eagle, Christopher S. « Tools for storage and retrieval of Ada software components in a software base ». Thesis, Monterey, California. Naval Postgraduate School, 1995. http://hdl.handle.net/10945/7530.

Texte intégral
Résumé :
One problem facing the Computer Aided Prototyping System (CAPS) project at the Naval Postgraduate School, is the lack of a large repository of existing reliable software components to draw upon for the creation of new prototype designs. Specifically, it i
Styles APA, Harvard, Vancouver, ISO, etc.
29

Steffen, Jamin D. « Design, Development, and Evaluation of Tools to Study Cellular ADP-ribose Polymer Metabolism ». Diss., The University of Arizona, 2011. http://hdl.handle.net/10150/202973.

Texte intégral
Résumé :
The metabolism of ADP-ribose polymers (PAR) is involved in several cellular processes with a primary focus on maintaining genomic integrity. PAR metabolism following genotoxic stress is transient due to a close coordination between poly(ADP-ribose) polymerases (PARPs) which synthesize PAR and poly(ADP-ribose) glycohydrolase (PARG) which degrades PAR. PARP-1 inhibitors have emerged as promising anticancer therapeutics by increasing chemotherapy sensitivity and selectively target tumors harboring DNA repair defects. Several pharmaceutical companies have PARP-1 inhibitors in clinical trials for treatment of cancer. PARP-1 inhibitors are generally well tolerated, although they typically have poor selectivity among PARPs, and potentially other NAD binding enzymes. The promise of PARP-1 inhibitors as cancer therapeutics has led this dissertation research towards developing alternative tools and approaches to target PAR metabolism.One approach described is an evaluation of high-throughput PARP-1 screening assays as potential tools to discover new classes of PARP-1 inhibitors. These assays were compared to a widely used radiolabeling PARP-1 assay. They were found to offer several advantages that include simplicity, sensitivity, reproducibility, accuracy and eliminating the need for radioactive materials.The primary focus of this dissertation research was to develop PARG inhibitors as an alternative way of targeting PAR metabolism. Lack of viable genetically engineered animals, effective siRNA, and useful pharmacological 20 inhibitors has prevented PARG from being evaluated as a therapeutic target. This dissertation describes the first systematic approach, using Target related Affinity Profiling (TRAP) technology, for the discovery of PARG inhibitors. Identification of several hits led to the first detailed structural activity relationship (SAR) studies defining a pharmacophore for PARG inhibition. Interestingly, these molecules show varying degrees of PARP-1 inhibition, providing the first direct evidence for homology in the active sites of PARP-1 and PARG. Evaluation of a lead inhibitor has provided the first evidence for PARG inhibition in intact cells. Further optimization resulted in a cell permeable inhibitor with reduced toxicity and poor selectivity, providing evidence for a new class of inhibitors that disrupt PAR metabolism by inhibiting both enzymes. The use of dual PARG/PARP-1 inhibitors represents a new approach for therapeutic development of anticancer agents. Finally, directions aimed to overcome remaining challenges are discussed.
Styles APA, Harvard, Vancouver, ISO, etc.
30

Lilley, Rebbecca Catherine, et n/a. « The development of an occupational health and safety surveillance tool for New Zealand workers ». University of Otago. Dunedin School of Medicine, 2007. http://adt.otago.ac.nz./public/adt-NZDU20071011.112802.

Texte intégral
Résumé :
World-wide, working life is undergoing major changes. Established market economies are increasingly characterised by demands for vastly greater market flexibility. New Zealand (NZ) has been no different with rapid changes occurring over the last 2 decades in the organisation of labour, of work and of the work environment. Recent international research suggests that work change significantly impacts upon worker health and safety. Many OECD nations undertake routine cross-sectional surveys to monitor changes in working conditions and environments, assessing the health and safety impact of these changes. Similar monitoring is not undertaken in NZ, with the impact of the work environment on health and injury outcomes poorly understood. This lack of knowledge (monitoring) is considered to be a significant impediment to the progression of health and safety initiatives in NZ. The aim of this thesis was to develop a tool (questionnaire) and methodology suitable for use in the surveillance of working conditions, work environments and health and injury outcomes using workers� surveys. The survey development was undertaken in 3 phases: i) development of tool through critical review; ii) empirical methodological testing and iii) an empirical validation study. Questionnaire development was a stepwise process of content selection. Firstly key dimensional themes were identified via critical review of literature and existing international surveys leading to the establishment of a dimensional framework. Secondly a critical review of questions to measure key dimensions based upon selection criteria occurred. Finally the selected questions and design were pre-tested before piloting. A similar development process was undertaken for the development of a calendar collecting occupational histories. A methodological study was undertaken piloting the questionnaire. Two methods of data collection were evaluated: face-to-face and telephone interviews, and two methods of occupational history collection: calendar and question set. Telephone interviewing was found to be the more efficient and effective data collection method while occupational history collection was found to be less time consuming by question set. Focus groups indicated questions were acceptable and suitable to NZ workers. A validation study was undertaken with a cross-sectional study in distinctly different occupational groups: cleaners and clerical workers. Comparisons were made between the groups with cleaners expected to be identified as employed under more hazardous working conditions and be exposed to more hazards of a physical nature, while clerical workers were expected to be exposed to more psychological hazards of a psychological nature. Results indicated the questionnaire provides data capable of making valid comparisons, identifying work patterns of high risk and provides good predictive validity. The final survey has the potential to generate population data on a wide range of work-related exposure and health variables relevant to contemporary working life. The survey results will contribute to understanding the range of working conditions and work environments NZ workers are currently exposed to and to assessing the health and safety impact of these exposures. Therefore it is recommended this tool initially be used in a national workforce survey to establish baseline surveillance data of working conditions, work environments and health and safety outcomes in NZ.
Styles APA, Harvard, Vancouver, ISO, etc.
31

au, J. Zhang@murdoch edu, et Jing Juan Zhang. « Water deficit in bread wheat : Characterisation using genetic and physiological tools ». Murdoch University, 2009. http://wwwlib.murdoch.edu.au/adt/browse/view/adt-MU20090227.120256.

Texte intégral
Résumé :
Under terminal water deficit, the impact of stem carbohydrate remobilization has greater significance because post-anthesis assimilation is limited, and grain growth depends on translocation of carbohydrate reserves. The working hypothesis of this thesis is that increases in stem carbohydrates facilitate tolerance to terminal drought in wheat. The goals of this thesis are to examine this hypothesis using physiological and genetic tools; identify genes that are related to QTL for stem carbohydrate; work with wheat and barley breeders to integrate findings into the breeding program of the Department of Agricultural and Food Western Australia. The physiological data of three drought experiments (two years in a glasshouse and one year in the field) suggested the maximum level of stem water soluble carbohydrate (WSC) is not consistently related to grain weight, especially, under water deficit. The patterns of WSC accumulation after anthesis differed depending on variety and suggested that WSC degradation and translocation have different genetic determinants. Most of the carbohydrates in stem WSC in wheat are fructans. Because 1-FEH gene was an important gene in fructan degradation, the three copies of this gene (1-FEH w1, 1-FEH w2 and 1-FEH w3) were isolated from the respective genomes of bread wheat. In addition, the genes were mapped to chromosome locations and coincided with QTL for grain weight. The results of gene expression studies show that 1-FEH w3 had significantly higher levels in the stem and sheath which negatively corresponded to the level of stem WSC in two wheat varieties in both water-deficit and well-watered treatments. Strikingly, the 1-FEH w3 appeared to be activated by water deficit in Westonia but not in Kauz. The results suggest that stem WSC level is not, on its own, a reliable criterion to identify potential grain yield in wheat exposed to water deficit during grain filling. The expression of 1-FEH w3 may provide a better indicator when linked to instantaneous water use efficiency, osmotic potential and green leaf retention, and this requires validation in field grown plants. In view of the location of the contribution to grain filling of stem WSC, this is a potential candidate gene contributing to grain filling. The numerous differences of intron sequences of 1-FEH genes would provide more opportunities to find markers associated with the QTL. A new FEH gene was partially isolated from Chinese Spring and the sequence was closely related to 1-FEH genes. This gene, FEH w4, was mapped to 6AS using Chinese Spring deletion bin lines. The polymorphism of this gene was found between different bread varieties using PCRs and RFLPs, and this allowed the gene to be mapped to two populations of Hanxuan 10 × Lumai 14 and Cranbrook × Halberd. In the population of Hanxuan 10 × Lumai 14, it was close to SSR marker xgwm334 and wmc297 where the QTL of thousand grain weight and grain filling efficiency were located. This result indicated this gene might be another possible candidate gene for grain weight and grain filling in wheat.
Styles APA, Harvard, Vancouver, ISO, etc.
32

Cameron, Michael, et mcam@mc-mc net. « Efficient Homology Search for Genomic Sequence Databases ». RMIT University. Computer Science and Information Technology, 2006. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20070509.162443.

Texte intégral
Résumé :
Genomic search tools can provide valuable insights into the chemical structure, evolutionary origin and biochemical function of genetic material. A homology search algorithm compares a protein or nucleotide query sequence to each entry in a large sequence database and reports alignments with highly similar sequences. The exponential growth of public data banks such as GenBank has necessitated the development of fast, heuristic approaches to homology search. The versatile and popular blast algorithm, developed by researchers at the US National Center for Biotechnology Information (NCBI), uses a four-stage heuristic approach to efficiently search large collections for analogous sequences while retaining a high degree of accuracy. Despite an abundance of alternative approaches to homology search, blast remains the only method to offer fast, sensitive search of large genomic collections on modern desktop hardware. As a result, the tool has found widespread use with millions of queries posed each day. A significant investment of computing resources is required to process this large volume of genomic searches and a cluster of over 200 workstations is employed by the NCBI to handle queries posed through the organisation's website. As the growth of sequence databases continues to outpace improvements in modern hardware, blast searches are becoming slower each year and novel, faster methods for sequence comparison are required. In this thesis we propose new techniques for fast yet accurate homology search that result in significantly faster blast searches. First, we describe improvements to the final, gapped alignment stages where the query and sequences from the collection are aligned to provide a fine-grain measure of similarity. We describe three new methods for aligning sequences that roughly halve the time required to perform this computationally expensive stage. Next, we investigate improvements to the first stage of search, where short regions of similarity between a pair of sequences are identified. We propose a novel deterministic finite automaton data structure that is significantly smaller than the codeword lookup table employed by ncbi-blast, resulting in improved cache performance and faster search times. We also discuss fast methods for nucleotide sequence comparison. We describe novel approaches for processing sequences that are compressed using the byte packed format already utilised by blast, where four nucleotide bases from a strand of DNA are stored in a single byte. Rather than decompress sequences to perform pairwise comparisons, our innovations permit sequences to be processed in their compressed form, four bases at a time. Our techniques roughly halve average query evaluation times for nucleotide searches with no effect on the sensitivity of blast. Finally, we present a new scheme for managing the high degree of redundancy that is prevalent in genomic collections. Near-duplicate entries in sequence data banks are highly detrimental to retrieval performance, however existing methods for managing redundancy are both slow, requiring almost ten hours to process the GenBank database, and crude, because they simply purge highly-similar sequences to reduce the level of internal redundancy. We describe a new approach for identifying near-duplicate entries that is roughly six times faster than the most successful existing approaches, and a novel approach to managing redundancy that reduces collection size and search times but still provides accurate and comprehensive search results. Our improvements to blast have been integrated into our own version of the tool. We find that our innovations more than halve average search times for nucleotide and protein searches, and have no signifcant effect on search accuracy. Given the enormous popularity of blast, this represents a very significant advance in computational methods to aid life science research.
Styles APA, Harvard, Vancouver, ISO, etc.
33

Kazanis, Phillip. « Methodologies and tools for etransforming small- to medium-size enterprises ». View thesis, 2004. http://library.uws.edu.au/adt-NUWS/public/adt-NUWS20050804.095044/index.html.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
34

Kern, Christine Luise, et n/a. « Demarketing as a tool for managing visitor demand in national parks : an Australian case study ». University of Canberra. Languages, International Studies & ; Tourism, 2006. http://erl.canberra.edu.au./public/adt-AUC20061114.125254.

Texte intégral
Résumé :
Nature-based tourism and recreation is a growing phenomenon around the world. In Australia, nature-based tourism represents an important part of the tourism sector and is to a large extent dependent on protected areas such as World Heritage areas, marine parks and national parks. While tourism and recreation can benefit protected areas, some are under pressure from visitation and marketing should play a role in managing visitor demand. To this end, a number of authors have suggested demarketing as a management tool to address situations of excess visitor demand, however, research on demarketing in protected areas is limited. To address this research gap, this thesis examines the use of demarketing in Australian national parks that face excess visitor demand using a case study on the Blue Mountains National Park. The thesis investigates factors that contribute to high visitor demand for the park, the use of demarketing to manage demand and factors that influence when and how demarketing is applied. Demarketing is that aspect of marketing that deals with discouraging customers in general or a certain class of customers in particular on either a temporary or permanent basis. In protected areas specifically, demarketing is concerned with reducing visitor numbers in total or selectively and redistributing demand spatially or temporarily. Six factors that contribute to high visitor demand for the national park were identified including the attractiveness of the park, its proximity to Sydney and the fact that the park is a renowned destination with icon sites. It was established that no holistic demarketing strategy is currently employed in the park and that the demarketing measures that are applied are not consciously used as demarketing. The measures used in the Blue Mountains National Park were discussed according to their association with the marketing mix components (4 Ps). Demarketing measures related to �product� include limiting recreational activities by defining specific areas where they can be conducted, limiting the duration of activities and closures of sites or features in the park. The measures related to �place� are the use of a booking system, limiting visitor numbers and group sizes, commercial licensing and limiting signage. Measures related to �price� are not extensively used in the park. The promotional demarketing measures applied include stressing restrictions and appropriate environmental behaviour in promotional material and nonpromotion of certain areas or experiences in the park. Importantly, these demarketing measures are not employed across the whole park or for all user groups, but are used for certain experiences in specific contexts and circumstances. Three types of factors influence the use of demarketing in the Blue Mountains National Park: pragmatic considerations, resource considerations and stakeholder interests. Pragmatic considerations include the feasibility and effectiveness of certain demarketing measures, which are influenced by the specific context of the national park. Resource considerations relate to financial, human and temporal resources and the findings suggest that a lack of resources influences and at times inhibits the use of demarketing measures. It was also found that various stakeholders have a profound influence on the use of demarketing measures. The stakeholder groups have diverse interests and therefore influence the use of demarketing in different ways by supporting or impeding certain measures. Based on the findings and limitations of this study, recommendations for government and future research are made. These emphasise among others the need for more consistent and comprehensive collection of visitor information to tailor management actions more effectively. It is also suggested that a more conscious and holistic application of demarketing measures may help to manage visitor demand to parks proactively to ensure that the resource remains for future generations.
Styles APA, Harvard, Vancouver, ISO, etc.
35

Smuts, Celia Margaretha. « Development of tools to improve the detection of Trypanoma evansi in Australia / ». Murdoch University Digital Theses Program, 2009. http://wwwlib.murdoch.edu.au/adt/browse/view/adt-MU20090709.113425.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
36

Balasubramanian, ArunKumar. « Benchmarking of Vision-Based Prototyping and Testing Tools ». Master's thesis, Universitätsbibliothek Chemnitz, 2017. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-qucosa-229999.

Texte intégral
Résumé :
The demand for Advanced Driver Assistance System (ADAS) applications is increasing day by day and their development requires efficient prototyping and real time testing. ADTF (Automotive Data and Time Triggered Framework) is a software tool from Elektrobit which is used for Development, Validation and Visualization of Vision based applications, mainly for ADAS and Autonomous driving. With the help of ADTF tool, Image or Video data can be recorded and visualized and also the testing of data can be processed both on-line and off-line. The development of ADAS applications needs image and video processing and the algorithm has to be highly efficient and must satisfy Real-time requirements. The main objective of this research would be to integrate OpenCV library with ADTF cross platform. OpenCV libraries provide efficient image processing algorithms which can be used with ADTF for quick benchmarking and testing. An ADTF filter framework has been developed where the OpenCV algorithms can be directly used and the testing of the framework is carried out with .DAT and image files with a modular approach. CMake is also explained in this thesis to build the system with ease of use. The ADTF filters are developed in Microsoft Visual Studio 2010 in C++ and OpenMP API are used for Parallel programming approach.
Styles APA, Harvard, Vancouver, ISO, etc.
37

Aumeerally, Manisah, et n/a. « Analytic Model Derivation Of Microfluidic Flow For MEMS Virtual-Reality CAD ». Griffith University. School of Information and Communication Technology, 2006. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20061106.095352.

Texte intégral
Résumé :
This thesis derives a first approximation model that will describe the flow of fluid in microfluidic devices such as in microchannels, microdiffusers and micronozzles using electrical network modelling. The important parameter that is of concern is the flow rates of these devices. The purpose of this work is to contribute to the physical component of our interactive Virtual Reality (VR)-prototyping tool for MEMS, with emphasis on fast calculations for interactive CAD design. Current calculations are too time consuming and not suitable for interactive CAD with dynamic animations. This work contributes to and fills the need for the development of MEMS dynamic visualisation, showing the movement of fluid within microdevices in time scale. Microfluidic MEMS devices are used in a wide range of applications, such as in chemical analysis, gene expression analysis, electronic cooling system and inkjet printers. Their success lies in their microdimensions, enabling the creation of systems that are considerably minute yet can contain many complex subsystems. With this reduction in size, the advantages of requiring less material for analysis, less power consumption, less wastage and an increase in portability becomes their selling point. Market size is in excess of US$50 billion in 2004, according to a study made by Nexus. New applications are constantly being developed leading to creation of new devices, such as the DNA and the protein chip. Applications are found in pharmaceuticals, diagnostic, biotechnology and the food industry. An example is the outcome of the mapping and sequencing of the human genome DNA in the late 1990's leading to greater understanding of our genetic makeup. Armed with this knowledge, doctors will be able to treat diseases that were deemed untreatable before, such as diabetes or cancer. Among the tools with which that can be achieved include the DNA chip which is used to analyse an individual's genetic makeup and the Gene chip used in the study of cancer. With this burgeoning influx of new devices and an increase in demand for them there is a need for better and more efficient designs. The MEMS design process is time consuming and costly. Many calculations rely on Finite Element Analysis, which has slow and time consuming algorithms, that make interactive CAD unworkable. This is because the iterative algorithms for calculating the animated images showing the ongoing proccess as they occur, are too slow. Faster computers do not solve the void of efficient algorithms, because with faster computer also comes the demand for a fasters response. A 40 - 90 minute FEA calculation will not be replaced by a faster computer in the next decades to an almost instant response. Efficient design tools are required to shorten this process. These interactive CAD tools need to be able to give quick yet accurate results. Current CAD tools involve time consuming numerical analysis technique which requires hours of numerous iterations for the device structure design followed by more calculations to achieve the required output specification. Although there is a need for a detailed analysis, especially in solving for a particular aspect of the design, having a tool to quickly get a first approximation will greatly shorten the guesswork involved in determining the overall requirement. The underlying theory for the fluid flow model is based on traditional continuum theory and the Navier-Stokes equation is used in the derivation of a layered flow model in which the flow region is segmented into layered sections, each having different flow rates. The flow characteristics of each sections are modeled as electrical components in an electrical circuit. Matlab 6.5 (MatlabTM) is used for the modelling aspect and Simulink is used for the simulation.
Styles APA, Harvard, Vancouver, ISO, etc.
38

Braczynski, Anne Kristin [Verfasser]. « Development of novel tools to study PARP10-mediated ADP-ribosylation reactions / Anne Kristin Braczynski ». Aachen : Hochschulbibliothek der Rheinisch-Westfälischen Technischen Hochschule Aachen, 2013. http://d-nb.info/1047231514/34.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
39

Wolverton, Cheryl Lynn. « Staff nurse perceptions' of nurse manager caring behaviors| Psychometric testing of the Caring Assessment Tool-Administration (CAT-adm(c)) ». Thesis, Indiana University - Purdue University Indianapolis, 2016. http://pqdtopen.proquest.com/#viewpdf?dispub=10133766.

Texte intégral
Résumé :

Caring relationships established between nurse managers and staff nurses promote positive work environments. However, research about staff nurses’ perceptions of nurse manager caring behaviors is limited. A 94-item Caring Assessment Tool-Administration (CAT-adm©) was developed to measure staff nurses’ perceptions of nurse managers’ caring behaviors; however, it lacked robust psychometric testing. This study was undertaken to establish the CAT-adm© survey as a reliable and valid tool to measure staff nurses’ perceptions of nurse managers’ caring behaviors.

The Quality-Caring Model® (QCM®) served as the theoretical framework. Specific aims were to 1) evaluate construct validity of the CAT-adm© survey by describing factors that account for variance in staff nurses' perceptions of nurse manager caring, 2) estimate internal consistency, and 3) conduct item reduction analysis. Four research questions were: 1) Will the factor structure of observed data fit an 8-factor solution? 2) What is the internal consistency reliability of the CAT- adm©? 3) What items can be reduced while maintaining an acceptable factor structure? and 4) What are staff nurses’ perceptions of nurse manager caring behaviors?

A cross-sectional descriptive design was used. A sample of 703 staff nurses from Midwestern, Midatlantic and Southern Regions of the U.S. completed the CAT-adm© survey electronically. Analysis included Confirmatory Factor Analysis (CFA), Exploratory Factor Analysis (EFA), univariate analysis, and descriptive statistics. CFA did not support an 8-factor solution. EFA supported a two-factor solution and demonstrated significant shared variance between the two factors. This shared variance supported a one-factor solution that could conceptually be labeled Caring Behaviors. Random selection reduced the scale to 25-items while maintaining a Cronbach’s Alpha of .98. Using the new 25-item scale, the composite score mean of staff nurses’ perceptions of nurse manager caring behaviors indicated a moderately high level of caring. Suggestions for nursing administration, nurse manager practice, leadership, education and for future research were given.

The new 25-item CAT-adm© survey has acceptable reliability and validity. The 25-item CAT-adm© survey provides hospital administrators, nurse managers, and researchers with an instrument to collect valuable information about the caring behaviors used by nurse managers in relationship with staff nurses.

Styles APA, Harvard, Vancouver, ISO, etc.
40

au, c. smuts@murdoch edu, et Celia Smuts. « Development of diagnostic tools to improve the detection of Trypanosoma evansi in Australia ». Murdoch University, 2009. http://wwwlib.murdoch.edu.au/adt/browse/view/adt-MU20090709.113425.

Texte intégral
Résumé :
The aim of this study was to evaluate new methods to improve detection and investigation of the effects of chronic or subclinical infection with Trypanosoma evansi in various mammalian species. Some of the more resistant host species, including pigs and buffaloes, are present in large feral populations in the northern parts of Australia, the area where T. evansi is most likely to gain entry to the country. Existing tests are not sufficiently reliable to detect all cases of disease and they cannot distinguish acute from chronic infections. Furthermore, the tests have different sensitivities in different host species. Surveillance for trypanosomiasis in Australia is problematic because of the need to work in remote parts of northern Australia where provision of a cold-chain for traditional blood and serum storage is difficult. An existing dried blood storage system was modified by treating cotton lint filter paper (Whatman #903) with a commercial post coating buffer (TropBio, Queensland). This treatment increased the longevity of antibodies to T. evansi in serum and blood stored on the paper (detected using an antibody-detection ELISA) compared to samples stored on plain paper, especially when the papers were stored under humid conditions and at high ambient temperatures. Attempts were made to improve the diagnostic utility and repeatability of antibody-ELISAs through the use of 2 recombinant T. brucei antigens (PFRA and GM6) and to optimize a competitive ELISA using RoTat 1.2 variable surface antigen and its monoclonal antibody. Antibody-detection using the two recombinant proteins was not sufficiently specific to enable their use for the detection of T. evansi. The RoTat 1.2 cELISA had good sensitivity and specificity (75% and 98% respectively) when used to test serum from cattle and buffaloes experimentally infected with T. evansi and uninfected animals. However, the test was not able to detect anti-T. evansi antibodies in serum from wallabies, pigs, a dog or a horse that were experimentally infected with T. evansi. The inability of the cELISA to detect anti-T. evansi antibodies may be due to the small number of samples tested or the lack of RoTat 1.2 specific antibodies in the animals tested. The feasibility of using an enzymatic test to detect trypanosome aminotransferase or antibodies to this enzyme was evaluated. Prior publications suggested that the detection of TAT was an appropriate diagnostic tool for the detection of T. evansi infection in camels. However, the results from this study did not support the use of this test for the detection of T. evansi infection in cattle or buffaloes with low to moderate parasitaemia. Trypanosomiasis is an immunological disease that affects most of the body’s organs, with more severe disease developing over time. Attempts were made to determine key cytokine and biochemical patterns that would distinguish infected from uninfected animals and acute from chronic infections. The results from this study showed that there was no specific pattern in serum cytokines or serum biochemistry that could be used to distinguish infected from uninfected animals, or different stages of disease. Immunohistochemistry was used on tissues from buffaloes and mice experimentally infected with T. evansi and T. brucei gambiense respectively to characterise the cellular immune response that was present. The immune response was predominantly cell mediated, with CD3+ T lymphocyte and macrophage infiltration occurring in most tissues. In end stage disease there was often suppression of the immune system with disruption of the architecture of the spleen and a decrease in B lymphocytes in the circulation. Trypanosomes were rarely visible in the tissues and were only seen in those animals with high parasitaemia. Lesions generally became more severe over time, but there was a large variation between animals, which suggests that immunohistochemistry is unsuitable as a diagnostic tool.
Styles APA, Harvard, Vancouver, ISO, etc.
41

Snitow, Samantha, et snitow@alumni tufts edu samantha. « Reducing the drink driving road toll : A case study in integrating communication and social policy enforcement ». RMIT University. Applied Communication, 2004. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20060307.165115.

Texte intégral
Résumé :
This thesis presents a case study of the drink drive initiatives, including marketing communications, legislation and enforcement practices implemented in the state of Victoria (Australia) between 1989-2000. It has been argued that the 51% reduction in road toll was related to these initiatives. In order to explore the veracity of these claims, a holistic case study approach was adopted. In addition to an examination of the communications tactics and extant practices of enforcement agencies, the study involved interviews with two distinct groups: professionals in various fields pertaining to road safety, and members of the general Victorian driving community. The focus of this work was on the advertising and communications campaigns that were run by the Transport Accident Commission from 1989-2000; however the policy and enforcement initiatives were also examined in terms of their potential impact on the lowering of the road toll. Suggestions for the improvement of policy and communication strategies within a social marketing context are made.
Styles APA, Harvard, Vancouver, ISO, etc.
42

McLeod, Philip, et n/a. « Fast, accurate pitch detection tools for music analysis ». University of Otago. Department of Computer Science, 2009. http://adt.otago.ac.nz./public/adt-NZDU20090220.090438.

Texte intégral
Résumé :
Precise pitch is important to musicians. We created algorithms for real-time pitch detection that generalise well over a range of single �voiced� musical instruments. A high pitch detection accuracy is achieved whilst maintaining a fast response using a special normalisation of the autocorrelation (SNAC) function and its windowed version, WSNAC. Incremental versions of these functions provide pitch values updated at every input sample. A robust octave detection is achieved through a modified cepstrum, utilising properties of human pitch perception and putting the pitch of the current frame within the context of its full note duration. The algorithms have been tested thoroughly both with synthetic waveforms and sounds from real instruments. A method for detecting note changes using only pitch is also presented. Furthermore, we describe a real-time method to determine vibrato parameters - higher level information of pitch variations, including the envelopes of vibrato speed, height, phase and centre offset. Some novel ways of visualising the pitch and vibrato information are presented. Our project �Tartini� provides music students, teachers, performers and researchers with new visual tools to help them learn their art, refine their technique and advance their fields.
Styles APA, Harvard, Vancouver, ISO, etc.
43

Dugan, Loren J. « The impact of CASE tools, Ada, and software reuse of a DoD software development project ». Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1996. http://handle.dtic.mil/100.2/ADA308162.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
44

Sapkota, Virginia A. « Welfare implications of nonidentical time valuations under constrained road pricing policies : analytical studies with corridor and urban-wide networks ». UWA Business School, 2004. http://theses.library.uwa.edu.au/adt-WU2005.0006.

Texte intégral
Résumé :
The goal of the research is to devise an equitable road pricing system which would leave the majority of routes free of tolls, so that low income people would suffer no cash loss although they would probably suffer loss of time. The aims of the dissertation are twofold. The first is to provide a numerical analysis of how urban commuters with differing abilities to pay would respond to additional road user charges. The welfare implications of such differential responses are examined and their policy implications analysed. The second aim is to develop a practical framework to model congestion pricing policies in the context of heterogeneous users. To achieve these aims, the following objectives have been set: (a) Using a simple network with two parallel competing routes, determine both welfare maximising and revenue maximising tolls under the constraint that only one route can be priced. In this setting, determine the allocation of traffic between the alternative routes, the efficiency gain, the revenue, the changes in travel cost and the distributional effects. (b) Establish a realistic model of an actual urban area to examine the impacts of selectively tolling congestible routes. As in the simple network case, assess the effects of toll policy on traffic distribution, network efficiency, revenues, and the welfare of the individual consumer and society. (c) Evaluate whether the non-identical treatment of users will enhance the acceptability of congestion pricing as a transport policy. Results from the simulations indicate that non-identical treatment of drivers? responses to toll charges provides better understanding of the differential impacts of various pricing policies. Allowing for heterogeneity in time valuation provides a better assessment of the efficiency of pricing policies and of the welfare impacts of toll charges, as it is able to capture their differential effects. More importantly, it shows that low-income commuters may not be significantly worse off with pricing especially when there is a free alternative route. This research demonstrates the need to adopt appropriate analytical techniques and assumptions when modelling the traffic equilibrium in a network with tolls. These include relaxing the homogeneity assumption, examining sensitivity to supply function parameter values and to the effect of vehicle operating cost, and using a route rather than link based measure of consumer surplus
Styles APA, Harvard, Vancouver, ISO, etc.
45

Hogan, Bernard Michael, et n/a. « The Internet as a Research and/or Communication Tool to Support Classroom-Based Instruction : Usage, Value, and Utility for Post-Secondary Students ». Griffith University. School of Film, Media and Cultural Studies, 2004. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20040719.124141.

Texte intégral
Résumé :
Recent research indicates that the Internet (or Net) is currently being used at many post-secondary institutions in support of traditional, classroom-based instruction. From 1994 to 2002, the percentage of post-secondary classes using the Web as a research tool and E-mail as a method of communication has increased almost ten fold. An extensive literature on the evaluation of the Internet as an educational technology has developed in recent years; however, there are some gaps that need to be filled to provide a more complete understanding of the Internet and its use by post-secondary students. First, most of the studies focus primarily on student usage of the Net, and less so on the value (or the advantages and disadvantages) and the utility (or usefulness) associated with that usage. Second, many of these studies make a distinction between the research and communication functions of the Internet. While I argue that this is an appropriate distinction, many examine one function or the other only – and not both simultaneously. The central research problem that this study addresses is helping to fill those two gaps in the evaluation literature by examining in detail student usage, value and utility of the Net as a research and/or communication tool for post-secondary students in support of classroom-based instruction. Drawing upon work from the fields of media studies, learning theory, and theories of communication, I establish a "Net as Tool" framework and adopt a uses and gratifications approach to examine student use of the Net. The three main inter-related concepts of usage, value and utility are used as organizing themes for the study, and I designed and developed a survey instrument to gather original quantitative data from post-secondary students in both Canada and Australia to fully examine those concepts. Two focus group sessions were designed to supplement this quantitative data with qualitative findings (and to generate more in-depth insights into student usage, value and utility of the Net as a research and/or communication tool). The results presented in this study have both theoretical and practical importance. In regards to the theoretical side, I have identified the underlying dimensions of usage, value, and utility, and highlighted what makes the Net valuable and useful as a research and/or communication tool. Additionally, I have identified the factors which are related to usage, value, and utility, and explored the inter-related nature of those three concepts. I concluded my study with an outline of the importance of the skill of digital literacy so that students can cope effectively with the online environment. These findings are significant because they help to fill some specific gaps in the evaluation knowledge of the Net in post-secondary education. In addition, I have developed a practical strategy which suggests how the Net could be used most effectively by students as a research and/or communication tool in support of classroom based instruction. The areas addressed by the strategy include access, infrastructure, technical support, training, integration into the curriculum, and appropriate use of the tool. The overall strategy is important because it contributes to our understanding of the Net as an educational tool, and it outlines ways to address the issue of the digital divide within post-secondary education. It is hoped the strategy will be useful to training staff, post-secondary administrators, instructors, and students.
Styles APA, Harvard, Vancouver, ISO, etc.
46

Enticott, Steven John, et n/a. « A critical evaluation of exchange traded option 'Delta' as a risk management tool for self-managed superannuation funds ». Swinburne University of Technology, 2006. http://adt.lib.swin.edu.au./public/adt-VSWT20061117.125347.

Texte intégral
Résumé :
This research discusses the use of Delta in regulating the investment behaviour of the Trustees of Self-Managed Superannuation Funds (SMSFs) who use Exchange Traded Options (ETOs) in their investment strategies. An ETO represents a contract between two parties, giving the taker (the buyer) the right, but not the obligation, to buy or sell a parcel of shares at a predetermined price, on or before a predetermined date, to or from the writer (the seller). It is acceptable for SMSF Trustees to use ETO investments as part of their overall investment strategy, providing that leverage or mere speculation are not the reasons behind that investment. It is important to note that neither the Regulator, the Australian Taxation Office (ATO), nor its predecessor, the Australian Prudential Regulatory Authority (APRA), actually state what constitutes 'speculation', or what the allowable uses for derivatives are. There are no practical guidelines. This is a key issue for this research, which aims, as practically as possible, to fill these crucial gaps. A Trustee must abide by their superannuation fund's overriding covenants and investment strategy, and inform its members, through Risk Management Statements, of the trust's derivative strategy. While ETOs can be used to manage risk, they also carry a level of risk themselves. Delta measures an ETO's value movement in correlation with a movement in the option's underlying share price. An ETO carrying a low Delta generally means a cheaper price (premium) per contract than an option carrying a higher Delta. The lower the Delta, however, the lower the chance there is of a positive result for the buyer. This research shows that an ETO Delta of less than 0.2 gives results in favour of buyers in only 11 out of 100 occurrences. This figure rises to 42 out of 100 when Delta is greater than 0.8. From the sampled data, there is an overall financial loss to the buyer of -1.91%, with the financial return results being mixed at all levels of Delta. The overall return results have been compiled without preference to market direction, and clearly highlight the natural premium bias (which the buyer pays) to the seller. What this data does is reenforce the need for Trustees to have a solid view of market directions, or a set strategy in place, as buyers of ETOs. The conclusions drawn from the findings show that the chance of loss (when buying), or gain (when selling) ETOs with a Delta of; - less than 0.20 is 89%; - less than 0.40 is 74%; - less than 0.60 is 66%; - less than 0.80 is 57%; - greater than 0.80 is 58%; For example, a Trustee buying an ETO with a Delta of less than 0.20, faces an 89% chance of loss; a Trustee selling an ETO with a Delta of less than 0.20, faces an 89% chance of gain. The findings on overall financial returns (profit or loss) offer additional support to this critical review of Delta as a risk measurement tool. Whist it is impossible to know the motives or actual positions of portfolio managers of SMSF at any time, the aim of the thesis is to provide a measurement tool that can be used to assist the trustee at any given time by measuring the option risk element alone. When interpreting the findings, the reader must remember that ETO strategies are numerous, and a high-risk profile for one strategy may represent a low risk for another. Further to this, an ETO strategy's risk profile may change with the overlaying of another ETO. For example, where a Call option is bought, the risk involved in that purchase is represented by the premium paid. However, another Call option can then be sold against that position, with a later (or earlier) date to expiry, and with a higher strike price. This 'overlay' reduces the initial risk, but impacts on the maximum gain. It is vital that Trustees have a solid understanding of the basics of ETO strategies before considering using Delta as a measure of risk. The research proposes some guidelines Trustees can use when assessing an ETO strategy against their derivative/investment risk profile. For example, a Trustee buying an ETO with a Delta of less than 0.20, faces an 89% chance of loss; a Trustee selling an ETO with a Delta of less than 0.20, faces an 89% chance of gain. The findings on overall financial returns (profit or loss) offer additional support to this critical review of Delta as a risk measurement tool. Whist it is impossible to know the motives or actual positions of portfolio managers of SMSF at any time, the aim of the thesis is to provide a measurement tool that can be used to assist the trustee at any given time by measuring the option risk element alone. When interpreting the findings, the reader must remember that ETO strategies are numerous, and a high-risk profile for one strategy may represent a low risk for another. Further to this, an ETO strategy's risk profile may change with the overlaying of another ETO. For example, where a Call option is bought, the risk involved in that purchase is represented by the premium paid. However, another Call option can then be sold against that position, with a later (or earlier) date to expiry, and with a higher strike price. This 'overlay' reduces the initial risk, but impacts on the maximum gain. It is vital that Trustees have a solid understanding of the basics of ETO strategies before considering using Delta as a measure of risk. The research proposes some guidelines Trustees can use when assessing an ETO strategy against their derivative/investment risk profile. (table inserted) The findings from 2400 data samples show strong trends in support of the underlying premise (see Figure: Positive Results Versus Delta (ETO Buyers) below). Given these findings, the research concludes that Delta can be used as a measure of risk by SMSF Trustees. Delta may not be suitable, however, for measuring multiple layers of combined ETO positions, a type of derivative strategy not suited to or usual in the context of measuring risk within a SMSF. (table inserted) There is a major difference between simple and simplistic solutions offering practical answers in an environment of increasing complexity. Often, simple solutions offer far more value to the less experienced, when compared to complex ones, especially given the growing number of SMSFs, and the increasing lack of expertise in the areas of superannuation and risk management that this growth implies.
Styles APA, Harvard, Vancouver, ISO, etc.
47

Malouf, Christopher P., et n/a. « Evaluation of an airborne thermal scanner (8-12 µm) as an irrigation scheduling tool for cotton (Gossypium hirsutum) ». University of Canberra. Resource, Environmental & ; Heritage Sciences, 1996. http://erl.canberra.edu.au./public/adt-AUC20060829.143622.

Texte intégral
Résumé :
Water is Australia's most precious natural resource. The quality, quantity and availability of this resource is the single factor most limiting agricultural development and sustainability in this country. Since the development of Australia's cotton industry in the 1960's, and the expanding areas of irrigated crop, there has been an increasing demand placed on the limited water resources of the country. Consequently, the cotton industry has been the target of protest from conservation groups, residents of rural townships and others farmers engaged in competing rural sectors. Therefore, cotton farmers need to develop best practice in terms of water use efficiency. Not only does this make good ecological sense but also good economic sense. Traditional methods of irrigation scheduling have proven to be subjective and haphazard. Recently developed methods, while providing more quantitative techniques, do not give a synoptic view of a field's or region's crop moisture status. The main objective of this project was to evaluate an airborne thermal scanner (8-12 µm) as practical tool for monitoring the water requirements of an irrigated cotton crop. The thermal scanner was mounted below a light aircraft and imagery was collected over Field 86 , Togo Station, north-west NSW during the summer of 1990/91. The field was divided into nine treatments for the purpose of this project. Three irrigation regimes (early, normal and late) with three repetitions were applied to the nine treatments. A total of fourteen images were selected for analysis. These images were grouped into sets of AM images, PM images as well as diurnal groupings which were interpreted for three separate dates during the growing season. Ground based measurements of infrared crop surface and soil temperature, soil moisture deficit, leaf area index (LAI) and the Crop Water Stress Index (CWSI) were collected to calibrate the airborne imagery. Imagery was in the first instance visually interpreted to determine what information could be gained from this technique. Patterns on the imagery were related to diurnal variations in soil and crop temperatures. This investigation revealed a number of soil related phenomena inherent to the field which were influencing the airborne detected temperatures. While this technique showed variability across the field, the interpretation was somewhat subjective. Temperature values were extracted from the imagery in order to conduct an analysis of variance (ANOVA) between the airborne and ground measurements of infrared crop surface temperature. In summary, this analysis did not show a strong relationship between the airborne and ground based measurements. A number of contributing factors have been proposed as the reason for this variation in the two datasets. Pearson's correlation analysis was applied to the AM (r = 0.65) and PM (r = 0.32) groups of airborne and ground temperatures. Airborne derived calculations of the CWSI were compared to ground based measurements for the AM group of flights. These derived values were only acceptable in instances where the ANOVA results had shown them to approximate the ground based measurements. While airborne thermal imagery provides a useful tool for determining general variations in temperatures across a field, there are many additional factors, the most dominant being the thermal characteristics of the background soil, which influence the detected temperatures. This technique does not provide the precise quantitative information required to accurately determine across-field measurement of the CWSI.
Styles APA, Harvard, Vancouver, ISO, etc.
48

Enticott, Steven John. « A critical evaluation of exchange traded option 'Delta' as a risk management tool for self-managed superannuation funds ». Australasian Digital Thesis Program, 2006. http://adt.lib.swin.edu.au/public/adt-VSWT20061117.125347.

Texte intégral
Résumé :
Thesis (DBA) - Swinburne University of Technology, 2006.
Submitted to the partial fulfilment of the requirements for the degree of Doctor of Business Administration, Australasian Graduate School of Management, Swinburne University of Technology, 2006. Typescript. Includes bibliographical references (p. 89-92).
Styles APA, Harvard, Vancouver, ISO, etc.
49

Shepherd, David Peter, et RSISE [sic]. « Optimisation of Iterative Multi-user Receivers using Analytical Tools ». The Australian National University. Research School of Information Sciences and Engineering, 2008. http://thesis.anu.edu.au./public/adt-ANU20081114.221408.

Texte intégral
Résumé :
The objective of this thesis is to develop tools for the analysis and optimization of an iterative receiver. These tools can be applied to most soft-in soft-out (SISO) receiver components. For illustration purposes we consider a multi-user DS-CDMA system with forward error correction that employs iterative multi-user detection based on soft interference cancellation and single user decoding. Optimized power levels combined with adaptive scheduling allows for efficient utilization of receiver resources for heavily loaded systems.¶ Metric transfer analysis has been shown to be an accurate method of predicting the convergence behavior of iterative receivers. EXtrinsic Information (EXIT), fidelity (FT) and variance (VT) transfer analysis are well-known methods, however the relationship between the different approaches has not been explored in detail. We compare the metrics numerically and analytically and derive functions to closely approximate the relationship between them. The result allows for easy translation between EXIT, FT and VT methods. Furthermore, we extend the $J$ function, which describes mutual information as a function of variance, to fidelity and symbol error variance, the Rayleigh fading channel model and a channel estimate. These $J$ functions allow the \textit{a priori} inputs to the channel estimator, interference canceller and decoder to be accurately modeled. We also derive the effective EXIT charts which can be used for the convergence analysis and performance predictions of unequal power CDMA systems.¶ The optimization of the coded DS-CDMA system is done in two parts; firstly the received power levels are optimized to minimize the power used in the terminal transmitters, then the decoder activation schedule is optimized such that the multi-user receiver complexity is minimized. The uplink received power levels are optimized for the system load using a constrained nonlinear optimization approach. EXIT charts are used to optimize the power allocation in a multi-user turbo-coded DS-CDMA system. We show through simulation that the optimized power levels allow for successful decoding of heavily loaded systems with a large reduction in the convergence SNR.¶ We utilize EXIT chart analysis and a Viterbi search algorithm to derive the optimal decoding schedule for a multi component receiver/decoder. We show through simulations that decoding delay and complexity can be significantly reduced while maintaining BER performance through optimization of the decoding schedule.
Styles APA, Harvard, Vancouver, ISO, etc.
50

Jaroenjittichai, Phrudth. « Pulsar polarisation as a diagnostic tool ». Thesis, University of Manchester, 2013. https://www.research.manchester.ac.uk/portal/en/theses/pulsar-polarisation-as-a-diagnostic-tool(cbbef2d8-5779-4f5b-a48d-09eec413f247).html.

Texte intégral
Résumé :
The geometry of pulsar beams is one of the intrinsic properties of neutron stars, governing the pulse-profile phenomenon and other aspects of pulsar astron- omy. With a number of pulsars in our dataset, their beam geometry is derived from the polarisation position angle (PPA) using the simple polar cap emission and dipole field model. This includes the rotating vector model (RVM), for which the solutions can hardly be constrained or fail to be consistent because of the lim- itations of the model itself. The inconsistencies in the results suggest that the initial PPAs can be strongly perturbed by additional parameters above the emis- sion altitude, such as the plasma medium or rotational aberration effects, after which their characteristic shape is no longer related to the geometry via the RVM. We investigate further into the effects of wave propagation in the pulsar magne- tosphere, and find an indication that, in most cases, the RVM-calculated PPAs are likely to be altered by plasma effects.In recent years, there have been an increasing number of intermittent and mode-switching pulsars observed to have their radio pulse profiles correlated with the change in pulsar spin frequency (ν ̇) (e.g. Lorimer et al. 2012, Lyne et al. 2010). These two phenomena are understood to be related via the states of plasma in the magnetosphere. As one such pulsar, and also one with known geometry and other astonishing behaviour, PSR B1822–09 is studied in terms of the mode- switching properties, the hollow-cone model and the wave propagation in the magnetosphere. We also study the model for explaining the intermittent pulsars PSRs B1931+24, J1841+0500 and J1832+0029, and find it can be consistently applied for PSRs B1822–09 and B0943+10, and other profile-switching pulsars. However, aspects of the conclusions are limited because of the lack of understand- ing of the connection between the radio flux and the states of plasma. We are also able to use the difference in the PPAs between two states of PSR B0943+10 to predict the change in plasma states and ν ̇, which cannot be measured directly from timing analysis as its switching timescale is too short.
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie