Dissertations / Theses on the topic 'Theory of computation not elsewhere classified'

To see the other types of publications on this topic, follow the link: Theory of computation not elsewhere classified.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 35 dissertations / theses for your research on the topic 'Theory of computation not elsewhere classified.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Zhu, Huaiyu. "Neural networks and adaptive computers : theory and methods of stochastic adaptive computation." Thesis, University of Liverpool, 1993. http://eprints.aston.ac.uk/365/.

Full text
Abstract:
This thesis studies the theory of stochastic adaptive computation based on neural networks. A mathematical theory of computation is developed in the framework of information geometry, which generalises Turing machine (TM) computation in three aspects - It can be continuous, stochastic and adaptive - and retains the TM computation as a subclass called "data processing". The concepts of Boltzmann distribution, Gibbs sampler and simulated annealing are formally defined and their interrelationships are studied. The concept of "trainable information processor" (TIP) - parameterised stochastic mapping with a rule to change the parameters - is introduced as an abstraction of neural network models. A mathematical theory of the class of homogeneous semilinear neural networks is developed, which includes most of the commonly studied NN models such as back propagation NN, Boltzmann machine and Hopfield net, and a general scheme is developed to classify the structures, dynamics and learning rules. All the previously known general learning rules are based on gradient following (GF), which are susceptible to local optima in weight space. Contrary to the widely held belief that this is rarely a problem in practice, numerical experiments show that for most non-trivial learning tasks GF learning never converges to a global optimum. To overcome the local optima, simulated annealing is introduced into the learning rule, so that the network retains adequate amount of "global search" in the learning process. Extensive numerical experiments confirm that the network always converges to a global optimum in the weight space. The resulting learning rule is also easier to be implemented and more biologically plausible than back propagation and Boltzmann machine learning rules: Only a scalar needs to be back-propagated for the whole network. Various connectionist models have been proposed in the literature for solving various instances of problems, without a general method by which their merits can be combined. Instead of proposing yet another model, we try to build a modular structure in which each module is basically a TIP. As an extension of simulated annealing to temporal problems, we generalise the theory of dynamic programming and Markov decision process to allow adaptive learning, resulting in a computational system called a "basic adaptive computer", which has the advantage over earlier reinforcement learning systems, such as Sutton's "Dyna", in that it can adapt in a combinatorial environment and still converge to a global optimum. The theories are developed with a universal normalisation scheme for all the learning parameters so that the learning system can be built without prior knowledge of the problems it is to solve.
APA, Harvard, Vancouver, ISO, and other styles
2

Laverick, Craig. "Enforcing the ISM Code, and improving maritime safety, with an improved Corporate Manslaughter Act : a safety culture theory perspective." Thesis, University of Central Lancashire, 2018. http://clok.uclan.ac.uk/23768/.

Full text
Abstract:
The International Safety Management (ISM) Code was introduced in 1998 in response to a number of high-profile maritime disasters, with the aim of establishing minimum standards for the safe operation of ships and creating an enhanced safety culture. It was the first piece of legislation introduced by the International Maritime Organisation that demanded a change in the behaviour and attitude of the international maritime community. Whilst there is no doubt that the ISM Code has been successful at improving maritime safety, there is now an increasing problem with complacency. The aim of this thesis is to consider how complacency with the ISM Code in the UK can be tackled by using reformed corporate manslaughter legislation. This thesis adopts a Safety Culture Theory approach and uses a multi-model research design methodology; a doctrinal model and a socio-legal model. The thesis hypothesis and the author's proposed corporate manslaughter reforms are tested through case studies and a survey. The thesis proposes the introduction of secondary individual liability for corporate manslaughter, in addition to existing primary corporate liability. If the proposed provisions were to be implemented, a gap in the law would be filled and, for the maritime industry, both the ship company and its corporate individuals would be held accountable for deaths at sea that are attributable to non-implementation of the ISM Code. It is suggested that this would deter further ISM complacency and so encourage the ISM Code’s intended safety culture. This thesis contributes to the intellectual advancement of the significant and developing interplay between criminal and maritime law, by adding to the scholarly understanding of the safety culture operating within the international maritime community, and examining how corporate manslaughter legislation could be used to improve implementation of the ISM Code. It offers sound research for consideration by legal researchers and scholars, and also by those working within the field of maritime safety regulation.
APA, Harvard, Vancouver, ISO, and other styles
3

Oliver, Christine. "Systemic reflexivity : building theory for organisational consultancy." Thesis, University of Bedfordshire, 2012. http://hdl.handle.net/10547/567099.

Full text
Abstract:
This dissertation argues for the value of the concept of systemic reflexivity in sense making, orientation and action in systemic practice, and in organisational practice in particular. The concept emerges as a theme through the development of two specific strands of published work from 1992 to 2013, that of Coordinated Management of Meaning Theory (CMM) and Appreciative Inquiry (AI). Both lines of inquiry highlight the moral dimension of practitioners’ conceptualisation and practice. Systemic reflexivity alerts us to the opportunities and constraints system participants make for the system in focus, facilitating exploration of a system’s coherence, through a detailed framework for systemic thinking which links patterns of communication to their narratives of influence and narrative consequences. It provides the conditions for enabling individual and collective responsibility for the ways that communication shapes our social worlds. The concept is illustrated in practice through a range of case studies within the published works.
APA, Harvard, Vancouver, ISO, and other styles
4

Edmonds, Andrew Nicola. "Time series prediction using supervised learning and tools from chaos theory." Thesis, University of Bedfordshire, 1996. http://hdl.handle.net/10547/582141.

Full text
Abstract:
In this work methods for performing time series prediction on complex real world time series are examined. In particular series exhibiting non-linear or chaotic behaviour are selected for analysis. A range of methodologies based on Takens' embedding theorem are considered and compared with more conventional methods. A novel combination of methods for determining the optimal embedding parameters are employed and tried out with multivariate financial time series data and with a complex series derived from an experiment in biotechnology. The results show that this combination of techniques provide accurate results while improving dramatically the time required to produce predictions and analyses, and eliminating a range of parameters that had hitherto been fixed empirically. The architecture and methodology of the prediction software developed is described along with design decisions and their justification. Sensitivity analyses are employed to justify the use of this combination of methods, and comparisons are made with more conventional predictive techniques and trivial predictors showing the superiority of the results generated by the work detailed in this thesis.
APA, Harvard, Vancouver, ISO, and other styles
5

(9805715), Zhigang Huang. "A recursive algorithm for reliability assessment in water distribution networks with applications of parallel programming techniques." Thesis, 1994. https://figshare.com/articles/thesis/A_recursive_algorithm_for_reliability_assessment_in_water_distribution_networks_with_applications_of_parallel_programming_techniques/13425371.

Full text
Abstract:
Project models the reliability of an urban water distribution network.. Reliability is one of the fundamental considerations in the design of urban water distribution networks. The reliability of a network can be modelled by the probability of the connectedness of a stochastic graph. The enumeration of a set of cuts of the graph, and the calculation of the disjoint probability products of the cuts, are two fundamental steps in the network reliability assessment. An improved algorithm for the enumeration of all the minimal cutsets of a graph is presented. Based on this, a recursive algorithm for the enumeration of all Buzacott cuts (a particular set of ordered cuts) of a graph has been developed. The final algorithm presented in this thesis incorporates the enumeration of Buzacott cuts and the calculation of the disjoint probability products of the cuts to obtain the network reliability. As a result, it is tightly coupled, and very efficient. Experimental results show that this algorithm has a higher efficiency than other reported methods. The parallelism existing in the reliability assessment is investigated. The final algorithm has been implemented in a concurrent computer program. The effectiveness of parallel programming techniques in reducing the computing time required by the reliability assessment has also been discussed.
APA, Harvard, Vancouver, ISO, and other styles
6

(9831926), David Ruxton. "Differential dynamic programming and optimal control of inequality constrained continuous dynamic systems." Thesis, 1991. https://figshare.com/articles/thesis/Differential_dynamic_programming_and_optimal_control_of_inequality_constrained_continuous_dynamic_systems/13430081.

Full text
Abstract:
In this thesis, the development of the differential dynamic programming (DDP) algorithm is extensively reviewed from its introduction to its present status. Following this review, the DDP algorithm is shown to be readily adapted to handle inequality constrained continuous optimal control problems. In particular, a new approach using multiplier penalty functions implemented in conjunction with the DDP algorithm, is introduced and shown to be effective. Emphasis is placed on the practical aspects of implementing and applying DDP algorithm variants. The new DDP and multiplier penalty function algorithm variant is then tested and compared with established DDP algorithm variants as well as another numerical method before being applied to solve a problem involving the control of a robot arm in the plane.
APA, Harvard, Vancouver, ISO, and other styles
7

(9811760), Scott Ladley. "An investigation into the application of evolutionary algorithms on highly constrained optimal control problems and the development of a graphical user interface for comprehensive algorithm control and monitoring." Thesis, 2003. https://figshare.com/articles/thesis/An_investigation_into_the_application_of_evolutionary_algorithms_on_highly_constrained_optimal_control_problems_and_the_development_of_a_graphical_user_interface_for_comprehensive_algorithm_control_and_monitoring/19930160.

Full text
Abstract:

In this thesis we investigate how intelligent techniques, such as Evolutionary Algorithms, can be applied to finding solutions to discrete optimal control problems. Also, a detailed investigation is carried out into the design and development of a superior execution environment for Evolutionary Algorithms.

An overview of the basic processes of an Evolutionary Algorithm is given, as well as detailed descriptions for several genetic operators. Several additional operators that may be applied in conjunction with an Evolutionary Algorithm are also studied. These operators include several versions of the simplex method, as well as 3 distinct hill -climbers, each designed for a specific purpose. The hill -climbing routines have been designed for purposes that include local search, escaping local minima, and a hill -climbing routine designed for self -adaptation to a broad range of problems.

The mathematical programming formulation of discrete optimal control problems is used to generate a class of highly constrained problems. Techniques are developed to accurately and rapidly solve these problems, whilst satisfying the equality constraints to machine accuracy.

The improved execution environment for Evolutionary Algorithms proposes the use of a Graphical User Interface for data visualisation, algorithm control and monitoring, as well as a Client/Server network interface for connecting the GUI to remotely run algorithms.

APA, Harvard, Vancouver, ISO, and other styles
8

Nguyen, Van-Tuong. "An implementation of the parallelism, distribution and nondeterminism of membrane computing models on reconfigurable hardware." 2010. http://arrow.unisa.edu.au:8081/1959.8/100802.

Full text
Abstract:
Membrane computing investigates models of computation inspired by certain features of biological cells, especially features arising because of the presence of membranes. Because of their inherent large-scale parallelism, membrane computing models (called P systems) can be fully exploited only through the use of a parallel computing platform. However, it is an open question whether it is feasible to develop an efficient and useful parallel computing platform for membrane computing applications. Such a computing platform would significantly outperform equivalent sequential computing platforms while still achieving acceptable scalability, flexibility and extensibility. To move closer to an answer to this question, I have investigated a novel approach to the development of a parallel computing platform for membrane computing applications that has the potential to deliver a good balance between performance, flexibility, scalability and extensibility. This approach involves the use of reconfigurable hardware and an intelligent software component that is able to configure the hardware to suit the specific properties of the P system to be executed. As part of my investigations, I have created a prototype computing platform called Reconfig-P based on the proposed development approach. Reconfig-P is the only existing computing platform for membrane computing applications able to support both system-level and region-level parallelism. Using an intelligent hardware source code generator called P Builder, Reconfig-P is able to realise an input P system as a hardware circuit in various ways, depending on which aspects of P systems the user wishes to emphasise at the implementation level. For example, Reconfig-P can realise a P system in a rule-oriented manner or in a region-oriented manner. P Builder provides a unified implementation framework within which the various implementation strategies can be supported. The basic principles of this framework conform to a novel design pattern called Content-Form-Strategy. The framework seamlessly integrates the currently supported implementation approaches, and facilitates the inclusion of additional implementation strategies and additional P system features. Theoretical and empirical results regarding the execution time performance and hardware resource consumption of Reconfig-P suggest that the proposed development approach is a viable means of attaining a good balance between performance, scalability, flexibility and extensibility. Most of the existing computing platforms for membrane computing applications fail to support nondeterministic object distribution, a key aspect of P systems that presents several interesting implementation challenges. I have devised an efficient algorithm for nondeterministic object distribution that is suitable for implementation in hardware. Experimental results suggest that this algorithm could be incorporated into Reconfig-P without too significantly reducing its performance or efficiency.
Thesis (PhDInformationTechnology)--University of South Australia, 2010
APA, Harvard, Vancouver, ISO, and other styles
9

(10514360), Uttara Vinay Tipnis. "Data Science Approaches on Brain Connectivity: Communication Dynamics and Fingerprint Gradients." Thesis, 2021.

Find full text
Abstract:
The innovations in Magnetic Resonance Imaging (MRI) in the recent decades have given rise to large open-source datasets. MRI affords researchers the ability to look at both structure and function of the human brain. This dissertation will make use of one of these large open-source datasets, the Human Connectome Project (HCP), to study the structural and functional connectivity in the brain.
Communication processes within the human brain at different cognitive states are neither well understood nor completely characterized. We assess communication processes in the human connectome using ant colony-inspired cooperative learning algorithm, starting from a source with no a priori information about the network topology, and cooperatively searching for the target through a pheromone-inspired model. This framework relies on two parameters, namely pheromone and edge perception, to define the cognizance and subsequent behaviour of the ants on the network and the communication processes happening between source and target. Simulations with different configurations allow the identification of path-ensembles that are involved in the communication between node pairs. In order to assess the different communication regimes displayed on the simulations and their associations with functional connectivity, we introduce two network measurements, effective path-length and arrival rate. These measurements are tested as individual and combined descriptors of functional connectivity during different tasks. Finally, different communication regimes are found in different specialized functional networks. This framework may be used as a test-bed for different communication regimes on top of an underlying topology.
The assessment of brain fingerprints has emerged in the recent years as an important tool to study individual differences. Studies so far have mainly focused on connectivity fingerprints between different brain scans of the same individual. We extend the concept of brain connectivity fingerprints beyond test/retest and assess fingerprint gradients in young adults by developing an extension of the differential identifiability framework. To do so, we look at the similarity between not only the multiple scans of an individual (subject fingerprint), but also between the scans of monozygotic and dizygotic twins (twin fingerprint). We have carried out this analysis on the 8 fMRI conditions present in the Human Connectome Project -- Young Adult dataset, which we processed into functional connectomes (FCs) and time series parcellated according to the Schaefer Atlas scheme, which has multiple levels of resolution. Our differential identifiability results show that the fingerprint gradients based on genetic and environmental similarities are indeed present when comparing FCs for all parcellations and fMRI conditions. Importantly, only when assessing optimally reconstructed FCs, we fully uncover fingerprints present in higher resolution atlases. We also study the effect of scanning length on subject fingerprint of resting-state FCs to analyze the effect of scanning length and parcellation. In the pursuit of open science, we have also made available the processed and parcellated FCs and time series for all conditions for ~1200 subjects part of the HCP-YA dataset to the scientific community.
Lastly, we have estimated the effect of genetics and environment on the original and optimally reconstructed FC with an ACE model.
APA, Harvard, Vancouver, ISO, and other styles
10

(8713962), James Ulcickas. "LIGHT AND CHEMISTRY AT THE INTERFACE OF THEORY AND EXPERIMENT." Thesis, 2020.

Find full text
Abstract:
Optics are a powerful probe of chemical structure that can often be linked to theoretical predictions, providing robustness as a measurement tool. Not only do optical interactions like second harmonic generation (SHG), single and two-photon excited fluorescence (TPEF), and infrared absorption provide chemical specificity at the molecular and macromolecular scale, but the ability to image enables mapping heterogeneous behavior across complex systems such as biological tissue. This thesis will discuss nonlinear and linear optics, leveraging theoretical predictions to provide frameworks for interpreting analytical measurement. In turn, the causal mechanistic understanding provided by these frameworks will enable structurally specific quantitative tools with a special emphasis on application in biological imaging. The thesis will begin with an introduction to 2nd order nonlinear optics and the polarization analysis thereof, covering both the Jones framework for polarization analysis and the design of experiment. Novel experimental architectures aimed at reducing 1/f noise in polarization analysis will be discussed, leveraging both rapid modulation in time through electro-optic modulators (Chapter 2), as well as fixed-optic spatial modulation approaches (Chapter 3). In addition, challenges in polarization-dependent imaging within turbid systems will be addressed with the discussion of a theoretical framework to model SHG occurring from unpolarized light (Chapter 4). The application of this framework to thick tissue imaging for analysis of collagen local structure can provide a method for characterizing changes in tissue morphology associated with some common cancers (Chapter 5). In addition to discussion of nonlinear optical phenomena, a novel mechanism for electric dipole allowed fluorescence-detected circular dichroism will be introduced (Chapter 6). Tackling challenges associated with label-free chemically specific imaging, the construction of a novel infrared hyperspectral microscope for chemical classification in complex mixtures will be presented (Chapter 7). The thesis will conclude with a discussion of the inherent disadvantages in taking the traditional paradigm of modeling and measuring chemistry separately and provide the multi-agent consensus equilibrium (MACE) framework as an alternative to the classic meet-in-the-middle approach (Chapter 8). Spanning topics from pure theoretical descriptions of light-matter interaction to full experimental work, this thesis aims to unify these two fronts.
APA, Harvard, Vancouver, ISO, and other styles
11

(6012225), Huian Li. "Transparent and Mutual Restraining Electronic Voting." Thesis, 2019.

Find full text
Abstract:
Many e-voting techniques have been proposed but not widely used in reality. One of the problems associated with most of existing e-voting techniques is the lack of transparency, leading to a failure to deliver voter assurance. In this work, we propose a transparent, auditable, end-to-end verifiable, and mutual restraining e-voting protocol that exploits the existing multi-party political dynamics such as in the US. The new e-voting protocol consists of three original technical contributions -- universal verifiable voting vector, forward and backward mutual lock voting, and in-process check and enforcement -- that, along with a public real time bulletin board, resolves the apparent conflicts in voting such as anonymity vs. accountability and privacy vs. verifiability. Especially, the trust is split equally among tallying authorities who have conflicting interests and will technically restrain each other. The voting and tallying processes are transparent to voters and any third party, which allow any voter to verify that his vote is indeed counted and also allow any third party to audit the tally. For the environment requiring receipt-freeness and coercion-resistance, we introduce additional approaches to counter vote-selling and voter-coercion issues. Our interactive voting protocol is suitable for small number of voters like boardroom voting where interaction between voters is encouraged and self-tallying is necessary; while our non-interactive protocol is for the scenario of large number of voters where interaction is prohibitively expensive. Equipped with a hierarchical voting structure, our protocols can enable open and fair elections at any scale.
APA, Harvard, Vancouver, ISO, and other styles
12

(9776870), Jeanne Allen. "The "theory-practice gap": Turning theory into practice in a pre-service teacher education program." Thesis, 2009. https://figshare.com/articles/thesis/The_theory-practice_gap_Turning_theory_into_practice_in_a_pre-service_teacher_education_program/13455275.

Full text
Abstract:
This thesis investigates the theory-practice gap using the exemplar of teacher education. The research is situated in a pre-service teacher education program that explicitly seeks to bridge the theory-practice gap so that it produces “learning managers” who can negotiate the contemporary knowledge society in ways different to those of their predecessors. The empirical work reported in this thesis describes and interprets the experiences of preservice and beginning teachers in turning theory into practice. In order to accomplish this outcome, the thesis draws on Mead’s theory of emergence and symbolic interactionism to provide a theoretical perspective for meaning-making in social situations. Data for the study were collected through interviews and focus groups involving a sample of first-year graduate teachers of an Australian pre-service teacher education program. The main finding of this thesis is that the theory-practice gap in pre-service teacher education under present institutional arrangements is an inevitable phenomenon arising as individuals undergo the process of emergence from pre-service to graduate and then beginning teachers. The study shows that despite the efforts of the program developers, environmental, social and cultural conditions in teacher education processes and structures and in schools inhibit the trainee and novitiate teacher from exercising agency to effect change in traditional classroom practices. Thus, the gap between theory and practice is co-produced and sustained in the model that characterises contemporary preservice teacher education in the perspectives of lecturers, teachers and administrators -- Abstract.
APA, Harvard, Vancouver, ISO, and other styles
13

(5930264), Arthur J. Shih. "Synthesis and Characterization of Copper-Exchanged Zeolite Catalysts and Kinetic Studies on NOx Selective Catalytic Reduction with Ammonia." 2019.

Find full text
Abstract:

Although Cu-SSZ-13 zeolites are used commercially in diesel engine exhaust after-treatment for abatement of toxic NOx pollutants via selective catalytic reduction (SCR) with NH3, molecular details of its active centers and mechanistic details of the redox reactions they catalyze, specifically of the Cu(I) to Cu(II) oxidation half-reaction, are not well understood. A detailed understanding of the SCR reaction mechanism and nature of the Cu active site would provide insight into their catalytic performance and guidance on synthesizing materials with improved low temperature (< 473 K) reactivity and stability against deactivation (e.g. hydrothermal, sulfur oxides). We use computational, titration, spectroscopic, and kinetic techniques to elucidate (1) the presence of two types of Cu2+ ions in Cu-SSZ-13 materials, (2) molecular details on how these Cu cations, facilitated by NH3 solvation, undergo a reduction-oxidation catalytic cycle, and (3) that sulfur oxides poison the two different types of Cu2+ ions to different extents at via different mechanisms.


Copper was exchanged onto H-SSZ-13 samples with different Si:Al ratios (4.5, 15, and 25) via liquid-phase ion exchange using Cu(NO3)2 as the precursor. The speciation of copper started from the most stable Cu2+ coordinated to two anionic sites on the zeolite framework to [CuOH]+ coordinated to only one anionic site on the zeolite framework with increasing Cu:Al ratios. The number of Cu2+ and [CuOH]+ sites was quantified by selective NH3 titration of the number of residual Brønsted acid sites after Cu exchange, and by quantification of Brønsted acidic Si(OH)Al and CuOH stretching vibrations from IR spectra. Cu-SSZ-13 with similar Cu densities and anionic framework site densities exhibit similar standard SCR rates, apparent activation energies, and orders regardless of the fraction of Z2Cu and ZCuOH sites, indicating that both sites are equally active within measurable error for SCR.


The standard SCR reaction uses O2 as the oxidant (4NH3 + 4NO + O2 -> 6H2O + 4N2) and involves a Cu(I)/Cu(II) redox cycle, with Cu(II) reduction mediated by NO and NH3, and Cu(I) oxidation mediated by NO and O2. In contrast, the fast SCR reaction (4NH3 + 2NO + 2NO2 -> 6H2O + 4N2) uses NO2 as the oxidant. Low temperature (437 K) standard SCR reaction kinetics over Cu-SSZ-13 zeolites depend on the spatial density and distribution of Cu ions, varied by changing the Cu:Al and Si:Al ratio. Facilitated by NH3 solvation, mobile Cu(I) complexes can dimerize with other Cu(I) complexes within diffusion distances to activate O2, as demonstrated through X-ray absorption spectroscopy and density functional theory calculations. Monte Carlo simulations are used to define average Cu-Cu distances. In contrast with O2-assisted oxidation reactions, NO2 oxidizes single Cu(I) complexes with similar kinetics among samples of varying Cu spatial density. These findings demonstrate that low temperature standard SCR is dependent on Cu spatial density and requires NH3 solvation to mobilize Cu(I) sites to activate O2, while in contrast fast SCR uses NO2 to oxidize single Cu(I) sites.


We also studied the effect of sulfur oxides, a common poison in diesel exhaust, on Cu-SSZ-13 zeolites. Model Cu-SSZ-13 samples exposed to dry SO2 and O2 streams at 473 and 673 K. These Cu-SSZ-13 zeolites were synthesized and characterized to contain distinct Cu active site types, predominantly either divalent Cu2+ ions exchanged at proximal framework Al sites (Z2Cu), or monovalent CuOH+ complexes exchanged at isolated framework Al sites (ZCuOH). On the model Z2Cu sample, SCR turnover rates (473 K, per Cu) catalyst decreased linearly with increasing S content to undetectable values at equimolar S:Cu molar ratios, while apparent activation energies remained constant at ~65 kJ mol-1, consistent with poisoning of each Z2Cu site with one SO2-derived intermediate. On the model ZCuOH sample, SCR turnover rates also decreased linearly with increasing S content, yet apparent activation energies decreased monotonically from ~50 to ~10 kJ mol-1, suggesting that multiple phenomena are responsible for the observed poisoning behavior and consistent with findings that SO2 exposure led to additional storage of SO2-derived intermediates on non-Cu surface sites. Changes to Cu2+ charge transfer features in UV-Visible spectra were more pronounced for SO2-poisoned ZCuOH than Z2Cu sites, while X-ray diffraction and micropore volume measurements show evidence of partial occlusion of microporous voids by SO2-derived deposits, suggesting that deactivation may not only reflect Cu site poisoning. Density functional theory calculations are used to identify the structures and binding energies of different SO2-derived intermediates at Z2Cu and ZCuOH sites. It is found that bisulfates are particularly low in energy, and residual Brønsted protons are liberated as these bisulfates are formed. These findings indicate that Z2Cu sites are more resistant to SO2 poisoning than ZCuOH sites, and are easier to regenerate once poisoned.

APA, Harvard, Vancouver, ISO, and other styles
14

(11205636), Sarah B. Percival. "Efficient Computation of Reeb Spaces and First Homology Groups." Thesis, 2021.

Find full text
Abstract:
This thesis studies problems in computational topology through the lens of semi-algebraic geometry. We first give an algorithm for computing a semi-algebraic basis for the first homology group, H1(S,F), with coefficients in a field F, of any given semi-algebraic set S⊂Rk defined by a closed formula. The complexity of the algorithm is bounded singly exponentially. More precisely, if the given quantifier-free formula involves s polynomials whose degrees are bounded by d, the complexity of the algorithm is bounded by (sd)kO(1).This algorithm generalizes well known algorithms having singly exponential complexity for computing a semi-algebraic basis of the zero-th homology group of semi-algebraic sets, which is equivalent to the problem of computing a set of points meeting every semi-algebraically connected component of the given semi-algebraic set at a unique point. We then turn our attention to the Reeb graph, a tool from Morse theory which has recently found use in applied topology due to its ability to track the changes in connectivity of level sets of a function. The roadmap of a set, a construction that arises in semi-algebraic geometry, is a one-dimensional set that encodes information about the connected components of a set. In this thesis, we show that the Reeb graph and, more generally, the Reeb space, of a semi-algebraic set is homeomorphic to a semi-algebraic set, which opens up the algorithmic problem of computing a semi-algebraic description of the Reeb graph. We present an algorithm with singly-exponential complexity that realizes the Reeb graph of a function f:X→Y as a semi-algebraic quotient using the roadmap of X with respect to f.
APA, Harvard, Vancouver, ISO, and other styles
15

(9834218), Julie Shaw. "Constructing a grounded theory of young adult health literacy." Thesis, 2017. https://figshare.com/articles/thesis/Constructing_a_grounded_theory_of_young_adult_health_literacy/13443047.

Full text
Abstract:
The health literacy of young adults indicates their preparedness to manage their health. Health literacy is currently viewed from the perspective of health professionals and patients with two dominant perspectives being functional health literacy and public health/health promotion health literacy, both of which are known to improve health outcomes. A review of the literature identifies that young peoples’ perception of health is inconsistent as well, there is limited information on the health literacy of young adults. The rise of the consumer society in Western countries in the 1950s created the need for the health literacy of the broader population to be enhanced as individuals are being increasingly asked to make choices/decisions in the management of their lifestyle and health. The lifestyle and health management choices of young people impact their current and future health. The young adult health literacy study aimed to address this gap in knowledge and construct a theory of young adult health literacy. Charmaz's constructive grounded theory approach underpinned this study. Recruitment of participants involved purposive sampling and snowballing. Twelve young adults who met the inclusion criteria for this study were recruited. Data collection involved completion of a demographic questionnaire along with semi-structured face to face and group interviews. Transcribed data was analysed using Charmaz's grounded theory framework resulting in the construction of a theoretical model of health literacy. The constructed theory has a central theme of My Health representing the embodiment of health. There are four other themes which are interconnected, Learning about Health, Developing Meaningful Knowledge, Making Health Decisions representing empowerment of the young adult, and Context representing the dynamics of daily living. The theory accommodates for the diversity of young adults and its fluidity allows for movement within the model. The theory contributes to the clarification of health literacy as a construct; identifies the enablers and barriers of health literacy for young adults; provides a framework for education, health information, health professionals and health services to use in the further enhancement of young adults’ health, health management and health literacy; as well as provide a foundation for young adults in making health decisions in later life.
APA, Harvard, Vancouver, ISO, and other styles
16

(7307489), Ishant Khurana. "Catalytic Consequences of Active Site Speciation, Density, Mobility and Stability on Selective Catalytic Reduction of NOx with Ammonia over Cu-Exchanged Zeolites." Thesis, 2019.

Find full text
Abstract:

Selective catalytic reduction (SCR) of NOx using NH3 as a reductant (4NH3+ 4NO + O2 6H2O + 4N2) over Cu-SSZ-13 zeolites is a commercial technology used to meet emissions targets in lean-burn and diesel engine exhaust. Optimization of catalyst design parameters to improve catalyst reactivity and stability against deactivation (hydrothermal and sulfur poisoning) necessitates detailed molecular level understanding of structurally different active Cu sites and the reaction mechanism. With the help of synthetic, titrimetric, spectroscopic, kinetic and computational techniques, we established new molecular level details regarding 1) active Cu site speciation in monomeric and dimeric complexes in Cu-SSZ-13, 2) elementary steps in the catalytic reaction mechanism, 3) and deactivation mechanisms upon hydrothermal treatment and sulfur poisoning.

We have demonstrated that Cu in Cu-SSZ-13 speciates as two distinct isolated sites, nominally divalent CuII and monovalent [CuII(OH)]+ complexes exchanged at paired Al and isolated Al sites, respectively. This Cu site model accurately described a wide range of zeolite chemical composition, as evidenced by spectroscopic (Infrared and X-ray absorption) and titrimetric characterization of Cu sites under ex situ conditions and in situ and operando SCR reaction conditions. Monovalent [CuII(OH)]+ complexes have been further found to condense to form multinuclear Cu-oxo complexes upon high temperature oxidative treatment, which have been characterized using UV-visible spectroscopy, CO-temperature programmed reduction and dry NO oxidation as a probe reaction. Structurally different isolated Cu sites have different susceptibilities to H2 and He reductions, but are similarly susceptible to NO+NH3 reduction and have been found to catalyze NOx SCR reaction at similar turnover rates (per CuII; 473 K) via a CuII/CuI redox cycle, as their structurally different identities are masked by NH3 solvation during reaction.


Molecular level insights on the low temperature CuII/CuI redox mechanism have been obtained using experiments performed in situand in operando coupled withtheory. Evidence has been provided to show that the CuII to CuI reduction half-cycle involves single-site Cu reduction of isolated CuII sites with NO+NH3, which is independent of Cu spatial density. In contrast, the CuI to CuII oxidation half-cycle involves dual-site Cu oxidation with O2 to form dimeric Cu-oxo complexes, which is dependent on Cu spatial density. Such dual-site oxidation during the SCR CuII/CuI redox cycle requires two CuI(NH3)2sites, which is enabled by NH3solvation that confers mobility to isolated CuI sites and allows reactions between two CuI(NH3)2 species and O2. As a result, standard SCR rates depend on Cu proximity in Cu-SSZ-13 zeolites when CuI oxidation steps are kinetically relevant. Additional unresolved pieces of mechanism have been investigated, such as the reactivity of Cu dimers, the types of reaction intermediates involved, and the debated role of Brønsted acid sites in the SCR cycle, to postulate a detailed reaction mechanism. A strategy has been discussed to operate either in oxidation or reduction-limited kinetic regimes, to extract oxidation and reduction rate constants, and better interpret the kinetic differences among Cu-SSZ-13 catalysts.


The stability of active Cu sites upon sulfur oxide poisoning has been assessed by exposing model Cu-zeolite samples to dry SO2 and O2 streams at 473 and 673 K, and then analyzing the surface intermediates formed via spectroscopic and kinetic assessments. Model Cu-SSZ-13 zeolites were synthesized to contain distinct Cu active site types, predominantly either divalent CuII ions exchanged at proximal framework Al (Z2Cu), or monovalent [CuIIOH]+ complexes exchanged at isolated framework Al (ZCuOH). SCR turnover rates (473 K, per Cu) decreased linearly with increasing S content to undetectable values at equimolar S:Cu ratios, consistent with poisoning of each Cu site with one SO2-derived intermediate. Cu and S K-edge X-ray absorption spectroscopy and density functional theory calculations were used to identify the structures and binding energies of different SO2-derived intermediates at Z2Cu and ZCuOH sites, revealing that bisulfates are particularly low in energy, and residual Brønsted protons are liberated at Z2Cu sites as bisulfates are formed. Molecular dynamics simulations also show that Cu sites bound to one HSO4- are immobile, but become liberated from the framework and more mobile when bound to two HSO4-. These findings indicate that Z2Cu sites are more resistant to SO2poisoning than ZCuOH sites, and are easier to regenerate once poisoned.


The stability of active Cu sites on various small-pore Cu-zeolites during hydrothermal deactivation (high temperature steaming conditions) has also been assessed by probing the structural and kinetic changes to active Cu sites. Three small-pore, eight-membered ring (8-MR) zeolites of different cage-based topology (CHA, AEI, RTH) have been investigated. With the help of UV-visible spectroscopy to probe the Cu structure, in conjunction with measuring differential reaction kinetics before and after subsequent treatments, it has been suggested that the RTH framework imposes internal transport restrictions, effectively functioning as a 1-D framework during SCR catalysis. Hydrothermal aging of Cu-RTH results in complete deactivation and undetectable SCR rates, despite no changes in long-range structure or micropore volume after hydrothermal aging treatments and subsequent SCR exposure, highlighting beneficial properties conferred by double six-membered ring (D6R) composite building units. Exposure aging conditions and SCR reactants resulted in deleterious structural changes to Cu sites, likely reflecting the formation of inactive copper-aluminate domains. Therefore, the viability of Cu-zeolites for practical low temperature NOx SCR catalysis cannot be inferred solely from assessments of framework structural integrity after aging treatments, but also require Cu active site and kinetic characterization after aged zeolites are exposed to low temperature SCR conditions.

APA, Harvard, Vancouver, ISO, and other styles
17

(9846839), Sandra Worsley. "A foot in both camps: A constructivist grounded theory study exploring the experience of nurses who became homeopaths." Thesis, 2020. https://figshare.com/articles/thesis/A_foot_in_both_camps_A_constructivist_grounded_theory_study_exploring_the_experience_of_nurses_who_became_homeopaths/13411316.

Full text
Abstract:
This research aimed to understand the factors attracting qualified nurses to the practice of homeopathy and the influence if any, their respective identities as nurses and homeopaths had on their nursing and homeopathic practice. Using constructivist grounded theory methodology, data was collected via semi-structured interviews with fifteen registered nurses, who were also registered homeopaths, from three states of Australia. Data from the study resulted in the development of a substantive theory, the ‘Theory of Congruent Positioning’, which proposes that the nurses in this study were attracted to the practice of homeopathy through a process of experiential and transformative learning, whereby they connected with the core tenets of homeopathic philosophy. The ‘Theory of Congruent Positioning’ also provides insights into how the respective nursing and homeopathic identities of the nurses in this study influenced their respective nursing and homeopathic practice.
APA, Harvard, Vancouver, ISO, and other styles
18

(11178198), Harrison Wong. "K-theory of certain additive categories associated with varieties." Thesis, 2021.

Find full text
Abstract:
Let K0(Vark) be the Grothendieck group of varieties over a field k. We construct an exact category, denoted Add(Vark)S, such that there is a surjection K0(Vark)→K0(Add(Vark)S).If we consider only zero dimensional varieties, then this surjection is an isomorphism. Like K0(Vark), the group K0(Add(Vark)S) is also generated by isomorphism classes of varieties,and we construct motivic measures on K0(Add(Vark)S) including the Euler characteristic if k=C, and point counting measures and the zeta function if k is finite.
APA, Harvard, Vancouver, ISO, and other styles
19

(9845663), Leonie Williams. "What needs? Nurses and Aboriginal patients in hospital. A grounded theory study." Thesis, 1999. https://figshare.com/articles/thesis/What_needs_Nurses_and_Aboriginal_patients_in_hospital_A_grounded_theory_study/13424732.

Full text
Abstract:
Study examines the practice of nurses with members of a local adult Aboriginal community who were admitted to hospital. The grounded theory study was conducted in regional Queensland.. This dissertation provides the results of a grounded theory study undertaken in a regional area of Queensland which resulted in the development of a substantive theory of nursing practice with adult Aboriginal people who were admitted to hospital. This study is foundational both for nursing research and in developing an understanding of Aboriginal-non-Aboriginal relationships in the 20th century. There have been no previous studies identified through the literature or available databases which have attempted to explore or explain the relationships of nurses with Aboriginal people who are hospitalised. In addition, there is little evidence of any qualitative studies with urban Aboriginal people. Through the analysis of a variety of data, the research has identified a theory of nursing practice which is unique to the professional relationship between Aboriginal patients and professional nurses. In addition, the theory identifies those variables which support optimal nursing practices with Aboriginal patients and others which reduce standards of professional service. The development of the theory and the tenets of the theory have resulted in a number of significant implications for both the nursing profession and providers of health services to Aboriginal people. In this current climate of reconciliation, it is timely that professional nurses undertake a critical examination of their practices with Aboriginal people and those aspects which can redress some of the inequities perpetrated through ignorance or ethnocentricity over the last 160 years.
APA, Harvard, Vancouver, ISO, and other styles
20

(9833828), Brian Sengstock. "A grounded theory study of nursing students' experiences in the off-campus clinical setting." Thesis, 2009. https://figshare.com/articles/thesis/A_grounded_theory_study_of_nursing_students_experiences_in_the_off-campus_clinical_setting/13425191.

Full text
Abstract:
Poor workplace relations are an issue of concern in many workplaces and this phenomenon is not restricted to the nursing profession. The issue of workplace violence in nursing is well documented and there are an increasing number of studies which have investigated the notion of horizontal violence amongst graduate nurses. The impact that poor workplace relations has on the development of a professional identity by nursing students in the off-campus clinical setting is significant in light of the current global shortage of nurses. There is a dearth of knowledge in understanding how Australian undergraduate nursing students experience the off-campus clinical setting and subsequently develop a professional identity as a nurse. Therefore the aim of this study was to discover and describe the phenomena in order to develop a substantive theory that explains the experiences of the under-graduate nursing students in a regional setting. Constructivist grounded theory methods were utilised in the conduct of the study. A sample of 29 participants was recruited permitting the formulation of a substantive theory regarding the development of a professional identity in nursing students. This substantive theory contributes knowledge relevant to the undergraduate nursing students, nurse educators, nursing workforce planners, and the tertiary educational institutions offering nursing. This is achieved through discovering, describing and explaining the phenomenon of anxiety which the nursing students experience as a result of the interrelationship and interactions of tradition bearing, staff and student performance. These interactions intersect to form expectations of where the student fits within the hierarchy of the facility and the nursing profession in general. An understanding of the issues associated with tradition bearing, staff performance, and student performance and the impact that the interaction of these conditions has upon the students developing professional identity as a nurse is necessary to allow for the implementation of corrective strategies.
APA, Harvard, Vancouver, ISO, and other styles
21

(5930132), Francisco J. Pena. "Efficient Computation of Accurate Seismic Fragility Functions Through Strategic Statistical Selection." Thesis, 2019.

Find full text
Abstract:
A fragility function quantifies the probability that a structural system reaches an undesirable limit state, conditioned on the occurrence of a hazard of prescribed intensity level. Multiple sources of uncertainty are present when estimating fragility functions, e.g., record-to-record variation, uncertain material and geometric properties, model assumptions, adopted methodologies, and scarce data to characterize the hazard. Advances in the last decades have provided considerable research about parameter selection, hazard characteristics and multiple methodology for the computation of these functions. However, there is no clear path on the type of methodologies and data to ensure that accurate fragility functions can be computed in an efficient manner. Fragility functions are influenced by the selection of a methodology and the data to be analyzed. Each selection may lead to different levels of accuracy, due to either increased potential for bias or the rate of convergence of the fragility functions as more data is used. To overcome this difficulty, it is necessary to evaluate the level of agreement between different statistical models and the available data as well as to exploit the information provided by each piece of available data. By doing this, it is possible to accomplish more accurate fragility functions with less uncertainty while enabling faster and widespread analysis. In this dissertation, two methodologies are developed to address the aforementioned challenges. The first methodology provides a way to quantify uncertainty and perform statistical model selection to compute seismic fragility functions. This outcome is achieved by implementing a hierarchical Bayesian inference framework in conjunction with a sequential Monte Carlo technique. Using a finite amount of simulations, the stochastic map between the hazard level and the structural response is constructed using Bayesian inference. The Bayesian approach allows for the quantification of the epistemic uncertainty induced by the limited number of simulations. The most probable model is then selected using Bayesian model selection and validated through multiple metrics such as the Kolmogorov-Smirnov test. The subsequent methodology proposes a sequential selection strategy to choose the earthquake with characteristics that yield the largest reduction in uncertainty. Sequentially, the quantification of uncertainty is exploited to consecutively select the ground motion simulations that expedite learning and provides unbiased fragility functions with fewer simulations. Lastly, some examples of practices during the computation of fragility functions that results i n undesirable bias in the results are discussed. The methodologies are implemented on a widely studied twenty-story steel nonlinear benchmark building model and employ a set of realistic synthetic ground motions obtained from earthquake scenarios in California. Further analysis of this case study demonstrates the superior performance when using a lognormal probability distribution compared to other models considered. It is concluded by demonstrating that the methodologies developed in this dissertation can yield lower levels of uncertainty than traditional sampling techniques using the same number of simulations. The methodologies developed in this dissertation enable reliable and efficient structural assessment, by means of fragility functions, for civil infrastructure, especially for time-critical applications such as post-disaster evaluation. Additionally, this research empowers implementation by being transferable, facilitating such analysis at community level and for other critical infrastructure systems (e.g., transportation, communication, energy, water, security) and their interdependencies.
APA, Harvard, Vancouver, ISO, and other styles
22

(9777596), Judith Applegarth. "Understanding Assisted Reproductive Technology nursing (ART) practice in Australia: A grounded theory study." Thesis, 2011. https://figshare.com/articles/thesis/Understanding_Assisted_Reproductive_Technology_nursing_ART_practice_in_Australia_A_grounded_theory_study/13464605.

Full text
Abstract:
"The aim of this study was to develop a theory to explain the clinical practice experiences of ART nurses in Australlia ... Semi-structured interviews were undertaken with 15 ART nurses working in metropolitan and regional ART units around Australia"--Abstract.
APA, Harvard, Vancouver, ISO, and other styles
23

(7041383), Carl J. Olthoff. "Computation of Large Displacement Stability Metrics in DC Power Systems." Thesis, 2019.

Find full text
Abstract:
Due to the instabilities that may occur in dc power systems with regulated power electronic loads such as those used in aircraft, ships, as well as terrestrial vehicles, many analysis techniques and design methodologies have been developed to ensure stable operation following small disturbances starting from normal operating conditions. However, these techniques do not necessarily guarantee large-displacement
stability following major disturbances such as faults, regenerative operation, pulsed loads, and/or loss of generating capacity. In this thesis, a formal mathematical definition of large-displacement stability is described and the analytical conditions needed to guarantee large-displacement stability are investigated for a notional dc power system. It is shown possible to guarantee large-displacement stability for any piecewise continuous value of load power provided it is bounded by the peak rating of the dc source.
APA, Harvard, Vancouver, ISO, and other styles
24

(9844157), Anthony Weber. "Morphine administration by paramedics: An application of the theory of planned behaviour." Thesis, 2014. https://figshare.com/articles/thesis/Morphine_administration_by_paramedics_An_application_of_the_theory_of_planned_behaviour/13387235.

Full text
Abstract:
The core principles of the Queensland Ambulance Service (QAS) that are founded on improving the health and well-being of all persons have remained relatively stable since 1892. This is despite changes in organisational structure, policies, protocols and procedures employed by operational paramedics. The primary scope of QAS operations is focused on the pre-hospital aspects of the health care continuum and has seen changes over time, with particularly rapid changes in the last two years to the content and nature of paramedic clinical practice. Timely and appropriate pain management in the pre-hospital environment is paramount to effective patient care. It is readily identified as a priority within the paramedic profession. Numerous studies have identified many factors that hinder the delivery of adequate pain management to patients with pain. A comprehensive review of the literature related to prehospital pain management, education and barriers to pain management has been conducted. This thesis has attempted to identify if educational programs improved knowledge and changed clinical behaviour, specifically patient care interventions and patient health outcomes. This information is valuable to those who develop clinical standards and education for ambulance services. As a result, this information could be used to help design programs that better meet the educational needs of paramedics and ultimately the needs of their patients and the community. The literature did not sufficiently identify the influences on clinical behaviour other than knowledge, so from this outcome it was identified that future studies must examine a theoretical model that can be used to assess paramedics’ intention to administer morphine to patients experiencing pain. The Theory of Planned Behaviour (TPB) was identified as an effective model for analyzing paramedic behavioural intention; it was recognised that this theory might help to identify and better understand the constructs of attitudes, social norms and behavioural control beliefs that influence paramedics’ intention to administer opioids to patients with pain. The purpose of this study was to analyse the ability of the direct measures of the Theory of Planned Behaviour (TPB) Model to mediate factors influencing ambulance paramedics’ intention to administer Morphine to patients with pain. Participants of this study were Advanced Care and Intensive Care Paramedics who were deemed competent in Morphine administration through the education division of the Queensland Ambulance Service. Data were collected by means of a questionnaire that used the constructs of the TPB, including subjective norm, perceived behavioural control and attitude. While participants reported strong intentions to administer Morphine they also reported negative attitudes towards the behaviour (morphine administration). The constructs of the TPB explained 26 per cent of the variance in intention to administer Morphine with subjective norm being the strongest significant predictor. The findings related to specific attitudes and normative pressures provide an understanding into paramedic’s pain management behaviour. This research may be the first step to identify if concepts taught in the classroom are being transferred to the clinical setting. Potential findings that may be identified in this study could be used to improve organisational awareness of factors that contribute to the future education and professional development of QAS Paramedics.
APA, Harvard, Vancouver, ISO, and other styles
25

(5930021), Paulami Majumdar. "Density Functional Theory Investigations of Metal/Oxide Interfaces and Transition Metal Catalysts." Thesis, 2021.

Find full text
Abstract:
One of the most important advances in modern theoretical surface science and catalysis research has been the advent of Ab-Initio Density Functional Theory (DFT). Based on the electronic structure formulation of Pierre Hohenberg, Walter Kohn and Lu Jeu Sham, DFT has revolutionized theoretical research in heterogeneous catalysis, electrocatalysis, batteries, as well as homogeneous catalysis using first-principles electronic structure simulations. Combined with statistical mechanics, kinetic theory, and experimental inputs, DFT provides a powerful technique for investigating surface structure, reaction mechanisms, understanding underlying reactivity trends, and using them for rational and predictive design of materials for various catalytic chemistries, including those that can propel us towards a clean energy future – for example water gas shift (WGS), methanol synthesis, oxidation reactions, CO2 electroreduction, among many others. Fueled by advances in supercomputing facilities, numerous early and current DFT studies have been primarily focused on idealized simulations aimed at obtaining qualitative insights into experimental observations. However, as the immense potential of DFT has been unfolding, the demand for closer representation of realistic catalytic situations have rapidly emerged, and with it, the recognition of the need to reduce the disparity between theoretical DFT structures and real catalytic environments. Bridging this ‘materials gap’ necessitates using more rigorous catalyst structures in DFT calculations that can capture realistic experimental geometries, while at the same time, are creatively simplified to be computationally tractable. This thesis is a compilation of several projects on metals and metal/oxide systems that have been undertaken using DFT, in collaboration with experimental colleagues, with the goal of addressing some of the challenges in heterogeneous catalysis, while decreasing the ‘materials gap’ between theory and experiments.

The first several chapters of this thesis focus on bifunctional, metal/oxide systems. These systems are quintessential in numerous heterogeneous catalysis applications and have been the subject of extensive study. More interestingly, they sometimes exhibit synergistic enhancement in rates that is greater than the sum of the individual rates on the metal (on an inert support) or on the oxide in isolation. Such bifunctionality often stems from the modified properties at the nanoscale interface between the metal and the oxide and is an active field of research. In particular, while a large body of literature exists that investigates the activity of metals, the role of the support in bifunctional systems is often uncertain and is the subject of investigation of the first few chapters of this thesis. We chose to study WGS on Au as support effects are particularly prominent on this system. The second chapter examines WGS on Au/ZnO, where realistic catalytic environment at the interface is reproduced by analyzing the thermodynamics of surface hydroxylation of the oxide under reaction conditions, and its effect on WGS kinetics is quantified through a microkinetic analysis. This study highlights the importance of considering spectator species which can drastically influence the energetics and kinetics of a reaction at a metal/oxide interface. In addition, fundamental aspects of the effect of surface hydroxyls on the electronic structure at the interface is also discussed.

The third chapter of the thesis builds on this theme and analyzes the effect of systematic perturbation of electronic structure at the interface through substitutional doping of the oxide. Chapters 3 and 4 focus on Au/MgO, a system which has been previously studied in extensive detail in our group and benchmarked through experiments. The effects of a series of dopants of varying electronic valences have been analyzed on a number of properties at the interface – vacancy formation energies, adsorption energies of intermediates, scaling properties, activation energy barriers and so on. Exciting new scaling relationships are identified at this interface, having properties different from that observed on extended surfaces, and are interpreted using an electrostatic model. In the subsequent chapter, we identified Bronsted-Evans-Polanyi relationships for the different steps in the WGS pathway for a series of dopants. Coupled with the scaling relations, these trends were then used in conjunction with a dual-site microkinetic model to perform a volcano analysis for interfacial rates. Our analysis thus builds, for the first time, a rational design paradigm for electronic structure perturbation of the support at a bifunctional interface. The next chapter further investigates support effects, both geometric and electronic, in greater detail for Au supported on a series of oxide supports and discusses accelerated identification of an activity descriptor through a close fusion between computations and experiments.

In addition to interfacial effects of the support, this thesis also briefly examines a more apparent role of the oxide, wherein it influences the geometry of the supported metal. Two different Au-based systems are investigated using surface science approaches in Chapter 6 - the segregation properties of a bimetallic Au/Ir alloy on anatase and wetting behavior of Au-FexOy heterodimers – both of which are representative of the structural evolution of a supported catalyst under reaction conditions. Through our analysis, we show that the oxide directly influences these behaviors of the supported metal.

The next few chapters explore catalysis using metallic systems, focusing on transition metals, an important class of materials in heterogeneous catalysis and constitutes the major body of DFT literature for trend based catalytic analyses. A crucial factor that contributed to the success of such high-throughput screening studies was identification of linear scaling relationships on transition metals, whereby the adsorption energy of complex molecular fragments was linearly related to that of simple atomic adsorbates. However, while these relationships are valid for low adsorbate coverages, at higher, catalytically relevant coverages, deviations from linearity are common, thus presenting a materials gap in volcano analyses. The incorporation of coverage effects, therefore, in scaling relations has been a pressing challenge. This thesis describes a simple means of systematically capturing changes in reaction energies due to coverage effects through a pairwise interaction model, where the changes in adsorption energies are shown to be a direct function of the number of neighbors and interaction parameters determined through DFT. In addition, we also draw a mathematical correspondence between scaling relations at high coverage and that at low coverage and discuss its implications on the existence of linear scaling relations.

In Chapter 8, we discuss collaborative work on Pt based catalysts, an active catalyst for many chemical and electrochemical systems. We explore trends in WGS on bimetallic Pt-M systems and identify an activity descriptor by correlating experimental rates with the binding strength of OH* on model surfaces of bimetallic alloys. In addition, we also investigate the interaction between Na promoter and Pt under reaction conditions, using an inverse oxide model, to obtain insights into the nature of promotion of alkali metals on WGS on Pt catalysts.
APA, Harvard, Vancouver, ISO, and other styles
26

(9828500), Kylie Radel. "The Dreamtime Cultural Centre: A grounded theory of doing business in an Indigenous tourism enterprise." Thesis, 2010. https://figshare.com/articles/thesis/The_Dreamtime_Cultural_Centre_A_grounded_theory_of_doing_business_in_an_Indigenous_tourism_enterprise/13459751.

Full text
Abstract:
The Dreamtime Cultural Centre in Central Queensland began operation in Australia's Bicentennial year, 1988. Over the 22 years of its operation, the enterprise has grown to provide a range of services and facilities including the original cultural tours, three venues for conference facilities, a 31 room motel, restaurant and conference centre, a SkillShare Hospitality Training Centre, kiosk, Bimbi Artefact Shop and a theatrette. The grounded theory model accounting for the success of the Dreamtime Cultural Centre demonstrates complex and highly interdependent processes. As an Indigenous tourism enterprise, the Centre is structured to reflect and reinforce the integration of social structures and processes that accommodate the dynamics and relationships arising from the cultural mores, traditions and expectations of a complex community. The success factors incorporate adaptations of Western organisational principles and concepts characteristic of small-medium enterprises overlaid against Indigenous foundations of family structures, kinship roles, traditions, cultures and heritages. There are two critical factors that underpin and drive success for the enterprise: the constructs of the 'Family' and the 'Entrepreneur'. The core of the enterprise is grounded in the Family relationships, connectedness and kinship frameworks which reflect Indigenous Australians' ways of being and doing. These frameworks are reflected in the organisational structure, organisational culture and rules guiding practices and processes respectively.
APA, Harvard, Vancouver, ISO, and other styles
27

(9804593), Pamela Hogan. "Registered nurse understanding of organisational commitment and its link to retention: A grounded theory study." Thesis, 2012. https://figshare.com/articles/thesis/Registered_nurse_understanding_of_organisational_commitment_and_its_link_to_retention_A_grounded_theory_study/13464893.

Full text
Abstract:
"Destabilisation of the nursing workforce due to poor retention creates inconsistencies and disruptions to the delivery of health care services. It can also have a negative impact on patient care and safety. If registered nurses remain in their jobs then hospitals and the health care system will realise significant savings in costs associated with replacing registered nurses. The impact of the nursing shortage is that health care facilities will continue to have difficulty replacing registered nurses once they have left. Focusing on nurse retention rather than on recruitment, may be a useful strategy to address the nursing shortage. Organisational commitment as a construct in workforce research has been related both negatively to turnover intentions and positively related to retention amongst employees. This construct was applied to this research which used a Grounded Theory methodology to examine how registered nurses understand organisational commitment and its link to retention. The registered nurse participant group came from acute care hospitals in Australia. The findings of this research are posited privileging the voices of the participants. Results add to the existing body of knowledge and are able to be explained and supported by existing literature within the field. The purposive sample group contributed to this study by participating in semi-structured in-depth interviews in which they described and discussed their commitment and their experiences related to workplace commitment and its link to retention. The main finding of this study was that the registered nurse participants understood organisational commitment to be at the ‘local’ level. That is, being committed to their work unit, to their nursing practice within the work unit and to the patients within the work unit. The strength of the participants’ organisational commitment, and hence their retention, was influenced positively or negatively by the management behaviours of their Nurse Managers. These findings formed the substantive theory of how registered nurses understand organisational commitment and its link to retention"--Abstract.
APA, Harvard, Vancouver, ISO, and other styles
28

(11178675), Reza Soltani. "COLLISION AVOIDANCE FOR AUTOMATED VEHICLES USING OCCUPANCY GRID MAP AND BELIEF THEORY." Thesis, 2021.

Find full text
Abstract:
This thesis discusses occupancy grid map, collision avoidance system and belief theory, and propose some of the latest and the most effective method such as predictive occupancy grid map, risk evaluation model and OGM role in the belief function theory with the approach of decision uncertainty according to the environment perception with the degree of belief in the driving command acceptability. Finally, how the proposed models mitigate or prevent the occurrence of the collision.
APA, Harvard, Vancouver, ISO, and other styles
29

(9838145), Nadia O'Connell. "Exploring the role of relationship marketing between universities and education agents: A case study analysis." Thesis, 2012. https://figshare.com/articles/thesis/Exploring_the_role_of_relationship_marketing_between_universities_and_education_agents_A_case_study_analysis/13457729.

Full text
Abstract:
"The purpose of this study is to understand the role of relationship marketing and agency theory control mechanisms in the recruitment of international students through education agents to Australian universities ... A multi-case embedded design was utilised with a sample consisting of in-depth interviews with 31 international marketing directors and marketing staff from eight Australian universities and on the other side of the dyad, interviews with 16 education agent managers and staff from 12 education agencies"--Abstract.
APA, Harvard, Vancouver, ISO, and other styles
30

(8082827), Kuang-Chung Wang. "METHOD DEVELOPMENT IN THE NEGF FRAMEWORK: MAXIMALLY LOCALIZED WANNIER FUNCTION AND BÜTTIKER PROBE FOR MULTI-PARTICLE INTERACTION." Thesis, 2019.

Find full text
Abstract:
The work involves two new method implementation and application in the Quantum transport community for nano-scale electronic devices.

First method: Ab-initio Tight-Binding(TB)
As the surfacing of novel 2D materials, layers can be stacked freely on top of each other bound by Van der Waals force with atomic precision. New devices created with unique characteristics will need the theoretical guidance. The empirical tight-binding method is known to have difficulty accurately representing Hamiltonian of the 2D materials. Maximally localized Wannier function(MLWF) constructed directly from ab-initio calculation is an efficient and accurate method for basis construction. Together with NEGF, device calculation can be conducted. The implementation of MLWF in NEMO5 and the application on 2D MOS structure to demystify interlayer coupling are addressed.
Second method: Büttiker-probe Recombination/Generation(RG) method:

The non-equilibrium Green function (NEGF) method is capable of nanodevice performance predictions including coherent and incoherent effects. To treat incoherent scattering, carrier generation and recombination is computationally very expensive. In this work, the numerically efficient Büttiker-probe model is expanded to cover recombination and generation effects in addition to various incoherent scattering processes. The capability of the new method to predict nanodevices is exemplified with quantum well III-N light-emitting diodes and photo-detector. Comparison is made with the state of art drift-diffusion method. Agreements are found to justify the method and disagreements are identified attributing to quantum effects.

The two menthod are individually developed and utilized together to study BP/MoS2 interface. In this vertical 2D device, anti-ambipolar(AAP) IV curve has been identified experimentally with different explanation in the current literature. An atomistic simulation is performed with basis generated from density functional theory. Recombination process is included and is able to explain the experiment findings and to provide insights into 2D interface devices.

APA, Harvard, Vancouver, ISO, and other styles
31

(9165011), Salar Safarkhani. "GAME-THEORETIC MODELING OF MULTI-AGENT SYSTEMS: APPLICATIONS IN SYSTEMS ENGINEERING AND ACQUISITION PROCESSES." Thesis, 2020.

Find full text
Abstract:

The process of acquiring the large-scale complex systems is usually characterized with cost and schedule overruns. To investigate the causes of this problem, we may view the acquisition of a complex system in several different time scales. At finer time scales, one may study different stages of the acquisition process from the intricate details of the entire systems engineering process to communication between design teams to how individual designers solve problems. At the largest time scale one may consider the acquisition process as series of actions which are, request for bids, bidding and auctioning, contracting, and finally building and deploying the system, without resolving the fine details that occur within each step. In this work, we study the acquisition processes in multiple scales. First, we develop a game-theoretic model for engineering of the systems in the building and deploying stage. We model the interactions among the systems and subsystem engineers as a principal-agent problem. We develop a one-shot shallow systems engineering process and obtain the optimum transfer functions that best incentivize the subsystem engineers to maximize the expected system-level utility. The core of the principal-agent model is the quality function which maps the effort of the agent to the performance (quality) of the system. Therefore, we build the stochastic quality function by modeling the design process as a sequential decision-making problem. Second, we develop and evaluate a model of the acquisition process that accounts for the strategic behavior of different parties. We cast our model in terms of government-funded projects and assume the following steps. First, the government publishes a request for bids. Then, private firms offer their proposals in a bidding process and the winner bidder enters in a con- tract with the government. The contract describes the system requirements and the corresponding monetary transfers for meeting them. The winner firm devotes effort to deliver a system that fulfills the requirements. This can be assumed as a game that the government plays with the bidder firms. We study how different parameters in the acquisition procedure affect the bidders’ behaviors and therefore, the utility of the government. Using reinforcement learning, we seek to learn the optimal policies of involved actors in this game. In particular, we study how the requirements, contract types such as cost-plus and incentive-based contracts, number of bidders, problem complexity, etc., affect the acquisition procedure. Furthermore, we study the bidding strategy of the private firms and how the contract types affect their strategic behavior.

APA, Harvard, Vancouver, ISO, and other styles
32

(10676241), Stacy Lynn Walker. "Connecting Through Communication: Scripts Enacting Three Theories." 2021.

Find full text
Abstract:
This creative non-thesis project includes three theories from communication studies. Uncertainty Reduction Theory, Cultivation Theory, and Cognitive Dissonance Theory. Each theoretical framework also includes a script written with the intent of filming in the future. Those videos could be shown in communication classes. These three theories cover a breadth of knowledge in the field as they pertain to interpersonal communication, media studies, and persuasion.
APA, Harvard, Vancouver, ISO, and other styles
33

(9875528), DP Irwin. "Evaluation of water quality monitoring networks using the concept of information entropy." Thesis, 1996. https://figshare.com/articles/thesis/Evaluation_of_water_quality_monitoring_networks_using_the_concept_of_information_entropy/13425992.

Full text
Abstract:
Demonstrates the procedure and flexibility of using the information theory approach to evaluate the performance of water quality monitoring networks. Specific application of these are used in the evaluation of the Isaac River water quality monitoring network in Central Queensland.. Continued population growth, improved standards of living, and an increasing concern for the environment require careful management strategies to ensure the ecologically sustainable development and future availability of water resources. Water quality monitoring is an integral component of this management process. A water quality monitoring network is a monitoring system which provides management with the status of water quality conditions at a variety of locations and instances of time. While a range of significant challenges exist within the effective and efficient design of the monitoring system, one of the most critical of these challenges is the need for the selection of representative temporal and spatial sampling frequencies which neither replicate nor compromise the 'informativeness' and effectiveness of the monitoring. Recent research has employed information entropy concepts of transferred information and loss of information content to measure the representativeness and 'informativeness' of water quality monitoring networks. On the basis of a review of the principles and theory which underpin the concepts in their application to evaluation of performance, an improved methodology based on entropy theory was developed and applied to a typical Central Queensland water quality monitoring network. Success was particularly achieved in identifying the optimal selections of water quality variables and monitoring station locations. Other significant results from the analysis include the recognition of relationships between the transfer function and the criteria of transferred information, evidence of a 'priority order' for the removal of variables from a monitoring programme and a relationship between the co-variance matrix determinant and the class interval size which may alleviate the problem of calculating unstable joint entropies.
APA, Harvard, Vancouver, ISO, and other styles
34

(6632552), Heather A. W. Cann. "BEYOND THE CLIMATE SCIENCE WARS: ELITE FRAMING AND CLIMATE CHANGE POLICY CONFLICT." Thesis, 2019.

Find full text
Abstract:
Stakeholders involved in debates around climate-energy policy shape public conversations through different “frames”: message units that strategically emphasize particular aspects of an issue while downplaying others. I investigate the presence of frames within climate change discourse and their political influence in the creation of climate-energy policies. Findings suggest that science frames may play a limited role when it comes to the development of actual climate policy at the state level, and importantly, that the strategic use of issue frames was able to level the playing field between environmental advocates and historically dominant industry actors. This work thus contributes to ongoing debates in the climate change framing literature by considering the “real world” of political communication coupled with an on-the-ground policy conflict.
APA, Harvard, Vancouver, ISO, and other styles
35

(3436478), Brigid Lynch. "Implementing skin cancer screening clinics in a rural community: A case study of diffusion theory." Thesis, 2001. https://figshare.com/articles/thesis/Implementing_skin_cancer_screening_clinics_in_a_rural_community_A_case_study_of_diffusion_theory/20022704.

Full text
Abstract:

Skin cancer screening clinics were introduced into a number of towns throughout Queensland as part of the Melanoma Screening Trial (MST), a study investigating the efficacy of screening for melanoma. The MST requires 60% of these towns' populations aged over 30 years to be screened for

melanoma within a three year intervention phase. The aim of this case study is to assess the relationship between Rogers' (1995) diffusion of innovations and the health promotion strategies implemented to encourage attendance at skin cancer screening clinics.

Data were obtained from a number of sources, including administrative files, progress reports, interviews and focus groups and were positioned within a comparative theory/practice matrix. Pattern matching logic was used to

assess the relationship between the health promotion strategies and the theoretical construct of diffusion of innovations.

All components of diffusion of innovations (Rogers, 1995) were addressed by the health promotion strategies encouraging attendance at the skin cancer screening clinics. The delivery of the skin cancer screening clinics was in accordance with principles identified by past diffusion research. The skin cancer screening clinics conformed to most predictors of diffusion success and were delivered within a "real" environment, as suggested by past community -based interventions. A number of changes to existing health promotion strategies and the addition of some new strategies have been suggested to improve the rate of diffusion of skin cancer screening clinics in

the future.

APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography