Dissertations / Theses on the topic '080699 Information Systems not elsewhere classified'

To see the other types of publications on this topic, follow the link: 080699 Information Systems not elsewhere classified.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 24 dissertations / theses for your research on the topic '080699 Information Systems not elsewhere classified.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Herzallah, Randa. "Exploiting uncertainty in nonlinear stochastic control problem." Thesis, Aston University, 2003. http://publications.aston.ac.uk/13267/.

Full text
Abstract:
This work introduces a novel inversion-based neurocontroller for solving control problems involving uncertain nonlinear systems which could also compensate for multi-valued systems. The approach uses recent developments in neural networks, especially in the context of modelling statistical distributions, which are applied to forward and inverse plant models. Provided that certain conditions are met, an estimate of the intrinsic uncertainty for the outputs of neural networks can be obtained using the statistical properties of networks. More generally, multicomponent distributions can be modelled by the mixture density network. Based on importance sampling from these distributions a novel robust inverse control approach is obtained. This importance sampling provides a structured and principled approach to constrain the complexity of the search space for the ideal control law. The developed methodology circumvents the dynamic programming problem by using the predicted neural network uncertainty to localise the possible control solutions to consider. Convergence of the output error for the proposed control method is verified by using a Lyapunov function. Several simulation examples are provided to demonstrate the efficiency of the developed control method. The manner in which such a method is extended to nonlinear multi-variable systems with different delays between the input-output pairs is considered and demonstrated through simulation examples.
APA, Harvard, Vancouver, ISO, and other styles
2

Woon, Wei Lee. "Analysis of magnetoencephalographic data as a nonlinear dynamical system." Thesis, Aston University, 2002. http://publications.aston.ac.uk/13266/.

Full text
Abstract:
This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.
APA, Harvard, Vancouver, ISO, and other styles
3

Ingram, Benjamin R. "Pragmatic algorithms for implementing geostatistics with large datasets." Thesis, Aston University, 2008. http://publications.aston.ac.uk/13265/.

Full text
Abstract:
With the ability to collect and store increasingly large datasets on modern computers comes the need to be able to process the data in a way that can be useful to a Geostatistician or application scientist. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively for likelihood-based Geostatistics. Various methods have been proposed and are extensively used in an attempt to overcome these complexity issues. This thesis introduces a number of principled techniques for treating large datasets with an emphasis on three main areas: reduced complexity covariance matrices, sparsity in the covariance matrix and parallel algorithms for distributed computation. These techniques are presented individually, but it is also shown how they can be combined to produce techniques for further improving computational efficiency.
APA, Harvard, Vancouver, ISO, and other styles
4

Sun, Yi. "Non-linear hierarchical visualisation." Thesis, Aston University, 2002. http://publications.aston.ac.uk/13263/.

Full text
Abstract:
This thesis applies a hierarchical latent trait model system to a large quantity of data. The motivation for it was lack of viable approaches to analyse High Throughput Screening datasets which maybe include thousands of data points with high dimensions. We believe that a latent variable model (LTM) with a non-linear mapping from the latent space to the data space is a preferred choice for visualising a complex high-dimensional data set. As a type of latent variable model, the latent trait model can deal with either continuous data or discrete data, which makes it particularly useful in this domain. In addition, with the aid of differential geometry, we can imagine that distribution of data from magnification factor and curvature plots. Rather than obtaining the useful information just from a single plot, a hierarchical LTM arranges a set of LTMs and their corresponding plots in a tree structure. We model the whole data set with a LTM at the top level, which is broken down into clusters at deeper levels of the hierarchy. In this manner, the refined visualisation plots can be displayed in deeper levels and sub-clusters may be found. Hierarchy of LTMs is trained using expectation-maximisation (EM) algorithm to maximise its likelihood with respect to the data sample. Training proceeds interactively in a recursive fashion (top-down). The user subjectively identifies interesting regions on the visualisation plot that they would like to model in a greater detail. At each stage of hierarchical LTM construction, the EM algorithm alternates between the E - and M - step. Another problem that can occur when visualising a large data set is that there may be significant overlaps of data clusters. It is very difficult for the user to judge where centres of regions of interest should be put. We address this problem by employing the minimum message length technique, which can help the user to decide the optimal structure of the model.
APA, Harvard, Vancouver, ISO, and other styles
5

Lesch, Ragnar H. "Modelling nonlinear stochastic dynamics in financial time series." Thesis, Aston University, 2000. http://publications.aston.ac.uk/13260/.

Full text
Abstract:
For analysing financial time series two main opposing viewpoints exist, either capital markets are completely stochastic and therefore prices follow a random walk, or they are deterministic and consequently predictable. For each of these views a great variety of tools exist with which it can be tried to confirm the hypotheses. Unfortunately, these methods are not well suited for dealing with data characterised in part by both paradigms. This thesis investigates these two approaches in order to model the behaviour of financial time series. In the deterministic framework methods are used to characterise the dimensionality of embedded financial data. The stochastic approach includes here an estimation of the unconditioned and conditional return distributions using parametric, non- and semi-parametric density estimation techniques. Finally, it will be shown how elements from these two approaches could be combined to achieve a more realistic model for financial time series.
APA, Harvard, Vancouver, ISO, and other styles
6

Harries, Alun M. "Investigating viscous fluid flow in an internal mixer using computational fluid dynamics." Thesis, Aston University, 2000. http://publications.aston.ac.uk/13261/.

Full text
Abstract:
This thesis presents an effective methodology for the generation of a simulation which can be used to increase the understanding of viscous fluid processing equipment and aid in their development, design and optimisation. The Hampden RAPRA Torque Rheometer internal batch twin rotor mixer has been simulated with a view to establishing model accuracies, limitations, practicalities and uses. As this research progressed, via the analyses several 'snap-shot' analysis of several rotor configurations using the commercial code Polyflow, it was evident that the model was of some worth and its predictions are in good agreement with the validation experiments, however, several major restrictions were identified. These included poor element form, high man-hour requirements for the construction of each geometry and the absence of the transient term in these models. All, or at least some, of these limitations apply to the numerous attempts to model internal mixes by other researchers and it was clear that there was no generally accepted methodology to provide a practical three-dimensional model which has been adequately validated. This research, unlike others, presents a full complex three-dimensional, transient, non-isothermal, generalised non-Newtonian simulation with wall slip which overcomes these limitations using unmatched ridding and sliding mesh technology adapted from CFX codes. This method yields good element form and, since only one geometry has to be constructed to represent the entire rotor cycle, is extremely beneficial for detailed flow field analysis when used in conjunction with user defined programmes and automatic geometry parameterisation (AGP), and improves accuracy for investigating equipment design and operation conditions. Model validation has been identified as an area which has been neglected by other researchers in this field, especially for time dependent geometries, and has been rigorously pursued in terms of qualitative and quantitative velocity vector analysis of the isothermal, full fill mixing of generalised non-Newtonian fluids, as well as torque comparison, with a relatively high degree of success. This indicates that CFD models of this type can be accurate and perhaps have not been validated to this extent previously because of the inherent difficulties arising from most real processes.
APA, Harvard, Vancouver, ISO, and other styles
7

Bounkong, Stephane. "Digital image watermarking." Thesis, Aston University, 2004. http://publications.aston.ac.uk/13264/.

Full text
Abstract:
In recent years, interest in digital watermarking has grown significantly. Indeed, the use of digital watermarking techniques is seen as a promising mean to protect intellectual property rights of digital data and to ensure the authentication of digital data. Thus, a significant research effort has been devoted to the study of practical watermarking systems, in particular for digital images. In this thesis, a practical and principled approach to the problem is adopted. Several aspects of practical watermarking schemes are investigated. First, a power constaint formulation of the problem is presented. Then, a new analysis of quantisation effects on the information rate of digital watermarking scheme is proposed and compared to other approaches suggested in the literature. Subsequently, a new information embedding technique, based on quantisation, is put forward and its performance evaluated. Finally, the influence of image data representation on the performance of practical scheme is studied along with a new representation based on independent component analysis.
APA, Harvard, Vancouver, ISO, and other styles
8

Alford, Philip. "A communicative model for stakeholder consultation : towards a framework for action inquiry in tourism I.T." Thesis, University of Bedfordshire, 2007. http://hdl.handle.net/10547/561263.

Full text
Abstract:
This thesis focuses on an under-researched area of tourism -the multi stakeholder, inter organisational business to business Tourism IT domain which exhibits a marked rate of failure. A critical review of B2B case studies reveals that this failure is in large part due to the primacy afforded to technical problem solving approaches over human centred ones. The main purpose of the research is therefore stated as: "how do we ensure that, as technological solutions are implemented within this domain, due consideration is given to human-centred issues?" In order to tackle this research problem an interdisciplinary approach is taken and a communicative model for stakeholder consultation is developed. At the centre of the model lies an innovative method for deconstructing and reconstructing stakeholder discourse. A Co-operative Inquiry research methodology was used and a significant number of stakeholders were engaged in an Open Space event sponsored by two major Tourism IT companies who wanted to investigate the issues and opportunities connected with travel distribution and technology. This was followed up with face to face interviews and live discussions over the internet. In addition stakeholder discourse was captured via the Travelmole tourism discussion site. The discourse between stakeholders was reconstructed and the normative and objective claims analysed in depth. The presentation of these reconstructions in textual, tabular and diagrammatic formats captures the complexity of stakeholder interactions, revealing that although IT is an important tool, what really lies at the core of multi stakeholder projects are the normative positions to which participants subscribe. The model provided a practical means for critiquing stakeholder discourse, helping to identify stakeholders both involved and affected by the issue; juxtaposing the 'is' against the 'ought'; and enabling critical reflection on the coercive use of power. The review of the tourism literature revealed that these issues are as important in general B2B tourism partnerships as in Tourism IT and in this respect the model provides a practical tool for critique and for enabling the formation of a shared normative infrastructure on which multi stakeholder projects can proceed. In addition, while borrowing from Management Science, this thesis also makes a contribution to it, specifically in the area of boundary critique, through the way in which Habermas' ideal speech criteria arc practically implemented.
APA, Harvard, Vancouver, ISO, and other styles
9

Blakey, Jeremy Peter. "Database training for novice end users : a design research approach : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Information Systems at Massey University, Albany, New Zealand." Massey University, 2008. http://hdl.handle.net/10179/880.

Full text
Abstract:
Of all of the desktop software available, that for the implementation of a database is some of the most complex. With the increasing number of computer users having access to this sophisticated software, but with no obvious way to learn the rudiments of data modelling for the implementation of a database, there is a need for a simple, convenient method to improve their understanding. The research described in this thesis represents the first steps in the development of a tool to accomplish this improvement. In a preliminary study using empirical research a conceptual model was used to improve novice end users’ understanding of the relational concepts of data organisation and the use of a database software package. The results showed that no conclusions could be drawn about either the artefact used or the method of evaluation. Following the lead of researchers in the fields of both education and information systems, a design research process was developed, consisting of the construction and evaluation of a training artefact. A combination of design research and a design experiment was used in the main study described in this thesis. New to research in information systems, design research is a methodology or set of analytical techniques and perspectives, and this was used to develop a process (development of an artefact) and a product (the artefact itself). The artefact, once developed, needed to be evaluated for its effectiveness, and this was done using a design experiment. The experiment involved exposing the artefact to a small group of end users in a realistic setting and defining a process for the evaluation of the artefact. The artefact was the tool that would facilitate the improvement of the understanding of data modelling, the vital precursor to the development of a database. The research was conducted among a group of novice end users who were exposed to the artefact, facilitated by an independent person. In order to assess whether there was any improvement in the novices’ understanding of relational data modelling and database concepts, they then completed a post-test. Results confirmed that the artefact, trialled through one iteration, was successful in improving the understanding of these novice end users in the area of data modelling. The combination of design research and design experiment as described above gave rise to a new methodology, called experimental design research at this early juncture. The successful outcome of this research will lead to further iterations of the design research methodology, leading in turn to the further development of the artefact which will be both useful and accessible to novice users of personal computers and database software. This research has made the following original contributions. Firstly, the use of the design research methodology for the development of the artefact, which proved successful in improving novice users’ understanding of relational data structures. Secondly, the novel use of a design experiment in an information systems project, which was used to evaluate the success of the artefact. And finally, the combination of the developed artefact followed by its successful evaluation using a design experiment resulted in the hybrid experimental design research methodology. The success of the implementation of the experimental design research methodology in this information systems project shows much promise for its successful application to similar projects.
APA, Harvard, Vancouver, ISO, and other styles
10

Gales, Mathis. "Collaborative map-exploration around large table-top displays: Designing a collaboration interface for the Rapid Analytics Interactive Scenario Explorer toolkit." Thesis, Ludwig-Maximilians-University Munich, 2018. https://eprints.qut.edu.au/115909/1/Master_Thesis_Mathis_Gales_final_opt.pdf.

Full text
Abstract:
Sense-making of spatial data on an urban level and large-scale decisions on new infrastructure projects need teamwork from experts with varied backgrounds. Technology can facilitate this collaboration process and magnify the effect of collective intelligence. Therefore, this work explores new useful collaboration interactions and visualizations for map-exploration software with a strong focus on usability. Additionally, for same-time and same-place group work, interactive table-top displays serve as a natural platform. Thus, the second aim of this project is to develop a user-friendly concept for integrating table-top displays with collaborative map-exploration. To achieve these goals, we continuously adapted the user-interface of the map-exploration software RAISE. We adopted a user-centred design approach and a simple iterative interaction design lifecycle model. Alternating between quick prototyping and user-testing phases, new design concepts were assessed and consequently improved or rejected. The necessary data was gathered through continuous dialogue with users and experts, a participatory design workshop, and a final observational study. Adopting a cross-device concept, our final prototype supports sharing information between a user’s personal device and table-top display(s). We found that this allows for a comfortable and practical separation between private and shared workspaces. The tool empowers users to share the current camera-position, data queries, and active layers between devices and with other users. We generalized further findings into a set of recommendations for designing user-friendly tools for collaborative map-exploration. The set includes recommendations regarding the sharing behaviour, the user-interface design, and the idea of playfulness in collaboration.
APA, Harvard, Vancouver, ISO, and other styles
11

Mayer, Miriam. "Democratising the City: Technology as Enabler of Citizen-Led Urban Innovation." Thesis, Ludwig-Maximilians-Universitat Munchen, 2018. https://eprints.qut.edu.au/115908/1/Masterarbeit%20Miriam%20Mayer_final_opt.pdf.

Full text
Abstract:
This study deals with finding a way to enable citizen-led urban innovation through technology while concentrating on various aspects of controversial city developments. Therefore the literature concerning this topic is first investigated and current online systems designed for citizens to engage in city development decisions explored. In addition, literature, approaches and systems related to conflict resolution are also presented and discussed. By means of applying multiple design cycles, including several user studies, an online platform for citizens to elaborate controversial ideas for the city together was developed. These design cycles were focused on first finding a suitable process to elaborate on ideas and find consent. The process implementing this is tested during two workshops that portray the procedure that would be realised on the platform. Findings after each workshop are used to revise the process. In order to design a user interface that could implement such a process first an expert focus group was asked to brainstorm solutions for multiple design questions. Considering this input two platform mock-ups were created and shown to participants to receive feedback. A final prototype of the online platform was then implemented and tested in a final user study. During this study participants elaborated an idea together to test the whole resulting product, while being able to use the online platform in an in the wild setting. In spite of discovering how dependent the usage of the platform is on its users, the feedback received for the general idea of using an online platform to elaborate on ideas and find consent was overall positive.
APA, Harvard, Vancouver, ISO, and other styles
12

Zito, Rocco. "The integration of GPS and GIS in transportation applications." 2002. http://arrow.unisa.edu.au:8081/1959.8/45754.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Calabretto, Jean-Pierre. "Supporting medication-related decision making with information model-based digital documents." 2007. http://arrow.unisa.edu.au:8081/1959.8/34051.

Full text
Abstract:
Medication is vital in treating chronic disease. Increasing use of medication, however, can lead to (potentially preventable) medication-related adverse events. Medication management offers a means of addressing such adverse events and pharmacists have an important role in this solution, especially in terms of reviews of patient medication. Improved availability and sharing of patient-related information are critical factors in medication management, so that providing access to this information becomes a major factor in effective medication reviews. Although clinical decision support tools can significantly assist doctors in accessing relevant point-of-care information for greater patient safety, it has proven difficult to ensure the availability and appropriate structure of patient-related information for such support tools. These information access and input problems are further exacerbated by a lack of existing research into suitable decision support solutions for pharmacists. This research project explored the suitability of an essential information model to support an electronic document solution to support clinical documentation and allow effective communication between pharmacists and doctors for medication reviews. The project investigated whether this approach could improve safety, quality and efficiency in the medication review process; as well as more generally identifying factors influencing development and uptake of document-based support tools in the Health sector. The project used a qualitative Design Research approach and iterated through three scenarios. The first, information-rich, hospital scenario developed an information model of essential medication management components, which underpinned the development of a digital document prototype implemented using XForms technology. In the second scenario, accredited pharmacists evaluated the digital document to enable refinement of the information model and its associated digital document for the broader community context. The third scenario involved field studies which evaluated the digital document (and thus the underlying information model) within the community, assessing its contribution to quality, safety and efficiency throughout the medication review process. The investigation identified a number of themes which guided design and development of the prototype; and which appeared likely to have a broader impact on successful uptake of decision support tools. Missing information proved to be a constant and serious problem for health professionals although, in this project, it also became a way of determining the value of an information element and thus its inclusion in the information model. Conversation played a significant role in the hospital environment to help supply pharmacists information needs. Information granularity, the language of health professionals; and their time constraints were major factors influencing design. Health professionals extensive use of their personal knowledge also suggested decision support tools in this sector should be systems for experts rather than expert systems, i.e. the decision support tool and its users personal knowledge should complement one another. The results of this proof-of-concept project suggest practice improvement in medication management is possible, with perceived improvements in safety, quality and efficiency of the medication management process. These benefits, however, now need to be affirmed in larger field studies. The contributions of this research are two-fold: firstly, it is possible to develop a model of essential medication-related information which is succinct, relevant and can be understood and shared by health professionals in conjunction with the individuals personal knowledge. Secondly, a document metaphor is a natural fit with health professionals for representing and communicating information. Expressing this metaphor as digital document overcomes the main problems of paper-based documents sharing and communication; and the dynamic properties of digital documents assist in decision-making.
APA, Harvard, Vancouver, ISO, and other styles
14

(7042784), Mohammed S. Alyakoob. "The Economics of Geographic and Demographic Heterogeneity in Digitally Transformed Markets." Thesis, 2019.

Find full text
Abstract:
The digital transformation of markets can remove traditional geographic restrictions, democratizing access to previously unattainable products, and enable individuals to extract rent from their personal assets. However, these digital innovations often have competitors and complementors that are not immune to the impact of local factors such as the local market structure, economic condition, and even demographics. This dissertation examines the geographic and demographic heterogeneity driven disparities in two digitally transformed markets, the financial and accommodations sectors respectively.

First, we study the impact of local financial market competition in managing online peer-to-peer loans. With the boom of financial technologies (FinTech), a critical question is whether the local financial market structure still matters. Unlike traditional retail financial institutions that are predominantly territorial, FinTech-based platforms, in particular peer-to-peer (P2P) lending, provide individuals equal access to funds by removing typical geographic restrictions. Combined with other benefits such as ease-of-use and lower interest rates, P2P lenders are increasingly threatening the traditional local lenders. A largely unanswered question in the literature is whether the local retail financial institutions strategically respond to the rise of such P2P platforms. Moreover, if the strategic reaction of traditional institutions continues the legacy of being territorial, borrowers will ultimately gain unevenly from the competition. That is, where a borrower lives may still matter. In this chapter, we devise multiple strategies to empirically analyze the extent and nature of the strategic response of traditional institutions to P2P lending. This includes: (1) utilization of a Probit model that leverages the richness of our local market data and (2) exploitation of bank mergers as exogenous shocks to local market structure. We find consistently that a borrower from a more competitive market is more likely to prepay, suggesting that local market structure plays a pivotal role in P2P borrowers' debt management. We validate the underlying mechanism by studying the improving credit profiles of borrowers and platforms' (exogenous) changes in pricing in moderating the main effect. This mechanism reveals that traditional banks, especially when their local market conditions support, credibly responds to the growth of P2P and are successful in attracting consumers back to traditional financial products. Relatedly, we document heterogeneity in the benefits that borrowers gain from the local market structure (using a machine learning algorithm) and verify the robustness of our main findings. We discuss the implications for P2P lending, other crowd-based markets, and local retail financial markets.

Second, we examines the heterogeneous economic spillover effects of a home sharing platform---Airbnb---on the growth of a complimentary local service---restaurants. By circumventing traditional land-use regulation and providing access to underutilized inventory, Airbnb is attracting visitors of a city to vicinities that are not traditional tourist destinations. Although visitors generally bring significant spending power, it is, however, not clear if the visitors use Airbnb primarily for lodging, thus, not contributing to the adjacent vicinity economy. To evaluate this, we focus on the impact of Airbnb on the restaurant employment growth across vicinities in New York City (NYC). Our results indicate that if the intensity of Airbnb activity (Airbnb reviews per household) increases by 1\%, the restaurant employment in an average area grows by approximately 1.03\%. We also investigate the role of demographics and market concentration in driving the variation. Notably, restaurants in areas with a relatively high number of Black residents do not benefit from the economic spillover of Airbnb activity. Also, restaurants in more competitive areas reap the benefit from this spillover most. We validate the underlying mechanism behind the main result by evaluating the impact of Airbnb on Yelp visitor reviews -- areas with increasing Airbnb activity experience a surge in their share of NYC visitor reviews. This result is further validated by evaluating the impact of a unique Airbnb neighborhood level policy recently implemented in New Orleans.
APA, Harvard, Vancouver, ISO, and other styles
15

(7525319), Megan M. Nyre-Yu. "Determining System Requirements for Human-Machine Integration in Cyber Security Incident Response." Thesis, 2019.

Find full text
Abstract:
In 2019, cyber security is considered one of the most significant threats to the global economy and national security. Top U.S. agencies have acknowledged this fact, and provided direction regarding strategic priorities and future initiatives within the domain. However, there is still a lack of basic understanding of factors that impact complexity, scope, and effectiveness of cyber defense efforts. Computer security incident response is the short-term process of detecting, identifying, mitigating, and resolving a potential security threat to a network. These activities are typically conducted in computer security incident response teams (CSIRTs) comprised of human analysts that are organized into hierarchical tiers and work closely with many different computational tools and programs. Despite the fact that CSIRTs often provide the first line of defense to a network, there is currently a substantial global skills shortage of analysts to fill open positions. Research and development efforts from educational and technological perspectives have been independently ineffective at addressing this shortage due to time lags in meeting demand and associated costs. This dissertation explored how to combine the two approaches by considering how human-centered research can inform development of computational solutions toward augmenting human analyst capabilities. The larger goal of combining these approaches is to effectively complement human expertise with technological capability to alleviate pressures from the skills shortage.

Insights and design recommendations for hybrid systems to advance the current state of security automation were developed through three studies. The first study was an ethnographic field study which focused on collecting and analyzing contextual data from three diverse CSIRTs from different sectors; the scope extended beyond individual incident response tasks to include aspects of organization and information sharing within teams. Analysis revealed larger design implications regarding collaboration and coordination in different team environments, as well as considerations about usefulness and adoption of automation. The second study was a cognitive task analysis with CSIR experts with diverse backgrounds; the interviews focused on expertise requirements for information sharing tasks in CSIRTs. Outputs utilized a dimensional expertise construct to identify and prioritize potential expertise areas for augmentation with automated tools and features. Study 3 included a market analysis of current automation platforms based on the expertise areas identified in Study 2, and used Systems Engineering methodologies to develop concepts and functional architectures for future system (and feature) development.

Findings of all three studies support future directions for hybrid automation development in CSIR by identifying social and organizational factors beyond traditional tool design in security that supports human-systems integration. Additionally, this dissertation delivered functional considerations for automated technology that can augment human capabilities in incident response; these functions support better information sharing between humans and between humans and technological systems. By pursuing human-systems integration in CSIR, research can help alleviate the skills shortage by identifying where automation can dynamically assist with information sharing and expertise development. Future research can expand upon the expertise framework developed for CSIR and extend the application of proposed augmenting functions in other domains.
APA, Harvard, Vancouver, ISO, and other styles
16

Palmer, Kent D. "Emergent design : explorations in systems phenomenology in relation to ontology, hermeneutics and the meta-dialectics of design." 2009. http://arrow.unisa.edu.au:8081/1959.8/74458.

Full text
Abstract:
A Phenomenological Analysis of Emergent Design is performed based on the foundations of General Schemas Theory. The concept of Sign Engineering is explored in terms of Hermeneutics, Dialectics, and Ontology in order to define Emergent Systems and Meta-systems Engineering based on the concept of Meta-dialectics. Phenomenology, Ontology, Hermeneutics, and Dialectics will dominate our inquiry into the nature of the Emergent Design of the System and its inverse dual, the Meta-system. This is an speculative dissertation that attempts to produce a philosophical, mathematical, and theoretical view of the nature of Systems Engineering Design. Emergent System Design, i.e., the design of yet unheard of and/or hitherto non-existent Systems and Meta-systems is the focus. This study is a frontal assault on the hard problem of explaining how Engineering produces new things, rather than a repetition or reordering of concepts that already exist. In this work the philosophies of E. Husserl, A. Gurwitsch, M. Heidegger, J. Derrida, G. Deleuze, A. Badiou, G. Hegel, I. Kant and other Continental Philosophers are brought to bear on different aspects of how new technological systems come into existence through the midwifery of Systems Engineering. Sign Engineering is singled out as the most important aspect of Systems Engineering. We will build on the work of Pieter Wisse and extend his theory of Sign Engineering to define Meta-dialectics in the form of Quadralectics and then Pentalectics . Along the way the various ontological levels of Being are explored in conjunction with the discovery that the Quadralectic is related to the possibility of design primarily at the Third Meta-level of Being, called Hyper Being. Design Process is dependent upon the emergent possibilities that appear in Hyper Being. Hyper Being, termed by Heidegger as Being (Being crossed-out) and termed by Derrida as Differance, also appears as the widest space within the Design Field at the third meta-level of Being and therefore provides the most leverage that is needed to produce emergent effects. Hyper Being is where possibilities appear within our worldview. Possibility is necessary for emergent events to occur. Hyper Being possibilities are extended by Wild Being propensities to allow the embodiment of new things. We discuss how this philosophical background relates to meta-methods such as the Gurevich Abstract State Machine and the Wisse Metapattern methods, as well as real-time architectural design methods as described in the Integral Software Engineering Methodology . One aim of this research is to find the foundation for extending the ISEM methodology to become a general purpose Systems Design Methodology. Our purpose is also to bring these philosophical considerations into the practical realm by examining P. Bourdieu?s ideas on the relationship between theoretical and practical reason and M. de Certeau?s ideas on practice. The relationship between design and implementation is seen in terms of the Set/Mass conceptual opposition. General Schemas Theory is used as a way of critiquing the dependence of Set based mathematics as a basis for Design. The dissertation delineates a new foundation for Systems Engineering as Emergent Engineering based on General Schemas Theory, and provides an advanced theory of Design based on the understanding of the meta-levels of Being, particularly focusing upon the relationship between Hyper Being and Wild Being in the context of Pure and Process Being.
APA, Harvard, Vancouver, ISO, and other styles
17

(11173440), Cassandra M. McCormack. "Information Architecture and Cognitive User Experience in Distributed, Asynchronous Learning: A Case Design of a Modularized Online Systems Engineering Learning Environment." Thesis, 2021.

Find full text
Abstract:

Systems engineering (SE) is an increasingly relevant domain in an increasingly interconnected world, but the demand for SE education is impeded by the challenges of effectively teaching interdisciplinary material that emphasizes the development of a mentality over specific skills. A modularized, asynchronous, distributed course configuration may provide an advantageous alternative to more traditional hybrid course designs. Online courses have been a topic in the educational field since the establishment of the internet. However, the widespread disruptions to higher education due to the COVID-19 pandemic highlighted the demand for and difficulty of developing deliberate and robust learning environments designs that consider a variety of traditional and non-traditional students. This thesis presents a case design of a learning environment for an interdisciplinary-focused, introductory graduate-level systems course that has previously been designed for, and taught in, a hybrid environment. The case design will emphasize the information architecture (IA) and user experience (UX) prototype design of the learning environment as informed by user-centric principles, cognitive theories and analyses, the IA literature, and existing course content. This focus on learner knowledge development (“beyond-the-screen”) factors rather than the direct user interface (“at-the-screen”) provides design recommendations and insights that are robust to changing user interface trends and preferences. A distribution of learners with varying backgrounds, learning needs, and goals associated with the material will be identified. These individual differences can dramatically impact the effectiveness of potential interventions, particularly when different types of learners have directly conflicting needs. Thus, the online learning environment will utilize adaptable interfaces to move away from a “one-size-fits-all” design approach. Content modularization and non-sequential, tag-based navigation were utilized to address the challenges of teaching highly interdisciplinary material. This thesis emphasizes a learning environment design that aims to teach highly interdisciplinary systems subject matter to a variety of learners with a variety of characteristics in an asynchronous, online format while making use of existing course material.

APA, Harvard, Vancouver, ISO, and other styles
18

(9183002), Ashish Mortiram Chaudhari. "Information Acquisition in Engineering Design: Descriptive Models and Behavioral Experiments." Thesis, 2020.

Find full text
Abstract:
Engineering designers commonly make sequential information acquisition decisions such as selecting designs for performance evaluation, selecting information sources, deciding whom to communicate with in design teams, and deciding when to stop design exploration. There is significant literature on normative decision making for engineering design, however, there is a lack of descriptive modeling of how designers actually make information acquisition decisions. Such descriptive modeling is important for accurately modeling design decisions, identifying sources of inefficiencies, and improving the design process. To that end, the research objective of the dissertation is to understand how designers make sequential information acquisition decisions and identify models that provide the best description of a designer’s decisions strategies. For gaining this understanding, the research approach consists of a synthesis of descriptive theories from psychological and cognitive sciences, along with empirical evidence from behavioral experiments under different design situations. Statistical Bayesian inference is used to determine how well alternate descriptive decision models describe the experimental data. This approach quantifies a designer's decision strategies through posterior parameter estimation and Bayesian model comparison.

Two research studies, presented in this dissertation, focus on assessing the effects of monetary incentives, fixed budget, type of design space exploration, and the availability of system-wide information on information acquisition decisions. The first study presented in this dissertation investigates information acquisition by an individual designer when multiple information sources are available and the total budget is limited. The results suggest that the student subjects' decisions are better represented by the heuristic-based models than the expected utility(EU)-based models.
While the EU-based models result in better net payoff, the heuristic models used by the subjects generate better design performance. The results also indicate the potential for nudging designers' decisions towards maximizing the net payoff by setting the fixed budget at low values and providing monetary incentives proportional to the saved budget.

The second study investigates information acquisition through communication. The focus is on designers’ decisions about whom to communicate with, and how much to communicate when there is interdependence between subsystems being designed. This study analyzes team communication of NASA engineers at a mission design laboratory (MDL) as well as of engineering students designing a simplified automotive engine in an undergraduate classroom environment. The results indicate that the rate of interactions increases in response to the reduce in system-level design performance in both settings. Additionally, the following factors seem to positively influence communication decisions: the pairwise design interdependence, node-wise popularity (significant with NASA MDL engineers due to large team size), and pairwise reciprocity.

The dissertation work increases the knowledge about engineering design decision making in following aspects. First, individuals make information acquisition decisions using simple heuristics based on in-situ information such as available budget amount and present system performance.
The proposed multi-discipline approach proves helpful for describing heuristics analytically and inferring context-specific decision strategies using statistical Bayesian inference. This work has potential application in developing decision support tools for engineering design. Second, the comparison of communication patterns between student design teams and NASA MDL teams reveals that the engine experiment preserves some but not all of the communication patterns of interest. We find that the representativeness depends not on matching subjects, tasks, and context separately, but rather on the behavior that results from the interactions of these three dimensions. This work provides lessons for designing representative experiments in the future.
APA, Harvard, Vancouver, ISO, and other styles
19

Davy, Carol. "Primary health care: knowledge development and application in Papua New Guinea." 2009. http://arrow.unisa.edu.au/vital/access/manager/Repository/unisa:38312.

Full text
Abstract:
Research into the use of information by health care professionals has generally been conducted in countries dominated by the biomedical model. In these contexts, illness is considered to have a scientifically identifiable physical cause, and treatment practices within the formal health care sector are prescribed and managed in accordance with this definition. Yet there are also contexts where other belief systems inform and guide the way that people think about their health. In comparison to the biomedical model, these contexts have contributed very little to our understanding of how health professionals develop their knowledge. This research investigates how primary health care workers (PHCWs) in one such context, Papua New Guinea (PNG), develop their knowledge about the health services they provide. In order to discover and understand the differing views of these PHCWs, 69 semi-structured interviews were conducted in three culturally and geographically diverse regions of PNG. In explaining the diagnostic and treatment practices they use, these participants provided insights into not only how PHCWs engage with information but also how it informs their professional practice. These data were analysed, interpreted and discussed using a framework consisting of four, primary but interconnecting aspects: the context in which information was provided, the interactions with the sources of information, the processes by which information was understood, and the outcomes realized as a result of the information being used. Findings indicated that the majority of participants in this study acknowledged, if not incorporated, information pertaining to biomedicine, Christianity and Indigenous belief systems into their diagnostic and treatment practices. Even when these belief systems clearly contradicted each other, PHCWs did not in general feel the need to make a conscious choice between them. From their comments it would appear that four factors contributed to this ability to incorporate diverse and often conflicting ideas into the way that patients were cared for. First, all of the belief systems were considered legitimate by at least one group of people connected to the community in which the PHCW worked. Second, although varying in degrees of availability and accessibility, members of these groups were able to disseminate information pertaining to the belief system they supported. Third, the PHCW had no particular affiliation with any one of these groups but instead regularly interacted with a range of different people. Lastly, the PHCW worked in situations where health practices were not generally well supervised by their employers and therefore they were relatively free to choose between various diagnostic and treatment practices. The qualitative interpretive approach adopted in this thesis contributes to the field of human information behavior by affirming that conflict is in the eye of the beholder. When a number of belief systems coexist and all are considered legitimate, information about them is freely available, and the recipients actions are neither constrained by their own dogma, nor imposed upon by others, individuals may quite comfortably embrace diverse beliefs. These findings may also contribute to a better understanding of health management practices in developing countries by suggesting that health professionals are not merely personifications of a biomedical model. Instead, the study demonstrates that multiple belief systems can be combined by PHCWs, and that in turn this benefits the formal health care sector through increased treatment options that are both appropriate and effective in such circumstances.
APA, Harvard, Vancouver, ISO, and other styles
20

(9756986), Shubham Agrawal. "Understanding the Cognitive and Psychological Impacts of Emerging Technologies on Driver Decision-Making Using Physiological Data." Thesis, 2020.

Find full text
Abstract:

Emerging technologies such as real-time travel information systems and automated vehicles (AVs) have profound impacts on driver decision-making behavior. While they generally have positive impacts by enabling drivers to make more informed decisions or by reducing their driving effort, there are several concerns related to inadequate consideration of cognitive and psychological aspects in their design. In this context, this dissertation analyzes different aspects of driver cognition and psychology that arise from drivers’ interactions with these technologies using physiological data collected in two sets of driving simulator experiments.

This research analyzes the latent cognitive and psychological effects of real-time travel information using electroencephalogram (EEG) data measured in the first set of driving simulator experiments. Using insights from the previous analysis, a hybrid route choice modeling framework is proposed that incorporates the impacts of the latent information-induced cognitive and psychological effects along with other explanatory variables that can be measured directly (i.e., route characteristics, information characteristics, driver attributes, and situational factors) on drivers’ route choice decisions. EEG data is analyzed to extract two latent cognitive variables that capture the driver’s cognitive effort during and immediately after the information provision, and cognitive inattention before implementing the route choice decision.

Several safety concerns emerge for the transition of control from the automated driving system to a human driver after the vehicle issues a takeover warning under conditional vehicle automation (SAE Level 3). In this context, this study investigates the impacts of driver’s pre-warning cognitive state on takeover performance (i.e., driving performance while resuming manual control) using EEG data measured in the second set of driving simulator experiments. However, there is no comprehensive metric available in the literature that could be used to benchmark the role of driver’s pre-warning cognitive state on takeover performance, as most existing studies ignore the interdependencies between the associated driving performance indicators by analyzing them independently. This study proposes a novel comprehensive takeover performance metric, Takeover Performance Index (TOPI), that combines multiple driving performance indicators representing different aspects of takeover performance.

Acknowledging the practical limitations of EEG data to have real-world applications, this dissertation evaluates the driver’s situational awareness (SA) and mental stress using eye-tracking and heart rate measures, respectively, that can be obtained from in-vehicle driver monitoring systems in real-time. The differences in SA and mental stress over time, their correlations, and their impacts on the TOPI are analyzed to evaluate the efficacy of using eye-tracking and heart rate measures for estimating the overall takeover performance in conditionally AVs.

The study findings can assist information service providers and auto manufacturers to incorporate driver cognition and psychology in designing safer real-time information and their delivery systems. They can also aid traffic operators to incorporate cognitive aspects while devising strategies for designing and disseminating real-time travel information to influence drivers’ route choices. Further, the study findings provide valuable insights to design operating and licensing strategies, and regulations for conditionally automated vehicles. They can also assist auto manufacturers in designing integrated in-vehicle driver monitoring and warning systems that enhance road safety and user experience.
APA, Harvard, Vancouver, ISO, and other styles
21

(10723737), Navin Bhartoor Lingaraju. "Spectral Multiplexing and Information Processing for Quantum Networks." Thesis, 2021.

Find full text
Abstract:
Modern fiber-optic networks leverage massive parallelization of communications channels in the spectral domain, as well as low-noise recovery of optical signals, to achieve high rates of information transfer. However, quantum information imposes additional constraints on optical transport networks – the no-cloning theorem forbids use of signal regeneration and many network protocols are premised on operations like Bell state measurements that prize spectral indistinguishability. Consequently, a key challenge for quantum networks is identifying a path to high-rate and high-fidelity quantum state transport.

To bridge this gap between the capabilities of classical and quantum networks, we developed techniques that harness spectral multiplexing of quantum channels, as well as that support frequency encoding. In relation to the former, we demonstrated reconfigurable connectivity over arbitrary subgraphs in a multi-user quantum network. In particular, through flexible provisioning of the pair source bandwidth, we adjusted the rate at which entanglement was distributed over any user-to-user link. To facilitate networking protocols compatible with both spectral multiplexing and frequency encoding, we synthesized a Bell state analyzer based on mixing outcomes that populate different spectral modes, in contrast to conventional approaches that are based on mixing outcomes that populate different spatial paths. This advance breaks the tradeoff between the fidelity of remote entanglement and the spectral distinguishability of photons participating in a joint measurement.

Finally, we take steps toward field deployment by developing photonic integrated circuits to migrate the aforementioned functionality to a chip-scale platform while also achieving the low loss transmission and high-fidelity operation needed for practical quantum networks.
APA, Harvard, Vancouver, ISO, and other styles
22

Beckett, Jason. "Forensic computing : a deterministic model for validation and verification through an ontological examination of forensic functions and processes." 2010. http://arrow.unisa.edu.au:8081/1959.8/93190.

Full text
Abstract:
This dissertation contextualises the forensic computing domain in terms of validation of tools and processes. It explores the current state of forensic computing comparing it to the traditional forensic sciences. The research then develops a classification system for the disciplines functions to establish the extensible base for which a validation system is developed.
Thesis (PhD)--University of South Australia, 2010
APA, Harvard, Vancouver, ISO, and other styles
23

(5930729), Ke Liu. "Pattern Exploration from Citizen Geospatial Data." Thesis, 2019.

Find full text
Abstract:
Due to the advances in location-acquisition techniques, citizen geospatial data has emerged with opportunity for research, development, innovation, and business. A variety of research has been developed to study society and citizens through exploring patterns from geospatial data. In this thesis, we investigate patterns of population and human sentiments using GPS trajectory data and geo-tagged tweets. Kernel density estimation and emerging hot spot analysis are first used to demonstrate population distribution across space and time. Then a flow extraction model is proposed based on density difference for human movement detection and visualization. Case studies with volleyball game in West Lafayette and traffics in Puerto Rico verify the effectiveness of this method. Flow maps are capable of tracking clustering behaviors and direction maps drawn upon the orientation of vectors can precisely identify location of events. This thesis also analyzes patterns of human sentiments. Polarity of tweets is represented by a numeric value based on linguistics rules. Sentiments of four US college cities are analyzed according to its distribution on citizen, time, and space. The research result suggests that social media can be used to understand patterns of public sentiment and well-being.
APA, Harvard, Vancouver, ISO, and other styles
24

(5930165), Xinwu Qian. "Linking urban mobility with disease contagion in urban networks." Thesis, 2019.

Find full text
Abstract:
This dissertation focuses on developing a series of mathematical models to understand the role of urban transportation system, urban mobility and information dissemination in the spreading process of infectious diseases within metropolitan areas. Urban transportation system serves as the catalyst of disease contagion since it provides the mobility for bringing people to participate in intensive urban activities and has high passenger volume and long commuting time which facilitates the spread of contagious diseases. In light of significant needs in understanding the connection between disease contagion and the urban transportation systems, both macroscopic and microscopic models are developed and the dissertation consists of three main parts.
The first part of the dissertation aims to model the macroscopic level of disease spreading within urban transportation system based on compartment models. Nonlinear dynamic systems are developed to model the spread of infectious disease with various travel modes, compare models with and without contagion during travel, understand how urban transportation system may facilitate or impede epidemics, and devise control strategies for mitigating epidemics at the network level. The hybrid automata is also introduced to account for systems with different levels of control and with uncertain initial epidemic size, and reachability analysis is used to over-approximate the disease trajectories of the nonlinear systems. The 2003 Beijing SARS data are used to validate the effectiveness of the model. In addition, comprehensive numerical experiments are conducted to understand the importance of modeling travel contagion during urban disease outbreaks and develop control strategies for regulating the entry of urban transportation system to reduce the epidemic size.
The second part of the dissertation develops a data-driven framework to investigate the disease spreading dynamics at individual level. In particular, the contact network generation algorithm is developed to reproduce individuals' contact pattern based on smart card transaction data of metro systems from three major cities in China. Disease dynamics are connected with contact network structures based on individual based mean field and origin-destination pair based mean field approaches. The results suggest that the vulnerability of contact networks solely depends on the risk exposure of the most dangerous individual, however, the overall degree distribution of the contact network determines the difficulties in controlling the disease from spreading. Moreover, the generation model is proposed to depict how individuals get into contact and their contact duration, based on their travel characteristics. The metro data are used to validate the correctness of the generation model, provide insights on monitoring the risk level of transportation systems, and evaluate possible control strategies to mitigate the impacts due to infectious diseases.
Finally, the third part of the dissertation focuses on the role played by information in urban travel, and develops a multiplex network model to investigate the co-evolution of disease dynamics and information dissemination. The model considers that individuals may obtain information on the state of diseases by observing the disease symptoms from the people they met during travel and from centralized information sources such as news agencies and social medias. As a consequence, the multiplex networks model is developed with one layer capturing information percolation and the other layer modeling the disease dynamics, and the dynamics on one layer depends on the dynamics of the other layer. The multiplex network model is found to have three stable states and their corresponding threshold values are analytically derived. In the end, numerical experiments are conducted to investigate the effectiveness of local and global information in reducing the size of disease outbreaks and the synchronization between disease and information dynamics is discussed.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography