Dissertations / Theses on the topic 'Criminal investigation Data processing'

To see the other types of publications on this topic, follow the link: Criminal investigation Data processing.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Criminal investigation Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Carnaz, Gonçalo José Freitas. "A graph-based framework for data retrieved from criminal-related documents." Doctoral thesis, Universidade de Évora, 2021. http://hdl.handle.net/10174/29954.

Full text
Abstract:
A digitalização das empresas e dos serviços tem potenciado o tratamento e análise de um crescente volume de dados provenientes de fontes heterogeneas, com desafios emergentes, nomeadamente ao nível da representação do conhecimento. Também os Órgãos de Polícia Criminal (OPC) enfrentam o mesmo desafio, tendo em conta o volume de dados não estruturados, provenientes de relatórios policiais, sendo analisados manualmente pelo investigadores criminais, consumindo tempo e recursos. Assim, a necessidade de extrair e representar os dados não estruturados existentes em documentos relacionados com o crime, de uma forma automática, permitindo a redução da análise manual efetuada pelos investigadores criminais. Apresenta-se como um desafio para a ciência dos computadores, dando a possibilidade de propor uma alternativa computacional que permita extrair e representar os dados, adaptando ou propondo métodos computacionais novos. Actualmente existem vários métodos computacionais aplicados ao domínio criminal, nomeadamente a identificação e classificação de entidades nomeadas, por exemplo narcóticos, ou a extracção de relações entre entidades relevantes para a investigação criminal. Estes métodos são maioritariamente aplicadas à lingua inglesa, e em Portugal não há muita atenção à investigação nesta área, inviabilizando a sua aplicação no contexto da investigação criminal. Esta tese propõe uma solução integrada para a representação dos dados não estruturados existentes em documentos, usando um conjunto de métodos computacionais: Preprocessamento de Documentos, que agrupa uma tarefa de Extracção, Transformação e Carregamento adaptado aos documentos relacionados com o crime, seguido por um pipeline de Processamento de Linguagem Natural aplicado à lingua portuguesa, para uma análise sintática e semântica dos dados textuais; Método de Extracção de Informação 5W1H que agrupa métodos de Reconhecimento de Entidades Nomeadas, a detecção da função semântica e a extracção de termos criminais; Preenchimento da Base de Dados de Grafos e Enriquecimento, permitindo a representação dos dados obtidos numa base de dados de grafos Neo4j. Globalmente a solução integrada apresenta resultados promissores, cujos resultados foram validados usando protótipos desemvolvidos para o efeito. Demonstrou-se ainda a viabilidade da extracção dos dados não estruturados, a sua interpretação sintática e semântica, bem como a representação na base de dados de grafos; Abstract: The digitalization of companies processes has enhanced the treatment and analysis of a growing volume of data from heterogeneous sources, with emerging challenges, namely those related to knowledge representation. The Criminal Police has similar challenges, considering the amount of unstructured data from police reports manually analyzed by criminal investigators, with the corresponding time and resources. There is a need to automatically extract and represent the unstructured data existing in criminal-related documents and reduce the manual analysis by criminal investigators. Computer science faces a challenge to apply emergent computational models that can be an alternative to extract and represent the data using new or existing methods. A broad set of computational methods have been applied to the criminal domain, such as the identification and classification named-entities (NEs) or extraction of relations between the entities that are relevant for the criminal investigation, like narcotics. However, these methods have mainly been used in the English language. In Portugal, the research on this domain, applying computational methods, lacks related works, making its application in criminal investigation unfeasible. This thesis proposes an integrated solution for the representation of unstructured data retrieved from documents, using a set of computational methods, such as Preprocessing Criminal-Related Documents module. This module is supported by Extraction, Transformation, and Loading tasks. Followed by a Natural Language Processing pipeline applied to the Portuguese language, for syntactic and semantic analysis of textual data. Next, the 5W1H Information Extraction Method combines the Named-Entity Recognition, Semantic Role Labelling, and Criminal Terms Extraction tasks. Finally, the Graph Database Population and Enrichment allows us the representation of data retrieved into a Neo4j graph database. Globally, the framework presents promising results that were validated using prototypes developed for this purpose. In addition, the feasibility of extracting unstructured data, its syntactic and semantic interpretation, and the graph database representation has also been demonstrated.
APA, Harvard, Vancouver, ISO, and other styles
2

Yalcinkaya, Ramazan. "Police officers' adoption of information technology: A case study of the Turkish POLNET system." Thesis, University of North Texas, 2007. https://digital.library.unt.edu/ark:/67531/metadc3900/.

Full text
Abstract:
One of the important branches of government and vital to the community, police agencies are organizations that have high usage rates of information technology systems since they are in the intelligence sector and thus have information incentives. Not only can information technologies develop intra- and inter-relationships of law enforcement agencies, but they also improve the efficiency and effectiveness of the police officers and agencies without adding additional costs. Thus, identifying the factors that influence the police officers' adoption of information technology can help predict and determine how information technology will contribute to the social organization of policing in terms of effectiveness and efficiency gains. A research framework was developed by integrating three different models, theory of planned behavior (TPB), technology acceptance theory (TAM), and diffusion of innovation theory (DOI) while adding two other factors, facility and voluntariness, to better determine the factors affecting the implementation and adoption of the POLNET software system used by the Turkish National Police (TNP). The integrated model used in this study covers not only basic technology acceptance factors, but also the factors related to policing. It also attempts to account for the factors of cultural differences by considering the important aspects of Turkish culture. A cross sectional survey was conducted among TNP officers using the POLNET system. The LISREL 8.5® analysis for the hypothesized model resulted in a good model fit; 13 of the 15 hypotheses were supported.
APA, Harvard, Vancouver, ISO, and other styles
3

Hargreaves, C. J. "Assessing the Reliability of Digital Evidence from Live Investigations Involving Encryption." Thesis, Department of Informatics and Sensors, 2009. http://hdl.handle.net/1826/4007.

Full text
Abstract:
The traditional approach to a digital investigation when a computer system is encountered in a running state is to remove the power, image the machine using a write blocker and then analyse the acquired image. This has the advantage of preserving the contents of the computer’s hard disk at that point in time. However, the disadvantage of this approach is that the preservation of the disk is at the expense of volatile data such as that stored in memory, which does not remain once the power is disconnected. There are an increasing number of situations where this traditional approach of ‘pulling the plug’ is not ideal since volatile data is relevant to the investigation; one of these situations is when the machine under investigation is using encryption. If encrypted data is encountered on a live machine, a live investigation can be performed to preserve this evidence in a form that can be later analysed. However, there are a number of difficulties with using evidence obtained from live investigations that may cause the reliability of such evidence to be questioned. This research investigates whether digital evidence obtained from live investigations involving encryption can be considered to be reliable. To determine this, a means of assessing reliability is established, which involves evaluating digital evidence against a set of criteria; evidence should be authentic, accurate and complete. This research considers how traditional digital investigations satisfy these requirements and then determines the extent to which evidence from live investigations involving encryption can satisfy the same criteria. This research concludes that it is possible for live digital evidence to be considered to be reliable, but that reliability of digital evidence ultimately depends on the specific investigation and the importance of the decision being made. However, the research provides structured criteria that allow the reliability of digital evidence to be assessed, demonstrates the use of these criteria in the context of live digital investigations involving encryption, and shows the extent to which each can currently be met.
APA, Harvard, Vancouver, ISO, and other styles
4

Hargreaves, Christopher James. "Assessing the reliability of digital evidence from live investigations involving encryption." Thesis, Cranfield University, 2009. http://dspace.lib.cranfield.ac.uk/handle/1826/4007.

Full text
Abstract:
The traditional approach to a digital investigation when a computer system is encountered in a running state is to remove the power, image the machine using a write blocker and then analyse the acquired image. This has the advantage of preserving the contents of the computer’s hard disk at that point in time. However, the disadvantage of this approach is that the preservation of the disk is at the expense of volatile data such as that stored in memory, which does not remain once the power is disconnected. There are an increasing number of situations where this traditional approach of ‘pulling the plug’ is not ideal since volatile data is relevant to the investigation; one of these situations is when the machine under investigation is using encryption. If encrypted data is encountered on a live machine, a live investigation can be performed to preserve this evidence in a form that can be later analysed. However, there are a number of difficulties with using evidence obtained from live investigations that may cause the reliability of such evidence to be questioned. This research investigates whether digital evidence obtained from live investigations involving encryption can be considered to be reliable. To determine this, a means of assessing reliability is established, which involves evaluating digital evidence against a set of criteria; evidence should be authentic, accurate and complete. This research considers how traditional digital investigations satisfy these requirements and then determines the extent to which evidence from live investigations involving encryption can satisfy the same criteria. This research concludes that it is possible for live digital evidence to be considered to be reliable, but that reliability of digital evidence ultimately depends on the specific investigation and the importance of the decision being made. However, the research provides structured criteria that allow the reliability of digital evidence to be assessed, demonstrates the use of these criteria in the context of live digital investigations involving encryption, and shows the extent to which each can currently be met.
APA, Harvard, Vancouver, ISO, and other styles
5

Ask, Karl. "Criminal investigation : motivation, emotion and cognition in the processing of evidence /." Göteborg : Dept. of Psychology Göteborg Univ, 2006. http://www.gbv.de/dms/spk/sbb/recht/toc/513716297.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Padhye, Manoday D. "Use of data mining for investigation of crime patterns." Morgantown, W. Va. : [West Virginia University Libraries], 2006. https://eidr.wvu.edu/etd/documentdata.eTD?documentid=4836.

Full text
Abstract:
Thesis (M.S.)--West Virginia University, 2006.
Title from document title page. Document formatted into pages; contains viii, 108 p. : ill. (some col.). Includes abstract. Includes bibliographical references (p. 80-81).
APA, Harvard, Vancouver, ISO, and other styles
7

Brouse, Andrew. "The interharmonium : an investigation into networked musical applications and brainwaves." Thesis, McGill University, 2001. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=33878.

Full text
Abstract:
This work surveys currently available methods for measuring human brainwaves in order to generate music and technologies for real-time transmission of audio and music over the Internet. The end goal is to produce a performable music system which sends live human brainwaves over the Internet to produce sounding music at another, physically separated location.
APA, Harvard, Vancouver, ISO, and other styles
8

Slingsby, T. P. "An investigation into the development of a facilities management system for the University of Cape Town." Master's thesis, University of Cape Town, 2004. http://hdl.handle.net/11427/5585.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Todd, Peter A. "An experimental investigation of the impact of computer based decision aids on the process of preferential choice." Thesis, University of British Columbia, 1988. http://hdl.handle.net/2429/29442.

Full text
Abstract:
This research examines the impact of Decision Support Systems (DSS) on the decision making process for preferential choice tasks. The potential impact of DSS on the decision process is evaluated in terms of how the system alters the decision maker's cognitive load. Competing hypotheses are developed based on the possible objectives of the decision maker with respect to decision effort and decision quality. One line of reasoning assumes that the DSS will be used in such a way as to maximise decision quality. The other asserts that the use of the DSS will be geared towards effort conservation. These hypotheses about the impact of the DSS on the decision process are tested in three experiments. The three studies employed concurrent verbal protocols to capture data about the decision process. In experiment 1 subjects were placed in either and aided or unaided decision setting and given problems of either five or ten alternatives from which to make a choice. The results showed that decision strategy changed as a results of the use of the decision aid. In general, subjects behaved as effort minimisers. There were no significant effects related to the amount of information processing. Experiment 2 was similar to experiment 1 except that subjects were given problems with either ten or twenty alternatives. The results were consistent with, though stronger than those of experiment 1. Almost all aided group subjects used Elimination by aspects strategy while the unaided group used a Conjunctive strategy. This is consistent with the notion of effort minimisation. There were no significant differences in the amount of information processing Experiment 3 was designed to test whether the results in experiments 1 and 2 were a due to the tendency of decision makers to minimise effort or because the aid was not powerful enough to induce additive processing. In this study the DSS was altered to both increase the support for the additive difference strategy and reduce support for the elimination by aspects approach. The results of experiment 3 show that decision makers tend to adapt their strategy to the type of decision aids available. There is evidence that if additive strategies are made sufficiently less effortful to use they will be employed. Similarly, when the degree of effort to follow a particular elimination strategy is manipulated decision makers tend to adapt in such a way as to minimise effort. Overall the results of the three experiments are consistent in demonstrating the adaptivity of decision makers to the types of support tools available to them. This adaptivity centres around the minimisation of decision effort. It appears that decision makers are highly conscious of the effort required to make decisions and work in such a way as to minimise that expenditure. When faced with the use a decision aid they appear to calibrate their own decision effort to that provided by the decision aid. There is some evidence that sufficient changes in the relative effort required to use various strategies can lead decision makers to follow more effortful approaches than they might otherwise consider. The precise nature of this effort-accuracy relationship needs to be studied more closely. The basic contribution of the dissertation has been to provide a formal approach for the study of DSS, based on concepts drawn from behavioural decision theory and information processing psychology. This work also has implications for behavioural decision theorists, consumer researchers and practical implications for the development of DSS in preferential choice settings.
Business, Sauder School of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
10

Olson, Chandra. "Jini an investigation in distributed computing /." [Florida] : State University System of Florida, 2001. http://etd.fcla.edu/etd/uf/2001/ank7122/chandra.PDF.

Full text
Abstract:
Thesis (M.E.)--University of Florida, 2001.
Title from first page of PDF file. Document formatted into pages; contains viii, 71 p.; also contains graphics. Vita. Includes bibliographical references (p. 69-70).
APA, Harvard, Vancouver, ISO, and other styles
11

Salmon, Gwendal. "Processing of shear waves from VSP data at the Forsmark site investigation." Thesis, Uppsala universitet, Geofysik, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-183684.

Full text
Abstract:
The Forsmark Nuclear Plant is one of the largest in Sweden and produces around one sixth of the total electrical energy in the country. It is situated on the east coast of Sweden in the Uppland region. Nuclear waste has to be properly handled every year and Forsmark is one site proposed for long-term storage of all spent fuel from Swedish nuclear power reactors. This potential high-level repository (a low-level one already exists in the area) will be based on the KBS-3 design process, which consists of 6000 iron-copper capsules where the waste will be stored for 30 years and finally buried 500 m down, isolated from the environment for100.000 years. Before using the disposal site, numerous investigations must be done in the area so the risks are reduced as much as possible. These investigations include drilling of cored boreholes down to 1000 m depth. In this study the KFM01A borehole (figure 1.1) was used with different shot points to analyze possible anisotropy in the subsurface. The anisotropy in rocks can be due to different mechanisms as crystal and mineral grain alignment, crack and pore space alignment and thin layer anisotropy (Rowlands J. et al., 1993). For this purpose a shear wave splitting analysis was done in an attempt to determine both orientation and density of fractures. Shear wave splitting has shown to be a very effective method detecting fractures, providing an unique ability to measure anisotropic seismic attributes that are sensitive to fractures (James E. Gaiser, 2004). This can be useful in many domains as in oil companies to improve reservoir management (James E. Gaiser, 2004) or as an imaging tool in fracture-controlled geothermal reservoirs, to monitor fluid pressure in the cracks and changes in crack density (Tang Chuanhai, 2005). Shear wave splitting studies have also been done in seismology for crustal studies (Rowlands J. et al., 1993). When shear waves enter anisotropic medias they split in two approximately orthogonal components, where the faster and slower components will travel parallel and perpendicular to the fracture planes respectively. The time delay will depend of the amount of anisotropy and the path length. Different methods can be used to evaluate the anisotropy; polarization diagrams (Crampin et al., 1986), linear moveout plots of the horizontal components (Li et al.,1988).The procedure described by Li et al. (1988) are the techniques that are used in the present study. The fractures orientation is also analyzed and compared with the general stress components in the area using well bore information from previous studies, as well as the general tectonic characteristics of the zone.
APA, Harvard, Vancouver, ISO, and other styles
12

Smith, Michael Alan. "An empirical investigation of the determinants of information systems outsourcing." Diss., Georgia Institute of Technology, 1998. http://hdl.handle.net/1853/29455.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Spence, William MacDonald. "Investigation into the opportunities presented by big data for the 4C Group." Thesis, Stellenbosch : Stellenbosch University, 2014. http://hdl.handle.net/10019.1/97409.

Full text
Abstract:
Thesis (MBA)--Stellenbosch University, 2014.
ENGLISH ABSTRACT: The telecommunications industry generates vast amounts of data on a daily basis. The exponential growth in this industry has, therefore, increased the amounts of nodes that generates data on a near real-time basis, and the required processing power to process all this information has increased as well. Organisations in different industries have experienced the same growth in information processing, and, in recent years, professionals in the Information Systems (IS) industry have started referring to these challenges as the concept of Big Data (BD). This theoretical research investigated the definition of big data as defined by several leading players in the industry. The theoretical research further focussed on several key areas relating to the big data era: i) Common attributes of big data. ii) How do organisations respond to big data? iii) What are the opportunities that big data provide to organisations? A selecting of case studies are presented to determine what other players in the IS industry does to exploit big data opportunities. The study signified that the concept of big data has emerged due to IT infrastructure struggling to cope with the increased volumes, variety and velocity of data being generated and that organisations are finding it difficult to incorporate the results from new and advanced mining and analytical techniques into their operations in order to extract the maximum value from their data. The study further found that big data impacts each component of the modern day computer based information system and the exploration of several practical cases further highlighted how different organisations have addressed this big data phenomenon in their IS environment. Using all this information, the study investigated the 4C Group business model and identified some key opportunities for this IT vendor in the big data era. As the 4C Group has positioned themselves across the ICT value chain, big data presents several good opportunities to explore in all components of the IS. While training and consulting can establish the 4C Group as a big data knowledgeable vendor, some enhancements to their application software functionalities can provide additional big data opportunities. And as true big data value only originates from the utilization of the data in the daily decision making processes, by offering IaaS the 4C Group can enable their clients to achieve the illusive goal of becoming a data driven organisation.
APA, Harvard, Vancouver, ISO, and other styles
14

Van, der Walt Craig. "An investigation into the practical implementation of speech recognition for data capturing." Thesis, Cape Technikon, 1993. http://hdl.handle.net/20.500.11838/1156.

Full text
Abstract:
Thesis (Master Diploma (Technology))--Cape Technikon, Cape Town,1993
A study into the practical implementation of Speech Recognition for the purposes of Data Capturing within Telkom SA. is described. As datacapturing is increasing in demand a more efficient method of capturing is sought. The technology relating to Speech recognition is herein examined and practical gnidelines for selecting a Speech recognition system are described. These guidelines are used to show how commercially available systems can be evaluated. Specific tests on a selected speech recognition system are described, relating to the accuracy and adaptability of the system. The results obtained illustrate why at present speech recognition systems are not advisable for the purpose of Data capturing. The results also demonstrate how the selection of keywords words can affect system performance. Areas of further research are highlighted relating to recognition performance and vocabulary selection.
APA, Harvard, Vancouver, ISO, and other styles
15

Tse, Ka-sze Hayson, and 謝家樹. "Bayesian network analysis of evidence in criminal court cases / y Hayson Ka-sze Tse." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2015. http://hdl.handle.net/10722/212618.

Full text
Abstract:
When justice goes wrong, grave consequences entail. They are damaging to the standing of the legal system and people’s lives. Humans are not good at assessing uncertainties. Parties to a legal proceeding adduce evidence to support or reject hypotheses. Errors happen when the tribunal fails to consider properly all the evidence in the context of inherent probabilities or improbabilities. This research work does not advocate trials by mathematics or statistics. This work extends the understanding of the application of Bayesian Networks in the law domain. The original contribution to knowledge is the analysis of evidence by Bayesian Network in the context of specific legal requirements of Hong Kong. The research questions are: 1. What are the legal requirements for the analysis of evidence in a criminal trial in Hong Kong? 2. How can a Bayesian Network be constructed for the purpose of such analysis? 3. Is such a Bayesian Network effective for the analysis? In answering the questions, this research work examined the feasibility of generic models created for digital crime scene investigations and concluded that each case must be, for the purpose of analysis of evidence in the trial, represented by a different Bayesian Network. This research work examined the trial processes, the tasks of tribunal of facts of criminal trials and some appellate decisions in Hong Kong. The work also created models of reasoning processes for the juries in Hong Kong. The work then compared the properties of Bayesian Networks with the processes of evaluation of evidence during trials. This research work also considered the reluctance of courts in the United Kingdom to allow experts to express their opinions on the bases of Bayesian calculations; even though trials are practically evaluations of uncertainties and assignments of degrees of beliefs. This research work then constructed a schedule of levels of proof and proposed a schematic method, to be used with the schedule, to construct Bayesian Networks for any types of trial in Hong Kong. The method requires an analyst to go through a mass of evidence systematically, and analyse their relationships amongst the ultimate probandum, the penultimate probanda, the intermediate propositions, the facts in issue and the facts relevant to the issue. This work then demonstrated the applications by two criminal cases in Hong Kong. The analyses show that the construction of Bayesian Network by the schematic method enables an analyst to take precaution to reach an assessment rationally and to approximate as far as capable his or her belief to the facts in issue.
published_or_final_version
Computer Science
Doctoral
Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
16

Kubisak, Timothy D. "Investigation of acoustic vector sensor data processing in the presence of highly variable bathymetry." Thesis, Monterey, California: Naval Postgraduate School, 2014. http://hdl.handle.net/10945/42664.

Full text
Abstract:
Approved for public release; distribution is unlimited
Data has been collected on acoustic vector sensors mounted on autonomous underwater gliders in the Monterey Bay during 2012–2013. Previous processing work computed the acoustic vector intensity to estimate bearing to impulsive sources of interest. These sources included small explosive shots deployed by local fishermen and humpback whale vocalizations. While the highly impulsive shot data produced unambiguous bearing estimations, the longer duration whale vocalizations showed a fairly wide spread in bearing. In this work, causes of the ambiguity in bearing estimation are investigated in the context of the highly variable bathymetry of the Monterey Bay Canyon, as well as the coherent multipath interference in the longer duration calls. Sound speed data collected during the previous experimental effort, along with a three-dimensional bathymetric relief of the Monterey Bay Canyon, are incorporated into a three-dimensional version of the Monterey-Miami Parabolic Equation Model. Propagation results are computed over a frequency band from 336–464 Hz in order to provide predictions of pulse arrival structure. This data is analyzed using conventional pressure plane-wave beamforming techniques in order to highlight horizontal coupling caused by the canyon bathymetry. The data is also analyzed using the previously developed acoustic vector intensity processing string and shown to exhibit a qualitatively similar spread in the estimated bearing.
APA, Harvard, Vancouver, ISO, and other styles
17

Khan, Abdullah. "An investigation into improving the functioning of manufacturing executions system at the Impala base metals refinery." Thesis, Stellenbosch : University of Stellenbosch, 2009. http://hdl.handle.net/10019.1/6408.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Baxter, David. "Perception of organisational politics and workplace innovation : an investigation of the perceptions and behaviour of staff in an Australian IT services organisation /." Swinburne Research Bank, 2004. http://hdl.handle.net/1959.3/46062.

Full text
Abstract:
Thesis (D.B.A.)--Swinburne University of Technology, Australian Graduate School of Entrepreneurship, 2004.
A thesis submitted to the fulfilment of the requirements for the degree of Doctor of Philosophy, Australian Graduate School of Entrepreneurship, Swinburne University of Technology, 2004. Typescript. Includes bibliographical references (p. 229-230).
APA, Harvard, Vancouver, ISO, and other styles
19

Cooke, Nicholas Duncan. "Initial findings of an investigation into the feasibility of a low level image processing workstation using transputers." Thesis, Rhodes University, 1990. http://hdl.handle.net/10962/d1006702.

Full text
Abstract:
From Introduction: The research concentrates primarily on a feasibility study involving the setting up of an image processing workstation. As broad as this statement concerning the workstation may seem, there are several factors limiting the extent of the research. This project is not concerned with the design and implementation of a fully-fledged image processing workstation. Rather, it concerns an initial feasibility study of such a workstation, centered on the theme image processing aided by the parallel processing paradigm. In looking at the hardware available for the project, in the context of an image processing environment, a large amount of initial investigation was required prior to that concerned with the transputer and parallel processing. Work was done on the capturing and displaying of images. This formed a vital part of the project. Furthermore, considering that a new architecture was being used as the work horse within a conventional host architecture, the INTEL 80286, several aspects of the host architecture had also to be investigated. These included the actual processing capabilities of the host, the capturing and storing of the images on the host, and most importantly, the interface between the host and the transputer [C0089]. Benchmarking was important in order for good conclusions to be drawn about the viability of the two types of hardware used, both individually and together. On the subject of the transputer as the workhorse, there were several areas whlch required investigation. Initial work had to cover the choice of network topology on whlch the benchmarking of some of the image processing applications were performed. Research into this was based on the previous work of several authors, whlch introduced features relevant to this investigation. The network used for this investigation was chosen to be generally applicable to a broad spectrum of applications in image processing. It was not chosen for its applicability for a single dedicated application, as has been the case for much of the past research performed in image processing [SAN88] [SCH89]. The concept of image processing techniques being implemented on the transputer required careful consideration in respect of what should be implemented. Image processing is not a new subject, and it encompasses a large spectrum of applications. The transputer, with image processing being hlghly suited to it, has attracted a good deal of research. It would not be rash to say that the easy research was covered first. The more trivial operations in image processing, requiring matrix type operations on the pixels attracted, the most coverage. Several researchers in the field of image processing on the transputer have broken the back of this set of problems. Conclusions regarding these operations on the transputer returned a fairly standard answer. An area of image processing which has not produced the same volume of return as that concerning the more trivial operations, is the subject of Fourier Analysis, that is, the Fourier Transform. Thus a major part of this project concerns an investigation into the Fourier Transform in image processing, in particular the Fast Fourier Transform. The network chosen for thls research has placed some constraint upon the degree of parallelism that can be achleved. It should be emphasized that this project is not concerned with the most efficient implementation of a specific image processing algorithm on a dedicated topology. Rather, it looks at the feasibility of a general system in the domain of image processing, concerned with a hlghly computationally intensive operation. This has had the effect of testing the processing power of the hardware used, and contributing a widely applicable parallel algorithm for use in Fourier Analysis. 3 These are discussed more fully in Chapter 2, which covers the work related to tbis project. The results of the investigation are presented along with a discussion of the methods throughout the thesis. The final chapter summarizes the findings of the research, assesses the value of the investigation, and points out areas for future investigation.
APA, Harvard, Vancouver, ISO, and other styles
20

Yang, Wenwei, and 楊文衛. "Development and application of automatic monitoring system for standard penetration test in site investigation." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2006. http://hub.hku.hk/bib/B36811919.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Miles, Shaun Graeme. "An investigation of issues of privacy, anonymity and multi-factor authentication in an open environment." Thesis, Rhodes University, 2012. http://hdl.handle.net/10962/d1006653.

Full text
Abstract:
This thesis performs an investigation into issues concerning the broad area ofIdentity and Access Management, with a focus on open environments. Through literature research the issues of privacy, anonymity and access control are identified. The issue of privacy is an inherent problem due to the nature of the digital network environment. Information can be duplicated and modified regardless of the wishes and intentions ofthe owner of that information unless proper measures are taken to secure the environment. Once information is published or divulged on the network, there is very little way of controlling the subsequent usage of that information. To address this issue a model for privacy is presented that follows the user centric paradigm of meta-identity. The lack of anonymity, where security measures can be thwarted through the observation of the environment, is a concern for users and systems. By an attacker observing the communication channel and monitoring the interactions between users and systems over a long enough period of time, it is possible to infer knowledge about the users and systems. This knowledge is used to build an identity profile of potential victims to be used in subsequent attacks. To address the problem, mechanisms for providing an acceptable level of anonymity while maintaining adequate accountability (from a legal standpoint) are explored. In terms of access control, the inherent weakness of single factor authentication mechanisms is discussed. The typical mechanism is the user-name and password pair, which provides a single point of failure. By increasing the factors used in authentication, the amount of work required to compromise the system increases non-linearly. Within an open network, several aspects hinder wide scale adoption and use of multi-factor authentication schemes, such as token management and the impact on usability. The framework is developed from a Utopian point of view, with the aim of being applicable to many situations as opposed to a single specific domain. The framework incorporates multi-factor authentication over multiple paths using mobile phones and GSM networks, and explores the usefulness of such an approach. The models are in tum analysed, providing a discussion into the assumptions made and the problems faced by each model.
Adobe Acrobat Pro 9.5.1
Adobe Acrobat 9.51 Paper Capture Plug-in
APA, Harvard, Vancouver, ISO, and other styles
22

Ndwe, Tembalethu Jama. "An investigation into the viability of deploying thin client technology to support effective learning in a disadvantaged, rural high school setting." Thesis, Rhodes University, 2002. http://hdl.handle.net/10962/d1006500.

Full text
Abstract:
Computer Based Training offers many attractive learning opportunities for high school pupils. Its deployment in economically depressed and educationally marginalized rural schools is extremely uncommon due to the high technology skills and costs involved in its deployment and ongoing maintenance. This thesis puts forward thin client technology as a potential solution to the needs of education environments of this kind. A functional business case is developed and evaluated in this thesis, based upon a requirements analysis of media delivery in learning, and upon formal cost/performance models and a deployment field trial. Because of the economic constraints of the envisaged deployment area in rural education, an industrial field trial is used, and the aspects of this trial that can be carried over to the rural school situation have been used to assess performance and cost indicators. Our study finds that thin client technology could be deployed and maintained more cost effectively than conventional fat client solutions in rural schools, that it is capable of supporting the learning elements needed in this deployment area, and that it is able to deliver the predominantly text based applications currently being used in schools. However, we find that technological improvements are needed before future multimediaintensive applications can be adequately supported.
APA, Harvard, Vancouver, ISO, and other styles
23

Adonis, Ridoh. "An empirical investigation into the information management systems at a South African financial institution." Thesis, Cape Peninsula University of Technology, 2016. http://hdl.handle.net/20.500.11838/2474.

Full text
Abstract:
Thesis (MTech (Business Administration))--Cape Peninsula University of Technology, 2016.
The study has been triggered by the increase in information breaches in organisations. Organisations may have policies and procedures, strategies and systems in place in order to mitigate the risk of information breaches; however, data breaches are still on the rise. Governments across the world have or are putting in place laws around data protection which organisations have to align their process, strategies and systems to. The continuous and rapid emergence of new technology is making it even easier for information breaches to occur. In particular, the focus of this study is aimed at the information management systems in a selected financial institution in South Africa. Based on the objectives, this study: explored the shortfalls of information security on a South African financial institution; investigated whether data remains separate while privacy is ensured; investigated responsiveness of business processes on information management; investigated the capability of systems on information management; investigated the strategies formulated for information management and finally, investigated projects and programmes aimed at addressing information management. Quantitative, as well as qualitative analysis, was employed whereby questionnaires were sent to employees who were employed at junior management positions. Semi- structured in-depth interviews were self-administered whereby the researcher interviewed senior management at the organisation. These senior managers from different value chains are responsible for implementing information management policies and strategy.
APA, Harvard, Vancouver, ISO, and other styles
24

Jones, Susan M. "An investigation of methodologies for software development prototyping." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 1993. https://ro.ecu.edu.au/theses/1150.

Full text
Abstract:
The computer industry has a poor record of system development using the traditional life-cycle approach. The main cause of user dissatisfaction is the unacceptably large amount of time between specification and delivery of a system. In addition, users have limited opportunity to influence how the system will look when implemented once development has commenced. With the advent of 4GLs, system development using a prototyping approach has become a viable option. This has reduced the development tlme significantly and, together with the use of prototyping, has allowed users to become more involved in the development process. However, this change in the development process has meant that often the use of an accepted methodology/system life cycle has been ignored or altered. This has resulted in systems where the definition-of-requirements phase was often fast-tracked or omitted totally and the system documentation is insufficient for effective maintenance. Thus, this approach has not proved to be as successful as expected. However, the opportunities that prototyping offers should not be discarded because of the use of inappropriate software development methodologies, languages or tools. This study seeks to identify factors that may influence the success or failure of a prototyping project and to assess the importance of any developmentmethodologies being used. Information was gathered via interviews, questionnaires and, where deemed necessary, the reviewing of development procedures used. Conclusions have been drawn from data gathered from various organisations in Western Australia that have used prototyping for a number of projects, thus, suggesting a refinement of the development process. Two main areas appeared to affect the success of a software development project. The first is the lack of flexibility in the methodology used and inappropriateness of the development tools and languages. The second is insufficient requirements analysis. The results indicate that a methodology is required that provides a good framework, but is flexible enough to handle different types and sizes of project. It should specifically address prototyping and include guidelines as to how to select the most suitable prototyping approach for each project. It should contain examples of different deliverables and various development cycles appropriate for each type of prototyping. There should be automated tools available to handle documentation and code generation where possible. The development of a methodology with the above characteristics is required if the advantages of prototyping are to be maximised in the future.
APA, Harvard, Vancouver, ISO, and other styles
25

Carrus, Elisa. "An investigation into the interaction between music and language processing : evidence from EEG and behavioural data." Thesis, Goldsmiths College (University of London), 2011. http://eprints.gold.ac.uk/7053/.

Full text
Abstract:
Music and language are both regulated by a syntactic system that allows for the combination of single elements into higher order structures. Patel (2003) suggested that music and language processing share (neural) resources used for structural processing. Previous electrophysiological studies (e.q., Koelsch et al., 2005) investigating music-language interactions have relied exclusively on the event- related-potential (ERP), which does not provide information regarding neural oscillations that are relevant for higher cognition. Further, possible interactions with pitch and melody have not been investigated, yet melody contains multiple structured relationships between individual tones. The present research aims to fill these gaps in the literature with special emphasis on melodic processing and on neural oscillations. In all experiments (two EEG and three behavioural) sentences with or without a violation in the last word were visually presented in synch with aurally presented music with or without a (harmonic or melodic) violation in the final note. The first EEG experiment investigated patterns of brain oscillations reporting evidence of a shared activation of the delta-theta band. Next, three behavioural experiments investigated the effect of melodic violations on the processing of language. Melodic violations were constructed by a computational model of pitch expectation (Pearce, 2005), which systematically estimates the conditional probability of final notes in a melody: high and low probability notes are perceived as expected and unexpected notes, respectively. In all three experiments participants showed facilitation for processing of incorrect sentences as reflected in a reduced language expectancy effect (faster processing for correct than for incorrect sentences) when sentences were presented on unexpected than on expected notes; this effect was .increasingly suppressed with working memory load. Finally, the second EEG experiment complemented previously reported interactions in the ERP and oscillatory domains by investigating interactions with melody. In conclusion, these studies show that a system of shared networks is activated during processing of music and language.
APA, Harvard, Vancouver, ISO, and other styles
26

Chan, Yik-Kwan Eric, and 陳奕鈞. "Investigation of a router-based approach to defense against Distributed Denial-of-Service (DDoS) attack." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2004. http://hub.hku.hk/bib/B30173309.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Kohnen, William. "A prototype investigation of a multi-GHz multi-channel analog transient recorder /." Thesis, McGill University, 1986. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=65462.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Tee, Singwhat. "An empirical investigation of factors influencing organisations to improve data quality in their information systems /." St. Lucia, Qld, 2003. http://www.library.uq.edu.au/pdfserve.php?image=thesisabs/absthe17473.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Laubscher, Robert Alan. "An investigation into the use of IEEE 1394 for audio and control data distribution in music studio environments." Thesis, Rhodes University, 1999. http://hdl.handle.net/10962/d1006483.

Full text
Abstract:
This thesis investigates the feasibility of using a new digital interconnection technology, the IEEE-1394 High Performance Serial Bus, for audio and control data distribution in local and remote music recording studio environments. Current methods for connecting studio devices are described, and the need for a new digital interconnection technology explained. It is shown how this new interconnection technology and developing protocol standards make provision for multi-channel audio and control data distribution, routing, copyright protection, and device synchronisation. Feasibility is demonstrated by the implementation of a custom hardware and software solution. Remote music studio connectivity is considered, and the emerging standards and technologies for connecting future music studio utilising this new technology are discussed.
Microsoft Word
Adobe Acrobat 9.46 Paper Capture Plug-in
APA, Harvard, Vancouver, ISO, and other styles
30

Cucciniello, Maria. "Investigation of the use of ICT in the modernization of the health care sector : a comparative analysis." Thesis, University of Edinburgh, 2011. http://hdl.handle.net/1842/8733.

Full text
Abstract:
This Ph.D. project started from a broad analysis aiming at investigating the key issues in the development of Information and Communication Technologies (ICT) in the health care sector, with the aim of making an in depth investigation to evaluate the effects of Electronic Medical Record (EMR) implementation on the organizations adopting them. Furthermore the study examined two study settings which have adopted the same EMR system produced by the same provider. This comparative study aims, in particular, to analyse how EMR systems are adopted by different health organizations focusing on the antecedents of the EMR project, on the implementation processes used and on the impacts produced. Diffusion theory, through the lens of socio-technical approach, represents the theoretical framework of the analysis. The research results are based on policy evaluation and case studies. The two hospitals selected for the case study analysis are the Regional Hospital of Local Health Authority in Aosta, Italy and the Royal Infirmary of Edinburgh, Scotland. In conducting the data collection several strategies have been used: documentary analysis, interviews and observations have been carried out. This work provides an overview of the key issues arising over e-health policy development through a comparative analysis of the UK and Italy and provides an insight into how EMR systems are adopted, implemented and evaluated within acute care organizations. The thesis is a comparative international research about the development of e-health and the use of ICT in health care sector. This approach makes a both a theoretical and methodological contribution. By focusing, in particular, on EMR systems, it offers to practitioners and policy makers a better basis of analysing ICT usage and its impacts on health care service delivery.
APA, Harvard, Vancouver, ISO, and other styles
31

DelFrate, Judith. "An investigation of teachers' backgrounds, usage, and attitudes towards computers in education /." The Ohio State University, 1987. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487584612163682.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Ainul, Azyan Zuliyanti Hanizan. "An investigation into the practicality of using a digital camera's RAW data in print publishing applications /." Link to online version, 2005. https://ritdml.rit.edu/dspace/handle/1850/1110.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Beauvais, Erik Alexander Maoui. "An investigation of a framework to evaluate computer supported collaborative work." Thesis, Rhodes University, 1999. http://eprints.ru.ac.za/1383/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Hutton, Alistair James. "An empirical investigation of issues relating to software immigrants." Thesis, Connect to e-thesis, 2008. http://theses.gla.ac.uk/136/.

Full text
Abstract:
Thesis (Ph.D.) - University of Glasgow, 2008.
Ph.D. thesis submitted to the Department of Computing Science, Faculty of Information and Mathematical Sciences, University of Glasgow, 2008. Includes bibliographical references. Print version also available.
APA, Harvard, Vancouver, ISO, and other styles
35

Klinkradt, Bradley Hugh. "An investigation into the application of the IEEE 1394 high performance serial bus to sound installation contro." Thesis, Rhodes University, 2003. http://hdl.handle.net/10962/d1004899.

Full text
Abstract:
This thesis investigates the feasibility of using existing IP-based control and monitoring protocols within professional audio installations utilising IEEE 1394 technology. Current control and monitoring technologies are examined, and the characteristics common to all are extracted and compiled into an object model. This model forms the foundation for a set of evaluation criteria against which current and future control and monitoring protocols may be measured. Protocols considered include AV/C, MIDI, QSC-24, and those utilised within the UPnP architecture. As QSC-24 and the UPnP architecture are IP-based, the facilities required to transport IP datagrams over the IEEE 1394 bus are investigated and implemented. Example QSC-24 and UPnP architecture implementations are described, which permit the control and monitoring of audio devices over the IEEE 1394 network using these IP-based technologies. The way forward for the control and monitoring of professional audio devices within installations is considered, and recommendations are provided.
KMBT_363
Adobe Acrobat 9.54 Paper Capture Plug-in
APA, Harvard, Vancouver, ISO, and other styles
36

Ferguson, Colin B., and mikewood@deakin edu au. "An investigation of the effects of microcomputers on the work of professional accountants." Deakin University. School of Accounting and Finance, 1994. http://tux.lib.deakin.edu.au./adt-VDU/public/adt-VDU20050915.155628.

Full text
Abstract:
Information technology research over the past two decades suggests that the installation and use of computers fundamentally affects the structure and function of organisations and, m particular, the workers in these organizations. Following the release of the IBM Personal Computer in 1982, microcomputers have become an integral part of most work environments. The accounting services industry, in particular, has felt the impact of this ‘microcomputer revolution’. In Big Six accounting firms, there is almost one microcomputer for each professional accountant employed, Notwithstanding this, little research has been done on the effect of microcomputers on the work outcomes of professional accountants working in these firms. This study addresses this issue. It assesses, in an organisational setting, how accountant’ perceptions of ease of use and usefulness of microcomputers act on their computer anxieties, microcomputer attitudes and use to affect their job satisfaction and job performance. The research also examines how different types of human-computer interfaces affect the relationships between accountants' beliefs about microcomputer utility and ease of use, computer anxiety, microcomputer attitudes and microcomputer use. To attain this research objective, a conceptual model was first developed, The model indicates that work outcomes (job satisfaction and job performance) of professional accountants using microcomputers are influenced by users' perceptions of ease of use and usefulness of microcomputers via paths through (a) the level of computer anxiety experienced by users, (b) the general attitude of users toward using microcomputers, and (c) the extent to which microcomputers are used by individuals. Empirically testable propositions were derived from the model to test the postulated relationships between these constructs. The study also tested whether or not users of different human-computer interfaces reacted differently to the perceptions and anxieties they hold about microcomputers and their use in the workplace. It was argued that users of graphical interfaces, because of the characteristics of those interfaces, react differently to their perceptions and anxieties about microcomputers compared with users of command-line (or textual-based) interfaces. A passive-observational study in a field setting was used to test the model and the research propositions. Data was collected from 164 professional accountants working in a Big Six accounting firm in a metropolitan city in Australia. Structural equation modelling techniques were used to test the, hypothesised causal relationships between the components comprising the general research model. Path analysis and ordinary least squares regression was used to estimate the parameters of the model and analyse the data obtained. Multisample analysis (or stacked model analysis) using EQS was used to test the fit of the model to the data of the different human-computer interface groups and to estimate the parameters for the paths in those different groups. The results show that the research model is a good description of the data. The job satisfaction of professional accountants is directly affected by their attitude toward using microcomputers and by microcomputer use itself. However, job performance appears to be only directly affected by microcomputer attitudes. Microcomputer use does not directly affect job performance. Along with perceived ease of use and perceived usefulness, computer anxiety is shown to be an important determinant of attitudes toward using microcomputers - higher levels of computer anxiety negatively affect attitudes toward using microcomputers. Conversely, higher levels of perceived ease of use and perceived usefulness heighten individuals' positive attitudes toward using microcomputers. Perceived ease of use and perceived usefulness also indirectly affect microcomputer attitudes through their effect on computer anxiety. The results show that higher levels of perceived ease of use and perceived usefulness result in lower levels of computer anxiety. A surprising result from the study is that while perceived ease of use is shown to directly affect the level of microcomputer usage, perceived usefulness and attitude toward using microcomputers does not. The results of the multisample analysis confirm that the research model fits the stacked model and that the stacked model is a significantly better fit if specific parameters are allowed to vary between the two human-computer interface user groups. In general, these results confirm that an interaction exists between the type of human-computer interface (the variable providing the grouping) and the other variables in the model The results show a clear difference between the two groups in the way in which perceived ease of use and perceived usefulness affect microcomputer attitude. In the case of users of command-line interfaces, these variables appear to affect microcomputer attitude via an intervening variable, computer anxiety, whereas in the graphical interface user group the effect occurs directly. Related to this, the results show that perceived ease of use and perceived usefulness have a significant direct effect on computer anxiety in command-line interface users, but no effect at all for graphical interface users. Of the two exogenous variables only perceived ease of use, and that in the case of the command-line interface users, has a direct significant effect on extent of use of microcomputers. In summary, the research has contributed to the development of a theory of individual adjustment to information technology in the workplace. It identifies certain perceptions, anxieties and attitudes about microcomputers and shows how they may affect work outcomes such as job satisfaction and job performance. It also shows that microcomputer-interface types have a differential effect on some of the hypothesised relationships represented in the general model. Future replication studies could sample a broader cross-section of the microcomputer user community. Finally, the results should help Big Six accounting firms to maximise the benefits of microcomputer use by making them aware of how working with microcomputers affects job satisfaction and job performance.
APA, Harvard, Vancouver, ISO, and other styles
37

Cleal, Bryan. "Capturing communities : the account of an anthropological investigation into technology and innovation within a 'European' framework." Thesis, University of Cambridge, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.274162.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Al-Mushayt, Omar S. "An empirical investigation of factors influencing the successful treatment of organisational issues in information systems development." Thesis, Loughborough University, 2000. https://dspace.lboro.ac.uk/2134/7254.

Full text
Abstract:
There are far too many Information Systems (IS) projects which end in failure. It is widely recognised that the primary reasons for this are essentially human and organisational and rarely technical. Although it is found that the vast majority of IT specialists consider human and organisational issues to be of equal if not of greater importance than technical issues, in practice they are still focusing on technical aspects at the expense of human and organisational issues in Information Systems Development (ISD) and implementation. Despite the awareness of the importance of human and organisational issues in ISD, little is known about how these issues can actually be addressed. This study attempts to fill this gap by investigating empirically how, when and by tinhorn a set of 14 specific organisational issues are treated in practice, and explores whether the treatment of this set of issues is dependent upon the employment of specific Systems Development Methods (SDM) or the successful adoption of organisationally oriented best practice factors. In excess of 2,250 questionnaires were posted to IS/IT directors in different British organisations which had over 250 employees, and 344 valid responses were received. This mail survey was followed by a series of focus groups interviewees with IT practitioners. It was envisaged that the integration of the two strategies would provide a very effective mechanism for combining the complementary advantages of the qualitative and quantitative research approaches. The interviews provided a richer picture of the research statistical results and explored their meaning and implications. This research presents empirical evidence that the level of organisational issues consideration, the tinting of treatment, and the person/people responsible for the treatment during ISD significantly influence the overall level of systems' success. The findings also show that there is a significant correlation between the adoption of best practice factors and the overall success of IS and the treatment of organisational issues. There is, however, no significant relationship between the use of systems development methods and the overall success of IS or the treatment of organisational issues. These findings suggest that it is not the choice of a specific systems method that ensures the consideration of a wide range of organisational issues, but the successful adoption of the organisationally oriented best practices approaches.
APA, Harvard, Vancouver, ISO, and other styles
39

Lo, Kin-keung, and 羅建強. "An investigation of computer assisted testing for civil engineering students in a Hong Kong technical institute." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1988. http://hub.hku.hk/bib/B38627000.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Majeke, Lunga. "Preliminary investigation into estimating eye disease incidence rate from age specific prevalence data." Thesis, University of Fort Hare, 2011. http://hdl.handle.net/10353/464.

Full text
Abstract:
This study presents the methodology for estimating the incidence rate from the age specific prevalence data of three different eye diseases. We consider both situations where the mortality may differ from one person to another, with and without the disease. The method used was developed by Marvin J. Podgor for estimating incidence rate from prevalence data. It delves into the application of logistic regression to obtain the smoothed prevalence rates that helps in obtaining incidence rate. The study concluded that the use of logistic regression can produce a meaningful model, and the incidence rates of these diseases were not affected by the assumption of differential mortality.
APA, Harvard, Vancouver, ISO, and other styles
41

Gultekin, Kubra. "Knowledge Management and Law Enforcement: An Examination of Knowledge Management Strategies of the Police Information System (POLNET) in the Turkish National Police." Thesis, University of North Texas, 2009. https://digital.library.unt.edu/ark:/67531/metadc11040/.

Full text
Abstract:
This research study explores knowledge management (KM) in law enforcement, focusing on the POLNET system established by the Turkish National Police as a knowledge-sharing tool. This study employs a qualitative case study for exploratory and descriptive purposes. The qualitative data set came from semi-structured face-to-face and telephone interviews, as well as self-administered e-mail questionnaires. The sample was composed of police administrators who created POLNET, working under the Department of Information Technologies and the Department of Communication. A content analysis method is used to analyze the data. This study finds that law enforcement organizations' KM strategies have several differences from Handzic and Zhou's integrated KM model. Especially, organizational culture and structure of law enforcement agencies differently affect knowledge creation, conversion, retrieval, and sharing processes. Accordingly, this study offers a new model which is dynamic and suggests that outcomes always affect drivers.
APA, Harvard, Vancouver, ISO, and other styles
42

Heacock, Gregory. "An investigation of the role of virtual reality systems and their application to ophthalmic teaching, diagnosis and treatment." Thesis, King's College London (University of London), 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.287483.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Irwin, Barry Vivian William. "Bandwidth management and monitoring for IP network traffic : an investigation." Thesis, Rhodes University, 2001. http://hdl.handle.net/10962/d1006492.

Full text
Abstract:
Bandwidth management is a topic which is often discussed, but on which relatively little work has been done with regard to compiling a comprehensive set of techniques and methods for managing traffic on a network. What work has been done has concentrated on higher end networks, rather than the low bandwidth links which are commonly available in South Africa and other areas outside the United States. With more organisations increasingly making use of the Internet on a daily basis, the demand for bandwidth is outstripping the ability of providers to upgrade their infrastructure. This resource is therefore in need of management. In addition, for Internet access to become economically viable for widespread use by schools, NGOs and other academic institutions, the associated costs need to be controlled. Bandwidth management not only impacts on direct cost control, but encompasses the process of engineering a network and network resources in order to ensure the provision of as optimal a service as possible. Included in this is the provision of user education. Software has been developed for the implementation of traffic quotas, dynamic firewalling and visualisation. The research investigates various methods for monitoring and management of IP traffic with particular applicability to low bandwidth links. Several forms of visualisation for the analysis of historical and near-realtime traffic data are also discussed, including the use of three-dimensional landscapes. A number of bandwidth management practices are proposed, and the advantages of their combination, and complementary use are highlighted. By implementing these suggested policies, a holistic approach can be taken to the issue of bandwidth management on Internet links.
APA, Harvard, Vancouver, ISO, and other styles
44

Russ, Keith David. "An investigation into the application of computers for the processing of survey and planning data for 2D and 3D interpretation." Thesis, University of Exeter, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.260748.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Schlenkrich, Lara. "An investigation of social computing." Thesis, Rhodes University, 2009. http://hdl.handle.net/10962/d1006194.

Full text
Abstract:
Social network sites have recently become extremely popular online destinations as they offer users easy ways to build and maintain their relationships with each other. Consequently, students, lecturers, teachers, parents and businesses are using these tools to communicate with each other in a fast and cost-effective manner. However, literature suggests that the full potential of social network sites has not yet been revealed since users are still battling to overcome the various negative characteristics surrounding these sites. A framework for appropriate use of these sites is needed so that users are able to overcome these negative aspects, allowing them to be more effective and use the sites successfully. The goal of this research is to construct a framework for perceived successful use of social computing tools in educational institutions. This framework will include critical success factors that need to be adopted by users in order to develop the positive aspects of social computing, while at the same time overcoming the disadvantages experienced by users. Factors for successful use were derived from the literature and consolidated into a theoretical framework in order to understand the factors that drive successful use of social network sites. Measures used to test successful use of social network sites were also derived from these sources and were included in the same theoretical framework; these measures allow users to evaluate the extent of perceived successful use of social network sites. This framework was tested empirically by means of a pilot study and online survey, and revised according to the results of the survey. The factors were identified using Cronbach alpha coefficients (in the pilot study) and exploratory factor analysis to confirm the reliability of the scales developed. Pearson product-moment correlation coefficient analysis, t-tests and Pearson Chi-Square tests were used to measure the relationships amongst the variables in the framework proposed in this research. The factors influencing perceived successful use of social network sites were identified by the empirical study as: • Privacy and Security Settings need to be enabled. These are split into: - Settings: content that users allow others to see - Viewers: people who are allowed onto a user's profile • It is necessary for users to practise Legal and Acceptable Activities when using social network sites • Suspect Information needs to be checked before sharing it with others • Personal and Professional Time needs to be separated to ensure that work is completed before social activities occur • Users need to practise Professional and Ethical Behaviour • Users need to have a Positive Attitude when using social network sites • Usability of sites affects their success. This includes: - technical capacity (broadband) - ease of use - functionality (range of features and functions) • Current and Controversial Issues need to be discussed on social network sites. The extent to which social network sites are being used successfully can be evaluated by the presence of the following measures: • Range of Content must be available to users. This includes: - Content displayed on profiles - Viewers able to visit profiles • Visitors Behaviour is monitored and no unwanted visitors are present users' profiles • Social Contracts found on sites are followed by users • Critical Thinking Skills and Accurate Information are displayed by users • Work is completed before social activities occur on sites • A Variety of Users is present on sites • Collaboration between people as well as variety of opinions exist on sites • Social Capital (well-being) is present after users have been on sites • Learning and Advising Skills are enhanced on sites. The framework developed provides users with a useful instrument to overcome the negative characteristics associated with social network sites. If used successfully, social network sites can offer lecturers and students a unique method to develop their relationship, creating a positive learning experience.
APA, Harvard, Vancouver, ISO, and other styles
46

Tang, Shijun. "Investigation on Segmentation, Recognition and 3D Reconstruction of Objects Based on LiDAR Data Or MRI." Thesis, University of North Texas, 2015. https://digital.library.unt.edu/ark:/67531/metadc801920/.

Full text
Abstract:
Segmentation, recognition and 3D reconstruction of objects have been cutting-edge research topics, which have many applications ranging from environmental and medical to geographical applications as well as intelligent transportation. In this dissertation, I focus on the study of segmentation, recognition and 3D reconstruction of objects using LiDAR data/MRI. Three main works are that (I). Feature extraction algorithm based on sparse LiDAR data. A novel method has been proposed for feature extraction from sparse LiDAR data. The algorithm and the related principles have been described. Also, I have tested and discussed the choices and roles of parameters. By using correlation of neighboring points directly, statistic distribution of normal vectors at each point has been effectively used to determine the category of the selected point. (II). Segmentation and 3D reconstruction of objects based on LiDAR/MRI. The proposed method includes that the 3D LiDAR data are layered, that different categories are segmented, and that 3D canopy surfaces of individual tree crowns and clusters of trees are reconstructed from LiDAR point data based on a region active contour model. The proposed method allows for delineations of 3D forest canopy naturally from the contours of raw LiDAR point clouds. The proposed model is suitable not only for a series of ideal cone shapes, but also for other kinds of 3D shapes as well as other kinds dataset such as MRI. (III). Novel algorithms for recognition of objects based on LiDAR/MRI. Aimed to the sparse LiDAR data, the feature extraction algorithm has been proposed and applied to classify the building and trees. More importantly, the novel algorithms based on level set methods have been provided and employed to recognize not only the buildings and trees, the different trees (e.g. Oak trees and Douglas firs), but also the subthalamus nuclei (STNs). By using the novel algorithms based on level set method, a 3D model of the subthalamus nuclei (STNs) in the brain has been successfully reconstructed based on the statistical data of previous investigations of an anatomy atlas as reference. The 3D rendering of the subthalamic nuclei and the skull directly from MR imaging is also utilized to determine the 3D coordinates of the STNs in the brain. In summary, the novel methods and algorithms of segmentation, recognition and 3D reconstruction of objects have been proposed. The related experiments have been done to test and confirm the validation of the proposed methods. The experimental results also demonstrate the accuracy, efficiency and effectiveness of the proposed methods. A framework for segmentation, recognition and 3D reconstruction of objects has been established, which has been applied to many research areas.
APA, Harvard, Vancouver, ISO, and other styles
47

Zou, Haichuan. "Investigation of hardware and software configuration on a wavelet-based vision system--a case study." Thesis, Georgia Institute of Technology, 1998. http://hdl.handle.net/1853/8719.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Andress, John Charles. "Development of the BaBar trigger for the investigation of CP violation." Thesis, University of Bristol, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.322355.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Funk, Antje Elisabeth Margarete. "Criminal liability of Internet providers in Germany and other jurisdictions." Thesis, Stellenbosch : Stellenbosch University, 2004. http://hdl.handle.net/10019.1/70134.

Full text
Abstract:
Thesis (LLM)--Stellenbosch University, 2004
ENGLISH ABSTRACT: This thesis deals with the criminal liability of Internet providers. The focus is on Germany, but the analysis is put in a wider, comparative context. This is done with reference to South Africa, as well as Europe and the American system. This thesis demonstrates and discusses the existing legal norms to regulate Internet provider liability for illegal content on the Internet and the international efforts to deal with this issue. In the introduction it is shown how the Internet has given rise to a new form of global communication and the accompanying legal problems. This is followed by an examination of the different functions Internet providers have. A survey of some of the important crimes affecting the Internet and also some Internet-specific offences put the more general issue of liability in a more specific context. Traditional and new forms of crimes are discussed. This section is followed by an analysis of Internet provider liability under German criminal law and Germany's Teleservices Act. From an international criminal law perspective some international instruments, like the Cybercrime Convention of the Council of Europe, is discussed. National legislation, especially in the context of the European Union, must always be put in the proper regional and international context. The thesis concludes with some thoughts on alternative, or perhaps complementary, methods to fight illegal and criminal conduct on the Internet. This is done not as a critique of the responses to Internet crime, but rather to strengthen the many hands trying to reduce Internet crime.
AFRIKAANSE OPSOMMING: Hierdie tesis handeloor die strafregtelike aanspreekliheid van Internet diensverskaffers. Die fokus val op Duitsland, maar die analise word ook geplaas in 'n wyer, vergelykende konteks. Dit word gedoen met verwysing na Suid-Afrika, sowel as Europa en die VSA. Die tesis demonstreer en bespreek die bestaande regsnorme wat Internet diensverskaffers reguleer met spesifieke verwysing na aanspreeklikheid vir onwettige inhoud op die Internet en internasionale pogings om hierdie probleem aan te spreek. Ter inleiding word daar aangetoon hoe die Internet aanleiding gee tot nuwe vorme van globale kommunikasie en die regsprobleme wat dit tot gevolg het. Dit word gevolg deur 'n ondersoek na die verskillende funksies van Internet verskaffers. 'n Ontleding en bespreking van Internet-spesifieke misdrywe plaas die meer algemene vraagstuk in 'n meer gefokusde konteks. Tradisionele en nuwe vorme van misdaad word bespreek. Hierdie afdeling word gevolg deur 'n ontleding van Internet diensverskaffer aanspreeklikheid ingevolge Duitse reg en die Duitse wetgewing op die terrein van telediens. Uit 'n internasionale strafreg oogpunt word sekere internasionale instrumente, soos die Cybercrime Convention van die Raad van Europa, bespreek. Nasionale wetgewing, veral in die konteks van die Europese Unie, word ook in die relevante regionale en internasionale konteks geplaas. Die tesis word afgesluit met sekere gedagtes oor alternatiewe, of moontlik komplimentêre, metodes in die stryd teen Internet-kriminaliteit. Dit moet nie gesien word as kritiek op die huidige stand van sake nie, maar eerder as 'n poging om die talle rolspelers in die stryd teen Internet misdaad se hande te sterk.
APA, Harvard, Vancouver, ISO, and other styles
50

Williams, Patricia A. "An investigation into information security in general medical practice." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2007. https://ro.ecu.edu.au/theses/274.

Full text
Abstract:
Increased demand by governments and patients for better healthcare communication has seen a growth in adoption of electronic medical records, with general practice as the cornerstone of this distributed environment. In this progressively more electronic state, general practice is charged with the responsibility to ensure confidentiality and privacy of patient infonnation. However, evidence suggests that protection of patient information is poorly handled in general practice. The deficiency in awareness of vulnerability and risk, together with the lack of appropriate controls and knowledge, leaves medical practice insecure and potentially vulnerable to information security breaches.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography