Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Software crisis.

Dissertationen zum Thema „Software crisis“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-26 Dissertationen für die Forschung zum Thema "Software crisis" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Pelaez, Valdez Maria Eloina. „A gift from Pandora's Box : the software crisis“. Thesis, University of Edinburgh, 1988. http://hdl.handle.net/1842/19228.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Santos, Daniel Soares. „Quality Evaluation Model for Crisis and Emergency Management Systems-of-Systems“. Universidade de São Paulo, 2017. http://www.teses.usp.br/teses/disponiveis/55/55134/tde-10072017-162919/.

Der volle Inhalt der Quelle
Annotation:
Systems-of-Systems (SoS) have performed an important and even essential role to the whole society and refer to complex softwareintensive systems, resulted from interoperability of independent constituent systems that work together to achieve more complex missions. SoS have emerged specially in critical application domains and, therefore, high level of quality must be assured during their development and evolution. However, dealing with quality of SoS still presents great challenges, as SoS present a set of unique characteristics that can directly affect the quality of such systems. Moreover, there are not comprehensive models that can support the quality evaluation of SoS. Motivated by this scenario, the main contribution of this Masters project is to present a SoS Evaluation Model, more specifically, addressing the crisis/emergency management domain, built in the context of a large international research project. The proposed model covers important evaluation activities and considers all SoS characteristics and challenges not usually addressed by other models. This model was applied to evaluate a crisis/emergency management SoS and our results have shown it viability to the effective management of the SoS quality.
Sistemas-de-Sistemas (SoS, do inglês Systems-of-Systems) realizam um importante e até essencial papel na sociedade. Referem-se a complexos sistemas intensivos em software, resultado da interoperabilidade de sistemas constituintes independentes que trabalham juntos para realizar missões mais complexas. SoS têm emergido especialmente em domínios de aplicação crítica, portanto, um alto nível de qualidade deve ser garantido durante seu desenvolvimento e evolução. Entretanto, lidar com qualidade em SoS ainda apresenta grandes desafios, uma vez que possuem um conjunto de características únicas que podem diretamente afetar a qualidade desses sistemas. Além disso, não existem modelos abrangentes para o suporte à avaliação de qualidade de SoS. Motivado por este cenário, a principal contribuição deste projeto de mestrado é apresentar um modelo de avaliação para SoS, especialmente destinado ao domínio de gerenciamento de crises e emergências. Este modelo foi construído no contexto de um grande projeto de pesquisa internacional, e cobre as mais importantes atividades de avaliação, considerando as principais características e desafios de SoS geralmente não abordados por outros modelos. Este modelo foi aplicado na avaliação de um SoS de gerenciamento de crises e emergência, e nossos resultados têm mostrado sua viabilidade para o efetivo gerenciamento da qualidade de SoS.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Nourjou, Reza. „GIS-based Intelligent Assistant Agent for Supporting Decisions of Incident Commander in Disaster Response“. 京都大学 (Kyoto University), 2014. http://hdl.handle.net/2433/188867.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Saoutal, Amina. „Amélioration de l'awareness informationnelle dans la collaboration inter-organisations pendant la gestion de crise“. Thesis, Troyes, 2015. http://www.theses.fr/2015TROY0036/document.

Der volle Inhalt der Quelle
Annotation:
Des verrous technologiques et sociaux importants sont identifiés lors du soutien de la collaboration entre acteurs d’une crise : en particulier, souvent la conception de systèmes informatiques dans ce but ne répond pas aux besoins des utilisateurs et les systèmes trop rigides ne permettent pas de supporter des situations dynamiques où les évènements sont imprévus et font appel à des mesures émergentes. Pour s’affranchir de ces verrous, notre travail se positionne dans le domaine du travail collaboratif assisté par ordinateur (TCAO) focalisé sur les systèmes sociotechniques. Cette recherche s’inscrit dans la problématique d’un système flexible d’information et de communication qui supporte l’awareness informationnelle dans un contexte précis : la collaboration inter-organisationnelle dans des situations émergentes et complexes. Celles-ci ajoutent au travail collaboratif plusieurs contraintes, notamment le stress, l’imprévision, la multitude des acteurs et les frontières organisationnelles. Dans ce contexte, les différentes organisations – SAMU, pompiers, gendarmerie et autres - doivent acquérir et percevoir les informations qui leur sont utiles pour accomplir leurs activités inter-organisationnelles. Certes, les acteurs rencontrent des problèmes qui entravent l’atteinte de leurs objectifs. Avec l’aide des sciences sociales, cette étude apporte une contribution de recherche en informatique ouverte sur l’interdisciplinarité. Un apport important en est l’étude de l’aspect organisationnel et de l’aspect informationnel à partir des pratiques réelles des utilisateurs
In order to overcome the technological and social locks that are identified in supporting collaboration - for instance, the computer systems design problems that do not meet the needs of users and/or systems that are rigid and do not allow to deal with dynamic situations where events are unexpected and appeal to emerging measures - our work is positioned in the field of Computer-Supported Cooperative Work (CSCW) characterized by the dualistic social and technical aspects.This research proposes flexible information and communication system that supports the information awareness in the inter-agency collaboration in emerging and complex situations as crisis. These situations add to a collaborative work several constraints such as stress, lack of foresight, the multitude of actors and organizational boundaries. In crisis management, the various organizations – emergency medical service, firefighters, police and others - need to perceive useful information to them to complete their inter-agency activities. However, actors encounter problems that prevent them to reach their goals. This study brings to its scale, a contribution in computer sciences opened to interdisciplinary with the help of social sciences which rely on study of current practices to understand and analyze the users, their activities and the work environment
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Sonnekus, Michael Hendrik. „A comparison of open source and proprietary digital forensic software“. Thesis, Rhodes University, 2015. http://hdl.handle.net/10962/d1017939.

Der volle Inhalt der Quelle
Annotation:
Scrutiny of the capabilities and accuracy of computer forensic tools is increasing as the number of incidents relying on digital evidence and the weight of that evidence increase. This thesis describes the capabilities of the leading proprietary and open source digital forensic tools. The capabilities of the tools were tested separately on digital media that had been formatted using Windows and Linux. Experiments were carried out with the intention of establishing whether the capabilities of open source computer forensics are similar to those of proprietary computer forensic tools, and whether these tools could complement one another. The tools were tested with regards to their capabilities to make and analyse digital forensic images in a forensically sound manner. The tests were carried out on each media type after deleting data from the media, and then repeated after formatting the media. The results of the experiments performed demonstrate that both proprietary and open source computer forensic tools have superior capabilities in different scenarios, and that the toolsets can be used to validate and complement one another. The implication of these findings is that investigators have an affordable means of validating their findings and are able to more effectively investigate digital media.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Chitturi, Kiran. „Building CTRnet Digital Library Services using Archive-It and LucidWorks Big Data Software“. Thesis, Virginia Tech, 2014. http://hdl.handle.net/10919/46865.

Der volle Inhalt der Quelle
Annotation:
When a crisis occurs, information flows rapidly in the Web through social media, blogs, and news articles. The shared information captures the reactions, impacts, and responses from the government as well as the public. Later, researchers, scholars, students, and others seek information about earlier events, sometimes for cross-event analysis or comparison. There are very few integrated systems which try to collect and permanently archive the information about an event and provide access to the crisis information at the same time. In this thesis, we describe the CTRnet Digital Library and Archive which aims to permanently archive crisis event information by using Archive-It services and then provide access to the archived information by using LucidWorks Big Data software. Through the Big Data (LWBD) software, we take advantage of text extraction, clustering, similarity, annotation, and indexing services and build digital libraries with the generated metadata that will be helpful for the system stakeholders to locate information about an event. Through this study, we collected data for 46 crises events using Archive-It. We built a CTRnet DL prototype and its services for the ``Boston Marathon Bombing" collection by using the components of LucidWorks Big Data. Running LucidWorks Big Data on a 30 node Hadoop cluster accelerates the sub-workflows processing and also provides fault tolerant execution. LWBD sub-workflows, ``ingest" and ``extract", processed the textual data present in the WARC files. Other sub-workflows ``kmeans", ``simdoc", and ``annotate" helped in grouping the search-results, deleting the duplicates and providing metadata for additional facets in the CTRnet DL prototype, respectively.
Master of Science
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Judd, Aaron C. „Improved Network Security and Disguising TCP/IP Fingerprint Through Dynamic Stack Modification /“. Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2005. http://library.nps.navy.mil/uhtbin/hyperion/05Sep%5FJudd.pdf.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Santos, Rodrigo Antônio dos. „Criminalidade em Goiânia: mapeamento dos crimes contra a pessoa nos contextos sociais de 2010 a 2014“. Universidade Federal de Goiás, 2016. http://repositorio.bc.ufg.br/tede/handle/tede/6604.

Der volle Inhalt der Quelle
Annotation:
Submitted by Jaqueline Silva (jtas29@gmail.com) on 2016-12-16T15:29:32Z No. of bitstreams: 2 Dissertação - Rodrigo Antônio dos Santos - 2016.pdf: 7982615 bytes, checksum: d618d017d5a9c7d3c33ce6611af601bb (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Approved for entry into archive by Jaqueline Silva (jtas29@gmail.com) on 2016-12-16T15:29:53Z (GMT) No. of bitstreams: 2 Dissertação - Rodrigo Antônio dos Santos - 2016.pdf: 7982615 bytes, checksum: d618d017d5a9c7d3c33ce6611af601bb (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Made available in DSpace on 2016-12-16T15:29:53Z (GMT). No. of bitstreams: 2 Dissertação - Rodrigo Antônio dos Santos - 2016.pdf: 7982615 bytes, checksum: d618d017d5a9c7d3c33ce6611af601bb (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Previous issue date: 2016-11-07
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES
This paper aimed to map, for neighborhoods of Goiânia, crimes against the person recorded in the 2010-2014 period. With the support of GIS software ArcGIS, crossed crime data with demographic and criminal, such as drug dealing, drug dealing association, color / race, gender and income of the population. The chosen crimes against the person were intentional homicide, manslaughter, bodily injury followed by death and robbery (the latter being a crime against property). At the end, you can see the rapid growth of violence in the city of Goiânia, as well as the most prominent criminal neighborhoods, and its correlation with the demographic variables
O presente trabalho buscou fazer um mapeamento, por bairros de Goiânia, dos crimes contra a pessoa registrados no período de 2010 a 2014. Com o apoio do software SIG ArcGIS, cruzou-se os dados criminais com variáveis demográficas e, também, criminais, sendo: tráfico de drogas, associação ao tráfico de drogas, cor/ raça da população, gênero e renda. Os crimes contra a pessoa escolhidos foram: homicídio doloso, homicídio culposo, lesão corporal seguida de morte e latrocínio (sendo este último um crime contra o patrimônio). Ao final, é possível perceber o crescimento acelerado da violência dentro da cidade de Goiânia, bem como aqueles bairros com maior destaque criminal, e, ainda, sua correlação com as variáveis demográficas.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Forrester, Jock Ingram. „An exploration into the use of webinjects by financial malware“. Thesis, Rhodes University, 2014. http://hdl.handle.net/10962/d1012079.

Der volle Inhalt der Quelle
Annotation:
As the number of computing devices connected to the Internet increases and the Internet itself becomes more pervasive, so does the opportunity for criminals to use these devices in cybercrimes. Supporting the increase in cybercrime is the growth and maturity of the digital underground economy with strong links to its more visible and physical counterpart. The digital underground economy provides software and related services to equip the entrepreneurial cybercriminal with the appropriate skills and required tools. Financial malware, particularly the capability for injection of code into web browsers, has become one of the more profitable cybercrime tool sets due to its versatility and adaptability when targeting clients of institutions with an online presence, both in and outside of the financial industry. There are numerous families of financial malware available for use, with perhaps the most prevalent being Zeus and SpyEye. Criminals create (or purchase) and grow botnets of computing devices infected with financial malware that has been configured to attack clients of certain websites. In the research data set there are 483 configuration files containing approximately 40 000 webinjects that were captured from various financial malware botnets between October 2010 and June 2012. They were processed and analysed to determine the methods used by criminals to defraud either the user of the computing device, or the institution of which the user is a client. The configuration files contain the injection code that is executed in the web browser to create a surrogate interface, which is then used by the criminal to interact with the user and institution in order to commit fraud. Demographics on the captured data set are presented and case studies are documented based on the various methods used to defraud and bypass financial security controls across multiple industries. The case studies cover techniques used in social engineering, bypassing security controls and automated transfers.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Ransbotham, III Samuel B. „Acquisition and diffusion of technology innovation“. Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/28094.

Der volle Inhalt der Quelle
Annotation:
Thesis (M. S.)--Management, Georgia Institute of Technology, 2008.
Committee Chair: Sabyasachi Mitra; Committee Member: Frank Rothaermel; Committee Member: Sandra Slaughter; Committee Member: Sridhar Narasimhan; Committee Member: Vivek Ghosal.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

OURIQUES, João Felipe Silva. „Investigation of Test Case Prioritization for Model-Based Testing“. Universidade Federal de Campina Grande, 2017. http://dspace.sti.ufcg.edu.br:8080/jspui/handle/riufcg/1674.

Der volle Inhalt der Quelle
Annotation:
Submitted by Emanuel Varela Cardoso (emanuel.varela@ufcg.edu.br) on 2018-09-10T23:01:21Z No. of bitstreams: 1 JOÃO FELIPE SILVA OURIQUES – TESE (PPGCC) 2017.pdf: 1146873 bytes, checksum: ad9b0dc8f44c3aa49b2d3201dca79589 (MD5)
Made available in DSpace on 2018-09-10T23:01:21Z (GMT). No. of bitstreams: 1 JOÃO FELIPE SILVA OURIQUES – TESE (PPGCC) 2017.pdf: 1146873 bytes, checksum: ad9b0dc8f44c3aa49b2d3201dca79589 (MD5) Previous issue date: 2017-12-12
CNPq
Teste de Software é uma tarefa cara dentre as de Verificação de Validação. No entanto, ainda é a mais utilizada para medir a qualidade de sistemas em desenvolvimento. No Teste Baseado em Modelo (TBM), casos de teste são gerados automaticamente de modelos de sistema, o que reduz tempo e custos. Por outro lado, frequentemente o tamanho das suites de teste geradas leva a uma execução completa inviável. Assim, algumas abordagens são usadas para lidar com os custos da execução dos casos de teste: Seleção de Casos de Teste, Minimização de suítes de teste e Priorização de Casos de Teste (PCT). Com a finalidade de melhorar o consumo de recursos durante o teste no nível de sistema, ao mesmo tempo não eliminando casos de teste no processo, nós exploramos a PCT neste trabalho. Nesta pesquisa de doutorado abordamos o problema de propor uma nova ordem de execução para casos de teste de sistema, especificamente os gerados através de abordagens de TBM, considerando que informação histórica não está disponível para guiar este processo. Como primeiro passo, avaliamos as técnicas atuais no nosso contexto de pesquisa, com a finalidade de avaliar suas caracteristicas. Baseado nisto, propomos duas técnicas de PCT: HARP e CRISPy. A primeira usa indicações de membros do time de desenvolvimento sobre porções suscetíveis a erros (as ditas dicas), como guia para a priorização; a segunda cria clusters de casos de teste com a finalidade de adicionar variabilidade a priorização. HARP se baseia na estratégia aleatórioadaptativa e explora casos de teste que cobrem passos relacionados às dicas. Para validá-la, executamos dois estudos empíricos e deles, concluimos que, com dicas boas, HARP é capaz de sugerir sequências de casos de teste efetivas, mas é necessário evitar situações em que as dicas não são consensuais entre os membros do time de desenvolvimento. Por outro lado, propomos CRISPy para complementar HARP. Para validá-la, realizamos três estudos empíricos e, mesmo sendo uma técnica simples de clustering, já consegue propor sequências de casos de teste efetivas em comparação as outras técnicas investigadas. Além disso, a depender dos tipos de casos de teste considerados, a escolha de uma função de distância adequada pode ser um passo de adequação da técnica. Detalhamos a implementação de todas as técnicas propostas e publicamos todos os artefatos relacionados aos estudos empíricos realizados em uma página web, o que permite a reprodução dos nossos resultados.
Software Testing is an expensive and laborious task among the Verification and Validation ones. Nevertheless testing is still the main way of assessing the quality of systems under development. In Model-Based Testing (MBT), test cases are generated automatically from system models, which provides time and cost reduction. On the other hand, frequently the size of the generated test suites leads to an unfeasible complete execution. Therefore, some approaches are used to deal with the costs involved in the test case execution, for example: Test Case Selection, Test Suite Reduction and the Test Case Prioritization (TCP). In order to improve resource consumption during system level testing, whereas not eliminating test cases, we explore TCP in our work. In this doctorate research we address the general problem of proposing a new order for the execution of system level test cases, specifically the ones generated through MBT approaches and considering that historical information is not available to guide this process. As a first step, we evaluate current techniques in this context, aiming at assessing their strengths and weaknesses. Based on these results, we propose two TCP techniques: HARP and CRISPy. The former uses indications (hints) provided by development team members regarding error prone portions of the system under test, as guidance for prioritization; the latter clusters test cases in order to add variability to the prioritization. HARP is based on the adaptive random strategy and explore test cases that cover steps related to the hints. To validate it, we perform two studies and from them we conclude that, with good hints, HARP can suggest effective prioritized test suites, provided that one avoid situations that the hints are not consensual in the development team. On the other hand, we propose CRISPy to complement HARP, i.e. to apply it when HARP is not indicated. To validate it we perform three studies and, even being a simple clustering technique, it already proposes effective prioritized test suites in comparison to current TCP techniques. Besides, depending on the kind of the test suites, choosing an adequate distance function could be a tuning step. We detail the implementation of all investigated and proposed techniques, and publish the artifacts related to the performed empirical studies in a companion site, enabling researchers to verify and reproduce our findings.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

Koen, Renico. „The development of an open-source forensics platform“. Diss., Pretoria : [s.n.], 2009. http://upetd.up.ac.za/thesis/available/etd-02172009-014722/.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Ransbotham, Samuel B. III. „Acquisition and diffusion of technology innovation“. Diss., Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/28094.

Der volle Inhalt der Quelle
Annotation:
In the first essay, I examine value created through external acquisition of nascent technology innovation. External acquisition of new technology is a growing trend in the innovation process, particularly in high technology industries, as firms complement internal efforts with aggressive acquisition programs. Yet, despite its importance, there is little empirical research on the timing of acquisition decisions in high technology environments. I examine the impact of target age on value created for the buyer. Applying an event study methodology to technology acquisitions in the telecommunications industry from 1995 to 2001, empirical evidence supports acquiring early in the face of uncertainty. The equity markets reward the acquisition of younger companies. In sharp contrast to the first essay, the second essay examines the diffusion of negative innovations. While destruction can be creative, certainly not all destruction is creative. Some is just destruction. I examine two fundamentally different paths to information security compromise an opportunistic path and a deliberate path. Through a grounded approach using interviews, observations, and secondary data, I advance a model of the information security compromise process. Using one year of alert data from intrusion detection devices, empirical analysis provides evidence that these paths follow two distinct, but interrelated diffusion patterns. Although distinct, I find empirical evidence that these paths both converge and escalate. Beyond the specific findings in the Internet security context, the study leads to a richer understanding of the diffusion of negative technological innovation. In the third essay, I build on the second essay by examining the effectiveness of reward-based mechanisms in restricting the diffusion of negative innovations. Concerns have been raised that reward-based private infomediaries introduce information leakage which decreases social welfare. Using two years of alert data, I find evidence of their effectiveness despite any leakage which may be occurring. While reward-based disclosures are just as likely to be exploited as non-reward-baed disclosures, exploits from reward-based disclosures are less likely to occur in the first week after disclosure. Further the overall volume of alerts is reduced. This research helps determine the effectiveness of reward mechanisms and provides guidance for security policy makers.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

Rastogi, Achal. „Phaeodactylum tricornutum genome and epigenome : characterization of natural variants“. Thesis, Paris Sciences et Lettres (ComUE), 2016. http://www.theses.fr/2016PSLEE048/document.

Der volle Inhalt der Quelle
Annotation:
Depuis la découverte de Phaeodactylum tricornutum par Bohlin en 1897, sa classification au sein de l'arbre de la vie a été controversée. En utilisant des morphotypes ovales et fusiformes Lewin a décrit en 1958 plusieurs traits caractéristiques de cette espèce rappelant la structure des diatomées mettant ainsi fin à la controverse sur la classification de P. tricornutum au sein des Bacillariophycées. Pour se faire, trois morphotypes (ovale, fusiforme et triradié) de Phaeodactylum tricornutum ont été observés. Au cours d’une centaine d’années environ, de 1908 à 2000, 10 souches de Phaeodactylum tricornutum (appelées écotypes) ont été collectées et stockées soit de manière axénique ou en l’état avec leur populations naturelles de bactéries dans les centres des ressources génétiques pour algues, cryo-préservées quand cela est possible. Divers outils cellulaires et moléculaires ont été établis pour disséquer et comprendre la physiologie et l'évolution de P. tricornutum, et/ou les diatomées en général. Grâce à des décennies de recherche et les efforts déployés par de nombreux laboratoires que P. tricornutum est aujourd’hui considérée comme une espèce modèle des diatomées. Le sujet de ma thèse traite majoritairement de la composition génétique et épigénétique du génome de P. tricornutum ainsi que de la diversité morphologique et physiologique sousjacente au sein des populations naturelles prospectées à différents endroits du globe. Pour se faire, j’ai généré les profils chromatiniens en utilisant différentes marques des modifications post-traductionnelles des histones (chapitres 1 et 2) et a également comparé la variation naturelle dans la distribution de certaines marques clés entre deux populations d’écotypes (chapitre 4). Nous avons également généré une carte de la diversité génétique à l’échelle du génome chez 10 écotypes de P. tricornutum révélant ainsi la présence d'un complexe d'espèces dans le genre Phaeodactylum comme la conséquence d’une hybridation ancienne (chapitre 3). Sur la base de nombreux rapports antérieurs et des observations similaires au sein de P. tricornutum, nous proposons l’hybridation naturelle comme une base solide et une possibilité plausible pour expliquer la diversité des espèces chez lest diatomées. De plus, nous avons mis à jour les annotations fonctionnelles et structurelles du génome de P. tricornutum (Phatr3, chapitre 2) et mis au point un algorithme de logiciel convivial pour aller chercher les cibles CRISPR du système d’édition du génome CRISPR / cas9 chez 13 génomes de phytoplancton incluant P. tricornutum (chapitre 5). Pour accomplir tout cela, j'ai utilisé diverses méthodes à la pointe de l’état de l’art comme la spectrométrie de masse, l’immunoprécipitation de la chromatine suivie de séquençage à haut débit ainsi que les séquençages du génome entier, de l'ARN et des protocoles d'édition du génome CRISPR et plusieurs logiciels / pipelines de calcul. Ainsi, le travail de thèse fournit une plate-forme complète qui pourra être utilisée à l’avenir pour des études épigénétiques, de génétiques moléculaires et fonctionnelles chez les diatomées en utilisant comme espèce modèle Phaeodactylum tricornutum. Ce travail est pionnier et représente une valeur ajoutée importante dans le domaine de la recherche sur les diatomées en répondant à des questions nouvelles ouvrant ainsi de nouveaux horizons à la recherche en particulier en épigénétique qui joue un rôle important mais pas encore assez apprécié dans le succès écologique des diatomées dans les océans actuels
Since the discovery of Phaeodactylum tricornutum by Bohlin in 1897, its classification within the tree of life has been controversial. It was in 1958 when Lewin, using oval and fusiform morphotypes, described multiple characteristic features of this species that resemble diatoms structure, the debate to whether classify P. tricornutum as a member of Bacillariophyceae was ended. To this point three morphotypes (oval, fusiform and triradiate) of Phaeodactylum tricornutum have been observed. Over the course of approximately 100 years, from 1908 till 2000, 10 strains of Phaeodactylum tricornutum (referred to asecotypes) have been collected and stored axenically as cryopreserved stocks at various stock centers. Various cellular and molecular tools have been established to dissect and understand the physiology and evolution of P. tricornutum, and/or diatoms in general. It is because of decades of research and efforts by many laboratories that now P. tricornutum is considered to be a model diatom species. My thesis majorly focuses in understanding the genetic and epigenetic makeup of P. tricornutum genome to decipher the underlying morphological and physiological diversity within different ecotype populations. To do so, I established the epigenetic landscape within P. tricornutum genome using various histone post-translational modification marks (chapter 1 and chapter 2) and also compared the natural variation in the distribution of some key histone PTMs between two ecotype populations (chapter 4). We also generated a genome-wide genetic diversity map across 10 ecotypes of P. tricornutum revealing the presence of a species-complex within the genus Phaeodactylum as aconsequence of ancient hybridization (Chapter 3). Based on the evidences from many previous reports and similar observations within P. tricornutum, we propose natural hybridization as a strong and potential foundation for explaining unprecedented species diversity within the diatom clade. Moreover, we updated the functional and structural annotations of P. tricornutum genome (Phatr3, chapter 2) and developed a user-friendly software algorithm to fetch CRISPR/Cas9 targets, which is a basis to perform knockout studies using CRISPR/Cas9 genome editing protocol, in 13 phytoplankton genomes including P. tricornutum (chapter 5). To accomplish all this, I used various state-of-the-art technologies like Mass-Spectrometry, ChIPsequencing, Whole genome sequencing, RNA sequencing, CRISPR genome editing protocols and several computational softwares/pipelines. In brief, the thesis work provides a comprehensive platform for future epigenetic, genetic and functional molecular studies in diatoms using Phaeodactylum tricornutum as a model. The work is an addon value to the current state of diatom research by answering questions that have never been asked before and opens a completely new horizon and demand of epigenetics research that underlie the ecological success of diatoms in modern-day ocean
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Ramos, Ana Rita Galocha. „Desenvolvimento de um equipamento para diagnóstico do pé diabético“. Master's thesis, Universidade de Aveiro, 2014. http://hdl.handle.net/10773/14072.

Der volle Inhalt der Quelle
Annotation:
A incidência da doença Diabetes Mellitus tem vindo a aumentar nas últimas décadas, acompanhando o aumento da obesidade populacional. Esta doença crónica e metabólica ocasiona não só uma diminuição na qualidade de vida, morbilidade e mortalidade, como também custos elevados, com graves consequências socioeconómicas. Deste modo, torna-se essencial uma melhoria nos meios de diagnóstico e monitorização das complicações relacionadas com a doença. De entre as complicações mais graves e dispendiosas, sobretudo da Diabetes Mellitus do tipo 2, encontra-se a doença do Pé Diabético, responsável principal por cerca de metade das amputações dos membros inferiores por causas não traumáticas. A identificação precoce desta problemática torna-se assim fundamental para a adoção de medidas preventivas que podem modificar o seu prognóstico. Este projeto, realizado sinergicamente com a empresa Exatronic, tem como objetivo o estudo e recolha de informação para o desenvolvimento de uma plataforma tecnológica de diagnóstico do Pé Diabético. Pretendeu-se, primeiramente, realizar um estudo do mercado, para que se identificassem os equipamentos atualmente utilizados na consulta do Pé Diabético, e se definissem as principais necessidades e metodologias a ter em conta para a conceção de um protótipo para o diagnóstico precoce desta patologia. Após este levantamento, decidiu-se recorrer à termografia para a construção deste mesmo equipamento, já que a análise térmica tem vindo a dar provas da sua extensa aplicabilidade no diagnóstico de várias patologias e, em casos de Pé Diabético, qualquer paciente apresenta invariavelmente alterações na temperatura cutânea do pé. A tecnologia dos cristais líquidos, através da sua rápida resposta, alta sensibilidade a alterações térmicas, baixo custo e fácil aquisição de imagem, permitiu oferecer um meio termográfico de sucesso para o diagnóstico pretendido. Sendo assim, para a construção do protótipo, pensou-se na utilização de um scanner adaptado, contendo uma folha de cristais líquidos, para adquirir um padrão térmico da planta dos pés dos pacientes. Inerente à plataforma que se pretende construir, desenvolveu-se uma interface, através da linguagem de programação MATLAB, que permite segmentar e armazenar digitalmente as imagens que são obtidas durante o diagnóstico, para eficazmente auxiliar o profissional de saúde na interpretação dos resultados. Posteriormente, abordaram-se as principais etapas do processo de certificação médica, de modo a possibilitar uma futura aposição da marcação CE, indispensável para a colocação do dispositivo desenvolvido neste trabalho no mercado. Para tal, foi verificada a conformidade com a diretiva médica 93/42/CEE. Devido à escassez de instrumentos para a medição das temperaturas do pé e altos custos da maioria das plataformas para o diagnóstico do Pé Diabético, a tecnologia apresentada neste projeto suscitou um elevado interesse no seio da comunidade médica, de forma a melhorar a prevenção e monitorização desta problemática.
In the recent decades, the incidence of Diabetes Mellitus has been increasing, with the increase of population obesity. This chronic and metabolic disease causes not only a decrease in quality of life, morbidity and mortality, as well as high costs associated with serious social and economic consequences. Therefore, it becomes essential to improve the diagnosis and monitoring pathways of the disease-related complications. Among the most serious and costly consequences, especially for Diabetes Mellitus type 2, is the Diabetic Foot disease, the main responsible for approximately half of lower limb amputations for non-traumatic causes. Early identification of Diabetic Foot becomes critical for the adoption of preventive measures that can modify the disease prognosis. The aim of the current dissertation project, realized synergistically with the company Exatronic, is the study and collection of information to develop a technological platform for the Diabetic Foot diagnosis. It was firstly intended to perform a market study, to identify the devices that are currently used in this type of diagnosis, and to define the main needs and methodologies to take into account for the design of a prototype. After this study, it was decided to apply the thermography method for the construction of the equipment, since thermal analysis has demonstrated its wide applicability in medical diagnosis of various pathologies and, any patient with Diabetic Foot invariably shows notable changes in skin temperatures of the foot. The liquid crystals technology, through its fast response times, high sensitivity to thermal changes, low cost and easy image acquisition, allowed to offer a successful thermography assessment for the desired application. Therefore, for the prototype construction, it was decided to use a scanner that was adapted, containing a liquid crystals sheet, to acquire thermal patterns of the patients’ soles. Inherent to the platform planned to be built, a computer interface was developed, through the programming language MATLAB, which allows the segmentation and digital storage of images obtained by the equipment to effectively assist the health professional in results interpretation. Subsequently, the main steps of the medical certification process were analysed, in order to enable a future CE marking, which is essential for placing the device mentioned above on the market. To achieve this, it was necessary to check the compliance with the medical directive 93/42/CEE. Due to the current lack of instruments for measuring the temperatures of the foot in the medical check-up for this pathology and high costs of most medical platforms, the technology presented in this project has led to a high interest among the medical community, in order to improve the prevention and monitoring of this complication.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

Coetzee, Dirk. „Visualisation of PF firewall logs using open source“. Thesis, Rhodes University, 2015. http://hdl.handle.net/10962/d1018552.

Der volle Inhalt der Quelle
Annotation:
If you cannot measure, you cannot manage. This is an age old saying, but still very true, especially within the current South African cybercrime scene and the ever-growing Internet footprint. Due to the significant increase in cybercrime across the globe, information security specialists are starting to see the intrinsic value of logs that can ‘tell a story’. Logs do not only tell a story, but also provide a tool to measure a normally dark force within an organisation. The collection of current logs from installed systems, operating systems and devices is imperative in the event of a hacking attempt, data leak or even data theft, whether the attempt is successful or unsuccessful. No logs mean no evidence, and in many cases not even the opportunity to find the mistake or fault in the organisation’s defence systems. Historically, it remains difficult to choose what logs are required by your organization. A number of questions should be considered: should a centralised or decentralised approach for collecting these logs be followed or a combination of both? How many events will be collected, how much additional bandwidth will be required and will the log collection be near real time? How long must the logs be saved and what if any hashing and encryption (integrity of data) should be used? Lastly, what system must be used to correlate, analyse, and make alerts and reports available? This thesis will address these myriad questions, examining the current lack of log analysis, practical implementations in modern organisation, and also how a need for the latter can be fulfilled by means of a basic approach. South African organizations must use technology that is at hand in order to know what electronic data are sent in and out of their organizations network. Concentrating only on FreeBSD PF firewall logs, it is demonstrated within this thesis the excellent results are possible when logs are collected to obtain a visual display of what data is traversing the corporate network and which parts of this data are posing a threat to the corporate network. This threat is easily determined via a visual interpretation of statistical outliers. This thesis aims to show that in the field of corporate data protection, if you can measure, you can manage.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Price, Erin Peta. „Development of novel combinatorial methods for genotyping the common foodborne pathogen Campylobacter jejuni“. Thesis, Queensland University of Technology, 2007. https://eprints.qut.edu.au/16601/1/Erin_Peta_Price_Thesis.pdf.

Der volle Inhalt der Quelle
Annotation:
Campylobacter jejuni is the commonest cause of bacterial foodborne gastroenteritis in industrialised countries. Despite its significance, it remains unclear how C. jejuni is disseminated in the environment, whether particular strains are more pathogenic than others, and by what routes this bacterium is transmitted to humans. One major factor hampering this knowledge is the lack of a standardised method for fingerprinting C. jejuni. Therefore, the overall aim of this project was to develop systematic and novel genotyping methods for C. jejuni. Chapter Three describes the use of single nucleotide polymorphisms (SNPs) derived from the multilocus sequence typing (MLST) database of C. jejuni and the closely related Campylobacter coli for genotyping these pathogens. The MLST database contains DNA sequence data for over 4000 strains, making it the largest comparative database available for these organisms. Using the in-house software package "Minimum SNPs", seven SNPs were identified from the C. jejuni/C. coli MLST database that gave a Simpson's Index of Diversity (D), or resolving power, of 0.98. An allele-specific real-time PCR method was developed and tested on 154 Australian C. jejuni and C. coli isolates. The major advantage of the seven SNPs over MLST is that they are cheaper, faster and simpler to interrogate than the sequence-based MLST method. When the SNP profiles were combined with sequencing of the rapidly evolving flaA short variable region (flaA SVR) locus, the genotype distributions were comparable to those obtained by MLST-flaA SVR. Recent technological advances have facilitated the characterisation of entire bacterial genomes using comparative genome hybridisation (CGH) microarrays. Chapter Four of this thesis explores the large volume of CGH data generated for C. jejuni and eight binary genes (genes present in some strains but absent in others) were identified that provided complete discrimination of 20 epidemiologically unrelated strains of C. jejuni. Real-time PCR assays were developed for the eight binary genes and tested on the Australian isolates. The results from this study showed that the SNP-binary assay provided a sufficient replacement for the more laborious MLST-flaA SVR sequencing method. The clustered regularly interspaced short palindromic repeat (CRISPR) region is comprised of tandem repeats, with one half of the repeat region highly conserved and the other half highly diverse in sequence. Recent advances in real-time PCR enabled the interrogation of these repeat regions in C. jejuni using high-resolution melt differentiation of PCR products. It was found that the CRISPR loci discriminated epidemiologically distinct isolates that were indistinguishable by the other typing methods (Chapter Five). Importantly, the combinatorial SNP-binary-CRISPR assay provided resolution comparable to the current 'gold standard' genotyping methodology, pulsed-field gel electrophoresis. Chapter Six describes a novel third module of "Minimum SNPs", 'Not-N', to identify genetic targets diagnostic for strain populations of interest from the remaining population. The applicability of Not-N was tested using bacterial and viral sequence databases. Due to the weakly clonal population structure of C. jejuni and C. coli, Not-N was inefficient at identifying small numbers of SNPs for the major MLST clonal complexes. In contrast, Not-N completely discriminated the 13 major subtypes of hepatitis C virus using 15 SNPs, and identified binary gene targets superior to those previously found for phylogenetic clades of C. jejuni, Yersinia enterocolitica and Clostridium difficile, demonstrating the utility of this additional module of "Minimum SNPs". Taken together, the presented work demonstrates the potentially far-reaching applications of novel and systematic genotyping assays to characterise bacterial pathogens with high accuracy and discriminatory power. This project has exploited known genetic diversity of C. jejuni to develop highly targeted assays that are akin to the resolution of the current 'gold standard' typing methods. By targeting differentially evolving genetic markers, an epidemiologically relevant, high-resolution fingerprint of the isolate in question can be determined at a fraction of the time, effort and cost of current genotyping procedures. The outcomes from this study will pave the way for improved diagnostics for many clinically significant pathogens as the concept of hierarchal combinatorial genotyping gains momentum amongst infectious disease specialists and public health-related agencies.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Price, Erin Peta. „Development of novel combinatorial methods for genotyping the common foodborne pathogen Campylobacter jejuni“. Queensland University of Technology, 2007. http://eprints.qut.edu.au/16601/.

Der volle Inhalt der Quelle
Annotation:
Campylobacter jejuni is the commonest cause of bacterial foodborne gastroenteritis in industrialised countries. Despite its significance, it remains unclear how C. jejuni is disseminated in the environment, whether particular strains are more pathogenic than others, and by what routes this bacterium is transmitted to humans. One major factor hampering this knowledge is the lack of a standardised method for fingerprinting C. jejuni. Therefore, the overall aim of this project was to develop systematic and novel genotyping methods for C. jejuni. Chapter Three describes the use of single nucleotide polymorphisms (SNPs) derived from the multilocus sequence typing (MLST) database of C. jejuni and the closely related Campylobacter coli for genotyping these pathogens. The MLST database contains DNA sequence data for over 4000 strains, making it the largest comparative database available for these organisms. Using the in-house software package "Minimum SNPs", seven SNPs were identified from the C. jejuni/C. coli MLST database that gave a Simpson's Index of Diversity (D), or resolving power, of 0.98. An allele-specific real-time PCR method was developed and tested on 154 Australian C. jejuni and C. coli isolates. The major advantage of the seven SNPs over MLST is that they are cheaper, faster and simpler to interrogate than the sequence-based MLST method. When the SNP profiles were combined with sequencing of the rapidly evolving flaA short variable region (flaA SVR) locus, the genotype distributions were comparable to those obtained by MLST-flaA SVR. Recent technological advances have facilitated the characterisation of entire bacterial genomes using comparative genome hybridisation (CGH) microarrays. Chapter Four of this thesis explores the large volume of CGH data generated for C. jejuni and eight binary genes (genes present in some strains but absent in others) were identified that provided complete discrimination of 20 epidemiologically unrelated strains of C. jejuni. Real-time PCR assays were developed for the eight binary genes and tested on the Australian isolates. The results from this study showed that the SNP-binary assay provided a sufficient replacement for the more laborious MLST-flaA SVR sequencing method. The clustered regularly interspaced short palindromic repeat (CRISPR) region is comprised of tandem repeats, with one half of the repeat region highly conserved and the other half highly diverse in sequence. Recent advances in real-time PCR enabled the interrogation of these repeat regions in C. jejuni using high-resolution melt differentiation of PCR products. It was found that the CRISPR loci discriminated epidemiologically distinct isolates that were indistinguishable by the other typing methods (Chapter Five). Importantly, the combinatorial SNP-binary-CRISPR assay provided resolution comparable to the current 'gold standard' genotyping methodology, pulsed-field gel electrophoresis. Chapter Six describes a novel third module of "Minimum SNPs", 'Not-N', to identify genetic targets diagnostic for strain populations of interest from the remaining population. The applicability of Not-N was tested using bacterial and viral sequence databases. Due to the weakly clonal population structure of C. jejuni and C. coli, Not-N was inefficient at identifying small numbers of SNPs for the major MLST clonal complexes. In contrast, Not-N completely discriminated the 13 major subtypes of hepatitis C virus using 15 SNPs, and identified binary gene targets superior to those previously found for phylogenetic clades of C. jejuni, Yersinia enterocolitica and Clostridium difficile, demonstrating the utility of this additional module of "Minimum SNPs". Taken together, the presented work demonstrates the potentially far-reaching applications of novel and systematic genotyping assays to characterise bacterial pathogens with high accuracy and discriminatory power. This project has exploited known genetic diversity of C. jejuni to develop highly targeted assays that are akin to the resolution of the current 'gold standard' typing methods. By targeting differentially evolving genetic markers, an epidemiologically relevant, high-resolution fingerprint of the isolate in question can be determined at a fraction of the time, effort and cost of current genotyping procedures. The outcomes from this study will pave the way for improved diagnostics for many clinically significant pathogens as the concept of hierarchal combinatorial genotyping gains momentum amongst infectious disease specialists and public health-related agencies.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Qureshi, Tahir N. „Attacking software crisis a macro approach“. Thesis, 1985. http://hdl.handle.net/10945/21171.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

Tsao, Chia-Chi, und 曹嘉琪. „Crisis Management for Information Systems: An Example of the Year 2000 Software Crisis“. Thesis, 1999. http://ndltd.ncl.edu.tw/handle/24093626100260116466.

Der volle Inhalt der Quelle
Annotation:
碩士
國立中正大學
資訊管理學系
87
Crises are inevitable, only those enterprises taking precautions against crises can reduce damage. In the past, crises of information systems are often treated similarly as general crises, but not anymore. The problem of information system is a crisis that most companies can not ignore. Take the year of 2000 software (Y2K) crisis for example. According to Gartner Group, Y2K has an estimated $300 to $ 600 billion price tag just to fix it. The problem cuts across all industry boundaries. This study aims to answer the question of whether information system crises are handled differently in comparing with general, non-information system crises. Crisis management is examined at three stages of a crisis, namely, before, during, and after the crises. In each stage, we identify the characteristics of crisis management from three aspects. They are the characteristics of decision-makers, characteristics of firms and industries, and types of crises. Results show that: (1). The environmental characteristics of enterprises do not have significant correlation with high-ranking managers’ attitudes toward both information system crises and non-information system crises managers. (2). High-ranking managers’ characteristics have a strong impact on how crises are handled. In Particular, whether or not a manager keeping abreast of knowledge of technology has strong influences on the manager’s attitude toward information system crisis management. (3). Across industries, banking and financial industry is the most active in handling and anticipating information system crises.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

Chung, Yu-Yao, und 鐘裕堯. „An Empirical Study of Year 2000 Software Crisis - the Bank industry as a Case Study“. Thesis, 1997. http://ndltd.ncl.edu.tw/handle/01655040494069771909.

Der volle Inhalt der Quelle
Annotation:
碩士
淡江大學
資訊管理學系
85
Due to the insufficient length of date-field and the possible calculation error of leap-year formula in information systems, parts of them will get troubles on data processing and even on company running at the beginning of the next century. Since the daily works of domestic banks heavily depend on their information system and the development date of those systems are earlier than the date of software crisis announcement, the relating chief executives of the banking industry must pay attention this issue.In this article, we first proposed a framework by literature reviews and expert opinions. The contents of this framework included some variables from crisis recognition, feasibility analyses and reaction strategies. Based on this framework, we designed a questionnaire to make a field survey on 149 banking CIOs. We got 48 valid samples and some findings, (1) Most CEOs and all CIOs realized this software crisis, (2) The bigger the organizational size was, the more aggressive attitudes of their strategies took, (3) Man-powers and professional consultants were two major needs to out-source when the legacy system must be modified, (4) Source codes and system design documentation of the legacy system were complete in general, (5) The most concerned crisis items were related to issues of validity of file updates and smooth running of branches, (6) There were no statistically significance between the variables of organizational size and crisis recognition.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Granåsen, Magdalena. „Exploring C2 Capability and Effectiveness in Challenging Situations : Interorganizational Crisis Management, Military Operations and Cyber Defence“. Licentiate thesis, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-156151.

Der volle Inhalt der Quelle
Annotation:
Modern societies are affected by various threats and hazards, including natural disasters, cyber-attacks, extreme weather events and inter-state conflicts. Managing these challenging situations requires immediate actions, suspension of ordinary procedures, decision making under uncertainty and coordinated action. In other words, challenging situations put high demands on the command and control (C2) capability. To strengthen the capability of C2, it is vital to identify the prerequisites for effective coordination and direction within the domain of interest. This thesis explores C2 capability and effectiveness in three domains: interorganizational crisis management, military command and control, and cyber defence operations. The thesis aims to answer three research questions: (1) What constitutes C2 capability? (2) What constitutes C2 effectiveness? and (3) How can C2 effectiveness be assessed? The work was carried out as two case studies and one systematic literature review. The main contributions of the thesis are the identification of perspectives of C2 capability in challenging situations and an overview of approaches to C2 effectiveness assessment. Based on the results of the three studies, six recurring perspectives of capability in the domains studied were identified: interaction (collaboration), direction and coordination, relationships, situation awareness, resilience and preparedness. In the domains there are differences concerning which perspectives that are most emphasized in order obtain C2 capability. C2 effectiveness is defined as the extent to which a C2 system is successful in achieving its intended result. The thesis discusses the interconnectedness of performance and effectiveness measures, and concludes that there is not a united view on the difference between measures of effectiveness and measures of performance. Different approaches to effectiveness assessment were identified, where assessment may be conducted based on one specific issue, in relation to a defined goal for a C2 function or using a more exploratory approach.

In the printed version is the permanent link to this publication incorrect. The link is changecd in the online version.

APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

Odendaal, Maria Elizabeth. „An interpretive case study into the application of software engineering theory“. Diss., 2012. http://hdl.handle.net/2263/25756.

Der volle Inhalt der Quelle
Annotation:
Even before software engineering was formally defined as a discipline, software projects were notorious for being behind schedule and over budget. The resulting software systems were also often described as unreliable. Researchers in the field have, over the years, theorised and proposed many standards, methods, processes and techniques to improve software project outcomes. Based on allegorical evidence, however, it would seem that these proposals are often not applied in practice. This study was inspired by a desire to probe this general theme, namely of the extent to which (if at all) software engineering theory is adopted in practice. The core of this research is an interpretive case study of a software project in the financial services industry that ran from end 2006 to mid 2008. I was one of a team of approximately 20 developers, analysts and development managers working on the project, until I left the company in 2009. Results are reported in a two-phase fashion over several themes. Firstly, the literature of recommended software engineering practices relating to a particular theme is reviewed. This is regarded as the "theory". Thereafter, the observations and evidence collected from the interpretive study in regard to the relevant theme is presented and discussed. The first theme investigated is the notion of "project outcome". Definitions of successful and failed software projects are considered from the perspective of the various stakeholders. Also considered are factors that contribute to project success or failure. After examining how case study participants viewed the project’s outcome, it is argued that the project could neither be labelled as a complete success nor as a complete failure. Two areas were identified as problematic: the requirements gathering process; and the system architecture that had been chosen. Improvements in these areas would arguably have most benefitted the project’s outcome. For this reason, recommended practices were probed in the literature relating both to requirements engineering and also to software architecture design. The case study project was then evaluated against these recommended practices to determine the degree to which they were implemented. In cases where the recommended practices were not implemented or only partially implemented, a number of reasons for the lack of adoption are considered. Of course, the conclusions made in this study as to why the recommended practices were not implemented cannot be naïvely generalized to the software engineering field as a whole. Instead, in line with the interpretive nature of the study, an attempt was made to gain in depth knowledge of a particular project, to show how that project’s individual characteristics influenced the adoption of software engineering theory, and to probe the consequences of such adoption or lack thereof. The study suggested that the complex and individual nature of software projects will have a substantial influence on the extent to which theory is adopted in practice. It also suggested that the impact such adoption will have on a project’s outcome will be critically influenced by the nature of the software project. Copyright
Dissertation (MSc)--University of Pretoria, 2012.
Computer Science
unrestricted
APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

Corregedor, Manuel Rodrigues. „Utilizing rootkits to address the vulnerabilities exploited by malware“. Thesis, 2012. http://hdl.handle.net/10210/6257.

Der volle Inhalt der Quelle
Annotation:
M.Sc.
Anyone who uses a computer for work or recreational purposes has come across one or all of the following problems directly or indirectly (knowingly or not): viruses, worms, trojans, rootkits and botnets. This is especially the case if the computer is connected to the Internet. Looking at the statistics in [1] we can see that although malware detection techniques are detecting and preventing malware, they do not guarantee a 100% detection and or prevention of malware. Furthermore the statistics in [2] show that malware infection rates are increasing around the world at an alarming rate. The statistics also show that there are a high number of new malware samples being discovered every month and that 31% of malware attacks resulted in data loss [3], with 10% of companies reporting the loss of sensitive business data [4][5]. The reason for not being able to achieve a 100% detection and / or prevention of malware is because malware authors make use of sophisticated techniques such as code obfuscation in order to prevent malware from being detected. This has resulted in the emergence of malware known as polymorphic and metamorphic malware. The aforementioned malware poses serious challenges for anti-malware software specifically signature based techniques. However a more serious threat that needs to be addressed is that of rootkits. Rootkits can execute at the same privilege level as the Operating System (OS) itself. At this level the rootkit can manipulate the OS such that it can distribute other malware, hide existing malware, steal information, hide itself, disable anti-malware software etc all without the knowledge of the user. It is clear from the statistics that anti-malware products are not working because infection rates continue to rise and companies and end users continue to fall victims of these attacks. Therefore this dissertation will address the problem that current anti-malware techniques are not working. The main objective of this dissertation is to create a framework called ATE (Anti-malware Technique Evaluator) that can be used to critically evaluate current commercial anti-malware products. The framework will achieve this by identifying the current vulnerabilities that exist in commercial anti-malware products and the operating system. The prior will be achieved by making use of two rootkits, the Evader rootkit and the Sabotager rootkit, which were specifically developed to support the anti-malware product evaluation. Finally an anti-malware architecture we called External Malware Scanner (EMS), will be proposed to address the identified vulnerabilities.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Botes, Christo. „Utilising advanced accounting software to trace the reintegration of proceeds of crime, from underground banking into the formal banking system“. Diss., 2008. http://hdl.handle.net/10500/791.

Der volle Inhalt der Quelle
Annotation:
The aim of this paper is to research how advanced accounting software can be used by police detectives, financial risk specialists and forensic investigation specialists, who are responsible for the investigation and tracing of the reintegration of proceeds of crime, from underground banking into formal banking system (pro active and reactive money laundering investigation) with a view on criminal prosecution. The research started of by looking at the basic ways how proceeds of crime are smuggled before it is integrated into the formal banking system. In that context, the phenomenon of Underground banking was researched. Currency smuggling, Hawala currency transfer schemes and the way in which it is used to move proceeds of crime were discussed in detail. Thereafter Formal banking and the way in which proceeds of crime is reintegrated from underground banking structures into formal banking systems were discussed. The use of advanced accounting software to trace the point where proceeds of crime are reintegrated into formal banking were researched extensively. Accounting software and investigative techniques on how to trace financial transactions which might be tainted with proceeds of crime were discussed. Accounting software which can be used on office computers such as laptops were discussed and more advanced automated systems which can be used to trace proceeds of crime transactions in the formal banking systems were also discussed. In specific, the investigative techniques on how to use these systems as investigative tools were discussed in great detail. This research paper gives a truly unique perspective on the financial investigative and analytical angle on proceeds of crime and money laundering detection.
Criminal Justice
M.Tech. (Forensic Investigation)
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

Dagada, Rabelani. „Legal and policy aspects to consider when providing information security in the corporate environment“. Thesis, 2014. http://hdl.handle.net/10500/18839.

Der volle Inhalt der Quelle
Annotation:
E-commerce is growing rapidly due to the massive usage of the Internet to conduct commercial transactions. This growth has presented both customers and merchants with many advantages. However, one of the challenges in E-commerce is information security. In order to mitigate e-crime, the South African government promulgated laws that contain information security legal aspects that should be integrated into the establishment of information security. Although several authors have written about legal and policy aspects regarding information security in the South African context, it has not yet been explained how these aspects are used in the provision of information security in the South African corporate environment. This is the premise upon which the study was undertaken. Forty-five South African organisations participated in this research. Data gathering methods included individual interviews, website analysis, and document analysis. The findings of this study indicate that most organisations in South Africa are not integrating legal aspects into their information security policies. One of the most important outcomes of this study is the proposed Concept Model of Legal Compliance in the Corporate Environment. This Concept Model embodies the contribution of this study and demonstrates how legal requirements can be incorporated into information security endeavours. The fact that the proposed Concept Model is technology-independent and that it can be implemented in a real corporate environment, regardless of the organisation’s governance and management structure, holds great promise for the future of information security in South Africa and abroad. Furthermore, this thesis has generated a topology for linking legislation to the provision of information security which can be used by any academic or practitioner who intends to implement information security measures in line with the provisions of the law. It is on the basis of this premise that practitioners can, to some extent, construe that the integration of legislation into information security policies can be done in other South African organisations that did not participate in this study. Although this study has yielded theoretical, methodological and practical contributions, there is, in reality, more research work to be done in this area.
School of Computing
D. Phil. (Information Systems)
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie