Rozprawy doktorskie na temat „Diagnosis – Data processing – Congresses”

Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Diagnosis – Data processing – Congresses.

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 41 najlepszych rozpraw doktorskich naukowych na temat „Diagnosis – Data processing – Congresses”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj rozprawy doktorskie z różnych dziedzin i twórz odpowiednie bibliografie.

1

Van, Boening Mark Virgil. "Call versus continuous auctions: An experimental study of market organization". Diss., The University of Arizona, 1991. http://hdl.handle.net/10150/185542.

Pełny tekst źródła
Streszczenie:
The results from 17 new experiments and 19 previously reported experiments are compared in an investigation of call and continuous auctions. The call auction used is the computerized PLATO sealed bid/offer (SBO), uniform price auction. The continuous auction used is the PLATO double auction (DA), a computerized version of the "open outcry" double auction. The SBO call auction has temporal consolidation of market orders and has limited information about trading activity. The continuous DA auction is characterized by sequential bilateral trades, and trading information (bids, offers, and prices) is publicly displayed. The paper first explores the effect of multiple crossings per trading period in the SBO call auction. Next, a comparison of SBO and DA is made, based on market experiments using flow supply and demand schedules. The institutional comparison is then extended to experimental asset markets. The results imply the following. First, multiple calls per period increase the efficiency of the SBO call auction, relative to one call per period, but they also induce greater misrepresentation of costs and values in the first crossing each period. Buyers and sellers also withhold units from the first crossing in a further attempt to gain strategic advantage. However, neither the withholding nor the misrepresentation appears to have any substantial influence on price. Second, the SBO auction with two calls per period is as efficient as the DA auction. In markets with a random competitive equilibrium (CE) each period, the SBO auction does a better job than DA at tracking the random CE price. Thus the SBO auction is equally as efficient as the DA, and has the further attributes of lower price volatility and greater privacy. Third, in laboratory asset markets, the SBO auction exhibits price bubbles similar to those observed in DA markets. Price dynamics in the two institutions are comparable, despite the stark differences in order flow and information dissemination.
Style APA, Harvard, Vancouver, ISO itp.
2

Bui, Bang Huy. "Development of algorithms for processing psychology data". Thesis, Queensland University of Technology, 1997. https://eprints.qut.edu.au/36007/1/36007_Bui_1997.pdf.

Pełny tekst źródła
Streszczenie:
This thesis presents the current analysis technique applied to certain psychology data and outlines alternative engineering approaches to such analysis. Current research on panic disorder involves data measurement and analysis of many physiological, neuro-chemical and psychological variables. Due to the complexity of and little knowledge about the human body, there are no firm theories on what actually gives rise to these variables, ie. what results in a rise in negative cognition or distress level. Current studies [1, 2] have reported that the patients cognitive responses tend to be more closely related to the distress level than other quantities and the heart rate is related to the distress level but on a smaller scale. However, the conclusions drawn from the results were not definitive. Engineering analysis techniques carried out indicated that the cognitions of the patients play an important role in the mechanisms of panic. This thus confirmed the results obtained by current studies in a more rigorous manner.
Style APA, Harvard, Vancouver, ISO itp.
3

Nakamura, Carlos. "The effects of specific support to hypothesis generation on the diagnostic performance of medical students /". Thesis, McGill University, 2006. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=102817.

Pełny tekst źródła
Streszczenie:
The hypothetico-deductive method, which involves an iterative process of hypothesis generation and evaluation, has been used for decades by physicians to diagnose patients. This study focuses on the levels of support that medical information systems can provide during these stages of the diagnostic reasoning process. The physician initially generates a list of possible diagnoses (hypotheses) based on the patients' symptoms. Later, those hypotheses are examined to determine which ones best account for the signs, symptoms, physical examination findings, and laboratory test results. Hypothesis generation is especially challenging for medical students because the organization of knowledge in medical school curricula is disease-centered. Furthermore, the clinical reference tools that are regularly used by medical students (such as Harrison's Online, UpToDate, and eMedicine) are mostly organized by disease. To address this issue, Abduction, a hypothesis generation tool; was developed for this study. Sixteen medical students were asked to solve two patient cases in two different conditions: A (support of clinical reference tools chosen by the participant and Abduction ) and B (support of clinical reference tools chosen by the participant). In Condition A, participants were able to generate the correct diagnosis in all 16 occasions (100%) and were able to confirm it in 13 occasions (81.25%). In Condition B, participants were able to generate the correct diagnosis in three out of 16 occasions (18.75%) and were able to confirm it once (6.25%). The implications of this study are discussed with respect to the cognitive support that Abduction can provide to medical students for clinical diagnosis.
Style APA, Harvard, Vancouver, ISO itp.
4

Moni, Mohammad Ali. "Clinical bioinformatics and computational modelling for disease comorbidities diagnosis". Thesis, University of Cambridge, 2015. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.708646.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Frigo, Alessandro. "A procedure for the autonomic diagnosis of esophageal motor disorders from HRM data processing". Doctoral thesis, Università degli studi di Padova, 2016. http://hdl.handle.net/11577/3424419.

Pełny tekst źródła
Streszczenie:
A proper understanding of physiological mechanisms for the propulsion of ingested food within the gastrointestinal canal is mandatory for the diagnosis of pathologies affecting its motility. One of the most discussed regions within the digestive system is the esophagus, as a tubular structure whose function pertains to bring food from mouth to stomach by means of a precise sequence of longitudinal and circumferential muscular contractions, called peristalsis. Pathologies and degenerative phenomena may influence this mechanism, leading to chest pain, acid reflux, cancer development and/or inability to swallow. As a growing number of subjects suffer from esophageal motility disorders, it represent a relevant social-health problem. The diagnosis of esophageal motility disorders is actually performed by analyzing results from High Resolution Manometry (HRM), the gold standard in esophageal diagnostics. HRM consists in a clinical test designed to measure the pressure evolution over time at different positions within a duct by means of a special probe. A number of models have been proposed in literature to interpret data from HRM, but results are often inadequate because of an improper evaluation of the complex esophageal conformation and the corresponding heterogeneous distribution of physio-mechanical properties. Furthermore, an inadequate effort was made to identify relationships between model parameters and esophageal properties, and their identification was usually performed accounting for limited sets of experimental data. The guidelines in diagnosis of esophageal motility disorders are currently defined by the Chicago Classification: a hierarchical algorithm that accounts for specific parameters evaluated by analyzing HRM results. The main drawback in this procedure pertains to the requirement of specialized experts for the evaluation of such parameters, inducing intra- and inter-operator variabilities with regard to the final diagnosis. The esophageal motility was investigated with the goal of providing a physiological model able to interpret results from HRM, accounting for parameters related to specific physio-mechanical properties of the esophagus and their heterogeneous distribution. Activities were focused on the implementation of a procedure for the autonomic detection of esophageal motility dysfunctions based on HRM measurements processing. As a result, objective criteria were defined to support the medical staff during the traditional diagnostic activity of esophageal motility disorders. The physiological model was developed to this purpose to evaluate the pressure distribution due to the transit of a generic pressure wave. The corresponding optimal sets of model parameters were identified accounting for the HRM results of each subject of a training set composed by 229 patients and 35 healthy volunteers. Patients and volunteers were classified into groups according to their specific healthy or pathological conditions, as non-pathological (73 patients and 35 volunteers), Achalasia pattern I (34 subjects), Achalasia pattern II (44 subjects), Achalasia pattern III (7 subjects), Esophago-Gastric Junction (EGJ) outflow obstruction (39 subjects), hypertensive LES (9 subjects), Nutcracker esophagus (14 subjects) and Diffuse Esophageal Spasm (9 subjects). The identified model parameters were analyzed, and their distributions were assessed for each group of subjects, as basis for the implementation of the autonomic diagnosis procedure. Thus, the condition of a generic patient could be determined through the evaluation of a similarity index designed to correlate the model parameters of the patient to the parameters distributions of the training set. As a result, a preliminary set of HRMs of healthy and pathological subjects was collected for a proper design and testing of the autonomic diagnosis software. The suitability of the developed physiological model was assessed by evaluating the coefficient of determination R2 between clinical data and model results, ranging from 83% to 96% among the different groups of subjects. The application of the model to each subject of the dataset allowed to assess the distribution of model parameters with regard to different healthy or pathological conditions, as the basis for the development of the autonomic diagnosis procedure. Furthermore, the main differences between the parameters distributions of pathological groups and the parameters distribution of the healthy group were observed in specific regions where the different symptoms are manifested, endorsing the suitability of the model to interpret the variation of physiological properties in pathological situations. Finally, the reliability of the autonomic diagnosis procedure was assessed by analyzing the performance of the algorithm, which was able to match the correct diagnosis in the 86% of the considered cases. Results suggest that the computational tools provided may represent a reliable support to the medical staff during the traditional diagnostic activity. As model parameters distributions represent the basis for the autonomic diagnosis procedure, there is room for improvement of the algorithm by considering a larger training set, which must be extended and continuously updated involving different research groups, as a future development of the research activity. Furthermore, the autonomic diagnosis procedure should be extended, in order to make it capable to diagnose pathologies accounting for additional clinical tests providing information about conductivity, morphometry and mechanical behavior of the involved biological tissues. Such information should be collected in a single clinical test in order to reduce costs and invasiveness for the patient, and can be performed by means of an innovative esophageal endoscope that is already under development.
La procedura per la diagnosi di patologie della motilità intestinale non può prescindere da una conoscenza appropriata dei meccanismi fisiologici che regolano il trasporto del cibo ingerito all’interno dell’intestino. Una delle regioni più studiate del tratto gastrointestinale, infatti, è l’esofago: una struttura tubolare in grado di trasportare il cibo dalla bocca allo stomaco mediante una precisa sequenza di contrazioni delle fibre muscolari longitudinali e circonferenziali chiamata peristalsi. Sfortunatamente, alcune patologie e processi degenerativi sono in grado di alterare questo meccanismo, generando dolore toracico, reflusso gastro-esofageo, difficoltà nella deglutizione e/o carcinoma dell’esofago in un numero crescente di soggetti, costituendo un grave problema socio-sanitario. Attualmente, la diagnosi di disturbi della motilità esofagea si svolge analizzando i risultati di un particolare esame clinico chiamato Manometria ad Alta Risoluzione (High Resolution Manometry – HRM), che consente di misurare l’evoluzione temporale della pressione intra-esofagea in diverse posizioni lungo esofago mediante un catetere trans-nasale appositamente progettato. In letteratura sono stati proposti diversi modelli per l’interpretazione di dati da manometria, ma con risultati spesso insoddisfacenti a causa di una valutazione impropria della distribuzione eterogenea delle proprietà fisio-meccaniche dell’esofago e di una inadeguata definizione della loro relazione con i parametri di modello utilizzati. Inoltre, l’identificazione di tali parametri è stata fatta sulla base di dataset ridotti. Oggi, le linee guida per la diagnosi di disordini motori dell’esofago sono definite dalla Classificazione di Chicago (Chicago Classification – CC): un algoritmo gerarchico che individua la patologia sulla base di parametri specifici estratti dall’analisi di dati da HRM. Il punto debole della CC consiste nella necessità di personale specializzato per il calcolo dei parametri, introducendo inevitabilmente variabilità intra- e inter-operatore nei confronti della diagnosi effettuata. In questa ricerca è stata analizzata la motilità esofagea, con l’obiettivo di sviluppare un modello fisiologico in grado di interpretare risultati da esami di HRM. Tale modello è stato definito mediante parametri collegati direttamente a proprietà fisio-meccaniche specifiche dell’esofago, considerando la loro distribuzione eterogenea. Le attività hanno previsto l’implementazione di una procedura per l’individuazione automatica di disfunzioni motorie dell’esofago, basata sull’analisi di dati da HRM. Sono stati quindi definiti alcuni criteri oggettivi per supportare la figura del clinico durante l’attività diagnostica tradizionale di disordini motori dell’esofago. Il modello fisiologico è stato sviluppato per valutare la mappa pressoria generata dal passaggio di una generica onda di pressione. Con riferimento a tale modello, sono stati individuati i set di parametri ottimali per interpretare al meglio gli esami HRM di ciascuno dei soggetti di un training set composto da 229 pazienti e 35 volontari sani. Tutti i soggetti sono stati raggruppati in diverse categorie sulla base del corrispondente stato di salute: normali (73+35 soggetti), Acalasia I (34), Acalasia II (44), Acalasia III (7), ostruzione della giunzione gastro-esofagea (39), sfintere inferiore ipertensivo (9), esofago schiaccianoci (14) e Spasmo Esofageo Diffuso (9). I parametri così identificati sono stati analizzati statisticamente per valutare la loro distribuzione in ciascuna categoria. Le distribuzioni di tali parametri costituiscono la base per lo sviluppo della procedura di diagnosi automatica. Infatti, la condizione di salute di un generico paziente può essere determinata calcolando un “indice di similarità” definito appositamente per rappresentare numericamente l’affinità tra i parametri specifici del paziente e le distribuzioni dei parametri delle diverse categorie del training set. E’ stato così costituito un set preliminare di dati da manometria ad alta risoluzione, corrispondente a soggetti sani e patologici per sviluppare e testare il software sviluppato. L’adeguatezza del modello fisiologico per quanto riguarda l’interpretazione di dati da HRM è stata accertata valutando il coefficiente di determinazione R2 tra i dati sperimentali e i risultati di modello, il quale variava tra 83% e 96% nelle diverse categorie. L’applicazione del modello a ogni soggetto del training set ha permesso inoltre di valutare la distribuzione dei parametri in diverse condizioni di salute. A ulteriore sostegno dell’adeguatezza del modello, è stato osservato che le differenze nelle distribuzioni di parametri tra soggetti sani e patologici sono state riscontrate in corrispondenza delle regioni dell’esofago colpite dalle diverse patologie. Infine, l’affidabilità della procedura di diagnosi automatica è stata valutata analizzando la performance dell’algoritmo, il quale si è dimostrato in grado di individuare la diagnosi corretta nell’86% dei casi considerati. I risultati ottenuti indicano che gli strumenti computazionali sviluppati possono rappresentare un valido sostegno per il personale medico durante l’attività diagnostica tradizionale. Per quanto riguarda gli sviluppi futuri della ricerca, dal momento che le distribuzioni dei parametri costituiscono il fondamento della procedura di diagnosi automatica, le prestazioni del software possono essere migliorate considerando un training set più grande, condividendolo con altri centri di ricerca ed aggiornandolo continuamente. Inoltre, la procedura di diagnosi automatica può essere estesa e resa capace di effettuare diagnosi sulla base di ulteriori esami clinici in grado di fornire informazioni sulla conducibilità, morfometria e comportamento meccanico delle strutture biologiche coinvolte. Queste informazioni potrebbero quindi essere raccolte mediante un unico test clinico per ridurre costi di indagine e invasività per il paziente, e potrebbero essere svolti in contemporanea mediante una sonda endoscopica innovativa già in fase di sviluppo.
Style APA, Harvard, Vancouver, ISO itp.
6

Lembke, Benjamin. "Bearing Diagnosis Using Fault Signal Enhancing Teqniques and Data-driven Classification". Thesis, Linköpings universitet, Fordonssystem, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-158240.

Pełny tekst źródła
Streszczenie:
Rolling element bearings are a vital part in many rotating machinery, including vehicles. A defective bearing can be a symptom of other problems in the machinery and is due to a high failure rate. Early detection of bearing defects can therefore help to prevent malfunction which ultimately could lead to a total collapse. The thesis is done in collaboration with Scania that wants a better understanding of how external sensors such as accelerometers, can be used for condition monitoring in their gearboxes. Defective bearings creates vibrations with specific frequencies, known as Bearing Characteristic Frequencies, BCF [23]. A key component in the proposed method is based on identification and extraction of these frequencies from vibration signals from accelerometers mounted near the monitored bearing. Three solutions are proposed for automatic bearing fault detection. Two are based on data-driven classification using a set of machine learning methods called Support Vector Machines and one method using only the computed characteristic frequencies from the considered bearing faults. Two types of features are developed as inputs to the data-driven classifiers. One is based on the extracted amplitudes of the BCF and the other on statistical properties from Intrinsic Mode Functions generated by an improved Empirical Mode Decomposition algorithm. In order to enhance the diagnostic information in the vibration signals two pre-processing steps are proposed. Separation of the bearing signal from masking noise are done with the Cepstral Editing Procedure, which removes discrete frequencies from the raw vibration signal. Enhancement of the bearing signal is achieved by band pass filtering and amplitude demodulation. The frequency band is produced by the band selection algorithms Kurtogram and Autogram. The proposed methods are evaluated on two large public data sets considering bearing fault classification using accelerometer data, and a smaller data set collected from a Scania gearbox. The produced features achieved significant separation on the public and collected data. Manual detection of the induced defect on the outer race on the bearing from the gearbox was achieved. Due to the small amount of training data the automatic solutions were only tested on the public data sets. Isolation performance of correct bearing and fault mode among multiplebearings were investigated. One of the best trade offs achieved was 76.39 % fault detection rate with 8.33 % false alarm rate. Another was 54.86 % fault detection rate with 0 % false alarm rate.
Style APA, Harvard, Vancouver, ISO itp.
7

Chou, Chuan-Ting. "Traditional Chinese medicine on-line diagnosis system". CSUSB ScholarWorks, 2006. https://scholarworks.lib.csusb.edu/etd-project/3182.

Pełny tekst źródła
Streszczenie:
The project developed a web-based application that provides a user-friendly interface to assist practitioners of traditional Chinese medicine in determining the correct diagnosis. Traditional Chinese Medicine On-line Diagnosis System (TCMODS) allows a diagnostician to enter a patient's symptoms using a series of questionnaires to determine health status, which will then be stored in the database as part of the patient's medical records. The database will also differentiate among the patterns of syndromes known in traditional Chinese medicine and search and match these with the patient's data to the known uses of Chinese herbs. TCMODS will then generate that patient's medical record, including the symptoms of the ailment, the syndrome, and a prescription. User identification and access privileges were differentiated in order to maintain the integrity of the patient medical data and the information needed to make the diagnoses. The project was designed to function across platforms and was written using HTML, JSP, and MySQL.
Style APA, Harvard, Vancouver, ISO itp.
8

Subbiah, Arun. "Design and evaluation of a distributed diagnosis algorithm for arbitrary network topologies in dynamic fault environments". Thesis, Georgia Institute of Technology, 2001. http://hdl.handle.net/1853/13273.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Faremo, Sonia. "Medical problem solving and post-problem reflection in BioWorld". Thesis, McGill University, 2004. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=84992.

Pełny tekst źródła
Streszczenie:
This study examined diagnostic problem solving and post-problem reflection in medical students, residents, and experts. Participants worked on three internal medicine cases from the computer-based learning environment, BioWorld. The analyses focused on general performance measures, problem solving operators and knowledge states, and post-problem reflection activities. Verbal protocol data was collected and examined using a coding scheme developed and implemented with the N-Vivo software. Students and residents differed in overall diagnostic accuracy, and significant differences were found in solution time and the number of utterances made for cases of varying difficulty. Differences in the use of operators and knowledge states are highlighted, although the groups were quite similar on many measures. The experts spent considerably more time working on case history information, consistently engaged in planning, and always generated the correct diagnosis (among others) in response to case history information. During post-problem reflection students used more case history data than residents. Expert models highlight the experts' problem solving cycle that consisted of reviewing data, identifying hypotheses, and planning. Post-questionnaire results indicate that participants found the cases to be interesting, useful for learning, but not especially difficult. Finally, several implications are drawn for the future development of BioWorld for medical training.
Style APA, Harvard, Vancouver, ISO itp.
10

Heacock, Gregory. "An investigation of the role of virtual reality systems and their application to ophthalmic teaching, diagnosis and treatment". Thesis, King's College London (University of London), 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.287483.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
11

Wen, Yiding. "Detecting microcalcifications in digitised mammograms by a computer aided diagnostic system". Thesis, The University of Sydney, 1999. https://hdl.handle.net/2123/27591.

Pełny tekst źródła
Streszczenie:
Breast cancer is now one of the most common forms of cancer and the leading cause of mortality in women in the developed countries. Early detection of breast cancer is currently the way to reduce breast cancer mortality and enhance the cure rate. Mammogram screening is widely recognized as the most reliable method for early detection of lesions and clustered microcalcifications, which are the two prominent symptoms of breast cancer. This thesis presents an image processing procedure for the automatic detection of clustered microcalcifications in digitized mammograms. This method consists of two main steps. First, possible microcalcification pixels in the mammograms are segmented out using wavelet features, and grouped into potential individual microcalcification objects by their spatial connectivity. Second, individual microcalcifications are detected by using the structure features extracted from the potential microcalcification objects. The classifiers used in the two steps are feedforward neural networks. The method is applied to 40 regions of interest extracted from mammograms in the Nijimegen database containing 144 clusters of microcalcifications. Results show that the proposed procedure gives satisfactory detection performance. In particular, a 97 percent mean true positive detection rate is achieved at the cost of 1.67 false positive in the whole dataset.
Style APA, Harvard, Vancouver, ISO itp.
12

Roberts, Tim S. "The development of an expert system for the diagnosis of diseases in fibre and dairy goats". Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 1990. https://ro.ecu.edu.au/theses/1113.

Pełny tekst źródła
Streszczenie:
This thesis details the development of an expert system for the diagnosis of diseases in fibre and dairy goats. Divided into five sections, five appendices, and a bibliography, this thesis centres on the methods used to build the expert system: the decisions taken at the outset of, and during the course of, development; some of the problems encountered, and the solutions to those problems. A detailed appraisal is made of the development process and suggestions are made for future developments over similar domains (for example, the diagnosis of diseases in animals other than goats). Much emphasis is placed on three topics in particular: the selection of the expert system tool(s) to be used (and the rejection of numerous others); the methodology employed for this selection process; and the methodology used for the process of development. Other topics which are routinely found in texts on expert systems (for example, knowledge elicitation techniques, explanatory facilities, expert system evaluation etc) are dealt with only briefly. However, for the reader interested in further information on these topics, references are made in the text to appropriate sources.
Style APA, Harvard, Vancouver, ISO itp.
13

Van, Greunen Francois. "Microcomputer-assisted diagnosis of inherited disorders of the skeleton". Master's thesis, University of Cape Town, 1988. http://hdl.handle.net/11427/25754.

Pełny tekst źródła
Streszczenie:
Several hundred inherited disorders of the skeleton have been delineated. Individually these conditions are rare, but as a group they cause much crippling and hardship. Several factors, including the rarity and complexity of the manifestations of these conditions, as well as semantic overlap, impede the accurate diagnosis which is essential for effective treatment. In this regard, the adoption of microcomputers warrants evaluation as a high technology aid. Microcomputers have developed tremendous capabilities during recent years. The state of the art has become such that a diagnostic aid facility on such a device has been demonstrated in various disciplines of medicine and may also be feasible in the area of inherited skeletal disorders. The study which forms the basis of this thesis, concerns the investigation of this feasibility and has led to the development of an effective working model which sets the basis for microcomputer-aided diagnosis. The design features followed in this project are similar to those conventionally employed for "Expert systems" on mainframe computers. A comprehensive knowledge base consisting of over 200 skeletal disorders and 700 radiographic and clinical manifestations, has resulted. Furthermore, the application is capable of "learning", although inference as employed by the inference engines of real expert systems, is not employed. In this context learning implies that the knowledge base, with the passage of time, improves considerably when used by experts. Serendipitous findings in this regard are: • 1) Considerable improvement of existing profile descriptions can occur without any increased demands on computer memory and storage space; • 2) Growth of the knowledge base in the form of additional disease profiles can be effected with very modest inroads on memory and storage resources. The computerized diagnostic aid which resulted from this thesis, has been demonstrated to be successful in both the Department of Human Genetics of the University of Cape Town and the Department of Paediatrics of the Johannes Gutenberg University in Mainz. Evaluated both in terms of efficiency and utility, the system provides an enhancement to the specialist genetic diagnostician. These achievements have been effected by means of a unique newly developed application of compressed bit-mapping, attained by writing the applicable programs in Turbo Pascal and 8086- assembler languages. Calculations indicate that much larger data bases may possibly be implemented on present-day microcomputers by means of the methods developed in this project.
Style APA, Harvard, Vancouver, ISO itp.
14

Elieson, S. Willard (Sanfred Willard). "Development of an Expert System to Teach Diagnostic Skills". Thesis, University of North Texas, 1990. https://digital.library.unt.edu/ark:/67531/metadc331448/.

Pełny tekst źródła
Streszczenie:
The primary purpose of the study was to develop an expert system that could C D perform medical diagnoses In selected problem areas, and C2) provide diagnostic Insights to assist medical students In their training. An expert system Is a computer-based set of procedures and algorithms that can solve problems In a given domain. Two research questions were proposed. The first was "Given a problem space defined by a matrix of diseases and symptoms, can a computer-based model be derived that will consistently perform accurate and efficient diagnoses of cases within that problem area?" The second question was "If the techniques derived from the model are taught to a medical student, is there a subsequent improvement of diagnostic skill?" An expert system was developed which met the objectives of the study. It was able to diagnose cases in the two problem areas studied with an accuracy of 94-95%. Furthermore, it was able to perform those diagnoses in a very efficient manner, often using no more than the theoretical minimum number of steps. The expert system employed three phases: rapid search by discrimination, confirmation by pattern matching against prototypes, and elimination of some candidates (impossible states) by making use of negative information. The discrimination phase alone achieved accuracies of 73-78%. By comparison, medical students achieved mean accuracies of 54-55% in the same problem areas. This suggests that novices could improve their diagnostic accuracy by approximately 20% by following the simple rules used in the first phase of the expert system. Curricular implications are discussed. When 49 first-year medical students at the Texas College of Osteopathic Medicine were exposed to some of the insights of the expert system by means of a videotaped 10- minute lecture, their diagnostic approach was modified and the accuracy of their diagnoses did improve. However, the degree of Improvement was not statistically significant. Recommendations for further research are made.
Style APA, Harvard, Vancouver, ISO itp.
15

White, Glen Ross. "Implementation of Dave : an expert system for the analysis of the Wechsler Adult Intelligence Scales and related information". Thesis, Kansas State University, 1985. http://hdl.handle.net/2097/9891.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
16

Puelz, Michael. "A program to generate and validate new test versions of a neuropsychological planning test". Virtual Press, 1991. http://liblink.bsu.edu/uhtbin/catkey/834522.

Pełny tekst źródła
Streszczenie:
Computers are used for diagnostic and training in the neuropsychological rehabilitation. PLANTEST is a program for the IBM-PC that was developed for diagnostic support. It implements a test that gives information about the reduced ability of brain-injured patients to make plans regarding a certain task.The presented thesis describes a knowledge-based system that can be used to develop new test versions for PLANTEST. The program is called SolvePT and it can prove the solubility of test material used in PLANTEST. It can also automatically generate new test material. The program uses an exhaustive forward-chaining, depth-first search and is implemented in Prolog. The datastructures and algorithm of the program as well as space and time requirements are discussed.
Department of Computer Science
Style APA, Harvard, Vancouver, ISO itp.
17

Oteniya, Lloyd. "Bayesian belief networks for dementia diagnosis and other applications : a comparison of hand-crafting and construction using a novel data driven technique". Thesis, University of Stirling, 2008. http://hdl.handle.net/1893/497.

Pełny tekst źródła
Streszczenie:
The Bayesian network (BN) formalism is a powerful representation for encoding domains characterised by uncertainty. However, before it can be used it must first be constructed, which is a major challenge for any real-life problem. There are two broad approaches, namely the hand-crafted approach, which relies on a human expert, and the data-driven approach, which relies on data. The former approach is useful, however issues such as human bias can introduce errors into the model. We have conducted a literature review of the expert-driven approach, and we have cherry-picked a number of common methods, and engineered a framework to assist non-BN experts with expert-driven construction of BNs. The latter construction approach uses algorithms to construct the model from a data set. However, construction from data is provably NP-hard. To solve this problem, approximate, heuristic algorithms have been proposed; in particular, algorithms that assume an order between the nodes, therefore reducing the search space. However, traditionally, this approach relies on an expert providing the order among the variables --- an expert may not always be available, or may be unable to provide the order. Nevertheless, if a good order is available, these order-based algorithms have demonstrated good performance. More recent approaches attempt to ''learn'' a good order then use the order-based algorithm to discover the structure. To eliminate the need for order information during construction, we propose a search in the entire space of Bayesian network structures --- we present a novel approach for carrying out this task, and we demonstrate its performance against existing algorithms that search in the entire space and the space of orders. Finally, we employ the hand-crafting framework to construct models for the task of diagnosis in a ''real-life'' medical domain, dementia diagnosis. We collect real dementia data from clinical practice, and we apply the data-driven algorithms developed to assess the concordance between the reference models developed by hand and the models derived from real clinical data.
Style APA, Harvard, Vancouver, ISO itp.
18

Gao, Feng. "Complex medical event detection using temporal constraint reasoning". Thesis, University of Aberdeen, 2010. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?pid=153271.

Pełny tekst źródła
Streszczenie:
The Neonatal Intensive Care Unit (NICU) is a hospital ward specializing in looking after premature and ill newborn babies. Working in such a busy and complex environment is not easy and sophisticated equipment is used to help the daily work of the medical staff . Computers are used to analyse the large amount of monitored data and extract hidden information, e.g. to detect interesting events. Unfortunately, one group of important events lacks features that are recognizable by computers. This group includes the actions taken by the medical sta , for example two actions related to the respiratory system: inserting an endotracheal tube into a baby’s trachea (ET Intubating) or sucking out the tube (ET Suctioning). These events are very important building blocks for other computer applications aimed at helping the sta . In this research, a strategy for detecting these medical actions based on contextual knowledge is proposed. This contextual knowledge specifies what other events normally occur with each target event and how they are temporally related to each other. The idea behind this strategy is that all medical actions are taken for di erent purposes hence may have di erent procedures (contextual knowledge) for performing them. This contextual knowledge is modelled using a point based framework with special attention given to various types of uncertainty. Event detection consists in searching for consistent matching between a model based on the contextual knowledge and the observed event instances - a Temporal Constraint Satisfaction Problem (TCSP). The strategy is evaluated by detecting ET Intubating and ET Suctioning events, using a specially collected NICU monitoring dataset. The results of this evaluation are encouraging and show that the strategy is capable of detecting complex events in an NICU.
Style APA, Harvard, Vancouver, ISO itp.
19

O'Donnell, Melissa. "Towards prevention - a population health approach to child abuse and neglect : health indicators and the identification of antecedent causal pathways". University of Western Australia. School of Paediatrics and Child Health, 2009. http://theses.library.uwa.edu.au/adt-WU2010.0029.

Pełny tekst źródła
Streszczenie:
[Truncated abstract] The primary aims of this thesis were to investigate health indicators of child maltreatment, as well as pathways into the child protection system using routinely collected government databases, enabling a preventative health approach to child abuse and neglect. This thesis aims to improve understanding of the trends in child maltreatment and the factors, at the child and family level, which increase or reduce vulnerability to child maltreatment so more effective prevention policies and practices can be developed. This project uses longitudinal de-identified population data from the Western Australian Government Departments of Child Protection, Health and Disability Services. These data contained information on demographic, clinical, social and child protection outcomes of children and their families. Record linkage of administrative data was undertaken to: investigate health indicators of abuse and neglect using Hospital Morbidity data to enable the monitoring of population trends in abuse and neglect; compare proportion of cases obtained using health indicators with the Department of Child Protection data, and describe the physical, psychological and social characteristics of abused and/or neglected children and families. Statistical techniques utilised include logistic and Cox regression to investigate risk of adverse child outcomes, taking into account potential confounding and time to event. The main findings include: There has been an increase in assault and maltreatment related hospital admissions over the last 25 years. ... There has been a marked increase in the birth prevalence of Neonatal Withdrawal Syndrome (NWS) in Western Australia over the last 25 years, from 1 per 10,000 live births in 1980, to 31 per 10,000 live births in 2005. Specific maternal characteristics associated with having a child with NWS are identified and these children have an increased risk of child protection involvement. A population level analysis of child and parental factors determined the estimated increase in risk of substantiated child maltreatment for child intellectual disability, parental admissions for mental health, substance use, and assault, as well as greater socio-economic disadvantage. Conclusions This is the first body of research which has extensively used longitudinal, population level linked health and child protection data to investigate health indicators of child abuse and neglect and antecedent causal pathways. Monitoring injuries and conditions associated with child abuse and neglect in routinely collected data and using multiple sources of ascertainment are important initiatives in child maltreatment surveillance. Health indicators of child abuse and neglect are not subject to the same definitional and policy issues as child protection data and therefore provide a more valid comparison over time and between jurisdictions. The identification of factors which increase vulnerability for children and families to child maltreatment is essential in the implementation of prevention strategies including universal public health approaches as well as the identification of at-risk families for targeted intervention.
Style APA, Harvard, Vancouver, ISO itp.
20

Papa, Frank J. "Test of the Generalizability Of "KBIT" (an Artificial Intelligence-Derived Assessment Instrument) Across Medical Problems". Thesis, University of North Texas, 1991. https://digital.library.unt.edu/ark:/67531/metadc332695/.

Pełny tekst źródła
Streszczenie:
This study was motivated by concerns within the medical education community regarding the psychometric soundness of current assessment methodologies. More specifically, there is reason to seriously question the reliablity and/or validity of these methodologies in assessing the intellectual skills upon which medical competence is based.
Style APA, Harvard, Vancouver, ISO itp.
21

Farooq, Kamran. "A novel ontology and machine learning driven hybrid clinical decision support framework for cardiovascular preventative care". Thesis, University of Stirling, 2015. http://hdl.handle.net/1893/22328.

Pełny tekst źródła
Streszczenie:
Clinical risk assessment of chronic illnesses is a challenging and complex task which requires the utilisation of standardised clinical practice guidelines and documentation procedures in order to ensure consistent and efficient patient care. Conventional cardiovascular decision support systems have significant limitations, which include the inflexibility to deal with complex clinical processes, hard-wired rigid architectures based on branching logic and the inability to deal with legacy patient data without significant software engineering work. In light of these challenges, we are proposing a novel ontology and machine learning-driven hybrid clinical decision support framework for cardiovascular preventative care. An ontology-inspired approach provides a foundation for information collection, knowledge acquisition and decision support capabilities and aims to develop context sensitive decision support solutions based on ontology engineering principles. The proposed framework incorporates an ontology-driven clinical risk assessment and recommendation system (ODCRARS) and a Machine Learning Driven Prognostic System (MLDPS), integrated as a complete system to provide a cardiovascular preventative care solution. The proposed clinical decision support framework has been developed under the close supervision of clinical domain experts from both UK and US hospitals and is capable of handling multiple cardiovascular diseases. The proposed framework comprises of two novel key components: (1) ODCRARS (2) MLDPS. The ODCRARS is developed under the close supervision of consultant cardiologists Professor Calum MacRae from Harvard Medical School and Professor Stephen Leslie from Raigmore Hospital in Inverness, UK. The ODCRARS comprises of various components, which include: (a) Ontology-driven intelligent context-aware information collection for conducting patient interviews which are driven through a novel clinical questionnaire ontology. (b) A patient semantic profile, is generated using patient medical records which are collated during patient interviews (conducted through an ontology-driven context aware adaptive information collection component). The semantic transformation of patients’ medical data is carried out through a novel patient semantic profile ontology in order to give patient data an intrinsic meaning and alleviate interoperability issues with third party healthcare systems. (c) Ontology driven clinical decision support comprises of a recommendation ontology and a NICE/Expert driven clinical rules engine. The recommendation ontology is developed using clinical rules provided by the consultant cardiologist from the US hospital. The recommendation ontology utilises the patient semantic profile for lab tests and medication recommendation. A clinical rules engine is developed to implement a cardiac risk assessment mechanism for various cardiovascular conditions. The clinical rules engine is also utilised to control the patient flow within the integrated cardiovascular preventative care solution. The machine learning-driven prognostic system is developed in an iterative manner using state of the art feature selection and machine learning techniques. A prognostic model development process is exploited for the development of MLDPS based on clinical case studies in the cardiovascular domain. An additional clinical case study in the breast cancer domain is also carried out for the development and validation purposes. The prognostic model development process is general enough to handle a variety of healthcare datasets which will enable researchers to develop cost effective and evidence based clinical decision support systems. The proposed clinical decision support framework also provides a learning mechanism based on machine learning techniques. Learning mechanism is provided through exchange of patient data amongst the MLDPS and the ODCRARS. The machine learning-driven prognostic system is validated using Raigmore Hospital's RACPC, heart disease and breast cancer clinical case studies.
Style APA, Harvard, Vancouver, ISO itp.
22

Silveira, Gabriela. "Narrativas produzidas por indivíduos afásicos e indivíduos cognitivamente sadios: análise computadorizada de macro e micro estrutura". Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/5/5170/tde-01112018-101055/.

Pełny tekst źródła
Streszczenie:
INTRODUÇÃO: O tema de investigação, discurso de afásicos, fornece informações importantes sobre aspectos fonológicos, morfológicos, sintáticos, semânticos e pragmáticos da linguagem de pacientes que sofreram lesão vascular cerebral. Uma das maneiras de estudar o discurso é por meio de cenas figurativas temáticas simples ou em sequência. A sequência da história de \"Cinderela\" é frequentemente utilizada em estudos, por ser familiar em todo o mundo, o que favorece estudos transculturais; por induzir a produção de narrativas, ao invés de descrições, frequentemente obtidas quando se utiliza prancha única para eliciar discursos. Outra vantagem do uso das sequências da \"Cinderela\" é o fato de gerar material linguístico em quantidade suficiente para análise detalhada. OBJETIVOS: (1) analisar, por meio de tecnologias computadorizadas, aspectos macro e microestruturais do discurso de indivíduos sadios do ponto de vista cognitivo, afásicos de Broca e afásicos anômicos; (2) explorar o discurso como indicador de evolução da afasia; (3) analisar a contribuição do SPECT para verificação de evolução da afasia junto ao discurso. MÉTODO: Participaram do estudo oito indivíduos afásicos de Broca e anômicos que compuseram o grupo do estudo longitudinal (G1), 15 indivíduos afásicos de Broca e anômicos que compuseram o outro grupo de estudo (G2) e 30 cognitivamente sadios (GC). Os participantes foram solicitados a examinar as cenas da história \"Cinderela\" e depois recontar a história, com suas palavras. Foram exploradas tecnologias computadorizadas e analisados aspectos macro e microestruturais dos discursos produzidos. Para o G1, tivermos a particularidade de coleta de discurso também pela prancha \"Roubo dos Biscoitos\", análise do exame SPECT e acompanhamento longitudinal por um período de seis meses. RESULTADOS: Comparando o GC e o G2, em relação à macroestrutura, notou-se que os afásicos do G2 se diferenciaram significativamente do GC em todas as proposições e, em relação à microestrutura, sete métricas foram capazes de diferenciar ambos os grupos. Houve diferença significante macro e micro estrutural entre os sujeitos afásicos de Broca e anômicos. Foi possível verificar diferenças em medidas da macro e da microestrutura no G1 com o avançar do tempo de lesão após AVC. A história da \"Cinderela\" forneceu dados de microestrutura mais completos do que a prancha \"Roubo dos Biscoitos\". Os resultados do SPECT permaneceram os mesmos, sem demonstração de mudança com a evolução da afasia. CONCLUSÃO: A produção de narrativa gerou material para análise de macroestrutura e microestrutura, tanto aspectos de macro quanto de microestrutura diferenciaram indivíduos cognitivamente sadios dos sujeitos afásicos. A análise do discurso da \"Cinderela\" serviu como instrumento para mensurar a melhora da linguagem dos sujeitos afásicos. O uso da ferramenta computacional auxiliou as análises discursivas
INTRODUCTION: The aphasic discourse analysis provides important information about the phonological, morphological, syntactic, semantic and pragmatic aspects of the language of patients who have suffered a stroke. The evaluation of the discourse, along with other methods, can contribute to observation of the evolution of the language and communication of aphasic patients; however, manual analysis is laborious and can lead to errors. OBJECTIVES: (1) to analyze, by computerized technologies, macro and microstructural aspects of the discourse of healthy cognitive individuals, Broca\'s and anomic aphasics; (2) to explore the discourse as indicator of the evolution of aphasia; (3) to analyze the contribution of single photon emission computed tomography (SPECT) to verify the correlation between behavioral and neuroimaging evolution data. METHOD: Two groups of patients were studied: GA1, consisting of eight individuals with Broca\'s aphasia and anomic aphasia, who were analyzed longitudinally from the sub-acute phase of the lesion and after three and six months; GA2 composed of 15 individuals with Broca\'s and anomic aphasia, with varying times of stroke installation and GC consisting of 30 cognitively healthy participants. Computerized technologies were explored for the analysis of metrics related to the micro and macrostructure of discourses uttered from Cinderela history and Cookie Theft picture. RESULTS: Comparing the GC and GA2, in relation to the discourse macrostructure, it was observed that the GA2 aphasics differed significantly from the GC in relation to the total number of propositions emitted; considering the microstructure, seven metrics differentiated both groups. There was a significant difference in the macro and microstructure between the discourses of Broca\'s aphasic subjects and anomic ones. It was possible to verify differences in macro and microstructure measurements in GA1 with the advancement of injury time. In GA1, the comparison between parameters in the sub-acute phase and after 6 months of stroke revealed differences in macrostructure - increase in the number of propositions of the orientation block and of the total propositions. Regarding the microstructure, the initial measures of syllable metrics by word content, incidence of nouns and incidence of content words differed after 6 months of intervention. The variable incidence of missing words in the dictionary showed a significantly lower value after three months of stroke. Cinderella\'s story provided more complete microstructure data than the Cookie Theft picture. There was no change in SPECT over time, without demonstration of change with the evolution of aphasia. CONCLUSION: The discourse produced from the history of Cinderella and the Cookie Theft picture generated material for macrostructure and microstructure analysis of cognitively healthy and aphasic individuals, made it possible to quantify and qualify the evolution of language in different phases of stroke recuperation and distinguished the behavior of healthy and with Broca´s and anomic aphasia, in macro and microstructure aspects. The exploration of computerized tools facilitated the analysis of the data in relation to the microstructure, but it was not applicable to the macrostructure, demonstrating that there is a need for tool adjustments for the discourse analysis of patients. SPECT data did not reflect the behavioral improvement of the language of aphasic subjects
Style APA, Harvard, Vancouver, ISO itp.
23

Toledo, Cíntia Matsuda. "Análise de aspectos micro e macrolinguísticos da narrativa de indivíduos com doença de Alzheimer, comprometimento cognitivo leve e sem comprometimentos cognitivos". Universidade de São Paulo, 2017. http://www.teses.usp.br/teses/disponiveis/5/5170/tde-11092017-133850/.

Pełny tekst źródła
Streszczenie:
INTRODUÇÃO: O envelhecimento da população é uma tendência social conhecida em países desenvolvidos e cada vez mais pronunciada em países em desenvolvimento. A demência é considerada um dos principais problemas de saúde devido ao rápido crescimento populacional de idosos, sendo os distúrbios de linguagem considerados importantes nesses quadros. O discurso tem ganhado destaque para a identificação dos distúrbios linguísticos nas demências assim como no seguimento desses pacientes. A caracterização das diferenças pode auxiliar no diagnóstico diferencial e contribuir para a criação de ferramentas futuras que auxiliem na intervenção clínica e ajudem a evitar a evolução e/ou progressão dos quadros demenciais. O processo de transcrição e análise do discurso é bastante laborioso, desta forma o uso de métodos computacionais tem auxiliado na identificação e extração de características linguísticas. OBJETIVO: identificar alterações em aspectos micro e macrolinguísticos que diferenciem indivíduos com doença de Alzheimer, comprometimento cognitivo leve e idosos sem comprometimento cognitivo na tarefa de narrativa de figuras em sequência e explorar a ferramenta computacional (Coh-Metrix-Dementia) para análise do discurso desses sujeitos. MÉTODO: Foram avaliados 60 indivíduos, sendo 20 em cada grupo de pesquisa (doença de Alzheimer leve - GDA, comprometimento cognitivo leve amnéstico - GCCLa e controle - GC). Os indivíduos foram solicitados a enunciar uma narrativa baseada em 22 cenas em sequência, que retratam a história da \"Cinderela\". Foram aplicados também os seguintes testes linguístico-cognitivos: Fluência Verbal, Teste de Nomeação do Boston e Camel and Cactus test. Utilizou-se o Coh-Metrix- Dementia para extração automática das métricas. RESULTADOS: Os valores extraídos pelo Coh-Metrix-Dementia foram tratados estatisticamente sendo possível levantar métricas capazes de distinguir os grupos estudados. Em relação aos aspectos microlinguísticos destacaram-se a redução nas habilidades sintáticas, maior dificuldade no resgate verbal, discursos com menor coesão e coerência local no GDA. No nível macrolinguístico o GDA apresentou os discursos menos informativos, com maior prejuízo em relação a coerência global e maior número de modalizações. O GDA também apresentou maior comprometimento da estrutura narrativa. Não foi possível discriminar o GCCLa e GC em nenhuma métrica do discurso deste estudo. Foram feitas adaptações em relação a segmentação das sentenças para um melhor funcionamento da ferramenta computacional. CONCLUSÃO: Os indivíduos do GDA apresentaram discursos com maior comprometimento macro e microestrutural. O uso da ferramenta computacional se mostrou um importante aliado para análises discursivas
INTRODUCTION: Population aging is a social trend known in developed countries and increasingly pronounced in developing countries. Dementia is considered one of the main health problems due to the rapid population growth of the elderly, and language disorders are considered important in these settings. The discourse is important for the identification of linguistic disorders in dementias as well as in the follow-up of these patients. The discourse differences characterization can help on the differential diagnosis and contribute to the creation of future tools for clinical intervention and help prevent the evolution and/or progression of dementia. The transcription and discourse analysis are laborius, thus the use of computational methods helped in the identification and extraction of linguistic characteristics. OBJECTIVE: The objective of this study was to identify changes in micro and macrolinguistic aspects that differentiate individuals with Alzheimer\'s disease, mild cognitive impairment and healthy elderly individuals during narrative of figures in sequence and to explore the computational tool (Coh-Metrix-Dementia) to analyze the subjects\' discourse. METHODS: 60 subjects were evaluated, 20 of them in each research group (mild Alzheimer\'s disease - GDA, amnestic cognitive impairment - GCCLa and control - CG). The subjects were asked to construct a narrative based on sequence of pictures, about the \"Cinderella´s Story\". The following linguistic-cognitive tests were also applied: Verbal Fluency, Boston Naming Test, and Camel and Cactus test. Coh-Metrix-Dementia was used for automatic metrics extraction. RESULTS: The values extracted by Coh-Metrix-Dementia were statistically treated and it was possible to obtain metrics capable of distinguishing the studied groups. In relation to the microlinguistic aspects, it was found the reduction in syntactic abilities, greater difficulty in verbal rescue, discourses with less cohesion and local coherence in the GDA. In the macrolinguistic level the GDA presented the less informative discourses, with greater loss in global coherence and the greater number of modalizations. The GDA also presented greater impairment on narrative structure. It was not possible to discriminate GCCLa and GC in any discourse´s metric in this study tool functioning. CONCLUSION: The GDA subjects presented discourses with greater macro and microstructural impairment. The computational tool usage proved to be an important ally for discursive analysis
Style APA, Harvard, Vancouver, ISO itp.
24

Forsyth, Rowena Public Health &amp Community Medicine Faculty of Medicine UNSW. "Tricky technology, troubled tribes: a video ethnographic study of the impact of information technology on health care professionals??? practices and relationships". Awarded by:University of New South Wales. School of Public Health and Community Medicine, 2006. http://handle.unsw.edu.au/1959.4/30175.

Pełny tekst źródła
Streszczenie:
Whilst technology use has always been a part of the practice of health care delivery, more recently, information technology has been applied to aspects of clinical work concerned with documentation. This thesis presents an analysis of the ways that two professional groups, one clinical and one ancillary, at a single hospital cooperatively engage in a work practice that has recently been computerised. It investigates the way that a clinical group???s approach to and actual use of the system creates problems for the ancillary group. It understands these problems to arise from the contrasting ways that the groups position their use of documentation technology in their local definitions of professional status. The data on which analysis of these practices is based includes 16 hours of video recordings of the work practices of the two groups as they engage with the technology in their local work settings as well as video recordings of a reflexive viewing session conducted with participants from the ancillary group. Also included in the analysis are observational field notes, interviews and documentary analysis. The analysis aimed to produce a set of themes grounded in the specifics of the data, and drew on TLSTranscription?? software for the management and classification of video data. This thesis seeks to contribute to three research fields: health informatics, sociology of professions and social science research methodology. In terms of health informatics, this thesis argues for the necessity for health care information technology design to understand and incorporate the work practices of all professional groups who will be involved in using the technology system or whose work will be affected by its introduction. In terms of the sociology of professions, this thesis finds doctors and scientists to belong to two distinct occupational communities that each utilise documentation technology to different extents in their displays of professional competence. Thirdly, in terms of social science research methodology, this thesis speculates about the possibility for viewing the engagement of the groups with the research process as indicative of their reactions to future sources of outside perturbance to their work.
Style APA, Harvard, Vancouver, ISO itp.
25

HUANG, YI-TING, i 黃翊婷. "Data Processing and conversion in Equipment Intelligent Diagnosis". Thesis, 2018. http://ndltd.ncl.edu.tw/handle/jyh9e3.

Pełny tekst źródła
Streszczenie:
碩士
國立高雄第一科技大學
環境與安全衛生工程系碩士班
106
The intelligent diagnosis system can obtain potential information by analyzing the equipment maintenance records to establish a diagnosis model. Therefore, an appropriate diagnosis method and complete records are the foundation of the intelligent diagnosis system. However, missing values, outliers, or redundant data in the records may result in poor data quality and decreasing the accuracy of the diagnostic results. Moreover, different diagnostic methods have their own needs for data types, formats, etc. In order to improve data quality and meet the needs of diagnosis methods, a series of data pre-processing are required. This study uses a reciprocating compressor as object, discusses the data required for intelligent diagnostic system, and proposes two-stage data pre-processing procedures for equipment maintenance records. On the other hand, an experiment was conducted using a neural network as a diagnostic method to compare the data transformation and dimensionality reduction results of the different normalization methods and principal component analysis (PCA), and the impact on the diagnostic results. The results show that different normalization methods have different characteristics, which will change the distribution of the data. In addition, PCA is a widely used method of dimensional reduction, but there are still limitations in its application. When performing equipment fault diagnosis, PCA may ignore the small changes in the fault characteristics of the equipment The intelligent diagnosis system based on equipment maintenance records seems to be feasible, but companies will face the problem of insufficient data and poor quality when promoting. This also leads to taking about 75% of the time in data processing. If the company has complete records and the research of intelligent diagnosis is developing well, it will eventually be able to achieve the purpose of intelligent diagnosis.
Style APA, Harvard, Vancouver, ISO itp.
26

Jhan, Bo-Yan, i 詹伯彥. "A Medical Tongue Diagnosis Assistant System - Image and Clinical Data Processing". Thesis, 2015. http://ndltd.ncl.edu.tw/handle/5j47tv.

Pełny tekst źródła
Streszczenie:
碩士
國立中興大學
資訊科學與工程學系
103
In Traditional Chinese Medicine (TCM) diagnosis, tongue diagnosis is as important as pulse diagnosis, the tongue diagnosis dialectic in its objectivity and consistency is longstanding issue. In this study, we established a tongue-diagnosis-evaluation data collection system for analysis and research of tongue image, to collects massive information of tongue images and its features. With these clinical data, the basic of tongue diagnosis study could be founded, which would facilitate the later research. When applying clinical data in medical studies, the reliability of data could effects the credibility of study directly. In this study, cloud storage structure is applied. By designing the operation of human-computer interface, a consistent standard process of collecting multiple doctor’s evaluation is provided, which could avoid the incompletion of diagnostic information. For captured tongue diagnosis images, based on clinic environment and tongue image capturing environment, color correction is applied. However, studies from past decade indicated that physician only did evaluation on standard-colored images, which might cause different results when evaluating tongue features due to inability of adapting the color information in different environment. To solve the problem mentioned above, in this study we provide a consistent standard process of color correction. Furthermore, the color correction will be carry out based on clinic environment, producing a more suitable image for doctors. Hence, in this study we have provided a standard color correction and color correction based on clinic environment, which make the evaluation results more consistent and clinical data more reliable when doctor conducting feature analysis. In the experimental results, by measuring color consistency, continuity and stability, it proves the proposed color correction method is reliable.
Style APA, Harvard, Vancouver, ISO itp.
27

Dendamrongvit, Thidarat. "An ontology-based system for representation and diagnosis of electrocardiogram (ECG) data". Thesis, 2006. http://hdl.handle.net/1957/28946.

Pełny tekst źródła
Streszczenie:
Electrocardiogram (ECG) data are stored and analyzed in different formats, devices, and computer platforms. There is a need to have an independent platform to support ECG processes among different resources for the purposes of improving the quality of health care and proliferating the results from research. Currently, ECG devices are proprietary. Devices from different manufacturers cannot communicate with each other. It is crucial to have an open standard to manage ECG data for representation and diagnosis. This research explores methods for representation and diagnosis of ECG by developing an Ontology for shared ECG data based on the Health Level Seven (HL7) standard. The developed Ontology bridges the conceptual gap by integrating ECG waveform data, HL7 standard data descriptions, and cardiac diagnosis rules. The Ontology is encoded in Extensible Markup Language (XML) providing human and machine readable format. Thus, the interoperability issue is resolved and ECG data can be shared among different ECG devices and systems. This developed Ontology also provides a mechanism for diagnostic decision support through an automated ECG diagnosis system for a medical technician or physician in the diagnosis of cardiac disease. An experiment was conducted to validate the interoperability of the Ontology, and also to assess the accuracy of the diagnosis model provided through the Ontology. Results showed 100% interoperability from ECG data provided through eight different databases, and a 93% accuracy in diagnosis of normal and abnormal cardiac conditions.
Graduation date: 2006
Style APA, Harvard, Vancouver, ISO itp.
28

Borisov, Nedyalko Krasimirov. "Integrated Management of the Persistent-Storage and Data-Processing Layers in Data-Intensive Computing Systems". Diss., 2012. http://hdl.handle.net/10161/5806.

Pełny tekst źródła
Streszczenie:

Over the next decade, it is estimated that the number of servers (virtual and physical) in enterprise datacenters will grow by a factor of 10, the amount of data managed by these datacenters will grow by a factor of 50, and the number of files the datacenter has to deal with will grow by a factor of 75. Meanwhile, skilled information technology (IT) staff to manage the growing number of servers and data will increase less than 1.5 times. Thus, a system administrator will face the challenging task of managing larger and larger numbers of production systems. We have developed solutions to make the system administrator more productive by automating some of the hard and time-consuming tasks in system management. In particular, we make new contributions in the Monitoring, Problem Diagnosing, and Testing phases of the system management cycle.

We start by describing our contributions in the Monitoring phase. We have developed a tool called Amulet that can continuously monitor and proactively detect problems on production systems. A notoriously hard problem that Amulet can detect is that of data corruption where bits of data in persistent storage differ from their true values. Once a problem is detected, our DiaDS tool helps in diagnosing the cause of the problem. DiaDS uses a novel combination of machine learning techniques and domain knowledge encoded in a symptoms database to guide the system administrator towards the root cause of the problem.

Before applying any change (e.g., changing a configuration parameter setting) to the production system, the system administrator needs to thoroughly understand the effect that this change can have. Well-meaning changes to production systems have led to performance or availability problems in the past. For this phase, our Flex tool enables administrators to evaluate the change hypothetically in a manner that is fairly accurate while avoiding overheads on the production system. We have conducted a comprehensive evaluation of Amulet, DiaDS, and Flex in terms of effectiveness, efficiency, integration of these contributions in the system management cycle, and how these tools bring data-intensive computing systems closer the goal of self-managing systems.


Dissertation
Style APA, Harvard, Vancouver, ISO itp.
29

Yuan, Soe-Tsyr. "Knowledge-based decision model construction for hierarchical diagnosis and repair". Thesis, 1994. http://hdl.handle.net/1957/35300.

Pełny tekst źródła
Streszczenie:
Knowledge-Based Model Construction (KBMC) has generated a lot of attention due to its importance as a technique for generating probabilistic or decision-theoretic models whose range of applicability in AI has been vastly increased. However, no one has tried to analyze the essential issues in KBMC, to determine if there exists a general efficient KBMC method for any problem domain, or to y identify the fruitful future research on KBMC. This research presents a unified framework for comparative analysis of KBMC systems identifying the essential issues in KBMC, showing that there is no such general efficient KBMC method, and listing the fruitful future research on KBMC. This thesis then presents a new KBMC mechanism for hierarchical diagnosis and repair. Diagnosis is formulated as a stochastic process and modeled using influence diagrams. In the best case using an abstraction hierarchy in problem-solving can yield an exponential speedup in search efficiency. However, this speedup assumes backtracking never occurs across abstraction levels. When this assumption fails, search may have to consider different abstract solutions before finding one that can be refined to a base solution, and, therefore, search efficiency is not necessarily improved. In this thesis, we present a decision model construction method for hierarchical diagnosis and repair. We show analytically and experimentally that our method always yields a significant speedup in search efficiency, and that hierarchies with smaller branching factors yield more significant efficiency gains. This thesis employs two causal pathways (functional and bridge fault) of domain knowledge in device trouble shooting, preventing either whole class of faults we will never be able to diagnose. Each causal pathway models the knowledge of adjacency and behavior within the corresponding interaction layer. Careful search of causal pathways allows us to restrict the search space of fault hypotheses at each time. We model this search among causal pathways decision-theoretically. Decision-theoretic control usually results in significant improvements over unaided human expert judgments. Furthermore, these improvements in performance are robust to substantial errors in the assessed costs and probabilities.
Graduation date: 1995
Style APA, Harvard, Vancouver, ISO itp.
30

Kan, John Priscilla. "Discrete and hybrid methods for the diagnosis of distributed systems". Phd thesis, 2013. http://hdl.handle.net/1885/156149.

Pełny tekst źródła
Streszczenie:
Many important activities of modern society rely on the proper functioning of complex systems such as electricity networks, telecommunication networks, manufacturing plants and aircrafts. The supervision of such systems must include strong diagnosis capability to be able to effectively detect the occurrence of faults and ensure appropriate corrective measures can be taken in order to recover from the faults or prevent total failure. This thesis addresses issues in the diagnosis of large complex systems. Such systems are usually distributed in nature, i.e. they consist of many interconnected components each having their own local behaviour. These components interact together to produce an emergent global behaviour that is complex. As those systems increase in complexity and size, their diagnosis becomes increasingly challenging. In the first part of this thesis, a method is proposed for diagnosis on distributed systems that avoids a monolithic global computation. The method, based on converting the graph of the system into a junction tree, takes into account the topology of the system in choosing how to merge local diagnoses on the components while still obtaining a globally consistent result. The method is shown to work well for systems with tree or near-tree structures. This method is further extended to handle systems with high clustering by selectively ignoring some connections that would still allow an accurate diagnosis to be obtained. A hybrid system approach is explored in the second part of the thesis, where continuous dynamics information on the system is also retained to help better isolate or identify faults. A hybrid system framework is presented that models both continuous dynamics and discrete evolution in dynamical systems, based on detecting changes in the fundamental governing dynamics of the system rather than on residual estimation. This makes it possible to handle systems that might not be well characterised and where parameter drift is present. The discrete aspect of the hybrid system model is used to derive diagnosability conditions using indicator functions for the detection and isolation of multiple, arbitrary sequential or simultaneous events in hybrid dynamical networks. Issues with diagnosis in the presence of uncertainty in measurements due sensor or actuator noise are addressed. Faults may generate symptoms that are in the same order of magnitude as the latter. The use of statistical techniques,within a hybrid system framework, is proposed to detect these elusive fault symptoms and translate this information into probabilities for the actual operational mode and possibility of transition between modes which makes it possible to apply probabilistic analysis on the system to handle the underlying uncertainty present.
Style APA, Harvard, Vancouver, ISO itp.
31

"Intelligent online monitoring and diagnosis for metal stamping operations". 2003. http://library.cuhk.edu.hk/record=b6073523.

Pełny tekst źródła
Streszczenie:
"March 2003."
Thesis (Ph.D.)--Chinese University of Hong Kong, 2003.
Includes bibliographical references (p. 183-193).
Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
Mode of access: World Wide Web.
Abstracts in English and Chinese.
Style APA, Harvard, Vancouver, ISO itp.
32

"An approach to diagnose cardiac conditions from electrocardiogram signals". 2011. http://library.cuhk.edu.hk/record=b5894714.

Pełny tekst źródła
Streszczenie:
Lu, Yan.
"October 2010."
Thesis (M.Phil.)--Chinese University of Hong Kong, 2011.
Includes bibliographical references (leaves 65-68).
Abstracts in English and Chinese.
Abstract --- p.i
Acknowledgement --- p.iv
Chapter 1. --- Introduction --- p.1
Chapter 1.1 --- Electrocardiogram --- p.1
Chapter 1.1.1 --- ECG Measurement --- p.2
Chapter 1.1.2 --- Cardiac Conduction Pathway and ECG Morphology --- p.4
Chapter 1.1.3 --- A Basic Clinical Approach to ECG Analysis --- p.6
Chapter 1.2 --- Cardiovascular Disease --- p.7
Chapter 1.3 --- Motivation --- p.9
Chapter 1.4 --- Related Work --- p.10
Chapter 1.5 --- Overview of Proposed Approach --- p.11
Chapter 1.6 --- Thesis Outline --- p.13
Chapter 2. --- ECG Signal Preprocessing --- p.14
Chapter 2.1 --- ECG Model and Its Generalization --- p.14
Chapter 2.1.1 --- ECG Dynamic Model --- p.14
Chapter 2.1.2 --- Generalization of ECG Model --- p.15
Chapter 2.2 --- Empirical Mode Decomposition --- p.17
Chapter 2.3 --- Baseline Wander Removal --- p.20
Chapter 2.3.1 --- Sources of Baseline Wander --- p.20
Chapter 2.3.2 --- Baseline Wander Removal by EMD --- p.20
Chapter 2.3.3 --- Experiments on Baseline Wander Removal --- p.21
Chapter 2.4 --- ECG Denoising --- p.24
Chapter 2.4.1 --- Introduction --- p.24
Chapter 2.4.2 --- Instantaneous Frequency --- p.26
Chapter 2.4.3 --- Problem of Direct ECG Denoising by EMD : --- p.28
Chapter 2.4.4 --- Model-based Pre-filtering --- p.30
Chapter 2.4.5 --- EMD Denoising Using Significance Test --- p.33
Chapter 2.4.6 --- EMD Denoising using Instantaneous Frequency --- p.35
Chapter 2.4.7 --- Experiments --- p.39
Chapter 2.5 --- Chapter Summary --- p.44
Chapter 3. --- ECG Classification --- p.45
Chapter 3.1 --- Database --- p.45
Chapter 3.2 --- Feature Extraction --- p.46
Chapter 3.2.1 --- Feature Selection --- p.46
Chapter 3.2.2 --- Feature Dimension Reduction by GDA --- p.48
Chapter 3.3 --- Classification by Support Vector Machine --- p.50
Chapter 3.4 --- Experiments --- p.53
Chapter 3.4.1 --- Performance of Feature Reduction --- p.54
Chapter 3.4.2 --- Performance of Classification --- p.57
Chapter 3.4.3 --- Performance Comparison with Other Works --- p.60
Chapter 3.5 --- Chapter Summary --- p.61
Chapter 4. --- Conclusions --- p.63
Reference --- p.65
Style APA, Harvard, Vancouver, ISO itp.
33

Binczyk, Franciszek Eugeniusz. "Processing and analysis of data obtained using Nuclear Magnetic Resonance technology in the diagnosis and treatment of brain tumours". Rozprawa doktorska, 2016. https://repolis.bg.polsl.pl/dlibra/docmetadata?showContent=true&id=62588.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
34

Binczyk, Franciszek Eugeniusz. "Processing and analysis of data obtained using Nuclear Magnetic Resonance technology in the diagnosis and treatment of brain tumours". Rozprawa doktorska, 2016. https://delibra.bg.polsl.pl/dlibra/docmetadata?showContent=true&id=62588.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
35

"A new stethoscope for reduction of heart sounds from lung sound recordings". 2001. http://library.cuhk.edu.hk/record=b5890844.

Pełny tekst źródła
Streszczenie:
Yip Lung.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2001.
Includes bibliographical references.
Abstracts in English and Chinese.
Chapter 1 --- Introduction
Chapter 1.1 --- Heart and Lung Diseases --- p.1
Chapter 1.1.1 --- Hong Kong --- p.1
Chapter 1.1.2 --- China --- p.2
Chapter 1.1.3 --- the United States of America (USA) --- p.3
Chapter 1.2 --- Auscultation --- p.3
Chapter 1.2.1 --- Introduction of Auscultation --- p.4
Chapter 1.2.2 --- Comparison between Auscultation and Ultrasound --- p.6
Chapter 1.3 --- Stethoscope --- p.7
Chapter 1.3.1 --- History of Stethoscope --- p.7
Chapter 1.3.2 --- New Electronic Stethoscope --- p.14
Chapter 1.4 --- Main Purpose of the Study --- p.16
Chapter 1.5 --- Organization of Thesis --- p.16
References --- p.18
Chapter 2 --- A New Electronic Stethoscope's Head
Chapter 2.1 --- Introduction --- p.20
Chapter 2.2 --- Biopotential Electrode --- p.21
Chapter 2.2.1 --- Flexible Electrode --- p.21
Chapter 2.2.2 --- Laplacian Electrocardiogram --- p.22
Chapter 2.3 --- Transducer --- p.25
Chapter 2.4 --- Design of the Head of Stethoscope --- p.26
Chapter 2.5 --- Experimental Results --- p.27
Chapter 2.5.1 --- Bias Voltage of Condenser Microphone --- p.27
Chapter 2.5.2 --- Frequency Response of New Stethoscope's Head --- p.29
Chapter 2.6 --- Discussion --- p.30
Chapter 2.7 --- Section Summary --- p.31
References --- p.33
Chapter 3 --- Signal Pre-processing Unit
Chapter 3.1 --- Introduction --- p.35
Chapter 3.2 --- High Input Impedance IC Amplifier --- p.36
Chapter 3.3 --- Voltage Control Voltage Source High Pass Filter Circuit --- p.37
Chapter 3.4 --- Multiple Feed Back Low Pass Filter Circuit --- p.39
Chapter 3.5 --- Overall Circuit --- p.41
Chapter 3.6 --- Experimental Results --- p.43
Chapter 3.7 --- Discussion --- p.46
Chapter 3.8 --- Section Summary --- p.47
References --- p.48
Chapter 4 --- Central Platform
Chapter 4.1 --- Introduction --- p.49
Chapter 4.2 --- Adaptive Filter --- p.49
Chapter 4.2.1 --- Introduction to Adaptive Filtering --- p.49
Chapter 4.2.2 --- Least-Mean-Square (LMS) Algorithm --- p.51
Chapter 4.2.3 --- Applications --- p.52
Chapter 4.3 --- Offline Processing --- p.54
Chapter 4.3.1 --- WINDAQ and MATLAB --- p.55
Chapter 4.3.2 --- Direct Reference Algorithm --- p.57
Chapter 4.3.3 --- Determination of Parameters in DRA --- p.62
Chapter 4.3.4 --- Experimental Results of DRA --- p.67
Chapter 4.3.5 --- Acoustic Waveform Based Algorithm --- p.72
Chapter 4.3.6 --- Experimental Results of AWBA --- p.81
Chapter 4.4 --- Online Processing --- p.85
Chapter 4.4.1 --- LABVIEW --- p.85
Chapter 4.4.2 --- Automated Gain Control --- p.88
Chapter 4.4.3 --- Implementation of LMS adaptive filter --- p.89
Chapter 4.4.4 --- Experimental Results of Online-AGC --- p.92
Chapter 4.5 --- Discussion --- p.93
Chapter 4.6 --- Section Summary --- p.97
References --- p.98
Chapter 5 --- Conclusion and Further Development
Chapter 5.1 --- Conclusion of the Main Contribution --- p.100
Chapter 5.2 --- Future Works --- p.102
Chapter 5.2.1 --- Modification of the Head of Stethoscope --- p.102
Chapter 5.2.2 --- Validation of Abnormal Breath --- p.102
Chapter 5.2.3 --- Low Frequency Analysis --- p.102
Chapter 5.2.4 --- AGC-AWBA Approach --- p.102
Chapter 5.2.5 --- Standalone Device --- p.103
Chapter 5.2.6 --- Demand on Stethoscope --- p.109
References --- p.110
Appendix
Chapter A.1 --- Determination of parameters in VCVS High Pass Filter --- p.106
Chapter A.2 --- Determination of parameters in MFB Low Pass Filter --- p.110
Chapter A.3 --- Source code of DRA (MATLAB) --- p.114
Chapter A.4 --- Source code of AWBA (MATLAB) --- p.129
Chapter A.5 --- Source code of online AGC (LABVIEW) --- p.134
Style APA, Harvard, Vancouver, ISO itp.
36

Patel, Jay Sureshbhai. "Utilizing Electronic Dental Record Data to Track Periodontal Disease Change". Diss., 2020. http://hdl.handle.net/1805/23677.

Pełny tekst źródła
Streszczenie:
Indiana University-Purdue University Indianapolis (IUPUI)
Periodontal disease (PD) affects 42% of US population resulting in compromised quality of life, the potential for tooth loss and influence on overall health. Despite significant understanding of PD etiology, limited longitudinal studies have investigated PD change in response to various treatments. A major barrier is the difficulty of conducting randomized controlled trials with adequate numbers of patients over a longer time. Electronic dental record (EDR) data offer the opportunity to study outcomes following various periodontal treatments. However, using EDR data for research has challenges including quality and missing data. In this dissertation, I studied a cohort of patients with PD from EDR to monitor their disease status over time. I studied retrospectively 28,908 patients who received comprehensive oral evaluation at the Indiana University School of Dentistry between January 1st-2009 and December 31st-2014. Using natural language processing and automated approaches, we 1) determined PD diagnoses from periodontal charting based on case definitions for surveillance studies, 2) extracted clinician-recorded diagnoses from clinical notes, 3) determined the number of patients with disease improvement or progression over time from EDR data. We found 100% completeness for age, sex; 72% for race; 80% for periodontal charting findings; and 47% for clinician-recorded diagnoses. The number of visits ranged from 1-14 with an average of two visits. From diagnoses obtained from findings, 37% of patients had gingivitis, 55% had moderate periodontitis, and 28% had severe periodontitis. In clinician-recorded diagnoses, 50% patients had gingivitis, 18% had mild, 14% had moderate, and 4% had severe periodontitis. The concordance between periodontal charting-generated and clinician-recorded diagnoses was 47%. The results indicate that case definitions for PD are underestimating gingivitis and overestimating the prevalence of periodontitis. Expert review of findings identified clinicians relying on visual assessment and radiographic findings in addition to the case definition criteria to document PD diagnosis.
2021-08-10
Style APA, Harvard, Vancouver, ISO itp.
37

"Synopsis of video streams and its application to computer aided diagnosis for GI tract abnormalities based on wireless capsule endoscopy (CE) video". 2012. http://library.cuhk.edu.hk/record=b5549629.

Pełny tekst źródła
Streszczenie:
無線膠囊內窺鏡(CE)是一種用於檢查整個胃腸道,尤其是小腸的無創技術。它極大地改善了許多小腸疾病的診斷和管理方式,如不明原因的消化道出血,克羅恩病,小腸腫瘤,息肉綜合征等。儘管膠囊內窺鏡有很好的臨床表現,但它仍然有一定的局限性。主要問題是每次檢查產生約50,000 幅低質量的圖像,對於醫生來說,評估如此大量的圖像是一項非常耗時、耗力的工作。
到目前為止,對於膠囊內窺鏡的分析和評估,學者們都把膠囊內窺鏡圖像視為單獨的,獨立的觀測對象。事實並非如此,因為圖像之間往往有顯著的重疊。特別是當膠囊內窺鏡在被小腸蠕動緩緩推動時,它可以捕捉同一病灶的多個視圖。我們的研究目的是使用所有可用的資訊,包括多幅圖像,研究對於膠囊內窺鏡的電腦輔助診斷(CAD)系統。
在這篇論文中,我們提出了一個嵌入分類器的多類隱馬爾可夫模型(HMM)的方案,它可以融合多幅相鄰圖像的時間資訊。由於膠囊內窺鏡圖像的品質比較低,我們首先進行預處理,以加強膠囊內窺鏡圖像,增加其對比度,消除噪聲。我們調查研究了多種圖像增強的方法,並調整了它們的參數使其適用於膠囊內窺鏡圖像。
對於基於單幅圖像的有監督的分類,AdaBoost 作為一個集成分類器來融合多個分類器,即本論文中的支持向量機(SVM),k-近鄰(k-NN),貝葉斯分類。在分類之前,我們提取和融合了顏色,邊緣和紋理特徵。
對於無線膠囊內窺鏡的視頻摘要,我們提出了有監督和無監督的兩類方法。對於有監督方法,我們提出了一個基於隱馬爾可夫模型的,靈活的,可擴展的框架,用於整合膠囊內窺鏡中連續圖像的時間資訊。它可以擴展到多類別,多特徵,多狀態。我們還提出了聯合隱馬爾可夫模型和並行隱馬爾可夫(PHMM)模型對系統進行改進,它們可以被看作是決策級的資訊融合。聯合隱馬爾可夫模型通過多層次的隱馬爾可夫模型,結合不同的資訊來源,對膠囊內窺鏡視頻進行分類和視頻摘要。 並行隱馬爾可夫模型採用貝葉斯推理,在決策時融合多個不同來源的資訊。對於無監督的方法,我們首先提出了一種基於顏色的特徵提取方法。在反色顏色空間中對亮度不變的色度不變矩用來表示膠囊內窺鏡圖像的顏色特徵。接著,我們又提出了一種基於輪廓元(Contourlet)變換的局部二元模式(LBP)作為紋理特徵。在特徵空間中,我們測量了相鄰圖像的距離,並把它視為一個位於二維平面上的開放輪廓上的點。 然後,我們採用一個無參數的關鍵點檢測方法檢測在視頻片段上的突變關鍵點。基於這些突變關鍵點,我們對膠囊內窺鏡視頻進行分割。最後,在每段被分割的視頻片段上,我們通過提取有代表性的關鍵幀來實現膠囊內窺鏡視頻摘要。我們分別用模擬和真實的病人數據進行實驗,對提出的方法進行驗證,結果表明了我們所提出的方案的有效性。它在實現自動評估膠囊內窺鏡圖像上具有很大的潛力。
Wireless Capsule Endoscopy (CE) is a non-invasive technology to inspect the whole gastrointestinal (GI) tract, especially the small intestine. It has dramatically changed the way of diagnosis and management of many diseases of the small intestine, such as obscure gastrointestinal bleeding, Crohn’s disease, small bowel tumors, polyposis syndromes, etc. Despite its promising clinical findings, it still has some limitations. The main problem is that it requires manual assessment of approximately 50,000 low quality images per examination which is highly time-consuming and labor-intense.
CE analysis and assessment so far treated CE images as individual and independent observations. It is obviously not the case as there is often significant overlap among images. In particular, CE captures multiple views of the same anatomy as the capsule is slowly propelled by peristalsis. Our broader work aims to perform computer aided diagnosis (CAD) in endoscopy using all available information, including multiple images.
In this dissertation, a framework of multi-class Hidden Markov Models (HMM) embedded with statistical classifiers for combining information from multiple CE images is proposed. Due to the low quality of CE image, pre-processing is performed to enhance CE images by increasing the contrast and removing noises. Several image enhancement methods are investigated and customized for CE images. For frame-based supervised classification, AdaBoost is used as the ensemble classifier to combine multiple classifiers, i.e. support vector machine (SVM), k-nearest neighbor (k-NN), and Bayes classifier. Before classification, color, edge and texture features are extracted and fused. Finally, both supervised and unsupervised methods are proposed for CE study synopsis. For supervised method, a flexible and extensible framework based on HMM is developed to integrate temporal information in CE images. It can be extended to multi-class, multi-features, and multi-states. Improvements can be made by combined HMM and Parallel HMM (PHMM) which are introduced as decision-level fusion schemes. Combined HMM considers different sources via a multi-layer HMM model to perform classification and video synopsis. PHMM employs Bayesian inference to combine the recognition results at decision level. For unsupervised method, illumination-independent opponent color moment invariants and local binary pattern (LBP) based on Contourlet transform are explored as color and texture features, respectively. Pair-wise image dissimilarity is measured in the feature space and treated as points on an open contour in a 2-D plane. CE video is segmented based on sudden change points which are detected using a non-parametric key-point detection method. From each segment, representative frames are extracted to summarize the CE video. Validation results on simulated and real patient data show promising performance of the proposed framework. It has great potential to achieve automatic assessment for CE images.
Detailed summary in vernacular field only.
Detailed summary in vernacular field only.
Detailed summary in vernacular field only.
Detailed summary in vernacular field only.
Detailed summary in vernacular field only.
Zhao, Qian.
Thesis (Ph.D.)--Chinese University of Hong Kong, 2012.
Includes bibliographical references (leaves 142-175).
Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
Abstract also in Chinese.
Abstract --- p.ii
Acknowledgments --- p.vii
List of Tables --- p.xiii
List of Figures --- p.xv
Chapter 1 --- The Relevance of Synopsis --- p.1
Chapter 1.1 --- Problem Statement --- p.1
Chapter 1.2 --- Application - Capsule Endoscopy Assessment --- p.4
Chapter 1.3 --- Literature Review --- p.9
Chapter 1.3.1 --- Methods Based on Frame Classification --- p.11
Chapter 1.3.2 --- Methods Integrating Temporal Information --- p.14
Chapter 1.4 --- Contributions --- p.19
Chapter 1.5 --- Organization --- p.23
Chapter 2 --- Preliminary --- p.25
Chapter 2.1 --- Hidden Markov Model (HMM) --- p.25
Chapter 2.2 --- Factorial HMM --- p.35
Chapter 3 --- Temporal Integration in Capsule Endoscopy Image Analysis --- p.37
Chapter 3.1 --- Pre-processing --- p.38
Chapter 3.2 --- Feature Extraction --- p.43
Chapter 3.3 --- Frame-based Supervised Classification --- p.47
Chapter 3.3.1 --- Supervised Classification using Individual Frames --- p.47
Chapter 3.3.2 --- Ensemble Learning Based on AdaBoost --- p.50
Chapter 3.4 --- Sequence-based Supervised Classification --- p.52
Chapter 3.5 --- Experiments --- p.58
Chapter 3.5.1 --- Capsule Endoscopy Image Enhancement --- p.60
Chapter 3.5.2 --- Frame-based Supervised Classification --- p.67
Chapter 3.5.3 --- Image Sequence Classification --- p.68
Chapter 3.6 --- Discussion --- p.80
Chapter 3.7 --- Summary --- p.82
Chapter 4 --- Capsule Endoscopy Study Synopsis --- p.98
Chapter 4.1 --- Supervised Synopsis Using Statistical Models --- p.98
Chapter 4.2 --- Unsupervised Synopsis via Representative Frame Extraction --- p.100
Chapter 4.2.1 --- Feature Extraction --- p.100
Chapter 4.2.2 --- Non-parametric Key-point Detection --- p.111
Chapter 4.2.3 --- Representative Frame Extraction --- p.112
Chapter 4.3 --- Experiments --- p.119
Chapter 4.3.1 --- Supervised Synopsis Based on HMM --- p.119
Chapter 4.3.2 --- Unsupervised Synopsis --- p.125
Chapter 4.4 --- Discussion --- p.132
Chapter 4.5 --- Summary --- p.133
Chapter 5 --- Conclusions and Future Work --- p.138
Chapter 5.1 --- Conclusions --- p.138
Chapter 5.2 --- Future Work --- p.141
Bibliography --- p.142
Style APA, Harvard, Vancouver, ISO itp.
38

Beggs, Clive B., Simon J. Shepherd i P. Zamboni. "Cerebral venous outflow resistance and interpretation of cervical plethysmography data with respect to the diagnosis of chronic cerebrospinal venous insufficiency". 2014. http://hdl.handle.net/10454/10606.

Pełny tekst źródła
Streszczenie:
No
PURPOSE: To investigate cerebrospinal fluid (CSF) dynamics in the aqueduct of Sylvius (AoS) in chronic cerebrospinal venous insufficiency (CCSVI)-positive and -negative healthy individuals using cine phase contrast imaging. MATERIALS AND METHODS: Fifty-one healthy individuals (32 CCSVI-negative and 19 age-matched CCSVI-positive subjects) were examined using Doppler sonography (DS). Diagnosis of CCSVI was established if subjects fulfilled >/=2 venous hemodynamic criteria on DS. CSF flow and velocity measures were quantified using a semiautomated method and compared with clinical and routine 3T MRI outcomes. RESULTS: CCSVI was associated with increased CSF pulsatility in the AoS. Net positive CSF flow was 32% greater in the CCSVI-positive group compared with the CCSVI-negative group (P = 0.008). This was accompanied by a 28% increase in the mean aqueductal characteristic signal (ie, the AoS cross-sectional area over the cardiac cycle) in the CCSVI-positive group compared with the CCSVI-negative group (P = 0.021). CONCLUSION: CSF dynamics are altered in CCSVI-positive healthy individuals, as demonstrated by increased pulsatility. This is accompanied by enlargement of the AoS, suggesting that structural changes may be occurring in the brain parenchyma of CCSVI-positive healthy individuals.
Style APA, Harvard, Vancouver, ISO itp.
39

Kaur, Rajvir. "A comparative analysis of selected set of natural language processing (NLP) and machine learning (ML) algorithms for clinical coding using clinical classification standards". Thesis, 2018. http://hdl.handle.net/1959.7/uws:49614.

Pełny tekst źródła
Streszczenie:
In Australia, hospital discharge summaries created at the end of an episode of care contain patient information such as demographic data, medical history, various diagnosis, interventions carried out, medications and drug therapies provided to the patient. These discharge summaries not only serve as a record of the episode of care, but later converted into a set of clinical codes for statistical analysis purposes. The process of clinical coding refers to assigning alphanumeric codes to discharge summaries. In Australia, clinical coding is done using International Classification of Diseases, version 10, Australian Modification (ICD-10-AM) and Australian Classification of Health Interventions (ACHI) as per the Australian Coding Standards (ACS), in an acute and subacute care setting, in both public and private hospitals. Clinical coding and subsequent analysis facilitate funding, insurance claims processing and research. The task of assigning codes to an episode of care is a manual process. This posed challenges in terms of ever-increasing set of codes in ICD-10-AM and ACHI, changing coding standards in ACS, complexity of care episodes, and large training and recruitment costs associated with clinical coders. In addition, the manual clinical coding process is time consuming and prone to errors, leading to financial losses. The use of Natural Language Processing (NLP) and Machine Learning (ML) techniques is considered as a solution to the above problem. In this thesis, four different approaches namely, pattern matching, rule based, machine learning and hybrid technique are compared to identify most efficient algorithm suitable for clinical coding. The ICD-10-AM and ACHI consists of 22 chapters based on human body organs, where each chapter describe diseases and interventions of a body system. The aforementioned, NLP and ML comparison is carried out only two chapters namely, diseases of the respiratory system and diseases of the digestive system. Initially, the dataset contained 190 clinical records of two chapters and named as Data190. Due to the limited number of clinical records, another 45 records were added to the existing dataset and this resultant dataset was named as Data235. The clinical records were cleaned up in the pre-processing stage to extract useful information which includes principal diagnosis, additional diagnosis, diabetes condition, principal procedure, additional procedure and anaesthesia details. In data pre-processing, various NLP techniques such as tokenisation, stop word removal, spelling error detection and correction, negation detection and abbreviation expansion were applied. In pattern matching approach, the textstring were matched charcter by character against the ICD-10-AMand ACHI coding guide using regular expression. If the match was found, codes were assigned. Whereas, in rule-based, 409 rules were defined to avoid coding of wrong patterns. In machine learning, once the unwanted information was removed from the clinical records, text was represented in vector form for feature extraction using Bag of words (BoW) representation (Manning, Raghavan, & Schütze, 2008, p. 117) and Term Frequency-Inverse Document Frequency (TF-IDF) vectoriser (Manning et al., 2008, p. 118). After feature extraction, classification is done using seven classifiers namely Support Vector Machine (SVM) (Cortes & Vapnik, 1995), Naïve Bayes (Manning et al., 2008, p. 258), Decision Tree (Kumar, Assistant, & Sahni, 2011), Random Forest (Breiman, 2001), AdaBoost (Freund & Schapire, 1999), Multi Layer Perceptron (MLP) (Naraei, Abhari, & Sadeghian, 2016) and k-Nearest Neighbour (kNN) (Manning et al., 2008, p. 297). A set of standard metrics: Precision(P), Recall (R), F-score (F-score), Accuracy, Hamming Loss(HL) and Jaccard Similarity (JS) (Dalianis, 2018), (Aldrees & Chikh, 2016) is used to do the measure the efficiency of the said NLP and ML algorithms using the above mentioned two datasets. For both the datasets (Data190 and Data235), the machine learning approach and the hybrid approach gave good performances in comparison to pattern matching and rule-based approach. Among all the classifiers, AdaBoost outperformed followed by Decision Tree and other classifiers. In the machine learning approach, Decision Tree technique performed better than all the other classifiers using 4-gram feature set by achieving 0.87 F-score, 0.7453 JS and 0.0877 HL. Similarly, in Data235, AdaBoost outperforms by achieving 0.91 F-score, 0.8294 JS and 0.0945 HL.
Style APA, Harvard, Vancouver, ISO itp.
40

Philips, Santosh. "Computational biology approaches in drug repurposing and gene essentiality screening". Diss., 2016. http://hdl.handle.net/1805/10978.

Pełny tekst źródła
Streszczenie:
Indiana University-Purdue University Indianapolis (IUPUI)
The rapid innovations in biotechnology have led to an exponential growth of data and electronically accessible scientific literature. In this enormous scientific data, knowledge can be exploited, and novel discoveries can be made. In my dissertation, I have focused on the novel molecular mechanism and therapeutic discoveries from big data for complex diseases. It is very evident today that complex diseases have many factors including genetics and environmental effects. The discovery of these factors is challenging and critical in personalized medicine. The increasing cost and time to develop new drugs poses a new challenge in effectively treating complex diseases. In this dissertation, we want to demonstrate that the use of existing data and literature as a potential resource for discovering novel therapies and in repositioning existing drugs. The key to identifying novel knowledge is in integrating information from decades of research across the different scientific disciplines to uncover interactions that are not explicitly stated. This puts critical information at the fingertips of researchers and clinicians who can take advantage of this newly acquired knowledge to make informed decisions. This dissertation utilizes computational biology methods to identify and integrate existing scientific data and literature resources in the discovery of novel molecular targets and drugs that can be repurposed. In chapters 1 of my dissertation, I extensively sifted through scientific literature and identified a novel interaction between Vitamin A and CYP19A1 that could lead to a potential increase in the production of estrogens. Further in chapter 2 by exploring a microarray dataset from an estradiol gene sensitivity study I was able to identify a potential novel anti-estrogenic indication for the commonly used urinary analgesic, phenazopyridine. Both discoveries were experimentally validated in the laboratory. In chapter 3 of my dissertation, through the use of a manually curated corpus and machine learning algorithms, I identified and extracted genes that are essential for cell survival. These results brighten the reality that novel knowledge with potential clinical applications can be discovered from existing data and literature by integrating information across various scientific disciplines.
Style APA, Harvard, Vancouver, ISO itp.
41

Saurombe, Nampombe Pearson. "Public programming of public archives in the East and Southern Africa regional branch of the International Council on Archives (ESARBICA):". Thesis, 2016. http://hdl.handle.net/10500/20084.

Pełny tekst źródła
Streszczenie:
Public programming initiatives are considered as an integral part of archival operations because they support greater use of archival records. This study investigated public programming practises in the ESARBICA region. The findings of the study were determined after applying methodological triangulation, within a quantitative research context. This included the use of self-administered questionnaires, semi-structured interviews and the analysis of documents and websites. Participants in this study were ESARBICA board members, Directors of the National Archives and archivists from the ESARBICA region. Nine (69.2%) national directors representing different member states completed the questionnaire and eight archivists from the same region were interviewed. Furthermore, three ESARBICA board members were also interviwed. Legislation and country reports from ESARBICA member states were reviewed, together with websites of institutions within the ESARBICA region that offered archival education and training. Findings of the study indicated that public programming initiatives were not a priority. Reasons for this included lack of public programming policies, budgetary constraints, shortage of staff and lack of transport. Furthermore, the national archives were reluctant to rope in technology to promote their archives. Collaboration efforts with regard to promoting archives were shallow. Moreover, the investigation of user needs was restricted to existing users of the archives. In addition to all this, the archivists felt that they needed to improve their public programming skills. The study therefore suggests that the national archives of ESARBICA should focus on: legislation, public programming policies, advocacy, users, partnerships and skills. Taking these factors into consideration, an inclusive and integrated public programming framework was developed and proposed as a possible measure for improving public programming efforts in the ESARBICA region.
Information Science
D. Litt. et Phil. (Information Science)
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii