Dissertations / Theses on the topic 'Patent Statistics'

To see the other types of publications on this topic, follow the link: Patent Statistics.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Patent Statistics.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Cihan, Cengiz. "An Empirical Analysis of Knowledge Production Function: What Differs Among The OECD Countries Including Turkey." Thesis, The University of Sydney, 2005. http://hdl.handle.net/2123/1757.

Full text
Abstract:
Since the 1950s, economic growth has been one of the main topics of economic discipline. In this context, the sources of economic growth have been analysed by different economic theories. These theories can be decomposed into two groups, namely modern neoclassical theory and evolutionary economic theory. In the modern neoclassical economic theory, the technological progress is considered as the main determinant of the long-run economic growth. In this regard, the sources of economic growth differences among countries are analyzed by using various types of models. In the earliest studies, it is assumed that technological progress is exogenous (Solow-Swan model). Constant returns to scale and perfectly competitive market structure assumptions are the main characteristics of these studies. After the developments in the economic theory, technological progress has been taken into account in a different way by a new line of models, namely endogenous growth models. More specifically, technological progress is endogenously determined process in these models. Contrary to the previous models, increasing returns to scale, which stem from externality and the monopolistic market structure, play a significant role in endogenous growth models. We have reached to the conclusion that, although it suffers from some weaknesses, endogenous growth model proposes a more realistic explanation for the economic growth process. In the evolutionary economic theory, technological progress is also considered as the main determinant of economic growth. However, this theory deals with empirical issues by focusing on observed facts instead of constructing theoretical models, and provides both guidance and interpretation regarding technological progress. In this theory, variables and relationships that are considered have many practical implications. In that respect, its structure is very much realistic and it avoids certain logical gaps and inconsistencies. One of the aims of this thesis is to examine developments in economic theory by focusing on technological progress. For this purpose, we compare formal and evolutionary theories. Our theoretical review reveals that both the endogenous growth models in the tradition of modern neoclassical theory, and the important insights of the evolutionary economic theory help to analyze technological progress and/or economic growth. Furthermore, this thesis aims to measure technological progress. The measurement of technological progress is vital for the nations’ development strategies and the firms’ innovation policies. In this regard, we use patent statistics as a proxy of technological progress. The empirical parts of the thesis involve a number of applications of endogenous growth theory by taking into account the propositions of modern neoclassical economic theory. In this regard, the growth rate differences across countries are examined by using the frameworks of both the modern neoclassical and evolutionary theories. The results show that both theories have reasonable power to explain why growth rate differs across countries. In addition, we conclude that patenting activities rather than R&D activities more suitably represent innovative activities. Moreover, this thesis empirically tests the knowledge generation process in the framework of endogenous growth approach. We employ the knowledge production approach for this purpose. It is found that both domestic and international stocks of knowledge as measured by granted patent statistics, R&D activities, human capital and openness measures are significant factors in explaining productivity growth. Furthermore, product variety and quality improvement dimensions of technological progress are empirically analyzed by using patent statistics. It is found that both dimensions of technological progress significantly affect creation of new technologies. Finally, the findings indicate that technological capability of Turkey is far away from other developed countries covered by this study.
APA, Harvard, Vancouver, ISO, and other styles
2

de, Rassenfosse Gaétan. "Essays on the propensity to patent: measurement and determinants." Doctoral thesis, Universite Libre de Bruxelles, 2010. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210130.

Full text
Abstract:
Chapter 1 discusses the econometric pitfalls associated with the use of patent production functions to study the invention process. It then goes on to argue that a sound understanding of the invention process necessarily requires an understanding of the propensity to patent. The empirical analysis carried out in Chapter 1 seeks to explain the proportion of inventions patented – a potential metric for the propensity to patent – from an international sample of manufacturing firms.

Chapter 2 proposes a methodology to filter out the noise induced by varying patent practices in the R&D-patent relationship. The methodology explicitly decomposes the patent-to-R&D ratio into its components of productivity and propensity. It is then applied to a novel data set of priority patent applications in four countries and six industries.

Chapter 3 takes stock of the literature on the role of fees in patent systems while Chapter 4 presents estimates of the price elasticity of demand for patents at the trilateral offices (that is, in the U.S. Japan and Europe). The estimation of dynamic panel data models of patent applications suggests that the long-term price elasticity is about -0.30.


Doctorat en Sciences économiques et de gestion
info:eu-repo/semantics/nonPublished

APA, Harvard, Vancouver, ISO, and other styles
3

Danguy, Jérôme. "Essays on the globalization of innovation using patent-based indicators." Doctoral thesis, Universite Libre de Bruxelles, 2013. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/209409.

Full text
Abstract:
Compared to the globalized markets of goods and services, technology production has been often described as “far from globalized” and mainly concentrated in the home country of multinational enterprises. However, academics and international organizations recognize that research and development (R&D) activities are increasingly performed at the international level. In particular, the globalization of innovation is a major concern since it is at the crossroads of the rising importance of knowledge economy and the increasing international slicing of firms’ value chains. In this context, the main motivations of this thesis are to investigate the extent to which innovation takes place across national borders and to analyze the drivers of this phenomenon across countries and across industries. For this purpose, this dissertation provides new evidence on the globalization of innovation in four empirical essays using patent-based indicators.

First, the relevance of patent statistics as indicators of innovation is evaluated by studying the relationship between expenditures in R&D activities and patenting efforts. Chapter 2 decomposes this relationship at the industry level to shed light on the origins of the worldwide surge in patent applications. The empirical investigation of the R&D-patent relationship relies on a unique panel dataset composed of 18 manufacturing industries in 19 countries covering the period from 1987 to 2005, for which five broad patent indicators are developed. This study shows that patent applications at the industry level reflect not only research productivity, but also two main components of the propensity to patent which are firms’ strategic considerations: the decision to protect an invention with a patent (the “appropriability strategy”) and the number of patents filed to protect an innovation (the “filing strategy”). The comparison between the results for various patent count indicators provides also interesting insights. While some industries (computers and communication technologies) and countries (South Korea, Spain, and Poland) have experienced a drastic increase in patent applications, the ratio of priority patent applications to R&D expenditures has been generally constant. This result suggests that there has been no spurt in innovation productivity. In contrast, regional applications (filings at the United States Patent and Trademark Office or at the European Patent Office) have been increasing since the early 1990s, suggesting that the patent explosion observed in large regional patent offices is due to the greater globalization of intellectual property rights rather than a surge in research productivity. Innovative firms are increasingly targeting global markets and hence have a higher tendency to seek protection in key markets worldwide.

Chapter 3 introduces, firstly, aggregate patent-based indicators to measure the globalization of innovation production. Secondly, it describes the patterns in international technology production for a large panel dataset covering 21 industries in 29 countries from 1980 to 2005. A strong growth in the intensity of globalization of innovation is confirmed not only in terms of cross-border ownership of innovation, but also in terms of international technological collaborations. More interestingly, heterogeneity across countries and industries is observed. On the one hand, more innovative countries (or industries) do not present more globalized innovation footprint. On the other hand, the ownership of innovation is still strongly concentrated in a few countries, although its location is increasingly dispersed across the world. Thirdly, it investigates empirically two main opposing motives driving the internationalization of innovation: home-base augmenting and home-base exploiting strategies. The results show that the degree of internationalization of innovation is negatively related to the revealed technological advantage of countries across industries. Countries tend to be more technologically globalized in industrial sectors in which they are less technologically specialized. The empirical findings suggest also that countries with multidisciplinary technological knowledge are more likely to take part in international co-inventions of new technologies and to be attractive for foreign innovative firms. This aggregated patent-based analysis provides additional evidence that globalization of innovation is a means of acquiring competences abroad that are lacking at home, suggesting that home-base augmenting motives matter in the globalization of innovation production. By contrast, the internationalization of innovation does not seem to be purely market-driven since large economies are not the target of foreign innovative firms and international patenting is more related to international competitiveness of country-industry pairs than to the direction of trade flows.

While the previous chapter studies the globalization of innovation of a country with the rest of world, Chapter 4 aims at explaining who collaborates with whom in the international production of technology. In particular, the impact of technological distance between partner’s economies is investigated for a panel dataset covering international co-inventions between 29 countries in 21 industries between 1988 and 2005. The descriptive analysis highlights that the overall growth in internationalization of innovation is due to both the increase in the number of international innovative actors and the rise of the average intensity of collaboration. The empirical findings then suggest that the two main arguments related to technological distance – ‘similarity versus diversity’ – can be reconciled by taking an industry approach. Indeed, the estimation results show that the impact of technological distance is twofold on the intensity of collaborative innovation at industry level. On the one hand, the more similar the industry-specific knowledge of two countries (low technological distance within the industry), the more easily they collaborate by sharing common industrial knowledge. On the other hand, the more different their non-industry-specific knowledge (high technological distance outside the scope of the industry), the more they collaborate to gain access to broad and interdisciplinary expertise. It suggests that the relative absorptive capacity between partner’s economies and the search for novel and complementary knowledge are key drivers of the globalization of innovation. Moreover, the results confirm the moderating effect of non-technological distance factors (spatial proximity, ease of communication, institutional proximity, and overall economic ties) in cross-border innovative relationships.

The topic of Chapter 5 is the cost-benefit analysis of the creation of a new ‘globalized’ patent: the EU Patent (formerly known as Community Patent) which consists in a single patent covering the entire EU territory for both application procedure and legal enforcement after grant. The objective of this chapter is threefold: (i) simulate the budgetary consequences in terms of renewal fees’ income for the European and national patent offices; (ii) evaluate the implications for the business sector in terms of absolute and relative fees; (iii) assess the total economic impact for the most important actors of the European patent system. Based on an econometric model explaining the determinants of the maintenance rate of patents, the simulations suggest that – with a sound renewal fee structure – the EU patent could generate more income for nearly all patent offices than under the current status quo. It would, at the same time, substantially reduce the relative patenting costs for applicants. Finally, the loss of economic rents by patent attorneys, translators and lawyers, and the drop of controlling power by national patent offices elucidate further the persistence of a fragmented European patent system.


Doctorat en Sciences économiques et de gestion
info:eu-repo/semantics/nonPublished

APA, Harvard, Vancouver, ISO, and other styles
4

Savin, Maxim. "National Systems of Innovation: Evidence from the Industry Level." Thesis, KTH, Samhällsekonomi, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-98669.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Cihan, Cengiz. "An Empirical Analysis of Knowledge Production Function: What Differs Among The OECD Countries Including Turkey." University of Sydney, 2006. http://hdl.handle.net/2123/1757.

Full text
Abstract:
Doctor of Philosophy
Since the 1950s, economic growth has been one of the main topics of economic discipline. In this context, the sources of economic growth have been analysed by different economic theories. These theories can be decomposed into two groups, namely modern neoclassical theory and evolutionary economic theory. In the modern neoclassical economic theory, the technological progress is considered as the main determinant of the long-run economic growth. In this regard, the sources of economic growth differences among countries are analyzed by using various types of models. In the earliest studies, it is assumed that technological progress is exogenous (Solow-Swan model). Constant returns to scale and perfectly competitive market structure assumptions are the main characteristics of these studies. After the developments in the economic theory, technological progress has been taken into account in a different way by a new line of models, namely endogenous growth models. More specifically, technological progress is endogenously determined process in these models. Contrary to the previous models, increasing returns to scale, which stem from externality and the monopolistic market structure, play a significant role in endogenous growth models. We have reached to the conclusion that, although it suffers from some weaknesses, endogenous growth model proposes a more realistic explanation for the economic growth process. In the evolutionary economic theory, technological progress is also considered as the main determinant of economic growth. However, this theory deals with empirical issues by focusing on observed facts instead of constructing theoretical models, and provides both guidance and interpretation regarding technological progress. In this theory, variables and relationships that are considered have many practical implications. In that respect, its structure is very much realistic and it avoids certain logical gaps and inconsistencies. One of the aims of this thesis is to examine developments in economic theory by focusing on technological progress. For this purpose, we compare formal and evolutionary theories. Our theoretical review reveals that both the endogenous growth models in the tradition of modern neoclassical theory, and the important insights of the evolutionary economic theory help to analyze technological progress and/or economic growth. Furthermore, this thesis aims to measure technological progress. The measurement of technological progress is vital for the nations’ development strategies and the firms’ innovation policies. In this regard, we use patent statistics as a proxy of technological progress. The empirical parts of the thesis involve a number of applications of endogenous growth theory by taking into account the propositions of modern neoclassical economic theory. In this regard, the growth rate differences across countries are examined by using the frameworks of both the modern neoclassical and evolutionary theories. The results show that both theories have reasonable power to explain why growth rate differs across countries. In addition, we conclude that patenting activities rather than R&D activities more suitably represent innovative activities. Moreover, this thesis empirically tests the knowledge generation process in the framework of endogenous growth approach. We employ the knowledge production approach for this purpose. It is found that both domestic and international stocks of knowledge as measured by granted patent statistics, R&D activities, human capital and openness measures are significant factors in explaining productivity growth. Furthermore, product variety and quality improvement dimensions of technological progress are empirically analyzed by using patent statistics. It is found that both dimensions of technological progress significantly affect creation of new technologies. Finally, the findings indicate that technological capability of Turkey is far away from other developed countries covered by this study.
APA, Harvard, Vancouver, ISO, and other styles
6

Righter, Emily Stewart. "Graphical and Bayesian Analysis of Unbalanced Patient Management Data." Diss., CLICK HERE for online access, 2007. http://contentdm.lib.byu.edu/ETD/image/etd1710.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Schwarz, Patrick. "Prediction with Penalized Logistic Regression : An Application on COVID-19 Patient Gender based on Case Series Data." Thesis, Karlstads universitet, Handelshögskolan (from 2013), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-85642.

Full text
Abstract:
The aim of the study was to evaluate dierent types of logistic regression to find the optimal model to predict the gender of hospitalized COVID-19 patients. The models were based on COVID-19 case series data from Pakistan using a set of 18 explanatory variables out of which patient age and BMI were numerical and the rest were categorical variables, expressing symptoms and previous health issues.  Compared were a logistic regression using all variables, a logistic regression that used stepwise variable selection with 4 explanatory variables, a logistic Ridge regression model, a logistic Lasso regression model and a logistic Elastic Net regression model.  Based on several metrics assessing the goodness of fit of the models and the evaluation of predictive power using the area under the ROC curve the Elastic Net that was only using the Lasso penalty had the best result and was able to predict 82.5% of the test cases correctly.
APA, Harvard, Vancouver, ISO, and other styles
8

Simmonds, Mark Crawford. "Statistical methods for individual patient data meta-analysis." Thesis, University of Cambridge, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.595824.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Vanier, Antoine. "The concept measurement, and integration of response shift phenomenon in Patient-Reported Outcomes data analyses : on certain methodological and statistical considerations." Thesis, Nantes, 2016. http://www.theses.fr/2016NANT1009/document.

Full text
Abstract:
Les données rapportées par les patients sont maintenant fréquemment utilisées en recherche biomédicale. Ces instruments permettent la mesure de concepts subjectifs tels que la qualité de vie, les niveaux d’anxiété, de douleur, de fatigue. L’interprétation d’une différence de score au cours du temps était basée sur l’hypothèse que le sens des concepts et échelles restai stable au cours du temps dans l’esprit des individus. Cette hypothèse semble aujourd’hui dépassée. L’auto-évaluation d’un concept est maintenant comprise comme contingente de la représentation subjective qu’à un sujet du dit concept, cette représentation pouvant changer au cours du temps, surtout après avoir vécu un évènement de santé : ce phénomène est connu comme le « response shift ». Depuis la fin des années 1990s, l’investigation de ce phénomène est devenue un sujet d’intérêt majeur en psychométrie. Si des développements ont vu le jour, ce sujet reste récent et donc accompagné de débats variés que ce soit sur le plan théorique ou méthodologique. Aussi, l’objectif général de cette thèse est d’investiguer certaines problématiques méthodologiques et statistiques liées au response shift. Ce manuscrit est composé de trois travaux principaux : un état de l’art et une synthèse des travaux conduits à un niveau international depuis que le response shift est étudié, une étude pilote des performances de la procédure d’Oort (une méthode populaire de détection de response shift) par simulations et un travail théorique sur les liens entre response shift et complexité sémantique des concepts mesurés et items utilisés
Patient-Reported Outcomes are increasingly used in health-related research. These instruments allow the assessment of subjective concepts such as Health-Related Quality of Life, anxiety level, pain or fatigue. Initially, the interpretation of a difference in score over time was based on the assumption that the meaning of concepts and measurement scales remains stable in individuals’ minds over time. This assumption has been challenged. Indeed, the self-assessment of a concept is now understood as a contingency of the subjective meaning a subject has of this concept, which can change over time especially as a result of a salient medical event: the “response shift” phenomenon. Since the end of the 1990s, researches on response shift phenomenon has become of prime interest in the field of health-related research. If developments have been made, it is still a young field with various scientific debates on a theoretical, methodological and statistical level. Thus, the broad objective of this thesis is to investigate some methodological and statistical issues regarding response shift concept, detection and integration into PRO data analyses. The manuscript is composed of three main works: a state of the art and synthesis of the works conducted at an international level since response shift phenomenon is investigated, a pilot study investigating the statistical performances of the Oort’s Procedure (a popular method of response shift detection using Structural Equation Modeling) by simulations and a theoretical work about the links between response shift occurrence and semantic complexity of concepts measured and items used
APA, Harvard, Vancouver, ISO, and other styles
10

Holm, Hansen Christian. "Analysis of routinely collected repeated patient outcomes." Thesis, University of Edinburgh, 2014. http://hdl.handle.net/1842/9556.

Full text
Abstract:
Clinical practice should be based on the best available evidence. Ideally such evidence is obtained through rigorously conducted, purpose-designed clinical studies such as randomised controlled trials and prospective cohort studies. However gathering information in this way requires a massive effort, can be prohibitively expensive, is time consuming, and may not always be ethical or practicable. When answers are needed urgently and purpose-designed prospective studies are not feasible, retrospective healthcare data may offer the best evidence there is. But can we rely on analysis with such data to give us meaningful answers? The current thesis studies this question through analysis with repeated psychological symptom screening data that were routinely collected from over 20,000 outpatients who attended selected oncology clinics in Scotland. Linked to patients’ oncology records these data offer a unique opportunity to study the progress of distress symptoms on an unprecedented scale in this population. However, the limitations to such routinely collected observational healthcare data are many. We approach the analysis within a missing data context and develop a Bayesian model in WinBUGS to estimate the posterior predictive distribution for the incomplete longitudinal response and covariate data under both Missing At Random and Missing Not At Random mechanisms and use this model to generate multiply imputed datasets for further frequentist analysis. Additional to the routinely collected screening data we also present a purpose-designed, prospective cohort study of distress symptoms in the same cancer outpatient population. This study collected distress outcome scores from enrolled patients at regular intervals and with very little missing data. Consequently it contained many of the features that were lacking in the routinely collected screening data and provided a useful contrast, offering an insight into how the screening data might have been were it not for the limitations. We evaluate the extent to which it was possible to reproduce the clinical study results with the analysis of the observational screening data. Lastly, using the modelling strategy previously developed we analyse the abundant screening data to estimate the prevalence of depression in a cancer outpatient population and the associations with demographic and clinical characteristics, thereby addressing important clinical research questions that have not been adequately studied elsewhere. The thesis concludes that analysis with observational healthcare data can potentially be advanced considerably with the use of flexible and innovative modelling techniques now made practicable with modern computing power.
APA, Harvard, Vancouver, ISO, and other styles
11

Berkman, Janet. "Predicting patient knowledge of cardiac risk factors: A comparison of two approaches." Thesis, University of Ottawa (Canada), 2004. http://hdl.handle.net/10393/26581.

Full text
Abstract:
The University of Ottawa Heart Institute conducted a survey of patients to understand the level of knowledge of cardiac risk factors and to identify any subgroups of patients that could benefit from specially designed educational programs. This thesis compares two approaches to the analysis of this multidimensional dataset. Both techniques looked at the 10 modifiable risk factors and a number of predictor variables (age, gender, education, and smoking status). Logistic regression was hampered by low sample size, sparse data, and the high probability responses of many of the binary knowledge variables. Only one risk factor was successfully explained by any of the predictor variables. Correspondence analysis demonstrated that those who are unaware of smoking as a risk factor are not current smokers; knowledge of low fibre diet is related to education but not to gender; females are more aware of high salt diet and stress as risk factors, and are more likely to have never smoked; smokers tend to have lower education and be unaware of the risk of a low fibre diet.
APA, Harvard, Vancouver, ISO, and other styles
12

Deng, Lisha. "Analysis of patient-safety related data using statistical modeling." Thesis, Lancaster University, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.643060.

Full text
Abstract:
To improve the quality of healthcare service, in particular reducing unintended harm to patients during the delivery of the service, patient safety study has become an important topic since the 1990s. This thesis aims to make a contribution to the patient safety research through statistical modelling based on the analysis of incident reports. Analysis of incident report-based data can use time series methods of count data or point process methods. However, strictly speaking, point process models using exact data should be used, because estimates using point process methods will lead t.o more efficient estimates than using interval-censored count data which discarded information, in particular when the underlying intensity driven the process is very wiggly. We have provided a theoretical analysis using Poisson process as an example to illustrate the efficiency loss in Chapter 5. The thesis also illustrated four case studies related to patient safety data. Safety incident report study and Ventilator-Associated-Pneumonia (VAP) study used time series methods, in particular Poisson log-linear model, to study what factors influence the trends of incidence and whet.her t.he rates of incidence differs amongst different hospital sites. Methicillin-Resistant Staphylococcus aurens (IVIRSA) and Campylobacteriosis study used point process methods, in particular Poisson process models, to study the trends of the incident rates, and what. population groups have higher risk rate and whether the rates of incidence differ amongst hospitals. However, we assumed that the counts/ infections occurred independently, which might be unrealistic for time series/ infectious disease data sometimes, if dependence such as cross infections cannot be neglected. Therefore , we proposed a new method in estimating parameters of the Log-Gaussian Cox process which is often used for clustered events. The method uses importance sampling in conjunction with non-parametric intensity estimation. This method is computationally easier than the Markov Chain Monte Carlo (MCMC) approach. It also appears to be more efficient than the minimum contrast estimating method using the K-function and the pair correlation g-function in the simulation study when the intensity function is smooth.
APA, Harvard, Vancouver, ISO, and other styles
13

Gruselius, Hanna. "Generative Models and Feature Extraction on Patient Images and Structure Data in Radiation Therapy." Thesis, KTH, Matematisk statistik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-229407.

Full text
Abstract:
This Master thesis focuses on generative models for medical patient data for radiation therapy. The objective with the project is to implement and investigate the characteristics of a Variational Autoencoder applied to this diverse and versatile data. The questions this thesis aims to answer are: (i) whether the VAE can capture salient features of medical image data, and (ii) if these features can be used to compare similarity between patients. Furthermore, (iii) if the VAE network can successfully reconstruct its input and lastly (iv) if the VAE can generate artificial data having a reasonable anatomical appearance. The experiments carried out conveyed that the VAE is a promising method for feature extraction, since it appeared to ascertain similarity between patient images. Moreover, the reconstruction of training inputs demonstrated that the method is capable of identifying and preserving anatomical details. Regarding the generative abilities, the artificial samples generally conveyed fairly realistic anatomical structures. Future work could be to investigate the VAEs ability to generalize, with respect to both the amount of data and probabilistic considerations as well as probabilistic assumptions.
Fokuset i denna masteruppsats är generativa modeller för patientdata från strålningsbehandling. Syftet med projektet är att implementera och undersöka egenskaperna som en “Variational Autoencoder” (VAE) har på denna typ av mångsidiga och varierade data. Frågorna som ska besvaras är: (i) kan en VAE fånga särdrag hos medicinsk bild-data, och (ii) kan dessa särdrag användas för att jämföra likhet mellan patienter. Därutöver, (iii) kan VAE-nätverket återskapa sin indata väl och slutligen (iv) kan en VAE skapa artificiell data med ett rimligt anatomiskt utseende. De experiment som utfördes pekade på att en VAE kan vara en lovande metod för att extrahera framtydande drag hos patienter, eftersom metoden verkade utröna likheter mellan olika patienters bilder. Dessutom påvisade återskapningen av träningsdata att metoden är kapabel att identifiera och bevara anatomiska detaljer. Vidare uppvisade generellt den artificiellt genererade datan, en realistisk anatomisk struktur. Framtida arbete kan bestå i att undersöka hur väl en VAE kan generalisera, med avseende på både mängd data som krävs och sannolikhetsteorietiska avgränsningar och antaganden.
APA, Harvard, Vancouver, ISO, and other styles
14

Jones, J. M. "The statistical analysis of the long-term outcome of breast cancer patients." Thesis, University of Manchester, 1987. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.378816.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Liu, Yazhuo. "Patient Populations, Clinical Associations, and System Efficiency in Healthcare Delivery System." Thesis, University of South Florida, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=3714134.

Full text
Abstract:

The efforts to improve health care delivery usually involve studies and analysis of patient populations and healthcare systems. In this dissertation, I present the research conducted in the following areas: identifying patient groups, improving treatments for specific conditions by using statistical as well as data mining techniques, and developing new operation research models to increase system efficiency from the health institutes’ perspective. The results provide better understanding of high risk patient groups, more accuracy in detecting disease’ correlations and practical scheduling tools that consider uncertain operation durations and real-life constraints.

APA, Harvard, Vancouver, ISO, and other styles
16

David, Shannon L. "Development and Validation of the Patient-AT Trust Instrument." Ohio University / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1375825756.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Forster, Jeri E. "Varying-coefficient models for longitudinal data : piecewise-continuous, flexible, mixed-effects models and methods for analyzing data with nonignorable dropout /." Connect to full text via ProQuest. Limited to UCD Anschutz Medical Campus, 2006.

Find full text
Abstract:
Thesis (Ph.D. in Biostatistics) -- University of Colorado at Denver and Health Sciences Center, 2006.
Typescript. Includes bibliographical references (leaves 72-75). Free to UCD Anschutz Medical Campus. Online version available via ProQuest Digital Dissertations;
APA, Harvard, Vancouver, ISO, and other styles
18

Sudoh, Katsuhito. "A Japanese-to-English Statistical Machine Translation System for Technical Documents." 京都大学 (Kyoto University), 2015. http://hdl.handle.net/2433/195986.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Ahmad, Ashar [Verfasser]. "Dissecting patient heterogeneity via statistical modeling based on multi-modal omics data / Ashar Ahmad." Bonn : Universitäts- und Landesbibliothek Bonn, 2019. http://d-nb.info/119183199X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Rahman, Md Abdur. "Statistical and Machine Learning for assessment of Traumatic Brain Injury Severity and Patient Outcomes." Thesis, Högskolan Dalarna, Institutionen för information och teknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:du-37710.

Full text
Abstract:
Traumatic brain injury (TBI) is a leading cause of death in all age groups, causing society to be concerned. However, TBI diagnostics and patient outcomes prediction are still lacking in medical science. In this thesis, I used a subset of TBIcare data from Turku University Hospital in Finland to classify the severity, patient outcomes, and CT (computerized tomography) as positive/negative. The dataset was derived from the comprehensive metabolic profiling of serum samples from TBI patients. The study included 96 TBI patients who were diagnosed as 7 severe (sTBI=7), 10 moderate (moTBI=10), and 79 mild (mTBI=79). Among them, there were 85 good recoveries (Good_Recovery=85) and 11 bad recoveries (Bad_Recovery=11), as well as 49 CT positive (CT. Positive=49) and 47 CT negative (CT. Negative=47). There was a total of 455 metabolites (features), excluding three response variables. Feature selection techniques were applied to retain the most important features while discarding the rest. Subsequently, four classifications were used for classification: Ridge regression, Lasso regression, Neural network, and Deep learning. Ridge regression yielded the best results for binary classifications such as patient outcomes and CT positive/negative. The accuracy of CT positive/negative was 74% (AUC of 0.74), while the accuracy of patient outcomes was 91% (AUC of 0.91). For severity classification (multi-class classification), neural networks performed well, with a total accuracy of 90%. Despite the limited number of data points, the overall result was satisfactory.
APA, Harvard, Vancouver, ISO, and other styles
21

Kreif, Noemi. "Statistical methods to address selection bias in economic evaluations that use patient-level observational data." Thesis, London School of Hygiene and Tropical Medicine (University of London), 2012. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.590633.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Coad, D. Stephen. "Outcome-dependent randomisation schemes for clinical trials with fluctuations in patient characteristics." Thesis, University of Oxford, 1989. http://ora.ox.ac.uk/objects/uuid:970a8103-24fc-496e-82c0-0645f2b4e9c4.

Full text
Abstract:
A clinical trial is considered in which two treatments are to be compared. Treatment allocation schemes are usually designed to assign approximately equal numbers of patients to each treatment. The purpose of this thesis is to investigate the efficiency of estimation and the effect of instability in the response variable for allocation schemes which are aimed at reducing the number of patients who receive the inferior treatment. The general background to outcome-dependent allocation schemes is described in Chapter 1. A discussion of ethical and practical problems associated with these methods is presented together with brief details of actual trials conducted. In Chapter 2, the response to treatment is Bernoulli and the trial size is fixed. A simple method for estimating the treatment difference is proposed. Simulation results for a selection of allocation schemes indicate that the effect of instability upon the performance of the schemes can sometimes be substantial. A decision-theory approach is taken in Chapter 3. The trial is conducted in a number of stages and the interests of both the patients in the trial and those who will be treated after the end of the trial are taken into account. Using results for conditional normal distributions, analytical results are derived for estimation of the treatment difference for both a stable and an unstable normal response variable for three allocation schemes. Some results for estimation are also given for other responses. The problem of sequential testing is addressed in Chapter 4. With instability in the response variable, it is shown that the error probabilities for the test for a stable response variable can be approximately preserved by using a modified test statistic with appropriately-widened stopping boundaries. In addition, some recent results for estimation following sequential tests are outlined. Finally, the main conclusions of the thesis are highlighted in Chapter 5.
APA, Harvard, Vancouver, ISO, and other styles
23

Armstrong, Paul Walter. "Fact or fiction : the problem of bias in Government Statistical Service estimates of patient waiting times." Thesis, London School of Hygiene and Tropical Medicine (University of London), 2000. http://researchonline.lshtm.ac.uk/682277/.

Full text
Abstract:
The cumulative likelihood of admission estimated for any given 'time-since-enrolment' depends on how we define membership of the population 'at-risk' and on how we handle right and left censored waiting times. As a result, published statistics will be biased because they assume that the waiting list is both stationary and closed and exclude all those not yet or never to be admitted. The cumulative likelihood of admission within three months was estimated using the Government Statistical Service method and compared with estimates which relaxed the assumption of stationarity and reflected variation in the numbers recruited to, and admitted from, the waiting list each quarter. The difference between the two estimates ranged from +5.5 to -9.1 percentage points among 11 Orthopaedic waiting lists in South Thames Region. In the absence of information on 'times-to-admission', exact 'times-since-enrolment' were extracted from Hospital Episode Statistics and assumed to be similarly distributed. In the absence of information on 'times-to-competing-event', the number of competing events falling in each waiting time category was estimated by differencing. A period lifetable was constructed using these approximations, census counts, counts of the number of new recruits and estimates of the number 'reset-to-zero' each quarter. The results support the view that the method used by the Government Statistical Service overestimates the cumulative likelihood of elective admission among those listed. The Government Statistical Service calculates the cumulative likelihood of admission within three months (range: 0.62-0.27) conditional on the fact of admission. Multiplying by the unconditional likelihood of being admitted (range: 0.93-0.31) estimates the cumulative likelihood of admission within three months among those listed (range: 0.55-0.12) and gives a rather different ranking of waiting list performance among 34 Orthopaedic waiting lists in South Thames Region.
APA, Harvard, Vancouver, ISO, and other styles
24

Wang, Yu. "Toward Better Health Care Service: Statistical and Machine Learning Based Analysis of Swedish Patient Satisfaction Survey." Thesis, KTH, Teknisk informationsvetenskap, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-212984.

Full text
Abstract:
Patients as a customer of health care service has rights to evaluate the servicethey received, and health care providers and professionals may take advantageof these evaluations to improve the health care service. To investigate the relationshipbetween patients overall satisfaction and satisfaction of specic aspects,this study uses classical statistical and machine learning based method to analyzeSwedish national patient satisfaction survey data.Statistical method including cross tabulation, chi-square test, correlationmatrix and linear regression identies the relationship between features. It isfound that patients' demographics have a signicant association between overallsatisfaction. And patients responses in each dimension show similar trend whichwill contribute to patients overall satisfaction.Machine learning classication approaches including Nave Bayes classier,logistic regression, tree-based model (decision tree, random forest, adaptiveboosting decision tree), support vector machines and articial neural networksare used to built models to classify patients overall satisfaction (positive ornegative) based on survey responses in dimensions and patients' demographicsinformation. These models all have relatively high accuracy (87.41%{89.85%)and could help to nd the important features of health care service and henceimprove the quality of health care service in Sweden.
Patienter som kund av hlsovrdstjnsten har rtt att utvrdera den tjnst de ftt, ochvrdgivare och yrkesverksamma kan utnyttja dessa utvrderingar fr att frbttravrden. Fr att underska frhllandet mellan patientens vergripande tillfredsstllelseoch tillfredsstllelse av specika aspekter anvnder den hr studien klassiskstatistisk och maskinbaserad metod fr att analysera svenska nationella patientunderskningsdata.Statistisk metod, inklusive tvr tabulering, chi-square test, korrelationsmatrisoch linjr regression identierar frhllandet mellan funktioner. Det r konstateratatt patienternas demogra har en betydande koppling mellan vergripande tillfredsstllelse.Och patientens svar i varje dimension visar en liknande trend somkommer att bidra till patientens vergripande tillfredsstllelse.Klassiceringsmetoder fr maskininlrning, inklusive Nave Bayes-klassiceraren,logistisk regression, trdbaserad modell (beslutstrd, slumpmssigt skog, adaptivtkar beslutstratt), stdvektormaskiner och konstgjorda neurala ntverk anvnds fratt bygga modeller fr att klassicera Patientens vergripande tillfredsstllelse (positiveller negativ) baserat p underskningsresponser i dimensioner och patientersdemograinformation. Dessa modeller har alla relativt hg noggrannhet (87.41%- 89.85%) och kan hjlpa till att hitta de viktigaste egenskaperna hos vrden ochdrmed frbttra kvaliteten p vrden i Sverige.
APA, Harvard, Vancouver, ISO, and other styles
25

Sun, Junfeng. "Stochastic models for compliance analysis and applications." Connect to resource, 2005. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1117049743.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Ma, Linna. "Splitting frames based on hypothesis testing for patient motion compensation in SPECT." Link to electronic thesis, 2006. http://www.wpi.edu/Pubs/ETD/Available/etd-083006-154306/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Puertas, Monica A. "Statistical and Prognostic Modeling of Clinical Outcomes with Complex Physiologic Data." Scholar Commons, 2014. https://scholarcommons.usf.edu/etd/5106.

Full text
Abstract:
Laboratory tests are a primary resource for diagnosing patient diseases. However, physicians often make decisions based on a single laboratory result and have a limited perspective of the role of commonly-measured parameters in enhancing the diagnostic process. By providing a dynamic patient profile, the diagnosis could be more accurate and timely, allowing physicians to anticipate changes in the recovery trajectory and intervene more effectively. The assessment and monitoring of the circulatory system is essential for patients in intensive care units (ICU). One component of this system is the platelet count, which is used in assessing blood clotting. However, platelet counts represent a dynamic equilibrium of many simultaneous processes, including altered capillary permeability, inflammatory cascades (sepsis), and the coagulation process. To characterize the value of dynamic changes in platelet count, analytical methods are applied to datasets of critically-ill patients in (1) a homogeneous population of ICU cardiac surgery patients and (2) a heterogeneous group of ICU patients with different conditions and several hospital admissions. The objective of this study was to develop a methodology to anticipate adverse events using metrics that capture dynamic changes of platelet counts in a homogeneous population, then redefine the methodology for a more heterogeneous and complex dataset. The methodology was extended to analyze other important physiological parameters of the circulatory system (i.e., calcium, albumin, anion gap, and total carbon dioxide). Finally, the methodology was applied to simultaneously analyze some parameters enhancing the predictive power of various models. This methodology assesses dynamic changes of clinical parameters for a heterogeneous population of ICU patients, defining rates of change determined by multiple point regression and by the simpler fixed time parameter value ratios at specific time intervals. Both metrics provide prognostic information, differentiating survivors from non-survivors and have demonstrated being more predictive than complex metrics and risk assessment scores with greater dimensionality. The goal was to determine a minimal set of biomarkers that would better assist care providers in assessing the risk of complications, allowing them alterations in the management of patients. These metrics should be simple and their implementation would be feasible in any environment and under uncertain conditions of the specific diagnosis and the onset of an acute event that causes a patient's admission to the ICU. The results provide evidence of the different behaviors of physiologic parameters during the recovery processes for survivors and non-survivors. These differences were observed during the first 8 to 10 days after a patient's admission to the ICU. The application of the presented methodology could enhance physicians' ability to diagnose more accurately, anticipate changes in recovery trajectories, and prescribe effective treatment, leading to more personalized care and reduced mortality rates.
APA, Harvard, Vancouver, ISO, and other styles
28

Daffue, Ruan Albert. "Applying patient-admission predictive algorithms in the South African healthcare system." Thesis, Stellenbosch : Stellenbosch University, 2013. http://hdl.handle.net/10019.1/79897.

Full text
Abstract:
Thesis (MScEng)--Stellenbosch University, 2013.
ENGLISH ABSTRACT: Predictive analytics in healthcare has become one of the major focus areas in healthcare delivery worldwide. Due to the massive amount of healthcare data being captured, healthcare providers and health insurers are investing in predictive analytics and its enabling technologies to provide valuable insight into a large variety of healthcare outcomes. One of the latest developments in the field of healthcare predictive modelling (PM) was the launch of the Heritage Health Prize; a competition that challenges individuals from across the world to develop a predictive model that successfully identifies the patients at risk of admission to hospital from a given patient population. The patient-admission predictive algorithm (PAPA) is aimed at reducing the number of unnecessary hospitalisations that needlessly constrain healthcare service delivery worldwide. The aim of the research presented is to determine the feasibility and value of applying PAPAs in the South African healthcare system as part of a preventive care intervention strategy. A preventive care intervention strategy is a term used to describe an out-patient hospital service, aimed at providing preventive care in an effort to avoid unnecessary hospitalisations from occurring. The thesis utilises quantitative and qualitative techniques. This included a review of the current and historic PM applications in healthcare to determine the major expected shortfalls and barriers to implementation of PAPAs, as well as the institutional and operational requirements of these predictive algorithms. The literature study is concluded with a review of the current state of affairs in the South African healthcare system to, firstly, articulate the need for PAPAs and, secondly, to determine whether the public and private sectors provide a suitable platform for implementation (evaluated based on the operational and institutional requirements of PAPAs). Furthermore, a methodology to measure and analyse the potential value-add of a PAPA care intervention strategy was designed and developed. The methodology required a survey of the industry leaders in the private healthcare sector of South Africa to identify, firstly, the current performance foci and, secondly, the factors that compromise the performance of these organisations to deliver high quality, resource-effective care. A quantitative model was developed and applied to an industry leader in the private healthcare sector of South Africa, in order to gauge the resultant impact of a PAPA care intervention strategy on healthcare provider performance. Lastly, in an effort to ensure the seamless implementation and operation of PAPAs, an implementation framework was developed to address the strategic, tactical, and operational challenges of applying predictive analytics and preventive care strategies similar to PAPAs. The research found that the application of PAPAs in the public healthcare sector of South Africa is infeasible. The private healthcare sector, however, was considered a suitable platform to implement PAPAs, as this sector satisfies the institutional and operational requirements of PAPAs. The value-add model found that a PAPA intervention strategy will add significant value to the performance of healthcare providers in the private healthcare sector of South Africa. Noteworthy improvements are expected in the ability of healthcare provider’s to coordinate patient care, patient-practitioner relationships, inventory service levels, and staffing level efficiency and effectiveness. A slight decrease in the financial operating margin, however, was documented. The value-add methodology and implementation support framework provides a suitable platform for future researchers to explore the collaboration of preventive care and PM in an effort to improve healthcare resource management in hospitals. In conclusion, patient-admission predictive algorithms provide improved evidence-based decision making for preventive care intervention strategies. An efficient and effective preventive care intervention strategy improves healthcare provider performance and, therefore, adds significant value to these organisations. With the proper planning and implementation support, the application of PAPA care intervention strategies will change the way healthcare is delivered worldwide.
AFRIKAANSE OPSOMMING: Vooruitskattingsanalises in gesondheidsorg het ontwikkel in een van die mees belangrike fokusareas in die lewering van kwaliteit gesondheidsorg in ontwikkelde lande. Gesondheidsorgverskaffers en lewensversekeraars belê in vooruitskattingsanalise en ooreenstemmende tegnologieë om groot hoeveelhede gesondheidsorg pasiënt-data vas te lê, wat waardevolle insigte bied ten opsigte van ʼn groot verskeidenheid van gesondheidsorg-uitkomstes. Een van die nuutste ontwikkelinge in die veld van gesondheidsorg vooruitskattingsanalises, was die bekendstelling van die “Heritage Health Prize”, 'n kompetisie wat individue regoor die wêreld uitdaag om 'n vooruitskattingsalgoritme te ontwikkel wat pasiënte identifiseer wat hoogs waarskynlik gehospitaliseer gaan word in die volgende jaar en as bron-intensief beskou word as gevolg van die beraamde tyd wat hierdie individue in die hospitaal sal deurbring. Die pasiënt-toelating vooruitskattingsalgoritme (PTVA) het ten doel om onnodige hospitaliserings te identifiseer en te voorkom tem einde verbeterde hulpbronbestuur in gesondheidsorg wêreldwyd te bewerkstellig. Die doel van die hierdie projek is om die uitvoerbaarheid en waarde van die toepassing van PTVAs, as 'n voorkomende sorg intervensiestrategie, in die Suid-Afrikaanse gesondheidsorgstelsel te bepaal. 'n Voorkomende sorg intervensiestrategie poog om onnodige hospitaliserings te verhoed deur die nodige sorgmaatreëls te verskaf aan hoë-riskio pasiënte, sonder om hierdie individue noodwendig te hospitaliseer. Die tesis maak gebruik van kwantitatiewe en kwalitatiewe tegnieke. Dit sluit in 'n hersiening van die huidige en historiese vooruitskattings modelle in die gesondheidsorgsektor om die verwagte struikelblokke in die implementering van PTVAs te identifiseer, asook die institusionele en operasionele vereistes van hierdie vooruitskattingsalgoritmes te bepaal. Die literatuurstudie word afgesluit met 'n oorsig van die huidige stand van sake in die Suid-Afrikaanse gesondheidsorgstelsel om, eerstens, die behoefte vir PTVAs te identifiseer en, tweedens, om te bepaal of die openbare en private sektore 'n geskikte platform vir implementering bied (gebaseer op die operasionele en institusionele vereistes van PTVAs). Verder word 'n metodologie ontwerp en ontwikkel om die potensiële waarde-toevoeging van 'n PTVA sorg intervensiestrategie te bepaal. Die metode vereis 'n steekproef van die industrieleiers in die private gesondheidsorgsektor van Suid-Afrika om die volgende te identifiseer: die huidige hoë-prioriteit sleutel prestasie aanwysers (SPAs), en die faktore wat die prestasie van hierdie organisasies komprimeer om hoë gehalte, hulpbron-effektiewe sorg te lewer. 'n Kwantitatiewe model is ontwikkel en toegepas op een industrieleier in die private Stellenbosch gesondheidsorgsektor van Suid-Afrika, om die gevolglike impak van 'n PTVA sorg intervensiestrategie op prestasieverbetering te meet. Ten slotte, in 'n poging om te verseker dat die implementering en werking van PTVAs glad verloop, is 'n implementeringsraamwerk ontwikkel om die strategiese, taktiese en operasionele uitdagings aan te spreek in die toepassing van vooruitskattings analises en voorkomende sorg strategieë soortgelyk aan PTVAs. Die navorsing het bevind dat die toepassing van PTVAS in die openbare gesondheidsorgsektor van Suid-Afrika nie lewensvatbaar is nie. Die private gesondheidsorgsektor word egter beskou as 'n geskikte platform om PTVAs te implementeer, weens die bevrediging van die institusionele en operasionele vereistes van PTVAs. Die waarde-toevoegings model het bevind dat 'n PTVA intervensiestrategie beduidende waarde kan toevoeg tot die prestasieverbetering van gesondheidsorgverskaffers in die private gesondheidsorgsektor van Suid-Afrika. Die grootste verbetering word in die volgende SPAs verwag; sorg koördinasie, dokter-pasiënt verhoudings, voorraad diensvlakke, en personeel doeltreffendheid en effektiwiteit. 'n Effense afname in die finansiële bedryfsmarge word egter gedokumenteer. 'n Implementering-ondersteuningsraamwerk is ontwikkel in 'n poging om die sleutel strategiese, taktiese en operasionele faktore in die implementering en uitvoering van 'n PTVA sorg intervensiestrategie uit te lig. Die waarde-toevoegings metodologie en implementering ondersteuning raamwerk bied 'n geskikte platform vir toekomstige navorsers om die rol van vooruitskattings modelle in voorkomende sorg te ondersoek, in 'n poging om hulpbronbestuur in hospitale te verbeter. Ten slotte, PTVAs verbeter bewysgebaseerde besluitneming vir voorkomende sorg intervensiestrategieë. 'n Doeltreffende en effektiewe voorkomende sorg intervensiestrategie voeg aansienlike waarde tot die algehele prestasieverbetering van gesondheidsorgverskaffers. Met behoorlike beplanning en ondersteuning met implementering, sal PTVA sorg intervensiestrategieë die manier waarop gesondheidsorg gelewer word, wêreldwyd verander.
APA, Harvard, Vancouver, ISO, and other styles
29

MA, LINNA. "Splitting Frames Based on Hypothesis Testing for Patient Motion Compensation in SPECT." Digital WPI, 2006. https://digitalcommons.wpi.edu/etd-theses/1002.

Full text
Abstract:
"Patient motion is a significant cause of artifacts in SPECT imaging. It is important to be able to detect when a patient undergoing SPECT imaging is stationary, and when significant motion has occurred, in order to selectively apply motion compensation. In our system, optical cameras observe reflective markers on the patient. Subsequent image processing determines the marker positions relative to the SPECT system, calculating patient motion. We use this information to decide how to aggregate detected gamma rays (events) into projection images (frames) for tomographic reconstruction. For the most part, patients are stationary, and all events acquired at a single detector angle are treated as a single frame. When a patient moves, it becomes necessary to split a frame into subframes during each of which the patient is stationary. This thesis presents a method for splitting frames based on hypothesis testing. Two competing hypotheses and probability model are designed. Whether to split frames is based on a Bayesian recursive estimation of the likelihood function. The estimation procedure lends itself to an efficient iterative implementation. We show that the frame splitting algorithm performance is good for a sample SNR. Different motion simulation cases are presented to verify the algorithm performance. This work is expected to improve the accuracy of motion compensation in clinical diagnoses."
APA, Harvard, Vancouver, ISO, and other styles
30

Doudna, Aaron Seth II. "Examining Adverse Patient Outcomes: The Role of Task Demand and Fatigue." Ohio University / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1574380981746224.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Webster, Ronald A. "Development of statistical methods for the surveillance and monitoring of adverse events which adjust for differing patient and surgical risks." Thesis, Queensland University of Technology, 2008. https://eprints.qut.edu.au/16622/1/Ronald_Albert_Webster_Thesis.pdf.

Full text
Abstract:
The research in this thesis has been undertaken to develop statistical tools for monitoring adverse events in hospitals that adjust for varying patient risk. The studies involved a detailed literature review of risk adjustment scores for patient mortality following cardiac surgery, comparison of institutional performance, the performance of risk adjusted CUSUM schemes for varying risk profiles of the populations being monitored, the effects of uncertainty in the estimates of expected probabilities of mortality on performance of risk adjusted CUSUM schemes, and the instability of the estimated average run lengths of risk adjusted CUSUM schemes found using the Markov chain approach. The literature review of cardiac surgical risk found that the number of risk factors in a risk model and its discriminating ability were independent, the risk factors could be classified into their "dimensions of risk", and a risk score could not be generalized to populations remote from its developmental database if accurate predictions of patients' probabilities of mortality were required. The conclusions were that an institution could use an "off the shelf" risk score, provided it was recalibrated, or it could construct a customized risk score with risk factors that provide at least one measure for each dimension of risk. The use of report cards to publish adverse outcomes as a tool for quality improvement has been criticized in the medical literature. An analysis of the report cards for cardiac surgery in New York State showed that the institutions' outcome rates appeared overdispersed compared to the model used to construct confidence intervals, and the uncertainty associated with the estimation of institutions' out come rates could be mitigated with trend analysis. A second analysis of the mortality of patients admitted to coronary care units demonstrated the use of notched box plots, fixed and random effect models, and risk adjusted CUSUM schemes as tools to identify outlying hospitals. An important finding from the literature review was that the primary reason for publication of outcomes is to ensure that health care institutions are accountable for the services they provide. A detailed review of the risk adjusted CUSUM scheme was undertaken and the use of average run lengths (ARLs) to assess the scheme, as the risk profile of the population being monitored changes, was justified. The ARLs for in-control and out-of-control processes were found to increase markedly as the average outcome rate of the patient population decreased towards zero. A modification of the risk adjusted CUSUM scheme, where the step size for in-control to out-of-control outcome probabilities were constrained to no less than 0.05, was proposed. The ARLs of this "minimum effect" CUSUM scheme were found to be stable. The previous assessment of the risk adjusted CUSUM scheme assumed that the predicted probability of a patient's mortality is known. A study of its performance, where the estimates of the expected probability of patient mortality were uncertain, showed that uncertainty at the patient level did not affect the performance of the CUSUM schemes, provided that the risk score was well calibrated. Uncertainty in the calibration of the risk model appeared to cause considerable variation in the ARL performance measures. The ARLs of the risk adjusted CUSUM schemes were approximated using simulation because the approximation method using the Markov chain property of CUSUMs, as proposed by Steiner et al. (2000), gave unstable results. The cause of the instability was the method of computing the Markov chain transition probabilities, where probability is concentrated at the midpoint of its Markov state. If probability was assumed to be uniformly distributed over each Markov state, the ARLs were stabilized, provided that the scores for the patients' risk of adverse outcomes were discrete and finite.
APA, Harvard, Vancouver, ISO, and other styles
32

Webster, Ronald A. "Development of statistical methods for the surveillance and monitoring of adverse events which adjust for differing patient and surgical risks." Queensland University of Technology, 2008. http://eprints.qut.edu.au/16622/.

Full text
Abstract:
The research in this thesis has been undertaken to develop statistical tools for monitoring adverse events in hospitals that adjust for varying patient risk. The studies involved a detailed literature review of risk adjustment scores for patient mortality following cardiac surgery, comparison of institutional performance, the performance of risk adjusted CUSUM schemes for varying risk profiles of the populations being monitored, the effects of uncertainty in the estimates of expected probabilities of mortality on performance of risk adjusted CUSUM schemes, and the instability of the estimated average run lengths of risk adjusted CUSUM schemes found using the Markov chain approach. The literature review of cardiac surgical risk found that the number of risk factors in a risk model and its discriminating ability were independent, the risk factors could be classified into their "dimensions of risk", and a risk score could not be generalized to populations remote from its developmental database if accurate predictions of patients' probabilities of mortality were required. The conclusions were that an institution could use an "off the shelf" risk score, provided it was recalibrated, or it could construct a customized risk score with risk factors that provide at least one measure for each dimension of risk. The use of report cards to publish adverse outcomes as a tool for quality improvement has been criticized in the medical literature. An analysis of the report cards for cardiac surgery in New York State showed that the institutions' outcome rates appeared overdispersed compared to the model used to construct confidence intervals, and the uncertainty associated with the estimation of institutions' out come rates could be mitigated with trend analysis. A second analysis of the mortality of patients admitted to coronary care units demonstrated the use of notched box plots, fixed and random effect models, and risk adjusted CUSUM schemes as tools to identify outlying hospitals. An important finding from the literature review was that the primary reason for publication of outcomes is to ensure that health care institutions are accountable for the services they provide. A detailed review of the risk adjusted CUSUM scheme was undertaken and the use of average run lengths (ARLs) to assess the scheme, as the risk profile of the population being monitored changes, was justified. The ARLs for in-control and out-of-control processes were found to increase markedly as the average outcome rate of the patient population decreased towards zero. A modification of the risk adjusted CUSUM scheme, where the step size for in-control to out-of-control outcome probabilities were constrained to no less than 0.05, was proposed. The ARLs of this "minimum effect" CUSUM scheme were found to be stable. The previous assessment of the risk adjusted CUSUM scheme assumed that the predicted probability of a patient's mortality is known. A study of its performance, where the estimates of the expected probability of patient mortality were uncertain, showed that uncertainty at the patient level did not affect the performance of the CUSUM schemes, provided that the risk score was well calibrated. Uncertainty in the calibration of the risk model appeared to cause considerable variation in the ARL performance measures. The ARLs of the risk adjusted CUSUM schemes were approximated using simulation because the approximation method using the Markov chain property of CUSUMs, as proposed by Steiner et al. (2000), gave unstable results. The cause of the instability was the method of computing the Markov chain transition probabilities, where probability is concentrated at the midpoint of its Markov state. If probability was assumed to be uniformly distributed over each Markov state, the ARLs were stabilized, provided that the scores for the patients' risk of adverse outcomes were discrete and finite.
APA, Harvard, Vancouver, ISO, and other styles
33

Rosmarin, Daniel Norris. "Germline determinants of 5-fluorouracil drug toxicity and patient survival in colorectal cancer." Thesis, University of Oxford, 2013. http://ora.ox.ac.uk/objects/uuid:d5e2c306-689c-4c53-b4c3-2c1001b04ec6.

Full text
Abstract:
Despite a decade of publications investigating the effect of germline polymorphisms on both toxicity related to treatment with 5-fluorouracil-based (5-FU) chemotherapy and prognosis following diagnosis with colorectal cancer (CRC), few genetic biomarkers have been identified convincingly. For 5-FU toxicity and CRC prognosis, in four results chapters, this thesis aims to validate previously-reported genetic biomarkers, identify new markers, determine the mechanistic basis of associated polymorphisms, and expand upon methods in the field. The first three results chapters investigate genetic biomarkers for the prediction of toxicity caused by 5-FU-based treatment, particularly for the 5-FU prodrug capecitabine (Xeloda®, Roche). In the first, a systematic review and meta-analysis is performed for all variants that have been previously studied for an association with toxicity caused by any 5-FU-based drug regimen. 16 studies are analysed, including 36 previously-studied variants. Four variants show strong evidence of affecting a patient’s risk of global (any) 5-FU-related toxicity upon analysis of both the existing data and over 900 patients from the QUASAR2 trial of capecitabine +/- bevacizumab (Avastin®, Roche/Genentech): DPYD 2846, DPYD *2A, TYMS 5’VNTR and TYMS 3’UTR. Next, 1,456 polymorphisms in 25 genes involved in the activation, action or degradation of 5-FU are investigated in 1,046 patients from QUASAR2. At a Bonferroni-corrected p-value threshold of 3.43e-05, three novel associations with capecitabine-related toxicity are identified in DPYD (rs12132152, rs7548189, A551T) and the previously-identified TYMS 5’VNTR and 3’UTR toxicity polymorphisms are refined to a tagging SNP (rs2612091) downstream of TYMS and intronic to the adjacent ENOSF1, the latter of which appears to be functional. Finally, a genome-wide investigation of 4.77 million directly genotyped or imputed SNPs identifies one variant (rs2093152 on chr20) as significantly associated with capecitabine-related diarrhoea (p<5e-08), though no associations meet this threshold for global toxicity. In the study of CRC prognosis, a severe left truncation to the VICTOR trial is defined and shown to probably reduce statistical power but not bias effect estimates. Applying standard and novel genome-wide analysis approaches, a set of 43 SNPs are prioritised for future work. With over one million new CRC cases annually, this work helps define biomarkers that could become broadly applicable in the clinical setting.
APA, Harvard, Vancouver, ISO, and other styles
34

Wu, Po-man, and 胡寶文. "Statistical analysis of cancer of cervix patients at Queen Mary Hospital." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1991. http://hub.hku.hk/bib/B31976815.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Kline, David. "Systematically Missing Subject-Level Data in Longitudinal Research Synthesis." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1440067809.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Jones, Scott Gordon. "Hydrogel spacers in external beam radiation therapy of prostate cancer: Patient selection and cost-effectiveness." Thesis, Queensland University of Technology, 2020. https://eprints.qut.edu.au/206984/1/Scott_Jones_Thesis.pdf.

Full text
Abstract:
This research compared statistical models for the prediction of rectum-related side-effects following a course of high-dose external beam radiation therapy for prostate cancer. This allowed patient sub-cohorts with a higher risk of developing long-term side-effects to be identified as suitable for a dose limiting internal organ spacer known as hydrogel. High-risk sub-cohorts were used in a cost-effectiveness analysis of hydrogel spacer to determine both clinical and economic value of this intervention in the Australian health care setting.
APA, Harvard, Vancouver, ISO, and other styles
37

Bell, M. L., M. H. Fiero, H. M. Dhillon, V. J. Bray, and J. L. Vardy. "Statistical controversies in cancer research: using standardized effect size graphs to enhance interpretability of cancer-related clinical trials with patient-reported outcomes." Oxford University Press, 2017. http://hdl.handle.net/10150/626025.

Full text
Abstract:
Patient reported outcomes (PROs) are becoming increasingly important in cancer studies, particularly with the emphasis on patient centered outcome research. However, multiple PROs, using different scales, with different directions of favorability are often used within a trial, making interpretation difficult. To enhance interpretability, we propose the use of a standardized effect size graph, which shows all PROs from a study on the same figure, on the same scale. Plotting standardized effects with their 95% confidence intervals (CIs) on a single graph clearly showing the null value conveys a comprehensive picture of trial results. We demonstrate how to create such a graph using data from a randomized controlled trial that measured 12 PROs at two time points. The 24 effect sizes and CIs are shown on one graph and clearly indicate that the intervention is effective and sustained.
APA, Harvard, Vancouver, ISO, and other styles
38

Vannicola, Catherine Marie. "Analysis of medical time series data using phase space analysis a complex systems approach /." Diss., Online access via UMI:, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
39

Hensler, Erik, and Pontus Karlsson. "The Impact of Overcrowding and Pre-triage Nurses on Patient flow: A Comparative Study at the Emergency Department at Linkoping University Hospital." Thesis, Linköpings universitet, Kommunikations- och transportsystem, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-141684.

Full text
Abstract:
Vid Linköpings Universitetssjukhus kommer en ny akutmottagning att öppna. En pre-triagesjuksköterska kommer sitta innanför ankomstentrén. Åtgärden kan reducera väntetid till medicinsk bedömning och vistelsetid på akutmottagningen. Studiens syfte var att undersöka hur överbelastning och pre-triagesjuksköterskan inverkar på patienters väntetid till medicinsk bedömning samt vistelsetid på en akutmottagning. Studien har även analyserat om pre-triagesjuksköterskans insatser inverkade på andelen patienter som erhöll en högre prioritet under vistelsetiden. Analyser utfördes med multipel linjär regression och urval av data. Slutsatser visar att överbelastning ökade både väntetid och vistelsetid. Pre-triagesjuksköterskan minskade väntetiden till medicinsk bedömning. Mer data behöver analyseras för att säkerställa resultatet för inverkan på vistelsetid. Mer data behövs före slutsatsen om pre-triagesjuksköterskans inverkan på andelen patienter som får högre prioritet.
APA, Harvard, Vancouver, ISO, and other styles
40

Hammami, Imen. "Statistical properties of parasite density estimators in malaria and field applications." Phd thesis, Université René Descartes - Paris V, 2013. http://tel.archives-ouvertes.fr/tel-01064071.

Full text
Abstract:
Malaria is a devastating global health problem that affected 219 million people and caused 660,000 deaths in 2010. Inaccurate estimation of the level of infection may have adverse clinical and therapeutic implications for patients, and for epidemiological endpoint measurements. The level of infection, expressed as the parasite density (PD), is classically defined as the number of asexual parasites relative to a microliter of blood. Microscopy of Giemsa-stained thick blood smears (TBSs) is the gold standard for parasite enumeration. Parasites are counted in a predetermined number of high-power fields (HPFs) or against a fixed number of leukocytes. PD estimation methods usually involve threshold values; either the number of leukocytes counted or the number of HPFs read. Most of these methods assume that (1) the distribution of the thickness of the TBS, and hence the distribution of parasites and leukocytes within the TBS, is homogeneous; and that (2) parasites and leukocytes are evenly distributed in TBSs, and thus can be modeled through a Poisson-distribution. The violation of these assumptions commonly results in overdispersion. Firstly, we studied the statistical properties (mean error, coefficient of variation, false negative rates) of PD estimators of commonly used threshold-based counting techniques and assessed the influence of the thresholds on the cost-effectiveness of these methods. Secondly, we constituted and published the first dataset on parasite and leukocyte counts per HPF. Two sources of overdispersion in data were investigated: latent heterogeneity and spatial dependence. We accounted for unobserved heterogeneity in data by considering more flexible models that allow for overdispersion. Of particular interest were the negative binomial model (NB) and mixture models. The dependent structure in data was modeled with hidden Markov models (HMMs). We found evidence that assumptions (1) and (2) are inconsistent with parasite and leukocyte distributions. The NB-HMM is the closest model to the unknown distribution that generates the data. Finally, we devised a reduced reading procedure of the PD that aims to a better operational optimization and a practical assessing of the heterogeneity in the distribution of parasites and leukocytes in TBSs. A patent application process has been launched and a prototype development of the counter is in process.
APA, Harvard, Vancouver, ISO, and other styles
41

Jaros, Mark J. "A joint model for longitudinal data and competing risks /." Connect to full text via ProQuest. Limited to UCD Anschutz Medical Campus, 2008.

Find full text
Abstract:
Thesis (Ph.D. in Biostatistics) -- University of Colorado Denver, 2008.
Typescript. Includes bibliographical references (leaves 117-119). Free to UCD Anschutz Medical Campus. Online version available via ProQuest Digital Dissertations;
APA, Harvard, Vancouver, ISO, and other styles
42

Agbedjro, Deborah. "Using statistical and machine learning methods to improve treatment success in patients with schizophrenia." Thesis, King's College London (University of London), 2018. https://kclpure.kcl.ac.uk/portal/en/theses/using-statistical-and-machine-learning-methods-to-improve-treatment-success-in-patients-with-schizophrenia(22171de1-b666-4cd5-b974-2e2f2b930dfe).html.

Full text
Abstract:
Background: People with schizophrenia (SCZ) suffer from impaired cognitive abilities and these are associated with poor functional outcomes. Cognitive Remediation Therapy (CRT) has been shown effective in improving the cognitive deficits of SCZ. Because there is evidence for CRT treatment heterogeneity of outcomes, there is a need to identify CRT predictors of differential response using moderation analysis of high dimensional psychiatric data, which typically contain relatively large percentages of missingness. This will contribute to precision medicine treatment, understanding mechanism responsible of differential therapy responses, and better prognosis. Aims: The primary aim of this PhD consisted of developing a CRT precision medicine model, using computer intensive statistical learning methods able to deal with high dimensional psychiatric data containing large percentages of missingness in the predictors and smaller per-centages in the outcome. Secondary aims were overcoming the following problems: variable selection or measurement of variable importance in the model, multicollinearity and overfitting, and summarising commensurate outcomes in one latent outcome. Methods: A simulation study comparing four statistical learning methods (Lasso, Elastic-net, Random Forests and Conditional Inference Random Forests) combined with two missing data imputation techniques (Multivariate Imputation using Chained Equations and MissForest) was run. The combined methods were assessed according to their optimism-corrected (via bootstrap internal validation) prediction accuracy and variable selection performance in differ-ent scenarios. The best method was chosen to develop a CRT precision medicine model using individual participant data from seven randomised controlled trials with approximately 400 pa-tients. Factor scores from a latent summary measure of cognitive commensurate outcomes, obtained via Factor Analysis, was used as the model dependent variable, to accommodate the above univariate statistical learning methods. Results: In the simulations, the method combining MissForest imputation with Lasso was the best compromise between prediction accuracy and clinical interpretability. MissForest-Lasso was then used to develop an internally validated precision medicine model, which se-lected only a weak moderator of treatment response. The model was therefore mainly prog-nostic. Conclusion: In future research, more modalities of data, such as genetics, OMICS and neuroimaging data, are recommended to successfully identify moderators of CRT success.
APA, Harvard, Vancouver, ISO, and other styles
43

Hugueny, Samuel Y. "Novelty detection with extreme value theory in vital-sign monitoring." Thesis, University of Oxford, 2013. http://ora.ox.ac.uk/objects/uuid:804a226c-a298-4764-9bc8-b191d2b852cd.

Full text
Abstract:
Every year in the UK, tens of thousands of hospital patients suffer adverse events, such as un-planned transfers to Intensive Therapy Units or unexpected cardiac arrests. Studies have shown that in a large majority of cases, significant physiological abnormalities can be observed within the 24-hour period preceding such events. Such warning signs may go unnoticed, if they occur between observations by the nursing staff, or are simply not identified as such. Timely detection of these warning signs and appropriate escalation schemes have been shown to improve both patient outcomes and the use of hospital resources, most notably by reducing patients’ length of stay. Automated real-time early-warning systems appear to be cost-efficient answers to the need for continuous vital-sign monitoring. Traditionally, a limitation of such systems has been their sensitivity to noisy and artefactual measurements, resulting in false-alert rates that made them unusable in practice, or earned them the mistrust of clinical staff. Tarassenko et al. (2005) and Hann (2008) proposed a novelty detection approach to the problem of continuous vital-sign monitoring, which, in a clinical trial, was shown to yield clinically acceptable false alert rates. In this approach, an observation is compared to a data fusion model, and its “normality” assessed by comparing a chosen statistic to a pre-set threshold. The method, while informed by large amounts of training data, has a number of heuristic aspects. This thesis proposes a principled approach to multivariate novelty detection in stochastic time- series, where novelty scores have a probabilistic interpretation, and are explicitly linked to the starting assumptions made. Our approach stems from the observation that novelty detection using complex multivariate, multimodal generative models is generally an ill-defined problem when attempted in the data space. In situations where “novel” is equivalent to “improbable with respect to a probability distribution ”, formulating the problem in a univariate probability space allows us to use classical results of univariate statistical theory. Specifically, we propose a multivariate extension to extreme value theory and, more generally, order statistics, suitable for performing novelty detection in time-series generated from a multivariate, possibly multimodal model. All the methods introduced in this thesis are applied to a vital-sign monitoring problem and compared to the existing method of choice. We show that it is possible to outperform the existing method while retaining a probabilistic interpretation. In addition to their application to novelty detection for vital-sign monitoring, contributions in this thesis to existing extreme value theory and order statistics are also valid in the broader context of data-modelling, and may be useful for analysing data from other complex systems.
APA, Harvard, Vancouver, ISO, and other styles
44

Rutledge, M. Hannah. "Patient Family and Hospital Staff Information Needs at a Pediatric Hospital: an Analysis of Information Requests Received by the Family Resource Libraries." Thesis, University of North Texas, 2015. https://digital.library.unt.edu/ark:/67531/metadc801947/.

Full text
Abstract:
This research explored the information needs of patient families and hospital staff at a pediatric hospital system in Dallas, Texas. Library statistics recorded in four hospital libraries from 2011 - 2013 were used to analyze the information requests from patient families and hospital staff. Crosstabulations revealed the extent to which patient families and hospital staff used the libraries to satisfy their information needs. The data showed that patient families used the libraries very differently than hospital staff. Chi-square tests for independence were performed to identify the relationships between the Classification (Patient Family, Hospital Staff) and two descriptors of information needs (Request Type, Resources Used). There were a total of 1,406 information requests analyzed. The data showed that patient families and hospital staff information requests differed greatly in the number of information requests, the type of information requested, the resources used and the time the library staff spent on the requests. Chi-square analyses revealed relationships statistically significant at the p < .05 level; however, the strength of the relationships varied.
APA, Harvard, Vancouver, ISO, and other styles
45

Johnson, Ashley Nzinga. "A statistical framework for the analysis of neural control of movement with aging and other clinical applications." Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/47573.

Full text
Abstract:
The majority of daily living tasks necessitate the use of bimanual movements or concurrent cognitive processing, which are often more difficult for elderly adults. With the number of Americans age 65 and older expected to double in the next 25 years, in-depth research and sophisticated technologies are necessary to understand the mechanisms involved in normal neuromuscular aging. The objective of the research is to understand the effects of aging on biological signals for motor control and to develop a methodology to classify aging and stroke populations. The methodological approach investigated the influence on correlated activity (coherence) between electroencephalogram (EEG) and electromyogram (EMG) signals into senior age. In support of classifying aging and stroke populations, the methodology selected optimal features from the time, frequency, and information theory domains. Additionally, the use of cepstral analysis was modified toward this application to analyze EEG and EMG signals. The inclusion and optimization of cepstral features significantly improved classification accuracy. Additionally, classification of young and elderly adults using Gaussian Mixture Models with Minimum Classification Error improved overall accuracy values. Contributions from the dissertation include demonstration of the change in correlated activity between EMG and EEG with fine motor simple and complex dual tasks; a quantitative feature library for characterizing the neural control of movement with aging under three task conditions; and a methodology for the selection and classification of features to characterize the neural control of movement. Additionally, the dissertation provides functional insight for the association of features with tasks, aging, and clinical conditions. The results of the work are significant because classification of the neural control of movement with aging is not well established. From these contributions, future potential contributions are: a methodology for physiologists to analyze and interpret data; and a computational tool to provide early detection of neuromuscular disorders.
APA, Harvard, Vancouver, ISO, and other styles
46

Maumet, Camille. "From group to patient-specific analysis of brain function in arterial spin labelling and BOLD functional MRI." Phd thesis, Université Rennes 1, 2013. http://tel.archives-ouvertes.fr/tel-00863908.

Full text
Abstract:
This thesis deals with the analysis of brain function in Magnetic Resonance Imaging (MRI) using two sequences: BOLD functional MRI (fMRI) and Arterial Spin Labelling (ASL). In this context, group statistical analyses are of great importance in order to understand the general mechanisms underlying a pathology, but there is also an increasing interest towards patient-specific analyses that draw conclusions at the patient level. Both group and patient-specific analyses are studied in this thesis. We first introduce a group analysis in BOLD fMRI for the study of specific language impairment, a pathology that was very little investigated in neuroimaging. We outline atypical patterns of functional activity and lateralisation in language regions. Then, we move forward to patient-specific analysis. We propose the use of robust estimators to compute cerebral blood flow maps in ASL. Then, we analyse the validity of the assumptions underlying standard statistical analyses in the context of ASL. Finally, we propose a new locally multivariate statistical method based on an a contrario approach and apply it to the detection of atypical patterns of perfusion in ASL and to activation detection in BOLD functional MRI.
APA, Harvard, Vancouver, ISO, and other styles
47

Viehmann, Manuel Alexander. "Komplementäre Therapie der zervikalen Dystonie." Doctoral thesis, Universitätsbibliothek Leipzig, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-115152.

Full text
Abstract:
In der Behandlung der zervikalen Dystonie wird die Therapie mit Botulinumtoxin (BTX) erfolgreich angewendet. Neben dieser Therapie werden von Patienten oft alternative Therapien (CAM [Complementary and Alternative Medicine]) angesprochen und ausprobiert. Diese Studie geht der Frage nach, wie häufig CAM-Therapien genutzt werden, wie deren Wirkung bewertet wird und ob es Prädiktoren für die Therapiewahl gibt. Zur Datenerhebung wurden 265 Fragebögen von zwei Patientengruppen ausgewertet. Gruppe 1 (n=101) bestand aus Patienten der Botulinumtoxinsprechstunde des Universitätsklinikums Leipzig und der Paracelsus Klinik Zwickau. Gruppe 2 (n=165) wurde aus Mitgliedern des Selbsthilfeverbandes Bundesverband-Torticollis e.V. rekrutiert. Bei 86% der Patienten wurde die Therapie mit BTX angewendet. Von den Therapiemöglichkeiten der CAM wurden am häufigsten physikalische Therapien (Massagen n=171) genannt. Am besten bewertet wurden jedoch, neben der BTX-Therapie, spezielle physiotherapeutische sowie psychotherapeutische Verfahren. Die CAM-Therapien wurden häufig in Kombination mit der BTX-Therapie angewendet und von Patienten, deren Erkrankung einen langen chronischen Verlauf vorwies (>10 Jahre). Als Prädiktoren für die Wahl einer CAM-Therapie zeigten sich eine Zugehörigkeit zur Gruppe 2, aufgetretene Nebenwirkungen im Rahmen der BTX-Therapie, männliches Geschlecht und erhöhter Stress bei den Erkrankten. Außerdem fand sich ein signifikanter Unterschied zu einem höheren Bildungsabschnitt und Arbeit in gehobeneren Berufsgruppen bei Patienten, die vermehrt CAM Therapie anwenden. Zusammenfassend wurden CAM-Therapien, neben der Behandlung mit BTX, häufig von den Befragten angewendet. Hohe Zufriedenheitswerte erzielte eine Kombination mit physiotherapeutischen Verfahren oder Psychotherapie. Die Wahl von CAM-Therapien ist von der Erkrankungsdauer, Bildungslage und finanziellen Ressourcen abhängig.
APA, Harvard, Vancouver, ISO, and other styles
48

Bucci, Francesca. "Personalized biomechanical model of a patient with severe hip osteoarthritis for the prediction of pelvic biomechanics." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/15879/.

Full text
Abstract:
L’articolazione dell'anca è un'articolazione sinoviale sferica che costituisce la connessione primaria tra gli arti inferiori e lo scheletro della parte superiore del corpo. Durante le attività quotidiane di routine, carichi anormali ripetuti sull'anca possono portare alla danneggiamento della cartilagine articolare e conseguentemente , all’osteoartrite (OA). L'OA dell'anca è una condizione muscolo-scheletrica cronica e progressiva, il cui trattamento per i pazienti severi è l'artroplastica totale dell'anca (THA). Il centro dell'articolazione dell'anca (HJC) ha grande importanza nell’analisi della biomeccanica dell’anca, così come il suo spostamento, che puo’ essere dovuto a patologie, come OA, o alla chirurgia, THA. Per valutare la biomeccanica del bacino in questa tesi sono stati implementati un modello muscoloscheletrico (NMS) personalizzato statistical shape e modelli ad elementi finiti (FE) di un paziente con grave OA mono-laterale dell'anca. Viene discussa l'accuratezza relativamente al modello scalato generico nella predizione delle grandezze biomeccaniche piu’ importanti, durante la deambulazione. Attraverso i modelli FE, è stato studiato l'effetto di una cattiva stima e/o dello spostamento del centro dell'articolazione dell'anca nelle direzioni antero-posteriore, mediolaterale o infero-superiore per valutare lo stato di sollecitazione della pelvi. Infine sono presentati i risultati di un approccio multiscala integrato, per valutare le caratteristiche biomeccaniche del suddetto paziente, passando dalla modellazione NMS, all’analisi del modello FE della pelvi, per effettuare un’analisi comparativa dell’arto osteoartritico con il modello dall’arto controlaterale prima dell’intervento e dopo lo stesso
APA, Harvard, Vancouver, ISO, and other styles
49

Liu, Rong. "A Comparison for Longitudinal Data Missing Due to Truncation." VCU Scholars Compass, 2006. http://hdl.handle.net/10156/1755.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Lee, Jean. "Variation in pediatric gastroenteritis admissions among Florida counties, 1995-2002." [Tampa, Fla] : University of South Florida, 2006. http://purl.fcla.edu/usf/dc/et/SFE0001610.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography