Dissertations / Theses on the topic 'Endpoints'

To see the other types of publications on this topic, follow the link: Endpoints.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Endpoints.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Weaver, Jean M. "Molecular pharmacodynamic endpoints /." The Ohio State University, 1997. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487945320761452.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Liu, Yi. "Testing for Efficacy for Primary and Secondary Endpoints by Partitioning Decision Paths." The Ohio State University, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=osu1259598621.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Izadinia, Vafa Dario. "Fingerprinting Encrypted Tunnel Endpoints." Diss., University of Pretoria, 2005. http://hdl.handle.net/2263/25351.

Full text
Abstract:
Operating System fingerprinting is a reconnaissance method used by Whitehats and Blackhats alike. Current techniques for fingerprinting do not take into account tunneling protocols, such as IPSec, SSL/TLS, and SSH, which effectively `wrap` network traffic in a ciphertext mantle, thus potentially rendering passive monitoring ineffectual. Whether encryption makes VPN tunnel endpoints immune to fingerprinting, or yields the encrypted contents of the VPN tunnel entirely indistinguishable, is a topic that has received modest coverage in academic literature. This study addresses these question by targeting two tunnelling protocols: IPSec and SSL/TLS. A new fingerprinting methodology is presented, several fingerprinting discriminants are identified, and test results are set forth, showing that endpoint identities can be uncovered, and that some of the contents of encrypted VPN tunnels can in fact be discerned.
Dissertation (MSc (Computer Science))--University of Pretoria, 2005.
Computer Science
unrestricted
APA, Harvard, Vancouver, ISO, and other styles
4

Feng, Chunyao Seaman John Weldon. "Bayesian evaluation of surrogate endpoints." Waco, Tex. : Baylor University, 2006. http://hdl.handle.net/2104/4187.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Nordman, Ina IC Clinical School St Vincent's Hospital Faculty of Medicine UNSW. "Surrogate endpoints of survival in metastatic carcinoma." Publisher:University of New South Wales. Clinical School - St Vincent's Hospital, 2008. http://handle.unsw.edu.au/1959.4/42791.

Full text
Abstract:
In most randomised controlled trials (RCTs), a large number of patients need to be followed over many years, for the clinical benefit of the drug to be accurately quantified (1). Using an early proxy, or a surrogate endpoint, in place of the direct endpoint of overall survival (OS) could theoretically shorten the duration of RCTs and minimise the exposure of patients to ineffective or toxic treatments (2, 3). This thesis examined the relationship between surrogate endpoints and OS in metastatic colorectal cancer (CRC), advanced non-small cell lung cancer (NSCLC) and metastatic breast cancer (MBC). A review of the literature identified 144 RCTs in metastatic CRC, 189 in advanced NSCLC and 133 in MBC. The publications were generally of poor quality with incomplete reporting on many key variables, making comparisons between studies difficult. The introduction of the CONSORT statement was associated with improvements in the quality of reporting. For CRC (337 arms), NSCLC (429 arms) and MBC (290 arms) there were strong relationships between OS and progression free survival (PFS), time to progression (TTP), disease control rate (DCR), response rate (RR) and partial response (PR). Correlation was also demonstrated between OS and complete response (CR) in CRC and duration of response (DOR) in MBC. However, while strong relationships were found, the proportion of variance explained by the models was small. Prediction bands constructed to determine the surrogate threshold effect size indicated that large improvements in the surrogate endpoints were needed to predict overall survival gains. PFS and TTP showed the most promise as surrogates. The gain in PFS and TTP required to predict a significant gain in overall survival was between 1.2 and 7.0 months and 1.8 and 7.7 months respectively, depending on trial size and tumour type. DCR was a better potential predictor of OS than RR. The results of this study could be used to design future clinical trials with particular reference to the selection of surrogate endpoint and trial size.
APA, Harvard, Vancouver, ISO, and other styles
6

Studer, Ahren M. "Verifying Physical Endpoints to Secure Digital Systems." Research Showcase @ CMU, 2011. http://repository.cmu.edu/dissertations/77.

Full text
Abstract:
The proliferation of electronic devices supporting sensing, actuation, and wireless communication enables the monitoring and/or control of a variety of physical systems with digital communication. Such “cyber physical systems” blur the boundaries of the digital and physical worlds, where correct information about the physical world is needed for the correct operation of the digital system. Often in these systems the physical source or destination of information is as important as the information itself. However, the omni-directional and invisible nature of wireless communication makes it difficult to determine communication endpoints. This allows a malicious party to intercept wireless messages or pose as other entities in the system. As such, these systems require new protocols to associate the endpoints of digital communication with physical entities. Traditional security approaches that associate cryptographic keys with names can help verify endpoints in static systems where a string accurately describes the role of a device. In other systems, the role of a device depends on its physical properties, such as location, which change over time. This dynamic nature implies that identification of an endpoint based on a static name is insufficient. Instead, we can leverage devices’ sensing and actuation capabilities to verify the physical properties and determine the physical endpoints of communication. We investigate three different scenarios where the physical source and/or destination is important and propose endpoint verification techniques: verifying the physical endpoints during an exchange between two smartphones, verifying the receiver of information is in a physical space to enable location-based access control, and verifying the source of information to protect Vehicle-to-Vehicle (V2V) applications. We evaluate our proposals in these systems and show that our solutions fulfill the security requirements while utilizing existing hardware. Exchanging Information Between Smartphones Shake on it (SHOT) allows users to verify the endpoints during an exchange of information between two smartphones. In our protocol, the phones use their vibrators and accelerometers to establish a human-observable communication channel. The users hold the phones together while the phones use this channel to bootstrap and verify the authenticity of an exchange that occurs over the higher-bandwidth wireless channel. Users can detect the injection of information from other devices as additional vibrations, and prevent such attacks. Our implementation of SHOT for the DROID smartphone is able to support sender and receiver verification during an exchange between two smartphones in 15 seconds on average. Location-Based Access Control We propose using location-based access control to protect sensitive files on laptops, without requiring any effort from the user to provide security. With a purely wireless electronic system, verifying that a given device is in a physical space is a challenge; either the definition of the physical space is vague (radio waves can travel beyond walls) or the solution requires expensive hardware to measure a message’s time of flight. Instead, we use infrared as a signal that walls can contain. We develop key derivation protocols that ensure only a receiver in the physical room with access to the signal can derive the key. We implement a system that uses the laptop’s webcam to record the infrared signal, derive a key, and decrypt sensitive files in less than 5 seconds. Source Verification for V2V Networks A number of V2V applications use information about nearby vehicles to prevent accidents or reduce fuel consumption. However, false information about the positioning of vehicles can cause erroneous behavior, including accidents that would not occur in the absence of V2V. As such, we need a way to verify which vehicle sent a message and that the message accurately describes the physical state of that vehicle. We propose using LED lights on vehicles to broadcast the certificate a vehicle is currently using. Receivers can use onboard cameras to film the encoding of the certificate and estimate the relative location of the vehicle. This visual channel allows a receiver to associate a physical vehicle at a known location with the cryptographic credentials used to sign a location claim. Our simulations indicate that even with a pessimistic visual channel, visual verification of V2V senders provides sufficient verification capabilities to support the relevant applications.
APA, Harvard, Vancouver, ISO, and other styles
7

Cooney, Maureen Anne Siega-Riz Anna Maria. "In utero environmental exposures and reproductive endpoints." Chapel Hill, N.C. : University of North Carolina at Chapel Hill, 2008. http://dc.lib.unc.edu/u?/etd,2433.

Full text
Abstract:
Thesis (Ph. D.)--University of North Carolina at Chapel Hill, 2009.
Title from electronic title page (viewed Sep. 3, 2009). "... in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Department of Epidemiology." Discipline: Epidemiology; Department/School: Public Health.
APA, Harvard, Vancouver, ISO, and other styles
8

Alhabib, Nada. "Explosion of escaping endpoints of exponential maps." Thesis, University of Liverpool, 2016. http://livrepository.liverpool.ac.uk/3001508/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Del, Gobbo Liana. "Magnesium biomarkers and cardiometabolic endpoints in multiethnic populations." Thesis, McGill University, 2012. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=107839.

Full text
Abstract:
Background: Magnesium (Mg) is known to exert diverse actions on cardiometabolic function; suboptimal Mg intake and status may contribute significantly to adverse metabolic and cardiovascular health. While type 2 diabetes mellitus (T2DM) is the most common cause of Mg depletion, research gaps exist regarding the potential influence of transient states of impaired glycemia, such as gestational diabetes, on Mg biomarkers in affected mothers or their offspring. Further, whether or not diabetes is an important effect modifier of associations between Mg biomarkers and cardiovascular endpoints, such as arrhythmias, is unknown. The determination of Mg status is riddled with challenges, however, and no simple, rapid and accurate test has emerged. Plasma (pMg) and serum Mg (sMg) are the most commonly used biomarkers; comparatively little is known about erythrocyte Mg (rMg) and cardiometabolic outcomes.Objectives: The main objectives of this thesis were: 1.) to determine if gestational diabetes history prospectively influences Mg concentrations or associations between Mg and glycemic variables in mothers and offspring 15-years post-partum (Montreal cohort) 2.) to examine associations between sMg, rMg and cardiovascular risk profiles in adults from two ethnically distinct, cross-sectional studies (Cree, Inuit) and evaluate the utility of rMg as a cardiovascular risk associate; and 3.) to estimate the risk of an arrhythmia associated with mortality, ventricular premature beats, across the sMg concentration gradient, and assess potential effect modification by T2DM status (Cree).Methods: Secondary data analysis was conducted on data collected from three diverse studies: 1.) Diabetic Pregnancies: Longitudinal follow-up of mother-offspring pairs (multiethnic Montreal cohort) 2.) Nituuchischaayihtitaau Aschii: A multi-community environment & health longitudinal study in Iiyiyiu Aschii (Cree survey) 3.) International Polar Year (IPY) Inuit Health Survey (Inuit survey). Associations between Mg biomarkers and endpoints were examined in multivariate linear and logistic regression models.Results: Gestational diabetes history 15-yrs prior was associated with reduced pMg in mothers (p=0.002) and elevated pMg in teenage offspring (p=0.002) relative to mother and offspring controls without gestational diabetes history. Associations between Mg status and some glycemic variables (fasting glucose, insulin, and insulin sensitivity) were stronger in mothers and offspring with gestational diabetes history than those without. In Cree adults without T2DM, sMg was inversely associated with fasting glucose (p=0.001), in addition to cardiovascular variables such as hsCRP (p=0.038) and carotid-intima media thickness (p=0.044). rMg was significantly associated with adiposity in both Cree and Inuit (p<0.001), but no associations between rMg and fasting glucose, insulin, hsCRP, blood pressure, nor carotid intima-media thickness were observed. In Cree, hypomagnesaemia (sMg <0.70mmol/L) was associated with an increased prevalence of ventricular premature beats (p<0.05). T2DM was a significant effect modifier of the association between sMg and risk of this arrhythmia (p<0.05).Conclusions: Transient states of impaired glycemia (gestational diabetes) may be associated with pMg and its associations with glycemic outcome variables in affected mothers and offspring. Unlike sMg or pMg, there is no current evidence that total rMg is significantly associated with a favourable cardiovascular risk profile or adds value to cardiometabolic risk assessment. Prevalence of ventricular premature beats is more common in Cree adults with hypomagnesaemia and T2DM. Further investigations evaluating the potential utility and predictive value of pMg and sMg as markers of cardiometabolic risk are warranted.
Contexte: Le magnésium (Mg) est connu pour exercer diverses mesures pour soutenir la fonction cardiométabolique. Bien que le diabète de type 2 (T2DM) est la cause la plus courante de l'épuisement de Mg, des lacunes dans la recherche existent concernant l'influence potentielle de diabète gestationnel sur le statut de Mg ultérieur ou les associations Mg-glycémiques des mères touchées ou de leur progéniture. De plus, si le diabète est un modificateur avec un effet des associations entre Mg et les bouts cardiovasculaires, tels que d'arrhythmias, est inconnue. La détermination du statut de Mg pose plein de défis, et aucun test simple, rapide et précis a émergé. Le plasma Mg (pMg) et le sérum Mg (sMg) restent les biomarqueurs utilisés le plus souvent; peu est connu au sujet de Mg érythrocytaire (rMg) et les résultats cardiométaboliques.Objectivifs : Les objectif de ce thèse était: 1.) De déterminer si l'histoire de diabète gestationnel influence les concentrations de Mg ou les associations entre les variables Mg et de la glycémie chez les mères et leurs enfants de 15 années post-partum (cohorte de Montréal), 2.) Etudier les associations entre le sMg, rMg et des profils de risque cardiovasculaires chez les adultes concernant deux études transversales et distincts sur le plan ethnique (Cris, Inuits) et évaluer l'utilité de rMg en tant qu'associé de risque cardiovasculaire; et 3.) Pour estimer le risque d'une arythmie, extrasystoles ventriculaires, à travers le gradient de concentration de sMg et d'évaluer l'effet potentiel de modification selon le statut du T2DM (population Crie).Méthodes: L'analyse des données secondaires a été réalisée sur des données recueillies auprès de trois études diverses: 1.) Grossesses diabétiques: Suivi longitudinal de la mère-enfants paires 2.) Nituuchischaayihtitaau Aschii: Une étude multi-communautaire et longitudinal sur l'environnement et la santé dans Iiyiyiu Aschii 3.) Songage sur la santé des Inuits: Quanuippitaa? Comment sommes-nous? Les associations entre les biomarqueurs et de paramètres de Mg ont été considérés dans la linéaire multivariée et la régression logistique. Résultats : L'histoire du diabète gestationnel de 15 ans avant a été associée à du pMg réduit de mères (p = 0,002) et le pMg élevée chez leur progéniture adolescente (p = 0,002). Les associations entre le statut de Mg et de certaines variables glycémique ont été plus prononcés chez les mères et leur progéniture avec l'histoire de diabète gestationnel. Chez les adultes Crie sans T2DM, le taux de sMg était inversement associé à une glycémie à jeûne (p = 0,001), en plus des variables cardiovasculaires telles que l'hsCRP (p = 0,038) et la carotide intima-média d'épaisseur (p = 0,044). rMg était significativement associé à l'adiposité chez les Cris et les Inuits (p <0,001), mais aucune association entre le rMg et le glucose du jeûne, d'insuline, de hsCRP, la pression sanguine, ni carotidienne épaisseur intima-média ont été observées. Chez les Cris, la hypomagnésémie (sMg <0.70mmol/L) était associée à une prévalence accrue des extrasystoles ventriculaires prématurés (p <0,05). Le diabète de type 2 était un modificateur d'effet de l'association entre le sMg et le risque de cette arythmie (p <0,05). Conclusions: Le diabète gestationnel peuvent influencer le pMg et des associations avec les variables des résultats de la glycémie chez les mères touchées et leur progéniture. Contrairement au sMg ou au pMg, il n'existe aucune preuve actuelle que le rMg totale est associée à un profil de risque cardiovasculaire favorable ou ajoute une valeur à l'évaluation du risque cardiométabolique. Prévalence d'une arythmie associée à un risque élevé de mortalité est plus fréquente chez les adultes Cris avec l'hypomagnésémie et T2DM. D'autres investigations évaluent l'utilité potentielle et la valeur prédictive de Mg extracellulaire (pMg et sMg) comme marqueurs du risque cardiométabolique sont appropriées.
APA, Harvard, Vancouver, ISO, and other styles
10

Blatchford, Patrick Judson. "Monitoring bivariate endpoints in group sequential clinical trials /." Connect to full text via ProQuest. Limited to UCD Anschutz Medical Campus, 2007.

Find full text
Abstract:
Thesis (Ph.D. in Biostatistics) -- University of Colorado Denver, 2007.
Typescript. Includes bibliographical references (leaves 104-106). Free to UCD affiliates. Online version available via ProQuest Digital Dissertations;
APA, Harvard, Vancouver, ISO, and other styles
11

Haihambo, Paulus. "Hyperconvexity and endpoints in T₀-quasi-metric spaces." Master's thesis, University of Cape Town, 2013. http://hdl.handle.net/11427/6617.

Full text
Abstract:
Over the last decades much progress has been made in the investigation of hyperconvexity in metric spaces. Recently Kemajou and others have published an article concerning hyperconvexity in T₀-quasi-metric spaces. In 1964 Isbell introduced and studied the concept of an endpoint of a metric space. The aim of this dissertation is to begin an investigation into hyperconvexity and endpoints of T₀-quasi-metric spaces. It starts off with basic definitions and some well-known properties of quasi-pseudometric spaces. We conclude by commencing an investigation into hyperconvexity and endpoints of T₀-quasi-metric spaces. In this dissertation several results obtained for hyperconvexity and endpoints in metric spaces are generalized to T₀-quasi-metric spaces, and some original results for hyperconvexity and endpoints of T₀-quasi-metric spaces are presented. We also discuss for a partially ordered set the connection between its Dedekind-MacNeille completion and the q-hyperconvex hull of its natural T₀-quasi-metric space.
APA, Harvard, Vancouver, ISO, and other styles
12

Wang, Hui. "Response Adaptive Randomization using Surrogate and Primary Endpoints." VCU Scholars Compass, 2016. http://scholarscompass.vcu.edu/etd/4517.

Full text
Abstract:
In recent years, adaptive designs in clinical trials have been attractive due to their efficiency and flexibility. Response adaptive randomization procedures in phase II or III clinical trials are proposed to appeal ethical concerns by skewing the probability of patient assignments based on the responses obtained thus far, so that more patients will be assigned to a superior treatment group. General response-adaptive randomizations usually assume that the primary endpoint can be obtained quickly after the treatment. However, in real clinical trials, the primary outcome is delayed, making it unusable for adaptation. Therefore, we utilize surrogate and primary endpoints simultaneously to adaptively assign subjects between treatment groups for clinical trials with continuous responses. We explore two types of primary endpoints commonly used in clinical tirials: normally distributed outcome and time-to-event outcome. We establish a connection between the surrogate and primary endpoints through a Bayesian model, and then update the allocation ratio based on the accumulated data. Through simulation studies, we find that our proposed response adaptive randomization is more effective in assigning patients to better treatments as compared with equal allocation randomization and standard response adaptive randomization which is solely based on the primary endpoint.
APA, Harvard, Vancouver, ISO, and other styles
13

James, Susan Elizabeth. "The analysis of multiple endpoints in clinical trials." Thesis, University of Leicester, 1993. http://hdl.handle.net/2381/34548.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Zain, Zakiyah. "Combining multiple survival endpoints within a single statistical analysis." Thesis, Lancaster University, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.618302.

Full text
Abstract:
The aim of this thesis is to develop methodology for combining multiple endpoints within a single statistical analysis that compares the responses of patients treated with a novel treatment with those of control patients treated conventionally. The focus is on interval-censored bivariate survival data, and five real data sets from previous studies concerning multiple responses are used to illustrate the techniques developed. The background to survival analysis is introduced by a general description of survival data, and an overview of existing methods and underlying models is included. A review is given of two of the most popular survival analysis methods, namely the logrank test and Cox's proportional hazards model. The global score test methodology for combining multiple end points is described in detail, and application to real data demonstrates its benefits. The correlation between two score statistics arising from bivariate interval-censored survival data is the core of this research. The global score test methodology is extended to the case of bivariate interval-censored survival data and a complementary log-log link is applied to derive the covariance and the correlation between the two score statistics. A number of common scenarios are considered in this investigation and the accuracy of the estimator is evaluated by means of extensive simulations. An established method, namely the approach of Wei, Lin and Weissfeld, is examined and compared with the proposed method using both real and simulated data. It is concluded that our method is accurate, consistent and comparable to the competitor. This study marked the first successful development of the global score test methodology for bivariate survival data, employing a new approach to the derivation of the covariance between two score statistics on the basis of an interval-censored model. Additionally. the relationship between the jackknife technique and the Wei, Lin and Weissfeld method has been clarified.
APA, Harvard, Vancouver, ISO, and other styles
15

Sved, Daniel W. "Monooxygenase induction and lethality as endpoints in aquatic toxicology." W&M ScholarWorks, 1991. https://scholarworks.wm.edu/etd/1539616869.

Full text
Abstract:
Spot, Leiostomus xanthurus, were exposed to suspended sediments (&\approx&20 mg/L) contaminated with polycyclic aromatic hydrocarbons (PAH) in a laboratory flow-through system to evaluate the applicability of hepatic ethoxyresorufin O-deethylase (EROD) induction as an indicator of PAH exposure. PAH sources tested were coal-tar creosote (CTC), a low molecular weight fraction of creosote (LMWF), and a high molecular weight fraction of creosote (HMWF). A standard 96-h acute toxicity test was conducted to ensure that PAH concentrations tested in induction studies were sub-acutely toxic. The 96-h LC50 for spot was 1740 &\mu&g PAH/L (95% confidence interval = 1480-2060 &\mu&g PAH/L). The lowest concentration producing an observable effect in 96 h was 560 &\mu&g PAH/L; no effects were observed for spot exposed to 250 &\mu&g PAH/L for 96 h. Induction of hepatic EROD activity occurred rapidly in fish exposed to high environmentally realistic concentrations of CTC or the HMWF, but not the LMWF. Maximal induction (30-fold) occurred in fish exposed for 48 h to 150 &\mu&g PAH/L. Induction was concentration-dependent up to 150 &\mu&g PAH/L; at 320 &\mu&g PAH/L induction was 14-fold. EROD activity decreased upon further exposure; by day 7, EROD activity was not significantly different than that on day 0. EROD activity in fish exposed to 16 &\mu&g PAH/L was not consistently higher than that in control fish. Spot exposed to at least 70 &\mu&g PAH/L from CTC or the HMWF experienced severe fin erosion, epidermal lesions, and mortality beginning a few days after maximal EROD induction occurred. No relationship between EROD induction and whole animal responses is implied, only that EROD induction did precede any high order effects. These results indicate complications to the use of EROD activity as a sensitive, reliable indicator of PAH exposure. The toxicity of CTC may inhibit or interfere with continued induction of EROD activity, but neither the toxicity nor inducing capability is associated with the LMWF. The lack of exposure-dependent EROD induction indicate there could be difficulties in interpreting field studies, where fish have unknown exposure histories.
APA, Harvard, Vancouver, ISO, and other styles
16

Bruce, Erica Dawn. "Modeling toxic endpoints for improving human health risk assessment." [College Station, Tex. : Texas A&M University, 2007. http://hdl.handle.net/1969.1/ETD-TAMU-1277.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Chen, May. "Effects of Chronic Energy Drink Consumption on Cardiometabolic Endpoints." Scholarly Commons, 2020. https://scholarlycommons.pacific.edu/uop_etds/3674.

Full text
Abstract:
Background: Since its introduction in the early 2000s, energy drinks have become increasingly popular among an extensive range of consumers, including adolescents and young adults. Currently, the United States Food and Drug Administration (FDA) does not regulate the formulation of energy drinks, which may vary widely in the amounts of caffeine and sugar, as well as various types of supplements. Recent reports of severe and fatal adverse effects related to energy drinks have led to growing concerns on the safety of energy drink consumption. Objective: This study aimed to investigate the effects of chronic daily consumption of energy drinks on cardiometabolic endpoints, including blood pressure, ECG parameters, blood glucose, lipid parameters, weight, body mass index, and body fat consumption in a healthy adult population. Methods: The study was an unblinded, non-randomized, proof-of-concept, prospective study that evaluated the effects of chronic consumption of energy drinks in a healthy, adult population. Each participant consumed two 16 oz. cans of a commercially available ED daily in two divided doses for 28 days. Investigators met with the participants on days 0, 7, 14, 21, and 28 of the study. Participants were required to complete a standardized log of consumption, which include date and time of consumption, as well an estimate of additional caffeine intake. The following measurements were taken for each participant over the 28 days: blood pressure (BP), electrocardiogram (ECG), fasting blood glucose (FBG), fasting lipid panel (FLP), weight, BMI, body fat composition, and serum creatinine. Adverse side effects related to energy drink consumption were also recorded. Wilcoxan signed-rank tests were used to compare and detect statistical difference between endpoints for baseline and maximum post-dose systolic BP, QTc, FBG, FLP, weight, BMI, body fat, and serum creatinine values. Results: Of the 14 total participants that were enrolled in the study, 12 participants completed the full study protocol for 28 days. Maximum measurements in peripheral systolic blood pressure (pSBP), peripheral diastolic blood pressure (pDBP), central systolic blood pressure (cSBP), central diastolic blood pressure (cDBP), and heart rate (HR) were found to be statistically significantly higher than baseline measurements (all P < 0.05). The maximum change from baseline to maximum pSBP, pDBP, cSBP, and cDBP were 9±7 mmHg, 5±4 mmHg, 7±6 mmHg, 5±4 mmHg, respectively. Maximum QTcB and QTcF intervals were also statistically higher than baseline (both P = 0.001). The maximum change from baseline in QTcB and QTcF interval were 19±13 ms and 15±10 ms, respectively. Both QTcB and QTcF intervals on days 7, 14, 21, and 28 were all found to be significantly higher than baseline (all P Results: Of the 14 total participants that were enrolled in the study, 12 participants completed the full study protocol for 28 days. Maximum measurements in peripheral systolic blood pressure (pSBP), peripheral diastolic blood pressure (pDBP), central systolic blood pressure (cSBP), central diastolic blood pressure (cDBP), and heart rate (HR) were found to be statistically significantly higher than baseline measurements (all P < 0.05). The maximum change from baseline to maximum pSBP, pDBP, cSBP, and cDBP were 9±7 mmHg, 5±4 mmHg, 7±6 mmHg, 5±4 mmHg, respectively. Maximum QTcB and QTcF intervals were also statistically higher than baseline (both P = 0.001). The maximum change from baseline in QTcB and QTcF interval were 19±13 ms and 15±10 ms, respectively. Both QTcB and QTcF intervals on days 7, 14, 21, and 28 were all found to be significantly higher than baseline (all P
APA, Harvard, Vancouver, ISO, and other styles
18

Fergusson, Nicholas Anthony. "Alternative Endpoints and Analysis Techniques in Kidney Transplant Trials." Thesis, Université d'Ottawa / University of Ottawa, 2017. http://hdl.handle.net/10393/36230.

Full text
Abstract:
Clinical trials in kidney transplantation suffer from several major issues including: 1) Unfeasibility due to low short-term event rates of hard outcomes and 2) Reliance on a composite outcome that consists of unequal endpoints that may generate misleading results. This thesis attempts to explore and apply methods to solve these issues and ultimately, improve kidney transplantation trials. We present a secondary analysis of the ACE trial in kidney transplant using composites with alternative graft function surrogate endpoints. Typically, kidney transplant trials—including the ACE trial— use a time-to-event composite of death, end-stage renal disease (ESRD), and doubling of serum creatinine. Instead of doubling of serum creatinine, we investigated the use of percentage declines of estimate glomerular filtration rate (eGFR) within a time-to-event composite of death and ESRD. Additionally, we present an application of an innovative analysis method, the win ratio approach, to the ACE trial as a way of lessening concerns associated with unequal composite endpoints. Composites of death, ESRD, and either a 40%, 30% or 20% decline in eGFR did not alter original ACE trial results, interpretations, or conclusions. The win ratio approach, and the presentation of a win ratio, generated very comparable results to a standard time-to-event analysis while lessening the impact of unequal composite endpoints and making fewer statistical assumptions. This research provides a novel, trial-level application of alternative endpoints and analysis techniques within a kidney transplant trial setting.
APA, Harvard, Vancouver, ISO, and other styles
19

Hellrung, II James Lee. "Labels for Anchors: The Good, the Bad, and the Endpoints." TopSCHOLAR®, 2010. http://digitalcommons.wku.edu/theses/156.

Full text
Abstract:
This study was designed to determine if any differences in internal consistency existed between different designs of scale anchors. The three different designs explored were properly designed scales, improperly designed scales, and endpoint only scales. Two-Hundred and thirty-five participants rated the frequency of which they performed various computer activities on a survey using one of the three different designs. Contrary to expectations, internal consistency did not differ across the three designs.
APA, Harvard, Vancouver, ISO, and other styles
20

Powell, Christine Louise Rusyn Ivan. "Improving linkage of hepatic toxicity and pathology endpoints with toxicogenomics." Chapel Hill, N.C. : University of North Carolina at Chapel Hill, 2007. http://dc.lib.unc.edu/u?/etd,997.

Full text
Abstract:
Thesis (Ph. D.)--University of North Carolina at Chapel Hill, 2007.
Title from electronic title page (viewed Dec. 18, 2007). "... in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Curriculum in Toxicology." Discipline: Toxicology; Department/School: Medicine.
APA, Harvard, Vancouver, ISO, and other styles
21

Luo, Yingchun. "Nonparametric statistical procedures for therapeutic clinical trials with survival endpoints." Thesis, Kingston, Ont. : [s.n.], 2007. http://hdl.handle.net/1974/492.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Rossouw, Maria Susanna. "Validation of endpoints as biomarkers of low-dose radiation damage." Thesis, Cape Technikon, 2004. http://hdl.handle.net/20.500.11838/1461.

Full text
Abstract:
Thesis (MTech (Biomedical Technology))--Cape Technikon, Cape Town, 2004
The need for radiobiological research was bom from the discovery that high doses of radiation could cause cancer and other health effects. However, recent developments in molecular biology uncovered the effects of low doses of radiation on different biological systems and as a result new techniques have been developed to measure these effects. The aim of this study was thus to validate biomarkers of initial DNA strand breaks, micronucleus formation, and the different pt ;ases of apoptosis as biological indicators of low-dose radiation damage. Furthermore, the difference in response of blood cells to different qualities and doses of radiation was investigated by irradiating cells with low- and high-LET radiation simultaneously. Blood from one donor was irradiated with doses between 0 and 4 Gy gamma- and neutron radiation. The alkaline single-cell gel electrophoresis (comet) assay was performed on different cell preparations directly after irradiation for the detection of initial DNA strand breaks. Radiation-induced cytogenetic damage was investigated using the cytokinesis-blocked micronucleus assay while different features of apoptosis were investigated by measuring caspase activation, enzymatic DNA fragmentation, and cellular morphology. The comet assay was sensitive enough to detect DNA strand breaks above 0.25 Gy and showed that the Iymphocyte isolation process induced some endogenous damage in cells, detected by the formation of highly damaged cells and hedgehogs in isolated cell preparations only.
APA, Harvard, Vancouver, ISO, and other styles
23

Wang, Wenbin. "HDACi-induced DNA damage : identifying potential endpoints for safety assessment." Thesis, Cardiff University, 2016. http://orca.cf.ac.uk/98927/.

Full text
Abstract:
Histone deacetylase inhibitors (HDACi) have been designed to alter the actions of epigenetic modifiers with the aim of 'reprogramming' the epigenome of diseased tissues back to their normal disease-free state. These inhibitors were designed to be non-DNA reactive and therefore considered safe from a genetic toxicology point of view. However, HDACi’s have been shown to induce DNA damage in healthy cells through unknown mechanisms, thereby posing significant risks to human health. Studies suggest that HDAC inhibitor-induced DNA damage is partly associated with changes in transcription and replication. Consequently, collisions between these events can result in the formation of DNA lesions and stable DNA:RNA hybrid structures (R-loops), which are implicated in the onset of cancer and various neurological diseases. Therefore, the aims of the current study were to better understand the mechanisms by which HDAC inhibitors may induce DNA damage and to identify potential endpoints for safety assessment: Chapter III: Efforts to study the effects of HDAC inhibition through a chemical approach proved unsuccessful in the yeast model organism but identified the HDAC mutant, rpd3Δ, showing histone hyper-acetylation compared to the wild type. Chapter IV: ChIP-chip was established for the TK6 lymphoblastoid cell line as a genome-wide tool for measuring the genotoxicity of HDAC inhibitors. Chapter V: Application of the ChIP-chip method showed that Trichostatin A-induced changes in histone H4 acetylation led to the re-distribution of transcription and replication on chromosome 17 in TK6 cells. This resulted in their co-localisation, suggestive of potential collisions. However, further efforts to determine this by mapping γH2AX and R-loop formation proved unsuccessful. Chapter VI: The yeast genetic mutant rpd3Δ was used to mimic the effects of treating with an HDAC inhibitor. The loss of RPD3 resulted in significantly higher levels of γH2A, predominantly at telomere regions. In conclusion, this thesis presents strong evidence to show that Trichostatin A promotes the co-localisation of transcription and replication, suggesting that there is a greater possibility of these processes colliding to form DNA damage.
APA, Harvard, Vancouver, ISO, and other styles
24

Richert, Laura. "Trial design and analysis of endpoints in HIV vaccine trials." Thesis, Bordeaux 2, 2013. http://www.theses.fr/2013BOR22048/document.

Full text
Abstract:
Des données complexes sont fréquentes dans les essais cliniques récents et nécessitent des méthodes statistiques adaptées. La recherche vaccinale du VIH est un exemple d’un domaine avec des données complexes et une absence de critères de jugement validés dans les essais précoces. Cette thèse d’Université concerne des recherches méthodologiques sur la conception et les aspects statistiques des essais cliniques vaccinaux du VIH, en particulier sur les critères de jugement d’immunogénicité et les schémas d’essai de phase I-II. A l’aide des données cytokiniques multiplex, nous illustrons les aspects méthodologiques spécifiques à une technique de mesure. Nous proposons ensuite des définitions de critères de jugement et des méthodes statistiques adéquates pour l'analyse des données d'immunogénicité multidimensionnelles. En particulier, nous montrons l’intérêt des scores multivariés non-paramétriques, permettant de résumer l’information à travers différents marqueurs d’immunogénicité et de faire des comparaisons inter- et intra-groupe. Dans l’objectif de contribuer à la conception méthodologique des nouveaux essais vaccinaux, nous présentons la construction d’un schéma d’essai optimisé pour le développement clinique précoce. En imbriquant les phases I et II d’évaluation clinique, ce schéma permet d’accélerer le développement de plusieurs stratégies vaccinales en parallèle. L’intégration d’une règle d’arrêt est proposée dans des perspectives fréquentistes et Bayesiennes. Les méthodes mises en avant dans cette thèse sont transposables à d’autres domaines d’application avec des données complexes, telle que les données d’imagerie ou les essais d’autres immunothérapies
Complex data are frequently recored in recent clinical trials and require the use of appropriate statistical methods. HIV vaccine research is an example of a domaine with complex data and a lack of validated endpoints for early-stage clinical trials. This thesis concerns methodological research with regards to the design and analysis aspects of HIV vaccine trials, in particular the definition of immunogenicity endpoints and phase I-II trial designs. Using cytokine multiplex data, we illustrate the methodological aspects specific to a given assay technique. We then propose endpoint definitions and statistical methods appropriate for the analysis of multidimensional immunogenicity data. We show in particular the value of non-parametric multivariate scores, which allow for summarizing information across different immunogenicity markers and for making statistical comparisons between and within groups. In the aim of contributing to the design of new vaccine trials, we present the construction of an optimized early-stage HIV vaccine design. Combining phase I and II assessments, the proposed design allows for accelerating the clinical development of several vaccine strategies in parallel. The integration of a stopping rule is proposed from both a frequentist and a Bayesian perspective. The methods advocated in this thesis are transposable to other research domains with complex data, such as imaging data or trials of other immune therapies
APA, Harvard, Vancouver, ISO, and other styles
25

Baurne, Yvette. "Nonparametric Combination Methodology : A Better Way to Handle Composite Endpoints?" Thesis, Uppsala universitet, Statistiska institutionen, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-274959.

Full text
Abstract:
Composite endpoints are widely used in clinical trials. The outcome of a clinical trial can affect many individuals and it is therefore of importance that the methods used are as effective and correct as possible. Improvements of the standard method of testing composite endpoints have been proposed and in this thesis, the alternative method using nonparametric combination methodology is compared to the standard method. Performing a simulation study, the power of three combining functions (Fisher, Tippett and the Logistic) are compared to the power of the standard method. The performances of the four methods are evaluated for different compositions of treatment effects, as well as for independent and dependent components. The results show that using the nonparametric combination methodology leads to higher power in both dependent and independent cases. The combining functions are suitable for different compositions of treatment effects, the Fisher combining function being the most versatile. The thesis is written with support from Statisticon AB.
APA, Harvard, Vancouver, ISO, and other styles
26

Lu, Qingshu. "Statistical analysis for two-stage adaptive designs with different study endpoints /." access full-text access abstract and table of contents, 2009. http://libweb.cityu.edu.hk/cgi-bin/ezdb/thesis.pl?phd-ms-b30082766f.pdf.

Full text
Abstract:
Thesis (Ph.D.)--City University of Hong Kong, 2009.
"Submitted to Department of Management Science in partial fulfillment of the requirements for the degree of Doctor of Philosophy." Includes bibliographical references (leaves 126-130)
APA, Harvard, Vancouver, ISO, and other styles
27

Das, Ashish. "Development of Energy-Based Endpoints for diagnosis of Pulmonary Valve Insufficiency." University of Cincinnati / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1384864758.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Bark, Charles. "CLINICAL SYMPTOMS AND MICROBIOLOGICAL OUTCOMES IN TUBERCULOSIS TREATMENT TRIALS." Case Western Reserve University School of Graduate Studies / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=case1307630776.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Gómez-Mateu, Moisés. "Composite endpoints in clinical trials : computational tools, practical guidelines and methodological extensions." Doctoral thesis, Universitat Politècnica de Catalunya, 2016. http://hdl.handle.net/10803/396263.

Full text
Abstract:
The conclusions from randomized clinical trials (RCT) rely on the primary endpoint (PE), which is chosen at the design stage of the study; thus, it is of utmost importance to select it appropriately. In RCT, there should generally be only one PE, and it should be able to provide the most clinically relevant and scientific evidence regarding the potential efficacy of the new treatment. Composite endpoints (CE) consist of the union of two or more outcomes and are often used in RCT. When the focus is time-to-event analysis, CE refer to the elapse time from randomization until the first component of the CE. In oncology trials, for instance, progression-free survival is defined as the time to disease progression or death. The decision on whether to use a CE versus a single component as the PE is controversial. The advantages and drawbacks regarding the use of CE have been extensively discussed in the literature. Gómez and Lagakos develop a statistical methodology to evaluate the convenience of using a relevant endpoint RE versus a CE consisting of the union of the RE plus another additional endpoint (AE). Their strategy is based on the value of the asymptotic relative efficiency (ARE), which relates the efficiency of using the logrank test based on the RE versus the efficiency based on the CE. The ARE is expressed as a function of the marginal laws of the time to each component RE and AE, the probabilities of observing each component in the control group, the hazard ratios measured by each component of the CE between the two treatment groups, and the correlation between components. This thesis explores, elaborates on, implements and applies the ARE method. We have also developed a new online platform named CompARE that facilitates the practical use of this method. The ARE method has been applied to cardiovascular studies. We have made further progress into the theoretical meaning of the ARE and have explored how to handle the probability and the hazard ratio of a combination of endpoints. In cardiovascular trials, it is common to use CE. We systematically examine the use of CE in this field by means of a literature search and the discussion of several case studies. Based on the ARE methodology, we provide guidelines for the informed choice of the PE. We prove that the usual interpretation of the ARE as the ratio of sample sizes holds and that it can be applied to evaluate the efficiency of the RE versus the CE. Furthermore, we carry out a simulation study to empirically check the proximity between the ratio of finite sample sizes and the ARE. We discuss how to derive the probabilities and hazard ratios when they come from a combination of several components. Furthermore, it is shown that the combined hazard ratio (HR*) is, in general, not constant over time, even if the hazard ratio of the marginal components are. This non-constant behaviour might have a strong influence on the interpretation of treatment effect and on sample size assessment. We evaluate the behaviour of the HR* in respect to the marginal parameters, and we study its departure from constancy, depending on different scenarios. This thesis has implemented the ARE methodology on the online platform CompARE. Clinicians and biostatisticians can use CompARE to study the performance of different endpoints in a variety of scenarios. CompARE has an intuitive interface and it is a convenient tool for better informed decisions regarding the PE. Results from different parameter settings are shown immediately by means of tables and plots. CompARE is extended to quantify specific values for the combined probability and hazard ratios. When the user cannot anticipate some of the needed parameters, CompARE provides a range of plausible values. Moreover, the departure from constancy of a combined hazard ratio can be explored by visualizing its shape over time. Sample size computations are implemented as well.
Els esdeveniments compostos consisteixen en la unió de dos o més esdeveniments, i són utilitzats usualment en assajos clínics aleatoritzats. Sovint, les anàlisis es basen en el temps fins que es produeix l’esdeveniment d’interès; en aquest cas parlaríem del temps fins al primer dels components. En assajos oncològics, per exemple, la supervivència lliure de progressió es defineix com a temps fins a la pro-gressió o la mort. La decisió entre utilitzar un esdeveniment compost o un component d’aquest com a variable principal és controvertida. Gómez i Lagakos desenvolupen una metodologia estadística per avaluar la conveniència d’utilitzar un esdeveniment rellevant enfront d'un esdeveniment compost consistent en la unió de l’esdeveniment rellevant més un esdeveniment addicional. La seva estratègia es basa en el valor de l’eficiència relativa asimptòtica (ARE, fent servir l’acrònim en anglès), la qual relaciona l’eficiència d’utilitzar la prova logrank basada en l’esdeveniment rellevant enfront de l’eficiència basada en l’esdeveniment compost. L’ARE s’expressa com a funció de les lleis marginals corresponents al temps fins a cada component rellevant i addicional, les probabilitats d’observar cada component en el grup control, els hazard ratios mesurats per a cada component de l’esdeveniment compost entre els dos grups de tractament i la correlació entre els components. Aquesta tesi explora, aprofundeix, implementa i aplica la metodologia ARE. També hem creat una nova plataforma en línia, CompARE, que facilita l’ús pràctic d’aquesta metodologia. Examinem sistemàticament l’ús d’esdeveniments compostos en assajos cardiovasculars a partir d’una recerca en la literatura existent i en discutim diferents casos. Basant-nos en la metodologia ARE, aportem guies per a l’elecció informada de la variable principal. Provem que la interpretació usual de l’ARE com la ràtio de les mides mostrals se sustenta i pot ser aplicada per avaluar l’eficiència de l’esdeveniment rellevant enfront de l’esdeveniment compost. A més, portem a terme una simulació per estudiar empíricament com n’està, de prop, la ràtio de mides mostrals finites respecte de l’ARE. Discutim com es poden derivar les probabilitats i els hazard ratios quan provenen d’una combinació de diversos components. També mostrem que el hazard ratio combinat és, en general, no constant al llarg del temps, fins i tot quan els hazard ratios dels components marginals ho són. Aquest comportament no constant pot tenir una gran influència en la interpretació de l’efecte del tractament i en el càlcul de les mides mostrals. Avaluem el comportament del hazard ratio combinat respecte dels paràmetres marginals i l’estudiem per a diferents escenaris. En aquesta tesi també s’ha implementat la metodologia ARE en la plataforma en línia CompARE. Clínics i bioestadístics poden utilitzar CompARE per estudiar el comportament de diferents esdeveniments en un gran ventall d’escenaris. CompARE conté una interfície intuïtiva i és una eina convenient per prendre una decisió informada millor sobre la variable principal. Els resultats provinents de diferents escenaris són mostrats instantàniament a partir de taules i gràfics. CompARE s’ha ampliat per quantificar valors específics per a la probabilitat combinada i el hazard ratio. Quan l’usuari no pot anticipar algun dels paràmetres necessaris, CompARE facilita un rang de valors possibles. A més, el hazard ratio pot ser explorat visualitzant-ne la forma al llarg del temps i, per tant, proporciona una ajuda gràfica per a possibles desviacions de la proporcionalitat dels hazards. Càlculs sobre la mida mostral també han estat implementats en la plataforma.
APA, Harvard, Vancouver, ISO, and other styles
30

Carrió, Gaspar Pau 1982. "Development of advanced strategies for the prediction of toxicity endpoints in drug development." Doctoral thesis, Universitat Pompeu Fabra, 2015. http://hdl.handle.net/10803/328418.

Full text
Abstract:
Safety concerns are one of the main causes of drug attrition. In these events, the moment at which the drug toxic effects are discovered changes dramatically the importance of the finding; discarding a valuable candidate at clinical testing stages means wasting years of efforts and huge economicinvestments. Even more dramatic is the discovery of toxic effect at post marketing stages, when the drug could have already produced severe side effects on a number of patients. For these reason there is a pushing need of developing methods able to assess the safety of drug candidates at early stages of development. Among these, in silico methods have many advantages, like not even requiring the availability of the compound, not wasting any quantity of it in case it has been already synthesized, being fast, cheap and make no use of animal testing. Unfortunately, in silico prediction methods of toxicity endpoints do not perform always as expected. The reasons are still under debate, but likely reasons are the complexity of the biological phenomena under study and the large structural diversity of the drug candidates, among others. The aim of this thesis is to improve currently used in silico prediction methods for their application to biological endpoints of interest in drug development, with a special emphasis to toxicological endpoints. Here, we report a novel general methodology called ADAN (Applicability Domain Analysis) for assessing the reliability of drug property predictions obtained by in silico methods. Furthermore, we proposed a unifying strategy for the use of in silico predictive methods in this field, defining rational criteria for the application of a whole spectrum of methods; from structural alerts to global QSAR models, including read across and local models. The usefulness of all the proposed methodologies is tested using a systematic analysis on representative datasets, obtaining good results that confirm their validity.
La manca de seguretat és una de les raons principals per la qual els candidats a fàrmacs són descartats. La fase en què els possibles efectes tòxics són identificats és crítica: descartar un candidat en fase clínica implica la pèrdua d'anys d'esforços i enormes inversions econòmiques. Encara pitjor és identificar efectes tòxics un cop el fàrmac està comercialitzat, quan es poden haver produït greus efectes secundaris en pacients. Per aquestes raons hi ha la necessitat de desenvolupar mètodes capaços d'avaluar la seguretat dels candidats a fàrmacs en les primeres etapes. Entre aquests, els mètodes in silico tenen molts avantatges, com no requerir la disponibilitat del compost, no perdre cap quantitat en cas que ja s'hagi sintetitzat, ser ràpid, econòmic i no fer ús de l'experimentació amb animals. Per desgràcia, els mètodes de predicció in silico aplicats a criteris d'avaluació de toxicitat no produeixen els resultats adequats. Les raons són objecte de debat, però raons probables són la complexitat dels fenòmens biològics en estudi i la gran diversitat estructural els fàrmacs candidats, entre d'altres. L'objectiu d'aquesta tesi és millorar els mètodes de predicció in silico emprats en l’avaluació de criteris d'interès en el desenvolupament de fàrmacs amb especial èmfasi en els de toxicitat. Presentem una nova metodologia general anomenada ADAN (Applicability Domain Analysis) per avaluar la fiabilitat de les prediccions obtingudes amb mètodes in silico. A més, proposem una estratègia unificada de l’ús de mètodes de predicció in silico emprats en aquest camp; com alertes estructurals, read-across, QSAR local i global. La estratègia incorpora criteris racionals per la seva utilització. Els bons resultats obtinguts amb dades representatives confirmen la validesa de les metodologies.
APA, Harvard, Vancouver, ISO, and other styles
31

Lessigiarska, Iglika. "Development of structure-activity relationships for pharmacotoxicological endpoints relevant to European Union legislation." Thesis, Liverpool John Moores University, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.436564.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Saxena, Akriti. "A SEQUENTIAL ALGORITHM TO IDENTIFY THE MIXING ENDPOINTS IN LIQUIDS IN PHARMACEUTICAL APPLICATIONS." VCU Scholars Compass, 2009. http://scholarscompass.vcu.edu/etd/1931.

Full text
Abstract:
The objective of this thesis is to develop a sequential algorithm to determine accurately and quickly, at which point in time a product is well mixed or reaches a steady state plateau, in terms of the Refractive Index (RI). An algorithm using sequential non-linear model fitting and prediction is proposed. A simulation study representing typical scenarios in a liquid manufacturing process in pharmaceutical industries was performed to evaluate the proposed algorithm. The data simulated included autocorrelated normal errors and used the Gompertz model. A set of 27 different combinations of the parameters of the Gompertz function were considered. The results from the simulation study suggest that the algorithm is insensitive to the functional form and achieves the goal consistently with least number of time points.
APA, Harvard, Vancouver, ISO, and other styles
33

Liu, Meng. "A PREDICTIVE PROBABILITY INTERIM DESIGN FOR PHASE II CLINICAL TRIALS WITH CONTINUOUS ENDPOINTS." UKnowledge, 2017. http://uknowledge.uky.edu/epb_etds/15.

Full text
Abstract:
Phase II clinical trials aim to potentially screen out ineffective and identify effective therapies to move forward to randomized phase III trials. Single-arm studies remain the most utilized design in phase II oncology trials, especially in scenarios where a randomized design is simply not practical. Due to concerns regarding excessive toxicity or ineffective new treatment strategies, interim analyses are typically incorporated in the trial, and the choice of statistical methods mainly depends on the type of primary endpoints. For oncology trials, the most common primary objectives in phase II trials include tumor response rate (binary endpoint) and progression disease-free survival (time-to-event endpoint). Interim strategies are well-developed for both endpoints in single-arm phase II trials. The advent of molecular targeted therapies, often with lower toxicity profiles from traditional cytotoxic treatments, has shifted the drug development paradigm into establishing evidence of biological activity, target modulation and pharmacodynamics effects of these therapies in early phase trials. As such, these trials need to address simultaneous evaluation of safety as well as proof-of-concept of biological marker activity or changes in continuous tumor size instead of binary response rates. In this dissertation, we extend a predictive probability design for binary outcomes in the single-arm clinical trial setting and develop two interim designs for continuous endpoints, such as continuous tumor shrinkage or change in a biomarker over time. The two-stage design mainly focuses on the futility stopping strategies, while it also has the capacity of early stopping for efficacy. Both optimal and minimax designs are presented for this two-stage design. The multi-stage design has the flexibility of stopping the trial early either due to futility or efficacy. Due to the intense computation and searching strategy we adopt, only the minimax design is presented for this multi-stage design. The multi-stage design allows for up to 40 interim looks with continuous monitoring possible for large and moderate effect sizes, requiring an overall sample size less than 40. The stopping boundaries for both designs are based on predictive probability with normal likelihood and its conjugated prior distributions, while the design itself satisfies the pre-specified type I and type II error rate constraints. From simulation results, when compared with binary endpoints, both designs well preserve statistical properties across different effect sizes with reduced sample size. We also develop an R package, PPSC, and detail it in chapter four, so that both designs can be freely accessible for use in future phase II clinical trials with the collaborative efforts of biostatisticians. Clinical investigators and biostatisticians have the flexibility to specify the parameters from the hypothesis testing framework, searching ranges of the boundaries for predictive probabilities, the number of interim looks involved and if the continuous monitoring is preferred and so on.
APA, Harvard, Vancouver, ISO, and other styles
34

Ntambwe, Lupetu Ives. "Sequential sample size re-estimation in clinical trials with multiple co-primary endpoints." Thesis, University of Warwick, 2014. http://wrap.warwick.ac.uk/66339/.

Full text
Abstract:
In this thesis, we consider interim sample size adjustment in clinical trials with multiple co-primary continuous endpoints. We aim to answer two questions: First, how to adjust a sample size in clinical trial with multiple continuous co-primary endpoints using adaptive and group sequential design. Second, how to construct a test in order to control the family-wise type I error rate and maintain the power, even if the correlation ρ between endpoints is not known. To answer the first question, we conduct K different interim tests, each for one endpoint and each at level α/K (i.e. Bonferroni adjustment). To answer the second question, either we perform a sample size re-estimation in which the results of the interim analysis are used to estimate one or more nuisance parameters, and this information is used to determine the sample size for the rest of the trial or the inverse normal combination test type approach; or we conduct a group sequential test where we monitor the information, and the information is adjusted to allow the correlation ρ to be estimated at each stage or the inverse normal combination test type approach. We show that both methods control the family-wise type I error α and maintain the power and that the group sequential methodology seems to be more powerful, as this depends on the spending function.
APA, Harvard, Vancouver, ISO, and other styles
35

Pallmann, Philip Steffen [Verfasser]. "Multiple contrast tests with repeated and multiple endpoints : with biological applications / Philip Steffen Pallmann." Hannover : Technische Informationsbibliothek (TIB), 2016. http://d-nb.info/1112948635/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Poppel, Gerardus Arnoldus Franciscus Catharina van. "Beta-carotene and cancer risk a trial in smokers using biomarkers as intermediate endpoints /." Maastricht : Maastricht : Rijksuniversiteit Limburg ; University Library, Maastricht University [Host], 1994. http://arno.unimaas.nl/show.cgi?fid=5908.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Sundaram, Supriya. "Characterisation of exacerbations in non-CF bronchiectasis to establish endpoints in measuring treatment efficacy." Thesis, University of East Anglia, 2013. https://ueaeprints.uea.ac.uk/50709/.

Full text
Abstract:
Bronchiectasis is characterised by chronic cough productive of mucopurulent sputum and frequent exacerbations. We have aimed to validate clinical, biochemical and microbiological endpoints to aid planning of future interventional studies. We recruited fifty-eight subjects with bronchiectasis at the Lung Defence unit (Papworth Hospital, Cambridge) and studied them in stable state (no exacerbation in the preceding four weeks) and during an exacerbation over a period of two years. The results of our research are discussed in this study. Clinical symptoms: Cough chest pain, chest discomfort, colour and volume of sputum and fatigue measured by a visual analogue score are useful endpoints. Breathlessness is a reliable endpoint when measured using either a visual analogue score or modified Borg’s breathlessness score. Health related quality of life measured using the Euroqol questionnaire is a sensitive marker of change during an exacerbation. The St George’s respiratory questionnaire did not demonstrate a significant change during an exacerbation. Spirometry: Forced expiratory volume in the first second (actual and percentage predicted) and Forced vital capacity (percentage predicted) do not change during the course of an exacerbation. Forced vital capacity actual may be used as an endpoint. pH of exhaled breath condensate in bronchiectasis is lower than in healthy subjects but does not change during the course of an exacerbation. Sputum appearance is a valid endpoint while 24hour volume of sputum and microbial clearance and anti-pseudomonal antibody titres cannot be used. ESR and serum titres of IFN-γ, TNF-α IL-6, IL-8, IL-10, IL-17 and IL-1β and titres in sputa of IFN-γ, IL-6, IL-17 do not change during an exacerbation. C-reactive protein and titres in sputa of TNF-α, IL-8 and IL-1β are effective indicators and can be recommended for use as end points in therapeutic interventional trials.
APA, Harvard, Vancouver, ISO, and other styles
38

Mazzotti, Andrea. "Analisi e sviluppo di un engine per l'interrogazione di endpoints SPARQL tramite linguaggio naturale." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amslaurea.unibo.it/4179/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Farhat, Naha. "Design and Analysis of Toxicological Experiments with Multiple Endpoints and Synergistic and Inhibitory Effects." VCU Scholars Compass, 2014. https://scholarscompass.vcu.edu/etd/636.

Full text
Abstract:
The enormous increase of exposure to toxic materials and hazardous chemicals in recent years is a major concern due to the adverse effect resulting from such exposure on human health specifically and all organisms in general. Among the major concerns of toxicologists is to determine an acceptable level(s) of exposure to such hazardous substance(s). Current approaches often evaluate each endpoint and stressor individually. Herein we propose two novel approaches to simultaneously determine the Benchmark Dose Tolerable Region (BMDTR) for multiple endpoints and multiple stressors studies when stressors experience no more than additive effects by adopting a Bayesian approach to compute the non-linear hierarchical model. A main concern while assessing the combined toxicological effect of chemical mixture is the anticipated type of the combined action (i.e. synergistic or antagonistic), thus it was essential to extend the two proposed methods to handle this situation, which imposes more challenges due to the non-linearity of the tolerable region. Furthermore, we proposed a new method to determine the endpoint probabilities for each endpoint, which reflects the importance of each endpoint in determining the boundaries of the Benchmark Dose Tolerable Region (BMDTR). This method was also extended for situations where there is an interaction effect between stressors. The results obtained from this method were consistent with the resulting BMDTR approach in both scenarios (i.e. additive effect and non-additive effect). In addition, we developed new criteria for determining ray designs for follow-up experiments for toxicology studies based on the popular D- A- and E- optimality criteria introduced initially by (Keifer, 1959) for both scenarios (i.e. additive effect and non-additive effect). Moreover, the endpoint probabilities were used to extend these criteria in to weighted versions, where the main motivation behind using these probabilities is to segregate necessary information from un-necessary information through inducing them as weights in to the Fisher Information Matrix. Illustrative examples from simulated data were provided to illustrate all methods and criteria.
APA, Harvard, Vancouver, ISO, and other styles
40

Krishnamoorthy, Mahesh kumaar. "Investigations on Linkages Between Blood Flow Dynamics and Histological Endpoints in Dialysis Access Fistula." University of Cincinnati / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1267718697.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Lee, Namheon. "Assessment of Pulmonary Insufficiency using Energy-Based Endpoints and 4D Phase Contrast MR Imaging." University of Cincinnati / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1384865927.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Michelson, Maximilian. "Client controlled, secure endpointto-endpoint storage in the cloud." Thesis, KTH, Hälsoinformatik och logistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-278069.

Full text
Abstract:
Abstract Softronic's customers do not want them to store sensitive data in a cloud environment as theydistrust the cloud providers with keeping sensitive data secret and are afraid of violating GDPR.Softronic wants to prove that data can be kept protected using encryption, even though it is storedin a cloud, and the goal of this thesis is to find a cryptographic solution with good security and performance. The chosen solution was to implement object-level encryption with both encryption and decryptiondone on-site at Softronic with the cloud provider kept outside of the encryption process. Encrypteddata can then safely be stored in the cloud and decrypted on demand on-site again. The cryptography used in the solution was determined after multiple evaluations comparingencryption algorithms and the effects of key lengths, block sizes, and modes of operation. Theevaluations showed big performance differences between encryption algorithms as well as fordifferent encryption modes, where the biggest difference was between those with and withoutintegrity checks built-in. The key length used did not affect object-level encryption performance andthe biggest key size can, therefore, be used for maximum security. The different block sizes did notaffect performance either, but a 128-bit one, as opposed to a 64-bit one, requires less maintenance,as key rotations are not required as frequently. The secure transport protocol, TLS, performed in-transit encryption of the object-level encrypteddata as it was sent to the cloud for storage which adversely affects performance. TLS encryptionsuites were, therefore, evaluated to find the one with the smallest performance impact. Theevaluations found that the key size affected performance when doing in-transit encryption, asopposed to object-level encryption, and that the encryption suite, TLS_AES_128_GCM_SHA256,with the smallest key performed the best. Keywords Encryption, data protection, cloud databases, symmetric encryption, TLS, GDPR, AEAD, Crypto
SammanfattningSoftronics kunder vill inte att de ska lagra känslig information i en molnmiljö eftersom de misstrormolnleverantörerna med att hålla känslig information hemlig och därför bryta mot GDPR. Softronicvill bevisa att data kan skyddas med hjälp av kryptering, även om de lagras i ett moln, och målet meddetta examensarbete är att hitta en kryptografisk lösning med god säkerhet och prestanda för detta.Den valda lösningen var att implementera kryptering på objektnivå och låta både kryptering ochdekryptering göras på plats hos Softronic. Molnleverantören hålls på så sätt utanförkrypteringsprocessen och lagrar endast krypterade data som på begäran hämtas från molnet ochavkrypteras på plats hos Softronic.Den kryptografi som användes i lösningen valdes efter flera utvärderingar som jämförde olikakrypteringsalgoritmer och tittade på effekterna av olika krypteringsnyckellängder, blockstorlekaroch krypteringslägen. Utvärderingarna visade stora prestandaskillnader mellan bådekrypteringsalgoritmer och lägen, med den största skillnaden mellan lägen med- och utan inbyggdaintegritetskontroller. Nyckellängden som användes påverkade inte krypteringsprestandan vidkryptering på objektnivå, och den största nyckel kan därför användas för maximal säkerhet. De olikablockstorlekarna påverkade inte heller prestandan, men en 128-bitars resulterar i mindre underhållän en 64-bitars, då nyckelrotationer inte krävs lika ofta.När objektnivå-krypterade data skickades till molnet för lagring utförde det säkratransportprotokollet, TLS, kryptering under transporten. Denna kryptering påverkade prestandannegativt och TLS-krypteringssviter utvärderades därför för att hitta den svit som har lägst påverkan.Detta resulterade i upptäckten att nyckellängd under TLS-kryptering spelar roll, till skillnad från närkryptering sker på objektnivå, och att sviten TLS_AES_128_GCM_SHA256, med den minstanyckelstorleken presterade bäst.NyckelordKryptering, dataskydd, molndatabaser, symmetrisk kryptering, TLS, GDPR, AEAD, Crypto
APA, Harvard, Vancouver, ISO, and other styles
43

Wente, Stephen P. "Optimizing land acquisition-conversion projects for water quality protection and enhancement using biological integrity endpoints." Virtual Press, 1996. http://liblink.bsu.edu/uhtbin/catkey/1036190.

Full text
Abstract:
Biological monitoring and land use data analysis were performed for a small (79,800 acre) watershed in west-central Indiana. A model was developed between Hilsenhoff biotic index and percentage of water (volume) draining through forestland at each sample site (R2.92, P < .002). This water volume model was found to explain more of the variation in biological integrity than USEPA and Ohio EPA habitat assessment methods, as well as, a land use model based upon percentage watershed surface area. Based on this water volume model, maps were created depicting regions within the watershed that had the greatest potential to damage water quality. Land acquisition/conversion projects based upon these maps should improve biological integrity/water quality more efficiently (requiring less land acquisition/conversion, and therefore lowering project costs, while increasing water quality benefits).
Department of Biology
APA, Harvard, Vancouver, ISO, and other styles
44

Bruce, Sharon Diane. "Development of rate related exercise-induced myocardial ischemia and risk of selected coronary diesease endpoints." Thesis, This resource online, 1993. http://scholar.lib.vt.edu/theses/available/etd-11102009-020132/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Zhou, Bin. "Validity of cerebrospinal fluid biomarkers as endpoints in early-phase clinical trials for Alzheimer disease." Kyoto University, 2009. http://hdl.handle.net/2433/126429.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Chambilla, Aquino Teófilo. "Lenguaje de especificación para la delegación de tareas en Servidores Web mediante agentes." Tesis, Universidad de Chile, 2016. http://repositorio.uchile.cl/handle/2250/139150.

Full text
Abstract:
Magíster en Ciencias, Mención Computación
La tecnología de los agentes se ha convertido en la base de una gran cantidad de aplicaciones ya que permite la incorporación de bases de conocimiento de acciones y tareas para resolver problemas complejos. Por otro lado, se sabe que los Servidores Web se sustentan en el protocolo HTTP, protocolo que solo permite las solicitudes y respuestas entre Cliente y Servidor y no delegar funciones a otros Servidores separados geográficamente. Esta investigación consiste en un estudio exploratorio del concepto de la delegación en el contexto de la Web, donde agentes que residen en diferentes Servidores Web puedan cooperan entre sí para resolver tareas complejas. Para ello, se propone un lenguaje de especificación para la delegación de tareas en Servidores Web mediante agentes, con propiedades necesarias para su autonomía y que puedan ser utilizados con flexibilidad en entornos distribuidos bajo la restricción del protocolo de comunicación HTTP. En primer lugar, se presenta el modelo abstracto de la delegación en el entorno de la Web y los componentes necesarios para la elaboración del lenguaje especificación propuesto, mediante la definición de acciones básicas y opcionales que son implementadas por los agentes participantes en el proceso de la delegación. En segundo lugar, como caso de estudio, se desarrolla la implementación de NautiLOD de manera distribuida mediante agentes. NautiLOD es un lenguaje de expresión declarativo que está diseñado para especificar patrones de navegación en la red Linked Open Data, donde sus primeras propuestas de implementación han sido con un enfoque centralizado. En un tercer lugar, se presenta Agent Server, una plataforma flexible y escalable para Sistema MultiAgentes basados en el ambiente de la Web, desarrollado bajo los principios de REST, que permite gestionar agentes distribuidos. La principal conclusión de la tesis es la validación del lenguaje de especificación en una plataforma homogénea como es Linked Data que gracias a su semántica permite a los agentes procesar su contenido, razonar sobre este y realizar deducciones lógicas. Esto se realizó con consultas propias en los Endpoints SPARQL expresados en NautiLOD.
APA, Harvard, Vancouver, ISO, and other styles
47

Kugler, Josephine [Verfasser]. "Essential Signaling Cascades as Predictive Endpoints for Teratogenicity in vitro : A Proof of Principle Study / Josephine Kugler." Berlin : Freie Universität Berlin, 2017. http://d-nb.info/1126505056/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Ross, Graham Andrew. "An investigation into the biological basis of #late effect' endpoints in the rectum of rats after radiation." Thesis, University of Oxford, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.339103.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Fulton, Rachael Louise. "Exploratory analyses to guide inclusion, limitation of sample size and strengthening of endpoints in clinical stroke trials." Thesis, University of Glasgow, 2013. http://theses.gla.ac.uk/4530/.

Full text
Abstract:
Clinical trials for treatment of acute ischaemic stroke require large numbers of patients and are expensive to conduct. Treatment is typically administered within the first hours or days after stroke onset. Outcome is usually assessed by a single measure, the most common being the modified Rankin scale (mRS) at day 90. Any strategy that can reduce cost or deliver more reliable answers on safety and efficacy of the investigational treatment would be welcome for future exploratory testing of novel treatments. This thesis focused on the impact of applying different methods of design, inclusion and outcome measurement to limit sample size and strengthen analysis in clinical trials in acute stroke. Firstly, inclusion criteria were investigated to assess the impact on functional outcome. By assessing how the effect of thrombolysis changes over onset time to treatment (OTT) the relationship between OTT and age could be investigated. By looking across the entire range of OTT and assessing the interaction between the two covariates this provided complementary data to a previous VISTA analysis conducted by Mishra et al. It was found that across the full range of OTT, up to 3.5h, the treatment effect of thrombolysis in very elderly stroke patients (>80 years old) was comparable to that of their younger counterparts. The association of AF and modified Rankin Scale (mRS) at day 90 was then assessed. Multiple logistic regression analysis adjusted for age and baseline National Institutes of Health Stroke Scale (NIHSS) showed that history of AF had no independent impact on stroke outcome. Deferred selection of subjects for neurorestorative therapies from hyperacute (<6h) to 24h was then explored using a simulation approach. The sample size required to detect a ‘shift’ in mRS outcome equivalent to a 5% absolute difference in proportion achieving mRS 0-2 versus 3-6 was modelled, setting power at 80% and assuming adjustment for entry age and NIHSS. It was found that extending the time window for patient selection provides a measurement which has a stronger more predictive relationship with outcome. Trial inclusion was explored further by investigating selection for delayed treatment with thrombolysis. Prognostic scoring methods were proposed to identify a strategy for patient selection to be applied first to an existing trial dataset and then validated in the pooled RCT 4.5-6h data. ). Prognostic score limits were chosen to optimise the sample for a net treatment benefit significant at p=0.01 by Cochran Mantel Haenszel test and by ordinal logistic regression. More inclusive limits were also defined based on p=0.05 criteria. After finalising prognostic score limits, for validation they were applied by an independent statistician to the pooled RCT data for 4.5-6h. The validation analysis based on ordinal outcomes failed to deliver a population in whom treatment >4.5h was safe and effective; analysis based on net benefit (mRS 0-1) showed significance. Secondly, different strategies for endpoint selection were considered. In the past some trialists have investigated the use of earlier endpoints on single trial datasets and taken advantage of the fact that numerous outcome scales are available to measure various domains of neurological and functional recovery. The use of an earlier neurological endpoint for detecting futility in a trial was considered with validation on external RCT data. Global endpoints, investigating different aspects of functional recovery at different time-points were then considered. Simulations were undertaken to assess the relationship between sample size and power for ordinal scales and the corresponding global outcomes. Day 7 NIHSS was found to be the most sensitive individual ordinal endpoint. Dichotomised analyses supported these results. However this needed validation in a randomised trial dataset for use in exploratory stroke trials. The validation study reinforced the results from the non-randomised VISTA study. The global test combination of NIHSS90 with NIHSS7 appeared to offer incremental sensitivity to treatment effect compared to the ordinal scales alone. The combination of mRS90 with NIHSS7 did not increase the sensitivity to treatment effect when compared to NIHSS alone, but offers a broader clinical measure without loss of statistical power. Finally, alternatives to the traditional RCT were considered. Abandoning the rigour of the blinded RCT carries substantial penalty in loss of reliability and should not be undertaken lightly. If a placebo control is deemed impractical or unethical, investigators often consider comparisons against historical controls. A within-VISTA exploration of case-control matching is presented. The reliability of different matching methods and covariate combinations were assessed using a simulation approach. The results indicate that caution must be taken when using historical controls to generate a matched control group. Substantial further work matching to external data and validation to RCT data is needed. Cluster randomised trials, which randomise patients by groups, are becoming a more widely used approach. When evaluating strategies to promote the transfer of research findings into clinical practice, i.e. in "Implementation Research", an RCT is impractical and a cluster randomised trial design is of advantage. Some elements in the design and sample size calculation of cluster randomised trials were considered. Intra cluster correlation coefficients (ICCs) were estimated from linear and generalised linear mixed models using maximum likelihood estimation for common measures used in stroke research. These estimates of relevant ICCs should assist in the design and planning of cluster randomised trials. In conclusion, this research has shown that there are several areas in the design of clinical trials of acute stroke that merit further investigation. Several strategies have been highlighted that could potentially reduce sample size whilst retaining optimal levels of statistical power. However other aspects such as patient selection and the nature of the intervention under study can affect trial cost and statistical power and need to be taken under consideration.
APA, Harvard, Vancouver, ISO, and other styles
50

Walker, Ann Sarah. "The analysis of multivariate failure time data with application to multiple endpoints in trials in HIV infection." Thesis, University College London (University of London), 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.390624.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography