Dissertations / Theses on the topic 'Chemistry – Statistical methods'

To see the other types of publications on this topic, follow the link: Chemistry – Statistical methods.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 20 dissertations / theses for your research on the topic 'Chemistry – Statistical methods.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

FREITAS, SONIA MARIA DE. "STATISTICAL METHODOLOGY FOR ANALYTICAL METHODS VALIDATION APPLICABLE CHEMISTRY METROLOGY." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2003. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=4058@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
A metodologia estatística escolhida para validação de métodos analíticos aplicável à metrologia em química é fundamental para assegurar a qualidade, comprovar a eficiência e demonstrar a exatidão dos resultados das medições nas análises químicas. Essa metodologia, desenvolvida em conformidade com o rigor metrológico, resulta num sistema de medições validado, confiável e com incertezas quantificadas. Este trabalho propõe uma metodologia geral para validação de métodos analíticos. A metodologia desenvolvida resultou de uma síntese de métodos parciais descritos na literatura, e inclui uma escolha crítica de técnicas mais adequadas dentro das alternativas existentes. A abordagem proposta combina quatro diferentes aspectos da validação: a modelagem da curva de calibração; o controle da especificidade do método; a comparação da tendência e precisão (repetitividade e precisão intermediária) do método com um método de referência; e a estimação das componentes de incerteza inerentes a todos esses aspectos. Como resultado, além de uma proposta para validação de métodos para uso em análises químicas, obtêm- se a função de calibração inversa e as incertezas expandidas, que permitem obter os resultados analíticos associados aos valores da resposta, com suas respectivas incertezas associadas. Na modelagem geral para obtenção da curva de calibração, empregam-se técnicas estatísticas para avaliação da linearidade e para o cálculo do mínimo valor detectável e do mínimo valor quantificável. A especificidade do método analítico é avaliada pela adição de padrões a um conjunto de amostras representativas e posterior recuperação dos mesmos, com ajuste por mínimos quadrados e testes de hipóteses. Para estudar a tendência e a precisão do método quando comparado a um método de referência, utiliza-se um modelo hierárquico de quatro níveis e a aproximação de Satterthwaite para determinação do número de graus de liberdade associados aos componentes de variância. As técnicas estatísticas utilizadas são ilustradas passo a passo por exemplos numéricos.
The use of statistical methodology for analytical methods validation is vital to assure that measurements have the quality level required by the goal to be attained. This thesis describes a statistical modelling approach for combining four different aspects of validation: checking the linearity of the calibration curve and compute the detection and the quantification limits; controlling the specificity of the analytical method; estimating the accuracy (trueness and precision) of the alternative method, for comparison with a reference method. The general approach is a synthesis of several partial techniques found in the literature, according to a choice of the most appropriate techniques in each case. For determination of the response function, statistical techniques are used for assessing the fitness of the regression model and for determination of the detection limit and the quantification limit. Method specificity is evaluated by adjusting a straight line between added and recovered concentrations via least squares regression and hypotheses tests on the slope and intercept. To compare a method B with a reference method A, the precision and accuracy of method B are estimated. A 4-factor nested design is employed for this purpose. The calculation of different variance estimates from the experimental data is carried out by ANOVA. The Satterthwaite approximation is used to determine the number of degrees of freedom associated with the variance components. The application of the methodology is thoroughly illustrated with step-by-step examples.
APA, Harvard, Vancouver, ISO, and other styles
2

Farhat, Hikmat. "Studies in computational methods for statistical mechanics of fluids." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape10/PQDD_0026/NQ50157.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Agrawala, Gautam Kumar. "Regional ground water interpretation using multivariate statistical methods." To access this resource online via ProQuest Dissertations and Theses @ UTEP, 2007. http://0-proquest.umi.com.lib.utep.edu/login?COPT=REJTPTU0YmImSU5UPTAmVkVSPTI=&clientId=2515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ganti, Satyakala. "DEVELOPMENT OF HPLC METHODS FOR PHARMACEUTICALLY RELEVANT MOLECULES; METHOD TRANSFER TO UPLC: COMPARING METHODS STATISTICALLY FOR EQUIVALENCE." Diss., Temple University Libraries, 2011. http://cdm16002.contentdm.oclc.org/cdm/ref/collection/p245801coll10/id/118587.

Full text
Abstract:
Chemistry
Ph.D.
High Pressure Liquid Chromatography (HPLC) is a well-known and widely used analytical technique which is prevalent throughout the pharmaceutical industry as a research tool. Despite its prominence HPLC possesses some disadvantages, most notably slow analysis time and large consumption of organic solvents. Ultra Pressure Liquid Chromatography (UPLC) is a relatively new technique which offers the same separation capabilities of HPLC with the added benefits of reduced run time and lower solvent consumption. One of the key developments which facilitate the new UPLC technology is sub 2-µm particles used as column packing material. These particles allow for higher operating pressures and increased flow rates while still providing strong separation. Although UPLC technology has been available since early 2000, few laboratories have embraced the new technology as an alternative to HPLC. Besides the resistance to investing in new capital, another major roadblock is converting existing HPLC methodology to UPLC without disruption. This research provides a framework for converting existing HPLC methods to UPLC. An existing HPLC method for analysis of Galantamine hydrobromide was converted to UPLC and validated according to ICH guidelines. A series of statistical evaluations on the validation data were performed to prove the equivalency between the original HPLC and the new UPLC method. This research presents this novel statistical strategy which can be applied to any two methodologies to determine parity.
Temple University--Theses
APA, Harvard, Vancouver, ISO, and other styles
5

Goodpaster, Aaron M. "Statistical Analysis Methods Development for Nuclear Magnetic Resonance and Liquid Chromatography/Mass Spectroscopy Based Metabonomics Research." Miami University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=miami1312317652.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Woocay-Prieto, Arturo. "Groundwater hydrochemical facies, flowpaths and recharge determined by multivariate statistical, isotopic and chloride mass-balance methods." To access this resource online via ProQuest Dissertations and Theses @ UTEP, 2008. http://0-proquest.umi.com.lib.utep.edu/login?COPT=REJTPTU0YmImSU5UPTAmVkVSPTI=&clientId=2515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Gabrielsson, Jon. "Multivariate methods in tablet formulation." Doctoral thesis, Umeå : Univ, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-268.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Wang, Bo. "Novel statistical methods for evaluation of metabolic biomarkers applied to human cancer cell lines." Miami University / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=miami1399046331.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Pham, Minh H. "Signal Detection of Adverse Drug Reaction using the Adverse Event Reporting System: Literature Review and Novel Methods." Scholar Commons, 2018. http://scholarcommons.usf.edu/etd/7218.

Full text
Abstract:
One of the objectives of the U.S. Food and Drug Administration is to protect the public health through post-marketing drug safety surveillance, also known as Pharmacovigilance. An inexpensive and efficient method to inspect post-marketing drug safety is to use data mining algorithms on electronic health records to discover associations between drugs and adverse events. The purpose of this study is two-fold. First, we review the methods and algorithms proposed in the literature for identifying association drug interactions to an adverse event and discuss their advantages and drawbacks. Second, we attempt to adapt some novel methods that have been used in comparable problems such as the genome-wide association studies and the market-basket problems. Most of the common methods in the drug-adverse event problem have univariate structure and thus are vulnerable to give false positive when certain drugs are usually co-prescribed. Therefore, we will study applicability of multivariate methods in the literature such as Logistic Regression and Regression-adjusted Gamma-Poisson Shrinkage Model for the association studies. We also adopted Random Forest and Monte Carlo Logic Regression from the genome-wide association study to our problem because of their ability to detect inherent interactions. We have built a computer program for the Regression-adjusted Gamma Poisson Shrinkage model, which was proposed by DuMouchel in 2013 but has not been made available in any public software package. A comparison study between popular methods and the proposed new methods is presented in this study.
APA, Harvard, Vancouver, ISO, and other styles
10

Moller, Jurgen Johann. "The implementation of noise addition partial least squares." Thesis, Stellenbosch : University of Stellenbosch, 2009. http://hdl.handle.net/10019.1/3362.

Full text
Abstract:
Thesis (MComm (Statistics and Actuarial Science))--University of Stellenbosch, 2009.
When determining the chemical composition of a specimen, traditional laboratory techniques are often both expensive and time consuming. It is therefore preferable to employ more cost effective spectroscopic techniques such as near infrared (NIR). Traditionally, the calibration problem has been solved by means of multiple linear regression to specify the model between X and Y. Traditional regression techniques, however, quickly fail when using spectroscopic data, as the number of wavelengths can easily be several hundred, often exceeding the number of chemical samples. This scenario, together with the high level of collinearity between wavelengths, will necessarily lead to singularity problems when calculating the regression coefficients. Ways of dealing with the collinearity problem include principal component regression (PCR), ridge regression (RR) and PLS regression. Both PCR and RR require a significant amount of computation when the number of variables is large. PLS overcomes the collinearity problem in a similar way as PCR, by modelling both the chemical and spectral data as functions of common latent variables. The quality of the employed reference method greatly impacts the coefficients of the regression model and therefore, the quality of its predictions. With both X and Y subject to random error, the quality the predictions of Y will be reduced with an increase in the level of noise. Previously conducted research focussed mainly on the effects of noise in X. This paper focuses on a method proposed by Dardenne and Fernández Pierna, called Noise Addition Partial Least Squares (NAPLS) that attempts to deal with the problem of poor reference values. Some aspects of the theory behind PCR, PLS and model selection is discussed. This is then followed by a discussion of the NAPLS algorithm. Both PLS and NAPLS are implemented on various datasets that arise in practice, in order to determine cases where NAPLS will be beneficial over conventional PLS. For each dataset, specific attention is given to the analysis of outliers, influential values and the linearity between X and Y, using graphical techniques. Lastly, the performance of the NAPLS algorithm is evaluated for various
APA, Harvard, Vancouver, ISO, and other styles
11

Wiklund, Susanne. "Spectroscopic data and multivariate analysis : tools to study genetic perturbations in poplar trees." Doctoral thesis, Umeå : Department of Chemistry, Umeå Univ, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-1396.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Oosthuysen, Coenraad Willem. "Work management business process against employee engagement." Thesis, Stellenbosch : University of Stellenbosch, 2008. http://hdl.handle.net/10019.1/810.

Full text
Abstract:
Thesis (MBA (Business Management))--University of Stellenbosch, 2008.
AFRIKAANSE OPSOMMING:‘n Kwalitatiewe evaluasie van die faktore wat werknemer verbintenis in die uitvoering van ‘n werksbestuur besigheidsproses beinvloed, word aangebied. Die nodigheid van werknemer verbintenis en besigheidsprosesse in organisasies word beskryf. ‘n Vergelyk tussen die beginsels van werknemer verbintenis en die werksbestuur proses word gedoen. Deur middel van statistiese evaluasie tegnieke (frekwensie evaluasie) dra hierdie navorsing by tot die bestaande kennis in hierdie domein deur die identifisering van faktore wat bydra tot die onttrekking van werknemers van die werksbestuur proses en gevolglik die organisasie verlaat. Resultate bevestig dat die werksbestuur proses die verbintenis van werknemers tot die werksbestuur proses fasiliteer, veral vir werknemers wat onlangs die arbeidsmark betree. Persoonlike groei en loopbaan ontwikkeling word egter nie ten volle ondersteun in die uitvoering van die werksbestuur proses nie. Aanbevelings word gemaak ter ondersteuning van die implementering van ‘n volhoubare werksbestuur proses.
ENGLISH ABSTRACT: Presents a qualitative evaluation of the factors that influence employee engagement from the perspective of implementing and sustaining a work management business process. Describes the intent of employee engagement and business processes in organizations. Compare the principles of employee engagement and the work management business process. By applying statistical evaluation methods (frequency analyses) this research adds to the existing body of knowledge in this field by identifying factors that lead to disengagement of employees in the execution of work management business processes and subsequent loss of skills. Analysis indicates that the business process facilitates engagement of employees in work management at the start of their careers, however career development and personal growth for experienced employees are lacking in the execution of work management. Concludes with recommendation for sustaining employee engagement in work management.
APA, Harvard, Vancouver, ISO, and other styles
13

Chang, Qiang. "Continuous-time random-walk simulation of surface kinetics." The Ohio State University, 2007. http://rave.ohiolink.edu/etdc/view?acc_num=osu1166592142.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Bunch, Nathan. "Oral Fluid Method Validation for Bowling Green State University." Bowling Green State University / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1586969951770212.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Mehta, Darshan. "New Fragmentation Method to Enhance Structure-Based In Silico Modeling of Chemically-Induced Toxicity." The Ohio State University, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=osu1449759891.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Lundberg, Robert. "Validation of Biomarkers for the Revision of the CEN/TR 15522-2:2012 Method : A Statistical Study of Sampling, Discriminating Powers and Weathering of new Biomarkers for Comparative Analysis of Lighter Oils." Thesis, Linköpings universitet, Kemi, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-156975.

Full text
Abstract:
The revision of the CEN/TR 15522-2:2012 methodology contains new biomarkers to facilitate forensic fingerprinting of the variety of oil types that can be a part of different crimes and the purpose of this project is to validate the biomarkers of the new methodology. Biomarkers were validated by examining corresponding diagnostic ratios compatibility with the internationally used sampling cloth, discriminating power, correlation and simulated weathering sensibility through GC-SIM-MS analysis followed by statistical evaluation with t-tests, diagnostic power, Pearson correlation matrices and MS-PW plots respectively. Results based on most of the diagnostic ratios showed good compatibility with the internationally used sampling cloth, expected patterns of biodegradation and photo-oxidation except for observed photo-oxidation of hydro PAHs and that normative ratios and informative ratios with high diagnostic powers, but with strong correlations for some of the tested ratios, could be identified in diesel oils. Due to delimitations however such as the limited number of oils with similar origins that were analyzed the results should be regarded as guidelines that can be expanded.
APA, Harvard, Vancouver, ISO, and other styles
17

Swann, Ellen Therese. "Development and application of statistical and quantum mechanical methods for modelling molecular ensembles." Phd thesis, 2018. http://hdl.handle.net/1885/142784.

Full text
Abstract:
The development of new quantum chemical methods requires extensive benchmarking to establish the accuracy and limitations of a method. Current benchmarking practices in computational chemistry use test sets that are subject to human biases and as such can be fundamentally flawed. This work presents a thorough benchmark of diffusion Monte Carlo (DMC) for a range of systems and properties as well as a novel method for developing new, unbiased test sets using multivariate statistical techniques. Firstly, the hydrogen abstraction of methanol is used as a test system to develop a more efficient protocol that minimises the computational cost of DMC without compromising accuracy. This protocol is then applied to three test sets of reaction energies, including 43 radical stabilisation energies, 14 Diels-Alder reactions and 76 barrier heights of hydrogen and non-hydrogen transfer reactions. The average mean absolute error for all three databases is just 0.9 kcal/mol. The accuracy of the explicitly correlated trial wavefunction used in DMC is demonstrated using the ionisation potentials and electron affinities of first- and second-row atoms. A multi-determinant trial wavefunction reduces the errors for systems with strong multi-configuration character, as well as for predominantly single-reference systems. It is shown that the use of pseudopotentials in place of all-electron basis sets slightly increases the error for these systems. DMC is then tested with a set of eighteen challenging reactions. Incorporating more determinants in the trial wavefunction reduced the errors for most systems but results are highly dependent on the active space used in the CISD wavefunction. The accuracy of multi-determinant DMC for strongly multi-reference systems is tested for the isomerisation of diazene. In this case no method was capable of reducing the error of the strongly-correlated rotational transition state. Finally, an improved method for selecting test sets is presented using multivariate statistical techniques. Bias-free test sets are constructed by selecting archetypes and prototypes based on numerical representations of molecules. Descriptors based on the one-, two- and three-dimensional structures of a molecule are tested. These new test sets are then used to benchmark a number of methods.
APA, Harvard, Vancouver, ISO, and other styles
18

Ebrahimi, Mohammadi Diako Chemistry Faculty of Science UNSW. "Multi-purpose multi-way data analysis." 2007. http://handle.unsw.edu.au/1959.4/40646.

Full text
Abstract:
In this dissertation, application of multi-way analysis is extended into new areas of environmental chemistry, microbiology, electrochemistry and organometallic chemistry. Additionally new practical aspects of some of the multi-way analysis methods are discussed. Parallel Factor Analysis Two (PARAFAC2) is used to classify a wide range of weathered petroleum oils using GC-MS data. Various chemical and data analysis issues exist in the current methods of oil spill analysis are discussed and the proposed method is demonstrated to have potential to be employed in identification of source of oil spills. Two important practical aspects of PARAFAC2 are exploited to deal with chromatographic shifts and non-diagnostic peaks.GEneralized Multiplicative ANalysis Of VAriance (GEMANOVA) is applied to assess the bactericidal activity of new natural antibacterial extracts on three species of bacteria in different structure and oxidation forms and different concentrations. In this work while the applicability of traditional ANOVA is restricted due to the high interaction amongst the factors, GEMANOVA is shown to return robust and easily interpretable models which conform to the actual structure of the data. Peptide-modified electrochemical sensors are used to determine three metal cations of Cu2+, Cd2+ and Pb2+ simultaneously. Two sets of experiments are performed using a four-electrode system returning a three-way array of size (sample ?? current ?? electrode) and a single electrode resulting in a two-way data set of size (sample ?? current). The data of former is modeled by N-PLS and that latter using PLS. Despite the presence of highly overlapped voltammograms and several sources of non-linearity N-PLS returns reasonable models while PLS fails. An intramolecular hydroamination reaction is catalyzed by several organometallic catalysts to identify the most effective catalysts. The reaction of starting material in the presence of 72 different catalysts is monitored by UV-Vis at two time points, before and after heating the mixtures in an oven. PARAFAC is applied to the three-way data set of (sample ?? wavelength ?? time) to resolve the overlapped UV-Vis peaks and to identify the effective catalysts using the estimated relative concentration of product (loadings plot of the sample mode).
APA, Harvard, Vancouver, ISO, and other styles
19

Reichard, Eric Jonathan. "Chemometrics applied to the discrimination of synthetic fibers by microspectrophotometry." 2014. http://hdl.handle.net/1805/3795.

Full text
Abstract:
Indiana University-Purdue University Indianapolis (IUPUI)
Microspectrophotometry is a quick, accurate, and reproducible method to compare colored fibers for forensic purposes. The use of chemometric techniques applied to spectroscopic data can provide valuable discriminatory information especially when looking at a complex dataset. Differentiating a group of samples by employing chemometric analysis increases the evidential value of fiber comparisons by decreasing the probability of false association. The aims of this research were to (1) evaluate the chemometric procedure on a data set consisting of blue acrylic fibers and (2) accurately discriminate between yellow polyester fibers with the same dye composition but different dye loadings along with introducing a multivariate calibration approach to determine the dye concentration of fibers. In the first study, background subtracted and normalized visible spectra from eleven blue acrylic exemplars dyed with varying compositions of dyes were discriminated from one another using agglomerative hierarchical clustering (AHC), principal component analysis (PCA), and discriminant analysis (DA). AHC and PCA results agreed showing similar spectra clustering close to one another. DA analysis indicated a total classification accuracy of approximately 93% with only two of the eleven exemplars confused with one another. This was expected because two exemplars consisted of the same dye compositions. An external validation of the data set was performed and showed consistent results, which validated the model produced from the training set. In the second study, background subtracted and normalized visible spectra from ten yellow polyester exemplars dyed with different concentrations of the same dye ranging from 0.1-3.5% (w/w), were analyzed by the same techniques. Three classes of fibers with a classification accuracy of approximately 96% were found representing low, medium, and high dye loadings. Exemplars with similar dye loadings were able to be readily discriminated in some cases based on a classification accuracy of 90% or higher and a receiver operating characteristic area under the curve score of 0.9 or greater. Calibration curves based upon a proximity matrix of dye loadings between 0.1-0.75% (w/w) were developed that provided better accuracy and precision to that of a traditional approach.
APA, Harvard, Vancouver, ISO, and other styles
20

Wright, Eric Thomas. "Bayesian learning methods for potential energy parameter inference in coarse-grained models of atomistic systems." Thesis, 2015. http://hdl.handle.net/2152/30464.

Full text
Abstract:
The present work addresses issues related to the derivation of reduced models of atomistic systems, their statistical calibration, and their relation to atomistic models of materials. The reduced model, known in the chemical physics community as a coarse-grained model, is calibrated within a Bayesian framework. Particular attention is given to developing likelihood functions, assigning priors on coarse-grained model parameters, and using data from molecular dynamics representations of atomistic systems to calibrate coarse-grained models such that certain physically relevant atomistic observables are accurately reproduced. The developed Bayesian framework is then applied in three case studies of increasing complexity and practical application. A freely jointed chain model is considered first for illustrative purposes. The next example entails the construction of a coarse-grained model for a liquid heptane system, with the explicit design goal of accurately predicting a vapor-liquid transfer free energy. Finally, a coarse-grained model is developed for an alkylthiophene polymer that has been shown to have practical use in certain types of photovoltaic cells. The development therein employs Bayesian decision theory to select an optimal CG potential energy function. Subsequently, this model is subjected to validation tests in a prediction scenario that is relevant to the performance of a polyalkylthiophene-based solar cell.
text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography