Dissertations / Theses on the topic 'Engineering - Statistical methods'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Engineering - Statistical methods.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Marco, Almagro Lluís. "Statistical methods in Kansei engineering studies." Doctoral thesis, Universitat Politècnica de Catalunya, 2011. http://hdl.handle.net/10803/85059.
Full textEsta tesis doctoral trata sobre Ingeniería Kansei (IK), una técnica para trasladar emociones transmitidas por productos en parámetros técnicos, y sobre métodos estadísticos que pueden beneficiar la disciplina. El propósito básico de la IK es descubrir de qué manera algunas propiedades de un producto transmiten ciertas emociones a sus usuarios. Es un método cuantitativo, y los datos se recogen típicamente usando cuestionarios. Se extraen conclusiones al analizar los datos recogidos, normalmente usando algún tipo de análisis de regresión.La IK se puede situar en el área de investigación del diseño emocional. La tesis empieza justificando la importancia del diseño emocional. Como que el rango de técnicas usadas bajo el nombre de IK es extenso y no demasiado claro, la tesis propone una definición de IK que sirve para delimitar su alcance. A continuación, se sugiere un modelo para desarrollar estudios de IK. El modelo incluye el desarrollo del espacio semántico – el rango de emociones que el producto puede transmitir – y el espacio de propiedades – las variables técnicas que se pueden modificar en la fase de diseño. Después de la recogida de datos, la etapa de síntesis enlaza ambos espacios (descubre cómo distintas propiedades del producto transmiten ciertas emociones). Cada paso del modelo se explica detalladamente usando un estudio de IK realizado para esta tesis: el experimento de los zumos de frutas. El modelo inicial se va mejorando progresivamente durante la tesis y los datos del experimento se reanalizan usando nuevas propuestas. Muchas inquietudes prácticas aparecen cuando se estudia el modelo para estudios de IK mencionado anteriormente (entre otras, cuántos participantes son necesarios y cómo se desarrolla la sesión de recogida de datos). Se ha realizado una extensa revisión bibliográfica con el objetivo de responder éstas y otras preguntas. Se describen también las aplicaciones de IK más habituales, junto con comentarios sobre ideas particularmente interesantes de distintos artículos. La revisión bibliográfica sirve también para listar cuáles son las herramientas más comúnmente utilizadas en la fase de síntesis. La parte central de la tesis se centra precisamente en las herramientas para la fase de síntesis. Herramientas estadísticas como la teoría de cuantificación tipo I o la regresión logística ordinal se estudian con detalle, y se proponen varias mejoras. En particular, se propone una nueva forma gráfica de representar los resultados de una regresión logística ordinal. Se introduce una técnica de aprendizaje automático, los conjuntos difusos (rough sets), y se incluye una discusión sobre su idoneidad para estudios de IK. Se usan conjuntos de datos simulados para evaluar el comportamiento de las herramientas estadísticas sugeridas, lo que da pie a proponer algunas recomendaciones. Independientemente de las herramientas de análisis utilizadas en la fase de síntesis, las conclusiones serán probablemente erróneas cuando la matriz del diseño no es adecuada. Se propone un método para evaluar la idoneidad de matrices de diseño basado en el uso de dos nuevos indicadores: un índice de ortogonalidad y un índice de confusión. Se estudia el habitualmente olvidado rol de las interacciones en los estudios de IK y se propone un método para incluir una interacción, juntamente con una forma gráfica de representarla. Finalmente, la última parte de la tesis se dedica al escasamente tratado tema de la variabilidad en los estudios de IK. Se proponen un método (basado en el análisis clúster) para segmentar los participantes según sus respuestas emocionales y una forma de ordenar los participantes según su coherencia al valorar los productos (usando un coeficiente de correlación intraclase). Puesto que muchos usuarios de IK no son especialistas en la interpretación de salidas numéricas, se incluyen representaciones visuales para estos dos nuevos métodos que facilitan el procesamiento de las conclusiones.
This PhD thesis deals with Kansei Engineering (KE), a technique for translating emotions elicited by products into technical parameters, and statistical methods that can benefit the discipline. The basic purpose of KE is discovering in which way some properties of a product convey certain emotions in its users. It is a quantitative method, and data are typically collected using questionnaires. Conclusions are reached when analyzing the collected data, normally using some kind of regression analysis. Kansei Engineering can be placed under the more general area of research of emotional design. The thesis starts justifying the importance of emotional design. As the range of techniques used under the name of Kansei Engineering is rather vast and not very clear, the thesis develops a detailed definition of KE that serves the purpose of delimiting its scope. A model for conducting KE studies is then suggested. The model includes spanning the semantic space – the whole range of emotions the product can elicit – and the space of properties – the technical variables that can be modified in the design phase. After the data collection, the synthesis phase links both spaces; that is, discovers how several properties of the product elicit certain emotions. Each step of the model is explained in detail using a KE study specially performed for this thesis: the fruit juice experiment. The initial model is progressively improved during the thesis and data from the experiment are reanalyzed using the new proposals. Many practical concerns arise when looking at the above mentioned model for KE studies (among many others, how many participants are used and how the data collection session is conducted). An extensive literature review is done with the aim of answering these and other questions. The most common applications of KE are also depicted, together with comments on particular interesting ideas from several papers. The literature review also serves to list which are the most common tools used in the synthesis phase. The central part of the thesis focuses precisely in tools for the synthesis phase. Statistical tools such as quantification theory type I and ordinal logistic regression are studied in detail, and several improvements are suggested. In particular, a new graphical way to represent results from an ordinal logistic regression is proposed. An automatic learning technique, rough sets, is introduced and a discussion is included on its adequacy for KE studies. Several sets of simulated data are used to assess the behavior of the suggested statistical techniques, leading to some useful recommendations. No matter the analysis tools used in the synthesis phase, conclusions are likely to be flawed when the design matrix is not appropriate. A method to evaluate the suitability of design matrices used in KE studies is proposed, based on the use of two new indicators: an orthogonality index and a confusion index. The commonly forgotten role of interactions in KE studies is studied and a method to include an interaction in KE studies is suggested, together with a way to represent it graphically. Finally, the untreated topic of variability in KE studies is tackled in the last part of the thesis. A method (based in cluster analysis) for finding segments among subjects according to their emotional responses and a way to rank subjects based on their coherence when rating products (using an intraclass correlation coefficient) are proposed. As many users of Kansei Engineering are not specialists in the interpretation of the numerical output from statistical techniques, visual representations for these two new proposals are included to aid understanding.
Molaro, Mark Christopher. "Computational statistical methods in chemical engineering." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/111286.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (pages 175-182).
Recent advances in theory and practice, have introduced a wide variety of tools from machine learning that can be applied to data intensive chemical engineering problems. This thesis covers applications of statistical learning spanning a range of relative importance of data versus existing detailed theory. In each application, the quantity and quality of data available from experimental systems are used in conjunction with an understanding of the theoretical physical laws governing system behavior to the extent they are available. A detailed generative parametric model for optical spectra of multicomponent mixtures is introduced. The application of interest is the quantification of uncertainty associated with estimating the relative abundance of mixtures of carbon nanotubes in solution. This work describes a detailed analysis of sources of uncertainty in estimation of relative abundance of chemical species in solution from optical spectroscopy. In particular, the quantification of uncertainty in mixtures with parametric uncertainty in pure component spectra is addressed. Markov Chain Monte Carlo methods are utilized to quantify uncertainty in these situations and the inaccuracy and potential for error in simpler methods is demonstrated. Strategies to improve estimation accuracy and reduce uncertainty in practical experimental situations are developed including when multiple measurements are available and with sequential data. The utilization of computational Bayesian inference in chemometric problems shows great promise in a wide variety of practical experimental applications. A related deconvolution problem is addressed in which a detailed physical model is not available, but the objective of analysis is to map from a measured vector valued signal to a sum of an unknown number of discrete contributions. The data analyzed in this application is electrical signals generated from a free surface electro-spinning apparatus. In this information poor system, MAP estimation is used to reduce the variance in estimates of the physical parameters of interest. The formulation of the estimation problem in a probabilistic context allows for the introduction of prior knowledge to compensate for a high dimensional ill-conditioned inverse problem. The estimates from this work are used to develop a productivity model expanding on previous work and showing how the uncertainty from estimation impacts system understanding. A new machine learning based method for monitoring for anomalous behavior in production oil wells is reported. The method entails a transformation of the available time series of measurements into a high-dimensional feature space representation. This transformation yields results which can be treated as static independent measurements. A new method for feature selection in one-class classification problems is developed based on approximate knowledge of the state of the system. An extension of features space transformation methods on time series data is introduced to handle multivariate data in large computationally burdensome domains by using sparse feature extraction methods. As a whole these projects demonstrate the application of modern statistical modeling methods, to achieve superior results in data driven chemical engineering challenges.
by Mark Christopher Molaro.
Ph. D.
Chang, Chia-Jung. "Statistical and engineering methods for model enhancement." Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/44766.
Full textWalls, Frederick George 1976. "Topic detection through statistical methods." Thesis, Massachusetts Institute of Technology, 1999. http://hdl.handle.net/1721.1/80244.
Full textIncludes bibliographical references (p. 77-79).
by Frederick George Walls.
M.Eng.
Maas, Luis C. (Luis Carlos). "Statistical methods in ultrasonic tissue characterization." Thesis, Massachusetts Institute of Technology, 1994. http://hdl.handle.net/1721.1/36456.
Full textIncludes bibliographical references (p. 88-93).
by Luis Carlos Maas III.
M.S.
Yu, Huan. "New Statistical Methods for Simulation Output Analysis." Diss., University of Iowa, 2013. https://ir.uiowa.edu/etd/4931.
Full textBetschart, Willie. "Applying intelligent statistical methods on biometric systems." Thesis, Blekinge Tekniska Högskola, Avdelningen för signalbehandling, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-1694.
Full textChandrasekaran, Venkat. "Convex optimization methods for graphs and statistical modeling." Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/66002.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (p. 209-220).
An outstanding challenge in many problems throughout science and engineering is to succinctly characterize the relationships among a large number of interacting entities. Models based on graphs form one major thrust in this thesis, as graphs often provide a concise representation of the interactions among a large set of variables. A second major emphasis of this thesis are classes of structured models that satisfy certain algebraic constraints. The common theme underlying these approaches is the development of computational methods based on convex optimization, which are in turn useful in a broad array of problems in signal processing and machine learning. The specific contributions are as follows: -- We propose a convex optimization method for decomposing the sum of a sparse matrix and a low-rank matrix into the individual components. Based on new rank-sparsity uncertainty principles, we give conditions under which the convex program exactly recovers the underlying components. -- Building on the previous point, we describe a convex optimization approach to latent variable Gaussian graphical model selection. We provide theoretical guarantees of the statistical consistency of this convex program in the high-dimensional scaling regime in which the number of latent/observed variables grows with the number of samples of the observed variables. The algebraic varieties of sparse and low-rank matrices play a prominent role in this analysis. -- We present a general convex optimization formulation for linear inverse problems, in which we have limited measurements in the form of linear functionals of a signal or model of interest. When these underlying models have algebraic structure, the resulting convex programs can be solved exactly or approximately via semidefinite programming. We provide sharp estimates (based on computing certain Gaussian statistics related to the underlying model geometry) of the number of generic linear measurements required for exact and robust recovery in a variety of settings. -- We present convex graph invariants, which are invariants of a graph that are convex functions of the underlying adjacency matrix. Graph invariants characterize structural properties of a graph that do not depend on the labeling of the nodes; convex graph invariants constitute an important subclass, and they provide a systematic and unified computational framework based on convex optimization for solving a number of interesting graph problems. We emphasize a unified view of the underlying convex geometry common to these different frameworks. We describe applications of both these methods to problems in financial modeling and network analysis, and conclude with a discussion of directions for future research.
by Venkat Chandrasekaran.
Ph.D.
Lingg, Andrew James. "Statistical Methods for Image Change Detection with Uncertainty." Wright State University / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=wright1357249370.
Full textRanger, Jeremy. "Adaptive image magnification using edge-directed and statistical methods." Thesis, University of Ottawa (Canada), 2004. http://hdl.handle.net/10393/26753.
Full textFarhat, Hikmat. "Studies in computational methods for statistical mechanics of fluids." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape10/PQDD_0026/NQ50157.pdf.
Full textLaporte, Catherine. "Statistical methods for out-of-plane ultrasound transducer motion estimation." Thesis, McGill University, 2010. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=86597.
Full textL'échographie 3D main-libre consiste habituellement à déplacer et à mesurer le déplacement d'une sonde échographique 2D conventionnelle au-dessus d'un sujet et à créer un volume à partir des images qui sera ensuite interprété dans un but médical. Puisque les capteurs de position externes peuvent être encombrants, il y a un intérêt à calculer la trajectoire de la sonde à partir des images elles-mêmes. Cette thèse se penche sur de nouvelles méthodes pour le calcul de la composante hors-plan de la trajectoire de la sonde utilisant la relation prédictive entre la décorrélation hors-plan du speckle échographique et le déplacement de la sonde. Afin de résoudre les ambiguïtés directionnelles associées à cette approche, un nouveau cadre d'opérations est proposé. Ce cadre combine des techniques d'optimisation combinatoire et des techniques statistiques robustes pour détecter les mouvements non-monotones et les intersections entre les images. Pour tenir compte de la variabilité du coefficient de corrélation échantillonnaire entre deux portions d'images de speckle pleinement développé correspondantes, un nouveau modèle probabiliste de la décorrélation du speckle est développé. Ce modèle permet de quantifier l'incertitude associée à l'estimé d'un déplacement, facilitant ainsi l'utilisation d'une nouvelle approche de maximisation de la vraisemblance pour l'estimation de la trajectoire hors-plan qui exploite pleinement l'information rendue disponible par des mesures de corrélation multiples et redondantes acquises dans des images de speckle pleinement développé. Afin de généraliser l'applicabilité de ces méthodes au cas d'images de tissus véritables, un nouvel algorithme guidé par les données est proposé pour l'estimation de la longueur de corrélation hors-plan locale à partir d'attributs statistiques acquis à même le plan image. Dans cette approche, la relation entre les attributs de l'image et la longueur de corr
Kim, Junmo 1976. "Nonparametric statistical methods for image segmentation and shape analysis." Thesis, Massachusetts Institute of Technology, 2005. http://hdl.handle.net/1721.1/30352.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Page 131 blank.
Includes bibliographical references (p. 125-130).
Image segmentation, the process of decomposing an image into meaningful regions, is a fundamental problem in image processing and computer vision. Recently, image segmentation techniques based on active contour models with level set implementation have received considerable attention. The objective of this thesis is in the development of advanced active contour-based image segmentation methods that incorporate complex statistical information into the segmentation process, either about the image intensities or about the shapes of the objects to be segmented. To this end, we use nonparametric statistical methods for modeling both the intensity distributions and the shape distributions. Previous work on active contour-based segmentation considered the class of images in which each region can be distinguished from others by second order statistical features such as the mean or variance of image intensities of that region. This thesis addresses the problem of segmenting a more general class of images in which each region has a distinct arbitrary intensity distribution. To this end, we develop a nonparametric information-theoretic method for image segmentation. In particular, we cast the segmentation problem as the maximization of the mutual information between the region labels and the image pixel intensities. The resulting curve evolution equation is given in terms of nonparametric density estimates of intensity distributions, and the segmentation method can deal with a variety of intensity distributions in an unsupervised fashion. The second component of this thesis addresses the problem of estimating shape densities from training shapes and incorporating such shape prior densities into the image segmentation process.
(cont.) To this end, we propose nonparametric density estimation methods in the space of curves and the space of signed distance functions. We then derive a corresponding curve evolution equation for shape-based image segmentation. Finally, we consider the case in which the shape density is estimated from training shapes that form multiple clusters. This case leads to the construction of complex, potentially multi-modal prior densities for shapes. As compared to existing methods, our shape priors can: (a) model more complex shape distributions; (b) deal with shape variability in a more principled way; and (c) represent more complex shapes.
by Junmo Kim.
Ph.D.
Guyader, Andrew C. "A statistical approach to equivalent linearization with application to performance-based engineering /." Pasadena : California Institute of Technology, Earthquake Engineering Research Laboratory, 2004. http://caltecheerl.library.caltech.edu.
Full textStrong, Mark J. (Mark Joseph). "Statistical methods for process control in automobile body assembly." Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/10922.
Full textIncludes bibliographical references (p. 117-120).
by Mark J. Strong.
M.S.
Wright, Christopher M. "Using Statistical Methods to Determine Geolocation Via Twitter." TopSCHOLAR®, 2014. http://digitalcommons.wku.edu/theses/1372.
Full textSun, Felice (Felice Tzu-yun) 1976. "Integrating statistical and knowledge-based methods for automatic phonemic segmentation." Thesis, Massachusetts Institute of Technology, 1999. http://hdl.handle.net/1721.1/80127.
Full textStunes, Michael R. "Statistical methods for locating performance problems in multi-tier applications." Thesis, Massachusetts Institute of Technology, 2012. http://hdl.handle.net/1721.1/77017.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (p. 59-60).
This thesis describes an algorithm developed to aid in solving the problem of performance diagnosis, by automatically identifying the specific component in a multicomponent application system responsible for a performance problem. The algorithm monitors the system, collecting load and latency information from each component, searches the data for patterns indicative of performance saturation using statistical methods, and uses a machine learning classifier to interpret those results. The algorithm was tested with two test applications in several configurations, with different performance problems synthetically introduced. The algorithm correctly located these problems as much as 90% of the time, indicating that this is a good approach to the problem of automatic performance problem location. Also, the experimentation demonstrated that the algorithm can locate performance problems in environments different from those for which it was designed and from that on which it was trained.
by Michael R. Stunes.
M.Eng.
Sharma, Vikas. "A new modeling methodology combining engineering and statistical modeling methods : a semiconductor manufacturing application." Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/10686.
Full textKang, Bei. "STATISTICAL CONTROL USING NEURAL NETWORK METHODS WITH HIERARCHICAL HYBRID SYSTEMS." Diss., Temple University Libraries, 2011. http://cdm16002.contentdm.oclc.org/cdm/ref/collection/p245801coll10/id/122303.
Full textPh.D.
The goal of an optimal control algorithm is to improve the performance of a system. For a stochastic system, a typical optimal control method minimizes the mean (first cumulant) of the cost function. However, there are other statistical properties of the cost function, such as variance (second cumulant) and skewness (third cumulant), which will affect the system performance. In this dissertation, the work on the statistical optimal control are presented, which extends the traditional optimal control method using cost cumulants to shape the system performance. Statistical optimal control will allow more design freedom to achieve better performance. The solutions of statistical control involve solving partial differential equations known as Hamilton-Jacobi-Bellman equation. A numerical method based on neural networks is employed to find the solutions of the Hamilton-Jacobi-Bellman partial differential equation. Furthermore, a complex problem such as multiple satellite control, has both continuous and discrete dynamics. Thus, a hierarchical hybrid architecture is developed in this dissertation where the discrete event system is applied to discrete dynamics, and the statistical control is applied to continuous dynamics. Then, the application of a multiple satellite navigation system is analyzed using the hierarchical hybrid architecture. Through this dissertation, it is shown that statistical control theory is a flexible optimal control method which improves the performance; and hierarchical hybrid architecture allows control and navigation of a complex system which contains continuous and discrete dynamics.
Temple University--Theses
Mastin, Dana Andrew. "Statistical methods for 2D-3D registration of optical and LIDAR images." Thesis, Massachusetts Institute of Technology, 2009. http://hdl.handle.net/1721.1/55123.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (p. 121-123).
Fusion of 3D laser radar (LIDAR) imagery and aerial optical imagery is an efficient method for constructing 3D virtual reality models. One difficult aspect of creating such models is registering the optical image with the LIDAR point cloud, which is a camera pose estimation problem. We propose a novel application of mutual information registration which exploits statistical dependencies in urban scenes, using variables such as LIDAR elevation, LIDAR probability of detection (pdet), and optical luminance. We employ the well known downhill simplex optimization to infer camera pose parameters. Utilization of OpenGL and graphics hardware in the optimization process yields registration times on the order of seconds. Using an initial registration comparable to GPS/INS accuracy, we demonstrate the utility of our algorithms with a collection of urban images. Our analysis begins with three basic methods for measuring mutual information. We demonstrate the utility of the mutual information measures with a series of probing experiments and registration tests. We improve the basic algorithms with a novel application of foliage detection, where the use of only non-foliage points improves registration reliability significantly. Finally, we show how the use of an existing registered optical image can be used in conjunction with foliage detection to achieve even more reliable registration.
by Dana Andrew Mastin.
S.M.
Ritchie, Paul Andrew 1960. "A systematic, experimental methodology for design optimization." Thesis, The University of Arizona, 1988. http://hdl.handle.net/10150/276698.
Full textMan, Peter Lau Weilen. "Statistical methods for computing sensitivities and parameter estimates of population balance models." Thesis, University of Cambridge, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.608291.
Full textMuller, Cole. "Reliability analysis of the 4.5 roller bearing." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2003. http://library.nps.navy.mil/uhtbin/hyperion-image/03Jun%5FMuller.pdf.
Full textThesis advisor(s): David H. Olwell, Samuel E. Buttrey. Includes bibliographical references (p. 65). Also available online.
Capaci, Francesca. "Contributions to the Use of Statistical Methods for Improving Continuous Production." Licentiate thesis, Luleå tekniska universitet, Industriell Ekonomi, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-66256.
Full text林達明 and Daming Lin. "Reliability growth models and reliability acceptance sampling plans from a Bayesian viewpoint." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1995. http://hub.hku.hk/bib/B3123429X.
Full textLomangino, F. Paul. "Grammar- and optimization-based mechanical packaging." Diss., Georgia Institute of Technology, 1995. http://hdl.handle.net/1853/15848.
Full textKentwell, D. J. "Fractal relationships and spatial distribution of ore body modelling." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 1997. https://ro.ecu.edu.au/theses/882.
Full textPatil, Vidyaangi Giszter Simon Francis. "Different forms of modularity in trunk muscles in the rat revealed by various statistical methods /." Philadelphia, Pa. : Drexel University, 2007. http://hdl.handle.net/1860/1563.
Full textChen, Hongshu. "Sampling-based Bayesian latent variable regression methods with applications in process engineering." Columbus, Ohio : Ohio State University, 2007. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1189650596.
Full textSrinivasan, Raghuram. "Monte Carlo Alternate Approaches to Statistical Performance Estimation in VLSI Circuits." University of Cincinnati / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1396531763.
Full textTosi, Riccardo. "Towards stochastic methods in CFD for engineering applications." Doctoral thesis, Universitat Politècnica de Catalunya, 2021. http://hdl.handle.net/10803/673389.
Full textLos desarrollos relacionados con la computación de alto rendimiento de las últimas décadas permiten resolver problemas científicos actuales, utilizando métodos computacionales sofisticados. Sin embargo, es necesario asegurarse de la eficiencia de los métodos computacionales modernos, con el fin de explotar al máximo las capacidades tecnológicas. En esta tesis proponemos diferentes métodos, relacionados con la cuantificación de incertidumbres y el cálculo de alto rendimiento, con el fin de minimizar el tiempo de computación necesario para resolver las simulaciones y garantizar una alta fiabilidad. En concreto, resolvemos sistemas de dinámica de fluidos caracterizados por incertidumbres. En el campo de la dinámica de fluidos computacional existen diferentes tipos de incertidumbres. Nosotros consideramos, por ejemplo, la forma y la evolución en el tiempo de las condiciones de frontera, así como la aleatoriedad de las fuerzas externas que actúan sobre el sistema. Desde un punto de vista práctico, es necesario estimar valores estadísticos del flujo del fluido, cumpliendo los criterios de convergencia para garantizar la fiabilidad del método. Para cuantificar el efecto de las incertidumbres utilizamos métodos de Monte Carlo jerárquicos, también llamados hierarchical Monte Carlo methods. Estas estrategias tienen tres niveles de paralelización: entre los niveles de la jerarquía, entre los eventos de cada nivel y durante la resolución del evento. Proponemos agregar un nuevo nivel de paralelización, entre batches, en el cual cada batch es independiente de los demás y tiene su propia jerarquía, compuesta por niveles y eventos distribuidos en diferentes niveles. Definimos estos nuevos algoritmos como métodos de Monte Carlo asíncronos y jerárquicos, cuyos nombres equivalentes en inglés son asynchronous hierarchical Monte Carlo methods. También nos enfocamos en reducir el tiempo de computación necesario para calcular estimadores estadísticos de flujos de fluidos caóticos e incompresibles. Nuestro método consiste en reemplazar una única simulación de dinámica de fluidos, caracterizada por una ventana de tiempo prolongada, por el promedio de un conjunto de simulaciones independientes, caracterizadas por diferentes condiciones iniciales y una ventana de tiempo menor. Este conjunto de simulaciones se puede ejecutar en paralelo en superordenadores, reduciendo el tiempo de computación. El método de promedio de conjuntos se conoce como ensemble averaging. Analizando las diferentes contribuciones del error del estimador estadístico, identificamos dos términos: el error debido a las condiciones iniciales y el error estadístico. En esta tesis proponemos un método que minimiza el error debido a las condiciones iniciales, y en paralelo sugerimos varias estrategias para reducir el coste computacional de la simulación. Finalmente, proponemos una integración del método de Monte Carlo y del método de ensemble averaging, cuyo objetivo es reducir el tiempo de computación requerido para calcular estimadores estadísticos de problemas de dinámica de fluidos dependientes del tiempo, caóticos y estocásticos. Reemplazamos cada realización de Monte Carlo por un conjunto de realizaciones independientes, cada una caracterizada por el mismo evento aleatorio y diferentes condiciones iniciales. Consideramos y resolvemos diferentes sistemas físicos, todos relevantes en el campo de la dinámica de fluidos computacional, como problemas de flujo del viento alrededor de rascacielos o problemas de flujo potencial. Demostramos la precisión, eficiencia y efectividad de nuestras propuestas resolviendo estos ejemplos numéricos.
Gli sviluppi del calcolo ad alte prestazioni degli ultimi decenni permettono di risolvere problemi scientifici di grande attualità, utilizzando sofisticati metodi computazionali. È però necessario assicurarsi dell’efficienza di questi metodi, in modo da ottimizzare l’uso delle odierne conoscenze tecnologiche. A tal fine, in questa tesi proponiamo diversi metodi, tutti inerenti ai temi di quantificazione di incertezze e calcolo ad alte prestazioni. L’obiettivo è minimizzare il tempo necessario per risolvere le simulazioni e garantire alta affidabilità. Nello specifico, utilizziamo queste strategie per risolvere sistemi fluidodinamici caratterizzati da incertezze in macchine ad alte prestazioni. Nel campo della fluidodinamica computazionale esistono diverse tipologie di incertezze. In questo lavoro consideriamo, ad esempio, il valore e l’evoluzione temporale delle condizioni di contorno, così come l’aleatorietà delle forze esterne che agiscono sul sistema fisico. Dal punto di vista pratico, è necessario calcolare una stima delle variabili statistiche del flusso del fluido, soddisfacendo criteri di convergenza, i quali garantiscono l’accuratezza del metodo. Per quantificare l’effetto delle incertezze sul sistema utilizziamo metodi gerarchici di Monte Carlo, detti anche hierarchical Monte Carlo methods. Queste strategie presentano tre livelli di parallelizzazione: tra i livelli della gerarchia, tra gli eventi di ciascun livello e durante la risoluzione del singolo evento. Proponiamo di aggiungere un nuovo livello di parallelizzazione, tra gruppi (batches), in cui ogni batch sia indipendente dagli altri ed abbia una propria gerarchia, composta da livelli e da eventi distribuiti su diversi livelli. Definiamo questi nuovi algoritmi come metodi asincroni e gerarchici di Monte Carlo, il cui corrispondente in inglese è asynchronous hierarchical Monte Carlo methods. Ci focalizziamo inoltre sulla riduzione del tempo di calcolo necessario per stimare variabili statistiche di flussi caotici ed incomprimibili. Il nostro metodo consiste nel sostituire un’unica simulazione fluidodinamica, caratterizzata da un lungo arco temporale, con il valore medio di un insieme di simulazioni indipendenti, caratterizzate da diverse condizioni iniziali ed un arco temporale minore. Questo insieme 10 di simulazioni può essere eseguito in parallelo in un supercomputer, riducendo il tempo di calcolo. Questo metodo è noto come media di un insieme o, in inglese, ensemble averaging. Calcolando la stima di variabili statistiche, commettiamo due errori: l’errore dovuto alle condizioni iniziali e l’errore statistico. In questa tesi proponiamo un metodo per minimizzare l’errore dovuto alle condizioni iniziali, ed in parallelo suggeriamo diverse strategie per ridurre il costo computazionale della simulazione. Infine, proponiamo un’integrazione del metodo di Monte Carlo e del metodo di ensemble averaging, il cui obiettivo è ridurre il tempo di calcolo necessario per stimare variabili statistiche di problemi di fluidodinamica dipendenti dal tempo, caotici e stocastici. Ogni realizzazione di Monte Carlo è sostituita da un insieme di simulazioni indipendenti, ciascuna caratterizzata dallo stesso evento casuale, da differenti condizioni iniziali e da un arco temporale minore. Consideriamo e risolviamo differenti sistemi fisici, tutti rilevanti nel campo della fluidodinamica computazionale, come per esempio problemi di flusso del vento attorno a grattacieli, o sistemi di flusso potenziale. Dimostriamo l’accuratezza, l’efficienza e l’efficacia delle nostre proposte, risolvendo questi esempi numerici.
Enginyeria civil
Knopp, Jeremy Scott. "Modern Statistical Methods and Uncertainty Quantification for Evaluating Reliability of Nondestructive Evaluation Systems." Wright State University / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=wright1395942220.
Full textTogiti, Varun. "Pattern Recognition of Power System Voltage Stability using Statistical and Algorithmic Methods." ScholarWorks@UNO, 2012. http://scholarworks.uno.edu/td/1488.
Full textFox, Marshall Edward. "Identifying opportunities to reduce emergency service calls in hematology manufacturing using statistical methods." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/104308.
Full textThesis: S.M. in Engineering Systems, Massachusetts Institute of Technology, Department of Mechanical Engineering, 2016. In conjunction with the Leaders for Global Operations Program at MIT.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 38-39).
The main goal of this project is to identify opportunities to improve the reliability of the DxHTM product line, an automated hematology instrument for analyzing patient blood samples. The product was developed by Beckman Coulter Diagnostics, a division of a Danaher operating company with principal manufacturing and support operations based near Miami, Florida. A critical business metric used to reflect reliability is the Emergency Service Call (ESC) rate. An ESC for an instrument is defined as the number of unscheduled, on-site technician visits during the one year warranty period. Though Beckman Coulter already deploys an extremely robust quality control system, ESCs can still occur for a wide variety of other reasons resulting in an impact to reliability. Any tools that support the reduction of ESCs may help generate positive perceptions among customers since their instruments will have greater up-time. This project entails an evaluation of a new initiative called "Reliability Statistical Process Control" (R-SPC). R-SPC is a form of manufacturing process control developed internally consisting of an electronic tool that collects raw instrument data during manufacturing. Unusual measurements are automatically sent to a cross functional team, which examines the potential trend in more detail. If an abnormal trend is identified, the examination could generate a lasting improvement in the manufacturing process. Currently, the success of R-SPC is measured by the extent to which it reduces ESCs. Because an unusual measurement engenders further actions to investigate an instrument, it is desirable to show with empirical evidence that the measurement is linked to reliability. To assess whether particular measurements were systematically related to the ESC rate, relevant data were analyzed via the Pearson Chi Squared statistical test. The tests revealed that some of the variables now monitored do not appear to affect the ESC rate for the range of values studied. In contrast, several proposed "derived" parameters may serve as better indicators of an instrument's ESC rate. Moreover, the Chi Squared methodology described can be used to investigate the relationships between other variables and the ESC rate. The thesis concludes by offering several specific recommendations to help refine the R-SPC initiative.
by Marshall Edward Fox.
M.B.A.
S.M. in Engineering Systems
Pookhao, Naruekamol. "Statistical Methods for Functional Metagenomic Analysis Based on Next-Generation Sequencing Data." Diss., The University of Arizona, 2014. http://hdl.handle.net/10150/320986.
Full textAradhye, Hrishikesh Balkrishna. "Anomaly Detection Using Multiscale Methods." The Ohio State University, 2001. http://rave.ohiolink.edu/etdc/view?acc_num=osu989701610.
Full textLongmire, Pamela. "Nonparametric statistical methods applied to the final status decommissioning survey of Fort St. Vrains prestressed concrete reactor vessel." The Ohio State University, 1998. http://rave.ohiolink.edu/etdc/view?acc_num=osu1407398430.
Full textUddin, Mohammad Moin. "ROBUST STATISTICAL METHODS FOR NON-NORMAL QUALITY ASSURANCE DATA ANALYSIS IN TRANSPORTATION PROJECTS." UKnowledge, 2011. http://uknowledge.uky.edu/gradschool_diss/153.
Full textEl, Hayek Mustapha Mechanical & Manufacturing Engineering Faculty of Engineering UNSW. "Optimizing life-cycle maintenance cost of complex machinery using advanced statistical techniques and simulation." Awarded by:University of New South Wales. School of Mechanical and Manufacturing Engineering, 2006. http://handle.unsw.edu.au/1959.4/24955.
Full textZibdeh, Hazim S. "Environmental thermal stresses as a first passage problem." Diss., Virginia Polytechnic Institute and State University, 1985. http://hdl.handle.net/10919/49971.
Full textPh. D.
incomplete_metadata
Rose, Michael Benjamin. "Statistical Methods for Launch Vehicle Guidance, Navigation, and Control (GN&C) System Design and Analysis." DigitalCommons@USU, 2012. https://digitalcommons.usu.edu/etd/1278.
Full textChan, Shu-hei, and 陳樹禧. "Statistical distribution of forces in random packings of spheres and honeycomb structures." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2004. http://hub.hku.hk/bib/B29545365.
Full textMahadevan, Sankaran. "Stochastic finite element-based structural reliability analysis and optimization." Diss., Georgia Institute of Technology, 1988. http://hdl.handle.net/1853/19517.
Full textTang, Philip Kwok Fan. "Stochastic Hydrologic Modeling in Real Time Using a Deterministic Model (Streamflow Synthesis and Reservoir Regulation Model), Time Series Model, and Kalman Filter." PDXScholar, 1991. https://pdxscholar.library.pdx.edu/open_access_etds/4580.
Full textKapur, Loveena. "Investigation of artificial neural networks, alternating conditional expectation, and Bayesian methods for reservoir characterization /." Digital version accessible at:, 1998. http://wwwlib.umi.com/cr/utexas/main.
Full textAyllón, David. "Methods for Cole Parameter Estimation from Bioimpedance Spectroscopy Measurements." Thesis, Högskolan i Borås, Institutionen Ingenjörshögskolan, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:hb:diva-19843.
Full textShenoi, Sangeetha Chandra. "A Comparative Study on Methods for Stochastic Number Generation." University of Cincinnati / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1511881394773194.
Full textLucas, Tamara J. H. "Formulation and solution of hierarchical decision support problems." Thesis, Georgia Institute of Technology, 1995. http://hdl.handle.net/1853/17291.
Full textHoi, Ka In. "Enhancement of efficiency and robustness of Kalman filter based statistical air quality models by using Bayesian approach." Thesis, University of Macau, 2010. http://umaclib3.umac.mo/record=b2488003.
Full text