To see the other types of publications on this topic, follow the link: Methodology of experiments.

Dissertations / Theses on the topic 'Methodology of experiments'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Methodology of experiments.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Wen, Qi. "Journal impact assessment : methodology and experiments /." View abstract or full-text, 2009. http://library.ust.hk/cgi/db/thesis.pl?IELM%202009%20WEN.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pate, Sharon. "Methodology for the on-line design of identification experiments." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp01/MQ29384.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Seay, Jeffrey Richard Eden Mario R. "A methodology for integrating process design elements with laboratory experiments." Auburn, Ala, 2008. http://repo.lib.auburn.edu/EtdRoot/2008/SPRING/Chemical_Engineering/Dissertation/Seay_Jeffrey_51.pdf.

Full text
Abstract:
Thesis (Ph. D.)--Auburn University, 2008.
Abstract. Vita. Written in two volumes. Volume one is public and contains non-proprietary data and results. Volume two contains all proprietary data, and will be available only to the dissertation committee and industrial sponsor. Includes bibliographical references (p. 161-166).
APA, Harvard, Vancouver, ISO, and other styles
4

Murphy, Margaret Kathleen. "Psychological aspects of survey methodology : experiments on the response process." Thesis, London School of Economics and Political Science (University of London), 1996. http://etheses.lse.ac.uk/3048/.

Full text
Abstract:
This thesis examines the psychological processes involved in responding to survey questions. Minor variations in questions have been shown to lead to variation in responses. These findings are inconsistent with the assumption that survey questions are tapping stable responses. Recently, psychological theories have been used to provide an explanation for these response effects. Research applying psychological theory to survey response is reviewed, covering research on both behavioural and attitudinal questions. These reviews illustrate a reconceptualisation of the basis of the survey response. The need for more detailed data on the response process is identified. Verbal reports are identified as a potential method for producing process data, yet, uncertainty over their validity is noted. The use of verbal reports as data is then reviewed, covering both their historical and more recent use. In the present research verbal report techniques are first experimentally examined to find an appropriate technique for obtaining process data in surveys. Think-aloud techniques are then used to examine the processes involved in responding to questions. A split-ballot questionnaire was administered, varying a number of questionnaire features where response effects have been hypothesised or shown to occur. Generally, the verbal protocols showed processing differences between the different question forms, and provided information about general types of cognitive processing during response. The next study moved on to look at context effects for attitudinal questions. An experiment was carried out in which a number of factors hypothesised to be influential in producing context effects were examined. A questionnaire was administered via computer and response latencies were measured. The results provide further information about the nature of context effects at attitude questions. The findings from this study are then discussed in terms of the methodologies used, the specific response effects addressed, and the survey response process generally.
APA, Harvard, Vancouver, ISO, and other styles
5

Meloso, Debrah C. Bossaerts Peter L. Zame William Zame William. "Prices, holdings, and learning in financial markets. : Experiments and methodology /." Diss., Pasadena, Calif. : California Institute of Technology, 2007. http://resolver.caltech.edu/CaltechETD:etd-05292007-143256.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Maheshwari, Nitin. "On the Selection of CMM Based Inspection Methodology for Circularity Tolerance." University of Cincinnati / OhioLINK, 2001. http://rave.ohiolink.edu/etdc/view?acc_num=ucin988220697.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Gupta, Abhishek. "Robust design using sequential computer experiments." Thesis, Texas A&M University, 2004. http://hdl.handle.net/1969.1/492.

Full text
Abstract:
Modern engineering design tends to use computer simulations such as Finite Element Analysis (FEA) to replace physical experiments when evaluating a quality response, e.g., the stress level in a phone packaging process. The use of computer models has certain advantages over running physical experiments, such as being cost effective, easy to try out different design alternatives, and having greater impact on product design. However, due to the complexity of FEA codes, it could be computationally expensive to calculate the quality response function over a large number of combinations of design and environmental factors. Traditional experimental design and response surface methodology, which were developed for physical experiments with the presence of random errors, are not very effective in dealing with deterministic FEA simulation outputs. In this thesis, we will utilize a spatial statistical method (i.e., Kriging model) for analyzing deterministic computer simulation-based experiments. Subsequently, we will devise a sequential strategy, which allows us to explore the whole response surface in an efficient way. The overall number of computer experiments will be remarkably reduced compared with the traditional response surface methodology. The proposed methodology is illustrated using an electronic packaging example.
APA, Harvard, Vancouver, ISO, and other styles
8

Sgaravatti, Daniele. "Down to earth philosophy : an anti-exceptionalist essay on thought experiments and philosophical methodology." Thesis, University of St Andrews, 2012. http://hdl.handle.net/10023/3228.

Full text
Abstract:
In the first part of the dissertations, chapters 1 to 3, I criticize several views which tend to set philosophy apart from other cognitive achievements. I argue against the popular views that 1) Intuitions, as a sui generis mental state, are involved crucially in philosophical methodology 2) Philosophy requires engagement in conceptual analysis, understood as the activity of considering thought experiments with the aim to throw light on the nature of our concepts, and 3) Much philosophical knowledge is a priori. I do not claim to have a proof that nothing in the vicinity of these views is correct; such a proof might well be impossible to give. However, I consider several versions, usually prominent ones, of each of the views, and I show those versions to be defective. Quite often, moreover, different versions of the same worry apply to different versions of the same theory. In the fourth chapter I discuss the epistemology of the judgements involved in philosophical thought experiments, arguing that their justification depends on their being the product of a competence in applying the concepts involved, a competence which goes beyond the possession of the concepts. I then offer, drawing from empirical psychology, a sketch of the form this cognitive competence could take. The overall picture squares well with the conclusions of the first part. In the last chapter I consider a challenge to the use of thought experiments in contemporary analytic philosophy coming from the ‘experimental philosophy'movement. I argue that there is no way of individuating the class of hypothetical judgements under discussion which makes the challenge both interesting and sound. Moreover, I argue that there are reasons to think that philosophers possess some sort of expertise which sets them apart from non-philosophers in relevant ways.
APA, Harvard, Vancouver, ISO, and other styles
9

Gilfether, Kevin G. "The Content of Thought Experiments and Philosophical Context." Oberlin College Honors Theses / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=oberlin1368182630.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bryson, Anthony Alan Fumerton Richard A. "The view from the armchair a defense of traditional philosophy /." Iowa City : University of Iowa, 2009. http://ir.uiowa.edu/etd/340.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Fitton, N. V. "Why and How to Report Distributions of Optima in Experiments on Heuristic Algorithms." University of Cincinnati / OhioLINK, 2001. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1006054556.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Bennett, Patrick Lawrence. "Using the Non-Intrusive Load Monitor for Shipboard Supervisory Control." Thesis, Monterey, California. Naval Postgraduate School, 2007. http://hdl.handle.net/10945/3008.

Full text
Abstract:
CIVINS
Field studies have demonstrated that it is possible to evaluate the state of many shipboard systems by analyzing the power drawn by electromechanical actuators. One device that can perform such an analysis is the non-intrusive load monitor (NILM). This thesis investigates the use of the NILM as a supervisory control system in the engineering plant of gas-turbine-powered vessel. Field tests demonstrate that the NILM can potentially reduce overall sensor count if used in a supervisory control system. To demonstrate the NILM's capabilities in supervisory control systems, experiments are being conducted at the U.S. Navy's Land-Based Engineering Site (LBES) in Philadelphia, Pennsylvania. Following a brief description of the LBES facility and the NILM itself, this thesis presents testing procedures and methodology with results obtained during the extensive field studies. This thesis also describes the on-going efforts to further demonstrate and develop the NILM's capabilities in supervisory control systems.
CIVINS
US Naval Academy (USNA) author.
APA, Harvard, Vancouver, ISO, and other styles
13

Liang, Li. "Graphical Tools, Incorporating Cost and Optimizing Central Composite Designs for Split-Plot Response Surface Methodology Experiments." Diss., Virginia Tech, 2005. http://hdl.handle.net/10919/26768.

Full text
Abstract:
In many industrial experiments, completely randomized designs (CRDs) are impractical due to restrictions on randomization, or the existence of one or more hard-to-change factors. Under these situations, split-plot experiments are more realistic. The two separate randomizations in split-plot experiments lead to different error structure from in CRDs, and hence this affects not only response modeling but also the choice of design. In this dissertation, two graphical tools, three-dimensional variance dispersion graphs (3-D VDGs) and fractions of design space (FDS) plots are adapted for split-plot designs (SPDs). They are used for examining and comparing different variations of central composite designs (CCDs) with standard, V- and G-optimal factorial levels. The graphical tools are shown to be informative for evaluating and developing strategies for improving the prediction performance of SPDs. The overall cost of a SPD involves two types of experiment units, and often each individual whole plot is more expensive than individual subplot and measurement. Therefore, considering only the total number of observations is likely not the best way to reflect the cost of split-plot experiments. In this dissertation, cost formulation involving the weighted sum of the number of whole plots and the total number of observations is discussed and the three cost adjusted optimality criteria are proposed. The effects of considering different cost scenarios on the choice of design are shown in two examples. Often in practice it is difficult for the experimenter to select only one aspect to find the optimal design. A realistic strategy is to select a design with good balance for multiple estimation and prediction criteria. Variations of the CCDs with the best cost-adjusted performance for estimation and prediction are studied for the combination of D-, G- and V-optimality criteria and each individual criterion.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
14

Agarwal, Gunjan. "Experiments with the pentium Performance monitoring counters." Thesis, Indian Institute of Science, 2002. http://hdl.handle.net/2005/69.

Full text
Abstract:
Performance monitoring counters are implemented in most recent microprocessors. In this thesis, we describe various performance measurement experiments for a program and a system that we conducted on a Linux operating system using the Pentium performance counters. We carried out our performance measurements on a Pentium II microprocessor. The Pentium II performance counters can be configured to count events such as cache misses, TLB misses, instructions executed etc. We used a low intrusive overhead technique to access these performance counters. We used these performance counters to measure the cache miss overheads due to context switches in Linux system. Our methodology involves sampling the hardware counters every 50ps. The sampling was set up using signals related to interval timers. We describe an analytical cache performance model under multiprogrammed condition from the literature and validate it using the performance monitoring counters. We next explores the long term performance of a system under different workload conditions. Various performance monitoring events - data cache h, data TLB misses, data cache reads or writes, branches etc. - are monitored over a 24 hour period. This is useful in identifying activities which cause loss of system performance. We used timer interrupts for sampling the performance counters. We develop a profiling methodology to give a perspective of performance of the different functions of a program, not only on the basis of execution-time but also on the data cache misses. Available tools like prof on Unix can be used to pinpoint the regions of performance loss of programs, but they mainly rely on an execution-time profiles. This does not give insight into problems in cache performance for that program. So we develop this methodology to get the performance of each function of the program not only on the basis of its execution time but also on the basis of its cache behavior.
APA, Harvard, Vancouver, ISO, and other styles
15

Langkau, Julia. "Appealing to intuitions." Thesis, University of St Andrews, 2013. http://hdl.handle.net/10023/3544.

Full text
Abstract:
This thesis is concerned with the ontology, epistemology, and methodology of intuitions in philosophy. It consists of an introduction, Chapter 1, and three main parts. In the first part, Chapter 2, I defend an account of intuitions as appearance states according to which intuitions cannot be reduced to beliefs or belief-like states. I argue that an account of intuitions as appearance states can explain some crucial phenomena with respect to intuitions better than popular accounts in the current debate over the ontology of intuitions. The second part, Chapters 3 to 5, is a reply to Timothy Williamson's (2004, 2007) view on the epistemology and methodology of intuitions. The practice of appealing to the fact that we have an intuition as evidence from thought experiments has recently been criticised by experimental philosophers. Williamson argues that since thought experiments reliably lead to knowledge of the content of our intuition, we can avoid this criticism and the resulting sceptical threat by appealing to the content of the intuition. I agree that thought experiments usually lead to knowledge of the content of our intuition. However, I show that appealing to the fact that we have an intuition is a common and useful practice. I defend the view that for methodological reasons, we ought to appeal to the fact that we have an intuition as initial evidence from thought experiments. The third part, Chapter 6, is devoted to a paradigm method involving intuitions: the method of reflective equilibrium. Some philosophers have recently claimed that it is trivial and could even accommodate scepticism about the reliability of intuitions. I argue that reflective equilibrium is not compatible with such scepticism. While it is compatible with the view I defend in the second part of the thesis, more specific methodological claims have to be made.
APA, Harvard, Vancouver, ISO, and other styles
16

Leite, de Vasconcelos Luis Arthur. "Methodological investigations into design inspiration and fixation experiments." Thesis, University of Cambridge, 2017. https://www.repository.cam.ac.uk/handle/1810/267495.

Full text
Abstract:
Designers often look for inspiration in their environment when exploring possible solutions to a given problem. However, many studies have reported that external stimuli may constrain designers’ imagination and limit their exploration to similar solutions, a phenomenon described as design fixation. Inspiration and fixation effects are traditionally studied with a similar experimental paradigm, which has produced a complex web of findings and explanations. Yet, when analysing the experiments and their findings closely, it becomes clear that there is considerable variation in how studies are conducted and the results they produce. Such variation makes it difficult to formulate a general view of how external stimuli affect the design process, and to translate the research findings into education and practice. Moreover, it raises questions about the reliability and effectiveness of the traditional experimental method. This thesis reports on a collection of studies that examine how design inspiration and fixation research is done and how it can be improved. It explores the research area by reviewing the literature and analysing data from a workshop; describes the research method by scrutinising experiments and their procedures; and explains the variation in research findings by testing experimental procedures empirically and suggesting new interpretations. My main findings are that: abstract stimuli can inspire or fixate designers to different degrees depending on how explicitly the stimuli are represented; external stimuli can inhibit the exploration of ideas that would otherwise be explored; the effect of experimental instructions varies depending on how encouraging the instructions are; and the way participants represent and elaborate ideas can moderate fixation results. Whilst this thesis offers insights into design practice and education, its main contribution is to design research, where it represents a fundamental material for those who are new to inspiration and fixation research, and for those who are already expert.
APA, Harvard, Vancouver, ISO, and other styles
17

Jackson, Samuel Edward. "Design of physical system experiments using Bayes linear emulation and history matching methodology with application to Arabidopsis thaliana." Thesis, Durham University, 2018. http://etheses.dur.ac.uk/12826/.

Full text
Abstract:
There are many physical processes within our world which scientists aim to understand. Computer models representing these processes are fundamental to achieving such understanding. Bayes linear emulation is a powerful tool for comprehensively exploring the behaviour of computationally intensive models. History matching is a method for finding the set of inputs to a computer model for which the corresponding model outputs give acceptable matches to observed data, given our state of uncertainty regarding the model itself, the measurements, and, if used, the emulators representing the model. This thesis provides three major developments to the current methodology in this area. We develop sequential history matching methodology by splitting the available data into groups and gaining insight about the information obtained from each group. Such insight is then realised through a wide array of novel visualisations. We develop emulation techniques for the case when there are hypersurfaces of input space across which we have essentially perfect knowledge about the model’s behaviour. Finally, we have developed the use of history matching methodology as criteria for the design of physical system experiments. We outline the general framework for design in a history matching setting, before discussing many extensions, including the performance of a comprehensive robustness analysis on our design choice. We outline our novel methodology on a model of hormonal crosstalk in the roots of an Arabidopsis plant.
APA, Harvard, Vancouver, ISO, and other styles
18

Balabanov, Vladimir Olegovich. "Development of Approximations for HSCT Wing Bending Material Weight using Response Surface Methodology." Diss., Virginia Tech, 1997. http://hdl.handle.net/10919/30730.

Full text
Abstract:
A procedure for generating a customized weight function for wing bending material weight of a High Speed Civil Transport (HSCT) is described. The weight function is based on HSCT configuration parameters. A response surface methodology is used to fit a quadratic polynomial to data gathered from a large number of structural optimizations. To reduce the time of performing a large number of structural optimizations, coarse-grained parallelization with a master-slave processor assignment on an Intel Paragon computer is used. The results of the structural optimization are noisy. Noise reduction in the structural optimization results is discussed. It is shown that the response surface filters out this noise. A statistical design of experiments technique is used to minimize the number of required structural optimizations and to maintain accuracy. Simple analysis techniques are used to find regions of the design space where reasonable HSCT designs could occur, thus customizing the weight function to the design requirements of the HSCT, while the response surface itself is created employing detailed analysis methods. Analysis of variance is used to reduce the number of polynomial terms in the response surface model function. Linear and constant corrections based on a small number of high fidelity results are employed to improve the accuracy of the response surface model. Configuration optimization of the HSCT employing a customized weight function is compared to the configuration optimization of the HSCT with a general weight function.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
19

JABBARI, BEHZAD J. "EXPERIMENTS IN PUBLIC OPINION RESEARCH ON THE INTERNET." University of Cincinnati / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1123627488.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Lherm, Francois-Rene. "Reassembling audit around the jurisdiction of comfort : how to design and conduct performative experiments." Thesis, Paris 1, 2016. http://www.theses.fr/2016PA01E072/document.

Full text
Abstract:
Cette thèse se compose de trois articles qui ensemble contribuent à améliorer la pertinence de l'audit et de la recherche en comptabilité, en s'appuyant sur le développement d'une nouvelle méthode d'expérimentation adaptée aux sciences de gestion. Dans un premier temps, elle souligne l'importance de deux questions de type "comment faire pour": "comment faire pour rendre l'utilisation du confort légitime dans l'audit des estimations?" et "comment faire pour optimiser la pertinence des résultats de la recherche comptable pour la pratique?". Puis, dans un second temps, elle développe l'expérimentation performative, une méthode qui permet aux sciences de gestion de répondre à des questions de ce type, et qu'elle applique aux deux questions précédentes. Malgré ses limites, cette recherche apporte des contributions et ouvre de nouvelles perspectives pour la pratique et la théorie dans trois champs différents. En audit, elle indique une nouvelle façon de penser l'auditabilité par-delà l'établissement d'une référence à des critères externes. Elle dote la recherche en comptabilité d'une méthode, détaillée étape par étape, qui permet de donner une validité expérimentale aux résultats théoriques. Pour la philosophie des sciences, elle offre des données nouvelles qui permettent de repenser les sciences de gestion. Enfin, il se peut que cette thèse indique un chemin praticable pour sortir de l'idée même de modernité
This thesis is composed of three articles which together contribute to enhance the relevance of audit as well as of research in accounting, based on the development of a new method of experimentation adapted to the management sciences. First, it suggests the importance of two highly relevant "how to" questions: "how to make the use of comfort legitimate in the audit of estimates?" and "how to optimize the practical relevance of theoretically valid results in auditing research?". Then, it develops the performative experiment, a method which makes it possible for the management sciences to address "how to" questions, and applies this method to the two questions raised. In spite of limits, this research may contribute to, and open new perspectives for, practice and theory in three different fields. In audit, it indicates a new way to think of auditability, beyond reference to external benchmarks. In accounting research, it offers a step-by-step process to bestow experimental validity to theoretical results. In the science studies, it delivers fresh data and perspectives to rethink the management sciences. Eventually, this thesis might indicate a not too hurtful way out of the very idea of modernity
APA, Harvard, Vancouver, ISO, and other styles
21

Buyukburc, Atil. "Robust Design Of Lithium Extraction From Boron Clays By Using Statistical Design And Analysis Of Experiments." Master's thesis, METU, 2003. http://etd.lib.metu.edu.tr/upload/1027036/index.pdf.

Full text
Abstract:
In this thesis, it is aimed to design lithium extraction from boron clays using statistical design of experiments and robust design methodologies. There are several factors affecting extraction of lithium from clays. The most important of these factors have been limited to a number of six which have been gypsum to clay ratio, roasting temperature, roasting time, leaching solid to liquid ratio, leaching time and limestone to clay ratio. For every factor, three levels have been chosen and an experiment has been designed. After performing three replications for each of the experimental run, signal to noise ratio transformation, ANOVA, regression analysis and response surface methodology have been applied on the results of the experiments. Optimization and confirmation experiments have been made sequentially to find factor settings that maximize lithium extraction with minimal variation. The mean of the maximum extraction has been observed as 83.81% with a standard deviation of 4.89 and the 95% prediction interval for the mean extraction is (73.729, 94.730). This result is in agreement with the studies that have been made in the literature. However
this study is unique in the sense that lithium is extracted from boron clays by using limestone directly from the nature, and gypsum as a waste product of boric acid production. Since these two materials add about 20% cost to the extraction process, the results of this study become important. Moreover, in this study it has been shown that statistical design of experiments help mining industry to reduce the need for standardization.
APA, Harvard, Vancouver, ISO, and other styles
22

Everett, Ryan Vincent. "An Improved Model-Based Methodology for Calibration of an Alternative Fueled Engine." The Ohio State University, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=osu1321285633.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Tomčová, Renata. "Testování výkonnosti maziv pro kolejovou dopravu." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2021. http://www.nusl.cz/ntk/nusl-443211.

Full text
Abstract:
Top-of- rail lubricants are an effective way to control adhesion, reduce contact wear and noise in rail transport. However, despite the widespread use of these lubricants, there is currently no standard defining how to test and evaluate their performance. This work aims to develop a methodology for testing top-of-rail lubricants in a laboratory environment using a tribometer in a ball-on-disk configuration. At first, important operational parameters of experiments are analyzed and experimentally tested. These are mainly wear-in and run-in, method of application, roughness and geometric parameters of contact bodies. The result of this work is a testing methodology that guarantees good repeatability and reliability of results. In the last part of this work, the methodology is verified using commercial top-of-rail lubricants.
APA, Harvard, Vancouver, ISO, and other styles
24

Habibulla, Murtuza. "Analyzing the performance of an order accumulation and sortation system using simulation a design of experiments approach." Ohio : Ohio University, 2001. http://www.ohiolink.edu/etd/view.cgi?ohiou1173895842.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Obeidat, Khaled Ahmad. "Design Methodology for Wideband Electrically Small Antennas (ESA) Based on the Theory of Characteristic Modes (CM)." The Ohio State University, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=osu1274730653.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Gryder, Ryan W. "Design & Analysis of a Computer Experiment for an Aerospace Conformance Simulation Study." VCU Scholars Compass, 2016. http://scholarscompass.vcu.edu/etd/4208.

Full text
Abstract:
Within NASA's Air Traffic Management Technology Demonstration # 1 (ATD-1), Interval Management (IM) is a flight deck tool that enables pilots to achieve or maintain a precise in-trail spacing behind a target aircraft. Previous research has shown that violations of aircraft spacing requirements can occur between an IM aircraft and its surrounding non-IM aircraft when it is following a target on a separate route. This research focused on the experimental design and analysis of a deterministic computer simulation which models our airspace configuration of interest. Using an original space-filling design and Gaussian process modeling, we found that aircraft delay assignments and wind profiles significantly impact the likelihood of spacing violations and the interruption of IM operations. However, we also found that implementing two theoretical advancements in IM technologies can potentially lead to promising results.
APA, Harvard, Vancouver, ISO, and other styles
27

Amanna, Ashwin Earl. "Statistical Experimental Design Framework for Cognitive Radio." Diss., Virginia Tech, 2012. http://hdl.handle.net/10919/77331.

Full text
Abstract:
This dissertation presents an empirical approach to identifying decisions for adapting cognitive radio parameters with no a priori knowledge of the environment. Cognitively inspired radios, attempt to combine observed metrics of system performance with artificial intelligence decision-making algorithms. Current architectures trend towards hybrid combinations of heuristics, such as genetic algorithms (GA) and experiential methods, such as case-based reasoning (CBR). A weakness in the GA is its reliance on limited mathematical models for estimating bit error rate, packet error rate, throughput, and signal-to-noise ratio. The CBR approach is similarly limited by its dependency on past experiences. Both methods have potential to suffer in environments not previously encountered. In contrast, the statistical methods identify performance estimation models based on exercising defined experimental designs. This represents an experiential decision-making process formed in the present rather than the past. There are three core contributions from this empirical framework: 1) it enables a new approach to decision making based on empirical estimation models of system performance, 2) it provides a systematic method for initializing cognitive engine configuration parameters, and 3) it facilitates deeper understanding of system behavior by quantifying parameter significance, and interaction effects. Ultimately, this understanding enables simplification of system models by identifying insignificant parameters. This dissertation defines an abstract framework that enables application of statistical approaches to cognitive radio systems regardless of its platform or application space. Specifically, it assesses factorial design of experiments and response surface methodology (RSM) to an over-the-air wireless radio link. Results are compared to a benchmark GA cognitive engine. The framework is then used for identifying software-defined radio initialization settings. Taguchi designs, a related statistical method, are implemented to identify initialization settings of a GA.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
28

Ayan, Elif. "Parameter Optimization Of Steel Fiber Reinforced High Strength Concrete By Statistical Design And Analysis Of Experiments." Master's thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/3/1051960/index.pdf.

Full text
Abstract:
This thesis illustrates parameter optimization of compressive strength, flexural strength and impact resistance of steel fiber reinforced high strength concrete (SFRHSC) by statistical design and analysis of experiments. Among several factors affecting the compressive strength, flexural strength and impact resistance of SFRHSC, five parameters that maximize all of the responses have been chosen as the most important ones as age of testing, binder type, binder amount, curing type and steel fiber volume fraction. Taguchi and regression analysis techniques have been used to evaluate L27(313) Taguchi&
#65533
s orthogonal array and 3421 full factorial experimental design results. Signal to noise ratio transformation and ANOVA have been applied to the results of experiments in Taguchi analysis. Response surface methodology has been employed to optimize the best regression model selected for all the three responses. In this study Charpy Impact Test, which is a different kind of impact test, have been applied to SFRHSC for the first time. The mean of compressive strength, flexural strength and impact resistance have been observed as around 125 MPa, 14.5 MPa and 9.5 kgf.m respectively which are very close to the desired values. Moreover, this study is unique in the sense that the derived models enable the identification of underlying primary factors and their interactions that influence the modeled responses of steel fiber reinforced high strength concrete.
APA, Harvard, Vancouver, ISO, and other styles
29

Aldemir, Basak. "Parameter Optimization Of Chemically Activated Mortars Containing High Volumes Of Pozzolan By Statistical Design And Analysis Of Experiments." Master's thesis, METU, 2006. http://etd.lib.metu.edu.tr/upload/12607106/index.pdf.

Full text
Abstract:
ABSTRACT PARAMETER OPTIMIZATION OF CHEMICALLY ACTIVATED MORTARS CONTAINING HIGH VOLUMES OF POZZOLAN BY STATISTICAL DESIGN AND ANALYSIS OF EXPERIMENTS Aldemir, BaSak M.S., Department of Industrial Engineering Supervisor: Prof. Dr. Ö
mer Saatç
ioglu Co-Supervisor: Assoc. Prof. Dr. Lutfullah Turanli January 2006, 167 pages This thesis illustrates parameter optimization of early and late compressive strengths of chemically activated mortars containing high volumes of pozzolan by statistical design and analysis of experiments. Four dominant parameters in chemical activation of natural pozzolans are chosen for the research, which are natural pozzolan replacement, amount of pozzolan passing 45 &
#956
m sieve, activator dosage and activator type. Response surface methodology has been employed in statistical design and analysis of experiments. Based on various second-order response surface designs
experimental data has been collected, best regression models have been chosen and optimized. In addition to the optimization of early and late strength responses separately, simultaneous optimization of compressive strength with several other responses such as cost, and standard deviation estimate has also been performed. Research highlight is the uniqueness of the statistical optimization approach to chemical activation of natural pozzolans.
APA, Harvard, Vancouver, ISO, and other styles
30

Mahendra, Adhiguna. "Methodology of surface defect detection using machine vision with magnetic particle inspection on tubular material." Thesis, Dijon, 2012. http://www.theses.fr/2012DIJOS051.

Full text
Abstract:
[...]L’inspection des surfaces considérées est basée sur la technique d’Inspection par Particules Magnétiques (Magnetic Particle Inspection (MPI)) qui révèle les défauts de surfaces après les traitements suivants : la surface est enduite d’une solution contenant les particules, puis magnétisées et soumise à un éclairage Ultra-Violet. La technique de contrôle non destructif MPI est une méthode bien connue qui permet de révéler la présence de fissures en surface d’un matériau métallique. Cependant, une fois le défaut révélé par le procédé, ladétection automatique sans intervention de l’opérateur en toujours problématique et à ce jour l'inspection basée sur le procédé MPI des matériaux tubulaires sur les sites de production deVallourec est toujours effectuée sur le jugement d’un opérateur humain. Dans cette thèse, nous proposons une approche par vision artificielle pour détecter automatiquement les défauts à partir des images de la surface de tubes après traitement MPI. Nous avons développé étape par étape une méthodologie de vision artificielle de l'acquisition d'images à la classification.[...] La première étape est la mise au point d’un prototype d'acquisition d’images de la surface des tubes. Une série d’images a tout d’abord été stockée afin de produire une base de données. La version actuelle du logiciel permet soit d’enrichir la base de donnée soit d’effectuer le traitement direct d’une nouvelle image : segmentation et saisie de la géométrie (caractéristiques de courbure) des défauts. Mis à part les caractéristiques géométriques et d’intensité, une analyse multi résolution a été réalisée sur les images pour extraire des caractéristiques texturales. Enfin la classification est effectuée selon deux classes : défauts et de non-défauts. Celle ci est réalisée avec le classificateur des forêts aléatoires (Random Forest) dont les résultats sontcomparés avec les méthodes Support Vector Machine et les arbres de décision.La principale contribution de cette thèse est l'optimisation des paramètres utilisées dans les étapes de segmentations dont ceux des filtres de morphologie mathématique, du filtrage linéaire utilisé et de la classification avec la méthode robuste des plans d’expériences (Taguchi), très utilisée dans le secteur de la fabrication. Cette étape d’optimisation a été complétée par les algorithmes génétiques. Cette méthodologie d’optimisation des paramètres des algorithmes a permis un gain de temps et d’efficacité significatif. La seconde contribution concerne la méthode d’extraction et de sélection des caractéristiques des défauts. Au cours de cette thèse, nous avons travaillé sur deux bases de données d’images correspondant à deux types de tubes : « Tool Joints » et « Tubes Coupling ». Dans chaque cas un tiers des images est utilisé pour l’apprentissage. Nous concluons que le classifieur du type« Random Forest » combiné avec les caractéristiques géométriques et les caractéristiques detexture extraites à partir d’une décomposition en ondelettes donne le meilleur taux declassification pour les défauts sur des pièces de « Tool Joints »(95,5%) (Figure 1). Dans le cas des « coupling tubes », le meilleur taux de classification a été obtenu par les SVM avec l’analyse multirésolution (89.2%) (figure.2) mais l’approche Random Forest donne un bon compromis à 82.4%. En conclusion la principale contrainte industrielle d’obtenir un taux de détection de défaut de 100% est ici approchée mais avec un taux de l’ordre de 90%. Les taux de mauvaises détections (Faux positifs ou Faux Négatifs) peuvent être améliorés, leur origine étant dans l’aspect de l’usinage du tube dans certaines parties, « Hard Bending ».De plus, la méthodologie développée peut être appliquée à l’inspection, par MPI ou non, de différentes lignes de produits métalliques
Industrial surface inspection of tubular material based on Magnetic Particle Inspection (MPI) is a challenging task. Magnetic Particle Inspection is a well known method for Non Destructive Testing with the goal to detect the presence of crack in the tubular surface. Currently Magnetic Particle Inspection for tubular material in Vallourec production site is stillbased on the human inspector judgment. It is time consuming and tedious job. In addition, itis prone to error due to human eye fatigue. In this thesis we propose a machine vision approach in order to detect the defect in the tubular surface MPI images automatically without human supervision with the best detection rate. We focused on crack like defects since they represent the major ones. In order to fulfill the objective, a methodology of machine vision techniques is developed step by step from image acquisition to defect classification. The proposed framework was developed according to industrial constraint and standard hence accuracy, computational speed and simplicity were very important. Based on Magnetic Particle Inspection principles, an acquisition system is developed and optimized, in order to acquire tubular material images for storage or processing. The characteristics of the crack-like defects with respect to its geometric model and curvature characteristics are used as priory knowledge for mathematical morphology and linear filtering. After the segmentation and binarization of the image, vast amount of defect candidates exist. Aside from geometrical and intensity features, Multi resolution Analysis wasperformed on the images to extract textural features. Finally classification is performed with Random Forest classifier due to its robustness and speed and compared with other classifiers such as with Support Vector Machine Classifier. The parameters for mathematical morphology, linear filtering and classification are analyzed and optimized with Design Of Experiments based on Taguchi approach and Genetic Algorithm. The most significant parameters obtained may be analyzed and tuned further. Experiments are performed ontubular materials and evaluated by its accuracy and robustness by comparing ground truth and processed images. This methodology can be replicated for different surface inspection application especially related with surface crack detection
APA, Harvard, Vancouver, ISO, and other styles
31

Hays, Mark A. "A Fault-Based Model of Fault Localization Techniques." UKnowledge, 2014. http://uknowledge.uky.edu/cs_etds/21.

Full text
Abstract:
Every day, ordinary people depend on software working properly. We take it for granted; from banking software, to railroad switching software, to flight control software, to software that controls medical devices such as pacemakers or even gas pumps, our lives are touched by software that we expect to work. It is well known that the main technique/activity used to ensure the quality of software is testing. Often it is the only quality assurance activity undertaken, making it that much more important. In a typical experiment studying these techniques, a researcher will intentionally seed a fault (intentionally breaking the functionality of some source code) with the hopes that the automated techniques under study will be able to identify the fault's location in the source code. These faults are picked arbitrarily; there is potential for bias in the selection of the faults. Previous researchers have established an ontology for understanding or expressing this bias called fault size. This research captures the fault size ontology in the form of a probabilistic model. The results of applying this model to measure fault size suggest that many faults generated through program mutation (the systematic replacement of source code operators to create faults) are very large and easily found. Secondary measures generated in the assessment of the model suggest a new static analysis method, called testability, for predicting the likelihood that code will contain a fault in the future. While software testing researchers are not statisticians, they nonetheless make extensive use of statistics in their experiments to assess fault localization techniques. Researchers often select their statistical techniques without justification. This is a very worrisome situation because it can lead to incorrect conclusions about the significance of research. This research introduces an algorithm, MeansTest, which helps automate some aspects of the selection of appropriate statistical techniques. The results of an evaluation of MeansTest suggest that MeansTest performs well relative to its peers. This research then surveys recent work in software testing using MeansTest to evaluate the significance of researchers' work. The results of the survey indicate that software testing researchers are underreporting the significance of their work.
APA, Harvard, Vancouver, ISO, and other styles
32

Palmér, Daniel. "Exploring interaction design for counter-narration and agonistic co-design – Four experiments to increase understanding of, and facilitate, an established practice of grassroots activism." Thesis, Malmö högskola, Fakulteten för kultur och samhälle (KS), 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-22507.

Full text
Abstract:
This is a documentation of a programmatic design approach, moving through different levels of an established practice of grassroots activism. The text frames an open-ended, exploratory methodology, as four stages of investigation, trying to find possible ways to shape and increase understanding of, and facilitate a process, of co-designing a practice. It presents the experience of looking for opportunities for counter-narration, as contribution to an activist cause, and questioning the role, purpose and approach of a designer in a grassroots activist environment.
APA, Harvard, Vancouver, ISO, and other styles
33

Rebello, André Luiz Santos. "A contribuição do ensino de física no ensino médio integrado da escola técnica de alimentos (panificação e derivados de leite): nata." Niterói, 2017. https://app.uff.br/riuff/handle/1/4951.

Full text
Abstract:
Submitted by Maria Bernadete Dos Santos (mariabpds@id.uff.br) on 2017-10-19T11:52:41Z No. of bitstreams: 3 license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Dissertação André Luiz Rebello.pdf: 3205651 bytes, checksum: f165b6698707b7e2d72b0cdf74831b2a (MD5) Produto André Rebelo.pdf: 1646879 bytes, checksum: d4074d7354b68eabea1e5ff535bb3943 (MD5)
Approved for entry into archive by Biblioteca Central do Valonguinho Biblioteca Central do Valonguinho (bcv@ndc.uff.br) on 2017-10-23T21:33:48Z (GMT) No. of bitstreams: 3 license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Dissertação André Luiz Rebello.pdf: 3205651 bytes, checksum: f165b6698707b7e2d72b0cdf74831b2a (MD5) Produto André Rebelo.pdf: 1646879 bytes, checksum: d4074d7354b68eabea1e5ff535bb3943 (MD5)
Made available in DSpace on 2017-10-23T21:33:48Z (GMT). No. of bitstreams: 3 license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Dissertação André Luiz Rebello.pdf: 3205651 bytes, checksum: f165b6698707b7e2d72b0cdf74831b2a (MD5) Produto André Rebelo.pdf: 1646879 bytes, checksum: d4074d7354b68eabea1e5ff535bb3943 (MD5)
Tomando como referencial teórico a aprendizagem significativa proposta por Ausubel e a epistemologia do conhecimento de Bachelard, nesta dissertação propomos a elaboração de uma metodologia de Ensino mais adequada para o Colégio Estadual Comendador Valentim Dos Santos Diniz (CECVSD). O CECVSD é uma Escola de Nível Médio Integrado, e a metodologia didática proposta partiu das observações relatadas pelos professores, durante as reuniões de planejamento escolar, em que os alunos sinalizavam dificuldades em fazer a ligação entre a teoria, dos conteúdos do ciclo comum, com a sua aplicação prática na parte Profissional Técnica. Essa metodologia didática consiste em uso de experimentos que construímos para apoiar o professor visando contribuir para o melhor aprendizado e formação do aluno. Estes experimentos estão disponibilizados como material didático.
Taking as a theoretical basis learning proposed by Ausubel and Bachelard’s epistemology of knowledge Bachelard in this paper we propose the development of a more appropriate teaching methodology for Commander Valentim Dos Santos Diniz State School (CECVSD). The CECVSD is an integrated secondary level school, and the teaching methodology proposed came from observations reported by teachers during the school planning meetings, in which students signaled difficulties in making the connection between the theory of the common cycle content with its practical application in the Technical Professional by using experiments that were built to support the teacher in order to contribute to better learning and student formation. These experiments are available as teaching material.
APA, Harvard, Vancouver, ISO, and other styles
34

Yan, Chang. "A computational game-theoretic study of reputation." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:e6acb250-efb8-410b-86dd-9e3e85b427b6.

Full text
Abstract:
As societies become increasingly connected thanks to advancing technologies and the Internet in particular, individuals and organizations (i.e. agents hereafter) engage in innumerable interaction and face constantly the possibilities thereof. Such unprecedented connectivity offers opportunities through which social and economic benefits are realised and disseminated. Nonetheless, risky and damaging interaction abound. To promote beneficial relationships and to deter adverse outcomes, agents adopt different means and resources. This thesis focuses on reputation as a crucial mechanism for promoting positive interaction, and examines the topic from game-theoretic perspective using computational methods. First, we investigate the design of reputation systems by incorporating economic incentives into algorithm design. Focusing on ubiquitous user-generated ratings on the Internet, we propose a truthful reputation mechanism that not only enforces honest reporting from individual raters but also takes into account their personal preferences. The mechanism is constructed using a blend of Bayesian Truth Serum and SimRank algorithms, both specifically adapted for our use case of online ratings. We show that the resulting mechanism is Bayesian incentive compatible and is computable in polynomial time. In addition, the mechanism is shown to be resistant to common manipulations on the Internet such as uniform fake ratings and targeted collusions. Lastly, we discuss detailed considerations for implementing the mechanism in practice. Second, we investigate experimentally the relative importance of reputational and social knowledge in sustaining cooperation in dynamic networks. In our experiments, U.S-based subjects play a repeated game where, in each round, an endogenous network is formed among a group of 13 players and each player chooses a cooperative or non-cooperative action that applies to all her connections. We vary the availability of reputational and social knowledge to subjects in 4 treatments. At the aggregate level, we find that reputational knowledge is of first-order importance for supporting cooperation, while social knowledge plays a complementary role only when reputational knowledge is available. Further community-level analysis reveals that reputational knowledge leads to the emergence of highly cooperative hubs, and a dense and cluster network, while social knowledge enhances cooperation by forming a large, dense and clustered community of cooperators who exclude outsiders through link removals and link refusals. At the individual level, reputational knowledge proves essential for the emergence of network structural characteristics that are associated with cooperative actions. In contrast, in treatments without reputational information, none of the network metrics is predicative of subjects' choices of action. Furthermore, we present UbiquityLab, a pioneering online platform for conducting real-time interactive experiments for game-theoretic studies. UbiquityLab supports both synchronous and asynchronous game models, and allows for complex and customisable interaction between subjects. It offers both back-end and front-end infrastructure with a modularised design to enable rapid development and streamlined operation. For in- stance, in synchronous mode, all per-stage and inter-stage logic are fully encapsulated by a thin server-side module, while a suite of client-side components eases the creation of game interface. The platform features a robust messaging protocol, such that player connection and game states are restored automatically upon networking errors and dropped out subjects are seamlessly substituted by customisable program players. Online experiments enjoy clear advantages over lab equivalents as they benefit from low operation cost, efficient execution, large and diverse subject pools, etc. UbiquityLab aims to promote online experiments as an emerging research methodology in experimental economics by bringing its benefits to other researchers.
APA, Harvard, Vancouver, ISO, and other styles
35

Mullaney, Tara. "Thinking beyond the Cure : a constructive design research investigation into the patient experience of radiotherapy." Doctoral thesis, Umeå universitet, Designhögskolan vid Umeå universitet, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-116989.

Full text
Abstract:
This constructive design research dissertation aims to understand how design can be used as part of a composite research approach to generate knowledge about how complex phenomena are composed through their interactions and relationships with various actors, both human and non-human. It has done this by investigating a single phenomenon, the patient experience of radiotherapy. Through the purposeful selection and application of methods, theories, and existing research from design, nursing, and STS, this thesis utilizes a mixed-method approach comprised of qualitative, quantitative methods, and design experimentation, across multiple research sites and patient populations, in three research projects – PERT, DUMBO, and POIS – to generate rich and layered knowledge of the patient experience. Experience prototypes are used to challenge, through intervention or provocation, the relationships between the various radiotherapy actors identified through the empirical methods. Together, the research generated in PERT, DUMBO, and POIS construct a map of the networked, interdependent actors which shape the patient’s emotional experience of radiotherapy: the staff, technology, information, environment, and institutions. It also calls attention to the problematic relationship between radiotherapy patients and the technologies used to treat them, which can lead to anxiety, worry, and fear. This thesis offers contributions related to both improving patient experience and designing for complex social issues. First, this research suggests that individuals, other than primary users, need to be acknowledged in the design of medical technologies. It proposes calling attention to patients by naming them as interactors in their relationships with the aforementioned technologies, removing them from the role of implicated actor. Second, this thesis problematizes treating the actors within a network as independent entities, which medical research and user-centered design often does, and calls for a new type of design practice which attends to these networked relationships. Third, this thesis suggests two ways in which design research practice should be shifted methodologically if it wants to engage with and design for complex social issues like patient experience; widening the researcher’s perspective on the issue through the use of a composite methodology, and having the researcher maintain this scope by remaining closely connected to their research context. The implications of this work concern how design research, design education, and design practice might shift their approaches to fully acknowledge and attend to the complexity of systems like healthcare.
APA, Harvard, Vancouver, ISO, and other styles
36

Jowsey, Allan. "Fire imposed heat fluxes for structural analysis." Thesis, University of Edinburgh, 2006. http://hdl.handle.net/1842/1480.

Full text
Abstract:
The last two decades have seen new insights, data and analytical methods to establish the behaviour of structures in fire. These methods have slowly migrated into practice and now form the basis for modern quantitative structural fire engineering. This study presents a novel methodology for determining the imposed heat fluxes on structural members. To properly characterise the temperature rise of the structural elements, a post-processing model for computational fluid dynamics tools was developed to establish the heat fluxes imposed on all surfaces by a fire. This model acts as a tool for any computational fluid dynamics model and works on the basis of well resolved local gas conditions. Analysis of the smoke layer and products of combustion allow for heat fluxes to be defined based on smoke absorption coefficients and temperatures. These heat fluxes are defined at all points on the structure by considering full spatial and temporal distributions. Furthermore, heat fluxes are defined by considering directionality and both characteristic length and time scales in fires. Length scales are evaluated for different structural member geometries, while time scales are evaluated for different structural materials including applied fire protection. It is the output given by this model that provides the input for the thermal analysis of the structural members that is a necessary step prior to the structural analysis to be undertaken. The model is validated against the experimental results of the previously mentioned large scale fire tests, showing good agreement. In addition, comparisons are made to current methods to highlight their potential inadequacies.
APA, Harvard, Vancouver, ISO, and other styles
37

Galdámez, Edwin Vladimir Cardoza. "Aplicação das técnicas de planejamento e análise de experimentos na melhoria da qualidade de um processo de fabricação de produtos plásticos." Universidade de São Paulo, 2002. http://www.teses.usp.br/teses/disponiveis/18/18140/tde-18112002-090421/.

Full text
Abstract:
Experimentos industriais são realizados pelas empresas no intuito de melhorar as características de qualidade dos produtos e processos de fabricação. Nesse sentido, esta dissertação tem como objetivo estudar e aplicar as técnicas de planejamento e análise de experimentos na melhoria da qualidade industrial. Como parte do objetivo, desenvolve-se uma aplicação com as técnicas de planejamento fatorial fracionado '2POT. K-P', metodologia de superfície de resposta e análise de variância, em um processo de moldagem por injeção, utilizado por uma indústria que fabrica e comercializa componentes plásticos usados na construção civil. Com essa pesquisa experimental identificam-se os parâmetros mais importantes da injeção plástica: temperatura da máquina e pressão de injeção. Ao mesmo tempo, determinam-se os níveis ótimos de regulagem desses parâmetros. Assim, com esse estudo, avalia-se o procedimento de implantação das técnicas de experimentação e as dificuldades práticas encontradas, bem como busca-se contribuir na integração entre universidade e empresa.
Industrial experiments are made by companies in order to improve the quality characteristics of products and production processes. In this sense, the objective of this dissertation is to study and apply the design of experiments in the industrial quality improvement. In addition, as a part of the objective, an application of the techniques of design Fractional Factorial '2POT. K-P', Analysis of Variance and Response Surface Methodology is done. It is focused in an injection molding process applied by a company, that makes and trades plastic products for the civil construction. Using this experimental study, the most important parameters of plastic injection are identified: melt temperature and injection pressure. At the same time, the optimal levels of adjustment of these parameters are determined. From this study, it is evaluated both the implantation procedures of the designs of experiments as well as the difficulties faced. Also, this study tries to contribute to the university-company relationship.
APA, Harvard, Vancouver, ISO, and other styles
38

Cevik, Mert. "Desifn And Optimization Of A Mixed Flow Compressor Impeller Using Robust Design Methods." Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/12611105/index.pdf.

Full text
Abstract:
This is a study that is focused on developing an individual design methodology for a centrifugal impeller and generating a mixed flow impeller for a small turbojet engine by using this methodology. The structure of the methodology is based on the design, modeling and the optimization processes, which are operated sequentially. The design process consists of engine design and compressor design codes operated together with a commercial design code. Design of Experiment methods and an in-house Neural Network code is used for the modeling phase. The optimization is based on an in-house code which is generated based on multidirectional search algorithm. The optimization problem is constructed by using the inhouse parametric design codes of the engine and the compressor. The goal of the optimization problem is to reach an optimum design which gives the best possible combination of the thrust and the fuel consumption for a small turbojet engine. The final combination of the design parameters obtained from the optimization study are used in order to generate the final design with the commercial design code. On the last part of the thesis a comparison of the final design and a standard radial flow impeller is made in order to clarify the benefit of the study. The results have been showed that a mixed flow compressor design is superior to a standard radial flow compressor in a small turbojet application.
APA, Harvard, Vancouver, ISO, and other styles
39

BÔSSO, Antônio Rafael de Souza Alves. "Desenvolvimento do Software PlanEx de planejamento de experimentos online e sua aplicação didática na pós-Graduação." Universidade Federal de Goiás, 2012. http://repositorio.bc.ufg.br/tede/handle/tde/1023.

Full text
Abstract:
Made available in DSpace on 2014-07-29T15:11:10Z (GMT). No. of bitstreams: 1 Tese Antonio Rafael de Souza Alves Bosso.pdf: 2756891 bytes, checksum: c44601eb9a4e7441d8e2cbaf917d13ef (MD5) Previous issue date: 2012-04-02
This work describes the design of experiments and response surface methodology (RSM) via factorial designs programmed in PlanEx, a new web/online software implemented using the Java language. The goal was to develop and validate PlanEx as a simple, efficient, and an interactive tool for undergraduate teaching and research. PlanEx was also designed as a free software, and the Java language was chosen since it is a community-developed open source software. PlanEx provides both full factorial design of experiments (2k) and fractional (2k-1), with up to six variables, and with, or without, replicates. PlanEx was also developed to perform variable screening through replicates, and normal probability plots and also to evaluate the model bias by the residual graphs. PlanEx is based on RSM with up to four variables at three levels, and central points with, or without, replicates. Through RSM the user can check linear or quadratic models to predict experimental responses through 3D surface plots, and analysis of variance (ANOVA). PlanEx was also evaluated in the second term of 2009, 2010, and 2011 in graduate courses of planning and design of experiments. The results have shown that PlanEx is fast, and interactive, and it is a simple design of experiment tool for both teaching and research.
Esta Tese apresenta o software PlanEx que contém as ferramentas do planejamento de experimentos fatorial escritas em linguagem de programação Java. O objetivo do trabalho foi desenvolver o software PlanEx e validá-lo como uma ferramenta de ensino e pesquisa simples, eficiente, interativa e gratuita na internet, sem a necessidade do usuário realizar download. A linguagem Java foi escolhida por oferecer uma grande quantidade de efeitos gráficos e de bibliotecas livres que contêm cálculos da estatística e da álgebra linear. O software PlanEx disponibiliza as ferramentas do planejamento de experimentos fatorial completo 2k e do fracionário 2k-1 com até seis variáveis e com, ou sem, replicatas de todos os experimentos. PlanEx foi desenvolvido para que estudantes e pesquisadores interessados no planejamento de experimentos possam utilizá-lo para realizar uma triagem das variáveis; para verificar quais variáveis provocam efeitos significativos na resposta experimental de um fenômeno em estudo, através de replicatas e/ou de gráficos de probabilidade normal; e para avaliar se o modelo é tendencioso através de gráficos de resíduos em função dos valores estimados. Para otimização ou expansão do planejamento de experimentos fatorial, o software PlanEx disponibiliza as ferramentas da Metodologia de Superfície de Resposta (RSM) com até quatro variáveis em três níveis, sendo um ponto central e com, ou sem, replicatas. Através da RSM o usuário pode verificar qual modelo, linear ou quadrático, prediz melhor as respostas experimentais através dos gráficos de superfície de resposta 3D e da Tabela de Análise de Variância (ANOVA). Através de três aplicações realizadas no segundo semestre dos anos de 2009, 2010 e 2011 nas aulas da disciplina de Quimometria II ministrada pelo Professor Dr. Anselmo Elcana de Oliveira no programa de pós-graduação de mestrado em Química da UFG e Doutorado Multiinstitucional em Química UFU/UFG/UFMT, o software PlanEx apresentou ser uma ferramenta rápida, interativa, gratuita e simples de ensino e pesquisa em nível de pós-graduação.
APA, Harvard, Vancouver, ISO, and other styles
40

Zhang, Aijun. "Majorization methodology for experimental designs." HKBU Institutional Repository, 2004. http://repository.hkbu.edu.hk/etd_ra/521.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Gomes, Charles. "Contribution de la planification expérimentale à la modélisation de phénomènes complexes en formulation." Thesis, Aix-Marseille, 2018. http://www.theses.fr/2018AIXM0735.

Full text
Abstract:
Dans certains domaines de la formulation, comme la cosmétique, les phénomènes étudiés peuvent être très chaotiques avec des zones de rupture ou de non linéarité. Dans ce cas, le formulateur doit se poser de nombreuses questions avant de proposer la stratégie expérimentale optimale qui doit être adaptée au mieux à son problème. Pour de tels phénomènes, des plans d'expériences classiques, tels que les réseaux de Scheffé ou les plans D-optimaux, se révèlent insuffisants car les points expérimentaux ne couvrent pas uniformément l'espace expérimental. En effet, il est intéressant dans ces cas d'étude d'explorer l'ensemble du domaine expérimental et de répartir uniformément les points dans l'espace. Pour cela, les plans uniformes ou Space-Filling Designs (SFD), fréquemment utilisés dans le cas de variables orthogonales, mais très peu dans le cas des variables de mélange, sont particulièrement intéressants. L'objectif de cette thèse est d’adapter des algorithmes de construction de plans uniformes dans le cas de plans de mélanges, de proposer des règles simples pour aider au choix de la nature et du nombre de points du plan d'expériences de mélange
In some domains of formulation, as cosmetics, the phenomena can be very chaotic with discontinuities or not linear zones. In the cosmetic field, the formulator has to propose the optimal experimental strategy which must be well adapted to the constraints imposed by the experimenters. For such phenomena, classical designs of experiments, such as Scheffé simplexes lattices or the D-optimal designs, are proving insufficient because the experimental points do not uniformly cover the experimental space. Indeed, it is essential in these studies to explore the whole experimental domain and to uniformly distribute points in the space. For that purpose, the Space-Filling Designs (SFD), frequently used in the case of orthogonal variables, but less in the case of the mixture variables, are particularly interesting. The objective of this thesis is to adapt the algorithms for construction of uniform designs in the case of mixture designs and to propose guidelines for the choice of the nature and the number of points of the experimental design
APA, Harvard, Vancouver, ISO, and other styles
42

Mitchinson, Pelham James. "Crowding indices : experimental methodology and predictive accuracy." Thesis, University of Southampton, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.302320.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Ritchie, Paul Andrew 1960. "A systematic, experimental methodology for design optimization." Thesis, The University of Arizona, 1988. http://hdl.handle.net/10150/276698.

Full text
Abstract:
Much attention has been directed at off-line quality control techniques in recent literature. This study is a refinement of and an enhancement to one technique, the Taguchi Method, for determining the optimum setting of design parameters in a product or process. In place of the signal-to-noise ratio, the mean square error (MSE) for each quality characteristic of interest is used. Polynomial models describing mean response and variance are fit to the observed data using statistical methods. The settings for the design parameters are determined by minimizing a statistical model. The model uses a multicriterion objective consisting of the MSE for each quality characteristic of interest. Minimum bias central composite designs are used during the data collection step to determine the settings of the parameters where observations are to be taken. Included is the development of minimum bias designs for various cases. A detailed example is given.
APA, Harvard, Vancouver, ISO, and other styles
44

Mulligan, Talley. "Lekhost a Tize: An Experiment in Film Production Methodology." ScholarWorks@UNO, 2003. http://scholarworks.uno.edu/td/46.

Full text
Abstract:
Proliferation and convergence of new technologies as well as the diverse media it has given rise to have had a dramatic impact on the theory and practice of contemporary filmmaking. This trend also holds considerable implication for the range of cinematic forms likely to be embraced in the future as well as the methodologies necessarily exploited in their making. The formal and expressive possibilities inherent in the climate of experimentation existing at this unique juncture in history encourage out-of-the-ordinary solutions to long-standing problems while begging important questions regarding the process and goal of filmmaking. Taking the films of the Czech New Wave and their trademark formal experimentation as a point of departure, the present study attempts to incorporate the disparate influences of these novel circumstances to filmmaking. As such, Lekhost a Tíže represents one filmmaker's efforts towards the goal of an intuitive and personal system of filmmaking, based on a flexible, yet expressive visual language that seeks to promote discovery without forfeiting narrative coherence.
APA, Harvard, Vancouver, ISO, and other styles
45

Nandivada, Rakesh. "Experimental methodology for measurement of diesel exhaust particulates." Morgantown, W. Va. : [West Virginia University Libraries], 2007. https://eidr.wvu.edu/etd/documentdata.eTD?documentid=5484.

Full text
Abstract:
Thesis (M.S.)--West Virginia University, 2007.
Title from document title page. Document formatted into pages; contains ix, 69 p. : ill. (some col.). Includes abstract. Includes bibliographical references (p. 59-64).
APA, Harvard, Vancouver, ISO, and other styles
46

Feitosa, Filipe Xavier. "Development of PVT methodology and mounting experimental apparatus." Universidade Federal do CearÃ, 2013. http://www.teses.ufc.br/tde_busca/arquivo.php?codArquivo=14029.

Full text
Abstract:
CoordenaÃÃo de AperfeÃoamento de Pessoal de NÃvel Superior
The aim of this work was to assemble an experimental apparatus PVT skilled labor in obtaining phase equilibria at high pressures and determining efficient methodology. Tests were performed mounting apparatus in order to develop methodology. During these tests, the data bubble point of pure carbon dioxide were determined at temperatures of 25, 26, 27 and 28 Â C and compared with data obtained for similar equipment, providing average deviation of 0.4 Bar, 0.6% on the extent being of the same order of magnitude compared to similar equipment. For testing systems in which one liquid component is introduced at ambient pressure in the analysis phase equilibrium torque was analyzed ethanol-CO2 at 40 Â C, and also compared with the literature data. The results obtained in steps preliminary analyzes showed that the system was capable of developing new data phase equilibrium. The study continued with the application development methodology for systems of viscera from fish oil + carbon dioxide oil, fish viscera + carbon dioxide + ethanol, corn oil + carbon dioxide and corn oil + ethanol + carbon dioxide at temperatures of 40, 50, 60, 70, 80, 90 and 110 Â C for systems without ethanol at temperatures of 40, 60, 80 and 110 Â C, for others. The phase diagrams obtained for all systems studied were of type IV according to the classification of Von Konynen and Scott, which is similar to that found in literature phase diagrams for systems consisting of triglycerides and carbon dioxide, showing the ability development of new data for the set-apparatus developed methodology.
O objetivo desta dissertaÃÃo foi montar um aparato experimental PVT hÃbil ao trabalho na obtenÃÃo de equilÃbrio de fases em altas pressÃes e a determinaÃÃo de metodologia eficiente. Testes de montagem no aparato foram realizados com o intuito de desenvolver a metodologia. Durantes estes testes, dados do ponto de bolha do diÃxido de carbono puro foram determinados em temperaturas de 25, 26, 27 e 28 ÂC e comparados com dados obtidos para equipamentos similares, fornecendo desvio mÃdio de 0,4 Bar, 0,6 % relativo à medida, sendo da mesma ordem de grandeza em relaÃÃo a equipamentos similares. Para testes de sistemas em que um componente lÃquido à pressÃo ambiente fosse introduzido nas anÃlises de equilÃbrio de fases o binÃrio etanol-CO2 foi analisado a 40 ÂC, e tambÃm comparado com dados da literatura. Os resultados obtidos nas etapas de anÃlises preliminares mostraram que o sistema estava apto a desenvolver novos dados de equilÃbrio de fases. Os estudos prosseguiram com o desenvolvimento da aplicaÃÃo da metodologia para os sistemas de Ãleo de vÃscera de peixe + diÃxido de carbono, Ãleo de vÃscera de peixe + etanol + diÃxido de carbono, Ãleo de milho + diÃxido de carbono e Ãleo de milho + etanol + diÃxido de carbono nas temperaturas de 40, 50, 60, 70, 80, 90 e 110 ÂC para os sistemas sem etanol e nas temperaturas de 40, 60, 80 e 110 ÂC, para os demais. Os diagramas de fases obtidos para todos os sistemas estudados foram do tipo IV de acordo com a classificaÃÃo de Von Konynen e Scott, o que se assemelha ao encontrado na literatura de diagramas de fase para sistemas constituÃdos de triglicerÃdeos e diÃxido de carbono, mostrando a capacidade de desenvolvimento de novos dados para o conjunto metodologia-aparato desenvolvido.
APA, Harvard, Vancouver, ISO, and other styles
47

Laverde, Albenise. "Os espaços experimentais das escolas públicas de arquitetura do Brasil: realidade ou utopia?" Universidade de São Paulo, 2017. http://www.teses.usp.br/teses/disponiveis/16/16132/tde-18122017-153956/.

Full text
Abstract:
Esta tese está inserida no debate sobre a utilização da experimentação construtiva ao longo da concepção da materialidade e seu papel como recurso pedagógico. Trata mais especificamente, dos espaços e práticas experimentais no contexto acadêmico nacional, objetivando compreender o processo de configuração dos espaços da área técnico-construtiva implantados nas escolas públicas de arquitetura de acordo com as particularidades advindas de diferentes contextos do país, das políticas educacionais e das ações dos atores envolvidos, com a identificação das condições essenciais para que estas práticas sejam implantadas e potencializadas no contexto acadêmico. O trabalho foi desenvolvido a partir de pesquisas documentais e por meio de visitas técnicas realizadas em 21 escolas públicas de arquitetura localizadas em diferentes regiões do país, com entrevistas direcionadas aos principais atores que atualmente estão à frente da área da Tecnologia da Construção. O trabalho trouxe contribuições de ordem teórica a partir da sistematização de uma bibliografia abrangente sobre um tema pouco estudado no contexto nacional. Quanto às contribuições de ordem prática, os dados obtidos nas visitas técnicas possibilitaram contextualizar os desafios enfrentados no ensino da Tecnologia da Construção, que não se resumem ao arranjo físico laboratorial, mas a uma dimensão mais ampla, abrangendo aspectos político-educacionais, estruturais e sócio-econômicos e também, questões de fundo, como as relações interpessoais e burocráticas. Estes resultados permitiram identificar as condições consideradas como essenciais na (re)formulação de estratégias voltadas à área da Tecnologia da Construção e sua infraestrutura, de maneira que as experiências existentes possam ser potencializadas e novas implantações tenham maior respaldo técnico de acordo com as particularidades contextuais.
This thesis is inserted in the debate about the use of experimentation in building along with the conception of materiality and its role as pedagogical resource. It approaches, more specifically, the experimental workspaces and practices in the Brazilian academic context. It aims to understand the process of shaping the workspaces of the technology and construction fields established in public schools of architecture according to the singularities from different contexts of the country, educational policies and activities of the actors involved, identifying the essential conditions for those practices to be implemented and strengthened in the academic context. The work was developed based on documentary research and through technical visits in 21 public schools of architecture located in different regions of the country, with interviews directed to the main actors who are currently in charge of the field of Technology of Construction. In relation to the practical contributions, the data obtained in the technical visits made it possible to contextualize the challenges faced in the education of Technology of Construction. These challenges are not limited to the physical arrangement of the laboratory; they embrace a broader dimension, covering education politics, structural and socioeconomic aspects and, still, background issues such as interpersonal relations and bureaucracy. These results enabled the identification of conditions considered essential in the (re) formulation of strategies directed to the field of Technology of Construction and its infrastructure, so that the existing experiences can be strengthened and new applications can have a greater technical support according to the contextual singularities.
APA, Harvard, Vancouver, ISO, and other styles
48

Mobasseri, Seyed Omid. "Developing a QFD-based design-integrated structural analysis methodology." Thesis, Brunel University, 2012. http://bura.brunel.ac.uk/handle/2438/7047.

Full text
Abstract:
Design of the mechanical components greatly depends on their expected structural performances. In modern design applications these performances are quantified by computer-based analysis and occasionally confirmed by experimental measurements or theoretical calculations. The dependency of the mechanical product to the structural analysis process is more significant under the product’s multi-functionality aspect that requires analyses for a variety of Variable Input Parameters, to obtain various structural responses and against more than one failure or design criterion. Structural analysis is known as the expert field, which requires an upfront investment and facilitation to be implemented in commercial design environment. On the other hand, the product design process is a systematic and sequential activity that put the designer in the central role of decision making. Lack of mutual understanding between these two disciplines reduces the efficiency of the structural analysis for design. This research aims to develop an integrated methodology to embed the structural analysis in the design process. The proposed methodology in this research combines the benefits of state-of-the-art approaches, early simulation and Validation and Verification practice, towards the specified aim. Moreover the novelty of the proposed methodology is in creative implication of Quality Function Deployment method to include the product’s multi-functionality aspect. The QFD-Based Design Integrated Structural Analysis methodology produces a reliable platform to increase the efficiency of the structural analysis process for product design purpose. The application of this methodology is examined through an industrial case-study for the telescopic cantilever boom, as it appears in Access platforms, and Cranes products. Findings of the case-study create a reliable account for the structural performance in early stages of the design, and ensure the functionality of the proposed methodology.
APA, Harvard, Vancouver, ISO, and other styles
49

Sitzia, Stefania. "Essays in products complexity, shaping effects and experimental methodology." Thesis, University of East Anglia, 2010. https://ueaeprints.uea.ac.uk/33361/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Parikh, Harshal. "Reservoir characterization using experimental design and response surface methodology." Texas A&M University, 2003. http://hdl.handle.net/1969/480.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography