To see the other types of publications on this topic, follow the link: Quality control.

Dissertations / Theses on the topic 'Quality control'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Quality control.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Jin, Ye. "Quality control of phytopharmaceuticals : assessment and quality control of traditional Chinese medicine." Thesis, Liverpool John Moores University, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.327675.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bush, Helen Meyers. "Nonparametric multivariate quality control." Diss., Georgia Institute of Technology, 1996. http://hdl.handle.net/1853/25571.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

PICCOLI, FLAVIO. "Visual Anomaly Detection For Automatic Quality Control." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2019. http://hdl.handle.net/10281/241219.

Full text
Abstract:
Il controllo di qualità automatico nei processi di produzione è uno degli elementi chiave della quarta rivoluzione industriale che porterà alla creazione della cosiddetta industria 4.0. In questo contesto, un elemento fondamentale è il rilievo di difetti, anomalie o guasti del prodotto in tempo compatibile con quello di produzione. Questa tesi si focalizza esattamente su questo tema: il rilevamento delle anomalie per il controllo automatico di qualità, attraverso l'analisi di immagini raffiguranti il prodotto sotto ispezione. Questa analisi verrà fatta tramite l'utilizzo di tecniche di machine learning, in particolare tramite l'uso di reti neurali convoluzionali (CNN) che sono uno strumento molto potente utilizzato nell'analisi di immagini. In primo luogo questa tesi esegue uno studio estensivo sull'argomento per introdurre il lettore e propone una una sequenza di elaborazioni per il rilevamento automatico di anomalie. Le elaborazioni sono: 1) il miglioramento delle immagini per evidenziare i difetti; 2) il rilievo delle anomalie. La prima elaborazione viene risolta utilizzando una trasformazione colore globale in grado di rimuovere effetti di luce indesiderati ed aumentare il contrasto. Questa trasformazione è ottenuta grazie all'utilizzo di SpliNet, un metodo basato su CNN che viene presentato in questa tesi, che è in grado di migliorare le immagini di input inferendo i parametri di un insieme di spline. La seconda elaborazione, e cioè il rilievo di anomalie, è stata affrontata proponendo due diversi metodi. Il primo ha l'obiettivo di modellare la normalità imparando un dizionario ed utilizzandolo in fase di test per determinare il grado di abnormalità di una immagine incognita. Questo metodo è basato su CNNs, che notoriamente richiedono grandi quantità di dati per essere addestrate. Tuttavia l'algoritmo proposto è in grado di lavorare su un insieme di immagini di addestramento molto piccolo (nell'ordine delle cinque immagini). Il metodo presentato aumenta le performances rispetto allo stato dell'arte relativo al dataset delle nanofibre acquisite con microscopio SEM del 5%, ottenendo un'area sottesa alla curva di 97.4%. Il secondo metodo proposto usa un insieme di trasformazioni locali per restaurare le immagini di input. Specificamente, queste trasformazioni sono un insieme di polinomi di grado due, i cui parametri vengono determinati attraverso l'utilizzo di una rete neurale convoluzionale. Il metodo è progettato in maniera tale che è possibile, attraverso un parametro, modulare l'accuratezza e il tempo di calcolo in maniera tale da soddisfare le esigenze dell'utente finale. Per affrontare la mancanza di dati che affligge il campo del rilievo automatico di anomalie, è stato presentato un metodo innovativo di aumento dei dati basato su deep learning. Questo metodo è in grado di generare migliaia di nuovi campioni sintetici a partire da pochi dati reali e pertanto è particolarmente adatto per aumentare dataset di tipo long-tail. La qualità dei campioni sintetizzati è stata dimostrata misurando l'accuratezza delle performance di algoritmi di machine learning addestrati sul dataset aumentato. Questo metodo è stato utilizzato per espandere un dataset di immagini raffiguranti asfalti difettosi. In questo contesto, l'utilizzo del dataset aumentato ha portato ad un incremento delle performance medie sulla segmentazione di anomalie fino a 17.5 punti percentuali. Nel caso di classi aventi bassa cardinalità, l'aumento arriva fino a 54.5 punti percentuali. Tutti i metodi presentati sovraperformano lo stato dell'arte.
Automatic quality control is one of the key ingredients for the fourth industrial revolution that will lead to the development of the so called industry 4.0. In this context, a crucial element is a production-compatible-time detection of defects, anomalies or product failures. This thesis focuses exactly on this theme: anomaly detection for industrial quality inspection, ensured through the analysis of images depicting the product under inspection. This analysis will be done through the use of machine learning, and especially through the use of convolutional neural networks (CNNs), a powerful instrument used in image analysis. This thesis starts with an extensive study on the subject to introduce the reader and to propose a pipeline for automatic anomaly detection. This pipeline is composed by two steps: 1) the enhancement of the input images for highlighting defects; 2) the detection of the anomalies. The first step is addressed with the use of a global color transformation able to remove undesired light effects and to enhance the contrast. This transformation is inferred through the use of SpliNet, a new CNN-based method here presented, that is able to enhance the input images by inferring the parameters of a set of splines. In the context of anomaly detection, two methods are presented. The first one has the aim of modeling normality by learning a dictionary and using it in test time to determine the degree of abnormality of an inquiry image. This method is based on deep learning, which is known to be data-hungry. However, the proposed algorithm is able to work also on very small trainsets (in the order of five images). The presented method boosts the performances of 5% with respect to the state-of-the art for the SEM-acquired nanofibers dataset, achieving an area under curve of 97.4%. The second proposed algorithm is a generative method able to restore the input, creating an anomaly-free version of the inquiry image. This method uses a set of local transforms to restore the input images. Specifically, these transforms are sets of polynomials of degree two, whose parameters are determined through the use of a convolutional neural network. In this context, the method can be tuned with a parameter toward accuracy or speed, for matching the needs of the final user. To address the lack of data that is suffered in this field, a totally new method for data augmentation based on deep learning is presented. This method is able to generate thousands of new synthesized samples starting from a few and thus is particularly suitable for augmenting long-tail datasets. The quality of the synthesized samples is demonstrated by showing the increase in performance of machine learning algorithms trained on the augmented dataset. This method has been employed to enlarge a dataset of defective asphalts. In this context, the use of the augmented dataset permitted to increase the average performance on anomaly segmentation of up to 17.5 percentage points. In the case of classes having a low cardinality, the improvement is up to 54.5 percentage points. For all the methods here presented I show their effectiveness by analyzing the results with the respective state-of-the-art and show their ability in outperforming the existing methods.
APA, Harvard, Vancouver, ISO, and other styles
4

Sepúlveda, Ariel. "The Minimax control chart for multivariate quality control." Diss., Virginia Tech, 1996. http://hdl.handle.net/10919/30230.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kenerson, Jonathan E. "Quality Assurance and Quality Control Methods for Resin Infusion." Fogler Library, University of Maine, 2010. http://www.library.umaine.edu/theses/pdf/KenersonJE2010.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Crossman, S. H. "Quality control in developing epithelia." Thesis, University College London (University of London), 2018. http://discovery.ucl.ac.uk/10044689/.

Full text
Abstract:
In the fruit-fly Drosophila melanogaster, extensive apoptosis is observed throughout the embryonic epidermis upon the mutation of many essential patterning genes. The molecular basis of cell elimination in this context is poorly understood, although previous studies have suggested the existence of a cell-autonomous quality control mechanism, which detects cells unable to adopt an appropriate terminal fate and removes them through apoptosis. This hypothetical system is thought to protect against patterning errors in order to preserve the integrity of the developing epidermis. To identify factors required for apoptosis in mis-patterned cells, I performed a targeted genetic screen, which identified a potential role for the EGFR signalling pathway in this process. Excess EGFR signalling was shown to rescue the cell death phenotype of the archetypal patterning mutant fushi tarazu (ftz), whilst EGFR null alleles triggered extensive epidermal apoptosis. Upon further experimentation, I was able to show that patterning mutant embryos fail to express the major EGFR activating ligands in the correct spatial pattern. This causes local troughs in EGFR signalling, which trigger transcriptional upregulation of the pro-apoptotic gene hid and subsequent cell death. These results argue against a cell-autonomous mechanism of cell elimination in mis-patterned embryos and instead suggest that the tissue-wide landscape of EGFR activity is responsible for coordinating cell fate and cell survival in the embryonic epidermis. Building on these observations, I have been able to show that the EGFR pathway also regulates apoptosis during normal development, where it specifies the maximum dimensions of embryonic segments. Taken together, these findings provide a novel link between early patterning events, cell viability and compartment size in the developing Drosophila embryo.
APA, Harvard, Vancouver, ISO, and other styles
7

Lennard, Nicola S. "Quality control for carotid endarterectomy." Thesis, University of Leicester, 2004. http://hdl.handle.net/2381/29469.

Full text
Abstract:
The aims of this study are to assess whether the introduction of a rigorous quality control method could produce a sustained reduction in the intraoperative stroke rate in this unit and whether it was feasible and practical to implement such a programme. The second part of this study will assess the incidence of sustained embolisation in the early post-operative period and investigate whether the antiplatelet agent Dextran 40 can help stop this embolisation, potentially preventing carotid artery thrombosis.;A prospective audit of all patients undergoing carotid endarterectomy was performed. The ability to monitor intraoperatively with TCD and perform completion angioscopy was assessed, as was the impact that these quality control techniques had on influencing the surgery. Patients were monitored postoperatively with TCD and any patient who developed sustained embolisation was commenced on an infusion of Dextran 40.;91% had continuous intraoperative TCD monitoring and 94% underwent successful completion angioscopy, a technical error was identified in 5% of angioscopic assessments. The intraoperative stroke rate was 0% during this study. Postoperative monitoring revealed that 5% of patients develop significant embolisation following CEA, Dextran 40 appeared to stop this embolisation. The overall 30-day stroke or death rate following CEA has fallen from 6% prior to 1992 to 2.2% in 1998.;It is possible to implement a quality control programme for CEA and this has been associated with a fall in the overall 30-day death and any stroke rate.
APA, Harvard, Vancouver, ISO, and other styles
8

Hughes, Anthony. "Quality control in radionuclide imaging." Thesis, University of Aberdeen, 1990. http://digitool.abdn.ac.uk/R?func=search-advanced-go&find_code1=WSN&request1=AAIU601994.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Gordon, Kara Leigh. "TorsinA and protein quality control." Diss., University of Iowa, 2011. https://ir.uiowa.edu/etd/2708.

Full text
Abstract:
DYT1 dystonia (DYT1) is a disabling inherited neurological disorder with juvenile onset. The genetic mutation in DYT1 leads to the deletion of a glutamic acid (E) residue in the protein torsinA. The function of torsinA and how the mutation leads to DYT1 is poorly understood. We hypothesize that how efficiently the disease-linked mutant protein is cleared may be critical for DYT1 pathogenesis. Therefore we explored mechanisms of torsinA catabolism, employing biochemical, cellular, and animal-based approaches. We asked if torsinA(wt) and torsinA(DE) are degraded preferentially through different catabolic mechanisms, specifically the ubiquitin proteasome pathway (UPP) and autophagy. We determined that torsinA(wt) is cleared by autophagy while torsinA(DE) is efficiently degraded by the UPP suggesting degradation processes can modulate torsinA(DE) levels. Proteins implicated in recognizing motifs on torsinA(DE) for targeting to the UPP represent candidate proteins that may modify DYT1 pathogenesis. We examined how removal of the hydrophobic domain and mutation of glycosylated asparagine residues on torsinA altered stability and catabolic mechanism. We found the glycosylation sites on torsinA are important for stability modulate its degradation through the UPP. F-box G-domain protein 1 (FBG1) has been implicated in degradation of glycosylated ER proteins. We hypothesized that FBG1 would promote torsinA degradation and demonstrated that FBG1 modulates levels of torsinA in a non-canonical manner through the UPP and autophagy. We examined if lack of FBG1 in a torsinA(DE) mouse model altered motor phenotypes. We saw no effect which suggests FBG1 does not alter DYT1 pathogenesis despite its promotion of torsinA(DE) degradation. In addition, we explored a potential mechanism for the previously described role of torsinA in modulating cytoplasmic protein aggregation. We hypothesized this endoplasmic reticulum (ER) resident protein would indirectly alter cytoplasmic protein aggregation through modulation of ER stress. We employed a poly-glutamine expanded repeat protein and pharmacological ER stressors to determine that torsinA does not alter poly-glutamine protein aggregation nor ER stress in a mammalian system. In summary, this thesis suggests proteins involved in the catabolism of torsinA(DE) may modify DYT1 pathogenesis and that torsinA and its DYT1-linked mutant are model proteins for investigating ER protein degradation by the UPP and autophagy.
APA, Harvard, Vancouver, ISO, and other styles
10

Cassady, Charles Richard. "Statistical quality control techniques using multilevel discrete product quality measures." Diss., This resource online, 1996. http://scholar.lib.vt.edu/theses/available/etd-06062008-151120/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Wändell, Johan. "Multistage gearboxes : vibration based quality control." Licentiate thesis, KTH, Aeronautical and Vehicle Engineering, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3987.

Full text
Abstract:

In this thesis, vibration based techniques for detection of localised surface damages in multistage gearboxes are presented and evaluated.

A modern vehicle gearbox is a complex system and the number of potential errors is large. For instance, surface damages can be caused by rough handling during assembly. Large savings can be made in the production industry by assuring the quality of products such as gearboxes. An automated quality test as a final step in the production line is one way to achieve this.

A brief review of available methods for vibration based condition monitoring of gearboxes is given in the opening summary. In the appended papers, a selection of these methods is used to design signal processing procedures for detection of localised surface damages in gearboxes. The procedures include the Synchronous signal averaging technique (SSAT), residual calculation, filtering with a prediction error filter (PEF) based on an AR-model and the use of crest factor and kurtosis as state features. The procedures are fully automatic and require no manual input during calibration or testing. This makes them easy to adapt to new test objects.

A numerical model, generating simulated gearbox vibration signals, is used to systematically evaluate the proposed procedures. The model originates from an existing model which is extended to include contributions from several gear stages as well as measurement noise. This enables simulation of difficulties likely to arise in quality testing such as varying background noise and modulation due to test rig misalignment. Without the numerical model, the evaluation would require extensive measure-ments. The numerical model is experimentally validated by comparing the simulated vibration signals to signals measured of a real gearbox.

In the experimental part of the study, vibration data is collected with accelerometers while the gearbox is running in an industrial test rig. In addition to the healthy condition, conditions including three different surface damage sizes are also considered.

The numerical and the experimental analysis show that the presented procedures are able to detect localised surface damages at an early stage. Previous studies of similar procedures have focused on gear crack detection and overall condition monitoring. The procedures can handle varying back-ground noise and reasonable modulation changes due to misalignment.

The results show that the choice of sensor position and operating conditions during measure-ments has a significant impact on the efficiency of the fault detection procedures. A localised surface damage excites resonances in the transfer path between the gear mesh and the accelerometer. These resonances amplify the defect signal. The results indicate that it is favourable to choose a speed at which the resonant defect signals are well separated from the gear meshing harmonics in the order domain. This knowledge is of great importance when it comes to quality testing. When a quality test procedure is being developed, it is often possible to choose the operating conditions and sensor positions. It can in fact be more important to choose proper operating conditions than to apply an optimal signal processing procedure.

APA, Harvard, Vancouver, ISO, and other styles
12

Moffitt, Richard Austin. "Quality control for translational biomedical informatics." Diss., Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/34721.

Full text
Abstract:
Translational biomedical informatics is the application of computational methods to facilitate the translation of basic biomedical science to clinical relevance. An example of this is the multi-step process in which large-scale microarray-based discovery experiments are refined into reliable clinical tests. Unfortunately, the quality of microarray data is a major issue that must be addressed before microarrays can reach their full potential as a clinical molecular profiling tool for personalized and predictive medicine. A new methodology, titled caCORRECT, has been developed to replace or augment existing microarray processing technologies, in order to improve the translation of microarray data to clinical relevance. Results of validation studies show that caCORRECT is able to improve the mean accuracy of microarray gene expression by as much as 60%, depending on the magnitude and size of artifacts on the array surface. As part of a case study to demonstrate the widespread usefulness of caCORRECT, the entire pipeline of biomarker discovery has been executed for the clinical problem of classifying Renal Cell Carcinoma (RCC) specimens into appropriate subtypes. As a result, we have discovered and validated a novel two-gene RT-PCR assay, which has the ability to diagnose between the Clear Cell and Oncocytoma RCC subtypes with near perfect accuracy. As an extension to this work, progress has been made towards a quantitative quantum dot immunohistochemical assay, which is expected to be more clinically viable than a PCR-based test.
APA, Harvard, Vancouver, ISO, and other styles
13

Ghoudi, Kilani. "Multivariate non-parametric quality control statistics." Thesis, University of Ottawa (Canada), 1990. http://hdl.handle.net/10393/5658.

Full text
Abstract:
During the startup phase of a production process while statistics on the product quality are being collected it is useful to establish that the process is under control. Small samples $\{ n\sb{i}\} \sbsp{i=1}{q}$ are taken periodically for $q$ periods. We shall assume each measurement is bivariate. A process is under control or on-target if all the observations are deemed to be independent and identically distributed and moreover the distribution of each observation is a product distribution. This would be the case if each coordinate of an observation is a nominal value plus noise. Let $F\sp{i}$ represent the empirical distribution function of the $i\sp{-th}$ sample. Let $\overline {F}$ represent the empirical distribution function of all observations. Following Lehman (1951) we propose statistics of the form$${\sum\limits\sbsp{i = 1}{q}}\int\sbsp{-\infty}{\infty}\int\sbsp{-\infty}{\infty}(F\sp{i}(s,t) - \overline{F}(s)\overline{F}(t))\sp2 d\overline{F}(s,t)\eqno(1)$$The emphasis there, however, is on the case where $n\sb{i}\ \to\ \infty$ while $q$ stayed fixed. Here we study the following family of statistics$$S\sb{q}={\sum\limits\sbsp{i = 1}{q}}\int\sbsp{-\infty}{\infty}\int\sbsp{-\infty}{\infty}k\sb{q}(n, i, F\sp{i}(s,t),\overline{F}(s)\overline{F}(t))n\sb{i}dF\sp{i}(s,t)\eqno(2)$$in the above quality control situation, where $q\to\infty$ while $n\sb{i}$ stays fixed. (Abstract shortened by UMI.)
APA, Harvard, Vancouver, ISO, and other styles
14

Wändell, Johan. "Multistage gearboxes : vibration based quality control /." Stockholm : Royal Institute of Technology, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3987.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Smith, Kaleigh. "Towards quality control in DNA microarrays." Thesis, McGill University, 2003. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=79129.

Full text
Abstract:
We present a framework for detecting degenerate probes in a DNA microarray that may add to measurement error in hybridization experiments. We consider four types of behaviour: secondary structure formation, self-dimerization, cross-hybridization and dimerization. The framework uses a well-established model of nucleic acid sequence hybridization and a novel method for the detection of patterns in hybridization experiment data. Our primary result is the identification of unique patterns in hybridization experiment data that are correlated with each type of degenerate probe behaviour. The framework also contains a machine learning technique to learn from the hybridization experiment data. We implement the components of the framework and evaluate the ability of the framework to detect degenerate probes in the Affymetrix HuGeneFL GeneChip.
APA, Harvard, Vancouver, ISO, and other styles
16

Ng, Gary K. L. "Quality control in laser percussion drilling." Thesis, University of Manchester, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.488033.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Wort, Ralph George. "Integrated information system for quality control." Thesis, University of the West of England, Bristol, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.283909.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Mehenni, B. "Fast visual inspection for quality control." Thesis, University of Nottingham, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.328421.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Robinson, David Charles. "Computer based on-line quality control." Thesis, University of Portsmouth, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.292316.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Safronova, D. "Biological method for water quality control." Thesis, Видавництво СумДУ, 2012. http://essuir.sumdu.edu.ua/handle/123456789/26726.

Full text
Abstract:
Problem Saint-Petersburg is located at the Neva river and almost all drinking water for 5 million inhabitants is taken from it. Talking about water supply centralised source for Saint-Petersburg, one can mark a lot of negative factors influencing the water quality. Surely, agricultural pollution, transport and economical problems are among them. When you are citing the document, use the following link http://essuir.sumdu.edu.ua/handle/123456789/26726
APA, Harvard, Vancouver, ISO, and other styles
21

Tshenye, Thapelo Obed. "Quality control of astronomical CCD observations." Master's thesis, University of Cape Town, 2007. http://hdl.handle.net/11427/4409.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Alexander, Adam Ross Washer Glenn A. "Guideline for implementing quality control and quality assurance for bridge inspection." Diss., Columbia, Mo. : University of Missouri--Columbia, 2009. http://hdl.handle.net/10355/6560.

Full text
Abstract:
The entire thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file; a non-technical public abstract appears in the public.pdf file. Title from PDF of title page (University of Missouri--Columbia, viewed on October 13, 2009). Thesis advisor: Dr. Glenn Washer. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
23

Reque, Calisaya Ramiro Benjamin. "Total quality control - TQC: controle total de qualidade na area de serviços." reponame:Repositório Institucional do FGV, 1993. http://hdl.handle.net/10438/4665.

Full text
Abstract:
Made available in DSpace on 2010-04-20T20:14:32Z (GMT). No. of bitstreams: 0 Previous issue date: 1993-11-05T00:00:00Z
Trata-se de uma análise teórica sobre os conceitos modernos de comtrole de qualidade - CQ, refletidos no TOTAL QUALITY CONTROL - TQC - Controle Total de Qualidade, aplicados às áreas de serviços das empresas industriais e às empresas fornecedoras de serviços.
APA, Harvard, Vancouver, ISO, and other styles
24

Banús, Paradell Núria. "New solutions to control robotic environments: quality control in food packaging." Doctoral thesis, Universitat de Girona, 2021. http://hdl.handle.net/10803/673469.

Full text
Abstract:
Machine vision systems and artificial intelligence techniques are two active research areas in the context of Industry 4.0. Their combination allows the reproduction of human procedures while improving the performance of the processes. However, to achieve the desired full automation, there is a need for new applications able to cover as many industrial scenarios and processes as possible. One of the areas that needs further research and development is the quality control of food packaging, and more specifically in the closure and sealing control of thermoforming packages. The shortcomings in this area were detected by TAVIL who, in collaboration with GILAB, proposed an Industrial Doctorate to investigate, develop and integrate in real scenarios new methods to improve the packaging stage of the food industry by using machine vision systems and artificial intelligence techniques. In the context of this Industrial Doctorate, two focuses of research were defined that differ at the level at which the problem is studied. The first focused on the quality control of food packages, and the second on the efficient management of machine vision systems in industrial scenarios
Els sistemes de visió per computador i les tècniques d’intel·ligència artificial són dues àrees de recerca actives en el context de la Indústria 4.0. La seva combinació permet la reproducció de procediments humans millorant al mateix temps el rendiment dels processos. Malgrat això, per aconseguir l’automatització completa desitjada, hi ha la necessitat de noves aplicacions capaces de cobrir el màxim d’escenaris i processos industrials possibles. Una de les àrees que necessita més investigació i desenvolupament és el control de qualitat dels envasos d’aliments, i més concretament, el control del tancament i del segellat d’envasos termoformats. Les necessitats en aquesta àrea van ser identificades per TAVIL que, amb col·laboració amb GILAB, van proposar un Doctorat Industrial per investigar, desenvolupar i integrar en escenaris reals nous mètodes per millorar l’etapa d’envasat de la indústria alimentària mitjançant sistemes de visió per computador i tècniques d’intel·ligència artificial. En el context d’aquest Doctorat Industrial, s’han seguit dues línies d’investigació que es diferencien en el nivell en el qual estudien el problema. La primera línia es basa en el control de qualitat d’envasos d’aliments, mentre que la segona es basa en el control eficient de sistemes de visió per computador en escenaris industrials
Programa de Doctorat en Tecnologia
APA, Harvard, Vancouver, ISO, and other styles
25

Bae, Suk Joo. "Analysis of dynamic robust design experiment and modeling approach for degradation testing." Diss., Available online, Georgia Institute of Technology, 2004:, 2003. http://etd.gatech.edu/theses/available/etd-04052004-180010/unrestricted/bae%5Fsuk%5Fj%5F2003%5Fphd.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Abrahamsson, Petter. "User Interface Design for Quality Control : Development of a user interface for quality control of industrial manufactured parts." Thesis, Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-79724.

Full text
Abstract:
The expected quality on manufactured components in the automotive industry is high, often with an accuracy of tenths of a millimeter. The conventional methods used to ensure the manufactured components are very accurate, but they are both time consuming and insufficient and only a small part of the produced series are analyzed today. The measurement is performed manually in so-called measurement fixtures. Where each component is fixed and predetermined points of investigation are controlled with a dial indicator. These fixtures are very expensive to manufacture and they are only compatible with one specific kind of component. Nowadays, great volumes of material are scrapped from these procedures in the automotive industry. Hence, there is a great need to increase the amount of controlled components without affecting the production rate negatively. This project was carried out for the relatively new company Viospatia, which is a spin-off company based on research from Luleå University of Technology. They have developed a system that automatically measures each component directly at the production line with the use of photogrammetry technology. This makes it possible to discover erroneous components almost immediately and the manufacturer gets a more distinct view of their production and its capability. The aim of this thesis has been to investigate how a user interface should be developed to be as user-friendly as possible without limiting the system’s functions. The objective has been to design a proposal of a user interface adapted for the intended user, creating value and is easy to use. The progression has been structured around a human-centered approach expedient for interaction design, where the developing phase, containing analyze, design and validate, is performed through iterations with continuous feedback from users and the project’s employer. The context, where the intended solution is supposed to be used, was investigated through interviews and observations at the involved companies. In the project there were three factories involved, Gestamp Hardtech and Scania Ferruform in Luleå and Volvo Cars in Olofström. These factories are using similar production methods, sheet metal stamping, so their prerequisites and needs are similar for this type of quality control system. Creative methods have been applied throughout the project to generate as much ideas as possible while trying to satisfy all the important aspects. Initially analog prototypes were created but they were soon developed to digital interactive prototypes. A larger usability-test was conducted with seven participants by using a weblink to the digital prototype. With support from the feedback these tests generated some adjustments were made and the final user interface was designed, separated in two levels - Supervisor and Operator. Through extensive literature study and user-testing it became clear that the operator needs to get an unmistakable message from the user interface. There should not be any doubts whatsoever and the operator should react immediately. This message is delivered with the use of colors that have an established meaning. By identifying what needs the different actors have, the system’s functions can be separated and made accessible only for the intended user. The functions can then be more specifically developed for the intended user instead of modifying them trying to make a compromise that fits everybody. This separation of functions is not anything the user has to actively do but it is performed automatically by the user interface when the user is signing in.
Den förväntade kvalitén på tillverkade delar inom bilindustrin är väldigt hög, med toleranser på så lite som tiondels millimeter många gånger. De konventionella metoderna som används för att kontrollmäta de tillverkade delarna idag är mycket noggranna, men de är både tidskrävande och otillräckliga och endast en väldigt liten del av en producerad serie blir kontrollmätt idag. Mätningen utförs manuellt i så kallade mätfixturer. Där varje komponent fixeras och förutbestämda undersökningspunkter kontrolleras med en så kallad mätklocka. Dessa fixturer är även väldigt dyra att tillverka och de är bara kompatibla med en specifik komponent. I dagens läge så kasseras otroligt stora mängder material från dessa komponenter inom bilindustrin. Här finns det alltså ett stort behov för att öka mängden komponenter som kontrolleras utan att påverka tillverkningstakten. Det här projektet utfördes åt det relativt nystartade företaget Viospatia, vilket är ett spin-off företag från forskning utförd vid Luleå tekniska universitet. De har utvecklat ett system som med hjälp av fotogrammetri automatiskt mäter av varje komponent direkt i produktionslinan. Detta gör att eventuella fel upptäcks nästan omedelbart samtidigt som tillverkaren får en tydligare bild av sin produktion och dess kapacitet. Syftet med denna masteruppsats har varit att undersöka hur ett gränssnitt bör utvecklas för att det ska bli så användarvänligt som möjligt utan att begränsa systemets viktiga funktioner. Målet har varit att ta fram ett förslag på ett gränssnitt som är anpassat för den tänkta användaren, som skapar ett mervärde och är enkelt att använda. Processen har följt en användarcentrerad struktur fördelaktig för interaktionsdesign, där utvecklingsfasen bestående av analys, design och validering sker i flera iterationer med kontinuerlig återkoppling med användare och uppdragsgivare. Kontexten, där den tänkta lösningen ska användas, undersöktes initialt hos de involverade företagen. I projektet var tre fabriker involverade, Gestamp Hardtech och Scania Ferruform i Luleå och Volvo Cars i Olofström. Dessa fabriker använder mestadels liknande tillverkningsmetoder, metallpressning, vilket gör att de rimligtvis har en del gemensamma förutsättningar och behov. Under arbetets gång har diverse kreativa metoder använts för att generera så mycket idéer som möjligt utan att förbise viktiga aspekter. Till en början utvecklades prototyper analogt för att sedan utvecklas till digitala interaktiva prototyper. Ett större användbarhetstest genomfördes på distans med sju testpersoner via en länk till den digitala prototypen. Med hjälp av responsen från dessa tester gjordes en del ändringar och den slutliga designen på gränssnittet blev uppdelat i två nivåer, Supervisor och Operator. Genom teoristudie och användartester framgick det att operatören behöver få en omisskännlig uppmaning från gränssnittet. Det bör inte uppstå några som helst tveksamheter och operatören skall kunna agera direkt. Denna uppmaning sker genom en tydlig färgkodning som utnyttjar vedertagna uppfattningar om färgers innebörd. Genom att identifiera vilka behov de olika aktörerna har kan man på så sätt också hålla isär de olika funktionerna och göra de tillgängliga endast för den typen av aktör som behöver de. De kan på så sätt också utvecklas mer specifikt för den tänkta aktören istället för att modifieras för att passa alla. Denna separering av funktioner är inget som användaren behöver ställa in själv utan görs automatiskt då den loggar in med sitt användarkonto.
APA, Harvard, Vancouver, ISO, and other styles
27

Wad, Charudatta V. "QoS : quality driven data abstraction for large databases." Worcester, Mass. : Worcester Polytechnic Institute, 2008. http://www.wpi.edu/Pubs/ETD/Available/etd-020508-151213/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

McCaskey, Suzanne D. "Robust design of dynamic systems." Thesis, Georgia Institute of Technology, 1994. http://hdl.handle.net/1853/24223.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Awwad, Tarek. "Context-aware worker selection for efficient quality control in crowdsourcing." Thesis, Lyon, 2018. http://www.theses.fr/2018LYSEI099/document.

Full text
Abstract:
Le crowdsourcing est une technique qui permet de recueillir une large quantité de données d'une manière rapide et peu onéreuse. Néanmoins, La disparité comportementale et de performances des "workers" d’une part et la variété en termes de contenu et de présentation des tâches par ailleurs influent considérablement sur la qualité des contributions recueillies. Par conséquent, garder leur légitimité impose aux plateformes de crowdsourcing de se doter de mécanismes permettant l’obtention de réponses fiables et de qualité dans un délai et avec un budget optimisé. Dans cette thèse, nous proposons CAWS (Context AwareWorker Selection), une méthode de contrôle de la qualité des contributions dans le crowdsourcing visant à optimiser le délai de réponse et le coût des campagnes. CAWS se compose de deux phases, une phase d’apprentissage opérant hors-ligne et pendant laquelle les tâches de l’historique sont regroupées de manière homogène sous forme de clusters. Pour chaque cluster, un profil type optimisant la qualité des réponses aux tâches le composant, est inféré ; la seconde phase permet à l’arrivée d’une nouvelle tâche de sélectionner les meilleurs workers connectés pour y répondre. Il s’agit des workers dont le profil présente une forte similarité avec le profil type du cluster de tâches, duquel la tâche nouvellement créée est la plus proche. La seconde contribution de la thèse est de proposer un jeu de données, appelé CrowdED (Crowdsourcing Evaluation Dataset), ayant les propriétés requises pour, d’une part, tester les performances de CAWS et les comparer aux méthodes concurrentes et d’autre part, pour tester et comparer l’impact des différentes méthodes de catégorisation des tâches de l’historique (c-à-d, la méthode de vectorisation et l’algorithme de clustering utilisé) sur la qualité du résultat, tout en utilisant un jeu de tâches unique (obtenu par échantillonnage), respectant les contraintes budgétaires et gardant les propriétés de validité en terme de dimension. En outre, CrowdED rend possible la comparaison de méthodes de contrôle de qualité quelle que soient leurs catégories, du fait du respect d’un cahier des charges lors de sa constitution. Les résultats de l’évaluation de CAWS en utilisant CrowdED comparés aux méthodes concurrentes basées sur la sélection de workers, donnent des résultats meilleurs, surtout en cas de contraintes temporelles et budgétaires fortes. Les expérimentations réalisées avec un historique structuré en catégories donnent des résultats comparables à des jeux de données où les taches sont volontairement regroupées de manière homogène. La dernière contribution de la thèse est un outil appelé CREX (CReate Enrich eXtend) dont le rôle est de permettre la création, l’extension ou l’enrichissement de jeux de données destinés à tester des méthodes de crowdsourcing. Il propose des modules extensibles de vectorisation, de clusterisation et d’échantillonnages et permet une génération automatique d’une campagne de crowdsourcing
Crowdsourcing has proved its ability to address large scale data collection tasks at a low cost and in a short time. However, due to the dependence on unknown workers, the quality of the crowdsourcing process is questionable and must be controlled. Indeed, maintaining the efficiency of crowdsourcing requires the time and cost overhead related to this quality control to stay low. Current quality control techniques suffer from high time and budget overheads and from their dependency on prior knowledge about individual workers. In this thesis, we address these limitation by proposing the CAWS (Context-Aware Worker Selection) method which operates in two phases: in an offline phase, the correlations between the worker declarative profiles and the task types are learned. Then, in an online phase, the learned profile models are used to select the most reliable online workers for the incoming tasks depending on their types. Using declarative profiles helps eliminate any probing process, which reduces the time and the budget while maintaining the crowdsourcing quality. In order to evaluate CAWS, we introduce an information-rich dataset called CrowdED (Crowdsourcing Evaluation Dataset). The generation of CrowdED relies on a constrained sampling approach that allows to produce a dataset which respects the requester budget and type constraints. Through its generality and richness, CrowdED helps also in plugging the benchmarking gap present in the crowdsourcing community. Using CrowdED, we evaluate the performance of CAWS in terms of the quality, the time and the budget gain. Results shows that automatic grouping is able to achieve a learning quality similar to job-based grouping, and that CAWS is able to outperform the state-of-the-art profile-based worker selection when it comes to quality, especially when strong budget ant time constraints exist. Finally, we propose CREX (CReate Enrich eXtend) which provides the tools to select and sample input tasks and to automatically generate custom crowdsourcing campaign sites in order to extend and enrich CrowdED
APA, Harvard, Vancouver, ISO, and other styles
30

Clarke-Pringle, Tracy Lee. "Product quality control in reduced dimension spaces." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0034/NQ66260.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Matos, Cristina Filipa Rodrigues de Oliveira. "Tat-mediated quality control in Escherichia coli." Thesis, University of Warwick, 2010. http://wrap.warwick.ac.uk/3769/.

Full text
Abstract:
The E. coli Twin-arginine translocation (Tat) pathway transports a subset of proteins from the cytosol, across the inner membrane to the periplasmic space. One of the unique features of this pathway is its ability to transport fully folded passenger proteins. These passenger proteins often contain redox co-factors such as the iron-sulphur (FeS) proteins. This feature of the pathway suggests that any quality control of passenger proteins must occur prior to export. In this study the question of Tat pathway quality control is addressed. Initial studies (Chapter 3) addressed the degree to which the Tat pathway would tolerate the misfolding of its passenger proteins. To this end, mutant forms of the FeS proteins NrfC and NapG, were generated with incremental impairment of FeS cluster formation. Expression of these mutants in E. coli revealed that the Tat system completely blocked the export of NrfC when even one of its four FeS centres was mutagenised. Furthermore, the rejected passenger proteins were rapidly degraded in a Tat dependent manner. Dissection of the components involved in this process led to the discovery that TatA/E were essential for the degradation (Chapter 3). Furthermore, the previously neglected subunit TatD also plays a central role in Tat-mediated quality control and degradation (Chapter 4). Interestingly, the data presented here demonstrate that this quality control of Tat passengers also extends to nonmutated and non-cofactor containing proteins that are not exported in a timely manner. Investigations into the mechanism of cytosolic degradation of rejected passenger proteins led to the discovery that the ClpAPS system is involved. Interestingly, in the case of FeS proteins, ClpP is not responsible for proteolysis yet ClpS and ClpA are required. However, the degradation of the non-cofactor containing passenger protein, FhuD, is dependent on the entire ClpAPS system (Chapter 5).
APA, Harvard, Vancouver, ISO, and other styles
32

Ratnam, Edward. "Indoor air quality simulation and feedback control." Thesis, Glasgow Caledonian University, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.388935.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Nuttall, James. "Protein quality control mechanisms in plant cells." Thesis, University of Warwick, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.399507.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Watts, Kate. "A study of quality control during galvannealing." Thesis, Swansea University, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.307558.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Heath, Michael Lindsey. "Quality control improvement in global apparel sourcing." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/104309.

Full text
Abstract:
Thesis: M.B.A., Massachusetts Institute of Technology, Sloan School of Management, 2016. In conjunction with the Leaders for Global Operations Program at MIT.
Thesis: S.M. in Engineering Systems, Massachusetts Institute of Technology, Department of Mechanical Engineering, 2016. In conjunction with the Leaders for Global Operations Program at MIT.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 49-50).
This project addressed challenges within the quality management process of one of the operating groups of Li & Fung. The primary goals were improved product quality and reduced quality control costs. The operating group works with thousands of factories across the world, producing a large variety of apparel and textile products. The industry trend of fast fashion, with small order sizes and shorter lead times, has placed considerable burden on the limited quality control resources. Understanding the current state of the quality management process was the first project step, and this was accomplished through factory visits and interviewing workers. The current inspection process was designed for large orders and performs sub-optimally with smaller orders. Second, the project took a broad view of the supplier base, performing statistical analysis of inspection and factory data. This revealed problems with the process that lead to high inspection costs and inaccurate inspection results. Next, the project identified technological solutions and process improvements to address the root causes of these issues and to increase the accuracy and efficiency of inspectors. Three specific technology solutions were developed: measurement digitization, label scanners, and improved management metrics. Each solution was prototyped and the critical functionality was tested to demonstrate the value of implementation. Business analysis of the solutions revealed time savings of 60,000 inspector hours/year and cost savings of more than $1 million. At the conclusion of the project, integration of the solutions within the current inspection mobile app was ongoing and expected to be rolled out across the quality organization in the first half of 2016. Finally, recommendations beyond the scope of the technology solutions are provided for further improvement of the quality management process.
by Michael Lindsey Heath.
M.B.A.
S.M. in Engineering Systems
APA, Harvard, Vancouver, ISO, and other styles
36

Karamancı, Kaan. "Exploratory data analysis for preemptive quality control." Thesis, Massachusetts Institute of Technology, 2009. http://hdl.handle.net/1721.1/53126.

Full text
Abstract:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2009.
Includes bibliographical references (p. 113).
In this thesis, I proposed and implemented a methodology to perform preemptive quality control on low-tech industrial processes with abundant process data. This involves a 4 stage process which includes understanding the process, interpreting and linking the available process parameter and quality control data, developing an exploratory data toolset and presenting the findings in a visual and easily implementable fashion. In particular, the exploratory data techniques used rely on visual human pattern recognition through data projection and machine learning techniques for clustering. The presentation of finding is achieved via software that visualizes high dimensional data with Chernoff faces. Performance is tested on both simulated and real industry data. The data obtained from a company was not suitable, but suggestions on how to collect suitable data was given.
by Kaan Karamancı.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
37

Thoutou, Sayi Mbani. "Quality control charts under random fuzzy measurements." Master's thesis, University of Cape Town, 2007. http://hdl.handle.net/11427/19140.

Full text
Abstract:
Includes bibliographical references. .
We consider statistical process control charts as tools that statistical process control utilizes for monitoring changes; identifying process variations and their causes in industrial processes (manufacturing processes) and which help manufacturers to take the appropriate action, rectify problems or improve manufacturing processes so as to produce good quality products. As an essential tool, researchers have always paid attention to the development of process control charts. Also, the sample sizes required for establishing control charts are often under discussion depending on the field of study. Of late, the problem of Fuzziness and Randomness often brought into modern manufacturing processes by the shortening product life cycles and diversification (in product designs, raw material supply etc) has compelled researchers to invoke quality control methodologies in their search for high customer satisfaction and better market shares (Guo et al 2006). We herein focus our attention on small sample sizes and focus on the development of quality control charts in terms of the Economic Design of Quality Control Charts; based on credibility measure theory under Random Fuzzy Measurements and Small Sample Asymptotic Distribution Theory. Economic process data will be collected from the study of Duncan (1956) in terms of these new developments as an illustrative example. or/Producer, otherwise they are undertaken with respect to the market as a whole. The techniques used for tackling the complex issues are diverse and wide-ranging as ascertained from the existing literature on the subject. The global ideology focuses on combining two streams of thought: the production optimisation and equilibrium techniques of the old monopolistic, cost-saving industry and; the new dynamic profit-maximising and risk-mitigating competitive industry. Financial engineering in a new and poorly understood market for electrical power must now take place in conjunction with - yet also constrained by - the physical production and distribution of the commodity.
APA, Harvard, Vancouver, ISO, and other styles
38

Kim, Sang Ik. "Contributions to experimental design for quality control." Diss., Virginia Polytechnic Institute and State University, 1988. http://hdl.handle.net/10919/53551.

Full text
Abstract:
A parameter design introduced by Taguchi provides a new quality control method which can reduce cost-effectively the product variation due to various uncontrollable noise factors such as product deterioration, manufacturing imperfections, and environmental factors under which a product is actually used. This experimental design technique identifies the optimal setting of the control factors which is least sensitive to the noise factors. Taguchi’s method utilizes orthogonal arrays which allow the investigation of main effects only, under the assumption that interaction effects are negligible. In this paper new techniques are developed to investigate two-factor interactions for 2t and 3t factorial parameter designs. The major objective is to be able to identify influential two-factor interactions and take those into account in properly assessing the optimal setting of the control factors. For 2t factorial parameter designs, we develop some new designs for the control factors by using a partially balanced array. These designs are characterized by a small number of runs and some balancedness property of the variance-covariance matrix of the estimates of main effects and two-factor interactions. Methods of analyzing the new designs are also developed. For 3t factorial parameter designs, a detection procedure consisting of two stages is developed by using a sequential method in order to reduce the number of runs needed to detect influential two-factor interactions. In this paper, an extension of the parameter design to several quality characteristics is also developed by devising suitable statistics to be analyzed, depending on whether a proper loss function can be specified or not.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
39

Yang, Yi. "Injection molding control : from process to quality /." View abstract or full-text, 2004. http://library.ust.hk/cgi/db/thesis.pl?CENG%202004%20YANG.

Full text
Abstract:
Thesis (Ph. D.)--Hong Kong University of Science and Technology, 2004.
Includes bibliographical references (leaves 218-244). Also available in electronic version. Access restricted to campus users.
APA, Harvard, Vancouver, ISO, and other styles
40

Wang, Wei. "WebRTC Quality Control in Contextual Communication Systems." Thesis, KTH, Radio Systems Laboratory (RS Lab), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-232704.

Full text
Abstract:
Audio and video communication is a universal task with a long history of technologies. Recent examples of these technologies include Skype video calling, Apple’s Face Time, and Google Hangouts. Today, these services offer everyday users the ability to have an interactive conference with both audio and video streams. However, many of these solutions depend on extra plugins or applications installing on the user’s personal computer or mobile device. Some of them also are subject to licensing, introducing a huge barrier for developers and restraining new companies from entering this area. The aim of Web Real-Time Communications (WebRTC) is to provide direct access to multimedia streams in the browser, thus making it possible to create rich media applications using web technology without the need for plugins or developers needing to pay technology license fees. Ericsson develops solutions for communication targeting professional and business users. With the increasing possibilities to gather data (via cloud-based applications) about the quality experienced by users in their video conferences, new demands are placed on the infrastructure to handle this data. Additionally, there is a question of how the stats should be utilized to automatically control the quality of service (QoS) in WebRTC communication systems. The thesis project deployed a WebRTC quality control service with methods of data processing and modeling to assess the perceived video quality of the ongoing session, and in further produce appropriate actions to remedy poor quality. Lastly, after evaluated on the Ericsson contextual test platform, the project verified that two of the stats-parameters (network delay and packet loss percentage) for assessing QoS have the negative effect on the perceived video quality but with different influence degree. Moreover, the available bandwidth turned out to be an important factor, which should be added as an additional stats-parameter to improve the performance of a WebRTC quality control service.
Ljud och videokommunikation är en universell uppgift med en lång historia av teknik. Exempel på dessa teknologier är Skype-videosamtal, Apples ansiktstid och Google Hangouts. Idag erbjuder dessa tjänster vardagliga användare möjligheten att ha en interaktiv konferens med både ljud- och videoströmmar. Men många av dessa lösningar beror på extra plugins eller applikationer som installeras på användarens personliga dator eller mobila enhet. Vissa av dem är också föremål för licensiering, införande av ett stort hinder för utvecklare och att hindra nya företag att komma in i detta område. Syftet med Web Real-Time Communications (WebRTC) är att ge direkt åtkomst till multimediaströmmar i webbläsaren, vilket gör det möjligt att skapa rich media-applikationer med webbteknik utan att plugins eller utvecklare behöver betala licensavgifter för teknik. Ericsson utvecklar lösningar för kommunikationsriktning för professionella och företagsanvändare. Med de ökande möjligheterna att samla data (via molnbaserade applikationer) om kvaliteten hos användare på sina videokonferenser ställs nya krav på infrastrukturen för att hantera dessa data. Dessutom är det fråga om hur statistiken ska användas för att automatiskt kontrollera kvaliteten på tjänsten (QoS) i WebRTC-kommunikationssystem. Avhandlingsprojektet tillämpade en WebRTC-kvalitetskontrolltjänst med metoder för databehandling och modellering för att bedöma upplevd videokvalitet av den pågående sessionen och vidare producera lämpliga åtgärder för att avhjälpa dålig kvalitet. Slutligen, efter utvärdering på Ericssons kontextuella testplattform, verifierade projektet att två av statistikparametrarna (nätverksfördröjning och paketförlustprocent) för bedömning av QoS har den negativa effekten på upplevd videokvalitet men med olika inflytningsgrad. Dessutom visade den tillgängliga bandbredd att vara en viktig faktor, som bör läggas till som en extra statistikparameter för att förbättra prestanda för enWebRTC-kvalitetskontrolltjänst.
APA, Harvard, Vancouver, ISO, and other styles
41

He, Baosheng. "New Bayesian methods for quality control applications." Diss., University of Iowa, 2018. https://ir.uiowa.edu/etd/6133.

Full text
Abstract:
In quality control applications, the most basic tasks are monitoring and fault diagnosis. Monitoring results determines if diagnosis is required, and conversely, diagnostic results aids better monitoring design. Quality monitoring and fault diagnosis are closely related but also have significant difference. Essentially. monitoring focus on online changepoint detection, whilst the primary objective of diagnosis is to identify fault root causes as an offline method. Several critical problems arise in the research of quality control: firstly, whether process monitoring is able to distinguish systematic or assignable faults and occasional deviation; secondly, how to diagnose faults with coupled root causes in complex manufacturing systems; thirdly, if the changepoint and root causes of faults can be diagnosed simultaneously. In Chapter 2, we propose a novel Bayesian statistical process control method for count data in the presence of outliers. That is, we discuss how to discern out of control status and temporary abnormal process behaviors in practice, which is incapable for current SPC methodologies. In this work, process states are modeled as latent variables and inferred by the sequential Monte Carlo method. The idea of Rao-Blackwellization is employed in the approach to control detection error and computational cost. Another contribution of this work is that our method possesses self-starting characteristics, which makes the method a more robust SPC tool for discrete data. Sensitivity analysis on monitoring parameter settings is also implemented to provide practical guidelines. In Chapter 3, we study the diagnosis of dimensional faults in manufacturing. A novel Bayesian variable selection oriented diagnostic framework is proposed. Dimensional fault sources are not explicitly measurable; instead, they are connected with dimensional measurements by a generalized linear mixed effect model, based on which we further construct a hierarchical quality-fault model to conduct Bayesian inference. A reversible jump Markov Chain Monte Carlo algorithm is developed to estimate the approximate posterior probability of fault patterns. Such diagnostic procedure is superior over previous studies since no numeric regularization is required for decision making. The proposed Bayesian diagnosis can further lean towards sparse fault patterns by choosing suitable priors, in order to handle the challenge from the diagnosability of faults. Our work considers the diagnosability in building dimensional diagnostic methodologies. We explain that the diagnostic result is trustworthy for most manufacturing systems in practice. The convergence analysis is also implemented, considering the trans-dimensional nature of the diagnostic method. In Chapter 4 of the thesis, we consider the diagnosis of multivariate linear profile models. We assume liner profiles as piece-wise constant. We propose an integrated Bayesian diagnostic method to answer two problems: firstly, whether and when the process is shifted, and secondly, in which pattern the shift occurs. The method can be applied for both Phase I and Phase II needs. For Phase I diagnosis, the method is implemented with no knowledge of in control profiles, whereas in Phase II diagnosis, the method only requires partial observations. To identify exactly which profile components deviate from nominal value, the variability of the value of profile components is marginalized out through a fully Bayesian approach. To address computational difficulty, we implement Monte Carlo Method to alternatively inspect between spaces of changepoint positions and fault patterns. The diagnostic method is capable to be applied under multiple scenarios.
APA, Harvard, Vancouver, ISO, and other styles
42

Sasset, Linda. "Protein quality control: from ribosome to proteasome." Doctoral thesis, Università degli studi di Trieste, 2014. http://hdl.handle.net/10077/10118.

Full text
Abstract:
2012/2013
Il peptide 2A è un peptide virale di 18-23 aminoacidi presente in alcuni picornavirus, dei virus a RNA che codificano per una poliproteina che viene poi processata per produrre tutte le proteine virali. Il peptide 2A, in particolare, permette la divisione tra le proteine capsidiche e quelle replicative. Durante la traduzione il peptide 2A agisce come elemento autonomo, in grado di mediare il suo processamento al C-terminale: per farlo modifica l'attività del ribosoma, promuovendo l'idrolisi del legame estere tra il peptide nascente e il tRNA anziché che la formazione del legame peptidico, con il conseguente rilascio prematuro della proteina a monte e la continuazione della traduzione dalla Pro a valle. Abbiamo chiamato questa proprietà "terminazione non convenzionale della traduzione", per distinguerla dalla terminazione convenzionale che si verifica in presenza di un codone di STOP e che viene mediata dai fattori di rilascio 1 e 3. Per questa sua attività, sia degli aminoacidi Asn-Pro-Gly al C-terminale del peptide 2A che la Pro al N-terminale della proteina a valle sono essenziali. Grazie alla sua capacità di autoprocessamento, il peptide 2A può essere utilizzato per produrre due proteine da un unico open reading frame (ORF) . Nella prima parte del lavoro abbiamo testato due differenti peptidi 2A (uno derivante dal Foot-and-Mouth Disease Virus (F2A), e uno dal Porcine Teschovirus (T2A) per trovare eventuali differenze tra la loro attività in termini di terminazione non convenzionale e di reinizio della traduzione dalla Pro a valle. Entrambi questi peptidi si sono mostrati molto efficienti nella terminazione non convenzionale, come si nota dalla quasi totale assenza di proteina di fusione. Tuttavia, il reinizio della traduzione dalla Pro valle 2A non è completo, come si evince dal fatto che la proteina a valle viene prodotta in una quantità inferiore rispetto a quella a monte. Abbiamo determinato che F2A funziona meglio T2A, in quanto consentee la sintesi della proteina a valle con maggiore efficienza. Questi risultati indicano che il peptide F2A è un ottimo sistema per la co-espressione di proteine in vivo. Abbiamo sfruttato questa tecnologia per la produzione di anticorpi ricombinante, riuscendo a produrre sia in vitro che in vivo la catena leggera e pesante dell’anticorpo, che inoltre si assembla correttamente. Nella seconda parte del lavoro abbiamo studiato una caratteristica particolare del peptide 2A: quando viene posizionato immediatamente a monte di un codone di stop, l'espressione della proteina a monte è fortemente compromessa. Questa caratteristica è dovuta alla sequenza amminoacidica, e non a quella nucleotidica. La causa di questa compromissione dell’espressione risiede in un forte stalling del ribosoma al sito Gly-STOP, totalmente dipendente dagli ultimi tredici residui del peptide 2A e dalla presenza degli amminoacidi C-terminali Asn-Pro-Gly terminali. Come conseguenza del blocco, viene attivato un controllo di qualità a livello del ribosoma, che induce l’ubiquitinazione e la degradazione da parte del proteasoma della piccola quantità di proteina prodotta. Questa attività di controllo da parte del peptide 2A sulla traduzione è efficace sia su ribosomi legati alla membrana dell’ER, che su quelli liberi nel citosol. Per i ribosomi legati alla membrana, abbiamo identificato il coinvolgimento di alcuni membri del pathway di retrotraslocazione come l’AAA ATPasi p97 e la deubiquitinase YOD1, suggerendo un meccanismo di controllo di qualità che rileva la presenza di ribosomi in stallo sul lato citosolico della membrana dell’ER e la segnala ad effettori presenti nel lato luminale per indurre la degradazione associata all’ER (ERAD) dei peptidi nascenti. E’ noto che uno dei principali mediatori dell’ERAD è l'AAA ATPasi p97/VCP; si che p97 sia necessaria per la retrotraslocazione e la degradazione da parte del proteasoma dei substrati ERAD, e che la sua perdita provoca la loro ritenzione nel lume dell’ER. Di conseguenza, l’attività ATPasica di p97 è ritenuta essenziale per l'estrazione delle proteine dalla membrana dell’ER . Per studiare il ruolo di p97 in ERAD, abbiamo usato un metodo recentemente sviluppato nel nostro laboratorio (Petris et al., 2011) che consiste nella mono-biotinilazione in vivo, da parte della biotin ligasi citosolica Bira, di substrati proteici taggati con il Biotin Acceptor Peptide BAP (GLNDIFEAQKIEWH). Questo metodo consente una discriminazione precisa tra le proteine che si trovano nel lume dell’ER (non biotinilate) e nel citoplasma (biotinilate). Abbiamo analizzato l'effetto di: 1) una forma dominante negativa di p97 (p97QQ), 2) un siRNA specifico per p97, e 3) l’inibitore chimico di p97 DBeQ, in tre diversi substrati modello di ERAD: NS1, il mutante Null Hong Kong dell’α1-antitripsina (NHK-α1AT), e Tetherin. I nostri risultati sono in contrasto con la visione predominante di p97 come fornitore di energia per l'estrazione delle proteine dall’ER in maniera ATP-dipendente. Abbiamo trovato che l'attività di p97 non è coinvolta nella retrotraslocazione di queste proteine, ma piuttosto nella separazione delle proteine, già nel lato citosolico, e nel reclutamento della PNGase. Risultati simili sono stati ottenuti analizzando l'effetto di una forma dominante negativa della deubiquitinase p97-associata YOD1 (YOD1 C160S).
The 2A peptide is a short viral peptide of 18-23 amino acids, present in some picornaviruses. These RNA viruses encode a polyprotein that is late-processed to produce all viral proteins. 2A allows the primary cleavage between the capsidic and the replicative proteins. It works as an autonomous element during translation, capable of mediating self-processing at its C-terminus. It modifies the ribosome activity, promoting hydrolysis of the ester bond between the nascent peptide and the tRNA rather than the formation of the peptide linkage, resulting in a premature release of the upstream protein and continuation of translation from the in-frame downstream Pro codon. We call this property “non-conventional translation termination”, in contrast to the “conventional translation termination” occurring at a STOP codon site and mediated by the Release Factors 1 and 3. For its activity both the amino acids Asn-Pro-Gly at the C-terminus of the 2A peptide, and the Pro at the N-terminus of the downstream protein are essential. Due to its property of self-processing at the C-terminus, in mammalian cells it can be used to produce two proteins from a single open reading frame (ORF). In the first part of the work we tested two different 2A peptides (one derived from the Foot-and-Mouth Disease Virus (F2A) and one from the Porcine Teschovirus (T2A)) in order to see possible differences between their activity in term of non-conventional translation termination and re-initiation of translation from the 2A downstream Pro. We found that both these peptides were very efficient in terminating translation, in fact we observed only a very little amount of fusion protein. However, re-initiation from the 2A downstream Pro was not complete, in fact we observed that the downstream protein was produced in a lower amount than the upstream one. We also determined that F2A worked better than T2A, allowing synthesis of the downstream protein with higher efficiency. These results indicate that F2A peptide is a powerful system for the co-expression of proteins in vivo. We exploited this technology for the production of recombinant antibodies. We found that both in vitro and in vivo antibody’s light and heavy chains were produced, and the complete antibody was properly folded. In the second part of the work we investigated a particular feature of the 2A peptide: when it was positioned immediately upstream of a STOP-codon, expression of the upstream protein was heavily impaired. We found that the compromised expression was due to the amino acidic and not to the nucleotidic sequence of the peptide. We found also that it was a consequence of a strong ribosome stalling at the Gly-STOP-codon boundary, totally dependent on the last thirteen 2A residues and on the presence of Asn-Pro-Gly as C-terminal amino acids. As a consequence of the stalling, a ribosome-mediated quality control is activated, inducing ubiquitination and proteasome degradation of the small amount of protein produced. This 2A regulatory activity on translation was effective on both membrane-bound and free ribosomes. We found that, for membrane-bound ribosomes, members of the retrotranslocation pathway such as the AAA-ATPase p97 and the deubiquitinase YOD1 were involved, thus suggesting a quality control mechanism that senses the presence of stalled ribosomes on the ER membrane cytosolic side and signals to effectors present in the luminal side to induce ERAD (Endoplasmic Reticulum Associated Degradation). It is widely known that one of the main players in ER Associated Degradation is the AAA-ATPase p97/VCP. It is generally thought that p97 ATPase activity is required for retro-translocation and proteasomal degradation, and that its loss causes retention of ERAD substrates in the ER lumen. As a consequence, p97 ATPase activity is considered essential for extraction of proteins from the ER membranes. To study the role of p97 in ERAD, we used a method recently developed in our laboratory (Petris et al., 2011), which is based in the specific in vivo mono-biotinylation by the cytosolic biotin ligase BirA of protein substrates tagged with the 15 amino acids-long Biotin Acceptor Peptide BAP (GLNDIFEAQKIEWH). This method allows a precise discrimination between the proteins located in the Endoplasmic Reticulum (non biotinylated) or in the cytoplasm (biotinylated). We analysed the effect of: 1) a dominant negative form of p97 (p97QQ), 2) a p97 specific siRNA, and 3) the p97 chemical inhibitor DBeQ, on the three different ERAD model substrates NS1, Null Hong Kong mutant of α1 anti-trypsin (NHK-α1AT), and Tetherin. Our results clearly challenge the predominant view of p97 as energy provider for the extraction of unfolded proteins from a putative retrotranslocon in an ATP-dependent manner. We found that p97 activity is not involved in the retrotranslocation of these proteins from ER to cytosol, but rather in the segregation of proteins already in the cytosolic side and in the recruitment of PNGase. Similar results were obtained analysing the effect of a dominant negative form of the p97-associated deubiquitinase YOD1 (YOD1 C160S).
XXVI Ciclo
1985
APA, Harvard, Vancouver, ISO, and other styles
43

Binny, Diana. "Radiotherapy quality assurance using statistical process control." Thesis, Queensland University of Technology, 2019. https://eprints.qut.edu.au/130738/1/Diana_Binny_Thesis.pdf.

Full text
Abstract:
The work presented in this thesis was a step forward in applying statistics to the important problem of monitoring machine performance and quantifying optimal treatment quality assurance in radiotherapy. This research investigated the use of an analytical decision making tool known as Statistical Process Control (SPC) that employs statistical means to measure, monitor and identify random and systematic errors in a process based on observed behaviour. In this research, several treatment machine and planning system parameters were investigated and a method of calculating SPC based tolerances to achieve optimal treatment goals was highlighted in this study.
APA, Harvard, Vancouver, ISO, and other styles
44

Akkinepally, Radha. "Quality control and quality assurance of hot mix asphalt construction in Delaware." Access to citation, abstract and download form provided by ProQuest Information and Learning Company; downloadable PDF file 2.68Mb, 136p, 2005. http://wwwlib.umi.com/dissertations/fullcit/1428173.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

McGrew, Don. "A review of sensory quality control and quality assurance for alcoholic beverages." Kansas State University, 2011. http://hdl.handle.net/2097/9160.

Full text
Abstract:
Master of Science
Food Science
Delores H. Chambers
Tools are available, through various reference books, to develop a purposeful sensory quality program. Some companies already have a strong sensory program in place; others may require a cultural change to facilitate the implementation. This paper indicates some of the challenges to be overcome, covers some current quality control (QC) sensory practices and addresses advantages and disadvantages of expert tasters Some specific issues regarding sensory evaluations of alcohol beverages are discussed and critical factors in production are reviewed with discussion on the potential for off taint development.
APA, Harvard, Vancouver, ISO, and other styles
46

Twala, Luphumlo Rodney. "An investigation into the impact of quality management systems on General Motors suppliers." Thesis, Nelson Mandela Metropolitan University, 2012. http://hdl.handle.net/10948/d1020927.

Full text
Abstract:
Acceptable product quality is one of the central pillars of customer satisfaction, which is key to customer retention and business success. The quest to improve and maintain good product quality is as old as mankind himself. In the beginning of time, quality involved selecting edible plants, to craftsmanship then later to mass production. As time went on, various individuals and institutions made valuable contributions to quality control methods, tools and techniques as we know them today. The International Organisation for Standardisation initiated quality management standards, provide requirements, specifications and guidelines that can be used to ensure materials and products meet a certain quality level. These quality management systems were adopted by the global automotive organisations, adapted and tailored to suit their requirements. The result was the development of ISO/TS 16949, which is a requirement for all direct manufacturing suppliers to the automotive original equipment manufacturers (OEMs), like BMW, TOYOTA and GM. Some OEMs specify additional requirements their supplier base, in case of General Motors, Quality Systems Basics (QSB) is a mandatory quality management systems requirement. QSB is designed by GM to help suppliers reduce product defects, improve internal efficiencies and improve supply chain processes. A quantitative approach was chosen, which utilised an explorative and descriptive survey questionnaire in order to complete the research study. The study will show that the majority of the respondents believed that the implementation of QSB has resulted in positive implications in their manufacturing process and supply value chain.
APA, Harvard, Vancouver, ISO, and other styles
47

Bayley, Luke Accounting Australian School of Business UNSW. "Aspects of accounting quality." Awarded by:University of New South Wales. Accounting, 2007. http://handle.unsw.edu.au/1959.4/40476.

Full text
Abstract:
Accounting numbers are not only the products of peripheral economic events, but, by and large, can be consciously influenced from the effects of calculated business decisions and the selective applications of alternative reporting procedures. In academic parlance, the term accounting quality, or lack thereof, is often used to describe the extent to which these convoluting influences create a disparity between economic fundamentals and their numerical portrayal. This doctoral thesis speaks to three aspects of accounting quality; (i) Earnings Thresholds: A Re-Examination of the Role of Earnings Management, (ii) Earnings Manipulation and the Investigation of 'Red Flag' Accounting Ratios, and (iii) An Empirical Analysis of Standard and Poor's (S&Ps) Core Earnings metric. Each topic is outlined in a separate research paper.
APA, Harvard, Vancouver, ISO, and other styles
48

Chatelain, Pierre. "Quality-driven control of a robotized ultrasound probe." Thesis, Rennes 1, 2016. http://www.theses.fr/2016REN1S092/document.

Full text
Abstract:
La manipulation robotique d'une sonde échographique a été un important sujet de recherche depuis plusieurs années. Plus particulièrement, des méthodes d'asservissement visuel guidé par échographie ont été développées pour accomplir différentes tâches, telles que la compensation de mouvement, le maintien de la visibilité d'une structure pendant la téléopération, ou le suivi d'un instrument chirurgical. Cependant, en raison de la nature des images échographiques, garantir une bonne qualité d'image durant l'acquisition est un problème difficile, qui a jusqu'ici été très peu abordé. Cette thèse traite du contrôle de la qualité des images échographiques acquises par une sonde robotisée. La qualité du signal acoustique au sein de l'image est représentée par une carte de confiance, qui est ensuite utilisée comme signal d'entrée d'une loi de commande permettant d'optimiser le positionnement de la sonde échographique. Une commande hybride est également proposée pour optimiser la fenêtre acoustique pour une cible anatomique qui est détectée dans l'image. L'approche proposée est illustrée dans le cas d'un scénario de télé-échographie, où le contrôle de la sonde est partagé entre la machine et le téléopérateur
The robotic guidance of an ultrasound probe has been extensively studied as a way to assist sonographers in performing an exam. In particular, ultrasound-based visual servoing methods have been developed to fulfill various tasks, such as compensating for physiological motion, maintaining the visibility of an anatomic target during teleoperation, or tracking a surgical instrument. However, due to the specific nature of ultrasound images, guaranteeing a good image quality during the procedure remains an unaddressed challenge. This thesis deals with the control of ultrasound image quality for a robot-held ultrasound probe. The ultrasound signal quality within the image is represented by a confidence map, which is used to design a servo control law for optimizing the placement of the ultrasound probe. A control fusion is also proposed to optimize the acoustic window for a specific anatomical target which is tracked in the ultrasound images. The method is illustrated in a teleoperation scenario, where the control is shared between the automatic controller and a human operator
APA, Harvard, Vancouver, ISO, and other styles
49

Meurée, Cédric. "Arterial spin labelling : quality control and super-resolution." Thesis, Rennes 1, 2019. http://www.theses.fr/2019REN1S016/document.

Full text
Abstract:
L'arterial spin labelling (ASL) est une technique d'imagerie par résonance magnétique de la perfusion cérébrale. Les travaux présentés dans cette thèse ont d'abord consisté à standardiser les acquisitions ASL dans le contexte d'études de neuro-imagerie multicentriques. Un processus de contrôle de la qualité des images a par la suite été proposé. Les travaux se sont ensuite orientés vers le post-traitement de données ASL, en évaluant la capacité d'algorithmes existants à y corriger les distorsions. Des méthodes de super-résolution adaptées aux acquisitions ASL mono et multi-TI ont finalement été proposées et validées sur des données simulées, de sujets sains, ou de patients imagés pour suspicion de tumeurs cérébrales
Arterial spin labelling (ASL) is a brain perfusion magnetic resonance imaging technique. The objective of this thesis was first to standardize ASL acquisitions in the context of multicenter neuroimaging studies. A quality control procedure has then been proposed. The capacity of existing algorithms to correct for distortions in ASL images has then been evaluated. Super-resolution methods, developed and adapted to single and multi-TI ASL data in the context of this thesis, are then described, and validated on simulated data, images acquired on healthy subjects, and on patients imaged for brain tumors
APA, Harvard, Vancouver, ISO, and other styles
50

Duncalf, A. J. "Quality assurance : An examination of the way that British manufacturing companies manage their £Tproduct quality£T." Thesis, University of Manchester, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.377047.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography