To see the other types of publications on this topic, follow the link: Pattern quality.

Dissertations / Theses on the topic 'Pattern quality'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Pattern quality.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Hammar, Karl. "Towards an Ontology Design Pattern Quality Model." Licentiate thesis, Linköpings universitet, Institutionen för datavetenskap, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-93370.

Full text
Abstract:
The use of semantic technologies and Semantic Web ontologies in particular have enabled many recent developments in information integration, search engines, and reasoning over formalised knowledge. Ontology Design Patterns have been proposed to be useful in simplifying the development of Semantic Web ontologies by codifying and reusing modelling best practices. This thesis investigates the quality of Ontology Design Patterns. The main contribution of the thesis is a theoretically grounded and partially empirically evaluated quality model for such patterns including a set of quality characteristics, indicators, measurement methods and recommendations. The quality model is based on established theory on information system quality, conceptual model quality, and ontology evaluation. It has been tested in a case study setting and in two experiments. The main findings of this thesis are that the quality of Ontology Design Patterns can be identified, formalised and measured, and furthermore, that these qualities interact in such a way that ontology engineers using patterns need to make tradeoffs regarding which qualities they wish to prioritise. The developed model may aid them in making these choices. This work has been supported by Jönköing University.
APA, Harvard, Vancouver, ISO, and other styles
2

Olson, Daren. "Teaching Patterns: A Pattern Language for Improving the Quality of Instruction in Higher Education Settings." DigitalCommons@USU, 2008. https://digitalcommons.usu.edu/etd/51.

Full text
Abstract:
One method for improving the appeal of instruction is found in Christopher Alexander’s work on architectural design patterns. In this qualitative research study, student comments on teacher/course evaluation forms were analyzed to generate six instructional design patterns. The teacher enthusiasm pattern encourages teachers to show (a) increased scholarship and enthusiasm towards the subject matter, (b) genuine concern and enthusiasm towards the students, and (c) mastery of and enthusiasm towards the act of teaching. The balanced curriculum pattern recommends that teachers (a) determine the appropriate depth or breadth of subject matter and communicate it to the students, (b) create a balanced schedule of activities, assignments, and tests, and (c) provide a variety of subject matter topics, instructional strategies, and media delivery technologies. The clear and appropriate assessments pattern directs teachers to (a) communicate the learning objectives related to each assessment, (b) ensure assessment methods are appropriate measures of the objectives, and (c) use fair criteria in grading and administering the assessments. The authentic connections pattern asks teachers to (a) help students understand the connections between the subject matter content and the world of work, (b) promote interpersonal connections between students through instruction and group work, as well as facilitate teacher-student connections by dealing with students honestly and fairly, and (c) encourage students to look at connections that go beyond workplace application and help students become better people. The flow of time pattern recommends that teachers (a) help students plan out their schedules for various time periods, and (b) synchronize the flow of instructional events with the flow of events occurring in the students’ personal lives. Finally, the negotiation and cooperation pattern encourages teachers to apply the processes of negotiation and cooperation to solve problems related to (a) the students’ lack of a sense of freedom, power, or control, (b) the conflict within the students or within the social order of the class, and (c) the general absence of a self-supporting, self-maintaining, and generating quality in the instruction. These six instructional design patterns may be used by teachers to increase the appeal of instruction in higher education settings.
APA, Harvard, Vancouver, ISO, and other styles
3

Millman, Michael Peter. "Computer vision for yarn quality inspection." Thesis, Loughborough University, 2000. https://dspace.lboro.ac.uk/2134/34196.

Full text
Abstract:
Structural parameters that determine yarn quality include evenness, hairiness and twist. This thesis applies machine vision techniques to yarn inspection, to determine these parameters in a non-contact manner. Due to the increased costs of such a solution over conventional sensors, the thesis takes a wide look at, and where necessary develops, the potential uses of machine vision for several key aspects of yarn inspection at both low and high speed configurations. Initially, the optimum optical / imaging conditions for yarn imaging are determined by investigating the various factors which degrade a yarn image. The depth of field requirement for imaging yarns is analysed, and various solutions are discussed critically including apodisation, wave front encoding and mechanical guidance. A solution using glass plate guides is proposed, and tested in prototype. The plates enable the correct hair lengths to be seen in the image for long hairs, and also prevent damaging effects on the hairiness definition due to yarn vibration and yarn rotation. The optical system parameters and resolution limits of the yarn image when using guide plates are derived and optimised. The thesis then looks at methods of enhancing the yarn image, using various illumination methods, and incoherent and coherent dark-field imaging.
APA, Harvard, Vancouver, ISO, and other styles
4

Duan, Hejun. "Quality control and pattern recognition in gas chromatography mass spectrometry." Thesis, University of Bristol, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.492569.

Full text
Abstract:
Quality control monitoring analysis and original pattern recognition studies involving coupled chromatography were presented in this thesis. A number of methods had been developed for the quality control analysis, which is extremely important in biological studies that were described in this thesis.
APA, Harvard, Vancouver, ISO, and other styles
5

Savvides, Vasos E. "Perceptual models in speech quality assessment and coding." Thesis, Loughborough University, 1988. https://dspace.lboro.ac.uk/2134/36273.

Full text
Abstract:
The ever-increasing demand for good communications/toll quality speech has created a renewed interest into the perceptual impact of rate compression. Two general areas are investigated in this work, namely speech quality assessment and speech coding. In the field of speech quality assessment, a model is developed which simulates the processing stages of the peripheral auditory system. At the output of the model a "running" auditory spectrum is obtained. This represents the auditory (spectral) equivalent of any acoustic sound such as speech. Auditory spectra from coded speech segments serve as inputs to a second model. This model simulates the information centre in the brain which performs the speech quality assessment.
APA, Harvard, Vancouver, ISO, and other styles
6

Tan, Kwee Teck. "Objective picture quality measurement for MPEG-2 coded video." Thesis, University of Essex, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.324249.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Cantell, Gillian Diane. "Measurement of image quality in nuclear medicine and radiology." Thesis, University of Aberdeen, 1995. http://digitool.abdn.ac.uk/R?func=search-advanced-go&find_code1=WSN&request1=AAIU078704.

Full text
Abstract:
The imaging process can be thought of as the acquisition of data and the processing and display of data. The image quality of the acquired data is assessed using objective methods. The spatial transfer characteristic was measured using the MTF, and the noise properties assessed using Wiener spectra for gamma camera and film-screen systems. An overall measure of image quality, noise equivalent quanta, can then be calculated. The image quality of the displayed data is assessed using subjective methods. Contrast detail test objects have been used for film-screen systems and forced choice experiments for nuclear medicine data. The Wiener spectrum noise measurement has been investigated as a measure of uniformity. Simulated and gamma camera flood images were produced. Observer tests were carried out to give a contrast at which the non-uniform flood images could be distinguished from the uniform flood images. Wiener spectra were produced and single number indices derived. Statistical tests were performed to determine the contrast at which the uniform and non-uniform Wiener spectra can be distinguished. Results showed that Wiener spectra measurements can be used as a measure of uniformity under certain conditions. The application of resolution and noise measurements to the evaluation of film-screen systems and radiographic techniques has been considered. The results follow the trends presented in the literature. Provided that the scanning equipment is available tests on film-screen systems are practical to perform and are an important addition to other evaluation tests. Results show the ideal observer approach of measuring the resolution, noise and hence noise equivalent quanta, is a practical method of assessing image quality in a hospital environment.
APA, Harvard, Vancouver, ISO, and other styles
8

Buzzard, Raymond Karl. "A prolog implementation of pattern search to optimize software quality assurance." Thesis, Monterey, California. Naval Postgraduate School, 1990. http://hdl.handle.net/10945/30680.

Full text
Abstract:
Approved for public release, distribution is unlimited
Quality Assurance (QA) is a critical factor in the development of successful software systems. Through the use of various QA tools, project managers can ensure that a desired level of performance and reliability is built into the system. However, these tools are not without cost. Project managers must weight all QA costs and benefits for each development environment before weigh all QA costs and benefits for each development environment before establishing an allocation strategy. The development of a system dynamics model has provided project managers with an automated tool that accurately replicates a project's dynamic behavior. This model can be used to determine the optimal quality assurance distribution pattern over a given project's life cycle. The objective of this thesis was to enhance a prototype expert system module that interacts with the system dynamics model for determining QA effort allocation schemes. The new module uses a pattern search algorithm to derive an optimal distribution scheme from a given set of project parameters. This system not only resolves all limitations discovered in the prototype model but also achieved significant reductions in total project cost.
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Min. "Pattern recognition methodology for network-based diagnostics of power quality problems /." Thesis, Connect to this title online; UW restricted, 2004. http://hdl.handle.net/1773/6099.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Nanyam, Yasasvy. "Hyperspectral Imaging for Nondestructive Measurement of Food Quality." OpenSIUC, 2010. https://opensiuc.lib.siu.edu/theses/334.

Full text
Abstract:
This thesis focuses on developing a nondestructive strategy for measuring the quality of food using hyperspectral imaging. The specific focus is to develop a classification methodology for detecting bruised/unbruised areas in hyperspectral images of fruits such as strawberries through the classification of pixels containing the edible portion of the fruit. A multiband segmentation algorithm is formulated to generate a mask for extracting the edible pixels from each band in a hypercube. A key feature of the segmentation algorithm is that it makes no prior assumptions for selecting the bands involved in the segmentation. Consequently, different bands may be selected for different hypercubes to accommodate the intra-hypercube variations. Gaussian univariate classifiers are implemented to classify the bruised-unbruised pixels in each band and it is shown that many band classifiers yield 100% classification accuracies. Furthermore, it is shown that the bands that contain the most useful discriminatory information for classifying bruised-unbruised pixels can be identified from the classification results. The strategy developed in this study will facilitate the design of fruit sorting systems using NIR cameras with selected bands.
APA, Harvard, Vancouver, ISO, and other styles
11

Li, Cui. "Image quality assessment using algorithmic and machine learning techniques." Thesis, Available from the University of Aberdeen Library and Historic Collections Digital Resources. Restricted: no access until June 2, 2014, 2009. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?application=DIGITOOL-3&owner=resourcediscovery&custom_att_2=simple_viewer&pid=26521.

Full text
Abstract:
Thesis (Ph.D.)--Aberdeen University, 2009.
With: An image quality metric based in corner, edge and symmetry maps / Li Cui, Alastair R. Allen. With: An image quality metric based on a colour appearance model / Li Cui and Alastair R. Allen. ACIVS / J. Blanc-Talon et al. eds. 2008 LNCS 5259, 696-707. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
12

Vespa, Lucas John. "PATTERN ENCODING ALGORITHMS AND INFORMATION MODELING METRICS FOR NETWORK QUALITY OF SERVICE." OpenSIUC, 2011. https://opensiuc.lib.siu.edu/dissertations/333.

Full text
Abstract:
Networks are becoming increasingly complex, making network quality of service (QoS) an ongoing and difficult problem. Important QoS challenges include ensuring availability of services, privacy and accuracy of data and preventing data loss. These challenges can be addressed by providing effective security mechanisms that meet the scalability requirements of modern networks, environments for testing these mechanisms and providing metrics to measure and increase information quality especially in resource limited networks such as sensor networks. To meet these goals, we present several topics. First, to address the scalability problems of deep packet inspection (DPI) in network intrusion detection systems (NIDS), we theoretically characterize DFA in order to develop scalable pattern matching engines. Second, to keep up with the multi-gigabit rates of current and future networks and to increase DPI performance, we exploit the inter-stream parallelism of network traffic and the parallel processing capabilities of graphics processing units to create a multi-gigabit GPU based deep packet inspection engine. Third, we address the state explosion problem of multi-stride DFA and exploit intra-stream parallelism to achieve multiple Gb/s speeds of DPI on a single processor. Fourth, to evaluate large scale deployment of DPI we develop a software test-bed for evaluating worm containment systems. Lastly, we develop information quality metrics for sensor networks, and use these metrics to schedule sensor data collection and increase the quality of information.
APA, Harvard, Vancouver, ISO, and other styles
13

Zeng, Yongqin. "Image segmentation algorithms incorporating perceptual quality factors for region-based image compression." Thesis, Imperial College London, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.313514.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Pahnke, Cornelia. "Animated systems engineering : a new approach to high quality groupware application specification and development." Thesis, University of Wolverhampton, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.275282.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Tchan, Jack Soning. "Development of an image analysis system to produce a standardised assessment of print quality." Thesis, Open University, 1998. http://oro.open.ac.uk/57912/.

Full text
Abstract:
A method has been developed using an image analysis system that simulates human print quality perception. Previous work in the area of print quality assessment has only produced methods that measure individual print quality variables, or assess small parts of an image. The image analysis system developed in this investigation is different from the previous work because it analyses the combined effects of different variables using neural network technology. In addition, measurements from an entire image can be obtained and the system can assess images irrespective of their shape. The image analysis system hardware consists of a monochrome CCD camera, a Matrox image acquisition board and a 200 MHz Pentium computer. A data pre-processing program was developed using Visual Basic version 5 to process the image data from the camera. The processed data was fed into a neural network so that empirical models of print quality could be formulated. The neural network code originated from the Matlab neural network toolbox. Backpropagation and radial basis neural network functions were used in the investigation. The hardware and software of the image analysis system were tested for non-impact printing techniques. Images of a square, a circle and text characters with dimensions of 1 cm or less were used as test images for the image analysis system. It was established that it was possible to identify the different printing processes that produced the simple shapes and text characters using the image analysis system. This was achieved by training the neural network using pre-processed image data. This produced multi-dimensional mathematical models that were used to classify the different printing processes. The classification of the different printing processes involved the objective measurement of print quality variables. Different printing processes can produce print that differs in print quality when assessed by observers. Therefore the successful classification of the printing processes demonstrated that the image analysis system could, in some cases, simulate human print quality perception. To consolidate on the preceding printing process identification result, a simulation of print quality perception was made. A neural network was trained using observer assessments of a simple pictorial image of a face. These face images were produced using a variety of different non-impact printing techniques. The neural network model was used to predict the outcomes of a further set of assessments of face images by the same observer. The accuracy of the predictions was 23 out of 24 for both the backpropagation and radial basis function neural network functions used in the test. The investigation also produced two possible practical applications for the system. Firstly, it was shown that the system has the potential to be used as a machine that can objectively assess the print quality from photocopiers. Secondly, it was demonstrated that the system might be used for forensic work, since it can identify different printing processes.
APA, Harvard, Vancouver, ISO, and other styles
16

Beaton, Duncan. "Integration of data description and quality information using metadata for spatial data and spatial information systems." Thesis, University of Newcastle Upon Tyne, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.321263.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Cheng, Xuemei. "Hyperspectral imaging and pattern recognition technologies for real time fruit safety and quality inspection." College Park, Md. : University of Maryland, 2004. http://hdl.handle.net/1903/2154.

Full text
Abstract:
Thesis (Ph. D.) -- University of Maryland, College Park, 2004.
Thesis research directed by: Biological Resources Engineering. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
APA, Harvard, Vancouver, ISO, and other styles
18

Zhu, Bin. "Novel statistical pattern recognition and 3D machine vision technologies for automated food quality inspection." College Park, Md.: University of Maryland, 2008. http://hdl.handle.net/1903/8912.

Full text
Abstract:
Thesis (Ph. D.) -- University of Maryland, College Park, 2008.
Thesis research directed by: Fischell Dept. of Bioengineering . Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
APA, Harvard, Vancouver, ISO, and other styles
19

Hague, Darren S. "Neural networks for image data compression : improving image quality for auto-associative feed-forward image compression networks." Thesis, Brunel University, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.262478.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Li, Jun. "Image texture decomposition and application in food quality analysis /." free to MU campus, to others for purchase, 2001. http://wwwlib.umi.com/cr/mo/fullcit?p3036842.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Lee, Hyun Jong. "THE CONSUMPTION AND SALES PATTERN OF UGLY APPLES IN SOUTH KOREA." UKnowledge, 2018. https://uknowledge.uky.edu/agecon_etds/70.

Full text
Abstract:
Approximately half of all wasted food is fruits and vegetables. One major cause of food waste is abnormal aesthetics; even if it is just as delicious as its normal counterpart. Food with a non-standard appearance (hereafter called ugly food) can be expelled by the markets. To reduce such waste, ugly food campaigns, which were developed in Europe and spread throughout the world, advocate for the consumption of ugly food. To study the problem of ugly food waste, this thesis examines ugly apples, since apples are the most common, representative, and readily accessible fruit. The objective of this thesis is to suggest marketing strategies and actions to facilitate the consumption and sales of ugly apples that can be expanded to other ugly fruits and vegetables. The data used for analysis are obtained from the Rural Development Administration in Korea. The findings of the thesis indicate that younger people and lower-income households are more likely to purchase ugly apples from online markets, non-stores such as food trucks and traditional markets compared with mega-scale discount stores. When advertising ugly apples, food quality should be emphasized rather than price.
APA, Harvard, Vancouver, ISO, and other styles
22

Awan, Zafar Iqbal, and Abdul Azim. "Network Emulation, Pattern Based Traffic Shaping and KauNET Evaluation." Thesis, Blekinge Tekniska Högskola, Avdelningen för telekommunikationssystem, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-6022.

Full text
Abstract:
Quality of Service is major factor for a successful business in modern and future network services. A minimum level of services is assured indulging quality of Experience for modern real time communication introducing user satisfaction with perceived service quality. Traffic engineering can be applied to provide better services to maintain or enhance user satisfaction through reactive and preventive traffic control mechanisms. Preventive traffic control can be more effective to manage the network resources through admission control, scheduling, policing and traffic shaping mechanisms maintaining a minimum level before it get worse and affect user perception. Accuracy, dynamicity, uniformity and reproducibility are objectives of vast research in network traffic. Real time tests, simulation and network emulation are applied to test uniformity, accuracy, reproducibility and dynamicity. Network Emulation is performed over experimental network to test real time application, protocol and traffic parameters. DummyNet is a network emulator and traffic shaper which allows nondeterministic placement of packet losses, delays and bandwidth changes. KauNet shaper is a network emulator which creates traffic patterns and applies these patterns for exact deterministic placement of bit-errors, packet losses, delay changes and bandwidth changes. An evaluation of KauNet with different patterns for packet losses, delay changes and bandwidth changes on emulated environment is part of this work. The main motivation for this work is to check the possibility to delay and drop the packets of a transfer/session in the same way as it has happened before (during the observation period). This goal is achieved to some extent using KauNet but some issues with pattern repetitions are still needed to be solved to get better results. The idea of history and trace-based traffic shaping using KauNet is given to make this possibility a reality.
APA, Harvard, Vancouver, ISO, and other styles
23

Guo, Yuanjing M. S. "DASH Intervention Effects on Home Food Environment and Diet Quality among Adolescents with Pre-hypertension and Hypertension." University of Cincinnati / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1470045434.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Alligood, Ronald R. II. "The Life Pattern of People with Spinal Cord Injury." VCU Scholars Compass, 2006. https://scholarscompass.vcu.edu/etd/1294.

Full text
Abstract:
This aim of this study was to answer the research question: "What is the Life Pattern of the Person with Spinal Cord Injury?" The unitary appreciative inquiry design, which has conceptualized through Rogers' (1986) science of unitary human beings, provided an approach for understanding the phenomenon in the context of human wholeness. The data, obtained through the methodology of unitary appreciative inquiry, led to the development of individual synopses for each of the participants. Once the synopses were completed, a composite pattern profile was constructed by the researcher that was indicative of the life pattern of people with spinal cord injury. The participants in the study validated the synopsis and pattern profile as accurate representations of their experience with spinal cord injury. This qualitative study, which was comprised of eight people who had undergone a spinal cord injury more than two years prior to the study, discovered three shared pattern manifestations: depersonalization; loss; and hopelessness. Although each person within this inquiry had a very good physical outcome concerning their spinal cord injury, the participants were not pleased with their current state of being. The pattern of despair, which was validated by the participants, was manifested through the profound sense of depersonalization, loss, and hopelessness.
APA, Harvard, Vancouver, ISO, and other styles
25

Mikhailov, Andrey, Kungaba Cedric Pefok, and Adnan Yousaf. "The pattern of customer complaint behaviour in public transportation :." Thesis, Karlstads universitet, Fakulteten för ekonomi, kommunikation och IT, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-6473.

Full text
Abstract:
Service providers must understand that they have to provide customer-perceived value, if they want to stay in business. One of the best ways to determine customer-perceived value is to encourage customer complaint. This will make it easier to identify areas of the service process which the consumer believes must be improved. The ultimate goal of our thesis is to identify and establish the patterns of customer complaint behaviour in public transportation which is a part of the service sector. If patterns are identified, it will be much easier to encourage and predict customers‘ abilities and willingness to complain during a service process. Hence, service providers will be able to create an environment that can encourage and facilitate customer complaint processes. In this regard, service providers will obtain more information that will enable them to improve the quality of their services in order provide customer-perceived value. In addition, due to the fact that services are offered at the same time when the customer is there, this increases chances of customers seeing failures. Therefore, it is vital to make it easier for customers to be able to complain as soon as they perceive these failures so that they leave the service environment satisfied. Thus, understanding the pattern of customer complaint behaviour will make this process easier. By pattern, we mean sequence and therefore, there must be factors that influence this sequence. Our thesis shall focus on three main factors; cost, contextual resources and customer‘s competence, that influence the ability, willingness and the extent to which customers will engage in a complain process. Therefore, this thesis focuses on the following: What is the pattern of customer complaint behaviour in public transportation and how do cost, contextual resources and customer‘s competence impact this pattern?However, we shall also mention other external factors that may influence the pattern of customer complaint behaviour like market structure and service characteristics. It is imperative to understand customer complaint behaviour in service because through customer complaint, customers‘ quality expectations can be determined and met. Studies reveal that, although complaint channels may exist, some customers still do not complaint. In our survey, only 21.6% of respondents who encountered a service failure actually complained implying that 78.4% of the respondents who encountered an unfavourable service experience did not complain. What could be the reasons that customers who encounter problems do not complaint, although they would want to complain?2Above all, if there is something to be learnt from customer complaint behaviour, we think that it should be the patterns of customer complaint behaviour. This is because if patterns can be identified, then the right channels can be put in place by service providers in order to encourage and facilitate the complaint process. This will enable much information to be obtained from the customers and then used to make improvements in the service offerings and processes. In this regard service quality and customer satisfaction can be increased. This will lead to customer retention and higher profits for the company as well as prevent negative word-of-mouth.In this thesis, we identified patterns of customer complaint behaviour in service with a focus in public transportation by using data from the passengers of the public bus companies of Karlstad city and the intercity bus company (SWEBUS) as bases of our research. In our questionnaire we asked customers to indicate the strength of preference for a complaint channel they would use in order to make a complaint to the bus company in the event of a negative service experience. The results were ranked in order to determine the pattern of customer complaint behaviour in public transportation. We approached this topic by revealing the importance of understanding customer complaint behaviour and using this knowledge to improve service development. We proceeded by emphasising on the importance of viewing customer complaint behaviour from the perspective of service dominant logic.
The Service and Market Oriented Transport Research Group
APA, Harvard, Vancouver, ISO, and other styles
26

Mariappan, Paramananthan. "Quality of bladder cancer surgery : improving outcomes." Thesis, University of Edinburgh, 2018. http://hdl.handle.net/1842/31261.

Full text
Abstract:
Background: At the time of diagnosis, approximately 75% of all bladder cancers are Non-Muscle Invasive Bladder Cancers (NMIBC) - the standard treatment for these cancers is a Transurethral Resection of the Bladder Tumour (TURBT). Although, the vast majority of these cancers are not life-threatening, they have a high risk of recurrence (and progression, particularly in higher risk NMIBC), despite the use of adjuvant intravesical chemotherapy. Consequently, patients are kept on long term cystoscopic surveillance with endoscopic removal if recurrences are detected - this impacts on patients' quality of life and contributes to the high cost for the healthcare provider. Aims: The fundamental aim of this series of clinical studies, spanning 12 years, was to identify and implement, means of improving the efficiency in both processing and operating on patients with NMIBC to not only reduce recurrence, but also to reduce the duration of follow up and repeat operations. It was an evolutionary process where the findings in the preceding studies formed the basis of the subsequent one - while the aim of the individual studies were different, there was a clear link to the essential principles, thus forming a coherent collection of studies. Methods and results: The project was carried out in 3 phases (with 2 or 3 main studies in each phase, augmented by 1 to 2 linked studies - making the entire submission for PhD by publications a series of 12 studies, to date): Phase 1 (5 studies in this phase): The aim was to demonstrate the natural history of non-invasive bladder cancer and identify sub-categories of patients who could be discharged from surveillance at 5 years. This was initially achieved by evaluating a prospectively maintained cohort of non-invasive bladder cancer patients diagnosed between 1978 and 1984 at the Western General Hospital, Edinburgh. This study identified the importance of the recurrence rate at the first follow up cystoscopy (RRFFC) as an essential prognostic marker. This finding was further validated using 2 separate cohorts from a different Centre (the Royal Infirmary, Edinburgh) managed in the 80s and the 90s, respectively. The data confirmed that over the decades, recurrence patterns do change, possibly as a result of differing techniques and improvements in optics and instruments; however, what remained the same was the prognostic value of the RRFFC. Phase 2 (3 studies in this phase): The early recurrence was deemed to be the result of missed and tumours left behind at the initial TURBT, i.e. a marker of quality. However, RRFFC was only known 3 months after the initial surgery. Since the RRFFC was such an important prognostic factor, the aim of this phase was to determine the surgical factors contributing to the quality of TURBT and subsequently implement changes to the principles in carrying out the surgery to improve this quality. This was achieved by prospective collection of information regarding all patients undergoing TURBT for new bladder cancers, recording the tumour features, surgeon experience, if the resection was deemed to have been complete or not, and the pathological results. We identified that the detrusor muscle in the resected specimen and the experience of the surgeon were independent determinants of TURBT quality. This finding was validated in a further study using cohorts from another time period and another Centre - this allowed me to develop the concept of Good Quality White Light TURBT (GQWLTURBT) as the benchmark for the white light TURBT. Phase 3 (4 studies in this phase): Photodynamic Diagnosis assisted TURBT (PDDTURBT) was demonstrated in randomised controlled trials as a technique that reduces the recurrences in NMIBC. In the absence of evidence with this technique in the 'real life' setting nor comparisons with standardised, benchmarked white light TURBT technique, we performed a prospective controlled study comparing PDD-TURBT and GQ-WLTURBT, evaluating early and delayed recurrence rates in 2 separate studies. I also performed a multicentre UK study on the outcomes with PDD-TURBT and collaborated with other experts in Europe in producing a review article around Photodynamic Diagnosis and the cost effectiveness of this technique. Summary: This coherent series of studies has contributed to knowledge in bladder cancer surgery by, among others: (a) mapping the individual patient natural history of non-invasive bladder cancer; (b) confirming the importance of early recurrence as a strong prognostic indicator; (c) identifying predictors of this early recurrence and the quality of TURBT; (d) introducing the concept of the benchmark Good Quality White Light TURBT and (e) demonstrating the benefits of photodynamic diagnosis within a 'real life' setting.
APA, Harvard, Vancouver, ISO, and other styles
27

Harjumaa, L. (Lasse). "Improving the software inspection process with patterns." Doctoral thesis, University of Oulu, 2005. http://urn.fi/urn:isbn:9514278941.

Full text
Abstract:
Abstract The quality of a software product depends largely on the quality of the process that is used to develop it. In small software companies, the development process may be informal or even ad hoc, which causes uncertainty and variation in the product quality. However, quality issues are as important in small companies as in their larger counterparts. To sustain their dynamics and competitiveness, small organizations need to concentrate on the most effective quality assurance methods. Software inspection is a proven method for improving product quality and it provides a very cost-effective way for small companies to improve their development processes. This study introduces a framework for adjusting the inspection process for the organization's specific needs and evaluating its capabilities. The main focus of this work, however, is on refining and improving the inspection process. The improvement is guided by concrete instructions that are documented as process patterns. The pattern approach has already been used successfully in several other areas of software engineering. Patterns aim at capturing the best practices of software development and transferring this knowledge between people or organizations. The framework for inspection process capability originates from the literature relating to different types of peer review methods and experiments with flexible and tool-supported inspections in small companies. Furthermore, generic process improvement models are studied to find a feasible structure for the framework. As a result of the analysis, the i3 capability model is introduced. The feasibility of the model has been investigated in real-life software organizations carrying out inspections. After the capability evaluation, the inspection process can be upgraded with the aid of improvement patterns, which provide structured and easy-to-follow guidelines for implementing improvements. An initial list of patterns, describing solutions to the most common problems confronted in the establishment of inspections, is extracted from related inspection research and an industrial experiment. The contributions of this study are, first, the new view of the inspection process, based on the fundamental activities that are performed during an inspection instead of a series of stages, as it is usually presented. An activity-based process description enables tailoring of the process for organization-specific needs and its targeted improvement. Second, the study introduces a practical, lightweight method for implementing the improvement. Patterns are especially suitable in companies where resources are limited and full-scale improvement programmes cannot be initiated. Furthermore, the generic process improvement models do not provide detailed information on how improvements should be carried out, and the pattern approach represents a promising method for that. Third, the inspection process currently does not have a very significant role in generic software process improvement models; this study helps in outlining the importance of inspections. A similar approach could be applied to other software subprocesses to enable their evaluation and improvement.
APA, Harvard, Vancouver, ISO, and other styles
28

Woxblom, Lotta. "Warp of sawn timber of Norway spruce in relation to end-user requirements : quality, sawing pattern and economic aspects /." Uppsala : Swedish Univ. of Agricultural Sciences (Sveriges lantbruksuniv.), 1999. http://epsilon.slu.se/avh/1999/91-576-5860-9.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Schweizer, Peter E. "Influences of Watershed Land Cover Pattern on Water Quality and Biotic Integrity of Coastal Plain Streams in Mississippi, USA." View abstract, 2008. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&res_dat=xri:pqdiss&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft_dat=xri:pqdiss:3339508.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Hönel, Sebastian. "Temporal data analysis facilitating recognition of enhanced patterns." Thesis, Linnéuniversitetet, Institutionen för datavetenskap (DV), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-51864.

Full text
Abstract:
Assessing the source code quality of software objectively requires a well-defined model. Due to the distinct nature of each and every project, the definition of such a model is specific to the underlying type of paradigms used. A definer can pick metrics from standard norms to define measurements for qualitative assessment. Software projects develop over time and a wide variety of re-factorings is applied tothe code which makes the process temporal. In this thesis the temporal model was enhanced using methods known from financial markets and further evaluated using artificial neural networks with the goal of improving the prediction precision by learning from more detailed patterns. Subject to research was also if the combination of technical analysis and machine learning is viable and how to blend them. An in-depth selection of applicable instruments and algorithms and extensive experiments were run to approximate answers. It was found that enhanced patterns are of value for further processing by neural networks. Technical analysis however was not able to improve the results, although it is assumed that it can for an appropriately sizedproblem set.
APA, Harvard, Vancouver, ISO, and other styles
31

Constantine, Sherry Lynette. "The Influence of Habitat Quality on the Community Structure, Distribution Pattern, Condition, and Growth of Coral Reef Fish: A Case Study of Grunts (Haemulidae) from Antigua B.W.I, A Small Island System." Scholarly Repository, 2008. http://scholarlyrepository.miami.edu/oa_dissertations/136.

Full text
Abstract:
The goal of this research was to determine the relative quality of near shore marine areas by investigating their influence on Haemulidae community structure, distribution pattern, condition, and growth. Habitat was defined at the small spatial scale of individual habitat types such as seagrass beds, mangroves and coral reefs, and at the broader spatial scale of the interconnection of these individual habitat types within a mosaic (IHM). Ten spatial, biotic and abiotic parameters (percentage coverage of sand, mangroves, hard substrate, and seagrass, turbidity, pH, salinity, temperature, average depth, and predator density) were investigated. These environmental characteristics acted as proxies for the quality of IHMs. The major findings of the research were: (1) IHMs and discrete habitat types in tropical marine systems are not always equal in quality. Further, the highest quality IHMs/discrete habitat types have the critical resources whether spatial, abiotic or biotic, at the optimum levels needed by organisms to carry out their critical life functions; (2) IHMs of the highest quality contain all the discrete habitat types needed by organisms to carry out their life processes in a spatial arrangement that maximizes energy savings; (3) IHMs can be of high quality in the absence of one habitat type, if this habitat type is replaced by another that can take on its ecological role; and (4) the percentage cover of hard substratum and seagrass, temperature, and predator density have a big impact on Haemulidae distribution pattern, community structure, condition and growth. In addition, this research highlighted some of many characteristics of benthic habitats such as type and configuration that should be included in the design of Marine Protected Areas for the effective management of fisheries resources. Effective Marine Protected Areas should have (1) large overall area with benthic habitat types of high quality; (2) spatial configurations with short distances (corridors) between habitat types; (3) spatial arrangements that place all individual habitat types in connection with all other habitat types so that energy expenditure in moving among habitat types is reduced; (4) habitats with high structural complexity; and (5) the inclusion of all the habitat types needed by focal organisms to carry out their life processes, or surrogate habitat types that can take on the role of ones that are absent.
APA, Harvard, Vancouver, ISO, and other styles
32

Mohan, Deepak. "Real-time detection of grip length deviation for fastening operations: a Mahalanobis-Taguchi system (MTS) based approach." Diss., Rolla, Mo. : University of Missouri-Rolla, 2007. http://scholarsmine.mst.edu/thesis/pdf/DeepakMohanThesisFinal_09007dcc80410b1d.pdf.

Full text
Abstract:
Thesis (M.S.)--University of Missouri--Rolla, 2007.
Vita. The entire thesis text is included in file. Title from title screen of thesis/dissertation PDF file (viewed October 24, 2007) Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
33

Hammar, Karl. "Content Ontology Design Patterns : Qualities, Methods, and Tools." Doctoral thesis, Linköpings universitet, Interaktiva och kognitiva system, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-139584.

Full text
Abstract:
Ontologies are formal knowledge models that describe concepts and relationships and enable data integration, information search, and reasoning. Ontology Design Patterns (ODPs) are reusable solutions intended to simplify ontology development and support the use of semantic technologies by ontology engineers. ODPs document and package good modelling practices for reuse, ideally enabling inexperienced ontologists to construct high-quality ontologies. Although ODPs are already used for development, there are still remaining challenges that have not been addressed in the literature. These research gaps include a lack of knowledge about (1) which ODP features are important for ontology engineering, (2) less experienced developers' preferences and barriers for employing ODP tooling, and (3) the suitability of the eXtreme Design (XD) ODP usage methodology in non-academic contexts. This dissertation aims to close these gaps by combining quantitative and qualitative methods, primarily based on five ontology engineering projects involving inexperienced ontologists. A series of ontology engineering workshops and surveys provided data about developer preferences regarding ODP features, ODP usage methodology, and ODP tooling needs. Other data sources are ontologies and ODPs published on the web, which have been studied in detail. To evaluate tooling improvements, experimental approaches provide data from comparison of new tools and techniques against established alternatives. The analysis of the gathered data resulted in a set of measurable quality indicators that cover aspects of ODP documentation, formal representation or axiomatisation, and usage by ontologists. These indicators highlight quality trade-offs: for instance, between ODP Learnability and Reusability, or between Functional Suitability and Performance Efficiency. Furthermore, the results demonstrate a need for ODP tools that support three novel property specialisation strategies, and highlight the preference of inexperienced developers for template-based ODP instantiation---neither of which are supported in prior tooling. The studies also resulted in improvements to ODP search engines based on ODP-specific attributes. Finally, the analysis shows that XD should include guidance for the developer roles and responsibilities in ontology engineering projects, suggestions on how to reuse existing ontology resources, and approaches for adapting XD to project-specific contexts.
APA, Harvard, Vancouver, ISO, and other styles
34

Luca, Matthieu. "Quality Timber Strength Grading : A prediction of strength using scanned surface grain data and FE-analyses." Thesis, Linnéuniversitetet, Institutionen för teknik, TEK, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-14037.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Rao, Raghunandan M. "Perspectives of Jamming, Mitigation and Pattern Adaptation of OFDM Pilot Signals for the Evolution of Wireless Networks." Thesis, Virginia Tech, 2016. http://hdl.handle.net/10919/77485.

Full text
Abstract:
Wireless communication networks have evolved continuously over the last four decades in order to meet the traffic and security requirements due to the ever-increasing amount of traffic. However this increase is projected to be massive for the fifth generation of wireless networks (5G), with a targeted capacity enhancement of 1000× w.r.t. 4G networks. This enhanced capacity is possible by a combination of major approaches (a) overhaul of some parts and (b) elimination of overhead and redundancies of the current 4G. In this work we focus on OFDM reference signal or pilot tones, which are used for channel estimation, link adaptation and other crucial functions in Long-Term Evolution (LTE). We investigate two aspects of pilot signals pertaining to its evolution - (a) impact of targeted interference on pilots and its mitigation and (b) adaptation of pilot patterns to match the channel conditions of the user. We develop theoretical models that accurately quantify the performance degradation at the user’s receiver in the presence of a multi-tone pilot jammer. We develop and evaluate mitigation algorithms to mitigate power constrained multi-tone pilot jammers in SISO- and full rank spatial multiplexing MIMO-OFDM systems. Our results show that the channel estimation performance can be restored even in the presence of a strong pilot jammer. We also show that full rank spatial multiplexing in the presence of a synchronized pilot jammer (transmitting on pilot locations only) is possible when the channel is flat between two pilot locations in either time or frequency. We also present experimental results of multi-tone broadcast pilot jamming (Jamming of Cell Specific Reference Signal) in the LTE downlink. Our results show that full-band jamming of pilots needs 5 dB less power than jamming the entire downlink signal, in order to cause Denial of Service (DoS) to the users. In addition to this, we have identified and demonstrated a previously unreported issue with LTE termed ‘Channel Quality Indicator (CQI) Spoofing’. In this scenario, the attacker tricks the user terminal into thinking that the channel quality is good, by transmitting interference transmission only on the data locations, while deliberately avoiding the pilots. This jamming strategy leverages the dependence of the adaptive modulation and coding (AMC) schemes on the CQI estimate in LTE. Lastly, we investigate the idea of pilot pattern adaptation for SISO- and spatial multiplexing MIMO-OFDM systems. We present a generic heuristic algorithm to predict the optimal pilot spacing and power in a nonstationary doubly selective channel (channel fading in both time and frequency). The algorithm fits estimated channel statistics to stored codebook channel profiles and uses it to maximize the upper bound on the constrained capacity. We demonstrate up to a 30% improvement in ergodic capacity using our algorithm and describe ways to minimize feedback requirements while adapting pilot patterns in multi-band carrier aggregation systems. We conclude this work by identifying scenarios where pilot adaptation can be implemented in current wireless networks and provide some guidelines to adapt pilots for 5G.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
36

Liu, Jia. "Heterogeneous Sensor Data based Online Quality Assurance for Advanced Manufacturing using Spatiotemporal Modeling." Diss., Virginia Tech, 2017. http://hdl.handle.net/10919/78722.

Full text
Abstract:
Online quality assurance is crucial for elevating product quality and boosting process productivity in advanced manufacturing. However, the inherent complexity of advanced manufacturing, including nonlinear process dynamics, multiple process attributes, and low signal/noise ratio, poses severe challenges for both maintaining stable process operations and establishing efficacious online quality assurance schemes. To address these challenges, four different advanced manufacturing processes, namely, fused filament fabrication (FFF), binder jetting, chemical mechanical planarization (CMP), and the slicing process in wafer production, are investigated in this dissertation for applications of online quality assurance, with utilization of various sensors, such as thermocouples, infrared temperature sensors, accelerometers, etc. The overarching goal of this dissertation is to develop innovative integrated methodologies tailored for these individual manufacturing processes but addressing their common challenges to achieve satisfying performance in online quality assurance based on heterogeneous sensor data. Specifically, three new methodologies are created and validated using actual sensor data, namely, (1) Real-time process monitoring methods using Dirichlet process (DP) mixture model for timely detection of process changes and identification of different process states for FFF and CMP. The proposed methodology is capable of tackling non-Gaussian data from heterogeneous sensors in these advanced manufacturing processes for successful online quality assurance. (2) Spatial Dirichlet process (SDP) for modeling complex multimodal wafer thickness profiles and exploring their clustering effects. The SDP-based statistical control scheme can effectively detect out-of-control wafers and achieve wafer thickness quality assurance for the slicing process with high accuracy. (3) Augmented spatiotemporal log Gaussian Cox process (AST-LGCP) quantifying the spatiotemporal evolution of porosity in binder jetting parts, capable of predicting high-risk areas on consecutive layers. This work fills the long-standing research gap of lacking rigorous layer-wise porosity quantification for parts made by additive manufacturing (AM), and provides the basis for facilitating corrective actions for product quality improvements in a prognostic way. These developed methodologies surmount some common challenges of advanced manufacturing which paralyze traditional methods in online quality assurance, and embody key components for implementing effective online quality assurance with various sensor data. There is a promising potential to extend them to other manufacturing processes in the future.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
37

A-iyeh, Enoch. "Voronoi tessellation quality: applications in digital image analysis." A-iyeh E., Peters, J.F, Proximal Groupoid Patterns in Digital Images, Computing Research Repository: Computer Vision and Pattern Recognition, 2016, 2017. http://hdl.handle.net/1993/32055.

Full text
Abstract:
A measure of the quality of Voronoi tessellations resulting from various mesh generators founded on feature-driven models is introduced in this work. A planar tessellation covers an image with polygons of various shapes and sizes. Tessellations have potential utility due to their geometry and the opportunity to derive useful information from them for object recognition, image processing and classification. Problem domains including images are generally feature-endowed, non-random domains. Generators modeled otherwise may easily guarantee quality of meshes but certainly bear no reference to features of the meshed problem domain. They are therefore unsuitable in point pattern identification, characterization and subsequently the study of meshed regions. We therefore found generators on features of the problem domain. This provides a basis for element quality studies and improvement based on quality criteria. The resulting polygonal meshes tessellating an n-dimensional digital image into convex regions are of varying element qualities. Given several types of mesh generating sets, a measure of overall solution quality is introduced to determine their effectiveness. Given a tessellation of general and mixed shapes, this presents a challenge in quality improvement. The Centroidal Voronoi Tessellation (CVT) technique is developed for quality improvement and guarantees of mixed, general-shaped elements and to preserve the validity of the tessellations. Mesh quality indicators and entropies introduced are useful for pattern studies, analysis, recognition and assessing information. Computed features of tessellated spaces are explored for image information content assessment and cell processing to expose detail using information theoretic methods. Tessellated spaces also furnish information on pattern structure and organization through their quality distributions. Mathematical and theoretical results obtained from these spaces help in understanding Voronoi diagrams as well as for their successful applications. Voronoi diagrams expose neighbourhood relations between pattern units. Given this realization, the foundation of near sets is developed for further applications.
February 2017
APA, Harvard, Vancouver, ISO, and other styles
38

Otim, Stephen O. "Simplified fixed pattern noise correction and image display for high dynamic range CMOS logarithmic imagers." Thesis, University of Oxford, 2007. http://ora.ox.ac.uk/objects/uuid:6a8cbdbf-ef5c-473f-a22e-76e1f8a2603b.

Full text
Abstract:
Biologically inspired logarithmic CMOS sensors offer high dynamic range imaging capabilities without the difficulties faced by linear imagers. By compressing dynamic range while encoding contrast information, they mimic the human visual system’s response to photo stimuli in fewer bits than those used in linear sensors. Despite this prospect, logarithmic sensors suffer poor image quality due to illumination dependent fixed pattern noise (FPN), making individual pixels appear up to 100 times brighter or darker. This thesis is primarily concerned with alleviating FPN in logarithmic imagers in a simple and convenient way while undertaking a system approach to its origin, distribution and effect on the quality of monochrome and colour images, after FPN correction. Using the properties of the Human visual system, I propose to characterise the errors arising from FPN in a perceptually significant manner by proposing an error measure, never used before. Logarithmic operation over a wide dynamic range is first characterised using a new model; yi j =aj +bj ln(exp sqrt(cj +djxi)−1), where yi j is the response of the sensor to a light stimulus xi and aj, bj, cj and dj are pixel dependent parameters. Using a proposed correction procedure, pixel data from a monochromatic sensor array is FPN corrected to approximately 4% error over 5 decades of illumination even after digitisation - accuracy equivalent to four times the human eyes ability to just notice an illumination difference against a uniform background. By evaluating how error affects colour, the possibility of indiscernible residual colour error after FPN correction, is analytically explored using a standard set of munsell colours. After simulating the simple FPN correction procedure, colour quality is analysed using a Delta E76 perceptual metric, to check for perceptual discrepancies in image colour. It is shown that, after quantisation, the FPN correction process yields 1−2 Delta E76 error units over approximately 5 decades of illumination; colour quality being imperceptibly uniform in this range. Finally, tone-mapping techniques, required to compress high dynamic range images onto the low range of standard screens, have a predominantly logarithmic operation during brightness compression. A new Logr'Gb' colour representation is presented in this thesis, significantly reducing computational complexity, while encoding contrast information. Using a well-known tone mapping technique, images represented in this new format are shown to maintain colour accuracy when the green colour channel is compressed to the standard display range, instead of the traditional luminance channel. The trade off between colour accuracy and computation in this tone mapping approach is also demonstrated, offering a low cost alternative for applications with low display specifications.
APA, Harvard, Vancouver, ISO, and other styles
39

Carmo, Gisele Tomaz do. "Estudo fonético qualitativo da fala e do canto no teatro popular em São Paulo." Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/8/8139/tde-28022019-182333/.

Full text
Abstract:
O objetivo deste trabalho é comparar o padrão formântico da fala atuada com o canto no teatro popular em São Paulo, com base nos estudos de Raposo de Medeiros (2002) e Sundberg (2015). Definiu-se a fonética acústica como área de estudo para a escolha do método, bem como para as análises dos aspectos acústicos investigados. Quanto ao método, o primeiro passo foi selecionar a canção Enchente, da peça Hospital da gente, que pertence ao repertório do Grupo Clariô de Teatro. Em seguida, realizou-se a coleta de dados da qual participou uma atriz profissional, de 33 anos. Solicitamos à atriz que cantasse e falasse o texto da canção como se estivesse no palco. A fala produzida pela atriz apresentou duas características distintas: em alguns momentos foi executada de forma gritada e, em outros momentos, de forma não-gritada, que nomeamos de fala normal. A fala gritada nos chamou à atenção, e consequentemente, despertou-nos o interesse em observar esse aspecto de qualidade de voz, em nossos dados. Após a gravação, tratamento e segmentação dos dados, medimos e comparamos qualitativamente os três primeiros formantes, F1, F2 e F3, bem como a Frequência Fundamental, F0, das vogais do PB na posição tônica, em sua porção mais estável. Com a emergência dos dados de fala gritada, vimos, então, a necessidade de o corpus ser aumentado com uma terceira condição de gravação, a condição de fala neutra, que não pode ser coletada pela atriz por motivos de falta de agenda. Assim, coletamos os dados da autora desta pesquisa para ser utilizada como parâmetro nas análises de qualidade de voz. Em um análise qualitativa foi possível dizer que as vogais da fala apresentam variação em sua produção, o que resulta em valores muitos diferentes intra vogais, por exemplo entre as vogais [e]. Já no canto foi possível perceber que as vogais [a], [] e [] apresentam seus valores mais concentrados, enquanto as demais vogais cantadas, as vogais altas, tendem a mostrar valores mais dispersos mesmo quando cantadas. A qualidade de voz da atriz varia ao longo do texto, mas as ocorrências de fala gritada possuem o F1 elevado; uma das característica descritas na literatura para descrever esse tipo de fala. Essa dissertação tenta aproximar estudos acadêmicos da área linguística, com o movimento artístico da periferia de São Paulo, com o intuito de apresentar aos artistas como a fonética acústica pode auxiliá-los em suas composições, no sentido de dar um pouco de clareza de como funciona o processo de produção de fala.
The objective of this work is to compare the formant pattern of speech with singing in the popular theater in São Paulo, based on studies by Raposo de Medeiros (2002) and Sundberg (2015). Acoustic Phonetics was defined as a study area both for the methodology and for the analysis of acoustic aspects investigated. As for the method, the first step was to select the song \"Enchente\" from the play \"Hospital da Gente\" which belongs to the repertoire of the Clariô Theater Group. Then, a data collection was performed, with the participation of a 33 years old professional actress. We asked the actress to sing and speak the text of the song as if she were on the stage. The speech produced by the actress presented two distinct characteristics: in a few moments she performed it in a shouted way and, at other times, in a non-shouted way, that we call normal speech. The shouted speech caught our attention, and consequently aroused our interest in observing this aspect of voice quality, in our data. After recording, treatment and segmentation of the data, we measured and compared the first three formants, F1, F2 and F3, as well as the Fundamental Frequency, F0, of the PB vowels in the tonic position, in their most stable portion. With the emergence of shouted speech data we then saw the need for the corpus to be increased with a third recording condition, the neutral speech condition, which could not be collected with the actress due to her full agenda. Thus, we collected data from the author of this research to be used as a parameter in the analysis of voice quality. In a qualitative analysis it was possible to say that speech vowels present variation in their production, which results in many different intra vowel values, for example between vowels [e]. In the song it was possible to perceive that the vowels [a], [] and [] present their most concentrated values, while the other vowels sung, the high vowels, tend to disperse even when sung. The voice quality of the actress varies throughout the text, but the shouted speech occurrences have high F1; one of the characteristics described in the literature to describe this type of speech. This dissertation tries to approximate academic studies of the linguistic area, with the artistic movement of the periphery of São Paulo, in order to present to the artists how acoustic phonetics can help them in their compositions, in the sense of giving a little clarity of how it works the process of speech production.
APA, Harvard, Vancouver, ISO, and other styles
40

Gorgulho, Bartira Mendes. "Diferenças e similaridades na qualidade da refeição do Brasil e Reino Unido: que lições podemos aprender?" Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/6/6138/tde-19102016-143657/.

Full text
Abstract:
Introdução. Apesar de consumirmos alimentos combinados e estruturados em refeições, a maioria dos estudos ainda se concentra em nutrientes ou alimentos consumidos isoladamente. Além disso, comparar a alimentação entre países em diferentes fases de transição nutricional e epidemiológica pode fornecer informações relevantes relacionadas à prevenção da obesidade e DCNT. Objetivo. Caracterizar e comparar a qualidade nutricional da principal refeição consumida por adultos residentes no Brasil e Reino Unido. Materiais e Métodos. A primeira etapa do estudo consistiu na revisão sistemática da literatura, que subsidiou a etapa seguinte, o desenvolvimento do Main Meal Quality Index. Para comparar a qualidade das refeições utilizou-se dados dos inquéritos alimentares Inquérito Nacional de Alimentação INA/POF 2008/09 e National Diet and Nutrition Survey - NDNS. Para a identificação e avaliação da qualidade da refeição utilizou-se duas diferentes abordagens: (1) abordagem híbrida, com a descrição da composição das refeições por meio da árvore de decisão de classificação, e (2) abordagem dirigida pela hipótese, através da aplicação do Main Meal Quality Index. Além disto foram analisados modelos de regressão múltipla a fim de identificar os fatores associados. Resultados. Considerando o horário de consumo e a contribuição energética, os eventos alimentares definidos como principal refeição foram o almoço, para o Brasil, e jantar, para o Reino Unido. A refeição principal brasileira (58 pontos) apresentou melhor qualidade nutricional, com maior participação de fibras e carboidratos, e menor teor de gorduras total e saturada, e densidade energética. No entanto, a principal refeição do Reino Unido (54 pontos) foi composta por mais frutas, verduras e legumes. Os ingredientes culinários, como arroz e feijão, foram classificados pelo algoritmo como componentes característicos da refeição brasileira, enquanto os itens de fast food, como batatas fritas, sanduíches e bebidas açucaradas, foram classificados como refeições Britânicas. No Brasil, o escore final do indicador associou-se positivamente com a idade, e negativamente com o gênero, energia consumida, estado nutricional e renda familiar; enquanto que, no Reino Unido, o indicador associou-se apenas com a idade (positivamente). Conclusão. Embora a principal refeição consumida no Brasil, quando comparada ao Reino Unido, apresente melhor qualidade e composição, as refeições consumidas em ambos os países estão aquém do recomendado.
Introduction. Although individuals consume foods combined and structured at meals, most authors still have studied nutrient or single food. Furthermore, compare countries in different stages of nutritional and epidemiological transition can provide relevant information related to the prevention of obesity and NCDs. Objective. To characterize and compare the nutritional quality of meals consumed by adults living in Brazil and UK. Subjects and methods. The first stage of the study consisted of a systematic review of the literature, which subsidized the next step, the development of the Main Meal Quality Index. Data from food surveys \"National Survey of Food - INA / HBS 2008/09\" and \"National Diet and Nutrition Survey - NDNS\" were used to analyzed and compare the main meals quality. Two different approaches for the identification and evaluation of the main meal pattern were used: (1) hybrid approach, to evaluate of the meal patters using data reduction techniques; and (2) hypothesis-driven approach, with the Main Meal Quality Index. Then, regression models were applied to analyzed associated sociodemographic factors. Results. Considering time slot and energy content, the eating events defined as main meal were lunch, for Brazil, and dinner, for UK. Brazilian main meal (58 points) had better nutritional quality, with greater participation of fiber and carbohydrates, and lower content of total and saturated fat, and energy density. However, the main meal consumed in UK (54 points) had more fruits and vegetables. Cooking ingredients, such as rice and beans, were classified as characteristic components of a Brazilian meal, while fast food items, like chips, sandwiches and sugary drinks, were classified as a British meal. In Brazil, the final score was positively associated with age, and negatively correlated with gender, energy consumption, nutritional status and family income; whereas, in the United Kingdom, the indicator is only associated with age (positively). Conclusion. Although Brazilian main meal, when compared with UK main meal, provide best quality and composition, meals consumed in both countries need improvement.
APA, Harvard, Vancouver, ISO, and other styles
41

Maillot, Pierre. "Nouvelles méthodes pour l'évaluation, l'évolution et l'interrogation des bases du Web des données." Thesis, Angers, 2015. http://www.theses.fr/2015ANGE0007/document.

Full text
Abstract:
Le Web des données offre un environnement de partage et de diffusion des données, selon un cadre particulier qui permet une exploitation des données tant par l’humain que par la machine. Pour cela, le framework RDF propose de formater les données en phrases élémentaires de la forme (sujet, relation, objet) , appelées triplets. Les bases du Web des données, dites bases RDF, sont des ensembles de triplets. Dans une base RDF, l’ontologie – données structurelles – organise la description des données factuelles. Le nombre et la taille des bases du Web des données n’a pas cessé de croître depuis sa création en 2001. Cette croissance s’est même accélérée depuis l’apparition du mouvement du Linked Data en 2008 qui encourage le partage et l’interconnexion de bases publiquement accessibles sur Internet. Ces bases couvrent des domaines variés tels que les données encyclopédiques (e.g. Wikipédia), gouvernementales ou bibliographiques. L’utilisation et la mise à jour des données dans ces bases sont faits par des communautés d’utilisateurs liés par un domaine d’intérêt commun. Cette exploitation communautaire se fait avec le soutien d’outils insuffisamment matures pour diagnostiquer le contenu d’une base ou pour interroger ensemble les bases du Web des données. Notre thèse propose trois méthodes pour encadrer le développement, tant factuel qu’ontologique, et pour améliorer l’interrogation des bases du Web des données. Nous proposons d’abord une méthode pour évaluer la qualité des modifications des données factuelles lors d’une mise à jour par un contributeur. Nous proposons ensuite une méthode pour faciliter l’examen de la base par la mise en évidence de groupes de données factuelles en conflit avec l’ontologie. L’expert qui guide l’évolution de cette base peut ainsi modifier l’ontologie ou les données. Nous proposons enfin une méthode d’interrogation dans un environnement distribué qui interroge uniquement les bases susceptibles de fournir une réponse
The web of data is a mean to share and broadcast data user-readable data as well as machine-readable data. This is possible thanks to rdf which propose the formatting of data into short sentences (subject, relation, object) called triples. Bases from the web of data, called rdf bases, are sets of triples. In a rdf base, the ontology – structural data – organize the description of factual data. Since the web of datacreation in 2001, the number and sizes of rdf bases have been constantly rising. This increase has accelerated since the apparition of linked data, which promote the sharing and interlinking of publicly available bases by user communities. The exploitation – interrogation and edition – by theses communities is made without adequateSolution to evaluate the quality of new data, check the current state of the bases or query together a set of bases. This thesis proposes three methods to help the expansion at factual and ontological level and the querying of bases from the web ofData. We propose a method designed to help an expert to check factual data in conflict with the ontology. Finally we propose a method for distributed querying limiting the sending of queries to bases that may contain answers
APA, Harvard, Vancouver, ISO, and other styles
42

Mehmood, Kashif. "Conception des Systèmes d'Information : une approche centrée sur les Patrons de Gestion de la Qualité." Phd thesis, Conservatoire national des arts et metiers - CNAM, 2010. http://tel.archives-ouvertes.fr/tel-00922995.

Full text
Abstract:
Les modèles conceptuels (MC) jouent un rôle crucial qui est celui de servir de base à l'ensemble du processus de développement d'un système d'information (SI) mais aussi de moyen de communication à la fois au sein de l'équipe de développement et avec les utilisateurs durant les premières étapes de validation. Leur qualité joue par conséquent un rôle déterminant dans le succès du système final. Des études ont montré que la majeure partie des changements que subit un SI concerne des manques ou des défaillances liés aux fonctionnalités attendues. Sachant que la définition de ses fonctionnalités incombe à la phase de l'analyse et conception dont les MC constituent les livrables, il apparaît indispensable pour une méthode de conception de veiller à la qualité des MC qu'elle produit. Notre approche vise les problèmes liés à la qualité de la modélisation conceptuelle en proposant une solution intégrée au processus de développement qui à l'avantage d'être complète puisqu'elle adresse à la fois la mesure de la qualité ainsi que son amélioration. La proposition couvre les aspects suivants: i. Formulation de critères de qualité en fédérant dans un premier temps les travaux existant sur la qualité des MC. En effet, un des manques constaté dans le domaine de la qualité des MC est l'absence de consensus sur les concepts et leurs définitions. Ce travail a été validé par une étude empirique. Ce travail a également permis d'identifier les parties non couverte par la littérature et de les compléter en proposant de nouveaux concepts ou en précisant ceux dont la définition n'était complète. ii. Définition d'un concept (pattern de qualité) permettant de capitaliser les bonnes pratiques dans le domaine de la mesure et de l'amélioration de la qualité des MC. Un pattern de qualité sert à aider un concepteur de SI dans l'identification des critères de qualité applicables à sa spécification, puis de le guider progressivement dans la mesure de la qualité ainsi que dans son amélioration. Sachant que la plupart des approches existantes s'intéresse à la mesure de la qualité et néglige les moyens de la corriger. La définition de ce concept est motivée par la difficulté et le degré d'expertise important qu'exige la gestion de la qualité surtout au niveau conceptuel où le logiciel fini n'est pas encore disponible et face à la diversité des concepts de qualité (critères et métriques) pouvant s'appliquer. iii. Formulation d'une méthode orientée qualité incluant à la fois des concepts, des guides et des techniques permettant de définir les concepts de qualité souhaités, leur mesure et l'amélioration de la qualité des MC. Cette méthode propose comme point d'entrée le besoin de qualité que doit formuler le concepteur. Il est ensuite guidée de manière flexible dans le choix des critères de qualité adaptés jusqu'à la mesure et la proposition de recommandations aidant à l'amélioration de la qualité du MC initial conformément au besoin formulé. iv. Développement d'un prototype "CM-Quality". Notre prototype met en œuvre la méthode proposée et offre ainsi une aide outillé à son application. Nous avons enfin mené deux expérimentations ; la première avait comme objectif de valider les concepts de qualité utilisés et de les retenir. La deuxième visait à valider la méthode de conception guidée par la qualité proposée
APA, Harvard, Vancouver, ISO, and other styles
43

Wang, Yihong. "Détection des défauts internes et externes des noix en coque : application du traitement d'images et de la reconnaissance de formes au contrôle de qualité." Grenoble 1, 1993. http://www.theses.fr/1993GRE10098.

Full text
Abstract:
Le triage des noix en coque a pour but d'eliminer toutes celles qui presentent des defauts externes et internes. Son automatisation permet d'en augmenter la productivite et une meilleure qualite de noix vendues. Cette automatisation repose sur un systeme d'inspection par vision artificielle. Celui-ci est constitue de deux parties: l'une relative a la prise d'images internes et externes de noix et l'autre a la qualification de la qualite des noix a partir de leurs images, laquelle fait appel aux techniques de traitement d'images et de reconnaissance de formes. Dans cette etude, un tel systeme de controle de qualite a ete elabore de facon logicielle, et cela selon un modele lineaire comportant 4 elements: capteur, bloc de traitement d'images, extracteur de parametre et classifieur. La camera optique et la radiographie par rayon x ont ete respectivement choisies comme capteurs pour la prise d'images externes et internes. Bases sur ces images, les algorithmes de traitement d'images et de reconnaissance de formes relatifs aux trois autres elements du modele ont ete developpes pour la classification des noix. L'ensemble du systeme a ete realise sur un pc 486-25 en langage c. Ce systeme a ete concu pour une application bien definie le triage automatique de noix, mais l'etude de sa generalisabilite montre qu'il peut aussi avoir d'autres applications dans les domaines industriel, medical, agricole, etc. . .
APA, Harvard, Vancouver, ISO, and other styles
44

Caujolle, Mathieu. "Identification et caractérisation des perturbations affectant les réseaux électriques HTA." Phd thesis, Supélec, 2011. http://tel.archives-ouvertes.fr/tel-00650911.

Full text
Abstract:
La reconnaissance des perturbations survenant sur les réseaux HTA est une problématique essentielle pour les clients industriels comme pour le gestionnaire du réseau. Ces travaux de thèse ont permis de développer un système d'identification automatique. Il s'appuie sur des méthodes de segmentation qui décomposent de manière précise et efficace les régimes transitoires et permanents des perturbations. Elles utilisent des filtres de types Kalman linéaire ou anti-harmoniques pour extraire les régimes transitoires. La prise en compte des variations harmoniques et de la présence de transitoires proches se fait à l'aide de seuils adaptatifs. Des méthodes de correction du retard a posteriori permettent d'améliorer la précision de la décomposition. Des indicateurs adaptés à la dynamique des régimes de fonctionnement analysés sont utilisés pour caractériser les perturbations. Peu sensibles aux erreurs de segmentation et aux perturbations harmoniques, ils permettent une description fiable des phases des perturbations. Deux types de systèmes de décision ont également été étudiés : des systèmes experts et des classifieurs SVM. Ces systèmes ont été mis au point à partir d'une large base de perturbations simulées. Leurs performances ont été évaluées sur une base de perturbations réelles : ils déterminent efficacement le type et la direction des perturbations observées (taux de reconnaissance moyen > 98%).
APA, Harvard, Vancouver, ISO, and other styles
45

Othman, Nadia. "Fusion techniques for iris recognition in degraded sequences." Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLL003/document.

Full text
Abstract:
Parmi les diverses modalités biométriques qui permettent l'identification des personnes, l'iris est considéré comme très fiable, avec un taux d'erreur remarquablement faible. Toutefois, ce niveau élevé de performances est obtenu en contrôlant la qualité des images acquises et en imposant de fortes contraintes à la personne (être statique et à proximité de la caméra). Cependant, dans de nombreuses applications de sécurité comme les contrôles d'accès, ces contraintes ne sont plus adaptées. Les images résultantes souffrent alors de diverses dégradations (manque de résolution, artefacts...) qui affectent négativement les taux de reconnaissance. Pour contourner ce problème, il est possible d’exploiter la redondance de l’information découlant de la disponibilité de plusieurs images du même œil dans la séquence enregistrée. Cette thèse se concentre sur la façon de fusionner ces informations, afin d'améliorer les performances. Dans la littérature, diverses méthodes de fusion ont été proposées. Cependant, elles s’accordent sur le fait que la qualité des images utilisées dans la fusion est un facteur crucial pour sa réussite. Plusieurs facteurs de qualité doivent être pris en considération et différentes méthodes ont été proposées pour les quantifier. Ces mesures de qualité sont généralement combinées pour obtenir une valeur unique et globale. Cependant, il n'existe pas de méthode de combinaison universelle et des connaissances a priori doivent être utilisées, ce qui rend le problème non trivial. Pour faire face à ces limites, nous proposons une nouvelle manière de mesurer et d'intégrer des mesures de qualité dans un schéma de fusion d'images, basé sur une approche de super-résolution. Cette stratégie permet de remédier à deux problèmes courants en reconnaissance par l'iris: le manque de résolution et la présence d’artefacts dans les images d'iris. La première partie de la thèse consiste en l’élaboration d’une mesure de qualité pertinente pour quantifier la qualité d’image d’iris. Elle repose sur une mesure statistique locale de la texture de l’iris grâce à un modèle de mélange de Gaussienne. L'intérêt de notre mesure est 1) sa simplicité, 2) son calcul ne nécessite pas d'identifier a priori les types de dégradations, 3) son unicité, évitant ainsi l’estimation de plusieurs facteurs de qualité et un schéma de combinaison associé et 4) sa capacité à prendre en compte la qualité intrinsèque des images mais aussi, et surtout, les défauts liés à une mauvaise segmentation de la zone d’iris. Dans la deuxième partie de la thèse, nous proposons de nouvelles approches de fusion basées sur des mesures de qualité. Tout d’abord, notre métrique est utilisée comme une mesure de qualité globale de deux façons différentes: 1) comme outil de sélection pour détecter les meilleures images de la séquence et 2) comme facteur de pondération au niveau pixel dans le schéma de super-résolution pour donner plus d'importance aux images de bonnes qualités. Puis, profitant du caractère local de notre mesure de qualité, nous proposons un schéma de fusion original basé sur une pondération locale au niveau pixel, permettant ainsi de prendre en compte le fait que les dégradations peuvent varier d’une sous partie à une autre. Ainsi, les zones de bonne qualité contribueront davantage à la reconstruction de l'image fusionnée que les zones présentant des artéfacts. Par conséquent, l'image résultante sera de meilleure qualité et pourra donc permettre d'assurer de meilleures performances en reconnaissance. L'efficacité des approches proposées est démontrée sur plusieurs bases de données couramment utilisées: MBGC, Casia-Iris-Thousand et QFIRE à trois distances différentes. Nous étudions séparément l'amélioration apportée par la super-résolution, la qualité globale, puis locale dans le processus de fusion. Les résultats montrent une amélioration importante apportée par l'utilisation de la qualité globale, amélioration qui est encore augmentée en utilisant la qualité locale
Among the large number of biometric modalities, iris is considered as a very reliable biometrics with a remarkably low error rate. The excellent performance of iris recognition systems are obtained by controlling the quality of the captured images and by imposing certain constraints on users, such as standing at a close fixed distance from the camera. However, in many real-world applications such as control access and airport boarding these constraints are no longer suitable. In such non ideal conditions, the resulting iris images suffer from diverse degradations which have a negative impact on the recognition rate. One way to try to circumvent this bad situation is to use some redundancy arising from the availability of several images of the same eye in the recorded sequence. Therefore, this thesis focuses on how to fuse the information available in the sequence in order to improve the performance. In the literature, diverse schemes of fusion have been proposed. However, they agree on the fact that the quality of the used images in the fusion process is an important factor for its success in increasing the recognition rate. Therefore, researchers concentrated their efforts in the estimation of image quality to weight each image in the fusion process according to its quality. There are various iris quality factors to be considered and diverse methods have been proposed for quantifying these criteria. These quality measures are generally combined to one unique value: a global quality. However, there is no universal combination scheme to do so and some a priori knowledge has to be inserted, which is not a trivial task. To deal with these drawbacks, in this thesis we propose of a novel way of measuring and integrating quality measures in a super-resolution approach, aiming at improving the performance. This strategy can handle two types of issues for iris recognition: the lack of resolution and the presence of various artifacts in the captured iris images. The first part of the doctoral work consists in elaborating a relevant quality metric able to quantify locally the quality of the iris images. Our measure relies on a Gaussian Mixture Model estimation of clean iris texture distribution. The interest of our quality measure is 1) its simplicity, 2) its computation does not require identifying in advance the type of degradations that can occur in the iris image, 3) its uniqueness, avoiding thus the computation of several quality metrics and associated combination rule and 4) its ability to measure the intrinsic quality and to specially detect segmentation errors. In the second part of the thesis, we propose two novel quality-based fusion schemes. Firstly, we suggest using our quality metric as a global measure in the fusion process in two ways: as a selection tool for detecting the best images and as a weighting factor at the pixel-level in the super-resolution scheme. In the last case, the contribution of each image of the sequence in final fused image will only depend on its overall quality. Secondly, taking advantage of the localness of our quality measure, we propose an original fusion scheme based on a local weighting at the pixel-level, allowing us to take into account the fact that degradations can be different in diverse parts of the iris image. This means that regions free from occlusions will contribute more in the image reconstruction than regions with artefacts. Thus, the quality of the fused image will be optimized in order to improve the performance. The effectiveness of the proposed approaches is shown on several databases commonly used: MBGC, Casia-Iris-Thousand and QFIRE at three different distances: 5, 7 and 11 feet. We separately investigate the improvement brought by the super-resolution, the global quality and the local quality in the fusion process. In particular, the results show the important improvement brought by the use of the global quality, improvement that is even increased using the local quality
APA, Harvard, Vancouver, ISO, and other styles
46

Olsén, Christina. "Towards Automatic Image Analysis for Computerised Mammography." Doctoral thesis, Umeå University, Computing Science, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-1657.

Full text
Abstract:

Mammographic screening is an effective way to detect breast cancer. Early detection of breast cancer depends to a high degree on the adequacy of the mammogram from which the diagnosis is made. Today, most of the analysis of the mammogram is performed by radiologists. Computer-aided diagnosis (CAD) systems have been proposed as an aid to increase the efficiency and effectiveness of the screening procedure by automatically indicating abnormalities in the mammograms. However, in order for a CAD system to be stable and efficient, the input images need to be adequate. Criteria for adequacy include: high resolution, low image noise and high image contrast. Additionally, the breast needs to be adequately positioned and compressed to properly visualise the entire breast and especially the glandular tissue.

This thesis addresses questions regarding the automatic determination of mammogram adequacy with the focus on breast positioning and segmentation evaluation. The goal and, thus, the major technical challenge is to develop algorithms that support fully automatic quality checks. The relevant quality criteria are discussed in Chapter 2. The aim of this discussion is to compile a comprehensive list of necessary criteria that a system for automatic assessment of mammographic adequacy must satisfy. Chapter 3 gives an overview of research performed in computer-aided analysis of mammograms. It also provides basic knowledge about image analysis involved in the research area of computerized mammography in general, and in the papers of this thesis, in particular. In contrast, Chapter 4 describes basic knowledge about segmentation evaluation, which is a highly important component in image analysis. Papers I–IV propose algorithms for measuring the quality of a mammogram according to certain criteria and addresses problems related to them. A method for automatic analysis of the shape of the pectoralis muscle is presented in Paper I. Paper II proposes a fully automatic method for extracting the breast border. A geometric assumption used by radiologists is that the nipple is located at the point on the breast border being furthest away from the pectoralis muscle. This assumption is investigated in Paper III, and a method for automatically restricting the search area is proposed. There has been an increasing need to develop an automated segmentation algorithm for extracting the glandular tissue, where the majority of breast cancer occur. In Paper IV, a novel approach for solving this problem in a robust and accurate way is proposed. Paper V discusses the challenges involved in evaluating the quality of segmentation algorithms based on ground truths provided by an expert panel. A method to relate ground truths provided by several experts to each other in order to establish levels of agreement is proposed. Furthermore, this work is used to develop an algorithm that combines an ensemble of markings into one surrogate ground truth.

APA, Harvard, Vancouver, ISO, and other styles
47

Vullaganti, Anoop. "Mechanical Parameter Characterization of Thin Polymer Films Using Digital Image Correlation." Thesis, Blekinge Tekniska Högskola, Institutionen för maskinteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-21653.

Full text
Abstract:
Mechanical parameter characterization of very thin polymer films using digital im- age correlation is performed in this work. At present days DIC is widely used in the construction, food industries, and aviation. Despite advantages when compared to other conventional methods, but users still face difficulties with the analysis of thin polymers like low-density polyethylene (LDPE) and polyethylene terephthalate (PET) thin polymer films. For the application of sprays to obtain the best pattern quality as well as the potential of thin-film material properties tempering from the stochastic pattern paint. This research work will investigate the effect of several spray paints on the material response of thin polymer film. It also shows how to achieve good surface traction, time effect, and the type of spray to be used for DIC analysis. Finally, this research also studies how the width of the specimen affects the wrinkling effect, which is a common phenomenon while testing the thin polymer films and exhibits the appropriate width for reducing wrinkles on thin polymer films.
APA, Harvard, Vancouver, ISO, and other styles
48

Lane, Andrea Marie. "Beverage Patterns and Diet Quality in US Children." The Ohio State University, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=osu1492169912205993.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Duarte, Denise Helena Silva. "Padrões de ocupação do solo e microclimas urbanos na região de clima tropical continental." Universidade de São Paulo, 2000. http://www.teses.usp.br/teses/disponiveis/16/16131/tde-18072006-182858/.

Full text
Abstract:
O objeto deste trabalho são os microclimas urbanos nas cidades brasileiras na região de Clima Tropical Continental. Partindo do princípio que há uma correlação entre microclimas urbanos e ocupação do solo, o objetivo é medir numericamente a correlação entre a temperatura do ar e algumas variáveis familiares ao planejamento, e que podem ser regulamentadas pela legislação municipal, a fim de orientar as medidas necessárias para amenizar o rigor climático nas cidades da região. Faz-se uma descrição qualitativa e quantitativa das variáveis urbanísticas envolvidas, bem como medições de temperatura e umidade do ar em diferentes estações e horários. Os resultados mostram que as variáveis taxa de ocupação e coeficiente de aproveitamento mantém correlação positiva com a temperatura do ar, e refletem uma maior influência da densidade construída sobre o período noturno, o que concorda com a teoria existente. Já com as variáveis arborização e água a correlação foi negativa em relação à temperatura do ar, em todos os horários. Ao final propõe-se um índice em função das variáveis urbanísticas utilizadas, visando subsidiar futuros trabalhos na determinação da proporção ideal entre densidade construída, arborização e água.
The subject of this thesis is the urban microclimate in cities of the Brazilian Continental Tropical Region. According to the principle that there is a correlation between urban microclimates and urban occupation, the objective is to numerically measure the correlation between air temperature and variables which are familiar to urban planning strategies. These variables can be controlled by municipal regulations, aiming to guide the procedures to ameliorate the urban harsh climate in that region. Qualitative and quantitative descriptions of the urban variables are made, as air temperature and humidity measurements registered along the day for the two main seasons. The results show that with variables related to building density, the correlation was positive when referred to air temperature, and reflected the greater influence of building density at night, what is in accord with the existing theory. On the other hand, using trees and water, the correlation was negative, when related to air temperature, in all periods. At the end, this work suggests an index using urban variables, aiming to support future studies in determining the ideal proportion among building density, trees and water in urban environments.
APA, Harvard, Vancouver, ISO, and other styles
50

Yogeswaran, Arjun. "3D Surface Analysis for the Automated Detection of Deformations on Automotive Panels." Thèse, Université d'Ottawa / University of Ottawa, 2011. http://hdl.handle.net/10393/19992.

Full text
Abstract:
This thesis examines an automated method to detect surface deformations on automotive panels for the purpose of quality control along a manufacturing assembly line. Automation in the automotive manufacturing industry is becoming more prominent, but quality control is still largely performed by human workers. Quality control is important in the context of automotive body panels as deformations can occur along the assembly line such as inadequate handling of parts or tools around a vehicle during assembly, rack storage, and shipping from subcontractors. These defects are currently identified and marked, before panels are either rectified or discarded. This work attempts to develop an automated system to detect deformations to alleviate the dependence on human workers in quality control and improve performance by increasing speed and accuracy. Some techniques make use of an ideal CAD model behaving as a master work, and panels scanned on the assembly line are compared to this model to determine the location of deformations. This thesis presents a solution for detecting deformations of various scales without a master work. It also focuses on automated analysis requiring minimal intuitive operator-set parameters and provides the ability to classify the deformations as dings, which are deformations that protrude from the surface, or dents, which are depressions into the surface. A complete automated deformation detection system is proposed, comprised of a feature extraction module, segmentation module, and classification module, which outputs the locations of deformations when provided with the 3D mesh of an automotive panel. Two feature extraction techniques are proposed. The first is a general feature extraction technique for 3D meshes using octrees for multi-resolution analysis and evaluates the amount of surface variation to locate deformations. The second is specifically designed for the purpose of deformation detection, and analyzes multi-resolution cross-sections of a 3D mesh to locate deformations based on their estimated size. The performance of the proposed automated deformation detection system, and all of its sub-modules, is tested on a set of meshes which represent differing characteristics of deformations in surface panels, including deformations of different scales. Noisy, low resolution meshes are captured from a 3D acquisition, while artificial meshes are generated to simulate ideal acquisition conditions. The proposed system shows accurate results in both ideal situations as well as non-ideal situations under the condition of noise and complex surface curvature by extracting only the deformations of interest and accurately classifying them as dings or dents.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography