Дисертації з теми "Visualization of the processing process"

Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Visualization of the processing process.

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 дисертацій для дослідження на тему "Visualization of the processing process".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Gomes, Ricardo Rafael Baptista. "Long-term biosignals visualization and processing." Master's thesis, Faculdade de Ciências e Tecnologia, 2011. http://hdl.handle.net/10362/7979.

Повний текст джерела
Анотація:
Thesis submitted in the fulfillment of the requirements for the Degree of Master in Biomedical Engineering
Long-term biosignals acquisitions are an important source of information about the patients’state and its evolution. However, long-term biosignals monitoring involves managing extremely large datasets, which makes signal visualization and processing a complex task. To overcome these problems, a new data structure to manage long-term biosignals was developed. Based on this new data structure, dedicated tools for long-term biosignals visualization and processing were implemented. A multilevel visualization tool for any type of biosignals, based on subsampling is presented, focused on four representative signal parameters (mean, maximum, minimum and standard deviation error). The visualization tool enables an overview of the entire signal and a more detailed visualization in specific parts which we want to highlight, allowing an user friendly interaction that leads to an easier signal exploring. The ”map” and ”reduce” concept is also exposed for long-term biosignal processing. A processing tool (ECG peak detection) was adapted for long-term biosignals. In order to test the developed algorithm, long-term biosignals acquisitions (approximately 8 hours each) were carried out. The visualization tool has proven to be faster than the standard methods, allowing a fast navigation over the different visualization levels of biosignals. Regarding the developed processing algorithm, it detected the peaks of long-term ECG signals with fewer time consuming than the nonparalell processing algorithm. The non-specific characteristics of the new data structure, visualization tool and the speed improvement in signal processing introduced by these algorithms makes them powerful tools for long-term biosignals visualization and processing.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Reach, Andrew McCaleb. "Smooth Interactive Visualization." Diss., Virginia Tech, 2017. http://hdl.handle.net/10919/78848.

Повний текст джерела
Анотація:
Information visualization is a powerful tool for understanding large datasets. However, many commonly-used techniques in information visualization are not C^1 smooth, i.e. when represented as a function, they are either discontinuous or have a discontinuous first derivative. For example, histograms are a non-smooth visualization of density. Not only are histograms non-smooth visually, but they are also non-smooth over their parameter space, as they change abruptly in response to smooth change of bin width or bin offset. For large data visualization, histograms are commonly used in place of smooth alternatives, such as kernel density plots, because histograms can be constructed from data cubes, allowing histograms to be constructed quickly for large datasets. Another example of a non-smooth technique in information visualization is the commonly-used transition approach to animation. Although transitions are designed to create smooth animations, the transition technique produces animations that have velocity discontinuities if the target is changed before the transition has finished. The smooth and efficient zooming and panning technique also shares this problem---the animations produced are smooth while in-flight, but they have velocity discontinuities at the beginning and end and of the animation as well as velocity discontinuities when interrupted. This dissertation applies ideas from signal processing to construct smooth alternatives to these non-smooth techniques. To visualize density for large datasets, we propose BLOCs, a smooth alternative to data cubes that allows kernel density plots to be constructed quickly for large datasets after an initial preprocessing step. To create animations that are smooth even when interrupted, we present LTI animation, a technique that uses LTI filters to create animations that are smooth, even when interrupted. To create zooming and panning animations that are smooth, even when interrupted, we generalize signal processing systems to Riemannian manifolds, resulting in smooth, efficient, and interruptible animations.
Ph. D.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

MacDonald, Darren T. "Image segment processing for analysis and visualization." Thesis, University of Ottawa (Canada), 2008. http://hdl.handle.net/10393/27641.

Повний текст джерела
Анотація:
This thesis is a study of the probabilistic relationship between objects in an image and image appearance. We give a hierarchical, probabilistic criterion for the Bayesian segmentation of photographic images. We validate the segmentation against the Berkeley Segmentation Data Set, where human subjects were asked to partition digital images into segments each representing a 'distinguished thing'. We show that there exists a strong dependency between the hierarchical segmentation criterion, based on our assumptions about the visual appearance of objects, and the distribution of ground truth data. That is, if two pixels have similar visual properties then they will often have the same ground truth state. Segmentation accuracy is quantified by measuring the information cross-entropy between the ground truth probability distribution and an estimate obtained from the segmentation. We consider the proposed method for estimating joint ground truth probability to be an important tool for future image analysis and visualization work.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Таран, Євгеній Сергійович. "Оправка розточна збірна". Bachelor's thesis, КПІ ім. Ігоря Сікорського, 2019. https://ela.kpi.ua/handle/123456789/31456.

Повний текст джерела
Анотація:
Метою дипломного проекту є проектування оправки розточної збірної, яка застосовується для обробки внутрішніх та зовнішніх циліндричних поверхонь, свердління отвору, точіння внутрішніх канавок та підрізання торця. Даний інструмент оснащений змінними твердосплавними пластинами, які надійно закріплені в корпусі інструменту. Також було проаналізовано конструкції інструментів для оброблення різних типів поверхонь, створене робоче креслення та 3D модель інструменту, розроблено технологію виготовлення і розраховано режими різання, вибрано та розраховано технологічне пристосування для фрезерування посадочних поверхонь під пластину з твердого сплаву, представлена керуюча програма для верстату з ЧПК та візуалізація процесу обробки.
The purpose of the diploma project is the design of the mandrel of the blade assembly, which is used for the treatment of internal and external cylindrical surfaces, drilling a hole, rolling the grooves and triming the ends. This tool is equipped with replaceable carbide plates, which are securely mounted in the tool casing. The structure of the tools for processing various types of surfaces was also analyzed, a working drawing and a 3D model of the tool were created, a production technology was developed and cutting patterns were calculated, a technological device for milling the landing surfaces under a solid alloy plate was selected and calculated, the control program for the CNC machine and visualization of the processing process are presented.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Wikström, Anders. "A Design Process Based on Visualization." Licentiate thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-10395.

Повний текст джерела
Анотація:
The global market of today is tough and the competition between companies demands new ways of developing products and services. The current challenge for the design research community is to provide designers with a wider range of methods and tools to support specific activities within the design process and to improve its overall coordination. It is uncertain whether problem-solving alone can be the tool for developing even simple products or services, as we are less open to the variety of opportunities that arise in the process. When approaching a problem, the cognitive mindset demanded to solve the problem differs from that required when searching for opportunities in creating something completely new. This emphasizes the importance of design thinking, using tools and methods for conducting Human Centered Design (HCD), empathy, and intuition along with the use of visualization. By focusing on the creative process and the use of sketches or models, this research will explore the possibility to develop tools and methods for conducting NPD projects in a more efficient, effective way. This research provides a prescriptive model of using visualization in New Product Development (NPD). This model is a result of clarifying the problem, understanding the factors that affect the problem and the development of support and a solution for implementing the model in industry. The model works as a supportive tool for project teams and guidelines for management.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Marokkey, Sajan Raphael. "Digital techniques for dynamic visualization in photomechanics." Thesis, Hong Kong : University of Hong Kong, 1995. http://sunzi.lib.hku.hk/hkuto/record.jsp?B14670896.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Hicks, Jeremy L. "Visualization of polymer processing at the continuum level." Connect to this title online, 2006. http://etd.lib.clemson.edu/documents/1171902589/.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Zhao, Hongyan. "A visualization tool to support Online Analytical Processing." [Gainesville, Fla.] : University of Florida, 2002. http://purl.fcla.edu/fcla/etd/UFE0000622.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Al, Beayeyz Alaa. "The Effect of It Process Support, Process Visualization and Process Characteristics on Process Outcomes." Thesis, University of North Texas, 2013. https://digital.library.unt.edu/ark:/67531/metadc407777/.

Повний текст джерела
Анотація:
Business process re-engineering (part of the Business Process Management domain) is among the top three concerns of Information Technology (IT) leaders and is deemed to be one of many important IT leveraging opportunities. Two major challenges have been identified in relation to BPM and the use of IT. The first challenge is related to involving business process participants in process improvement initiatives using BPM systems. BPM technologies are considered to be primarily targeted for developers and not BPM users, and the need to engage process participants into process improvement initiatives is not addressed, contributing to the business-IT gap. The second challenge is related to potential de-skilling of knowledge workers when knowledge-intensive processes are automated and process knowledge resides in IT, rather than human process participants. The two identified challenges are not separate issues. Process participants need to be knowledgeable about the process in order to actively contribute to BPM initiatives, and the loss of process knowledge as a result of passive use of automated systems may further threaten their participation in process improvement. In response to the call for more research on the individual impacts of business process initiatives, the purpose of this dissertation study is to understand the relationship between IT configurations (particularly process support and process visualization), process characteristics and individual level process outcomes, such as task performance and process knowledge. In the development of the research model we rely on organizational knowledge creation literature and scaffolding in Vygotsky’s Zone of Proximal Development, business process modeling and workflow automation research, as well as research on the influence of IT on individual performance. The theoretical model is tested empirically in experimental settings using a series of two studies. In both studies participants were asked to complete tasks as part of a business process using different versions of a mock-up information system. Together, the studies evaluate the effect of IT process support, process visualization and process complexity on process participant performance and process knowledge. The results of the studies show the significant influence of IT process support on individual process outcomes. The studies indicate that task performance does increase but at the cost of users’ process knowledge. Process visualization however is shown to enhance user’s process knowledge in the event of no formal process training while having no negative impact on task performance. The key contribution of this research is that it suggests a practical way to counteract potential negative effects of IT process automation by converting the use of the information system into a learning experience, where the IT itself acts as a scaffold for the acquisition of process knowledge. The results have practical implications for the design of workflow automation systems, as well as for process training.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Park, Joonam. "A visualization system for nonlinear frame analysis." Thesis, Georgia Institute of Technology, 1999. http://hdl.handle.net/1853/19172.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Huang, Shiping. "Exploratory visualization of data with variable quality." Link to electronic thesis, 2005. http://www.wpi.edu/Pubs/ETD/Available/etd-01115-225546/.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Knežević, Jovana. "Schedule visualization and analysis for halide image processing language." Thesis, Massachusetts Institute of Technology, 2013. http://hdl.handle.net/1721.1/85431.

Повний текст джерела
Анотація:
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2013.
Cataloged from PDF version of thesis.
Includes bibliographical references (page 43).
Image processing applications require high performance software implementations in order to satisfy large input data and run on smaller mobile devices that require high efficiency. Halide is a language and compiler for optimizing image processing pipelines. Halide introduces a separation between algorithm, the logics behind the program, and a schedule, the order of execution. This thesis focuses on providing interactive GUI for visual analysis of Halide schedules. It creates a visualization of the order of execution and provides tools for analyzing three important aspects of image processing schedules: redundancy, locality and parallelism. Tool is designed for Halide programers who want to gain better understanding of scheduling in Halide and receive guidance for schedule optimizations.
by Jovana Knezevic.
M. Eng.
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Najim, S. A. "Faithful visualization and dimensionality reduction on graphics processing unit." Thesis, Bangor University, 2014. https://research.bangor.ac.uk/portal/en/theses/faithful-visualization-and-dimensionality-reduction-on-graphics-processing-unit(527800f6-191c-4257-98d1-7909a1ab9ead).html.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Mattasantharam, R. (Rubini). "3D web visualization of continuous integration big data." Master's thesis, University of Oulu, 2018. http://urn.fi/URN:NBN:fi:oulu-201812063239.

Повний текст джерела
Анотація:
Continuous Integration (CI) is a practice that is used to automate the software build and its test for every code integration to a shared repository. CI runs thousands of test scripts every day in a software organization. Every test produces data which can be test results logs such as errors, warnings, performance measurements and build metrics. This data volume tends to grow at unprecedented rates for the builds that are produced in the Continuous Integration (CI) system. The amount of the integrated test results data in CI grows over time. Visualizing and manipulating the real time and dynamic data is a challenge for the organizations. The 2D visualization of big data has been actively in use in software industry. Though the 2D visualization has numerous advantages, this study is focused on the 3D representation of CI big data visualization and its advantage over 2D visualization. Interactivity with the data and system, and accessibility of the data anytime, anywhere are two important requirements for the system to be usable. Thus, the study focused in creating a 3D user interface to visualize CI system data in 3D web environment. The three-dimensional user interface has been studied by many researchers who have successfully identified various advantages of 3D visualization along with various interaction techniques. Researchers have also described how the system is useful in real world 3D applications. But the usability of 3D user interface in visualizations in not yet reached to a desirable level especially in software industry due its complex data. The purpose of this thesis is to explore the use of 3D data visualization that could help the CI system users of a beneficiary organization in interpreting and exploring CI system data. The study focuses on designing and creating a 3D user interface for providing a more effective and usable system for CI data exploration. Design science research framework is chosen as a suitable research method to conduct the study. This study identifies the advantages of applying 3D visualization to a software system data and then proceeds to explore how 3D visualization could help users in exploring the software data through visualization and its features. The results of the study reveal that the 3D visualization help the beneficiary organization to view and compare multiple datasets in a single screen space, and to see the holistic view of large datasets, as well as focused details of multiple datasets of various categories in a single screen space. Also, it can be said from the results that the 3D visualization help the beneficiary organization CI team to better represent big data in 3D than in 2D.
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Braunisch, Jan. "Language Independent Speech Visualization." Thesis, Linköpings universitet, Reglerteknik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-71040.

Повний текст джерела
Анотація:
A speech visualization system is proposed thatcould be used by a deaf person for understanding speech.Several novel techniques are proposed, including: (1) Minimizing spectral leakage in the Fourier transform by using avariable-length window. (2) Making use of the fact that there is no spectral leakage in order to calculate how much of the energy of the speech signal is due to its periodic component vs. its nonperiodic component. (3) Modelling the mouth and lips as a band-pass filter and estimating the central frequency and bandwidth of this filter in order to assign colours tounvoiced speech sounds.
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Thakar, Aniruddha. "Visualization feedback from informal specifications." Thesis, This resource online, 1993. http://scholar.lib.vt.edu/theses/available/etd-03242009-040810/.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Cai, Bo. "Scattered Data Visualization Using GPU." University of Akron / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=akron1428077896.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Kitasaka, Takayuki, Kensaku Mori, and Yasuhito Suenaga. "New Paradigm of Medical Image Processing - Visualization, Detection, and Navigation-." INTELLIGENT MEDIA INTEGRATION NAGOYA UNIVERSITY / COE, 2005. http://hdl.handle.net/2237/10458.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
19

Renjifo, Carlos A. "Exploration, processing and visualization of physiological signals from the ICU." Thesis, Massachusetts Institute of Technology, 2005. http://hdl.handle.net/1721.1/33350.

Повний текст джерела
Анотація:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2005.
Includes bibliographical references (p. 119-120).
This report studies physiological signals measured from patients in the Intensive Care Unit (ICU). The signals explored include heart rate, arterial blood pressure, pulmonary artery pressure, and central venous pressure measurements. Following an introduction to these signals, several methods are proposed for visualizing the data using time and frequency domain techniques. By way of a patient case study we motivate a novel method for data clustering based on the singular value decomposition and present some potential applications based on this method for use within the ICU setting.
by Carlos A. Renjifo.
M.Eng.
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Narayanan, Shruthi (Shruthi P. ). "Real-time processing and visualization of intensive care unit data." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/119537.

Повний текст джерела
Анотація:
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2017.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (page 83).
Intensive care unit (ICU) patients undergo detailed monitoring so that copious information regarding their condition is available to support clinical decision-making. Full utilization of the data depends heavily on its quantity, quality and manner of presentation to the physician at the bedside of a patient. In this thesis, we implemented a visualization system to aid ICU clinicians in collecting, processing, and displaying available ICU data. Our goals for the system are: to be able to receive large quantities of patient data from various sources, to compute complex functions over the data that are able to quantify an ICU patient's condition, to plot the data using a clean and interactive interface, and to be capable of live plot updates upon receiving new data. We made significant headway toward our goals, and we succeeded in creating a highly adaptable visualization system that future developers and users will be able to customize.
by Shruthi Narayanan.
M. Eng.
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Elshahali, Mai Hassan Ahmed Ali. "Real-Time Processing and Visualization of 3D Time-Variant Datasets." Diss., Virginia Tech, 2015. http://hdl.handle.net/10919/56582.

Повний текст джерела
Анотація:
Scientific visualization is primarily concerned with the visual presentation of three-dimensional phenomena in domains like medicine, meteorology, astrophysics, etc. The emphasis in scientific visualization research has been on the efficient rendering of measured or simulated data points, surfaces, volumes, and a time component to convey the dynamic nature of the studied phenomena. With the explosive growth in the size of the data, interactive visualization of scientific data becomes a real challenge. In recent years, the graphics community has witnessed tremendous improvements in the performance capabilities of graphics processing units (GPUs), and advances in GPU-accelerated rendering have enabled data exploration at interactive rates. Nevertheless, the majority of techniques rely on the assumption that a true three-dimensional geometric model capturing physical phenomena of interest, is available and ready for visualization. Unfortunately, this assumption does not hold true in many scientific domains, in which measurements are obtained from a given scanning modality at sparsely located intervals in both space and time. This calls for the fusion of data collected from multiple sources in order to fill the gaps and tell the story behind the data. For years, data fusion has relied on machine learning techniques to combine data from multiple modalities, reconstruct missing information, and track features of interest through time. However, these techniques fall short in solving the problem for datasets with large spatio-temporal gaps. This realization has led researchers in the data fusion domain to acknowledge the importance of human-in-the-loop methods where human expertise plays a major role in data reconstruction. This PhD research focuses on developing visualization and interaction techniques aimed at addressing some of the challenges that experts are faced with when analyzing the spatio-temporal behavior of physical phenomena. Given a number of datasets obtained from different measurement modalities and from simulation, we propose a generalized framework that can guide research in the field of multi-sensor data fusion and visualization. We advocate the use of GPU parallelism in our developed techniques in order to emphasize interaction as a key component in the successful exploration and analysis of multi-sourced data sets. The goal is to allow the user to create a mental model that captures their understanding of the spatio-temporal behavior of features of interest; one which they can test against real data measurements to verify their model. This model creation and verification is an iterative process in which the user interacts with the visualization, explores and builds an understanding of what occurred in the data, then tests this understanding against real-world measurements and improves it. We developed a system as a reference implementation of the proposed framework. Reconstructed data is rendered in a way that completes the users' cognitive model, which encodes their understanding of the phenomena in question with a high degree of accuracy. We tested the usability of the system and evaluated its support for this cognitive model construction process. Once an acceptable model is constructed, it is fed back to the system in the form of a reference dataset, which our framework uses to guide the real-time tracking of measurement data. Our results show that interactive exploration tasks enable the construction of this cognitive model and reference set, and that real-time interaction is achievable during the exploration, reconstruction, and enhancement of multi-modal time-variant three-dimensional data, by designing and implementing advanced GPU-based visualization techniques.
Ph. D.
Стилі APA, Harvard, Vancouver, ISO та ін.
22

McGraw, Tim E. "Denoising, segmentation and visualization of diffusion weighted MRI." [Gainesville, Fla.] : University of Florida, 2005. http://purl.fcla.edu/fcla/etd/UFE0011618.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Chung, David H. S. "High-dimensional glyph-based visualization and interactive techniques." Thesis, Swansea University, 2014. https://cronfa.swan.ac.uk/Record/cronfa42276.

Повний текст джерела
Анотація:
The advancement of modern technology and scientific measurements has led to datasets growing in both size and complexity, exposing the need for more efficient and effective ways of visualizing and analysing data. Despite the amount of progress in visualization methods, high-dimensional data still poses a number of significant challenges in terms of the technical ability of realising such a mapping, and how accurate they are actually interpreted. The different data sources and characteristics which arise from a wide range of scientific domains as well as specific design requirements constantly create new special challenges for visualization research. This thesis presents several contributions to the field of glyph-based visualization. Glyphs are parametrised objects which encode one or more data values to its appearance (also referred to as visual channels) such as their size, colour, shape, and position. They have been widely used to convey information visually, and are especially well suited for displaying complex, multi-faceted datasets. Its major strength is the ability to depict patterns of data in the context of a spatial relationship, where multi-dimensional trends can often be perceived more easily. Our research is set in the broad scope of multi-dimensional visualization, addressing several aspects of glyph-based techniques, including visual design, perception, placement, interaction, and applications. In particular, this thesis presents a comprehensive study on one interaction technique, namely sorting, for supporting various analytical tasks. We have outlined the concepts of glyph- based sorting, identified a set of design criteria for sorting interactions, designed and prototyped a user interface for sorting multivariate glyphs, developed a visual analytics technique to support sorting, conducted an empirical study on perceptual orderability of visual channels used in glyph design, and applied glyph-based sorting to event visualization in sports applications. The content of this thesis is organised into two parts. Part I provides an overview of the basic concepts of glyph-based visualization, before describing the state-of-the-art in this field. We then present a collection of novel glyph-based approaches to address challenges created from real-world applications. These are detailed in Part II. Our first approach involves designing glyphs to depict the composition of multiple error-sensitivity fields. This work addresses the problem of single camera positioning, using both 2D and 3D methods to support camera configuration based on various constraints in the context of a real-world environment. Our second approach present glyphs to visualize actions and events "at a glance". We discuss the relative merits of using metaphoric glyphs in comparison to other types of glyph designs to the particular problem of real-time sports analysis. As a result of this research, we delivered a visualization software, MatchPad, on a tablet computer. It successfully helped coaching staff and team analysts to examine actions and events in detail whilst maintaining a clear overview of the match, and assisted in their decision making during the matches. Abstract shortened by ProQuest.
Стилі APA, Harvard, Vancouver, ISO та ін.
24

Peng, Wei. "Clutter-based dimension reordering in multi-dimensional data visualization." Link to electronic thesis, 2005. http://www.wpi.edu/Pubs/ETD/Available/etd-01115-222940.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Maloney, Ross J. "Assisting reading and analysis of text documents by visualization." Maloney, Ross J. (2005) Assisting reading and analysis of text documents by visualization. PhD thesis, Murdoch University, 2005. http://researchrepository.murdoch.edu.au/177/.

Повний текст джерела
Анотація:
The research reported here examined the use of computer generated graphics as a means to assist humans to analyse text documents which have not been subject to markup. The approach taken was to survey available visualization techniques in a broad selection of disciplines including applications to text documents, group those techniques using a taxonomy proposed in this research, then develop a selection of techniques that assist the text analysis objective. Development of the selected techniques from their fundamental basis, through their visualization, to their demonstration in application, comprises most of the body of this research. A scientific orientation employing measurements, combined with visual depiction and explanation of the technique with limited mathematics, is used as opposed to fully utilising any one of those resulting techniques for performing complete text document analysis. Visualization techniques which apply directly to the text and those which exploit measurements produced by associated techniques are considered. Both approaches employ visualization to assist the human viewer to discover patterns which are then used in the analysis of the document. In the measurement case, this requires consideration of data with dimensions greater than three, which imposes a visualization difficulty. Several techniques for overcoming this problem are proposed. Word frequencies, Zipf considerations, parallel coordinates, colour maps, Cusum plots, and fractal dimensions are some of the techniques considered. One direct application of visualization to text documents is to assist reading of that document by de-emphasising selected words by fading them on the display from which they are read. Three word selection techniques are proposed for the automatic selection of which words to use. An experiment is reported which used such word fading techniques. It indicated that some readers do have improved reading speed under such conditions, but others do not. The experimental design enabled the separation of that group which did decrease reading times from the remaining readers who did not. Measurement of comprehension errors made under different types of word fading were shown not to increase beyond that obtained under normal reading conditions. A visualization based on categorising the words in a text document is proposed which contrasts to visualization of measurements based on counts. The result is a visual impression of the word composition, and the evolution of that composition within that document. The text documents used to demonstrates these techniques include English novels and short stories, emails, and a series of eighteenth century newspaper articles known as the Federalist Papers. This range of documents was needed because all analysis techniques are not applicable to all types of documents. This research proposes that an interactive use of the techniques on hand in a non-prescribed order can yield useful results in a document analysis. An example of this is in author attribution, i.e. assigning authorship of documents via patterns characteristic of an individual's writing style. Different visual techniques can be used to explore the patterns of writing in given text documents. A software toolkit as a platform for implementing the proposed interactive analysis of text documents is described. How the techniques could be integrated into such a toolkit is outlined. A prototype of software to implement such a toolkit is included in this research. Issues relating to implementation of each technique used are also outlined.
Стилі APA, Harvard, Vancouver, ISO та ін.
26

Kolb, Jens [Verfasser]. "Abstraction, visualization, and evolution of process models / Jens Kolb." Ulm : Universität Ulm. Fakultät für Ingenieurwissenschaften und Informatik, 2015. http://d-nb.info/1076083889/34.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
27

HUANG, HEXIANG. "APPLICATION OF VISUALIZATION IN URBAN PLANNING DECISION-MAKING PROCESS." University of Cincinnati / OhioLINK, 2004. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1100979099.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
28

Wisby, C. "Real-time digital imaging techniques for flow visualization." Thesis, University of Surrey, 1989. http://epubs.surrey.ac.uk/848586/.

Повний текст джерела
Анотація:
A real-time digital imaging technique has been applied to smoke flow visualized turbulent flows to provide statistical data concerning bluff body wakes. The 'digital imaging technique' has been successfully applied to the wake of a two-dimensional flat plate, circular cylinder and a jet in a crossflow configuration. A detailed study of the two-dimensional flat plate model involved comparative hot-wire and pressure measurements combined with data from previously published experimental investigations. The results obtained included, intermittency measurements, vortex shedding spectral analyses (autocorrelations), spatial correlations, wake interface statistics and turbulence data. In the majority of cases, the digital imaging technique was found to provide excellent quantitative detail whilst also offering some unique wake interface statistics. The experiments conducted on the circular cylinder model revealed details of secondary vortex shedding and their base-bleed dependence, whilst the jet in a crossflow configuration enabled the imaging technique to be applied to a complex, three-dimensional flow model. The resulting iso-intermittency contour map was produced expediently, and within an experimental period far shorter than could be expected for single-location probe measurements. In addition to the above-outlined quantitative technique, real-time digital imaging was also applied more qualitatively to the study of dynamic stall on an aerofoil and to the enhancement of high-speed vapour-screen visualizations, both techniques offering the possibility for enhanced quantitative flow studies in future investigations. Finally, true-colour video digitisation has been exploited in a preliminary study of the quantification of global surface shear stress values using liquid crystal technology. Although in its infancy, the realisation of an experimental procedure along such lines would be of immense benefit to experimental aerodynamic research.
Стилі APA, Harvard, Vancouver, ISO та ін.
29

Brattain, Laura. "Enhanced Ultrasound Visualization for Procedure Guidance." Thesis, Harvard University, 2014. http://dissertations.umi.com/gsas.harvard:11649.

Повний текст джерела
Анотація:
Intra-cardiac procedures often involve fast-moving anatomic structures with large spatial extent and high geometrical complexity. Real-time visualization of the moving structures and instrument-tissue contact is crucial to the success of these procedures. Real-time 3D ultrasound is a promising modality for procedure guidance as it offers improved spatial orientation information relative to 2D ultrasound. Imaging rates at 30 fps enable good visualization of instrument-tissue interactions, far faster than the volumetric imaging alternatives (MR/CT). Unlike fluoroscopy, 3D ultrasound also allows better contrast of soft tissues, and avoids the use of ionizing radiation.
Engineering and Applied Sciences
Стилі APA, Harvard, Vancouver, ISO та ін.
30

Ho, Sai-chuen, and 何世全. "Process roaming." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2007. http://hub.hku.hk/bib/B38983424.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
31

Monkevich, James Matthew. "Analysis of Aperture Radiation Using Computer Visualization and Image-Processing Techniques." Thesis, Virginia Tech, 1998. http://hdl.handle.net/10919/36735.

Повний текст джерела
Анотація:
In order to accurately describe the behavior of an antenna, one needs to understand the radiation mechanisms that govern its operation. One way to gain such an insight is to view the fields and currents present on a radiating structure. Unfortunately, in close proximity to an antenna empirical techniques fail because the measurement probe alters the operation of the radiating structure. Computational methods offer a solution to this problem. By simulating the operation of an antenna, one can obtain electromagnetic field data near (or even internal to) a radiating structure. However, these computationally intense techniques often generate extremely large data sets that cannot be adequately interpreted using traditional graphical approaches.

A visualization capability is developed that allows an analysis of the above-mentioned data sets. With this technique, the data is viewed from a unique, global perspective. This format is well suited for analytical investigations as well as debugging during modeling and simulation. An illustrative example is provided in the context of a rectangular microstrip patch antenna. A comparison is performed between the visualized data and the theory of operation for the microstrip patch in order to demonstrate that radiation mechanisms can be obtained visually.

An additional analysis tool is developed using Gabor filters and image-processing techniques. This tool allows one to detect and filter electromagnetic waves propagating with different velocities (both speed and direction). By doing so, each mode of an antenna can be analyzed independently. The fields of a multi-moded, open-ended rectangular waveguide are analyzed in order to demonstrate the effectiveness of these techniques.
Master of Science

Стилі APA, Harvard, Vancouver, ISO та ін.
32

Koglin, Ryan W. "Efficient Image Processing Techniques for Enhanced Visualization of Brain Tumor Margins." University of Akron / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=akron1415835138.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Lakshmanan, Kris. "Quantitative computer image processing of color particle markers in flow visualization /." The Ohio State University, 1986. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487265143146735.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Shih, Po-Wen. "Efficient visualization of large datasets using parallel processing and visibility computation /." The Ohio State University, 1999. http://rave.ohiolink.edu/etdc/view?acc_num=osu1488187763835562.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Lilja, Johan. "[18F]Flutemetamol PET image processing, visualization and quantification targeting clinical routine." Doctoral thesis, Uppsala universitet, Radiologi, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-317688.

Повний текст джерела
Анотація:
Alzheimer’s disease (AD) is the leading cause of dementia and is alone responsible for 60-70% of all cases of dementia. Though sharing clinical symptoms with other types of dementia, the hallmarks of AD are the abundance of extracellular depositions of β-amyloid (Aβ) plaques, intracellular neurofibrillary tangles of hyper phosphorylated tau proteins and synaptic depletion. The onset of the physiological hallmarks may precede clinical symptoms with a decade or more, and once clinical symptoms occur it may be difficult to separate AD from other types of dementia based on clinical symptoms alone. Since the introduction of radiolabeled Aβ tracer substances for positron emission tomography (PET) imaging it is possible to image the Aβ depositions in-vivo, strengthening the confidence in the diagnosis. Because the accumulation of Aβ may occur years before the first clinical symptoms are shown and even reach a plateau, Aβ PET imaging may not be feasible for disease progress monitoring. However, a negative scan may be used to rule out AD as the underlying cause to the clinical symptoms. It may also be used as a predictor to evaluate the risk of developing AD in patients with mild cognitive impairment (MCI) as well as monitoring potential effects of anti-amyloid drugs.Though currently validated for dichotomous visual assessment only, there is evidence to suggest that quantification of Aβ PET images may reduce inter-reader variability and aid in the monitoring of treatment effects from anti-amyloid drugs.The aim of this thesis was to refine existing methods and develop new ones for processing, quantification and visualization of Aβ PET images to aid in the diagnosis and monitoring of potential treatment of AD in clinical routine. Specifically, the focus for this thesis has been to find a way to fully automatically quantify and visualize a patient’s Aβ PET image in such way that it is presented in a uniform way and show how it relates to what is considered normal. To achieve the aim of the thesis registration algorithms, providing the means to register a patient’s Aβ PET image to a common stereotactic space avoiding the bias of different uptake patterns for Aβ- and Aβ+ images, a suitable region atlas and a 3-dimensional stereotactic surface projections (3D SSP) method, capable of projecting cortical activity onto the surface of a 3D model of the brain without sampling white matter, were developed and evaluated.The material for development and testing comprised 724 individual amyloid PET brain images from six distinct cohorts, ranging from healthy volunteers to definite AD. The new methods could be implemented in a fully automated workflow and were found to be highly accurate, when tested by comparisons to Standards of Truth, such as defining regional uptake from PET images co-registered to magnetic resonance images, post-mortem histopathology and the visual consensus diagnosis of imaging experts.
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Patro, Anilkumar G. "Pixel oriented visualization in XmdvTool." Link to electronic thesis, 2004. http://www.wpi.edu/Pubs/ETD/Available/etd-0907104-084847/.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Lloyd, Natasha. "Clutter measurement and reduction for enhanced information visualization." Link to electronic thesis, 2005. http://www.wpi.edu/Pubs/ETD/Available/etd-011206-232808/.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Huang, Xiaodi, and xhuang@turing une edu au. "Filtering, clustering and dynamic layout for graph visualization." Swinburne University of Technology, 2004. http://adt.lib.swin.edu.au./public/adt-VSWT20050428.111554.

Повний текст джерела
Анотація:
Graph visualization plays an increasingly important role in software engineering and information systems. Examples include UML, E-R diagrams, database structures, visual programming, web visualization, network protocols, molecular structures, genome diagrams, and social structures. Many classical algorithms for graph visualization have already been developed over the past decades. However, these algorithms face difficulties in practice, such as the overlapping nodes, large graph layout, and dynamic graph layout. In order to solve these problems, this research aims to systematically address both algorithmic and approach issues related to a novel framework that describes the process of graph visualization applications. At the same time, all the proposed algorithms and approaches can be applied to other situations as well. First of all, a framework for graph visualization is described, along with a generic approach to the graphical representation of a relational information source. As the important parts of this framework, two main approaches, Filtering and Clustering, are then particularly investigated to deal with large graph layouts effectively. In order to filter 'noise' or less important nodes in a given graph, two new methods are proposed to compute importance scores of nodes called NodeRank, and then to control the appearances of nodes in a layout by ranking them. Two novel algorithms for clustering graphs, KNN and SKM, are developed to reduce visual complexity. Identifying seed nodes as initial members of clusters, both algorithms make use of either the k-nearest neighbour search or a novel node similarity matrix to seek groups of nodes with most affinities or similarities among them. Such groups of relatively highly connected nodes are then replaced with abstract nodes to form a coarse graph with reduced dimensions. An approach called MMD to the layout of clustered graphs is provided using a multiple-window�multiple-level display. As for the dynamic graph layout, a new approach to removing overlapping nodes called Force-Transfer algorithm is developed to greatly improve the classical Force- Scan algorithm. Demonstrating the performance of the proposed algorithms and approaches, the framework has been implemented in a prototype called PGD. A number of experiments as well as a case study have been carried out.
Стилі APA, Harvard, Vancouver, ISO та ін.
39

Kraemer, Eileen T. "A framework, tools, and methodology for the visualization of parallel and distributed systems." Diss., Georgia Institute of Technology, 1995. http://hdl.handle.net/1853/9214.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
40

San, Pedro Martín Javier de. "Structure discovery techniques for circuit design and process model visualization." Doctoral thesis, Universitat Politècnica de Catalunya, 2017. http://hdl.handle.net/10803/461196.

Повний текст джерела
Анотація:
Graphs are one of the most used abstractions in many knowledge fields because of the easy and flexibility by which graphs can represent relationships between objects. The pervasiveness of graphs in many disciplines means that huge amounts of data are available in graph form, allowing many opportunities for the extraction of useful structure from these graphs in order to produce insight into the data. In this thesis we introduce a series of techniques to resolve well-known challenges in the areas of digital circuit design and process mining. The underlying idea that ties all the approaches together is discovering structures in graphs. We show how many problems of practical importance in these areas can be solved utilizing both common and novel structure mining approaches. In the area of digital circuit design, this thesis proposes automatically discovering frequent, repetitive structures in a circuit netlist in order to improve the quality of physical planning. These structures can be used during floorplanning to produce regular designs, which are known to be highly efficient and economical. At the same time, detecting these repeating structures can exponentially reduce the total design time. The second focus of this thesis is in the area of the visualization of process models. Process mining is a recent area of research which centers on studying the behavior of real-life systems and their interactions with the environment. Complicated process models, however, hamper this goal. By discovering the important structures in these models, we propose a series of methods that can derive visualization-friendly process models with minimal loss in accuracy. In addition, and combining the areas of circuit design and process mining, this thesis opens the area of specification mining in asynchronous circuits. Instead of the usual design flow, which involves synthesizing circuits from specifications, our proposal discovers specifications from implemented circuits. This area allows for many opportunities for verification and re-synthesis of asynchronous circuits. The proposed methods have been tested using real-life benchmarks, and the quality of the results compared to the state-of-the-art.
Els grafs són una de les representacions abstractes més comuns en molts camps de recerca, gràcies a la facilitat i flexibilitat amb la que poden representar relacions entre objectes. Aquesta popularitat fa que una gran quantitat de dades es puguin trobar en forma de graf, i obre moltes oportunitats per a extreure estructures d'aquest grafs, útils per tal de donar una intuïció millor de les dades subjacents. En aquesta tesi introduïm una sèrie de tècniques per resoldre reptes habitualment trobats en les àrees de disseny de circuits digitals i mineria de processos industrials. La idea comú sota tots els mètodes proposats es descobrir automàticament estructures en grafs. En la tesi es mostra que molts problemes trobats a la pràctica en aquestes àrees poden ser resolts utilitzant nous mètodes de descobriment d'estructures. En l'àrea de disseny de circuits, proposem descobrir, automàticament, estructures freqüents i repetitives en les definicions del circuit per tal de millorar la qualitat de les etapes posteriors de planificació física. Les estructures descobertes poden fer-se servir durant la planificació per produir dissenys regulars, que son molt més econòmics d'implementar. Al mateix temps, la descoberta i ús d'aquestes estructures pot reduir exponencialment el temps total de disseny. El segon punt focal d'aquesta tesi és en l'àrea de la visualització de models de processos industrials. La mineria de processos industrials es un tema jove de recerca que es centra en estudiar el comportament de sistemes reals i les interaccions d'aquests sistemes amb l'entorn. No obstant, quan d'aquest anàlisi s'obtenen models massa complexos visualment, l'estudi n'és problemàtic. Proposem una sèrie de mètodes que, gràcies al descobriment automàtic de les estructures més importants, poden generar models molt més fàcils de visualitzar que encara descriuen el comportament del sistema amb gran precisió. Combinant les àrees de disseny de circuits i mineria de processos, aquesta tesi també obre un nou tema de recerca: la mineria d'especificacions per circuits asíncrons. En l'estil de disseny asíncron habitual, sintetitzadors automàtics generen circuits a partir de les especificacions. En aquesta tesi proposem el pas invers: descobrir automàticament les especificacions de circuits ja implementats. Així, creem noves oportunitats per a la verificació i la re-síntesi de circuits asíncrons. Els mètodes proposats en aquesta tesi s'han validat fent servir dades obtingudes d'aplicacions pràctiques, i en comparem els resultats amb els mètodes existents.
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Liao, Dezhi. "PHYSICALLY-BASED VISUALIZATION OF RESIDENTIAL BUILDING DAMAGE PROCESS IN HURRICANE." Doctoral diss., University of Central Florida, 2007. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/4185.

Повний текст джерела
Анотація:
This research provides realistic techniques to visualize the process of damage to residential building caused by hurricane force winds. Three methods are implemented to make the visualization useful for educating the public about mitigation measures for their homes. First, the underline physics uses Quick Collision Response Calculation. This is an iterative method, which can tune the accuracy and the performance to calculate collision response between building components. Secondly, the damage process is designed as a Time-scalable Process. By attaching a damage time tag for each building component, the visualization process is treated as a geometry animation allowing users to navigate in the visualization. The detached building components move in response to the wind force that is calculated using qualitative rather than quantitative techniques. The results are acceptable for instructional systems but not for engineering analysis. Quick Damage Prediction is achieved by using a database query instead of using a Monte-Carlo simulation. The database is based on HAZUS® engineering analysis data which gives it validity. A reasoning mechanism based on the definition of the overall building damage in HAZUS® is used to determine the damage state of selected building components including roof cover, roof sheathing, wall, openings and roof-wall connections. Exposure settings of environmental aspects of the simulated environment, such as ocean, trees, cloud and rain are integrated into a scene-graph based graphics engine. Based on the graphics engine and the physics engine, a procedural modeling method is used to efficiently render residential buildings. The resulting program, Hurricane!, is an instructional program for public education useful in schools and museum exhibits.
Ph.D.
Other
Sciences
Modeling and Simulation PhD
Стилі APA, Harvard, Vancouver, ISO та ін.
42

Effinger, Philip [Verfasser], and Michael [Akademischer Betreuer] Kaufmann. "Visualization of Business Process Models / Philip Effinger ; Betreuer: Michael Kaufmann." Tübingen : Universitätsbibliothek Tübingen, 2013. http://d-nb.info/1162844078/34.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
43

Kazantzi, Vasiliki. "Novel visualization and algebraic techniques for sustainable development through property integration." Texas A&M University, 2006. http://hdl.handle.net/1969.1/4930.

Повний текст джерела
Анотація:
The process industries are characterized by the significant consumption of fresh resources. This is a critical issue, which calls for an effective strategy towards more sustainable operations. One approach that favors sustainability and resource conservation is material recycle and/or reuse. In this regard, an integrated framework is an essential element in sustainable development. An effective reuse strategy must consider the process as a whole and develop plant-wide strategies. While the role of mass and energy integration has been acknowledged as a holistic basis for sustainable design, it is worth noting that there are many design problems that are driven by properties or functionalities of the streams and not by their chemical constituency. In this dissertation, the notion of componentless design, which was introduced by Shelley and El-Halwagi in 2000, was employed to identify optimal strategies for resource conservation, material substitution, and overall process integration. First, the focus was given on the problem of identifying rigorous targets for material reuse in property-based applications by introducing a new property-based pinch analysis and visualization technique. Next, a non-iterative, property-based algebraic technique, which aims at determining rigorous targets of the process performance in materialrecycle networks, was developed. Further, a new property-based procedure for determining optimal process modifications on a property cluster diagram to optimize the allocation of process resources and minimize waste discharge was also discussed. In addition, material substitution strategies were considered for optimizing both the process and the fresh properties. In this direction, a new process design and molecular synthesis methodology was evolved by using the componentless property-cluster domain and Group Contribution Methods (GCM) as key tools in developing a generic framework and systematic approach to the problem of simultaneous process and molecular design.
Стилі APA, Harvard, Vancouver, ISO та ін.
44

Henry, Sam. "Indirect Relatedness, Evaluation, and Visualization for Literature Based Discovery." VCU Scholars Compass, 2019. https://scholarscompass.vcu.edu/etd/5855.

Повний текст джерела
Анотація:
The exponential growth of scientific literature is creating an increased need for systems to process and assimilate knowledge contained within text. Literature Based Discovery (LBD) is a well established field that seeks to synthesize new knowledge from existing literature, but it has remained primarily in the theoretical realm rather than in real-world application. This lack of real-world adoption is due in part to the difficulty of LBD, but also due to several solvable problems present in LBD today. Of these problems, the ones in most critical need of improvement are: (1) the over-generation of knowledge by LBD systems, (2) a lack of meaningful evaluation standards, and (3) the difficulty interpreting LBD output. We address each of these problems by: (1) developing indirect relatedness measures for ranking and filtering LBD hypotheses; (2) developing a representative evaluation dataset and applying meaningful evaluation methods to individual components of LBD; (3) developing an interactive visualization system that allows a user to explore LBD output in its entirety. In addressing these problems, we make several contributions, most importantly: (1) state of the art results for estimating direct semantic relatedness, (2) development of set association measures, (3) development of indirect association measures, (4) development of a standard LBD evaluation dataset, (5) division of LBD into discrete components with well defined evaluation methods, (6) development of automatic functional group discovery, and (7) integration of indirect relatedness measures and automatic functional group discovery into a comprehensive LBD visualization system. Our results inform future development of LBD systems, and contribute to creating more effective LBD systems.
Стилі APA, Harvard, Vancouver, ISO та ін.
45

Tang, Xianmin. "Low damage processing and process characterization." W&M ScholarWorks, 2000. https://scholarworks.wm.edu/etd/1539623979.

Повний текст джерела
Анотація:
Two novel plasma sources (one neutral source and one pulsed inductively coupled plasma source) and ashing process characterization were investigated. The primary goal was to characterize these source properties and develop corresponding applications. The study includes process damage assessment with these two sources and another continuous wave (13.56MHz) plasma source. A global average simulation of the pulsed discharges was also included.;The transient plasma density and electron temperature from the double probe analysis were compared with single Langmuir probe results with sheath displacement corrections in pulsed discharges (200Hz--10kHz). The equivalent resistance method can be used effectively to analyze these double probe data. The transient behaviors of the plasma density and electron temperature are in accord with the model of the discharge. The hyper-thermal neutral source based on the surface reflection neutralization techniques was shown to provide enough fast neutrals for ashing applications. The surface roughness of the post-cleaned wafer was less than 10A. Ex-situ and in-situ measurements yield typical removal rates of about 10 A/s without stream collimation. The removal rates at increasing pressures show a trade-off between creating higher density plasma, leading to a large initial neutral flux and attenuation of neutrals due to collisions. Changing the reflector plate changes the neutral energy without changing the discharge composition. A novel technique, combining momentum and heat flux measurements shows that neutral stream energy is 3--6 eV and the neutral flux is on the order of 1015 cm-2 s-1. The derived etch rates from the measured neutral flux and energy values and the experimental rates are in good agreement. Quasi-static capacitance-voltage measurements demonstrate that the low energy neutral source induces much less damage than other plasma sources. Most of the neutral process damage is caused by uv photons escaping from the plasma source zone. The process-induced damage vary with the reflector bias and rf power.
Стилі APA, Harvard, Vancouver, ISO та ін.
46

Brun, Anders. "Manifolds in Image Science and Visualization." Doctoral thesis, Linköping : Department of Biomedical Engineering, Linköpings universitet, 2007. http://www.bibl.liu.se/liupubl/disp/disp2008/tek1157s.pdf.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
47

Carter, Caleb. "High Resolution Visualization of Large Scientific Data Sets Using Tiled Display." Fogler Library, University of Maine, 2007. http://www.library.umaine.edu/theses/pdf/CarterC2007.pdf.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
48

Cidambi, Indu. "Digital signal processing and micromorphological visualization of crystal lattices of thin films." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1996. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp04/MQ30894.pdf.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
49

Antle, Alissa N. "Interactive visualization tools for spatial data & metadata." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape4/PQDD_0010/NQ56495.pdf.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
50

Choi, Yi-king, and 蔡綺瓊. "Computer visualization techniques in surgical planning for pedicle screw insertion." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B31224234.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії