Dissertations / Theses on the topic 'Process control Dynamics Data processing'

To see the other types of publications on this topic, follow the link: Process control Dynamics Data processing.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 37 dissertations / theses for your research on the topic 'Process control Dynamics Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Alici, Semra. "Dynamic data reconciliation using process simulation software and model identification tools." Access restricted to users with UT Austin EID Full text (PDF) from UMI/Dissertation Abstracts International, 2001. http://wwwlib.umi.com/cr/utexas/fullcit?p3025133.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Morgan, Clifford Owen. "Development of computer aided analysis and design software for studying dynamic process operability." Thesis, Georgia Institute of Technology, 1986. http://hdl.handle.net/1853/10187.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sabri, Dina O. "Process control using an optomux control board." Virtual Press, 1987. http://liblink.bsu.edu/uhtbin/catkey/484759.

Full text
Abstract:
In this thesis process control concepts were used to develop software that could be adapted to a real world situation. The software was used to control a simple temperature regulating experiment. This experiment was used to demonstrate the use of OPTOMUX analog and digital input/output devices in controlling a process. The goal of this experiment was to use the input/output devices in controlling the temperature of the box within specified tolerances for a designated period of time. To accomplish optimal use of equipment and optimal control, a mathematical model was derived to predict the behavior of the process under control. The pattern observed while the temperature was increasing toward room temperature closely resembled an exponential function. For temperatures above room temperatures the curve then approximated a square root function. The pattern followed when decreasing the temperature-was exponential throughout. The time required to collect all the significant data in the case of increasing the temperature was two hours. In the case of decreasing temperature, one hour. Beyond these time limits the temperature remained essentially constant. The maximum temperature value that could be reached was six degrees above room temperature and the minimum two degrees below room temperature.
APA, Harvard, Vancouver, ISO, and other styles
4

Tian, Yu-Chu. "Dynamics analysis and integrated design of real-time control systems." School of Electrical and Information Engineering, 2009. http://hdl.handle.net/2123/5743.

Full text
Abstract:
Doctor of Philosophy (PhD)
Real-time control systems are widely deployed in many applications. Theory and practice for the design and deployment of real-time control systems have evolved significantly. From the design perspective, control strategy development has been the focus of the research in the control community. In order to develop good control strategies, process modelling and analysis have been investigated for decades, and stability analysis and model-based control have been heavily studied in the literature. From the implementation perspective, real-time control systems require timeliness and predictable timing behaviour in addition to logical correctness, and a real-time control system may behave very differently with different software implementations of the control strategies on a digital controller, which typically has limited computing resources. Most current research activities on software implementations concentrate on various scheduling methodologies to ensure the schedulability of multiple control tasks in constrained environments. Recently, more and more real-time control systems are implemented over data networks, leading to increasing interest worldwide in the design and implementation of networked control systems (NCS). Major research activities in NCS include control-oriented and scheduling-oriented investigations. In spite of significant progress in the research and development of real-time control systems, major difficulties exist in the state of the art. A key issue is the lack of integrated design for control development and its software implementation. For control design, the model-based control technique, the current focus of control research, does not work when a good process model is not available or is too complicated for control design. For control implementation on digital controllers running multiple tasks, the system schedulability is essential but is not enough; the ultimate objective of satisfactory quality-of-control (QoC) performance has not been addressed directly. For networked control, the majority of the control-oriented investigations are based on two unrealistic assumptions about the network induced delay. The scheduling-oriented research focuses on schedulability and does not directly link to the overall QoC of the system. General solutions with direct QoC consideration from the network perspective to the challenging problems of network delay and packet dropout in NCS have not been found in the literature. This thesis addresses the design and implementation of real-time control systems with regard to dynamics analysis and integrated design. Three related areas have been investigated, namely control development for controllers, control implementation and scheduling on controllers, and real-time control in networked environments. Seven research problems are identified from these areas for investigation in this thesis, and accordingly seven major contributions have been claimed. Timing behaviour, quality of control, and integrated design for real-time control systems are highlighted throughout this thesis. In control design, a model-free control technique, pattern predictive control, is developed for complex reactive distillation processes. Alleviating the requirement of accurate process models, the developed control technique integrates pattern recognition, fuzzy logic, non-linear transformation, and predictive control into a unified framework to solve complex problems. Characterising the QoC indirectly with control latency and jitter, scheduling strategies for multiple control tasks are proposed to minimise the latency and/or jitter. Also, a hierarchical, QoC driven, and event-triggering feedback scheduling architecture is developed with plug-ins of either the earliest-deadline-first or fixed priority scheduling. Linking to the QoC directly, the architecture minimises the use of computing resources without sacrifice of the system QoC. It considers the control requirements, but does not rely on the control design. For real-time NCS, the dynamics of the network delay are analysed first, and the nonuniform distribution and multi-fractal nature of the delay are revealed. These results do not support two fundamental assumptions used in existing NCS literature. Then, considering the control requirements, solutions are provided to the challenging NCS problems from the network perspective. To compensate for the network delay, a real-time queuing protocol is developed to smooth out the time-varying delay and thus to achieve more predictable behaviour of packet transmissions. For control packet dropout, simple yet effective compensators are proposed. Finally, combining the queuing protocol, the packet loss compensation, the configuration of the worst-case communication delay, and the control design, an integrated design framework is developed for real-time NCS. With this framework, the network delay is limited to within a single control period, leading to simplified system analysis and improved QoC.
APA, Harvard, Vancouver, ISO, and other styles
5

Tian, Yu-Chu. "Dynamics analysis and integrated design of real-time control systems." Connect to full text, 2008. http://ses.library.usyd.edu.au/handle/2123/5743.

Full text
Abstract:
Thesis (Ph. D.)--University of Sydney, 2009.
Title from title screen (viewed November 30, 2009). Submitted in fulfilment of the requirements for the degree of Doctor of Philosophy to the School of Electrical and Information Engineering in the Faculty of Engineering & Information Technologies. Degree awarded 2009; thesis submitted 2008. Includes bibliographical references. Also available in print form.
APA, Harvard, Vancouver, ISO, and other styles
6

Koeppen, Kyle Bruce. "Virtual access hydraulics experiment for system dynamics and control education." Thesis, Georgia Institute of Technology, 2001. http://hdl.handle.net/1853/15906.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

黎浩然 and Ho-yin Albert Lai. "Artificial intelligence based thermal comfort control with CFD modelling." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1999. http://hub.hku.hk/bib/B3122278X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Lakshmanan, Nithya M. "Estimation and control of nonlinear batch processes using multiple linear models." Thesis, Georgia Institute of Technology, 1997. http://hdl.handle.net/1853/11835.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Jones, Patricia Marie. "Constructing and validating a model-based operator's associate for supervisory control." Thesis, Georgia Institute of Technology, 1988. http://hdl.handle.net/1853/24274.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Davenport, George Andrew 1965. "A process control system for biomass liquefaction." Thesis, The University of Arizona, 1989. http://hdl.handle.net/10150/558114.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Singh, Rahul. "A model to integrate Data Mining and On-line Analytical Processing: with application to Real Time Process Control." VCU Scholars Compass, 1999. https://scholarscompass.vcu.edu/etd/5521.

Full text
Abstract:
Since the widespread use of computers in business and industry, a lot of research has been done on the design of computer systems to support the decision making task. Decision support systems support decision makers in solving unstructured decision problems by providing tools to help understand and analyze decision problems to help make better decisions. Artificial intelligence is concerned with creating computer systems that perform tasks that would require intelligence if performed by humans. Much research has focused on using artificial intelligence to develop decision support systems to provide intelligent decision support. Knowledge discovery from databases, centers around data mining algorithms to discover novel and potentially useful information contained in the large volumes of data that is ubiquitous in contemporary business organizations. Data mining deals with large volumes of data and tries to develop multiple views that the decision maker can use to study this multi-dimensional data. On-line analytical processing (OLAP) provides a mechanism that supports multiple views of multi-dimensional data to facilitate efficient analysis. These two techniques together can provide a powerful mechanism for the analysis of large quantities of data to aid the task of making decisions. This research develops a model for the real time process control of a large manufacturing process using an integrated approach of data mining and on-line analytical processing. Data mining is used to develop models of the process based on the large volumes of the process data. The purpose is to provide prediction and explanatory capability based on the models of the data and to allow for efficient generation of multiple views of the data so as to support analysis on multiple levels. Artificial neural networks provide a mechanism for predicting the behavior of nonlinear systems, while decision trees provide a mechanism for the explanation of states of systems given a set of inputs and outputs. OLAP is used to generate multidimensional views of the data and support analysis based on models developed by data mining. The architecture and implementation of the model for real-time process control based on the integration of data mining and OLAP is presented in detail. The model is validated by comparing results obtained from the integrated system, OLAP-only and expert opinion. The system is validated using actual process data and the results of this verification are presented. A discussion of the results of the validation of the integrated system and some limitations of this research with discussion on possible future research directions is provided.
APA, Harvard, Vancouver, ISO, and other styles
12

Storoshchuk, Orest Lev Poehlman William Frederick Skipper. "Model based synchronization of monitoring and control systems /." *McMaster only, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
13

Palix, Nicolas, Julia L. Lawall, Gaël Thomas, and Gilles Muller. "How Often do Experts Make Mistakes?" Universität Potsdam, 2010. http://opus.kobv.de/ubp/volltexte/2010/4132/.

Full text
Abstract:
Large open-source software projects involve developers with a wide variety of backgrounds and expertise. Such software projects furthermore include many internal APIs that developers must understand and use properly. According to the intended purpose of these APIs, they are more or less frequently used, and used by developers with more or less expertise. In this paper, we study the impact of usage patterns and developer expertise on the rate of defects occurring in the use of internal APIs. For this preliminary study, we focus on memory management APIs in the Linux kernel, as the use of these has been shown to be highly error prone in previous work. We study defect rates and developer expertise, to consider e.g., whether widely used APIs are more defect prone because they are used by less experienced developers, or whether defects in widely used APIs are more likely to be fixed.
APA, Harvard, Vancouver, ISO, and other styles
14

Gogyan, Anahit. "Generation and interfacing of single-photon light with matter and control of ultrafast atomic dynamics for quantum information processing." Phd thesis, Université de Bourgogne, 2010. http://tel.archives-ouvertes.fr/tel-00534488.

Full text
Abstract:
We develop a robust and realistic mechanism for the generation of indistinguishable single-photon (SP) pulses with identical frequency and polarization. They are produced on demand from a coupled double-Raman atom-cavity system driven by a sequence of laser pump pulses. This scheme features a high efficiency, the ability to produce a sequence of narrow-band SP pulses with a delay determined only by the pump repetition rate, and simplicity of the system free from complications such as repumping process and environmental dephasing. We propose and analyze a simple scheme of parametric frequency conversion for optical quantum information in cold atomic ensembles. Its remarkable properties are minimal losses and distortion of the pulse shape, and the persistence of quantum coherence and entanglement. Efficient conversion of frequency between different spectral regions is shown. A method for the generation of frequency-entangled single photon states is discussed. We suggest a robust and simple mechanism for the coherent excitation of molecules or atoms to a superposition of pre-selected states by a train of femtosecond laser pulses, combined with narrow-band coupling field. The theory of quantum beatings in the generation of ultra-violet radiation via a four wave mixing in pump-probe experiments is developed. The results are in good agreement with experimental data observed in Rb vapor when the laser phase fluctuations are significant.
APA, Harvard, Vancouver, ISO, and other styles
15

Fan, Yang, Hidehiko Masuhara, Tomoyuki Aotani, Flemming Nielson, and Hanne Riis Nielson. "AspectKE*: Security aspects with program analysis for distributed systems." Universität Potsdam, 2010. http://opus.kobv.de/ubp/volltexte/2010/4136/.

Full text
Abstract:
Enforcing security policies to distributed systems is difficult, in particular, when a system contains untrusted components. We designed AspectKE*, a distributed AOP language based on a tuple space, to tackle this issue. In AspectKE*, aspects can enforce access control policies that depend on future behavior of running processes. One of the key language features is the predicates and functions that extract results of static program analysis, which are useful for defining security aspects that have to know about future behavior of a program. AspectKE* also provides a novel variable binding mechanism for pointcuts, so that pointcuts can uniformly specify join points based on both static and dynamic information about the program. Our implementation strategy performs fundamental static analysis at load-time, so as to retain runtime overheads minimal. We implemented a compiler for AspectKE*, and demonstrate usefulness of AspectKE* through a security aspect for a distributed chat system.
APA, Harvard, Vancouver, ISO, and other styles
16

Austin, Andrew. "Process Capability in a Computer Integrated Manufacturing Cell." TopSCHOLAR®, 2014. http://digitalcommons.wku.edu/theses/1322.

Full text
Abstract:
With the rise of automation in traditional manufacturing processes, more companies are beginning to integrate computer integrated manufacturing (CIM) cells on their production floors. Through CIM cell integration, companies have the ability to reduce process time and increase production. One of the problems created with CIM cell automation is caused by the dependency the sequential steps have on one another. Dependency created by the previous step increases the probability that a process error could occur due to previous variation. One way to eliminate this dependency is through the use of an in-process measuring device such as a Renishaw spindle probe used in conjunction with a computer numerical control (CNC) milling machine. Western Kentucky University (WKU) utilizes a CIM cell in the Senator Mitch McConnell Advanced Manufacturing and Robotics laboratory. The laboratory is located in the Architectural and Manufacturing Sciences department and gives students the opportunity to learn how automated systems can be integrated. The CIM cell consists of three Mitsubishi six-axis robots, a Haas Mini-mill, a Haas GT-10 lathe, an AXYZ, Inc. CNC router table, 120 watt laser engraver, an Automated Storage and Retrieval System (ASRS), material handling conveyor, and vision station. The CIM cell functions throughout the curriculum as a means for applied learning and research. The researcher used this CIM cell in order to determine if an in-process measuring device, such as the Renishaw spindle probe, had the ability to affect process capability. The researcher conducted the study to see if an in-process measuring device can be integrated into the CIM cell located in the Senator Mitch McConnell Advanced Manufacturing and Robotics laboratory to eliminate compounding variation. The researcher discovered that through the use of a Renishaw 40-2 spindle probe used in conjunction with a CNC Haas Mini Mill, process capability has the potential to be improved in a CIM cell by accounting for compounding variation present in the process.
APA, Harvard, Vancouver, ISO, and other styles
17

Rezaie, Reza. "Gaussian Conditionally Markov Sequences: Theory with Application." ScholarWorks@UNO, 2019. https://scholarworks.uno.edu/td/2679.

Full text
Abstract:
Markov processes have been widely studied and used for modeling problems. A Markov process has two main components (i.e., an evolution law and an initial distribution). Markov processes are not suitable for modeling some problems, for example, the problem of predicting a trajectory with a known destination. Such a problem has three main components: an origin, an evolution law, and a destination. The conditionally Markov (CM) process is a powerful mathematical tool for generalizing the Markov process. One class of CM processes, called $CM_L$, fits the above components of trajectories with a destination. The CM process combines the Markov property and conditioning. The CM process has various classes that are more general and powerful than the Markov process, are useful for modeling various problems, and possess many Markov-like attractive properties. Reciprocal processes were introduced in connection to a problem in quantum mechanics and have been studied for years. But the existing viewpoint for studying reciprocal processes is not revealing and may lead to complicated results which are not necessarily easy to apply. We define and study various classes of Gaussian CM sequences, obtain their models and characterizations, study their relationships, demonstrate their applications, and provide general guidelines for applying Gaussian CM sequences. We develop various results about Gaussian CM sequences to provide a foundation and tools for general application of Gaussian CM sequences including trajectory modeling and prediction. We initiate the CM viewpoint to study reciprocal processes, demonstrate its significance, obtain simple and easy to apply results for Gaussian reciprocal sequences, and recommend studying reciprocal processes from the CM viewpoint. For example, we present a relationship between CM and reciprocal processes that provides a foundation for studying reciprocal processes from the CM viewpoint. Then, we obtain a model for nonsingular Gaussian reciprocal sequences with white dynamic noise, which is easy to apply. Also, this model is extended to the case of singular sequences and its application is demonstrated. A model for singular sequences has not been possible for years based on the existing viewpoint for studying reciprocal processes. This demonstrates the significance of studying reciprocal processes from the CM viewpoint.
APA, Harvard, Vancouver, ISO, and other styles
18

Amarilla, Rosemara Santos Deniz. "Identificação e análise dos processos de negócio de empresas de pequeno porte do setor da construção civil." Universidade Tecnológica Federal do Paraná, 2013. http://repositorio.utfpr.edu.br/jspui/handle/1/586.

Full text
Abstract:
Capes
O presente estudo tem como objetivo identificar e analisar comparativamente os principais processos de negócio de empresas de pequeno porte do subsetor de edificações da construção civil. Para tanto, foi utilizado o método de pesquisa qualitativa e estudo de casos múltiplos como técnica principal. Participaram deste trabalho, cinco empresas que estão localizadas na cidade de Curitiba, Paraná. Para a coleta de dados, foram realizadas entrevistas semi-estruturadas, análises de documentos e observações diretas. A partir das informações obtidas, desenvolveu-se graficamente o mapeamento dos processos de negócio de cada empresa utilizando-se a notação BPMN. O estudo mostrou que os processos e as atividades das empresas deste setor apresentam características comuns, facilitando assim, a padronização das melhores práticas. Com o estudo de outros casos, surgirão outros aspectos semelhantes que poderão ser utilizados na elaboração de um modelo de referência que apresenta orientação específica sobre como os processos de negócio podem ser gerenciados nas organizações do subsetor de edificações.
The present study has as objective to identify and analyze comparatively the main business processes of small companies of the subsector of edifications of the civil construction. Therefore, was used the qualitative research method and multiple case study as the main technique. Participated of this work, five companies that are located in the city of Curitiba, Paraná. For data collection were performed semi-structured interviews, document analysis and direct observations. From the information obtained, developed graphically mapping the business processes of each company using the BPMN notation. The study showed that the processes and activities of companies in this sector present common characteristics, thus facilitating the standardization of best practices. With the study of other cases will arise other similar aspects that could be used in the preparation of a reference model that provide specific guidance on how business processes can be managed in organizations of the subsector of edifications.
APA, Harvard, Vancouver, ISO, and other styles
19

Andrade, Elzimar. "Gerenciamento de processos para melhoria da eficiência na administração pública: estrutura de referência para a UTFPR." Universidade Tecnológica Federal do Paraná, 2017. http://repositorio.utfpr.edu.br/jspui/handle/1/2565.

Full text
Abstract:
Esta pesquisa teve como objetivo propor uma Estrutura de Referência para a implementação do Gerenciamento de Processos de Negócio (Business Process Management – BPM) adequada às particularidades de uma instituição pública de ensino superior com as características da Universidade Tecnológica Federal do Paraná (UTFPR). Há cada vez mais interesse por parte de organizações públicas e privadas em estabelecer uma forma de gerenciamento de seus processos buscando obter resultados mais eficientes e eficazes. No setor público, o BPM vem sendo utilizado para obtenção de processos menos burocráticos, mais ágeis e com melhor uso dos recursos. O setor público possui características diversas daquelas verificadas no setor privado e que demandam atenção especial nas limitações encontradas ao adotar abordagens que, mesmo validadas no ambiente empresarial, ainda carecem de maior aprofundamento quando aplicadas a organizações de governo. Trata-se de uma pesquisa aplicada com utilização de questionários como instrumento de coleta de dados, em duas etapas. A primeira de pesquisa teve como alvo 62 Instituições Federais de Ensino Superior (IFES), vinculadas ao Ministério da Educação (MEC), buscando conhecer o panorama do gerenciamento de processos no campo de atuação, e a segunda etapa teve como alvo a própria UTFPR, buscando-se conhecer os fatores indutores e restritores à implementação do BPM e as características da instituição. Além disso, foi realizada pesquisa bibliométrica para identificar publicações sobre o BPM no setor público como referencial. Da visão resultante das etapas de pesquisa e de um modelo conceitual de BPM estabelecido com base na abordagem feita pela Associação Internacional de BPM Brasil (ABPMP – Association of Business Process Management Professionals) e no Modelo de Burlton (2001), foi estabelecida a Estrutura de Referência para implementação do BPM na UTFPR, considerando, além das suas especificidades, os recursos disponíveis, estrutura de Tecnologia de Informação (TI), cultura organizacional, pessoas, entre outros. A pesquisa confirmou que, apesar das restrições a que estão sujeitas as organizações públicas, o BPM é uma forma de gerenciamento de processos aplicável a uma organização com as características da UTFPR e que há uma forma de implementação que permite observar estas especificidades, para a melhoria da eficiência. Como limitações, observou-se que a pesquisa nas IFES brasileiras não foi capaz de obter dados sobre o impacto do BPM nos resultados e na performance das instituições pesquisadas, dado que extensa maioria ainda estão em fase de implementação, nem de avaliar o real impacto dos escritórios de processo nas IFES que adotaram esta estrutura. Em conclusão, além da Estrutura de Referência proposta para a UTFPR, fica a contribuição para a área de estudo de Planejamento e Políticas Públicas.
This research was developed to propose a Business Process Management (BPM) referential structure for the Federal University of Technology – Paraná (UTFPR), adapted to the particularities of a public higher education institution like UTFPR. There is interest from public and private organizations to establish a way of managing their processes seeking more efficient and effective results. In the public sector, BPM has been used to seek the establishment of less bureaucratic processes, more agile and with better use of public resources. It was taken into account that the public sector has characteristics different from those verified in the private sector and that demands special attention in the limitations found by the Public Administration when adopting approaches that, even validated in the business environment, still need to deepen when applied in governmental organizations. It is an applied research in two stages, using questionnaires as an instrument for collecting data. The first step of research was aimed at the 62 Federal Institutions of Higher Education (IFES), bonded to the Ministry of Education (MEC), seeking to know the scenery in the field of action, and the second stage was developed in UTFPR, seeking to know the factors inducing and restricting BPM implementation. In addition, bibliometric research was conducted to identify publications on BPM in the public sector as a reference. The structure of reference for implementation of the BPM in UTFPR was stablished from the perspective of the research steps and a conceptual model of BPM based on the approach of the International BPM Association (ABPMP) and the model of Burlton (2001). The available resources, Information Technology (IT) framework, organizational culture, people and other aspects were considered in addition to its specificities. It has been confirmed that, despite the restrictions to which public organizations are subject, BPM is a way of managing processes applicable to an organization with the characteristics of UTFPR and that there is a form of implementation that allows to observe these specificities, seeking for efficiency. As limitations, the research was not able to obtain data about the impact of BPM in the results and performance of surveyed Brazilian IFES, since an overwhelming majority is still in the implementation phase nor to evaluate the real impact of the Business Process Management offices (BPMO) in IFES that have adopted this structure. In conclusion, the proposed structure of reference for UTFPR contribute to the study area of Planning and Public Policies, where the efficient performance of the State is a requirement for Public Governance.
APA, Harvard, Vancouver, ISO, and other styles
20

Urban, Marek. "Návrh zavěšení kol Formule Student." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2020. http://www.nusl.cz/ntk/nusl-417557.

Full text
Abstract:
Tato práce se se zabývá návrhem kinematiky zavěšení kol obou náprav. Na základě analýz jízdních dat, multi-body simulací v softwaru Adams Car, simulací v Matlabu a analytických kalkulací v Mathcadu, je navržena řada změn s cílem zlepšit jízdní vlastnosti vozu Formule student, tyto změny jsou následně implementovány do CAD modelu vozu. Jednotlivé změny kinematiky náprav jsou provedeny na základě analýzy konkrétního problému, který se snaží řešit. Jednou z problematik je zástavbová náročnost systému odpružení a zavěšení zadních kol, zde je cílem snížit hmotnost, výšku těžiště a moment setrvačnosti. Další problematikou je geometrie předního kola, kde je cílem zlepšit využití pneumatik a snížit síly v řízení. Dále se práce zabývá simulacemi elastokinematiky zadní nápravy, součástí je také návrh měřícího zařízení. V poslední části je zkoumán vliv provedených změn i elastokinematiky na jízdní dynamiku vozu v ustálených stavech za pomocí MM metody simulované s modelem celého vozu v Adams Car a zpracované v Matlabu.
APA, Harvard, Vancouver, ISO, and other styles
21

Vestin, Albin, and Gustav Strandberg. "Evaluation of Target Tracking Using Multiple Sensors and Non-Causal Algorithms." Thesis, Linköpings universitet, Reglerteknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-160020.

Full text
Abstract:
Today, the main research field for the automotive industry is to find solutions for active safety. In order to perceive the surrounding environment, tracking nearby traffic objects plays an important role. Validation of the tracking performance is often done in staged traffic scenarios, where additional sensors, mounted on the vehicles, are used to obtain their true positions and velocities. The difficulty of evaluating the tracking performance complicates its development. An alternative approach studied in this thesis, is to record sequences and use non-causal algorithms, such as smoothing, instead of filtering to estimate the true target states. With this method, validation data for online, causal, target tracking algorithms can be obtained for all traffic scenarios without the need of extra sensors. We investigate how non-causal algorithms affects the target tracking performance using multiple sensors and dynamic models of different complexity. This is done to evaluate real-time methods against estimates obtained from non-causal filtering. Two different measurement units, a monocular camera and a LIDAR sensor, and two dynamic models are evaluated and compared using both causal and non-causal methods. The system is tested in two single object scenarios where ground truth is available and in three multi object scenarios without ground truth. Results from the two single object scenarios shows that tracking using only a monocular camera performs poorly since it is unable to measure the distance to objects. Here, a complementary LIDAR sensor improves the tracking performance significantly. The dynamic models are shown to have a small impact on the tracking performance, while the non-causal application gives a distinct improvement when tracking objects at large distances. Since the sequence can be reversed, the non-causal estimates are propagated from more certain states when the target is closer to the ego vehicle. For multiple object tracking, we find that correct associations between measurements and tracks are crucial for improving the tracking performance with non-causal algorithms.
APA, Harvard, Vancouver, ISO, and other styles
22

Scarlato, Michele. "Sicurezza di rete, analisi del traffico e monitoraggio." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amslaurea.unibo.it/3223/.

Full text
Abstract:
Il lavoro è stato suddiviso in tre macro-aree. Una prima riguardante un'analisi teorica di come funzionano le intrusioni, di quali software vengono utilizzati per compierle, e di come proteggersi (usando i dispositivi che in termine generico si possono riconoscere come i firewall). Una seconda macro-area che analizza un'intrusione avvenuta dall'esterno verso dei server sensibili di una rete LAN. Questa analisi viene condotta sui file catturati dalle due interfacce di rete configurate in modalità promiscua su una sonda presente nella LAN. Le interfacce sono due per potersi interfacciare a due segmenti di LAN aventi due maschere di sotto-rete differenti. L'attacco viene analizzato mediante vari software. Si può infatti definire una terza parte del lavoro, la parte dove vengono analizzati i file catturati dalle due interfacce con i software che prima si occupano di analizzare i dati di contenuto completo, come Wireshark, poi dei software che si occupano di analizzare i dati di sessione che sono stati trattati con Argus, e infine i dati di tipo statistico che sono stati trattati con Ntop. Il penultimo capitolo, quello prima delle conclusioni, invece tratta l'installazione di Nagios, e la sua configurazione per il monitoraggio attraverso plugin dello spazio di disco rimanente su una macchina agent remota, e sui servizi MySql e DNS. Ovviamente Nagios può essere configurato per monitorare ogni tipo di servizio offerto sulla rete.
APA, Harvard, Vancouver, ISO, and other styles
23

Ramlal, Jasmeer. "Model predictive control of hybrid systems." Thesis, 2002. http://hdl.handle.net/10413/4808.

Full text
Abstract:
Hybrid systems combine the continuous behavior evolution specified by differential equations with discontinuous changes specified by discrete event logic. Usually these systems in the processing industry can be identified as having to depend on discrete decisions regarding their operation. In process control there therefore is a challenge to automate these decisions. A model predictive control (MPC) strategy was proposed and verified for the control of hybrid systems. More specifically, the dynamic matrix control (DMC) framework commonly used in industry for the control of continuous variables was modified to deal with mixed integer variables, which are necessary for the modelling and control of hybrid systems. The algorithm was designed and commissioned in a closed control loop comprising a SCADA system and an optimiser (GAMS). GAMS (General Algebraic Modelling System) is an optimisation package that is able to solve for integer/continuous variables given a model of the system and an appropriate objective function. Online and offline closed loop tests were undertaken on a benchmark interacting tank system and a heating/cooling circuit. The algorithm was also applied to an industrial problem requiring the optimal sequencing of coal locks in real time. To complete the research concerning controller design for hybrid behavior, an investigation was undertaken regarding systems that have different modes of operation due to physicochemical (inherent) discontinuities e.g. a tank with discontinuous cross sectional area, fitted with an overflow. The findings from the online tests and offline simulations reveal that the proposed algorithm, with some system specific modification, was able to control each of the four hybrid systems under investigation. Based on which hybrid system was being controlled, by modifying the DMC algorithm to include integer variables, the mixed integer predictive controller (MIPC) was employed to initiate selections, switchings and determine sequences. Control of the interacting tank system was focused on an optimum selection in terms of operating positions for process inputs. The algorithm was shown to retain the usual features of DMC (i.e. tuning and dealing with multivariable interaction). For a system with multiple modes of operation i.e. the heating/cooling circuit, the algorithm was able to switch the mode of operation in order to meet operating objectives. The MPC strategy was used to good effect when getting the algorithm to sequence the operation of several coal locks. In this instance, the controller maintained system variables within certain operating constraints. Furthermore, soft constraints were proposed and used to promote operation close to operating constraints without the danger of computational failure due to constraint violations. For systems with inherent discontinuities, a MPC strategy was proposed that predicted trajectories which crossed discontinuities. Convolution models were found to be inappropriate in this instance and state space equations describing the dynamics of the system were used instead.
Thesis (M.Sc.Eng.)-University of Natal, Durban, 2002.
APA, Harvard, Vancouver, ISO, and other styles
24

Zhang, Yang 1980. "Improved methods in statistical and first principles modeling for batch process control and monitoring." 2008. http://hdl.handle.net/2152/17920.

Full text
Abstract:
This dissertation presents several methods for improving statistical and first principles modeling capabilities, with an emphasis on nonlinear, unsteady state batch processes. Batch process online monitoring is chosen as a main research area here due to its importance from both theoretical and practical points of view. Theoretical background and recent developments of PCA/PLS-based online monitoring methodologies are reviewed, along with fault detection metrics, and algorithm variations for different applications. The available commercial softwares are also evaluated based on the corresponding application area. A detailed Multiway PCA based batch online monitoring procedure is used as the starting point for further improvements. The issue of dynamic batch profile synchronization is addressed. By converting synchronization into a dynamic optimization problem, Dynamic Time Warping (DTW) and Derivative DTW (DDTW) show the best performance by far. To deal with the singularity point and numerical derivative estimation problems of DTW and DDTW in the presence of noise, a robust DDTW algorithm is proposed by combining Savitzky-Golay filter and DDTW algorithm together. A comparative analysis of robust DDTW and available methods is performed on simulated and real chemical plant data. As traditional Multiway PCA-based (MPCA) methods consider batch monitoring in a static fashion (fail to consider time dependency between/within process variables with respect to time), an EWMA filtered Hybrid-wise unfolding MPCA (E-HMPCA) is proposed that considers batch dynamics in the model and reduce the number of Type I and II errors in online monitoring. Chemical and biochemical batch examples are used to compare the E-HMPCA algorithm with traditional methods. First principles modeling is known to be time consuming for development. In order to increase modeling efficiency, dynamic Design of Experiments (DOE) is introduced for Dynamic Algebraic Equation (DAE) system parameter estimation. A new criterion is proposed by combining PCA and parameter sensitivity analysis (P-optimal criterion). The new criterion under certain assumptions reduce to several available criteria and is suitable for designing experiments to improve estimation of specific parameter sets. Furthermore, the criterion systematically decomposes a complex system into small pieces according to PCA. Two engineering examples (one batch, one continuous) are used to illustrate the idea and algorithm.
text
APA, Harvard, Vancouver, ISO, and other styles
25

Kang, Alan Montzy. "A real-time expert system shell for process control." Thesis, 1990. https://hdl.handle.net/10539/25920.

Full text
Abstract:
A dissertation submitted to the Faculty of Engineering, University of the Witwatersrand, Johannesburg, in fulfilment of the requirements for the degree of Master of Science in Engineering
A multi-layered expert system shell that specifically addresses real-time issues is designed and implemented. The architecture of this expert system shell supports the concepts of parallelism, concurrent computation and competitive reasoning in that it allows several alternatives to be explored simultaneously. An inference engine driven by a hybrid of forward and backward chanining methods is used to achieve real-time response, and certainty factors are used for uncertainty management. Real-time responsiveness is improved by allowing the coexistence of procedural and declarative knowledge within the same system. A test bed that was set up in order to investigate the performance of the implemented shell is described. It was found in the performance analysis that the proposed system meets the real-time requirements as specified in this research.
Andrew Chakane 2018
APA, Harvard, Vancouver, ISO, and other styles
26

Handrigan, Paul. "Distributed systems, hardware-in-the-loop simulation, and applications in control systems /." 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
27

Randelhoff, Mark Charles. "A reconfigurable distributed process control environment for a network of PC's using Ada and NetBIOS." Thesis, 1992. http://hdl.handle.net/10413/6901.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Gebbie, Ian. "A framework for a real-time knowledge based system." Thesis, 1993. https://hdl.handle.net/10539/25059.

Full text
Abstract:
A dissertation submitted to the Faculty of Engineering, University of the Witwatersrand, in fulfilment of the requirements of the degree of Master of Science Engineering
A framework designed to contain and manage the use of knowledge in a real-time knowledge based system for high level control of an industrial process is presented. A prototype of the framework is designed and implemented on a static objectorientated shell. Knowledge is stored in objects and in forward chaining rules. The knowledge has a well defined structure, making it easy to create and manage. Rules are used to recognize conditions and propose control objectives. The framework uses the knowledge to determine variables that if altered will meet the objectives. Control actions are then found to implement changes to these variables The use of explicit control objectives makes it possible to determine if an action worked as intended and if its use is suitable for the present conditions. This enables a learning mechanism to be applied in the expert system. The prototype operated adequately, but the knowledge required to drive the. system was found to be very detailed and awkward to create.
Andrew Chakane 2018
APA, Harvard, Vancouver, ISO, and other styles
29

(6442592), Jaeyoung Kim. "A Study on a High Precision Magnetic Levitation Transport System for Carrying Organic Light-Emitting Diode Displays." Thesis, 2019.

Find full text
Abstract:

High precision magnetic levitation control methodologies during the manufacture of Organic light-emitting diode (OLED) displays are designed, manipulated, and experimentally validated in this thesis. OLED displays have many advantages over conventional display technologies including thinner, lighter, lower power consumption, higher resolutions, and greater brightness. However, OLED displays require tighter environmental conditions of the manufacturing processes without the introduction of vibration and contamination. For this reason, magnetic levitation is used to transport the displays attached on the carrier during the manufacturing process. This thesis addresses several critical problems related to implement the levitation control performance of the carrier's motion during the manufacturing process.

Attractive magnetic levitation requires measurement of the airgap between the carrier and the levitation electromagnets. An algorithm for modeling the gap sensor installation errors was developed and subsequently used for controller development. A levitation controller only was initiated as the stationary point for optimal state feedback controller-observer compensator developed in this study. This optimal state feedback controller-observer compensator allows the carrier to be passed from support fixtures without the introduction of vibration. This controller was designed, and its levitation control performance confirmed with both simulation and experimental validation. To implement the levitation control performance of the carrier's motion, a second order notch filter and a first order low pass filter are designed to minimize the mechanical resonance and noise from the gap sensor, respectively. To reduce the sudden change of the levitation forces owing to the discrete allocation of the levitation electromagnets, a section control algorithm is developed; the sum of the levitation forces is equal to the weight of the carrier and the sum of the moment along the propulsion axis is equal to zero.

Using the developed control strategies, the peak to peak variation of the carrier’s motion at a standstill was 50 µm. This same motion at low-speed 30 mm/s was 250 µm. While at high speed 300 mm/s was 430 µm. The relative improvement in the levitation control performance of optimal state feedback controller-observer compensator over the levitation controller only was a peak to peak attenuation of 50 µm at low-speed and 270 µm at high-speed. Most significantly while using optimal state feedback controller-observer compensator could be passed from support fixture to support fixture, i.e., through the deadzone, without mechanical contact or other manufacturing processes, inhibiting vibration.

Having comparative simulation and experimental validation, the proposed control strategies were validated to improve the levitation control performance of the carrier under uncertain disturbance and sensor installation error, and it is expected to manufacture OLED displays with high productivity and low defect rate.

APA, Harvard, Vancouver, ISO, and other styles
30

Husvu, Munyaradzi. "Business process improvements and innovations in support service processes and the effective measurement of their impact on the performance of manufacturing firms in South Africa." Thesis, 2017. http://hdl.handle.net/10539/23452.

Full text
Abstract:
A research report submitted to the Faculty of Engineering and the Built Environment, University of the Witwatersrand, in fulfillment of the requirements for the degree of Masters in Engineering, 2017
Manufacturing companies have challenges implementing business process improvements and innovations (BPI) on support service processes effectively and find it difficult to measure the impact of such interventions on the overall performance of the organisation. Measurement of the impact of BPIs on overall performance of manufacturing companies is problematic due to the inadequacy of BPI metrics for support services. Furthermore, there are no universally accepted frameworks available for the measurement of the impact of improvements on support service processes on the performance of manufacturing companies. While there are frameworks available for performance measurement in general, they are not specific to measurement of the impact of BPIs in manufacturing support service processes. An initial exploratory study, based on an online survey of 50 companies that would typically conduct BPI or where known to the researcher to have conducted BPIs recently, was conducted to explore the nature of BPIs in manufacturing support service processes in South Africa. A second longer online survey was then conducted with 1000 respondents in manufacturing companies selected through expert sampling to further explore the nature and impact of BPIs in manufacturing support service processes considering selection of support service processes, the types and number of support service processes as well as BPI traditions and methodologies in use within manufacturing companies. In addition, four companies were selected for in-depth case studies in which ten projects were analysed by applying within case and cross case analysis The results of the surveys, the case studies and a revisit to the case companies were used to refine successive iterations of a theoretical framework initially developed from the literature. The framework provides a set of guidelines and actions for manufacturing companies to effectively conduct BPIs on manufacturing support service processes a basis from which the impact of improvements in manufacturing support service processes on manufacturing companies can be measured by providing the measurement areas to consider and a set of high level measures to use as high level indicators. Finally, the framework was checked for completeness using recommended criteria derived from the literature and was found to be complete and suitable as it met all the criteria for good measurement systems defined in the literature sources used in this study.
MT 2017
APA, Harvard, Vancouver, ISO, and other styles
31

Park, Seong Cheol. "Indianapolis emergency medical service and the Indiana Network for Patient Care : evaluating the patient match process." Thesis, 2014. http://hdl.handle.net/1805/3808.

Full text
Abstract:
Indiana University-Purdue University Indianapolis (IUPUI)
In 2009, Indianapolis Emergency Medical Service (I-EMS, formerly Wishard Ambulance Service) launched an electronic medical record system within their ambulances and started to exchange patient data with the Indiana Network for Patient Care (INPC). This unique system allows EMS personnel in an ambulance to get important medical information prior to the patient’s arrival to the accepting hospital from incident scene. In this retrospective cohort study, we found EMS personnel made 3,021 patient data requests (14%) of 21,215 EMS transports during a one-year period, with a “success” match rate of 46%, and a match “failure” rate of 17%. The three major factors for causing match “failure” were (1) ZIP code 55%, (2) Patient Name 22%, and (3) Birth Date 12%. This study shows that the ZIP code is not a robust identifier in the patient identification process and Non-ZIP code identifiers may be a better choice due to inaccuracies and changes of the ZIP code in a patient’s record.
APA, Harvard, Vancouver, ISO, and other styles
32

Abdulla, Mohammed Shahid. "Simulation Based Algorithms For Markov Decision Process And Stochastic Optimization." Thesis, 2008. http://hdl.handle.net/2005/812.

Full text
Abstract:
In Chapter 2, we propose several two-timescale simulation-based actor-critic algorithms for solution of infinite horizon Markov Decision Processes (MDPs) with finite state-space under the average cost criterion. On the slower timescale, all the algorithms perform a gradient search over corresponding policy spaces using two different Simultaneous Perturbation Stochastic Approximation (SPSA) gradient estimates. On the faster timescale, the differential cost function corresponding to a given stationary policy is updated and averaged for enhanced performance. A proof of convergence to a locally optimal policy is presented. Next, a memory efficient implementation using a feature-vector representation of the state-space and TD (0) learning along the faster timescale is discussed. A three-timescale simulation based algorithm for solution of infinite horizon discounted-cost MDPs via the Value Iteration approach is also proposed. An approximation of the Dynamic Programming operator T is applied to the value function iterates. A sketch of convergence explaining the dynamics of the algorithm using associated ODEs is presented. Numerical experiments on rate based flow control on a bottleneck node using a continuous-time queueing model are presented using the proposed algorithms. Next, in Chapter 3, we develop three simulation-based algorithms for finite-horizon MDPs (FHMDPs). The first algorithm is developed for finite state and compact action spaces while the other two are for finite state and finite action spaces. Convergence analysis is briefly sketched. We then concentrate on methods to mitigate the curse of dimensionality that affects FH-MDPs severely, as there is one probability transition matrix per stage. Two parametrized actor-critic algorithms for FHMDPs with compact action sets are proposed, the ‘critic’ in both algorithms learning the policy gradient. We show w.p1convergence to a set with the necessary condition for constrained optima. Further, a third algorithm for stochastic control of stopping time processes is presented. Numerical experiments with the proposed finite-horizon algorithms are shown for a problem of flow control in communication networks. Towards stochastic optimization, in Chapter 4, we propose five algorithms which are variants of SPSA. The original one measurement SPSA uses an estimate of the gradient of objective function L containing an additional bias term not seen in two-measurement SPSA. We propose a one-measurement algorithm that eliminates this bias, and has asymptotic convergence properties making for easier comparison with the two-measurement SPSA. The algorithm, under certain conditions, outperforms both forms of SPSA with the only overhead being the storage of a single measurement. We also propose a similar algorithm that uses perturbations obtained from normalized Hadamard matrices. The convergence w.p.1 of both algorithms is established. We extend measurement reuse to design three second-order SPSA algorithms, sketch the convergence analysis and present simulation results on an illustrative minimization problem. We then propose several stochastic approximation implementations for related algorithms in flow-control of communication networks, beginning with a discrete-time implementation of Kelly’s primal flow-control algorithm. Convergence with probability1 is shown, even in the presence of communication delays and stochastic effects seen in link congestion indications. Two relevant enhancements are then pursued :a) an implementation of the primal algorithm using second-order information, and b) an implementation where edge-routers rectify misbehaving flows. Also, discrete-time implementations of Kelly’s dual algorithm and primal-dual algorithm are proposed. Simulation results a) verifying the proposed algorithms and, b) comparing stability properties with an algorithm in the literature are presented.
APA, Harvard, Vancouver, ISO, and other styles
33

Khatib, Akram Ghassan. "Evaluation of performance of an air handling unit using wireless monitoring system and modeling." Thesis, 2014. http://hdl.handle.net/1805/5943.

Full text
Abstract:
Indiana University-Purdue University Indianapolis (IUPUI)
Heating, ventilation, and air conditioning (HVAC) is the technology responsible to maintain temperature levels and air quality in buildings to certain standards. In a commercial setting, HVAC systems accounted for more than 50% of the total energy cost of the building in 2013 [13]. New control methods are always being worked on to improve the effectiveness and efficiency of the system. These control systems include model predictive control (MPC), evolutionary algorithm (EA), evolutionary programming (EP), and proportional-integral-derivative (PID) controllers. Such control tools are used on new HVAC system to ensure the ultimate efficiency and ensure the comfort of occupants. However, there is a need for a system that can monitor the energy performance of the HVAC system and ensure that it is operating in its optimal operation and controlled as expected. In this thesis, an air handling unit (AHU) of an HVAC system was modeled to analyze its performance using real data collected from an operating AHU using a wireless monitoring system. The purpose was to monitor the AHU's performance, analyze its key parameters to identify flaws, and evaluate the energy waste. This system will provide the maintenance personnel to key information to them to act for increasing energy efficiency. The mechanical model was experimentally validated first. Them a baseline operating condition was established. Finally, the system under extreme weather conditions was evaluated. The AHU's subsystem performance, the energy consumption and the potential wastes were monitored and quantified. The developed system was able to constantly monitor the system and report to the maintenance personnel the information they need. I can be used to identify energy savings opportunities due to controls malfunction. Implementation of this system will provide the system's key performance indicators, offer feedback for adjustment of control strategies, and identify the potential savings. To further verify the capabilities of the model, a case study was performed on an air handling unit on campus for a three month monitoring period. According to the mechanical model, a total of 63,455 kWh can be potentially saved on the unit by adjusting controls. In addition the mechanical model was able to identify other energy savings opportunities due to set point changes that may result in a total of 77,141 kWh.
APA, Harvard, Vancouver, ISO, and other styles
34

(9293561), Rih-Teng Wu. "Development and Application of Big Data Analytics and Artificial Intelligence for Structural Health Monitoring and Metamaterial Design." Thesis, 2020.

Find full text
Abstract:

Recent advances in sensor technologies and data acquisition platforms have led to the era of Big Data. The rapid growth of artificial intelligence (AI), computing power and machine learning (ML) algorithms allow Big Data to be processed within affordable time constraints. This opens abundant opportunities to develop novel and efficient approaches to enhance the sustainability and resilience of Smart Cities. This work, by starting with a review of the state-of-the-art data fusion and ML techniques, focuses on the development of advanced solutions to structural health monitoring (SHM) and metamaterial design and discovery strategies. A deep convolutional neural network (CNN) based approach that is more robust against noisy data is proposed to perform structural response estimation and system identification. To efficiently detect surface defects using mobile devices with limited training data, an approach that incorporates network pruning into transfer learning is introduced for crack and corrosion detection. For metamaterial design, a reinforcement learning (RL) and a neural network based approach are proposed to reduce the computation efforts for the design of periodic and non-periodic metamaterials, respectively. Lastly, a physics-constrained deep auto-encoder (DAE) based approach is proposed to design the geometry of wave scatterers that satisfy user-defined downstream acoustic 2D wave fields. The robustness of the proposed approaches as well as their limitations are demonstrated and discussed through experimental data or/and numerical simulations. A roadmap for future works that may benefit the SHM and material design research communities is presented at the end of this dissertation.


APA, Harvard, Vancouver, ISO, and other styles
35

Yesmunt, Garrett Scot. "Design, analysis, and simulation of a humanoid robotic arm applied to catching." Thesis, 2014. http://hdl.handle.net/1805/5610.

Full text
Abstract:
Indiana University-Purdue University Indianapolis (IUPUI)
There have been many endeavors to design humanoid robots that have human characteristics such as dexterity, autonomy and intelligence. Humanoid robots are intended to cooperate with humans and perform useful work that humans can perform. The main advantage of humanoid robots over other machines is that they are flexible and multi-purpose. In this thesis, a human-like robotic arm is designed and used in a task which is typically performed by humans, namely, catching a ball. The robotic arm was designed to closely resemble a human arm, based on anthropometric studies. A rigid multibody dynamics software was used to create a virtual model of the robotic arm, perform experiments, and collect data. The inverse kinematics of the robotic arm was solved using a Newton-Raphson numerical method with a numerically calculated Jacobian. The system was validated by testing its ability to find a kinematic solution for the catch position and successfully catch the ball within the robot's workspace. The tests were conducted by throwing the ball such that its path intersects different target points within the robot's workspace. The method used for determining the catch location consists of finding the intersection of the ball's trajectory with a virtual catch plane. The hand orientation was set so that the normal vector to the palm of the hand is parallel to the trajectory of the ball at the intersection point and a vector perpendicular to this normal vector remains in a constant orientation during the catch. It was found that this catch orientation approach was reliable within a 0.35 x 0.4 meter window in the robot's workspace. For all tests within this window, the robotic arm successfully caught and dropped the ball in a bin. Also, for the tests within this window, the maximum position and orientation (Euler angle) tracking errors were 13.6 mm and 4.3 degrees, respectively. The average position and orientation tracking errors were 3.5 mm and 0.3 degrees, respectively. The work presented in this study can be applied to humanoid robots in industrial assembly lines and hazardous environment recovery tasks, amongst other applications.
APA, Harvard, Vancouver, ISO, and other styles
36

Wilson, Derek Alan. "A Dredging Knowledge-Base Expert System for Pipeline Dredges with Comparison to Field Data." 2010. http://hdl.handle.net/1969.1/ETD-TAMU-2010-12-8653.

Full text
Abstract:
A Pipeline Analytical Program and Dredging Knowledge{Base Expert{System (DKBES) determines a pipeline dredge's production and resulting cost and schedule. Pipeline dredge engineering presents a complex and dynamic process necessary to maintain navigable waterways. Dredge engineers use pipeline engineering and slurry transport principles to determine the production rate of a pipeline dredge system. Engineers then use cost engineering factors to determine the expense of the dredge project. Previous work in engineering incorporated an object{oriented expert{system to determine cost and scheduling of mid{rise building construction where data objects represent the fundamental elements of the construction process within the program execution. A previously developed dredge cost estimating spreadsheet program which uses hydraulic engineering and slurry transport principles determines the performance metrics of a dredge pump and pipeline system. This study focuses on combining hydraulic analysis with the functionality of an expert{system to determine the performance metrics of a dredge pump and pipeline system and its resulting schedule. Field data from the U.S. Army Corps of Engineers pipeline dredge, Goetz, and several contract daily dredge reports show how accurately the DKBES can predict pipeline dredge production. Real{time dredge instrumentation data from the Goetz compares the accuracy of the Pipeline Analytical Program to actual dredge operation. Comparison of the Pipeline Analytical Program to pipeline daily dredge reports shows how accurately the Pipeline Analytical Program can predict a dredge project's schedule over several months. Both of these comparisons determine the accuracy and validity of the Pipeline Analytical Program and DKBES as they calculate the performance metrics of the pipeline dredge project. The results of the study determined that the Pipeline Analytical Program compared closely to the Goetz eld data where only pump and pipeline hydraulics a ected the dredge production. Results from the dredge projects determined the Pipeline Analytical Program underestimated actual long{term dredge production. Study results identi ed key similarities and di erences between the DKBES and spreadsheet program in terms of cost and scheduling. The study then draws conclusions based on these ndings and o ers recommendations for further use.
APA, Harvard, Vancouver, ISO, and other styles
37

Strelet, Eugeniu. "Análise multivariada de imagem para monitorização avançada de processos e produtos." Master's thesis, 2018. http://hdl.handle.net/10316/86689.

Full text
Abstract:
Dissertação de Mestrado Integrado em Engenharia Química apresentada à Faculdade de Ciências e Tecnologia
Este trabalho de dissertação aborda um tema de importância crescente na área de Engenharia Química que é a Análise e Processamento de Imagens para efeitos de Controlo de Processos e Produtos. O objetivo essencial é extrair das imagens a informação relevante e usá-la de forma crescente na monitorização de um conjunto de parâmetros de um dado processo e/ou produto. Assim, torna-se possível implementar o controlo estatístico multivariado do processo (MSPC) ou supervisionar a qualidade de produtos em tempo real.Nesta dissertação processam-se as imagens recolhidas, usando determinadas metodologias, para obter a informação relevante para os fins em vista. Entre as metodologias usadas incluem-se: Análise aos Componentes Principais (PCA); Mínimos Quadrados Parciais (PLS); Metodologias de Classificação; Análise e Processamento Clássico das Imagens; Análise das Transformadas de Onduletas (WTA); Cartas de Controlo (nomeadamente, Hotelling’s T 2 e Q).O presente trabalho é dividido em três partes. Cada uma corresponde a um objetivo distinto e logo as estratégias e metodologias são funcionalmente distintas. Os objetivos são: (i) a construção de um modelo que permita monitorizar, a partir de tomografia elétrica, o comportamento de um parâmetro (altura da interface de dois líquidos invencíveis num tubo), sem que para tal seja necessário reconstruir a imagem em causa; (ii) desenvolvimento de um algoritmo que permita a análise simultânea de cor e forma; (iii) desenvolvimento de um algoritmo que permita o controlo estatístico multivariado de processos a partir de análise e processamento de textura e de cor simultaneamente.Para atingir o primeiro objetivo, recorreu-se à regressão usando a metodologia PLS, após uma análise prévia dos dados através de PCA. Tal permitiu explorar o sistema em causa (dois líquidos imiscíveis, óleo e água num tubo) e construir um modelo que permitiu a monitorização da altura central da interface com uma boa precisão e robustez.O segundo objetivo foi atingido com o desenvolvimento do algoritmo de Deteção e Classificação de Objetos (DCO), que emprega a análise e processamento clássico das imagens, para detetar e separar objetos do fundo e extrair algumas features relevantes para a sua posterior identificação. Emprega também abordagens de classificação aliadas a PCA para determinar a cor e a forma dos objetos em causa.Finalmente, testou-se uma estratégia recentemente proposta, que cruzava WTA e MSPC, para a monitorização em tempo real de produtos texturados: Multiscale and Multivariate Image Analysis. Com este algoritmo foi possível detetar todas as falhas (de tamanho, forma e cor) do processo simulado neste trabalho, cumprindo assim o terceiro objetivo.No final deste trabalho, foi possível constatar que as imagens constituem uma fonte rica de informação. O processamento e análise digital de imagens em conjunto com as estratégias e metodologias exploradas nesta dissertação revelaram-se uma combinação com grande potencial para a área da Engenharia Química, nomeadamente no controlo de processo e/ou produtos.
In this dissertation strategies are presented that allow to extract useful information from images in real time. That information can be there used for monitoring a set of product or process parameters and implement image-based Multivariate Statistical Process Control.The main goal of Multivariate Image Analysis for Process and Product Monitoring is to "squeeze"the images to obtain, by applying proper methodologies, the information required to process monitoring and control. Acording with, are: Principal Component Analysis; Partial Least Squares; Classifiers; Classical Digital Image Processing; Wavelets Transforms; Control Charts (e.g. Hotelling’s T 2 & Q).This work is divided into three parts. Each one correspond to a different goal: (i) derive a model for monitoring the interface heights using Electrical Tomography; (ii) develop an algorithm that enables the identification of objects using simultaneously spectral and shape information extracted from images; (iii) to implement on-line, for the first time, a recently proposed algorithm for image-based process monitoring, called Multiscale and Multivariate Image Analysis (MSMIA).To reach the first goal, a PLS Regression modelling approach was developed after PCA analysis. With this approach it was possible to build a model to monitor this interface height with good accuracy and robustness.The second goal was reached by building an algorithm, that combines classical digital image processing and Multivariate Image Analysis approaches. This approach was able to detect the color and shape of objects.In the scope of third goal, Multiscale and Multivariate Image Analysis was implemented on-line and tested. With this strategy, it was possible to detect all simulated process faults (size, shape & color).With this work, it was possible to demonstrate that images are a rich source of information. Digital Image Processing allied with advanced data analysis and modelling is a powerful combination to use in Chemical Engineering, namely for process and/or product control.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography