Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Automated factories.

Dissertationen zum Thema „Automated factories“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-50 Dissertationen für die Forschung zum Thema "Automated factories" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Winck, Ryder Christian. „Fabric control for feeding into an automated sewing machine“. Thesis, Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/28205.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Musleh, Maath. „Visual Analysis of Industrial Multivariate Time-Series Data : Effective Solution to Maximise Insights from Blow Moulding Machine Sensory Data“. Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-105253.

Der volle Inhalt der Quelle
Annotation:
Developments in the field of data analytics provides a boost for small-sized factories. These factories are eager to take full advantage of the potential insights in the remotely collected data to minimise cost and maximise quality and profit. This project aims to process, cluster and visualise sensory data of a blow moulding machine in a plastic production factory. In collaboration with Lean Automation, we aim to develop a data visualisation solution to enable decision-makers in a plastic factory to improve their production process. We will investigate three different aspects of the solution: methods for processing multivariate time-series data, clustering approaches for the sensory-data cultivated, and visualisation techniques that maximises production process insights. We use a formative evaluation method to develop a solution that meets partners' requirements and best practices within the field. Through building the MTSI dashboard tool, we hope to answer questions on optimal techniques to represent, cluster and visualise multivariate time series data.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Kalibjian, J. R., T. J. Voss und J. J. Yio. „Automated Application of Calibration Factors on Telemetered Data“. International Foundation for Telemetering, 1993. http://hdl.handle.net/10150/608881.

Der volle Inhalt der Quelle
Annotation:
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada
A long standing problem in telemetry post processing is the application of correct calibration factors to telemetered data generated on a system which has had a history of hardware changes. These calibration problems become most exacerbated when old test data is being examined and there is uncertainty as to hardware configuration at the time of the test. In this paper a mechanism for introducing a high degree of reliability in the application of calibration factors is described in an implementation done for Brilliant Pebbles Flight Experiment Three (FE-3).
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Holden, Jeffrey. „A STUDY OF SEMI-AUTOMATED TRACING“. DigitalCommons@CalPoly, 2011. https://digitalcommons.calpoly.edu/theses/574.

Der volle Inhalt der Quelle
Annotation:
Requirements tracing is crucial for software engineering practices including change analysis, regression testing, and reverse engineering. The requirements tracing process produces a requirements traceability matrix(TM) which links high- and low-level document elements. Manually generating a TM is laborious, time consuming, and error-prone. Due to these challenges TMs are often neglected. Automated information retrieval(IR) techniques are used with some efficiency. However, in mission- or safety-critical systems a human analyst is required to vet the candidate TM. This introduces semi-automated requirements tracing, where IR methods present a candidate TM and a human analyst validates it, producing a final TM. In semi-automated tracing the focus becomes the quality of the final TM. This thesis expands upon the research of Cuddeback et al. by examining how human analysts interact with candidate TMs. We conduct two experiments, one using an automated tracing tool and the other using manual validation. We conduct formal statistical analysis to determine the key factors impacting the analyst’s tracing performance. Additionally, we conduct a pilot study investigating how analysts interact with TMs generated by automated IR methods. Our research statistically confirms the finding of Cuddeback et al. that the strongest impact on analyst performance is the initial TM quality. Finally we show evidence that applying local filters to IR results produce the best candidate TMs.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Bass, Ellen J. „Human-automated judgment learning : a research paradigm based on interpersonal learning to investigate human interaction with automated judgments of hazards“. Diss., Georgia Institute of Technology, 2002. http://hdl.handle.net/1853/25498.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Louw, Tyron Linton. „The human factors of transitions in highly automated driving“. Thesis, University of Leeds, 2017. http://etheses.whiterose.ac.uk/17148/.

Der volle Inhalt der Quelle
Annotation:
The aim of this research was to investigate the nature of the out-of-the-loop (OoTL) phenomenon in highly automated driving (HAD), and its effect on driver behaviour before, during, and after the transition from automated to manual control. The work addressed questions relating to how automation affects drivers' (i) performance in transition situations requiring control- and tactical-level responses, (ii) their behaviour in automation compared to in manual driving, (iii-iv) their visual attention distribution before and during the transition, as well as (v) their perceptual-motor performance after resuming control. A series of experiments were developed to take drivers progressively further OoTL for short periods during HAD, by varying drivers' secondary task engagement and the amount of visual information from the system and environment available to them. Once the manipulations ended, drivers were invited to determine a need to resume control in critical and non-critical vehicle following situations. Results showed that, overall, drivers looked around more during HAD, compared to manual driving, and had poorer vehicle control in critical transition situations. Generally, the further OoTL drivers were during HAD, the more dispersed their visual attention. However, within three seconds of the manipulations ending, the differences between the conditions resolved, and in many cases, this was before drivers resumed control. Differences between the OoTL manipulations emerged once again in terms of the timing of drivers' initial response (take-over time) in critical events, where the further OoTL drivers were the longer it took them to resume control, but there was no difference in the quality of the subsequent vehicle control. Results suggest that any information presented to drivers during automation should be placed near the centre of the road and that kinematically early avoidance response may be more important for safety than short take-over times. This thesis concludes with a general conceptualisation of the relationship between a number of driver and vehicle/environment factors that influence driver performance in the transition.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Machado, Kayla L. „Management factors affecting calf growth and health“. Thesis, Virginia Tech, 2011. http://hdl.handle.net/10919/76914.

Der volle Inhalt der Quelle
Annotation:
Two calf feeding trends are emerging in the dairy industry in the United States. Large herds often find it economical to feed pasteurized waste milk; while smaller herds are embracing technological advancements by utilizing automated calf milk feeders. Housing of calves varies depending on feeding mechanism. Calves fed using autofeeders are grouped together but large herds often find it more labor efficient to house calves individually in elevated wooden crates or polyethylene hutches. Two studies were conducted. The objective of the first field study was to evaluate the influence of diet and housing type on growth and morbidity in 84 Holstein heifer calves in a 2 by 2 factorial experimental design. Calves were housed in either polyethylene hutches or elevated wooden crates with slatted floors. Diets consisted of pasteurized waste milk or the same waste milk supplemented to provide approximately 454 g of milk replacer solids containing 25% protein and 10% fat (LOL Balancer). Calves were randomly placed in 1 of 4 treatment groups 48 h after birth and monitored until weaning (~60 d of age). Body weights and hip heights were measured at time of enrollment and weaning. Milk samples of pasteurized waste milk were obtained five times weekly to measure standard bacteriological plate count, fat, protein and total solids content. All calves were fed 3.3 L of liquid diet via bottle at 0730 and 1530 h. Calves were monitored daily for respiratory and digestive illness and treated according to established protocols. Pasteurized waste milk contained 332,171 ° 733,487 cfu/ mL, 3.51 ° 0.59% fat, 3.13 ° 0.30% protein, and 11.64 ° 1.05% total solids. Housing (P = 0.02) and diet (P = 0.01) affected weight gain, but there was no interaction. Least squares average daily gain for crate and hutches were 0.52 ° 0.024 and 0.59 ° 0.024 kg/d. Least squares average daily gain for waste milk and balancer diets were 0.52 ° 0.024 and 0.60 ° 0.024 kg/d, respectively. Housing or diet did not affect hip height growth/d (0.196 ° 0.007 cm). Health of the calves was not affected by diet or housing. Supplementing waste milk with balancer or housing calves in hutches resulted in higher weight gain. The objective of the second study was to evaluate management, and sanitation and consistency of liquid delivered to calves via automated feeders. Ten herds in Virginia and North Carolina with sophisticated (Förster-Technik, Germany) and basic (Biotic Industries Inc., TN, USA) machines completed a 60-question survey concerning calf and autofeeder management. Duplicate milk replacer samples were obtained to measure sanitation, dry matter, and temperature of milk in the autofeeder at the time of the survey. Six dairies from the original 10 were visited monthly for 3 mo for continued evaluation of sanitation, dry matter, and temperature of milk replacer from the autofeeder. Seven herds utilizing basic machines had a mean SPC of 6,925,000 ° 7,371,000 cfu/ml. The mean dry matter and temperature readings were 12.0 ° 2.1 Brix and 38.8 ° 6.7 °C, respectively. Three dairies that used sophisticated autofeeders had a mean SPC of 1,339,000 ° 2,203,000 cfu/ml. Mean dry matter and temperature readings were 10.37 ° 1.68 Brix and 38.6 ° 6.76°C, respectively. Dairies were also categorized based on management strategies. Producers that purchased autofeeders to manipulate feeding rates, refocus labor to sanitation, and care and well-being of calves, or for technological advancements were successful at rearing calves via autofeeders. Dairy producers who purchased an autofeeder to explore feeding options were not as successful because proper time and management was not dedicated to care of calves or to maintenance of the autofeeder.
Master of Science
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Sanchez, Julian. „Factors that affect trust and reliance on an automated aid“. Diss., Available online, Georgia Institute of Technology, 2006, 2006. http://etd.gatech.edu/theses/available/etd-03302006-115459/.

Der volle Inhalt der Quelle
Annotation:
Thesis (Ph. D.)--Psychology, Georgia Institute of Technology, 2006.
Ute Fischer, Committee Member ; Jerry R. Duncan, Committee Member ; Gregory Corso, Committee Member ; Wendy A. Rogers, Committee Member ; Arthur D. Fisk, Committee Chair.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Lee, Lisa Meredith. „Factors affecting accuracy ratings of an automated adolescent MMPI report /“. Norfolk, Va. : Lee, 1989.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Barg-Walkow, Laura Hillary. „Understanding the role of expectations on human responses to an automated system“. Thesis, Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/52909.

Der volle Inhalt der Quelle
Annotation:
As automation becomes increasing ubiquitous, it is important to know how differences in introducing automated systems will affect human-automation interactions. There are two main ways of introducing expected reliability of an automated system to users: explicitly telling operators what to expect or giving operators experience using the system. This study systematically investigated the effect of expectation format initially and over time on: 1) perceptions of reliability and system usage, and 2) human responses to automation (e.g., compliance, reliance, and overall dependence). Initially, there was an effect of expected level for explicit statement groups, whereas there was no effect of expected level for initial exposure groups. Over time, explicit statement groups had more stable perceptions of system reliability than the initial exposure groups. In general, perceived reliability did not converge to actual system reliability (75%) by the end of the study. Additionally, perceived reliability had a weak, but positive relationship with actual system use, whereas perceptions of system use (e.g., perceived dependence) had a strong, but negative relationship with actual system use. Outside of initial effects seen with perceived reliability, there were few initial differences between expectation formats. Almost all groups tended to initially comply more than rely, with the exception of the initial exposure – lower-than group. Over time, level of expectation for initial exposure groups influenced reliance. There were no differences between expectation groups on compliance and dependence over time. In general, dependence and compliance increased or stayed the same as time using the system increased. This pattern was also seen with reliance, with the exception of the initial exposure - higher-than group decreasing reliance over time. Results from this study have implications for both theory and practice. The research findings both support and augment the existing conceptual model of automation. A better understanding of the differential effects of expectation format and introduced level of expectations can lead to introductions of automated systems that are best suited to the system’s goals, ultimately improving system performance.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

Ekman, Matthew J. „Automated photoelastic determination of fracture parameters for bimaterial interface cracks“. Thesis, Loughborough University, 1998. https://dspace.lboro.ac.uk/2134/10421.

Der volle Inhalt der Quelle
Annotation:
This thesis details an experimental study on the determination of the fracture parameters for a crack located at the interface between two dissimilar materials using the method of photoelasticity. The interface is potential1y an inherent weak spot of any composite material, structure"or adhesively bonded joint. Accurate description of the state of stress at the crack tip is required for strength prediction. The concept of the complex stress intensity factor is used to characterise the elastic crack tip stress field for an interface crack. Complex stress intensity factors and their moduli have been measured experimental1y for standard bimaterial crack geometries using the wel1 established technique of photo elasticity. Bimaterial specimens comprising aluminium al10y and epoxy resin components were used. This creates a large material mismatch at the interface and al10ws data to be col1ected from the epoxy component of the specimen using transmission photoelasticity. An automated ful1 field photoelastic technique was developed to significantly reduce the data col1ection time. The technique comprises elements from the approaches of three wavelength and phase stepping photoelasticity and is a significant improvement on techniques previously available. Stress intensity factors were determined by fitting a theoretical stress field solution for the bimaterial crack to the experimental data. A computational routine automatical1y selects the region of best fit between the experimental data and the theoretical solution. This data is then used to determine the complex stress intensity factor and its modulus value. In order to provide a robust fit between the experimental data and the theoretical field solution a weighting function was incorporated into the routine. The measured bimaterial stress intensity factors are compared with those determined experimental1y for equivalent homogeneous specimens made from epoxy resin. The differences between the two are then discussed. The experimental results agree with the wel1 known concept that tension and shear effects are inherently coupled at the crack tip. However, the effects of changing the load angle with respect to the interface also demonstrate that some contrasts exist with known numerical solutions.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

McBride, Sara E. „The effect of workload and age on compliance with and reliance on an automated system“. Thesis, Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/33886.

Der volle Inhalt der Quelle
Annotation:
Automation provides the opportunity for many tasks to be done more effectively and with greater safety. However, these benefits are unlikely to be attained if an automated system is designed without the human user in mind. Many characteristics of the human and automation, such as trust and reliability, have been rigorously examined in the literature in an attempt to move towards a comprehensive understanding of the interaction between human and machine. However, workload has primarily been examined solely as an outcome variable, rather than as a predictor of compliance, reliance, and performance. This study was designed to gain a deeper understanding of whether workload experienced by human operators influences compliance with and reliance on an automated warehouse management system, as well to assess whether age-related differences exist in this interaction. As workload increased, performance on the Receiving Packages task decreased among younger and older adults. Although younger adults also experienced a negative effect of workload on Dispatching Trucks performance, older adults did not demonstrate a significant effect. The compliance data showed that as workload increased, younger adults complied with the automation to a greater degree, and this was true regardless of whether the automation was correct or incorrect. Older adults did not demonstrate a reliable effect of workload on compliance behavior. Regarding reliance behavior, as workload increased, reliance on the automation increased, but this effect was only observed among older adults. Again, this was true regardless of whether the automation as correct or incorrect. The finding that individuals may be more likely to comply with or rely on faulty automation if they are in high workload state compared to a low workload state suggests that an operator's ability to detect automation errors may be compromised in high workload situations. Overall, younger adults outperformed older adults on the task. Additionally, older adults complied with the system more than younger adults when the system erred, which may have contributed to their poorer performance. When older adults verified the instructions given by the automation, they spent longer doing so than younger adults, suggesting that older adults may experience a greater cost of verification. Further, older adults reported higher workload and greater trust in the system than younger adults, but both age groups perceived the reliability of the system quite accurately. Understanding how workload and age influence automation use has implications for the way in which individuals are trained to interact with complex systems, as well as the situations in which automation implementation is determined to be appropriate.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Charron, Rhona. „The influence of different degrees of assistance in automated intelligent tutoring /“. Thesis, McGill University, 1989. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=61997.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

Watts, Robert Michael. „Development and evaluation of an automated path planning aid“. Thesis, Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/33839.

Der volle Inhalt der Quelle
Annotation:
In the event of an onboard emergency, air transport pilots are remarkably adept at safely landing their aircraft. However, the tasks of selecting an alternate landing site and developing a safe path to land are very difficult in the high workload, high stress environment of a cockpit during an emergency. The purpose of this research was to develop an automated path planning aid which would assist the pilot in the completion of these tasks. A prototype was developed to test this concept experimentally. The experiment was also intended to gather further information about how pilots think about and accomplish this task as well as the best ways to assist them. In order to better understand the priorities and processes pilots use when dealing with emergency planning, a survey of airline pilots was conducted. The results of this survey highlighted the fact that each emergency is unique and has its own set of factors which are critically important. One factor which is important in many emergencies is the need to land quickly. The survey responses indicated that one of the most important characteristics of a useful tool is that it should provide pertinent information in an easy to use manner, and should not divert too much attention from their other tasks. A number of design goals drove the development of the prototype aid. First, the aid was to work within current aircraft, without requiring substantial redesign on the cockpit. Second, the aid was to help improve pilots' performance without increasing their workload. Finally, the aid was designed to assist pilots in obtaining and processing critical information which influences the site selection and path development tasks. One variation of the aid included a filter dial which allowed pilots to quickly reduce the number of options considered, another variation of the aid did not include such a dial. These two variations of the aid were tested in order to assess the impact of the addition of the filter dial to the system. Though many of the results did not prove to be statistically significant, they suggest that the addition of a filter dial improved the quality of the selected landing site; however, it also increased the time required for the selection. The results were obtained in both familiar and unfamiliar emergencies. The dial was shown to improve the time to complete the task in the case of unfamiliar emergencies. The experiment also compared an optimal ranking system to a non-optimal system, for which results showed no significant difference between the two. This may imply that while pilots did not tend to over rely on the ranking system, under-reliance may need to be addressed by training and a better understanding of the factors which impact the rankings. The participants found that the aid facilitates quick and easy access to critical information. The aid was also useful for processing this information by filtering out options which were inappropriate for a given scenario through the use of the filter dial. The participants also made recommendations about possible improvements which could be made to the system such as better filter settings which are more similar to the way that pilots think about their options.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

TSE, Man Kei. „Evaluation of an anaesthesia automated record keeping system : a human factors approach“. Digital Commons @ Lingnan University, 2018. https://commons.ln.edu.hk/otd/40.

Der volle Inhalt der Quelle
Annotation:
Anaesthesia Information Management System (AIMS) is an automated record keeping system that imports and stores patient’s vital signs information from a physiological monitor in real-time. However, only a handful of studies have examined the effect of automated record keeping system on anaesthetists’ cognitive performance. Therefore, the current thesis aims to evaluate AIMS in terms of anaesthetists’ attitude (Study 1) and its effect on their cognitive performance (Study 2). Study 1, a questionnaire study examined anaesthetists’ trust and acceptance of AIMS. Forty-two anaesthetists at Tuen Mun Hospital (TMH) and Po Oi hospitals (POH) have completed a self-reported questionnaire. Results found that anaesthetists generally adopted a positive attitude toward AIMS. They exhibited a high level of trust and acceptance of AIMS. Also, they perceived AIMS as highly useful and relevant to their job. Study 2, a simulation study compared AIMS with manual record keeping on anaesthetists’ vigilance, situation awareness (SA) and mental workload. 20 anaesthetists at TMH were randomly assigned to two conditions: (1) AIMS and (2) Manual. Each participant received a 45-minute scenario in a full-scale simulation. Participants were asked to take over a case of general anaesthesia and perform record keeping. Results showed that AIMS did not impair anaesthetists’ vigilance and SA. In addition, it reduced anaesthetists’ mental workload and enabled them to spend less time on record keeping task. The current thesis provides an evaluation of AIMS by using a human factors approach. It contributes to the understanding on the effect of AIMS on anaesthetist’ in terms of attitude and cognitive performance. Based on the evaluation, we generate some recommendation for designers and hospitals to address the limitation of AIMS in interface designs and to increase anaesthetists’ acceptance of AIMS.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

Mayer, Andrew K. „Manipulation of user expectancies effects on reliance, compliance, and trust using an automated system /“. Thesis, Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/22633.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Banks, Victoria A. „Human factors considerations in the design and development of highly automated driving systems“. Thesis, University of Southampton, 2016. https://eprints.soton.ac.uk/397266/.

Der volle Inhalt der Quelle
Annotation:
Increasing levels of automation within the driving task has seen the driver’s role change from an active operator to one of a passive monitor. However, systems design has been plagued by criticism for failing to acknowledge the new role of the driver within the system network. To further our understanding of the driver’s role within an automated driving system, the theory of Distributed Cognition was adopted. Distributed Cognition provides a useful framework for the investigation of task partitioning between multiple system agents. A novel Systems Design Framework has been developed as part of this thesis that utilises both qualitative and quantitative research methodologies within the Distributed Cognition paradigm. The framework is divided into two phases, the first phase requires an understanding of how individual system agents function to create models that show how these components share information using Operator Sequence Diagrams whilst empirical methods were used to validate these models in the second phase (e.g. Verbal Protocol Analysis and Network Analysis). These extension methodologies were useful in highlighting a number of design weaknesses, beyond the modelled technological components, that required modification to improve overall system design. The Systems Design Framework has been successfully applied to assist Systems Engineers with a foundation to design and conduct research into the human factors implications of different levels of automation within driving.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

El-Dabaja, Sarah S. „Drivers of "Driverless" Vehicles: A Human Factors Study of Connected and Automated Vehicle Technologies“. Ohio University / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1576670482075765.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Naujoks, Frederik, Sebastian Hergeth, Katharina Wiedemann, Nadja Schömig, Yannick Forster und Andreas Keinath. „Test procedure for evaluating the human-machine interface of vehicles with automated driving systems“. Taylor & Francis, 2019. https://publish.fid-move.qucosa.de/id/qucosa%3A72242.

Der volle Inhalt der Quelle
Annotation:
Objective: The human–machine interface (HMI) is a crucial part of every automated driving system (ADS). In the near future, it is likely that—depending on the operational design domain (ODD)—different levels of automation will be available within the same vehicle. The capabilities of a given automation level as well as the operator’s responsibilities must be communicated in an appropriate way. To date, however, there are no agreed-upon evaluation methods that can be used by human factors practitioners as well as researchers to test this. Methods: We developed an iterative test procedure that can be applied during the product development cycle of ADS. The test procedure is specifically designed to evaluate whether minimum requirements as proposed in NHTSA’s automated vehicle policy are met. Results: The proposed evaluation protocol includes (a) a method to identify relevant use cases for testing on the basis of all theoretically possible steady states and mode transitions of a given ADS; (b) an expert-based heuristic assessment to evaluate whether the HMI complies with applicable norms, standards, and best practices; and (c) an empirical evaluation of ADS HMIs using a standardized design for user studies and performance metrics. Conclusions: Each can be used as a stand-alone method or in combination to generate objective, reliable, and valid evaluations of HMIs, focusing on whether they meet minimum requirements. However, we also emphasize that other evaluation aspects such as controllability, misuse, and acceptance are not within the scope of the evaluation protocol.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

Dadashi, Nastaran. „Human factors of future rail intelligent infrastructure“. Thesis, University of Nottingham, 2012. http://eprints.nottingham.ac.uk/13157/.

Der volle Inhalt der Quelle
Annotation:
The introduction of highly reliable sensors and remote condition monitoring equipment will change the form and functionality of maintenance and engineering systems within many infrastructure sectors. Process, transport and infrastructure companies are increasingly looking to intelligent infrastructure to increase reliability and decrease costs in the future, but such systems will present many new (and some old) human factor challenges. As the first substantial piece of human factors work examining future railway intelligent infrastructure, this thesis has an overall goal to establish a human factors knowledge base regarding intelligent infrastructure systems, as used in tomorrow’s railway but also in many other sectors and industries. An in-depth interview study with senior railway specialists involved with intelligent infrastructure allowed the development and verification of a framework which explains the functions, activities and data processing stages involved. The framework includes a consideration of future roles and activities involved with intelligent infrastructure, their sequence and the most relevant human factor issues associated with them, especially the provision of the right information in the right quantity and form to the right people. In a substantial fieldwork study, a combination of qualitative and quantitative methods was employed to facilitate an understanding of alarm handling and fault finding in railway electrical control and maintenance control domains. These functions had been previously determined to be of immediate relevance to work systems in the future intelligent infrastructure. Participants in these studies were real railway operators as it was important to capture users’ cognition in their work settings. Methods used included direct observation, debriefs and retrospective protocols and knowledge elicitation. Analyses of alarm handling and fault finding within real-life work settings facilitated a comprehensive understanding of the use of artefacts, alarm and fault initiated activities, along with sources of difficulty and coping strategies in these complex work settings. The main source of difficulty was found to be information deficiency (excessive or insufficient information). Each role requires different levels and amounts of information, a key to good design of future intelligent infrastructure. The findings from the field studies led to hypotheses about the impact of presenting various levels of information on the performance of operators for different stages of alarm handling. A laboratory study subsequently confirmed these hypotheses. The research findings have led to the development of guidance for developers and the rail industry to create a more effective railway intelligent infrastructure system and have also enhanced human factors understanding of alarm handling activities in electrical control.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

Mehrmand, Arash. „A Factorial Experiment on Scalability of Search-based Software Testing“. Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4224.

Der volle Inhalt der Quelle
Annotation:
Software testing is an expensive process, which is vital in the industry. Construction of the test-data in software testing requires the major cost and knowing which method to use in order to generate the test data is very important. This paper discusses the performance of search-based algorithms (preferably genetic algorithm) versus random testing, in software test-data generation. A factorial experiment is designed so that, we have more than one factor for each experiment we make. Although many researches have been done in the area of automated software testing, this research differs from all of them due to sample programs (SUTs) which are used. Since the program generation is automatic as well, Grammatical Evolution is used to guide the program generations. They are not goal based, but generated according to the grammar we provide, with different levels of complexity. Genetic algorithm is first applied to programs, then we apply random testing. Based on the results which come up, this paper recommends one method to use for software testing, if the SUT has the same conditions as we had in this study. SUTs are not like the sample programs, provided by other studies since they are generated using a grammar.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Baber, Christopher. „The human factors of automatic speech recognition in control room systems“. Thesis, Aston University, 1990. http://publications.aston.ac.uk/10839/.

Der volle Inhalt der Quelle
Annotation:
This thesis addresses the viability of automatic speech recognition for control room systems; with careful system design, automatic speech recognition (ASR) devices can be useful means for human computer interaction in specific types of task. These tasks can be defined as complex verbal activities, such as command and control, and can be paired with spatial tasks, such as monitoring, without detriment. It is suggested that ASR use be confined to routine plant operation, as opposed the critical incidents, due to possible problems of stress on the operators' speech. Some solutions to the problems of stress are given. From a series of studies, it is concluded that the interaction be designed to capitalise upon the tendency of operators to use short, succinct, and task specific styles of speech. From studies comparing different types of feedback, it is concluded that operators be given screen based feedback rather than auditory feedback. Feedback will take two forms: the use of the ASR device will require recognition feedback, which will be best supplied using text; the performance of a process control task will require feedback integrated into the mimic display. This latter feedback can be either textual or symbolic, but it is suggested that symbolic feedback will be more beneficial. Related to both interaction style and feedback is the issue of handling recognition errors. These should be corrected by simple command repetition practices, rather than use error handling dialogues. This thesis also addresses some of the problems of user error in ASR use, and provides a number of recommendations for its reduction. Before using the ASR device, new operators will require some form of training. It is shown that a demonstration by an experienced user of the device can lead to superior performance than instructions. Thus a relatively cheap and very efficient form of operator training can be supplied by demonstration by experienced ASR operators
APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

Evans, Elizabeth. „Is there a role for top-down factors in 'automatic' imitation?“ Thesis, University of Manchester, 2014. https://www.research.manchester.ac.uk/portal/en/theses/is-there-a-role-for-topdown-factors-in-automatic-imitation(74785fa8-411a-44e6-bcfb-249adb8dd0a6).html.

Der volle Inhalt der Quelle
Annotation:
The passive perception of irrelevant actions can facilitate or interfere with the execution of one’s own actions, known as ‘automatic imitation’ (AI). For example, when one is pressing down on a button, reaction times (RTs) are faster when observing a finger depression (compatible action) and slower whilst observing a finger lift (incompatible action). This phenomenon has been attributed to the mirror neuron system and is thought to represent a laboratory model of spontaneous motor mimicry which facilitates social interactions. AI is typically reduced or absent when the observed movement is produced by a non-human agent. However, previous findings suggest that the magnitude of this ‘human bias’ can be modulated by top-down factors, such as attention and prior instructions regarding whether the movement was produced by a human or non-human agent. This thesis aimed to further examine the role of attention and belief regarding stimulus agency in automatic imitation. Participants were required to perform a pre-specified key press or release response to a diffuse yellow flash go signal. This response was either compatible or incompatible with the finger or object movement, which was presented simultaneously. AI was measured by subtracting compatible from incompatible RTs to calculate the compatibility effect. Experiments 1a, 1b, 2 and 7 focused on exploring the role of attention in AI. Experiment 1a revealed that the human bias is dependent on when the go signal occurs. AI was greater for the finger stimulus relative to the object stimulus when the go signal occurred during the movement, but not after the movement. It is suggested that attention to the movement is reduced when the go signal occurs after the movement. This implies that the human bias in AI is dependent on attention being directed towards the movement. Experiments 1b and 2 indicated that AI was removed if a visual dual task was added, but that AI remained and was greater when an auditory dual task was added. This indicates that AI was removed when the visual dual task competed for cognitive resources with action observation. The facilitation of AI when an auditory dual task was added suggests that the additional cognitive load may have occupied cognitive resources required for the inhibition of imitation. These findings highlight that AI is susceptible to attentional load, implying that AI is not a strongly automatic process. Experiment 7 explored whether the spread of attention modulates the magnitude of AI by comparing a ‘diffuse’ go signal to a ‘focused’ go signal which directed attention to the stimulus movement. Significantly larger AI effects were produced for the group of participants who saw the focused flash first, indicating that focusing attention on the spatial location of the movement increased AI, and furthermore that initially observing the focused flash ‘trained’ participants to pay attention to the stimulus movement in the diffuse flash condition. Experiments 3 and 4 examined why AI effects for non-human stimuli are more likely to be significant when trials are presented in separate blocks (e.g. human vs. non-human stimuli) as opposed to randomly mixed trials. It was hypothesised that this pattern of previous results could be due to less attention being drawn to stimulus differences when stimuli are presented separately as opposed to mixed with a block of trials. However, in both experiments, AI effects were present for the object stimulus in the group of participants who observed the block of finger trials first. This suggests that the prior observation of the finger movement caused a carry-over of human agency to the object stimulus. Experiments 5, 6, 8 and 9 directly explored the role of belief regarding stimulus agency in AI by instructing participants that the object movement was generated by a human finger movement. Experiments 5, 6 and 8 provided preliminary evidence that AI is affected by belief instructions, but the effects were weak or confounded by spatial stimulus-response compatibility (SRC) effects (i.e. compatibility effects based on spatial correspondence of the stimulus and response location). Experiment 9 was designed to differentiate imitative compatibility from SRC effects, thus providing a pure measure of imitative compatibility. Imitative compatibility was present for the object stimulus after the belief manipulation. This demonstrates that a human belief regarding stimulus agency of the object modulated imitative compatibility effects due to the top-down knowledge that the movement was human generated, and not due to increased attention and SRC effects. The presented work has provided multiple lines of evidence which demonstrate that so-called ‘automatic’ imitation effects are strongly susceptible to top-down influences, including attention and belief regarding stimulus agency. The current work could be used to evaluate top-down modulation of imitation in autistic populations, as it has been proposed that top-down modulation of the automatic imitation pathway may be atypical in autism.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

Sedin, Engla Maria Helena. „Semi-automated immunohistochemical staining of the VEGF-A-protein for clinical use and the identification in NHG-graded breast carcinoma“. Thesis, Uppsala universitet, Institutionen för kvinnors och barns hälsa, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-237106.

Der volle Inhalt der Quelle
Annotation:
Angiogenesis has a crucial influence on tumour development and identification of microvessels in malignant breast cancer tissue is an indicator for worse prognosis. Angiogenesis is partially governed by the family of vascular endothelial growth factors (VEGF) and their receptors, by which the VEGF-A-protein seems to be the most important factor.     The aims of this work were to first establish a method for immunohistological (IH)-staining of the VEGF-A-protein for clinical use and then to label and evaluate the expression of this protein in 31 Nottingham Histology Graded (NHG I-III) breast carcinoma.      Formaldehyde-fixated tissues from invasive breast neoplasms and control tissues were labelled with monoclonal antibodies against VEGF-A and CD31-proteins using a semi-automated IH-system from Ventana BenchMark. Positively stained vessels were counted from digital copies of microscopic pictures related to mm2 tissue.     A method of IH-labelling with VEGF-A protein was successfully established before staining of the breast tissue and in 19 of the 31 breast cancers. Vessels were counted for both antibodies. The VEGF-A-antibody stained 2.7 ± 2.3 (mean ± SD) vessels/mm2 and the CD31-antibody stained 27.3 ± 19.3 in the breast carcinoma tissue. The percent of VEGF-A-stained vessels in relation to CD31-stained were 7.6% in the NHG-I- (n=3), 7.8% in the NHG-II- (n=10) and 15.0% in the NHG-III-group (n=6).     The results demonstrate that increased NHG-grade and lower differentiation can be associated with higher percent of vessels expressing VEGF-A-protein. The result was not statistically certified because of the small number of stained breast cancers and additional investigations are recommended before clinical use.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Kambli, Sachin. „Continuous and automated real-time bridge health monitor & dissemination of structural rating factors via the WWW“. Cincinnati, Ohio : University of Cincinnati, 2007. http://www.ohiolink.edu/etd/view.cgi?ucin1179158524.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

KONDURY, SHIRISHA. „CONTINUOUS AND AUTOMATED TRAFFIC MONITOR FOR IMMEDIATE IDENTIFICATION AND STATISTICAL HISTORY OF INFLUENCE LINE AND RATING FACTORS“. University of Cincinnati / OhioLINK, 2001. http://rave.ohiolink.edu/etdc/view?acc_num=ucin996765671.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
27

KAMBLI, SACHIN. „CONTINUOUS AND AUTOMATED REAL-TIME BRIDGE HEALTH MONITOR & DISSEMINATION OF STRUCTURAL RATING FACTORS VIA THE WWW“. University of Cincinnati / OhioLINK, 2007. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1179158524.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
28

Psotti, Andrea. „Design of a new Human Machine Interface for bar feeding automatic machines towards Industry 4.0 smart factories“. Master's thesis, Alma Mater Studiorum - Università di Bologna, 2016.

Den vollen Inhalt der Quelle finden
Annotation:
Un caricatore di barre è una macchina automatica solitamente installata a valle di una macchina utensile, il suo compito è di fornire un flusso costante e continuo di materiale da lavorare al tornio. Questa macchina è di estrema importanza all'interno di una torneria meccanica in quanto è responsabile dell’autonomia del processo di lavorazione; per questa ragione le principali caratteristiche di un caricatore di barre sono: stabilità, errori e problemi devono presentarsi raramente, robustezza, la macchina infatti deve essere in grado di tollerare malfunzionamenti sia elettrici che meccanici, e, infine, un caricatore deve sempre essere correttamente interfacciato con la macchina utensile che rifornisce. IEMCA è una società italiana che progetta e produce dal 1961 caricatori automatici di barre. L’obiettivo finale è di formulare una proposta per una nuova interfaccia uomo-macchina da installare e vendere sulle nuove macchine prodotte da IEMCA, considerando i concetti dell’Industry 4.0. Il processo di progettazione inizierà con un esame tecnico della piattaforma esistente, analizzandone il principio di funzionamento. Successivamente verrà svolta un’estensiva ricerca di mercato nei mercati italiano, tedesco e statunitense con il fine di comprendere i punti di forza e debolezza dell’attuale interfaccia, i punti di forza e debolezza di altre interfacce uomo-macchina per caricatori di barre presenti sul mercato e investigando possibili sviluppi futuri provenienti da richieste del mercato nell’industria delle macchine utensili e studiando struttura e caratteristiche dell’Industry 4.0. Infine si creerà un prototipo di software per l’interfaccia uomo-macchina che ingloberà tutti gli aspetti e le funzionalità emerse dalla ricerca e dalle fasi del processo di progettazione. Se gli obiettivi saranno raggiunti la nuova soluzione HMI sarà installata e venduta su tutte le nuove macchine prodotte da IEMCA, quindi su tutti i caricatori di barre, venduti in tutto il globo.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
29

Gabrecht, Katharina M. „Human factors of semi-autonomous robots for urban search and rescue“. Thesis, University of Nottingham, 2016. http://eprints.nottingham.ac.uk/35458/.

Der volle Inhalt der Quelle
Annotation:
During major disasters or other emergencies, Urban Search and Rescue (USAR) teams are responsible for extricating casualties safely from collapsed urban structures. The rescue work is dangerous due to possible further collapse, fire, dust or electricity hazards. Sometimes the necessary precautions and checks can last several hours before rescuers are safe to start the search for survivors. Remote controlled rescue robots provide the opportunity to support human rescuers to search the site for trapped casualties while they remain in a safe place. The research reported in this thesis aimed to understand how robot behaviour and interface design can be applied to utilise the benefits of robot autonomy and how to inform future human-robot collaborative systems. The data was analysed in the context of USAR missions when using semi-autonomous remote controlled robot systems. The research focussed on the influence of robot feedback, robot reliability, task complexity, and transparency. The influence of these factors on trust, workload, and performance was examined. The overall goal of the research was to make the life of rescuers safer and enhance their performance to help others in distress. Data obtained from the studies conducted for this thesis showed that semi-autonomous robot reliability is still the most dominant factor influencing trust, workload, and team performance. A robot with explanatory feedback was perceived as more competent, more efficient and less malfunctioning. The explanatory feedback was perceived as a clearer type of communication compared to concise robot feedback. Higher levels of robot transparency were perceived as more trustworthy. However, single items on the trust questionnaire were manipulated and further investigation is necessary. However, neither explanatory feedback from the robot nor robot transparency, increased team performance or mediated workload levels. Task complexity mainly influenced human-robot team performance and the participants’ control allocation strategy. Participants allowed the robot to find more targets and missed more robot errors in the high complexity conditions compared to the low task complexity conditions. Participants found more targets manually in the low complexity tasks. In addition, the research showed that recording the observed robot performance (the performance of the robot that was witnessed by the participant) can help to identify the cause of contradicting results: participants might not have noticed some of the robots mistakes and therefore they were not able to distinguish between the robot reliability levels. Furthermore, the research provided a foundation of knowledge regarding the real world application of USAR in the United Kingdom. This included collecting knowledge via an autoethnographic approach about working processes, command structures, currently used technical equipment, and attitudes of rescuers towards robots. Also, recommendations about robot behaviour and interface design were collected throughout the research. However, recommendations made in the thesis include consideration of the overall outcome (mission performance) and the perceived usefulness of the system in order to support the uptake of the technology in real world applications. In addition, autonomous features might not be appropriate in all USAR applications. When semi-autonomous robot trials were compared to entirely manual operation, only the robot with an average of 97% reliability significantly increased the team performance and reduced the time needed to complete the USAR scenario compared to the manually operated robot. Unfortunately, such high robot success levels do not exist to date. This research has contributed to our understanding of the factors influencing human-robot collaboration in USAR operations, and provided guidance for the next generation of autonomous robots.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
30

Dean, Timothy J. Physical Environmental &amp Mathematical Sciences Australian Defence Force Academy UNSW. „Development and evaluation of automated radar systems for monitoring and characterising echoes from insect targets“. Awarded by:University of New South Wales - Australian Defence Force Academy. School of Physical, Environmental and Mathematical Sciences, 2007. http://handle.unsw.edu.au/1959.4/38667.

Der volle Inhalt der Quelle
Annotation:
This thesis describes the construction of a mobile Insect Monitoring Radars (IMR) and investigations of: the reliability of IMRs for observing insect migration in inland Australia; possible biases in IMR migration estimates; the relation between an insect???s size and its radar properties; radar discrimination between insect species; the effect of weather on the migrations of Australian plague locusts and of moths; the scale of these migrations; and here IMRs are best located. The principles of entomological radar design, and the main features of insect migration in inland Australia, are reviewed. The main procedures used in the study are: calculation of radar performance and of insect radar cross sections (RCSs); reanalysis of a laboratory RCS dataset; statistical analysis of a fouryear dataset of IMR and weather observations; and a field campaign using both two existing fixed IMRs and the new mobile unit. Statistical techniques used include correlation, multiple regression, discriminant analysis, and principal components analysis. The original results of this work include design details of the mobile IMR, extension of radar performance calculations to IMRs and evaluation of flight speed biases, a holistic approach to IMR design, the relation of insect RCS magnitudes and polarization patterns to morphological variables, an estimate of the accuracy of the retrieved parameters, evaluations of three approaches (oneparameter, theory-based, and a novel two-stage method) to target identification, and verification of inferred target identities using results from nearby light traps. Possible sites for future IMRs are identified. The major conclusions are that: a mobile IMR can be built with a performance equal to that of a fixed IMR but at half the cost; significant biases in the signal processing results arise from insect speed; locusts and moths can be distinguished if all RCS parameters are used; IMRs can be designed to match particular requirements; weather has a significant effect on insect migration, the best single predictor of insect numbers being temperature; moonlight has no effect; the spatial correlation of migration properties falls to 50% at a separation of 300 km; and migrating insects can be carried by the wind for 500 km in a single night
APA, Harvard, Vancouver, ISO und andere Zitierweisen
31

Lanthier, Paul. „Aspects ergodiques et algébriques des automates cellulaires“. Thesis, Normandie, 2020. http://www.theses.fr/2020NORMR034.

Der volle Inhalt der Quelle
Annotation:
La première partie de ce manuscrit entre dans le cadre de la théorie des probabilités, et est consacrée à l’étude de filtrations engendrées par des automates cellulaires. On étudie deux versions d’un automate algébrique agissant sur des configurations dont les états sont à valeurs dans un groupe abélien fini : l’une est déterministe, et consiste à additionner les états de deux cellules consécutives, et la seconde est une perturbation aléatoire de la première. À partir de ces automates, on construit des processus aléatoires markoviens et on étudie les filtrations engendrées par ces processus. On montre en utilisant le critère de I-confort que les deux filtrations sont standards au sens développé par Vershik. Cependant, les automates cellulaires ont comme particularité de commuter avec l’opérateur de décalages des coordonnées, appelé « shift ». Dans cette thèse, on introduit une nouvelle classification des filtrations dite « dynamique » qui tient compte de l’action de cette transformation. Les filtrations sont définies non plus sur des espaces de probabilité mais sur des systèmes dynamiques, ce sont dans ce cas des filtrations « facteurs » : chaque tribu est invariante par la dynamique du système. Le pendant de la standardité du point de vue dynamique est étudié. On crée ainsi un critère nécessaire pour la standardité dynamique appelé « I-confort dynamique ». La question de savoir si le I-confort dynamique est suffisant reste ouverte mais un premier résultat dans cette direction est donné, en montrant qu’une version renforcée du I-confort dynamique entraîne la standardité dynamique. En établissant qu’elle ne satisfait pas le critère de I-confort dynamique, on prouve que la filtration facteur engendrée par l’automate déterministe n’est pas dynamiquement standard, et donc que la classification dynamique des filtrations diffère de la classification développée par Vershik. L’automate probabiliste dépend d’un paramètre d’erreur, et on montre par un argument de percolation que la filtration facteur engendrée par cet automate est dynamiquement standard pour des valeurs assez grandes de ce paramètre. 0n conjecture qu’elle ne sera pas dynamiquement standard pour les valeurs très petites de ce paramètre. La deuxième partie de ce manuscrit, plus algébrique, tire son origine d’une problématique musicale, liée au calcul d’intervalles dans une ligne mélodique périodique. Les travaux présentés ici poursuivent les recherches du compositeur roumain Anatol Vieru et de Moreno Andreatta et Dan Vuza mais d’une manière originale en se plaçant du point des vue des automates cellulaires. On étudie l’action sur des suites périodiques de deux automates cellulaires algébriques, dont l’un est identique à celui de la première partie. Les questions sur la caractérisation des suites réductibles et reproductibles ainsi que les temps associés ont été approfondies et améliorées pour ces deux automates. Le calcul des antécédents et des images via les deux automates a été explicité. La question de l’évolution des périodes a été traitée avec la création d’un outil appelé « caractéristique » qui permet de décrire et de contrôler l’évolution de la période dans les temps négatifs. Des simulations permettent de voir que l’évolution des périodes lorsque l’on tire au hasard des antécédents suit un motif presque régulier, et l’explication de ce phénomène reste une question ouverte. Les résultats mathématiques de cette deuxième partie ont été utilisés dans le module « automaton » d’un logiciel de composition gratuit, nommé « UPISketch. Ce module permet à un compositeur de créer des lignes mélodiques en itérant les images ou en prenant des antécédents successifs d’une ligne mélodique de départ
The first part of this manuscript falls within the framework of probability theory, and is devoted to the study of filtrations generated by some cellular automata. We study two versions of an algebraic automaton acting on configurations whose states take values in a finite Abelian group: one is deterministic, and consists in adding the states of two consecutive cells, and the second is a random perturbation of the first one. From these automata, random Markovian processes are constructed and the filtrations generated by these processes are studied. Using the I-cosiness criterion, we show that the two filtrations are standard in the sense developed by Vershik. However, cellular automata have the particularity of commuting with the coordinate shift operator. In this thesis, we introduce a new classification of the filtrations called "dynamic" which takes into account the action of this transformation. Filtrations are no longer defined on probability spaces but on dynamical systems, and are in this case "factor" filtrations: each sigma-algebra is invariant by the dynamics of the system. The counterpart of standardity from the dynamic point of view is studied. This creates a necessary criterion for dynamic standardity called "dynamic I-cosiness". The question of whether the dynamic I-cosiness is sufficient remains open, but a first result in this direction is given, showing that a strengthened version of the dynamic I-cosiness leads to dynamic standardity. By establishing that it does not satisfy the criterion of dynamic I-cosiness, it is proved that the factor filtration generated by the deterministic automaton is not dynamically standard, and therefore that the dynamic classification of the filtrations differs from the classification developed by Vershik. The probabilistic automaton depends on an error parameter, and it is shown by a percolation argument that the factor filtration generated by this automaton is dynamically standard for large enough values of this parameter. It is conjectured that it will not be dynamically standard for very small values of this parameter. The second part of this manuscript, more algebraic, has its origin in a musical problem, linked to the calculation of intervals in a periodic melodic line. The work presented here continues the research of the Romanian composer Anatol Vieru and of Moreno Andreatta and Dan Vuza, but in an original way from the point of view of cellular automata. We study the action on periodic sequences of two algebraic cellular automata, one of which is identical to that of the first part. The questions on the characterization of reducible and reproducible sequences as well as the associated times have been deepened and improved for these two automata. The calculation of preimages and images via the two automata was explained. The question of the evolution of the periods was treated with the creation of a tool called "characteristic" which allows to describe and control the evolution of the period in negative times. Simulations show that the evolution of the periods when the preimages are drawn at random follows an almost regular pattern, and the explanation of this phenomenon remains an open question. The mathematical results of this second part have been used in the "Automaton" module of a free composing software called "UPISketch ». This module allows a composer to create melodic lines by iterating images or taking successive preimages of a starting melodic line
APA, Harvard, Vancouver, ISO und andere Zitierweisen
32

Dickenson, Adrian C. „Repetition and interference effects in spatial stimulus - response compatibility : automatic and strategic factors“. Thesis, Birkbeck (University of London), 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.314357.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
33

Forsberg, Anne-Mari. „Factors affecting cow behaviour in a barn equipped with an automatic milking system /“. Uppsala : Department of Animal Nutrition and Management, Swedish University of Agricultural Sciences, 2008. http://epsilon.slu.se/11200991.pdf.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
34

Rudolph, Frederick M. „Human performance during automation : the interaction between automation, system information, and information display in a simulated flying task“. Diss., Georgia Institute of Technology, 2000. http://hdl.handle.net/1853/36207.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
35

Dielmann, Alfred. „Automatic recognition of multiparty human interactions using dynamic Bayesian networks“. Thesis, University of Edinburgh, 2009. http://hdl.handle.net/1842/4022.

Der volle Inhalt der Quelle
Annotation:
Relating statistical machine learning approaches to the automatic analysis of multiparty communicative events, such as meetings, is an ambitious research area. We have investigated automatic meeting segmentation both in terms of “Meeting Actions” and “Dialogue Acts”. Dialogue acts model the discourse structure at a fine grained level highlighting individual speaker intentions. Group meeting actions describe the same process at a coarse level, highlighting interactions between different meeting participants and showing overall group intentions. A framework based on probabilistic graphical models such as dynamic Bayesian networks (DBNs) has been investigated for both tasks. Our first set of experiments is concerned with the segmentation and structuring of meetings (recorded using multiple cameras and microphones) into sequences of group meeting actions such as monologue, discussion and presentation. We outline four families of multimodal features based on speaker turns, lexical transcription, prosody, and visual motion that are extracted from the raw audio and video recordings. We relate these lowlevel multimodal features to complex group behaviours proposing a multistreammodelling framework based on dynamic Bayesian networks. Later experiments are concerned with the automatic recognition of Dialogue Acts (DAs) in multiparty conversational speech. We present a joint generative approach based on a switching DBN for DA recognition in which segmentation and classification of DAs are carried out in parallel. This approach models a set of features, related to lexical content and prosody, and incorporates a weighted interpolated factored language model. In conjunction with this joint generative model, we have also investigated the use of a discriminative approach, based on conditional random fields, to perform a reclassification of the segmented DAs. The DBN based approach yielded significant improvements when applied both to the meeting action and the dialogue act recognition task. On both tasks, the DBN framework provided an effective factorisation of the state-space and a flexible infrastructure able to integrate a heterogeneous set of resources such as continuous and discrete multimodal features, and statistical language models. Although our experiments have been principally targeted on multiparty meetings; features, models, and methodologies developed in this thesis can be employed for a wide range of applications. Moreover both group meeting actions and DAs offer valuable insights about the current conversational context providing valuable cues and features for several related research areas such as speaker addressing and focus of attention modelling, automatic speech recognition and understanding, topic and decision detection.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
36

Hopkins, Deborah Ann. „Factors affecting adoption of automated teller machines, direct deposit of paychecks and partial direct deposit to savings where available“. Connect to resource, 1986. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1214411921.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
37

Wibowo, Gatot Morti Chavalit Wongse-ek Manus Mongkolsuk. „Factors affecting image quality and entrance skin exposure when using automatic exposure control (AEC) /“. Abstract, 2004. http://mulinet3.li.mahidol.ac.th/thesis/2547/cd370/4537449.pdf.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
38

Wredle, Ewa. „Automatic milking and grazing : factors and stimuli affecting cow motivation to visit the milking unit /“. Uppsala : Dept. of Animal Nutrition and Management, Swedish University of Agricultural Sciences, 2005. http://epsilon.slu.se/2005116.pdf.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
39

Moosavi, Syed Shakeeb Hassan. „The significance of behavioural (non-automatic) factors in the ventilatory response to exercise in man“. Thesis, Imperial College London, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.267486.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
40

Al-Mannai, Bader Darwish. „A practical decision support tool for the design of automated manufacturing systems : incorporating human factors alongside other considerations in the design“. Thesis, Cranfield University, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.424077.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
41

Garau, Giulia. „Speaker normalisation for large vocabulary multiparty conversational speech recognition“. Thesis, University of Edinburgh, 2009. http://hdl.handle.net/1842/3983.

Der volle Inhalt der Quelle
Annotation:
One of the main problems faced by automatic speech recognition is the variability of the testing conditions. This is due both to the acoustic conditions (different transmission channels, recording devices, noises etc.) and to the variability of speech across different speakers (i.e. due to different accents, coarticulation of phonemes and different vocal tract characteristics). Vocal tract length normalisation (VTLN) aims at normalising the acoustic signal, making it independent from the vocal tract length. This is done by a speaker specific warping of the frequency axis parameterised through a warping factor. In this thesis the application of VTLN to multiparty conversational speech was investigated focusing on the meeting domain. This is a challenging task showing a great variability of the speech acoustics both across different speakers and across time for a given speaker. VTL, the distance between the lips and the glottis, varies over time. We observed that the warping factors estimated using Maximum Likelihood seem to be context dependent: appearing to be influenced by the current conversational partner and being correlated with the behaviour of formant positions and the pitch. This is because VTL also influences the frequency of vibration of the vocal cords and thus the pitch. In this thesis we also investigated pitch-adaptive acoustic features with the goal of further improving the speaker normalisation provided by VTLN. We explored the use of acoustic features obtained using a pitch-adaptive analysis in combination with conventional features such as Mel frequency cepstral coefficients. These spectral representations were combined both at the acoustic feature level using heteroscedastic linear discriminant analysis (HLDA), and at the system level using ROVER. We evaluated this approach on a challenging large vocabulary speech recognition task: multiparty meeting transcription. We found that VTLN benefits the most from pitch-adaptive features. Our experiments also suggested that combining conventional and pitch-adaptive acoustic features using HLDA results in a consistent, significant decrease in the word error rate across all the tasks. Combining at the system level using ROVER resulted in a further significant improvement. Further experiments compared the use of pitch adaptive spectral representation with the adoption of a smoothed spectrogram for the extraction of cepstral coefficients. It was found that pitch adaptive spectral analysis, providing a representation which is less affected by pitch artefacts (especially for high pitched speakers), delivers features with an improved speaker independence. Furthermore this has also shown to be advantageous when HLDA is applied. The combination of a pitch adaptive spectral representation and VTLN based speaker normalisation in the context of LVCSR for multiparty conversational speech led to more speaker independent acoustic models improving the overall recognition performances.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
42

Edling, Laura. „Factors Affecting The Adoption Of Automated Wood Pellet Heating Systems In The Northeastern Us And Implications For The Transition To Renewable Energy“. ScholarWorks @ UVM, 2020. https://scholarworks.uvm.edu/graddis/1177.

Der volle Inhalt der Quelle
Annotation:
Public and private incentive programs have encouraged conversions to high efficiency, low emissions wood heating systems as a strategy to promote renewable energy and support local economies in the Northeastern US. Despite these efforts, the adoption of these systems remains slow. The study that is the subject of this dissertation examines several social, economic, policy and environmental factors that affect the decisions of individuals and small-scale institutions (local business and community facilities) to transition to automated wood pellet boilers and furnaces (AWPH) utilizing local fuel sources. Due to the complexity and risk associated with conversion, the transition to these systems can help further both a practical and theoretical understanding of the global transition to non-fossil fuel technologies. Chapter One of this dissertation examines this notion in more detail, as well as spells out the research questions of this study. Chapter Two delves into the research methods and their implications for other studies of energy transitions. These methods include interviews with 60 consumers, technology and fuel suppliers, and NGO and state agency personnel. These provided in-depth qualitative data which are complemented by a four-state survey (New Hampshire, Vermont, New York, and Maine) of adopters and informed non-adopters of AWPH systems (n=690; 38% response rate). Interview and survey questions, as well as subsequent coding, was developed through use of diffusion of innovation theory, the multi-level perspective on sociotechnical transitions, as well as through collaboration with industry experts and research partners. Chapters Three and Four offer a discussion of the results and their implications. Specifically, Chapter Three examines the complex system actors, elements, and interactions that are part of the transition from fossil fuel technology to AWPH. Chapter Four focuses on the data surrounding state and private programs that encourage the use of AWPH and the implications that this data has for effective climate mitigation and energy policy. Data show that AWPH consumers, who should be considered “early adopters” due to the small number of AWPH adopters in the region, are largely value-driven but are also concerned about upfront costs and lack of available technical support and fuel delivery options. Both environmental values (e.g. desire to find alternative to fossil fuels, concern for air quality and belief in climate change) and social values (e.g. support for the local economy and wood products industry) influenced consumer decisions, especially when fuel oil prices were low. Financial incentives, which are offered by all four states in the study region, were highly influential, but additional decision support offered by a non-profit (e.g. site visits, informational workshops, local print media) were rated highly by consumers where they were available. These additional supports, as well as the community-based nature of the non-profit program, enabled a broader range of people (lower income, more risk averse) to choose AWPH as well as created more efficiency in the supply chain. This approach created a reinforcing feedback loop between broader early adopters of AWPH, normalization of AWPH technology and its associated infrastructure, and increased levels of technical support and fuel availability. These findings suggest that efforts to increase adoption of renewable technologies that use locally harvest fuels take a community-based and system-wide approach, targeting both consumer and supplier motivations and barriers.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
43

Albaret, Claude. „Automated system for Monte Carlo determination of cutout factors of arbitrarily shaped electron beams and experimental verification of Monte Carlo calculated dose distributions“. Thesis, McGill University, 2004. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=81259.

Der volle Inhalt der Quelle
Annotation:
Dose predictions by Monte-Carlo (MC) techniques could alleviate the measurement load required in linac commissioning and clinical radiotherapy practice, where small or irregular electron fields are routinely encountered. In particular, this study focused on the MC calculation of cutout factors for clinical electron beams. A MC model for a Varian linac CL2300C/D was built and validated for all electron energies and applicators. A MC user code for simulation of irregular cutouts was then developed and validated. Supported by a home-developed graphical user interface, it determines in situ cutout factors and depth dose curves for arbitrarily shaped electron fields and collects phase space data. Overall, the agreement between simulations and measurements was excellent for fields larger than 2 cm.
The MC model was also used to calculate dose distributions with the fast MC code XVMC in CT images of phantoms of clinical interest. These dose distributions were compared to dose calculations performed by the pencil-beam algorithm-based treatment planning system CadPlan and verified against measurements. Good agreement between calculations and measurements was achieved with both systems for phantoms containing 1-dimensional heterogeneities, provided a minimal quality of the CT images. In phantoms with 3-dimensional heterogeneities however, CadPlan appeared unable to predict the dose accurately, whereas MC provided with a more satisfactory dose distribution, despite some local discrepancies.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
44

Wilkins, Annekathrin. „Factors influencing the dispersal of Pseudomonas fluorescens NZI7 by Caenorhabditis elegans“. Thesis, University of Oxford, 2016. https://ora.ox.ac.uk/objects/uuid:6bf58183-f197-490d-86d4-633ae8d46c06.

Der volle Inhalt der Quelle
Annotation:
Caenorhabditis elegans is a natural predator of the mushroom pathogen Pseudomonas fluorescens NZI7. The bacterial mechanisms for reducing predation by the nematode through the secretion of secondary metabolites have been described, but not yet fully explored. The behaviour of nematodes is influenced by the different factors produced by the pseudomonads. In this thesis we develop a range of assays to link the behaviour of C. elegans to these factors to identify their role in bacteria-nematode interactions. We show that these factors play two distinct roles: they may either repel nematodes, or harm them. This permits the classification of mutants of P. fl. NZI7 lacking these factors as either attractive, edible or both. Many studies of C. elegans behaviour have demonstrated that the nematode can distinguish between different food sources. Our results show two distinct types of response: chemotaxis drives the response to attractive or repellent stimuli, and nematodes also show a choice behaviour that is independent of chemotaxis. This choice behaviour is determined by bacterial edibility and requires nematodes to come into contact with the bacteria. This contact is the foundation of the bacterial dispersal by nematodes. By making use of the luminescence property of the available bacterial mutants, we demonstrate an intimate link between the behaviour of C. elegans and the success with which bacteria are disseminated: if nematodes are induced to regularly leave a bacterial colony, whether through their genotype or the low edibility of the food, then they will spread bacteria effectively. Throughout this thesis, we use computational simulations based on a hybrid cellular automaton model to represent the nematode-bacteria interactions. These simulations recreate the observed behaviour of the system, thus they help to confirm our hypotheses and establish the fundamental aspects of the interactions between the two species.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
45

Carmody, Meghan A. „Task-Dependent Effects of Automation: The Role of Internal Models in Performance, Workload, and Situational Awareness in a Semi-Automated Cockpit“. Ft. Belvoir Defense Technical Information Center, 1994. http://handle.dtic.mil/100.2/ADA292538.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
46

Trask, Simon J. „Systems and Safety Engineering in Hybrid-Electric and Semi-Autonomous Vehicles“. The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1555521147257702.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
47

Aminjikarai, Vedagiri Srinivasa Babu. „An Automated Dynamic Fracture Procedure and a Continuum Damage Mechanics Based Model for Finite Element Simulations of Delamination Failure in Laminated Composites“. University of Cincinnati / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1242963775.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
48

Teitelbaum, Aryeh Roberto, und a_hay@jct ac il. „Arts'Codes: A New Methodology for the Development of Real-Time Embedded Applications for Control Systems“. RMIT University. Accounting and Law, 2007. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20071219.094115.

Der volle Inhalt der Quelle
Annotation:
Embedded real-time applications have to allow interaction between the control computer and the controlled environment. Controlling the environment requires in particular to take into account its time constraints and critical logical conditions. One of the main programmer efforts in real-time application's development is to trace the incoming events, and to perform reactions based on the current system status, according to the application requirements. All this have to be handled, although external events may come in the middle of a critical reaction, which may disturb it. This problem involves two difficulties: „X The cognitive efforts to percept the problem, and consequently to express the solution. „X The correct translation of this solution to code. Two requirements were defined in this research in order to achieve high-quality performance: clearness and robustness, clearness in the design, and robustness in the execution. In this work the author proposes a methodology and a tool for real-time application's development that uses or implies an innovated form of design based on natural-cognitive researches. This design method has clear compilation's rules to produce an Object-Oriented light-code, suitable for embedded platforms. These compilation's rules introduce to the code implicit security and synchronization's elements, to support robust execution. In this methodology, clear development phases were defined, using a high-degree of reuse and even polymorphism, which were emphasized in the research. Several existing ideas were improved/adapted and synthesized together with the author's innovation, creating the Arts'Codes method for real-time application development. The work includes cognitive evaluations, assuring the natural skills of the design. Arts'Codes method proposes a natural VPL (Visual Programming Language) for real-time applications, based on hierarchic components. This VPL is built on a minimum of diagrams: one for the static architecture and one for the dynamic behaviour, with a similar restricted notation at all levels. These two diagrams (static architecture and dynamic behaviour) are interleaved in a unified view. This method was implemented by building a suitable graphic editor, which automatically compiles the applications diagrams in a light and robust Object-Oriented code (based on Parallel Automata FSM), and by building an execution compact software platform. Furthermore, the parallel automata FSM are translated automatically in PTL temporal formula defining the goals and the behaviours of the components, permitting to prove a-priory that the components behaviours are consistent to their goals. The execution platform is based on a restricted implementation of the synchrony hypothesis and on a powerful model of execution: the parallel automata FSM. These Parallel Automata describe the dynamic behaviours of the components and allows implementing run-time exceptions handling too. In addition, the research proposes a tri-processor execution hardware platform, which supports a hybrid synchronous/multi-threading execution. This method will contribute to versatile, clear and robust real-time application's development.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
49

Bozkurt, Halil. „Modeling of Socio-Economic Factors and Adverse Events In an Active War Theater By Using a Cellular Automata Simulation Approach“. Doctoral diss., University of Central Florida, 2013. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5771.

Der volle Inhalt der Quelle
Annotation:
Department of Defense (DoD) implemented Human Social Cultural and Behavior (HSCB) program to meet the need to develop capability to understand, predict and shape human behavior among different cultures by developing a knowledge base, building models, and creating training capacity. This capability will allow decision makers to subordinate kinetic operations and promote non-kinetic operations to govern economic programs better in order to initiate efforts and development to address the grievances among the displeased by adverse events. These non-kinetic operations include rebuilding indigenous institutions' bottom-up economic activity and constructing necessary infrastructure since the success in non-kinetic operations depends on understanding and using social and cultural landscape. This study aims to support decision makers by building a computational model to understand economic factors and their effect on adverse events. In this dissertation, the analysis demonstrates that the use of cellular automata has several significant contributions to support decision makers allocating development funds to stabilize regions with higher adverse event risks, and to better understand the complex socio-economic interactions with adverse events. Thus, this analysis was performed on a set of spatial data representing factors from social and economic data. In studying behavior using cellular automata, cells in the same neighborhood synchronously interact with each other to determine their next states, and small changes in iteration may yield to complex formations of adverse event risk after several iterations of time. The modeling methodology of cellular automata for social and economic analysis in this research was designed in two major implementation levels as follows: macro and micro-level. In the macro-level, the modeling framework integrates population, social, and economic sub-systems. The macro-level allows the model to use regionalized representations, while the micro-level analyses help to understand why the events have occurred. Macro-level subsystems support cellular automata rules to generate accurate predictions. Prediction capability of cellular automata is used to model the micro-level interactions between individual actors, which are represented by adverse events. The results of this dissertation demonstrate that cellular automata model is capable of evaluating socio-economic influences that result in changes in adverse events and identify location, time and impact of these events. Secondly, this research indicates that the socio-economic influences have different levels of impact on adverse events, defined by the number of people killed, wounded or hijacked. Thirdly, this research shows that the socio-economic, influences and adverse events that occurred in a given district have impacts on adverse events that occur in neighboring districts. The cellular automata modeling approach can be used to enhance the capability to understand and use human, social and behavioral factors by generating what-if scenarios to determine the impact of different infrastructure development projects to predict adverse events. Lastly, adverse events that could occur in upcoming years can be predicted to allow decision makers to deter these events or plan accordingly if these events do occur.
Ph.D.
Doctorate
Industrial Engineering and Management Systems
Engineering and Computer Science
Industrial Engineering
APA, Harvard, Vancouver, ISO und andere Zitierweisen
50

Bäckström, Madeleine, und Nicklas Silversved. „Digitalizing the workplace: improving internal processes using digital services : A process improvement by digitalization, emphasizing chosen quality factors“. Thesis, Linköpings universitet, Institutionen för datavetenskap, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-177003.

Der volle Inhalt der Quelle
Annotation:
In recent years, the number of digital services and tools available has increased rapidly. When companies want to digitalize their business, they have the opportunity to browse a large number of existing platforms and applications available on the market to find a good match for their specific needs. However, when a company wishes to digitalize a work task that already has a well-established workflow, problems may arise. Due to this, a tailored digital solution may in some cases be the better suited option, rather than the ones available on the market.  The intention of this work was to investigate the challenges that companies face in relation to digitalization of the workplace in general, and the challenges of a company’s expense management process in particular. As an example of how a workplace digitalization can take place, a collaboration with a forest industry company was conducted. An evaluation of their analog and internal expense management process was done, where the found challenges were assessed with respect to chosen quality factors. The evaluation and the found challenges regarding digitalization constituted the basis for a process mapping and a digital solution aiming to improve the company’s expense management process. The resulting work emphasizes how a digital solution can be tailored with simple means within a limited time frame, taking specific needs and existing challenges into account in order to digitalize the workplace. In addition, the work presents what challenges that exists within the concept of digitalizing the workplace and regarding expense management, and how quality factors can be used in combination with a process improvement in order to relieve and eliminate them.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie