Dissertations / Theses on the topic 'Object decision'

To see the other types of publications on this topic, follow the link: Object decision.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Object decision.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Li, Xiaobo 1976. "Decision support systems using object-oriented technology." Thesis, Massachusetts Institute of Technology, 2000. http://hdl.handle.net/1721.1/86539.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Olafsson, Björgvin. "Partially Observable Markov Decision Processes for Faster Object Recognition." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-198632.

Full text
Abstract:
Object recognition in the real world is a big challenge in the field of computer vision. Given the potentially enormous size of the search space it is essential to be able to make intelligent decisions about where in the visual field to obtain information from to reduce the computational resources needed. In this report a POMDP (Partially Observable Markov Decision Process) learning framework, using a policy gradient method and information rewards as a training signal, has been implemented and used to train fixation policies that aim to maximize the information gathered in each fixation. The purpose of such policies is to make object recognition faster by reducing the number of fixations needed. The trained policies are evaluated by simulation and comparing them with several fixed policies. Finally it is shown that it is possible to use the framework to train policies that outperform the fixed policies for certain observation models.
APA, Harvard, Vancouver, ISO, and other styles
3

Lantz, Fredrik. "Terrain Object recognition and Context Fusion for Decision Support." Licentiate thesis, Linköping : Department of Computer and Information Science, Linköpings universitet, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-11926.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Lo, Chih-Chung. "Using effect size in information fusion for identifying object presence and object quality /." free to MU campus, to others for purchase, 1996. http://wwwlib.umi.com/cr/mo/fullcit?p9823330.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

MAPELLI, CRISTINA. "Object decision: the interaction of semantic information and structural description system." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2013. http://hdl.handle.net/10281/44696.

Full text
Abstract:
In the neuropsychological literature, visual object recognition is deemed to be subserved by structural information that is functionally independent from semantic memory. According to this view, the perceptual representation of the stimulus is compared with representations of objects previously seen and stored in the so-called “Structural Description System”, or SDS; when matching between representations occurs, the corresponding conceptual knowledge is activated. Recently, the functional independence of SDS has been challenged based on the evidence that patients affected by Semantic Dementia, a neurodegenerative condition characterized by a selective disruption of semantic memory with sparing of the early stages of object recognition, are impaired at Object Decision tasks. In the present dissertation, we aimed at investigating whether object recognition could be achieved on the basis of structural features alone, or whether semantic knowledge about the object is also involved. To this aim we created two different Object Decision tasks and administered them to healthy young subjects, elderly normal controls and neurodegenerative patients. Our findings suggest that conceptual knowledge plays a crucial role in visual object recognition, and that semantic memory and the SDS are highly interactive.
APA, Harvard, Vancouver, ISO, and other styles
6

Golder, P. A. "CORD : an object model supporting statistical summary information for management decision making." Thesis, Aston University, 1997. http://publications.aston.ac.uk/7972/.

Full text
Abstract:
Information systems have developed to the stage that there is plenty of data available in most organisations but there are still major problems in turning that data into information for management decision making. This thesis argues that the link between decision support information and transaction processing data should be through a common object model which reflects the real world of the organisation and encompasses the artefacts of the information system. The CORD (Collections, Objects, Roles and Domains) model is developed which is richer in appropriate modelling abstractions than current Object Models. A flexible Object Prototyping tool based on a Semantic Data Storage Manager has been developed which enables a variety of models to be stored and experimented with. A statistical summary table model COST (Collections of Objects Statistical Table) has been developed within CORD and is shown to be adequate to meet the modelling needs of Decision Support and Executive Information Systems. The COST model is supported by a statistical table creator and editor COSTed which is also built on top of the Object Prototyper and uses the CORD model to manage its metadata.
APA, Harvard, Vancouver, ISO, and other styles
7

Sandino, Mora Juan David. "Autonomous decision-making for UAVs operating under environmental and object detection uncertainty." Thesis, Queensland University of Technology, 2022. https://eprints.qut.edu.au/232513/1/Juan%20David_Sandino%20Mora_Thesis.pdf.

Full text
Abstract:
This study established a framework that increases cognitive levels in small UAVs (or drones), enabling autonomous navigation in partially observable environments. The UAV system was validated under search and rescue by locating victims last seen inside cluttered buildings and in bushlands. This framework improved the decision-making skills of the drone to collect more accurate statistics of detected victims. This study assists validation processes of detected objects in real-time when data is complex to interpret for UAV pilots and reduces human bias on scouting strategies.
APA, Harvard, Vancouver, ISO, and other styles
8

Liu, Dingfei. "An object-oriented approach to structuring multicriteria decision support in natural resource management problems." Doctoral thesis, University of Cape Town, 2001. http://hdl.handle.net/11427/4384.

Full text
Abstract:
Includes bibliographical references.
The undertaking of MCDM (Multicriteria Decision Making) and the development of DSSs (Decision Support Systems) tend to be complex and inefficient, leading to low productivity in decision analysis and DSSs. Towards this end, this study has developed an approach based on object orientation for MCDM and DSS modelling, with the emphasis on natural resource management. The object-oriented approach provides a philosophy to model decision analysis and DSSs in a uniform way, as shown by the diagrams presented in this study. The solving of natural resource management decision problems, the MCDM decision making procedure and decision making activities are modelled in an object-oriented way. The macro decision analysis system, its DSS, the decision problem, the decision context, and the entities in the decision making procedure are represented as "objects". The object-oriented representation of decision analysis also constitutes the basis for the analysis ofDSSs.
APA, Harvard, Vancouver, ISO, and other styles
9

Tao, Cheng. "Decision-Making Support by a Value-Driven Design Model." Thesis, Blekinge Tekniska Högskola, Institutionen för maskinteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-11881.

Full text
Abstract:
This thesis analyses the use of value models as boundary objects to support decision making during conceptual design of Product-Service Systems. Compared to requirements-based models, value models are claimed to enhance understanding of the design problems and customer needs, as well as to help the design team in creating more value adding solutions. The work of this thesis was to prepare, conduct and analyse a series of design experiments, which are are based on the continuous observations of designers’ verbalized design considerations. Protocol analysis was conducted to investigate how value models perform as boundary objects in design, in comparison with requirements-based models. The time spent on each different activity in the protocol has been used as main proxy in the experiment. Data triangulation was ensured by the use of a questionnaire that was answered by all participants. Both methods revealed that in the preliminary phase, value models are more effective than requirements-based models in conveying intuitive value-related information, assessing intangibles value aspects, and encouraging discussions on value concerns.
APA, Harvard, Vancouver, ISO, and other styles
10

Ehtiati, Tina. "Strongly coupled Bayesian models for interacting object and scene classification processes." Thesis, McGill University, 2007. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=102975.

Full text
Abstract:
In this thesis, we present a strongly coupled data fusion architecture within a Bayesian framework for modeling the bi-directional influences between the scene and object classification mechanisms. A number of psychophysical studies provide experimental evidence that the object and the scene perception mechanisms are not functionally separate in the human visual system. Object recognition facilitates the recognition of the scene background and also knowledge of the scene context facilitates the recognition of the individual objects in the scene. The evidence indicating a bi-directional exchange between the two processes has motivated us to build a computational model where object and scene classification proceed in an interdependent manner, while no hierarchical relationship is imposed between the two processes. We propose a strongly coupled data fusion model for implementing the feedback relationship between the scene and object classification processes. We present novel schemes for modifying the Bayesian solutions for the scene and object classification tasks which allow data fusion between the two modules based on the constraining of the priors or the likelihoods. We have implemented and tested the two proposed models using a database of natural images created for this purpose. The Receiver Operator Curves (ROC) depicting the scene classification performance of the likelihood coupling and the prior coupling models show that scene classification performance improves significantly in both models as a result of the strong coupling of the scene and object modules.
ROC curves depicting the scene classification performance of the two models also show that the likelihood coupling model achieves a higher detection rate compared to the prior coupling model. We have also computed the average rise times of the models' outputs as a measure of comparing the speed of the two models. The results show that the likelihood coupling model outputs have a shorter rise time. Based on these experimental findings one can conclude that imposing constraints on the likelihood models provides better solutions to the scene classification problems compared to imposing constraints on the prior models.
We have also proposed an attentional feature modulation scheme, which consists of tuning the input image responses to the bank of Gabor filters based on the scene class probabilities estimated by the model and the energy profiles of the Gabor filters for different scene categories. Experimental results based on combining the attentional feature tuning scheme with the likelihood coupling and the prior coupling methods show a significant improvement in the scene classification performances of both models.
APA, Harvard, Vancouver, ISO, and other styles
11

Vo, Ba Tuong. "Random finite sets in Multi-object filtering." University of Western Australia. School of Electrical, Electronic and Computer Engineering, 2008. http://theses.library.uwa.edu.au/adt-WU2009.0045.

Full text
Abstract:
[Truncated abstract] The multi-object filtering problem is a logical and fundamental generalization of the ubiquitous single-object vector filtering problem. Multi-object filtering essentially concerns the joint detection and estimation of the unknown and time-varying number of objects present, and the dynamic state of each of these objects, given a sequence of observation sets. This problem is intrinsically challenging because, given an observation set, there is no knowledge of which object generated which measurement, if any, and the detected measurements are indistinguishable from false alarms. Multi-object filtering poses significant technical challenges, and is indeed an established area of research, with many applications in both military and commercial realms. The new and emerging approach to multi-object filtering is based on the formal theory of random finite sets, and is a natural, elegant and rigorous framework for the theory of multiobject filtering, originally proposed by Mahler. In contrast to traditional approaches, the random finite set framework is completely free of explicit data associations. The random finite set framework is adopted in this dissertation as the basis for a principled and comprehensive study of multi-object filtering. The premise of this framework is that the collection of object states and measurements at any time are treated namely as random finite sets. A random finite set is simply a finite-set-valued random variable, i.e. a random variable which is random in both the number of elements and the values of the elements themselves. Consequently, formulating the multiobject filtering problem using random finite set models precisely encapsulates the essence of the multi-object filtering problem, and enables the development of principled solutions therein. '...' The performance of the proposed algorithm is demonstrated in simulated scenarios, and shown at least in simulation to dramatically outperform traditional single-object filtering in clutter approaches. The second key contribution is a mathematically principled derivation and practical implementation of a novel algorithm for multi-object Bayesian filtering, based on moment approximations to the posterior density of the random finite set state. The performance of the proposed algorithm is also demonstrated in practical scenarios, and shown to considerably outperform traditional multi-object filtering approaches. The third key contribution is a mathematically principled derivation and practical implementation of a novel algorithm for multi-object Bayesian filtering, based on functional approximations to the posterior density of the random finite set state. The performance of the proposed algorithm is compared with the previous, and shown to appreciably outperform the previous in certain classes of situations. The final key contribution is the definition of a consistent and efficiently computable metric for multi-object performance evaluation. It is shown that the finite set theoretic state space formulation permits a mathematically rigorous and physically intuitive construct for measuring the estimation error of a multi-object filter, in the form of a metric. This metric is used to evaluate and compare the multi-object filtering algorithms developed in this dissertation.
APA, Harvard, Vancouver, ISO, and other styles
12

Esterhuyse, Jacques. "The use of object oriented systems development methodologies in data warehouse development / J. Esterhuyse." Thesis, North-West University, 2008. http://hdl.handle.net/10394/3661.

Full text
Abstract:
Research has shown that data warehouses potentially offer great investment opportunities to business. To benefit from this, business needs to invest large sums of money. Such investments are very risky, as no guarantee of the success of these ventures can be given. Object-oriented development has proved successful for developing operational systems in industry. This study researches object-oriented techniques to discover whether these techniques could be used successfully in data warehousing. A literature study focuses on the definition of an information systems development methodology and defines the components of such methodology. A further literature study on four popular object-oriented methodologies determines the commonalities of these methodologies. In conclusion, a literature study on data warehouse methodologies is done to discover the phases and techniques used in developing data warehouses. Based on the literature, a method is proposed to build a data warehouse harnessing object-oriented phases and techniques. The proposed method is applied as an interpretive experiment, followed by an evaluation of the data warehouse implemented.
Thesis (M.Sc. (Computer Science))--North-West University, Potchefstroom Campus, 2009.
APA, Harvard, Vancouver, ISO, and other styles
13

Glover, Arren John. "Developing grounded representations for robots through the principles of sensorimotor coordination." Thesis, Queensland University of Technology, 2014. https://eprints.qut.edu.au/71763/1/Arren_Glover_Thesis.pdf.

Full text
Abstract:
Robots currently recognise and use objects through algorithms that are hand-coded or specifically trained. Such robots can operate in known, structured environments but cannot learn to recognise or use novel objects as they appear. This thesis demonstrates that a robot can develop meaningful object representations by learning the fundamental relationship between action and change in sensory state; the robot learns sensorimotor coordination. Methods based on Markov Decision Processes are experimentally validated on a mobile robot capable of gripping objects, and it is found that object recognition and manipulation can be learnt as an emergent property of sensorimotor coordination.
APA, Harvard, Vancouver, ISO, and other styles
14

McPhail, Jonathan C. "Deciding on a pattern, towards a distributed decision aid that supports the selection of an object-oriented software pattern." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/MQ57788.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Hamidi, Behzad. "A BIM-based Object-oriented Data Model to Support Sustainable Demolition Waste Management Decision Making at End-of-Life." Diss., Virginia Tech, 2015. http://hdl.handle.net/10919/73429.

Full text
Abstract:
Sustainable demolition waste management is rarely practiced within the construction industry. This is mainly due to the fact that the decision-making process for sustainable demolition waste management is a very resource-demanding and time-consuming task in terms of data collection and data management. The decision-making process includes multiple analyses of possible demolition waste management alternatives from economic, environmental, and social perspectives. Such analyses require waste managers to capture and manage huge amounts of data scattered within fragmented data sources at the end-of-life of a building. The process of capturing and managing this information for the building end-of-life would be time-consuming and costly. Therefore, the waste managers are reluctant to pursue sustainable demolition waste management practices in order to prevent potential delays and incurred costs. This research identified information that is required to conduct sustainable demolition waste management analyses. The identified information was then classified based on information sources. An object-oriented data model (OODM) was proposed to allow the waste managers to more efficiently store and manage the information at the end-of-life phase. Furthermore, a sustainable demolition waste management prototype application was developed to demonstrate how the required information is captured from different sources of data, stored within OODM classes, and retrieved from the integrated database. Finally, the proposed OODM was verified in terms of its scope, flexibility, and implementability. The goal of the research is to offer a method for storing and managing end-of-life information in an efficient and effective manner to support sustainable demolition waste management decision making. To achieve the goal, this dissertation outlines the objectives of the research, the methodologies used in developing the object-oriented data model, conclusions, limitations, and potential future research work.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
16

McPhail, Jonathan C. (Jonathan Cheynne) Carleton University Dissertation Computer Science. "Deciding on a pattern: towards a distributed decision aid that supports the selection of an object-oriented software pattern." Ottawa, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
17

Rudolph, Melanie M. "National freight transport planning : towards a Strategic Planning Extranet Decision Support System (SPEDSS)." Thesis, Loughborough University, 1998. https://dspace.lboro.ac.uk/2134/7080.

Full text
Abstract:
This thesis provides a `proof-of-concept' prototype and a design architecture for a Object Oriented (00) database towards the development of a Decision Support System (DSS) for the national freight transport planning problem. Both governments and industry require a Strategic Planning Extranet Decision Support System (SPEDSS) for their effective management of the national Freight Transport Networks (FTN). This thesis addresses the three key problems for the development of a SPEDSS to facilitate national strategic freight planning: 1) scope and scale of data available and required; 2) scope and scale of existing models; and 3) construction of the software. The research approach taken embodies systems thinking and includes the use of: Object Oriented Analysis and Design (OOA/D) for problem encapsulation and database design; artificial neural network (and proposed rule extraction) for knowledge acquisition of the United States FTN data set; and an iterative Object Oriented (00) software design for the development of a `proof-of-concept' prototype. The research findings demonstrate that an 00 approach along with the use of 00 methodologies and technologies coupled with artificial neural networks (ANNs) offers a robust and flexible methodology for the analysis of the FTN problem domain and the design architecture of an Extranet based SPEDSS. The objectives of this research were to: 1) identify and analyse current problems and proposed solutions facing industry and governments in strategic transportation planning; 2) determine the functional requirements of an FTN SPEDSS; 3) perform a feasibility analysis for building a FTN SPEDSS `proof-of-concept' prototype and (00) database design; 4) develop a methodology for a national `internet-enabled' SPEDSS model and database; 5) construct a `proof-of-concept' prototype for a SPEDSS encapsulating identified user requirements; 6) develop a methodology to resolve the issue of the scale of data and data knowledge acquisition which would act as the `intelligence' within a SPDSS; 7) implement the data methodology using Artificial Neural Networks (ANNs) towards the validation of it; and 8) make recommendations for national freight transportation strategic planning and further research required to fulfil the needs of governments and industry. This thesis includes: an 00 database design for encapsulation of the FTN; an `internet-enabled' Dynamic Modelling Methodology (DMM) for the virtual modelling of the FTNs; a Unified Modelling Language (UML) `proof-of-concept' prototype; and conclusions and recommendations for further collaborative research are identified.
APA, Harvard, Vancouver, ISO, and other styles
18

Lin, Chung-Ching. "Detecting and tracking moving objects from a moving platform." Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/49014.

Full text
Abstract:
Detecting and tracking moving objects are important topics in computer vision research. Classical methods perform well in applications of steady cameras. However, these techniques are not suitable for the applications of moving cameras because the unconstrained nature of realistic environments and sudden camera movement makes cues to object positions rather fickle. A major difficulty is that every pixel moves and new background keeps showing up when a handheld or car-mounted camera moves. In this dissertation, a novel estimation method of camera motion parameters will be discussed first. Based on the estimated camera motion parameters, two detection algorithms are developed using Bayes' rule and belief propagation. Next, an MCMC-based feature-guided particle filtering method is presented to track detected moving objects. In addition, two detection algorithms without using camera motion parameters will be further discussed. These two approaches require no pre-defined class or model to be trained in advance. The experiment results will demonstrate robust detecting and tracking performance in object sizes and positions.
APA, Harvard, Vancouver, ISO, and other styles
19

Wang, Yue. "Decision-Making for Search and Classification using Multiple Autonomous Vehicles over Large-Scale Domains." Digital WPI, 2011. https://digitalcommons.wpi.edu/etd-dissertations/87.

Full text
Abstract:
This dissertation focuses on real-time decision-making for large-scale domain search and object classification using Multiple Autonomous Vehicles (MAV). In recent years, MAV systems have attracted considerable attention and have been widely utilized. Of particular interest is their application to search and classification under limited sensory capabilities. Since search requires sensor mobility and classification requires a sensor to stay within the vicinity of an object, search and classification are two competing tasks. Therefore, there is a need to develop real-time sensor allocation decision-making strategies to guarantee task accomplishment. These decisions are especially crucial when the domain is much larger than the field-of-view of a sensor, or when the number of objects to be found and classified is much larger than that of available sensors. In this work, the search problem is formulated as a coverage control problem, which aims at collecting enough data at every point within the domain to construct an awareness map. The object classification problem seeks to satisfactorily categorize the property of each found object of interest. The decision-making strategies include both sensor allocation decisions and vehicle motion control. The awareness-, Bayesian-, and risk-based decision-making strategies are developed in sequence. The awareness-based approach is developed under a deterministic framework, while the latter two are developed under a probabilistic framework where uncertainty in sensor measurement is taken into account. The risk-based decision-making strategy also analyzes the effect of measurement cost. It is further extended to an integrated detection and estimation problem with applications in optimal sensor management. Simulation-based studies are performed to confirm the effectiveness of the proposed algorithms.
APA, Harvard, Vancouver, ISO, and other styles
20

Benda, Ondřej. "Návrh rozhodovacích stromů na základě evolučních algoritmů." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2012. http://www.nusl.cz/ntk/nusl-219457.

Full text
Abstract:
Tato diplomová práce pojednává o dvou algoritmech pro dolování z proudu dat - Very Fast Decision Tree (VFDT) a Concept-adapting Very Fast Decision Tree (CVFDT). Je vysvětlen princip klasifikace rozhodovacím stromem. Je popsána základní myšlenka konstrukce stromu Hoeffding Tree, který je základem pro algoritmy VFDT a CVFDT. Tyto algoritmy jsou poté rozebrány detailněji. Dále se tato práce zabývá návrhem algoritmu Genetického Programování (GP), který je použit pro vytváření klasifikátoru obrazových dat. Vytvořený klasifikátor je použit jako alternativní způsob klasifikace objektů v obraze ve frameworku Viola-Jones. V práci je rozebrána implementace algoritmů, které jsou implementovány v jazyce Java. Algoritmus GP je integrován do knihovny “Image Processing Extension” programu RapidMiner. Algoritmy VFDT a CVFDT jsou testovány na syntetických a reálných textových datech. Algoritmus GP je testován na klasifikaci obrazových dat a následně vytvořený klasifikátor je otestován na detekci obličejů v obraze.
APA, Harvard, Vancouver, ISO, and other styles
21

Hutchinson, Craig Alan. "Multiscale Modelling as an Aid to Decision Making in the Dairy Industry." Thesis, University of Canterbury. Chemical and Process Engineering, 2006. http://hdl.handle.net/10092/2146.

Full text
Abstract:
This work presents the first known attempt to model the dairy business from a multiscale modelling perspective. The multiscale nature of the dairy industry is examined with emphasis on those key decision making and process scales involved in production. Decision making scales identified range from the investor level to the plant operator level, and encompass business, production, plant, and operational levels. The model considers scales from the production manager to the unit operation scale. The cheese making process is used to demonstrate scale identification in the context of the important phenomena and other natural levels of scrutiny of interest to decision makers. This work was a first step in the establishment of a multiscale system model capable of delivering information for process troubleshooting, scheduling, process and business optimization, and process control decision-making for the dairy industry. Here, only material transfer throughout a process, use of raw materials, and production of manufactured product is modelled. However, an implementation pathway for adding other models (such as the precipitation of milk protein which forms curd) to the system model is proposed. The software implementation of the dairy industry multiscale model presented here tests the validity of the proposed: • object model (object and collection classes) used to model unit operations and integrate them into a process, • mechanisms for modelling material and energy streams, • method to create simulations over variable time horizons. The model was implemented using object oriented programming (OOP) methods in conjunction with technologies such as Visual Basic .NET and CAPE-OPEN. An OOP object model is presented which successfully enabled the construction of a multiscale model of the cheese making process. Material content, unit operation, and raw milk supply models were integrated into the multiscale model. The model is capable of performing simulations over variable time horizons, from 1 second, to multiple years. Mechanisms for modelling material streams, connecting unit operations, and controlling unit operation behaviour were implemented. Simple unit operations such as pumps and storage silos along with more complex unit operations, such as a cheese vat batch, were modelled. Despite some simplifications to the model of the cheese making process, the simulations successfully reproduced the major features expected from the process and its constituent unit operations. Decision making information for process operators, plant managers, production managers, and the dairy business manager can be produced from the data generated. The multiscale model can be made more sophisticated by extending the functionality of existing objects, and incorporating other scale partial models. However, increasing the number of reported variables by even a small number can quickly increase the data processing and storage demands of the model. A unit operation’s operational state of existence at any point of time was proposed as a mechanism for integrating and recalculating lower scale partial models. This mechanism was successfully tested using a unit operation’s material content model and is presented here as a new concept in multiscale modelling. The proposed modelling structure can be extended to include any number of partial models and any number of scales.
APA, Harvard, Vancouver, ISO, and other styles
22

Vařeková, Petra. "Analýza a vyhodnocení umísťování staveb dle stavebního zákona v mikroregionu Litovelsko." Master's thesis, Vysoké učení technické v Brně. Ústav soudního inženýrství, 2016. http://www.nusl.cz/ntk/nusl-241297.

Full text
Abstract:
The subject of this thesis is to analyze and evaluate the placement of buildings under the Building Act in the microregion Litovelsko. The theoretical part will include individual division of the Construction Act, including zoning classification. In will be done analysis of specific examples and to the family house in the village Ješov, then there will be discussed at the appropriate location of the building plot in the village Hradečná and the last example is the location of the construction site in dilapidated buildings in the village Kovářov. Finally, the work will include evaluation of appropriate solutions to the location of buildings.
APA, Harvard, Vancouver, ISO, and other styles
23

Sheehan, Jennifer Karr. "Intangible Qualities of Rare Books: Toward a Decision-Making Framework for Preservation Management in Rare Book Collections, Based Upon the Concept of the Book as Object." Thesis, University of North Texas, 2006. https://digital.library.unt.edu/ark:/67531/metadc5213/.

Full text
Abstract:
For rare book collections, a considerable challenge is involved in evaluating collection materials in terms of their inherent value, which includes the textual and intangible information the materials provide for the collection's users. Preservation management in rare book collections is a complex and costly process. As digitization and other technological advances in surrogate technology have provided new forms representation, new dilemmas in weighing the rare book's inherently valuable characteristics against the possibly lesser financial costs of surrogates have arisen. No model has been in wide use to guide preservation management decisions. An initial iteration of such a model is developed, based on a Delphi-like iterative questioning of a group of experts in the field of rare books. The results are used to synthesize a preservation management framework for rare book collections, and a small-scale test of the framework has been completed through two independent analyses of five rare books in a functioning collection. Utilizing a standardized template for making preservation decisions offers a variety of benefits. Preservation decisions may include prioritizing action upon the authentic objects, or developing and maintaining surrogates in lieu of retaining costly original collection materials. The framework constructed in this study provides a method for reducing the subjectivity of preservation decision-making and facilitating the development of a standard of practice for preservation management within rare book collections.
APA, Harvard, Vancouver, ISO, and other styles
24

Al-Shrouf, Lou'i Verfasser], Dirk [Akademischer Betreuer] [Söffker, and Claus-Peter [Akademischer Betreuer] Fritzen. "Development and Implementation of a Reliable Decision Fusion and Pattern Recognition System for Object Detection and Condition Monitoring / Loui Al-Shrouf. Gutachter: Dirk Söffker ; Claus-Peter Fritzen." Duisburg, 2014. http://d-nb.info/1057837237/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Tahara, Creusa Sayuri. "Arquitetura para integração de métodos para apoiar a decisão em formação de células de manufatura." Universidade de São Paulo, 2001. http://www.teses.usp.br/teses/disponiveis/18/18135/tde-12042017-165200/.

Full text
Abstract:
A manufatura celular tem um importante destaque entre os sistemas de produção, sendo empregada em parte significativa das empresas de manufatura classe mundial e alvo de constantes pesquisas. Um dos aspectos fundamentais para a aplicação desse sistema de produção é a solução de problemas de formação de células. Ao longo das últimas três décadas muitos algoritmos foram propostos para auxiliar esta decisão, porém, eles não têm sido intensamente aplicados na rotina das empresas, que na maioria dos casos, definem as células com base no bom senso e experiência. Alguns dos fatores que contribuem para isso são: a não disponibilidade de dados necessários para a solução do problema, e o fato de que cada algoritmo retrata situações particulares, necessitando de um especialista que seja capaz de escolhê-los ou desenvolvê-los e aplicá-los conforme as características específicas do problema enfrentado. Neste trabalho propõe-se que uma arquitetura que busca integrar os diversos algoritmos de formação de célula, permitindo que estes sejam aplicados a um mesmo conjunto de dados. O objetivo é disponibilizar aos profissionais de empresa o maior número possível de algoritmos de maneira simples e sem a necessidade de dispendiosos levantamentos de dados, seguindo um procedimento em que possam analisar diferentes soluções de um problema. A arquitetura é composta de um procedimento, um modelo orientado a objeto e um sistema de apoio à decisão para formação de células. Os resultados obtidos da aplicação mostram que a arquitetura proposta é viável com grandes possibilidades para continuar a ser desenvolvida em pesquisas futuras.
The celular manufacturing has an important prominence among the production systems, being used in significant part of the world class manufacturing companies and it is objective of constant researches. One of the fundamental aspects for the application of this production system is the solution to the problem of cells formation. Along the last three decades many algorithms were proposed to aid this decision, even so, they have not been applied intensively in the normal routine of the companies that in most of the cases, they cells design with base on the good sense and experience. Some of the factors that contribute to that are: there are not available necessary data to the solution of the problem, and the fact that each algorithm presents private situations, needing a specialist that is capable to choose them or to develop them and to apply them according to the specific characteristics of the problem. In this work we propose a framework that looks for to integrate the several algorithms of cell formation, al/owing that these are applied to a same group of data. The purpose is become available to professional of a company, the largest possible number of algorithms in a simple way and without the necessity of expensive survey of data, following a procedure in that it can analyze different solutions of a problem. The framework is composed of a procedure, an object-oriented model and a decision support system for cells formation. The obtained results of the application show that the proposal framework is acceptable with great possibilities to continue to be developed in future researches.
APA, Harvard, Vancouver, ISO, and other styles
26

Regmi, Hem Kanta. "A Real-Time Computational Decision Support System for Compounded Sterile Preparations using Image Processing and Artificial Neural Networks." University of Toledo / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1469113622.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Ruiz, Luis Fernando Chimelo. "Uma abordagem de classificação da cobertura da terra em imagens obtidas por veículo aéreo não tripulado." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2014. http://hdl.handle.net/10183/111857.

Full text
Abstract:
Câmaras não métricas acopladas a Veículos Aéreos Não Tripulados (VANT) possibilitam coleta de imagens com alta resolução espacial e temporal. Além disso, o custo de operação e manutenção desses equipamentos são reduzidos. A classificação da cobertura da terra por meio dessas imagens são dificultadas devido à alta variabilidade espectral dos alvos e ao grande volume de dados gerados. Esses contratempos são contornados utilizando Análise de Imagens Baseada em Objetos (Object-Based Image Analysis – OBIA) e algoritmos de mineração de dados. Um algoritmo empregado na OBIA são as Árvores de Decisão (AD). Essa técnica possibilita tanto a seleção de atributos mais informativos quanto a classificação das regiões. Novas técnicas de AD foram desenvolvidas e, nessas inovações, foram inseridas funções para selecionar atributos e para melhorar a classificação. Um exemplo é o algoritmo C5.0, que possui uma função de redução de dados e uma de reforço. Nesse contexto, este trabalho tem como objetivo (i) avaliar o método de segmentação por crescimento de regiões em imagens com altíssima resolução espacial, (ii) determinar os atributos preditivos mais importantes na discriminação das classes e (iii) avaliar as classificações das regiões em relação aos parâmetros de seleção dos atributos (winnow) e de reforço (trial), que estão contidos no algoritmo C5.0. A segmentação da imagem foi efetuada no programa Spring, já as regiões geradas na segmentação foram classificadas pelo modelo de AD C5.0, que está disponível no programa R. Como resultado foi identificado que a segmentação crescimento de regiões possibilitou uma alta correspondência com regiões geradas pelo especialista, resultando em valores de Reference Bounded Segments Booster (RBSB) próximos a 0. Os atributos mais importantes na construção dos modelos por AD foram a razão entre a banda do verde com a azul (r_v_a) e o Modelo Digital de Elevação (MDE). Para o parâmetro de reforço (trial), não foi identificada melhora na acurácia da classificação ao aumentar seu valor. Já o parâmetro winnow possibilitou uma redução no número de atributos preditivos, sem perdas estatisticamente significativas na acurácia da classificação. A função de reforço (trial) não melhorou a classificação da cobertura da terra. Também não foram constatadas diferenças estatisticamente significativas quando winnow selecionado como verdadeiro, mas se encontrou o benefício desse último parâmetro reduzindo a dimensionalidade dos dados. Nesse sentido, este trabalho contribuiu para a classificação da cobertura da terra em imagens coletadas por VANT, uma vez que se desenvolveu algoritmos para automatizar os processos da OBIA e para avaliar a classificação das regiões em relação às funções de reforço (winnow) e de seleção do atributo (winnow) do classificador por árvore de decisão C5.0.
Non-metric cameras attached to Unmanned Aerial Vehicles (UAV) enable collection of images with high spatial and temporal resolution. In addition, the cost of operation and maintenance of equipment are reduced. The land cover classification through these images are hampered due to high spectral variability of the targets and the large volume of data generated. These setbacks are contoured using Image Analysis Based on Objects (OBIA) and data mining algorithms. An algorithm used in OBIA are Decision Trees (AD). This technique allows the selection of the most informative attributes as the classification of regions. New AD techniques have been developed and these innovations, were functions inserted to select attributes and to improve classification. One example is a C5.0 algorithm, which has a data reduction function and of boosting. In this context, this paper aims to (i) evaluate the segmentation method for growing regions in images with high spatial resolution, (ii) determine the most important predictive attributes in the discrimination of classes and (iii) evaluate the classifications of regions regarding the attributes selection parameters (winnow) and boosting (trial), which are contained in the C5.0 algorithm. The image segmentation was performed in Spring program, since the regions generated in segmentation were classified by model C5.0 , which is available in the program R. As a result it was identified that the segmentation by region growing provided a high correlation with regions generated by the expert, resulting in Reference Bounded Segments Booster values (RBSB) near 0. The most important features in the construction of models of decision tree are the ratio between the band of green with the blue (r_v_a) and the Digital Elevation Model (DEM). Was not identified improvement in classification accuracy when was increased value of trial parameter. Already winnow parameter enabled a reduction in the number of predictive attributes, with no statistically significant losses in the accuracy of the classification. The boosting function (trial) did not improve the classification of land cover. Also were not found statistically significant differences when winnow selected as true, but was found the benefit of the latter parameter to reducing the dimensionality of the data. Thus, this work contributed to the land cover classification in images collected by UAV, once that were developed algorithms to automate the processes of integration OBIA and decision tree (C5.0).
APA, Harvard, Vancouver, ISO, and other styles
28

Mangeot, Christine. "Copilote : configuration et planification integrees en langage objet pour la teleoperation." Paris 6, 1987. http://www.theses.fr/1987PA066508.

Full text
Abstract:
L'evolution de la teleoperation assistee par ordinateur passe par l'introduction d'outils qui assistent l'operateur dans sa tache de decision et de planification. Les besoins en matiere d'assistance et les solutions proposees par l'intelligence artificielle sont evalues
APA, Harvard, Vancouver, ISO, and other styles
29

Andersson, Maria, Fredrik Gustafsson, Louis St-Laurent, and Donald Prevost. "Recognition of Anomalous Motion Patterns in Urban Surveillance." Linköpings universitet, Reglerteknik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-93983.

Full text
Abstract:
We investigate the unsupervised K-means clustering and the semi-supervised hidden Markov model (HMM) to automatically detect anomalous motion patterns in groups of people (crowds). Anomalous motion patterns are typically people merging into a dense group, followed by disturbances or threatening situations within the group. The application of K-means clustering and HMM are illustrated with datasets from four surveillance scenarios. The results indicate that by investigating the group of people in a systematic way with different K values, analyze cluster density, cluster quality and changes in cluster shape we can automatically detect anomalous motion patterns. The results correspond well with the events in the datasets. The results also indicate that very accurate detections of the people in the dense group would not be necessary. The clustering and HMM results will be very much the same also with some increased uncertainty in the detections.

Funding Agencies|Vinnova (Swedish Governmental Agency for Innovation Systems) under the VINNMER program||

APA, Harvard, Vancouver, ISO, and other styles
30

Aguilar, Lleyda David. "Sensorimotor decision-making with moving objects." Doctoral thesis, Universitat de Barcelona, 2017. http://hdl.handle.net/10803/461673.

Full text
Abstract:
Moving is essential for us to survive, and in countless occasions we move in response to visual information. However, this process is characterized as uncertain, given the variability present both at the sensory and motor stages. A crucial question, then, is how to deal with this uncertainty in order for our actions to lead to the best possible outcomes. Statistical decision theory (SDT) is a normative framework that establishes how people should make decisions in the presence of uncertainty. This theory identifies the optimal action as that which maximizes the expected reward (outcome) of the situation. Movement planning can be reformulated in terms of SDT, so that the focus is placed on the decisional component. Some experimental work making use of this theoretical approach has concluded that humans are optimal movement planners, while other has identified situations where suboptimality arises. However, sensorimotor decision-making within SDT has commonly eluded scenarios of interaction with moving objects. At the same time, the work devoted to moving objects has not focused on the decisional aspect. The present thesis aims at bridging both fields, with each of our three studies trying to answer different questions. Given the spatiotemporal nature of situations with moving objects, we can plan our actions by relying on both temporal and spatial cues provided by the object. In Study I we investigated whether exploiting more one type of these visual cues led to a better performance, as defined by the reward given after each action. In our task we presented a target, which could vary in speed and motion time, approaching a line. Participants responded to stop the target and were rewarded according to its proximity to the line. Responding after the target crossed the line was penalized. We discovered that those participants planning their responses based on time-based motion cues had a better performance than those monitoring the target’s changing spatial position. This was due to the former approach circumventing a limitation imposed by the resolution of the visual system. We also found that viewing the object for longer favored time-based responses, as mediated by longer integration time. Finally, we used existing SDT models to obtain a reference of optimality, but we defend that these models are limited to interpret our data. Study II built on our previous findings to explore whether the use of temporal cues could be learnt. We took our previous paradigm and adapted it so that reward was manipulated after each task in order to foster exploiting temporal information. There was no evidence for learning taking place, since participants using temporal cues did so from the start of the experiment. Whether other methods reward can shape the use of certain cues, and why some people naturally tend to make more use of temporal information, still remain elusive. Study III deepened our knowledge on which variability people consider when planning their responses. We hypothesized that the reason why people are suboptimal (as defined by SDT) in many situations is because they represent only their measurement variability, roughly equivalent to the execution noise, while excluding the variability created by sudden changes in their planning. We took previous data and used a Kalman filter to extract each participant’s measurement variability. We then used it to compute SDT-derived optimal responses, and discovered that they explained well our data, giving support to our hypothesis. We also found evidence for participants using the information provided by reward both to avoid being penalized and to choose the point at which to stabilize their responses. Taken together, our experimental work presents interaction with moving objects as a complex set of situations where different information guides our response planning. Firstly, visual cues of different origin. Secondly, our variability, coming from many sources, some of which may not be considered. Finally, the outcomes related to each action.
Moure’s és essencial per a la nostra supervivència, i en incomptables ocasions ens movem en resposta a informació visual. Tanmateix, aquest procés és incert, donada la variabilitat present tant a l'estadi sensorial com en el motor. Una pregunta crucial, doncs, és com gestionar aquesta incertesa perquè les nostres accions portin a les millors conseqüències possibles. La teoria de la decisió estadística (Statistical decision theory, SDT) és un marc teòric normatiu que estableix com la gent hauria de fer decisions en presència d'incertesa. Aquesta teoria identifica l'acció òptima amb aquella que maximitza la recompensa (entesa com a conseqüència) esperada de la situació. La planificació del moviment pot ser reformulada en termes de SDT, de tal manera que s’emfatitza el component decisional. Diferents treballs experimentals que han fet servir aquesta aproximació teòrica han conclòs que els humans som planificadors de moviment òptims, mentre que altres han identificat situacions on la suboptimalitat sorgeix. No obstant això, la presa de decisions sensoriomotora des de SDT normalment ha ignorat escenaris que requereixen d'interacció com objectes en moviment. Alhora, els treballs dedicats als objectes en moviment no s'han centrat en l'aspecte de decisió. La present tesi es proposa acostar els dos camps, amb cada un dels nostres tres estudis intentant respondre diferents preguntes. L’Estudi I descobrí que, per planificar les nostres decisions, fer servir informació temporal portà a un millor rendiment que fer servir informació espacial, i això fou facilitat per veure l'objecte durant més temps. També vam criticar la limitació de certs models d’SDT per interpretar els nostres dades. L'Estudi II intentà promoure l'ús d'informació temporal, tot i que no s’aconseguí fomentar l’aprenentatge. Finalment, l’'Estudi III trobà que la raó per la qual la gent és subòptima en moltes situacions es deu al fet que representa només la seva variabilitat de mesura, més o menys equivalent al soroll d'execució, mentre que s'exclou la variabilitat creada per sobtats canvis en la planificació de la resposta. També trobàrem que els participants van usar la informació donada per la recompensa tant per evitar ser penalitzats com per escollir el punt on estabilitzar les seves respostes.
APA, Harvard, Vancouver, ISO, and other styles
31

Zavalina, Viktoriia. "Identifikace objektů v obraze." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2014. http://www.nusl.cz/ntk/nusl-220364.

Full text
Abstract:
Master´s thesis deals with methods of objects detection in the image. It contains theoretical, practical and experimental parts. Theoretical part describes image representation, the preprocessing image methods, and methods of detection and identification of objects. The practical part contains a description of the created programs and algorithms which were used in the programs. Application was created in MATLAB. The application offers intuitive graphical user interface and three different methods for the detection and identification of objects in an image. The experimental part contains a test results for an implemented program.
APA, Harvard, Vancouver, ISO, and other styles
32

Kovalov, Ievgen. "Context-sensitive Points-To Analysis : Comparing precision and scalability." Thesis, Linnéuniversitetet, Institutionen för datavetenskap, fysik och matematik, DFM, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-18225.

Full text
Abstract:
Points-to analysis is a static program analysis that tries to predict the dynamic behavior of programs without running them. It computes reference information by approximating for each pointer in the program a set of possible objects to which it could point to at runtime. In order to justify new analysis techniques, they need to be compared to the state of the art regarding their accuracy and efficiency. One of the main parameters influencing precision in points-to analysis is context-sensitivity that provides the analysis of each method separately for different contexts it was called on. The problem raised due to providing such a property to points-to analysis is decreasing of analysis scalability along with increasing memory consumption used during analysis process. The goal of this thesis is to present a comparison of precision and scalability of context-sensitive and context-insensitive analysis using three different points-to analysis techniques (Spark, Paddle, P2SSA) produced by two research groups. This comparison provides basic trade-offs regarding scalability on the one hand and efficiency and accuracy on the other. This work was intended to involve previous research work in this field consequently to investigate and implement several specific metrics covering each type of analysis regardless context-sensitivity – Spark, Paddle and P2SSA. These three approaches for points-to analysis demonstrate the intended achievements of different research groups. Common output format enables to choose the most efficient type of analysis for particular purpose.
APA, Harvard, Vancouver, ISO, and other styles
33

Morgan, Beatriz Fátima. "Tecnologias contábeis, decisões coletivas e gestão de risco nas relações de suprimento de gás liquefeito de petróleo." Universidade de São Paulo, 2012. http://www.teses.usp.br/teses/disponiveis/12/12136/tde-18072012-170633/.

Full text
Abstract:
Esta pesquisa discute como as tecnologias contábeis contribuem para construir decisões coletivas para gerenciamento de risco operacional ao longo das relações de suprimento. A construção das decisões é vista como um processo, e o coletivo formado por humanos e não-humanos. As tecnologias contábeis são abordadas como inscrições e objetos delimitantes. As inscrições mediam as ações à distância e simplificam objetos complexos tornando-os entidades separadas que atuam como objetos delimitantes (LATOUR, 1987; LAW, 1986; STAR; GRIESEMER, 1989). O conceito de risco na literatura apresenta ambiguidades. A forma de gerenciá-lo é problematizada na literatura contábil em relações inter-organizacionais. O estudo de campo foi conduzido nas relações de suprimento de gás liquefeito de petróleo que incluem a companhia petrolífera, os fornecedores de transporte e a companhia distribuidora. As informações foram obtidas sob a perspectiva deste último, por meio de etnografia que incluiu entrevistas, observações e shadowing de objetos e pessoas, e submetidos à análise narrativa permeada pela análise desconstrutiva. Inicialmente foram identificadas as ameaças que podem ocasionar falta de produto em unidades de produção da distribuidora. Nem todas as ameaças observadas são passíveis de se tornar objeto de gerenciamento de risco. Para isso, elas precisam estar inscritas e acumuladas na área responsável pelo suprimento que atua como um centro de cálculo. Porém, muitas estão interligadas com outros fatores e apresentam efeitos inesperados nas dimensões de espaço e tempo. As tecnologias de contabilidade padronizadas, tais como, planejamentos de longo prazo e orçamentos, quando usadas isoladamente não tem força suficiente para mobilizar ações que levem à redução do risco. Por outro lado, os conflitos gerados quando ambas são combinadas impelem para a busca de outras informações que resultem em um número mais preciso. Nas relações estudadas, os riscos acentuam-se no curto prazo em contraste com o longo prazo conforme preconizado pela literatura. As tecnologias construídas na prática exercem força para mobilizar decisões imediatas. Apesar de, contratualmente, a parceria para o fornecimento de gás ser firmada por duas companhias, em que de um lado está o fornecedor e do outro a distribuidora, o que se tem no dia-a-dia são múltiplas relações construídas entre as unidades de produção, refinarias e a área de suprimentos. Desta forma, inscrições, como a ordem de compra, são capazes de redefinir as fronteiras. Transladam para agir como instrumentos de gerenciamento de risco operacional. A confiança manifesta-se como um quase-objeto (MOURITSEN; THRANE, 2006) que ganha existência na circulação de informações entre as partes. Com isso, o risco de falta de produto se apresenta como um ,,fantasma\" que poderá ser materializado dependendo do fluxo de informações. Este estudo contribui empiricamente por mostrar os riscos envolvidos nas relações de suprimento de gás liquefeito de petróleo ao longo do território brasileiro, e como a contabilidade contribui para gerenciá-los. Além disso, estende o conhecimento relativo à forma como ocorre o fluxo de informações no cenário inter-organizacional, bem como a adoção de práticas híbridas para o alcance de ações coletivas que gerenciam os riscos de suprimentos e constroem relações. Teoricamente, o estudo contribui na discussão de conceito de risco e como a contabilidade está associada com este conceito.
This research discusses how accounting technologies contribute to construct collective decisions for operational risk management across the supply relationships. Decision construction is seen as a process and the collective is formed by humans and nonhumans. The accounting technologies are approached as inscriptions and boundary objects. The inscriptions mediate the action at a distance and simplify complex objects making them separate entities that act as boundary objects (LATOUR, 1987; LAW, 1986; STAR; GRIESEMER, 1989). The concept of risk in the literature presents ambiguities. The way to manage it is problematized in the accounting literature on inter-organizational relationships. The field study was conducted in liquefied petroleum gas supply relations which involve the oil company, the transport suppliers and the distribution company. The information was obtained from the perspective of the latter, through ethnography which included interviews, observations and shadowing of objects and people, and narrative analysis permeated by deconstructive analysis was employed. Initially we identified the threats that can cause shortage of the product in production unit in the distribution company. Not all threats found are possible to transform into an object to risk management. To do this they must be inscribed and accumulated in the management of suppliers that acts as a center of calculation. However, many are intertwined with other factors and have unexpected effects on the dimensions of space and time. The accounting standard technologies, such as long-term planning and budgeting, when used alone they do not have enough force to mobilize interventions to reduce risk. On the other hand, the conflicts generated when both are combined to propel the search for other information that results in a more precise figure. In the relations studied, the risks increase in short term in contrast to long term as seen in the literature. Thus, the technologies built in practice, exert force to mobilize immediate decisions. Although contractually the partnership for gas supply to be signed by two companies, which on one side is the supplier and the distributor on the other, in day-to-day activities multiple relations are built among production units, refineries and supply area. In this way, inscriptions such as purchase orders are able to redefine the boundaries. They translate to act as instruments of operational risk management. Trust manifests itself as a quasi-object (MOURITSEN; THRANE, 2006) which comes into existence when information is circulating between the parties. Thus, the risk of shortage of the product presents itself as a ,,ghost\" that can be materialized depending on information flows. This study contributes empirically by showing the risks involved in the supply relations of petroleum liquefied gas throughout the Brazilian territory, and how accounting helps to manage them. Furthermore, it extends the knowledge on how the information flows occur in an inter-organizational setting, as well as the adoption of hybrid practices to achieve collective actions to manage supply risks and build relations. Theoretically, the study contributes to the discussion of the risk concept and how the accounting is associated with this concept.
APA, Harvard, Vancouver, ISO, and other styles
34

Riseth, Asbjørn Nilsen. "Algorithms for decision making." Thesis, University of Oxford, 2018. http://ora.ox.ac.uk/objects/uuid:a66e17dc-a626-4226-82f6-2d716f8690fd.

Full text
Abstract:
We investigate algorithms for different steps in the decision making process, focusing on systems where we are uncertain about the outcomes but can quantify how probable they are using random variables. Any decision one makes in such a situation leads to a distribution of outcomes and requires a way to evaluate a decision. The standard approach is to marginalise the distribution of outcomes into a single number that tries in some way to summarise the value of each decision. After selecting a marginalisation approach, mathematicians and decision makers focus their analysis on the marginalised value but ignore the distribution. We argue that we should also be investigating the implications of the chosen mathematical approach for the whole distribution of outcomes. We illustrate the effect different mathematical formulations have on the distribution with one-stage and sequential decision problems. We show that different ways to marginalise the distributions can result in very similar decisions but each way has a different complexity and computational cost. It is often computationally intractable to approximate optimal decisions to high precision and much research goes into developing algorithms that are suboptimal in the marginalised sense, but work within the computational budget available. If the performance of these algorithms is evaluated they are mainly judged based on the marginalised values, however, comparing the performance using the full distribution provides interesting information: We provide numerical examples from dynamic pricing applications where the suboptimal algorithm results in higher profit than the optimal algorithm in more than half of the realisations, which is paid for with a more significant underperformance in the remaining realisations. All the problems discussed in this thesis lead to continuous optimisation problems. We develop a new algorithm that can be used on top of existing optimisation algorithms to reduce the cost of approximating solutions. The algorithm is tested on a range of optimisation problems and is shown to be competitive with existing methods.
APA, Harvard, Vancouver, ISO, and other styles
35

McInerney, Robert E. "Decision making under uncertainty." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:a34e87ad-8330-42df-8ba6-d55f10529331.

Full text
Abstract:
Operating and interacting in an environment requires the ability to manage uncertainty and to choose definite courses of action. In this thesis we look to Bayesian probability theory as the means to achieve the former, and find that through rigorous application of the rules it prescribes we can, in theory, solve problems of decision making under uncertainty. Unfortunately such methodology is intractable in realworld problems, and thus approximation of one form or another is inevitable. Many techniques make use of heuristic procedures for managing uncertainty. We note that such methods suffer unreliable performance and rely on the specification of ad-hoc variables. Performance is often judged according to long-term asymptotic performance measures which we also believe ignores the most complex and relevant parts of the problem domain. We therefore look to develop principled approximate methods that preserve the meaning of Bayesian theory but operate with the scalability of heuristics. We start doing this by looking at function approximation in continuous state and action spaces using Gaussian Processes. We develop a novel family of covariance functions which allow tractable inference methods to accommodate some of the uncertainty lost by not following full Bayesian inference. We also investigate the exploration versus exploitation tradeoff in the context of the Multi-Armed Bandit, and demonstrate that principled approximations behave close to optimal behaviour and perform significantly better than heuristics on a range of experimental test beds.
APA, Harvard, Vancouver, ISO, and other styles
36

Brockie, Steven A. J. "A co-ordinated business object approach for supporting tactical level management decisions." Thesis, Aston University, 2003. http://publications.aston.ac.uk/10611/.

Full text
Abstract:
The proliferation of data throughout the strategic, tactical and operational areas within many organisations, has provided a need for the decision maker to be presented with structured information that is appropriate for achieving allocated tasks. However, despite this abundance of data, managers at all levels in the organisation commonly encounter a condition of ‘information overload’, that results in a paucity of the correct information. Specifically, this thesis will focus upon the tactical domain within the organisation and the information needs of management who reside at this level. In doing so, it will argue that the link between decision making at the tactical level in the organisation, and low-level transaction processing data, should be through a common object model that used a framework based upon knowledge leveraged from co-ordination theory. In order to achieve this, the Co-ordinated Business Object Model (CBOM) was created. Detailing a two-tier framework, the first tier models data based upon four interactive object models, namely, processes, activities, resources and actors. The second tier analyses the data captured by the four object models, and returns information that can be used to support tactical decision making. In addition, the Co-ordinated Business Object Support System (CBOSS), is a prototype tool that has been developed in order to both support the CBOM implementation, and to also demonstrate the functionality of the CBOM as a modelling approach for supporting tactical management decision making. Containing a graphical user interface, the system’s functionality allows the user to create and explore alternative implementations of an identified tactical level process. In order to validate the CBOM, three verification tests have been completed. The results provide evidence that the CBOM framework helps bridge the gap between low level transaction data, and the information that is used to support tactical level decision making.
APA, Harvard, Vancouver, ISO, and other styles
37

Boldt, Annika. "Metacognition in decision making." Thesis, University of Oxford, 2015. http://ora.ox.ac.uk/objects/uuid:5d9b2036-cc42-4515-b40e-97bb3ddb1d78.

Full text
Abstract:
Humans effortlessly and accurately judge their subjective probability of being correct in a given decision, leading to the view that metacognition is integral to decision making. This thesis reports a series of experiments assessing people’s confidence and error-detection judgements. These different types of metacognitive judgements are highly similar with regard to their methodology, but have been studied largely separately. I provide data indicating that these judgements are fundamentally linked and that they rely on shared cognitive and neural mechanisms. As a first step towards such a joint account of confidence and error detection, I present simulations from a computational model that is based on the notion these judgements are based on the same underlying processes. I next focus on how metacognitive signals are utilised to enhance cognitive control by means of a modulation of information seeking. I report data from a study in which participants received performance feedback, testing the hypothesis that participants will focus more on feedback when they are uncertain whether they were correct in the current trial, whilst ignoring feedback when they are certain regarding their accuracy. A final question addressed in this thesis asks which information contributes internally to the formation of metacognitive judgements, given that it remains a challenge for most models of confidence to explain the precise mechanisms by which confidence reflects accuracy, under which circumstances this correlation is reduced, and the role other influences might have, such as the inherent reliability of a source of evidence. The results reported here suggest that multiple variables – such as response time and reliability of evidence – play a role in the generation of metacognitive judgements. Inter-individual differences with regard to the utilisation of these cues to confidence are tested. Taken together, my results suggest that metacognition is crucially involved in decision making and cognitive control.
APA, Harvard, Vancouver, ISO, and other styles
38

Zellentin, Alexa Birgit. "Neutrality in political decision making." Thesis, University of Oxford, 2009. http://ora.ox.ac.uk/objects/uuid:d9e8cb98-6ca2-4184-9fc4-98a206499e43.

Full text
Abstract:
Liberal neutrality – as understood in current legal and political debates – has two underlying intuitions and therefore two distinct elements. On the one hand it refers to the intuition that there are matters the state has no business getting involved in (hands-off element). On the other hand it is motivated by the idea that the state ought to treat citizens as equals and show equal respect and for their different conceptions of the good life (equality element). This thesis defends this two-fold understanding of neutrality with reference to Rawls’ conception of society as a fair system of cooperation and the idea of citizens as free and equal persons. In particular, the idea that citizens are to be treated as free justifies the hands-off element and argues that the state must be involved in nothing but justice. In the context of political decision making this requires the state to be justificatorily neutral. Treating citizens as equals requires the state to grant its citizens equal political rights and also to ensure that these rights have “fair value.” Given the danger that cultural bias undermines the equal standing of citizens the state has to ensure procedures of political decision making that are able to take citizens’ different conceptions into account. Treating citizens as free and equal therefore requires that the state bans all considerations of the good from being part of the justification of state action while at the same time taking these considerations into account when deliberating the way how these regulations are to be implemented.
APA, Harvard, Vancouver, ISO, and other styles
39

Simpson, Edwin Daniel. "Combined decision making with multiple agents." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:f5c9770b-a1c9-4872-b0dc-1bfa28c11a7f.

Full text
Abstract:
In a wide range of applications, decisions must be made by combining information from multiple agents with varying levels of trust and expertise. For example, citizen science involves large numbers of human volunteers with differing skills, while disaster management requires aggregating information from multiple people and devices to make timely decisions. This thesis introduces efficient and scalable Bayesian inference for decision combination, allowing us to fuse the responses of multiple agents in large, real-world problems and account for the agents’ unreliability in a principled manner. As the behaviour of individual agents can change significantly, for example if agents move in a physical space or learn to perform an analysis task, this work proposes a novel combination method that accounts for these time variations in a fully Bayesian manner using a dynamic generalised linear model. This approach can also be used to augment agents’ responses with continuous feature data, thus permitting decision-making when agents’ responses are in limited supply. Working with information inferred using the proposed Bayesian techniques, an information-theoretic approach is developed for choosing optimal pairs of tasks and agents. This approach is demonstrated by an algorithm that maintains a trustworthy pool of workers and enables efficient learning by selecting informative tasks. The novel methods developed here are compared theoretically and empirically to a range of existing decision combination methods, using both simulated and real data. The results show that the methodology proposed in this thesis improves accuracy and computational efficiency over alternative approaches, and allows for insights to be determined into the behavioural groupings of agents.
APA, Harvard, Vancouver, ISO, and other styles
40

Lindstam, Tim, and Anton Svensson. "Behavior Based Artificial Intelligence in a Village Environment." Thesis, Malmö högskola, Fakulteten för teknik och samhälle (TS), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-20522.

Full text
Abstract:
Abstract. Autonomous agents, also known as AI agents, are staples in modern video games. They take a lot of roles, everything from being quest-givers in roleplaying games, to opposing forces in action- and shooter games. Crafting an AI that is not only easy to create, but also retains humanlike and believable behavior, has always represented a challenge to the development industry, and has in several cases ended up with open world games using AI systems that limit the AI agents to simple moving patterns. In this thesis, a form of AI systems more commonly used in simulation games such as The Sims video game series, are taken and implemented in an environment that could possibly be seen in an open world game. After the implementation, a set of tests were performed on a group of testers which resulted in the insight that a majority of the testers, when asked to compare their experience to other games, found this implementation to feel more lifelike and realistic.
APA, Harvard, Vancouver, ISO, and other styles
41

Agli, Hamza. "Raisonnement incertain pour les règles métier." Thesis, Paris 6, 2017. http://www.theses.fr/2017PA066129/document.

Full text
Abstract:
Nous étudions dans cette thèse la gestion des incertitudes au sein des systèmes à base de règles métier orientés objet (Object-Oriented Business Rules Management Systems ou OO-BRMS) et nous nous intersessions à des approches probabilistes. Afin de faciliter la modélisation des distributions de probabilités dans ces systèmes, nous proposons d'utiliser les modèles probabilistes relationnels (Probabilistic Relational Models ou PRM), qui sont une extension orientée objet des réseaux bayésiens. Lors de l'exploitation des OO-BRMS, les requêtes adressées aux PRM sont nombreuses et les réponses doivent être calculées rapidement. Pour cela, nous proposons, dans la première partie de cette thèse, un nouvel algorithme tirant parti de deux spécificités des OO-BRMS. Premièrement, les requêtes de ces derniers s'adressent seulement à une sous partie de leur base. Par conséquent, les probabilités à calculer ne concernent que des sous-ensembles de toutes les variables aléatoires des PRM. Deuxièmement, les requêtes successives diffèrent peu les unes des autres. Notre algorithme exploite ces deux spécificités afin d'optimiser les calculs. Nous prouvons mathématiquement que notre approche fournit des résultats exacts et montrons son efficacité par des résultats expérimentaux. Lors de la deuxième partie, nous établissons des principes généraux permettant d'étendre les OO-BRMS pour garantir une meilleure inter-operabilité avec les PRM. Nous appliquons ensuite notre approche au cas d'IBM Operational Decisions Manager (ODM) dans le cadre d'un prototype développé, que nous décrivons de manière générale. Enfin, nous présentons des techniques avancées permettant de compiler des expressions du langage technique d'ODM pour faciliter leur exploitation par le moteur probabiliste des PRM
In this thesis, we address the issue of uncertainty in Object-Oriented Business Rules Management Systems (OO-BRMSs). To achieve this aim, we rely on Probabilistic Relational Models (PRMs). These are an object-oriented extension of Bayesian Networks that can be exploited to efficiently model probability distributions in OO-BRMSs. It turns out that queries in OO-BRMS are numerous and we need to request the PRM very frequently. The PRM should then provide a rapid answer. For this reason, we propose, in the first part of this thesis, a new algorithm that respects two specifities of OO-BRMSs and optimizes the probabilistic inference accordingly. First, OO-BRMSs queries affect only a subset of their base, hence, the probabilities of interest concern only a subset of the PRMs random variables. Second, successive requests differ only slightly from each other. We prove theoretically the correctness of the proposed algorithm and we highlight its efficiency through experimental tests. During the second part, we establish some principles for probabilistic OO-BRMSs and we describe an approach to couple them with PRMs. Then, we apply the approach to IBM Operational Decision Manager (ODM), one of the state-of-the-art OO-BRMSs, and we provide a general overview of the resulted prototype. Finally, we discuss advanced techniques to compile elements of ODM technical language into instructions that are exploitable by the PRM probabilistic engine
APA, Harvard, Vancouver, ISO, and other styles
42

Klimenka, Filip. "Econometric methods for implementing decision functions." Thesis, University of Oxford, 2017. https://ora.ox.ac.uk/objects/uuid:3ae3ceb6-ea4d-4233-9396-2c9e15189657.

Full text
Abstract:
This thesis develops econometric methods for implementing data-based decisions. Decisions are viewed as functions of parameters which are estimated from the data. Standard methods focus on providing precise estimates of parameters ignoring intention to use them in decisions. My thesis focuses on designing methods to minimize the expected error in decision functions. The first chapter develops model averaging estimators in multiple regressions that minimize the mean squared error (MSE) of a chosen decision function. Our motivating example is implementing a portfolio choice rule that depends on variables included in assets' returns specification. We characterize the asymptotic MSE of decisions functions based on different models and then describe model-selection and averaging estimators that enable improvements in the MSE. The performance of our method is demonstrated with extensive simulations and empirical applications to futures data. The second chapter describes the risk improvements for a model averaging using two models. This type of averaging is known as shrinkage. Since the risk improvement is over the function of parameters, this shrinkage is referred to as focused shrinkage. The estimator is a weighted average between unrestricted and restricted models. The latter is a minimum distance estimator and requires selecting a projection matrix. The risk improvement of our shrinkage estimator over maximum-likelihood for arbitrary projection matrices is derived. I then show in an application to portfolio choice, that for a specific choice of projection matrix, this improvement can be substantial. The third chapter considers an application of the focused shrinkage estimator to the Global Minimum Variance (GMV) portfolio. Implementing the GMV portfolio requires estimating a covariance matrix and the literature has offered several estimators. Focused shrinkage is particularly suitable here because it can be used to directly minimize the MSE of the GMV portfolio. We illustrate the benefits of our estimator by conducting extensive simulations and empirical applications.
APA, Harvard, Vancouver, ISO, and other styles
43

Fuchs, Béatrice. "Représentation des connaissances pour le raisonnement à partir de cas : le système ROCADE." Saint-Etienne, 1997. http://www.theses.fr/1997STET4017.

Full text
Abstract:
PADIM a pour objectifs l'aide à la conception de systèmes de supervision et l'aide à l'opérateur dans sa tâche de supervision d'un système industriel complexe. Le système de supervision est un environnement informatique qui collecte les données et les rassemble pour illustrer la situation courante sur les écrans d'interface. L'opérateur doit choisir les tableaux de bord convenant le mieux pour gérer la situation en cours, et les faire évoluer en fonction de la situation. L'aide à la décision s'appuie sur le RAPC pour capitaliser et réutiliser l'expérience des opérateurs. La mise en oeuvre d'un système de RAPC dans le domaine de la supervision industrielle repose sur l'acquisition et la représentation de connaissances de différents types. Dans le but d'apporter un cadre précis pour le développement de systèmes d'intelligence artificielle s'appuyant sur le RAPC. La contribution de cette recherche est de deux ordres : - au niveau « connaissance » une modélisation du RAPC a été réalisée pour décrire précisément les fonctionnalités des systèmes de RAPC en terme de tâches. Il met en évidence les connaissances nécessaires aux taches de raisonnement, les connaissances produites, les modèles de connaissances utilisés, et les mécanismes d'inférence mis en oeuvre. Ce modèle sert de guide pour faciliter l'acquisition et la modélisation des connaissances, et est conçu pour être applicable dans de nombreux domaines d'application et pour des tâches cognitives variées. Le modèle permet de capturer aussi bien les fonctionnalités invariantes que spécifiques des systèmes de RAPC. Plusieurs exemples de systèmes de RAPC sont étudiés en utilisant le modèle de tâches et afin d'illustrer ses capacités. - au niveau « symbole », un environnement pour le développement de systèmes de RAPC a été réalisé. Le système ROCADE est un système de représentation de connaissances par objets écrit en objective C dans l'environnement Nextstep. ROCADE possède des capacités de raisonnement telles que l'héritage, l'appariement et la classification. Il permet de faciliter l'élaboration de connaissances en collectant des informations dans le système d'information environnant en particulier le système de supervision dans le cadre de l'application PADIM. L'objectif de ce travail reste l'intégration des deux aspects - aide à l'acquisition et à la modélisation et implantation dans un environnement de développement de systèmes de RAPC - afin de constituer un support pour le développement de systèmes d'intelligence artificielle utilisant le raisonnement à partir de cas. Une application complète d'aide à la conception en supervision industrielle avec le système designer a d'ores et déjà été réalisée
APA, Harvard, Vancouver, ISO, and other styles
44

Kaufman, Maike Jennifer. "Local decision-making in multi-agent systems." Thesis, University of Oxford, 2011. http://ora.ox.ac.uk/objects/uuid:e8ebc360-906d-4733-9f24-f3ed98735e89.

Full text
Abstract:
This thesis presents a new approach to local decision-making in multi-agent systems with varying amounts of communication. Here, local decision-making refers to action choices which are made in a decentralized fashion by individual agents based on the information which is locally available to them. The work described here is set within the multi-agent decision process framework. Unreliable, faulty or stochastic communication patterns present a challenge to these settings which usually rely on precomputed, centralised solutions to control individual action choices. Various approximate algorithms for local decision-making are developed for scenarios with and without sequentiality. The construction of these techniques is based strongly on methods of Bayesian inference. Their performance is tested on synthetic benchmark scenarios and compared to that of a more conservative approach which guarantees coordinated action choices as well as a completely decentralized solution. In addition, the method is applied to a surveillance task based on real-world data. These simulation results show that the algorithms presented here can outperform more traditional approaches in many settings and provide a means for flexible, scalable decision-making in systems with varying information exchange between agents.
APA, Harvard, Vancouver, ISO, and other styles
45

Kolling, Nils Stephen. "Decision making, the frontal lobes and foraging behaviour." Thesis, University of Oxford, 2015. http://ora.ox.ac.uk/objects/uuid:ea509f5e-dca4-44e5-9f3f-f7d6550e5b45.

Full text
Abstract:
The aim of this thesis was to understand the function of the frontal lobes during different types of decisions thusfar mostly neglected in cognitive neuroscience. Namely, I sought to understand how decisions are made when comparisons are not about a simple set of concrete options presented, but rather require a comparison with one specific encounter and a sense of the value of the current environment (Chapter 2-3). Additionally, I wanted to understand how decisions between concrete options can be contextualized by the current environment to allow considerations about changing environmental constraints to factor into the decision making process (Chapter 4-5). At last, I wanted to test how the potential for future behaviours within an environment has an effect on peoples decisions (Chapter 6). In other words, how do people construct prospective value when it requires a sense of own future behaviours? All this work was informed by concepts and models originating from optimal foraging theory, which seeks to understand animal behaviours using computational models for different ecological types of choices. Thus, this thesis offers a perspective on the neural mechanisms underlying human decision making capacities that relates them to common problems faced by animals and presumably humans in ecological environments (Chapter 1 and 7). As optimal foraging theory assumes that solving these problems efficiently is highly relevant for survival, it is possible that neural structures evolved in ways to particularly accommodate for the solution of those problems. Therefore, different prefrontal structures might be dedicated to unique ways of solving ecological kinds of decision problems. My thesis as a whole gives some evidence for such a perspective, as dACC and vmPFC were repeatedly identified as constituting unique systems for evaluation according to different reference frames. Their competition within a wider network of areas appeared to ultimately drive decisions under changing contexts. In the future, a better understanding of those changing interactions between these prefrontal areas which generate more complex and adaptive behaviours, will be crucial for understanding more natural choice behaviours. For this temporally resolved neural measurements as well as causal interference will be essential.
APA, Harvard, Vancouver, ISO, and other styles
46

Bang, Dan. "On confidence in individual and group decision-making." Thesis, University of Oxford, 2015. https://ora.ox.ac.uk/objects/uuid:e86852b9-d167-44bb-9e0f-add2183bf1f1.

Full text
Abstract:
This thesis is about the human ability to share and combine representations of the uncertainty associated with individual beliefs - an ability which is called metacognition and facilitates effective cooperation. We distinguish between two metacognitive representations: an implicit confidence variable for oneself and an explicit confidence report for sharing with others. Using visual psychophysics and computational modelling, we address the issues of optimality and flexibility in the formation and the utilisation of these representations. We show that people can compute the confidence variable in an optimal manner (the probability that a given belief is correct as per Bayesian inference). Further, we show that the mapping of this variable onto a confidence report can vary flexibly - with people adjusting their reports according to the history of reports given and feedback obtained. This optimality and flexibility is important for effective cooperation. Being a probability, the optimal confidence variable can be compared across people. However, to facilitate this comparison, people must adapt their confidence reports to each other and develop a common metric for reporting the probability that their belief is correct. We show that people solve this communication problem sub-optimally; they match each other's mean confidence and confidence distributions, regardless of whether they are equally likely to be correct or not. In addition, we show that, while people can take into account differences in underlying competence to some extent, they fail to do so adequately; they exhibit an equality bias, weighting their partner's beliefs as if they were as good or as bad as their own, regardless of true differences in their underlying competence. More generally, our results pose a problem for our current understanding of metacognition which assumes that confidence reports are stable over time. In addition, our results show that confidence reports are socially malleable, and thus raise the possibility that well-known biases, such as overconfidence, might reflect particular norms for social interaction.
APA, Harvard, Vancouver, ISO, and other styles
47

Schuck-Paim, Cynthia. "The starling as a rational decision maker." Thesis, University of Oxford, 2003. http://ora.ox.ac.uk/objects/uuid:75f83ea4-01d6-4920-a7d8-0728b88da7da.

Full text
Abstract:
A central question in behavioural and evolutionary ecology is to understand how animals make decisions between, for instance, potential mates, nesting sites, foraging patches and territories. Normative models of choice usually predict preferences between alternatives by computing their value according to some criterion and then identifying the alternative with greatest value. An important consequence of this procedure is captured in the economic concept of rationality, defined through a number of principles that are necessary for the existence of a scale of value upon which organisms base their choices. Violations of rationality are nonetheless well documented in psychological and economic studies of human choice and consumer behaviour, and have forced a reinterpretation of much of the existing data and models. Although largely unexplored in the study of animal decision-making, the systematic observation of irrationality would similarly pose serious challenges for functional approaches to behaviour. In this thesis I explore the possibility that violations of rational axioms may also be found in animal choices, using the European starling (Sturnus vulgaris) as a model species. My objectives were threefold. Firstly, I investigated the prevalence of rationality across distinct foraging paradigms, in situations involving multialternative choices, structured choice sets, choices between alternatives described by multiple attributes and risk-sensitive decisions. In a number of distinct experiments, the preferences of the starlings were consistent and stable across contexts, conforming to basic rational principles such as transitivity and regularity. A second objective was to explore possible factors underlying reported violations of rational axioms by animals. Amongst potential mechanisms, I review and examine the implications of the use of hierarchical and higher order choice rules, as well as the presence of constraints on the perception of rewards. Finally, I examine the likely effect of contextual changes on an organism's state, and consequently choice behaviour, and experimentally confirm the expectation that statedependence in foraging preferences can underlie the observation of seemingly irrational behaviour. Altogether, my results suggest that, rather than being a common phenomenon, breaches of rationality in animals might be restricted to specific sets of parameters and conditions. They also emphasize the importance of considering the potential multitude of factors underlying violations of rationality in animal choices, and suggest that students of economic rationality in animal behaviour should also view preferences as a dynamic, statedependent measure.
APA, Harvard, Vancouver, ISO, and other styles
48

Flack, Andrea. "Collective decision-making in homing pigeon navigation." Thesis, University of Oxford, 2013. http://ora.ox.ac.uk/objects/uuid:55ca08f4-404d-4897-ac80-5c832f984c24.

Full text
Abstract:
This thesis focuses on conflict resolution and collective decision-making in co-navigating pigeons, Columba livia. These birds have a remarkable homing ability and frequently fly in flocks. Group navigation demands that group members reach consensus on which path to follow, but the mechanisms by which they do so remain largely unexplored. Pigeons are particularly suitable for studying these mechanisms, due to their sociality and the fact that their possession of information can easily be altered and quantified. I present the results of a series of experiments that manipulated the experience of homing pigeons in various ways so as to observe the effect of information they had previously gathered on their group behaviour. Key findings were: Previous navigational experience contributes to the establishment of leader-follower relationships. The larger the difference in experience between two co-navigating pigeons, the higher the likelihood the more experienced bird will emerge as leader. Shared homing experience through repeated joint flights can allow two pigeons to develop into a “behavioural unit”. They form spatial sub-groups when flying with less familiar birds, and perform a similar transition between compromise- and leadership-dominated flights as single birds, although they are more likely to accept compromise routes. Such previous association histories between birds can thus affect collective decision-making in larger flocks. There is a trade-off between the amount of spatial information handled and the efficiency with which such information can be applied during homing. Leading/following behaviour is influenced by the recency of the route memories. Leadership hierarchies in pigeon flocks appear resistant to changes in the navigational knowledge of a subset of their members, at least when these changes are relatively small in magnitude. The stability of the hierarchical structure might be beneficial during decision-making. Mathematical modelling suggests that underlying hierarchical social structures can increase navigational accuracy. Hierarchically organised groups with the smallest number of strong connections achieve highest accuracy. Group leader-follower dynamics resemble the underlying social structure.
APA, Harvard, Vancouver, ISO, and other styles
49

Lin, Chia-Shu. "Decision-making in the context of pain." Thesis, University of Oxford, 2011. http://ora.ox.ac.uk/objects/uuid:3ba80922-3629-4958-a62b-7ebf75871bbf.

Full text
Abstract:
Clinical and behavioural evidence has shown that the threat value of pain biases decisions about whether a stimulus is perceived as painful or not, and if yes, how intense is the sensation. This thesis aims to investigate the neural mechanisms underlying the effect of perceived threat on perceptual decisions about pain. The first study investigates the neural mechanisms underlying the effect of threat on the decision about the quality of the sensation, i.e., whether it is perceived as painful or not. The perception of pain (relative to no pain) was associated with activation in the anterior insula as well as an increased connectivity between this region and the mid-cingulate cortex (MCC). Activity in the MCC was correlated with the threat-related bias to perceived pain. In the second study, probabilistic tractography was performed with diffusion tensor imaging to investigate the structural connectivity between subdivisions of the insula and other pain-related regions. Additional analyses revealed that the structural connectivity between the anterior insula and the MCC, and between the posterior insula and somatosensory cortices, is positively correlated with the threat-related bias toward pain. In the third study, a multivariate pattern analysis (MVPA) was performed to investigate whether pain can be decoded from functional neuroimaging data acquired during the anticipation and during the receipt of pain. The results show that pain can be predicted by the pattern of neural activity in the right anterior insula during anticipation and stimulation. The fourth study investigated the effect of uncertainty about the stimulation intensity as a form of threat on the perceived intensity of pain. Uncertainty was found to be associated with an increased activation in the anterior insula. Overall, these findings suggest that a neural network consisting of the anterior insula and the MCC plays a key role in decisions about the quality and the quantity of nociceptive sensation. Results from the MVPA analysis support the notion that perceptual decisions are encoded by a distributed network of brain regions. The variability in anatomical connections between these regions may account for the individual differences in the susceptibility to a threat-mediated bias toward pain.
APA, Harvard, Vancouver, ISO, and other styles
50

Hunt, Laurence T. "Modelling human decision under risk and uncertainty." Thesis, University of Oxford, 2011. http://ora.ox.ac.uk/objects/uuid:244ce799-7397-4698-8dac-c8ca5d0b3e28.

Full text
Abstract:
Humans are unique in their ability to flexibly and rapidly adapt their behaviour and select courses of action that lead to future reward. Several ‘component processes’ must be implemented by the human brain in order to facilitate this behaviour. This thesis examines two such components; (i) the neural substrates supporting action selection during value- guided choice using magnetoencephalography (MEG), and (ii) learning the value of environmental stimuli and other people’s actions using functional magnetic resonance imaging (fMRI). In both situations, it is helpful to formally model the underlying component process, as this generates predictions of trial-to-trial variability in the signal from a brain region involved in its implementation. In the case of value-guided action selection, a biophysically realistic implementation of a drift diffusion model is used. Using this model, it is predicted that there are specific times and frequency bands at which correlates of value are seen. Firstly, there are correlates of the overall value of the two presented options, and secondly the difference in value between the options. Both correlates should be observed in the local field potential, which is closely related to the signal measured using MEG. Importantly, the content of these predictions is quite distinct from the function of the model circuit, which is to transform inputs relating to the value of each option into a categorical decision. In the case of social learning, the same reinforcement learning model is used to track both the value of two stimuli that the subject can choose between, and the advice of a confederate who is playing alongside them. As the confederate advice is actually delivered by a computer, it is possible to keep prediction error and learning rate terms for stimuli and advice orthogonal to one another, and so look for neural correlates of both social and non-social learning in the same fMRI data. Correlates of intentional inference are found in a network of brain regions previously implicated in social cognition, notably the dorsomedial prefrontal cortex, the right temporoparietal junction, and the anterior cingulate gyrus.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography