Academic literature on the topic 'Inference Without Moments'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Inference Without Moments.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Inference Without Moments":

1

Zhang, Zhiyi. "Several Basic Elements of Entropic Statistics." Entropy 25, no. 7 (July 13, 2023): 1060. http://dx.doi.org/10.3390/e25071060.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Inspired by the development in modern data science, a shift is increasingly visible in the foundation of statistical inference, away from a real space, where random variables reside, toward a nonmetrized and nonordinal alphabet, where more general random elements reside. While statistical inferences based on random variables are theoretically well supported in the rich literature of probability and statistics, inferences on alphabets, mostly by way of various entropies and their estimation, are less systematically supported in theory. Without the familiar notions of neighborhood, real or complex moments, tails, et cetera, associated with random variables, probability and statistics based on random elements on alphabets need more attention to foster a sound framework for rigorous development of entropy-based statistical exercises. In this article, several basic elements of entropic statistics are introduced and discussed, including notions of general entropies, entropic sample spaces, entropic distributions, entropic statistics, entropic multinomial distributions, entropic moments, and entropic basis, among other entropic objects. In particular, an entropic-moment-generating function is defined and it is shown to uniquely characterize the underlying distribution in entropic perspective, and, hence, all entropies. An entropic version of the Glivenko–Cantelli convergence theorem is also established.
2

Chernozhukov, Victor, Whitney K. Newey, and Andres Santos. "Constrained Conditional Moment Restriction Models." Econometrica 91, no. 2 (2023): 709–36. http://dx.doi.org/10.3982/ecta13830.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Shape restrictions have played a central role in economics as both testable implications of theory and sufficient conditions for obtaining informative counterfactual predictions. In this paper, we provide a general procedure for inference under shape restrictions in identified and partially identified models defined by conditional moment restrictions. Our test statistics and proposed inference methods are based on the minimum of the generalized method of moments (GMM) objective function with and without shape restrictions. Uniformly valid critical values are obtained through a bootstrap procedure that approximates a subset of the true local parameter space. In an empirical analysis of the effect of childbearing on female labor supply, we show that employing shape restrictions in linear instrumental variables (IV) models can lead to shorter confidence regions for both local and average treatment effects. Other applications we discuss include inference for the variability of quantile IV treatment effects and for bounds on average equivalent variation in a demand model with general heterogeneity.
3

Montiel Olea, José Luis, Mikkel Plagborg-Møller, and Eric Qian. "SVAR Identification from Higher Moments: Has the Simultaneous Causality Problem Been Solved?" AEA Papers and Proceedings 112 (May 1, 2022): 481–85. http://dx.doi.org/10.1257/pandp.20221047.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Two recent strands of the structural vector autoregression literature use higher moments for identification, exploiting either non-Gaussianity or heteroskedasticity. These approaches achieve point identification without exclusion or sign restrictions. We review this work critically and contrast its goals with the separate research program that has pushed for macroeconometrics to rely more heavily on credible economic restrictions. Identification from higher moments imposes stronger assumptions on the shock process than second-order methods do. We recommend that these assumptions be tested. Since inference from higher moments places high demands on a finite sample, weak identification issues should be given priority by applied users.
4

Vendeville, Nathalie, Nathalie Blanc, and Claire Brechet. "A Drawing Task to Assess Emotion Inference in Language-Impaired Children." Journal of Speech, Language, and Hearing Research 58, no. 5 (October 2015): 1563–69. http://dx.doi.org/10.1044/2015_jslhr-l-14-0343.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Purpose Studies investigating the ability of children with language impairment (LI) to infer emotions rely on verbal responses (which can be challenging for these children) and/or the selection of a card representing an emotion (which limits the response range). In contrast, a drawing task might allow a broad spectrum of responses without involving language. This study used a drawing task to compare the ability to make emotional inferences in children with and without LI. Method Twenty-two children with LI and 22 typically developing children ages 6 to 10 years were assessed in school during 3 sessions. They were asked to listen to audio stories. At specific moments, the experimenter stopped the recording and asked children to complete the drawing of a face to depict the emotion felt by the story's character. Three adult study-blind judges were subsequently asked to evaluate the expressiveness of the drawings. Results Children with LI had more difficulty than typically developing children making emotional inferences. Children with LI also made more errors of different valence than their typically developing peers. Conclusion Our findings confirm that children with LI show difficulty in producing emotional inferences, even when performing a drawing task—a relatively language-free response mode.
5

Montano Herrera, Liliana, Tobias Eilert, I.-Ting Ho, Milena Matysik, Michael Laussegger, Ralph Guderlei, Bernhard Schrantz, Alexander Jung, Erich Bluhmki, and Jens Smiatek. "Holistic Process Models: A Bayesian Predictive Ensemble Method for Single and Coupled Unit Operation Models." Processes 10, no. 4 (March 29, 2022): 662. http://dx.doi.org/10.3390/pr10040662.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The coupling of individual models in terms of end-to-end calculations for unit operations in manufacturing processes is a challenging task. We present a probability distribution-based approach for the combined outcomes of parametric and non-parametric models. With this so-called Bayesian predictive ensemble, the statistical moments such as mean value and standard deviation can be accurately computed without any further approximation. It is shown that the ensemble of different model predictions leads to an uninformed prior distribution, which can be transformed into a predictive posterior distribution using Bayesian inference and numerical Markov Chain Monte Carlo calculations. We demonstrate the advantages of our method using several numerical examples. Our approach is not restricted to certain unit operations, and can also be used for the more robust interpretation and assessment of model predictions in general.
6

Andersen, Torben G. "SIMULATION-BASED ECONOMETRIC METHODS." Econometric Theory 16, no. 1 (February 2000): 131–38. http://dx.doi.org/10.1017/s0266466600001080.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The accessibility of high-performance computing power has always influenced theoretical and applied econometrics. Gouriéroux and Monfort begin their recent offering, Simulation-Based Econometric Methods, with a stylized three-stage classification of the history of statistical econometrics. In the first stage, lasting through the 1960's, models and estimation methods were designed to produce closed-form expressions for the estimators. This spurred thorough investigation of the standard linear model, linear simultaneous equations with the associated instrumental variable techniques, and maximum likelihood estimation within the exponential family. During the 1970's and 1980's the development of powerful numerical optimization routines led to the exploration of procedures without closed-form solutions for the estimators. During this period the general theory of nonlinear statistical inference was developed, and nonlinear micro models such as limited dependent variable models and nonlinear time series models, e.g., ARCH, were explored. The associated estimation principles included maximum likelihood (beyond the exponential family), pseudo-maximum likelihood, nonlinear least squares, and generalized method of moments. Finally, the third stage considers problems without a tractable analytic criterion function. Such problems almost invariably arise from the need to evaluate high-dimensional integrals. The idea is to circumvent the associated numerical problems by a simulation-based approach. The main requirement is therefore that the model may be simulated given the parameters and the exogenous variables. The approach delivers simulated counterparts to standard estimation procedures and has inspired the development of entirely new procedures based on the principle of indirect inference.
7

Chernozhukov, Victor, Denis Chetverikov, and Kengo Kato. "Inference on Causal and Structural Parameters using Many Moment Inequalities." Review of Economic Studies 86, no. 5 (November 16, 2018): 1867–900. http://dx.doi.org/10.1093/restud/rdy065.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract This article considers the problem of testing many moment inequalities where the number of moment inequalities, denoted by $p$, is possibly much larger than the sample size $n$. There is a variety of economic applications where solving this problem allows to carry out inference on causal and structural parameters; a notable example is the market structure model of Ciliberto and Tamer (2009) where $p=2^{m+1}$ with $m$ being the number of firms that could possibly enter the market. We consider the test statistic given by the maximum of $p$ Studentized (or $t$-type) inequality-specific statistics, and analyse various ways to compute critical values for the test statistic. Specifically, we consider critical values based upon (1) the union bound combined with a moderate deviation inequality for self-normalized sums, (2) the multiplier and empirical bootstraps, and (3) two-step and three-step variants of (1) and (2) by incorporating the selection of uninformative inequalities that are far from being binding and a novel selection of weakly informative inequalities that are potentially binding but do not provide first-order information. We prove validity of these methods, showing that under mild conditions, they lead to tests with the error in size decreasing polynomially in $n$ while allowing for $p$ being much larger than $n$; indeed $p$ can be of order $\exp (n^{c})$ for some $c > 0$. Importantly, all these results hold without any restriction on the correlation structure between $p$ Studentized statistics, and also hold uniformly with respect to suitably large classes of underlying distributions. Moreover, in the online supplement, we show validity of a test based on the block multiplier bootstrap in the case of dependent data under some general mixing conditions.
8

Andrews, Donald W. K., and Patrik Guggenberger. "VALIDITY OF SUBSAMPLING AND “PLUG-IN ASYMPTOTIC” INFERENCE FOR PARAMETERS DEFINED BY MOMENT INEQUALITIES." Econometric Theory 25, no. 3 (June 2009): 669–709. http://dx.doi.org/10.1017/s0266466608090257.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This paper considers inference for parameters defined by moment inequalities and equalities. The parameters need not be identified. For a specified class of test statistics, this paper establishes the uniform asymptotic validity of subsampling,mout ofnbootstrap, and “plug-in asymptotic” tests and confidence intervals for such parameters. Establishing uniform asymptotic validity is crucial in moment inequality problems because the pointwise asymptotic distributions of the test statistics of interest have discontinuities as functions of the true distribution that generates the observations.The size results are quite general because they hold without specifying the particular form of the moment conditions—only 2 +δmoments finite are required. The results allow for independent and identically distributed (i.i.d.) and dependent observations and for preliminary consistent estimation of identified parameters.
9

Matsushita, Yukitoshi, and Taisuke Otsu. "LIKELIHOOD INFERENCE ON SEMIPARAMETRIC MODELS WITH GENERATED REGRESSORS." Econometric Theory 36, no. 4 (November 25, 2019): 626–57. http://dx.doi.org/10.1017/s026646661900029x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Hahn and Ridder (2013, Econometrica 81, 315–340) formulated influence functions of semiparametric three-step estimators where generated regressors are computed in the first step. This class of estimators covers several important examples for empirical analysis, such as production function estimators by Olley and Pakes (1996, Econometrica 64, 1263–1297) and propensity score matching estimators for treatment effects by Heckman, Ichimura, and Todd (1998, Review of Economic Studies 65, 261–294). The present article studies a nonparametric likelihood-based inference method for the parameters in such three-step estimation problems. In particular, we apply the general empirical likelihood theory of Bravo, Escanciano, and van Keilegom (2018, Annals of Statistics, forthcoming) to modify semiparametric moment functions to account for influences from plug-in estimates into the above important setup, and show that the resulting likelihood ratio statistic becomes asymptotically pivotal without undersmoothing in the first and second step nonparametric estimates.
10

Adams, Kelsey L., and Philip M. Grove. "The Effect of Transient Location on the Resolution of Bistable Visual and Audiovisual Motion Sequences." Perception 47, no. 9 (July 20, 2018): 927–42. http://dx.doi.org/10.1177/0301006618788796.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
We examined the attention and inference accounts of audiovisual perception using the stream/bounce display, a visual stimulus wherein two identical objects move toward each other, completely superimpose, then move apart. This display has two candidate percepts: stream past each other or bounce off each other. Presented without additional visual or auditory transients, the motion sequence tends to yield the streaming percept, but when coupled with a tone or flash at the point of coincidence, the response bias flips toward bouncing. We explored two competing accounts of this effect: the attentional hypothesis and the inference hypothesis. Participants watched a series of motion sequences where a transient, when present, occurred at the moment of coincidence either colocalised with the motion sequence (congruent presentation) or on the opposite side of the display (incongruent presentation). Assuming the spotlight or zoom lens metaphor, an attentional account predicts that incongruent presentations should be associated with a higher percentage of bouncing responses than congruent presentations, while the inferential account predicts the opposite effect. No effect was found for tone-only trials. However, in trials containing a visual transient, results showed higher proportions of bounce responses within congruent over incongruent presentations, favouring the inference hypothesis over a spotlight or zoom lens attentional account.

Dissertations / Theses on the topic "Inference Without Moments":

1

Kandji, Baye Matar. "Stochastic recurrent equations : structure, statistical inference, and financial applications." Electronic Thesis or Diss., Institut polytechnique de Paris, 2023. http://www.theses.fr/2023IPPAG004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Nous nous intéressons à l'étude des propriétés théoriques des équations récurrentes stochastiques (SRE) et de leurs applications en finance. Ces modèles sont couramment utilisés en économétrie, y compris en économétrie de la finance, pour styliser la dynamique d'une variété de processus tels que la volatilité des rendements financiers. Cependant, la structure de probabilité ainsi que les propriétés statistiques de ces modèles sont encore mal connues, particulièrement lorsque le modèle est considéré en dimension infinie ou lorsqu'il est généré par un processus non indépendant. Ces deux caractéristiques entraînent de formidables difficultés à l'étude théorique de ces modèles. Dans ces contextes, nous nous intéressons à l'existence de solutions stationnaires, ainsi qu'aux propriétés statistiques et probabilistes de ces solutions.Nous établissons de nouvelles propriétés sur la trajectoire de la solution stationnaire des SREs que nous exploitons dans l'étude des propriétés asymptotiques de l'estimateur du quasi-maximum de vraisemblance (QMLE) des modèles de volatilité conditionnelle de type GARCH. En particulier, nous avons étudié la stationnarité et l'inférence statistique des modèles GARCH(p,q) semi-forts dans lesquels le processus d'innovation n'est pas nécessairement indépendant. Nous établissons la consistance du QMLE des GARCH (p,q) semi-forts sans hypothèses d'existence de moment, couramment supposée pour ces modèles, sur la distribution stationnaire. De même, nous nous sommes intéressés aux modèles GARCH à deux facteurs (GARCH-MIDAS); un facteur de volatilité à long terme et un autre à court terme. Ces récents modèles introduits par Engle et al. (2013) ont la particularité d'avoir des solutions stationnaires avec des distributions à queue épaisse. Ces modèles sont maintenant fréquemment utilisés en économétrie, cependant, leurs propriétés statistiques n'ont pas reçu beaucoup d'attention jusqu'à présent. Nous montrons la consistance et la normalité asymptotique du QMLE des modèles GARCH-MIDAS et nous proposons différentes procédures de test pour évaluer la présence de volatilité à long terme dans ces modèles. Nous illustrons nos résultats avec des simulations et des applications sur des données financières réelles.Enfin, nous étendons le résultat de Kesten (1975) sur le taux de croissance des séquences additives aux processus superadditifs. Nous déduisons de ce résultat des généralisations de la propriété de contraction des matrices aléatoires aux produits d'opérateurs stochastiques. Nous utilisons ces résultats pour établir des conditions nécessaires et suffisantes d'existence de solutions stationnaires du modèle affine à coefficients positifs des SREs dans l'espace des fonctions continues. Cette classe de modèles regroupe la plupart des modèles de volatilité conditionnelle, y compris les GARCH fonctionnels
We are interested in the theoretical properties of Stochastic Recurrent Equations (SRE) and their applications in finance. These models are widely used in econometrics, including financial econometrics, to explain the dynamics of various processes such as the volatility of financial returns. However, the probability structure and statistical properties of these models are still not well understood, especially when the model is considered in infinite dimensions or driven by non-independent processes. These two features lead to significant difficulties in the theoretical study of these models. In this context, we aim to explore the existence of stationary solutions and the statistical and probabilistic properties of these solutions.We establish new properties on the trajectory of the stationary solution of SREs, which we use to study the asymptotic properties of the quasi-maximum likelihood estimator (QMLE) of GARCH-type (generalized autoregressive conditional heteroskedasticity) conditional volatility models. In particular, we study the stationarity and statistical inference of semi-strong GARCH(p,q) models where the innovation process is not necessarily independent. We establish the consistency of the QMLE of semi-strong GARCHs without assuming the commonly used condition that the stationary distribution admits a small-order moment. In addition, we are interested in the two-factor volatility GARCH models (GARCH-MIDAS); a long-run, and a short-run volatility. These models were recently introduced by Engle et al. (2013) and have the particularity to admit stationary solutions with heavy-tailed distributions. These models are now widely used but their statistical properties have not received much attention. We show the consistency and asymptotic normality of the QMLE of the GARCH-MIDAS models and provide various test procedures to evaluate the presence of long-run volatility in these models. We also illustrate our results with simulations and applications to real financial data.Finally, we extend a result of Kesten (1975) on the growth rate of additive sequences to superadditive processes. From this result, we derive generalizations of the contraction property of random matrices to products of stochastic operators. We use these results to establish necessary and sufficient conditions for the existence of stationary solutions of the affine case with positive coefficients of SREs in the space of continuous functions. This class of models includes most conditional volatility models, including functional GARCHs

Book chapters on the topic "Inference Without Moments":

1

Roy, Abhishek Ghosh, and Naba Kumar Peyada. "Aircraft Aerodynamic Parameter Estimation Using Intelligent Estimation Algorithms." In Nature-Inspired Algorithms for Big Data Frameworks, 276–88. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-5852-1.ch011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Application of adaptive neuro fuzzy inference system (ANFIS)-based particle swarm optimization (PSO) algorithm to the problem of aerodynamic modeling and optimal parameter estimation for aircraft has been addressed in this chapter. The ANFIS-based PSO optimizer constitutes the aircraft model in restricted sense capable of predicting generalized force and moment coefficients employing measured motion and control variables only, without formal requirement of conventional variables or their time derivatives. It has been shown that such an approximate model can be used to extract equivalent stability and control derivatives of a rigid aircraft.
2

Masuta, Hiroyuki, Tatsuo Motoyoshi, Kei Sawai, Ken'ichi Koyanagi, Toru Oshima, and Hun-Ok Lim. "Direct Perception and Action Decision for Unknown Object Grasping." In Rapid Automation, 1372–87. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-8060-7.ch064.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This paper discusses the direct perception of an unknown object and the action decision to grasp an unknown object using depth sensor for social robots. Conventional methods estimate the accurate physical parameters when a robot wants to grasp an unknown object. Therefore, we propose a perceptual system based on an invariant concept in ecological psychology, which perceives the information relevant to the action of the robot. Firstly, we proposed the plane detection based approach for perceiving an unknown object. In this paper, we propose the sensation of grasping which is expressed by using inertia tensor, and applied with fuzzy inference using the relation between principle moment of inertia. The sensation of grasping encourages the decision for the grasping action directly without inferring from physical value such as size, posture and shape. As experimental results, we show that the sensation of grasping expresses the relative position and posture between the robot and the object, and the embodiment of the robot arm by one parameter. And, we verify the validity of the action decision from the sensation of grasping.
3

"existing code correlating a whistle with the information that now is the moment to attack. The information is obvious enough: it is the only information that A could conceivably have intended to make manifest in the circumstances. Could not the repetition of such a situation lead to the development of a code? Imagine that the two prisoners, caught again, find themselves in the same predicament: again a whistle, again an escape, and again they are caught. The next time, prisoner B, who has not realised that both guards are distracted, hears pris-oner A whistle: this time, fortunately, B does not have to infer what the whistle is intended to make manifest: he knows. The whistle has become a signal associ-ated by an underlying code to the message ‘Let us overpower our guards now!’ Inferential theorists might be tempted to see language as a whole as having developed in this way: to see conventional meanings as growing out of natural inferences. This is reminiscent of the story of how Rockefeller became a million-aire. One day, when he was young and very poor, Rockefeller found a one-cent coin in the street. He bought an apple, polished it, sold it for two cents, bought two apples, polished them, sold them for four cents . . . After one month he bought a cart, after two years he was about to buy a grocery store, when he inherited the fortune of his millionaire uncle. We will never know how far hominid efforts at conventionalising inference might have gone towards establishing a full-fledged human language. The fact is that the development of human languages was made possible by a specialised biological endowment. Whatever the origin of the language or code employed, a piece of coded behaviour may be used ostensively – that is, to provide two layers of information: a basic layer of information, which may be about anything at all, and a second layer con-sisting of the information that the first layer of information has been intentionally made manifest. When a coded signal, or any other arbitrary piece of behaviour, is used ostensively, the evidence displayed bears directly on the individual’s intention, and only indirectly on the basic layer of information that she intends to make manifest. We are now, of course, dealing with standard cases of Gricean communication. Is there a dividing line between instances of ostension which one would be more inclined to describe as ‘showing something’, and clear cases of communica-tion where the communicator unquestionably ‘means something’? One of Grice’s main concerns was to draw such a line: to distinguish what he called ‘natural meaning’ – smoke meaning fire, clouds meaning rain, and so on – from ‘non-natural meaning’: the word ‘fire’ meaning fire, Peter’s utterance meaning that it will rain, and so on. Essential to this distinction was the third type of communi-cator’s intention Grice mentioned in his analysis: a true communicator intends the recognition of his informative intention to function as at least part of the audi-ence’s reason for fulfilling that intention. In other words, the first, basic, layer of information must not be entirely recoverable without reference to the second. What we have tried to show so far in this section is that there are not two distinct and well-defined classes, but a continuum of cases of ostension ranging from ‘showing’, where strong direct evidence for the basic layer of information is provided, to ‘saying that’, where all the evidence is indirect. Even in our very." In Pragmatics and Discourse, 158. Routledge, 2005. http://dx.doi.org/10.4324/9780203994597-29.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Inference Without Moments":

1

Garita, Francesco, Hans Yu, and Matthew P. Juniper. "Assimilation of Experimental Data to Create a Quantitatively-Accurate Reduced Order Thermoacoustic Model." In ASME Turbo Expo 2020: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/gt2020-14929.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract We combine a thermoacoustic experiment with a thermoacoustic reduced order model using Bayesian inference to accurately learn the parameters of the model, rendering it predictive. The experiment is a vertical Rijke tube containing an electric heater. The heater drives a base flow via natural convection, and thermoacoustic oscillations via velocity-driven heat release fluctuations. The decay rates and frequencies of these oscillations are measured every few seconds by acoustically forcing the system via a loudspeaker placed at the bottom of the tube. More than 320,000 temperature measurements are used to compute state and parameters of the base flow model using the Ensemble Kalman Filter. A wave-based network model is then used to describe the acoustics inside the tube. We balance momentum and energy at the boundary between two adjacent elements, and model the viscous and thermal dissipation mechanisms in the boundary layer and at the heater and thermocouple locations. Finally, we tune the parameters of two different thermoacoustic models on an experimental dataset that comprises more than 40,000 experiments. This study shows that, with thorough Bayesian inference, a qualitative model can become quantitatively accurate, without overfitting, as long as it contains the most influencial physical phenomena.
2

Parasuraman, S., V. Ganapathy, and Bijan Shirinzadeh. "Behavior Based Robot Navigation: Resolving Behavior Conflicts Using Fuzzy Inference System." In ASME 7th Biennial Conference on Engineering Systems Design and Analysis. ASMEDC, 2004. http://dx.doi.org/10.1115/esda2004-58172.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Conflict resolution is the control decision process, which should be taken as a result of the firing among several fuzzy behavior rules. In the Behavior-based Robot Navigation System, control of a robot is shared between a set of perception-action units, called behaviors selection. In other words, the behavior selection is the way that an agent selects the most appropriate or the most relevant next action to take at a particular moment, when facing a particular problem. Based on selective sensory information, each behavior produces immediate reaction to control the robot with respect to a particular objective, i.e., a narrow aspect of the robot’s overall task such as obstacle avoidance or goal seek. Behaviors with different and possibly incommensurable objectives may produce conflicting actions that are seemingly irreconcilable. The main issue in the design of behavior based robot control systems is the formulation of effective mechanism to coordinate the behavior’s activities without any behavior conflicts during navigation. This paper presents the techniques to design the behaviors and resolve the behaviors conflicts, which are based on the Situation Context of Applicability (SCA) of the environments.

To the bibliography