To see the other types of publications on this topic, follow the link: Perceptual interactions.

Journal articles on the topic 'Perceptual interactions'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Perceptual interactions.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Basirat, Anahita, Jean-Luc Schwartz, and Marc Sato. "Perceptuo-motor interactions in the perceptual organization of speech: evidence from the verbal transformation effect." Philosophical Transactions of the Royal Society B: Biological Sciences 367, no. 1591 (April 5, 2012): 965–76. http://dx.doi.org/10.1098/rstb.2011.0374.

Full text
Abstract:
The verbal transformation effect (VTE) refers to perceptual switches while listening to a speech sound repeated rapidly and continuously. It is a specific case of perceptual multistability providing a rich paradigm for studying the processes underlying the perceptual organization of speech. While the VTE has been mainly considered as a purely auditory effect, this paper presents a review of recent behavioural and neuroimaging studies investigating the role of perceptuo-motor interactions in the effect. Behavioural data show that articulatory constraints and visual information from the speaker's articulatory gestures can influence verbal transformations. In line with these data, functional magnetic resonance imaging and intracranial electroencephalography studies demonstrate that articulatory-based representations play a key role in the emergence and the stabilization of speech percepts during a verbal transformation task. Overall, these results suggest that perceptuo (multisensory)-motor processes are involved in the perceptual organization of speech and the formation of speech perceptual objects.
APA, Harvard, Vancouver, ISO, and other styles
2

Kramer, Peter, Ivilin Stoianov, Carlo Umiltà, and Marco Zorzi. "Interactions between perceptual and numerical space." Psychonomic Bulletin & Review 18, no. 4 (May 12, 2011): 722–28. http://dx.doi.org/10.3758/s13423-011-0104-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yarrow, Kielan, Patrick Haggard, and John C. Rothwell. "Vibrotactile–Auditory Interactions are Post-Perceptual." Perception 37, no. 7 (January 2008): 1114–30. http://dx.doi.org/10.1068/p5824.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cook, R., C. Aichelburg, P. Shah, and A. Johnston. "Perceptual interactions between dynamic facial features." Journal of Vision 14, no. 10 (August 22, 2014): 564. http://dx.doi.org/10.1167/14.10.564.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Polat, Uri, and Dov Sagi. "The architecture of perceptual spatial interactions." Vision Research 34, no. 1 (January 1994): 73–78. http://dx.doi.org/10.1016/0042-6989(94)90258-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Holly, Jan E. "Vestibular coriolis effect differences modeled with three-dimensional linear-angular interactions." Journal of Vestibular Research 14, no. 6 (December 1, 2004): 443–60. http://dx.doi.org/10.3233/ves-2004-14603.

Full text
Abstract:
The vestibular coriolis (or "cross-coupling") effect is traditionally explained by cross-coupled angular vectors, which, however, do not explain the differences in perceptual disturbance under different acceleration conditions. For example, during head roll tilt in a rotating chair, the magnitude of perceptual disturbance is affected by a number of factors, including acceleration or deceleration of the chair rotation or a zero-g environment. Therefore, it has been suggested that linear-angular interactions play a role. The present research investigated whether these perceptual differences and others involving linear coriolis accelerations could be explained under one common framework: the laws of motion in three dimensions, which include all linear-angular interactions among all six components of motion (three angular and three linear). The results show that the three-dimensional laws of motion predict the differences in perceptual disturbance. No special properties of the vestibular system or nervous system are required. In addition, simulations were performed with angular, linear, and tilt time constants inserted into the model, giving the same predictions. Three-dimensional graphics were used to highlight the manner in which linear-angular interaction causes perceptual disturbance, and a crucial component is the Stretch Factor, which measures the "unexpected" linear component.
APA, Harvard, Vancouver, ISO, and other styles
7

Reed, Catherine L. "Perceptual Dependence for Shape and Texture during Haptic Processing." Perception 23, no. 3 (March 1994): 349–66. http://dx.doi.org/10.1068/p230349.

Full text
Abstract:
Perceptual dependence—the existence of perceptual interactions between the component dimensions of the same stimulus—was investigated for shape and texture during haptic processing. The haptic system combines tactual and kinesthetic information. Previous research has demonstrated that haptic exploration influences the extent to which object properties are integrated. Conditions designed to promote and impede the integration of shape and texture were compared. Perceptual independence was assessed by the use of a speeded-classification paradigm and quantitative tests developed by Ashby and Maddox. Results indicate that shape and texture are perceptually dependent for both conditions. Hand-movement analyses show simultaneous exploration for both dimensions. The tendency to process dimensions dependently is discussed in terms of a limited-capacity model of haptic-information processing.
APA, Harvard, Vancouver, ISO, and other styles
8

Rockwell, Patricia, David B. Buller, and Judee K. Burgoon. "Measurement of deceptive voices: Comparing acoustic and perceptual data." Applied Psycholinguistics 18, no. 4 (October 1997): 471–84. http://dx.doi.org/10.1017/s0142716400010948.

Full text
Abstract:
ABSTRACTThis study compared vocal features of deception that can be measured by acoustic equipment with vocal features of deception that can be measured perceptually by human coders. As deception researchers have traditionally measured vocal behavior with either acoustic or perceptual methods (but not both), it is uncertain what correspondence, if any, exists between these methods. This study attempted to determine the degree of this correspondence. Deceptive interactions from an earlier study (Burgoon, Buller, Ebesu, & Rockwell, 1994) were used to conduct a detailed analysis of the vocal features of deceptive speech. The vocal samples were analyzed perceptually and acoustically. Results indicated moderate correlations between some acoustic and perceptual variables; neither measurement type, however, proved conclusively superior to the other in discriminating between truth and deception.
APA, Harvard, Vancouver, ISO, and other styles
9

Bernstein, Ira H., Victor Bissonnette, and Kenneth R. Welch. "Perceptual and response interactions in semantic priming." Perception & Psychophysics 48, no. 6 (November 1990): 525–34. http://dx.doi.org/10.3758/bf03211598.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

VALDEZ, A. B., and E. L. AMAZEEN. "Sensory and perceptual interactions in weight perception." Perception & Psychophysics 70, no. 4 (May 1, 2008): 647–57. http://dx.doi.org/10.3758/pp.70.4.647.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Krumhansl, Carol L., and Paul Iverson. "Perceptual interactions between musical pitch and timbre." Journal of Experimental Psychology: Human Perception and Performance 18, no. 3 (1992): 739–51. http://dx.doi.org/10.1037/0096-1523.18.3.739.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Auvray, Malika, Charles Lenay, and John Stewart. "Perceptual interactions in a minimalist virtual environment." New Ideas in Psychology 27, no. 1 (April 2009): 32–47. http://dx.doi.org/10.1016/j.newideapsych.2007.12.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Poletto, C. J., and C. L. Van Doren. "Perceptual interactions between electrocutaneous loudness and pitch." IEEE Transactions on Rehabilitation Engineering 3, no. 4 (1995): 334–42. http://dx.doi.org/10.1109/86.481973.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Polat, Uri. "Functional architecture of long-range perceptual interactions." Spatial Vision 12, no. 2 (1999): 143–62. http://dx.doi.org/10.1163/156856899x00094.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Laing, D. G. "Perceptual odour interactions and objective mixture analysis." Food Quality and Preference 5, no. 1-2 (January 1994): 75–80. http://dx.doi.org/10.1016/0950-3293(94)90010-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Conrad, Verena, Marco Pino Vitello, and Uta Noppeney. "Interactions between apparent motion rivalry in vision and touch." Seeing and Perceiving 25 (2012): 26–27. http://dx.doi.org/10.1163/187847612x646497.

Full text
Abstract:
Introduction: In multistable perception, the brain alternates between several perceptual explanations of ambiguous sensory signals. Recent studies have demonstrated crossmodal interactions between ambiguous and unambiguous signals. However it is currently unknown whether multiple bistable processes can interact across the senses (Conrad et al., 2010; Pressnitzer and Hupe, 2006). Using the apparent motion quartet in vision and touch, this study investigated whether bistable perceptual processes for vision and touch are independent or influence each other when powerful cues of congruency are provided to facilitate visuotactile integration (Conrad et al., in press). Methods: When two visual flashes and/or tactile vibration pulses are presented alternately along the two diagonals of the rectangle, subjects’ percept vacillates between vertical and horizontal apparent motion in the visual and/or tactile modalities (Carter et al., 2008). Observers were presented with unisensory (visual/tactile), visuotactile spatially congruent and incongruent apparent motion quartets and reported their visual or tactile percepts. Results: Congruent stimulation induced pronounced visuotactile interactions as indicated by increased dominance times and %-bias for the percept already dominant under unisensory stimulation. Yet, the temporal dynamics did not converge for congruent stimulation. It depended also on subjects’ attentional focus and was generally slower for tactile than visual reports. Conclusion: Our results support Bayesian approaches to perceptual inference, where the probability of a perceptual interpretation is determined by combining a modality-specific prior with incoming visual and/or tactile evidence. Under congruent stimulation, joint evidence from both senses decelerates the rivalry dynamics by stabilizing the more likely perceptual interpretation. Importantly, the perceptual stabilization was specific to spatiotemporally congruent visuotactile stimulation indicating multisensory rather than cognitive bias mechanisms.
APA, Harvard, Vancouver, ISO, and other styles
17

Tenenbaum, Joshua B., and William T. Freeman. "Separating Style and Content with Bilinear Models." Neural Computation 12, no. 6 (June 1, 2000): 1247–83. http://dx.doi.org/10.1162/089976600300015349.

Full text
Abstract:
Perceptual systems routinely separate “content” from “style,” classifying familiar words spoken in an unfamiliar accent, identifying a font or handwriting style across letters, or recognizing a familiar face or object seen under unfamiliar viewing conditions. Yet a general and tractable computational model of this ability to untangle the underlying factors of perceptual observations remains elusive (Hofstadter, 1985). Existing factor models (Mardia, Kent, & Bibby, 1979; Hinton & Zemel, 1994; Ghahramani, 1995; Bell & Sejnowski, 1995; Hinton, Dayan, Frey, & Neal, 1995; Dayan, Hinton, Neal, & Zemel, 1995; Hinton & Ghahramani, 1997) are either insufficiently rich to capture the complex interactions of perceptually meaningful factors such as phoneme and speaker accent or letter and font, or do not allow efficient learning algorithms. We present a general framework for learning to solve two-factor tasks using bilinear models, which provide sufficiently expressive representations of factor interactions but can nonetheless be fit to data using efficient algorithms based on the singular value decomposition and expectation-maximization. We report promising results on three different tasks in three different perceptual domains: spoken vowel classification with a benchmark multi-speaker database, extrapolation of fonts to unseen letters, and translation of faces to novel illuminants.
APA, Harvard, Vancouver, ISO, and other styles
18

Stanford, Terrence R., and Emilio Salinas. "Urgent Decision Making: Resolving Visuomotor Interactions at High Temporal Resolution." Annual Review of Vision Science 7, no. 1 (September 15, 2021): 323–48. http://dx.doi.org/10.1146/annurev-vision-100419-103842.

Full text
Abstract:
Measuring when exactly perceptual decisions are made is crucial for defining how the activation of specific neurons contributes to behavior. However, in traditional, nonurgent visuomotor tasks, the uncertainty of this temporal measurement is very large. This is a problem not only for delimiting the capacity of perception, but also for correctly interpreting the functional roles ascribed to choice-related neuronal responses. In this article, we review psychophysical, neurophysiological, and modeling work based on urgent visuomotor tasks in which this temporal uncertainty can be effectively overcome. The cornerstone of this work is a novel behavioral metric that describes the evolution of the subject's perceptual judgment moment by moment, allowing us to resolve numerous perceptual events that unfold within a few tens of milliseconds. In this framework, the neural distinction between perceptual evaluation and motor selection processes becomes particularly clear, as the conclusion of one is not contingent on that of the other.
APA, Harvard, Vancouver, ISO, and other styles
19

Ferreira, Vicente, Arancha de-la-Fuente-Blanco, and María-Pilar Sáenz-Navajas. "A New Classification of Perceptual Interactions between Odorants to Interpret Complex Aroma Systems. Application to Model Wine Aroma." Foods 10, no. 7 (July 14, 2021): 1627. http://dx.doi.org/10.3390/foods10071627.

Full text
Abstract:
Although perceptual interactions are usually mentioned and blamed for the difficulties in understanding the relationship between odorant composition and aromatic sensory properties, they are poorly defined and categorised. Furthermore, old classifications refer mainly to effects on the odour intensity of the mixture of dissimilar non-blending odours and do not consider odour blending, which is one of the most relevant and influential perceptual interactions. Beginning with the results from classical studies about odour interaction, a new and simple systematic is proposed in which odour interactions are classified into four categories: competitive, cooperative, destructive and creative. The first categories are most frequent and display a mild level of interaction, being characterised mostly by analytical processing. The last two are less frequent and activate (or deactivate) configurational processes of object recognition with deep effects on the quality and intensity of the perception. These interactions can be systematically applied to interpret the formation of sensory descriptors from the odorant composition, suggesting that qualitatively the system works. However, there is a lack of quantitative data to work with odour intensities reliably, and a pressing need to systematise the effects of creative interactions.
APA, Harvard, Vancouver, ISO, and other styles
20

Aso, Kenji, Takashi Hanakawa, Toshihiko Aso, and Hidenao Fukuyama. "Cerebro-cerebellar Interactions Underlying Temporal Information Processing." Journal of Cognitive Neuroscience 22, no. 12 (December 2010): 2913–25. http://dx.doi.org/10.1162/jocn.2010.21429.

Full text
Abstract:
The neural basis of temporal information processing remains unclear, but it is proposed that the cerebellum plays an important role through its internal clock or feed-forward computation functions. In this study, fMRI was used to investigate the brain networks engaged in perceptual and motor aspects of subsecond temporal processing without accompanying coprocessing of spatial information. Direct comparison between perceptual and motor aspects of time processing was made with a categorical-design analysis. The right lateral cerebellum (lobule VI) was active during a time discrimination task, whereas the left cerebellar lobule VI was activated during a timed movement generation task. These findings were consistent with the idea that the cerebellum contributed to subsecond time processing in both perceptual and motor aspects. The feed-forward computational theory of the cerebellum predicted increased cerebro-cerebellar interactions during time information processing. In fact, a psychophysiological interaction analysis identified the supplementary motor and dorsal premotor areas, which had a significant functional connectivity with the right cerebellar region during a time discrimination task and with the left lateral cerebellum during a timed movement generation task. The involvement of cerebro-cerebellar interactions may provide supportive evidence that temporal information processing relies on the simulation of timing information through feed-forward computation in the cerebellum.
APA, Harvard, Vancouver, ISO, and other styles
21

Parks, Nathan A., Matthew R. Hilimire, and Paul M. Corballis. "Steady-state Signatures of Visual Perceptual Load, Multimodal Distractor Filtering, and Neural Competition." Journal of Cognitive Neuroscience 23, no. 5 (May 2011): 1113–24. http://dx.doi.org/10.1162/jocn.2010.21460.

Full text
Abstract:
The perceptual load theory of attention posits that attentional selection occurs early in processing when a task is perceptually demanding but occurs late in processing otherwise. We used a frequency-tagged steady-state evoked potential paradigm to investigate the modality specificity of perceptual load-induced distractor filtering and the nature of neural-competitive interactions between task and distractor stimuli. EEG data were recorded while participants monitored a stream of stimuli occurring in rapid serial visual presentation (RSVP) for the appearance of previously assigned targets. Perceptual load was manipulated by assigning targets that were identifiable by color alone (low load) or by the conjunction of color and orientation (high load). The RSVP task was performed alone and in the presence of task-irrelevant visual and auditory distractors. The RSVP stimuli, visual distractors, and auditory distractors were “tagged” by modulating each at a unique frequency (2.5, 8.5, and 40.0 Hz, respectively), which allowed each to be analyzed separately in the frequency domain. We report three important findings regarding the neural mechanisms of perceptual load. First, we replicated previous findings of within-modality distractor filtering and demonstrated a reduction in visual distractor signals with high perceptual load. Second, auditory steady-state distractor signals were unaffected by manipulations of visual perceptual load, consistent with the idea that perceptual load-induced distractor filtering is modality specific. Third, analysis of task-related signals revealed that visual distractors competed with task stimuli for representation and that increased perceptual load appeared to resolve this competition in favor of the task stimulus.
APA, Harvard, Vancouver, ISO, and other styles
22

Vogt, Stefan, and Heiko Hecht. "Multi-level sensorimotor interactions." Behavioral and Brain Sciences 24, no. 5 (October 2001): 906–7. http://dx.doi.org/10.1017/s0140525x01480101.

Full text
Abstract:
We share the authors' general approach to the study of perception and action, but rather than singling out a particular level of “late perceptual” and “early motor” processing for sensorimotor interactions, we argue that these can arise at multiple levels during action preparation and execution. Recent data on action-perception transfer are used to illustrate this perspective.
APA, Harvard, Vancouver, ISO, and other styles
23

Kramer, Peter, Ivilin Stoianov, Carlo Umiltà, and Marco Zorzi. "Erratum to: Interactions between perceptual and numerical space." Psychonomic Bulletin & Review 18, no. 5 (September 7, 2011): 1029. http://dx.doi.org/10.3758/s13423-011-0149-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Wu, Bing, Roberta L. Klatzky, and Ralph L. Hollis. "Force, Torque, and Stiffness: Interactions in Perceptual Discrimination." IEEE Transactions on Haptics 4, no. 3 (July 2011): 221–28. http://dx.doi.org/10.1109/toh.2011.3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Klorfeld-Auslender, Shira, and Nitzan Censor. "Visual-oculomotor interactions facilitate consolidation of perceptual learning." Journal of Vision 19, no. 6 (June 11, 2019): 11. http://dx.doi.org/10.1167/19.6.11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Motoyoshi, I. "The role of spatial interactions in perceptual synchrony." Journal of Vision 4, no. 5 (May 1, 2004): 1. http://dx.doi.org/10.1167/4.5.1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Delwiche, Jeannine. "The impact of perceptual interactions on perceived flavor." Food Quality and Preference 15, no. 2 (March 2004): 137–46. http://dx.doi.org/10.1016/s0950-3293(03)00041-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Frijters, Jan E. R., and Hendrik N. J. Schifferstein. "Perceptual interactions in mixtures containing bitter tasting substances." Physiology & Behavior 56, no. 6 (December 1994): 1243–49. http://dx.doi.org/10.1016/0031-9384(94)90372-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Li, Qiong, and Ying-Yi Hong. "Intergroup Perceptual Accuracy Predicts Real-Life Intergroup Interactions." Group Processes & Intergroup Relations 4, no. 4 (October 2001): 341–54. http://dx.doi.org/10.1177/1368430201004004004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Landry, M., and J. Ristic. "The influence of attentional interactions on perceptual processing." Journal of Vision 12, no. 9 (August 10, 2012): 673. http://dx.doi.org/10.1167/12.9.673.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Torralbo, Ana, and Diane M. Beck. "Perceptual-Load-Induced Selection as a Result of Local Competitive Interactions in Visual Cortex." Psychological Science 19, no. 10 (October 2008): 1045–50. http://dx.doi.org/10.1111/j.1467-9280.2008.02197.x.

Full text
Abstract:
A growing literature suggests that the degree to which distracting information can be ignored depends on the perceptual load of the task, or the extent to which the task exhausts perceptual capacity. However, there is currently no a priori definition of what constitutes high or low perceptual load. We propose that interactions among cells in visual cortex that represent nearby stimuli determine the perceptual load of a task, and that manipulations designed to modulate these competitive spatial interactions should modulate distractor processing. We found that either spatially separating the task-relevant items in a display or placing the target and nontargets in different visual fields increased interference from a distractor that was to be ignored. These data are consistent with the idea that the ability to ignore such distracting information results in part from the need to actively resolve competitive interactions in visual cortex, and is not the consequence of an exhausted capacity per se.
APA, Harvard, Vancouver, ISO, and other styles
32

van der Smagt, Maarten J., Christian Wehrhahn, and Thomas D. Albright. "Contextual Masking of Oriented Lines: Interactions Between Surface Segmentation Cues." Journal of Neurophysiology 94, no. 1 (July 2005): 576–89. http://dx.doi.org/10.1152/jn.00366.2004.

Full text
Abstract:
The ability of human observers to detect and discriminate a single feature of a visual image deteriorates markedly when the targeted feature is surrounded by others of a similar kind. This perceptual masking is mirrored by the suppressive effects of surround stimulation on the responses of neurons in primary visual cortex (area V1). Both perceptual and neuronal masking effects are partially relieved, however, if the targeted image feature is distinguished from surrounding features along some dimension, such as contour orientation. Masking relief is likely to play an important role in perceptual segmentation of complex images. Because dissimilar surfaces usually differ along multiple feature dimensions, we tested the possibility that those differences may influence segmentation in an invariant manner. As expected, we found that the presence of surrounding features resulted in perceptual masking and neuronal response suppression in area V1, but that either orientation or contrast polarity differences between the target and surrounding features was sufficient to partially relieve these effects. Simultaneous differences along both dimensions, however, yielded no greater relief from masking than did either difference alone. Although the averaged neuronal effects of orientation polarity cues were thus invariant, the time course over which these effects emerged after each stimulus appearance was different for the two cues. These findings refine our understanding of the functions of nonclassical receptive fields, and they support a key role for V1 neurons in surface segmentation.
APA, Harvard, Vancouver, ISO, and other styles
33

Zhou, Youmei, Kevin Thwaites, and Daixin Dai. "Gender Differences of the Elderly on the Social and Perceptual Dimensional Interaction in Neighbourhood Park: A Case Study of Beijing." SPACE International Journal of Conference Proceedings 2, no. 1 (July 25, 2022): 10–18. http://dx.doi.org/10.51596/sijocp.v2i1.35.

Full text
Abstract:
Neighbourhood parks are an essential activity venue for contemporary older adults and potentially impact their health. Most research has focused on the physical and visual dimensions of the park, with some studies demonstrating differences in the visual preferences of gender. Still, there is a lack of detailed research on the social and perceptual dimensions of male and female older people and a lack of research on psycho-emotional satisfaction and mechanisms to alleviate the psychological isolation of older people of different genders. In this study, a mixed research method was conducted to investigate the multi-dimensional interaction of older people using neighbourhood park, with a sample size of 418 participants from the Chaoyang district of Beijing, an integrated research method including questionnaires, semi-structured interviews and observation, Goodman and Kruskal’s gamma analysis and Grounded theory qualitative method applied by using SPSS and NVivo to analyse the elements of neighbourhood parks related psychological outcomes. A conceptual framework of multi-layered social interactions and the influence of different social interactions on the perceptual and psychological dimensions has been conducted, as well as the gender differences. The study found that the social, physical space and perceptual dimensions of neighbourhood parks all contribute to alleviating loneliness among older people, that site-based self-regulation also plays an important role, and that low-level social interactions generally affect older people. In contrast, mid-level social interactions have a significant effect on perceptual and psychological benefits, with a synergistic increase in impact, and older women show a more vital need for social activities and self-expression than men. The findings of this study provide a theoretical basis for a multidimensional spatial mediation to enhance the well-being of older people and provide new ideas for comprehensive multidimensional and multi-level ageing optimisation research.
APA, Harvard, Vancouver, ISO, and other styles
34

Clark, Logan D., and Sara L. Riggs. "Investigating the Use of Movement Kinematics to Assess Perceptual Ambiguity in Virtual Reality." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 63, no. 1 (November 2019): 2318–22. http://dx.doi.org/10.1177/1071181319631156.

Full text
Abstract:
With the emergence of 3D direct-selection interfaces in virtual reality (VR) displays, specialized metrics may be needed to assess the efficiency of these complex interactions. Kinematic movement trajectory analysis, an established technique in experimental psychology, may provide a useful framework for quantifying meaningful patterns of user interaction using the 3D position data recorded by most VR displays. We explored this possibility by investigating the effectiveness of a kinematic approach for identifying ambiguous interface elements in VR by detecting movements initiated toward incorrect target locations (termed “misfires”). Twenty-three participants selected 96 target objects of varying perceptual ambiguity presented in a simple VR environment with limited depth cues. Movement trajectories were recorded and evaluated using a-priori criteria to identify discontinuities consistent with misfire movements. Misfires were observed on 31.6% of trials and occurred significantly more often for movements to perceptually ambiguous targets. The findings of this study suggest that kinematic measures may be useful for quantifying patterns of user interaction in direct-selection VR interfaces.
APA, Harvard, Vancouver, ISO, and other styles
35

Raja, Vicente. "From metaphor to theory: the role of resonance in perceptual learning." Adaptive Behavior 27, no. 6 (June 10, 2019): 405–21. http://dx.doi.org/10.1177/1059712319854350.

Full text
Abstract:
Unlike dominant cognitivist theories that take perceptual learning to be a process of enriching sensory stimulation with previous knowledge, ecological psychologists take it to be an enhancement in the detection of already rich perceptual information. The difference between beginners and experts is that the latter detect better information to support their task goals. While the study of perceptual learning in terms of perceptual information and perceiver–environment interactions is common in the ecological literature, ecological psychology still lacks a story regarding the way perceptual information is detected by perceptual systems and the plasticity of such detection in learning events. In this article, I propose the ecological notion of resonance—along with biophysical resonance, non-linear resonance, and metastability—as a plausible foundation to account for the process of detection of perceptual information both in perceptual events and in events of perceptual learning.
APA, Harvard, Vancouver, ISO, and other styles
36

Thellman, Sam, and Tom Ziemke. "The Perceptual Belief Problem." ACM Transactions on Human-Robot Interaction 10, no. 3 (July 2021): 1–15. http://dx.doi.org/10.1145/3461781.

Full text
Abstract:
The explainability of robotic systems depends on people’s ability to reliably attribute perceptual beliefs to robots, i.e., what robots know (or believe) about objects and events in the world based on their perception. However, the perceptual systems of robots are not necessarily well understood by the majority of people interacting with them. In this article, we explain why this is a significant, difficult, and unique problem in social robotics. The inability to judge what a robot knows (and does not know) about the physical environment it shares with people gives rise to a host of communicative and interactive issues, including difficulties to communicate about objects or adapt to events in the environment. The challenge faced by social robotics researchers or designers who want to facilitate appropriate attributions of perceptual beliefs to robots is to shape human–robot interactions so that people understand what robots know about objects and events in the environment. To meet this challenge, we argue, it is necessary to advance our knowledge of when and why people form incorrect or inadequate mental models of robots’ perceptual and cognitive mechanisms. We outline a general approach to studying this empirically and discuss potential solutions to the problem.
APA, Harvard, Vancouver, ISO, and other styles
37

Meese, Tim S., and Mark A. Georgeson. "Spatial Filter Combination in Human Pattern Vision: Channel Interactions Revealed by Adaptation." Perception 25, no. 3 (March 1996): 255–77. http://dx.doi.org/10.1068/p250255.

Full text
Abstract:
Above threshold, two superimposed sinusoidal gratings of the same spatial frequency (eg 1 cycle deg−1), of equal moderate contrast (eg C1 = C2 = 6%), and with orientations of ±45°, usually look like a compound structure containing vertical and horizontal edges (ie a blurred checkerboard). These feature orientations are very different from the dominant filter orientations in a wavelet-type (eg simple-cell) transform of the stimulus, and so present a serious challenge to conventional models of orientation coding based on labelled linear filters. Previous experiments on perceived structure in static plaids have led to the view that the outputs of tuned spatial filters are combined in a stimulus-dependent way, before features such as edges are extracted. Here an adaptation paradigm was used to investigate the cross-channel interactions that appear to underlie the spatial-filter-combination process. Reported are two aftereffects of selective adaptation: (i) adaptation to a 1 cycle deg−1 plaid whose component orientations are intermediate to those in a 1 cycle deg−1 test plaid ‘breaks’ perceptual combination of the components in the test plaid; (ii) adapting to a 3 cycles deg−1 plaid whose component orientations match those in a 1 cycle deg−1 test plaid facilitates perceptual combination of the components in the test plaid. The results are taken as evidence that spatial channels remote from those most responsive to a test plaid play a crucial role in determining whether the test plaid segments or coheres perceptually.
APA, Harvard, Vancouver, ISO, and other styles
38

Serniclaes, Willy. "On the invariance of speech percepts." ZAS Papers in Linguistics 40 (January 1, 2005): 177–94. http://dx.doi.org/10.21248/zaspil.40.2005.265.

Full text
Abstract:
A fundamental question in the study of speech is about the invariance of the ultimate percepts, or features. The present paper gives an overview of the noninvariance problem and offers some hints towards a solution. Examination of various data on place and voicing perception suggests the following points. Features correspond to natural boundaries between sounds, which are included in the infant's predispositions for speech perception. Adult percepts arise from couplings and contextual interactions between features. Both couplings and interactions contribute to invariance. But this is at the expense of profound qualitative changes in perceptual boundaries implying that features are neither independently nor invariantly perceived. The question then is to understand the principles which guide feature couplings and interactions during perceptual development. The answer might reside in the fact that: (1) adult boundaries converge to a single point of the perceptual space, suggesting a context-free central reference; (2) this point corresponds to the neutral vocoïd, suggesting the reference is related to production; (3) at this point perceptual boundaries correspond to the natural ones, suggesting the reference is anchored in predispositions for feature perception. In sum, perceptual invariance seems to be grounded on a radial representation of the vocal tract around a singular point at which boundaries are context-fee, natural and coincide with the neutral vocoïd.
APA, Harvard, Vancouver, ISO, and other styles
39

Curran, William, Colin W. G. Clifford, and Christopher P. Benton. "The hierarchy of directional interactions in visual motion processing." Proceedings of the Royal Society B: Biological Sciences 276, no. 1655 (September 30, 2008): 263–68. http://dx.doi.org/10.1098/rspb.2008.1065.

Full text
Abstract:
It is well known that context influences our perception of visual motion direction. For example, spatial and temporal context manipulations can be used to induce two well-known motion illusions: direction repulsion and the direction after-effect (DAE). Both result in inaccurate perception of direction when a moving pattern is either superimposed on (direction repulsion), or presented following adaptation to (DAE), another pattern moving in a different direction. Remarkable similarities in tuning characteristics suggest that common processes underlie the two illusions. What is not clear, however, is whether the processes driving the two illusions are expressions of the same or different neural substrates. Here we report two experiments demonstrating that direction repulsion and the DAE are, in fact, expressions of different neural substrates. Our strategy was to use each of the illusions to create a distorted perceptual representation upon which the mechanisms generating the other illusion could potentially operate. We found that the processes mediating direction repulsion did indeed access the distorted perceptual representation induced by the DAE. Conversely, the DAE was unaffected by direction repulsion. Thus parallels in perceptual phenomenology do not necessarily imply common neural substrates. Our results also demonstrate that the neural processes driving the DAE occur at an earlier stage of motion processing than those underlying direction repulsion.
APA, Harvard, Vancouver, ISO, and other styles
40

McMains, Stephanie A., and Sabine Kastner. "Defining the Units of Competition: Influences of Perceptual Organization on Competitive Interactions in Human Visual Cortex." Journal of Cognitive Neuroscience 22, no. 11 (November 2010): 2417–26. http://dx.doi.org/10.1162/jocn.2009.21391.

Full text
Abstract:
Multiple stimuli that are present simultaneously in the visual field compete for neural representation. At the same time, however, multiple stimuli in cluttered scenes also undergo perceptual organization according to certain rules originally defined by the Gestalt psychologists such as similarity or proximity, thereby segmenting scenes into candidate objects. How can these two seemingly orthogonal neural processes that occur early in the visual processing stream be reconciled? One possibility is that competition occurs among perceptual groups rather than at the level of elements within a group. We probed this idea using fMRI by assessing competitive interactions across visual cortex in displays containing varying degrees of perceptual organization or perceptual grouping (Grp). In strong Grp displays, elements were arranged such that either an illusory figure or a group of collinear elements were present, whereas in weak Grp displays the same elements were arranged randomly. Competitive interactions among stimuli were overcome throughout early visual cortex and V4, when elements were grouped regardless of Grp type. Our findings suggest that context-dependent grouping mechanisms and competitive interactions are linked to provide a bottom–up bias toward candidate objects in cluttered scenes.
APA, Harvard, Vancouver, ISO, and other styles
41

Crommett, Lexi E., Deeksha Madala, and Jeffrey M. Yau. "Multisensory perceptual interactions between higher-order temporal frequency signals." Journal of Experimental Psychology: General 148, no. 7 (July 2019): 1124–37. http://dx.doi.org/10.1037/xge0000513.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Viviani, Paolo, and Natale Stucchi. "Biological movements look uniform: Evidence of motor-perceptual interactions." Journal of Experimental Psychology: Human Perception and Performance 18, no. 3 (1992): 603–23. http://dx.doi.org/10.1037/0096-1523.18.3.603.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Murray, Scott O., Paul Schrater, and Daniel Kersten. "Perceptual grouping and the interactions between visual cortical areas." Neural Networks 17, no. 5-6 (June 2004): 695–705. http://dx.doi.org/10.1016/j.neunet.2004.03.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Wexler, Bruce E., Gary Schwartz, Stephen Warrenburg, Mark Servis, and Irene Tarlatzis. "Effects of emotion on perceptual asymmetry: Interactions with personality." Neuropsychologia 24, no. 5 (January 1986): 699–710. http://dx.doi.org/10.1016/0028-3932(86)90009-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Wilschut, Anna, Jan Theeuwes, and Christian N. L. Olivers. "Early perceptual interactions shape the time course of cueing." Acta Psychologica 144, no. 1 (September 2013): 40–50. http://dx.doi.org/10.1016/j.actpsy.2013.04.020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Atanasova, Boriana, Thierry Thomas-Danguin, Dominique Langlois, Sophie Nicklaus, and Patrick Etievant. "Perceptual interactions between fruity and woody notes of wine." Flavour and Fragrance Journal 19, no. 6 (2004): 476–82. http://dx.doi.org/10.1002/ffj.1474.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Ristic, Jelena, Clara Colombatto, and Luowei Yan. "From dyads to crowds: Perceptual unity of group interactions." Journal of Vision 23, no. 9 (August 1, 2023): 5336. http://dx.doi.org/10.1167/jov.23.9.5336.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Fudin, Robert. "Re-Examination of Evidence Questions Christman's (1989) Report of Moderate Experimental Support for the Visual Spatial Frequency Hypothesis." Perceptual and Motor Skills 80, no. 3 (June 1995): 955–62. http://dx.doi.org/10.2466/pms.1995.80.3.955.

Full text
Abstract:
The visual spatial frequency hypothesis contends that perceptual characteristics of stimulus arrays can affect the magnitude and direction of hemispheric asymmetries in laterality experiments. In a 1989 literature review, Christman reported that 45 of 79 experimental comparisons yielded significant interactions for side of hemispheric advantage x perceptual characteristic which supported the visual spatial frequency hypothesis, a level of support he characterized as moderate. Re-examination of those 45 outcomes shows that in 20 of them either a significant interaction for side of hemispheric advantage x perceptual characteristic was not found or, if it was, the particulars do not agree fully with predictions of the visual spatial frequency hypothesis as presented by Christman in the 1989 paper. These findings suggest that experimental support for the visual spatial frequency hypothesis is weak, not moderate as characterized by Christman.
APA, Harvard, Vancouver, ISO, and other styles
49

Wu, Chaozhong, Wenhui Chu, Hui Zhang, and Türker Özkan. "Interactions between Driving Skills on Aggressive Driving: Study among Chinese Drivers." Transportation Research Record: Journal of the Transportation Research Board 2672, no. 31 (October 14, 2018): 10–20. http://dx.doi.org/10.1177/0361198118755683.

Full text
Abstract:
Aggressive driving has attracted significant attention recently with the increase in related road traffic collisions occurring in China. This study aims to investigate the effect of driving skills on aggressive driving behaviors and traffic accidents to find implications for traffic safety improvement in China. A total of 735 Chinese drivers were recruited to complete a self-reported survey including demographic information, the translated Driver Skill Inventory (DSI), and Driver Aggression Indicator Scale (DAIS). Exploratory factor analysis was first conducted to investigate the factor structures of DSI and DAIS among Chinese drivers. Unlike the two-factor solution (i.e., perceptual-motor and safety skills) found in other studies, the current study result revealed a three-factor solution (i.e., perceptual-motor, safety, and emotional control skills) of DSI. Then, the interaction between DSI factors on DAIS factors, demographic variables, and the number of self-reported traffic accidents and offenses was tested by using moderated regression methods. The results revealed the interaction between perceptual-motor skills and safety skills on aggressive warnings committed by drivers themselves. The interactive effect between safety skills and emotional control skills on perceived aggressive warnings was also found. The results suggested that higher ratings of safety skills are essential for buffering the effect of high-level perceptual-motor skills and emotional control skills on aggressive driving in China. In conclusion, policy makers should be interested in understanding the effect of Chinese drivers’ skills on the aggression drivers committed and conceived in traffic. Successful intervention strategies should include all skill factors in the driver training contents.
APA, Harvard, Vancouver, ISO, and other styles
50

Green, Patrick R., and Frank E. Pollick. "Recognising actions." Behavioral and Brain Sciences 25, no. 1 (February 2002): 106–7. http://dx.doi.org/10.1017/s0140525x02330024.

Full text
Abstract:
The ability to recognise the actions of conspecifics from displays of biological motion is an essential perceptual capacity. Physiological and psychological evidence suggest that the visual processing of biological motion involves close interaction between the dorsal and ventral systems. Norman's strong emphasis on the functional differences between these systems may impede understanding of their interactions.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography