Cite this paper as
S. F. Corbin, C.H. Moore, T. Davis, K. Shockley and T. Lorenz (2023). Multiscale kinematics of action intention. Journal of Multiscale Neuroscience 2(1), 192-203. https://doi.org/10.56280/1565382896
Human motion contains rich contextual information about not only action, but action intention. In two experiments, we investigated whether the multiscale kinematic information that differentiates intentional actions is the same information to which observers attend when asked to observe an actor’s intended movement. To do so, we first recorded an actor’s movement kinematics while performing four different intentional sit-to-stand actions. Analyzing the differences in movement kinematics, we then identified the joints that contributed to differentiating the actions using principal components analysis and multinomial regression. Observers were then shown point-light displays of these movements and given a forced-choice task to select which action the actor intended to complete and were highly accurate at this task. We hypothesized that if perceptual information used to perceive action intention corresponds to the kinematic information that differentiates among the four possible actions, then observers’ gaze should center more on the joints identified in the movement analysis. This hypothesis was supported, suggesting that joint kinematics that differentiate possible actions are the same joint kinematics to which observers attend in order to successfully differentiate movement intentions in others.
Keywords: Kinematic information, embodiment, point-light displays, action intention, intention recognition, biological motion
Conflict of Interest
The authors declare no conflict of interest
This article belongs to the Special Issue
Prof Michael J. Spivey, Author of "The Continuity of Mind".
Department of Cognitive and Information Sciences,
University of California, Merced, USA
Copyright: © 2023 The Author(s). Published by Neural Press.
This is an open access article distributed under the terms and conditions of the CC BY 4.0 license.