Adaptivity of End Effector Motor Control Under Different Sensory Conditions: Experiments with Humans in Virtual Reality and Robotic Applications
by , ,
Abstract:
The investigation of human perception and movement kinematics during manipulation tasks provides insights that can be applied in the design of robotic systems in order to perform human-like manipulations in different contexts and with different performance requirements. In this paper we investigate control in a motor task, in which a tool is moved vertically until it touches a support surface. We evaluate how acoustic and haptic sensory information generated at the moment of contact modulates the kinematic parameters of the movement. Experimental results show differences in the achieved motor control precision and adaptation rate across conditions. We describe how the experimental results can be used in robotics applications in the fields of unsupervised learning, supervised learning from human demonstrators and teleoperations.
Reference:
Adaptivity of End Effector Motor Control Under Different Sensory Conditions: Experiments with Humans in Virtual Reality and Robotic Applications (Jaime Leonardo Maldonado Cañon, Thorsten Kluss, Christoph Zetzsche), In Frontiers in Robotics and AI, 2019.
Bibtex Entry:
@Article{maldonado_etal_2019,
  author   = {Maldonado Cañon, Jaime Leonardo and Kluss, Thorsten and Zetzsche, Christoph},
  title    = {{A}daptivity of {E}nd {E}ffector {M}otor {C}ontrol {U}nder {D}ifferent {S}ensory {C}onditions: {E}xperiments with {H}umans in {V}irtual {R}eality and {R}obotic {A}pplications},
  journal  = {Frontiers in Robotics and AI},
	url      = {10.3389/frobt.2019.00063/abstract">https://www.frontiersin.org/articles/10.3389/frobt.2019.00063/abstract},
	doi      = {10.3389/frobt.2019.00063},
  year     = {2019},
  abstract = {The investigation of human perception and movement kinematics during manipulation tasks provides insights that can be applied in the design of robotic systems in order to perform human-like manipulations in different contexts and with different performance requirements. In this paper we investigate control in a motor task, in which a tool is moved vertically until it touches a support surface. We evaluate how acoustic and haptic sensory information generated at the moment of contact modulates the kinematic parameters of the movement. Experimental results show differences in the achieved motor control precision and adaptation rate across conditions. We describe how the experimental results can be used in robotics applications in the fields of unsupervised learning, supervised learning from human demonstrators and teleoperations.},
}