Back to All Events

Decoding Human Movement Intention from Wearable Sensors

  • Dalhousie University CS Auditorium (#127), Goldberg Computer Science Building, 6050 University Avenue Halifax Canada (map)
Speaker: Xianta Jiang, University of Alberta

Abstract: 
Understanding human movement is essential for the better design of human-machine interface. Researchers have made great efforts to develop cheap and easy-to-use wearable sensors and algorithms to achieve this goal. In this talk, I will present my previous works in this field, including movement intention detection and perception and cognition recognition using bio-signals collected by wearable sensors. First, I will talk about detection of arm and leg movement intention using ForceMyography, ElectroMyography, and Inertial Measurement Units.  The applications of this line of studies are for human-machine interaction, prosthetic limb (robotic arm) control, and rehabilitation and assistive device. Second, I will discuss how to measure visual perception and cognition of surgeons using eye-tracking technology. Results gained can help us to assess mental workloads, team cognition of surgical team, and help to detect the moment of performance difficulty under AR/VR environment.

Bio:
Dr. Xianta Jiang received Ph.D. from the Computer Science at the Simon Fraser University (2015). He then took a post-doc fellowship at the MENVA research group at the Engineering Science of the Simon Fraser University. Dr. Jiang currently works as a senior research associate in the Surgical Simulation Research Lab (SSRL) in the Department of Surgery at University of Alberta (U of A). His research interest is in human perception, motor control and cognition assessment, with applications in human-machine interaction, tele-operation, robotic manipulation, and healthcare (surgery and rehabilitation).  

Later Event: February 28
Modern WorkSpace with Citrix