Multimodal Brain Computer Interface for Human-Robot Interaction
We are sorry, this position has been filled.
This project is focused on using a novel suite of sensors Electroencephalography (EEG), eye-tracking, pupillary size, and functional Near Infrared Spectroscopy (fNIRS)) to improve current HRI systems. Each of these sensing modalities can reinforce and complement each other, and when used together, can address a major shortcoming of current BCIs which is the determination of the user state or situational awareness. The sensing suite will be used to navigate a mobile robot using brain control alone. The student will be responsible for acquiring sensor data and integrating the data with robotic path planning algorithms ot allow the robot tonavigate safely.
Lab: Columbia Robotics Lab
Direct Supervisor: Peter Allen
Position Dates: 6/1/2019 - 8/15/2019
Hours per Week: 20
Paid Position: Yes
Number of positions: 1
Qualifications: Knowledge of Robotic Operating System is needed for this position