This project is focused on using a novel suite of sensors Electroencephalography (EEG), eye-tracking, pupillary size, and functional Near Infrared Spectroscopy (fNIRS)) to improve current HRI systems. Each of these sensing modalities can reinforce and complement each other, and when used together, can address a major shortcoming of current BCIs which is the determination of the user state or situational awareness. The sensing suite will be used to navigate a mobile robot using brain control alone. The student will be responsible for acquiring sensor data and integrating the data with robotic path planning algorithms ot allow the robot tonavigate safely.