As virtual reality becomes more ubiquitous, it is increasingly being used for exposure therapy sessions and cognitive rehabilitation with succesful results.
Is it possible to extend the benefit of this technology to the whole spectrum of users with physical disabilites?
Users with extreme limb motor disabilities cannot use conventional modes of user input like keyboards or joysticks.
Can we devise methods to allow immobilised patients to freely interact in such virtual environments?
In addition to this, can this method of interaction have other possible applications like reducing social isolation by immersing users in social virtual spaces populated with other virtual avatars?
Year Apr 2017 – May 2017
Scope Virtual reality, AI, EEG
BrainShift is a web-based virtual reality experiment which allows users to immerse themselves in VR using Electroencephalogram (EEG) brainwaves and eye blinking. It serves as a quick and robust technique that allows physically-disabled users to conveniently navigate and interact in virtual reality (VR) using their brain waves by combining an input modality of high data throughput and a stationary EEG activation signal.
Both these input modalities can be leveraged and used with even the most extreme physical disabilites like Locked-in syndrome . This interaction paradigm only requires a functioning brain and eye movement to enable immersive VR.
The user is fitted with an EEG headset that monitors his concentration levels and eye blinks. A reticle in the subject’s field of view follows his eye gaze direction with the help of an eye-tracker. The subject is then autonomously navigated by a pathfinding algorithm to any visible spatial point in the environment by simply targeting his reticle crosshair at that particular point and executing a concentrated blink.
The user can also interact with objects in his environemnt using other input modalities of the eye. Timed gaze-input can be used to activate levers or activate object states while left/right eye blinks also serve as distinct triggers.