Are Brain Interfaces the Next Stage of Wearables?
Is this sensor fusion delusion or will mind control really make it mainstream?
London-based innovation studio This Place has created a brain-computer interface app for Google Glass that enables the user to snap and share photos using thought control. The app couples Glass with an electroencephalogram (EEG) reader made by NeuroSky. Such low-cost, off-the-shelf EEG readers can be found online for less than $80. This is no longer exotic hardware.
The application, named MindRDR, requires the user to concentrate on a virtual horizontal line to snap the image and then to concentrate again in order to share the image through social networks. Using a brain-computer interface in this way is only a marginal effort saver, but does make for an interesting technology demo. Mobile industry visionaries are evaluating the long-term potential of combining input from EEG readers with data from other sensors. Context-aware platforms could begin to adjust interfaces according to a user’s moods and attitudes. Content can match the conscience.
Consumer-grade brain-computer headsets have been available for more than five years, and several industries are finding real uses for these accessories. Neurogaming has become a real subset of the gaming business, and brain fitness trackers are being evaluated by educators as a way to train the mind. Data about stress, focus, interest, excitement and facial expressions could are being incorporated into the overall user experience of applications.
There has been some amazing research into using such interfaces to improve the lives of mobility-impaired individuals, and the trickle-down effect from this work will permeate consumer electronics products for years to come. Mind control could begin to complement touch and voice input in innovative ways, from cars to smartphones to wearables.
The current technology has its limitations, so uses for it shouldn’t get too far out of hand. Nonetheless, it’s exciting to think about its long-term potential.