Here Come the Moodables

Emotions Become Part of Contextual Computing

Moodables_l

The age of “mood-put” is coming. The use of electroencephalography (EEG) sensors to detect brainwave activity for human–computer interaction is reaching mainstream, sensors on the skin can detect stress levels and cameras can recognize facial movements. Even usage patterns can determine mood. Thoughts and feelings could be used for input in direct and subtle ways.

In an interview with the BBC, Dell’s chief research officer Jai Menon said the company could release mood-detecting applications by 2017. Dell researchers are using low-cost headsets by NeuroSky to collect brainwave activity for this project. Such consumer-grade EEG headsets can now be found online for as low as $60.

This is nothing particularly new — most major consumer electronics and IT-focused companies have been researching the use of brainwave sensors for a decade or more. However, wearables are becoming mainstream products and users’ bodies might soon be littered with sensors of all sorts; the trend might soon go to people’s heads. Some big questions remain: can the devices be accurate enough to be truly useful, and would users allow their thoughts and feelings to become part of the usability equation?

Some companies are combing other signals from the body to detect mood, such as facial gestures and skin conductance. Webcams could, for example, track a user’s facial movements to adjust the level of gameplay. Fitness trackers and smartwatches could collect information on the wearer’s skin moisture to detect levels of excitement and stress. A user’s psychological and physiological states could be determined using sensors on the wrist and fingers, and some wearables already use electrocardiogram sensors to ascertain stress levels.

Medical technologies with long track records are finding their way into low-cost consumer wearables, and are becoming part of contextual computing. A person’s surrounding can be determined by sensors like GPS chipsets, accelerometers and microphones, and a user’s mood could be added to the mix to create extremely relevant, situation-customised experiences for the user.

Microsoft’s MoodScope project doesn’t just rely on sensors to understand a person’s current mood — its researchers are developing mood-determining algorithms based on a smartphone owner’s usage patterns. Software in mobiles could understand if the user is happy, annoyed, stressed, calm or bored and adjust output accordingly.

Is mood sensing a next big thing? Accuracy levels have ranged between 50% and 90%, only marginally useful in many cases, but we can expect this to improve. However, privacy issues are already a concern. Google, for example, has patents for mood-collecting wearables. Perhaps a wearer’s reactions to a car advertisement could be determined via a smartwatch and mood ring. Other usage scenarios could include adjusting lighting or music based on a user’s current attitude. Mood-put is certainly causing some stress, but it’s a trend worth monitoring. Lifeloggers should be excited.