When AI met EQ

Affective computing may be a multi-billion -dollar business, but much of how it works remains obscure.

'The essential role of emotion in both human cognition and perception ... indicates that affective computers should not only provide better performance in assisting humans, but also might enhance computers' abilities to make decisions.'

It was way back in 1995 that Professor Rosalind W Picard made that statement, in Technical Report No.321 from the Perceptual Computing wing of the Massachusetts Institute of Technology (MIT). The paper's alternative, more user-friendly title, Affective Computing, now commonly describes a branch of computer science that creates user interfaces with a more intuitive touch.

What was back then blue -sky thinking has today transformed into tangible software and hardware - and an industry with significant economic value. A 2017 report from MarketsandMarkets tipped the global affective computing sector to be worth some $54bn (£42.6bn) by 2021. Key factors cited in that forecast included increasing adoption of wearable tech, flourishing industry partnerships and ecosystems, and (perhaps a warning) an absence of governing bodies and regulations.

To define affective computing, let's start with what it is not. Picard wrote: 'I am not proposing ... the business of building "emotional computers".' And that still holds true. As Slawomir J Nasuto, professor of cybernetics at the University of Reading, told Forward: 'It's about creating systems that can infer emotional states from humans, via various means. These may include speech, facial expressions, posture, biological signals and/or electroencephalogram [EEG].'

Having gathered information, the system 'must recognise, or interpret, what the emotion may be. And then it does something in response to that information. That "doing" bit will typically involve the adjustment of something perceptible to the human subject, such as a software program's display or even a mechanical artefact, such as a robot'.

Emotional components

Core types of hardware deployed throughout the affective computing sector include:

  • cameras and other scanning and imaging devices that monitor and track facial expressions, eye movements, gestures and postures;
  • adhesive body sensors that read EEGs and 'galvanic skin response' (subtle shifts in the skin's electrical activity);
  • high -end audio equipment that records variances and textures in users' voices;
  • virtual reality (VR) gear, such as head-mounted displays (HMDs) that enable users to take part in simulated experiences; and
  • 'haptic' technologies that deliver to the user particular stimuli - vibrations, say - to simulate physical effects in virtual settings. Haptics have long been used in gaming control pads to suggest the feeling of driving fast or using heavy weapons, and they are now being worked into gloves and other garments used in conjunction with HMDs in the course of VR events.

Making sense of the data produced by several of these hardware types are an array of sophisticated software applications in the fields of artificial intelligence (AI) and machine learning. Those programs run the complex algorithms that enable the systems of which they are a part to best interpret, and act on, users' emotional cues.

Some of the most notable innovators in affective computing are active in the retail space. This includes AI company Realeyes, which works with several big-name brands, such as Coca-Cola, Expedia, Mars, AT&T and LG, to help them measure, optimise and compare the effectiveness of their content.

Realeyes has taught computers to measure viewers' emotions and attention levels using webcams. It can show brands' content to panels of consenting consumers all around the world and let advertisers know exactly how audiences feel about a campaign and whether they are paying attention or not. Marketers get an overall score based on attention and emotional engagement, enabling them to compare multiple assets instantly by industry or competitor, or benchmark them against previous campaigns. They are also shown the moments where emotions and attention were at their highest and lowest, enabling them to make edits to maximise engagement.

Microsoft’s Human Understanding and Empathy team is working on various projects with the aim of implementing affective computing into their products. This includes developing a multimodal emotion-sensing platform.

A number of academic research hubs are also working on affective computing projects, led by some of the leading thinkers on the subject. Notably, iBUG (the Intelligent Behaviour Understanding Group) at Imperial College London is dedicated to detailed machine analysis of human behaviour through multimodal methods - in other words, extracting data from a variety of sources, such as facial, vocal and gesture-based cues, and aggregating it. The group's leader is affective computing oracle Professor Maja Pantie.

Intimate relationship

At the University of Sydney, the Wellbeing Technology Lab was first established under the name 'the Positive Computing Lab', a reflection of the title of a 2014 book by its founders, Professor Rafael Calvo and Dorian Peters, both leading lights in the field of user experience. In the book, they look at how technology is becoming more physically and psychologically intimate, and argue that it must therefore include functions that embody concepts of mindfulness, empathy and compassion.

Among the Lab's research partners today are the Black Dog Institute, an Australian mental-health clinical services group, and ReachOut, a similar body that aims to help young people, and there is considerable interest among affective computing experts and creatives in harnessing the relevant technologies for a range of social benefits.

For Nasuto, it is easy to imagine how well affective-computing technologies could be integrated with public-sector infrastructures to fulfill a social purpose.

One example is education, says Nasuto, where schools may, for instance, use some form of computerised tutoring. This may work more effectively alongside technology that recognises the mental state of pupils, including stress or attention -arousal, states that signify whether the learners are struggling, interested or simply bored. If, on the basis of that input, the system can adjust the difficulty of the problem, style of explanation or pace of delivery, that could keep the students engaged and maximise the teaching's impact.

Nasuto also sees potential in the use of affective computing in 'social robots' for care homes. He and colleagues have been working on 'a brain­-computer interface for affective-state modulation with music' and a system that is designed to recognise a subject's emotional state - from EEG and other physiological signals - and then modify a music stream emanating from a sound source. 'An algorithm generates the music on the fly, and we're looking at how the stream could react to, and then change, the subject's affective state. We are interested in the possibility of using a system like this to treat emotional disorders in people in residential care. It could be a form of therapy on its own or an adjunct to other therapies.'

Certain neurological disorders are associated with emotional disturbances, he continues - for example, people with various forms of dementia often develop depression. In order to create a therapy to mitigate memory loss, you must address the patient's emotional state too. If you don't, even a brilliant therapy may not work optimally because the subject is agitated or withdrawn and can't engage. 'So addressing the emotional side by using music to put them in a state that's more receptive to the cognitive therapy may have a synergistic effect,' says Nasuto.

Nasuto has explored how a similar system could be deployed in intensive-care wards. 'When people undergo operations,' he says, 'particularly after traumatic events, they are highly stressed. And post-operative medications have side effects that often cause cognitive impairment. So a non­-pharmacological intervention, such as music, may help to reduce the levels of anxiety that the patient is suffering. That, in turn, may enable the clinicians to lower the doses of medications that the patient is receiving.'

The ethics question

As to the ethical and legal dimensions of affective computing, Nasuto admits these technologies create new challenges: 'Affective computing is a tool - and any tool can be used for either good or nefarious purposes. What is specific here is the pervasiveness of affective computing via online connectivity and the emergence of cheaper and more networked sensing technologies. Together, they will open the way for the collection of unprecedented volumes of data on the human state.

'Perhaps in the future, EEG signals will be used for financial authentication - so there are data - protection issues there. And it's no stretch of the imagination to envisage that one day your TV set may have a camera and microphone that can pick up not only what you're doing and saying, but also how you're doing and saying it. That data could be used to gauge your reaction to shows and commercials, so it could be monetised by the media industry. These are all new, ethical horizons that we will have to make sense of.'

Richard Johnson also discussed affective computing in IEEE Spectrum’s biomedical engineering blog. Read the article here