We live in a world where we are interacting with technology via screens and user interfaces, where ‘true machine learning’ is becoming a widely deployable, out-of- the-box client solution for mainstream use. We have universal connectivity through the ‘Internet of Things’ and the technology that it brings has slipped seemingly effortlessly into daily-use to improve our lives.
Brand-to- user – it’s getting intimate.
And it’s happening at speed. And it’s bringing new challenges to Artificial Intelligence and User-experience designers : Next-step experience, where the removal of the user interface, where user-machine correspondence will start to happen outside of a screen, through voice, text, and ultimately, emotional intuition, is now a reality.
A machine talks, and we want more.
Research shows that when machines talk, users are starting to expect more. Users are looking for relationships, and looking towards Artificial Intelligence in intimate terms. Research has proven, when machines talk, humans naturally want machines to provide the same connection that comes from a friend, a confidante, even a parent. Designers are bringing emotion and personalities to machines. It’s emotions that will make the difference, from data-driven interactions to deep emotional intelligence guided experiences.
Siri, Ellie and Karim.
It’s a proud iPhone fact that most males will change their ever-faithful Siri to speak with a female voice – many have posed probing questions about (her) personal life, yet Siri remains ever-present and will always respond, politely and un-phased. Users are welcoming and enjoying relationships with AI machines. Moreover, those machines that attempt to react like a human, to diagnose and interpret emotions, give advice, or trigger actions to change emotions, or the services provided by machines that alter users emotions, are seen as machines with real benefit. It’s positive and voluntary user reaction – where the user feels comfortable and is agreeing to allow the service to manipulate their emotions.
It’s becoming known as ‘machine therapy’ and there are already some surprising applications, going way beyond seeing how far you can probe Siri’s private life. In the US, the armed services use virtual mental-health professionals to provide virtual psychotherapy treatments – ‘Ellie’ is helping to treat soldiers with post traumatic stress disorder, while ‘Karim’ is helping Syrian refugees overcome trauma. The “virtual coach” is also rapidly becoming mainstream. Microsoft introduced their chatbot Xiaoice through China’s WeChat app in 2014 to sweet-talk and seduce its Users. Three days after launch, it had been added to conversations 1.5million times. Xiaoice now has emotional best-friend exchanges with 40 million users, who have told it “I love you” over 100 million times since its launch.
• • •
“Designers are bringing emotion and personalities to machines. It’s emotions that will make the difference, from data-driven interactions to deep emotional intelligence guided experiences”.
• • •
Ethics, respect and trust.
As with any relationship, trust is paramount. And when it comes to brand building, nothing has changed – How brands craft their value propositions, brand personalities, and their communication strategies, still plays a crucial role in maintaining and deepening brand affinity and building trust. When entering into emotional relationships, a breach of trust will never be forgiven. Brands need to tread carefully when entering the world of emotional experiences – the stakes are sky-high, and the potential fall-out from proven emotional abuse will prove devastating.
It’s about the ethics of artificial intelligence, ethics that should be continually considered and strictly governed. We are in the stages of AI exploration, and Brands are starting to test and develop emotional indicators in the broadest sense – From the benign experimentation, where humans receive a chocolate when the machine detects a smile, to blatant brand bombing, where a TV advert will hijack all AI machines within a home to advertise fast food. But users and their expectations are already shifting to demand more of artificial intelligence than it is capable of giving. We will soon forget what it was like to frown or sigh at our device, instead the same device will soon be telling us ‘Sorry.., you didn’t like that, did you?’.
Prototyping and experiential testing will help designers and brands reach the appropriate level of intimacy with users – and it’s the experiments that fail, that will determine the rules for maintaining this newly established emotional trust. On the way to the new human-machine interaction paradigm, the biggest hurdle might not be the creation of emotionally intelligent machines, but the effort to find humans emotionally intelligent enough to build them.