During her conference for Web2day, Noemie Lecorps gives us some tips on how to make emotional design a central asset in the development of artificial intelligence.

For Noémie Lecorps, the main objective of design is to make it easier for users to make decisions. We are entering an era of personal growth where everyone is seeking their well-being and a better understanding of themselves. The consumption of products and services by users is becoming increasingly responsible and serves a will, a quest for meaning and ignores the unnecessary or superficial.

Noémie Lecorps regularly lists the objects she uses every day and those she does not use. It also follows this process on telephone applications. The central question to clean up his services: does this save him time? This obsession with optimizing one’s time is also reflected in the raison d’être of today’s assistants: to free up users’ time while being useful to them.

AIs still too inhuman

But with the arrival of Alexa, Siri, etc., are we really witnessing an optimal experience of artificial intelligence at the service of users? Noémie Lecorps points out the flaws that are still very visible in these AIs today: an Apple Watch that invites you to get up and play sports while you are in a meeting (and she knows it, since she has access to your agenda); or Google Home that asks you which light it should turn on at 90%. “No one asks for that,” says the designer, “and it is that kind of detail that cuts the relationship between man and machine. While today’s information is certainly personalized, it is not adapted to real life. This requires more emotional intelligence in products and services.

How can we improve the effectiveness of these intelligences? Noémie Lecorps emphasizes interpersonal and intrapersonal intelligence, which form emotional intelligence. From a design point of view, it is important to define what it is: it is the ability to analyze, recognize and understand one’s own emotions and those of others. In emotional design, we talk a lot about empathy. The user experience revolves around this idea and in design, we need to design both functional and emotional interaction. Noémie Lecorps therefore delivers some good practices to understand the emotional impact of what the designer will design for humans.

Basic design and customization of the experience

In practice, it is about starting to think and test things as artificial intelligence technology evolves. Noémie Lecorps takes the example of Vince Speelman and the 9 design states. When designing a product or an experiment, it is necessary to know how to design 9 basic cases:

  • Nothing: when it is the first time the user uses your service, what information should be given to start interacting with it?
  • Loading: ensure that the loading of the service is subtle and discreet. Facebook, for example, uses the skeleton of an empty post.
  • None: when the content is loaded but empty, it is an opportunity to get the user to act (“Do this”) or reward him (“Good job, everything is settled”)
  • One: on an input, it can be an interaction after the first hit. In a list, this can be when you have an item.
  • Some: your data is loaded, you have entries and the user knows them well.
  • Too many: too many results, too many characters, too many queries… Think about this case to lighten the content
  • Incorrect: how to positively inform the user of an error
  • Correct: how to positively inform the success of a task
  • Done: notify the user when the task is completed and especially when it is validated.

But that is not enough. When interacting with an AI, you must also respect the principles of trust, control over interaction, feedback on actions, visibility into what you are doing with your data, as well as the ability to make mistakes and go backwards, so you must know how to start from a simple user case, and personalize the experience.

Then come the handling of error cases and frustration or disappointment. “When you talk to someone, it is rare to be disappointed in what the person says to you. A question is always followed by a relevant answer to the subject. However, we often experience these cases of disappointment when interacting with a robot. Emotional intelligence allows us to work on the design of an AI so that we no longer feel this frustration,” says Noémie Lecorps.

Define a relational model or interaction contract

It is therefore necessary to detect emotions, interpret them, and then act according to the emotions of your users. It is also important to define the personality of your AI: what character you expect, temperament (what is innate, how a human reacts to an emotion), behaviour (what you do, how you interact with others) and attitude (psychological attitude). It is possible to conduct personality tests to define expectations of an AI so that it builds confidence and makes you want to interact with it.

Then comes the definition of an emotional model of AI: what character you give it, the tone of voice (dynamic, compassionate, accompanying etc.), the vocabulary, the degree of proximity to the user.

It should also be kept in mind that the memory functioning of a human being is not that of an AI. An AI will retain everything, where this is not the case for humans, who sometimes need to forget in order to move forward in their lives. “If an AI remembers every failure, every negative action, it won’t be motivating to move forward,” says Noémie Lecorps. This is typically the example of Facebook going back to images from 8 years ago, without worrying about the emotional impact they will have on the user, whether they are positive or negative memories etc. ». It is therefore necessary to take into account the impact of particular circumstances, there is very little general situation. The idea is to conceive in this way an accompanying experience for the user, whose willingness is to help others while being aware of their emotions.