Launched in June 2014, Pepper is the first social humanoid robot capable of understanding and reacting to human emotions. Well equipped with features and a highlevel interface for communicating with those around him, Pepper analyses expressions and voice tones using the latest advances in voice and emotion recognition.
Currently, Pepper is available in Japan. Already used in SoftBank and Nescafe stores, among others, Pepper can welcome customers, drive traffic, give information about products and services and even collect data. Pepper is engaging, surprising and above all, the first “emotional” robot. He was not designed for an industrial function, rather to be a true companion for daily life with a first focus on affection.
Pepper has the ability to interpret basic expressions of emotion on the human face: a smile, frown, look of surprise, anger and sadness. He also knows how to understand the intonation of the voice, the context of words, as well as nonverbal language such as the tilt of the head. Coupling these interpretations allows Pepper to determine whether the person in front of him is in a happy or sad mood, with a valuation scale between the two states. The goal is to bring Pepper to really understand and adopt his reaction to fit a person’s mood. Pepper is gifted with a capacity for additional sharing through the tablet that is placed on his heart. This tablet can display additional information to enrich your interaction and reflects the world inside Pepper. Pepper has a colossal database of questions and answers in Japanese, English and French. Users can choose three different shades of voice: playful, neutral or didactic.