Pepper
SoftBank Robotics

Pepper

Humanoid robot for customer engagement and interaction

Contact for pricing

Pepper is a semi-humanoid social robot designed by SoftBank Robotics for customer service, retail, and hospitality environments. Standing 1.2 meters tall with an expressive tablet interface on its chest, Pepper can recognize faces, detect emotions, and engage in natural conversations with humans. The robot is widely deployed in stores, banks, hospitals, and exhibition spaces to provide information, entertainment, and assistance.

Released: 2014

Overview

Pepper is SoftBank Robotics' flagship humanoid robot designed to bring social interaction and customer engagement to commercial environments. Launched in 2014 in Japan and expanding globally in subsequent years, Pepper was conceived as the world's first social robot capable of recognizing human emotions and adapting its behavior accordingly. With its friendly appearance, expressive gestures, and interactive touchscreen display, Pepper has become an iconic presence in retail stores, banks, airports, and healthcare facilities worldwide.

The robot stands 1.21 meters tall and moves on an omnidirectional wheeled base, allowing smooth navigation through crowded spaces. Pepper's design prioritizes approachability and engagement, featuring a white humanoid upper body with articulated arms and hands, a tablet interface positioned on its chest for interactive content, and an expressive head capable of various movements. This combination of physical presence and digital interface enables Pepper to serve as both a greeter and information kiosk.

Pepper represents a significant milestone in human-robot interaction, focusing on emotional intelligence rather than task-specific automation. Its ability to detect and respond to human emotions through facial recognition and voice analysis creates more natural and engaging interactions, making it particularly effective in customer-facing roles where human connection matters.

Key Features

  • Emotion Recognition: Advanced AI analyzes facial expressions, voice tone, and body language to detect and respond to human emotions in real-time
  • Natural Interaction: 20 degrees of freedom enable natural gestures and movements, with articulated arms and hands for expressive communication
  • Multi-Sensory Perception: Equipped with 3D camera, dual HD cameras, four directional microphones, touch sensors, sonar, and laser sensors for comprehensive environmental awareness
  • Interactive Touchscreen: 10.1-inch tablet interface on chest provides visual content, forms, and interactive applications to supplement verbal communication
  • Multilingual Capabilities: Supports multiple languages and can be programmed for various conversational scenarios and cultural contexts
  • Cloud Connectivity: WiFi and Ethernet connectivity enable cloud-based AI services, remote management, and continuous software updates
  • Long Battery Life: Up to 12 hours of autonomous operation on a single charge, suitable for full-day commercial deployment
  • Open SDK Platform: NAOqi framework allows developers to create custom applications, behaviors, and integrations for specific business needs

Applications

Pepper has been deployed across diverse industries globally, with particularly strong adoption in retail, hospitality, healthcare, and education sectors. In retail environments, Pepper serves as a brand ambassador, product advisor, and queue management assistant, engaging customers with personalized recommendations and entertainment while collecting valuable interaction data. Banks and financial institutions use Pepper to greet customers, provide basic account information, and guide visitors to appropriate services. In healthcare settings, the robot assists with patient check-in, wayfinding, and provides companionship in elderly care facilities.

The education sector has embraced Pepper as a teaching assistant and learning companion, particularly for STEM education and programming instruction. Hotels and airports deploy Pepper for guest services, multilingual information delivery, and self-service check-in processes. Conference centers and exhibitions use the robot to attract visitors, deliver presentations, and create memorable interactive experiences. The robot's versatility and programmability have led to creative applications ranging from autism therapy support to corporate reception assistance, demonstrating its adaptability across various customer engagement scenarios.

Technical Highlights

Pepper's emotion recognition system represents a significant achievement in affective computing, combining computer vision, voice analysis, and machine learning algorithms to assess human emotional states. The robot's perception system integrates multiple sensor modalities including a 3D depth camera for spatial awareness, dual HD cameras for face and gesture recognition, four directional microphones for sound localization and speech recognition, and various proximity sensors for safe navigation. This sensor fusion enables Pepper to maintain situational awareness while simultaneously tracking and engaging with multiple individuals.

The robot's omnidirectional wheeled base provides smooth, stable movement at speeds up to 3 km/h, with advanced obstacle avoidance and autonomous navigation capabilities. Pepper's 20 degrees of freedom are strategically distributed across its body, including mobile head joints, articulated shoulders, elbows, wrists, and hands, plus hip joints that enable expressive upper-body movements. The NAOqi operating system and development framework provide extensive APIs and tools for customization, supporting Python, C++, Java, and JavaScript, with integration capabilities for popular cloud AI services and business systems. This technical foundation has established Pepper as a versatile platform for social robotics research and commercial applications.

Videos

Ready to deploy Pepper?

Tell us about your use case and we'll help you evaluate this solution.

Regarding: Pepper