Robbie Roybut - Exploring AR Experiences
Robbie at Invalides, Paris
Robbie is a semi-autonomous robotic platform, resulting in 4 iterations (so far), with the mission of exploring new ways to create augmented reality experiences. Designed to provide a robust platform for navigating both interior and exterior environments, we used it to capture 360 degree video of various environments that we then edited with our custom AR annotating software to enrich the experience. Robbie was deployed at trade shows in Germany and the US, as well as many tours around Paris. We were hoping that we would be able to establish a relationship with the museums and other cultural sites in the city to begin building a catalog of educational AR experiences that people could enjoy from home.
Hardware: What makes the robot tick
Robbie is composed of several systems, interacting in tandem to provide a smooth navigation and recording experience. We designed Robbie from the ground up, and learned more from each iteration to improve the design.
- Raspberry Pi
- Primary brains
- Bluetooth connections for control
- Uses Bluetooth GATT to provide services/characteristics for control
- Communicates with other hardware via USB Serial
- MPU9250
- 9 axis IMU, contributes to navigation
- Rotary encoders (motors)
- Record how much each wheel moves, contributes to navigation
- Ultrasonic collision prevention
- Detects when surfaces are approaching to prevent crashes
- Motor driver
- Provides safe interface between low voltage control system and high voltage motor system
- Teensy
- 32 bit ARM microcontroller
- Interfaces with MPU9250 and other sensors
- Performs sensor fusion to give Robbie a sense of direction and orientation
- Automatically course corrects to compensate for slippage, slopes, etc.
- Provides dynamic power to wheels to overcome obstacles such as curbs, etc.
Software: What makes the robot tock
- Raspberry Pi
- Node.js with Johnny-Five and Bleno
- Johnny-Five gave some convenient interfaces to the Teensy
- Bleno provided a bluetooth API
- Node.js with Johnny-Five and Bleno
- Teensy
- Firmware written in C++
- Used I2C for MPU9250 communication
- Hardware interrupts for rotary encoders
- iOS: RobBoss iphone app
- Controlled Robbie via Bluetooth
- Desktop: RobBoss Electron app
- Used Noble to control via bluetooth
Lessons Learned
Friction, or how I learned to test on carpet
I had an amusing (thankfully recoverable!) experience leading up to Robbie V3's first deployment at a major event. V3 had much larger wheels to help give it more stability over minor imperfections, as well as being able to tackle larger obstacles on outdoor excursions. I had been testing on hardwood floors and other relatively smooth surfaces, and the larger wheels were performing exactly as I'd hoped. And then I drove onto carpet. Larger wheels require much more torque to spin, and the difference in friction meant that the power curves I had configured for turns led to a stuttery mess. Thankfully I discovered my oversight with enough time to adjust the power curves and wheels (the event floor was carpeted!) and V3 was able to attend without any embarrasing turns.
Don't assume that your test conditions are representative of the real world, and be prepared to make adjustments as conditions change!
People Love Goofy
We tested different outfits for Robbie, because we didn't want to alarm people as it was driving around. We would send it out naked, circuits and wires visible to any who looked, or we would send it clad in "clean" paneling. Invariably, the more polished we dressed Robbie the less people wanted to be around it. But when we sent it out battery flapping in the wind people loved it - some took selfies with the derpy robot.
Don't be afraid to have personality in your products. People relate to and are comfortable with things that look silly much more than sterile design-by-committee facades
Effective AR requires (very) high resolutions
When we began work on Robbie, the 360 cameras on the market were limited to 1080p per hemisphere. Shortly after, 4k/hemisphere cameras came onto the market. This proved to be a much more agreeable experience, but even with that increase the footage was grainy when projected into a 360 AR view. Despite having what we believed to be a compelling use case, we were unsatisfied with the quality of output and knew we would have to wait for future generations of hardware to get closer to realizing our complete vision. Now that the technology had had several years to mature, I'm interested to see if it is able to match our original hopes.
Sometimes ideas take time to bake and you need to wait on other factors to come into play before you can fully execute a vision