A ten week project in partnership with Dan Taylor
This project began with a simple observation about commuting in cities. People can't typically get all the way to where they want to go. Because of this we've seen a surge in personal transportation devices that allow people to park farther away and ride into work quickly. These modes of transport are great at carrying people but not anything else.
We thought that it was odd that there hasn't been much exploration into the area of carrying a user's things . There are plenty of examples of this in industry. Amazon uses robots in its warehouses. Hospitals are beginning to use automated robots that carry all sorts of things to patients and staff. We wondered if there was space for a device that could help consumers perform similar tasks.
We envisioned a platform that you could stand on and ride but had the ability to follow you around while it carried things for you. From the start we saw it as a development platform that others could design and build for. We chose grocery shopping as a use case because it is very relatable and it represents a difficult challenge for a device like this. Narrow isles, complicated backgrounds, and oblivious shoppers would all need to be accounted for.
The first test was simple. We followed Ashkon around while he shopped, acting as a robotic shopping cart.
From this test we discovered some valuable insights about how we should move forward. Most importantly, Ashkon thought the experience was much better because his hands were free.
However, he didn't like how the unit followed directly behind him and he wanted to be able to control when it followed him. He also wondered how large this would have to be to hold all of the technology a robot like this would need.
To ensure that the fundamental premise of this device was valid, we spent some time packaging the internal components into a reasonable volume.
We then added some ideas about user interaction and tested them in our studio.
Once we settled on a set of behaviors that seemed to work in studio, we took the prototype into a busy store at dinnertime with a person who was unfamiliar with the project. We had him go about his shopping while I performed robot duties.
We learned some valuable things from this test. Overall, the remote and kick interactions were successful. They made sense to the user and they worked well in a store. We still had issues with the connection between the base and the bag. The unit's behavior still wasn't as natural as we wanted and the shape of the bag didn’t make sense for most groceries.
As we tested new designs and behaviors, we discovered that this was more of a UX problem than anything else. We spent the most amount time trying to nail down the behavior of the unit. Should it follow, or should a person push it virtually? Should there be an app to control it or should you just kick it? To answer these questions we wrote out behavior flows like the one here. Eventually we nailed down a set of behaviors that users responded well to.
Sketching form development.
CAD form development.
Should it be omnidirectional or directional?
Rectilinear shapes held the most groceries.
Omnidirectional wheels printing.
Appearance model made from CNCd MDF.