Discovering how shoppers feel about robots. Research within retail stores in the United States revealed a baseline for how people feel about robots being within that environment. As it turns out, people can simultaneously be excited and fear being replaced. This research informed attributes and a persona to use in the design of the next generation of robot.
I was the researcher and designer for this project done at Bossa Nova Robotics in 2015.
How do you make someone like a robot? Bossa Nova uses a robot to collect data in an environment that anyone can be in: a big box retail store (soon to be revealed). People can be mean to robots, so the human-robot interaction experience has to be designed in a way that people allow the robot autonomously operate around them. To design this experience, we had to understand how the current generation of robots fared against shoppers.
“This is like wigging me out, is this a scientist thing?” “[I]t looks like it’s going to be taking people’s jobs. That’s what technology does these days.” “Is it following us?” “It’s a computer on wheels, it’s cool.”
It’s not difficult to get a shopper to express her opinion. It’s a different story, however, when you ask for her opinion when her child is running through the store, she’s hunting for specific WIC-eligible products, and a robot is in her way.
Research within a big box retail chain in middle America is difficult to quantify. There are many ways to categorize how someone thinks, feels, and moves. We wanted to know the exact internal state of someone when they first encountered the robot: did they feel repulsed? Were they scared? Or maybe they thought it was like real-life Star Wars? This information may be easier to collect behind closed doors in a controlled lab environment, but the multitude of variables within a retail store can wildly affect someone’s mood — the constant beeping of the checkout lanes, the brightness of the white fluorescent lights, and other customers crowding the aisles. Now throw into the mix a snail-paced robot impeding his goal of grabbing those Cheetos off the shelf, and you can get some interesting reactions.
We went through four iterations of data collection methods, beginning with a detailed survey of shopper’s glances and minute facial expressions when encountering the robot, and ending with something much simpler: the single question, “How does it make you feel?” Through the process we realized that strangers are not afraid to be being honest, and most of the information we needed was in the answer to that single question. We could also collect much more data by asking a single question and handle the complex interaction of multiple large families moving and talking around the robot.
We discerned how people felt about our robot through data collected across 3 different brands, in 4 different stores, in 2 states, and 223 human-robot interactions. We put each quote on an individual sticky note and created an affinity diagram, allowing the themes of feelings and concerns to bubble up. Specifics of what we discovered cannot be shared, but our discovery did allow us to create a persona for with our next generation of robot that hopefully meets specific attributes perceived by shoppers. Among other things, the robot needs to be predictable and mindful, but also appear busy.
Designing a human-robot interaction experience can be difficult, especially when design is new to an organization. There are so many variables that affect how someone perceives a robot. Not only is there how a robot looks, but there’s also how that robot moves and communicates its intentions and feelings to people around it. All of these groupings of characteristics are created by different teams within a company, and those teams have their own individual goals. So I talk to mechanical and electrical engineering about design opportunities in how the robot looks, and I talk to the robot software team about triggers than can be developed that allow a robot to respond to things within its environment. I also talk with business about the impact that politeness can have on the speed of data collection.
I created a method that would allow high-level design goals to be applied throughout the work of all teams, while still remaining true to the goal design persona. Human-robot interaction can be broken down into three areas to focus work: physicality, behavior, and explicit communication. More information on these specific areas can be found in other projects on my site.