Visual Product Scanner


FLOW (Visual Product Scanner)

Customer Goal

Find one or more products on Amazon using my phone without typing a search query.

Role

Lead Designer

Key Deliverables

  • Sketches
  • Wireframe concepts
  • UX specification
  • Contributing to research design documents
  • Final visuals (assets)
  • Finding and logging implementation bugs
  • Ideation for the future

Impact

  • Amazon’s Flow feature for the iPhone shopping application was released in early February, 2014
  • Increased usage of feature over “Snap It”, which it replaced
  • Increased conversion
  • The feature was the focus of numerous online publications, including Wired, The Verge, Fast Company, The Wall Street Journal and many more. My favorite quote, possibly of all time regarding anything I’ve designed, is from Fast Company, which said “The experience feels like sorcery.”

Process (Iterative)

The initial direction from the business was to extend barcode scanning to include product recognition, so early concepts I’d created sometime earlier contained a semblance to that, maintaining at least the horizontal red line. However, that ended up being shelved half-way through design and development. Months later it was taken off the shelf, this time for quick implementation into the iPhone shopping application (I’d designed the Android experience with iOS and Windows Phone in mind).

I discussed the feature with other teams, designers and researchers to find any existing, related data, especially user research.

I then worked closely with the product manager to understand and help define the updated product requirements and use cases. We could do something new, but didn’t have much time to experiment. Working closely with the lead developer to understand what the technology could or could not was also instrumental. In collaborating, the three of us learned a lot about the constraints, and generated ideas together around many of them.

Connecting with the research team early in the process allowed us to line up at least one study (we’d run a couple when designing for the Android app). Because I had forged a good relationship with the research team, and because I had the skills and experience to offload a lot of the study prep work, in the end, we were able to conduct more than a single study. I worked with the researcher to define the tasks, come up with ideas for testing in the lab, and leading discussion in the observation room.

As research was happening, I moved the project from wireframes into comps, and visual design, to help stakeholders, the product manager, and development teams know what they’re working toward, plus arm them with assets – rough at first – to iteratively toward the final product.

Actively working with the lead developer every day helped us on refine the interactions and motion design, as well as identify several bugs and resolve them.

Lessons Learned/Constraints

  • It’s more important to solve the simple use case(s) first and then layer more complexity – if needed – possibly over time, rather than trying to solve everything
  • Micro-interactions are important, but for some features they can make or break it
  • Extending the barcode experience, or even leveraging anything resembling an experience in the market, still may not be the right metaphor for the customers
  • You can never seem to have enough user research and data (ok, I already knew this one)

Additional Context

A separate team within Amazon, called A9, had created – and still owned – the visual recognition technology. I worked with them in late 2010 when designing Amazon’s barcode scanning experience within the iPhone, Android and Windows Phone applications. In late 2011, that team released a stand-alone app called “Flow”. It had some interesting ideas about product recognition and allowed Amazon to start collecting data.

Fast forward to early 2013. Given my experience designing the barcode scanner in late 2010, I was asked to design the new continuous scan feature for the shopping apps, starting with Android. Some of the problems with that app experience were:

  • Recognized objects floated on screen and occluded the view making it hard to scan multiple objects in a row
  • The history used tiny thumbnail images that were hard to recognize and understand
  • Recognition was triggered repeatedly for the same object leading to a feeling of not being in control
  • Recognition continued to occur in the background when the user may be focused on the item that was recognized, again leading to a feeling of not being in control
  • The scanning interface was filled with frenetic, blinking dots

I did not believe the existing Flow experience was the best one for Amazon’s mobile customers, so I set out to redesign it.