Live Feedback Sessions


Customer Goal (Internal)

Leverage existing resources in new ways to gather more feedback from the learners.

Role

Lead researcher (solo), script writer, producer, and director

Key Deliverables

  • Project plan and research questions
  • Recruiting
  • Script and slides/images for overlay video during broadcast
  • Rehearsals
  • Reports

Impact

  • Pioneered a new, novel (and lower cost) user research method to the “toolbox” at Rosetta Stone, along the lines of a focus group, but without as much discussion (or a professional moderator)
  • Grew the team’s understanding of the participants’ desire for greater inclusion of explicit grammar instruction as a supplement to Rosetta Stone’s immersive method

Process

As the only dedicated Researcher for the Product team, I was always looking for ways to maximize ROI in terms of time and expense in return for useful, relevant data from the learners. After helping launch Rosetta Stone Live Lessons, I was familiar with many of the coaches (tutors), Vimeo streaming, and what it took to prepare a session and go live. During a conversation with the PM and the Senior Design Manager, we wondered if the Live lessons format could be leveraged as a method for user research.

I had in mind a simpler, less interactive form of a focus group, utilizing a coach as the moderator (not a trained focus group moderator) and the group text chat associated with the live video as the forum for a limited discussion. Potentially a very cost-effective method, albeit expecting less useful and rich data. The initial questions we had were along the lines of: Is this even a viable method? Will people participate? Are multiple choice or open-ended questions better? I drafted the questions, helped create the slides that would show alongside the coach/moderator, and worked with the producer, coach, and a chat moderator to rehearse and then go live.

Overall, it was a success! People participated, there was no confusion, the coaches easily followed (and enhanced) the script, and the open-ended test question seemed easy enough for participants, and for me to parse and analyze after by exporting the text chat transcript. Since the first Live Feedback session was successful, we proceeded to plan a second one, this time leaning in equally on exploring the format AND research questions pertaining to learner’s expectations about explicit grammar instruction (Rosetta Stone relies upon an immersive methodology for language instruction and has little in the way of explicit lists and tools of grammar).

I wanted to push the limits further in the second round, inviting more participants, sending a brief survey up front so as to include the results in the Live Session itself as a prompt for discussion, and by asking open-ended questions almost exclusively – all while collecting valuable qualitative data about the participants’ thoughts and expectations about explicit grammar instruction. I worked with the same small team to create, produce, and go live with the new set of questions. Everything went as smoothly as the first time, except for having more people drop out at the last minute due to conflicts and just time of day issues. Unlike a research session where you typically have multiple slots over multiple days, this was a single session on a single day, at a time of day that was reasonably convenient for us in order to leverage members of the Coaching team. On the plus side, we gathered a richer data set with the open-ended questions, and the included graphs from the pre-session survey also helped generate more discussion. After discussing the session, writing up a report, and sharing the findings, the second session was also deemed a success – and plans were made to conduct Live Feedback sessions on a quarterly basis. We’d leveraged resources we already had in new ways to create a new, inexpensive channel for Rosetta Stone to collect feedback for the Product organization.

Lessons Learned/Constraints

  • Because it was a live event, getting more than 10 people across time zones to attend was a substantial challenge (many dropped out or didn’t show up, despite saying they would and filling out paperwork).
  • The medium lent itself best to asking questions related to the medium itself – that is, about the Live streaming feature, on-demand videos, and tutoring. Perhaps the medium encouraged a particular mental model. Or, perhaps, I simply hadn’t yet figured out the best way to ask those other kinds of questions. I believe we also needed to boost compensation to encourage attendance.