answerlab collaboration: usability testing

 

problem

I was working as head of research for two product families and it was impossible for one researcher to provide the cadence of testing required to get all of our products up to speed. In order to accelerate the process, we established a working relationship with AnswerLab, an independent research agency with researchers in New York and Los Angeles.

goal

To combine my efforts as an in-house lead researcher with the efforts of the researchers at AnswerLab in order to triage all of our product offerings. It was my job to manage AnswerLab and provide deep generative and iterative research support on-site and AnswerLab's job to take care of recruiting, script writing, walking users through our prototypes, and producing summary reports.

 

phase 1: establishing testing cadence

 

  • I connected with the project managers propose core structure for active testing weeks.
    • Day 1: Initial usability testing
    • Day 2: Free for reiteration
    • Day 3: Second day of usability testing to gauge the success of improvements
  • Each product team prioritized their projects and estimate need for testing.
  • Design was directed to gauge the desigers' ability to provide prototypes such that they could keep pace with frequent testing.
  • Research and design management were brought into a research meeting to discuss realistic testing frequency and possible issues related to a high-speed process.
  • I discussed possible variability with AnswerLab and determined that we would have flexibility in a given schedule if we needed to slow down our efforts.
  • A final meeting was held to determine the best path forward.
  • Solution: 3 weeks of usability testing every month. Each week consisted of testing on Tuesday and Thursday with Wednesday off for reiteration.

 

phase 2: establishing round timelines

 

  • Each round of testing would last approximately 6 weeks and include 4 active session days of testing, i.e. 2 per testing week. 6 participants would be in session for 1-hour on each active testing day.
  • AnswerLab sent various versions of timelines to cover us for the next 6 months.
  • I coordinated with product and management to determine which timeline agreements would work for our product families and our UX team.

 

phase 3: populating the schedule

 

  • I worked with both product teams to determine which projects would go into the first three round of testing - and when.
  • Time slots were populated on our research planning Confluence page.

 

phase 4: scheduling meetings according to the timeline

 

  • Kick-off meetings were fielded first. Preliminary and final walkthroughs, recruiting checks, and summary meetings were arranged later. I made sure to include key attendees to each meeting, specific to each project and each round, i.e. PO's, PM's, designers, relevant managers, and me as the researcher.
  • I sent AnswerLab a list of key attendees prior to each meeting and AnswerLab sent out the calendar invites and WebEx links.
  • As scheduling needs shifted, i.e. if one of AnswerLab's lead researchers or one of our PO's was gone, we shifted meeting times collaboratively.

 

phase 6: preparation for testing

 

  • Kick-off: I evaluated each product as the in-house researcher and conducted generative research as needed. We held workshops, journey mapping exercises, and meetings as needed leading up to our kick-off with AnswerLab. In preparation for kick-off, I used the personas I created to define recruiting criteria. Stakeholder interviews and group meetings allowed me to create a full project brief and preliminary task flow, which I shared with AnswerLab.
  • Recruiting: AnswerLab either recruited directly or used clients from our database list pull to create participant lists. Participants were recruited according to location if we were doing on-site testing and from all over the country if we wanted AnswerLab to test remotely in their own lab.
  • Preliminary Walkthrough: During the preliminary walkthrough, the designer showed their work to AnswerLab's researchers. I coordinated with AnswerLab researchers to clarify ideal task flows and to strategize around the scenarios we wanted to lay our for our participants.
  • Final Walkthrough: Prior to testing, we reiterated prototypes according to what we wanted to learn from testing, the task flows that matched those goals, and the type of artifact that met that need. Final prototype was shown to the researchers at Answerlab a few days before testing and Invision links were shared.
  • Usability Testing Day 1: AnswerLab conducted usability testing and the designers and I joined remotely via WebEx. PO's, PM's, and managers joined as available. I transcribed 2-3 pages of quotations and task-related data per participant.
  • Iteration Day 2: I worked with our designers, content strategist, and product to coordinate reiteration related to elements of the interface that did not test successfully with our first 6 participants.
  • Usability Testing Day 2: AnswerLab conducted usability testing and the designers and I joined remotely, along with anyone else who was relevant to the project and available.  I transcribed per participant, as I did on day 1, and delivered next step suggestions to design and product.

 

phase 7: summary reports

 

  • Ten Key Takeaways: AnswerLab produced a list of key takeaways from testing the day after testing was finished for each project.
  • Final Report: AnswerLab provided a final report that covered each full round, i.e. 4 days of usability testing with 12 participants.

 

phase 8: repeat!

 

  • Each product would get another week of testing approximately 3 months later.
  • Eventually this process would slow down and an in-house researcher would take over all responsibilities from AnswerLab.