Learn Up: Testing for the win
Role: Product Design
LearnUp offers job applicants pre-job training during the application process. Our mission is to reduce the skills gap for jobseekers while increasing retention for employers. This article is about a project to redesign LearnUp's opt in flow.
Understanding the Problem
LearnUp works by integrating into a company's application process via a modal that appears after a candidate clicks to apply for a job. However, the modal we were using for the initial LearnUp ‘opt in’ wasn’t getting very good conversion rates.
Only about 20% of users who were clicking ‘apply’ were opting in to LearnUp. Since this represented the top of a long funnel, we really needed to improve our conversation rate.
Fortunately we had some clues about what could be going wrong. We conducted a few user interviews and quickly realized that many users felt our current modal looked like spam. They didn’t trust clicking into the modal and they weren’t sure what LearnUp was.
Testing for the win
Using the styleguide we had developed with LearnUp, we quickly mocked up a few solutions. We came up with five different designs, each using a different combination of value props, client logo and email input, basically the three levers that could affect conversion. But which modal was the right one?
To find that out, we divided the tests into two rounds: one with email input and one without email input. In the first round we would test what worked better: value props, client logo or both. In the second round we’d test the winner from the first round against it’s email input counterpart. So, for example, if value props won out over client logo, we’d then test it against value props with email input. If the email input version performed the same or better, then email input won.
Involving the team
Because we were working with a team that had never had a designer before, we wanted to make an effort to involve other departments in the design and engineering process. This can be a great way to increase communication between departments and demystify the design process. We decided to try something new…A/B Testing Bingo! Everyone took bets and the winner got an exciting “mystery prize”. People got pretty excited…
And the winner is....
Once we found the winner from the first round we tested it against it’s counter-part in the second round. It turns out the second round candidate was the winner. Best of all we increased our sign-up conversion from 20% to 42%-a big win!
What would we do differently...
Doing two rounds of testing took more time than we would have liked. For a small, agile startup we couldn't really afford to take long periods validating every assumption. We had to be careful what we tested. In the future, we would keep the testing to a single round at first and extend a second round if the results were inconclusive.