Break into Tech, Skillcrush's flagship course program needed some validation
The Break into Tech Course Package is a multi class and feature bundle. Describing its value is difficult. It offers 12+ classes, flexibility, customization, exclusive perks like 1-on-1 sessions with instructors.
Our target customer is specifically womxn and BIPOC who wish to change careers into tech.
Previously we had not done much validation other than the kind you receive via conversion. Meaning, we created a product, and it sold well! To grow and scale we needed to validate the value proposition. We also needed to make sure we were positioning this product in a way that makes sense to our specific customers.
We already had seen some pain points bubble up with our Break into Tech sales page. We also had some feedback that some new customers we're confused on what they had access to / features they had. I joined the team and help tackle some of this product debt and set in motion a plans for the product's future.
As a product designer I led qualitative research and presented findings to other teams. I also pair designed our new positioning for sales pages.
We identified 4 main outcomes we wanted to see as a result of this work.
There was a lot to sort out. We made a decision to take the approach of co-creating and validating each piece at a time. Then building off of that like a foundation for the next piece.
For us the pieces looked like:
A: Validate the values our audience is seeking when it comes to a large product to help them change to a tech career
B: Validate the intersection of the values of the audience with what our product offers
C: Design and Validate a sales page that communicates our value proposition clearly
D: Validate that our buyer journey aligns with what we are offering.
Specifically our approach was to test and validate A, then test and validate A+B, then test and validate A+B+C and so on.
Value test with users to identify what they are seeking out in a large course product via card sorting
To find out what our prospective students valued in a large course package we started some card sorting style activities. I led a few sessions where participants had blank cards and would tell us their criteria for looking for an online school for coding. Then we had them rank their criteria from most to least important. Eventually we had similar things bubbling up and saw overlap with our existing offering. We started testing with cards with words, having participants sort them based on what was most important to them. We identified a few values that were highly desired and proceeded with that knowledge.
Internally do the Rose, Bud, Thorn exercise to get all our thoughts and assumptions out there
It was important to find out what kind of internal assumptions we were holding on to. I facilitated a Rose, Thorn, Bud exercise with stakeholders from multiple teams. Our goal was to try to identify what we felt like was working, what was not working, and what has opportunity. This was helpful in seeing the perspectives of other teams who had different interactions with our customers.
Testing our assumptions about career focused positioning vs skill focused positioning
We used competitive analysis to identify opportunity areas. We noticed that many of our competitors used a skill heavy focus in their approach. We had always used a career focused approach so we wanted to compare these with participants to see if there was a preference. We learned how skills and career focus positioning come across to our prospective customers. We got some good validation about our language and our vocal style to position outcomes in a career focused way. And surprisingly, that more detail in skills does play an important role in the decision making process. In true discovery fashion there is no one right answer. :)
Blind competitor testing
We wanted to have our prospective customers compare our product against competitors in a way that would produce less bias. So we opted for stripping out all visuals and moving value proposition to google docs. We had our participants view each document and compare and contrast in alternating order. We asked questions about which one they preferred as well as how much they would expect it to cost.
Reflective survey of super brand new students
My assumption was that our brand new students (customers) could give us valuable insight. Their purchasing motivations in particular could give us more confidence as recent converters. We ran a survey at the start of the course in Orientation that asked some discovery questions. It was positioned it as a self reflective exercise to the students. We asked questions that helped us understand what users goals were in purchasing our product.
Journey Mapping student experience
We wanted to understand a little more about how our product was delivering on the values that our prospective customers desired. So, we created a few journey maps. One of the takeaways from this research was that our system of organizing content by phases was not helping students reach their goals. Or use high value features like career counseling.
Testing our new positioning early in the funnel, outside of the main sales page. (coding camp)
We wanted to test our assumptions about our new value proposition, but we wanted to be cautious about it. We chose to put our test in one of our funnels where we could measure its success before we launched it on the main site. We designed a new page and ran a few rounds of user interviews. We launched the winner in a split test between our existing page and our new one. The results showed a 5.23% conversion lift compared to our control (existing page) and we were able to move forward.
Breaking down and testing in pieces was really helpful to tackling this complex of a project. We followed the guidance of Teresa Torres in her blog post about co-creation. A side benefit of this project involved iteration discovery interviews and increasing our show rate. I wrote about that experience here.