Following our initial redesign, I led the UX for HubSpot.com as we went through rapid iterations that would multiply the site's traffic, conversions and revenue. This is a look at how our small multi-disciplinary team iteratively built one of the most-visited SaaS sites on the web.
Over the course of the year, we ran multiple small iterations followed by a large refactoring iteration in Q3. Here’s how that looked when laid out on a timeline.
This case study presents a meticulous implementation of my personal UX process, and an opportunity to see how it is practically applied. I believe in taking an approach to design that is deeply rooted in both theory and practice, and we can see those worlds collide here.
All iterations were driven by measurable primary and secondary goals, and supplemented by unique one-off goals.
We measured conversion not by clicks or sign-ups, but by activated and retained users. A predictive analytics system assisted in this.
We took an objective approach to design, testing and measuring everything from usability (SUS) to aesthetics (UPT).
I believe in a goal-driven and objective design process, where decisions can be justified using research and KPI’s, rather than just opinion and intuition. These goals acted as “north stars” that set a clear context and aligned everyone on the project, working toward the same outcome. They suffocate ego, bureaucracy, and competing interests.
Every task was stack ranked against these goals, using the criteria of potential, importance, and ease. This kept our team focused on the best use of our resources, and gave us a clear idea for what should be included in each iteration.
We'd source opportunities from the core team, stakeholders within the business, and ongoing performance analysis. They were all accompanied by data and cross-referenced with goals established in our MSPOT. These were used to build the business case, flesh out the iteration, and set it in motion.
To support this, we used a hybrid data model tracking users across HubSpot Analytics, Google Analytics 360, and Amplitude. We didn’t make guesses; we attacked the best opportunities.
Following the iteration outline, a JIRA epic and Kanban board were created. Relevant Slack channels and InVision projects were opened. A Google Drive folder and an Abstract project were established for versioning (and assets, if a Design System update was to be made). Finally, a shell wiki post and case study were formed in preparation for post-mortem documentation, both internally and externally.
Once the iteration was established, research would become the primary focus. This was both an ongoing and an iteration-dependent process.
With over 2,000 pages across the site, detailed conversion flow audits helped identify areas where the experience could be reduced and simplified.
Simple flows had a 2-3x higher conversion rate than their complex counterparts. A new, standardized “Get Started” flow helped users convert in 3 steps from anywhere.
By looking at individual user journeys, we uncovered issues where users were wrongfully converting, getting lost in the site, and abandoning products that they signed up for.
In one such case, a new conversion and sign-in interface was created. It detects known users even if they aren’t signed in and offers to continue with their account instead.
Using a composite of data sourced from interviews, user tests, and surveys, we kept a pulse on how prospects preferred to learn about new software products. In one instance, users indicated that they wanted to watch a video or read through a website, and that chat was least favored. At the same time, our conversion data consistently showed that chat had a high close rate.
After considering both quantitative and qualitative data sets, and reconciling areas where they contradicted each other, we developed solutions that prioritized video-based and story-based learning, as well as a new chat experience. Users didn’t dislike chat in general; they disliked our chat. These videos increased conversion by 300%, and chat volume increased by 38%.
User interviews and surveys would help uncover the messaging that spoke the most to prospects, and was later cross-referenced with copy vs. copy testing data.
Copy was centered around product-descriptive, “all-in-one”, and “growth” themes. In copy vs. copy tests, changes to messaging improved conversion by 20%.
An average of 10 experiments were launched on the site every week. The learnings from these experiments continually impacted our incremental growth metrics.
After KPI’s were tracked and proven out for each experiment, their learnings are applied across the site. This simple scaling effort bled into design, research, and code.
Key users and stakeholders were interviewed with each iteration. Behavioral flags in the product and website identified the right users and pulled them into interviews, while internally, we worked with key stakeholders and interviewed across the business. Results were then compiled into a document with key themes and overlapping interests.
In one of our iterations, a key theme was cognitive load. Users and stakeholders alike felt that the design was too complex; a total overload of information. By running a cognitive load tapping test, we were able to confirm this, and we drastically simplified the design as a result. Follow-up testing showed users performing over 20% better.
User testing and session recordings were regularly run to observe organic user behaviors, check the health of the experience, and qualitatively test against hypotheses.
Results were compiled into feedback docs and stack ranked quantitatively in Rainbow Spreadsheets. These were then mapped to action items for design and development.
Heatmaps, Scrollmaps, and Color Blindness Simulations were used to evaluate the effectiveness and accessibility of the design, as well as how users interacted with it.
Pre and post launch heatmaps helped predict design performance. On the left is a heatmap of a pre-launch static PNG. On the right is the post-launch live design.
As fresh research was completed and used to jumpstart a new iteration, I placed into a kickoff slide deck that I would eventually present to key stakeholders and decision makers. From this, we'd agree on the direction of the iteration, the timeline and the team needed to carry it out. We didn't iterate for the sake of iterating, so this was part of the process helped us evaluate each iteration.
Research determines content, and content determines design. In that order. This is what drove our research-first and copy-first approach to design.
In-depth messaging research was matched with specific verbiage and used to construct full copy docs, which went through several rounds of feedback and iteration. As the copy began to solidify, it would be placed into a rough structure to visualize how the story would be told. During this time, design exploration was also under way, working in conjunction with copy.
Even when substantial research is present, we still need to test our concept and determine how much actual potential exists for the iteration that we’re proposing. This can lead to pivots or affirmations; sometimes both.
In the example to the right, we tested several hypotheses around aesthetics and messaging, many of which broke our brand conventions. I produced this design and we had it live in 24 hours. It converted 20% better than the control, even as a quick experiment. This indicated that our hypotheses were aimed in the right direction.
Increase in conversion rate.
That justified an iteration.
I maintained several InVision Boards with elements that I and the team pulled from around the web, concepted in meetings, or drew during a moment of inspiration. We regularly implemented these elements as tests and enhancements. Importantly, each core element was quantitatively and qualitatively validated before implementation, no matter how subjective it may have seemed.
As the design direction was established, we began creating the structure and hierarchy of the design, as well as the aesthetic treatments.
Full user flows and unique interactions were prototyped for sharing internally and testing with users. In the example to the left, I prototyped a concept that focused our home page on user personas and how the software contributed to their daily work. Persona photos would intermittently fade to videos of the individual working. This was something that was best communicated through a prototype, and it helped sell the idea to key stakeholders.
All elements were integrated into HubSpot’s new global design system, Canvas, and distributed via our UI Library and a master Sketch file. There, working examples were displayed with code snippets and instructions for design and use. This system was updated with each iteration, it encompassed all design at HubSpot, and it removed the need for design assets. As a result, design was more ubiquitous, consistent, scalable, and efficient.
Our iterations more than doubled the global conversion rate of HubSpot.com.
Compression & service workers slashed our load times.
These are HubSpot’s highest value conversion, making them a litmus test for the quality of the conversion mix.
By improving HubSpot's chat bots and chat experience, this became a conversion with a high close rate.
With dynamic numbers, we targeted high intent customers.
This resulted in millions of dollars in new ARR for HubSpot, tied directly to our iterations.
HubSpot Analytics, Google Analytics 360 and Amplitude were used to collect data, which was then compiled in Google BigQuery and used to populate custom visualizations that were built in Google Data Studio. This allowed us to see data across the entire site and product, and focus exclusively on metrics and questions that were relevant to our work. Using a predictive analytics model that was laid on top of this data, we were able to measure conversion not by clicks or submissions, but by user retention and revenue. This was tracked across more than 90 unique conversion journeys throughout the site.
All 90+ conversion flows in Google Sheets, after being compiled in Google BigQuery.
Snapshot from our weekly meeting, covering key metrics and experiments.
The lessons learned here stretch far beyond the walls of HubSpot. In this speech that I gave to a room of students and startup founders, I go into even greater depth on this project, our process and what we learned.
Perhaps this is the hyper-critical designer in me, but I was about 70% happy with the design at launch. Considering where we were in the first iteration, versus where landed, I was proud of what we achieved. The site became one of the highest performing in the SaaS industry. But I still saw more potential in it. Aesthetics, usability, copy, code; there were many areas that stood to be improved. We put the roadmap in place to do exactly that, and it's continuing to roll out today.
Previously, our iterations went over scope and occasionally required that we push deadlines. I cited this as an area for improvement in the case study for our first iteration, and following that, we fully conquered the issue. We managed to get project management down to a science, leveraging systems built in JIRA, InVision, and Google to flesh out and track each project. We became maniacally focused on goals and objectives, which helped us to eliminate scope creep and stay true to the deadline. With each iteration, we improved; a huge benefit of an iterative process.
Serving as the UX Lead on the site, as part of a small 12 person SWAT team tasked with building HubSpot.com, I quickly learned the value of focus, compromise, and leadership by influence. All 1,600 employees at HubSpot had a direct stake in the site. The ability to manage those relationships and turn insight into action allowed me to scale my work and get more of the company behind UX as a practice, beyond just HubSpot.com. This was a strong indirect benefit of this project.
We relentlessly took an objective approach with these designs, and despite that being a painstaking effort, we saw it pay strong dividends with each iteration. I don’t credit our doubled and tripled conversion rates to chance or genius, but to a deep study and understanding of our users and product. This went all the way down to aesthetics, where we didn’t debate over personal preferences; rather, we tested aesthetic directions with actual users and quantified the best aesthetics. From my point of view, the fostering of this approach was one of the biggest wins for this project.
These iterations were built on the foundation of the first redesign that I took part in at HubSpot - back when we were still establishing our UX process, and had only a fraction of the team and site traffic. Take a journey back to our roots.
Brought in as the team’s first UX Designer, I established our process and led HubSpot.com through its first major redesign.
A revolutionary idea and a tight timeframe would result in this critical redesign of our fledgling startup's website and product.