HubSpot logo
Web & Software
Project phase

Growing HubSpot

Following our initial redesign, I led the UX for as we went through rapid iterations that would multiply the site's traffic, conversions and revenue. This is a look at how our small multi-disciplinary team iteratively built one of the most-visited SaaS sites on the web.

10M+ visitors per month
35K+ enterprise customers
2K+ unique pages
100K+ leads per month
90+ countries
$3.1B market cap
10M+ visitors per month
100K+ leads per month
35K+ enterprise customers
90+ countries
2K+ unique pages
$3.1B market cap
10M+ visitors per month
100K+ leads per month
35K+ enterprise customers
90+ countries
2K+ unique pages
$3.1B market cap

Straightforward goals

All iterations were driven by measurable primary and secondary goals, and supplemented by unique one-off goals.

Primary goal


We measured conversion not by clicks or sign-ups, but by activated and retained users. A predictive analytics system assisted in this.

Secondary goal


We took an objective approach to design, testing and measuring everything from usability (SUS) to aesthetics (UPT).

Setting measurable goals

I believe in a goal-driven and objective design process, where decisions can be justified using research and KPI’s, rather than just opinion and intuition. These goals acted as “north stars” that set a clear context and aligned everyone on the project, working toward the same outcome. They suffocate ego, bureaucracy, and competing interests.

Task prioritization

Every task was stack ranked against these goals, using the criteria of potential, importance, and ease. This kept our team focused on the best use of our resources, and gave us a clear idea for what should be included in each iteration.

Chapter One


Iteration planning

We'd source opportunities from the core team, stakeholders within the business, and ongoing performance analysis. They were all accompanied by data and cross-referenced with goals established in our MSPOT. These were used to build the business case, flesh out the iteration, and set it in motion.

To support this, we used a hybrid data model tracking users across HubSpot Analytics, Google Analytics 360, and Amplitude. We didn’t make guesses; we attacked the best opportunities.

Logging and communication

Following the iteration outline, a JIRA epic and Kanban board were created. Relevant Slack channels and InVision projects were opened. A Google Drive folder and an Abstract project were established for versioning (and assets, if a Design System update was to be made). Finally, a shell wiki post and case study were formed in preparation for post-mortem documentation, both internally and externally.

Core research

Once the iteration was established, research would become the primary focus. This was both an ongoing and an iteration-dependent process.

Conversion Flows

With over 2,000 pages across the site, detailed conversion flow audits helped identify areas where the experience could be reduced and simplified.

Simple flows had a 2-3x higher conversion rate than their complex counterparts. A new, standardized “Get Started” flow helped users convert in 3 steps from anywhere.

User Journeys

By looking at individual user journeys, we uncovered issues where users were wrongfully converting, getting lost in the site, and abandoning products that they signed up for.

In one such case, a new conversion and sign-in interface was created. It detects known users even if they aren’t signed in and offers to continue with their account instead.

Learning Behaviors

Using a composite of data sourced from interviews, user tests, and surveys, we kept a pulse on how prospects preferred to learn about new software products. In one instance, users indicated that they wanted to watch a video or read through a website, and that chat was least favored. At the same time, our conversion data consistently showed that chat had a high close rate.

After considering both quantitative and qualitative data sets, and reconciling areas where they contradicted each other, we developed solutions that prioritized video-based and story-based learning, as well as a new chat experience. Users didn’t dislike chat in general; they disliked our chat. These videos increased conversion by 300%, and chat volume increased by 38%.

Conversion Motivators

User interviews and surveys would help uncover the messaging that spoke the most to prospects, and was later cross-referenced with copy vs. copy testing data.

Copy was centered around product-descriptive, “all-in-one”, and “growth” themes. In copy vs. copy tests, changes to messaging improved conversion by 20%.


An average of 10 experiments were launched on the site every week. The learnings from these experiments continually impacted our incremental growth metrics.

After KPI’s were tracked and proven out for each experiment, their learnings are applied across the site. This simple scaling effort bled into design, research, and code.

User and Stakeholder Interviews

Key users and stakeholders were interviewed with each iteration. Behavioral flags in the product and website identified the right users and pulled them into interviews, while internally, we worked with key stakeholders and interviewed across the business. Results were then compiled into a document with key themes and overlapping interests.

In one of our iterations, a key theme was cognitive load. Users and stakeholders alike felt that the design was too complex; a total overload of information. By running a cognitive load tapping test, we were able to confirm this, and we drastically simplified the design as a result. Follow-up testing showed users performing over 20% better.

User Testing

User testing and session recordings were regularly run to observe organic user behaviors, check the health of the experience, and qualitatively test against hypotheses.

Results were compiled into feedback docs and stack ranked quantitatively in Rainbow Spreadsheets. These were then mapped to action items for design and development.


Heatmaps, Scrollmaps, and Color Blindness Simulations were used to evaluate the effectiveness and accessibility of the design, as well as how users interacted with it.

Pre and post launch heatmaps helped predict design performance. On the left is a heatmap of a pre-launch static PNG. On the right is the post-launch live design.

How research was compiled and shared

As fresh research was completed and used to jumpstart a new iteration, I placed into a kickoff slide deck that I would eventually present to key stakeholders and decision makers. From this, we'd agree on the direction of the iteration, the timeline and the team needed to carry it out. We didn't iterate for the sake of iterating, so this was part of the process helped us evaluate each iteration.

Leading with copy

Research determines content, and content determines design. In that order. This is what drove our research-first and copy-first approach to design.

In-depth messaging research was matched with specific verbiage and used to construct full copy docs, which went through several rounds of feedback and iteration. As the copy began to solidify, it would be placed into a rough structure to visualize how the story would be told. During this time, design exploration was also under way, working in conjunction with copy.

Chapter Two


Exploratory design

Even when substantial research is present, we still need to test our concept and determine how much actual potential exists for the iteration that we’re proposing. This can lead to pivots or affirmations; sometimes both.

In the example to the right, we tested several hypotheses around aesthetics and messaging, many of which broke our brand conventions. I produced this design and we had it live in 24 hours. It converted 20% better than the control, even as a quick experiment. This indicated that our hypotheses were aimed in the right direction.


Increase in conversion rate.
That justified an iteration.

Testing and validating elements

I maintained several InVision Boards with elements that I and the team pulled from around the web, concepted in meetings, or drew during a moment of inspiration. We regularly implemented these elements as tests and enhancements. Importantly, each core element was quantitatively and qualitatively validated before implementation, no matter how subjective it may have seemed.

Deep shadows
JTBD messaging
Long copy

Wireframes & Mockups

As the design direction was established, we began creating the structure and hierarchy of the design, as well as the aesthetic treatments.


Full user flows and unique interactions were prototyped for sharing internally and testing with users. In the example to the left, I prototyped a concept that focused our home page on user personas and how the software contributed to their daily work. Persona photos would intermittently fade to videos of the individual working. This was something that was best communicated through a prototype, and it helped sell the idea to key stakeholders.

Integrating into the Design System

All elements were integrated into HubSpot’s new global design system, Canvas, and distributed via our UI Library and a master Sketch file. There, working examples were displayed with code snippets and instructions for design and use. This system was updated with each iteration, it encompassed all design at HubSpot, and it removed the need for design assets. As a result, design was more ubiquitous, consistent, scalable, and efficient.

Global design system
Dedicated & dynamic
Chapter Three


Increase in


Our iterations more than doubled the global conversion rate of

Decrease in


Compression & service workers slashed our load times.

Increase in

Demo requests

These are HubSpot’s highest value conversion, making them a litmus test for the quality of the conversion mix.

Increase in

Sales chats

By improving HubSpot's chat bots and chat experience, this became a conversion with a high close rate.

Increase in


With dynamic numbers, we targeted high intent customers.

Increase in

Retained customers

This resulted in millions of dollars in new ARR for HubSpot, tied directly to our iterations.

Data collection and analysis

HubSpot Analytics, Google Analytics 360 and Amplitude were used to collect data, which was then compiled in Google BigQuery and used to populate custom visualizations that were built in Google Data Studio. This allowed us to see data across the entire site and product, and focus exclusively on metrics and questions that were relevant to our work. Using a predictive analytics model that was laid on top of this data, we were able to measure conversion not by clicks or submissions, but by user retention and revenue. This was tracked across more than 90 unique conversion journeys throughout the site.

All 90+ conversion flows in Google Sheets, after being compiled in Google BigQuery.

Snapshot from our weekly meeting, covering key metrics and experiments.

Beyond the case study

The lessons learned here stretch far beyond the walls of HubSpot. In this speech that I gave to a room of students and startup founders, I go into even greater depth on this project, our process and what we learned.

It was an honor to lecture at Harvard University about this project and our unique approach to UX.

What I learned

Perhaps this is the hyper-critical designer in me, but I was about 70% happy with the design at launch. Considering where we were in the first iteration, versus where landed, I was proud of what we achieved. The site became one of the highest performing in the SaaS industry. But I still saw more potential in it. Aesthetics, usability, copy, code; there were many areas that stood to be improved. We put the roadmap in place to do exactly that, and it's continuing to roll out today.

Previously, our iterations went over scope and occasionally required that we push deadlines. I cited this as an area for improvement in the case study for our first iteration, and following that, we fully conquered the issue. We managed to get project management down to a science, leveraging systems built in JIRA, InVision, and Google to flesh out and track each project. We became maniacally focused on goals and objectives, which helped us to eliminate scope creep and stay true to the deadline. With each iteration, we improved; a huge benefit of an iterative process.

Serving as the UX Lead on the site, as part of a small 12 person SWAT team tasked with building, I quickly learned the value of focus, compromise, and leadership by influence. All 1,600 employees at HubSpot had a direct stake in the site. The ability to manage those relationships and turn insight into action allowed me to scale my work and get more of the company behind UX as a practice, beyond just This was a strong indirect benefit of this project.

We relentlessly took an objective approach with these designs, and despite that being a painstaking effort, we saw it pay strong dividends with each iteration. I don’t credit our doubled and tripled conversion rates to chance or genius, but to a deep study and understanding of our users and product. This went all the way down to aesthetics, where we didn’t debate over personal preferences; rather, we tested aesthetic directions with actual users and quantified the best aesthetics. From my point of view, the fostering of this approach was one of the biggest wins for this project.

Chapter four

Final Designs

Back to the beginning

These iterations were built on the foundation of the first redesign that I took part in at HubSpot - back when we were still establishing our UX process, and had only a fraction of the team and site traffic. Take a journey back to our roots.

Up next