Brought in as the team’s first UX Designer, I oversaw the design for HubSpot.com as we established our process and went through our first major iteration, which would lead to an ongoing iterative cadence.
1M+ users per month
5K+ leads per month
1K+ unique pages
8K+ paid users
$1.2B market cap
HubSpot had gone from a single-product domestic company to a multi-product international company, so new products and departments needed to be represented. Conversion and usability opportunities needed to be acted upon. The design aesthetic needed to be updated to fit HubSpot’s new style guide. And it all needed to happen in time for release at HubSpot’s annual industry event, INBOUND.
I led this project and was supported by a team of 3, comprised of a Visual Designer, a Developer, and a PM. I was the sole individual responsible for User Experience and I personally created the deliverables that you will see in this case study, excluding visual design and development. I saw the project through from the kickoff meeting, to the launch, and through multiple rounds of iteration afterward.
I began by digging into our historical data, surfacing user pain points and barriers to conversion. Users commonly went from the home page directly to the pricing page (pre-disqualifying themselves from the product), the FAQ, or a search. Despite being in-depth, the home page lacked critical information.
Massive amounts of data were available in HubSpot, Google Analytics, and Mixpanel. The main challenge was sorting through the data and finding meaningful patterns.
I used heat maps and scroll maps dating from several years back to present time. This helped understand engagement on current and previous versions, and the effects that specific changes had. At a high level, users weren’t engaging with critical elements and less than 25% would scroll.
These heat maps and scroll maps were run with 25,000 users each, supplying 467,308 unique data points that were analyzed in this project.
Using anonymous session recordings, I was able to reproduce many of the behaviors found in the Analytics Review, Heat Mapping, and User Testing. This validated hypotheses and identified areas of opportunity for the design.
These live, anonymous, and undetected recordings represented a reliable hybrid qualitative and quantitative data set.
Qualitative data collection was extremely important in this project, and as a result, user testing was conducted at virtually every phase. Tests were arranged in small, focused sessions aimed at iteration and incremental improvement.
This project drastically affected multiple different areas of the organization, which at times had conflicting interests. I conducted interviews with representatives from Product, Marketing, Sales, Services, Support, and Executive Leadership to understand each part of the company’s unique requirements and concerns for the design.
I collected a lot of internal feedback, cross-referenced it with user feedback, and prioritized design changes.
10% of the traffic to the home page of HubSpot.com was that of existing HubSpot customers that were logging into their portals or grabbing important resources. They were very high priority users that weren’t generally taken into account by the KPI’s set for the site. In order to ensure that they were factored into the design, I conducted a set of interviews with customers and used it as the basis for customer-specific dynamic content.
Interviews were conversational and tailored to each customer.
As a relatively well-known product, HubSpot receives a good amount of unsolicited feedback. This can manifest itself in many ways. From full-blown 2,200 word teardowns, to support and sales calls, to tweets, to direct emails, and everything in between. I have always prioritized unsolicited feedback for it’s genuine nature and passionate roots.
Users regularly share unsolicited feedback for HubSpot’s web properties. This is critical to our research process.
The process of honing in on the new design was very iterative. In order to understand what resonated most with the audience, and subsequently qualify or disqualify different elements from the design, I ran tests with different combinations of isolated design changes and saw how users responded.
Once we had a solid direction for the design, I began to produce multiple different variations of wireframes. I then put the designs in front of users and internal stakeholders for testing and feedback. This helped me to narrow the design down to three major variations, which I used to establish a single design framework, and thus move into visual design.
We experimented with a wide array of designs, eventually landing on an atypical, grid-first structure.
During the visual design phase, I worked directly with Anna, one of our awesome Visual Designers. She used a mix of clean lines, strong typography, bold colors, unique imagery, and an atypical grid structure to elevate the design to a new level for HubSpot.
Wireframes translated almost perfectly into the final product, demonstrating the strength of the early research and design work. See it for yourself with this side-by-side comparison.
A living design
This design needed to be scalable, flexible, and alive. This is one of the primary reasons why we went with such a grid-based, modular structure. It would scale well across devices, content could easily be changed or moved around, and key sections could be updated using a new editorial calendar that we would be introducing.
Several editorial sections inspired by stakeholder interviews were built into the design and coded so that marketers could edit them easily.
In the past, designs were very static and felt as if they went straight from the PSD to code. So in this iteration, we used interactions to bring the page to life, draw users into the content, and help confirm user actions. A video player that started playing the video on mouseover, hover states that revealed images or new colors, and advanced CSS animations all played a part.
We took a very unique set of photos intended to fit perfectly into the atypical grid structure. This would allow the off-hover state to show an out-of-focus section of the photo that would expand out into the right grid element on-hover, revealing the full photo and additional information. This turned out to be a distinctive element of the design.
Thoughtful repurposing of space was very important in this design and something that users specifically noted in testing.
As a way to deliver highly relevant content to each user and solve for specific user groups like existing customers, we developed dynamic content sections that would adapt based on the user. These were then tested within the context of personas and purchasing stages.
Earlier in the project, interviews with existing customers identified personalized content as a major opportunity.
Over 19% of the U.S. population is disabled. Accessibility and device compatibility were crucial to all of HubSpot.com. The design needed to perform across multiple different devices by utilizing a fully responsive design and stable code. This code would then need to hold up well in devices built for accessibility purposes, like screen readers.
Using an array of accessibility-focused tools, I tested the design for multiple forms of color blindness and compatibility with screen readers.
Internationalization and Localization
As a global company, HubSpot required an equally global site. In addition to English, the new pages would also be translated into Spanish, Portuguese, German, French, and Japanese, with additional languages to come in the future. This was an important factor in the design, as foreign translations could potentially take up 100% more space than their English counterparts.
Annabeth, the Developer on this project, did an incredible job of putting together clean code that was compatible across all devices. I worked with her and Anna, the Visual Designer, to execute all interactions properly, ensure that the design translated well to the code, and QA test along the way.
As with all of our projects, the site was built and hosted in the HubSpot software itself, using the HubSpot CMS.
The design needed to display consistently across browsers and devices (including some that were utilizing outdated technology). Using BrowserStack, we emulated the site on real devices. Knowing that our users were 73% in Chrome, 12% in Safari, 9% in Firefox, 4% in IE, and 1% in Edge, we prioritized fixes according to audience size and criticality.
The design was tested across devices and resolutions in multiple versions of Chrome, Safari, Firefox, Internet Explorer, Edge, Opera, and Yandex.
In an awesome moment, we launched the new pages back stage at INBOUND, right as the new products and features were announced by our co-founders on stage.
Each page affected in the iteration had a set of Key Performance Indicators that I monitored and tested against. I paid especially close attention to Conversion Rate, Submission Rate, Drop-Off Rate, Event Triggers that we built into the design, Goal Completion, Navigation Summary (Origin page and Destination page), and even specific Search Queries.
Google Analytics provided a wealth of historical data that could be compared in Behavior Flows and Navigation Summaries.
Every Call-To-Action button in the design was coded as a HubSpot CTA, meaning that the HubSpot platform would track views, clicks, and submissions for every button. This data was then rolled up into user profiles and larger reports that helped us gauge how the design was performing.
CTA’s could be tracked throughout a user’s lifecycle, helping us to understand where and when they were converting.
Through multiple rounds of heat maps with 25,000 users each, I was able to observe how users were engaging with the design, where engagement had increased, and where it had dropped off. We saw major engagement increases in critical CTA’s, navigation elements (like the software tab), and application links (like the free trail conversion link).
Heat maps helped to show where the design was performing well and identify opportunities for iterative tests.
User testing and recording
With the design live, I recorded users on the site and ran additional rounds of user testing. This really helped to give us ideas for elements to test and ultimately incorporate into future iterations of the design. This information was very easily paired with quantitative data and used to prioritize the most important items for the next iteration.
Screen recordings were used to immediately assess the design. In this case, I was watching how an existing customer used the site.
It was extremely rewarding to work on a project that generated such measurable impact for both HubSpot and it’s users. I discuss this more in-depth and give a unique peek into the launch experience on an episode of the UX and Growth podcast, which I co-host.
What I learned
This was a very high impact project with a hard deadline. A lot of things went right, but as with any project, things also went wrong. I like to reflect on the successes and failures of every project so that I can learn from them and apply those learnings in the future. In this project, I really learned that my small, agile team works great together. We were under a ton of pressure and the output required from each individual was extremely high. Even so, we pulled through together and shipped the product on time.
Despite the fact that we hit our deadlines, we could have made it easier on ourselves by setting a more explicit scope and continually refocusing our stakeholders on the project goals. As the project neared completion, the scope seemed to increase dramatically with new ideas and “nice to have’s” that weren’t always aligned with our original KPI’s. This put a lot of pressure on us to deliver additional work that may not have even solved for our goals, all by the original deadline.
Because we were rushing to complete a project that had experienced some scope creep, we sacrificed some quality and even neglected to implement several tracking elements before the launch (like custom events and CTA coding). At first, we just weren’t tracking everything that we should have been. This led to a data gap and a delay in our analysis. We should have devised a stronger launch and analysis plan well before the project even neared completion.
Perhaps the most rewarding part of this project was applying the full Lean UX process that I implemented at HubSpot, and seeing how big of an impact it had on the team and the end product that we built. The team was efficient and collaborated well (with reportedly less stress), users were involved throughout the entire process, and we produced a product that performed well. In the months following, we were able to really see how this project transformed our design process.
And that was just the beginning
After this project, we continued iterating and improving the site. I’m documenting that in a live case study that I update with each iteration. We’ve come a long way since this launch; you should see it.