Structuring Small Experiments for Big Results

March 13, 2019
by
The C2 Group

User testing doesn't need to cost an arm and a leg to get big results organizations long after. C2's UX Designer Nicholas Fuller shares the steps to constructing smaller usability testing to optimize and improve any digital experience.

UX & Design
Digital Marketing

Usability testing can be a complex affair. Lengthy interviews, multi-camera setups, eye-tracking equipment; the list goes on. While these techniques all have their value and place, they can require an unobtainable amount of time or budget.

Thankfully, utilizing testing doesn’t necessarily need to always be a full-blown research study to make meaningful change. When looking to quickly optimize a facet of an experience, we exercise a fairly simple process of events to achieve impactful outcomes.

The following steps have been promising in achieving big results:

Isolate a specific problem

Define and isolate a specific problem to solve for. Clearly defined metrics and goals are a great place to start in identifying a specific problem. Maybe your commerce site sees a consistent drop-off just before checkout or a marketing landing page on your site does a poor job at lead generation. Identify where you’re at through available data and define a goal for where you want to be.

Build a deeper understanding

To empathize with users , we need to do our best to understand who they are and what their goals are. It’s critical the target audience is clear. Customer Journey Maps and User Personas are great ways to understand your audience, but if nothing that polished is available, a defined audience segmentation will be able to guide us. Once the audience is defined, I highly suggest building a rough Use Flow Map. This will help better understand the steps users go through for the expected journey or outcome we are trying to get them to reach.

An example of a Use Flow map for a commerce campaign page, highlighting identified sources of traffic and clear key decision points.
Use Flows can be everything from fancy flow charts to sketches on napkins. Above is an example of a campaign page for a commerce site, highlighting identified sources of traffic and clear key decision points.

Use Flows can be everything from fancy flow charts to sketches on napkins.

Building a Use Flow Map simplifies and visualizes the steps users go through in a manner that disregards aesthetics and user interface. Instead, we’re solely focusing on the process these users can take and the decisions that have to be made. Above, we have a campaign page for a commerce site, highlighting identified sources of traffic and clear key decision points. With a basic understanding of who the users are and what their journey looks like, we can turn to analytics to get a dose of reality.

Gather applicable data

Using tools like Hotjar, CrazyEgg, and Google Analytics give organizations greater insight into user engagement. Google Analytics is helpful in identifying what content users are engaging with, while a more visual analytics tool like Hotjar does a better job in visualizing how that content is being engaged with.

Google Analytics allows for easy tracking and review of your site’s traffic performance including when users visit, what they’re visiting, and how long they’re visiting. We can use Google Analytics to compare our envisioned Use Flow Map with the real user report. For example, Google Analytics offers a higher-level look at User Flow, allowing businesses to see what the various funnels of pages are commonly visited by users. If you’re interested in more gritty details, building funnels and setting goals are a great way to make the most out of your Google Analytics account.

An example of Google Analytic's Users Flow chart tool
Above is an example of what a Users Flow chart may look like for a site’s Google Analytics.

Hotjar, on the other hand, helps to identify how content is being engaged with. With the ability to record user sessions, view heatmaps of aggregate clicks, movement, and scroll depths, and track funnels of page activity, Hotjar gives a better look at the user experience and how users engage with a website. For example, the below heatmap shows that user mouse activity and scroll activity does not adequately reach the call-to-action form on this landing page.

The map on the left shows how desktop users are navigate by clicking on the page. The map on the right details the scroll depth for desktop users using Hotjar visual analytics tool
The map on the left shows how desktop users are navigate by clicking on the page. The map on the right details the scroll depth for desktop users.

Ideate, Iterate, and Validate

Apply what we now know to convert those observations into improved outcomes. From what we’ve gathered, we can hypothesize several potential improvements. In our example, it might be an obvious and easy fix by simply moving the desired call-to-action up on the page. We can prove our hypothesis with some A/B testing, using different usability tools or publishing the change and recording a new set of data to analyze. Now, not every solution is a “one-time” fix. Some changes have the potential to make a problem worse or even introduce new risks or issues. This is where user validation becomes important.

In our example, we hypothesize that the form is not getting attention because it is too far down on the page, and users are interacting with the button on the hero area instead. Our proposed solution is to remove the distracting button and reposition the form up higher on the page.

Before and after comparison screen grabs of the campaign page with edits based on the results of the Hotjar data.
Before (left): There is a CTA button in the hero banner and the form is lowest on the page. After (right): The CTA button is removed from the hero banner and the form is moved up higher on the page near the average fold.

To validate, we will spin up a usability test against a small sample size of panel testers with a testing service such as Usertesting.com, UsabilityHub, or UserCrowd. Clearly define your audience and what you want users to be tested against. As shown below, we set the demographics to match our target audience and publish a sample of each UI iteration to validate any performance gains or losses.

Screen grab of UsabilityHub.com demographics filter.
Our selected demographics for UsabilityHub user testing.

These test sessions are remote, unmoderated, and based on a single click.

Our results below are shown as a heatmap of clicks and the average duration for a User click:

Before and after comparison of campaign page with implemented recommendations and better results.
We can see that by making these page adjustments, more users are performing our preferred path of action and the average time to complete the task goes down.

Our hypothesis is partially correct, moving the form up gained it some traction, but users still gravitated somewhat to the navigation. Though, as an added bonus, it appears our edited solution performs a little faster.

Based on the data we have now, it is safe to say we could make an improvement upon the incumbent solution just by making this change. If your timeline requires that, run with what you have. If you have time to iterate, this is where we would re-test with a stripped-down navigation (Basic Landing Page 101). Assuming, of course, we implement the tested solution, continue to track conversions, and observe user interaction through analytics.

Larger research studies can often result with a multitude of issues that go far beyond client expectations and budgets. Smaller footprint studies bound by known goals can prove to be immediately actionable. In summary: isolate a problem, define a goal, gather and interpret relevant data, ideate solutions, test iteratively, validate, and implement and optimize .

Rinse and repeat, as necessary.