Which Metrics Best Measure UX Success? Colorado Product Designers Weigh In.

by Taylor Karg
February 23, 2021
ux metrics success
shutterstock

Good design work is informed by a blend of qualitative and quantitative methods for collecting data — but that means little if a product team can’t turn those actionable insights into the best possible product for users. 

Recently, Built In Colorado asked three local designers from StackHawkArtifact Uprising and ShapeShift about which metrics product designers should use to measure the success — or failure — of their designs and how they translate those insights into innovation.

 

Aaron White
Head of Design and User Experience • StackHawk

Because security software company StackHawk’s UX metrics are directly tied to its business objectives, it uses a blend of data to measure user experience. Head of Design and User Experience Aaron White said that to gather qualitative data, he uses a hypothesis-driven model for focused user interactions, which in turn, helps him better understand the patterns he sees in the quantitative data.

 

What UX metrics have you found to be most important for your business and product? 

The most important UX metrics have to be directly tied to business objectives. In my current role at StackHawk, our business objectives are tied to optimizing “time-to-value” and growing user adoption of our application security testing platform. My metrics on the UX side have to roll into those larger goals.

We pay close attention to the percentage of users that sign up successfully and complete a security scan within their initial session. We also look at user health metrics, such as scans per account, completed scans, scan errors and scan frequency. All of these data points help inform various product decisions.

Since our product recently hit the “general availability” milestone, we don’t have access to huge sets of statistically significant data. Instead, we do our best to recognize patterns, engage with users where we can, make hypotheses about what is happening and roll out functionality we think will address users’ hurdles.
 

Quantitative and qualitative data work hand-in-hand and I have to be careful that I am not overly reliant on one type over the other.”


How do you balance the quantitative data youre collecting with qualitative data youre getting directly from users? 

Exciting users has always been the most fulfilling part of UX for me. Quantitative metrics are great indicators of what is happening, but qualitative data is the only way to go the next layer deeper into why things are happening.

But getting the qualitative data can be tricky. To gather the most valuable qualitative insights, I have to go into user conversations with clearly defined hypotheses that I try to disprove. Finding people that will give you their honest thoughts instead of what they think you want to hear can be tricky, but having that feedback loop is invaluable.

By using a hypothesis-driven model for focused user interactions, I am able to better understand why I am seeing patterns in quantitative metrics. Quantitative and qualitative data work hand-in-hand and I have to be careful that I am not overly reliant on one type over the other.

 

How do you translate the data you collect into action?

After making the product generally available, we studied how many users successfully made it from sign up to scan in their initial session and saw room to improve this metric. We considered persona types, gathered user feedback and looked at the product logs to determine what was a blocker to successful scans.

Our product is more developer-first than most security products on the market. So while a developer could go from sign up to scan in about eight minutes, different personas were struggling to complete the scan. Some lacked familiarity with how their app worked (things like authentication) and others wanted to scan a sample application before their own systems. 

To provide a better onboarding experience and support a non-developer evaluation case, we found a publicly available application that all users could scan to take friction out of the user flow. We let users decide if they wanted to scan the publicly available option or their own private app.

Once we implemented this, we saw a greater percentage of users being able to successfully complete their first scan. We are dedicated to providing the best onboarding experience for all users and were thrilled to see this improvement.

 

Daniel Alt
Senior Product Designer • Artifact Uprising

As an e-commerce platform, Artifact Uprising values the qualitative data its users provide via phone calls, interviews and emails. Senior Product Designer Daniel Alt said that it allows he and his team to dig deep into what their customers are thinking while using the platform, which then allows them to run additional A/B testing to see which subtle adjustments will improve their experience.

 

What UX metrics have you found to be most important for your business and product? 

As an e-commerce shop whose products require user personalization, we try to make it as seamless as possible for users to build their product. We track how far customers make it down the funnel (or the build process), conversion, bounce rate, scroll percentages, net promoter scores (NPS) and more. One of the metrics we consistently look to is interactions. We are constantly running A/B interaction tests to discover what resonates with our customers.
 

If we change a design and it doesn’t impact the quantitative data, we default to the qualitative side of the data.”


How do you balance the quantitative data you’re collecting with qualitative data you’re getting directly from users? 

We really value our qualitative data at Artifact Uprising. We have an amazing customer base that is enthusiastic about sharing their experience through calls, interviews and emails. This allows us to dig a bit deeper into what our customers are thinking. That being said, running A/B tests allows us to make subtle adjustments to improve the customer experience. One of our general rules is that if we change a design and it doesn’t impact the quantitative data, we default to the qualitative side of the data.

 

How do you translate the data you collect into action?

Product detail pages (PDP) are at the core of e-commerce and we do a lot of experimentation in that space. One of the early tests we ran at Artifact Uprising was a small design improvement test focusing on minor typography, spacing and layout on the PDP. It stemmed from qualitative data we got through our ongoing pulse checks with customers. A pain point we heard many times during these conversations was that our site had issues with readability, which led us to make design improvements to specifically solve this issue. We were able to prove the value of the updates by measuring funnel progress, which in our case was a user adding a product to their cart. Ultimately, the test proved successful and we were able to use the learnings to make similar readability improvements across the site. 

 

shapeshift
shapeshift
Bethany
Senior UI/UX Manager • ShapeShift

At crypto platform ShapeShift, usability informs its UX strategy. To measure their success in making their design as user-friendly as possible, the UX team looks at feature adoption rates and whether they were able to decrease the amount of time the user spends trying to understand the feature or task they’re engaging with, Senior UI/UX Manager Bethany said.

 

What UX metrics have you found to be most important for your business and product? 

Our biggest focus in 2020 was offering customers a better user experience to promote growth. User engagement is a major focus for us, and we primarily measure this with daily active users. Usability is an extremely important piece of building and maintaining user engagement.

Historically, ShapeShift’s main product offering was focused more on the experienced crypto-user. Recently, we’ve expanded our focus and adjusted our product to make it easy for new users to get started with crypto. To measure our success with making things easy, we look at feature adoption rates, decreasing the amount of time someone spends trying to understand the feature or task they’re engaging with and reducing user error.

 

How do you balance the quantitative data you’re collecting with qualitative data you’re getting directly from users? 

During the design process, we do our best to empathize with our users and rely heavily on qualitative data that is gathered through multiple rounds of prototyping with usability testing. This allows us to challenge our assumptions and ensure that we are building the right features and products. Once a feature or new product is launched, we monitor relevant analytics to see where we might need improvements or modifications — measuring what’s happening with quantitative data and looking at the how and why with the qualitative data.

We work a lot with our marketing team to implement experiments or A/B tests if we want to improve upon a metric. Sometimes, when the analytics fall short, we may end up redesigning specific aspects of a feature. Our user base is also pretty vocal and we get feedback from them in multiple ways, including our online feedback forum, surveys and social media. Recently, many of the features our users are requesting align with projects we’re already considering in our roadmap, which helps validate our product decisions. We’re always striving to create the best user experience we can, while striking an optimal balance between users’ needs and our goals as a business.

 

We do our best to empathize with our users and rely heavily on qualitative data that is gathered through multiple rounds of prototyping with usability testing.”

 

How do you translate the data you collect into action? 

We’re always turning data into action. Most recently, our data drove us to make significant changes to when and how we require users to verify their identities (a regulatory requirement for some activities we offer), a process known in the industry as “Know Your Customer” (KYC). 

We’ve modified this process several times with the goal of increasing conversion. We changed the user flows, incorporated optical character recognition (OCR) technologies to automate data entry and rewarded users who completed the process. While conversion improved with each iteration, there were still pain points, especially for our international users. Ultimately, when we look at the whole picture — the quantitative and qualitative data — it has shown us that no matter how much we improve it, our users just don’t want to provide the level of personal information that’s required with KYC.

Therefore, we’ve been working on features and solutions to our products that don’t require our users to go through KYC. For example, we recently launched a new trading experience leveraging decentralized exchanges (DEXs) that don’t require our users to KYC at all. We’re already seeing positive results and we’re excited to continue on this path.

Jobs from companies in this blog

Colorado startup guides

LOCAL GUIDE
Best Companies to Work for in Denver & Boulder
LOCAL GUIDE
Coolest Tech Offices in Denver & Colorado Tech
LOCAL GUIDE
Best Perks at Colorado Tech Companies
LOCAL GUIDE
Women in Colorado Tech