Editor’s note: We’ve asked a handful of design leaders to respond to prompts each week. This week’s prompt was “What key metrics do you use to track your design’s effectiveness?” Read responses from Chris Thelwell and Jeff Gothelf below, and check out Nir Eyal’s answer here.
What you measure determines what you achieve
The UX designers at Envato are embedded within agile development teams. This means design has adopted methods for measuring our performance from our agile practices. However, alongside our engineers and product management colleagues we are working to change this. We believe it is also really important to measure the effectiveness of our work.
How you measure a team sets how a team behaves and how well it performs. A team will naturally optimize themselves around achieving the targets they set.
I believe this is where most agile practices go wrong.
“A team will naturally optimize themselves around achieving the targets they set.”
Agile is all about the delivery of value into the hands of the customer. “What’s wrong with that?” I hear you ask. Well, the problem lies in the fact that people focus only on measuring the delivery part. They ignore the value part or the impact that value has on the customer.
For example, Scrum measures velocity, burn, and story points. Kanban measures cycle time and flow. All these measures celebrate the act of shipping software to the customer. So the teams will naturally optimize themselves around shipping. High-fives all around.
When you measure delivery you expect deliverables
With the whole team focused on the shipping, the pressure on the designers is huge. Designers are expected to deliver the deliverables of design—wireframes, mockups, prototypes, and spec documents. And they can’t let the development team slow down. This is where most agile design teams get stuck feeding the development beast.
“We’re stuck in delivery mode because all we ever measure is shipping.”
Jeff Gothelf, author of Lean UX, has been reminding us for the last 5 years to “get out of the deliverables business.” Yet we’re still doing it. We’re stuck in delivery mode because all we ever measure is shipping.
Changing the way we measure
At Envato we still value shipping—it’s still really important. We measure how fast we ship working software into the hands of our customers, as well as the design team’s ability to support our developers in doing so.
Image from Inside Design: Envato.
But, taking a data-led approach, we’re now measuring the impact that the value we ship has on our customers. The teams are starting to measure key business metric that matter. They focus on making a difference to these metrics in every two week sprint. A success factor or hypothesis is decided at the start of an initiative. This defines when we are done, not just when the feature ships.
For example:
- We measure the number of people who successfully sign up against those who drop out. We’ve seen significant improvements since we designed new ways to sign up, such as during checkout or by reducing the number of steps.
- We look for an increase in conversion as we iterate on the way we communicate the benefits of our items—such as support
- We are gradually reducing our page load speed. We discovered a link between slow page load speed and high exit rates. We believe this results in fewer purchases on those pages.
Because our designers are embedded in the development teams, they share the same metric-driven goal. So everybody on the team—not just the designer—works on making a positive difference to the impact our designs have on the customer.
Tracking the effectiveness of your design
The metrics you choose to determine your design’s effectiveness vary based on each business problem you’re solving. Additionally, these metrics should be the same ones that determine the effectiveness of your product and engineering choices.
In other words, you can’t split out ‘design metrics’ from other kinds of product metrics.
The experience of using your product or service is holistic. Your users don’t think in terms of your org chart. The choices they make within your product reflect the sum total of design, product, content, engineering, marketing, and everyone else involved in making that product a reality.
“You can’t split out ‘design metrics’ from other kinds of product metrics.”
This is why it’s critical at the beginning of an initiative to determine what outcomes you’d like to achieve. Outcomes are measurable changes in customer behavior, and, perhaps most importantly, they’re objective. This is a huge benefit because it provides the product team (not just the design team) evidence for the efficacy of their choices.
Image from Inside Design: Buzzfeed.
Did these new changes improve customer success? If so, let’s optimize them. If they didn’t, let’s roll them back, learn why they didn’t work, and try again. These outcomes are related to the specific problem the team is solving. It’s therefore impossible to put forth a defined list of “design metrics” that would work across any project.
When it comes to the team’s effectiveness, it’s risky and difficult to carve out a specific discipline’s performance and measure it objectively. The best approach here is to run regular, cross-functional retrospectives with the entire product team. These exercises allow the team to assess what went well over the last iteration, what could go better and who will take the necessary steps to improve specific items.
“Run regular, cross-functional retrospectives with the entire product team.”
While there’s no specific metric that comes out of these sessions, teams can sample the general morale of their colleagues with an anonymous poll at every retrospective. This result can be plotted over time to see if team morale—for the entire team, not just design—is going up or down and what’s driving that change.
Join the conversation
Write your own response to the prompt “What key metrics do you use to track your design’s effectiveness?” on Medium, and submit it to our publication.
by Jeff Gothelf
Jeff is a lean thinking and design evangelist, spreading the gospel of great team collaboration, product innovation, and evidence-based decision making. Jeff co-founded Neo Innovation after leading UX design teams at TheLadders and Web Trends. He’s the co-author (with Josh Seiden) of Lean UX: Applying Lean Principles to Improve User Experience and Sense and Respond, out later this year.
Chris Thelwell has been a digital product designer in both the UK and Australia for many years, juggling award-winning F1 projects, cool Google Chrome apps and the occasional European football championship. An outcome focused design leader, Chris specializes in disrupting markets, creating innovative new digital products, and building high-performing design teams in Agile software delivery environments within large enterprises, startups, and agencies.