Using NPS as an engineering metric gets product owners into hot water.

Before we dive into why, let’s first look at what Net Promoter Score was designed for. This metric is used to measure customer loyalty by asking “how likely are you to recommend us to a friend?”

It’s used to gather customers’ pains and joys so software teams can sell, market, and develop a product according to the customers’ needs. But imagine if every software tool you used sent you regular NPS surveys? Response rates will suffer and the metric will slowly become less and less valuable over time.

A good NPS score is seen in software communities as the benchmark for a product-first company and regularly features on executive and IPO reports. It’s so common that over two-thirds of the Fortune 1000 claim to use it.

General NPS benchmarks

Image source

Many engineering and service delivery managers would also attribute a high NPS score to success, and may add the NPS metric to reporting to support the engineering team’s efforts.

We go into which metrics you should be measuring to keep software quality high here, but here’s a quick overview:

  • Users affected by bugs
  • Median application response time
  • P99 application response time
  • Resolved bugs => new bugs
  • Bugs per 1000 lines of code

All engineering teams need a well-rounded approach to metrics that include internal and customer-facing numbers, but we often lean on NPS since it’s so easy to measure using dedicated tools.

While NPS is valuable, it can be misleading when applied to software quality, mainly because there are plenty of teams that celebrate a good NPS score, even though their software is slow and buggy.

General NPS benchmarks

If your software quality is high, and you use NPS as a guide to measure customer satisfaction, you’d expect these numbers to increase proportionally, but they won’t. NPS doesn’t tell you anything about the quality of code that is being delivered to customers - it’s a loyalty metric.

Here are a few other problems:

  • Because the NPS is a difference score, it lacks a meaningful scale of measurement. The business impact of increasing NPS scores has yet to be proved.

  • If you spend too much time chasing higher NPS scores, you’ll likely fall into a habit of building features rather than fixing cruft and technical debt.

  • High internal quality reduces the cost of future features, (meaning that putting the time into writing good code), actually reduces cost. The business case for spending time on tech debt might be stronger than building a new feature.

  • NPS also reinforces a culture of celebrating the past—which is not helpful in steering engineering direction. Any metrics that are retroactive leave little space for innovation, but the bigger problem is when too much weight is placed on NPS as a driver for directing more engineering resources.

NPS is flawed, but it is still a valuable metric. We just need to think about it a little differently.

Think of NPS as a high-level health check

“I do keep an eye on the NPS Slack channel and watch for comments around speed and performance, but I take them with a pinch of salt. I am more concerned with quality at the code level, like how many users are affected by bugs.”

Rachel, Delivery Manager at Raygun

Forget the idea that NPS is a good indicator of software quality. It’s not helpful in this context, and using it to determine engineering direction will likely lead to frustration. You’ll end up celebrating the past instead of driving your engineering team forward. Your boss will never be happy with the fluctuating scores.

Instead, look at NPS as a high-level health check. It gives your customers a voice, and as an engineering lead, it’s your job to listen. NPS won’t give you a view on software quality, but it will keep you honest. It’s the one software metric that is often public and can be affected across all teams, from customer success to marketing.

Software teams can use NPS without leaning on an overall score as an indicator of software quality. While NPS might be your current best indicator of a customer’s overall experience, support it with better engineering metrics—your reports will stand out.

“We’re all facing the same problem at the end of the day. Everybody is trying to find metrics to track, and there isn’t a clear answer to what we should be tracking. But that doesn’t mean that we shouldn’t make an effort in tracking those. So we’re finding out what we don’t know. We need to discover new metrics … and what’s best for our customers.”

Zheng Li, Director of Product at Raygun.

We asked prolific tech leaders from The Warehouse Group, Montoux LTD, Trade Me, and Sharesies how they use metrics to measure software quality. Here’s what they said:


Better engineering metrics are necessary because when used properly, they will support software that is fast, doesn’t crash, and communicates how users are using our software and what their experience is.

Further reading on NPS for engineering teams