A team that I coached received this mandate: “The CTO’s measure of success is that all teams increase velocity by at least 50 percent by Sprint six.”
It took perhaps ten seconds before one of the Development Team members was grinning from ear to ear. Even the team manager saw the flaw in the directive he’d just passed on. “Okay, so in Sprint five we just need to make all our five-point stories thirteen-point stories, right?” joked one of the team members.
Welcome to Goodhart’s law in action: “When a measure becomes a target, it ceases to be a good measure.”
Goodhart’s law also applies to people metrics. Setting individual performance goals can dampen morale, will lead to false data, and even set teammate against teammate. Instead of promoting collaboration, it leads to unhealthy intra-team and cross-team competition as well as the gaming of metrics to look “better.”
With the transparency of Agile and the granularity of team-based metrics, it is extremely important to be responsible in how you use your measurements. There are five principles I follow when dealing with metrics:
- Measure the project, the teams, and the adoption separately
- Start collecting metrics early and often
- Stay focused
- Be consistent
- And, most importantly, measure responsibly.
1. Measure the Project, the Teams and the Adoption Separately
If you try to use the same measurements to track the health of the project/product as you do the health of the team, you are not gaining efficiency. Separating your measuring of the product, team and Agile adoption may look like a lack of focus. However, just because a project is going really well does not mean the teams working on the project are in good shape, or that your Agile adoption is sustainable.
In my old command-and-control project management days, I used to say that any project manager could ship a product on time, on budget, and in scope—once. Your project can be in awesome shape, only to be in trouble when half the team quits after the launch. Keep your project, team and Agile adoption measurements separate from each other. Doing so gives you a total picture of what is going on, and failure to do so runs the risk of corrupting all your data and wasting all your measurement efforts.
2. Collect Your Agile Metrics Early and Often
Start collecting Agile metrics from the very first Sprint. If possible, capture how your teams performed before the Agile adoption started. If you are already six Sprints into an Agile adoption before you start collecting data, you’ve lost the opportunity to see where you were before you started the adoption. This makes it even harder to see where you are going.
When I began to coach a twelve-team organization, they were already three months into a rolling Agile adoption. I was lucky that they had been entering data into their Agile lifecycle tool. I continued to collect data as I observed the teams, so by the time I was three months in, I had a really good picture of each team. This gave me a place to work from as I dove into engaging with the teams. Already having trends and areas to focus on meant I was able to get some fast wins.
3. Stay Focused
Our brains really only work well with about three to five items at a time. If you’re tracking ten different metrics, odds are that five of them are not getting the proper attention or being tracked correctly. Too many metrics can also mean that one part of the organization might be paying attention to metrics 1-4 while another is looking at 7-10 and no one even cares about 5 and 6.
One of the pillars of good Agile and lean development is focus. We need to do the same thing with our measurements. Measure what matters the most to the organization or team. Improve what is being measured. Then shift focus to measure the next important thing to the organization or team.
I recommend setting up three separate types of metrics: team health-based metrics, product-tracking metrics to monitor outcome, and Agile health metrics to track the adoption.
4. Keep Your Agile Metrics Consistent
Consistency isn’t so much of an issue with the Agile projects themselves as it is with the tools we use to track measurements. There are two problems in this regard. First, physical task boards can become cluttered and out of sync if not carefully tended. Unfortunately, online tracking tools usually exacerbate these problems instead of helping. This is because the customizability of online tools often results in different features being used in different ways from team to team.
Second, when your teams cross over multiple managers or organizational structures, tracking metrics across the company can quickly become unwieldy. Again, online tools can exacerbate the problem. This is especially true when using any of the larger, enterprise-scale Agile life-cycle tools that allow easy customization, e.g., Jira, Rally, VersionOne, etc.
For an example of how the concept of consistency was implemented, I once worked with teams using both Scrum and Kanban. We wanted a consistent view of metrics to identify trends, not for measuring teams against teams. We created artificial iterations solely for tracking the team metrics. The Kanban teams reported metrics in monthly increments. The Scrum teams reported in two-week increments. The metrics charts calculated everything by week. If a Kanban team had four escaped defects per iteration and a Scrum team had two per iteration, then the organization was trending with one defect per team per week.
As far as what metrics to actually use, I recommend common, agreed-upon measures for all teams and organizations. My recommendations for specific Agile metrics are cycle time, escaped defect rate, planned-to-done ratio, and a team happiness metric. Even outside of Agile organizations, these four metrics are easily understood. For product metrics focus on value based metrics and for the adoption large-scale retrospectives or a good Agile assessment tool.
5. Measure Responsibly
“Where there is great power there is great responsibility..”
Now to the most important principle: measure responsibly.
With all this data, using the data responsibly becomes essential. Here are some key points to remember:
- Don’t measure team against team: Measuring team health metrics against team health metrics doesn’t work, nor is it valuable. Measure product outcome. And never compare velocity!
- Don’t react too quickly: It is not uncommon for the first few Sprints of any new team or project to miss the mark. This is a normal progression of any iteration cycle, we learn with every iteration. Look for trends of three or more Sprints.
- Use supporting metrics: Use metrics that support one another, so that if one metric is gamed, another metric will show what’s going on, e.g., if the team crashes the cycle time at the cost of quality, escaped defects will go up.
- For products, use customer-focused measures: When measuring project success, stick to customer-focused metrics, such as revenue, cost savings, Planned to Done Business Value, Net Promoter or Fit for Purpose.
- Use data, not estimates to forecast: Don’t use estimates to forecast project timelines. Use Empirical forecasting, e.g., yesterday’s weather, or statistical forecasting.
- Use data to look forward: Most metrics are a lagging indicator. Use them to find better ways to do things in the future, not to punish past performance.
Simply having metrics in place is not enough. You need a plan for how you will use them, and you need to constantly check to make sure you haven’t mixed up or polluted how you collect, monitor, or use that data.
In sum, applying these five principles leads to success when selecting and using Agile Team Metrics: Measure the project, teams and adoption separately to give you a total picture of what is going on. Collect your metrics early and often to give you more information to see where you’re going. Don’t use too many metrics, which will keep you in focus. Use the same metrics consistently over the organization to make the data meaningful. Lastly, use the data you’ve collected in a responsible manner.
Note: An earlier version of this article first appeared in its original form in Agile Connections in May of 2017.