Say-Do Metrics: Avoiding Agile Banditry in Your Organization

Published on
6 minute read

In Agile environments, there’s often a temptation to rely on metrics that seem to offer clarity and control over a project’s progress. One such metric is the “say-do” metric, which measures what a team says they will do versus what they actually accomplish. While this may appear useful on the surface, it’s often a slippery slope that leads to vanity metrics, reduced psychological safety, and, ultimately, a focus on outputs rather than outcomes.

In this post, we’ll explore the dangers of say-do metrics, why focusing on outcomes is critical, and how to steer clear of what I call Agile banditry. Let’s dive in.

What Are Say-Do Metrics?

Say-do metrics compare a team’s estimated project outcomes with the actual results. A common example is tracking original estimates vs. actual hours worked on a project. On the surface, this seems like a valuable way to measure performance. However, as I’ve seen time and again, say-do metrics are ripe for manipulation, leading teams down the wrong path.

A Real-Life Example of Misleading Metrics

Let me share an example from an organization I worked with many years ago. The head of the PMO (Project Management Office) proudly showed me a presentation of metrics sent to leadership. The data compared original estimates with actuals for five ongoing projects, each in the thousands of hours range.

  • All five projects were within a 20% margin between estimated and actual time.

  • Three projects were within a 15% deviation.

  • Two projects boasted a margin of 10% or less.

This initially sounds impressive, right? But then I noticed something odd: one project had an original estimate of 5,232 hours and an actual time spent of… 5,232 hours.

Wait, what? 🧐

When Metrics Become Vanity Metrics

I pointed out this highly suspicious match between estimated and actual hours. The head of the PMO sheepishly admitted that, to avoid the leadership’s wrath after last year’s budget issues, they allowed project managers to submit change requests that adjusted original estimates. By the time the final report was due, they had managed to align their estimates perfectly with the actuals. This data was going to be presented to leadership, and key funding decisions would be made based on it.

Here’s the problem: this data wasn’t reflective of reality. It was complete fiction. These kinds of vanity metrics can have disastrous consequences when leadership uses them to make funding or project prioritization decisions.

Key Lessons from Misleading Metrics:

  • Vanity metrics paint an unrealistic picture to keep leadership happy.

  • They create a false sense of security and trust in the data.

  • They divert focus from what really matters – delivering value to customers.

The Impact on Psychological Safety

Monitoring say-do metrics not only skews the data but also undermines psychological safety within the team. When teams feel pressured to match their estimates to the actuals, they’re incentivized to manipulate data rather than report the truth.

Here’s what happens:

  • Teams start gaming the system to avoid negative feedback.

  • Leadership becomes disconnected from the true challenges on the ground.

  • The focus shifts from outcomes to merely ticking off boxes.

This lack of transparency and manipulation of data fosters a culture of fear. If you’re not delivering exactly what you promised, you’re seen as a failure, even if you delivered something far more valuable.

🚫 Avoid Agile Banditry

To prevent this, stop relying on say-do metrics. Focus on fostering transparency, creating psychological safety, and delivering outcomes that matter, not just outputs that look good on paper.

Outputs vs. Outcomes: What’s the Difference?

Focusing on Outputs

Outputs are the tangible results of work completed, often measured by the number of tasks, features, or deliverables produced. In a traditional Agile environment, this might mean delivering a specific number of features or completing a certain amount of work within a sprint.

However, focusing too much on outputs can lead to situations where teams prioritize quantity over quality. This often happens when metrics like original estimates vs. actuals take center stage.

Focusing on Outcomes

Outcomes, on the other hand, are the value derived from the work. It’s not just about completing 10 features; it’s about delivering 9 that are incredibly valuable to the end-user or stakeholder. When we shift our focus from outputs to outcomes, we measure success based on the impact of our work, not just the completion of it.

Let me share an example from my time at an MVP Summit at Microsoft. One year, our group of MVPs spent days determining the five most valuable features that we believed Microsoft should develop for Azure DevOps.

The following year, Brian Harry, a key figure at Microsoft, took the stage and said something that initially surprised us: “Of the five things you said were the best things we could build, we built none of them.”

That could have been devastating – after all, we’d spent a significant amount of time and effort figuring out those five features. But then he explained that what they did build was far more valuable to customers. They had focused on outcomes, not just ticking off our list of recommendations. And in the end, they blew us away with the value they delivered.

🔥 Key takeaway:

  • Prioritize outcomes over outputs.

  • Focus on what delivers the most value, not just what was originally promised.

How to Shift Focus to Outcomes

If your organization is stuck in an output-focused mindset, it’s time to make a shift. Here are some steps to help move the focus from say-do metrics to outcomes:

1. Encourage Transparency

Create an environment where teams feel safe sharing the real challenges and roadblocks they’re facing. Open and honest communication leads to better problem-solving.

2. Measure Value Delivered

Instead of measuring tasks completed or hours worked, focus on the value delivered to the end user. Are the features being delivered making a positive impact?

3. Incorporate Flexibility

Allow teams the flexibility to adapt to changing circumstances. Agile is all about responding to change. Strictly adhering to say-do metrics leaves little room for the iteration and improvement that Agile thrives on.

4. Continuous Improvement

Encourage teams to focus on continuous improvement rather than just hitting arbitrary targets. Celebrate the learning and adaptations that come from not getting things perfect the first time.

Say-Do Metrics: A Bandit’s Tool

At the end of the day, say-do metrics are a tool of Agile banditry. They allow organizations to create a veneer of success while covering up the true picture. And when leadership makes decisions based on these vanity metrics, the consequences can be dire.

What You Should Do Instead:

  • Shift your focus from outputs to outcomes.

  • Foster psychological safety within your teams.

  • Measure value delivered, not tasks completed.

  • Embrace transparency and open communication.

If your organization is grappling with Agile banditry and misleading metrics, my team at Naked Agility can help. We specialize in helping teams and organizations get back on track, focusing on delivering real value rather than playing games with data.

One of the agile practices that puts us clearly and explicitly in the category of agile banditry is using say-do metrics. That’s metrics where you’re looking at what people say they were going to do and then what they actually did. A great example of that is, oh, I guess my favourite example is from an organisation that I worked with many years ago. The head of PMO brought me into his office to show me some of the metrics that they presented up to leadership in the organisation. For each of the projects, this was many years ago, each of the projects that they had underway, I think there were five projects they had underway, they had an original estimate versus actual. That’s the quintessential say-do, original estimate and actuals for each of the projects, and they were in the thousands of hours, right, like 5,000 hours.

Something I noticed right away when he showed me this presentation, it was a presentation to show the leadership, was that of the five projects, all of them were within 20%. But if you know your BCKs and you know your statistics, you’ll probably know from the Chaos Report of the Standish Group that 65% of the features that we think we’re going to build change during the life of the project. So how could it possibly be only a 20% deviation between the amount of time we think it’s going to take and the amount of time actually going to take? So that was my first clue that something was up.

My second clue was that three of the projects were within 15%, two of them were within 10%, and then I noticed that one project that said 5,232 hours original estimate, 5,232 hours actuals. I just turned the graph around, pointed at that one and said, “But this is BS. How could you possibly have these numbers be the same?” He said, “Well, right, you know there’s something bad coming when somebody does that well.” Last year, we created this report with what happened, and we gave it to leadership, and they had a tantrum and threw their toys out the pram and shouted at us, and made our lives a misery. So this year, we made a change to the system and we allowed project managers to make a change request to their original estimate. That project manager just got his change request in just before we completed the report, before the deadline. So the original estimate had been changed more than once in order to get the graphs to look like they did.

So they’re going to go and present this to leadership, and leadership is potentially going to make decisions on what they fund, what they don’t fund, based on this type of data, this data that is complete fiction. It’s just complete fiction. So you don’t want that in organisations. If you have a monitor say-do metrics, there are many other types of say-do metrics, but the most common one is original estimate and actuals. Monitoring say-do metrics reduces psychological safety. The PMO of a big organisation didn’t feel safe enough to show leadership the real data; they had to create vanity metrics in order to make them happy.

We tend to focus on output rather than outcome. Well, you said you were going to deliver 10 things, but you only actually delivered nine things. You suck! Whereas in actual fact, we delivered nine of the most valuable things. Wasn’t that better if we delivered nine things that were more valuable than the 10 things that were asked for? Or you delivered zero of the things we asked. I remember being at an MVP Summit at Microsoft, where they invited us to come and talk about the product. The year before, we’d done a whole bunch of stats and learnings on what are the most valuable features that we as MVPs believe as your DevOps needs as a product. I think it was TFS service at the time. What are the cool features we need? We did days, like five days of figuring out what that was and came up with a list of here’s the top five things that we believe all of our customers need in the product.

The very first thing, the next year when Brian Harry got up to do his keynote, starting presentation to kick the next Summit off, he said, “Here’s a list of the five things that you said were the best things we could build, and I’ll tell you now we’ve built none of them. Here’s what we did build.” Instead of building the things that we thought were the most valuable, he went off and totally blew our socks off with all the things he did build that were much more valuable to our customers. If we’d held them to the say-do metric, they delivered nothing that we asked for, and that is absolutely not the case.

So don’t focus on output over outcome. You want to be focusing on the outcomes, what it is you’re trying to achieve. That overemphasis that comes from it on estimation is key to that negative feeling that people have when you monitor them with say-do metrics. Just stop doing say-do metrics. Don’t be a bandit and give up on that agile banditry of say-do metrics. If you’re being ambushed by agile bandits in your organisation, then my team at Naked Agility can help you or help you find a consultant or expert who can set up a no-obligation consultation using the links below. And don’t forget to like and subscribe if you enjoyed this video.

Metrics and Learning People and Process Agile Project Management Psychological Safety Agile Philosophy Organisational Agility Software Development Pragmatic Thinking Agile Product Management Transparency and Accountability

Connect with Martin Hinshelwood

If you've made it this far, it's worth connecting with our principal consultant and coach, Martin Hinshelwood, for a 30-minute 'ask me anything' call.

Our Happy Clients​

We partner with businesses across diverse industries, including finance, insurance, healthcare, pharmaceuticals, technology, engineering, transportation, hospitality, entertainment, legal, government, and military sectors.​

Teleplan Logo
Illumina Logo
YearUp.org Logo
SuperControl Logo
Emerson Process Management Logo
Jack Links Logo
Akaditi Logo
New Signature Logo
Microsoft Logo
Epic Games Logo
Slicedbread Logo
Sage Logo

NIT A/S

Schlumberger Logo
ProgramUtvikling Logo
Boeing Logo
DFDS Logo
Trayport Logo
Washington Department of Transport Logo
Ghana Police Service Logo
New Hampshire Supreme Court Logo
Washington Department of Enterprise Services Logo
Department of Work and Pensions (UK) Logo
Nottingham County Council Logo
Brandes Investment Partners L.P. Logo
Hubtel Ghana Logo

NIT A/S

Teleplan Logo
SuperControl Logo
Trayport Logo