process change

Lesson 14: The Report on Process Change

Lesson 14: The Two-Column Report That Shows if Your Process Changes Actually Work

Everybody says you can’t improve what you don’t measure. That is true, but here is what most people forget: you have to measure the right stuff. That is the real trick. Let me walk you through how just two columns in a simple report helped settle a messy debate about a process change and turned it into a standard everyone followed.

There is this nagging question that comes up in almost every operational improvement project, right after the initial buzz fades and the results aren’t in yet: Is this actually working? Sounds straightforward. In reality, it’s weirdly difficult to answer. Not because the data doesn’t exist, but because people usually track the wrong data. Teams log outputs – tickets closed, features shipped, calls handled – but hardly ever connect the process change to the outcome. They don’t really check if what they changed is actually driving the result they care about.

When that happens, good changes get lost. Everything feels logical and right, but if leaders can’t see a clear connection between what you changed and what improved, the change stays vulnerable. One exec gets skeptical, one reorg happens, and poof – gone.

It’s not about collecting more data. It’s about finding the right pairing, and you can see if you’re onto something in just a week.

The Principle: Measure Outcomes and Adherence Side by Side

If you’ve been following this series, you probably have two things going that most companies keep totally separate. First, you’ve got your “north star” signal – the metric you picked back on lesson 6 to track whether you’re delivering value to customers. Second, there’s the process change – the boundary checklist you put together in lesson 11, tested in lesson 12, and rolled out in the workflow.

Most leaders watch these metrics independently, if they track both at all. You get a weekly review of the signal, maybe, and adherence to the checklist is usually assumed rather than monitored. No one really checks whether the two are linked.

But “obvious” doesn’t cut it. Most places have some healthy skepticism about process changes – and they should. Evidence, not assumption, decides if a new idea actually sticks or fades away.

So, stick your two metrics into one report. Track them together for a week. Then look: is there a correlation that tells you the process change is helping? Or is it not? Either way, you actually learn something.

A Real Example: The One Week That Changed Minds

Let’s go through how this really played out.

We’d piloted our checklist – ran two sprints, had measurable results, made a logical case for scaling it up. Leadership was intrigued but unconvinced. The results were positive, sure, but they pointed out that one pilot over two sprints wasn’t proof. Maybe it was a fluke.

Instead of arguing, we tracked both metrics for a week across more workflows. Simple two-column report: Column A was “time to first value” – the north star, capturing how fast customers reached their first use of the core product after signup. Column B was checklist adherence – the percentage of handoffs where the three-item checklist was actually used.

After a week, we compared results. When checklist adherence hit 85% or higher, the signal improved by 15% over weeks with poor adherence (below 50%). In plain English: when most teams used the checklist, customers got value faster. When they skipped it, results stalled.

That 15% wasn’t proof, but it was a clear trend. We presented it as correlation, not causation. It was enough to shift the conversation from “maybe we think it works” to “we’ve got real evidence – let’s run a proper experiment to make sure.” Leadership signed off and formalized the checklist.

How to Set Up Your Two-Column Weekly Report

You don’t need a fancy dashboard. A shared spreadsheet, updated weekly, is totally fine.

Column A: Your signal value. This is whatever metric you decided on – note the weekly number and whether it’s trending up, down, or staying flat.

Column B: Checklist adherence. For every handoff, record if the checklist was completed, yes or no. At the end of the week, calculate the adherence rate.

And here’s the question you ask: When adherence is strong – say above 80% – do you see the signal improve? When adherence drops, does the signal drop too?

If the correlation sticks for several weeks, you’ve got directional evidence that your process change is making a difference. That’s enough to justify testing it further, and honestly, enough to make a case for scaling it up. If there’s no correlation, that’s useful, too. Maybe the checklist isn’t what moves the needle for that metric, or you picked the wrong signal. Either way, knowing early saves you a ton of headaches.

A few practical tips:

  • Track for at least a week before jumping to conclusions; two or three weeks are better.
  • Don’t oversell correlation as proof. Present it honestly – as evidence, not a final answer.
  • Keep the report visible. When everyone reviews it together each week, you build shared understanding, and adherence usually goes up without anyone having to nag.

The Honest Lesson

When your data looks good, it’s tempting to treat it as proof and rush to scale your new process. Resist that urge. Correlation feels convincing, but jumping the gun might lead you to make decisions for the wrong reasons – you’ll never know what really drove the improvement and you won’t be able to repeat it elsewhere.

Treat correlation as what it is: a pattern that deserves further investigation. If it persists, set up a real experiment to establish causation.

The best improvement-minded leaders aren’t the ones who act fastest on incomplete data. They’re the ones who know exactly how much confidence each piece of evidence deserves before making a move.

Track the signal. Track adherence. Look for the correlation. Then be honest about what you’re seeing and what it means. That’s how you turn evidence into real operational knowledge.

Take a look at whatever process change you’re running now. Do you actually know if sticking to the new process moves the outcome you care about? If not, a two-column setup takes half an hour – and the insight you’ll get in a week is probably worth way more than that.

Next time, I’ll dig into “Measure Signal + Checklist Adherence”. See you then.