Trust Part 2: Capacity Based Funding Experiment

Share:
Indexes
  1. The problems with project-oriented funding
  2. Experiment 1 — the Innovators
  3. Criteria for transparency and trust
  4. Where Next?

What’s in a name? A rose by any other name would smell as sweet … unless you call it stinkweed. We have struggled to find a name for this funding thing that works for everyone:

  • ‘Smart Funding’ — this was a bad idea, any name starting with ‘smart’ is insulting stakeholders by implying their current process is ‘stupid’.
  • ‘Persistent Funding’ — we’re still working to get this one out of the vocabulary. Finance (and some teams) hear this as a blank cheque with no transparency of the value delivered.
  • ‘Product Oriented Funding’ is what we had been calling it for the last year or so and that has been OK except for those parts of the business where ‘Product’ can only mean the financial products we provide to Members like Mortgages and Current Accounts.

What we mean is the funding approach suitable for long-lived agile product teams, constrained by team capacity rather than the size of one’s chequebook. Here’s our story in detail for the benefit of any on a similar path.

Our approach is based on the principles of trust and transparency as outlined in Trust Part 1: Governance and Bureaucracy

The problems with project-oriented funding

Much has been written on the topic of project-oriented funding and the issues it causes. For example, see the summary here: Lean Budgets — Scaled Agile Framework. This is our experience.

Our starting point in the years up to 2020 was that project-based budgeting has been very strongly related to our annual budget cycle. This had created a long-standing annual boom-bust cycle with the following impacts:

  • Loss of domain knowledge every time a short-lived project team was disbanded.
  • Seasonal skills bottlenecks with many waterfall projects mobilised in Q1 all trying to get business analysts and architects or squeeze through the procurement process.
  • Release bottlenecks pile up in Q4 (January — March for us) compounded by Christmas and end of financial year release constraints.
  • Technical debt accumulates when projects rush to close while running out of money and are forced to hand over to operations with various caveats and workarounds.
  • Cross-charging to fund dependencies on other teams became a paperwork industry that we called ‘home and away’ funding.
  • No supporting capacity for benefits realisation where the project team is long gone by the time some of the lagging benefits should be realised.
  • A quarterly funding drawdown process at the portfolio level was used to try to mitigate some of the above by at least ensuring transparency of the work and how funding would be used. However, delivery teams found this process disempowering.

With no desire to jump in one step to an assumed solution we have taken an evolutionary approach using structured experiments.

Experiment 1 — the Innovators

For some time our leading agile teams, particularly digital teams, had been doing their best to operate a funding model based on stable long-lived teams. This resulted in awkwardness between their ways of working and our traditional portfolio governance processes that were designed for project-oriented delivery.

Hack: A ‘Product’ wrapped inside a traditional ‘Programme’. Some of our digital teams figured out that going through the traditional governance process for funding a Programme enabled a stable budget. They just deliberately made the scope of a programme business case the same as the scope of their product vision.

A breakthrough came when our Everyday Banking team was willing to ‘open up the books’ on their ways of working and provide hard evidence on the benefits of a more stable model. This was not an experiment to change something. The stable resourcing and funding model the team had been running was well established. The experiment was really to try to explain it to everyone and to prove why it is not an undisciplined ‘Agile’ process. Here are the lessons we learned.

Lesson 1: The ability to utilise capacity-based funding is 100% dependent on actually having stable cross-functional teams. You can’t be running a set of short-lived project teams and expect to have stable capacity and funding.

This slide helped everyone understand the link between stable teams and funding

Lesson 2: Quarterly funding drawdown for such a large portfolio doesn’t really enable tactical steering of the portfolio. The reality is that ramping team capacity up or down is a very disruptive process that cannot be executed in a short period on quarter boundaries. It may be required where there is a real mid-year cost challenge or a disruptive event like the pandemic that forces portfolio level re-planning but it’s not something that happens every quarter.

Lessons 3: We learned that the actual value of quarterly governance at the whole portfolio level is to provide a level of transparency that informs longer-term investment decisions and occasional point interventions rather than centralised quarterly decision making.

Yes, I’m aware that Lessons 2 and 3 sound like heresy for some consultants that have been promising an ability to easily pivot cash investment priorities every quarter as a benefit of lean portfolio management, according to SAFe. Please be careful what you promise and read on.

Lesson 4: The business agility benefit of quarterly planning happens within the product teams where they have the freedom to pivot or swap out priorities within their current budget and capacity constraints. Noting that this does leave a question to pick up later about why, how and when to adjust the investment appetite for a product-oriented team.

Lesson 5: That for any change to our funding model all stakeholders need to have a shared understanding and be willing to collaborate and come on a journey of discovery. This has also been possible because we focussed on empirical facts in our experiments rather than holding opinions without evidence. We’re pleased to say this collaboration has been key to progress.

Criteria for transparency and trust

The key outcome of this experiment was to define a set of practices that demonstrate good self-discipline and transparency by product teams. These criteria were not intended to define everything that could be considered good practice. Rather, they were specifically designed to enable the governance functions to agree that teams with these practices could be trusted with lighter weight funding governance.

And what’s in it for the teams? Simply that teams would no longer be required to request funding drawdown as part of their quarterly update to the investment committee. This change was minor in terms of paperwork, though major progress for our principle of ‘accountable freedom’ and a cause for celebration by our first team, Everyday Banking.

Here are the criteria we used throughout 2021 for product-oriented teams to demonstrate the self-discipline required to earn the trust they deserve.

Resourcing strategy: The team resource sheet (.xls of course) would show that teams are defined as small teams with stable team members rather than short term or part-time allocations across multiple ‘projects’.

Financial control: The monthly financial forecast and actual cost for the team would be reliable, with a narrow margin of inaccuracy. This naturally follows when the team structure is in fact stable per their resource strategy.

Strategic alignment and transparency: Teams would be reasonably mature in their use of OKRs for quarterly planning and the management of their backlogs in terms of epics and stories. In particular, stakeholders were interested to see how the Epics were linked up to the OKRs and the OKRs up to multi-year Outcomes to show traceability up to enterprise strategy. At Nationwide we nickname this traceability ‘the Golden Thread’ as explained by Nick Brown in his fantastic article.

Iterative release of Value: Teams would be releasing into production regularly. Based on the premise that product teams have the freedom to pivot priorities within capacity, it’s no good pretending that’s possible if it takes multiple quarters to get to a release. That would just be water-scrum-fall.

Kudos to these people for getting us that far: Dan and Carlo in Everyday Banking for being brutally transparent. Matt and James from Finance who examined the experiment and evidence carefully with readiness for change. Nitish and Harriet in the strategy office for helping us communicate the evidence from the experiment that would stand up to challenge. Jane in my Lean Portfolio team for doing the thorough work to collate and present the facts. Everyone in the Strategic Investment Committee for engaging with the topic and buying into the approach of using deliberate experimentation to learn and to evolve the model.

Where Next?

Of course we now wanted to ‘scale up’ the approach. The next part in the series will relate the next rounds of experiments and learnings through 2021. We still had these questions to answer:

  • If I just fund team capacity, what am I getting for my money?
  • How does product team funding dial up or down to meet strategic needs?
  • What if we decide to use a 3rd party for a temporary boost to capacity?
  • What about non-resource costs?

Other posts in this series:
Trust Part 1: Governance and Bureaucracy
Coming soon: Trust Part 3: Capacity Based Funding at Scale
Coming later: Trust Part 4: Value Stream Funding