Why data training programmes often fail – and what to do before you commission one

The conversation usually starts with a symptom.

Reports that take too long and nobody trusts. Decisions made without data, or with the wrong data. A board that asks for better insight and receives dashboards nobody can interpret. Staff who have access to data but lack confidence in using it. A sense, somewhere in the organisation, that data should be doing more than it is.

The instinct that follows is understandable, and almost universal: let’s do some training.

Training feels like the right response because it’s concrete, it’s deliverable, and it maps onto a familiar model of how capability gaps get addressed. Identify the gap, provide the learning, measure the completion rate, mark the problem as addressed.

The difficulty is that data capability gaps are often not primarily a training problem. And training programmes that are designed before the actual gap is properly understood tend to miss the mark — sometimes visibly, sometimes in subtler ways that only become apparent later.


The diagnosis problem

The most common mistake in commissioning data training is moving straight from symptom to solution without stopping to understand the cause.

A report that takes too long might indicate that staff lack the technical skills to produce it more efficiently. But it might equally indicate that the underlying data is poorly structured, that the definitions used in the report are contested and require manual reconciliation each time, or that the process for requesting and approving the report is where the time is actually going. Training won’t fix any of those things.

Decisions being made without data might indicate low data confidence or capability. But it might indicate that the data available doesn’t actually answer the questions decision-makers are asking, that the data arrives too late to be useful, or that the relationship between the data team and the leadership team doesn’t support the kind of dialogue that data-informed decision-making requires. Training on how to read a dashboard won’t fix those things either.

The point isn’t that training is never the right answer. It often is, at least in part. The point is that training designed around an assumed problem rather than a diagnosed one is unlikely to address the actual cause – which means the symptoms will persist, the investment will have limited return, and the next conversation will start with the same frustration as the last one.


What a proper diagnosis looks like

Understanding a data capability gap properly requires looking at the organisation from several angles simultaneously.

It requires talking to the people who own the data and understanding how it flows – where it comes from, how it’s transformed, where the ownership and accountability sit, and where the informal workarounds and manual interventions are hiding.

It requires talking to the people who use data in their decision-making – not just the analysts, but the leaders and managers who receive data and are expected to act on it – and understanding where their confidence breaks down and why.

It requires looking at whether the organisation’s data architecture is fit for the purposes it’s being put to – whether definitions are consistent, whether systems are integrated in ways that support rather than complicate data use, and whether the data people are working with actually reflects the questions they’re trying to answer.

And it requires an honest assessment of where the organisation is across the full range of data capability – not just technical skills, but data literacy, data culture, governance, and the quality and accessibility of the data itself. Capability gaps that look like skills problems are often governance problems, or architecture problems, or culture problems wearing a skills problem’s clothing.

data training fails

Why this matters for training design

The diagnostic step doesn’t just identify the right problem to address. It also fundamentally shapes what good training looks like if training is indeed the right response.

An organisation where the primary gap is in data confidence at leadership level needs something very different from an organisation where the gap is in the technical skills of analysts. An organisation where the underlying data is inconsistently defined needs to address those definitions before training staff to use dashboards built on them. An organisation where the culture around data is risk-averse and where staff are reluctant to engage with data because mistakes feel high-stakes needs a different intervention entirely from one where staff are enthusiastic but undertrained.

Training that doesn’t account for these differences is training that’s been designed for an imaginary organisation rather than the actual one.


What to do instead

The practical implication is straightforward: before commissioning a training programme, invest in a proper assessment of where the organisation actually is and what the primary capability gaps actually are.

That assessment doesn’t need to be lengthy or expensive. Done well, it’s a focused exercise — conversations with the right people, structured around the right questions, synthesised into a clear picture of current capability and where the most significant gaps and opportunities lie. The output should be a working tool that informs decisions about where to invest, in what order, and in what form — not a lengthy report that sits in a drawer.

The Data Fluency MOT is designed to be exactly that. A structured, independent assessment of data capability across five dimensions, completed efficiently and delivered as something an organisation can actually act on.

It won’t always conclude that training is the wrong answer. Often it will conclude that training is the right answer, and will clarify precisely what kind of training, for whom, addressing what. But starting with the assessment rather than the solution means that whatever comes next is built on an accurate understanding of the organisation — which is the only foundation on which data capability genuinely improves.

Find out more about the Data Fluency MOT →

Scroll to Top