When the Data Futures Programme was conceived, it promised a more streamlined, timely, and efficient approach to collecting student data. The vision was a single, unified dataset collected three times a year, replacing outdated processes and enabling real-time insights for regulators.
Yet, in 2025, we are left with none of these benefits. Instead, Data Futures has increased workload, introduced instability, and created sector-wide confusion and people leaving the sector. An independent review of the programme, commissioned after multiple delays and widespread institutional struggles, makes one thing clear: this programme has been a disappointment – at least in its current form.
The Office for Students (OfS), along with other statutory customers, have been quick to demand compliance but slow to acknowledge the realities faced by data professionals on the ground. Institutions were expected to adapt to shifting goalposts, with frequent changes to data specifications, quality rules, and submission deadlines. The review highlights that regulatory pressure led institutions to overstate their readiness, fearing additional scrutiny from the OfS if they admitted difficulties.
For larger universities, these challenges were frustrating but often absorbed by dedicated data teams. For small providers, however, the impact was far greater—sometimes existential.
Small Providers
Small HE providers are expected to meet the same regulatory and compliance standards as large ones, but with significantly fewer resources. The OfS review revealed several sector-wide challenges, but these were felt most acutely in smaller institutions:
- Increased Data Burden – The first year of Data Futures required the collection of far more data items than before, meaning more information from students and extensive reworking of internal systems.
- System Limitations – Many small providers rely on off-the-shelf student records systems, which were not always equipped to handle rapidly changing data requirements.
- Quality Rule Chaos – The constant changes to the quality assurance rules meant that late-stage errors skyrocketed, requiring additional hours of data cleaning and validation. This was particularly problematic for small teams, where data professionals were often also responsible for other tasks, such as enrolment and assessment boards.
- Regulatory Pressure – Stories circulated in the higher education data community of providers receiving harsh warnings from the OfS over missed deadlines. The fear of falling foul of regulation led some institutions to downplay their struggles in official surveys, creating an inaccurate picture of sector readiness.
- Key-Person Dependency – In many small providers, one person was responsible for managing the data return. While this offered agility, it also created single points of failure—as seen when data leads left their institutions, leaving knowledge gaps that could jeopardise future compliance.
These challenges were not inevitable—they were the result of poor programme governance, unrealistic regulatory expectations, and a failure to listen to institutions’ concerns.
How We Fix This: A Smarter Approach to Data Collection
The good news is that these problems are not insurmountable. The review provides a roadmap for improvement, but small providers need additional targeted support. Here’s what must change:
1. Locking Down Requirements Sooner
The frequent last-minute changes to data specifications and quality rules caused avoidable disruption. The recommendations in the report are that return requirements are locked down at least 18 months before implementation—ensuring institutions have time to prepare.
2. Fixing the Quality Assurance System
Institutions spent far too much time fixing errors due to poorly implemented quality rules. A more stable, transparent system is needed, with real-world testing before rules go live.
3. Acknowledging the Pressures on Data Professionals
The OfS must recognise that data staff are doing their best under difficult conditions. Regulatory approaches should focus on collaboration rather than punishment—especially for smaller providers who lack the resources of larger universities.
4. Investing in Small Provider Support
Many small providers cannot afford dedicated data teams or expensive system upgrades. Targeted funding, sector-wide collaboration, and better communication from regulators can help ensure smaller institutions are not left behind in future data reforms.
The Way Forward
Data Futures was meant to reduce burden, not increase it. It was meant to improve data quality, not undermine it. For small providers, the stakes are even higher. If the OfS is serious about creating a fair and effective data landscape, it must start by listening to those on the ground. That means recognising the real challenges faced by institutions—particularly those without the luxury of large teams and in-house developers—and creating a system that is workable for all.
Only then can Data Futures truly deliver on its original promise.