top of page

From Prescribed to Principled: Why Assessment Organisations Are Now Being Asked to Make the Big Decisions

  • 22 hours ago
  • 4 min read

For years, apprenticeship assessment plans told assessment organisations exactly what to do. They specified the assessment methods, prescribed the logistics, dictated how resits should run, and laid out operational details right down to sequencing and evidence expectations.


That era is over.


With Skills England’s reformed assessment model and Ofqual’s new regulatory framework, apprenticeship assessment plans are becoming shorter, higher-level and principles based. Instead of tightly prescribing delivery, they set out the outcomes that must be assessed and leave assessment organisations (AOs) to design the systems that make that happen.


This represents a profound shift for the sector. Rather than simply delivering what the plan dictates, AOs must now design, justify and operate assessment models that meet the regulatory expectations while still working in practice for apprentices and providers.


But far from being a constraint, this shift creates an opportunity to rethink how apprenticeship assessment works.

 

Assessment plans set the destination — not the route

Under the new model, assessment plans focus on what apprentices must demonstrate, not the operational detail of how assessments run.


Typically, plans now define things like:

  • the assessment outcomes

  • mandatory knowledge and skills

  • performance descriptors

  • a mandatory assessment method and the selection of optional assessment methods

  • gateway-to-completion expectations


What they no longer tend to specify are the practical mechanics that make assessment happen day to day. For example, plans are far less likely to prescribe:

  • resit rules

  • booking or cancellation windows

  • sequencing between assessments

  • detailed marking models

  • operational logistics or scheduling


Those decisions now sit with the AO. In other words, AOs are no longer just implementing a system, they are designing it.


designing

 

From executors to designers

This shift is reinforced by Ofqual’s new guidance for apprenticeship assessment qualifications. The regulator sets the expectations that AOs must meet, but it does not prescribe the operational model.


Previously, the assessment plan itself would have included many specific requirements and details. Now, the responsibility for meeting these expectations rests squarely with the assessment organisation.


In practice, that means AOs must design and justify how their assessment systems ensure things like:

  • appropriate levels of synoptic assessment

  • the proportion of assessment marked by the AO

  • effective employer engagement

  • robust management of conflicts of interest

  • consistency across pathways or assessment variants

  • controlled transitions between versions of assessment plans


This approach gives AOs greater autonomy, but it also places greater emphasis on their ability to interpret the guidance and translate it into assessment systems that are fair, robust and workable in practice.

 

Turning reform into an opportunity

At Accelerate People, we’re approaching this change as a rare opportunity to challenge some of the inherited assumptions of traditional apprenticeship assessment design.


Rather than simply replicating existing EPA models, we’re reviewing how assessment works in practice and identifying where reform allows us to improve the experience for apprentices, providers and employers.


Change takes time and buy-in, we’re examining every part of the process and asking key questions like “what issues do we currently have with this?” and “how can we improve the apprentice’s experience here?” It is difficult to get out of the EPA mindset, constantly reminding ourselves that just because something is currently done a particular way we don’t have to keep it that way – we are the designers.



What could the changes look like

The changes aren’t restricted to the assessment structure; we’re looking at the operational logistics and the rules that go with the assessments. This will include things like:

  • how assessment availability is structured

  • booking service levels and timelines

  • cancellation and rescheduling arrangements

  • internal quality assurance processes

  • workflows supporting external quality assurance

  • resit arrangements

  • evidence types for assessments


With greater flexibility, we can focus on solutions that prioritise apprentice experience whilst keeping in mind fairness, consistency and efficiency.

As these decisions take shape, we’ll also develop clear guidance to help apprentices and training providers navigate the new approach. This will include practical information on assessment expectations, booking processes, evidence requirements and resit arrangements, ensuring that everyone involved understands how the system works and what is required at each stage. Our aim is not just to redesign assessment internally, but to make those changes transparent, accessible and easy to work with in practice.

 

What we’re choosing to keep

The reforms give us space to take stock and celebrate what does work well, we won’t change for changes sake. We pride ourselves on our market leading SLAs so naturally we will be keeping those! Other things we’ll be keeping include:

  • Our responsive customer service

  • Clear provider guidance and support

  • Robust, defensible marking models

  • Strong, proactive quality assurance

  • Transparent booking processes

  • Our excellent tech platforms – adapting these quickly to suit any changes we implement

 

Decorative- man and women on pc

The message for the sector: AOs are no longer just delivering the model — we are the model


This is the real shift.


AOs have moved from following instructions to making decisions that shape the entire apprentice experience. That’s a responsibility, but it’s also a privilege. We finally have the freedom to design assessment approaches that:

  • Reflect real occupational practice

  • Support centres more effectively

  • Remove unnecessary burden

  • Improve apprentice experience

  • Strengthen public trust in apprenticeship assessment


The next year will define what high quality apprenticeship assessment looks like in the reformed world. We are already doing the work; reviewing, questioning, refining and redesigning.


But this is not something that should be done in isolation. As the new model develops, we’re committed to listening to feedback from apprentices, training providers and employers, and working closely with our partners to shape approaches that work in practice.


When AOs step into this design role with transparency, collaboration and quality at the centre, and when the sector works together to refine what good looks like the system will ultimately be stronger for everyone.


Talk to us about assessment reforms.


Jo M- assessment

 

Comments


bottom of page