15th Oct 2025
Up Learn’s A Level Maths Improvements
The course is now exponentially better!
This year, our goal has been to make Up Learn easy to use and valuable for teachers and students alike. We’ve heard your feedback on how to make maths better. Here’s what we’ve done…
In a nutshell:
- New diagnostic quizzes make it much easier to find the content you need, and filter the content you don’t
- Sections are now shorter and more uniform.
- Marking accuracy is significantly improved – it’s now highly unlikely that students will be denied the marks
Here’s the detail:
1. Diagnostic quizzes
It is now much easier to tick off content you don’t need and find the content you do need
All of our pure maths content is now covered by new diagnostic quizzes. These accurately diagnose what students know, tick off content they don’t need, and prescribe content to fill gaps.

We no longer give students one ‘all-or-nothing’ diagnostic question per concept. Instead, we have several questions per concept. These quizzes were all written with great care by senior members of the team and we think represent the best questions the maths course has to offer.

This new approach allows us to assess knowledge dynamically.
First, students are served a random question from the pool.
Then, rather than saying ‘they 100% know this if they got it correct’, or ‘they 100% don’t know this if they got it wrong’, we update a probability that they understand the concept.
This update takes into account two things:
- the probability of guessing that question correct if you don’t understand the concept
- the probability of slipping up on that question even if you understand the concept
For example, questions that are very hard to guess correctly will increase the probability much more than two-option multiple choice questions.
This probability always determines the next step. If the probability of the student knowing the concept is high, we move them onto something else.

If it is sufficiently low, we recommend content to fill the knowledge gap.

If it’s still unclear, we ask them another question.
These quizzes are much less punitive than our old ‘all-or-nothing’ ones. Since we could only ask one question, we would sometimes hold the bar for skipping content unreasonably high, therefore directing students to content they probably didn’t need and making the course longer.
Our old quizzes also only gave one opportunity to tick off content. But since we now have several questions per concept, these new quizzes are reusable. A student can come back two weeks later, take the diagnostic quiz again, see different questions, and still tick off content. Again, this makes it more feasible to make more progress in the course.
2. Shorter sections
Section lengths are more uniform and we’ve removed some content
We first developed our maths course between 2017-2020. We designed it as a course for students to use independently. We weren’t yet imagining that, one day, teachers would be assigning sections for students to complete on a regular cadence. One consequence is that the lengths of our sections varied drastically.
This year, we have reorganised several parts of the course. The sections that we found students took the longest to complete have been split into two separate sections. In some cases, we have deleted content that we felt, on reflection, was overkill. We’ve also tried to make our titles clearer. For example, ‘Differentiating more functions’ (22 lessons, 11 quizzes) became:
➤ ‘Differentiating exponential and trigonometric functions’ (13 lessons, 7 quizzes)
➤ ‘Differentiating sin(x) and cos(x) from first principles’ (9 lessons, 4 quizzes)
There are still a small number of sections with a high number of activities. But these are ones where we found that, in practice, students usually tick off a significant amount of the content through our new diagnostic quizzes.
3. Better marking accuracy
It’s now extremely rare to be denied a mark you should have been given in a maths question
It is harder than you might think to create a 100% accurate math marking engine! We use an engine called MathLive. It is accurate in a vast majority of cases. But until recently, it had a few elusive bugs. These would sometimes cause correct answers to be marked incorrect, which is a very frustrating experience – especially after spending a while on a question.
We spent a while investigating and classifying the errors it would make. For example, we realised it would often fail to identify equivalence when expressions contained factorials, natural logarithms, and fractional powers.
Every bug we identified is now fixed. This is thanks to a combination of lobbying the math engine developers to fix bugs on their end but also changing the way we implement the engine.
Since then, we have also implemented another safety net: AI marking. Whenever the engine deems an answer incorrect, it is sent to an AI reasoning model to double check. This allows us to correct any occasional mistakes the math engine still makes.
Find out more
For Teachers & Schools
Want to see how our updated Psychology course could support your students?
👉Book a quick call with our team.
For Students & Parents
Looking for the best support with the new Psychology specification?
👉Explore our Maths A level course or book a quick call
MAKE AN ENQUIRY