Every year 6 pupil takes key stage 2 (KS2) national curriculum tests, known as SATs, to measure school performance and to make sure individual pupils have the support that they need when they move into secondary school.
This year, the assessments took place from 9 to 12 May and covered subjects including English grammar, reading, and maths.
But were the assessments any harder this year compared to previous years? Here, we explain everything you need to know from how the tests are developed to why some pupils may have found them more difficult than others.
Why do we need SATs and why are they important?
KS2 tests and assessments are an essential part of ensuring that all pupils master the basics of reading, writing and maths to prepare them for secondary school.
They help teachers and parents understand how pupils are performing against the age-related expectations outlined in the National Curriculum and enable them to identify where pupils need more support.
They also allow us to hold schools to account to ensure that they support all pupils, regardless of background or prior attainment, to achieve sufficient progress.
Are this year’s English reading SATs harder than normal? Was the reading test too advanced for children aged 10 to 11?
Every year, we use a range of established processes to ensure the tests are appropriate and fair. This includes reviews by teachers, curriculum and inclusion experts and other education professionals through three rounds of expert review panels to make sure they’re the right difficulty level.
It takes three years to create appropriate tests. During the process, the texts and questions are also rigorously trialled twice, with a nationally representative sample of year 6 pupils. In the second, technical trial, more than a thousand pupils see each question.
Evidence from these processes indicated that the tests were of similar difficulty to previous years. As a result, we are confident the test was set to an appropriate level of difficulty.
Will the difficulty of the test be represented in the results?
Yes, the difficulty of a test is reflected in SATs results. All SATs papers are marked externally and are given a raw mark and a scaled score.
The raw mark refers to the number of marks that pupils achieve by answering questions on a particular test. For example, a pupil may score 30 out of a possible 40 available marks in a paper.
The raw mark that pupils need to achieve to meet the expected standard changes every year as the overall difficulty of a test is considered. It will be lower if the paper is found to be more difficult and higher if it’s deemed to be easier.
To help compare results from one year to the next, a scaled score is also calculated for each paper taken by a pupil. It’s calculated using a technical process that takes into account the difficulty of the questions. The maximum scaled score possible is 120.
The Standards and Testing Agency (STA) - the independent body responsible for developing the assessments – will go through the usual marking and scoring process for the 2023 papers to make sure any difficulty is reflected in the scaled score and the raw marks needed to meet the standard.
Did children only have 34 seconds to answer each question?
Some news reports have suggested that this year, pupils were given only 34 seconds to answer each question in the English reading paper.
There are many factors that influence difficulty so it’s misleading to calculate timings per question. Each question is designed differently to test pupils’ range of ability and some require more time than others. Naturally, some questions will take longer, whilst some can be answered more quickly.
Are the topics in the assessment sufficiently engaging?
Feedback from two large scale trials of year 6 pupils allowed us to gather feedback from test administrators, teachers and, most importantly, pupils.
The majority of pupils who took the trial in 2022 indicated that they enjoyed reading the texts as did the teachers who were involved in the expert review process.
When deciding which texts are used in a final test, we always consider pupil engagement as part of the test development process.
The content of the texts was not culturally recognisable for some pupils – why haven’t equality and diversity been recognised?
Inclusivity, equality and diversity are all key considerations when developing the tests.
Our expert review panels include a range of experts including serving teachers working at schools from across the country and with pupils from a range of diverse backgrounds.
Sometimes panellists do conclude that a particular text is not culturally inclusive or has potentially stereotypical characters and where that is the case, we do not use the text.
That was not the case for the texts in the 2023 test. For the first text, the panel thought the characters were culturally inclusive and that even though the context was a countryside setting/camping, it would not disadvantage pupils who had no direct experience of that environment.
For the second text, there was general agreement that the subject matter was interesting and should engage pupils.
The third text was also well-received by panellists who felt that, although the text was challenging, it was engaging and well-written. They thought the text might appeal more to boys but that it was not gender-biased.