Schoolzone blog: GCSE performance: your magic number

GCSE performance: your magic numberSubscribe to get blog posts by email
Subscribe to get blog posts by RSS


13 May 2015

Search previous posts:


What will your GCSE results show for next year? It's difficult to tell, because the Five A* to C data took a dip, nationally, last year. Can we work out a scaling factor that would correct for the dip? Something that schools could apply to last year's GCSE performance to see whether it was on track with the national improvement? That's what we fell to wondering today.

Obviously, schools could look to see whether they were above or below the national average, but were they sufficiently above it (for example) given their previous rate of progress? In other words, were they improving faster or slower compared to the national picture than in previous years. The national change makes it difficult for schools to see.

There are many problems with trying to work out a magic number: for example, looking at individual schools, we can see that those which approach 100% tend to drop out of the figures as they move to IGCSEs (etc), so the averages fall. Cohorts change, schools move towards more EBacc subjects etc etc.

But, let's look beyond such things at the raw data for student performance, so ignoring the school performances and the fluctuations just mentioned, instead, seeing how pupils performed as body.

The chart for student performance from 1996 to 2014 looks like this:

Source: DfE performance data

If the system hadn't changed in 2014, what would the chart have looked looked like? More importantly, for each school, what would its performance have looked like?

There are two ways to look at it:

1. The DfE's own calculation, based on the previous methods shows that the 2014 average would have been 56.8%, rather than the recorded 53.4% (remember this is the average across students, not schools). This is based on taking out data for vocational qualifications (the Wolf effect) and for early entries.

2. Continuing the trend of improvement, the 2014 average would have been 59.0%.

The DfE calculation (1) isn't a perfectly reliable method because it's based on different numbers and skews the data based on whether schools had offered vocational quals, palyed the early entry game etc.

Method 2 exaggerates the perception of the 2014 decline because it doesn't allow for the corrections made in method 1.


Correct your 2014 results

So, if you want to know what your school performance should have been like last year and maybe should be like this year, here's how to go about it.

Decide whether you fit best with method 1 (your school had a heavy number of vocational qualifications, used many retakes etc) or method 2 (it didn't).

Method 1: Multiply your 2014 Five A* to C (inc E&M) score by: 1.06

Method 2: Multiply your 2014 Five A* to C (inc E&M) score by: 1.10

This is an estimate of what your 2014 results would have looked like if the DfE hadn't shifted the goalposts.


To predict your corrected 2015 results

Of course, you'll have you own local school knowledge, but based on a national comparison and trends, the scaling factor to use is (method 2, since we dont know what the DfE will use for method 1) the scaling factor to apply to your 2014 data is (also, since results had reached a plateau last year) 1.10.


Is any of this worth anything?

We don't know.

But try it and see what you think - it's interesting, if nothing else. And it makes your results seem better - the question is, are they better enough?



HTML Comment Box is loading comments...

Subscribe to blog: enter your email address:

Schoolzone                                               Contact us
Court Mews 268 London Rd                 01242262906
Cheltenham GL52 6HS

Accessibility   Privacy and cookie policy   Terms and conditions  

© Ltd 2015
. All rights reserved