Standing up to shoddy exam marking

Perhaps it is just the nervousness caused by the news that a Head recently lost his job after poor exam results, but I for one am delighted to see HMC once more taking up the cause of inconsistent exam marking. It is always tricky for a school to stand alone on this issue; complaining looks like excuse-making, rather like a football manager who always blames the referee, the linesman, the half-time oranges or the weather for their own short-comings.

Yet I know that we are not alone at Wycliffe in being mystified by some of this year’s results. It’s not that our outcomes were poor either; they weren’t but they were often bizarre. Two students whom I taught for two years contrasted hugely in ability, and having seen their scripts, this was reflected in their answers. One has been a nailed-on A* pupil from the outset; the other, with significant educational needs, did well to produce any work worthy of a grade D in two years. Yet they were marked just four points apart in their A2 exam – quite inexplicably.

Heads of a number of departments, reporting to me on annual performance, have been left speechless. Now there are times I’d welcome that but, for the good of our pupils, we have to understand how examiners are reaching assessment decisions – and some very experienced, very successful curriculum leaders simply cannot explain some of the marking – or some of the re-marking.

There are some subjects in which reviews of marking have led to wildly fluctuating outcomes; in one economics paper, one pupil’s marks increased by more than twenty marks, another pupil’s marks on the same paper dropped five. And we have noticed that with another exam board every request for a re-mark has led to unchanged outcomes, across a range of subjects. Most of these subjects were essay-based. Is that really credible?

Now bear with me through the news that my subject happens to be Media. The arguments over that subject’s integrity is for another time, place or bar but, for now, let me reassure you that as an examiner for the AS paper and having been around the subject for many years, I thought I knew what was required. But it seems, without warning, case studies recommended by the board themselves two years ago have suddenly been deemed not acceptable. The same case studies produced outstanding marks last year, but this year not. There was nothing in notices from the subject officer, on the website or in replies to my queries that explains why. This, it seems, is the reason for my contrasting students’ relative fortunes. As an examiner, I also know that reaching agreement at standardisation meetings is exceptionally hard and that no two markers ever come up with identical marks. So why is it that so many reviews of marking result in no change from this board?

In other subjects too, interpretation of mark schemes has apparently changed without warning; History, English, Theatre Studies and PE have all seen inexplicable application of marking guidelines that are at odds with what has happened in previous years. Some moderation of coursework has changed, similarly without warning and often at the expense of a student’s final grade.

Whilst we start to look again at alternative providers who can provide some stability and certainty for our students - and in whose examining we can have confidence - can I urge you to respond to HMC’s most recent call for feedback so as a sector we stand together? Sometimes the referee does get it wrong after all.