Just say no to junk data: Assessment at Michaela

I was reading @hfletcherwood‘s blogpost describing his dream assessment system:

1) Abolish whole-school data reporting on anything less than an annual basis.

Whether from teachers to school leaders, from school to parent, you name it, get rid of it.  It is only on an annual basis that we can conduct assessments thorough enough to provide valid inferences about students’ overall understanding of a subject, and where we can pursue reliably through moderation.  (If you doubt the former point, ask language teachers trying to mark assessments of speaking, reading, writing and listening on a termly basis).  Valid and reliable data would provide excellent justification for significant interventions for students.  This is necessary to create the time to pursue point 2 properly:

2) Devolve regular assessment to teachers and departments

I’m not arguing for less assessment, I’m arguing for frequent, useful assessment.    Levels do not help a teacher or a head of department (if they did, it would be possible to explain how to support a student at ‘level 5c’ in history without any further information).

What teachers need to assess on a regular basis is students’ knowledge of individual concepts and ideas, and their capacity to use that knowledge; this kind of analysis must happen at a departmental level.  Question-level analysis in departments provides usable insights: if 80% of students in Class A answered a question about the Blitz well, but only 40% in Class B did so, it seems highly likely that the teacher of Class A has done something her colleague would benefit from learning about.  Departments can create short-term solutions (Teacher Y spends a few minutes reteaching the Blitz using Teacher X’s approach) alongside longer-term ones (reconsidering the unit plans). ‘Intervention’ would be more frequent, and more useful than on a whole-school level.

This is almost precisely what we do at Michaela. We examine twice a year rather than annually, but bar that, Harry pretty much just summarised our assessment policy in two paragraphs.

Multiple choice quizzes

Pupils take weekly multiple-choice quizzes online. These are automatically marked with question led analysis that looks like this.

Screen Shot 2015-04-19 at 21.09.21

In fortnightly line management meetings, we discuss the implications of this data. How is the class doing overall? Do I need to reteach a topic? Stretch the class further? How are individual pupils doing? What can I do to support those who are struggling? How have the historically lowest performing pupils done recently? What’s worked well and what hasn’t? Why?

Teachers can also set pure recap quizzes, testing content from throughout the year, by picking and mixing old questions.

You can tag questions; set time limits; set maximum number of retakes; upload from Word or Excel; embed TeX, images, videos…

One can choose between free-text entry, single answer multiple choice, or multiple correct multiple choice. The latter can really add rigour, as can carefully planned distractors. Here’s a random assortment I just screenshotted:

Screen Shot 2015-02-20 at 09.56.01

Screen Shot 2015-02-20 at 12.38.39

Screen Shot 2015-02-20 at 12.40.13

Biannual exams

The other component of the assessment system is the biannual exams. These have two components: a multiple choice element, and a traditional written exam. The multiple choice quiz tests a random selection of knowledge from the entire domain – all the content they have studied since beginning at Michaela. The written papers are synoptic wherever possible, drawing knowledge we expect them to have retained from previous terms and years. We’re not in the business of tacit acceptance that you’ll forget everything from Year x when you’re in Year x + 1. 

Pupils are given a percentage score for their performance in both parts separately. Internal and external moderation calibrates standards for percentages. We do not touch levels! Levels are not part of the vocabulary at Michaela.  This is not a ladder system where you go up if you’re improving: pupils who stay on the same percentage over their school career are making expected progress. A pupil getting 80% in Year 11 is doing so on a harder, broader range of knowledge than the test they got 80% on in Year 7.

How it’s going

This assessment system is what I was trying to shoe-horn in as an individual teacher behind my classroom door in my old school, except without the shoe-horning.

Last year, I was attaching meaningless levels to half-termly assessments, spending an age marking and doing data entry for junk data that would sit on SIMS and not tell anyone anything actionable. In my classroom, I would play about with apps like QuickKey and use websites like diagnosticquestions.com to get a useful picture of my classes’ learning. I was a lone ranger. In terms of a Venn diagram, we didn’t have any overlap:

Screen Shot 2015-02-20 at 11.41.41

Now data has become worth looking at. What I was hiding behind closed doors is the whole-school policy. The workload of meaningless marking and data entry has been felled. Our Venn diagram becomes one lovely overlapping circle:

Screen Shot 2015-02-20 at 11.41.52


  1. We had a few teething issues with the technology, though nothing major.
  2. It’s different. Our system is not challenging to understand, but if you’re looking for levels and a nice linear progression with numbers going up, you’ll have to try a little harder. I won’t apologise for that. If you’re not data literate enough to understand that something staying at 80% can be progress, then you should not have any role in judging my performance or my school’s performance.
  3. It’s spoiled me for any other school: I couldn’t go back to an assessment system that didn’t make sense.

I see so many people blogging about the right ideas on assessment. I see so many people doing the right thing behind their classroom door. School leaders now need to capitalise on that and make whole school systems that make sense. Drop the junk data, work with your teachers. Increase the Venn diagram overlap.

 Further reading

Michael Fordham on Assessment after levels: don’t reinvent a square wheel

Joe Kirby on Life after levels

Daisy Christodoulou on assessment

17 thoughts on “Just say no to junk data: Assessment at Michaela

  1. philiprolt

    Really interesting solution to something that is a real problem at the moment. As a head I want to be able to track progress but with the freedom to devise an assessment process that supports learning there must be better ways than the old system. This is a great example of an assessment system that rises to the ‘after the levels’ challenge.

  2. mrbenney

    Firstly, many thanks for blogging this Bodil. It is absolutely fascinating to see the different approaches being taken to assessment and tracking. There is lots here that I like. The fortnightly multiple choice questions (as long as they are well designed which they certainly seem to be) clearly give you valuable data on what the pupils can and can’t do. I really like how that data is used in meetings and these questions: “How is the class doing overall? Do I need to reteach a topic? Stretch the class further? How are individual pupils doing? What can I do to support those who are struggling? How have the historically lowest performing pupils done recently? What’s worked well and what hasn’t? Why?” show the data is used to inform teaching rather than just ploughing through the scheme of work. Departmental meetings can too often be filled with administrative matters rather than teaching and learning.
    2 questions: 1) how easy has it been to “standardise” the different exams so that 80% in year 7 and 80% in year 8 genuinely show progression?
    2) Unrelated, but what sort of marking do you do in books (or does the multiple choice quizzes replace the need)?
    Perhaps the real challenge at Michaela will be to continue to use data at this level for individuals as well as classes as the school gets bigger.
    Thanks again for blogging Bodil. You’ve given lots of food for thought.

    1. Adam Porter

      The integration with book marking is something I am interested in. Does it correlate and in what way?

      I’m a huge fan of the system you describe Bodil, and it seems to tackle most if not all of the typical holes in assessment practice.

      My only further query is regarding how this form of assessment contributes to any form of outcome tracking (especially at KS4), for any audience, parents, leadership…

      You have already said that “We do not touch levels!” Does this mean that there is never any predicted grades etc. or that individual mock papers and controlled assessments are graded in isolation as a separate assessment practice?

      It seems ideal to have faith that an assessment approach such as this focusing on actual content would result in the best possible student outcomes but this does seem like a tough sell to many who are used to (albeit often arbitrary) grade based outcome predictions.

  3. ingotian

    Similar to what we have with the Computer baseline testing project. Two multiple choice tests per year. 60,000 did the first one so that provides data to contextualise school performance in national performance. On-line formative assessment system for pupils to provide evidence and self and peer assess their work and teacher can confirm (or deny). No paper and its all free. Second progress test in the baseline series of 6 will be ready on Monday. 6 tests over 3 years will give a good idea of individual and department progress through KS3. All number crunching done centrally and fed back to schools. Job done and minimal admin.

    1. Bodil Post author

      We use a custom made system, but it will be opened up to other schools in time. It’s called Blue, made by a company called Genesis VM

  4. Pingback: Just say no to junk data: Assessment at Michaela | The Echo Chamber

  5. Pingback: #mathsconf2015 | Bodil's blog

  6. Pingback: Actual Maths: Thoughts on #mathsconf2015, Part 3 | Teaching at the Edge of Chaos

  7. Pingback: Assessment is difficult, but it is not mysterious | The Wing to Heaven

  8. Pingback: Assessment alternatives 1: using questions instead of criteria | The Wing to Heaven

  9. Mike O'Donovan

    Hi Bodil,

    I’m an old colleague of Jonny Porter’s back in Birmingham and I’ve been spreading your blog among colleagues here, especially this piece on assessment as it definitely speaks volumes to me.

    I’ve just trialled the Quickkey software – wow – what a revolutionary app this could prove to be! I’m going to see how year 10s deal with multiple choice quiz tomorrow.

    Looking at the practicalities of developing the quizzes, have they all been developed to fit with the scheme of work so all classes of the same ability do the same test? If so, do teachers supplement these quizzes with their own based on their own classes’ needs?


  10. david jones

    Thanks for sharing your ideas Bodil-we are on a similar journey and I shared your post with my colleagues. Our science leader asked me if I could ask you if it was possible to see your weekly multiple choice science questions and which software you are using for setting them up.

    We will share anything we have back. Our inset day questions asked of colleagues about reducing unnecessary data and reducing workload are attached. http://blog.meolscophighschool.co.uk/?p=2402 It’s a start!

    Many thanks


Go forth and opine