Data. It’s one of those things that creates a range of emotions in teachers and leaders all across the country. Anger, hate, loathing and utter bamboozlement in some cases. I was somebody who thought that they would never understand data to a decent level, let alone be able to use it as an effective tool for departmental leadership. The common cry is “but data is a flawed representation of my classroom” or “it doesn’t show the whole picture” or even “data is a load of nonsense”. While I agree with the middle statement, I think that when used correctly, data can be one of the most important and useful sources for improving departments and classroom teaching as long as we ask the right questions.
To emphasise, I don’t think data paints a full picture, nor do I believe that it is used effectively unless taken with background context. Having said that, it does have many uses. To dismiss data outright because it doesn’t look the way we want it to, or because it asks difficult questions of us, would be damaging and naïve. Hopefully I can give a few ways of using it here.
Firstly, have you used it across departments? This is only really useful for GCSEs, A-Levels and any time when a full mock is set due to the fact that this provide the most accurate data to compare across departments. Unless somebody finds some miraculous way of making all tests across subjects in each year exactly the same level of difficulty and rigour, data in key stage 3 and until full mocks are sat, need to be taken with a pinch of salt. At these points it is excellent for comparing your above/on/below ratios and comparisons between subject classes but looking at raw subject progress is problematic. It can be used however at GCSE, A-Level and in mocks to see how you are doing compared to other departments with a greater degree of confidence.
Let’s first consider it when it comes to mock exams as that is something many of us will be currently undertaking or will be shortly after the Christmas break. Teachers and Heads of Department who use data well, will never stop asking questions of it. Ask yourself, have you used realistic boundaries? How similar are the results to last year? If they have changed, why? If your results, based on progress scores, if they have changed significantly (up or down by 0.3 or more) does that match where you feel students are at? If, for example, the grades are higher than you would have expected, this could mean the students have been revising more, new strategies have been effectively implemented or, marking is more generous than last year. Follow it up, find out why in a non-accusatory and entirely supportive way. Get some tests marked by your staff and yourself, compare them to some samples you have from last year, is it consistent? If not why not? This could be a valuable training opportunity across the department that your usual standardisation of marking may not have picked up. Ask the students, has something been working or not working this year?
Then of course we have the real thing, GCSEs and A-Levels. I’m naturally quite competitive so I always look at the progress scores of my department compared to the other departments in the school. It’s a realistic way of checking your impact. What have you done this year, has it worked or not? It’s entirely possible to be delighted for the success of other departments while also wanting to go the next step and beat them. I am naturally quite competitive with the other departments in my school and I want to beat their progress score in the summer, but only because my results have gone up, not because theirs have dropped (they won’t, the heads of department and teaching staff are great across the board).
The questions to ask here are, if we have the same students, are ours performing as well as they are in other subjects? Avoid the temptation to make excuses here and say other subjects get more curriculum time or whatever the case may be, look at what you can impact or change directly. There will always be a degree of difference between subjects, but it should never be overly substantial. If students in a certain ability group are underperforming in your subject but not in others, that’s a departmental training need. Speak to the HoDs in your school that don’t seem to have that issue and see why they think that is. Steal ideas and inspiration from them whenever you can. We’re all in it for the students at the end of the day. Are they doing revision/homework/exam prep/intervention differently from you? If so, STEAL IT!
Our history results weren’t great the first year-round but in our second year they were superb. There were of course a range of factors but ultimately, I spent a lot of time looking at what the successful departments were doing and working hard to emulate their success. It’s worked the last two years now with significantly positive progress scores.
This is going to be the first part of several entries on data as I’ve written a lot on this already today! In later installations I’m going to look at using data to inform department leadership and classroom teaching. Hope this has been in some way useful.
Questions to ask of mock data:
Have you used realistic boundaries?
How similar are the results to last year?
If they have changed, why?
Is marking accurate? How do you know?
If your results, based on progress scores, if they have changed significantly (up or down by 0.5) does that match where you feel students are at?
Ask the students, has something been working or not working this year?
What have we changed this year? Has it had an impact?
Of official exams:
Are we performing better or worse than other departments?
Why may that be?
Have some policies trialled this year clearly worked or failed to work? Why?
Can we steal policies or ideas from high-performing departments?
Do they do revision/homework/exam prep/intervention differently that we do? What’s the impact?
Are certain groups (PP, SEN, HA, MA, LA, EAL) underperforming in our subject? Why?
How can I use the information from these exams to assess strengths and weaknesses in the department? (later blog entry to address this)
One thought on “Data – Asking the right questions”