Monday, 20 June 2016

Drop the Data Drop

If there is one phrase that exposes the true nature of data in a school it is 'data drop'. That termly data collection window, which opens 3 to 6 times per year and causes fear, consternation and resentment amongst teachers. The data drop is indicative of a system that revolves around an accountability cycle - to provide data for governors, LAs, Ofsted - rather than the needs of teachers, for the benefit of learning. And if teaching and learning are not placed at the heart of the system, then the system's heart is not in the right place. Any system that involves collecting assessment data every 6 to 12 weeks in order to produce graphs, predict outcomes and monitor teacher performance is coming at the challenge of school improvement from entirely the wrong end and there are two key issues with such an approach.

First, the data is likely to be of low quality. Teachers will see the data collection cycle as an exercise in itself, disconnected from learning in the classroom, and possibly even intrusive. Teachers, disenfranchised from the data process, will come to dislike and distrust it. Where assessment data is used for purposes of performance management, teachers are likely to give the benefit of the doubt rather than err on the side of caution, and data ceases to be a true reflection of pupils learning. It reduces assessment to a tick box exercise. I recently spoke to a Year 2 teacher who had spent over 6 hours ticking off numerous objectives in a tracking system to ensure the data was in by the last day of term. 6 hours that would have been better spent doing something more useful like, well, planning lessons or relaxing in front of the telly. The teacher rightly resented that time wasted in front of a laptop. And such resentment combined with the pressure of having to enter assessments onto the system within a narrow time window may result in teachers using the block filling option available in many tracking systems just to get the job done. What is the likelihood of robust data when such shortcuts are used. 

The second issue is that the data is nearly always out of date. When assessment data is only entered onto a system at the end of a term, it is only going to be current, and thus useful, every 6 to 12 weeks, depending on when your data drops are. The rest of the time it is out of date. And if it's always out of date then it's not data that can be acted upon with any degree of confidence. In fact, by the time you come to act upon it, it may  be too late, all of which begs the question: who is the data for? I regularly visit schools where I am showed data issued with caveats relating to its currency, and an invitation to come back early next term. The solution is to change to ongoing recording of assessment as and when evidence is seen. Obviously this has to be carefully considered so as not to increase workload - i.e. assessing against a few key objectives and trusting teacher judgement - but when done properly it can turn the system into a positive tool for teaching and learning. When regularly updated - recording assessment as and when learning happens - the system becomes a live, real-time record of pupils' learning, of their strengths and weaknesses and the gaps that need to be addressed. It reflects the current state of play as opposed to where pupils were at the end of last term. Headteachers are therefore assured that any reports they run are up to date; and anyone using the system in the classroom can be confident that it presents an accurate picture of pupils' learning. To ensure the system is truly fit for purpose there has to be a culture shift towards acceptance that the numbers can go down as well as up, that sometimes pupils understand things and then they fall behind and need support. If the prevailing culture is that such dips are unacceptable then teachers will not record them and, again, the system is disconnected from the reality. 

The data drop speaks volumes about the attitude towards data in a school: it is all about accountability and top down scrutiny. But we need to be very careful and understand that some of the practices we put in place, supposedly for the purposes of school improvement, can actually be counterproductive to the intended aims. The irony of accountability driven approaches is that they can be a risk to pupils' learning and therefore can have a negative impact on results. Precisely what they were implemented to avoid. 

So, my advice: drop the data drop. Move towards continual recording of assessment as and when learning happens and ensure your systems are built around the needs of teachers, that they are first and foremost tools for teaching and learning. In return you will have live, accurate, reliable data whenever you need it, data that can be acted upon to benefit pupils, and a data culture that is more about learning and less about the needs of external agencies and performance management.

Everyone wins.


  1. Your last few posts have been really useful and make lots of sense James. Thank you. It is a long battle for teachers to break the habits and distrust of tracking information, used to beat almost everyone with at some stage I am sure. We are being moderated tomorrow and the Key Stage 1 staff have spent ridiculous amounts of time gathering evidence based on confused messages from the DfE and the LA. This despite repeated assurances from SLT that the workload was not necessary and overly burdensome for them. The teachers' response being that they wanted to make sure they had enough rather than too little evidence for the moderators. A sad state of things that seems to be quite widespread at the moment. This will change next year I hope as people become more confident in what they are assessing. Keep banging the drum for dropping data collections at fixed points.

  2. Thanks for comment. Really pleased to hear my posts are making some sense and getting through to people. We need to keep chipping away at the nonsense until it just crumbles away. Feel like we're getting somewhere finally.