GUEST BLOG: The Numbers Game

I once had a Yr13 class that got exceptional results, 100% A-C. I was really pleased, especially because getting 100% A-C in a pretty deprived area of Wales in Physics A-level is quite an achievement! But were my results all that exceptional, actually? Well, if you consider that the class in question was made of only 2 students, things begin to appear in a different light.

 

The problem is that, these days, schools, Senior Leaders, Governors, Local Authorities, Ofsted and Estyn, etc. are all too concerned with numbers and percentages. But does focusing too much on percentages and tracking risk distracting us from what is really fundamental in teaching, i.e. the learners?

 

I am concerned by the fact that schools invest large amounts of money in annual subscriptions to tracking and targeting tools that promise to make teachers’ lives easier and to provide a formative assessment platform that will improve students’ performance and allow teachers to demonstrate progress. First of all, for assessment to be formative it needs to be shared with the learner, yet pupils are rarely mentioned by providers of tracking systems. The data collected when teachers input assessments in these systems might be used to generate pupil profiles, which is indeed really useful, but only if that information is shared constructively with the learners themselves and in a way that they can understand and use to improve their skills. However, the main use of these systems seems to exist to please the Senior Leadership Team that can monitor and micromanage the performance of individual teachers.

 

Something else suggested by some providers of tracking systems is the ability to identify groups of learners with common skill profiles to support planning and differentiation in the classroom. Perhaps someone should tell these providers that educational research suggests that differentiation is more effective when it is done by feedback rather than by giving different content to groups of students of different “ability”. The Sutton Trust report What makes great teaching? lists “grouping students by ability” among the common practices which are not supported by evidence, but we are told that maniacally tracking learners’ assessment data will help us to sort them into groups of “common skill profiles”. But there is more! The Education Endowment Foundation in its Teaching and Learning Toolkit states that there is evidence which suggests setting or streaming pupils has a negative impact on their learning. In contrast the same page suggests that peer tutoring (which is quite the opposite of setting and streaming) has an average positive effect of approximately five additional months’ progress.

 

But it is easy to point the finger at school leaders, when in fairness they face pressures from so many directions to perform and keep meticulous records of their students’ attainment, and leading a school has become a numbers game rather than focusing on the learner as an individual. Ofsted and Estyn inspectors spend most of their school visits analysing and discussing data these days and it is quite likely that a good number of teachers don’t get observed, so this tendency of concentrating on data, tracking and monitoring is engrained in all levels of education. Don’t get me wrong, I am all for effective use of assessment data for diagnostic purposes, but it has to be meaningful, not just an exercise to tick boxes.

 

 

Let me present a scenario that I suspect could happen to some headteachers. The scene is set in a primary school in Wales. The data presented is made up, but it is based on a real set of data I once came across, so the patterns reflect a real Yr5 group of learners. The names of pupils and any other names quoted are entirely made up too.

 

So, the Challenge Adviser arrives at the school and sits comfortably in the headteacher’s office. If you are outside Wales, Challenge Advisers are employed by Consortia to challenge schools on their performance. The name would suggest that they have to find grounds to challenge a school on something, which is a rather interesting concept.

 

Anyway, if you imagine a completely data driven individual, who perhaps has not had a background in science or maths, and there are plenty of senior leaders in that category too, the conversation could easily go like this.

 

Challenge Adviser: Good morning Mr Jones. I was very pleased to see you’ve used the Diagnostic Tool to analyse your test results on the procedural numeracy test.

 

Headteacher: Yes, it is a handy tool, isn’t it?

 

Challenge Adviser: Yes, yes… But let’s look at some of your data then. See, here in your current Yr5 girls are performing really well! Look at those percentages! Look at those tables. However, your boys don’t seem to have done that well, have they?

Headteacher: It would appear so, but…

 

Challenge Adviser: The thing that concerns me most, Mr Jones, is that I don’t see in your School Improvement Plan any targeted intervention to tackle your, should we say, boys problem!

 

Headteacher: Well, as I was trying to say…

 

Challenge Adviser: I tell you what… We have this excellent course on “Engaging boys with Numeracy” that would be perfect for your boys and would also be very suitable as part of your SIP! It will cost you £300 per delegate and I recommend you send as many of your staff as possible. And don’t worry, you can pay for it with your School Effectiveness Grant money!

 

Headteacher: Hang on, let’s look more closely at the data. Why are you challenging me so much?

 

Challenge Adviser: Hello? Challenge Adviser? It is my job to challenge… and I rather like it, I might add… eheh

 

Headteacher: Rolling his eyes in despair. Ok, let’s get back to the first page of that diagnostic tool. See? There are only three girls in this class and three of the boys perform similarly to them, so I believe these percentages are not a fair comparison. Perhaps we should look at the performance of girls across the whole school?

Challenge Adviser: Look Mr Jones, I have been quite patient with you so far. The data speaks clearly and I want to see evidence that your school is tackling this problem. So show me the evidence!!!

 

Headteacher: Talking slowly and stammering in disbelief. But… I… Just… Have!

 

John Hatti, in his excellent and very well researched book Visible Learning for Teachers, argues that “School leaders need to stop creating schools that attempt to lock in prior achievement and experiences (such as by using tracking), and instead be evidence-informed about the talents and growth of all students…”. He goes on to say “She (Weinstein, 2002) also demonstrated that many institutional practices (such as tracking or streaming) can lead to beliefs that preclude many opportunities to learn: Expectancy processes do not reside solely ‘in the minds of teachers’ but instead are built into the very fabric of our institutions and our society.”

 

So, by all means use data for diagnostic purposes, but use it intelligently and make sure you do not introduce unconscious bias about your learners’ ability, which will most likely lead to skewed expectations of them.

 

 

This blog post was written by our guest blogger Alessio Bernardelli, Founding Director of CollaboratEd. CollaboratEd is an award winning provider of professional development training for teachers and schools. You can contact Alessio and CollaboratEd at CollaboratEd1@gmail.com or via Twitter as @Collaborat_Ed

View similar articles
EduStaff blog article