11 Sep '13

News School league tables: how we are helping parents make sense of the data

The Open Public Services Network aims to make the vast amount of information on schools more useful to the public

It is no surprise that parents struggle to make sense of data about schools: there is just so much of it. School league tables are among the most frequently viewed official datasets, but few of us use them to make decisions.

The Open Public Services Network aims to make the vast amounts of data now available about education, healthcare, policing and social care more useful to the people who rely on those services. Too often, the information makes sense to managers or professionals, but leaves the general public confused.

Our first project has been to look at schools and ask the question: how well does the information available to parents and children help them understand the education provided by a school? How might it be improved? We brought together a group of experts to consider this. The results of their deliberations can be found here.

One thing became clear early on in the discussions – there were some large gaps. For example the "culture of learning" within a school was seen as crucial but the information available gave limited insight into this. There was a desire to know much more about the views of parents, staff and pupils.

Understanding the curriculum – the range and types of subjects taught – was also seen as important. Here we were able to make better progress as, although it was not available in the right format, relevant data did exist. We were able to identify clear differences, even at GCSE level, in the range of subjects taught in different schools. Whether your interest is history or science, art or business studies, you might want to look for a school where lots of children take these exams or at least be aware if very few pupils study the subjects.

Exam performance is important, but interpreting the data is controversial. We looked at a number of ways to make it easier. First, we wanted to separate out significant differences from random variation. Our analysis suggested that for many schools, differences in performance were very likely random, but in some cases the results probably were worth paying attention to. In a similar vein, we were able to distinguish between schools that achieved consistent results over time and those that were more variable.

In each case we put schools into groups or "bands" depending on how they performed. This approach is useful because bands can be represented in simple ways that allow easy comparison – graphically or with labels. Most people are more comfortable with information presented this way than they are with numbers. The Guardian has used words such as "excellent" or "variable" to describe the different groups. This is as we hoped the data would be used but I should make clear that the particular labels and presentation used by the Guardian have been their editorial choices, not the decisions of the OPSN group.

Throughout the process, the expert group was very aware that information relevant to one child would be completely different to the information of interest to another. To demonstrate the point, we produced information about subjects at different levels of performance. So in the data displayed by the Guardian, one person can search for a school that does lots of science and gets many A grades while another can identify the school most likely to get them the five Cs they need in the subjects they wish to pursue.

The most contentious area was the use of so-called "value-added" or "progress" measures. These attempt, with varying degrees of sophistication, to calculate the difference the school has made between the results that you would expect a child to get and those actually achieved. On balance – but certainly without unanimity – it was felt that, while imperfect, this was important contextual information. We focused on identifying where differences in these measures were most likely to be important – where the evidence suggests the school is making a big impact given its intake or the opposite. This data has been used as the basis of the "school impact" ratings in the Guardian tables.

There are, of course, thousands of different ways of presenting data about schools all of which will have their strengths and weaknesses depending on your needs. Our aim is purely to encourage new approaches in the belief that by giving people a better understanding of the services available to them, it will open up greater opportunities for them to achieve their goals.

The complete dataset that was produced in the course of our work can be downloaded from the RSA website here along with the report.

Roger Taylor is chair of the Open Public Services Network

theguardian.com © 2013 Guardian News and Media Limited or its affiliated companies. All rights reserved. | Use of this content is subject to our Terms & Conditions | More Feeds

Read the full story in Guardian Education
Log in Join