Chris Woodhead and his unreliable facts

The Guardian, September 1999

They really don’ t know . The world of education is rather like the world of the Fortean Times, the journal of strange phenomena. A few years ago, the magazine totted up the number of reports which they had been sent from around the world that year describing bizarre and inexplicable events. They compared the total to the number of similar reports which they had received the previous year involving various freaks and flying saucers and concluded that the world had become 3.5% weirder.

The difference, however, between the world of education and the Fortean Times is that the journal of strange phenomena has its tongue in its cheek. The civil servants, academics and pundits who inform the most important domestic debate in the country may produce findings which have no statistical validity at all, and yet they present them with a straight face, claiming to have uncovered the truth, when the alarming reality very often is that they really don’t know .

This summer, for example, the chief inspector of schools, Chris Woodhead, wrote an acidic essay in the Independent, flaying ‘elitist liberals’ who disagreed with the education secretary’s views on homework. In support of his position, Mr Woodhead wrote: “When I became chief inspector five years ago, 20-30% of lessons were routinely judged by inspectors to be unsatisfactory or poor… Now, more than 50% of lessons taught to 7-11-year-olds are routinely judged to be good.” Point made. But look again.

There are several problems here. The smallest problem is that five years ago, when Mr Woodhead took over, Ofsted had gathered such a small amount of data from its new work in primary schools, that it published no figures at all for them in its annual report. Ofsted’s chief press officer confirms that the database simply was not big enough to produce any reliable results.

If Mr Woodhead has not simply invented the figures, he may be ignoring the paucity of data and relying on a crude table from the time which does show a failure rate of the kind he reports. If that is his source, he fails to tell his readers that this crude table also suggested that more than 40% of the relevant lessons were rated good or very good. More important, however, is the fact that the entire basis of the comparison is invalid because in the last five years, the methods of inspection, the criteria of measurement and the number of grades have all changed.

The hopelessness of making this kind of comparison was put powerfully by one education expert who was apparently frustrated at the abuse of Ofsted evidence: ‘In 1996, the framework for inspection was revised and significant changes were made to the way in which lessons are graded by inspectors and the way in which standards achieved by pupils are judged. In addition, there were some changes in the criteria used by inspectors to judge other aspects of the quality and efficiency of schools. While this has improved the quality of inspection, it makes precise comparisons with previous years impossible.” Impossible? The man who wrote that was the same Chris Woodhead, in his 1996/7 annual report. Fortean indeed.

Despite the impossibility of making these year-on-year comparisons, Chris Woodhead and government ministers regularly use – and abuse – Ofsted data to do so. If it is beguiling for the public, it is infuriating for Ofsted’s inspectors, who have queued up (in private) to complain. “There is a real shortage of hard data,” according to one of them. “We cannot give a hard answer. All we can do is to take a view. The government is being told stupid things.”

For example, Tory ministers, supported by Mr Woodhead, seized on figures which appeared to show that smaller class sizes did nothing to yield better academic results. This was a powerful statistic since it struck at the heart of left-wing criticism and could save the Treasury a huge bill for extra teachers. Unfortunately, the figures are Fortean from start to finish. In the current market in education, struggling schools lose pupils and so they have smaller classes. But they were struggling in the first place, because they had so many disadvantaged children. Those who leave tend to be the bright, middle class children whose parents can afford to move house and who can negotiate the appeals procedure. And so they leave behind them a school which has small classes, populated by children who do badly in exams. On the other side of the tracks, the successful school fills up its classrooms with bright children, who turn in good results. Did Mr Woodhead not understand that? As one of his Ofsted inspectors told the Guardian: ‘The correlation is entirely bogus.”

Mr Woodhead has launched scathing attacks on teachers who run their classes in small groups. In support of his claim that ‘whole class teaching’ generates better results and is now used far more frequently than in the past, he likes to refer to Ofsted’s own unpublished data and to the work of Maurice Galton at Leicester University. However, when pressed, Ofsted told us in writing that they had no such data to support the claim, and Maurice Galton said he did not agree with Mr Woodhead’s position. He says that no one really understands what makes a good teacher and that, contrary to Mr Woodhead’s claim of an increase in whole class teaching: “What teachers do with children has not changed – only what is taught has changed. The whole of 20 years effort has not changed anything of the fundamental structure in the classroom.”

Fortean statistics are riddled like woodworm through the structure of the debate on education. Sometimes, it is a question of clashing statistics, as with the squabble this summer about whether primary school children should do half an hour’s homework each evening. David Blunkett said they should, but researchers at Durham University said better results were achieved by children who did less homework. Blunkett’s people poured scorn on them. The Durham people replied that Mr Blunkett had done no research. Mr Blunkett’s people said they had and that it was the Durham people who had not done their homework properly. The truth is that they really don’t know .

More often, it is a matter of misreading the statistics that are available. Mr Woodhead claims there are 15,000 incompetent teachers. Some of his inspectors disagree. “I write the reports he’s reading,” said one. “He can say that n% of lessons are judged to be less than satisfactory. That does not entitle him to say that n% of teachers are incompetent. That’s a very different thing. A competent teacher can have an unsatisfactory lesson. Plus it is subjective, this data we provide is impressionistic. And it shouldn’t be used for the drawing of these bogus, politically acceptable tabloid slogans.”

Ofsted justifies its own role by pointing out that since it started work, in 1992/3, there has been a steady rise in the number of pupils scoring at least five A to Cs at GCSE. But wait. Look at Scotland, where Ofsted has no role: the comprehensive schools there have seen their GCSE sets rising just as fast and, lately, even faster. Or look at Northern Ireland: no comprehensives, no Ofsted, but the same annual rise in results. Or consider the fact that the exam results of 16-year-olds in England started rising in 1980, more than 10 years before Ofsted started work. What does it mean? That some other factor is driving up the performance of children – more pressure to find work, more access to information, better teaching even? Or is it just that the examiners keep moving the goal posts to make it easier for the students to score? They really don’t know .

In a lecture in February last year, Mr Woodhead told his audience: “Down on the bedrock, the crusade for higher standards involves a clash of ideologies which will be resolved not by the intrusion of political will but by the exercise of a quality which is in rather shorter supply: intellectual clarity.” Just so.