Ofsted, the UK schools inspection agency, have released a report today on mathematics education in UK schools, entitled “Mathematics: Made to Measure” (freely downloadable as PDF, lower down the page). Given that I encounter a lot of teachers during my work, I feel duty bound to point out that Ofsted are not generally popular among UK teachers. Directly or indirectly, Ofsted are encouraging a growth in bureaucracy and harmful over-critiquing of teaching — and the new head of Ofsted is also not exactly trying to get the teachers on side. That’s not to mention the satisfactory-is-not-satisfactory rebranding. However, Ofsted reports make for good reading. They are the result of a large-scale observation of UK schools, and provide an overview of effectiveness of teaching methods that is not otherwise available.

The Ofsted report is about maths, not computing, which goes against the title of the blog — but I still hold an interest in maths, and maths is similar enough to computing that there are potentially useful lessons in the report for computing too.

## Teaching

I was particularly interested to skim the section on teaching, which begins on page 20. This section shows what Ofsted believe made for good teaching:

In the very best schools, all lessons had a clear focus on thinking and understanding… Whole-class teaching was dynamic with pupils collaborating extensively with each other. It challenged them to think for themselves, for instance by suggesting how to tackle a new problem or comparing alternative approaches. Critically, pupils were directly engaged in mathematics for a substantial portion of each lesson. As a result, they had time to develop a high degree of competence and to tackle challenging, varied questions and problems that helped to deepen their understanding.

[paragraphs 51 and 52]

So that is what Ofsted believe worked well. Here’s what they were not so keen on:

A common feature of the satisfactory teaching observed was the use of examples followed by practice with many similar questions. This allowed

consolidation of a skill or technique but did not develop problem-solving skills or understanding of concepts. The teachers typically demonstrated a standard method, giving tips to pupils on how to avoid making mistakes and, sometimes, ‘rules’ and mnemonics to help them commit the methods to memory. Many of their questions concerned factual recall so that pupils’ ‘explanations’ often consisted of restating the method rather than justifying their answers. [paragraph 54]

So this is fragile knowledge, engendered by rote learning and solely using practice on similar questions. One pupil describes how this shallow knowledge disappeared under the pressure of an exam:

An able pupil summed this up: ‘You need to understand and not just do it. You think you know how to do it but you get to an exam and you can’t. You realise that nobody’s told you why it works and why you do what you do, so you can’t remember it.’

(I wonder what the exams are like: do they have questions that can mostly be answered by using the rules and mnemonics, or do they ask questions that require the sorts of problem-solving that the earlier quote describes?)

## ICT

The report also mentions the use of ICT (i.e. computers) in Maths. It turns out, this hasn’t spread very far at all:

Carefully chosen practical activities and resources, including computer software, have two principal benefits: they aid conceptual understanding and make learning more interesting. Too few of the schools used these resources well… More generally, the potential of ICT to enhance learning in mathematics continues to be underdeveloped. [paragraphs 62 and 65]

But some systems were used successfully:

In a Year 6 class, pairs of pupils used computers to draw acute and obtuse angles. The software allowed them to draw an estimate for a given angle, for example, 170°, after which it told them what angle they had created, and allowed further improved angles to be drawn. This aided pupils’ conceptualisation of angles of different sizes. [paragraph 65]

There are various systems, such as GeoGebra, DrGeo and so on, which aid this sort of teaching, and they seem to be a sensible use of ICT.

## Too Little Problem-Solving

I recently wrote about problem-solving being a distinctive benefit of computer science. Ofsted contend that maths should also feature problem-solving, but that the opportunity is often missed:

Similarly, in most of the secondary schools visited, problem solving typically followed the acquisition of a new skill. For instance, having practised many routine questions on calculating missing lengths in given right-angled triangles using Pythagoras’ Theorem, pupils eventually move on to solving problems for example about ladders leaning against walls, before perhaps tackling a two-step problem such as calculating the area of an equilateral triangle of given side length. A distinguishing feature of the good teaching was that all the pupils in the class tackled a wide variety of problems. [paragraph 70]

Important weaknesses in the curriculum remain. Inspectors continue to be concerned about the lack of emphasis on ‘using and applying mathematics’… In the very best schools, ‘using and applying mathematics’ was integrated into day-to-day teaching. For example, new topics were introduced by presenting a suitable problem and inviting pupils to use their existing knowledge in innovative ways. More generally, the lack of emphasis on using and applying mathematics remained a weakness that is persistent. [paragraphs 100 and 103]

This lack of emphasis on application may relate to this problem:

It remains a concern that secondary pupils seemed so readily to accept the view that learning mathematics is important but dull. [paragraph 42]

This seems like an opportune moment to plug my other blog again, which tries to provide examples of applying maths in games to give useful applications of these techniques. Nice to know I’m at least thinking along the right kind of lines!

Meanwhile, the report goes on to have a dig at the curriculum designers (which I believe is the responsibility of the Department for Education, which is separate from Ofsted) for not listening last time round:

These criticisms are not new. Much the same was said in the previous report. In many secondary schools at that time, pupils’ experience of using and applying mathematics was largely restricted to GCSE coursework tasks. Since then, coursework has been abolished in GCSE mathematics, with the result that the schemes of work in some of the schools gave teachers no guidance at all on teaching pupils to use and apply mathematics. [paragraph 104]

## General Teaching

There’s a few other observations on teaching, including a mention of the importance of observing the students:

Teachers generally circulated to observe pupils as they worked independently or in groups. This is an important improvement since the previous survey. It has become relatively rare to see secondary teachers rooted to the front of the class or primary teachers remaining entirely with one focus group while the other groups work on set tasks. The more effective teachers used the information that they gathered through monitoring pupils’ progress and understanding to support their learning and to adapt the lesson to meet emerging needs. Crucially, they made quick sweeps of the class to check on every pupil before deciding where and how to intervene. [paragraph 81]

Also, the usefulness of starting by establishing what students already know:

Skilled teachers made good use of starter or introductory activities to establish how much pupils already knew about a topic, using the information to tailor their subsequent teaching… This type of assessment requires a high degree of skill: it was done consistently well in few of the schools. [paragraphs 82 and 83]

## Example Design

I found this example interesting:

One question was ‘What is the first number in the sequence 3n+1?’ A pupil correctly answered ‘4’ and the teacher praised him and moved on to the next question. However, this answer could be derived incorrectly by ignoring the n in the expression 3n+1 and simply adding 3+1. The teacher did not check that the pupil knew to substitute 1 for n… When asked to find the value of an expression such as 3n+1, pupils who do not understand algebra often ignore the variable n and just calculate using the numbers, 3+1 in this case. This means that the correctly calculated answer for the first term is indistinguishable from the incorrect, as will always be the case for the first term of such a sequence. Asking the pupil how he worked out the answer, and going on to check the value of the next few terms would ascertain whether the pupil understood fully. [paragraph 84]

This reminds me a bit of designing test cases. The fastest way to avoid this problem would seem to me to be to design the example to ensure that the first term could not be derived by ignoring the n. For example, rather than .

## Summary

There’s more sections in the report which are interesting but I’ve missed out to avoid this taking all day (e.g. the section on marking). I found the report interesting from a maths perspective, but also interesting from a computing perspective. Some of the recommendations made in the report already seem familiar from teaching computing: the focus on problem-solving, the group-working with teacher observation. But do we stand to suffer from some of the same problems mentioned in the report: the focus on fragile rote learning at the expense of deeper understanding?

I’ll end with a rather troubling quotes from the report about pupil progression in mathematics:

Children’s varying pre-school experiences of mathematics mean they start school with different levels of knowledge of number and shape. For too many pupils, this gap is never overcome: their attainment at 16 years can largely be predicted by their attainment at age 11, and this can be tracked back to the knowledge and skills they have acquired by age 7. [Key Findings, page 8]