Much of the argument for computing education (at least in the UK) has centred around the benefits of the abstract skills that computing can confer. Often bunched into the buzzword “computational thinking”, I’m referring to things like algorithm construction, debugging, mental models of execution and so on. A key question is: how can we best instil these abstract reasoning skills?
One place to find answers to this may be the work of Daniel Willingham, an education researcher who is quite popular among UK teacher-bloggers. I’ve just finished Willingham’s 2009 book “Why Don’t Students Like School”. The title is a bit of a misnomer; essentially, it presents very readable arguments about education backed by cognitive psychology findings. This post is my attempt to map those findings into computing education.
I don’t think anyone is pushing the teaching of programming so that we can get children to learn Java syntax. We’re not aiming for basic knowledge, we’re aiming for conceptual understanding and abstract skills. The only question is how we can teach those skills. Willingham offers his argument:
“Factual knowledge must precede skill” [page 25]
Willingham offers a series of convincing arguments [chapter 2] for why factual knowledge is necessary for critical thinking. One argument is that this knowledge reduces working memory load, to better free up space for higher thinking. If young pupils are struggling with being able to read, they won’t have much room left to do further thinking about a text. This is not too surprising, but it backs up what we observe with a cognitive explanation. Transferring this finding to programming, students need to master some of the basic facts of programming — general syntax, meaning of constructs (e.g. if statements) — to free up working memory space to think about constructing algorithms.
Another reason for why factual knowledge must come first is that it provides a domain in which to think. For example, history aims to teach critical thinking and analysis skills, but Willingham argues that these cannot be developed in a vacuum — the students first need to learn the details about a given historical period or conflict before they can start engaging in critical thinking about that period. Doing this for several different historical events will start to build up the abstract, transferable skill. Thinking of this in terms of computing, it seems to me that if we want to teach computational thinking, the obvious domain in which to acquire these skills is programming. We should not fear getting bogged down in programming; it is not a swamp, but rather the foundation for further skills.
This blog post mentions a teacher-trainer who suggests “teachers shouldn’t be taught programming via a specific programming language such as Python but be taught the key concepts behind programming principles in a not language context”. While the key concepts are the end goal, what Willingham suggests is that you can’t just teach the key concepts directly. As the blog author begins to suspect, you need a programming language in which to learn the concepts.
In chapter 4 of the book, Willingham discusses ways to build abstract skills, the answer being essentially to offer students many concrete examples from which they can then build an abstract principle. However, in computing, the concrete/abstract distinction can get a bit more complicated — computing being a discipline particularly centred on abstraction. Read the following tasks/concepts and try to classify them as concrete or abstract:
- Write a program that given a list of sales totals for twelve months, produces the yearly total.
- int total = 0; for (Integer i : list) total += i;
- Write a loop that processes each element in turn, adding each element to the total, then return the total.
- Write a program that produces the sum of a given list of integers.
- A fold is a combinator that performs an associative combining operation on an ordered sequence to get a single result.
There is clearly some sort of scale, and it’s not immediately obvious which of these are concrete. Is the only concrete example the first one? There’s clearly a lot of abstraction to be found in computing, but should we stay away from it at first, and remain as close to the concrete as possible?
I’m not going to write a full review of Willingham’s book, but I’d say it’s worth reading if you’re interested in education. It is very readable and pulls off that rare trick where when you read it, everything it says seems so obvious that you’re not sure if was pointless reading it, or whether it has so smoothly convinced you of its arguments that you didn’t even notice it happening.
Computing faces many of the same challenges as any other school subject: students are in danger of learning shallow knowledge, deep knowledge will take time and effort, not all of the students will achieve deep knowledge, many students will get bored in lessons and many may fall behind. These should not be seen as dangers specific to computing or to programming, but rather the standard (very tough!) challenges of school education. Willingham’s book offers some food for thought, but there are few definitive answers in education, only guiding principles. I am convinced of this much, though: programming is still absolutely central to the teaching of computing.