I’ve seen mention recently of the issue of self-taught vs taught programmers, with the assertion that one is better than the other: self-taught is generally better, or taught makes a noticeable difference. In this post I want to examine some of the issues around this self-taught vs taught distinction: why self-taught seems positive, but also some of the problems it brings with it.
The whole taught/self-taught distinction is of course a false dichotomy — try to label the self-taught and taught programmers among the following hypothetical programmers:
- Anna started coding at age ten, continued to program heavily in her spare time until she studied computer science at university at eighteen, and became a software developer thereafter.
- Bill started coding at age thirteen, developed his own apps at seventeen, and became a self-employed app developer at eighteen, never going to university.
- Cecilia began coding at university during one module of her physics degree, and continued to program in her subsequent job in data analysis.
- Dave studied computer science at university at eighteen, having done no programming before, and became a software developer thereafter.
- Emma began coding at age eighteen on a computer science degree, left and became a project manager, but switched to be a software developer ten years after her degree, having to re-learn almost all of the skills that she had forgotten.
So, what distinguishes the taught from the self-taught — Bill and Dave are obvious, but it’s not clear-cut for the rest. Many people, especially in computing (because it is often not offered at schools, at least in the UK), are a mix of taught, self-taught, and learned-on-the-job. Perhaps a useful definition of self-taught, at least for this post, is “learned to program before undertaking formal study, if any”.
So if your ultimate motivation is to find the best programmer, which of the people above would you pick? I think I’d instinctively favour Anna or Bill — and these are the more self-taught programmers. But there are two enormous confounds when considering taught vs self-taught programmers, especially when looking at hiring around the age of 18–21.
The first confound is enthusiasm for the subject. Those who are exposed to programming at a young age and like it often continue to program in their spare time (but see discussion of gender below), until they get a chance to study the subject properly — typically at university. So often when we think of programmers who taught themselves before eighteen, we are thinking of those who really enjoy programming. It’s no surprise if these programmers turn out to be better than, say, someone who takes computer science at university without knowing what programming is like and who turns out not to enjoy it very much.
The second confound is experience. If you’re looking at someone who is self-taught before their degree vs someone who only has a degree, there is a difference in experience, which is particular telling when hiring recent graduates. Anna in our examples above has eight years more experience of programming than Dave. I think this is a problem overall: those who didn’t get exposure to programming before university are forever playing catch-up to those who have been programming for years beforehand. I’ve seen some evidence of this being quite disheartening at the beginning of a degree: someone turns up to university and starts trying to understand the concept of a variable, and they end up pair-programming in their first class with someone who can solve the problems in thirty seconds because they’ve been programming for years. I suspect this comparison effect may disadvantage those who are not self-taught.
(I’m mainly focusing here on how someone who is self-taught and has a degree compares to someone who only has a degree. The discussion of whether a degree is worthwhile is a separate argument.)
There are a couple of unfortunate issues around self-taught programmers, and the dual experience/demonstrated-enthusiasm benefits that they have. In particular, I believe that the vast majority of self-taught programming teens are males. I don’t see this discussed much, but Emma Mulqueeny mentions it in her Guardian article:
Teaching yourself something that should really be covered as a part of lessons is a bit like doing extra homework – why, ask many teens, would anyone do that? There is no way the majority of hormonally challenged, desperate-to-find-their-place-in-the-world teenage girls would risk ridicule or isolation by doing such a thing – let alone be open and proud about it. (Boys of the same age have different social challenges and do not measure their societal worth so much by peer review.)
I’m not sure I completely buy this extra-homework argument. But self-teaching programming is typically a solitary experience, and I think it may well be more acceptable to peers and parents for boys to engage in a solitary non-social activity than for girls to do so. The point is: if self-taught programmers get a leg up through experience and demonstrated enthusiasm, I believe girls are missing out on these benefits, and this, along with negative preconceptions of programming by girls, is part of the reason for the gender-gap in programming.
My suggested solution to the issues surrounding taught vs self-taught programmers is simple: more computing in schools! This would take away the experience confound to some extent, and would hopefully also even up the gender balance by exposing all teens to computing (thus removing false preconceptions, especially negative stereotypes by girls) and allowing them to learn programming without this hypothesised teen-girl stigma of self-teaching. And how early? Well, Mulqueeny makes the case for an early start to address the gender issues:
So, make one simple change: teach programming in Year 5 and thereafter make it a relevant and necessary part of the curriculum. Then you’ll see the girls.
(For non-UK readers: Year 5 is ages 9–10.)