Download our updated Women in Data report, which features four new profiles of women across the European Union. You can also pick-up a copy at Strata + Hadoop World London, where Alice Zheng will lead a session on Deploying Machine Learning in Production.
Lately, there has been a slew of media coverage about the Imposter Syndrome. Many columnists, bloggers, and public speakers have spoken or written about their own struggles with the Imposter Syndrome. And original psychological research on the Imposter Syndrome has found that out of every five successful people, two consider themselves a fraud.
I’m certainly no stranger to the sinking feeling of being out of place. During college and graduate school, it often seemed like everyone else around me was sailing through to the finish line, while I alone lumbered with the weight of programming projects and mathematical proofs. This led to an ongoing self-debate about my choice of a major and profession. One day, I noticed myself reading the same sentence over and over again in a textbook; my eyes were looking at the text, but my mind was saying, “Why aren’t you getting this yet? It’s so simple. Everybody else gets it. What’s wrong with you?”
When I look back upon those years, I have two thoughts: 1. That was hard. 2. What a waste of perfectly good brain cells! I could have done so many cool things if I had not spent all that time doubting myself.
But one can’t simply snap out of the Imposter Syndrome. It has a variety of causes, and it’s sticky. I was brought up with the idea of holding myself to a high standard, to measure my own progress against others’ achievements. Falling short of expectations is supposed to be a great motivator for action…or is it?
In practice, measuring one’s own worth against someone else’s achievements can hinder progress more than it helps. It is a flawed method. I have a mathematical analogy for this: When we compare our position against others, we are comparing the static value of functions. But what determines the global optimum of a function are its derivatives. The first derivative measures the speed of change, the second derivative measures how much the speed picks up over time, and so on. How much we can achieve tomorrow is not just determined by where we are today, but how fast we are learning, changing, and adapting. The rate of change is much more important than a static snapshot of the current position. And yet, we fall into the trap of letting the static snapshots define us.
Computer science is a discipline where the rate of change is particularly important. For one thing, it’s a fast-moving and relatively young field. New things are always being invented. Everyone in the field is continually learning new skills in order to keep up. What one knows today may become obsolete tomorrow. Those who stop learning, stop being relevant.
Even more fundamentally, software programming is about tinkering, and tinkering involves failures. This is why the hacker mentality is so prevalent. We learn by doing, and failing, and re-doing. We learn about good designs by iterating over initial bad designs. We work on pet projects where we have no idea what we are doing, but that teach us new skills. Eventually, we take on bigger, real projects.
Perhaps this is the crux of my position: I’ve noticed a cautiousness and an aversion to failure in myself and many others. I find myself wanting to wrap my mind around a project and perfectly understand its ins and outs before I feel comfortable diving in. I want to get it right the first time. Few things make me feel more powerless and incompetent than a screen full of cryptic build errors and stack traces, and part of me wants to avoid it as much as I can.
The thing is, everything about computers is imperfect, from software to hardware, from design to implementation. Everything up and down the stack breaks. The ecosystem is complicated. Components interact with each other in weird ways. When something breaks, fixing it sometimes requires knowing how different components interact with each other; other times it requires superior Googling skills. The only way to learn the system is to break it and fix it. It is impossible to wrap the mind around the stack in one day: application, compiler, network, operating system, client, server, hardware, etc. And one certainly can’t grok it by standing on the outside as an observer.
Let us not measure ourselves against others, or focus on how much we don’t yet know. Let us measure ourselves by how much we’ve learned since last week, and how far we’ve come.
Further, many computer science programs try to teach their students computing concepts on the first go: recursion, references, data structures, semaphores, locks, etc. These are beautiful, important concepts. But they are also very abstract and inaccessible by themselves. They also don’t instruct the students on how to succeed in real software engineering projects. In the courses I took, programming projects constituted a large part, but they were included as a way of illustrating abstract concepts. One still needed to parse through the concepts to pass the course. In my view, the ordering should be reversed, especially for beginners. Hands-on practice with programming projects should be the primary mode of teaching, concepts and theory should play a secondary, supporting role. It should be made clear to the students that one needn’t master all the concepts before one can write a kickass program.
In some ways, all of us in this field are imposters. No one knows everything. The only way to progress is to dive in and start doing. Let us not measure ourselves against others, or focus on how much we don’t yet know. Let us measure ourselves by how much we’ve learned since last week, and how far we’ve come. Let us learn through playing and failing. The imposter syndrome can be a great teacher. It teaches us to love our failures and keep going.