Last week, I admitted that one of my key programming experiences had been with a typewriter. I wasn’t programming the typewriter: I just stopped programming. A few people liked this, others wondered about it, and (since it’s August) it seems worth explaining.
I started programming in seventh grade, on a ZX81. It took years for me to figure out what I was really doing. (In sixth grade, I’d bounced off a guide that showed
A=A+1. I barely knew algebra, but I knew that was impossible.)
A few years later, I’d mostly mastered the many corners of Applesoft BASIC, memorized more of the Beagle Brothers chart than was wise, and done basic work with 6502 assembly. I knew how DOS 3.3 worked, how hi-res graphics was organized, and much more I don’t currently use. (I did get out of the house sometimes, though.)
After a summer program on media arts, I gave up programming. My spaghetti code had tied itself in knots around me, and I was especially tired of debugging. I kept finding myself switching from complex approaches I coded myself to simpler things I could make work through a GUI on an Amiga, and it felt like I’d hit a wall.
For the next two years, my last two of high school and much of my first semester of college, I only used my computer for occasional games. Mostly, it stayed off and in the corner. I bought a Brother AX-20 electric typewriter, got used to writing in drafts, and stayed away from programming completely.
Something weird happened. Before, I’d really enjoyed feeling my brain work like code, accepting limited inputs, processing things as logically as possible, and yielding reliable results. That “Return” key in my head had been a powerful accent on my thoughts. Weirdest, though, was feeling and noticing the change for about the first six months. I could feel myself becoming less formally logical, somehow, but it was comfortable.
Starting with computers at an early age may have made it easier for them to shape the way I think, and stopping when I did makes it harder to separate what was me leaving the computer from what was just me in high school. However, when I get deep into computers again, I feel the old habits come back. (Except, fortunately, the Return key.) My impatience for illogical behavior climbs, my expectations about clean input make me reject too many things, and I just want to focus on a beautifully ordered flow of information going by.
Noting those habits, realizing that the way I interact with the world changes, makes it easier for me to step back. It also makes it easier for me to deal with people so deep in that worldview that they don’t
step back very often. Too often I encounter people who have mistaken the logic of the computer for the logic of the world, the way things should be. If we can just get enough into our computers, everything will work neatly…
Stepping back and feeling those changes has made it much easier for me to deal with people inside and outside of the programming flow. At a simple level, it makes it easier to do things like solicit and work with requirements for a project—taking requirements seriously usually means there will be extra work created by the mismatch of what people want and what computers are good at doing.
For me personally, feeling this divide directly has made it much easier for me to write and edit tutorials and references meant for readers of many types. I have sympathy for people on both sides of this, and want them to be able to cross these boundaries at will. That means figuring out how to explain things created by people deep in the programming logic to people who aren’t. It’s not as obvious as those on either side of the divide would like.
If you can, step back. Take some breaks from programming and computers, especially extended breaks. Spend a little time figuring out how you change when you’re programming, and when you’ve been away from the computer for a while. Maybe it won’t be as dramatic as it was for me, but I suspect you’ll feel things change. I doubt it will hurt your programming in the long run, and it may well help you at the critical task of applying computer capabilities in a human world.
Have you felt these changes in the way you think? Or am I overstating it?