Thu

Nov 24
2005

Nat Torkington

Nat Torkington

Burn In 8: Robert Lefkowitz (the R0ml)

This is the ninth entry in the O'Reilly Radar series about how alpha geeks got into computers. Robert Lefkowitz has driven IT on Wall Street, managed massive open source installations, and now works for a startup doing startupy things. He's a hilarious presenter and has often spoken at OSCON.

Robert Lefkowitz's Story

Between my junior and senior years in high school -- that would have been in 1969 -- I was admitted to an NSF summer program at Manhattan College in New York -- in which we studied Linear Algebra, Atomic Physics, and Computer Programming. Computer Programming consisted of writing programs in Fortran II -- and walking the punched card decks down to the data center window where I believe they were run on a CDC machine. My project was a program to compute the inverse of a large matrix. Every run would result in pages of gobbledy-gook being printed out until the data center operators canceled the job. They told me to stop submitting jobs unless I fixed the problem. Unfortunately, I couldn't figure out the problem. My instructor was less than helpful -- he looked over the program and opined: "Looks good to me. Beats me why it does that." No joy there.

That year -- the Math department at the high school acquired an Olivetti Programma 101 (http://www.science.uva.nl/faculteit/museum/ Programma101.html) My project there was to write a program to calculate pi. Other students assigned this problem wrote the naive algorithm -- after several days of chunking away, it converged to 3 digits of accuracy. I took a different approach -- and used the arc- tangent algorithm. Unfortunately, the program wouldn't fit in the available memory. However, I recall that the D, E, and F registers could be split -- half the bytes being usable for instructions -- which halved the precision of the calculations. Even with this trick, I came out a few bytes short -- and had to spend several days looking for optimizations to shave those few bytes so that the program could run. When I finally figured it out -- it converged to 7 digits within seconds -- that being the maximum precision possible with the split registers.

In 1970, I started at MIT -- planning to become a nuclear engineer. However, my freshman seminar was with Professor Fredkin -- who ran project MAC. Part of the seminar involved a briefcase sized kit of digital components -- we could implement various algorithms by wiring them together appropriately. Projects included things like a random music generator and a seven-day alarm clock.

At one point, as my interest in the Japanese game of Go increased, I was discussing it with Professor Fredkin -- and the question of complexity (versus chess) was being discussed. I had seen estimates of the number of possible chess games -- in the 10 to the 70th to 10 to the 120th ranges -- and it was clear that the number of possible games in Go (barring symmetry) was 361 factorial. Fredkin swiveled his chair -- typed in the factorial program into Multics LISP, and ran (fac 361) -- which immediately filled the screen with digits. Then he did it again, but asked for the log (base 10) to get the order of magnitude. I was impressed. Even now -- 35 years later -- I use this simple test to assess the expressive power of a programming language -- most fail. Only within the last few years have languages (other than LISP) appeared which can solve this problem on the first try.

I changed my major from nuclear physics to computer programming -- and wound up taking a number of classes in computational theory and computer language design and boolean algebra and the like. However, there were no classes offered by the computer science department which involved using a computer. They all involved what was called "blackboard evaluation" -- proving the correctness of various algorithms. With the exception of the course on operating system design taught by John Donovan -- in which we learned PL/I -- and had to write a lexical analyzer, parser and interpreter for a small subset of PL/I. Done with punched cards and run on an IBM System/360.

But the event that was probably most responsible for my eventual career as a programmer (although I didn't know it at the time) was the introductory course in electrical engineering (6.01) taught by Professor Paul Penfield. Penfield gave us all APL accounts on the IBM mainframe -- and also homework assignments that involved circuits of such large numbers of components that the solutions would not be computable manually -- we had to learn enough APL in order to solve the problems.

When I started looking for work in 1974 -- I interviewed at IBM Sterling Forest -- and they asked if I knew APL. "Sure", I replied -- I had, after all, learned a bit a few years earlier in order to solve some circuit problems. They scheduled a follow-up tech interview -- so I ran out, bought the Principles of Operation for APL -- and crammed for a week to actually learn the language. I got the job at IBM as an APL programmer. It was a summer job -- but my first full time job was at an APL timesharing company -- and eventually, in 1981, I wound up at Morgan Stanley -- as they had a huge commitment to the APL language and environment.

One last note -- in 1975 -- when I started at TCC (the timesharing company), we had the opportunity to examine the IBM 5100 -- a desktop computer that IBM sold (six or seven years before the thing we call the IBM-PC came out). It was an all-in-one unit. It didn't have a disk drive -- it had a cassette tape interface. The machine had a rocker switch -- it could be booted in either APL or Basic -- those being the two standard languages for small computers. We, of course, were interested in the APL. It was a faithful replica of the System/370 APL that ran on the mainframe -- down to all the obscure bugs we knew of. But of course, we all agreed that it made no sense to have a desktop computer -- after all, a large data center that one could access via the network gave one all the computing power one needed -- without the hassle of backups and other maintenance. Besides, there wasn't anything you could do on the desktop machine that you couldn't do on the mainframe. And the 5100 was very expensive. Eventually, of course, things changed.


tags:   | comments: 2   | Sphere It
submit:

 
Previous  |  Next

0 TrackBacks

TrackBack URL for this entry: http://blogs.oreilly.com/cgi-bin/mt/mt-t.cgi/4398

Comments: 2

  james governor [11.24.05 06:48 AM]

love this feature to bits. but what the heck were you thinking posting them all at once? you get past the fourth or fifth story and its like enough already. i appreciate i can come back. but you might want to think about leavening this content with a little something else.

  james governor [11.24.05 06:51 AM]

having said that it needs leavening - i am also wondering whether you might not eventually generate a blook based on this content. get all the alpha geeks in there, and you'd be cooking with gas.

we all remember our first experiences with computers and its really nice to see other people's epiphanies.

computing as a spiritual act. and that is true even for gamma geeks.

Post A Comment:

 (please be patient, comments may take awhile to post)






Type the characters you see in the picture above.