Thu

Dec 20
2007

Nat Torkington

Nat Torkington

Our methodology

Thomas Lord posted an interesting question in a comment on one of my recent posts: "I have a question about how the "Radar" works. Are you tracking Erlang? or following the broader trend around the pi calculus? Is Erlang interesting to you as a technological idea? Or as a particular product?". I realized we haven't talked much about what we actually do here, so I thought today I'd take the time to talk about that.

Dale and Tim have great noses for the future, and were right in the thick of things with the commercialization of Unix, popular uptake of the Internet, Open Source, Peer-to-Peer, and Web 2.0. Their methodology is pretty simple: in hacks, research, and startups by alpha-geeks we can often catch early glimpses of what will later be mainstream products or trends. So when Dale saw Pei Wei working on an X11 viewer for this thing called the "World Wide Web", he thought "everyone can use this". The O'Reilly Radar is an attempt to scale this beyond Tim and Dale.

So we see trends like Web 2.0, the growing need for concurrency, ubiquitous machine learning, and the importance of operations. We look to see what alpha geeks are doing in those spaces, find the bits that resonate, hold them up and say "this is what the future holds". We do this on the Radar blog, in Release 2.0, in research reports, in conferences like Velocity, and in the talks we give.

O'Reilly's business model is obviously predicated on this kind of future thinking—typical animal books take nine months to hit the shelves and it's hard to launch a conference with shorter lead time (unless you're Dave McClure!). You might think we'd keep the best ideas for ourselves and publish the rest, but we don't. We hope the rest of O'Reilly listens to what we say, but we don't run departments like conferences or books. As Marc Hedlund is fond of quoting, "Don't worry about people stealing your ideas. If your ideas are any good, you'll have to shove them down people's throats".

Our process isn't scientific research, where you come up with a hypothesis and then conduct experiments to disprove that hypothesis. What would an experiment to disprove the hypothesis "concurrency is moving from a niche to the mainstream but it's still largely an unsolved problem" look like? We try to quantify trends wherever we can, but at its heart this is an attempt to train and employ our instincts.

So Erlang and Haskell are interesting to us because we see alpha-geeks learning and playing with them and they have a "we make parallel code easier" story that fits with the trend we see of people struggling to figure out how to take advantage of multicore systems. We look for datapoints like "Amazon built SimpleDB in Erlang" that would confirm the hypothesis "Erlang can be used in the mainstream", and we also look for failures that might disprove that hypothesis. Such a failure might be "we build this in Erlang but couldn't keep a team together to run it, so had it rewritten in C++". In this mindset, Yahoo! Stores is a failure for Lisp and not a success (sorry, Paul!).

That's why we don't try to break every story. We leave that to our friends at ReadWriteWeb and TechCrunch. As Tim's said on many occasions, we "amplify the faint signals of the alpha-geeks." It's fun and we get to meet interesting people and think about the way things should be instead of the way they are. Hope that answers your question, Tom!


tags: backstory  | comments: 7   | Sphere It
submit:

 
Previous  |  Next

0 TrackBacks

TrackBack URL for this entry: http://blogs.oreilly.com/cgi-bin/mt/mt-t.cgi/6156

Comments: 7

  dave mcclure [12.20.07 04:51 AM]

in other words: hang out with geeks, drink beer / eat grub, and at the end of the evening, watch what they crap out the other end... and then some of that shit ends up turning into gold ;)

(great post nat)

- dave mcclure

ps - this time, we're taking a *leisurely* 5 months to put together the next Graphing Social Patterns West conference:
http://conferences.oreilly.com/gspwest

  gnat [12.20.07 01:26 PM]

@dave: sssh, it only *looks* like an expense account with email! Really it's MUCH MORE than that :-)

  Ho√†ng ƒê·ª©c Hi·∫øu [12.20.07 04:36 PM]

Graham can still refer to Greenspun's Tenth Rule when telling the story of Viaweb though, see the first footnote of Beating the Averages http://www.paulgraham.com/avg.html

  Thomas Lord [12.20.07 10:33 PM]

Well, shucks! Thanks, man!

One thing I note about how Radar works is that, whether its a dust-up over codes of contact or a question like mine that happens to tickle the right rib: the editorial thinking is thoughtfully responsive (though not often so darn direct!). Flattery should get me nowhere, though, so, to the topic:

Maybe this is the best way to put it, to keep things interesting -- i'll tell you about one of my failings as an engineer (because, then I'll say "it's not just me").

I find it damn hard to talk honestly and straightforwardly and informatively and not misleadingly (all at the same time) about software. It's hard to make accurate statements: vocabulary is lacking and when engineers talk about hard practical problems among themselves there's an aweful lot of pointing, grunting, and gesticulating: pretty hard to digest into an executive summary.

As a result, our industry operates on the basis of a lot of polite fictions.

An example of such a fiction is the supersition of language choice. You mention a "data point" example of a hypothetical project that started in Erlang but floundered until rewritten in C++. Heck, maybe there's even a little temporal cluster of those stories with 10 near-simultaneous failures in 10 unrelated firms. Clearly there's some non-random event being reported in the echo chamber but, what does it actually mean?

Taking a "data point" like that: remind me again why the programming language change is the main thing to look at? Did the manager change? Staff turnover? Requirements redefinition? Methodology shift? Brain-drain of key Erlang programmers? Flood of cheap C++ programmers? Or, the upstream Erlang implementation supplier neglected an important platform? Or, is that the ability to create an inline virtual private overloaded operator turned out to have much more value than the ability to pass a channel over a channel?

How did the language choice become the story when its one of the least likely suspects? Perhaps because its an inanimate object, ready at hand to deflect blame towards?

But then filter those anecdotes through futurists and they rightly note trends but the interpretation of those trends by the consumers of the futurists is pretty far removed from where we started. Like, for example, the theory that programming language choices matter much gains strength and creates new markets just on the basis of the rumor.

So, I find myself sort of stuck always wanting to say an ambiguous and self-contradictory thing to O'Reilley: You aren't predicting the future so much as you're creating it; I'm certain you're way of slicing through the tech for a perspective is distorting (e.g., when the degrees of freedom around Web 2.0 privacy issues come up); and at the same time, and to be absolutely clear about this: WTF (pardon me, but...) WTF do *I* know any better.

Really, I can poke holes in what I infer to be more or less the model of the technological landscape that you promote. And, some of those flaws I see I really do want to point to because of how I think their promotion plays out in the markets. But... WTF do *I* know any better: I'm at best extremely tentative in any positive suggests of a better way organize understanding and guess at the future and so forth. It's not like I have a better idea than "what you guys do" -- just some good ideas about the need to morph things a bit and maybe do things to better effect. Somehow.

But, anyway, the simple journalistic refinement of my question in light of your elaboration is to ask how it is that language choice came to be the key element of the data-point stories you mention. Why language choice when so many other variables are in play?

I understand the answer to be, in part, "well, blame the alpa-geeks, we're just reporting what we can gather from them." Well, yes. It is very hard to talk about software honestly. Fair cop.

-t

  Thomas Lord [12.20.07 11:29 PM]

That was off the cuff (on purpose) but I want to clarify one thing that might be unclear because of its off-the-cuffness. When I say I find Radar's editorial voice (so to speak -- it's not a monolith at all) "responsive" I don't mean simply to issues I poke my head in on. Not at all. I mean O'Reilley *is* a kind of organic implementation (best kind!) of web 2.0 principles and they are responsive to an audience who they really do go out and meet. Again, flattery should get me nowhere but.... you guys being in the conference business seems from this distance to be something you do good things with -- there's some "depth" to your perspective that validates it -- the benefits of lots of well-deserved good will.

So, I'll shut up now, since I'm embarassing both of us :-)

-t

  Tim O'Reilly [12.22.07 08:37 AM]

Tom --

To your point about how naming something shapes our perceptions and thus helps to create reality, I have to confess that that's an explicit goal of radar. We do definitely report on what we see, but there's a kind of advocacy too, to help people see what we we see.

Take open source. One of the things that led me to convene the meeting that came to be called "the open source summit" was that I saw that the current "free software" story left important bits out. Because their story was focused on the GNU project, Linux, GCC, emacs, and the various GNU tools were all they ever talked about, ignoring amazing (and already far more mainstream) tools like Bind, Sendmail, and Apache, which were more aligned with the Berkeley Unix tradition. I explicitly set out to change the story. Ditto when everyone thought P2P was just about file sharing. We saw that it was the first sign of the internet as a new kind of platform, and began telling that story (which eventually ripened into the Web 2.0 story.) See my essay Remaking the Peer to Peer Meme for an account of this aspect of how we work.

But it isn't just advocacy. There's a long linguistic tradition that notes how language shapes thought. I wrote a bit about that in the introduction to one of our direct mail catalogs back in 2000:

"Benjamin Whorf's famous book Language, Thought and Reality explores the way that the words we use shape the way we think, and even the reality we perceive. Walk through a meadow with a botanist or a farmer, and she will not only be able to name the dozen different types of grasses that make up a typical mix, but she will see them. Most of us go through life without words for many of the phenomena that surround us, and as a result, our ability to perceive them, think about them, and ultimately manipulate them is greatly reduced. In many ways, the history of civilization is the history of language. At bottom, language is a technology that helps us to perceive and manipulate reality."

(I went on from there to talk about Perl. Here is the entire piece:
http://tim.oreilly.com/articles/perl.html )

I'm also very fond of George Soros' formulation of a related idea, which he calls "reflexive knowledge," or "things that are neither true or false but become so because people believe them." He points out that many of the most important human phenomena -- history, politics, markets -- are dominated by this kind of reflexive activity.

Or take my favorite poet, Wallace Stevens. He wrote extensively about the relationship of thought and reality, and concluded that perhaps what we seek for as "fact" or "truth" should instead be considered as an aesthetic competition to construct worlds that each other can believe in.

As to the specific point about choice of language, it is in fact one of those "reflexive knowledge" choices that is part advocacy, part accident, and part adaptive response to a particular technological environment and effectiveness in that environment (e.g. ruby (on rails) is better adapted to building web applications than Fortran is.)

  Thomas Lord [12.22.07 12:25 PM]

I like that. So, one way to summarize an aspect of it is to say that "creating reality" through things like journalism and publishing is unavoidable, necessary, and indeed primary. Thus, it is no criticsm to say "that is what you are doing -- creating rather than reporting a reality" because, in reality, no journalists, conference organizers, or publishers could do differently -- it's incoherent to suggest so.

It's good, I think, to be able to reflect on those facts in a first-order way -- to begin to have words for them -- in the public conversation. Not that the facts were deliberately hidden before and are now, here, magically revealed -- just that evolution in self-awareness (even of a group, like radar + audience) can be productive.

It isn't a criticism to say "You're creating reality within our industry!" but it can be a legitimate criticism to say "The particular reality you are creating is problematic in the following ways...."

So, on programming language issues... I have a hypothesis about the dynamics here.

Programming languages are one of the agreed upon units of modularity. That is, people have pretty universally shared expectations about at least the rough shape of the API to a programming language implementation and they are semi-commodities: you can swap them in and out, in some sense.

Because they can be commodified, and because each implementation tends to be the kind of program that a small team can maintain with only modest resources, a lot of trade arises around them. Books, conference sessions, support contracts, sourceforge categories, etc.

Now, it's objectively true that in careful experiments you can prove that some languages are better than others for some things (your Fortran for web programming is a fine example). But, in this enticing but somewhat crowded market you get lots of people in the business of selling their particular language and the cumulative cacophony amounts to a pretty gross exaggeration and distortion of what's at stake in programming language choices.

We're swapping the wrong kinds of stories, in this part of the echo chamber, in my opinion, based on what I remember of the industry in the late 1980s.

We swap stories about how Python adoption waxes or wanes. We swap stories about how to get started quickly with RoR, whatever. These are nothing like the stories I remember from engineering school where the best tales were of systems simply and robustly built, analyzable, flexible, and elegant. The design of a new programming language for some component was, as much as anything, likely to be just a "subsection" in the overall description.

Methodology, systems thinking, however you want to put it.... those things you can find in the original unix tradition (or the lisp machine tradition) --- those skills are harder to commodify, harder to sell, harder to find words for, and harder to create reality around.

They are much more important, though.

-t

Post A Comment:

 (please be patient, comments may take awhile to post)






Type the characters you see in the picture above.