Tue

Dec 19
2006

Tim O'Reilly

Tim O'Reilly

Second Life, IBM, and the Cell Processor

In response to Steven Shankland's article about IBM starting a "Second Life business unit", Maris Fogel made some interesting observations on Dave Farber's IP list about the connection between IBM's cell processor and the new Second Life Unit:

A very interesting development. I believe that IBM can see a ways into the future here, and that they are using an early market position to their advantage.

My thought is this: In the upcoming years IBM will see their revolutionary Cell processor chip in hundreds of millions of households around the world via game console systems from a variety of manufacturers. The realism of Second Life's graphics is rather low at the moment, but IBM's chip can easily overcome that. Virtual realms as per Second Life also require very high bandwidth to deliver the experience to the end-user, but if we look around the world very high-speed broadband is becoming more available all the time.

So we have life-like real-time graphics rendering (via the Cell chip), climbing broadband Internet speeds (Japan?, Korea?), available on relatively inexpensive consumer devices, in a hundred million homes world-wide, within 3-4 years...

No wonder IBM thinks that virtual realms will take off in the future.

There are a few pieces that I do not know about, such as the human-machine interface technologies, but I suspect that if the Cell processor lives up to it's claims then processing real-time video input from a digital camera will be simple, and that will allow for the display of facial expressions and body language (consumer cameras are at 10 megapixels now, and new models will focus on video capture quality because that is primarily how the younger generation uses them; convenient, eh?) At this rate an entire virtual industry could take off right under our noses :)

In a followup to the list, Amy Wohl disagreed:
It is provocative to think of IBM in the consumer virtual worlds business, but I keep remembering that IBM is no longer in ANY consumer businesses, but its own choice. Unless they are planning to change that strategy, I would think that this venture into the future is on another path.

Note that the IBM spokesman talks about "training, conferences and Commerce," in the Shankland article, B2B notions, I think.

On Irving Wladawsky-Berger's own site, he describes the Shankland interview and adds some comments of his own. Here there are references to "societal" possibilities for IBM and virtual reality -- the blog notes healthcare, training and education as possibilities.

Certainly IBM can always choose to go off in another direction -- $90 Billion companies have deep pockets and lots of choices. But I think IBM's venture into virtual reality is about placing itself in a leadership position in what it thinks could be a new way of doing business. Of course, it is planning to learn about the opportunities as it tries out the new technology and this may move them into more consumer-oriented markets.

Of course, both Maris and Amy could be right. Advanced processors will make Second Life-like services more, well, lifelike, and the virtual reality of the future will be much richer and more immersive than today's services. And that is irrespective of whether IBM actually wants to be in consumer businesses or not (which they likely don't, as Amy observes.)

P.S. Irving's blog entry is well worth a read. He concludes:

It is important to remember that ten years ago many found the early Web-based applications rather weird and faddish, and not something that would likely take hold in business. Indeed some of the e-businesses that were created at the height of the dot-com frenzy were silly, and the marketplace dealt with them accordingly. But, I think that most will agree that the Internet and e-business have had a revolutionary impact on the world.

I am hoping that something similar will happen once again. I suspect that, over time, it will.

Me too.

tags:   | comments: 11   | Sphere It
submit:

 
Previous  |  Next

0 TrackBacks

TrackBack URL for this entry: http://blogs.oreilly.com/cgi-bin/mt/mt-t.cgi/5107

Comments: 11

  rektide [12.19.06 05:31 PM]

ah a wonderful case of people prognosticating out of their rear ends.

the game servers themselves are fairly constrained at the moment. these run scripts which are actually compiled into mono (*cough* .net *cough*) programs. theres already utilization problems here, and plenty of potential for wonder technologies to come by and change everything. in this case though, it would require hacking mono to let it use the cell's spe's. i havent looked at the gcc-cell tool chain, but i suspect this would be an extremely long adventure that IBM would have to front, since off the shelf cpu's cost about as much as crackerjacks leaving little incentive for SL to switch on their own impetus.

as for game clients, cell is hardly going to provide much help in advancing their graphics state. cell is not a graphics chip. the ps3 has some nvidia kit for that. it can offload some of the vector math from the 3d processor, but cell alone is never going to bring you portable virtual worlds. last, i have no idea what cells power consumption is, but i dont think it was meant for your cellphone.

what these um people are really talking about is DAMMIT (ati/amd) who are building opengl accelerated mobile cores. ti made some gurglings about opengl acceleration on the omap3, but i suspect it'll be like the cell; just some offloading of a little vector math. there's a lot of dedicated 3d accel hardware you just cannot fake on the cpu. otoh, ati's been building mobile processors for years and AMD is trying to create something they can combine on the cpu. nvidia's doing the same thing. they should both succeed marvelously.

the only thing the article talks about thats not bunk is processors like cell being used for scene capture. math like edge & motion detection should just scream on cell, its far more about small local datasets. otoh, the kinds of integration we'll be seeing on the new low clock ati/nvidia kit should blow 6-7 spe's away, and not suffer from from the data locality issues of cell.

the interesting thing about opengl ES acceleration is not that it will provide us virtual worlds. thats just one side effect. its that SoC opengl provides a standard way of accessing hardware accelerated graphics, which is required for any kind of displaying at all (2d & 3d) and is something that has never existed on embedded platforms (as a standard). we've been at the mercy of integrators and their particular J2me and microsoft implementations to access hardware acceleration. in counterpoint, in a couple years opengl es will be on every single SoC with graphics output and will provide the perfect launching point for throwing open source code on everything. thats why Xgl was started.

the fight to watch here is DAMMIT&NV v. MS, which is all the funnier because MS has no idea they're fighting & no idea the stakes. its a crying friggin shame its taken this long for people to even start taking about integrated opengl es on core, because it will single handedly liberate electronics from integrators & platforms. look closely and you'll see a back current of people ranting about this for many a years now.

  Taran Rampersad [12.19.06 10:14 PM]

Hmm. People are so quick to make up IBM's mind, and it seems that IBM isn't. IBM has openly said more than once that it is doing R&D - of course, people read what they wish, and wish what they read.

IBM is about businesses. When you think IBM, you don't think about games. You think business. And they have said (again, right in front of everyone) that they are looking to provide virtual worlds as a service to their clients.

But wait. They did the Circuit City sim in SecondLife (which I have yet to visit, it doesn't seem all that important). While IBM is scoring heavy cool points with some bloggers, it is apparent from their invite only policies and other aspects that they are courting the media and trying to find the media that they want - with a focus on weblogs. They have been immersing their own people in SecondLife much like they did with that operating system... what is it called... umm... oh - Linux, that's it.

They see virtual worlds as part of the future, just as they played Heisenberg with Linux. They see themselves as part of the future. And so... there they are, prepping for whatever comes next and letting their own research happen behind closed doors.

The Open Cell processor? Sure. I can see that, and more - given that a virtual world is a LAMP cluster.

I also see something else IBM is known for. Patents.

  steve [12.20.06 05:09 AM]

rektide knows his stuff. It's a pity that well known people have blogs where they post crap of varying degrees, while the engineers remain buried in back rooms and largely ignored.

This is part of the reason that I complain bitterly about the ignorance of CEOs... as CEOs are an amalgamation of areas that are needed for keeping a company running, but the technology that they talk about so much is something they know the least about.

In many ways, most company CEOs are like the senator who talks about the Internet as a series of tubes. When they open their mouths and talk about the future of the Net, they're really just parroting what someone told them to say, and you can only hope that the analogy isn't tubularly bad.

I don't read Dave Farber's list, because there's no feedback. Dave posts something he finds interesting, and I'm supposed to swallow that crap without any response because Dave doesn't find me interesting.

No thanks. This is the Internet. It works both ways.

  LKM [12.20.06 07:16 AM]

Rektide is exactly right. Cell is not a graphics chip. These people have no idea what they're talking about (and by "these people," I mean both the point and the counter-point in this post - it's quite funny to see people who have no clue trying to refute people who have no clue). Seems like they're just repeating marketing crap and trying to sell it for insight. They're wrong.

  rektide [12.20.06 02:07 PM]

steve & LKM; its criminal how quiet the engineers are. i dont know if most engineers are just so enraptured in their own design to vocalize anything, if they want to guard the secrets like some monastic cult protecting secret arcanum, or what, but for some reason the actual technocrats are highly under represented on the net.

a lot of it is noise. usenet is horrible because comp.arch.embedded is actually comp.arch.embedded.troubleshooting. no one wants to deal with that bogosity. blogs are a wonderful forum because you're not beholden to anyone, but finding the good techie ones is really difficult.

  rektide [12.20.06 02:13 PM]

steve on IBM;
"And so... there they are, prepping for whatever comes next and letting their own research happen behind closed doors."

Reminds me of the google situation. Its sinfully less engineering oriented, but i have commentary on Google trying to be closed doors over at this megathread. Everyones still trying to build products in a world thats transcending this content/material modus operandi and moving towards participatory. very few closed door research projects can really create that participatory inertia.

Whats up with Shirky posting on Gawker^BValleywag? how uncharacteristically unparticipatory of him.

  Tim O'Reilly [12.22.06 08:25 AM]

Steve --

Tis true, alas, that I, like most CEOs, have only a superficial knowledge of many subjects. But that's the beauty of a blog like this. I hear something that sounds provocative, and I share it with a community that knows more than I do. They help educate me and the rest of my readers. Not all of us are deep into an understanding of the specifics of processors. That doesn't mean we're ignorant. We just know other stuff.

As to Farber being a one-way list -- that's only partly true. It's a moderated list, and Farber often publishes a number of follow ups. In fact, I just sent him rektide's comments, and I'm hoping he publishes them.

As far as "the internet works both ways," you're right, it does. But that doesn't mean that it's all completely unmoderated flow. That's why so many people have deserted usenet. And frankly, civility helps keep two-way working. In fact, it's one of the foundations of the internet :-) As Jon Postel wrote in RFC 761: "TCP implementations should follow a general principle of robustness: be conservative in what you do, be liberal in what you accept from others."

It's possible that if Farber doesn't post your comments, it could be a matter of tone. He also does cut off discussion after a certain number of followups.

  steve [12.23.06 04:37 PM]

I wasn't targeting you as a CEO of O'Reilly when I made my comments about CEOs of companies. I'm talking about general media coverage where CEOs are interviewed and their opinions are published as the definitive coverage.

I haven't tried to post to Dave Farber's list, and simply don't read it because the format of the exchange doesn't appeal to me. I know what you mean about tone and civility as I've had similar comments before, but it seems to be an artifact of my writing style rather than intent.

What I'm getting at is that the people who really know the technology are the engineers buried in the back rooms. When I'm fixing up a project, I don't talk to the managers because they're really just not relevant.

Before the Internet was popular, the technical types were running everything just fine. Now it's the suit brigade again, and what we built is no longer ours... pointy haired Dilbert types talk about tubes, and disasterously stupid ideas aren't shot down in short order.

Why must it be the case that the decision making process keeps being put into the hands of those least equipped to understand and execute? This wasn't what the Internet promised at the start.

PS. As an aside, I'm not sure how to converse via a web site, as I have to make a deliberate effort to check previous articles.

  Tim O'Reilly [12.24.06 02:21 PM]

Steve, it's true that a web site conversation is slower than say, face to face, IM, email or usenet, but it doesn't mean you can't have a good conversation. Remember that some of the great all time conversations were carried out in paper letters, sometimes not received for months, and replied to in years.

  csven [12.26.06 02:06 PM]

I don't see IBM's involvement in Second Life as being an end in itself. Second Life, as noted, has issues. I suspect instead that they see it (and other platforms currently available) as training ground for a potential merging of PLM systems with virtual worlds. This jives with their business-centric vision and wraps up quite a bit more than simply virtual worlds... it includes production.

  steve [01.12.07 06:38 AM]

OK, let's see how far back these conversations can go. :)

As you know, I turned up complaining about Web 2.0. I actually had two complaints, the second one of which came out in the wrong place and the wrong time. I really wasn't targetting you as CEO in the above series, so I apologise for that.

My second complaint is with media coverage of technical events. When I go through news reports, interviews are always with the CEO of such and such company, where they trot out a sales line to the reporter interviewing them.

The things that are in these media reports are so factually wrong that I feel a pain in my stomach as my life drains away. The reporter fails to ask the right questions, the CEO pushes for some desirable point, and no basic technical validation of the story has occured.

Since this is an era of network effects, isn't it time to stop using journalists to ask technical questions? Isn't it time to stop talking to the wrong people when you're trying to find out about a new technology?

As I mentioned above, when I go in to fix a project, the people from management are the last ones I talk to as they live in a fantasty construct... and you need to talk to the peons underneath to find out the true state of the project.

Is there some way of using a people person as the focal point of a group of boffins, who will be able to bypass buzzwords and drive to the real point underneath? Is it even possible? Would the wisdom of the crowd be real or cracked?

Is it possible for a news report to simply ignore the CEO of the company and go talk to the technicians instead? Look at the fantastic books by Simon Singh and James Gleick as they talk directly to physicists and theorists and mathematicians, distilling out to material that most people would be interested in.

Can that process, which is rather laborious, be networked and optimised to a routine, so that all technical reports are of that standard? That material will always come from the technical people in the company, not the CEO or the marketing types.

If nothing else, I hope this explains what I was getting at, rather than the frustrated splurt prior.

Post A Comment:

 (please be patient, comments may take awhile to post)






Type the characters you see in the picture above.

RECOMMENDED FOR YOU

RECENT COMMENTS