Tue

Mar 7
2006

Nat Torkington

Nat Torkington

ETech: Jeff Han

Jeff Han gave an amazing demo at ETech, showing a multi-point touch-sensitive display. Here's a transcript, but you'll probably have to wait for the video to really get the full power of his creation.

Jeff Han at ETech, 7 Mar 2006

Consulting research scientist at NYU's department of Computer Science. This stuff is literally just coming out of the lab right now. You're amongst the first to see it out of the lab. I think this is going to change the way we interact with computers.

Rear-projected drafting table. Equipped with multitouch sensors. ATMs, smart whiteboards, etc. can only register 1 point of contact at the time. Multitouch sensor lets you register multiple touch points, use all figners, both hands. Multitouch itself isn't a new concept. Played around with multitouch in 80s, but this is very low cost, very high resolution, and very important.

Technology isn't the real exciting thing, more the interactions you can do on top of it once you're given this precise information. For instance, can have nice fluid simulation running. Induce vortice here with one hand, inject fluid with another. Device is pressure sensitive, can use clicker instead of hand. Can invent simple gestures.

This application is neat, developed in lab. Started as screen saver, but hacked so it's multitouch enabled. Cna use both fingers to play with the lava. Take two balls, merge them, inject heat into the system, pull them apart. This obviously can't be done with single point interaction, whether touch screen or mouse.

It does the right thing, there's no interface. Can do exactly what you'd expect if this were a real thing. Inherently multiuser. Rael, come up and help me out. I can work in an area over here, and he can be playing with another area at the same time. It immediately enables multiple users to interact with a shared display, the interface simply disappears.

Here's a lightbox app. Dragging phtoso around. Two fingers at once, I can start zooming, rotating, all in one really seamless motion. It's neat because it's exactly what you expect would happen if you grabbed this virtual photo here. All very seamless and fluid.

Someone who's new to computing culture can use this. Could be important as we introduce computers to a whole new group of people. I cringe at the $100 laptop with its WIMP interface.

Really simple and elegant technique for detecting touch point, scattered light by deformation caused by touch on screen.

Kinaesthetic memory, the visual memory where you left things. Ability to quickly zoom, get a bigger work area if you run out of space, etc. changes things. More of an infinite desktop than standard fixed area.

Now, of course, can do the same thing with videos as with photos. All 186 channels of TW cable.

Inevitably there'll be comparisons with Minority Report. Minority Report and other gestural interfaces aren't touch based. Can't differentiate between slight hover and actual touch. Disconcerting to user if they have action happen without tactile feedback. I argue that touch is more intuitive than gross gestural things. Also gestural is very imprecise.

Ability to zoom in and out quickly lets you find new ways to explore information. What's interesting is that we're excited about potential for this in information visualization applications. Can easily drill down or get bigger picture. Having a lot of fun exploring what we can do with it.

Another application we put together is mapping. This is NASA WorldWind, like Google Earth but Open Source. We hacked it up to use the two fingered gestural interface to zoom in. Can change datasets in NASA Worldwind. They also collect pseudocolour data, to make a hypertext map interface. [Demo stalls, restarts] Three dimensional information, so how do you navigate in that direction. Use three points to define an axis of tilt. Could be right or wrong interface, but example of kind of possibilities once you think outside the box.

Virtual keyboard, rescalable. No reason to conform to physical devices. Brings promise of a truly dynamic user interface, possibility to minimize RSI. Probably not the right thing to do, to launch in and emulate things from the real world. But lot of possibilities, we're really excited.

Lots of entertainment applications, multiuser with many people playing in parallel. Here's a simple drawing tool. Can add constraints and have multiple constraints, to make a reallye asy virtual puppeteer tool. Lot of math under the surface to do what's physically plausible (algorithm published last year at SIGGRAPH).

[demo reel]


tags:   | comments: 8   | Sphere It
submit:

 
Previous  |  Next

0 TrackBacks

TrackBack URL for this entry: http://blogs.oreilly.com/cgi-bin/mt/mt-t.cgi/4502

Comments: 8

  Robert Brewer [03.07.06 04:13 PM]

FWIW, this isn't "just coming out of the lab". Applied Minds is already selling multi-point map tables by the truckload (and has been for many months now). http://www.wired.com/news/technology/0,1282,67951,00.html

  Tim O'Reilly [03.08.06 10:32 AM]

Robert -- I don't think Applied Minds is selling "boatloads" - at $100K per, these are very much a specialty item. I believe Jeff's technology will make this kind of display much more affordable. It's also much more of a generalized platform, rather than a special-purpose device. In addition, I believe that the level of fine-grained control is quite a bit better.

Not to take anything away from Danny and Bran -- the original map table is pretty amazing.

  tarcisio pirotta [03.26.06 12:11 AM]

Hi, this touch screen in so interesting. I ve been following the project. Does someone how it works the cpation sistem? The led frame and camera are infrared??? or just the scattered light by deformation caused by touch on screen is enough to track th points? there is no interference ij the light is white and some of the thing displayed too???

  dan quixote [10.15.06 07:41 AM]

jeff uses "frustrated total internal reflection": the IR light stays inside the acrylic plate except where that is touched. he describes the technique neatly in his ACM paper "multi-touch sensing through frustrated total internal reflection"

  Fauzia Imam [10.21.06 01:29 PM]

Astonishing, Amazing, Stunning. Jeff you are marvellous.Keep it up

  homer jay [01.09.07 11:02 PM]

So, now that the apple iphone is upon us with its multi-touch interface and 200 patents, was Jef consulting with Apple on this?

  marge [01.30.07 05:39 PM]

his webpage says to "stay tuned"

  Ken [06.02.07 05:52 AM]

Now I see all of his project inside a fancy coffee table that Microsoft claims has been developing secretly since 2001, which uses EXACTLY this technology (IR light and cameras and all). I've seen not a single place where Microsoft attributes this guy for anything on its development. How comes this? Can they simply say "no one knew about this this but we had it since '01" and that phrase can let them claim as theirs all the development that happened "after '01" Gee, I'l say I was secretly developing vista since 1992...

Post A Comment:

 (please be patient, comments may take awhile to post)






Type the characters you see in the picture above.