Tracking the signal of emerging technologies

The first NASA IT Summit featured deep views into the future.

4727644780_1a2f2e5f04_b.jpgLast week the words of science fiction writer William Gibson ran rampant over the Twitter back channel at the inaugural NASA IT Summit when a speaker quoted his observation that “The future is here. It’s just not evenly distributed yet.” That’s a familiar idea to readers of the O’Reilly Radar, given its focus on picking up the weak signals that provide insight into what’s coming next. So what does the future of technology hold for humanity and space flight? I’ve been reading the fiction of Jules Verne, Isaac Asimov, David Brin, Neal Stephenson, Bruce Sterling and many other great authors since I was a boy, and thinking and dreaming of what’s to come. I’m not alone in that; Tim O’Reilly is also dreaming of augmented reality fiction these days.

Last week I interviewed NASA’s CIO and CTO at the NASA IT Summit about some of that fiction made real. We discussed open source, cloud computing, virtualization, and Climate@Home, a distributed supercomputer for climate modeling. Those all represent substantive, current implementations of enterprise IT that enable the agency to support mission-critical systems. (If you haven’t read about the state of space IT, it’s worth circling back.)

Three speakers at the Summit offered perspectives on emerging technologies that were compelling enough to report on:

  • Former senior technology officer at the Defense Intelligence Agency Lewis Shepherd
  • Gartner VP David Cearley
  • Father of the Internet Vint Cerf

You can watch Cerf speak in the embedded video below. (As a bonus, Jack Blitch’s presentation on Disney’s “Imagineers” follows.) For more on the technologies they discuss, and Shepherd’s insight into a “revolution in scientific computing,” read on.

Building an Internet in space

Even a cursory look at the NASA IT Summit Agenda reveals the breadth of topics discussed. You could find workshops on everything from infrastructure to interactivity, security in the cloud to open government, space medicine to ITIL, as well as social media and virtual worlds. The moment that was clearly a highlight for many attendees, however, came when Vint Cerf talked about the evolution of the Internet. His perspective on building resilient IT systems that last clearly resonated with this crowd, especially his description of the mission as “a term of art.” Cerf said that “designing communications and architectures must be from a multi-mission point of view.” This has particular relevance for an agency that builds IT systems for space, where maintenance isn’t a matter of a stroll to the server room.

Cerf’s talk was similar to the one he delivered at “Palantir Night Live” earlier this summer, which you can watch on YouTube or read about from Rob Pegoraro at the Washington Post.

Cerf highlighted the more than 1.8 billion people on the IP network worldwide at the end of 2009, as well as the 4.5 billion mobile devices that are increasingly stressing it. “The growth in the global Internet has almost exhausted IPv4 address space,” he said. “And that’s my fault.” Time for everyone to learn IPv6.

Looking ahead to the future growth of the Internet, Cerf noted both the coming influx of Asian users and the addition of non-Latin characters, including Cyrillic, Chinese, and Arabic. “If your systems are unprepared to deal with non-Latin character sets, you need to correct that deficiency,” he said.

Cerf also considered the growth of the “Real World Web” as computers are increasingly embedded in “human space.” In the past, humans have adapted to computer interfaces, he said, but computers are increasingly adapting to human interfaces, operating by speech, vision, touch and gestures.

Cerf pointed to the continued development of Google Goggles, an app that allows Android users to take a picture of an object and send it to Google to find out what it is. As CNET reported yesterday, Goggles is headed to iPhones this year. Cerf elicited chuckles from the audience when describing the potential for his wife’s Cochlear implant to be reprogrammed with TCP/IP, thereby allowing her to ask questions over a VoIP network, essentially putting her wife on the Internet. To date, as far as we know, she is not online.

Cerf also described the growing “Internet of Things.” That network will include an InterPlaNetary Internet, said Cerf, or IPN. Work has been going forward on the IPN since 1998, including the development of more fault-tolerant networking that stores and forwards packets as connections become available in a “variably delayed and disrupted environment.”

“TCP/IP is not going to work,” he said, “as the distance between planets is literally astronomical. TCP doesn’t do well with that. The other problem is celestial motion, with planets rotating. We haven’t figured out how to stop that.”

The “Bundle Protocol” is the key to an interplanetary Internet, said Cerf. The open source, publicly available Bundle protocol was first tested in space on the UK-DMC satellite in 2008. This method allows three to five times more data throughput than standard TCP/IP, addressing the challenge of packetized communications by hopping and storing the data. Cerf said we’ll need more sensors in space, including self-documenting instruments for meta-data and calibration, in order to improve remote networking capabilities. “I’m deeply concerned that we don’t know how to do many of these things,” he observed.

Cerf also expressed concern about the lack of standards for cloud computing, suggesting that “we need a virtual cloud to allow more interoperability.”

Government 2.0 and the Revolution in Scientific Computing

Lewis Shepherd, former senior technology officer at the Defense Intelligence Agency and current Director of Microsoft’s Institute for Advanced Technology in Governments, focused his talk on whether humanity is on the cusp of a fourth research paradigm as the “scale and expansion of storage and computational power continues unabated.”

Shepherd put that prediction in the context of the evolution of science from experimental to theoretical to computational. Over time, scientists have moved beyond describing natural phenomena or Newton’s Laws to simulating complex phenomena, an ability symbolized by comparing the use of lens-based microscopes to electron microscopes. This has allowed scientists to create nuclear simulations.

Shepherd now sees the emergence of a fourth paradigm, or “eScience,” where a set of tools and technologies support data federation and collaboration to address the explosion of exabytes of data. As an example he referenced imagery of the Pleiades star cluster from the Digitized Sky Survey synthesized within the WorldWide Telescope.

“When data becomes ubiquitous, when we become immersed in a sea of data, what are the implications?” asked Shepherd. “We need to be able to derive meaning and information that wasn’t predicted when the data sets were constructed. No longer will we have to be constrained by databases that are purpose-built for a system that we design with a certain set of requirements. We can do free-form science against unconstrained sets of data, or modeling on the fly because of the power of the cloud.”

His presentation from the event is embedded below.

In particular, Shepherd looked at the growth of cloud computing and data ubiquity as an enabler for collaboration and distributed research worldwide. In the past, the difficulty of replicating scientific experiments was a hindrance. He doesn’t see that as a fundamental truth anymore. Another liberating factor, in his view, is the evolution of programming into modeling.

“Many of the new programming tools are not just visual but hyper-visual, with drag and drop modeling. Consider that in the context of continuous networking,” he said. “Always-on systems offer you the ability to program against data sets in the cloud, where you can see the emergence of real-time interactive simulations.”

What could this allow? “NASA can design systems that appear to be far simpler than the computation going on behind the scenes,” he suggested. “This could enable pervasive, accurate, and timely modeling of reality.”

Much of this revolution is enabled by open data protocols and open data sets, posited Shepherd, including a growing set of interactions — government-to-government, government-to-citizen, citizen-to-citizen — that are leading to the evolution of so-called “citizen science.” Shepherd referenced the Be A Martian Project, where the NASA Jet Propulsion Laboratory crowdsourced images from Mars.

He was less optimistic about the position of the United States in research and development, including basic science. Even with President Obama’s promise to put science back in its rightful place during his inaugural address, and some $24 billion dollars in new spending in the Recovery Act, Shepherd placed total research and development as a percentage of GDP at only 0.8%.

“If we don’t perform fundamental research and development here, it can be performed elsewhere,” said Shepherd. “If we don’t productize here, technology will be productized elsewhere. Some areas are more important than others; there are some areas we would not like to see an overseas label on. The creation of NASA itself was based on that. Remember Sputnik?”

His observations paralleled those made by Intel CEO Paul Otelinni at the Aspen Forum this Monday, who sees the U.S. facing a looming tech decline.

“Government has the ability to recognize long time lines,” said Shepherd, “and then make long term investment decisions on funding of basic science.” The inclusion of Web 2.0 into government, a trend evidenced in the upcoming Gov 2.0 Summit, is crucial for revealing that potential. “We should be thinking of tech tools that would underlay Gov 3.0 or Gov 4.0,” he said, “like the simulation of data science and investment in STEM education.”

Gartner’s Top Strategic Technologies

Every year, Gartner releases its list of the top 10 strategic technologies and trends. Their picks for 2010 included cloud computing, mobile applications (Cearley used the term apptrepreneurship to describe the mobile application economy that is powered by the iTunes and Android marketplaces, a useful coinage I wanted to pass along), flash memory, activity monitoring for security, social computing, pod-based data centers, green IT, client computing, advanced analytics, and virtualization for availability. Important trends all, and choices that have been born out since the analysis was issued last October.

What caught my eye at the NASA IT Summit were other emerging technologies, several of which showed up on Gartner’s list of emerging technologies in 2008. Several of these are more likely familiar to fellow fans of science fiction than data center operators, though to be fair I’ve found that there tends to be considerable cross over between the two.

Context-aware Computing
There’s been a lot of hype around the “real-time Web” over the past two years. What’s coming next is the so-called “right-time Web,” where users can find information or access services when and where they need them. This trend is enabled by the emergence of pervasive connectivity, smartphones, and the cloud.

“It will be collaborative, predictive, real-time, and embedded,” said Clearey,” adding to everyday human beings’ daily processes.” He also pointed to projects using Hadoop, the open source implementation of MapReduce that Mike Loukides wrote about in What is Data Science? Context-aware computing that features a thin client, perhaps a tablet, powered by massive stores of data and predictive analytics could change the way we work, live, and play. By 2015-2020 there will be a “much more robust context-delivery architecture,” Cearley said. “We’ll need a structured way of bring together information, including APIs.”

Real World Web
Our experiences in the physical world are increasingly integrated with virtual layers and glyphs, a phenomenon that blogger Chris Brogan described in 2008 in his Secrets of the Annotated World. Cyberspace is disappearing into everyday experience. That unification is enabled by geotagging, QR codes, RFID chips, and sensor networks. There’s a good chance many more of us will be shopping with QR codes or making our own maps in real-time soon.

Augmented Reality
Context-aware computing and the Real World Web both relate to the emergence of augmented reality, which has the potential to put names to faces and much more. Augmented reality can “put information in context at the point of interaction,” said Cearley, “including emerging wearable and ‘glanceable’ interfaces. There’s a large, long term opportunity. In the long term, there’s a ‘human augmentation’ trend.”

Features currently available in most mobile devices, such as GPS, cellphone cameras, and accelerometers, have started to make augmented reality available to cutting edge users. For instance the ARMAR project shows the potential of augmented reality for learning, and Augmented reality without the phone is on its way. For a practical guide to augmented reality, look back to 2008 on Radar. Nokia served up a video last year that shows what AR glasses might offer:

Future User Interfaces
While the success of the iPad has many people thinking about touchscreens, Cearley went far beyond touch, pointing to emerging gestural interfaces like the SixthSense wearable computer at MIT. “Consider the Z-factor,” he suggested, “or computing in three dimensions.” Cearley pointed out that there’s also a lot happening in the development of 3D design tools, and he wouldn’t count virtual worlds out, though they’re mired “deep in the trough of disillusionment.” According to Cearley, the problem with current virtual worlds is that they’re “mired in a proprietary model, versus an open, standards-driven approach.” For a vision of a “spatial operating system” that’s familiar to people who have seen “Minority Report,” watch the video of g-speak from oblong below:

Fluid User Interface
This idea focuses on taking the user beyond interacting with information through a touchscreen or gesture-based system and into contextual user interfaces, where an ensemble of technologies allow a human to experience emotionally-aware interactions. “Some are implemented in toys and games now,” said Cearley, “with sensors and controls.” The model would include interactions across multiple devices, including building out a mind-computer interface. “The environment is the computer.” For a glimpse into that future, consider the following video from the H+ Summit at Harvard’s Science Center with Heather Knight, social roboticist and founder of marilynmonrobot.com:

.

User Experience Platforms
Cearley contended that user experience design is more important than a user experience platform. While a UXP isn’t a market yet, Cearley said that he anticipated news of its emergence later in 2010. For more on the importance and context of user experience, check out UX Week, which is happening as I write this in San Francisco. A conceptual video of “Mag+” is embedded below:

Mag+ from Bonnier on Vimeo.

3D Printing
If you’re not following the path of make-offs and DIY indie innovations, 3D printing may be novel. In 2010, the 3D printing revolution is well underway at places like MakerBot industries. In the future, DARPA’s programmable matter program could go even further, said Cearley, though there will need to be breakthroughs in materials science. You can watch a MakerBot in action below:

Mobile robotics driving mobile infrastructure
I experienced a vision of this future myself at the NASA IT Summit when I interviewed NASA’s CTO using a telerobot. Cearley observed many applications coming for this technology, from mobile video conferencing to applications in healthcare and telemedicine. A video from the University of Louisville shows how that future is developing:

Fabric Computing
Cearley’s final emerging technology, fabric computing, is truly straight out of science fiction. Storage and networking could be distributed through a garment or shelter, along with displays or interfaces. A Stanford lecture on “computational textiles” is embedded below:

tags: , , , ,