Space IT, the final frontier

Exploring open source cloud computing, virtualization and Climate@Home at NASA's first IT Summit.

rover.JPGWhen people think of NASA and information technology in 2010, issues like the future of manned space flight, the aging space shuttle fleet or progress on the International Space Station may come to mind. What casual observers miss is how NASA is steadily modernizing those systems, including developing open source cloud computing, virtualization, advanced robotics, deep space communications and collaborative social software, both behind the firewall and in the public eye.

NASA has also earned top marks for its open government initiatives from both the White House and an independent auditor. That focus is in-line with the agency’s mission statement, adopted in February 2006, to “pioneer the future in space exploration, scientific discovery and aeronautics research,” and it was on display this week at the first NASA IT Summit in Washington, D.C.

The first NASA IT Summit featured speeches, streaming video, discussions about government, innovation and a lively Twitter back channel. Plenty of my colleagues in the technology journalism world were on hand to capture insights from NASA’s initial sally into the technology conference fray. Headlines offer insight into the flavor of the event and the focus of its keynoters:

Below you’ll find my interviews with NASA CTO for IT Chris Kemp (my first interview conducted via telerobot) and NASA CIO Linda Cureton.

NASA CIO and CTO on cloud computing, virtualization and Climate@Home

Gov 2.0 Summit, 2010During the second day of the summit, I interviewed Linda Cureton on some of the key IT initiatives that NASA is pursuing. In particular, I wondered whether NASA’s open source cloud computing technology, Nebula, could be used as a platform for other agencies. “The original problem was that NASA was not in the business of providing IT services,” said Cureton. “We are in the business of being innovative. To create that capability for elsewhere in government is difficult, from that perspective, yet it’s something that the government needs.”

Cureton described Nebula as similar to other spinoffs, where NASA develops a technology and provides it elsewhere in government. “We released the foundation of Nebula into the open source domain so that people in other agencies can take it and use it,” she said. The other major benefit is that once something is in that public domain, the contributions from others — crowdsourcing, so to speak — will improve it.”

Current cost savings in NASA isn’t rooted in the cloud, however. It’s coming from data center consolidation and virtualization. “NASA is decentralized,” said Cureton, “so we’re seeing people are finding ways to consolidate and save money in many ways. The major drivers of the virtualization that has been done are space and the desire to modernize, and to ensure a user experience that could replicate having their own resources to do things without having their own server.”

Cureton observed that because of the decentralization of the agency, energy savings may not always be a driver. “Since low-hanging fruit from virtualization may have been plucked, that’s where facilities managers now want to measure,” she said. “From what I’ve learned, over the past year and a half, there’s been a lot of virtualization. ” For instance, the NASA Enterprise Application Competency Center (NEACC) has achieved floor space reductions from data center consolidation approaching a 12 to 1 ratio, with 36 physical servers and 337 virtual machines.

That’s also meant a power reduction ratio of 6 to 1, which feeds into the focus on green technology in many IT organizations. For instance, as a I reported last year, a green data center is enabling virtualization growth for Congress. Cureton emphasized the importance of metering and monitoring in this area. “If you can’t measure it, you can’t improve it. You need more knowledge about what you can do, like with virtualization technologies. In looking at our refresh strategy, we’re looking at green requirements, just as you might with a car. There’s also cultural challenges. If you don’t pay the electrical bill, you care about different issues.”

Does she put any stock in EnergyStar ratings for servers? “Yes,” said Cureton, whose biography includes a stint at the Department of Energy. “It means something. It’s data that can be taken into account, along with other things. If you buy a single sports car, you might not care about MPG. If you’re buying a fleet of cars, you will care. people who buy at scale, will care about EnergyStar.”

More perspective on Nebula and Open Stack

Cureton hopes agencies take Nebula code and deploy it, especially given continued concerns in government about so-called public clouds. “The things that slow people down with the public cloud include IT security and things of that nature,” she said. “Once an agency understands Nebula, the model can address a lot of risks and concerns the agency might have. if you’re not ready for the Amazon model, it might be a good choice to get your feet wet. The best choice is to start with lower security-class data. When you look at large, transactional databases, Ii’m not sure that’s ready for cloud yet.”

As my telerobotic interview with Chris Kemp revealed (see below) there have now been “hundreds of contributions” to the Nebula code that “taxpayers didn’t have to pay for.” If you missed the news, Rackspace, NASA and several other major tech players announced Open Stack at OSCON this summer. Open Stack “enables any organization to create and offer cloud computing capabilities using open source technology running on standard hardware.” You can watch video of Rackspace’s Lew Moorman talking about an open cloud on YouTube.

There will, however, be integration challenges for adding Nebula code to enterprise systems until the collaboration matures. “You have to realize Nebula code is in production,” said Kemp in an additional interview. “The Open Stack guys basically took Nebula code as seed for the computing part. For storage, users are able to rapidly contribute Rackspace file code. Together, there eventually will be a whole environment. People are able to check out that code right now in the Nebula environment, but there’s a difference between that and a mature implementation.”

Kemp pointed out that both of these code bases have been taken out of large production systems. “It would be irresponsible to call it mature,” he said. “The community needs to test it on all types of hardware and configurations, building infrastructures with specific security scenarios and hardware scenarios. We expect it to be ‘1.0 caliber’ by the fall.”

The bottom line, however, is that, using these components, IT organizations that want to participate can turn commodity hardware into scalable, extensible cloud environments using the same code currently in production serving tens of thousands of customers and large government projects. All of the code for OpenStack is freely available under the Apache 2.0 license. NASA itself has committed to use OpenStack to power their cloud platforms, though Kemp cautioned that NASA is “not endorsing OpenStack, but is endorsing large groups of developers working on the code.”

What Kemp anticipated evolving late this year is a “hybrid EC2,” referring to Amazon’s cloud environment. “Amazon is not selling as EC2 appliance or S3 appliance,” he said. “If you’re building a large government- or science-class, NASA-class cloud environment, this is intended to make all of the necessary computing infrastructure available to you. If you could build that kind of infrastructure with off the shelf components, we would have.”

The manner of the interview with Kemp at the IT Summit also was a powerful demonstration of how NASA is experimenting with telepresence and robotics. Due to his status as a proud new father, Kemp was unable to join in person. Using an Anybot, Kemp was able to talk to dozens of co-workers and collaborators at the summit from his home in California. Watching them talk recalled William Gibson’s famous quote: “The future is here. It’s just not evenly distributed yet.”


381020main_3-5km_lg.jpgCrowdsourcing the search for aliens at the SETI@Home initiative is a well-known project for many computer users. Now, NASA plans to extend that distributed model for processing worldwide to help determine the accuracy of models that scientists will use to predict climate change. NASA describes the project as “unprecedented in scope.” Climate@Home is a strategic partnership between NASA’s Earth Science Division and the Office of the CIO, which Cureton heads. As with SETI@Home, participants won’t need special training. They’ll just need a laptop or desktop and to download a client to run in the background.

Effectively, NASA will be creating a virtual supercomputing network instead of building or re-purposing a supercomputer, which consumes immense amounts of energy. That means that the project will feature a much lower carbon footprint than it would otherwise, which is desirable on a number of levels. The Climate@Home initiative is modeled after a similar project coordinated by the Oxford e-Research Center called Cureton talks about the project in the video below. She also comments (briefly) on the “Be A Martian” project at the Jet Propulsion Laboratory, which enlists citizen scientists in exploring Mars and having fun by sorting through images of the red planet.

Federal CIO on smarter spending

The final day of the summit featured a short, clear speech from federal CIO Vivek Kundra, where he challenged the federal government to spend less on IT. Video is embedded below:

Note: Presentations at the Summit from the grandfather of the Internet, Vint Cerf, the futurism of David W. Cearley, VP & Gartner Fellow, and the analysis of Microsoft’s Lewis Shepherd, all provided provocative views of what’s to come in technology. Look for a post on their insights next week.


The efficiencies and issues surrounding government’s use of technology will be explored at the Gov 2.0 Summit, being held Sept. 7-8 in Washington, D.C. Request an invitation.

tags: , , ,