Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

August 20 2010

Space IT, the final frontier

rover.JPGWhen people think of NASA and information technology in 2010, issues like the future of manned space flight, the aging space shuttle fleet or progress on the International Space Station may come to mind. What casual observers miss is how NASA is steadily modernizing those systems, including developing open source cloud computing, virtualization, advanced robotics, deep space communications and collaborative social software, both behind the firewall and in the public eye.

NASA has also earned top marks for its open government initiatives from both the White House and an independent auditor. That focus is in-line with the agency's mission statement, adopted in February 2006, to "pioneer the future in space exploration, scientific discovery and aeronautics research," and it was on display this week at the first NASA IT Summit in Washington, D.C.

The first NASA IT Summit featured speeches, streaming video, discussions about government, innovation and a lively Twitter back channel. Plenty of my colleagues in the technology journalism world were on hand to capture insights from NASA's initial sally into the technology conference fray. Headlines offer insight into the flavor of the event and the focus of its keynoters:

Below you'll find my interviews with NASA CTO for IT Chris Kemp (my first interview conducted via telerobot) and NASA CIO Linda Cureton.



NASA CIO and CTO on cloud computing, virtualization and Climate@Home


Gov 2.0 Summit, 2010During the second day of the summit, I interviewed Linda Cureton on some of the key IT initiatives that NASA is pursuing. In particular, I wondered whether NASA's open source cloud computing technology, Nebula, could be used as a platform for other agencies. "The original problem was that NASA was not in the business of providing IT services," said Cureton. "We are in the business of being innovative. To create that capability for elsewhere in government is difficult, from that perspective, yet it's something that the government needs."

Cureton described Nebula as similar to other spinoffs, where NASA develops a technology and provides it elsewhere in government. "We released the foundation of Nebula into the open source domain so that people in other agencies can take it and use it," she said. The other major benefit is that once something is in that public domain, the contributions from others -- crowdsourcing, so to speak -- will improve it."

Current cost savings in NASA isn't rooted in the cloud, however. It's coming from data center consolidation and virtualization. "NASA is decentralized," said Cureton, "so we're seeing people are finding ways to consolidate and save money in many ways. The major drivers of the virtualization that has been done are space and the desire to modernize, and to ensure a user experience that could replicate having their own resources to do things without having their own server."

Cureton observed that because of the decentralization of the agency, energy savings may not always be a driver. "Since low-hanging fruit from virtualization may have been plucked, that's where facilities managers now want to measure," she said. "From what I've learned, over the past year and a half, there's been a lot of virtualization. " For instance, the NASA Enterprise Application Competency Center (NEACC) has achieved floor space reductions from data center consolidation approaching a 12 to 1 ratio, with 36 physical servers and 337 virtual machines.

That's also meant a power reduction ratio of 6 to 1, which feeds into the focus on green technology in many IT organizations. For instance, as a I reported last year, a green data center is enabling virtualization growth for Congress. Cureton emphasized the importance of metering and monitoring in this area. "If you can't measure it, you can't improve it. You need more knowledge about what you can do, like with virtualization technologies. In looking at our refresh strategy, we're looking at green requirements, just as you might with a car. There's also cultural challenges. If you don't pay the electrical bill, you care about different issues."

Does she put any stock in EnergyStar ratings for servers? "Yes," said Cureton, whose biography includes a stint at the Department of Energy. "It means something. It's data that can be taken into account, along with other things. If you buy a single sports car, you might not care about MPG. If you're buying a fleet of cars, you will care. people who buy at scale, will care about EnergyStar."

More perspective on Nebula and Open Stack

Cureton hopes agencies take Nebula code and deploy it, especially given continued concerns in government about so-called public clouds. "The things that slow people down with the public cloud include IT security and things of that nature," she said. "Once an agency understands Nebula, the model can address a lot of risks and concerns the agency might have. if you're not ready for the Amazon model, it might be a good choice to get your feet wet. The best choice is to start with lower security-class data. When you look at large, transactional databases, Ii'm not sure that's ready for cloud yet."

As my telerobotic interview with Chris Kemp revealed (see below) there have now been "hundreds of contributions" to the Nebula code that "taxpayers didn't have to pay for." If you missed the news, Rackspace, NASA and several other major tech players announced Open Stack at OSCON this summer. Open Stack "enables any organization to create and offer cloud computing capabilities using open source technology running on standard hardware." You can watch video of Rackspace's Lew Moorman talking about an open cloud on YouTube.

There will, however, be integration challenges for adding Nebula code to enterprise systems until the collaboration matures. "You have to realize Nebula code is in production," said Kemp in an additional interview. "The Open Stack guys basically took Nebula code as seed for the computing part. For storage, users are able to rapidly contribute Rackspace file code. Together, there eventually will be a whole environment. People are able to check out that code right now in the Nebula environment, but there's a difference between that and a mature implementation."

Kemp pointed out that both of these code bases have been taken out of large production systems. "It would be irresponsible to call it mature," he said. "The community needs to test it on all types of hardware and configurations, building infrastructures with specific security scenarios and hardware scenarios. We expect it to be '1.0 caliber' by the fall."

The bottom line, however, is that, using these components, IT organizations that want to participate can turn commodity hardware into scalable, extensible cloud environments using the same code currently in production serving tens of thousands of customers and large government projects. All of the code for OpenStack is freely available under the Apache 2.0 license. NASA itself has committed to use OpenStack to power their cloud platforms, though Kemp cautioned that NASA is "not endorsing OpenStack, but is endorsing large groups of developers working on the code."

What Kemp anticipated evolving late this year is a "hybrid EC2," referring to Amazon's cloud environment. "Amazon is not selling as EC2 appliance or S3 appliance," he said. "If you're building a large government- or science-class, NASA-class cloud environment, this is intended to make all of the necessary computing infrastructure available to you. If you could build that kind of infrastructure with off the shelf components, we would have."

The manner of the interview with Kemp at the IT Summit also was a powerful demonstration of how NASA is experimenting with telepresence and robotics. Due to his status as a proud new father, Kemp was unable to join in person. Using an Anybot, Kemp was able to talk to dozens of co-workers and collaborators at the summit from his home in California. Watching them talk recalled William Gibson's famous quote: "The future is here. It's just not evenly distributed yet."

Climate@Home

381020main_3-5km_lg.jpgCrowdsourcing the search for aliens at the SETI@Home initiative is a well-known project for many computer users. Now, NASA plans to extend that distributed model for processing worldwide to help determine the accuracy of models that scientists will use to predict climate change. NASA describes the project as "unprecedented in scope." Climate@Home is a strategic partnership between NASA's Earth Science Division and the Office of the CIO, which Cureton heads. As with SETI@Home, participants won't need special training. They'll just need a laptop or desktop and to download a client to run in the background.

Effectively, NASA will be creating a virtual supercomputing network instead of building or re-purposing a supercomputer, which consumes immense amounts of energy. That means that the project will feature a much lower carbon footprint than it would otherwise, which is desirable on a number of levels. The Climate@Home initiative is modeled after a similar project coordinated by the Oxford e-Research Center called ClimatePrediction.net. Cureton talks about the project in the video below. She also comments (briefly) on the "Be A Martian" project at the Jet Propulsion Laboratory, which enlists citizen scientists in exploring Mars and having fun by sorting through images of the red planet.

Federal CIO on smarter spending

The final day of the summit featured a short, clear speech from federal CIO Vivek Kundra, where he challenged the federal government to spend less on IT. Video is embedded below:


Note: Presentations at the Summit from the grandfather of the Internet, Vint Cerf, the futurism of David W. Cearley, VP & Gartner Fellow, and the analysis of Microsoft's Lewis Shepherd, all provided provocative views of what's to come in technology. Look for a post on their insights next week.



Related:





The efficiencies and issues surrounding government's use of technology will be explored at the Gov 2.0 Summit, being held Sept. 7-8 in Washington, D.C. Request an invitation.

July 12 2010

Crowdsourcing the search for aliens

Research from the Search for Extra-Terrestrial Intelligence (SETI) project is well known to most technologists because the SETI@Home initiative was one of the first widely distributed computing applications.

Although decades of listening and analyzing radio signals have yet to yield proof of alien intelligence, the pursuit has resulted in significant advances in signal processing technology, as well as serendipitous discoveries in radio astronomy. Now Jill Tarter, director of the Center for SETI Research at the SETI Institute, wants to take the distributed analysis of radio signals to the next level. Tarter, a speaker at the upcoming OSCON conference, discusses her new initiatives in the following Q&A.

How is your new project different from existing distributed computing projects, such as SETI@Home?

OSCON -
Save 20%Jill Tarter: SETI@Home came on the scene a decade ago, and it was brilliant and revolutionary. It put distributed computing on the map with such a sexy application. But in the end, it's been service computing. You could execute the SETI searches that were made available to you, but you couldn't make them any better or change them.

We'd like to take the next step and invite all of the smart people in the world who don't work for Berkeley or for the SETI Institute to use the new Allen Telescope. To look for signals that nobody's been able to look for before because we haven't had our own telescope; because we haven't had the computing power.

At the moment, we're swamped with data. We can't process it all in real-time. Ten years from now, Moore's law will allow us to catch up. Ten years after that, we'll probably be data-starved.

Our study has typically been done by analyzing the data in near real-time with things we've invented and custom-built over the years. We're about to change that by going into a cluster environment for the first time, and not building any accelerator cards or any special purpose hardware. That means anybody can help us write software to make this better. We're trying to get our code cleaned up enough to publish as open source and then let anybody do what they want with it.

Once a week we capture about eight hours of data that we're putting in the cloud. People thus far have been downloading big datasets. That's a bummer, operating on them in their own environment, using their own analysis tools, looking for different things. What we want to do, and what I'm hoping to demonstrate at OSCON, is release a new API that we've co-developed with a startup called Cloudant that will allow people to compile and debug their code locally and then upload it and operate using EC2 resources that Amazon has provided.

How can people who aren't math wonks get involved?

JT: For people who don't have black belts in digital signal processing, we want to take regions of the spectrum that are overloaded with signals and get those out and have them visualized in different ways against different basis vectors. We'd like to see if people can use their pattern recognition capabilities to look or maybe listen; to tease out patterns in the noise that we don't know about.

That'll be a big challenge, and there will be a lot of matching of visual patterns that are real or imagined by the observer against known patterns of interference. That can involve a lot more of the world. Perhaps we can make it into a game.

How is the Allen Telescope different from traditional radio telescopes such as the Very Large Array?

JT: The Allen Telescope is the first of what we call Large Number of Small Dishes (LNSD), a new way of building telescopes. It's a radio interferometer. That isn't new. We've had interferometers since the '70s. But creating the equivalent of a large telescope by building it out of lots of small pieces, by using consumer technologies wherever possible and by putting the complexity into computing, we've changed the paradigm and brought the cost down. I hope that we'll use it to change the world by detecting evidence of another technology or by discovering some new astrophysical phenomenon that no one yet thought of.

The Drake equation is a well-known estimate of the number of possible intelligent alien races that might exist in our galaxy. Does the recent discovery of planets around other stars shift the equation?

JT: Yes, in the sense that we're reducing the error bars on the fraction of sun-like stars that have planets. Within a couple of years, thanks to the Kepler mission, I think we'll have found the first Earth analog. That's going to make a big difference in people's perceptions about life elsewhere. So far, the planetary systems that we're finding look a bit strange when compared to ours. But when we actually find analogs, people will begin to say, "Wow, maybe there are other technological civilizations out there."

We're also moving from the other direction. We're starting to give microbes the respect they deserve, and we're getting blown away by the capabilities of extremophiles. Millions of years of evolution have made these things perfect for living and growing in battery acid and in all kinds of extreme conditions. So what we're also doing is broadening the potentially habitable real estate in the universe. It might not actually have to be quite such a Goldilocks "just right" planet for life to originate and evolve into something that's technological, although not humanlike. I think there's a real estate boom going on out there.

Related:


OSCON will be held July 19-23 in Portland, Ore. Radar readers can save 20% on registration with the discount code OS10RAD.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl