Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

July 26 2013

Podcast: quantum computing with Pete Worden and Bob Lee

Hangar One at Moffett Field, built for the US Navy's early dirigible program.Hangar One at Moffett Field, built for the US Navy's early dirigible program.

Hangar One at Moffett Field, built for the US Navy’s early dirigible program. Photo via Wikipedia.

At Sci Foo Camp a few weeks ago we recorded a conversation with Pete Worden, director of NASA’s Ames Research Center, and Bob Lee, CTO of Square. Among our topics on this wide-ranging podcast: quantum computing, which Ames is pursuing in partnership with Google; fraud prevention; and the remarkable Hangar One, built to accommodate dirigible aircraft at Moffett Field (the former Navy base that’s now home to Ames).

On this recording from O’Reilly: Jon Bruner, Jim Stogdill and Renee DiResta. Subscribe to the O’Reilly Radar Podcast through iTunes, SoundCloud, or directly through our podcast’s RSS feed.

January 31 2013

NASA launches second International Space Apps Challenge

From April 20 to April 21, on Earth Day, the second international Space Apps Challenge will invite developers on all seven continents to the bridge to contribute code to NASA projects.

space app challengespace app challenge

Given longstanding concerns about the sustainability of apps contests, I was curious about NASA’s thinking behind launching this challenge. When I asked NASA’s open government team about the work, I immediately heard back from Nick Skytland (@Skytland), who heads up NASA’s open innovation team.

“The International Space Apps Challenge was a different approach from other federal government ‘app contests’ held before,” replied Skytland, via email.

“Instead of incentivizing technology development through open data and a prize purse, we sought to create a unique platform for international technological cooperation though a weekend-long event hosted in multiple locations across the world. We didn’t just focus on developing software apps, but actually included open hardware, citizen science, and data visualization as well.”

Aspects of that answer will please many open data advocates, like Clay Johnson or David Eaves. When Eaves recently looked at apps contests, in the context of his work on Open Data Day (coming up on February 23rd), he emphasized the importance of events that build community and applications that meet the needs of citizens or respond to business demand.

The rest of my email interview with Skytland follows.

Why is the International Space Apps Challenge worth doing again?

Nick Skytland: We see the International Space Apps Challenge event as a valuable platform for the Agency because it:

  • Creates new technologies and approaches that can solve some of the key challenges of space exploration, as well as making current efforts more cost-effective.
  • Uses open data and technology to address global needs to improve life on Earth and in space.
  • Demonstrates our commitment to the principles of the Open Government Partnership in a concrete way.

What were the results from the first challenge?

Nick Skytland: More than 100 unique open-source solutions were developed in less then 48 hours.

There were 6 winning apps, but the real “results” of the challenge was a 2,000+ person community engaged in and excited about space exploration, ready to apply that experience to challenges identified by the agency at relatively low cost and on a short timeline.

How does this challenge contribute to NASA’s mission?

Nick Skytland: There were many direct benefits. The first International Space Apps Challenge offered seven challenges specific to satellite hardware and payloads, including submissions from at least two commercial organizations. These challenges received multiple solutions in the areas of satellite tracking, suborbital payloads, command and control systems, and leveraging commercial smartphone technology for orbital remote sensing.

Additionally, a large focus of the Space Apps Challenge is on citizen innovation in the commercial space sector, lowering the cost and barriers to space so that it becomes easier to enter the market. By focusing on citizen entrepreneurship, Space Apps enables NASA to be deeply involved with the quickly emerging space startup culture. The event was extremely helpful in encouraging the collection and dissemination of space-derived data.

As you know, we have amazing open data. Space Apps is a key opportunity for us to continue to open new data sources and invite citizens to use them. Space Apps also encouraged the development of new technologies and new industries, like the space-based 3D printing industry and open-source ROV (remote submersibles for underwater exploration.)

How much of the code from more than 200 “solutions” is still in use?

Nick Skytland: We didn’t track this last time around, but almost all (if not all) of the code is still available online, many of the projects continued on well after the event, and some teams continue to work on their projects today. The best example of this is the Pineapple Project, which participated in numerous other hackathons after the 2012 International Space Apps Challenge and just recently was accepted into the Geeks Without Borders accelerator program.

Of the 71 challenges that were offered last year, a low percentage were NASA challenges — about 13, if I recall correctly. There are many reasons for this, mostly that cultural adoption of open government philosophies within government is just slow. What last year did for us is lay the groundwork. Now we have much more buy-in and interest in what can be done. This year, our challenges from NASA are much more mission-focused and relevant to needs program managers have within the agency.

Additionally, many of the externally submitted challenges we have come from other agencies who are interested in using space apps as a platform to address needs they have. Most notably, we recently worked with the Peace Corps on the Innovation Challenge they offered at RHoK in December 2012, with great results.

The International Space Apps Challenge was a way for us not only to move forward technology development, drawing on the talents and initiative of bright-minded developers, engineers, and technologists, but also a platform to actually engage people who have a passion and desire to make an immediate impact on the world.

What’s new in 2013?

Nick Skytland: Our goal for this year is to improve the platform, create an even better engagement experience, and focus the collective talents of people around the world on develop technological solutions that are relevant and immediately useful.

We have a high level of internal buy-in at NASA and a lot of participation outside NASA, from both other government organizations and local leads in many new locations. Fortunately, this means we can focus our efforts on making this an meaningful event and we are well ahead of the curve in terms of planning to do this.

To date, 44 locations have confirmed their participation and we have six spots remaining, although four of these are reserved as placeholders for cities we are pursuing. We have 50 challenge ideas already drafted for the event, 25 of which come directly from NASA. We will be releasing the entire list of challenges around March 15th on spaceappschallenge.org.

We have 55 organizations so far that are supporting the event, including seven other U.S. government organizations, and international agencies. Embassies or consulates are either directly leading or hosting the events in Monterrey, Krakow, Sofia, Jakarta, Santa Cruz, Rome, London and Auckland.

 

August 08 2012

Curiosity rover: why does sci-fi always look more marvellous than reality? | Jonathan Jones

These ordinary looking views of Mars sent by Nasa's rover are beautiful and moving precisely because they are so ordinary

The landscape of Mars glows in a dust-rich sunset. The sky is yellow. The rocks are red. It is a place of – literally – unearthly beauty. But have we already ruined it? In the week that Nasa landed its latest robot explorer Curiosity on the surface of Mars, this picture reveals the wreckage of earlier landers cluttering up the Martian desert, reducing its pristine strangeness to a dumping ground of human space dreams. How typical of the earthlings to make a wasteland of Mars.

No, wait, I misread the caption. This is not a picture taken by Curiosity in its first week on Mars. It is a digitally created image by artist Kelly Richardson. It imagines what Mars might look like in 200 years if we keep sending probes there. It is, in other words, science fiction.

Why does science fiction always look more marvellous than the real landscapes of alien worlds? The pictures that have so far come from Curiosity are nothing like as grabbing as this fantastic image. The first photograph it sent showed a skewed vista of dust and heat with just the misty outline of a horizon. Nasa had to patch it into previous images of the planet to make sense of it. It's all very well scientists saying these first pictures from Curiosity are the most beautiful things they have ever seen – the red planet is far more spectacular in art and other fantastic images.

Richardson is in a very long line of artists who have pictured Mars. Long, long ago, Mars was a god. Botticelli's painting Venus and Mars depicts the god of war lulled to sleep and invokes the magical influence of his planet.

This might seem like ancient baloney but it is no more far fetched than the Mars of sci-fi. A lurid painting of Martians disporting themselves under the planet's glorious sky in a landscape of pyramids, towers and blue canals epitomises the image of Mars that was dreamed up in 20th science fiction before Viking, the first unmanned Nasa lander, started to reveal Martian realities in 1976. Mars was for a long time the favourite planet for imagined alien life. It seemed utterly alien and the "canals" visible on its surface from Earth were held to be the work of some grand civilisation. Even today, science fiction images of Mars outdo mere reality. A 2008 Doctor Who special pictured Mars as the home of a base where the first human explorers are attacked by watery beings from below. A base – there's always a base. Bases are so much more glamorous than unmanned computerised buggies with cameras on front.

Enough. The scientists are right of course. The comparative dullness of Curiosity's first pictures from Mars is the point (and their vagueness will be forgotten when it starts sending back high-definition images). These ordinary looking views of Mars are beautiful and moving precisely because they are so ordinary.

The ordinariness of Mars is its magic. It looks like a red desert on Earth because it is the mirror of Earth – as are all planets everywhere. Everything in the universe is made of the same elements, according to the same physical laws. The discovery that nothing in space is truly "alien" and every object out there (or rather out here – we're just another thing in space) started when Galileo aimed his telescope at the moon. From one point of view the history of astronomy and space exploration is the story of how the universe became banal. But this banality is more glorious than any imaginary spectacle of an alien world where little green men drive motorboats up and down their glittering canals.


guardian.co.uk © 2012 Guardian News and Media Limited or its affiliated companies. All rights reserved. | Use of this content is subject to our Terms & Conditions | More Feeds




August 06 2012

The miracle of a thumbnail image from Mars

Last night, I stayed up late to watch the NASA livestream of the Curiosity rover landing. It seems to have been an unmitigated success: each step of the entry and landing process, even that crazy sky-crane maneuver, was performed flawlessly.

As Travis Beacham put it on Twitter:

Although there were tearful hugs and high-fives and all manner of cheering when “Touchdown!” was called, the wonderment built to a real climax when the first thumbnail image came through. It was small, in black and white, and showed the Martian horizon in the background, with the wheel of the rover in the foreground.

Shortly thereafter, a slightly larger version was displayed: still black and white, but with enough resolution to show dust on the glass. A second one followed a few minutes later, showing the rover’s shadow on the ground. Cue the “pics or it didn’t happen” jokes, as well as the rapid proliferation of Photoshopped spoofs.

Image from the Curiosity rover on Mars
One of the first images from the Curiosity rover.


In our micro-culture of the moment, obsessed with photo sharing and images, this tiny thumbnail still seemed like a miracle (albeit a required one). A picture really is worth a whole lot of words. But have you ever stopped to think about what it takes to plan for that from Mars?

We take for granted being able to snap a great-looking picture and send it wirelessly to almost anywhere we want with the tap of a few icons, but transmitting images back from another planet is a complicated process.

I couldn’t help but think about the images that came back from the Phoenix lander in 2008, and the excellent chapter J.M. Hughes, principle software engineer for the imaging software on Phoenix, wrote in Beautiful Data:

The challenge was to devise a way to download the image data from each of the cameras, store the data in a pre-allocated memory location, process the data to remove known pixel defects, crop and/or scale the images, perform any commanded compression, and then slice-and-dice it all up into packets for hand-off to the main computer’s downlink manager task for transmission back to Earth.

And all of this must be done carefully, sparingly, in order to conserve resources. As Hughes put it, “A spacecraft is an exercise in applied minimalism: just enough to do the job and no more.”

In honor of the Curiosity’s inspiring success, we’re making Hughes’ chapter available here. Reading about some of the design trade-offs required in building and successfully deploying the imaging software on a Mars spacecraft makes Curiosity’s achievement all the more amazing.

Images from the Curiosity rover can be found here.

Curiosity image: NASA/JPL-Caltech

Related:

August 03 2012

They promised us flying cars

We may be living in the future, but it hasn’t entirely worked out how we were promised. I remember the predictions clearly: the 21st century was supposed to be full of self-driving cars, personal communicators, replicators and private space ships.

Except, of course, all that has come true. Google just got the first license to drive their cars entirely autonomously on public highways. Apple came along with the iPhone and changed everything. Three-dimensional printers have come out of the laboratories and into the home. And in a few short years, and from a standing start, Elon Musk and SpaceX has achieved what might otherwise have been thought impossible: late last year, SpaceX launched a spacecraft and returned it to Earth safely. Then they launched another, successfully docked it with the International Space Station, and then again returned it to Earth.

The SpaceX Dragon capsule is grappled and berthed to the Earth-facing port of the International Space Station’s Harmony module at 12:02 p.m. EDT, May 25, 2012. Credit: NASA/SpaceX


Right now there is a generation of high-tech tinkerers breaking the seals on proprietary technology and prototyping new ideas, which is leading to a rapid growth in innovation. The members of this generation, who are building open hardware instead of writing open software, seem to have come out of nowhere. Except, of course, they haven’t. Promised a future they couldn’t have, they’ve started to build it. The only difference between them and Elon Musk, Jeff Bezos, Sergey Brin, Larry Page and Steve Jobs is that those guys got to build bigger toys than the rest of us.

The dotcom billionaires are regular geeks just like us. They might be the best of us, or sometimes just the luckiest, but they grew up with the same dreams, and they’ve finally given up waiting for governments to build the future they were promised when they were kids. They’re going to build it for themselves.

The thing that’s driving the Maker movement is the same thing that’s driving bigger shifts, like the next space race. Unlike the old space race, pushed by national pride and the hope that we could run fast enough in place so that we didn’t have to start a nuclear war, this new space race is being driven by personal pride, ambition and childhood dreams.

But there are some who don’t see what’s happening, and they’re about to miss out. Case in point: a lot of big businesses are confused by the open hardware movement. They don’t understand it, don’t think it’s worth their while to make exceptions and cater to it. Even the so-called “smart money” doesn’t seem to get it. I’ve heard moderately successful venture capitalists from the Valley say that they “… don’t do hardware.” Those guys are about to lose their shirts.

Makers are geeks like you and me who have decided to go ahead and build the future themselves because the big corporations and the major governments have so singularly failed to do it for us. Is it any surprise that dotcom billionaires are doing the same? Is it any surprise that the future we build is going to look a lot like the future we were promised and not so much like the future we were heading toward?

Related:

February 16 2012

Developer Week in Review: NASA says goodbye to big iron

It looks like I'm going to have a life-changing decision to make in the next few weeks, one that will be shared by millions of people around the world. At risk, the balance in my bank account.

I refer, of course, to whether I'll pony up the cash to upgrade my iPad 2 to a 3, once Apple actually tells us what the iPad 3 will have in it. Unless it cooks gourmet dinners and transports you to other planets, my best guess is that I won't. For one thing, we're also facing the release of the iPhone 5 later in the year, and I make it a policy only to do one Apple fan-boy "upgrade the expensive toy you just bought last year" purchase a year. For another, it looks like the 3 is going to be a faster version of the 2 with a Retina display, and I just can't see it being enough of a delta in features to make it worth the cost.

If I'm going to upgrade either device, I need cash in the bank, so time to earn my keep with this week's news.

HAL is crestfallen ...

NASA logoWe arrive at a bit of a milestone this week, as NASA says goodbye to the last piece of big iron left in its data processing infrastructure. With the retirement of the last IBM Z9, NASA finishes its mission to boldly go where most of the rest of the high tech world had already gone years ago. I especially liked the shout-out to old-school programmers in JCL at the end of NASA's blog post marking the occasion.

NASA, like many organizations running life-critical applications, has to take a very conservative approach to hardware upgrades, because failure is not an option. The computers installed into NASA space vehicles and probes are notorious for being generations behind the current state of the art, because of the long lead times to get them spec'd out and installed. Obviously, no mainframe flies into space, for reasons of weight and space if nothing else. You can see the same kind of excruciatingly slow hardware progress at agencies like the FAA, which can take a human generation to upgrade to a new air traffic control system.

For now, let us bid farewell to the brave Z9, last of its kind at NASA. It would be nice to fantasize that it was responsible for some intricate detail of manned space flight, but the reality is that it evidently ran business applications. Even so, if you don't pay the engineers and vendors, they don't work, so it did play its own sort of role in the exploration of the universe.

Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

Monty Redmond's Visual Python

Visual Studio, like Eclipse and Xcode, provides IDE support for a huge swath of the developer community. While it's still common to find old-schoolers who use Emacs or vi to grind out code, most programmers these days end up using an IDE to take advantage of the debugging and integrated documentation features they provide.

Eclipse is well-known for the wide variety of languages and platforms it supports, but it's easy to forget that Microsoft is making a concerted effort to open up Visual Studio to a wider developer audience as well. One sign of this is the version 1.1 release of Python Tools for Visual Studio, which has just come out. This toolkit is notable for another reason, too: it's one of the projects coming out of Microsoft's Codeplex open source initiative.

I know I'm not alone in having been skeptical of Microsoft's recent warming to open source. It's easy to see it as yet another "embrace, extend and extinguish" play. But at a certain point, you have to say that if it walks and talks like a mule, it may in fact be a mule after all. While I don't expect to see the Windows XP source code being donated to Apache anytime soon, it does seem to appear that Microsoft is making an honest effort to leverage the power of the open source model where it makes sense. That's a huge change from the company's previous "open source is communism" stance. As with most things, time will tell if this is the real deal.

I guess we'll find out what happens when you cross the streams ...

Open source developers have a reputation for bringing a passion, sometimes at an obsessive level, to the projects they work on. But even they would find themselves challenged to keep up with the frenzied level of creative mania displayed by bronies, adult fans of the new My Little Pony reboot. So what happens when you combine the two forces of open source and the brony herd? Wonder Twin developer powers activate!

"PonyKart" is a "Mario Kart"-style game set in the "My Little Pony: Friendship is Magic" universe. It's being developed by a group of brony developers over on SourceForge. It's still in the early days, but the initial videos they've released are impressive.

There's a reason you don't see a lot of open source games with this level of complexity; it's a fairly massive undertaking and is usually only within the resources of major game houses. There is a very capable Linux "MarioKart" clone out there, but consider that the "PonyKart" folks have only been in operation since July of last year, compared to the six years of development that have gone into "Supertuxkart" so far, and you can get a feel for the awesome power that can be brought to bear when two committed movements overlap. To be fair, there are more tools available now — such as physics engines — then when "Supertuxkart" started development, but the "PonyKart" effort is still striking. Imagine what could happen if we could get the Gleeks interested in video editing software ...

Tying in another theme often harped upon in these pages, the reason PonyKart can happen at all is that Hasbro has gone out of its way to apply a light hand as far as their intellectual property is concerned. Rather than wrapping a death-grip around the My Little Pony characters, Hasbro has let fans pretty much run wild with them (including the inevitable Rule 34 stuff). The company has wisely decided to let the fans churn up a meme-storm, while it sits back and counts the profits from toy sales. Are you listening, RIAA and MPAA? You could do much better by cooperating with your fan base, rather than persecuting them.

Of course, "PonyKart" could still lose momentum and die. There's a big difference between a long-term effort and horsing around for a few months (see what I did there?). But given the evidence to date, I wouldn't count this nag out of the race yet.

(Obligatory full disclosure: Your humble chronicler is a member of the herd, although not involved in the "PonyKart" project.)

Got news?

Please send tips and leads here.

Related:

October 23 2011

Snaps from space: Astronauts' photographs go on sale

Vintage prints of pictures taken by Neil Armstrong, Buzz Aldrin and others to be auctioned by Bloomsbury in London

They often used Hasselblad cameras from Sweden modified only by the addition of a bigger button to press, but then taking pictures when you are an astronaut in a bulky, pressurised suit is clearly tricky.

Many of the astronauts' early space photographs have become extremely famous, more for their otherworldly beauty than their scientific value.

And now some are to appear in the UK's first dedicated sale of vintage Nasa photographs.

Bloomsbury Auctions in London has announced details of the first specialist sale of images showing how man came to land on the moon.

"We are thrilled," said Sarah Wheeler, Bloomsbury's photographs specialist.

"What we are offering are historic artefacts – rare, iconic, vintage photographs taken by the astronauts themselves and printed within days of their return to Earth and very different from today's downloadable images."

More than 280 photographs, with estimated values ranging from £200 to £10,000, will be auctioned. They have been collected over decades by Frenchman Victor Martin-Malburet, who has exhibited them in Paris and Saint-Etienne.

Some of the most striking images in the collection are of Ed White's spacewalk in 1965, part of the Gemini 4 mission.

White was the first American to walk in space. His walk was photographed by fellow astronaut James McDivitt – who was looking out of the craft without really being able to see what he was shooting at.

"He was remarkably successful considering he couldn't really frame the pictures," said a Bloomsbury spokesman.

Other highlights include the first view of Earth from the moon, taken on 23 August 1966 and shown publicly on 10 September.

It is a grainy image but the technological feat of making it happen at all should not be underestimated – the pictures were taken by an unmanned satellite which also developed them and sent them back to Earth as radio waves.

There are also images of a Gemini 12 spacewalk by Buzz Aldrin in November 1966 including one taken by the astronaut himself – using his modified Hasselblad with the big button – which Bloomsbury has billed as the "first self-portrait in space".

One of the most recognisable images is Earthrise, taken by William Anders on Christmas Eve in 1968 from Apollo 8.

Anders explained that they had spent all their time on Earth studying the moon and when they got there, they could see a fragile and delicate-looking Earth.

"I was immediately almost overcome by the thought that we came all this way to the moon, and yet the most significant thing we're seeing is our own home planet, the Earth."

And of course there is Apollo 11 – the mission that landed the first men on the moon – and photographs by Aldrin of his footprints.

Because flight leader Neil Armstrong was often taking the photographs, there are not many pictures of him. But there is the famous image Armstrong took of Aldrin in which he is reflected in Aldrin's goldplated visor.

The photographs are all vintage prints – made soon after the event depicted. The more expensive ones are the large-format prints that were often presented to scientists or dignitaries. The sale takes place on 3 November.


guardian.co.uk © 2011 Guardian News and Media Limited or its affiliated companies. All rights reserved. | Use of this content is subject to our Terms & Conditions | More Feeds


April 07 2011

Open source is mission critical for NASA

NASA Open Source SummitLast week's inaugural NASA Open Source Summit in Mountain View, Calif. was aimed at identifying the agency's issues surrounding open source tools and using the power of the web to collect policy recommendations.

The Summit organizers convened thinkers and leaders in open source technology from around the country, including representatives from RedHat, GitHub, Google, the Department of Defense, Mozilla, IBM and from within NASA itself. If you want to review the play by play, I liveblogged the NASA Open Source Summit at Govfresh.

In a separate report, eWeek picked up on the difficult licensing and export issues open source presents to NASA. The Register focused on the talk that Google's open source and public sector engineering manager Chris DiBona gave, where he emphasized that open source won't kill you. Red Hat's OpenSource.com report noted the overall gist of the event: make openness the default.

Over the course of two days at the NASA Ames center, I spoke with many of the presenters and attendees about what they shared or learned. Below, Chris Wantstrath (@defunkt), one of the co-founders of GitHub, spoke at length about open source at NASA.

Geek lesson: "I guess this shouldn't be surprising, but we deal a lot at GitHub with collaboration in the software development process," Wantstrath said. "It seems like the big thing here [at the Summit] is a higher level of that. Licensing, process issues, getting the sign-off to just put your code out into the wild. We sort of deal in the wild and here, they're just trying to get to that point.

Gil Yehuda (@gyehuda), former Forrester analyst and current director of open source at Yahoo, spoke frankly about the pain points NASA is experiencing in the open source arena:

Key takeaway:: Open source policies "should be a service, not a barrier," Yehuda said. There are the "ridiculously brilliant people here, working on the most amazing things, and yet they're encumbered by processes."

Three NASA Open Source Summit presentations of note

You can find them all of the presentations delivered at the Summit on SlideShare. Three are particularly interesting, although all are worth browsing.

Terry Fong of the Intelligent Robotics Group shared his view of open source at NASA from a practitioner's perspective (the full presentation is posted here)

Key stat: Fong said it takes six months for code to be released as open source.

David Wheeler (presentation available here), from the Institute for Defense Analysis at the Defense Department, offered a cogent analysis of the NASA Open Source Agreement (NOSU).

Fundamental issue: NOSU is incompatible with other open source licenses. Open source experts at the Summit strongly recommended that NASA consider GPL, BSD or another open source license.

Chris Mattmann, from the NASA Jet Propulsion Laboratory, shared an open source strategy for NASA that will be useful for any government agency that's interested in open source or building communities (his full presentation is posted here). Mattmann may be familiar to regular Radar readers: last year he spoke with me about how NASA open source technology is leading to better medical decisions.

Key insight: The measure of a healthy community is its responsiveness. Mattmann has considerable insight into what it takes to build healthy open source communities: his OODT project is one of the first to work within the Apache Incubator.

NASA and open government

For insight into the open government aspect of the Summit, I talked with Nick Skytland (@skytland), the director of the NASA open government initiative. "I think the general message has been sent that open source is a valuable tool for NASA to consider in the future, and I think we're going down that path," he said.

Along with talking about some of the tools and platforms that NASA uses, Skytland described what open government means in context:

We have technology, we have policy, which is what this event is really all about, but fundamentally, open government is about culture — changing culture to embrace technology, getting permission to experiment and try things that maybe we hadn't considered before. To improve the way we do business and also to contribute to our mission.

NASA as an example for collaboration and open source

This Summit, which combined virtual and in-person participation, is a worthy example of collaboration around improving government. Lucas Cioffi, chairman of the Open Forum Foundation, talked about how the organizers brought different groups together:

This was a highly participatory event — not just collecting tweets, not just a typical chat, but actually engaging people in deeper conversation. We used the very interesting phone conference tool called Maestro Conference. It took the dozens and dozens of people that were interested in deeper discussion and broke them out in smaller discussions. Most importantly, we blended the in-person and remote participation together using Google Docs so that people who were not present were still in the same conversation. They were building on each other's ideas, commenting on each other's ideas.

Whether it's a huge federal institution or small town council, embracing the spirit of open government means officials have to move beyond what information technology by itself can accomplish. That's a tall order. It requires a cultural change. To put it another way, open government is a mindset.

That's one of the reasons why NASA getting open source right will make a difference to governments everywhere. If the agency can find a way to connect its vast amounts of data and scientific expertise to more people, both NASA and citizens will benefit. When the White House open sourced its IT dashboard code last week, it created a footpath for other agencies to follow when they release code. NASA's policy development and open source efforts could offer similar guidance.



Related:


March 01 2011

The NASA Make Challenge

Make NASA ChallengeI'm excited to announce the launch of the first NASA Make Challenge: Experimental Science Kits for Space.

Last year, I met with Lynn Harper and Daniel Rasky of the Space Portal at NASA Ames to talk about ideas for a DIY space issue of Make, which became Make Volume 24. In that same conversation, we talked about the role that makers could play in space exploration. I recall Lynn saying that we needed "not hundreds of experiments going into space but hundreds of thousands of experiments." There is so much we don't know; so much we could learn, she added, if we simply had more experiments testing what happens in microgravity. The Space Portal team recognized that makers were an untapped resource, ready and willing to take on that kind of challenge. Makers just needed an open door.

Now the Space Portal is known as the Emerging Commercialization Space Office (ECSO), which is the new name for the Space Portal group now that it is an official NASA office. Rasky, an inventor who developed the heat shield used by Space X, is the Director of ECSO. We have collaborated with this new office and the Teachers in Space program to create the NASA Make Challenge, which benefits education as well as the space program.

Our first challenge is to develop inexpensive science kits that can be built in a classroom and then sent on-board suborbital flights to conduct experiments. The experiments must fit within a CubeSat, a 10 cm x 10 cm x 10 cm module. It's an opportunity to use off-the-shelf technology to design projects kits that students can build and see them actually get into space. Imagine — Arduinos in space. (There has already been one Arduino sent into space.)

Teachers in Space program, located in Plano,TX, will work with teachers across the country to build the first set of kits and make the arrangements for the experiments to fly on an unmanned suborbital vehicle in late summer. Later on, these teachers will work with students to build experimental kits for future flights.

If you are fascinated by space, it's a great time for you to be able to do something as a maker and make a real contribution. Makers can participate in a new kind of space program, one that expands beyond NASA to include commercial space collaboration.

Visit the NASA Make Challenge page on Makezine.com for information on how to participate in the program and sign up for a mailing list to get more information. (The rules are still under review.) The deadline for experimental science kit submissions will be April 30, 2011. The winner of the NASA Make Challenge will be honored at this year's Bay Area Maker Faire, and the winning kit project will be featured in the magazine.



Related Links


November 03 2010

Open sourcing space

The current issue of MAKE magazine might be called SPACE: The DIY Frontier. We look at how a variety of makers are exploring space and doing work that once required large budgets and a team of NASA scientists. Now with off-the-shelf components and your own initiative, you can build and launch your own satellite or weather balloon. This science-oriented issue of MAKE shows you how to roll your own space program. My introduction to the issue is posted below


MAKE: Volume 24 CoverWith one exception, Silicon Valley lacks monumental landmarks that signify its importance as a world capital of technology innovation. That exception is Hangar One at Moffett Field in Mountain View, Calif., which is the home of NASA Ames Research Center.

Hangar One stands out like a Great Pyramid visible from Highway 101. Built to house airships called dirigibles or zeppelins, Hangar One opened in 1933. The floor inside this freestanding structure covers eight acres, and its enormous clamshell doors were designed for the passage of these airships. The hangar reputedly creates its own climate inside, bringing rain unexpectedly to parties that were organized there, back before it was closed.

Today, the future of this historic structure depends on NASA and various groups debating whether to restore it or tear it down. (Its walls are covered with siding that contains asbestos and PCBs.) Those who would preserve it recognize its power as a cultural symbol. While the days of airships are mostly gone — Airship Ventures now runs zeppelin tours from Moffett Field — Hangar One remains an inspiration.

Inspiration was a by-product of the space race in the United States. Many, like me, thought of themselves as part of the space program, following the Mercury and Apollo missions, even if our role was limited to watching TV. The goal of a moon landing inspired young men and women to become scientists and engineers. They entered NASA with great enthusiasm to become part of something as big as they could imagine.



Many had satisfactory careers inside NASA, while others grew frustrated as NASA became a slow-moving bureaucracy. Increasingly, NASA made it harder (and more costly) to do anything. So, like the age of dirigibles, the U.S. space program that I grew up with is gone, and like Hangar One, its future is uncertain. Yet our fascination with space is not.



One cause for hope is that the future of space exploration doesn't depend solely on NASA. Bruce Pittman, who works in the Space Portal group at NASA Ames, calls this future "Space 2.0." If Space 1.0 was a "one-size-fits-all" approach with NASA controlling everything, Pittman says, then Space 2.0 depends upon "massive participation," harnessing enthusiasm and expertise in places around the globe.



Space 2.0 represents the open sourcing of space exploration, a new model that could lead to faster, cheaper ways to develop space technologies.



It's also a call for makers to participate in research and development. Just as we're seeing amateurs play a role in astronomy and other fields, amateurs will be undertaking projects in support of a next-generation space program. For example, Lynn Harper of NASA Ames points out that the commercialization of space will mean a huge increase in suborbital flights, and a growing field of research in microgravity. She says this research requires "not just hundreds of experiments to send into space, but hundreds of thousands."



In this "DIY Space" issue of MAKE, you'll meet all kinds of makers, some inside NASA but many more outside the agency. We look at how to build your own homebrew satellites that take payloads into near-space and even into orbit. We show you how
to build fast, cheap gadgets to analyze galactic spectra or eavesdrop on the space station. We also look at a variety of space-related projects seeking the participation of makers like you, from smartphone satellites to lunar mining robots.

For his report "Rocket Men," Charles Platt interviewed the makers of a new private space industry. He also visited the Mojave Air and Space Port, where individuals and small companies set up to do space research. Spaceport manager Stuart Witt says, "I offer people the freedom to experiment." That's all you really need. The future, if you're so inspired, is up to you.



You can find MAKE: DIY Space Projects on newsstands and at Maker SHED.



Related:


August 26 2010

Tracking the signal of emerging technologies

4727644780_1a2f2e5f04_b.jpgLast week the words of science fiction writer William Gibson ran rampant over the Twitter back channel at the inaugural NASA IT Summit when a speaker quoted his observation that "The future is here. It's just not evenly distributed yet." That's a familiar idea to readers of the O'Reilly Radar, given its focus on picking up the weak signals that provide insight into what's coming next. So what does the future of technology hold for humanity and space flight? I've been reading the fiction of Jules Verne, Isaac Asimov, David Brin, Neal Stephenson, Bruce Sterling and many other great authors since I was a boy, and thinking and dreaming of what's to come. I'm not alone in that; Tim O'Reilly is also dreaming of augmented reality fiction these days.

Last week I interviewed NASA's CIO and CTO at the NASA IT Summit about some of that fiction made real. We discussed open source, cloud computing, virtualization, and Climate@Home, a distributed supercomputer for climate modeling. Those all represent substantive, current implementations of enterprise IT that enable the agency to support mission-critical systems. (If you haven't read about the state of space IT, it's worth circling back.)

Three speakers at the Summit offered perspectives on emerging technologies that were compelling enough to report on:

  • Former DISA CTO Lewis Shepherd
  • Gartner VP David Cearley
  • Father of the Internet Vint Cerf

You can watch Cerf speak in the embedded video below. (As a bonus, Jack Blitch's presentation on Disney's "Imagineers" follows.) For more on the technologies they discuss, and Shepherd's insight into a "revolution in scientific computing," read on.


Building an Internet in space

Even a cursory look at the NASA IT Summit Agenda reveals the breadth of topics discussed. You could find workshops on everything from infrastructure to interactivity, security in the cloud to open government, space medicine to ITIL, as well as social media and virtual worlds. The moment that was clearly a highlight for many attendees, however, came when Vint Cerf talked about the evolution of the Internet. His perspective on building resilient IT systems that last clearly resonated with this crowd, especially his description of the mission as "a term of art." Cerf said that "designing communications and architectures must be from a multi-mission point of view." This has particular relevance for an agency that builds IT systems for space, where maintenance isn't a matter of a stroll to the server room.

Cerf's talk was similar to the one he delivered at "Palantir Night Live" earlier this summer, which you can watch on YouTube or read about from Rob Pegoraro at the Washington Post.

Cerf highlighted the more than 1.8 billion people on the IP network worldwide at the end of 2009, as well as the 4.5 billion mobile devices that are increasingly stressing it. "The growth in the global Internet has almost exhausted IPv4 address space," he said. "And that's my fault." Time for everyone to learn IPv6.

Looking ahead to the future growth of the Internet, Cerf noted both the coming influx of Asian users and the addition of non-Latin characters, including Cyrillic, Chinese, and Arabic. "If your systems are unprepared to deal with non-Latin character sets, you need to correct that deficiency," he said.

Cerf also considered the growth of the "Real World Web" as computers are increasingly embedded in "human space." In the past, humans have adapted to computer interfaces, he said, but computers are increasingly adapting to human interfaces, operating by speech, vision, touch and gestures.

Cerf pointed to the continued development of Google Goggles, an app that allows Android users to take a picture of an object and send it to Google to find out what it is. As CNET reported yesterday, Goggles is headed to iPhones this year. Cerf elicited chuckles from the audience when describing the potential for his wife's Cochlear implant to be reprogrammed with TCP/IP, thereby allowing her to ask questions over a VoIP network, essentially putting her wife on the Internet. To date, as far as we know, she is not online.

Cerf also described the growing "Internet of Things." That network will include an InterPlaNetary Internet, said Cerf, or IPN. Work has been going forward on the IPN since 1998, including the development of more fault-tolerant networking that stores and forwards packets as connections become available in a "variably delayed and disrupted environment."

"TCP/IP is not going to work," he said, "as the distance between planets is literally astronomical. TCP doesn't do well with that. The other problem is celestial motion, with planets rotating. We haven't figured out how to stop that."

The "Bundle Protocol" is the key to an interplanetary Internet, said Cerf. The open source, publicly available Bundle protocol was first tested in space on the UK-DMC satellite in 2008. This method allows three to five times more data throughput than standard TCP/IP, addressing the challenge of packetized communications by hopping and storing the data. Cerf said we'll need more sensors in space, including self-documenting instruments for meta-data and calibration, in order to improve remote networking capabilities. "I'm deeply concerned that we don't know how to do many of these things," he observed.

Another issue by Cerf is the lack of standards for cloud interoperability. "We need a virtual cloud to allow more interoperability."

Government 2.0 and the Revolution in Scientific Computing

Lewis Shepherd, former CTO at the Defense Information Systems Agency and current Director of Microsoft’s Institute for Advanced Technology in Governments, focused his talk on whether humanity is on the cusp of a fourth research paradigm as the "scale and expansion of storage and computational power continues unabated."

Shepherd put that prediction in the context of the evolution of science from experimental to theoretical to computational. Over time, scientists have moved beyond describing natural phenomena or Newton's Laws to simulating complex phenomena, an ability symbolized by comparing the use of lens-based microscopes to electron microscopes. This has allowed scientists to create nuclear simulations.

Shepherd now sees the emergence of a fourth paradigm, or "eScience," where a set of tools and technologies support data federation and collaboration to address the explosion of exabytes of data. As an example he referenced imagery of the Pleiades star cluster from the Digitized Sky Survey synthesized within the WorldWide Telescope.

"When data becomes ubiquitous, when we become immersed in a sea of data, what are the implications?" asked Shepherd. "We need to be able to derive meaning and information that wasn't predicted when the data sets were constructed. No longer will we have to be constrained by databases that are purpose-built for a system that we design with a certain set of requirements. We can do free-form science against unconstrained sets of data, or modeling on the fly because of the power of the cloud."

His presentation from the event is embedded below.

In particular, Shepherd looked at the growth of cloud computing and data ubiquity as an enabler for collaboration and distributed research worldwide. In the past, the difficulty of replicating scientific experiments was a hindrance. He doesn't see that as a fundamental truth anymore. Another liberating factor, in his view, is the evolution of programming into modeling.

"Many of the new programming tools are not just visual but hyper-visual, with drag and drop modeling. Consider that in the context of continuous networking," he said. "Always-on systems offer you the ability to program against data sets in the cloud, where you can see the emergence of real-time interactive simulations."

What could this allow? "NASA can design systems that appear to be far simpler than the computation going on behind the scenes," he suggested. "This could enable pervasive, accurate, and timely modeling of reality."

Much of this revolution is enabled by open data protocols and open data sets, posited Shepherd, including a growing set of interactions -- government-to-government, government-to-citizen, citizen-to-citizen -- that are leading to the evolution of so-called "citizen science." Shepherd referenced the Be A Martian Project, where the NASA Jet Propulsion Laboratory crowdsourced images from Mars.

He was less optimistic about the position of the United States in research and development, including basic science. Even with President Obama's promise to put science back in its rightful place during his inaugural address, and some $24 billion dollars in new spending in the Recovery Act, Shepherd placed total research and development as a percentage of GDP at only 0.8%.

"If we don't perform fundamental research and development here, it can be performed elsewhere," said Shepherd. "If we don't productize here, technology will be productized elsewhere. Some areas are more important than others; there are some areas we would not like to see an overseas label on. The creation of was NASA based on that. Remember Sputnik?" His observations were in parallel with those made by Intel CEO Paul Otelinni at the Aspen Forum this Monday, who sees the U.S. facing a looming tech decline.

"Government has the ability to recognize long time lines," said Shepherd, "and then make long term investment decisions on funding of basic science." The inclusion of Web 2.0 into government, a trend evidenced in the upcoming Gov 2.0 Summit, is crucial for revealing that potential. "We should be thinking of tech tools that would underlay Gov 3.0 or Gov 4.0," he said, "like the simulation of data science and investment in STEM education."

Gartner's Top Strategic Technologies

Every year, Gartner releases its list of the top 10 strategic technologies and trends. Their picks for 2010 included cloud computing, mobile applications (Cearley used the term apptrepreneurship to describe the mobile application economy that is powered by the iTunes and Android marketplaces, a useful coinage I wanted to pass along), flash memory, activity monitoring for security, social computing, pod-based data centers, green IT, client computing, advanced analytics, and virtualization for availability. Important trends all, and choices that have been born out since the analysis was issued last October.

What caught my eye at the NASA IT Summit were other emerging technologies, several of which showed up on Gartner's list of emerging technologies in 2008. Several of these are more likely familiar to fellow fans of science fiction than data center operators, though to be fair I've found that there tends to be considerable cross over between the two.

Context-aware Computing
There's been a lot of hype around the "real-time Web" over the past two years. What's coming next is the so-called "right-time Web," where users can find information or access services when and where they need them. This trend is enabled by the emergence of pervasive connectivity, smartphones, and the cloud.

"It will be collaborative, predictive, real-time, and embedded," said Clearey," adding to everyday human beings' daily processes." He also pointed to projects using Hadoop, the open source implementation of MapReduce that Mike Loukides wrote about in What is Data Science? Context-aware computing that features a thin client, perhaps a tablet, powered by massive stores of data and predictive analytics could change the way we work, live, and play. By 2015-2020 there will be a "much more robust context-delivery architecture," Cearley said. "We'll need a structured way of bring together information, including APIs."

Real World Web
Our experiences in the physical world are increasingly integrated with virtual layers and glyphs, a phenomenon that blogger Chris Brogan described in 2008 in his Secrets of the Annotated World. Cyberspace is disappearing into everyday experience. That unification is enabled by geotagging, QR codes, RFID chips, and sensor networks. There's a good chance many more of us will be shopping with QR codes or making our own maps in real-time soon.

Augmented Reality
Context-aware computing and the Real World Web both relate to the emergence of augmented reality, which has the potential to put names to faces and much more. Augmented reality can "put information in context at the point of interaction," said Cearley, "including emerging wearable and 'glanceable' interfaces. There's a large, long term opportunity. In the long term, there's a 'human augmentation' trend."

Features currently available in most mobile devices, such as GPS, cellphone cameras, and accelerometers, have started to make augmented reality available to cutting edge users. For instance the ARMAR project shows the potential of augmented reality for learning, and Augmented reality without the phone is on its way. For a practical guide to augmented reality, look back to 2008 on Radar. Nokia served up a video last year that shows what AR glasses might offer:

Future User Interfaces
While the success of the iPad has many people thinking about touchscreens, Cearley went far beyond touch, pointing to emerging gestural interfaces like the SixthSense wearable computer at MIT. "Consider the Z-factor," he suggested, "or computing in three dimensions." Cearley pointed out that there's also a lot happening in the development of 3D design tools, and he wouldn't count virtual worlds out, though they're mired "deep in the trough of disillusionment." According to Cearley, the problem with current virtual worlds is that they're "mired in a proprietary model, versus an open, standards-driven approach." For a vision of a "spatial operating system" that's familiar to people who have seen "Minority Report," watch the video of g-speak from oblong below:

Fluid User Interface
This idea focuses on taking the user beyond interacting with information through a touchscreen or gesture-based system and into contextual user interfaces, where an ensemble of technologies allow a human to experience emotionally-aware interactions. "Some are implemented in toys and games now," said Cearley, "with sensors and controls." The model would include interactions across multiple devices, including building out a mind-computer interface. "The environment is the computer." For a glimpse into that future, consider the following video from the H+ Summit at Harvard's Science Center with Heather Knight, social roboticist and founder of marilynmonrobot.com:

.

User Experience Platforms
Cearley contended that user experience design is more important than a user experience platform. While a UXP isn't a market yet, Cearley said that he anticipated news of its emergence later in 2010. For more on the importance and context of user experience, check out UX Week, which is happening as I write this in San Francisco. A conceptual video of "Mag+" is embedded below:

Mag+ from Bonnier on Vimeo.

3D Printing
If you're not following the path of make-offs and DIY indie innovations, 3D printing may be novel. In 2010, the 3D printing revolution is well underway at places like MakerBot industries. In the future, DARPA's programmable matter program could go even further, said Cearley, though there will need to be breakthroughs in materials science. You can watch a MakerBot in action below:

Mobile robotics driving mobile infrastructure
I experienced a vision of this future myself at the NASA IT Summit when I interviewed NASA's CTO using a telerobot. Cearley observed many applications coming for this technology, from mobile video conferencing to applications in healthcare and telemedicine. A video from the University of Louisville shows how that future is developing:

Fabric Computing
Cearley's final emerging technology, fabric computing, is truly straight out of science fiction. Storage and networking could be distributed through a garment or shelter, along with displays or interfaces. A Stanford lecture on "computational textiles" is embedded below:

August 20 2010

Space IT, the final frontier

rover.JPGWhen people think of NASA and information technology in 2010, issues like the future of manned space flight, the aging space shuttle fleet or progress on the International Space Station may come to mind. What casual observers miss is how NASA is steadily modernizing those systems, including developing open source cloud computing, virtualization, advanced robotics, deep space communications and collaborative social software, both behind the firewall and in the public eye.

NASA has also earned top marks for its open government initiatives from both the White House and an independent auditor. That focus is in-line with the agency's mission statement, adopted in February 2006, to "pioneer the future in space exploration, scientific discovery and aeronautics research," and it was on display this week at the first NASA IT Summit in Washington, D.C.

The first NASA IT Summit featured speeches, streaming video, discussions about government, innovation and a lively Twitter back channel. Plenty of my colleagues in the technology journalism world were on hand to capture insights from NASA's initial sally into the technology conference fray. Headlines offer insight into the flavor of the event and the focus of its keynoters:

Below you'll find my interviews with NASA CTO for IT Chris Kemp (my first interview conducted via telerobot) and NASA CIO Linda Cureton.



NASA CIO and CTO on cloud computing, virtualization and Climate@Home


Gov 2.0 Summit, 2010During the second day of the summit, I interviewed Linda Cureton on some of the key IT initiatives that NASA is pursuing. In particular, I wondered whether NASA's open source cloud computing technology, Nebula, could be used as a platform for other agencies. "The original problem was that NASA was not in the business of providing IT services," said Cureton. "We are in the business of being innovative. To create that capability for elsewhere in government is difficult, from that perspective, yet it's something that the government needs."

Cureton described Nebula as similar to other spinoffs, where NASA develops a technology and provides it elsewhere in government. "We released the foundation of Nebula into the open source domain so that people in other agencies can take it and use it," she said. The other major benefit is that once something is in that public domain, the contributions from others -- crowdsourcing, so to speak -- will improve it."

Current cost savings in NASA isn't rooted in the cloud, however. It's coming from data center consolidation and virtualization. "NASA is decentralized," said Cureton, "so we're seeing people are finding ways to consolidate and save money in many ways. The major drivers of the virtualization that has been done are space and the desire to modernize, and to ensure a user experience that could replicate having their own resources to do things without having their own server."

Cureton observed that because of the decentralization of the agency, energy savings may not always be a driver. "Since low-hanging fruit from virtualization may have been plucked, that's where facilities managers now want to measure," she said. "From what I've learned, over the past year and a half, there's been a lot of virtualization. " For instance, the NASA Enterprise Application Competency Center (NEACC) has achieved floor space reductions from data center consolidation approaching a 12 to 1 ratio, with 36 physical servers and 337 virtual machines.

That's also meant a power reduction ratio of 6 to 1, which feeds into the focus on green technology in many IT organizations. For instance, as a I reported last year, a green data center is enabling virtualization growth for Congress. Cureton emphasized the importance of metering and monitoring in this area. "If you can't measure it, you can't improve it. You need more knowledge about what you can do, like with virtualization technologies. In looking at our refresh strategy, we're looking at green requirements, just as you might with a car. There's also cultural challenges. If you don't pay the electrical bill, you care about different issues."

Does she put any stock in EnergyStar ratings for servers? "Yes," said Cureton, whose biography includes a stint at the Department of Energy. "It means something. It's data that can be taken into account, along with other things. If you buy a single sports car, you might not care about MPG. If you're buying a fleet of cars, you will care. people who buy at scale, will care about EnergyStar."

More perspective on Nebula and Open Stack

Cureton hopes agencies take Nebula code and deploy it, especially given continued concerns in government about so-called public clouds. "The things that slow people down with the public cloud include IT security and things of that nature," she said. "Once an agency understands Nebula, the model can address a lot of risks and concerns the agency might have. if you're not ready for the Amazon model, it might be a good choice to get your feet wet. The best choice is to start with lower security-class data. When you look at large, transactional databases, Ii'm not sure that's ready for cloud yet."

As my telerobotic interview with Chris Kemp revealed (see below) there have now been "hundreds of contributions" to the Nebula code that "taxpayers didn't have to pay for." If you missed the news, Rackspace, NASA and several other major tech players announced Open Stack at OSCON this summer. Open Stack "enables any organization to create and offer cloud computing capabilities using open source technology running on standard hardware." You can watch video of Rackspace's Lew Moorman talking about an open cloud on YouTube.

There will, however, be integration challenges for adding Nebula code to enterprise systems until the collaboration matures. "You have to realize Nebula code is in production," said Kemp in an additional interview. "The Open Stack guys basically took Nebula code as seed for the computing part. For storage, users are able to rapidly contribute Rackspace file code. Together, there eventually will be a whole environment. People are able to check out that code right now in the Nebula environment, but there's a difference between that and a mature implementation."

Kemp pointed out that both of these code bases have been taken out of large production systems. "It would be irresponsible to call it mature," he said. "The community needs to test it on all types of hardware and configurations, building infrastructures with specific security scenarios and hardware scenarios. We expect it to be '1.0 caliber' by the fall."

The bottom line, however, is that, using these components, IT organizations that want to participate can turn commodity hardware into scalable, extensible cloud environments using the same code currently in production serving tens of thousands of customers and large government projects. All of the code for OpenStack is freely available under the Apache 2.0 license. NASA itself has committed to use OpenStack to power their cloud platforms, though Kemp cautioned that NASA is "not endorsing OpenStack, but is endorsing large groups of developers working on the code."

What Kemp anticipated evolving late this year is a "hybrid EC2," referring to Amazon's cloud environment. "Amazon is not selling as EC2 appliance or S3 appliance," he said. "If you're building a large government- or science-class, NASA-class cloud environment, this is intended to make all of the necessary computing infrastructure available to you. If you could build that kind of infrastructure with off the shelf components, we would have."

The manner of the interview with Kemp at the IT Summit also was a powerful demonstration of how NASA is experimenting with telepresence and robotics. Due to his status as a proud new father, Kemp was unable to join in person. Using an Anybot, Kemp was able to talk to dozens of co-workers and collaborators at the summit from his home in California. Watching them talk recalled William Gibson's famous quote: "The future is here. It's just not evenly distributed yet."

Climate@Home

381020main_3-5km_lg.jpgCrowdsourcing the search for aliens at the SETI@Home initiative is a well-known project for many computer users. Now, NASA plans to extend that distributed model for processing worldwide to help determine the accuracy of models that scientists will use to predict climate change. NASA describes the project as "unprecedented in scope." Climate@Home is a strategic partnership between NASA's Earth Science Division and the Office of the CIO, which Cureton heads. As with SETI@Home, participants won't need special training. They'll just need a laptop or desktop and to download a client to run in the background.

Effectively, NASA will be creating a virtual supercomputing network instead of building or re-purposing a supercomputer, which consumes immense amounts of energy. That means that the project will feature a much lower carbon footprint than it would otherwise, which is desirable on a number of levels. The Climate@Home initiative is modeled after a similar project coordinated by the Oxford e-Research Center called ClimatePrediction.net. Cureton talks about the project in the video below. She also comments (briefly) on the "Be A Martian" project at the Jet Propulsion Laboratory, which enlists citizen scientists in exploring Mars and having fun by sorting through images of the red planet.

Federal CIO on smarter spending

The final day of the summit featured a short, clear speech from federal CIO Vivek Kundra, where he challenged the federal government to spend less on IT. Video is embedded below:


Note: Presentations at the Summit from the grandfather of the Internet, Vint Cerf, the futurism of David W. Cearley, VP & Gartner Fellow, and the analysis of Microsoft's Lewis Shepherd, all provided provocative views of what's to come in technology. Look for a post on their insights next week.



Related:





The efficiencies and issues surrounding government's use of technology will be explored at the Gov 2.0 Summit, being held Sept. 7-8 in Washington, D.C. Request an invitation.

August 10 2010

Data as a climate change agent

Amidst varied hopes for open data and open government, enabling better data-driven decisions in both the private and public sector rank high. One of the existential challenges for humanity will be addressing climate change, particularly in countries where scientific resources are scant or even non-existent.

In February, the Obama administration NOAA) provides weather information. Earlier this summer, the Center for Strategic and International Studies (CSIS) published new research, "Earth Observation for Climate Change," and hosted a forum on leveraging climate data services to manage climate change. The video from the forum is embedded below:

"The vision is an informed society anticipating and responding to climate and its impacts," said Thomas R. Karl, director of the National Climatic Data Center at NOAA.

The dawn of climate data services

Gov 2.0 Summit, 2010According to the CSIS report, the September 2009 meeting of the World Climate Conference agreed to establish the Global Framework for Climate Services (GFCS) to connect research to policy making. The framework has four components:

  1. Observation and modeling
  2. Research and modeling
  3. Climate services information system
  4. A user interface (UI) program

When combined, that information system and UI would constitute a "World Climate Service System." According to the report:

"The goal of the new service will be to better inform decision makers (particularly in less-developed nations) by supplying data and analyses on climate change. When it is finally implemented, the World Climate Service System will provide climate and earth observations, models and forecasts to provide critical climate data to governments and other users around the world."

For instance, Karl pointed to Devil's Lake in North Dakota's expansion over recent decades. "The governor and mayor have asked for help," he said. "They have to make investments in roads and bridges. How should we do that? What can we say about the causes?"

Climate data services could serve a number of societal needs, said Karl, including better understanding of coastal inundation, changes in storm intensity, wave heights, drought conditions, or how extreme events might change as climate warms. There are also ancillary benefits to addressing societal challenges, said Karl, including identifying infrastructure issues or gaps in core capabilities, which would benefit the energy, transportation, agriculture and health sectors. The NOAA Office of Programming Planning and Implementation is soliciting input on its proposals, including an Ideascale instance.

Policy makers will need several categories of data to make better decisions, according to the report:

  • Trend data, like changes in forest size, gases in the atmosphere, or ocean currents.
  • Regional data, to identify specific issues in a smaller geographic area.
  • Effects assessment data, to measure the efficacy of mitigation or adaptation policies.
  • Compliance data, to monitor progress in a given agreement or treaty.
  • Planning data, to provide information that insurance companies, urban planners, corporations and others need to reduce risk or uncertainty.

Space policy and carbon-sensing satellites

The CSIS report also put a focus on space policy, specifically a "shortage of satellites actually designed and in orbit to measure climate change." Satellites have been used to measure pollution, or in the case of NASA's ICESat satellite, provide laser altimetry to measure the rate of melting in the Arctic ice. The crash of NASA's Orbiting Carbon Observation (OCO) satellite in 2009 "left the world essentially bereft of the ability to make precise measurements to assess emissions reductions effort," according to the CSIS report. That means climate scientists are relying on Japan's GOSAT, the European Space Agency's SCIAMACHY and Canada's CanX-2 satellites until OCO 2 gets up into orbit in February 2013. All of these current systems lack the advanced sensors or monitoring capabilities scientists desire to assess changes in the carbon cycle. The number of earth observation instruments has actually declined in recent years, as shown on the figure below:

number-earth-observation-instruments.jpg

Visualizing Climate Data

Many citizens will not find the NOAA Climate Services Interactive Map to be a terrific interface to gain insight, although scientists can find datasets relatively easily. Fortunately, NOAA has launched a prototype of a climate services portal, Climate.gov, which is a vast improvement on more traditional .gov websites. The site includes an online magazine, access to climate data and services, a section on understanding climate science, education and news.

climate-gov-screenshot.jpg

The Climate.gov portal appears to be a rare beast in government IT: a public prototype. Some govies might even call it a government 2.0 beta.

Opening data for innovation

Will open climate data be available to civic developers and commercial concerns to build businesses upon?

"We invest a lot in terms of making data available with an open data policy, so everyone can see what everybody is doing" said Dr. Jack Kaye, associate director for research and analysis in NASA's Earth Science Division. "The sheer volume of data and complexity of it makes it a challenge for less sophisticated users. One of the challenges is to create tools that will facilitate the less knowledgeable user."

That's one area where Climate.gov is currently succeeding, in terms of providing a clear "climate dashboard," pictured to the right, that shows the progression of climate change since the late 19th century. As any IT executive knows, however, a dashboard is only as useful as the data feeding it. That's a concern that's been highlighted in the Climategate controversy over the past year. Despite that concern, there is reason for optimism behind seeing open climate data published online, where it can be exposed to more transparent vetting.

The story of how weather data provisions information and news outlets may be well worn in the Gov 2.0 dialogue, though most citizens don't think about NOAA data underpinning serious decisions on business, travel or recreation.

Thanks to "infovegan" Clay Johnson, the history of how weather data was opened is clearer today. The history of GPS shows the innovation and value spawned by the release of global positioning system data. As Time reported last year, a market-research firm estimated the global GPS< market will total $75 billion by 2013.

Earlier this spring, the United States released community health information to provision healthcare apps and drive better policy. Now, scientists and policy makers will explore the potential for climate data services to inform citizens and government, enabling both to make better decisions for communities and businesses alike.


Related:



The Gov 2.0 Summit will be held Sept. 7-8 in Washington, D.C. Learn more and request an invitation.

September 09 2008

TERRA 445: Coronal Mass Ejections

Coronal Mass Ejections seeks to explain a highly technical phenomena through compelling interviews with top scientists and amazing graphics depicting our solar system. However, the film also seeks to make the point that science doesn’t have all the answers, but that it’s ok and is in fact a good thing.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl