Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

August 03 2011

March 29 2011

Four short links: 29 March 2011

  1. Serve -- American Express mobile payments play. Money on mobiles is a huge potential, look for others to bang around here before the right answer is found. (via Mike Olson)
  2. Move Mayonnaise and Ketchup (YouTube) -- I don't know why you'd want to move mayonnaise and ketchup intact, but this is the machine for it. (via Russell Brown)
  3. Duplicates Detection with ElasticSearch (Andre Zmievski) -- duplicate detection (or de-duping) is one of the most unappreciated problems that the developers of certain types of applications face sooner or later. The applications I’m talking about share two main characteristics: item collection and some sort of social aspect.
  4. Ceaser -- tool for making CSS easing animations. (via Josh Clark)

March 24 2011

Four short links: 24 March 2010

  1. Digital Subscription Prices -- the NY Times in context. Aie.
  2. Trinity -- Microsoft Research graph database. (via Hacker News)
  3. Data Science Toolkit -- prepackaged EC2 image of most useful data tools. (via Pete Warden)
  4. Snappy -- Google's open sourced compression library, as used in BigTable and MapReduce. Emphasis is on speed, with resulting lack of quality in filesize (20-100% bigger than zlib).

August 06 2010

Four short links: 6 August 2010

  1. AWS: Forget the Revenue, Did You See the Margins? (RedMonk) -- According to UBS, Amazon Web Services gross margins for the years 2006 through 2014 are 47%, 48%, 48%, 49%, 49%, 50%, 50.5%, 51%, 53%. (these are analyst projections, so take with grain of salt, but those are some sweet margins if they're even close to accurate)
  2. Science Pipes -- an environment in which students, educators, citizens, resource managers, and scientists can create and share analyses and visualizations of biodiversity data. It is built to support inquiry-based learning, allowing analysis results and visualizations to be dynamically incorporated into web sites (e.g. blogs) for dissemination and consumption beyond SciencePipes.org itself. (via mikeloukides on Twitter)
  3. ScraperWiki Source Code -- AGPL-licensed source to the ScraperWiki, a tool for data storage, cleaning, search, visualization, and export.
  4. Doc split -- a command-line utility and Ruby library for splitting apart documents into their component parts: searchable UTF-8 plain text via OCR if necessary, page images or thumbnails in any format, PDFs, single pages, and document metadata (title, author, number of pages...)

July 23 2010

May 13 2010

White House moves Recovery.gov to Amazon's cloud

Recovery.govEarlier today in a blog post on WhiteHouse.gov, federal CIO Vivek Kundra announced that Recovery.gov would be moving to the cloud. The Recovery Accountability and Transparency Board's primary contractor, Smartronix, chose Amazon's Elastic Compute Cloud (EC2) to host the site. NASA has used EC2 for testing, but this will be the first time a government website -- a ".gov" -- has been hosted on Amazon's EC2. Kundra estimated the savings to the operational budget to run Recovery.gov at approximately $750,000, with $334,000 coming in 2010 alone.

"This is a production system," said Kundra, during a press briefing today. "That's a critical difference from other agencies that have been testing or piloting. We don't have data that's sensitive in nature or vital to national security here."

The recovery board plans to redirect more than $1 million in computer hardware and software that were being used to host Recovery.gov to fraud oversight operations. It's a move that Earl Devaney, chairman of the recovery board, said will help identify fraud, waste and abuse in the recovery program.

Gov 2.0 Expo 2010Devaney said that after a town hall, Smartronix evaluated the cloud computing options and decided to go with Amazon's public offering. "In terms of competition, part of what we're trying to introduce is Darwinian pressure in federal IT, including new entrants in government that haven't been there before, such as Amazon," said Kundra.

He also said this represents one of the "first bricks in the foundation that we're laying" throughout the federal government, in terms of cloud computing. Kundra pointed to an upcoming forum on cloud computing on May 20 where many of the issues that the National Institute for Standards and Technology has been working on will be addressed, including the relationship between the public and private sector.

Kundra said that the move to Amazon's public cloud went through tough cybersecurity vetting. "The board and vendor went through a rigorous process, in terms of FISMA checks and balances," said Kundra.

The federal government is actively developing cloud computing case studies, like the use of Saleforce.com's public cloud by the Department of Health and Human Services (HHS). "Case studies are being used for the broader FedRAMP program," he said. "Look at the Department of Interior: The CIO is considering moving 80,000 emails to the cloud. Look at the investments made at GSA or a recent RFI [Request for Information] around email. Across federal government, you're seeing a number of agencies putting in a plan."

Why move to the cloud now, rather than at the start? "We had concerns about data, pulling it off with hundreds of thousands of feeds," said Devaney. "We found it prudent to do it the old-fashioned way first, then move to the cloud."

On May 20, Kundra said the federal government will release data center numbers as part of a larger project to consolidate IT infrastructure. Based upon those findings, further migration to cloud computing may be forthcoming.

February 04 2010

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl