Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

November 11 2013

Four short links: 11 November 2013

  1. Living Light — 3D printed cephalopods filled with bioluminescent bacteria. PAGING CORY DOCTOROW, YOUR ORGASMATRON HAS ARRIVED. (via Sci Blogs)
  2. Repacking Lego Batteries with a CNC Mill — check out the video. Patrick programmed a CNC machine to drill out the rivets holding the Mindstorms battery pack together. Coding away a repetitive task like this is gorgeous to see at every scale. We don’t have to teach our kids a particular programming language, but they should know how to automate cruft.
  3. My Thoughts on Google+ (YouTube) — when your fans make hatey videos like this one protesting Google putting the pig of Google Plus onto the lipstick that was YouTube, you are Doin’ It Wrong.
  4. Presto: Interacting with Petabytes of Data at Facebooka distributed SQL query engine optimized for ad-hoc analysis at interactive speed. It supports standard ANSI SQL, including complex queries, aggregations, joins, and window functions. For details, see the Facebook post about its launch.

October 21 2013

The biocoding revolution

What is biocoding? For those of you who have been following the biotechnology industry, you’ll have heard of the rapid advances in genome sequencing. Our ability to read the language of life has advanced dramatically, but only recently have we been able to start writing the language of life at scale.

The first large-scale biocoding success was in 2010, when Craig Venter (one of my scientific heroes) wrote up the genome of an entirely synthetic organism, booted it up and created de novo life. Venter’s new book, Life at the Speed of Light, discusses the creation of the first synthetic life form. In his book and in video interviews, Venter talks about the importance of ensuring the accuracy of the DNA code they designed. One small deletion of a base (one of the four letters that make up the biological equivalent of 1s and 0s) resulted in a reading frame shift that left them with gibberish genomes, a mistake they were able to find and correct. One of the most amusing parts of Venter’s work was that they were able to encode sequences in the DNA to represent each letter of the English alphabet. Their watermark included the names of their collaborators, famous quotes, an explanation of the coding system used, and a URL for those who crack the code written in the DNA. Welcome to the future — and let me know if you crack the code!

Biocoding is just the beginning of the rise of the true biohackers. This is a community of several thousand people, with skill sets ranging from self-taught software hackers to biology postdocs who are impatient with the structure of traditional lab work. Biohackers want to tinker; do fun science; and, in the process, accelerate the pace of biotech innovation. There are plenty of differences between writing computer code and writing code in the building blocks of life, but the important thing is that it can be done and is being done now by citizen scientists working both from shared biohacker labs (like BiocuriousGenspace, and Counter Culture Labs) and at home (for example, Cathal Garvey, who works out of a spare bedroom in his mother’s home). Drew Endy’s short video about Engineering Biology gives a great overview of what we can accomplish when we start programming the genetic code. One of his projects is genetically encoded data storage — but it’s not just about replacing dry silicon with wet carbon; it’s about what can happen when you can do computing in an environment where you couldn’t possibly place silicon: inside a living cell.

Biotech is the wet nanotech we’ve been waiting for. It’s a little less logical and a lot buggier than we’d like, but we now have the tools to write DNA, insert this code into a cell, reboot the cell and make those cells produce custom-designed proteins and substances, and engineer biology. The potential for synthetic biology and biotechnology is vast. The biocoding era will be as transformative as the computer era, and we all have an opportunity to create the future together.

Biocoder is a new O’Reilly quarterly newsletter chronicling the rise of DIY bio, synthetic bio, biohackers, Grinders, and the new innovations being developed at the edges of the biotech industry. Check out Biocoder and download it for free.

January 24 2013

Four short links: 24 January 2013

  1. Google’s Driverless Car is Worth Trillions (Forbes) — Much of the reporting about Google’s driverless car has mistakenly focused on its science-fiction feel. [...] In fact, the driverless car has broad implications for society, for the economy and for individual businesses. Just in the U.S., the car puts up for grab some $2 trillion a year in revenue and even more market cap. It creates business opportunities that dwarf Google’s current search-based business and unleashes existential challenges to market leaders across numerous industries, including car makers, auto insurers, energy companies and others that share in car-related revenue.
  2. DIY BioPrinter (Instructables) — Think of it as 3D printing, but with squishier ingredients! How to piggyback on inkjet printer technology to print with your own biomaterials. It’s an exciting time for biohackery: FOO Ewan Birney is kicking ass and taking names, he was just involved in a project storing and retrieving data from DNA.
  3. Parsley — open-sourced forms validation library in Javascript.
  4. ADAMS — open sourced workflow tool for machine learning, from the excellent people at Waikato who brought you WEKA. ADAMS = Advanced Data mining And Machine learning System.

December 18 2012

DARPA and Defense Department look to a more open source future

As the United States military marches further into the age of networked warfare, data networks and the mobile platforms to distribute and access them will become even more important.

Open Source Uncle SamOpen Source Uncle SamThis fall, the (retired) eighth Vice Chairman of the Joint Chiefs of Staff described a potential future of the military that’s founded not only in open source thinking, but in next-generation user interfaces and biohacking straight out of science fiction. If even some of the strategic thinking he described at this year’s Military Open Source Conference in D.C. is applied to how the technology that supports the next generation of war fighters is built, dramatic evolutionary changes could cascade down the entire supply chain of one of the world’s biggest organizations.

In his remarks, James E. “Hoss” Cartwright, a four-star general who retired from the United States Marine Corps in August 2011, outlined a strategic need to make military technology more modular, based upon open standards and adaptable on the battlegrounds of the future.

Cartwright, the first holder of the Harold Brown Chair in Defense Policy Studies for the Center for Strategic & International Studies, a member of the Defense Policy Board Advisory Committee, and an adviser to several corporate entities in the defense industry, is well placed to have an informed and influential opinion.

Over the course of his talk at the Military Open Source Conference, Cartwright outlined how open source software models could be applied to hardware, making vehicles into adaptable platforms for different missions, not vertically integrated programs that can take a decade or longer to design, build or change.

Given the scope of the Pentagon’s current capabilities and DARPA’s research, potential ethics concerns abound, from drone warfare to sentient robotics to targeted genetic plagues to brain scanning to biohacking.

In that context, Cartwright prioritizing ethical qualms about secrecy, privacy and big data over those raised by biohacking was notable.

The issue that Cartwright said bothered him the most, however, was big data. “There are really no secrets out there,” he said. By exposing data to a larger dataset, it’s possible to correlate real identities. (That’s the so-called “Mosaic Effect.”)

That’s what’s now happening with network intrusions from other countries, he said, which leads to genuine national security headaches. Cartwright noted that while the federal government has huge classification protocols, they’re nearly all discoverable if you know how to correlate the information. Even correlations in anonymized data can lead to the discovery of true identities.

Big data concerns aside, Cartwright highlighted a strategic need for the U.S. Department of Defense to address these risks and develop improved man-machine interfaces, from touch screens for unmanned vehicles and weapons systems to prosthetics for veterans.

Making these changes won’t happen overnight. The relevant time scale is many years, if not decades. It’s far from easy to turn an aircraft carrier and its battle group around, much less to shift the U.S. Department of Defense’s approach to procuring and using technology. That said, hearing a retired four-star general articulate this kind of strategic thinking has stayed with me.

Biohacking on the battlefield

Retired General James E. CartwrightRetired General James E. CartwrightAt the outset of his remarks, General Cartwright shared an anecdote involving former Defense Secretary Robert Gates, where they asked a sergeant at a base in Savannah, Ga., what he though of mobile.

The sergeant said that he loved it. He would rather leave his rifle behind than a military-enabled smartphone. “I can call any help I need with it, it always works, and I don’t have to go to school for it,” Cartwright recounted the sergeant’s response.

The sergeant’s comments reflect a serious issue for the 2.5 million people who have to fight, reflected Cartwright.

“We’re asking them to go into a ‘Star Wars’ bar on a regular basis,” he said, pointing to the language challenges soldiers face abroad. “We don’t know whether they’re saying something that will get them shot or hugged.”

Improved interfaces for mobile devices are, however, just the tip of the iceberg for improvements to the connection of soldiers to vehicles and weapons systems. Radical advances in storage, processing power and robotics are also offering new opportunities to help wounded warriors.

The cutting edge now is fully mechanized, battery powered, wireless prosthetics, said Cartwright, pointing to research in San Diego. Related research successfully enabled a soldier in Walter Reed Hospital who had lost three limbs to move a prosthetic limb only by using a brain-enabled chip.

“What we found was that, as soon as they put a chip on, phantom pain went away,” said Cartwright. “The FDA just licensed one.”

Bionic limbs have been around for years, but in 2012, amputees are climbing skyscrapers and moving bionic limbs using direct neural control.

Programmable soldiers?

The event horizon Cartwright described in his comments at the conference, however, went far beyond prosthetics into a tale of potential augmentation straight out of the annals of science fiction.

The retired general described an experiment in which a mouse ran a maze with a computer chip wired into its brain. After the researchers transferred that chip to another mouse that had never seen the course, the second mouse could run the maze. Such software-driven activities, if they were to be successfully improved and tested, could have profound implications for soldiers.

For instance, could you take a recruit, provide someone basic skills, then add a chip and upgrade him from basic to full rifleman?

“It takes 66 repetitions to get a habit,” said Cartwright. “It takes tens of thousands for Olympic quality. What if you can take that down?”

Even if these kinds of experiments aren’t deployed for humans any time soon, other technologies are already far along in development.

When you hit about age 55, said Cartwright, ocular nerves and auditory nerves start to degenerate. He described ongoing experiments with wired interfaces for human’s ocular nerves where developers were writing code to interpret visual stimuli for the brain.

“We’re now at the point where you can see good forms with purely programming,” he said.

Even if this level of biohacking doesn’t make its way into use by disabled veterans next year, the need for a combination of datasets, programming, man-machine interfaces, and biological research to augment the capabilities of current war fighters is, in Cartwright’s assessment, increasingly important.

“We’re getting to the point where, absent something like this, most of our systems require you to be an engineer to run them,” he said. “We need to improve the machine interface so that anyone can use it. That’s as important as the capability itself.”

Part of developing new capabilities for warfare using the brain, however, will need to be securing those interfaces against hackers. Interfaces and access points to “wetware” will need to be hardened, just like hardware and software.

If you can’t hack it, don’t pack it

One huge challenge that the armed services are facing today, Cartwright said, is adapting code in response to what soldiers are actually encountering in the field.

We can’t send issues back and have people quickly rewrite code, he said, which presents significant problems. To put it another way, the DoD wants the armed forces to be able to “write as they fight.”

Cartwright described a pilot program where contractors and grad students were sent into the field so they could understand the problems they were working against and reduce the time to write code to address it. The results were promising: they didn’t lose any technical staff and turnaround time for patches went down drastically, once they were able to get inside the decision cycle.

Only programmers in the field can teach analytic algorithms to determine the difference between an ambush and a drug deal, said Cartwright.

“You can’t do that unless you know how to dig for data and understand context,” he said. “That’s the turnaround time that we needed to stay inside an adversary’s decision loop.”

That’s particularly relevant for networked warfare. According to Cartwright, new software works in the “cyberfight” in Afghanistan for about 9-14 days before it needs minor changes — but new systems take years to build. Top leadership in the military thinks a problem in the battlefield means that an entire new platform is needed, he said, but you’re looking at 14 years to build a new kind of truck.

Open source military hardware?

What needs to change is the incentive structures for the people building and designing the “platforms of record” in the future, said Cartwright. That means designing programs and apps for problems we actually have, versus developing something that doesn’t get into the field for 10-15 years — and if you guess wrong on who an adversary will be, that sends you into a modification cycle of at least three years.

Open source methods, by way of contrast, can give the military the ability to change software in weeks and months, not years, said Cartwright. In that context, he indicated that the Pentagon is looking at how they can move from tightly, singularly integrated programs in the direction of more open platforms and open standards, where war fighters can add or get capabilities with modularity and at a speed measured in weeks and months, not months and years.

During the question and answer period that followed his remarks, Cartwright followed up on his comments on open source. Cartwright said that the Pentagon would like to get to the point where platforms are a conveyance for the needs soldiers have, with infrastructure set up in such a way that things can be switched out.

Notably, he said that in the past few years of the financial crisis, defense technology manufacturers that are agnostic to platform are faring far better. “They’re building code — sensors, activities — and others are not,” he said, “and if one or two programs are canceled, they’re in trouble.”

Cartwright asserted that military service acquisition people have started to understand the value of flexibility of technology that enables soldiers to quickly configure technology for fights.

To scale that across the entire military, he said, they must adopt more common standards across all services. Eventually, that would mean “displays, chipsets, anybody in this room can write code against, depending upon what the customer wants.”

Cartwright said he’d like to see today’s model of open source extended to military software and hardware.

“We’re thinking about a future where everyone’s garage can be a sweat house for the military,” he said, playing to his audience of military open source conferees.

Making these changes, however, won’t be easy or fast. “We’re still an industrial nation at heart,” he said. “We’re trying to get over that.”

In response to a follow-up question, he was frank about the time it may take to shift the thinking of some acquisition officers on open source and modularity.

“We’ve been working to make it look like it’s being fixed,” he said, “but we may need to wait for people to age out.”

Image Credits: Wikipedia and mil-oss.org

November 30 2012

To eat or be eaten?

One of Marc Andreessen’s many accomplishments was the seminal essay “Why Software is Eating the World.” In it, the creator of Mosaic and Netscape argues for his investment thesis: everything is becoming software. Music and movies led the way, Skype makes the phone company obsolete, and even companies like Fedex and Walmart are all about software: their core competitive advantage isn’t driving trucks or hiring part-time employees, it’s the software they’ve developed for managing their logistics.

I’m not going to argue (much) with Marc, because he’s mostly right. But I’ve also been wondering why, when I look at the software world, I get bored fairly quickly. Yeah, yeah, another language that compiles to the JVM. Yeah, yeah, the Javascript framework of the day. Yeah, yeah, another new component in the Hadoop ecosystem. Seen it. Been there. Done that. In the past 20 years, haven’t we gained more than the ability to use sophisticated JavaScript to display ads based on a real-time prediction of the user’s next purchase?

When I look at what excites me, I see a much bigger world than just software. I’ve already argued that biology is in the process of exploding, and the biological revolution could be even bigger than the computer revolution. I’m increasingly interested in hardware and gadgetry, which I used to ignore almost completely. And we’re following the “Internet of Things” (and in particular, the “Internet of Very Big Things”) very closely. I’m not saying that software is irrelevant or uninteresting. I firmly believe that software will be a component of every (well, almost every) important new technology. But what grabs me these days isn’t software as a thing in itself, but software as a component of some larger system. The software may be what makes it work, but it’s not about the software.

A dozen or so years ago, people were talking about Internet-enabled refrigerators, a trend which (perhaps fortunately) never caught on. But it led to an interesting exercise: thinking of the dumbest device in your home, and imagine what could happen if it was intelligent and network-enabled. My furnace, for example: shortly after buying our house, we had the furnace repairman over 7 times during the month of November. And rather than waiting for me to notice that the house was getting cold at 2AM, it would have been nice for a “smart furnace” to notify the repairman, say “I’m broken, and here’s what’s probably wrong.” (The Nest doesn’t do that, but with a software update it probably could.)

The combination of low-cost, small-footprint computing (the BeagleBone, Raspberry Pi, and the Arduino), along with simple manufacturing (3D printing and CNC machines), and inexpensive sensors (for $150, the Kinect packages a set of sensors that until recently would easily have cost $10,000) means that it’s possible to build smart devices that are much smaller and more capable than anything we could have built back when we were talking about smart refrigerators. We’ve seen Internet-enabled scales, robotic vacuum cleaners, and more is on the way.

At the other end of the scale, GE’s “Unleashing the Industrial Internet” event had a fully instrumented network-capable jet engine on stage, with dozens of sensors delivering realtime data about the engine’s performance. That data can be used for everything from performance optimization to detecting problems. In a panel, Tim O’Reilly asked Matt Reilly of Accenture “do you want more Silicon Valley on your turf?” and his immediate reply was “absolutely.”

Even in biology: synthetic biology is basically nothing more than programming with DNA, using a programming language that we don’t yet understand and for which there is still no “definitive guide.” We’re only beginning to get to the point where we can reliably program and build “living software,” but we are certainly going to get there. And the consequences will be profound, as George Church has pointed out.

I’m not convinced that software is going to eat everything. I don’t see us living in a completely virtual world, mediated completely by browsers and dashboards. But I do see everything eating software: software will be a component of everything we do or buy, from our clothing to our food. Why is the FitBit a separate device? Why not integrate it into your shoes? Can we imagine cookies that incorporate proteins that have been engineered to become unappealing when we’ve eaten too much? Yes, we can, though we may not be happy about that. Seriously, I’ve had discussions about genetically engineered food that would diagnose diseases and turn different colors in your digestive track to indicate cancer and other conditions. (You can guess how you read the results).

Andreessen is certainly right in his fundamental argument that software has disrupted, and will continue to disrupt, just about every industry on the planet. He pointed to health care and education as the next industries to be disrupted; and we’re certainly seeing that, with Coursera and Udacity in education, and conferences like StrataRx in health care. We just need to push his conclusion farther. Is a robotic car a case of software eating the driver, or of the automobile eating software? You tell me. At the Industrial Internet event, Andreessen was quoted as saying “We only invest in hardware/software hybrids that would collapse if you pulled the software out.” Is an autonomous car something that would collapse if you pulled the software out? The car is still drivable. In any case, my “what’s the dumbest device in the house” exercise is way too limiting. When are we going to build something that we can’t now imagine, that isn’t simply an upgrade of what we already have? What would it mean for our walls and floors, or our plumbing, to be intelligent? At the other extreme, when will we build devices where we don’t even notice that they’ve “eaten” software? Again, Matt Reilly: “It will be about flights that are on time, luggage that doesn’t get lost.”

In the last few months, I’ve seen a number of articles on the future of venture investment. Some argue that it’s too easy and inexpensive to look for “the next Netscape,” and as a consequence, big ambitious projects are being starved. It’s hard for me to accept that. Yes, there’s a certain amount of herd thinking in venture capital, but investors also know that when everyone is doing the same thing, they aren’t going to make any money. Fred Wilson has argued that momentum is moving from consumer Internet to enterprise software, certainly a market that is ripe for disruption. But as much as I’d like to see Oracle disrupted, that still isn’t ambitious enough.

Innovation will find the funds that it needs (and it isn’t supposed to be easy). With both SpaceX and Tesla Motors, Elon Musk has proven that it’s possible for the right entrepreneur to take insane risks and make headway. Of course, neither has “succeeded,” in the sense of a lucrative IPO or buyout. That’s not the point either, since being an entrepreneur is all about risking failure. Neither SpaceX nor Tesla are Facebook-like “consumer web” startups, nor even enterprise software startups or education startups. They’re not “software” at all, though they’ve both certainly eaten a lot of software to get where they are. And that leads to the most important question:

What’s the next big thing that’s going to eat software?

Related:

November 09 2012

George Church and the potential of synthetic biology

A few weeks ago, I explained why I thought biohacking was one of the most important new trends in technology. If I didn’t convince you, Derek Jacoby’s review (below) of George Church’s new book, Regenesis, will. Church is no stranger to big ideas: big ideas on the scale of sending humans to Mars. (The moon? That’s so done.) And unlike most people with big ideas, Church has an uncanny track record at making his ideas reality. Biohacking has been not so quietly gaining momentum for several years now. If there’s one book that can turn this movement into a full-blown revolution, this is it. — Mike Loukides


George Church and Ed Regis pull off an exciting and speculative romp through the field of synthetic biology and where it could take us in the not too distant future. If anyone with less eminence than Church were to have written this book then half this review would need to be spent defending the realism of the possibilities, but with his track record if he suggests it’s a possibility then it’s worth thinking about.

The possibilities are mind-blowing — breeding organisms immune to all viruses, recreating extinct species, creating humans immune to cancer. We’re entering an age where the limits to our capabilities to re-make the world around us are limited only by our imaginations and our good judgement. Regenesis addresses this as well, for instance proposing mechanisms to create synthetic organisms that are incapable of interacting with natural ones.

Although the book is aimed at a non-technical general audience, the science is explained in excellent detail and is well-referenced for further study.

As the book documents, we’re in the middle of an exponential increase in genomics capabilities that dwarfs even the pace of change in the computer industry. In such a rapidly changing field if you can imagine a plausible technical approach to a problem, no matter how difficult or cumbersome it may be, then soon it’s likely to become easy.

To give an example of an idea long discussed in science fiction, the book addresses re-creating extinct species. Surprisingly, there is already a successful example of this having occurred! The Pyrenean ibex, or bucardo, is a type of mountain goat that went extinct in 1999. But before the last ibex died, researchers scraped a few tissue cells from the ear of the last surviving ibex. They were able to induce the skin cells to become stem cells, and then in a process called interspecies nuclear transfer cloning they were able to fuse those stem cells with de-nucleated donor goat eggs, implant the eggs into domestic goats, and successfully birth a living ibex. By extension, the book examines the implications of reviving the wooly mammoth, or even neanderthals.

Similar detailed examples and discussions take the reader through the potentials of synthetic biology to transform fuel production, food production, waste processing, medicine, and even engineering of the human genome to produce Homo evolutis. Church’s background is in directed evolution — he invented many of the most powerful techniques to rapidly evolve portions of a genome to possess specified characteristics. To hear the inventor of such a powerful technology explore the ramifications of it is a real treat. Society will be exploring the issues raised in this book for many years — how to take advantage of the ability to re-engineer life while protecting against the risks that such a powerful technology must bring.

Refreshingly, in Church’s view protecting against those risks need not exclude amateurs and citizen scientists. Regenesis proposes a licensing scheme, but much more akin to a driver’s license than a formidable hurdle, and suggests a model where a combination of engineering techniques and basic shared procedures is sufficient to protect against any reasonable threats to safety while still ensuring the widest possible access to the technology.

Regenesis provides an accessible and engaging introduction to the revolutionary potentials of synthetic biology and should be of interest to both experts and a general science audience.

Related

October 03 2012

Biohacking: The next great wave of innovation

Genspace and Biocurious logosGenspace and Biocurious logosI’ve been following synthetic biology for the past year or so, and we’re about to see some big changes. Synthetic bio seems to be now where the computer industry was in the late 1970s: still nascent, but about to explode. The hacker culture that drove the development of the personal computer, and that continues to drive technical progress, is forming anew among biohackers.

Computers certainly existed in the ’60s and ’70s, but they were rare, and operated by “professionals” rather than enthusiasts. But an important change took place in the mid-’70s: computing became the domain of amateurs and hobbyists. I read recently that the personal computer revolution started when Steve Wozniak built his own computer in 1975. That’s not quite true, though. Woz was certainly a key player, but he was also part of a club. More important, Silicon Valley’s Homebrew Computer Club wasn’t the only one. At roughly the same time, a friend of mine was building his own computer in a dorm room. And hundreds of people, scattered throughout the U.S. and the rest of the world, were doing the same thing. The revolution wasn’t the result of one person: it was the result of many, all moving in the same direction.

Biohacking has the same kind of momentum. It is breaking out of the confines of academia and research laboratories. There are two significant biohacking hackerspaces in the U.S., GenSpace in New York and BioCurious in California, and more are getting started. Making glowing bacteria (the biological equivalent of “Hello, World!”) is on the curriculum in high school AP bio classes. iGem is an annual competition to build “biological robots.” A grassroots biohacking community is developing, much as it did in computing. That community is transforming biology from a purely professional activity, requiring lab coats, expensive equipment, and other accoutrements, to something that hobbyists and artists can do.

As part of this transformation, the community is navigating the transition from extremely low-level tools to higher-level constructs that are easier to work with. When I first leaned to program on a PDP-8, you had to start the computer by loading a sequence of 13 binary numbers through switches on the front panel. Early microcomputers weren’t much better, but by the time of the first Apples, things had changed. DNA is similar to machine language (except it’s in base four, rather than binary), and in principle hacking DNA isn’t much different from hacking machine code. But synthetic biologists are currently working on the notion of “standard biological parts,” or genetic sequences that enable a cell to perform certain standardized tasks. Standardized parts will give practitioners the ability to work in a “higher level language.” In short, synthetic biology is going through the same transition in usability that computing saw in the ’70s and ’80s.

Alongside this increase in usability, we’re seeing a drop in price, just as in the computer market. Computers cost serious money in the early ’70s, but the price plummeted, in part because of hobbyists: seminal machines like the Apple II, the TRS-80, and the early Macintosh would never have existed if not to serve the needs of hobbyists. Right now, setting up a biology lab is expensive; but we’re seeing the price drop quickly, as biohackers figure out clever ways to make inexpensive tools, such as the DremelFuge, and learn how to scrounge for used equipment.

And we’re also seeing an explosion in entrepreneurial activity. Just as the Homebrew Computer Club and other garage hackers led to Apple and Microsoft, the biohacker culture is full of similarly ambitious startups, working out of hackerspaces. It’s entirely possible that the next great wave of entrepreneurs will be biologists, not programmers.

What are the goals of synthetic biology? There are plenty of problems, from the industrial to the medical, that need to be solved. Drew Endy told me how one of the first results from synthetic biology, the creation of soap that would be effective in cold water, reduced the energy requirements of the U.S. by 10%. The holy grail in biofuels is bacteria that can digest cellulose (essentially, the leaves and stems of any plant) and produce biodiesel. That seems achievable. Can we create bacteria that would live in a diabetic’s intestines and produce insulin? Certainly.

But industrial applications aren’t the most interesting problems waiting to be solved. Endy is concerned that, if synthetic bio is dominated by a corporate agenda, it will cease to be “weird,” and won’t ask the more interesting questions. One Synthetic Aesthetics project made cheeses from microbes that were cultured from the bodies of people in the synthetic biology community. Christian Bok has inserted poetry into a microbe’s DNA. These are the projects we’ll miss if the agenda of synthetic biology is defined by business interests. And these are, in many ways, the most important projects, the ones that will teach us more about how biology works, and the ones that will teach us more about our own creativity.

The last 40 years of computing have proven what a hacker culture can accomplish. We’re about to see that again, this time in biology. And, while we have no idea what the results will be, it’s safe to predict that the coming revolution in biology will radically change the way we live — at least as radically as the computer revolution. It’s going to be an interesting and exciting ride.

Related:

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl