Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

October 03 2012

Biohacking: The next great wave of innovation

Genspace and Biocurious logosGenspace and Biocurious logosI’ve been following synthetic biology for the past year or so, and we’re about to see some big changes. Synthetic bio seems to be now where the computer industry was in the late 1970s: still nascent, but about to explode. The hacker culture that drove the development of the personal computer, and that continues to drive technical progress, is forming anew among biohackers.

Computers certainly existed in the ’60s and ’70s, but they were rare, and operated by “professionals” rather than enthusiasts. But an important change took place in the mid-’70s: computing became the domain of amateurs and hobbyists. I read recently that the personal computer revolution started when Steve Wozniak built his own computer in 1975. That’s not quite true, though. Woz was certainly a key player, but he was also part of a club. More important, Silicon Valley’s Homebrew Computer Club wasn’t the only one. At roughly the same time, a friend of mine was building his own computer in a dorm room. And hundreds of people, scattered throughout the U.S. and the rest of the world, were doing the same thing. The revolution wasn’t the result of one person: it was the result of many, all moving in the same direction.

Biohacking has the same kind of momentum. It is breaking out of the confines of academia and research laboratories. There are two significant biohacking hackerspaces in the U.S., GenSpace in New York and BioCurious in California, and more are getting started. Making glowing bacteria (the biological equivalent of “Hello, World!”) is on the curriculum in high school AP bio classes. iGem is an annual competition to build “biological robots.” A grassroots biohacking community is developing, much as it did in computing. That community is transforming biology from a purely professional activity, requiring lab coats, expensive equipment, and other accoutrements, to something that hobbyists and artists can do.

As part of this transformation, the community is navigating the transition from extremely low-level tools to higher-level constructs that are easier to work with. When I first leaned to program on a PDP-8, you had to start the computer by loading a sequence of 13 binary numbers through switches on the front panel. Early microcomputers weren’t much better, but by the time of the first Apples, things had changed. DNA is similar to machine language (except it’s in base four, rather than binary), and in principle hacking DNA isn’t much different from hacking machine code. But synthetic biologists are currently working on the notion of “standard biological parts,” or genetic sequences that enable a cell to perform certain standardized tasks. Standardized parts will give practitioners the ability to work in a “higher level language.” In short, synthetic biology is going through the same transition in usability that computing saw in the ’70s and ’80s.

Alongside this increase in usability, we’re seeing a drop in price, just as in the computer market. Computers cost serious money in the early ’70s, but the price plummeted, in part because of hobbyists: seminal machines like the Apple II, the TRS-80, and the early Macintosh would never have existed if not to serve the needs of hobbyists. Right now, setting up a biology lab is expensive; but we’re seeing the price drop quickly, as biohackers figure out clever ways to make inexpensive tools, such as the DremelFuge, and learn how to scrounge for used equipment.

And we’re also seeing an explosion in entrepreneurial activity. Just as the Homebrew Computer Club and other garage hackers led to Apple and Microsoft, the biohacker culture is full of similarly ambitious startups, working out of hackerspaces. It’s entirely possible that the next great wave of entrepreneurs will be biologists, not programmers.

What are the goals of synthetic biology? There are plenty of problems, from the industrial to the medical, that need to be solved. Drew Endy told me how one of the first results from synthetic biology, the creation of soap that would be effective in cold water, reduced the energy requirements of the U.S. by 10%. The holy grail in biofuels is bacteria that can digest cellulose (essentially, the leaves and stems of any plant) and produce biodiesel. That seems achievable. Can we create bacteria that would live in a diabetic’s intestines and produce insulin? Certainly.

But industrial applications aren’t the most interesting problems waiting to be solved. Endy is concerned that, if synthetic bio is dominated by a corporate agenda, it will cease to be “weird,” and won’t ask the more interesting questions. One Synthetic Aesthetics project made cheeses from microbes that were cultured from the bodies of people in the synthetic biology community. Christian Bok has inserted poetry into a microbe’s DNA. These are the projects we’ll miss if the agenda of synthetic biology is defined by business interests. And these are, in many ways, the most important projects, the ones that will teach us more about how biology works, and the ones that will teach us more about our own creativity.

The last 40 years of computing have proven what a hacker culture can accomplish. We’re about to see that again, this time in biology. And, while we have no idea what the results will be, it’s safe to predict that the coming revolution in biology will radically change the way we live — at least as radically as the computer revolution. It’s going to be an interesting and exciting ride.

Related:

August 28 2012

June 20 2012

Alan Turing: the short, brilliant life and tragic death of an enigma

Codebreaker and mathematician Alan Turing's legacy comes to life in a Science Museum exhibition

A German Enigma coding machine on loan from Mick Jagger and a 1950 computer with less calculating power than a smartphone but which was once the fastest in the world, are among the star objects in a new exhibition at the Science Museum devoted to the short, brilliant life and tragic death of the scientist Alan Turing.

"We are in geek heaven," his nephew Sir John Turing said, surrounded by pieces of computing history which are sacred relics to Turing's admirers, including a computer-controlled tortoise that had enchanted the scientist when he saw it at the museum in the 1951 Festival of Britain. "This exhibition is a great tribute to a very remarkable man," Turing said.

"My father was in awe of him, the word genius was often used in speaking of him in the family," he said, "but he also spoke of his eccentricity, of how he cycled to work at Bletchley wearing a gas mask to control his hayfever so the local people he passed dreaded that a gas attack was imminent."

The exhibition, marking the centenary of Turing's birth, tackles both the traumatic personal life and the brilliant science of the man who was a key member of the codebreaking team at Bletchley Park, and devised the Turing Test which is still the measure of artificial intelligence.

Turing was gay, and in 1952 while working at Manchester University, where he had a relationship with a technician called Arnold Murray, he was arrested and charged with gross indecency. He escaped prison only by agreeing to chemical castration through a year's doses of oestrogen – which curator David Rooney said had a devastating effect on him, mentally and physically. In 1954 he was found dead in his bed, a half eaten apple on the table beside him, according to legend laced with the cyanide which killed him.

His mother insisted that his death was accidental, part of an experiment to silver plate a spoon – he had previously gold plated another piece of cutlery by stripping the gold from a pocket watch – with the chemicals found in a pot on the stove. However the coroner's report, also on display, is unequivocal: Turing had consumed the equivalent of a wine glass of poison and the form records bleakly "the brain smelled of bitter almonds".

The death is wreathed with conspiracy theories, but Rooney's explanation for the apple is pragmatic: not an obsession with the poisoned apple in the Disney film of Snow White, as some have claimed, but a very intelligent man who had it ready to bite into to counteract the appalling taste of the cyanide.

His nephew said both the prosecution and death were devastating for the family, but they were delighted by the formal public apology offered in 2009 by then prime minister Gordon Brown.

The campaign for a posthumous pardon is more problematic he said, speaking as a senior partner at the law firm Clifford Chance.

"So many people were condemned properly under the then law for offences which we now see entirely differently. One would not wish to think that Turing won a pardon merely because he is famous, that might be just a step too far. But the suggestion that there might be some reparation by having him appear on the back of a bank note – that might indeed be good."

The exhibition includes the only surviving parts of one of the 200 bombe machines which ran day and night decoding German messages at sites around the country, each weighing a ton and all broken up for scrap after the war. The components were borrowed from the government intelligence centre at GCHQ after tortuous negotiations. Although visitors will not realise it, a short interview filmed at GCHQ is even more exceptional, the only film for public viewing ever permitted inside the Cheltenham complex.

By 1950 when the Pilot Ace computer, on which Turing did key development work, was finally running at the National Physical Laboratory, he had moved to Manchester, impatient at the slow pace of work in the postwar public sector. It is displayed beside a panel of tattered metal, part of a Comet, the first civilian passenger jet, which exploded over the Mediterranean killing all on board: the computer ran the millions of calculations to work out why.

Rooney says the exhibition is also intended to destroy the impression of Turing as a solitary boffin: it includes many of the people he worked with, who regarded him with awe and affection. When he came to see the computer tortoises in 1951 – they responded to light and scuttled back home when the bulb was switched on in their hutches – he also managed to break a game playing computer by recognising the work of a protege and cracking the algorithm on the spot: the computer flashed both "you've won" and "you've lost" messages at him, and then shut itself down in a sulk.

In an interview filmed for the exhibition his last researcher, Professor Bernard Richards of Manchester University, the man he was due to meet on the day of his death, says: "Turing struck me as a genius. He was on a higher plane."

Codebreaker – Alan Turing's life and legacy, free at the Science Museum, London, until June 2013.


guardian.co.uk © 2012 Guardian News and Media Limited or its affiliated companies. All rights reserved. | Use of this content is subject to our Terms & Conditions | More Feeds


May 19 2012

Finally, mystery of the famous faces of art may be revealed

Californian university project will use facial recognition software to identify subjects of paintings

A Californian university has won funding to use advanced facial recognition technology to try to solve the mysteries of some of the world's most famous works of art.

Professor Conrad Rudolph said the idea for the experiment came from watching news and detective shows such as CSI which had a constant theme of using advanced computers to recognise unknown faces from murder victims to wanted criminals.

Rudolph, professor of medieval art history at the University of California at Riverside, realised he might be able to apply that cutting-edge forensic science to some of the oldest mysteries in art: identifying the real people in paintings such as Vermeer's Girl with a Pearl Earring, Hals's The Laughing Cavalier or thousands of other portraits and busts where the identity of the subject has been lost. Work on the project should begin within a month or so.

Police and forensic scientists can use facial recognition software that identifies individuals by measuring certain key features. For example, it might measure the distance between someone's eyes or the gap between their mouth and their nose. In real life such measurements should be almost as unique as a fingerprint. Rudolph is hoping that the same might be true of portraiture, whether it is a sculpted bust or a painting.

To start with, his team will use facial recognition software on death masks of known individuals and then compare them to busts and portraits. If the software can find a match where Rudolph and his team know one exists, then it shows the technique works and can be used on unknown subjects to see if it can match them up with known identities.

The identity of the subjects of some of the most famous pictures in the world are unknown, including Girl with a Pearl Earring, the 17th-century portrait that inspired a film starring Scarlett Johansson. The Imagined Lives exhibition now running at London's National Portrait Gallery features portraits of 14 unknown subjects. Many of those paintings were once thought to be of historical figures such as Elizabeth I, but the identities are now disputed. The truth behind several paintings of Shakespeare – such as the Chandos portrait and the Cobbe portrait – has also been much disputed. It is possible facial recognition software could help solve these mysteries.

To be identified, the subject of a portrait would need to be matched to the identity of another named person in a separate picture. But Rudolph has some tricks up his sleeve. He believes that another forensic technique – whereby an "ageing" programme is run on a subject – could also help solve art mysteries. In fighting crime the software is usually used to produce "adult" pictures of children who have been missing for many years. But it could see if the Girl with a Pearl Earring had been painted again as a much older woman, whose identity might be known.

Away from the high-profile cases there are a legion of other unknown subjects that might be more easily identified. In many works from before the 19th century wealthy patrons often inserted themselves, their families or friends and business associates into crowd scenes.

Facial recognition technology could be used to identify some of these people from already known works and thus provide insight into personal, political and business relationships of the day. In other cases families in wealthy homes commissioned busts of relatives that were often sold when estates went bankrupt or families declined.

The new technique could identify many of these people by linking the busts to known portraits. "These are historical documents and they can teach us things. Works of art can show us political connections or business links. It opens up a whole new window into the past," Rudolph said.

In order to transfer the process to analysing faces in works of art, some technical issues will need to be overcome. Portraits are in two dimensions and are also an artistic interpretation rather than a definitive likeness. In some cases, the painter might have simply not been very accurate, or attempted to flatter a subject, which would make recognition more difficult.

"It is different using this on art rather than an actual human," said Rudolph, "But we are trying to test the limits of the technology now and then who knows what advances may happen in the future? This is a fast-moving field."


guardian.co.uk © 2012 Guardian News and Media Limited or its affiliated companies. All rights reserved. | Use of this content is subject to our Terms & Conditions | More Feeds


March 26 2012

Grow a sunflower to solve unfinished Alan Turing experiment

Manchester Science Festival sows the seeds of a very bright idea to honour the computer genius in his centenary year

If ever there was a man for bright ideas, it was Alan Turing, and he would have loved this.

The whole of Manchester is being invited to plant sunflowers as part of the current centenary celebrations of his birth; and not just as a sentimental gesture.

Fittingly in the tradition of the great computer scientist, whose vital role in World War II's Enigma code-cracking was over-shadowed by his public disgrace for having gay sex, the event is practical and scientific. The Museum of Science and Industry and partners, including Manchester University where Turing made extraordinary strides in computer development after the war, are trying to conclude an experiment which he left unfinished.

Fascinated by numerical sequences and geometric patterns, Turing speculated that both the petals and densely-packed seedheads of sunflowers include striking examples of the Fibonacci number series – a mathematical phenomenon which is explained much more clearly than I could ever manage on this link here. When he was prosecuted in 1952, humiliated and put on a primitive course of hormone treatment, or chemical castration, this project joined many others in gathering dust.

Here's the sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89…. Can you work out the next number?

Although he had been awarded an OBE, the significance of Turing's wartime work was unknown to his colleagues at Manchester university or the public at large. His death in 1954 from cyanide poisoning has been widely assumed to have been suicide although this was never officially confirmed.

His interest in Fibonacci numbers in sunflowers, and other plants, stemmed in part from his own observations and partly from his knowledge of the history of science. The excellent Turing Centenary website has a lovely drawing of him by his mother, opting out of a hockey match at school and in the words of her pencilled caption: 'watching the daisies grow.' Most daisies have 34, 55 or 89 petals – the 9th, 10th, and 11th numbers in the Fibonacci series.

Turing knew about Leonardo da Vinci's interest in the subject and acknowledged the work of a Dutch scientist, J C Schoute, who studied the patterns on 319 sunflower heads just before the Second World War. That was cited in a paper Turing wrote in 1951 about patterns and sequences in biology which he also enjoyed testing on his fledgling computers.

Then it all ended. So the Turing Sunflower Project is taking it up, with a database which will be thousands strong. Professor Jonathan Swinton, visiting professor in computational systems at Oxford University, says that the numerology could be important to understanding how plants grow. He says:

Other scientists believe that Turing's explanation of why this happens in sunflowers is along the right lines but we need to test this out on a big dataset, so the more people who can grow sunflowers, the more robust the experiment.


The project's manager Erinma Ochu says:

We hope to provide the missing evidence to test Turing's little-known theories about Fibonacci numbers in sunflowers. It would be a fitting celebration of the work of Alan Turing.


The results of the experiment will be a highlight of Manchester Science Festival in October. Details on how to register for seeds are here. Tweets on progress are here, and a blog on the festival, sunflowers included, is here.


guardian.co.uk © 2012 Guardian News and Media Limited or its affiliated companies. All rights reserved. | Use of this content is subject to our Terms & Conditions | More Feeds


January 24 2012

Robot cleaners and the Museum of Me: Intel's vision of the future

Intel's best-known project might be gimmicky, but its new collaboration with the Royal College of Art is full of daring

Over the last decade or so, the burgeoning culture industry has spawned museums at such a rate that it seems no small town or minor artist will be left unrepresented. Now, social media has taken that logic to its absurd conclusion: it is not just minor artists who will get their own museum, we all will. Or so the creators of the Museum of Me would have us believe. Launched last year, and last week named the FWA (Favourite Website awards) site of the year, the Museum of Me turns your Facebook profile into a virtual exhibition. It sounds cheesy (and it is), but the fact that it already has more than 850,000 "likes" confirms that you can't underestimate the public's self-obsession.

The site, designed by Japanese agency Projector, takes the 19th-century concept of the museum as edifying repository and turns it into a characteristically 21st-century memorial to the self. Entering this generically deconstructivist building, what you get is a fly-through animation of a series of galleries, with pictures of you and your friends on the walls. There is a random selection of status updates jumbled on screens, and then a final sequence that implies, erroneously, that you are merely a composite of your social network. A soaring soundtrack turns the sentimentality dial to max. The experience is a cross between a photo album, a phonebook and a funeral. Not until the very end do you realise that it was all just an ad: "Intel Core i5. Visibly Smart".

The Museum of Me is a deft piece of marketing by microchip maker Intel. Given the opportunity to see how your life looks splashed on a museum's walls, you'd have to be the uncurious type not to have a peek. You can see why it went viral. But Intel doesn't sell directly to consumers, so what does it get out of this? Brand awareness, clearly, but also an opportunity to demonstrate that it is the purveyor of new experiences. And that's where it gets interesting: the Museum of Me may be a disposable gimmick, but Intel spends a good deal of time imagining what the future of our everyday experiences will look like. It has to. Making a microchip takes between three and seven years. Chips can't be designed to run gadgets we already own, or to satisfy observable consumer behaviour: they have to be designed for a market that doesn't yet exist.

"Our job is to think five years ahead, or beyond," says Wendy March, senior designer at Intel's department of interaction and experience research. "Technology changes so rapidly, and what's next on the horizon is sometimes closer than you think." As a result, Intel sponsors some of the most speculative research in design today. Working with design schools across the world, it sets students the task of dreaming up future scenarios – no matter how implausible they might seem.

One school the company has a longstanding relationship with is London's Royal College of Art. In recent years it has sponsored research by Intel's interaction design department into such topics as the future of money and the use of robots in the domestic environment. In a cashless society, what rituals would we devise to make money tangible? How would we communicate with our robots? One student envisaged a "swab-bot" that roams the house doing hygiene tests and leaving you notes about your unsatisfactory cleanliness. "It's not about, 'Here's an idea, let's make that.' It's more about expanding our thinking," says March.

The RCA group's current research is into the future of social computing. This isn't just about social media and our insatiable appetite for sharing our personal lives. Social computing also allows asthma sufferers, for instance, to share information about air quality and their medication use, revealing patterns that will help improve their future treatment. "We're accumulating more and more data – but what do we do with it?" says March. "How do we stop it going into the digital equivalent of the cupboard under the stairs?"

Students at the RCA are finding various uses for it. One has designed an app that plays a soundtrack related to the crime figures for different areas of London, giving you an atmospheric sense of how safe you are, statistically, as you walk through the city. Another has documented all the posters at the Occupy site so that they can be shared digitally when they disappear (the British Library is interested in making it part of its collection). Other ideas are more speculative: for instance, turning social housing blocks into human supercomputers or hive minds, gathering the so-called wisdom of crowds.

This kind of research is not about plugging a gap in the market, but about enabling students to think beyond the narrowness of tech products. "It's useful because it shows the students there is another way of working with industry that's not about products," says Tony Dunne, the RCA's professor of design interactions. "Instead they can be involved upstream, even challenging a company's own ideas, using story-telling, speculation and social observation." With a company like Intel this is particularly interesting: as computing becomes ubiquitous, microprocessors are not just for gadgets but are increasingly woven into the fabric of everyday life. As William Gibson put it a decade ago, "I very much doubt that our grandchildren will understand the distinction between that which is a computer and that which isn't."


guardian.co.uk © 2012 Guardian News and Media Limited or its affiliated companies. All rights reserved. | Use of this content is subject to our Terms & Conditions | More Feeds


October 29 2011

Dennis Ritchie's legacy of elegantly useful tools

On Sunday, 10/30 we're celebrating Dennis Ritchie Day. Help spread the word: #DennisRitchieDay

Shortly after Dennis Ritchie died, J.D. Long (@cmastication) tweeted perhaps the perfect comment on Ritchie's life: "Dennis Ritchie was the architect whose chapel ceiling Steve Jobs painted." There aren't many who remember the simplicity and the elegance of the Unix system that Jobs made beautiful, and even fewer who remember the complexity and sheer awfulness of what went before: the world of IBM's S/360 mainframes, JCL, and DEC's RSX-11.

Much of what was important about the history of Unix is still in OS X, but under the surface. It would have been almost inconceivable for Apple to switch from the PowerPC architecture to the Intel architecture if Unix wasn't written in C (and its successors), and wasn't designed to be portable to multiple hardware platforms. Unix was the first operating system designed for portability. Portability isn't noticeable to the consumer, but it was crucial to Apple's long-term strategy.

OS X applications have become all-consuming things: you can easily pop from email to iTunes to Preview and back again. It's easy to forget one key component of the original Unix philosophy: simple tools that did one thing, did it well, and could be connected to each other with pipes (Doug McIlroy's invention). But simple tools that date back to the earliest days of Unix still live on, and are still elegantly useful.

Dennis Ritchie once said "UNIX is basically a simple operating system, but you have to be a genius to understand the simplicity." It's true. And we need more geniuses who share his spirit.

Related:

October 26 2011

Dennis Ritchie Day

Dennis RitchieSunday, October 16 was declared Steve Jobs Day by California's Governor Brown. I admire Brown for taking a step to recognize Jobs' extraordinary contributions, but I couldn't help be struck by Rob Pike's comments on the death of Dennis Ritchie a few weeks after Steve Jobs. Pike wrote:

I was warmly surprised to see how many people responded to my Google+ post about Dennis Ritchie's untimely passing. His influence on the technical community was vast, and it's gratifying to see it recognized. When Steve Jobs died there was a wide lament — and well-deserved it was — but it's worth noting that the resurgence of Apple depended a great deal on Dennis' work with C and Unix.

The C programming language is quite old now, but still active and still very much in use. The Unix and Linux (and Mac OS X and I think even Windows) kernels are all C programs. The web browsers and major web servers are all in C or C++, and almost all of the rest of the Internet ecosystem is in C or a C-derived language (C++, Java), or a language whose implementation is in C or a C-derived language (Python, Ruby, etc.). C is also a common implementation language for network firmware. And on and on.

And that's just C.

Dennis was also half of the team that created Unix (the other half being Ken Thompson), which in some form or other (I include Linux) runs all the machines at Google's data centers and probably at most other server farms. Most web servers run above Unix kernels; most non-Microsoft web browsers run above Unix kernels in some form, even in many phones.

And speaking of phones, the software that runs the phone network is largely written in C.

But wait, there's more.

In the late 1970s, Dennis joined with Steve Johnson to port Unix to the Interdata. From this remove it's hard to see how radical the idea of a portable operating system was; back then OSes were mostly written in assembly language and were tightly coupled, both technically and by marketing, to specific computer brands. Unix, in the unusual (although not unique) position of being written in a "high-level language," could be made to run on a machine other than the PDP-11. Dennis and Steve seized the opportunity, and by the early 1980s, Unix had been ported by the not-yet-so-called open source community to essentially every mini-computer out there. That meant that if I wrote my program in C, it could run on almost every mini-computer out there. All of a sudden, the coupling between hardware and operating system was broken. Unix was the great equalizer, the driving force of the Nerd Spring that liberated programming from the grip of hardware manufacturers.

The hardware didn't matter any more, since it all ran Unix. And since it didn't matter, hardware fought with other hardware for dominance; the software was a given. Windows obviously played a role in the rise of the x86, but the Unix folks just capitalized on that. Cheap hardware meant cheap Unix installations; we all won. All that network development that started in the mid-80s happened on Unix, because that was the environment where the stuff that really mattered was done. If Unix hadn't been ported to the Interdata, the Internet, if it even existed, would be a very different place today.

I read in an obituary of Steve Jobs that Tim Berners-Lee did the first WWW development on a NeXT box, created by Jobs' company at the time. Well, you know what operating system ran on NeXT's, and what language.

For myself, I can attest that there would be no O'Reilly Media without Ritchie's work. It was Unix that created the fertile ground for our early publishing activities; it was Unix's culture of collaborative development and architecture of participation that was the deepest tap root of what became the open source software movement, and not coincidentally, much of the architecture of the Internet as well. These are the technologies I built my business around. Anyone who has built their software or business with knowledge from O'Reilly books or conferences can trace their heritage back to Ritchie and his compatriots.

I don't have the convening power of a Governor Brown, but for those of us around the world who care, I hereby declare this Sunday, October 30 to be Dennis Ritchie Day! Let's remember the contributions of this computing pioneer.

P.S. Help spread the word. Use the hashtag #DennisRitchieDay on Twitter and Google+

Photo: Via Wikimedia Commons.

Reposted byurfin urfin

"Revolution in the Valley," revisited

It's one thing to look back on a project with the power of hindsight and recognize that project's impact. It's quite another to realize a project will be profound as it's still playing out.

Andy Hertzfeld had that rare experience when he was working on the original Macintosh. Hertzfeld and his colleagues knew the first Mac could reshape the boundaries of computing, and that knowledge proved to be an important motivator for the team.

Over the years, Hertzfeld has chronicled many of the stories surrounding the Macintosh's development through his website, Folklore.org, and in his 2004 book, "Revolution in the Valley." With the book making its paperback debut and the work of Steve Jobs and Apple fresh in people's minds, I checked in with Hertzfeld to revisit a few of those past stories and discuss the long-term legacy of the Macintosh.

Our interview follows.

What traits did people on the Macintosh team share?

Andy HertzfeldAndy Hertzfeld: Almost everyone on the early Macintosh team was young, super smart, artistic and idealistic. We all shared a passionate conviction that personal computers would change the world for the better.

At what point during the Macintosh's development did you realize this was a special project?

Andy Hertzfeld: We knew it from the very beginning; that's why we were so excited about it. We knew that we had a chance to unlock the incredible potential of affordable personal computers for the average individual for the very first time. That was incredibly exciting and motivating.

Between the book and your website, did the story of the Macintosh take on a "Rashomon"-like quality for you?

Andy Hertzfeld: I was hoping for more of that than we actually achieved — there haven't been enough stories by varied authors for that to come to the forefront. The most "Rashomon"-like experience has been in the comment section of the website, where people have corrected some of my mistakes and sometimes described an alternate version that's different than my recollections.

How did the Macintosh project change after Steve Jobs got involved?

Andy Hertzfeld: It became real. Before Steve, the Macintosh was a small research project that was unlikely to ever ship; after Steve, it was the future of Apple. He infused the team with a fierce urgency to ship as soon as possible, if not sooner.

There's a story about how after Jobs brought you onto the Macintosh team, he finalized the move by yanking your Apple II's power chord out of its socket. Was that an unusual exchange, or did Jobs always act with that kind of boldness?

Andy Hertzfeld: Well, not always, but often. Steve was more audacious than anyone else I ever encountered.

In the book, you wrote that as soon as you saw the early Mac logic board you knew you had to work on the project. What about that board caught your attention?

Andy Hertzfeld: It was the tightness and cleverness of the design. It was a clear descendant of the brilliant work that Steve Wozniak did for the Apple II, but taken to the next level.

Was the logic board the gravitational center for the project?

Andy Hertzfeld: I would say that Burrell Smith's logic board was the seed crystal of brilliance that drew everyone else to the project, but it wasn't the gravitational center. That would have to be the user interface software. My nomination for the gravitational center is Bill Atkinson's QuickDraw graphics library.

Revolution in The Valley: The Insanely Great Story of How the Mac Was Made — "Revolution in the Valley" reveals what it was like to be there at the birth of the personal computer revolution. The story comes to life through the book's portrait of the talented and often eccentric characters who made up the Macintosh team.

Today only: Get the ebook edition of "Revolution in the Valley" for $9.99 (Save 50%).

Where does the graphical user interface (GUI) rank among tech milestones? Is it on the same level as the printing press or the Internet?

Andy Hertzfeld: It's hard to compare innovations from different eras. I think both the printing press and the Internet were more important than the GUI, but the Internet couldn't have reached critical mass without the GUI enabling ordinary people to enjoy using computers.

In the book's introduction you wrote that the team "failed" because computers are still difficult. Has computer complexity improved at all?

Andy Hertzfeld: I think things have improved significantly since I wrote that in 2004, mainly because of the iPhone and iPad, which eliminated lots of the complexity plaguing desktop computing. That said, I think we still have a ways to go, but newer developments like Siri are very promising.

What do you think the long-term legacy of the Macintosh will be?

Andy Hertzfeld: The Macintosh was the first affordable computer that ordinary people could tolerate using. I hope we're remembered for making computing delightful and fun.

Would a current Apple fan have anything in common with a Macintosh-era Apple fan?

Andy Hertzfeld: Sure, a love for elegant design and bold innovation. They both think differently.

This interview was edited and condensed.

Related:

October 06 2011

Why do some people really hate Apple | Charles Arthur

Few companies inspire such strong emotions, but is it Apple's profile, design or technology that pushes those buttons?

You don't have to go far on the web or even everyday life to find people happy to say it: they hate Steve Jobs and all he stood for, and those who buy things from Apple – the "sheeple", in an oft-used phrase – are simply buying stuff for no reason than its marketing, or advertising. Apple, they say, is a giant con trick.

Why do they care? Because, says Don Norman, an expert in how we react emotionally to design, buying or using products that engage our emotions strongly will inevitably alienate those who don't share those emotions – and just as strongly. Norman, formerly vice-president of the Advanced Technology Group at Apple, is co-founder of the Nielsen/Norman Group, which studies usability. He's also an author of books including Emotional Design and his latest, called Living With Complexity.

Apple, he says, excels at generating strong positive emotional reactions from those who use its products. The iPhone was a classic example with its revolutionary touchscreen control – which wasn't the first, but was the best: "Touch is a very important sense; a lot of human emotion is built around touching objects, other people, touching things," says Norman. "I think that we've lost something really big when we went to the abstraction of a computer with a mouse and a keyboard, it wasn't real, and the telephone was the same, it was this bunch of menus and people got lost in the menus and buttons to push and it felt like a piece of technology.

"Whereas the iPhone felt like a piece of delight. It really is neat to go from one page to the other not by pushing a button but by swiping your hand across the page." He adds: "The correct word is intimacy; it is more intimate. Think of it not as a swipe, think of it as a caress."

But just as physics sees an equal and opposite reaction to every action, so strong emotions engender adverse emotions in response. Take this comment by Aaron Holesgrove of OzTechNews about the iPad: "Actually, the iPad succeeds because it enables you to read websites whilst sitting on the toilet and play casual games in bed. It's a toy. You can't eliminate complexity when there was never any complexity in the first place – Apple went and threw a 10in screen on the iPod Touch and iPhone and called them the iPad and iPad 3G, respectively." Critics say Apple's products don't have as many features; their technical specifications aren't comparable to the leading-edge ones; they're more expensive. In short, you're being ripped off. And what's more, Apple is exploiting workers in China who build the products.

By contrast, ask someone about other comparable products out there – Amazon's new Kindle Fire, RIM's PlayBook, HP's TouchPad – and you'll get indifference, even if the prices are the same, or they're made in the same Chinese factories as Apple uses.

Norman says that the reaction – both the love and the hate – comes from Apple's designs. "This is important. It's something that I have trouble convincing companies of: great design will really convert people, but it will also put off other people. So you have to be willing to offend people; to make things that you know a lot of people are going to hate."

Apple's focus on design, which is principally expressed through the objects it sells – the iPods, iMacs, MacBooks, iPhones, iPads – drives those extreme reactions, he says. (And it's notable that nobody ever complained about Pixar's products – even though Jobs was chief executive there too.)

Part of why people like the devices so much is that they can personalise them: "The iPhone, being your mobile phone, is part of you, like the iPod is but even more so, because you're carrying everything around, not just your music but also your contacts and the ability to contact people – because people have observed that mobile phones are a very personal item."

By contrast, other companies that try to cater for and please everyone are guaranteed to fall short – and so won't excite emotion. "Many people try to make a product that everybody will love; Microsoft is a good example," he explains. "If you make a product that everybody loves – you do all your market surveys, and when people don't like something about it you change it – you end up with a bland product that everybody will accept but nobody truly loves."

Apple isn't like that, he says. "Apple says 'We're not going to even worry about it. We're going to make something that we ourselves love. We just assume that anything that we really love, lots and lots of people will love. And if other people really dislike it and hate it, so what. Tough on them.'"

But what about the criticism of the lack of specifications? When the iPod was still a hot seller, before the iPhone, I asked Phil Schiller, then as now Apple's head of marketing, about the lack of extras such as FM tuners and voice recorders – which rivals did offer, even though their products made no headway in the market.

Schiller put it simply: extras like FM radio were "a technology in search of a customer". He explained: "We're very careful about the technologies we bring to our products. Just because there's a new technology doesn't mean you should put it in your product. Just because our competitors have put it in their product – because they need something to compete with us, because they're losing on everything else – doesn't mean we should put it in the product.

"We should put new features in a product because it makes sense for our customers to have that feature, and because a significant percentage of our customers will want that feature. Otherwise, not. Remember, all these features cost money, space and most importantly power, and power is a really big deal."

At Apple, the executives' view is that "a lot of product suffer from featureitis": that it's easier to try to sell a checklist than selling a better product that does what customers really need to do. As one explained it to me: "We try to be very careful not to get caught up in a 'list of features war'; we try to focus just on what makes a great product for the customers, what do they really want to do, and focus on that like no one else. If we think some features aren't that great, and don't really work that well, and involve trade-offs that customers won't want, we just don't do it. We don't just have a checklist on the side of a box."

It may be significant that the strongest criticism of Apple tends to come from those most engaged with the nuts and bolts of technology. Apple's staff have probably got used to having their products called toys by now. As long as they keep selling, though, they'll keep ignoring the critics in favour of the fans – which will, of course, inflame emotions on both sides even more.


guardian.co.uk © 2011 Guardian News and Media Limited or its affiliated companies. All rights reserved. | Use of this content is subject to our Terms & Conditions | More Feeds


How Steve Jobs put the seduction into technology

Apple reshaped the personal computer from a wobbly, Professor Branestawm-like contraption into a kind of digital jewellery

I wrote this just a few weeks ago when Steve Jobs announced he was quitting Apple:

"The Macintosh turned out so well," Jobs, who resigned as the CEO of Apple last night, once told the New York Times, "because the people working on it were musicians, artists, poets and historians who also happened to be excellent computer scientists."
And the people who bought the first Apple Mac computers were often architects, designers and journalists. One way or another, Steve Jobs and Steve Wozniak, the creators of the Apple Macintosh computers in the 1970s, came up with a line of products that – though clunky at first – had great appeal and continue to excite those engaged in design and the media – those who were best placed to sow the Apple seed.

Steve Jobs died on Wednesday. I'm writing this on an Apple desktop computer. When I rush off to work on another story today, my personal Apple MacBook will come with me. I also have an iPad, a Christmas present that has insinuated itself into my working and private life. With all these Apples about, it can be like living in a digital orchard.

As someone who still loves his Olivetti Lettera and who has learned to come to terms with the digital world slowly and cautiously, Apple has eased the transition. It's not just that the technology is "user friendly" to writers and millions engaged in what are known as the "creative industries", but that the physical design of Apple gizmos is seductive.

The iPad is like some magic tablet that comes alive and glows as if a genie has answered at least some of your wishes. Invented by Jobs and his team and styled by Apple's Jonathan Ives, it is one of those products that I like to imagine transporting back in time and showing our equally inventive ancestors as they built a pyramid or engineered a Gothic cathedral. Look what we've learned to do!

There would, of course, be one major snag. With no electricity, cables or satellites, let alone service providers and all the rest of the digital panopoly, the iPad's screen would remain resolutely dark, its crisp and gleaming plastic and metal case holding little interest for the architects of the Great Pyramid of Cheops or Salisbury Cathedral.

So Apples are very much objects of our time, so much so that each has been superseded by the next at a speed that might suggest a policy of built-in obsolescence. It's not that, although any company wants to sell us its next product or go out of business. It's more a case of design and technology moving on rapidly. And, in Jobs's case, of making Apple products indispensable in the way a wristwatch, handbag or wallet are to so very many millions of people.

Jobs, with incisive assistance from his design team, reshaped the personal computer from a wobbly, Professor Branestawm-like contraption lashed together at the back of a garage, or from an early Moog synthesiser lookalike, into a kind of digital jewellery.

Machines that, when he was growing up, were the stuff of men in white coats poring over punched paper tape and whirring, tape-recorder style reels in sealed, air-conditioned rooms are, thanks to Jobs, sleek hand-held devices that slip into handbags – wallets in the next couple of years, no doubt.

The very first Apple computer went on sale in 1976, its digital gubbins protected by a wooden casing. That was just a generation ago, and yet the latest Apples design make it look as though it might have been a tool used by medieval masons.

George Stephenson did not invent the steam railway locomotive, but when he and his son, Robert, shaped Rocket in 1829 – a pretty canary yellow and white design – they made this revolutionary machine aesthetically and emotionally acceptable to a largely suspicious and sceptical public.

Jobs has done much the same thing with Apple and the personal computer. There is, of course, something almost touching about the fact that most of these gleaming, seductive 21st-century devices are charged, when plugged into walls, by electricity generated by the mighty stationary steam engines we know as power stations.

Not everything under the digital sun is new, but Jobs knew how to make it shine into our offices, our homes and our private lives.


guardian.co.uk © 2011 Guardian News and Media Limited or its affiliated companies. All rights reserved. | Use of this content is subject to our Terms & Conditions | More Feeds


Steve Jobs, Apple co-founder, dies at 56

The mastermind behind an empire that has revolutionised personal computing, telephony and music, dies in California

Steve Jobs, billionaire co-founder of Apple and the mastermind behind an empire of products that revolutionised computing, telephony and the music industry, has died in California at the age of 56.

Jobs stepped down in August as chief executive of the company he helped set up in 1976, citing illness. He had been battling an unusual form of pancreatic cancer, and had received a liver transplant in 2009.

Jobs wrote in his letter of resignation: "I have always said if there ever came a day when I could no longer meet my duties and expectations as Apple's CEO, I would be the first to let you know. Unfortunately, that day has come."

Apple released a statement paying tribute: "Steve's brilliance, passion and energy were the source of countless innovations that enrich and improve all of our lives … The world is immeasurably better because of Steve."

Bill Gates, the former chief executive of Microsoft, said in a statement that he was "truly saddened to learn of Steve Jobs's death". He added: "The world rarely sees someone who has had the profound impact Steve has had, the effects of which will be felt for many generations to come.

"For those of us lucky enough to get to work with him, it's been an insanely great honour. I will miss Steve immensely."

He is survived by his wife, Laurene, and four children. In a statement his family said Jobs "died peacefully today surrounded by his family … We know many of you will mourn with us, and we ask that you respect our privacy during our time of grief".

Jobs was one of the pioneers of Silicon Valley and helped establish the region's claim as the global centre of technology. He founded Apple with his childhood friend Steve Wozniak, and the two marketed what was considered the world's first personal computer, the Apple II.

He was ousted in a bitter boardroom battle in 1985, a move that he later claimed was the best thing that could have happened to him. Jobs went on to buy Pixar, the company behind some of the biggest animated hits in cinema history including Toy Story, Cars and Finding Nemo.

He returned to Apple 11 years later when it was being written off by rivals. What followed was one of the most remarkable comebacks in business history.

Apple was briefly the most valuable company in the world earlier this year, knocking oil giant Exxon Mobil off the top spot. The company produces $65.2bn a year in revenue compared with $7.1bn in its business year ending September 1997.

Starting with his brightly coloured iMacs, Jobs went on to launch hit after hit transformed personal computing.

Then came the success of the iPod, which revolutionised the music industry, leading to a collapse in CD sales and making Jobs one of the most powerful voices in an industry he loved.

His firm was named in homage to the Beatles' record label, Apple. But the borrowing was permitted on the basis that the computing firm would stay out of music. After the success of the iPod the two Apples became engaged in a lengthy legal battle which finally ended last year when the Beatles allowed iTunes to start selling their back catalogue.

Jobs's remarkable capacity to spot what people wanted next came without the aid of market research or focus groups.

"For something this complicated, it's really hard to design products by focus groups," he once said. "A lot of times, people don't know what they want until you show it to them."

Jobs initially hid his illness but his startling weight loss started to unnerve his investors. He took a six-month medical leave of absence in 2009, during which he received a liver transplant, and another medical leave of absence in mid-January before stepping down as chief executive in August.

Jobs leaves an estimated $8.3bn, but he often dismissed others' interest in his wealth. "Being the richest man in the cemetery doesn't matter to me … Going to bed at night saying we've done something wonderful … that's what matters to me."


guardian.co.uk © 2011 Guardian News and Media Limited or its affiliated companies. All rights reserved. | Use of this content is subject to our Terms & Conditions | More Feeds


August 25 2011

Steve Jobs: iDesigned your life

iMacs, iPods, iPads – the Apple CEO took us from beige plastic to sophisticated and desirable design in every sphere of our lives

"The Macintosh turned out so well", Steve Jobs – who resigned as CEO of Apple last night – once told the New York Times, "because the people working on it were musicians, artists, poets and historians who also happened to be excellent computer scientists." And the people who bought the first Apple Mac computers were often architects, designers and journalists. One way or another, Steve Jobs and Steve Wozniak, creators of the Apple Macintosh computers in the 1970s, came up with a line of products that – though clunky at first – had great appeal, and continue to excite those engaged in design and the media; those who were best placed to sow the Apple seed.

The very first Apple computer to go on sale in 1976, in a wooden casing, had a lashed-together look that hinted strongly at its roots in a Californian garage. From a purely aesthetic point of view, it might have come from an old Buck Rogers comic book. When Apple II emerged a year later, boasting colour graphics and a plastic case, these revolutionary computers – compact, easy to understand and use, and entertaining – began to sell in larger numbers.

But the real revolution in easy-to-use desktop computer design was the Macintosh 128K, launched in January 1984. It featured a mouse, a separate keyboard and a tiny screen with graphic commands that even an exhausted Fleet Street journalist could adapt to. And yet despite their ingenuity, and the revolutionary impact they had on millions of working lives, no one could call early Apple products things of beauty. We used them to produce early issues of Blueprint magazine, a monthly devoted to architecture, fashion and design, yet they seemed lacklustre compared to many of the gleaming 80s designs we published. But everyone was fascinated by Jobs and Wozniak, these awkward ambassadors for a new era in design and media.

One of Jobs's greatest contributions to design was the promotion of Jonathan Ive, the brilliant young British designer, to senior vice president of industrial design at Apple Inc in 1998. Jobs had been away from Apple for some years – creating Pixar and thus Toy Story in the interim – yet when he came back, he teamed up with Ive to create a range of hugely appealing products. The first was the colourful iMac of 1998, a bold attempt to break away from the dull world of beige and grey plastic computer cases. With its oddball marriage of boiled sweet colours and transparent plastics, the iMac was certainly eye-catching, and it also sold – two million in the first 12 months.

But Jobs and Ive really got into their stride in 2001 with the iPod MP3 player, a small, minimalist design that evoked the work of the legendary German designer Dieter Rams, who had done so much since the 1950s to make Braun products, from record players to electric shavers, sell in prodigious quantities worldwide. The iPhone (2007) and iPad three years later have seen the Jobs-Ive design partnership come to fruition. These lightweight yet well-made, jewel-like objects, with their crystal-clear screens, finally imbued the design of computers and digital gizmos with a seductive quality. Once seen and touched, sales were made. Packaging and advertising were all of a piece with these sleek new products, as are the latest Apple showrooms – as much clubs as shops for Apple customers.

The minimalist quality – that has worked so well aesthetically and commercially in recent years – is what Jobs had been seeking all along. An unostentatious man, he has worked over four decades to fuse the complexities of computer operations with an ease of use and finally a gracefulness and beauty that must have seemed not so much out of the question or improbable in the mid-1970s, but irrelevant. What mattered then was to make new technology work for everyone, and like the first steam locomotives, aircraft, typewriters or telephones, Jobs's first designs seem archaic today. His contribution to both technology and design has been enormous. Amazing, really, how quickly those artless wooden and glum plastic boxes have become – with a little help from friends and colleagues – objects of modern desire.


guardian.co.uk © Guardian News & Media Limited 2011 | Use of this content is subject to our Terms & Conditions | More Feeds


April 13 2010

Grumpy old men, the "Inmates" and margins

Apple-Google-Progress.pngAs the iPad descends upon us, it is fair to ask, "Is this the beginning of the end, or the end of the beginning?" Depending upon whom you ask, the conclusions widely vary.

For example, RealNetworks' Rob Glaser forcefully argues that Apple's vertically integrated model "Must be stopped." He cautions: "If that's the way the industry plays out -- and there are a couple of vertical stovepipes that are closed -- A: we will have a much slower pace of innovation than we've ever had and B: there will be a tremendous loss in terms of value creation versus it being more horizontal."

Meanwhile, science fiction writer, blogger and tech activist, Cory Doctorow, recently made waves when he asserted in Why I won't buy an iPad (and think you shouldn't, either) that, "If you can't open it, you don't own it. Screws not glue." He concluded:

The real issue isn't the capabilities of the piece of plastic you unwrap today, but the technical and social infrastructure that accompanies it. If you want to live in the creative universe where anyone with a cool idea can make it and give it to you to run on your hardware, the iPad isn't for you. If you want to live in the fair world where you get to keep (or give away) the stuff you buy, the iPad isn't for you. If you want to write code for a platform where the only thing that determines whether you're going to succeed with it is whether your audience loves it, the iPad isn't for you.

And don't even get me started on the legions who dismiss Apple's end-to-end approach with an "Apple's Evil" slap, or more stridently, paint the story as "destined" to play out as things did in the PC Wars, with arrogant Apple racing to an early lead, only to get its head handed to it in the end.

I won't spend a lot of time bringing to the fore the masses that see the Apple model in more favorable terms, as the numbers speak for themselves across just about any metric that matters:

  • 85 million iPhones/iPod Touches/iPads sold
  • 185,000 applications built
  • 100,000 developer ecosystem
  • 4 billion application downloads
  • 15 billion iTunes media sold
  • JD Power Award for Customer Satisfaction
  • Ungodly operating margins/cash flow

So how to reconcile the animus with the market's clear directional momentum? Read on ...

Progress and grumpy old men

The late Herb Caen, the legendary columnist of the San Francisco Chronicle, once wrote a piece about the worsening state of San Francisco and, in particular, one of its main arteries, Market Street.

iPad CoverageIn it, he lamented about how this thoroughfare was always under construction, how the city's charms and enduring traditions were getting swept aside by outsiders, and how the place was becoming less and less hospitable to locals and long-timers, forcing Caen to wonder if, perhaps, San Francisco's best days were behind it.

Ah, but Caen was setting us up for an unexpected upper-cut, as at the tail end of the piece, he reveals (I am paraphrasing), "Would it surprise you to know that I wrote this piece way back in 1954?"

Caen's point was that then, as now, every generation sees their generation as the Real Generation and the Right Approach, when in truth, progress just moves forward.

Hence, the locals of San Francisco, circa 1954, saw a city losing sight of its traditions and therefore, its magic. In truth, the city was just moving forward with the times.

Thus, it was unsurprising that 30 years later, today's locals would reach the exact same conclusions about the "good old days" being their particular generational approach.

I would argue that Glaser, Doctorow and a number of others (Daring Fireball's John Gruber covers some of the other disenchanted in an excellent piece, The Kids Are All Right) are simply guilty of confusing their truth with The Truth, a not so subtle way of saying, "My Way or the Highway."

A note aside, while I have heard plenty of grumpy old men lamenting about the continuing rise of the Apple approach and its dark implications, I have yet to hear a single female prognosticator confuse such attributes with real-world unfavorable outcomes. Perhaps, it's because women don't long for the "good old days" of Stone Age tools, techno-babble, impersonal computing and the like.

Me personally, my first computer was a TRS-80, so I understand the nostalgia of being able to tinker down to schematics and assembly code, and just the same, prefer the ability to apply my muscles judiciously to higher level problems versus lower level ones.

Hence, what I give up in terms of absolute flexibility, I gain in not having to worry about hardware abstractions, infinite form-factors, middleware, glue code, software distribution, marketplace and monetization.

To me, that is a more than acceptable trade-off, inasmuch as you would be hard-pressed to argue that the model is less democratic or even less web friendly (while Apple is clearly trying to create the best native experience possible, they have unquestionably also created the best mobile web experience and are key proponents of HTML5 and pioneered WebKit adoption).

Nonetheless, the yin and yang of openness vs. integrated raises a fundamental question that underscores the battle being fought in the simmering industry battle between Apple and Google.

Do we really need more inmates?

Inmates.jpgThere are two bookends that gave me a grammar and narrative for thinking about software (and hardware) development and design. The first is "The Mythical Man-Month" by Fred Brooks, and the second is "The Inmates are Running the Asylum" by Alan Cooper.

In "Inmates," Cooper makes the argument that too often the development process is driven by techies building the types of products that they would like to use, as opposed to really understanding the aspirations and outcome goals of their target user, let alone who that target user even is.

Worse, they often compensate for this blind spot by building products that address all use cases, including edge cases, and build a design interaction model that is a composite of that blob of functionality.

The end-result are products that are confusing, needlessly complex and that address all theoretical problems from a check box perspective, but few real problems from a specific outcome perspective.

Keep this in mind next time you are comparing the Apple product that seems to be "missing" certain features relative to the cheaper alternative on the other shelf. Nothing is free when it comes to product design decisions.

Margins, and who keeps which piece of what dollar

It's worth revisiting Rob Glaser's earlier comment about "stopping Apple," as it underscores the real reason many want to stop Apple.

Train_coupling.jpgBack in the days of the PC, the rise of Microsoft and Intel led to a horizontally organized industry. Microsoft and Intel kept the highest margin dollars for themselves, and could expand into adjacent segments as they saw fit. They also left a number of chunks of the hardware, software and infrastructure stack to third parties.

This type of loose-coupling worked because the PC was essentially a homogeneous platform, and the expectations of user experience were such that daily system crashes, recurrent performance lags and numbingly-complex "enterprise" software was considered the rule, and not the exception.

Now, of course, the two industry standard-bearers of the Post-PC Era, Apple and Google, respectively, have addressed the challenges of old very differently. Google, by embracing simpler, loosely coupled (read: horizontally-focused) cloud-facing solutions, and Apple, by embracing vertically-integrated, complete product solutions that marry hardware, software, service, developer and marketplace.

But make no bones about it; the real tempest here is who keeps the high margin dollars.

In the case of Google, they are happy to allow any and all to plug into their search and advertising gravy train, so long as they can disrupt any and all incumbent segments ripe to be broken up by their model.

In the case of Apple, they see user experience and control of same as central to their value proposition and "govern" accordingly.

Whether you see one as more open, closed, virtuous or evil depends upon your personal preference about user experience and choice, not to mention your particular economic self-interest.

But that is a post for another time.

Related:

April 07 2010

The iPad and computing's middle ground

It's been quite a while now that cafes have been filled with laptops and people fighting over power outlets. More recently, those same coffee shops have added a crowd of people waiting for their drinks, nearly all hunched over their iPhones. Mobile devices take computing to new places.

I have to wonder where we'll see iPads a few months or years from now. I bet some of the places they'll show up aren't yet obvious.

iPad CoverageOne simple example: Phil Schiller's demo of the iWork spreadsheet app, Numbers, in the iPad launch keynote (iTunes link) showed a spreadsheet tracking a local soccer team. It's a great demo. Would you carry a laptop around a soccer field? Would you want to track game stats on an iPhone while shouting encouragement to the players? Neither of those quite work, but the iPad, replacing the coach's old clipboard, could easily make that environment a better one for computing. It's a middle ground where the phone is too small and the laptop too big.

I've long loved Don Norman's book, "The Invisible Computer," which talks about computing as an embedded aspect of everyday devices. The iPad isn't that; it does, though, move the ways we can use computers and networks closer to activities that so far have been difficult to reach. The computer is still visible, but it is also much more everyday than it was.

I'm excited about the iPad and what people might create for it. Apple does a fantastic job -- better than anyone -- at providing development kits that make developers' work look beautiful. They model the best experiences in the apps they ship, and provide tools that allow any motivated developer to make similarly beautiful experiences for their own apps. The form of the iPad is one big change, but the examples Apple is setting, in iBooks and iWork and the rest, invite people to create,. I love that.

The uproar around the iPhone and iPad restrictions and patent enforcement issues is real and I sympathize with both positions. But to get to the future we have to imagine it, we have to see it made real. How many tech companies and entrepreneurs talked endlessly and nearly fruitlessly (no pun intended) about the mobile web and tablet computing before the iPhone and the iPad came along? Apple is great at giving up just enough freedom to settle complaints (witness the evolution of DRM in iTunes), and I suspect that will happen again here. Regardless of how it plays out, I think we're seeing an expansion of the way we use and think about computing.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl