Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

March 15 2012

Foxconn and Ford, Emerson and Jobs

This post originally appeared on The Question Concerning Technology. It's republished with permission.

To borrow a line from Chuck Berry, it goes to show you never can tell.

I embarked this week on a bit of historical research, thinking I might find some connections between the factory workers of the digital era and those of the industrial era. Along the way I found myself confronting deep questions about the relationship between technology and spirit.

As most people know, there's been a raft of publicity lately about the conditions that prevail in the mega-factories of Foxconn, the Taiwan-based company that produces many of the digital devices we love so well. Even as Foxconn was denying that its workers are mistreated, the company announced it was raising their salaries by as much as 25 percent, its third announced pay increase in the past two years. Overtime hours are also being reduced.

No doubt these adjustments are aimed in part at repairing some of the damage to Foxconn's public image, and to the public images of its clients, notably Apple Computer. A dozen or so employee suicides in rapid succession tend to attract critical scrutiny.

That's not the whole story, however. Several reports also point out that Foxconn is at pains to stabilize the high rates of employee turnover in its factories, turnover that suggests the company may not always be able to depend on the vast, pliant pool of migrant labor that's fueled its explosive growth so far.

All this struck me as having some interesting parallels with the evolution of labor policies in the factories of an earlier breakthrough technology, the automobile.

In 1913 Henry Ford introduced the moving assembly line at his Highland Park factory in Michigan, revolutionizing the process of mass production. The following year he revolutionized his company's relationship with its workers by introducing the Five Dollar Day, a pay rate that more than doubled the average employee's salary. He also cut back the standard shift from nine to eight hours.

There were strings attached, including requirements that Ford's standards of cleanliness and sobriety be met at home as well as at work. Nonetheless, for the legions of mostly immigrant workers who besieged the employment office at Highland Park, the Five Dollar Day redefined what it meant to earn a living wage.

Like the pay raises at Foxconn, the Five Dollar Day was aimed at reducing unacceptable rates of employee turnover. The profits Ford was realizing with his production efficiencies were being eaten up by the cost of replacing 370 per cent of his workforce a year. Workers hadn't yet grown accustomed to the grinding routine of the assembly line; absenteeism was also rampant. The Five Dollar Day effectively encouraged employees to show up, and to stick around.

Whether by luck or by design, the Five Dollar Day also established one of the foundational principles of modern consumerism: Pay employees enough so that they can afford to buy the products they produce. This, too, is part of what's happening in China. Foxconn employees want to own iPads and iPhones as well as make them. Economists and environmentalists are having fun contemplating the implications of a shift in individual buying power in China today analogous to that unleashed in America in 1914.

This was pretty much what I expected to find when I started looking into the history of the Five Dollar Day. What I didn't expect to find was that Henry Ford's institution of that policy may have been inspired, at least in part, by the Sage of Concord, Ralph Waldo Emerson.

It's an irony of history that a man who loved nature as much as Henry Ford would have so much to do with its destruction. According to biographer Robert Lacey, Ford was a great admirer of the naturalist John Burroughs. He gave Burroughs a Model T in hopes of persuading him that cars, by providing people with means to escape the pestilent cities, would promote rather than undermine the cause of conservation. Burroughs presumably was unconvinced, but he did manage to infuse Ford with his passion for Emerson.

Lacey says the Five Dollar Day reflects in particular the ideas expressed in Emerson's essay, "Compensation." Ford often gave copies to friends, and a close associate said it "comes nearer to stating his creed than anything else." It's not hard to see why, given that Ford was a billionaire who believed in reincarnation, and who sometimes said he belonged with "the Buddhist crowd."

"Compensation" distinctly demonstrates the degree to which Emerson's transcendentalism resonates with Eastern religions. "The true doctrine of omnipresence," he says in one passage,

"is that God reappears with all his parts in every moss and cobweb. The value of the universe contrives to throw itself into every point. If the good is there, so is the evil; if the affinity, so the repulsion; if the force, so the limitation."

In another passage he adds, "The soul is.

"Under all this running sea of circumstance, whose waters ebb and flow with perfect balance, lies the aboriginal abyss of real Being. Essence, or God, is not a relation or a part, but the whole. Being is the vast affirmative, excluding negation, self-balanced, and swallowing up all relations, parts and times within itself."

As we used to say in the '60s, far out.

We know that Steve Jobs was well acquainted with the principles of Zen Buddhism and Hindu mysticism. With the works of Emerson, probably not so much. There's no mention of Emerson in Walter Isaacson's biography of Jobs, or in several other books on the history of Apple I've read. Jobs wasn't known as a reader (neither was Ford), and I'd guess that "Compensation" would have tried his patience. It's as abstruse and as silly in spots as Emerson's other essays, and as wordy. Still, one imagines that if Jobs had read it, he would have recognized its affirmation of some of the cosmic truths he held dear.

Basically "Compensation" is a meditation on what in Eastern terms would be called karma and the interplay between the yin and the yang. The gist of the message is that no one, in the end, gets away with anything. "A perfect equity," Emerson says, "adjusts its balance in all parts of life ...

"Our action is overmastered and characterized above our will by the law of nature. We aim at a petty end quite aside from the public good, but our act arranges itself by irresistible magnetism in line with the poles of the world."

The subject of work is directly addressed sporadically, but those mentions are telling. "Human labor," Emerson says, "through all its forms,

"from the sharpening of a stake to the construction of a city or an epic, is one immense illustration of the perfect compensation of the universe. The absolute balance of Give and Take, the doctrine that every thing has its price – and if that price is not paid, not that thing but something else is obtained, and that it is impossible to get anything without its price – is not less sublime in the columns of a leger than in the budgets of states, in the laws of light and darkness, in all the action and reaction of nature."

Robert Lacey cites this passage as suggestive of Ford's realization that he wasn't enjoying the advantages he could have enjoyed from his assembly line because he wasn't paying heed to the absolute balance of Give and Take. He wasn't paying the price.

This isn't to say that reading Emerson suddenly turned Ford into some gooey-eyed idealist. Many scholars argue that the Five Dollar Day was less about sharing the wealth than it was about gaining control of an unruly workforce. Ford himself described the policy as "one of the finest cost-cutting moves we ever made," but he also insisted he'd rather make 15,000 families happy than to make 20 or 30 millionaires.

In any event, the Five Dollar Day accomplished its mission, and helped ignite the engine of consumerism that defines, as much as anything, the American character to this day. In that sense Steve Jobs most assuredly carried Ford's legacy into the 21st century.

It's impossible to say how Jobs would have responded to the controversies regarding Foxconn that continued to escalate after his death. In a June, 2011 interview, two months before he stepped down as Apple's CEO, Jobs said he was deeply troubled by Foxconn's employee suicides, but insisted that Apple was doing "one of the best jobs in our industry and maybe in any industry" of monitoring the working conditions in its supply chain. Even if that's true, Apple's critics argue that doing "one of the best jobs in our industry" doesn't necessarily mean the company is doing enough.

There's not much evidence, in Isaacson's biography at least, that during his lifetime Jobs spent a lot of time thinking about the people who assembled his products. There's endless talk about purity of design and the seamless integration of hardware and software, but no substantive discussion of workers, factories, or China. Foxconn isn't mentioned at all. I think it's fair to conclude that Jobs was far more focused on what it feels like to use the iPod, the iPad, and the Mac than he was in what it feels like to make them. His talent lay in empathizing with his customers, not with his factory workers.

It would be unfair to expect Jobs to have been all things to all people. Like everyone else, he had his strengths and his weaknesses. Still, it's regrettable that a man who believed so strongly in the holistic integrity of Apple's products, inside and out, seems to have paid relatively little attention to the human beings who literally bring those products into the world.

In his better moments Jobs had to have realized, if he allowed himself to think about it, that there's an inherent karmic imbalance in the production of Apple's products. The devices he shepherded so carefully to market promise to open paths of individual freedom and creativity. That's why he believed they made the world a better place, and that's why we love them. The revelations about the working conditions at Foxconn remind us that individual freedom and creativity are not the values that prevail on the assembly line.

As consumers, most of us give far less thought to what it's like to work on the line than Steve Jobs probably did. Our disinterest ignores Emerson's absolute law of Give and Take. "Treat men as pawns and ninepins and you shall suffer as well as they," he said. "If you leave out their heart, you shall lose your own."

Associated photo on home and category pages: Old Five Dollar Bill - 1934 by Kevin Krejci, on Flickr

Related:

February 13 2012

Apple's iTV and the implications of what Steve said

If I accept conventional wisdom, Apple is getting into the TV-making business because:

  1. The living room is the last consumer segment that Apple has yet to completely remake in its image.
  2. Apple creates new markets where none exist, and it isn't satisfied with merely improving upon existing ones.
  3. Steve Jobs allegedly said that he'd cracked the code for creating an integrated TV set.
  4. If the iPad is really "just" a big iPod Touch, and has already sold 55 million units, then a TV that is "just" a big iPad could do gonzo business.
  5. The business of making TVs is broken, and Apple has to fix it.
  6. Cable and satellite providers are evil, and Apple has to liberate consumers.
  7. Tim Cook "needs" a hit.

As I stated in my last post following Apple's gaudy earnings numbers, I don't accept conventional wisdom because conventional wisdom is dead! Apple killed it.

Most fundamentally, all assumptions about Apple seem to stem from a misunderstanding of how differently Apple thinks and operates from everyone else.

For starters, Apple doesn't chase markets just because they're there. Nor do they get sucked into market share battles just so they can say they sold the most units (see: iOS vs. Android).

Further, neither the aggrandizement of the CEO's ego nor the altruistic care-taking of the consumer drive Apple's product strategy.

Rather, Apple pursues markets purely and vigorously based upon a simple logic. Do they believe that their integrated hardware + software + service approach can be applied in a leveraged fashion to create a differentiated offering that delights consumers, appeals to the masses, and can be sold at high margins at a predictable run rate?

If the answer is "yes," then game on. If the answer is "no," then leave it as a hobby (such as the current Apple TV), or avoid the market altogether.

This is the backdrop for understanding the rumors about Apple building a new-fangled television set. Rumors and whispers notwithstanding, in the words of Dr. Hannibal Lecter, the obvious question is:

"Of each particular thing, ask: What is it in itself? What is its nature?"

Apple TV matrix
Top layer = iOS devices; Middle layer = Core device functions; Bottom layer = Noteworthy hardware subsystems.

In the case of a serious living room play, if you check out the above graphic, what stands out most about the Apple TV in its current incarnation is its lack of apps, web, and communications support. These elements are the three biggest game changers that propelled the iPhone, iPod Touch and iPad beyond the impressive media foundation that marked the pre-iOS iPod.

What is also lacking is the mainstream television programming (HBO, ESPN, ABC) that the typical consumer demands. A 'purdy' new TV doesn't remedy that problem, now does it?

But, remember, Apple is long removed from their anti-establishment days, whereby for the company to succeed the incumbent had to fail. Hence, the rebirth of the Mac was predicated on getting into bed with Microsoft; the rise of the iPod was predicated on getting into bed with the music industry; and the rise of the iPhone was predicated on getting into bed with mobile carriers.

When framed that way, who hasn't Apple gotten into bed with yet that they need to get in bed with to succeed in a mainstream way?

You guessed it; the cable and satellite providers. Why? Because as noted venture capitalist Bill Gurley sagely pointed out, "When it Comes to Television Content, Affiliate Fees Make the World Go 'Round."

In other words, for an Apple TV to be free-flowing with first-tier TV content in the same way that an iPod flows with first-tier music, Apple will need DIRECTV and/or Comcast to bless it.

ESPN, after all, earns $4.69 per subscriber household in affiliate fees on each and every cable subscriber. Apple's good friend, Disney, owns ESPN, ABC, Disney Channel and a slew of other channels. Disney simply isn't going to throw billions of dollars away in affiliate fees just so they can help Apple. All of the major TV content players view the world similarly.

So where does that get you when you connect the dots? I'll tell you where it doesn't get you ... to a television-like device that:

  1. Is priced 2-4X the cost of an iPad.
  2. Has sales cycles of one device every 5-10 years.
  3. Has bad margins.
  4. Has a serviceable form factor that for many people is good enough. (Apple challenges industries where the baseline experience is terrible. Television hardware wouldn't seem to qualify.)

Conversely, what if you could buy a set-top box that plugged into your modern, big-screen TV, and:

  1. It just worked.
  2. Had every channel you currently get on cable.
  3. You could run those same channels as apps on your other iOS devices.
  4. Your TV could be controlled by any of those same iOS devices.
  5. You could upgrade to the newest version of the set-top box every 2-3 years (on a carrier-subsidized basis).
    1. Who wouldn't buy this device? And why wouldn't the cable and satellite providers be all over this? After all, does anyone seriously like their set-top box?

      As a sanity check, a carrier subsidy on a sub-$500 device is meaningful, whereas a carrier subsidy on a $1,500+ device like a TV set is nothing.

      Wait! But, didn't Steve Jobs say that he'd like to make an integrated TV set?

      Even if he did say that, do you really think that in his final official act as Apple spokesman, Jobs would telegraph to the world his company's grand intentions in the living room?

      Related:

January 20 2012

Developer Week in Review: Early thoughts on iBooks Author

One down, two to go, Patriots-wise. Thankfully, this week's game is on Sunday, so it doesn't conflict with my son's 17th birthday on Saturday. They grow up so quickly; I can remember him playing with his Comfy Keyboard, now he's writing C code for robots.

A few thoughts on iBooks Author and Apple's textbook move

iBooks AuthorThursday's Apple announcement of Apple's new iBooks Author package isn't developer news per se, but I thought I'd drop in a few initial thoughts before jumping into the meat of the WIR because it will have an impact on the community in several ways.

Most directly, it is another insidious lock-in that Apple is wrapping inside a candy-covered package. Since iBooks produced with the tool can only be viewed in full on iOS devices, textbooks and other material produced with iBooks Author will not be available (at least in the snazzy new interactive form) on Kindles or other ereaders. If Apple wanted to play fair, it should make the new iBooks format an open standard. Of course, this would cut Apple out of its cut of the royalties as well as yielding the all-important control of the user experience that Steve Jobs installed as a core value in the company.

On a different level, this could radically change the textbook and publishing industry. It will make it easier to keep textbooks up to date and start to loosen the least-common-denominator stranglehold that huge school districts have on the textbook creation process. On the other hand, I can see a day when pressure from interest groups results in nine different textbooks being used in the same class, one of which ignores evolution, one of which emphasizes the role of Antarctic-Americans in U.S. history, etc.

It's also another step in the disintermediation of publishing since the cost of getting your book out to the world just dropped to zero (not counting proofreading, indexing, editing, marketing, and all the other nice things a traditional publisher does for a writer). I wonder if Apple is going to enforce the same puritanical standards on iBooks as they do on apps. What are they going to do when someone submits a My Little Pony / Silent Hill crossover fanfic as an iBook?

Another item off my bucket list

I've been to Australia. I've had an animal cover book published. And now I've been called a moron (collectively) by Richard Stallman.

The occasion was the previously mentioned panel on the legacy of Steve Jobs, on which I participated this previous weekend. As could have been expected, Stallman started in describing Jobs as someone who the world would have been better off without. He spent the rest of the hour defending the position that it doesn't matter how unusable the free alternative to a proprietary platform is, only that it's free. When we disagreed, he shouted us down as "morons."

As I've mentioned before, that position makes a few invalid assumptions. One is that people's lives will be better if they use a crappy free software package over well-polished commercial products. In reality, the perils of commercial software that Stallman demonizes so consistently are largely hypothetical, whereas the usability issues of most consumer-facing free software are very real. For the 99.999% of people who aren't software professionals, the important factor is whether the darn thing works, not if they can swap out an internal module.

The other false premise at play here is that companies are Snidely Whiplash wanna-bes that go out of their way to oppress the masses. Stallman, to his credit as a savvy propagandist, has co-opted the slogans of the Occupy Wall Street movement, referring to the 1% frequently. The reality is that when companies try to pull shady stunts, especially in the software industry, they usually get caught and have to face the music. Remember the furor over Apple's allegedly accidental recording of location data on the iPhone? Stallman's dystopian future, where corporations use proprietary platforms as a tool of subjugation, has pretty much failed every time it's actually been tried on the ground. I'm not saying corporations are angels, or even that they have the consumer's best interests in mind, it's just that they aren't run by demonic beings that eat babies and plot the enslavement of humanity.

Achievement unlocked: Erased user's hard drive

Sometimes life as a software engineer may seem like a game, but Microsoft evidently wants to turn it into a real one. The company has announced a new plug-in for Visual Studio that lets you earn achievements for coding practices and other developer-related activities.

Most of them are tongue in cheek, but I'm terrified that we may start seeing these achievements in live production code as developers compete to earn them all. Among the more fear-inspiring:

  • "Write 20 single letter class-level variables in one file. Kudos to you for being cryptic!"
  • "Write a single line of 300 characters long. Who needs carriage returns?"
  • "More than 10 overloads of a method. You could go with this or you could go with that."
Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

Got news?

Please send tips and leads here.

Related:

December 20 2011

The price of greatness: Three takeaways from the biography of Steve Jobs

Steve JobsAs the first Christmas approaches without Apple founder Steve Jobs, it's worth pausing for a moment to appreciate what he has left behind.

In addition to an astoundingly healthy business with $80 billion in the bank, recent analysis by Andy Zaky of Bullish Cross suggests that in the current holiday quarter, Apple will record its largest earnings blowout ever.

This is on top of unparalleled customer loyalty and brand recognition, not to mention a potent halo effect generated by Apple's iPhone, iPad and Mac products.

Yet, according to analyst Zaky, Apple remains the most undervalued large cap stock in America. It's almost as if Apple is saving "one more thing" for the holidays; this one, a stocking-stuffer for investors.

I bring this last point up because the notion of Apple still being undervalued (and under-appreciated), despite the accomplishments, accolades and attention, suggests something about the human condition; namely, that when faced with an exceedingly bright and brilliant light, our minds naturally filter it down a bit.

But true greatness, the kind realized by Jobs in his life, and by Edison, Disney and Ford before him, is best appreciated without filters, for it is something that is experienced perhaps only once in a generation.

With that in mind, I want to share three takeaways from Walter Isaacson's biography of Jobs that spotlight both the greatness of the man and the price that greatness demands.

"The flu game"

In the annals of professional sports, there is perhaps no individual performance more emblematic of greatness in action, than "the flu game" in the 1997 NBA Finals, where a flu-ridden Michael Jordan overcame a stomach virus that had rendered him weak and dehydrated to score 38 points and lead his Chicago Bulls to a 90-88 victory over the Utah Jazz in Game 5. They won the series in six games.

That one man could overcome, no ignore, failing health to will his team to victory is both a defining example of the greatness of Michael Jordan as a basketball player, and no different than how Jordan approached every game that he played.

I thought about this a lot in reading Jobs' bio, inasmuch as one of the key takeaways (for me) from the book was how Apple's rise from the ashes was largely accomplished with its leader fighting not a flu, but cancer, and not for one game, but for eight years.

We all know about Jobs' battles with cancer, his forced leaves of absence, and the fact that he was never quite physically restored to the cherub-like state that he embodied when he first returned to Apple in 1997.

But the book lays clear, painfully so, something that all of us grokked and groped from the shadows but could never truly "know" because it wasn't public: from the moment he got sick in 2003 to when he died in October of this year, Jobs was never fully healthy again

.

Quite the opposite, in fact. He was literally fighting a continuous battle with his body, and a metastasizing cancer, yet still led his team to a series of triumphs that have no equal in the annals of business.

What Steve Jobs accomplished after cancer

  1. iTunes Store ramp
  2. iPhone
  3. iOS + App Store
  4. iPad

During this period, Apple stock surged more than 3,000%, and for Jobs personally, it was only his second greatest financial achievement; he would realize far greater personal wealth leading Pixar's evolution from a failing tech provider for the film business into Disney 2.0.

Apple's stock performance

Just as Jordan's flu game is simultaneously emblematic and par for the course of his greatness, so too was Jobs' leadership of Apple during his period of sickness.

The man known for reality distortion and an unwavering, uncompromising pursuit of the insanely great, ignored his own personal suffering, paying the ultimate price to achieve greatness. More so than any nugget from the Steve Jobs bio is this coarsely ground truth, something that should serve as a reminder the next time we wonder why there are so few great leaders, and even fewer great companies.

Yeah, but he was a jerk

Those who seek to dismiss or marginalize the accomplishments of Jobs tend to focus on one of three things.

Either they diminish his accomplishments as a modern-day Edison since Jobs wasn't an engineer, or they give props to Jobs' marketing savvy as a backhanded-way of diminishing the realness of what he built.

Or, they point out that he was a narcissistic jerk who took credit for the accomplishments of others, was controlling, belligerent, and probably not the prototypical role model of the family man (home for dinner, mowing the lawn on the weekends).

I'd like to focus on this last point, as it is simply irrelevant to the field of play that Jobs made his mark within.

Few of us know or care if Michael Jordan is a nice guy, whether Walt Disney remembered the names of his workers' kids or if Thomas Edison pet his dog. Case in point, Henry Ford held anti-Semitic views, but that doesn't mute the impact that Ford had on the field of play that is the automotive industry.

In Jobs' case, we have already established how fully the man led by example; how unparalleled the financial results his company generates are; and the deep, emotional bond that Apple products engender with users. But, also know that Jobs built a corporate culture defined by longevity, loyalty, depth, purpose and intellectual honesty — but above all, peak performance.

In other words, in the field of play that is creating enduring companies that build products that "make a dent in the universe" (a Jobs axiom), whether the leader is warm, fuzzy and personally likable is mostly orthogonal to the outcomes that he manifests.

Sweating the details

So, we've established that Jobs led by example, making the ultimate sacrifice so that his vision, his purpose in life, could be realized.

And we've noted that whatever personal peculiarities adorned the man, they didn't tarnish his accomplishments one iota.

In closing, I'd note how Jobs' manifestation of these attributes translated into the type of leader who plugged himself into an entire category of granular decisions that on the one hand, most CEOs would delegate "on principal," but on the other, it's darn near impossible to imagine an un-Jobsian leader being able to yield the wealth of transformational products that Apple has created.

One such example explored in the book are the specific materials and production processes that Apple uses in building its products. Such is the story of Gorilla Glass, the exceptionally lightweight, damage-resistant glass that came to anchor the screen of the iPhone.

How Gorilla Glass came to be is classic Jobs.

Internally, the iPhone team was driven by a realization that the centerpiece of a touch-driven phone was the display, not a composite of screen, casing and keyboard.

Armed with this clarity, Jobs drove the Apple team to re-think the form of the device around its display centricity. But, of course, this begged the question of the integrity and durability of the display material being used.

While conventional wisdom initially drove the company toward plastic screens, as the iPod had used, Jobs focused on the elegance and substantive nature of glass.

Having gotten wind from an old friend that Corning Glass was doing some amazing things with chemically-fortified glass, in typical Jobs fashion, he tracked down Corning's CEO, who told him about a chemical process that had actually originated in the 1960s but had never found an appropriate commercial application.

Convinced that he had found the right answer, Jobs challenged Corning's CEO to commit to both the capacity and timeline needed to achieve the scale Apple required to meet the iPhone launch deadline.

It was a game-changing solution for an unproven new device from an approach that had never been produced commercially prior to that point. And it worked!

There are similar stories in the book about the advent of multitouch, Apple's embrace of intricate metal fabrication processes, mass-purchasing of pinpoint lasers and the internal prototyping culture that instructed what became the Apple Stores.

Beyond showcasing the many incredible qualities of Jobs, all of this serves to underscore that having a simple product line — in terms of having very few products — is very different than having a simple product strategy. With scarcity comes focus, and with focus comes precision.

A final thought

There are many of us who consider ourselves to be entrepreneurs, inventors, and startup guys and gals, but I think this quote from Jobs captures the essence that there are no shortcuts to greatness. Greatness is dedication. It's a demand, and it's a detail. Or, as Jobs said:

I hate it when people call themselves entrepreneurs when what they're really trying to do is launch a startup and then sell or go public so they can cash in and move on. They're unwilling to do the work it takes to build a real company, which is the hardest work in business.

Amen. Somewhere in the universe, there is a hole where the light of Steve Jobs still shines through.

Photo of Steve Jobs from Apple Press Info.

Related:

November 21 2011

Four short links: 21 November 2011

  1. Steve Jobs in Early NeXT Days (YouTube) -- documentary footage of the early retreats at NeXT, where Jobs talks about plans and priorities. Very interesting to watch this knowing how the story ends. I'm astonished by how well Jobs spoke, even then, and delighted by the glimpses of impatience and dismissiveness. I wonder where the raw footage went. (via The Next Web)
  2. Cotton Candy Prototype -- an Android-running computer on a USB stick. Plug it in, use the software on the stick to talk to the onboard OS, and you're off. The ease of carrying your systems and data with you like this is the only long-term challenge I can see to the convenience of cloud storage of your digital life. For more details see Laptop Mag.
  3. Clayton Christensen on Short-Sighted Pursuit of Profits (Forbes) -- love this quote from an overseas semiconductor manufacturer: You Americans measure profitability by a ratio. There's a problem with that. No banks accept deposits denominated in ratios.
  4. Ford Just Became a Software Company (Information Week) -- Ford are shipping memory sticks with software upgrades to the touchscreen computer in their cars. This is the future of manufacturing: your physical products will need software, which will for your business to have software competencies you haven't begun to dream of. Business opportunity?

November 14 2011

Steve Jobs, the Unabomber, and America's love/hate relationship with technology

iPhone 4s and an old cabinAs the extraordinary tide of tributes to the life and work of Steve Jobs poured in these past few weeks, I couldn't help wondering how Ted Kaczynski was taking the news.

Kaczynski, aka the Unabomber, is serving a life sentence in a Colorado prison for conducting a murderous terror campaign he'd hoped would overthrow the kingdom of technology. There can be no more dramatic testimony to the failure of that campaign than the orgy of eulogies accorded Jobs.

Still, beneath their obvious differences, there's a connection between Kaczynski and Jobs, not between them personally but between the archetypes they've come to represent.

The emotional reactions to Jobs' passing made it abundantly clear that for many of us he'd come to symbolize the hopeful, life-affirming potential of the technical arts, in the process buttressing our faith in technology as a vehicle of human progress.

Kaczynski, by contrast, seemed a creature who'd emerged from the depths of our subconscious, a malignant manifestation of our fears that technology is not our friend but our enemy, and that our enemy is gaining the upper hand. Several commentators argued that Kaczynski disturbed us in part because we share a measure of his fear, and of his anger. Robert Wright wrote in Time magazine that "there's a little bit of the Unabomber in all of us." Daniel J. Kevles made essentially the same point in the The New Yorker; his essay appeared under the headline, "E Pluribus Unabomber." Alton Chase, in his biography of Kaczynski, suggested that the Unabomber Manifesto articulated in hyperbolic terms the same sort of earth-friendly sentiments that embrace organic vegetables, camping, and the Prius. Minus the violence, Chase said, the Manifesto represented "nothing less than the American creed."

That's a vast overstatement, I think, but it does speak to the incongruity I'm driving at here. Jobs and Kaczynski represent the extreme poles of a deep-seated ambivalence in our attitudes toward technology. That ambivalence has been a part of American history, and part of the American psyche, since the beginning.

Thomas Jefferson set the pattern. Jefferson argued passionately for a national economy based on the wholesome integrity of the family farm. Dependence on manufactures, he wrote, "begets subservience and venality, suffocates the germ of virtue, and prepares fit tools for the designs of ambition." But Jefferson also installed a host of inventions at Monticello and marveled at the wonders of industrial power in England. He loved nature but found it impossible to resist the fruits of abundance and power technology offered. Jefferson's oscillations on technology, said the historian Leo Marx, represent "decisive contradictions in our culture and in ourselves."

The same contradictions flavored the nation's pursuit of Manifest Destiny. The dominant theme was that technology was the spearhead of civilization, the essential tool for taming the savage frontier. At the same time, a less confident undercurrent whispered that the possibilities of human freedom were vanishing even as the glories of nature were being despoiled. Contemporary accounts quoted by Henry Nash Smith demonstrate how both perspectives were projected onto the personality of Daniel Boone, who was alternately portrayed as "the angelic Spirit of Enterprise," paving the way for decency and prosperity, or as a paragon of lonely rectitude, moving ever westward, ahead of the madding crowd. "I had not been two years at the licks [in Missouri]," Boone was said to have complained, "before a d--d Yankee came, and settled down within a hundred miles of me!!"

This discordant medley of enthusiasm and regret would subsequently be echoed in the frontier novels of James Fenimore Cooper and the Westerns of John Ford. In both, a self-reliant frontiersman typically served as a bridge between wild nature and community, often demonstrating that for all the gains civilization brought, something noble and pure was being lost. Still later the same sorts of tensions would appear in the public images of Thomas Edison and Henry Ford, who were revered not only for their achievements in technology, but also for having managed to turn the trick of becoming rich and famous while retaining the homespun virtues of small town boys.

Steve Jobs and Ted Kaczynski — I'm talking about the individuals now, not the archetypes — were both products of the 1960s counterculture, and the spectacular divergence of their subsequent careers testifies to the depth of the counterculture's bifurcated views on technology. Yes, the '60s were a time of getting real and getting back to the land, but they were also an era of changing consciousness with the help of high-powered sound systems and LSD. Whether Kaczynski ever dropped acid I don't know, but he certainly dropped out. And although his Manifesto showed that he was filled with hatred for much of what the '60s stood for, it's also true that his views on technology were shaped by some of the counterculture's favorite intellectuals, Jacques Ellul and Herbert Marcuse among them.

Whole Earth Catalog - Detail by akaalias, on FlickrJobs regularly cited as seminal influences in his youth LSD and the "Whole Earth Catalog." Certainly, Stewart Brand's counterculture bible captured the era's eclecticism in regard to machines: readers regularly found woodstoves and potter's wheels featured alongside books on cybernetics and space stations. In that context, it makes perfect sense that, as Walter Isaacson's biography reveals, Jobs tried for nine months to treat his pancreatic cancer with fruit juices and herbal remedies before seeking out the most technologically advanced medical treatments he could find.

If Steve Jobs and Thomas Jefferson can be ambivalent about technology, I guess any of us can. That's where Ted Kaczynski took a more radical path, a path of madness. You can't separate good technologies from bad technologies, he said. Buying into the Internet and artificial intelligence means also buying into nuclear meltdowns, eugenics, and global warming. Technology aims inexorably in one direction only: totalitarianism, the eradication of nature and the subjugation of human beings.

Kaczynski's madness came not so much in the logic of that philosophy — similar views have been endorsed by plenty of respectable people, including Ellul and Marcuse — as it did in his insistence on trying to force everyone else to adhere to it. His contempt for compromise was deep. When it comes to technology, he scornfully said, people want to have their cake and eat it, too. To which generations of Americans have replied, "Who wouldn't?"

By that I mean that we lust for the gifts technology bestows while overlooking, as best we can, its degradations. We love the mobility our cars provide, but keeping them filled with gas has gotten us into all sorts of trouble, and suburban sprawl is a nightmare. I wouldn't want to give up my iPad, my Android, my Xbox, or my plasma TV, but the people who make them in China seem to be getting a pretty bad deal, and don't ask me where they end up when I throw them away.

Our Jobsian side smiles confidently and says, "Relax! Technology will provide us with solutions to all those problems — give it time." To which our Kaczynski side scowls and snarls that technology doesn't solve problems, it creates them. Trying to extricate ourselves with more machinery only serves to dig the hole we're in that much deeper.

Technological schizophrenia: It's an American tradition.

Photos: iPhone 4s via Apple; Kaczynski's cabin via Singularity Symposium

Related:

November 04 2011

Four short links: 4 November 2011

  1. Beethoven's Open Repository of Research (RocketHub) -- open repository funded in a Kickstarter-type way. First crowdfunding project I've given $$$ to.
  2. KeepOff (GitHub) -- open source project built around hacking KeepOn Interactive Dancing Robots. (via Chris Spurgeon)
  3. Steve Jobs One-on-One (ComputerWorld) -- interesting glimpse of the man himself in an oral history project recording made during the NeXT years. I don't need a computer to get a kid interested in that, to spend a week playing with gravity and trying to understand that and come up with reasons why. But you do need a person. You need a person. Especially with computers the way they are now. Computers are very reactive but they're not proactive; they are not agents, if you will. They are very reactive. What children need is something more proactive. They need a guide. They don't need an assistant.
  4. Bluetooth Violin Bow -- this is awesome in so many directions. Sensors EVERYWHERE! I wonder what hackable uses it has ...

October 26 2011

Developer Week in Review: These things always happen in threes

Fall is being coy this year in the Northeast. We've been having on and off spells of very mild, almost summer-like weather over the last few weeks. That trend seems to be finally ending, alas, as there is possible snow forecasted for the weekend in New Hampshire. As the old joke goes, if you don't like the weather here, just wait five minutes.

The fall also brings hunting to the area. The annual moose season just concluded (you need to enter a special lottery to get a moose permit), but deer season is just about to open. My son and I won't be participating this year, but we recently purchased the appropriate tools of the trade, a shotgun to hunt in southern NH (where you can't hunt deer with a rifle) and a Mosin Nagant 91/30 for the rest of the state. The later is probably overkill, but my son saved up his pennies to buy it, being a student of both WWII and all things Soviet. Hopefully, he won't dislocate his shoulder firing it ...

Meanwhile, in the wider world ...

John McCarthy: 1927-2011

It's been a sad month for the computer industry, with the deaths of Steve Jobs and Dennis Ritchie already fact. Less well known, but equally influential, AI pioneer and LISP creator John McCarthy passed away on Sunday. McCarthy was involved in the creation of two of the preeminent AI research facilities in the world, at MIT and Stanford, and he is generally credited with coining the term "artificial intelligence."

LISP has had its periods of popularity, peaking in the 1980s, but it's never been a mainstream language in the way that C, FORTRAN, BASIC or Java was. What people tend to forget is just how old LISP really is. Only FORTRAN, COBOL and ALGOL are older then LISP, which came on the scene in 1958. Many of the concepts we take for granted today, such as closures, first saw light in LISP. It also lives in the hearts of Emacs and AutoCAD, among others, and LISP is the language used in much of the groundbreaking artificial intelligence work.

On a side note, when I first met my wife and told her I was involved in the AI field, she gave me a truly strange look. She had a BA in animal science, you see, and in that field "AI" stands for artificial insemination.

Velocity Europe, being held Nov. 8-9 in Berlin, will bring together the web operations and performance communities for two days of critical training, best practices, and case studies.

Save 20% on registration with the code RADAR20

Someone finally admits the dirty truth about the GPL

If you listen to Richard Stallman, the GPL is all about being a coercive force that will eventually drive all software to be free (as in freedom.) Those of us who watch such things have noticed that it has a paradoxical effect, however. Companies like MySQL (now Oracle) use it the same way that drug dealers offer free samples to new customers. "The first one's free, but you'll be back for more." In other words, they get you hooked by offering a GPL version, but cash in when you want to use their product for commercial purposes because the GPL is too dangerous for most companies.

Now, python developer Zed Shaw has brought the GPL's dirty little secret into the light of day. In a particularly NSFW rant, Shaw explains why he chooses to use the GPL these days. In short, it's because he's sick of developers at companies getting to be heroes by using his stuff and getting the glory. "I use the GPL to keep you honest. You now have to tell your bosses you're using my gear. And it will scare the piss out of them." He goes on to say that he's using the GPL as a stick to force companies to pay him to use his software.

This goes right to the very core of the debate about what free/open software should be about. Is it a tool to make all software free? Is it a way to allow "good" people (i.e., non-commercial users) to have access while punishing "bad" people (professional developers)? Personally, I'm thrilled that Southwest Airlines uses a Java library I created for another client years ago and open sourced, but evidently some people (especially those who aren't getting paid to maintain open-source projects by a day job) want to get paid for their efforts.

I find the logic a bit questionable. I don't see a lot of difference between a free software developer who holds corporate users' feet to the fire and a commercial software developer. Sure, it still allows hobbyists and educational users to use the software for free, but it's actually acting to discourage companies from getting involved in FL/OSS by encouraging the wrong model. When companies use open-source software in their products, they are more likely to contribute back to the project and to open source other non-critical code they produce. If they are paying a developer for it, they are much less likely to contribute back.

The Steve Jobs movie: I predict lots of people walking and talking

With the Steve Jobs biography currently sitting at the top of Amazon's bestseller list, Sony Pictures is wasting no time getting a film adaptation underway. The current buzz is that Aaron Sorkin, creator of the West Wing and winner of the Academy Award for his adaptation of "The Social Network," is on the short list to write the screenplay.

It would be interesting to see how Sorkin would tackle Jobs' story, full and complex as it is. One approach might be to leave out the '80s, already covered to some degree in "Pirates of Silicon Valley," and concentrate instead on his youth and the last 15 years of his life. One can only hope that the technological details are not hopelessly mangled in an attempt to make it accessible.

Got news?

Please send tips and leads here.

Related:

"Revolution in the Valley," revisited

It's one thing to look back on a project with the power of hindsight and recognize that project's impact. It's quite another to realize a project will be profound as it's still playing out.

Andy Hertzfeld had that rare experience when he was working on the original Macintosh. Hertzfeld and his colleagues knew the first Mac could reshape the boundaries of computing, and that knowledge proved to be an important motivator for the team.

Over the years, Hertzfeld has chronicled many of the stories surrounding the Macintosh's development through his website, Folklore.org, and in his 2004 book, "Revolution in the Valley." With the book making its paperback debut and the work of Steve Jobs and Apple fresh in people's minds, I checked in with Hertzfeld to revisit a few of those past stories and discuss the long-term legacy of the Macintosh.

Our interview follows.

What traits did people on the Macintosh team share?

Andy HertzfeldAndy Hertzfeld: Almost everyone on the early Macintosh team was young, super smart, artistic and idealistic. We all shared a passionate conviction that personal computers would change the world for the better.

At what point during the Macintosh's development did you realize this was a special project?

Andy Hertzfeld: We knew it from the very beginning; that's why we were so excited about it. We knew that we had a chance to unlock the incredible potential of affordable personal computers for the average individual for the very first time. That was incredibly exciting and motivating.

Between the book and your website, did the story of the Macintosh take on a "Rashomon"-like quality for you?

Andy Hertzfeld: I was hoping for more of that than we actually achieved — there haven't been enough stories by varied authors for that to come to the forefront. The most "Rashomon"-like experience has been in the comment section of the website, where people have corrected some of my mistakes and sometimes described an alternate version that's different than my recollections.

How did the Macintosh project change after Steve Jobs got involved?

Andy Hertzfeld: It became real. Before Steve, the Macintosh was a small research project that was unlikely to ever ship; after Steve, it was the future of Apple. He infused the team with a fierce urgency to ship as soon as possible, if not sooner.

There's a story about how after Jobs brought you onto the Macintosh team, he finalized the move by yanking your Apple II's power chord out of its socket. Was that an unusual exchange, or did Jobs always act with that kind of boldness?

Andy Hertzfeld: Well, not always, but often. Steve was more audacious than anyone else I ever encountered.

In the book, you wrote that as soon as you saw the early Mac logic board you knew you had to work on the project. What about that board caught your attention?

Andy Hertzfeld: It was the tightness and cleverness of the design. It was a clear descendant of the brilliant work that Steve Wozniak did for the Apple II, but taken to the next level.

Was the logic board the gravitational center for the project?

Andy Hertzfeld: I would say that Burrell Smith's logic board was the seed crystal of brilliance that drew everyone else to the project, but it wasn't the gravitational center. That would have to be the user interface software. My nomination for the gravitational center is Bill Atkinson's QuickDraw graphics library.

Revolution in The Valley: The Insanely Great Story of How the Mac Was Made — "Revolution in the Valley" reveals what it was like to be there at the birth of the personal computer revolution. The story comes to life through the book's portrait of the talented and often eccentric characters who made up the Macintosh team.

Today only: Get the ebook edition of "Revolution in the Valley" for $9.99 (Save 50%).

Where does the graphical user interface (GUI) rank among tech milestones? Is it on the same level as the printing press or the Internet?

Andy Hertzfeld: It's hard to compare innovations from different eras. I think both the printing press and the Internet were more important than the GUI, but the Internet couldn't have reached critical mass without the GUI enabling ordinary people to enjoy using computers.

In the book's introduction you wrote that the team "failed" because computers are still difficult. Has computer complexity improved at all?

Andy Hertzfeld: I think things have improved significantly since I wrote that in 2004, mainly because of the iPhone and iPad, which eliminated lots of the complexity plaguing desktop computing. That said, I think we still have a ways to go, but newer developments like Siri are very promising.

What do you think the long-term legacy of the Macintosh will be?

Andy Hertzfeld: The Macintosh was the first affordable computer that ordinary people could tolerate using. I hope we're remembered for making computing delightful and fun.

Would a current Apple fan have anything in common with a Macintosh-era Apple fan?

Andy Hertzfeld: Sure, a love for elegant design and bold innovation. They both think differently.

This interview was edited and condensed.

Related:

October 24 2011

A focus on the stuff that matters most

This post originally appeared in Tim O'Reilly's Google+ feed.

This tweet by Steve Case (@stevecase) struck home for me, because in the aftermath of Steve Jobs' death I've been thinking a lot about O'Reilly, wanting to make sure that we streamline and focus on the stuff that matters most.

Steve Case tweet about Steve Jobs

Here's the money quote from the article Case mentioned:

"My passion has been to build an enduring company where people were motivated to make great products," Jobs told [biographer Walter] Isaacson. "[T]he products, not the profits, were the motivation. [John] Sculley flipped these priorities to where the goal was to make money. It's a subtle difference, but it ends up meaning everything."

Jobs went on to describe the legacy he hoped he would leave behind, "a company that will still stand for something a generation or two from now."

"That's what Walt Disney did," said Jobs, "and Hewlett and Packard, and the people who built Intel. They created a company to last, not just to make money. That's what I want Apple to be."

All of our greatest work at O'Reilly has been driven by passion and idealism. That includes our early forays into publishing, when we were a documentation consulting company to pay the bills but wrote documentation on the side for programs we used that didn't have any good manuals. It was those manuals, on topics that no existing tech publisher thought were important, that turned us into a tech publisher "who came out of nowhere."

In the early days of the web, we were so excited about it that Dale Dougherty wanted to create an online magazine to celebrate the people behind it. That morphed into GNN, the Global Network Navigator, the web's first portal and first commercial ad-supported site.

In the mid-'90s, realizing that no one was talking about the programs that were behind all our most successful books, I brought together a collection of free software leaders (many of whom had never met each other) to brainstorm a common story. That story redefined free software as open source, and the world hasn't been the same since. It also led to a new business for O'Reilly, as we launched our conference business to help bring visibility to these projects, which had no company marketing behind them.

Thinking deeply about open source and the internet got me thinking big ideas about the Internet as operating system, and the shift of influence from software to network effects in data as the key to future applications. I was following people who at the time seemed "crazy" — but they were just living in a future that hadn't arrived for the rest of the world yet. It was around this time that I formulated our company mission of "changing the world by spreading the knowledge of innovators."

In 2003, in the dark days after the dotcom bust, our company goal for the year was to reignite enthusiasm in the computer business. Two outcomes of that effort did just that: Sara Winge's creation of Foo Camp spawned a worldwide, grassroots movement of self-organizing "unconferences," and our Web 2.0 Conference told a big story about where the Internet was going and what distinguished the companies that survived the dotcom bust from those that preceded it.

In 2005, seeing the passion that was driving garage inventors to a new kind of hardware innovation, Dale once again wanted to launch a magazine to celebrate the passionate people behind the movement. This time, it was "Make:", and a year later, we launched Maker Faire as a companion event. Around 150,000 people attended Maker Faires last year, and the next generation of startups is emerging from the ferment of the movement that Dale named.

Meanwhile, through those dark years after the dotcom bust, we also did a lot of publishing just to keep the company afloat. (With a small data science team at O'Reilly, we built a set of analytical tools that helped us understand the untapped opportunities in computer book publishing. We realized that we were playing in only about 2/5 of the market; moving into other areas that we had never been drawn to helped pay the bills, but never sparked the kind of creativity as the areas that we'd found by following our passion.)

It was at this time that I formulated an image that I've used many times since: profit in a business is like gas in a car. You don't want to run out of gas, but neither do you want to think that your road trip is a tour of gas stations.

When I think about the great persistence of Steve Jobs, there's a lesson for all of us in it.

What's so great about the Apple story is that Steve ended up making enormous amounts of money without making it a primary goal of the company. (Ditto Larry and Sergey at Google.) Contrast that with the folks who brought us the 2008 financial crisis, who were focused only on making money for themselves, while taking advantage of others in the process.

Making money through true value creation driven by the desire to make great things that last, and make the world a better place — that's the heart of what is best in capitalism. (See also the wonderful HBR blog post, "Steve Jobs and the Purpose of the Corporation." I also got a lot of perspective on this topic from Leander Kahney's book, "Inside Steve's Brain.")

See comments and join the conversation about this topic at Google+.

Related:

October 09 2011

On the media reaction to the death of Steve Jobs

Socrates said at his trial that "the unexamined life is not worth living for a human being." In days since the death of Steve Jobs, his life and legacy have been the subject of conversations around the globe, with thousands of articles, broadcasts, tweets and updates, with more to come this week in the pages of magazines whose publishers stopped the presses to put the co-founder of Apple on their covers. We've looked at his impact on commerce and legacy here at Radar, too.

In the context of worldwide reactions to his impact on the arc of history, recognizing the complexity of his life and offering a balanced assessment of the impact of his legacy on this earth matters. In that context, O'Reilly editors have been exchanging frank reflections over email on the passing of one of the technology industry's iconic figures. While some of the comments shared are provocative, we thought it was an important conversation to share.

t_hero.png

Mike Loukides initiated the thread: "I have to say that all those photos of people putting candles in front of Apple stores really creeps me out. I think hero worship is one of the most destructive things human culture has ever invented. He was a person. He had to die, sooner or later. It's sad and unfortunate that it was sooner. But you don't escape mortality by being able to hit a ball farther than someone else, or having better design sense than the next guy. And watching the devout go up humbly to the altar of the Apple store to leave their offerings--well, if that doesn't say something about what's wrong with this economy, I don't know what does."

Sarah Milstein responded: "Thanks for saying that, Mike. I respect people's sadness, but at the risk of saying something unpopular, I've also been astonished and dismayed by the intense adulation.

I'm not big on hero worship to start with, and I find it particularly distressing in this situation, because his public work never seriously focused on identifying worthy public problems and inspiring people to tackle them. (You can argue that Apple's products have enabled other people to do good work and that the company has created lots of jobs; but there's still a meaningful distinction to be made between focusing on profits and focusing on social benefit. I mean, nobody says that oil companies are doing a lot of good work because their products allow aid workers to reach poor people around the globe.)

You can't ask people not to have their feelings. But I've found Silicon Valley and the Internet quite alienating the past couple of days, and I appreciate a dissenting voice."

Mark Frauenfelder offered a straightforward reflection about this moment in history and what we're seeing in the reaction of millions around the globe to losing 'the Crazy one': "A lot of people want heroes in their life, and some of them picked Steve Jobs as their personal protagonist. It beats worshipping Ayn Rand, in my opinion."

By week's end, we saw more evidence of a backlash about the coverage of Jobs' passing emerge online. Mike and Sarah were not alone in their concerns nor perspective. Ryan Tate and Wade Roush explored the complexities of Jobs' life and legacy for Gawker and Xconomy, respectively. While Tech Review editor Jason Pontin objected that "no one's forgetting Steve's dark side," the preponderance of both mainstream and technology industry coverage has been strongly positive.

For my part, I participated in that coverage and publicly shared in the outpouring of personal stories over the course of Wednesday night. Late in the evening, I curated a collection of the more post powerful reflections, videos and quotes about Steve Jobs that I'd found. On Friday, I switched my personal blog to a "Retro Mac OS" theme that matches the look BoingBoing took on Wednesday night. In doing so, I don't think I was guilty of hagiography or gross adulation. The tools he helped bring into being changed my life and continue to enrich it. Jobs told us “how to live before you die” in a 2005 commencement speech at Stanford University, that we should not “don’t let the noise of others’ opinions drown out your own inner voice.” He gave us inspiration to write our own melodies, to insist on seeing them made, whether that vision was wrought in gleaming glass and aluminum, drawn in artful pixels or published, echoing Gutenberg’s first revolution. Thinking back, my first computer was an Apple II+. In 1985, I wrote a story on it. In 1995, I made my first Web site on a Mac. In 2011, I share my world on an iPhone.

There's much more to consider, however, and some of that story has been well told by one of my college classmates, Mike Daisey, who has produced an extraordinary one man show about Jobs that he's been performing over the past year.

Daisey wrote an op-ed in Friday's New York Times that engaged further with the complexities of Job's life and legacy with characteristic eloquence:

Apple’s rise to power in our time directly paralleled the transformation of global manufacturing. As recently as 10 years ago Apple’s computers were assembled in the United States, but today they are built in southern China under appalling labor conditions. Apple, like the vast majority of the electronics industry, skirts labor laws by subcontracting all its manufacturing to companies like Foxconn, a firm made infamous for suicides at its plants, a worker dying after working a 34-hour shift, widespread beatings, and a willingness to do whatever it takes to meet high quotas set by tech companies like Apple.

I have traveled to southern China and interviewed workers employed in the production of electronics. I spoke with a man whose right hand was permanently curled into a claw from being smashed in a metal press at Foxconn, where he worked assembling Apple laptops and iPads. I showed him my iPad, and he gasped because he’d never seen one turned on. He stroked the screen and marveled at the icons sliding back and forth, the Apple attention to detail in every pixel. He told my translator, “It’s a kind of magic.”

Mr. Jobs’s magic has its costs. We can admire the design perfection and business acumen while acknowledging the truth: with Apple’s immense resources at his command he could have revolutionized the industry to make devices more humanely and more openly, and chose not to. If we view him unsparingly, without nostalgia, we would see a great man whose genius in design, showmanship and stewardship of the tech world will not be seen again in our lifetime. We would also see a man who in the end failed to “think different,” in the deepest way, about the human needs of both his users and his workers.

It’s a high bar, but Jobs always believed passionately in brutal honesty, and the truth is rarely kind. With his death, the serious work to do the things he has failed to do will fall to all of us: the rebels, the misfits, the crazy ones who think they can change the world.

Ken Jones remembered Daisey's show well: "I saw his show on Steve Jobs last year in Portland OR and it was extremely moving. He blends his admiration for Jobs and Apple products with a journalistic account of his visit to the Foxconn factory in Shenzhen, China, where these products are assembled under heartbreakingly deplorable conditions. You'll never think the same way about your iPhone or iPad again.

Tim O'Reilly commented on the work of Steve Jobs when asked by the New York Times. "They were doing a 'will we ever see his like again' kind of story, and I had to say, 'of course we will.'"

Specifically, Tim told the Times that “I don’t want to take anything away from the guy, he was brilliant and uncompromising and wonderful, but there’s a level of adulation that goes beyond what is merited. There will be revolutions and revolutionaries to come.”

Tim explained more via email: "I also posited the difference if, instead of dying at the top of his game, Jobs had died in 5 years, with, for instance, everyone saying 'he did it again' - did something world changing, but held it too tightly and was beaten by a commodity play just like with the original Apple II and Mac. I've also often wondered, if in some devolved future, people like Elvis and Princess Diana would end up as saints, with miracles attributed to them..."

Mike Loukides responded: "Two thoughts have been running through my head. One is that, after Princess Diana's death, when everyone was gushing over her charity work, someone said "On the landscape of suffering, Diana was a tourist. Mother Theresa was a monument." On the landscape of design, Jobs was certainly a monument, but we would do well to remember that there are other landscapes that are possibly more worth our attention.

Another was a couple of reports on other less wealthy people who died the same day: Reverend Fred L. Shuttlesworth led the NAACP in Alabama through the toughest years of the civil rights movement, another was Derek Bell, a law professor who quit Harvard until they tenured a black woman, and who was responsible for writing much of our anti-discrimination law.

And a third (bonus) is from Neil Gaiman's American Gods. Towards the end, there's a battle between the new gods (wealth, media, power, all that) and the old gods. Mr Wednesday/Wodin is the lead, but Czernobog, a very obscure Russian deity with some resemblance to Thor, plays an important role. Wednesday says something about Media, and Czernobog says "Media? I think I know her. She's that Greek chick, right?" Elvis and Diana as saints, indeed."

Sarah Milstein: "Of course, lots of news events could have eclipsed the deaths of Bell and Shuttlesworth. And debating relative worth is going to get us nowhere. But I have been sorry to see that them get relatively little airtime for having done heroic and inspiring things."

For those unfamiliar, Rev. Fred L. Shuttlesworth was a All we've got to do is to keep marching," he said. And they did. As a result, their children live in a country more closely aligned with the promise given at its founding that "all men are created equal."

Derek Bell also passed away on Wedneday. The Harvard law school professor and civil rights activist, told students to 'speak up, stand out.' All three men will share a place on October 5th, 2011 in United States history books.

Sara Winge offered a coda to the conversation, sharing a reflection on science and art: "I think many people had intense reactions to Steve Jobs' death because he was an artist. Most of us hunger for beauty. It nourishes us in a way nothing else does. He created beautiful things that were also practical and provided a lovely-to-use portal to an amazing new world of our own creativity. Now that he's gone, his very precise, very personal aesthetic will never infuse an Apple creation again. That feels like a loss to me.

That reaction doesn't have a thing to do with his business acumen, his personal abrasiveness, or his worth as a human relative to more saintly folks. It's really about how his creative spirit inspired people, and how important (and emotional) that is to many of them."

October 07 2011

Publishing News: Betting on the Nobel Prize

Here's what caught my attention in this week's publishing news.

Insults aside, the Nobel Prize for Literature kept the bookies busy

Alfred_Nobel_Icon.pngThere was much ado running up to the announcement of who would win the Nobel Prize for Literature this week. The American literature community lost a bit of hope that an American author would win the prize — which hasn't happened since 1993 — when Horace Engdahl, the Nobel Academy permanent secretary, said "[t]he US is too isolated, too insular. They don't translate enough and don't really participate in the big dialogue of literature. That ignorance is restraining."

Engdahl's comment didn't seem to affect the betting line, though. On Wednesday, Time reported Philip Roth's odds at 16/1 and Bob Dylan at "the astounding odds of 5/1." Bob Dylan? In an interview for the Time report, Alex Donohue of Ladbrokes, a British-based gambling company, explained the Dylan phenomenon:

So we introduced Bob Dylan at 100/1. We put him in because we thought that maybe he'd have a chance and a few dedicated Bob Dylan fans might want to bet, but [we assumed] that no one would take him seriously. But now, obviously there's been a massive gamble and we've taken bets from all over the world — Sweden, Japan, Canada, all of Europe — on Bob Dylan. People out there betting just can't get enough and they keep backing him.

How did gamblers make out on the winner, Tomas Tranströmer? The 80-year-old Swedish poet came in on Wednesday with 7-to-1 odds.

TOC Frankfurt 2011 — Being held on Tuesday, Oct. 11, 2011, TOC Frankfurt will feature a full day of cutting-edge keynotes and panel discussions by key figures in the worlds of publishing and technology.

Save 100€ off the regular admission price with code TOC2011OR

Steve Jobs bio to hit shelves ahead of schedule

The biggest news in any industry this week was the death of Steve Jobs. In response, Simon & Schuster moved up the release date of Jobs' authorized biography to October 24. Pre-sales of the book increased 42,000% upon his death. The biography's author Walter Isaacson said Jobs, during the final interview for the book, told him he authorized the book because of his kids:

I wanted my kids to know me. I wasn't always there for them and I wanted them to know why and to understand what I did.

Additionally, ShelfAwareness pulled together a nice list of recent titles on Jobs and noted the upcoming "I, Steve: Steve Jobs in His Own Words," due out November 15.

The potential power of free Kindles

Amazon lit up the digital publishing world last week with its launch of a $79 Kindle. Breaking the two-digit entry barrier is a big deal and will arguably be the turning point for the ereader. Mathew Ingram over at GigaOM took it a step further and asked what will happen when the Kindle is free. He said a free ereader might open the door much wider for content like the Kindle Singles, and it will be lucrative for authors:

These not-quite-books can be written and uploaded by anyone, and offered at whatever price point an author decides: as little as 99 cents, or even free. Offering a free — or ad-supported — Kindle would presumably just provide even more of an avenue for these kinds of books to reach readers, and that in turn could (theoretically at least) make it possible for more writers to make a living from their writing.

Ingram followed with a nice argument in favor of the less-expensive-book model and made some interesting suggestions, such as "[offering] a subscription to an author, so I can automatically get whatever he or she writes." That's one of those ideas that seems so obvious, you wonder why it hasn't happened yet.

Photo: Alfred Nobel by Zero grey, on Wikimedia Commons


Related:


  • Let's imagine Steve Jobs is President of the United States
  • Commerce Weekly: How Steve Jobs changed the way we buy
  • Publishing News: Amazon vs barrier to entry
  • More Publishing Week in Review coverage

  • October 06 2011

    Commerce Weekly: How Steve Jobs changed the way we buy

    We're changing the name of this blog from ePayments Week to Commerce Weekly to better reflect the wider scope of our coverage — not just payment, but communication and transaction technologies along the entire commercial value chain.

    With that in mind, here's what caught my eye this week.

    Steve Jobs' commercial legacy

    First generation iPodIt's difficult to write about anything else today, with the entire tech and creative universe mourning the loss of an uncompromising genius. Much has already been published about the ways that Steve Jobs changed how we work and interact with computers. Less has been written about how he changed the way we shop and buy. Here are three thoughts on that.

    The iPod and the iTunes store. As Jobs said before introducing the iPhone in 2007, the iPod "didn't just change the way we listen to music. It changed the entire music industry." Its pairing with the iTunes store actually went further, creating the first simple, sustainable platform for purchasing and downloading all kinds of digital media, including TV shows, movies, books, college lectures, and more. As of June 2011, iTunes had 225 million accounts, and through them more than 15 billion songs have been sold, making it the world's number one music store. Apple extended the model to software with the App Store, which has distributed more than 14 billion apps in three years.

    The iPhone and in-app purchases. Although there were smart phones before the introduction of the iPhone in January 2007, finding and installing new applications for them wasn't easy. The iPhone changed that, making it simple to download and install new apps and opening the landscape for mobile app developers. By doing so, it broadened the opportunity for consumers to make purchases inside mobile apps. In-app purchases have helped make the freemium model (free to install, paid for with subsequent purchases inside the application) the dominant one for mobile apps, on iOS and other mobile platforms.

    Apple Store in New York City
    The Apple Store at 59th Street and Fifth Avenue, New York City. Via Fletcher6, Wikimedia Commons.

    The Apple Store. Apple opened its first physical retail stores in 2001, just as other computer makers were closing theirs. But Apple's innovations — cutting-edge architectural design, the Genius Bar, iPhone and iPad checkout — made their stores a destination for Apple fans and the curious alike. Ten years on, Apple has 357 stores across the world.

    Even all this was a small part of Jobs' legacy. I'd like to think the best part of what he gave us — even better than all the cool toys — was a shining, successful example of what's possible when you don't compromise your vision. He demonstrated to two generations of creative geeks what's possible when you commit yourself to making a thing work the way it really should. That's a rare feat in a world where too many things don't.

    X.commerce Innovate Developer Conference — Technology is changing the way people shop and driving the on- and off-line commerce market to $8T by 2013. Enter the new X.commerce, an end-to-end commerce platform backed by eBay, PayPal, and Magento, and enhanced by a rich ecosystem of developers and partners. Be there for the X.commerce Innovate Developer Conference on Oct 12 - 13, 2011 in San Francisco.

    Register with code OREILLYUSERS for $200 off

    eBay CEO: We won't compete with our customers

    Ahead of eBay's Innovate conference, Robert Scoble talked with eBay's chief executive John Donahoe about the changes underway in retail, mobile, and social commerce. Donahoe predicted that rapidly evolving technology will drive "more changes in the way consumers pay and shop in the next three years than we've seen in the last 15 to 20."

    Scoble has posted the interview on YouTube (it's also embedded below). Among the highlights:

    • Donahoe positioned eBay's commerce ecosystem as a merchant-friendly alternative to Amazon: "We provide all the tools to help third-party developers create businesses for merchants, and we will never compete with [merchants]."
    • There are 500,000 developers working with Magento (the open-source ecommerce platform that eBay purchased earlier this year) and, according to Forrester, that work has generated more than $1 billion in revenue for them.
    • Mobile is a big opportunity because "people don't want to enter a credit card number into a mobile device. It's cumbersome," and they don't believe it's secure.
    • Katie Mitic, who leads Facebook's platform and marketing efforts, is joining eBay's board. Donahoe positioned this as a significant gesture as eBay tries to work with Facebook to figure out the social shopping connection.
    • eBay is increasingly global: of the $60 billion in volume last year on eBay, 55% came from the U.S. and 45% happened outside the U.S. What's more, 20% of eBay's transactions cross borders. "So, $5 billion worth of goods was exported out of the U.S. on eBay."
    • eBay will remain platform- and operating-system agnostic. "We've lost the hubris of thinking we're going to decide for them. Our consumers will tell us where we need to go."

    There were a few notable gaps where Donahoe was honest about not having the answers.

    • On China: Although some Chinese sellers use eBay and PayPal for transactions with customers outside of the country, foreign companies can't tap the enormous market in transactions within the country. He expects PayPal to partner with a Chinese bank or other financial service in the next few years.
    • On social commerce: While eBay is beginning to see elements of social entering the shopping experience, there's still no clarity on what the social shopping experience means. Is it Facebook coming to eBay, or eBay merchandise selling through Facebook (or both)?

    Got news?

    News tips and suggestions are always welcome, so please send them along.


    If you're interested in learning more about the commerce space, check out PayPal X DevZone, a collaboration between O'Reilly and PayPal.


    iPod Photo via Wikimedia Commons.

    Related:

    Reposted bycheg00 cheg00

    August 26 2011

    Top Stories: August 22-26, 2011

    Here's a look at the top stories published across O'Reilly sites this week.


    Ruminations on the legacy of Steve Jobs
    Apple, under Steve Jobs, has always had an unrelenting zeal to bring humanity to the center of the ring. Mark Sigal argues that it's this pursuit of humanity that may be Jobs' greatest innovation.
    The nexus of data, art and science is where the interesting stuff happens
    Jer Thorp, data artist in residence at the New York Times, discusses his work at the Times and how aesthetics shape our understanding of data.
    Inside Google+: The virtuous circle of data and doing right by users
    Data liberation and user experience emerged as core themes during a recent discussion between Tim O'Reilly and Google+ VP of Product Bradley Horowitz.
    Five things Android needs to address on the enterprise side
    Android has the foundation to support enterprise use, but there's a handful of missing pieces that need to be addressed if it's going to fully catch on in the corporate world.
    The Daily Dot wants to tell the web's story with social data journalism
    The newly launched Daily Dot is trying an experiment in community journalism, where the community is the Internet. To support their goal, they're applying the lens of data journalism to the social web.





    Strata Conference New York 2011, being held Sept. 22-23, covers the latest and best tools and technologies for data science — from gathering, cleaning, analyzing, and storing data to communicating data intelligence effectively. Save 30% on registration with the code STN11RAD.

    August 25 2011

    Ruminations on the legacy of Steve Jobs

    Steve Jobs"It's better to die on your feet than to live on your knees." — Neil Young

    "That day has come." Four simple words that signaled that Steve Jobs felt compelled to step down as CEO of Apple, the company he founded, then lost, then saw ridiculed and written off, only to lead its rebirth and rise to new heights.

    It's an incredible story of prevailing (read: dominating) over seemingly insurmountable odds. A story that has no peer in technology, or any other industry, for that matter.

    That is why even though this moment was long anticipated, and while I know that Steve isn't gone (and hopefully won't be anytime soon), yesterday's announcement nonetheless feels like a "Kennedy" or "Lennon" moment, where you'll remember "where you were when ..."

    I say this having seen first-hand the genuine, profound sadness of multitudes of people, both online and on the street, most who (obviously) have never met the man.

    Why is this? I think that we all recognize greatness, and appreciate the focus, care, creativity, and original vision that it takes to achieve it.

    The realization that one man sits at the junction point of cataclysmic disruptions in personal computing (Apple II/Mac), music (iPod + iTunes), mobile computing (iPhone + iOS), movies (Pixar) and post-PC computing (iPad) is breath taking in its majesty. A legacy with no equal.

    The intersection of technology and liberal arts

    Apple Store in New York CityIn an era where entrepreneurialism is too often defined by incrementalism and pursuit of the exit strategy, Jobs' Apple was always defined by true husbandry of a vision, and the long, often thankless, pursuit of excellence and customer delight that goes with it.

    Ironically, though, Jobs' greatest innovation may actually be as basic as "bringing humanity back into the center of the ring," to borrow a phrase from Joe Strummer of the seminal rock band, The Clash.

    Consider Jobs' own words at the launch of the iPad back in January, 2010:

    The reason we've been able to create products like this is because we've tried to be at the intersection of technology and liberal arts. We make things that are easy to use, fun to use — they really fit the users.

    If this seems intuitive, and it should be, consider the modus operandi that preceded it. Before Apple, the hard truth was that the "inmates ran the asylum," in that products were typically designed by engineers to satisfy their own needs, as opposed to those of the actual consumers of the products.

    Moreover, products were designed and marketed according to their "speeds and feeds," checklists of attributes over well-chiseled, highly-crafted outcomes. And it didn't really matter if at each step along the value chain the consumer was disrespected and disregarded.

    Ponder for a moment the predecessor to the Apple Store, CompUSA, and what that experience was like versus the new bar for customer service being set by Apple.

    Or, think about the constraints on enjoying music and other media before the iPod, or the pathetic state of mobile phones before the iPhone.

    Skeptics and haters alike can credibly say that Apple did not create these categories, but recognize that it took a visionary like Steve Jobs to build a new technology value chain around the consumer and make it actually work. To give birth to an entirely new platform play. To free the user from the hard boundaries of WIMP computing. To bring design and user interaction models into the modern age. And to magically collapse the once-impenetrable boundaries between computing, communications, media, Internet, and gaming.

    Even today, the legacy MP3 device category is utterly dominated by Apple's iPod, despite every would-be competitor knowing exactly what Apple's strategy is in this domain.

    To do this in segment after segment, launch after launch, takes true conviction and a bit of chutzpah. But then again, Apple, under Jobs, has never been a company that embraced or felt beholden to conventional wisdom (see "Apple's segmentation strategy, and the folly of conventional wisdom").

    iPad as the signature moment in a brilliant career

    iPad 2Time and again, investors, competitors and industry pundits have dismissed Apple, most recently when the company launched the iPad. Then, the conventional wisdom was that Apple "blew it" or that it was "just a big iPod Touch," nothing landmark.

    Truth be told, such dismissals are probably the barometer by which Steve Jobs knows that he's played the winning hand.

    I wrote in 2010, in anticipation of the iPad launch:

    The best way to think about the iPad is as the device that inspired Steve Jobs to create the iPhone and the iPod Touch. It's the vaunted 3.0 vision of a 1.0 deliverable that began its public life when the first generation of iPhone launched only two-and-a-half years ago ... it is a product that is deeply personal to Steve Jobs, and I believe the final signature on an amazing career. I expect the product to deliver.

    Well, it did deliver, and 30 million iPads later, the ascent of post-PC computing seems irrevocable as a result.

    The moral of the story in considering the wonder and beauty of Steven P. Jobs, thus, is two-fold.

    One is that most companies wouldn't even have chanced cannibalizing a cash cow product like the iPod Touch (or the iPhone) to create a new product in an unproven category like tablet devices.

    Not Apple, where sacred cows are ground up and served for lunch as standard operating procedure.

    Two is that the mastery required to create a wholly new category of device that could be dismissed as "just a big iPod Touch" takes a very rare bird. Namely, one that pursues non-linear strategies requiring high leverage, deep integration and even higher orchestration.

    .

    Exactly the type of complexity that only Jobs and company could make look ridiculously, deceptively simple.

    In his honor, may we all be willing to "Think Different" in the days, weeks and months ahead. That's the best way to pay tribute to a legacy that will stand the test of time.

    Apple Store and Steve Jobs photos from Apple Press Info.



    Related:

    Four short links: 25 August 2011

    1. Steve Jobs's Best Quotes (WSJ Blogs) -- Playboy: We were warned about you: Before this Interview began, someone said we were "about to be snowed by the best."; [Smiling] "We're just enthusiastic about what we do." (via Kevin Rose)
    2. The Tao of Programming -- The Tao gave birth to machine language. Machine language gave birth to the assembler. The assembler gave birth to the compiler. Now there are ten thousand languages. Each language has its purpose, however humble. Each language expresses the Yin and Yang of software. Each language has its place within the Tao. But do not program in COBOL if you can avoid it. (via Chip Salzenberg)
    3. In Defense of Distraction (NY Magazine) -- long thoughtful piece about attention. the polymath economist Herbert A. Simon wrote maybe the most concise possible description of our modern struggle: "What information consumes is rather obvious: It consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it." (via BoingBoing)
    4. 31 Days of Canvas Tutorials -- a pointer to 31 tutorials on the HTML5 Canvas.

    May 19 2011

    Four short links: 19 May 2011

    1. Right to Access the Internet -- a survey of different countries' rights to access to access the Internet.
    2. Peace Through Statistics -- three ex-Yugoslavian statisticians nominated for Nobel Peace Prize. In war-torn and impoverished countries, statistics provides a welcome arena in which science runs independent of ethnicity and religion. With so few resources, many countries are graduating few, if any, PhDs in statistical sciences. These statisticians collaboratively began a campaign to collect together the basics underlying statistics and statistics education, with the hope of increasing access to statistical ideas, knowledge and training around the world.
    3. Vintage Steve Jobs (YouTube) -- he's launching the "Think Different" campaign, but it's a great reminder of what a powerful speaker he is and a look at how he thinks about marketing.
    4. Anatomy of a Fake Quotation (The Atlantic) -- deconstructing how the words of a 24 year old English teacher in Japan sped around the world, attributed to Martin Luther King.

    January 26 2011

    Developer Week in Review

    The weather outside is frightful (at least here in the northeast United States), but the news is so delightful. Note: Delightfulness may vary. This statement contains forward-looking statements, and should not be considered an indication of future delightedness.

    The Donald returns ... no, not that Donald

    The Art of Computer ProgrammingThe seminal study of programming techniques has long-time been considered to be Donald Knuth's "The Art of Computer Programming." Unfortunately, new volumes in the series have been coming out slower than year-old ketchup out of a bottle.

    It was news of some note, therefore, when it was announced this week that Volume 4A of his magnum opus (which covers Combinatorial Algorithms, Part 1, for the curious) was available for purchase. Given that it's been 38 years between the original volume 3 and the new 4A, and seven volumes are planned in total, we may all be programming via thought waves by the time the series is completed.

    It's time to play musical CEOs

    There are shakeups afoot at both the second and seventh largest U.S. companies this week (that would be Apple and Google, respectively). Steve Jobs' on-again, off-again, on-again stewardship of Apple appears to be off again, as he heads off for something medical, leaving COO Tim Cook at the helm.

    Meanwhile, over at Google, Eric Schmidt announced he's changing his role from CEO to executive chairman, with Larry P. running the show as CEO now.

    The generally held assumption at this point is that both companies are mature enough and large enough that they don't depend on direct guidance from the top to stay on mission, but only time will tell (especially for Apple) if the long-term philosophy survives a succession.

    Repent, for the end is near

    You've been hearing about it for years, that the IPv4 pool is going, going ... well now it's evidently gone. Or at least it will be in February, according to the latest IANA projections. That's when the last address blocks are due to be handed out, and then the biker gangs take over and Tina Turner starts hoarding subnets.

    In spite of the persistent warnings that the day was coming, IPv6 service is all but non-existant in the United States, and much of the IPv6 software stacks in products and operating systems has never really been put to the test under real world conditions. We may have avoided Y2K, only to get slammed by Y2K11 ...

    Assuming that the web is still operating next week, we'll be here reporting on any other apocalyptic news that may occur. Suggestions are always welcome, so please send tips or news here.



    October 18 2010

    May 24 2010

    Google vs Apple: Google doesn't need to win

    So now we're apparently in a Google versus Apple fight to the death. Google open-sources VP8 (now WebM), and Steve Jobs immediately throws cold water on it. Apple got their share of scorn at Google's I/O conference. Google thinks they have mobile/cloud/desktop integration nailed--and if what they demo'd last week actually works in FroYo (Android 2.2), they probably do.


    But the notion that this is a "fight to the death" is a bit bizarre, even though it's been portrayed that way, and by none other than Steve Jobs, who earlier this year said that Google is out to kill the iPhone. If it is a battle, the terms are uneven.


    My own disclosures: I'm definitely a Google fan. And I'm also an iPad-equipped Apple fan, though I am also very unhappy with the closedness of the Apple platform, and the way they treat their developers. But they make beautiful hardware, and they really understand "it just works."


    Vic Gundotra nailed it in his second keynote at Google I/O. When he was starting at Google, he asked Andy Rubin why Android was important; why did the world need one more mobile platform? Andy's answer was that Google had a very dismal future if they didn't address
    mobile; we'd end up with one platform, controlled by one vendor, and one carrier. It was a wise and prophetic answer. If I've pieced the chronology together correctly, this would be about the time the iPhone was coming online. And the iPhone is a great device--great, but ultimately closed.


    Apple makes hardware, and the more hardware they sell, the more money they make. So Apple clearly wins if they sell iPhones to everyone--the more iPhones (and iPads), the more they win. There would be nothing better for them than driving the other smartphone manufacturers out of the market. (They don't seem to be interested in low-end, low profit margin phones, but that's another story.) So what it takes for Apple to win is clear: dominance of the smartphone market.


    Google's stakes are different. They don't make money from selling phones, and they even abandoned their retail NexusOne store with very little pain. They don't make money from licensing software either, as far as I know. Google makes money from selling ads. And the more ads they sell, the happier they are. Apple is fighting for market share in cellphones; Google is fighting for market share in ad placement.


    This asymmetry is very important. Google does not have to dominate the smartphone business; they just have to make sure that there's an environment in which the business of selling ads thrives. While Apple wants to dominate smartphones, Google undeniably dominates online ad sales--and they clearly see ad placement on mobile as a huge opportunity. Conversely, failure to dominate mobile ad sales would be disastrous. At best, it would limit their potential; at worst, if we're heading for the end of the "desktop/laptop era", it could seriously threaten their core business.


    Making money selling mobile ads requires that Google keep the smartphone market open, plural, competitive. As long as there are multiple smartphones in the market, content developers will be driven towards open standards like HTML5. Developers will build richer and richer HTML content for the phones--and Google will thrive in its core business, placing ads on HTML pages. Google doesn't need to "win"; they just need to "not lose", to keep the game open, and to drive open technologies to the next level where they can compete successfully. In the long run, a closed system can only thrive if it's the only player in the game. If we've learned one thing from the growth of the Internet, it's that open standards that can be implemented by many vendors trumps closed systems, and enables the kind of competition that drives out monopolies.


    Just as an athlete will inevitably perform better when he's relaxed and not worried about losing, Google's big advantage in the smartphone wars may well be precisely that they don't need to win. Googlers are justifiably proud that US Android sales have snuck ahead of iPhone sales. Of course, that's 50-odd phones available for all US carriers, versus two iPhone models available only from AT&T. And when the iPhone 4 comes out, Apple will certainly see a big burst of sales. But that's not what's really important to Google; all they need to do is keep the game open, for themselves, Palm/HP, RIM, and the other smartphone vendors--and to establish the kinds of standards that enable a competitive market to thrive.


    There is a real threat to Apple, though; just because Google doesn't need to win smartphone dominance doesn't mean they wouldn't like to. And in the wake of their FroYo demos at I/O, that seems increasingly likely. Dan Lyons (Fake Steve Jobs) makes a lot of good points in his Newsweek blog:


    • Google's technology is way ahead of anything Apple is offering, or likely to offer. Streaming music from your desktop is only one example. Google, not Apple, is offering what customers want.

    • Apple's response to Google's claim that they are shipping more phones was "so what, we have more market share." Lyons says he's heard that before, it's the song of a company that's losing and in denial. I've heard it too. Lyons is right.

    • It's easy to think that Apple fell apart in the late 80s and early 90s because a clueless Coca Cola exec booted Jobs and took over. But the real story, if you're old enough to remember, is that Jobs mismanaged the company after a series of stellar technical triumphs. History appears to be repeating itself.



    I am genuinely sad about this; Apple is a great innovative company. There's no reason they can't do everything Google is doing. Analyzing each players' strengths, Apple really understands user experience and design. They have a lock on that. Google really understands cloud computing and connectivity. However, it will probably be easier for Apple to get up to speed on the connectivity issues than for Google to get Apple's design sensibility. Nothing Google is adding to Android is fundamentally that difficult, and Apple has no shortage of engineering talent.


    But--and this is important--Apple will not be able to take Google on in the areas of connectivity and cloud computing as long as they insist on a closed platform. Not because Google's FroYo features can't be implemented on a closed platform, but because it just wouldn't occur to you to do so. Furthermore, you can only go so far telling customers that you know what's best for them. I hate Flash almost as much as Steve Jobs, but you know what? If I were building a platform, supporting Flash would be a requirement. Flash is everywhere. Getting tied up in a pointless fight with Adobe is silly. Vic's daughter is right: she wants the toy that can run her favorite online games. That's going to be an Android phone, not an iPad or an iPhone. Apple is insisting on playing the game in a way that they can only lose.


    Having said that, why is Apple so interested in HTML5? Why are they supporting it with almost as much energy as Google? I think Steve Jobs really understands that HTML5 is the "right thing" for the future of the web. Apple is not going to drop native applications. But Jobs has always had an uncanny sense of when things are done right.


    Although Google doesn't need to "win" the battle with Apple, Apple's hysteria, along with its insistence on fighting the wrong battles, means that Google has a decent chance of winning. HTML5 may be Apple's last chance to change their ways, and make decisions that aren't dictated by their desire to control the platform. If they don't, they will lose, and that would be tragic, both for Apple and for users.

    Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
    Could not load more posts
    Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
    Just a second, loading more posts...
    You've reached the end.

    Don't be the product, buy the product!

    Schweinderl