Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

June 25 2012

William Gibson got some of it right

"The sky above the port was the color of television tuned to a dead channel."

Thus begins "Neuromancer," one of the most influential works of science fiction ever written. William Gibson's vision of a dystopic future, where corporations have become the new governments and freelance hackers jack into the net with immersive computer systems, set the tone for the cyberpunk movement. Unfortunately, we still don't have our "deck" to jack into the net, we're still using the same (if highly upgraded) flat displays, keyboards and mice that we did in the '80s.

What we do have are the negative aspects of the novel. For a while, it looked like cyberwarfare was going to be mostly theoretical, and that the largest threats to network security were going to come from individual black-hat hackers. But then groups such as the Russian mafia got into the game, and then nation-states started using cyberwarfare as a tool of sabotage and espionage, and now corporations are resorting to reprisal attacks against entities that attack them. The net is now an active war zone, where hardware comes pre-installed with spook-authored malware designed to destroy centrifuges.

The other half of the Gibson dystopia, the rise of corporations as pseudo-governments, has occurred as well. SOPA, ACTA, PIPA, DMCA, and friends are all legislation directly authored or highly influenced by powerful industry lobbies, with the goal of making governments the enforcement arms of businesses. The FBI spends significant amounts of its time enforcing copyright and trademark violations. The recent Supreme Court ruling, that corporations are people too, could have come right out of the pages of "Neuromancer."

The fact that the technological future of "Neuromancer" has failed to come to pass speaks to the evolutionary nature of computer innovation. A direct brain interface is probably still decades (if not generations) away. But the fact that the societal and political future forecast in "Neuromancer" struck so close to home is a sad commentary on human nature. If you assume the worst, you stand a good chance of being right.

What's most interesting is that he totally blew the call on where the battle-lines would be drawn. In Gibson's universe, corporations are fighting each other for trade secrets, with highly skilled software assassins dancing elegant battles against elaborately constructed firewalls. In the real world, the defenders are hopelessly outgunned, fighting a battle standing on fragile software platforms while illiterate script-kiddies fire off salvo after salvo of brute-force attack. And rather than priceless technology blueprints, the booty that companies are trying to protect is the mundane: credit card numbers, music and movies.

Also, in "Neuromancer," the battle is largely invisible, with the average person on the street unaware of the carnage occurring electronically around them. By contrast, the general public is painfully aware of how vulnerable modern computer systems are to abuse, and pretty much anyone who uses the net regularly can tell you about DMCA takedowns and the perils of SOPA. In short, Gibson may have been right about the net becoming an online warzone, but he failed badly to identify the what and why of the war.

The real question is, where does our version of dystopic web-life go from here? There appear to be two diverging paths, neither one very palatable. At one extreme, groups such as Anonymous can make the web so unsafe to use that no one dares to use it for anything. On the other, governments and corporations make it safe for themselves, at the cost of our personal liberties and privacies. Or, we could continue to muddle along somewhere in the middle, which may be the best outcome we can hope for.

June 01 2012

Developer Week in Review: The overhead of insecure infrastructure

I'm experiencing a slow death by pollen this week, which has prompted me to ponder some of the larger issues of life. In particular, I was struck by the news that an FPGA chip widely used in military applications has an easily exploitable back door.

There is open discussion at the moment about whether this was a deliberate attempt by a certain foreign government (*cough* China *cough*) to gain access to sensitive data and possibly engage in Stuxnet-like mischief, or just normal carelessness on the part of chip designers who left a debugging path open and available. Either way, there's a lot of hardware out there walking around with its fly down, so to speak.

As developers, we put a lot of time and effort into trying to block the acts of people with bad intent. At my day job, we have security "ninjas" on each team that take special training and devote a fair amount of their time to keeping up with the latest exploits and remediations. Web developers constantly have to guard against perils such as cross-site scripting and SQL injection hacks. Mobile developers need to make sure their remote endpoints are secure and provide appropriate authentication.

The thing is, we shouldn't have to. The underlying platforms and infrastructures we develop on top of should take care of all of this, and leave us free to innovate and create the next insanely great thing. The fact that we have to spend so much of our time building fences rather than erecting skyscrapers is a sign of how badly this basic need has been left unmet.

So why is the development biome so under protected? I think there are several factors. The first is fragmentation. It's easier to guard one big army base than 1,000 small ones. In the same way, the more languages, operating systems and packages that are in the wild, the more times you have to reinvent the wheel. Rather than focus on making a small number of them absolutely bulletproof (and applying constant vigilance to them), we jump on the flavor of the day, regardless of how much or little effort has been put into reducing the exposed security footprint of our new toy.

The fact that we have independent, massive efforts involved in securing the base operating systems for MacOS, Windows, Linux, BSD, etc, is nothing short of a crime against the development community. Pretty it up any way that suits you with a user interface, but there should (at this point in the lifecycle of operating systems) only be a single, rock-solid operating system that the whole world uses. It is only because of greed, pettiness, and bickering that we have multiple, fragile operating systems, all forgetting to lock their car before they go out to dinner.

Languages are a bit more complex, because there is a genuine need for different languages to match different styles of development and application needs. But, again, the language space is polluted with far too many "me-too" wannabes that distract from the goal of making the developer's security workload as low as possible. The next time you hear about a site that gets pwned by a buffer overrun exploit, don't think "stupid developers!", think "stupid industry!" Any language that allows a developer to leave themselves vulnerable to that kind of attack is a bad language, period!

The other major factor in why things are so bad is that we don't care, evidently. If developers refused to develop on operating systems or languages that didn't supply unattackable foundations, companies such as Apple and Microsoft (and communities such as the Linux kernel devs) would get the message in short order. Instead, we head out to conferences like WWDC eager for the latest bells and whistles, but nary a moment will be spent to think about how the security of the OS could be improved.

Personally, I'm tired of wasting time playing mall security guard, rather than Great Artist. In a world where we had made security a must-have in the infrastructure we build on, rather than in the code we develop, think of how much more amazing code could have been written. Instead, we spend endless time in code reviews, following best practices, and otherwise cleaning up after our security-challenged operating systems, languages and platform. Last weekend, we honored (at least in the U.S.) those who have given their life to physically secure our country. Maybe it's time to demand that those who secure our network and computing infrastructures do as good a job ...

OSCON 2012 — Join the world's open source pioneers, builders, and innovators July 16-20 in Portland, Oregon. Learn about open development, challenge your assumptions, and fire up your brain.

Save 20% on registration with the code RADAR20


Related:


May 25 2012

Developer Week in Review: Oracle's big bet fails to pay off

I've been taking the opportunity this week to do some spring office cleaning. Unfortunately, I clean my home office infrequently enough that at a certain point, cleaning it becomes more an exercise in archeology than organization. There's nothing like finding a six-month-old check you never deposited to encourage more frequent cleaning.

The same can be said for code, of course. It's far too easy to let crufty code build up in an application, and then be faced with the mother of all refactoring efforts six months down the road, when your code finally reaches a critical mass of flaky behavior. It's worth the effort to continually refactor and improve your code, assuming you can convince your product management that quality is as important as new features.

Android is almost out of the woods

It wouldn't be a Week in Review without the latest in Godzilla vs. Gamera Oracle vs. Google. Things aren't looking all too sunny for Oracle at the moment, as the jury in the case just threw out all the patent-related claims in the lawsuit. This doesn't leave Oracle with much left on the plate, as the case now boils down to the question of whether the Java APIs are copyrightable. That's a matter the jury is deadlocked on.

Like all things legal, this is going to drag on for years as there are appeals and retrials and the like. But for the moment, it appears that Android is out of the woods, at least as far as the use of Java is concerned. Of course, there's still all those pesky International Trade Commission issues keeping many Android handsets waiting at the border, but that's a battle for another day ...

Scripters of the world, rejoice!

For Perl developers, a point release of the language is a major event, as it only occurs roughly once a year. This year's edition has just been released, and Perl 5.16 packs a ton of improvements (nearly 600,000 lines' worth!).

Since Perl is such a mature language, most of the changes are incremental. Probably the most significant is further enhancements in Unicode support. Nonetheless, there should be something useful for the serious Perl developer.

FreeBSD bids GCC farewell

As the licensing on the GCC compiler has become increasingly restrictive, some of us have been wondering when the fallout would start. Wait no longer: The FreeBSD team has ditched GCC for the more BSD-friendly licensing of Clang.

GCC has spent decades as the compiler of choice for just about everything, but recent changes in the GPL have made it less attractive to use, especially in commercial development. With the Apple-sponsored Clang compiler now seen as a viable (and perhaps even superior) alternative, with a much less restrictive license, the Free Software Foundation may need to decide if it would rather stand on principle, or avoid becoming marginalized.

Got news?

Please send tips and leads here.

OSCON 2012 — Join the world's open source pioneers, builders, and innovators July 16-20 in Portland, Oregon. Learn about open development, challenge your assumptions, and fire up your brain.

Save 20% on registration with the code RADAR20


Related:


May 04 2012

Developer Week in Review: Are APIs intellectual property?

Returning after a brief hiatus due to my annual spring head cold, welcome back to your weekly dose of all things programming. Last week, I was attending the Genomes, Environments and Traits conference (I'm a participant in the Personal Genome Project), when I got notified that WWDC registration had opened up. I ended up having to type in my credit card information on my iPhone while listening to the project organizers discuss what they were doing with the saliva I had sent them. The conference itself was very interesting (although I was coming down with the aforementioned cold, so I wasn't at the top of my game). The cost to sequence a genome is plummeting — it's approaching $1,000 a pop — and it has the potential to totally revolutionize how we think about health care.

It's also an interesting example of big data, but not how we normally think about it. An individual genome isn't all that big in the scheme of things (it's about 3GB uncompressed per genome), but there are huge computational challenges involved in relating individual variations in the genome to phenotype variations (in other words, figuring out what variations are responsible for traits or diseases).

While all the West Coast developers who slept through the WWDC registration period lick their wounds, here's the rest of the news.

APIs are copyrightable, unless they aren't?

These days, I feel like you need to consider a minor in law to go with your computer science degree. In the latest news from the front, we have conflicting opinions regarding the status of APIs. On the one hand, the judge in the Oracle versus Google lawsuit has instructed the jury they should assume that APIs are copyrightable. As the linked article discusses, this could have ominous implications for any third-party re-implementation of a programming language or other software that is not open source.

Over in Europe, however, a new ruling has stated that programming languages and computer functionality are not copyrightable. So, depending on which side of the ocean you live on, APIs are either open season, or off limits. No word yet as to the legal status of APIs on the Falkland Islands ...

Fluent Conference: JavaScript & Beyond — Explore the changing worlds of JavaScript & HTML5 at the O'Reilly Fluent Conference (May 29 - 31 in San Francisco, Calif.).

Save 20% on registration with the code RADAR20

Code to make your head hurt.

For those of you who like to celebrate the perversities of life, it's hard to beat the International Obfuscated C Competition, which just released its 2011 winners. For your viewing pleasure, we have programs that compute pi, chart histograms, and even judging programs for obfuscation, all written in a manner that will have code reviewers running to the toilet with terminal bouts of nausea.

And speaking of C ...

We tend to focus a lot of attention on emerging languages, partially because many of them have novel features, and partially because the grass is always greener in a different language. It's instructive to step back sometimes and take a look at what people are actually using. The latest TIOBE Programming Community Index, which measures how much code there is out there in each of the various languages, has a new top dog, and it's our old friend C. In fact, when you factor in C#, C++ and Objective-C, C-related languages pretty much own the category. Java has now fallen to the second position, and you have to go all the way down to sixth place to find a scripting language, PHP.

Importantly, all the hot new languages, like Erlang and Scala, don't even make the top 20, and you only need half-a-percentage point to get in that list. As much as we like the new darlings on the block, the old veterans still are where most of the action (and money) is.

Got news?

Please send tips and leads here.

Related:

Reposted bynunatak nunatak

April 13 2012

Developer Week in Review: Everyone can program?

I'm devoting this week's edition of the WIR to a single news item. Sometimes something gets stuck in my craw, and I have to cough it out or choke on it (hopefully none of you are reading this over lunch ...).

Yet another attempt to create programming for dummies ...

Just today, I came across a news article discussing a recent Apple patent application for a technology to allow "non-programmers" to create iOS applications.

This seems to be the holy grail of software design, to get those pesky overpaid software developers out of the loop and let end-users create their own software. I'll return to the particulars of the Apple application in a moment, but first I want to discuss the more general myth, because it is a myth, that there's some magic bullet that could let lay people create applications.

The underlying misunderstanding is that it is something technical that is standing between "Joe Sixpack" and the software of his dreams. The line of reasoning goes that because languages are hard to understand and require specialized knowledge, there's a heavy learning curve before a new person could be productive. In reality, the particulars of a specific platform are largely irrelevant to whether a skilled software engineer can be productive in it, though there are certainly languages and operating systems that are easier to code for than others. But the real difference between a productive engineer and a slow one lies in how good the engineer is at thinking about software, not C or Java or VB.

Almost without exception, any software engineer I talk to thinks it's insane when an employer would rather hire someone with two years total experience, all of it in a specific language, rather than one with 10 years experience in a variety of languages, all other factors being equal. When I think about a problem, I don't think in Java or Objective-C, I think in algorithms and data structures. Then, once I understand it, I implement it in whatever language is appropriate.

I believe that a lot of the attitude one sees toward software engineering — that it's an "easy" profession that "anyone" could do if it weren't for the "obfuscated" technology — comes from the fact that it's a relatively well-paid profession that doesn't require a post-graduate degree. I match or out-earn most people slaving away with doctorates in the sciences, yet I only have a lowly bachelors, and not even a BS. "Clearly," we must be making things artificially hard so we can preserve our fat incomes.

In a sense, they are right, in that it doesn't take huge amounts of book learning to be a great programmer. What it takes is an innate sense of how to break apart problems and see the issues and pitfalls that might reach out to bite you. It also takes a certain logical bent of mind that allows you to get to the root of the invariable problems that are going to occur.

Really good software engineers are like great musicians. They have practiced their craft, because nothing comes for free, but they also have a spark of something great inside them to begin with that makes them special. And the analogy is especially apt because while there are always tools being created to make it easier for "anyone" to create music, it still takes a special talent to make great music.

Which brings us back to Apple's patent. Like most DWIM (do what I mean) technologies for programming, it handles a very specific and fairly trivial set of applications, mainly designed for things like promoting a restaurant. No one is going to be writing "Angry Birds" using it. Calling it a tool to let anyone program is like saying that Dreamweaver lets anyone create a complex ecommerce site integrated into a back-end inventory management system.

The world is always going to need skilled software engineers, at least until we create artificial intelligences capable of developing their own software based on vague end-user requirements. So, we're good to go for at least the next 10 years.

Fluent Conference: JavaScript & Beyond — Explore the changing worlds of JavaScript & HTML5 at the O'Reilly Fluent Conference (May 29 - 31 in San Francisco, Calif.).

Save 20% on registration with the code RADAR20

Got news?

Please send tips and leads here.

Related:

April 06 2012

Developer Week in Review: When giant corporations collide

The days of the April Fools' web joke are over, or should be. It's gotten too old, to institutionalized, and it's so widespread these days that serious news can slip through the cracks because everyone assumes it's a joke. If people want to pull hoaxes, pick a random day in the middle of the summer and do it then; you'll get much more bang for the buck because no one will be expecting it. I used to like a good fake article as much as the rest, back in the days when they would be buried somewhere in the pages of a magazine's April edition, but now it's just lame. Be assured, all the items in this edition of Developer Week in Review are 100% prank-free and were supervised by the American Humane Association.

Gentlemen, start your lawyers!

Like a large radioactive reptile, the lawsuit between Oracle and Google over the improper use of Java has been sleeping quietly in a courtroom in San Jose. But now, the slumbering monster is about to awake, potentially leaving a trail of broken companies scattered from California to Asia. After all attempts to broker a settlement between Larry's House of Java and the People's Autonomous Car and Search Engine Company failed, the judge involved has ordered the two parties to start sharpening their long-knives, in an unusually candid opinion.

It's hard to overestimate the potential impact that a ruling against Google could have on the smartphone industry. If Google was required to remove Java from Android phones, Android would essentially become useless because the entire stack that Android apps use is built on top of Java. More likely, Google would be required to shell out a significant license fee to Oracle, which added to the ones it already pays to Microsoft and (potentially) Apple, could make Android phones less and less profitable to the handset makers who actually end up paying the fees. Of course, given the glacial pace at which these proceedings move, Android may have already moved on by the time any such judgement actually comes down ...

Linux has a friend in ... Redmond?

In the past few weeks, we've made several references to Microsoft's increasing support of the open source model, and this week brought even more evidence of the sea change out of Washington state. For a technology that Steve Ballmer once described as akin to cancer, Linux is certainly getting a lot of love from Microsoft these days. The software behemoth is now in the top 20 corporate contributors to the Linux Kernel, committing more than 1% of all new lines of code last year.

It is worth bearing in mind that most of that code is in support of Microsoft technologies, such as Hyper-V, but even still, it's clear that Microsoft doesn't treat Linux like an ill-behaved street urchin anymore.

The art of game cheats

I'm not much (if anything) of a game programmer; I've always gravitated more to the web side of the force. But I certainly play my share of games. I'm currently racing my 17 year old to level 80 on "Call of Duty MW3" on the Wii (I'm [MLP]TwilightSparkle if you want to ally with a mediocre player who likes Akimbo FMG9 a bit too much for his own good ...). If you play enough multiplayer, you'll eventually come to recognize the players who have an almost psychic knowledge of where everyone is. They're the ones who always seem to come around the corner already sighted in on you. You know, the cheaters ...

Now, one game developer has stepped forward to explain some of the hacks that cheats use to become Chuck Norris clones and how they are implemented. Even if you are never going to get within 1,000 yards of a z-buffer, it's worth reading to see just how easily games can be tweaked to give unethical players an unbeatable edge.

Fluent Conference: JavaScript & Beyond — Explore the changing worlds of JavaScript & HTML5 at the O'Reilly Fluent Conference (May 29 - 31 in San Francisco, Calif.).

Save 20% on registration with the code RADAR20

Got news?

Please send tips and leads here.

Related:

March 29 2012

Developer Week in Review: Google I/O's ticket window open and shuts in record time

This week, I'd like to take a moment to thank the good folks over at Parkland Medical Center, who took pity on the retching, sweat-covered soul who appeared on the doorstep of their emergency room last Friday morning. They swiftly (well, after 15 eternal minutes in the waiting room, which is pretty swift for a walk-in to an ER) got him hooked up to an IV and introduced the two God-given holy fluids of morphine and Dilaudid. On a totally unrelated note, I'd like to proudly announce the birth of a healthy 3mm kidney stone at 5PM last Friday. Donations to its college fund can be made ...

Extending the trend line doesn't look good

Google IOLast year, Google I/O sold out in under an hour. This year, it only took 20 minutes. If we extend the trend-line out a few years, the only people who will be able to get in will be those who have access to micro-second responsive stock market trading programs and hyper-tuned eBay auction sniping software.

At least, however, Google fans have some clue when the registration opens for their conference. Those of us still waiting for Apple's WWDC conference know it will have to open for registration soon, but the exact date and time is a mystery. Thankfully, the multi-thousand dollar registration fee tends to make WWDC a bit slower to fill up, but it will still be a race for those who require authorization from their management to go (some of us get authorization months in advance, specifically for this reason).

If there's a solution to this classic supply versus demand problem, I can't see it. Regional conferences reduce the benefit of getting all the developers together in one place and would have the companies sending their development staff to the four corners of the world. Maybe Apple and Google need to start renting out football stadiums instead of conference facilities.


More pigs spotted airborne

For those who have been taking a skeptical view of Microsoft's avowed embrace of the open source movement, there's more reason to believe it's genuine. This week, Microsoft released a whole crop of its .NET technology to its CodePlex open source repository, and the company did it under the hyper-liberal Apache 2.0 license rather than something proprietary and restrictive. In addition, Microsoft has started using the developer-friendly git source control system — another attempt to make itself more compatible with the open source community as a whole.

Of course, releasing portions of its proprietary environment as open source is still an attempt to get people to use Microsoft's technology as a whole, including Visual Studio, but the more it puts out there under licenses that include patent grants, the more possible it is to incorporate compatibility with Microsoft products in non-Microsoft platforms and products.

Was the cake made out of 0xDEADBEEF?

It's practically unimaginable today, but when the gcc compiler was first released 25 years ago this week, the only way to compile your code was to pay your hardware vendor for a proprietary compiler package, sometimes costing tens of thousands of dollars.

In the intervening years, the shining star of the Free Software Foundation (FSF) has become the go-to (excuse the expression) compiler for most modern compiled languages, available on and for just about every hardware platform you can think of. It doesn't have the death-grip hold on the industry it once did, with Apple among others moving to LLVM, but it was the first and for a long time the best compiler money couldn't buy. You may argue with the current philosophy of the FSF, but give it due props for opening up the world of programming to the world by making free tools available to anyone who wanted them.

Fluent Conference: JavaScript & Beyond — Explore the changing worlds of JavaScript & HTML5 at the O'Reilly Fluent Conference (May 29 - 31 in San Francisco, Calif.).

Save 20% on registration with the code RADAR20

Got news?

Please send tips and leads here.

Related:

March 22 2012

Developer Week in Review: The mysterious Google I/O machine

We're in the countdown days to the two big annual developer conferences (not counting OSCON, of course ...). Google I/O will open registration on March 27th, and if past history is any guide, WWDC should also start (and end) signups around the same week. So, get your credit cards warmed up and ready. Last year, both conferences sold out in less than a day (Google I/O in under an hour!).

And speaking of Google I/O

Google IO game

Just what is the purpose of the Rube Goldberg-esque physical puzzle that has gone up on the Google I/O website. Does it have something to do with a puzzle that potential attendees will need to solve to register? Will attendees be flung around from session to session by giant pendulums? Is it all just a cool demo of Chrome? And does it have anything to do with ancient Mayan prophecies?

In any event, it's a fun (if simple) game, worth a few moments of your time, but unlikely to absorb more than 15 minutes of your attention. Now, if they added achievements and a Zombie mode, that might be something.


So much for sandboxing

Reports of a successful exploitation against the Chrome sandbox appeared recently, and now word has broken that a new Java exploit not only breaks out of the sandbox, but manages to install itself into system memory, where it can mess around with privileged processes. Worse, unlike the Chrome exploit, which was reported to Google and not in the wild, this new Java hack is being actively distributed on popular Russian news sites.

Since the entire point of a sandbox is to keep malicious code from getting access to system resources, it is truly disheartening to see how frequently sandboxes are being penetrated these days. If there's one piece of code that needs to be rock-solid, it's the bit that keeps the bad guys from doing bad things. That it fails so often in reality either indicates that developers aren't doing a good job, or that it's a really hard problem and it may be time to rethink sandboxing as a valid security approach.

Go is almost a Go

For those who have been eagerly awaiting Google's attempt to reinvent the wheel new programing language, Go, the wait is almost over, as RC1 has just hit the street. According to the developers, this is very close to what the final 1.0 release will look like. If you've been waiting for a stable version of Go to kick the tires, now is probably the time.

As with most new programming languages, I am maintaining a healthy degree of skepticism as to the long-term viability of Go. This is not because of any inherent faults of the language, but because of the institutional inertia that new languages have to fight to gain acceptance. Whether Google's influence will be enough to get Go ensconced in the pantheon of mainstream languages is yet to be seen.

Fluent Conference: JavaScript & Beyond — Explore the changing worlds of JavaScript & HTML5 at the O'Reilly Fluent Conference (May 29 - 31 in San Francisco, Calif.).

Save 20% on registration with the code RADAR20

Got news?

Please send tips and leads here.

Related:

March 15 2012

Developer Week in Review: When game development met Kickstarter

Happy day after Pi Day, everyone (except all you Tau fanatics ...). If you happen to live in Louisville, drop by the FedEx facility there and say "hi" to my new iPad. It's been sitting there since last Friday, waiting for the magic hour to take the final leg of its voyage so all of them arrive on the same day (unless you happen to live in Vietnam, evidently ...). My upgraded Apple TV unit is allegedly arriving today, a day early. That's me, single-handedly helping to drive Apple's stock price over $700.

Disintermediation, thy name is Kickstarter

Double Fine Adventure Kickstarter campaign

Kickstarter has gained a reputation for letting small ventures crowdsource their funding, providing an alternative to venture and bank investments for new products and projects. But with a few notable exceptions, it's been fairly small-scale stuff, typically between $10,000 and $100,000 of total funding.

Meanwhile, independent game designers have been hampered by the large costs associated with creating products that can compete with the big players such as EA. With costs for even a relatively simple game running into the millions, there was no practical way to fund great ideas without giving up artistic control to the megacorps.

Now, several game developers have decided that game funding and Kickstarter are two great tastes that taste great together. Crowdfunding for small software projects is old hat for Kickstarter, but the scale that it is now being taken to is rather breathtaking.

It started with the folks over at "Double Fine Adventure" (which includes the talent behind the well-known "Monkey Island" series of games), who set up a Kickstarter project with a $400,000 goal. That money was intended to fund development of a new point-and-click adventure game. To say that it was successful is truly an understatement: The project ended up with $3.3 million dollars in funding.

Brian Fargo, who was executive producer for the hit games "Wasteland" and "Fallout," evidently liked what he saw. He's following the same model for "Wasteland 2." It's already blown past the stated goal of $900,000 (it was just over a million dollars pledged when I wrote this). With 32 days to go in the pledge period, it's almost certain that they will even exceed the $1.5-million-dollar level that will let them create both a Windows and OS X version.

This disintermediated payment model is very exciting, both for software developers who might have a big idea that needs big funds, and potentially for many other areas of creative endeavor. Your favorite show just got canceled? Fund it yourselves! In the mean time, hopefully we'll see more exciting independent games find the budgets they need to become reality.

Fluent Conference: JavaScript & Beyond — Explore the changing worlds of JavaScript & HTML5 at the O'Reilly Fluent Conference (May 29 - 31 in San Francisco, Calif.).

Save 20% on registration with the code RADAR20

And speaking of crowdsourcing

This week marks the end of an era, as Encyclopedia Britannica announced that they will no longer issue a print version of their product (digital products will continue). For people of my age, Britannica was the go-to source when grinding out those high school term papers (along with another dinosaur, the Reader's Guide to Periodical Literature).

What did in the EB was, of course, Wikipedia. For all its warts, there was just no way that a massive tome (both physically and financially) was going to survive in the long term, when a much more up-to-date and comprehensive source was available for free. The Britannica's 120,000 articles just couldn't compete with Wikipedia's nearly three million, especially when the cutting-edge articles in the 2010 EB edition covered such breaking news as the Human Genome Project (completed in, wait for it, 2003).

Purists will bemoan the death of an authoritative, expert-edited research source, but the reality is that expert-curated sources (such as journals) are proving to be as subject to bias and error as crowdsourced ones. I hear horror stories from my wife about how hard it is to get a journal article accepted if it goes against the conventional wisdom, especially since the people reviewing the articles are usually the ones who have the most to lose if it turns out they were wrong. Crowdsourced reference material can suffer from the opposite problem, letting fringe theories creep in around the edges, of course.

In the end, what won the war for Wikipedia (apart from price and convenience) is the sheer volume of information available. Sure, a complete list of the characters appearing in "Firefly" may not end up being crucial to your kid's next senior essay, but life is more than just papers. Wikipedia rules because it has the meaty articles, but also the ones you need on a day-to-day basis.

Got news?

Please send tips and leads here.

Related:

March 08 2012

Developer Week in Review: The new iPad and the big meh

The wacky climatology continues here in New England. We got half a foot of snow last week and it's 65 degrees today. Combine that with the unseasonable tornados Friday in the Midwest and South, and the icebox Europe has been suffering under, and you want to quote Bill Murray from "Ghostbusters": "Human sacrifice, dogs and cats living together ... mass hysteria!"

And speaking of mass hysteria ...

Apple announces new products. World yawns.

iPad third generationSomewhere on a sleepy little ice-covered moon in a far-off galaxy, 12-eyed alien sloths watched the live-blogging of the Apple iPad reveal yesterday, so over-hyped are Apple's product announcements these days. The big surprise this time was that, well, there were no big surprises. A combination of leaks from companies in Apple's supply chain and good guesswork meant that we've known for days what was likely to be announced, and the rumors were pretty much on the money. A 4-core iPad with 2x display resolution and a better camera, on LTE, and an upgraded Apple TV unit. It seems that the days are gone when Apple's master pitchmen can pull something genuinely novel out of the hat with no advanced warning. I can remember the stunned applause when the iPhone was first unveiled. The "new" iPad's announcement was more like "yeah, OK, cool."

That having been said, the new iPad is going to blur the laptop/tablet lines even further, as a combination of more processor power and a higher resolution display are letting more and more advanced applications make the transition to a tablet form factor. For developers, this is going to mean abandoning the mouse and keyboard as the primary way of doing things in user interface design, even for products that traditionally were thought of as "desktop applications" (such as CAD).

And yes, for the record, I bought a 64GB LTE model (black). If you're looking to trade in your old iPad, Amazon seems to be giving the best offers at the moment for used ones.

Great moments in patent extortion, the series!

Steve Jobs famously vowed that he'd destroy the Android, but recent reports indicate that Apple has decided it would rather make profits, not war. Apple is reportedly offering to back off patent litigation against handset vendors in return for a $15/unit license fee. If you combine that with Microsoft's $10/unit fee, that means that $25 of every Android sold is going to companies that directly compete against the platform.

What a great business model! Buy our product, or don't buy it, but either way we'll make money on the deal. Mind you, I'm sure Apple and Microsoft clear more than $25 and $10 respectively in profit off each iPhone and Windows phone they sell, so they'd still rather you buy one of theirs. Still, if you can't beat 'em, tax 'em!

Welcome Raspberry Pi

Raspberry Pi model BThe Raspberry Pi is finally here and shipping. Not surprisingly, the $35 single-board Linux computer immediately sold out. However, there's evidently a robust supply chain in place because I was able to purchase a unit for delivery in just a few weeks.

For my money, the big loser in all this is going to be the Arduino, which is cute but underpowered and hard to develop for. Given the Pi is cheaper than most Arduinos and offers networking, HDMI and USB, plus an easier-to-use Linux OS, I can see a lot of developers deciding to drop Arduino in favor of it. It will run happily on 4AA batteries and has GPIO ports available, so you could even use it in your favorite autonomous flying vehicle autopilot application if you wanted.

Got news?

Please send tips and leads here.

Related:

February 23 2012

Developer Week in Review: Flash marginalization continues

I got a rude reminder of how dependent we've grown on ubiquitous telecommunications, as AT&T decided to take a sick day, cell phone service-wise. The outage only lasted an hour or so, but I suddenly found myself on the road with no way to call into a scheduled scrum standup (can it be a standup when you're sitting in your car?) and no way to email to let them know what was going on.

Total outages have been pretty rare, but it wouldn't take much from a solar storm perspective to knock everything offline, something I wrote about several years ago. Try to imagine modern society with no power, telecommunications or GPS navigation for a few days, and losing cell service for an hour gets put into its proper perspective.

Now that I'm back at home with a nice reliable fiber connection, I can give you the news of the week.

Tux can only flash people wearing chrome, now

As was reported previously, Adobe is starting to gracefully put Flash out to pasture in favor of HTML5. The deathwatch took another step forward this week, with Adobe announcing that only Chrome will be able to run Flash under Linux in the future.

One could argue that Linux never was much of a market for Flash anyway, but following on the heels of the announcement regarding mobile support, it should be clear that Flash is on the way out. Flash was once considered the last best hope for seamless integration across desktop and mobile platforms, held back only by Apple's intransigence. Now, all eyes are on HTML5.


Getting laid off doesn't sound so bad now, does it?

In the "developed world," software professionals spend a lot of time worried about intellectual property, career viability, privacy issues, and the like — our version of "first world problems." Once in a while, however, we get harsh reminders of the kind of real problems that can face a software developer in less-friendly circumstances.

Such is the case of Saeed Malekpour, an Iranian-born engineer and Canadian resident, who is currently facing a death sentence in Iran, accused of creating a pornographic network. According to most sources, the only thing that Malekpour actually did was to create a program that could be used to upload photos to websites, and that code had been incorporated into pornographic websites without his knowledge.

Malekpour confessed to running a pornographic network after a year in custody, a time when his supporters claim he was frequently tortured. What is certain is that very soon, if nothing is done, he will be executed, likely by being beheaded.

It's easy to write this off as a symptom of extremist ideology, but it should also serve as a wake-up call to open source and freelance developers who never plan to venture outside so-called "developed" countries. It is far too easy to imagine some hapless developer being dragged off to an undisclosed location because his or her software was found on the laptop of a jihadist. The problem with writing software is that you never know who may end up using it.

Putting Apple's labor issues in perspective

I just watched the "Nightline" report on Apple's production facilities, run by Foxconn in China. I'm sure that there's lots of righteous outrage afoot about the low wages (starting at $1.80 an hour) and cramped living conditions at the facility. I thought it was worth putting things in perspective, however.

To make it clear at the outset, I'm not in any way an apologist for China's government or social system. But I suspect you could find lots of people living in the U.S. willing to work for that wage, provided with lodging for $17 a month and a meal that cost about an hour's wage. As the report pointed out, the suicide rate at Foxconn is actually below the average in China, at 1.7 suicides per 100,000. For comparison, U.S. police officers experience 18 suicides per 100,000. And lest we become too indignant about factory accidents at the Foxconn facilities that killed more than two dozen in the past few years, we should remember that the U.S. doesn't have a shinning record in this regard either.

The point I'm making is that Apple makes an easy target because of its size and because some people want to make trouble for the company whenever they can. However, if we're going to attack Apple, let's do it for the right reasons. By most accounts, Apple is doing a much better job ensuring worker rights and safety than the industry as a whole.


Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.



Save 20% on registration with the code RADAR20

Got news?

Please send tips and leads here.

Related:

February 16 2012

Developer Week in Review: NASA says goodbye to big iron

It looks like I'm going to have a life-changing decision to make in the next few weeks, one that will be shared by millions of people around the world. At risk, the balance in my bank account.

I refer, of course, to whether I'll pony up the cash to upgrade my iPad 2 to a 3, once Apple actually tells us what the iPad 3 will have in it. Unless it cooks gourmet dinners and transports you to other planets, my best guess is that I won't. For one thing, we're also facing the release of the iPhone 5 later in the year, and I make it a policy only to do one Apple fan-boy "upgrade the expensive toy you just bought last year" purchase a year. For another, it looks like the 3 is going to be a faster version of the 2 with a Retina display, and I just can't see it being enough of a delta in features to make it worth the cost.

If I'm going to upgrade either device, I need cash in the bank, so time to earn my keep with this week's news.

HAL is crestfallen ...

NASA logoWe arrive at a bit of a milestone this week, as NASA says goodbye to the last piece of big iron left in its data processing infrastructure. With the retirement of the last IBM Z9, NASA finishes its mission to boldly go where most of the rest of the high tech world had already gone years ago. I especially liked the shout-out to old-school programmers in JCL at the end of NASA's blog post marking the occasion.

NASA, like many organizations running life-critical applications, has to take a very conservative approach to hardware upgrades, because failure is not an option. The computers installed into NASA space vehicles and probes are notorious for being generations behind the current state of the art, because of the long lead times to get them spec'd out and installed. Obviously, no mainframe flies into space, for reasons of weight and space if nothing else. You can see the same kind of excruciatingly slow hardware progress at agencies like the FAA, which can take a human generation to upgrade to a new air traffic control system.

For now, let us bid farewell to the brave Z9, last of its kind at NASA. It would be nice to fantasize that it was responsible for some intricate detail of manned space flight, but the reality is that it evidently ran business applications. Even so, if you don't pay the engineers and vendors, they don't work, so it did play its own sort of role in the exploration of the universe.

Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

Monty Redmond's Visual Python

Visual Studio, like Eclipse and Xcode, provides IDE support for a huge swath of the developer community. While it's still common to find old-schoolers who use Emacs or vi to grind out code, most programmers these days end up using an IDE to take advantage of the debugging and integrated documentation features they provide.

Eclipse is well-known for the wide variety of languages and platforms it supports, but it's easy to forget that Microsoft is making a concerted effort to open up Visual Studio to a wider developer audience as well. One sign of this is the version 1.1 release of Python Tools for Visual Studio, which has just come out. This toolkit is notable for another reason, too: it's one of the projects coming out of Microsoft's Codeplex open source initiative.

I know I'm not alone in having been skeptical of Microsoft's recent warming to open source. It's easy to see it as yet another "embrace, extend and extinguish" play. But at a certain point, you have to say that if it walks and talks like a mule, it may in fact be a mule after all. While I don't expect to see the Windows XP source code being donated to Apache anytime soon, it does seem to appear that Microsoft is making an honest effort to leverage the power of the open source model where it makes sense. That's a huge change from the company's previous "open source is communism" stance. As with most things, time will tell if this is the real deal.

I guess we'll find out what happens when you cross the streams ...

Open source developers have a reputation for bringing a passion, sometimes at an obsessive level, to the projects they work on. But even they would find themselves challenged to keep up with the frenzied level of creative mania displayed by bronies, adult fans of the new My Little Pony reboot. So what happens when you combine the two forces of open source and the brony herd? Wonder Twin developer powers activate!

"PonyKart" is a "Mario Kart"-style game set in the "My Little Pony: Friendship is Magic" universe. It's being developed by a group of brony developers over on SourceForge. It's still in the early days, but the initial videos they've released are impressive.

There's a reason you don't see a lot of open source games with this level of complexity; it's a fairly massive undertaking and is usually only within the resources of major game houses. There is a very capable Linux "MarioKart" clone out there, but consider that the "PonyKart" folks have only been in operation since July of last year, compared to the six years of development that have gone into "Supertuxkart" so far, and you can get a feel for the awesome power that can be brought to bear when two committed movements overlap. To be fair, there are more tools available now — such as physics engines — then when "Supertuxkart" started development, but the "PonyKart" effort is still striking. Imagine what could happen if we could get the Gleeks interested in video editing software ...

Tying in another theme often harped upon in these pages, the reason PonyKart can happen at all is that Hasbro has gone out of its way to apply a light hand as far as their intellectual property is concerned. Rather than wrapping a death-grip around the My Little Pony characters, Hasbro has let fans pretty much run wild with them (including the inevitable Rule 34 stuff). The company has wisely decided to let the fans churn up a meme-storm, while it sits back and counts the profits from toy sales. Are you listening, RIAA and MPAA? You could do much better by cooperating with your fan base, rather than persecuting them.

Of course, "PonyKart" could still lose momentum and die. There's a big difference between a long-term effort and horsing around for a few months (see what I did there?). But given the evidence to date, I wouldn't count this nag out of the race yet.

(Obligatory full disclosure: Your humble chronicler is a member of the herd, although not involved in the "PonyKart" project.)

Got news?

Please send tips and leads here.

Related:

February 10 2012

Developer Week in Review: A pause to consider patents

This week, as I do occasionally, I want to focus in on one specific topic.

For regular readers, the topic of technology innovation and patents is nothing new; it's a problem that is frequently covered in this space. But this week, there were two important occurrences in the world of intellectual property that highlight just what a mess we've gotten ourselves into.

The first is an unexpected turn of events down in scenic Tyler, Texas, home of the world's most litigant-friendly patent infringement juries. How friendly? Well, biased enough that Eolas relocated its corporate HQ to Tyler just to be close to the courts. Eolas, as you may recall, is the pesky gadfly that's been buzzing around giants such as Microsoft for years, claiming broad patents over, well, the entire Internet. Rather than continuing a costly court battle it might lose in the end, the House of Redmond settled, and a host of other high-tech cash-cows followed suit.

US Patent 5,838,906As Eolas continued to threaten to sue the pants off everyone, a ragtag group of plucky companies like Adobe Systems, Google, Yahoo, Apple, eBay and Amazon.com said enough is enough. And this week in Tyler, following testimony by luminaries such as Sir Tim Berners-Lee, a jury agreed, invalidating the infamous '906 patent.

You'd think that this would make Google, one of the main defendants, a big hero and confirm its status as Not Evil. But in the very same week, Google refused to budge on its licensing requirements for patents it acquired from Motorola, patents that are required for any company that wants to play in the 3G cell phone space.

When a standard is adopted by governmental bodies (such as the FCC) or standards-setting bodies like IEEE, it should ideally be free of any intellectual property restraints. After all, that's the purpose of a standard: to provide a common framework that competing companies can use to produce interoperable products. Standards such as GSM and CDMA are why you can use your iPhone in Europe (if you're rich).

The problem is, most modern standards come with a bundle of patents attached to them. In the 3G space, Google (through the Motorola acquisition) and Samsung own a lot of them. As part of the standard-making process, these companies are supposed to agree to offer use of the patents under Fair, Reasonable and Non-Discriminatory (FRAND) license terms. The idea is that all companies using the standard pay the same license fees to the patent holders, so no one gets an advantage. The problem is, who decides what is Fair and Reasonable?

This is especially pernicious when the company licensing the patent is also a competitor in the space. Obviously, Samsung doesn't pay itself a license fee to use its patent, so it doesn't matter how expensive it makes the fee, as long as Samsung doesn't incur the wrath of the standard-setting body. In the case of Motorola/Google, the license fee is set at 2.25% of the total selling price of the phone (which would come to around $13.50 on a $600 iPhone). Apple, et al., are screaming to the moon that that kind of licensing is not in the spirit of FRAND, but it's up to groups such as the European standards body, ETSI, to determine if the patent holders are really playing fair.

Of course, Google has fallen victim to the same issues. Although it doesn't pay the piper directly, phone manufacturers using Android end up reportedly paying $5 per phone to Microsoft to avoid patent issues. It's worth noting, however, that at least Microsoft is using software-related patents that it claims Android infringes, not patents directly related to the underlying standards used by the phone.

There's a simple solution to this problem, of course, which is not to allow patent-encumbered technologies to become standards. The software world has (mostly) been free of this kind of nonsense, and it's a good thing. Can you imagine having to pay a license fee to use SOAP, or AJAX? The worrisome thing is that this could become the model used for software patents, and it would basically kill smaller companies trying to be innovative.

Oh, and before you count Eolas out of the game, remember that this is just a single trial it lost. It can try again with another jury and another set of companies. Unless the USPTO invalidates the underlying patent, Eolas is still out there, waiting to strike.

Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

Got news?

Please send tips and leads here.

Related:

January 27 2012

Developer Week in Review: Sometimes, form does need to follow function

It was 56 degrees in Boston on Tuesday. It wasn't a record (you need to go back to 1999 for that, when it hit 62), but it definitely is another page in what has been a very, very bizarre winter (so far, the largest snowfall occurred back on Halloween, for example). Call it climate change, call it elves, call it sunspot variations, but whatever you call it, call it weird.

Meanwhile, while we wait for the the great Northeast Football War to commence, a few notes on the week's events.

Sometimes, you need a button

I suspect that somewhere, once a day, a journalist is taking a pair of 20-sided dice and rolling on a table called "What product Apple might work on next." The latest incarnation of this madness is a rumor that Apple might enter the smart remote control market with a touchscreen product.

The problem is, there are already touchscreen apps for the iPhone and iPad that talk to remote control widgets. And they suck. As much as Apple hates buttons and clutter, remote controls need buttons, or at least a few. The problem is kinesthetic, and has to do with the fact that many activities that we do with a remote control involve looking up at the screen while using the remote, such as skipping through commercials. Touch screens, by their nature, don't provide tactical feedback, which means you need to look down to see what you're pushing.

This is a powerful reminder that as much as we want cool interfaces and minimal design aesthetics, sometimes it's more important that the darn thing does what we want it to do. The Apple crew has (to date) been great at paring devices down to their essential functionality, but it may meet its match in the remote.

Maybe Apple will come up with a work-around for this. One answer would be to have a duplicate of what's on the TV appear on the remote, so that you could see what you were doing while pushing buttons. But that would require DVR, Blu-ray and cable companies to adopt a universal way to get the video streaming to the controller. Of course, they could make it only work with the Apple TV (and rumored new Apple televisions), but that would be vendor lock-in, and Apple never does that ...

Time to invest in disk drive companies

Should you have any doubts that Big Brother is watching more and more, Australia is now proposing that telcos and ISPs be required to retain data about all emails and phone calls made in the country, and make it available to law enforcement officials. Apart from the privacy issues, think about the data management nightmare that would be — because it's not just a month or a year that they would be required to retain, but all records in perpetuity (or until the policy is overturned). This means that providers will need to figure out how to store this data in a way that will allow it to be accessed decades into the future.

Like SOPA and PIPA, this is an example of legislators writing checks that the providers have to pay. Add in the U.S. Patent Office, and you have a grand collection of bureaucrats and politicians trying to regulate technologies that they understand not a wit. Maybe it's time for all the technically adept of the world to form their own country, but I fear civil war would break out the first time they had to decide if Greedo shot first.

Open source heart code

Software operating in life-critical environments, from aircraft to medical devices, is nothing new. Unlike "Angry Birds," however, bugs in this kind of software come with a high price tag. Just this year, there were disturbing reports of hacks that allowed third parties to override the dosage delivered by insulin pumps.

Now, one lawyer has stepped forward to demand that she have access to the software that drives the pacemaker that was to be implanted in her. GNOME Foundation director Karen Sandler is spearheading a campaign to have the source code to implantable devices be open source so that it can be inspected for vulnerabilities and bugs.

As more software is embedded into high-risk devices (such as the autonomous vehicles Google is getting ready to deploy or software for voting machines), the potential for accidental (or intentional) disasters grow. How does society weigh the intellectual property rights of the manufacturers against the rights of the public to ensure that they are safe?

Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

Got news?

Please send tips and leads here.

Related:

January 20 2012

Developer Week in Review: Early thoughts on iBooks Author

One down, two to go, Patriots-wise. Thankfully, this week's game is on Sunday, so it doesn't conflict with my son's 17th birthday on Saturday. They grow up so quickly; I can remember him playing with his Comfy Keyboard, now he's writing C code for robots.

A few thoughts on iBooks Author and Apple's textbook move

iBooks AuthorThursday's Apple announcement of Apple's new iBooks Author package isn't developer news per se, but I thought I'd drop in a few initial thoughts before jumping into the meat of the WIR because it will have an impact on the community in several ways.

Most directly, it is another insidious lock-in that Apple is wrapping inside a candy-covered package. Since iBooks produced with the tool can only be viewed in full on iOS devices, textbooks and other material produced with iBooks Author will not be available (at least in the snazzy new interactive form) on Kindles or other ereaders. If Apple wanted to play fair, it should make the new iBooks format an open standard. Of course, this would cut Apple out of its cut of the royalties as well as yielding the all-important control of the user experience that Steve Jobs installed as a core value in the company.

On a different level, this could radically change the textbook and publishing industry. It will make it easier to keep textbooks up to date and start to loosen the least-common-denominator stranglehold that huge school districts have on the textbook creation process. On the other hand, I can see a day when pressure from interest groups results in nine different textbooks being used in the same class, one of which ignores evolution, one of which emphasizes the role of Antarctic-Americans in U.S. history, etc.

It's also another step in the disintermediation of publishing since the cost of getting your book out to the world just dropped to zero (not counting proofreading, indexing, editing, marketing, and all the other nice things a traditional publisher does for a writer). I wonder if Apple is going to enforce the same puritanical standards on iBooks as they do on apps. What are they going to do when someone submits a My Little Pony / Silent Hill crossover fanfic as an iBook?

Another item off my bucket list

I've been to Australia. I've had an animal cover book published. And now I've been called a moron (collectively) by Richard Stallman.

The occasion was the previously mentioned panel on the legacy of Steve Jobs, on which I participated this previous weekend. As could have been expected, Stallman started in describing Jobs as someone who the world would have been better off without. He spent the rest of the hour defending the position that it doesn't matter how unusable the free alternative to a proprietary platform is, only that it's free. When we disagreed, he shouted us down as "morons."

As I've mentioned before, that position makes a few invalid assumptions. One is that people's lives will be better if they use a crappy free software package over well-polished commercial products. In reality, the perils of commercial software that Stallman demonizes so consistently are largely hypothetical, whereas the usability issues of most consumer-facing free software are very real. For the 99.999% of people who aren't software professionals, the important factor is whether the darn thing works, not if they can swap out an internal module.

The other false premise at play here is that companies are Snidely Whiplash wanna-bes that go out of their way to oppress the masses. Stallman, to his credit as a savvy propagandist, has co-opted the slogans of the Occupy Wall Street movement, referring to the 1% frequently. The reality is that when companies try to pull shady stunts, especially in the software industry, they usually get caught and have to face the music. Remember the furor over Apple's allegedly accidental recording of location data on the iPhone? Stallman's dystopian future, where corporations use proprietary platforms as a tool of subjugation, has pretty much failed every time it's actually been tried on the ground. I'm not saying corporations are angels, or even that they have the consumer's best interests in mind, it's just that they aren't run by demonic beings that eat babies and plot the enslavement of humanity.

Achievement unlocked: Erased user's hard drive

Sometimes life as a software engineer may seem like a game, but Microsoft evidently wants to turn it into a real one. The company has announced a new plug-in for Visual Studio that lets you earn achievements for coding practices and other developer-related activities.

Most of them are tongue in cheek, but I'm terrified that we may start seeing these achievements in live production code as developers compete to earn them all. Among the more fear-inspiring:

  • "Write 20 single letter class-level variables in one file. Kudos to you for being cryptic!"
  • "Write a single line of 300 characters long. Who needs carriage returns?"
  • "More than 10 overloads of a method. You could go with this or you could go with that."
Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

Got news?

Please send tips and leads here.

Related:

January 13 2012

Developer Week in Review: A big moment for Kinect?

Hope everyone is having a good year, so far. We're just getting our first snow of the season up here in New England (Snowtober not included...). Alas, I shan't be able to watch the Patriots and Broncos gird themselves for epic battle this Saturday (except after the fact on TiVo), as I'll be speaking that evening at the Arisia SF Convention in downtown Boston. I'll be participating on a panel discussing the legacy of Steve Jobs, and since one of the other panelists is Richard Stallman, it should make for a lively discussion.

Kinect for Windows makes it a good time to be a chiropractor

Say what you will about Microsoft, but its Kinect user input system has been a hot item since it was first released for the Xbox 360. The Kinect has also been a hacker's favorite, as researchers and makers alike have repurposed it for all sorts of body-tracking applications.

Come February, Microsoft will be releasing the first version of the Kinect specifically designed for Windows PCs, complete with a free SDK and runtime. This means that Windows developers can now start designing games and applications that use gestures and body positioning. A future full of "Minority Report"-style user interfaces can't be far away. And with people having to writhe and contort to use their computers, a 15-minute warm up and stretch will become mandatory company policy across the world.

Of more immediate interest: Will the hardware be open enough for folks to create non-Windows SDKs? I suspect a lot of Linux and Mac developers would love to play with a Kinect, and if Microsoft is smart, they'll take the money and smile.

A patent for those half-days

Like mobile phone litigation, software patent abuses are such a frequent occurrence that if I chose to chronicle them all, there would be no room left every week to discuss anything else. But every once in a while, a patent of such mind-altering "well, duh!" magnitude is granted that it must be acknowledged.

Enter the current subject: IBM's recently granted patent for a system that notifies people who try to email you if you're on vacation. But wait, you respond, just about every email system in existence lets you set yourself on vacation and send an auto-response to anyone who emails you. Ah, you fool, but can it handle the case where you only take a half day off? That's what this patent covers.

If NYC crashes with a null pointer exception, we'll know why

It may be more PR than promise, but New York City Mayor Michael Bloomberg has pledged to learn coding, as part of Codecademy's Code Year project.

Between Codecademy, the Kahn Academy and free courseware now being offered by prestigious institutions such as MIT and Stanford, there's never been more resources available to the average person who wants to learn software engineering. The question is, how will the corporate world react to a cadre of self-taught developers? We often hear there's a shortage of engineering talent in the U.S., but will companies hire newbie coders who learned it all online?

Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

Got news?

Please send tips and leads here.

Related:

January 05 2012

Developer Week in Review: 2012 preview edition

Baby New Year has opened his eyes, and he sees a bright future for the developer community. Of course, newborn babies can't focus beyond a few inches, so I'd take that with a grain of salt. Some of us are a little longer in the tooth, so this week, I'll try to peer out into the months ahead and take my best guess as to what we can expect in 2012. You can come back in December and laugh hysterically at my predictions.

It's all about the mobile

Let's get the obvious out of the way first. The intellectual property litigation mayhem that we saw in 2011 will continue unabated in the new year. Now that several vendors have implemented the nuclear option by suing their competitors, the fun and games can only get more intense as companies use local judicial systems and trade organizations as a way to keep competing products out of markets.

On the Android front, Ice Cream Sandwich (ICS) is starting to show up on handsets, but depressingly few if you're an Android developer hoping to use the new features of the release. There's no word if there will be a follow-on to ICS anytime soon, which is probably a good thing, given how far behind handset makers are in getting recent releases onto their shipping products.

Fans of iOS can look forward to at least one new iPhone and iPad (if not more) in 2012, as well as iOS 6. We'll probably see the end-of-life for the iPhone 3 family since only the 3GS made it onto the iOS 5 supported list, and another year will have past. Rumors abound that there will be an integrated TV option for iOS as well — whether it will allow apps to be installed is a question mark at the moment. Siri on your TV could be fairly awesome; imagine just saying, "Record all new Patriots games" and having it happen.

The BlackBerry appears to be singing its swan song while those pesky P2ME feature phones continue to own much of the low-end cell phone market. The biggest unknown this year is if the Windows Phone platform will finally gain significant traction. Nokia and Microsoft are spending a boatload of money to promote it. They have the resources to buy market share if they want, and recent reviews of new Windows Phone devices have actually been pretty positive. The question would be, who would Microsoft steal market share from — Apple, Android or the low-end phones?

Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

Clouds are gathering on the horizon

Much as the Internet rapidly gained mindshare in the early '90s, the cloud has now become the hot new concept that the general public grasps, at least in principle. What exactly the cloud is tends to depend on who you talk to, but the general idea of moving desktop applications to HTML5-based web applications is a done deal at this point.

The one big wrench in the plan could come from the legislative branches of the world. The more they pass SOPA-like laws, the more people are going to worry about how easily they could lose access to their private data if they move it to the cloud. It was bad enough when you had to trust Google not to be evil; expecting elected representatives to be evil is almost a given.

The increasing move to the cloud is only going to heat up demand for developers who know HTML5, jQuery, PHP, and other web-based technologies. At least in the short run, it's going to be a good time to be a web developer.

Offshoring loses its cachet

The stampede to move development jobs overseas seems to have encountered a roadblock, and many U.S. companies appear to be rethinking the economics of outsourcing projects. Some startups are trying new and innovative (and potentially insane) schemes to work around U.S. labor laws, and while this is unlikely to bring back the go-go days of the late '90s — when developers were courted like rock stars — it may perhaps stem the hemorrhaging of skilled jobs overseas. The challenge for the U.S. will be to produce enough high-tech workers to fill all those returning jobs, especially as more and more high school students rethink the economics of going to college.

Got news?

Please send tips and leads here.

Related:

December 16 2011

Developer Week in Review: HP sets webOS free

Hard to believe there's only 15 days left in 2011; it's flown by so quickly. Next week, I'll be putting out the much anticipated Developer Year In Review, highlighting the ups and downs of the industry over the last 12 months. But for the moment, enjoy these pre-holiday tidbits:

HP gets into the spirit of the season

HP WebOSEvidently, Meg Whitman was visited by three ghosts recently because she opened her window last week and shouted for the boy downstairs to run to the butcher and buy the big goose in the window so it could be delivered to Bob Cratchit's house. Except in this case, the goose was the source code to webOS, and the lucky recipient was the open source community.

It's certainly a magnanimous gesture on the part of HP, and it's likely to lead to any number of interesting spin-off projects. It will also provide an interesting contrast to the current open-source tablet darling, Android. Exactly who will administer the project and which license it will be released under is still uncertain. Hopefully, it will be a relatively permissive license so it can freely cross-pollinate.

For HP, this is definitely making the best of a bad situation. As readers may recall, I've harped on several occasions about how Oracle has been shedding itself of many of the assets it acquired when it purchased Sun. But as far as throwing away money, Oracle is bush-league compared to HP. It's taken less than two years for HP to relegate the $1.2 billion it paid for Palm to the "capital losses" column in its tax return.

Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

And speaking of Oracle ...

Anyone who has ever been involved in the negotiations for an outside vendor to deliver a software solution knows that it's an inexact science, at best. There always turns out to be requirements that were missed or technical complications that turn up during deployment, and customers are usually (reluctantly) willing to pay the piper because they have already committed to the solution.

Montclair State University evidently decided to try plan B when Oracle went over budget and missed deadlines on the university's new ERP system. They are taking Mr. Ellison's yacht-funding enterprise to federal court, accusing Oracle of rigging the demo and trying (in the words of the university) to extort money by threatening not to complete the work unless paid millions more in fees.

It may be dicey to figure out if Montclair understated its requirements or if Oracle low-balled the bid since I've yet to see a requirements spec for a fixed-price contract that was worth the paper it was written on. Oracle can at least take comfort from the fact that Montclair doesn't have a law school, so there won't be any pro bono faculty members on the legal team.

On the other hand, T&M has its perils, too

Some companies prefer to bid contracts as time and materials (T&M), rather than fixed price. This is a good deal for the contractor because it won't get caught underfunded if things turn out to be complicated. For the customer, it offers the benefit of being able to pull the plug if things aren't working out or to add and remove requirements without having to renegotiate. The downside for the contractor is that it can't profit from finishing early.

Of course, this all assumes that the contractor is actually working on the project. In a recent case, your tax dollars (for all you American readers) were going to pay someone to watch movies, hang out in bars, and ride roller coasters. California-based Aerospace Corp just paid the Department of Justice a nice round $2.5 million to settle allegations that not only was it billing time for an employee who was moonlighting at another firm, but that he spent his days at leisure while billing both firms.

Incredibly, this went on for five years, despite such stunts as billing for more than 24 hours of work in a single day. You almost have to admire the chutzpah of Mr. William Grayson Hunter, who also inflated his high school diploma into a doctorate from Oxford. He also managed to die of natural causes before the long arm of the law could bring him to justice, presumably with a smile on his face and a Six Flags hat on his head.

Got news?

Please send tips and leads here.

Related:

December 09 2011

Developer Week in Review: Developers are our most important asset?

I'm about halfway through the Steve Jobs biography (currently in the middle of the NeXT years), and I am continually struck by the dissonance of Jobs' incredible insight about some things and total blindness to others. It reminds me of an observation I used to make about some people: There's a reason Wisdom and Intelligence were two different stats in Dungeons & Dragons.

Not all of us are born to ascend to such lofty heights of fame, but it's nice to know that ...

Developers: Not just nameless cogs in the machine?

People who make a living making software have had reason to be nervous over the last decade. The trend seems to have been a race to the bottom of the salary scale, with generic talent in developing countries valued on par with highly experienced developers at home. Even when companies were willing to acknowledge that an external developer might not be as productive as an experienced one who has been working for the company for many years, the argument was made that since you could hire four or five foreign developers for the price of the stateside talent, the economics still made sense.

Now, an interesting essay in Forbes makes the argument that really good developers aren't twice or three times as valuable as the average, but 10 times or more. The essay's author, Venkatesh Rao, puts forward the proposition that smart companies such as Google recognize the value of retaining their high-end talent and deliberately shower them with lavish perks to keep them from straying. Rao argues that because developers require so little support infrastructure (as opposed to a biologist, for example, who needs staff and a lab), highly productive software engineers have a direct multiplier effect on the bottom like.

I've been noticing the beginnings of a backlash against the offshoring fad, as it seems have others. The question, of course, is whether average companies are able to look past the immediate cash-flow factor and evaluate the value of their staffs based on things such as the end quality of the product and (to quote Mr. Jobs) how insanely great the results are.

Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

Data centers and the boonies economy

The news has been alight recently with stories of large companies (Apple, Google, Facebook, Microsoft) setting up massive data centers in out-of-the-way locations. The reasons for this are several, and they've resulted in a perfect storm of motivations for placing data centers in rural locations.

First off, data centers require a lot of square footage, and a million square feet in the greater San Francisco Bay area is obviously a lot more expensive than in the wilds of Washington state. Construction costs are likely to be lower as well. Tax rates tend to be lower, and small country towns are more likely to offer incentivized tax rates to bring in jobs.

Second, data centers don't really require a lot of high-end staffing, beyond a resident engineer or two to keep things humming. Security personnel and maintenance staff are non-skilled positions that can be hired as easily in the mountains of the Ozarks as in downtown New York. And once again, they are likely to work much cheaper because the cost of living in rural locations is so low.

Many rural locations also are close to low-cost power generation sources, such as dams. Since electricity is a major cost in data center budgets, getting your power locally can take a large bite out of operating budgets. In addition, there is a belt of climate that runs through areas such as Colorado that offers the ability to take advantage of open air cooling, rather than having to run costly air conditioning all the time.

There's also something to be said for geographic diversity. If The Big One ever hits San Francisco, it will take down pretty much any data center in the affected area. Scattering your eggs into multiple baskets is common sense.

The reason that this works for data centers, and less so for things such as manufacturing, is that all a data center needs to function is good connectivity. With so much dark fiber strung across the country, it's not that expensive to bring multiple "fat pipes" to even the most remote locations, especially when you factor in all the savings. Data centers don't need good rail infrastructure, highways, geographic centrality, or any of the other factors that drive location decisions for physical manufacturing.

Your mobile news roundup

Ignoring for the moment the continual cacophony of lawsuits and counter-suits that seem to be business as usual these days, there were actually some recent news items of note in the mobile space.

There's no publicly available API for developers to add their own mojo to Apple's Siri, but that hasn't stopped enterprising hackers from discovering a way to do a man-in-the-middle intercept and add their own functionality. Before anyone gets in a tizzy, it only works if the mobile user explicitly opts in by installing a self-signed SSL certificate so that HTTPS connections to the hacked Siri proxy succeed. It's not something that could be done behind your back. Once in place, you can insert your own Siri functionality by writing code in Ruby on the proxy server. I've tried it, and it's surprisingly easy to make Siri jump through hoops. Apple will probably close this loophole soon, but for the moment it offers a tantalizing look at how powerful general access to a voice interface could be. The word is that Apple is hiring two engineers to work on APIs for Siri. Evidently, Apple sees the potential, too.

In Android-land, Google announced that the ten billionth app had been downloaded from the Android Market. By comparison, the 15 billionth iTunes app purchase occurred this summer, as announced at WWDC. There's no question that Android's velocity is greater than iOS, due largely to the huge number of phones running Android now, most of which are significantly cheaper than an iPhone. Is it time to wonder if the iPhone is going to end up being the Betamax of phones, eventually done in by a flood of inexpensive competitors? Or will it end up being more like the MacBook, the choice of anyone who can afford them?

In Redmond, Microsoft is ramping up a developer infrastructure for its Windows 8 platform, and the company has decided to follow the app store model as well. It appears that Microsoft will be offering better terms for developers, something that shouldn't be surprising, given Microsoft's legendary loyalty to those who choose to follow the way of Windows.

Got news?

Please send tips and leads here.

Related:

November 30 2011

Developer Week in Review: Siri is the talk of the town

After a one-week hiatus, during which research was undertaken in waistline enhancement via the consumption of starch and protein materials, we're back to see what's been happening in the non-turkey-related fields.

Imitation is the sincerest form of flattery

SiriIt's an interesting time for the voice-enabled smartphone field. On the one hand, some industry pundits with vested interests are claiming that people don't want to talk to their phones and don't want them to be assistants. Perhaps they have forgotten that the original smartphones were offshoots of the PDA market, and that PDA doesn't stand for "public display of affection" in this case.

At the other extreme, we have Microsoft stating that Apple's Siri is just a knock-off of Windows Tellme, a claim that has been placed into question by several head-to-head comparisons of features.

Of most interest to the developer community are reports that the latest iOS beta release contains additional hooks to allow applications to integrate into Siri's voice recognition functionality. I talked about the possibility that Apple would be expanding the use of Siri into third-party apps a few weeks ago, and the new features in the beta seem to confirm that voice is going to be made available as a feature throughout applications. This would be a real game changer, in everything from games to GPS applications on the iOS platform.

Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

Computer science for the masses

Two interesting pieces of news this time around on the educational front. In the higher learning arena, Stanford is expanding its free online computer science courseware with several new classes, including one on machine learning. Although you can't earn a free degree this way, you can get computer-graded test results to go along with the recorded lectures. This material will be very useful, even to grizzly old veterans such as myself, who may have a hole or two in their theoretical underpinnings. For a bright high school student who has exhausted his or her school's CS offerings, it could also serve as a next step.

Meanwhile, in the UK, the government seems to be moving toward having all students learn the basics of programming. I worry about this, on two fronts. First, it is unclear if the majority of students really need to learn software engineering or would benefit from it. Force-feeding coding skills into students who may not have the aptitude or proclivity to want to learn them seems unwise to me and is likely to slow down the students who might actually have a desire to learn the subject. Second, I have my doubts that a government-designed software engineering curriculum would actually be any good.

Is there anything JavaScript can't do?

JavaScriptJavaScript is often derided by "serious" computer professionals as a poorly designed toy language unfit for "real" software engineering. Yet, those who spend time using it know that you can produce some impressive results with it.

For example, there is now a JavaScript implementation of the OpenPGP message specification, which would allow JavaScript code to send and receive encrypted messages. And if you really want to go out on a limb, you could always develop a Java Virtual Machine byte code interpreter written entirely in JavaScript (somewhere, James Gosling is crying ...).

There's no question that JavaScript has its weak points, but its near-ubiquity makes it an incredibly useful spanner to carry around in your tool belt. Developers, sneer at your own risk. Like cockroaches, JavaScript may be around well after some more traditional languages have turned to dust.

Got news?

Please send tips and leads here.

Related:

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl