Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

April 05 2012

Editorial Radar with Mike Loukides & Mike Hendrickson

Mike Loukides and Mike Hendrickson, two of O'Reilly Media's editors, sat down recently to talk about what's on their editorial radars. Mike and Mike have almost 50 years of combined technical book publishing experience and I always enjoy listening to their insight.

In this session, they discuss what they see in the tech space including:

  • How 3D Printing and personal manufacturing will revolutionize the way business is conducted in the U.S. [Discussed at the 00:43 mark ]
  • The rise of mobile and device sensors and how intelligence will be added to all sorts of devices. [Discussed at the 02:15 mark ]
  • Clear winners in today's code space: JavaScript. With Node.js, D3, HTML5, JavaScript is stepping up the plate. [Discussed at the 04:12 mark ]
  • A discussion on the best first language to teach programming and how we need to provide learners with instruction for the things they want to do. [Discussed at the 06:03 mark ]

You can view the entire interview in the following video.

Next month, Mike and Mike will be talking about functional languages.

Fluent Conference: JavaScript & Beyond — Explore the changing worlds of JavaScript & HTML5 at the O'Reilly Fluent Conference (May 29 - 31 in San Francisco, Calif.).

Save 20% on registration with the code RADAR20

March 23 2012

December 22 2011

Four short links: 22 December 2011

  1. Fuzzy String Matching in Python (Streamhacker) -- useful if you're to have a hope against the swelling dark forces powered by illiteracy and touchscreen keyboards.
  2. The Business of Illegal Data (Strata Conference) -- fascinating presentation on criminal use of big data. "The more data you produce, the happier criminals are to receive and use it. Big data is big business for organized crime, which represents 15% of GDP."
  3. Isarithmic Maps -- an alternative to chloropleths for geodata visualization.
  4. Server-Side Javascript Injection (PDF) -- a Blackhat talk about exploiting backend vulnerabilities with techniques learned from attacking Javascript frontends. Both this paper and the accompanying talk will discuss security vulnerabilities that can arise when software developers create applications or modules for use with JavaScript-based server applications such as NoSQL database engines or Node.js web servers. In the worst-case scenario, an attacker can exploit these vulnerabilities to upload and execute arbitrary binary files on the server machine, effectively granting him full control over the server.

December 19 2011

Four short links: 19 December 2011

  1. The History of Version Control (Francis Irving) -- concise history of the key advances in managing source code versions. Worth it just for the delicious apposition of "history" and "version control".
  2. BrowserID -- Mozilla's authentication solution. BrowserID aims to provide a secure way of proving your identity to servers across the Internet, without having to create separate usernames and passwords each time. Instead of a new username, it uses your email address as your identity which allows it to be decentralized since anyone can send you an email verification message. It's currently implemented via JavaScript but hopefully it will be built into the browser in the future. (via Nelson Minar)
  3. A Look Inside Mobile Design Patterns -- Sample chapter on how different apps handle invitations, from a new [O'Reilly-published, huzzah!] book on mobile design patterns. (via David Kaneda)
  4. Node Toolbox -- concise compendium of resources for node.js development.

December 13 2011

Four short links: 13 December 2011

  1. Newton's Notebooks Digitised -- wonderful for historians, professional and amateur. I love (a) his handwriting; (b) the pages full of long division that remind us what an amazing time-saver the calculator and then computer was; (c) use of "yn" for "then (the y is actually a thorn, pronounced "th", and it's from this that we get "ye", actually pronounced pronounced "the"). All that and chromatic separation of light, inverse square law, and alchemical mysteries.
  2. Creative Commons Kicks Off 4.0 Round -- public discussion process around issues that will lead to a new version of the CC licenses.
  3. Shred -- an HTTP client library for node.js. (via Javascript Weekly)
  4. Holding Back the Age of Data (Redmonk) -- Absent a market with well understood licensing and distribution mechanisms, each data negotiation - whether the subject is attribution, exclusivity, license, price or all of the above - is a one off. Very good essay into the evolution of a mature software industry into an immature data industry.

November 28 2011

Four short links: 28 November 2011

  1. Twine (Kickstarter) -- modular sensors with connectivity, programmable in If This Then That style. (via TechCrunch)
  2. Small Sample Sizes Lead to High Margins of Error -- a reminder that all the stats in the world won't help you when you don't have enough data to meaningfully analyse.
  3. Yahoo! Cocktails -- somehow I missed this announcement of a Javascript front-and-back-end dev environment from Yahoo!, which they say will be open sourced 1Q2012. Until then it's PRware, but I like that people are continuing to find new ways to improve the experience of building web applications. A Jobsian sense of elegance, ease, and perfection does not underly the current web development experience.
  4. UK Govt To Help Businesses Fight Cybercrime (Guardian) -- I view this as a good thing, even though the conspiracy nut in me says that it's a step along the path that ends with the spy agency committing cybercrime to assist businesses.

July 28 2011

Four short links: 28 July 2011

  1. 23andMe Disproves Its Own Business Model -- a hostile article talking about how there's little predictive power in genetics for diabetes and Parkinson's so what's the point of buying a 23andMe subscription? The wider issue is that, as we've known for a while, mapping out your genome only helps with a few clearcut conditions. For most medical things that we care about, environment is critical too--but that doesn't mean that personalized genomics won't help us better target therapies.
  2. jsftp -- lightweight implementation of FTP client protocol for NodeJS. (via Sergi Mansilla)
  3. Really Bad Workshops -- PDF eBook with rock-solid advice for everyone who runs a workshop.
  4. PigEditor (GitHub) -- Eclipse plugin for those working with Pig and Hadoop. (via Josh Patterson)

July 13 2011

Four short links: 13 July 2011

  1. Freebase in Node.js (github) -- handy library for interacting with Freebase from node code. (via Rob McKinnon)
  2. Formalize -- CSS library to provide a standard style for form elements. (via Emma Jane Hogbin)
  3. Suggesting More Friends Using the Implicit Social Graph (PDF) -- Google paper on the algorithm behind Friend Suggest. Related: Katango. (via Big Data)
  4. Dyslexia -- a typeface for dyslexics. (via Richard Soderberg)

July 08 2011

Top stories: July 4-8, 2011

Here's a look at the top stories published across O'Reilly sites this week.


Seven reasons you should use Java again
To mark the launch of Java 7, here's seven reasons why Java is worth your time and worth another look.
What is Node.js?
Learning Node might take a little effort, but it's going to pay off. Why? Because you're afforded solutions to your web application problems that require only JavaScript to solve.
3 Android predictions: In your home, in your clothes, in your car
"Learning Android" author Marko Gargenta believes Android will soon be a fixture in our homes, in our clothes and in our vehicles. Here he explains why and how this will happen.
Into the wild and back again
Burnt out from years of school and tech work, Ryo Chijiiwa quit his job and moved off the grid. In this interview, Chijiiwa talks about how solitude and time in the wilderness has changed his perspective on work and life.
Data journalism, data tools, and the newsroom stack
The MIT Civic Media conference and 2011 Knight News Challenge winners made it clear that data journalism and data tools will play key roles in the future of media and open government.




OSCON Java 2011, being held July 25-27 in Portland, Ore., is focused on open source technologies that make up the Java ecosystem. Save 20% on registration with the code OS11RAD


July 06 2011

What is Node.js?

Node.js. It's the latest in a long line of "Are you cool enough to use me?" programming languages, APIs, and toolkits. In that sense, it lands squarely in the tradition of Rails, and Ajax, and Hadoop, and even to some degree iPhone programming and HTML5. Go to a big technical conference, and you'll almost certainly find a few talks on Node.js, although most will fly far over the head of the common mortal programmer.

Dig a little deeper, and you'll hear that Node.js (or, as it's more briefly called by many, simply "Node") is a server-side solution for JavaScript, and in particular, for receiving and responding to HTTP requests. If that doesn't completely boggle your mind, by the time the conversation heats up with discussion of ports, sockets, and threads, you'll tend to glaze over. Is this really JavaScript? In fact, why in the world would anyone want to run JavaScript outside of a browser, let alone the server?

The good news is that you're hearing (and thinking) about the right things. Node really is concerned with network programming and server-side request/response processing. The bad news is that like Rails, Ajax, and Hadoop before it, there's precious little clear information available. There will be, in time — as there now is for these other "cool" frameworks that have matured — but why wait for a book or tutorial when you might be able to use Node today, and dramatically improve the maintainability of your code and even the ease with which you bring on programmers?

A warning to the Node experts out there

Node is like most technologies that are new to the masses, but old hat to the experienced few: it's opaque and weird to most but completely usable for a small group. The result is that if you've never worked with Node, you're going to need to start with some pretty basic server-side scripts. Take your time making sure you know what's going on, because while this is JavaScript, it's not operating like the client-side JavaScript you're used to. In fact, you're going to have to twist your JavaScript brain around event loops and waiting and even a bit of network theory.

Unfortunately, this means that if you've been working and playing with Node for a year or two, much of this article is going to seem pedestrian and overly simplistic. You'll look for things like using Node on the client, or heavy theory discussions on evented I/O and reactor patterns, and npm. The reality is that while that's all interesting — and advances Node to some pretty epic status — it's incomprehensible to someone just getting started out. Given that, maybe you should pass this piece on to your co-workers who don't know Node, and then when they're buying into Node's usefulness, start to bring them along on the more advanced Node use cases.

Node Day at OSCON — This year's OSCON features a day-long dive into Node on Tuesday, July 26. Join experts and users from the Node community to discuss best practices and future developments, and survey the ever-growing number of Node frameworks and plugins.

Save 20% on registration with the code OS11RAD

Node: A few basic examples

First things first: you need to realize that Node is intended to be used for running standalone JavaScript programs. This isn't a file referenced by a piece of HTML and running in a browser. It's a file sitting on a file system, executed by the Node program, running as what amounts to a daemon, listening on a particular port.


Skipping hello world


The classic example here is "Hello World," detailed on the Node website. Almost everyone starts with Hello World, though, so check that out on your own, and skip straight to something a lot more interesting: a server that can send static files, not just a single line of text:

	var sys = require("sys"),  
	    http = require("http"),  
	    url = require("url"),  
	    path = require("path"),  
	    fs = require("fs");  

http.createServer(function(request, response) {
var uri = url.parse(request.url).pathname;
var filename = path.join(process.cwd(), uri);
path.exists(filename, function(exists) {
if(!exists) {
response.writeHead(404, {"Content-Type": "text/plain"});
response.end("404 Not Found\n");
return;
}

fs.readFile(filename, "binary", function(err, file) {
if(err) {
response.writeHead(500, {"Content-Type": "text/plain"});
response.end(err + "\n");
return;
}

response.writeHead(200);
response.end(file, "binary");
});
});
}).listen(8080);

console.log("Server running at http://localhost:8080/");

Thanks much to Mike Amundsen for the pointer to similar code. This particular example was posted by Devon Govett on the Nettuts+ training blog, although it's been updated for the current version of Node in a number of places. Devon's entire tutorial post is actually a great companion piece on getting up to speed on Node once you have a handle on the basics.

If you're new to Node, type this code into a text file and save the file as NodeFileServer.js. Then head out to the Node website and download Node or check it out from the git repository. You'll need to build the code from source; if you're new to Unix, make, and configure, then check out the online build instructions for help.

Node runs JavaScript, but isn't JavaScript

Don't worry that you've put aside NodeFileServer.js for a moment; you'll come back to it and more JavaScript shortly. For now, soak in the realization that you've just run through the classic Unix configuration and build process:

./configure
make
make install

That should come with another realization: Node itself isn't JavaScript. Node is a program for running JavaScript, but isn't JavaScript itself. In fact, Node is a C program. Do a directory listing on the Node/src directory and you'll see something like this:

Node ls listing
Click to enlarge

For all of you thinking that JavaScript is a poor language in which to be writing server-side tools, you're half right. Yes, JavaScript is not equipped to deal with operating system-level sockets and network connectivity. But Node isn't written in JavaScript; it's written in C, a language perfectly capable of doing the grunt work and heavy lifting required for networking. JavaScript is perfectly capable of sending instructions to a C program that can be carried out in the dungeons of your OS. In fact, JavaScript is far more accessible than C to most programmers — something worth noting now, and that will come up again and again in the reasons for looking seriously at Node.

The primary usage of Node further reflects that while Node works with JavaScript, it isn't itself JavaScript. You run it from the command line:

 — (bdm0509@Bretts-MacBook-Pro Sun, 29 May 11) —  —  —  —  —  —  —  —  —  — (/Users/bdm0509/tmp/Node/src) — 
 — (09:09 $)-> export PATH=$HOME/local/Node/bin:$PATH

— (bdm0509@Bretts-MacBook-Pro Sun, 29 May 11) — — — — — — — — — — (/Users/bdm0509/tmp/Node/src) —
— (09:09 $)-> cd ~/examples

— (bdm0509@Bretts-MacBook-Pro Sun, 29 May 11) — — — — — — — — — — — — (/Users/bdm0509/examples) —
— (09:09 $)-> Node NodeFileServer.js
Server running at http://127.0.0.1:1337/

And there you have it. While there's a lot more to be said about that status line — and what's really going on at port 1337 — the big news here is that Node is a program that you feed JavaScript. What Node then does with that JavaScript isn't worth much ink; to some degree, just accept that what it does, it does. This frees you up to write JavaScript, not worry about learning C. Heck, a big appeal to Node is that you can actually write a server without worrying about C. That's the point.

Interacting with a "Node server"

Make sure you still have your NodeFileServer.js code running via Node. Then you can hit your local machine — on port 1337 — and see this unremarkable output.

File server browser
Click to enlarge

Yes, this is about as mundane as you can get. Well, that is, until you realize that you've actually written a file server in about 20 lines of code. The output you see — the actual code of the script you wrote — isn't canned in the script itself. It's being served from the file system. Throw an image into the same directory, and simply add the name of the image to the end of your URL, like http://localhost:8080/my_image.png:

image example

Node happily serves this binary image up. That's pretty remarkable, when you again refer to the brevity of the code. On top of that, how hard would it be if you wanted to write your own server code in JavaScript? Not only that, but suppose you wanted to write that code to handle multiple requests? (That's a hint; open up four, five, or 10 browsers and hit the server.) The beauty of Node is that you can write entirely simple and mundane JavaScript to get these results.

A quick line-by-line primer

There's a lot more to talk about around Node than in the actual code that runs a server. Still, it's worth taking a blisteringly fast cruise through NodeFileServer.js before moving on. Take another look at the code:

var http = require('http');
http.createServer(function (req, res) {
  res.writeHead(200, {'Content-Type': 'text/plain'});
  res.end('Hello World\n');
}).listen(1337, "127.0.0.1");
console.log('Server running at http://127.0.0.1:1337/');

First, you have a call to a function called require(). The use of require() has been a long-standing request by programmers. You can actually find this mentioned in some of the discussions on JavaScript modularity, as well as germane to CommonJS, and a pretty cool implementation by O'Reilly author David Flanagan from 2009. In other words, require() may be new to you, but it isn't an untested, careless piece of Node code. It's core to using modular JavaScript, and something of which Node takes heavy advantage.

Then, the resulting http variable is used to create a server. That server is handed a function block to run when it's contacted. This particular function ignores the request completely and just writes out a response, in text/plain, saying simply "Hello World\n". Pretty straightforward stuff.

In fact, this lays out the standard pattern for Node usage:

  1. Define the type of interaction and get a variable for working with that interaction (via require()).
  2. Create a new server (via createServer()).
  3. Hand that server a function for handling requests.
    • The request handling function should include a request ...
    • ... and a response.
  4. Tell the server to start handling requests on a specific port and IP (via listen).

Lost in translation

Despite the ease with which you can get a server coded in JavaScript (regardless of whether the actual code-running facility is C or anything else) still begs the question: Should you write a server in JavaScript? To really get a handle on the answer to this question, consider a pretty typical use case.

The JSON round trip

You've got a typical web application, HTML front-end with CSS styling, and JavaScript for the validation and communication with a server. And because you're up on the interactive web, you're using Ajax and not relying purely on a form's POST to get data to and from the server. If this is you, then you're probably comfortable with JSON, too, as that's the almost de facto means of sending data across the wire these days.

So you've got an Ajax request that says, for example, "give me more information about some particular guitar on an online auction site." That request gets thrown across the network to a PHP program running on a server somewhere. The PHP server has to send a lot of information back to the JavaScript requestor, and it's got to send that information in some format that JavaScript can unpack. So the return information is bundled up into an array, which can then be converted to JSON, sort of like this:

$itemGuitar = array(
  'id' => 'itemGuitar',
  'description' => 'Pete Townshend once played this guitar while his own axe ' .
                    was in the shop having bits of drumkit removed from it.',
  'price' => 5695.99,
  'urls' => array('http://www.thewho.com', 'http://en.wikipedia.com/wiki/Pete_Townshend')
);

$output = json_encode($itemGuitar);
print($output);

Back on the client, the JavaScript gets this chunk of information, which has changed slightly because of JSON and transmission. The client basically gets something like this:

{
  "id": "itemGuitar",
  "description": "Pete Townshend once played this guitar...",
  "price": 5695.99,
  "urls": ["http://www.thewho.com", "http://en.wikipedia.com/wiki/Pete_Townshend"]
}

This is pretty standard fare. Then, it's easy to convert this text "thing" into an object in JavaScript. You just call eval(), like this:

var itemDetails = eval('(' + jsonDataString + ')');

The result is a nice JavaScript object with properties that match up to the JSON array-like structure. Of course, since the jsonDataString usually is returned from a server, you're more likely to see code like this:

var itemDetails = eval('(' + request.responseText + ')');

This is the typical JSON round trip. But there are problems here ... big, big problems.

Subtlety and nuance destroy solid code

First, there's a major problem in that this sort of code relies heavily on a translator. In this case, the translator is the JSON interpreter and related code, and there are in fact two dependencies: a JSON interpreter for Java in the form of what eval() does with the response text, and the JSON interpreter for PHP. As of PHP 5.2.0, that interpreter is included with PHP, but it's still essentially an external dependency, separate from the core of PHP.

Now, this isn't a rant about translation itself. There's nothing to suggest that there are problems in taking, say, an "l" and turning it into an "i", or something that's item 1 in an array and reporting it as being item 2 in an array. There's a lot of testing that occurs before JSON tools are ever released to ensure that what gets reported is correct, and accurate round tripping from a client to a server and back again are possible. Lots and lots and lots of testing is involved ...

And that is in fact a problem.

The dependency of JavaScript and PHP (and C and Lisp and Clojure and Eiffel and ... well, see the figure below for all the JSON toolkits floating around for a ton of different languages) on a toolkit is a huge issue. In other words, the problem isn't the translation but the translator. While programming languages evolve slowly, the uses to which these languages are applied is growing quickly. The result is that JSON is being put to use in areas of complexity that simply didn't exists or went untouched even a few months ago. And with each new iteration — each new depth of recursion and combination of data types — it's possible that an area is discovered that the translator doesn't support.

JSON toolkits
A selection of JSON toolkits.
Click to see the full list

That's not in itself bad. In fact, it argues for the popularity of JSON that it's constantly put to new use. But with the "new" comes the "does it support the new?" So JSON has to evolve from time to time, and that means testing, and retesting, and release on tons of platforms. You, the programmer, may have to rearrange your data; or wait on a release to support your needs; or hack at JSON yourself. Again, many of these are just the so-called costs of programming.

But imagine you could ditch the translation — and therefore the translator — altogether. Imagine you could write, not JSON round tripping, but JavaScript end to end.

That's the promise of Node. All the text you've just read — about PHP including JSON in 5.2.0 but not before, about arrays becoming objects, about data being configured in new ways and requiring new things from JSON — it all goes away when you have JavaScript sending data and receiving and responding to that data.

eval() in JavaScript is (Potentially) the Devil

As if that's not enough reason to look seriously at Node, there's the pesky issue of running eval() on a string. It's long been accepted that eval() is dangerous stuff. It runs code that you can only see as textual data; it's the equivalent of that "Run Your SQL by typing it in here" unvalidated text box, open to SQL injection and malicious intent. It's quite possible that every time eval() is passed in a string, a puppy somewhere in the Midwest shivers and a mom on the Eastern Seaboard stubs her toe and curses. It's that precarious. There's plenty to read about online, and it's not worth going into in detail here. Just Google "eval JavaScript evil" or "eval JavaScript injection" to get a good taste of the issues.

Still, Node without any context doesn't allow you to avoid eval(), so there are potentially still shivering puppies out there. However, Node used as it's intended absolutely gets you around the typical eval() problems. Node is often called evented JavaScript or evented I/O, and that little word — "evented" — is hugely important. But to get a hold of what evented really means, and why it gets you out of the dangers of eval(), you've got to understand not just how JSON is typically round tripped in applications, but how the very structure of applications on the web are typically architected.

Today's web is a big-event web

Typical web forms are "big-event" submitters. In other words, lots of data entry and selection happens — a user fills out text boxes, selects choices from combo boxes, selects items from a list, and so on — and then all of that information is submitted to a server. There's a single "big event" from the programming perspective: the submission of all that form data, usually through a POST. That's pretty much how the web operated, pre-Ajax.

Sending lots of data at one time

With Ajax, there is a little more of what's called evented programming. There are more events that trigger interaction with the server. The classic case is the entry of a zip code, and then a resulting call to the server to get a city and state. With Ajax and the XmlHttpRequest, tons of data didn't have to be gobbed up and thrown to the server all at once. However, that doesn't change the reality that the web is still mostly a big-event place. Ajax is used far more often to achieve interesting visuals, do quick validations, and submit forms without leaving a page than it is to create truly evented web pages. So even though a form isn't submitting a big gob of information with a POST, an Ajax request is doing the same thing.

Honestly, that's only partly the fault of less-than-creative Ajax programmers. Every time you send off a request — no matter how small — there's a lot of network traffic going on. A server has to respond to that request, usually with a new process in its own thread. So if you really move to an evented model, where you might have 10 or 15 individual micro-requests going from a single page to a server, you're going to have 10 or 15 threads (maybe less, depending on how threads are pooled and how quickly they're reclaimed on the server) firing up. Now multiply that by 1,000 or 10,000 or 1,000,000 copies of a given page floating around ... and you could have chaos. Network slowdown. System crashes.

The result is that, in most cases, the Web needs to be, at a minimum, a medium-event place. The result of this concession is that server-side programs aren't sending back tiny responses to very small and focused requests. They're sending back multiple bits of data, and that requires JSON, and then you're back to the eval() problem. The problem is eval(), sure, but the problem is also — from a certain perspective, at least — the nature of the web and threading and HTTP traffic between a web page and a server-side program responding to that request.

(Some of you more advanced JavaScript folks are screaming at this point, because you know better than to use eval(). Instead, you're using something like JSON.parse() instead of eval(). And there are also some compelling arguments for careful usage of eval(). These are things worth screaming about. Still, just see how many questions there are surrounding eval() on sites like Stack Overflow and you'll realize that most folks don't use eval() correctly or safely. It's a problem, because there are lots of intermediate programmers who just aren't aware of the issues around eval().)

Sending a little data at all times

Node brings a different approach to the party: it seeks to move you and your web applications to an evented model, or if you like, a "small event" model. In other words, instead of sending a few requests with lots of data, you should be sending tons of requests, on lots of events, with tiny bits of data, or requests that need a response with only a tiny bit of data. In some cases, you have to almost recall your GUI programming. (All the Java Swing folks can finally use their pent-up GUI knowledge.) So a user enters their first and last name, and while they're moving to the next box, a request is already requesting validation of just that name against existing names. The same is true for zip codes, and addresses, and phone numbers. There's a constant stream of requesting and responding happening, tied to almost every conceivable event on a page.

So what's the difference? Why is this possible with Node, and aren't the same issues around threading existent here? Well, no, they're not. Node's own site explains their philosophy the best:

Node's goal is to provide an easy way to build scalable network programs. In the "hello world" web server example ... many client connections can be handled concurrently. Node tells the operating system (through epoll, kqueue, /dev/poll, or select) that it should be notified when a new connection is made, and then it goes to sleep. If someone new connects, then it executes the callback. Each connection is only a small heap allocation.

Node has no blocks, no threads competing for the same resource (Node is happy to just let things happen however they happen), nothing that has to start up upon request. Node just sits around waiting (quite literally; unused Node responders are sleeping). When a request comes in, it's handled. This results in very fast code, without uber-programmers writing the server-side behavior.

Yes, chaos can ensue

It's worth pointing out that this model does allow all the problems that any non-blocking system allows to come into play: one process (not thread) writing to a data store while another one grabs just-invalidated data; intrusions into what amounts to a transaction; and so on. But realize that the majority of event-based programming on a web form is read-only! How often are you actually modifying data in a micro-request? Very rarely. Instead, there's a constant validation, data lookup, and querying going on. In these cases, it's better to just fire away with the requests. The database itself may add some locking, but in general, good databases will do this much more efficiently than server-side code, anyway; and they'll certainly handle things better than an operating system will spin up and down threads for a generic, "a web response came in" process.

Additionally, Node does have plans to allow for process forking, and the HTML5 Web Workers API is the engine that will probably make this feature go. Still, if you move to an evented model for your web application, you'll probably run into an issue where you might want threading in less than one out of 100 situations. Still, the changes are best in how you think about your web applications, and how often you send and receive data from a server, rather than in how Node works.

In the right place at the right time

There's another web pattern at work here, and it's probably far more important than whether you use Node or not, and how evented your web applications are. It's simply this: use different solutions for different problems. Even better, use the right solution for a particular problem, regardless of whether that's the solution you've been using for all your other problems.

The inertia of familiarity

There's a certain inertia in not just web design, but all of programming. That inertia can be stated axiomatically like this: the more you learn, use, and become good at a certain approach or technique or language, the more likely you are to use that approach/technique/language widely. It's one of those principles that sounds good until you dig deeply. Yes, it's good to learn a language or toolkit well, and to employ it widely. But this inertia often causes you to use a tool because you know it, rather than because it's the right tool.

Look at Ajax, something already discussed. Initially, Ajax provided a solid approach to sending quick requests, without form submissions, to a server. Now it's become a drop-in replacement for all form submissions. That's taking a technology, learning it, applying it, and then eventually over-applying it. There's still a solid place for form submissions — when a form needs to be submitted! As simple as it sounds, there are tends of thousands of web applications submitting forms with Ajax, just because the lead web developer is up on Ajax.

In the same vein, it's possible to get excited about Node — probably because you buy into all the splendid and wise observations you've been reading — and then use it everywhere. Suddenly, you're replacing all your PHP and Perl back-ends with Node. The result? A mess. In fact, you'll be forced to have several web forms do just what Node isn't meant for: submit big chunks of data to JavaScript on the server via Node, and force that JavaScript to either send back a chunk of JSON that's got to be parsed or eval()ed, or send back a full-blown HTML page or an HTTP redirect.

But that's simply not what Node is best at. It's great at micro-requests; at evented I/O. Use Node for quick communication between a web page and a server. Use form submissions to send big chunks of data to the server. Use PHP and Perl to do heavy database lifting and generate dynamic HTML pages. Use Node to provide a means for server-side JavaScript to run and handle small requests. Throw in Rails and Spring and servlets and whatever else you need. But make your decisions based upon the problem you're solving, rather than what you happen to know best at the time.

Node's promise of simplicity

There's one last note worth making. When you take this broad approach to programming, you'll often find that you're not having to go as deeply into each toolkit, API, and framework you use. By using your tools for what they're best at, you don't need to be able to staple with hammers or measure with drills. Using tools for their intended purpose typically means you use the core capabilities more. So while you're creating generalists — programmers that know lots of things — you are also reducing the need for specialists — programmers that know one or two things really, really well. Of course, every pointy-haired boss also realizes that those specialists are really, really expensive and hard to find.

Learning Node might take a little effort, but it's going to pay off. Why? Because you're afforded solutions to your web application problems that require only JavaScript to solve. That means your existing JavaScript expertise comes into play. And when you do need to use PHP or Perl — because it's the right solution for a particular problem — you don't need a PHP or Perl guru. You need to know the basics, and those needs can be expanded when the problem requires expansion. Stretching comes at the behest of new problems, rather than stretching poor solutions thinly.

Your biggest challenge is the continual move to a web that is made up of smaller pieces, talking more often, and the combination of what can seem like a dizzying array of technologies. However, taking the core features of 100 technologies is always going to serve you better than taking 100% of one technology and trying to solve 100 problems. Node and evented I/O isn't a solution to every problem, but it sure is a solution to some important problems.


OSCON 2011, coming up later this month, will feature an entire day dedicated to Node. Learn more about Node Day and save 20% on registration with the codeOS11RAD.



Related:


June 10 2011

Radar's top stories: June 6-10, 2011

Here's a look at the top stories published on Radar this week.


Facebook's face recognition strategy may be just the ticket
Facebook's face recognition may provide a great strategy for cutting the Gordian Knot on this thorny privacy problem.
Why a JavaScript hater thinks everyone needs to learn JavaScript in the next year
If you've avoided JavaScript, this is the year to learn it. And if you don't, you risk being left behind.
The secrets of Node's success
What is it about Node.js that makes it interesting to developers? The key factors are performance, timing, and focusing on a real problem that wasn't easily solved with other server-side dynamic languages.
Algorithms are the new medical tests
Predictive Medical Technologies' system uses real-time, intensive care unit monitoring data to predict cardiac arrest and other health events. CEO Bryan Hughes discusses the system and the application of diagnostic data.
Google Correlate: Your data, Google's computing power
Google Correlate is a new tool in Google Labs that lets you upload state- or time-based data to see what search trends most correlate with that information. Here's a look at how it works and what you can do with it.





OSCON Java 2011, being held July 25-27 in Portland, Ore., is focused on open source technologies that make up the Java ecosystem. Save 20% on registration with the code OS11RAD


June 09 2011

JavaScript spread to the edges and became permanent in the process

At some point, when nobody was looking, JavaScript evolved from humble beginnings to become an important and full-fledged development language.

James Duncan (@jamesaduncan), the chief architect at Joyent, is one of the people who's now putting JavaScript in unexpected places. In his case, it's using JavaScript as a web server development language through the Node.js platform.

In the following interview, Duncan shares his thoughts on JavaScript's growth and how we came to depend so heavily on the language.

For a lot of us who have been in the industry for a while, JavaScript has always seemed like a toy language. When did that change?

James Duncan: The big change started when Google launched Maps. All of a sudden, there was this highly-interactive, highly-usable, fun-to-play with web interface. It just blew away almost every other map provider's offering, and it was all constructed in JavaScript. Everyone sat up and took notice.

What's happening now is that JavaScript is installed on millions and millions of edges of our network. If you think of every web-accessing device as an edge on the network, then JavaScript is in all of them. Every web browser has it, every smartphone has it.

In the same way that C has become permanent at the systems level, with JavaScript, what you've got is a situation where it's on so many edges of the network, it really can never be eliminated. For better or worse, JavaScript be there in 15 years.

Save 50% - JavaScript Ebooks and Videos

JavaScript is everywhere: servers, rich web client libraries, HTML5, databases, even JavaScript-based languages. If you've avoided JavaScript, this is the year to learn it. And if you don't, you risk being left behind. Whatever your level, we have you covered:

Introductory  /  Intermediate  /  Advanced

One week only—offer expires 14 June. Use discount code HALFD in the shopping cart. Buy now and SAVE.



There is still a lingering perception that JavaScript isn't a
first-class programming language, though. Some vendors that use it for
scripting in their products pitch it as an easier language to
use.


James Duncan: I've long called the JavaScript programming community the "dark development team in the sky" because JavaScript requires a skill set that no one admits to having, but everyone actually has. I can use this same positioning to explain why it's important: You walk into an enterprise and say, "Why are you doing all of this high-ceremony .NET/Objective C-style development? It's just not necessary." Because you can just do it in JavaScript, you already have the skills. That makes it an easier sell.

I don't think JavaScript is any less capable than any other language, per se. It's just that what goes into that JavaScript stack is very much defined by what it's embedded into. Because it originates in the browser, we think of it as being easy and unsophisticated, but if you dive down, it absolutely is sophisticated, capable and powerful. It's just that our mindset — and certainly the corporate mindset — has been that JavaScript is only used for making pop-ups in web pages.

In terms of the browser, some of us remember the nightmares of cross-browser JavaScript compatibility issues. Are they still around? And is HTML5 going to be the next iteration of the same problems?

James Duncan: You're right, the browser environment is a nightmare. That's becoming less true just because of the nature of libraries like JQuery and Prototype. Those libraries have improved circumstances so that, for the most part, if someone's writing code in JQuery and really sticking to JQuery, they're probably not going to have many problems anymore. Or if they do run into issues, they're either impossible to fix because something is fundamentally broken or the fix is trivial. There doesn't seem to be much in between.

HTML5 is fascinating. If you look at things like WebSocket and the Web Worker specs, they seem to be implemented pretty uniformly from browser to browser. So I think it's getting better. The browser vendors have learned their lesson: Trying to fight the standard isn't sensible. Even Microsoft with IE has gotten much better with this — they've got a long way to go, but they're much better than they were.

We're currently seeing somewhat of an arms race regarding JavaScript execution speed. Is there an end in sight?

James Duncan: The cool thing is that we're just at the tip of the iceberg. There's no doubt we're going to see huge performance improvements, which we've already seen in part over the last year. It's going to plateau eventually, but I think there is still room for it to get faster and faster. We've had decades of compiler research going into making the fastest or most optimizing C compiler possible. That sort of research spend has not happened for dynamic languages in non-academic environments before, with the possible exception of Smalltalk, but it's happening now around JavaScript. So you're still going to see performance improvements.

A hot topic in computing is parallel programming in languages such as Erlang. Is JavaScript going to join the party?

James Duncan: Node, in some ways, has a similar view: it's all asynchronous-based programming, all asynchronous IOs. Where it steps away from what Erlang is doing is that it explicitly says, "You shall run as one process." You get one CPU to play with. What we're going to see inside the Node space is some capability to have messaging between CPU cores and, therefore, Node processes running on different CPU cores. I don't know if anyone's going to go to the same extreme as Erlang, but some of the things Node is doing are similar.

What would you like to see changed in JavaScript?

James Duncan: For me, the two points are: One, don't break the web, and two, it would be great to see some sort of concrete class system. That's not to say I dislike the prototype-based object orientation within JavaScript, but I think a lot of people get confused by its capabilities. Rather than understanding it as sort of a liberating thing, they see it as a limiting thing. Putting some syntactic sugar around classes would go a long way toward easing people into the language from other sources.

This interview was edited and condensed.

Associated photo on home and category pages: Node Globe Pics by Dan Zen, on Flickr


Learn about Node.js this July at the O'Reilly Open Source Convention in Portland, Oregon. For a comprehensive orientation to Node, attend Monday's Node tutorials and then join us all day Tuesday at Node Day for total immersion in the people and technologies making up the Node ecosystem. Save 20% on registration with the code OS11RAD




Related:


June 08 2011

The secrets of Node's success

In the short time since its initial release in late 2009, Node.js has captured the interest of thousands of experienced developers, grown a package manager and a corpus of interesting modules and applications, and even spawned a number of startups.

What is it about this technology that makes it interesting to developers? And why has it succeeded while other server-side JavaScript implementations linger in obscurity or fail altogether?

The key factors are performance, timing, and focusing on a real problem that wasn't easily solved with other server-side dynamic languages.

Browser wars and JavaScript performance

In the early 2000s, AJAX web development was coming into its own and placing increasing demands on browsers' JavaScript engines. New JavaScript libraries such as YUI, Dojo and jQuery were allowing developers to do much more with web user interface (UI), creating a user experience for web applications that mimicked the behavior of desktop applications.

As JavaScript libraries and websites became more complex and users started to notice poor performance in their browsers, browser developers started to focus seriously on their JavaScript engines.

The race for faster JavaScript engines heated up in September 2008 when Google released Chrome and the Chromium source code. The engine behind it was V8 and it outperformed all others. This helped spur the developers of Firefox, Safari, Opera and Internet Explorer to improve JavaScript performance in their browsers and it opened a new front in the browser wars.

Technically speaking, V8 takes a slightly novel approach to improving performance. Certain JavaScript objects are dynamically compiled directly into native machine code before execution based on a predictive analysis of the code.

This, along with a new approach to property access and a more efficient garbage collection system enabled Chrome to initially post significantly faster benchmarks than other browsers.

Save 50% - JavaScript Ebooks and Videos

JavaScript is everywhere: servers, rich web client libraries, HTML5, databases, even JavaScript-based languages. If you've avoided JavaScript, this is the year to learn it. And if you don't, you risk being left behind. Whatever your level, we have you covered:

Introductory  /  Intermediate  /  Advanced

One week only—offer expires 14 June. Use discount code HALFD in the shopping cart. Buy now and SAVE.

The other browsers responded with improved or completely rewritten JavaScript engines that matched or exceeded V8's benchmarks. These optimizations are still going on, and Google's V8 is benefiting from the healthy, often technically brilliant, competition. Compared to the interpreters for server-side dynamic languages like Ruby, Python, PHP and Perl, JavaScript now has several efficient and incredibly fast runtimes.

Ryan Dahl, creator of Node.js, chose the V8 engine for Node. This has an additional benefit for a server-side implementation.

The predictive optimization of JavaScript works fairly well in the Chrome browser, but it is much more effective for server applications where the same chunks of code tend to be run multiple times. V8 is able to refine its optimizations and soon ends up with very efficient cached machine code.

Node has an additional performance advantage (a big one) that is not directly tied to V8, but we'll get to that in a bit.



The rehabilitation of JavaScript


JavaScript was once widely regarded as an awful hack of a language. Many programmers still feel this way, but the prejudice is starting to fade, mostly because there is a growing body of good code that shows off the language.

One person who has done much to pinpoint JavaScript's technical weak points is Douglas Crockford. Fortunately, instead of stopping there, he has also created JSLint and written "JavaScript: The Good Parts" to help developers write better code while avoiding most of the "bad parts" of the language. In his presentations and posts, one of his core assertions is that:

... despite JavaScript's astonishing shortcomings, deep down, in its core, it got something very right. When you peel away the cruft, there is an expressive and powerful programming language there. That language is being used well in many Ajax libraries to manage and augment the DOM, producing an application platform for interactive applications delivered as web pages. Ajax has become popular because JavaScript works. It works surprisingly well.

Without getting into the details of which parts are good or bad, we have seen in the past few years that professional developers have come to realize that JavaScript is not going away. Many of developers have gotten on with the task of building complex, well-designed applications and libraries. There are still problems with JavaScript and with its specification, but programmers are now much less likely to dismiss it out of hand.

Previous server-side JavaScript frameworks had a much harder time overcoming the negative mindset about the language. By the time Node arrived, JavaScript had overcome the most of its image problem.

Node.js solves a real problem

Wikipedia has a fairly comprehensive "Comparison of server-side JavaScript solutions". Node is in there, but most of the others listed are not nearly so well known. The use of the term "solutions" is interesting, as most of these projects are solutions to problems that have already been solved by other languages.

Python, Java, Ruby, PHP, Perl and others are all still extremely good choices for most types of dynamic web applications. They talk to databases, crunch numbers, validate data, and parse templates. They are high-level languages, and there are several MVC frameworks for each of them for quick web app creation. Node is sometimes touted as the next Ruby-on-Rails, but this a bad comparison and misses the point of what Node is for.

Node is not trying to solve the same problems as Rails, and it's not competing head-on with any of the other languages or frameworks in the areas where they do well. It was made for, and is most successful at, solving a special set of problems with modern web applications. What can it do that these other languages cannot?

It turns out that what JavaScript can do is the flip side of something it can't do: blocking I/O.

Evented I/O

JavaScript itself can't actually read or write to the filesystem. This ability was omitted from the language because it wasn't necessary for its job in the browser, so Node was able to start from scratch with an I/O system based on event loops.

Node is all about "evented I/O," but what does that actually mean?

To those of us who are either not programmers or are not familiar with event loops, an analogy might help.

You're in a grocery store with a list of items to buy. You wheel your cart around the store, pick up one item at a time, put it in your cart, then take the cart through the checkout. You can optimize this slightly by fetching the items in a sane order, but you can't go get the milk while you're waiting at the deli counter.

If you're in a hurry, you might start thinking of crazy ways to speed up the process. You could enlist a number of other shoppers with shopping carts and send each out to buy a single item. This would create bottlenecks in narrow isles and a huge traffic jam at the checkout

This is clearly an insane way to solve the issue because it throws more shopping carts and cash registers at the problem than needed.

Programming languages that block on I/O often try to solve similar problems by spawning additional threads or processes (c.f. Apache, sendmail). This can be expensive in terms of memory usage, and an analysis of Python's Global Interpreter Lock shows just how expensive the traffic jam can be in terms of CPU utilization.

JavaScript and Node use event loops and callbacks to approach the problem differently.

Returning to the shopping example: If you had a group of kids along with you on your shopping trip, you could send each off to get a single item and return them to the cart. When they've all returned, you can proceed through the checkout. The time taken in fetching items would be the maximum time for retrieving a single item (the kid who had to wait at the deli counter), rather than the sum (picking up the items in sequence). Using runners for the small, simple task of fetching items is a more efficient way of parallelizing the problem than sending out full-fledged shoppers and carts.

It's not a perfect analogy by any means, but more succinct and accurate descriptions involve code or pseudo-code. Ryan Dahl's initial presentation at JSConf 2009 used the following example:

  var result = db.query("select..");
  // use result

Here the database query blocks the program from doing anything else until the query is returned, whereas in an event loop:

  db.query("select..", function (result) {
  // use result
  });

... the program can continue doing things while waiting for the function to call provide its callback.

Node provides non-blocking libraries for database, file and network access. Since I/O is not a fundamental part of JavaScript, nothing had to be taken away to add them. Python's Twisted and Ruby's Event Machine have to work around some basic language components in order to get similar evented behavior.

So, in addition to the performance wins Node gets "for free" by using the V8 JavaScript engine, the event loop model itself allows Node servers to handle massive concurrencies in network connections very efficiently. It often approaches the benchmarks achieved by high-performance reverse proxies like Nginx (which is also based on an event loop).

Sharing code between the browser and server

Using the same language on the server that you're already using in the browser has been the promise of many server-side JavaScript (SSJS) systems. The idea of "one language to rule them all" is appealing to developers who have been bounced from one language to another as each new technology emerges.

Aptana Jaxer, one of the better known recent SSJS implementations, hoped to engage AJAX developers with this model. In principle, it's a good idea. You can use the same libraries to validate data on both sides, call server-side JavaScript functions directly from the browser and pre-construct HTML DOM on the server with browser UI libraries to speed up the initial page load.

Jaxer did not see anywhere near the uptake that Node has, and this could be partly because of timing. To many, Jaxer looked like a reimplementation of ASP. It also didn't have Node's performance benchmarks, and it didn't focus on the issue of blocking I/O. Despite its interesting possibilities, Jaxter didn't reach the tipping point of forming a vibrant community.

Critical mass for Node.js

Most new technologies die not because of their lack of merit, but because of obscurity. A new language or framework needs a large enough pool of users and core developers in order for it to be sustainable. Large marketing campaigns and the backing of a big company can push a programming language or technology framework into the mainstream, but sometimes it happens from the ground up.

Node generated excitement on its first release because programmers find it interesting and powerful. It's hard to find negative comments from those who are using it.

Because of the instant appeal, there were enough early adopters to start a vibrant community and a large number projects, many of which are open source. These applications, 96 and counting linked from the Node wiki, show off many of the amazing things Node can do. They provide developers with example code and inspiration.

The wiki also lists 86 companies and startups using Node. Though it's not an exhaustive list, and many of the companies listed are quite small, there are at least two significant players.

Joyent is the corporate home of Node. It employs Ryan Dahl (Node's creator), Isaac Schlueter (creator of NPM), and other Node contributors. Joyent owns the Node trademark and copyright, and the company recently launched No.de, a hosting service for Node applications. This gives Node a stable, funded base of development resources and a spokesperson for the project in the corporate world.

The other big player is HP. Shortly after HP acquired Palm, Palm's webOS mobile operating system added Node. This was a smart move for HP, and was very well received by the webOS community:

If you think about it, Node delivers a services platform for the cloud, so is there a way that we could work together? We got together with Ryan Dahl of Node to try this out, and it turns out that Node works fantastically well on a mobile device! Major kudos should go to the V8 team for creating a great VM and to Ryan for writing efficient code that scaled down from the cloud to the device. -- Dion Almaer

Putting Node on a mobile device turns the idea of Server Side JavaScript on its head, but why not? With Node the JavaScript engine is small enough, the code is portable enough, and the programming model is light and asynchronous - a perfect combination a mobile device. It's possible that this, rather than a Rails-like MVC framework or a content management system, will be what propels Node into ubiquity.

Node is not the "next" anything

Node is something new, and that's why programmers are interested in it.

Node has changed our mental image of what a server can be. It doesn't have to be running on a high-performance blade in an air-conditioned co-lo serving millions of requests and gigabytes of data. It can be in your pocket synchronizing your contacts whenever it finds Wi-Fi. It can be caching a web application for faster, local access. It can be a peer-to-peer web server. And it can be a number of things we haven't even thought of yet.

April 26 2011

Open source tools look to make mapping easier

The rapid evolution of tools for mapping open data is an important trend for the intersection of data, new media, citizens and society. Whether it's mapping issues, mapping broadband access or mapping crisis data, geospatial technology is giving citizens and policy makers alike new insight into the world we inhabit. Below, earthquake data is mapped in Japan.

Earlier today, Washington-based Development Seed launched cloud-based hosting for files created with their map design suite, MapBox.

"We are trying to radically lower the barrier of entry to map making for organizations and activists," said Eric Gundersen, the founder of Development Seed.

Media organizations and nonprofits have been making good use of Development Seed's tools. MapBox was used to tell stories with World Bank data. The Department of Education broadband maps were designed with Development Seed's open source TileMill tool and are hosted in the cloud. The Chicago Tribune also used TileMill to map population change using open data from the United States census.

Maps from the MapBox suite can be customized as interactive embeds, enabling media, nonprofits and government entities to share a given story far beyond a single static web page. For instance, the map below was made using open data in Baltimore that was released by the city earlier this year:


"This isn't about picking one person's API," said Gundersen. "This is working with anyone's API. It's your data. It's your tiles. If we do this right, we're about to have a lot of good GIS folks who will be able to make better web maps. There's a lot of locked up data that could be shared."

Making maps faster with Node.js

After making its mark in open source development with Drupal, Development Seed is now focusing on Node.js.

Why? Speed matters. "Data projects really are custom and they need more of a framework that focuses on speed," said Gundersen. "That's what Node.js delivers."

Node.js is a relatively recent addition to the development world that has seen high-profile adoption at Google and Yahoo. The framework was created by Ryan Dahl (@ryah). Jolie O'Dell covered what's hot about Node.js this March, focusing on its utility for real-time web apps.

(If you're interested in getting up and running with Node.js, O'Reilly has a preview of an upcoming book on the framework.)



Related:


March 04 2011

Four short links: 4 March 2011

  1. JSARToolKit -- Javascript port of the Flash AR Toolkit. I'm intrigued because the iPad2 has rear-facing camera and gyroscopes up the wazoo, and (of course) no Flash. (via Mike Shaver on Twitter)
  2. Android Patterns -- set of design patterns for Android apps. (via Josh Clark on Twitter)
  3. Preview of Up and Running with Node.js (O'Reilly) -- Tom Hughes-Croucher's new book in preview form. Just sorting out commenting now. (via Tom on Twitter)
  4. #Blue Opens for Business -- a web app that gets your text messages. You can reply, and there's an API to give other apps read/write access. Signs the text message is finally becoming a consumer platform.

February 22 2011

Four short links: 22 February 2011

  1. Cluster (github) -- Node.JS multi-core server manager with plugins support. Hot restarts, and other goodness. (via The Change Log via Javascript Weekly)
  2. Nokia Culture Will Out (Adam Greenfield) -- Except that, as realized by Nokia, this is precisely what failed to happen. I experienced, in fact, neither a frisson of elegant futurism nor a blasé presentiment of everyday life at midcentury. I was given an NFC phone, and told to tap it against the item I wanted from the vending machine. This is what happened next: the vending machine teeped, and the phone teeped, and six or seven seconds later a notification popped up on its screen. It was an incoming text message, which had been sent by the vending machine at the moment I tapped my phone against it. I had to respond “Y” to this text to complete the transaction. The experience was clumsy and joyless and not in any conceivable way an improvement over pumping coins into the soda machine just the way I did quarters into Defender at the age of twelve.
  3. NextGen Education and Research Robotics -- virtual conference on robotics in education.
  4. Homemade Arduino Printer (Instructables) -- made with an Arduino, two dead CD/DVD drives and a marker pen. Clever hack! (via MindKits on Twitter)

January 25 2011

Four short links: 25 January 2011

  1. node.io -- distributed node.js-based scraper system.
  2. Joystick-It -- adhesive joystick for the iPad. Compare the Fling analogue joystick. Tactile accessories for the iPad—hot new product category or futile attempt to make a stripped-down demi-computer into an aftermarked pimped-out hackomatic? (via Aza Raskin on Twitter)
  3. Programmed for Love (Chronicle of Higher Education) -- Sherry Turkle sees the danger in social hardware emulating emotion. Companies will soon sell robots designed to baby-sit children, replace workers in nursing homes, and serve as companions for people with disabilities. All of which to Turkle is demeaning, "transgressive," and damaging to our collective sense of humanity. It's not that she's against robots as helpers—building cars, vacuuming floors, and helping to bathe the sick are one thing. She's concerned about robots that want to be buddies, implicitly promising an emotional connection they can never deliver. (via BoingBoing)
  4. Asking the Right Questions (Expert Labs) -- Andy Baio compiled a list of how Q&A sites like StackOverflow, Quora, Yahoo! Answers, etc. steer people towards asking questions whose answers will improve the site (and away from flamage, chitchat, etc.). The secret sauce to social software is the invisible walls that steer people towards productive behaviour.

January 21 2011

Four short links: 21 January 2011

  1. Proof-of-Concept Android Trojan Captures Spoken Credit-Card Numbers -- Soundminer sits in the background and waits for a call to be placed [...] the application listens out for the user entering credit card information or a PIN and silently records the information, performing the necessary analysis to turn it from a sound recording into a number. Very clever use of sensors for evil! (via Slashdot)
  2. Cloud9 IDE -- open source IDE for node.js. I'm using it as I learn node.js, and it's sweet as, bro.
  3. The Quantified Self Conference -- May 28-29 in Mountain View. (via Pete Warden)
  4. Bram Cohen Demos P2P Streaming -- the creator of BitTorrent is winding up to release a streaming protocol that is also P2P. (via Hacker News)

October 05 2010

Four short links: 5 October 2010

  1. Nooski Mouse Trap -- I have one, it is fantastic. This man built a better mouse trap. Now please beat a path to his door.
  2. Introduction to Node.js (video) -- Two weeks ago, Yahoo! hosted a BayJax meetup dedicated to NodeJS (since the meetup coincided with Cinco de Mayo, we named it ‘Cinco de Node’). Ryan Dahl, the creator of NodeJS, gave a talk on the project and was very kind to let us record his presentation for YUI Theater. (via anselm on Twitter)
  3. Living With a Computer (Atlantic Monthly) -- a 1979 blast from the past about what it was like to get your first computer. So much of this article remains as true today as it was then: upgrade fever, impatience, more dependencies, etc. Yet another hazard is that recommending the right computer is a little like recommending the "right"' religion. People tend to like the system they've ended up with. The most important point about computers, more so than about religions, is that the difference between a good one and a bad one is tiny compared with the difference between having one and not. (via pomeranian99 on Twitter)
  4. Why You Can't Have a Receipt for Your Taxes (Clay Johnson) -- In the end, this is because our dollars are not packets.

July 29 2010

Four short links: 29 July 2010

  1. How to Raise Funds for Non-Profits (Joi Ichi) -- One organization sent a message to all of their donors during the Haiti crisis asking them to give to an NGO that they had vetted. They didn't ask for any money for themselves. This had a hugely positive effect and the donors trust in the group increased. Wallets aren't zero sum.
  2. legislation.gov.uk -- very elegant legislation system for the UK. Check out the annual analysis, for example. (via rchards on Twitter)
  3. The Great WebKit Comparison Table -- So far I’ve tested 14 different mobile WebKits, and they are all slightly different. You can find the details below. (via Andrew Savikas)
  4. Node and Scaling in the Small vs Scaling in the Large (al3x) -- In a system of no significant scale, basically anything works. The power of today’s hardware is such that, for example, you can build a web application that supports thousands of users using one of the slowest available programming languages, brutally inefficient datastore access and storage patterns, zero caching, no sensible distribution of work, no attention to locality, etc. etc. Basically, you can apply every available anti-pattern and still come out the other end with a workable system, simply because the hardware can move faster than your bad decision-making.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl