Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

January 21 2014

Decision making under uncertainty

The 2014 Edge Annual Question (EAQ) is out. This year, the question posed to the contributors is: What scientific idea is ready for retirement?

As usual with the EAQ, it provokes thought and promotes discussion. I have only read through a fraction of the responses so far, but I think it is important to highlight a few Edge contributors who answered with a common, and in my opinion a very important and timely, theme. The responses that initially caught my attention came from Laurence Smith (UCLA), Gavin Schmidt (NASA), Guilio Boccaletti (The Nature Conservancy) and Danny Hillis (Applied Minds). If I were to have been asked this question, my contribution for idea retirement would likely align most closely with these four responses: Smith and Boccaletti  want to see same idea disappear — stationarity; Schmidt’s response focused on the abolition of simple answers; and Hillis wants to do away with cause-and-effect.

In the age of big-data, from a decision-making standpoint, all of these responses address the complex nature of interconnected scientific topics and the search for one-size-fits-all answers. The conclusions should all point toward what science is supposed to do in the first place: to generate knowledge. Of course with every experiment there is the objective of finding an answer to a specific question, but each experiment, if performed properly, should fundamentally serve to generate more questions, not answers. When a newly-minted PhD successfully defends his or her dissertation, for a brief moment in time, they are the world’s expert on their particular subject. However, the next day, that may not hold true. This is progress and should be embraced. If we can apply findings of a study to address a specific problem, great. But science should be a humbling endeavor — as each day we should realize how much we actually don’t know.

As more and more universities are cranking out graduates under the data-science rubric, I hope that part of the curriculum stresses that while machine learning and advanced algorithms can be used to uncover new, useful and novel patterns contained within large datasets, it should also be stressed that these same tools and techniques can be trained to determine when false positives might lead down dark data alleys. This is all part of a proper lens through which to view scientific risk management. Taking a complex adaptive systems approach to data analysis will better prepare decision makers to identify tipping points and non-stationarity, while providing a foundation to continuously challenge assumptions, and at the same time, to embrace the notion of complexity, shifting baselines, and ambiguity.

June 06 2012

IPv6 day and the state of the edge

IPv6 logoIPv6 enters into permanent operation today and we'll finally have all the addresses we need. Unfortunately the old system with its baked in scarcity — operating like a tireless gravitational force — has already had a few decades to deform the architecture of the Internet in important and perhaps irreversible ways.

I got a notice from Apple reminding me that my MobileMe hosting is going away on June 30. I'm lazy when it comes to certain things and at one point or another iWeb and MobileMe seemed like a simple way to get a personal web page out there. I just wanted a bit of publicly searchable state to clarify who I am (as differentiated from that other Jim Stogdill on the web) that wasn't mediated, moderated, monetized, and walled off by Facebook or some other Austro-Hungarian Central Power of the web. A little place I could call my own.

Really, this is a stupid problem to have. In the last month those pages have had fewer than 100 visits and I could have served them all from a low wattage pluggable computer stashed in a closet without it breaking a sweat. But the Internet doesn't work that way, or not as easily as it should. And at least one of the reasons is its history of address scarcity.

I attended the "Internet Everywhere" panel at the World Science Festival over the weekend. Maybe the most interesting bit was when Neil Gershenfeld forcefully reminded us that the Internet was never intended to be just a bitnet. He was thanking Vint Cerf for making state-full edges a core design principle of the original web. Distributed state meant that adding nodes also added capability and that ownership and power stayed distributed as the Net grew. Maybe it's a sign of where we are now that the man he was thanking works for the web's other Central Power these days.

Unfortunately that chronic shortage of addresses contracted the web, shifting the definition of "edge" from the device you are looking at to the ISP it's connected to. That redefinition from Internet host to mere remote client means that I have to go through the minor hassle of re-hosting my four little pages of HTML instead of happily forgetting that it's in my closet.

I've long been vexed by the asymmetry inherent in DHCP-enabled second class citizenship and I remember the first time I tried to build a permanently addressable home on the web. It was a bunch of years ago and I had my eye on a used Cobalt Qube on eBay. I figured I'd use it as a web server and blog host etc. But like I said before, sometimes I'm lazy, and a fixed IP address was too expensive and (at least at the time) Dynamic DNS was enough of a hurdle for me to say "to hell with it."

Any geek will tell you that it can be done, that I'm making a mountain of a mole hill, and it's not even that hard. "Pay extra for a fixed and registered IP address or use Dynamic DNS." But IP address scarcity made it just hard and expensive enough to make sure that edge hosting didn't become the norm. I'm not commenting on whether it's possible (it is), but whether it's the low-energy state for the broad population of netizens.

Address scarcity contributes to a strange attractor that deformed the logic of the Internet at scale and helped guarantee the cloud would become the primary architecture. When Vint and his colleagues chose that 32-bit address space they thought they were just making a simple engineering tradeoff based on a seemingly predictable future. But it turns out they were adding a bit of dark matter to our Internet cosmos, perhaps just enough to shift the whole thing from open and expanding to closed and collapsing. Address scarcity added to the gravitational force of centralization and control.

On the other hand, if we had IPv6 from the very beginning maybe a whole lot more of us would be hosting our blogs, photos, videos, and pretty much everything else right there in a DMZ hosted on their home router. In that world services like YouTube might need be no more than curation overlays and CDNs for popular content. Sort of a commercially provided BitTorrent index for the stuff we hosted from our closets.

What else might we have built with such an infrastructure? The cloud gives us a sandbox to build applications in, but it also sandboxes our sense of what is even possible. How many startups don't start from the unexamined assumption of cloud hosting today? Why HealthVault? Why not a device that I keep in my house that is completely under my control for that kind of personal information? I could even put it in my safe deposit box if I didn't have any doctor's appointments.

Maybe security concerns and natural economies of scale would have made centralization and "the cloud" inevitable outcomes without any help from address scarcity. But as our universe continues to collapse into a few very highly capitalized Central Powers I find myself hoping that IPv6 will take away at least some of the gravitational force that is pulling it in on itself.


Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!