Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

February 26 2013

Der Mensch und seine Geschichten

Jonathan Gottschall über die evolutionären Hintergründe des Geschichtenerzählens.
Rezension von Katja Mellmann (25.02.2013)
zu Jonathan Gottschall: The storytelling animal. How stories make us human.
Houghton Mifflin Harcourt, Boston, 2012.

[...]

Gleichwohl fußt Gottschalls Darstellung durchwegs auf gründlicher Kenntnis der einschlägigen Forschungsbereiche, und der Autor erlaubt sich an keiner Stelle eine im pejorativen Sinne ,populärwissenschaftliche‘ Vereinfachung oder Verzerrung. Der in Form von Endnoten und einem Literaturverzeichnis eingerichtete wissenschaftliche Apparat des Buches macht die Studie außerdem für ein Fachpublikum anschlussfähig, und auch im Haupttext werden einzelne wichtige Forschungspositionen namentlich benannt und in einer angemessenen Form referiert. Die belletristischen Kapitelüberschriften und die langen Ausschmückungspassagen freilich erschweren einen schnellen Zugriff des Fachkollegen auf die in der derzeitigen Diskussion zentralen Punkte. Darum sei der Inhalt des Buches hier vor allem im Hinblick auf diese Anschlusspunkte für die Evolutionstheoretische Literaturwissenschaft rekapituliert.

Das erste Kapitel expliziert das Thema des Buches. Gottschall interessiert nicht nur, warum Homo Sapiens überhaupt Geschichten erzählt, sondern auch, warum dem Geschichtenerzählen eine so zentrale Bedeutung in der menschlichen Kultur zukommt. Dieses „Warum“ ist allerdings eher als ein „Dass“ gemeint, das heißt die folgenden Kapitel widmen sich eher dem ausführlichen Nachweis, dass das menschliche Leben in der Tat in vielen Aspekten durch ,Stories‘ bestimmt ist, als einer konsequenten Beantwortung der Warum-Frage. Zwar fragt Gottschall auch regelmäßig nach „Funktionen“ der von ihm beschriebenen Verhaltensweisen, will dies aber ausdrücklich nicht als Antwort auf die Frage nach der biologischen Funktion der jeweiligen Verhaltensweise, also nach ihrer evolutionären Entstehungsursache verstanden wissen. Welche von den beobachtbaren Verhaltenstendenzen jeweils als „Adaptationen“ im evolutionsbiologischen Sinne, also als Anpassungen an einen je spezifischen Selektionsdruck gelten können und welchen Verhaltensweisen wohl eher der Status evolutionärer Nebenprodukte zukommt, lässt Gottschall offen, diskutiert dieses Thema aber verschiedentlich und führt die konträren Forschungspositionen samt ihrer Argumente in aller Kürze auf.

Diese vorsichtige Zurückhaltung erspart ihm unter anderem, sich auf eine bestimmte Reihenfolge festzulegen, in der die beteiligten kognitiven Fähigkeiten evolutionär entstanden sein könnten. Ziel seiner Ausführungen ist also nicht, die Genese des Geschichtenerzählens als menschlicher Eigenschaft zur Darstellung zu bringen, er unternimmt vielmehr eine analytisch aufschlussreiche Parallelisierung verschiedener menschlicher Verhaltensweisen unter dem Aspekt ihrer Bedeutung für die menschliche Fähigkeit des ,Storytelling‘. Zu diesen miteinander parallelisierten Verhaltensweisen gehören insbesondere das kindliche Spiel (Kapitel 2-3), das Träumen (Kapitel 4) und das literarische ,Erzählen‘ (in einem weiten, gattungsübergreifenden Sinne von ,Dichtung‘, ,fiction‘ überhaupt).

[...]

Reposted from02mysoup-aa 02mysoup-aa

October 26 2012

TERRA 720: Atom

Atom is a short, animated film about the ‘life’ of Atom X. From the Big Bang to the emergence of life on Earth and beyond, this film tells a rather brief story of, well, everything.

January 04 2012

World-first hybrid shark found off Australia | uk.news.yahoo.com

Scientists said on Tuesday that they had discovered the world's first hybrid sharks in Australian waters, a potential sign the predators were adapting to cope with climate change.

 --------------------------------------

 // quotation by oAnth:

 [...]

 

The Australian black-tip is slightly smaller than its common cousin and can only live in tropical waters, but its hybrid offspring have been found 2,000 kilometres down the coast, in cooler seas.

 

It means the Australian black-tip could be adapting to ensure its survival as sea temperatures change because of global warming.

 

"If it hybridises with the common species it can effectively shift its range further south into cooler waters, so the effect of this hybridising is a range expansion," Morgan said.

"It's enabled a species restricted to the tropics to move into temperate waters."

 

Climate change and human fishing are some of the potential triggers being investigated by the team, with further genetic mapping also planned to examine whether it was an ancient process just discovered or a more recent phenomenon.

 [...]

 

- original Url: http://uk.news.yahoo.com/world-first-hybrid-shark-found-off-australia-070347259.html



October 13 2011

Stone Age painting kits found in cave

Bone and stone tools were apparently used for crushing pigments and mixing them in the shells of giant sea snails

The oldest known painting kits, used 100,000 years ago in the stone age, have been unearthed in a cave in South Africa.

Two sets of implements for preparing red and yellow ochres to decorate animal skins, body parts or perhaps cave walls were excavated at the Blombos cave on the Southern Cape near the Indian Ocean.

The stone and bone tools for crushing, mixing and applying the pigments were uncovered alongside the shells of giant sea snails that had been used as primitive mixing pots. The snails are indigenous to South African waters.

Other bones, including the shoulder blade of a seal, were among the ingredients for making the pigments. The bones were probably heated in a fire and the marrow fat used as a binder for the paint.

Along with ancient flakes of charcoal, researchers found a "high water mark" on the shells' inner wall, evidence that an unknown liquid, probably urine or water, was added to make the paint more fluid.

The remarkable discovery, reported in the journal Science, throws light on the capabilities and rituals of Homo sapiens who occupied the cave from at least 140,000 years ago. The cave's entrance was blocked by sand 70,000 years ago.

"This is the first known instance for deliberate planning, production and curation of a compound," Christopher Henshilwood at the University of Bergen told Science, adding that the finding also marked the first known use of containers. "It's early chemistry. It casts a whole new light on early Homo sapiens and tells us they were probably a lot more intelligent than we think, and capable of carrying out quite sophisticated acts at least 40,000 to 50,000 years before any other known example of this kind of basic chemistry," he added.

One of the toolkits, which was found next to a pile of different instruments, was more complex and particularly well preserved, with its intact shell coated with red pigment. A second shell, found close by, was broken, but its grinding stone was coated with red and yellow pigments, suggesting it had been used more than once.

Henshilwood's team said the tools were evidence for an "ochre-processing workshop" run by early humans, who gathered the colourful mineral oxides from sites about 20 miles away.

Piecing together the process from the instruments they found, Henshilwood said the artists used small quartzite cobbles to hammer and grind the ochres into a powder, which was then poured into the shell and mixed with charcoal, burnt and broken bone, and the unidentified liquid.

One of the artists' kits came with a slender bone from the front leg of a dog or wolf. One end of the bone had been dipped in ochre, leading the scientists to conclude it was used as a primitive paintbrush.

"You could use this type of mixture to prepare animal skins, to put on as body paint, or to paint on the walls of the cave, but it is difficult to be sure how it was used," said Francesco d'Errico, a study co-author at the University of Bordeaux. "The discovery is a paradox because we now know much better how the pigment was made than what it is used for."

Tiny grooves at the bottom of the shells may be scratch marks caused by sand grains when the artist mixed the paint with a finger. "From time to time they were scratching the bottom when their finger was moving some of these little grains," said d'Errico.

The team has unearthed other artefacts from early humans at the cave. In 2004, it uncovered a collection of 75,000-year-old decorative shell beads at Blombos cave, some of which had been painted with ochre.

"Twenty thousand years after these painting kits were left behind, humans at Blombos were certainly using pigments for symbolic purposes. It is clear they knew all the sources for these red and yellow pigments. This was a tradition for them," said d'Errico.


guardian.co.uk © 2011 Guardian News and Media Limited or its affiliated companies. All rights reserved. | Use of this content is subject to our Terms & Conditions | More Feeds


May 30 2011

The Species Problem

These days the term 'species' is thrown around as if it were a definite unit, but the realities of defining a species are much trickier, and more contentious, than they seem. ;Biologists have argued over the definition of a species since the dawn of humanity, but have yet to come up with a single species concept that works for all types of life. This film explores current concepts used across the various fields of biology and covers some of the specific problems they encounter.

March 29 2011

The ecology of risk

"Nothing in biology makes sense except in the light of evolution."

This statement, the title of an essay by evolutionary biologist T.G. Dobzhansky, was published in "American Biology Teacher" in 1973. It properly asserts that evolution is the cornerstone of any meaningful dialogue in the biological sciences, stressing the importance of ecological theory in understanding biological system behavior.

We can extend this ecological theme of interconnectedness to modern financial and commercial activity, where we can just as easily state: "Nothing in economics makes sense, except in the light of ecology." Large-scale events that have disrupted energy, agricultural and material supply chains in recent months underscore the importance of viewing the world through a spatial lens.

With each passing decade, as new manufacturing and production origins have come online, the barriers to entry that for many years had prevented the functioning of a true global economy have slowly been dissolving. No industry or country can now be considered immune to the financial fallout stemming from supply disruptions. While globalization has most certainly provided much of the world with many benefits, from cheap, reliable energy, food and telecommunications, to an ever-expanding universe of choices to enrich our lives culturally and materially, the increasing number of choices is accompanied by a more fragile economy, susceptible to perturbations.

So while we continue to enjoy the aforementioned benefits of a connected economy, we need to recognize and appreciate the potential magnitude of the underlying risks. The global raw material supply chain is now spread out very thinly. So much that when even the undocumented news of a potential threat to the supply of particular materials sends ripples through the markets, affecting in near real time both the price of the material in question and the equity valuations of those in the exposed industries.

Effectiveness graph
Image from Gary Horvitz

Darwin meets the Fed

As such, perhaps it is now time to take a more serious ecological or geographical approach to risk as it pertains to global commerce. Nassim Taleb repeatedly argues that the financial system needs to robustify. Using nature as a metaphor, Taleb maintains that mother nature is redundant, and the financial system should employ similar measures of redundancy to avoid blowing up. Multiple layers of protection are necessary to protect against shocks. While this redundancy can limit the swings to the upside (much to the chagrin of many fund managers ), it also spreads out the risk when the negative events inevitably arise, minimizing the disastrous consequences.

Theoretical biologist (and former managing director at Deutsche Bank) George Sugihara also calls for ecological principles to be included in more comprehensive risk management and mitigation strategies. Drawing from Sugihara's own research into fishery management and collapse, he suggests that the early identification of tipping points, rather than trying to model irrational investor behavior, may serve as a more effective means to alert economists to potentially significant market corrections. We may even go as far back applying some of the early ideas of Darwin, by taking into account the influence of selective pressures, and the subsequent nonlinear nature of the evoked response.

Natural capital

These are just a few of the many instances where modern economic analysis and the goal of financial stability can benefit from approaches grounded in the natural sciences. This is a theme that has certainly been around for a long time, but I don't feel that it has ever been taken seriously by those outside of academia. For risk managers striving to protect either natural or financial capital (or both), the need to appreciate not only the connections within a system, but more importantly their underlying geographical opportunities and constraints, is the first step in robustifying markets. Therefore, instead of trying to export theories of what prudent investors would or should do according to standard economic theory and structuring positions to capitalize on these assumptions (efficient markets are actually a fallacy), when a market-moving event commences, a robust strategy grounded in ecological principles will survive, and over the long term have a better chance at providing a healthy return.

While the risk manager may not be able to prevent the initial shock to the market, he or she may be able to better anticipate the potential effects, and in the process, properly construct a hedge to avoid the consequences. On paper, the job of the modern financial risk manager is to preserve capital, striving to maintain positive returns while minimizing drawdown. In practice however, the risk manager is as much a speculator as the head trader. To compound the increased risk for a collapse, most fund managers are watching their radar for the same potential risks. When an event surfaces and catches the market by surprise, (ie., Australia floods, Argentina drought, MENA civil unrest, increased volatility in oil prices, Japan earthquake, etc.) almost everyone loses. Will an approach that is partially resistant to such shocks limit the upside, curbing the potential bonus of a fund manager? Absolutely. But it will also ensure that the fund manager is still around to enjoy the capital they are paid to preserve and grow.

Only a truly diverse portfolio can be considered stable or robust. As in large scale-agriculture, monoculture may work for a while, but eventually, diversity wins.



Related:




February 16 2011

How early man used his head

Skulls unearthed in a Somerset cave were skilfully fashioned into cups with the rest of the bodies probably being cannibalised

A macabre collection of bone cups made from human skulls, unearthed in a Somerset cave, are the oldest of their kind, researchers believe.

The extraordinary vessels are the handiwork of early modern humans, who used stone tools to prepare and finish the containers around 14,700 years ago after the last ice age.

The three cups, made from the skulls of two adults and one three-year-old child, were dug up several decades ago, alongside the cracked and cut-marked remains of animal and human bones at Gough Cave in Cheddar Gorge, south-west England. They have now been re-examined using new techniques.

The human bones show clear signs of butchery, implying that the bodies were stripped for meat and crushed for marrow before the heads were severed and turned into crockery.

There is no suggestion that the cups are trophies made from the remains of dead enemies. It is more likely that making skull cups was a traditional craft and their original owners died naturally.

"It would probably take a half day to prepare a skull cup," said Silvia Bello, the palaeontologist who led the study at the Natural History Museum in London. "Defleshing the skull was a skilled and lengthy business."

Researchers said it was impossible to know how skull cups were used, but historically they have held food, blood or wine. Some are still used today in Hindu and Buddist rituals. "To us they can still seem a little strange," said Bello. "I wouldn't have my cereal in one."

Writing in the journal Plos One, the scientists describe revisiting excavated remains from the cave, including a skull cup unearthed in 1987 by Chris Stringer, head of human origins research at the museum. Detailed examination of 37 skull fragments and four pieces of jaw using a 3D microscope revealed a common pattern of hard strikes followed by more finessed stone tool work that turned a freshly decapitated head into a functional cup or bowl.

"This is the first time we've understood how this material was processed, and the fact that the skulls were not just cut and butchered, but were shaped in a purposeful way," said Stringer.

The discarded human bones had the same cut and saw marks found on butchered animal bones at the site, and some were cracked open or crushed, as was done with animal bones to expose nutritious marrow. Only the skulls seem to have been treated with special care. The cuts and dents show they were scrupulously cleaned of any soft tissues soon after death.

"They systematically shaped the skulls to make them into cups. They scalped them to remove the hair, they removed the eyeballs and ears, they knocked off the faces, then removed the jaws and chiseled away the edges to make the rims nice and even. They did a pretty thorough job,' Stringer said.

The smaller cup, made from the child's skull, would have leaked because the cranial bones had not fully fused together, but the larger two might have carried food or around two pints of liquid.

"We assume it was some kind of ritual treatment. If there's not much food around they may have eaten their dead to survive. Perhaps they did this to honour the dead, to celebrate their lives," Stringer added.

The cave dwellers were among the first humans to return to Britain at the end of the last ice age. The island was unpopulated and almost completely under ice 20,000 years ago, but as the climate warmed, plants and animals moved across Doggerland, a now submerged land bridge that linked Britain to mainland Europe. Where food went, early humans followed and brought art, craft and toolmaking skills with them.

The ages of the remains at Gough Cave suggest it was home to humans for at least 100 years. The cave is well-sheltered and, with skin flaps over the entrance, would have made a cosy abode, Stringer said. The residents were ideally placed to hunt passing deer and wild boar, while up on the Mendip Hills roamed reindeer and horses.

In the 1900s, several hundred tonnes of soil were removed from the cave to open it up as a tourist attraction, a move that may have destroyed priceless ancient remains. The skull cup and other bones unearthed in 1987 survived only because they were lodged behind a large rock.

In 1903, field researchers working in the cave's entrance uncovered Cheddar Man, the oldest complete skeleton in Britain at more than 9,000 years old. A painting of a mammoth was found on the wall in 2007. Other artefacts from the site include an exquisitely carved mammoth ivory spearhead.

A precise replica of one of the skull cups, complete with cut marks, will go on display at the Natural History Museum in London from 1 March for three months.


guardian.co.uk © Guardian News & Media Limited 2011 | Use of this content is subject to our Terms & Conditions | More Feeds


November 17 2010

A Darwinian Theory of Beauty

I love this TED talk by Denis Dutton of Arts & Letters Daily fame. He uses evolution to explain beauty, and it's illustrated in the same style (by the same person) as the RSA talks. I'm particularly drawn to the scientific approach to art, a crossover that positively reeks of the kind of cross-discipline thinking that I encounter at Sci Foo. I'm putting together the program for Kiwi Foo, and I hope to have the same fertile intersection of ideas.

September 07 2010

Natural selection

Charles Darwin never patronised his audience but presented his evidence modestly; Richard Dawkins, on the other hand, lacks the patience to let natural history speak for itself

Charles Darwin was not a clever man. Well, clearly he was a very clever man. But he was not self-consciously clever: he never talked down to his readers. His masterpiece, On the Origin of Species, is a modest book. It begins with evidence – and down-to-earth, homely evidence at that. Even though Darwin's encounter with the island species of the Galapagos and other exotic discoveries on his voyage with HMS Beagle was so important to his intellectual evolution he starts his great work with observations about domestic British breeds. Similarly, in The Descent of Man he offers copious anecdotes about his study of primates in London Zoo (he wasn't above teasing the animals).

Darwin is the finest fruit of English empiricism. His modest presentation of evidence contrasts, I am sorry to say, with the rhetorical stridency of Richard Dawkins. Visit the famous atheist's website and you will see two causes being pushed. Dawkins is campaigning with other secular stars against the pope's visit to Britain. Meanwhile he is touring the paperback of his book The Greatest Show On Earth: The Evidence for Evolution. The trouble with this book is that it lacks Darwin's empirical style. Where the Victorian writer presented masses of evidence, and let his astonishing, earth-shattering theory emerge from common-sense observations of nature, Dawkins lacks the patience, at this point in his career, to let natural history speak for itself. He has become the mirror image of the theological dogmatists he despises.

He just can't separate science from the debate he has got into with religious people. "Debate" is too kind a word. In a debate you are trying to convince your opponents, but the new atheists have closed off the grey area in which, for a long time in the west, science and religion co-existed. In The Greatest Show On Earth, Dawkins tries momentarily to backtrack, pointing out that all educated bishops believe in evolution. But he is soon back to the realm of dogma, asking himself why it took so long to come across the reality of evolution. This is clearly a historical question, although it may not be a good historical question (why did it take so long to discover the iPad? Well, first you had to invent the wheel ...) No sooner does he ask this question than Dawkins replies, in effect – and I am only slightly caricaturing – that it was because people were a bit thick. He offers no intellectual history of how Darwin's big idea was born from centuries of natural science, how the religious Victorians created an intellectual atmosphere in which such a leap in the dark could be contemplated.

Nor does he offer what is surely needed – a blow-by-blow introduction to evolution that starts calmly from the visible evidence all around us. In a telling aside, he is dismissive about the fossil Ida, which he cannot resist telling his readers was massively overhyped. Missing link? You'd have to be an idiot to think that, he grumps ... I am not defending the publicity for this fossil, but it typifies the self-regard of the public atheist that when an accessible, immediate, exciting piece of visual evidence for The Descent of Man enters the mainstream, his reaction is to sneer. He doesn't actually want to persuade, he just wants to be the cleverest kid in the class. Which Darwin never was.


guardian.co.uk © Guardian News & Media Limited 2010 | Use of this content is subject to our Terms & Conditions | More Feeds


April 11 2010

Stephen Hawking: Humans are "Entering a Stage of Self-Designed Evolution" - www.dailygalaxy.com - 20100329

[...] In the last ten thousand years the human species has been in [...] "an external transmission phase," where the internal record of information, handed down to succeeding generations in DNA, has not changed significantly. "But the external record, in books, and other long lasting forms of storage [...] has grown enormously. [...] [they] would use the term, evolution, only for the internally transmitted genetic material, and would object to it being applied to information handed down externally. [...] our human brains "with which we process this information have evolved only on the Darwinian time scale, of hundreds of thousands of years. [...] But we are now entering a new phase [...] "self designed evolution," in which we will be able to change and improve our DNA. "At first [...] these changes will be confined to the repair of genetic defects [...] I am sure that during the next century, people will discover how to modify both intelligence, and instincts like aggression."

January 04 2010

Skinner Box? There's an App for That

If you are reading this post it means that after countless misfires, I finally kept my attention focused long enough to finish it. That may seem like no big deal, a mere trifling effort, but I'm basking in the moment. In fact, I'll probably tweet it.

It didn't start out to be about digital Skinner boxes. It was a Radar backchannel email about the infamous Web 2.0 Expo Twitterfall incident. I got all curmudgeonly and ranted about continuous partial attention, Twitter as a snark amplifier, and the "Ignite'ification" of conferences (with apologies to Brady). In short, I demonstrated myself unfit to contribute to a blog called Radar.

I swear I'm not a Luddite. I'm not moving to Florida to bitch about the government full time and I'm not in some remote shack banging this out on an ancient Underwood. However, I guess I count myself among the skeptics when it comes to the unmitigated goodness of progress. Or at least its distant cousin, trendiness.

Anyway, I sent the email, inexplicably Jesse said "post!", and I tried reworking it. I still am. This piece has been grinding away like sand in my cerebral gears since, and along the way it has become about something else.

In The Anthologist, Nicholson Baker describes writing poetry as the process of starting with a story and building a poem around it. I try to do that with photography and build pictures around narrative and metaphor. After the work takes shape the story is carved back out and what remains hints at the story's existence, like a smoke ring without the mouth.

He says it better: "If you listen to them, the stories and fragments of your stories you hear can sometimes slide right into your poem and twirl around in it. Then later you cut out the story and the poem has a mysterious feeling of charged emptiness, like the dog after the operation." Don't worry about the dog, it lived and it isn't relevant. My point is that this post isn't about the Twitterfall fail story, that was just a catalyst. The inchoate uneasiness still twirling around in here is what's left of it.

This all began with these lingering questions: "Why are we conference attendees paying good money, traveling long distances, and sitting for hours in chairs narrower than our shoulders only to stare at our laptops? Why do we go to all that trouble and then spend the time Twittering and wall posting on the overwhelmed conference wifi? Or, more specifically, why are we so fascinated with our own 140 character banalities pouring down the stage curtain that we ignore, or worse, mob up on, the speakers that drew us there in the first place?"

As I kept working away on what has become this overlong post, the question eventually turned into, "why the hell can't I finish this?" This has become the post about distraction that I've been too distracted to complete. It's also about ADHD and the digital skinner box that makes it worse, narcissism's mirror, network collectivism and the opt-in borg, and an entropic counter-argument for plugging in anyway. So, here goes...


My name is Jim, and I'm a digital stimulusaholic

A few weeks ago I was watching TV from across the room in the airport and I couldn't hear the sound. The missing sound track made the cuts more obvious so I timed them. They averaged about 1.5 seconds while ranging from about a quarter to at most three. The standard deviation was pretty tight but there was plenty of random jitter and the next cut was always a surprise. Even during the shortest clips the camera zoomed or panned (or both). Everything was always in motion, like a drunk filming dancers. Even though I've known this was the trend for a while it surprised me. Without the dialog to provide continuity it was disconcerting and vertigo inducing.

In his book Blink of an Eye, Walter Murch describes movie editing as a process akin to adding eye blinks where they naturally belong so that film works for the brain like a dream. It's a lovely book by the way. I think these frenetic transitions alter how we experience that film-induced dream state, and for me at least, can make film feel a nightmare during exam week. Unfortunately, much of my daily experience mirrors this new cultural reality.

Before I discovered Twitter I used to joke that coffee was at the root of a personal productivity paradox. I could drink it and stay alert while wearing a path in the carpet back and forth to the men's room. Or I could stay stimulant free and sleep at my desk. That was a joke, but the information-sphere that we live in now is like that. I can either drink liberally from the fire hose and stimulate my intellect with quick-cutting trends, discoveries, and memes; but struggle to focus. Or I can sign off, deactivate, and opt out. Then focus blissfully and completely on the rapidly aging and increasingly entropic contents of my brain, but maybe finish stuff. Stuff of rapidly declining relevance.

We have such incredible access to information, I just wish it wasn't so burdened with this payload of distraction. Also, I wish my brain wasn't being trained to need these constant microburst's of stimulation.

Email was the first electronic medium to raise my clock speed, and also my first digital distraction problem. After some "ding, you have mail," I turned off the blackberry notification buzz, added rationing to my kit bag of coping strategies, and kept on concentrating. Then RSS came along and it was like memetic crystal meth. The pursuit of novelty in super-concentrated form delivered like the office coffee service. Plus, no one had to worry about all that behind-the-counter pseudoephedrine run around. "Hey, read as much as you want, no houses were blown up in Indiana to make your brain buzz."

It was a RUSH to know all this stuff, and know it soonest; but it came like a flood. That un-read counter was HARD to keep to zero and there was always one more blog to add. Read one interesting post and be stuck with them forever. In time keeping up with my RSS reader came to be like Lucy in the chocolate factory with the conveyor belt streaming by. From my vantage point today, RSS seems quaint. The good old days. I gave it up for good last year when I finally bought an iPhone and tapped Twitter straight into the vein. Yeah, I went real time.

Now I can get a hit at every stop light. Between previews at the movies. Waiting for the next course at a restaurant. While you are talking to me on a conference call (it's your fault, be interesting). When you look down at dinner to check yours. Last thing before I go to sleep. The moment I wake up. Sitting at a bar. Walking home. While opening presents on Christmas morning (don't judge me, you did it too). In between the sentences of this paragraph.

I am perfectly informed (I will know it before it hits the New York Times home page) and I'm utterly distracted.

Here are just a few of the things I learned yesterday while I was working on this post. Scientists are tracking malaria with cell phone data, there is an open source GSM base station project, I need to bend over (and touch my toes) more, WWII 8th Air Force bomber crews had brass ones (seriously, read this pdf), Erik Prince is probably graymailing the CIA, and electric motorcycles seem to be on the verge of being popular.

So here I am at the nexus of ADHD, AMS*, and digital Narcissism. I'm in a Skinner box alright, but I don't smack the bar and wait for pellets, I tweet into the void and listen for echoes. There it is now, that sweet sweet tweet of instant 140 char affirmation. Feels good. RT means validation. I think I'm developing a Pavlovian response to the @ symbol that borders on the sexual.

And I remember to give RT love too. Even if the tweet didn't really grab me as much as I let on. After all, you have to grease the machine to keep its pellet chute clear. Give to get. I won't RT cheeky_geeky though, gotta draw the line somewhere. No preferential attachment help from this tweeter. Better to RT the ones that really need it; they'll be more grateful and they'll come through later when I'm jonesing hard for 140 characters of meaningful interaction.

And Twitterfall! I've only experienced it once, but holy shit, it's a Skinner Box spitting champagne truffles. It's real time plus real place. Back channel my ass, this is narcissism's mirror mirror on the wall, who's the twitteringest mofo of them all? And it's big. Don't have to wait for the echo, I can see it right there! And so can everyone else. A perfect cybernetic feedback loop of self. A self licking ice cream cone of the mind. I didn't know it till I experienced Twitterfall, but ASCII echo isn't enough. We're still flesh with pumping hearts after all and we want to feel the response. Listen to them shift in their seats as my last twitticism wends its way down the wall. Slow down you bastards, let it hang there a bit, it was a good one. Hear that? Yeah, they saw it.

This brave new inter-networked, socially-mediated, post-industrial, cybernetically-interwoven world is an integrated web of Pavlovian stimulus and response and I'm barking at the bell. Turns out, this isn't a Skinner Box. No, "box" is too confining for this metaphor. This is a fully networked, digitally rendered, voluntarily joined Skinner Borg. It doesn't embed itself in us, we embed ourselves in it. It's Clockwork Orange, self served.

For the last couple of years I've jacked in to this increasing bit rate of downloadable intellectual breadth and I've traded away the slow conscious depth of my previous life. And you know what? Now I'm losing my self. I used to be a free standing independent cerebral cortex. My own self. But not any more. Now I'm a dumb node in some uber-net's basal ganglia. Tweet, twitch, brief repose; repeat. My autonomic nervous system is plugged in, in charge, and interrupt ready while the gray wrinkly stuff is white knuckled from holding on.


The singularity is here, and it's us... also it's dumb, snarky, and in love with itself.

Everyone is worried that the singularity will be smart, I'm worried that it will be dumb, with a high clock speed. Any dumb ass can beat you at chess if it gets ten moves to your one. In fact, what if the singularity already happened, we are its neurons, and it's no smarter than a C. elegans worm? Worse, after the Twitterfall incident, I'm worried about what it will do when it discovers its motor neural pathways.

The human brain is brilliance derived from dumb nerves. Out of those many billions of simple connections came our Threshold of Reflection and everything that followed. But consciousness is going meta and we're being superseded by a borg-like singularity; intelligence turned upside down. Smart nodes suborning ourselves to a barely conscious #fail-obsessed network. It's dumb as a worm, fast as a photo multiplier tube, and ready to rage on at the slightest provocation. If you're on stage (or build a flawed product, or ever ever mention politics), watch out.

We don't plan to go mob rules any more than a single transistor on your computer intends to download porn. We participate in localized stimulus and response. Macro digital collectivism from local interaction. Macro sentiment from local pellet bar smacking.

We're pre-implant so I plug into the Skinner Borg with fingers and eyes that are low bandwidth synapses. When I try to unplug (or when I'm forced to in an airplane at altitude), my fingers tingle and I feel it still out there. I'm a stimulus seeking bundle of nerves. I experience the missing network like a phantom limb.

So where's this going? Like I said, I'm not a Luddite but I'm no Pollyanna Digitopian either. Age of spiritual machines? Whatever. Show me spiritual people. When the first machine or machine-assisted meta-consciousness arrives on the scene, it's going to be less like the little brother that you played Battleship with and more like a dumb digital version of poor Joe from Johnny Got His Gun. Barely sentient but isolated from sensation. Do we think that a fully formed functional consciousness is going to spring to life the first time sufficient processing power is there to enable it? I'm not worried about it replicating and taking over the world, I'm worried about it going completely bat shit crazy and stumbling around breaking stuff in an impotent rage.


My Dilemma, Evolution, and Entropy

All this talk of borgs, singularities, and addiction doesn't address my very real and right now dilemma. The world is changing and we all have to keep up. Mainlining memes is AWESOME for that, but at what cost? It's a bargain that I'm trying not to see as Faustian.

We don't have parallel ports so we have to choose. Lots of bite sized pellets or slow down and go deep? Frenetic pursuit of the novel or quiet concentration? Can I stay plugged in without giving up my ability to focus? I don't want to be a donor synapse to the worm and I don't want to have to intravenously drip Adderall to cope.

At root, this is a question of breadth vs. depth and finding the right balance. This conversation was started by a conference. Organizers have to choose too, and they base their choices on what they think we prefer. Do we want to listen to Sandy Pentland for an hour and come away with a nuanced understanding of his work on honest signals, or would we rather have six twitter-overlaid ten minute overviews in the same hour? Are we looking for knowledge? Or suggestions of what to investigate more deeply later (assuming we can find some "later" to work with)? Can we sit still for an hour even if we want to?

We humans and our organizations are open dissipative systems evolving memetically in the information realm and genetically on intergenerational time scales. Living organisms beat entropy by remaining in a perpetual state of disequilibrium - they absorb energy from their environment and exhaust disorder back into it. The greater their disequilibrium, the more energy that is required to maintain an internally ordered state, but paradoxically, the more adaptive they are to changing surroundings.

If we work in a domain in flux we require a higher rate of information consumption to maintain our ability to adapt while maintaining an internally ordered state. The U.S. Army is experiencing this now as it tries to adapt to the difference between a Fulda Gap standoff and the current counter insurgency mission. Moving from a period of relative stasis to a tighter evolutionary time scale, it's adapt or lose. As a learning organization its emphasis has to shift from transmitting and conforming with existing knowledge, to consuming and processing new.

The pursuit of novelty in this context isn't just fun, it is the foundation for a stocked library of adaptive schemata that support intellectual evolution. Since memes and the seeds of synthesis can come in compact packages, a broad, fast, and shallow headspace can work in times of rapid change. This isn't just an argument for fast paced conferences with lots of breadth, but it also explains why twitter, RSS feeds, and broad weakly-connected social networks (e.g. Foo) are so valuable. It's also one of the arguments that I make in enterprises like the DoD why they should promote rather than discourage social media use.

However, I don't think the evolutionary / entropic argument is the only one in play. The cultural and cognitive domains are relevant too, and speaking personally, I feel like I'm bumping hard up against some relevant limits. My memetic needs are increasing faster than genetic barriers can evolve. Obviously, in the cultural domain we are becoming more accustomed to fast paced transitions and partial attention. However, anecdotally it seems like I'm not the only one wondering about the impact of all this stuff. Early signals are popping up in the strangest places. When I attended the Gov 2.0 Summit more than one participant commented that the fast paced format was intellectually exhausting.

By nature I'm an abstainer more than a moderator. It's hard for me to limit a behavior by doing just a little bit of it. Just check the empty quart ice cream container in my trash can if you doubt me. So, frankly, I am stumped on what to do. I simply don't know how to proceed in a way that will keep the information flow going but in a manner that doesn't damage my ability to produce work of my own.


Which Singularity?

In the early years of the last century the Dadaists observed America's technological progress from their Parisian perch and recoiled artistically from the dehumanizing eruptions of concrete and steel in the machine age capital of Manhattan. Paintings like Picabia's Universal Prostitution were comments on how our culture (and perhaps selves) seemed to be merging with machines. Having observed the early machine age's ascendance first hand, Duchamp would have understood our uneasy fascination with the singularity.

I'm trapped in a cybernetic feedback loop. That much is clear. However, these loops operate at different scales in both time and space and maybe Twitter itself solves at least one larger conundrum. As we join the Skinner Borg in droves, and our ability to concentrate is compromised at scale, who is going to develop the technologies that evolve the worm?

When astrophysicists use the term "singularity" they mean that edge of a black hole where the decaying-from-the-center gravitational force just balances the ability of light to escape. Along the surface of that sphere, some distance out from the hole itself, light just hangs there in perpetual stop motion.

The very technology that makes our collective integration possible also distracts us from advancing it. In equilibrium, distraction and ambition square off at the singular point of failed progress. If the next generation of Moore's, Joy's, and Kurzweil's are half as distracted as I am, we are going to find ourselves frozen right here, nodes in a wormy borg that never becomes a butterfly. (yeah, I know, worms don't become butterflies, but I'm desperate to finish...). Anyway, maybe Twitter is just God's way of making sure we never manage to finish creating our future oppressor.

p.s. There really is an app for that.

*AMS = Afraid to miss something

September 24 2008

TERRA 447: Malice in Wonderland PREVIEW

Sometimes, unexpected guests can very quickly become pests. But what happens when you ask them to leave? Join the Red Queen and Alice on live action, animated journey down the rabbit hole, as they are caught in an evolutionary arms war between parasite and host.

Science films often portray their content in a very pedestrian way. This one set out to challenge this approach by presenting a theory about parasite/host co-evolution (by Matt Ridley) in a visually extraordinary manner, using Lewis Carroll's imagery. This short is aimed at a general, educated audience, and was produced as part of an ongoing MFA in Science & Natural History Filmmaking at Montana State University in Bozeman, Montana
TERRA 447: Malice in Wonderland

Sometimes, unexpected guests can very quickly become pests. But what happens when you ask them to leave? Join the Red Queen and Alice on live action, animated journey down the rabbit hole, as they are caught in an evolutionary arms war between parasite and host.

Science films often portray their content in a very pedestrian way. This one set out to challenge this approach by presenting a theory about parasite/host co-evolution (by Matt Ridley) in a visually extraordinary manner, using Lewis Carroll's imagery. This short is aimed at a general, educated audience, and was produced as part of an ongoing MFA in Science & Natural History Filmmaking at Montana State University in Bozeman, Montana.
Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl