Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

November 08 2013

‘Focus,' par Daniel Goleman - NYTimes.com

‘Focus,’ par Daniel Goleman - NYTimes.com
http://www.nytimes.com/2013/11/03/books/review/focus-by-daniel-goleman.html

Nicholas Carr revient sur « Focus » le nouvel ouvrage de Daniel Goleman, qui rappelle que l’#attention n’est pas un interrupteur entre la concentration et la distraction. L’attention est bien plus variée. Trop concentrés, et nous devenons parfaitement inattentifs. Notre attention résulte de l’interaction entre deux parties différentes de notre #cerveau : le cerveau inférieur, qui travaille hors de notre conscience, qui surveille les signaux provenant de nos sens, agissant comme un système d’alerte. (...)

#cognition #neuroscience

November 08 2011

October 31 2011

August 25 2011

Four short links: 25 August 2011

  1. Steve Jobs's Best Quotes (WSJ Blogs) -- Playboy: We were warned about you: Before this Interview began, someone said we were "about to be snowed by the best."; [Smiling] "We're just enthusiastic about what we do." (via Kevin Rose)
  2. The Tao of Programming -- The Tao gave birth to machine language. Machine language gave birth to the assembler. The assembler gave birth to the compiler. Now there are ten thousand languages. Each language has its purpose, however humble. Each language expresses the Yin and Yang of software. Each language has its place within the Tao. But do not program in COBOL if you can avoid it. (via Chip Salzenberg)
  3. In Defense of Distraction (NY Magazine) -- long thoughtful piece about attention. the polymath economist Herbert A. Simon wrote maybe the most concise possible description of our modern struggle: "What information consumes is rather obvious: It consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it." (via BoingBoing)
  4. 31 Days of Canvas Tutorials -- a pointer to 31 tutorials on the HTML5 Canvas.

November 22 2010

Why "Delivering Happiness" is a must read

At Web 2.0 Summit last week, Tony Hsieh explained to the audience that when it came to phone support at Zappos, he had opted to use the phone as a branding tool rather than to focus on expense minimization or revenue maximization (upselling). Zappos is thriving. In his tenure as CEO of Zappos, Tony has made many decisions that might fly in the face of conventional advice, and in his book "Delivering Happiness," readers gain wisdom from his stories and processes.

For more than 40 years, we've escalated our obsession with productivity. From efficiency experts at Disney in the 1960s peering over the shoulders of animators to being accessible 24/7 by checking email and answering the phone in the bathroom today, we have worshipped at the altar of output, efficiency and accessibility. Our productivity- obsessed society manages time and not attention.

At many companies, software developers are rewarded for knocking out the list of features and cranking toward the release date with no emphasis on the quality of the feature being checked. Does it work? Check. Tests ran? Check. Is anyone asking if it's contributing to the excellence of the product or service? Is there another way, a path to quality results and profitability that is not productivity-obsessed?

One might venture a guess that Apple has found that path. Readers of "Delivering Happiness" realize that this alternate path is the secret sauce at Zappos.

Our productivity-obsessed society is in the more, faster club. Sometimes that's a good thing. Sometimes it's not. Which is why I started a discussion about post-productivity computing and an era characterized by post-productivity values.

Post-productivity does not mean unproductive. It does mean, let's take the best and leave the rest, as our burned-out selves, our burned-out workforce and our burned-out economy take steps to move into a thriving and prosperous 21st century. This era of engagement is post-productivity because the motivations and metrics mine, in the best ways, our human assets (positive emotions, positive relationships, meaning, engagement) as well as profitability. I use the term post-productivity primarily as a reminder that a productivity-obsessed approach to an era of engagement takes us right back into the murky swamp.

This is about a new mindset. Tony Hsieh figured it out and last year, 25,000 people applied for the 250 job openings at Zappos. Applicants are enthusiastic to be part of an era of engagement, post-productivity company -- selling shoes online, being part of the Zappos team.

Like many of you, I've worked in companies that pit employees against each other by rating on a curve. Productivity, which could be contagious, instead, becomes a zero sum game. The brilliant and insightful Stanford professor Carol Dweck, author of "Mindset," would view this as a management process with a fixed mindset orientation. Hsieh manages Zappos for what Dweck calls a growth mindset, a mindset that welcomes challenges, embraces exuberant learning, and experiences failure as part of positive forward motion.

Once you start reading "Delivering Happines" you'll feel an irresistible pull to email Zappos for a copy of their "Culture Book". This crowd-sourced, edited collection of stories bathes the reader in stories of a corporate culture characterized by trust, productivity, joy, and profitability. "Delivering Happiness" is an inspired and inspiring must read for our journey into an era of engagement.



Related:




November 01 2010

The economics of gaining attention

A fascinating article at the Daily Beast chronicled an attempt to reverse engineer the Facebook social news feed. It sought to answer questions about how and who Facebook chooses to display on your news feed page.

Not surprisingly, Facebook makes assumptions based on behavior to ensure that it propagates people and information with the highest likelihood of gaining attention or engagement. For example, individuals whose profiles are “stalked” by others show up disproportionately in news feeds because Facebook assumes they must be stalked for good reason. They must be interesting.

Facebook screen?

As Facebook becomes an increasingly vital part of how businesses connect with customers, the algorithms determining who gets attention will become increasingly important. They shape business communications and behavior.

We have now a long history of content being written to accommodate the rules of search engines -- particularly Google. We research keywords and then ensure they are placed at the front of our headlines and titles. We reorganize content into staccato bursts of bullet points and subtitles, and so on. Optimization of this kind now dominates all professional content production on the web and shapes our experience as consumers of that content.

As our social, economic and political lives are increasingly mediated through a few consolidated technologies such as Facebook and Google, software exerts a profound influence on the way we engage with one another. The natural, sociological secrets of how to gain attention are being codified. In turn, this creates a normative effect on how we behave. We conform to the rules embedded in the code.

We have always written lead lines with an eye to attracting readers, but there are two aspects here that are new:

  1. The widespread incorporation of scientific rigor into the exercise. For example, the Huffington Post does A/B tests of its own headlines to favor the winning headline.
  2. The uniformity of the resulting norms. We are conforming to a few dominant algorithms.

Gaining attention in this world becomes as much about the science of standing out as the art of being outstanding. And every link forged is a form of currency exchange where the market favors the heavyweights.

While I don't doubt that we will see a continued wellspring of creativity emerge from an open web, these algorithms themselves represent a bias toward those who decipher the code. Doing so requires resources that favor the large over the small, and the organization over the individual. There is nothing new to this progression, but it does run counter to the heroic individual archetype (the lone blogger, the basement video show broadcast around the world, etc.) that the web often celebrates as its own unique progeny.



Related:


October 28 2010

Four short links: 28 October 2010

  1. Exploring Computational Thinking (Google) -- educational materials to help teachers get students thinking about recognizing patterns, decomposing problems, and so on.
  2. TimeMap -- Javascript library to display time series datasets on a map.
  3. Feedly -- RSS feeds + twitter + other sites into a single magazine format.
  4. Attention and Information -- what appears to us as “too much information” could just be the freedom from necessity. The biggest change ebooks have made in my life is that now book reading is as stressful and frenetic as RSS reading, because there's as much of an oversupply of books-I'd-like-to-read as there is of web-pages-I'd-like-to-read. My problem isn't over-supply of material, it's a shortage of urgency that would otherwise force me to make the hard decisions about "no, don't add this to the pile, it's not important enough to waste my time with". Instead, I have 1990s books on management that looked like maybe I might learn something .... (via Clay Shirky on Twitter)

June 27 2010

A New Era of Post-Productivity Computing?

Glenn Fisher recently posted on software that disables bits of the computer to make us more productive and to minimize distractions. Programs like Freedom, Isolator, RescueTime, LeechBlock, Turn Off the Lights and others were mentioned, with more coverage going to Freedom, a tool that blocks distractions. Freedom users can choose to disable Internet access and/or local network access. Users claim that software like Freedom makes them more productive by blocking tempting distractions.

I’m not opposed to using technologies to support us in reclaiming our attention. But I prefer passive, ambient, non-invasive technologies over parental ones. Consider the Toyota Prius. The Prius doesn’t stop in the middle of a highway and say, “Listen to me, Mr. Irresponsible Driver, you’re using too much gas and this car isn’t going to move another inch until you commit to fix that.” Instead, a display engages us in a playful way and our body implicitly learns to shift to use less gas.

Glenn was kind enough to call me for a comment as he prepared his post. We talked about email apnea, continuous partial attention, and how, while software that locks out distractions is a great first step, our ultimate opportunity is to evolve our relationship with personal technologies.

Personal technologies today are prosthetics for our minds.

In our current relationship with technology, we bring our bodies, but our minds rule. “Don’t stop now, you’re on a roll. Yes, pick up that phone call, you can still answer these six emails. Follow Twitter while working on PowerPoint, why not?” Our minds push, demand, coax, and cajole. “No break yet, we’re not done. No dinner until this draft is done.” Our tyrannical minds conspire with enabling technologies and our bodies do their best to hang on for the wild ride.

With technologies like Freedom, we re-assign the role of tyrant to the technology. The technology dictates to the mind. The mind dictates to the body. Meanwhile, the body that senses and feels, that turns out to offer more wisdom than the finest mind could even imagine, is ignored.

At the heart of compromised attention is compromised breathing. Breathing and attention are commutative. Athletes, dancers, and musicians are among those who don’t have email apnea. Optimal breathing contributes to regulating our autonomic nervous system and it’s in this regulated state that our cognition and memory, social and emotional intelligence, and even innovative thinking can be fueled.

Our opportunity is to create personal technologies that are prosthetics for our beings. Conscious Computing. It’s post-productivity, post-communication era computing. Personal technologies that enhance our lives.

Thirty years ago, personal computing technologies created a revolution in personal productivity, supporting a value on self-expression, output and efficiency. The personal communications technology era that followed the era of personal productivity amplified accessibility and responsiveness. Personal technologies have served us well as prosthetics for the mind, in service of thinking and doing.

Scientists, like Antonion Damasio, Daniel Siegel, and Daniel Goleman, are showing us that aspects of our intelligence come from sensing and feeling and that our bodies offer a kind of wisdom.

Here at #Foo10, Sara has just pointed out that, for the first time she can remember, people are sitting in sessions, taking notes on notepads, laptops closed. Laptops are out of sight. It feels different. That’s another option. We can use technology to help enable Conscious Computing, or we can find it on our own, through attending to how we feel.

How do we usher in an era of Conscious Computing? What tools, technologies, and techniques will it take for personal technologies to become prosthetics of our full human potential?

January 04 2010

Skinner Box? There's an App for That

If you are reading this post it means that after countless misfires, I finally kept my attention focused long enough to finish it. That may seem like no big deal, a mere trifling effort, but I'm basking in the moment. In fact, I'll probably tweet it.

It didn't start out to be about digital Skinner boxes. It was a Radar backchannel email about the infamous Web 2.0 Expo Twitterfall incident. I got all curmudgeonly and ranted about continuous partial attention, Twitter as a snark amplifier, and the "Ignite'ification" of conferences (with apologies to Brady). In short, I demonstrated myself unfit to contribute to a blog called Radar.

I swear I'm not a Luddite. I'm not moving to Florida to bitch about the government full time and I'm not in some remote shack banging this out on an ancient Underwood. However, I guess I count myself among the skeptics when it comes to the unmitigated goodness of progress. Or at least its distant cousin, trendiness.

Anyway, I sent the email, inexplicably Jesse said "post!", and I tried reworking it. I still am. This piece has been grinding away like sand in my cerebral gears since, and along the way it has become about something else.

In The Anthologist, Nicholson Baker describes writing poetry as the process of starting with a story and building a poem around it. I try to do that with photography and build pictures around narrative and metaphor. After the work takes shape the story is carved back out and what remains hints at the story's existence, like a smoke ring without the mouth.

He says it better: "If you listen to them, the stories and fragments of your stories you hear can sometimes slide right into your poem and twirl around in it. Then later you cut out the story and the poem has a mysterious feeling of charged emptiness, like the dog after the operation." Don't worry about the dog, it lived and it isn't relevant. My point is that this post isn't about the Twitterfall fail story, that was just a catalyst. The inchoate uneasiness still twirling around in here is what's left of it.

This all began with these lingering questions: "Why are we conference attendees paying good money, traveling long distances, and sitting for hours in chairs narrower than our shoulders only to stare at our laptops? Why do we go to all that trouble and then spend the time Twittering and wall posting on the overwhelmed conference wifi? Or, more specifically, why are we so fascinated with our own 140 character banalities pouring down the stage curtain that we ignore, or worse, mob up on, the speakers that drew us there in the first place?"

As I kept working away on what has become this overlong post, the question eventually turned into, "why the hell can't I finish this?" This has become the post about distraction that I've been too distracted to complete. It's also about ADHD and the digital skinner box that makes it worse, narcissism's mirror, network collectivism and the opt-in borg, and an entropic counter-argument for plugging in anyway. So, here goes...


My name is Jim, and I'm a digital stimulusaholic

A few weeks ago I was watching TV from across the room in the airport and I couldn't hear the sound. The missing sound track made the cuts more obvious so I timed them. They averaged about 1.5 seconds while ranging from about a quarter to at most three. The standard deviation was pretty tight but there was plenty of random jitter and the next cut was always a surprise. Even during the shortest clips the camera zoomed or panned (or both). Everything was always in motion, like a drunk filming dancers. Even though I've known this was the trend for a while it surprised me. Without the dialog to provide continuity it was disconcerting and vertigo inducing.

In his book Blink of an Eye, Walter Murch describes movie editing as a process akin to adding eye blinks where they naturally belong so that film works for the brain like a dream. It's a lovely book by the way. I think these frenetic transitions alter how we experience that film-induced dream state, and for me at least, can make film feel a nightmare during exam week. Unfortunately, much of my daily experience mirrors this new cultural reality.

Before I discovered Twitter I used to joke that coffee was at the root of a personal productivity paradox. I could drink it and stay alert while wearing a path in the carpet back and forth to the men's room. Or I could stay stimulant free and sleep at my desk. That was a joke, but the information-sphere that we live in now is like that. I can either drink liberally from the fire hose and stimulate my intellect with quick-cutting trends, discoveries, and memes; but struggle to focus. Or I can sign off, deactivate, and opt out. Then focus blissfully and completely on the rapidly aging and increasingly entropic contents of my brain, but maybe finish stuff. Stuff of rapidly declining relevance.

We have such incredible access to information, I just wish it wasn't so burdened with this payload of distraction. Also, I wish my brain wasn't being trained to need these constant microburst's of stimulation.

Email was the first electronic medium to raise my clock speed, and also my first digital distraction problem. After some "ding, you have mail," I turned off the blackberry notification buzz, added rationing to my kit bag of coping strategies, and kept on concentrating. Then RSS came along and it was like memetic crystal meth. The pursuit of novelty in super-concentrated form delivered like the office coffee service. Plus, no one had to worry about all that behind-the-counter pseudoephedrine run around. "Hey, read as much as you want, no houses were blown up in Indiana to make your brain buzz."

It was a RUSH to know all this stuff, and know it soonest; but it came like a flood. That un-read counter was HARD to keep to zero and there was always one more blog to add. Read one interesting post and be stuck with them forever. In time keeping up with my RSS reader came to be like Lucy in the chocolate factory with the conveyor belt streaming by. From my vantage point today, RSS seems quaint. The good old days. I gave it up for good last year when I finally bought an iPhone and tapped Twitter straight into the vein. Yeah, I went real time.

Now I can get a hit at every stop light. Between previews at the movies. Waiting for the next course at a restaurant. While you are talking to me on a conference call (it's your fault, be interesting). When you look down at dinner to check yours. Last thing before I go to sleep. The moment I wake up. Sitting at a bar. Walking home. While opening presents on Christmas morning (don't judge me, you did it too). In between the sentences of this paragraph.

I am perfectly informed (I will know it before it hits the New York Times home page) and I'm utterly distracted.

Here are just a few of the things I learned yesterday while I was working on this post. Scientists are tracking malaria with cell phone data, there is an open source GSM base station project, I need to bend over (and touch my toes) more, WWII 8th Air Force bomber crews had brass ones (seriously, read this pdf), Erik Prince is probably graymailing the CIA, and electric motorcycles seem to be on the verge of being popular.

So here I am at the nexus of ADHD, AMS*, and digital Narcissism. I'm in a Skinner box alright, but I don't smack the bar and wait for pellets, I tweet into the void and listen for echoes. There it is now, that sweet sweet tweet of instant 140 char affirmation. Feels good. RT means validation. I think I'm developing a Pavlovian response to the @ symbol that borders on the sexual.

And I remember to give RT love too. Even if the tweet didn't really grab me as much as I let on. After all, you have to grease the machine to keep its pellet chute clear. Give to get. I won't RT cheeky_geeky though, gotta draw the line somewhere. No preferential attachment help from this tweeter. Better to RT the ones that really need it; they'll be more grateful and they'll come through later when I'm jonesing hard for 140 characters of meaningful interaction.

And Twitterfall! I've only experienced it once, but holy shit, it's a Skinner Box spitting champagne truffles. It's real time plus real place. Back channel my ass, this is narcissism's mirror mirror on the wall, who's the twitteringest mofo of them all? And it's big. Don't have to wait for the echo, I can see it right there! And so can everyone else. A perfect cybernetic feedback loop of self. A self licking ice cream cone of the mind. I didn't know it till I experienced Twitterfall, but ASCII echo isn't enough. We're still flesh with pumping hearts after all and we want to feel the response. Listen to them shift in their seats as my last twitticism wends its way down the wall. Slow down you bastards, let it hang there a bit, it was a good one. Hear that? Yeah, they saw it.

This brave new inter-networked, socially-mediated, post-industrial, cybernetically-interwoven world is an integrated web of Pavlovian stimulus and response and I'm barking at the bell. Turns out, this isn't a Skinner Box. No, "box" is too confining for this metaphor. This is a fully networked, digitally rendered, voluntarily joined Skinner Borg. It doesn't embed itself in us, we embed ourselves in it. It's Clockwork Orange, self served.

For the last couple of years I've jacked in to this increasing bit rate of downloadable intellectual breadth and I've traded away the slow conscious depth of my previous life. And you know what? Now I'm losing my self. I used to be a free standing independent cerebral cortex. My own self. But not any more. Now I'm a dumb node in some uber-net's basal ganglia. Tweet, twitch, brief repose; repeat. My autonomic nervous system is plugged in, in charge, and interrupt ready while the gray wrinkly stuff is white knuckled from holding on.


The singularity is here, and it's us... also it's dumb, snarky, and in love with itself.

Everyone is worried that the singularity will be smart, I'm worried that it will be dumb, with a high clock speed. Any dumb ass can beat you at chess if it gets ten moves to your one. In fact, what if the singularity already happened, we are its neurons, and it's no smarter than a C. elegans worm? Worse, after the Twitterfall incident, I'm worried about what it will do when it discovers its motor neural pathways.

The human brain is brilliance derived from dumb nerves. Out of those many billions of simple connections came our Threshold of Reflection and everything that followed. But consciousness is going meta and we're being superseded by a borg-like singularity; intelligence turned upside down. Smart nodes suborning ourselves to a barely conscious #fail-obsessed network. It's dumb as a worm, fast as a photo multiplier tube, and ready to rage on at the slightest provocation. If you're on stage (or build a flawed product, or ever ever mention politics), watch out.

We don't plan to go mob rules any more than a single transistor on your computer intends to download porn. We participate in localized stimulus and response. Macro digital collectivism from local interaction. Macro sentiment from local pellet bar smacking.

We're pre-implant so I plug into the Skinner Borg with fingers and eyes that are low bandwidth synapses. When I try to unplug (or when I'm forced to in an airplane at altitude), my fingers tingle and I feel it still out there. I'm a stimulus seeking bundle of nerves. I experience the missing network like a phantom limb.

So where's this going? Like I said, I'm not a Luddite but I'm no Pollyanna Digitopian either. Age of spiritual machines? Whatever. Show me spiritual people. When the first machine or machine-assisted meta-consciousness arrives on the scene, it's going to be less like the little brother that you played Battleship with and more like a dumb digital version of poor Joe from Johnny Got His Gun. Barely sentient but isolated from sensation. Do we think that a fully formed functional consciousness is going to spring to life the first time sufficient processing power is there to enable it? I'm not worried about it replicating and taking over the world, I'm worried about it going completely bat shit crazy and stumbling around breaking stuff in an impotent rage.


My Dilemma, Evolution, and Entropy

All this talk of borgs, singularities, and addiction doesn't address my very real and right now dilemma. The world is changing and we all have to keep up. Mainlining memes is AWESOME for that, but at what cost? It's a bargain that I'm trying not to see as Faustian.

We don't have parallel ports so we have to choose. Lots of bite sized pellets or slow down and go deep? Frenetic pursuit of the novel or quiet concentration? Can I stay plugged in without giving up my ability to focus? I don't want to be a donor synapse to the worm and I don't want to have to intravenously drip Adderall to cope.

At root, this is a question of breadth vs. depth and finding the right balance. This conversation was started by a conference. Organizers have to choose too, and they base their choices on what they think we prefer. Do we want to listen to Sandy Pentland for an hour and come away with a nuanced understanding of his work on honest signals, or would we rather have six twitter-overlaid ten minute overviews in the same hour? Are we looking for knowledge? Or suggestions of what to investigate more deeply later (assuming we can find some "later" to work with)? Can we sit still for an hour even if we want to?

We humans and our organizations are open dissipative systems evolving memetically in the information realm and genetically on intergenerational time scales. Living organisms beat entropy by remaining in a perpetual state of disequilibrium - they absorb energy from their environment and exhaust disorder back into it. The greater their disequilibrium, the more energy that is required to maintain an internally ordered state, but paradoxically, the more adaptive they are to changing surroundings.

If we work in a domain in flux we require a higher rate of information consumption to maintain our ability to adapt while maintaining an internally ordered state. The U.S. Army is experiencing this now as it tries to adapt to the difference between a Fulda Gap standoff and the current counter insurgency mission. Moving from a period of relative stasis to a tighter evolutionary time scale, it's adapt or lose. As a learning organization its emphasis has to shift from transmitting and conforming with existing knowledge, to consuming and processing new.

The pursuit of novelty in this context isn't just fun, it is the foundation for a stocked library of adaptive schemata that support intellectual evolution. Since memes and the seeds of synthesis can come in compact packages, a broad, fast, and shallow headspace can work in times of rapid change. This isn't just an argument for fast paced conferences with lots of breadth, but it also explains why twitter, RSS feeds, and broad weakly-connected social networks (e.g. Foo) are so valuable. It's also one of the arguments that I make in enterprises like the DoD why they should promote rather than discourage social media use.

However, I don't think the evolutionary / entropic argument is the only one in play. The cultural and cognitive domains are relevant too, and speaking personally, I feel like I'm bumping hard up against some relevant limits. My memetic needs are increasing faster than genetic barriers can evolve. Obviously, in the cultural domain we are becoming more accustomed to fast paced transitions and partial attention. However, anecdotally it seems like I'm not the only one wondering about the impact of all this stuff. Early signals are popping up in the strangest places. When I attended the Gov 2.0 Summit more than one participant commented that the fast paced format was intellectually exhausting.

By nature I'm an abstainer more than a moderator. It's hard for me to limit a behavior by doing just a little bit of it. Just check the empty quart ice cream container in my trash can if you doubt me. So, frankly, I am stumped on what to do. I simply don't know how to proceed in a way that will keep the information flow going but in a manner that doesn't damage my ability to produce work of my own.


Which Singularity?

In the early years of the last century the Dadaists observed America's technological progress from their Parisian perch and recoiled artistically from the dehumanizing eruptions of concrete and steel in the machine age capital of Manhattan. Paintings like Picabia's Universal Prostitution were comments on how our culture (and perhaps selves) seemed to be merging with machines. Having observed the early machine age's ascendance first hand, Duchamp would have understood our uneasy fascination with the singularity.

I'm trapped in a cybernetic feedback loop. That much is clear. However, these loops operate at different scales in both time and space and maybe Twitter itself solves at least one larger conundrum. As we join the Skinner Borg in droves, and our ability to concentrate is compromised at scale, who is going to develop the technologies that evolve the worm?

When astrophysicists use the term "singularity" they mean that edge of a black hole where the decaying-from-the-center gravitational force just balances the ability of light to escape. Along the surface of that sphere, some distance out from the hole itself, light just hangs there in perpetual stop motion.

The very technology that makes our collective integration possible also distracts us from advancing it. In equilibrium, distraction and ambition square off at the singular point of failed progress. If the next generation of Moore's, Joy's, and Kurzweil's are half as distracted as I am, we are going to find ourselves frozen right here, nodes in a wormy borg that never becomes a butterfly. (yeah, I know, worms don't become butterflies, but I'm desperate to finish...). Anyway, maybe Twitter is just God's way of making sure we never manage to finish creating our future oppressor.

p.s. There really is an app for that.

*AMS = Afraid to miss something

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl