Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

August 07 2013

Predicting the future: Strata 2014 hot topics

Conferences like Strata are planned a year in advance. The logistics and coordination required for an event of this magnitude takes a lot of planning, but it also takes a decent amount of prediction: Strata needs to skate to where the puck is going.

While Strata New York + Hadoop World 2013 is still a few months away, we’re already guessing at what next year’s Santa Clara event will hold. Recently, the team got together to identify some of the hot topics in big data, ubiquitous computing, and new interfaces. We selected eleven big topics for deeper investigation.

  • Deep learning
  • Time-series data
  • The big data “app stack”
  • Cultural barriers to change
  • Design patterns
  • Laggards and Luddites
  • The convergence of two databases
  • The other stacks
  • Mobile data
  • The analytic life-cycle
  • Data anthropology

Here’s a bit more detail on each of them.

Deep learning

Teaching machines to think has been a dream/nightmare of scientists for a long time. Rather than teaching a machine explicitly, or using Watson-like statistics to figure out the best answer from a mass of data, Deep Learning uses simpler, core ideas and then builds upon them — much as a baby learns sounds, then words, then sentences.

It’s been applied to problems like vision (find an edge, then a shape, then an object) and better voice recognition. But advances in processing and algorithms are making it increasingly attractive for a large number of challenges. A Deep Learning model “copes” better with things its creators can’t foresee, or genuinely new situations. A recent MIT Technology Review article said these approaches improved image recognition by 70%, and improved Android voice recognition 25%. But 80% of the benefits come from additional computing power, not algorithms, so this is stuff that’s only become possible with the advent of cheap, on-demand, highly parallel processing.

The main drivers of this approach are big companies like Google (which acquired DNNResearch), IBM and Microsoft. There are also startups in the machine learning space like Vicarious and Grok (née Numenta).

Deep Learning isn’t without its critics. Something learned in a moment of pain or danger might not be true later on, so the system needs to unlearn — or at least reduce the certainty — of a conclusion. What’s more, certain things might only be true after a sequence of events: once we’ve seen a person put a ball in a box and close the lid, we know there is a ball in the box, but a picture of the box afterward wouldn’t reveal this. Inability to take into account time is one of the criticisms Grok founder Jeff Hawkins levels at Deep Learning.

There’s some good debate, and real progress in AI and machine learning, as a result of the new computing systems that make these models possible. They’ll likely supplant the expert systems (yes/no trees) that are used in many industries, but have fundamental flaws. Ben Goldacre described this problem at Strata in 2012: almost every patient who displays the symptoms of a rare disease instead has two, much more common, diseases with those symptoms.

Also this is why House is a terrible doctor’s show.

In 2014, much of the data science content of Strata will focus on making machines smarter, and much of this will come from abundant back-end processing paired with advances in vision, sensemaking, and context.

Time-series data

Data is often structured according to the way it will be used.

  • To data designers, a graph is a mathematical structure that describes how a pair of objects relate to one another. This is why Facebook’s search tool is called Graph Search. To work with large numbers of relationships, we use a Graph database that organizes everything in it according to how it’s related to everything else. This makes it easy to find things that are linked to one another, like routers in a network or friends at a company, even with millions of connections. As a result, it’s often in the core of a social network’s application stack. Companies like Neo4j and Titan and Vertex make them.
  • On the other hand, a relational database keeps several tables of data (your name; a product purchase) and then links them by a common thread (such as the credit card used to buy the product, or the name of the person to whom it belongs). When most traditional enterprise IT people say “database,” they mean a relational database (RDBMS). The RDBMS has been so successful it’s supplanted most other forms of data storage.

(As a sidenote, at the core of the RDBMS is a “join,” an operation that links two tables. Much of the excitement around NoSQL databases was in fact about doing away with the join, which — though powerful — significantly restricts how quickly and efficiently an RDBMS can process large amounts of data. Ironically, the dominant language for querying many of these NoSQL databases through tools like Impala is now SQL. If the NoSQL movement had instead been called NoJoin, things might have been much more clear.)

Book SpiralBook Spiral

Book Spiral – Seattle Central Library by brewbooks, on Flickr

Data systems are often optimized for a specific use.

  • Think of a coin-sorting machine — it’s really good at organizing many coins of a limited variety (nickels, dimes, pennies, etc.).
  • Now think of a library — it’s really good at a huge diversity of books, often only one or two of each, and not very fast.

Databases are the same: a graph database is built differently from a relational database; an analytical database (used to explore and report on data) is different from an operational one (used in production).

Most of the data in your life — from your Facebook feed to your bank statement — has one common element: time. Time is the primary key of the universe.

Since time is often the common thread in data, optimizing databases and processing systems to be really, really good at handling data over time is a huge benefit for many applications, particularly those that try to find correlations between seemingly different data — does the temperature on your NEST thermostat correlate with an increase in asthma inhaler use? Black Swans aside, time is also useful when trying to predict the future from the past.

Time Series data is at the root of life-logging and the Quantified Self movement, and will be critical for the Internet of Things. It’s a natural way to organize things which, as humans, we fundamentally understand. Time series databases have a long history, and there’s a lot of effort underway to modernize them as well as the analytical tools that crunch the data they contain, so we think time-series data deserves deeper study in 2014.

The Big Data app stack

We think we’re about to see the rise of application suites for big data. Consider the following evolution:

  1. On a mainframe, the hardware, operating system, and applications were often indistinguishable.
  2. Much of the growth of consumer PCs happened because of the separation of these pieces — companies like Intel and Phoenix made the hardware; Microsoft and Red Hat made the OS; and developers like WordPerfect, Lotus, and DBase made the applications.
  3. Eventually, we figured out what the PC was “for” and it acquired a core set of applications without which, it seems, a PC wouldn’t be useful. Those are generally described as “office suites,” and while there was once a rivalry for them, today, they’ve been subsumed by OS makers (Apple, Microsoft, Open Source) while those that didn’t have an OS withered on the vine (Corel).
  4. As we moved onto the web, the same thing started to happen — email, social network, blog, and calendar seemed to be useful online applications now that we were all connected, and the big portal makers like Google, Sina, Yahoo, Naver, and Facebook made “suites” of these things. So, too, did the smartphone platforms, from PalmPilot to Blackberry to Apple and Android.
  5. Today’s private cloud platforms are like yesterday’s operating systems, with OpenStack, CloudPlatform, VMWare, Eucalyptus, and a few others competing based on their compatibility with public clouds, hardware, and applications. Clouds are just going through this transition to apps, and we’re learning that their “app suite” includes things like virtual desktops, disaster recovery, on-demand storage — and of course, big data.

Okay, enough history lesson.

We’re seeing similar patterns emerge in big data. But it’s harder to see what the application suite is before it happens. In 2014, we think we’ll be asking ourselves, what’s the Microsoft Office of Big Data? We can make some guesses:

  • Predicting the future
  • Deciding what people or things are related to other people or things
  • Helping to power augmented reality tools like Google Glass with smart context
  • Making recommendations by guessing what products will appeal to which customers
  • Optimizing bottlenecks in supply chains or processes
  • Identifying health risks or anomalies worthy of investigation

Companies like Wibidata are trying to figure this out — and getting backed by investors with deep pockets. Just as most of the interesting stories about operating systems were the apps that ran on them, and the stories about clouds are things like big data, so the good stories about big data are the “office suites” atop it. Put another way, we don’t know yet what big data is for, but I suspect that in 2014 we’ll start to find out.

Cultural barriers to data-driven change

Every time I talk with companies about data, they love the concept but fail on the execution. There are a number of reasons for this:

  • Incumbency. Yesterday’s leaders were those who could convince others to act in the absence of information. Tomorrow’s leaders are those who can ask the right questions. This means there is a lot of resistance from yesterday’s leaders (think Moneyball).
  • Lack of empowerment. I recently ate a meal in the Pittsburgh airport, and my bill came with a purple pen. I’m now wondering if I tipped differently because of that. What ink colour maximizes per-cover revenues in an airport restaurant? (Admittedly, I’m a bit obsessive.) But there’s no reason someone couldn’t run that experiment, and increase revenues. Are they empowered to do so? How would they capture the data? What would they deem a success? These are cultural and organizational questions that need to be tackled by the company if it is to become data-driven.
  • Risk aversion. Steve Blank says a startup is an organization designed to search for a scalable, repeatable business model. Here’s a corollary: a big company is one designed to perpetuate a scalable, repeatable business model. Change is not in its DNA — predictability is. Since the days of Daniel McCallum, organizational charts and processes fundamentally reinforce the current way of doing things. It often takes a crisis (such as German jet planes in World War Two or Netscape’s attack on Microsoft) to evoke a response (the Lockheed Martin Skunk Works or a free web browser).
  • Improper understanding. Correlation is not causality — there is a correlation between ice cream and drowning, but that doesn’t mean we should ban ice cream. Both are caused by summertime. We should hire more lifeguards (and stock up on ice cream!) in the summer. Yet many people don’t distinguish between correlation and causality. As a species, humans are wired to find patterns everywhere  because a false positive (turning when we hear a rustle in the bushes, only to find there’s nothing there) is less dangerous than a false negative (not turning and getting eaten by a sabre-toothed tiger).
  • Focus on the wrong data. Lean Analytics urges founders to be more data-driven and less self-delusional. But when I recently spoke with executives from DHL’s innovation group, they said that innovation in a big company requires a wilful disregard for data. That’s because the preponderance of data in a big company reinforces the status quo; nascent, disruptive ideas don’t stand a chance. Big organizations have all the evidence they need to keep doing what they have always done.

There are plenty of other reasons why big organizations have a hard time embracing data. Companies like IBM, CGI, and Accenture are minting money trying to help incumbent organizations be the next Netflix and not the next Blockbuster.

What’s more, the advent of clouds, social media, and tools like PayPal or the App Store has destroyed many of the barriers to entry on which big companies rely. As Quentin Hardy pointed out in a recent article, fewer and fewer big firms stick around for the long haul.

Design patterns

As any conference matures, we move into best practices. The way these manifest themselves with architecture is in the form of proven architectures — snippets of recipes people can re-use. Just as a baker knows how to make an icing sauce with fat and sugar — and can adjust it to make myriad variations — so, too, can an architect use a particular architecture to build a known, working component or service.

As Mike Loukides points out, a design pattern is even more abstract than a recipe. It’s like saying, “sweet bread with topping,” which can then be instantiated in any number of different kinds of cake recipes. So, we have a design pattern for “highly available storage” and then rely on proven architectural recipes such as load-balancing, geographic redundancy, and eventual consistency to achieve it.

Such recipes are well understood in computing, and they eventually become standards and appliances. We have a “scale-out” architecture for web computing, where many cheap computers can handle a task, as an Application Delivery Controller (a load balancer) “sprays” traffic across those machines. It’s common wisdom today. But once, it was innovative. Same thing with password recovery mechanisms and hundreds of other building blocks.

We’ll see these building blocks emerge for data systems that meet specific needs. For example, a new technology called homomorphic encryption allows us to analyze data while it is still encrypted, without actually seeing the data. That would, for example, allow us to measure the spread of a disease without violating the privacy of the individual patients. (We had a presenter talk about this at DDBD in Santa Clara.) This will eventually become a vital ingredient in a recipe for “data where privacy is maintained.” There will be other recipes optimized for speed, or resiliency, or cost, all in service of the “highly available storage” pattern.

This is how we move beyond vendors. Just as a scale-out web infrastructure can have an ADC from Radware, Citrix, F5, Riverbed, Cisco, and others (with the same pattern), we’ll see design patterns for big data with components that could come from Cloudera, Hortonworks, IBM, Intel, MapR, Oracle, Microsoft, Google, Amazon, Rackspace, Teradata, and hundreds of others.

Note that many vendors who want to sell “software suites” will hate this. Just as stereo vendors tried to sell all-in-one audio systems, which ultimately weren’t very good, many of today’s commercial providers want to sell turnkey systems that don’t allow the replacement of components. Design patterns and the architectures on which they rely are anathema to these closed systems — and are often where standards tracks emerge. 2014 is when that debate will start out in Big Data.

Laggards and Luddites

Certain industries are inherently risk-averse, or not technological. But that changes fast. A few years ago, I was helping a company called FarmsReach connect restaurants to local farmers and turn the public market into a supply chain hub. We spent a ton of effort building a fax gateway because farmers didn’t have mobile phones, and ultimately, the company pivoted to focus on building networks between farmers.

Today, however, farmers are adopting tech quickly, and they rely on things like GPS-based tractor routing and seed sowing (known as “Satellite Farming”) to get the most from their fields.

As the cost of big data drops and the ease of use increases, we’ll see it applied in many other places. Consider, for example, a city that can’t handle waste disposal. Traditionally, the city would buy more garbage trucks and hire more garbage collectors. But now, it can analyze routing and find places to optimize collection. Unfortunately, this requires increased tracking of workers — something the unions will resist very vocally. We already saw this in education, where efforts to track students were shut down by teachers’ unions.

In 2014, big data will be crossing the chasm, welcoming late adopters and critics to the conversation. It’ll mean broadening the scope of the discussion — and addressing newfound skepticism — at Strata.

Convergence of two databases

If you’re running a data-driven product today, you typically have two parallel systems.

  • One’s in production. If you’re an online retailer, this is where the shopping cart and its contents live, or where the user’s shipping address is stored.
  • The other’s used for analysis. An online retailer might make queries to find out what someone bought in order to handle a customer complaint or generate a report to see which products are selling best.

Analytical technology comes from companies like Teradata, IBM (from the Cognos acquisition), Oracle (from the Hyperion acquisition), SAP, and independent Microstrategy, among many others. They use words like “Data Warehouse” to describe these products, and they’ve been making them for decades. Data analysts work with them, running queries and sending reports to corporate bosses. A standalone analytical data warehouse is commonly accepted wisdom in enterprise IT.

But those data warehouses are getting faster and faster. Rather than running a report and getting it a day later, analysts can explore the data in real time — re-sorting it by some dimension, filtering it in some way, and drilling down. This is often called pivoting, and if you’ve ever used a Pivot Table in Excel you know what it’s like. In data warehouses, however, we’re dealing with millions of rows.

At the same time, operational databases are getting faster and sneakier. Traditionally, a database is the bottleneck in an application because it doesn’t handle concurrency well. If a record is being changed in the database by one person, it’s locked so nobody else can touch it. If I am editing a Word document, it makes sense to lock it so someone else doesn’t edit it — after all, what would we do with the changes we’d both made?

But that model wouldn’t work for Facebook or Twitter. Imagine a world where, when you’re updating your status, all your friends can’t refresh their feeds.

We’ve found ways to fix this. When several people edit a Google Doc at once, for instance, each of their changes is made as a series of small transactions. The document doesn’t really exist — instead, it’s a series of transactional updates, assembled to look like a document. Similarly, when you post something to Facebook, those changes eventually find their way to your friends. The same is true on Twitter or Google+.

These kinds of eventually consistent approaches make concurrent editing possible. They aren’t really new, either: your bank statement is eventually consistent, and when you check it online, the bottom of the statement tells you that the balance is only valid up until a period in the past and new transactions may take a while to post. Here’s what mine says:

Transactions from today are reflected in your balance, but may not be displayed on this page if you recently updated your bankbook, if a paper statement was issued, or if a transaction is backdated. These transactions will appear in your history the following business day.

Clearly, if eventual consistency is good enough for my bank account, it’s good enough for some forms of enterprise data.

So, we have analytical databases getting real-time fast and operational databases increasingly able to do things concurrently without affecting production systems. Which begs the question: why do we have two databases?

This is a massive, controversial issue worth billions of dollars. Take, for example, EMC, which recently merged its Greenplum acquisition into Pivotal. Pivotal’s marketing (“help customers build, deploy, scale, and analyze at an unprecedented velocity”) points at this convergence, which may happen as organizations move their applications into cloud environments (which is partly why Pivotal includes Cloud Foundry, which VMWare acquired).

The change will probably create some huge industry consolidation in the coming years (think Oracle buying Teradata, then selling a unified operational/analytical database). There are plenty of reasons it’s a bad idea, and plenty of reasons why it’s a good one. We think this will be a hot topic in 2014.

Cassandra and the other stacks

Big data has been synonymous with Hadoop. The break-out success of the Hadoop ecosystem has been astonishing, but it does other stacks a disservice. There are plenty of other robust data architectures that have furiously enthusiastic tribes behind them. Cassandra, for example, was created by Facebook, released into the wild, and tamed by Reddit to allow the site to scale to millions of daily visitors atop Amazon with only a handful of employees. MongoDB is another great example, and there are dozens more.

Some of these stacks got wrapped around the axle of the NoSQL debate, which, as I mentioned, might have been better framed as NoJoin. But we’re past that now, and there are strong case studies for many of the stacks. There are also proven affinities between a particular stack (such as Cassandra) and a particular cloud (such as Amazon Web Services) because of their various heritages.

In 2014, we’ll be discussing more abstract topics and regarding every one of these stacks as a tool in a good toolbox.

Mobile data

By next year, there will be more mobile phones in the world than there are humans, over one billion of them “smart.” They are the closest thing we have to a tag for people. Whether measuring mall traffic for shoppers or projecting the source of Malarial outbreaks in Africa, it’s big. One carrier recently released mobile data from the Ivory Coast to researchers.

Just as Time Series data has structure, so does geographic data, much of which lives in Strata’s Connected World track. Mobile data is a precursor to the Internet of Everything, and it’s certainly one of the most prolific structured data sources in the world.

I think concentrating on mobility is critical for another reason, too. The large systems created to handle traffic for the nearly 1,000 carriers in the world are big, fast, and rock solid. An AT&T 5ESS switch or one of the large-scale Operational Support Systems, simply does not fall over.

Other than DNS, the Internet doesn’t really have this kind of industrial-grade system for managing billions of devices, each of which can connect to the others with just a single address. That is astonishing scale, and we tend to ignore it as “plumbing.” In 2014 , the control systems for the Internet of Everything are as likely to come from Big Iron made by Ericsson as they are to come from some Web 2.0 titan.

The analytic life-cycle

The book The Theory That Would Not Die begins with a quote from John Maynard Keynes: “When the facts change, I change my opinion. What do you do, sir?” As this New York Times review of the book observes, “If you are not thinking like a Bayesian, perhaps you should be.”

Bayes’ theorem says that beliefs must be updated based on new evidence — and in an information-saturated world, new evidence arrives constantly, which means the cycle turns quickly. To many readers, this is nothing more than explaining the scientific method. But there are plenty of people who weren’t weaned on experimentation and continuous learning — and even those with a background in science make dumb mistakes, as the Boy Or Girl Paradox handily demonstrates.

Ben Lorica, O’Reilly’s chief scientist (and owner of the enviable Twitter handle @BigData) recently wrote about the lifecycle of data analysis. I wrote another piece on the Lean Analytics cycle with Avinash Kaushik a few months ago. In both cases, it’s an iterative process of hypothesis-forming, experimentation, data collection, and readjustment.

In 2014, we’ll be spending more time looking at the whole cycle of data analysis, including collection, storage, interpretation, and the practice of asking good questions informed by new evidence.

Data anthropology

Data seldom tells the whole story. After flooding in Haiti, mobile phone data suggested people weren’t leaving one affected area for a safe haven. Researchers concluded that they were all sick with cholera and couldn’t move. But by interviewing people on the ground, aid workers found out the real problem was that flooding had destroyed the roads, making it hard to leave.

As this example shows, there’s no substitute for context. In Lean Analytics, we say “Instincts are experiments. Data is proof.” For some reason this resonated hugely and is one of the most favorited/highlighted passages in the book. People want a blend of human and machine, of soft, squishy qualitative data alongside cold, hard quantitative data. We joke that in the early stages of a startup, your only metric should be “how many people have I spoken with?” It’s too early to start counting.

In Ash Maurya’s Running Lean, there’s a lot said about customer development. Learning how to conduct good interviews that don’t lead the witness and measuring the cultural factors that can pollute data is hugely difficult. In The Righteous MindJonathan Haidt says all university research is WEIRD: Western, Educated, Industrialized, Rich, and Democratic. That’s because test subjects are most often undergraduates, who fit this bill. To prove his assertion, Haidt replicated studies done on campus at a McDonald’s a few miles away, with vastly different results.

At the first Strata New York, I actually left the main room one morning to go write a blog post. I was so overcome by the examples of data errors — from bad collection, to bad analysis, to wilfully ignoring the results of good data — that it seemed to me we weren’t paying attention to the right things. If “Data is the new Oil,” then its supply chain is a controversial XL pipeline with woefully few people looking for leaks and faults. Anthropology can fix this, tying quantitative assumptions to verification.

Nobody has championed data anthropology as much as O’Reilly’s own Roger Magoulas, who joined Jon Bruner and Jim Stogdill for a podcast on the subject recently.

So, data anthropology can ensure good data collection, provide essential context to data, and check that the resulting knowledge is producing the intended results. That’s why it’s on our list of hot topics for 2014.

Photo: Book Spiral – Seattle Central Library by brewbooks, on Flickr

August 24 2012

Building conference programs: it’s about the attendee

I’ve chaired computer industry conferences for ten years now. First for IDEAlliance (XML Europe, XTech), and recently with O’Reilly Media (OSCON, Strata). Over the years I have tried to balance three factors as I select talks: proposal quality, important new work, and practical value of the knowledge to the attendees.

As the competition for speaking slots at both Strata and OSCON reach intense levels, I wanted to articulate these factors, and the principles I use when compiling conference programs.

How the program is made

My guiding principle in putting a program together is value to the attendees. They’re why we do this. By putting out quality content and speakers, we attract thinking, interested attendees. In turn, our sponsors get a much better quality of conversation and customer contact through their presence at the event.

Here’s the process in a nutshell: proposals are invited through a public call for participation, and then reviewers, drawn from the industry community of experts, will grade and comment on each proposal. I and my co-chairs use this feedback, along with editorial judgement, to compile the final schedule. For keynotes, and a small number of breakout sessions, we will augment the review process by inviting talks we think are important for the program.

Sponsors and the schedule

No company’s industry position (sponsor or not) means they get an easy ride through the proposal vetting process. That would undermine the trust the attendees place in the chairs and program committee. In the past, I’ve been tough with even the very biggest companies, and it’s not been an easy process for me or them. It is however essential, and I am grateful to those companies and others who respect the process.

A conference schedule decided by sponsor-driven talk placement will soon fail to meet the needs of the attendees, because nobody will reliably have attendee interests at heart.

Sponsors have a vital part to play. Their support enables the event to exist in the first place, and they enable many aspects that make the conference experience great. Sponsor companies are a core part of the technical and business ecosystem on which the conferences are based, and many sponsor employees contribute great technical talks that are part of the editorial tracks.

Attendees make real buying decisions and want to hear from sponsors about their products. O’Reilly conferences have a place for product talks, and those talks are marked as sponsored. Because of our ethos of good content, and our practice of advising sponsors on their talks, most of these talks are also excellent, and on a par with the quality of the rest of the program.

We can always do better

Sometimes, we miss the mark. With hundreds of proposals and a fast-moving field, building a program is data-informed, but still an art. As conference chairs we decide an editorial strategy, which sometimes means great talks are left out because they don’t fit the story we think the attendees will value most.

I recognize and lament that it’s not always a pleasant process. Rejection stings, and with over 1,000 proposals it’s an unfortunate fact that something like 850 talks won’t get through.

Recently, with so much competition in the big data industry, I’ve received more heat than usual about program choices. Some proposers know how to create great presentations, and others don’t. If we select more from those that offer great presentations, it’s with the audience in mind, not bias. I’m always willing to offer advice to help people do better the next time around.

Tips for success

With the call for papers for Strata 2013 coming out, I’d like to ensure we do a good job and have the best options for our attendees. Somewhat masochistically, I want my job in deciding the program to be harder than ever because of the quality of incoming proposals.

I think it’s worth spelling out a few guidelines that will help proposers, especially those from industry vendors:

  • Read the submission guidelines three times over, and take them to heart
  • Think about the audience and the problems they’re trying to solve
  • Talk to us. The chairs and program committee are listed on the web site and we can offer guidance (though no guarantees) before you submit your proposal

Submit your proposal on time

I’ll close by reiterating my recommendation to communicate. We have a process designed to maximize the quality of the conference. That process isn’t perfect, and we can and do make mistakes. To minimize mistakes, you can help. As a chair, I prefer to cultivate ongoing relationships with people in the field. Talk to me and brief me about your technology before the conference program is set, not afterwards. It would be unfair for me to do anything to disadvantage those proposers who followed the guidelines and submitted on time.

Finally, thank you. For reading this far, and helping me, my co-chairs, and the program committee in creating the best program possible. We value each and every submission, whether it makes the schedule or not.

I am always happy to offer advice. You can reach me at edd@oreilly.com.

November 11 2011

Confessions of a not-so-public speaker

Empty Stage by Max Wolfe, on Flickr One of Web 2.0 Summit 2011's memorable moments came early, when program chair John Battelle was gently but earnestly admonished by anthropologist Genevieve Bell for not having more women on stage that day. Cue lots of applause from the audience. John rejoined that he wouldn't discuss the number of women who had turned him down.

Part of my job here at O'Reilly is to encourage women, people of color, and other folks often underrepresented at tech conferences to be speakers at our events. I can really empathize with John: I've been turned down a lot, too. During that moment at Web 2.0 Summit, I wondered how many women applauding Genevieve's comment are regular tech conference speakers themselves. It's one thing to say we need role models and a very different thing to actually be one.

And that's exactly the intersection I find myself standing in now.

I worked in fundraising for many years, and it wasn't until I became a donor myself that I truly understood how to overcome the challenges of getting people to open their wallets — not to mention understand how good it feels to give to an important cause. Similarly, I know I won't be able to be a true agent for diversity in our speaker rosters until I step up and become a public speaker myself.

You'd think it'd be easier being in the conference organizing biz, but for me, it's the opposite. The quality of speakers I usually see — engaging, humorous, knowledgeable, and at one with their slide decks — can be a bit intimidating. While I don't think I'll be a speaker at Web 2.0 Summit any time soon, the biggest issue is just taking those first steps toward the speaker side of the street.

So, I've resolved to start my speaking journey. Some people are naturals on stage, and others, like me, need some encouragement. Make that a lot of encouragement. I've been fortunate to have two accomplished speakers cheering me on: entrepreneur and writer Jessica Faye Carter and investment book author Cathleen Rittereiser. They're helping me put together an action plan for becoming a public speaker.

In the hopes that it inspires more than just me, I'd like to share their excellent advice more broadly — below you'll find five tips for launching your own public speaking effort.

Join an online speaking organizationLinkedIn and MeetUp are rife with speaking groups; SpeakerMatch and Speakerfile are two fairly new social networking sites.

Join a speaking group in real lifeToastmasters and National Speakers Association (NSA) are two of the largest and most active. NSA's online magazine has great resources for speakers.

Read — Dale Carnegie's "The Quick and Easy Way to Effective Speaking" still gets high marks today. Take a look at "Confessions of a Public Speaker," "The Confident Speaker," and "Slide:ology." [Disclosure: "Confessions of a Public Speaker" and "Slide:ology" are O'Reilly titles.]

Start low-key — User group meetings and Ignite events are usually supportive places to get your feet wet. Scott Berkun's Why You Should Speak (at Ignite) presentation (embedded below) is an inspirational and succinct primer for newbies, and it helps answer the pesky what-the-hell-do-I-talk-about question.

Team up — Take the stage with a more experienced speaker. Even if you just push the button on the slide clicker, you're still putting yourself in front of an audience.

Come along with me, won't you? Even if you're not part of an "underrepresented group." It's good for our careers; the communities we represent; the causes we espouse; and hey, I've heard it can be fun, too.

I'd love to hear from you. How did you get started speaking? What are your suggestions and resources for honing preso chops? What do you get out of speaking in public? If you're an event organizer, what steps are you taking to diversify your participants? If you're a regular on the conference circuit, what do you do to mentor and encourage others to take the podium?

Please share your advice and ideas in the comments area.

Associated photo on home and category pages: 224/365 Mic by thebarrowboy, on Flickr. Photo at top of post: Empty Stage by Max Wolfe, on Flickr.

Related:

November 04 2011

Top Stories: October 31-November 4, 2011

Here's a look at the top stories published across O'Reilly sites this week.

How I automated my writing career
You scale content businesses by increasing the number of people who create the content ... or so conventional wisdom says. Learn how a former author is using software to simulate and expand human-quality writing.

What does privacy mean in an age of big data?
Ironclad digital privacy isn't realistic, argues "Privacy and Big Data" co-author Terence Craig. What we need instead are laws and commitments founded on transparency.

If your data practices were made public, would you be nervous?
Solon Barocas, a doctoral student at New York University, discusses consumer perceptions of data mining and how companies and data scientists can shape data mining's reputation.

Five ways to improve publishing conferences
Keynotes and panel discussions may not be the best way to program conferences. What if organizers instead structured events more like a great curriculum?


Anthropology extracts the true nature of tech
Genevieve Bell, director of interaction and experience research at Intel, talks about how anthropology can inform business decisions and product design.


Tools of Change for Publishing, being held February 13-15 in New York, is where the publishing and tech industries converge. Register to attend TOC 2012.

November 02 2011

Five ways to improve publishing conferences

This is part of an ongoing series related to Peter Meyers' project "Breaking the Page: Transforming Books and the Reading Experience." We'll be featuring additional material in the weeks ahead. (Note: This post originally appeared on A New Kind of Book. It's republished with permission.)

Ever suffer from "conference head"? It's that feeling, after a couple dozen speeches and panels, where you wonder: wow, what did I learn from all that talking?

Having just returned from Books in Browsers (BiB), a tweet from Liza Daly (@liza) stuck in my head: Much better to have talks as a series of refinements or rebuttals vs. 50 people telling us that the digital revolution is 'here'.

Liza Daly tweet

It got me thinking: is the standard conference format — solo talks plus panel discussions — the best way to "program" a one- or two-day get together? What if organizers structured events more like a great class?

A few quick caveats before I answer: I have never designed or chaired a conference myself, and I offer up these thoughts from the perspective of a frequent attendee and with a huge helping of humility — I can only imagine the time and energy that goes into actually putting one of these shows on. This post was spurred by my time spent at the immensely rewarding BiB, but my ideas here are less a review of that gathering and more about how to make all speaker-heavy conferences more useful. Finally, as for what this topic has to do with digital book design issues: it's tangential, to be sure, but since you can't swing a dead cat these days without hitting a conference on publishing, it felt worthwhile to share what I hope are constructive suggestions

First, a quick roundup of key problems:

Problem: Presentation overlap

This is where multiple speakers give, more or less, the same presentation. Or even if the talks aren't exactly identical, it's the feeling you get when, say, speakers #2, #5, #8, and #11 all talk about how "social reading" is gonna change digital books. Even when organizers do a good job of keeping people from doing "brochure talks" (here's a big problem & here's how my company will solve it), you still end up watching multiple people block out their own version of a framing story that often ends up sounding pretty similar: publishing is undergoing a Gutenberg-sized revolution; readers are suffering from info overload; it's hard to discover what to read; etc.

Problem: I learned what?

What's tough in most conferences is pattern-spotting and takeaway extraction. What's missing are the epiphanies a great teacher gets her students to notice by the end of a class or semester: a sense kids get that they now know more about the topic than when they began. Facing a barrage of speakers who often stray from the descriptions they've submitted (guilty, I plead), the audience can sometimes find it hard to pinpoint what, exactly, they've learned. Is it possible that what conferences need most are good editors to prune, shape, and synthesize all the valuable ideas that speakers (and attendees) share? More on that idea in a moment.

Problem: Format monotony

Empty new museum auditorium by ol slambert, on FlickrOne speech followed by another speech followed by another speech. Have coffee. Repeat. Even when everyone's top notch, the sheer uniformity of sitting through multiple slide-powered talks is hell on our brain's need for diversity.

Having sketched out what I see as the three big problems, here's my crack at some solutions worth exploring:

Solution 1. Organizer as curriculum developer

More than just articulating a theme and curating a speaker list, the organizer would need to devise a "curriculum" — one that doesn't dilly dally too much with basics and yet spends enough time tackling fundamentals so attendees would really feel like they'd gained a new appreciation for issues they thought they already understood. This would clearly entail a substantial amount of speaker management. Organizers would need a degree of cooperation that some presenters might be unwilling to commit to; for example, they'd have to agree in advance to sticking to their assigned topics. As someone who strayed at least partially from the blurb I pitched to the BiB program committee, I know first hand how tempting eleventh-hour inspiration can be.

The event I have in mind would resemble something like a learner's journey — from gentle introduction to the articulation of big challenges; then onto intermediate-level matters; and finally, culminating in some niche topics suitable for those with a master's level understanding. (I did think, by the way, that Brian O'Leary's call at the end of BiB for industry-wide cooperation was a pitch-perfect example of the kind of topic well-suited to wrap up a conference.)

Solution 2. Diverse activities

Rather than a non-stop sequence of solo presentations, I'm picturing a varied program of events woven around traditional talks: a moderator, mic in hand, working her way around the audience posing questions, eliciting answers, and drawing out connections; group activities (split into groups of five, and take 10 minutes to design a product you'd buy); team debates; the presentation of pre-made content (like documentary shorts), website tours, and narrated app slideshows. The idea here is to keep attendees engaged by giving them lots of different ways to consider the material under review.

Solution 3. Note-takers & synthesizers

The first idea here is for a conference to provide a note-taker (skilled in the art of sussing out key points — kinda like the bloggers The New York Times uses to report on live events). Freed from the distractions of writing, attendees could focus more on what speakers are saying. Even better, what if, once or twice a day, an emcee-type got up on stage and distilled out big themes and takeaways? What if these nuggets were posted in a highly visible spot (off- and online) to give everyone a persistent sense of lessons learned or emergent themes?

Solution 4. Workshop-style critiques

Hugely controversial and potentially disastrous territory I'm entering here, but I'm brainstorming, okay? What if someone — respectful, inquisitive, skilled in the art of asking illuminating questions — was up on stage with the speakers and, following their talks, engaged them in a Q&A. This, of course, is what post-speech question time is meant for, but many audience members are too shy, reluctant to challenge, etc. I do want to make sure I'm clear here: I'm not suggesting we grill speakers gotcha-style. I am looking for a way to get people to address the toughest challenges they face and make a strong case about why their solutions and ideas are compelling.

Solution 5. More content

Boy, for an industry built around authors, it's amazing how little time they get at our events. I'm not just talking about storytellers. I'm also thinking of how-to explainers, idea-weavers, cookbook chefs, photographers. Is there a way to get more of these people up on stage — not just talking about their fears in this new era of publishing, but actually sharing what they create to remind everyone of why consumers buy books in the first place?

Webcast: Digital Bookmaking Tools Roundup #2 — Back by popular demand, in a second look at Digital Bookmaking Tools, author and book futurist Pete Meyers explores the existing options for creating digital books.

Join us on Thursday, November 10, 2011, at 10 am PT
Register for this free webcast

Photo: Empty new museum auditorium by ol slambert, on Flickr

Related:

July 24 2011

Sexual Harassment at Technical Conferences: A Big No-No

We've been contacted recently about issues of sexual harassment at technical conferences, including at Oscon, which starts tomorrow in Portland. At O'Reilly we take those issues very seriously. While we're still trying to understand exactly what might have happened at Oscon or other O'Reilly conferences in the past, it's become clear that this is a real, long-standing issue in the technical community. And we do know this: we don't condone or support harassment or offensive behavior, at our conferences or anywhere. It's counter to our company values. More importantly, it's counter to our values as human beings.

We’re voicing our strong, unequivocal support of appropriate behavior by all participants at technical events, including Oscon and other O'Reilly conferences. We invite you to help us make Oscon a place that is welcoming and respectful to all participants, so everyone can focus on the conference itself, and the great networking and community richness that can happen when we get together in person.

One issue that has come up at some technical conferences is sexual or racist comments or images in slides. This is not appropriate. Speakers and exhibitors should use good judgment; if we hear complaints and we think they are warranted, you may not be invited back.

Even more alarmingly, we’ve heard accounts of female attendees having to put up with stalking, offensive comments, and unwanted sexual advances. I’d like to borrow a line from the Flickr Community Guidelines, which use the term Creepiness as follows: “You know that guy. Don’t be that guy.” If we hear that you are that guy, we will investigate, and you may be asked to leave.

Please bring any concerns to the immediate attention of the event staff, or contact our VP of Conferences, Gina Blaber at gina@oreilly.com. We thank our attendees for their help in keeping the event welcoming, respectful, and friendly to all participants.

P.S. We are going to adapt this blog post into a "Code of Conduct" that will become part of the web site registration materials for all of our conferences.

May 17 2011

Putting conference distractions to good use

DonahueLogo.pngConference presenters are increasingly faced with audiences that are dividing time between in-person presentations and web updates. Two presenters at SXSW 2010 noticed the growing trend and developed an app to harness that distraction.

Tim Meaney (@timothymeaney), partner at Arc90, and Christopher Fahey (@chrisfahey), founding partner at Behavior Design, launched the Donahue app shortly before SXSW 2011 in March. In a recent interview, they discussed how the app helps presenters and audiences stay connected and keep the conversation going.

Our interview follows.


How does the Donahue app work?

TimMeaney.pngTim Meaney: First, for a highly technical answer, we've posted a blog with a full technical walk-through of how we architected and built the app. For a more general description, Donahue is a presentation tool built upon the premise that certain conference presentations are best delivered in conversational format. The app allows the presenter to construct their points as a series of portable ideas, delivered through Donahue into a number of views:

  • The Presenter View of the point — For display in the room, this view is akin to a PowerPoint slide. We took care to remember that not everyone in the room will have a laptop or wish to view the "supplemental" experience of the talk.
  • The Participant View of the point — This view allows for easy interaction with the presenter's point. Donahue puts these points directly out there with the presenter's name and avatar attached. In the Participant View, anyone in the audience — in-person and web-based — can reply to the points or tweet to their network. This reduces the friction around the presenter's ideas, and allows the points to flow freely through the audience into a larger network.
  • The idea or point is also directly tweeted, from the presenter. This creates another opportunity for ideas delivered in a talk to reach others.

ChrisFahey.pngChristopher Fahey: From the moment Arc90's Rich Ziade thought that Twitter could be Donahue's engine, we knew that Donahue would have to be able to work for users who didn't want to (or could not) use Donahue. Users who are only on Twitter can engage with Donahue using standard Twitter functions, like hashtags and retweets.

Another view is the Projector View. We knew Donahue would have to work for people who wanted to experience the conference in the conventional way, sitting in the room sans laptop, phone, or tablet. The Projector View takes the speaker's tweets and any related media (like a photo) and displays them in a simplified view, suitable for projection on a screen.

Tim Meaney: Donahue also "works" by acknowledging that the audience wants to have a conversation. It's pretty standard today that the audience tweets during a talk, and then hours later the presenter uploads their slide deck to SlideShare, and then later elaborates their thesis or ideas in a blog post. With Donahue, that wall between audience and presenter, and the abstraction of a slide deck, is removed. The content and ideas are immediately shared, and the audience can immediately begin discussing them. People insist upon discussion, and instead of fighting that trend — "please close your laptops" — we went the other way and joined the conversation.

Web 2.0 Summit, being held October 17-19 in San Francisco, will examine "The Data Frame" — focusing on the impact of data in today's networked economy.

Save $300 on registration with the code RADAR

How should conferences evolve? What needs to improve?

Tim Meaney: It's hard to make a general proscription for all conferences, but we do believe that conference presentations, like all other forms of media, are being impacted by "the conversation revolution." And much like all other forms of media, bringing the benefits of the conversation directly into the conference will suit presenters, organizers, and attendees alike. Those benefits are engaged participants, frictionless sharing of ideas, better learning through discourse, and building new connections among all participants. It's very likely that conferences will begin to better design for conversations — the audience is demanding it.

Christopher Fahey: Speakers also need to ask themselves a few questions:

  • What can I get from this audience? — Can the speaker improve his or her own ideas by really hearing audience reactions and feedback? How? The best feedback is likely to pop into the audience's heads during the talk itself. How can speakers harness that?
  • What can this audience do with my ideas? — If the ideas are any good, the speaker should desire and expect those ideas to grow, spread, and evolve immediately. Again, this might happen in real time.

Complexity is important, too. In our talk at SXSW, I mentioned that most conferences are not theoretical physics and that most audiences can understand everything that a speaker is saying without devoting their full attention. Two days after getting back from SXSW, I went to a theoretical physics lecture — and I was right: theoretical physics is far more complex than web design or project management or search engine optimization. I tried to tweet during the lecture, and when I looked back up I was completely lost. I couldn't keep up with the speaker if I allowed myself even a moment's distraction.

But what I learned from that experience was this: Even a complex topic should permit audiences to let their minds wander. You just can't come to understand and master a complex topic through listening to a lecture alone. Learners need to read and study at their own pace. Conferences and lectures augment and inspire those materials. But most of all, conferences should connect both speakers and audiences with the subject matter and with each other. This enables learning by empowering people to pay attention together, think about ideas together, and most importantly talk about them in the same energized moment.

This interview was edited and condensed.



Related:


March 24 2011

Would I attend my own conference?

When you’re deciding whether to attend a conference, and you’re checking out the website, what do you consider? Most likely, you’ll look at the program, searching for names you know and session titles that describe compelling topics.

If you’re like me—some of you are and some of you aren’t—you’ll also look for diversity among the speakers. If every speaker is a man, or if everyone is white, or both, I know this isn’t an event for me. I don’t need to hear more of the same prominent voices, and I don’t get much value out of an environment that takes a narrow, old-school view on who’s worth listening to.

Because some of you aren’t like me in your choices, there are profitable conferences with speaker rosters that look like roll call for the signers of the Constitution. But conferences that want to be taken seriously by people who take other kinds of people seriously need more diversity among the speakers to thrive. And conference organizers, whose goals often include highlighting new ideas, cannot simply recycle the same short list of well-known speakers from show to show.

Which is a funny thing for me to say. Because I’m co-chair, along with Brady Forrest, for Web 2.0 Expo, a large, semi-annual tech conference that starts on Monday and is among the shows co-produced by O'Reilly and UBMTechWeb that have been pilloried in the past for our speaker line-ups—particularly for not having enough women. While the last outcry came before I had this job, these sorts of discussions are cyclical, and the shows I’ve organized could reasonably be targets of such criticism.

What gives? And what can we do about it?

First, let's put some data behind the idea that men are overrepresented as conference speakers. For the three Web 2.0 Expos I’ve organized, our speaker rosters have comprised 25 to 30 percent women. That’s a near-triumph, considering that only 10 percent of the people who apply to speak are women, and the vast, vast majority of well-known businesspeople in tech—the ones a lot of you look for when considering a conference pass—are men (more on that in a minute). But it’s far short of, say, a 50/50 split.

Further, I’m dismayed to report that when it comes to the percentage of women speakers, our not-stellar numbers are among the best in tech conferences. TechCrunch Disrupt’s NY 2010 show had fewer than 10% women speakers. Our sister show, Web 2.0 Summit—which is programmed by people other than Brady and me--had just around 10% women speakers in 2010. Twitter’s 2010 Chirp conference had one woman speaker listed on their site; Facebook’s F8 conference managed two or so. Future of Web Apps October 2010 show clocked in with 14% women speakers. The Bloomberg Empowered Entrepreneur Summit, which focuses on tech and takes place next month, has zero women entrepreneurs on the roster. Of course, conferences that focus on women, like BlogHer, have close to 100% female slates. But as a rule, general tech conferences don’t get near half.

In a way, this isn’t a big surprise. It’s well-documented that women are underrepresented in the tech sector (if you're not already up to speed, start with "Out of the Loop in Silicon Valley" by Claire Cain Miller, and do not miss "The Men and No Women of Web 2.0 Boards" by Kara Swisher). And it’s also well-documented that across sectors, women are underrepresented in senior roles—i.e., the sorts of positions that are likely to have stories to share at conferences. So, yeah, the population of female speakers we can draw on is smaller than the population of male speakers. But Expo generally has just 150 - 250 speakers total per show (and most conferences have fewer). Why can’t we find 75 - 125 women speakers?

There are two primary ways that conferences get speakers, and we use both methods. 1) You put out a pubic call for speakers (sometimes known as a call for proposals, or a call for papers, or whatever); you get a slew of applicants; you accept some of them. 2) You brainstorm a list of people you’d like to have speak; you reach out to many of them; some of them accept.

Here’s where these methods go wrong: 1) About 10 percent of the public applicants will be women, even if you ask women to apply. 2) The brainstorming, which requires that you know of the speakers already, produces even worse results: 5 percent on a good day. (Another conference organizer has described the second process like this: “Who should we have this year?” A long list of well-known people gets suggested. Somebody notices there aren’t any women on the list. “Ok, what women should we ask?” “We had Caterina Fake last year, but Carol Bartz and Sheryl Sandberg might be free.” “Right, who else?” Longish pause. “I wonder if Ev Williams or Biz Stone is available.”)

This is where we have a chance to change things.

In a recent post, “Designers, Women and Hostility in Open Source", Gina Trapani argues that to boost the participation of women in open source projects, the projects need to organize differently than they often do. Her recommendations, based on her own experiences as an organizer, include things like welcoming and mentoring new participants, recognizing valuable contributions that aren’t just code, and, indeed, valuing things other than the code. Note that she does not recommend that women participants behave differently in order to gain status.

We have a similar opportunity to rethink conference rosters. Let’s take the call for proposals method of finding speakers. When people call out a show for having a paltry percentage of women in the lineup, the traditional response is to explain (or complain) that very few women applied and to then call on more women to enter in the proposal system. Colleagues of mine, people I respect deeply, have gone this route.

But it doesn’t work. While, obviously, some women will apply to speak, the overwhelming evidence is that most will not. In a post last year, Clay Shirky, lamented that his female students were far less likely to sing their own praises and ask for things that would benefit them, like recommendations, than were his male students. His suggestion? That the women act more like the men. While I generally enjoy agreeing deeply with Clay, he—like the conference organizers calling on more women to apply—has missed a key point. If your system of finding worthy students or speakers to promote is to have them come to you and ask, but a solid body of research shows that women won’t do so, you’ve institutionalized a gap.

Better instead, as Gina recommends, to change your system. For conference organizers, that means not just opening up a public call for proposals and asking Women 2.0 and Girls in Tech to tell their friends, but also seeking out and inviting individual women. That may sound inefficient, and it is time-consuming. But if your supposedly efficient public-call system isn’t yielding the desired results, then it’s simply failing efficiently.

We’ve gotten fairly good results at Expo reaching out to individual women. Key to this success is that we aren’t looking to put women on stage because they’re women, we’re seeking out great speakers whom we may have overlooked because they’re women. So it’s not uncommon that I’ll hear about a woman who might have a good presentation to give, but when I talk with her, it’s clear she’s not a fit for our show. I don’t shoehorn in those women, I move on and find others who are right for us.

To improve our efficiency, I enlist help reaching individuals. For instance, at Expo, we generally prefer single speakers or co-presenters to panels. But when somebody proposes an intriguing panel to us, I ask the organizer to include at least one woman with appropriate expertise. I’ve had dozens of these conversations. Almost always, the organizer’s response is, “Oh, right, hadn’t thought of that, good idea. A would be great, or we could ask B if she’s free.” Only once has the response been, “I won’t be able to find anybody.”

In addition to panel organizers, I plant the seed with founders, CEOs and other senior businesspeople. When I meet them (male or female), and we get to chatting about conferences, I ask them to consider actively supporting their female employees as speakers. For the CEOs, that might mean brainstorming with the employees on conferences they could reach out to and topics they could propose, giving them time to write the proposals and travel to shows, and maybe offering really good speaker training. While I can’t yet track results for this mini-initiative, I’ve been surprised to find that when I make these suggestions, businesspeople most often look like the light bulb has gone off, “Right, yes. I can do that stuff—and I want to.”

So you can supplement the call-for-proposals method with a raft of invitations (and bolster that with help from CEOs). But where do you find the women to invite? And what do you do about the brainstorming-notable-people method? In both cases, the hurdle is that accomplished women are, more often than not, less prominent (because, y’know, they don’t speak at conferences as frequently).

If I were to ask you: Who are the ten biggest names among web CEOs? Feel free to include hardware and software companies. And also: Who are the ten biggest names among web entrepreneurs? Feel free to include people from your first list. Your lists, like mine, would include few or no women. So now, even if I change the question to ask you: Who’s doing interesting work we might want to highlight? Well, now your brain is primed to remember the men you came up with a minute ago. And you’re all set to overlook a slew of compelling speakers.

This is where lists are really key. I simply keep lists of C-level women in tech, women entrepreneurs, women VCs, women tech journalists, women consultants and so on. And I find women to add by keeping a close eye on everyone else’s lists, conferences, books, blog comments and tweets—and then, often, seeking out video of these women to get a sense of whether they’re good speakers. (Incidentally, I am, for various reasons, skeptical of those “Top Women in Tech” and “Female Entrepreneurs to Watch” lists. But I have to admit that when they prompt your team to remember specific women in the brainstorming process, or when they help you find those interesting women you wouldn’t otherwise have known about, they’re useful.) No question, when we’re trying to move the needle on our percentage of women speakers, being able to consult these lists give us a fighting chance.

Maintaining these lists takes work. But y’know what? That’s part of our jobs. And it leads to a world in which I might just be interested in attending my own conference.

Among the things I haven’t tackled in this post: Why are women less likely to propose themselves as speakers (and what can we do about that)? Does an increase in great female speakers affect attendee satisfaction or measurably improve the bottom line? Are there ways that matter in which women speakers are different to work with than men? If there’s interest, I’ll consider a follow-up piece.

Also in this post, I’ve focused on female speakers. But tech conferences—including my shows—could benefit from efforts to increase the diversity of speaker rosters along other vectors, including race, age, physical ability and other factors that influence experience, perception and understanding. What else can we do to improve our line-ups? Thoughtful, constructive ideas welcome.

September 14 2010

USA: “Science Blogging” Goes Global, Gains Respect

By David Wescott

A global online community of scientists have recently emerged as an influential and important contributor to worldwide journalism about science. They have grown more sophisticated in their communications, now catching the attention of journalists who were previously dismissive of citizen media about science.

Scienceblogging.orgIn August, a trio of science bloggers launched Scienceblogging.org, a comprehensive aggregator of content from science blogs across the globe. The site features links to content from mainstream science media as well as independent networks of bloggers. It currently features 57 feeds that include several hundred blogs. While the content is predominantly American, the aggregator also features networks from China, Brazil, Germany, New Zealand, Belgium, Canada, and France. The aggregator was built by the creators of the annual ScienceOnline conference - Bora Zivkovic, Anton Zuiker, and Dave Munger.

ScienceOnline2010, billed as “the fourth annual conference on science and the Web,” took place from January 14 to January 17 at Sigma Xi in Research Triangle Park, North Carolina. The conference has grown in size and popularity each year, and the 2010 conference attracted science bloggers from ten countries. In a display of increased acceptance and legitimacy, the January conference featured presentations and attendance from journalists representing several mainstream media organizations, such as Reuters, BBC, and The New York Times. A similar conference with different organizers, Science Online London, took place in September this year.

Science blogging is by no means a new phenomenon. Bora Zivkovic says he has seen the science blogging community grow and thrive over the past decade, earning more than a few converts from mainstream media.

“As journalists lost jobs, they took up blogging,” says Zivkovic, noting that professional media are also increasingly creating blogs on their own websites.

In this video interview by bkthrough on YouTube, Zivkovic comments on how ScienceOnline has grown and how the conference agenda has moved from merely discussing blogging to other forms of online activity as well.

Interestingly, one of the major discussion topics of the conference was the lapses in accuracy sometimes made by mainstream journalists when reporting on science issues. Several scientists and science bloggers expressed frustration with the constraints mainstream journalism places on science-based communication, such as the need to summarize complex issues into 20-second “sound bites.” Another important topic was the need for scientists to become better communicators and try to make their work more accessible and understandable to a wider audience.

Many science bloggers are actively trying to expand their community beyond academia. Darlene Cavalier, an advocate for science literacy and the author of a blog called Science Cheerleader is also the founder of a website called Science for Citizens that aims to be a resource for people who want to participate in “citizen science” - projects run by professional researchers that leverage volunteers to help with research tasks such as data collection or computation. Professional researchers can also promote their projects and recruit volunteers at the site. While the site is currently available only in English, Cavalier said projects are open to people in all countries and she hopes to offer the site in other languages in the future.

The ScienceOnline organizers have announced plans to hold an even bigger conference in 2011 in Research Triangle Park. Meanwhile, Zivkovic sees the skepticism that mainstream journalists once held toward bloggers starting to change a bit.

“The curmudgeon journalists who write ‘you'll miss us when we're gone' Luddite pieces cannot write about blogs that way any more,” he says. “They are now reserving their snark, often using the exactly same old cliches they used about blogs five years ago, to denigrate Twitter.”

[Disclosure: I work for a company that served as a sponsor for the ScienceOnline 2010 conference]

December 31 2009

Four short links: 31 December 2009

  1. Botnets and the Global Infection Rate (PDF) -- fascinating insights into botnets, control tools, and business models.
  2. Atlassian Uses OpenSocial for Internal Integration -- they use it inside their firewall to build a better dashboard. OpenSocial defines two concepts--an API for defining and working with social data (profiles, attributes, relationships) and specification for gadgets. OpenSocial's fundamental promise was interoperability--write an application once and host it in multiple social networks. Sound familiar? That's what we wanted to do with our own products.
  3. Professional Conference Video with Semi-Professional Equipment -- How to make a great video of yourself giving a presentation, without having a cameraman to track you on stage. (I tried to tell my wife that I had semi-professional equipment, by the way, and it took a quarter of an hour for her to stop laughing.)
  4. Thoughts to Speech -- tested on a stroke victim in his 20s who was able to think but not move, electrodes and a small FM transmitter were implanted between speech and motor centres of his brain. Neurites grew into the electrodes, and the signals sent to them are broadcast by the transmitter to an external receiver. From there a desktop computer runs software to figure out which muscles were being moved, and then makes the corresponding sound. It requires training, but is an exciting breakthrough in brain-computer connection.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl