Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

August 29 2012

Follow up on big data and civil rights

A few weeks ago, I wrote a post about big data and civil rights, which seems to have hit a nerve. It was posted on Solve for Interesting and here on Radar, and then folks like Boing Boing picked it up.

I haven’t had this kind of response to a post before (well, I’ve had responses, such as the comments to this piece for GigaOm five years ago, but they haven’t been nearly as thoughtful).

Some of the best posts have really added to the conversation. Here’s a list of those I suggest for further reading and discussion:

Nobody notices offers they don’t get

On Oxford’s Practical Ethics blog, Anders Sandberg argues that transparency and reciprocal knowledge about how data is being used will be essential. Anders captured the core of my concerns in a single paragraph, saying what I wanted to far better than I could:

… nobody notices offers they do not get. And if these absent opportunities start following certain social patterns (for example not offering them to certain races, genders or sexual preferences) they can have a deep civil rights effect

To me, this is a key issue, and it responds eloquently to some of the comments on the original post. Harry Chamberlain commented:

However, what would you say to the criticism that you are seeing lions in the darkness? In other words, the risk of abuse certainly exists, but until we see a clear case of big data enabling and fueling discrimination, how do we know there is a real threat worth fighting?

I think that this is precisely the point: you can’t see the lions in the darkness, because you’re not aware of the ways in which you’re being disadvantaged. If whites get an offer of 20% off, but minorities don’t, that’s basically a 20% price hike on minorities — but it’s just marketing, so apparently it’s okay.

Context is everything

crystal ball ii by mararie, on Flickrcrystal ball ii by mararie, on FlickrMary Ludloff of Patternbuilders asks, “When does someone else’s problem become ours?” Mary is a presenter at Strata, and an expert on digital privacy. She has a very pragmatic take on things. One point Mary makes is that all this analysis is about prediction — we’re taking a ton of data and making a prediction about you:

The issue with data, particularly personal data, is this: context is everything. And if you are not able to personally question me, you are guessing the context.

If we (mistakenly) predict something, and act on it, we may have wronged someone. Mary makes clear that this is thoughtcrime — arresting someone because their behavior looked like that of a terrorist, or pedophile, or thief. Firing someone because their email patterns suggested they weren’t going to make their sales quota. That’s the injustice.

This is actually about negative rights, which Wikipedia describes as:

Rights considered negative rights may include civil and political rights such as freedom of speech, private property, freedom from violent crime, freedom of worship, habeas corpus, a fair trial, freedom from slavery.

Most philosophers agree that negative rights outweigh positive ones (i.e. I have a right to fresh air more than you have a right to smoke around me.) So our negative right (to be left unaffected by your predictions) outweighs your positive one. As analytics comes closer and closer to predicting actual behavior, we need to remember the lesson of negative rights.

Big bata is the new printing press

Lori Witzel compares the advent of big data to the creation of the printing press, pointing out — somewhat optimistically — that once books were plentiful, it was hard to control the spread of information. She has a good point — we’re looking at things from this side of the big data singularity:

And as the cost of Big Data and Big Data Analytics drops, I predict we’ll see a similar dispersion of technology, and similar destabilizations to societies where these technologies are deployed.

There’s a chance that we’ll democratize access to information so much that it’ll be the corporations, not the consumers, that are forced to change.

While you slept last night

TIBCO’s Chris Taylor, standing in for Kashmir Hill at Forbes, paints a dystopian picture of video-as-data, and just how much tracking we’ll face in the future:

This makes laughable the idea of an implanted chip as the way to monitor a population. We’ve implanted that chip in our phones, and in video, and in nearly every way we interact with the world. Even paranoids are right sometimes.

I had a wide-ranging chat with Chris last week. We’re sure to spend more time on this in the future.

The veil of ignorance

The idea for the original post came from a conversation I had with some civil rights activists in Atlanta a few months ago, who hadn’t thought about the subject. They (or their parents) walked with Martin Luther King, Jr. But to them big data was “just tech.” That bothered me, because unless we think of these issues in the context of society and philosophy, bad things will happen to good people.

Perhaps the best tool for thinking about these ethical issues is the Veil of Ignorance. It’s a philosophical exercise for deciding social issues that goes like this:

  1. Imagine you don’t know where you will be in the society you’re creating. You could be a criminal, a monarch, a merchant, a pauper, an invalid.
  2. Now design the best society you can.

Simple, right? When we’re looking at legislation for big data, this is a good place to start. We should set privacy, transparency, and use policies without knowing whether we’re ruling or oppressed, straight or gay, rich or poor.

This post originally appeared on Solve for Interesting. This version has been lightly edited. Photo: crystal ball ii by mararie, on Flickr

Strata Conference + Hadoop World — The O’Reilly Strata Conference, being held Oct. 23-25 in New York City, explores the changes brought to technology and business by big data, data science, and pervasive computing. This year, Strata has joined forces with Hadoop World.

Save 20% on registration with the code RADAR20

Related:

August 01 2012

Big data is our generation’s civil rights issue, and we don’t know it

Data doesn’t invade people’s lives. Lack of control over how it’s used does.

What’s really driving so-called big data isn’t the volume of information. It turns out big data doesn’t have to be all that big. Rather, it’s about a reconsideration of the fundamental economics of analyzing data.

For decades, there’s been a fundamental tension between three attributes of databases. You can have the data fast; you can have it big; or you can have it varied. The catch is, you can’t have all three at once.

The big data trifectaThe big data trifecta

I’d first heard this as the “three V’s of data”: Volume, Variety, and Velocity. Traditionally, getting two was easy but getting three was very, very, very expensive.

The advent of clouds, platforms like Hadoop, and the inexorable march of Moore’s Law means that now, analyzing data is trivially inexpensive. And when things become so cheap that they’re practically free, big changes happen — just look at the advent of steam power, or the copying of digital music, or the rise of home printing. Abundance replaces scarcity, and we invent new business models.

In the old, data-is-scarce model, companies had to decide what to collect first, and then collect it. A traditional enterprise data warehouse might have tracked sales of widgets by color, region, and size. This act of deciding what to store and how to store it is called designing the schema, and in many ways, it’s the moment where someone decides what the data is about. It’s the instant of context.

That needs repeating:

You decide what data is about the moment you define its schema.

With the new, data-is-abundant model, we collect first and ask questions later. The schema comes after the collection. Indeed, big data success stories like Splunk, Palantir, and others are prized because of their ability to make sense of content well after it’s been collected — sometimes called a schema-less query. This means we collect information long before we decide what it’s for.

And this is a dangerous thing.

When bank managers tried to restrict loans to residents of certain areas (known as redlining) Congress stepped in to stop it (with the Fair Housing Act of 1968). They were able to legislate against discrimination, making it illegal to change loan policy based on someone’s race.

Home Owners' Loan Corporation map showing redlining of hazardous districts in 1936Home Owners' Loan Corporation map showing redlining of hazardous districts in 1936
Home Owners’ Loan Corporation map showing redlining of “hazardous” districts in 1936.


“Personalization” is another word for discrimination. We’re not discriminating if we tailor things to you based on what we know about you — right? That’s just better service.

In one case, American Express used purchase history to adjust credit limits based on where a customer shopped, despite his excellent credit limit:

Johnson says his jaw dropped when he read one of the reasons American Express gave for lowering his credit limit: “Other customers who have used their card at establishments where you recently shopped have a poor repayment history with American Express.”

Some of the things white men liked in 2010, according to OKCupidSome of the things white men liked in 2010, according to OKCupidWe’re seeing the start of this slippery slope everywhere from tailored credit-card limits like this one to car insurance based on driver profiles. In this regard, big data is a civil rights issue, but it’s one that society in general is ill-equipped to deal with.

We’re great at using taste to predict things about people. OKcupid’s 2010 blog post “The Real Stuff White People Like” showed just how easily we can use information to guess at race. It’s a real eye-opener (and the guys who wrote it didn’t include everything they learned — some of it was a bit too controversial). They simply looked at the words one group used which others didn’t often use. The result was a list of “trigger” words for a particular race or gender.

Now run this backwards. If I know you like these things, or see you mention them in blog posts, on Facebook, or in tweets, then there’s a good chance I know your gender and your race, and maybe even your religion and your sexual orientation. And that I can personalize my marketing efforts towards you.

That makes it a civil rights issue.

If I collect information on the music you listen to, you might assume I will use that data in order to suggest new songs, or share it with your friends. But instead, I could use it to guess at your racial background. And then I could use that data to deny you a loan.

Want another example? Check out Private Data In Public Ways, something I wrote a few months ago after seeing a talk at Big Data London, which discusses how publicly available last name information can be used to generate racial boundary maps:

Screen from the Mapping London projectScreen from the Mapping London project
Screen from the Mapping London project.


This TED talk by Malte Spitz does a great job of explaining the challenges of tracking citizens today, and he speculates about whether the Berlin Wall would ever have come down if the Stasi had access to phone records in the way today’s governments do.

So how do we regulate the way data is used?

The only way to deal with this properly is to somehow link what the data is with how it can be used. I might, for example, say that my musical tastes should be used for song recommendation, but not for banking decisions.

Tying data to permissions can be done through encryption, which is slow, riddled with DRM, burdensome, hard to implement, and bad for innovation. Or it can be done through legislation, which has about as much chance of success as regulating spam: it feels great, but it’s damned hard to enforce.

There are brilliant examples of how a quantified society can improve the way we live, love, work, and play. Big data helps detect disease outbreaks, improve how students learn, reveal political partisanship, and save hundreds of millions of dollars for commuters — to pick just four examples. These are benefits we simply can’t ignore as we try to survive on a planet bursting with people and shaken by climate and energy crises.

But governments need to balance reliance on data with checks and balances about how this reliance erodes privacy and creates civil and moral issues we haven’t thought through. It’s something that most of the electorate isn’t thinking about, and yet it affects every purchase they make.

This should be fun.

This post originally appeared on Solve for Interesting. This version has been lightly edited.

Strata Conference + Hadoop World — The O’Reilly Strata Conference, being held Oct. 23-25 in New York City, explores the changes brought to technology and business by big data, data science, and pervasive computing. This year, Strata has joined forces with Hadoop World.

Save 20% on registration with the code RADAR20

Related:

May 29 2010

Burkhard Hirsch zum 80. Geburtstag

Heribert Prantl hat in der Wochenendausgabe der Süddeutschen Zeitung eine schöne Laudatio auf den FDP-Politiker Burkhard Hirsch, den ich seit längerer Zeit nur noch als Bürgerrechtler wahrnehme, zu dessen 80. Geburtstag verfasst. Seine Erfolge feiert Hirsch freilich schon seit längerer Zeit nicht mehr auf der politischen Bühne als Mitglied der Partei, die sich liberal nennt. Ihn darf man allerdings mit Fug und Recht einen Liberalen nennen, der unbeirrt und geradlinig für die Bürgerrechte eintritt. Den großen Lauschangriff, das Lufsicherheitsgesetz und die Vorratsdatenspeicherung hat er vor dem Bundesverfassungsgericht als Beschwerdeführer oder Prozessvertreter erfolgreich bekämpft. Er ist, wie Prantl schreibt, ein Anwalt des Rechts.

Ein Zitat aus einem Text, den Hirsch vor einigen Jahren für die ZEIT verfasst hat, bringt seine konsequente rechtsstaatliche Haltung auf den Punkt:

“Der Schutz der Privatheit und der individuellen Freiheitsrechte sind kein eigenbrötlerischer Individualismus. Sie gehören zur Menschenwürde. Sie sind zentrale Werte des Grundgesetzes. In dieser freiheitlichen Qualität unserer Verfassung liegt ihre integrierende Kraft, nicht etwa in der möglichst lückenlosen Anwendung polizeilicher Eingriffsmöglichkeiten. Daran ändern auch Eitelkeit oder Einfalt mancher Bürger nichts, die ihr Privatleben am Handy in die Gegend brüllen oder in TV-Talks bereitwillig ausbreiten. Daran ändert auch das so gute Gewissen vieler Bürger nichts, die ihr Privatleben mit der Behauptung leugnen, sie hätten nichts zu verbergen. In Wirklichkeit glauben sie, von einem Verdacht verschont zu bleiben. Sie wollen mehr eigene Sicherheit mit der Freiheit anderer bezahlen. Das ist politische Zechprellerei.” (Wehret dem bitteren Ende! – Die Politik verliert im Kampf gegen innere Feinde jedes Maß, DIE ZEIT 10/2005)

Ich würde mir so wünschen, es gäbe mehr Menschen wie Burkhard Hirsch.

April 16 2010

ACTA-Entwurf soll offiziell veröffentlicht werden

Nach Monaten der Geheimniskrämerei um das umstrittene ACTA-Abkommen, wurde nun von der US-Regierung für den 21.April die offizielle Veröffentlichung eines aktuellen Entwurfstextes angekündigt. Die bisherigen Entwurfsfassungen lassen zumindest die Bestrebung erkennen, eine Regulierung des Internets zu Gunsten des Schutzes des geistigen Eigentums vorzunehmen, die deutlich über bestehende Regelungen hinausgeht.

Möglicherweise ist auch die massive Kritik von Bürgerrechtsorganisationen und die Forderung nach mehr Transparenz durch das EU-Parlament ein Grund dafür, dass man sich zu dieser Veröffentlichung entschlossen hat.

Reposted bykrekk krekk

Veranstaltungshinweis: Bürgerrechte nach der digitalen Revolution

An der Universität Passau findet am 29. und 30. April 2010 ein Symposium zum Thema “Bürgerrechte nach der digitalen Revolution ” Freiheit – Sicherheit – Gleichgültigkeit?” der Forschungsstelle ReH..Mo statt. Die Frage scheint mir zutreffend gestellt, weshalb ich auf die Antworten der Referenten und des Publikums gespannt bin. Es wird bei der Veranstaltung außerdem auch eine Twitter-Wall geben und in der von mir moderierten Podiumsdiskussion sollen Fragen via Twitter gestellt werden können.

Bei dieser Gelegenheit, möchte ich auch das ReH..Mo-Blog lobend erwähnen, das fast täglich über Neuigkeiten aus den Bereichen IT-Recht und E-Justice berichtet.

February 13 2010

02mydafsoup-01

About EFF - Electronic Frontier Foundation

 

From the Internet to the iPod, technologies are transforming our society and empowering us as speakers, citizens, creators, and consumers. When our freedoms in the networked world come under attack, the Electronic Frontier Foundation (EFF) is the first line of defense. EFF broke new ground when it was founded in 1990 — well before the Internet was on most people's radar — and continues to confront cutting-edge issues defending free speech, privacy, innovation, and consumer rights today. From the beginning, EFF has championed the public interest in every critical battle affecting digital rights.

Blending the expertise of lawyers, policy analysts, activists, and technologists, EFF achieves significant victories on behalf of consumers and the general public. EFF fights for freedom primarily in the courts, bringing and defending lawsuits even when that means taking on the US government or large corporations. By mobilizing more than 50,000 concerned citizens through our Action Center, EFF beats back bad legislation. In addition to advising policymakers, EFF educates the press and public.

EFF is a donor-funded nonprofit and depends on your support to continue successfully defending your digital rights. Litigation is particularly expensive; because two-thirds of our budget comes from individual donors, every contribution is critical to helping EFF fight — and win — more cases.

Additional information:

Annual Reports

EFF's Annual Reports are publicly-available:

02mydafsoup-01
Die EFF war die erste Bürgerrechtsorganisation für das Netz- und kämpft seit ihrer Gründung für Datenschutz und freie Rede. Nun feiert sie ihren zwanzigsten Geburtstag.

In den USA haben Bürgerrechtsvereinigungen eine große Tradition. Werden irgendwo im Land im größeren Stil Freiheitsrechte eingeschränkt, ist ein Anwalt der liberalen American Civil Liberties Union (ACLU) meist nicht weit. Die IT-Fachleute Mitch Kapor, John Gilmore und John Perry Barlow dachten sich im Frühjahr 1990, dass man eine solche Einrichtung doch auch für den Bereich der Computertechnik bräuchte. Barlow hatte gerade schlimme Erfahrungen bei einer Hausdurchsuchung durch das FBI gemacht - die Ermittler hatten ihm vorgeworfen, Quellcode eines Rechnersystems entwendet zu haben.

In der frühen Online-Community WELL in San Francisco schrieb Barlow von dem Vorfall, was Kapor und Gilmore zu dem Vorschlag trieb, man müsse eine Vereinigung gründen, die Bürgerrechte auch in den gerade aufkommenden Computernetzen verteidigt. Die Electronic Frontier Foundation, kurz EFF, war innerhalb kürzester Zeit geboren. Sie begann damit, Barlow und andere Hacker zu verteidigen und konnte schnell größere Summen von Branchengrößen wie Apple-Gründer Steve Wozniak einwerben. Das Projekt wuchs und wuchs, die frühen Vernetzungsmöglichkeiten nutzend.
<a href="http://adfarm1.adition.com:80/redi*sid=68548/kid=68794/bid=187223/lid=120873433054/c=33193/keyword=/clickurl=" target="_blank"><img src="http://imagesrv.adition.com/banners/355/187223/altbanner.gif" width="300" height="250" border="0" alt="Hier klicken!" title="Hier klicken!"></a>
Heute kann man sagen, dass die Gründung der EFF vor 20 Jahren geradezu prophetisch war. Interessierten sich 1990 Behörden und Regierungen verhältnismäßig wenig für das, was in den Datennetzen abging, versuchten sie in den folgenden Jahren Schritt für Schritt, das aufkeimende Internet zu kontrollieren.


[...]
— Zitat: TAZ 20100212

20 Jahre EFF An der elektronischen Front

Reposted bykellerabteil kellerabteil
Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl