Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

May 09 2013

Where will software and hardware meet?

I’m a sucker for a good plant tour, and I had a really good one last week when Jim Stogdill and I visited K. Venkatesh Prasad at Ford Motor in Dearborn, Mich. I gave a seminar and we talked at length about Ford’s OpenXC program and its approach to building software platforms.

The highlight of the visit was seeing the scale of Ford’s operation, and particularly the scale of its research and development organization. Prasad’s building is a half-mile into Ford’s vast research and engineering campus. It’s an endless grid of wet labs like you’d see at a university: test tubes and robots all over the place; separate labs for adhesives, textiles, vibration dampening; machines for evaluating what’s in reach for different-sized people.

Prasad explained that much of the R&D that goes into a car is conducted at suppliers–Ford might ask its steel supplier to come up with a lighter, stronger alloy, for instance–but Ford is responsible for integrative research: figuring out how to, say, bond its foam insulation onto that new alloy.

In our more fevered moments, we on the software side of things tend to foresee every problem being reduced to a generic software problem, solvable with brute-force computing and standard machinery. In that interpretation, a theoretical Google car operating system–one that would drive the car and provide Web-based services to passengers–could commoditize the mechanical aspects of the automobile. If you’re not driving, you don’t care much about how the car handles; you just want a comfortable seat, functional air conditioning, and Web connectivity for entertainment. A panel in the dashboard becomes the only substantive point of interaction between a car and its owner, and if every car is running Google’s software in that panel, then there’s not much left to distinguish different makes and models.

When’s the last time you heard much of a debate on Dell laptops versus HP? As long it’s running the software you want, and meets minimum criteria for performance and physical quality, there’s not much to distinguish laptop makers for the vast majority of users. The exception, perhaps, is Apple, which consumers do distinguish from other laptop makers for both its high-quality hardware and its unique software.

That’s how I start to think after a few days in Mountain View. A trip to Detroit pushes me in the other direction: the mechanical aspects of cars are enormously complex. Even incremental changes take vast re-engineering efforts. Changing the shape of a door sill to make a car easier to get into means changing a car’s aesthetics, its frame, the sheet metal that gets stamped to make it, the wires and sensors embedded in it, and the assembly process that puts it together. Everything from structural integrity to user experience needs to be carefully checked before a thousand replicates start driving out of Ford’s plants every day.

So, when it comes to value added, where will the balance between software and machines emerge? Software companies and industrial firms might both try to shift the balance by controlling the interfaces between software and machines: if OpenXC can demonstrate that it’s a better way to interact with Ford cars than any other interface, Ford will retain an advantage.

As physical things get networked and instrumented, software can make up a larger proportion of their value. I’m not sure exactly where that balance will arise, but I have a hard time believing in complete commoditization of the machines beneath the software.

See our free research report on the industrial internet for an overview of the ways that software and machines are coming together.

February 21 2013

Investigating the growth and influence of professional Makers

The growth of the Maker movement has been nothing if not amazing. We’ve had more than 100,000 people at Maker Faire in San Francisco, and more than 50,000 at the New York event, with mini-Maker Faires in many other cities. Arduino is almost a household word, along with Raspberry Pi. Now that O’Reilly has spun out Maker Media as an independent company, we look forward to the continued success of these events; they’re signs of an important cultural shift, a rejection of a prefabricated, shrink-and bubble-wrap economy that hasn’t served us well. The Make movement has proven that there are many people who want the joy of creating, whether it’s a crystal radio, a custom head for a Pez dispenser, or glowing e coli.

But the Maker movement is not just about hobbyists. We’ve seen a lot in print about the re-shoring of American manufacturing, the return of the manufacturing jobs that had been exported to China and the Far East over the past few decades. One of the questions we’re asking at O’Reilly is what the Maker movement has to do with the return of manufacturing. If the return of manufacturing just means lots of low-level industrial jobs, paying barely more than minimum wage and under near-slavery conditions, that doesn’t sound desirable. That also doesn’t sound possible, at least to me: whatever else one might say about the cost of doing business in the U.S., North America just doesn’t have the sheer concentrations of people needed to make a Foxconn.

Of course, many of the writers who’ve noted the return of manufacturing have also noted that it’s returning in a highly automated way: instead of people running around a warehouse, you’ll have Kiva robots doing the running. Instead of skilled machinists operating milling machines, you’ll have highly automated computer controlled machines with a small number of humans to test the parts and make sure they’re operating properly. This vision is more plausible — even likely — but while it promises continued employment for the engineers who make the robots, it certainly doesn’t solve any problems in the labor market.

But just as small business has long been the cornerstone of the U.S. economy, one wonders whether or not small manufacturing, driven by “professional Makers,” could be the foundation for the resurgence of manufacturing in the U.S. A number of innovations have made this shift conceivable. One of the most important is the ease with which makers can raise money to get a business started. Thanks to Kickstarter, initial funding for a small business is a lot easier than it used to be. Kickstarter isn’t alone; IndieGoGo, Selfstarter, and many others also enable Makers to raise money without running the venture capital gauntlet.

There’s also been an amazing drop in the cost of tooling. Not long ago, 3D printers, laser cutters, and computer-controlled milling machines were tools that enthusiasts could only dream about. Now you can get a 3D printer for a few hundred dollars, and a laser cutter for a couple of thousand. If you don’t want to own your own 3D printer, they’re starting to appear in storefronts and copy shops. Online fabrication services exist for everything from printed circuit boards to DNA. You design what you want online, click a button, and a few weeks later, a batch of PC boards, or 3D printed parts, or plasmids with custom DNA, arrive. This isn’t new, but it’s becoming easier all the time. Autodesk has apps for your iPad that let you design for a 3D printer; you can easily send the design to the copy shop or library for production.

In the 20th-century economy, one barrier to starting a new business was establishing a sales channel. That’s another problem that’s been solved recently. There are new outlets and sales channels that specialize in micro-manufacturing. Etsy is the most well known; Tindie is a newer entry that caters to electronics; and I believe we will see many more online marketplaces specializing in small manufacturers.

There’s more at stake in re-invigorating small manufacturing than just adding to the economy. Several years ago, I was in a meeting with Bunnie Huang, founder of Chumby, where he said that the United States had lost the engineering skills needed to do manufacturing. The engineers needed to do product development, to take a raw design and figure out how to produce it, no longer existed in the U.S., at least in sufficient numbers to support a manufacturing economy. As manufacturing had gone offshore, so had the people who knew how to do it. A product like the iPhone isn’t manufactured in China because it’s cheaper; it’s manufactured in China because that kind of manufacturing just can’t be done in the United States. Part of a reboot of American manufacturing means home-growing the product engineering and development smarts that we’ve lost over the years; and professional Making, Makers turning their ideas and passions into products, is necessary to re-develop the talent and experience that are in short supply.

If you’re a professional maker, we’d like to hear your story. What kind of a business are you running? Do you have, or foresee having, employees? What kind of an impact has your business had on your community? I’ve seen too many small towns going to ruin around an abandoned factory. The people with the skills are still there, but the jobs left years ago. Can the Maker movement make an appreciable change in local economies? And if small numbers of makers can contribute to a local economy, what can the entire movement do for the national economy?

We’re waiting for your answers.

January 18 2013

RFP-EZ: Making it easier for small companies to bid on government contracts

A few years ago, when I was doing the research that led to my work in open government, I had a conversation with Aneesh Chopra, later the first Federal CTO but at the time, the Secretary of Technology for the Commonwealth of Virginia.  I remember him telling me about the frustration of being in government, knowing that you could go to someone down the street to build a website in a week, but still having to put the job through procurement, a process taking nine months and resulting in a website costing ten times or more what it could have cost if he’d just been able to hire someone on the open market.

Much of the difficulty stems from stringent legal regulations that make it difficult for companies to compete and do business with government. (Like so many government regulations, these rules were designed with good intentions after scandals involving government officials steering contracts to their friends, but need to be simplified and updated for current circumstances.) The regulations are so complex that often, the people who do business with the federal government are more specialized in understanding that regulation than they are in the technology they’re providing. As a result, there are specialized intermediaries whose sole business is bidding on government jobs, and then subcontracting them to people who can actually do the work.

The problem has been compounded by the fact that many things that were once hard and expensive are now easy and cheap. But government rules make it hard to adopt cutting edge technology.

That’s why I’m excited to see the Small Business Administration launch RFP-EZ as part of the White House’s Presidential Innovation Fellows program. It’s a small step towards getting the door open — towards making it easier for new businesses to sell to government. RFP-EZ simplifies both the process for small companies to bid on government jobs and the process for government officials to post their requests. Hopefully it will increase government’s access to technology, increase competition in the federal space, and lower prices.

This is a huge opportunity for web developers and other commercial technology providers. Government is the largest buyer on the planet, and your potential to work on stuff that matters is unparalleled when you’re working with the platform of government. When government and private industry work together to solve problems, amazing things can happen. RFP-EZ is a step in that direction.

If you’re a startup or consulting firm who has a desire to make a difference, and a desire for revenue, I’d encourage you to check out what RFP-EZ has to offer. There are a few projects awaiting bids now, and from what I hear, more on their way. (This is still an experiment, and successful outcomes will lead to more jobs being posted.) If you’ve got a solution to the problems that are posted, take a step towards working on stuff that matters at scale.

I have another reason for urging innovative companies to participate. This project is an experiment. Take a look at the Federal Register notice about the project. It’s a pilot that has a clear start and end date. They’re using the pilot to gather data, learn from it, and iterate. They’ve given themselves room to succeed and permission to fail.  I’d like to see government do more of both. Your participation will encourage that response.

December 18 2012

Interoperating the industrial Internet

One of the most interesting points made in GE’s “Unleashing the Industrial Internet” event was GE CEO Jeff Immelt’s statement that only 10% of the value of Internet-enabled products is in the connectivity layer; the remaining 90% is in the applications that are built on top of that layer. These applications enable decision support, the optimization of large scale systems (systems “above the level of a single device,” to use Tim O’Reilly’s phrase), and empower consumers.

Given the jet engine that was sitting on stage, it’s worth seeing how far these ideas can be pushed. Optimizing a jet engine is no small deal; Immelt said that the engine gained an extra 5-10% efficiency through software, and that adds up to real money. The next stage is optimizing the entire aircraft; that’s certainly something GE and its business partners are looking into. But we can push even harder: optimize the entire airport (don’t you hate it when you’re stuck on a jet waiting for one of those trucks to push you back from the gate?). Optimize the entire air traffic system across the worldwide network of airports. This is where we’ll find the real gains in productivity and efficiency.

So it’s worth asking about the preconditions for those kinds of gains. It’s not computational power; when you come right down to it, there aren’t that many airports, aren’t that many flights in the air at one time. There are something like 10,000 flights in the air at one time, worldwide; and in these days of big data, and big distributed systems, that’s not a terribly large number. It’s not our ability to write software; there would certainly be some tough problems to solve, but certainly nothing as difficult as, say, searching the entire web and returning results in under a second.

But there is one important prerequisite for software that runs above the level of a single machine, and that’s interoperability. That’s something the inventors of the Internet understood early on; nothing became a standard unless at least two independent, interoperable implementations existed. The Interop conference didn’t start as a trade show, it started as a technical exercise where everyone brought their experimental hardware and software and worked on it until it played well together.

If we’re going to build useful applications on top of the industrial Internet, we must ensure, from the start, that the components we’re talking about interoperate. It’s not just a matter of putting HTTP everywhere. Devices need common, interoperable data representations. And that problem can’t be solved just by invoking XML: several years of sad experience has proven that it’s certainly possible to be proprietary under the aegis of “open” XML standards.

It’s a hard problem, in part because it’s not simply technical. It’s also a problem of business culture, and the desire to extract as much monetary value from your particular system as possible. We see the consumer Internet devolving into a set of walled gardens, with interoperable protocols but license agreements that prevent you from moving data from one garden into another. Can the industrial Internet do better? It takes a leap of faith to imagine manufacturers of industrial equipment practicing interoperability, at least in part because so many manufacturers have already developed their own protocols and data representations in isolation. But that’s what our situation demands. Should a GE jet engine interoperate with a jet engine from Pratt and Whitney? What would that mean, what efficiencies in maintenance and operations would that entail? I’m sure that any airline would love a single dashboard that would show the status of all its equipment, regardless of vendor. Should a Boeing aircraft interoperate with Airbus and Bombardier in a system to exchange in-flight data about weather and other conditions? What if their flight computers were in constant communication with each other? What would that enable? Leaving aviation briefly: self-driving cars have the potential to be much safer than human-driven cars; but they become astronomically safer if your Toyota can exchange data directly with the BMW coming in the opposite direction. (“Oh, you intend to turn left here? Your turn signal is out, by the way.”)

Extracting as much value as possible from a walled garden is false optimization. It may lead you to a local maximum in profitability, but it leaves the biggest gains, the 90% that Immelt talked about in his keynote, behind. Tim O’Reilly has talked about the “clothesline paradox“: if you dry your clothes on a clothesline, the money you save doesn’t disappear from the economy, even though it disappears from the electric company’s bottom line. The economics of walled gardens is the clothesline paradox’s evil twin. Building a walled garden may increase local profitability, but prevents larger gains, Immelt’s 90% gains in productivity, from existing. They never reach the economy.

Can the industrial Internet succeed in breaking down walled gardens, whether they arise from business culture, legacy technology, or some other source? That’s a hard problem. But it’s the problem the industrial Internet must solve if it is to succeed.


This is a post in our industrial Internet series, an ongoing exploration of big machines and big data. The series is produced as part of a collaboration between O’Reilly and GE.

October 11 2012

Investigating the industrial Internet

Consumer networks have revolutionized the way companies understand and reach their customers, making possible intricate measurement and accurate prediction at every step of every transaction. The same revolution is underway in our infrastructure, where new generations of sensor-laden power plants, cars and medical devices will generate vast quantities of data that could bring about improvements in quality, reliability and cost. Big machines will enter the modern era of big data, where they’ll be subject to constant analysis and optimization.

We’ve teamed up with General Electric to explore the industrial Internet and convene a series of conversations that we hope will accelerate its development. GE’s strong presence in many industries has given it a great deal of insight into the ways that industrial data might be gathered, distributed and linked together.

Linking together big smart devices into a true industrial Internet presents enormous challenges: standards need to be developed with the full engagement of the technology industry. Software innovators will need to develop tools that can handle vast quantities of sensor data under tight security constraints, sharing information that can improve the performance of systems that have many operators — without leaking anything important to malicious groups.

Launching the industrial Internet will require big investment on the part of those who will operate each of its nodes, so in addition to looking at the concept’s technical aspects we’ll also explore its promise as a business revolution in ways that are both practical and already in use (like remote operation of mining equipment) and promising but largely conceptual (like mobile health and big data in diagnostics).

GE won’t be the only voice in this conversation: other companies have developed their own visions for the industrial Internet and we’ll be exploring those as well, looking for commonalities and engaging as many voices as we can from our neutral place in the technology industry.

The promise of the industrial Internet is that it will bring intelligence to industries that are hugely capital-intensive and create broad value that all of the industrial Internet’s participants will share. We’ll look for stories that illustrate that future.

July 21 2012

Overfocus on tech skills could exclude the best candidates for jobs

At the second RailsConf, David Heinemeier Hansson told the audience about a recruiter trying to hire with “5 years of experience with Ruby on Rails.” DHH told him “Sorry; I’ve only got 4 years.” We all laughed (I don’t think there’s anyone in the technical world who hasn’t dealt with a clueless recruiter), but little did we know this was the shape of things to come.

Last week, a startup in a relatively specialized area advertised a new engineering position for which they expected job candidates to have used their API. That raised a few eyebrows, not the least because it’s a sad commentary on the current jobs situation.

On one hand, we have high unemployment. But on the other hand, at least in the computing industry, there’s no shortage of jobs. I know many companies that are hiring, and all of them are saying they can’t find the people they want. I’m only familiar with the computer industry, which is often out of synch with the rest of the economy. Certainly, in Silicon Valley where you can’t throw a stone without hitting a newly-funded startup, we’d expect a chronic shortage of software developers. But a quick Google search will show you that the complaint is widespread: trucking, nursing, manufacturing, teaching, you’ll see the “lack of qualified applicants” complaint everywhere you look.

Is the problem that there are no qualified people? Or is the problem with the qualifications themselves?

There certainly have been structural changes in the economy, for better or for worse: many jobs have been shipped offshore, or eliminated through automation. And employers are trying to move some jobs back onshore for which the skills no longer exist in the US workforce. But I don’t believe that’s the whole story. A number of articles recently have suggested that the problem with jobs isn’t the workforce, it’s the employers: companies that are only willing to hire people who will drop in perfectly to the position that’s open. Hence, a startup requiring that applicants have developed code using their API.

It goes further: many employers are apparently using automated rejection services which (among other things) don’t give applicants the opportunity to make their case: there’s no human involved. There’s just a resume or an application form matched against a list of requirements that may be grossly out of touch with reality, generated by an HR department that probably doesn’t understand what they’re looking for, and that will never talk to the candidates they reject.

I suppose it’s a natural extension of data science to think that hiring can be automated. In the future, perhaps it will be. Even without automated application processing, it’s altogether too easy for an administrative assistant to match resumes against a checklist of “requirements” and turn everyone down: especially easy when the stack of resumes is deep. If there are lots of applications, and nobody fits the requirements, it must be the applicants’ fault, right? But at this point, rigidly matching candidates against inflexible job requirements isn’t a way to go forward.

Even for a senior position, if a startup is only willing to hire people who have already used its API, it is needlessly narrowing its applicant pool to a very small group. The candidates who survive may know the API already, but what else do they know? Are the best candidates in that group?

A senior position is likely to require a broad range of knowledge and experience, including software architecture, development methodologies, programming languages and frameworks. You don’t want to exclude most of the candidates by imposing extraneous requirements, even if those requirements make superficial sense. Does the requirement that candidates have worked with the API seem logical to an unseasoned executive or non-technical HR person? Yes, but it’s as wrong as you can get, even for a startup that expects new hires to hit the ground running.

The reports about dropping enrollments in computer science programs could give some justification to the claim that there’s a shortage of good software developers. But the ranks of software developers have never been filled by people with computer science degrees. In the early 80s, a friend of mine (a successful software developer) lamented that he was probably the last person to get a job in computing without a CS degree.

At the time, that seemed plausible, but in retrospect, it was completely wrong. I still see many people who build successful careers after dropping out of college, not completing high school, or majoring in something completely unrelated to computing. I don’t believe that they are the exceptions, nor should they be. The best way to become a top-notch software developer may well be to do a challenging programming-intensive degree program in some other discipline. But if the current trend towards overly specific job requirements and automated rejections continues, my friend will be proven correct, just about 30 years early.

A data science skills gap?

What about new areas like “data science”, where there’s a projected shortage of 1.5 million “managers and analysts”?

Well, there will most certainly be a shortage if you limit yourselves to people who have some kind of degree in data science, or a data science certification. (There are some degree programs, and no certifications that I’m aware of, though the related fields of Statistics and Business Intelligence are lousy with certifications). If you’re a pointy-haired boss who needs a degree or a certificate to tell you that a potential hire knows something in an area where you’re incompetent, you’re going to see a huge shortage of talent.

But as DJ Patil said in “Building Data Science Teams,” the best data scientists are not statisticians; they come from a wide range of scientific disciplines, including (but not limited to) physics, biology, medicine, and meteorology. Data science teams are full of physicists. The chief scientist of Kaggle, Jeremy Howard, has a degree in philosophy. The key job requirement in data science (as it is in many technical fields) isn’t demonstrated expertise in some narrow set of tools, but curiousity, flexibility, and willingness to learn. And the key obligation of the employer is to give its new hires the tools they need to succeed.

At this year’s Velocity conference, Jay Parikh talked about Facebook’s boot camp for bringing new engineers up to speed (this segment starts at about 3:30). New hires are expected to produce shippable code in the first week. There’s no question that they’re expected to come up to speed fast. But what struck me was that boot camp is that it’s a 6 week program (plus a couple additional weeks if you’re hired into operations) designed to surround new hires with the help they need to be successful. That includes mentors to help them work with the code base, review their code, integrate them into Facebook culture, and more. They aren’t expected to “hit the ground running.” They’re expected to get up to speed fast, and given a lot of help to do so successfully.

Facebook has high standards for whom they hire, but boot camp demonstrates that they understand that successful hiring isn’t about finding the perfect applicant: it’s about what happens after the new employee shows up.

Last Saturday, I had coffee with Nathan Milford, US Operations manager for Outbrain. We discussed these issues, along with synthetic biology, hardware hacking, and many other subjects. He said “when I’m hiring someone, I look for an applicant that fits the culture, who is bright, and who is excited and wants to learn. That’s it. I’m not going to require that they come with prior experience in every component of our stack. Anyone who wants to learn can pick that up on the job.”

That’s the attitude we clearly need if we’re going to make progress.

February 22 2012

Data for the public good

Can data save the world? Not on its own. As an age of technology-fueled transparency, open innovation and big data dawns around the world, the success of new policy won't depend on any single chief information officer, chief executive or brilliant developer. Data for the public good will be driven by a distributed community of media, nonprofits, academics and civic advocates focused on better outcomes, more informed communities and the new news, in whatever form it is delivered.

Advocates, watchdogs and government officials now have new tools for data journalism and open government. Globally, there's a wave of transparency that will wash over every industry and government, from finance to healthcare to crime.

In that context, open government is about much more than open data — just look at the issues that flow around the #opengov hashtag on Twitter, including the nature identity, privacy, security, procurement, culture, cloud computing, civic engagement, participatory democracy, corruption, civic entrepreneurship or transparency.

If we accept the premise that Gov 2.0 is a potent combination of open government, mobile, open data, social media, collective intelligence and connectivity, the lessons of the past year suggest that a tidal wave of technology-fueled change is still building worldwide.

The Economist's support for open government data remains salient today:

"Public access to government figures is certain to release economic value and encourage entrepreneurship. That has already happened with weather data and with America's GPS satellite-navigation system that was opened for full commercial use a decade ago. And many firms make a good living out of searching for or repackaging patent filings."

As Clive Thompson reported at Wired last year, public sector data can help fuel jobs, and "shoving more public data into the commons could kick-start billions in economic activity." In the transportation sector, for instance, transit data is open government fuel for economic growth.

There is a tremendous amount of work ahead in building upon the foundations that civil society has constructed over decades. If you want a deep look at what the work of digitizing data really looks like, read Carl Malamud's interview with Slashdot on opening government data.

Data for the public good, however, goes far beyond government's own actions. In many cases, it will happen despite government action — or, often, inaction — as civic developers, data scientists and clinicians pioneer better analysis, visualization and feedback loops.

For every civic startup or regulation, there's a backstory that often involves a broad number of stakeholders. Governments have to commit to open up themselves but will, in many cases, need external expertise or even funding to do so. Citizens, industry and developers have to show up to use the data, demonstrating that there's not only demand, but also skill outside of government to put open data to work in service accountability, citizen utility and economic opportunity. Galvanizing the co-creation of civic services, policies or apps isn't easy, but tapping the potential of the civic surplus has attracted the attention of governments around the world.

There are many challenges for that vision to pass. For one, data quality and access remain poor. Socrata's open data study identified progress, but also pointed to a clear need for improvement: Only 30% of developers surveyed said that government data was available, and of that, 50% of the data was unusable.

Open data will not be a silver bullet to all of society's ills, but an increasing number of states are assembling platforms and stimulating an app economy.

Results-oriented mayors like Rahm Emanuel and Mike Bloomberg are committing to opening Chicago and opening government data in New York City, respectively.

Following are examples of where data for the public good is already having an impact upon the world we live in, along with some ideas about what lies ahead.

Financial good

Anyone looking for civic entrepreneurship will be hard pressed to find a better recent example than BrightScope. The efforts of Mike and Ryan Alfred are in line with traditional entrepreneurship: identifying an opportunity in a market that no one else has created value around, building a team to capitalize on it, and then investing years of hard work to execute on that vision. In the process, BrightScope has made government data about the financial industry more usable, searchable and open to the public.

Due to the efforts of these two entrepreneurs and their California-based startup, anyone who wants to learn more about financial advisers before tapping one to manage their assets can do so online.

Prior to BrightScope, the adviser data was locked up at the Securities and Exchange Commission (SEC) and the Financial Industry Regulatory Authority (FINRA).

"Ryan and I knew this data was there because we were advisers," said BrightScope co-founder Mike Alfred in a 2011 interview. "We knew data had been filed, but it wasn't clear what was being done with it. We'd never seen it liberated from the government databases."

While they knew the public data existed and had their idea years ago, Alfred said it didn't happen because they "weren't in the mindset of being data entrepreneurs" yet. "By going after 401(k) first, we could build the capacity to process large amounts of data," Alfred said. "We could take that data and present it on the web in a way that would be usable to the consumer."

Notably, the government data that BrightScope has gathered on financial advisers goes further than a given profile page. Over time, as search engines like Google and Bing index the information, the data has become searchable in places consumers are actually looking for it. That's aligned with one of the laws for open data that Tim O'Reilly has been sharing for years: Don't make people find data. Make data find the people.

As agencies adapt to new business relationships, consumers are starting to see increased access to government data. Now, more data that the nation's regulatory agencies collected on behalf of the public can be searched and understood by the public. Open data can improve lives, not least through adding more transparency into a financial sector that desperately needs more of it. This kind of data transparency will give the best financial advisers the advantage they deserve and make it much harder for your Aunt Betty to choose someone with a history of financial malpractice.

The next phase of financial data for good will use big data analysis and algorithmic consumer advice tools, or "choice engines," to make better decisions. The vast majority of consumers are unlikely to ever look directly at raw datasets themselves. Instead, they'll use mobile applications, search engines and social recommendations to make smarter choices.

There are already early examples of such services emerging. Billshrink, for example, lets consumers get personalized recommendations for a cheaper cell phone plan based on calling histories. Mint makes specific recommendations on how a citizen can save money based upon data analysis of the accounts added. Moreover, much of the innovation in this area is enabled by the ability of entrepreneurs and developers to go directly to data aggregation intermediaries like Yodlee or CashEdge to license the data.

EMC's Big Data solution accelerates business transformation. We offer a cost-efficient and scale-out IT infrastructure that allows organizations to access broad data sources, collaborate and execute real-time analysis and drive actionable insight.

Transit data as economic fuel

Transit data continues to be one of the richest and most dynamic areas for co-creation of services. Around the United States and beyond, there has been a blossoming of innovation in the city transit sector, driven by the passion of citizens and fueled by the release of real-time transit data by city governments.

Francisca Rojas, research director at the Harvard Kennedy School's Transparency Policy Project, has investigated the dynamics behind the disclosure of data by transit agencies in the United States, which she calls one of the most successful implementations of open government. "In just a few years, a rich community has developed around this data, with visionary champions for disclosure inside transit agencies collaborating with eager software developers to deliver multiple ways for riders to access real-time information about transit," wrote Rojas.

The Massachusetts Bay Transit Authority (MBTA) learned from Portland, Oregon's, TriMet that open data is better. "This was the best thing the MBTA had done in its history," said Laurel Ruma, O'Reilly's director of talent and a long-time resident in greater Boston, in her 2010 Ignite talk on real-time transit data. The MBTA's move to make real-time data available and support it has spawned a new ecosystem of mobile applications, many of which are featured at MBTA.com.

There are now 44 different consumer-facing applications for the TriMet system. Chicago, Washington and New York City also have a growing ecosystem of applications.

As more sensors go online in smarter cities, tracking the movements of traffic patterns will enable public administrators to optimize routes, schedules and capacity, driving efficiency and a better allocation of resources.

Transparency and civic goods

As John Wonderlich, policy director at the Sunlight Foundation, observed last year, access to legislative data brings citizens closer to their representatives. "When developers and programmers have better access to the data of Congress, they can better build the databases and tools that let the rest of us connect with the legislature."

That's the promise of the Sunlight Foundation's work, in general: Technology-fueled transparency will help fight corruption, fraud and reveal the influence behind policies. That work is guided by data, generated, scraped and aggregated from government and regulatory bodies. The Sunlight Foundation has been focused on opening up Congress through technology since the organization was founded. Some of its efforts culminated recently with the publication of a live XML feed for the House floor and a transparency portal for House legislative documents.

There are other horizons for transparency through open government data, which broadly refers to public sector records that have been made available to citizens. For a canonical resource on what makes such releases truly "open," consult the "8 Principles of Open Government Data."

For instance, while gerrymandering has been part of American civic life since the birth of the republic, one of the best policy innovations of 2011 may offer hope for improving the redistricting process. DistrictBuilder, an open-source tool created by the Public Mapping Project, allows anyone to easily create legal districts.

"During the last year, thousands of members of the public have participated in online redistricting and have created hundreds of valid public plans," said Micah Altman, senior research scientist at Harvard University Institute for Quantitative Social Science, via an email last year.

"In substantial part, this is due to the project's effort and software. This year represents a huge increase in participation compared to previous rounds of redistricting — for example, the number of plans produced and shared by members of the public this year is roughly 100 times the number of plans submitted by the public in the last round of redistricting 10 years ago," Altman said. "Furthermore, the extensive news coverage has helped make a whole new set of people aware of the issue and has re framed it as a problem that citizens can actively participate in to solve, rather than simply complain about."

Principles for data in the public good

As a result of digital technology, our collective public memory can now be shared and expanded upon daily. In a recent lecture on public data for public good at Code for America, Michal Migurski of Stamen Design made the point that part of the global financial crisis came through a crisis in public knowledge, citing "The Destruction of Economic Facts," by Hernando de Soto.

To arrive at virtuous feedback loops that amplify the signals that citizens, regulators, executives and elected leaders inundated with information need to make better decisions, data providers and infomediaries will need to embrace key principles, as Migurski's lecture outlined.

First, "data drives demand," wrote Tim O'Reilly, who attended the lecture and distilled Migurski's insights. "When Stamen launched crimespotting.org, it made people aware that the data existed. It was there, but until they put visualization front and center, it might as well not have been."

Second, "public demand drives better data," wrote O'Reilly. "Crimespotting led Oakland to improve their data publishing practices. The stability of the data and publishing on the web made it possible to have this data addressable with public links. There's an 'official version,' and that version is public, rather than hidden."

Third, "version control adds dimension to data," wrote O'Reilly. "Part of what matters so much when open source, the web, and open data meet government is that practices that developers take for granted become part of the way the public gets access to data. Rather than static snapshots, there's a sense that you can expect to move through time with the data."

The case for open data

Accountability and transparency are important civic goods, but adopting open data requires grounded arguments for a city chief financial officer to support these initiatives. When it comes to making a business case for open data, John Tolva, the chief technology officer for Chicago, identified four areas that support the investment in open government:

  1. Trust — "Open data can build or rebuild trust in the people we serve," Tolva said. "That pays dividends over time."
  2. Accountability of the work force — "We've built a performance dashboard with KPIs [key performance indicators] that track where the city directly touches a resident."
  3. Business building — "Weather apps, transit apps ... that's the easy stuff," he said. "Companies built on reading vital signs of the human body could be reading the vital signs of the city."
  4. Urban analytics — "Brett [Goldstein] established probability curves for violent crime. Now we're trying to do that elsewhere, uncovering cost savings, intervention points, and efficiencies."

New York City is also using data internally. The city is doing things like applying predictive analytics to building code violations and housing data to try to understand where potential fire risks might exist.

"The thing that's really exciting to me, better than internal data, of course, is open data," said New York City chief digital officer Rachel Sterne during her talk at Strata New York 2011. "This, I think, is where we really start to reach the potential of New York City becoming a platform like some of the bigger commercial platforms and open data platforms. How can New York City, with the enormous amount of data and resources we have, think of itself the same way Facebook has an API ecosystem or Twitter does? This can enable us to produce a more user-centric experience of government. It democratizes the exchange of information and services. If someone wants to do a better job than we are in communicating something, it's all out there. It empowers citizens to collaboratively create solutions. It's not just the consumption but the co-production of government services and democracy."

The promise of data journalism

NYTimes: 365/360 - 1984 (in color) by blprnt_van, on FlickrThe ascendance of data journalism in media and government will continue to gather force in the years ahead.

Journalists and citizens are confronted by unprecedented amounts of data and an expanded number of news sources, including a social web populated by our friends, family and colleagues. Newsrooms, the traditional hosts for information gathering and dissemination, are now part of a flattened environment for news. Developments often break first on social networks, and that information is then curated by a combination of professionals and amateurs. News is then analyzed and synthesized into contextualized journalism.

Data is being scraped by journalists, generated from citizen reporting, or gleaned from massive information dumps — such as with the Guardian's formidable data journalism, as detailed in a recent ebook. ScraperWiki, a favorite tool of civic coders at Code for America and elsewhere, enables anyone to collect, store and publish public data. As we grapple with the consumption challenges presented by this deluge of data, new publishing platforms are also empowering us to gather, refine, analyze and share data ourselves, turning it into information.

There are a growing number of data journalism efforts around the world, from New York Times interactive features to the award-winning investigative work of ProPublica. Here are just a few promising examples:

  • Spending Stories, from the Open Knowledge Foundation, is designed to add context to news stories based upon government data by connecting stories to the data used.
  • Poderopedia is trying to bring more transparency to Chile, using data visualizations that draw upon a database of editorial and crowdsourced data.
  • The State Decoded is working to make the law more user-friendly.
  • Public Laboratory is a tool kit and online community for grassroots data gathering and research that builds upon the success of Grassroots Mapping.
  • Internews and its local partner Nai Mediawatch launched a new website that shows incidents of violence against journalists in Afghanistan.

Open aid and development

The World Bank has been taking unprecedented steps to make its data more open and usable to everyone. The data.worldbank.org website that launched in September 2010 was designed to make the bank's open data easier to use. In the months since, more than 100 applications have been built using the data.

"Up until very recently, there was almost no way to figure out where a development project was," said Aleem Walji, practice manager for innovation and technology at the World Bank Institute, in an interview last year. "That was true for all donors, including us. You could go into a data bank, find a project ID, download a 100-page document, and somewhere it might mention it. To look at it all on a country level was impossible. That's exactly the kind of organization-centric search that's possible now with extracted information on a map, mashed up with indicators. All of sudden, donors and recipients can both look at relationships."

Open data efforts are not limited to development. More data-driven transparency in aid spending is also going online. Last year, the United States Agency for International Development (USAID) launched a public engagement effort to raise awareness about the devastating famine in the Horn of Africa. The FWD campaign includes a combination of open data, mapping and citizen engagement.

"Frankly, it's the first foray the agency is taking into open government, open data, and citizen engagement online," said Haley Van Dyck, director of digital strategy at USAID, in an interview last year.

"We recognize there is a lot more to do on this front, but are happy to start moving the ball forward. This campaign is different than anything USAID has done in the past. It is based on informing, engaging, and connecting with the American people to partner with us on these dire but solvable problems. We want to change not only the way USAID communicates with the American public, but also the way we share information."

USAID built and embedded interactive maps on the FWD site. The agency created the maps with open source mapping tools and published the datasets it used to make these maps on data.gov. All are available to the public and media to download and embed as well.

The combination of publishing maps and the open data that drives them simultaneously online is significantly evolved for any government agency, and it serves as a worthy bar for other efforts in the future to meet. USAID accomplished this by migrating its data to an open, machine-readable format.

"In the past, we released our data in inaccessible formats — mostly PDFs — that are often unable to be used effectively," said Van Dyck. "USAID is one of the premiere data collectors in the international development space. We want to start making that data open, making that data sharable, and using that data to tell stories about the crisis and the work we are doing on the ground in an interactive way."

Crisis data and emergency response

Unprecedented levels of connectivity now exist around the world. According to a 2011 survey from the Pew Internet and Life Project, more than 50% of American adults use social networks, 35% of American adults have smartphones, and 78% of American adults are connected to the Internet. When combined, those factors mean that we now see earthquake tweets spread faster than the seismic waves themselves. Networked publics can now share the effects of disasters in real time, providing officials with unprecedented insight into what's happening. Citizens act as sensors in the midst of the storm, creating an ad hoc system of networked accountability through data.

The growth of an Internet of Things is an important evolution. What we saw during Hurricane Irene in 2011 was the increasing importance of an Internet of people, where citizens act as sensors during an emergency. Emergency management practitioners and first responders have woken up to the potential of using social data for enhanced situational awareness and resource allocation.

An historic emergency social data summit in Washington in 2010 highlighted how relevant this area has become. And last year's hearing in the United States Senate on the role of social media in emergency management was "a turning point in Gov 2.0," said Brian Humphrey of the Los Angeles Fire Department.

The Red Cross has been at the forefront of using social data in a time of need. That's not entirely by choice, given that news of disasters has consistently broken first on Twitter. The challenge is for the men and women entrusted with coordinating response to identify signals in the noise.

First responders and crisis managers are using a growing suite of tools for gathering information and sharing crucial messages internally and with the public. Structured social data and geospatial mapping suggest one direction where these tools are evolving in the field.

A web application from ESRI deployed during historic floods in Australia demonstrated how crowdsourced social intelligence provided by Ushahidi can enable emergency social data to be integrated into crisis response in a meaningful way.

The Australian flooding web app includes the ability to toggle layers from OpenStreetMap, satellite imagery, and topography, and then filter by time or report type. By adding structured social data, the web app provides geospatial information system (GIS) operators with valuable situational awareness that goes beyond standard reporting, including the locations of property damage, roads affected, hazards, evacuations and power outages.

Long before the floods or the Red Cross joined Twitter, however, Brian Humphrey of the Los Angeles Fire Department (LAFD) was already online, listening. "The biggest gap directly involves response agencies and the Red Cross," said Humphrey, who currently serves as the LAFD's public affairs officer. "Through social media, we're trying to narrow that gap between response and recovery to offer real-time relief."

After the devastating 2010 earthquake in Haiti, the evolution of volunteers working collaboratively online also offered a glimpse into the potential of citizen-generated data. Crisis Commons has acted as a sort of "geeks without borders." Around the world, developers, GIS engineers, online media professionals and volunteers collaborated on information technology projects to support disaster relief for post-earthquake Haiti, mapping streets on OpenStreetMap and collecting crisis data on Ushahidi.

Healthcare

What happens when patients find out how good their doctors really are? That was the question that Harvard Medical School professor Dr. Atul Gawande asked in the New Yorker, nearly a decade ago.

The narrative he told in that essay makes the history of quality improvement in medicine compelling, connecting it to the creation of a data registry at the Cystic Fibrosis Foundation in the 1950s. As Gawande detailed, that data was privately held. After it became open, life expectancy for cystic fibrosis patients tripled.

In 2012, the new hope is in big data, where techniques for finding meaning in the huge amounts of unstructured data generated by healthcare diagnostics offer immense promise.

The trouble, say medical experts, is that data availability and quality remain significant pain points that are holding back existing programs.

There are, literally, bright spots that suggest what's possible. Dr. Gawande's 2011 essay, which considered whether "hotspotting" using health data could help lower medical costs by giving the neediest patients better care, offered another perspective on the issue. Early outcomes made the approach look compelling. As Dr. Gawande detailed, when a Medicare demonstration program offered medical institutions payments that financed the coordination of care for its most chronically expensive beneficiaries, hospital stays and trips to the emergency rooms dropped more than 15% over the course of three years. A test program adopting a similar approach in Atlantic City saw a 25% drop in costs.

Through sharing data and knowledge, and then creating a system to convert ideas into practice, clinicians in the ImproveCareNow network were able to improve the remission rate for Crohn's disease from 49% to 67% without the introduction of new drugs.

In Britain, researchers found that the outcomes for adult cardiac patients improved after the publication of information on death rates. With the release of meaningful new open government data about performance and outcomes from the British national healthcare system, similar improvements may be on the way.

"I do believe we are at the beginning of a revolutionary moment in health care, when patients and clinicians collect and share data, working together to create more effective health care systems," said Susannah Fox, associate director for digital strategy at the Pew Internet and Life Project, in an interview in January. Fox's research has documented the social life of health information, the concept of peer-to-peer healthcare, and the role of the Internet among people living with chronic disease.

In the past few years, entrepreneurs, developers and government agencies have been collaboratively exploring the power of open data to improve health. In the United States, the open data story in healthcare is evolving quickly, from new mobile apps that lead to better health decisions to data spurring changes in care at the U.S. Department of Veterans Affairs.

Since he entered public service, Todd Park, the first chief technology officer of the U.S. Department of Health and Human Services (HHS), has focused on unleashing the power of open data to improve health. If you aren't familiar with this story, read the Atlantic's feature article that explores Park's efforts to revolutionize the healthcare industry through better use of data.

Park has focused on releasing data at Health.Data.Gov. In a speech to a Hacks and Hackers meetup in New York City in 2011, Park emphasized that HHS wasn't just releasing new data: "[We're] also making existing data truly accessible or usable," he said, taking "stuff that's in a book or on a website and turning it into machine-readable data or an API."

Park said it's still quite early in the project and that the work isn't just about data — it's about how and where it's used. "Data by itself isn't useful. You don't go and download data and slather data on yourself and get healed," he said. "Data is useful when it's integrated with other stuff that does useful jobs for doctors, patients and consumers."

What lies ahead

There are four trends that warrant special attention as we look to the future of data for public good: civic network effects, hybridized data models, personal data ownership and smart disclosure.

Civic network effects

Community is a key ingredient in successful open government data initiatives. It's not enough to simply release data and hope that venture capitalists and developers magically become aware of the opportunity to put it to work. Marketing open government data is what repeatedly brought federal Chief Technology Officer Aneesh Chopra and Park out to Silicon Valley, New York City and other business and tech hubs.

Despite the addition of topical communities to Data.gov, conferences and new media efforts, government's attempts to act as an "impatient convener" can only go so far. Civic developer and startup communities are creating a new distributed ecosystem that will help create that community, from BuzzData to Socrata to new efforts like Max Ogden's DataCouch.

Smart disclosure

There are enormous economic and civic good opportunities in the "smart disclosure" of personal data, whereby a private company or government institution provides a person with access to his or her own data in open formats. Smart disclosure is defined by Cass Sunstein, Administrator of the White House Office for Information and Regulatory Affairs, as a process that "refers to the timely release of complex information and data in standardized, machine-readable formats in ways that enable consumers to make informed decisions."

For instance, the quarterly financial statements of the top public companies in the world are now available online through the Securities and Exchange Commission.

Why does it matter? The interactions of citizens with companies or government entities generate a huge amount of economically valuable data. If consumers and regulators had access to that data, they could tap it to make better choices about everything from finance to healthcare to real estate, much in the same way that web applications like Hipmunk and Zillow let consumers make more informed decisions.

Personal data assets

When a trend makes it to the World Economic Forum (WEF) in Davos, it's generally evidence that the trend is gathering steam. A report titled "Personal Data Ownership: The Emergence of a New Asset Class" suggests that 2012 will be the year when citizens start thinking more about data ownership, whether that data is generated by private companies or the public sector.

"Increasing the control that individuals have over the manner in which their personal data is collected, managed and shared will spur a host of new services and applications," wrote the paper's authors. "As some put it, personal data will be the new 'oil' — a valuable resource of the 21st century. It will emerge as a new asset class touching all aspects of society."

The idea of data as a currency is still in its infancy, as Strata Conference chair Edd Dumbill has emphasized. The Locker Project, which provides people with the ability to move their own data around, is one of many approaches.

The growth of the Quantified Self movement and online communities like PatientsLikeMe and 23andMe validates the strength of the movement. In the U.S. federal government, the Blue Button initiative, which enables veterans to download personal health data, has now spread to all federal employees and earned adoption at Aetna and Kaiser Permanente.

In early 2012, a Green Button was launched to unleash energy data in the same way. Venture capitalist Fred Wilson called the Green Button an "OAuth for energy data."

Wilson wrote:

"It is a simple standard that the utilities can implement on one side and web/mobile developers can implement on the other side. And the result is a ton of information sharing about energy consumption and, in all likelihood, energy savings that result from more informed consumers."

Hybridized public-private data

Free or low-cost online tools are empowering citizens to do more than donate money or blood: Now, they can donate, time, expertise or even act as sensors. In the United States, we saw a leading edge of this phenomenon in the Gulf of Mexico, where Oil Reporter, an open source oil spill reporting app, provided a prototype for data collection via smartphone. In Japan, an analogous effort called Safecast grew and matured in the wake of the nuclear disaster that resulted from a massive earthquake and subsequent tsunami in 2011.

Open source software and citizens acting as sensors have steadily been integrated into journalism over the past few years, most dramatically in the videos and pictures uploaded after the 2009 Iran election and during 2011's Arab Spring.

Citizen science looks like the next frontier. Safecast is combining open data collected by citizen science with academic, NGO and open government data (where available), and then making it widely available. It's similar to other projects, where public data and experimental data are percolating.

Public data is a public good

Despite the myriad challenges presented by legitimate concerns about privacy, security, intellectual property and liability, the promise of more informed citizens is significant. McKinsey's 2011 report dubbed big data as the next frontier for innovation, with billions of dollars of economic value yet to be created. When that innovation is applied on behalf of the public good, whether it's in city planning, transit, healthcare, government accountability or situational awareness, those effects will be extended.

We're entering the feedback economy, where dynamic feedback loops between customers and corporations, partners and providers, citizens and governments, or regulators and companies can both drive efficiencies and leaner, smarter governments.

The exabyte age will bring with it the twin challenges of information overload and overconsumption, both of which will require organizations of all sizes to use the emerging toolboxes for filtering, analysis and action. To create public good from public goods — the public sector data that governments collect, the private sector data that is being collected and the social data that we generate ourselves — we will need to collectively forge new compacts that honor existing laws and visionary agreements that enable the new data science to put the data to work.

Photo: NYTimes: 365/360 - 1984 (in color) by blprnt_van, on Flickr

Related:

January 03 2012

Le Blog d'Olivier Berruyer sur les crises actuelles > Discours de fin de mandat de Dwight Eisenhower, 1961 | blog in FR - video in EN


Voici le fameux discours testament du président Eisenhower sur le “complexe militaro-industriel” – il n’y a plus qu’à remplacer ceci par “complexe banco-financier”...

 

------------------

// oAnth - original www-site: http://www.les-crises.fr/eisenhower-1961/

See it on Scoop.it, via manually by oAnth - from my scoop.it contacts

October 08 2011

Brazil: Different Perspectives on Steve Jobs' Work

As the world mourns the death of Steve Jobs, the Brazilian cartoonist Carlos Latuff illustrates another side of the entrepreneur. The same does Rodrigo Savazoni, on the blog Trezentos, saying [pt] that Jobs was the number one enemy of collaboration.

October 06 2011

China: The Runaway Bosses of Wenzhou City

Wenzhou, a city in southeastern Zhejiang province, has been a prosperous foreign trading port since early 19th century. Its citizens have migrated all over the world and small and medium size enterprises (SMEs) have been flourishing in the city since the Deng Xiaoping's open door policy in the 1970s. The term “Wenzhou model” has been used to describe the unregulated free market economic development in the region.

Recently a scandal broke out surrounding Wenzhou government officials who were involved in illegal loan sharking activities; it is estimated that billions of RMB have been trafficked out from banks in the city to the underground loan market and onto local SMEs.

Luggage factory, Wenzhou, China. Image by Malcolm M on Flickr (CC BY 2.0).

Luggage factory, Wenzhou, China. Image by Malcolm M on Flickr (CC BY 2.0).

However, as a result of the economic downturn, the companies have been unable to pay the debt. So far more than 26 Wenzhou entrepreneurs have fled [zh] the country and the local government is pleading with the central government for help. In the past few days, ‘runaway bosses' has become a trending term on various social media sites and many believe that Wenzhou bosses do not deserve any sympathy.

Young entrepreneur Song Zude comments [zh] on the situation on the Sina Weibo forum:

温州炒房团,该死!房价高涨主要原因是政府哄抬地价、狂追GDP,但温州炒房团也是罪人之一,十多年来,温州人在全国疯狂炒房,扰乱了当地房价,苦了当地百姓!许多温州老板不专心做实业,卷入炒房、虚拟经济、民间高利借贷,最终债务累累,“温跑跑”满世界逃,时候已到,恶有恶报!

The property speculators in Wenzhou do not deserve any sympathy. The major reason for high property prices is government's policy in lifting the land prices and chasing after GDP, at the same time, the property speculators in Wenzhou should also be responsible. In the past decade, Wenzhou people have been lifting property prices all over the country, disrupting the local property market and creating suffering for local residents. Many Wenzhou bosses are not focused on their businesses, they are involved in property speculation, the virtual economy, loan sharking activities and now in serious debts. “Runaway Wenzhou people” are all over the world, now it is time for them to pay back.

There are more than 300 comments in Song's discussion thread, many expressing their anger towards Wenzhou speculators:

-浩米- 温州炒房,潮州炒地,银行炒息,房都没买,先比人轮奸一轮!(今天 09:39)

-浩米-: Wenzhou speculates on property, Chaozhou speculates on land, banks make their profit on interest. For those who do not have property, they have to go through all these gang rapes first.

dengdaiyanguilai 就是啊!现在政府要救他们,那他们以前把全国人民的钱都挣完了,现在要救 ,以前的钱他们退、吐出来吗? (今天 08:03)

dengdaiyanguilai: That's right. They have extracted all our money, can they return all our money before the government helps them out?

靖江枫 温州籍老板们,国家受自然灾害袭击时你们在干什么?你们喝着拉菲红酒,开的是豪华跑车,抱的美女明星,而大部分钞票都靠投机倒把得来,被高利贷逼走的人有几个没参与这些勾当?(今天 03:15)

靖江枫: Wenzhou bosses, what were you doing when the country was suffering from natural disaster? You were drinking Lafite [an expensive French red wine], driving luxurious vehicles, hugging beautiful movie stars. For those who earn their money through speculation and are forced to runaway haven't engaged in the above mentioned activities?

丹东晓程 温州炒房团倒霉之日,就是贫农下中农幸福之时。

丹东晓程: The doomsday of Wenzhou property speculators is a happy day for the poor rural villagers.

本莫愁 炒高国内的,又跑去迪拜炒!结果赶上迪拜房地产业大跌,好多人栽里头!温八子们,拿着在国内炒房赚的钱,全跑国外去给人送钱,真害人不浅!(10月5日 14:59)

本莫愁: They have pushed up the property prices in the mainland then transferred their capital to Dubai. In the end the property price in Dubai fell and many have buried themselves there. The eight richest in Wenzhou, they have given the money they earned in China to foreigners!

VanTssei 既然ZF创造了一个能叫你稳赚的垄断市场,有资本为什么不赚?本来炒房团就是ZF为了有房市资金介入而鼓励下的产物,炒房先驱而已。真心的,祖德就时事针砭时弊和对娱乐圈冷嘲热讽就算了,别扯太多你的副业。(10月5日 14:44)

VanTssei: Since the government has created a sure win monopolized market, it is natural for those who have capital to make money. Property speculation has been encouraged by the government, Wenzhou people are just taking the lead in this. Zude should focus on commenting on entertainment business, economics is not your field of expertise.

State-run media, Chinese Central Television's finance commentator Niu Wenxin disagrees with the public sentiment and blames the United States (US) [zh] for the problem:

当中国不能继续紧缩货币之时,美国开始用最强力的手段逼人民币升值。什么意思?我们必须人民币升值就紧缩货币是同义词。去看看温州的情况吧,那就是紧缩的后果。但现在有人存心转移国人视线,把温州问题的原因引向温州老板的投机,贪婪、暴利等。这是混淆是非,是迫使中国经济走向死亡的手段!

When China can no longer continue its tight monetary policy, the US forces RMB to increase in value. What does it mean? The increase in RMB's value is equal to a tight monetary policy. Let's take a look in Wenzhou - it is the consequence of the tight monetary policy. However, some people try to blur the issue by saying the problem of Wenzhou is caused by Wenzhou bosses' speculation activities and greed. This will lead to bringing the Chinese economy down.

In Niu's discussion thread:

poliwang 现在世界经济增长在放缓,国际大宗商品价格连续暴跌,中国的出口将会面临严峻的冲击,中国如果不马上停止紧缩政策而转向刺激内需,中国经济很可能出现200年时的断崖式滑坡和硬着陆!我们现在首要的问题是要多创造就业,避免出现中小企业大规模倒闭和大量的失业! (10月5日 07:55)

poliwang: The economic growth in the whole world is now slowing down. The price of many commodities are trembling down. Our export [market] is facing a critical situation. If China does not stop the tight monetary policy and stimulate internal demand, we will face a hard landing. Currently we have to create more jobs and prevent SMEs from shutting down and unemployment.

清蒸_小老虎 温州地下金融资金链断裂的根源其实是金融系统的低效和利率的扭曲。再加上腐败和无效监管,于是形成了事实上的庞氏骗局,最终必然走向崩盘。信贷紧缩只是提早刺破了这个泡泡。毕竟当全民放贷开始出现时,我们应该知道离盛宴结束已经不远了。(10月5日 00:38)

清蒸_小老虎: The broken underground financial chain is the result of the twisted interest rate and the ineffectiveness of the current finance system. In addition, the corruption and lack of regulation has created a de facto fraud and will eventually collapse. The tightening of the load just squeezes the existing bubble to burst out. When loan sharking activities start emerging all over the country, we know the prosperous time is near the end.

October 03 2011

Cape Verde: Fishing Agreement with European Union

Cape-verdean journalist Odair Varela, on his blog, makes a critical analysis [pt] to the new agreement between Cape Verde's government and the European Union for the exploitation of the country's vast fishing resources.

Colombia: Mining Debate Continues as New Minister is Appointed

Among the many riches of Colombia are gold, silver, emeralds, coal and nickel deposits; however, safety concerns have led many to question the management of the country's mining industry. Not only because of the tragedies [es] that have occurred in the mines themselves, but also because of environmental [es] concerns, possible problems of sustainability [es] and the lack of economic, political and social benefits [es].

President Juan Manuel Santos nominated Mauricio Cárdenas Santa María Minister of Mining and Energy, aiming to initiate [es] a significant restructuring [es] of the Ministry. However, reactions on social networks, blogs and YouTube continue to flood the Internet.

Most of Colombia's gold is produced from alluvial operations, by local artisanal miners. Image by Alzbeta Jungrova, copyright Demotix (26/11/09).

Most of Colombia's gold is produced from alluvial operations, by local artisanal miners. Image by Alzbeta Jungrova, copyright Demotix (26/11/09).

Reactions to new minister

La Silla Vacía [es] reports on the new minister:

Fue director de Planeación Nacional durante el gobierno de Andrés Pastrana y Ministro de Transporte, donde se vio envuelto en el escándalo de Dragacol por una conciliación laboral realizada de manera irregular.

He was the director of National Planning during Andrés Pastrana's administration and the Minister of Transportation, where he became wrapped up in the Dragacol scandal for a labour conciliation that was badly executed.

Under Mauricio Cárdenas, in 1998 the Ministry of Transportation signed a fraudulent contract for 26,000 million pesos (US$ 12.5 million) to the Society of Mining and Construction of Colombia and the Caribbean S.A (Dragacol) [es], for a series of mine-sweeping operations. The group MUMA (@Mumismo) is now questioning the new Minister about this case:

PAÍS SIN MEMORIA… Miren quién es el nuevo Ministro de Minas y Energía: “Se cae Dragacol” http://bit.ly/pi9EaR

THIS COUNTRY HAS NO MEMORY… Look who's the new Minister of Mining and Energy: “Dragacol collapses” http://bit.ly/pi9EaR [es]

Effects of mining

Looking at mining from another angle, @TuiterosBoyaca shares a link from newspaper El Tiempo [es] on the latest news [es] from the department of Boyacá, when on September 21, 2011, an accumulation of gas produced an explosion that in turn caused the coal mine ‘El Diamante 3′ to cave-in, leaving seven dead and several injured:

Tras 52 horas, rescatan cuerpos de los siete mineros en Socha (Boyacá) http://bit.ly/qIg1Pw

After 52 hours, the bodies of seven miners in Socha (Boyacá) have been rescued http://bit.ly/qIg1Pw

The following video from August 28, 2010, which has been shared by thousands of netizens, resonates today given the ongoing issues related to mining. ‘Open-pit mining contamination in Colombia' was produced due to the concern of various Colombian television actors and actresses. The video includes their opinions in favour of conservation of the environment, and their positions against contamination; they explain that during extraction elements used in the mining of gold affect the air, rivers, population and general ecosystem:

The University of el Rosario of Bogota (@urosarionews) wonders [es] about the consequences of mining on the environment and adds a link to an article published on July 19, 2011, by Sala de Prensa [es]:

En Yanacocha, Perú, la minería de oro a cielo abierto afectó la naturaleza. ¿Podría suceder esto en Colombia? http://bit.ly/qnTaP1

In Yanacocha, Perú, open-pit mining of gold affects nature. Could this happen in Colombia? http://bit.ly/qnTaP1 [es]

The blog El Salmón [es] also expresses worry over sustainability and the environment:

Ahora, a la lenta muerte del nevado del Tolima pretenden sumarle la amenaza de destrucción de los ecosistemas de la cuenca y un futuro incierto para miles de familias que deberán salir de sus fincas para abrirle paso a la locomotora minera…

Now, to the slow death of the Nevado del Tolima they intend to add the threat of the destruction of the ecosystems of the basin and an uncertain future for thousands of families that have to leave their farms to open their doors to a mining locomotive…

Activist's murder

Finally, the murder of Father Reinel Restrepo, an activist who lead and spoke out against the extraction of mining in Marmato, Caldas, where multinational Gran Colombia Gold operates in the extraction of gold, has raised several concerns. The parish priest was opposed to his parish being subjected to the development of a mega-project for the extraction of gold, and had been warned that it would cost him his life.

Walter Rengifo (@wrengifoc) makes reference [es] to the possible causes of the death of Father Reinel Restrepo:

El párroco de Marmato caldas denuncio días antes, que temía por su vida por protestar en contra de un mega proyecto de extracción de oro y la reubicación del pueblo para lograr sus fines.

The parish priest of Marmato Caldas declared, days before, that he feared for his life after protesting against the gold extraction mega project and the relocation of the town to achieve their means.

Meanwhile, Samir Ammar (@marmatovive) shares a link [es] to the blog Marmato Vive, where a letter was published citing the Project for the Accompaniment and Solidarity of Colombia's webpage as a source:

MARMATO VIVE: COMUNICADO DE ORGANIZACIONES CANADIENSES TRAS EL ASESINATO DEL PADRE JOSÉ REINEL RESTR http://t.co/eVJDtEjs

MARMATO VIVE: COMMUNIQUE FROM CANADIAN ORGANISATIONS ABOUT FATHER JOSE REINEL RESTREPO'S MURDER http://t.co/eVJDtEjs

The blog No to mining [es], citing Noticias UNO [es] as a source, reports:

Reynel Restrepo venía acompañando desde hace dos años por los derechos de los mineros de Marmato, quienes se oponen a que multinacionales del oro los desplacen del lugar. El representante de la junta cívica de Marmato dijo que el padre Restrepo le había dicho que en los últimos días había recibido presión de la Gran Colombia Gold. La multinacional ha dicho que su trabajo no atenta contra la comunidad y en un comunicado que expidió ayer señala que lamentaba la muerte del sacerdote.

Reynel Restrepo lobbied for two years for the rights of the miners of Marmato, who opposed the multinational companies that would displace them. The representative of the joint committee of Marmato said that Father Restrepo told him that in the last few days he was being pressured by Gran Colombia Gold. The multinational company said that its work does not pose a threat to the community and sent a communique yesterday that declared their sympathy for the death of the priest.

The same entry included a video [es] with testimonies from the inhabitants of Marmato and declarations from the priest.

Panama: The Struggle Against Mining

Joao Q in Mediocerrado [es] wonders “What happened to the anti-mining struggle in Panama?”, and attempts to bring the issue of mining back to the public discussion with a post on the subject.

September 26 2011

September 19 2011

Nigeria: Does Terrorism Pay Better Than Farming?

Activista Blogger, David Habba in Nigeria struck up conversation with a student from the University of Agriculture in Makurdi, who no longer feels financially motivated to enter the agricultural sector. “Someone must grow the food and who says it must be me?”

September 16 2011

Bermuda: Term Limits Backlash

Vexed Bermoothes says of the government's imposition of work permit term limits: “This – combined with the general pissiness of the PLP towards expatriates and international business – led to a massive exodus from the island”; Politics.bm adds: “Their signature policies have hurt Bermudians. Time to fess up.”

September 11 2011

South Korea: Google Raided over Alleged Antitrust Violations

Google's South Korea offices have been raided once again as the country's largest mobile search operators raised a claim that Google unfairly discouraged competition by limiting search engine options in Android handsets. The South Korean regulator accepted this argument and stormed Google's Seoul offices on September 6, 2011, reigniting numerous debates online about whether it was a fair decision and what motives lie behind such harsh action.

It is the third time Google Korea has been raided. The first raid was in August, 2010 over a large amount of traffic information Google accidentally collected during its Google Street View project. The second raid came in May this year after the allegation that AdMob, Google's mobile advertising unit, had gathered personal location data without permission.

Google and Naver Apps

Image of Google and Naver Apps on phone, by Lee Yoo Eun (CC-BY-2.0)

This time, South Korea's two largest search engine companies, Naver(NHN Corp) and Daum Communications filed a complaint with the Korean Fair Trade Commission (KFTC) in April 2011 that the Android mobile phone operating system unfairly hinders competition by setting Google's search engine as the default option, making it difficult for users to switch to different portals. They also argued that Google continues to discourage competitors by delaying OS certification for phone manufacturers.

Even to Koreans who regularly visit the two local portals, the companies' argument was widely seen as irrational, or even ungrateful. Kang Min-soo (@bombshots) tweeted [ko] that it is only natural for Google search to be pre-loaded on Android, and that Naver and Daum should be thankful they are even allowed onto the device.

구글압수수색 들어갔다고. 안드로이드에 구글기본탑재가 불공정거래? 윈도우에 IE 기본탑재는? iOS 검색기본이 구글인거는? 그나마 무료로 재워줬더니 주인보고 안방에서 나오라는 꼴.

So they raided Google. They claimed that pre-loading Google on Android phones was unfair […] What about [Internet Explorer] installation on Windows? Or about Google being the basic search engine on the [iPhone] iOS? This situation is like a landlord who allowed a guest to stay at his place for free, but is on the verge of being kicked out from the master bedroom by that same guest.

J.S. Park (@unclecow) tweeted [ko]:

경쟁력없는 것도 부끄러운 일인터인데 법이 이러쿵 저러쿵 하는 걸 보니 안쓰러울뿐이다. 기분 나쁘면 네이버OS,다음OS만들어서 네이버폰,다음폰 만들어~ 너희들도 네이버폰이랑 다음폰 만들면 구글검색 이용 못하게 할꺼면서.

It is already shameful that they [the Korean companies] lack competitiveness, but that they so brazenly ramble on about this and that law article. It is almost pitiful to watch. If you dont like the situation, then create your own OS, such as Naver OS or Daum OS and make your own cell phones, Naver Phone and Daum Phone. If this happened, you would never allow Google to be pre-loaded onto your devices.

Many also accused the Korean regulator for being inconsistent in judging antitrust violations. Blogger 어설프군YB pointed out [ko] that while Naver enjoys dominant position in the search market despite numerous anti-trust allegations, Google who only has about a 15 percent market share was raided.

한국에서 독과점 형태로 모든 온라인 사업에 주도적 역할을 하고 있는 상황에서 […] 여러 면에서 산업 발전을 저해하는 정책을 펼치고 있습니다. 특히나 네이버는 검색시 자사 DB 즉, 네이버가 구축한 블로그, 카페등의 검색 비율을 높임으로 인해 70% 이상의 검색을 독점하고 있는데도 이와 관련해서는 조사를 진행하지 않고 구글에게 문제를 제기한다는 것자체게 어처구니 없는 접근이 아닐까 생각됩니다.

These major Korean portal sites have formed an oligopoly and are playing major roles in almost every web business. But they block development by enforcing their monopolistic policies. And in the case of Naver, they manipulate the search results by directing users to first visit Naver blogs and Naver community sites [before they visit other platforms] keeping people within their network. This manipulation enabled Naver to hold on to more than a 70 percent share of the Korean search market. Questioning Google before they ever investigate Naver is an incomprehensible approach to solving antitrust issues.

Worrying that these series of raids may chill freedom of speech online, activists railed against the government's decision. The international digital rights organization Electronic Frontier Foundation (EFF) in the United States even sent an open letter to the Korean Communications Standards Commission (see Jillian York's commentary). Several bloggers who tried to read between the lines, raised suspicions that this is a strategy of the government to ‘tame’ Google who has refused to follow certain local laws that may restrict online freedom of expression or undermine people's privacy.

Blogger ‘Photography is Power' wrote [ko] about Google's ongoing conflict with the Korean government that has continued over several years.

수사할것이 있으면 해야죠. 그런데 이 구글 압수수색을 곱지 않게 보는 네티든들이 많습니다. […] 네이버나 다음입장에서는 국내 실정법을 따른 것 이겠지만  자사의 고객 정보를 경찰 영장 하나로 고객에게 당신 정보 경찰에 열람토록 하겠습니다 라는 통보도 없기 그냥 보여줍니다.[…] 2년전 4월  국내는 유튜브 실명제 도입으로 뜨거웠습니다.  동영상도 실명제를 통해서 올려야 한다는 것이죠. 한국법이 그러니 너희도 따르라고 압박을 했습니다. 그러자 구글코리아는 꼼수를 냅니다. 한국지역을 설정하면 업로드를 못하게 막아 놓았죠. 하지만 지역설정을 한국 이외로 하면 올릴 수 있습니다.[…] 구글코리아는 표현의 자유를 억압하는 실명제를 거부했습니다.

If there really is something to investigate, then we should investigate Google. But quite a few net users have cast suspicious glances at this raid. […] In the case of Daum and Naver, the police has full access to their customer information whenever an arrest warrant is issued. Police can access all the information even without prior notification. The companies can just argue that they are following local laws.[…] Two years ago, in April 2009, YouTube’s real name verification system heated up online debates in South Korea. Back then, the Korean regulator insisted that people need to upload videos using their real names, and forced other foreign websites to follow this law. Then Google found a loophole; they blocked people from uploading a video if their country setting was South Korea. But simply by switching the setting to another country, one can upload a video from South Korea [using any name] […] Google Korea argued against the real name verification law since they said it would restrict the freedom of expression of their users.

The blogger also noted that Google Korea seemed “lackluster” in recent months after a localization project failed to materialize and they lost a tight partnership with Daum. But he still expressed support for Google, describing it as “the only company that can break down the language barrier”. He suggested Google should streamline translation services in order to better survive in the hostile Korean online ecosystem.

September 06 2011

Brazil: Amazon Defender Under Threats Requests Protection

The Brazilian rainforest defender Raimundo Belmiro urges the authorities for protection after death threats “for his activism against the destruction of the Amazon jungle”. The message is spreading in both Brazilian and international blogosphere.

August 31 2011

North Korean Airline Uses Facebook

Marc Perton wrote about North Korean Airline ‘Koryo' and its use of social media in the Consumerist blog.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl