Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

July 20 2011

02mydafsoup-01

[...]

The problem is that this concentration of power in the hands of a few creates problems for resilience and availability. "From an engineering standpoint, the downsides to this are the same things you get with monoculture in agriculture," says Labovitz. Ecosystems without genetic variation are the most vulnerable to being wiped out by a single virus. Similarly, as more of us depend on ever fewer sources for content, and get locked into proprietary technologies, we will become more susceptible to potentially catastrophic single points of failure.

That problem will only intensify with the ascendancy of the cloud, one of the biggest internet innovations of the past few years. The cloud is the nebulous collection of servers in distant locations that increasingly store our data and provide crucial services. It started with web mail services like Hotmail, which let you store your email on central servers rather than on the computer in front of you. The concept quickly spread. Last month, Apple announced the iCloud, a free service that will store all your music, photos, email, books and other data - and even apps - for seamless access via any Apple device, be that an iPhone, iPad or MacBook laptop.

Some companies have moved their entire IT departments into the cloud. Indeed, there are companies that barely exist outside the cloud: in addition to backing up data, Amazon lets internet-based companies rent space on its servers.

The cloud could generate exactly the single points of failure that the internet's robust architecture was supposed to prevent. And when those points fail, they may fail spectacularly. During an outage of Amazon's cloud service in April, when the company's servers went dark, entire companies briefly blinked out of existence. Cloud services also raise security concerns. "One big issue with being connected to the cloud is that a lot of information is in different places and shared," says Labovitz. "You no longer have one castle to protect. It's a much more distributed architecture, and a connected one. You just need one weak link."

[...]

Welcome to the age of the splinternet - tech - 20 July 2011 - New Scientist
Reposted bykrekk krekk
02mydafsoup-01

[...]

#1 Technological control. Protocols, hardware, software are mostly US-designed. If, overnight, a couple of players such as Apple and Microsoft decide that Flash sucks, their gravitational field acts upon everything else (they might be right, technically speaking for web-video, but still many Flash-based multimedia productions becomes useless, like providing glasses that won’t read old books…) The same goes for hardware designs (microchips, graphic components), operating systems and even HTML norms (even though W3C, the World Wide Web Consortium, is supposed to be an international organization).

#2 Commercial control. As the internet becomes more applications-oriented, this control over hardware and OS designs and suppliers influences the availability of contents. The perfect example is the Apple ecosystem (iPhone, iPod, iPad devices + iTunes + Applications). Willing to focus on its lucrative domestic market, and for alleged production reasons, Apple decided to postpone the release of the iPad outside the US by a couple of months.
Fine. But in doing so, it blocked the access to the iPad App store and all its related contents. To use my own [admittedly grey-market] iPad, I managed to switch from a France-based iTunes account to a US one (you must have a billing address there). Then, a new world of contents and applications materialized before my eyes. All the applications I was prevented from grabbing for my iPhone suddenly became available, so did recent movies (to rent or to purchase), TV series, documentaries… and books.

#3 Regulatory control. Apple is not the only one to territorialize its system (although it does that with a great zeal). Country blocking — i.e. the ability to implement regional restrictions though Country Code Top-Level Domain –  is in fact dictated by complex country-to-country copyright contractual agreements.

[...]

Balkanizing the Web | mondaynote.com 2010-05-02
Reposted bykrekk krekk
02mydafsoup-01
[...]

‘When it comes to Cloud Computing, the relationship between the service provider and the customer is by nature asymmetrical’, he says. ‘The former has thousands if not millions of customers and limited liability; in case of litigation, it will have entire control over elements of proof. As for the customer, he bears the risk of having his service interrupted, his data lost or corrupted — when not retained by the supplier, or accessed by third parties and government agencies)’.


[...]

The CVML partner then laid out six critical elements to be implemented in European legislation. These would legally supersede US contractual terms and, as a result, better protect European customers.

1 / Transparency. Guillaume Seligmann suggests a set of standard indicators pertaining to service availability, backup arrangements and pricing – like in the banking industry for instance. In Europe, a bank must provide a borrower with the full extent of his commitments when underwriting a loan. (Some economists say this disposition played a significant role at containing the credit bubble that devastated the US economy).Soup Bookmarklet

2 / Incident notifications. Today, unless he is directly affected, the customer learns about outages from specialized medias, rarely though a detailed notification from the service provider. Again, says Seligmann, the Cloud operator should have the obligation to report in greater details all incidents as well as steps taken to contain damage. This would allow the customer to take all measures required to protect his business operations.

3 / Data restitution. On this crucial matter, most contracts remain vague. In many instances, the customer wanting to terminate his contract and to get back his precious data, will get a large dump of raw data, sometimes in the provider’s proprietary format. ‘That’s unacceptable’, says the attorney. ‘The customer should have the absolute guarantee that, at any moment of his choosing, he we have the right to get the latest backed-up version of his data, presented in a standard format immediately useable by another provider. By no means can data be held hostage in the event of a lawsuit’.

4 / Control and certification. Foreign-headquartered companies, themselves renting facilities in other countries, create a chain fraught with serious hazards. The only way to mitigate risks is to give customers the ability to monitor at all times the facility hosting their data. Probably not the easiest to implement for confidentiality and security reasons. At least, says Guillaume Seligmann, any Cloud provider should be certified by a third party entity in the same way many industries (energy, transportation, banking) get certifications and ratings from specialized agencies – think about how critical such provisions are for airlines or nuclear power plants.

5 / Governing laws. The idea is to avoid the usual clause: “For any dispute, the parties consent to personal jurisdiction in, and the exclusive venue of, the courts of Santa Clara County, California”. To many European companies, this sounds like preemptive surrender. According to Seligmann’s proposal, the end-user should have the option to take his case before his own national court and the local judge should have the power to order really effective remedies. This is the only way to make the prospect of litigation a realistic one.

6 / Enforceability. The credibility of the points stated above depends on their ability to supersede and to render ineffective conflicting contractual terms imposed by the service provider. In that respect, the European Union is well armed to impose such constraints, as it already did on personal data protection. In the US, imposing the same rules might be a different story.

The overall issue of regulating the cloud is far from anecdotal. Within a few years, we can bet the bulk of our hard drives – individual as well as collective ones – will be in other people’s large hands: Amazon S3 storage service now stores 339 billion objects – twice last year’s volume.
We’ll gain in terms of convenience and efficiency. We should also gain in security.

[END]
Catching the Cloud | mondaynote.com 2011-07-17
Reposted bykrekk krekk

July 19 2011

Google+ is the social backbone

Google plusThe launch of Google+ is the beginning of a fundamental change on the web. A change that will tear down silos, empower users and create opportunities to take software and collaboration to new levels.

Social features will become pervasive, and fundamental to our interaction with networked services. Collaboration from within applications will be as natural to us as searching for answers on the web is today.

It's not just about Google vs Facebook

Much attention has focused on Google+ as a Facebook competitor, but to view the system solely within that context is short-sighted. The consequences of the launch of Google+ are wider-reaching, more exciting and undoubtedly more controversial.

Google+ is the rapidly growing seed of a web-wide social backbone, and the catalyst for the ultimate uniting of the social graph. All it will take on Google's part is a step of openness to bring about such a commoditization of the social layer. This would not only be egalitarian, but would also be the most effective competitive measure against Facebook.

As web search connects people to documents across the web, the social backbone connects people to each other directly, across the full span of web-wide activity. (For the avoidance of doubt, I take "web" to include networked phone and tablet applications, even if the web use is invisible to the user.)

Search removed the need to remember domain names and URLs. It's a superior way to locate content. The social backbone will relieve our need to manage email addresses and save us laborious "friending" and permission-granting activity — in addition to providing other common services such as notification and sharing.

Though Google+ is the work of one company, there are good reasons to herald it as the start of a commodity social layer for the Internet. Google decided to make Google+ be part of the web and not a walled garden. There is good reason to think that represents an inclination to openness and interoperation, as I explain below.

Strata Conference New York 2011, being held Sept. 22-23, covers the latest and best tools and technologies for data science -- from gathering, cleaning, analyzing, and storing data to communicating data intelligence effectively.

Save 20% on registration with the code STN11RAD

It's time for the social layer to become a commodity

We're now several years into the era of social networks. Companies have come and gone, trying to capture the social graph and exploit it. Well intentioned but doomed grass-roots initiatives have waxed and waned. Facebook has won the platform game, being the dominant owner of our social attention, albeit mostly limited to non-workplace application.

What does this activity in social software mean? Clearly, social features are important to us as users of computers. We like to identify our friends, share with them, and meet with them. And it's not just friends. We want to identify co-workers, family, sales prospects, interesting celebrities.

Currently, we have all these groups siloed. Because we have many different contexts and levels of intimacy with people in these groups, we're inclined to use different systems to interact with them. Facebook for gaming, friends and family. LinkedIn for customers, recruiters, sales prospects. Twitter for friends and celebrities. And so on into specialist communities: Instagram and Flickr, Yammer or Salesforce Chatter for co-workers.

The situation is reminiscent of electronic mail before it became standardized. Differing semi-interoperable systems, many as walled gardens. Business plans predicated on somehow "owning" the social graph. The social software scene is filled with systems that assume a closed world, making them more easily managed as businesses, but ultimately providing for an uncomfortable interface with the reality of user need.

An interoperable email system created widespread benefit, and permitted many ecosystems to emerge on top of it, both formal and ad-hoc. Email reduced distance and time between people, enabling rapid iteration of ideas, collaboration and community formation. For example, it's hard to imagine the open source revolution without email.

When the social layer becomes a standard facility, available to any application, we'll release ourselves into a world of enhanced diversity, productivity and creative opportunity. Though we don't labor as much under the constraints of distance or time as we did before email, we are confined by boundaries of data silos. Our information is owned by others, we cannot readily share what is ours, and collaboration is still mostly boxed by the confines of an application's ability.

A social backbone would also be a boost for diversity. Communities of interest would be enabled by the ready availability of social networking, without having the heavy lifting in creating the community, or run the risk of disapproval or censorship from a controlling enterprise.

The effect of email interoperability didn't just stop at enabling communication: it was a catalyst for standards in document formats and richer collaboration. The social backbone won't just make it easier to handle permissions, identity and sharing, but will naturally exert pressure for further interoperation between applications. Once their identity is united across applications, users will expect their data to travel as well.

We see already a leaning toward this interoperability: the use of Twitter, Facebook and Google as sign-on mechanisms across websites and games, attempts to federate and intermingle social software, cloud-based identity and wallet services.

What a social backbone would do

As users, what can we expect a social backbone to do for us? The point is to help computers serve us better. We naturally work in contexts that involve not only documents and information, but groups of people. When working with others, the faster and higher bandwidth the communication, the better.

To give some examples, consider workplace collaboration. Today's groupware solutions are closed worlds. It's impractical for them to encompass either a particularly flexible social model, or a rich enough variety of applications and content, so they support a restricted set of processes. A social backbone could make groupware out of every application. For the future Photoshop, iMovie and Excel, it adds the equivalent power of calling someone over and saying "Hey, what about this?"

Or think about people you interact with. When you're with someone, everything you're currently doing with them is important. Let's say you're working with your friend Jane on the school's PTA fundraiser, and her and your kids play together. Drag Jane into your PTA and Playdates circles. Drop a letter to parents into the PTA circle, and your calendar's free/busy info into Playdates.

Now you're sharing information both of you need. Next Thursday you see Jane at school. While you're chatting, naturally the topic of playdates and the PTA come up. You bring up Jane on your phone, and there are links right there to the letter you're writing, and some suggested dates for mutually free time.

Teaching computer systems about who we know lets them make better guesses as to what we need to know, and when. My examples are merely simple increases in convenience. The history of computing frequently shows that once a platform is opened up, the creative achievements of others far exceed those dreamed of by the platform's progenitors.

The social backbone democratizes social software: developers are freed from the limitations of walled gardens, and the power to control what you do with your friends and colleagues is returned to you, the user.

Social backbone services

Which services will the social backbone provide? We can extract these from those provided by today's web and social software applications:

  • Identity — authenticating you as a user, and storing information about you
  • Sharing — access rights over content
  • Notification — informing users of changes to content or contacts' content
  • Annotation — commenting on content
  • Communication — direct interaction among members of the system

These facilities are not new requirements. Each of them have been met in differing ways by existing services. Google and Amazon serve as identity brokers with a reasonable degree of assurance, as do Twitter and Facebook, albeit with a lesser degree of trust.

A host of web services address sharing of content, though mostly focused on sharing the read permission, rather than the edit permission. Notification originated with email, graduated through RSS, and is now a major part of Twitter's significance, as well as a fundamental feature of Facebook. Annotation is as old as the web, embodied by the hyperlink, but has been most usefully realized through blogging, Disqus, Twitter and Facebook commenting. Communication between users has been around as long as multi-user operating systems, but is most usefully implemented today in Facebook chat and instant messaging, where ad-hoc groups can easily be formed.

Why not Facebook?

Unfortunately, each of today's answers to providing these social facilities are limited by their implementation. Facebook provides the most rounded complement of social features, so it's a reasonable question to ask why Facebook itself can't provide the social backbone for the Internet.

Facebook's chief flaw is that is a closed platform. Facebook does not want to be the web. It would like to draw web citizens into itself, so it plays on the web, but in terms that leave no room for doubt where the power lies. Content items in Facebook do not have a URI, so by definition can never be part of the broader web. If you want to use Facebook's social layer, you must be part of and subject to the Facebook platform.

Additionally, there are issues with the symmetry of Facebook's friending model: it just doesn't model real life situations. Even the term "friend" doesn't allow for the nuance that a capable web-wide social backbone needs.

This is not to set up a Facebook vs Google+ discussion, but to highlight that Facebook doesn't meet the needs of a global social backbone.

Why Google+?

Why is Google+ is the genesis of a social backbone? The simple answer is that it's the first system to combine a flexible enough social model with a widespread user base, and a company for whom exclusive ownership of the social graph isn't essential to their business.

Google also has the power to bootstrap Google+ as a social backbone: the integration of Google+ into Google's own web applications would be a powerful proving ground and advertisement for the concept.

Yet one company alone should not have the power to manage identity for everyone. A workable and safe social backbone must support competition and choice, while still retaining the benefits of the network. Email interoperability was created not by the domination of one system, but by standards for communication.

To achieve a web-wide effect, Google+ needs more openness and interoperability, which it does not yet have. The features offered by the upcoming Google+ API will give us a strong indication of Google's attitude towards control and interoperability.

There is some substantial evidence that Google would support an open and interoperable social backbone:

  • Google's prominence as a supporter of the open web, which is crucial to its business.
  • The early inclination to interoperation of Google+: public content items have a URI, fallback to email is supported for contacts who are not Google+ members.
  • Google is loudly trumpeting their Data Liberation Front, committed to giving users full access to their own data.
  • Google has been involved in the creation of, or has supported, early stage technologies that address portions of the social backbone, including OAuth, OpenID, OpenSocial, PubSubHubbub.
  • Google displays an openness to federation with interoperating systems, evinced most keenly by Joseph Smarr, the engineer behind the Google+ Circles model. The ill-fated Google Wave incorporated federation.
  • The most open system possible would best benefit Google's mission in organizing the world's information, and their business in targeting relevant advertising.

Toward the social backbone

Computers ought to serve us and provide us with means of expression.

A common, expressive and interoperable social backbone will help users and software developers alike. Liberated from information silos and repeat labor of curating friends and acquaintances, we will be free to collaborate more freely. Applications will be better able to serve us as individuals, not as an abstract class of "users".

The road to the social backbone must be carefully trodden, with privacy a major issue. There is a tough trade-off between providing usable systems and those with enough nuance to sufficiently meet our models of collaboration and sharing.

Obstacles notwithstanding, Google+ represents the promise of a next generation of social software. Incorporating learnings from previous failures, a smattering of innovation, and a close attention to user need, it is already a success.

It requires only one further step of openness to take the Google+ product into the beginnings of a social backbone. By taking that step, Google will contribute as much to humanity as it has with search.

Edd Dumbill is the chair of O'Reilly's Strata and OSCON conferences. Find him here on Google+.


(Google's Joseph Smarr, a member of the Google+ team, will discuss the future of the social web at OSCON. Save 20% on registration with the code OS11RAD.)

Related:


Reposted byRK RK

July 10 2011

02mydafsoup-01

Browsersuite - The SeaMonkey® Project


Seamonkey 2.2 basiert auf Firefox 5
Die Browsersuite Seamonkey wurde in der Version 2.2 veröffentlicht und basiert auf Firefox 5. Damit folgen die Macher von Seamonkey den kurzen Veröffentlichungsintervallen von Firefox. [...]



Wikipedia



July 03 2011

02mydafsoup-01

MediaElement.js - HTML5 video player and audio player with Flash and Silverlight shims | open source


HTML5 <video> and <audio> made easy.
One file. Any browser. Same UI.

June 09 2011

02mydafsoup-01

Piwik - Web analytics - Open source

  
Piwik is a downloadable, open source (GPL licensed) real time web analytics software program. It provides you with detailed reports on your website visitors: the search engines and keywords they used, the language they speak, your popular pages… and so much more.

June 05 2011

02mydafsoup-01

The Making of Diaspora - IEEE Spectrum - 2011-06

Armed with Google technologies, four young coders are planting the seeds for the post-Facebook future
By Ariel Bleicher  /  June 2011

on scoop.it - oAnth - permalink

June 02 2011

02mydafsoup-01

skype-open-source

Skype protocol reverse engineered, source available for download

May 31 2011

02mydafsoup-01

May 30 2011

02mydafsoup-01

May 29 2011

02mydafsoup-01
Facebook is building a browser - ur1.ca/4auzd | via scripting.com 2011-05-29

----------------------------------------------------------------

// oAnth - Certainly highly restrictive (in the IT-telcos' newspeak language 'secure') by continuing and implementing the #eG8 criteria à la Sarko - imho the alarm bells are ringing: this sparks rumors, why the Mozilla foundation was certainly not invited to join the panel of the selected 'immortels' by the Sarko government
oAnth - via diaspora | 2011-05-29

May 28 2011

demo | Crocodoc

This crocodoc site has impressive document manipulation technology.  It converts PDF to HTML5, which the user can annotate and then downloaded as a modified PDF.
Reposted fromeob eob

May 16 2011

02mydafsoup-01
Using biometry to replace memorized access credentials is a very, very bad idea: 

You can't change your biometrics at will.

Having your biometrics stored somewhere allows for abuse of that data.

Biometric access control doesn't allow for distress signal or plausible denialability access. Imagine being forced into a retina scan vs. being forced entering a password. In the password case one could have a distress password which will give access to only noncritical information/resources and/or may issue a distress signal/action.

Biometric access control means, that if someone is after some resource very badly, physical violence may hit you.

--------------------------------------------------

// oAnth - an answer on

Iris recognition scanner eliminates passwords


via http://02mydafsoup-01.soup.io/post/132167965/Iris-recognition-scanner-eliminates-passwords
Reposted fromdatenwolf datenwolf

May 14 2011

02mydafsoup-01

May 12 2011

New Google Analytics - Overview Reports Overview

This is part of our series of posts highlighting the new Google Analytics. The new version of Google Analytics is currently available in beta to all Analytics users. And follow Google Analytics on Twitter for the latest updates.

This week we’re going a bit meta with an overview of the new Overview reports in the new Google Analytics. Overview reports were part of the old version of Analytics, of course, but we’ve made some changes to help your analysis.

Anatomy of the Overview Report
Each overview report consists of three sections. There's a timeline graph, some aggregate metrics, and a set of reports.



Whats inside of each of these sections depends on which report you’re looking at. For example, the Visitor Overview shows a graph of visits and metrics like New vs. Returning visitors, while Content Overview shows metrics like pageviews and average time on page.

The Graph
We’ve made a few changes to the graphs in the new Google Analytics, and we'll share them here. You can now make adjustments to the graphs you see in Google Analytics from the buttons on the top right of the graph:
  • Switch a graph between Line Chart and Motion Chart
  • Graph different metrics: Select from the dropdown or the scorecard
Metrics dropdown Metrics Scorecard
  • Compare two metrics: Graph an additional metric for comparison

  • Graph By: Change graph from between Monthly, Weekly, Daily, and even Hourly for some reports


Reports
The bottom section of an overview reports lets you look through a subset of the reports available in that section. You can flip through these reports to see where you want to start your analysis. In the Traffic Sources Overview, we can start by looking at a report of Keywords.



From here we can go view the full report or look at another report, like Referral Sources:



Intelligence Overview
Google Analytics Intelligence automatically searches your website traffic to look for anomalies. When it finds something that's out of the ordinary it surfaces this as an alert. You can also setup your own alerts by defining custom alerts.

Now you can feel like the president of the principality of Analytica with your very own Intelligence Overview report.



The Intelligence Overview report shows you all of your automatic alerts (daily, weekly, and monthly) at a glance. From the Intelligence Overview, you can click on Details to see a graph of the alert and go directly into the GA report. You can also add or review an annotation right from the pop-up graph.


I hope you enjoyed this overview of Overview Reports. Please continue to send us feedback on the new Google Analytics. Stay tuned for next week’s installment in New Google Analytics series.

Posted by Trevor Claiborne, Google Analytics Team
Reposted fromdarinrmcclure darinrmcclure

May 10 2011

In the past few years, there has been massive growth in new and exciting cheap or free web site usability testing tools, so here’s my list of 24 tools you may need to use from time to time. Gone are the days of using expensive recruitment firms, labs and massive amounts of time to create, deploy and report on usability tests. By using these usability testing tools and others like them, you have for the first time a complete set of tools designed to tackle almost any usability research job. From recruiting real users (with tools such as Ethnio) to conducting live one on one remote moderated tests (UserVue) to analyzing results of usability changes using A/B testing (Google Website Optimizer), there is a plethora of useful and usable tools to conduct usability testing.
24 Usability Testing Tools | Useful Usability
Reposted fromjoshdamon joshdamon

May 09 2011

May 07 2011

May 01 2011

02mydafsoup-01
Ars Technica locked out of #Facebook account due to infringement complaint. Community policing is prone to abuse. https://eff.org/r.47u

---------------------------------------------------------------
// by arstechnica.com - started by 2011-04-29 with several updates:

Facebook shoots first, ignores questions later; account lock-out attack works (Update X)

Twitter / EFF: Ars Technica locked out of ... | 2011-05-01
Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl