Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

January 27 2014

The new bot on the block

Fukushima changed robotics. More precisely, it changed the way the Japanese view robotics. And given the historic preeminence of the Japanese in robotic technology, that shift is resonating through the entire sector.

Before the catastrophic earthquake and tsunami of 2011, the Japanese were focused on “companion” robots, says Rodney Brooks, a former Panasonic Professor of Robotics at MIT, the founder and former technical officer of IRobot, and the founder, chairman and CTO of Rethink Robotics. The goal, says Brooks, was making robots that were analogues of human beings — constructs that could engage with people on a meaningful, emotional level. Cuteness was emphasized: a cybernetic, if much smarter, equivalent of Hello Kitty, seemed the paradigm.

But the multiple core meltdown at the Fukushima Daiichi nuclear complex following the 2011 tsunami changed that focus abruptly.

“Fukushima was a wake-up call for them,” says Brooks. “They needed robots that could do real work in highly radioactive environments, and instead they had robots that were focused on singing and dancing. I was with IRobot then, and they asked us for some help. They realized they needed to make a shift, and now they’re focusing on more pragmatic designs.”

Pragmatism was always the guiding principle for Brooks and his companies, and is currently manifest in Baxter, Rethink’s flagship product. Baxter is a breakthrough production robot for a number of reasons. Equipped with two articulated arms, it can perform a multitude of tasks. It requires no application code to start up, and no expensive software to function. No specialists are required to program it; workers with minimal technical background can “teach” the robot right on the production line through a graphical user interface and arm manipulation. Also, Baxter requires no cage — human laborers can work safely alongside it on the assembly line.

Moreover, it is cheap: about $25,000 per unit. It is thus the robotic equivalent of the Model T, and like the Model T, Baxter and its subsequent iterations will impose sweeping changes in the way people live and work.

“We’re at the point with production robots where we were with mobile robots in the late 1980s and early 1990s,” says Brooks. “The advances are accelerating dramatically.”

What’s the biggest selling point for this new breed of robot? Brooks sums it up in a single word: dignity.

“The era of cheap labor for factory line work is coming to a close, and that’s a good thing,” he says. “It’s grueling, and it can be dangerous. It strips people of their sense of worth. China is moving beyond the human factory line — as people there become more prosperous and educated, they aspire to more meaningful work. Robots like Baxter will take up the slack out of necessity.”

And not just for the assemblage of widgets and gizmos. Baxter-like robots will become essential in the health sector, opines Brooks — particularly in elder care. As the Baby Boom piglet continues its course through the demographic python, the need for attendants is outstripping supply. No wonder: the work is low-paid and demanding. Robots can fill this breach, says Brooks, doing everything from preparing and delivering meals to shuttling laundry, changing bedpans and mopping floors.

“Again, the basic issue is dignity,” Brooks said. “Robots can free people from the more menial and onerous aspects of elder care, and they can deliver an extremely high level of service, providing better quality of life for seniors.”

Ultimately, robots could be more app than hardware: the sexy operating system on Joaquin Phoenix’s mobile device in the recent film “Her” may not be far off the mark. Basically, you’ll carry a “robot app” on your smartphone. The phone can be docked to a compatible mechanism — say, a lawn mower, or car, or humanoid mannequin — resulting in an autonomous device ready to trim your greensward, chauffeur you to the opera, or mix your Mojitos.

YDreams Robotics, a company co-founded by Brooks protégé Artur Arsenio, is actively pursuing this line of research.

“It’s just a very efficient way of marketing robots to mass consumers,” says Arsenio. “Smartphones basically have everything you need, including cameras and sensors, to turn mere things into robots.”

YDream has its first product coming out in April: a lamp. It’s a very fine if utterly unaware desk lamp on its own, says Artur, but when you connect it to a smartphone loaded with the requisite app, it can do everything from intelligently adjusting lighting to gauging your emotional state.

“It uses its sensors to interface socially,” Artur says. “It can determine how you feel by your facial expressions and voice. In a video conference, it can tell you how other participants are feeling. Or if it senses you’re sad, it may Facebook your girlfriend that you need cheering up.”

Yikes. That may be a bit more interaction than you want from a desk lamp, but get used to it. Robots could intrude in ways that may seem a little off-putting at first — but that’s a marker of any new technology. Moreover, says Paul Saffo, a consulting professor at Stanford’s School of Engineering and a technology forecaster of repute, the highest use of robots won’t be doing old things better. It will be doing new things, things that haven’t been done before, things that weren’t possible before the development of key technology.

“Whenever we have new tech, we invariably try to use it to do old things in a new way — like paving cow paths,” says Saffo. “But the sooner we get over that — the sooner we look beyond the cow paths — the better off we’ll be. Right now, a lot of the thinking is, ‘Let’s have robots drive our cars, and look like people, and be physical objects.’

But the most important robots working today don’t have physical embodiments, says Saffo — think of them as ether-bots, if you will. Your credit application? It’s a disembodied robot that gets first crack at that. And the same goes for your resume when you apply for a job.

In short, robots already are embedded in our lives in ways we don’t think of as “robotic.” This trend will only accelerate. At a certain point, things may start feeling a little — well Singularity-ish. Not to worry — it’s highly unlikely Skynet will rain nuclear missiles down on us anytime soon. But the melding of robotic technology with dumb things nevertheless presents some profound challenges — mainly because robots and humans react on disparate time scales.

“The real questions now are authority and accountability,” says Saffo. “In other words, we have to figure out how to balance the autonomy systems need to function with the control we need to ensure safety.”

Saffo cites modern passenger planes like the Airbus 330 as an example.

“Essentially they’re flying robots,” he says. “And they fly beautifully, conserving fuel to the optimal degree and so forth. But the design limits are so tight — if they go too fast, they can fall apart; if they go too slow, they stall. And when something goes wrong, the pilot has perhaps 50 kilometers to respond. At typical speeds, that doesn’t add up to much reaction time.”

Saffo noted the crash of Air France Flight 447 in the mid-Atlantic in 2009 involved an Airbus 330. Investigations revealed the likely cause was turbulence complicated by the icing up of the plane’s speed sensors. This caused the autopilot to disengage, and the plane began to roll. The pilots had insufficient time to compensate, and the aircraft slammed into the water at 107 knots.

“The pilot figured out what was wrong — but it was 20 seconds too late,” says Saffo. “To me, it shows we need to devote real effort to defining boundary parameters on autonomous systems. We have to communicate with our robots better. Ideally, we want a human being constantly monitoring the system, so he or she can intervene when necessary. And we need to establish parameters that make intervention even possible.”

Rod Brooks will be speaking at the upcoming Solid Conference in May. If you are interested in robotics and other aspects of the convergence of physical and digital worlds, subscribe to the free Solid Newsletter.

January 24 2014

Four short links: 24 January 2014

  1. What Every Computer Scientist Should Know About Floating Point Arithmetic — in short, “it will hurt you.”
  2. Ori a distributed file system built for offline operation and empowers the user with control over synchronization operations and conflict resolution. We provide history through light weight snapshots and allow users to verify the history has not been tampered with. Through the use of replication instances can be resilient and recover damaged data from other nodes.
  3. RoboEartha Cloud Robotics infrastructure, which includes everything needed to close the loop from robot to the cloud and back to the robot. RoboEarth’s World-Wide-Web style database stores knowledge generated by humans – and robots – in a machine-readable format. Data stored in the RoboEarth knowledge base include software components, maps for navigation (e.g., object locations, world models), task knowledge (e.g., action recipes, manipulation strategies), and object recognition models (e.g., images, object models).
  4. Mother — domestic sensors and an app with an appallingly presumptuous name. (Also, wasn’t “Mother” the name of the ship computer in Alien?) (via BoingBoing)

December 18 2013

Democratizing technology and the road to empowerment

Advancements in technology are making what once was relegated only to highly educated scientists, engineers and developers accessible to — and affordable for — the mainstream. This democratization of technology and the empowerment it affords was an underlying thread through many of the stories at this year’s Business Innovation Factory (BIF) summit. From allowing hobbyists and makers to innovate and develop on an advanced level to enabling individuals to take control of their personal health data to using space suits to help children with cerebral palsy, technological advancements are beginning to empower — and enrich — at scale.

With the rise of quantified self, for example, people have begun amassing personal data based on their activities and behaviors. Some argue that QS doesn’t go quite far enough and that a more complete story can be told by incorporating emotional data, our sense of experience. While it’s empowering in many ways to be able to collect and control all this personal big data, what to do with this onslaught of information and how to process it remains a question for many.

Alexander Tsiaras, who founded theVisualMD, argued in his talk at BIF9 that “story gives a soul to the data,” and that it’s time to change the paradigm, to start using technology to create ecosystems to empower people to understand what’s going on inside their bodies as a result of their behaviors.

Using visualization and interactive media, personal big data — medical records, test results, lab reports, diagnoses, and exercise and eating habits, for instance — are deconstructed, as Tsiaras explained, to “demystify” the data: “The beauty of visualization is that it speaks to everyone,” he said. From stories to explain test results to stories to help patients visualize the processes going on inside their bodies when they eat particular foods or when they exercise, people are able to turn their personal big data into stories, whether to better understand a chronic condition or to understand how their behaviors play into prevention. “This is the most important thing,” Tsiaras argued, “the moment you take control, that empowerment is huge.”

Arguably, one of the most democratizing and empowering of technological innovations is 3D printing — innovators can now manufacture the products they conceive, even at scale. BIF storyteller Ping Fu emphasized the potential of the technology through a powerful personal story of how her past experiences led her to computer science — a breakthrough, she explained, that changed her life personally and professionally, leading her to co-found 3D printing and design company Geomagic. Fu defined innovation as “imagination applied” and shared examples of innovations in 3D printing, including the Smithsonian’s plan to scan and print artifacts from its collection (which can now be achieved by individuals at home), custom prosthetics designed as mirror images of actual limbs, and the digital preservation of UNESCO World Heritage sites. Fu stressed that the technology should not be viewed as a platform for printing tchotchkes, that real-world, useful products are being produced.

This argument was further supported by storyteller Easton LaChapelle, a 17-year-old high school student who has used 3D printing technology in coordination with advancements in (and some creativity with) engineering materials to create a robotic hand that’s wirelessly controlled by a robotic glove — complete with haptic feedback — and a 3D-printed brain-wave-controlled robotic arm.

Affordable access to 3D printing, LaChapelle said, was key to his ability to move forward with his designs, and he noted that 3D printing is a driving force for innovation: “I can design something in my room and hit print, and within an hour, it’s in front of me; that alone is really fascinating, that you’re able to design something and have it physically in front of you; it’s remarkable in today’s world — it’s a whole evolving technology.”

Advancements in technology aren’t only empowering LaChapelle to create innovative designs in robotics; they’re empowering him to help humanity, and in turn, empowering humanity. He explained during his story:

“When I was at the science fair in Colorado, I had the first generation of the arm there for a public viewing, and a 7-year-old girl came up to me. She had a prosthetic limb from the elbow to the fingertip, with one motion — open-close — and one sensor. That alone was $80,000. That was really the moment that touched my heart, the “ah-ha” moment, that I could take what I’m already doing, transfer it directly to prosthetics, and potentially make people’s lives better.”

The final iteration of the arm is completely 3D printed, making it lightweight — from fingertip to shoulder, it’s in line to weigh less than five pounds — and cost about $400 to produce. “The third generation arm is the final [iteration of the arm] and the point where it can change people’s lives,” said LaChapelle. He’s currently working on developing exoskeleton legs for a friend who was paralyzed in a car accident, technology that he’ll make available to anyone with paralysis, MS or any other condition that impairs movement: “I want to solve this. My approach is to give them something they can afford and something they can use easily.”

You can watch LaChapelle’s inspiring talk in the following video:

In a similar vein, storyteller Dava Newman, professor of Aeronautics and Astronautics and Engineering Systems at MIT, explained how her team uses advances in materials technology to develop exoskeleton space suits to address particular issues astronauts experience in space, such as combating bone density loss, and increasing mobility and flexibility (see the gravity loading countermeasure suit, which is also used to help children with cerebral palsy perform daily activities).

Suits are also designed with high-precision EGain sensors to help provide muscular protection and measure hot spots and pressure while astronauts are training, in hopes of preventing shoulder injuries, for instance. The ultimate goal is developing the BioSuit, which uses electrospun materials, dielectric elastomers and shape memory alloys to provide a skin-tight, pressurized but flexible suit environment for astronauts. Newman stressed that the importance of our work as scientists, designers, researchers, artists, mathematicians, etc., is to take care of one another:

“…200 miles, 400 kilometers — Boston to New York — that’s where we’re living in space now; it’s low Earth orbit, and it’s fantastic and it’s great, but it’s been 40 years since we’ve been to another planetary body. I think, with all my great students and all the great dreamers in the world, we’ll get to the Moon, we’ll get to Mars, and we’re going for the search of life. It’ll be humans and rovers and robots all working together. The scientific benefit will be great, but the most important thing is that we learn about ourselves — it’s in the reflection and thinking about who we are and who humanity is.”

As technology advances and empowers more and more people to create, build and produce more and more innovative products and experiences, a discussion has begun as to the responsibility we have as engineers, scientists, designers, etc., to consider the implications and ramifications of our work, to using our designs for the benefit of humanity, to further social progress in a positive direction. It’s clear through these and other storytellers from BIF9 that doing so results in a more enriching, empowering experience for everyone.

Robots will remain forever in the future

(Note: this post first appeared on Forbes; this lightly edited version is re-posted here with permission.)

We’ve watched the rising interest in robotics for the past few years. It may have started with the birth of FIRST Robotics competitions, continued with the iRobot and the Roomba, and more recently with Google’s driverless cars. But in the last few weeks, there has been a big change. Suddenly, everybody’s talking about robots and robotics.

It might have been Jeff Bezos’ remark about using autonomous drones to deliver products by air. It’s a cool idea, though I think it’s farfetched, but that’s another story. Amazon Prime isn’t Amazon’s first venture into robotics: a year and a half ago, they bought Kiva Systems, which builds robots that Amazon uses in their massive warehouses. (Personally, I think package delivery by drone is unlikely for many, many reasons, but that’s another story, and certainly no reason for Amazon not to play with delivery in their labs.)

But what really lit the fire was Google’s acquisition of Boston Dynamics, a DARPA contractor that makes some of the most impressive mobile robots anywhere. It’s hard to watch their videos without falling in love with what their robots can do. Or becoming very scared. Or both. And, of course, Boston Dynamics isn’t a one-time buy. It’s the most recent in a series of eight robotics acquisitions, and I’d bet that it’s not the last in the series.

Google is clearly doing something big, but what? Unlike Jeff Bezos, Larry Page and Sergei Brin haven’t started talking about delivering packages with drones or anything like that. Neither has Andy Rubin, who is running the new robotics division. The NSA probably knows, to Google’s chagrin, but we won’t until they’re ready to tell us. Google has launched a number of insanely ambitious “moon shot” projects recently; I suspect this is another.

Whatever is coming from Google, we’ll certainly see even greater integration of robots into everyday life. Those robots will quickly become so much a part of our lives that we’ll cease to think of them as robots; they’ll just be the things we live with. At O’Reilly’s Foo Camp in 2012, Jason Huggins, creator of this Angry Birds-playing bot, remarked that robots are always part of the future. Little bits of that future break off and become part of the present, but when that happens, those bits cease to be “robots.” In 1945, a modern dishwasher would have been a miracle, as exotic as the space-age appliances in The Jetsons. But now, it’s just a dishwasher, and we’re trying to think of ways to make it more intelligent and network-enabled. Dishwashers, refrigerators, vacuums, stoves: is the mythical Internet-enabled refrigerator that orders milk when you’re running low a robot? What about a voice-controlled baking machine, where you walk up and tell it what kind of bread you want? Will we think of these as robots?

I doubt it. Much has been made of Google’s autonomous vehicles. Impressive as they are, autonomous robots are nowhere near as interesting as assistive robots, robots that assist humans in some difficult task. Driving around town is one thing, but BMW already has automatic parallel parking. But do we call these “robotic cars”? What about anti-lock brakes and other forms of computer-assisted driving that have been around for years? A modern airliner essentially flies itself from one airport to another, but do we see a Boeing 777 as a “robot”? We prefer not to, perhaps because we cherish the illusion that a human pilot is doing the flying. Robots are everywhere already; we’ve just trained ourselves not to see them.

We can get some more ideas about what the future holds by thinking about some of Google’s statements in other contexts. Last April, Slate reported that Google was obsessed with building the Star Trek computer. It’s a wonderful article that contains real insight into the way Google thinks about technology. Search is all about context: it’s not about the two or three words you type into the browser; it’s about understanding what you’re looking for, understanding your language rather than an arcane query language. The Star Trek computer does that; it anticipates what Kirk wants and answers his questions, even if they’re ill-formed or ambiguous. Let’s assume that Google can build that kind of search engine. Once you have the Star Trek computer doing your searches, the next step is obvious: don’t just do a search; get me the stuff I want. Find me my keys. Put the groceries away. Water the plants. I don’t think robotic helpers like these are as far off as they seem; most of the technologies we need to build them already exist. And while it may take a supercomputer to recognize that a carton of eggs is a carton of eggs, that supercomputer is only an Internet connection away.

But will we recognize these devices as robots once they’ve been around for a year or two? Or will they be “the finder,” “the unpacker,” “the gardener,” while robots remain implausibly futuristic? The latter, I think. Garden-variety text search, whether it’s Android or Siri, is an amazing application of artificial intelligence, but these days, it’s just something that phones do.

I have no doubt that Google’s robotics team is working on something amazing and mind-blowing. Should they succeed, and should that success become a product, though, whatever they do will almost certainly fade into the woodwork and become part of normal, everyday reality. And robots will remain forever in the future. We might have found Rosie, the Jetsons’ robotic maid, impressive. But the Jetsons didn’t.

November 27 2013

The Amazon whisperer, invisible interfaces, FDA vs 23andMe, and robots usher in a new polical order

The Radar team does a lot of sharing in the backchannel. Here’s a look at a selection of stories and innovative people and companies from around the web that have caught our recent attention. Have an interesting tidbit to contribute to the conversation? Send me an email or ping me on Twitter

  • The edges of connected realities — Steve Mason’s TEDxSF talk, in which he discusses the evolution of connected environments and quotes Yves Behar: “The interface of the future is invisible.” (Jenn Webb, via Jim Stogdill, via Rachel Kalmar) Mason’s talk is a must-watch, so I’ll just provide direct access:
  • Why the FDA is targeting Google-backed 23andMe: Unnecessary MRIs, mastectomiesChristina Farr wrote: “One San Francisco-based neurologist, who asked to remain anonymous, told me that some of her healthiest patients — all 23andMe customers — have begun demanding unnecessary and expensive MRI tests for Alzheimer’s disease. ’23andMe’s test is creating chaos with people in their 20s and 30s,’ she said. ‘They generate havoc and walk away.’” (Via Jim Stogdill)
  • A Letter I Will Probably Send to the FDA — From the other side of the FDA vs 23andMe spat, Dr. Scott Alexander wrote: “…23andMe has raised awareness of genetics among the general population and given them questions and concerns, usually appropriate, which they can discuss with their doctor. Their doctor can then follow up on these concerns. Such followup may involve reassurance, confirmation with other genetic testing, confirmation through other diagnostic modalities, or referral to another professional such as a genetic counselor. In my experience personal genomic results do not unilaterally determine a course of treatment, but may influence an ambiguous clinical picture in one direction or the other, or be a useful factor when deciding between otherwise equipotent medications. Banning the entire field of personal genomics in one fell swoop would eliminate a useful diagnostic tool from everyone except a few very wealthy patients.” (Via Mike Loukides)
  • Chaim Pikarski, the Amazon whispererJason Feifer wrote: “[Pikarski] has an entire team of people who read reviews on Amazon, looking for moments when people say, ‘I wish this speaker were rechargeable.’ Pikarski then makes a rechargeable version. … This is the heart of C&A: Each buyer has a specialty — beach products, cellular accessories, and so on. Their job is to scour the web to learn all the features people wish a product had, and hire a manufacturer, often in China, to make the desired version. (Via Tim O’Reilly, via Kevin Slavin)
  • The Robots Are HereTyler Cowen wrote: “The rise of smart machines — technologies that encompass everything from artificial intelligence to industrial robots to the smartphones in our pockets — is changing how we live, work and play. Less acknowledged, perhaps, is what all this technological change portends: nothing short of a new political order. The productivity gains, the medical advances, the workplace reorganizations and the myriad other upheavals that will define the coming automation age will create new economic winners and losers; it will reorient our demographics; and undoubtedly, it will transform what we demand from our government.” (Via Jenn Webb)

Related:

November 20 2013

IoT meets agriculture, Intellistreets, immersive opera, and high-tech pencils

The Radar team does a lot of sharing in the backchannel. Here’s a look at a selection of stories and innovation highlights from around the web that have caught our recent attention. Have an interesting tidbit to contribute to the conversation? Join the discussion in the comments section, send me an email or ping me on Twitter.

  • Pencil — FiftyThree’s new Pencil tool is a super cool gadget that exhibits interesting technology from a software meets hardware perspective. (Via Mike Loukides)
  • littleBits Synth KitlittleBits and Korg have teamed up to bring DIY to music with a modular synthesizer kit. From John Paul Titlow on Fast Company: “Hobbyists have been doing this for many years, either through prepackaged kits or off-the-shelf components. The difference here is that no soldering or wiring is required. Each circuit piece’s magnetized and color-coded end makes them effortlessly easy to snap together and pull apart … Notably, the components of the kit are compatible with other littleBits modules, so it’s possible to build a synthesizer that integrates with other types of sensors, lights, and whatever else littleBits cooks up down the line.” (Via Jim Stogdill)
  • Why Curated Experiences Are The New Future Of Marketing — From immersive opera to extreme kidnapping to a bungee-jumping car, companies are learning that the key to their customers’ attention (and wallets) lies in the experience. From Krisztina “Z” Holly on Forbes: “There is something about a moment in time that can’t be replicated, an experience that is your very own, an adventure with others that is deeply personal and memorable. It is something that can’t be achieved by a high-budget celebrity endorsement or a large ad buy.” (Via Sara Winge)
  • The Internet of things, stalk by stalk — IoT meets agriculture with Air Tractor planes that can change fertilizer mixtures in-flight and the Rosphere farm robot that monitors crops. From Paul Rako on GigaOm: “Sensors on the robot could monitor each and every stalk of corn. Those robots can communicate with each other over a mesh network. A mesh network is like a chat room for gadgets. They identify themselves and their capabilities, and are then a shared resource.” (Via Jenn Webb)
  • What happens in Vegas DOESN’T stay in Vegas — Vegas’ new wireless street light system, Intellistreets, can record video and audio. From the Daily Mail: “The wireless, LED lighting, computer-operated lights are not only capable of illuminating streets, they can also play music, interact with pedestrians and are equipped with video screens, which can display police alerts, weather alerts and traffic information. The high tech lights can also stream live video of activity in the surrounding area. … These new street lights, being rolled out with the aid of government funding, are also capable of recording video and audio.” (Via Tim O’Reilly)

Additionally, at the recent Techonomy conference, Tim O’Reilly and Max Levchin discussed “Innovation and the Coming Shape of Social Transformation” — have a look:

November 05 2012

Four short links: 5 November 2012

  1. The Psychology of Everything (YouTube) — illustrating some of the most fundamental elements of human nature through case studies about compassion, racism, and sex. (via Mind Hacks)
  2. Reports of Exempt Organizations (Public Resource) — This service provides bulk access to 6,461,326 filings of exempt organizations to the Internal Revenue Service. Each month, we process DVDs from the IRS for Private Foundations (Type PF), Exempt Organizations (Type EO), and filings by both of those kinds of organizations detailing unrelated business income (Type T). The IRS should be making this publicly available on the Internet, but instead it has fallen to Carl Malamud to make it happen. (via BoingBoing)
  3. Chris Anderson Leaves for Drone Co (Venturebeat) — Editor-in-chief of Wired leaves to run his UAV/robotics company 3D Robotics.
  4. pysqli (GitHub) — Python SQL injection framework; it provides dedicated bricks that can be used to build advanced exploits or easily extended/improved to fit the case.

October 16 2012

Industrial Internet links

Here’s a broad look at a few recent items of interest related to the industrial Internet — the world of smart, connected, big machines.

Smarter Robots, With No Wage Demands (Bloomberg Businessweek) — By building more intelligence into robots, Rethink Robotics figures it can get them into jobs where work has historically been too irregular or too small-scale for automation. That could mean more manufacturing stays in American factories, though perhaps with fewer workers.

The Great Railway Caper (O’Reilly Strata EU) — Today’s railroads rely heavily on the industrial Internet to optimize locomotive operations and maintain their very valuable physical plant. Some of them were pioneers in big networked machines. Part of Sprint originated as the Southern Pacific Railroad Network of Intelligent Telecommunications, which used the SP’s rights-of-way to transmit microwave and fiber optic signals. But in the 1950s, computing in railways was primitive (as it was just about everywhere else, too). John Graham-Cumming relayed this engaging story of network optimization in 1955 at our Strata Conference in London two weeks ago.

The Quiet Comfort of the Internet of Things (O’Reilly Strata EU) — Alexandra Deschamps-Sonsino presents a quirky counterpoint to the Internet of big things; what you might call the Internet of very small things. She leads Designswarm and founded Good Night Lamp, which produces a family of Internet-connected lamps that let friends and family communicate domestic milestones, like bedtime. Her Strata keynote explains a bit of her work and approach.

Solar panel control systems vulnerable to hacks, feds warn (Ars Technica) — A good reminder that the industrial Internet can be vulnerable to the same sorts of attacks as the rest of the Internet if it’s not built out properly — in this case, run-of-the-mill SQL injection.

The industrial Internet series is produced as part of a collaboration between O’Reilly and GE.

Related:

March 02 2012

Four short links: 2 March 2012

  1. Interview: Hanno Sander on Robotics (Circuit Cellar) -- this is what Mindstorms wants to be when it grows up. AAA++ for teaching kids. Hanno is a Kiwi Foo Camper.
  2. Context Needed: Benchmarks -- Benchmarks fall into a few common traps because of under-reporting in context and lack of detail in results. The typical benchmark report doesn't reveal the benchmark's goal, full details of the hardware and software used, how the results were edited if at all, how to reproduce the results, detailed reporting on the system's performance during the test, and an interpretation and explanation of the results. (via Jesse Robbins)
  3. Morris.js (GitHub) -- a lightweight library that uses jQuery and Raphaël to make drawing time-series graphs easy.
  4. Bret Victor: Inventing on Principle (Vimeo) -- the first 20m has amazing demos of a coding environment with realtime feedback. Must see this! (via Sacha Judd)

November 04 2011

Four short links: 4 November 2011

  1. Beethoven's Open Repository of Research (RocketHub) -- open repository funded in a Kickstarter-type way. First crowdfunding project I've given $$$ to.
  2. KeepOff (GitHub) -- open source project built around hacking KeepOn Interactive Dancing Robots. (via Chris Spurgeon)
  3. Steve Jobs One-on-One (ComputerWorld) -- interesting glimpse of the man himself in an oral history project recording made during the NeXT years. I don't need a computer to get a kid interested in that, to spend a week playing with gravity and trying to understand that and come up with reasons why. But you do need a person. You need a person. Especially with computers the way they are now. Computers are very reactive but they're not proactive; they are not agents, if you will. They are very reactive. What children need is something more proactive. They need a guide. They don't need an assistant.
  4. Bluetooth Violin Bow -- this is awesome in so many directions. Sensors EVERYWHERE! I wonder what hackable uses it has ...

October 10 2011

August 22 2011

Four short links: 22 August 2011

  1. Cities in Fact and Fiction: An Interview with William Gibson (Scientific American) -- Paris, as much as I love Paris, feels to me as though it's long since been "cooked." Its brand consists of what it is, and that can be embellished but not changed. A lack of availability of inexpensive shop-rentals is one very easily read warning sign of overcooking. I wish Manhattan condo towers could be required to have street frontage consisting of capsule micro-shops. The affordable retail slots would guarantee the rich folks upstairs interesting things to buy, interesting services, interesting food and drink, and constant market-driven turnover of same, while keeping the streetscape vital and allowing the city to do so many of the things cities do best. London, after the Olympic redo, will have fewer affordable retail slots, I imagine. (via Keith Bolland)
  2. Bootstrap -- HTML toolkit from Twitter, includes base CSS and HTML for typography, forms, buttons, tables, grids, navigation, and more. Open sourced (Apache v2 license).
  3. Extra Headers for Browser Security -- I hadn't realized there were all these new headers to avoid XSS and other attacks. Can you recommend a good introduction to these new headers? (via Nelson Minar)
  4. Swarmanoid -- award-winning robotics demo of heterogeneous, dynamically connected, small autonomous robots that provide services to each other to accomplish a larger goal. (via Mike Yalden)

May 06 2011

2 makers, 2 robots, 2 visions

If you haven't been to a Maker Faire, it can be hard to describe the vast and diverse array of exhibitors and events, and no set of interviews can do the event itself justice. But recent conversations with makers Scott Bergquist and Ian Bernstein do offer a decent representation of what Maker Faire is fundamentally about.

What follows are the stories of two very different robots, both of which will be in attendance (in some form) at the upcoming Maker Faire Bay Area.

The errand bot

Bergquist will be showing the Driverless Errand Car, a design concept for an autonomous vehicle system that delivers goods and runs errands. Bergquist admits that his idea is well ahead of the technologies required, but he wants people to start thinking about how society could benefit if fuel-efficient robotic vehicles were running errands without requiring human interaction. As he likes to put it, "Why drive a 4,000-pound vehicle to get a quarter-pounder at McDonalds?"

In Bergquist's vision, these robots will do all the scut-work errands, like getting groceries or dropping a tax form off at the post office. Clearly, this would require massive infrastructure changes, such as having food distribution centers that could do order fulfillment when a 'bot arrived. But Bergquist believes it could be done economically, because the 'bots could travel longer distances to get their orders (since time isn't a factor), which in turn would allow for centralized warehousing.

"Within my area, I can think of four Safeway stores that are within a 10-mile radius," Bergquist said. "Instead, there could be just one store, which would be inconvenient to a human shopper because everybody would have to, from all directions, come over to that one store. But for a humanless, driverless device, it wouldn't mind waiting in line to get its order. If it took it 40 minutes of waiting in line, it really wouldn't make any difference to you. It wouldn't make any difference to you if it made five trips a week instead of once a week. It wouldn't feel put out by having to stop at one distribution center to pick up canned goods and then driving further to another distribution center to pick up fresh vegetables, and then driving to another facility to pick up bread."

Another unique feature of this system is that in Bergquist's design the vehicles run off of spring power. They would be wound up before leaving the house, and would function because they would be so much lighter than a conventional vehicle.

"My [proposed] car is rather small," Bergquist said. "It's a little under four-feet long and a little under two-feet wide. It's about a meter tall. Without going into a lot of detail, for such a light vehicle, for such a limited amount of optional range, a spring-powered car could be a real no-fuel vehicle that would be extremely cheap because you wouldn't have an electric motor; you wouldn't have an electric battery."

Maker Faire Bay Area will be held May 21-22 in San Mateo, Calif. Event details, exhibitor profiles, and ticket information can be found at the Maker Faire site.

A basic ball with complicated guts

SpheroIan Bernstein and his business partners will be bringing Sphero to Maker Faire. The robot, which is about the size of a baseball, rolls around, has LEDs that can turn the 'bot thousands of different colors, is loaded with sensors, and can be controlled and programmed from iPhones, Androids and PCs.

Bernstein has no grand society-changing plans for Sphero, he just sees it as a cool platform for new applications. "We've had a lot of people come to us, and some of the ideas are educational," Bernstein said. "Training centers for college or companies and also K-12 — they thought this would be kind of a cool way to teach programming. Like Arduino, you can see other projects people have made."

Although a robotic sphere may seem simple, Bernstein says that it's fiendishly hard to develop the underlying control software. "It's incredibly complicated to build a ball that moves in this fashion," he said. "We're literally using missile-guidance-type control systems in there. We've consulted with people that do NASA projects. What people see is just a simple ball. So it's minimalistic, but it's complicated on the engineering side." (Here's a video of Sphero in action.)

Bernstein and his associates plan to have Sphero available for sale by the end of the year, while Bergquist's errand-running cars are more science fiction than product at the moment. You'll be able to meet these bots' creators, and get a sense for how the robots work, at this month's Maker Faire.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl