Tag Archives: industrial

Your Toaster on the Internet

Toaster

Billions of people have access to the Internet. Now, whether a significant proportion of these do anything productive with this tremendous resource is open to debate — many preferring only to post pictures of their breakfasts, themselves or to watch last-minute’s viral video hit.

Despite all these humans clogging up the Tubes of the Internets most traffic along the information superhighway is in fact not even human. Over 60 percent of all activity comes from computer systems, such as web crawlers, botnets, and increasingly, industrial control systems, ranging from security and monitoring devices, to in-home devices such as your thermostat, refrigerator, smart TV , smart toilet and toaster. So, soon Google will know what you eat and when, and your fridge will tell you what you should eat (or not) based on what it knows of your body mass index (BMI) from your bathroom scales.

Jokes aside, the Internet of Things (IoT) promises to herald an even more significant information revolution over the coming decades as all our devices and machines, from home to farm to factory, are connected and inter-connected.

From the ars technica:

If you believe what the likes of LG and Samsung have been promoting this week at CES, everything will soon be smart. We’ll be able to send messages to our washing machines, run apps on our fridges, and have TVs as powerful as computers. It may be too late to resist this movement, with smart TVs already firmly entrenched in the mid-to-high end market, but resist it we should. That’s because the “Internet of things” stands a really good chance of turning into the “Internet of unmaintained, insecure, and dangerously hackable things.”

These devices will inevitably be abandoned by their manufacturers, and the result will be lots of “smart” functionality—fridges that know what we buy and when, TVs that know what shows we watch—all connected to the Internet 24/7, all completely insecure.

While the value of smart watches or washing machines isn’t entirely clear, at least some smart devices—I think most notably phones and TVs—make sense. The utility of the smartphone, an Internet-connected computer that fits in your pocket, is obvious. The growth of streaming media services means that your antenna or cable box are no longer the sole source of televisual programming, so TVs that can directly use these streaming services similarly have some appeal.

But these smart features make the devices substantially more complex. Your smart TV is not really a TV so much as an all-in-one computer that runs Android, WebOS, or some custom operating system of the manufacturer’s invention. And where once it was purely a device for receiving data over a coax cable, it’s now equipped with bidirectional networking interfaces, exposing the Internet to the TV and the TV to the Internet.

The result is a whole lot of exposure to security problems. Even if we assume that these devices ship with no known flaws—a questionable assumption in and of itself if SOHO routers are anything to judge by—a few months or years down the line, that will no longer be the case. Flaws and insecurities will be uncovered, and the software components of these smart devices will need to be updated to address those problems. They’ll need these updates for the lifetime of the device, too. Old software is routinely vulnerable to newly discovered flaws, so there’s no point in any reasonable timeframe at which it’s OK to stop updating the software.

In addition to security, there’s also a question of utility. Netflix and Hulu may be hot today, but that may not be the case in five years’ time. New services will arrive; old ones will die out. Even if the service lineup remains the same, its underlying technology is unlikely to be static. In the future, Netflix, for example, might want to deprecate old APIs and replace them with new ones; Netflix apps will need to be updated to accommodate the changes. I can envision changes such as replacing the H.264 codec with H.265 (for reduced bandwidth and/or improved picture quality), which would similarly require updated software.

To remain useful, app platforms need up-to-date apps. As such, for your smart device to remain safe, secure, and valuable, it needs a lifetime of software fixes and updates.

A history of non-existent updates

Herein lies the problem, because if there’s one thing that companies like Samsung have demonstrated in the past, it’s a total unwillingness to provide a lifetime of software fixes and updates. Even smartphones, which are generally assumed to have a two-year lifecycle (with replacements driven by cheap or “free” contract-subsidized pricing), rarely receive updates for the full two years (Apple’s iPhone being the one notable exception).

A typical smartphone bought today will remain useful and usable for at least three years, but its system software support will tend to dry up after just 18 months.

This isn’t surprising, of course. Samsung doesn’t make any money from making your two-year-old phone better. Samsung makes its money when you buy a new Samsung phone. Improving the old phones with software updates would cost money, and that tends to limit sales of new phones. For Samsung, it’s lose-lose.

Our fridges, cars, and TVs are not even on a two-year replacement cycle. Even if you do replace your TV after it’s a couple years old, you probably won’t throw the old one away. It will just migrate from the living room to the master bedroom, and then from the master bedroom to the kids’ room. Likewise, it’s rare that a three-year-old car is simply consigned to the scrap heap. It’s given away or sold off for a second, third, or fourth “life” as someone else’s primary vehicle. Your fridge and washing machine will probably be kept until they blow up or you move houses.

These are all durable goods, kept for the long term without any equivalent to the smartphone carrier subsidy to promote premature replacement. If they’re going to be smart, software-powered devices, they’re going to need software lifecycles that are appropriate to their longevity.

That costs money, it requires a commitment to providing support, and it does little or nothing to promote sales of the latest and greatest devices. In the software world, there are companies that provide this level of support—the Microsofts and IBMs of the world—but it tends to be restricted to companies that have at least one eye on the enterprise market. In the consumer space, you’re doing well if you’re getting updates and support five years down the line. Consumer software fixes a decade later are rare, especially if there’s no system of subscriptions or other recurring payments to monetize the updates.

Of course, the companies building all these products have the perfect solution. Just replace all our stuff every 18-24 months. Fridge no longer getting updated? Not a problem. Just chuck out the still perfectly good fridge you have and buy a new one. This is, after all, the model that they already depend on for smartphones. Of course, it’s not really appropriate even to smartphones (a mid/high-end phone bought today will be just fine in three years), much less to stuff that will work well for 10 years.

These devices will be abandoned by their manufacturers, and it’s inevitable that they are abandoned long before they cease to be useful.

Superficially, this might seem to be no big deal. Sure, your TV might be insecure, but your NAT router will probably provide adequate protection, and while it wouldn’t be tremendously surprising to find that it has some passwords for online services or other personal information on it, TVs are sufficiently diverse that people are unlikely to expend too much effort targeting specific models.

Read the entire story here.

Image: A classically styled chrome two-slot automatic electric toaster. Courtesy of Wikipedia.

Technology and Employment

Technology is altering the lives of us all. Often it is a positive influence, offering its users tremendous benefits from time-saving to life-extension. However, the relationship of technology to our employment is more complex and usually detrimental.

Many traditional forms of employment have already disappeared thanks to our technological tools; still many other jobs have changed beyond recognition, requiring new skills and knowledge. And this may be just the beginning.

From Technology Review:

Given his calm and reasoned academic demeanor, it is easy to miss just how provocative Erik Brynjolfsson’s contention really is. ­Brynjolfsson, a professor at the MIT Sloan School of Management, and his collaborator and coauthor Andrew McAfee have been arguing for the last year and a half that impressive advances in computer technology—from improved industrial robotics to automated translation services—are largely behind the sluggish employment growth of the last 10 to 15 years. Even more ominous for workers, the MIT academics foresee dismal prospects for many types of jobs as these powerful new technologies are increasingly adopted not only in manufacturing, clerical, and retail work but in professions such as law, financial services, education, and medicine.

That robots, automation, and software can replace people might seem obvious to anyone who’s worked in automotive manufacturing or as a travel agent. But Brynjolfsson and McAfee’s claim is more troubling and controversial. They believe that rapid technological change has been destroying jobs faster than it is creating them, contributing to the stagnation of median income and the growth of inequality in the United States. And, they suspect, something similar is happening in other technologically advanced countries.

Perhaps the most damning piece of evidence, according to Brynjolfsson, is a chart that only an economist could love. In economics, productivity—the amount of economic value created for a given unit of input, such as an hour of labor—is a crucial indicator of growth and wealth creation. It is a measure of progress. On the chart Brynjolfsson likes to show, separate lines represent productivity and total employment in the United States. For years after World War II, the two lines closely tracked each other, with increases in jobs corresponding to increases in productivity. The pattern is clear: as businesses generated more value from their workers, the country as a whole became richer, which fueled more economic activity and created even more jobs. Then, beginning in 2000, the lines diverge; productivity continues to rise robustly, but employment suddenly wilts. By 2011, a significant gap appears between the two lines, showing economic growth with no parallel increase in job creation. Brynjolfsson and McAfee call it the “great decoupling.” And Brynjolfsson says he is confident that technology is behind both the healthy growth in productivity and the weak growth in jobs.

It’s a startling assertion because it threatens the faith that many economists place in technological progress. Brynjolfsson and McAfee still believe that technology boosts productivity and makes societies wealthier, but they think that it can also have a dark side: technological progress is eliminating the need for many types of jobs and leaving the typical worker worse off than before. ­Brynjolfsson can point to a second chart indicating that median income is failing to rise even as the gross domestic product soars. “It’s the great paradox of our era,” he says. “Productivity is at record levels, innovation has never been faster, and yet at the same time, we have a falling median income and we have fewer jobs. People are falling behind because technology is advancing so fast and our skills and organizations aren’t keeping up.”

Brynjolfsson and McAfee are not Luddites. Indeed, they are sometimes accused of being too optimistic about the extent and speed of recent digital advances. Brynjolfsson says they began writing Race Against the Machine, the 2011 book in which they laid out much of their argument, because they wanted to explain the economic benefits of these new technologies (Brynjolfsson spent much of the 1990s sniffing out evidence that information technology was boosting rates of productivity). But it became clear to them that the same technologies making many jobs safer, easier, and more productive were also reducing the demand for many types of human workers.

Anecdotal evidence that digital technologies threaten jobs is, of course, everywhere. Robots and advanced automation have been common in many types of manufacturing for decades. In the United States and China, the world’s manufacturing powerhouses, fewer people work in manufacturing today than in 1997, thanks at least in part to automation. Modern automotive plants, many of which were transformed by industrial robotics in the 1980s, routinely use machines that autonomously weld and paint body parts—tasks that were once handled by humans. Most recently, industrial robots like Rethink Robotics’ Baxter (see “The Blue-Collar Robot,” May/June 2013), more flexible and far cheaper than their predecessors, have been introduced to perform simple jobs for small manufacturers in a variety of sectors. The website of a Silicon Valley startup called Industrial Perception features a video of the robot it has designed for use in warehouses picking up and throwing boxes like a bored elephant. And such sensations as Google’s driverless car suggest what automation might be able to accomplish someday soon.

A less dramatic change, but one with a potentially far larger impact on employment, is taking place in clerical work and professional services. Technologies like the Web, artificial intelligence, big data, and improved analytics—all made possible by the ever increasing availability of cheap computing power and storage capacity—are automating many routine tasks. Countless traditional white-collar jobs, such as many in the post office and in customer service, have disappeared. W. Brian Arthur, a visiting researcher at the Xerox Palo Alto Research Center’s intelligence systems lab and a former economics professor at Stanford University, calls it the “autonomous economy.” It’s far more subtle than the idea of robots and automation doing human jobs, he says: it involves “digital processes talking to other digital processes and creating new processes,” enabling us to do many things with fewer people and making yet other human jobs obsolete.

It is this onslaught of digital processes, says Arthur, that primarily explains how productivity has grown without a significant increase in human labor. And, he says, “digital versions of human intelligence” are increasingly replacing even those jobs once thought to require people. “It will change every profession in ways we have barely seen yet,” he warns.

Read the entire article here.

Image: Industrial robots. Courtesy of Techjournal.