Tag Archives: cloud computing

Father of Distributed Computing

Leslie_LamportDistributed computing is a foundational element for most modern day computing. It paved the way for processing to be shared across multiple computers and, nowadays, within the cloud. Most technology companies, including IBM, Google, Amazon, and Facebook, use distributed computing to provide highly scalable and reliable computing power for their systems and services. Yet, Bill Gates did not invent distributed computing, nor did Steve Jobs. In fact, it was pioneered in the mid-1970s by an unsung hero of computer science, Leslie Lamport. Know aged 73 Leslie Lamport was recognized with this year’s Turing Award.

From Technology Review:

This year’s winner of the Turing Award—often referred to as the Nobel Prize of computing—was announced today as Leslie Lamport, a computer scientist whose research made possible the development of the large, networked computer systems that power, among other things, today’s cloud and Web services. The Association for Computing Machinery grants the award annually, with an associated prize of $250,000.

Lamport, now 73 and a researcher with Microsoft, was recognized for a series of major breakthroughs that began in the 1970s. He devised algorithms that make it possible for software to function reliably even if it is running on a collection of independent computers or components that suffer from delays in communication or sometimes fail altogether.

That work, within a field now known as distributed computing, remains crucial to the sprawling data centers used by Internet giants, and is also involved in coördinating the multiple cores of modern processors in computers and mobile devices. Lamport talked to MIT Technology Review’s Tom Simonite about why his ideas have lasted.

Why is distributed computing important?

Distribution is not something that you just do, saying “Let’s distribute things.” The question is ‘How do you get it to behave coherently?’”

My Byzantine Generals work [on making software fault-tolerant, in 1980] came about because I went to SRI and had a contract to build a reliable prototype computer for flying airplanes for NASA. That used multiple computers that could fail, and so there you have a distributed system. Today there are computers in Palo Alto and Beijing and other places, and we want to use them together, so we build distributed systems. Computers with multiple processors inside are also distributed systems.

We no longer use computers like those you worked with in the 1970s and ’80s. Why have your distributed-computing algorithms survived?

Some areas have had enormous changes, but the aspect of things I was looking at, the fundamental notions of synchronization, are the same.

Running multiple processes on a single computer is very different from a set of different computers talking over a relatively slow network, for example. [But] when you’re trying to reason mathematically about their correctness, there’s no fundamental difference between the two systems.

I [developed] Paxos [in 1989] because people at DEC [Digital Equipment Corporation] were building a distributed file system. The Paxos algorithm is very widely used now. Look inside of Bing or Google or Amazon—where they’ve got rooms full of computers, they’ll probably be running an instance of Paxos.

More recently, you have worked on ways to improve how software is built. What’s wrong with how it’s done now?

People seem to equate programming with coding, and that’s a problem. Before you code, you should understand what you’re doing. If you don’t write down what you’re doing, you don’t know whether you understand it, and you probably don’t if the first thing you write down is code. If you’re trying to build a bridge or house without a blueprint—what we call a specification—it’s not going to be very pretty or reliable. That’s how most code is written. Every time you’ve cursed your computer, you’re cursing someone who wrote a program without thinking about it in advance.

There’s something about the culture of software that has impeded the use of specification. We have a wonderful way of describing things precisely that’s been developed over the last couple of millennia, called mathematics. I think that’s what we should be using as a way of thinking about what we build.

Read the entire story here.

Image: Leslie Lamport, 2005. Courtesy of Wikipedia.

What Did You Have for Breakfast Yesterday? Ask Google

Memory is, well, so 1990s. Who needs it when we have Google, Siri and any number of services to help answer and recall everything we’ve ever perceived and wished to remember or wanted to know. Will our personal memories become another shared service served up from the “cloud”?

[div class=attrib]From the Wilson Quarterly:[end-div]

In an age when most information is just a few keystrokes away, it’s natural to wonder: Is Google weakening our powers of memory? According to psychologists Betsy Sparrow of Columbia University, Jenny Liu of the University of Wisconsin, Madison, and Daniel M. Wegner of Harvard, the Internet has not so much diminished intelligent recall as tweaked it.

The trio’s research shows what most computer users can tell you anecdotally: When you know you have the Internet at hand, your memory relaxes. In one of their experiments, 46 Harvard undergraduates were asked to answer 32 trivia questions on computers. After each one, they took a quick Stroop test, in which they were shown words printed in different colors and then asked to name the color of each word. They took more time to name the colors of Internet-related words, such as modem and browser. According to Stroop test conventions, this is because the words were related to something else that they were already thinking about—yes, they wanted to fire up Google to answer those tricky trivia questions.

In another experiment, the authors uncovered evidence suggesting that access to computers plays a fundamental role in what people choose to commit to their God-given hard drive. Subjects were instructed to type 40 trivia-like statements into a dialog box. Half were told that the computer would erase the information and half that it would be saved. Afterward, when asked to recall the statements, the students who were told their typing would be erased remembered much more. Lacking a computer backup, they apparently committed more to memory.

[div class=attrib]Read the entire article here.[end-div]

Brokering the Cloud

Computer hardware reached (or plummeted, depending upon your viewpoint) the level of commodity a while ago. And of course, some types of operating systems platforms, and software and applications have followed suit recently — think Platform as a Service (PaaS) and Software as a Service (SaaS). So, it should come as no surprise to see new services arise that try to match supply and demand, and profit in the process. Welcome to the “cloud brokerage”.

[div class=attrib]From MIT Technology Review:[end-div]

Cloud computing has already made accessing computer power more efficient. Instead of buying computers, companies can now run websites or software by leasing time at data centers run by vendors like Amazon or Microsoft. The idea behind cloud brokerages is to take the efficiency of cloud computing a step further by creating a global marketplace where computing capacity can be bought and sold at auction.

Such markets offer steeply discounted rates, and they may also offer financial benefits to companies running cloud data centers, some of which are flush with excess capacity. “The more utilized you are as a [cloud services] provider … the faster return on investment you’ll realize on your hardware,” says Reuven Cohen, founder of Enomaly, a Toronto-based firm that last February launched SpotCloud, cloud computing’s first online spot market.

On SpotCloud, computing power can be bought and sold like coffee, soybeans, or any other commodity. But it’s caveat emptor for buyers, since unlike purchasing computer time with Microsoft, buying on SpotCloud doesn’t offer many contractual guarantees. There is no assurance computers won’t suffer an outage, and sellers can even opt to conceal their identity in a blind auction, so buyers don’t always know whether they’re purchasing capacity from an established vendor or a fly-by-night startup.

[div class=attrib]Read more here.[end-div]

[div class=attrib]Image courtesy of MIT Technology Review.[end-div]