Amazon.com Inc. is making a big bet, but it's not on selling books, CDs or holiday gifts. Instead, it wants to sell you all the processing power you can eat. Rather than competing with your local bookstore, it's taking on the likes of IBM, Hewlett-Packard Co. and Sun Microsystems Inc.
Amazon's recently released Elastic Compute Cloud (which it calls EC2 and is still in beta) for the first time brings grid computing and utility computing to the masses -- the ability to buy server power in the same way you now buy electricity or water.
In essence, you pay 10 cents per virtual server per hour, plus bandwidth costs, and you can do whatever you want with that power. While it's not quite as simple as turning on your water tap, it's the same basic idea. You pay only for the processing power you use, and how much you use is entirely in your control.
IBM, HP and Sun already sell computing power on demand, but they sell primarily to large corporations, and on a very big scale. Amazon, on the other hand, sells to small and medium-size businesses, as well as to large corporations, and does it via unique technology that builds on previously released Amazon middleware services.
Not everyone agrees that the same company that offers 40% off best-sellers should try to become a big-time IT provider. But Amazon has always believed that books were only an entree into selling far more sophisticated goods and services. Can it succeed? We'll take a look inside the technology, then talk to the Amazon executives in charge of the service, which may give some hint as to whether it will pay off.
How it works
Let's start off with a look at what the system is, how it works and a brief history of it. EC2 is not, in fact, the first of this type of service that Amazon has launched; it's an outgrowth of an existing platform called Amazon Web Services. Back in March 2006, Amazon released its Simple Storage Service (S3), online metered storage that costs 15 cents per gigabyte per month of storage used, plus 20 cents per gigabyte of data transferred. It uses standard Representational State Transfer and Simple Object Access Protocol interfaces.
In July 2006, Amazon followed with the Simple Queuing Service (SQS), a scalable hosted queue that stores messages as they travel between computers. It's designed to let developers easily move data between distributed application components, while ensuring that messages aren't lost.
It can be used to transfer messages even when individual components aren't currently available -- once a component is available, it's sent to it from the queue. Again, it's a metered model; costs are 10 cents per 1,000 messages sent, and 20 cents per gigabyte of data transferred. Like S3, it uses REST and SOAP interfaces.
In both instances, the technology wasn't developed from scratch. Instead, Amazon used its own internal infrastructure and technologies and made them available to developers.
EC2 continues in that tradition. Put in the simplest terms, Amazon rents out virtual servers, which it called instances, from its data centers, which are grids. Each instance has the approximate power of a server with a 1.7Ghz Xeon processor, 1.75GB of RAM, a 160GB hard drive and a 250Mbit/sec. Internet connection that can communicate in bursts of up to 1Gbit/sec.
You pay 10 cents per hour for each instance, plus 20 cents per gigabyte of data transfer. You can also combine it with S3 and pay 15 cents per gigabyte per month for storage. In the future, Amazon will likely roll out other tiers of instances, with more powerful instances costing more per hour.
This is a big change from most hosted models, in which you typically pay based on a maximum or planned capacity, plus fees for added redundancy. In the Amazon model, you pay only for what you actually use.
To use the service, you create a server image (called an Amazon Machine Image, or AMI), based on an Amazon spec. Ultimately, the server image will be able to have whatever operating system, applications, configuration, log-ins and security that you want. At the moment, it only supports the Linux kernel. Amazon also has prebuilt AMIs built that you can use as well, so that you don't have to configure them from scratch.
To use EC2, you upload the AMI, then invoke it and use it via an Amazon API. That virtual server can do anything you want -- power a database, speed downloads, power search or host a Web site, for example. You treat the virtual servers just as if they were your own servers.
Users can have multiple AMIs, and those AMIs can cooperate with one another in the same way that servers can. So, for example, you could build a three-tiered application with three different AMIs. One tier could be a Web server using Apache, a second tier could handle the application logic and the third tier could be the database.
While there are clear benefits for small business, larger corporations have signed on, too. For example, Microsoft Corp. has used the service to speed up software downloads, and Linden Lab has used it to help handle downloads of its Second Life online virtual world.
Where EC2 is headed
One major question raised by EC2 has nothing to do with technology, and everything to do with business: Has Amazon made a blunder by venturing outside of its core competency? After all, selling the latest best-seller and holiday gifts is one thing, trying to be a major league IT provider is something else entirely.
But Amazon execs don't see things that way. In fact, they maintain EC2 and similar services are at the heart of the Amazon business plan.
"Amazon is fundamentally a technology company; we've spent more than one and a half billion dollars investing in technology and content," says Adam Selipsky, vice president of product management and developer relations at Amazon Web Services. "We began by retailing books, but it was never in our business plan to stay with that."
Selipsky says that Amazon's first major move into expanding its platform beyond books and basic retail came in 2000, when the company opened its platform to third-party merchants, who were able to sell their products on Amazon.
In 2002, the third wave began, he says, when Amazon launched the Amazon E-Commerce Service, which allows developers to create applications that hook into Amazon's database, retrieve and display product information, and build customer shopping carts.
Out of that grew Amazon's Web Services initiatives, including S3, SQS and EC2.
"The Web Service initiatives let us pass on the engineering expertise we've acquired through the years, and the sometimes painful lessons we've learned building a Web-scale business," Selipsky explains. He adds that Amazon will continue to add other service for developers and businesses, although he would not be specific about what future services might be launched.
What Amazon Cloud means for grid computing
EC2 is one of the more innovative uses of grid computing and middleware, but it is far from the only one, and will certainly not be the last. Grid computing has been hyped for several years, but to date has not yet lived up to the hype.
Robert Rosenberg, president at analyst firm Insight Research Corp., has been tracking grid computing for at least four years, and says, "There's some progress in grid computing, but we hoped that it would be further along than it is by now."
Rosenberg says that a lack of widely accepted standards and the complexities of grid programming have held back grid computing to date. But a service like EC2, he believes, may spur wider use of grids because of its simplicity and the price range is reachable for even small businesses.
He estimates that $1.6 billion will be spent on grid computing in 2006, which will rise to $24 billion by 2011. Amazon, clearly, wants a piece of that pie. It's still not clear yet, though, whether the company will be able to replicate its e-retailing success with EC2 and other services aimed at the IT crowd.
More information on the EC2 can be found here.
Read or comment on the original, here.