October 25, 2010 1 Comment
Cloud computing (which is many things to many different people) has really begun to take shape in 2010. Initially amongst all the hype and misunderstanding the primary selling point of cloud computing was around providing unlimited computing scale without any capital investment.
I think over the last few months we have seen the selling point become more about providing computing power as a utility with a pay-for-use cost similar to electricity for example. It is a subtle difference but important none the less. You can still have the near unlimited scale but I think cloud computing vendors have begun to realise that the vast majority of potential customers won’t turn into the next Facebook, Twitter or whatever.
Now the focus is about the ability to deploy computing power on a per needs basis, and it is a powerful argument once you start to dig into how it works. Over the last few months I have been playing around with Amazon Web Services (AWS) which is really a phenomenal infrastructure. I initially looked at it about 2 or 3 years ago when it was all driven by tricky command-line tools and it was just a bit more hassle than it was worth unless you were a super computing geek!
With Amazon Web Services you can create a new server instance in under 30 seconds with a starting cost of $0.12 per hour for EU based windows servers (actually hosted in Dublin, Ireland). There are a number of pricing structures available but the main point is with AWS you are given the ability to create your own servers in 4 global datacentres. So if you have a need for it you can deploy a server to run in the East or West coast of the US, Europe (Dublin) or Singapore with more options likely to become available in time.
You can manage all of this yourself via an easy to use web interface. You essentially have your own global datacentres with zero upfront cost. I couldn’t even begin to figure out how much that would cost by paying for physical servers, finding datacentres to host them in or dealing with different vendors in different countries who provide traditional server hosting. It is utility based computing in its purest with all the flexibility it provides.
You may for example use AWS for hosting a web application on a single server instance. Lets say you now want to do some testing or development on it. Well, all you need to do is launch a new server instance based on a copy of your live instance. That will take about 30 seconds and then you are all set. If you spend 8 hours working on it that will cost you about €1. Terminate that test instance at the end of day and that’s the only cost you will incur.
AWS has many other services beyond just providing server instances that I haven’t even mentioned but I highly recommend you investigate them. Even just recently we have seen some other vendors like Dediserv and Digiweb start to offer their own cloud computing infrastructure. They have a long way to go to catch up with AWS which has at least 5 years of technical growth and competence behind it (November 2005 was the earliest blog post I could find on Amazons AWS official blog).
Cloud computing is still maturing, there is no doubt about that. Interoperability amongst various vendors and legal issues still need to be worked on or at least need to be known. However, there will be a time where we have to make a valid business case for buying a physical on-premise server versus a cloud server. Afterall you would have a hard time today to convince someone to buy a generator rather than plugging into the nearest socket in the wall?