Virtualization Technology News and Information
CloudSigma: 2012 Cloud Predictions - Expanding Flexibility


What do Virtualization and Cloud executives think about 2012? Find out in this series exclusive.

2012 Cloud Predictions - Expanding Flexibility

Contributed Article by Robert Jenkins, CloudSigma, CTO

"Flexibility" is a growing buzz word in the tech space and is increasingly important as technology and innovation continue to skyrocket. Just as DVDs moved to streaming and books moved to e-readers, so too must cloud computing move to a more flexible model for users, putting them in the driver's seat. While it's true that the cloud is actually the more flexible model when compared with physical servers and SANs, there is still tremendous room for improvement in terms of cloud flexibility, especially for public cloud infrastructures. 

Traditional public cloud infrastructures have gone as far as they can with their current feature-set and capabilities. With the infrastructure as a service (IaaS) market widening, customers are demanding better price/performance, greater flexibility and more control. For instance, companies requiring high-performance computing, heavy streaming or data processing simply aren't good fits in traditional ‘bounded' clouds. This is precisely why flexibility and control will be a major focus in 2012 for IaaS offerings, particularly in terms of resources, deployments and user control.  

Flexible Resources

Traditional cloud infrastructure providers, including Amazon and Rackspace, offer resources in bundled packages. On the surface, it makes sense. Every company migrating to the cloud will need sufficient CPU, RAM and storage, so why not package them together and price them as one charge? Sounds
simple, right? Wrong. In reality, this creates an extremely inefficient and inflexible system for the customer because it does not allow the resources to scale freely to match individual needs or demand.

For example, let's say a provider bundled CPU and RAM resources together, providing higher levels of RAM at a ratio of 2:1. With this strategy, a customer that needs very little RAM, but significant CPU resources would be forced to buy bundled packages until they reached their desired CPU requirement, all the while doubling their amount of RAM, no matter how unnecessary it may be. This creates wasted resources through significant over provisioning.

Moving into 2012, many cloud providers should come to realize the efficiencies of unbundled resources, allowing customers to buy CPU, RAM and storage in the exact quantities they need. Not only is this significantly more effective, but it is also more cost-efficient. With unbundled resources, companies can purchase exactly what they need and not feel the wasted effects of over provisioning - on their infrastructure or their wallet. In a dynamic environment unbundled resources allows for efficient granular scaling. Indeed, some cloud providers extend resource flexibility a step further, charging by smaller time increments - even every 5 minutes, instead of hourly or monthly - or offering lower rates during nights or weekends when there's less volume.  

Flexible Deployments

Currently, when moving to the cloud, many companies are forced to change their operating system or software to accommodate the provider's restrictions. But, why should enterprises have to change their preferred infrastructure when they move to a public cloud? To make the transition as seamless and effective as possible, shouldn't the cloud infrastructure at least match the flexibility and control offered by physical, hardware-based data centers?  

Good price/performance is achieved by creating a good 'fit' between the computing requirements of the customer and the specifications of the cloud servers. Giving public cloud customers the ability to tweak their infrastructure to better fit their particular computing profile leads to much greater computing efficiency and, therefore, much better price/performance. 

Giving users more flexibility in terms of their deployments will be something many providers will need to adopt in 2012. Lifting restrictions on operating systems and application deployments will give many enterprises not only the flexibility, but the confidence to move away from their proprietary infrastructure and into the cloud. And, with Ovum predicting that North America's cloud market will continue to rise at a compound annual growth rate of 27.1 percent over the next five years, the demand for flexible deployment options is sure to only increase as technology continues to advance. 

With flexible deployment options, cloud providers essentially future-proof their IaaS offering as they will be able to adequately handle any new software, application or operating system that may emerge.  Without such flexibility, restrictive providers may soon become obsolete if they are unable to adapt.

Flexible Control

Companies adopting cloud computing are smart. Many have run their own IT infrastructure in-house successfully for the duration of their company's existence. So, who's to say they can't manage their infrastructure just as successfully in the cloud? As it stands, many public cloud IaaS providers closely manage companies' cloud deployments and data, locking them into an agreement that limits their access to their own data, or even charges them a fee to take it out of the provider's cloud.  

As with most IaaS providers, users' data is stored on virtual drives within the cloud and companies can simply transfer data out over IP. However, this does not include entire drive images, which is a key component to achieving ubiquitous access and control. And, with flexibility demands on the rise, companies will naturally want access to their infrastructure, including the ability move their data around as they see fit. With complete data portability, companies can rest assured that they have control over all of their data. 

Ringing in 2012

In a recent study, Gartner predicted that the worldwide cloud IaaS market would grow from an estimated $3.7 billion in 2011 to $10.5 billion by 2014. This shows tremendous growth and promise for the cloud over the next few years. However, in order to sustain increasing adoption rates, there's a certain level of development that needs to accompany cloud innovation - namely, increased flexibility.  

The cloud was initially conceived to be a more flexible, scalable and accessible IT environment. In order to stay aligned with that model in 2012 and beyond, restrictive cloud providers will need to strip away their limitations on resources, deployments and user control. Achieving a completely customizable and flexible IaaS platform will be a necessity in order to remain a viable public cloud option and meet customer demand. As we ring in the New Year, it will be interesting to see how cloud providers address these needs and compete with such providers already accomplishing such flexible standards.


About the Author

Robert Jenkins is the co-founder and CTO of CloudSigma and is responsible for leading the technological innovation of the company’s pure-cloud IaaS offering. Under Robert’s direction, CloudSigma has established an unprecedented open, customer-centric approach to the public cloud.

Published Thursday, December 08, 2011 2:41 PM by David Marshall
Comments - Virtualization Technology News and Information for Everyone - (Author's Link) - January 4, 2012 7:07 AM

I'd like to personally welcome each and every one of you to the start of 2012! As we begin what will certainly prove to be a fantastic new year, I wanted to make sure to thank all of the loyal member's and readers of Once again, with the help

CloudSigma » Ringing in 2012 with Flexible Control - (Author's Link) - August 6, 2013 6:21 AM
Ringing in 2012 with Flexible Control | CloudSigma - (Author's Link) - February 16, 2016 5:55 AM
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
<December 2011>