What do Virtualization and Cloud executives think about 2010? Find out in this VMblog.com series exclusive.
Contributed Article By Brian Duckering, Senior Manager, Endpoint Virtualization, Symantec
2010 Predictions for Endpoint Virtualization
I hope that I am only remembered for the predictions that actually come true. Fortunately, few bad predictions are remembered for very long - surely the "global demand for four computers" prediction of the 1960s has faded from most memories. There is a lesson there, however, that technology developed and well understood in one arena, may have explosive and disruptive implications in many other arenas not originally considered. Virtualization is forging a similar path. Server virtualization is well understood and provides great value, but other forms of virtualization have been confusing, complex, misunderstood, and even more costly at times. I see 2010 as the year that a critical mass of comprehension causes a shift from IT business as usual, to a new paradigm of automation and productivity. Well, perhaps the planning will start in 2010.
Well, even if it is just the planning that happens next year, here are a few things that I think will move us in that direction in 2010.
Management methods shift from system-based to user-based
Managing systems has always worked just fine. But it has gotten a lot more complicated and costly as users become more mobile and less predictable, demanding that their workspaces follow them from one device to another, seamlessly. For many this has caused a re-evaluation of what the purpose of IT actually is. The systems don't create value for companies - the users do. Yet, the tools and methods predominantly deployed target devices, not people.
As this realization spreads in 2010, IT will be actively looking for the technologies that are user-focused. Virtualization encourages this shift in focus by separating the information that matters from the hardware and backend infrastructure that has become somewhat irrelevant to the user. Virtualized and streamed applications can be up and running on any system in any location in seconds, for example. This means that the resources a user needs to be productive can be configured on the fly, instead of ahead of time. We used to have to prepare the maximum assets on a system that a user may need because we couldn't predict what they would need or when. This is costly and time consuming.
Virtualization allows a just-in-time computing approach, where IT prepares the minimum configuration common across the company, and productivity need determines when and where resources are delivered. It's all about applying IT resources where needed, at the moment they are required, and not applying them where they are not needed - and it is easy to automate this. The potential savings in IT time and resources, including application license costs, is huge. Businesses seeking to reduce their planning, testing, and deployment cycles in 2010 will increasingly look to endpoint virtualization and the user-based management approach to meet their goals.
Accepting infrastructure hybridization
Every new computing model that has come along has been proclaimed by some to be the next killer app that will replace all others. VDI was the latest model to replace all other models. So far that has not come to pass. 2010 will be the year that there will be a general consensus and acceptance that each new model will solve certain problems and best satisfy the needs of SOME, but not ALL users. This means that companies will accept that they will have some combination of laptops, desktops, terminal servers, VDI, blades, etc., with each serving its purpose as the best balance of cost per user, performance and connectivity for each of its target user groups - and stop trying to fight that reality.
The corollary to this infrastructure hybridization is that IT will be looking for tools and technologies that bridge the models, help to integrate the technologies, and reduce the redundancies that occur when multiple IT groups are tackling similar tasks with different technologies. The goal, of course is to actually reduce costs as these great new virtual desktop technologies are adopted, instead of increase them, as many companies have discovered. In the end, though, IT still wants to provide seamless productivity for their users, regardless of which technology is supporting them on the backend.
In 2009, the big discussion has been around securing virtual environments. The technology has progressed, but the discussion is far from over there. But that is not what I'm talking about. I'm talking about using virtualization to actually improve security. It is easy to see today that virtualization can help in far more use cases than server sprawl. So security will surely benefit from virtualization as well.
Virtualization fundamentally provides a separation between one thing and another. Application virtualization, for example, can isolate what might be considered a high risk application, such as a web browser, from the rest of the system to that threats encountered while using that application are unable to affect the rest of the system. We are already seeing benefits from this type of implementation. But beginning in 2010, we'll see DLP, traditional security programs, endpoint management, and endpoint virtualization begin to converge.
Enterprises usually prefer integrated solutions to patching together technologies from multiple vendors. Although new technologies are almost always introduced and brought into production environments in patch-together mode, maturing of technologies and weariness of IT staffs eventually push toward a desire for end-to-end solutions. In 2010, virtualization will reach that point. Although nobody wants a closed system that cannot be added to, they will want fewer point solutions and more unified offerings from single vendors that offer solutions that work right out of the box. Nobody would seriously consider going to the automotive parts store to buy all of the parts to build a car, and I think enterprises are now growing weary of taking that approach with virtualization to build a complete solution.
So enterprises will look to fewer vendors to solve more of their problems. And this will be a great thing for endpoint virtualization, because with the valuable pieces being integrated by the larger vendors, it will be far easier to scale and ramp up adoption rates in 2010.
Weeding out impractical solutions
There have been many different features and solutions introduced into the virtualization space over the last few years. This is great. Having seen and investigated most of them now, in 2010 enterprises will become more focused on those features that solve real problems and discard those that either don't fit well, or even add greater challenges. Here's an example of a great virtualization idea: put an application on a USB key so that it can be carried around and run anywhere. Unfortunately, while this is pretty cool for an IT guy to experiment with, or a low value consumer app, enterprises will soon realize the management, security and licensing challenges of this model in their controlled environments.
There are certainly other examples, and we look forward to many other innovative ideas that come from the virtualization world. Ultimately, however, it will not be the creative virtualization engineers, but the enterprise IT managers that will decide which have lasting value, and which do not. In 2010, enterprises will get better at identifying which is which, and hopefully, virtualization vendors will as well.
Windows 7 is a catalyst
It's been nearly 10 years since Windows XP came out, and it is by far the most common production OS in enterprises today, since most skipped Vista. So the introduction of Windows 7, which seems to have addressed many of the biggest complaints about Vista, will take center stage as one of the most common enterprise projects in 2010. With a project as large as a new operating system, enterprises see this as an opportunity to make even larger infrastructure changes at the same time. All of the "nice to have" projects that have been sidelined for higher urgency items over that past several years, will finally rise to the top of the pile as part of this bigger project.
Windows 7 will be the catalyst event for even greater things. And with the growing recognition of endpoint virtualization as a change agent that could fundamentally enhance the way IT is done (if only it weren't so disruptive), this might be just what is needed to propel it from the testing labs into the production environment. With the coming Windows 7 migrations, it is just possible that endpoint virtualization implementations will see a doubling in the next year or two from what previously has taken 10 to 15 years. It is truly the opportunity for enterprises to establish virtualization as their new foundation.
So there you are. Six endpoint virtualization predictions that have a far better chance of being correct than predicting how many computers will be needed in the world. More importantly, regardless of the specific steps that we will see in 2010, or exactly how many implementations the world will ultimately need, the surest prediction is that endpoint virtualization, in its many forms, will continue to grow in popularity and prominence in enterprise infrastructures, and we have not yet seen all of the ways virtualization will improve our lives.
About the Author
Brian Duckering is Senior Manager responsible for Symantec's Endpoint Virtualization products and technologies. Symantec Endpoint Virtualization helps IT departments provide the end user with more efficient access to data while delivering a more flexible and productive computing experience. Duckering's experience includes development, sales engineering, product management and marketing, alliance management and technology evangelism. Duckering joined Symantec from AppStream, where he served as Senior Director of Products and Alliances. AppStream was acquired by Symantec in April of 2008.