What do virtualization executives think about 2009? A VMBlog.com Series Exclusive.
Contributed by Ken Berryman, Vice President, Endpoint Virtualization, Symantec
Endpoint Virtualization in 2009
In 2008, as enterprises of all sizes began to realize increasing benefits from endpoint virtualization, we saw a flurry of new companies dipping their toes into the endpoint virtualization pool. The result has been the movement of endpoint virtualization towards the inevitable outcome that all successful technologies reach: mainstream. As a result, the demonstrable ROI of endpoint virtualization technology has steadily improved, and as market forces continue to drive increasing ROI, the door to major endpoint virtualization advancements in 2009 is opening.
Moving towards a standards-based approach
The first major innovation to endpoint virtualization we will see in 2009 is the establishment of a standards-based approach. When a new technology is developed, such as endpoint virtualization, it typically falls outside the arena of established standards—that is, after all, part of what makes it new. However, as the technology becomes more mainstream a lack of standards becomes a major complexity issue. Often times, vendors begin pitting their own standards against those of other vendors, leaving IT administrators with a confusing mess of incongruence that they are usually left to sort out by themselves.
However, the major endpoint virtualization vendors are beginning to realize the need for a standards-based approach to accomplishing the many tasks of endpoint virtualization, such as the formatting of application packages, and most are moving in that direction. It is important that as the industry moves toward a standardized approach that everything possible is done to comply with existing standards as opposed to creating new ones. This may mean the road to standardization is a bit more arduous, but in the end it will keep complexity, and therefore total costs, to a minimum. A real movement toward standardization is also what will drive convergence in local computing and cloud-based services, allowing IT and end-users to have a best-of-both-worlds model, instead of being forced into a single, less than ideal, computing model.
A unified set of management tools is given priority
The second major advance we will see in 2009 is the development of a common set of management tools. One of the overall goals of endpoint virtualization has to be to simplify IT management—this is where the majority of endpoint cost resides today. The problem is that most management solutions that came to market before and during 2008 are only capable of managing either traditional or virtual environments, but not both. This being the case, in order to effectively manage their infrastructures, IT administrators who have implemented endpoint virtualization have had to use an assortment of tools to keep all the environments in their infrastructure—traditional, virtual and hybrid—in check.
Customers do not want to have multiple tools, using multiple management methods, to provide coverage of their full environment, physical and virtual. Ideally, customers should have one solution from one vendor to manage all aspects of endpoint computing, from the physical to the virtual and from the endpoint device to the data center back end.
At this point the industry cannot even lay claim to providing a partial solution to this management challenge. However, in 2009 we will see great advancements in this area. As the economy continues to struggle into the new year, multi-vendor infrastructures will continue to be the norm as IT seeks to provide every ounce of value and make the most of every dollar. These multi-vendor infrastructures will consist of wide arrays of solution types, ranging from terminal services, Software as a Service (SaaS), Virtually Hosted Desktops (VHD), cloud and local computing. With all of these solution types existing alongside each other within the enterprise, the demand and drive towards the creation of systems that manage heterogeneously, multi-vendor and cross platform will increase dramatically.
The path towards this unified management console is already being paved. Many companies are signing on to support non-traditional approaches from vendors, such as packaging solutions that create virtualized applications. Some vendors are including application virtualization, streaming and light weight local virtual machine support in their larger management frameworks, in some cases by way of OEM agreements, while they determine their own long term strategies.
Pilot programs take flight
Also in 2009, we will see the number of internal pilot programs increase. As countless new technologies around endpoint virtualization and general virtualization enablement continue to be brought to market, IT will much more aggressively explore new approaches to endpoint virtualization technology implementations.
Many IT managers are already building up internal lab environments to test new innovations and push requirements back to vendors. Endpoint virtualization purchases in the coming year will more than ever before be leveraged into driving enhancements back to the vendor community to move point products into enterprise-ready solutions. Based on current trends, it’s hardly a stretch of the imagination to see every enterprise having one or more endpoint virtualization pilot programs running internally across multiple vendors with a variety of use cases selected based on expected benefits in business productivity and cost savings. These pilots will then translate into large scale implementations for those vendors that can demonstrate the benefits of taking advantage of this range of technologies.
Application virtualization will help IT reduce, reuse and recycle
Doing more with what organizations already have is not a new theme, but in 2009 we will see application virtualization take a much more prominent role in green initiatives. Projects like license compliance, a major endpoint virtualization driver, will transition from liability reduction efforts into capital reduction efforts—using solutions like application streaming to create perfectly adapted infrastructures based on real business needs, such as hotelling to reduce the number of machines and office space required for a given size workforce. Just as server virtualization has allowed for datacenter consolidation and reduction of hardware resources, endpoint virtualization will bring innovations to the physical endpoint and reduce the need for costly new datacenter builds to support ever-changing end-user computing needs. We will also see more use of application virtualization to allow applications that typically would have conflicted with each other while running on the same device to execute side by side without conflict, thus eliminating the need to use expensive server-based sandboxes.
In addition, the maturing of virtualization management systems will bring more advanced automation, which will immediately reduce IT overhead for users at the endpoint. In 2009, IT will also place more value in bridging solutions and virtualization middleware to connect disparate underlying infrastructures so they can add new functionality and take full advantage of the technology they have already purchased and deployed.
In conclusion, 2009 will see endpoint virtualization continue to deliver on its promise to enable the information resources that businesses depend on to be protected more completely, managed more easily and controlled automatically—all with greater visibility, increased cost savings and more confidence. For that reason, 2009 will be the year that endpoint virtualization truly moves from the labs and into the mainstream.
About Ken Berryman
Ken Berryman is responsible for driving Symantec’s overall endpoint virtualization business, including software virtualization and streaming solutions. Symantec endpoint virtualization helps IT departments provide the end user with more efficient access to data while delivering a more flexible and productive computing experience. Endpoint virtualization will also integrate with other Symantec solutions to improve the security, storage, portability and management of customer information.
Berryman previously led product development for the NetBackup Product Platform within the Symantec Data Center Management Group and was responsible for the NetBackup, PureDisk, and Backup Reporter product lines.
Berryman joined Symantec from McKinsey & Company, where he was a partner in the Silicon Valley Office. During his ten years at McKinsey, he led the North American Software Practice, and was a well-known speaker at events such as Software and Enterprise 2007, SoftSummit, and SIAA’s Enterprise conferences. Berryman served clients across the high tech industry on a range of topics including strategy, operations, and sales effectiveness.
Berryman holds both a master’s and doctorate in physics from Stanford University as well as a bachelor’s in physics from Harvard University.