Virtualization Technology News and Information
White Papers
RSS
White Papers Search Results
Showing 1 - 16 of 29 white papers, page 1 of 2.
HP, VMware & Liquidware Labs Simplify Desktop Transformation
This whitepaper provides an overview of the requirements and benefits of launching a virtual desktop project on proven, enterprise ready solution stack from HP, VMware, and Liquidware Labs. HP VirtualSystem CV2, with VMware View and Liquidware Labs ProfileUnity, offers a comprehensive Virtual Desktop solutions stack with integrated User Virtualization Management and Dynamic Application Portability. By combining offerings from proven industry leaders in this end-to-end solution, customers can fas
Desktops and workspaces are transforming to virtual and cloud technologies at a lightning fast pace. With the rapid growth of Microsoft Windows 7 (and soon Windows 8) adoption, virtual desktop strategies, and cloud storage and virtual application adoption, there is a perfect storm brewing that is driving organizations to adopt client virtualization now.

You need a plan, one that is complete and well capable of guiding you through this key phase of your desktop transformation project. HP and Liquidware Labs offer a comprehensive User Virtualization Management and Dynamic Application Portability (DAP) solution that takes care of the key requirements for your desktop transformation to a virtual desktop infrastructure (VDI).

User Virtualization and Dynamic Application Portability from HP and Liquidware Labs is integral to your VDI project by providing the following:

  • Dramatic savings in storage, licensing, and management costs with the use of robust and flexible persona management to leverage non-persistent desktops.
  • Instant productivity within seconds of logon with automatic context-aware configurations which enable flexible desktop environments where users can logon to any desktop, physical or virtual. Minimize golden image builds while allowing the ultimate in personalization, break down the barriers to user adoption and fast-track productivity in the new environment with user and department installed applications.
Liquidware Labs Desktop Transformation Methodology
This whitepaper provides an overview of Liquidware Labs proprietary Desktop Transformation methodology of Assess, Design, Migrate and Validate and describes how our solutions are designed to assist users to move through a logical, phased approach for successful migrations of physical to virtual desktops.
This whitepaper provides an overview of Liquidware Labs proprietary Desktop Transformation methodology of Assess, Design, Migrate and Validate and describes how our solutions are designed to assist users to move through a logical, phased approach for successful migrations of physical to virtual desktops.
Stratusphere Architectural Overview
In this paper, we outline the Architectural components and considerations for our Stratusphere FIT and Stratusphere UX products. This paper is intended for technical audiences who are already generally familiar with these solutions and the functionality they provide.

In this paper, we outline the Architectural components and considerations for our Stratusphere FIT and Stratusphere UX products. This paper is intended for technical audiences who are already generally familiar with these solutions and the functionality they provide.

VDI FIT and VDI UX Key Metrics
As more organizations prepare for and deploy hosted virtual desktops, it has become clear that there is a need to measure compatibility and performance for virtual desktop infrastructure both in the planning process and when measuring user experience after moving to virtual desktops. This whitepaper covers best practices and provides an introduction to composite metrics VDI FIT in Stratusphere FIT and VDI UX in Stratusphere UX to provide a framework to measure Good/Fair/Poor desktops.

As more organizations prepare for and deploy hosted virtual desktops, it has become clear that there is a need to support two related but critical phases. The first is to inventory and assess the physical desktop environment in order to create a baseline for the performance and quality of the user experience for the virtual desktop counterparts. When planning and preparing, organizations would like to know which desktops, users and applications are a good fit for desktop virtualization and which ones are not. The second phase is to track the virtual desktops in production in order to proactively identify when performance and user experience does not meet expectations, as well as to continue to refine and optimize the desktop image and infrastructure as changes are introduced into the environment.

Because virtual desktops live on shared systems, the layers of technology make it more complex to measure and classify fitness and user experience. But with increased industry knowledge and the emergence of best practices, plus new purpose-built products such as Liquidware Labs’ Stratusphere FIT and Stratusphere UX, it is now possible to more accurately measure and classify both fitness and user experience. This white paper covers these best practices and provides an introduction to the VDI FIT and VDI UX classification capabilities in Stratusphere FIT and Stratusphere UX.

Liquidware Labs ProfileUnity and its Role as a Disaster Recovery Solution
As desktop operating systems become more and more complex, the need for a proper disaster recovery methodology on the desktop is increasingly crucial, whether the desktop is physical or virtual. In addition, many enterprise customers are leveraging additional technologies, including local hard drive encryption, persistent virtual images and non‐persistent virtual images. This paper provides an overview of these issues and outlines how disaster recovery (DR) plan, coupled with Liquidware Labs Pro
Many corporations around the globe leverage Virtual Desktop Infrastructure (VDI) as a strategic, cost‐effective methodology to deliver business continuity for user applications and data. Virtualization renders a physical computer made of metal, plastic and silica as a portable file that can be moved through a network from a data center to a disaster recovery (DR) site. Although this may sound easy, transferring virtual machine files can be challenging for corporate networks in a number of ways. Moving large amounts of data is a time consuming process that may take days to complete. Moreover, once archival process is complete, the data is effectively out of date or out of context. As a response various strategies focus on leaving the bulk of the data transferred and only updating and replicating the changes in data. Desktop infrastructure is particularly sensitive to the issue of synchronization so applications run properly. The challenge is keeping desktops in sync because desktops, applications and data change often. This has given birth to a whole new set of strategies and software unique to desktops to accomplish backups safely and effectively. Liquidware Labs’ ProfileUnity™ is a best of breed solution that provides a seamless end user DR experience identical to the one at the home office.
5 Fundamentals of Modern Data Protection
Some data protection software vendors will say that they are “agentless” because they can do an agentless backup. However, many of these vendors require agents for file-level restore, proper application backup, or to restore application data. My advice is to make sure that your data protection tool is able to address all backup and recovery scenarios without the need for an agent.
Legacy backup is costly, inefficient, and can force IT administrators to make risky compromises that impact critical business applications, data and resources. Read this NEW white paper to learn how Modern Data Protection capitalizes on the inherent benefits of virtualization to:
  • Increase your ability to meet RPOs and RTOs
  • Eliminate the need for complex and inefficient agents
  • Reduce operating costs and optimize resources
The Expert Guide to VMware Data Protection
Virtualization is a very general term for simulating a physical entity by using software. There are many different forms of virtualization that may be found in a data center including server, network and storage virtualization. When talking about server virtualization there are many unique terms and concepts that you may hear that are part of the technology that makes up server virtualization.
Virtualization is the most disruptive technology of the decade. Virtualization-enabled data protection and disaster recovery is especially disruptive because it allows IT to do things dramatically better at a fraction of the cost of what it would be in a physical data center.

Chapter 1: An Introduction to VMware Virtualization

Chapter 2: Backup and Recovery Methodologies

Chapter 3: Data Recovery in Virtual Environments

Chapter 4: Learn how to choose the right backup solution for VMware
CIO Guide to Virtual Server Data Protection
Server virtualization is changing the face of the modern data center. CIOs are looking for ways to virtualize more applications, and faster across the IT spectrum.
Server virtualization is changing the face of the modern data center. CIOs are looking for ways to virtualize more applications, faster across the IT spectrum. Selecting the right data protection solution that understands the new virtual environment is a critical success factor in the journey to cloud-based infrastructure. This guide looks at the key questions CIOs should be asking to ensure a successful virtual server data protection solution.
Five Fundamentals of Virtual Server Protection
The benefits of server virtualization are compelling and are driving the transition to large scale virtual server deployments.
The benefits of server virtualization are compelling and are driving the transition to large scale virtual server deployments. From cost savings recognized through server consolidation or business flexibility and agility inherent in the emergent private and public cloud architectures, virtualization technologies are rapidly becoming a cornerstone of the modern data center. With Commvault's software, you can take full advantage of the developments in virtualization technology and enable private and public cloud data centers while continuing to meet all your data management, protection and retention needs. This whitepaper outlines the to 5 challenges to overcome in order to take advantage of the benefits of virtualization for your organization.
Server Capacity Defrag
This is not a paper on disk defrag. Although conceptually similar, it describes an entirely new approach to server optimization that performs a similar operation on the compute, memory and IO capacity of entire virtual and cloud environments.

This is not a paper on disk defrag. Although conceptually similar, it describes an entirely new approach to server optimization that performs a similar operation on the compute, memory and IO capacity of entire virtual and cloud environments.

Capacity defragmentation is a concept that is becoming increasingly important in the management of modern data centers. As virtualization increases its penetration into production environments, and as public and private clouds move to the forefront of the IT mindset, the ability to leverage this newly-found agility while at the same driving high efficiency (and low risk) is a real game changer. This white paper outlines how managers of IT environments make the transition from old-school capacity management to new-school efficiency management.

Scale Computing’s hyperconverged system matches the needs of the SMB and mid-market
Scale Computing HC3 is cost effective, scalable and designed for installation and management by the IT generalist
Everyone has heard the buzz about hyper-converged systems – appliances with compute, storage and virtualization infrastructures built in – these days. Hyper-converged infrastructure systems are an extension of infrastructure convergence – the combination of compute, storage and networking resources in one compact box – that promise of simplification by consolidating resources onto a commodity x86 server platform.
Why Parallel I/O & Moore's Law Enable Virtualization and SDDC to Achieve their Potential
Today’s demanding applications, especially within virtualized environments, require high performance from storage to keep up with the rate of data acquisition and unpredictable demands of enterprise workloads. In a world that requires near instant response times and increasingly faster access to data, the needs of business-critical tier 1 enterprise applications, such as databases including SQL, Oracle and SAP, have been largely unmet.

Today’s demanding applications, especially within virtualized environments, require high performance from storage to keep up with the rate of data acquisition and unpredictable demands of enterprise workloads. In a world that requires near instant response times and increasingly faster access to data, the needs of business-critical tier 1 enterprise applications, such as databases including SQL, Oracle and SAP, have been largely unmet. 

The major bottleneck holding back the industry is I/O performance. This is because current systems still rely on device -level optimizations tied to specific disk and flash technologies since they don’t have software optimizations that can fully harness the latest advances in more powerful server system technologies such as multicore architectures. Therefore, they have not been able to keep up with the pace of Moore’s Law.

Waiting on IO: The Straw That Broke Virtualization’s Back
In this paper, we will discuss DataCore’s underlying parallel architecture, how it evolved over the years and how it results in a markedly different way to address the craving for IOPS (input/output operations per second) in a software-defined world.
Despite the increasing horsepower of modern multi-core processors and the promise of virtualization, we’re seeing relatively little progress in the amount of concurrent work they accomplish. That’s why we’re having to buy a lot more virtualized servers than we expected.

On closer examination, we find the root cause to be IO-starved virtual machines (VMs), especially for heavy online transactional processing (OLTP) apps, databases and mainstream IO-intensive workloads. Plenty of compute power is at their disposal, but servers have a tough time fielding inputs and outputs. This gives rise to an odd phenomenon of stalled virtualized apps while many processor cores remain idle.

So how exactly do we crank up IOs to keep up with the computational appetite while shaving costs? This can best be achieved by parallel IO technology designed to process IO across many cores simultaneously, thereby putting those idle CPUs to work. Such technology has been developed by DataCore Software, a long-time master of parallelism in the field of storage virtualization.

In this paper, we will discuss DataCore’s underlying parallel architecture, how it evolved over the years and how it results in a markedly different way to address the craving for IOPS (input/output operations per second) in a software-defined world.

VMware Data Replication Done Right
Until now the most common data replication technologies and methods essential to mission-critical BC/DR initiatives have been tied to the physical environment. Although they do work in the virtual environment, they aren’t optimized for it.

Until now the most common data replication technologies and methods essential to mission-critical BC/DR initiatives have been tied to the physical environment. Although they do work in the virtual environment, they aren’t optimized for it. With the introduction of hypervisor-based replication, Zerto elevates BC/DR up the infrastructure stack where it belongs: in the virtualization layer.

Challenges:

  • If a data replication solution isn’t virtual-ready, management overhead could be more than doubled.
  • Customer data is always growing, so a company can find its information inventory expanding exponentially and not have a data replication solution to keep pace.
  • Some replication methods remain firmly tied to a single vendor and hardware platform, limiting the organization’s ability to get the best solutions – and service – at the best price.

Benefits of Hypervisor-Based Replication:

  • Granularity - The ability to replicate at the correct level of any virtual entity is critical. Zerto’s solution can replicate all virtual machines and all of the meta data as well.
  • Scalability - Zerto’s hypervisor-based replication solution is software-based so it can be deployed and managed easily, no matter how fast the infrastructure expands.
  • Hardware-agnostic - Zerto’s data replication is hardware-agnostic, supporting all storage arrays, so organizations can replicate from anything to anything. This allows users to mix storage technologies such as SAN & NAS, and virtual disk types such as RDM & VMFS.
Introducing Cloud Disaster Recovery
Can mission-critical apps really be protected in the cloud? Introducing: Cloud Disaster Recovery Today, enterprises of all sizes are virtualizing their mission-critical applications, either within their own data center, or with an external cloud vendor.

Can mission-critical apps really be protected in the cloud?

Introducing: Cloud Disaster Recovery

Today, enterprises of all sizes are virtualizing their mission-critical applications, either within their own data center, or with an external cloud vendor. One key driver is to leverage the flexibility and agility virtualization offers to increase availability, business continuity and disaster recovery.

With the cloud becoming more of an option, enterprises of all sizes are looking for the cloud, be it public, hybrid or private, to become part of their BC/DR solution. However, these options do not always exist. Virtualization has created the opportunity, but there is still a significant technology gap. Mission-critical applications can be effectively virtualized and managed; however, the corresponding data cannot be effectively protected in a cloud environment.

Additional Challenges for Enterprises with the Cloud:

  • Multi-tenancy
  • Data protection & mobility
  • Lack of centralized management

Solutions with Zerto Virtual Replication:

  • Seamless integration with no environment change
  • Multi-site support
  • Hardware-agnostic replications
The 2016 State of Storage in Virtualization
ActualTech Media and Tegile recently teamed up to get a deeper understanding of what’s happening at the intersection of virtualization and storage. ActualTech polled over 1,000 IT professionals to learn about the top challenges they’re facing within their organization, and how they plan to use solutions like flash storage, cloud storage, and VMware Virtual Volumes (VVols) to solve those challenges.
Highlights from the survey: • Whether it’s SQL Server, Oracle, or Big Data, organizations are increasingly virtualizing their most mission-critical workloads. • Although all-flash storage captures the headlines, only 3% of respondents say they run all flash. Nearly two-thirds of respondents use hybrid storage systems. • Just under one-quarter of flash users are using PCIe-based flash storage cards in their systems. • Fibre Channel remains the protocol of choice for virtual environments, while iSCSI comes in second. • 55% of respondents say they know little about VVols. Only 5% consider themselves well versed in the VMware technology. • 30% of respondents use some form of cloud storage.