Virtualization Technology News and Information
White Papers
RSS
White Papers Search Results
Showing 33 - 48 of 57 white papers, page 3 of 4.
The SysAdmin Guide to Azure Infrastructure as a Service
If you're used to on-premises infrastructures, cloud platforms can seem daunting. But it doesn't need to be. This eBook written by the veteran IT consultant and trainer Paul Schnackenburg, covers all aspects of setting up and maintaining a high-performing Azure IaaS environment, including: • VM sizing and deployment • Migration • Storage and networking • Security and identity • Infrastructure as code and more!

The cloud computing era is well and truly upon us, and knowing how to take advantage of the benefits of this computing paradigm while maintaining security, manageability, and cost control are vital skills for any IT professional in 2020 and beyond. And its importance is only getting greater.

In this eBook, we’re going to focus on Infrastructure as a Service (IaaS) on Microsoft’s Azure platform - learning how to create VMs, size them correctly, manage storage, networking, and security, along with backup best practices. You’ll also learn how to operate groups of VMs, deploy resources based on templates, managing security and automate your infrastructure. If you currently have VMs in your own datacenter and are looking to migrate to Azure, we’ll also teach you that.

If you’re new to the cloud (or have experience with AWS/GCP but not Azure), this book will cover the basics as well as more advanced skills. Given how fast things change in the cloud, we’ll cover the why (as well as the how) so that as features and interfaces are updated, you’ll have the theoretical knowledge to effectively adapt and know how to proceed.

You’ll benefit most from this book if you actively follow along with the tutorials. We will be going through terms and definitions as we go – learning by doing has always been my preferred way of education. If you don’t have access to an Azure subscription, you can sign up for a free trial with Microsoft. This will give you 30 days 6 to use $200 USD worth of Azure resources, along with 12 months of free resources. Note that most of these “12 months” services aren’t related to IaaS VMs (apart from a few SSD based virtual disks and a small VM that you can run for 750 hours a month) so be sure to get everything covered on the IaaS side before your trial expires. There are also another 25 services that have free tiers “forever”.

Now you know what’s in store, let’s get started!

Why Should Enterprises Move to a True Composable Infrastructure Solution?
IT Infrastructure needs are constantly fluctuating in a world where powerful emerging software applications such as artificial intelligence can create, transform, and remodel markets in a few months or even weeks. While the public cloud is a flexible solution, it doesn’t solve every data center need—especially when businesses need to physically control their data on premises. Download this report to see how composable infrastructure helps you deploy faster, effectively utilize existing hardwar

IT Infrastructure needs are constantly fluctuating in a world where powerful emerging software applications such as artificial intelligence can create, transform, and remodel markets in a few months or even weeks. While the public cloud is a flexible solution, it doesn’t solve every data center need—especially when businesses need to physically control their data on premises. This leads to overspend— purchasing servers and equipment to meet peak demand at all times. The result? Expensive equipment sitting idle during non-peak times.

For years, companies have wrestled with overspend and underutilization of equipment, but now businesses can reduce cap-ex and rein in operational expenditures for underused hardware with software-defined composable infrastructure. With a true composable infrastructure solution, businesses realize optimal performance of IT resources while improving business agility. In addition, composable infrastructure allows organizations to take better advantage of the most data-intensive applications on existing hardware while preparing for future, disaggregated growth.

Download this report to see how composable infrastructure helps you deploy faster, effectively utilize existing hardware, rein in capital expenses, and more.

Evaluator Group Report on Liqid Composable Infrastructure
In this report from Eric Slack, Senior Analyst at the Evaluator Group, learn how Liqid’s software-defined platform delivers comprehensive, multi-fabric composable infrastructure for the industry’s widest array of data center resources.
Composable Infrastructures direct-connect compute and storage resources dynamically—using virtualized networking techniques controlled by software. Instead of physically constructing a server with specific internal devices (storage, NICs, GPUs or FPGAs), or cabling the appropriate device chassis to a server, composable enables the virtual connection of these resources at the device level as needed, when needed.

Download this report from Eric Slack, Senior Analyst at the Evaluator Group to learn how Liqid’s software-defined platform delivers comprehensive, multi-fabric composable infrastructure for the industry’s widest array of data center resources.
Make the Move: Linux Desktops with Cloud Access Software
Gone are the days where hosting Linux desktops on-premises is the only way to ensure uncompromised customization, choice and control. You can host Linux desktops & applications remotely and visualize them to further security, flexibility and performance. Learn why IT teams are virtualizing Linux.

Make the Move: Linux Remote Desktops Made Easy

Securely run Linux applications and desktops from the cloud or your data center.

Download this guide and learn...

  • Why organizations are virtualizing Linux desktops & applications
  • How different industries are leveraging remote Linux desktops & applications
  • What your organization can do to begin this journey


ESG - DataCore vFilO: Visibility and Control of Unstructured Data for the Modern, Digital Business
Organizations that want to succeed in the digital economy must contend with the cost and complexity introduced by the conventional segregation of multiple file system silos and separate object storage repositories. Fortunately, they can look to DataCore vFilO software for help. DataCore employs innovative techniques to combine diverse unstructured data resources to achieve unprecedented visibility, control, and flexibility.
DataCore’s new vFilO software shares important traits with its existing SANsymphony software-defined block storage platform. Both technologies are certainly enterprise class (highly agile, available, and performant). But each solution exhibits those traits in its own manner, taking the varying requirements for block, file, and object data into account. That’s important at a time when a lot of companies are maintaining hundreds to thousands of terabytes of unstructured data spread across many file servers, other NAS devices, and object storage repositories both onsite and in the cloud. The addition of vFilO to its product portfolio will allow DataCore to position itself in a different, even more compelling way now. DataCore is able to offer a “one-two punch”—namely, one of the best block storage SDS solutions in SANsymphony, and now one of the best next-generation SDS solutions for file and object data in vFilO. Together, vFilO and SANsymphony will put DataCore in a really strong position to support any IT organization looking for better ways to overcome end-users’ file-sharing/access difficulties, keep hardware costs low … and maximize the value of corporate data to achieve success in a digital age.
ESG Showcase - DataCore vFilO: NAS Consolidation Means Freedom from Data Silos
File and object data are valuable tools that help organizations gain market insights, improve operations, and fuel revenue growth. However, success in utilizing all of that data depends on consolidating data silos. Replacing an existing infrastructure is often expensive and impractical, but DataCore vFilO software offers an intelligent, powerful option—an alternative, economically appealing way to consolidate and abstract existing storage into a single, efficient, capable ecosystem of readily-se

Companies have NAS systems all over the place—hardware-centric devices that make data difficult to migrate and leverage to support the business. It’s natural that companies would desire to consolidate those systems, and vFilO is a technology that could prove to be quite useful as an assimilation tool. Best of all, there’s no need to replace everything. A business can modernize its IT environment and finally achieve a unified view, plus gain more control and efficiency via the new “data layer” sitting on top of the hardware. When those old silos finally disappear, employees will discover they can find whatever information they need by examining and searching what appears to be one big catalog for a large pool of resources.

And for IT, the capacity-balancing capability should have especially strong appeal. With it, file and object data can shuffle around and be balanced for efficiency without IT or anyone needing to deal with silos. Today, too many organizations still perform capacity balancing work manually—putting some files on a different NAS system because the first one started running out of room. It’s time for those days to end. DataCore, with its 20-year history offering SANsymphony, is a vendor in a great position to deliver this new type of solution, one that essentially virtualizes NAS and object systems and even includes keyword search capabilities to help companies use their data to become stronger, more competitive, and more profitable.

ESG Showcase - Time to Better Leverage Your File and Object Data
The need to handle unstructured data according to business relevance is becoming urgent. Modern datarelated demands have begun surpassing what traditional file and object storage architectures can achieve. Businesses today need an unstructured data storage environment that lets end-users easily and flexibly access the benefits of the file- and object-based information they need to do their jobs.
It’s time for a “Nirvana” between file and object: The marketplace has long been debating which of the two unstructured data type models is the “right one.” But data is data. If you remove the traditional constraints and challenges associated with those two particular technologies, it becomes possible to imagine a type of data service that supports both. That kind of solution would give an organization the benefits of a global multi-site namespace (i.e., object style), including the ability to address a piece of data regardless of where it is in a folder hierarchy, or what device it’s stored on. And it would offer the rich metadata that object storage is known for. This new kind of solution would also deliver the performance and ease of use that file systems are known for. DataCore says it is providing such a solution, called vFilO. To vFilO, data is just data, where metadata is decoupled but tightly aligned with the data, and data’s location is independent of data access needs or metadata location. It bridges both worlds by providing a system that has all the benefits of file and object—without any of the limitations. It is a next-generation system built on a different paradigm, promising to bridge the world of file and object. It just might be the answer IT departments need. No longer required to pick between two separate systems, IT organizations can find great value in a consolidated, global, unified storage system that can deliver on the business’s needs for unstructured data.
The Time is Now for File Virtualization
DataCore’s vFilO is a distributed files and object storage virtualization solution that can consume storage from a variety of providers, including NFS or SMB file servers, most NAS systems, and S3 Object Storage systems, including S3-based public cloud providers. Once vFilO integrates these various storage systems into its environment, it presents users with a logical file system and abstracts it from the actual physical location of data.

DataCore vFilO is a top-tier file virtualization solution. Not only can it serve as a global file system, IT can also add new NAS systems or file servers to the environment without having to remap users of the new hardware. vFilO supports live migration of data between the storage systems it has assimilated and leverages the capabilities of the global file system and the software’s policy-driven data management to move older data to less expensive storage automatically; either high capacity NAS or an object storage system. vFilO also transparently moves data from NFS/SMB to object storage. If the user needs access to this data in the future, they access it like they always have. To them, the data has not moved.

The ROI of File virtualization is powerful, but it has struggled to gain adoption in the data center. File Virtualization needs to be explained, and explaining it takes time. vFilO more than meets the requirements to qualify as a top tier file virtualization solution. DataCore has the advantage of over 10,000 customers that are much more likely to be receptive to the concept since they have already embraced block storage virtualization with SANSymphony. Building on its customer base as a beachhead, DataCore can then expand File Virtualization’s reach to new customers, who, because of the changing state of unstructured data, may finally be receptive to the concept. At the same time, these new file virtualization customers may be amenable to virtualizing block storage, and it may open up new doors for SANSymphony.

IDC: DataCore SDS: Enabling Speed of Business and Innovation with Next-Gen Building Blocks
DataCore solutions include and combine block, file, object, and HCI software offerings that enable the creation of a unified storage system which integrates more functionalities such as data protection, replication, and storage/device management to eliminate complexity. They also converge primary and secondary storage environments to give a unified view, predictive analytics and actionable insights. DataCore’s newly engineered SDS architecture make it a key player in the modern SDS solutions spa
The enterprise IT infrastructure market is undergoing a once-in-a-generation change due to ongoing digital transformation initiatives and the onslaught of applications and data. The need for speed, agility, and efficiency is pushing demand for modern datacenter technologies that can lower costs while providing new levels of scale, quality, and operational efficiency. This has driven strong demand for next-generation solutions such as software-defined storage/networking/compute, public cloud infrastructure as a service, flash-based storage systems, and hyperconverged infrastructure. Each of these solutions offers enterprise IT departments a way to rethink how they deploy, manage, consume, and refresh IT infrastructure. These solutions represent modern infrastructure that can deliver the performance and agility required for both existing virtualized workloads and next-generation applications — applications that are cloud-native, highly dynamic, and built using containers and microservices architectures. As we enter the next phase of datacenter modernization, businesses need to leverage newer capabilities enabled by software-defined storage that help them eliminate management complexities, overcome data fragmentation and growth challenges, and become a data-driven organization to propel innovation. As enterprises embark on their core datacenter modernization initiatives with compelling technologies, they should evaluate enterprise-grade solutions that redefine storage and data architectures designed for the demands of the digital-native economy. Digital transformation is a technology-based business strategy that is becoming increasingly imperative for success. However, unless infrastructure provisioning evolves to suit new application requirements, IT will not be viewed as a business enabler. IDC believes that those organizations that do not leverage proven technologies such as SDS to evolve their datacenters truly risk losing their competitive edge.
IDC: SaaS Backup and Recovery: Simplified Data Protection Without Compromise
Although the majority of organizations have a "cloud first" strategy, most also continue to manage onsite applications and the backup infrastructure associated with them. However, many are moving away from backup specialists and instead are leaving the task to virtual infrastructure administrators or other IT generalists. Metallic represents Commvault's direct entry into one of the fastest-growing segments of the data protection market. Its hallmarks are simplicity and flexibility of deployment

Metallic is a new SaaS backup and recovery solution based on Commvault's data protection software suite, proven in the marketplace for more than 20 years. It is designed specifically for the needs of medium-scale enterprises but is architected to grow with them based on data growth, user growth, or other requirements. Metallic initially offers either monthly or annual subscriptions through reseller partners; it will be available through cloud service providers and managed service providers over time. The initial workload use cases for Metallic include virtual machine (VM), SQL Server, file server, MS Office 365, and endpoint device recovery support; the company expects to add more use cases and supported workloads as the solution evolves.

Metallic is designed to offer flexibility as one of the service's hallmarks. Aspects of this include:

  • On-demand infrastructure: Metallic manages the cloud-based infrastructure components and software for the backup environment, though the customer will still manage any of its own on- premise infrastructure. This environment will support on-premise, cloud, and hybrid workloads. IT organizations are relieved of the daily task of managing the infrastructure components and do not have to worry about upgrades, OS or firmware updates and the like, for the cloud infrastructure, so people can repurpose that time saved toward other activities.
  • Metallic offers preconfigured plans designed to have users up and running in approximately 15 minutes, eliminating the need for a proof-of-concept test. These preconfigured systems have Commvault best practices built into the design, or organizations can configure their own.
  • Partner-delivered services: Metallic plans to go to market with resellers that can offer a range of services on top of the basic solution's capabilities. These services will vary by provider and will give users a variety of choices when selecting a provider to match the services offered with the organization's needs.
  • "Bring your own storage": Among the flexible options of Metallic, including VM and file or SQL database use cases, users can deploy their own storage, either on-premise or in the cloud, while utilizing the backup/recovery services of Metallic. The company refers to this option as "SaaS Plus."
Confronting modern stealth
How did we go from train robberies to complex, multi-billion-dollar cybercrimes? The escalation in the sophistication of cybercriminal techniques, which overcome traditional cybersecurity and wreak havoc without leaving a trace, is dizzying. Explore the methods of defense created to defend against evasive attacks, then find out how Kaspersky’s sandboxing, endpoint detection and response, and endpoint protection technologies can keep you secure—even if you lack the resources or talent.
Explore the dizzying escalation in the sophistication of cybercriminal techniques, which overcome traditional cybersecurity and wreak havoc without leaving a trace. Then discover the methods of defense created to stop these evasive attacks.

Problem:
Fileless threats challenge businesses with traditional endpoint solutions because they lack a specific file to target. They might be stored in WMI subscriptions or the registry, or execute directly in the memory without being saved on disk. These types of attack are ten times more likely to succeed than file-based attacks.

Solution:
Kaspersky Endpoint Security for Business goes beyond file analysis to analyze behavior in your environment. While its behavioral detection technology runs continuous proactive machine learning processes, its exploit prevention technology blocks attempts by malware to exploit software vulnerabilities.

Problem:
The talent shortage is real. While cybercriminals are continuously adding to their skillset, businesses either can’t afford (or have trouble recruiting and retaining) cybersecurity experts.

Solution:
Kaspersky Sandbox acts as a bridge between overwhelmed IT teams and industry-leading security analysis. It relieves IT pressure by automatically blocking complex threats at the workstation level so they can be analyzed and dealt with properly in time.


Problem:
Advanced Persistent Threats (APTs) expand laterally from device to device and can put an organization in a constant state of attack.

Solution:
Endpoint Detection and Response (EDR) stops APTs in their tracks with a range of very specific capabilities, which can be grouped into two categories: visibility (visualizing all endpoints, context and intel) and analysis (analyzing multiple verdicts as a single incident).
    
Attack the latest threats with a holistic approach including tightly integrated solutions like Kaspersky Endpoint Detection and Response and Kaspersky Sandbox, which integrate seamlessly with Kaspersky Endpoint Protection for Business.
Active Directory: Best Practices for Administration and Backup
Active Directory: Best Practices for Administration and Backup
Active Directory: Best Practices for Administration and Backup
3 Ultimate Strategies for Ransomware Prevention
Let’s explore all the options that exist to prevent, locate, disarm and mitigate ransomware risks.

Ransomware is here to stay. As much as no one likes to hear that, we have to admit that it’s no longer a question of ‘if’ but rather ‘when’ and ‘to what’ extent your business might be affected.

Rather than hope it won’t get you, let’s explore all the options that exist to prevent, locate, disarm and mitigate ransomware risks.

We’ll go over some essential strategies you can implement today and also review what a comprehensive backup software can offer you, such as:

•    Simple things to implement right away: 3‑2‑1 rule, air gapped backups and access policies
•    Ransomware myths: what really should be taken seriously?
•    What today’s backup technologies can offer: in‑flight malware check, immutable backups
•    And more!

10 Best Practices for VM Backup
Topics: VM, backup, Veeam
Proper protection for critical applications and implementing off‑site solutions are essential to keeping VM data available while minimizing data loss. Every business should have a backup strategy to guarantee virtual machines are properly protected and recoverable.

When backing up virtual machines, one should consider applications running on the machine, how critical the machine is for business operations and how fast you can get back up running when a disaster occurs. Proper protection for critical applications and implementing off‑site solutions are essential to keeping VM data available while minimizing data loss. Every business should have a backup strategy to guarantee virtual machines are properly protected and recoverable.

Join this webinar to learn more about:

•    Factors affecting VM backups
•    Off‑site data protection strategies
•    Ensuring data is recoverable and secure

Top 10 Best Practices for VMware Backups
Topics: vSphere, backup, Veeam
Backup is the foundation for restores, so it is essential to have backups always available with the required speed. The “Top 10 Best Practices for vSphere Backups” white paper discusses best practices with Veeam Backup & Replication and VMware vSphere.

More and more companies come to understand that server virtualization is the way for modern data safety. In 2019, VMware is still the market leader and many Veeam customers use VMware vSphere as their preferred virtualization platform. But, backup of virtual machines on vSphere is only one part of service Availability. Backup is the foundation for restores, so it is essential to have backups always available with the required speed. The “Top 10 Best Practices for vSphere Backups” white paper discusses best practices with Veeam Backup & Replication and VMware vSphere, such as:

•    Planning your data restore in advance
•    Keeping track of your data backup software updates and keeping your backup tools up-to-date
•    Integrating storage based snapshots into your Availability concept
•    And much more!

10 tips to make your daily backup easier
Topics: VM, backup, Veeam
Dive deeper in this educational technical session, where we showcase particular features that help backup admins to go through the day-to-day routine with ease!

There are a lot of game-changing features in every release of Veeam Backup & Replication, whether it be native object storage integration, SAP HANA support or built-in backup testing. However, besides these obvious heavy-lifters, there are many other features that are specifically designed and constantly improved to make a backup administrator’s daily life easier.

•    Let’s see if any of these sounds familiar:
•    How to I create quick, on-demand restore point and not disrupt any existing backup policy?
•    How to archive and export just one single VM?
•    How to exclude deleted file blocks from my backup?

If these questions are on you mind — great news! All that and more is covered by Veeam!
Dive deeper in this educational technical session, where we showcase particular features that help backup admins to go through the day-to-day routine with ease!

top25