A2U, an IGEL Platinum Partner, recently experienced a situation where one of its large, regional healthcare clients was hit by a cyberattack. “Essentially, malware entered the client’s network via a computer and began replicating like wildfire,” recalls A2U Vice President of Sales, Robert Hammond.
During the cyberattack, a few hundred of the hospital’s PCs were affected. Among those were 30 endpoints within the finance department that the healthcare organization deemed mission critical due to the volume of daily transactions between patients, insurance companies, and state and county agencies for services rendered. “It was very painful from a business standpoint not to be able to conduct billing and receiving, not to mention payroll,” said Hammond.
Prior to this particular incident, A2U had received demo units of the IGEL UD Pocket, a revolutionary micro thin client that can transform x86-compatible PCs and laptops into IGEL OS-powered desktops.
“We had been having a discussion with this client about re-imaging their PCs, but their primary concern was maintaining the integrity of the data that was already on the hardware,” continued Hammond. “HIPAA and other regulations meant that they needed to preserve the data and keep it secure, and we thought that the IGEL UD Pocket could be the answer to this problem. We didn’t see why it wouldn’t work, but we needed to test our theory.”
When the malware attack hit, that opportunity came sooner, rather than later for A2U. “We plugged the UD Pocket into one of the affected machines and were able to bypass the local hard drive, installing the Linux-based IGEL OS on the system without impacting existing data,” said Hammond. “It was like we had created a ‘Linux bubble’ that protected the machine, yet created an environment that allowed end users to quickly return to productivity.”
Working with the hospital’s IT team, it only took a few hours for A2U to get the entire finance department back online. “They were able to start billing the very next day,” added Hammond.
The primary goal of a multi-cloud data management strategy is to supply data, either via copying or moving data to the various multi-cloud use cases. A key enabler of this movement is the data management software applications. In theory, data protection applications can perform both of the copy and move functions. A key consideration is how the multi-cloud data management experience is unified. In most cases, data protection applications ignore the user experience of each cloud and use their proprietary interface as the unifying entity, which increases complexity.
There are a variety of reasons organizations may want to leverage multiple clouds. The first use case is to use public cloud storage as a backup mirror to an on-premises data protection process. Using public cloud storage as a backup mirror enables the organization to automatically off-site data. It also sets up many of the more advanced use cases.
Another use case is using the cloud for disaster recovery.
Another use case is “Lift and Shift,” which means the organization wants to run the application in the cloud natively. Initial steps in the “lift and shift” use case are similar to Dev/Test, but now the workload is storing unique data in the cloud.
Multi-cloud is a reality now for most organizations and managing the movement of data between these clouds is critical.
DevOps: An unsuspecting target for the world’s most sophisticated cybercriminals
DevOps focuses on automated pipelines that help organizations improve business-impacting KPIs like time-to-market, product development speed, agility and more. In a world where less time means more money, putting code into production the same day it’s written is, well, a game changer. But with new opportunities come new challenges. Automated building of software that’s distributed by vendors straight into corporations worldwide leaves cybercriminals salivating over costly supply chain attacks.
So how does one combat supply chain attacks?
Many can be prevented through the deployment of security to development infrastructure servers, the routine vetting of containers and anti-malware testing of the production artifacts. The problem is that a lack of integration solutions in traditional security products wastes time due to fragmented automation, overcomplicated processes and limited visibility—all taboo in DevOps environments.
Cybercriminals exploit fundamental differences between the operational goals of those who maintain and operate in the development environment. That’s why it’s important to show unity and focus on a single strategic goal—delivering a safe product to partners and customers in time.The protection-performance balance
A strong security foundation is crucial to stopping threats, but it won’t come from a one bullet. It takes the right multi-layered combination to deliver the right DevOps security-performance balance, bringing you closer to where you want to be.
Protect your automated pipeline using endpoint protection that’s fully effective in pre-filtering incidents before EDR comes into play. After all, the earlier threats can be countered automatically, the less impact on resources. It’s important to focus on protection that’s powerful, accessible through an intuitive and well-documented interface, and easily integrated through scripts.
Protecting your data and ensuring its’ availability is one of your top priorities. Like a castle in medieval times, you must always defend it and have built-in defense mechanisms. It is under attack from external and internal sources, and you do not know when or where it will come from. The prevalence of ransomware and the sharp increase in users working from home and on any device adds further complexity and broadens the attack surfaces available to bad actors. So much so, that your organization being hit with ransomware is almost unavoidable. While preventing attacks is important, you also need to prepare for the inevitable fallout of a ransomware incident.
Here are just a few datapoints from recent research around ransomware:• Global Ransomware Damage Costs Predicted To Reach $20 Billion (USD) By 2021 • Ransomware is expected to attack a business every 11 seconds by the end of 2021 • 75% of the world’s population (6 Billion people) will be online by 2022. • Phishing scams account for 90% of attacks. • 55% of small businesses pay hackers the ransom • Ransomware costs are predicted to be 57x more over a span of 6 years by 2021 • New ransomware strains destroy backups, steal credentials, publicly expose victims, leak stolen data, and some even threaten the victim's customers
So how do you prepare? By making sure you’re recovery ready with a layered approach to securing your data. Two proven techniques for reducing the attack surface on your data are data isolation and air gapping. Hitachi Vantara and Commvault deliver this kind of protection with the combination of Hitachi Data Protection Suite (HDPS) and Hitachi Content Platform (HCP) which includes several layers and tools to protect and restore your data and applications from the edge of your business to the core data centers.
Managing the performance of Windows-based workloads can be a challenge. Whether physical PCs or virtual desktops, the effort required to maintain, tune and optimize workspaces is endless. Operating system and application revisions, user installed applications, security and bug patches, BIOS and driver updates, spyware, multi-user operating systems supply a continual flow of change that can disrupt expected performance. When you add in the complexities introduced by virtual desktops and cloud architectures, you have added another infinite source of performance instability. Keeping up with this churn, as well as meeting users’ zero tolerance for failures, are chief worries for administrators.
To help address the need for uniform performance and optimization in light of constant change, Liquidware introduced the Process Optimization feature in its Stratusphere UX solution. This feature can be set to automatically optimize CPU and Memory, even as system demands fluctuate. Process Optimization can keep “bad actor” applications or runaway processes from crippling the performance of users’ workspaces by prioritizing resources for those being actively used over not used or background processes. The Process Optimization feature requires no additional infrastructure. It is a simple, zero-impact feature that is included with Stratusphere UX. It can be turned on for single machines, or groups, or globally. Launched with the check of a box, you can select from pre-built profiles that operate automatically. Or administrators can manually specify the processes they need to raise, lower or terminate, if that task becomes required. This feature is a major benefit in hybrid multi-platform environments that include physical, pool or image-based virtual and cloud workspaces, which are much more complex than single-delivery systems. The Process Optimization feature was designed with security and reliability in mind. By default, this feature employs a “do no harm” provision affecting normal and lower process priorities, and a relaxed policy. No processes are forced by default when access is denied by the system, ensuring that the system remains stable and in line with requirements.