Written by Paul
Anderson, FireMon's Vice President Solutioning, GPC and Elisa Lippincott, product marketing at FireMon
Cloud's allure is undeniable - increasing IT agility and
accelerating time-to-market of innovative services is a primary goal in almost
every modern enterprise. However, before diving right in, organizations need to
understand the entire process of building and launching new applications and
microservices. Identifying manual change processes currently in place and
ensuring they can be automated in a cloud operations model is a critical factor
to ensuring that security doesn't slow down innovation or innovation doesn't compromise
security.
Ideally, a strong and secure public, private or hybrid cloud
environment would encompass a perfect union of centralized visibility, integration,
automation, orchestration and control that doesn't compromise critical IT
security and data privacy practices. There are key challenges IT security and
operations must face to realize the benefits of a secure cloud model. Since
most enterprises use disparate security solutions complicated by fragmented
policies, manual processes and inconsistent controls, the complexity of moving
to public, private or hybrid cloud can have a negative impact on business
innovation, operational overhead and the bottom line.
Following are a list of important questions enterprises should
consider when moving to a hybrid cloud environment.
Q. What
are some of the challenges associated with moving to a hybrid environment?
Many organizations underestimate the level of complexity in hybrid-cloud
environments. Let's face it - managing security on-premises is hard enough, but
add in multiple cloud vendors and platform options, increased threat vectors
and attack surfaces, as well as new models like DevOps, Infrastructure as Code
and containers - and keeping up with everything can be almost impossible. Even more
so when application release cycles are moving closer and closer to real-time
deployments. Despite potential complexity, the majority of enterprises have
adopted a "cloud first" mantra. The goal is to enable application development
teams and business owners to release new services and innovate faster than
competitors. Additionally, the agility
of cloud allows organizations to respond to customers and market conditions in
real-time. Like many organizations, some of the customers we speak with value
the importance of change management, security and compliance, but don't want to
compromise application release schedules, miss application deployment SLAs back
to the business or end up being a bottleneck. Modernization and transformation
are now critical to delivering secure cloud services.
Q. What's
the best approach to enabling the business without moving all workloads to the
public cloud?
The best approach will ultimately depend on the organization, so
let's look at a real-world example from one of our customers. This customer
invested in and deployed a software-defined private cloud. With the goal of
enabling agile development without compromising security, they adopted micro-segmentation
as part of their security and networking strategy. They mapped out which web,
application and database servers should and shouldn't interact with each other,
and then implemented virtual distributed firewall rules to ensure only assets
that should talk to another asset could within their environment. After a
couple of months of testing, they moved production applications into the
environment and saw immediate results. Application release cycles were eight-to-10
times faster than on their legacy data center infrastructure, and they were
able to meet release schedules and SLAs back to the business. After a period of
time, they were up to over 500 virtual machines (VMs) and had over 25 micro-segmented
security zones.
Q. Does
that approach scale as the enterprise grows? Or is the honeymoon over?
Just like the start of most new relationships,
everything was perfect after the initial deployment. As the customer's
environment continued to grow, some of the issues that seemed minor at first
began to impact their business. Everything was great...until they noticed things
started to slow down and virtual machines weren't being deployed as quickly as
they were at the beginning. Complexity reared its ugly head, as they realized
they had gone from managing a small number of firewalls in their new
environment to managing firewall rules for over 25 virtual distributed
firewalls in place. DevOps had automated the deployment of VMs and resource
provisioning, but every change created firewall change requests that fell into
the hands of their network operations team. The network operations team had to
manually create necessary access rules and then publish them into each relevant
distributed firewall, in addition to changes to the perimeter firewalls and
networking devices already in place. As new applications queued up to deploy,
previously deployed applications either changed, moved or spun down, which
meant the ticket queue for changes had more than doubled, not to mention the
added burden of investigating any rules that were blocking or not allowing
proper access.
So, you can guess what happened next. Deployment
release times began to slip, the number of unintended misconfigurations
increased and the return on investment on their private cloud investment dropped
quickly. Not only did they not anticipate the level of complexity of their
private cloud deployment, they also overlooked the domino effect on the network
operations team. Since the team was responsible for the private cloud
environment in addition to their firewalls across the entire
organization, ticket responses in other environments were impacted and
timelines for other critical projects slipped.
Q. What can be done to keep the balance of accelerating the
business without compromising security in that perfect honeymoon stage?
Automation is a key component of maintaining that balance. Many
enterprises have a number of their processes automated, but they're not
integrated with each other, or, in worst cases, manual processes are king, which
opens the potential for misconfigurations and human error. Some vendors have
automation included in their solutions, but chaining automation end-to-end
across multiple vendors that make up a private cloud is critical to maintaining
agility and security at scale. Our customer had some automation already in
place, with their DevOps team using automation tools as part of their private
cloud development process. They figured there had to be a way for them to integrate
with those DevOps tools already in use to fully automate their firewall change
process and integrate it into their CI/CD (continuous integration and
continuous delivery) pipeline. They were able to centralize the design and
delivery of their firewall policies for every current (and future) VM in their
private cloud, in addition to any related access requests tied to their
physical perimeter firewalls. With full visibility and unified security policy
management across their hybrid environment, the DevOps team can deploy new
applications on time, and the network operations team can focus on more
strategic projects as their ticket queue is reduced to managing any outliers
that are not automatically pre-approved.
Q. And the
hybrid environment lives happily ever after?
Despite the speed of enterprises moving
to the cloud, on-premises environments aren't going to disappear anytime soon. Public,
private and hybrid cloud all appear to be here to stay. None are perfect...you
have to work to keep them happy and in tip-top shape. The emergence of
DevSecOps shows that people and software tools in the environment are starting
to work across traditional silos to provide a fully automated and secure
process. You need all the pieces integrated and talking to each other, and the
ability to automate and orchestrate processes to eliminate misconfigurations
and ensure continuous compliance. Marrying the needs of your business with the
context and policies of your network assets can let you automatically determine
and enforce the necessary access and security policy across your existing
infrastructure. Ultimately, embedding security in the DevOps process will
provide you with the foundation for a strong and secure hybrid environment that
drives innovation at your speed of business.
##
About the Authors
As Vice President of
Solutioning for next-gen automation software, Paul Anderson spends his time
with customers and partners to understand the impact that trends including
DevOps, Micro-segmentation, and Cloud have on network and security operations.
While almost every enterprise organization is moving to a ‘cloud' model
to increase agility and innovation, there are many paths to get there. As
DevSecOps roles emerge and network and security teams are now being asked to make
secure changes without slowing down release cycles, an automation strategy is
critical to enable agility, maintain security controls, avoid human error and
create repeatable processes to allow personnel to spend more time on innovative
projects and less time troubleshooting and fighting fires. Paul's background
lends to a unique understanding of the challenges that organizations face when
marrying application release schedules with infrastructure operations and
security. Prior to joining FireMon Paul spent a handful of years on
the front lines of large Digital Transformation, private cloud and Data Center
Modernization projects with VCE & Dell EMC and previously four years in the
application delivery and security space at F5 Networks.
As the Director of Product Marketing, Elisa Lippincott is charged with
demonstrating the value proposition for FireMon across the enterprise cloud
security market. Elisa has a long resume in the cyber security industry, most
recently with Trend Micro's Network Defense team. Her security expertise spans
over 18 years covering intrusion prevention (IPS), next-generation firewall
(NGFW), threat intelligence, network access control (NAC), identity and access
management (IAM), security information and event management (SIEM), and cloud
security solutions.