Kong Inc., the leading API platform for modern architectures, today introduced
Kong Cloud. The new, fully managed service is the fastest way for large organizations to adopt the
Kong Enterprise
API platform at scale. It speeds large organizations' digital
transformation by helping them to more easily build cloud-native
services and connect cloud and on-prem environments. Kong Cloud is
available immediately and may be deployed on AWS, Azure, Google Cloud or
any other cloud platform.
Kong Cloud takes advantage of Kong Enterprise's wide array of
management and monitoring capabilities to provide an end-to-end API
platform, including a developer portal, security features and API
traffic analytics, at cloud scale. With Kong Cloud, organizations get
zero-touch updates, immediate delivery of new product functionality, and
improved uptime and availability through localization of APIs across
regions. This helps enterprises accelerate their move to multi-cloud
strategies without sacrificing security or availability, or risking
vendor lock-in.
At the core of Kong Cloud is Kong's open source technology for
securing, connecting and orchestrating APIs, with more than 54 million
downloads and four years of use in production. Kong was built with
scalability, performance and extensibility in mind, so it readily adapts
to the demands of modern architectures such as containers and
microservices, serverless and service mesh, while fully supporting
services that use traditional systems. Kong also ties directly into the
Kubernetes lifecycle with the Kong Ingress Controller, providing a seamless integration point for data input.
"We designed Kong to support cloud native environments as well as
more traditional systems, making it the ideal API management platform
for supporting business-critical applications and digital transformation
initiatives," said Augusto Marietti, CEO and co-founder of Kong Inc.
"Kong Cloud gets organizations up and running on Kong quickly and
ensures they're benefitting from its most sophisticated features as soon
as they're available. What we've learned from running Kong at scale is
also being contributed to open source Kong, ensuring faster evolution of
the platform available to any developer, anywhere."