GCloud Endpoints is a distributed API management system. It provides an API console, hosting, logging, monitoring, and other features to help you create, share, maintain, and secure your APIs. This page provides an overview of Cloud Endpoints for OpenAPI. For information on other types of API frameworks supported by Cloud Endpoints.
Cloud Endpoints uses the distributed Extensible Service Proxy (ESP) to provide low latency and high performance for serving even the most demanding APIs. The Extensible Service Proxy is a service proxy based on NGINX, so you can be confident that it scales as needed to handle simultaneous requests to your API. The proxy runs in its own Docker container for better isolation and scalability and is distributed in the Google Container Registry and Docker registry. It can be used with Google App Engine flexible, Google Kubernetes Engine, Google Compute Engine or Kubernetes.
Cloud Endpoints uses Google Service Infrastructure to manage APIs and report logs and metrics. Most Cloud APIs use this same infrastructure. You can manage and monitor your APIs using the Endpoints dashboard in the Google Cloud Platform Console.
Cloud Endpoints is optimized for the Docker container environment. You can host your API anywhere Docker is supported so long as it has Internet access to Google Cloud Platform.
However, Cloud Endpoints provides an optimized workflow to run your APIs on the following:
Cloud Endpoints is language independent. You build your API in any language and REST framework that supports API description using an OpenAPI configuration file. To use Cloud Endpoints with OpenAPI, you:
Configure Endpoints: You describe the API surface and configure Cloud Endpoints features (such as API keys or authentication rules) in an OpenAPI configuration file.
Deploy the Endpoints Configuration: After you have defined your API in an OpenAPI configuration file, you use the Cloud SDK to deploy it to Google Service Management, which Cloud Endpoints uses to manage your API. Now Cloud Endpoints knows all about your API and how to secure it.
Deploy the API Backend: You deploy the Extensible Service Proxy and your API backend to a supported Google Cloud backend, such as Google Compute Engine. The Extensible Service Proxy coordinates with Cloud Endpoints backend services to secure and monitor your API at runtime.
Cloud Endpoints lets you configure your API to require an API key for any call and validates the API key. You can also use the Google Cloud Platform Console to share your API with other developers so they can enable your API and generate API key(s) to call it.
For most API calls, there is a user on the other end of each call. While API keys indicate which app is making a call to your API, the authentication process determines which user is using that app.
Note that your API server still needs to decide what the authenticated user can do with your API.
Extensible Service Proxy delivers security and insight in less than 1ms per call. Deploy your API automatically with Google App Engine and Google Kubernetes Engine, or add our proxy container to your Kubernetes deployment.
Use your favorite API framework and language, or choose our open source Cloud Endpoints Frameworks in Java or Python. Simply upload an Open API specification and deploy our containerized proxy.
The Prometheus Pushgateway exists to allow ephemeral and batch jobs to expose…
cert-manager is a Kubernetes add-on to automate the management and issuance of…
This article describes how to set up and use Azure Monitor container health to…
Tell us about a new Kubernetes application
Never miss a thing! Sign up for our newsletter to stay updated.
Discover and learn about everything Kubernetes