Apigee Edge enables you to quickly expose backend services as APIs. You do this by creating an API proxy that provides a facade for the backend service that you want to expose.

The API proxy decouples your backend service implementation from the API that developers consume. This shields developers from future changes to your backend services. As you update backend services, developers, insulated from those changes, can continue to call the API uninterrupted.

By exposing an API through Apigee Edge, you gain the ability to modify and monitor its behavior using out-of-the-box policies. Edge's out-of-the-box policies enable you to enhance your API with sophisticated features to control traffic, enhance performance, enforce security, and increase the utility of your APIs, without requiring you to write any code or to modify any backend services. Extension policies enable you to implement custom logic in the form of JavaScript, Python, Java, and XSLT.

In this lab we will see how to use an out of the box traffic management policy, "Spike Arrest", to protect against traffic spikes. A Spike Arrest policy can throttle the number of requests processed by an API proxy and sent to a backend, protecting against performance lags and downtime.

What you'll learn

What you'll need

How will you use use this tutorial?

Read it through only Read it and complete the exercises

How would you rate your experience with using Apigee Edge?

Novice Intermediate Proficient

Self-paced environment setup

If you already have an account login to your Apigee instance: https://apigee.com/edge

If you do not want to sign up right away, and are looking for a short-lived temporary account, you can Click on this link to get the login credentials

If you want to signup for your own Apigee trial account: https://login.apigee.com/sign__up

Create An API Spec

Select Develop → Specs in the side navigation menu

Click +Spec. Click on Import URL to add a new spec from existing source.

Enter spec details. Replace {your-initials} with the initials of your name.

Verify the values and click Import. Spec has been imported into Apigee Edge & Ready to use. You should see your spec in the list. For example,

image alt text

Click on {your-initials}_employee_api_spec from the list to access Open API spec editor & interactive documentation that lists API details & API Resources.

image alt text

Create an API Proxy

It's time to create Apigee API Proxy from Open API Specification. Click on Develop → API Proxies from side navigation menu.

Click +Proxy The Build a Proxy wizard is invoked.

image alt text

Select Reverse proxy, Click on Use OpenAPI below reverse proxy option.

image alt text

You should see a popup with list of Specs. Select {your-initials}_employee_api_spec and click Select.

image alt text

You can see the selected OpenAPI Spec URL below the Reverse Proxy option, Click Next to continue.

image alt text

Enter details in the proxy wizard. Replace {your-initials} with the initials of your name.

image alt text

Verify the values and click Next.

You can select, de-select list of API Proxy Resources that are pre-filled from OpenAPI Spec. Select all & Click on Next

image alt text

Select Pass through (none) for the authorization in order to choose not to apply any security policy for the proxy. Click Next.

image alt text

Go with the default Virtual Host configuration.

image alt text

Ensure that only the test environment is selected to deploy to and click Build and Deploy

image alt text

Once the API proxy is built and deployed click the link to view your proxy in the proxy editor.

image alt text

Congratulations! ...You have now built a reverse proxy for an existing backend service. You should see the proxy Overview screen.

image alt text

Test the API Proxy

Copy the URL for your API proxy.

image alt text

Open the REST Client on a new browser window. Paste the link into the REST Client and make a GET call

image alt text

You should see a success response similar to this image alt text

Click on Develop tab to access API Proxy development dashboard.

The UI Should look like this:

Click on PreFlow under Proxy Endpoint default, Click on +Step on top of Request flow to attach a spike arrest policy.

Select Spike Arrest Policy. Click on Add button to add spike arrest policy to proxy endpoint preflow request.

You can notice Spike Arrest policy icon on top of request flow that shows where exactly policy is attached and policy XML configuration below in editor.

Change the Policy XML configuration to below code & update the rate to 12pm.

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<SpikeArrest async="false" continueOnError="false" enabled="true" name="Spike-Arrest-1">
    <DisplayName>Spike Arrest-1</DisplayName>

Think of Spike Arrest as a way to generally protect against traffic spikes rather than as a way to limit traffic to a specific number of requests. Your APIs and backend can handle a certain amount of traffic, and the Spike Arrest policy helps you smooth traffic to the general amounts you want.

The runtime Spike Arrest behavior differs from what you might expect to see from the literal per-minute or per-second values you enter.

For example, say you enter a rate of 6pm (6 requests per minute). In testing, you might think you could send 6 requests in 1 second, as long as they came within a minute. But that's not how the policy enforces the setting. If you think about it, 6 requests inside a 1-second period could be considered a mini spike in some environments.

What actually happens, then? To prevent spike-like behavior, Spike Arrest smooths the number of full requests allowed by dividing your settings into smaller intervals:

Per-minute rates get smoothed into full requests allowed in intervals of seconds. For example, 6pm gets smoothed like this: 60 seconds (1 minute) / 6pm = 10-second intervals, or 1 request allowed every 10 seconds. A second request inside of 10 seconds will fail. Also, a 7th request within a minute will fail.

Per-second rates get smoothed into full requests allowed in intervals of milliseconds. For example, 10ps gets smoothed like this: 1000 milliseconds (1 second) / 10ps = 100-millisecond intervals, or 1 request allowed every 100 milliseconds. A second request inside of 100ms will fail. Also, an 11th request within a second will fail.

Click on Save to save the API Proxy changes.

Congratulations!...You have now secured your backend against denial of service attacks, performance lags or downtime of target servers.

Let us test the updated API proxy using the Trace Console. Click on Trace tab.

image alt text

Click on Start Trace Session to see API Proxy with spike arrest in action.

image alt text

Click on Send button multiple times, You will see 500 response code when spike arrest policy kicks in to protect target servers from spike in traffic.

image alt text

You might notice that number of requests with 200 response is more than spike arrest rate value configured, It's due to multiple message processors where policies gets executed and each has individual counters.

You can also use Apigee Rest Client to test Spike Arrest Policy.

image alt text

This time, when you send the request, the request should show up in the trace of your API Proxy.


That completes this hands-on lesson. In this simple lab you learned how to protect target servers against denial of service attacks.

You have deployed added an API Proxy to Apigee, protected your API from spikes

Learn More

Useful Apigee documentation links on Traffic Management & Spike Arrest Policy -


This work is licensed under a Creative Commons Attribution 2.0 Generic License.