January 21, 2021

What Is Rate Limiting?

API Lifecycle Management

Rate limiting is important to preventing malicious attacks on your APIs. That's why lack of resources is one of OWASP’s top API risks.

Back to top

What Is Rate Limiting?

Rate limiting applies to the number of calls a user can make to an API within a set time frame. This is used to help control the load that’s put on the system.

Rate limiting helps prevent a user from exhausting the system’s resources. Without rate limiting, it’s easier for a malicious party to overwhelm the system. This is done when the system is flooded with requests for information, thereby consuming memory, storage, and network capacity.

An API that utilizes rate limiting may throttle clients that attempt to make too many calls or temporarily block them altogether. Users who have been throttled may either have their requests denied or slowed down for a set time. This will allow legitimate requests to still be fulfilled without slowing down the entire application.

Back to top

Rate Limiting Use Cases

Brute Force Attacks

One of the most common use cases for rate limiting is to block brute force attacks. In a brute force attack, a hacker uses automation to send an endless stream of requests to an API, hoping that eventually one may be accepted. Limiting client access will slow down this attack. At the same time, the receiving system should notice the unexpectedly large number of failed requests and generate alerts so that further action may be taken.

In some cases, a user may accidentally cause a brute attack—a bug may cause repeated failed requests, causing the client to keep trying. Rate limiting would help to force a temporary stop and allow for follow-up action.

Denial of Service Attacks

Another use case is to prevent a Denial of Service (DoS) attack. DoS attacks occur when a user attempts to slow or shut down an application completely. The user may do this by flooding the API with requests. Multiple users may attempt to overwhelm the system in this way as well; this is known as a Distributed Denial of Service (DDoS) attack.

Back to top

Rate Limiting Strategies With Akana

While rate limiting is a solution for stopping an abundance of requests, there are several other security measures that can be put in place before an attack happens.

Akana offers multiple configurable policies for a customizable API security package. These policies work to verify client identity and reduce overload of requests. 


Protect your services by applying authentication to messages. Make sure that the messages come from trusted domains. This policy ensures the sending identity is authentic, known to the service, and authorized to invoke the service.

HTTP Security

For REST and SOAP APIs, there are several policies available.

HTTP Basic Authentication

This allows a client to provide credentials in the form of a username and password with making a request.

HTTP Digest Authentication

This allows a client to negotiate credentials using the HTTP protocol. It supersedes unencrypted use of Basic Authentication and makes it possible to establish a user identity securely without sending a password in plain text over the network.

SAML Bearer Token Authentication

Allows a client to provide credentials in the form of a SAML token that uses the Bearer subject confirmation method when making a request.

JWT Bearer Token Authentication

This allows a client to provide credentials in the form of a JWT Bearer token.

Client Certificate

This allows authentication based on an X.509 certificate provided by the client. The certificate can be taken from the Secure Sockets Layer (SSL) protocol context. This option requires that your client application use SSL to make a secure connection with the application server. The application server must enable mutual authentication for the SSL connection.

Cookie Authentication

This allows authentication by passing user credentials through HTTP cookies when making a request. The cookie is typically issued to clients by a third-party identity provider and Access Manager software for single sign-on purposes.

HTTP Caching

The API gateway can return a cached response, rather than forward the request to the back-end to have it send the response. This can reduce the number of identical calls one user makes and frees up bandwith for other users.


As previously mentioned, throttling slows down the amount of requests a user can make. This allows developers to monitor suspicious activity and take action — such as blocking an IP address — before a full DoS or DDoS attack can occur.

These policies are easily added through the developer portal to endpoints and work independently of each other. They ensure that only valid requests are granted. And to make sure that back-end systems can cope with the incoming messages, rate limiting may be applied to either individual clients or as an absolute limit on the API.

Additional Threat Prevention

While the policies above are mainly preventative measures, there are several options for stopping attacks such as DoS and DDoS in action. The Akana API gateway makes it easy to monitor activity in real time.

If there is suspicious activity happening, developers can quickly blacklist one or more IP addresses. This prevents those IP addresses from ever interacting with the API. The gateway can also be decommissioned — this will make services unavailable to users, but it will not affect the back-end system at all.

Back to top

Get Started With Akana

See for yourself how Akana makes it easy to ensure API security, leverage rate limiting, and protect against the risk of lack of resources. 

Request a Trial Watch a Demo First


👉 Become an Expert

Explore additional resources:

Back to top