When building a new integration with a third party API, there are so many different problems to solve! Idempotency, retries, exponential backoffs, authentication… the list goes on and on! One of the most common constraints when integrating with an external API is rate limiting. In this post, we are going show you a very simple pattern that we have found to effectively mitigate the risk of rate limiting without adding a lot of undue complexity on your own side of the integration.
Redis is great. You can do so much with it. Pub/sub, arbitrary data structures, caching… the list is endless! One of the most powerful aspects of Redis that often goes overlooked is it’s ability to execute Lua scripts directly within it’s own runtime. As a backend developer, this means you can perform complex operations inside of Redis in an atomic fashion with a single command rather than sending multiple commands and losing the guarantees that a transaction would otherwise provide.
One very obvious use case for such a feature is for the implementation of distributed locks, a useful pattern for controlling and/or limited access to a logical resource in a distributed system. A third party API is simply another type of logical resource, so with a little bit of Lua we can effectively use Redis to “lock” access to this resource across all services on our backend once it has been interacted with a certain number of times within a timespan.
The logic for this in Lua is rather easy, given the “stdlib” of functionality via Lua packages that Redis provides for us:
Now, any of our backend services that need to make API calls to a third party rate limited API can simply invoke this script via Redis’s eval command. If the command returns true, the service can go ahead and safely make the request. If the command returns false, the service should yield and try again later.
Partner with us
You have the vision for a gorgeous product experience. We are the software design and engineering team that can bring it to life. Learn more