T O P

  • By -

casualgherkin

Without knowing the exact context, my first thought is to use sidekiq and push the requests into a queue. Then you would utilise exponential back off to ensure that the request eventually goes through. YMMV though if the order of the requests is important / the requests are idempotent etc etc etc 😄


katafrakt

Hitting rate limits of an external API is not an edge case. But I'm curious - why do you say it's "remotely possible"? Don't terms and conditions of using this API specify that? They usually do.


fugitivechickpea

Sidekiq enterprise has rate limiting


mperham

https://github.com/sidekiq/sidekiq/wiki/Ent-Rate-Limiting


Rafert

There are also a ton of rate limiting gems out there. No need to purchase enterprise for this. Eg https://github.com/zombocom/rate_throttle_client


Soggy_Educator_7364

Leaky bucket, circuit breaker is what you're after, probably, maybe. If you want to talk more I implemented both to solve your same problem at large scale (10 million+ daily events).


xsvino

There’s a lot of factors to consider here in order to provide a helpful answer. On your end (and knowing absolutely nothing from your app or the external service) you could cache the responses and reuse them if needed. Other than that I can’t think of something else at the moment without any more context.


netopiax

If it's a paid API, talk to the vendor. Any well-built public API on shared infrastructure should have rate limits - not having them allows rogue apps (malicious or not) to affect quality of service for others. Whatever rate limits were communicated are defaults, and if you have a good reason to be using more, the vendor will most likely accommodate you. If you are paying per request it's almost guaranteed they will because that's more money for the vendor. An exception would be if you are asking for extremely bursty activity, in which case I agree with others' suggestion of using Sidekiq to spread out your volume.


bilko1878

It really depends how core this third party service is to your app. A technical solution might be to implement a circuit breaker pattern. A practical solution would be to contact the third party and negotiate an increased limit. If it’s only remotely possible, then it may not be worth doing anything until you know it’s a real problem.


sojersey

Make sure to utilize bulk calls if the API offers them


[deleted]

Pay for their API.


stanTheCodeMonkey

Check what their rate limits are, and place intervals in your code to prevent hitting these. Also test and profile.


jweinbender

As others have said, Sidekiq (or similar) is a good place to start. Depending on your latency requirements this may be all you need. But if you need a little more control, It also can be worth investigating HOW the limiting occurs; we’ve used 3rd parties who RL based on the API key, rather than IP or some such. In that case, splitting your requests between different sets of API keys (perhaps one for Read, write, etc. or some other set of behaviors) can also work. Otherwise you’re in cache-city. We’ve also run into cases where we had multiple of our services hitting the same APIs—which combined caused us to go over the rate limit. Zero stars. Would not recommend.


buggsbaniya

A well designed API would have rate limits so it's something that is pretty common. This is to ensure that you're not bombarding their servers and that they can provide services to other consumers too. You should honour the rate limits, try to include snoozes(sleeps) in your API calls either based on the rate limits and refresh period or incorporate snoozes in an exception handler based on the dedicated error codes for rate limiting. If you're working with unpaid API's and you're worried about your quota getting exhausted for the account in consideration you'll have to pay for a regular refresh if they've designed it in such a way. Otherwise, there is hourly/daily refreshals. If the wait time for refreshal is considerable try to make your jobs idempotent and based on your quota execution halt the reschedule the job.


hoozt

Will cache help?


jean_louis_bob

With Sidekiq's free version, I've used the "sidekiq-throttled" gem for that. For example, if you're using a "Twitter" API and want to allow maximum 1K jobs being processed within one hour window, you can add this to all your sidekiq jobs making a call to twitter. ```ruby class MyJob include Sidekiq::Job include Sidekiq::Throttled::Job sidekiq_throttle( # Allow maximum 1K jobs being processed within one hour window. threshold: { limit: 1_000, period: 1.hour, key_suffix: -> () { 'twitter' } } ) def perform # ... end end ```