Starting around 12:45 UTC, we started seeing serious performance degradation of the API - most (but not all) of our calls are either getting 504 Gateway Timeout errors, or just taking a really, really long time to execute
Starting around 12:45 UTC, we started seeing serious performance degradation of the API - most (but not all) of our calls are either getting 504 Gateway Timeout errors, or just taking a really, really long time to execute
Hello,
Are you still experiencing this? It should be working fine now.
Can we have screenshot or video of API where it is throwing error.
Thanks,
Ankit
We don't seem to be getting errors anymore, but the API is definitely being much slower than normal
Due to high traffic server can have load and due to it response time can be slow. We are aware of it and working consistently to improving our API performance.
Apologise for the inconvenience.
Are there any plans of changing how the API throttling works? While I understand why API rate limiting is necessary, but currently it is based on our Application ID, which means as we get more customers they have to share a fixed amount of bandwidth - this is the exact opposite of how an API should scale.
Most other APIs base their limiting on the individual location.
If this was rate limiting you'd receive a 429, not 504. See: https://docs.clover.com/docs/api-usage-rate-limits. Rate limiting is also based on token, which is essentially "per-site" correct? You're right there is still an upper bound for the app.
You can petition developer relations to have your rates raised. You'd want to make sure you've first considered the best practices here: https://docs.clover.com/docs/api-usage-rate-limits#best-practices-to-avoid-429-http-error-messages first.
Hi there,
We are also seeing some slower than usual api performance. Are there any baseline numbers on how fast an item create/update should take? Is there any data we can gather that would help the Clover team determine if there are bottlenecks?
Much appreciated,
-Rares @ Shopventory
7 People are following this question.