You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I just had an issue where my server got a huge number of hits to the /api/v1/status endpoint, which basically killed my PHP-FPM server (it hit the maximum number of workers, preventing people from connecting to any other sites in the same pool).
Does Cachet cache the response to this API call? It would be good to cache its response (even if only for a short period of time, like 30 seconds), so that a large number of hits don't cause any perf issues. If a lot of hits come in at the same time, they could all receive the cached response rather than having to recompute it + hit the DB.
The text was updated successfully, but these errors were encountered:
Thanks for the reply! In theory I could use Nginx FastCGI caching to cache the endpoint, but it seems like Cachet is sending a Cache-Control: no-cache header, making the content uncacheable. Is there an easy way to modify /api/v1/status to send a caching header? Then I could just configure Nginx to cache it.
I just had an issue where my server got a huge number of hits to the
/api/v1/status
endpoint, which basically killed my PHP-FPM server (it hit the maximum number of workers, preventing people from connecting to any other sites in the same pool).Does Cachet cache the response to this API call? It would be good to cache its response (even if only for a short period of time, like 30 seconds), so that a large number of hits don't cause any perf issues. If a lot of hits come in at the same time, they could all receive the cached response rather than having to recompute it + hit the DB.
The text was updated successfully, but these errors were encountered: