DB pool exhaustion
azuyah
PROOP

23 days ago

Hello we have had a lot of problems lately with our backend, lots of db exhaustion and pool limits. We are fairly new to this and don't know how to solve it. Customers are being affected by this. Can you help? Thanks

Attachments

$30 Bounty

5 Replies

Railway
BOT

23 days ago

This thread has been marked as public for community involvement, as it does not contain any sensitive or personal information. Any further activity in this thread will be visible to everyone.

Status changed to Open Railway 23 days ago


23 days ago

Hey, what's your average throughput, especially in the endpoints that require you to make database queries? It would also be useful to know the number of instances you have running of your service, without some context it's hard to pin point the issue, but I'd personally recommend reviewing the db connections pool size configuration, a limit of 10 connections seems very low, especially for something in production


from logs.... It’s app side DB pool exhaustion...  sqlalchemy.exc.TimeoutError: QueuePool limit of size 10 overflow 10 reached, connection timed out ...this means  app exhausted SQLAlchemy conns...so new requests waited 30s and failed... then 502s/restarts are secondary effects after pool starvation...this is a pool/concurrency/query issue..what you can do is decrease effective DB concurrency per replica , if using postgress ad PgBouncer and make sure sessions are always closed and long queries are optimized.


haato

Hey, what's your average throughput, especially in the endpoints that require you to make database queries? It would also be useful to know the number of instances you have running of your service, without some context it's hard to pin point the issue, but I'd personally recommend reviewing the db connections pool size configuration, a limit of 10 connections seems very low, especially for something in production

azuyah
PROOP

9 days ago

Hello, sorry for the late reply I was expecting an email when I got a reply. I tried increasing the connections but then I hit 503 on railway saying "Backend.max_conn reached"

As I said we are very new to this so I'm not sure how to proceed. We are more than happy to just increase the limits but when I tried that before it seemed to slow the entire website down for some reason.

I attached a screenshot of the services we are running.

Attachments


azuyah

Hello, sorry for the late reply I was expecting an email when I got a reply. I tried increasing the connections but then I hit 503 on railway saying "Backend.max_conn reached"As I said we are very new to this so I'm not sure how to proceed. We are more than happy to just increase the limits but when I tried that before it seemed to slow the entire website down for some reason.I attached a screenshot of the services we are running.

9 days ago

What was your throughput (requests per second) when you got the max_conn error? From your screenshot it looks like you're running only one instance of your backend, which could be the reason why you're getting that error (having way too many clients connecting to a single instance), try increasing the number of replicas in your backend's horizontal scaling settings


haato

What was your throughput (requests per second) when you got the max_conn error? From your screenshot it looks like you're running only one instance of your backend, which could be the reason why you're getting that error (having way too many clients connecting to a single instance), try increasing the number of replicas in your backend's horizontal scaling settings

azuyah
PROOP

9 days ago

I don't know now sorry. We have a db and backend for our CRM, emailing lists, contacts etc basically everything regardin got marketing. Then we have the core backend that handles users, file uploads etc. But we have like 3 active users so I doubt that's what's overloading it.

Do you mean literally making copies of the backend to spread the load?


Loading...