2

We have recently been having issues with postgres running out of connection slots, and after a lot of debugging and shrugging of shoulders we have pretty much tracked it down to the fact that we understood Connection pools wrong.

We use Rails, Postgres and Unicorn, and Delayed Job Are we correct to assume that the connection pool is process specific, i.e each process has its own 10 (our connection pool limit) connections to the db in the pool?

And If there are no threads anywhere in the app, are we correct to assume that for the most part each process will use 1 connection, since noone ever needs a second one?

Based on these assumptions we tracked it down to the number of processes

Web server - 4x unicorn
Delayed job 3x server - 30 processes = 90 connections

That's 94 connections, and a couple connections for rails:consoles and a couple of rails runner or rake tasks would explain why we were hitting the limit often right? It has been particularly often this week after I converted a ruby script into a rails runner script.

We are planning to increase the max from 100 -> 200 or 250 to relieve this but is there a trivial way to implement inter process connection pooling in rails?

3
  • No, there is no trivial way to implement inter process connection pooling in rails and I don't think you want that. You may be just worried because of the recent events. But think about when you want to scale. Having separate resources for each process makes scaling trivial for you. Also, I think 10 connections for each DelayedJob process is a bit too much. Do you really have so many requests filling up the queue? Commented Nov 12, 2013 at 8:31
  • @Chandranshu its 10 in the pool but mathematically they are using only 1 each. I have 30 x 3 processes running Commented Nov 12, 2013 at 9:28
  • @Chandranshu scaling is the problem. If I need to scale up my workers, I need a lot of connections :( Commented Nov 12, 2013 at 9:29

1 Answer 1

2

You probably want to take a look at pgbouncer. It's a purpose-built PostgreSQL connection pooler. There are some notes on the wiki too. It's packaged for most linux distros too.

Sign up to request clarification or add additional context in comments.

3 Comments

Hmm looks interesting.. perhaps we would need to look at this if we hit this problem again..
You almost certainly don't want 100 concurrent connections at the database end. Since you're on rails and haven't come across pgbouncer I'm guessing you're not running a 64-core box. Quick rule of thumb is no more than double the number of cores available. PostgreSQL will handle 100 concurrent queries just fine, but you probably won't like how long it will take for the last one to finish.
Yeah, I was thinking we should probably optimize, but right now we are on low load and most of them are background workers i.e delay is ok (I realize just now that this means my web processes might get edged out..)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.