Skip to Content
Menu
This question has been flagged
3 Replies
14451 Views

Hi,

Suppose I am using OpenERP with a load balancer, to handle concurrency.

How many users could one instance handle concurrently?

I assume that postgresql would never run into performance problems? (ie OpenERP application is the bottleneck?)

Regards,

Avatar
Discard
Author

Can gunicorn spawn workers on multiple VPS? If not, I image that this does not scale well?

Best Answer

DO NOT scale the number of workers to the number of clients you expect to have. Gunicorn should only need 4-12 worker processes to handle hundreds or thousands of requests per second.

Gunicorn relies on the operating system to provide all of the load balancing when handling requests. Generally we recommend (2 x $num_cores) + 1 as the number of workers to start off with. While not overly scientific, the formula is based on the assumption that for a given core, one worker will be reading or writing from the socket while the other worker is processing a request.

Obviously, your particular hardware and application are going to affect the optimal number of workers. Our recommendation is to start with the above guess and tune using TTIN and TTOU signals while the application is under load.

Always remember, there is such a thing as too many workers. After a point your worker processes will start thrashing system resources decreasing the throughput of the entire system.

For more details about gunicorn based workers link...

Avatar
Discard
Best Answer

The recommended practice is to use `gunicorn` to handle concurrency. It deploys several workers serving requests on the same port. The recommended number of workers is `[number of cores] * 2 + 1`.

You have to take care of your Postgres configs and tuning, but usually the bottleneck will come from the application server ORM.

Avatar
Discard
Author Best Answer

More information from Opendays 2014:

http://www.slideshare.net/openobject/performance2014-35689113

Avatar
Discard
Related Posts Replies Views Activity
0
Jun 15
3393
1
Apr 18
9519
0
Jun 21
2448
0
Feb 21
4639
1
Apr 18
5429