This question has been flagged
3 Replies
47240 Views

What could be the configuration of odoo server with following detail

  • 1200 total users

  • 500 concurrent user


What is the best configuration of Odoo server

  • Server Configuration ?

  • RAM Size ?

  • Hard Disk Size ?

  • Ubuntu version

Another query

Lets say I have a object called project.project in PostgreSQL, Lets say it reaches it maximum number of records for example 300000000, Now I cannot enter more data in that Object.

What can we should do now?? Is there any functionality to archive the data from project.project object.

Or Increasing Hard drive will do the job.. or do we need to use another database..


Please help ?



Avatar
Discard
Best Answer

With that amount of user concurrency you need to have as much as ram and cpu cores that you could get, let's said that you have 32gb, and 10 cpu cores, you need to config Odoo to use this resources as best as it could using workers that enables multiprocesing. The database server is in the same server or in another? if it's in the same server you need to separate an amount of ram out of the Odoo resources calculations for setting PosgreSQL database memory tunnings. Also I recomends a pool strategy in front of the database server like pg_bouncer that will protect your PosgreSQL instance from get out of connection slots.

For resource config in Odoo such as ram and cpu cores you need to determine what number of workers do you need(in others servers like gunicorn the formula is workers = cores * 2 + 1, Odoo is very similar but thats ok if you use workers = cores, better than no workers) and what amount of ram min/max you will asign to every worker( default to 640mb), this is important take an example:

#workers = cores * 2 + 1 and cores is 10  then

workers = 21

ram  = 640 *21 = 13gb ... using default workers ram value min

Another consideration is that if you have a process in Odoo that consume more ram than the worker you will experience MemoryError in that process, then you need to increase workers ram or optimize the process(usually writing data that you have in memory to disk)

If you set a load balancer for Odoo you need to take sessions into account and share session data between Odoo instances, I answer a question here in the forum about that using Redis as session store

This are my 2 cents

Hope some of this info helps

Avatar
Discard
Author

Hello Axel Thanks for sharing indepth for Load sharing!! Indeed it is helpful , Kudos

in Odoo 8 the default memory consumed by a worker is 2048 so the ram calculation changes:
ram = 2048 *7 = 14gb

werd was here.

Best Answer

Hello,

Please refer this

http://www.slideshare.net/openobject/performance2014-35689113

May be it will helpful to you.

Thanks


Avatar
Discard
Author

Thanks, It was helpful links

Best Answer

s for sizing the server, you will definitly need some disk space, and ram will be needed , but it should work on 16-32 GB just fine.. Ubuntu version is not relevant,but use some newer version (i'd sugges LTS versions.. 12.04 or 14.04) .

As for second part of question.. in postgres there is no such thing as rowcount limit, or record limit.. 
the only real limit is disk space... (check this link and see some BIG table testing)

As for sever config.. you should consider a configuration that uses load balancer wich is safer and cheaper to assemble then highend single server... also.. consider virtulazition and db replication (in case your main  db server fails )...

this is not a cook bok, it is only an opinion... 

hope it helps a bit

Avatar
Discard
Author

For Ununtu we are going to use 14.04 LTS I indeed research on object level and found that , it is taking 32 TB space for one object Even 32 TB is big,,,but is there any way to archiving the data. I know one feature called setting up trigger and load balance to another postgresSQL database but all my historical data would be gone!! Yes, we are going to use primary secondary concept and disaster management for the production server.

Author

For Ununtu we are going to use 14.04 LTS I indeed research on object level and found that , it is taking 32 TB space for one object Even 32 TB is big,,,but is there any way to archiving the data. I know one feature called setting up trigger and load balance to another postgresSQL database but all my historical data would be gone!! Yes, we are going to use primary secondary concept and disaster management for the production server.

Better use a base_backup for the second server and a replica strategyy like streaming replication