Odoo Help

Andreas Brueckl
On 3/6/13, 2:32 PM

I use the following setup:

  1. Backup-Script /var/scripts/dump_db.sh

    ## OpenERP Backup
    ## Backup databases: openerpdb1, openerpdb2
    # Stop OpenERP Server
    /etc/init.d/openerp-server stop
    # Dump DBs
    for db in openerpdb1 openerpdb2
      date=`date +"%Y%m%d_%H%M%N"`
      pg_dump -E UTF-8 -p 5433 -F p -b -f $filename $db
      gzip $filename
    # Start OpenERP Server
    /etc/init.d/openerp-server start
    exit 0
  2. Housekeeping script /var/scripts/housekeeping.sh (deletes backups which are older than 30 days)

    rm -f $logfile
    for file in `find /var/pgdump/ -mtime +30 -type f -name *.sql.gz`
      echo "deleting: " $file >> $logfile
      rm $file
    exit 0
  3. Create daily cronjobs in /etc/crontab. The backup runs daily at 1am and the housekeeping job runs daily at 5am.

    # m h dom mon dow user  command
    0 1 * * * postgres /var/scripts/dump_db.sh
    0 5 * * * postgres /var/scripts/housekeeping.sh

if you run your dump_db.sh as the postgres user you don't need to use the /root/.pgpass ..

0 1 * * * postgres /var/scripts/dump_db.sh

Ian Beardslee
on 3/7/13, 1:00 AM

Thank you, this simplifies the backup! I have updated my answer.

Andreas Brueckl
on 3/7/13, 3:30 AM

Thanks for the scripts and answer , just a small thing I noticed in point 3 you use /var/scripts and in point 2 you place the scripts in /var/script

on 4/22/13, 7:47 AM

I update the response with the correct typo.

Tuxservices, Jeudy Nicolas
on 9/3/13, 7:57 PM

What should I use for 'hostname'? I am getting the following error:dump_db.sh: 1: dump_db.sh: ubuntupinn: not found Stopping openerp-server: openerp-server. pg_dump: [archiver (db)] connection to database "acctpinntest" failed: could not connect to server: No such file or directory Is the server running locally and accepting connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5433"? gzip: /var/pgdump/_acctpinntest_20130906_1729547372172.sql: No such file or directory Starting openerp-server: openerp-server.

on 9/6/13, 9:24 PM

Any help on this script, please? It's working except for host name. We have used our Ubuntu host name and it doesn't work. Any help is appreciated, thanks!

on 9/9/13, 1:46 AM

Can someone assist with the 'hostname' - I posted the error we get with the above custom script, so I assume the issue is with the hostname. We are running the scripts with a crontab on the server itself. Any assistance would be greatly appreciated - Thanks!

on 9/13/13, 5:23 PM

What happens if you just run "echo hostname" at the command prompt? Note that that is hostname with the backticks NOT 'hostname' with the single quotes. Also is your database server the same as your application server? .. .. mutter mutter .. how does one show the backticks in this comment section?

Ian Beardslee
on 9/16/13, 3:07 AM

The full path is "/bin/hostname". This command just returns the value in "/etc/hostname". But you can also assign a custom value to the variable.

Andreas Brueckl
on 9/16/13, 5:09 AM

if I type echo hostname from ssh I get hostname

on 9/16/13, 6:28 PM

I tried the script with hostname as listed above and I still get errors. OpenERP server stopped and restarted correctly but the drop db code produced errors. I am on Ubuntu Server 12.10

on 9/16/13, 6:35 PM

You need to type "echo <backtick>hostname<backtick>" where <backtick> is that sloping quote mark eg .. ` .. usually next to the 1 on the keyboard. This comment section doesn't seem to want to display that in with the what you need to type. Also is your database server the same as your application server?

Ian Beardslee
on 9/16/13, 11:56 PM

what we should be given in place of hostname,openerpdb1,openerpdb2 . The database name currently we are using should be given?

on 1/27/14, 10:30 AM

Sureka, yes. In the above script 'hostname' is picked up as part of the script so you can leave that. But openerpdb1 and openerpdb2 are the names of the databases you want backed up, you could have many there as you want.

Ian Beardslee
on 1/28/14, 7:35 PM

Hi what does this 5433 stands for in the pg_dump -E UTF-8 -p 5433 -F p -b -f $filename $db

Patrick Yap
on 3/24/14, 6:42 AM

pg_dump option : -p port, --port port for TCP port or local socket in case of you are not using the default one

Tuxservices, Jeudy Nicolas
on 3/24/14, 9:36 AM

Thanks now I understand :D

Patrick Yap
on 3/24/14, 11:25 PM

Hi andreas everytime I try to run the scripts without cron im getting this error

Patrick Yap
on 3/25/14, 10:54 PM

Hi andreas everytime i try to run the .sh file without cron (for testing im getting this error) pg_dump: [archiver (db)] connection to database "DATABASENAME" failed: FATAL: role "root" does not exist should I add a role in my postgres

Patrick Yap
on 3/25/14, 10:55 PM

Has anybody tried restoring with a backup created this way? When I restore using the pg_restore commandline, the database is created, but in the database manager I cannot see a database. When I use the database manager webpage I get an error.

Stijn Staelens
on 9/22/14, 4:40 AM

@Stijn did you find a way? I'm facing the same problem

César Bustíos Benites
on 11/12/14, 12:14 PM
António Sequeira
On 6/6/13, 8:04 PM


Here is the script I made for a 24x24 running service with several medium size database


  • keeps a logfile
  • backups multiple databases
  • does a vacuumdb before backing up
  • doesn't stop the openerp server and
  • removes old files (more than 10 days).

Simples and efective...

The script is executed via cron with

 00 01 * * * nice -19 /home/mbmaster/scripts/backupdb > /dev/null 2>&1

the "nice -19" is important to lower the priority of the backup

 # Backup script starts here.

 # Location of the backup logfile.

 # Location to place backups.

 touch $logfile
  timeslot=`date +%d%m%y%H%M%S`
  databases=`psql -U postgres -q -c "\l" | awk '{ print $1}' | grep -vE '^\||^-|^List|^Name|template[0|1]|^\('`

  for i in $databases; do
    timeinfo=`date '+%T %x'`
    echo "Backup and Vacuum started at $timeinfo for time slot $timeslot on database: $i " >>
    /usr/bin/vacuumdb -z -U postgres $i >/dev/null 2>&1
    /usr/bin/pg_dump $i -U postgres | gzip > "$backup_dir/openerp-$i-$timeslot-database.gz"
    timeinfo=`date '+%T %x'`
    echo "Backup and Vacuum complete at $timeinfo for time slot $timeslot on database: $i " >> $logfile


 # delete files more than 10 days old
 find $backup_dir/openerp* -mtime +10 -exec rm {} \;

Thanks for sharing.

Daniel Reis
on 9/2/13, 5:45 AM

what is the name of the script file? Is it backupdb.sh ? will this work in Ubuntu server 12?

on 9/6/13, 9:25 PM

Has anybody tried restoring a backup created this way? When I restore using pg_restore odoo isn't "seeing" the database. When I go to the database manager webpage no databases are found. When I restore using the webpage I get an error.

Stijn Staelens
on 9/22/14, 4:37 AM

I noticed that when u zip the dump files, they can't be restored!!!

Rami Talat
on 11/18/14, 11:37 AM
Timo Goosen
On 8/11/14, 4:23 AM

Remember to test your backups otherwise they aren't backups.

Ivan Elizaryev
On 3/9/15, 7:51 AM

I'd like to share my solution with a community.

Key features:

  • It uses python-rotate-backups tool to apply more complex strategy in deleting old backups. E.g. you will have 7 backups for last 7 days, 4 backups for each week of last 30 days, 12 backups for each month of last year etc.
  • It can backup posgtresql dump only  as well as full dump (with a filestore)
  • it reads odoo configuration file to get data_dir and parameters to access database

To use script install dependencies:

pip install rotate-backups

Then download script to /usr/local/bin and make it executable:

cd /usr/local/bin/

wget -q https://gist.githubusercontent.com/2abdd91d00dddc4e4fa4/raw/odoo-backup.py -O odoo-backup.py

chmod +x odoo-backup.py

Then add lines at /etc/crontab like these:

# m h dom mon dow user    command
11 6    * * *    ODOO_USER odoo-backup.py -d DATABASE_NAME -p /PATH/TO/BACKUPS -c /PATH/TO/ODOO.conf --no-save-filestore --daily 8 --weekly 0 --monthly 0 --yearly 0
12 4    * * 7    ODOO_USER odoo-backup.py -d DATABASE_NAME -p /PATH/TO/BACKUPS -c /PATH/TO/ODOO.conf 

To test script execute something like this:

sudo su - ODOO_USER -s /bin/bash -c "odoo-backup.py -d DATABASE_NAME -p /PATH/TO/BACKUPS -c /PATH/TO/ODOO.conf ​"


If you have a lot of products with images you can check out that question about moving images from database to filestore: https://www.odoo.com/forum/help-1/question/how-can-i-add-product-images-from-filestore-1238

I like you strategy for historical backups. Is your script still available? I couldn't seem to find it.

Tim Drinkwater
on 7/25/16, 6:38 PM

Greatly appreciated. Thank you.

Tim Drinkwater
on 7/26/16, 9:05 AM

Hello Ivan,

looks like a nice solution I will test it later today. Currently I am using auto_backup but I like the idea of your module can also safe the filestore. Is it also possible to restore a backup with your module? I am experiencing problems restoring big backups with the Odoo Database Manager

Demian Rihs
on 8/11/16, 12:07 PM

Demian, you probably have restriction for uploaded file size on your nginx\apache. Try to upload directly to odoo by adding port 8069:


Ivan Elizaryev
on 8/12/16, 1:03 AM
Nicholas Riegel 2
On 3/6/13, 3:23 PM

If your OpenERP server is a Linux server, then here is a link to some beta bash scripts that may accomplish the backup portion.


Please note these scripts are released under the GNU GPL. If you improve them, please share the changes with me and others.

As far as doing a hot backup, a true "hot" backup is not possible with atomized databases (like PostgreSQL). You can dump the database to a dump file, then backup the dump file. While Postgre is dumping the database, no operations can be performed, so its best to do a database dump when OpenERP is not being used (ie 4am Sunday morning). If you set up a slave PostgreSQL server, then the slave server may be able to be dumped, but I do not know this for certain. The dump operation takes only a few seconds (of course this depends on the size of your database and the speed of your server, etc).

If you have a master slave(s) setup, you may be able to dump a slave database (slave databases just replicate what is in the master to be able to support fast read operations. Only the master can do add, delete, update) while the master is able devote resources to add, delete, update.

The scripts above are only set up to backup a single or master PostgreSQL database. I haven't done any testing in an environment that has master slave(s).

I haven't experimented with incremental backups of a database dump files. I suppose backup tools like TAR and others could perform this since the database dump file is just a file.

I hope this helps!

You can include the address, you just can't format it a a hyperlink. Edit the answer to add that and I'll format it for you.

Daniel Reis
on 3/6/13, 6:16 PM

Thanks Daniel. Actually you can take credit for one of the the scripts as I branched it from you!

ITpedia Solutions, LLC, Nicholas Riegel 2
on 3/7/13, 8:52 PM
Leonardo Donelli
On 7/6/14, 4:51 AM

Another good option is barman, see this nice presentation about it from the 2014 OpenDays

  • Hot, full, differential and incremental backups
  • Support multiple databases
  • Remote management
  • And a lot of other features. Again, see the presentation.
Dharmesh Rathod
On 6/7/13, 3:28 AM


There is module available "auto_backup" which takes backup of your database regular basis.

Email : info@acespritech.com<br&gt; Skype : acespritech<br> Blog : acespritechblog.wordpress.com

Can't find it. Closest thing is this informative blog entry on backups, but no mention of a script or module ...

Daniel Reis
on 9/2/13, 5:44 AM

I had this working on V7, following an upgrade it stopped working, I eventually tracked the initial problem down to where the addon was installed, one problem sorted. However it now appears the module is only available for V6 (https://apps.openerp.com/apps/modules/6.0/auto_backup/) - Although the app now runs, it doesnt backup! - will keep trying :(

on 2/23/15, 5:40 AM
Olek Omelchneko
On 9/13/16, 7:05 AM

I have tested a lot of tools, and now I use PostgreSQL-Backup http://postgresql-backup.com/postgresql-blog/backup-tool It's a simple tool which creates remote PostgreSQL database backups,Zips, Encrypts and sends backups to a folder, FTP, Dropbox, Box, Google Drive, MS OneDrive, Amazon S3 or Windows Azure Storage. Runs on a flexible Schedule and sends email confirmations on job success or failure. Also, backups file folders, allows to view the results on the web and more.

Gabriele Bartolini
On 2/3/15, 6:16 PM

Hi there,

version 1.4.0 of Barman, Backup and Recovery Manager for PostgreSQL, includes support for transparent file-level incremental backup, which can bring significant reductions in terms of disk space, backup time and network consumption.

Please look at the following resources for more information:

  • http://www.pgbarman.org/barman-1-4-0-released/
  • http://blog.2ndquadrant.com/incremental-backup-barman-1-4-0/ (a blog article I have written about this feature)
  • http://www.slideshare.net/openobject/odoo-disaster-recovery-with-barman (slides I presented at Odoo conference 2014 in Brussels)
  • https://www.youtube.com/watch?v=Ka-4R43XJFs (introductory video on Barman)

One important aspect of Barman is that it also implements declarative retention policies (by REDUNDANCY or RECOVERY WINDOW), as well as other features such as WAL compression. Most importantly it allows to recover at any point in time, with just a simple command.

Thank you,



On 8/11/14, 5:19 AM

If your system Ubuntu, you install module autopostgresqlbackup:

sudo apt-get update

sudo apt-get install autopostgresqlbackup

This is the best tools for me. This create, auto and hot: latest, daily, weekly and monthly backup with rotation


On 7/17/14, 4:55 AM


You can just install the auto_backup module from the extra-addons. This way you can configure your backups from inside openerp.

No need to stop your openerp-server running. Everything just keeps working while making a backup.

if you have some extra bells and whistles needed, you can still get a script or something

hm there is no such auto_backup module in https://github.com/odoo/odoo-extra.

None of the branches has it

on 3/6/15, 5:31 AM

I use Backupninja: http://manpages.ubuntu.com/manpages/gutsy/man1/backupninja.1.html

Simple,and works fine together with Duplicity (https://help.ubuntu.com/community/DuplicityBackupHowto) to provide offsite backup

Nice tip! Thanks.

Daniel Reis
on 9/2/13, 5:45 AM
prasoon gupta
On 5/23/16, 1:58 AM

1. here is simple backup script


cd /home/prasoon/p                         # give the path where you want to save your backup zip file

date=date+"%Y%m%d_%H%M%N"  # changing its date format

pg_dump admin | gzip > prasoon.gz    # using the dump command that convert the backup into zip format

save this file as backup.sh inside the /etc/cron.daily

and using cron tab you will be able to take backup

2. After creating your script


crontab -e

1 * * * * /etc/cron.dally/backup.sh

ya this is right code it is working...............

kundan Verma
on 5/23/16, 2:28 AM
E.R. Spada II
On 10/7/15, 1:06 AM

Here is a better solution, you can set a sftp extra location. I also added the local to DropBox folder, now I have 3 backups - https://github.com/Yenthe666/auto_backup

Francesco OpenCode
On 3/6/13, 2:31 PM

You can insert a cron line where you can use pg_dump

How is this possible in windows?

on 9/2/13, 5:47 AM
On 9/4/13, 6:07 AM

1 - Backup-Script /var/lib/postgresql/postgres_db_backup.sh


# Location to place backups.

#String to append to the name of the backup files
backup_date=`date +%Y-%m-%d_%H-%M`

#Numbers of days you want to keep copie of your databases
databases=`psql -l -t | cut -d'|' -f1 | sed -e 's/ //g' -e '/^$/d'`
for i in $databases; do
  if [ "$i" != "template0" ] && [ "$i" != "template1" ] && [ "$i" != "postgres" ]; then
    echo Dumping $i to $backupfile
    pg_dump $i|gzip > $backupfile
find $backup_dir -type f -prune -mtime +$number_of_days -exec rm -f {} \;

2 - open your terminal

  • su

  • su postgres

  • crontab -e


  • 45 */4 * * * /var/lib/postgresql/postgres_db_backup.sh

45 */4 * * * /var/lib/postgresql/postgres_db_backup.sh can you explain the figures? and how do I solve permission denied?

Patrick Yap
on 3/24/14, 10:30 PM

after creating directory /var/backups/postgres_db/ make chmod 777 postgres_db

on 3/26/14, 1:34 PM

Wonderful solution, thx.

Rami Talat
on 11/16/14, 5:43 PM

Restore failed !!!! please find a way to correct it.

Rami Talat
on 11/16/14, 8:19 PM
sepuuya merit
On 2/19/15, 8:29 AM


I have used the official answer but when i test this is the error i get, please help

sudo sh /var/scripts/dump_db.sh
Stopping openerp-server: openerp-server.
pg_dump: [archiver (db)] connection to database "SPIDDAFRICA" failed: could not connect to server: No such file or directory
        Is the server running locally and accepting
        connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5433"?
gzip: /var/pgdump/odoo_SPIDDAFRICA_20150219_0658056726971.sql: No such file or directory
pg_dump: [archiver (db)] connection to database "EMBARQ" failed: could not connect to server: No such file or directory
        Is the server running locally and accepting
        connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5433"?
gzip: /var/pgdump/odoo_EMBARQ_20150219_0658104368481.sql: No such file or directory
pg_dump: [archiver (db)] connection to database "Opencare" failed: could not connect to server: No such file or directory
        Is the server running locally and accepting
        connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5433"?
gzip: /var/pgdump/odoo_Opencare_20150219_0658180975532.sql: No such file or directory
Starting openerp-server: openerp-server.


On 2/7/15, 7:07 AM

We also solved this over a nightly chron job, who is backing up the database every night and replaces older db copys.

About This Community

This community is for professionals and enthusiasts of our products and services. Read Guidelines

Question tools

12 follower(s)


Asked: 3/6/13, 1:45 PM
Seen: 101752 times
Last updated: 12/30/17, 12:45 AM