Odoo Help

Welcome!

This community is for beginners and experts willing to share their Odoo knowledge. It's not a forum to discuss ideas, but a knowledge base of questions and their answers.

0

OperationalError: out of shared memory [Closed]

By
patrick
on 3/12/13, 6:38 AM 3,063 views

The Question has been closed

by
patrick
on 04/09/2013 05:51:32

I am testing openERP V7, and did the following: create an order with 28 lines (= 28 different products). Than I confirmed the order. Going to the delivery order, I can click on "Check availability", so I do it. This will generate the above error in my browser.

I tried to Check orders with less lines, the one I tried had 18 lines and that worked without issues, except it took some time (so you get the screen telling you to wait a little).

Traceback:

Client Traceback (most recent call last):

File "/usr/lib/pymodules/python2.7/openerp/addons/web/http.py",line 203, in dispatch

  response["result"] = method(self, **self.params)

File "/usr/lib/pymodules/python2.7/openerp/addons/web/controllers/main.py", line 1078, in call_button

  action = self._call_kw(req, model, method, args, {})

File "/usr/lib/pymodules/python2.7/openerp/addons/web/controllers/main.py", line 1066, in _call_kw

  return getattr(req.session.model(model), method)(*args, **kwargs)

File "/usr/lib/pymodules/python2.7/openerp/addons/web/session.py", line 43, in proxy

  result = self.proxy.execute_kw(self.session._db, self.session._uid, self.session._password, self.model, method, args, kw)

File "/usr/lib/pymodules/python2.7/openerp/addons/web/session.py", line 31, in proxy_method

  result = self.session.send(self.service_name, method, *args)

File "/usr/lib/pymodules/python2.7/openerp/addons/web/session.py", line 104, in send

  raise xmlrpclib.Fault(openerp.tools.ustr(e), formatted_info)

Server Traceback (most recent call last):

File "/usr/lib/pymodules/python2.7/openerp/addons/web/session.py", line 90, in send

  return openerp.netsvc.dispatch_rpc(service_name, method, args)

File "/usr/lib/pymodules/python2.7/openerp/netsvc.py", line 293, in dispatch_rpc

  result = ExportService.getService(service_name).dispatch(method, params)

File "/usr/lib/pymodules/python2.7/openerp/service/web_services.py", line 618, in dispatch

  res = fn(db, uid, *params)

File "/usr/lib/pymodules/python2.7/openerp/osv/osv.py", line 188, in execute_kw

  return self.execute(db, uid, obj, method, *args, **kw or {})

File "/usr/lib/pymodules/python2.7/openerp/osv/osv.py", line 131, in wrapper

  return f(self, dbname, *args, **kwargs)

File "/usr/lib/pymodules/python2.7/openerp/osv/osv.py", line 197, in execute

  res = self.execute_cr(cr, uid, obj, method, *args, **kw)

File "/usr/lib/pymodules/python2.7/openerp/osv/osv.py", line 185, in execute_cr

  return getattr(object, method)(cr, uid, *args, **kw)

File "/usr/lib/pymodules/python2.7/openerp/addons/stock/stock.py", line 769, in action_assign

  self.pool.get('stock.move').action_assign(cr, uid, move_ids)

File "/usr/lib/pymodules/python2.7/openerp/addons/stock/stock.py", line 2098, in action_assign

  res = self.check_assign(cr, uid, todo)

File "/usr/lib/pymodules/python2.7/openerp/addons/stock/stock.py", line 2166, in check_assign

  self.write(cr, uid, done, {'state': 'assigned'})

File "/usr/lib/pymodules/python2.7/openerp/addons/stock/stock.py", line 1778, in write

  return  super(stock_move, self).write(cr, uid, ids, vals, context=context)

File "/usr/lib/pymodules/python2.7/openerp/osv/orm.py", line 4150, in write

  'where id IN %s', upd1 + [sub_ids])

File "/usr/lib/pymodules/python2.7/openerp/sql_db.py", line 161, in wrapper

  return f(self, *args, **kwargs)

File "/usr/lib/pymodules/python2.7/openerp/sql_db.py", line 226, in execute

  res = self._obj.execute(query, params)

OperationalError: out of shared memory

HINT: You might need to increase max_locks_per_transaction.

Checking the log-file of openERP, I see the following:

2013-03-12 09:01:35,508 1688 WARNING test1 openerp.addons.stock.stock: Failed attempt to reserve 10.0 x product 712, likely due to another transaction already in progress. Next attempt is likely to work. Detailed error available at DEBUG level.

2013-03-12 09:01:38,421 1688 WARNING test1 openerp.addons.stock.stock: Failed attempt to reserve 28.0 x product 151, likely due to another transaction already in progress. Next attempt is likely to work. Detailed error available at DEBUG level.

2013-03-12 09:01:47,176 1688 WARNING test1 openerp.addons.stock.stock: Failed attempt to reserve 13.0 x product 469, likely due to another transaction already in progress. Next attempt is likely to work. Detailed error available at DEBUG level.

2013-03-12 09:01:52,666 1688 WARNING test1 openerp.addons.stock.stock: Failed attempt to reserve 10.0 x product 326, likely due to another transaction already in progress. Next attempt is likely to work. Detailed error available at DEBUG level.

2013-03-12 09:01:54,012 1688 WARNING test1 openerp.addons.stock.stock: Failed attempt to reserve 8.0 x product 62, likely due to another transaction already in progress. Next attempt is likely to work. Detailed error available at DEBUG level.

2013-03-12 09:02:11,267 1688 WARNING test1 openerp.addons.stock.stock: Failed attempt to reserve 30.0 x product 722, likely due to another transaction already in progress. Next attempt is likely to work. Detailed error available at DEBUG level.

2013-03-12 09:02:11,283 1688 ERROR test1 openerp.sql_db: bad query: update stock_move set "state"='assigned',write_uid=1,write_date=(now() at time zone 'UTC') where id IN (1380, 1381, 1382, 1383, 1384, 1385, 1386, 1387, 1388, 1389, 1390, 1391, 1392, 1393, 1394, 1395, 1396, 1397, 1398, 1399, 1400)

Traceback (most recent call last): File "/usr/lib/pymodules/python2.7/openerp/sql_db.py", line 226, in execute res = self._obj.execute(query, params)

OperationalError: out of shared memory

HINT: You might need to increase max_locks_per_transaction.

When I rebooted, I can do anything, as long as I do not try to check availability of the order. Running htop op the server (Ubuntu 12.04.1, in a VM), I see that the CPU usage of the process of postgreSQL goes sky high (90%+) when trying to check availability.

Doing some checking on the internet regarding the message 'max_locks_per_transaction' and postgreSQL, it looks like I have to modify a setting of postgreSQL.

I wonder why a relative small order with 28 lines gives this error. If you need more info, let me now.

0

patrick

--patrick--
1958
| 5 4 7
Baarn, Netherlands
--patrick--
patrick
On 4/9/13, 5:35 AM

After contacting openERP, they have provided a patch, to fix this issue. It is in the process of being merged into MAIN and TRUNK.

After the patch, you can check for availability of more than 423 lines within 30 seconds, instead of crashing the DB with just 28 lines.

So it wil be fixed soon.

1
Felix Schubert
On 3/13/13, 7:02 PM

You should provide more information about your vm config - regularly the demo environment runs on the standard postgresql settings without any problem.

VM: 1 GB mem, Ubuntu 12.04.02, 64 bits.

I have not modified anything after installation of openERP, so that is out of the question.

What other specs do you need in order to help?

I have imported about 1400 products, with related amounts in the locations (about 1800). The import was done using the openERP framework through a script.

patrick
on 3/18/13, 5:27 AM

About This Community

This community is for professionals and enthusiasts of our products and services. Read Guidelines

Question tools

1 follower(s)

Stats

Asked: 3/12/13, 6:38 AM
Seen: 3063 times
Last updated: 3/16/15, 8:10 AM