This question has been flagged
3719 Views

Sending emails from Openerp 7 causes 1GB memory spike. This is caused by  this part of the code:

fol_obj = self.pool.get("mail.followers") 
fol_ids = fol_obj.search(cr, SUPERUSER_ID, [
('res_model', '=', message.model),
('res_id', '=', message.res_id),
('subtype_ids', 'in', message.subtype_id.id)
], context=context)

in /openerp-base/src/addons/mail/mail_message.py (def _notify method line 881).

Our mail_folowers table contains around 3 000 000 rows. The search method, in converting our code example to sql statement, at the point where it creates a where clause, it does a string supstitution something like - "where s%, s%, s%...." with 3 000 000 %s characters, and then substitutes a list of 3 million id elements to create a single string. Than it executes that string. This causes a memory error on our system in /openerp-base/src/server/openerp/tools/misc.py,  flatten() method (line 226).

A workaround is to use cr.execute() insted of search method, but I am afraid that this points to a bigger problem. When using tables with more than a couple of million of rows, search is a big problem.


Avatar
Discard

+1 for the detailed testing and numbers. I'm curious for reactions about this. Have you tried other search methods to improve this speed?

Author

No just this one. Now I use direct query to database. But I have other big tables like account move lines with even bigger number of lines. This is a good candidate why I need a lot of ram to run easy reconcile (or any reconciliation for that matter). I will be fixing this in the near future. I will post more on this problem at that time.