Bỏ qua để đến Nội dung
Menu
Câu hỏi này đã bị gắn cờ
2 Trả lời
475 Lượt xem

I'm trying to create a module to group accounting entries by analytical accounts in both the list view and the pivot view. This is my code:

With new databases, it works perfectly. I can group the accounting entries by analytical account and I can see the account names. However, when they are in production databases with more than 10,000 entries, two things happen...

1: If I leave auto_init on, it installs without problems, but when grouping, it groups everything as False and in the pivot view, it shows it as None.

2: If I remove the init, using the True store, it recalculates all the records to the point that it can't recalculate them. The service gets stuck, and the module doesn't even install or update.

Any recommendations? Thanks!

# -*- coding: utf-8 -*-
from odoo import models, fields, api
import logging
_logger = logging.getLogger(__name__)
import json
from odoo.tools.sql import column_exists, create_column

class AccountMoveLine(models.Model):
_inherit = 'account.move.line'

analytic_account_names = fields.Char(
string="Cuentas Analíticas asdasd",
compute="_compute_analytic_account_names",
store=True,
index=True,
precompute=True,
)

def _auto_init(self):
"""
Create compute stored field check_number
here to avoid MemoryError on large databases.
"""
if not column_exists(self.env.cr, 'account_move_line', 'analytic_account_names'):
create_column(self.env.cr, 'account_move_line', 'analytic_account_names', 'varchar')

return super()._auto_init()

@api.depends('analytic_distribution')
def _compute_analytic_account_names(self):
AnalyticAccount = self.env['account.analytic.account'].sudo()
all_ids = set()
for line in self:
ids = self._extract_analytic_ids(line.analytic_distribution)
all_ids.update(ids)
all_accounts = AnalyticAccount.browse(all_ids).exists()
id_to_name = {acc.id: acc.name for acc in all_accounts}
for line in self:
ids = self._extract_analytic_ids(line.analytic_distribution)
names = [id_to_name.get(acc_id, 'Desconocido') for acc_id in ids]
line.analytic_account_names = ', '.join(sorted(names)) if names else ''

def _extract_analytic_ids(self, raw):
ids = set()
if isinstance(raw, dict):
keys = raw.keys()
elif isinstance(raw, str):
try:
parsed = json.loads(raw)
if isinstance(parsed, dict):
keys = parsed.keys()
else:
keys = []
except Exception:
keys = raw.split(',') if ',' in raw else []
else:
keys = []

for key in keys:
try:
ids.add(int(str(key).strip()))
except (ValueError, TypeError):
continue

return ids

​​
Ảnh đại diện
Huỷ bỏ
Câu trả lời hay nhất

I guess I would be trying a step-wise approach in this particular scenario of having a table with many records already.

If it's tracked down to be an execution-time or allowed-resource-based limitation, increasing the limits in the config file (i.e. odoo.conf or .odoorc) may help already.

Another approach, that could solve your issue, is aforementioned step-wise approach: It may work to first make the field stored only (without any computation). Then populate the field, either from CLI or based on some button action (or even a scheduled action that triggers subsequent execution until finished) so you have better control over batch sizes and thus records to be updated in one go. Lastly, apply the actual computation logic on the field. With then no change being actually written to the database, the module upgrade probably does work.

Note: This is just an theoretical description, nothing I would have practically tested recently.

Another thing to try, if applicable to your use case, would be to create an actual (database) view (as some sort of a report view) gathering and displaying all needed information.

Ảnh đại diện
Huỷ bỏ
Câu trả lời hay nhất

Hi, 

The way store=True​ parameter works is by initialy triggering _compute_analytic_account_names()​ and then save the record into it (literally saving it into database). If the record has been saved, the @api.depends(​) will only re-calculating again if there are a changes in analytic_distribution​ field. If there are no changes, means no changes.

You might want to disable store=True​ if you would like the record to do automatic update during user access. However, if the record is way too much it probably will affect how records are loaded because  _compute_analytic_account_names() will calculate and put it into analytic_account_names​ . In your case, better to remove store=True​ 

and i dont think you need _auto_init()​ method there. Just @api.depends​ method will be fine

Hope it helps,

Altela (altelasoftware.com)

Ảnh đại diện
Huỷ bỏ
Bài viết liên quan Trả lời Lượt xem Hoạt động
1
thg 7 24
4725
4
thg 8 25
2242
2
thg 5 24
2170
0
thg 12 23
1546
1
thg 9 23
2133