Bỏ qua để đến Nội dung
Menu
Câu hỏi này đã bị gắn cờ
1 Trả lời
6075 Lượt xem

Hello,

I have a problem regarding a ir.cron task that runs by a bunch of assets and tries to post them automatically. Said method runs perfectly with 700 depreciation lines but when i take to production where are a considerable larger amount of assets and depreciation lines (558,747) the openerp-server just stops and falls down.

ids = self.search(cr, uid, [('state', '=', 'open')])

        for asset in self.browse(cr, uid, ids):
            for asset_line in asset.depreciation_line_ids:
                if asset_line.move_check:

This code gets all my assets that are open and then try to check for a set of conditions and post it. When it gets to the asset_line.move_check line, it takes a long time and then it kills the proccess, all while keeping the server cpu at 100%.

Any ideas on how i can improve performance on this?

Ảnh đại diện
Huỷ bỏ
Câu trả lời hay nhất

Hi,

You can improve using python multiprocessing.

Thanks

Ảnh đại diện
Huỷ bỏ
Tác giả

Using module multiprocessing or threading? I was taking that option into consideration. Just wanted to see if someone had encountered something similar somewhere.

Use multiprocessing

Use read not browse, better performance, browse read all field from table, read take only asked field, so better performance.

ids = self.search(cr, uid, [('state', '=', 'open')])
for asset in self.read(cr, uid, ids,['depreciation_line_ids']):
            for asset_line in self.pool.get('account.asset.depreciation.line').read(cr, uid, asset['depreciation_line_ids'],['move_check']):
                if asset_line['move_check']:
Tác giả

Yeah, i am thinking that this is the next natural step, i already got it working with browse, but stil performance is slow and resource consumption high. I have really never used read before, i am going to have to rewrite my entire algorithm.

Bài viết liên quan Trả lời Lượt xem Hoạt động
1
thg 3 15
5852
2
thg 7 22
5387
1
thg 9 19
2409
2
thg 12 23
42062
1
thg 4 15
4536