This question has been flagged
1 Reply
2011 Views

I need to fix an issue where a lot of products don't have reordering rules. I just need the default values on all the rules. I wrote this server action to automatically create a reordering rule for any product that didn't already have one:


products = env['product.product'].search([('type','!=','consu')])


loop_count = len(products)

count = 0


to_create = []


for product in products:

  product._compute_nbr_reordering_rules() # we have to do this manually here because it won't calculate on it's own

  if product.nbr_reordering_rules == 0:

    to_create.append({'product_id':product.id})

    count += 1

env['stock.warehouse.orderpoint'].create(to_create)

log('Created ' + str(count) + ' new reorder rules. Searched ' +str(loop_count) +' records', level='info')


It works perfectly on my test database with only a few products. However, when I ran the same thing on a copy of the production data, it loaded for ~15 minutes (expected with how many records there are and the complexity of my loop), then no records were created. I checked the log after running and this was the message:

Created 28622 new reorder rules. Searched 30883 records

So it looks like it at least tried to make all the records, but ultimately failed as nothing ended up in the database. Is there anything that I missed when working with such a large amount of records?

Avatar
Discard
Best Answer

try this :

products = env['product.product'].search([('type','!=','consu'),('type','!=','service')] , limit=1000)

and as you know sql Query faster and better than ORM methods (search)  in case of large records.

Avatar
Discard
Author

Limiting the amount of records seems to work. However if I run the action again, it give me the same 1000 records and doesn't create any more rules because it did that last time. My fix for this involved searching without a limit, then only trying to create records on a subset of that data. I found I could safely create records in batches of 5000. More might be possible, but 5000 is good enough for my set of ~30k records. In case anyone is interested, this is my final code:

products = env['product.product'].search([('type','!=','consu'),('type','!=','service')])

loop_count = len(products)

count = 0

ids = []

to_create = []

# Change start/end and run several times to get all records

batch_start = 10000

batch_end = 15000

if batch_end > loop_count:

batch_end = loop_count

batch = products[batch_start:batch_end]

for product in batch:

product._compute_nbr_reordering_rules() # we have to do this manually here because it won't calculate on it's own

if product.nbr_reordering_rules == 0:

to_create.append({'product_id':product.id})

count += 1

ids.append(product.id)

env['stock.warehouse.orderpoint'].create(to_create)

log('Created ' + str(count) + ' new reorder rules. Searched ' +str(len(batch)) +' records from '+str(batch_start)+' to '+str(batch_end)+'\nProduct IDS: '+str(ids), level='info')