This question has been flagged

Here's the scenario - Lets say I install a module which changes the column type of a table. And this module changes the type of the date_order field in sale_order table from date to datetime. Now what happens is everytime I restart openerp with updateall modules flag, it drops the original date_order column (with type date)  in the table and adds a new column date_order with type datetime. 

Now postgres maintains a reference to all of its columns per table in pg_attribute table, which doesnt get cleared even when you drop the columns. Because of this every drop-column-add-column cycle that occurs when you restart openerp increments the column count for the table and in this case its the sale_order table. But the problem is postgres has a column limit of 1600 per table which will be reached if you restart openerp 800 times. And then openerp fails to start. 

This is only in case of 1 column. If modules override multiple columns in the table, this limit will be reached sooner. 

So the question is, is there a well known way to solve this problem with openerp?

My guess is this is not a very rare case scenario and some people may have already faced this problem. Are there any recommended solutions or conventions or best practices ?

 

 

 

Avatar
Discard

HELP! This is also affecting me. Has anyone found a way to solve this problem???

Author Best Answer

This is how we solved the problem: 

Lets say there are two modules  sale_order, my_sale_order. Suppose my_sale_order extends the functionality of the sale_order.

And in my_sale_order you want to change the column type of a field in sale_order, lets say, change the type of column order_date from date to datetime. Now, instead of changing the type of order_date to datetime in my_sale_order, what you can do is add a new column called order_datetime  whose type is datetime in my_sale_order  and change all the references in UI markup(and any other references required for proper functioning) from order_date to order_datetime. You can also keep the order_date and order_datetime in sync in your code.

The rationale behind not going with the db restore method suggested by Tim is because that's not a permanent solution and it requires periodic intervention by the support staff to fix this issue, when it reappears.

Avatar
Discard
Best Answer

Ok community.  I have found a work around that i DO NOT consider a viable option for the long run.  But to help anyone in need:

This problem occurs due to the fact that over the course of time, enough add/drop sequences in postgres will cause the system to hit the hard limit of 1600 columns.  (EVEN IF WHEN VIEWED IN POSTGRES, YOU ONLY SEE MINIMAL COLUMNS).  When you drop a column, it is taken out of the table, but the reference to that column lives on, and stays consuming the 1600 column limit.

My temporary solution:

pg_dump -t table_name_with_issue --inserts -f file_name_for_sql_output.sql database_name

Now you have the sql to recreate the problematic table in file_name_for_sql_output.sql

Enter into psql terminal:  psql -d database_name
Drop the problematic table: drop table table_name_with_issue cascade;
Exit psql:  ctrl+d
Restore the table from the generated .sql:  psql -f file_name_for_sql_output.sql database_name
 

And with that, you should have a re-created table that has a reset column limit.

This problem is pretty much unsolvable without an experienced database administrator, and because of that, I consider this a bug in Odoo.

Maybe someone else has a better workaround?

 

 

 

 

 

Avatar
Discard
Author

This is how we solved the problem: - make sure the column types are same in both the modules - when ever you want to change a column type, add a new column in the sub module to suit your needs. (we added a new column of type datetime and left the original date column in sale_order intact) - use this newly added column everywhere in your views

Could you elaborate on this a little more? I don't really understand the mentality behind this?