This question has been flagged
3 Replies
5766 Views

Hello, i have a table with 998232 records. In a form created by myself, in the field many2one relation, when i choose the option "search", it takes more than 45 minutes to show the information. Is it normal?? There's any way to improve it?

Thanks

Avatar
Discard

If you know your postgresql configuration file embed it on your question. Also speed depends on your RAM and other HW so that info would be also needed to answer to your question properly. Good way to try to make your search faster is to optimize postgresql.

Author

It is not a postgresql configuration problem...the same database using another software to show the records takes 4 minutes. Also, when i execute the module, the tree view takes only 1 minute to show the first 80 records. Is the create form with the many2one field that takes 45 minutes to show the records.

Author Best Answer

After having realized several tests, I have found the problem. In my case, using ubuntu 13 I do modified the parameters of the sysctl.conf defining 25 % of the memory of kernel to use. I added the following parameters in the file /etc/sysctl.conf: kernel.shmall = 524288 kernel.shmmax = 2147483648 ( this is 2 Gb of memory - my computer have 8 Gb ) It is necessary to take care with this because it depends of memory ram on every computer. Now the search goes from 45 minutes to 35 seconds. I post this solution in case someone has the same problem.

Avatar
Discard
Best Answer

I tried this fix and works very well. h_t_t_p_s://code.launchpad.net/~openerp-dev/openerp-web/6.1-opw-574218-xal/+merge/10737

        === modified file 'addons/web/static/src/js/view_form.js'
    --- addons/web/static/src/js/view_form.js   2012-05-16 14:42:16 +0000
    +++ addons/web/static/src/js/view_form.js   2012-05-25 13:26:18 +0000
    @@ -2056,11 +2056,20 @@
                 if (values.length > self.limit) {
                     values = values.slice(0, self.limit);
                     values.push({label: _t("<em>Â Â Â Search More...</em>"), action: function() {
    -                    dataset.name_search(search_val, self.build_domain(), 'ilike'
    -                    , false, function(data) {
    +                    if (search_val.length == 0) {
    +                        // search optimisation - in case user didn't enter any text we
    +                        // do not need to prefilter records; for big datasets (ex: more
    +                        // that 10.000 records) calling name_search() could be very very
    +                        // expensive!
                             self._change_int_value(null);
    -                        self._search_create_popup("search", data);
    -                    });
    +                        self._search_create_popup("search", undefined);
    +                    } else {
    +                        dataset.name_search(search_val, self.build_domain(), 'ilike'
    +                        , false, function(data) {
    +                            self._change_int_value(null);
    +                            self._search_create_popup("search", data);
    +                        });
    +                    }
                     }});
                 }
                 // quick create
Avatar
Discard
Best Answer

Yes there is a way to decrease loading time. take the name_get method of that many one field, in that function browse method is trying to get the all information at a time,So to retrieve 998232 records it will take the time, you just remove the browse method from name_get function and use read method by mentioning the field names to retrieve, like name.

Avatar
Discard