Imports are slow. All of the server code that is necessary to create a new record in the standard interface is run for an import, you're just not loading up pages. I'm guessing offhand you're dealing with a CSV with tens of thousands of records? To make those imports not so painfully slow, you could use software like Talend to help write directly into the postgres database.
What I've been doing is writing up Python scripts to parse a CSV, then reformat the data/headers in a way the database can understand and just use insert statements with psycopg2 (Postgres Python library, used by openerp). Can get data into the database in seconds rather than hours, which means redoing the import to fix an error isn't so painful.
Your import is slow if it handles 10 record per second; worst case scenario it shouldn't take longer than 1 record per second. It it does, something's wrong:
- try with a small sample of record, just to confirm that it's not slowness.
- have a look at the server log: there might be an error that not being passed on to the user interface, but is blocking the import.
Please try to give a substantial answer. If you wanted to comment on the question or answer, just use the commenting tool. Please remember that you can always revise your answers - no need to answer the same question twice. Also, please don't forget to vote - it really helps to select the best questions and answers!
About This Community
|Asked: 10/22/13, 5:18 PM|
|Seen: 939 times|
|Last updated: 3/16/15, 8:10 AM|