I ran into a similar problem. In general, a huge amount of data is not a problem. But I had one action that was slow.
Here is how I tracked it down:
In postgres turn on SQL statement logging. (I did this by setting "log_min_duration_statement=0" in the postgresql.conf file. Or you can do "log_statement = 'all'")
Now login as a user and try the action that was slow.
View the postgres log file to see what occurred.
The log file will show each SQL statement and the amount of time for each one. In my case I found a SQL statement that was computing an account balance in a slow fashion due to the large amount of data loaded and was using a SQL view. I was able to modify the view to be more efficient.
Hope that helps.
Please try to give a substantial answer. If you wanted to comment on the question or answer, just use the commenting tool. Please remember that you can always revise your answers - no need to answer the same question twice. Also, please don't forget to vote - it really helps to select the best questions and answers!
About This Community
|Asked: 5/15/15, 3:23 AM|
|Seen: 1240 times|
|Last updated: 7/3/16, 8:12 AM|