Обсуждение: Autovaccum settings while Bulk Loading data
Dear Experts,
I have data about half milllion to 1 million which is populated into the Postgres db using a batch job (A sql script consists of pl/pgsql functions and views) .
I am using PostgreSQL 8.3.5 on windows 2003 64-Bit machine.
It would be helpful if you can suggest me the appropriate Autovacuum settings for handling this large data as my autovacuum setting is hanging the entire process.
As of now I have the below Autovacuum settings in postgresql.conf file.
#------------------------------------------------------------------------------
# AUTOVACUUM PARAMETERS
#------------------------------------------------------------------------------
autovacuum = on
log_autovacuum_min_duration = 0
autovacuum_max_workers = 5
autovacuum_naptime = 10min
autovacuum_vacuum_threshold = 1000
autovacuum_analyze_threshold = 500
autovacuum_vacuum_scale_factor = 0.2
autovacuum_analyze_scale_factor = 0.1
autovacuum_freeze_max_age = 200000000
#autovacuum_vacuum_cost_delay = 200
#autovacuum_vacuum_cost_limit = -1
--------------------------------------------------------------------------------------
Please provide you suggestion regarding the same.
Many thanks
Build a bright career through MSN Education Sign up now.
I have data about half milllion to 1 million which is populated into the Postgres db using a batch job (A sql script consists of pl/pgsql functions and views) .
I am using PostgreSQL 8.3.5 on windows 2003 64-Bit machine.
It would be helpful if you can suggest me the appropriate Autovacuum settings for handling this large data as my autovacuum setting is hanging the entire process.
As of now I have the below Autovacuum settings in postgresql.conf file.
#------------------------------------------------------------------------------
# AUTOVACUUM PARAMETERS
#------------------------------------------------------------------------------
autovacuum = on
log_autovacuum_min_duration = 0
autovacuum_max_workers = 5
autovacuum_naptime = 10min
autovacuum_vacuum_threshold = 1000
autovacuum_analyze_threshold = 500
autovacuum_vacuum_scale_factor = 0.2
autovacuum_analyze_scale_factor = 0.1
autovacuum_freeze_max_age = 200000000
#autovacuum_vacuum_cost_delay = 200
#autovacuum_vacuum_cost_limit = -1
--------------------------------------------------------------------------------------
Please provide you suggestion regarding the same.
Many thanks
Build a bright career through MSN Education Sign up now.
On 10/06/10 11:47, Ambarish Bhattacharya wrote: > It would be helpful if you can suggest me the appropriate Autovacuum settings for handling this large data as my autovacuumsetting is hanging the entire process. What do you mean by "hanging the entire process"? -- Heikki Linnakangas EnterpriseDB http://www.enterprisedb.com
Please keep the mailing list CC'd, so that others can help. On 10/06/10 15:30, Ambarish Bhattacharya wrote: >> On 10/06/10 11:47, Ambarish Bhattacharya wrote: >>> It would be helpful if you can suggest me the appropriate Autovacuum settings for handling this large data as my autovacuumsetting is hanging the entire process. >> >> What do you mean by "hanging the entire process"? > > Hanging the entire process means...the autovacuum and auto analyzes starts and after that there is no acitivity i couldsee in the postgres log related to the bulk loading and when checked the postgres processes from the task manager icould see few of the postgres porcess are still running and had to be killed from there..normal shut down in not happeningin this case... You'll have to provide a lot more details if you want people to help you. How do you bulk load the data? What kind of log messages do you normally get in the PostgreSQL log related to bulk loading? Autovacuum or autoanalyze should not interfere with loading data, even if it runs simultaneously. -- Heikki Linnakangas EnterpriseDB http://www.enterprisedb.com