Обсуждение: Very slow searching in a table with more than 10 millions recovered records from a backup file...

Поиск
Список
Период
Сортировка
Hi there,
    I have an application with a database (pgsql) which has a big table (> 10 millions records) in windows 2003. Some times, I need to install the new version of the application.  Here is what I did: 1. back up the big table via pgadmin III, 2. stop the pgsql in the old version of the application, 3. install the new version of the application (pgsql is included and all tables keep  same like before) and 4. recovering the data(> 10 millions records) into the table from the backup file.
   After I restart the application, searching the table becomes very very slow (much slower than the searching in the old version). I don't know what is wrong with it. pgsql needs time to "reindexing" those 10 millions records for the searching?
  
Thanks for your suggestions in advance.
  
ouyang

Re: Very slow searching in a table with more than 10 millions recovered records from a backup file...

От
Grzegorz Jaśkiewicz
Дата:
On Fri, Jun 12, 2009 at 9:56 AM, zxo102 ouyang<zxo102@gmail.com> wrote:
> Hi there,
>     I have an application with a database (pgsql) which has a big table (>
> 10 millions records) in windows 2003. Some times, I need to install the new
> version of the application.  Here is what I did: 1. back up the big table
> via pgadmin III, 2. stop the pgsql in the old version of the application,
> 3. install the new version of the application (pgsql is included and all
> tables keep  same like before) and 4. recovering the data(> 10 millions
> records) into the table from the backup file.
>    After I restart the application, searching the table becomes very very
> slow (much slower than the searching in the old version). I don't know what
> is wrong with it. pgsql needs time to "reindexing" those 10 millions records
> for the searching?

This is because you missed vacuum analyze in those steps, that should
be done right after restore.


--
GJ

Grzegorz,

Thank you very much. I will do that. 
I have another question: if I do the following steps, does it "hurt" pgsql?
 step 1. stop the pgsql in the old version of the application; the whole application is installed in c:/xbop and pgsql is located in c:/xbop/pgsql;
 step 2. rename c:/xbop to c:/xbop_old;
 step 3. install the new version in c:/xbop
 step 4. copy the pgsql in c:/xbop_old/pgsql into c:/xbop

Since pgsql's backup and restore will take hours for the big table, if the above steps will not hurt the performance of pgsql, that might be a good way for me. 

Any suggestions. 

ouyang 


2009/6/12 Grzegorz Jaśkiewicz <gryzman@gmail.com>
On Fri, Jun 12, 2009 at 9:56 AM, zxo102 ouyang<zxo102@gmail.com> wrote:
> Hi there,
>     I have an application with a database (pgsql) which has a big table (>
> 10 millions records) in windows 2003. Some times, I need to install the new
> version of the application.  Here is what I did: 1. back up the big table
> via pgadmin III, 2. stop the pgsql in the old version of the application,
> 3. install the new version of the application (pgsql is included and all
> tables keep  same like before) and 4. recovering the data(> 10 millions
> records) into the table from the backup file.
>    After I restart the application, searching the table becomes very very
> slow (much slower than the searching in the old version). I don't know what
> is wrong with it. pgsql needs time to "reindexing" those 10 millions records
> for the searching?

This is because you missed vacuum analyze in those steps, that should
be done right after restore.


--
GJ

On Sat, 2009-06-13 at 09:31 +0800, zxo102 ouyang wrote:
> Grzegorz,
>
>
> Thank you very much. I will do that.
> I have another question: if I do the following steps, does it "hurt"
> pgsql?
>  step 1. stop the pgsql in the old version of the application; the
> whole application is installed in c:/xbop and pgsql is located in
> c:/xbop/pgsql;
>  step 2. rename c:/xbop to c:/xbop_old;
>  step 3. install the new version in c:/xbop
>  step 4. copy the pgsql in c:/xbop_old/pgsql into c:/xbop
>
>
> Since pgsql's backup and restore will take hours for the big table, if
> the above steps will not hurt the performance of pgsql, that might be
> a good way for me.

You should be able to use the new install's PostgreSQL binaries with the
old data directory so long as the old and new versions of PostgreSQL are
the same major version, and have been built with compatible options. If
you're sure the old and new versions of postgresql are compatible, stop
the old version of postgresql, move the data directory aside, install
the program upgrade, then *COPY* (not move, just in case) the data
directory back.

You *MUST* dump and reload if you are going to upgrade from, say, 8.2 to
8.3, or 8.3 to 8.4. It is OK to upgrade without a dump and reload from,
say, 8.3.1 to 8.3.3 .

In addition to making sure the versions are compatible, you must also
make sure the new copy of PostgreSQL was built with compatible options.
Both must have the same setting for integer datetimes, both must be
built with the same blocksize, etc. They should also use the same
versions of any procedural languages like Pl/Perl or Pl/Python, etc.

If you want to be safe, dump and reload. If you're shipping updates to
users/customers who may not upgrade reliably with every version and
don't know how to make these decisions for themselves, dump and reload.

Remember, if the major version changes (eg upgrading from 8.2 to 8.3) or
the postgresql build settings have changed, you MUST dump and reload.

--
Craig Ringer