Acutely, there are 7 inserts that that take place on
that table before it can talk to the unit that is
broadcasting to me again...
There is a unique constraint on (tstamp, cd_id) but
removing it didn't fix the speed issue...
I am at about 3,000,000 rows give or take a few
thousand. My first take is that I agree with you in
that 3 mill rows should not be an issue at insert
time, but at this point I have no clue on what else it
can be... The rest of the database responds just fine,
all except this table. I also had done a VACUUM
ANALYZE on the table in hopes that it would help...
On my original fix, is there any disadvantage to that
many tables? Other than \d becomes almost useless?
--- Tom Lane <tgl@sss.pgh.pa.us> wrote:
> Felson <felson123@yahoo.com> writes:
> > I have a table that stores a HUGE volume of data
> every
> > day. I am now running into a problem where when I
> try
> > to insert data, the remote connection times out
> > because it takes to long... (1 minute)
>
> How much is HUGE? I'm having a really hard time
> believing that a simple
> insert could take > 1min regardless of table size
> ... are there perhaps
> triggers or rules or foreign-key references on this
> table that could be
> eating the time?
>
> regards, tom lane
__________________________________________________
Do You Yahoo!?
HotJobs - Search Thousands of New Jobs
http://www.hotjobs.com