I have a table that stores a HUGE volume of data every
day. I am now running into a problem where when I try
to insert data, the remote connection times out
because it takes to long... (1 minute)
The solution I was going to try, is to break up the
table it to one per upload site. (tablename_siteid) as
the table name. This could result in there being
posably 1000 or more tables in about a years time. Can
this cause me any problems? If so, what would be a
better answer?
current structer:
Table "channeldata"
Attribute | Type | Modifier
------------+-----------+-----------------------
id | integer | not null default nextval
cd_id | integer | default 0
s_id | integer | default 0
units | smallint | default 0
datareal | float8 |
dataalt | float8 |
dataoffset | float8 | default 0
tstamp | timestamp | default now()
proposed structer:
Table "channeldata_[s_id]"
Attribute | Type | Modifier
------------+-----------+-----------------------
id | integer | not null default nextval
cd_id | integer | default 0
units | smallint | default 0
datareal | float8 |
dataalt | float8 |
dataoffset | float8 | default 0
tstamp | timestamp | default now()
__________________________________________________
Do You Yahoo!?
HotJobs - Search Thousands of New Jobs
http://www.hotjobs.com