Hi,
Bit of an abstract question I appreciate, however I just thought I'd
see what people thought. I have an anonymosied dataset of travel
behaviour of some people in a major city (I'd rather not go into
details if that's ok). What I intend to do is to work out where they
each are for every minute of the day. So for ~80,000 people x 1440
minutes = 115,200,000 rows of data! So a few questions:
1) Is PostgreSQL going to be able to cope with this? In terms of the
table size? I think so...
2) My columns will be something like
person_id integer,
person_timestamp timestamp,
person_location_geom geometry
Any thoughts on those? The format of the columns?
3) I'll probably create a Primary Key which is a combination of
person_id and person_timestamp. Does this sound like a good idea?
4) Should I use some indexes to improve performance maybe?
Best wishes
James