-----BEGIN PGP SIGNED MESSAGE-----
Hash: RIPEMD160
> Now my problem is importing the data in bulk, and to keep the version
> of the record that has the longest interval value (the third field in
> the CSV below). Refer to the entries of 03/29 of the *.gs files. The
...
> Any advice/ideas as to the K.I.S.S. to use/implement insert/update
> instead of doing the select/delete search for duplicates?
One approach would be to load into a temporary table, add an index
to it on date,time,interval and then insert back into the main
table with a group by:
INSERT INTO realtable
SELECT DISTINCT t1.*
FROM temptable t1,
(SELECT date,time,MAX(interval_length) AS imax
FROM temptable t2
GROUP BY 1,2
) AS t2
WHERE t1.date=t2.date AND t1.time=t2.time AND t1.interval_length=t2.imax;
- --
Greg Sabino Mullane greg@turnstep.com
End Point Corporation http://www.endpoint.com/
PGP Key: 0x14964AC8 201204032320
http://biglumber.com/x/web?pk=2529DF6AB8F79407E94445B4BC9B906714964AC8
-----BEGIN PGP SIGNATURE-----
iEYEAREDAAYFAk97vfMACgkQvJuQZxSWSsh4kQCeKQbDE74iio288KOBp/5Z5qOc
F2MAoJCE3uR3MkDJ+dghp2XKCQnpAjPB
=FTry
-----END PGP SIGNATURE-----