Re: ResultSet memory usage
От | Timo Savola |
---|---|
Тема | Re: ResultSet memory usage |
Дата | |
Msg-id | 1010765145.10350.9.camel@vorlon обсуждение исходный текст |
Ответ на | Re: ResultSet memory usage ("Nick Fankhauser" <nickf@ontko.com>) |
Ответы |
Re: ResultSet memory usage
|
Список | pgsql-jdbc |
> A possible workaround- If you only need to grab a few rows is there some way > to make those rows float to the top using an "order by" & then apply "limit" > so you don't have to deal with the huge ResultSet? I'm using order by, but the point is that I can only make an educated guess for the limit parameter. And I can't calculate a "big enough" value. I need to get N first entries with duplicates removed based on one (or two) unique column(s). I can't use distinct since I need to select also other columns that shouldn't be affected by "distinct". I've thought about subselects, etc. but so far the best/cleanest approach I've come up with is to use a HashSet for the unique column values on the Java end. The down side is that I need to transfer a lot of unnecessary rows from to the application, and with PostgreSQL that means all rows. Timo
В списке pgsql-jdbc по дате отправления: