Обсуждение: Re: is postgres a good solution for billion record data.. what about mySQL?

Поиск
Список
Период
Сортировка

Re: is postgres a good solution for billion record data.. what about mySQL?

От
Tawita Tererei
Дата:
In addition to this what about MySQL, how much data (records) that can be managed with it?

regards

On Sun, Oct 25, 2009 at 3:32 AM, shahrzad khorrami <shahrzad.khorrami@gmail.com> wrote:
is postgres a good solution for billion record data, think of 300kb data insert into db at each minutes, I'm coding with php
what do you recommend to manage these data?

--
Shahrzad Khorrami



--
Mr Tererei A Aruee
IT Manager
MLPID

Re: is postgres a good solution for billion record data.. what about mySQL?

От
Raymond O'Donnell
Дата:
On 24/10/2009 20:46, Tawita Tererei wrote:
> In addition to this what about MySQL, how much data (records) that can be
> managed with it?
>
> regards
>
> On Sun, Oct 25, 2009 at 3:32 AM, shahrzad khorrami <
> shahrzad.khorrami@gmail.com> wrote:
>
>> is postgres a good solution for billion record data, think of 300kb data
>> insert into db at each minutes, I'm coding with php
>> what do you recommend to manage these data?

I know that many people on this list manage very large databases with
PostgreSQL. I haven't done it myself, but I understand that with the
right hardware and good tuning, PG will happily deal with large volumes
of data; and 300kb a minute isn't really very much by any standards.

You can get a few numbers here: http://www.postgresql.org/about/

Ray.

--
Raymond O'Donnell :: Galway :: Ireland
rod@iol.ie

Re: is postgres a good solution for billion record data.. what about mySQL?

От
Scott Marlowe
Дата:
On Sat, Oct 24, 2009 at 1:46 PM, Tawita Tererei <nevita0305@hotmail.com> wrote:
> In addition to this what about MySQL, how much data (records) that can be
> managed with it?

That's a question for the mysql mailing lists / forums really.  I do
know there's some artificial limit in the low billions that you have
to create your table with some special string to get it to handle
more.

As for pgsql, we use one of our smaller db servers to keep track of
our stats.  It's got a 6 disk RAID-10 array of 2TB SATA 5400 RPM
drives (i.e. not that fast really) and we store about 2.5M rows a day
in it.  So in 365 days we could see 900M rows in it.  Each daily
partition takes about 30 seconds to seq scan.  On our faster servers,
we can seq scan all 2.5M rows in about 10 seconds.

PostgreSQL can handle it, but don't expect good performance with a
single 5400RPM SATA drive or anything.

Re: is postgres a good solution for billion record data.. what about mySQL?

От
Lew
Дата:
Raymond O'Donnell wrote:
> On 24/10/2009 20:46, Tawita Tererei wrote:
>> In addition to this what about MySQL, how much data (records) that can be
>> managed with it?
>>
>> regards
>>
>> On Sun, Oct 25, 2009 at 3:32 AM, shahrzad khorrami <
>> shahrzad.khorrami@gmail.com> wrote:
>>
>>> is postgres a good solution for billion record data, think of 300kb data
>>> insert into db at each minutes, I'm coding with php
>>> what do you recommend to manage these data?
>
> I know that many people on this list manage very large databases with
> PostgreSQL. I haven't done it myself, but I understand that with the
> right hardware and good tuning, PG will happily deal with large volumes
> of data; and 300kb a minute isn't really very much by any standards.
>
> You can get a few numbers here: http://www.postgresql.org/about/

I know folks who've successfully worked with multi-terabyte databases with PG.

--
Lew