Обсуждение: Capacity Planning

Поиск
Список
Период
Сортировка

Capacity Planning

От
Jeff Keller
Дата:
Hi All -

We are reviewing possible database and operating solutions for our company
and we are looking at running PostgreSQL on Linux.

Does PostgreSQL have the capability to handle the following requirements?
Is anyone successfully running an application with similar characteristics?
100 Gig Database with 600 concurrent users.
500,000,000 Record Reads per 12 Hour Business Day
200,000 Record Creates per 12 Hour Business Day
1,500,000 Record Updates per 12 Hour Business Day

Thanks,
Jeff




Re: Capacity Planning

От
Gaetano Mendola
Дата:
Jeff Keller wrote:

> Hi All -
>
> We are reviewing possible database and operating solutions for our company
> and we are looking at running PostgreSQL on Linux.
>
> Does PostgreSQL have the capability to handle the following requirements?
> Is anyone successfully running an application with similar characteristics?
> 100 Gig Database with 600 concurrent users.
> 500,000,000 Record Reads per 12 Hour Business Day
> 200,000 Record Creates per 12 Hour Business Day
> 1,500,000 Record Updates per 12 Hour Business Day

Well, that are big numbers. What do you need is for sure
big iron.

Tell us what are you planning to buy in order to support that load.

My actual experience is ( rougly ):
100 concurrent users
2.000.000 read for 12 h
1.000.000 update for 12 h
50.000 new records each day

as you can see this scenario is far aways from your need
but we are using only a two processor Intel Xeon 2.8 GHz in
hyperthreding mode with a not so tuned RAID system and only
1 GB of RAM.

I think that with 8 processors, good fiber channel access to your
RAID, and good ammount of memory you can easily reach that numbers.


This is a challenging task to accomplish, do you need any help out there ;-) ?


Regards
Gaetano Mendola










Re: Capacity Planning

От
Jeff Keller
Дата:
I had a typo in the first post.  The Record Reads per day should be
50,000,000, not 500 Million.  My mistake.  One decimal place makes a huge
difference.   Our current app is Progress based with running on an IBM p650
with 4 processors and suspect a similar load if we were to changes apps and
databases.

Thanks,
Jeff



Jeff Keller wrote:

> Hi All -
>
> We are reviewing possible database and operating solutions for our company
> and we are looking at running PostgreSQL on Linux.
>
> Does PostgreSQL have the capability to handle the following requirements?
> Is anyone successfully running an application with similar
characteristics?
> 100 Gig Database with 600 concurrent users.
> 500,000,000 Record Reads per 12 Hour Business Day
> 200,000 Record Creates per 12 Hour Business Day
> 1,500,000 Record Updates per 12 Hour Business Day

Well, that are big numbers. What do you need is for sure
big iron.

Tell us what are you planning to buy in order to support that load.

My actual experience is ( rougly ):
100 concurrent users
2.000.000 read for 12 h
1.000.000 update for 12 h
50.000 new records each day

as you can see this scenario is far aways from your need
but we are using only a two processor Intel Xeon 2.8 GHz in
hyperthreding mode with a not so tuned RAID system and only
1 GB of RAM.

I think that with 8 processors, good fiber channel access to your
RAID, and good ammount of memory you can easily reach that numbers.


This is a challenging task to accomplish, do you need any help out there ;-)
?


Regards
Gaetano Mendola




Re: Capacity Planning

От
Gaetano Mendola
Дата:
Jeff Keller wrote:

> I had a typo in the first post.  The Record Reads per day should be
> 50,000,000, not 500 Million.  My mistake.  One decimal place makes a huge
> difference.   Our current app is Progress based with running on an IBM p650
> with 4 processors and suspect a similar load if we were to changes apps and
> databases.

That's different, and is completely affordable by an IBM p650, how much RAM?
What about your disks ?

I forgot to say in my previous post, that whith my numbers:

100 concurrent users
2.000.000 read for 12 h
1.000.000 update for 12 h
50.000 new records each day


the load average ( the unix one ) is under 2.



Regards
Gaetano Mendola