On 10/28/2016 08:44 AM, Warner, Gary, Jr wrote:
> I've recently been blessed to move one of my databases onto a huge IBM P8 computer. Its a power PC architecture
with20 8-way cores (so postgres SHOULD believe there are 160 cores available) and 1 TB of RAM.
>
> I've always done my postgres tuning with a copy of "pgtune" which says in the output:
>
> # WARNING
> # this tool not being optimal
> # for very high memory systems
>
> So . . . what would I want to do differently based on the fact that I have a "very high memory system"?
The most obvious is that you are going to want to have (depending on
PostgreSQL version):
* A very high shared_buffers (in newer releases, it is not uncommon to
have many, many GB of)
* Use that work_mem baby. You have 1TB available? Take your average data
set return, and make work_mem at least that.
* IIRC (and this may be old advice), maintenance_work_mem up to 4GB. As
I recall it won't effectively use more than that but I could be wrong.
Lastly but most importantly, test test test.
JD
--
Command Prompt, Inc. http://the.postgres.company/
+1-503-667-4564
PostgreSQL Centered full stack support, consulting and development.
Everyone appreciates your honesty, until you are honest with them.
Unless otherwise stated, opinions are my own.