Re: Load 500 GB test data with Large objects and different types

Поиск
Список
Период
Сортировка
От Holger Jakobs
Тема Re: Load 500 GB test data with Large objects and different types
Дата
Msg-id B99F6B60-C662-4C5A-AAF2-A40057A4CDBA@jakobs.com
обсуждение исходный текст
Ответ на Re: Load 500 GB test data with Large objects and different types  (Lucio Chiessi <lucio.chiessi@trustly.com>)
Список pgsql-admin
Converting int values to timestamp is possible with PostgreSQL's functions.

The already mentioned generate_series() can generate timestamp values directly.

So you can choose.

What do you mean by large objects? Large binary values can be saved as bytea type or as large objects separate from tables.


Am 16. Februar 2023 21:02:44 GMT+01:00 schrieb Lucio Chiessi <lucio.chiessi@trustly.com>:
Hi Raj.  You can use the generate_series() function to create millions of rows and do an insert from this select.
I'm used to using it in my test data generation.

Lucio Chiessi

Senior Database Administrator

Trustly, Inc.

M: +55 27 996360276

  

    

PayWithMyBank® is now part of Trustly



On Thu, Feb 16, 2023 at 3:46 PM Raj kumar <rajkumar820999@gmail.com> wrote:
Hi, 

What is the easy/best way to load 500gb of data for testing purpose with flexible data types
1) Timestamp Datetime
2) Blob Large Objects 
3) Different data types.

I tried pgbench and sysbench which only gives int and varchar types.

Thanks,
Raj Kumar Narendiran.

Please read our privacy policy here on how we process your personal data in accordance with the General Data Protection Regulation (EU) 2016/679 (the “GDPR”) and other applicable data protection legislation

В списке pgsql-admin по дате отправления:

Предыдущее
От: Lucio Chiessi
Дата:
Сообщение: Re: Load 500 GB test data with Large objects and different types
Следующее
От: Karthik Krishnakumar
Дата:
Сообщение: Upgrading postgres quickly, without downtime.