Re: How To: A large [2D] matrix, 100,000+ rows/columns

Поиск
Список
Период
Сортировка
От Joe Conway
Тема Re: How To: A large [2D] matrix, 100,000+ rows/columns
Дата
Msg-id 626833f2-50e2-342c-e8ea-cb16b0d01785@joeconway.com
обсуждение исходный текст
Ответ на How To: A large [2D] matrix, 100,000+ rows/columns  (Pat Trainor <pat.trainor@gmail.com>)
Ответы Re: How To: A large [2D] matrix, 100,000+ rows/columns
Список pgsql-general
On 6/8/23 22:17, Pat Trainor wrote:
> Imagine something akin to stocks, where you have a row for every stock, 
> and a column for every stock. Except where the same stock is the row & 
> col, a number is at each X-Y (row/column), and that is the big picture. 
> I need to have a very large matrix to maintain & query, and if not 
> (1,600 column limit), then how could such data be broken down to work?

  100,000 rows *
  100,000 columns *
  8 bytes (assuming float8)
= about 80 GB per matrix if I got the math correct.

Is this really a dense matrix or is it sparse? What kind of operations?

Does it really need to be stored as such or could it be stored as 
vectors that are converted to a matrix on the fly when needed?

Seems like using python or R makes more sense. Perhaps it might make 
sense to store the data in Postgres and use plpython or plr. But it is 
hard to say with more details.


-- 
Joe Conway
PostgreSQL Contributors Team
RDS Open Source Databases
Amazon Web Services: https://aws.amazon.com




В списке pgsql-general по дате отправления:

Предыдущее
От: Ben Chobot
Дата:
Сообщение: Re: Active Active PostgreSQL Solution
Следующее
От: Wim Bertels
Дата:
Сообщение: Re: How To: A large [2D] matrix, 100,000+ rows/columns