Hi Martin,
On 01/02/2019 19.41, Martin Stöcker wrote:
> Hi Sven,
>
> in many cases I prefer "copy from" to import data. So I can create my
> test data via spreadsheet and csv.
Sure, that works, but it would then be useful if the COPY command would
actually read the first line of a CSV file and use it to find the target
columns in the table, but it does not.
> Sometimes it is helpfull to do some shell stuff to create it or import
> with psql.
Yes, that definitely works.
> But if you prefer to have column names and column data near to each
> other, why not using json?
>
> postgres=#create table test ( i integer, t text);
> postgres=# insert into test(select * from
> json_to_recordset('[{"i":1,"t":"foo"},{"i":"7","t":"bar"}]') as x(i
> int, t text));
That also works, but requires one to name all the columns and their
types in the "AS x" part. That makes the statement very verbose. My
proposal would not require the types to be stated and would only require
the column names in the "data part" of the statement.
> INSERT 0 2
> postgres=# select * from test;
> i | t
> ---+-----
> 1 | foo
> 7 | bar
> (2 rows)
>
> Regards Martin
Thanks for your thoughts!
With kind regards,
Sven