> I'm working with some people who live and breath Excel. I need to be able
> to move data back and forth between formats which make sense for Excel and
> for PostgreSQL. In some cases, this is just to accommodate what people are
> used to. In other cases, like statistical clustering, it's something that
> really has to be done.
>
> Here is a simplified example:
>
> I'm given data in Excel with one sheet each for a bunch of experiments. In
> each sheet, there are rows with different drugs at different doses and
> columns for each subject. The cells contain the response data. I wrote a
> Perl script which automates the process of extracting that data into a csv
> file which can be imported into a table like the following:
>
> create table results(
> expt_no int references experiments(id),
> subject int references subjects(id),
> drug text references drugs(name),
> dose numeric,
> response numeric
> )
>
> Now, suppose I do some computation on the results in the database and want
> to export it back out to the same kind of format that I received it (drugs
> and doses in rows and subjects in columns.) One method would be to use
> Perl. I could use DBD::Pg and loop through a bunch of queries to build a
> two dimensional array and then spit that back out but is there a good way
> to do this just in SQL? Is there a better way than creating a temporary
> table for each subject and then joining all the temp tables?
You may want to look into the tablefunc contrib module. It contains a crosstab
which will transpose rows and columns in the result.
This may be slow though.
HTH