Yes, seems so. I separated one of the tables having the problem and did
a test. First, loaded it with original file and saw the same: invalid
command then follows "out of memory". After I fixed those escape
characters, the table could be loaded successfully.
于 2011/8/30 10:14, Scott Marlowe 写道:
> 2011/8/29 Rural Hunter<ruralhunter@gmail.com>:
>> Hi all,
>> I'm a newbie here. I'm trying to test pgsql with my mysql data. If the
>> performance is good, I will migrate from mysql to pgsql.
>> I installed pgsql 9.1rc on my Ubuntu server. I'm trying to import a large
>> sql file dumped from mysql into pgsql with 'plsql -f'. The file is around
>> 30G with bulk insert commands in it. It rans several hours and then aborted
>> with an "out of memory" error. This is the tail of the log I got:
>> INSERT 0 280
>> INSERT 0 248
>> INSERT 0 210
>> INSERT 0 199
>> invalid command \n
>> out of memory
> I'd look at what's leading to the invalid command \n up there. I
> doubt the out of memory is more than a symptom from that. I.e. some
> part of your inserts are improperly formatted and the machine is then
> trying to insert a row with what it thinks is a column of several
> gigabytes.
>