Обсуждение: Problems Importing table to pgadmin

Поиск
Список
Период
Сортировка

Problems Importing table to pgadmin

От
metaresolve
Дата:
Hi,

pgAdmin 4
Windows 10

I'm brand new and struggling. I was able to create a table with the CREATE
TABLE command and set up the columns. However, when I try to "import"
nothing happens, at all. I import the table and hit Ok and nothing happens.
If I SELECT * from [table] I get no rows back. I'm not sure what I'm doing
wrong.

This is the query i used to create the table:

CREATE TABLE cc_20161207 (
status varchar(1),
precinct_num varchar(4),
congress varchar(1),
assembly varchar(2),
senate varchar(2),
commission varchar(1),
education varchar(1),
regent varchar(2),
school varchar(1),
city varchar(3),
ward varchar(3),
township varchar(3),
first_name varchar(20),
middle_name varchar(20),
last_name varchar(30),
name_suffix varchar(3),
sex varchar(1),
party_reg varchar(3),
birth_year varchar(4),
phone_num varchar(12),
res_street_num varchar(6),
res_direction varchar(2),
res_street_name varchar(25),
res_address_type varchar(4),
res_unit varchar(20),
res_city varchar(40),
res_state varchar(2),
res_zip_code varchar(5),
mail_address varchar(32),
mail_city varchar(50),
mail_state varchar(2),
mail_zip_code varchar(5),
activity_date varchar(10),
registration_date varchar(10),
registration_num varchar(9),
language_pref varchar(10),
polling_code varchar(3),
confidential_flag varchar(1),
id_not_reqd varchar(1),
affidavit varchar(45),
election1 varchar(4),
vote_type1 varchar(1),
election2 varchar(4),
vote_type2 varchar(1),
election3 varchar(4),
vote_type3 varchar(1),
election4 varchar(4),
vote_type4 varchar(1),
election5 varchar(4),
vote_type5 varchar(1),
election6 varchar(4),
vote_type6 varchar(1),
election7 varchar(4),
vote_type7 varchar(1),
election8 varchar(4),
vote_type8 varchar(1),
election9 varchar(4),
vote_type9 varchar(1),
election10 varchar(4),
vote_type10 varchar(1),
election11 varchar(4),
vote_type11 varchar(1),
election12 varchar(4),
vote_type12 varchar(1),
election13 varchar(4),
vote_type13 varchar(1),
election14 varchar(4),
vote_type14 varchar(1),
election15 varchar(4),
vote_type15 varchar(1),
election16 varchar(4),
vote_type16 varchar(1),
election17 varchar(4),
vote_type17 varchar(1),
election18 varchar(4),
vote_type18 varchar(1),
election19 varchar(4),
vote_type19 varchar(1),
election20 varchar(4),
vote_type20 varchar(1)
    );

However, when I look at the table it's got the OID fields in there. From
what I read, the default is set to off, so I don't understand why they're
creating them.

<http://postgresql.nabble.com/file/n5933807/30652992334_36a60a0fed.jpg>

I think this is the problem because the main error I'm getting is the 42601
that, when I googled, seemed to mean it was trying to import a table with a
different amount of fields.

I tried to ALTER TABLE to drop the OID columns but no luck.
When I try to COPY cc_20161207 from "file" I get this error:

<http://postgresql.nabble.com/file/n5933807/30685290943_4a7c7136ce.jpg>

When I try to right-click on the table and import, I go through the dialogue
and click "OK" but nothing ever happens.

Can someone help me with where I'm going wrong?
Thanks,
meta



--
View this message in context: http://postgresql.nabble.com/Problems-Importing-table-to-pgadmin-tp5933807.html
Sent from the PostgreSQL - general mailing list archive at Nabble.com.


Re: Problems Importing table to pgadmin

От
"David G. Johnston"
Дата:
On Wed, Dec 7, 2016 at 4:13 PM, metaresolve <solvemetare@gmail.com> wrote:
However, when I look at the table it's got the OID fields in there. From
what I read, the default is set to off, so I don't understand why they're
creating them.

<http://postgresql.nabble.com/file/n5933807/30652992334_36a60a0fed.jpg>

​The first 7 fields in the image are system fields that are always present but should never be visible.  I'd raise a complaint over on pgadmin-support@postgresql.org - or learn the psql utility.

Also,

On the second image you are using double-quotes to delimit a string literal.  This is wrong.  PostgreSQL always uses single quotes to indicate literal string value double quotes are reserved for object identifiers (table names, column names, etc).

David J.

Re: Problems Importing table to pgadmin

От
metaresolve
Дата:
Thank you! It was the double quotes. I did run into the permissions error
afterwards but I solved it with a google search.

Thanks,
meta



--
View this message in context: http://postgresql.nabble.com/Problems-Importing-table-to-pgadmin-tp5933807p5933812.html
Sent from the PostgreSQL - general mailing list archive at Nabble.com.


Re: Problems Importing table to pgadmin

От
John R Pierce
Дата:
On 12/7/2016 3:28 PM, David G. Johnston wrote:
On the second image you are using double-quotes to delimit a string literal.  This is wrong.  PostgreSQL always uses single quotes to indicate literal string value double quotes are reserved for object identifiers (table names, column names, etc).


AND, COPY table from filename only works at the *SERVER* level, not at the user level, and can only be done by a database superuser.    the database server likely has no right to access files on your desktop.  In the psql shell, you can use \copy to do this, that actually does a COPY table FROM stdin;  then the psql client reads your file and pipes it to that COPY command, so the file is read by the client, and not the server.



-- 
john r pierce, recycling bits in santa cruz

Re: Problems Importing table to pgadmin

От
metaresolve
Дата:
That's.... a little beyond me. Let me back up a sec and maybe you guys can
help.

I used to use Access to do my data crunching, matching, and cleaning at my
old job. I worked with a max of 600k records so Access could handle it. I
know, lame, but it's what I knew.

I was using Alteryx the past 8 months on files of 20M and higher. But, I got
laid off, so I don't have access to that software anymore.
[
My thought was to use postgreSQL as a kind of more advanced Access that I
could use to crunch numbers on similarly. However, My file has 1.1M records
on it and pgadmin seems to be choking on it.

How many records and relational tables can pgadmin/postgres actually handle?
I'm anticipating tests for my upcoming job hunts and I need something that
gives me query capability. I figured since I used to do it in Access I could
do it in SQL.

(btw, the reason I picked postgreSQL is my friend from my old job uses it
and likes it. i thought it would be easier to get help from her.)

Any input?



--
View this message in context: http://postgresql.nabble.com/Problems-Importing-table-to-pgadmin-tp5933807p5933818.html
Sent from the PostgreSQL - general mailing list archive at Nabble.com.


Re: Problems Importing table to pgadmin

От
Adrian Klaver
Дата:
On 12/07/2016 04:02 PM, metaresolve wrote:
> That's.... a little beyond me. Let me back up a sec and maybe you guys can
> help.
>
> I used to use Access to do my data crunching, matching, and cleaning at my
> old job. I worked with a max of 600k records so Access could handle it. I
> know, lame, but it's what I knew.
>
> I was using Alteryx the past 8 months on files of 20M and higher. But, I got
> laid off, so I don't have access to that software anymore.
> [
> My thought was to use postgreSQL as a kind of more advanced Access that I
> could use to crunch numbers on similarly. However, My file has 1.1M records
> on it and pgadmin seems to be choking on it.

Define choking?

>
> How many records and relational tables can pgadmin/postgres actually handle?

https://www.postgresql.org/about/

So you have plenty of head room.

> I'm anticipating tests for my upcoming job hunts and I need something that
> gives me query capability. I figured since I used to do it in Access I could
> do it in SQL.
>
> (btw, the reason I picked postgreSQL is my friend from my old job uses it
> and likes it. i thought it would be easier to get help from her.)
>
> Any input?
>
>
>
> --
> View this message in context: http://postgresql.nabble.com/Problems-Importing-table-to-pgadmin-tp5933807p5933818.html
> Sent from the PostgreSQL - general mailing list archive at Nabble.com.
>
>


--
Adrian Klaver
adrian.klaver@aklaver.com


Re: Problems Importing table to pgadmin

От
Tom Lane
Дата:
Adrian Klaver <adrian.klaver@aklaver.com> writes:
> On 12/07/2016 04:02 PM, metaresolve wrote:
>> How many records and relational tables can pgadmin/postgres actually handle?

> https://www.postgresql.org/about/
> So you have plenty of head room.

Well, pgadmin and postgres are two different things.  pgadmin is mostly a
GUI, and AFAIK it's not really designed to perform well with query results
that are enormously larger than what you might want to scroll through
on-screen.

You might have better luck if you tell it to use COPY for bulk data
transfers.  I don't use it so I can't offer any detailed help, but
I believe that's possible.

            regards, tom lane


Re: Problems Importing table to pgadmin

От
metaresolve
Дата:
Choking: I get the "Waiting for the query execution to complete" circling
around for a while. I tried shutting it down and trying again but it's still
freezing on the execution. But if the TB are accurate, I wonder why it's
slowing on this? Any thoughts?



--
View this message in context: http://postgresql.nabble.com/Problems-Importing-table-to-pgadmin-tp5933807p5933826.html
Sent from the PostgreSQL - general mailing list archive at Nabble.com.


Re: Problems Importing table to pgadmin

От
Adrian Klaver
Дата:
On 12/07/2016 04:54 PM, metaresolve wrote:
> Choking: I get the "Waiting for the query execution to complete" circling
> around for a while. I tried shutting it down and trying again but it's still
> freezing on the execution. But if the TB are accurate, I wonder why it's
> slowing on this? Any thoughts?

As Tom said pgAdmin is a GUI. If you are indeed trying to pull 1 million
rows into it, then it will be slow. Think 1 million rows x number of
fields per row and a GUI widget for each cell. It takes a lot of time to
build that. Try the Postgres command line client, psql and see what happens.

>
>
>
> --
> View this message in context: http://postgresql.nabble.com/Problems-Importing-table-to-pgadmin-tp5933807p5933826.html
> Sent from the PostgreSQL - general mailing list archive at Nabble.com.
>
>


--
Adrian Klaver
adrian.klaver@aklaver.com


Re: Problems Importing table to pgadmin

От
John R Pierce
Дата:
On 12/7/2016 4:54 PM, metaresolve wrote:
> Choking: I get the "Waiting for the query execution to complete" circling
> around for a while. I tried shutting it down and trying again but it's still
> freezing on the execution. But if the TB are accurate, I wonder why it's
> slowing on this? Any thoughts?

note database write performance on a windows desktop system thats
probably using one desktop hard drive is going to be a LOT slower than a
proper dedicated database server.

inserting lots of rows into a table that has a multiple indexes is also
quite time consuming.     in fact, if you're starting with an empty
table, don't create ANY indexes til AFTER you populate the table with
data, and before doing the CREATE INDEX, make sure
'maintenance_work_mem' is like 1GB.


--
john r pierce, recycling bits in santa cruz



Re: Problems Importing table to pgadmin

От
Adrian Klaver
Дата:
On 12/07/2016 04:54 PM, metaresolve wrote:
> Choking: I get the "Waiting for the query execution to complete" circling
> around for a while. I tried shutting it down and trying again but it's still
> freezing on the execution. But if the TB are accurate, I wonder why it's
> slowing on this? Any thoughts?

Also, what is the query you are executing?



--
Adrian Klaver
adrian.klaver@aklaver.com


Re: Problems Importing table to pgadmin

От
John R Pierce
Дата:
On 12/7/2016 4:02 PM, metaresolve wrote:
> I used to use Access to do my data crunching, matching, and cleaning at my
> old job. I worked with a max of 600k records so Access could handle it. I
> know, lame, but it's what I knew.

Access is really 2 completely different things bundled.   One is that
rather weak and poor "Jet" sorta-relational database that sorta
implements a subset of SQL.     The other is a database application
development system oriented around forms and reports.


> My thought was to use postgreSQL as a kind of more advanced Access that I
> could use to crunch numbers on similarly. However, My file has 1.1M records
> on it and pgadmin seems to be choking on it.

Postgres, on a properly scaled and tuned database server, can handle
billions of records.   Obviously, doing something silly like querying
all billions at once will never be fast, thats a lot of data to marshal
and process.



--
john r pierce, recycling bits in santa cruz



Re: Problems Importing table to pgadmin

От
metaresolve
Дата:
Uh, yeah, it was a SELECT * from cc_20161207;

I know, it was dumb. I didn't realize it would break it or at least run for
a while. I tend to do things in small steps, run a query, check my results,
then tweak.

You're right, I wouldn't want to be viewing those million. so I guess I
could just be doing the top 10000 rows to get a sense, then be doing my
aggregate group bys/etc to get my final numbers? That's how I hacked Access
to be my data analysis program until I got to Alteryx. But I was also never
using files of 1M then.




--
View this message in context: http://postgresql.nabble.com/Problems-Importing-table-to-pgadmin-tp5933807p5933831.html
Sent from the PostgreSQL - general mailing list archive at Nabble.com.


Re: Problems Importing table to pgadmin

От
Adrian Klaver
Дата:
On 12/07/2016 05:19 PM, metaresolve wrote:
> Uh, yeah, it was a SELECT * from cc_20161207;
>
> I know, it was dumb. I didn't realize it would break it or at least run for
> a while. I tend to do things in small steps, run a query, check my results,
> then tweak.
>
> You're right, I wouldn't want to be viewing those million. so I guess I
> could just be doing the top 10000 rows to get a sense, then be doing my
> aggregate group bys/etc to get my final numbers? That's how I hacked Access
> to be my data analysis program until I got to Alteryx. But I was also never
> using files of 1M then.
>

FYI pgAdmin, if I remember correctly, has a setting that limits the
maximum number of rows that it will fetch at a time.



--
Adrian Klaver
adrian.klaver@aklaver.com


Re: Problems Importing table to pgadmin

От
Melvin Davidson
Дата:


On Wed, Dec 7, 2016 at 8:25 PM, Adrian Klaver <adrian.klaver@aklaver.com> wrote:
On 12/07/2016 05:19 PM, metaresolve wrote:
Uh, yeah, it was a SELECT * from cc_20161207;

I know, it was dumb. I didn't realize it would break it or at least run for
a while. I tend to do things in small steps, run a query, check my results,
then tweak.

You're right, I wouldn't want to be viewing those million. so I guess I
could just be doing the top 10000 rows to get a sense, then be doing my
aggregate group bys/etc to get my final numbers? That's how I hacked Access
to be my data analysis program until I got to Alteryx. But I was also never
using files of 1M then.


FYI pgAdmin, if I remember correctly, has a setting that limits the maximum number of rows that it will fetch at a time.



--
Adrian Klaver
adrian.klaver@aklaver.com



--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

With regard to John's comments
>Postgres, on a properly scaled and tuned database server, can handle billions of records.  
>Obviously, doing something silly like querying all billions at once will never be fast, thats a lot of data to marshal and process.

On a similar note, your table structure is not exactly optimized for a relational database.
I see no primary key to identify each row. In addition, the election & vote columns
belong in a separate table that is a child of cc_20161207, since they are repeating fields.

eg:
ALTER TABLE cc_20161207
  ADD COLUMN cc_20161207_pk serial;

UPDATE cc_20161207
  SET cc_20161207_pk = nextval(cc_20161207_pk_seq)
WHERE cc_20161207_pk IS NULL;
 
ALTER TABLE cc_20161207
  ALTER COLUMN cc_20161207_pk SET NOT NULL;
ALTER TABLE cc_20161207
  ADD CONSTRAINT cc_20161207 PRIMARY KEY (cc_20161207_pk); 

 
CREATE TABLE election_data (
cc_20161207_pk bigint NOT NULL,
election_id serial NOT NULL,
election varchar(4),
vote_type varchar(1)
CONSTRAINT election_data_pk PRIMARY KEY (cc_20161207_pk, election_id)
);

--
Melvin Davidson
I reserve the right to fantasize.  Whether or not you
wish to share my fantasy is entirely up to you.

Re: Problems Importing table to pgadmin

От
Metare Solve
Дата:
Melvin: haha, yeah, it's a download from the Clark County, NV voterfile website. It's just the format they send out to people who request the file. I worked this summer doing QA on voterfile builds so I'm familiar with the data. I thought it would be good stuff to start with.

But thank you for the code in how to fix that flat file. I'm half-teaching myself SQL and building a database for some game data i have with the plan to sit Tableau on top of it. 

btw, you guys are really nice and helpful. I'm definitely coming back, you're making me want to make some progress on that dashboard.

On Wed, Dec 7, 2016 at 8:58 PM, Melvin Davidson <melvin6925@gmail.com> wrote:


On Wed, Dec 7, 2016 at 8:25 PM, Adrian Klaver <adrian.klaver@aklaver.com> wrote:
On 12/07/2016 05:19 PM, metaresolve wrote:
Uh, yeah, it was a SELECT * from cc_20161207;

I know, it was dumb. I didn't realize it would break it or at least run for
a while. I tend to do things in small steps, run a query, check my results,
then tweak.

You're right, I wouldn't want to be viewing those million. so I guess I
could just be doing the top 10000 rows to get a sense, then be doing my
aggregate group bys/etc to get my final numbers? That's how I hacked Access
to be my data analysis program until I got to Alteryx. But I was also never
using files of 1M then.


FYI pgAdmin, if I remember correctly, has a setting that limits the maximum number of rows that it will fetch at a time.



--
Adrian Klaver
adrian.klaver@aklaver.com



--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

With regard to John's comments
>Postgres, on a properly scaled and tuned database server, can handle billions of records.  
>Obviously, doing something silly like querying all billions at once will never be fast, thats a lot of data to marshal and process.

On a similar note, your table structure is not exactly optimized for a relational database.
I see no primary key to identify each row. In addition, the election & vote columns
belong in a separate table that is a child of cc_20161207, since they are repeating fields.

eg:
ALTER TABLE cc_20161207
  ADD COLUMN cc_20161207_pk serial;

UPDATE cc_20161207
  SET cc_20161207_pk = nextval(cc_20161207_pk_seq)
WHERE cc_20161207_pk IS NULL;
 
ALTER TABLE cc_20161207
  ALTER COLUMN cc_20161207_pk SET NOT NULL;
ALTER TABLE cc_20161207
  ADD CONSTRAINT cc_20161207 PRIMARY KEY (cc_20161207_pk); 

 
CREATE TABLE election_data (
cc_20161207_pk bigint NOT NULL,
election_id serial NOT NULL,
election varchar(4),
vote_type varchar(1)
CONSTRAINT election_data_pk PRIMARY KEY (cc_20161207_pk, election_id)
);

--
Melvin Davidson
I reserve the right to fantasize.  Whether or not you
wish to share my fantasy is entirely up to you.