On 08/20/2015 12:34 PM, Gregory Arenius wrote:
> have a Pandas dataframe that I'm inserting into an SQL database. I'm
> using Psycopg2 directly to talk to the database, not SQLAlchemy, so I
> can't use Pandas built in to_sql functions. Almost everything works as
> expected except for the fact that numpy np.NaN values get converted to
> text as NaN and inserted into the database. They really should be
> treated as SQL null values.
>
> So, I'm trying to make a custom adapter to convert np.NaN to SQL null
> but everything I've tried results in the same NaN strings being inserted
> in the database.
>
> The code I'm currently trying is:
>
> |defadapt_nans(null):a
> =adapt(None).getquoted()returnAsIs(a)register_adapter(np.NaN,adapt_nans)|
>
> I've tried a number of variations along this theme but haven't had any luck.
How about having Pandas do the work? From here:
https://github.com/pydata/pandas/issues/1972
import pandas as pd
df = pd.DataFrame([
dict(a=1, p=0),
dict(a=2, m=10),
dict(a=3, m=11, p=20),
dict(a=4, m=12, p=21)
], columns=('a', 'm', 'p', 'x'))
In [4]: df
Out[4]:
a m p x
0 1 NaN 0 NaN
1 2 10 NaN NaN
2 3 11 20 NaN
3 4 12 21 NaN
In [5]: df.where(pd.notnull(df), None)
Out[5]:
a m p x
0 1 None 0 None
1 2 10 None None
2 3 11 20 None
3 4 12 21 None
>
> Cheers,
>
> Greg
>
--
Adrian Klaver
adrian.klaver@aklaver.com