-
-
Notifications
You must be signed in to change notification settings - Fork 18.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BUG: error in handling a sqlalchemy type with arguments (instantiated type, not class) #9083
Comments
@pilotstew Thanks for the report! There appear to be a few problems:
|
@pilotstew You can solve it for now using this monkey-patch:
Can you try that? This should let you be able to do |
Worked Perfectly! |
@pilotstew Good to hear! |
Actually, no I didn't. In pandas all timestamps were UTC and converted to local timezone in postgres. Since they are all uniform and converted correctly it's not a huge issue. I can convert back to UTC on the read_sql. |
OK, yes that is what I expected and described in #9086. So this is then another enhacement to do in 'real' timezone support! |
I opened a couple of other issues that I encountered while looking into this:
Lets keep this issue for the actual |
Hi, the same code above is causing errors again, albeit different ones: import sqlalchemy
import pandas as pd
eng = sqlalchemy.create_engine("postgresql+psycopg2://******:********@localhost:5432/postgres")
times = ['201412120154', '201412110254']
df = pd.DataFrame()
df['time'] = pd.to_datetime(times, utc=True)
df.time.to_sql('test', eng, dtype={'time': sqlalchemy.TIMESTAMP}, if_exists='append') With the traceback: Traceback (most recent call last):
File "./test.py", line 10, in <module>
df.time.to_sql('test', eng, dtype={'time': sqlalchemy.TIMESTAMP}, if_exists='append')
File "/usr/local/lib/python2.7/dist-packages/pandas/core/generic.py", line 1165, in to_sql
chunksize=chunksize, dtype=dtype)
File "/usr/local/lib/python2.7/dist-packages/pandas/io/sql.py", line 571, in to_sql
chunksize=chunksize, dtype=dtype)
File "/usr/local/lib/python2.7/dist-packages/pandas/io/sql.py", line 1250, in to_sql
table.insert(chunksize)
File "/usr/local/lib/python2.7/dist-packages/pandas/io/sql.py", line 748, in insert
keys, data_list = self.insert_data()
File "/usr/local/lib/python2.7/dist-packages/pandas/io/sql.py", line 729, in insert_data
d = b.values.astype('M8[us]').astype(object)
File "/usr/local/lib/python2.7/dist-packages/pandas/tseries/index.py", line 842, in astype
raise ValueError('Cannot cast DatetimeIndex to dtype %s' % dtype)
ValueError: Cannot cast DatetimeIndex to dtype datetime64[us] I'm using pandas version 0.18.1 and sqlalchemy version 1.0.13. It looks like pandas is trying to convert the datetime into a microsecond timestamp but in Cheers |
Also not working in the latest build of pandas - Traceback is similar: Traceback (most recent call last):
File "./test.py", line 10, in <module>
df.time.to_sql('test', eng, dtype={'time': sqlalchemy.TIMESTAMP}, if_exists='append')
File "/home/arun/pandas-env/local/lib/python2.7/site-packages/pandas/core/generic.py", line 1166, in to_sql
chunksize=chunksize, dtype=dtype)
File "/home/arun/pandas-env/local/lib/python2.7/site-packages/pandas/io/sql.py", line 571, in to_sql
chunksize=chunksize, dtype=dtype)
File "/home/arun/pandas-env/local/lib/python2.7/site-packages/pandas/io/sql.py", line 1250, in to_sql
table.insert(chunksize)
File "/home/arun/pandas-env/local/lib/python2.7/site-packages/pandas/io/sql.py", line 748, in insert
keys, data_list = self.insert_data()
File "/home/arun/pandas-env/local/lib/python2.7/site-packages/pandas/io/sql.py", line 729, in insert_data
d = b.values.astype('M8[us]').astype(object)
File "/home/arun/pandas-env/local/lib/python2.7/site-packages/pandas/tseries/index.py", line 847, in astype
raise ValueError('Cannot cast DatetimeIndex to dtype %s' % dtype)
ValueError: Cannot cast DatetimeIndex to dtype datetime64[us] |
@al626 This is indeed not yet working (but this is due to the timezone aware datetime column, not due to the |
There is the general issue #9086 about time zones not being supported, however this issue is not really up to date (situation has changed because it is now possible to have datetime64 timezone aware columns). |
Thanks, I'll keep track of that |
Would there be any update on using timestamps data with timezones info |
I'm trying to use DataFrame().to_sql to input a time aware dataframe series. Here is an example of my code.
The error I recieve is:
The following code works but obviously results in a postgresql column that is not timezone aware.
I'm using python 2.7, pandas 0.15.2, postsgresql 9.3 and SQLAlchemy 0.9.7. This same issue also occurs with sqlalchemy.DATETIME(timezone=True) vs sqlalchemy.DATETIME
Full Traceback:
The text was updated successfully, but these errors were encountered: