You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I have a private S3 bucket and created an URI using this tool as below:
./sign_s3_url.bash --bucket 'eae-data-test' --aws-access-key-id 'my-key' --aws-secret-access-key 'secret-key' --file-path 'hdf/customer' --region 'us-east-1' --minute-expire 10001
Above 'to_hdf' throws below error:
File "/eae/env/dts/lib/python3.6/site-packages/pandas/core/generic.py", line 1471, in to_hdf
return pytables.to_hdf(path_or_buf, key, self, **kwargs)
File "/eae/env/dts/lib/python3.6/site-packages/pandas/io/pytables.py", line 280, in to_hdf
complib=complib) as store:
File "/eae/env/dts/lib/python3.6/site-packages/pandas/io/pytables.py", line 467, in init
self.open(mode=mode, **kwargs)
File "/eae/env/dts/lib/python3.6/site-packages/pandas/io/pytables.py", line 580, in open
self._handle = tables.open_file(self._path, self._mode, **kwargs)
File "/eae/env/dts/lib/python3.6/site-packages/tables/file.py", line 320, in open_file
return File(filename, mode, title, root_uep, filters, **kwargs)
File "/eae/env/dts/lib/python3.6/site-packages/tables/file.py", line 784, in init
self._g_new(filename, mode, **params)
File "tables/hdf5extension.pyx", line 371, in tables.hdf5extension.File._g_new
File "/eae/env/dts/lib/python3.6/site-packages/tables/utils.py", line 185, in check_file_access
check_file_access(filename, 'w')
File "/eae/env/dts/lib/python3.6/site-packages/tables/utils.py", line 175, in check_file_access
raise IOError("%s does not exist" % (parentname,))
OSError: https://s3.amazonaws.com/eae-data-test/hdf does not exist
Could you please help me with this?
The text was updated successfully, but these errors were encountered:
Hi,
I have a private S3 bucket and created an URI using this tool as below:
./sign_s3_url.bash --bucket 'eae-data-test' --aws-access-key-id 'my-key' --aws-secret-access-key 'secret-key' --file-path 'hdf/customer' --region 'us-east-1' --minute-expire 10001
Above script created a link similar as below:
https://s3.amazonaws.com/eae-data-test/hdf/customer?AWSAccessKeyId=AK6&Expires=1533802548&Signature=MJ%2E%3D
I tried using this link in pandas as below:
import pandas as pd
df=pd.read_sql('SELECT * FROM SQL-SERVER-TABLE', sql-server-pypyodbc-connection)
uri='https://s3.amazonaws.com/eae-data-test/hdf/customer?AWSAccessKeyId=AK6&Expires=1533802548&Signature=MJ%2E%3D'
df.to_hdf(uri,key='df')
Above 'to_hdf' throws below error:
File "/eae/env/dts/lib/python3.6/site-packages/pandas/core/generic.py", line 1471, in to_hdf
return pytables.to_hdf(path_or_buf, key, self, **kwargs)
File "/eae/env/dts/lib/python3.6/site-packages/pandas/io/pytables.py", line 280, in to_hdf
complib=complib) as store:
File "/eae/env/dts/lib/python3.6/site-packages/pandas/io/pytables.py", line 467, in init
self.open(mode=mode, **kwargs)
File "/eae/env/dts/lib/python3.6/site-packages/pandas/io/pytables.py", line 580, in open
self._handle = tables.open_file(self._path, self._mode, **kwargs)
File "/eae/env/dts/lib/python3.6/site-packages/tables/file.py", line 320, in open_file
return File(filename, mode, title, root_uep, filters, **kwargs)
File "/eae/env/dts/lib/python3.6/site-packages/tables/file.py", line 784, in init
self._g_new(filename, mode, **params)
File "tables/hdf5extension.pyx", line 371, in tables.hdf5extension.File._g_new
File "/eae/env/dts/lib/python3.6/site-packages/tables/utils.py", line 185, in check_file_access
check_file_access(filename, 'w')
File "/eae/env/dts/lib/python3.6/site-packages/tables/utils.py", line 175, in check_file_access
raise IOError("
%s
does not exist" % (parentname,))OSError:
https://s3.amazonaws.com/eae-data-test/hdf
does not existCould you please help me with this?
The text was updated successfully, but these errors were encountered: