Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update all Python 3.6 tests to 3.9 #180

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 19 additions & 23 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -162,7 +162,7 @@ jobs:
pandas_version:
type: string
description: "Version of pandas to test against"
default: '<1'
default: '==1.3.5'
command:
type: string
description: "Command to run in records-mover venv"
Expand Down Expand Up @@ -254,7 +254,7 @@ jobs:
description: "Version of Python to test against"
pandas_version:
type: string
default: '>=1'
default: '==1.3.5'
db_name:
type: string
description: "Database to run inside"
Expand Down Expand Up @@ -313,15 +313,15 @@ jobs:
python_version:
type: string
description: "Version of python to test against"
default: '3.6'
default: '3.9'
docker:
- image: circleci/python:<<parameters.python_version>>
steps:
- checkout
- installvenv:
extras: <<parameters.extras>>
python_version: <<parameters.python_version>>
pandas_version: '<1'
pandas_version: '==1.3.5'
# requirements.txt includes twine and other release packages
include_dev_dependencies: true
- run:
Expand Down Expand Up @@ -353,12 +353,12 @@ jobs:
twine upload -r pypi dist/*
cli-extra-test:
docker:
- image: circleci/python:3.6
- image: circleci/python:3.9
steps:
- checkout
- installvenv:
extras: '[cli]'
python_version: '3.6'
python_version: '3.9'
pandas_version: ''
# we want this just like a user would install it, not with
# dev tools installed
Expand All @@ -383,9 +383,6 @@ workflows:
#
# https://devguide.python.org/devcycle/#end-of-life-branches
#
# That said, Python 3.5 and before don't support type
# annotations on variables, which we use, so right now Python
# 3.6 is the current minimum version tested against.
#
# https://app.asana.com/0/1128138765527694/1161072974798065
# - test:
Expand Down Expand Up @@ -420,7 +417,7 @@ workflows:
# - integration_test_with_dbs:
# name: vertica-no-s3-itest
# extras: '[vertica,itest]'
# python_version: "3.6"
# python_version: "3.9"
# command: |
# . venv/bin/activate
# export PATH=${PATH}:${PWD}/tests/integration/bin:/opt/vertica/bin
Expand Down Expand Up @@ -460,7 +457,7 @@ workflows:
# - integration_test_with_dbs:
# name: mysql-itest
# extras: '[mysql,itest]'
# python_version: "3.6"
# python_version: "3.9"
# # Using Pandas reproduced a bug that happened when we were
# # relying on Pandas:
# #
Expand All @@ -482,7 +479,7 @@ workflows:
- integration_test_with_dbs:
name: vertica-s3-itest
extras: '[vertica,aws,itest]'
python_version: "3.6"
python_version: "3.9"
command: |
. venv/bin/activate
export PATH=${PATH}:${PWD}/tests/integration/bin:/opt/vertica/bin
Expand All @@ -499,7 +496,7 @@ workflows:
- integration_test_with_dbs:
name: cli-1-itest
extras: '[cli,gsheets,vertica]'
python_version: "3.6"
python_version: "3.9"
command: |
. venv/bin/activate
export PATH=${PATH}:${PWD}/tests/integration/bin:/opt/vertica/bin
Expand All @@ -516,7 +513,7 @@ workflows:
- integration_test_with_dbs:
name: cli-2-itest
extras: '[cli,gsheets,vertica]'
python_version: "3.6"
python_version: "3.9"
command: |
. venv/bin/activate
export PATH=${PATH}:${PWD}/tests/integration/bin:/opt/vertica/bin
Expand All @@ -533,7 +530,7 @@ workflows:
- integration_test_with_dbs:
name: cli-3-itest
extras: '[cli,gsheets,vertica]'
python_version: "3.6"
python_version: "3.9"
command: |
. venv/bin/activate
export PATH=${PATH}:${PWD}/tests/integration/bin:/opt/vertica/bin
Expand All @@ -550,7 +547,7 @@ workflows:
- integration_test:
name: redshift-s3-itest
extras: '[redshift-binary,itest]'
python_version: "3.6"
python_version: "3.9"
db_name: demo-itest
requires:
- test-3.9
Expand All @@ -560,7 +557,7 @@ workflows:
- integration_test:
name: redshift-no-s3-itest
extras: '[redshift-binary,itest]'
python_version: "3.6"
python_version: "3.9"
db_name: demo-itest
include_s3_scratch_bucket: false
requires:
Expand All @@ -571,7 +568,7 @@ workflows:
- integration_test:
name: redshift-s3-itest-old-pandas
extras: '[redshift-binary,itest]'
python_version: "3.6"
python_version: "3.8"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is this being changed to 3.8?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah I'm guessing it's because this test specifically is targeting compatibility with an old version of pandas that maybe requires an older python version? the rest of these have been bumped to 3.9 as expected.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you got it!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tried to find a way to get around this by allowing it to build the old version of pandas in 3.9 but kept creating compounding errors

pandas_version: "<1"
db_name: demo-itest
requires:
Expand All @@ -582,7 +579,7 @@ workflows:
- integration_test:
name: redshift-s3-itest-no-pandas
extras: '[redshift-binary,itest]'
python_version: "3.6"
python_version: "3.9"
pandas_version: ""
db_name: demo-itest
requires:
Expand All @@ -593,7 +590,7 @@ workflows:
# - integration_test:
# name: bigquery-no-gcs-itest
# extras: '[bigquery,itest]'
# python_version: "3.6"
# python_version: "3.9"
# db_name: bltoolsdevbq-bq_itest
# include_gcs_scratch_bucket: false
# requires:
Expand All @@ -604,7 +601,7 @@ workflows:
# - integration_test:
# name: bigquery-gcs-itest
# extras: '[bigquery,itest]'
# python_version: "3.6"
# python_version: "3.9"
# db_name: bltoolsdevbq-bq_itest
# requires:
# - redshift-s3-itest
Expand All @@ -614,7 +611,7 @@ workflows:
# - integration_test_with_dbs:
# name: tbl2tbl-itest
# extras: '[literally_every_single_database_binary,itest]'
# python_version: "3.6"
# python_version: "3.9"
# command: |
# . venv/bin/activate
# export PATH=${PATH}:${PWD}/tests/integration/bin:/opt/vertica/bin
Expand All @@ -635,7 +632,6 @@ workflows:
- deploy:
context: PyPI
requires:
# - test-3.6
# - test-3.7
- test-3.8
- test-3.9
Expand Down
2 changes: 1 addition & 1 deletion metrics/bigfiles_high_water_mark
Original file line number Diff line number Diff line change
@@ -1 +1 @@
1135
1137
2 changes: 1 addition & 1 deletion metrics/flake8_high_water_mark
Original file line number Diff line number Diff line change
@@ -1 +1 @@
166
167
2 changes: 1 addition & 1 deletion records_mover/records/targets/google_sheets.py
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ def _get_service(self) -> SheetsService:
def as_json_serializable(self, cell: Any) -> Any:
if isinstance(cell, np.generic):
# MyPy complains that this method does not exist
native = np.asscalar(cell) # type: ignore
native = cell.item()
ryantimjohn marked this conversation as resolved.
Show resolved Hide resolved
else:
native = cell
if isinstance(cell, float) and math.isnan(native):
Expand Down
2 changes: 2 additions & 0 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -126,6 +126,8 @@ def initialize_options(self) -> None:

itest_dependencies = [
'jsonschema', # needed for directory_validator.py
'pytz',
ryantimjohn marked this conversation as resolved.
Show resolved Hide resolved
ryantimjohn marked this conversation as resolved.
Show resolved Hide resolved
ryantimjohn marked this conversation as resolved.
Show resolved Hide resolved
ryantimjohn marked this conversation as resolved.
Show resolved Hide resolved
'wheel', # needed to support legacy 'setup.py install'
ryantimjohn marked this conversation as resolved.
Show resolved Hide resolved
] + (
nose_dependencies +
# needed for records_database_fixture retrying drop/creates on
Expand Down
6 changes: 2 additions & 4 deletions tests/unit/records/targets/test_google_sheets.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,7 @@ def test_move_from_dataframe_sheet_exists(self,
out = self.google_sheets.move_from_dataframes_source(mock_dfs_source,
mock_processing_instructions)
mock_df.to_records.assert_called_with(index=mock_dfs_source.include_index)
mock_json_encodable_datum = mock_np.asscalar.return_value
mock_np.asscalar.assert_called_with(1)
mock_json_encodable_datum = 1
mock_http = mock_httplib2.Http.return_value
mock_authed_http = mock_google_auth_httplib2.AuthorizedHttp.return_value
mock_google_auth_httplib2.AuthorizedHttp.\
Expand Down Expand Up @@ -83,8 +82,7 @@ def test_move_from_dataframe_sheet_new(self,
out = self.google_sheets.move_from_dataframes_source(mock_dfs_source,
mock_processing_instructions)
mock_df.to_records.assert_called_with(index=mock_dfs_source.include_index)
mock_json_encodable_datum = mock_np.asscalar.return_value
mock_np.asscalar.assert_called_with(1)
mock_json_encodable_datum = 1
mock_http = mock_httplib2.Http.return_value
mock_authed_http = mock_google_auth_httplib2.AuthorizedHttp.return_value
mock_google_auth_httplib2.AuthorizedHttp.\
Expand Down