Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pandas 2.0.0 brings some warnings and incompatible changes #2480

Closed
seisman opened this issue Apr 5, 2023 · 2 comments · Fixed by #2569
Closed

pandas 2.0.0 brings some warnings and incompatible changes #2480

seisman opened this issue Apr 5, 2023 · 2 comments · Fixed by #2569
Labels
maintenance Boring but important stuff for the core devs
Milestone

Comments

@seisman
Copy link
Member

seisman commented Apr 5, 2023

pandas 2.0.0 was released on Apr 3, 2023 and it brings some incompatible changes (see https://pandas.pydata.org/docs/whatsnew/v2.0.0.html for the changelog).

Here are the new failures and warnings with pandas 2.0.0 (https://github.com/GenericMappingTools/pygmt/actions/runs/4610492623/jobs/8149020030?pr=2477):

=================================== FAILURES ===================================
______________ [doctest] pygmt.clib.conversion.array_to_datetime _______________
276 
277     Examples
278     --------
279     >>> import datetime
280     >>> # numpy.datetime64 array
281     >>> x = np.array(
282     ...     ["2010-06-01", "2011-06-01T12", "2012-01-01T12:34:56"],
283     ...     dtype="datetime64",
284     ... )
285     >>> array_to_datetime(x)
Differences (unified diff with -expected +actual):
    @@ -1,3 +1,3 @@
     DatetimeIndex(['2010-06-01 00:00:00', '2011-06-01 12:00:00',
                    '2012-01-01 12:34:56'],
    -              dtype='datetime64[ns]', freq=None)
    +              dtype='datetime64[s]', freq=None)

/home/runner/work/pygmt/pygmt/pygmt/clib/conversion.py:285: DocTestFailure
=============================== warnings summary ===============================
pygmt/helpers/decorators.py::pygmt.helpers.decorators.kwargs_to_strings
  <doctest pygmt.helpers.decorators.kwargs_to_strings[13]>:3: UserWarning: Converting non-nanosecond precision datetime values to nanosecond precision. This behavior can eventually be relaxed in xarray, as it is an artifact from pandas which is now beginning to support non-nanosecond precision values. This warning is caused by passing non-nanosecond np.datetime64 or np.timedelta64 values to the DataArray or Variable constructor; it can be silenced by converting the values to nanosecond precision ahead of time.

pygmt/tests/test_x2sys_cross.py: 12 warnings
  /home/runner/work/pygmt/pygmt/pygmt/src/x2sys_cross.py:233: UserWarning: Could not infer format, so each element will be parsed individually, falling back to `dateutil`. To ensure parsing is consistent and as-expected, please specify a format.
    table = pd.read_csv(
@seisman
Copy link
Member Author

seisman commented Jun 9, 2023

pygmt/tests/test_x2sys_cross.py: 12 warnings
  /home/runner/work/pygmt/pygmt/pygmt/src/x2sys_cross.py:233: UserWarning: Could not infer format, so each element will be parsed individually, falling back to `dateutil`. To ensure parsing is consistent and as-expected, please specify a format.
    table = pd.read_csv(

@weiji14 Do you have any idea to fix the pandas 2.0.0 warning in x2sys_cross?

@weiji14
Copy link
Member

weiji14 commented Jun 9, 2023

pygmt/tests/test_x2sys_cross.py: 12 warnings
  /home/runner/work/pygmt/pygmt/pygmt/src/x2sys_cross.py:233: UserWarning: Could not infer format, so each element will be parsed individually, falling back to `dateutil`. To ensure parsing is consistent and as-expected, please specify a format.
    table = pd.read_csv(

@weiji14 Do you have any idea to fix the pandas 2.0.0 warning in x2sys_cross?

Yep, just need to use pd.read_csv(..., date_format="ISO8601"), have started a PR at #2569.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
maintenance Boring but important stuff for the core devs
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants