Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

np.testing.assert_allclose does not match np.allclose #7726

Closed
MicahLyle opened this issue Jun 10, 2016 · 5 comments
Closed

np.testing.assert_allclose does not match np.allclose #7726

MicahLyle opened this issue Jun 10, 2016 · 5 comments

Comments

@MicahLyle
Copy link

MicahLyle commented Jun 10, 2016

Here is the reproduced bug below from running iPython to reproduce what happened in the code. Basically, np.testing.assert_allclose does not match np.allclose. Meaning np.allclose returned true but np.testing.assert_allclose raised an error in this case.

Python 3.5.1 |Continuum Analytics, Inc.| (default, Dec  7 2015, 11:24:55)
IPython 4.2.0
numpy 1.11.0 py35_1
 ---------------------------------------------------------------------------------------------------------------------------------------
In [38]: a = np.array([6.938894e-18, -3.469447e-18, -3.469447e-18])

In [39]: b = np.zeros(3)

In [40]: np.allclose(a, b, rtol=1e-07)
Out[40]: True

In [41]: np.testing.assert_allclose(a, b, rtol=1e-07)
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-41-3be1e226d188> in <module>()
----> 1 np.testing.assert_allclose(a, b, rtol=1e-07)

/Users/mlyle/anaconda/envs/python3/lib/python3.5/site-packages/numpy/testing/utils.py in assert_allclose(actual, desired, rtol, atol, equal_nan, err_msg, verbose)
   1389     header = 'Not equal to tolerance rtol=%g, atol=%g' % (rtol, atol)
   1390     assert_array_compare(compare, actual, desired, err_msg=str(err_msg),
-> 1391                          verbose=verbose, header=header)
   1392
   1393 def assert_array_almost_equal_nulp(x, y, nulp=1):

/Users/mlyle/anaconda/envs/python3/lib/python3.5/site-packages/numpy/testing/utils.py in assert_array_compare(comparison, x, y, err_msg, verbose, header, precision)
    731                                 names=('x', 'y'), precision=precision)
    732             if not cond:
--> 733                 raise AssertionError(msg)
    734     except ValueError:
    735         import traceback

AssertionError:
Not equal to tolerance rtol=1e-07, atol=0

(mismatch 100.0%)
 x: array([  6.938894e-18,  -3.469447e-18,  -3.469447e-18])
 y: array([ 0.,  0.,  0.])
@njsmith
Copy link
Member

njsmith commented Jun 10, 2016

Yes, they have different defaults for the atol and rtol parameters.

There's some long discussion we had about this that you can probably find if you search... IIRC it got bogged down with the folks who use allclose saying that if we changed the defaults to match assert_allclose then it would break their test suites so we definitely shouldn't do that, and the folks who use assert_allclose saying that if we changed the defaults to match allclose then it would break their test suite so we definitely shouldn't do that. Both sides have a point I guess, but it is unfortunate :-(

@rgommers
Copy link
Member

Previous discussion: http://thread.gmane.org/gmane.comp.python.numeric.general/58235/

So this is a wontfix, closing the issue.

@raaaaaymond
Copy link

OK I understand that you won't fix it, but at least please don't claim in the documentation that they're equivalent! I know the documentation has a disclaimer that says "note that allclose has different default values", but the sentence "The test is equivalent to allclose(actual, desired, rtol, atol)" is very confusing.

@eric-wieser
Copy link
Member

Can you suggest a less confusing version?

@raaaaaymond
Copy link

Suggestion, maybe something like the following?

"Due to different default parameter values, its behaviour is different to allclose, but they are functionally equivalent."

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants