You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
cudf::strings::detail::stod returns inf for a number very close to zero, like 9.299999257686047e-0005603333574677677, and returns 0.0 for a very large number, like 9.299999257686047e0005603333574677677.
Steps/Code to reproduce bug
cudf::strings::detail::stod (link) uses int to save exponent value, so it will overflow to a negative sign if input exponent is larger than INT_MAX.
For the above 1st double string, it will overflow from negative value to positive value.
For the above 2nd double string, it will overflow from positive value to negative value.
So it gets wrong results.
Not sure if this is intentional and matches some behavior.
getJsonObject in spark-rapids-jni needs this function to perform floating point number normalization, so we copied this function to spark-rapids-jni and fixed it. But we would like to call it from cuDF in a long term.
The text was updated successfully, but these errors were encountered:
Describe the bug
cudf::strings::detail::stod
returnsinf
for a number very close to zero, like9.299999257686047e-0005603333574677677
, and returns0.0
for a very large number, like9.299999257686047e0005603333574677677
.Steps/Code to reproduce bug
cudf::strings::detail::stod
(link) usesint
to save exponent value, so it will overflow to a negative sign if input exponent is larger than INT_MAX.For the above 1st double string, it will overflow from negative value to positive value.
For the above 2nd double string, it will overflow from positive value to negative value.
So it gets wrong results.
Not sure if this is intentional and matches some behavior.
getJsonObject
in spark-rapids-jni needs this function to perform floating point number normalization, so we copied this function to spark-rapids-jni and fixed it. But we would like to call it from cuDF in a long term.The text was updated successfully, but these errors were encountered: