You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the task
When reviewing some code I noticed that the CSV parser will throw an exception if the schema for the column is not nullable, but the data in it is nullable. I got a little scared because we are not checking for that in our code. But when I went to reproduce the issue, spark just marked all of the columns as nullable, despite my wishes, and the CSV parsing code. I did a quick look at the Spark code to try to see where this switch was happening, and I could not find it. It would be good for us to make sure we understand what is happening and that we have all of the cases covered. This is not critical at all, just something I noticed and it made me a bit concerned.
The text was updated successfully, but these errors were encountered:
Describe the task
When reviewing some code I noticed that the CSV parser will throw an exception if the schema for the column is not nullable, but the data in it is nullable. I got a little scared because we are not checking for that in our code. But when I went to reproduce the issue, spark just marked all of the columns as nullable, despite my wishes, and the CSV parsing code. I did a quick look at the Spark code to try to see where this switch was happening, and I could not find it. It would be good for us to make sure we understand what is happening and that we have all of the cases covered. This is not critical at all, just something I noticed and it made me a bit concerned.
The text was updated successfully, but these errors were encountered: