Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fix flaky memory usage test by guaranteeing array size. (#10114)
The test `test_dataframe.py::test_memory_usage_multi` is currently flaky. In theory it can fail for any value of the `rows` parameter, but in practice we only observe failures for the smaller value of 10. The reason for this is that the data for the `MultiIndex` is being constructed by randomly sampling from an array of size 3, and for a sufficiently small sample (e.g. 10) the probability that selection will not actually include all three values (e.g. a sample of `[0, 1, 1, 1, 0, 1, 1, 0, 0, 1]`) is not vanishingly small and occurs with observable frequency. The resulting `MultiIndex` will encode the levels for that column as a column with only two values, and as a result the column will occupy 8 fewer bytes (one 64 bit integer or float) less of space than expected. This PR changes that by always sampling without replacement from an array of the same length as the number of rows. I could also have fixed this problem by fixing a random seed that ensures that all the values are always sampled, but I made this change instead because 1) it more clearly conveys the intent, and 2) fixing a seed is a change that we should discuss and apply globally across all our tests. Authors: - Vyas Ramasubramani (https://github.com/vyasr) Approvers: - Bradley Dice (https://github.com/bdice) - Ashwin Srinath (https://github.com/shwina) URL: #10114
- Loading branch information