You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There are Python integration tests that construct data sources and run SQL queries. Consider this window-function test:
https://github.com/NVIDIA/spark-rapids/blob/branch-0.2/integration_tests/src/main/python/window_function_test.py#L79deftest_window_aggs_for_ranges(data_gen):
df=with_cpu_session(
lambdaspark : gen_df(spark, data_gen, length=2048))
df.createOrReplaceTempView("window_agg_table")
assert_gpu_and_cpu_are_equal_collect(
lambdaspark: spark.sql(
'select '' sum(c) over '' (partition by a order by cast(b as timestamp) asc '' range between interval 1 day preceding and interval 1 day following) as sum_c_asc '# ...'from window_agg_table'))
It would be good to have a pytest wrapper that does the following:
Takes a test input data_gen, and a SQL string.
Constructs an input DataFrame, and a corresponding input table.
Runs the supplied SQL on CPU and GPU, and compares results.
Should be a tiny utility.
The text was updated successfully, but these errors were encountered:
There are Python integration tests that construct data sources and run SQL queries. Consider this window-function test:
It would be good to have a
pytest
wrapper that does the following:data_gen
, and a SQL string.Should be a tiny utility.
The text was updated successfully, but these errors were encountered: