Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test failure: test_sql_execution #349

Closed
github-actions bot opened this issue Nov 19, 2024 · 0 comments · Fixed by #350
Closed

Test failure: test_sql_execution #349

github-actions bot opened this issue Nov 19, 2024 · 0 comments · Fixed by #350
Labels
bug Something isn't working

Comments

@github-actions
Copy link

❌ test_sql_execution: assert [(10103, 1011..., 10069), ...] == [(10282, 1017..., 10065), ...] (1.362s)
assert [(10103, 1011..., 10069), ...] == [(10282, 1017..., 10065), ...]
  
  At index 0 diff: (10103, 10110) != (10282, 10171)
  
  Full diff:
    [
        (
  -         10282,
  -         10171,
  -     ),
  -     (
  -         10110,
  ?            -
  +         10103,
  ?             +
            10110,
        ),
        (
  -         10103,
  ?           -
  +         10023,
  ?            +
            10023,
        ),
        (
  -         10022,
  -         10017,
  ?             -
  +         10001,
  ?            +
  +         10018,
        ),
        (
  +         10044,
  -         10110,
  ?             ^
  +         10111,
  ?             ^
  -         10282,
        ),
        (
  -         10009,
  ?           ^^
  +         10199,
  ?           ^^
  -         10065,
  ?            ^^
  +         10022,
  ?            ^^
        ),
        (
  -         10153,
  ?           ^^
  +         10023,
  ?           ^^
  -         10199,
  -     ),
  -     (
  -         10112,
            10069,
        ),
        (
  +         11371,
  -         10023,
  ?            ^
  +         10003,
  ?            ^
  -         10153,
        ),
        (
  +         11371,
  +         11201,
  +     ),
  +     (
  -         10012,
  ?             ^
  +         10014,
  ?             ^
  +         10023,
  +     ),
  +     (
            10003,
  +         11222,
        ),
    ]
04:48 DEBUG [databricks.sdk] Loaded from environment
04:48 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
04:48 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
04:48 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
04:48 INFO [databricks.sdk] Using Databricks Metadata Service authentication
[gw7] linux -- Python 3.10.15 /home/runner/work/lsql/lsql/.venv/bin/python
04:48 DEBUG [databricks.sdk] Loaded from environment
04:48 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
04:48 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
04:48 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
04:48 INFO [databricks.sdk] Using Databricks Metadata Service authentication
04:48 DEBUG [databricks.labs.lsql.core] Executing SQL statement: SELECT pickup_zip, dropoff_zip FROM nyctaxi.trips LIMIT 10
04:48 DEBUG [databricks.sdk] POST /api/2.0/sql/statements/
> {
>   "catalog": "samples",
>   "format": "JSON_ARRAY",
>   "statement": "SELECT pickup_zip, dropoff_zip FROM nyctaxi.trips LIMIT 10",
>   "warehouse_id": "DATABRICKS_WAREHOUSE_ID"
> }
< 200 OK
< {
<   "manifest": {
<     "chunks": [
<       {
<         "chunk_index": 0,
<         "row_count": 10,
<         "row_offset": 0
<       }
<     ],
<     "format": "JSON_ARRAY",
<     "schema": {
<       "column_count": 2,
<       "columns": [
<         {
<           "name": "pickup_zip",
<           "position": 0,
<           "type_name": "INT",
<           "type_text": "INT"
<         },
<         "... (1 additional elements)"
<       ]
<     },
<     "total_chunk_count": 1,
<     "total_row_count": 10,
<     "truncated": false
<   },
<   "result": {
<     "chunk_index": 0,
<     "data_array": [
<       [
<         "10103",
<         "... (1 additional elements)"
<       ],
<       "... (9 additional elements)"
<     ],
<     "row_count": 10,
<     "row_offset": 0
<   },
<   "statement_id": "01efa631-7f60-1b86-bc24-57a108eb7f29",
<   "status": {
<     "state": "SUCCEEDED"
<   }
< }
04:48 DEBUG [databricks.sdk] Loaded from environment
04:48 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
04:48 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
04:48 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
04:48 INFO [databricks.sdk] Using Databricks Metadata Service authentication
04:48 DEBUG [databricks.labs.lsql.core] Executing SQL statement: SELECT pickup_zip, dropoff_zip FROM nyctaxi.trips LIMIT 10
04:48 DEBUG [databricks.sdk] POST /api/2.0/sql/statements/
> {
>   "catalog": "samples",
>   "format": "JSON_ARRAY",
>   "statement": "SELECT pickup_zip, dropoff_zip FROM nyctaxi.trips LIMIT 10",
>   "warehouse_id": "DATABRICKS_WAREHOUSE_ID"
> }
< 200 OK
< {
<   "manifest": {
<     "chunks": [
<       {
<         "chunk_index": 0,
<         "row_count": 10,
<         "row_offset": 0
<       }
<     ],
<     "format": "JSON_ARRAY",
<     "schema": {
<       "column_count": 2,
<       "columns": [
<         {
<           "name": "pickup_zip",
<           "position": 0,
<           "type_name": "INT",
<           "type_text": "INT"
<         },
<         "... (1 additional elements)"
<       ]
<     },
<     "total_chunk_count": 1,
<     "total_row_count": 10,
<     "truncated": false
<   },
<   "result": {
<     "chunk_index": 0,
<     "data_array": [
<       [
<         "10103",
<         "... (1 additional elements)"
<       ],
<       "... (9 additional elements)"
<     ],
<     "row_count": 10,
<     "row_offset": 0
<   },
<   "statement_id": "01efa631-7f60-1b86-bc24-57a108eb7f29",
<   "status": {
<     "state": "SUCCEEDED"
<   }
< }
[gw7] linux -- Python 3.10.15 /home/runner/work/lsql/lsql/.venv/bin/python

Running from nightly #6

@github-actions github-actions bot added the bug Something isn't working label Nov 19, 2024
JCZuurmond added a commit that referenced this issue Nov 19, 2024
* Changes to work with Databricks SDK `v0.38.0` ([#350](#350)). In this release, the open-source library has been updated to be compatible with Databricks SDK version 0.38.0, addressing issues [#349](#349) to [#332](#332). The changes include modifying the `create_dashboard` function, which now directly passes the `SDKDashboard` instance to the `ws.lakeview.create` and `ws.lakeview.update` methods, eliminating the need for dictionary conversions. A new SQL query, `NYC_TAXI_TRIPS_LIMITED`, has been introduced, and the `test_sql_execution` method has been updated to use this query. The `test_sql_execution_partial` and `test_sql_execution_as_iterator` methods have been removed, and the `test_fetch_one_works` method now includes an assertion to verify the returned row's `pickup_zip` value. These updates improve the library's compatibility with the latest Databricks SDK, refactor SQL query-based tests, and enhance test reliability.
* Specify the minimum required version of `databricks-sdk` as 0.37.0 ([#331](#331)). In this release, we have updated the minimum required version of the `databricks-sdk` package to 0.37.0, as specified in the project's `pyproject.toml` file. This update is necessary due to the modifications made in pull request [#320](#320), which constrains the `databricks-sdk` version to 0.37.x for compatible updates within the same minor version. This change also resolves issue [#330](#330). It is important to note that no new methods have been added or existing functionality changed in the codebase as part of this commit. Therefore, the impact on the existing functionality should be minimal and confined to the interaction with the `databricks-sdk` package.
JCZuurmond added a commit that referenced this issue Nov 19, 2024
* Changes to work with Databricks SDK `v0.38.0` ([#350](#350)). In this release, we have upgraded the Databricks SDK to version 0.38.0 from version 0.37.0 to ensure compatibility with the latest SDK and address several issues. The update includes changes to make the code compatible with the new SDK version, removing the need for `.as_dict()` method calls when creating or updating dashboards and utilizing a `sdk_dashboard` variable for interacting with the Databricks workspace. We also updated the dependencies to "databricks-labs-blueprint[yaml]" package version greater than or equal to 0.4.2 and `sqlglot` package version greater than or equal to 22.3.1. The `test_core.py` file has been updated to address multiple issues ([#349](#349) to [#332](#332)) related to the Databricks SDK and the `test_dashboards.py` file has been revised to work with the new SDK version. These changes improve integration with Databricks' lakeview dashboards, simplify the code, and ensure compatibility with the latest SDK version, resolving issues [#349](#349) to [#332](#332).
* Specify the minimum required version of `databricks-sdk` as 0.37.0 ([#331](#331)). In this release, we have updated the minimum required version of the `databricks-sdk` package to 0.37.0 from 0.29.0 in the `pyproject.toml` file to ensure compatibility with the latest version. This change was made necessary due to updates made in issue [#320](#320). To accommodate any patch release of `databricks-sdk` with a major and minor version of 0.37, we have updated the dependency constraint to use the `~=` operator, resolving issue [#330](#330). These changes are intended to enhance the compatibility and stability of our software.
JCZuurmond added a commit that referenced this issue Nov 19, 2024
* Changes to work with Databricks SDK `v0.38.0`
([#350](#350)). In this
release, we have upgraded the Databricks SDK to version 0.38.0 from
version 0.37.0 to ensure compatibility with the latest SDK and address
several issues. The update includes changes to make the code compatible
with the new SDK version, removing the need for `.as_dict()` method
calls when creating or updating dashboards and utilizing a
`sdk_dashboard` variable for interacting with the Databricks workspace.
We also updated the dependencies to "databricks-labs-blueprint[yaml]"
package version greater than or equal to 0.4.2 and `sqlglot` package
version greater than or equal to 22.3.1. The `test_core.py` file has
been updated to address multiple issues
([#349](#349) to
[#332](#332)) related to
the Databricks SDK and the `test_dashboards.py` file has been revised to
work with the new SDK version. These changes improve integration with
Databricks' lakeview dashboards, simplify the code, and ensure
compatibility with the latest SDK version, resolving issues
[#349](#349) to
[#332](#332).
* Specify the minimum required version of `databricks-sdk` as 0.37.0
([#331](#331)). In this
release, we have updated the minimum required version of the
`databricks-sdk` package to 0.37.0 from 0.29.0 in the `pyproject.toml`
file to ensure compatibility with the latest version. This change was
made necessary due to updates made in issue
[#320](#320). To
accommodate any patch release of `databricks-sdk` with a major and minor
version of 0.37, we have updated the dependency constraint to use the
`~=` operator, resolving issue
[#330](#330). These changes
are intended to enhance the compatibility and stability of our software.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

0 participants