Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update databricks-sdk requirement from ~=0.16.0 to ~=0.17.0 #23

Merged
merged 1 commit into from
Jan 17, 2024

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Jan 12, 2024

Updates the requirements on databricks-sdk to permit the latest version.

Release notes

Sourced from databricks-sdk's releases.

v0.17.0

  • Use covariant type for @retried(on=[...]) (#486).
  • Configure request timeout using existing parameter from Config (#489).
  • Make contents of __init__.py equal across projects (#488).
  • Update SDK to Latest OpenAPI Specification (#501).

Note: This release contains breaking changes, please see below for more details.

API Changes:

  • [Breaking] Changed list() method for w.tokens workspace-level service to return databricks.sdk.service.settings.ListPublicTokensResponse dataclass.
  • Changed list() method for w.external_locations workspace-level service to require request of databricks.sdk.service.catalog.ListExternalLocationsRequest dataclass and w.storage_credentials workspace-level service to require request of databricks.sdk.service.catalog.ListStorageCredentialsRequest dataclass.
  • Added next_page_token field for databricks.sdk.service.catalog.ListExternalLocationsResponse, databricks.sdk.service.catalog.ListFunctionsResponse, databricks.sdk.service.catalog.ListSchemasResponse and databricks.sdk.service.catalog.ListStorageCredentialsResponse.
  • Added max_results field for databricks.sdk.service.catalog.ListFunctionsRequest and databricks.sdk.service.catalog.ListSchemasRequest.
  • Added page_token field for databricks.sdk.service.catalog.ListFunctionsRequest and databricks.sdk.service.catalog.ListSchemasRequest.
  • Added omit_columns field for databricks.sdk.service.catalog.ListTablesRequest.
  • Added omit_properties field for databricks.sdk.service.catalog.ListTablesRequest.
  • Added init_scripts field for databricks.sdk.service.pipelines.PipelineCluster.
  • Added validate_only field for databricks.sdk.service.pipelines.StartUpdate and databricks.sdk.service.pipelines.UpdateInfo.
  • Changed create() method for w.dashboards workspace-level service . New request type is databricks.sdk.service.sql.DashboardPostContent dataclass.
  • Added update() method for w.dashboards workspace-level service.
  • Added http_headers field for databricks.sdk.service.sql.ExternalLink.
  • Added run_as_role field for databricks.sdk.service.sql.QueryEditContent.
  • Added package: databricks.sdk.service.dashboards and databricks.sdk.service.vectorsearch.
  • Added dataclass: databricks.sdk.service.catalog.ListExternalLocationsRequest, databricks.sdk.service.catalog.ListStorageCredentialsRequest, databricks.sdk.service.settings.ListPublicTokensResponse, databricks.sdk.service.sql.DashboardEditContent and databricks.sdk.service.sql.DashboardPostContent.
  • Removed dataclass: databricks.sdk.service.catalog.TableConstraintList and databricks.sdk.service.sql.CreateDashboardRequest.

OpenAPI SHA: 0e0d4cbe87193e36c73b8b2be3b0dd0f1b013e00, Date: 2024-01-10

Changelog

Sourced from databricks-sdk's changelog.

0.17.0

  • Use covariant type for @retried(on=[...]) (#486).
  • Configure request timeout using existing parameter from Config (#489).
  • Make contents of __init__.py equal across projects (#488).
  • Update SDK to Latest OpenAPI Specification (#501).

Note: This release contains breaking changes, please see below for more details.

API Changes:

  • [Breaking] Changed list() method for w.tokens workspace-level service to return databricks.sdk.service.settings.ListPublicTokensResponse dataclass.
  • Changed list() method for w.external_locations workspace-level service to require request of databricks.sdk.service.catalog.ListExternalLocationsRequest dataclass and w.storage_credentials workspace-level service to require request of databricks.sdk.service.catalog.ListStorageCredentialsRequest dataclass.
  • Added next_page_token field for databricks.sdk.service.catalog.ListExternalLocationsResponse, databricks.sdk.service.catalog.ListFunctionsResponse, databricks.sdk.service.catalog.ListSchemasResponse and databricks.sdk.service.catalog.ListStorageCredentialsResponse.
  • Added max_results field for databricks.sdk.service.catalog.ListFunctionsRequest and databricks.sdk.service.catalog.ListSchemasRequest.
  • Added page_token field for databricks.sdk.service.catalog.ListFunctionsRequest and databricks.sdk.service.catalog.ListSchemasRequest.
  • Added omit_columns field for databricks.sdk.service.catalog.ListTablesRequest.
  • Added omit_properties field for databricks.sdk.service.catalog.ListTablesRequest.
  • Added init_scripts field for databricks.sdk.service.pipelines.PipelineCluster.
  • Added validate_only field for databricks.sdk.service.pipelines.StartUpdate and databricks.sdk.service.pipelines.UpdateInfo.
  • Changed create() method for w.dashboards workspace-level service . New request type is databricks.sdk.service.sql.DashboardPostContent dataclass.
  • Added update() method for w.dashboards workspace-level service.
  • Added http_headers field for databricks.sdk.service.sql.ExternalLink.
  • Added run_as_role field for databricks.sdk.service.sql.QueryEditContent.
  • Added package: databricks.sdk.service.dashboards and databricks.sdk.service.vectorsearch.
  • Added dataclass: databricks.sdk.service.catalog.ListExternalLocationsRequest, databricks.sdk.service.catalog.ListStorageCredentialsRequest, databricks.sdk.service.settings.ListPublicTokensResponse, databricks.sdk.service.sql.DashboardEditContent and databricks.sdk.service.sql.DashboardPostContent.
  • Removed dataclass: databricks.sdk.service.catalog.TableConstraintList and databricks.sdk.service.sql.CreateDashboardRequest.

OpenAPI SHA: 0e0d4cbe87193e36c73b8b2be3b0dd0f1b013e00, Date: 2024-01-10

0.16.0

  • Sort imports in service template (#479).
  • Add py.typed to support PEP-561 (#483).
  • Fixed bug in @retried when exception subtypes were not respected (#484).
  • Make WorkspaceClient and AccountClient more friendly with autospeccing (#480).

API Changes:

  • Added azure_workspace_info field for databricks.sdk.service.provisioning.Workspace.
  • Added databricks.sdk.service.provisioning.AzureWorkspaceInfo dataclass.
  • Changed update_config() method for w.serving_endpoints workspace-level service with new required argument order.
  • Changed served_entities field for databricks.sdk.service.serving.EndpointCoreConfigInput to no longer be required.
  • Changed create() method for a.account_ip_access_lists account-level service with new required argument order.
  • Changed replace() method for a.account_ip_access_lists account-level service with new required argument order.
  • Changed update() method for a.account_ip_access_lists account-level service with new required argument order.
  • Changed create() method for w.ip_access_lists workspace-level service with new required argument order.
  • Changed replace() method for w.ip_access_lists workspace-level service with new required argument order.
  • Changed update() method for w.ip_access_lists workspace-level service with new required argument order.
  • Changed ip_addresses field for databricks.sdk.service.settings.CreateIpAccessList to no longer be required.

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Updates the requirements on [databricks-sdk](https://github.com/databricks/databricks-sdk-py) to permit the latest version.
- [Release notes](https://github.com/databricks/databricks-sdk-py/releases)
- [Changelog](https://github.com/databricks/databricks-sdk-py/blob/main/CHANGELOG.md)
- [Commits](databricks/databricks-sdk-py@v0.16.0...v0.17.0)

---
updated-dependencies:
- dependency-name: databricks-sdk
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Jan 12, 2024
@nfx nfx merged commit 6fde666 into main Jan 17, 2024
1 of 7 checks passed
@dependabot dependabot bot deleted the dependabot/pip/databricks-sdk-approx-eq-0.17.0 branch January 17, 2024 22:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant