Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Creating performance test to validate UCX at scale #607

Closed
wants to merge 24 commits into from

Conversation

william-conti
Copy link
Contributor

@william-conti william-conti commented Nov 20, 2023

This pull request introduces new methods in the Makefile for running performance tests. It also adds a new test suite for performance in the tests/performance directory, including a conftest.py file for setting up fixtures and test_performance.py for running performance tests. The changes related to tests include creating and configuring various Databricks objects such as groups, pipelines, jobs, experiments, models, pools, warehouses, clusters, policies, queries, alerts, scopes, dashboards, repos, directories, and notebooks, and then verifying the successful creation and assignment of permissions for these objects. The performance tests measure the time taken to create and configure these objects and save the results in a SQL database for further analysis.

  • Populate 1k groups in workspace / account
  • Populate objects in scope of group migration
  • Verify and Identify any failure creation - at this stage it should not fail, but it has to be captured
  • Run assessment / group migration
  • Compare before/after group migration

@william-conti william-conti changed the title Creating test to test UCX at scale Creating performance test to validate UCX at scale Nov 20, 2023
@william-conti william-conti added enhancement New feature or request pr/do-not-merge this pull request is not ready to merge labels Nov 20, 2023
Copy link

codecov bot commented Nov 20, 2023

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 82.82%. Comparing base (e1b7a27) to head (257dc3b).
Report is 367 commits behind head on main.

❗ Current head 257dc3b differs from pull request most recent head 5e246da. Consider uploading reports for the commit 5e246da to get more accurate results

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #607      +/-   ##
==========================================
+ Coverage   78.60%   82.82%   +4.21%     
==========================================
  Files          41       35       -6     
  Lines        4272     3493     -779     
  Branches      800      650     -150     
==========================================
- Hits         3358     2893     -465     
+ Misses        711      454     -257     
+ Partials      203      146      -57     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@CLAassistant
Copy link

CLAassistant commented Nov 27, 2023

CLA assistant check
All committers have signed the CLA.

.github/workflows/performance.yml Outdated Show resolved Hide resolved
@@ -227,7 +227,7 @@ def run_workflow(self, step: str):
logger.debug(f"starting {step} job: {self._ws.config.host}#job/{job_id}")
job_run_waiter = self._ws.jobs.run_now(job_id)
try:
job_run_waiter.result()
job_run_waiter.result(timeout=datetime.timedelta(days=1))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make timeout a parameter of run_workflow(), setting 1 day timeout here is a hidden disaster

src/databricks/labs/ucx/mixins/fixtures.py Outdated Show resolved Hide resolved
src/databricks/labs/ucx/mixins/fixtures.py Outdated Show resolved Hide resolved
from databricks.labs.ucx.mixins.fixtures import * # noqa: F403

logging.getLogger("tests").setLevel("DEBUG")
logging.getLogger("databricks.labs.ucx").setLevel("DEBUG")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you're running this as a dedicated runner, it might make sense to route logs to a file as well.

pyproject.toml Outdated Show resolved Hide resolved
tests/performance/test_performance.py Show resolved Hide resolved
try_validate_secrets(persisted_rows, sql_backend, test_database, test_groups, ws)
validate_entitlements(sql_backend, test_database, ws)

assert [] == verificationErrors
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Log errors into a file as well, ide would have trouble rendering it

tests/performance/test_performance.py Show resolved Hide resolved
Co-authored-by: Serge Smertin <[email protected]>
@nfx nfx added tech debt chores and design flaws and removed enhancement New feature or request labels Apr 22, 2024
@nfx
Copy link
Collaborator

nfx commented Jul 2, 2024

closing this PR as the last time it had commit was 7 months ago.

@nfx nfx closed this Jul 2, 2024
@nfx nfx deleted the feature/performance-test branch July 2, 2024 19:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pr/do-not-merge this pull request is not ready to merge tech debt chores and design flaws
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants