-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Creating performance test to validate UCX at scale #607
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #607 +/- ##
==========================================
+ Coverage 78.60% 82.82% +4.21%
==========================================
Files 41 35 -6
Lines 4272 3493 -779
Branches 800 650 -150
==========================================
- Hits 3358 2893 -465
+ Misses 711 454 -257
+ Partials 203 146 -57 ☔ View full report in Codecov by Sentry. |
@@ -227,7 +227,7 @@ def run_workflow(self, step: str): | |||
logger.debug(f"starting {step} job: {self._ws.config.host}#job/{job_id}") | |||
job_run_waiter = self._ws.jobs.run_now(job_id) | |||
try: | |||
job_run_waiter.result() | |||
job_run_waiter.result(timeout=datetime.timedelta(days=1)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Make timeout a parameter of run_workflow(), setting 1 day timeout here is a hidden disaster
from databricks.labs.ucx.mixins.fixtures import * # noqa: F403 | ||
|
||
logging.getLogger("tests").setLevel("DEBUG") | ||
logging.getLogger("databricks.labs.ucx").setLevel("DEBUG") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you're running this as a dedicated runner, it might make sense to route logs to a file as well.
try_validate_secrets(persisted_rows, sql_backend, test_database, test_groups, ws) | ||
validate_entitlements(sql_backend, test_database, ws) | ||
|
||
assert [] == verificationErrors |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Log errors into a file as well, ide would have trouble rendering it
Co-authored-by: Serge Smertin <[email protected]>
closing this PR as the last time it had commit was 7 months ago. |
This pull request introduces new methods in the Makefile for running performance tests. It also adds a new test suite for performance in the
tests/performance
directory, including a conftest.py file for setting up fixtures and test_performance.py for running performance tests. The changes related to tests include creating and configuring various Databricks objects such as groups, pipelines, jobs, experiments, models, pools, warehouses, clusters, policies, queries, alerts, scopes, dashboards, repos, directories, and notebooks, and then verifying the successful creation and assignment of permissions for these objects. The performance tests measure the time taken to create and configure these objects and save the results in a SQL database for further analysis.