Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dynamo View Performance Benchmarks #9623

Merged
merged 13 commits into from
Apr 8, 2019
Merged

Dynamo View Performance Benchmarks #9623

merged 13 commits into from
Apr 8, 2019

Conversation

QilongTang
Copy link
Contributor

@QilongTang QilongTang commented Apr 4, 2019

Please Note:

  1. Before submitting the PR, please review How to Contribute to Dynamo
  2. Dynamo Team will meet 1x a month to review PRs found on Github (Issues will be handled separately)
  3. PRs will be reviewed from oldest to newest
  4. If a reviewed PR requires changes by the owner, the owner of the PR has 30 days to respond. If the PR has seen no activity by the next session, it will be either closed by the team or depending on its utility will be taken over by someone on the team
  5. PRs should use either Dynamo's default PR template or one of these other template options in order to be considered for review.
  6. PRs that do not have one of the Dynamo PR templates completely filled out with all declarations satisfied will not be reviewed by the Dynamo team.
  7. PRs made to the DynamoRevit repo will need to be cherry-picked into all the DynamoRevit Release branches that Dynamo supports. Contributors will be responsible for cherry-picking their reviewed commits to the other branches after a LGTM label is added to the PR.

Purpose

This PR:

  1. Introduce PerformanceTestHelper
  2. Introduce DynamoView Performance Test Base
  3. Re-organize the code base and add more comments to help understanding
  4. Updated the console app to default to run both Model and Viewl test cases
  5. Add FastBenchmarkReleaseConfig as default performance benchmark job. It contains DynamoMaxWarmuoCount = 9, DynamoMaxIterationCount = 9 to avoid the case, the benchmark sometimes runs forever and reach to the 100 times internal limit.
  6. Add DebugInProcessConfig as an alternative config that dev can apply locally for debugging benchmark running

image

Time consumption running view benchmarks with the current suite:
image

Time consumption running model benchmarks with the current suite:
image

Declarations

Check these if you believe they are true

  • The code base is in a better state after this PR
  • Is documented according to the standards
  • The level of testing this PR includes is appropriate
  • User facing strings, if any, are extracted into *.resx files
  • All tests pass using the self-service CI.
  • Snapshot of UI changes, if any.
  • Changes to the API follow Semantic Versioning, and are documented in the API Changes document.

Reviewers

@ColinDayOrg @scottmitchell @aparajit-pratap

FYIs

@DynamoDS/dynamo

1. introduce helper for grabbing config
2. introduce a second performance test case
3. make the console app entrance cleaner
@QilongTang QilongTang added the WIP label Apr 4, 2019
@QilongTang
Copy link
Contributor Author

This PR is ready for review in terms of getting view execution time, I will investigate at how to get render time

@QilongTang QilongTang added PTAL Please Take A Look 👀 and removed WIP labels Apr 5, 2019
@QilongTang QilongTang changed the title [WIP] Performance Benchmark Part II Performance Benchmark Part II Apr 5, 2019
@ColinDayOrg
Copy link
Contributor

Other than a couple of comments about documentation and the empty class, LGTM.

@QilongTang
Copy link
Contributor Author

Addressed comments, please take a look @ColinDayOrg @aparajit-pratap

@aparajit-pratap
Copy link
Contributor

@QilongTang just one last question about the reasoning for selection of the fast config settings for iteration count, warmup count, etc. Is there some heuristic you used to come up with these numbers?

@ColinDayOrg
Copy link
Contributor

LGTM :shipit:

@QilongTang
Copy link
Contributor Author

@aparajit-pratap @ColinDayOrg These numbers are not fixed, we should tune those. I decided these as first version because these numbers make sure the graph run with view along can be finished within an hour, the model tests finished within 15 minutes. And the number I got is quite stable and as good as before. After we hook up the benchmark with CI, we should change these numbers and decide best times to run and warmup, etc. Maybe instead of static int, we should read from some config. But more to consider in the future I believe.

@QilongTang QilongTang merged commit a356bb6 into master Apr 8, 2019
@QilongTang QilongTang deleted the PerformanceBenchmark branch April 8, 2019 22:13
@QilongTang QilongTang removed the PTAL Please Take A Look 👀 label Apr 9, 2019
@QilongTang QilongTang changed the title Performance Benchmark Part II Dynamo View Performance Benchmarks Apr 9, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants