-
Notifications
You must be signed in to change notification settings - Fork 459
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Android] Use ms for number report #5362
Conversation
ns * 1e-6 = ms
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/5362
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 11d53dd with merge base 034e098 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
@kirklandsign has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@@ -106,14 +106,14 @@ public void onGenerationStopped() { | |||
new BenchmarkMetric( | |||
benchmarkModel, | |||
"model_load_time(ns)", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
still ns?
@@ -106,14 +106,14 @@ public void onGenerationStopped() { | |||
new BenchmarkMetric( | |||
benchmarkModel, | |||
"model_load_time(ns)", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit: The metric value is now in (ms) instead of (ns), so its name need to be updated accordingly
@@ -106,14 +106,14 @@ public void onGenerationStopped() { | |||
new BenchmarkMetric( | |||
benchmarkModel, | |||
"model_load_time(ns)", | |||
mStatsDump.loadEnd - mStatsDump.loadStart, | |||
(mStatsDump.loadEnd - mStatsDump.loadStart) * 1e-6, | |||
0.0f)); | |||
// LLM generate time | |||
results.add( | |||
new BenchmarkMetric( | |||
benchmarkModel, | |||
"generate_time(ns)", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comment about the metric name
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Stamped!
@@ -106,14 +106,14 @@ public void onGenerationStopped() { | |||
new BenchmarkMetric( | |||
benchmarkModel, | |||
"model_load_time(ns)", | |||
mStatsDump.loadEnd - mStatsDump.loadStart, | |||
(mStatsDump.loadEnd - mStatsDump.loadStart) * 1e-6, | |||
0.0f)); | |||
// LLM generate time | |||
results.add( | |||
new BenchmarkMetric( | |||
benchmarkModel, | |||
"generate_time(ns)", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ditto
@kirklandsign has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@kirklandsign merged this pull request in 62024d8. |
ns * 1e-6 = ms
Example output:
(9.36ms inference mv2_xnnpack)
and for LLM
(12.34 tps mocked number)