-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add benchmark for filestream input #37317
Conversation
0c78985
to
0ea5a7c
Compare
0ea5a7c
to
65ade18
Compare
Now we can quickly compare performance metrics when we make changes to the filestream implementation without running the whole Filebeat.
65ade18
to
db5e526
Compare
❕ Build Aborted
Expand to view the summary
Build stats
🤖 GitHub commentsExpand to view the GitHub comments
To re-run your PR in the CI, just comment with:
|
💚 Build Succeeded
Expand to view the summary
Build stats
Test stats 🧪
💚 Flaky test reportTests succeeded. 🤖 GitHub commentsExpand to view the GitHub comments
To re-run your PR in the CI, just comment with:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
one optional nit/comment
connector, eventsDone := newTestPipeline(expEventCount) | ||
done := make(chan struct{}) | ||
go func() { | ||
err := input.Run(context, connector) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit. Right now the benchmark is capturing the time spent with the config, setting up manager etc. Might be worth using StopTimer and StartTimer to limit the reporting the time spent in input.Run
. I'm sure the setup time is roughly constant so not a huge deal, but it might make it easier to see small changes
Now we can quickly compare performance metrics when we make changes to the filestream implementation without running the whole Filebeat. (cherry picked from commit f2cf95c)
awesome work, this will make a big impact especially when we start running and reporting the status of those benchmarks per PR basis. |
Now we can quickly compare performance metrics when we make changes to the filestream implementation without running the whole Filebeat.
Now we can quickly compare performance metrics when we make changes to the filestream implementation without running the whole Filebeat.
Checklist
- [ ] I have made corresponding changes to the documentation- [ ] I have made corresponding change to the default configuration files- [ ] I have added tests that prove my fix is effective or that my feature works- [ ] I have added an entry inCHANGELOG.next.asciidoc
orCHANGELOG-developer.next.asciidoc
.How to test this PR locally
Default Filestream Configuration
On my machine I've got the following results:
Fingerprint File Identity
Related issues