-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(feedback): Emit outcomes for user feedback events #3026
Conversation
@cmanallen only other thing -- we'll want to emit an |
@cmanallen yes! Best write an integration test here with all the outcomes you expect and work your way backward from there (we can assist with emitting the outcomes because it's not always obvious). What outcomes do you actually expect? User feedback is not sampled or filtered AFAIK, Will there be quotas or rate limits for user feedback? Or do you want to collect outcomes for invalid items? |
Yes to both quotas and rate-limits. We should record an outcome for invalid items. |
@cmanallen is this PR still relevant? |
@jjbayer Added a failing test. We just want to emit an outcome when a feedback was rate-limited. |
@cmanallen Thanks! Unfortunately the rate limiting code is a bit verbose, but you can look at this PR, specifically the files |
@jjbayer Added but still failing. Where is the quantity value incremented? |
return { | ||
"type": "feedback", | ||
"type": "userreportv2", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the name of the event item type, because of
relay/relay-base-schema/src/events.rs
Line 30 in 0608f08
#[serde(rename_all = "lowercase")] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
According to the coverage in test_feedback.py
this is the correct type.
https://github.com/getsentry/relay/blob/master/tests/integration/test_feedback.py#L6
@cmanallen I fixed the integration test. I gave you bad advice on the |
0b32be3
to
6deb9a6
Compare
@jjbayer My mistake I didn't realize you had fixed the code on this branch. In my attempt to fix it on my outdated branch I must have made a typo. You can ignore my earlier comments. I'm still a little confused about the event type. I'm not sure why |
I think the linked test works because it sends data to the relay/tests/integration/fixtures/__init__.py Lines 306 to 308 in 23cbd98
|
@jjbayer Makes sense. So this rate-limiting outcome will be emitted for requests to the /envelope endpoint? |
@cmanallen it will work for the |
@jjbayer I'm unfamiliar with this product but I'm pretty sure we use the envelope endpoint to submit feedback. As long as it works there with the |
@jjbayer Is this ready (can be approved) or is there more work to do? |
* master: feat(metric-stats): Report cardinality to metric stats (#3360) release: 0.8.56 fix(perfscore): Adds span op tag to perf score totals (#3326) ref(profiles): Return retention_days as part of the Kafka message (#3362) ref(filter): Add GTmetrix to the list of web crawlers (#3363) fix: Fix kafka topic default (#3350) ref(normalization): Remove duplicated normalization (#3355) feat(feedback): Emit outcomes for user feedback events (#3026) feat(cardinality): Implement cardinality reporting (#3342)
Adds outcomes for user feedback events.
Should there be test coverage for this? Will this handle failure outcomes? Are there other locations that need configuration?