Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make classification evaluation metrics work when there is field mapping type mismatch #53458

Merged
merged 2 commits into from
Mar 16, 2020

Conversation

przemekwitek
Copy link
Contributor

@przemekwitek przemekwitek commented Mar 12, 2020

This PR makes the comparison used in evaluation metrics more lenient.
Instead of comparing raw field values, it now compares their string representations so e.g.: actual field value "1" and predicted field value 1 are assumed the same.

Relates #53485

@przemekwitek przemekwitek force-pushed the fix_overall_accuracy branch 2 times, most recently from b75cbbe to b05927c Compare March 12, 2020 14:32
@przemekwitek przemekwitek changed the title Make accuracy evaluation metric work when there is field mapping type mismatch Make classification evaluation metrics work when there is field mapping type mismatch Mar 12, 2020
@przemekwitek przemekwitek removed the WIP label Mar 12, 2020
@przemekwitek przemekwitek marked this pull request as ready for review March 12, 2020 14:41
@elasticmachine
Copy link
Collaborator

Pinging @elastic/ml-core (:ml)

@przemekwitek przemekwitek force-pushed the fix_overall_accuracy branch from b05927c to 8e66ea3 Compare March 16, 2020 08:47
Copy link
Member

@benwtrent benwtrent left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

naming suggestion. But looks good.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants