-
Notifications
You must be signed in to change notification settings - Fork 656
📎 Prettier Compatibility Metric #2555
Comments
I would like to have a try. |
Awesome. I assigned you the issue. Ping me if something is unclear or if you need some pointers. I recommend approaching this problem step by step (PR by PR) and built out the tools first and test them as CLI before approaching the CI. |
I propose another formula to calculate the prettier compatibility,
There is no conclusion to say which one is better in any case, it dependends. match_lines: MAX(lines_file_1, lines_file_2) We use the same formula to calculate compatibility_per_file, so we got:
the result of the first compatibility formula: the result of my compatibility formula: if someone resolves all the compatible issues of file b,
the result of the first compatibility formula: the result of my compatibility formula: the compatibility diff of the first formula: I prefer to call the formula that @MichaReiser is |
My understanding is that you're proposing an alternative metric for the overall compatibility but keep the same metric for a single file. The file based metric calculates the average of the per file compatibilities, whereas the "line based" metric tries to measure how many lines in total are similar. That's why I would call these In my view, both of these provide valuable signal and I would recommend implementing both to see which one works better to track our work. What do you think? |
agree |
File Based Average Prettier Similarity: compatibility_per_file = matching_lines / MAX(lines_file_1, lines_file_2)
file_based_average_prettier_similarity = Sum(compatibility_per_file) / number_of_files Line Based Average Prettier Similarity compatibility_per_file = matching_lines / MAX(lines_file_1, lines_file_2)
line_based_average_prettier_similarity = SUM(matching_lines) / SUM(MAX(lines_file1, lines_file2)) |
I saw this PR was merged: #2574 What's missing now? |
The metric is merged but what would be nice to have is a CI job that comments with the current metric and compares it with main (ideally, per file). |
I am still working on CI |
I have some concerns about a numerical metric. Testing out Rome on some small projects, I've noticed that there are large diffs that occur from very small changes like trailing commas. On the flip side, there are some changes that are small in line diffs, but produce formatted output that, at least in my view, is harder to read and less appealing visually. I don't want to discourage a numerical metric, but I do think we should take more stuff into consideration when thinking about compatibility. |
This metric metric is a tool that helps us approximate the prettier compatibility. It isn't an exact representation. Nevertheless, it helps us to measure if we are moving in the right direction and have a rough understanding on how close we are. However, it doesn't mean that our ultimate goal is to reach 100% and that we should optimize for it at any cost. That would be a misuse of the metric. Regarding trailing comma. We should make sure that we compare apples with apples, meaning, we should apply the same formatting options. |
This issue is stale because it has been open 14 days with no activity. |
Description
Rome's goal is that our formatting matches Prettier's formatting closely. However, it's currently difficult to know if a PR is improving the compatibility or is making things worse.
Goal
Define a Prettier compatibility metric and provide means to compute the metric using the current Rome version.
Proposal
Percentage of lines that match Prettier's formatting, similar to git's similarity index
I'm not very proficient at math and the metric might be flawed. Please feel free to propose other metrics.
Code Pointers
Our test runner already has an option to generate a Report by setting the
REPORT_PRETTIER
env variable.tools/crates/rome_js_formatter/tests/prettier_tests.rs
Lines 270 to 297 in db61c4b
It should be straightforward to
Stretch
main
and comments the two numbers together with the differencePR - main
(percentage the PR improved the compatibility)The text was updated successfully, but these errors were encountered: