From 7a495b5024d0d3f0b8361f2d6424f38a967b5bf7 Mon Sep 17 00:00:00 2001 From: ArturoAmorQ Date: Tue, 10 Oct 2023 11:51:16 +0200 Subject: [PATCH] Tweak --- python_scripts/trees_classification.py | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/python_scripts/trees_classification.py b/python_scripts/trees_classification.py index 32ce9aa44..d2ca2ae08 100644 --- a/python_scripts/trees_classification.py +++ b/python_scripts/trees_classification.py @@ -241,12 +241,12 @@ # In the next exercise, you will increase the tree depth to get an intuition on # how such parameter affects the space partitioning. # -# Finally, we can try to visualize the output of predict_proba for a multiclass -# problem using `DecisionBoundaryDisplay`, except that For a K-class problem, -# you'll have K probability outputs for each data point. Visualizing all these -# on a single plot can quickly become tricky to interpret. It is then common to -# instead produce K separate plots, one for each class, in a one-vs-rest (or -# one-vs-all) fashion. +# Finally, we can try to visualize the output of `predict_proba` for a +# multiclass problem using `DecisionBoundaryDisplay`, except that For a K-class +# problem, you'll have K probability outputs for each data point. Visualizing +# all these on a single plot can quickly become tricky to interpret. It is then +# common to instead produce K separate plots, one for each class, in a +# one-vs-rest (or one-vs-all) fashion. # # For example, in the plot below, the first column shows in red the certainty on # classifying a data point as belonging to the "Adelie" class. Notice that the