-
-
Notifications
You must be signed in to change notification settings - Fork 60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
util_pb/pb_color_map: use HSV cone distance as error #93
Conversation
I am not an expert on C, but I would have expected to be able to use #include <fixmath.h> Because the function prototypes seem to be declared in
Instead, I have to do: #include <fix16_trig.c> which doesn't seem correct. Am I misunderstanding something? Might there be an issue in the build system? Also, when using
Which I guess means these implementations are just too big to be used. |
An easier way to debug this instead of running on the hub would be to create a new |
It is probably due to the cache table. It looks like it can be disabled. diff --git a/bricks/stm32/stm32.mk b/bricks/stm32/stm32.mk
index 2d002a59..4fdfc386 100644
--- a/bricks/stm32/stm32.mk
+++ b/bricks/stm32/stm32.mk
@@ -406,7 +406,7 @@ LWRB_SRC_C = lib/lwrb/src/lwrb/lwrb.c
# libfixmath
-COPT += -DFIXMATH_NO_CTYPE
+COPT += -DFIXMATH_NO_CTYPE -DFIXMATH_NO_CACHE -DFIXMATH_FAST_SIN
LIBFIXMATH_SRC_C = $(addprefix lib/libfixmath/libfixmath/,\
fix16_sqrt.c \ |
We need to add the source file to the list of source files in the makefile. diff --git a/bricks/stm32/stm32.mk b/bricks/stm32/stm32.mk
index 2d002a59..72f585cb 100644
--- a/bricks/stm32/stm32.mk
+++ b/bricks/stm32/stm32.mk
@@ -411,6 +411,7 @@ COPT += -DFIXMATH_NO_CTYPE
LIBFIXMATH_SRC_C = $(addprefix lib/libfixmath/libfixmath/,\
fix16_sqrt.c \
fix16_str.c \
+ fix16_trig.c \
fix16.c \
uint32.c \
) |
Makes sense, will do! |
This produces the following error:
This comes from But already with |
Yes. The maintainer has been quite responsive in the past if you want to do this. |
OK, I'll give it a shot |
6fd44ba
to
9e42bd4
Compare
This seems to work as of now, but I still want to move the function to pbio and add tests for it. Just need to find some time for it. I also want to do some testing to see if it really improves on the previous method. |
Thanks for submitting this. It looks nice! Which definition of the cone did you implement? It appears that cones are commonly associated with chroma (which we could calculate also). When saturation is used, it seems more common to work with a cylinder. Is it one of the following, and if so, which one? I've also been wondering if we may still need two scaling factors to set the relative dimensions of the cylinder or cone, which would amount to relative weights similar to what we have today. |
And after this is completed and merged, I think it's worth adding a little bit of (configurable) hysteresis as well. This can be done fairly trivially by requiring a minimal change in the cost function to update the returned color. That makes it more reliable to reliably detect a change in color without jitter. We've used it here, but only for reflection. By applying this to the cost function, we may be able to get the same benefit for colors. |
That's a good point. Using the cylinder would not be ideal as it is degenerate for the color black, giving arbitrary distances between different HSV parameters for black. For this reason, I have opted to adopt the cone (Fig 3b). While I was not aware of the distinction between chroma and saturation, I think I intuitively used chroma anyways because that's the intuitive way to map it into the cone (radial coordinate = saturation*value = chroma). Fun fact: the German wikipedia page also misleadingly labels the radial coordinate as saturation for the HSV cone) Of course, using the bicone (Fig3a) might also be a valid approach, I might experiment with that as well.
One scaling factor suffices, since only the ratio between HS-plane and V-axis matters. I have this in mind in my implementation, but need to figure out whether to calibrate it somehow or leave it to the user. |
One other aspect I have been thinking about is the default colors. When testing this cost function with standard colors I found that it is more difficult to detect some standard LEGO colors. Especially for standard LEGO green I found that it has quite a low value and thus is not very close to the pybricks But beyond that, would it make sense to provide calibrated colors for standard LEGO colors? |
If we can come up with values that work 80% of the time in a wide variety of lighting conditions, then yes, it could be worth it. But if users will always have to calibrate anyway, then it isn't worth it. |
The problem here isn't so much LEGO colors or the color mapping cost function per se. The main thing here is that we also use the default In the current mapping, this provides a nice compromise, with the default colors (None, Red, Green, Blue, Yellow) being detected very well. In my tests, default Pybricks performance was a lot better than with the standard LEGO color mode. Even if we provided other colors, it’s not really clear what they should be, since S and V are distance dependent. We could provide Color.BRICK_YELLOW_1CM, but this might be the point where users are better off providing their own colors. Notice also that the current cost function mapping isn't strictly symmetric. In the current version, we assign more meaning to the requested color than the measured color. The reasoning for this should probably be documented a bit better. Separately from this, especially for green and yellow, see also the hack in |
f43ad8e
to
80a27ae
Compare
Implement the color error by calculating the cartesian distance between the colors in the HSV cone. This should be more insensitive to fluctuating hue values on dark and unsaturated colors. This has been discussed further in pybricks/support#627
also add defines in cpp config
this seems to make the difference, not sure why
e8c28f2
to
c8b857e
Compare
5759b70
to
54c7384
Compare
I might try to run the test at Edit: found it: pybricks/pybricks-coverage#1 (comment) |
this works even with chroma_weight=50
Thanks for all the work and extensive analysis so far! I don't have a lot of time at the moment, but reviewing this is definitely on my todo list. Did you check out pybricks/support#116 and the following path we are currently using? Maybe you're addressing this already, but if not, let's make sure we're not over-fitting the HSV mapping to this underlying RGB-to-HSV hack. pybricks-micropython/pybricks/util_pb/pb_color_map.c Lines 22 to 45 in f69bd69
|
I did take a loot at pybricks/support#116, but somehow assumed it just affected the hue (which I wasn't too concerned about here). I seem to have missed the fact that it also changes the value/saturation. I'll take a look. I'll squash some of the commits and give better commit messages once this PR approaches merging. Or do you prefer to keep this all as one commit? |
If I disable these corrections: pybricks-micropython/pybricks/util_pb/pb_color_map.c Lines 47 to 49 in f69bd69
I once again have problems with detecting GREEN rather than None , since the green value is just ~31. These lines are actually the same modification I use when mapping to the bicone radius, so basically the most recent commit applies this transformation twice.
|
I don't think we should be trying to make "green" LEGO bricks match |
Yeah, I think specific calibrated colors would work very well, and the color you provided is pretty close to what I get from sensor.hsv() (with pybricks/support#116 enabled). That would mean introducing new default colors, e.g. |
So perhaps before adjusting the HSV cost matching function (this PR), maybe we could first have a closer look at fixing the RGB to HSV conversion in #pybricks/support#116. I.e. @dlech has made a carefully crafted But our Do you have a SPIKE Color Sensor, @Novakasa? If not, we can send you one. |
I don't have any SPIKE components, would be great if you have a spare |
Maybe I'm missing something, but I believe the basic LEGO brick colors (including green) are all detected very well with the default settings. So did this occur specifically with the default (master), or the modified cost function (this PR), or both? |
👍 I will reach out via email.
Small changes/typos/fixes can generally be squashed into the relevant main commit, but it is fine to have more than one commit for different pieces of code. For example, see this branch. Coincidentally, it already has a squashed version of some of your commits :) |
I would also recommend trying out the PUPDevice class so you can access the original RGB values. Then you can test various conversions in Python without having to reflash the firmware every time. You'll also be able to either duplicate our HSV conversions or adjust them as needed. You could even take it a step further by having the sensor just continuously print the RGB data and do processing in your computer as it comes in. |
In master, the default colors are detected very well in my testing. The way I see it, the color error function in master is fine-tuned to work quite well with the task of detecting the default colors, but has some issues when trying to distinguish custom colors that can be closer to each other especially with low value and saturation. If we chose to use more realistic colors for matching, that would make things a bit easier and we could probably have more default colors, but in any case I think the latest bicone approach seems quite robust to me even with the "unrealistic colors". |
@Novakasa, what is this status on this PR? Is it ready to be merged or do you still want to make changes? |
Since we'll no longer be using libfixmath for motors, this might be quite a sizable increase in build size. I would rather leave it open until we're definitely sure it's a big improvement and for a large number of users. Here's perhaps a different idea: Instead of changing the cost function in the firmware, how about we generalize it so users can provide their own and write it in MicroPython? For example: def my_cost_function(sample: Color, detectable_color: Color):
return 123456
sensor.detectable_colors(colors, cost_function=my_cost_function) It seems like this would give all the existing advantages of using this detectable color feature, while giving power users a flexible tool to make it suit their application. |
@dlech While this PR is close to ready, if Another possibility would be to try to implement hsv-cone-cost without libfixmath. I might give this a shot if I find the time. However, I would prefer to not try to make the function match well the default colors and the calibrated colors at the same time, as it does sacrifice some simplicity. We could either change the default colors to match against more realistic ones (and maybe map it back to Color.GREEN etc before returning the color), or we use the original cost function by default, but change to the cone cost function when the user provides their own colors. |
Sounds good. I'm in favor of adding a few predefined colors like |
closing this in favor of #104 |
I am trying to implement the color error by calculating the cartesian distance between the colors in the HSV cone. This should be more insensitive to fluctuating hue values on dark and unsaturated colors. This has been discussed further in pybricks/support#627
This doesn't crash the hub, but doesn't really seem to work yet, still need to figure out how to debug on the hub.