<< return to Pixycam.com

Teaching Pixy

I am having a hard time teaching Pixy about objects that it recognizes in different light conditions. Whenever Pixy is moving around and it looks in the direction of a window, it seems to lose track of the objects it has learned.

For my purposes, I care very little what the video looks like. In Photoshop if I amp the saturation level of a screen grab of the image, the brightly colored objects I want Pixy to recognize REALLY stand out independent of the lighting conditions. Is there any way to tell Pixy to amp up the saturation of the video it is processing? Even if this slowed the processing speed down somewhat, I am not all that concerned. 10 fps is just as good as 50 fps for my purposes.

Thanks for the help.

Best,
Shannon

Hi Shannon,
The bright window is causing other parts of the image to be underexposed. And underexposure leads to poor hue response (as you already figured out). This is a challenging problem— maintaining good accuracy under lots of different lighting conditions, esp when a bright lighting source enters the frame. We’ll improve this, but in the meantime there are two things you can try:

  1. Decrease the “Min Saturation” parameter, to be more inclusive when things get underexposed

  2. Increase the “Brightness” parameter, to improve the exposure of the underexposed parts of the image.

There’s more info here:
http://cmucam.org/projects/cmucam5/wiki/Some_Tips_on_Generating_Color_Signatures

thanks!

I thought the Min Saturation parameter only had meaning when you were teaching the Pixy about an object. Is this not correct?

Also I need to adjust these parameters with an Arduino type board. Is there a way my program can determine if the image is under or over exposed so that it can alter the parameters?

Thanks,
Shannon

Hi Shannon,

You are correct, the min saturation is used when teaching Pixy about objects. By lowering the min saturation and teaching Pixy with your object, you’ll still be able to detect it when a bright light source enters the frame, like lighting from a window. So when the saturation of your object decreases due to underexposure, Pixy will still detect it since the model is more inclusive. Does this make sense?

Currently these parameters are not available in the Arduino library, although we’re working on it.

Let me know if you have any more questions.

Thanks!

Scott