<< return to Pixycam.com

Gaps in object detection in static lighting and is there frame rate setting?

I poll Pixy firmware 1.0.2 over I2C but I feel I see this very same issue using PixyMon as well.

I can have nice lighting and object being detected in mid frame for a few seconds at a time.

But no matter what I try I get periodic dropouts where no objects are detected then they come back in a fraction of a second later in another sample. I am using 3-color stripes in cc object mode for object type 121 for example.

What I have to do is have my system not report that the object is gone from the frame until I no longer see the object for up to 3 seconds. This is a workaround but of course is a real issue for me. I wish to know if the object has disappeared from the frame by worse case 500msec.

I See this also in PixyMon if you just bring up PixyMon and do NOT yet hit the ‘cook’ icon so you see the object outline in white box on black. It appears to flicker and completely disappears for 1/4 second or so intervals irradically.

This happens even if I tweek ‘brightness’ so the block detection is optimal in PixyMon using the ‘cook’ icon. I want to tweek brightness to scan in several brightnesses and look for close or farther away thing because I know brightness is a key and critical parameter.

I wonder if slowing down the frame rate to 10 per second which is FINE for my needs would maybe better help this issue but cannot find a frame rate setting in PixyMon.

Again I question if maybe what I am seeing is that I sample Pixy right as it is updating samples and if proper queue or locking on the queue are not done maybe the blocks being reported come out to be none for that case. I’m just throwing out ideas here.

I will be implementing checksum verification as there is some chance this may be over the wire corruption but since I see what looks like maybe this same issue in both the flickering Pixy 3-color light AND in PixyMon loosing the image I question if it is maybe ‘real’ bug.

Any thoughts? Thanks, Mark

Followup: Have implemented checksum verification so this issue is not related to bad ‘over the wire’ data.

Hi Mark,

How long is your system running before you see this issue occur? Also, what is your setup like (is Pixy moving around and changing scenes? Is it a dynamic environment? Are there many objects? etc.)? Just trying to get a feel for what might be happening. Thanks!

Scott

I get this issue with a small robot that I jack up it’s body so the wheels don’t touch anything and this allows me to debug the code that feeds off of the pixy object recognition to decide to move right, left, towards object and so on. The issue happens if I retrieve the objects myself over I2C and I think it is happening if I just directly connect using Pixymon because I see periods of time when the fairly large pattern of my 3 colors of tape next to each other just disappear from being shown as a rectangle then re-appear.

If the rectangle for my single object in PixyMon disappears yet the image still shows the same then is that how I would know that Pixy did not report any objects to PixyMon in the times no white rectangle shows up? That is my assumption and matches my own separate query results.

My main purpose of this post was to find out if that is a current bug in firmware 1.0.2 or if I had older firmware that had this sort of defect and you have not stated there is any newer than 1.0.2.

This effect does not happen all the time and I suspect is related to the rather touch lighting conditions that I already know Pixy is required to operate within. This is a robot and it moves about but I try to keep black behind my one object which is a 3-bar color coded object. Use of the ‘cc’ mode I have found is critical as if I just have it look for one color that color is bound to show up being recognized somewhere in the background or even in a shiny piece of metal the camera may see. Very tricky and in fact my many issues have been squarely based on trying to control lighting.

What I would love to see is some firmware support to find very specific patterns like say simple bar-codes or say a square with an X through it or maybe a circle with a dot or crosshairs in it. I think the whole bit about recognizing color saturation is just too prone to issues that require tight control to be reliable.

I will continue to mess with parameters and am hoping support to tweek the ‘brightness’ register over I2C happen someday or the firmware port to be makable using gcc completes so I can try to stumble through it myself. I will also try to find colors that are less of an issue for whatever reason and will mess with saturation and other parameters in PixyMon that work best. I typically then disconnect PixyMon on USB and then start my code that accesses the blocks over I2C.

So I cast a ‘vote’ for simple shape or bar-code recognition your way although I am sure your list is too long already for other things like face recognition that has been planned I think.

Take care,
mark johnston

I posted this to a thread on GCC port but I worded it nicely so here is an idea if you all are truely concerned about better detection which I hear is the focus of the dec release (so this is too late for that I think)

I feel this is important feature feedback to perhaps GREATLY improve your detection ability over modestly variable lighting WHICH is in fact the ‘real world’ of movable field of view robotics basically.

My focus is use of 3-color bands being recognized over as broad of a difference in brightness as you can possibly handle. I was going to myself vary your ‘brightness’ and sample at 3 levels basically to get as many tries to find object as possible. The lack of synchronizing and setting things via I2C has slowed that effort. The issue is Pixy today is so very highly sensitive to precise environmental (lighting) that I have almost given up on it so I applaud efforts to better it’s detection.

The ‘dream’: It would be great if Pixy could sort of rotate between a few brightness settings internally in it’s frame sampling to better it’s detection (this slows effective frame rate but would be highly desirable from my point of view and I am certain MANY customers of Pixy would benefit). Maybe in a 4-frame cycle where over some dynamic range of say 50% is used. This is then a mode for this sampling where if a user sets this mode called ‘dynamic sample range’ or similar name to 50% then over the 4 frame cycle the 1st frame is at brightness of 50% of user setting, second frame is 75% (half of setting below), 3rd frame at 125% and fourth frame at 150% (user setting + this value). I’m not really sure but feel that this would GREATLY help cover brightnesses at the expense of slower effective frame rate. Now of course movement between frames means there could be multiple objects very close to each other but the user’s problem is that he must figure this out based on his application. My issue is lack of detection unless brightness is just right and I have only a very specific color code to detect so it is there or not and ‘not’ is a ‘bad’ thing and my main issue.