We’ve built a small competition robot based on the Pixy1 and a BeagleBone Blue (very nice combo for small robots btw.). The camera has a pan-servo.
Problem for us is that we want to use the pan-servo dynamically, turning our head while moving, but locating what we see require us to know the exact camera angle for each frame.
If this has already been solved by Pixy, then no need to read on, but simply help me understand how
I see 3 fundamental solutions:
-
Place small colored “tags” on the robot within the bottom (or top) edge of the camera view, then each frame will implicitly contain directional information in form of those colored “tags” in the blob-list.
-
Create a servo model that estimates the position of the servo, given servo response and servo commands. This may prove hard to get right…
-
Buy (or hack) a servo with position feedback (extra wire from servo internal potentiometer) and connect that wire to Pixy’s ADC input, then for each frame Pixy samples the actual servo position and sends that information over together with the frame - Could be embedded in a separate blob.
Anyone have any thoughts on this issue?
~Per