<< return to Pixycam.com

Using Pixy as an optical flow sensor

Hi,

I want to try and use the Pixy as an optical flow sensor such as http://store.3drobotics.com/products/optical-flow-sensor or https://pixhawk.org/modules/px4flow

It will point perpendicular to the ground and tell the user how much the camera has moved in the X and Y directions since it was polled last by keeping track of salient points within the images.

Has anyone tried this?
Does anyone know if this is possible?
How easy is it to access the images on the pixy and program it with new firmware?

Thanks
Joe

Hi Joe,
Sounds cool! We’re going to release what we’re calling the “Firmware SDK” in the next month or so. But in the meantime we’re also going to post some information on how to make your own “cooked” mode in PixyMon to test your algorithm.

Do you have some experience with computer vision?

thanks!

Hi,

I have done a few computer vision modules at University and read a few papers that implement an optical flow algorithm on different hardware.

I think that if I can access the raw image data then I should be able to get this working, although frame rate will have an impact on the maximum speed the camera can move, I’m also unsure how well this will work when the camera is not focused.

Can you give me any more info on the firmware SDK and cooked mode?

Thanks
Joe

Hi Joe & Rich,

I’m also interested in using Pixy for a similar application, though I also need to track rotation in addition to X & Y translation.
If you want to collaborate on the project, just let me know.

Cheers,

  • Dean

Hey Joe, Rich & Dean

I too am working on similar project and was looking into using pixy as an optical flow sensor. My current project is very similar to yours Joe. I am using the Pixy cam at the moment for identify landing targets for an APM quad rotor.

Dean I have successfully translated the both the X, Y and ration/orientation angle of a 2 cc pattern into yaw pitch and roll commands.

I wanted to use the optical flow capability to add another layer of stability when descending on to the landing pad as well as using it for a reference when GPS aided navigation is not possible.

The goal of the project is to demonstrate the ability of autonomous navigation and landing when GPS aided navigation is not possible.

Im currently using APM 2.5 running arducopter 3.2 and interfacing with a raspberry pi mavproxy to send the vision based commands to the APM.

How have your project progressed?

Cheers,
James

So far I’ve just been reading up on visual odometry and general optical flow techniques. Since this isn’t my area of expertise, I need to do more research before I attempt to tackle anything real.

Unfortunately, everything I’ve read seems to indiciate that my goals may be unrealistic. I’m trying to do odometry for a wheeled robot on carpet (FRC competition robot), which means that I don’t need to worry about altitude or lighting conditions since I can control those (assume the camera will be aimed down inside a cone with uniform lighting). Most optical flow algorithms look for edges in order to build a list of trackable features, and match those up frame-to-frame. The carpet will be low contrast with a uniform feature size (tufts) and no good lines to track.

[My original plan was to use an optical mouse sensor, but none of them can track close to 12ft/s, and none of them seem to report rotation. Also, they seem to be unobtainium these days, as both Cypress and Avago product lines seem to have been discontinued.]

If the camera is aimed towards the horizon, I’ve got to contend with lots of fast-moving robots in the FOV, as well as spotlights and reflective surfaces (polycarbonate wall panels). I think the s/n ratio might be pretty hard to work with in this environment.

When I get some time, I plan to build a rig and film some reference videos I can use to test various flow algorithms to see if I can get anything close to what I’m attempting.