So far I’ve just been reading up on visual odometry and general optical flow techniques. Since this isn’t my area of expertise, I need to do more research before I attempt to tackle anything real.
Unfortunately, everything I’ve read seems to indiciate that my goals may be unrealistic. I’m trying to do odometry for a wheeled robot on carpet (FRC competition robot), which means that I don’t need to worry about altitude or lighting conditions since I can control those (assume the camera will be aimed down inside a cone with uniform lighting). Most optical flow algorithms look for edges in order to build a list of trackable features, and match those up frame-to-frame. The carpet will be low contrast with a uniform feature size (tufts) and no good lines to track.
[My original plan was to use an optical mouse sensor, but none of them can track close to 12ft/s, and none of them seem to report rotation. Also, they seem to be unobtainium these days, as both Cypress and Avago product lines seem to have been discontinued.]
If the camera is aimed towards the horizon, I’ve got to contend with lots of fast-moving robots in the FOV, as well as spotlights and reflective surfaces (polycarbonate wall panels). I think the s/n ratio might be pretty hard to work with in this environment.
When I get some time, I plan to build a rig and film some reference videos I can use to test various flow algorithms to see if I can get anything close to what I’m attempting.