<< return to Pixycam.com

Laser Light tracking demo code

Could I possibly get a copy of the Arduino code that was used for this demo?
http://www.cmucam.org/projects/cmucam5/wiki/Will_Pixy_tracksense_laser_light

I want to do something similar, but move the light towards an object, and was curious as to how you are detecting non-signature objects to keep the laser light away from.

Thank you

Hi Eric,

I’m attempting to find the code from this demo, but be warned, it may be from a very old version of Pixy and might not be suitable for use with Pixy in its current form. But if it seems to be of any help then we’ll be glad to release it.

Scott

Hi Eric,
The cat demo detected a shift in the position of the laser dot. Since the laser and the camera were in the different planes, anything that interrupts the laser light will cause a parallax shift of the light in the Pixy image. You can use the same principle to create a distance sensor. This is essentially what Kinect does-- structured lighting.

The program went like this—


while true:
  if laser light shifted: (from previous position)
     move pan/tilt to new random position
     continue

So it wasn’t moving the laser away so much as just moving the laser randomly when it sensed something interacting with the beam.

You want to do something like sense motion and then move the laser toward the motion??

Thanks for the reply! That makes complete sense now that you’ve explained how it was done.

Basically yes, I want to sense and object in motion, or otherwise isn’t normally there, and move the laser light to “target” it.

I can see how doing this for one of the signature items would be fairly easy, move laser light signature to where other signature object is. Was curious about non-signature blocks, or if can that information be retrieved.

I am building an airsoft auto-sentry for fun. The laser light is going to be used to verify that there is a target lock, or just to look cool. I’m OK with it targeting learned signature items, but it would be cool if I could target anything moving or not normally in the FOV.

Thank you

Non-signature blocks— that’s something Pixy doesn’t understand. So, motion detection is easily within the realm of Pixy’s processing ability. It may be challenging to do what you describe because when you move Pixy, motion is created in the image. So you’d need to do some kind of interleaving, like this:

1 detect motion
2 move to location of motion
3 stop
4 goto 1

Step 2 needs calibration— you need a good mapping between image coordinates and servo positions, which is doable but challenging!

Motion detection/segmenting is on the list! (we don’t have a date though :frowning:

Face detection might help! With face detection (and hue-based object detection) you can do something like this (a proportional control loop):

1 find face/object
2 move motors in the direction of the face/object at a speed proportional to the distance from the center of image
3 goto 1

The advantage of this is there’s no interleaving— The camera will just track the face/object continuously. Just don’t shine the laser in people’s eyes!

For now Pixy will be fixed just below the tilt/pan mechanism, the servos will have a fixed targeting area that corresponds to Pixy’s FOV. There will be a little bit of offset, but should be trivial enough for engaging targets. And should be fairly easy to calculate. The problem I’m about to tackle, as you mentioned, is mapping Pixy coordinates to a servo pulse length.

I am also playing with the idea of putting pixy on the tilt/pan coaxial to the airsoft gun, and chase learned signature object(s). That wont involve tracking motion for reasons you explained. Simply goto the position of the signature object(s). I’ll have to devise a way to prioritize signature objects when there are multiple. Perhaps the one with the largest size.

The face detection idea is interesting. I didn’t plan on this involving any people/animals as targets. It will only be operational when I want to mess with it, don’t want any surprises when I wake up groggy.

Thanks again!