<< return to Pixycam.com

laser pointer tracking with Pixymon

Hey cmucam5 coders,

I am fairly new to using files from other programs and using them within an (Arduino) code…

My ultimate goal for a project I’m working on is this:

  1. I would like to have the cmucam5 follow a red laser dot.
  2. Then, for a random object of uniform color, when the red laser dot is pointing to it for a few seconds, I would like the Pixymon program to set a signature of the uniform colored object (kind of like doing the action with a mouse by clicking and dragging a square around the object in Pixymon, but automatically within the code in Arduino).

First, is this possible by calling the Pixymon program within the code (preferably Arduino code) and modifying the tasks/functions of Pixymon so it does what you want it to?

I am familiar with the Arduino and would like to integrate a robot with wheels to this project so it can find, track and approach the object that the red laser dot is pointing to…

Any help/suggestions would be awesome!!!

Hi J,

Sounds like an interesting idea! Although there are a couple problems with that:

  • Unfortunately you can’t yet train Pixy via Arduino, it must be done with Pixymon
  • If you trained Pixy on the colored object while the red laser is pointing at it, then you’d also be training on the laser color, which would give you an inaccurate color description of the original object

What exactly do you mean by “modifying the tasks/functions of Pixymon”? I’m not sure I’m following you here.

Let us know if you have any other questions.

Scott

Hey Scott,

Thanks for some feedback. What I wanted to do was have the camera

  1. follow the red laser dot, and then
  2. recognize that the red dot has been “sitting” on a uniformed, colored object for a few seconds.
  3. The red laser will then turn off after those few seconds, and then
  4. somehow with Arduino or something, recognize and signaturize the object that the red dot was “sitting” on.

I thought there would be a way to use commands to open and interact with the PixyMon program to train a colored object.

Please any suggestions would be great.

-J

Hi J,

I’m guessing this means you’ll be using the pan/tilt module?

So if you were able to train Pixy on the colored objects beforehand then this could work. Maybe when the red laser is on the object you could try to detect what other detected objects are surrounding the laser detected block. Or, when the laser detected block disappears, you could trigger on whatever detected blocks is most dominant in the image.

Does this make sense? Its possible that we’ll allow you to train Pixy programmatically in the near future, but nothing is in the works yet. Let me know if you have any other questions or ideas.

Scott

Hey Scott,

That does sound like a good idea, but I kinda wanted to have my cmucam5 (with the pan/tilt module) to “signaturize” a random object so in the future (when I mount it on a vehicle) it can retrieve the random object I point the laser to, you know what I mean?

Is there any way I can have the the cmucam5 open the Pixymon program open after it recognizes that the laser dot has been sitting on the random object for, let’s say 4 seconds, so I can manually “signaturize” the random object? And so I can then have the mounted cmucam5 move towards the object. If so, should I use opencv, Arduino, Matlab, or which is the best way to code this?

Something like this…

If (redlaserdot within 20X20 pixel boundary for 4 seconds)
open Pixymon;
Else if (redlaserdot detected and moving)
follow redlaserdot;
Else
center cmucam5;

Thanks,
J

Yes, I know what you mean. I was just suggesting other possibilities since we don’t natively support automatic signature creation.

So you’re suggesting that you want to automatically open Pixymon on the desktop when the laser is detected for 4 seconds? And then you’d manually create the new signature? On the desktop you could interface with “libpixyusb”:https://github.com/charmedlabs/pixy/tree/master/src/host/libpixyusb and create a program to check if the laser is static for 4 seconds. Then the program could programmatically open Pixymon. Is that what you’re looking for?

Good luck.

Scott

Hey Scott,

Yes! That is exactly what I wanted to do! Is what you mentioned even possible to do/program? I am new to open source projects like this; I have only used the Arduino to create a line following robot in the past, but not something this complicated. How would I go about and start something like this using the cmucam5 (w/ pan/tilt module) and PixyMon? Should I be using the Arduino IDE or something like OpenCV to program this?

(copied from above):

If (redlaserdot within 20X20 pixel boundary for 4 seconds)
open Pixymon;
Else if (redlaserdot detected and moving)
follow redlaserdot;
Else
center cmucam5;

Any help would be awesome,

-J

Hi,

You would probably want to start programming this in the Arduino IDE. It might be a bit difficult for your second project, so I’d suggest breaking up the different parts of your idea and work on each as a separate project at first before integrating them.

So, first I’d suggest writing a program to follow the red dot with the pan/tilt module. We have some “example code”:https://github.com/charmedlabs/pixy/blob/master/src/host/arduino/libraries/Pixy/examples/pantilt/pantilt.ino that does this, which you can modify to your needs.

On the desktop side, you can use libpixyusb to track the movement of the dot. You might want to add a timer to track how long it has been since the dot had moved, which triggers the opening of Pixymon.

Good luck.

Scott

Hey Scott,

So what I did so far was use the RedBot kit to connect it with the PixyMon via the ICSP port.

All I want to do for right now is to have the cmucam5 turn the wheels on the Redbot so it can follow the object.

Something, like…

if the object the cmucam5 sees is 5X5 through the lens of the camera…

if the object size is bigger than 5X5 through the lens of the camera, then
move wheels backward,
else if the object size is smaller, then
move wheels forward,
else if the object is on the right side of the camera lens, then
turn right until centered,
else if the object is on the left side of the camera lens, then
turn left until centered,
else
don’t do anything.

I know that the Serial Monitor on the ArduinoIDE prints the object’s characteristics, like size and stuff, but how do I access those parameters so that I can make these if conditions based on the distance and position of the object relative to the camera? Or what is the best way to go about this?

-J

-J
This is what I did to get the first learned color (if pixy.blocks[0].signature == 1 )
Hope this helps.

Pixy pixy; // create an instance of pixy

pixy.init(); // initializes the pixy camera

blocks = pixy.getBlocks();
if (blocks) // if object is found
{
sprintf(buf, “Detected %d:\n”, blocks);
Serial.print(buf);
sprintf(buf, " block %d: ", 0);
if (pixy.blocks[0].signature == 1 ) // blocks[ zero ] is the color of the object
{
found = true;
object_x = pixy.blocks[0].x;
object_y = pixy.blocks[0].y;
object_width = pixy.blocks[0].width;
object_height = pixy.blocks[0].height;
}

Wow, thanks!!! This code was very helpful. Before, I have tried extracting the “changing variable data” (x,y) from the .c and .h library files. However, the code you have provided seems to be more efficient and what I need.

Thanks,
J