<< return to Pixycam.com

Snapshot with libpixyusb?

It has been stated in the forum that libpixyusb API allows one to obtain a still image from the camera, but I can’t find anything in the libpixyusb API reference documentation that indicates how that might be done. Can someone provide or post a simple example, along the lines of hello_pixy?

Second question: can one set the exposure time? If not, what are the meaning and effects of the two parameters (gain and compensation) in the set_exposure_compensation call in the libpixyusb API?

Thanks in advance! Jim Remington

Regarding the first question— I think it’s been discussed and it’s possible. I don’t have a ready to run example, but I’ll ask around.

2nd question-- gain is the analog gain for the exposure and the compensation is the exposure period. This is sort of an advanced feature to play with though. You’ll need to disable AEC before you can change these values.

thanks!

Thanks, Rich:

The ability to grab a frame is really, really basic for any camera module, and as a glance through the forum suggests, many other people would appreciate a simple example. Adding an appropriate function call to libpixyusb would vastly increase the potential for PIXY applications and would undoubtedly boost module sales.

Yes, set_exposure_compensation is an “advanced feature” but the ability to set an exposure (and to understand what that setting means) is essential for quantitative imaging applications. To evaluate the effects of changing those parameters, I need to be able to grab a frame. It would be terrific if you or someone else could provide example code!

Incidentally, I’m planning to use the Raspberry Pi for this and already have everything else working.

Cheers, Jim

Well… agreed… sorta.

Bear in mind that Pixy isn’t meant to be used as a camera – the image processing is meant to be done onboard and only the results are communicated. If you’re comparing Pixy to a webcam you’ll notice that webcams are cheaper and probably better at streaming video. Pixy’s usefulness is the “vision as a sensor” piece.

Can you describe your application? Maybe your application requires something special about Pixy (when compared to a webcam.)

thanks!

I am a physicist interested in educational outreach. Right now I want to evaluate the Pixy’s potential for acquiring and quantitatively evaluating optical spectra, as described for this remarkably simple, useful and easy to make instrument: http://publiclab.org/wiki/spectrometer

The USB cameras described in those pages are incapable of exposure control, which makes it nearly impossible to quantitatively measure pollutants, etc. It appears that Pixy does offer exposure control, which would eliminate that difficulty. If I can make that work, the people at Public Lab would be probably be very interested in buying and/or recommending purchase of the Pixy.

I agree that the Pixy is terrific for blob following! However, it doesn’t make sense to me that you recommend customers to use the Pixy in only certain ways. It is to everyone’s advantage if people can think of new (or old) uses for it!

Best regards, Jim Remington

Rich:

Here is one way to take good advantage of the Pixy’s unique and powerful onboard processor in the application mentioned above. I would define a Region Of Interest, for example a 1000x50 pixel rectangle that spans the spectrum image, then the processor would divide up that region into uniform segments and return (say) 100 integrated intensities. Naturally, one would have to disable auto white balance and have defined exposures.

Furthermore, I would like to do dark background subtraction. Take an image with the input slit covered and subtract that from the spectrum image before integrating the intensities.

That way an Arduino or other tiny processor could be used to evaluate and/or compare spectra in the field. As it is now, such operations require a desktop or laptop computer or a smart phone running a substantial application.

Incidentally, the ready availability of interchangeable M12 objectives (with various focal lengths, possibly also with various bandpass filters) could be extremely useful for the proposed application.

I have other ideas for scientific image processing on a Pixy, too. What do you think?

Best regards, Jim

Hi Jim,
Thanks for the info— very cool!

So with Pixy you can disable auto-exposure and white-balance and set them to fixed values. Have you tried doing experiments in PixyMon? You can disable white balance, AEC, etc:
http://cmucam.org/projects/cmucam5/wiki/Camera_Pane

And you can grab frames:
http://cmucam.org/projects/cmucam5/wiki/How_to_Grab_a_Frame

It might be good enough for initial experiments… The specifics of the exposure controls, etc, aren’t well documented in the imager’s datasheet, which is frustrating for everyone. Adding to the frustration is that the datasheets require an NDA with OMnivision, so that’s pretty lame. Anyway, I love these kinds of applications – I’ll try to help as much as I can.

Hi, Rich:

My version of Pixymon (2.0.4) does not seem to offer access to set_exposure_compensation parameters so there is no obvious way to experiment with that feature. When I disable auto exposure compensation, the results seem hit or miss (sometimes I just get black frames).

I prefer to work with command line applications within linux, so if you could post a simple example of a frame grabber in C/C++, I can take it from there.

Thanks, Jim

When you disable AEC from PixyMon, pixy saves the current exposure settings. Black frames sound like some kind of issue— if you have steps to reproduce, please share.

There are also the commands cam_setAEC and cam_getECV and cam_setECV that you can access from the command window (press stop, then type “help cam_setAEC”, etc.)

Thanks. I’m trying to puzzle out the effects of changing ECV values and the results aren’t making much sense. cam_getECV might return a number like 104434, and changing that (with AEC off, using cam_setECV) in either direction by more than about 100 units or so generally results in dimmer images. Has anyone been able to make sense out of the ECV value?

I am running into dead help links. Example: The here link is dead in the following help response " It is recommended to enable overexposure highlighting to prevent overexposure of the objects you wish to detection. There is more information here."

Can these links be fixed?

Finally, is there a way to get/save frames from PixyMon in full native camera resolution?

Hi Jim,
The ECV value is separated into 2 numbers, the 8 least-significant bytes are the analog gain and the 16 most significant bytes are the actual exposure value (although the units aren’t stated in the imager datasheet).

So for example;

cam_getECV
response: 104696 (0x198f8)
cam_setECV 0x2500f8
response: 0 (0x0)

here I left the lower 8 bits alone. Playing around with these, the exposure number and gain values seem very non-linear (logarithmic I’m guessing). When you crank the analog gain up, you can see the noise increase, etc, so it follows intuition at least.

Regarding saving a full frame – Pixy doesn’t have enough contiguous free memory (only 128K) to grab a full frame.

Sorry – where was the dead link?

thanks!

Dead link: http://cmucam.org/cmucam5/wiki/Some_Tips_on_Generating_Color_Signatures_2

(On page http://cmucam.org/projects/cmucam5/wiki/Signature_Tuning_Pane and at the bottom of the page, under “Camera brightness”, click here There are other dead links like that one.)

Bummer about the limited memory and inability to save an entire frame. That makes it much, much more difficult to implement my ideas about internal processing of spectra, especially the dark background subtraction.

Thanks!

There’s some more background on Pixy and memory in this discussion:
http://cmucam.org/boards/9/topics/4451?r=4527#message-4527

So it sounds like you need to save a full-resolution frame to make this work?

Rich:

Thanks for the link to that thread! I missed it earlier and if I had known about it, I would not have had so many questions.

I might be able to get away with a strip of say 1280 x 40. However dark background subtraction must be done on a pixel-by-pixel basis, so whether lower resolutions are useful depends on how they are grabbed.

When one grabs a 640x400 or 320x200, are the intermediate pixels skipped, averaged or something else?

What is the difference between the two modes in cam_getFrame (0 and 1, as in M0R0, M1R1)?

Thanks again for your help.

Right – so this is an undocumented/experimental feature, the MxRy means this:

Mx, or “mode x” where x is either:
0: full resolution mode (1280x800) at 25 frames/sec
1: quarter resolution mode (640x400) at 50 frames/sec – the imager bins/averages the pixels

Ry, or “resolution y” where y is either:
0: 1280x800 resolution no pixel binning/averaging
1: 640x400 resolution no pixel binning/averaging (only available in mode 1)
2: 320x200 resolution with pixel binning/averaging (also only available in mode 1)

So you’ll probably be interested in M0R0. Just make sure that width*height doesn’t exceed 73728 bytes (72K) — this is the video memory buffer.

thanks!

I’m making some progress starting with low resolution frames, but Pixy is returning a frame of all zeros. Below is the code I’m using, cribbed from various forum posts. Is something else required other than initialize the Pixy? I’ve seen references to “start”/“stop” or “run” commands. I believe, but am not certain, that AWB and AEC are set as the default. PixyMon returns a fine image of my workbench, of course.

As always, help is much appreciated!

The current program output, plus an 8x8 pixel block from about the center of the frame, is as follows:

pi@raspberrypi:~/pixy/build/hello_pixy$ sudo ./hello_pixy
initialized Pixy - 0
getFrame return value 0 response 0
returned w 320 h 200 npix 64000
 average pixel value 0
 00 00 00 00 00 00 00 00
 00 00 00 00 00 00 00 00
 00 00 00 00 00 00 00 00
 00 00 00 00 00 00 00 00
 00 00 00 00 00 00 00 00
 00 00 00 00 00 00 00 00
 00 00 00 00 00 00 00 00
 00 00 00 00 00 00 00 00
#include 
#include 
#include 
#include 
#include 
#include "pixy.h"


void handle_SIGINT(int unused)
{
  // On CTRL+C - abort //

  printf("\nBye!\n");
  exit(0);
}

int main(int argc, char * argv[])
{
  int      index;
  int      blocks_copied;
  int      pixy_init_status;

  // Catch CTRL+C (SIGINT) signals //
  signal(SIGINT, handle_SIGINT);

  // Connect to Pixy //
  pixy_init_status = pixy_init();
  printf("initialized Pixy - %d\n", pixy_init_status);

  // Was there an error initializing pixy? //
  if(!pixy_init_status == 0)
  {
    // Error initializing Pixy //
    printf("pixy_init(): ");
    pixy_error(pixy_init_status);
    return pixy_init_status;
  }


  // getFrame Example //
  {
    unsigned char pixels[72000]={0xff};

    int32_t response, fourcc;
    int8_t renderflags;
    int return_value, res;
    uint16_t width, height;
    uint32_t  numPixels;

    response = 0;
    return_value = pixy_command("cam_getFrame",  // String id for remote procedure
                                 0x01, 0x21,      // mode
                                 0x02,   0,        // xoffset
                                 0x02,   0,         // yoffset
                                 0x02, 320,       // width
                                 0x02, 200,       // height
                                 0,              // separator
                                 &response,      // pointer to mem address for return value
                                &fourcc,  //for some reason these 5 args are needed, contrary to the docs
                                &renderflags,
                                &width,
                                &height,
                                &numPixels,
                                 &pixels,        // pointer to mem address for returned frame
                                 0);

    fprintf(stderr,"getFrame return value %d response %d\n", return_value, response);
    printf("returned w %d h %d npix %d \n",width,height,numPixels);

// check success:
    if(return_value != 0) return return_value;


// display 8x8 block
   unsigned int i,j,ind,start;

   unsigned long avg=0;
   for(i=0; i<numPixels; i++) avg += pixels[i];
   avg = avg/numPixels;
   printf(" average pixel value %d \n",avg);

   start=100*320+160; //roughly in middle of frame
   for (i=0; i<8; i++) {
        for (j=0; j<8; j++) {
        ind=i*width+j+start;
        printf(" %02x",pixels[ind]);
        }
    printf("\n");
    }
    // Sleep for 1/10 sec //
   while(1) usleep(100000);
        //(exit on ^C)
  }
}

Hi Jim,
You should call stop() before grabbing frames by using the command below because Pixy is running it’s blob detection program by default and you’ll step on it’s internal frame grabs:

return_value = pixy_command(“stop”, END_OUT_ARGS, &chirp_response, END_IN_ARGS);

When you’re done you can start the program again:

return_value = pixy_command(“start”, END_OUT_ARGS, &chirp_response, END_IN_ARGS);

But this isn’t totally necessary because Pixy will resume the program when it loses connection with your program.

thanks!

Thanks for the hint! I added a stop, which appeared to succeed. However, I still get all zeros in the frame. Lens cap is off and PixyMon works. snippet:

    return_value = pixy_command("stop", END_OUT_ARGS, &response, END_IN_ARGS);   
    printf(" STOP returned %d response %d\n", return_value, response);
    response = 0;
    return_value = pixy_command("cam_getFrame",  // String id for remote proced$

program output:

pi@raspberrypi:~/pixy/build/hello_pixy$ sudo ./hello_pixy
initialized Pixy - 0
 STOP returned 0 response 0
getFrame return value 0 response 0
returned w 320 h 200 npix 64000
 average pixel value 0
 00 00 00 00 00 00 00 00
 00 00 00 00 00 00 00 00
 00 00 00 00 00 00 00 00
 00 00 00 00 00 00 00 00
 00 00 00 00 00 00 00 00
 00 00 00 00 00 00 00 00
 00 00 00 00 00 00 00 00
 00 00 00 00 00 00 00 00

Ah, sorry, you need to change your line

unsigned char pixels[72000]={0xff};

to

unsigned char *pixels;

and then pass the pointer to this pointer into pixy_command (like you have now). Libpixyusb will just give you the pointer to the video frame memory to avoid the copy.

Great, that works fine! I assumed that the space I had allocated would be used.
Where and how does libpixyusb allocate frame memory? I looked through the source code, and can find nothing that is obviously related to cam_getFrame.