<< return to Pixycam.com

Question regarding the libraries and the firmware

Hi everyone! First of all, I am extremely new in this programming thing, so maybe this question is the most stupid one that you guys have heard in the decades, but I need to know it :slight_smile:

I would like to make some changes in the camera in order to recognize shapes (I think that is feasible but since I have just some knowledge in C++ I dont know if that is realistic for me). I would like to work with MATLAB for image processing, but since OpenCV can not be uploaded in the camera I dont know if that is possible. Anyone knows anything about it?

I also wanted to know if I could call from my Arduino the functions of the reference (http://charmedlabs.github.io/pixy/pixy_8h.html#ad8ff0e513bae5acec82c6a7e0e21685f) in order to change the camera parameters automatically from the microcontroller (i will assemble everything in a robot and it is kind of stressful have to conect the camera each time that I want to change some lightning or signature parameter)

Sorry for the (probably) dumm questions and I hope to hear some advices!

Thanks in advance

Roberto Laso

Hello Robert,

If you want to use Matlab, you take save raw images from Pixy and then use Matlab to process.

http://cmucam.org/projects/cmucam5/wiki/How_to_Grab_a_Frame

There is some code here with examples on how to read png files into Matlab and process.


Regarding your link (http://charmedlabs.github.io/pixy/pixy_8h.html#ad8ff0e513bae5acec82c6a7e0e21685f) that is for libpixyusb. The Arduino/serial protocol is simpler to keep the communications library simpler to port and lightweight.

Edward

Thanks for the answer Edward, that was helpful. I was thinking about using Matlab cause I wanted to implement some changes in the firmware for traffic lights AND SIGNALS and use that on-board. Is that something realistic? Is there a way in which could I use Matlab for real-time applications on board?

Thanks in advance

Hi Robert.

Before diving in to programming the Pixy firmware, I’d recommend implementing your algorithms in Pixymon. Then you can get a better feel for what you’re working with as well (what kind of noise you’re picking up, how much data you need to process etc. Remember, your limited by the lpc4330 cpu and 1MB of ram on the Pixy, which can’t handle everything you throw at it).

Programming the Pixymon to do your bidding is WAY easier than programming the Pixy itself, in my opinion/experience anyway. (not to mention that you probably have to get a debugger/emulator (I’m using LPC Link2) and solder a jtag header onto your Pixy to be able to debug your firmware code).

I’m not familiar with Mathlab, but if you have, and thoroughly understand, the principle behind your algorithms, then programming them shouldn’t be an impossible task :slight_smile:

I’ve modified Pixymon myself, and are currently trying to implement my algorithms in the Pixy firmware, but I haven’t made too much progress there yet.
See this thread: http://www.cmucam.org/boards/9/topics/5308
for a short intro to my modified Pixymon, and what I did with it (bear in mind, I’m an amateur, and not a charmedlabs dev)

I hope this was helpful, Good luck!

  • Sondre

PS: If you have some questions about how to modify pixymon, I may be able to help you, but I’m really busy nowadays, so I can’t promise a swift response :slight_smile:

Hi Sondre, that was really helpful. I also took a look to your videos (and I suscribed ;)) and they are cool, I find really interesting what you guys are doing, keep going! About modifying PixyMon, I wanted to use it on-board in the Auto, meaning that I would need a laptop and that is not ergonomic but… anyways, I wanted a little bit of guidance for making my own algorithms. Wich tools did you use for modify PixyMon? How difficult could be for a almost-newbie in programming?
Thanks Sondre and congratulation again!

Thank you Robert! :slight_smile: Glad you like them!

Pixymon is programmed in QtCreator, I used the free version, which is more than sufficient. Here’s a link: http://www.qt.io/download-open-source/
When installing, make sure to also install the minGW compiler (you’ll be prompted during the install).

The language used in Qt is C++, but pixymon uses the Qt libraries for variables (QString instead of String, QWidgets for UI elements, emit & signals for event handling and threading etc. The Qt library presumably exists to make cross-platform compiling easier), but that’s not important for now, just be aware of what variable types you’re working with.

I will now quickly walk you trough part of the renderer.cpp class from my modified pixymon, the line numbers specified will correspond to this file: https://github.com/Pravuz/Pixymon_modified/blob/master/host/pixymon/renderer.cpp
Note that modifying the renderer class is not an elegant solution (the elegant solution would be to make a new module based on monmodule.cpp, but nevermind that for now).
+if you’re completely green in regards to programming, this might be a bit too advanced, and I would recommend checking out this beginners intro to programming in c+++ +: https://www.youtube.com/playlist?list=PLF9B0522C7BC3C1C2 or some other c+++ +youtube guide+

Beginning with the pixels for each image in the videostream, they are handled in the renderBA81 method (line 296).

Line 336 and 337 are part of my algorithm, which again aren’t to important for now, but you should compare my version of renderBA81 with the original
( https://github.com/charmedlabs/pixy/blob/master/src/host/pixymon/renderer.cpp line 198).

Regardless, each pixel is handled individually within the nested for-loops. The first order of business is the interpolating (line 332), which is of no concern, it just has to happen because of how the data for the image sensor on pixy is built. after the interpolating, you’re left with rgb values for the pixel, which I convert to greyscale (line 338), and use for comparison in order to detect a change (in another word; motion. line 339 to 354).

The rgb (each value are 8bit integers) value for each pixel is then combined into one 32bit integer (line 359), and then added to the image which will be displayed in the program (via videowidget.cpp).

When all the pixels are processed, the image is emit’ed via the signal called ‘image’ in line 368. This signal is then picked up by the videowidget.cpp class. (the connection here is established on line 56).

Now, if you take a look at the header for the renderer class (https://github.com/Pravuz/Pixymon_modified/blob/master/host/pixymon/renderer.h)
you can see that I’ve made a few Signals (line 82 to 85) and slots (which are used by the signals, line 105 to 109).
What this essentially does, is split the work up into different threads, so that the rest of my algorithm doesn’t cause the video to lag.
So the next part of my algorithm, which is to separate, filter and process the objects detected, are done in the background.

Hopefully this short intro wasn’t too confusing or discouraging, but I’m happy to assist with what I can, as long as I have time :slight_smile:

Hi Sondre,

Congratulations for the program, it is awesome! I have been the last week busy but I had the time to go through your code (also through tutorials cause I am not that C++ expert) for unterstanding and, of course, I have several doubts:

Why do you convert the image into grayscale for compare with the threshold? You cannot do it directly with the rgb pixels? And why do you get rid off the ?

I am doing a project about the possibilites that the camera, together with Arduino, might have. The idea of implementing algorithms for edge recognition directly in Pixy is far from reality right now since I am running out of time with my project, so I will try to implement it in PixyMon (just like you did) regardless if it is later suitable for Pixy or not because I would get the data directly from PixyMon (is that possible?).

And I have an extra question for you since you probably have been diggind deeper in the libpixyusb: I want to use the data of Pixy (not necessary in real time) for image processing in Matlab. Right now I just can get frames directly from PixyMon and upload it in Matlab but I wanted something more efficient. Maybe you could have an idea of how could I get this data in C++ code for using it in Matlab (image or video).

Congratulations again for your work!

Roberto Laso

Thanks :slight_smile:

I’m converting the image to grayscale, because I want to be able to track objects of any color. If I where to compare the raw pixels without interpolating with a bayer filter, I’d only pick up changes from an object with the same color as the pixel in the respective pixels. In other words, a red object would have gaps, and object separation would be a lot harder.
also, instead of comparing the rgb values separately, converting to grey requires less memory to store the previous frame, as well as less computing power for each comparison, and with near equivalent accuracy.

Edge dedection should be quite doable I think, all you need to do with a matrix containing the image, is convert the matrix with a Fast Fourier Transform (FFT), and then remove all ‘frequencies’ below a given treshold, then convert back to a normal image using the Inverse Fourier Transform. What you’re left with is an image that only have rgb/grey/pixel values where an edge is.
(to make it more clear, if you instead where to remove all ‘frequencies’ ABOVE a certain treshold, you’ll be left with a very blurry image with no edges)
The treshold value is something you have to experiment with. But the FFT is used a lot in image compression (jpeg) and is relatively fast (unsurprising, considering the name i suppose)…

getting the data output from pixymon should be doable, but I’m not quite sure how, but google is your friend here :slight_smile:

I haven’t dug in to the libpixyusb actually, so I can’t help you there, sorry :\

Hello Sondre, i have downloaded Qtcreater and the Pixymon Modified source code as you provide from the link provided.
but i am just start learning programming and i dont have any idea, how to compile it in Windows. i have tried some from the you tude and other but not able to see as did in the you tube video. so how can i test that in my home. could you plsese suggest me.
i also would like to ask with everyone, same queston, if you have any idea please suggest me.

Thank you.