Mindsensors sells a look-alike but with adapter instead of just the cable, they have their own ev3-blocks and also support for robotC, should it be possible to use that library for the one from robotshop?
Is it possible with your ev3-block to use the camera to lane keeping, i.e. follow a black part of the road by detecting white lines to the left and right? It seems like your ev3-block only returns position of the largest block?
Because the lego-block exist, it is possible to use it with RobotC. All we need is the description of the protocol (I think I2C but we need description of registers). I hope it will be describe soon in the wiki.
The Mindsensors adapter and EV3 block are not associated with our LEGO version. Their EV3 block is incompatible (I assume) with our version.
Right now, line following with say a black line and white background is not supported. Right now Pixy uses color cues, but we get requests for this kind of algorithm (line following, line segmentation) every now and then. I’ll log this as another vote. (thanks!)
Regarding the protocol, we’ll write up some docs on our LEGO I2C protocol in the coming weeks. In the meantime, the lego_getData() function is a good reference of the protocol:
If you look at the code you’ll see there are 4 different queries:
0x50 general mode
0x51-0x57 signature mode
0x58 get specified color code
0x60 get angle (applies only to color codes)
Thanks for the reply. Just to clarify things, do you mean that line-following is not supported at all with the PixyCam and that I should go with the NXTCam if that is what I need? It confuses me a little, since mindsensors ev3-block is said to have the line-tracking part (which is built for NXTcam but said to be fully compatible with Pixy). I have asked the same question to them but not received a reply yet. Thanks in advance
I.e is it not possible to train the Pixy for the lines?