How to connect two image sensors to FX3 and stream UVC

Tip / Sign in to post questions, reply, level up, and achieve exciting badges. Know more

cross mob
Anonymous
Not applicable

I saw FX3 is capable of stereo vision ("3-D") via a demo video.  However, I couldn't find any material showing how to connect two image sensors to GPIF-II.  Is there any material like this available?  I am trying to quickly assess how to connect a TOF sensor (with PCLK, HD/VD, and 8-bit data) with a VGA RGB sensor to FX3. 

0 Likes
1 Solution
lock attach
Attachments are accessible only for community members.
Anonymous
Not applicable

Hi,

Pre-requisites:

Go through AN75779 and AN65974 app note firmware before looking into this code. Designing or modifying this firmware would be very hard if you don't have an idea of streaming video from one image sensor. We merged two app note firmware to come up with a firmware that can stream video through three cameras. Note that this firmware is tested using Lattice MACHXO3L FPGA, FX3 and three camera modules. FPGA does UVC header addition, get video data from two parallel sensor, get video data from one RGB MIPI sensor and send to FX3 slave FIFO GPIF. FX3 in turn sends the VIdeo data to three UVC interfaces. 

You can use the code example attached with the post. This code example can run three UVC applications at a time. GPIF thread 0 is used for camera 1, Thread 1 for camera 2 and thread 2&3 for camera 3.

For your application:

The idea is to use thread 0 and 1 for getting video data from first image sensor and thread 2 and 3 to get video data from second image sensor. Use two DMA channels instead of three. Use DMA call back function 3 design for both your DMA channels call back function.

Regards,

Savan

View solution in original post

0 Likes
1 Reply
lock attach
Attachments are accessible only for community members.
Anonymous
Not applicable

Hi,

Pre-requisites:

Go through AN75779 and AN65974 app note firmware before looking into this code. Designing or modifying this firmware would be very hard if you don't have an idea of streaming video from one image sensor. We merged two app note firmware to come up with a firmware that can stream video through three cameras. Note that this firmware is tested using Lattice MACHXO3L FPGA, FX3 and three camera modules. FPGA does UVC header addition, get video data from two parallel sensor, get video data from one RGB MIPI sensor and send to FX3 slave FIFO GPIF. FX3 in turn sends the VIdeo data to three UVC interfaces. 

You can use the code example attached with the post. This code example can run three UVC applications at a time. GPIF thread 0 is used for camera 1, Thread 1 for camera 2 and thread 2&3 for camera 3.

For your application:

The idea is to use thread 0 and 1 for getting video data from first image sensor and thread 2 and 3 to get video data from second image sensor. Use two DMA channels instead of three. Use DMA call back function 3 design for both your DMA channels call back function.

Regards,

Savan

0 Likes