- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
hello every
I am using FX3 to stream image data. I refered the AN75779 document to use UVC interface.
I just wanna know how to get raw14 format data by host application.
Questions...
- How to transfer raw14 format data to USB ( like USB packet structure )
Streaming RAW8 or RAW10 using CX3
- How shoud I modify AN75779 example source?
Diagram - my project sequence.
FPGA <=> FX3 <=> USB <=> PC
GPIF UVC
Please let me know how to solve the problem.
Thanks for your help.
BEST REGARDS,
Ted Lee.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello Ted Lee,
As the application you are working on is UVC application, RAW 14 cannot be directly streamed through UVC driver. But you can stream RAW14 video as YUV (YUY2) format.
For this the modifications needs to be done in are:
- GPIF state machine should be modified with bus width as 16 bits. As the output of the sensor is 14 bit 2 bits will be non image data which will be sample and later can be removed on the host application side.
- In descriptors, video frame size should be as below
Assume that you are streaming 1920x1080 resolution at 30 fps in RAW 14 format with the following values:
a. Width in pixel : 1920
b. Height in pixel : 1080
c. Bits per pixel : 14
d. Frames per second : 30
e. Frame size : 1920 x 1080 x 14 bits
f. Bit rate : 1920 x 1080 x 14 x 30 bits per second
In this setting, the GPIF II interface should be set to 16-bit parallel data interface for RAW 10 MIPI output format, and so has padding of 2 bits for each pixel, causing the frame size to increase. The above calculations will be changed as follows:
a. Width in pixel : 1920
b. Height in pixel : 1080
c. Bits per pixel :16 (due to padded 6 bits per pixel)
d. Frames per second : 30
e. Frame size : 1920 x 1080 x 16 bits
f. Bit rate : 1920 x 1080 x 16 x 30 bits per second
Similar changes (frame size and bit rate) need to be done in probe control structure also.
Please let me know if any queries on this
Regards,
Rashi
Rashi
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello Ted Lee,
As the application you are working on is UVC application, RAW 14 cannot be directly streamed through UVC driver. But you can stream RAW14 video as YUV (YUY2) format.
For this the modifications needs to be done in are:
- GPIF state machine should be modified with bus width as 16 bits. As the output of the sensor is 14 bit 2 bits will be non image data which will be sample and later can be removed on the host application side.
- In descriptors, video frame size should be as below
Assume that you are streaming 1920x1080 resolution at 30 fps in RAW 14 format with the following values:
a. Width in pixel : 1920
b. Height in pixel : 1080
c. Bits per pixel : 14
d. Frames per second : 30
e. Frame size : 1920 x 1080 x 14 bits
f. Bit rate : 1920 x 1080 x 14 x 30 bits per second
In this setting, the GPIF II interface should be set to 16-bit parallel data interface for RAW 10 MIPI output format, and so has padding of 2 bits for each pixel, causing the frame size to increase. The above calculations will be changed as follows:
a. Width in pixel : 1920
b. Height in pixel : 1080
c. Bits per pixel :16 (due to padded 6 bits per pixel)
d. Frames per second : 30
e. Frame size : 1920 x 1080 x 16 bits
f. Bit rate : 1920 x 1080 x 16 x 30 bits per second
Similar changes (frame size and bit rate) need to be done in probe control structure also.
Please let me know if any queries on this
Regards,
Rashi
Rashi