USB superspeed peripherals Forum Discussions
hi,
I'm using CYUSB3KIT-003 uvc project with ov5640. GPIF II interface shows in 'gpif.png' and its state machine unchanged.
ov5640 i2c probe and initializing was copy from my old CY7C68013A project which works ok. The wave shows in 'ov5640.png'.
I got wrong wave using the same i2c registers config (see SensorInit.txt from my code).
I2C config ov5640 to stream raw 640x480 bayer. Usb descriptor (cyfxuvcdscr.c) show in 'usb.txt'.
But I see black image using windows camera application.
I capture usb commucation with wireshark show all 0xFF values in 16kb transfermation (shows in 'cap.png' and 'uartlog.txt').
According to "001-92220_AN75779_How_to_Implement_an_Image_Sensor_Interface_with_EZ-USB_FX3_in_a_USB_Video_Class_UVC_Framework_Chinese.pdf",
My cmos sensor is 8bit width, so LD_DATA_COUNT and LD_ADDR_COUNT in GPIF II state machine is set to 16367. (=16368/(8/8)-1). Is this right?
Please help me to find out what is missing or mistaken.
Another thing puzzle me is that why the uvc project can't transfer raw data captured from gpif ports to usb host directly, even the image is broken.
I think if raw data comes to my application, I can debug that without troubling by the fx3.
Thanks
Show LessHI,
I AM TRYING TO INSTALL THE FX3 SUPPORT SOFTWARE FOR LINUX ON AN I7 BASED LAPTOP RUNNING UBUNTU 16.04. I HAVE RUN INTO A NUMBER OF ISSUES THAT ARE ILLUSTRATED BELOW. I AM LOOKING FOR ANY INFORMATION THAT MAY HELP ME BUILD AND INSTALL THE EXECUTABLES. I BELIEVE I HAVE SET UP ALL THE PATHS CORRECTLY, DOWNLOADED THE FILES TO SUPPORT THE 64BIT VERSION, DOWNLOADED QT4, ETC.
ANY ASSISTANCE WOULD BE GREATLY APPRECIATED. I HAVE USED AND DEVELOPED WITH THE WINDOWS BASED SOFTWARE (STREAMER) FOR FX3 WITHOUT ISSUE IN THE PAST.
THANKS,
BILL
I have attached a PDF with the full discussion and images from the compile attempts.
Show LessI have project that needs to use 24-bit GPIF configuration for parallel data bus. At the same time, I2C, SPI and UART communication block are also needed. Since SPI can UART can't be used at the same time. 4 GPIOs need to be used to implement bit-bang SPI. I I have 2 questions regarding using 24-bit GPIF
1. IO configuration block : io_cfg.isDQ32Bit = CyTrue ?
2. available GPIOs for other control signals: Are GPIOs of CTL[0], CTL[1], ..., CTL[12], DQ[23], DQ[24],..., DQ[31], I2S_CLK, I2S_SD,I2S_WS, and I2S_MCLK all available for controlling other external accessories?
Thanks,
Show LessIs there a good reason why SSRX+ and SSRX- seem to be swapped relative to SSTX+ and SSTX- ?
If they were the right way round, it would be very convenient to route to a USB connector. Was this by design, by accident, or have I made a terrible mistake ?
Show Less
Hi All,
I have been using the firmware shown here: https://community.cypress.com/thread/16971?q=Streaming%20RAW%20image%20data%20using%20Cypress%20driver heavily modified to interface with the OV5647 sensor and with the CyU3PMipicsiSetPhyTimeDelays suggested in other posts, to stream RAW 10-bit data from an OV5647 sensor to a Windows 10 app that translates the raw Bayer data to an RGB image that is displayed by OpenCV. In general the approach is working, but there are some major details preventing it from being a production ready solution. Others have noted the same issues in the original post.
I'll explain my Windows app for background. It is a 64-bit command line app that runs two threads. The first thread (collection thread) starts streaming then performs continuous XferData calls (using CyUSB3.dll) to retrieve camera data into one of N frame-sized buffers. N being between 1 to 32. A second thread (rendering thread) waits for a complete (filled) frame buffer then translates the Bayer data to an RGB image (OpenCV Mat object) and calls imshow to display the image. Simple concept. Logic insures that the buffers are used sequentially and a buffer's content isn't overwritten before it is displayed. So, in the case of a single buffer, the collection thread collects a buffer then waits for the rendering thread to display it before collecting the next frame. These processes generally keep up with the sensors 15 fps (5Mp) and 30 fps (1080p) rates. Rendering typically takes 5 to 30 ms with an average of about 20.
I have experimented with two camera resolutions 5Mp and 1080p. 1080p works much better than 5Mp, but both have the same issue: after some amount of time, ranging randomly from a few frames to several minutes, the XferData call times out (regardless of how long or short I set the timeout). Sometimes the system recovers from the timeout, other times it doesn't (and firmware must be reloaded to recover). My debugging shows that the rendering thread occasionally falls behind causing the collection thread to delay frame transfer long enough that firmware detects a DMA timeout and its timer resets the stream. Instances when the system recovers from this timeout cause the image to randomly shift (vertically & horizontally) in the display window.
My idea to fix this issue goes like this. Instead of implementing start and stop scan vendor commands (x99/x88), I would start the system with the sensor streaming. The DMA callback will simply toss the incoming MIPI data until it receives a frame grab command. At this point the next incoming MIPI frame is sent to the PC. After the grab frame is sent, firmware reverts to tossing the incoming MIPI data until the next grab command. This way the PC is ready to collect MIPI data when it is being sent so there should be no timeout issues.
I have almost succeeded in the first phase rewriting the DMA callback routine to toss incoming data *except* callbacks stop after two MIPI frames are collected, and nothing I've done causes them to continue beyond the first two buffers. I've attached the DMA callback routine. You'll see I am using GPIO17 as an oscilloscope trigger to know when the DMA callback is active. I see one burst of GPIO17 toggles during the imager's first frame, then none.
I think the problem has something to do with buffer commit but can't find any good details on how this works in the API except for a couple of brief comments here and there. Can someone explain how the DMA handler "knows" a buffer is ready for processing (e.g. sending to USB) and any other pertinent details such as how to manipulate whatever variable controls it?
Thanks,
Scott
Show LessI've been planning on using the FX3 family to act as a USB host in order to transfer data between storage devices on an embedded system. However, I just noticed in the technical reference manual that it only supports USB 2.0 as a host, not USB 3.0 as a host.
Is there any way I can achieve USB 3.0 speeds using it to transfer data? If not, is there another product that will allow me to reach the necessary speed? USB 2.0 is simply too slow for the amount of data that needs to be moved.
Show LessHello,
I already brought FX3 board and I am planning to add an external SD card connection to the board, taking the schematic of SD card connection in the FX3S as a reference for the circuit implementation as well as the RAID 0 example from the code examples.However I am not sure whether the SD/MMC controller in the FX3S board block digram is a hardware device or software program? will the FX3 works with this assumptions of adding external sd card adaptor?? Any advice in adding a memory to the FX3 board?
Show LessI have implemented a VCOM CDC device in my application. The virtual COM port is based on the usbtouart example code from Cypress. My FX3 code implements a composite device with 2 interfaces. One has the 3 endpoints associated with the VCOM port and the other has multiple endpoints associated with my bulk streaming application. There are 2 issues I am observing.
1) When the device is disconnected, the VCOM port persists in device manager? I would expect that the hardware disconnect would cause the host to recognize the device has left. I am guessing there is something not happening correctly in the disconnect routine in the FX3. Would someone please point me in the right direction?
2) This one is more complicated. I have been using the SUPERSPEED EXPLORER KIT as my primary testbed. In this case the VCOM port works as expected. I have another hardware set (it is a development board from Cesys and has an FX3 and FPGA on board together). This board brings up the VCOM port in device manager correctly but I am not seeing any output in the terminal on my host PC. I am routing debug messages over this virtual COM port so I am not able to see any debug messages that may be occurring. I can set the debug messages to flow through the physical UART RX/TX pins, but then I may be missing what is really causing the problem. Everything else about the application seems to work in both cases. Does anyone have ideas what may be different between these two cases such that device manager would enumerate (ie the driver works) but the VCOM port is active in one case but not the other? Would the clocking structure or PIB init have anything to do with the VCOM port?
Thanks in advance.
Show LessFX3の「同期スレーブFIFO」のサンプルデザインを使用して、FX3とホストPC間でUSB3.0の通信試験をしているのですが
ホストPC側でC++
StreamerからBULK OUTを行うとXfer
request rejected. NTSTATUS = c0000001のエラーメッセージが表示されて通信が停止する事象が発生しています。エラーコードより原因の特定はできますでしょうか?
【詳細】
BULK OUTのエラー発生時のGPIFIIのパケット読み出し波形を取得したところ、添付画像のような波形となっていました。1回目のパケット読み出しは問題なく成功しているのですが、2回目のパケット読み出しが異常な波形となっています。規定のパケット長(1024Byte)を読み出し終わる前にFLAG_Cがアサートされている上に、それよりも前にアサートされるべきFLAG_Dがアサートされません。
この結果を見ると、FLAG_Cがアサートされている理由は、C++ StreamerのBULK OUTの最後でショートパケットを受信しているためではないかと推測されますが、
その推測で正しいでしょうか? C++ StreamerのBULK OUTの仕様を教えて下さい。
また、上記が正しいとしても、FLAG_Cがアサートされているのに、それよりも前にアサートされるべきFLAG_Dがアサートされない理由が分かりません。
FLAGの動作としてこのような動作となるのは正しいのでしょうか? FLAGの動作について教えて下さい。
- 評価で使用しているサンプルF/W : AN65974の「SlaveFifoSync」のファームウェア
但し、下記について一部FWを修正して評価しております。
― LOOPBACK_SHRT_ZLPのDefineを無効、STREAM_IN_OUT のDefineを有効
マクニカ 荒井
Show LessHow do I purchase the Aptina sensor board to work with EZ-USB® FX3™ SuperSpeed Explorer Kit and Aptina™ Image Sensor Interconnect Board for the EZ-USB® FX3™ SuperSpeed Explorer Kit ?
Show Less