PSoC™ 5, 3 & 1 Forum Discussions
IsLineChanged and USB_GetDTERate never changes when setting baud rate on PC
I am building a USB->serial bridge mode as part of a project. To enter bootload mode on the downstream device, the UART baud needs to change on the fly. I was hoping to detect the baud change on the USB interface using the USBFS USB_GetDTERate command (after detecting change with IsLineChanged). I am not seeing any changes when I change the baud of the serial terminal on the PC. Is this a driver issue on the PC? Do I need a different driver or configuration of the USBFS component to pass through the baud rate changes?
Show LessHi Infineon/Cypress Support,
I am developing an application that will program the EEPROM on a CY8C5267AXI-LP051. I previously developed a similar application that programmed flash on FM3, PSoC4, and PSoC6 devices using the flash driver included in the Peripheral Driver Library (PDL). https://www.cypress.com/design-guides/peripheral-driver-library-pdl-psoc-creator
I can't seem to find a similar driver for the PSoC5. Can you help?
Show Less
Hello there,
We made a few design-based around the PSoC 5LP, and we can't buy them anywhere!
What is your strategy?
We are planning to redesign... (buy other MCUs and then redesign) but obviously, in both the short and long term, this is not our preferred option.
I would like to hear from you!
Thank you!
Francesco
Show Less
I have a brand new CY8CKIT-059 PSoC board I purchased from DigiKey this week. I installed PSoCProgrammerSetup_3.29.1_b4659_0.exe from the Cypress web site. I did install it to a different location (P:\Pro instead of C:\Program Files (x86)). I did a custom install - but left everything selected. The install seemed to go fine.
When I plug the board into my PC (known working USB port and cable), the blue light blinks as expected, but no windows USB recognition sounds. The same happens on a laptop - no device recognition sounds - that laptop does NOT have the software installed.
The only thing I did before plugging it in was solder on a connector to some of the data port pins - no excessive heat or anything like that.
So, I plugged it into a Linux box, and I see the following, so it doesn't seem to be dead
[2866776.802948] usb usb6-port1: disabled by hub (EMI?), re-enabling...
[2866776.802956] usb 6-1: USB disconnect, device number 2
[2866777.210914] usb 6-1: new low-speed USB device number 3 using uhci_hcd
[2866777.388847] usb 6-1: New USB device found, idVendor=0566, idProduct=3107, bcdDevice= 1.00
[2866777.388849] usb 6-1: New USB device strings: Mfr=0, Product=0, SerialNumber=0
[2866777.405432] input: HID 0566:3107 as /devices/pci0000:00/0000:00:1d.0/usb6/6-1/6-1:1.0/0003:0566:3107.0008/input/input15
[2866777.463185] hid-generic 0003:0566:3107.0008: input,hidraw0: USB HID v1.10 Keyboard [HID 0566:3107] on usb-0000:00:1d.0-1/input0
[2866777.476457] input: HID 0566:3107 Consumer Control as /devices/pci0000:00/0000:00:1d.0/usb6/6-1/6-1:1.1/0003:0566:3107.0009/input/input16
[2866777.535048] input: HID 0566:3107 System Control as /devices/pci0000:00/0000:00:1d.0/usb6/6-1/6-1:1.1/0003:0566:3107.0009/input/input17
[2866777.535182] hid-generic 0003:0566:3107.0009: input,hidraw1: USB HID v1.10 Device [HID 0566:3107] on usb-0000:00:1d.0-1/input1
I have seen info about uninstalling and reinstalling the drivers, but they are all premised on having a device show up in device manager - this one isn't.
Thoughts?
(While this is "percolating" I will also uninstall the software and reinstall it.)
JRJ
Show LessHi,
Problem description:
I try to debug two cypress MSU with two MiniProg3 devices at the same time, but I have hard time to debug two programs in parallel on one pc. For debugging I connected single Win10 PC with two MiniProg3 devices and opened two different projects (one psoc creator 3.3 and the other 4.1). When I try to ‘attach to target’, one of the programs enable to choose the device to connect.
However, in the other program, I have no device to choose- only after I connect the first device from the first program I see device recognition in the second program. After connection of both devices, both programs run number of seconds then one of the programs pop up an err message( This is not the case when is debug separately only one of the programs at a time).
Is it even possible to run in parallel two psoc creator in debug mode on the same pc?
Additional info:
I work on system with two different cards that based on cypress MCU on each card. Those two card communicate between them.
1-MSU versions:
psoc creator 3.3 with CY8C5868AXI-LP032 MSU
psoc creator 4.1 with CY8C5888LTI-LP097 MSU
2-device center recognize the two MiniProg3
3-error I got from psoc creator 3.3 (when I run first 4.1)
4--error I got from psoc creator 4.1 (when I run first 3.3)
3.3:
4.1:
Error from 4.1
Note - After 4.1 is running , 3.3 is not stopped on the break point then after two seconds I got an error message from 4.1.
Show LessHi
I was planning to perform square wave voltammetry on a three-electrode configuration system. In square wave voltammetry, a symmetric square wave is superimposed with a staircase potential and then applied to a stationary electrode. I have decided to choose the PSoC 5LP board for this purpose. Therefore, can anybody provide any feedback or suggestions based on this? Is this board feasible to perform the experiment?
Thanking you
Show LessI am Using psoc CY8C58LP Family Micro Controller in the i am loading firmware file using kitprog4 after loaded board working fine, Now i want to get checksum of my loaded application, kindly share procedure how to get firmware checksum.
Show LessI'm interested in the description in the Section 4.2.5.3 (Page 27) of CY8CKIT-059 PSoC5LP Prototyping Kit Guide (Doc. #: 001-96498 Rev. *G) as followed:
The KitProg board contains two dual-inline headers (J8 and J9). These headers are both 1x7-pin-headers, used to pull out several pins of PSoC 5LP to support advanced features like a low-speed oscilloscope and a low-speed digital logic analyzer.
I want to know that how to implement these advanced features like a low-speed oscilloscope and a low-speed digital logic analyzer in the CY8CKIT-059 KitProg.
Show Less
Background info for my question:
I am an amateur/hobby circuit designer (self taught), so apologies in advance. Also sorry for the long post. I have many skills, but brevity is not one of them.
I've designed a pinball lighting control circuit board. When I originally spec'ed out my requirements, and searched for components, the PSoC 1 rose to the top of my list. Many may think the PSoC 1 is not the best choice for this application (and they're probably right), but at this point I'm too heavily invested in time and money to do anything different.
I'm using the PSoC 1 with 56 GPIO, and configuring all 56 as LED outputs. I will say that my main goal is 100% achieved, I have a working solution that meets my original objectives of directly controlling all 56 outputs without matrixing, and minimal latency, and decent PWM for LED's (good enough for pinball, anyway, though a little less flicker would be a nice-to-have). Not only that, but my self-designed board is significantly cheaper than the commercial options that are available, so win-win even if my PSoC 1 choice was ill-conceived.
These same LED outputs are also optionally used to control a separate Power Driver board, allowing me to turn on/off pinball solenoids just like turning on/off an LED, except I don't use PWM for those signals, just on/off.
Keep in mind that for my following questions, each of the 56 outputs needs to be individually controllable for on/off and PWM duty cycle.
It was only after I designed my pinball lighting solution that someone asked if I could use it to control servos, and at first I thought sure, absolutley! But upon further inspection I don't think the PWM performance is high enough for accurate servo control, bringing me here.
Question 1: Hardware PWM vs. Software PWM?
I know the PSoC 1 can be configured with hardware PWM by using a digital block. But even with only 8-bit PWM, I have to dedicate a full digital block to PWM calculations. And best I can tell, I can only drive 1 GPIO individually with hardware PWM. I did see some options for using one PWM signal to drive multiple pins, but it seems like this needs to use shared PWM duty cycles, so this wouldn't allow me to individually control multiple GPIO using a single PWM digital block.
Am I correct? So a software PWM solution is the only option if I need 56 GPIO individually valued?
My chosen PSoC only has 4 digital blocks, so I quickly settled on software PWM, but I'm wondering if I misunderstand how best to leverage the PSoC 1's capabilities.
I would love it if there was a way to use a digital block to improve PWM performance for all 56 GPIO simultaneously, but while maintaining individual addressability.
Question 2: Software PWM Performance?
After significant code refinement, I have gotten my software PWM performance up to a staggering 8Hz (laugh), using 64 PWM levels (6-bit), while setting all 56 pins on each pass.
To provide a bit more detail: my software does 64 loops through the on/off settings for each PWM Period, and is able to complete 8 Period loops in 1 second (8*64 = 512 total loops per second). And it is doing this for all 56 GPIO, so technically it's processing the on/off state at a rate of around 28,672/GPIO Pins per second.
8Hz is pretty low. Like I wrote above, this is good enough for basic pinball lighting, and triggering solenoids on/off, but a bit slow for more advanced RGB lighting, and way too low for servos which commonly need around 50Hz. Also, servos need very fine pulse duration control often measured in fractions of a millisecond. My smallest duration right now is around 2ms, far too long to control servos. Even if my basic loop was magically running 6x faster at 50Hz, I don't think the duration control is fine enough for servos, and would require a hardware PWM implementation for adequate control.
So does the performance I'm achieving sound reasonable for a PSoC 1? Somehow I thought a 24MHz processor would be a little faster. It seems my code is taking 837 cycles per pin to process my code, which seems high to me.
The PSoC 1 Clocks and Global Resources documentation states that M8C assembly language instructions take between 4 and 15 cycles of CPUCLK to execute. I'm using C, not assembly, so I'm not sure if C has a performance overhead. For each GPIO, I use each GPIO's target PWM level (0-64) to query a 65x65 constant array of predetermined 6-bit PWM values, then use GetState to compare if the new On/Off state is different, and if it is different then I issue the LED_1_On or Off Pragma command. That's it, the code is super simple, so I didn't think it would take so many CPU clocks to look up a boolean value in a 2-dimensional array and see if it is different than the current On/Off state.
It also seems that if I scaled back my grand ambitions from 56 GPIO to just 1, at best this would only be 56x faster, essentially a 448 Hz software PWM solution for 1 GPIO.
Since I need 56 PWM LED outputs, I've never tested the digital block to see how fast that PWM solution performs, though I'd wager it's quite a bit faster. Still, 448 Hz for a single 6-bit software PWM coded GPIO seems pitiful.
Question 3: Best Practices for Accessing GPIO - Registers vs. Pragmas?
In my early test code, I was directly accessing the GPIO using the registers, like PRT4DR, thinking that would be fastest. But I've since changed to using Pragmas, like LED_01_Start, LED_01_On or LED_01_Off. From my tests, it seems performance is the same, and the Pragmas make for easier coding. Am I wrong?
Similarly, I was originally setting the GPIO ON or OFF on every pass, even if there was no change from the previous pass. I refined my code to only set the state if it had changed, and this seemed to boost performance. I started with an array of 56 booleans to track the On/Off states manually, but then started using the LED_01_GetState instead, and this Pragma seemed to have no adverse effect on performance. The only downside I discovered to using GetState is that it only reported correctly if I used the Pragmas for On and Off, otherwise it returned the wrong value if I set On/Off directly via registers.
Am I right that Pragmas have no performance impact, and possibly even memory benefits since I don't need to track and array of on/off states in my code?
Question 4: Can I Set All GPIO with 1 Command?
In my software PWM solution, I am stepping through each GPIO, one-by-one, to set their On/Off state (but only if it has changed). This seems really inefficient. In a worst case scenario of a 50% duty cycle, I'm flipping each GPIO's state with every pass, one-by-one.
It seems it would be more efficient if I could pass a single command with 7-byte value that represents the on/off state of all 56 GPIO.
I couldn't find any commands like this. Perhaps I'm overestimating the potential impact, as in my testing it seems that my code sets 1 GPIO at the same rate as all 56 GPIO, so perhaps toggling the GPIO state doesn't have much of a performance penalty. I guess this makes sense, as I would expect the PSoC to be highly optimized for setting the GPIO states. But if I could eliminate a loop through all 56 GPIO one-by-one, perhaps PWM frequency would improve due to code efficiency.
Question 5: Best Global Resource Settings?
Performance is my main concern for this device. Low latency is #1, followed by higher PWM frequency. For that reason, I've pretty much maxed out the Global Resource settings (or, at least I think I have maxed them out). But I understand that some of these settings consume more power without any benefit to my device, so minimizing wasted power drain seems wise if possible.
Below is what I've configured, and my reasoning. Does this look right?
Power Setting is set to 5.0v / 24MHz - I believe this is the fastest option.
CPU_Clock is SysClk/1 - I figure this is what is controlling my code speed, and that this is the fastest clock.
Sleep_Time is 1_Hz - I'm never "sleeping" in my code, it simply runs my simple loop forever as fast as possible, no delays, so I'm not sure if this has any impact at all for my purposes. Though perhaps I'm confused how this affects performance and it is slowing down my code execution by waking up the CPU every 1Hz!!!
VC1 is set to 16 - I don't see any performance difference vs. setting VC1 to 1
VC2 is set to 16 - I don't see any performance difference vs. setting VC2 to 1
VC3 is set to VC2/16 - I don't see any performance difference vs. setting to 1. The documentation indicates I can set this lower than VC1/2, going as low as 256. Should I change this to 256 for lower power drain? Diminishing returns?
SycClk Source is set to Internal - I'm using the built in IMO, not an external crystal solution
SysClk*2 Disable is set to No - I don't think the SYSCLK Doubler affects software PWM, as I don't think software code is able to take advantage of the special 48MHz clock timings that are available to the digital blocks, so I thought I should set this to Yes for more power savings. But every time I set it to Yes, the PSoC 1 failed to connect properly through USB. Perhaps that indicates a hardware issue, but leaving it set to No works just fine, even though this seems like a higher performance setting.
Analog Power is set to All Off - I originally was the default of SC On/Ref Low. All 56 GPIO are using the LED module configured to Active High. Since I'm using digital IO, my assessment is that these Analog Power settings don't apply to me, and it seems to behave the same with the All Off setting, which should have the least power drain.
Ref Mux is set to (Vdd/2)+/-BandGap - This too seems to be related to Analog Power for use in Analog Blocks, so I'm thinking this value has no impact at all for my design.
AGndBypass is set to Disable - This seems to be related to Analog Power's Ground, for use in Analog Blocks, so I think this has no impact on my design.
Op_Amp Bias is set to Low - Another analog setting that doesn't apply to my design?
A_Buff_Power is set to Low - And another analog setting that doesn't affect my design, right?
Trip Voltage [LVD] is set to 4.81V - The fact that my tests are operating without restart hiccups suggests that my USB 5V power delivery is working correctly, otherwise I would expect even minor voltage sag to cause restarts with this being the most aggressive setting. Side note, this is the 2nd revision of my design. My 1st design used an external power source shared with the LED's, and lighting more than a handful at once caused enough voltage sag to trigger a PSoC reset. I'm now using USB power for the PSoC, and separate power planes for my LED's, and I'm ecstatic that it's working well.
LVDThrottleBack is set to Disable - Combined with my take on the LVD behavior, I think this suggests that I am successfully running at 24MHz, and not scaling back to 12MHz due to low voltage.
Watchdog Enable is set to Disable - My understanding of the Watchdog is that it can check if the PSoC is hung, and restart it if it is non-responsive. Other than low software PWM performance, my board seems to be working flawlessly, even without the Watchdog enabled. I'm thinking that I might need to set this to Enabled for a production version of my design, to make it more reliable for end-users, but I'm not sure if that's the right way to think about this feature. But since I have a software loop that runs non-stop forever without sleep, I'm not sure that this even gives the Watchdog an opportunity to be effective.
Question 6 - What am I overlooking?
Sorry to be so greedy for input, but I thought it wise to ask the most general of questions - what am I completely overlooking?
All I need is for the PSoC 1 to individually address all 56 GPIO with a PWM signal, at the highest possible frequency, and with the most accurate duty cycle period duration.
Should I be using timers? Interrupts? Assembly instead of C?
I have a very refined loop that simply updates all GPIO as fast as possible with the current On/Off state based upon predefined PWM values. The code occasionally checks if the USB buffer has new PWM data to process (this is a tiny 56 byte record), a query that I've varied from once per PWM Period to 64 times per period with no discernable impact. I'm currently checking the USB buffers for data once every 2 PWM periods, approximately 256 checks/sec, or every 4ms. This is to keep latency low for my pinball control software.
Because it is simply running a loop as fast as possible, I didn't see a need for timers or interrupts, even though these would be typical in a PWM solution. I do understand that my timer-less approach leads to some performance variability, though it seems consistent enough as the processing load is essentially the same with every loop.
You made it to the end!
If you actually read everything above, you're awesome! Thank you! Hopefully you have some good advice to share back my way...
If I need to post code, let me know.
-Paul
Show LessHello,
I would like to implement on PSoC Creator with a PSoC 3 a timer that counts the seconds passed since the device was turned on, saving this value in a numeric variable, and that if there was a shutdown this value would not be reset (maybe saving it in EEPROM?). After a restart the counter should continue to count the seconds from the last previous value.
In practice, as if it were the switch-on time of the device expressed in seconds.
Do you have any advice for me please? Is there any feature that already implements this functionality on PSoC 3?
Thank you very much for your availability
Greetings