Tip / Sign in to post questions, reply, level up, and achieve exciting badges. Know more

cross mob
jsmith678x
Level 4
Level 4
100 sign-ins 25 replies posted First solution authored

Hi,

Can I use third party TFLITE models and 'model to C array' converters? Tensorflow converter: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/python/util.py  convert_bytes_to_c_source routine.

So other TFLITE files are compatible with PSoc6 ML library? Also can I use the tensorflow converter to create the C array?

 

0 Likes
1 Solution
nin
Moderator
Moderator
Moderator
50 solutions authored 100 replies posted First like given

Hi @jsmith678x ,

Yes, you can use third-party TFLITE models and model-to-C array converters with the PSoc6 ML library, as long as they are compatible with the library's requirements for input and output formats. The TensorFlow converter that you mentioned can indeed be used to create a C array from a TFLITE model.

However, it's important to note that the PSoC6 ML library has its own set of supported models and tools. It's recommended that you check the library's documentation to ensure that your chosen TFLITE model and conversion tool are compatible with the library.

Regarding the TensorFlow converter, it should be able to generate a C array from a TFLITE model for use with the PSoc6 ML library. You can use the convert_bytes_to_c_source routine to create a C header file with the weights and biases of the model in the appropriate format for use with the library.

Please refer to section 5.2 of ModusToolbox™ Machine Learning user guide.

The TFLM library runs machine learning models on Infineon microcontrollers. The TFLM library is available as a ModusToolbox™ asset. Use the following GitHub link: https://github.com/infineon/ml-tflite-micro

You can add a dependency file (mtb format) under the deps folder or use the Library Manager to add it to your project. It is available under Library > Machine Learning > ml-tflite-micro.

The ModusToolbox™ Machine Learning (ML) Configurator is used in ML applications for adapting a pre-trained learning model to an Infineon target platform. The tool accepts a pre-trained ML model and generates an embedded model (as a library), which can be used along with your application code for a target device.

The ML Configurator accepts a pre-trained ML model and generates an embedded model (as a C header or binary file), which can be used along with your application code for a target device. The tool lets you fit the pre-trained model of choice to the target device with a set of optimization parameters. The tool is provided as a GUI and a command line tool that provides the same functionality. The GUI includes its own user guide, available from the Help menu. The command line tool includes a -h switch to display the various options available. The ML

Configurator has two options for inference engine:
• TFLM inference engine
• Infineon inference engine

Each option comes with a different set of configurations, as explained in the ModusToolbox™ Machine Learning Configurator user guide.


Best regards,
Nin

View solution in original post

0 Likes
1 Reply
nin
Moderator
Moderator
Moderator
50 solutions authored 100 replies posted First like given

Hi @jsmith678x ,

Yes, you can use third-party TFLITE models and model-to-C array converters with the PSoc6 ML library, as long as they are compatible with the library's requirements for input and output formats. The TensorFlow converter that you mentioned can indeed be used to create a C array from a TFLITE model.

However, it's important to note that the PSoC6 ML library has its own set of supported models and tools. It's recommended that you check the library's documentation to ensure that your chosen TFLITE model and conversion tool are compatible with the library.

Regarding the TensorFlow converter, it should be able to generate a C array from a TFLITE model for use with the PSoc6 ML library. You can use the convert_bytes_to_c_source routine to create a C header file with the weights and biases of the model in the appropriate format for use with the library.

Please refer to section 5.2 of ModusToolbox™ Machine Learning user guide.

The TFLM library runs machine learning models on Infineon microcontrollers. The TFLM library is available as a ModusToolbox™ asset. Use the following GitHub link: https://github.com/infineon/ml-tflite-micro

You can add a dependency file (mtb format) under the deps folder or use the Library Manager to add it to your project. It is available under Library > Machine Learning > ml-tflite-micro.

The ModusToolbox™ Machine Learning (ML) Configurator is used in ML applications for adapting a pre-trained learning model to an Infineon target platform. The tool accepts a pre-trained ML model and generates an embedded model (as a library), which can be used along with your application code for a target device.

The ML Configurator accepts a pre-trained ML model and generates an embedded model (as a C header or binary file), which can be used along with your application code for a target device. The tool lets you fit the pre-trained model of choice to the target device with a set of optimization parameters. The tool is provided as a GUI and a command line tool that provides the same functionality. The GUI includes its own user guide, available from the Help menu. The command line tool includes a -h switch to display the various options available. The ML

Configurator has two options for inference engine:
• TFLM inference engine
• Infineon inference engine

Each option comes with a different set of configurations, as explained in the ModusToolbox™ Machine Learning Configurator user guide.


Best regards,
Nin

0 Likes