Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Converting custom tensorflow model to .kmodel #1299

Open
mhueppe opened this issue Feb 25, 2025 · 3 comments
Open

Converting custom tensorflow model to .kmodel #1299

mhueppe opened this issue Feb 25, 2025 · 3 comments

Comments

@mhueppe
Copy link

mhueppe commented Feb 25, 2025

Hey,
I want to convert a custom tf model into .kmodel format to run it on the k210 chip.
For testing I wanted to start really simple with 1 input neuron, and one output neuron (here the model trained to return 1 if the input is >0.5 and 0 elsewhen).
However, after saving it to tf.lite format I can't get it to convert using ncc. I currently use the command:

./ncc compile k210_model.tflite k210_model.kmodel --dataset X_train.npy --input-type uint8 --output-type uint8 --inference-type uint8.

However, that does not seem to work. It crashes during quantization. How would I need to format the dataset to work properly?

The code base is the following:

`
import tensorflow as tf
import numpy as np

X_train = np.random.rand(10000, 1) # 1000 samples, 1 feature
y_train = (X_train > 0.5).astype(int) # 1 if >0.5, else 0

model = tf.keras.Sequential([
tf.keras.layers.Input(shape=(1,)),
tf.keras.layers.Dense(1, activation='sigmoid') # Output layer for binary classification
])

model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])

model.fit(X_train, y_train, epochs=100, batch_size=32, verbose=1)

test_samples = np.array([[0.3], [0.7]])
print(model.predict(test_samples)) # Expect outputs close to [0, 1]

np.save("mnt/data/X_train.npy", X_train)

converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
tflite_model = converter.convert()

with open("model.tflite", "wb") as f:
f.write(tflite_model)

`

The error I receive is the following:

  1. Import graph...
  2. Optimize Pass 1...
  3. Optimize Pass 2...
  4. Quantize...
    4.1. Add quantization checkpoints...
    4.2. Get activation ranges...
    Plan buffers...
    Fatal: Invalid dataset, should contain one file at least

However, even when I try to change the path the error maintains. I.e. only giving the directory instead of the file for example.

@curioyang
Copy link
Member

curioyang commented Feb 28, 2025

@mhueppe
--dataset X_train.npy
You can only give a dir path of images instead of npy.

@mhueppe
Copy link
Author

mhueppe commented Feb 28, 2025

So can you only run comptrr vision models on the k210? Additionally, for images what would the dir structure look like? Is there any way to run a time series model on it?

@curioyang
Copy link
Member

images dir such as this. --dataset /path/to/MINIImageNet_20

tree -L 2 MINIImageNet_20
MINIImageNet_20
├── ILSVRC2012_val_00035000.JPEG
├── ILSVRC2012_val_00035001.JPEG
├── ILSVRC2012_val_00035002.JPEG
├── ILSVRC2012_val_00035003.JPEG
├── ILSVRC2012_val_00035004.JPEG
├── ILSVRC2012_val_00035005.JPEG
├── ILSVRC2012_val_00035006.JPEG
├── ILSVRC2012_val_00035007.JPEG
├── ILSVRC2012_val_00035008.JPEG
├── ILSVRC2012_val_00035009.JPEG
├── ILSVRC2012_val_00035010.JPEG
├── ILSVRC2012_val_00035011.JPEG
├── ILSVRC2012_val_00035012.JPEG
├── ILSVRC2012_val_00035013.JPEG
├── ILSVRC2012_val_00035014.JPEG
├── ILSVRC2012_val_00035015.JPEG
├── ILSVRC2012_val_00035016.JPEG
├── ILSVRC2012_val_00035017.JPEG
├── ILSVRC2012_val_00035018.JPEG
└── ILSVRC2012_val_00035019.JPEG

It's difficult to run a time series model on K210.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants