Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tutorial 4 - ValueError: CrypTen does not support ONNX op Identity. #455

Open
ngkuru opened this issue Mar 13, 2023 · 6 comments
Open

Tutorial 4 - ValueError: CrypTen does not support ONNX op Identity. #455

ngkuru opened this issue Mar 13, 2023 · 6 comments

Comments

@ngkuru
Copy link

ngkuru commented Mar 13, 2023

Hi, I am receiving a ValueError: CrypTen does not support ONNX op Identity. when trying to run tutorial 4's "Classifying Encrypted Data with Encrypted Model" section. I haven't made any changes to the code except adding a %env CUDA_VISIBLE_DEVICES="" to bypass CUDA initialization error. How do I get around this? Thank you

Log is below.

Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/ngkuru/crypten/crypten/mpc/context.py", line 30, in _launch
    return_value = func(*func_args, **func_kwargs)
  File "/tmp/ipykernel_27669/2664848552.py", line 14, in encrypt_model_and_data
    private_model = crypten.nn.from_pytorch(model, dummy_input)
  File "/home/ngkuru/crypten/crypten/nn/onnx_converter.py", line 58, in from_pytorch
    crypten_model = from_onnx(f)
  File "/home/ngkuru/crypten/crypten/nn/onnx_converter.py", line 47, in from_onnx
    return _to_crypten(onnx_model)
  File "/home/ngkuru/crypten/crypten/nn/onnx_converter.py", line 184, in _to_crypten
    crypten_class = _get_operator_class(node.op_type, attributes)
  File "/home/ngkuru/crypten/crypten/nn/onnx_converter.py", line 260, in _get_operator_class
    raise ValueError(f"CrypTen does not support ONNX op {node_op_type}.")
ValueError: CrypTen does not support ONNX op Identity.
@Adongua
Copy link

Adongua commented Oct 23, 2023

Hi, I am receiving a ValueError: CrypTen does not support ONNX op Identity. when trying to run tutorial 4's "Classifying Encrypted Data with Encrypted Model" section. I haven't made any changes to the code except adding a %env CUDA_VISIBLE_DEVICES="" to bypass CUDA initialization error. How do I get around this? Thank you

Log is below.

Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/ngkuru/crypten/crypten/mpc/context.py", line 30, in _launch
    return_value = func(*func_args, **func_kwargs)
  File "/tmp/ipykernel_27669/2664848552.py", line 14, in encrypt_model_and_data
    private_model = crypten.nn.from_pytorch(model, dummy_input)
  File "/home/ngkuru/crypten/crypten/nn/onnx_converter.py", line 58, in from_pytorch
    crypten_model = from_onnx(f)
  File "/home/ngkuru/crypten/crypten/nn/onnx_converter.py", line 47, in from_onnx
    return _to_crypten(onnx_model)
  File "/home/ngkuru/crypten/crypten/nn/onnx_converter.py", line 184, in _to_crypten
    crypten_class = _get_operator_class(node.op_type, attributes)
  File "/home/ngkuru/crypten/crypten/nn/onnx_converter.py", line 260, in _get_operator_class
    raise ValueError(f"CrypTen does not support ONNX op {node_op_type}.")
ValueError: CrypTen does not support ONNX op Identity.

Hi,have you solved this problem now?

@WJHBLUESAPPHIRE
Copy link

The same issue remains

@jtfields
Copy link

I am receiving the same error when running Tutorial 4 on Google Colab - "ValueError: CrypTen does not support ONNX op Identity."

To run Crypten on Python 3.10, I downgraded the following - !pip install torch>=1.7.0 torchvision>=0.9.1 omegaconf>=2.0.6 onnx>=1.7.0 pandas>=1.2.2 pyyaml>=5.3.1 tensorboard future scipy>=1.6.0

@Adongua
Copy link

Adongua commented Jan 28, 2024

Run Crypten on Python 3.9
requirements.txt:
torch == 1.8.1
torchvision == 0.9.1
omegaconf == 2.0.6
onnx == 1.10.0
pandas == 1.2.2
pyyaml == 5.3.1
tensorboard
future
scipy == 1.6.0
sklearn
crypten
numpy == 1.19.5
pip install -r requirements.txt
Afterwards,in the nn/onnx_converter.py of imported crypten,I converted _OPSET_VERSION = 17 to _OPSET_VERSION = 11,and
this problem has been resolved.

jtfields added a commit to jtfields/CrypTen that referenced this issue Jan 30, 2024
@meitianyu
Copy link

Run Crypten on Python 3.9 requirements.txt: torch == 1.8.1 torchvision == 0.9.1 omegaconf == 2.0.6 onnx == 1.10.0 pandas == 1.2.2 pyyaml == 5.3.1 tensorboard future scipy == 1.6.0 sklearn crypten numpy == 1.19.5 pip install -r requirements.txt Afterwards,in the nn/onnx_converter.py of imported crypten,I converted _OPSET_VERSION = 17 to _OPSET_VERSION = 11,and this problem has been resolved.

Hi I also encountered the same problem and tried to solve it the way you did, but it still didn't work. Are there any other special operations?

I have installed the following packages in python3.9:
torch==1.8.1
torchvision==0.9.1
omegaconf==2.0.6
onnx==1.10.0
pandas == 1.2.2
pyyaml==5.3.1
tensorboard
future
scipy==1.6.0
sklearn
cryptic
numpy==1.19.5
and changed the value of _OPSET_VERSION to 11, but the output is still ValueError: Unsupported ONNX opset version: 17

@IshanAryendu
Copy link

Use this command inside your project directory to apply the changes: python3 setup.py install

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants