Skip to main content

From Huggingface

Hugging Face Transformers
Optimum


βœ… Step 1: Check ONNX Export Compatibility​

Before exporting, check if your model is supported by the ONNX export pipeline:

πŸ”— List of supported models for ONNX export


πŸ› οΈ Step 2: Install Required Dependencies​

pip install optimum[onnxruntime] transformers

πŸš€ Step 3: Exporting the Model with Optimum​

If your model is supported:

from optimum.onnxruntime import ORTModelForSequenceClassification
from optimum.exporters.onnx import main_export

# 1. Define your model and tokenizer
model_id = "distilbert-base-uncased-finetuned-sst-2-english"

# 2. Use the CLI tool or Python API
# CLI (recommended for most users)
!optimum-cli export onnx \
--model $model_id \
--task sequence-classification \
--output onnx_output/

# Python API (for more control)
main_export(
model_name_or_path=model_id,
output="onnx_output/",
task="sequence-classification"
)

This will create an onnx_output/ directory with the ONNX model and associated config files.


❌ If Your Model Is Supported​

If the model is not listed as compatible, you can attempt to manually export using PyTorch’s ONNX exporter (not always reliable for Transformer models):

import torch
from transformers import AutoModel
model = AutoModel.from_pretrained("your-model")
model.eval()

dummy_input = torch.randint(0, 1000, (1, 16)) # Example input
torch.onnx.export(
model,
dummy_input,
"model.onnx",
input_names=["input_ids"],
output_names=["output"],
opset_version=11,
dynamic_axes={"input_ids": {0: "batch_size"}, "output": {0: "batch_size"}}
)

⚠️ Warning: This may fail due to unsupported operations or architecture-specific details in Hugging Face models.


πŸ†˜ Still Not Working?​

If you encounter issues or your model isn’t supported by either method:

πŸ‘‰ Submit a support request to our team. We’ll help you export your model and guide you through the correct procedure based on your use case and model type.