From Huggingface
Hugging Face Transformers
Optimum
β Step 1: Check ONNX Export Compatibilityβ
Before exporting, check if your model is supported by the ONNX export pipeline:
π List of supported models for ONNX export
π οΈ Step 2: Install Required Dependenciesβ
pip install optimum[onnxruntime] transformers
π Step 3: Exporting the Model with Optimumβ
If your model is supported:
from optimum.onnxruntime import ORTModelForSequenceClassification
from optimum.exporters.onnx import main_export
# 1. Define your model and tokenizer
model_id = "distilbert-base-uncased-finetuned-sst-2-english"
# 2. Use the CLI tool or Python API
# CLI (recommended for most users)
!optimum-cli export onnx \
--model $model_id \
--task sequence-classification \
--output onnx_output/
# Python API (for more control)
main_export(
model_name_or_path=model_id,
output="onnx_output/",
task="sequence-classification"
)
This will create an onnx_output/ directory with the ONNX model and associated config files.
β If Your Model Is Supportedβ
If the model is not listed as compatible, you can attempt to manually export using PyTorchβs ONNX exporter (not always reliable for Transformer models):
import torch
from transformers import AutoModel
model = AutoModel.from_pretrained("your-model")
model.eval()
dummy_input = torch.randint(0, 1000, (1, 16)) # Example input
torch.onnx.export(
model,
dummy_input,
"model.onnx",
input_names=["input_ids"],
output_names=["output"],
opset_version=11,
dynamic_axes={"input_ids": {0: "batch_size"}, "output": {0: "batch_size"}}
)
β οΈ Warning: This may fail due to unsupported operations or architecture-specific details in Hugging Face models.
π Still Not Working?β
If you encounter issues or your model isnβt supported by either method:
π Submit a support request to our team. Weβll help you export your model and guide you through the correct procedure based on your use case and model type.