Does Zuko allow exporting to ONNX? #45
-
Hello again, Is it possible to export Zuko flows to onnx? If possible, do you have any example? If not promptly possible, you have any ideia of much effort would it take? I would be interesting in trying that out. Best, |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 9 replies
-
Hello @CaioDaumann, I have never tried, but I think it should be possible. Looking at https://pytorch.org/tutorials/beginner/onnx/export_simple_model_to_onnx_tutorial.html, you will probably need to wrap the flow as a "pure function", that is something that takes tensors as input and returns tensors as output. This is not the case of the Also, I don't know how ONNX handles randomness, but sampling from the flow requires first sampling from the base distribution, and then transforming it. It might be easier to only export the transform to ONNX. In any case, if you succeed, consider contributing a tutorial to the repo! |
Beta Was this translation helpful? Give feedback.
-
Hi @francois-rozet , @dpbigler , Getting back to this, I spend some time trying to export a zuko model to onnx and I came up with something like this: import torch
import torch.utils.data as data
import zuko
import numpy as np
import onnxruntime as ort
class WrappedNSF(torch.nn.Module):
def __init__(self):
super(WrappedNSF, self).__init__()
self.flow = zuko.flows.NSF(features=2, transforms=3, hidden_features=(64, 64))
def forward(self, x):
result = self.flow().transform(x)
return result
def two_moons(n: int, sigma: float = 1e-1):
theta = 2 * torch.pi * torch.rand(n)
label = (theta > torch.pi).float()
x = torch.stack((
torch.cos(theta) + label - 1 / 2,
torch.sin(theta) + label / 2 - 1 / 4,
), axis=-1)
return torch.normal(x, sigma), label
samples, labels = two_moons(16384)
samples_tensor = samples.clone().detach()
trainset = data.TensorDataset(samples, labels)
trainloader = data.DataLoader(trainset, batch_size=64, shuffle=True)
model = WrappedNSF()
model.eval()
dummy_input = torch.randn(1, 2)
output = model(dummy_input)
print("Sample output:", output)
try:
torch.onnx.export(model, # Wrapped model instance
dummy_input, # Model input (or a tuple for multiple inputs)
"wrapped_flow_model.onnx", # Output ONNX file path
export_params=True, # Store the trained parameter weights inside the model file
opset_version=17, # ONNX version to export the model to
do_constant_folding=True, # Optimization: constant folding
input_names=['input'], # Model's input names
output_names=['output'], # Model's output names
dynamic_axes={'input': {0: 'batch_size'}, # Variable length axes
'output': {0: 'batch_size'}})
print("Model exported successfully.")
except Exception as e:
print("Failed to export model:", str(e)) But this returns the following error:
Any ideias here or should I open an issue in PyTorch as the error message suggests? |
Beta Was this translation helpful? Give feedback.
Hello @CaioDaumann,
I have never tried, but I think it should be possible. Looking at https://pytorch.org/tutorials/beginner/onnx/export_simple_model_to_onnx_tutorial.html, you will probably need to wrap the flow as a "pure function", that is something that takes tensors as input and returns tensors as output. This is not the case of the$c$ and $x$ as input and returns $\log p(x | c)$ is probably enough.
Flow
objects which take a tensor as input and returns aDistribution
. A very thin wrapper module that takes bothAlso, I don't know how ONNX handles randomness, but sampling from the flow requires first sampling from the base distribution, and then transforming it. It might be easier to …