You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there any documented way to export the trained Lux model to be loaded by Tensorflow/PyTorch or even C++? Is ONNX the correct way to go? If so, how can I export Lux model as ONNX?
The text was updated successfully, but these errors were encountered:
I don't think there is currently any way to export models from Lux (or more broadly julia) to ONNX. the other way is possible via https://github.com/DrChainsaw/ONNXNaiveNASflux.jl -> Flux Model -> Lux.transform
Just scanned the source code of ONNXNaiveNASflux.jl which seems it supports exporting flux model to ONNX in src/serialize/serialize.jl? Or do I miss anything?
Is there any documented way to export the trained Lux model to be loaded by Tensorflow/PyTorch or even C++? Is ONNX the correct way to go? If so, how can I export Lux model as ONNX?
The text was updated successfully, but these errors were encountered: