Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance of the model on mobile/browser devices #46

Open
10dimensions opened this issue Jan 26, 2021 · 6 comments
Open

Performance of the model on mobile/browser devices #46

10dimensions opened this issue Jan 26, 2021 · 6 comments

Comments

@10dimensions
Copy link

I tried converting the .pt (torch) model to both .onnx and tfjs formats.
To correspondingly deploy them on browser as well on a node server (on CPU).

And the inference speeds average around 1500-1700 ms?

At the same time I found an iOS example on fastdepth.github.io which averages to an excellent 40 fps.

Am I missing anything on my browser/cpu implementations? Any additional processing to be done?
Thanks

@martinjuhasz
Copy link

@10dimensions how where you able to deploy this on something else than the tx2? how do you recompile the models for other platforms? would love to hear how this is done as i'm currently failing on that.

@niharsalunke
Copy link

Hi @martinjuhasz, I have set up the model on my pc without tx2. Are you still interested to know more about it?

@10dimensions
Copy link
Author

Hi @martinjuhasz

Yes. I did try to convert the .pth model to onnx, and also tfjs graph (+ bins)

On tensorflowjs (node runtime), I was able to compile and run it on CLI. But still the fps was very low, around 0.5 fps, as opposed to the expected 20+ fps

@martinjuhasz
Copy link

@10dimensions thanks for the info

@niharsalunke yeah, still interested!

@dpredie
Copy link

dpredie commented Feb 28, 2022

@10dimensions can you share the onnx model?

@10dimensions
Copy link
Author

@dpredie Nope, I don't at the moment. But I guess PyTorch has converters in-built.
https://deci.ai/resources/blog/how-to-convert-a-pytorch-model-to-onnx/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants