Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Average Inference Time #27

Open
FuzhiYang opened this issue Jan 27, 2021 · 0 comments
Open

Average Inference Time #27

FuzhiYang opened this issue Jan 27, 2021 · 0 comments

Comments

@FuzhiYang
Copy link

Great job!
There is a question about the average inference time on BSD100, Urban100 and Manga109. In Table 6, for EDSR-baseilne, EDSR, RDN, and IMDN, the average inference time becomes less from BSD100 to Manga109. But the image size becomes larger from BSD100 to Manga109. That makes me confused. Did I miss something about this average inference time?

In addition, I used the "test_IMDN.py" script and tested the released "IMDN_x4.pth" on BSD100, Urban100 and Manga109 on 2080 Ti. The average inference times are 9.14ms, 14.37ms and 17.00ms. I think 2080 Ti should be faster than Titan Xp mentioned in your paper. Are there any wrong steps for me?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant