Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference time #2

Open
skunkwerk opened this issue Jan 15, 2024 · 1 comment
Open

Inference time #2

skunkwerk opened this issue Jan 15, 2024 · 1 comment

Comments

@skunkwerk
Copy link

Hi, thanks for publishing this work!
I couldn’t find any mention in your paper of the inference time for generating a new model from an image / text. Do you know how long it takes, and on what GPU you tested on?

@menyifang
Copy link
Owner

For inference, it runs in real-time for generative module and 30s for two optimize modules. When working from image/text to the full scheme, inversion and some mesh operations should be considered. So it takes around 3 minutes in total on a 3090 GPU.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants