Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Load model directly from TensorFlow Hub #1122

Closed
AzizZayed opened this issue Jul 26, 2021 · 0 comments · Fixed by #1231
Closed

Load model directly from TensorFlow Hub #1122

AzizZayed opened this issue Jul 26, 2021 · 0 comments · Fixed by #1231
Labels
enhancement New feature or request

Comments

@AzizZayed
Copy link
Contributor

AzizZayed commented Jul 26, 2021

Description

DJL currently does not support loading a model directly from TFHub like this:

Criteria<int[], Image[]> criteria =
        Criteria.builder()
        ...
        .optModelUrls(“https://tfhub.dev/deepmind/biggan-deep-256/1”)
        ...
        .build();

The above code would be the goal of the enhancement, to be able to load a model using a TFHub link. We can define a customized url like: “tfhub://deepmind/biggan-deep-256/1”, so we can handle them differently
or “djl://tfhub/deepmind/biggan-deep-256/1".

The current way to do it is a bit tricky:

  1. Go on the TFHub link, right click on the download button and copy the link address
  2. In the terminal, type curl -v <link address>
  3. Find the location header and it points to the real download link.

curl -v https://tfhub.dev/deepmind/biggan-deep-256/1?tf-hub-format=compressed

Response:

...
< location: https://storage.googleapis.com/tfhub-modules/deepmind/biggan-deep-256/1.tar.gz
...

And now we can load the model with the given link:

Criteria<int[], Image[]> criteria =
        Criteria.builder()
        ...
        .optModelUrls(“https://storage.googleapis.com/tfhub-modules/deepmind/biggan-deep-256/1.tar.gz")
        ...
        .build();

DJL beginners will deeply appreciate such a feature since it removes another layer of complexity when trying to run inference on a model quickly. I encountered this issue myself.

@AzizZayed AzizZayed added the enhancement New feature or request label Jul 26, 2021
frankfliu added a commit to frankfliu/djl that referenced this issue Sep 18, 2021
Fixes deepjavalibrary#1122

Change-Id: Ia1a2fafc502cb07878ed23dea66f1914b8b3159a
frankfliu added a commit to frankfliu/djl that referenced this issue Sep 18, 2021
Fixes deepjavalibrary#1122

Change-Id: Ia1a2fafc502cb07878ed23dea66f1914b8b3159a
frankfliu added a commit to frankfliu/djl that referenced this issue Sep 19, 2021
Fixes deepjavalibrary#1122

Change-Id: Ia1a2fafc502cb07878ed23dea66f1914b8b3159a
frankfliu added a commit to frankfliu/djl that referenced this issue Sep 19, 2021
Fixes deepjavalibrary#1122

Change-Id: Ia1a2fafc502cb07878ed23dea66f1914b8b3159a
frankfliu added a commit to frankfliu/djl that referenced this issue Sep 19, 2021
Fixes deepjavalibrary#1122

Change-Id: Ia1a2fafc502cb07878ed23dea66f1914b8b3159a
frankfliu added a commit that referenced this issue Sep 19, 2021
Fixes #1122

Change-Id: Ia1a2fafc502cb07878ed23dea66f1914b8b3159a
Lokiiiiii pushed a commit to Lokiiiiii/djl that referenced this issue Oct 10, 2023
Co-authored-by: KexinFeng <fenkexin@amazon.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant