Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

implement tensorflow tensor handlers #421

Merged
merged 9 commits into from
Dec 19, 2019
Merged

implement tensorflow tensor handlers #421

merged 9 commits into from
Dec 19, 2019

Conversation

bojiang
Copy link
Member

@bojiang bojiang commented Dec 10, 2019

(Thanks for sending a pull request! Please make sure to read the contribution guidelines, then fill out the blanks below.)

What changes were proposed in this pull request?

implement tensorflow tensor handlers(auto-transform in artifact)

Does this close any currently open issues?

no

How was this patch tested?

57% on tensorflow 2, 1.14

@pep8speaks
Copy link

pep8speaks commented Dec 10, 2019

Hello @hrmthw, Thanks for updating this PR.

There are currently no PEP 8 issues detected in this PR. Cheers! 🍻

Comment last updated at 2019-12-19 02:42:48 UTC

@bojiang bojiang changed the title implement tensorflow tensor handlers in artifact implement tensorflow tensor handlers(auto-transform in artifact) Dec 10, 2019
"""
output_format = request.headers.get("output", "json")
if output_format not in {"json", "str"}:
return make_response(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use raise BadInput instead? we recently made some changes to how we handle this type of error. Similarily below in L157

try:
import tensorflow as tf
except ImportError:
raise ImportError(
Copy link
Member

@parano parano Dec 14, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use raise MissingDependencyException instead? you may need to rebase or merge master

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

@parano
Copy link
Member

parano commented Dec 14, 2019

looks like you still need to add TensorFlow to the test dependencies, tests are still failing on travis-ci due to this. Consider mocking tf to avoid actually adding the dependency

@bojiang bojiang changed the title implement tensorflow tensor handlers(auto-transform in artifact) [WIP]implement tensorflow tensor handlers(auto-transform in artifact) Dec 15, 2019
@codecov-io
Copy link

codecov-io commented Dec 19, 2019

Codecov Report

Merging #421 into master will increase coverage by <.01%.
The diff coverage is 57.44%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #421      +/-   ##
==========================================
+ Coverage   58.55%   58.56%   +<.01%     
==========================================
  Files          82       83       +1     
  Lines        5569     5705     +136     
==========================================
+ Hits         3261     3341      +80     
- Misses       2308     2364      +56
Impacted Files Coverage Δ
bentoml/artifact/tf_savedmodel_artifact.py 28.84% <13.95%> (-9.87%) ⬇️
bentoml/handlers/tensorflow_tensor_handler.py 67.24% <62.74%> (-5.49%) ⬇️
bentoml/handlers/utils.py 91.48% <91.48%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 23bac65...2222433. Read the comment docs.

@bojiang bojiang changed the title [WIP]implement tensorflow tensor handlers(auto-transform in artifact) implement tensorflow tensor handlers Dec 19, 2019
@parano parano merged commit db783a0 into bentoml:master Dec 19, 2019
joshuacwnewton added a commit to MLH-Fellowship/BentoML that referenced this pull request Jul 24, 2020
Model service works before saving, but saving/loading TF1 models is
somewhat tricky. BentoML throws a warning about unsupported behavior.
See bentoml#421 for more information.
parano pushed a commit that referenced this pull request Jul 31, 2020
* Rename Tensorflow tests to Tensorflow_v2.2

* Add partially working TF1.14 integration test

Model service works before saving, but saving/loading TF1 models is
somewhat tricky. BentoML throws a warning about unsupported behavior.
See #421 for more information.

* Refactor TF2 artifact tests using fixtures

* Write test for TF2/Docker integration

* Change scope of tf2_svc_saved_dir fixture

* Fix bug for TF2.2 docker test (missing async)

* Fix incorrectly formatted data in API request

* Replace tmpdir with tmp_path_factory for scoping

tmpdir can't be used with module scope in a pytest fixture.

* Rename tf1 test objects to use '1' signifier

* Replace '__call__' with 'call' to fix bug

* Add prediction to build model layers before saving

Solves issue where model could not serialized properly using .save().

* Fix request in TF2 docker test

Return format and input data format were incorrect for TF API.

* Use common test data across all TF2 tests

* Remove partially working TF1.14 test

* Run dev/format.sh and dev/lint.sh

* Rename TF bento_service_example to TF2

This is because there will be another bento service example for TF1.

* Remove _wait_until_ready and import instead

* Add notes about image/host funcs from conftest.py
aarnphm pushed a commit to aarnphm/BentoML that referenced this pull request Jul 29, 2022
* init tf_tensor_handler test

* implement tensorflow tensor http_handler

* implement tensorflow tensor cli_handler and lambda_handler

* move input tensor auto transforming into artifact

* fix tf_tensor_handler test

* add new arg 'method' to TensorflowTensorHandler

* add NestedDecoder; decode tensor handler output

* add warning about importing tf 1.x SavedModel

* style
aarnphm pushed a commit to aarnphm/BentoML that referenced this pull request Jul 29, 2022
…ml#937)

* Rename Tensorflow tests to Tensorflow_v2.2

* Add partially working TF1.14 integration test

Model service works before saving, but saving/loading TF1 models is
somewhat tricky. BentoML throws a warning about unsupported behavior.
See bentoml#421 for more information.

* Refactor TF2 artifact tests using fixtures

* Write test for TF2/Docker integration

* Change scope of tf2_svc_saved_dir fixture

* Fix bug for TF2.2 docker test (missing async)

* Fix incorrectly formatted data in API request

* Replace tmpdir with tmp_path_factory for scoping

tmpdir can't be used with module scope in a pytest fixture.

* Rename tf1 test objects to use '1' signifier

* Replace '__call__' with 'call' to fix bug

* Add prediction to build model layers before saving

Solves issue where model could not serialized properly using .save().

* Fix request in TF2 docker test

Return format and input data format were incorrect for TF API.

* Use common test data across all TF2 tests

* Remove partially working TF1.14 test

* Run dev/format.sh and dev/lint.sh

* Rename TF bento_service_example to TF2

This is because there will be another bento service example for TF1.

* Remove _wait_until_ready and import instead

* Add notes about image/host funcs from conftest.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants