-
Notifications
You must be signed in to change notification settings - Fork 805
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
implement tensorflow tensor handlers #421
Conversation
Hello @hrmthw, Thanks for updating this PR. There are currently no PEP 8 issues detected in this PR. Cheers! 🍻 Comment last updated at 2019-12-19 02:42:48 UTC |
""" | ||
output_format = request.headers.get("output", "json") | ||
if output_format not in {"json", "str"}: | ||
return make_response( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use raise BadInput
instead? we recently made some changes to how we handle this type of error. Similarily below in L157
try: | ||
import tensorflow as tf | ||
except ImportError: | ||
raise ImportError( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use raise MissingDependencyException
instead? you may need to rebase or merge master
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
looks like you still need to add TensorFlow to the test dependencies, tests are still failing on travis-ci due to this. Consider mocking tf to avoid actually adding the dependency |
Codecov Report
@@ Coverage Diff @@
## master #421 +/- ##
==========================================
+ Coverage 58.55% 58.56% +<.01%
==========================================
Files 82 83 +1
Lines 5569 5705 +136
==========================================
+ Hits 3261 3341 +80
- Misses 2308 2364 +56
Continue to review full report at Codecov.
|
Model service works before saving, but saving/loading TF1 models is somewhat tricky. BentoML throws a warning about unsupported behavior. See bentoml#421 for more information.
* Rename Tensorflow tests to Tensorflow_v2.2 * Add partially working TF1.14 integration test Model service works before saving, but saving/loading TF1 models is somewhat tricky. BentoML throws a warning about unsupported behavior. See #421 for more information. * Refactor TF2 artifact tests using fixtures * Write test for TF2/Docker integration * Change scope of tf2_svc_saved_dir fixture * Fix bug for TF2.2 docker test (missing async) * Fix incorrectly formatted data in API request * Replace tmpdir with tmp_path_factory for scoping tmpdir can't be used with module scope in a pytest fixture. * Rename tf1 test objects to use '1' signifier * Replace '__call__' with 'call' to fix bug * Add prediction to build model layers before saving Solves issue where model could not serialized properly using .save(). * Fix request in TF2 docker test Return format and input data format were incorrect for TF API. * Use common test data across all TF2 tests * Remove partially working TF1.14 test * Run dev/format.sh and dev/lint.sh * Rename TF bento_service_example to TF2 This is because there will be another bento service example for TF1. * Remove _wait_until_ready and import instead * Add notes about image/host funcs from conftest.py
* init tf_tensor_handler test * implement tensorflow tensor http_handler * implement tensorflow tensor cli_handler and lambda_handler * move input tensor auto transforming into artifact * fix tf_tensor_handler test * add new arg 'method' to TensorflowTensorHandler * add NestedDecoder; decode tensor handler output * add warning about importing tf 1.x SavedModel * style
…ml#937) * Rename Tensorflow tests to Tensorflow_v2.2 * Add partially working TF1.14 integration test Model service works before saving, but saving/loading TF1 models is somewhat tricky. BentoML throws a warning about unsupported behavior. See bentoml#421 for more information. * Refactor TF2 artifact tests using fixtures * Write test for TF2/Docker integration * Change scope of tf2_svc_saved_dir fixture * Fix bug for TF2.2 docker test (missing async) * Fix incorrectly formatted data in API request * Replace tmpdir with tmp_path_factory for scoping tmpdir can't be used with module scope in a pytest fixture. * Rename tf1 test objects to use '1' signifier * Replace '__call__' with 'call' to fix bug * Add prediction to build model layers before saving Solves issue where model could not serialized properly using .save(). * Fix request in TF2 docker test Return format and input data format were incorrect for TF API. * Use common test data across all TF2 tests * Remove partially working TF1.14 test * Run dev/format.sh and dev/lint.sh * Rename TF bento_service_example to TF2 This is because there will be another bento service example for TF1. * Remove _wait_until_ready and import instead * Add notes about image/host funcs from conftest.py
(Thanks for sending a pull request! Please make sure to read the contribution guidelines, then fill out the blanks below.)
What changes were proposed in this pull request?
implement tensorflow tensor handlers(auto-transform in artifact)
Does this close any currently open issues?
no
How was this patch tested?
57% on tensorflow 2, 1.14