Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

88 python 3 compatibility #94

Merged
merged 6 commits into from
Feb 15, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/getting_started/minikube.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ In this session, we show how to wrap the sklearn iris classifier in the [seldon-

3. Wrap your saved model using the core-python-wrapper docker image:
```bash
docker run -v $(pwd):/model seldonio/core-python-wrapper:0.6 /model IrisClassifier 0.1 seldonio --force
docker run -v $(pwd):/model seldonio/core-python-wrapper:0.7 /model IrisClassifier 0.1 seldonio --force
```

4. Build the docker image locally
Expand Down
4 changes: 2 additions & 2 deletions docs/wrappers/h2o.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ Detailed steps:
3. Run the python wrapping scripts, with the additional ````--base-image``` argument:

```bash
docker run -v /path/to/your/model/folder:/model seldonio/core-python-wrapper:0.6 /model H2OModel 0.1 myrepo --base-image=H2OBase:1.0
docker run -v /path/to/your/model/folder:/model seldonio/core-python-wrapper:0.7 /model H2OModel 0.1 myrepo --base-image=H2OBase:1.0
```

"0.1" is the version of the docker image that will be created. "myrepo" is the name of your dockerhub repository.
Expand Down Expand Up @@ -88,7 +88,7 @@ Here we give a step by step example in which we will train and save a [H2O model

```bash
cd ../../
docker run -v models/h2o_example:my_model seldonio/core-python-wrapper:0.6 my_model H2OModel 0.1 myrepo --base-image=H2OBase:1.0
docker run -v models/h2o_example:my_model seldonio/core-python-wrapper:0.7 my_model H2OModel 0.1 myrepo --base-image=H2OBase:1.0
```

This will create a docker image "seldonio/h2omodel:0.1", which is ready to be deployed in seldon-core.
16 changes: 8 additions & 8 deletions docs/wrappers/python.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,13 +61,13 @@ After you have copied the required files in your model folder, you run the Seldo
In order to make things even simpler (and because we love Docker!) we have dockerised the wrapper script so that you don't need to install anything on your machine to run it - except Docker.

```
docker run -v /path/to/model/dir:/my_model seldonio/core-python-wrapper:0.6 /my_model MnistClassifier 0.1 seldonio
docker run -v /path/to/model/dir:/my_model seldonio/core-python-wrapper:0.7 /my_model MnistClassifier 0.1 seldonio
```

Let's explain each piece of this command in more details.


``` docker run seldonio/core-python-wrapper:0.6 ``` : run the core-python-wrapper container (version 0.6)
``` docker run seldonio/core-python-wrapper:0.7 ``` : run the core-python-wrapper container.

``` -v /path/to/model/dir:/my_model ``` : Tells docker to mount your local folder to /my_model in the container. This is used to access your files and generate the wrapped model files.

Expand All @@ -76,7 +76,7 @@ Let's explain each piece of this command in more details.
For reference, here is the complete list of arguments that can be passed to the script.

```
docker run -v /path:<model_path> seldonio/core-python-wrapper:0.6
docker run -v /path:<model_path> seldonio/core-python-wrapper:0.7
<model_path>
<model_name>
<image_version>
Expand All @@ -97,18 +97,18 @@ Required:
* docker_repo: The name of your dockerhub repository. In our example seldonio.

Optional:
* out_folder: The folder that will be created to contain the output files. Defaults to ./build
* service_type: The type of Seldon Service API the model will use. Defaults to MODEL. Other options are ROUTER, COMBINER, TRANSFORMER, OUTPUT_TRANSFORMER
* base_image: The docker image your docker container will inherit from. Defaults to python:2.
* image_name: The name of your docker image. Defaults to model_name in lowercase
* out-folder: The folder that will be created to contain the output files. Defaults to ./build
* service-type: The type of Seldon Service API the model will use. Defaults to MODEL. Other options are ROUTER, COMBINER, TRANSFORMER, OUTPUT_TRANSFORMER
* base-image: The docker image your docker container will inherit from. Defaults to python:2.
* image-name: The name of your docker image. Defaults to model_name in lowercase
* force: When this flag is present, the build folder will be overwritten if it already exists. The wrapping is aborted by default.
* persistence: When this flag is present, the model will be made persistent, its state being saved at a regular interval on redis.
* grpc: When this flag is present, the model will expose a GRPC API rather than the default REST API

Note that you can access the command line help of the script by using the -h or --help argument as follows:

```
docker run seldonio/core-python-wrapper:0.6 -h
docker run seldonio/core-python-wrapper:0.7 -h
```

Note also that you could use the python script directly if you feel so enclined, but you would have to check out seldon-core and install some python libraries on your local machine - by using the docker image you don't have to care about these dependencies.
Expand Down
7 changes: 4 additions & 3 deletions examples/models/mean_classifier/MeanClassifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,15 +11,16 @@ def __init__(self, intValue=0):
assert type(intValue) == int, "intValue parameters must be an integer"
self.int_value = intValue

print "Loading model here"
X = np.load(open("model.npy",'r'))
print("Loading model here")

X = np.load(open("model.npy",'rb'), encoding='latin1')
self.threshold_ = X.mean() + self.int_value

def _meaning(self, x):
return f(x.mean()-self.threshold_)

def predict(self, X, feature_names):
print X
print(X)
X = np.array(X)
assert len(X.shape) == 2, "Incorrect shape"

Expand Down
2 changes: 0 additions & 2 deletions examples/models/mean_classifier/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,2 @@
numpy==1.11.2
scikit-learn==0.17.1
pandas==0.18.1
scipy==0.18.1
2 changes: 1 addition & 1 deletion wrappers-docker/Makefile
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
IMAGE_NAME=docker.io/seldonio/core-python-wrapper
IMAGE_VERSION=0.6
IMAGE_VERSION=0.7

SELDON_CORE_DIR=..

Expand Down
2 changes: 1 addition & 1 deletion wrappers/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,5 @@ update_protos:
cp ../proto/prediction.proto ./python/proto

build_protos:
python -m grpc.tools.protoc -I./python/proto --python_out=./python/proto --grpc_python_out=./python/proto ./python/proto/prediction.proto
python -m grpc.tools.protoc -I./python --python_out=./python --grpc_python_out=./python ./python/proto/prediction.proto

2 changes: 1 addition & 1 deletion wrappers/python/microservice.py
Original file line number Diff line number Diff line change
Expand Up @@ -180,6 +180,6 @@ def parse_parameters(parameters):
server.add_insecure_port("0.0.0.0:{}".format(port))
server.start()

print "GRPC Microservice Running on port {}".format(port)
print("GRPC Microservice Running on port {}".format(port))
while True:
time.sleep(1000)
4 changes: 2 additions & 2 deletions wrappers/python/model_microservice.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,8 +36,8 @@ def get_rest_microservice(user_model,debug=False):
@app.errorhandler(SeldonMicroserviceException)
def handle_invalid_usage(error):
response = jsonify(error.to_dict())
print "ERROR:"
print error.to_dict()
print("ERROR:")
print(error.to_dict())
response.status_code = 400
return response

Expand Down
6 changes: 3 additions & 3 deletions wrappers/python/outlier_detector_microservice.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,8 @@ def get_rest_microservice(user_model,debug=False):
@app.errorhandler(SeldonMicroserviceException)
def handle_invalid_usage(error):
response = jsonify(error.to_dict())
print "ERROR:"
print error.to_dict()
print("ERROR:")
print(error.to_dict())
response.status_code = 400
return response

Expand Down Expand Up @@ -67,7 +67,7 @@ def TransformInput(self,request,context):
outlier_score = score(self.user_model,features,datadef.names)

request.meta.tags["outlierScore"] = outlier_score

return request

def get_grpc_server(user_model,debug=False):
Expand Down
7 changes: 6 additions & 1 deletion wrappers/python/persistence.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,12 @@
import threading
import os
import time
import cPickle as pickle
try:
# python 2
import cPickle as pickle
except ImportError:
# python 3
import pickle
import redis


Expand Down
1 change: 0 additions & 1 deletion wrappers/python/seldon_requirements.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
numpy==1.11.2
pandas==0.18.1
grpc==0.3.post19
grpcio==1.1.3
Flask==0.11.1
futures
Expand Down
4 changes: 2 additions & 2 deletions wrappers/python/transformer_microservice.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,8 +48,8 @@ def get_rest_microservice(user_model,debug=False):
@app.errorhandler(SeldonMicroserviceException)
def handle_invalid_usage(error):
response = jsonify(error.to_dict())
print "ERROR:"
print error.to_dict()
print("ERROR:")
print(error.to_dict())
response.status_code = 400
return response

Expand Down
2 changes: 1 addition & 1 deletion wrappers/python/wrap_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ def wrap_model(
**wrapping_arguments):
if os.path.isdir(build_folder):
if not force_erase:
print "Build folder already exists. To force erase, use --force argument"
print("Build folder already exists. To force erase, use --force argument")
exit(0)
else:
shutil.rmtree(build_folder)
Expand Down
20 changes: 10 additions & 10 deletions wrappers/tester.py
Original file line number Diff line number Diff line change
Expand Up @@ -123,36 +123,36 @@ def run(args):
for i in range(args.n_requests):
batch = generate_batch(contract,args.batch_size)
if args.prnt:
print '-'*40
print "SENDING NEW REQUEST:"
print('-'*40)
print("SENDING NEW REQUEST:")

if not args.grpc:
REST_request = gen_REST_request(batch,features=feature_names,tensor=args.tensor)
if args.prnt:
print REST_request
print(REST_request)

response = requests.post(
REST_url,
data={"json":json.dumps(REST_request),"isDefault":True})
jresp = response.json()

if args.prnt:
print "RECEIVED RESPONSE:"
print jresp
print
print("RECEIVED RESPONSE:")
print(jresp)
print()
else:
GRPC_request = gen_GRPC_request(batch,features=feature_names,tensor=args.tensor)
if args.prnt:
print GRPC_request
print(GRPC_request)

channel = grpc.insecure_channel('{}:{}'.format(args.host,args.port))
stub = prediction_pb2_grpc.ModelStub(channel)
response = stub.Predict(GRPC_request)

if args.prnt:
print "RECEIVED RESPONSE:"
print response
print
print("RECEIVED RESPONSE:")
print(response)
print()


if __name__ == "__main__":
Expand Down