Skip to content

Latest commit

 

History

History
94 lines (66 loc) · 3.15 KB

exercise1.md

File metadata and controls

94 lines (66 loc) · 3.15 KB

inbox | next ➡️

ML Ops 1: I heard that you know programming...

Hi!

Our scientists found a way to improve search algorithms of the Ice-seeker Martian Rovers. It should help us to reach sustainability a couple of months earlier! There is one catch - new algorithm requires special hardware and could run only on our mainframe. So we need to setup an API through which the Rovers could submit Martian soil scans to the ML model and get the results back.

I heard that you know programming. Could you take a stab in implementing that API wrapper for us?

The model could be represented by this Python script (we'll just plug the model later where thread.sleep currently is). It reads a request from the stdin, runs the computations and writes result back to the stdout. The output might contain different text, but the result is always a JSON object on a single line, prefixed by result: .

#!/usr/bin/env python
import sys
import time
import json

def model_predict(data):
    # run dedicated hardware
    print("Start calculations")
    time.sleep(2)
    print("Done")
    # we have the result now
    result = { 'ice_found': True }
    return result

def main():
    print("Loading data from the stdin")

    request = json.load(sys.stdin)

    print("input: " + json.dumps(request))
    result = model_predict(request)
    print("result: " + json.dumps(result))

if __name__ == "__main__":
    main()

For example, to execute the model via the command-line, you can:

> echo "[1,2,42]" | python model1.py
Loading data from the stdin
input: [1, 2, 42]
Start calculations
Done
result: {"ice_found": true}

Data scientists would like to have an HTTP API that handles JSON requests on http://localhost:8080/predict by proxying them to the Python model. You can keep the model source code in the same folder with the HTTP API server.

I understand that this is not exactly what you came to Mars for, unfortunately you are all we've got. The Colony depends on you.

Best regards, Colony Director

Task

Please make the Data Scientists of Mars happy!

You can use the language of your choice to implement the API that forwards incoming requests to the Python model. Naturally, the API should respond with whatever JSON result the model returns.

Should you have any questions, don't hesitate to ask them in the MLOps Community Slack. You could mention @abdullin to get a faster response.

Solutions

Please feel free to send PRs that add your solution to this list.

Next

✉️ 1 unread message