This assignment walks through a process of turning a simple Python application into a more production-ready app.
The task list may be long, so if there's anything you're struggling with, you're welcome to skip ahead. We don't need you to check items off a laundry list; rather, we want to focus on how you work on the things you're already familiar with.
This assignment makes use of VS Code's remote development features, so that your development environment requires no toil to set up, and is consistent regardless of operating system or other dependencies.
You need only two components:
- A container runtime like Docker Desktop
- Visual Studio Code
-
Using Nexar's repository as a template, create a new repository under your own Github account, then clone that repo into a local directory, e.g.
$HOME/nexar-assignment
. Using the GitHub CLI you can do this with a single command:cd $HOME gh repo create \ nexar-assignment \ --template https://github.com/getnexar/infra-eng-assignment.git \ --private # optional \ --confirm # optional
-
Open your working copy in VS Code (e.g.
code $HOME/nexar-assignment
in macOS). A prompt should pop up asking you to open the directory in a container. Click 'Yes', let it build the dev container (takes 1-3 minutes), and once it's done, you're good to go.
The doc-search
application implements a simple search endpoint over a set of documents. More specifically, given a dataset of documents, where each document has a numeric identifier, the endpoint returns a list of all of the document IDs containing ALL words in the q
query parameter.
For example, if the web server is serving at http://localhost:8080, and the words hello
and world
both exist only in document 1, then the command curl http://localhost:8080/?q=hello+world
should return:
{
"results": ["1"]
}
- Run unit tests
cd doc-search/src/ && python -m unittest -b test_index
- Build the container image:
docker build . -t doc-search
- Run the app:
docker run -p 8080:8080 doc-search
. - Test the app: you can use
curl
to query it, for example:curl http://localhost:8080/?q=hello+world
will return a JSON document with all of the documents containing bothhello
andworld
The app currently has a Dockerfile
included under doc-search/
.
- Every commit to application code (
.py
files) results in a slow build of the container image. Modify theDockerfile
to make the build faster. - How can you minimize the size of the resulting container image? Modify the
Dockerfile
or describe your solution.
Here you will deploy the application to a local Minikube. If you're working in VS Code in a container, then Minikube is already installed in your environment.
- Implement a minimal Helm chart for this application.
- Deploy the chart to Minikube, under the
default
namespace. - Verify that you can call the service from outside the cluster.
- We want Kubernetes to tolerate a slow start for our app. Implement this behavior in your chart. Bonus points if you can simulate a slow start and test your solution.
- In the app's Python code, instrument latency of the
search/
endpoint, and expose a metrics HTTP endpoint on port8000
. You may use any open-source library for this purpose. - Add code and/or configuration that installs Prometheus onto the k8s cluster and configures it to scrape metrics from the app.
- Using a load generator like
hey
, generate some load on the app. - Using the built-in web UI for Prometheus, chart the p50, p90, p99 latencies of
search/
requests over the load you generated before. - (Bonus) which other key metrics are important/useful to instrument in a web service like this? Add them as you see fit and show how you can query them in Prometheus.
Good luck!