FastAPI wraps the Satellighte
library to serve as RESTful API
From root directory of the repository run followings,
pip install fastapi==0.74.1
pip install "uvicorn[standard]"==0.17.5
pip install python-multipart
python deployment/fastapi/service.py
From root directory of the repository run followings,
docker build -t satellighte-fastapi deployment/fastapi/
if gpu enabled, run with
docker run -d --name satellighte-service --rm -p 8080:8080 --gpus all satellighte-fastapi
if gpu disabled, run with
docker run -d --name satellighte-service --rm -p 8080:8080 satellighte-fastapi
ONNX Runtime inference can lead to faster customer experiences and lower costs.
From root directory of the repository run followings,
pip install onnx~=1.11.0
pip install onnxruntime~=1.10.0
python deployment/onnx/export.py
# python deployment/onnx/export.py --model_name mobilenetv2_default_eurosat --version 0
python deployment/onnx/runtime.py
# python deployment/onnx/runtime.py -m satellighte/models/mobilenetv2_default_eurosat/v0/mobilenetv2_default_eurosat.onnx -s src/eurosat_samples/AnnualCrop.jpg
Neural Magic's DeepSparse Engine is able to integrate into popular deep learning libraries allowing you to leverage DeepSparse for loading and deploying sparse models with ONNX.
From root directory of the repository run followings. We need the ONNX
model to use it. Create your onnx model from the above steps. Next,
pip install deepsparse~=1.0.2
python deployment/deepsparse/runtime.py
# python deployment/deepsparse/runtime.py -m -m satellighte/models/mobilenetv2_default_eurosat/v0/mobilenetv2_default_eurosat.onnx -s src/eurosat_samples/AnnualCrop.jpg
TensorFlow is a free and open-source software library for machine learning and artificial intelligence.
From root directory of the repository run followings,
pip install onnx-tf~=1.10.0
pip install tensorflow~=2.9.1
pip install tensorflow-probability~=0.17.0
From root directory of the repository run followings. We need the ONNX
model to use it. Create your onnx model from the above steps. Next,
python deployment/tensorflow/export.py
# python deployment/tensorflow/export.py -m satellighte/models/mobilenetv2_default_eurosat/v0/mobilenetv2_default_eurosat.onnx
python deployment/tensorflow/runtime.py
# python deployment/tensorflow/runtime.py -m satellighte/models/mobilenetv2_default_eurosat/v0/mobilenetv2_default_eurosat_tensorflow -s src/eurosat_samples/AnnualCrop.jpg -l "AnnualCrop,PermanentCrop,Forest,HerbaceousVegetation,Highway,Industrial,Pasture,Residential,River,SeaLake"
TensorFlow Lite is a mobile library for deploying models on mobile, microcontrollers and other edge devices.
From root directory of the repository run followings,
pip install onnx-tf~=1.10.0
pip install tensorflow~=2.9.1
pip install tensorflow-probability~=0.17.0
From root directory of the repository run followings. We need the TensorFlow
model to use it. Create your tensorflow model from the above steps. Next,
python deployment/tensorflow_lite/export.py
# python deployment/tensorflow_lite/export.py -m satellighte/models/mobilenetv2_default_eurosat/v0/mobilenetv2_default_eurosat_tensorflow
python deployment/tensorflow_lite/runtime.py
# python deployment/tensorflow_lite/runtime.py -m satellighte/models/mobilenetv2_default_eurosat/v0/mobilenetv2_default_eurosat_tensorflow.tflite -s satellighte/src/eurosat_samples/AnnualCrop.jpg -l "AnnualCrop,PermanentCrop,Forest,HerbaceousVegetation,Highway,Industrial,Pasture,Residential,River,SeaLake"