Skip to content

Latest commit

 

History

History
62 lines (44 loc) · 4.38 KB

File metadata and controls

62 lines (44 loc) · 4.38 KB

How to Run TensorFlow Lite Models on macOS

This guide shows how to set up a TensorFlow Lite Runtime environment on a macOS device. We'll use Anaconda to create a Python environment to install the TFLite Runtime in. It's easy!

Acknowledgement: Thanks goes to Max Hancock for contributing this guide!

Step 1. Download and Install Anaconda

First, install Anaconda, which is a Python environment manager that greatly simplifies Python package management and deployment. Anaconda allows you to create Python virtual environments on your Mac without interfering with existing installations of Python. Go to the Anaconda Downloads page and click the Download button.

When the download finishes, open the downloaded .exe file and step through the installation wizard. Use the default install options.

Step 2. Set Up Virtual Environment and Directory

First open up the terminal by opening a Finder window, and press 'Command + Shift + U', and then select Terminal. We'll create a folder called tflite1 directly in the Home folder (under your username) - you can use any other folder location you like, just make sure to modify the commands below to use the correct file paths. Create the folder and move into it by issuing the following commands in the terminal:

mkdir ~/tflite1
cd ~/tflite1

Next, create a Python 3.9 virtual environment by issuing:

conda create --name tflite1-env python=3.9

Enter "y" then "ENTER" when it asks if you want to proceed. Activate the environment and install the required packages by issuing the commands below. We'll install TensorFlow, OpenCV, and a downgraded version of protobuf. TensorFlow is a pretty big download (about 450MB), so it will take a while.

conda activate tflite1-env
pip install tensorflow
pip install opencv-python
pip uninstall protobuf
pip install protobuf==3.20.0

Download the detection scripts from this repository by issuing:

curl https://raw.githubusercontent.com/EdjeElectronics/TensorFlow-Lite-Object-Detection-on-Android-and-Raspberry-Pi/master/TFLite_detection_image.py --output TFLite_detection_image.py
curl https://raw.githubusercontent.com/EdjeElectronics/TensorFlow-Lite-Object-Detection-on-Android-and-Raspberry-Pi/master/TFLite_detection_video.py --output TFLite_detection_video.py
curl https://raw.githubusercontent.com/EdjeElectronics/TensorFlow-Lite-Object-Detection-on-Android-and-Raspberry-Pi/master/TFLite_detection_webcam.py --output TFLite_detection_webcam.py
curl https://raw.githubusercontent.com/EdjeElectronics/TensorFlow-Lite-Object-Detection-on-Android-and-Raspberry-Pi/master/TFLite_detection_stream.py --output TFLite_detection_stream.py

Step 3. Move TFLite Model into Directory

Next, take the custom TFLite model that was trained and downloaded from the Colab notebook and move it into the {username}\tflite1 directory (replacing {username} with your home directory name). If you downloaded it from Colab, it should be in a file called custom_model_lite.zip. (If you haven't trained a model yet and just want to test one out, download my "change counter" model by clicking this Dropbox link.) Move that file to the {username}\tflite1 directory. Once it's moved, unzip it using:

tar -xf custom_model_lite.zip

At this point, you should have a folder at {username}\tflite1\custom_model_lite which contains at least a detect.tflite and labelmap.txt file.

Step 4. Run TensorFlow Lite Model!

Now, just call one of the detection scripts and point it at your model folder with the --modeldir option. For example, to run your custom_model_lite model on a webcam, issue:

python TFLite_detection_webcam.py --modeldir=custom_model_lite

A window will appear showing detection results drawn on the live webcam feed, make sure to accept the use of webcam. For more information on how to use the detection scripts, like if you want to enter an image, video, or web stream please see Step 3 in the main README page.

Have fun using TensorFlow Lite! Stay tuned for more examples on how to build cool applications around your model.