Neural Networks on Mobile Devices with TensorFlow Lite: A Tutorial

By SAGAR SHARMA

This will be a practical, end-to-end guide on how to build a mobile application using TensorFlow Lite that classifies images from a dataset for your projects.

This application uses live camera and classifies objects instantly. The TFLite application will be smaller, faster, and more accurate than an application made using TensorFlow Mobile, because TFLite is made specifically to run neural nets on mobile platforms.

We’ll be using the MobileNet model to train our network, which will keep the app smaller.

  1. Python 3 .5 or higher python3 -V
  2. Tensorflow 1.9 or higher — pip3 install — upgrade tensorflow

Also, open the terminal and type:

alias python=python3

Now, python3 will open with the python command. This will make it easier to implement the code just by copy-pasting without having to worry about 3 after typing Python.

Let’s start by downloading the code from the tensorflow-for-poets GitHub. Open the command prompt where you want to download the folder and type:

This will download the files and make a new folder called tensorflow-for-poets in your current directory.

Output

Download GitHub Files

FYI: You can change the name of the folder to your project name after downloading.

Info: The folder contains the sub-folders
scripts — Contains the machine learning code .py files.
tf_files— It will contain output files like models — graph.pb, labels.txt
android — Contains Android app projects for both tfmobile and TFlite.
iOS — Contains the iOS app project files using xCode.

Let’s download a 200MB publicly available dataset with 5 different flowers to classify from. Then extract the flower_photos.tgz inside the tf_files folder which will look something like this:

tensorflow-for-poets-2 > tf_files > flower_photos
Info — The 5 different category folders are Rose, Daisy, Dandelion, Sunflower and Tulip.

Open the command prompt inside the “tensorflow-for-poets-2” folder and type:

Output

This will download the pre-trained frozen graph mobilenet_1.0_244

Download Graph

and create retrained_graph.pb and retrained_labels.txt files in the tf_files folder.

Completed Result

Open another command prompt in the current directory and point tensorboard to the summaries_dir:

Output

Open Tensorboard

Now you can open the 6006 port in your browser to see the results.

🔴 Training & 🔵 Validation

Tensorboard | Cross Entropy/Loss(Left) | Accuracy(Right)

Accuracy is above 0.90 and loss is below 0.4 for validation.

I downloaded a random rose image from the Internet in the current folder and used the following command to run the label_image.py script to detect the --image using the --graph file.

Output

The result gives 99% accuracy for the new_rose image.

Toco is used to convert the file to .lite format. For more detail on toco arguments, use toco --help

This will create an optimized_graph.lite file in your tf_files directory.

Info: Use the toco -h for more details
--input_file has been updated to --graph_def_file
--input_format is not need for the mobile_net graphs.
From here on, the tutorial is divided into two sections: iOS and Android.

Download Xcode

Install Xcode

Install Cocoapods

Install TFLite Cocoapod

Open the project with Xcode

Press ▶️ to initiate the simulator in Xcode.

First move the trained files into the assets folder of the application.

Replace the graph.lite file.

And then thelabels.txt file.

Now just click the ▶️ to open the simulator and drop images to see the results.

👏 👏 👏 Congratulations! Now you can apply the same method in your next gazillion dollar app, enable doctors work to faster and better without expensive equipment in the rural parts of the world, or just have fun. 👏👏 👏

There are two ways to do this: Android Studio and Bazel. I’ll be using AS since more people are familiar with it.

If you don’t have it installed already, go here and install it:

To make sure everything is working correctly in Android Studio, let’s do a test run.

🔸 Open Android Studio and select “📁Open an existing Android Studio project”.

🔸 Go to android/tfmobile directory.

🔸 If everything works perfectly, click the BUILD>BUILD APK button.

Open the folder containing app-debug.apk file by clicking locate.

Note: Turn On the developer mode in your phone before installing the app.

First move the trained files into the assets folder of the application.

Replace the graph.lite file.

and then labels.txt file.

Now click Tools>> Build .apk file.

Install the .apk file onyour phone and see the re-trained neural network detecting the objects.

👏 👏 👏 Congratulations! Now you can apply the same method in your next gazillion dollar app, enable doctors to work faster and better without expensive equipment in rural parts of the world, or just have fun. 👏👏 👏

If you hit a wall while implementing this post, reach out in the comments.

For further tutorials on how to use TensorFlow in mobile apps follow me on Medium and Twitter to see similar posts.

Clap it! Share it! Follow Me!

Happy to be helpful. kudos…

Discuss this post on Hacker News and Reddit.