Train Inception with Custom Images on CPU

Train with your own Images with Tensorflow

SAGAR SHARMA
Towards Data Science

--

In my previous post, we saw how to do Image Recognition with TensorFlow using Python API on CPU without any training. We were using Inception-v3 model which is already trained by google on 1000 classes but what if we want to do the same thing but with our own images. We are going to use transfer learning which will help us to retrain final layer of already trained Inception-v3 model with new categories from scratch. It will take around 30 minutes on a laptop, without any GPU. For more info refer this link or who only want linux commands go to his link.

We will be training the model on the free data of flowers which is hosted by tensorflow and free for download. Before we start let’s do one test on a random dandelion image on my CPU.

and the result is…

We will be using Python 3 and TensorFlow 1.4

If your tensorflow is not up-to-date use the following command to update.

pip install --upgrade tensorflow

The training of the dataset can be done in only 4 steps which are as follows:

1. Download the tensorflow-for-poets-2

Let’s start by making a new folder Flowers_Tensorflow. Now open command prompt here and type…

git clone https://github.com/googlecodelabs/tensorflow-for-poets-2

This will download the “tensorflow-for-poets-2” folder from the tensorflow repository in you Flower_Tensorflow folder.

This folder contains files like scripts, tf_folder etc.

2. Download the dataset

Go to this link and download the flower data.Then extract “flower_photos” folder from the .tgz file that we just downloaded and paste it in the tf_files folder. This folder contains 5 categories daisy, dandelion, roses, sunflower, tulips and a LICENSE.txt file.

3. Retrain the model

Open command prompt in the “tensorflow-for-poets-2” folder and type

python scripts/retrain.py --output_graph=tf_files/retrained_graph.pb --output_labels=tf_files/retrained_labels.txt --image_dir=tf_files/flower_photos

Note: This is a single line. Just copy the complete thing from above and paste it in the command prompt.

After pressing enter the program will start creating .txt files in the C:/tmp/bottleneck/roses etc. It will create around 7300 bottleneck .txt files and look something like this.

After this it will start training and complete around 4000 steps like this…

While your computer is training on the new flower dataset let me break down the command and explain what we just did..

The whole command can be divided into 4 parts

Invoke/Run the retrain.py script.

python scripts/retrain.py

Make a new graph file in the tf_files folder(after training is completed).

--output_graph=tf_files/retrained_graph.pb

Make a new label file in the tf_files folder (after training is completed).

--output_labels=tf_files/retrained_labels.txt

Point towards the flower dataset directory.

--image_dir=tf_files/flower_photos

Note: You can add/change the arguments in the above command

Change the model other than Inception-v3 “Mobilenet Models

--architecture mobilenet_1.0_224

Tensorboard

--summaries_dir=tf_files/training_summaries/${write the architecture here}

Changing the Bottleneck directory

--bottleneck_dir=tf_files/bottlenecks

Changing the Training Steps

--how_many_training_steps=500

4. Test the newly Trained Model

To test the model just download any image and paste in the “tensorflow-for-poets-2” then type (where image.png is the name of the file).

python scripts/label_image.py --image image.png

But you might get an error.

To overcome this error open the label_image.py file in the scripts folder.

Go to line 74 and change the values or go to this link.

Change:

input_height = 224
input_width = 224
input_mean = 128
input_std = 128
input_layer = "input"

To:

input_height = 299
input_width = 299
input_mean = 0
input_std = 255
input_layer = "Mul"

Now, that we have made the changes let’s do this on other Flowers.

Daisy

Roses

Tulips

Ok let’s stop this. I’m nauseated by searching so many flowers…

If you want to implement the same experiment on Mobile platform for a LIVE video.

But Clap this one first!!!

I have tried to keep the article as exact and easy to understand as possible. Any comments, suggestions or if you have any questions, write it in the comments.

Follow me on Medium, Facebook, Twitter, LinkedIn, Google+, Quora to see similar posts.

Clap it! Share it! Follow Me!

Happy to be helpful. kudos…..

--

--