The world’s leading publication for data science, AI, and ML professionals.

Getting Started with TensorBoard in Four Lines

The benefits far outweigh the cost, it doesn't make sense to not add these lines to your code.

Introduction

I recently completed my first Computer Vision project making the use of Convolutional Neural Networks (CCNs) built with Google’s machine learning library Tensorflow. Now if you know anything about me, I like cool things. Don’t get me wrong, CNNs are cool. They’re probably one of the coolest parts of data science and might even be cooler than maps (which is a big statement coming from me). But with all the material out there covering the in’s and out’s of CNNs, I’ve decided to focus on a small but important aspect of them.

Charts.

Charts are cool.

But wait!

Training a CNN can be an extremely resource-intensive task. This article doesn’t go into the details of building one, but if you’re using TensorBoard, you should really make sure you’re resources are set up. Thankfully, I’m working on a fairly capable desktop now but this same model would turn my poor laptop into an incredibly loud yet inefficient space heater. Using a GPU will be much easier on your machine and will also save time. I’m talking potentially hours to train a model down to minutes or seconds. If you’re working on a laptop or haven’t configured TensorFlow to utilize your GPU, do yourself a favor and get comfortable using Google Colab. It’s a free resource that provides GPU accelerated processing for all your machine learning needs! Here’s a great article to get you started.

Setting Up Google Colab for CNN Modeling

Now it doesn’t make any sense for you to spend all this time building a model if you don’t know how it’s performing. While building multiple iterations of the model, I found it difficult to keep track of the changes between each iteration, and what the effect on the outcome was. I made charts to quickly display the accuracy and loss of my training and validation sets after fitting each model, but comparing them together involved scrolling through dozens of lines of code and epoch results just to see each chart.

Sure I could combine the charts but then I’d have to decipher what each line means or arrange each as a subplot, and after a dozen or so iterations things can get a bit hectic.


Getting Started

What I found was TensorBoard. TensorBoard is a visualization tool built right into Tensorflow. I still have my charts in my notebook to see at a glance how my model performs as I’m making different changes, but after all of the iterations, I can open up Tensorboard in my browser to see how they all compare to one another all wrapped up in a nice and easy UI.

Here’s what we do

  • Import TensorBoard
  • Instantiate TensorBoard with a path and pass it as a callback

    A couple of things to note

Logs

  • You can set to wherever you’d like to save your TensorBoard logs. This path is important because this will be used when launching TensorBoard later.

Names

  • Descriptive naming isn’t necessary, but it is incredibly helpful. Initially, I only used date/time stamps as log names and it was impossible for me to remember what the topology of the network was when I was looking at the results. The naming convention I decided to go with was the optimizer used, followed by the number of neurons for each layer. The above example represents a CNN using the "Adam" optimizer followed by 4 convolutional layers each with 32, 64, 128, and 256 neurons in that order. I included the timestamp afterward to avoid rewriting over my logs and also filter through my logs based on the date and time.

Launch TensorBoard

So far we’ve talked about a lot of set-up, now we get into the charts!

From your terminal of choice, (here I’m using Anaconda prompt) navigate to your directory that contains the folder with all of your saved logs and run the command

This launches TensorBoard locally and points out where to read your log data from (which is in my case). After you hit enter you’ll see some activity shown below

Now depending on how many logs you’ve created and what version you’re using you may or may not see the exact same output as shown here, but the important part is the link highlighted above – http://localhost:6006/. Copy this into your browser and you’ll be met with something that looks similar to this!

Charts!

Not very exciting yet right?

On the left side, there are a bunch of checkboxes along with what you named each log. As you can see I simply gave each model a number at first, which was a terrible way to go. I then switched to the naming convention I described above which is better but I still feel like there might be an even better way. If you have any ideas let me know!

So what we can do now is click a few of those checkboxes on the left to start seeing the charts! You can also filter your logs if there’s only a certain subset that you want to view and compare.

Here I’m showing how you can filter through logs, expand the chart, adjust the axis and smoothing, and how you can hover to view more information through the interactive chart. Another useful feature is zooming and panning through your chart.


Before we sign off, don’t forget to celebrate and dance for all of the accomplishments and progress you’ve made so far!

Recap

So there’s a nice quick-start version of TensorBoard. There are still a lot of other features available including flowcharts, histograms, and time series analysis of your results all of which we may get into another time. Even with only the features I’ve outlined, TensorBoard has such a useful application for saving all of your logs and being able to review and compare them at a later time. You get all your charts available to you right at your fingertips, and charts are cool.


Related Articles