Skip to main content

How to use TensorBoard with TensorFlow 2.0 in Google Colaboratory?


Another post starts with you beautiful people!
It is quite a wonderful moment for me that many Aspiring Data Scientists like you have connected with me through my facebook page and have started their focused journey to be a Data Scientists by following my book. If you have not then I recommend to atleast visit my last post here.

In two of my previous posts we have learnt about keras and colab. In this post I am going to share with you all that TensorFlow 2.0 has been released and one quite interesting news about this release is that our beloved deep learning library keras is in built with it. Yes! You heard it right. If you know keras then using TensorFlow 2.0 library is quite easy for you. One of the interesting benefit of using TensorFlow library is it's visualization tool known as TensorBoard. In this post we are going to learn how to use TensorFlow 2.0 with MNIST dataset and then setup TensorBoard with Google Colaboratory.

Let's start this post learning by opening a new notebook in Google Colab and install TensorFlow 2.0 library as below-

This will take a minute and will install the TensorFlow 2.0 library in your notebook. After installing the library we can import this as below- 

Next, we will load the MNIST dataset from tensorflow keras library then split the dataset as training and test dataset and then we will normalize it like below- 

Now we will visualize the performance of our model in TensorBoard. For this first we will use keras.callbacks.TensorBoard function like below-
Here see the parameters we are passing in TensorBoard function./log folder is the location where tensorflow will create logs and other parameters are boolean flags with true values which allow us to visualize different types of properties in TensorBoard.

Next, we will define our sequential keras model and then compile it like below-

After compiling the model we will train and evaluate our model like below-

Our model has accuracy of 98%. Since TensorBoard runs on a server, we will learn the tricky part- how to access a localhost server from our colab notebook? For this purpose we will use ngrok. ngrok is a cross-platform application that enables developers to expose a local development server to the Internet with minimal effort. We will download ngrok like below- 

After downloading and installing ngrok in colab, we will fire up Tensorboard using ngrok like below-

Now we will start the server with ngrok like below-
Once your server is up, it will display the url like in above output cell. Copy this url and access it in a browser. You will see your TensorBoard page like below-

See the beautiful histograms showing in your board. Now explore it more by clicking on different tabs. Here I am seeing distributions and screen is like below-

If you click on the first tab- Scalars, you will see the performance of our model like below-

You can also get whole code as a notebook from this link.
That's it! with above steps you are able to setup TensorBoard with TensorFlow 2.0 in your colab notebook. We can use TensorBoard to visualize our TensorFlow graph, plot quantitative metrics about the execution of our graph, and show additional data like images that pass through it. This visualization tool is very flexible and useful so don't wait and explore as much as you can. Till then Go chase your dreams, have an awesome day, make every second count and see you later in my next post.











Comments

  1. It was really a nice article and i was really impressed by reading this, Thanks for sharing this information. Career3s provides Python training with real time experts. please go through our website once python Training

    ReplyDelete
  2. Awesome post. You Post is very informative. Thanks for Sharing.
    Data Science course in Noida

    ReplyDelete
  3. Thank for sharing this beautiful blog on Tensor board, well explained. keep sharing more like this.
    Scope of data science

    ReplyDelete

Post a Comment

Popular posts from this blog

How to use opencv-python with Darknet's YOLOv4?

Another post starts with you beautiful people 😊 Thank you all for messaging me your doubts about Darknet's YOLOv4. I am very happy to see in a very short amount of time my lovely aspiring data scientists have learned a state of the art object detection and recognition technique. If you are new to my blog and to computer vision then please check my following blog posts one by one- Setup Darknet's YOLOv4 Train custom dataset with YOLOv4 Create production-ready API of YOLOv4 model Create a web app for your YOLOv4 model Since now we have learned to use YOLOv4 built on Darknet's framework. In this post, I am going to share with you how can you use your trained YOLOv4 model with another awesome computer vision and machine learning software library-  OpenCV  and of course with Python 🐍. Yes, the Python wrapper of OpenCV library has just released it's latest version with support of YOLOv4 which you can install in your system using below command- pip install opencv-pyt...

How to convert your YOLOv4 weights to TensorFlow 2.2.0?

Another post starts with you beautiful people! Thank you all for your overwhelming response in my last two posts about the YOLOv4. It is quite clear that my beloved aspiring data scientists are very much curious to learn state of the art computer vision technique but they were not able to achieve that due to the lack of proper guidance. Now they have learnt exact steps to use a state of the art object detection and recognition technique from my last two posts. If you are new to my blog and want to use YOLOv4 in your project then please follow below two links- How to install and compile Darknet code with GPU? How to train your custom data with YOLOv4? In my  last post we have trained our custom dataset to identify eight types of Indian classical dance forms. After the model training we have got the YOLOv4 specific weights file as 'yolo-obj_final.weights'. This YOLOv4 specific weight file cannot be used directly to either with OpenCV or with TensorFlow currently becau...

Detecting Credit Card Fraud As a Data Scientist

Another post starts with you beautiful people! Hope you have learnt something from my previous post about  machine learning classification real world problem Today we will continue our machine learning hands on journey and we will work on an interesting Credit Card Fraud Detection problem. The goal of this exercise is to anonymize credit card transactions labeled as fraudulent or genuine. For your own practice you can download the dataset from here-  Download the dataset! About the dataset:  The datasets contains transactions made by credit cards in September 2013 by european cardholders. This dataset presents transactions that occurred in two days, where we have 492 frauds out of 284,807 transactions. The dataset is highly unbalanced, the positive class (frauds) account for 0.172% of all transactions. Let's start our analysis with loading the dataset first:- As per the  official documentation -  features V1, V2, ... V28 are the principal compo...