Transfer Learning — Introduction !! Part -2

--

In the previous article, we discussed the layout of the Transfer Learning Series. In this article we will have a deep intuition about Transfer Learning and installation of libraries in which we will implement these concepts.

1. Library Installation

We will use two API’s which are mentioned below:

Keras: It is most widely used open source library based upon TensorFlow. It can be imported in python and any one one use it without knowing it’s internal functioning. It was released by MIT which was build on top of TensorFlow but with the release of TensorFlow 3.0 google has embedded Keras with TensorFlow officially.

Fig. 1 Keras (Source: blog.keras.io)

Installation:

pip install tensorflow
pip install keras

Version check:

import keras             @ importing keras
keras.__version__ @ it will show the ver. of Keras installed
import tensorflow as tf @ importing keras
tf.__version__ @ it will show the ver. of tf installed

Keras: Link

Tensorflow: Link

PyTorch: It is one the most widely used open source library based upon TensorFlow. It can be imported in python and any one one use it without knowing it’s internal functioning.

Fig. 2 PyTorch(Source: pytorch.org)

Installation:

pip install torch

Version check:

import torch                            @ importing keras
print(torch.__version__) @ printing Pytorch version

Pytorch: Link

Big Data Jobs

2. Transfer Learning

In Artificial Intelligence collecting huge datasets problem was persistent from the beginning and acted as a roadblock for several decades in the application of Artificial Intelligence in different sectors. However, the scientist found a way to counter this roadblock by two different ways:

Synthetic data: Scientists developed model which can produce the datasets similar to the original one by taking consideration of the statistic. Now many libraries exist which make synthetic dateset and can be used to train the model. This topic is out of the scope for this series then also i have given little bit of information.

Transfer Learning: This was one of the major breakthrough world wide which enabled many researchers to transfer the features learned by the trained model in their model and fine tune the new model with respect to their datasets which is small. Scientist/Researchers have found a way by which we can used the per-trained models to train the new neural network architecture.

Trending AI Articles:

1. Why Corporate AI projects fail?

2. How AI Will Power the Next Wave of Healthcare Innovation?

3. Machine Learning by Using Regression Model

4. Top Data Science Platforms in 2021 Other than Kaggle

Fig 3 Transfer Learning: Spanner turned into hammer (source:thisiswhyimbroke)

Transfer Learning is a mechanism just like a spanner where it was designed to open bolts but by slight modification i.e. replacing its head we can use it as hammer as well. Similarly, we can use the per-trained model to build a new model by the following ways:

  1. By Freezing all layer weight of per-trained weight and using fully collected layer of our own.
Fig. 4 Transfer Learning model trainning (source: learnopencv.com)

2. By using the architecture of state of the art models without weights with or without default Fully Connected Layers.

Fig. 6 Pre-Trained Model Architecture ( Source: learnopencv.com)

3. By setting some weights of the pre-trained model to trainable with custom Fully connected Layer.

4. By Using weight of the per-trained modes as a initializer in our custom model.

The are the basic four ways by which we can train our transfer learning model on our datasets which we will see in detail in upcoming articles.

Advantages:

  1. Require Less datasets.
  2. Require Less time and compute to train.
  3. Improved baseline performance of model.

In next article we will see some of the state of the art pre-trained models with their repository. Stay Tuned!!!!

Need help ??? Consult with me on DDI :)

Special Thanks:

As we say “Car is useless if it doesn’t have a good engine” similarly student is useless without proper guidance and motivation. I will like to thank my Guru as well as my Idol “Dr. P. Supraja” and “A. Helen Victoria”- guided me throughout the journey, from the bottom of my heart. As a Guru, she has lighted the best available path for me, motivated me whenever I encountered failure or roadblock- without her support and motivation this was an impossible task for me.

References

Pytorch: Link

Keras: Link

Tensorflow: Link

if you have any query feel free to contact me with any of the -below mentioned options:

YouTube : Link

Website: www.rstiwari.com

Medium: https://tiwari11-rst.medium.com

Github Pages: https://happyman11.github.io/

Articles: https://laptrinhx.com/author/ravi-shekhar-tiwari/

Google Form: https://forms.gle/mhDYQKQJKtAKP78V7

Don’t forget to give us your 👏 !

--

--