Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 15 additions & 0 deletions Chapter11/MobileNetV2/withTransferLearning/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
What it does :

1. This python program train cats and dogs dataset using MobileNetV2 in tensorflow.

Dependancies :

1. Tensorflow module is needed to be installed in the local machine to run this program.

Things to check before running :

1. Check whether you have given the correct location of your dataset file in the google drive.
2. You should have access to the file in the Google Drive.



Original file line number Diff line number Diff line change
@@ -0,0 +1,247 @@
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"name": "mobileNetV2_with_transfer_learning.ipynb",
"provenance": [],
"collapsed_sections": []
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"language_info": {
"name": "python"
}
},
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "zkMBooK6itVx"
},
"source": [
"# **Problem: MobileNetV2 in tensorflow. Train on cats and dogs dataset.**\n",
"\n",
"Python program to train cats and dogs dataset using MobileNetV2 in tensorflow.\n",
"\n",
"Run all the cells. After executing the last cell, you will get the accuracy of the trained model.\n",
"\n",
"**Notes:**\n",
"\n",
"Following things are needed to be checked before running the program.\n",
" 1. Tensorflow module is needed to be installed in the local machine to run this program.\n",
" 2. Check whether you have given the correct location of your dataset file.\n",
" 3. You should have access to the file in the Google Drive."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "d9v7TRDIkMmm"
},
"source": [
"# **Import Modules**"
]
},
{
"cell_type": "code",
"metadata": {
"id": "pThT1tW3kPHo"
},
"source": [
"# Import tensorflow\n",
"import tensorflow as tf\n",
"\n",
"# Import keras\n",
"from tensorflow import keras\n",
"\n",
"# Import preprocess_input for ImageDataGenerator class\n",
"from keras.applications.mobilenet import preprocess_input\n",
"\n",
"# Import ImageDataGenerator class for the Data Augmentation\n",
"from keras.preprocessing.image import ImageDataGenerator\n",
"\n",
"# Import gdown module to download files from google drive\n",
"import gdown"
],
"execution_count": 3,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "_eQMxp_RqAXm"
},
"source": [
"# **Get the file location from google drive**"
]
},
{
"cell_type": "code",
"metadata": {
"id": "1szy5fTHp-tZ"
},
"source": [
"# Please change the URL as needed (make sure you have the access to the file)\n",
"\n",
"url = 'https://drive.google.com/file/d/1fMHrqIY0QYEj9qFUFsDuF949Jo-UWzVX/view?usp=sharing'\n",
"\n",
"# Derive the file id from the URL\n",
"file_id = url.split('/')[-2]\n",
"\n",
"# Derive the download url of the the file\n",
"download_url = 'https://drive.google.com/uc?id=' + file_id\n",
"\n",
"# Give the location you want to save it in your local machine\n",
"file_location = r'cats_and_dogs.zip'\n",
"\n",
"# Download the file from drive to your local machine\n",
"gdown.download(download_url, file_location, quiet=False)\n"
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "Jjtzusk3qj2X"
},
"source": [
"# **Unzip the zip dataset**"
]
},
{
"cell_type": "code",
"metadata": {
"id": "6bY4IAylqnla"
},
"source": [
"!unzip /content/cats_and_dogs.zip -d \"/content/unzipped_folder/\""
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "K-d0fcsDkUB9"
},
"source": [
"# **Construct train and test datasets**"
]
},
{
"cell_type": "code",
"metadata": {
"id": "v1JaL8hakbOu"
},
"source": [
"train_path = r\"/content/unzipped_folder/test_set/test_set\"\n",
"test_path = r\"/content/unzipped_folder/training_set/training_set\"\n",
"\n",
"# Give image size and shape\n",
"IMG_SIZE = 224\n",
"IMG_SHAPE = (IMG_SIZE, IMG_SIZE, 3)\n",
"\n",
"# Go through the train directory to obtain cateogries\n",
"train_batches = ImageDataGenerator(preprocessing_function=preprocess_input).flow_from_directory(\n",
" train_path ,target_size=(IMG_SIZE,IMG_SIZE),batch_size=24,class_mode='categorical')\n",
"\n",
"# Go through the test directory to obtain cateogries\n",
"test_batches = ImageDataGenerator(preprocessing_function=preprocess_input).flow_from_directory(\n",
" test_path ,target_size=(IMG_SIZE,IMG_SIZE),batch_size=24,class_mode='categorical')"
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "YMnOflqRk2F0"
},
"source": [
"# **Derive the model using MobileNetV2**"
]
},
{
"cell_type": "code",
"metadata": {
"id": "CQUevpmak6JH"
},
"source": [
"base_model = keras.applications.MobileNetV2(input_shape=IMG_SHAPE,include_top=False,weights='imagenet')\n",
"base_model.trainable = False\n",
"\n",
"feature_batch = base_model.output\n",
"global_average_layer = keras.layers.GlobalAveragePooling2D()\n",
"feature_batch_average = global_average_layer(feature_batch)\n",
"prediction_layer = keras.layers.Dense(2)\n",
"prediction_batch = prediction_layer(feature_batch_average)\n",
"\n",
"# Group layers into a keras model\n",
"model = keras.Sequential([\n",
" base_model,\n",
" global_average_layer,\n",
" prediction_layer\n",
"])"
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "yC-rnheIk8nT"
},
"source": [
"# **Compile and train the model**"
]
},
{
"cell_type": "code",
"metadata": {
"id": "GacazgGelBen"
},
"source": [
"# Complile the model\n",
"model.compile(optimizer=tf.keras.optimizers.RMSprop(),\n",
" loss='binary_crossentropy',\n",
" metrics=['accuracy'])\n",
"\n",
"# Train the model\n",
"history = model.fit(train_batches,\n",
" steps_per_epoch=24,\n",
" epochs=1, #<-- increase for higher accuracy\n",
" validation_data=test_batches,\n",
" validation_steps=200)"
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "8N5VvKeslDcL"
},
"source": [
"# **Get accuracy of the model**"
]
},
{
"cell_type": "code",
"metadata": {
"id": "bHLNH7W0lIcL"
},
"source": [
"# Get the accuracy\n",
"loss,accuracy = model.evaluate(test_batches, steps = 20)\n",
"\n",
"print('Accuracy :', accuracy)"
],
"execution_count": null,
"outputs": []
}
]
}
Original file line number Diff line number Diff line change
@@ -1,2 +1,111 @@
# TODO: Create a MobileNetV2 in tensorflow. Train on cats and dogs dataset. (with transfer learning)
# TODO: Code should be well commented.
'''Copyright (c) 2021 AIClub

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the Software without restriction, including without
limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so, subject to the following
conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial
portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT
LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO
EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN
AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE
OR OTHER DEALINGS IN THE SOFTWARE.'''

# Python program train cats and dogs dataset with the MobileNetV2 in tensorflow

# Import tensorflow
import tensorflow as tf

# Import keras
from tensorflow import keras

# Import preprocess_input for ImageDataGenerator class
from keras.applications.mobilenet import preprocess_input

# Import ImageDataGenerator class for the Data Augmentation
from keras.preprocessing.image import ImageDataGenerator

# Import gdown module to download files from google drive
import gdown

# Import zip file module to open the zip file
from zipfile import ZipFile

#--------------------------------------------- Get the file location from google drive ----------------------------------------

# Please change the URL as needed (make sure you have the access to the file)

url = 'https://drive.google.com/file/d/1fMHrqIY0QYEj9qFUFsDuF949Jo-UWzVX/view?usp=sharing'

# Derive the file id from the URL
file_id = url.split('/')[-2]

# Derive the download url of the the file
download_url = 'https://drive.google.com/uc?id=' + file_id

# Give the location you want to save it in your local machine
file_location = 'cats_and_dogs.zip'

#--------------------------------------------- Download and extract the zip file -----------------------------------------------------------

# Download the file from drive to your local machine
gdown.download(download_url, file_location)

# Open the downloaded zip file and extract its contents
with ZipFile(file_location, "r") as zip_file:
filepath = zip_file.extractall()
# Read train and test datasets
train_path = r"test_set\test_set"
test_path = r"training_set\training_set"

#--------------------------------------------- Begin the training operation using MobileNetV2 -----------------------------------

# Give image size and shape
IMG_SIZE = 224
IMG_SHAPE = (IMG_SIZE, IMG_SIZE, 3)

# Go through the train directory to obtain cateogries
train_batches = ImageDataGenerator(preprocessing_function=preprocess_input).flow_from_directory(
train_path ,target_size=(IMG_SIZE,IMG_SIZE),batch_size=24,class_mode='categorical')

# Go through the test directory to obtain cateogries
test_batches = ImageDataGenerator(preprocessing_function=preprocess_input).flow_from_directory(
test_path ,target_size=(IMG_SIZE,IMG_SIZE),batch_size=24,class_mode='categorical')

# Get the base model using MobileNetV2 pre trained model
base_model = keras.applications.MobileNetV2(input_shape=IMG_SHAPE,include_top=False,weights='imagenet')
base_model.trainable = False

feature_batch = base_model.output
global_average_layer = keras.layers.GlobalAveragePooling2D()
feature_batch_average = global_average_layer(feature_batch)
prediction_layer = keras.layers.Dense(2)
prediction_batch = prediction_layer(feature_batch_average)

# Group layers into a keras model
model = keras.Sequential([
base_model,
global_average_layer,
prediction_layer
])

# Complile the model
model.compile(optimizer=tf.keras.optimizers.RMSprop(),
loss='binary_crossentropy',
metrics=['accuracy'])

# Fit the training data
history = model.fit(train_batches,
steps_per_epoch=24,
epochs=1, #<-- increase for higher accuracy
validation_data=test_batches,
validation_steps=200)

# Get the accuracy
loss,accuracy = model.evaluate(test_batches, steps = 20)

print('Accuracy :', accuracy)