Ends in
00
days
00
hrs
00
mins
00
secs
ENROLL NOW

🎁 Get 20% Off - Christmas Big Sale on All Practice Exams, Video Courses, and eBooks!

Training an Image Classification Model with TensorFlow in Amazon SageMaker

Home » AWS » Training an Image Classification Model with TensorFlow in Amazon SageMaker

Training an Image Classification Model with TensorFlow in Amazon SageMaker

Introduction

In today’s digital age, where data is as vital as currency, the power of Machine Learning (ML) in transforming industries is undeniable. From self-driving cars to personalized medicine, ML models are at the heart of many technological breakthroughs. Among the various tools and frameworks available for ML, TensorFlow has emerged as a leader, renowned for its versatility and scalability. This article aims to demystify the process of training an Image Classification model using TensorFlow, specifically within the Amazon SageMaker environment. For beginners and intermediate enthusiasts alike, navigating through TensorFlow’s complexities and harnessing the power of SageMaker can seem daunting. However, with the right guidance, these tools can be invaluable assets in your ML toolkit.

With the rise of cloud computing, Amazon SageMaker has established itself as a potent platform for ML development, offering seamless integration with TensorFlow. This combination not only streamlines the model training process but also provides an efficient way to deploy and manage ML models. In this article, we will take a hands-on approach, walking you through each step of building and training an Image Classification model using TensorFlow in SageMaker. From setting up your environment to evaluating your model’s performance, we aim to provide a clear, step-by-step guide that bridges the gap between theory and practice. Whether you’re a student, a budding data scientist, or an IT professional looking to expand your ML skills, this article will serve as a practical guide to understanding and applying TensorFlow within the Amazon SageMaker ecosystem.

Here is an example architecture on what someone can achieve by leveraging Amazon SageMaker and Tensorflow to create an ML-powered software.

Training an Image Classification Model with TensorFlow in Amazon SageMaker

Setting Up Environment

Before diving into the world of machine learning with TensorFlow and Amazon SageMaker, it’s crucial to establish a solid foundation. This section will guide you through the initial setup, ensuring you have all the necessary tools and environments ready for your journey into image classification.

Prerequisites

  • Amazon Web Services (AWS) Account: First and foremost, you need an AWS account. If you don’t have one, sign up at the AWS website. This account is your gateway to accessing various AWS services, including Amazon SageMaker.
  • Basic Knowledge of Python and Machine Learning: Familiarity with Python programming and fundamental machine learning concepts will be beneficial. This knowledge will help you understand and modify the code as needed.

Tools Overview

  • TensorFlow: TensorFlow is an open-source machine learning library developed by Google. It’s known for its flexibility and extensive feature set for building and training ML models.
  • Amazon SageMaker: SageMaker is a fully managed service that provides every developer and data scientist with the ability to build, train, and deploy machine learning models quickly. SageMaker simplifies the machine learning lifecycle and provides a high-level environment for your ML projects.
  • Python: The scripting language we will use for writing our training script and SageMaker notebook.

Environment Setup in SageMaker

Creating a SageMaker Domain and User

Navigate the AWS Management Console and look for Amazon SageMaker. Once inside SageMaker, on the left navigation panel, select Domains and click the Create Domain button.

Training an Image Classification Model with TensorFlow in Amazon SageMaker

Once you click the Create Domain button, you can choose the Set up for single user (Quick Setup).

Tutorials dojo strip

Training an Image Classification Model with TensorFlow in Amazon SageMaker

Accessing SageMaker Studio

In this tutorial, we will use SageMaker Studio, but you can follow this tutorial even if you are using your local Jupyter environment.

If you are following this tutorial and want to use SageMaker Studio, after setting up your domain and user, you click Launch and select Studio. It will open a new tab in your browser.

Training an Image Classification Model with TensorFlow in Amazon SageMaker

Inside SageMaker Studio, you will see something like this:

Training an Image Classification Model with TensorFlow in Amazon SageMakerClick the JupyterLab panel that you can see on your screen.

After that, you will create a new JupyterLab Space.

Training an Image Classification Model with TensorFlow in Amazon SageMaker

Once a new JupyterLab Space is created, click Run Space.

Training an Image Classification Model with TensorFlow in Amazon SageMaker

It will take a while, but after that, you will be able to see and click the Open JupyterLab button.

Training an Image Classification Model with TensorFlow in Amazon SageMaker

Note: This tutorial will not cover the step-by-step process of setting up permissions for the IAM role that the notebook instance needs. You just need to make sure that the role has access to the S3 bucket, where we will store some data and model artifacts.

With your environment set up, you’re now ready to step into the world of machine learning with TensorFlow in Amazon SageMaker. The following sections will guide you through the process of building and training your image classification model, ensuring that you have a comprehensive understanding of both the theoretical and practical aspects of this exciting field.

Training Script

In this section, we dissect the training script train.py, which is the heart of our image classification model using TensorFlow in Amazon SageMaker. The script is well-structured, encapsulating various components from model definition to training logic. Let’s break down the key parts.

Importing Libraries and Setting Up Logging

from __future__ import print_function
import argparse
import gzip
import json
import logging
import os
import traceback
import numpy as np
import tensorflow as tf
from tensorflow.keras import Model
from tensorflow.keras.layers import Conv2D, Dense, Flatten, Dropout,
BatchNormalization, MaxPooling2D

logging.basicConfig(level=logging.DEBUG)

This section imports necessary libraries and sets up basic logging. TensorFlow and Keras are used for building the model, while other libraries support data handling and argument parsing.

Defining the Model: SmallConv

class SmallConv(Model):
    def __init__(self):
        super(SmallConv, self).__init__()
        # First Convolutional Block
        self.conv1 = Conv2D(32, 3, padding='same', activation='relu')
        self.bn1 = BatchNormalization()
        self.pool1 = MaxPooling2D()
        self.drop1 = Dropout(0.25)

        # Second Convolutional Block
        self.conv2 = Conv2D(64, 3, padding='same', activation='relu')
        self.bn2 = BatchNormalization()
        self.pool2 = MaxPooling2D()
        self.drop2 = Dropout(0.25)

        # Fully Connected Layer
        self.flatten = Flatten()
        self.d1 = Dense(128, activation='relu')
        self.drop3 = Dropout(0.5)
        self.d2 = Dense(10)  # Output layer for 10 classes

    def call(self, x):
        x = self.conv1(x)
        x = self.bn1(x)
        x = self.pool1(x)
        x = self.drop1(x)

        x = self.conv2(x)
        x = self.bn2(x)
        x = self.pool2(x)
        x = self.drop2(x)

        x = self.flatten(x)
        x = self.d1(x)
        x = self.drop3(x)
        return self.d2(x)

SmallConv is a custom model class inheriting from TensorFlow’s Model. It defines a simple convolutional neural network (CNN) suitable for image classification tasks. The CNN includes convolutional layers, batch normalization, dropout for regularization, and fully connected layers.

Data Preprocessing Functions

def convert_to_numpy(data_dir, images_file, labels_file):
    # Byte string to numpy arrays conversion
    with gzip.open(os.path.join(data_dir, images_file), "rb") as f:
        images = np.frombuffer(f.read(), np.uint8,
offset=16).reshape(-1, 28, 28)

    with gzip.open(os.path.join(data_dir, labels_file), "rb") as f:
        labels = np.frombuffer(f.read(), np.uint8, offset=8)

    return (images, labels)


def mnist_to_numpy(data_dir, train):
    # Load MNIST data into numpy array
   if train:
        images_file = "train-images-idx3-ubyte.gz"
        labels_file = "train-labels-idx1-ubyte.gz"
    else:
        images_file = "t10k-images-idx3-ubyte.gz"
        labels_file = "t10k-labels-idx1-ubyte.gz"

    return convert_to_numpy(data_dir, images_file, labels_file)

These functions handle the loading and conversion of MNIST dataset images and labels from compressed files into numpy arrays, which are then used for training and testing the model.

Normalization Function

def normalize(x, axis):
    eps = np.finfo(float).eps

    mean = np.mean(x, axis=axis, keepdims=True)
    std = np.std(x, axis=axis, keepdims=True) + eps
    return (x - mean) / std

Normalization is crucial in machine learning. This function normalizes the image data, ensuring the model receives data that’s on a similar scale, which is important for the training process.

Training Logic

def train(args):
    # Data loading, model instantiation, and training process
    ...

The train function embodies the core training logic. It loads the data, normalizes it, prepares data loaders, initializes the SmallConv model, and sets up the loss function and optimizer. It also contains the training loop, where the model is trained and evaluated on the dataset.

Argument Parsing

def parse_args():
    parser = argparse.ArgumentParser()

    parser.add_argument("--batch-size", type=int, default=32)
    parser.add_argument("--epochs", type=int, default=1)
    parser.add_argument("--learning-rate", type=float, default=1e-3)
    parser.add_argument("--beta_1", type=float, default=0.9)
    parser.add_argument("--beta_2", type=float, default=0.999)

    # Environment variables given by the training image
    parser.add_argument("--model-dir", type=str,
default=os.environ["SM_MODEL_DIR"])
    parser.add_argument("--train", type=str,
default=os.environ["SM_CHANNEL_TRAINING"])
    parser.add_argument("--test", type=str,
default=os.environ["SM_CHANNEL_TESTING"])

    parser.add_argument("--current-host", type=str,
default=os.environ["SM_CURRENT_HOST"])
    parser.add_argument("--hosts", type=list,
default=json.loads(os.environ["SM_HOSTS"]))

    return parser.parse_args()

This function parses command-line arguments, which are vital for hyperparameter tuning and specifying paths for data and model saving. This flexibility is essential when training models across different environments.

Main Execution Check

if __name__ == "__main__":
    args = parse_args()
    train(args)

This conditional check ensures that the training script runs when executed directly. It parses the arguments and calls the train function with those arguments.

Each part of this script is designed to handle specific aspects of the machine-learning workflow in a modular and clear manner. Understanding each component is key to manipulating and customizing the training process according to specific requirements or datasets. In the next sections, we’ll delve deeper into how this script is utilized within the Amazon SageMaker environment to train and deploy our image classification model.

SageMaker Notebook Set Up

In this section, we will see how to set up our notebook for training an image classification model using Tensorflow and SageMaker.

Initializing SageMaker and Role

import os
import json
import sagemaker
from sagemaker.tensorflow import TensorFlow
from sagemaker import get_execution_role

sess = sagemaker.Session()
role = get_execution_role()

output_path = "s3://<INSERT BUCKET NAME>/DEMO-tensorflow/mnist"

Here, we import the necessary libraries and initialize a SageMaker session.get_execution_role fetches the AWS Identity and Access Management (IAM) role attached to your SageMaker instance, which is essential for accessing AWS resources.

Configuring the Tensorflow Estimator

local_mode = False

if local_mode:
    instance_type = "local"
else:
    instance_type = "ml.c4.xlarge"

est = TensorFlow(
    entry_point="train.py",
    source_dir="code",
    role=role,
    framework_version="2.3.1",
    model_dir=False,
    py_version="py37",
    instance_type=instance_type,
    instance_count=1,
    volume_size=250,
    output_path=output_path,
    hyperparameters={
        "batch-size": 512,
        "epochs": 1,
        "learning-rate": 1e-3,
        "beta_1": 0.9,
        "beta_2": 0.999,
    },
)

This section sets up the TensorFlow estimator. The TensorFlow class from SageMaker’s Python SDK simplifies running TensorFlow scripts. We specify our training script (train.py), its directory, and various hyperparameters.

Note: This tutorial will not cover how to download and upload the images to your S3 bucket. To make sure that you can follow up on this point, you can download the data from this public S3 object from AWS: s3://sagemaker-example-files-prod-<REGION_NAME>.

Model Training

Once the SageMaker environment is set up and the data is ready, we proceed to train the model.

channels = {"training": loc, "testing": loc}
est.fit(inputs=channels)

The fit method of the estimator object begins the training job. Here, channels is a dictionary specifying the S3 paths to the training and testing data. This method abstracts away the complexities, making it easier to start training models with just a few lines of code.

Free AWS Courses

After running this code block, you should see something similar to this:

Training an Image Classification Model with TensorFlow in Amazon SageMaker

At the bottom part, you can actually see the metrics of your model and the training and billable seconds of your model training.

Training an Image Classification Model with TensorFlow in Amazon SageMaker

Model Output

After training, we can observe the model artifacts, including the trained model and its parameters that are saved to the specified S3 location.

tf_mnist_model_data = est.model_data
print("Model artifact saved at:\n", tf_mnist_model_data)

The model_data attribute of the estimator object provides the S3 path to these artifacts. This output is vital for deploying the model or for further analysis and evaluation.

Conclusion

Throughout this article, we have trained an image classification model using TensorFlow in Amazon SageMaker. Starting from the initial setup of our environment to dissecting the training script, and finally executing the model training in SageMaker, we’ve covered a comprehensive path that blends theoretical understanding with practical application.

Future Steps

Now that you have a trained model, the next logical step is to explore the deployment techniques of your model. Amazon SageMaker offers various deployment options, including real-time endpoints for instant predictions and batch transform for processing data in batches. Experimenting with these deployment strategies will give you a holistic view of the machine learning lifecycle. Additionally, you might want to delve into advanced model tuning to enhance your model’s performance or experiment with different datasets and neural network architectures.

Explore Best Practices

As you continue your journey in machine learning, here are a few best practices to keep in mind:

  1. Data Quality: Always ensure the quality of your data. Better data beats fancier algorithms.
  2. Regularization Techniques: Experiment with different regularization techniques to avoid overfitting.
  3. Hyperparameter Tuning: Leverage SageMaker’s hyperparameter tuning capabilities to optimize your model’s performance.
  4. Stay Informed: The field of machine learning is ever-evolving. Stay updated with the latest research and tools.
  5. Experiment and Iterate: Don’t be afraid to try new approaches and learn from failures.

Remember, this article is just a starting point. Machine learning is a vast field with endless possibilities. The key to mastery is continuous learning and experimentation. Keep exploring, keep learning, and most importantly, enjoy the journey in this fascinating world of machine learning.

Resources:

https://docs.aws.amazon.com/sagemaker/

https://docs.aws.amazon.com/sagemaker/latest/dg/gs.html?icmpid=docs_sagemaker_lp/index.html

https://docs.aws.amazon.com/sagemaker/latest/dg/image-classification-tensorflow.html

Get 20% Off – Christmas Big Sale on All Practice Exams, Video Courses, and eBooks!

Tutorials Dojo portal

Learn AWS with our PlayCloud Hands-On Labs

Tutorials Dojo Exam Study Guide eBooks

tutorials dojo study guide eBook

FREE AWS Exam Readiness Digital Courses

FREE AWS, Azure, GCP Practice Test Samplers

Subscribe to our YouTube Channel

Tutorials Dojo YouTube Channel

Follow Us On Linkedin

Recent Posts

Written by: John Patrick Laurel

Pats is the Head of Data Science at a European short-stay real estate business group. He boasts a diverse skill set in the realm of data and AI, encompassing Machine Learning Engineering, Data Engineering, and Analytics. Additionally, he serves as a Data Science Mentor at Eskwelabs. Outside of work, he enjoys taking long walks and reading.

AWS, Azure, and GCP Certifications are consistently among the top-paying IT certifications in the world, considering that most companies have now shifted to the cloud. Earn over $150,000 per year with an AWS, Azure, or GCP certification!

Follow us on LinkedIn, YouTube, Facebook, or join our Slack study group. More importantly, answer as many practice exams as you can to help increase your chances of passing your certification exams on your first try!

View Our AWS, Azure, and GCP Exam Reviewers Check out our FREE courses

Our Community

~98%
passing rate
Around 95-98% of our students pass the AWS Certification exams after training with our courses.
200k+
students
Over 200k enrollees choose Tutorials Dojo in preparing for their AWS Certification exams.
~4.8
ratings
Our courses are highly rated by our enrollees from all over the world.

What our students say about us?