NLP Online Explainability with SageMaker Clarify


This notebook’s CI test result for us-west-2 is as follows. CI test results in other regions can be found at the end of the notebook.

This us-west-2 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable


  1. Introduction

  2. General Setup

    1. Install dependencies

    2. Import libraries

    3. Set configurations

    4. Create serializer and deserializer

    5. For visualization

  3. Prepare data

    1. Download data

    2. Loading the data

    3. Data preparation for model training

    4. Upload the dataset

  4. Train and Deploy Hugging Face Model

    1. Train model with Hugging Face estimator

    2. Download the trained model files

    3. Prepare model container definition

  5. Create endpoint

    1. Create model

    2. Create endpoint config

    3. Create endpoint

  6. Invoke endpoint

    1. Single record request

    2. Single record request, no explanation

    3. Batch request, explain both

    4. Batch request with more records, explain some of the records

  7. Cleanup

Introduction

Amazon SageMaker Clarify helps improve your machine learning models by detecting potential bias and helping explain how these models make predictions. The fairness and explainability functionality provided by SageMaker Clarify takes a step towards enabling AWS customers to build trustworthy and understandable machine learning models.

SageMaker Clarify currently supports explainability for SageMaker models as an offline processing job. This example notebook showcases a new feature for explainability on a SageMaker real-time inference endpoint, a.k.a. Online Explainability.

This example notebook walks you through:
1. Key terms and concepts needed to understand SageMaker Clarify 1. Trained the model on the Women’s ecommerce clothing reviews dataset. 1. Create a model from trained model artifacts, create an endpoint configuration with the new SageMaker Clarify explainer configuration, and create an endpoint using the same explainer configuration. 1. Invoke the endpoint with single and batch request with different EnableExplanations query. 1. Explaining the importance of the various input features on the model’s decision.

General Setup

We recommend you use Python 3 (Data Science) kernel on SageMaker Studio or conda_python3 kernel on SageMaker Notebook Instance.

Install dependencies

Install required dependencies. datasets[s3] and transformers are used for data preparation and training, captum is used to visualize the feature attributions.

[1]:
! pip install -r requirements.txt --upgrade --quiet

Import libraries

[2]:
import boto3
import csv
import pandas as pd
import numpy as np
import pprint
import tarfile

from sagemaker.huggingface import HuggingFace
from datasets import Dataset
from datasets.filesystems import S3FileSystem
from captum.attr import visualization
from sklearn.model_selection import train_test_split
from sagemaker import get_execution_role, Session
from sagemaker.s3 import S3Uploader
from sagemaker.serializers import CSVSerializer
from sagemaker.deserializers import JSONDeserializer
from sagemaker.utils import unique_name_from_base

Set configurations

[3]:
boto3_session = boto3.session.Session()
sagemaker_client = boto3.client("sagemaker")
sagemaker_runtime_client = boto3.client("sagemaker-runtime")

# Initialize sagemaker session
sagemaker_session = Session(
    boto_session=boto3_session,
    sagemaker_client=sagemaker_client,
    sagemaker_runtime_client=sagemaker_runtime_client,
)

region = sagemaker_session.boto_region_name
print(f"Region: {region}")

role = get_execution_role()
print(f"Role: {role}")

prefix = unique_name_from_base("DEMO-NLP-Women-Clothing")

s3_bucket = sagemaker_session.default_bucket()
s3_prefix = f"sagemaker/{prefix}"
s3_key = f"s3://{s3_bucket}/{s3_prefix}"
print(f"Demo S3 key: {s3_key}")

model_name = f"{prefix}-model"
print(f"Demo model name: {model_name}")
endpoint_config_name = f"{prefix}-endpoint-config"
print(f"Demo endpoint config name: {endpoint_config_name}")
endpoint_name = f"{prefix}-endpoint"
print(f"Demo endpoint name: {endpoint_name}")

# SageMaker Clarify model directory name
model_path = "model/"

# Instance type for training and hosting
instance_type = "ml.m5.xlarge"
Region: us-west-2
Role: arn:aws:iam::000000000000:role/service-role/SMClarifySageMaker-ExecutionRole
Demo S3 key: s3://sagemaker-us-west-2-000000000000/sagemaker/DEMO-NLP-Women-Clothing-1687464029-bfff
Demo model name: DEMO-NLP-Women-Clothing-1687464029-bfff-model
Demo endpoint config name: DEMO-NLP-Women-Clothing-1687464029-bfff-endpoint-config
Demo endpoint name: DEMO-NLP-Women-Clothing-1687464029-bfff-endpoint

Create serializer and deserializer

CSV serializer to serialize test data to string

[4]:
csv_serializer = CSVSerializer()

JSON deserializer to deserialize invoke endpoint response

[5]:
json_deserializer = JSONDeserializer()

For visualization

We have some methods implemented for visualization in visualization_utils.py file.

[6]:
%run visualization_utils.py

Prepare data

Download data

Data Source: https://www.kaggle.com/nicapotato/womens-ecommerce-clothing-reviews/

The Women’s E-Commerce Clothing Reviews dataset has been made available under a Creative Commons Public Domain license. A copy of the dataset has been saved in a sample data Amazon S3 bucket. In the first section of the notebook, we’ll walk through how to download the data and get started with building the ML workflow as a SageMaker pipeline

[7]:
s3 = boto3.client("s3")
s3.download_file(
    f"sagemaker-example-files-prod-{region}",
    "datasets/tabular/womens_clothing_ecommerce/Womens_Clothing_E-Commerce_Reviews.csv",
    "womens_clothing_reviews_dataset.csv",
)

Load the dataset

[8]:
df = pd.read_csv("womens_clothing_reviews_dataset.csv", index_col=[0])
df.head()
[8]:
Clothing ID Age Title Review Text Rating Recommended IND Positive Feedback Count Division Name Department Name Class Name
0 767 33 NaN Absolutely wonderful - silky and sexy and comf... 4 1 0 Initmates Intimate Intimates
1 1080 34 NaN Love this dress! it's sooo pretty. i happene... 5 1 4 General Dresses Dresses
2 1077 60 Some major design flaws I had such high hopes for this dress and reall... 3 0 0 General Dresses Dresses
3 1049 50 My favorite buy! I love, love, love this jumpsuit. it's fun, fl... 5 1 0 General Petite Bottoms Pants
4 847 47 Flattering shirt This shirt is very flattering to all due to th... 5 1 6 General Tops Blouses

Context

The Women’s Clothing E-Commerce dataset contains reviews written by customers. Because the dataset contains real commercial data, it has been anonymized, and any references to the company in the review text and body have been replaced with “retailer”.

Content

The dataset contains 23486 rows and 10 columns. Each row corresponds to a customer review.

The columns include:

  • Clothing ID: Integer Categorical variable that refers to the specific piece being reviewed.

  • Age: Positive Integer variable of the reviewer’s age.

  • Title: String variable for the title of the review.

  • Review Text: String variable for the review body.

  • Rating: Positive Ordinal Integer variable for the product score granted by the customer from 1 Worst, to 5 Best.

  • Recommended IND: Binary variable stating where the customer recommends the product where 1 is recommended, 0 is not recommended.

  • Positive Feedback Count: Positive Integer documenting the number of other customers who found this review positive.

  • Division Name: Categorical name of the product high level division.

  • Department Name: Categorical name of the product department name.

  • Class Name: Categorical name of the product class name.

Goal

To predict the sentiment of a review based on the text, and then explain the predictions using SageMaker Clarify.

Data preparation for model training

Target Variable Creation

Since the dataset does not contain a column that indicates the sentiment of the customer reviews, lets create one. To do this, let’s assume that reviews with a Rating of 4 or higher indicate positive sentiment and reviews with a Rating of 2 or lower indicate negative sentiment. Let’s also assume that a Rating of 3 indicates neutral sentiment and exclude these rows from the dataset. Additionally, to predict the sentiment of a review, we are going to use the Review Text column; therefore let’s remove rows that are empty in the Review Text column of the dataset

[ ]:
def create_target_column(df, min_positive_score, max_negative_score):
    neutral_values = [i for i in range(max_negative_score + 1, min_positive_score)]
    for neutral_value in neutral_values:
        df = df[df["Rating"] != neutral_value]
    df["Sentiment"] = df["Rating"] >= min_positive_score
    return df.replace({"Sentiment": {True: 1, False: 0}})


df = create_target_column(df, 4, 2)
df = df[~df["Review Text"].isna()]

Train-Validation-Test splits

The most common approach for model evaluation is using the train/validation/test split. Although this approach can be very effective in general, it can result in misleading results and potentially fail when used on classification problems with a severe class imbalance. Instead, the technique must be modified to stratify the sampling by the class label as below. Stratification ensures that all classes are well represented across the train, validation and test datasets.

[10]:
target = "Sentiment"
cols = "Review Text"

X = df[cols]
y = df[target]

# Data split: 11%(val) of the 90% (train and test) of the dataset ~ 10%; resulting in 80:10:10split
test_dataset_size = 0.10
val_dataset_size = 0.11
RANDOM_STATE = 42

# Stratified train-val-test split
X_train, X_test, y_train, y_test = train_test_split(
    X, y, test_size=test_dataset_size, stratify=y, random_state=RANDOM_STATE
)
X_train, X_val, y_train, y_val = train_test_split(
    X_train,
    y_train,
    test_size=val_dataset_size,
    stratify=y_train,
    random_state=RANDOM_STATE,
)

print(
    "Dataset: train ",
    X_train.shape,
    y_train.shape,
    y_train.value_counts(dropna=False, normalize=True).to_dict(),
)
print(
    "Dataset: validation ",
    X_val.shape,
    y_val.shape,
    y_val.value_counts(dropna=False, normalize=True).to_dict(),
)
print(
    "Dataset: test ",
    X_test.shape,
    y_test.shape,
    y_test.value_counts(dropna=False, normalize=True).to_dict(),
)

# Combine the independent columns with the label
df_train = pd.concat([X_train, y_train], axis=1).reset_index(drop=True)
df_test = pd.concat([X_test, y_test], axis=1).reset_index(drop=True)
df_val = pd.concat([X_val, y_val], axis=1).reset_index(drop=True)
Dataset: train  (15874,) (15874,) {1: 0.8804334131283861, 0: 0.11956658687161396}
Dataset: validation  (1962,) (1962,) {1: 0.8802242609582059, 0: 0.11977573904179409}
Dataset: test  (1982,) (1982,) {1: 0.8804238143289607, 0: 0.11957618567103935}
[11]:
headers = df_test.columns.to_list()
feature_headers = headers[0]
label_header = headers[1]
print(f"Feature names: {feature_headers}")
print(f"Label name: {label_header}")
print(f"Test data (without label column):")
test_data = df_test.iloc[:, :1]
test_data
Feature names: Review Text
Label name: Sentiment
Test data (without label column):
[11]:
Review Text
0 I am 5'6", 130 lbs with an athletic body type ...
1 The design on the blue sweater is actually a d...
2 The colors are so much brighter than pictured....
3 A very versatile and cozy top. would look grea...
4 Just not cute. i don't know how else to descri...
... ...
1977 As soon as i opened the package, i knew that t...
1978 As the title suggests, i am very skeptical and...
1979 I love this dress. i'm 6' so it's a tad bit sh...
1980 I love the concept of this dress. i love the s...
1981 Bought this is the blue, which is actually a v...

1982 rows × 1 columns

We have split the dataset into train, test, and validation datasets. We use the train and validation datasets during training process, and run Clarify on the test dataset.

Upload the dataset

Here, we upload the prepared datasets to S3 buckets so that we can train the model with the Hugging Face Estimator.

[12]:
df_train.to_csv("train.csv", index=False, header=False)
df_val.to_csv("test.csv", index=False, header=False)
[13]:
training_input_path = f"{s3_key}/train"
print(f"training input path: {training_input_path}")
val_input_path = f"{s3_key}/test"
print(f"validation input path: {val_input_path}")

train_uri = S3Uploader.upload("train.csv", training_input_path)
test_uri = S3Uploader.upload("test.csv", val_input_path)
training input path: s3://sagemaker-us-west-2-000000000000/sagemaker/DEMO-NLP-Women-Clothing-1687464029-bfff/train
validation input path: s3://sagemaker-us-west-2-000000000000/sagemaker/DEMO-NLP-Women-Clothing-1687464029-bfff/test

Train and Deploy Hugging Face Model

In this step of the workflow, we use the Hugging Face Estimator to load the pre-trained distilbert-base-uncased model and fine-tune the model on our dataset.

Train model with Hugging Face estimator

The hyperparameters defined below are parameters that are passed to the custom PyTorch code in `scripts/train.py <./scripts/train.py>`__. The only required parameter is model_name. The other parameters like epoch, train_batch_size all have default values which can be overridden by setting their values here.

The training job requires GPU instance type. Here, we use ml.g4dn.xlarge.

[14]:
training_input_path
[14]:
's3://sagemaker-us-west-2-000000000000/sagemaker/DEMO-NLP-Women-Clothing-1687464029-bfff/train'
[15]:
# Hyperparameters passed into the training job
hyperparameters = {
    "epochs": 1,
    "model_name": "distilbert-base-uncased",
    "train_file": "train.csv",
    "test_file": "test.csv",
}

huggingface_estimator = HuggingFace(
    entry_point="train.py",
    source_dir="scripts",
    instance_type="ml.g4dn.xlarge",
    instance_count=1,
    transformers_version="4.6.1",
    pytorch_version="1.7.1",
    py_version="py36",
    role=role,
    hyperparameters=hyperparameters,
    disable_profiler=True,
    debugger_hook_config=False,
)

# starting the train job with our uploaded datasets as input
huggingface_estimator.fit({"train": training_input_path, "test": val_input_path}, logs=True)
INFO:sagemaker.image_uris:image_uri is not presented, retrieving image_uri based on instance_type, framework etc.
INFO:sagemaker:Creating training-job with name: huggingface-pytorch-training-2023-06-22-20-00-31-761
Using provided s3_resource
2023-06-22 20:00:32 Starting - Starting the training job...
2023-06-22 20:00:47 Starting - Preparing the instances for training......
2023-06-22 20:01:51 Downloading - Downloading input data...
2023-06-22 20:02:16 Training - Downloading the training image...............
2023-06-22 20:04:42 Training - Training image download completed. Training in progress.bash: cannot set terminal process group (-1): Inappropriate ioctl for device
bash: no job control in this shell
2023-06-22 20:04:55,537 sagemaker-training-toolkit INFO     Imported framework sagemaker_pytorch_container.training
2023-06-22 20:04:55,567 sagemaker_pytorch_container.training INFO     Block until all host DNS lookups succeed.
2023-06-22 20:04:55,571 sagemaker_pytorch_container.training INFO     Invoking user training script.
2023-06-22 20:04:55,825 sagemaker-training-toolkit INFO     Invoking user script
Training Env:
{
    "additional_framework_parameters": {},
    "channel_input_dirs": {
        "test": "/opt/ml/input/data/test",
        "train": "/opt/ml/input/data/train"
    },
    "current_host": "algo-1",
    "framework_module": "sagemaker_pytorch_container.training:main",
    "hosts": [
        "algo-1"
    ],
    "hyperparameters": {
        "epochs": 1,
        "model_name": "distilbert-base-uncased",
        "test_file": "test.csv",
        "train_file": "train.csv"
    },
    "input_config_dir": "/opt/ml/input/config",
    "input_data_config": {
        "test": {
            "TrainingInputMode": "File",
            "S3DistributionType": "FullyReplicated",
            "RecordWrapperType": "None"
        },
        "train": {
            "TrainingInputMode": "File",
            "S3DistributionType": "FullyReplicated",
            "RecordWrapperType": "None"
        }
    },
    "input_dir": "/opt/ml/input",
    "is_master": true,
    "job_name": "huggingface-pytorch-training-2023-06-22-20-00-31-761",
    "log_level": 20,
    "master_hostname": "algo-1",
    "model_dir": "/opt/ml/model",
    "module_dir": "s3://sagemaker-us-west-2-000000000000/huggingface-pytorch-training-2023-06-22-20-00-31-761/source/sourcedir.tar.gz",
    "module_name": "train",
    "network_interface_name": "eth0",
    "num_cpus": 4,
    "num_gpus": 1,
    "output_data_dir": "/opt/ml/output/data",
    "output_dir": "/opt/ml/output",
    "output_intermediate_dir": "/opt/ml/output/intermediate",
    "resource_config": {
        "current_host": "algo-1",
        "current_instance_type": "ml.g4dn.xlarge",
        "current_group_name": "homogeneousCluster",
        "hosts": [
            "algo-1"
        ],
        "instance_groups": [
            {
                "instance_group_name": "homogeneousCluster",
                "instance_type": "ml.g4dn.xlarge",
                "hosts": [
                    "algo-1"
                ]
            }
        ],
        "network_interface_name": "eth0"
    },
    "user_entry_point": "train.py"
}
Environment variables:
SM_HOSTS=["algo-1"]
SM_NETWORK_INTERFACE_NAME=eth0
SM_HPS={"epochs":1,"model_name":"distilbert-base-uncased","test_file":"test.csv","train_file":"train.csv"}
SM_USER_ENTRY_POINT=train.py
SM_FRAMEWORK_PARAMS={}
SM_RESOURCE_CONFIG={"current_group_name":"homogeneousCluster","current_host":"algo-1","current_instance_type":"ml.g4dn.xlarge","hosts":["algo-1"],"instance_groups":[{"hosts":["algo-1"],"instance_group_name":"homogeneousCluster","instance_type":"ml.g4dn.xlarge"}],"network_interface_name":"eth0"}
SM_INPUT_DATA_CONFIG={"test":{"RecordWrapperType":"None","S3DistributionType":"FullyReplicated","TrainingInputMode":"File"},"train":{"RecordWrapperType":"None","S3DistributionType":"FullyReplicated","TrainingInputMode":"File"}}
SM_OUTPUT_DATA_DIR=/opt/ml/output/data
SM_CHANNELS=["test","train"]
SM_CURRENT_HOST=algo-1
SM_MODULE_NAME=train
SM_LOG_LEVEL=20
SM_FRAMEWORK_MODULE=sagemaker_pytorch_container.training:main
SM_INPUT_DIR=/opt/ml/input
SM_INPUT_CONFIG_DIR=/opt/ml/input/config
SM_OUTPUT_DIR=/opt/ml/output
SM_NUM_CPUS=4
SM_NUM_GPUS=1
SM_MODEL_DIR=/opt/ml/model
SM_MODULE_DIR=s3://sagemaker-us-west-2-000000000000/huggingface-pytorch-training-2023-06-22-20-00-31-761/source/sourcedir.tar.gz
SM_TRAINING_ENV={"additional_framework_parameters":{},"channel_input_dirs":{"test":"/opt/ml/input/data/test","train":"/opt/ml/input/data/train"},"current_host":"algo-1","framework_module":"sagemaker_pytorch_container.training:main","hosts":["algo-1"],"hyperparameters":{"epochs":1,"model_name":"distilbert-base-uncased","test_file":"test.csv","train_file":"train.csv"},"input_config_dir":"/opt/ml/input/config","input_data_config":{"test":{"RecordWrapperType":"None","S3DistributionType":"FullyReplicated","TrainingInputMode":"File"},"train":{"RecordWrapperType":"None","S3DistributionType":"FullyReplicated","TrainingInputMode":"File"}},"input_dir":"/opt/ml/input","is_master":true,"job_name":"huggingface-pytorch-training-2023-06-22-20-00-31-761","log_level":20,"master_hostname":"algo-1","model_dir":"/opt/ml/model","module_dir":"s3://sagemaker-us-west-2-000000000000/huggingface-pytorch-training-2023-06-22-20-00-31-761/source/sourcedir.tar.gz","module_name":"train","network_interface_name":"eth0","num_cpus":4,"num_gpus":1,"output_data_dir":"/opt/ml/output/data","output_dir":"/opt/ml/output","output_intermediate_dir":"/opt/ml/output/intermediate","resource_config":{"current_group_name":"homogeneousCluster","current_host":"algo-1","current_instance_type":"ml.g4dn.xlarge","hosts":["algo-1"],"instance_groups":[{"hosts":["algo-1"],"instance_group_name":"homogeneousCluster","instance_type":"ml.g4dn.xlarge"}],"network_interface_name":"eth0"},"user_entry_point":"train.py"}
SM_USER_ARGS=["--epochs","1","--model_name","distilbert-base-uncased","--test_file","test.csv","--train_file","train.csv"]
SM_OUTPUT_INTERMEDIATE_DIR=/opt/ml/output/intermediate
SM_CHANNEL_TEST=/opt/ml/input/data/test
SM_CHANNEL_TRAIN=/opt/ml/input/data/train
SM_HP_EPOCHS=1
SM_HP_MODEL_NAME=distilbert-base-uncased
SM_HP_TEST_FILE=test.csv
SM_HP_TRAIN_FILE=train.csv
PYTHONPATH=/opt/ml/code:/opt/conda/bin:/opt/conda/lib/python36.zip:/opt/conda/lib/python3.6:/opt/conda/lib/python3.6/lib-dynload:/opt/conda/lib/python3.6/site-packages
Invoking script with the following command:
/opt/conda/bin/python3.6 train.py --epochs 1 --model_name distilbert-base-uncased --test_file test.csv --train_file train.csv
2023-06-22 20:04:58,268 - datasets.builder - WARNING - Using custom data configuration default-dd9cff78fda6d41f
Downloading and preparing dataset csv/default (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /root/.cache/huggingface/datasets/csv/default-dd9cff78fda6d41f/0.0.0/2dc6629a9ff6b5697d82c25b73731dd440507a69cbce8b425db50b751e8fcfd0...
Dataset csv downloaded and prepared to /root/.cache/huggingface/datasets/csv/default-dd9cff78fda6d41f/0.0.0/2dc6629a9ff6b5697d82c25b73731dd440507a69cbce8b425db50b751e8fcfd0. Subsequent calls will reuse this data.
2023-06-22 20:04:58,638 - datasets.builder - WARNING - Using custom data configuration default-6e3ab9f4794794bf
Downloading and preparing dataset csv/default (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /root/.cache/huggingface/datasets/csv/default-6e3ab9f4794794bf/0.0.0/2dc6629a9ff6b5697d82c25b73731dd440507a69cbce8b425db50b751e8fcfd0...
Dataset csv downloaded and prepared to /root/.cache/huggingface/datasets/csv/default-6e3ab9f4794794bf/0.0.0/2dc6629a9ff6b5697d82c25b73731dd440507a69cbce8b425db50b751e8fcfd0. Subsequent calls will reuse this data.
2023-06-22 20:04:58,655 - __main__ - INFO -  loaded train_dataset length is: 15874
2023-06-22 20:04:58,655 - __main__ - INFO -  loaded test_dataset length is: 1962
2023-06-22 20:04:58,761 - filelock - INFO - Lock 140303718168840 acquired on /root/.cache/huggingface/transformers/23454919702d26495337f3da04d1655c7ee010d5ec9d77bdb9e399e00302c0a1.91b885ab15d631bf9cee9dc9d25ece0afd932f2f5130eba28f2055b2220c0333.lock
2023-06-22 20:04:58,875 - filelock - INFO - Lock 140303718168840 released on /root/.cache/huggingface/transformers/23454919702d26495337f3da04d1655c7ee010d5ec9d77bdb9e399e00302c0a1.91b885ab15d631bf9cee9dc9d25ece0afd932f2f5130eba28f2055b2220c0333.lock
2023-06-22 20:04:58,990 - filelock - INFO - Lock 140300681038760 acquired on /root/.cache/huggingface/transformers/0e1bbfda7f63a99bb52e3915dcf10c3c92122b827d92eb2d34ce94ee79ba486c.cf47717d443acbff3940da39f5ddd0b17179607321d46f2c0a5060d2264eefd0.lock
2023-06-22 20:04:59,236 - filelock - INFO - Lock 140300681038760 released on /root/.cache/huggingface/transformers/0e1bbfda7f63a99bb52e3915dcf10c3c92122b827d92eb2d34ce94ee79ba486c.cf47717d443acbff3940da39f5ddd0b17179607321d46f2c0a5060d2264eefd0.lock
2023-06-22 20:04:59,351 - filelock - INFO - Lock 140300681038760 acquired on /root/.cache/huggingface/transformers/75abb59d7a06f4f640158a9bfcde005264e59e8d566781ab1415b139d2e4c603.53241bddd84f83cd6f1881886465d84bbf4f27be795658add74bee2568ac4587.lock
2023-06-22 20:04:59,566 - filelock - INFO - Lock 140300681038760 released on /root/.cache/huggingface/transformers/75abb59d7a06f4f640158a9bfcde005264e59e8d566781ab1415b139d2e4c603.53241bddd84f83cd6f1881886465d84bbf4f27be795658add74bee2568ac4587.lock
2023-06-22 20:04:59,907 - filelock - INFO - Lock 140300681038760 acquired on /root/.cache/huggingface/transformers/8c8624b8ac8aa99c60c912161f8332de003484428c47906d7ff7eb7f73eecdbb.20430bd8e10ef77a7d2977accefe796051e01bc2fc4aa146bc862997a1a15e79.lock
2023-06-22 20:05:00,018 - filelock - INFO - Lock 140300681038760 released on /root/.cache/huggingface/transformers/8c8624b8ac8aa99c60c912161f8332de003484428c47906d7ff7eb7f73eecdbb.20430bd8e10ef77a7d2977accefe796051e01bc2fc4aa146bc862997a1a15e79.lock
2023-06-22 20:05:02,959 - filelock - INFO - Lock 140300647498416 acquired on /root/.cache/huggingface/transformers/9c169103d7e5a73936dd2b627e42851bec0831212b677c637033ee4bce9ab5ee.126183e36667471617ae2f0835fab707baa54b731f991507ebbb55ea85adb12a.lock
2023-06-22 20:05:06,178 - filelock - INFO - Lock 140300647498416 released on /root/.cache/huggingface/transformers/9c169103d7e5a73936dd2b627e42851bec0831212b677c637033ee4bce9ab5ee.126183e36667471617ae2f0835fab707baa54b731f991507ebbb55ea85adb12a.lock
Some weights of the model checkpoint at distilbert-base-uncased were not used when initializing DistilBertForSequenceClassification: ['vocab_projector.weight', 'vocab_transform.bias', 'vocab_layer_norm.bias', 'vocab_layer_norm.weight', 'vocab_transform.weight', 'vocab_projector.bias']
- This IS expected if you are initializing DistilBertForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing DistilBertForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
Some weights of DistilBertForSequenceClassification were not initialized from the model checkpoint at distilbert-base-uncased and are newly initialized: ['classifier.bias', 'pre_classifier.weight', 'pre_classifier.bias', 'classifier.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
[2023-06-22 20:05:08.910 algo-1:27 INFO utils.py:27] RULE_JOB_STOP_SIGNAL_FILENAME: None
[2023-06-22 20:05:08.965 algo-1:27 INFO profiler_config_parser.py:102] Unable to find config at /opt/ml/input/config/profilerconfig.json. Profiler is disabled.
{'eval_loss': 0.14932341873645782, 'eval_accuracy': 0.936289500509684, 'eval_f1': 0.9643163003140165, 'eval_precision': 0.9510135135135135, 'eval_recall': 0.9779965257672264, 'eval_runtime': 10.8827, 'eval_samples_per_second': 180.287, 'epoch': 1.0}
{'train_runtime': 251.0795, 'train_samples_per_second': 1.979, 'epoch': 1.0}
***** Eval results *****
#0150 tables [00:00, ? tables/s]#015                            #015#0150 tables [00:00, ? tables/s]#015                            #015#015Downloading:   0%|          | 0.00/483 [00:00<?, ?B/s]#015Downloading: 100%|██████████| 483/483 [00:00<00:00, 428kB/s]
#015Downloading: 0.00B [00:00, ?B/s]#015Downloading: 219kB [00:00, 1.70MB/s]#015Downloading: 232kB [00:00, 1.78MB/s]
#015Downloading: 0.00B [00:00, ?B/s]#015Downloading: 466kB [00:00, 4.60MB/s]
#015Downloading:   0%|          | 0.00/28.0 [00:00<?, ?B/s]#015Downloading: 100%|██████████| 28.0/28.0 [00:00<00:00, 23.5kB/s]
#015  0%|          | 0/16 [00:00<?, ?ba/s]#015  6%|▋         | 1/16 [00:00<00:02,  6.25ba/s]#015 12%|█▎        | 2/16 [00:00<00:02,  6.62ba/s]#015 19%|█▉        | 3/16 [00:00<00:01,  6.94ba/s]#015 25%|██▌       | 4/16 [00:00<00:01,  7.10ba/s]#015 31%|███▏      | 5/16 [00:00<00:01,  7.25ba/s]#015 38%|███▊      | 6/16 [00:00<00:01,  7.38ba/s]#015 44%|████▍     | 7/16 [00:00<00:01,  7.46ba/s]#015 50%|█████     | 8/16 [00:01<00:01,  7.40ba/s]#015 56%|█████▋    | 9/16 [00:01<00:00,  7.08ba/s]#015 62%|██████▎   | 10/16 [00:01<00:00,  7.10ba/s]#015 69%|██████▉   | 11/16 [00:01<00:00,  6.75ba/s]#015 75%|███████▌  | 12/16 [00:01<00:00,  6.62ba/s]#015 81%|████████▏ | 13/16 [00:01<00:00,  6.45ba/s]#015 88%|████████▊ | 14/16 [00:02<00:00,  6.22ba/s]#015 94%|█████████▍| 15/16 [00:02<00:00,  6.11ba/s]#015100%|██████████| 16/16 [00:02<00:00,  6.51ba/s]#015100%|██████████| 16/16 [00:02<00:00,  6.84ba/s]
#015  0%|          | 0/2 [00:00<?, ?ba/s]#015 50%|█████     | 1/2 [00:00<00:00,  6.20ba/s]#015100%|██████████| 2/2 [00:00<00:00,  6.23ba/s]#015100%|██████████| 2/2 [00:00<00:00,  6.24ba/s]
#015Downloading:   0%|          | 0.00/268M [00:00<?, ?B/s]#015Downloading:   2%|▏         | 5.01M/268M [00:00<00:05, 50.1MB/s]#015Downloading:   4%|▍         | 12.0M/268M [00:00<00:04, 54.8MB/s]#015Downloading:   8%|▊         | 20.4M/268M [00:00<00:04, 61.1MB/s]#015Downloading:  11%|█         | 29.1M/268M [00:00<00:03, 67.1MB/s]#015Downloading:  14%|█▍        | 37.8M/268M [00:00<00:03, 72.1MB/s]#015Downloading:  17%|█▋        | 46.6M/268M [00:00<00:02, 76.1MB/s]#015Downloading:  21%|██        | 55.2M/268M [00:00<00:02, 79.1MB/s]#015Downloading:  24%|██▍       | 63.6M/268M [00:00<00:02, 80.5MB/s]#015Downloading:  27%|██▋       | 71.6M/268M [00:00<00:02, 75.9MB/s]#015Downloading:  30%|██▉       | 79.9M/268M [00:01<00:02, 77.9MB/s]#015Downloading:  33%|███▎      | 88.7M/268M [00:01<00:02, 80.6MB/s]#015Downloading:  36%|███▋      | 97.4M/268M [00:01<00:02, 82.4MB/s]#015Downloading:  40%|███▉      | 106M/268M [00:01<00:01, 83.9MB/s] #015Downloading:  43%|████▎     | 115M/268M [00:01<00:01, 85.1MB/s]#015Downloading:  46%|████▌     | 124M/268M [00:01<00:01, 86.0MB/s]#015Downloading:  49%|████▉     | 133M/268M [00:01<00:01, 86.7MB/s]#015Downloading:  53%|█████▎    | 141M/268M [00:01<00:01, 87.0MB/s]#015Downloading:  56%|█████▌    | 150M/268M [00:01<00:01, 87.2MB/s]#015Downloading:  59%|█████▉    | 159M/268M [00:01<00:01, 87.3MB/s]#015Downloading:  63%|██████▎   | 168M/268M [00:02<00:01, 87.4MB/s]#015Downloading:  66%|██████▌   | 176M/268M [00:02<00:01, 87.4MB/s]#015Downloading:  69%|██████▉   | 185M/268M [00:02<00:00, 86.6MB/s]#015Downloading:  72%|███████▏  | 194M/268M [00:02<00:00, 87.0MB/s]#015Downloading:  76%|███████▌  | 203M/268M [00:02<00:00, 87.2MB/s]#015Downloading:  79%|███████▉  | 211M/268M [00:02<00:00, 87.4MB/s]#015Downloading:  82%|████████▏ | 220M/268M [00:02<00:00, 87.5MB/s]#015Downloading:  85%|████████▌ | 229M/268M [00:02<00:00, 87.4MB/s]#015Downloading:  89%|████████▊ | 238M/268M [00:02<00:00, 87.4MB/s]#015Downloading:  92%|█████████▏| 247M/268M [00:02<00:00, 87.7MB/s]#015Downloading:  95%|█████████▌| 255M/268M [00:03<00:00, 87.8MB/s]#015Downloading:  99%|█████████▊| 264M/268M [00:03<00:00, 87.7MB/s]#015Downloading: 100%|██████████| 268M/268M [00:03<00:00, 84.6MB/s]
Some weights of the model checkpoint at distilbert-base-uncased were not used when initializing DistilBertForSequenceClassification: ['vocab_projector.weight', 'vocab_transform.bias', 'vocab_layer_norm.bias', 'vocab_layer_norm.weight', 'vocab_transform.weight', 'vocab_projector.bias']
- This IS expected if you are initializing DistilBertForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing DistilBertForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
Some weights of DistilBertForSequenceClassification were not initialized from the model checkpoint at distilbert-base-uncased and are newly initialized: ['classifier.bias', 'pre_classifier.weight', 'pre_classifier.bias', 'classifier.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
#015  0%|          | 0/497 [00:00<?, ?it/s]#015  0%|          | 1/497 [00:01<10:21,  1.25s/it]#015  0%|          | 2/497 [00:01<08:21,  1.01s/it]#015  1%|          | 3/497 [00:02<06:58,  1.18it/s]#015  1%|          | 4/497 [00:02<06:00,  1.37it/s]#015  1%|          | 5/497 [00:03<05:19,  1.54it/s]#015  1%|          | 6/497 [00:03<04:51,  1.69it/s]#015  1%|▏         | 7/497 [00:03<04:28,  1.83it/s]#015  2%|▏         | 8/497 [00:04<04:15,  1.91it/s]#015  2%|▏         | 9/497 [00:04<04:02,  2.01it/s]#015  2%|▏         | 10/497 [00:05<03:54,  2.08it/s]#015  2%|▏         | 11/497 [00:05<03:50,  2.11it/s]#015  2%|▏         | 12/497 [00:06<03:49,  2.12it/s]#015  3%|▎         | 13/497 [00:06<03:47,  2.13it/s]#015  3%|▎         | 14/497 [00:07<03:46,  2.14it/s]#015  3%|▎         | 15/497 [00:07<03:44,  2.15it/s]#015  3%|▎         | 16/497 [00:08<03:40,  2.18it/s]#015  3%|▎         | 17/497 [00:08<03:41,  2.17it/s]#015  4%|▎         | 18/497 [00:09<03:40,  2.17it/s]#015  4%|▍         | 19/497 [00:09<03:40,  2.17it/s]#015  4%|▍         | 20/497 [00:09<03:40,  2.16it/s]#015  4%|▍         | 21/497 [00:10<03:39,  2.16it/s]#015  4%|▍         | 22/497 [00:10<03:39,  2.16it/s]#015  5%|▍         | 23/497 [00:11<03:38,  2.17it/s]#015  5%|▍         | 24/497 [00:11<03:38,  2.16it/s]#015  5%|▌         | 25/497 [00:12<03:39,  2.15it/s]#015  5%|▌         | 26/497 [00:12<03:38,  2.16it/s]#015  5%|▌         | 27/497 [00:13<03:39,  2.15it/s]#015  6%|▌         | 28/497 [00:13<03:39,  2.14it/s]#015  6%|▌         | 29/497 [00:14<03:38,  2.14it/s]#015  6%|▌         | 30/497 [00:14<03:37,  2.15it/s]#015  6%|▌         | 31/497 [00:15<03:38,  2.14it/s]#015  6%|▋         | 32/497 [00:15<03:36,  2.15it/s]#015  7%|▋         | 33/497 [00:15<03:36,  2.14it/s]#015  7%|▋         | 34/497 [00:16<03:36,  2.14it/s]#015  7%|▋         | 35/497 [00:16<03:31,  2.18it/s]#015  7%|▋         | 36/497 [00:17<03:30,  2.19it/s]#015  7%|▋         | 37/497 [00:17<03:30,  2.19it/s]#015  8%|▊         | 38/497 [00:18<03:31,  2.17it/s]#015  8%|▊         | 39/497 [00:18<03:31,  2.16it/s]#015  8%|▊         | 40/497 [00:19<03:30,  2.17it/s]#015  8%|▊         | 41/497 [00:19<03:29,  2.17it/s]#015  8%|▊         | 42/497 [00:20<03:30,  2.16it/s]#015  9%|▊         | 43/497 [00:20<03:29,  2.17it/s]#015  9%|▉         | 44/497 [00:21<03:28,  2.17it/s]#015  9%|▉         | 45/497 [00:21<03:27,  2.17it/s]#015  9%|▉         | 46/497 [00:21<03:28,  2.17it/s]#015  9%|▉         | 47/497 [00:22<03:27,  2.16it/s]#015 10%|▉         | 48/497 [00:22<03:27,  2.16it/s]#015 10%|▉         | 49/497 [00:23<03:26,  2.16it/s]#015 10%|█         | 50/497 [00:23<03:27,  2.16it/s]#015 10%|█         | 51/497 [00:24<03:28,  2.14it/s]#015 10%|█         | 52/497 [00:24<03:24,  2.18it/s]#015 11%|█         | 53/497 [00:25<03:21,  2.20it/s]#015 11%|█         | 54/497 [00:25<03:23,  2.18it/s]#015 11%|█         | 55/497 [00:26<03:23,  2.17it/s]#015 11%|█▏        | 56/497 [00:26<03:17,  2.24it/s]#015 11%|█▏        | 57/497 [00:27<03:18,  2.21it/s]#015 12%|█▏        | 58/497 [00:27<03:19,  2.20it/s]#015 12%|█▏        | 59/497 [00:27<03:19,  2.19it/s]#015 12%|█▏        | 60/497 [00:28<03:19,  2.19it/s]#015 12%|█▏        | 61/497 [00:28<03:19,  2.19it/s]#015 12%|█▏        | 62/497 [00:29<03:19,  2.18it/s]#015 13%|█▎        | 63/497 [00:29<03:17,  2.20it/s]#015 13%|█▎        | 64/497 [00:30<03:18,  2.18it/s]#015 13%|█▎        | 65/497 [00:30<03:19,  2.16it/s]#015 13%|█▎        | 66/497 [00:31<03:18,  2.17it/s]#015 13%|█▎        | 67/497 [00:31<03:17,  2.18it/s]#015 14%|█▎        | 68/497 [00:32<03:18,  2.17it/s]#015 14%|█▍        | 69/497 [00:32<03:17,  2.16it/s]#015 14%|█▍        | 70/497 [00:32<03:17,  2.16it/s]#015 14%|█▍        | 71/497 [00:33<03:18,  2.15it/s]#015 14%|█▍        | 72/497 [00:33<03:18,  2.14it/s]#015 15%|█▍        | 73/497 [00:34<03:17,  2.15it/s]#015 15%|█▍        | 74/497 [00:34<03:16,  2.16it/s]#015 15%|█▌        | 75/497 [00:35<03:15,  2.16it/s]#015 15%|█▌        | 76/497 [00:35<03:14,  2.16it/s]#015 15%|█▌        | 77/497 [00:36<03:13,  2.18it/s]#015 16%|█▌        | 78/497 [00:36<03:13,  2.16it/s]#015 16%|█▌        | 79/497 [00:37<03:12,  2.17it/s]#015 16%|█▌        | 80/497 [00:37<03:12,  2.17it/s]#015 16%|█▋        | 81/497 [00:38<03:12,  2.16it/s]#015 16%|█▋        | 82/497 [00:38<03:12,  2.16it/s]#015 17%|█▋        | 83/497 [00:39<03:12,  2.15it/s]#015 17%|█▋        | 84/497 [00:39<03:15,  2.12it/s]#015 17%|█▋        | 85/497 [00:40<03:17,  2.09it/s]#015 17%|█▋        | 86/497 [00:40<03:15,  2.11it/s]#015 18%|█▊        | 87/497 [00:40<03:13,  2.12it/s]#015 18%|█▊        | 88/497 [00:41<03:11,  2.13it/s]#015 18%|█▊        | 89/497 [00:41<03:13,  2.11it/s]#015 18%|█▊        | 90/497 [00:42<03:11,  2.12it/s]#015 18%|█▊        | 91/497 [00:42<03:11,  2.12it/s]#015 19%|█▊        | 92/497 [00:43<03:12,  2.10it/s]#015 19%|█▊        | 93/497 [00:43<03:17,  2.05it/s]#015 19%|█▉        | 94/497 [00:44<03:15,  2.06it/s]#015 19%|█▉        | 95/497 [00:44<03:15,  2.06it/s]#015 19%|█▉        | 96/497 [00:45<03:13,  2.07it/s]#015 20%|█▉        | 97/497 [00:45<03:13,  2.07it/s]#015 20%|█▉        | 98/497 [00:46<03:11,  2.09it/s]#015 20%|█▉        | 99/497 [00:46<03:12,  2.06it/s]#015 20%|██        | 100/497 [00:47<03:13,  2.05it/s]#015 20%|██        | 101/497 [00:47<03:13,  2.04it/s]#015 21%|██        | 102/497 [00:48<03:12,  2.06it/s]#015 21%|██        | 103/497 [00:48<03:08,  2.09it/s]#015 21%|██        | 104/497 [00:49<03:06,  2.10it/s]#015 21%|██        | 105/497 [00:49<03:06,  2.10it/s]#015 21%|██▏       | 106/497 [00:50<03:09,  2.06it/s]#015 22%|██▏       | 107/497 [00:50<03:06,  2.09it/s]#015 22%|██▏       | 108/497 [00:51<03:09,  2.06it/s]#015 22%|██▏       | 109/497 [00:51<03:09,  2.05it/s]#015 22%|██▏       | 110/497 [00:52<03:04,  2.10it/s]#015 22%|██▏       | 111/497 [00:52<03:00,  2.13it/s]#015 23%|██▎       | 112/497 [00:52<03:04,  2.08it/s]#015 23%|██▎       | 113/497 [00:53<03:05,  2.07it/s]#015 23%|██▎       | 114/497 [00:53<03:03,  2.09it/s]#015 23%|██▎       | 115/497 [00:54<03:03,  2.08it/s]#015 23%|██▎       | 116/497 [00:54<03:00,  2.11it/s]#015 24%|██▎       | 117/497 [00:55<02:58,  2.13it/s]#015 24%|██▎       | 118/497 [00:55<02:57,  2.13it/s]#015 24%|██▍       | 119/497 [00:56<02:56,  2.14it/s]#015 24%|██▍       | 120/497 [00:56<02:54,  2.16it/s]#015 24%|██▍       | 121/497 [00:57<02:55,  2.15it/s]#015 25%|██▍       | 122/497 [00:57<02:52,  2.18it/s]#015 25%|██▍       | 123/497 [00:58<02:51,  2.18it/s]#015 25%|██▍       | 124/497 [00:58<02:53,  2.15it/s]#015 25%|██▌       | 125/497 [00:59<02:52,  2.15it/s]#015 25%|██▌       | 126/497 [00:59<02:53,  2.14it/s]#015 26%|██▌       | 127/497 [00:59<02:53,  2.13it/s]#015 26%|██▌       | 128/497 [01:00<02:54,  2.12it/s]#015 26%|██▌       | 129/497 [01:00<02:53,  2.12it/s]#015 26%|██▌       | 130/497 [01:01<02:54,  2.10it/s]#015 26%|██▋       | 131/497 [01:01<02:54,  2.10it/s]#015 27%|██▋       | 132/497 [01:02<02:53,  2.11it/s]#015 27%|██▋       | 133/497 [01:02<02:52,  2.11it/s]#015 27%|██▋       | 134/497 [01:03<02:52,  2.11it/s]#015 27%|██▋       | 135/497 [01:03<02:51,  2.10it/s]#015 27%|██▋       | 136/497 [01:04<02:48,  2.14it/s]#015 28%|██▊       | 137/497 [01:04<02:49,  2.12it/s]#015 28%|██▊       | 138/497 [01:05<02:49,  2.12it/s]#015 28%|██▊       | 139/497 [01:05<02:48,  2.12it/s]#015 28%|██▊       | 140/497 [01:06<02:48,  2.12it/s]#015 28%|██▊       | 141/497 [01:06<02:47,  2.13it/s]#015 29%|██▊       | 142/497 [01:07<02:47,  2.12it/s]#015 29%|██▉       | 143/497 [01:07<02:47,  2.11it/s]#015 29%|██▉       | 144/497 [01:08<02:47,  2.10it/s]#015 29%|██▉       | 145/497 [01:08<02:44,  2.14it/s]#015 29%|██▉       | 146/497 [01:08<02:44,  2.13it/s]#015 30%|██▉       | 147/497 [01:09<02:44,  2.13it/s]#015 30%|██▉       | 148/497 [01:09<02:45,  2.11it/s]#015 30%|██▉       | 149/497 [01:10<02:43,  2.12it/s]#015 30%|███       | 150/497 [01:10<02:43,  2.12it/s]#015 30%|███       | 151/497 [01:11<02:42,  2.12it/s]#015 31%|███       | 152/497 [01:11<02:44,  2.10it/s]#015 31%|███       | 153/497 [01:12<02:43,  2.11it/s]#015 31%|███       | 154/497 [01:12<02:43,  2.10it/s]#015 31%|███       | 155/497 [01:13<02:43,  2.10it/s]#015 31%|███▏      | 156/497 [01:13<02:42,  2.10it/s]#015 32%|███▏      | 157/497 [01:14<02:41,  2.10it/s]#015 32%|███▏      | 158/497 [01:14<02:40,  2.11it/s]#015 32%|███▏      | 159/497 [01:15<02:40,  2.10it/s]#015 32%|███▏      | 160/497 [01:15<02:39,  2.11it/s]#015 32%|███▏      | 161/497 [01:16<02:40,  2.09it/s]#015 33%|███▎      | 162/497 [01:16<02:39,  2.10it/s]#015 33%|███▎      | 163/497 [01:17<02:39,  2.10it/s]#015 33%|███▎      | 164/497 [01:17<02:38,  2.10it/s]#015 33%|███▎      | 165/497 [01:17<02:35,  2.13it/s]#015 33%|███▎      | 166/497 [01:18<02:36,  2.12it/s]#015 34%|███▎      | 167/497 [01:18<02:35,  2.12it/s]#015 34%|███▍      | 168/497 [01:19<02:35,  2.12it/s]#015 34%|███▍      | 169/497 [01:19<02:35,  2.11it/s]#015 34%|███▍      | 170/497 [01:20<02:35,  2.10it/s]#015 34%|███▍      | 171/497 [01:20<02:35,  2.09it/s]#015 35%|███▍      | 172/497 [01:21<02:35,  2.09it/s]#015 35%|███▍      | 173/497 [01:21<02:35,  2.08it/s]#015 35%|███▌      | 174/497 [01:22<02:34,  2.09it/s]#015 35%|███▌      | 175/497 [01:22<02:31,  2.12it/s]#015 35%|███▌      | 176/497 [01:23<02:32,  2.10it/s]#015 36%|███▌      | 177/497 [01:23<02:32,  2.10it/s]#015 36%|███▌      | 178/497 [01:24<02:31,  2.10it/s]#015 36%|███▌      | 179/497 [01:24<02:31,  2.11it/s]#015 36%|███▌      | 180/497 [01:25<02:32,  2.08it/s]#015 36%|███▋      | 181/497 [01:25<02:31,  2.09it/s]#015 37%|███▋      | 182/497 [01:26<02:30,  2.09it/s]#015 37%|███▋      | 183/497 [01:26<02:30,  2.09it/s]#015 37%|███▋      | 184/497 [01:27<02:30,  2.09it/s]#015 37%|███▋      | 185/497 [01:27<02:29,  2.09it/s]#015 37%|███▋      | 186/497 [01:27<02:26,  2.12it/s]#015 38%|███▊      | 187/497 [01:28<02:26,  2.12it/s]#015 38%|███▊      | 188/497 [01:28<02:26,  2.11it/s]#015 38%|███▊      | 189/497 [01:29<02:27,  2.09it/s]#015 38%|███▊      | 190/497 [01:29<02:26,  2.09it/s]#015 38%|███▊      | 191/497 [01:30<02:26,  2.09it/s]#015 39%|███▊      | 192/497 [01:30<02:26,  2.09it/s]#015 39%|███▉      | 193/497 [01:31<02:26,  2.07it/s]#015 39%|███▉      | 194/497 [01:31<02:26,  2.07it/s]#015 39%|███▉      | 195/497 [01:32<02:25,  2.08it/s]#015 39%|███▉      | 196/497 [01:32<02:24,  2.09it/s]#015 40%|███▉      | 197/497 [01:33<02:23,  2.09it/s]#015 40%|███▉      | 198/497 [01:33<02:23,  2.08it/s]#015 40%|████      | 199/497 [01:34<02:23,  2.07it/s]#015 40%|████      | 200/497 [01:34<02:22,  2.09it/s]#015 40%|████      | 201/497 [01:35<02:22,  2.07it/s]#015 41%|████      | 202/497 [01:35<02:22,  2.06it/s]#015 41%|████      | 203/497 [01:36<02:22,  2.07it/s]#015 41%|████      | 204/497 [01:36<02:22,  2.06it/s]#015 41%|████      | 205/497 [01:37<02:21,  2.07it/s]#015 41%|████▏     | 206/497 [01:37<02:20,  2.07it/s]#015 42%|████▏     | 207/497 [01:38<02:20,  2.07it/s]#015 42%|████▏     | 208/497 [01:38<02:19,  2.08it/s]#015 42%|████▏     | 209/497 [01:39<02:17,  2.10it/s]#015 42%|████▏     | 210/497 [01:39<02:17,  2.09it/s]#015 42%|████▏     | 211/497 [01:39<02:15,  2.12it/s]#015 43%|████▎     | 212/497 [01:40<02:15,  2.10it/s]#015 43%|████▎     | 213/497 [01:40<02:13,  2.12it/s]#015 43%|████▎     | 214/497 [01:41<02:15,  2.09it/s]#015 43%|████▎     | 215/497 [01:41<02:14,  2.09it/s]#015 43%|████▎     | 216/497 [01:42<02:14,  2.08it/s]#015 44%|████▎     | 217/497 [01:42<02:15,  2.07it/s]#015 44%|████▍     | 218/497 [01:43<02:14,  2.08it/s]#015 44%|████▍     | 219/497 [01:43<02:14,  2.06it/s]#015 44%|████▍     | 220/497 [01:44<02:14,  2.06it/s]#015 44%|████▍     | 221/497 [01:44<02:14,  2.06it/s]#015 45%|████▍     | 222/497 [01:45<02:14,  2.05it/s]#015 45%|████▍     | 223/497 [01:45<02:13,  2.06it/s]#015 45%|████▌     | 224/497 [01:46<02:12,  2.05it/s]#015 45%|████▌     | 225/497 [01:46<02:12,  2.05it/s]#015 45%|████▌     | 226/497 [01:47<02:12,  2.05it/s]#015 46%|████▌     | 227/497 [01:47<02:11,  2.06it/s]#015 46%|████▌     | 228/497 [01:48<02:10,  2.06it/s]#015 46%|████▌     | 229/497 [01:48<02:08,  2.09it/s]#015 46%|████▋     | 230/497 [01:49<02:08,  2.08it/s]#015 46%|████▋     | 231/497 [01:49<02:07,  2.08it/s]#015 47%|████▋     | 232/497 [01:50<02:06,  2.10it/s]#015 47%|████▋     | 233/497 [01:50<02:06,  2.09it/s]#015 47%|████▋     | 234/497 [01:51<02:07,  2.06it/s]#015 47%|████▋     | 235/497 [01:51<02:07,  2.06it/s]#015 47%|████▋     | 236/497 [01:52<02:06,  2.06it/s]#015 48%|████▊     | 237/497 [01:52<02:04,  2.08it/s]#015 48%|████▊     | 238/497 [01:53<02:07,  2.02it/s]#015 48%|████▊     | 239/497 [01:53<02:07,  2.02it/s]#015 48%|████▊     | 240/497 [01:54<02:06,  2.02it/s]#015 48%|████▊     | 241/497 [01:54<02:06,  2.02it/s]#015 49%|████▊     | 242/497 [01:55<02:06,  2.02it/s]#015 49%|████▉     | 243/497 [01:55<02:05,  2.03it/s]#015 49%|████▉     | 244/497 [01:56<02:03,  2.04it/s]#015 49%|████▉     | 245/497 [01:56<02:02,  2.05it/s]#015 49%|████▉     | 246/497 [01:56<02:01,  2.06it/s]#015 50%|████▉     | 247/497 [01:57<02:01,  2.06it/s]#015 50%|████▉     | 248/497 [01:57<02:01,  2.06it/s]#015 50%|█████     | 249/497 [01:58<02:00,  2.06it/s]#015 50%|█████     | 250/497 [01:58<02:00,  2.05it/s]#015 51%|█████     | 251/497 [01:59<01:59,  2.05it/s]#015 51%|█████     | 252/497 [01:59<01:59,  2.05it/s]#015 51%|█████     | 253/497 [02:00<01:58,  2.05it/s]#015 51%|█████     | 254/497 [02:00<01:57,  2.07it/s]#015 51%|█████▏    | 255/497 [02:01<01:57,  2.06it/s]#015 52%|█████▏    | 256/497 [02:01<01:57,  2.06it/s]#015 52%|█████▏    | 257/497 [02:02<01:54,  2.09it/s]#015 52%|█████▏    | 258/497 [02:02<01:54,  2.08it/s]#015 52%|█████▏    | 259/497 [02:03<01:55,  2.07it/s]#015 52%|█████▏    | 260/497 [02:03<01:54,  2.07it/s]#015 53%|█████▎    | 261/497 [02:04<01:54,  2.06it/s]#015 53%|█████▎    | 262/497 [02:04<01:53,  2.06it/s]#015 53%|█████▎    | 263/497 [02:05<01:53,  2.06it/s]#015 53%|█████▎    | 264/497 [02:05<01:53,  2.06it/s]#015 53%|█████▎    | 265/497 [02:06<01:53,  2.04it/s]#015 54%|█████▎    | 266/497 [02:06<01:52,  2.05it/s]#015 54%|█████▎    | 267/497 [02:07<01:51,  2.06it/s]#015 54%|█████▍    | 268/497 [02:07<01:51,  2.06it/s]#015 54%|█████▍    | 269/497 [02:08<01:51,  2.05it/s]#015 54%|█████▍    | 270/497 [02:08<01:49,  2.07it/s]#015 55%|█████▍    | 271/497 [02:09<01:49,  2.06it/s]#015 55%|█████▍    | 272/497 [02:09<01:47,  2.09it/s]#015 55%|█████▍    | 273/497 [02:10<01:47,  2.08it/s]#015 55%|█████▌    | 274/497 [02:10<01:47,  2.07it/s]#015 55%|█████▌    | 275/497 [02:11<01:48,  2.05it/s]#015 56%|█████▌    | 276/497 [02:11<01:48,  2.04it/s]#015 56%|█████▌    | 277/497 [02:12<01:48,  2.03it/s]#015 56%|█████▌    | 278/497 [02:12<01:45,  2.07it/s]#015 56%|█████▌    | 279/497 [02:12<01:46,  2.05it/s]#015 56%|█████▋    | 280/497 [02:13<01:45,  2.05it/s]#015 57%|█████▋    | 281/497 [02:13<01:45,  2.05it/s]#015 57%|█████▋    | 282/497 [02:14<01:45,  2.04it/s]#015 57%|█████▋    | 283/497 [02:14<01:45,  2.04it/s]#015 57%|█████▋    | 284/497 [02:15<01:43,  2.06it/s]#015 57%|█████▋    | 285/497 [02:15<01:43,  2.05it/s]#015 58%|█████▊    | 286/497 [02:16<01:42,  2.05it/s]#015 58%|█████▊    | 287/497 [02:16<01:42,  2.06it/s]#015 58%|█████▊    | 288/497 [02:17<01:40,  2.08it/s]#015 58%|█████▊    | 289/497 [02:17<01:41,  2.05it/s]#015 58%|█████▊    | 290/497 [02:18<01:40,  2.05it/s]#015 59%|█████▊    | 29
2023-06-22 20:09:32,262 sagemaker-training-toolkit INFO     Reporting training SUCCESS
1/497 [02:18<01:40,  2.05it/s]#015 59%|█████▉    | 292/497 [02:19<01:40,  2.05it/s]#015 59%|█████▉    | 293/497 [02:19<01:39,  2.05it/s]#015 59%|█████▉    | 294/497 [02:20<01:39,  2.03it/s]#015 59%|█████▉    | 295/497 [02:20<01:40,  2.02it/s]#015 60%|█████▉    | 296/497 [02:21<01:38,  2.05it/s]#015 60%|█████▉    | 297/497 [02:21<01:36,  2.07it/s]#015 60%|█████▉    | 298/497 [02:22<01:36,  2.07it/s]#015 60%|██████    | 299/497 [02:22<01:36,  2.06it/s]#015 60%|██████    | 300/497 [02:23<01:34,  2.08it/s]#015 61%|██████    | 301/497 [02:23<01:34,  2.07it/s]#015 61%|██████    | 302/497 [02:24<01:34,  2.06it/s]#015 61%|██████    | 303/497 [02:24<01:34,  2.06it/s]#015 61%|██████    | 304/497 [02:25<01:33,  2.06it/s]#015 61%|██████▏   | 305/497 [02:25<01:34,  2.04it/s]#015 62%|██████▏   | 306/497 [02:26<01:32,  2.07it/s]#015 62%|██████▏   | 307/497 [02:26<01:32,  2.05it/s]#015 62%|██████▏   | 308/497 [02:27<01:32,  2.04it/s]#015 62%|██████▏   | 309/497 [02:27<01:31,  2.05it/s]#015 62%|██████▏   | 310/497 [02:28<01:31,  2.05it/s]#015 63%|██████▎   | 311/497 [02:28<01:31,  2.04it/s]#015 63%|██████▎   | 312/497 [02:29<01:30,  2.04it/s]#015 63%|██████▎   | 313/497 [02:29<01:28,  2.07it/s]#015 63%|██████▎   | 314/497 [02:30<01:28,  2.06it/s]#015 63%|██████▎   | 315/497 [02:30<01:28,  2.05it/s]#015 64%|██████▎   | 316/497 [02:31<01:28,  2.04it/s]#015 64%|██████▍   | 317/497 [02:31<01:28,  2.03it/s]#015 64%|██████▍   | 318/497 [02:32<01:28,  2.03it/s]#015 64%|██████▍   | 319/497 [02:32<01:27,  2.03it/s]#015 64%|██████▍   | 320/497 [02:33<01:27,  2.02it/s]#015 65%|██████▍   | 321/497 [02:33<01:26,  2.03it/s]#015 65%|██████▍   | 322/497 [02:34<01:26,  2.02it/s]#015 65%|██████▍   | 323/497 [02:34<01:26,  2.02it/s]#015 65%|██████▌   | 324/497 [02:34<01:24,  2.04it/s]#015 65%|██████▌   | 325/497 [02:35<01:24,  2.03it/s]#015 66%|██████▌   | 326/497 [02:35<01:24,  2.03it/s]#015 66%|██████▌   | 327/497 [02:36<01:23,  2.03it/s]#015 66%|██████▌   | 328/497 [02:36<01:23,  2.03it/s]#015 66%|██████▌   | 329/497 [02:37<01:22,  2.03it/s]#015 66%|██████▋   | 330/497 [02:37<01:22,  2.03it/s]#015 67%|██████▋   | 331/497 [02:38<01:22,  2.01it/s]#015 67%|██████▋   | 332/497 [02:38<01:21,  2.02it/s]#015 67%|██████▋   | 333/497 [02:39<01:20,  2.03it/s]#015 67%|██████▋   | 334/497 [02:39<01:20,  2.03it/s]#015 67%|██████▋   | 335/497 [02:40<01:20,  2.02it/s]#015 68%|██████▊   | 336/497 [02:40<01:20,  2.01it/s]#015 68%|██████▊   | 337/497 [02:41<01:18,  2.04it/s]#015 68%|██████▊   | 338/497 [02:41<01:18,  2.04it/s]#015 68%|██████▊   | 339/497 [02:42<01:17,  2.03it/s]#015 68%|██████▊   | 340/497 [02:42<01:17,  2.04it/s]#015 69%|██████▊   | 341/497 [02:43<01:16,  2.04it/s]#015 69%|██████▉   | 342/497 [02:43<01:14,  2.07it/s]#015 69%|██████▉   | 343/497 [02:44<01:14,  2.07it/s]#015 69%|██████▉   | 344/497 [02:44<01:14,  2.06it/s]#015 69%|██████▉   | 345/497 [02:45<01:14,  2.05it/s]#015 70%|██████▉   | 346/497 [02:45<01:12,  2.07it/s]#015 70%|██████▉   | 347/497 [02:46<01:12,  2.06it/s]#015 70%|███████   | 348/497 [02:46<01:11,  2.08it/s]#015 70%|███████   | 349/497 [02:47<01:11,  2.07it/s]#015 70%|███████   | 350/497 [02:47<01:11,  2.05it/s]#015 71%|███████   | 351/497 [02:48<01:11,  2.04it/s]#015 71%|███████   | 352/497 [02:48<01:11,  2.04it/s]#015 71%|███████   | 353/497 [02:49<01:10,  2.03it/s]#015 71%|███████   | 354/497 [02:49<01:10,  2.03it/s]#015 71%|███████▏  | 355/497 [02:50<01:10,  2.02it/s]#015 72%|███████▏  | 356/497 [02:50<01:09,  2.03it/s]#015 72%|███████▏  | 357/497 [02:51<01:08,  2.03it/s]#015 72%|███████▏  | 358/497 [02:51<01:08,  2.02it/s]#015 72%|███████▏  | 359/497 [02:52<01:08,  2.01it/s]#015 72%|███████▏  | 360/497 [02:52<01:07,  2.02it/s]#015 73%|███████▎  | 361/497 [02:53<01:07,  2.01it/s]#015 73%|███████▎  | 362/497 [02:53<01:07,  2.00it/s]#015 73%|███████▎  | 363/497 [02:54<01:06,  2.01it/s]#015 73%|███████▎  | 364/497 [02:54<01:06,  2.00it/s]#015 73%|███████▎  | 365/497 [02:55<01:05,  2.01it/s]#015 74%|███████▎  | 366/497 [02:55<01:05,  2.01it/s]#015 74%|███████▍  | 367/497 [02:56<01:04,  2.01it/s]#015 74%|███████▍  | 368/497 [02:56<01:04,  2.01it/s]#015 74%|███████▍  | 369/497 [02:57<01:03,  2.01it/s]#015 74%|███████▍  | 370/497 [02:57<01:03,  2.01it/s]#015 75%|███████▍  | 371/497 [02:58<01:02,  2.02it/s]#015 75%|███████▍  | 372/497 [02:58<01:01,  2.02it/s]#015 75%|███████▌  | 373/497 [02:59<01:00,  2.06it/s]#015 75%|███████▌  | 374/497 [02:59<00:59,  2.05it/s]#015 75%|███████▌  | 375/497 [03:00<00:59,  2.04it/s]#015 76%|███████▌  | 376/497 [03:00<00:59,  2.03it/s]#015 76%|███████▌  | 377/497 [03:01<00:59,  2.02it/s]#015 76%|███████▌  | 378/497 [03:01<00:58,  2.02it/s]#015 76%|███████▋  | 379/497 [03:02<00:58,  2.02it/s]#015 76%|███████▋  | 380/497 [03:02<00:57,  2.02it/s]#015 77%|███████▋  | 381/497 [03:03<00:56,  2.05it/s]#015 77%|███████▋  | 382/497 [03:03<00:56,  2.04it/s]#015 77%|███████▋  | 383/497 [03:04<00:55,  2.04it/s]#015 77%|███████▋  | 384/497 [03:04<00:55,  2.04it/s]#015 77%|███████▋  | 385/497 [03:05<00:55,  2.03it/s]#015 78%|███████▊  | 386/497 [03:05<00:54,  2.03it/s]#015 78%|███████▊  | 387/497 [03:06<00:54,  2.02it/s]#015 78%|███████▊  | 388/497 [03:06<00:54,  2.01it/s]#015 78%|███████▊  | 389/497 [03:07<00:53,  2.01it/s]#015 78%|███████▊  | 390/497 [03:07<00:53,  2.01it/s]#015 79%|███████▊  | 391/497 [03:07<00:52,  2.01it/s]#015 79%|███████▉  | 392/497 [03:08<00:51,  2.05it/s]#015 79%|███████▉  | 393/497 [03:08<00:50,  2.04it/s]#015 79%|███████▉  | 394/497 [03:09<00:50,  2.04it/s]#015 79%|███████▉  | 395/497 [03:09<00:50,  2.02it/s]#015 80%|███████▉  | 396/497 [03:10<00:50,  2.01it/s]#015 80%|███████▉  | 397/497 [03:10<00:49,  2.01it/s]#015 80%|████████  | 398/497 [03:11<00:49,  2.00it/s]#015 80%|████████  | 399/497 [03:11<00:48,  2.01it/s]#015 80%|████████  | 400/497 [03:12<00:48,  2.01it/s]#015 81%|████████  | 401/497 [03:12<00:47,  2.00it/s]#015 81%|████████  | 402/497 [03:13<00:46,  2.04it/s]#015 81%|████████  | 403/497 [03:13<00:46,  2.04it/s]#015 81%|████████▏ | 404/497 [03:14<00:46,  2.02it/s]#015 81%|████████▏ | 405/497 [03:14<00:45,  2.01it/s]#015 82%|████████▏ | 406/497 [03:15<00:45,  2.02it/s]#015 82%|████████▏ | 407/497 [03:15<00:44,  2.01it/s]#015 82%|████████▏ | 408/497 [03:16<00:44,  2.00it/s]#015 82%|████████▏ | 409/497 [03:16<00:43,  2.01it/s]#015 82%|████████▏ | 410/497 [03:17<00:43,  2.01it/s]#015 83%|████████▎ | 411/497 [03:17<00:42,  2.01it/s]#015 83%|████████▎ | 412/497 [03:18<00:41,  2.05it/s]#015 83%|████████▎ | 413/497 [03:18<00:40,  2.07it/s]#015 83%|████████▎ | 414/497 [03:19<00:40,  2.06it/s]#015 84%|████████▎ | 415/497 [03:19<00:40,  2.05it/s]#015 84%|████████▎ | 416/497 [03:20<00:39,  2.04it/s]#015 84%|████████▍ | 417/497 [03:20<00:39,  2.03it/s]#015 84%|████████▍ | 418/497 [03:21<00:39,  2.02it/s]#015 84%|████████▍ | 419/497 [03:21<00:38,  2.01it/s]#015 85%|████████▍ | 420/497 [03:22<00:38,  2.01it/s]#015 85%|████████▍ | 421/497 [03:22<00:37,  2.00it/s]#015 85%|████████▍ | 422/497 [03:23<00:37,  2.00it/s]#015 85%|████████▌ | 423/497 [03:23<00:36,  2.01it/s]#015 85%|████████▌ | 424/497 [03:24<00:36,  2.00it/s]#015 86%|████████▌ | 425/497 [03:24<00:35,  2.04it/s]#015 86%|████████▌ | 426/497 [03:25<00:35,  2.02it/s]#015 86%|████████▌ | 427/497 [03:25<00:34,  2.02it/s]#015 86%|████████▌ | 428/497 [03:26<00:34,  2.02it/s]#015 86%|████████▋ | 429/497 [03:26<00:33,  2.00it/s]#015 87%|████████▋ | 430/497 [03:27<00:33,  2.00it/s]#015 87%|████████▋ | 431/497 [03:27<00:32,  2.01it/s]#015 87%|████████▋ | 432/497 [03:28<00:31,  2.04it/s]#015 87%|████████▋ | 433/497 [03:28<00:31,  2.04it/s]#015 87%|████████▋ | 434/497 [03:29<00:30,  2.03it/s]#015 88%|████████▊ | 435/497 [03:29<00:30,  2.02it/s]#015 88%|████████▊ | 436/497 [03:30<00:30,  2.01it/s]#015 88%|████████▊ | 437/497 [03:30<00:29,  2.02it/s]#015 88%|████████▊ | 438/497 [03:31<00:29,  2.02it/s]#015 88%|████████▊ | 439/497 [03:31<00:28,  2.02it/s]#015 89%|████████▊ | 440/497 [03:32<00:28,  2.00it/s]#015 89%|████████▊ | 441/497 [03:32<00:27,  2.00it/s]#015 89%|████████▉ | 442/497 [03:33<00:27,  1.99it/s]#015 89%|████████▉ | 443/497 [03:33<00:26,  2.03it/s]#015 89%|████████▉ | 444/497 [03:34<00:26,  2.02it/s]#015 90%|████████▉ | 445/497 [03:34<00:25,  2.03it/s]#015 90%|████████▉ | 446/497 [03:35<00:24,  2.06it/s]#015 90%|████████▉ | 447/497 [03:35<00:24,  2.02it/s]#015 90%|█████████ | 448/497 [03:36<00:24,  2.02it/s]#015 90%|█████████ | 449/497 [03:36<00:23,  2.01it/s]#015 91%|█████████ | 450/497 [03:37<00:23,  2.01it/s]#015 91%|█████████ | 451/497 [03:37<00:22,  2.02it/s]#015 91%|█████████ | 452/497 [03:38<00:22,  2.02it/s]#015 91%|█████████ | 453/497 [03:38<00:21,  2.04it/s]#015 91%|█████████▏| 454/497 [03:39<00:21,  2.04it/s]#015 92%|█████████▏| 455/497 [03:39<00:20,  2.06it/s]#015 92%|█████████▏| 456/497 [03:40<00:20,  2.05it/s]#015 92%|█████████▏| 457/497 [03:40<00:19,  2.04it/s]#015 92%|█████████▏| 458/497 [03:41<00:19,  2.02it/s]#015 92%|█████████▏| 459/497 [03:41<00:18,  2.02it/s]#015 93%|█████████▎| 460/497 [03:42<00:18,  2.00it/s]#015 93%|█████████▎| 461/497 [03:42<00:17,  2.01it/s]#015 93%|█████████▎| 462/497 [03:43<00:17,  2.01it/s]#015 93%|█████████▎| 463/497 [03:43<00:16,  2.01it/s]#015 93%|█████████▎| 464/497 [03:44<00:16,  2.01it/s]#015 94%|█████████▎| 465/497 [03:44<00:15,  2.04it/s]#015 94%|█████████▍| 466/497 [03:45<00:15,  2.03it/s]#015 94%|█████████▍| 467/497 [03:45<00:14,  2.03it/s]#015 94%|█████████▍| 468/497 [03:46<00:14,  2.02it/s]#015 94%|█████████▍| 469/497 [03:46<00:13,  2.01it/s]#015 95%|█████████▍| 470/497 [03:47<00:13,  2.00it/s]#015 95%|█████████▍| 471/497 [03:47<00:13,  2.00it/s]#015 95%|█████████▍| 472/497 [03:48<00:12,  2.00it/s]#015 95%|█████████▌| 473/497 [03:48<00:12,  1.99it/s]#015 95%|█████████▌| 474/497 [03:49<00:11,  2.00it/s]#015 96%|█████████▌| 475/497 [03:49<00:11,  1.99it/s]#015 96%|█████████▌| 476/497 [03:50<00:10,  2.00it/s]#015 96%|█████████▌| 477/497 [03:50<00:09,  2.00it/s]#015 96%|█████████▌| 478/497 [03:51<00:09,  2.00it/s]#015 96%|█████████▋| 479/497 [03:51<00:09,  1.99it/s]#015 97%|█████████▋| 480/497 [03:52<00:08,  2.00it/s]#015 97%|█████████▋| 481/497 [03:52<00:07,  2.01it/s]#015 97%|█████████▋| 482/497 [03:53<00:07,  1.97it/s]#015 97%|█████████▋| 483/497 [03:53<00:07,  1.96it/s]#015 97%|█████████▋| 484/497 [03:54<00:06,  1.98it/s]#015 98%|█████████▊| 485/497 [03:54<00:06,  1.95it/s]#015 98%|█████████▊| 486/497 [03:55<00:05,  1.96it/s]#015 98%|█████████▊| 487/497 [03:55<00:05,  1.98it/s]#015 98%|█████████▊| 488/497 [03:56<00:04,  1.99it/s]#015 98%|█████████▊| 489/497 [03:56<00:04,  1.99it/s]#015 99%|█████████▊| 490/497 [03:57<00:03,  2.00it/s]#015 99%|█████████▉| 491/497 [03:57<00:02,  2.03it/s]#015 99%|█████████▉| 492/497 [03:58<00:02,  2.02it/s]#015 99%|█████████▉| 493/497 [03:58<00:01,  2.01it/s]#015 99%|█████████▉| 494/497 [03:59<00:01,  2.04it/s]#015100%|█████████▉| 495/497 [03:59<00:00,  2.02it/s]#015100%|█████████▉| 496/497 [04:00<00:00,  2.02it/s]
#015  0%|          | 0/31 [00:00<?, ?it/s]#033[A
#015  6%|▋         | 2/31 [00:00<00:05,  5.48it/s]#033[A
#015 10%|▉         | 3/31 [00:00<00:06,  4.21it/s]#033[A
#015 13%|█▎        | 4/31 [00:01<00:07,  3.60it/s]#033[A
#015 16%|█▌        | 5/31 [00:01<00:07,  3.30it/s]#033[A
#015 19%|█▉        | 6/31 [00:01<00:08,  3.10it/s]#033[A
#015 23%|██▎       | 7/31 [00:02<00:08,  2.99it/s]#033[A
#015 26%|██▌       | 8/31 [00:02<00:07,  2.92it/s]#033[A
#015 29%|██▉       | 9/31 [00:02<00:07,  2.87it/s]#033[A
#015 32%|███▏      | 10/31 [00:03<00:07,  2.83it/s]#033[A
#015 35%|███▌      | 11/31 [00:03<00:07,  2.81it/s]#033[A
#015 39%|███▊      | 12/31 [00:04<00:06,  2.79it/s]#033[A
#015 42%|████▏     | 13/31 [00:04<00:06,  2.78it/s]#033[A
#015 45%|████▌     | 14/31 [00:04<00:06,  2.77it/s]#033[A
#015 48%|████▊     | 15/31 [00:05<00:05,  2.76it/s]#033[A
#015 52%|█████▏    | 16/31 [00:05<00:05,  2.76it/s]#033[A
#015 55%|█████▍    | 17/31 [00:05<00:04,  2.80it/s]#033[A
#015 58%|█████▊    | 18/31 [00:06<00:04,  2.82it/s]#033[A
#015 61%|██████▏   | 19/31 [00:06<00:04,  2.85it/s]#033[A
#015 65%|██████▍   | 20/31 [00:06<00:03,  2.86it/s]#033[A
#015 68%|██████▊   | 21/31 [00:07<00:03,  2.88it/s]#033[A
#015 71%|███████   | 22/31 [00:07<00:03,  2.89it/s]#033[A
#015 74%|███████▍  | 23/31 [00:07<00:02,  2.89it/s]#033[A
#015 77%|███████▋  | 24/31 [00:08<00:02,  2.91it/s]#033[A
#015 81%|████████  | 25/31 [00:08<00:02,  2.90it/s]#033[A
#015 84%|████████▍ | 26/31 [00:08<00:01,  2.91it/s]#033[A
#015 87%|████████▋ | 27/31 [00:09<00:01,  2.90it/s]#033[A
#015 90%|█████████ | 28/31 [00:09<00:01,  2.89it/s]#033[A
#015 94%|█████████▎| 29/31 [00:09<00:00,  2.90it/s]#033[A
#015 97%|█████████▋| 30/31 [00:10<00:00,  2.91it/s]#033[A
#015100%|██████████| 31/31 [00:10<00:00,  3.21it/s]#033[A#015                                                 #015
#015                                               #015#033[A#015100%|██████████| 497/497 [04:11<00:00,  2.02it/s]
#015100%|██████████| 31/31 [00:10<00:00,  3.21it/s]#033[A
#015                                               #033[A#015                                                 #015#015100%|██████████| 497/497 [04:11<00:00,  2.02it/s]#015100%|██████████| 497/497 [04:11<00:00,  1.98it/s]
#015  0%|          | 0/31 [00:00<?, ?it/s]#015  6%|▋         | 2/31 [00:00<00:05,  5.56it/s]#015 10%|▉         | 3/31 [00:00<00:06,  4.25it/s]#015 13%|█▎        | 4/31 [00:01<00:07,  3.66it/s]#015 16%|█▌        | 5/31 [00:01<00:07,  3.34it/s]#015 19%|█▉        | 6/31 [00:01<00:07,  3.14it/s]#015 23%|██▎       | 7/31 [00:02<00:07,  3.01it/s]#015 26%|██▌       | 8/31 [00:02<00:07,  2.93it/s]#015 29%|██▉       | 9/31 [00:02<00:07,  2.89it/s]#015 32%|███▏      | 10/31 [00:03<00:07,  2.85it/s]#015 35%|███▌      | 11/31 [00:03<00:07,  2.83it/s]#015 39%|███▊      | 12/31 [00:03<00:06,  2.81it/s]#015 42%|████▏     | 13/31 [00:04<00:06,  2.81it/s]#015 45%|████▌     | 14/31 [00:04<00:06,  2.80it/s]#015 48%|████▊     | 15/31 [00:05<00:05,  2.79it/s]#015 52%|█████▏    | 16/31 [00:05<00:05,  2.79it/s]#015 55%|█████▍    | 17/31 [00:05<00:04,  2.83it/s]#015 58%|█████▊    | 18/31 [00:06<00:04,  2.85it/s]#015 61%|██████▏   | 19/31 [00:06<00:04,  2.87it/s]#015 65%|██████▍   | 20/31 [00:06<00:03,  2.87it/s]#015 68%|██████▊   | 21/31 [00:07<00:03,  2.87it/s]#015 71%|███████   | 22/31 [00:07<00:03,  2.89it/s]#015 74%|███████▍  | 23/31 [00:07<00:02,  2.90it/s]#015 77%|███████▋  | 24/31 [00:08<00:02,  2.91it/s]#015 81%|████████  | 25/31 [00:08<00:02,  2.91it/s]#015 84%|████████▍ | 26/31 [00:08<00:01,  2.92it/s]#015 87%|████████▋ | 27/31 [00:09<00:01,  2.92it/s]#015 90%|█████████ | 28/31 [00:09<00:01,  2.90it/s]#015 94%|█████████▎| 29/31 [00:09<00:00,  2.91it/s]#015 97%|█████████▋| 30/31 [00:10<00:00,  2.90it/s]#015100%|██████████| 31/31 [00:10<00:00,  3.21it/s]#015100%|██████████| 31/31 [00:10<00:00,  2.96it/s]

2023-06-22 20:09:38 Uploading - Uploading generated training model
2023-06-22 20:10:04 Completed - Training job completed
Training seconds: 493
Billable seconds: 493

Download the trained model files

[16]:
! aws s3 cp {huggingface_estimator.model_data} model.tar.gz
! mkdir -p {model_path}
! tar -xvf model.tar.gz -C  {model_path}/
download: s3://sagemaker-us-west-2-000000000000/huggingface-pytorch-training-2023-06-22-20-00-31-761/output/model.tar.gz to ./model.tar.gz
tokenizer.json
training_args.bin
tokenizer_config.json
special_tokens_map.json
config.json
vocab.txt
pytorch_model.bin

Prepare model container definition

We are going to use the trained model files along with the HuggingFace Inference container to deploy the model to a SageMaker endpoint.

[17]:
with tarfile.open("hf_model.tar.gz", mode="w:gz") as archive:
    archive.add(model_path, recursive=True)
    archive.add("code/")
directory_name = s3_prefix.split("/")[-1]
zipped_model_path = sagemaker_session.upload_data(
    path="hf_model.tar.gz", key_prefix=directory_name + "/hf-model-sm"
)
zipped_model_path
[17]:
's3://sagemaker-us-west-2-000000000000/DEMO-NLP-Women-Clothing-1687464029-bfff/hf-model-sm/hf_model.tar.gz'

Create a new model object and then update its model artifact and inference script. The model object will be used to create the SageMaker model.

[18]:
model = huggingface_estimator.create_model(name=model_name)
container_def = model.prepare_container_def(instance_type=instance_type)
container_def["ModelDataUrl"] = zipped_model_path
container_def["Environment"]["SAGEMAKER_PROGRAM"] = "inference.py"
pprint.pprint(container_def)
{'Environment': {'SAGEMAKER_CONTAINER_LOG_LEVEL': '20',
                 'SAGEMAKER_PROGRAM': 'inference.py',
                 'SAGEMAKER_REGION': 'us-west-2',
                 'SAGEMAKER_SUBMIT_DIRECTORY': ''},
 'Image': '763104351884.dkr.ecr.us-west-2.amazonaws.com/huggingface-pytorch-inference:1.7.1-transformers4.6.1-cpu-py36-ubuntu18.04',
 'ModelDataUrl': 's3://sagemaker-us-west-2-000000000000/DEMO-NLP-Women-Clothing-1687464029-bfff/hf-model-sm/hf_model.tar.gz'}

Create endpoint

Create model

The following parameters are required to create a SageMaker model:

  • ExecutionRoleArn: The ARN of the IAM role that Amazon SageMaker can assume to access the model artifacts/ docker images for deployment

  • ModelName: name of the SageMaker model.

  • PrimaryContainer: The location of the primary docker image containing inference code, associated artifacts, and custom environment map that the inference code uses when the model is deployed for predictions.

[19]:
sagemaker_client.create_model(
    ExecutionRoleArn=role,
    ModelName=model_name,
    PrimaryContainer=container_def,
)
print(f"Model created: {model_name}")
Model created: DEMO-NLP-Women-Clothing-1687464029-bfff-model

Create endpoint config

Create an endpoint configuration by calling the create_endpoint_config API. Here, supply the same model_name used in the create_model API call. The create_endpoint_config now supports the additional parameter ClarifyExplainerConfig to enable the Clarify explainer. The SHAP baseline is mandatory, it can be provided either as inline baseline data (the ShapBaseline parameter) or by a S3 baseline file (the ShapBaselineUri parameter). Baseline dataset type shall be the same as input dataset type, and baseline samples shall only include features. For more details on baseline selection please refer this documentation.

Please see the API documentation for details on other config parameters.

Here we use a special token as the baseline.

[20]:
baseline = [["<UNK>"]]
print(f"SHAP baseline: {baseline}")
SHAP baseline: [['<UNK>']]

The TextConfig configured with sentence level granularity (When granularity is sentence, each sentence is a feature, and we need a few sentences per review for good visualization) and the language as English.

[21]:
sagemaker_client.create_endpoint_config(
    EndpointConfigName=endpoint_config_name,
    ProductionVariants=[
        {
            "VariantName": "TestVariant",
            "ModelName": model_name,
            "InitialInstanceCount": 1,
            "InstanceType": instance_type,
        }
    ],
    ExplainerConfig={
        "ClarifyExplainerConfig": {
            "InferenceConfig": {"FeatureTypes": ["text"]},
            "ShapConfig": {
                "ShapBaselineConfig": {"ShapBaseline": csv_serializer.serialize(baseline)},
                "TextConfig": {"Granularity": "sentence", "Language": "en"},
            },
        }
    },
)
[21]:
{'EndpointConfigArn': 'arn:aws:sagemaker:us-west-2:000000000000:endpoint-config/demo-nlp-women-clothing-1687464029-bfff-endpoint-config',
 'ResponseMetadata': {'RequestId': 'ad8d98fe-ac16-4227-b80b-570abb94ac58',
  'HTTPStatusCode': 200,
  'HTTPHeaders': {'x-amzn-requestid': 'ad8d98fe-ac16-4227-b80b-570abb94ac58',
   'content-type': 'application/x-amz-json-1.1',
   'content-length': '136',
   'date': 'Thu, 22 Jun 2023 20:10:54 GMT'},
  'RetryAttempts': 0}}

Create endpoint

Once you have your model and endpoint configuration ready, use the create_endpoint API to create your endpoint. The endpoint_name must be unique within an AWS Region in your AWS account. The create_endpoint API is synchronous in nature and returns an immediate response with the endpoint status being Creating state.

[22]:
sagemaker_client.create_endpoint(
    EndpointName=endpoint_name,
    EndpointConfigName=endpoint_config_name,
)
[22]:
{'EndpointArn': 'arn:aws:sagemaker:us-west-2:000000000000:endpoint/demo-nlp-women-clothing-1687464029-bfff-endpoint',
 'ResponseMetadata': {'RequestId': 'd8f02d0a-b221-4dc3-8a58-317eb9077844',
  'HTTPStatusCode': 200,
  'HTTPHeaders': {'x-amzn-requestid': 'd8f02d0a-b221-4dc3-8a58-317eb9077844',
   'content-type': 'application/x-amz-json-1.1',
   'content-length': '116',
   'date': 'Thu, 22 Jun 2023 20:10:55 GMT'},
  'RetryAttempts': 0}}

Wait for the endpoint to be in “InService” state

[23]:
sagemaker_session.wait_for_endpoint(endpoint_name)
--!
[23]:
{'EndpointName': 'DEMO-NLP-Women-Clothing-1687464029-bfff-endpoint',
 'EndpointArn': 'arn:aws:sagemaker:us-west-2:000000000000:endpoint/demo-nlp-women-clothing-1687464029-bfff-endpoint',
 'EndpointConfigName': 'DEMO-NLP-Women-Clothing-1687464029-bfff-endpoint-config',
 'ProductionVariants': [{'VariantName': 'TestVariant',
   'DeployedImages': [{'SpecifiedImage': '763104351884.dkr.ecr.us-west-2.amazonaws.com/huggingface-pytorch-inference:1.7.1-transformers4.6.1-cpu-py36-ubuntu18.04',
     'ResolvedImage': '763104351884.dkr.ecr.us-west-2.amazonaws.com/huggingface-pytorch-inference@sha256:97cdf11484b82818b195579c7b5d8f16bc97d600ae352f47667e0587de7ae7f0',
     'ResolutionTime': datetime.datetime(2023, 6, 22, 20, 10, 56, 696000, tzinfo=tzlocal())}],
   'CurrentWeight': 1.0,
   'DesiredWeight': 1.0,
   'CurrentInstanceCount': 1,
   'DesiredInstanceCount': 1}],
 'EndpointStatus': 'InService',
 'CreationTime': datetime.datetime(2023, 6, 22, 20, 10, 56, 158000, tzinfo=tzlocal()),
 'LastModifiedTime': datetime.datetime(2023, 6, 22, 20, 13, 15, 436000, tzinfo=tzlocal()),
 'ExplainerConfig': {'ClarifyExplainerConfig': {'InferenceConfig': {'FeatureTypes': ['text']},
   'ShapConfig': {'ShapBaselineConfig': {'ShapBaseline': '<UNK>'},
    'TextConfig': {'Language': 'en', 'Granularity': 'sentence'}}}},
 'ResponseMetadata': {'RequestId': 'c3d17eaf-1043-4f7b-bf54-0c7361ab6693',
  'HTTPStatusCode': 200,
  'HTTPHeaders': {'x-amzn-requestid': 'c3d17eaf-1043-4f7b-bf54-0c7361ab6693',
   'content-type': 'application/x-amz-json-1.1',
   'content-length': '1068',
   'date': 'Thu, 22 Jun 2023 20:13:26 GMT'},
  'RetryAttempts': 0}}

Invoke endpoint

There are expanding business needs and legislative regulations that require explanations of why a model made the decision it did. SageMaker Clarify uses SHAP to explain the contribution that each input feature makes to the final decision.

Kernel SHAP algorithm requires a baseline (also known as background dataset). By definition, baseline should either be a S3 URI to the baseline dataset file, or an in-place list of records. Baseline dataset type shall be the same as the original request data type, and baseline records shall only include features.

Below are the several different combination of endpoint invocation, call them one by one and visualize the explanations by running the subsequent cell.

Single record request

Put only one record in the request body, and then send the request to the endpoint to get its predictions and explanations.

[24]:
num_records = 1
[25]:
response = sagemaker_runtime_client.invoke_endpoint(
    EndpointName=endpoint_name,
    ContentType="text/csv",
    Accept="text/csv",
    Body=csv_serializer.serialize(test_data.iloc[:num_records, :].to_numpy()),
)
pprint.pprint(response)
{'Body': <botocore.response.StreamingBody object at 0x7fbc8bb4ffd0>,
 'ContentType': 'application/json',
 'InvokedProductionVariant': 'TestVariant',
 'ResponseMetadata': {'HTTPHeaders': {'connection': 'keep-alive',
                                      'content-length': '809',
                                      'content-type': 'application/json',
                                      'date': 'Thu, 22 Jun 2023 20:13:28 GMT',
                                      'x-amzn-invoked-production-variant': 'TestVariant',
                                      'x-amzn-requestid': '3acca534-1feb-42dc-b322-9f8d64f27e75'},
                      'HTTPStatusCode': 200,
                      'RequestId': '3acca534-1feb-42dc-b322-9f8d64f27e75',
                      'RetryAttempts': 0}}
[26]:
result = json_deserializer.deserialize(response["Body"], content_type=response["ContentType"])
pprint.pprint(result)
{'explanations': {'kernel_shap': [[{'attributions': [{'attribution': [0.06300842799999996],
                                                      'description': {'partial_text': 'I '
                                                                                      'am '
                                                                                      '5\'6", '
                                                                                      '130 '
                                                                                      'lbs '
                                                                                      'with '
                                                                                      'an '
                                                                                      'athletic '
                                                                                      'body '
                                                                                      'type '
                                                                                      'and '
                                                                                      'i '
                                                                                      'ordered '
                                                                                      'a '
                                                                                      'size '
                                                                                      'small.',
                                                                      'start_idx': 0}},
                                                     {'attribution': [-0.17682224599999996],
                                                      'description': {'partial_text': 'these '
                                                                                      'were '
                                                                                      'really '
                                                                                      'baggy '
                                                                                      'in '
                                                                                      'the '
                                                                                      'thigh/quadricep '
                                                                                      'area '
                                                                                      'and '
                                                                                      'made '
                                                                                      'my '
                                                                                      'thighs '
                                                                                      'look '
                                                                                      'bulky.',
                                                                      'start_idx': 74}},
                                                     {'attribution': [0.19618576600000012],
                                                      'description': {'partial_text': 'the '
                                                                                      'fabric '
                                                                                      'quality '
                                                                                      'is '
                                                                                      'very '
                                                                                      'nice '
                                                                                      'and '
                                                                                      'i '
                                                                                      'like '
                                                                                      'the '
                                                                                      'idea '
                                                                                      'of '
                                                                                      'them '
                                                                                      'for '
                                                                                      'curvier '
                                                                                      'body '
                                                                                      'types.',
                                                                      'start_idx': 157}},
                                                     {'attribution': [0.0976928519999999],
                                                      'description': {'partial_text': 'my '
                                                                                      'son '
                                                                                      'commented '
                                                                                      'that '
                                                                                      'they '
                                                                                      'looked '
                                                                                      'like '
                                                                                      'pajama '
                                                                                      'pants '
                                                                                      'and '
                                                                                      'i '
                                                                                      'agreed.',
                                                                      'start_idx': 241}}],
                                    'feature_type': 'text'}]]},
 'predictions': {'content_type': 'text/csv', 'data': '0.9713779\n'},
 'version': '1.0'}
[27]:
visualize_result(result, df_test[label_header][:num_records])
Legend: Negative Neutral Positive
True LabelPredicted LabelAttribution LabelAttribution ScoreWord Importance
01 (0.97)True0.92 I am 5'6", 130 lbs with an athletic body type and i ordered a size small. these were really baggy in the thigh/quadricep area and made my thighs look bulky. the fabric quality is very nice and i like the idea of them for curvier body types. my son commented that they looked like pajama pants and i agreed.

Single record request, no explanation

Use the EnableExplanations parameter to disable the explanations for this request.

[28]:
num_records = 1
[29]:
response = sagemaker_runtime_client.invoke_endpoint(
    EndpointName=endpoint_name,
    ContentType="text/csv",
    Accept="text/csv",
    Body=csv_serializer.serialize(test_data.iloc[:num_records, :].to_numpy()),
    EnableExplanations="`false`",  # Do not provide explanations
)
pprint.pprint(response)
{'Body': <botocore.response.StreamingBody object at 0x7fbc8cd62440>,
 'ContentType': 'application/json',
 'InvokedProductionVariant': 'TestVariant',
 'ResponseMetadata': {'HTTPHeaders': {'connection': 'keep-alive',
                                      'content-length': '98',
                                      'content-type': 'application/json',
                                      'date': 'Thu, 22 Jun 2023 20:13:28 GMT',
                                      'x-amzn-invoked-production-variant': 'TestVariant',
                                      'x-amzn-requestid': '268abfc9-870c-4423-8e38-30a86add6a20'},
                      'HTTPStatusCode': 200,
                      'RequestId': '268abfc9-870c-4423-8e38-30a86add6a20',
                      'RetryAttempts': 0}}
[30]:
result = json_deserializer.deserialize(response["Body"], content_type=response["ContentType"])
pprint.pprint(result)
{'explanations': {},
 'predictions': {'content_type': 'text/csv', 'data': '0.9713779\n'},
 'version': '1.0'}
[31]:
visualize_result(result, df_test[label_header][:num_records])
No Clarify explanations for the record(s)

Batch request, explain both

Put two records in the request body, and then send the request to the endpoint to get their predictions and explanations.

[32]:
num_records = 2
[33]:
response = sagemaker_runtime_client.invoke_endpoint(
    EndpointName=endpoint_name,
    ContentType="text/csv",
    Accept="text/csv",
    Body=csv_serializer.serialize(test_data.iloc[:num_records, :].to_numpy()),
)
pprint.pprint(response)
{'Body': <botocore.response.StreamingBody object at 0x7fbc8bb4ee30>,
 'ContentType': 'application/json',
 'InvokedProductionVariant': 'TestVariant',
 'ResponseMetadata': {'HTTPHeaders': {'connection': 'keep-alive',
                                      'content-length': '1574',
                                      'content-type': 'application/json',
                                      'date': 'Thu, 22 Jun 2023 20:13:31 GMT',
                                      'x-amzn-invoked-production-variant': 'TestVariant',
                                      'x-amzn-requestid': '168b10a5-27d2-479a-a5cc-bb25d41e7030'},
                      'HTTPStatusCode': 200,
                      'RequestId': '168b10a5-27d2-479a-a5cc-bb25d41e7030',
                      'RetryAttempts': 0}}
[34]:
result = json_deserializer.deserialize(response["Body"], content_type=response["ContentType"])
pprint.pprint(result)
{'explanations': {'kernel_shap': [[{'attributions': [{'attribution': [0.06300842799999996],
                                                      'description': {'partial_text': 'I '
                                                                                      'am '
                                                                                      '5\'6", '
                                                                                      '130 '
                                                                                      'lbs '
                                                                                      'with '
                                                                                      'an '
                                                                                      'athletic '
                                                                                      'body '
                                                                                      'type '
                                                                                      'and '
                                                                                      'i '
                                                                                      'ordered '
                                                                                      'a '
                                                                                      'size '
                                                                                      'small.',
                                                                      'start_idx': 0}},
                                                     {'attribution': [-0.17682224599999996],
                                                      'description': {'partial_text': 'these '
                                                                                      'were '
                                                                                      'really '
                                                                                      'baggy '
                                                                                      'in '
                                                                                      'the '
                                                                                      'thigh/quadricep '
                                                                                      'area '
                                                                                      'and '
                                                                                      'made '
                                                                                      'my '
                                                                                      'thighs '
                                                                                      'look '
                                                                                      'bulky.',
                                                                      'start_idx': 74}},
                                                     {'attribution': [0.19618576600000012],
                                                      'description': {'partial_text': 'the '
                                                                                      'fabric '
                                                                                      'quality '
                                                                                      'is '
                                                                                      'very '
                                                                                      'nice '
                                                                                      'and '
                                                                                      'i '
                                                                                      'like '
                                                                                      'the '
                                                                                      'idea '
                                                                                      'of '
                                                                                      'them '
                                                                                      'for '
                                                                                      'curvier '
                                                                                      'body '
                                                                                      'types.',
                                                                      'start_idx': 157}},
                                                     {'attribution': [0.0976928519999999],
                                                      'description': {'partial_text': 'my '
                                                                                      'son '
                                                                                      'commented '
                                                                                      'that '
                                                                                      'they '
                                                                                      'looked '
                                                                                      'like '
                                                                                      'pajama '
                                                                                      'pants '
                                                                                      'and '
                                                                                      'i '
                                                                                      'agreed.',
                                                                      'start_idx': 241}}],
                                    'feature_type': 'text'}],
                                  [{'attributions': [{'attribution': [0.0625544215],
                                                      'description': {'partial_text': 'The '
                                                                                      'design '
                                                                                      'on '
                                                                                      'the '
                                                                                      'blue '
                                                                                      'sweater '
                                                                                      'is '
                                                                                      'actually '
                                                                                      'a '
                                                                                      'dark '
                                                                                      'navy '
                                                                                      '(not '
                                                                                      'black, '
                                                                                      'as '
                                                                                      'i '
                                                                                      'thought '
                                                                                      'it '
                                                                                      'was), '
                                                                                      'but '
                                                                                      'it '
                                                                                      'still '
                                                                                      'looks '
                                                                                      'beautiful '
                                                                                      'with '
                                                                                      'black '
                                                                                      'underneath.',
                                                                      'start_idx': 0}},
                                                     {'attribution': [0.0713343185],
                                                      'description': {'partial_text': 'rather '
                                                                                      'than '
                                                                                      'jeans '
                                                                                      'like '
                                                                                      "it's "
                                                                                      'shown, '
                                                                                      'the '
                                                                                      'v-neck '
                                                                                      'is '
                                                                                      'a '
                                                                                      'nice '
                                                                                      'change '
                                                                                      'from '
                                                                                      'other '
                                                                                      'cardigans '
                                                                                      'i '
                                                                                      'have '
                                                                                      '- '
                                                                                      'looks '
                                                                                      'great '
                                                                                      'with '
                                                                                      'a '
                                                                                      'cami '
                                                                                      'or '
                                                                                      'another '
                                                                                      'v-neck '
                                                                                      'underneath '
                                                                                      'it.',
                                                                      'start_idx': 141}},
                                                     {'attribution': [-0.007584672500000035],
                                                      'description': {'partial_text': 'soft, '
                                                                                      'nice '
                                                                                      'medium-weight '
                                                                                      'and '
                                                                                      'not '
                                                                                      'at '
                                                                                      'all '
                                                                                      'itchy.',
                                                                      'start_idx': 291}},
                                                     {'attribution': [0.0749991925],
                                                      'description': {'partial_text': 'happy '
                                                                                      'with '
                                                                                      'this '
                                                                                      'as '
                                                                                      'an '
                                                                                      'easy '
                                                                                      'everyday '
                                                                                      'sweater.',
                                                                      'start_idx': 338}}],
                                    'feature_type': 'text'}]]},
 'predictions': {'content_type': 'text/csv', 'data': '0.9713779\n0.99261636\n'},
 'version': '1.0'}
[35]:
visualize_result(result, df_test[label_header][:num_records])
Legend: Negative Neutral Positive
True LabelPredicted LabelAttribution LabelAttribution ScoreWord Importance
01 (0.97)True0.92 I am 5'6", 130 lbs with an athletic body type and i ordered a size small. these were really baggy in the thigh/quadricep area and made my thighs look bulky. the fabric quality is very nice and i like the idea of them for curvier body types. my son commented that they looked like pajama pants and i agreed.
11 (0.99)True2.68 The design on the blue sweater is actually a dark navy (not black, as i thought it was), but it still looks beautiful with black underneath. rather than jeans like it's shown, the v-neck is a nice change from other cardigans i have - looks great with a cami or another v-neck underneath it. soft, nice medium-weight and not at all itchy. happy with this as an easy everyday sweater.

Batch request with more records, explain some of the records

Put a few more records to the request body, and then use the EnableExplanations expression to filter the records to be explained according to their predictions.

[36]:
num_records = 4
[37]:
response = sagemaker_runtime_client.invoke_endpoint(
    EndpointName=endpoint_name,
    ContentType="text/csv",
    Accept="text/csv",
    Body=csv_serializer.serialize(test_data.iloc[:num_records, :].to_numpy()),
    EnableExplanations="[0]>`0.99`",  # Explain a record only when its prediction meets the condition
)
pprint.pprint(response)
{'Body': <botocore.response.StreamingBody object at 0x7fbc8bb4db70>,
 'ContentType': 'application/json',
 'InvokedProductionVariant': 'TestVariant',
 'ResponseMetadata': {'HTTPHeaders': {'connection': 'keep-alive',
                                      'content-length': '1340',
                                      'content-type': 'application/json',
                                      'date': 'Thu, 22 Jun 2023 20:13:33 GMT',
                                      'x-amzn-invoked-production-variant': 'TestVariant',
                                      'x-amzn-requestid': 'd3ddf9ed-042d-41de-ad26-71a77fc90d0b'},
                      'HTTPStatusCode': 200,
                      'RequestId': 'd3ddf9ed-042d-41de-ad26-71a77fc90d0b',
                      'RetryAttempts': 0}}
[38]:
result = json_deserializer.deserialize(response["Body"], content_type=response["ContentType"])
pprint.pprint(result)
{'explanations': {'kernel_shap': [None,
                                  [{'attributions': [{'attribution': [0.0625544215],
                                                      'description': {'partial_text': 'The '
                                                                                      'design '
                                                                                      'on '
                                                                                      'the '
                                                                                      'blue '
                                                                                      'sweater '
                                                                                      'is '
                                                                                      'actually '
                                                                                      'a '
                                                                                      'dark '
                                                                                      'navy '
                                                                                      '(not '
                                                                                      'black, '
                                                                                      'as '
                                                                                      'i '
                                                                                      'thought '
                                                                                      'it '
                                                                                      'was), '
                                                                                      'but '
                                                                                      'it '
                                                                                      'still '
                                                                                      'looks '
                                                                                      'beautiful '
                                                                                      'with '
                                                                                      'black '
                                                                                      'underneath.',
                                                                      'start_idx': 0}},
                                                     {'attribution': [0.0713343185],
                                                      'description': {'partial_text': 'rather '
                                                                                      'than '
                                                                                      'jeans '
                                                                                      'like '
                                                                                      "it's "
                                                                                      'shown, '
                                                                                      'the '
                                                                                      'v-neck '
                                                                                      'is '
                                                                                      'a '
                                                                                      'nice '
                                                                                      'change '
                                                                                      'from '
                                                                                      'other '
                                                                                      'cardigans '
                                                                                      'i '
                                                                                      'have '
                                                                                      '- '
                                                                                      'looks '
                                                                                      'great '
                                                                                      'with '
                                                                                      'a '
                                                                                      'cami '
                                                                                      'or '
                                                                                      'another '
                                                                                      'v-neck '
                                                                                      'underneath '
                                                                                      'it.',
                                                                      'start_idx': 141}},
                                                     {'attribution': [-0.007584672500000035],
                                                      'description': {'partial_text': 'soft, '
                                                                                      'nice '
                                                                                      'medium-weight '
                                                                                      'and '
                                                                                      'not '
                                                                                      'at '
                                                                                      'all '
                                                                                      'itchy.',
                                                                      'start_idx': 291}},
                                                     {'attribution': [0.0749991925],
                                                      'description': {'partial_text': 'happy '
                                                                                      'with '
                                                                                      'this '
                                                                                      'as '
                                                                                      'an '
                                                                                      'easy '
                                                                                      'everyday '
                                                                                      'sweater.',
                                                                      'start_idx': 338}}],
                                    'feature_type': 'text'}],
                                  None,
                                  [{'attributions': [{'attribution': [0.0685000183333333],
                                                      'description': {'partial_text': 'A '
                                                                                      'very '
                                                                                      'versatile '
                                                                                      'and '
                                                                                      'cozy '
                                                                                      'top.',
                                                                      'start_idx': 0}},
                                                     {'attribution': [0.06710124333333331],
                                                      'description': {'partial_text': 'would '
                                                                                      'look '
                                                                                      'great '
                                                                                      'dressed '
                                                                                      'up '
                                                                                      'or '
                                                                                      'down '
                                                                                      'for '
                                                                                      'a '
                                                                                      'casual '
                                                                                      'comfy '
                                                                                      'fall '
                                                                                      'day.',
                                                                      'start_idx': 31}},
                                                     {'attribution': [0.06915193833333336],
                                                      'description': {'partial_text': 'what '
                                                                                      'a '
                                                                                      'fun '
                                                                                      'piece '
                                                                                      'for '
                                                                                      'my '
                                                                                      'wardrobe!',
                                                                      'start_idx': 96}}],
                                    'feature_type': 'text'}]]},
 'predictions': {'content_type': 'text/csv',
                 'data': '0.9713779\n0.99261636\n0.29229787\n0.9960663\n'},
 'version': '1.0'}
[39]:
visualize_result(result, df_test[label_header][:num_records])
Legend: Negative Neutral Positive
True LabelPredicted LabelAttribution LabelAttribution ScoreWord Importance
11 (0.99)True2.68 The design on the blue sweater is actually a dark navy (not black, as i thought it was), but it still looks beautiful with black underneath. rather than jeans like it's shown, the v-neck is a nice change from other cardigans i have - looks great with a cami or another v-neck underneath it. soft, nice medium-weight and not at all itchy. happy with this as an easy everyday sweater.
11 (1.00)True2.96 A very versatile and cozy top. would look great dressed up or down for a casual comfy fall day. what a fun piece for my wardrobe!

Cleanup

Finally, don’t forget to clean up the resources we set up and used for this demo!

[40]:
sagemaker_client.delete_endpoint(EndpointName=endpoint_name);
[41]:
sagemaker_client.delete_endpoint_config(EndpointConfigName=endpoint_config_name);
[42]:
sagemaker_client.delete_model(ModelName=model_name);

Notebook CI Test Results

This notebook was tested in multiple regions. The test results are as follows, except for us-west-2 which is shown at the top of the notebook.

This us-east-1 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable

This us-east-2 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable

This us-west-1 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable

This ca-central-1 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable

This sa-east-1 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable

This eu-west-1 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable

This eu-west-2 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable

This eu-west-3 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable

This eu-central-1 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable

This eu-north-1 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable

This ap-southeast-1 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable

This ap-southeast-2 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable

This ap-northeast-1 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable

This ap-northeast-2 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable

This ap-south-1 badge failed to load. Check your device’s internet connectivity, otherwise the service is currently unavailable