Training Compiler
Train deep learning (DL) models faster on scalable GPU instances managed by SageMaker.
Get started with SageMaker Training Compiler
PyTorch with Hugging Face Transformers
For single-node single-GPU training:
- Compile and Train a Hugging Face Transformers Trainer Model for Question and Answering with the SQuAD dataset
- Compile and Train a Hugging Face Transformer BERT Model with the SST Dataset using SageMaker Training Compiler
- Compile and Train a Binary Classification Trainer Model with the SST2 Dataset for Single-Node Single-GPU Training
- Compile and Train a Vision Transformer Model on the Caltech-256 Dataset using a Single Node
For single-node multi-GPU training:
For multi-node multi-GPU training:
For more information, see Amazon SageMaker Training Compiler in the Amazon SageMaker Developer Guide.