This lab provides a hands-on introduction for finetuning foundational models with AWS SageMaker.
This lab was heavily inspired by AWS's Sagemaker Examples Repository.
This lab relies heavily on the use of a Jupyter notebook. If you are unfamiliar with Jupyter notebooks, please refer to the Jupyter Documentation for more information. The instructions found in this readme assist in setting up the environment for the lab, but the lab itself is contained in the domain-adaption-finetuning-gpt-j-6b.ipynb Jupyter notebook found in this repository.
- Lab Objectives
- Prerequisites
- Setting Up Your Environment
- Lab Instructions
- Credits and Acknowledgments
In this lab, you will learn how to:
- Deploy and interact with a SageMaker Jumpstart foundational model.
- Finetune a foundational model using AWS SageMaker.
- Deploy a finetuned model to an endpoint.
- Make inferences using the deployed model.
- An active AWS account. (With access to the AWS Bedrock service)
- Basic knowledge of Python programming.
- You must have an active AWS account.
This lab utalizes a Jupyter Notebook to run the code. There are multiple options for running a Jupyter Notebook:
- For a fully-managed environment with rich AI/ML features, we'd recommend using SageMaker Studio. To get started quickly, you can refer to the instructions for domain quick setup.
- For a fully-managed but more basic experience, you could instead create a SageMaker Notebook Instances.
- If you prefer to use your existing (local or other) notebook environment, make sure it has credentials for calling AWS.
Using one of the notebook environments listed in the Choose a notebook environment section, create a new notebook instance.
Clone this repository to your notebook instance. This can be done by running the following command in a terminal:
git clone [email protected]:trek10inc/internal/machine-learning/generative-ai-curriculum.git
cd generative-ai-curriculum
You're now ready to explore the lab notebooks! Start with domain-adaption-finetuning-gpt-j-6b.ipynb for details on how to finetune foundational models using AWS Sagemaker.
Open the PaaS/lab01-sagemaker-finetuning-foundational-model/domain-adaption-finetuning-gpt-j-6b.ipynb
notebook in your notebook environment.
Follow the instructions in the notebook to complete the lab.
This notebook will walk you through the basics of connecting to the Amazon Bedrock service from Python.
This lab was heavily inspired by AWS's Sagemaker Examples Repository.