{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Orchestrating Jobs, Model Registration, and Continuous Deployment with Amazon SageMaker\n",
"\n",
"Amazon SageMaker offers Machine Learning application developers and Machine Learning operations engineers the ability to orchestrate SageMaker jobs and author reproducible Machine Learning pipelines, deploy custom-build models for inference in real-time with low latency or offline inferences with Batch Transform, and track lineage of artifacts. You can institute sound operational practices in deploying and monitoring production workflows, deployment of model artifacts, and track artifact lineage through a simple interface, adhering to safety and best-practice paradigmsfor Machine Learning application development.\n",
"\n",
"The SageMaker Pipelines service supports a SageMaker Machine Learning Pipeline Domain Specific Language (DSL), which is a declarative Json specification. This DSL defines a Directed Acyclic Graph (DAG) of pipeline parameters and SageMaker job steps. The SageMaker Python Software Developer Kit (SDK) streamlines the generation of the pipeline DSL using constructs that are already familiar to engineers and scientists alike.\n",
"\n",
"The SageMaker Model Registry is where trained models are stored, versioned, and managed. Data Scientists and Machine Learning Engineers can compare model versions, approve models for deployment, and deploy models from different AWS accounts, all from a single Model Registry. SageMaker enables customers to follow the best practices with ML Ops and getting started right. Customers are able to standup a full ML Ops end-to-end system with a single API call."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## SageMaker Pipelines\n",
"\n",
"Amazon SageMaker Pipelines support the following activites:\n",
"\n",
"* Pipelines - A Directed Acyclic Graph of steps and conditions to orchestrate SageMaker jobs and resource creation.\n",
"* Processing Job steps - A simplified, managed experience on SageMaker to run data processing workloads, such as feature engineering, data validation, model evaluation, and model interpretation.\n",
"* Training Job steps - An iterative process that teaches a model to make predictions by presenting examples from a training dataset.\n",
"* Conditional step execution - Provides conditional execution of branches in a pipeline.\n",
"* Registering Models - Creates a model package resource in the Model Registry that can be used to create deployable models in Amazon SageMaker.\n",
"* Creating Model steps - Create a model for use in transform steps or later publication as an endpoint.\n",
"* Parameterized Pipeline executions - Allows pipeline executions to vary by supplied parameters.\n",
"* Transform Job steps - A batch transform to preprocess datasets to remove noise or bias that interferes with training or inference from your dataset, get inferences from large datasets, and run inference when you don't need a persistent endpoint."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Layout of the SageMaker ModelBuild Project Template\n",
"\n",
"The template provides a starting point for bringing your SageMaker Pipeline development to production.\n",
"\n",
"```\n",
"|-- codebuild-buildspec.yml\n",
"|-- CONTRIBUTING.md\n",
"|-- pipelines\n",
"| |-- abalone\n",
"| | |-- evaluate.py\n",
"| | |-- __init__.py\n",
"| | |-- pipeline.py\n",
"| | `-- preprocess.py\n",
"| |-- get_pipeline_definition.py\n",
"| |-- __init__.py\n",
"| |-- run_pipeline.py\n",
"| |-- _utils.py\n",
"| `-- __version__.py\n",
"|-- README.md\n",
"|-- sagemaker-pipelines-project.ipynb\n",
"|-- setup.cfg\n",
"|-- setup.py\n",
"|-- tests\n",
"| `-- test_pipelines.py\n",
"`-- tox.ini\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"A description of some of the artifacts is provided below:\n",
"
\n",
"Your codebuild execution instructions:\n",
"```\n",
"|-- codebuild-buildspec.yml\n",
"```\n",
"
\n",
"Your pipeline artifacts, which includes a pipeline module defining the required `get_pipeline` method that returns an instance of a SageMaker pipeline, a preprocessing script that is used in feature engineering, and a model evaluation script to measure the Mean Squared Error of the model that's trained by the pipeline:\n",
"\n",
"```\n",
"|-- pipelines\n",
"| |-- abalone\n",
"| | |-- evaluate.py\n",
"| | |-- __init__.py\n",
"| | |-- pipeline.py\n",
"| | `-- preprocess.py\n",
"\n",
"```\n",
"\n",
"For additional subfolders with code and/or artifacts needed by pipeline, they need to be packaged correctly by the `setup.py` file. For example, to package a `pipelines/source` folder,\n",
"\n",
"* Include a `__init__.py` file within the `source` folder.\n",
"* Add it to the `setup.py` file's `package_data` like so:\n",
"\n",
"```\n",
"...\n",
" packages=setuptools.find_packages(),\n",
" include_package_data=True,\n",
" package_data={\"pipelines.my_pipeline.src\": [\"*.txt\"]},\n",
" python_requires=\">=3.6\",\n",
" install_requires=required_packages,\n",
" extras_require=extras,\n",
"...\n",
"```\n",
"\n",
"
\n",
"Utility modules for getting pipeline definition jsons and running pipelines:\n",
"\n",
"```\n",
"|-- pipelines\n",
"| |-- get_pipeline_definition.py\n",
"| |-- __init__.py\n",
"| |-- run_pipeline.py\n",
"| |-- _utils.py\n",
"| `-- __version__.py\n",
"```\n",
"
\n",
"Python package artifacts:\n",
"```\n",
"|-- setup.cfg\n",
"|-- setup.py\n",
"```\n",
"
\n",
"A stubbed testing module for testing your pipeline as you develop:\n",
"```\n",
"|-- tests\n",
"| `-- test_pipelines.py\n",
"```\n",
"
\n",
"The `tox` testing framework configuration:\n",
"```\n",
"`-- tox.ini\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### AutoReload"
]
},
{
"cell_type": "code",
"execution_count": 29,
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"The autoreload extension is already loaded. To reload it, use:\n",
" %reload_ext autoreload\n"
]
}
],
"source": [
"%load_ext autoreload\n",
"%autoreload 2"
]
},
{
"cell_type": "code",
"execution_count": 112,
"metadata": {
"collapsed": true,
"jupyter": {
"outputs_hidden": true
},
"tags": []
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Looking in indexes: https://pypi.org/simple, https://pip.repos.neuron.amazonaws.com\n",
"Requirement already satisfied: sagemaker in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (2.139.0)\n",
"Collecting sagemaker\n",
" Downloading sagemaker-2.140.1.tar.gz (684 kB)\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m684.5/684.5 kB\u001b[0m \u001b[31m36.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[?25h Preparing metadata (setup.py) ... \u001b[?25ldone\n",
"\u001b[?25hRequirement already satisfied: attrs<23,>=20.3.0 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from sagemaker) (22.2.0)\n",
"Requirement already satisfied: boto3<2.0,>=1.26.28 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from sagemaker) (1.26.71)\n",
"Requirement already satisfied: google-pasta in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from sagemaker) (0.2.0)\n",
"Requirement already satisfied: numpy<2.0,>=1.9.0 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from sagemaker) (1.23.5)\n",
"Requirement already satisfied: protobuf<4.0,>=3.1 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from sagemaker) (3.20.2)\n",
"Requirement already satisfied: protobuf3-to-dict<1.0,>=0.1.5 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from sagemaker) (0.1.5)\n",
"Requirement already satisfied: smdebug_rulesconfig==1.0.1 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from sagemaker) (1.0.1)\n",
"Requirement already satisfied: importlib-metadata<5.0,>=1.4.0 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from sagemaker) (4.13.0)\n",
"Requirement already satisfied: packaging>=20.0 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from sagemaker) (21.3)\n",
"Requirement already satisfied: pandas in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from sagemaker) (1.4.4)\n",
"Requirement already satisfied: pathos in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from sagemaker) (0.3.0)\n",
"Requirement already satisfied: schema in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from sagemaker) (0.7.5)\n",
"Requirement already satisfied: s3transfer<0.7.0,>=0.6.0 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from boto3<2.0,>=1.26.28->sagemaker) (0.6.0)\n",
"Requirement already satisfied: jmespath<2.0.0,>=0.7.1 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from boto3<2.0,>=1.26.28->sagemaker) (1.0.1)\n",
"Requirement already satisfied: botocore<1.30.0,>=1.29.71 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from boto3<2.0,>=1.26.28->sagemaker) (1.29.71)\n",
"Requirement already satisfied: zipp>=0.5 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from importlib-metadata<5.0,>=1.4.0->sagemaker) (3.11.0)\n",
"Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from packaging>=20.0->sagemaker) (3.0.9)\n",
"Requirement already satisfied: six in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from protobuf3-to-dict<1.0,>=0.1.5->sagemaker) (1.16.0)\n",
"Requirement already satisfied: python-dateutil>=2.8.1 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from pandas->sagemaker) (2.8.2)\n",
"Requirement already satisfied: pytz>=2020.1 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from pandas->sagemaker) (2022.7)\n",
"Requirement already satisfied: dill>=0.3.6 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from pathos->sagemaker) (0.3.6)\n",
"Requirement already satisfied: pox>=0.3.2 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from pathos->sagemaker) (0.3.2)\n",
"Requirement already satisfied: multiprocess>=0.70.14 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from pathos->sagemaker) (0.70.14)\n",
"Requirement already satisfied: ppft>=1.7.6.6 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from pathos->sagemaker) (1.7.6.6)\n",
"Requirement already satisfied: contextlib2>=0.5.5 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from schema->sagemaker) (21.6.0)\n",
"Requirement already satisfied: urllib3<1.27,>=1.25.4 in /home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages (from botocore<1.30.0,>=1.29.71->boto3<2.0,>=1.26.28->sagemaker) (1.26.15)\n",
"Building wheels for collected packages: sagemaker\n",
" Building wheel for sagemaker (setup.py) ... \u001b[?25ldone\n",
"\u001b[?25h Created wheel for sagemaker: filename=sagemaker-2.140.1-py2.py3-none-any.whl size=925381 sha256=002573160eaf373fe6ff12aea81a7c86bf95896f450128709288e22b1461d22c\n",
" Stored in directory: /home/ec2-user/.cache/pip/wheels/73/18/c5/3ad3801205b996b9ef1dafcdb5fd09d701fb9290c52a066b74\n",
"Successfully built sagemaker\n",
"Installing collected packages: sagemaker\n",
" Attempting uninstall: sagemaker\n",
" Found existing installation: sagemaker 2.139.0\n",
" Uninstalling sagemaker-2.139.0:\n",
" Successfully uninstalled sagemaker-2.139.0\n",
"Successfully installed sagemaker-2.140.1\n"
]
}
],
"source": [
"!pip install sagemaker --upgrade"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### A SageMaker Pipeline\n",
"\n",
"The pipeline that we create follows a typical Machine Learning Application pattern of pre-processing, training, evaluation, and conditional model registration and publication, if the quality of the model is sufficient.\n",
"\n",
"\n",
"\n",
"### Getting some constants\n",
"\n",
"We get some constants from the local execution environment."
]
},
{
"cell_type": "code",
"execution_count": 24,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"import boto3\n",
"import sagemaker\n",
"from utils.ssm import parameter_store\n",
"\n",
"region = boto3.Session().region_name\n",
"pm = parameter_store(region)\n",
"\n",
"# Change these to reflect your project/business name or if you want to separate ModelPackageGroup/Pipeline from the rest of your team\n",
"model_package_group_name = f\"NeMoASRModelPackageGroup-Example\"\n",
"#base_job_prefix = pm.get_params(key='PREFIX')\n",
"prefix = pm.get_params(key='PREFIX')\n",
"role = pm.get_params(key=\"-\".join([prefix, 'SAGEMAKER-ROLE-ARN'])) #sagemaker.get_execution_role()\n",
"default_bucket = pm.get_params(key=\"-\".join([prefix, 'BUCKET'])) # sagemaker.session.Session().default_bucket()\n"
]
},
{
"cell_type": "code",
"execution_count": 25,
"metadata": {
"tags": []
},
"outputs": [
{
"data": {
"text/plain": [
"('nemo-asr',\n",
" 'sm-nemo-ramp',\n",
" 'arn:aws:iam::419974056037:role/service-role/AmazonSageMaker-ExecutionRole-20221206T163436')"
]
},
"execution_count": 25,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"prefix, default_bucket, role"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": []
},
"source": [
"### Get the pipeline instance\n",
"\n",
"Here we get the pipeline instance from your pipeline module so that we can work with it."
]
},
{
"cell_type": "code",
"execution_count": 33,
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:botocore.credentials:Found credentials from IAM Role: BaseNotebookInstanceEc2InstanceRole\n",
"INFO:botocore.credentials:Found credentials from IAM Role: BaseNotebookInstanceEc2InstanceRole\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
" \n",
"== Preprocessing Step ==\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Cloning into '/tmp/tmpe7t4ro8k'...\n",
"remote: Counting objects: 20, done. \n",
"Already on 'main'\n",
"INFO:sagemaker.processing:Uploaded /tmp/tmpe7t4ro8k/./code/ to s3://sm-nemo-ramp/preprocessing/source/sourcedir.tar.gz\n",
"INFO:sagemaker.processing:runproc.sh uploaded to s3://sm-nemo-ramp/preprocessing/source/runproc.sh\n",
"/home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages/sagemaker/workflow/pipeline_context.py:258: UserWarning: Running within a PipelineSession, there will be No Wait, No Logs, and No Job being started.\n",
" warnings.warn(\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Your branch is up to date with 'origin/main'.\n",
" \n",
"Args: dict_items([('ProcessingResources', {'ClusterConfig': {'InstanceType': 'ml.m5.xlarge', 'InstanceCount': 1, 'VolumeSizeInGB': 30}}), ('AppSpecification', {'ImageUri': '419974056037.dkr.ecr.us-east-1.amazonaws.com/nemo-test-training', 'ContainerArguments': ['--proc_prefix', '/opt/ml/processing', '--train_mount_dir', '/opt/ml/input/data/training/', '--test_mount_dir', '/opt/ml/input/data/testing/'], 'ContainerEntrypoint': ['/bin/bash', '/opt/ml/processing/input/entrypoint/runproc.sh']}), ('RoleArn', 'arn:aws:iam::419974056037:role/service-role/AmazonSageMaker-ExecutionRole-20221206T163436'), ('ProcessingInputs', [{'InputName': 'input-data', 'AppManaged': False, 'S3Input': {'S3Uri': 's3://sm-nemo-ramp/nemo-asr/data', 'LocalPath': '/opt/ml/processing/input', 'S3DataType': 'S3Prefix', 'S3InputMode': 'File', 'S3DataDistributionType': 'FullyReplicated', 'S3CompressionType': 'None'}}, {'InputName': 'code', 'AppManaged': False, 'S3Input': {'S3Uri': 's3://sm-nemo-ramp/preprocessing/source/sourcedir.tar.gz', 'LocalPath': '/opt/ml/processing/input/code/', 'S3DataType': 'S3Prefix', 'S3InputMode': 'File', 'S3DataDistributionType': 'FullyReplicated', 'S3CompressionType': 'None'}}, {'InputName': 'entrypoint', 'AppManaged': False, 'S3Input': {'S3Uri': 's3://sm-nemo-ramp/preprocessing/source/runproc.sh', 'LocalPath': '/opt/ml/processing/input/entrypoint', 'S3DataType': 'S3Prefix', 'S3InputMode': 'File', 'S3DataDistributionType': 'FullyReplicated', 'S3CompressionType': 'None'}}]), ('ProcessingOutputConfig', {'Outputs': [{'OutputName': 'output-data', 'AppManaged': False, 'S3Output': {'S3Uri': Join(on='/', values=['s3://sm-nemo-ramp', 'NEMOASR-pipeline', 'preprocessing', 'output-data']), 'LocalPath': '/opt/ml/processing/output', 'S3UploadMode': 'EndOfJob'}}]})])\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Cloning into '/tmp/tmpqodvht2z'...\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
" \n",
"== Preprocessing Step 2 ==\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"remote: Counting objects: 20, done. \n",
"Already on 'main'\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Your branch is up to date with 'origin/main'.\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:sagemaker.processing:Uploaded /tmp/tmpqodvht2z/./code/ to s3://sm-nemo-ramp/preprocessing-2/source/sourcedir.tar.gz\n",
"INFO:sagemaker.processing:runproc.sh uploaded to s3://sm-nemo-ramp/preprocessing-2/source/runproc.sh\n",
"/home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages/sagemaker/workflow/pipeline_context.py:258: UserWarning: Running within a PipelineSession, there will be No Wait, No Logs, and No Job being started.\n",
" warnings.warn(\n",
"Cloning into '/tmp/tmpchz5oy95'...\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
" \n",
"Args: dict_items([('ProcessingResources', {'ClusterConfig': {'InstanceType': 'ml.m5.xlarge', 'InstanceCount': 1, 'VolumeSizeInGB': 30}}), ('AppSpecification', {'ImageUri': '419974056037.dkr.ecr.us-east-1.amazonaws.com/nemo-test-training', 'ContainerArguments': ['--proc_prefix', '/opt/ml/processing', '--train_mount_dir', '/opt/ml/input/data/training/', '--test_mount_dir', '/opt/ml/input/data/testing/'], 'ContainerEntrypoint': ['/bin/bash', '/opt/ml/processing/input/entrypoint/runproc.sh']}), ('RoleArn', 'arn:aws:iam::419974056037:role/service-role/AmazonSageMaker-ExecutionRole-20221206T163436'), ('ProcessingInputs', [{'InputName': 'input-data', 'AppManaged': False, 'S3Input': {'S3Uri': 's3://sm-nemo-ramp/nemo-asr/data', 'LocalPath': '/opt/ml/processing/input', 'S3DataType': 'S3Prefix', 'S3InputMode': 'File', 'S3DataDistributionType': 'FullyReplicated', 'S3CompressionType': 'None'}}, {'InputName': 'code', 'AppManaged': False, 'S3Input': {'S3Uri': 's3://sm-nemo-ramp/preprocessing-2/source/sourcedir.tar.gz', 'LocalPath': '/opt/ml/processing/input/code/', 'S3DataType': 'S3Prefix', 'S3InputMode': 'File', 'S3DataDistributionType': 'FullyReplicated', 'S3CompressionType': 'None'}}, {'InputName': 'entrypoint', 'AppManaged': False, 'S3Input': {'S3Uri': 's3://sm-nemo-ramp/preprocessing-2/source/runproc.sh', 'LocalPath': '/opt/ml/processing/input/entrypoint', 'S3DataType': 'S3Prefix', 'S3InputMode': 'File', 'S3DataDistributionType': 'FullyReplicated', 'S3CompressionType': 'None'}}]), ('ProcessingOutputConfig', {'Outputs': [{'OutputName': 'output-data-2', 'AppManaged': False, 'S3Output': {'S3Uri': Join(on='/', values=['s3://sm-nemo-ramp', 'NEMOASR-pipeline', 'preprocessing', 'output-data-2']), 'LocalPath': '/opt/ml/processing/output', 'S3UploadMode': 'EndOfJob'}}]})])\n",
"here2\n",
"pretrain_s3_path s3://sm-nemo-ramp/nemo-asr/pretrained\n",
" \n",
"== Training Step ==\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"remote: Counting objects: 20, done. \n",
"Already on 'main'\n",
"/home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages/sagemaker/workflow/pipeline_context.py:258: UserWarning: Running within a PipelineSession, there will be No Wait, No Logs, and No Job being started.\n",
" warnings.warn(\n",
"Cloning into '/tmp/tmp1wp72ljo'...\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Your branch is up to date with 'origin/main'.\n",
" \n",
"Args: dict_items([('AlgorithmSpecification', {'TrainingInputMode': 'File', 'TrainingImage': '419974056037.dkr.ecr.us-east-1.amazonaws.com/nemo-test-training', 'MetricDefinitions': [{'Name': 'train_loss', 'Regex': 'loss=([0-9\\\\.]+)(e-?[[01][0-9])?'}, {'Name': 'wer', 'Regex': 'wer:([0-9\\\\.]+)(e-?[[01][0-9])?'}], 'EnableSageMakerMetricsTimeSeries': True}), ('OutputDataConfig', {'S3OutputPath': Join(on='/', values=['s3://sm-nemo-ramp', 'NEMOASR-pipeline', 'training', 'model-output'])}), ('StoppingCondition', {'MaxRuntimeInSeconds': 3600}), ('ResourceConfig', {'VolumeSizeInGB': 512, 'InstanceCount': 1, 'InstanceType': 'ml.g4dn.8xlarge'}), ('RoleArn', 'arn:aws:iam::419974056037:role/service-role/AmazonSageMaker-ExecutionRole-20221206T163436'), ('InputDataConfig', [{'DataSource': {'S3DataSource': {'S3DataType': 'S3Prefix', 'S3Uri': , 'S3DataDistributionType': 'FullyReplicated'}}, 'ChannelName': 'training'}, {'DataSource': {'S3DataSource': {'S3DataType': 'S3Prefix', 'S3Uri': , 'S3DataDistributionType': 'FullyReplicated'}}, 'ChannelName': 'testing'}, {'DataSource': {'S3DataSource': {'S3DataType': 'S3Prefix', 'S3Uri': 's3://sm-nemo-ramp/nemo-asr/pretrained', 'S3DataDistributionType': 'FullyReplicated'}}, 'ChannelName': 'pretrained'}]), ('HyperParameters', {'config-path': '\"conf\"', 'sagemaker_submit_directory': '\"s3://sm-nemo-ramp/NEMOASRtrain-exp/source/sourcedir.tar.gz\"', 'sagemaker_program': '\"speech_to_text_ctc.py\"', 'sagemaker_container_log_level': '20', 'sagemaker_region': '\"us-east-1\"'}), ('ExperimentConfig', {'TrialComponentDisplayName': 'NEMOASRtrain-exp'}), ('CheckpointConfig', {'S3Uri': Join(on='/', values=['s3://sm-nemo-ramp', 'NEMOASR-pipeline', 'training', 'ckpt'])}), ('ProfilerConfig', {'DisableProfiler': True})])\n",
" \n",
"== Evaluation Step ==\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"remote: Counting objects: 20, done. \n",
"Already on 'main'\n",
"INFO:sagemaker.processing:Uploaded /tmp/tmp1wp72ljo/./code/ to s3://sm-nemo-ramp/NEMOASReval-exp/source/sourcedir.tar.gz\n",
"INFO:sagemaker.processing:runproc.sh uploaded to s3://sm-nemo-ramp/NEMOASReval-exp/source/runproc.sh\n",
"/home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages/sagemaker/workflow/pipeline_context.py:258: UserWarning: Running within a PipelineSession, there will be No Wait, No Logs, and No Job being started.\n",
" warnings.warn(\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Your branch is up to date with 'origin/main'.\n",
" \n",
"Args: dict_items([('ProcessingResources', {'ClusterConfig': {'InstanceType': 'ml.g4dn.8xlarge', 'InstanceCount': 1, 'VolumeSizeInGB': 30}}), ('AppSpecification', {'ImageUri': '419974056037.dkr.ecr.us-east-1.amazonaws.com/nemo-test-training', 'ContainerEntrypoint': ['/bin/bash', '/opt/ml/processing/input/entrypoint/runproc.sh']}), ('RoleArn', 'arn:aws:iam::419974056037:role/service-role/AmazonSageMaker-ExecutionRole-20221206T163436'), ('ProcessingInputs', [{'InputName': 'model_artifact', 'AppManaged': False, 'S3Input': {'S3Uri': , 'LocalPath': '/opt/ml/processing/model', 'S3DataType': 'S3Prefix', 'S3InputMode': 'File', 'S3DataDistributionType': 'FullyReplicated', 'S3CompressionType': 'None'}}, {'InputName': 'test_manifest_file', 'AppManaged': False, 'S3Input': {'S3Uri': Join(on='/', values=[, 'an4', 'test_manifest.json']), 'LocalPath': '/opt/ml/processing/input/manifest', 'S3DataType': 'S3Prefix', 'S3InputMode': 'File', 'S3DataDistributionType': 'FullyReplicated', 'S3CompressionType': 'None'}}, {'InputName': 'wav_dataset', 'AppManaged': False, 'S3Input': {'S3Uri': Join(on='/', values=[, 'an4', 'wav']), 'LocalPath': '/opt/ml/processing/input/wav', 'S3DataType': 'S3Prefix', 'S3InputMode': 'File', 'S3DataDistributionType': 'FullyReplicated', 'S3CompressionType': 'None'}}, {'InputName': 'code', 'AppManaged': False, 'S3Input': {'S3Uri': 's3://sm-nemo-ramp/NEMOASReval-exp/source/sourcedir.tar.gz', 'LocalPath': '/opt/ml/processing/input/code/', 'S3DataType': 'S3Prefix', 'S3InputMode': 'File', 'S3DataDistributionType': 'FullyReplicated', 'S3CompressionType': 'None'}}, {'InputName': 'entrypoint', 'AppManaged': False, 'S3Input': {'S3Uri': 's3://sm-nemo-ramp/NEMOASReval-exp/source/runproc.sh', 'LocalPath': '/opt/ml/processing/input/entrypoint', 'S3DataType': 'S3Prefix', 'S3InputMode': 'File', 'S3DataDistributionType': 'FullyReplicated', 'S3CompressionType': 'None'}}]), ('ProcessingOutputConfig', {'Outputs': [{'OutputName': 'evaluation-metrics', 'AppManaged': False, 'S3Output': {'S3Uri': 's3://sm-nemo-ramp/NEMOASR-pipeline/evaluation/output/evaluation-metrics', 'LocalPath': '/opt/ml/processing/evaluation', 'S3UploadMode': 'EndOfJob'}}]}), ('Environment', {'MANIFEST_PATH': '/opt/ml/input/data/testing/an4/wav', 'WAV_PATH': '/opt/ml/processing/input/wav'}), ('ExperimentConfig', {'TrialComponentDisplayName': 'NEMOASReval-exp'})])\n",
" \n",
"== Registration Step ==\n"
]
}
],
"source": [
"from pipelines.nemo_asr.pipeline import get_pipeline\n",
"\n",
"pipeline = get_pipeline(\n",
" region=region,\n",
" role=role,\n",
" default_bucket=default_bucket,\n",
" model_package_group_name=model_package_group_name\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Submit the pipeline to SageMaker and start execution\n",
"\n",
"Let's submit our pipeline definition to the workflow service. The role passed in will be used by the workflow service to create all the jobs defined in the steps."
]
},
{
"cell_type": "code",
"execution_count": 34,
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"Cloning into '/tmp/tmpidimjk9r'...\n",
"remote: Counting objects: 20, done. \n",
"Already on 'main'\n",
"INFO:sagemaker.processing:Uploaded /tmp/tmpidimjk9r/./code/ to s3://sm-nemo-ramp/NEMOASR-pipeline/code/1b17c9317e373407fb3440afde960082/sourcedir.tar.gz\n",
"INFO:sagemaker.processing:runproc.sh uploaded to s3://sm-nemo-ramp/NEMOASR-pipeline/code/5b4247ccf769254fe9ac221423614408/runproc.sh\n",
"/home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages/sagemaker/workflow/pipeline_context.py:258: UserWarning: Running within a PipelineSession, there will be No Wait, No Logs, and No Job being started.\n",
" warnings.warn(\n",
"Cloning into '/tmp/tmppzkwxslu'...\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Your branch is up to date with 'origin/main'.\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"remote: Counting objects: 20, done. \n",
"Already on 'main'\n",
"INFO:sagemaker.processing:Uploaded /tmp/tmppzkwxslu/./code/ to s3://sm-nemo-ramp/NEMOASR-pipeline/code/1b17c9317e373407fb3440afde960082/sourcedir.tar.gz\n",
"INFO:sagemaker.processing:runproc.sh uploaded to s3://sm-nemo-ramp/NEMOASR-pipeline/code/5b4247ccf769254fe9ac221423614408/runproc.sh\n",
"/home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages/sagemaker/workflow/pipeline_context.py:258: UserWarning: Running within a PipelineSession, there will be No Wait, No Logs, and No Job being started.\n",
" warnings.warn(\n",
"Cloning into '/tmp/tmpad_ymqr_'...\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Your branch is up to date with 'origin/main'.\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"remote: Counting objects: 20, done. \n",
"Already on 'main'\n",
"Cloning into '/tmp/tmp9gtw2nom'...\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Your branch is up to date with 'origin/main'.\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"remote: Counting objects: 20, done. \n",
"Already on 'main'\n",
"INFO:sagemaker.processing:Uploaded /tmp/tmp9gtw2nom/./code/ to s3://sm-nemo-ramp/NEMOASR-pipeline/code/96d962adedeb9da1b53022759a7b10f0/sourcedir.tar.gz\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Your branch is up to date with 'origin/main'.\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:sagemaker.processing:runproc.sh uploaded to s3://sm-nemo-ramp/NEMOASR-pipeline/code/042c27441c10cac7d270da211efc863b/runproc.sh\n",
"/home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages/sagemaker/workflow/pipeline_context.py:258: UserWarning: Running within a PipelineSession, there will be No Wait, No Logs, and No Job being started.\n",
" warnings.warn(\n",
"WARNING:sagemaker.workflow._utils:Popping out 'CertifyForMarketplace' from the pipeline definition since it will be overridden in pipeline execution time.\n",
"Cloning into '/tmp/tmpwo4p7e83'...\n",
"remote: Counting objects: 20, done. \n",
"Already on 'main'\n",
"INFO:sagemaker.processing:Uploaded /tmp/tmpwo4p7e83/./code/ to s3://sm-nemo-ramp/NEMOASR-pipeline/code/1b17c9317e373407fb3440afde960082/sourcedir.tar.gz\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Your branch is up to date with 'origin/main'.\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:sagemaker.processing:runproc.sh uploaded to s3://sm-nemo-ramp/NEMOASR-pipeline/code/5b4247ccf769254fe9ac221423614408/runproc.sh\n",
"/home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages/sagemaker/workflow/pipeline_context.py:258: UserWarning: Running within a PipelineSession, there will be No Wait, No Logs, and No Job being started.\n",
" warnings.warn(\n",
"Cloning into '/tmp/tmpjgmmnagj'...\n",
"remote: Counting objects: 20, done. \n",
"Already on 'main'\n",
"INFO:sagemaker.processing:Uploaded /tmp/tmpjgmmnagj/./code/ to s3://sm-nemo-ramp/NEMOASR-pipeline/code/1b17c9317e373407fb3440afde960082/sourcedir.tar.gz\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Your branch is up to date with 'origin/main'.\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:sagemaker.processing:runproc.sh uploaded to s3://sm-nemo-ramp/NEMOASR-pipeline/code/5b4247ccf769254fe9ac221423614408/runproc.sh\n",
"/home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages/sagemaker/workflow/pipeline_context.py:258: UserWarning: Running within a PipelineSession, there will be No Wait, No Logs, and No Job being started.\n",
" warnings.warn(\n",
"Cloning into '/tmp/tmp1np7y_t9'...\n",
"remote: Counting objects: 20, done. \n",
"Already on 'main'\n",
"Cloning into '/tmp/tmpa1pz_60p'...\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Your branch is up to date with 'origin/main'.\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"remote: Counting objects: 20, done. \n",
"Already on 'main'\n",
"INFO:sagemaker.processing:Uploaded /tmp/tmpa1pz_60p/./code/ to s3://sm-nemo-ramp/NEMOASR-pipeline/code/96d962adedeb9da1b53022759a7b10f0/sourcedir.tar.gz\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Your branch is up to date with 'origin/main'.\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:sagemaker.processing:runproc.sh uploaded to s3://sm-nemo-ramp/NEMOASR-pipeline/code/042c27441c10cac7d270da211efc863b/runproc.sh\n",
"/home/ec2-user/anaconda3/envs/pytorch_p39/lib/python3.9/site-packages/sagemaker/workflow/pipeline_context.py:258: UserWarning: Running within a PipelineSession, there will be No Wait, No Logs, and No Job being started.\n",
" warnings.warn(\n"
]
},
{
"data": {
"text/plain": [
"{'PipelineArn': 'arn:aws:sagemaker:us-east-1:419974056037:pipeline/nemoasr-pipeline',\n",
" 'ResponseMetadata': {'RequestId': '026c340b-e841-470c-85fc-bd5d8eb6890b',\n",
" 'HTTPStatusCode': 200,\n",
" 'HTTPHeaders': {'x-amzn-requestid': '026c340b-e841-470c-85fc-bd5d8eb6890b',\n",
" 'content-type': 'application/x-amz-json-1.1',\n",
" 'content-length': '84',\n",
" 'date': 'Wed, 22 Mar 2023 11:51:44 GMT'},\n",
" 'RetryAttempts': 0}}"
]
},
"execution_count": 34,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"pipeline.upsert(role_arn=role)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We'll start the pipeline, accepting all the default parameters.\n",
"\n",
"Values can also be passed into these pipeline parameters on starting of the pipeline, and will be covered later. "
]
},
{
"cell_type": "code",
"execution_count": 35,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"execution = pipeline.start()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Pipeline Operations: examining and waiting for pipeline execution\n",
"\n",
"Now we describe execution instance and list the steps in the execution to find out more about the execution."
]
},
{
"cell_type": "code",
"execution_count": 50,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"{'PipelineArn': 'arn:aws:sagemaker:ap-northeast-2:419974056037:pipeline/nemoasrpipeline-example',\n",
" 'PipelineExecutionArn': 'arn:aws:sagemaker:ap-northeast-2:419974056037:pipeline/nemoasrpipeline-example/execution/mkx2o8vxos4u',\n",
" 'PipelineExecutionDisplayName': 'execution-1679288681411',\n",
" 'PipelineExecutionStatus': 'Executing',\n",
" 'PipelineExperimentConfig': {'ExperimentName': 'nemoasrpipeline-example',\n",
" 'TrialName': 'mkx2o8vxos4u'},\n",
" 'CreationTime': datetime.datetime(2023, 3, 20, 5, 4, 41, 350000, tzinfo=tzlocal()),\n",
" 'LastModifiedTime': datetime.datetime(2023, 3, 20, 5, 4, 41, 350000, tzinfo=tzlocal()),\n",
" 'CreatedBy': {},\n",
" 'LastModifiedBy': {},\n",
" 'ResponseMetadata': {'RequestId': '4a7dedca-7244-473e-9f25-8d7ef0757691',\n",
" 'HTTPStatusCode': 200,\n",
" 'HTTPHeaders': {'x-amzn-requestid': '4a7dedca-7244-473e-9f25-8d7ef0757691',\n",
" 'content-type': 'application/x-amz-json-1.1',\n",
" 'content-length': '518',\n",
" 'date': 'Mon, 20 Mar 2023 05:04:42 GMT'},\n",
" 'RetryAttempts': 0}}"
]
},
"execution_count": 50,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"execution.describe()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can wait for the execution by invoking `wait()` on the execution:"
]
},
{
"cell_type": "code",
"execution_count": 52,
"metadata": {
"tags": []
},
"outputs": [
{
"ename": "AttributeError",
"evalue": "'_PipelineExecution' object has no attribute 'log'",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mAttributeError\u001b[0m Traceback (most recent call last)",
"\u001b[0;32m/tmp/ipykernel_26785/915508597.py\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[0;32m----> 1\u001b[0;31m \u001b[0mexecution\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlog\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m",
"\u001b[0;31mAttributeError\u001b[0m: '_PipelineExecution' object has no attribute 'log'"
]
}
],
"source": [
"execution.wait()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can list the execution steps to check out the status and artifacts:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"execution.list_steps()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Parameterized Executions\n",
"\n",
"We can run additional executions of the pipeline specifying different pipeline parameters. The parameters argument is a dictionary whose names are the parameter names, and whose values are the primitive values to use as overrides of the defaults.\n",
"\n",
"Of particular note, based on the performance of the model, we may want to kick off another pipeline execution, but this time on a compute-optimized instance type and set the model approval status automatically be \"Approved\". This means that the model package version generated by the `RegisterModel` step will automatically be ready for deployment through CI/CD pipelines, such as with SageMaker Projects."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"execution = pipeline.start(\n",
" parameters=dict(\n",
" ProcessingInstanceType=\"ml.c5.xlarge\",\n",
" ModelApprovalStatus=\"Approved\",\n",
" )\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"execution.wait()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"execution.list_steps()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Config Writer"
]
},
{
"cell_type": "code",
"execution_count": 97,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"from pipelines.nemo_asr.config.config import config_handler"
]
},
{
"cell_type": "code",
"execution_count": 98,
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"strFilePath /home/ec2-user/SageMaker/nemo-on-sagemaker/pipelines/nemo_asr/config\n",
"====== config info. ======\n",
" COMMON: base_job_prefix:SM-NeMo-ASR-PIPELINE-\n",
" INPUT: input_data_s3_uri:s3://sm-nemo-bucket/data\n",
" PIPELINE: name:NeMoASRPipeline-Example\n",
" PIPELINE: enable_caching:False\n",
" PIPELINE: expire_after:T48H\n",
" PREPROCESSING: instance_type:ml.g4dn.xlarge\n",
" PREPROCESSING: instance_count:1\n",
" TRAINING: framework_version:None\n",
" TRAINING: py_version:None\n",
" TRAINING: instance_type:ml.p3.2xlarge\n",
" TRAINING: instance_count:1\n",
" TRAINING: experiment_name:train-exp\n",
" EVALUATION: instance_type:ml.g4dn.xlarge\n",
" EVALUATION: instance_count:1\n",
" EVALUATION: experiment_name:eval-exp\n",
" MODEL_REGISTER: model_package_group_name:NeMoASRModelPackageGroup-Example\n",
" MODEL_REGISTER: model_approval_status_default:PendingManualApproval\n",
" MODEL_REGISTER: inference_instances:[\"ml.p3.2xlarge\"]\n",
" MODEL_REGISTER: transform_instances:[\"ml.p3.2xlarge\"]\n",
" DEPLOY: processing_instance_type:ml.m5.xlarge\n",
" DEPLOY: processing_instance_count:1\n",
" DEPLOY: processing_framework_version:1.0-1\n",
" DEPLOY: instance_type:ml.g4dn.xlarge\n",
" DEPLOY: initial_instance_count:1\n",
" DEPLOY: model_server_workers:1\n",
" DEPLOY: framework_version:1.12.1\n",
" DEPLOY: py_version:py38\n",
"==========================\n"
]
}
],
"source": [
"pipeline_config = config_handler(strConfigPath=\"config-pipeline.ini\")"
]
},
{
"cell_type": "code",
"execution_count": 99,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"pipeline_config.set_value(\"CODE\", \"TEST\", \"AA\")"
]
},
{
"cell_type": "code",
"execution_count": 100,
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"====== config info. ======\n",
" COMMON: base_job_prefix:SM-NeMo-ASR-PIPELINE-\n",
" INPUT: input_data_s3_uri:s3://sm-nemo-bucket/data\n",
" PIPELINE: name:NeMoASRPipeline-Example\n",
" PIPELINE: enable_caching:False\n",
" PIPELINE: expire_after:T48H\n",
" PREPROCESSING: instance_type:ml.g4dn.xlarge\n",
" PREPROCESSING: instance_count:1\n",
" TRAINING: framework_version:None\n",
" TRAINING: py_version:None\n",
" TRAINING: instance_type:ml.p3.2xlarge\n",
" TRAINING: instance_count:1\n",
" TRAINING: experiment_name:train-exp\n",
" EVALUATION: instance_type:ml.g4dn.xlarge\n",
" EVALUATION: instance_count:1\n",
" EVALUATION: experiment_name:eval-exp\n",
" MODEL_REGISTER: model_package_group_name:NeMoASRModelPackageGroup-Example\n",
" MODEL_REGISTER: model_approval_status_default:PendingManualApproval\n",
" MODEL_REGISTER: inference_instances:[\"ml.p3.2xlarge\"]\n",
" MODEL_REGISTER: transform_instances:[\"ml.p3.2xlarge\"]\n",
" DEPLOY: processing_instance_type:ml.m5.xlarge\n",
" DEPLOY: processing_instance_count:1\n",
" DEPLOY: processing_framework_version:1.0-1\n",
" DEPLOY: instance_type:ml.g4dn.xlarge\n",
" DEPLOY: initial_instance_count:1\n",
" DEPLOY: model_server_workers:1\n",
" DEPLOY: framework_version:1.12.1\n",
" DEPLOY: py_version:py38\n",
" CODE: test:AA\n",
"==========================\n"
]
}
],
"source": [
"pipeline_config.get_all_info()"
]
},
{
"cell_type": "code",
"execution_count": 101,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"pipeline_config.write_value()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"instance_type": "ml.t3.medium",
"kernelspec": {
"display_name": "conda_pytorch_p39",
"language": "python",
"name": "conda_pytorch_p39"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.15"
}
},
"nbformat": 4,
"nbformat_minor": 4
}
|