{ "cells": [ { "cell_type": "markdown", "metadata": { "id": "kBxSda3I7lS6" }, "source": [ "# Deploy and perform inference on Model Package from AWS Marketplace \n", "\n", "This notebook provides you instructions on how to deploy and perform inference on model packages from AWS Marketplace Hugging Face Text Summarization model.\n", "\n", "This notebook is compatible only with those Text Summarization model packages which this notebook is linked to.\n", "\n", "#### Pre-requisites:\n", "1. **Note**: This notebook contains elements which render correctly in Jupyter interface. Open this notebook from an Amazon SageMaker Notebook Instance or Amazon SageMaker Studio.\n", "1. Ensure that IAM role used has **AmazonSageMakerFullAccess**\n", "1. To deploy this ML model successfully, ensure that:\n", " 1. Either your IAM role has these three permissions and you have authority to make AWS Marketplace subscriptions in the AWS account used: \n", " 1. **aws-marketplace:ViewSubscriptions**\n", " 1. **aws-marketplace:Unsubscribe**\n", " 1. **aws-marketplace:Subscribe** \n", " 2. or your AWS account has a subscription to this Text Summarization model. If so, skip step: [Subscribe to the model package](#1.-Subscribe-to-the-model-package)\n", "\n", "#### Contents:\n", "1. [Subscribe to the model package](#1.-Subscribe-to-the-model-package)\n", "2. [Create an endpoint and perform real-time inference](#2.-Create-an-endpoint-and-perform-real-time-inference)\n", " 1. [Create an endpoint](#A.-Create-an-endpoint)\n", " 2. [Create input payload](#B.-Create-input-payload)\n", " 3. [Perform real-time inference](#C.-Perform-real-time-inference)\n", " 4. [Delete the endpoint](#D.-Delete-the-endpoint)\n", "3. [Perform batch inference](#3.-Perform-batch-inference) \n", "4. [Clean-up](#4.-Clean-up)\n", " 1. [Delete the model](#A.-Delete-the-model)\n", " 2. [Unsubscribe to the listing (optional)](#B.-Unsubscribe-to-the-listing-(optional))\n", " \n", "\n", "#### Usage instructions\n", "You can run this notebook one cell at a time (By using Shift+Enter for running a cell).\n", "\n", "**Note** - This notebook requires you to follow instructions and specify values for parameters, as instructed." ] }, { "cell_type": "markdown", "metadata": { "id": "RxA4196d7lS-" }, "source": [ "### 1. Subscribe to the model package" ] }, { "cell_type": "markdown", "metadata": { "id": "SvaUgXDm7lS_" }, "source": [ "To subscribe to the model package:\n", "1. Open the model package listing page you opened this notebook for.\n", "1. On the AWS Marketplace listing, click on the **Continue to subscribe** button.\n", "1. On the **Subscribe to this software** page, review and click on **\"Accept Offer\"** if you and your organization agrees with EULA, pricing, and support terms. \n", "1. Once you click on **Continue to configuration button** and then choose a **region**, you will see a **Product Arn** displayed. This is the model package ARN that you need to specify while creating a deployable model using Boto3. Copy the ARN corresponding to your region and specify the same in the following cell." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "MhUrChdX7lTA" }, "outputs": [], "source": [ "model_package_arn='' " ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "r7X-G0Pg7lTB" }, "outputs": [], "source": [ "import json \n", "from sagemaker import ModelPackage\n", "import sagemaker as sage\n", "from sagemaker import get_execution_role" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "uQNH6jQo7lTB" }, "outputs": [], "source": [ "role = get_execution_role()\n", "sagemaker_session = sage.Session()\n", "boto3 = sagemaker_session.boto_session\n", "bucket = sagemaker_session.default_bucket()\n", "region = sagemaker_session.boto_region_name\n", "\n", "s3 = boto3.client(\"s3\")\n", "runtime= boto3.client('runtime.sagemaker')" ] }, { "cell_type": "markdown", "metadata": { "id": "TK-hhnm37lTC" }, "source": [ "In next step, you would be deploying the model for real-time inference. For information on how real-time inference with Amazon SageMaker works, see [Documentation](https://docs.aws.amazon.com/sagemaker/latest/dg/how-it-works-hosting.html)." ] }, { "cell_type": "markdown", "metadata": { "id": "08QHTP-97lTD" }, "source": [ "### 2. Create an endpoint and perform real-time inference" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "IvmmUZJR7lTD" }, "outputs": [], "source": [ "model_name='text-summarization-model'" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "I7ia4Yls7lTE" }, "outputs": [], "source": [ "#The Text Summarization model packages this notebook notebook is compatible with, support application/x-text as the \n", "#content-type.\n", "content_type='application/x-text'" ] }, { "cell_type": "markdown", "metadata": { "id": "QrSsZBuq7lTE" }, "source": [ "Review and update the compatible instance type for the model package in the following cell." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "ozqu1PmN7lTF" }, "outputs": [], "source": [ "real_time_inference_instance_type='ml.g4dn.xlarge'\n", "batch_transform_inference_instance_type='ml.p2.xlarge'" ] }, { "cell_type": "markdown", "metadata": { "id": "JD7OHrOv7lTF" }, "source": [ "#### A. Create an endpoint" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "rxUVGaU27lTF" }, "outputs": [], "source": [ "#create a deployable model from the model package.\n", "model = ModelPackage(role=role,\n", " model_package_arn=model_package_arn,\n", " sagemaker_session=sagemaker_session)\n", "\n", "#Deploy the model\n", "predictor = model.deploy(1, real_time_inference_instance_type, endpoint_name=model_name)" ] }, { "cell_type": "markdown", "metadata": { "id": "EsucIa3t7lTG" }, "source": [ "Once endpoint has been created, you would be able to perform real-time inference." ] }, { "cell_type": "markdown", "metadata": { "id": "RWD_x2dD7lTG" }, "source": [ "#### B. Prepare input file for performing real-time inference\n", "#### Let's put in some example input text. You can put in any English text, the model returns a summary of the input text." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "Pyo11WdU7lTG" }, "outputs": [], "source": [ "input_text = \"The tower is 324 metres (1,063 ft) tall, about the same height as an 81-storey building, and the tallest structure in Paris. Its base is square, measuring 125 metres (410 ft) on each side. During its construction, the Eiffel Tower surpassed the Washington Monument to become the tallest man-made structure in the world, a title it held for 41 years until the Chrysler Building in New York City was finished in 1930. It was the first structure to reach a height of 300 metres. Due to the addition of a broadcasting aerial at the top of the tower in 1957, it is now taller than the Chrysler Building by 5.2 metres (17 ft). Excluding transmitters, the Eiffel Tower is the second tallest free-standing structure in France after the Millau Viaduct.\"" ] }, { "cell_type": "markdown", "metadata": { "id": "PqRpsWhr7lTH" }, "source": [ "#### C. Query endpoint that you have created" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "c8fb5hwo7lTH" }, "outputs": [], "source": [ "#perform_inference method performs inference on the endpoint and prints predictions.\n", "newline, bold, unbold = '\\n', '\\033[1m', '\\033[0m'\n", "def query_endpoint(encoded_text):\n", " endpoint_name = model_name\n", " response = runtime.invoke_endpoint(EndpointName=endpoint_name, ContentType=content_type, Body=encoded_text)\n", " model_predictions = json.loads(response['Body'].read())[0]['summary_text']\n", " return model_predictions\n", "\n", "model_predictions = query_endpoint(json.dumps(input_text).encode('utf-8'))\n", "print (f\"Inference:{newline}\"\n", " f\"input text: {input_text}{newline}\"\n", " f\"model prediction: {bold}{model_predictions}{unbold}{newline}\")" ] }, { "cell_type": "markdown", "metadata": { "id": "0TGv-iTc7lTH" }, "source": [ "#### D. Delete the endpoint" ] }, { "cell_type": "markdown", "metadata": { "id": "iQUDaYeb7lTI" }, "source": [ "Now that you have successfully performed a real-time inference, you do not need the endpoint any more. You can terminate the endpoint to avoid being charged." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "sSSBNSHZ7lTI" }, "outputs": [], "source": [ "model.sagemaker_session.delete_endpoint(model_name)\n", "model.sagemaker_session.delete_endpoint_config(model_name)" ] }, { "cell_type": "markdown", "metadata": { "id": "zY2dMChE7lTI" }, "source": [ "### 3. Perform batch inference" ] }, { "cell_type": "markdown", "metadata": { "id": "AoLkOST97lTI" }, "source": [ "In this section, you will perform batch inference using multiple input payloads together. If you are not familiar with batch transform, and want to learn more, see [How to run a batch transform job](https://docs.aws.amazon.com/sagemaker/latest/dg/how-it-works-batch.html)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "Q6bb2pQn7lTI" }, "outputs": [], "source": [ "#upload the batch-transform job input files to S3\n", "transform_input_key_prefix = 'text-summarization-model-transform-input'\n", "f = open(\"transform-input-data.txt\", \"w\")\n", "f.write(\"The tower is 324 metres (1,063 ft) tall, about the same height as an 81-storey building, and the tallest structure in Paris. Its base is square, measuring 125 metres (410 ft) on each side. During its construction, the Eiffel Tower surpassed the Washington Monument to become the tallest man-made structure in the world, a title it held for 41 years until the Chrysler Building in New York City was finished in 1930. It was the first structure to reach a height of 300 metres. Due to the addition of a broadcasting aerial at the top of the tower in 1957, it is now taller than the Chrysler Building by 5.2 metres (17 ft). Excluding transmitters, the Eiffel Tower is the second tallest free-standing structure in France after the Millau Viaduct.\")\n", "f.close()\n", "transform_input = sagemaker_session.upload_data(\"transform-input-data.txt\", key_prefix=transform_input_key_prefix) \n", "print(\"Transform input uploaded to \" + transform_input)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "qn14tG6j7lTJ" }, "outputs": [], "source": [ "#Run the batch-transform job\n", "transformer = model.transformer(1, batch_transform_inference_instance_type)\n", "transformer.transform(transform_input, content_type=content_type)\n", "transformer.wait()" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "FxKppJgU7lTJ" }, "outputs": [], "source": [ "# output is available on following path\n", "transformer.output_path" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "A9mIvJ0P7lTJ" }, "outputs": [], "source": [ "output_bucket_name, output_path = transformer.output_path.replace(\"s3://\", \"\").split(\"/\", 1)\n", "obj = s3.get_object(Bucket=output_bucket_name, Key=output_path + '/transform-input-data.txt.out')\n", "batch_prediction = obj['Body'].read().decode('utf-8')\n", "\n", "# print out batch-transform job output\n", "print(batch_prediction)" ] }, { "cell_type": "markdown", "metadata": { "id": "4CfNfFAE7lTJ" }, "source": [ "### 4. Clean-up" ] }, { "cell_type": "markdown", "metadata": { "id": "iwOWVFvr7lTK" }, "source": [ "#### A. Delete the model" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "aTAvusox7lTK" }, "outputs": [], "source": [ "model.delete_model()" ] }, { "cell_type": "markdown", "metadata": { "id": "xm0fxGt47lTK" }, "source": [ "#### B. Unsubscribe to the listing (optional)" ] }, { "cell_type": "markdown", "metadata": { "id": "uYvD5OpZ7lTK" }, "source": [ "If you would like to unsubscribe to the model package, follow these steps. Before you cancel the subscription, ensure that you do not have any [deployable model](https://console.aws.amazon.com/sagemaker/home#/models) created from the model package or using the algorithm. Note - You can find this information by looking at the container name associated with the model. \n", "\n", "**Steps to unsubscribe to product from AWS Marketplace**:\n", "1. Navigate to __Machine Learning__ tab on [__Your Software subscriptions page__](https://aws.amazon.com/marketplace/ai/library?productType=ml&ref_=mlmp_gitdemo_indust)\n", "2. Locate the listing that you want to cancel the subscription for, and then choose __Cancel Subscription__ to cancel the subscription.\n", "\n" ] } ], "metadata": { "instance_type": "ml.t3.medium", "kernelspec": { "display_name": "conda_pytorch_latest_p36", "language": "python", "name": "conda_pytorch_latest_p36" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.13" }, "colab": { "name": "notebook.ipynb", "provenance": [], "collapsed_sections": [] } }, "nbformat": 4, "nbformat_minor": 0 }