{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Personalize Workshop Cleanup\n", "\n", "This notebook will walk through deleting all of the resources created. You should only need to perform these steps if you have deployed in your own AWS account and want to deprovision the resources. If you are participating in an AWS-led workshop, this process is likely not necessary.\n", "\n", "Resources have to deleted in a specific sequence to avoid dependency errors. In order, we will delete campaigns, solutions, event trackers, datasets, and the dataset group. In addition, we need to make sure that each resource type is fully deleted before moving on to the next resource type. We'll also delete the schemas for our datasets." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Import Dependencies\n", "\n", "To get started, let's import the dependencies we'll need for this notebook. We also have to retrieve Uid from a SageMaker notebook instance tag." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Import Dependencies\n", "import botocore\n", "import boto3\n", "import time\n", "import sys\n", "import json\n", "\n", "from packaging import version\n", "from botocore.exceptions import ClientError\n", "\n", "# Setup Clients\n", "personalize = boto3.client('personalize')\n", "ssm = boto3.client('ssm')\n", "iam = boto3.client(\"iam\")\n", "\n", "with open('/opt/ml/metadata/resource-metadata.json') as f:\n", " data = json.load(f)\n", "sagemaker = boto3.client('sagemaker')\n", "sagemakerResponce = sagemaker.list_tags(ResourceArn=data[\"ResourceArn\"])\n", "for tag in sagemakerResponce[\"Tags\"]:\n", " if tag['Key'] == 'Uid':\n", " Uid = tag['Value']\n", " break" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Since we require a newer version of the botocore library, upgrade the local version if necessary. Note that if the botocore is upgraded, you will need to restart the Jupyter notebook kernel and re-execute the cells from the top to resume. An assertion error is thrown as a reminder if a kernel restart is required." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Minimum version of botocore we need for this workshop.\n", "min_botocore_version = '1.16.24'\n", "\n", "if version.parse(botocore.__version__) < version.parse(min_botocore_version):\n", " print('Current version of botocore ({}) does not meet the minimum required version ({})'.format(botocore.__version__, min_botocore_version))\n", " print('Upgrading to latest pip and botocore...')\n", "\n", " !{sys.executable} -m pip install --upgrade pip\n", " !{sys.executable} -m pip install --upgrade --no-deps --force-reinstall botocore\n", " \n", " assert False, 'Restart the notebook kernel to pick up the latest botocore and begin at the top of this notebook'\n", "else:\n", " print('Version of botocore ({}) meets minimum requirement ({}) for this notebook'.format(botocore.__version__, min_botocore_version))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Load resource names saved in: 02_Training_Layer.ipynb" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "%store -r" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Determine Solutions\n", "\n", "Next let's gather the solution ARNs for all solutions in our dataset group." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "solution_arns = []\n", "\n", "solutions_response = personalize.list_solutions(datasetGroupArn = dataset_group_arn, maxResults = 100)\n", "if 'solutions' in solutions_response:\n", " for solution in solutions_response['solutions']:\n", " solution_arns.append(solution['solutionArn'])\n", " \n", "print('Solutions found: ' + str(solution_arns))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Delete Campaigns for Solutions\n", "\n", "For each solution, let's delete all associated campaigns." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "campaign_arns_deleted = []\n", "\n", "for solution_arn in solution_arns:\n", " campaigns_response = personalize.list_campaigns(solutionArn = solution_arn, maxResults = 100)\n", " \n", " if 'campaigns' in campaigns_response:\n", " for campaign in campaigns_response['campaigns']:\n", " print('Deleting campaign: ' + campaign['campaignArn'])\n", " \n", " personalize.delete_campaign(campaignArn = campaign['campaignArn'])\n", " campaign_arns_deleted.append(campaign['campaignArn'])\n", "\n", "if len(campaign_arns_deleted) > 0:\n", " print('Done initiating deletes for {} campaign(s)'.format(len(campaign_arns_deleted)))\n", "else:\n", " print('No campaigns to delete')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Wait for Campaigns to be Deleted" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "print('Waiting for {} campaigns to be fully deleted'.format(len(campaign_arns_deleted)))\n", "while len(campaign_arns_deleted) > 0:\n", " try:\n", " status = None\n", " max_time = time.time() + 3*60*60 # 3 hours\n", " while time.time() < max_time:\n", " describe_campaign_response = personalize.describe_campaign(\n", " campaignArn = campaign_arns_deleted[0]\n", " )\n", " status = describe_campaign_response[\"campaign\"][\"status\"]\n", " print('Campaign {}: {}'.format(campaign_arns_deleted[0], status))\n", " time.sleep(10)\n", " \n", " print('Exceeded wait time for campaign {} to delete; skipping'.format(campaign_arns_deleted[0]))\n", " campaign_arns_deleted.pop(0)\n", "\n", " except ClientError as e:\n", " error_code = e.response['Error']['Code']\n", " if error_code == 'ResourceNotFoundException':\n", " print('Campaign {} no longer exists'.format(campaign_arns_deleted[0]))\n", " campaign_arns_deleted.pop(0)\n", " \n", "print('Done deleting campaigns')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Delete Solutions\n", "\n", "With the campaigns fully deleted, we can now delete the solutions for our dataset group." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "solution_arns_deleted = []\n", "\n", "for solution_arn in solution_arns:\n", " print('Deleting solution: ' + solution_arn)\n", "\n", " personalize.delete_solution(solutionArn = solution_arn)\n", " solution_arns_deleted.append(solution_arn)\n", "\n", "if len(solution_arns_deleted) > 0:\n", " print('Done initiating deletes for {} solution(s)'.format(len(solution_arns_deleted)))\n", "else:\n", " print('No solutions to delete')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Wait for Solutions to be Deleted\n", "\n", "Before we can move on, ensure the solutions have been fully deleted." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "print('Waiting for {} solutions to be fully deleted'.format(len(solution_arns_deleted)))\n", "while len(solution_arns_deleted) > 0:\n", " try:\n", " status = None\n", " max_time = time.time() + 3*60*60 # 3 hours\n", " while time.time() < max_time:\n", " describe_solution_response = personalize.describe_solution(\n", " solutionArn = solution_arns_deleted[0]\n", " )\n", " status = describe_solution_response[\"solution\"][\"status\"]\n", " print('Solution {}: {}'.format(solution_arns_deleted[0], status))\n", " time.sleep(10)\n", " \n", " print('Exceeded wait time for solution {} to delete; skipping'.format(solution_arns_deleted[0]))\n", " solution_arns_deleted.pop(0)\n", "\n", " except ClientError as e:\n", " error_code = e.response['Error']['Code']\n", " if error_code == 'ResourceNotFoundException':\n", " print('Solution {} no longer exists'.format(solution_arns_deleted[0]))\n", " solution_arns_deleted.pop(0)\n", " \n", "print('Done deleting solutions')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Delete Datasets\n", "\n", "Next, we can delete the datasets for our dataset group." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "print('Deleting datasets for dataset group')\n", "dataset_arns_deleted = []\n", "dataset_paginator = personalize.get_paginator('list_datasets')\n", "\n", "for dataset_page in dataset_paginator.paginate(datasetGroupArn = dataset_group_arn):\n", " for dataset in dataset_page['datasets']:\n", " print('Deleting dataset {}'.format(dataset['datasetArn']))\n", " personalize.delete_dataset(datasetArn = dataset['datasetArn'])\n", " dataset_arns_deleted.append(dataset['datasetArn'])\n", " \n", "if len(dataset_arns_deleted) > 0:\n", " print('Done initiating deletes for {} datasets(s)'.format(len(dataset_arns_deleted)))\n", "else:\n", " print('No datasets to delete')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Wait for Datasets to be Deleted" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "print('Waiting for {} datasets to be fully deleted'.format(len(dataset_arns_deleted)))\n", "while len(dataset_arns_deleted) > 0:\n", " try:\n", " status = None\n", " max_time = time.time() + 3*60*60 # 3 hours\n", " while time.time() < max_time:\n", " describe_dataset_response = personalize.describe_dataset(\n", " datasetArn = dataset_arns_deleted[0]\n", " )\n", " status = describe_dataset_response[\"dataset\"][\"status\"]\n", " print('Dataset {}: {}'.format(dataset_arns_deleted[0], status))\n", " time.sleep(10)\n", " \n", " print('Exceeded wait time for dataset {} to delete; skipping'.format(dataset_arns_deleted[0]))\n", " dataset_arns_deleted.pop(0)\n", "\n", " except ClientError as e:\n", " error_code = e.response['Error']['Code']\n", " if error_code == 'ResourceNotFoundException':\n", " print('Dataset {} no longer exists'.format(dataset_arns_deleted[0]))\n", " dataset_arns_deleted.pop(0)\n", " \n", "print('Done deleting datasets')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Delete Dataset Group\n", "\n", "Finally, we can delete our dataset group." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "print('Deleting dataset group')\n", "personalize.delete_dataset_group(datasetGroupArn = dataset_group_arn)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Wait for Dataset Group to be Deleted" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "print('Waiting for dataset group to be fully deleted')\n", "try: \n", " status = None\n", " max_time = time.time() + 3*60*60 # 3 hours\n", " while time.time() < max_time:\n", " describe_dataset_group_response = personalize.describe_dataset_group(\n", " datasetGroupArn = dataset_group_arn\n", " )\n", " status = describe_dataset_group_response[\"datasetGroup\"][\"status\"]\n", " print(\"Dataset group: {}\".format(status))\n", " time.sleep(10)\n", " \n", "except ClientError as e:\n", " error_code = e.response['Error']['Code']\n", " if error_code == 'ResourceNotFoundException':\n", " print(\"Dataset group fully deleted\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Delete Schemas\n", "\n", "We're almost done. The last step to cleaning up Personalize resources is to delete the schemas for our datasets." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "schemas_to_delete = [ 'adx-weather-schema-items', 'adx-weather-users', 'adx-weather-interactions' ]\n", "schema_paginator = personalize.get_paginator('list_schemas')\n", "for schema_page in schema_paginator.paginate():\n", " for schema in schema_page['schemas']:\n", " if schema['name'] in schemas_to_delete:\n", " print('Deleting schema {}'.format(schema['schemaArn']))\n", " personalize.delete_schema(schemaArn = schema['schemaArn'])\n", " \n", "print('Done deleting schemas')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Cleanup Complete\n", "\n", "All resources created by the Personalize workshop have been deleted." ] } ], "metadata": { "kernelspec": { "display_name": "conda_python3", "language": "python", "name": "conda_python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.13" } }, "nbformat": 4, "nbformat_minor": 4 }