{ "cells": [ { "cell_type": "markdown", "id": "fba4c02d", "metadata": {}, "source": [ "# Demo Notebook for MLCommons Integration\n", "\n", "#### [Download notebook](https://github.com/opensearch-project/opensearch-py-ml/blob/main/docs/source/examples/demo_ml_commons_integration.ipynb)\n", "\n", "\n", "This notebook provides a walkthrough guidance for users to invoke MLCommons apis to upload ml models to opensearch cluster\n", "\n", "Step 0: Import packages and set up client\n", "\n", "Step 1: Upload NLP model from local file to Opensearch cluster\n", "\n", "Step 2: Load Model\n", "\n", "Step 3: Get Task\n", "\n", "Step 4: Get Model\n", "\n", "Step 5: Generate Sentence Embedding\n", "\n", "Step 6: Unload Model\n", "\n", "Step 7: Delete Model\n", "\n" ] }, { "cell_type": "markdown", "source": [], "metadata": { "collapsed": false } }, { "cell_type": "markdown", "id": "7011727e", "metadata": {}, "source": [ "## Step 0: Import packages and set up client\n", "Install required packages for opensearch_py_ml.sentence_transformer_model\n", "Install `opensearchpy` and `opensearch-py-ml` through pypi\n" ] }, { "cell_type": "code", "execution_count": null, "id": "17a3e085", "metadata": { "pycharm": { "is_executing": true }, "scrolled": true }, "outputs": [], "source": [ "!pip install opensearch-py opensearch-py-ml" ] }, { "cell_type": "code", "execution_count": 7, "id": "f12096cd", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\u001B[33mDEPRECATION: Configuring installation scheme with distutils config files is deprecated and will no longer work in the near future. If you are using a Homebrew or Linuxbrew Python, please see discussion at https://github.com/Homebrew/homebrew-core/issues/76621\u001B[0m\u001B[33m\n", "\u001B[0mCollecting deprecated\n", " Downloading Deprecated-1.2.14-py2.py3-none-any.whl (9.6 kB)\n", "Requirement already satisfied: wrapt<2,>=1.10 in /usr/local/lib/python3.9/site-packages (from deprecated) (1.13.3)\n", "Installing collected packages: deprecated\n", "\u001B[33m DEPRECATION: Configuring installation scheme with distutils config files is deprecated and will no longer work in the near future. If you are using a Homebrew or Linuxbrew Python, please see discussion at https://github.com/Homebrew/homebrew-core/issues/76621\u001B[0m\u001B[33m\n", "\u001B[0m\u001B[33mDEPRECATION: Configuring installation scheme with distutils config files is deprecated and will no longer work in the near future. If you are using a Homebrew or Linuxbrew Python, please see discussion at https://github.com/Homebrew/homebrew-core/issues/76621\u001B[0m\u001B[33m\n", "\u001B[0mSuccessfully installed deprecated-1.2.14\n", "\n", "\u001B[1m[\u001B[0m\u001B[34;49mnotice\u001B[0m\u001B[1;39;49m]\u001B[0m\u001B[39;49m A new release of pip is available: \u001B[0m\u001B[31;49m23.1\u001B[0m\u001B[39;49m -> \u001B[0m\u001B[32;49m23.1.2\u001B[0m\n", "\u001B[1m[\u001B[0m\u001B[34;49mnotice\u001B[0m\u001B[1;39;49m]\u001B[0m\u001B[39;49m To update, run: \u001B[0m\u001B[32;49mpython3.9 -m pip install --upgrade pip\u001B[0m\n" ] } ], "source": [ "!pip install deprecated" ] }, { "cell_type": "code", "execution_count": 1, "id": "39146873", "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "import warnings\n", "warnings.filterwarnings('ignore', category=DeprecationWarning)\n", "warnings.filterwarnings(\"ignore\", message=\"Unverified HTTPS request\")\n", "from opensearchpy import OpenSearch" ] }, { "cell_type": "code", "execution_count": 2, "id": "5c85ae17", "metadata": {}, "outputs": [], "source": [ "CLUSTER_URL = 'https://localhost:9200'" ] }, { "cell_type": "code", "execution_count": 3, "id": "77442abf", "metadata": {}, "outputs": [], "source": [ "def get_os_client(cluster_url = CLUSTER_URL,\n", " username='admin',\n", " password='admin'):\n", " '''\n", " Get OpenSearch client\n", " :param cluster_url: cluster URL like https://ml-te-netwo-1s12ba42br23v-ff1736fa7db98ff2.elb.us-west-2.amazonaws.com:443\n", " :return: OpenSearch client\n", " '''\n", " client = OpenSearch(\n", " hosts=[cluster_url],\n", " http_auth=(username, password),\n", " verify_certs=False\n", " )\n", " return client " ] }, { "cell_type": "code", "execution_count": 4, "id": "89e1cb2a", "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/usr/local/lib/python3.9/site-packages/opensearchpy/connection/http_urllib3.py:199: UserWarning: Connecting to https://localhost:9200 using SSL with verify_certs=False is insecure.\n", " warnings.warn(\n" ] } ], "source": [ "client = get_os_client()" ] }, { "cell_type": "markdown", "id": "4da9e0de", "metadata": {}, "source": [ "## Step 1: Upload NLP model from local file to Opensearch cluster\n", "\n", "We can upload machine learning models to Opensearch cluster using MLCommons register_model api. In this demo we will show how can we upload model\n", "\n", "\n", "###### From Opensearch 2.8, to register a model we need to have a model group. First we need to register a model group and use the model group id to register a model. For registering a model group we can look at this doc: \n", "\n", "https://github.com/opensearch-project/ml-commons/blob/2.x/docs/model_access_control.md#registering-a-model-group\n", "\n", "In our following example, we created a group and using the group id to register a model.\n", "\n", "\n", "###### From Opensearch 2.6, we introduced pre-trained models: https://opensearch.org/docs/latest/ml-commons-plugin/pretrained-models/\n", "\n", "\n", "* One thing to remember, if we don't have any ml node then registering model might throw exception. In that case we need to update this setting: https://github.com/opensearch-project/ml-commons/blob/main/build.gradle#L46\n" ] }, { "cell_type": "code", "execution_count": 5, "id": "168a969e", "metadata": {}, "outputs": [], "source": [ "from opensearch_py_ml.ml_commons import MLCommonClient\n", "ml_client = MLCommonClient(client)" ] }, { "cell_type": "code", "execution_count": 30, "id": "3bfe1532", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Model was registered successfully. Model Id: 7aF5sYgBZqn0fcHifav4\n" ] } ], "source": [ "\n", "model_id = ml_client.register_pretrained_model(model_name = \"huggingface/sentence-transformers/all-MiniLM-L12-v2\", model_version = \"1.0.1\", model_format = \"TORCH_SCRIPT\", model_group_id = \"d4hfsYgBFp6IJxCcqpwi\", deploy_model=False, wait_until_deployed=False)" ] }, { "cell_type": "markdown", "id": "65de517d", "metadata": {}, "source": [ "We can also upload model from our own file system or URL. But to do that we need to update couple cluster settings:\n", "\n", "To register from url: plugins.ml_commons.allow_registering_model_via_url\n", "To register from file system: plugins.ml_commons.allow_registering_model_via_local_file\n", "\n", " By default, both of these values are `False`, we need to update to `True`\n", "\n", "\n", "To demonstrate, we download the model zip file from the url: https://github.com/opensearch-project/ml-commons/raw/2.x/ml-algorithms/src/test/resources/org/opensearch/ml/engine/algorithms/text_embedding/all-MiniLM-L6-v2_torchscript_sentence-transformer.zip?raw=true\n", "\n", "To upload model to the cluster, we need a zip file containing a torchScript file (.pt extension) and a tokenizer.json file. Please refer to the previous download. We also need a json file with defining the config information with following these request fields: \n", "\n", "https://opensearch.org/docs/latest/ml-commons-plugin/api/#request-fields" ] }, { "cell_type": "code", "execution_count": 6, "id": "c7b0ff7e", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Total number of chunks 10\n", "Sha1 value of the model file: 3ead6e8725322ff54ef9137c453132046098d7e6494945283b8fc980c9123675\n", "Model meta data was created successfully. Model Id: 4oh9sYgBFp6IJxCclpx2\n", "uploading chunk 1 of 10\n", "Model id: {'status': 'Uploaded'}\n", "uploading chunk 2 of 10\n", "Model id: {'status': 'Uploaded'}\n", "uploading chunk 3 of 10\n", "Model id: {'status': 'Uploaded'}\n", "uploading chunk 4 of 10\n", "Model id: {'status': 'Uploaded'}\n", "uploading chunk 5 of 10\n", "Model id: {'status': 'Uploaded'}\n", "uploading chunk 6 of 10\n", "Model id: {'status': 'Uploaded'}\n", "uploading chunk 7 of 10\n", "Model id: {'status': 'Uploaded'}\n", "uploading chunk 8 of 10\n", "Model id: {'status': 'Uploaded'}\n", "uploading chunk 9 of 10\n", "Model id: {'status': 'Uploaded'}\n", "uploading chunk 10 of 10\n", "Model id: {'status': 'Uploaded'}\n", "Model registered successfully\n", "Model deployed successfully\n" ] } ], "source": [ "\n", "model_path = '/Volumes/workplace/upload_content/sentence-transformers_all-MiniLM-L6-v2-1.0.0-torch_script.zip'\n", "model_config_path = '/Volumes/workplace/upload_content/all-MiniLM-L6-v2_torchscript.json'\n", "\n", "\"\"\"\n", "all-MiniLM-L6-v2_torchscript.json content:\n", "\n", "{\n", " \"name\": \"all-MiniLM-L6-v2\",\n", " \"version\": 1,\n", " \"model_format\": \"TORCH_SCRIPT\",\n", " \"model_config\": {\n", " \"model_type\": \"bert\",\n", " \"embedding_dimension\": 384,\n", " \"framework_type\": \"sentence_transformers\"\n", " }\n", "}\n", "\"\"\"\n", "\n", "\n", "model_id_file_system = ml_client.register_model(model_path, model_config_path, model_group_id = \"d4hfsYgBFp6IJxCcqpwi\", isVerbose=True)" ] }, { "cell_type": "markdown", "id": "9c3b7cbd", "metadata": {}, "source": [ "## Step 2: Load Model\n", "\n", "In the last step we upload a model and the model id is: `7KFfsYgBZqn0fcHi8Ku0`. Now we will load this model in opensearch memory." ] }, { "cell_type": "code", "execution_count": 15, "id": "28e9310c", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Model deployed successfully\n", "{'model_id': '7KFfsYgBZqn0fcHi8Ku0', 'task_type': 'DEPLOY_MODEL', 'function_name': 'TEXT_EMBEDDING', 'state': 'FAILED', 'worker_node': ['QkNLom65QCiU1AwA4fRQHA', 'BJ0OvIWHTJuEJu2muBVRIA'], 'create_time': 1686603192569, 'last_update_time': 1686603193467, 'error': '{\"QkNLom65QCiU1AwA4fRQHA\":\"Duplicate deploy model task\",\"BJ0OvIWHTJuEJu2muBVRIA\":\"Duplicate deploy model task\"}', 'is_async': True}\n" ] } ], "source": [ "\n", "load_model_output = ml_client.deploy_model(\"7KFfsYgBZqn0fcHi8Ku0\")\n", "\n", "print(load_model_output)\n" ] }, { "cell_type": "markdown", "id": "34ee235b", "metadata": {}, "source": [ "## Step 3: Get Task\n", "\n", "When we invoke load model api of mlcommons plugin, a task get created. We can see the task id (`j9uRoIUBqB81FWKi_Xqu`) from previous output. Now, we can get the detailed information of the task using this task id" ] }, { "cell_type": "code", "execution_count": 31, "id": "44d6b1d2", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "{'model_id': 'jtuRoIUBqB81FWKil3qA', 'task_type': 'LOAD_MODEL', 'function_name': 'TEXT_EMBEDDING', 'state': 'COMPLETED', 'worker_node': '56rNfEbPSG6p8ZZli59Zpg,Lncik04uQxe-cw3BC14wNA', 'create_time': 1673436764200, 'last_update_time': 1673436768619, 'is_async': True}\n" ] } ], "source": [ "\n", "task_info = ml_client.get_task_info(\"kNuaoIUBqB81FWKimHoo\")\n", "\n", "print(task_info)" ] }, { "cell_type": "markdown", "id": "97ed5665", "metadata": {}, "source": [ "## Step 4: Get Model\n", "\n", "With using the model id, we can also pull information about the model metadata from the opensearch cluster." ] }, { "cell_type": "code", "execution_count": 16, "id": "661c3f46", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "{'name': 'huggingface/sentence-transformers/all-MiniLM-L12-v2', 'model_group_id': 'd4hfsYgBFp6IJxCcqpwi', 'algorithm': 'TEXT_EMBEDDING', 'model_version': '1', 'model_format': 'TORCH_SCRIPT', 'model_state': 'DEPLOYED', 'model_content_size_in_bytes': 134568911, 'model_content_hash_value': 'f8012a4e6b5da1f556221a12160d080157039f077ab85a5f6b467a47247aad49', 'model_config': {'model_type': 'bert', 'embedding_dimension': 384, 'framework_type': 'SENTENCE_TRANSFORMERS', 'all_config': '{\"_name_or_path\":\"microsoft/MiniLM-L12-H384-uncased\",\"attention_probs_dropout_prob\":0.1,\"gradient_checkpointing\":false,\"hidden_act\":\"gelu\",\"hidden_dropout_prob\":0.1,\"hidden_size\":384,\"initializer_range\":0.02,\"intermediate_size\":1536,\"layer_norm_eps\":1e-12,\"max_position_embeddings\":512,\"model_type\":\"bert\",\"num_attention_heads\":12,\"num_hidden_layers\":12,\"pad_token_id\":0,\"position_embedding_type\":\"absolute\",\"transformers_version\":\"4.8.2\",\"type_vocab_size\":2,\"use_cache\":true,\"vocab_size\":30522}'}, 'created_time': 1686603034649, 'last_updated_time': 1686603197066, 'last_registered_time': 1686603047287, 'last_deployed_time': 1686603197066, 'total_chunks': 14, 'planning_worker_node_count': 2, 'current_worker_node_count': 2, 'planning_worker_nodes': ['QkNLom65QCiU1AwA4fRQHA', 'BJ0OvIWHTJuEJu2muBVRIA'], 'deploy_to_all_nodes': True}\n" ] } ], "source": [ "\n", "model_info = ml_client.get_model_info(\"7KFfsYgBZqn0fcHi8Ku0\")\n", "\n", "print(model_info)" ] }, { "cell_type": "markdown", "id": "3b22c708", "metadata": {}, "source": [ "## Step 5: Generate Sentence Embedding\n", "\n", "Now using the loaded model in memory, we can generate embedding for sentences. We can provide a list of sentences to get a list of embedding for the sentences. " ] }, { "cell_type": "code", "execution_count": 17, "id": "8cc5a796", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "{'inference_results': [{'output': [{'name': 'sentence_embedding', 'data_type': 'FLOAT32', 'shape': [384], 'data': [0.070045985, 0.094030164, 0.029469099, 0.006335383, -0.037177853, 0.0034696271, 0.06973787, -0.041374803, -0.05277958, -0.019993568, 0.049499072, 0.0443014, 0.05095634, -0.09186084, -0.039252527, -0.028518854, 0.018059185, -0.097130835, -0.03480089, 0.044088792, 0.025124501, -0.06829837, 0.021070546, 0.073358126, -0.016342998, 0.016885245, 0.0073821708, -0.06980089, 0.019172797, -0.12756695, -0.0028336751, 0.076620854, 0.010953987, 0.04055977, 0.047134332, -0.029655162, -0.025424464, -0.023706172, 0.015665784, -0.00028456873, -0.022526933, -0.073676914, 0.055472963, 0.01868282, 0.039403405, -0.024852037, 0.04160002, -0.0012200676, -0.012104933, -0.051197134, -0.074466705, -0.055452745, 0.0074861553, -0.019089255, -0.030097326, -0.026060734, -0.052988756, 0.067124605, 0.025931405, -0.026440043, -0.006570677, 0.055886537, -0.053474635, 0.007984726, 0.08091788, -0.03664718, -0.03190376, -0.073973835, -0.00654548, 0.03476009, -0.009686864, 0.016051935, -0.047839254, 0.0035186426, -9.659327e-05, -0.03731518, 0.005097011, 0.0046331016, 0.04831959, -0.037194345, -0.018304912, -0.06406546, 0.007127483, 0.0036731786, -0.037867643, 0.0525798, 0.022754213, -0.009397554, -0.0788709, -0.04558914, 0.026581936, -0.07401967, 0.0022031788, 0.027502475, -0.04612981, 0.02867453, 0.02812278, -0.06055298, 0.046777442, 0.19889696, 0.026731564, -0.007961404, -0.064086854, 0.03999534, -0.065743305, -0.01038975, 0.031175181, 0.0015405513, -0.04053732, -0.0026587155, 0.0201725, 0.083728634, 0.005884634, 0.032710858, -0.0015733105, -0.09610316, 0.002357033, 0.05984074, -0.16500355, 0.036792327, 0.12291777, -0.001806536, 0.043636028, -0.04189108, 0.009408711, -0.08278277, 0.12746945, 0.016939184, 0.012635965, -0.146165, 0.051131286, 0.0033628952, -0.022748712, 0.022569735, -0.037459757, 0.018744115, -0.006217754, 0.084697165, -0.006795305, -0.054143608, -0.057388783, 0.047126126, 0.016142208, 0.04754377, -0.062171612, -0.012241133, -0.08141003, -0.01191984, 0.016868386, 0.013309095, 0.0659549, 0.02784064, 0.013731603, -0.05787209, -0.026792012, 0.018663717, -0.050591003, -0.02040348, 0.047714904, -0.016063442, -0.10401469, 0.13376768, -0.030586442, 0.0010340372, 0.070359856, -0.013292282, -0.015457728, 0.048926532, -0.011410942, -0.052058127, 0.18224765, 0.0035982048, 0.02246424, -0.0030590945, -0.019338036, 0.00026652194, 0.022335682, 0.07469184, 0.02382239, 0.01046695, 0.03009087, -0.006465016, -0.051522683, 0.019980922, -0.03548732, 0.010370418, -0.08818134, -0.028292583, 0.074272536, 0.08402995, 0.035817318, -0.018850267, 0.031453982, 0.06855093, 0.014118732, -0.07007131, 0.008944237, 0.01816519, -0.07784689, -0.07069612, -0.012862535, -0.015138116, -0.03347607, -0.0068311035, -0.002638591, 0.036386125, 0.03143706, -0.08959042, 0.053561244, -0.08690694, -0.014790011, 0.055373162, -0.11275848, -0.038676344, 0.01777759, -0.0624037, 0.08309957, -0.058959704, 0.02146699, -0.0052596424, -0.03967568, -0.07005087, 0.0535649, 1.1060871e-32, 0.0013608177, 0.106175095, -0.05951191, -0.0037961602, 0.013185205, -0.046274826, 0.10418064, -0.0012341454, -0.0131508345, 0.018092377, 0.0054246127, 0.014041023, 0.036791388, 0.016286977, -0.09607988, 0.015155129, 0.023281448, 0.0831054, 0.00065585395, 0.0002872294, 0.049073815, 0.050753243, -0.0048564766, 0.08584035, -0.049346138, 0.010288568, 0.08767223, -0.0668006, -0.02730152, -0.06031797, 0.08574493, 0.0017309934, 0.004164226, 0.13996643, -0.0064810724, -0.06346753, 0.106137946, -0.06602096, 0.007606502, 0.034102228, 0.015871577, 0.034030348, -0.06620869, 0.061993707, -0.016627932, -0.001730041, 0.025229787, -0.0031807858, 0.04901121, 0.00089508446, -0.039892998, 0.0016668648, -0.012777798, 0.015382705, 0.016362779, -0.02404705, -0.021434382, -0.09117077, -0.05373932, -0.018560624, 0.023284905, 0.0031982018, -0.0031510415, 0.10734427, 0.06998923, -0.0027732637, -0.0530556, 0.023547022, 0.025985928, -0.035036553, -0.058912452, -0.029204743, 0.008312955, -0.056485176, -0.014061077, 0.04225055, -0.08007323, -0.0091617135, -0.018469298, -0.04556874, -0.015968537, -0.014121485, 0.0036195326, 0.027108386, 0.03586916, -0.01750429, 0.10870282, -0.0083625065, 0.014823508, 0.04874035, 0.010967432, 0.018752282, -0.046599686, -0.05806444, -0.029174872, 4.6263226e-33, -0.03030781, -0.06679235, 0.0005264349, 0.045648444, 0.10910713, -0.016406672, -0.006509885, -0.16902378, 0.011055921, 0.036133043, 0.050960954, 0.028546343, -0.07644701, 0.08252433, -0.041281033, 0.0342127, -0.001389836, 0.0062635387, -0.04514362, -0.01706786, 0.07922916, -0.022634191, -0.06849333, -0.007873758, 0.030855168, 0.036613345, 0.01543812, 0.041089304, -0.05730519, 0.06317529, 0.08143371, 0.07131982, -0.037264705, -0.058889233, -0.049804617, -0.034103516, 0.06648858, -0.014449709, 0.022788258, 0.03571679, -0.03570635, 0.046074998, 0.0053664967, 0.024858603, 0.025428161, -0.010590862, -0.040552795, -0.06513859, -0.0003934997, -0.056372937, -0.05148246, 0.011264148, 0.06645999, 0.026652971, 0.07176414, 0.035537906, 0.03287273, -0.0017281885, -0.079337545, 0.049712148, 0.06649142, 0.06070372, -0.012733799, -0.0060011027]}]}, {'output': [{'name': 'sentence_embedding', 'data_type': 'FLOAT32', 'shape': [384], 'data': [0.07083514, 0.098972656, 0.023528868, 0.015548298, -0.03414622, 0.024773195, 0.060273148, -0.028832981, -0.09044978, -0.029630601, 0.044647824, 0.020695554, 0.0451399, -0.105178855, -0.03574795, -0.022011578, 0.02242477, -0.06875169, -0.041503854, 0.046235904, 0.025888842, -0.058203552, 0.0317196, 0.064302124, -0.030310726, 0.027002888, -0.0028196487, -0.044340227, 0.032432694, -0.11670581, 0.014379569, 0.06432164, -0.011770892, 0.032793347, 0.04387399, -0.038692925, -0.013582388, -0.026226582, 0.0076006465, -0.027217122, -0.034419663, -0.08012475, 0.05472637, 0.0074604633, 0.047486894, -0.025225092, 0.037482392, 0.0020973443, -0.0042126533, -0.057536595, -0.08447187, -0.043337554, 0.014264286, -0.023134142, -0.029479904, -0.03363044, -0.05907903, 0.06964192, 0.035337694, -0.022371998, -0.021929951, 0.062269177, -0.030993886, 0.0076106074, 0.07694969, -0.017912427, -0.044359725, -0.056732178, -0.015525767, 0.032636724, -0.009579165, 0.018426858, -0.054791622, 0.008909622, 0.0063452916, -0.0313364, 0.005820822, -0.008221157, 0.058385953, -0.047120888, -0.0052236924, -0.06515579, 0.0010107595, 0.020413563, -0.041749377, 0.044107597, 0.008662938, -0.011051114, -0.08871864, -0.04507809, 0.036184292, -0.07195325, 0.009203214, 0.023508972, -0.028275426, 0.028231027, 0.003911504, -0.070118375, 0.058573272, 0.20727791, 0.048356153, -0.0018830848, -0.066115424, 0.022906033, -0.061623204, -0.021237886, 0.018174624, -0.0050362684, -0.023906242, -0.002387967, 0.035721593, 0.10048729, -0.007958879, 0.044000257, 0.0103866365, -0.09361922, 0.011654698, 0.06738684, -0.1736049, 0.03254977, 0.13699506, -0.0092994645, 0.04818335, -0.050219133, 0.005542939, -0.10294832, 0.10766475, 0.015173319, 0.013554773, -0.13501169, 0.045765, 0.021902261, -0.033359535, 0.028230267, -0.04085567, 0.0077474224, -0.036401946, 0.092795655, -0.00096511975, -0.059694305, -0.035620503, 0.037294034, 0.0015283289, 0.034953, -0.04880616, -0.0059092464, -0.060103483, -0.008989362, 0.022295682, 0.022184573, 0.04979118, 0.049196165, 0.03790123, -0.053590335, -0.0077016405, 0.003188551, -0.056628674, -0.027927972, 0.060124118, -0.025649551, -0.09657911, 0.1299167, -0.032081775, -0.0060925665, 0.06273658, -0.025575282, -0.0069705267, 0.050369203, -0.025092889, -0.019692965, 0.17965682, 0.006651051, 0.024963867, -0.018071478, -0.016473701, -0.00877558, 0.04949933, 0.07836956, 0.03252581, -0.007377615, 0.03200573, -0.011032086, -0.028064458, 0.01763287, -0.026994107, 0.00808559, -0.070627235, -0.034461614, 0.05010755, 0.09331023, 0.049584378, -0.021610672, 0.027137306, 0.071931034, -0.013025755, -0.07407564, 0.020584432, 0.01143555, -0.10216391, -0.06688468, 0.019413434, -0.0076031247, -0.03699172, -0.03037342, 0.0014173934, 0.033982094, 0.028674062, -0.080678664, 0.037503026, -0.08438446, -0.00107288, 0.057899155, -0.113598086, -0.018308517, 0.011551393, -0.07234887, 0.09252413, -0.049695067, 0.014780652, -0.0075675673, -0.044519123, -0.061334103, 0.0589534, 8.414513e-33, 0.009771945, 0.12160786, -0.03935934, -0.010013683, 0.008984462, -0.048829008, 0.092219844, 0.0024215193, -0.008333239, 0.03851628, 0.012913554, 0.032042034, 0.04166485, 0.00095339265, -0.10857277, 0.024132853, 0.021791779, 0.10197005, 0.009454616, 0.02655147, 0.05246785, 0.06066869, -0.01103198, 0.08555522, -0.04832902, 0.015530094, 0.081544966, -0.07317735, -0.024714837, -0.054873053, 0.07664425, 0.00995868, -0.0126262475, 0.12942667, -0.020951333, -0.071839534, 0.112829536, -0.05635948, -0.00964097, 0.0416308, 0.037358228, 0.029693559, -0.06289643, 0.05327226, -0.012749386, 0.009228852, 0.019423483, 0.004775946, 0.036169004, -0.012142309, -0.019225916, -4.9336504e-05, -0.026529595, 0.012205288, 0.017824031, -0.015235648, -0.018118307, -0.08875225, -0.03372443, -0.020540647, -0.0050654826, -0.001371073, 0.0008696483, 0.11509085, 0.067525595, 0.0014781682, -0.05162725, 0.024478963, 0.02746541, -0.04733127, -0.071528465, -0.033279188, -0.007459125, -0.058763813, -0.04050985, 0.023988923, -0.0877261, -0.0031558173, -0.020018995, -0.041812886, -0.0050884373, -0.01390084, 0.02811135, 0.021810783, 0.013438191, -0.007385128, 0.100511186, -0.0144268125, -0.0046929154, 0.039197326, 0.002359752, 0.015907316, -0.035865612, -0.071099125, -0.01309764, 7.376516e-33, -0.022041496, -0.06963562, -0.022240836, 0.03717517, 0.09121255, -0.009112608, -0.014528304, -0.15407158, 0.005560728, 0.024977768, 0.052556068, 0.014630092, -0.08889645, 0.09537802, -0.03995325, 0.04229064, -0.024341475, 0.009678618, -0.048159223, -0.036069203, 0.059143994, -0.00944305, -0.062726334, 0.0018664591, 0.004131935, 0.03928479, 0.014626856, 0.025103925, -0.05313026, 0.054437604, 0.07312243, 0.059003882, -0.038209524, -0.085269175, -0.06061704, -0.042276274, 0.061298754, -0.016126659, 0.03648384, 0.0263596, -0.020908905, 0.048757087, -0.0046108705, 0.035508107, 0.028468246, -0.009864904, -0.034235284, -0.07204, -0.0005532754, -0.05454473, -0.04248278, 0.014105863, 0.081947, 0.040557176, 0.06266554, 0.040847275, 0.028606374, 0.025837993, -0.07722129, 0.039314467, 0.049470015, 0.05854979, -0.03244549, -0.02478489]}]}]}\n" ] } ], "source": [ "# Now using this model we can generate sentence embedding.\n", "\n", "input_sentences = [\"Test sentence1\", \"Test sentence2\"]\n", "\n", "embedding_output = ml_client.generate_embedding(\"7KFfsYgBZqn0fcHi8Ku0\", input_sentences)\n", "\n", "print(embedding_output)\n" ] }, { "cell_type": "markdown", "id": "4a27702e", "metadata": {}, "source": [ "## Step 6: Unload Model\n", "\n", "After generating the embedding if we want we can unload the model from memory. `unload_model` method takes two input. \n", "\n", "1. model_id --> Which model we want to unload\n", "2. node_ids --> list of the nodes from where we want to unload the model.\n", "\n", "If we don't provide `node_ids` then method will unload model from all the nodes available like the following example." ] }, { "cell_type": "code", "execution_count": 18, "id": "9636c6bf", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "{'BJ0OvIWHTJuEJu2muBVRIA': {'stats': {'7KFfsYgBZqn0fcHi8Ku0': 'undeployed'}}, 'LnxY8AMlTpecPVBSiTOYWg': {'stats': {'7KFfsYgBZqn0fcHi8Ku0': 'not_found'}}, 'U6Y1_KIrRJuUbcu89tDIWg': {'stats': {'7KFfsYgBZqn0fcHi8Ku0': 'not_found'}}, 'QkNLom65QCiU1AwA4fRQHA': {'stats': {'7KFfsYgBZqn0fcHi8Ku0': 'undeployed'}}}\n" ] } ], "source": [ "\n", "undeploy_model_response = ml_client.undeploy_model(\"7KFfsYgBZqn0fcHi8Ku0\")\n", "\n", "print(undeploy_model_response)" ] }, { "cell_type": "markdown", "id": "ad3173f3", "metadata": {}, "source": [ "## Step 7: Delete Model\n", "\n", "We can also delete the model from the index using the model id." ] }, { "cell_type": "code", "execution_count": 19, "id": "001165fb", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "{'_index': '.plugins-ml-model', '_id': '7KFfsYgBZqn0fcHi8Ku0', '_version': 11, 'result': 'deleted', '_shards': {'total': 2, 'successful': 2, 'failed': 0}, '_seq_no': 24, '_primary_term': 1}\n" ] } ], "source": [ "\n", "delete_model_response = ml_client.delete_model(\"7KFfsYgBZqn0fcHi8Ku0\")\n", "\n", "print(delete_model_response)" ] }, { "cell_type": "code", "execution_count": null, "id": "afb60353", "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.9.16" } }, "nbformat": 4, "nbformat_minor": 5 }