Deploy your Notebook
NOTE: This notebook requires Kaptain SDK 1.3.x or later.
This is a continuation of the Multi-Cluster Tutorial use case example. Ensure you have successfully completed the steps in Build your Notebook in your source cluster.
What You Need
To run this notebook, ensure your target deployment cluster notebook server is configured similar to the source cluster as specified in the Prerequisites, which includes these steps:
Create a Docker
secret
and an AWS credentialssecret
.Create a
PodDefault
configuration referencing the created secrets.Launch a Jupyter notebook server with said
PodDefault
configuration.
You will be able to open this notebook after launching the notebook server.
Ensure You Are Ready to Start
Before proceeding, verify that the notebook server was configured and launched correctly:
Ensure that the docker
secret
is mounted. You should not see an error:CODE%%sh ls -la ~/.docker/config.json
Output:
CODElrwxrwxrwx 1 root istio 18 Oct 6 07:45 /home/kubeflow/.docker/config.json -> ..data/config.json
Verify that the AWS environment variables are set. You should see
AWS_ACCESS_KEY_ID
,AWS_REGION
, andAWS_SECRET_ACCESS_KEY
:CODE%%sh set | egrep ^AWS_ | cut -f 1 -d '='
Output:
CODEAWS_ACCESS_KEY_ID AWS_REGION AWS_SECRET_ACCESS_KEY
Initialize Model Configuration
Before loading the model, provide some configuration to the Kaptain SDK, so it is able to recognize Docker and S3:
CODEfrom kaptain.config import Config from kaptain.platform.config.s3 import S3ConfigurationProvider from kaptain.platform.config.docker import DockerConfigurationProvider from kaptain.platform.model_util import ModelUtil config = Config( docker_config_provider=DockerConfigurationProvider.default(), storage_config_provider=S3ConfigurationProvider.from_env(), )
Load the stored model state from S3:
CODE# Replace model_uri = "..." with the value obtained from the previous notebook when running `model.meta().saved_model_uri` model = Model.load_from_json(model_uri = "s3://kaptain/models/dev/mnist/trained/b69dc6f6e3c246858cf43a1eba8be5f5/0001", config = config)
Output:
CODE[I 221006 08:39:36 model_util:67] Loading model state from s3://kaptain/models/dev/mnist/trained/b69dc6f6e3c246858cf43a1eba8be5f5/0001.
Deploy the model:
CODEmodel.deploy(cpu="1", memory="2G", replace=True)
Congratulations, you have completed this tutorial!