Prerequisites
Before running the tutorial, configure a Docker secret and an AWS credentials secret. AWS credentials are needed in both the source cluster (the one training the model) and the target cluster (the one to which you'll deploy the model).
ENV setup
Set the environment variable NAMESPACE to the namespace in which you will be training and/or deploying your model.
export NAMESPACE=user1
Create a Docker secret
Create a docker configuration file, docker-config.json
, with base64 encoded credentials.
Encode your credentials with the following command:
CODEecho -n "<username>:<password>" | base64
Create the
docker-config.json
file with the following format:CODE{ "auths": { "https://index.docker.io/v1/": { "auth": "<username:password in base64>" } } }
Create a secret from your
docker-config.json
file:CODEkubectl --kubeconfig target/kube.config create secret generic docker-secret -n ${NAMESPACE} --from-file=docker-config.json
Create an AWS secret
Set your
AWS_ACCESS_KEY_ID
andAWS_SECRET_ACCESS_KEY
to environment variables:CODEAWS_ACCESS_KEY_ID=... AWS_SECRET_ACCESS_KEY=...
Create a secret with the following command:
CODEkubectl --kubeconfig target/kube.config create secret -n ${NAMESPACE} generic aws-credentials \ --from-literal=AWS_ACCESS_KEY_ID="${AWS_ACCESS_KEY_ID}" \ --from-literal=AWS_SECRET_ACCESS_KEY="${AWS_SECRET_ACCESS_KEY}" \ --from-literal=AWS_REGION=us-west-2 \ --from-literal=S3_ENDPOINT=https://s3.us-west-2.amazonaws.com
Create a PodDefault
which References the Docker and AWS Secrets:
cat << EOF | kubectl apply -f -
apiVersion: "kubeflow.org/v1alpha1"
kind: PodDefault
metadata:
name: inject-credentials
namespace: ${NAMESPACE}
spec:
selector:
matchLabels:
inject-credentials: "true"
desc: "AWS and Docker credentials"
volumeMounts:
- name: docker-secret-volume
mountPath: /home/kubeflow/.docker/
volumes:
- name: docker-secret-volume
secret:
secretName: docker-secret
envFrom:
- secretRef:
name: aws-credentials
EOF
Launch your Notebook Server
Create a notebook server based on the
mesosphere/kubeflow:...-jupyter-spark-3.3.0-tensorflow-2.x.x
imageSelect AWS and Docker credentials under the Configuration section
Select Launch > Connect.
Upload the Tutorials and Run the Corresponding Notebook
Upload this tutorial to the notebook, and run the corresponding step. In the source cluster, run through Build-Notebook.ipynb
. In the target deployment cluster, run through Deploy-Notebook.ipynb
.