Delete EKS Cluster from CLI
Ensure that the KUBECONFIG
environment variable is set to the self-managed cluster by running export KUBECONFIG={SELF_MANAGED_AWS_CLUSTER}.conf
Delete the EKS cluster
Follow these steps:
Ensure your AWS credentials are up to date. If you are using user profiles, then refresh the credentials using the command below. Otherwise, proceed to step 2.
CODEdkp update bootstrap credentials aws
Delete the Kubernetes cluster and wait a few minutes:
Before deleting the cluster, dkp deletes all Services of type LoadBalancer on the cluster. Each Service is backed by an AWS Classic ELB. Deleting the Service deletes the ELB that backs it. To skip this step, use the flag
--delete-kubernetes-resources=false
.NOTE: Do not skip this step if the VPC is managed by DKP. When DKP deletes the cluster, it deletes the VPC. If the VPC has any EKS Classic ELBs, EKS does not allow the VPC to be deleted, and DKP cannot delete the cluster.
CODEdkp delete cluster --cluster-name=${CLUSTER_NAME}
CODE✓ Deleting Services with type LoadBalancer for Cluster default/eks-example ✓ Deleting ClusterResourceSets for Cluster default/eks-example ✓ Deleting cluster resources ✓ Waiting for cluster to be fully deleted Deleted default/eks-example cluster
Next Step:
Once your cluster is built in the Konvoy component of DKP for your infrastructure/environment, you will install the Kommander component of DKP to see your dashboard and continue customization.
Known Limitations
NOTE: Be aware of these limitations in the current release of DKP.
The DKP version used to create the workload cluster must match the DKP version used to delete the workload cluster.