Reputation: 19312
I have deployed a k8s
cluster to aws
using kops
.
The process created a ~./kube/config
file with the following structure:
apiVersion: v1
clusters:
- cluster:
certificate-authority-data:
<data_here>
name: <cluster_name>
contexts:
- context:
cluster: <cluster_name>
user: <cluster_name>
name: <cluster_name>l
current-context: <cluster_name>
kind: Config
preferences: {}
users:
- name: <cluster_name>
user:
as-user-extra: {}
client-certificate-data:
<client_certificate_data>
client-key-data:
<client-key-data>
password: <some-password>
username: admin
- name:<cluster-name>-basic-auth
user:
as-user-extra: {}
password: <some-password>
username: admin
When (after creating the dashboard) I perform kube proxy
and try to access it via localhost, I am prompted with the following screen:
1: Why when I point the file browser to the above created ~./kube/config
file I get
Authentication failed
2: When I provide the password
included in the ~./kube/config
file, I do log in but not as an administrator (e.g. I am unable to view compute resources on the pods)
Upvotes: 0
Views: 1620
Reputation: 388
You can give the admin access by running this yaml file :
kind: ClusterRoleBinding
apiVersion: rbac.authorization.k8s.io/v1beta1
metadata:
name: dashboard
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: cluster-admin
subjects:
- kind: ServiceAccount
name: kubernetes-dashboard
namespace: kube-system
This will allow access to the kubernetes dashboard. But this will give open access, you need to do skip the login after running this. more details can be found here https://github.com/kubernetes/dashboard/wiki/Access-control
Upvotes: 1