Skip to content

Deployment of driver fails on GKE: Unknown user #67

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
saad-ali opened this issue Jul 16, 2018 · 6 comments
Closed

Deployment of driver fails on GKE: Unknown user #67

saad-ali opened this issue Jul 16, 2018 · 6 comments

Comments

@saad-ali
Copy link
Contributor

saad-ali commented Jul 16, 2018

I see the following when deploying to GKE:

2018-07-16 07:35:05.000 PDT
Unknown user "system:serviceaccount:default:csi-controller-sa"

But kubectl shows that the account exists:

$ kubectl get sa
NAME                SECRETS   AGE
csi-controller-sa   1         8m
csi-node-sa         1         8m
default             1         1h

Also see this error:

E  github.com/kubernetes-csi/external-provisioner/vendor/github.com/kubernetes-incubator/external-storage/lib/controller/controller.go:496: Failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:serviceaccount:default:csi-controller-sa" cannot list persistentvolumeclaims at the cluster scope: [clusterrole.rbac.authorization.k8s.io "system:csi-external-attacher" not found, clusterrole.rbac.authorization.k8s.io "system:csi-external-provisioner" not found] 
  undefined
E  Unknown user "system:serviceaccount:default:csi-controller-sa"
 
  undefined

But kubectl shows the clusterrolebinding exists

$ kubectl describe clusterrolebindings csi-controller-attacher-binding
Name:         csi-controller-attacher-binding
Labels:       <none>
Annotations:  kubectl.kubernetes.io/last-applied-configuration={"apiVersion":"rbac.authorization.k8s.io/v1","kind":"ClusterRoleBinding","metadata":{"annotations":{},"name":"csi-controller-attacher-binding","namespa...
Role:
  Kind:  ClusterRole
  Name:  system:csi-external-attacher
Subjects:
  Kind            Name               Namespace
  ----            ----               ---------
  ServiceAccount  csi-controller-sa  default
$ kubectl describe clusterrolebindings csi-controller-provisioner-binding
Name:         csi-controller-provisioner-binding
Labels:       <none>
Annotations:  kubectl.kubernetes.io/last-applied-configuration={"apiVersion":"rbac.authorization.k8s.io/v1","kind":"ClusterRoleBinding","metadata":{"annotations":{},"name":"csi-controller-provisioner-binding","name...
Role:
  Kind:  ClusterRole
  Name:  system:csi-external-provisioner
Subjects:
  Kind            Name               Namespace
  ----            ----               ---------
  ServiceAccount  csi-controller-sa  default
@saad-ali saad-ali changed the title Deployment of driver fails on GKE: Deployment of driver fails on GKE: Unknown user Jul 16, 2018
@davidz627
Copy link
Contributor

I have been unable to reproduce this issue. What version of GKE are you using?

I find it very strange that the cluster is showing the objects exist but the errors seem to be saying that they don't. And at this point everything is just done in the default namespace so I dont see that being a problem right now.

I'm going to push an update to solve: #66. And once that is in the driver should be plug and play with GKE

@saad-ali
Copy link
Contributor Author

Version:

$ kubectl version
Client Version: version.Info{Major:"", Minor:"", GitVersion:"v0.0.0-master+$Format:%h$", GitCommit:"$Format:%H$", GitTreeState:"not a git tree", BuildDate:"1970-01-01T00:00:00Z", GoVersion:"go1.8.3", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"9+", GitVersion:"v1.9.7-gke.3", GitCommit:"9b5b719c5f295c99de68ffb5b63101b0e0175376", GitTreeState:"clean", BuildDate:"2018-05-31T18:32:23Z", GoVersion:"go1.9.3b4", Compiler:"gc", Platform:"linux/amd64"}

Verified that #66 is resolved with the latest commit. But I'm still running in to this, and Leonid is running in to other issues.

@msau42
Copy link
Contributor

msau42 commented Jul 17, 2018

Your 1.9.7 cluster is an alpha cluster? David and I are have both tested against GKE 1.10.5

@davidz627
Copy link
Contributor

The default external component roles were not introduced until 1.11, they were also cherry-picked to 1.10.5 . No 1.9 version has these cluster roles, you will have to create them yourself if using version <1.10.4

@davidz627
Copy link
Contributor

I think running this on your cluster will come up empty: kubectl get clusterroles | grep csi

Weird that clusterrolebinding doesn't do any validation to make sure the cluster roles actually exist though

@davidz627
Copy link
Contributor

Closing this issue as we have no repro steps, feel free to reopen if you are still seeing this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants