Hi Wei/Jayanth,
Thanks for sharing the details. I am able to fetch out the api and keys and
deployed the driver as suggested by @vivek and GH page.
Now I encountered with one more issue that the cloudstack csi node goes in
CrashLoopBackOff Error. I am trying to get some more info regarding this which
is as below
{"level":"error","ts":1708932622.5365772,"caller":"zap/options.go:212","msg":"finished
unary call with code
Internal","grpc.start_time":"2024-02-26T07:30:22Z","grpc.request.deadline":"2024-02-26T07:32:22Z","system":"grpc","span.kind":"server","grpc.service":"csi.v1.Node","grpc.method":"NodeGetInfo","error":"rpc
error: code = Internal desc = Get
\"http://10.1.10.2:8080/client/api?apiKey=k83H56KFdhFqpv7cXPU11nkwxPt8f2rXnm1WWVIRdeErqZr72Pzp7ySmricPWs7FQQuMmClznDhMz7uqnRD2wA&command=listVirtualMachines&id=cf4940eb-52a4-4205-b056-1575926cb488&response=json&signature=t4jdPVL7jqhGt5pWC0kjx%2Bxzr3o%3D\":<http://10.1.10.2:8080/client/api?apiKey=k83H56KFdhFqpv7cXPU11nkwxPt8f2rXnm1WWVIRdeErqZr72Pzp7ySmricPWs7FQQuMmClznDhMz7uqnRD2wA&command=listVirtualMachines&id=cf4940eb-52a4-4205-b056-1575926cb488&response=json&signature=t4jdPVL7jqhGt5pWC0kjx%2Bxzr3o%3D\%22:>
dial tcp 10.1.10.2:8080: connect: connection
refused","grpc.code":"Internal","grpc.time_ms":1.138,"stacktrace":"github.com/grpc-ecosystem/go-grpc-middleware/logging/zap.DefaultMessageProducer\n\t/home/runner/go/pkg/mod/github.com/grpc-ecosystem/[email protected]/logging/zap/options.go:212\ngithub.com/grpc-ecosystem/go-grpc-middleware/logging/zap.UnaryServerInterceptor.func1\n\t/home/runner/go/pkg/mod/github.com/grpc-ecosystem/[email protected]/logging/zap/server_interceptors.go:39\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1\n\t/home/runner/go/pkg/mod/google.golang.org/[email protected]/server.go:1183\ngithub.com/container-storage-interface/spec/lib/go/csi._Node_NodeGetInfo_Handler\n\t/home/runner/go/pkg/mod/github.com/container-storage-interface/[email protected]/lib/go/csi/csi.pb.go:7351\ngoogle.golang.org/grpc.(*Server).processUnaryRPC\n\t/home/runner/go/pkg/mod/google.golang.org/[email protected]/server.go:1372\ngoogle.golang.org/grpc.(*Server).handleStream\n\t/home/runner/go/pkg/mod/google.golang.org/[email protected]/server.go:1783\ngoogle.golang.org/grpc.(*Server).serveStreams.func2.1\n\t/home/runner/go/pkg/mod/google.golang.org/[email protected]/server.go:1016"}
kubectl get pods -A
NAMESPACE NAME
READY STATUS RESTARTS AGE
default example-pod
0/1 Pending 0 87m
kube-system cloud-controller-manager-574bcb86c-vzp4m
1/1 Running 0 155m
kube-system cloudstack-csi-controller-7f89c8cd47-ftgnf
5/5 Running 0 150m
kube-system cloudstack-csi-controller-7f89c8cd47-j4s4z
5/5 Running 0 150m
kube-system cloudstack-csi-controller-7f89c8cd47-ptvss
5/5 Running 0 150m
kube-system cloudstack-csi-node-56hxg
2/3 CrashLoopBackOff 34 (99s ago) 150m
kube-system cloudstack-csi-node-98cf2
2/3 CrashLoopBackOff 34 (39s ago) 150m
kube-system coredns-5dd5756b68-5wwxk
1/1 Running 0 4h17m
kube-system coredns-5dd5756b68-mbpwt
1/1 Running 0 4h17m
kube-system etcd-kspot-app-control-18de3ee6b6f
1/1 Running 0 4h17m
kube-system kube-apiserver-kspot-app-control-18de3ee6b6f
1/1 Running 0 4h17m
kube-system kube-controller-manager-kspot-app-control-18de3ee6b6f
1/1 Running 0 4h17m
kube-system kube-proxy-56r4l
1/1 Running 0 4h17m
kube-system kube-proxy-mf6cc
1/1 Running 0 4h17m
kube-system kube-scheduler-kspot-app-control-18de3ee6b6f
1/1 Running 0 4h17m
kube-system weave-net-59t9z
2/2 Running 1 (4h17m ago) 4h17m
kube-system weave-net-7xvpp
2/2 Running 0 4h17m
kubernetes-dashboard dashboard-metrics-scraper-5657497c4c-g89lq
1/1 Running 0 4h17m
kubernetes-dashboard kubernetes-dashboard-5b749d9495-fqplb
1/1 Running 0 4h17m
kubectl get csinode
NAME DRIVERS AGE
kspot-app-control-18de3ee6b6f 0 4h23m
kspot-app-node-18de3eeb7b7 0 4h23m
kubectl describe csinode
Name: kspot-app-control-18de3ee6b6f
Labels: <none>
Annotations: storage.alpha.kubernetes.io/migrated-plugins:
kubernetes.io/aws-ebs,kubernetes.io/azure-disk,kubernetes.io/azure-file,kubernetes.io/cinder,kubernetes.io/gce-pd,kubernetes.io/vsphere-vo...
CreationTimestamp: Mon, 26 Feb 2024 05:42:57 +0000
Spec:
Events: <none>
Name: kspot-app-node-18de3eeb7b7
Labels: <none>
Annotations: storage.alpha.kubernetes.io/migrated-plugins:
kubernetes.io/aws-ebs,kubernetes.io/azure-disk,kubernetes.io/azure-file,kubernetes.io/cinder,kubernetes.io/gce-pd,kubernetes.io/vsphere-vo...
CreationTimestamp: Mon, 26 Feb 2024 05:43:12 +0000
Spec:
Events: <none>
Thanks and Regards,
Bharat Saini
[signature_1647819183]
From: Wei ZHOU <[email protected]>
Date: Monday, 26 February 2024 at 1:52 AM
To: [email protected] <[email protected]>
Subject: Re: CKS Storage Provisioner Info
EXTERNAL EMAIL: Please verify the sender email address before taking any
action, replying, clicking any link or opening any attachment.
+1
Or use the api key of "admin" user.
-Wei
On Sun, Feb 25, 2024 at 7:57 PM Jayanth Reddy <[email protected]>
wrote:
> Hello Bharat,
> With your login as "admin" user, you should be able to generate keys for
> any user. Please do the below
>
> 1. Go to "Accounts"
> 2. Select the account named "admin"
> 3. Scroll down and click "users"
> 4. Select the "admin-kubeadmin" user
> 5. Then select the button for generation of keys.
>
> Please let me know if that helps.
>
> Thanks,
> Jayanth
>
> Sent from Outlook for Android<https://aka.ms/AAb9ysg>
>
>
>
--------------------------- Disclaimer: ------------------------------
This message and its contents are intended solely for the designated addressee
and are proprietary to Kloudspot. The information in this email is meant
exclusively for Kloudspot business use. Any use by individuals other than the
addressee constitutes misuse and an infringement of Kloudspot's proprietary
rights. If you are not the intended recipient, please return this email to the
sender. Kloudspot cannot guarantee the security or error-free transmission of
e-mail communications. Information could be intercepted, corrupted, lost,
destroyed, arrive late or incomplete, or contain viruses. Therefore, Kloudspot
shall not be liable for any issues arising from the transmission of this email.