[magnum] Persistent Volume Claim with cinder backend

Lingxian Kong anlin.kong at gmail.com
Thu Jan 24 22:39:56 UTC 2019


Hi,

I don't think you can use `openstack.org/standalone-cinder` without setup a
standalone cinder external provisioner[1]. Although kubernetes.io/cinder is
deprecated, it just works, either for the in-tree openstack provider or
openstack-cloud-controller-manager[2] which is supported in Stein dev cycle
and already backported to Rocky. Regardless of both of them, CSI is the
future(I'm going to add that support in Magnum, too).

[1]:
https://github.com/kubernetes/cloud-provider-openstack/blob/f056677572b2635632abcc7dbde459cdfc4432b9/docs/using-cinder-standalone-provisioner.md
[2]:
https://review.openstack.org/#/q/6c61a1a949615f6dc1df36f3098cd97466ac7238

Cheers,
Lingxian Kong


On Thu, Jan 24, 2019 at 7:39 PM Christian Zunker
<christian.zunker at codecentric.cloud> wrote:

> Hi,
>
> we are running Magnum Rocky.
> I tried to create a persistent volume claim and got it working with
> provisioner: kubernetes.io/cinder
> But it failed with provisioner: openstack.org/standalone-cinder
>
> The docs state kubernetes.io/cinder is deprecated:
>
> https://kubernetes.io/docs/concepts/storage/storage-classes/#openstack-cinder
>
> Which one should be used in Rocky?
>
>
> This is our complete config for this case:
> apiVersion: storage.k8s.io/v1
> kind: StorageClass
> metadata:
>   name: cinder
>   annotations:
>     storageclass.beta.kubernetes.io/is-default-class: "true"
>   labels:
>     kubernetes.io/cluster-service: "true"
>     addonmanager.kubernetes.io/mode: EnsureExists
> provisioner: openstack.org/standalone-cinder
> parameters:
>   type: volumes_hdd
>   availability: cinderAZ_ceph
> ---
> kind: PersistentVolumeClaim
> apiVersion: v1
> metadata:
>   name: myclaim
> spec:
>   accessModes:
>     - ReadWriteOnce
>   resources:
>     requests:
>       storage: 42Gi
>   storageClassName: cinder
>
> regards
> Christian
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openstack.org/pipermail/openstack-discuss/attachments/20190125/3996d6ff/attachment-0001.html>


More information about the openstack-discuss mailing list