[openstack-dev] [nova] Help needed in debugging issue - ClientException: Unable to update the attachment. (HTTP 500)

Matt Riedemann mriedemos at gmail.com
Wed Apr 25 14:10:52 UTC 2018


On 4/25/2018 3:32 AM, Sreeram Vancheeswaran wrote:
> Hi team!
> 
> We are currently facing  an issue in our out-of-tree driver nova-dpm [1] 
> with nova and cinder on master, where instance launch in devstack is 
> failing due to communication/time-out issues in nova-cinder.   We are 
> unable to get to the root cause of the issue and we need your help on 
> getting some hints/directions to debug this issue further.
> 
> --> From nova-compute service: BuildAbortException: Build of instance 
> aborted: Unable to update the attachment. (HTTP 500) from the 
> nova-compute server (detailed logs here [2]).
> 
> --> From cinder-volume service: ERROR oslo_messaging.rpc.server 
> VolumeAttachmentNotFound: Volume attachment could not be found with 
> filter: attachment_id = 266ef7e1-4735-40f1-b704-509472f565cb.
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server  (detailed logs here [3])
> 
> Debugging steps done so far:-
> 
>   * Compared the package versions between the current devstack under
>     test with the **last succeeding job in our CI system** (to be exact,
>     it was for the patches https://review.openstack.org/#/c/458514/ and
>     https://review.openstack.org/#/c/458820/); However the package
>     versions for packages such as sqlalchemy, os-brick, oslo* are
>     exactly the same in both the systems.
>   * We used git bisect to revert nova and cinder projects to versions
>     equal to or before the date of our last succeeding CI run; but still
>     we were able to reproduce the same error.
>   * Our guess is that the db "Save" operation during the update of
>     volume attachment is failing.  But we are unable to trace/debug to
>     that point in the rpc call;  Any suggestions on how to debug this
>     sceario would be really helpful.
>   * We are running devstack master on Ubuntu 16.04.04
> 
> 
> References
> 
> [1] https://github.com/openstack/nova-dpm
> 
> 
> [2] Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.volume.cinder 
> [None req-751d4586-cd97-4a8f-9423-f2bc4b1f1269 service nova] Update 
> attachment failed for attachment 266ef7e1-4735-40f1-b704-509472f565cb. 
> Error: Unable to update the attachment. (HTTP 500) (Request-ID: 
> req-550f6b9c-7f22-4dce-adbe-f6843e0aa3ce) Code: 500: ClientException: 
> Unable to update the attachment. (HTTP 500) (Request-ID: 
> req-550f6b9c-7f22-4dce-adbe-f6843e0aa3ce)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [None req-751d4586-cd97-4a8f-9423-f2bc4b1f1269 service nova] [instance: 
> d761da60-7bb1-415e-b5b9-eaaed124d6d2] Instance failed block device 
> setup: ClientException: Unable to update the attachment. (HTTP 500) 
> (Request-ID: req-550f6b9c-7f22-4dce-adbe-f6843e0aa3ce)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2] Traceback (most recent 
> call last):
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/compute/manager.py", line 1577, in _prep_block_device
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2] 
> wait_func=self._await_block_device_map_created)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/virt/block_device.py", line 828, in 
> attach_block_devices
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2] _log_and_attach(device)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/virt/block_device.py", line 825, in _log_and_attach
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2] 
> bdm.attach(*attach_args, **attach_kwargs)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/virt/block_device.py", line 46, in wrapped
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     ret_val = 
> method(obj, context, *args, **kwargs)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/virt/block_device.py", line 618, in attach
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     virt_driver, 
> do_driver_attach)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py", 
> line 274, in inner
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     return f(*args, 
> **kwargs)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/virt/block_device.py", line 615, in _do_locked_attach
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     
> self._do_attach(*args, **_kwargs)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/virt/block_device.py", line 600, in _do_attach
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     do_driver_attach)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/virt/block_device.py", line 514, in _volume_attach
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2] 
> self['mount_device'])['connection_info']
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/volume/cinder.py", line 291, in wrapper
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     res = method(self, 
> ctx, *args, **kwargs)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/volume/cinder.py", line 327, in wrapper
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     res = method(self, 
> ctx, attachment_id, *args, **kwargs)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/volume/cinder.py", line 736, in attachment_update
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     'code': getattr(ex, 
> 'code', None)})
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 
> 220, in __exit__
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     self.force_reraise()
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 
> 196, in force_reraise
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2] six.reraise(self.type_, 
> self.value, self.tb)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/volume/cinder.py", line 726, in attachment_update
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     attachment_id, 
> _connector)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/usr/local/lib/python2.7/dist-packages/cinderclient/v3/attachments.py", 
> line 67, in update
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     resp = 
> self._update('/attachments/%s' % id, body)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/usr/local/lib/python2.7/dist-packages/cinderclient/base.py", line 344, 
> in _update
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     resp, body = 
> self.api.client.put(url, body=body, **kwargs)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/usr/local/lib/python2.7/dist-packages/cinderclient/client.py", line 
> 206, in put
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     return 
> self._cs_request(url, 'PUT', **kwargs)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/usr/local/lib/python2.7/dist-packages/cinderclient/client.py", line 
> 191, in _cs_request
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     return 
> self.request(url, method, **kwargs)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/usr/local/lib/python2.7/dist-packages/cinderclient/client.py", line 
> 177, in request
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     raise 
> exceptions.from_response(resp, body)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2] ClientException: Unable 
> to update the attachment. (HTTP 500) (Request-ID: 
> req-550f6b9c-7f22-4dce-adbe-f6843e0aa3ce)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [None req-751d4586-cd97-4a8f-9423-f2bc4b1f1269 service nova] [instance: 
> d761da60-7bb1-415e-b5b9-eaaed124d6d2] Build of instance 
> d761da60-7bb1-415e-b5b9-eaaed124d6d2 aborted: Unable to update the 
> attachment. (HTTP 500) (Request-ID: 
> req-550f6b9c-7f22-4dce-adbe-f6843e0aa3ce): BuildAbortException: Build of 
> instance d761da60-7bb1-415e-b5b9-eaaed124d6d2 aborted: Unable to update 
> the attachment. (HTTP 500) (Request-ID: 
> req-550f6b9c-7f22-4dce-adbe-f6843e0aa3ce)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2] Traceback (most recent 
> call last):
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/compute/manager.py", line 1839, in 
> _do_build_and_run_instance
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     filter_properties, 
> request_spec)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/compute/manager.py", line 2052, in 
> _build_and_run_instance
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2] bdms=block_device_mapping)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 
> 220, in __exit__
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     self.force_reraise()
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 
> 196, in force_reraise
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2] six.reraise(self.type_, 
> self.value, self.tb)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/compute/manager.py", line 2004, in 
> _build_and_run_instance
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     
> block_device_mapping) as resources:
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/usr/lib/python2.7/contextlib.py", line 17, in __enter__
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]     return self.gen.next()
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]   File 
> "/opt/stack/nova/nova/compute/manager.py", line 2211, in _build_resources
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2] reason=e.format_message())
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2] BuildAbortException: 
> Build of instance d761da60-7bb1-415e-b5b9-eaaed124d6d2 aborted: Unable 
> to update the attachment. (HTTP 500) (Request-ID: 
> req-550f6b9c-7f22-4dce-adbe-f6843e0aa3ce)
> Apr 25 06:41:57 zos057 nova-compute[6190]: ERROR nova.compute.manager 
> [instance: d761da60-7bb1-415e-b5b9-eaaed124d6d2]
> 
> 
> [3] Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments [req-f9f3364b-4dd8-4195-a60a-2f0e44c1f2ea 
> req-550f6b9c-7f22-4dce-adbe-f6843e0aa3ce admin admin] Unable to update 
> the attachment.: MessagingTimeout: Timed out waiting for a reply to 
> message ID fe836528e2ea43edabe8004845837f4f
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments Traceback (most recent call last):
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments   File 
> "/opt/stack/cinder/cinder/api/v3/attachments.py", line 228, in update
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments     connector))
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments   File 
> "/opt/stack/cinder/cinder/volume/api.py", line 2158, in attachment_update
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments     attachment_ref.id))
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments   File "/opt/stack/cinder/cinder/rpc.py", line 
> 187, in _wrapper
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments     return f(self, *args, **kwargs)
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments   File 
> "/opt/stack/cinder/cinder/volume/rpcapi.py", line 442, in attachment_update
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments     attachment_id=attachment_id)
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments   File 
> "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", 
> line 174, in call
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments     retry=self.retry)
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments   File 
> "/usr/local/lib/python2.7/dist-packages/oslo_messaging/transport.py", 
> line 131, in _send
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments     timeout=timeout, retry=retry)
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments   File 
> "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", 
> line 559, in send
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments     retry=retry)
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments   File 
> "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", 
> line 548, in _send
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments     result = self._waiter.wait(msg_id, timeout)
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments   File 
> "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", 
> line 440, in wait
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments     message = self.waiters.get(msg_id, 
> timeout=timeout)
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments   File 
> "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", 
> line 328, in get
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments     'to message ID %s' % msg_id)
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments MessagingTimeout: Timed out waiting for a 
> reply to message ID fe836528e2ea43edabe8004845837f4f
> Apr 25 06:41:57 zos057 devstack at c-api.service[11490]: ERROR 
> cinder.api.v3.attachments
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server [req-f9f3364b-4dd8-4195-a60a-2f0e44c1f2ea 
> req-550f6b9c-7f22-4dce-adbe-f6843e0aa3ce admin None] Exception during 
> message handling: VolumeAttachmentNotFound: Volume attachment could not 
> be found with filter: attachment_id = 266ef7e1-4735-40f1-b704-509472f565cb.
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server Traceback (most recent call last):
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server   File 
> "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", 
> line 163, in _process_incoming
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server   File 
> "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", 
> line 220, in dispatch
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, 
> ctxt, args)
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server   File 
> "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", 
> line 190, in _do_dispatch
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server     result = func(ctxt, **new_args)
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server   File 
> "/opt/stack/cinder/cinder/volume/manager.py", line 4378, in 
> attachment_update
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server     connector)
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server   File 
> "/opt/stack/cinder/cinder/volume/manager.py", line 4349, in 
> _connection_create
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server self.db.volume_attachment_update(ctxt, 
> attachment.id, values)
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server   File "/opt/stack/cinder/cinder/db/api.py", 
> line 365, in volume_attachment_update
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server     return 
> IMPL.volume_attachment_update(context, attachment_id, values)
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server   File 
> "/opt/stack/cinder/cinder/db/sqlalchemy/api.py", line 182, in wrapper
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server     return f(*args, **kwargs)
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server   File 
> "/opt/stack/cinder/cinder/db/sqlalchemy/api.py", line 2674, in 
> volume_attachment_update
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server     filter='attachment_id = ' + attachment_id)
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server VolumeAttachmentNotFound: Volume attachment 
> could not be found with filter: attachment_id = 
> 266ef7e1-4735-40f1-b704-509472f565cb.
> Apr 25 06:42:47 zos057 cinder-volume[11475]: ERROR 
> oslo_messaging.rpc.server
> 
> 
> -- 
> ---------------------------------------------------------------------------------------------------
> Sreeram Vancheeswaran
> System z Firmware - Openstack Development
> IBM Systems & Technology Lab, Bangalore, India
> Phone:  +91 80 40660826     Mob: +91-9341411511
> Email :sreeram at linux.vnet.ibm.com
> 
> 
> 
> __________________________________________________________________________
> OpenStack Development Mailing List (not for usage questions)
> Unsubscribe: OpenStack-dev-request at lists.openstack.org?subject:unsubscribe
> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
> 

You're doing boot from volume so nova-api should be creating the volume 
attachment record [1] and then nova-compute is updating the attachment 
with the compute host connector, which also creates the export in the 
backend storage via cinder. For whatever reason, the attachment_id 
nova-compute is passing to cinder is not found, but I wouldn't know why. 
You'll likely need to trace the request through the nova-api, 
nova-compute, cinder-api and cinder-volume logs, and trace 
266ef7e1-4735-40f1-b704-509472f565cb which is the attachment ID. Like I 
said, nova-api creates it, will store it in the block_device_mappings 
table, and reference it later in nova-compute when actually attaching 
the volume to the instance on the compute host. The fact you're getting 
down to cinder-volume does mean that when nova-compute called cinder-api 
to update the volume attachment, cinder-api found the attachment in the 
database, otherwise it would return a 404 response to nova-compute. 
Maybe you're hitting some weird race?

It's also weird that cinder-api is hitting an RPC messaging timeout even 
though cinder-volume clearly failed, that should be raised back up to 
cinder-api and spewed back to the caller (nova-compute) as a 500 error.

Also, I should probably confirm, are you booting from an existing 
volume, or booting from an image or volume snapshot where nova-compute 
then creates the volume in Cinder and then attaches it to the server? If 
so, that flow doesn't yet create volume attachment records, which is 
what patch [2] is for.

[1] 
https://github.com/openstack/nova/blob/0a642e2eee8d430ddcccf2947aedcca1a0a0b005/nova/compute/api.py#L3830
[2] https://review.openstack.org/#/c/541420/

-- 

Thanks,

Matt



More information about the OpenStack-dev mailing list