Re: [cinder] Error when creating backups from iscsi volume
On 11/03, Rishat Azizov wrote:
Hi, Gorka,
Thanks. I see multiple "multipath -f" calls. Logs in attachments.
Hi, There are multiple things going on here: 1. There is a bug in os-brick, because the disconnect_volume should not fail, since it is being called with force=True and ignore_errors=True. The issues is that this call [1] is not wrapped in the ExceptionChainer context manager, and it should not even be a flush call, it should be a call to "multipathd remove map $map" instead. 2. The way multipath code is written [2][3], the error we see about "3624a93705842cfae35d7483200015fce is not a multipath device" means 2 different things: it is not a multipath or an error happened. So we don't really know what happened without enabling more verbose multipathd log levels. 3. The "multipath -f" call should not be failing in the first place, because the failure is happening on disconnecting the source volume, which has no data buffered to be written and therefore no reason to fail the flush (unless it's using a friendly name). I don't know if it could be happening that the first flush fails with a timeout (maybe because there is an extend operation happening), but multipathd keeps trying to flush it in the background and when it succeeds it removes the multipath device, which makes following calls fail. If that's the case we would need to change the retry from automatic [4] to manual and check in-between to see if the device has been removed in-between calls. The first issue is definitely a bug, the 2nd one is something that could be changed in the deployment to try to get additional information on the failure, and the 3rd one could be a bug. I'll see if I can find someone who wants to work on the 1st and 3rd points. Cheers, Gorka. [1]: https://github.com/openstack/os-brick/blob/e15edf6c17449899ec8401c37482f7cb5... [2]: https://github.com/opensvc/multipath-tools/blob/db4804bc7393f2482448bdd87013... [3]: https://github.com/opensvc/multipath-tools/blob/db4804bc7393f2482448bdd87013... [4]: https://github.com/openstack/os-brick/blob/e15edf6c17449899ec8401c37482f7cb5...
чт, 9 мар. 2023 г. в 15:55, Gorka Eguileor <geguileo@redhat.com>:
On 06/03, Rishat Azizov wrote:
Hi,
It works with smaller volumes.
multipath.conf attached to thist email.
Cinder version - 18.2.0 Wallaby
Hi,
After giving it some thought I think I may know what is going on.
If you have DEBUG logs enabled in cinder-backup when it fails, how many calls do you see in the cinder-backup to "multipath -f" from os-brick, only one or do you see more?
Cheers, Gorka.
пн, 6 мар. 2023 г. в 17:35, Gorka Eguileor <geguileo@redhat.com>:
On 16/02, Rishat Azizov wrote:
Hello!
We have an error with creating backups from iscsi volume. Usually,
happens with large backups over 100GB.
2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server [req-f6619913-6f96-4226-8d75-2da3fca722f1 23de1b92e7674cf59486f07ac75b886b a7585b47d1f143e9839c49b4e3bbe1b4 - - -] Exception during message handling: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. Command: multipath -f 3624a93705842cfae35d7483200015ec6 Exit code: 1 Stdout: '' Stderr: 'Feb 16 00:22:45 | 3624a93705842cfae35d7483200015ec6 is not a multipath device\n' 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server Traceback (most recent call last): 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py",
309, in dispatch 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py",
229, in _do_dispatch 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/cinder/utils.py", line 890, in wrapper 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server return func(self, *args, **kwargs) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/cinder/backup/manager.py", line 410, in create_backup 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server volume_utils.update_backup_error(backup, str(err)) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 227, in __exit__ 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server self.force_reraise() 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 200, in force_reraise 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server raise self.value 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/cinder/backup/manager.py", line 399, in create_backup 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server updates = self._run_backup(context, backup, volume) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/cinder/backup/manager.py", line 493, in _run_backup 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server ignore_errors=True) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/cinder/backup/manager.py", line 1066, in _detach_device 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server force=force, ignore_errors=ignore_errors) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/os_brick/utils.py", line 141, in trace_logging_wrapper 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server return f(*args, **kwargs) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py",
360,
in inner 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server return f(*args, **kwargs) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py",
line 880, in disconnect_volume 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server is_disconnect_call=True) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py",
line 942, in _cleanup_connection 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server self._linuxscsi.flush_multipath_device(multipath_name) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/os_brick/initiator/linuxscsi.py",
382, in flush_multipath_device 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server root_helper=self._root_helper) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/os_brick/executor.py", line 52, in _execute 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server result = self.__execute(*args, **kwargs) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/os_brick/privileged/rootwrap.py",
this line line line line line
172, in execute 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server return execute_root(*cmd, **kwargs) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_privsep/priv_context.py", line 247, in _wrap 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server return self.channel.remote_call(name, args, kwargs) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_privsep/daemon.py", line 224, in remote_call 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server raise exc_type(*result[2]) 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server Command: multipath -f 3624a93705842cfae35d7483200015ec6 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server Exit code: 1 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server Stdout: '' 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server Stderr: 'Feb 16 00:22:45 | 3624a93705842cfae35d7483200015ec6 is not a multipath device\n' 2023-02-16 00:23:25.298 1920 ERROR oslo_messaging.rpc.server
Could you please help with this error?
Hi,
Does it work for smaller volumes or does it also fail?
What are your defaults in your /etc/multipath.conf file?
What Cinder release are you using?
Cheers, Gorka.
defaults { user_friendly_names no find_multipaths yes enable_foreign "^$" }
blacklist_exceptions { property "(SCSI_IDENT_|ID_WWN)" }
blacklist { }
devices { device { vendor "PURE" product "FlashArray" fast_io_fail_tmo 10 path_grouping_policy "group_by_prio" failback "immediate" prio "alua" hardware_handler "1 alua" max_sectors_kb 4096 } }
2023-03-10 16:42:41.785 2878341 DEBUG cinder.backup.drivers.ceph [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Transferred chunk 1600 of 1600 (84387K/s) _transfer_data /usr/lib/python3.6/site-packages/cinder/backup/drivers/ceph.py:426 2023-03-10 16:42:42.107 2878341 DEBUG cinder.backup.driver [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Getting metadata type 'volume-base-metadata' _save_vol_base_meta /usr/lib/python3.6/site-packages/cinder/backup/driver.py:79 2023-03-10 16:42:42.139 2878341 DEBUG cinder.backup.driver [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Completed fetching metadata type 'volume-base-metadata' _save_vol_base_meta /usr/lib/python3.6/site-packages/cinder/backup/driver.py:98 2023-03-10 16:42:42.139 2878341 DEBUG cinder.backup.driver [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Getting metadata type 'volume-metadata' _save_vol_meta /usr/lib/python3.6/site-packages/cinder/backup/driver.py:109 2023-03-10 16:42:42.147 2878341 DEBUG cinder.backup.driver [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] No metadata type 'volume-metadata' available _save_vol_meta /usr/lib/python3.6/site-packages/cinder/backup/driver.py:123 2023-03-10 16:42:42.148 2878341 DEBUG cinder.backup.driver [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Getting metadata type 'volume-glance-metadata' _save_vol_glance_meta /usr/lib/python3.6/site-packages/cinder/backup/driver.py:132 2023-03-10 16:42:42.156 2878341 DEBUG cinder.backup.driver [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Completed fetching metadata type 'volume-glance-metadata' _save_vol_glance_meta /usr/lib/python3.6/site-packages/cinder/backup/driver.py:145 2023-03-10 16:42:42.157 2878341 DEBUG cinder.backup.drivers.ceph [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Backing up metadata for volume 7e9beead-01a6-488e-bc2f-47093a8cdbd9. _backup_metadata /usr/lib/python3.6/site-packages/cinder/backup/drivers/ceph.py:946 2023-03-10 16:42:42.251 2878341 DEBUG cinder.backup.drivers.ceph [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Backup '4f32e959-509d-49d9-9674-acc3776b4b6a' of volume 7e9beead-01a6-488e-bc2f-47093a8cdbd9 finished. backup /usr/lib/python3.6/site-packages/cinder/backup/drivers/ceph.py:1011 2023-03-10 16:42:42.252 2878341 DEBUG oslo_concurrency.processutils [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Running cmd (subprocess): sudo cinder-rootwrap /etc/cinder/rootwrap.conf chown 0 /dev/dm-5 execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:42:42.613 2878341 DEBUG oslo_concurrency.processutils [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] CMD "sudo cinder-rootwrap /etc/cinder/rootwrap.conf chown 0 /dev/dm-5" returned: 0 in 0.361s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:42:42.614 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] ==> disconnect_volume: call "{'args': (<os_brick.initiator.connectors.iscsi.ISCSIConnector object at 0x7fe6925c3ac8>, {'target_discovered': False, 'discard': True, 'target_luns': [1, 1, 1, 1], 'target_iqns': ['iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1', 'iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1', 'iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1', 'iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1'], 'target_portals': ['10.224.18.46:3260', '10.224.18.47:3260', '10.224.18.48:3260', '10.224.18.49:3260'], 'wwn': '3624a93705842cfae35d7483200015fce', 'qos_specs': {'total_bytes_sec': '524288000', 'read_iops_sec': '12800', 'write_iops_sec': '6400'}, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, {'type': 'block', 'scsi_wwn': '3624a93705842cfae35d7483200015fce', 'path': '/dev/dm-5', 'multipath_id': '3624a93705842cfae35d7483200015fce'}), 'kwargs': {'force': True, 'ignore_errors': True}}" trace_logging_wrapper /usr/lib/python3.6/site-packages/os_brick/utils.py:150 2023-03-10 16:42:42.616 2878341 DEBUG oslo_concurrency.lockutils [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Lock "connect_volume" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector.disconnect_volume" :: waited 0.001s inner /usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py:359 2023-03-10 16:42:42.616 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Getting connected devices for (ips,iqns,luns)=[('10.224.18.46:3260', 'iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1', 1), ('10.224.18.47:3260', 'iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1', 1), ('10.224.18.48:3260', 'iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1', 1), ('10.224.18.49:3260', 'iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1', 1)] _get_connection_devices /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:821 2023-03-10 16:42:42.618 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:42:42.631 2880468 DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node" returned: 0 in 0.013s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:42:42.632 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('10.224.18.46:3260,4294967295 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1\n10.224.18.47:3260,4294967295 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1\n10.224.18.48:3260,4294967295 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1\n10.224.18.49:3260,4294967295 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1\n', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:42:42.634 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:42:42.647 2880468 DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.013s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:42:42.647 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('tcp: [234] 10.224.18.46:3260,4013 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 (non-flash)\ntcp: [235] 10.224.18.47:3260,4013 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 (non-flash)\ntcp: [236] 10.224.18.49:3260,4013 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 (non-flash)\ntcp: [237] 10.224.18.48:3260,4013 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 (non-flash)\n', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:42:42.649 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] iscsiadm ('-m', 'session'): stdout=tcp: [234] 10.224.18.46:3260,4013 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 (non-flash) tcp: [235] 10.224.18.47:3260,4013 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 (non-flash) tcp: [236] 10.224.18.49:3260,4013 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 (non-flash) tcp: [237] 10.224.18.48:3260,4013 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 (non-flash) stderr= _run_iscsiadm_bare /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:1155 2023-03-10 16:42:42.649 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] iscsi session list stdout=tcp: [234] 10.224.18.46:3260,4013 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 (non-flash) tcp: [235] 10.224.18.47:3260,4013 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 (non-flash) tcp: [236] 10.224.18.49:3260,4013 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 (non-flash) tcp: [237] 10.224.18.48:3260,4013 iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 (non-flash) stderr= _run_iscsi_session /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:1144 2023-03-10 16:42:42.653 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Resulting device map defaultdict(<function ISCSIConnector._get_connection_devices.<locals>.<lambda> at 0x7fe6921bf268>, {('10.224.18.46:3260', 'iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1'): ({'sdb'}, set()), ('10.224.18.47:3260', 'iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1'): ({'sda'}, set()), ('10.224.18.48:3260', 'iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1'): ({'sdc'}, set()), ('10.224.18.49:3260', 'iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1'): ({'sdd'}, set())}) _get_connection_devices /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:853 2023-03-10 16:42:42.654 2878341 DEBUG os_brick.initiator.linuxscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Removing multipathed devices sdc, sdd, sda, sdb remove_connection /usr/lib/python3.6/site-packages/os_brick/initiator/linuxscsi.py:308 2023-03-10 16:42:42.655 2878341 DEBUG os_brick.initiator.linuxscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Flush multipath device 3624a93705842cfae35d7483200015fce flush_multipath_device /usr/lib/python3.6/site-packages/os_brick/initiator/linuxscsi.py:376 2023-03-10 16:42:42.657 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipath -f 3624a93705842cfae35d7483200015fce execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:42:46.675 2880468 DEBUG oslo_concurrency.processutils [-] CMD "multipath -f 3624a93705842cfae35d7483200015fce" returned: 1 in 4.019s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:42:46.676 2880468 DEBUG oslo_concurrency.processutils [-] 'multipath -f 3624a93705842cfae35d7483200015fce' failed. Retrying. execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:478 2023-03-10 16:42:46.676 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipath -f 3624a93705842cfae35d7483200015fce execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:42:46.685 2880468 DEBUG os_brick.privileged.rootwrap [-] Sleeping for 20 seconds on_execute /usr/lib/python3.6/site-packages/os_brick/privileged/rootwrap.py:106 2023-03-10 16:42:55.994 2878341 DEBUG oslo_service.periodic_task [req-b7d7b4ae-b092-40bf-9105-97bede752389 - - - - -] Running periodic task BackupManager.publish_service_capabilities run_periodic_tasks /usr/lib/python3.6/site-packages/oslo_service/periodic_task.py:211 2023-03-10 16:42:55.995 2878341 DEBUG cinder.manager [req-b7d7b4ae-b092-40bf-9105-97bede752389 - - - - -] Notifying Schedulers of capabilities ... _publish_service_capabilities /usr/lib/python3.6/site-packages/cinder/manager.py:197 2023-03-10 16:43:06.707 2880468 DEBUG oslo_concurrency.processutils [-] CMD "multipath -f 3624a93705842cfae35d7483200015fce" returned: 1 in 20.031s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:06.707 2880468 DEBUG oslo_concurrency.processutils [-] 'multipath -f 3624a93705842cfae35d7483200015fce' failed. Retrying. execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:478 2023-03-10 16:43:06.707 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipath -f 3624a93705842cfae35d7483200015fce execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:06.714 2880468 DEBUG os_brick.privileged.rootwrap [-] Sleeping for 40 seconds on_execute /usr/lib/python3.6/site-packages/os_brick/privileged/rootwrap.py:106 2023-03-10 16:43:46.753 2880468 DEBUG oslo_concurrency.processutils [-] CMD "multipath -f 3624a93705842cfae35d7483200015fce" returned: 1 in 40.046s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:46.754 2880468 DEBUG oslo_concurrency.processutils [-] 'multipath -f 3624a93705842cfae35d7483200015fce' failed. Not Retrying. execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:474 2023-03-10 16:43:46.754 2880468 DEBUG oslo.privsep.daemon [-] privsep: Exception during request[140628271231200]: Unexpected error while running command. Command: multipath -f 3624a93705842cfae35d7483200015fce Exit code: 1 Stdout: '' Stderr: 'Mar 10 16:43:06 | 3624a93705842cfae35d7483200015fce is not a multipath device\n' _process_cmd /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:490 Traceback (most recent call last): File "/usr/lib/python3.6/site-packages/oslo_privsep/daemon.py", line 485, in _process_cmd ret = func(*f_args, **f_kwargs) File "/usr/lib/python3.6/site-packages/oslo_privsep/priv_context.py", line 249, in _wrap return func(*args, **kwargs) File "/usr/lib/python3.6/site-packages/os_brick/privileged/rootwrap.py", line 197, in execute_root return custom_execute(*cmd, shell=False, run_as_root=False, **kwargs) File "/usr/lib/python3.6/site-packages/os_brick/privileged/rootwrap.py", line 146, in custom_execute on_completion=on_completion, *cmd, **kwargs) File "/usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py", line 441, in execute cmd=sanitized_cmd) oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. Command: multipath -f 3624a93705842cfae35d7483200015fce Exit code: 1 Stdout: '' Stderr: 'Mar 10 16:43:06 | 3624a93705842cfae35d7483200015fce is not a multipath device\n' 2023-03-10 16:43:46.757 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (5, 'oslo_concurrency.processutils.ProcessExecutionError', ('', 'Mar 10 16:43:06 | 3624a93705842cfae35d7483200015fce is not a multipath device\n', 1, 'multipath -f 3624a93705842cfae35d7483200015fce', None)) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:46.758 2878341 WARNING os_brick.exception [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Flushing 3624a93705842cfae35d7483200015fce failed: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. 2023-03-10 16:43:46.761 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd del path /dev/sdc execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:46.772 2880468 DEBUG oslo_concurrency.processutils [-] CMD "multipathd del path /dev/sdc" returned: 0 in 0.011s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:46.772 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('ok\n', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:46.774 2878341 DEBUG os_brick.initiator.linuxscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Remove SCSI device /dev/sdc with /sys/block/sdc/device/delete remove_scsi_device /usr/lib/python3.6/site-packages/os_brick/initiator/linuxscsi.py:75 2023-03-10 16:43:46.774 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/block/sdc/device/delete execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:46.814 2880468 DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/block/sdc/device/delete" returned: 0 in 0.040s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:46.815 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('1', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:46.817 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd del path /dev/sdd execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:46.828 2880468 DEBUG oslo_concurrency.processutils [-] CMD "multipathd del path /dev/sdd" returned: 0 in 0.011s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:46.828 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('ok\n', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:46.829 2878341 DEBUG os_brick.initiator.linuxscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Remove SCSI device /dev/sdd with /sys/block/sdd/device/delete remove_scsi_device /usr/lib/python3.6/site-packages/os_brick/initiator/linuxscsi.py:75 2023-03-10 16:43:46.831 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/block/sdd/device/delete execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:46.869 2880468 DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/block/sdd/device/delete" returned: 0 in 0.039s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:46.870 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('1', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:46.872 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd del path /dev/sda execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:46.883 2880468 DEBUG oslo_concurrency.processutils [-] CMD "multipathd del path /dev/sda" returned: 0 in 0.011s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:46.884 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('ok\n', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:46.885 2878341 DEBUG os_brick.initiator.linuxscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Remove SCSI device /dev/sda with /sys/block/sda/device/delete remove_scsi_device /usr/lib/python3.6/site-packages/os_brick/initiator/linuxscsi.py:75 2023-03-10 16:43:46.887 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/block/sda/device/delete execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:46.929 2880468 DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/block/sda/device/delete" returned: 0 in 0.042s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:46.930 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('1', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:46.931 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd del path /dev/sdb execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:46.942 2880468 DEBUG oslo_concurrency.processutils [-] CMD "multipathd del path /dev/sdb" returned: 0 in 0.011s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:46.943 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('ok\n', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:46.944 2878341 DEBUG os_brick.initiator.linuxscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Remove SCSI device /dev/sdb with /sys/block/sdb/device/delete remove_scsi_device /usr/lib/python3.6/site-packages/os_brick/initiator/linuxscsi.py:75 2023-03-10 16:43:46.945 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/block/sdb/device/delete execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:46.979 2880468 DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/block/sdb/device/delete" returned: 0 in 0.034s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:46.979 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('1', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:46.980 2878341 DEBUG os_brick.initiator.linuxscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Checking to see if SCSI volumes sdc, sdd, sda, sdb have been removed. wait_for_volumes_removal /usr/lib/python3.6/site-packages/os_brick/initiator/linuxscsi.py:83 2023-03-10 16:43:46.981 2878341 DEBUG os_brick.initiator.linuxscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] SCSI volumes sdc, sdd, sda, sdb have been removed. wait_for_volumes_removal /usr/lib/python3.6/site-packages/os_brick/initiator/linuxscsi.py:92 2023-03-10 16:43:46.982 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Disconnecting from: [('10.224.18.46:3260', 'iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1'), ('10.224.18.47:3260', 'iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1'), ('10.224.18.48:3260', 'iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1'), ('10.224.18.49:3260', 'iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1')] _disconnect_connection /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:1132 2023-03-10 16:43:46.983 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.46:3260 --op update -n node.startup -v manual execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:46.996 2880468 DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.46:3260 --op update -n node.startup -v manual" returned: 0 in 0.013s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:46.996 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:46.997 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'manual'): stdout= stderr= _run_iscsiadm /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:1009 2023-03-10 16:43:46.998 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.46:3260 --logout execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:47.026 2880468 DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.46:3260 --logout" returned: 0 in 0.028s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:47.027 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('Logging out of session [sid: 234, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.46,3260]\nLogout of [sid: 234, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.46,3260] successful.\n', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:47.028 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] iscsiadm ('--logout',): stdout=Logging out of session [sid: 234, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.46,3260] Logout of [sid: 234, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.46,3260] successful. stderr= _run_iscsiadm /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:1009 2023-03-10 16:43:47.029 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.46:3260 --op delete execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:47.041 2880468 DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.46:3260 --op delete" returned: 0 in 0.012s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:47.041 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:47.042 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] iscsiadm ('--op', 'delete'): stdout= stderr= _run_iscsiadm /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:1009 2023-03-10 16:43:47.043 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.47:3260 --op update -n node.startup -v manual execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:47.055 2880468 DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.47:3260 --op update -n node.startup -v manual" returned: 0 in 0.012s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:47.055 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:47.056 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'manual'): stdout= stderr= _run_iscsiadm /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:1009 2023-03-10 16:43:47.057 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.47:3260 --logout execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:47.084 2880468 DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.47:3260 --logout" returned: 0 in 0.026s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:47.084 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('Logging out of session [sid: 235, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.47,3260]\nLogout of [sid: 235, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.47,3260] successful.\n', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:47.086 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] iscsiadm ('--logout',): stdout=Logging out of session [sid: 235, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.47,3260] Logout of [sid: 235, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.47,3260] successful. stderr= _run_iscsiadm /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:1009 2023-03-10 16:43:47.088 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.47:3260 --op delete execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:47.101 2880468 DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.47:3260 --op delete" returned: 0 in 0.013s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:47.102 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:47.103 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] iscsiadm ('--op', 'delete'): stdout= stderr= _run_iscsiadm /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:1009 2023-03-10 16:43:47.104 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.48:3260 --op update -n node.startup -v manual execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:47.114 2880468 DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.48:3260 --op update -n node.startup -v manual" returned: 0 in 0.010s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:47.114 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:47.115 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'manual'): stdout= stderr= _run_iscsiadm /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:1009 2023-03-10 16:43:47.116 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.48:3260 --logout execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:47.140 2880468 DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.48:3260 --logout" returned: 0 in 0.024s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:47.140 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('Logging out of session [sid: 237, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.48,3260]\nLogout of [sid: 237, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.48,3260] successful.\n', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:47.141 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] iscsiadm ('--logout',): stdout=Logging out of session [sid: 237, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.48,3260] Logout of [sid: 237, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.48,3260] successful. stderr= _run_iscsiadm /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:1009 2023-03-10 16:43:47.142 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.48:3260 --op delete execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:47.153 2880468 DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.48:3260 --op delete" returned: 0 in 0.010s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:47.153 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:47.154 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] iscsiadm ('--op', 'delete'): stdout= stderr= _run_iscsiadm /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:1009 2023-03-10 16:43:47.155 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.49:3260 --op update -n node.startup -v manual execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:47.165 2880468 DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.49:3260 --op update -n node.startup -v manual" returned: 0 in 0.010s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:47.165 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:47.166 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'manual'): stdout= stderr= _run_iscsiadm /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:1009 2023-03-10 16:43:47.167 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.49:3260 --logout execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:47.189 2880468 DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.49:3260 --logout" returned: 0 in 0.022s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:47.190 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('Logging out of session [sid: 236, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.49,3260]\nLogout of [sid: 236, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.49,3260] successful.\n', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:47.191 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] iscsiadm ('--logout',): stdout=Logging out of session [sid: 236, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.49,3260] Logout of [sid: 236, target: iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1, portal: 10.224.18.49,3260] successful. stderr= _run_iscsiadm /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:1009 2023-03-10 16:43:47.192 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.49:3260 --op delete execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:47.202 2880468 DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-06.com.purestorage:flasharray.55893eb505d1d2a1 -p 10.224.18.49:3260 --op delete" returned: 0 in 0.010s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:47.202 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (4, ('', '')) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:43:47.203 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] iscsiadm ('--op', 'delete'): stdout= stderr= _run_iscsiadm /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:1009 2023-03-10 16:43:47.204 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Flushing again multipath 3624a93705842cfae35d7483200015fce now that we removed the devices. _cleanup_connection /usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py:941 2023-03-10 16:43:47.204 2878341 DEBUG os_brick.initiator.linuxscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Flush multipath device 3624a93705842cfae35d7483200015fce flush_multipath_device /usr/lib/python3.6/site-packages/os_brick/initiator/linuxscsi.py:376 2023-03-10 16:43:47.205 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipath -f 3624a93705842cfae35d7483200015fce execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:47.215 2880468 DEBUG oslo_concurrency.processutils [-] CMD "multipath -f 3624a93705842cfae35d7483200015fce" returned: 1 in 0.010s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:43:47.216 2880468 DEBUG oslo_concurrency.processutils [-] 'multipath -f 3624a93705842cfae35d7483200015fce' failed. Retrying. execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:478 2023-03-10 16:43:47.216 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipath -f 3624a93705842cfae35d7483200015fce execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:43:47.221 2880468 DEBUG os_brick.privileged.rootwrap [-] Sleeping for 20 seconds on_execute /usr/lib/python3.6/site-packages/os_brick/privileged/rootwrap.py:106 2023-03-10 16:43:55.994 2878341 DEBUG oslo_service.periodic_task [req-b7d7b4ae-b092-40bf-9105-97bede752389 - - - - -] Running periodic task BackupManager.publish_service_capabilities run_periodic_tasks /usr/lib/python3.6/site-packages/oslo_service/periodic_task.py:211 2023-03-10 16:43:55.995 2878341 DEBUG cinder.manager [req-b7d7b4ae-b092-40bf-9105-97bede752389 - - - - -] Notifying Schedulers of capabilities ... _publish_service_capabilities /usr/lib/python3.6/site-packages/cinder/manager.py:197 2023-03-10 16:44:07.242 2880468 DEBUG oslo_concurrency.processutils [-] CMD "multipath -f 3624a93705842cfae35d7483200015fce" returned: 1 in 20.026s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:44:07.242 2880468 DEBUG oslo_concurrency.processutils [-] 'multipath -f 3624a93705842cfae35d7483200015fce' failed. Retrying. execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:478 2023-03-10 16:44:07.242 2880468 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipath -f 3624a93705842cfae35d7483200015fce execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:384 2023-03-10 16:44:07.249 2880468 DEBUG os_brick.privileged.rootwrap [-] Sleeping for 40 seconds on_execute /usr/lib/python3.6/site-packages/os_brick/privileged/rootwrap.py:106 2023-03-10 16:44:47.287 2880468 DEBUG oslo_concurrency.processutils [-] CMD "multipath -f 3624a93705842cfae35d7483200015fce" returned: 1 in 40.045s execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:423 2023-03-10 16:44:47.287 2880468 DEBUG oslo_concurrency.processutils [-] 'multipath -f 3624a93705842cfae35d7483200015fce' failed. Not Retrying. execute /usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py:474 2023-03-10 16:44:47.288 2880468 DEBUG oslo.privsep.daemon [-] privsep: Exception during request[140628271231200]: Unexpected error while running command. Command: multipath -f 3624a93705842cfae35d7483200015fce Exit code: 1 Stdout: '' Stderr: 'Mar 10 16:44:07 | 3624a93705842cfae35d7483200015fce is not a multipath device\n' _process_cmd /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:490 Traceback (most recent call last): File "/usr/lib/python3.6/site-packages/oslo_privsep/daemon.py", line 485, in _process_cmd ret = func(*f_args, **f_kwargs) File "/usr/lib/python3.6/site-packages/oslo_privsep/priv_context.py", line 249, in _wrap return func(*args, **kwargs) File "/usr/lib/python3.6/site-packages/os_brick/privileged/rootwrap.py", line 197, in execute_root return custom_execute(*cmd, shell=False, run_as_root=False, **kwargs) File "/usr/lib/python3.6/site-packages/os_brick/privileged/rootwrap.py", line 146, in custom_execute on_completion=on_completion, *cmd, **kwargs) File "/usr/lib/python3.6/site-packages/oslo_concurrency/processutils.py", line 441, in execute cmd=sanitized_cmd) oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. Command: multipath -f 3624a93705842cfae35d7483200015fce Exit code: 1 Stdout: '' Stderr: 'Mar 10 16:44:07 | 3624a93705842cfae35d7483200015fce is not a multipath device\n' 2023-03-10 16:44:47.288 2880468 DEBUG oslo.privsep.daemon [-] privsep: reply[140628271231200]: (5, 'oslo_concurrency.processutils.ProcessExecutionError', ('', 'Mar 10 16:44:07 | 3624a93705842cfae35d7483200015fce is not a multipath device\n', 1, 'multipath -f 3624a93705842cfae35d7483200015fce', None)) _call_back /usr/lib/python3.6/site-packages/oslo_privsep/daemon.py:511 2023-03-10 16:44:47.289 2878341 DEBUG oslo_concurrency.lockutils [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Lock "connect_volume" released by "os_brick.initiator.connectors.iscsi.ISCSIConnector.disconnect_volume" :: held 124.674s inner /usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py:371 2023-03-10 16:44:47.289 2878341 DEBUG os_brick.initiator.connectors.iscsi [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] <== disconnect_volume: exception (124674ms) ProcessExecutionError('', 'Mar 10 16:44:07 | 3624a93705842cfae35d7483200015fce is not a multipath device\n', 1, 'multipath -f 3624a93705842cfae35d7483200015fce', None) trace_logging_wrapper /usr/lib/python3.6/site-packages/os_brick/utils.py:160 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server [req-bdebfdef-daf3-4250-8594-9ed91adb7c00 f91779ad06064ebfbeeff54de535a6cd 8a676a415f9541c59705a373a36b0ec4 - - -] Exception during message handling: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. Command: multipath -f 3624a93705842cfae35d7483200015fce Exit code: 1 Stdout: '' Stderr: 'Mar 10 16:44:07 | 3624a93705842cfae35d7483200015fce is not a multipath device\n' 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server Traceback (most recent call last): 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/cinder/utils.py", line 890, in wrapper 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server return func(self, *args, **kwargs) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/cinder/backup/manager.py", line 410, in create_backup 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server volume_utils.update_backup_error(backup, str(err)) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 227, in __exit__ 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server self.force_reraise() 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 200, in force_reraise 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server raise self.value 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/cinder/backup/manager.py", line 399, in create_backup 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server updates = self._run_backup(context, backup, volume) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/cinder/backup/manager.py", line 493, in _run_backup 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server ignore_errors=True) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/cinder/backup/manager.py", line 1066, in _detach_device 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server force=force, ignore_errors=ignore_errors) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/os_brick/utils.py", line 154, in trace_logging_wrapper 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py", line 360, in inner 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server return f(*args, **kwargs) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 880, in disconnect_volume 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server is_disconnect_call=True) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 942, in _cleanup_connection 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server self._linuxscsi.flush_multipath_device(multipath_name) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/os_brick/initiator/linuxscsi.py", line 382, in flush_multipath_device 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server root_helper=self._root_helper) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/os_brick/executor.py", line 52, in _execute 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server result = self.__execute(*args, **kwargs) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/os_brick/privileged/rootwrap.py", line 172, in execute 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server return execute_root(*cmd, **kwargs) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_privsep/priv_context.py", line 247, in _wrap 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server return self.channel.remote_call(name, args, kwargs) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_privsep/daemon.py", line 224, in remote_call 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server raise exc_type(*result[2]) 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server Command: multipath -f 3624a93705842cfae35d7483200015fce 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server Exit code: 1 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server Stdout: '' 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server Stderr: 'Mar 10 16:44:07 | 3624a93705842cfae35d7483200015fce is not a multipath device\n' 2023-03-10 16:44:47.314 2878341 ERROR oslo_messaging.rpc.server
[Sending the email again as it seems it didn't reach the ML] On 13/03, Gorka Eguileor wrote:
On 11/03, Rishat Azizov wrote:
Hi, Gorka,
Thanks. I see multiple "multipath -f" calls. Logs in attachments.
Hi, There are multiple things going on here: 1. There is a bug in os-brick, because the disconnect_volume should not fail, since it is being called with force=True and ignore_errors=True. The issues is that this call [1] is not wrapped in the ExceptionChainer context manager, and it should not even be a flush call, it should be a call to "multipathd remove map $map" instead. 2. The way multipath code is written [2][3], the error we see about "3624a93705842cfae35d7483200015fce is not a multipath device" means 2 different things: it is not a multipath or an error happened. So we don't really know what happened without enabling more verbose multipathd log levels. 3. The "multipath -f" call should not be failing in the first place, because the failure is happening on disconnecting the source volume, which has no data buffered to be written and therefore no reason to fail the flush (unless it's using a friendly name). I don't know if it could be happening that the first flush fails with a timeout (maybe because there is an extend operation happening), but multipathd keeps trying to flush it in the background and when it succeeds it removes the multipath device, which makes following calls fail. If that's the case we would need to change the retry from automatic [4] to manual and check in-between to see if the device has been removed in-between calls. The first issue is definitely a bug, the 2nd one is something that could be changed in the deployment to try to get additional information on the failure, and the 3rd one could be a bug. I'll see if I can find someone who wants to work on the 1st and 3rd points. Cheers, Gorka. [1]: https://github.com/openstack/os-brick/blob/e15edf6c17449899ec8401c37482f7cb5... [2]: https://github.com/opensvc/multipath-tools/blob/db4804bc7393f2482448bdd87013... [3]: https://github.com/opensvc/multipath-tools/blob/db4804bc7393f2482448bdd87013... [4]: https://github.com/openstack/os-brick/blob/e15edf6c17449899ec8401c37482f7cb5...
чт, 9 мар. 2023 г. в 15:55, Gorka Eguileor <geguileo@redhat.com>:
On 06/03, Rishat Azizov wrote:
Hi,
It works with smaller volumes.
multipath.conf attached to thist email.
Cinder version - 18.2.0 Wallaby
Hi Gorka, Thanks! I fixed issue by adding to multipathd config uxsock_timeout directive: uxsock_timeout 10000 Because in multipathd logs I saw this error: 3624a93705842cfae35d7483200015fd8: map flushed cli cmd 'del map 3624a93705842cfae35d7483200015fd8' timeout reached after 4.858561 secs Now large disk backups work fine. 2. This happens because despite the timeout of the first attempt and exit code 1, the multipath device was disconnected, so the next attempts ended with an error "is not a multipath device", since the multipath device had already disconnected. вт, 14 мар. 2023 г. в 14:46, Gorka Eguileor <geguileo@redhat.com>:
[Sending the email again as it seems it didn't reach the ML]
On 13/03, Gorka Eguileor wrote:
On 11/03, Rishat Azizov wrote:
Hi, Gorka,
Thanks. I see multiple "multipath -f" calls. Logs in attachments.
Hi,
There are multiple things going on here:
1. There is a bug in os-brick, because the disconnect_volume should not fail, since it is being called with force=True and ignore_errors=True.
The issues is that this call [1] is not wrapped in the ExceptionChainer context manager, and it should not even be a flush call, it should be a call to "multipathd remove map $map" instead.
2. The way multipath code is written [2][3], the error we see about "3624a93705842cfae35d7483200015fce is not a multipath device" means 2 different things: it is not a multipath or an error happened.
So we don't really know what happened without enabling more verbose multipathd log levels.
3. The "multipath -f" call should not be failing in the first place, because the failure is happening on disconnecting the source volume, which has no data buffered to be written and therefore no reason to fail the flush (unless it's using a friendly name).
I don't know if it could be happening that the first flush fails with a timeout (maybe because there is an extend operation happening), but multipathd keeps trying to flush it in the background and when it succeeds it removes the multipath device, which makes following calls fail.
If that's the case we would need to change the retry from automatic [4] to manual and check in-between to see if the device has been removed in-between calls.
The first issue is definitely a bug, the 2nd one is something that could be changed in the deployment to try to get additional information on the failure, and the 3rd one could be a bug.
I'll see if I can find someone who wants to work on the 1st and 3rd points.
Cheers, Gorka.
[1]: https://github.com/openstack/os-brick/blob/e15edf6c17449899ec8401c37482f7cb5... [2]: https://github.com/opensvc/multipath-tools/blob/db4804bc7393f2482448bdd87013... [3]: https://github.com/opensvc/multipath-tools/blob/db4804bc7393f2482448bdd87013... [4]: https://github.com/openstack/os-brick/blob/e15edf6c17449899ec8401c37482f7cb5...
чт, 9 мар. 2023 г. в 15:55, Gorka Eguileor <geguileo@redhat.com>:
On 06/03, Rishat Azizov wrote:
Hi,
It works with smaller volumes.
multipath.conf attached to thist email.
Cinder version - 18.2.0 Wallaby
On 16/03, Rishat Azizov wrote:
Hi Gorka,
Thanks! I fixed issue by adding to multipathd config uxsock_timeout directive: uxsock_timeout 10000
Because in multipathd logs I saw this error: 3624a93705842cfae35d7483200015fd8: map flushed cli cmd 'del map 3624a93705842cfae35d7483200015fd8' timeout reached after 4.858561 secs
Now large disk backups work fine.
2. This happens because despite the timeout of the first attempt and exit code 1, the multipath device was disconnected, so the next attempts ended with an error "is not a multipath device", since the multipath device had already disconnected.
Hi, That's a nice workaround until we fix it upstream!! Thanks for confirming my suspicions were right. This is the 3rd thing I mentioned was happening, flush call failed but it actually removed the device. We'll proceed to fix the flushing code in master. Cheers, Gorka.
вт, 14 мар. 2023 г. в 14:46, Gorka Eguileor <geguileo@redhat.com>:
[Sending the email again as it seems it didn't reach the ML]
On 13/03, Gorka Eguileor wrote:
On 11/03, Rishat Azizov wrote:
Hi, Gorka,
Thanks. I see multiple "multipath -f" calls. Logs in attachments.
Hi,
There are multiple things going on here:
1. There is a bug in os-brick, because the disconnect_volume should not fail, since it is being called with force=True and ignore_errors=True.
The issues is that this call [1] is not wrapped in the ExceptionChainer context manager, and it should not even be a flush call, it should be a call to "multipathd remove map $map" instead.
2. The way multipath code is written [2][3], the error we see about "3624a93705842cfae35d7483200015fce is not a multipath device" means 2 different things: it is not a multipath or an error happened.
So we don't really know what happened without enabling more verbose multipathd log levels.
3. The "multipath -f" call should not be failing in the first place, because the failure is happening on disconnecting the source volume, which has no data buffered to be written and therefore no reason to fail the flush (unless it's using a friendly name).
I don't know if it could be happening that the first flush fails with a timeout (maybe because there is an extend operation happening), but multipathd keeps trying to flush it in the background and when it succeeds it removes the multipath device, which makes following calls fail.
If that's the case we would need to change the retry from automatic [4] to manual and check in-between to see if the device has been removed in-between calls.
The first issue is definitely a bug, the 2nd one is something that could be changed in the deployment to try to get additional information on the failure, and the 3rd one could be a bug.
I'll see if I can find someone who wants to work on the 1st and 3rd points.
Cheers, Gorka.
[1]: https://github.com/openstack/os-brick/blob/e15edf6c17449899ec8401c37482f7cb5... [2]: https://github.com/opensvc/multipath-tools/blob/db4804bc7393f2482448bdd87013... [3]: https://github.com/opensvc/multipath-tools/blob/db4804bc7393f2482448bdd87013... [4]: https://github.com/openstack/os-brick/blob/e15edf6c17449899ec8401c37482f7cb5...
чт, 9 мар. 2023 г. в 15:55, Gorka Eguileor <geguileo@redhat.com>:
On 06/03, Rishat Azizov wrote:
Hi,
It works with smaller volumes.
multipath.conf attached to thist email.
Cinder version - 18.2.0 Wallaby
Hi Gorka and Rishat, As discussed with Gorka, I will be working on the issues reported. I've reported 2 bugs for case 1) and 3) since we aren't sure on case 2) yet. *Bug 1*: https://bugs.launchpad.net/os-brick/+bug/2012251 *Fix 1*: https://review.opendev.org/c/openstack/os-brick/+/878045 *Bug 2*: https://bugs.launchpad.net/os-brick/+bug/2012352 *Fix 2*: https://review.opendev.org/c/openstack/os-brick/+/878242 I'm not 100% sure that the approach in *Fix 2* is the best way to do it but it works with my test scenarios and reviews are always appreciated. Thanks Rajat Dhasmana On Thu, Mar 16, 2023 at 5:45 PM Gorka Eguileor <geguileo@redhat.com> wrote:
On 16/03, Rishat Azizov wrote:
Hi Gorka,
Thanks! I fixed issue by adding to multipathd config uxsock_timeout directive: uxsock_timeout 10000
Because in multipathd logs I saw this error: 3624a93705842cfae35d7483200015fd8: map flushed cli cmd 'del map 3624a93705842cfae35d7483200015fd8' timeout reached after 4.858561 secs
Now large disk backups work fine.
2. This happens because despite the timeout of the first attempt and exit code 1, the multipath device was disconnected, so the next attempts ended with an error "is not a multipath device", since the multipath device had already disconnected.
Hi,
That's a nice workaround until we fix it upstream!!
Thanks for confirming my suspicions were right. This is the 3rd thing I mentioned was happening, flush call failed but it actually removed the device.
We'll proceed to fix the flushing code in master.
Cheers, Gorka.
вт, 14 мар. 2023 г. в 14:46, Gorka Eguileor <geguileo@redhat.com>:
[Sending the email again as it seems it didn't reach the ML]
On 13/03, Gorka Eguileor wrote:
On 11/03, Rishat Azizov wrote:
Hi, Gorka,
Thanks. I see multiple "multipath -f" calls. Logs in attachments.
Hi,
There are multiple things going on here:
1. There is a bug in os-brick, because the disconnect_volume should not fail, since it is being called with force=True and ignore_errors=True.
The issues is that this call [1] is not wrapped in the ExceptionChainer context manager, and it should not even be a flush call, it should be a call to "multipathd remove map $map" instead.
2. The way multipath code is written [2][3], the error we see about "3624a93705842cfae35d7483200015fce is not a multipath device" means
different things: it is not a multipath or an error happened.
So we don't really know what happened without enabling more verbose multipathd log levels.
3. The "multipath -f" call should not be failing in the first place, because the failure is happening on disconnecting the source volume, which has no data buffered to be written and therefore no reason to fail the flush (unless it's using a friendly name).
I don't know if it could be happening that the first flush fails with a timeout (maybe because there is an extend operation happening), but multipathd keeps trying to flush it in the background and when it succeeds it removes the multipath device, which makes following calls fail.
If that's the case we would need to change the retry from automatic [4] to manual and check in-between to see if the device has been removed in-between calls.
The first issue is definitely a bug, the 2nd one is something that could be changed in the deployment to try to get additional information on
2 the
failure, and the 3rd one could be a bug.
I'll see if I can find someone who wants to work on the 1st and 3rd points.
Cheers, Gorka.
[1]:
https://github.com/openstack/os-brick/blob/e15edf6c17449899ec8401c37482f7cb5...
[2]:
https://github.com/opensvc/multipath-tools/blob/db4804bc7393f2482448bdd87013...
[3]:
https://github.com/opensvc/multipath-tools/blob/db4804bc7393f2482448bdd87013...
[4]:
https://github.com/openstack/os-brick/blob/e15edf6c17449899ec8401c37482f7cb5...
чт, 9 мар. 2023 г. в 15:55, Gorka Eguileor <geguileo@redhat.com>:
On 06/03, Rishat Azizov wrote: > Hi, > > It works with smaller volumes. > > multipath.conf attached to thist email. > > Cinder version - 18.2.0 Wallaby
participants (3)
-
Gorka Eguileor
-
Rajat Dhasmana
-
Rishat Azizov