Unable to Live/Cold migrate instances that have ceph root and secondary netapp storage devices | Wallaby
Swogat Pradhan
swogatpradhan22 at gmail.com
Mon Aug 21 09:44:58 UTC 2023
Hi,
I have an instance that has disk /dev/vda present in ceph and the rest of
the 6 volumes (10 TB) each on netapp.
I am trying to migrate the VM to another host but it is failing to migrate
with error volume not found like below:
2023-08-21 09:22:31.852 7 WARNING os_brick.initiator.connectors.iscsi
[req-6ed9d181-fa42-4a4a-b55b-2d08352d4e22
9f20ac539d60ff27487ce83f160e3c29bbeea1017d28fb5159b17c9e77454345
265571cce0b54251b17ace88630b5e46 - 64989c3036764834a755ff2bd55b30b3
64989c3036764834a755ff2bd55b30b3] Couldn't find iSCSI nodes because
iscsiadm err: iscsiadm: No records found
: os_brick.exception.VolumeDeviceNotFound: Volume device not found at .
2023-08-21 09:22:31.866 7 WARNING os_brick.initiator.connectors.iscsi
[req-6ed9d181-fa42-4a4a-b55b-2d08352d4e22
9f20ac539d60ff27487ce83f160e3c29bbeea1017d28fb5159b17c9e77454345
265571cce0b54251b17ace88630b5e46 - 64989c3036764834a755ff2bd55b30b3
64989c3036764834a755ff2bd55b30b3] iscsiadm stderr output when getting
sessions: iscsiadm: No active sessions.
: os_brick.exception.VolumeDeviceNotFound: Volume device not found at .
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server
[req-6ed9d181-fa42-4a4a-b55b-2d08352d4e22
9f20ac539d60ff27487ce83f160e3c29bbeea1017d28fb5159b17c9e77454345
265571cce0b54251b17ace88630b5e46 - 64989c3036764834a755ff2bd55b30b3
64989c3036764834a755ff2bd55b30b3] Exception during message handling:
os_brick.exception.VolumeDeviceNotFound: Volume device not found at .
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server Traceback (most
recent call last):
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/oslo_messaging/rpc/server.py", line 165,
in _process_incoming
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server res =
self.dispatcher.dispatch(message)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line
309, in dispatch
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server return
self._do_dispatch(endpoint, method, ctxt, args)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line
229, in _do_dispatch
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server result =
func(ctxt, **new_args)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/nova/exception_wrapper.py", line 72, in
wrapped
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server context, exc,
binary)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 227, in
__exit__
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server
self.force_reraise()
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 200, in
force_reraise
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server raise
self.value
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/nova/exception_wrapper.py", line 63, in
wrapped
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server return
f(self, context, *args, **kw)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/nova/compute/utils.py", line 1437, in
decorated_function
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server return
function(self, context, *args, **kwargs)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 212, in
decorated_function
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server
kwargs['instance'], e, sys.exc_info())
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 227, in
__exit__
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server
self.force_reraise()
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 200, in
force_reraise
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server raise
self.value
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 200, in
decorated_function
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server return
function(self, context, *args, **kwargs)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 8288, in
pre_live_migration
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server bdm.save()
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 227, in
__exit__
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server
self.force_reraise()
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 200, in
force_reraise
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server raise
self.value
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 8249, in
pre_live_migration
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server migrate_data)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 10343,
in pre_live_migration
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server
self._connect_volume(context, connection_info, instance)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1808,
in _connect_volume
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server
vol_driver.connect_volume(connection_info, instance)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/nova/virt/libvirt/volume/iscsi.py", line
64, in connect_volume
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server device_info =
self.connector.connect_volume(connection_info['data'])
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/os_brick/utils.py", line 154, in
trace_logging_wrapper
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server return
f(*args, **kwargs)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py", line 391,
in inner
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server return
f(*args, **kwargs)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py",
line 524, in connect_volume
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server
self._cleanup_connection(connection_properties, force=True)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 227, in
__exit__
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server
self.force_reraise()
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 200, in
force_reraise
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server raise
self.value
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py",
line 518, in connect_volume
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server return
self._connect_single_volume(connection_properties)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/os_brick/utils.py", line 78, in _wrapper
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server return r(f,
*args, **kwargs)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/tenacity/__init__.py", line 409, in call
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server do =
self.iter(retry_state=retry_state)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/tenacity/__init__.py", line 368, in iter
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server raise
retry_exc.reraise()
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/tenacity/__init__.py", line 186, in
reraise
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server raise
self.last_attempt.result()
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib64/python3.6/concurrent/futures/_base.py", line 425, in result
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server return
self.__get_result()
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib64/python3.6/concurrent/futures/_base.py", line 384, in
__get_result
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server raise
self._exception
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/tenacity/__init__.py", line 412, in call
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server result =
fn(*args, **kwargs)
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server File
"/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py",
line 597, in _connect_single_volume
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server raise
exception.VolumeDeviceNotFound(device='')
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server
os_brick.exception.VolumeDeviceNotFound: Volume device not found at .
2023-08-21 09:22:34.348 7 ERROR oslo_messaging.rpc.server
2023-08-21 09:22:34.678 7 INFO nova.virt.block_device
[req-6ed9d181-fa42-4a4a-b55b-2d08352d4e22
9f20ac539d60ff27487ce83f160e3c29bbeea1017d28fb5159b17c9e77454345
265571cce0b54251b17ace88630b5e46 - 64989c3036764834a755ff2bd55b30b3
64989c3036764834a755ff2bd55b30b3] [instance:
5520dbd7-8f12-4041-bd8c-65b4b55b0452] Attempting to driver detach volume
a005a015-0bb2-498e-9068-ac14570a5f2e from mountpoint /dev/vda
2023-08-21 09:22:34.680 7 WARNING nova.virt.block_device
[req-6ed9d181-fa42-4a4a-b55b-2d08352d4e22
9f20ac539d60ff27487ce83f160e3c29bbeea1017d28fb5159b17c9e77454345
265571cce0b54251b17ace88630b5e46 - 64989c3036764834a755ff2bd55b30b3
64989c3036764834a755ff2bd55b30b3] [instance:
5520dbd7-8f12-4041-bd8c-65b4b55b0452] Detaching volume from unknown instance
2023-08-21 09:22:34.682 7 WARNING nova.virt.libvirt.driver
[req-6ed9d181-fa42-4a4a-b55b-2d08352d4e22
9f20ac539d60ff27487ce83f160e3c29bbeea1017d28fb5159b17c9e77454345
265571cce0b54251b17ace88630b5e46 - 64989c3036764834a755ff2bd55b30b3
64989c3036764834a755ff2bd55b30b3] [instance:
5520dbd7-8f12-4041-bd8c-65b4b55b0452] During detach_volume, instance
disappeared.: nova.exception.InstanceNotFound: Instance
5520dbd7-8f12-4041-bd8c-65b4b55b0452 could not be found.
2023-08-21 09:22:34.707 7 INFO nova.virt.block_device
[req-6ed9d181-fa42-4a4a-b55b-2d08352d4e22
9f20ac539d60ff27487ce83f160e3c29bbeea1017d28fb5159b17c9e77454345
265571cce0b54251b17ace88630b5e46 - 64989c3036764834a755ff2bd55b30b3
64989c3036764834a755ff2bd55b30b3] [instance:
5520dbd7-8f12-4041-bd8c-65b4b55b0452] Attempting to driver detach volume
01b9b430-9cfd-43b1-b674-b6fb830ee511 from mountpoint /dev/vdb
2023-08-21 09:22:34.708 7 WARNING nova.virt.block_device
[req-6ed9d181-fa42-4a4a-b55b-2d08352d4e22
9f20ac539d60ff27487ce83f160e3c29bbeea1017d28fb5159b17c9e77454345
265571cce0b54251b17ace88630b5e46 - 64989c3036764834a755ff2bd55b30b3
64989c3036764834a755ff2bd55b30b3] [instance:
5520dbd7-8f12-4041-bd8c-65b4b55b0452] Detaching volume from unknown instance
2023-08-21 09:22:34.710 7 WARNING nova.virt.libvirt.driver
[req-6ed9d181-fa42-4a4a-b55b-2d08352d4e22
9f20ac539d60ff27487ce83f160e3c29bbeea1017d28fb5159b17c9e77454345
265571cce0b54251b17ace88630b5e46 - 64989c3036764834a755ff2bd55b30b3
64989c3036764834a755ff2bd55b30b3] [instance:
5520dbd7-8f12-4041-bd8c-65b4b55b0452] During detach_volume, instance
disappeared.: nova.exception.InstanceNotFound: Instance
5520dbd7-8f12-4041-bd8c-65b4b55b0452 could not be found.
Server show :
+-------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------+
| Field | Value
|
+-------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------+
| OS-DCF:diskConfig | AUTO
|
| OS-EXT-AZ:availability_zone | Watson
|
| OS-EXT-SRV-ATTR:host | overcloud-novacompute-0.bdxworld.com
|
| OS-EXT-SRV-ATTR:hostname | hkg2-cctv-archive-server-2
|
| OS-EXT-SRV-ATTR:hypervisor_hostname | overcloud-novacompute-0.bdxworld.com
|
| OS-EXT-SRV-ATTR:instance_name | instance-0000012f
|
| OS-EXT-SRV-ATTR:kernel_id |
|
| OS-EXT-SRV-ATTR:launch_index | 0
|
| OS-EXT-SRV-ATTR:ramdisk_id |
|
| OS-EXT-SRV-ATTR:reservation_id | r-mdrfy13i
|
| OS-EXT-SRV-ATTR:root_device_name | /dev/vda
|
| OS-EXT-SRV-ATTR:user_data | None
|
| OS-EXT-STS:power_state | Running
|
| OS-EXT-STS:task_state | None
|
| OS-EXT-STS:vm_state | active
|
| OS-SRV-USG:launched_at | 2023-01-05T09:49:54.000000
|
| OS-SRV-USG:terminated_at | None
|
| accessIPv4 |
|
| accessIPv6 |
|
| addresses | HKG2-CCTV=192.168.240.14
|
| config_drive | True
|
| created | 2023-01-05T09:49:46Z
|
| description | HKG2-CCTV-Archive-server-2
|
| flavor | disk='150', ephemeral='0',
extra_specs.hw:cpu_cores='10', extra_specs.hw:cpu_sockets='2',
original_name='CCTV-VM-Flavor', ram='16384', swap='0', |
| | vcpus='10'
|
| hostId |
570b5ef66c44d3f80a1a498c9dbd8b9a2b690a1aee748bc9cb2cf6fb
|
| host_status | UP
|
| id |
5520dbd7-8f12-4041-bd8c-65b4b55b0452
|
| image | N/A (booted from volume)
|
| key_name | None
|
| locked | False
|
| locked_reason | None
|
| name | HKG2-CCTV-Archive-server-2
|
| progress | 0
|
| project_id | 265571cce0b54251b17ace88630b5e46
|
| properties |
|
| security_groups | name='CCTV services Incoming'
|
| | name='CCTV services Outgoing'
|
| server_groups | []
|
| status | ACTIVE
|
| tags |
|
| trusted_image_certificates | None
|
| updated | 2023-08-21T09:22:35Z
|
| user_id |
ff69985eea81fe6f11e9caa3f242623cbddc7ce21cdd864c9585d793533acb9e
|
| volumes_attached | delete_on_termination='False',
id='a005a015-0bb2-498e-9068-ac14570a5f2e'
|
| | delete_on_termination='False',
id='01b9b430-9cfd-43b1-b674-b6fb830ee511'
|
| | delete_on_termination='False',
id='a385a416-1d37-4111-b7ae-f182a54aeaef'
|
| | delete_on_termination='False',
id='95bafa69-b7cc-4d35-9002-723ca556a570'
|
| | delete_on_termination='False',
id='b5091e83-2afb-457c-abf8-ed855156b59f'
|
| | delete_on_termination='False',
id='0e9762bd-5229-433d-825e-6c7a77c6927c'
|
| | delete_on_termination='False',
id='9a00e144-8755-48d5-8907-7fa499172371'
|
+-------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------+
Please suggest how I can migrate the instance.
With regards,
Swogat Pradhan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.openstack.org/pipermail/openstack-discuss/attachments/20230821/22f0bb60/attachment-0001.htm>
More information about the openstack-discuss
mailing list