Hi Zufar, Since you are trying to use Agent deploy, you need to use HTTPS instead of HTTP. Try to run the services under HTTPS and see if the problem still happen. Em sáb, 19 de jan de 2019 às 11:06, Zufar Dhiyaulhaq < zufar@onf-ambassador.org> escreveu:
Hi,
I get some error when trying to create an instance for bare-metal node. Bellow is my troubleshooting. I don't know what is happening. Any suggestions?
*Ironic Error Log*: 2019-01-19 15:36:41.232 15780 ERROR ironic.drivers.modules.deploy_utils [req-200dac66-0995-41c4-8c8c-dff053d27e36 499299da0c284a4ba9214ea0d83867cc 62088a869020430392a4fb1a0c5d2863 - default default] Agent deploy supports only HTTP(S) URLs as instance_info['image_source'] or swift temporary URL. Either the specified URL is not a valid HTTP(S) URL or is not reachable for node 6c20755a-e36b-495a-98e1-a40f58e5ac3c. Error: Validation of image href secreturl failed, reason: Got HTTP code 404 instead of 200 in response to HEAD request.: ImageRefValidationFailed: Validation of image href secreturl failed, reason: Got HTTP code 404 instead of 200 in response to HEAD request.
2019-01-19 15:36:41.233 15780 ERROR ironic.conductor.manager [req-200dac66-0995-41c4-8c8c-dff053d27e36 499299da0c284a4ba9214ea0d83867cc 62088a869020430392a4fb1a0c5d2863 - default default] Error while preparing to deploy to node 6c20755a-e36b-495a-98e1-a40f58e5ac3c: Validation of image href secreturl failed, reason: Got HTTP code 404 instead of 200 in response to HEAD request.: ImageRefValidationFailed: Validation of image href secreturl failed, reason: Got HTTP code 404 instead of 200 in response to HEAD request.
*Nova Error Log:* 2019-01-19 16:35:52.639 13355 ERROR oslo.service.loopingcall [-] Fixed interval looping call 'nova.virt.ironic.driver.IronicDriver._wait_for_active' failed: InstanceDeployFailure: Failed to provision instance 3$ 2019-01-19 16:35:52.639 13355 ERROR oslo.service.loopingcall Traceback (most recent call last): 2019-01-19 16:35:52.639 13355 ERROR oslo.service.loopingcall File "/usr/lib/python2.7/site-packages/oslo_service/loopingcall.py", line 137, in _run_loop 2019-01-19 16:35:52.639 13355 ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) 2019-01-19 16:35:52.639 13355 ERROR oslo.service.loopingcall File "/usr/lib/python2.7/site-packages/nova/virt/ironic/driver.py", line 505, in _wait_for_active 2019-01-19 16:35:52.639 13355 ERROR oslo.service.loopingcall raise exception.InstanceDeployFailure(msg) 2019-01-19 16:35:52.639 13355 ERROR oslo.service.loopingcall InstanceDeployFailure: Failed to provision instance 38c276b1-b88a-4f4b-924b-8b52377f3145: Failed to prepare to deploy: Validation of image href secre$ 2019-01-19 16:35:52.639 13355 ERROR oslo.service.loopingcall 2019-01-19 16:35:52.640 13355 ERROR nova.virt.ironic.driver [req-e11c3fcc-2066-49c6-b47b-0e3879840ad0 7ad46602ac42417a8c798c69cb3105e5 f3bb39ae2e0946e1bbf812bcde6e08a7 - default default] Error deploying instanc$ 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [req-e11c3fcc-2066-49c6-b47b-0e3879840ad0 7ad46602ac42417a8c798c69cb3105e5 f3bb39ae2e0946e1bbf812bcde6e08a7 - default default] [instance: 38c276b1-b88a-4$ 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] Traceback (most recent call last): 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 2252, in _build_resources 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] yield resources 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 2032, in _build_and_run_instance 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] block_device_info=block_device_info) 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] File "/usr/lib/python2.7/site-packages/nova/virt/ironic/driver.py", line 1136, in spawn 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] 'node': node_uuid}) 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__ 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] self.force_reraise() 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] six.reraise(self.type_, self.value, self.tb) 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] File "/usr/lib/python2.7/site-packages/nova/virt/ironic/driver.py", line 1128, in spawn 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] timer.start(interval=CONF.ironic.api_retry_interval).wait() 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] File "/usr/lib/python2.7/site-packages/eventlet/event.py", line 121, in wait 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] return hubs.get_hub().switch() 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] File "/usr/lib/python2.7/site-packages/eventlet/hubs/hub.py", line 294, in switch 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] return self.greenlet.switch() 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] File "/usr/lib/python2.7/site-packages/oslo_service/loopingcall.py", line 137, in _run_loop 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] result = func(*self.args, ** self.kw) 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] File "/usr/lib/python2.7/site-packages/nova/virt/ironic/driver.py", line 505, in _wait_for_active 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] raise exception.InstanceDeployFailure(msg) 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145] InstanceDeployFailure: Failed to provision instance 38c276b1-b88a-4f4b-924b-8b52377f3145: Failed to prep$ 2019-01-19 16:35:52.641 13355 ERROR nova.compute.manager [instance: 38c276b1-b88a-4f4b-924b-8b52377f3145]
*Ironic Configuration:* [DEFAULT] enabled_drivers=pxe_ipmitool enabled_hardware_types = ipmi log_dir=/var/log/ironic transport_url=rabbit://guest:guest@10.60.60.10:5672/ auth_strategy=keystone notification_driver = messaging
[conductor] send_sensor_data = true automated_clean=true
[swift] region_name = RegionOne project_domain_id = default user_domain_id = default project_name = services password = IRONIC_PASSWORD username = ironic auth_url = http://10.60.60.10:5000/v3 auth_type = password
[pxe] tftp_root=/tftpboot tftp_server=10.60.60.10 ipxe_enabled=True pxe_bootfile_name=undionly.kpxe uefi_pxe_bootfile_name=ipxe.efi pxe_config_template=$pybasedir/drivers/modules/ipxe_config.template uefi_pxe_config_template=$pybasedir/drivers/modules/ipxe_config.template pxe_append_params=coreos.autologin #ipxe_use_swift=True
[agent] image_download_source = http
[deploy] http_root=/httpboot http_url=http://10.60.60.10:8088
[service_catalog] insecure = True auth_uri=http://10.60.60.10:5000/v3 auth_type=password auth_url=http://10.60.60.10:35357 project_domain_id = default user_domain_id = default project_name = services username = ironic password = IRONIC_PASSWORD region_name = RegionOne
[database] connection=mysql+pymysql:// ironic:IRONIC_DBPASSWORD@10.60.60.10/ironic?charset=utf8
[keystone_authtoken] auth_url=http://10.60.60.10:35357 www_authenticate_uri=http://10.60.60.10:5000 auth_type=password username=ironic password=IRONIC_PASSWORD user_domain_name=Default project_name=services project_domain_name=Default
[neutron] www_authenticate_uri=http://10.60.60.10:5000 auth_type=password auth_url=http://10.60.60.10:35357 project_domain_name=Default project_name=services user_domain_name=Default username=ironic password=IRONIC_PASSWORD cleaning_network = 461a6663-e015-4ecf-9076-d1b502c3db25 provisioning_network = 461a6663-e015-4ecf-9076-d1b502c3db25
[glance] region_name = RegionOne project_domain_id = default user_domain_id = default project_name = services password = IRONIC_PASSWORD username = ironic auth_url = http://10.60.60.10:5000/v3 auth_type = password temp_url_endpoint_type = swift swift_endpoint_url = http://10.60.60.10:8080/v1/AUTH_%(tenant_id)s swift_account = AUTH_f3bb39ae2e0946e1bbf812bcde6e08a7 swift_container = glance swift_temp_url_key = secret
*Temp-URL enable:* [root@zu-controller0 ~(keystone_admin)]# openstack object store account show +------------+---------------------------------------+ | Field | Value | +------------+---------------------------------------+ | Account | AUTH_f3bb39ae2e0946e1bbf812bcde6e08a7 | | Bytes | 996 | | Containers | 1 | | Objects | 1 | | properties | Temp-Url-Key='secret' | +------------+---------------------------------------+
*Swift Endpoint:* [root@zu-controller0 ~(keystone_admin)]# openstack endpoint list | grep swift | 07e9d544a44241f5b317f651dce5f0a4 | RegionOne | swift | object-store | True | public | http://10.60.60.10:8080/v1/AUTH_%(tenant_id)s | | dadfd168384542b0933fe41df87d9dc8 | RegionOne | swift | object-store | True | internal | http://10.60.60.10:8080/v1/AUTH_%(tenant_id)s | | e53aca9d357542868516d367a0bf13a6 | RegionOne | swift | object-store | True | admin | http://10.60.60.10:8080/v1/AUTH_%(tenant_id)s |
Best Regards, Zufar Dhiyaulhaq
-- *Att[]'sIury Gregory Melo Ferreira * *MSc in Computer Science at UFCG* *Part of the puppet-manager-core team in OpenStack* *Software Engineer at Red Hat Czech* *Social*: https://www.linkedin.com/in/iurygregory *E-mail: iurygregory@gmail.com <iurygregory@gmail.com>*