Re: 【octavia】Failed to load CA Certificate /etc/octavia/certs/server_ca.cert.pem

Michael Johnson johnsomor at gmail.com
Thu Apr 2 17:45:11 UTC 2020


Hi there.

For the first issue, try building your image for the Train release by
adding the "-g stable/train" parameter.

For the other errors, those are errors that the python packages failed
to download. Maybe the internet connection dropped during the image
build?

Michael

On Wed, Apr 1, 2020 at 4:01 PM hao7.liu at midea.com <hao7.liu at midea.com> wrote:
>
> OS version:CentOS7.6, ubuntu1804
> openstack version:Train
> when i create an amphora image, always may errors, such as:
>
> ./diskimage-create.sh -i ubuntu -d bionic -r 123456 -s 5 -o amphora-x64-haproxy-ubuntu-1804-0401
>
> 2020-04-01 05:34:13.189 | Ignoring actdiag: markers 'python_version == "3.7"' don't match your environment
> 2020-04-01 05:34:13.192 | Ignoring sphinxcontrib-applehelp: markers 'python_version == "3.6"' don't match your environment
> 2020-04-01 05:34:13.194 | Ignoring sphinxcontrib-applehelp: markers 'python_version == "3.7"' don't match your environment
> 2020-04-01 05:34:13.197 | Ignoring scikit-learn: markers 'python_version == "3.6"' don't match your environment
> 2020-04-01 05:34:13.199 | Ignoring scikit-learn: markers 'python_version == "3.7"' don't match your environment
> 2020-04-01 05:34:13.203 | Processing /opt/amphora-agent
> 2020-04-01 05:34:14.758 | ERROR: Package 'octavia' requires a different Python: 2.7.5 not in '>=3.6'
> 2020-04-01 05:34:14.823 | Unmount /tmp/dib_build.EjDukNCf/mnt/tmp/yum
> 2020-04-01 05:34:14.867 | Unmount /tmp/dib_build.EjDukNCf/mnt/tmp/pip
> 2020-04-01 05:34:14.887 | Unmount /tmp/dib_build.EjDukNCf/mnt/tmp/in_target.d
> 2020-04-01 05:34:14.915 | Unmount /tmp/dib_build.EjDukNCf/mnt/sys
> 2020-04-01 05:34:14.935 | Unmount /tmp/dib_build.EjDukNCf/mnt/proc
> 2020-04-01 05:34:14.963 | Unmount /tmp/dib_build.EjDukNCf/mnt/dev/pts
> 2020-04-01 05:34:14.991 | Unmount /tmp/dib_build.EjDukNCf/mnt/dev
> 2020-04-01 05:34:15.721 | INFO diskimage_builder.block_device.blockdevice [-] State already cleaned - no way to do anything here
> root at ip-172-31-53-210:/apps/octavia/diskimage-create#
>
>
>
> 2020-04-01 05:47:47.398 |       Successfully uninstalled pip-9.0.1
> 2020-04-01 05:47:48.444 | Successfully installed pip-20.0.2 setuptools-44.1.0 wheel-0.34.2
> 2020-04-01 05:47:51.309 | Collecting virtualenv
> 2020-04-01 05:47:51.966 |   Downloading virtualenv-20.0.15-py2.py3-none-any.whl (4.6 MB)
> 2020-04-01 05:49:29.260 | ERROR: Exception:
> 2020-04-01 05:49:29.261 | Traceback (most recent call last):
> 2020-04-01 05:49:29.261 |   File "/usr/local/lib/python3.6/dist-packages/pip/_vendor/urllib3/response.py", line 425, in _error_catcher
> 2020-04-01 05:49:29.261 |     yield
> 2020-04-01 05:49:29.261 |   File "/usr/local/lib/python3.6/dist-packages/pip/_vendor/urllib3/response.py", line 507, in read
> 2020-04-01 05:49:29.261 |     data = self._fp.read(amt) if not fp_closed else b""
> 2020-04-01 05:49:29.261 |   File "/usr/local/lib/python3.6/dist-packages/pip/_vendor/cachecontrol/filewrapper.py", line 62, in read
> 2020-04-01 05:49:29.261 |     data = self.__fp.read(amt)
> 2020-04-01 05:49:29.261 |   File "/usr/lib/python3.6/http/client.py", line 459, in read
> 2020-04-01 05:49:29.261 |     n = self.readinto(b)
> 2020-04-01 05:49:29.261 |   File "/usr/lib/python3.6/http/client.py", line 503, in readinto
> 2020-04-01 05:49:29.261 |     n = self.fp.readinto(b)
> 2020-04-01 05:49:29.261 |   File "/usr/lib/python3.6/socket.py", line 586, in readinto
> 2020-04-01 05:49:29.261 |     return self._sock.recv_into(b)
> 2020-04-01 05:49:29.261 |   File "/usr/lib/python3.6/ssl.py", line 1012, in recv_into
> 2020-04-01 05:49:29.261 |     return self.read(nbytes, buffer)
> 2020-04-01 05:49:29.261 |   File "/usr/lib/python3.6/ssl.py", line 874, in read
> 2020-04-01 05:49:29.261 |     return self._sslobj.read(len, buffer)
> 2020-04-01 05:49:29.261 |   File "/usr/lib/python3.6/ssl.py", line 631, in read
> 2020-04-01 05:49:29.261 |     v = self._sslobj.read(len, buffer)
> 2020-04-01 05:49:29.261 | socket.timeout: The read operation timed out
> 2020-04-01 05:49:29.261 |
> 2020-04-01 05:49:29.261 | During handling of the above exception, another exception occurred:
> 2020-04-01 05:49:29.261 |
> 2020-04-01 05:49:29.261 | Traceback (most recent call last):
> 2020-04-01 05:49:29.261 |   File "/usr/local/lib/python3.6/dist-packages/pip/_internal/cli/base_command.py", line 186, in _main
> 2020-04-01 05:49:29.261 |     status = self.run(options, args)
> 2020-04-01 05:49:29.261 |   File "/usr/local/lib/python3.6/dist-packages/pip/_internal/commands/install.py", line 331, in run
> 2020-04-01 05:49:29.262 |     resolver.resolve(requirement_set)
> 2020-04-01 05:49:29.262 |   File "/usr/local/lib/python3.6/dist-packages/pip/_internal/legacy_resolve.py", line 177, in resolve
> 2020-04-01 05:49:29.262 |     discovered_reqs.extend(self._resolve_one(requirement_set, req))
> 2020-04-01 05:49:29.262 |   File "/usr/local/lib/python3.6/dist-packages/pip/_internal/legacy_resolve.py", line 333, in _resolve_one
> 2020-04-01 05:49:29.262 |     abstract_dist = self._get_abstract_dist_for(req_to_install)
> 2020-04-01 05:49:29.262 |   File "/usr/local/lib/python3.6/dist-packages/pip/_internal/legacy_resolve.py", line 282, in _get_abstract_dist_for
> 2020-04-01 05:49:29.262 |     abstract_dist = self.preparer.prepare_linked_requirement(req)
> 2020-04-01 05:49:29.262 |   File "/usr/local/lib/python3.6/dist-packages/pip/_internal/operations/prepare.py", line 482, in prepare_linked_requirement
> 2020-04-01 05:49:29.262 |     hashes=hashes,
> 2020-04-01 05:49:29.262 |   File "/usr/local/lib/python3.6/dist-packages/pip/_internal/operations/prepare.py", line 287, in unpack_url
> 2020-04-01 05:49:29.262 |     hashes=hashes,
> 2020-04-01 05:49:29.262 |   File "/usr/local/lib/python3.6/dist-packages/pip/_internal/operations/prepare.py", line 159, in unpack_http_url
> 2020-04-01 05:49:29.262 |     link, downloader, temp_dir.path, hashes
> 2020-04-01 05:49:29.262 |   File "/usr/local/lib/python3.6/dist-packages/pip/_internal/operations/prepare.py", line 303, in _download_http_url
> 2020-04-01 05:49:29.262 |     for chunk in download.chunks:
> 2020-04-01 05:49:29.262 |   File "/usr/local/lib/python3.6/dist-packages/pip/_internal/utils/ui.py", line 160, in iter
> 2020-04-01 05:49:29.262 |     for x in it:
> 2020-04-01 05:49:29.262 |   File "/usr/local/lib/python3.6/dist-packages/pip/_internal/network/utils.py", line 39, in response_chunks
> 2020-04-01 05:49:29.262 |     decode_content=False,
> 2020-04-01 05:49:29.262 |   File "/usr/local/lib/python3.6/dist-packages/pip/_vendor/urllib3/response.py", line 564, in stream
> 2020-04-01 05:49:29.262 |     data = self.read(amt=amt, decode_content=decode_content)
> 2020-04-01 05:49:29.262 |   File "/usr/local/lib/python3.6/dist-packages/pip/_vendor/urllib3/response.py", line 529, in read
> 2020-04-01 05:49:29.262 |     raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
> 2020-04-01 05:49:29.262 |   File "/usr/lib/python3.6/contextlib.py", line 99, in __exit__
> 2020-04-01 05:49:29.262 |     self.gen.throw(type, value, traceback)
> 2020-04-01 05:49:29.262 |   File "/usr/local/lib/python3.6/dist-packages/pip/_vendor/urllib3/response.py", line 430, in _error_catcher
> 2020-04-01 05:49:29.262 |     raise ReadTimeoutError(self._pool, None, "Read timed out.")
> 2020-04-01 05:49:29.262 | pip._vendor.urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='files.pythonhosted.org', port=443): Read timed out.
> 2020-04-01 05:49:29.424 | Unmount /tmp/dib_build.QPWJUysz/mnt/var/cache/apt/archives
> 2020-04-01 05:49:29.459 | Unmount /tmp/dib_build.QPWJUysz/mnt/tmp/pip
> 2020-04-01 05:49:29.490 | Unmount /tmp/dib_build.QPWJUysz/mnt/tmp/in_target.d
> 2020-04-01 05:49:29.522 | Unmount /tmp/dib_build.QPWJUysz/mnt/sys
> 2020-04-01 05:49:29.546 | Unmount /tmp/dib_build.QPWJUysz/mnt/proc
> 2020-04-01 05:49:29.573 | Unmount /tmp/dib_build.QPWJUysz/mnt/dev/pts
> 2020-04-01 05:49:29.607 | Unmount /tmp/dib_build.QPWJUysz/mnt/dev
> 2020-04-01 05:49:30.562 | INFO diskimage_builder.block_device.blockdevice [-] State already cleaned - no way to do anything here
>
> and may other errors.
>
> ________________________________
> hao7.liu at midea.com
>
>
> 发件人: hao7.liu at midea.com
> 发送时间: 2020-04-01 14:48
> 收件人: openstack-discuss at lists.openstack.org
> 主题: 【octavia】Failed to load CA Certificate /etc/octavia/certs/server_ca.cert.pem
> OS version:CentOS7.6
> openstack version:Train
> when i deployed my openstack with octavia,and create a lb,the worker report error logs:
>
> 2020-04-01 14:40:41.842 164881 DEBUG octavia.controller.worker.v1.controller_worker [-] Task 'MASTER-octavia-create-amp-for-lb-subflow-octavia-generate-serverpem' (7abe1523-7802-48ad-a7c1-1d2f8f32f706) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
> 2020-04-01 14:40:41.865 164881 INFO octavia.controller.worker.v1.tasks.database_tasks [-] Created Amphora in DB with id 191958e3-2577-4a8a-a1ff-b8f048056b72
> 2020-04-01 14:40:41.869 164881 DEBUG octavia.controller.worker.v1.controller_worker [-] Task 'BACKUP-octavia-create-amp-for-lb-subflow-octavia-create-amphora-indb' (667607d7-6357-4bac-a498-725c370a2b34) transitioned into state 'SUCCESS' from state 'RUNNING' with result '191958e3-2577-4a8a-a1ff-b8f048056b72' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:183
> 2020-04-01 14:40:41.874 164881 DEBUG octavia.controller.worker.v1.controller_worker [-] Task 'BACKUP-octavia-create-amp-for-lb-subflow-octavia-generate-serverpem' (7f312151-6f92-4ae7-9826-0fccc315ba43) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
> 2020-04-01 14:40:41.927 164881 INFO octavia.certificates.generator.local [-] Signing a certificate request using OpenSSL locally.
> 2020-04-01 14:40:41.927 164881 INFO octavia.certificates.generator.local [-] Using CA Certificate from config.
> 2020-04-01 14:40:41.946 164881 WARNING octavia.controller.worker.v1.controller_worker [-] Task 'BACKUP-octavia-create-amp-for-lb-subflow-octavia-generate-serverpem' (7f312151-6f92-4ae7-9826-0fccc315ba43) transitioned into state 'FAILURE' from state 'RUNNING'
> 13 predecessors (most recent first):
>   Atom 'BACKUP-octavia-create-amp-for-lb-subflow-octavia-create-amphora-indb' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {}, 'provides': u'191958e3-2577-4a8a-a1ff-b8f048056b72'}
>   |__Flow 'BACKUP-octavia-create-amp-for-lb-subflow'
>      |__Atom 'BACKUP-octavia-get-amphora-for-lb-subflow-octavia-mapload-balancer-to-amphora' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'flavor': {u'loadbalancer_topology': u'ACTIVE_STANDBY'}, 'loadbalancer_id': u'd7ca9fb7-eda3-4a17-a615-c6d7f31d32d8'}, 'provides': None}
>         |__Flow 'BACKUP-octavia-get-amphora-for-lb-subflow'
>            |__Flow 'BACKUP-octavia-plug-net-subflow'
>               |__Flow 'octavia-create-loadbalancer-flow'
>                  |__Atom 'octavia.controller.worker.v1.tasks.network_tasks.GetSubnetFromVIP' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'loadbalancer': <octavia.common.data_models.LoadBalancer object at 0x7f93312afd90>}, 'provides': <octavia.network.data_models.Subnet object at 0x7f93312af550>}
>                     |__Atom 'octavia.controller.worker.v1.tasks.network_tasks.UpdateVIPSecurityGroup' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'loadbalancer': <octavia.common.data_models.LoadBalancer object at 0x7f93312afd90>}, 'provides': None}
>                        |__Atom 'octavia.controller.worker.v1.tasks.database_tasks.UpdateVIPAfterAllocation' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'vip': <octavia.common.data_models.Vip object at 0x7f9331b253d0>, 'loadbalancer_id': u'd7ca9fb7-eda3-4a17-a615-c6d7f31d32d8'}, 'provides': <octavia.common.data_models.LoadBalancer object at 0x7f93312afd90>}
>                           |__Atom 'octavia.controller.worker.v1.tasks.network_tasks.AllocateVIP' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'loadbalancer': <octavia.common.data_models.LoadBalancer object at 0x7f9332b66290>}, 'provides': <octavia.common.data_models.Vip object at 0x7f9331b253d0>}
>                              |__Atom 'reload-lb-before-allocate-vip' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'loadbalancer_id': u'd7ca9fb7-eda3-4a17-a615-c6d7f31d32d8'}, 'provides': <octavia.common.data_models.LoadBalancer object at 0x7f9332b66290>}
>                                 |__Atom 'octavia.controller.worker.v1.tasks.lifecycle_tasks.LoadBalancerIDToErrorOnRevertTask' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'loadbalancer_id': u'd7ca9fb7-eda3-4a17-a615-c6d7f31d32d8'}, 'provides': None}
>                                    |__Flow 'octavia-create-loadbalancer-flow': CertificateGenerationException: Could not sign the certificate request: Failed to load CA Certificate /etc/octavia/certs/server_ca.cert.pem.
> 2020-04-01 14:40:41.946 164881 ERROR octavia.controller.worker.v1.controller_worker Traceback (most recent call last):
> 2020-04-01 14:40:41.946 164881 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
> 2020-04-01 14:40:41.946 164881 ERROR octavia.controller.worker.v1.controller_worker     result = task.execute(**arguments)
> 2020-04-01 14:40:41.946 164881 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python2.7/site-packages/octavia/controller/worker/v1/tasks/cert_task.py", line 47, in execute
> 2020-04-01 14:40:41.946 164881 ERROR octavia.controller.worker.v1.controller_worker     validity=CONF.certificates.cert_validity_time)
> 2020-04-01 14:40:41.946 164881 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python2.7/site-packages/octavia/certificates/generator/local.py", line 234, in generate_cert_key_pair
> 2020-04-01 14:40:41.946 164881 ERROR octavia.controller.worker.v1.controller_worker     cert = cls.sign_cert(csr, validity, **kwargs)
> 2020-04-01 14:40:41.946 164881 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python2.7/site-packages/octavia/certificates/generator/local.py", line 91, in sign_cert
> 2020-04-01 14:40:41.946 164881 ERROR octavia.controller.worker.v1.controller_worker     cls._validate_cert(ca_cert, ca_key, ca_key_pass)
> 2020-04-01 14:40:41.946 164881 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python2.7/site-packages/octavia/certificates/generator/local.py", line 53, in _validate_cert
> 2020-04-01 14:40:41.946 164881 ERROR octavia.controller.worker.v1.controller_worker     .format(CONF.certificates.ca_certificate)
> 2020-04-01 14:40:41.946 164881 ERROR octavia.controller.worker.v1.controller_worker CertificateGenerationException: Could not sign the certificate request: Failed to load CA Certificate /etc/octavia/certs/server_ca.cert.pem.
> 2020-04-01 14:40:41.946 164881 ERROR octavia.controller.worker.v1.controller_worker
> 2020-04-01 14:40:41.969 164881 DEBUG octavia.controller.worker.v1.controller_worker [-] Task 'BACKUP-octavia-create-amp-for-lb-subflow-octavia-generate-serverpem' (7f312151-6f92-4ae7-9826-0fccc315ba43) transitioned into state 'REVERTING' from state 'FAILURE' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
> 2020-04-01 14:40:41.972 164881 WARNING octavia.controller.worker.v1.controller_worker [-] Task 'BACKUP-octavia-create-amp-for-lb-subflow-octavia-generate-serverpem' (7f312151-6f92-4ae7-9826-0fccc315ba43) transitioned into state 'REVERTED' from state 'REVERTING' with result 'None'
> 2020-04-01 14:40:41.975 164881 DEBUG octavia.controller.worker.v1.controller_worker [-] Task 'BACKUP-octavia-create-amp-for-lb-subflow-octavia-create-amphora-indb' (667607d7-6357-4bac-a498-725c370a2b34) transitioned into state 'REVERTING' from state 'SUCCESS' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
> 2020-04-01 14:40:41.975 164881 WARNING octavia.controller.worker.v1.tasks.database_tasks [-] Reverting create amphora in DB for amp id 191958e3-2577-4a8a-a1ff-b8f048056b72
> 2020-04-01 14:40:41.992 164881 WARNING octavia.controller.worker.v1.controller_worker [-] Task 'BACKUP-octavia-create-amp-for-lb-subflow-octavia-create-amphora-indb' (667607d7-6357-4bac-a498-725c370a2b34) transitioned into state 'REVERTED' from state 'REVERTING' with result 'None'
> 2020-04-01 14:40:41.995 164881 DEBUG octavia.controller.worker.v1.controller_worker [-] Task 'BACKUP-octavia-get-amphora-for-lb-subflow-octavia-mapload-balancer-to-amphora' (97f157c5-8b35-476d-a3d9-586087ecf235) transitioned into state 'REVERTING' from state 'SUCCESS' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
> 2020-04-01 14:40:41.996 164881 WARNING octavia.controller.worker.v1.tasks.database_tasks [-] Reverting Amphora allocation for the load balancer d7ca9fb7-eda3-4a17-a615-c6d7f31d32d8 in the database.
> 2020-04-01 14:40:42.003 164881 INFO octavia.certificates.generator.local [-] Signing a certificate request using OpenSSL locally.
> 2020-04-01 14:40:42.003 164881 INFO octavia.certificates.generator.local [-] Using CA Certificate from config.
> 2020-04-01 14:40:42.005 164881 WARNING octavia.controller.worker.v1.controller_worker [-] Task 'BACKUP-octavia-get-amphora-for-lb-subflow-octavia-mapload-balancer-to-amphora' (97f157c5-8b35-476d-a3d9-586087ecf235) transitioned into state 'REVERTED' from state 'REVERTING' with result 'None'
> 2020-04-01 14:40:42.006 164881 WARNING octavia.controller.worker.v1.controller_worker [-] Task 'MASTER-octavia-create-amp-for-lb-subflow-octavia-generate-serverpem' (7abe1523-7802-48ad-a7c1-1d2f8f32f706) transitioned into state 'FAILURE' from state 'RUNNING': CertificateGenerationException: Could not sign the certificate request: Failed to load CA Certificate /etc/octavia/certs/server_ca.cert.pem.
> 2020-04-01 14:40:42.006 164881 ERROR octavia.controller.worker.v1.controller_worker Traceback (most recent call last):
> 2020-04-01 14:40:42.006 164881 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
> 2020-04-01 14:40:42.006 164881 ERROR octavia.controller.worker.v1.controller_worker     result = task.execute(**arguments)
> 2020-04-01 14:40:42.006 164881 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python2.7/site-packages/octavia/controller/worker/v1/tasks/cert_task.py", line 47, in execute
> 2020-04-01 14:40:42.006 164881 ERROR octavia.controller.worker.v1.controller_worker     validity=CONF.certificates.cert_validity_time)
> 2020-04-01 14:40:42.006 164881 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python2.7/site-packages/octavia/certificates/generator/local.py", line 234, in generate_cert_key_pair
> 2020-04-01 14:40:42.006 164881 ERROR octavia.controller.worker.v1.controller_worker     cert = cls.sign_cert(csr, validity, **kwargs)
> 2020-04-01 14:40:42.006 164881 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python2.7/site-packages/octavia/certificates/generator/local.py", line 91, in sign_cert
> 2020-04-01 14:40:42.006 164881 ERROR octavia.controller.worker.v1.controller_worker     cls._validate_cert(ca_cert, ca_key, ca_key_pass)
> 2020-04-01 14:40:42.006 164881 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python2.7/site-packages/octavia/certificates/generator/local.py", line 53, in _validate_cert
> 2020-04-01 14:40:42.006 164881 ERROR octavia.controller.worker.v1.controller_worker     .format(CONF.certificates.ca_certificate)
> 2020-04-01 14:40:42.006 164881 ERROR octavia.controller.worker.v1.controller_worker CertificateGenerationException: Could not sign the certificate request: Failed to load CA Certificate /etc/octavia/certs/server_ca.cert.pem.
> 2020-04-01 14:40:42.006 164881 ERROR octavia.controller.worker.v1.controller_worker
> 2020-04-01 14:40:42.013 164881 DEBUG octavia.controller.worker.v1.controller_worker [-] Task 'MASTER-octavia-create-amp-for-lb-subflow-octavia-generate-serverpem' (7abe1523-7802-48ad-a7c1-1d2f8f32f706) transitioned into state 'REVERTING' from state 'FAILURE' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
> 2020-04-01 14:40:42.014 164881 WARNING octavia.controller.worker.v1.controller_worker [-] Task 'MASTER-octavia-create-amp-for-lb-subflow-octavia-generate-serverpem' (7abe1523-7802-48ad-a7c1-1d2f8f32f706) transitioned into state 'REVERTED' from state 'REVERTING' with result 'None'
> 2020-04-01 14:40:42.017 164881 DEBUG octavia.controller.worker.v1.controller_worker [-] Task 'MASTER-octavia-create-amp-for-lb-subflow-octavia-create-amphora-indb' (145e3ecd-816e-415e-90a4-b7b09ca09c60) transitioned into state 'REVERTING' from state 'SUCCESS' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
> 2020-04-01 14:40:42.018 164881 WARNING octavia.controller.worker.v1.tasks.database_tasks [-] Reverting create amphora in DB for amp id 1ecbc19a-2644-4f3a-a9fc-bf6ace1655e3
> 2020-04-01 14:40:42.034 164881 WARNING octavia.controller.worker.v1.controller_worker [-] Task 'MASTER-octavia-create-amp-for-lb-subflow-octavia-create-amphora-indb' (145e3ecd-816e-415e-90a4-b7b09ca09c60) transitioned into state 'REVERTED' from state 'REVERTING' with result 'None'
> 2020-04-01 14:40:42.038 164881 DEBUG octavia.controller.worker.v1.controller_worker [-] Task 'MASTER-octavia-get-amphora-for-lb-subflow-octavia-mapload-balancer-to-amphora' (a17713f7-52df-4d3b-8cd2-5e592ce29a6a) transitioned into state 'REVERTING' from state 'SUCCESS' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
> 2020-04-01 14:40:42.038 164881 WARNING octavia.controller.worker.v1.tasks.database_tasks [-] Reverting Amphora allocation for the load balancer d7ca9fb7-eda3-4a17-a615-c6d7f31d32d8 in the database.
> 2020-04-01 14:40:42.047 164881 WARNING octavia.controller.worker.v1.controller_worker [-] Task 'MASTER-octavia-get-amphora-for-lb-subflow-octavia-mapload-balancer-to-amphora' (a17713f7-52df-4d3b-8cd2-5e592ce29a6a) transitioned into state 'REVERTED' from state 'REVERTING' with result 'None'
> 2020-04-01 14:40:42.052 164881 DEBUG octavia.controller.worker.v1.controller_worker [-] Task 'octavia.controller.worker.v1.tasks.network_tasks.GetSubnetFromVIP' (b6e38bf6-57d3-4b99-8226-486e16606d72) transitioned into state 'REVERTING' from state 'SUCCESS' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
> 2020-04-01 14:40:42.054 164881 WARNING octavia.controller.worker.v1.controller_worker [-] Task 'octavia.controller.worker.v1.tasks.network_tasks.GetSubnetFromVIP' (b6e38bf6-57d3-4b99-8226-486e16606d72) transitioned into state 'REVERTED' from state 'REVERTING' with result 'None'
> 2020-04-01 14:40:42.059 164881 DEBUG octavia.controller.worker.v1.controller_worker [-] Task 'octavia.controller.worker.v1.tasks.network_tasks.UpdateVIPSecurityGroup' (47efda4a-4ab4-4618-ae0d-f0d145ca75b0) transitioned into state 'REVERTING' from state 'SUCCESS' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
> 2020-04-01 14:40:42.062 164881 WARNING octavia.controller.worker.v1.controller_worker [-] Task 'octavia.controller.worker.v1.tasks.network_tasks.UpdateVIPSecurityGroup' (47efda4a-4ab4-4618-ae0d-f0d145ca75b0) transitioned into state 'REVERTED' from state 'REVERTING' with result 'None'
> 2020-04-01 14:40:42.066 164881 DEBUG octavia.controller.worker.v1.controller_worker [-] Task 'octavia.controller.worker.v1.tasks.database_tasks.UpdateVIPAfterAllocation' (e24fb53e-195e-401d-b300-a798503d1f97) transitioned into state 'REVERTING' from state 'SUCCESS' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
> 2020-04-01 14:40:42.068 164881 WARNING octavia.controller.worker.v1.controller_worker [-] Task 'octavia.controller.worker.v1.tasks.database_tasks.UpdateVIPAfterAllocation' (e24fb53e-195e-401d-b300-a798503d1f97) transitioned into state 'REVERTED' from state 'REVERTING' with result 'None'
> 2020-04-01 14:40:42.073 164881 DEBUG octavia.controller.worker.v1.controller_worker [-] Task 'octavia.controller.worker.v1.tasks.network_tasks.AllocateVIP' (11bbd801-d889-4499-ab7d-768d81153939) transitioned into state 'REVERTING' from state 'SUCCESS' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
> 2020-04-01 14:40:42.073 164881 WARNING octavia.controller.worker.v1.tasks.network_tasks [-] Deallocating vip 172.20.250.184
> 2020-04-01 14:40:42.199 164881 INFO octavia.network.drivers.neutron.allowed_address_pairs [-] Removing security group b2430a12-2c07-4ca9-a381-3af79f702715 from port a52f2cfa-765b-4664-b4ad-c2a11dd870de
> 2020-04-01 14:40:43.189 164881 INFO octavia.network.drivers.neutron.allowed_address_pairs [-] Deleted security group b2430a12-2c07-4ca9-a381-3af79f702715
> 2020-04-01 14:40:43.994 164881 WARNING octavia.controller.worker.v1.controller_worker [-] Task 'octavia.controller.worker.v1.tasks.network_tasks.AllocateVIP' (11bbd801-d889-4499-ab7d-768d81153939) transitioned into state 'REVERTED' from state 'REVERTING' with result 'None'
> 2020-04-01 14:40:43.999 164881 DEBUG octavia.controller.worker.v1.controller_worker [-] Task 'reload-lb-before-allocate-vip' (01c2a7f3-9114-41f3-a2c0-42601b2b48f0) transitioned into state 'REVERTING' from state 'SUCCESS' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
> 2020-04-01 14:40:44.002 164881 WARNING octavia.controller.worker.v1.controller_worker [-] Task 'reload-lb-before-allocate-vip' (01c2a7f3-9114-41f3-a2c0-42601b2b48f0) transitioned into state 'REVERTED' from state 'REVERTING' with result 'None'
> 2020-04-01 14:40:44.007 164881 DEBUG octavia.controller.worker.v1.controller_worker [-] Task 'octavia.controller.worker.v1.tasks.lifecycle_tasks.LoadBalancerIDToErrorOnRevertTask' (2339e5d5-e545-4f1d-9147-4f5a7b2f9ce9) transitioned into state 'REVERTING' from state 'SUCCESS' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
> 2020-04-01 14:40:44.017 164881 WARNING octavia.controller.worker.v1.controller_worker [-] Task 'octavia.controller.worker.v1.tasks.lifecycle_tasks.LoadBalancerIDToErrorOnRevertTask' (2339e5d5-e545-4f1d-9147-4f5a7b2f9ce9) transitioned into state 'REVERTED' from state 'REVERTING' with result 'None'
> 2020-04-01 14:40:44.028 164881 WARNING octavia.controller.worker.v1.controller_worker [-] Flow 'octavia-create-loadbalancer-flow' (aab75b85-a8f1-486f-99e8-5c81e21aa3f3) transitioned into state 'REVERTED' from state 'RUNNING'
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server [-] Exception during message handling: WrappedFailure: WrappedFailure: [Failure: octavia.common.exceptions.CertificateGenerationException: Could not sign the certificate request: Failed to load CA Certificate /etc/octavia/certs/server_ca.cert.pem., Failure: octavia.common.exceptions.CertificateGenerationException: Could not sign the certificate request: Failed to load CA Certificate /etc/octavia/certs/server_ca.cert.pem.]
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 274, in dispatch
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 194, in _do_dispatch
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/octavia/controller/queue/v1/endpoints.py", line 45, in create_load_balancer
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server     self.worker.create_load_balancer(load_balancer_id, flavor)
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/tenacity/__init__.py", line 292, in wrapped_f
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server     return self.call(f, *args, **kw)
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/tenacity/__init__.py", line 358, in call
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server     do = self.iter(retry_state=retry_state)
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/tenacity/__init__.py", line 319, in iter
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server     return fut.result()
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/concurrent/futures/_base.py", line 422, in result
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server     return self.__get_result()
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/tenacity/__init__.py", line 361, in call
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server     result = fn(*args, **kwargs)
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/octavia/controller/worker/v1/controller_worker.py", line 344, in create_load_balancer
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server     create_lb_tf.run()
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 247, in run
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server     for _state in self.run_iter(timeout=timeout):
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 340, in run_iter
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server     failure.Failure.reraise_if_any(er_failures)
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 341, in reraise_if_any
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server     raise exc.WrappedFailure(failures)
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server WrappedFailure: WrappedFailure: [Failure: octavia.common.exceptions.CertificateGenerationException: Could not sign the certificate request: Failed to load CA Certificate /etc/octavia/certs/server_ca.cert.pem., Failure: octavia.common.exceptions.CertificateGenerationException: Could not sign the certificate request: Failed to load CA Certificate /etc/octavia/certs/server_ca.cert.pem.]
> 2020-04-01 14:40:44.029 164881 ERROR oslo_messaging.rpc.server
>
>
> ________________________________
> hao7.liu at midea.com



More information about the openstack-discuss mailing list