<div dir="ltr">Hello Yipei,<div><br></div><div>"<b><span style="font-size:12.8px">octavia.amphorae.backends.</span><wbr style="font-size:12.8px"><span style="font-size:12.8px">agent.api_server.listener [-] Failed to verify haproxy file: Command '['haproxy', '-c', '-L', 'NK20KVuD6oi5NrRP7KOVflM</span></b></div><div style="font-size:12.8px"><b>3MsQ', '-f', '/var/lib/octavia/bca2c985-<wbr>471a-4477-8217-92fa71d04cb7/<wbr>haproxy.cfg.new']' returned non-zero exit status 1</b>"</div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">Verification of haproxy cfg file is failing.</div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">You can create a dummy file from the haproxy template files(jinja2 files) and verify on any system with haproxy installed.</div><div style="font-size:12.8px"><b>haproxy -c -f "filename"</b></div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">Regards,</div><div style="font-size:12.8px">Ganpat</div></div><div class="gmail_extra"><br><div class="gmail_quote">On Wed, Jun 28, 2017 at 3:19 PM, Yipei Niu <span dir="ltr"><<a href="mailto:newypei@gmail.com" target="_blank">newypei@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">Hi, Michael,<div><br></div><div>Thanks for your help. I have already created a load balancer successfully, but failed creating a listener. The detailed errors of amphora-agent and syslog in the amphora are as follows.</div><div><br></div><div>In amphora-agent.log:</div><div><br></div><div><div>[2017-06-28 08:54:12 +0000] [1209] [INFO] Starting gunicorn 19.7.0</div><div>[2017-06-28 08:54:13 +0000] [1209] [DEBUG] Arbiter booted</div><div>[2017-06-28 08:54:13 +0000] [1209] [INFO] Listening at: http://[::]:9443 (1209)</div><div>[2017-06-28 08:54:13 +0000] [1209] [INFO] Using worker: sync</div><div>[2017-06-28 08:54:13 +0000] [1209] [DEBUG] 1 workers</div><div>[2017-06-28 08:54:13 +0000] [1816] [INFO] Booting worker with pid: 1816</div><div>[2017-06-28 08:54:15 +0000] [1816] [DEBUG] POST /0.5/plug/vip/<a href="http://10.0.1.8" target="_blank">10.0.1.8</a></div><div>::ffff:192.168.0.12 - - [28/Jun/2017:08:54:59 +0000] "POST /0.5/plug/vip/<a href="http://10.0.1.8" target="_blank">10.0.1.8</a> HTTP/1.1" 202 78 "-" "Octavia HaProxy Rest Client/0.5 (<a href="https://wiki.openstack.org/wiki/Octavia" target="_blank">https://wiki.openstack.org/<wbr>wiki/Octavia</a>)"</div><div>[2017-06-28 08:59:18 +0000] [1816] [DEBUG] PUT /0.5/listeners/9ed4f0a5-6b1e-<wbr>4832-97cc-fb8d1518cbd4/<wbr>bca2c985-471a-4477-8217-<wbr>92fa71d04cb7/haproxy</div><div>::ffff:192.168.0.12 - - [28/Jun/2017:08:59:19 +0000] "PUT /0.5/listeners/9ed4f0a5-6b1e-<wbr>4832-97cc-fb8d1518cbd4/<wbr>bca2c985-471a-4477-8217-<wbr>92fa71d04cb7/haproxy HTTP/1.1" 400 414 "-" "Octavia HaProxy Rest Client/0.5 (<a href="https://wiki.openstack.org/wiki/Octavia" target="_blank">https://wiki.openstack.org/<wbr>wiki/Octavia</a>)"</div></div><div><br></div><div>In syslog:</div><div><br></div><div><div>Jun 28 08:57:14 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 ec2: ##############################<wbr>##############################<wbr>#</div><div>Jun 28 08:57:14 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 ec2: -----BEGIN SSH HOST KEY FINGERPRINTS-----</div><div>Jun 28 08:57:14 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 ec2: 1024 SHA256:qDQcKq2Je/<wbr>CzlpPndccMf0aR0u/<wbr>KPJEEIAl4RraAgVc root@amphora-9ed4f0a5-6b1e-<wbr>4832-97cc-fb8d1518cbd4 (DSA)</div><div>Jun 28 08:57:15 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 ec2: 256 SHA256:n+5tCCdJwASMaD/<wbr>kJ6fm0kVNvXDh4aO0si2Uls4MXkI root@amphora-9ed4f0a5-6b1e-<wbr>4832-97cc-fb8d1518cbd4 (ECDSA)</div><div>Jun 28 08:57:15 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 ec2: 256 SHA256:7RWMBOW+<wbr>QKzeolI6BDSpav9dVZuon58weIQJ9/<wbr>peVxE root@amphora-9ed4f0a5-6b1e-<wbr>4832-97cc-fb8d1518cbd4 (ED25519)</div><div>Jun 28 08:57:16 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 ec2: 2048 SHA256:9z+<wbr>EcAAUyTENKJRctKCzPslK6Yf4c7s9R<wbr>8sEflDITIU root@amphora-9ed4f0a5-6b1e-<wbr>4832-97cc-fb8d1518cbd4 (RSA)</div><div>Jun 28 08:57:16 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 ec2: -----END SSH HOST KEY FINGERPRINTS-----</div><div>Jun 28 08:57:16 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 ec2: ##############################<wbr>##############################<wbr>#</div><div>Jun 28 08:57:17 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 cloud-init[2092]: Cloud-init v. 0.7.9 running 'modules:final' at Wed, 28 Jun 2017 08:57:03 +0000. Up 713.82 seconds.</div><div>Jun 28 08:57:17 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 cloud-init[2092]: Cloud-init v. 0.7.9 finished at Wed, 28 Jun 2017 08:57:16 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0]. Up 727.30 seconds</div><div>Jun 28 08:57:19 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 systemd[1]: Started Execute cloud user/final scripts.</div><div>Jun 28 08:57:19 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 systemd[1]: Reached target Cloud-init target.</div><div>Jun 28 08:57:19 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 systemd[1]: Startup finished in 52.054s (kernel) + 11min 17.647s (userspace) = 12min 9.702s.</div><div>Jun 28 08:59:19 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 amphora-agent[1209]: 2017-06-28 08:59:19.243 1816 ERROR octavia.amphorae.backends.<wbr>agent.api_server.listener [-] Failed to verify haproxy file: Command '['haproxy', '-c', '-L', 'NK20KVuD6oi5NrRP7KOVflM</div><div>3MsQ', '-f', '/var/lib/octavia/bca2c985-<wbr>471a-4477-8217-92fa71d04cb7/<wbr>haproxy.cfg.new']' returned non-zero exit status 1</div><div>Jun 28 09:00:11 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 systemd[1]: Starting Cleanup of Temporary Directories...</div><div>Jun 28 09:00:12 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 systemd-tmpfiles[3040]: [/usr/lib/tmpfiles.d/var.conf:<wbr>14] Duplicate line for path "/var/log", ignoring.</div><div>Jun 28 09:00:15 amphora-9ed4f0a5-6b1e-4832-<wbr>97cc-fb8d1518cbd4 systemd[1]: Started Cleanup of Temporary Directories.</div></div><span class=""><div><br></div><div>Look forward to your valuable comments.</div><div><br></div><div>Best regards,</div><div>Yipei</div></span></div><div class="HOEnZb"><div class="h5"><div class="gmail_extra"><br><div class="gmail_quote">On Tue, Jun 27, 2017 at 2:33 PM, Yipei Niu <span dir="ltr"><<a href="mailto:newypei@gmail.com" target="_blank">newypei@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">Hi, Micheal,<div><br></div><div>Thanks a lot for your help, but I still have one question. </div><div><br></div><div>In Octavia, once the controller worker fails plugging VIP to the amphora, the amphora is deleted immediately, making it impossible to trace the error. How to prevent Octavia from stopping and deleting the amphora? </div><div><br></div><div>Best regards,</div><div>Yipei </div></div><div class="m_1628938085856144410HOEnZb"><div class="m_1628938085856144410h5"><div class="gmail_extra"><br><div class="gmail_quote">On Mon, Jun 26, 2017 at 11:21 AM, Yipei Niu <span dir="ltr"><<a href="mailto:newypei@gmail.com" target="_blank">newypei@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">Hi, all,<div><br></div><div>I am trying to create a load balancer in octavia. The amphora can be booted successfully, and can be reached via icmp. However, octavia fails to plug vip to the amphora through the amphora client api and returns 500 status code, causing some errors as follows.</div><div><br></div><div><div> |__Flow 'octavia-create-loadbalancer-f<wbr>low': InternalServerError: Internal Server Error</div><div>2017-06-21 09:49:35.864 25411 ERROR octavia.controller.worker.cont<wbr>roller_worker Traceback (most recent call last):</div><div>2017-06-21 09:49:35.864 25411 ERROR octavia.controller.worker.cont<wbr>roller_worker File "/usr/local/lib/python2.7/dist<wbr>-packages/taskflow/engines/act<wbr>ion_engine/executor.py", line 53, in _execute_task</div><div>2017-06-21 09:49:35.864 25411 ERROR octavia.controller.worker.cont<wbr>roller_worker result = task.execute(**arguments)</div><div>2017-06-21 09:49:35.864 25411 ERROR octavia.controller.worker.cont<wbr>roller_worker File "/opt/stack/octavia/octavia/co<wbr>ntroller/worker/tasks/amphora_<wbr>driver_tasks.py", line 240, in execute</div><div>2017-06-21 09:49:35.864 25411 ERROR octavia.controller.worker.cont<wbr>roller_worker amphorae_network_config)</div><div>2017-06-21 09:49:35.864 25411 ERROR octavia.controller.worker.cont<wbr>roller_worker File "/opt/stack/octavia/octavia/co<wbr>ntroller/worker/tasks/amphora_<wbr>driver_tasks.py", line 219, in execute</div><div>2017-06-21 09:49:35.864 25411 ERROR octavia.controller.worker.cont<wbr>roller_worker amphora, loadbalancer, amphorae_network_config)</div><div>2017-06-21 09:49:35.864 25411 ERROR octavia.controller.worker.cont<wbr>roller_worker File "/opt/stack/octavia/octavia/am<wbr>phorae/drivers/haproxy/rest_ap<wbr>i_driver.py", line 137, in post_vip_plug</div><div>2017-06-21 09:49:35.864 25411 ERROR octavia.controller.worker.cont<wbr>roller_worker net_info)</div><div>2017-06-21 09:49:35.864 25411 ERROR octavia.controller.worker.cont<wbr>roller_worker File "/opt/stack/octavia/octavia/am<wbr>phorae/drivers/haproxy/rest_ap<wbr>i_driver.py", line 378, in plug_vip</div><div>2017-06-21 09:49:35.864 25411 ERROR octavia.controller.worker.cont<wbr>roller_worker return exc.check_exception(r)</div><div>2017-06-21 09:49:35.864 25411 ERROR octavia.controller.worker.cont<wbr>roller_worker File "/opt/stack/octavia/octavia/am<wbr>phorae/drivers/haproxy/excepti<wbr>ons.py", line 32, in check_exception</div><div>2017-06-21 09:49:35.864 25411 ERROR octavia.controller.worker.cont<wbr>roller_worker raise responses[status_code]()</div><div>2017-06-21 09:49:35.864 25411 ERROR octavia.controller.worker.cont<wbr>roller_worker InternalServerError: Internal Server Error</div><div>2017-06-21 09:49:35.864 25411 ERROR octavia.controller.worker.cont<wbr>roller_worker</div></div><div><br></div><div>To fix the problem, I log in the amphora and find that there is one http server process is listening on port 9443, so I think the amphora api services is active. But do not know how to further investigate what error happens inside the amphora api service and solve it? Look forward to your valuable comments.</div><div><br></div><div>Best regards,</div><div>Yipei </div></div>
</blockquote></div><br></div>
</div></div></blockquote></div><br></div>
</div></div><br>______________________________<wbr>______________________________<wbr>______________<br>
OpenStack Development Mailing List (not for usage questions)<br>
Unsubscribe: <a href="http://OpenStack-dev-request@lists.openstack.org?subject:unsubscribe" rel="noreferrer" target="_blank">OpenStack-dev-request@lists.<wbr>openstack.org?subject:<wbr>unsubscribe</a><br>
<a href="http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev" rel="noreferrer" target="_blank">http://lists.openstack.org/<wbr>cgi-bin/mailman/listinfo/<wbr>openstack-dev</a><br>
<br></blockquote></div><br></div>