<div dir="ltr">Hi,<div><br></div><div>I also hit the loopingcall error while running magnum 4.1.1 (ocata). It is tracked by this bug: <a href="https://bugs.launchpad.net/magnum/+bug/1666790">https://bugs.launchpad.net/magnum/+bug/1666790</a>. I cherry picked the fix to ocata locally, but this needs to be done upstream as well.</div><div><br></div><div>I think that the heat stack create timeout is unrelated to that issue though. Try the following to debug the issue:</div><div>- Check the cluster's heat stack and its component resources.</div><div>- If created, SSH to the master and slave nodes, checking systemd services are up and cloud-init succeeded.</div><div><br></div><div>Regards,</div><div>Mark</div></div><div class="gmail_extra"><br><div class="gmail_quote">On 12 May 2017 at 05:57, KiYoun Sung <span dir="ltr"><<a href="mailto:kysung@devstack.co.kr" target="_blank">kysung@devstack.co.kr</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><span style="font-size:14px">Hello,</span><div style="font-size:14px">Magnum Team.<div><br></div><div>I installed magnum on Openstack Ocata(by fuel 11.0).</div><div>I referred to this guide.(<a href="https://docs.openstack.org/project-install-guide/container-infrastructure-management/ocata/install.html" target="_blank">https://docs.openstack.<wbr>org/project-install-guide/cont<wbr>ainer-infrastructure-managemen<wbr>t/ocata/install.html</a>)</div><div><br></div><div>Below is my installation information.</div><div><div>root@controller:~# dpkg -l | grep magnum</div><div>magnum-api                          4.1.0-0ubuntu1~cloud0                      all          OpenStack containers as a service</div><div>magnum-common                       4.1.0-0ubuntu1~cloud0                      all          OpenStack containers as a service - API server</div><div>magnum-conductor                    4.1.0-0ubuntu1~cloud0                      all          OpenStack containers as a service - conductor</div><div>python-magnum                       4.1.0-0ubuntu1~cloud0                      all          OpenStack containers as a service - Python library</div><div>python-magnumclient                 2.5.0-0ubuntu1~cloud0                      all          client library for Magnum API - Python 2.x</div></div><div><br></div><div>After installation,</div><div>I created cluster-template for kubernetes like this.</div><div><span style="color:rgb(0,0,0);font-family:arial;font-size:13px;white-space:pre-wrap">(magnum cluster-template-create --name k8s-cluster-template \
                       --image fedora-atomic-latest \
                       --keypair testkey \
                       --external-network admin_floating_net \
                       --dns-nameserver 8.8.8.8 \
                       --flavor m1.small \
                       --docker-volume-size 5 \
                       --network-driver flannel \
                       --coe kubernetes  )</span><br></div><div><span style="color:rgb(0,0,0);font-family:arial;font-size:13px;white-space:pre-wrap"><br></span></div><div><font color="#000000" face="arial"><span style="white-space:pre-wrap">and I create cluster, </span></font></div><div><font color="#000000" face="arial"><span style="white-space:pre-wrap">b</span></font>ut "magnum clutser-create' command was failed.</div><div><span style="color:rgb(0,0,0);font-family:arial;font-size:13px;white-space:pre-wrap">(magnum cluster-create --name k8s-cluster \
                      --cluster-template k8s-cluster-template \
                      --node-count 1 \
                      --timeout 10   )</span><br></div><div><span style="color:rgb(0,0,0);font-family:arial;font-size:13px;white-space:pre-wrap"><br></span></div><div><span style="color:rgb(0,0,0);font-family:arial;font-size:13px;white-space:pre-wrap">After 10 minutes(option "--timeout 10"),</span></div><div><font color="#000000" face="arial"><span style="white-space:pre-wrap">creation was failed, and the status is "CREATE_FAILED"</span></font></div><div><font color="#000000" face="arial"><span style="white-space:pre-wrap"><br></span></font></div><div><font color="#000000" face="arial"><span style="white-space:pre-wrap">I executed "openstack server list" command,</span></font></div><div><font color="#000000" face="arial"><span style="white-space:pre-wrap">there is a only kube-master instance. </span></font></div><div><font color="#000000" face="arial"><span style="white-space:pre-wrap">(</span></font>root@controller:~# openstack server list</div><div><div>+-----------------------------<wbr>---------+--------------------<wbr>-------------------+--------+-<wbr>------------------------------<wbr>-+----------------------+</div><div>| ID                                   | Name                                  | Status | Networks                       | Image Name           |</div><div>+-----------------------------<wbr>---------+--------------------<wbr>-------------------+--------+-<wbr>------------------------------<wbr>-+----------------------+</div><div>| bf9c5097-74fd-4457-a8a2-4feae7<wbr>6d4111 | k8-i27fw72w5t-0-i6lg6mzpzrl6-k<wbr>ube-    | ACTIVE | private=10.0.0.9, 172.16.1.135 | fedora-atomic-latest |</div><div>|                                      | master-ekjrg2v6ztss                   |        |                                |                      |</div><div>+-----------------------------<wbr>---------+--------------------<wbr>-------------------+--------+-<wbr>------------------------------<wbr>-+----------------------+    </div></div><div>)</div><div><br></div><div>I think kube-master instance create is successful.</div><div>I can connect that instance</div><div>and docker container was running normally.</div><div><br></div><div>Why this command was failed?</div><div><br></div><div>Here is my /var/log/magnum/magnum-conduct<wbr>or.log and /var/log/nova-all.log.</div><div>magnum-conductor.log have a ERROR.</div><div>==============================<wbr>=================</div><div><div>2017-05-12 04:05:00.684 756 ERROR magnum.common.keystone [req-e2d4ea12-ec7a-4865-9eda-d<wbr>272cc43a827 - - - - -] Keystone API connection failed: no password, trust_id or token found.</div><div>2017-05-12 04:05:00.686 756 ERROR magnum.common.exception [req-e2d4ea12-ec7a-4865-9eda-d<wbr>272cc43a827 - - - - -] Exception in string format operation, kwargs: {'code': 500}</div><div>2017-05-12 04:05:00.686 756 ERROR magnum.common.exception Traceback (most recent call last):</div><div>2017-05-12 04:05:00.686 756 ERROR magnum.common.exception   File "/usr/lib/python2.7/dist-packa<wbr>ges/magnum/common/exception.<wbr>py", line 92, in __init__</div><div>2017-05-12 04:05:00.686 756 ERROR magnum.common.exception     self.message = self.message % kwargs</div><div>2017-05-12 04:05:00.686 756 ERROR magnum.common.exception KeyError: u'client'</div><div>2017-05-12 04:05:00.686 756 ERROR magnum.common.exception</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall [req-e2d4ea12-ec7a-4865-9eda-d<wbr>272cc43a827 - - - - -] Fixed interval looping call 'magnum.service.periodic.Clust<wbr>erUpdateJob.update_status' failed</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall Traceback (most recent call last):</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall   File "/usr/lib/python2.7/dist-packa<wbr>ges/oslo_service/loopingcall.<wbr>py", line 137, in _run_loop</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall     result = func(*self.args, **<a href="http://self.kw/" target="_blank">self.kw</a>)</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall   File "/usr/lib/python2.7/dist-packa<wbr>ges/magnum/service/periodic.<wbr>py", line 71, in update_status</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall     cdriver.update_cluster_status(<wbr>self.ctx, self.cluster)</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall   File "/usr/lib/python2.7/dist-packa<wbr>ges/magnum/drivers/heat/driver<wbr>.py", line 80, in update_cluster_status</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall     poller.poll_and_check()</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall   File "/usr/lib/python2.7/dist-packa<wbr>ges/magnum/drivers/heat/driver<wbr>.py", line 169, in poll_and_check</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall     stack = self.openstack_client.heat().s<wbr>tacks.get(</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall   File "/usr/lib/python2.7/dist-packa<wbr>ges/magnum/common/exception.<wbr>py", line 59, in wrapped</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall     return func(*args, **kw)</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall   File "/usr/lib/python2.7/dist-packa<wbr>ges/magnum/common/clients.py", line 94, in heat</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall     region_name=region_name)</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall   File "/usr/lib/python2.7/dist-packa<wbr>ges/magnum/common/clients.py", line 45, in url_for</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall     return self.keystone().session.get_en<wbr>dpoint(**kwargs)</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall   File "/usr/lib/python2.7/dist-packa<wbr>ges/magnum/common/keystone.py"<wbr>, line 59, in session</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall     auth = self._get_auth()</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall   File "/usr/lib/python2.7/dist-packa<wbr>ges/magnum/common/keystone.py"<wbr>, line 98, in _get_auth</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall     raise exception.AuthorizationFailure<wbr>()</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall AuthorizationFailure: %(client)s connection failed. %(message)s</div><div>2017-05-12 04:05:00.687 756 ERROR oslo.service.loopingcall</div><div>2017-05-12 04:10:11.634 756 ERROR magnum.drivers.heat.driver [req-7bf01756-8b81-4387-8512-e<wbr>29cd2b71aab - - - - -] Cluster error, stack status: CREATE_FAILED, stack_id: c1bf3ab0-c869-458d-a16d-6f4ab8<wbr>284046, reason: Timed out</div></div><div>==============================<wbr>=================<br></div><div><br></div><div>And, heat-all.log has same logging repeatedly.</div><div>==============================<wbr>=================<br></div><div><div>2017-05-12T04:09:58.514153+00:<wbr>00 controller heat-engine: 2017-05-12 04:09:58.512 8813 DEBUG heat.engine.scheduler [req-90ba259d-abef-4d45-a87e-0<wbr>c08cbd5a4ae -</div><div>- - - -] Task create from HeatWaitCondition "master_wait_condition" Stack "k8s-cluster-nyj64j3yjhjs-kube<wbr>_masters-k6i27fw72w5t-0-i6lg6m<wbr>zpzrl6" [57626786-4858-4</div><div>715-9c01-97ad6156d6f7] running step /usr/lib/python2.7/dist-packag<wbr>es/heat/engine/scheduler.py:<wbr>215</div><div>2017-05-12T04:09:58.532625+00:<wbr>00 controller heat-engine: 2017-05-12 04:09:58.531 8813 DEBUG heat.engine.scheduler [req-90ba259d-abef-4d45-a87e-0<wbr>c08cbd5a4ae -</div><div>- - - -] Task create from HeatWaitCondition "master_wait_condition" Stack "k8s-cluster-nyj64j3yjhjs-kube<wbr>_masters-k6i27fw72w5t-0-i6lg6m<wbr>zpzrl6" [57626786-4858-4</div><div>715-9c01-97ad6156d6f7] sleeping _sleep /usr/lib/python2.7/dist-packag<wbr>es/heat/engine/scheduler.py:<wbr>156</div><div>2017-05-12T04:09:58.740335+00:<wbr>00 controller heat-engine: 2017-05-12 04:09:58.738 8879 DEBUG heat.engine.scheduler [req-e0532cce-e310-4222-80d9-1<wbr>59c9aee6afa -</div><div>- - - -] Task create from TemplateResource "0" Stack "k8s-cluster-nyj64j3yjhjs-kube<wbr>_masters-k6i27fw72w5t" [093664fa-7914-41a3-924e-7e570<wbr>6320301] running step</div><div>/usr/lib/python2.7/dist-packag<wbr>es/heat/engine/scheduler.py:<wbr>215</div><div>2017-05-12T04:09:58.746408+00:<wbr>00 controller heat-engine: 2017-05-12 04:09:58.745 8879 DEBUG heat.engine.scheduler [req-e0532cce-e310-4222-80d9-1<wbr>59c9aee6afa -</div><div>- - - -] Task create from TemplateResource "0" Stack "k8s-cluster-nyj64j3yjhjs-kube<wbr>_masters-k6i27fw72w5t" [093664fa-7914-41a3-924e-7e570<wbr>6320301] sleeping _sle</div><div>ep /usr/lib/python2.7/dist-packag<wbr>es/heat/engine/scheduler.py:<wbr>156</div><div>2017-05-12T04:09:58.980256+00:<wbr>00 controller heat-engine: 2017-05-12 04:09:58.979 8847 DEBUG heat.engine.scheduler [req-fade682c-de27-4d9d-8756-6<wbr>7f6ebd5ffc4 -</div><div>- - - -] Task create from ResourceGroup "kube_masters" Stack "k8s-cluster-nyj64j3yjhjs" [c1bf3ab0-c869-458d-a16d-6f4ab<wbr>8284046] running step /usr/lib/python2.7</div><div>/dist-packages/heat/engine/sch<wbr>eduler.py:215</div><div>2017-05-12T04:09:58.986982+00:<wbr>00 controller heat-engine: 2017-05-12 04:09:58.985 8847 DEBUG heat.engine.scheduler [req-fade682c-de27-4d9d-8756-6<wbr>7f6ebd5ffc4 -</div><div>- - - -] Task create from ResourceGroup "kube_masters" Stack "k8s-cluster-nyj64j3yjhjs" [c1bf3ab0-c869-458d-a16d-6f4ab<wbr>8284046] sleeping _sleep /usr/lib/python</div><div>2.7/dist-packages/heat/engine/<wbr>scheduler.py:156</div><div>2017-05-12T04:09:59.533940+00:<wbr>00 controller heat-engine: 2017-05-12 04:09:59.532 8813 DEBUG heat.engine.scheduler [req-90ba259d-abef-4d45-a87e-0<wbr>c08cbd5a4ae -</div><div>- - - -] Task create from HeatWaitCondition "master_wait_condition" Stack "k8s-cluster-nyj64j3yjhjs-kube<wbr>_masters-k6i27fw72w5t-0-i6lg6m<wbr>zpzrl6" [57626786-4858-4</div><div>715-9c01-97ad6156d6f7] running step /usr/lib/python2.7/dist-packag<wbr>es/heat/engine/scheduler.py:<wbr>215</div><div>2017-05-12T04:09:59.554131+00:<wbr>00 controller heat-engine: 2017-05-12 04:09:59.553 8813 DEBUG heat.engine.scheduler [req-90ba259d-abef-4d45-a87e-0<wbr>c08cbd5a4ae -</div><div>- - - -] Task create from HeatWaitCondition "master_wait_condition" Stack "k8s-cluster-nyj64j3yjhjs-kube<wbr>_masters-k6i27fw72w5t-0-i6lg6m<wbr>zpzrl6" [57626786-4858-4</div><div>715-9c01-97ad6156d6f7] sleeping _sleep /usr/lib/python2.7/dist-packag<wbr>es/heat/engine/scheduler.py:<wbr>156</div><div>2017-05-12T04:09:59.747987+00:<wbr>00 controller heat-engine: 2017-05-12 04:09:59.746 8879 DEBUG heat.engine.scheduler [req-e0532cce-e310-4222-80d9-1<wbr>59c9aee6afa -</div><div>- - - -] Task create from TemplateResource "0" Stack "k8s-cluster-nyj64j3yjhjs-kube<wbr>_masters-k6i27fw72w5t" [093664fa-7914-41a3-924e-7e570<wbr>6320301] running step</div><div>/usr/lib/python2.7/dist-packag<wbr>es/heat/engine/scheduler.py:<wbr>215</div><div>2017-05-12T04:09:59.764422+00:<wbr>00 controller heat-engine: 2017-05-12 04:09:59.755 8879 DEBUG heat.engine.scheduler [req-e0532cce-e310-4222-80d9-1<wbr>59c9aee6afa -</div><div>- - - -] Task create from TemplateResource "0" Stack "k8s-cluster-nyj64j3yjhjs-kube<wbr>_masters-k6i27fw72w5t" [093664fa-7914-41a3-924e-7e570<wbr>6320301] sleeping _sle</div><div>ep /usr/lib/python2.7/dist-packag<wbr>es/heat/engine/scheduler.py:<wbr>156</div><div>2017-05-12T04:09:59.989512+00:<wbr>00 controller heat-engine: 2017-05-12 04:09:59.987 8847 DEBUG heat.engine.scheduler [req-fade682c-de27-4d9d-8756-6<wbr>7f6ebd5ffc4 -</div><div>- - - -] Task create from ResourceGroup "kube_masters" Stack "k8s-cluster-nyj64j3yjhjs" [c1bf3ab0-c869-458d-a16d-6f4ab<wbr>8284046] running step /usr/lib/python2.7</div><div>/dist-packages/heat/engine/sch<wbr>eduler.py:215</div><div>2017-05-12T04:09:59.997531+00:<wbr>00 controller heat-engine: 2017-05-12 04:09:59.996 8847 DEBUG heat.engine.scheduler [req-fade682c-de27-4d9d-8756-6<wbr>7f6ebd5ffc4 -</div><div>- - - -] Task create from ResourceGroup "kube_masters" Stack "k8s-cluster-nyj64j3yjhjs" [c1bf3ab0-c869-458d-a16d-6f4ab<wbr>8284046] sleeping _sleep /usr/lib/python</div><div>2.7/dist-packages/heat/engine/<wbr>scheduler.py:156</div></div><div>...<br><div>==============================<wbr>=================<br></div><div></div></div><div><br></div><div>Thank you.<br></div><div>Best Regards.</div></div></div>
<br>______________________________<wbr>______________________________<wbr>______________<br>
OpenStack Development Mailing List (not for usage questions)<br>
Unsubscribe: <a href="http://OpenStack-dev-request@lists.openstack.org?subject:unsubscribe" rel="noreferrer" target="_blank">OpenStack-dev-request@lists.<wbr>openstack.org?subject:<wbr>unsubscribe</a><br>
<a href="http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev" rel="noreferrer" target="_blank">http://lists.openstack.org/<wbr>cgi-bin/mailman/listinfo/<wbr>openstack-dev</a><br>
<br></blockquote></div><br></div>