[Openstack] [Cinder] New volume status stuck at "Creating" after creation in Horizon

Ahmed Al-Mehdi ahmedalmehdi at gmail.com
Wed Dec 5 08:54:32 UTC 2012


I posted the cinder-scheduler log in my first post, but here they are here
again.  There are generated right around the time frame when I created the
volume.  I am trying to understand the error message "VolumeNotFound:
Volume 9dd360bf-9ef2-499f-ac6e-
893abf5dc5ce could not be found".  Is this error message related to
volume_group "cinder-volumes" or the new volume I just created.


2012-12-04 09:05:02 23552 DEBUG cinder.openstack.common.rpc.
amqp [-] received {u'_context_roles': [u'Member', u'admin'],
u'_context_request_id': u'req-1b122042-c3e4-4c1e-8285-ad148c8c2367',
u'_context
_quota_class': None, u'args': {u'topic': u'cinder-volume', u'image_id':
None, u'snapshot_id': None, u'volume_id':
u'9dd360bf-9ef2-499f-ac6e-893abf5dc5ce'}, u'_context_auth_token':
'<SANITIZED>', u'_co
ntext_is_admin': False, u'_context_project_id':
u'70e5c14a28a14666a86e85b62ca6ae18', u'_context_timestamp':
u'2012-12-04T17:05:02.375789', u'_context_read_deleted': u'no',
u'_context_user_id': u'386d0
f02d6d045e7ba49d8edac7bb43f', u'method': u'create_volume',
u'_context_remote_address': u'10.176.20.102'} _safe_log
/usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/common.py:195
2012-12-04 09:05:02 23552 DEBUG cinder.openstack.common.rpc.amqp [-]
unpacked context: {'user_id': u'386d0f02d6d045e7ba49d8edac7bb43f', 'roles':
[u'Member', u'admin'], 'timestamp': u'2012-12-04T17:05:
02.375789', 'auth_token': '<SANITIZED>', 'remote_address':
u'10.176.20.102', 'quota_class': None, 'is_admin': False, 'request_id':
u'req-1b122042-c3e4-4c1e-8285-ad148c8c2367', 'project_id': u'70e5c14a
28a14666a86e85b62ca6ae18', 'read_deleted': u'no'} _safe_log
/usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/common.py:195
2012-12-04 09:05:02 23552 ERROR cinder.openstack.common.rpc.amqp [-]
Exception during message handling
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp Traceback
(most recent call last):
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
"/usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/amqp.py",
line 276, in _process_data
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp     rval =
self.proxy.dispatch(ctxt, version, method, **args)
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
"/usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/dispatcher.py",
line 145, in dispatch
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp     return
getattr(proxyobj, method)(ctxt, **kwargs)
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
"/usr/lib/python2.7/dist-packages/cinder/scheduler/manager.py", line 98, in
_schedule
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
db.volume_update(context, volume_id, {'status': 'error'})
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
"/usr/lib/python2.7/dist-packages/cinder/db/api.py", line 256, in
volume_update
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp     return
IMPL.volume_update(context, volume_id, values)
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
"/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line 124,
in wrapper
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp     return
f(*args, **kwargs)
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
"/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line 1071,
in volume_update
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
volume_ref = volume_get(context, volume_id, session=session)
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
"/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line 124,
in wrapper
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp     return
f(*args, **kwargs)
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
"/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line 1014,
in volume_get
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp     raise
exception.VolumeNotFound(volume_id=volume_id)
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
VolumeNotFound: Volume 9dd360bf-9ef2-499f-ac6e-893abf5dc5ce could not be
found.
2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp

Thank you,
Ahmed.



On Tue, Dec 4, 2012 at 11:10 PM, Huang Zhiteng <winston.d at gmail.com> wrote:

> Can you check the cinder scheduler log?
>
> On Wed, Dec 5, 2012 at 1:44 AM, Ahmed Al-Mehdi <ahmedalmehdi at gmail.com>
> wrote:
> > Hello,
> >
> > I setup a two node OpenStack setup, one controller-node and one
> > compute-node.  I am using Quantum, Cinder services, and KVM for
> > virtualization.  I am running into an issue creating a volume through
> > Horizon which I will attach to a VM later on.  The status of volume in
> > Horizon is stuck at "Creating".  The output of "cinder list" shows
> nothing.
> >
> > The iscsi service is setup properly, as far as I can tell.  I feel there
> is
> > a communication issue between the openstack services.
> >
> > No log entry in cinder-volume.log.
> >
> > However, cinder-scheduler.log has the following entry:
> >
> > 2012-12-04 09:05:02 23552 DEBUG cinder.openstack.common.rpc.amqp [-]
> > received {u'_context_roles': [u'Member', u'admin'],
> u'_context_request_id':
> > u'req-1b122042-c3e4-4c1e-8285-ad148c8c2367', u'_context
> > _quota_class': None, u'args': {u'topic': u'cinder-volume', u'image_id':
> > None, u'snapshot_id': None, u'volume_id':
> > u'9dd360bf-9ef2-499f-ac6e-893abf5dc5ce'}, u'_context_auth_token':
> > '<SANITIZED>', u'_co
> > ntext_is_admin': False, u'_context_project_id':
> > u'70e5c14a28a14666a86e85b62ca6ae18', u'_context_timestamp':
> > u'2012-12-04T17:05:02.375789', u'_context_read_deleted': u'no',
> > u'_context_user_id': u'386d0
> > f02d6d045e7ba49d8edac7bb43f', u'method': u'create_volume',
> > u'_context_remote_address': u'10.176.20.102'} _safe_log
> >
> /usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/common.py:195
> > 2012-12-04 09:05:02 23552 DEBUG cinder.openstack.common.rpc.amqp [-]
> > unpacked context: {'user_id': u'386d0f02d6d045e7ba49d8edac7bb43f',
> 'roles':
> > [u'Member', u'admin'], 'timestamp': u'2012-12-04T17:05:
> > 02.375789', 'auth_token': '<SANITIZED>', 'remote_address':
> u'10.176.20.102',
> > 'quota_class': None, 'is_admin': False, 'request_id':
> > u'req-1b122042-c3e4-4c1e-8285-ad148c8c2367', 'project_id': u'70e5c14a
> > 28a14666a86e85b62ca6ae18', 'read_deleted': u'no'} _safe_log
> >
> /usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/common.py:195
> > 2012-12-04 09:05:02 23552 ERROR cinder.openstack.common.rpc.amqp [-]
> > Exception during message handling
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> Traceback
> > (most recent call last):
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> > "/usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/amqp.py",
> line
> > 276, in _process_data
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> rval =
> > self.proxy.dispatch(ctxt, version, method, **args)
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> >
> "/usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/dispatcher.py",
> > line 145, in dispatch
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> return
> > getattr(proxyobj, method)(ctxt, **kwargs)
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> > "/usr/lib/python2.7/dist-packages/cinder/scheduler/manager.py", line 98,
> in
> > _schedule
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> > db.volume_update(context, volume_id, {'status': 'error'})
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> > "/usr/lib/python2.7/dist-packages/cinder/db/api.py", line 256, in
> > volume_update
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> return
> > IMPL.volume_update(context, volume_id, values)
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> > "/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line
> 124, in
> > wrapper
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> return
> > f(*args, **kwargs)
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> > "/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line
> 1071,
> > in volume_update
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> > volume_ref = volume_get(context, volume_id, session=session)
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> > "/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line
> 124, in
> > wrapper
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> return
> > f(*args, **kwargs)
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> > "/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line
> 1014,
> > in volume_get
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> raise
> > exception.VolumeNotFound(volume_id=volume_id)
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> > VolumeNotFound: Volume 9dd360bf-9ef2-499f-ac6e-893abf5dc5ce could not be
> > found.
> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> >
> >
> > Has anyone run into this issue.  Can I issue some cinder-* cli command to
> > get more info about the issue.
> > Any help would be very appreciated.
> >
> > Thank you,
> > Ahmed.
> >
> >
> > _______________________________________________
> > Mailing list: https://launchpad.net/~openstack
> > Post to     : openstack at lists.launchpad.net
> > Unsubscribe : https://launchpad.net/~openstack
> > More help   : https://help.launchpad.net/ListHelp
> >
>
>
>
> --
> Regards
> Huang Zhiteng
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openstack.org/pipermail/openstack/attachments/20121205/89b593b2/attachment.html>


More information about the Openstack mailing list