[Openstack] [SWIFT] 404 error during uploads
Kuo Hugo
tonytkdk at gmail.com
Tue Dec 24 07:19:01 UTC 2013
DB locked stack trace is a known issue.
How fast did you push 100 objects and what's your HW spec there ?
You can check for the existence of 404 container by swift-get-node.
In case of :
Dec 23 08:49:11 storage1 container-server 10.xx.xx.xxx - -
[23/Dec/2013:08:49:11 +0000] "HEAD /slot-8/11126/AUTH_swift/test1" 404 -
"tx92750eeef61640a3929f4-0052b7f907" "HEAD
http://swift:8080/v1/AUTH_swift/test1" "proxy-server 10178" 0.0002
$>swift-get-node /etc/swift/container.ring.gz AUTH_swift test1
And check the existence of the container DB on disks. If not then I'll
check the permission or availability of the mount point for swift daemon.
Hope it help
2013/12/24 Stephen Wood <smwood4 at gmail.com>
> I'm also seeing this in the object-server logs:
>
> 12OperationalError: database is locked#012 (txn:
> tx37d4e1807957447cac403-0052b89cbb)
> Dec 23 20:27:39 store01 container-server ERROR __call__ error with PUT
> /slot-3/11126/AUTH_swift/test1 : #012Traceback (most recent call last):#012
> File "/usr/lib/python2.7/dist-packages/swift/container/server.py", line
> 486, in __call__#012 res = method(req)#012 File
> "/usr/lib/python2.7/dist-packages/swift/common/utils.py", line 1870, in
> wrapped#012 return func(*a, **kw)#012 File
> "/usr/lib/python2.7/dist-packages/swift/common/utils.py", line 686, in
> _timing_stats#012 resp = func(ctrl, *args, **kwargs)#012 File
> "/usr/lib/python2.7/dist-packages/swift/container/server.py", line 262, in
> PUT#012 created = broker.is_deleted()#012 File
> "/usr/lib/python2.7/dist-packages/swift/container/backend.py", line 246, in
> is_deleted#012 with self.get() as conn:#012 File
> "/usr/lib/python2.7/contextlib.py", line 17, in __enter__#012 return
> self.gen.next()#012 File
> "/usr/lib/python2.7/dist-packages/swift/common/db.py", line 325, in get#012
> self.possibly_quarantine(*sys.exc_info())#012 File
> "/usr/lib/python2.7/dist-packages/swift/common/db.py", line 323, in get#012
> self.conn = get_db_connection(self.db_file, self.timeout)#012 File
> "/usr/lib/python2.7/dist-packages/swift/common/db.py", line 167, in
> get_db_connection#012 timeout=timeout)#012DatabaseConnectionError: DB
> connection error
> (/srv/node/slot-3/containers/11126/5ab/56edb8dfe26326806d33c3c73aeb65ab/56edb8dfe26326806d33c3c73aeb65ab.db,
> 25):#012Traceback (most recent call last):#012 File
> "/usr/lib/python2.7/dist-packages/swift/common/db.py", line 159, in
> get_db_connection#012 cur.execute('PRAGMA synchronous =
> NORMAL')#012OperationalError: database is locked#012 (txn:
> tx0171c51f618f4dbbbc712-0052b89cbb)
>
> I'll see this error even during successful uploads, which has me wondering
> if something is borked on my container service for these hosts, or if locks
> aren't properly getting released.
>
>
> On Mon, Dec 23, 2013 at 12:13 PM, Stephen Wood <smwood4 at gmail.com> wrote:
>
>> I've run into a very annoying problem with my swift cluster and I'm
>> hoping somebody can help me out. During any given 100 uploads, I'll usually
>> see between 1 to 4% of the calls give a 404. The swift client writes the
>> following:
>>
>> Object PUT failed: http://swift:8080/v1/AUTH_swift/test1/042 404 Not
>> Found [first 60 chars of response] <html><h1>Not Found</h1><p>The resource
>> could not be found.<
>>
>> I'm uploading everything to a single bucket and there's only 100 files in
>> there that are less than 10kb each. For auth I'm using tempauth. The 404 is
>> totally random. Sometimes I don't see it, sometimes I see it 5% of the time.
>>
>> On the proxies, the error message looks like this:
>>
>> Dec 23 09:00:08 proxy01 proxy-server 10.xx.xx.xx 10.xxx.xxx.xxx
>> 23/Dec/2013/09/00/08 PUT /v1/AUTH_swift/test1/002 HTTP/1.0 404 - -
>> AUTH_tk275e94194c2447a787e57ec2574789f3 - 70 -
>> txf2066f1fa5ec403db9791-0052b7fb98 - 0.0367 - -
>>
>> On the storage servers I see this:
>>
>> Dec 23 08:49:11 storage1 container-server 10.xx.xx.xxx - -
>> [23/Dec/2013:08:49:11 +0000] "HEAD /slot-8/11126/AUTH_swift/test1" 404 -
>> "tx92750eeef61640a3929f4-0052b7f907" "HEAD
>> http://swift:8080/v1/AUTH_swift/test1" "proxy-server 10178" 0.0002
>>
>> Note that these are not necessarily from the same request, but it's the
>> messages that pop up.
>>
>> Here's what I have for my proxy-server.conf:
>>
>> [DEFAULT]
>> bind_port = 8080
>> workers = 8
>> user = swift
>> log_statsd_host = statsd
>> log_statsd_port = 8125
>> log_statsd_default_sample_rate = 1
>> log_statsd_metric_prefix = proxy01
>>
>> [pipeline:main]
>> pipeline = healthcheck proxy-logging cache swift3 tempauth proxy-logging
>> proxy-server
>>
>> [app:proxy-server]
>> use = egg:swift#proxy
>> allow_account_management = true
>> account_autocreate = true
>>
>> [filter:proxy-logging]
>> use = egg:swift#proxy_logging
>>
>> [filter:swift3]
>> use = egg:swift3#swift3
>>
>> [filter:tempauth]
>> use = egg:swift#tempauth
>> [some user] .reseller_admin
>>
>> [filter:healthcheck]
>> use = egg:swift#healthcheck
>>
>> [filter:cache]
>> use = egg:swift#memcache
>> memcache_servers = 10.xx.xx.xx:11211,10.xx.xx.xx:11211,10.xx.xx.xx:11211,
>> 10.xx.xx.xx:11211,10.xx.xx.xx:11211
>>
>> On the storage backends I'm using the following:
>>
>> obect-server.conf
>>
>> [DEFAULT]
>> bind_ip = 10.xx.xx.xxx
>> workers = 16
>> log_facility = LOG_LOCAL4
>>
>> pipeline:main]
>> pipeline = object-server
>>
>> app:object-server]
>> use = egg:swift#object
>>
>> [object-replicator]
>>
>> [object-updater]
>>
>> [object-auditor]
>>
>> The container and account servers look identical to this.
>>
>> Any ideas?
>>
>>
>> --
>> Stephen Wood
>> www.heystephenwood.com
>>
>
>
>
> --
> Stephen Wood
> www.heystephenwood.com
>
> _______________________________________________
> Mailing list:
> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack
> Post to : openstack at lists.openstack.org
> Unsubscribe :
> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openstack.org/pipermail/openstack/attachments/20131224/cad6f715/attachment.html>
More information about the Openstack
mailing list