[openstack-dev] UTF-8 required charset/encoding for openstack database?
rpodolyaka at mirantis.com
Mon Mar 10 18:08:24 UTC 2014
AFAIK, most OpenStack projects enforce tables to be created with the
encoding set to UTF-8 because MySQL has horrible defaults and would
use latin1 otherwise. PostgreSQL must default to the locale of a
system on which it's running. And, I think, most systems default to
Actually, I can't think of a reason, why would you want to use
anything else than UTF-8 for storing and exchanging of textual data.
I'd recommend to reconsider your encoding settings for PostgreSQL.
On Mon, Mar 10, 2014 at 10:24 AM, Chris Friesen
<chris.friesen at windriver.com> wrote:
> I'm using havana and recent we ran into an issue with heat related to
> character sets.
> In heat/db/sqlalchemy/api.py in user_creds_get() we call
> _decrypt() on an encrypted password stored in the database and then try to
> convert the result to unicode. Today we hit a case where this errored out
> with the following message:
> UnicodeDecodeError: 'utf8' codec can't decode byte 0xf2 in position 0:
> invalid continuation byte
> We're using postgres and currently all the databases are using SQL_ASCII as
> the charset.
> I see that in icehouse heat will complain if you're using mysql and not
> using UTF-8. There doesn't seem to be any checks for other databases
> It looks like devstack creates most databases as UTF-8 but uses latin1 for
> nova/nova_bm/nova_cell. I assume this is because nova expects to migrate
> the db to UTF-8 later. Given that those migrations specify a character set
> only for mysql, when using postgres should we explicitly default to UTF-8
> for everything?
> OpenStack-dev mailing list
> OpenStack-dev at lists.openstack.org
More information about the OpenStack-dev