[openstack-dev] [all] cyclomatic complexity check in flake8 & Python 3
Thomas Herve
therve at redhat.com
Tue Dec 12 14:04:03 UTC 2017
On Tue, Dec 12, 2017 at 2:08 PM, Thierry Carrez <thierry at openstack.org> wrote:
> Zane Bitter wrote:
>> We have around 60 repos using the 'mccabe' package to do cyclomatic
>> complexity checks in flake8 in their gate jobs.[1] Based on advice
>> received at the time[2], the version of mccabe that we use in the gate
>> is effectively pinned to 0.2.1 by hacking (the project, not... never
>> mind).[3]
>>
>> This is starting to cause problems, because version 0.2.1 does not work
>> with Python 3. By 'does not work' I don't mean that it refuses to run, I
>> mean that it returns completely different numbers. (As far as I can tell
>> they're always the same or lower.) tox creates the pep8 environment
>> using the version of python that tox itself is running in, so as distros
>> increasingly prefer to install packages using python3, people will
>> increasingly be running flake8 under python3, and they'll run into
>> situations like the one I had today where their pep8 tests pass locally
>> but fail in the gate (which still runs under python2). When the gate
>> switches to python3, as I assume it eventually must, we'll get bogus
>> results.
>>
>> Unfortunately, moving to a later version is complicated by the fact that
>> 0.2.1 was... not good software. There is exactly one regression test.[4]
>> And if you're about to ask whether it was to check the trivial case that
>> the function:
>>
>> def foo():
>> return None
>>
>> has a cyclomatic complexity of 1, then the joke is on you because it
>> actually returns 2 for that function.
>>
>> Later versions have more tests and appear to be consistent with each
>> other and across python versions, but they return completely different
>> results to 0.2.1. And the results are mostly higher, so if we bump the
>> requirements many of those 60 projects will break.
>>
>> (I could also go into details on how the numbers that later versions
>> report are also wrong in different ways, but I'll save it for another day.)
>>
>> Here's how the allegedly most complex function in Heat[5] shakes out,
>> for example:
>>
>> mccabe py27 py36
>> 0.2.1 14 6
>> 0.3.1 23 23
>> 0.6.1 23 23
>>
>> I don't have a solution to suggest, but I burned most of my day digging
>> into this so I just wanted to let y'all know how screwed we are.
> Thanks for digging into this, Zane. I'd say we have two options, based
> on how useful running those tests is:
>
> 1/ If they are useful, we should bump the version, at the same time
> adjusting max-complexity per repo to match the most complex function
> (with some slack factor). Encourage more projects to use cyclomatic
> complexity checks and fix the biggest offenders using cycle goals.
>
> 2/ If they are not useful, just drop cyclomatic complexity tests overall
> rather than relying on wrong results and wasting CPU cycles
Hugely in favor of doing that. Even if it was working properly, I
don't think for Heat it was ever a useful test. We end up either
bumping the max or moving a conditional in a function, which doesn't
solve anything.
--
Thomas
More information about the OpenStack-dev
mailing list