[openstack-dev] [qa] running heat & horizon unit tests on client changes (was [nova] request to tag novaclient 2.18.0)

Steve Baker sbaker at redhat.com
Wed Jul 23 13:51:22 UTC 2014


On 18/07/14 08:35, Matt Riedemann wrote:
>
>
> On 7/17/2014 5:48 PM, Steve Baker wrote:
>> On 18/07/14 00:44, Joe Gordon wrote:
>>>
>>>
>>>
>>> On Wed, Jul 16, 2014 at 11:28 PM, Steve Baker <sbaker at redhat.com
>>> <mailto:sbaker at redhat.com>> wrote:
>>>
>>>     On 12/07/14 09:25, Joe Gordon wrote:
>>>>
>>>>
>>>>
>>>>     On Fri, Jul 11, 2014 at 4:42 AM, Jeremy Stanley
>>>>     <fungi at yuggoth.org <mailto:fungi at yuggoth.org>> wrote:
>>>>
>>>>         On 2014-07-11 11:21:19 +0200 (+0200), Matthias Runge wrote:
>>>>         > this broke horizon stable and master; heat stable is
>>>>         affected as
>>>>         > well.
>>>>         [...]
>>>>
>>>>         I guess this is a plea for applying something like the
>>>> oslotest
>>>>         framework to client libraries so they get backward-compat
>>>>         jobs run
>>>>         against unit tests of all dependant/consuming software...
>>>>         branchless
>>>>         tempest already alleviates some of this, but not the case of
>>>>         changes
>>>>         in a library which will break unit/functional tests of another
>>>>         project.
>>>>
>>>>
>>>>     We actually do have some tests for backwards compatibility, and
>>>>     they all passed. Presumably because both heat and horizon have
>>>>     poor integration test.
>>>>
>>>>     We ran
>>>>
>>>>       * check-tempest-dsvm-full-havana
>>>>        
>>>> <http://logs.openstack.org/66/94166/3/check/check-tempest-dsvm-full-havana/8e09faa>
>>>>         SUCCESS in 40m 47s (non-voting)
>>>>       * check-tempest-dsvm-neutron-havana
>>>>        
>>>> <http://logs.openstack.org/66/94166/3/check/check-tempest-dsvm-neutron-havana/b4ad019>
>>>>         SUCCESS in 36m 17s (non-voting)
>>>>       * check-tempest-dsvm-full-icehouse
>>>>        
>>>> <http://logs.openstack.org/66/94166/3/check/check-tempest-dsvm-full-icehouse/c0c62e5>
>>>>         SUCCESS in 53m 05s
>>>>       * check-tempest-dsvm-neutron-icehouse
>>>>        
>>>> <http://logs.openstack.org/66/94166/3/check/check-tempest-dsvm-neutron-icehouse/a54aedb>
>>>>         SUCCESS in 57m 28s
>>>>
>>>>
>>>>     on the offending patches (https://review.openstack.org/#/c/94166/)
>>>>
>>>>     Infra patch that added these tests:
>>>>     https://review.openstack.org/#/c/80698/
>>>>
>>>>
>>>     Heat-proper would have continued working fine with novaclient
>>>     2.18.0. The regression was with raising novaclient exceptions,
>>>     which is only required in our unit tests. I saw this break coming
>>>     and switched to raising via from_response
>>>     https://review.openstack.org/#/c/97977/22/heat/tests/v1_1/fakes.py
>>>
>>>     Unit tests tend to deal with more internals of client libraries
>>>     just for mocking purposes, and there have been multiple breaks in
>>>     unit tests for heat and horizon when client libraries make
>>>     internal changes.
>>>
>>>     This could be avoided if the client gate jobs run the unit tests
>>>     for the projects which consume them.
>>>
>>> That may work but isn't this exactly what integration testing is for?
>> If you mean tempest then no, this is different.
>>
>> Client projects have done a good job of keeping their public library
>> APIs stable. An exception type is public API, but the constructor for
>> raising that type arguably is more of a gray area since only the client
>> library should be raising its own exceptions.
>>
>> However heat and horizon unit tests need to raise client exceptions to
>> test their own error condition handling, so exception constructors could
>> be considered public API, but only for unit test mocking in other
>> projects.
>>
>> This problem couldn't have been caught in an integration test because
>> nothing outside the unit tests directly raises a client exception.
>>
>> There have been other breakages where internal client library changes
>> have broken the mocking in our unit tests (I recall a neutronclient
>> internal refactor).
>>
>> In many cases the cause may be inappropriate mocking in the unit tests,
>> but that is cold comfort when the gates break when a client library is
>> released.
>>
>> Maybe we can just start with adding heat and horizon to the check jobs
>> of the clients they consume, but the following should also be
>> considered:
>> grep "python-.*client" */requirements.txt
>>
>> This could give client libraries more confidence that internal changes
>> don't break anything, and allows them to fix mocking in other projects
>> before their changes land.
>>
>>
>>
>> _______________________________________________
>> OpenStack-dev mailing list
>> OpenStack-dev at lists.openstack.org
>> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
>>
>
> I don't think we should have to change the gate jobs just so that
> other projects can test against the internals of their dependent
> clients, that sounds like a flawed unit test design to me.
>
> Looking at
> https://review.openstack.org/#/c/97977/22/heat/tests/v1_1/fakes.py for
> example, why is a fake_exception needed to mock out novaclient's
> NotFound exception?  A better way to do this is that whatever is
> expecting to raise the NotFound should use mock with a side_effect to
> raise novaclient.exceptions.NotFound, then mock handles the spec being
> set on the mock and you don't have to worry about the internal
> construction of the exception class in your unit tests.
>
fakes.py is ancient and doesn't reflect how we write tests currently.
Our mox tests use AssertRaises (less ancient) and our mock tests use
side_effect (current). However all three approaches are in the same
situation when it is not possible to construct a NotFound object in a
way which works with both novaclient 2.17.0 and 2.18.0.

And we've just been hit by this again - our progress on getting heat
juno-2 released stopped when keystoneclient 0.10.0 was released, which
required an urgent fix [1]

This is the kind of change which would be discovered early if heat and
horizon unit tests ran against python-*client changes. I don't even
think these jobs should be voting, but they'll give us enough early
warning that our mocking technique needs revisiting. They will also give
client developers an idea of the impact of their changes, and allow them
to consider whether they are actually changing public API.

[1] https://review.openstack.org/#/c/108875/1



More information about the OpenStack-dev mailing list