[openstack-dev] [Neutron] Scaling our testing needs
Armando M.
armamig at gmail.com
Thu Dec 3 02:19:34 UTC 2015
Hi Neutrinos,
I would like to share a proposal with you on how we could scale our
ever-growing testing needs, and at the same time, provide guidance to the
developers who care about the quality of the work they produce, and how
they can protect it from the dreadful regressions!
In [1] you will see a Decision Tree: the objective is to guide us in
deciding what is the best testing strategy for a feature, and what type of
support to expect from our CI engine. In a nutshell we have the following
basic options we can choose from:
- Unit testing
- Functional testing
- API testing (*)
- Fullstack testing
- Scenario testing (**)
The bottom line is: we should tap into our existing infrastructure as much
as we can to try to minimize the amount of CI moving parts that allow us to
test the numerous features that Neutron has. Some work has to happen (as
highlighted below) before we can make this decision process as smooth as
possible. In particular:
(*) The API testing framework allows us to execute tempest-style API tests
against a Neutron live server. Assaf and I are working to address the
overlap between Tempest and this framework, and to ensure that there's a
clear demarcation to what test belongs to Tempest and what test belongs to
Neutron codebases respectively. More will follow in another thread, so stay
tuned. On the other hand, there's some work that needs to go into this
testing framework to make the server load up non-default extensions.
(**) This is something that Ihar started in [2]. I am still unclear on how
we intend to leverage this job and yet keep scenario-like tests for
hardcore Neutron features in the Neutron tree. Ihar is thinking QoS, I may
be thinking dpdk, someone else might be thinking to something yet again
different (don't get bogged down on the actual examples). We'll iterate on
[2], but I see it as an integral part of our end-to-end testing strategy.
More to follow.
As for new additional jobs that may come and go: I believe that we have to
revisit our approach to introduce experimental and non-voting jobs, and how
we make them stable and ultimately running as gate jobs and, long term,
having the features they tests supplant the ones they are meant to
supersede (think pecan, or dvr). I have an idea on how to address the
non-voting neglect conundrum, but I'll tackle it in another thread.
Finally, a word about projects that need testing with Neutron, advanced
services and beyond. The same way the neutron-lib initiative is tackling
the python side of the Neutron codebase as of now, we'll have to work to
identify the common pieces of functionality that will allow the reuse of
the testing machinery across multiple project, in a modular and decouple
fashion, so that when Neutron related projects want to integrate with the
Neutron CI engine, they will do so using good patterns rather than bad
ones; I believe that [3] is one of them, and that's why I have been pushing
back.
Feedback welcome, as I am sure I may overlooked something.
Cheers,
Armando
[1] https://wiki.openstack.org/wiki/Network/Testing
[2] https://review.openstack.org/#/c/247697/
[3] https://review.openstack.org/#/c/242925/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openstack.org/pipermail/openstack-dev/attachments/20151202/e34be9c6/attachment.html>
More information about the OpenStack-dev
mailing list