[openstack-dev] [neutron-vnaas][neutron][qa] Functional/scenario testing for VPNaaS repo

Paul Michali pc at michali.net
Tue Feb 17 14:14:54 UTC 2015


Hi need some guidance and feedback on our needs for testing in the VPNaaS
repo.

*Background...*

The VPNaaS reference implementation is currently using the open source
OpenSwan application that provides the IPSec site-to-site connection
functionality. The OpenStack code essentially creates the configuration
files for this app, and updates firewall rules for a connection.

A developer on the VPNaaS sub-team has been implementing a new driver that
uses the open source StrongSwan application (
https://review.openstack.org/#/c/144391/). This uses a different
configuration setup, requires installation on Ubuntu 14.04 (for example),
and disabling of AppArmor (to not enforce for the charon and stoke process).

The intent here is to replace OpenSwan, with StrongSwan as the reference
implementation in the future, as it is a newer implementation, has more
features, is supported on multiple operating systems now, and for Ubuntu
14.10, OpenSwan is beeing deprecated (no longer be installed).

Currently, there is only some API test cases in the Tempest repo for VPN.
There are no functional tests for VPNaaS, and in particular, no scenario
test that ensures that OpenSwan (and now StrongSwan) apps are properly
configured and can create and negotiate an end to end connection.


*Goals...*

The goal is to provide functional tests for the device drivers, that are
used to control the OpenSwan (and now StrongSwan). My guess here is that we
can verify that the right configuation files/directories are created, and
can check the status of the OpenSwan/StrongSwan process for different
operations.

In addition a scenario test is strongly desired (at least by me :) to
ensure that the feature indeed works (negotiating a connection and able to
pass traffic between the two nodes).

Personally, I'd like to see us get something in place for K-3, even if it
is limited in nature, as we've been 2+ releases without any
functional/scenario tests.

*Where we are today...*

As part of the StrongSwan driver implementation (
https://review.openstack.org/#/c/144391/), a functional test is being
developed. It currently checks the configuration files generated.

In addition, there are currently two implementations of a scenario test for
VPNaaS out for review (https://review.openstack.org/#/c/140072, and
https://review.openstack.org/#/c/153292/5) that developers have been
working on. Both of these are targeted for the Tempest repo. One does a
ping check and the other does an SSH.

I'm thinking of, but have not started, implementing functional tests for
the OpenSwan driver (if the community thinks this makes sense, given it
will be deprecated).

My understanding is that the Neutron tests in the Tempest repo are being
migrated into the Neutron repo, and a tempest library developed.


*Questions/guidance needed...*

With the scenario tests, there are several questions...

1) Is there a preference (from a Tempest standpoint) of one of the scenario
tests over the other (both do the same thing, just differently)?

2) Should an exception be made to the decision to not allow *aaS tests to
be added to Tempest? The two scenario test implementations mentioned above
are created for the Tempest repo (because they are based on two different
abandoned designs from 1/2014 and 7/2014). The test could be migrated to
Neutron (and later VPNaaS repos) as part of that migration process.

3) If not, when is it expected that the migration will be done (wondering
if this could make K-3)?

4) When will the tempest library be available, if we're waiting for the
migration to complete and then use the test in the VPNaaS repo?

5) Instead of being based on Tempest, and waiting for migration, could the
scenario test be adapted to run in the existing functional test area of the
VPNaaS repo as a dsvm-functional test (and would it make sense to go that
route)?

6) Since the StrongSwan test has different setup requirements than
OpenSwan, will we need a separate tempest jobs?


For functional tests (of the device drivers), there are several questions...

7) Because of the setup differences, the thought was to create two
functional jobs, one for StrongSwan and one for OpenSwan. Does that sound
right?

8) Should there be two root directories (tests/functional/openswan and
tests/functional/strongswan) or should there be one root (tests/functional)
using sub-directories and filters to select modules for the two jobs?

9) Would the latter scheme be better, in case there are tests that are
common to both implementations (and could be placed in tests/functional)?

10) The checking of the config files (I think) could be done w/o a devstack
environment. Should those be done in unit tests, or is it better to keep
all tests related to the specific driver, in the functional test area
(testing the config file, and querying the status for the process).

General...

11) Are there other testing approaches that we are missing or should
consider (and that we should be doing first, to meet our goals)?


Essentially, the VPNaaS team is looking for guidance and direction on how
to meet the goals above (get some minimal functional and scenario tests in
place, in time for K-3), and limit back-tracking / rework as much as
possible.

Please advise. Thanks!


Paul Michali)

IRC............ pc_m (irc.freenode.com)
Twitter....... @pmichali
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openstack.org/pipermail/openstack-dev/attachments/20150217/504ac870/attachment.html>


More information about the OpenStack-dev mailing list