<div dir="ltr">Hi need some guidance and feedback on our needs for testing in the VPNaaS repo.<div><br></div><div><b>Background...</b></div><div><br></div><div>The VPNaaS reference implementation is currently using the open source OpenSwan application that provides the IPSec site-to-site connection functionality. The OpenStack code essentially creates the configuration files for this app, and updates firewall rules for a connection.</div><div><br></div><div>A developer on the VPNaaS sub-team has been implementing a new driver that uses the open source StrongSwan application (<a href="https://review.openstack.org/#/c/144391/">https://review.openstack.org/#/c/144391/</a>). This uses a different configuration setup, requires installation on Ubuntu 14.04 (for example), and disabling of AppArmor (to not enforce for the charon and stoke process).</div><div><br></div><div>The intent here is to replace OpenSwan, with StrongSwan as the reference implementation in the future, as it is a newer implementation, has more features, is supported on multiple operating systems now, and for Ubuntu 14.10, OpenSwan is beeing deprecated (no longer be installed).</div><div><br></div><div>Currently, there is only some API test cases in the Tempest repo for VPN. There are no functional tests for VPNaaS, and in particular, no scenario test that ensures that OpenSwan (and now StrongSwan) apps are properly configured and can create and negotiate an end to end connection.</div><div><br></div><div><br></div><div><b>Goals...</b></div><div><br></div><div>The goal is to provide functional tests for the device drivers, that are used to control the OpenSwan (and now StrongSwan). My guess here is that we can verify that the right configuation files/directories are created, and can check the status of the OpenSwan/StrongSwan process for different operations.</div><div><br></div><div>In addition a scenario test is strongly desired (at least by me :) to ensure that the feature indeed works (negotiating a connection and able to pass traffic between the two nodes).</div><div><br></div><div>Personally, I'd like to see us get something in place for K-3, even if it is limited in nature, as we've been 2+ releases without any functional/scenario tests.</div><div><br></div><div><b>Where we are today...</b></div><div><br></div><div>As part of the StrongSwan driver implementation (<a href="https://review.openstack.org/#/c/144391/">https://review.openstack.org/#/c/144391/</a>), a functional test is being developed. It currently checks the configuration files generated.</div><div><br></div><div>In addition, there are currently two implementations of a scenario test for VPNaaS out for review (<a href="https://review.openstack.org/#/c/140072">https://review.openstack.org/#/c/140072</a>, and <a href="https://review.openstack.org/#/c/153292/5">https://review.openstack.org/#/c/153292/5</a>) that developers have been working on. Both of these are targeted for the Tempest repo. One does a ping check and the other does an SSH.</div><div><br></div><div>I'm thinking of, but have not started, implementing functional tests for the OpenSwan driver (if the community thinks this makes sense, given it will be deprecated).</div><div><br></div><div>My understanding is that the Neutron tests in the Tempest repo are being migrated into the Neutron repo, and a tempest library developed.</div><div><br></div><div><br></div><div><b>Questions/guidance needed...</b></div><div><br></div><div>With the scenario tests, there are several questions...</div><div><br></div><div>1) Is there a preference (from a Tempest standpoint) of one of the scenario tests over the other (both do the same thing, just differently)?</div><div><br></div><div>2) Should an exception be made to the decision to not allow *aaS tests to be added to Tempest? The two scenario test implementations mentioned above are created for the Tempest repo (because they are based on two different abandoned designs from 1/2014 and 7/2014). The test could be migrated to Neutron (and later VPNaaS repos) as part of that migration process.</div><div><br></div><div>3) If not, when is it expected that the migration will be done (wondering if this could make K-3)?</div><div><br></div><div>4) When will the tempest library be available, if we're waiting for the migration to complete and then use the test in the VPNaaS repo?</div><div><br></div><div>5) Instead of being based on Tempest, and waiting for migration, could the scenario test be adapted to run in the existing functional test area of the VPNaaS repo as a dsvm-functional test (and would it make sense to go that route)?<br></div><div><br></div><div>6) Since the StrongSwan test has different setup requirements than OpenSwan, will we need a separate tempest jobs?</div><div><br></div><div><br></div><div>For functional tests (of the device drivers), there are several questions...</div><div><br></div><div>7) Because of the setup differences, the thought was to create two functional jobs, one for StrongSwan and one for OpenSwan. Does that sound right?</div><div><br></div><div>8) Should there be two root directories (tests/functional/openswan and tests/functional/strongswan) or should there be one root (tests/functional) using sub-directories and filters to select modules for the two jobs?</div><div><br></div><div>9) Would the latter scheme be better, in case there are tests that are common to both implementations (and could be placed in tests/functional)?</div><div><br></div><div>10) The checking of the config files (I think) could be done w/o a devstack environment. Should those be done in unit tests, or is it better to keep all tests related to the specific driver, in the functional test area (testing the config file, and querying the status for the process).</div><div><br></div><div>General...</div><div><br></div><div>11) Are there other testing approaches that we are missing or should consider (and that we should be doing first, to meet our goals)?</div><div><br></div><div><br></div><div>Essentially, the VPNaaS team is looking for guidance and direction on how to meet the goals above (get some minimal functional and scenario tests in place, in time for K-3), and limit back-tracking / rework as much as possible.</div><div><br></div><div>Please advise. Thanks!</div><div><br></div><div><br></div><div>Paul Michali)<br></div><div><div><div class="gmail_signature"><div dir="ltr"><div dir="ltr"><div><br></div><div>IRC............ pc_m (<a href="http://irc.freenode.com" target="_blank">irc.freenode.com</a>)<br>Twitter....... @pmichali</div><div><br></div></div></div></div></div>
</div></div>