Hi team, A few weeks ago we enabled horizon-integration-tests job[1]. It's a set of selenium-based test cases to verify that Horizon works as expected from the user's perspective. Like any new job, it's added in a non-voting mode for now. During the PTG, I'd got several conversations with project teams that it would be good to have such tests in each plugin to verify that plugin works correctly with a current Horizon version. We've got about 30 plugins in the Plugin Registry [2]. Honestly, without any kind of testing in most of the plugins, we can't be sure that they work well with a current version of Horizon. That's why we decided to implement some kind of smoke tests for plugins based on Horizon integration tests framework. These tests should verify that a plugin is installed and pages could be opened in a browser. We will run these tests on the experimental queue and/or on some schedule on Horizon gates to verify that plugins are maintained and working properly. My idea is to have such a list of 'tested' plugins, so we can add 'Maintained' label to the Plugin Registry. Once these jobs become voting, we can add a label 'Verified'. I think such a schedule looks reasonable: * Stein-Train release cycles - add non-voting jobs for each maintained plugin and introduce "Maintained" label * Train-U release cycles - makes stable jobs voting and introduce "Verified" label in the Horizon Plugin registry I do understand that some teams don't have enough resources to maintain integration tests, so I'm stepping as a volunteer to introduce such tests and jobs for the project. I already published patches for Vitrage and Heat [3] plugins and will do the same for Ironic and Manila dashboards in a short time. Any help or feedback is welcome:). [1] https://review.openstack.org/#/c/580469/ [2] https://docs.openstack.org/horizon/latest/install/plugin-registry.html [3] https://review.openstack.org/#/q/topic:horizon-integration-tests+(status:ope...) Regards, Ivan Kolodyazhny, http://blog.e0ne.info/