Hi, I’m reaching out to you to give you an update on the second round of surveys that the OpenInfra Foundation community managers have been running. Our goal has been to understand challenges that contributors and maintainers in OpenStack have in order to improve the experience for both. To protect the anonymity of the survey respondents we are sharing anonymized and aggregated survey results reflecting feedback we received throughout OpenStack. As a next step we will reach out to project teams who received more than one survey response, starting with the team(s) that received most, to share team-specific feedback and understand how we can help to address them. Highlights in trends comparing the first [1] and second rounds of survey analysis: * Maintainer survey: + In the second round, we got 40% fewer responses, with 36% fewer teams participating. + 63% of the Flamingo OpenStack Maintainer Experience survey respondents filled out the Epoxy maintainer survey as well. + Overall, Flamingo maintainer survey respondents rated their experience more positive than the overall experience shared for the Epoxy release cycle. + Getting review attention is still the biggest challenge + In the second round we got responses from maintainers who don’t contribute code themselves + Maintainers had growing challenges with finding time to review non-priority changes, changes that were submitted without blueprints or specs and finding time for in-depth reviews + While maintainers still had challenges with change owners becoming unresponsive, it was much less common than it was reported for Epoxy * Contributor survey: + Results to the Flamingo OpenStack Contributor Experience survey are mostly comparable with results from the Epoxy contributor survey. + The biggest difference was the mention of broken CI. + While review attention got fewer mentions, it’s still the biggest issue contributors face. + Issues with review feedback quality, shallow initial reviews as well as overly nit picky feedback, also came up more in recent survey results compared to the previous round. Below please find the broader analysis of the two surveys: __MAINTAINER SURVEY__ We received 9 responses from 7 project teams, with only 1 team receiving more than one response. We asked respondents to rate the below statements between 1 (very bad) and 5 (excellent), which averaged at the following: - Code review - Changes you propose are reviewed in a timely manner: 4 - Code review - You receive actionable feedback from other reviewers: 4.44 - Code review - Automated test failures quickly direct you to problems in your changes: 4.33 - Contributor docs - It is comprehensive to cover processes and good practices: 4.33 - Contributor docs - It is up to date: 4 The survey question that related to how much maintainers are relying on pointing to the contributor documentation didn’t display options correctly in this round, and therefore we excluded it from the analysis. We added a new question to find out what project teams are using to track their priorities.Please note that respondents were allowed to choose as many options as were applicable to them. The below numbers are the percentages of respondents who relied on each option: - Gerrit change topics - 100% - Reporting in team meetings - 100% - Reporting in IRC or on mailing lists - 89% - Etherpad or similar collaborative document - 67% - Custom review dashboards or queries - 67% - Specification documents - 56% - Review-Priority or similar Gerrit votes - 44% - Launchpad Blueprints - 44% - Gerrit hashtags - 33% - Kanban lists - 11% The below numbers are the percentages of project teams who relied on each option: - Gerrit change topics - 100% - Reporting in team meetings - 100% - Reporting in IRC or on mailing lists - 86% - Etherpad or similar collaborative document - 86% - Custom review dashboards or queries - 86% - Specification documents - 57% - Review-Priority or similar Gerrit votes - 57% - Launchpad Blueprints - 43% - Gerrit hashtags - 29% - Kanban lists - 14% There is a small difference between the tools that were listed by maintainers and the overall reported tool usage for project teams. This can imply that maintainers of the same team are unaware of tools being used by other maintainers, and a lack of team consensus on tools and methods to use. The survey also asked respondents to mark which listed issues they face while trying to land their own changes, as well as trying to land other contributors’ changes. Please note that respondents were allowed to choose as many options as were applicable to them. The below numbers are the percentages of respondents who faced each challenge: - Landing their own changes - Trouble getting review attention - 56% - Other - 22%: + Maintainers didn’t contribute code: 11% + No issues: 11% People listed 4 additional challenges in the Epoxy round. - Landing other people’s changes - Change is lower priority than others - 56% - Change owner/stakeholders are unreachable to discuss change - 56% - Change is beyond expertise - 44% - Change owner can't/won't add missing pieces to the change - 44% - Change broken, doesn't pass CI - 44% - Change owner's response is slow - 33% - Change needs a BP/spec - 22% - Other (Lack of time to review changes in depth) - 22% - The change isn't interesting to me - 11% The below numbers are the percentages of project teams who faced each challenge: - Landing their own changes - I have trouble getting the attention of reviewers for my change - 57% - Other - 28% + Maintainer did not contribute code or docs - 14% + No issues - 14% - Landing other people’s changes: - Change is lower priority than others - 71% - Change owner/stakeholders are unreachable to discuss change - 57% - Change is beyond expertise - 57% - Change owner can't/won't add missing pieces to the change - 43% - Change broken, doesn't pass CI - 43% - Change owner's response is slow - 43% - Change needs a BP/spec - 29% - Other (Lack of time to review changes in depth) - 29% - The change isn't interesting to me - 14% The main challenge for maintainers in the Flamingo cycle was still getting review attention when contributing themselves, but this time around they didn’t report other issues. When trying to land other people’s changes, individuals faced the biggest challenge with review priorities and change owners becoming unreachable, while on the team level broken CI, slow change owner responses and owners not being able to fix issues in their changes were also among the most common issues. Issues which were mentioned more often in the Flamingo survey responses included changes being low priority, maintainers not having the expertise and changes needing a blueprint or spec. __CONTRIBUTOR SURVEY__ We received 31 responses from 17 project teams, with 29% of these teams receiving more than one response. 52% of respondents were new contributors in the Flamingo release cycle, which is twice as many as in the Epoxy survey responses. We asked respondents to rate the below statements between 1 (very bad) and 5 (excellent), which averaged at the following: - Contributor experience - Changes you propose are reviewed in a timely manner: 3.52 - Contributor experience - You receive actionable feedback from other reviewers: 4.39 - Contributor experience - Automated test failures quickly direct you to problems in your changes: 3.9 - Contributor documentation - You were able to find information about the processes the project team is using: 3.97 - Contributor documentation - It helped you to apply better practices throughout your contribution journey and achieve results faster: 4.16 - Contributor documentation - It is easy to discover: 3.68 - Contributor documentation - It is easy to navigate: 3.68 We added a new question to find out what project teams are using to track their priorities. Please note that respondents were allowed to choose as many options as were applicable to them. The below numbers are the percentages of respondents who relied on each option: - Checking in IRC or on mailing lists - 54.84% - Checking Launchpad Blueprints - 48.39% - Checking team meeting discussions or logs - 45.16% - Checking Gerrit change topics - 41.94% - Checking specification documents - 32.26% - Checking Review-Priority or similar Gerrit votes - 25.81% - Checking Gerrit hashtags - 22.58% - I don't know how the team is tracking priorities - 16.13% - Checking StoryBoard Stories - 12.90% - Checking team's tracking etherpad or similar collaborative document - 12.90% - I'm not sure if the team is tracking priorities - 9.68% - Checking Custom review dashboards or queries - 9.68% - Other - 9.68% - Checking Kanban lists - 0% The below numbers are the percentages of project teams who relied on each option: - Checking in IRC or on mailing lists - 64.71% - Checking Launchpad Blueprints - 47.06% - Checking team meeting discussions or logs - 58.82% - Checking Gerrit change topics - 47.06% - Checking specification documents - 35.29% - Checking Review-Priority or similar Gerrit votes - 47.06% - Checking Gerrit hashtags - 35.29% - I don't know how the team is tracking priorities - 23.53% - Checking StoryBoard Stories - 23.53% - Checking team's tracking etherpad or similar collaborative document - 17.65% - I'm not sure if the team is tracking priorities - 17.65% - Checking Custom review dashboards or queries - 17.65% - Other - 17.65% - Checking Kanban lists - 0% Inconsistencies in team priority tracking can indicate a disconnect between where maintainers define, highlight and follow team priorities versus where contributors look for them. The biggest disconnect seems to be Gerrit change topics and team meetings, which all maintainers seem to rely on, while only half of the contributors use or follow. The survey also asked respondents to mark which listed issues they face while trying to land their own changes. Please note that respondents were allowed to choose as many options as were applicable to them. The below numbers are the percentages of respondents who faced each challenge: - Trouble getting review attention - 48% - Other - 45% - Unable to determine priorities - 13% - Reviewers keep coming back with new requests - 11% - Need more clarification on feedback to move forward - 10% - Reviewer expectation or consensus shifts over time - 6% - Expected to expand scope to address other project issues - 13% - Reviewers disagree or give conflicting feedback - 3% - Asked to make changes deviating from past consensus - 6% - Change reviewed too thoroughly - 3% - Shallow initial reviews - 6% - No changes pass CI - 19% The below numbers are the percentages of project teams who faced each challenge: - Trouble getting review attention - 64% - Other - 59% - Unable to determine priorities - 18% - Reviewers keep coming back with new requests - 11% - Need more clarification on feedback to move forward - 6% - Reviewer expectation or consensus shifts over time - 6% - Expected to expand scope to address other project issues - 18% - Reviewers disagree or give conflicting feedback - 6% - Asked to make changes deviating from past consensus - 6% - Change reviewed too thoroughly - 6% - Shallow initial reviews - 12% - No changes pas CI - 29% While the official survey deadline has passed, we are still looking for feedback. If you missed filling out the survey(s), please provide your input now! ~ Please fill out this survey separately for every OpenStack project you contributed to during the Flamingo release: https://openinfrafoundation.formstack.com/forms/openstack_contributor_experi... ~ Please fill out this survey separately for every OpenStack project you were a core reviewer of during the Flamingo release: https://openinfrafoundation.formstack.com/forms/openstack_maintainer_experie... Thanks, Ildikó [1] https://lists.openstack.org/archives/list/openstack-discuss@lists.openstack.... ——— Ildikó Váncsa Director of Community OpenInfra Foundation