Upcoming metrics dashboard changes
Hi OpenStack community, Since the OpenInfra Foundation joined the Linux Foundation, we have been looking into opportunities to take advantage of the frameworks and tool sets that the Linux Foundation has, and some of you and your organizations are already familiar with. While the dashboards are still being built I would like to let you know that, as part of that effort, we will switch to LFX Insights[1] as the metrics dashboard for OpenInfra projects, including OpenStack, starting on __January 15, 2026.__ A couple things to note: - The new OpenStack metrics dashboard[2] is already available, but it is __still under active construction.__ - The set of metrics that is displayed on the new dashboard will not be identical with the metrics on the dashboard provided by Bitergia. - LFX Insights does not display affiliation information for individuals who are listed on the dashboard as contributors, for privacy reasons. However, if you already have an LF account[3], you will be able to check and update your affiliation information. Or, you can create a new LF account[4]. - You can learn more about the dashboard in the LFX Insights documentation[5]. - If your organization is an OpenInfra Foundation / Linux Foundation member already, the LF Organization Dashboard[6] is also available for a more company-centric view. The OpenStack Bitergia metrics dashboard[7] remains available __until January 14, 2026.__ Please let me know if you have any questions. Best Regards, Ildikö [1] https://insights.linuxfoundation.org/ [2] https://insights.linuxfoundation.org/project/OpenStack/ [3] https://openprofile.dev/ [4] https://docs.linuxfoundation.org/lfx/sso/create-an-account [5] https://insights.linuxfoundation.org/docs/introduction/what-is-insights/ [6] https://lfx.linuxfoundation.org/tools/organization-dashboard/ [7] https://openstack.biterg.io/ ——— Ildikó Váncsa Director of Community OpenInfra Foundation
Hey Ildiko, I know I'm a week early, so if this is getting cleaned up, that's awesome. As it is though, the new dashboards look extremely anemic compared to the existing ones -- I've yet to find any of my use cases to be solved by the new dashboard. Some of these I feel like there must be some large piece of this puzzle I'm missing? I have basically three things I use this data for: - Individual review numbers, from gerrit, indicating frequency and types of review -- preferably with the ability to compare between reviewers. This was from the gerrit dashboard in biterg (and got top-billing on the original stackalytics dashboard) and is an important aspect for evaluation of existing and new reviewers for a project. - Ability to easily compare contributions by company OUTSIDE of the top 5. The lack of metrics outside the top 5 is disappointing. - Ability to filter by individual OpenStack *project* (e.g. all Ironic-responsible repos; not just "ironic" the repo) for either of the two use cases above Can you help me figure out how to perform these tasks on the new dashboard? I'm really getting the impression I must be missing some more advanced dig-down. Thanks, JayF On 12/16/25 11:29 AM, Ildiko Vancsa wrote:
Hi OpenStack community,
Since the OpenInfra Foundation joined the Linux Foundation, we have been looking into opportunities to take advantage of the frameworks and tool sets that the Linux Foundation has, and some of you and your organizations are already familiar with.
While the dashboards are still being built I would like to let you know that, as part of that effort, we will switch to LFX Insights[1] as the metrics dashboard for OpenInfra projects, including OpenStack, starting on __January 15, 2026.__
A couple things to note: - The new OpenStack metrics dashboard[2] is already available, but it is __still under active construction.__ - The set of metrics that is displayed on the new dashboard will not be identical with the metrics on the dashboard provided by Bitergia. - LFX Insights does not display affiliation information for individuals who are listed on the dashboard as contributors, for privacy reasons. However, if you already have an LF account[3], you will be able to check and update your affiliation information. Or, you can create a new LF account[4]. - You can learn more about the dashboard in the LFX Insights documentation[5]. - If your organization is an OpenInfra Foundation / Linux Foundation member already, the LF Organization Dashboard[6] is also available for a more company-centric view.
The OpenStack Bitergia metrics dashboard[7] remains available __until January 14, 2026.__
Please let me know if you have any questions.
Best Regards, Ildikö
[1] https://insights.linuxfoundation.org/ [2] https://insights.linuxfoundation.org/project/OpenStack/ [3] https://openprofile.dev/ [4] https://docs.linuxfoundation.org/lfx/sso/create-an-account [5] https://insights.linuxfoundation.org/docs/introduction/what-is-insights/ [6] https://lfx.linuxfoundation.org/tools/organization-dashboard/ [7] https://openstack.biterg.io/
———
Ildikó Váncsa Director of Community OpenInfra Foundation
Hi Jay, Thank you so much for reviewing the new dashboard and sharing your use cases and questions. It is very helpful in understanding what might still be missing from the new dashboard. Historically LFX Insights has been more tailored towards GitHub and data that platform provides. The Gerrit-based dashboards, like the one for OpenStack, are a newer addition and in that regard some of the metrics you're asking for are not (yet) available. I reflected on each of your question in more detail below. Best Regards, Ildikó
[...]
I have basically three things I use this data for:
- Individual review numbers, from gerrit, indicating frequency and types of review -- preferably with the ability to compare between reviewers. This was from the gerrit dashboard in biterg (and got top-billing on the original stackalytics dashboard) and is an important aspect for evaluation of existing and new reviewers for a project.
This looks like a use case that needs to be added to LFX Insights, I will need to open a GH Issue to cover it. First I would like to make sure my understanding is correct in what you need. If I got that right, you've been looking at reviewer statistics such as review types (-2, -1, 1, 2), # of reviews, and frequency of a person performing reviews. Is that accurate? While we will not be able to recreate the Bitergia dashboard in the new tool, it would be helpful for me to visualize how you've been using it. Can you share the list of widgets that you've been continuously relying on for reviewer evaluation?
- Ability to easily compare contributions by company OUTSIDE of the top 5. The lack of metrics outside the top 5 is disappointing.
I assume you've been looking at the 'Organizations leaderboard' on the 'Contributors' tab. On the bottom of the widget there's an 'All organizations' label, which is a button. If you click it, it'll open up a sidebar that gives you the full list of contributing organizations. Is that what you've been looking for?
- Ability to filter by individual OpenStack *project* (e.g. all Ironic-responsible repos; not just "ironic" the repo) for either of the two use cases above.
For this one, on the top of the dashboard where it says 'OpenStack/All repositories', the 'All repositories' text is a drop-down list. If you click on it, it'll list all repositories that are tracked by the tool for OpenStack. In the search bar you can type in 'ironic' and it'll give you all repos where the project's name is included. There are checkboxes on the left side, where you can select any or all Ironic repos to view metrics for. When you don't want the filter to be applied anymore, you can click on 'All repositories' on the top of that top-down list window to switch back to the view of entire OpenStack. Is that what you've been looking for?
[...]
Hi Jay,
Thank you so much for reviewing the new dashboard and sharing your use cases and questions. It is very helpful in understanding what might still be missing from the new dashboard.
Historically LFX Insights has been more tailored towards GitHub and data that platform provides. The Gerrit-based dashboards, like the one for OpenStack, are a newer addition and in that regard some of the metrics you're asking for are not (yet) available. for what its worth i agree with jay
On 11/01/2026 22:11, Ildiko Vancsa wrote: the info currently aviable in the dashboard is not very useful for anythign i would have looked at in the old dashboards.
I reflected on each of your question in more detail below.
Best Regards, Ildikó
[...]
I have basically three things I use this data for:
- Individual review numbers, from gerrit, indicating frequency and types of review -- preferably with the ability to compare between reviewers. This was from the gerrit dashboard in biterg (and got top-billing on the original stackalytics dashboard) and is an important aspect for evaluation of existing and new reviewers for a project. This looks like a use case that needs to be added to LFX Insights, I will need to open a GH Issue to cover it. First I would like to make sure my understanding is correct in what you need. If I got that right, you've been looking at reviewer statistics such as review types (-2, -1, 1, 2), # of reviews, and frequency of a person performing reviews. Is that accurate?
this is useful but we also used to track disagreement i.e negative comment where there is a +2 or posivive comemtn where there is a -2 the other gap we had that used to cause people to game the system was the fact we didnt track when a review was left with CodeReview=0 by not being able to see the neutral comment we could not see engagement in the discussion and people would end up putting +/- 1 making it a less strong signal. im not sure if that can be addressed in there system but at a minim we should try and restore the old behavior where we have review stats.
While we will not be able to recreate the Bitergia dashboard in the new tool, it would be helpful for me to visualize how you've been using it. Can you share the list of widgets that you've been continuously relying on for reviewer evaluation?
for me i think it woudlbe similar to jay which would be the review stat and the ability to filter by project (repo or set of project i.e all of ironic or all of nova ectra), the ablity to include/exclude companies/individuals filtienr on date ranges which is the content included in the gerrit approvals dashboard https://openstack.biterg.io/app/dashboards?security_tenant=openstack#/view/95487340-6762-11e9-a198-67126215b112?_g=(filters:!(),refreshInterval:(pause:!t,value:0),time:(from:now-90d,to:now))&_a=(description:'Gerrit%20Approvals%20panel%20by%20Bitergia',filters:!(('$state':(store:appState),meta:(alias:Bots,disabled:!f,index:gerrit,key:author_bot,negate:!t,params:(query:!t),type:phrase),query:(match:(author_bot:(query:!t,type:phrase)))),('$state':(store:appState),meta:(alias:!n,disabled:!f,index:gerrit,key:project,negate:!f,params:(query:nova),type:phrase),query:(match_phrase:(project:nova)))),fullScreenMode:!f,options:(darkTheme:!f,hidePanelTitles:!f,useMargins:!t),query:(language:lucene,query:''),timeRestore:!f,title:'Gerrit%20Approvals',viewMode:view) i very rarely look at anything out side of that dashboard but i do vocationally skim them.
- Ability to easily compare contributions by company OUTSIDE of the top 5. The lack of metrics outside the top 5 is disappointing.
I assume you've been looking at the 'Organizations leaderboard' on the 'Contributors' tab. On the bottom of the widget there's an 'All organizations' label, which is a button. If you click it, it'll open up a sidebar that gives you the full list of contributing organizations. Is that what you've been looking for?
that maybe has some of the info but is presened in a very unfrieldy way in a tiny popup windows and you really tell what approval vs merges vs comment really are. https://www.stackalytics.io/?module=ironic-group and https://www.stackalytics.io/report/contribution?module=ironic-group&project_type=openstack&days=30 are likely the 2 most used view that i woudl have used in stackalitcs. https://openstack.biterg.io/app/dashboards#/view/95487340-6762-11e9-a198-67126215b112?_g=(filters:!(),refreshInterval:(pause:!t,value:0),time:(from:now-90d,to:now))&_a=(description:'Gerrit%20Approvals%20panel%20by%20Bitergia',filters:!(('$state':(store:appState),meta:(alias:Bots,disabled:!f,index:gerrit,key:author_bot,negate:!t,params:(query:!t),type:phrase),query:(match:(author_bot:(query:!t,type:phrase)))),('$state':(store:appState),meta:(alias:!n,disabled:!f,index:gerrit,key:project,negate:!f,params:(query:ironic),type:phrase),query:(match_phrase:(project:ironic)))),fullScreenMode:!f,options:(darkTheme:!f,hidePanelTitles:!f,useMargins:!t),query:(language:lucene,query:''),timeRestore:!f,title:'Gerrit%20Approvals',viewMode:view) was the closest to that view in https://openstack.biterg.io with out seeing the distribution of a a contibutotd vote and how that aligned the core team review know they did 100 reivews in 90 days tell be very littel. i.e. if they only ever leave +1 and do so on patches that the core team -1 or -2 that tells me that they are nto doign a detailed review or are still learning and dont have the context to review the changes properly. one metric that has histoically been missing is +/- 1 without comments That not alwasy a red flag as i sometime forget to tick the box when i hit reply and go back and add my vote without a coment the second time but again takignthe 100 review in 90 day example if 90 of those +1s have no comment or explanation as to why that basiclly tells me i should disregard review form that person becuase tehy are not reviewing properly. https://docs.openstack.org/project-team-guide/review-the-openstack-way.html
- Ability to filter by individual OpenStack *project* (e.g. all Ironic-responsible repos; not just "ironic" the repo) for either of the two use cases above.
For this one, on the top of the dashboard where it says 'OpenStack/All repositories', the 'All repositories' text is a drop-down list. If you click on it, it'll list all repositories that are tracked by the tool for OpenStack. In the search bar you can type in 'ironic' and it'll give you all repos where the project's name is included. There are checkboxes on the left side, where you can select any or all Ironic repos to view metrics for. When you don't want the filter to be applied anymore, you can click on 'All repositories' on the top of that top-down list window to switch back to the view of entire OpenStack. Is that what you've been looking for?
that a combersome way to do it but i guess that might work if you have only a coupple of repos in the governace fo a project team both stackaltics and biterga had mapping form https://opendev.org/openstack/governance/src/branch/master/reference/project... and provided a single drop down to select all of the ironic project in one go. so on stackaltics each official project team had a "<team> Official" option in modules and in bitegra we had teams and repos as separate filters. jay may have a diffent prespective then me but stackalytics.io still provides more useful infor then the new dashbaord. and the new dashbaord is much less user firendly/flexable then either of the prior ones. i wonder if there is an api we can pull form like statilices provdied so we can visulaise things ourselves? i dont actully use these often but mostly check in once or twice a cycle to see how engramnt is changing but its not like i coudl not do my work without these.
[...]
Hi Sean, Thank you for the detailed information on how you've been using the metrics dashboards, it is very helpful. Please allow me a little bit of time to go through the pointers you gave and get a better understanding of your use cases and needs. In the meantime, I asked the LFX Insights folks if there's a way for our communities to retrieve data from the new dashboards, for a more personalized way of visualizing it. I'll get back to you as soon as I hear back from that team. Thanks and Best Regards, Ildikó
On Jan 12, 2026, at 05:01, Sean Mooney <smooney@redhat.com> wrote:
Hi Jay,
Thank you so much for reviewing the new dashboard and sharing your use cases and questions. It is very helpful in understanding what might still be missing from the new dashboard.
Historically LFX Insights has been more tailored towards GitHub and data that platform provides. The Gerrit-based dashboards, like the one for OpenStack, are a newer addition and in that regard some of the metrics you're asking for are not (yet) available. for what its worth i agree with jay
On 11/01/2026 22:11, Ildiko Vancsa wrote: the info currently aviable in the dashboard is not very useful for anythign i would have looked at in the old dashboards.
I reflected on each of your question in more detail below.
Best Regards, Ildikó
[...]
I have basically three things I use this data for:
- Individual review numbers, from gerrit, indicating frequency and types of review -- preferably with the ability to compare between reviewers. This was from the gerrit dashboard in biterg (and got top-billing on the original stackalytics dashboard) and is an important aspect for evaluation of existing and new reviewers for a project. This looks like a use case that needs to be added to LFX Insights, I will need to open a GH Issue to cover it. First I would like to make sure my understanding is correct in what you need. If I got that right, you've been looking at reviewer statistics such as review types (-2, -1, 1, 2), # of reviews, and frequency of a person performing reviews. Is that accurate?
this is useful but we also used to track disagreement i.e negative comment where there is a +2 or posivive comemtn where there is a -2
the other gap we had that used to cause people to game the system was the fact we didnt track when a review was left with CodeReview=0
by not being able to see the neutral comment we could not see engagement in the discussion and people would end up putting +/- 1 making it a less strong signal.
im not sure if that can be addressed in there system but at a minim we should try and restore the old behavior where we have review stats.
While we will not be able to recreate the Bitergia dashboard in the new tool, it would be helpful for me to visualize how you've been using it. Can you share the list of widgets that you've been continuously relying on for reviewer evaluation?
for me i think it woudlbe similar to jay which would be the review stat and the ability to filter by project (repo or set of project i.e all of ironic or all of nova ectra), the ablity to include/exclude companies/individuals filtienr on date ranges
which is the content included in the gerrit approvals dashboard
i very rarely look at anything out side of that dashboard but i do vocationally skim them.
- Ability to easily compare contributions by company OUTSIDE of the top 5. The lack of metrics outside the top 5 is disappointing.
I assume you've been looking at the 'Organizations leaderboard' on the 'Contributors' tab. On the bottom of the widget there's an 'All organizations' label, which is a button. If you click it, it'll open up a sidebar that gives you the full list of contributing organizations. Is that what you've been looking for?
that maybe has some of the info but is presened in a very unfrieldy way in a tiny popup windows and you really tell what approval vs merges vs comment really are.
https://www.stackalytics.io/?module=ironic-group and https://www.stackalytics.io/report/contribution?module=ironic-group&project_type=openstack&days=30
are likely the 2 most used view that i woudl have used in stackalitcs.
was the closest to that view in https://openstack.biterg.io
with out seeing the distribution of a a contibutotd vote and how that aligned the core team review know they did 100 reivews in 90 days tell be very littel.
i.e. if they only ever leave +1 and do so on patches that the core team -1 or -2 that tells me that they are nto doign a detailed review or are still learning and dont have the context to review the changes properly.
one metric that has histoically been missing is +/- 1 without comments That not alwasy a red flag as i sometime forget to tick the box when i hit reply and go back and add my vote without a coment the second time but again takignthe 100 review in 90 day example if 90 of those +1s have no comment or explanation as to why that basiclly tells me i should disregard review form that person becuase tehy are not reviewing properly. https://docs.openstack.org/project-team-guide/review-the-openstack-way.html
- Ability to filter by individual OpenStack *project* (e.g. all Ironic-responsible repos; not just "ironic" the repo) for either of the two use cases above.
For this one, on the top of the dashboard where it says 'OpenStack/All repositories', the 'All repositories' text is a drop-down list. If you click on it, it'll list all repositories that are tracked by the tool for OpenStack. In the search bar you can type in 'ironic' and it'll give you all repos where the project's name is included. There are checkboxes on the left side, where you can select any or all Ironic repos to view metrics for. When you don't want the filter to be applied anymore, you can click on 'All repositories' on the top of that top-down list window to switch back to the view of entire OpenStack. Is that what you've been looking for?
that a combersome way to do it but i guess that might work if you have only a coupple of repos in the governace fo a project team
both stackaltics and biterga had mapping form https://opendev.org/openstack/governance/src/branch/master/reference/project...
and provided a single drop down to select all of the ironic project in one go. so on stackaltics each official project team had a "<team> Official" option in modules and in bitegra we had teams and repos as separate filters.
jay may have a diffent prespective then me but stackalytics.io still provides more useful infor then the new dashbaord. and the new dashbaord is much less user firendly/flexable then either of the prior ones.
i wonder if there is an api we can pull form like statilices provdied so we can visulaise things ourselves?
i dont actully use these often but mostly check in once or twice a cycle to see how engramnt is changing but its not like i coudl not do my work without these.
[...]
On 1/11/26 2:11 PM, Ildiko Vancsa wrote:
Hi Jay,
Thank you so much for reviewing the new dashboard and sharing your use cases and questions. It is very helpful in understanding what might still be missing from the new dashboard.
Historically LFX Insights has been more tailored towards GitHub and data that platform provides. The Gerrit-based dashboards, like the one for OpenStack, are a newer addition and in that regard some of the metrics you're asking for are not (yet) available.
I reflected on each of your question in more detail below.
Best Regards, Ildikó
[...]
I have basically three things I use this data for:
- Individual review numbers, from gerrit, indicating frequency and types of review -- preferably with the ability to compare between reviewers. This was from the gerrit dashboard in biterg (and got top-billing on the original stackalytics dashboard) and is an important aspect for evaluation of existing and new reviewers for a project. This looks like a use case that needs to be added to LFX Insights, I will need to open a GH Issue to cover it. First I would like to make sure my understanding is correct in what you need. If I got that right, you've been looking at reviewer statistics such as review types (-2, -1, 1, 2), # of reviews, and frequency of a person performing reviews. Is that accurate?
While we will not be able to recreate the Bitergia dashboard in the new tool, it would be helpful for me to visualize how you've been using it. Can you share the list of widgets that you've been continuously relying on for reviewer evaluation?
"Gerrit approvals" and "Git" -- both with significant filtering. I had a bookmark on biterg, for instance, which would show me the review stats for everyone on my team at G-Research, and I could easily modify it to show me all Ironic reviewers for a comparison. Similarly with git, I would use it to track contributions from members of my team and track health of a project over time through things like company participation diversity metrics.
- Ability to easily compare contributions by company OUTSIDE of the top 5. The lack of metrics outside the top 5 is disappointing.
I assume you've been looking at the 'Organizations leaderboard' on the 'Contributors' tab. On the bottom of the widget there's an 'All organizations' label, which is a button. If you click it, it'll open up a sidebar that gives you the full list of contributing organizations. Is that what you've been looking for?
This is what I was looking for .All organizations works OK; but affiliations are broken. I see both a "GR-OSS" (not an entity, just a marketing name for our group) and "G-Research". What's the process for correcting these affiliations? I'm not looking forward to asking my developers to take additional time to update a now-third place for their affiliation data.
- Ability to filter by individual OpenStack *project* (e.g. all Ironic-responsible repos; not just "ironic" the repo) for either of the two use cases above.
For this one, on the top of the dashboard where it says 'OpenStack/All repositories', the 'All repositories' text is a drop-down list. If you click on it, it'll list all repositories that are tracked by the tool for OpenStack. In the search bar you can type in 'ironic' and it'll give you all repos where the project's name is included. There are checkboxes on the left side, where you can select any or all Ironic repos to view metrics for. When you don't want the filter to be applied anymore, you can click on 'All repositories' on the top of that top-down list window to switch back to the view of entire OpenStack. Is that what you've been looking for?
This is not sufficient. Ironic has nearly two-dozen repos, the majority do not contain the string "ironic" -- for instance, networking-baremetal, networking-generic-switch, or sushy. The previous tooling, when sorting for "Ironic" would find all projects under Ironic governance as declared here: https://opendev.org/openstack/governance/src/branch/master/reference/project...
[...]
Hi Jay, Thank you for sharing some more details about what you are looking for in the new metrics dashboard. With regard to affiliations, in my understanding LFX Insights is primarily using individuals' Linux Foundation account to propagate affiliation data. In that sense people will need to create/update their Linux Foundation profile to fix affiliation issues. As we've been starting migrate to more of the LF tooling, I hope that most folks already have an account and it just need to be updated in some cases. Please allow me a little time to check in with the LFX Insights team about the other tiems to understand what we can do, how and on what timeline. I'll get back on this thread as soon as I have updates. Thanks and Best Regards, Ildikó
On Jan 13, 2026, at 08:57, Jay Faulkner <jay@gr-oss.io> wrote:
On 1/11/26 2:11 PM, Ildiko Vancsa wrote:
Hi Jay,
Thank you so much for reviewing the new dashboard and sharing your use cases and questions. It is very helpful in understanding what might still be missing from the new dashboard.
Historically LFX Insights has been more tailored towards GitHub and data that platform provides. The Gerrit-based dashboards, like the one for OpenStack, are a newer addition and in that regard some of the metrics you're asking for are not (yet) available.
I reflected on each of your question in more detail below.
Best Regards, Ildikó
[...]
I have basically three things I use this data for:
- Individual review numbers, from gerrit, indicating frequency and types of review -- preferably with the ability to compare between reviewers. This was from the gerrit dashboard in biterg (and got top-billing on the original stackalytics dashboard) and is an important aspect for evaluation of existing and new reviewers for a project. This looks like a use case that needs to be added to LFX Insights, I will need to open a GH Issue to cover it. First I would like to make sure my understanding is correct in what you need. If I got that right, you've been looking at reviewer statistics such as review types (-2, -1, 1, 2), # of reviews, and frequency of a person performing reviews. Is that accurate?
While we will not be able to recreate the Bitergia dashboard in the new tool, it would be helpful for me to visualize how you've been using it. Can you share the list of widgets that you've been continuously relying on for reviewer evaluation?
"Gerrit approvals" and "Git" -- both with significant filtering. I had a bookmark on biterg, for instance, which would show me the review stats for everyone on my team at G-Research, and I could easily modify it to show me all Ironic reviewers for a comparison. Similarly with git, I would use it to track contributions from members of my team and track health of a project over time through things like company participation diversity metrics.
- Ability to easily compare contributions by company OUTSIDE of the top 5. The lack of metrics outside the top 5 is disappointing.
I assume you've been looking at the 'Organizations leaderboard' on the 'Contributors' tab. On the bottom of the widget there's an 'All organizations' label, which is a button. If you click it, it'll open up a sidebar that gives you the full list of contributing organizations. Is that what you've been looking for?
This is what I was looking for .All organizations works OK; but affiliations are broken. I see both a "GR-OSS" (not an entity, just a marketing name for our group) and "G-Research". What's the process for correcting these affiliations? I'm not looking forward to asking my developers to take additional time to update a now-third place for their affiliation data.
- Ability to filter by individual OpenStack *project* (e.g. all Ironic-responsible repos; not just "ironic" the repo) for either of the two use cases above.
For this one, on the top of the dashboard where it says 'OpenStack/All repositories', the 'All repositories' text is a drop-down list. If you click on it, it'll list all repositories that are tracked by the tool for OpenStack. In the search bar you can type in 'ironic' and it'll give you all repos where the project's name is included. There are checkboxes on the left side, where you can select any or all Ironic repos to view metrics for. When you don't want the filter to be applied anymore, you can click on 'All repositories' on the top of that top-down list window to switch back to the view of entire OpenStack. Is that what you've been looking for?
This is not sufficient. Ironic has nearly two-dozen repos, the majority do not contain the string "ironic" -- for instance, networking-baremetal, networking-generic-switch, or sushy. The previous tooling, when sorting for "Ironic" would find all projects under Ironic governance as declared here: https://opendev.org/openstack/governance/src/branch/master/reference/project...
[...]
Hi Jay, Sean and All, Thank you again for providing information about your use of OpenStack community metrics. In order to capture the community's, as well as individuals' and companies' needs, I created an etherpad with the feedback that surfaced so far in various conversations so far: https://etherpad.opendev.org/p/lfx_insights_metrics_use_cases I added use cases, wishlist items as well as questions and comments to the etherpad, along with updates where I had any. Please check and update the notes in the etherpad that are not accurate, and add any missing use case or further comments and questions. I will use the etherpad to keep working with the LFX Insights team to understand what and when we can add to the dashboard, and how we can enable people to access raw data to use for more specialized needs. I will update the etherpad and get back on this thread as well when I have updates. I really appreciate y'all's input and help in understanding how you've been finding value in the metrics dashboards the community has been using up until now. I also ask for your patience and understanding as we work through addressing everyone's needs with the new platform. Please let me know what other ways I can support the community throughout this transition period. Thanks and Best Regards, Ildikó
On Jan 13, 2026, at 18:37, Ildiko Vancsa <ildiko.vancsa@gmail.com> wrote:
Hi Jay,
Thank you for sharing some more details about what you are looking for in the new metrics dashboard.
With regard to affiliations, in my understanding LFX Insights is primarily using individuals' Linux Foundation account to propagate affiliation data. In that sense people will need to create/update their Linux Foundation profile to fix affiliation issues. As we've been starting migrate to more of the LF tooling, I hope that most folks already have an account and it just need to be updated in some cases.
Please allow me a little time to check in with the LFX Insights team about the other tiems to understand what we can do, how and on what timeline. I'll get back on this thread as soon as I have updates.
Thanks and Best Regards, Ildikó
On Jan 13, 2026, at 08:57, Jay Faulkner <jay@gr-oss.io> wrote:
On 1/11/26 2:11 PM, Ildiko Vancsa wrote:
Hi Jay,
Thank you so much for reviewing the new dashboard and sharing your use cases and questions. It is very helpful in understanding what might still be missing from the new dashboard.
Historically LFX Insights has been more tailored towards GitHub and data that platform provides. The Gerrit-based dashboards, like the one for OpenStack, are a newer addition and in that regard some of the metrics you're asking for are not (yet) available.
I reflected on each of your question in more detail below.
Best Regards, Ildikó
[...]
I have basically three things I use this data for:
- Individual review numbers, from gerrit, indicating frequency and types of review -- preferably with the ability to compare between reviewers. This was from the gerrit dashboard in biterg (and got top-billing on the original stackalytics dashboard) and is an important aspect for evaluation of existing and new reviewers for a project. This looks like a use case that needs to be added to LFX Insights, I will need to open a GH Issue to cover it. First I would like to make sure my understanding is correct in what you need. If I got that right, you've been looking at reviewer statistics such as review types (-2, -1, 1, 2), # of reviews, and frequency of a person performing reviews. Is that accurate?
While we will not be able to recreate the Bitergia dashboard in the new tool, it would be helpful for me to visualize how you've been using it. Can you share the list of widgets that you've been continuously relying on for reviewer evaluation?
"Gerrit approvals" and "Git" -- both with significant filtering. I had a bookmark on biterg, for instance, which would show me the review stats for everyone on my team at G-Research, and I could easily modify it to show me all Ironic reviewers for a comparison. Similarly with git, I would use it to track contributions from members of my team and track health of a project over time through things like company participation diversity metrics.
- Ability to easily compare contributions by company OUTSIDE of the top 5. The lack of metrics outside the top 5 is disappointing. I assume you've been looking at the 'Organizations leaderboard' on the 'Contributors' tab. On the bottom of the widget there's an 'All organizations' label, which is a button. If you click it, it'll open up a sidebar that gives you the full list of contributing organizations. Is that what you've been looking for?
This is what I was looking for .All organizations works OK; but affiliations are broken. I see both a "GR-OSS" (not an entity, just a marketing name for our group) and "G-Research". What's the process for correcting these affiliations? I'm not looking forward to asking my developers to take additional time to update a now-third place for their affiliation data.
- Ability to filter by individual OpenStack *project* (e.g. all Ironic-responsible repos; not just "ironic" the repo) for either of the two use cases above. For this one, on the top of the dashboard where it says 'OpenStack/All repositories', the 'All repositories' text is a drop-down list. If you click on it, it'll list all repositories that are tracked by the tool for OpenStack. In the search bar you can type in 'ironic' and it'll give you all repos where the project's name is included. There are checkboxes on the left side, where you can select any or all Ironic repos to view metrics for. When you don't want the filter to be applied anymore, you can click on 'All repositories' on the top of that top-down list window to switch back to the view of entire OpenStack. Is that what you've been looking for?
This is not sufficient. Ironic has nearly two-dozen repos, the majority do not contain the string "ironic" -- for instance, networking-baremetal, networking-generic-switch, or sushy. The previous tooling, when sorting for "Ironic" would find all projects under Ironic governance as declared here: https://opendev.org/openstack/governance/src/branch/master/reference/project...
[...]
participants (4)
-
Ildiko Vancsa
-
Ildiko Vancsa
-
Jay Faulkner
-
Sean Mooney