Hi Jay,
Thank you so much for reviewing the new dashboard and sharing your use cases and questions. It is very helpful in understanding what might still be missing from the new dashboard.
Historically LFX Insights has been more tailored towards GitHub and data that platform provides. The Gerrit-based dashboards, like the one for OpenStack, are a newer addition and in that regard some of the metrics you're asking for are not (yet) available. for what its worth i agree with jay
On 11/01/2026 22:11, Ildiko Vancsa wrote: the info currently aviable in the dashboard is not very useful for anythign i would have looked at in the old dashboards.
I reflected on each of your question in more detail below.
Best Regards, Ildikó
[...]
I have basically three things I use this data for:
- Individual review numbers, from gerrit, indicating frequency and types of review -- preferably with the ability to compare between reviewers. This was from the gerrit dashboard in biterg (and got top-billing on the original stackalytics dashboard) and is an important aspect for evaluation of existing and new reviewers for a project. This looks like a use case that needs to be added to LFX Insights, I will need to open a GH Issue to cover it. First I would like to make sure my understanding is correct in what you need. If I got that right, you've been looking at reviewer statistics such as review types (-2, -1, 1, 2), # of reviews, and frequency of a person performing reviews. Is that accurate?
this is useful but we also used to track disagreement i.e negative comment where there is a +2 or posivive comemtn where there is a -2 the other gap we had that used to cause people to game the system was the fact we didnt track when a review was left with CodeReview=0 by not being able to see the neutral comment we could not see engagement in the discussion and people would end up putting +/- 1 making it a less strong signal. im not sure if that can be addressed in there system but at a minim we should try and restore the old behavior where we have review stats.
While we will not be able to recreate the Bitergia dashboard in the new tool, it would be helpful for me to visualize how you've been using it. Can you share the list of widgets that you've been continuously relying on for reviewer evaluation?
for me i think it woudlbe similar to jay which would be the review stat and the ability to filter by project (repo or set of project i.e all of ironic or all of nova ectra), the ablity to include/exclude companies/individuals filtienr on date ranges which is the content included in the gerrit approvals dashboard https://openstack.biterg.io/app/dashboards?security_tenant=openstack#/view/95487340-6762-11e9-a198-67126215b112?_g=(filters:!(),refreshInterval:(pause:!t,value:0),time:(from:now-90d,to:now))&_a=(description:'Gerrit%20Approvals%20panel%20by%20Bitergia',filters:!(('$state':(store:appState),meta:(alias:Bots,disabled:!f,index:gerrit,key:author_bot,negate:!t,params:(query:!t),type:phrase),query:(match:(author_bot:(query:!t,type:phrase)))),('$state':(store:appState),meta:(alias:!n,disabled:!f,index:gerrit,key:project,negate:!f,params:(query:nova),type:phrase),query:(match_phrase:(project:nova)))),fullScreenMode:!f,options:(darkTheme:!f,hidePanelTitles:!f,useMargins:!t),query:(language:lucene,query:''),timeRestore:!f,title:'Gerrit%20Approvals',viewMode:view) i very rarely look at anything out side of that dashboard but i do vocationally skim them.
- Ability to easily compare contributions by company OUTSIDE of the top 5. The lack of metrics outside the top 5 is disappointing.
I assume you've been looking at the 'Organizations leaderboard' on the 'Contributors' tab. On the bottom of the widget there's an 'All organizations' label, which is a button. If you click it, it'll open up a sidebar that gives you the full list of contributing organizations. Is that what you've been looking for?
that maybe has some of the info but is presened in a very unfrieldy way in a tiny popup windows and you really tell what approval vs merges vs comment really are. https://www.stackalytics.io/?module=ironic-group and https://www.stackalytics.io/report/contribution?module=ironic-group&project_type=openstack&days=30 are likely the 2 most used view that i woudl have used in stackalitcs. https://openstack.biterg.io/app/dashboards#/view/95487340-6762-11e9-a198-67126215b112?_g=(filters:!(),refreshInterval:(pause:!t,value:0),time:(from:now-90d,to:now))&_a=(description:'Gerrit%20Approvals%20panel%20by%20Bitergia',filters:!(('$state':(store:appState),meta:(alias:Bots,disabled:!f,index:gerrit,key:author_bot,negate:!t,params:(query:!t),type:phrase),query:(match:(author_bot:(query:!t,type:phrase)))),('$state':(store:appState),meta:(alias:!n,disabled:!f,index:gerrit,key:project,negate:!f,params:(query:ironic),type:phrase),query:(match_phrase:(project:ironic)))),fullScreenMode:!f,options:(darkTheme:!f,hidePanelTitles:!f,useMargins:!t),query:(language:lucene,query:''),timeRestore:!f,title:'Gerrit%20Approvals',viewMode:view) was the closest to that view in https://openstack.biterg.io with out seeing the distribution of a a contibutotd vote and how that aligned the core team review know they did 100 reivews in 90 days tell be very littel. i.e. if they only ever leave +1 and do so on patches that the core team -1 or -2 that tells me that they are nto doign a detailed review or are still learning and dont have the context to review the changes properly. one metric that has histoically been missing is +/- 1 without comments That not alwasy a red flag as i sometime forget to tick the box when i hit reply and go back and add my vote without a coment the second time but again takignthe 100 review in 90 day example if 90 of those +1s have no comment or explanation as to why that basiclly tells me i should disregard review form that person becuase tehy are not reviewing properly. https://docs.openstack.org/project-team-guide/review-the-openstack-way.html
- Ability to filter by individual OpenStack *project* (e.g. all Ironic-responsible repos; not just "ironic" the repo) for either of the two use cases above.
For this one, on the top of the dashboard where it says 'OpenStack/All repositories', the 'All repositories' text is a drop-down list. If you click on it, it'll list all repositories that are tracked by the tool for OpenStack. In the search bar you can type in 'ironic' and it'll give you all repos where the project's name is included. There are checkboxes on the left side, where you can select any or all Ironic repos to view metrics for. When you don't want the filter to be applied anymore, you can click on 'All repositories' on the top of that top-down list window to switch back to the view of entire OpenStack. Is that what you've been looking for?
that a combersome way to do it but i guess that might work if you have only a coupple of repos in the governace fo a project team both stackaltics and biterga had mapping form https://opendev.org/openstack/governance/src/branch/master/reference/project... and provided a single drop down to select all of the ironic project in one go. so on stackaltics each official project team had a "<team> Official" option in modules and in bitegra we had teams and repos as separate filters. jay may have a diffent prespective then me but stackalytics.io still provides more useful infor then the new dashbaord. and the new dashbaord is much less user firendly/flexable then either of the prior ones. i wonder if there is an api we can pull form like statilices provdied so we can visulaise things ourselves? i dont actully use these often but mostly check in once or twice a cycle to see how engramnt is changing but its not like i coudl not do my work without these.
[...]