[openstack-dev] [neutron] [third-party] Neutron 3rd Party CI status dashboard

Anita Kuno anteaya at anteaya.info
Mon Jun 30 19:08:53 UTC 2014


On 06/29/2014 07:59 PM, Anita Kuno wrote:
> On 06/29/2014 07:43 PM, Anita Kuno wrote:
>> On 06/29/2014 03:25 PM, Ilya Shakhat wrote:
>>> Hi!
>>>
>>> During last couple weeks there is an increasing demand on tracking
>>> 3rd-party CI statuses. We at Stackalytics decided to be in trend and (with
>>> some inspiration from Salvatore's proposal) implemented report that shows
>>> summary on external CI status. The initial version is available for Neutron
>>> - http://stackalytics.com/report/ci/neutron/7
>>>
>>> The report shows summary of all CI jobs during specified period of time,
>>> including:
>>>  * stats of runs on merged patch sets:
>>>     - total number of runs
>>>     - success rate (success to total ratio)
>>>     - time of the latest run
>>>     - last test result
>>>   * stats for all patch sets (the same set as for merged)
>>>   * last test results for every merged patch set grouped by days (useful to
>>> see how different CIs correlate with each other and how often they run)
>>>
>>> Under "merged patch set" report means "the last patch set in the merged
>>> change request", thus it is almost the same as the trunk code. CI
>>> configuration is taken from DriverLog's default data
>>> <https://git.openstack.org/cgit/stackforge/driverlog/tree/etc/default_data.json>.
>>> Standard Stackalytics screen is also available for CIs -
>>> http://stackalytics.com/?metric=ci, including votes breakdown and activity
>>> log.
>>>
>>> Since this is the first version there are some open questions:
>>>  * Currently report shows results per CI id, but there are CIs that run
>>> tests against multiple drivers and this case is not supported. What would
>>> be more useful: to get stats for a driver or for CI?
>>>  * Most CIs run tests when patch set is posted. So even if change request
>>> is merged within selected time period corresponding CI results may be
>>> missing.
>>>  * Patterns for non-voting CIs need to be verified. For example Cisco CI
>>> now runs 5 jobs, but DriverLog data still contains old data.
>>>
>>> Thanks,
>>> Ilya
>>>
>>> 2014-06-16 17:52 GMT+04:00 Salvatore Orlando <sorlando at nicira.com>:
>>>
>>>>
>>>> However, it would be great if we could start devising a solution for
>>>> having "health" reports from the various CI systems.
>>>> This report should report the following kind of information:
>>>> - timestamp of last run
>>>> - timestamp of last vote (a system might start job which then get aborted
>>>> for CI infra problems)
>>>> - % of success vs failures (not sure how important is that one but
>>>> provides a metric of comparison with upstream jenkins)
>>>> - % of disagreements with jenkins (this might allow us to quickly spot
>>>> those CI systems which are randomly -1'ing patches)
>>>>
>>>> The percentage metrics might be taken over a 48 hours or 7 days interval,
>>>> or both.
>>>> Does this idea sound reasonable?
>>>>
>>>>
>>>
>>>
>>>
>>> _______________________________________________
>>> OpenStack-dev mailing list
>>> OpenStack-dev at lists.openstack.org
>>> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
>>>
>> Hi Ilya:
>>
>> I look forward to hearing more about this dashboard and ensuring you or
>> someone else associated with this dashboard are available for questions
>> at the third party meeting tomorrow:
>> https://wiki.openstack.org/wiki/Meetings/ThirdParty
>>
>> We missed you last week.
>>
>> Thanks Ilya,
>> Anita.
>>
> And one question I will have when we discuss this is regarding this
> statement: "Green cell - tests ran successfully,"
> 
> Currently we don't have community consensus around the use of the
> statement "tests ran successfully" regarding third party ci systems.
> This is as statement, you recall, we had discussed at the third party
> meeting when we talked about driverlog.
> 
> 18:46:02 <krtaylor> but what does CI tested really mean? just running
> tests? or tested to pass some level of requirements?
> http://eavesdrop.openstack.org/meetings/third_party/2014/third_party.2014-06-16-18.00.log.html
> 
> Using a statement to convey success prior to having a definition from
> the community about what the statement means will create confusion in
> the community and frustration from folks, including myself, who are
> willing to have the conversation about what it means, who feel
> circumvented by this second use of a phrase which implies a decided
> meaning where none yet exists. Please participate in conversations
> around the definition of phrases of success and failure regarding third
> party systems and point to the logs where consensus is reached prior it
> its use in future.
> 
> In addition to attending the third party meeting, please attend the
> infra meeting or the qa meeting, and hopefully meetings that include
> programs that have interactions with third party ci systems including
> nova, neutron, and cinder (if there are other programs interacting with
> third party ci systems please attend the third party meeting so I know
> about you).
> 
> Thanks Ilya, I look forward to our future discussions,
> Anita.
> 
As an update, Ilya did attend the third party meeting today, thank you Ilya.

I am disappointed to realize that Ilya (or stackalytics, I don't know
where this is coming from) is unwilling to cease making up definitions
of success for third party ci systems to allow the openstack community
to arrive at its own definition.

18:59:05 <anteaya> ilyashakhat__: I am asking if you are willing to stop
making up stackalytic definitions of success
18:59:10 <anteaya> for third party ci systems

http://eavesdrop.openstack.org/meetings/third_party/2014/third_party.2014-06-30-18.01.log.html

The definition of success for third party ci system assessment lies with
the openstack community. It isn't a race, it is a collaborative effort.
I'm disappointed that this approach isn't embraced by all involved.

Anita.



More information about the OpenStack-dev mailing list