Learn everything about our new Bitergia Research branch!

Understanding the code review process in OpenStack

Share this Post

Table of Contents

As a part of our tests with Kibana and Elasticserch as frontends for our MetricsGrimoire databases, we’ve set up a dashboard for understanding the code review process in OpenStack (be sure of visiting it with a large screen and a reasonable CPU, otherwise your experience may be a bit frustrating).

Screenshot from 2015-10-22 00-24-53This dashboard includes information about all review processes (changesets) in OpenStack, using information obtained from their Gerrit instance. For each review, we have information such as the submitter (owner), the time it was first uploaded and accepted or abandoned, the number of patchsets (iterations) needed until it was accepted, and the time until it was merged or abandoned. With all of them we have prepared an active visualization that allows both to understand the big picture and to drill down looking for the details. Follow on reading to learn about some of these details.

[Note: this is our second post about our dashboards based on Kibana. If you’re interested, have a look at the first one, about OpenStack code contributions.]

As it is usual in Kibana dashboards, you can click pretty much everywhere, and the dashboard will react by filtering according to your clicked selections. The dahsboard shows some general numbers and charts, such as the evolution of the number of reviews over time or the total number of submitters (uploaders) in the top part of the dashboard. But the most interesting part is maybe below it: the evolution of number of patchsets and time needed to complete a review.

Screenshot from 2015-10-22 00-39-45At the bottom of the dashboard we have included a table with the list of reviews (changesets) selected in any given moment, which includes references to it in Gerrit.

Screenshot from 2015-10-22 00-42-42Just to show some of the possibilities of the dashboard, let’s imagine that we’re interested in how long reviews are lasting until they are complete in Neutron, one of the largest (by development activity) projects in OpenStack. We can start by looking for the projects pie (see it in the right of the screenshot below).

Screenshot from 2015-10-22 00-48-56By selecting Neutron in the pie, we filter all the data in the dashboard, which is now showing only reviews (changesets) for it.

Screenshot from 2015-10-22 00-52-05We can already see how merged and abandoned reviews have evolved over time, and how the population of new reviews (still not merged nor abandoned) is mainly from the two last months, but with a fraction being even 5 months old. We can see as well how 75% of the changesets have less than 5 patchsets. However, this number is a bit missleading, since new and abandoned reviews are considered in the accounting. Assuming we’re interested in the evolution of merged reviews, we look for status pie, and select in it that we want to filter only the merged ones. Now this part of the dashboard shows as follows.

Screenshot from 2015-10-22 00-59-22New and abandoned reviews have disappeared, and you can notice the two green buttons in the top left of the screenshot, which are the proxies to interact with the two filters we have defined (Neutron project and merged status). Using them you can remove filters, invert them, and some other funny things. You can notice as well how the numbers are a bit different, although the number of patchsets per changeset have remained pretty solid.

But below these top charts and numbers, some interesting data is hiding.

Screenshot from 2015-10-22 01-05-21Here we can see the number of changesets with 1, 2, 3, etc. patchsets (the table in the top left of the screenshot), or the duration of the review process (in days, in the bottom chart and numbers). This way we can learn how in Neutron 50% of the review processes are finishing with a merge in less than one day (durations are in days, rounding down, thus all durations between 0 and 24 hours are considered as 0 days), with 75% are lasting less than three days (very impresive numbers, by the way).

The chart on the right suggests that the 95% percentil (95% of the shorter changesets) is decreasing in duration during the last months. If we want to check some details for this impression, we can just select the last months in that chart (by clicking and dragging).

Screenshot from 2015-10-22 01-16-20We can see now how for the four selected months, 95% of the changests are merged in less than 7 days, and we can see how that number is decreasing over time just by hovering over the bars in the bottom chart. However, the duration for the slowest changesets (those between 95% and 99% percentiles) is growing. If we’re interesting in seeing what’s happening with them, we can scroll the top left table down to the third page, where we find the longest code reviews for these months.

Screenshot from 2015-10-22 01-23-30If we’re curious about that patchset that lasted for 136 days, we just click on it and move to the bottom of the dashboard too see it.

Screenshot from 2015-10-22 01-27-39We can now go to Gerrit, look for that changeset number 153946, and see the details in all their glory.

Screenshot from 2015-10-22 01-30-04Now, if we want to look at what happened to some other patchsets, maybe in other periods or for other projects, we just go to the upper part of the dashboard, to remove (or intreract with) those green filters, or to the top right, to select different dates.

Now that you have some ideas of what can be done… come to the real thing and follow your curiosity on how the OpenStack code review process is working!

bitergiawiseowl

bitergiawiseowl

More To Explore

Do You Want To Start
Your Metrics Journey?

drop us a line and Start with a Free Demo!