Today we’re presenting a preview of our analysis on Liferay at the Liferay Symposium Spain. We’ve analyzed Liferay public git and Jira repositories, which provide a good view of the public development activities around the project. Liferay Inc., the company, maintains another ticketing system for their customers as well, which we have not analyzed. However, all changes to the code seem to be done in the public git repository, and a large part of the activity with respect to ticket management (if not all) goes through the public Jira system at some point (even if initiated in the in-company tracking system).

In this preview, we’re including some charts and data similar to other analysis we’ve done in the past (such as the OpenStack Folsom report, or the report on Zentyal). But we’re also including some new stuff, such as some charts on how long does it take to close tickets (for every month in the analyzed period), or the activity in the git repository by module (directory).
Yo can now run directly to browse the preview of our report, the charts and data, or go on reading some details about it in this post.

The analysis of the public ticketing system of the project (Jira, in the case of Liferay), shows how the community is filing tickets (including bug reports), and how the project is dealing with them. The chart for open tickets has the usual peaks around release time (bugs tend to be uncovered right after releases, when new pieces of code are tested by users), but they are not too high. Even more revealing is the chart about ticket openers, which shows how more than 150 people are filing new tickets (in many cases bug reports) during the last months, and how that community has grown from about 90 at the beginning of the period.
With respect to tickets closed per month, the most remarkable issue is the peak in July 2010, when more than 1,000 tickets were closed, probably due to some policy set by the project with respect to closing old tickets. The community of developers closing tickets has also grown (in fact, it has grown like threefold) during the period of study.
We’ve also included a new analysis on how long does it take to the project to close tickets. Measures of ticket closing are tricky, because if you use the simple ones (such as mean time to close a ticket), it is difficult to really interpret them. For example, in that case, a low delay in closing may be interpreted as a good thing, because tickets are closed quickly, but also as a bad thing, because it could mean that old tickets are never been closed (usually, any given project has a lot of open bugs, many of them old).
Therefore, we’ve opted for a slightly more complex measure, which is the quantiles of time to close tickets. Quantiles (or percentiles, for this matter) try to capture not how “mean” tickets are closed, but to find the boundaries for how long a certain fraction of the tickets remain open.

If you take for example the chart for quantil .99, it can be interpreted as follows: For each month in the chart, 99% of the tickets open during that month, and later closed, were closed in less than the time drawn in the chart. In other words, for 1% of the tickets open during that month, and later closed, it took more than the time shown in the chart to close them. For March 2011, therefore, 99% of the tickets open then were closed in less than 382 days.
To interpret these charts, it is important to notice that they refer only to tickets closed (since those still open cannot be measured, because we still don’t know how long it is going to take to close them). It is also important to notice that the older a period (a month, in this case), the more likely that the value is high, since tickets open in that period had a chance of being closed during more time. For example, any bug filed 12 months ago had 12 months to be closed, and therefore, values up to 12 months of delay are possible for that month. But bugs filed during the last month analyzed didn’t yet have more than month of time to be fixed, and therefore values longer than one month are not possible for them.
This said, it is remarkably for this project how for almost all months 25% of the tickets are closed in less than one day, and 50% of the tickets are closed in less than one week. Of course, closing the rest takes much more (those remaining are usually the most difficult to close), but the overall view seems quite reasonable.
Well, for the rest, just go the report and see by yourself how the project is performing with respect to their git (changes to source code) activity, their per module (directory) activity, or the general evolution of the project.
[Final note: All the numbers included in this study could still have some errors, but they have already gone through a number of validations, and are correct to our best knowledge. This said, remember you can always download the datasets and do a parallel analysis, if you’re interested.]