Software Release Practices – Survey Results
Over the past few weeks, we’ve gathered some interesting information about how software teams release their product, and a review of the results of these current practices. This has been done as part of a research project in conjunction with Mitacs and BCIC, along with a student that is currently pursuing a Masters degree at UBC.
The sample size spans 18 respondents across a range of different companies. While low from a strict research perspective, we believe it still gives us an interesting range of responses from the questions that were asked, and if nothing else, raises awareness and generates interesting discussion.
We tried to target individuals that had a clear understanding of the release process in their organizations, and in several cases the original person contacted passed the survey along to the ones that were better equipped to answer the questions.
The first question asked the participants to describe the drivers that affected the decision to release software.
As expected from our experience in the industry over the years, releasing product because the promised date has arrived came out as the strongest driver on average.
Looking at the data in more detail, though, reveals that all of these drivers (except ‘loss of bonuses’) was the strongest for at least one of the organizations, and every single respondent indicated that the decision to release software is driven by a wide range of factors, it is a far more complex decision than the cynical view of “the date has arrived, so we have to ship”.
From here, we asked respondents to identify the evidence they gather and use as a means of determining if the product is ready to ship.
As before, all groups used a variety of evidence to determine the state of readiness of the product, primarily functional and code testing, but a few things stood out for us. A number of groups indicated ‘team competence’ was the primary source of evidence, while there was no group that leaned heavily on complexity metrics, or static code analysis as evidence for release. Very few were driven by process compliance or standards adherence.
Again, thus far we are seeing confirmation that both the decision to release and the evidence used to decide suggest that ‘when to release software’ is a complex, multi-faceted problem, which was one of the things we were looking for.
Now to the core of the survey: how well are we doing with this complex decision?
We asked about the frequency of issues that people have with releases, and found the following on average:
What this is saying that across the board, almost half of the releases result in modest problems (or worse), and almost 25% of the releases result in significant or very ugly issues.
Clearly, there is room for improvement in how we manage this difficult decision!
We asked a few more questions about the nature of these release issues, and this is where the data started to be, well, a bit less crisp. First, we asked about the impact of these different severities of release issues, and got this:
While the averages expressed here follow the trend that we expected, there were a few data sets that bucked this trend. Whether or not we ignore the muddiness of the data, we think it is tough to argue that if the issue is ‘very ugly’, the impact in terms of cost, morale, product quality and credibility is going to go up.
In a manner similar to the responses to the question of the relative impact release issues, the responses to the question ‘What preventive investment would be appropriate to avoid the post-release experiences?’ showed an increasing trend based on severity:
(Note that this is where the data was the most puzzling, as several respondents suggested a greater investment to prevent minor release problems.)
We looked at the data from several different angles. There was no clear correlation between the size of the organization and the type of pains they were experiencing, for example
One interesting note comes when we combine the frequency and impact in these different categories, to find an overall exposure to the organization in each category:
What this tells us is that while we all fear ‘the big one’, those significant or very ugly problems that can ruin our day, the cumulative cost of the minor or modest issues we run into with our releases can be of the same magnitude over time. While we should certainly take action to prevent the big spills, anything we can do to reduce the smaller ones will help, too.
Overall, the informal, small slice of data here tells us that software release problems are real and frequent. What this data doesn’t really tell us is the true costs we are dealing with, in all its forms beyond the almighty dollar – delays, reduced product quality, lost credibility and disruption are all real and affect all of us.
We’re taking this to the next step to try to put a better measure on the cost of these release issues. What we are looking for is participation in a small focus group to better understand these challenges.
Though you need not have completed the above survey, it might help put you in the same mindset as the other focus group participants. Contact us if you are interested in doing so.
If you are well acquainted with your organization’s release process and have an interest in discussing these issues with a possibility of contributing to improvements for your own organization and the industry in general, we would welcome your participation in a small, focussed session in the next few weeks to discuss the release decision in more detail.
If you don’t fit the above criteria but know someone who does, please feel free to pass this note along to them…
We are planning on a 90 minute session somewhere around June 10 (depending on availability) at a location likely to be on the UBC Campus. We’ll cover parking and refreshments for the session, and would like to better understand:
- the real costs of poorly planned releases
- your preferences for the shape of potential solutions to help mitigate these challenges
If you are interested in participating, please contact us to this message, and we’ll use Doodle to find the best time and date for people to attend.