Wednesday, June 11, 2014

Dear developer, could you please do a quality assessment of how we use your software?

Quality assessments (QA) in software development are a way of ensuring that the software being developed lives up to the expected standards. The methods of assessing, the tools used and the criteria measured are varied, some find checklists to be sufficient while others need complex workflows. Whatever your method of assessment is, the most important part is that the quality assessment is done by a different team than the ones who developed the software.

At Epinova we have had structured quality assessments as a part of our projects for a couple of years now. Whenever we're closing in on a deadline, a developer who has not been part of the project goes through the whole codebase to see if the quality is acceptable. So in our case, a quality assessment includes a quite extensive code review. It also contains criteria covering the security and setup of all environments, user experience, accessability, performance etc. The list goes on and on. Not to mention, we have several different types of quality assessments: Backend, frontend and javascript. Each one is conducted as a deadline is getting nearer. This means that before a deadline, the developers have set aside time to fix any issues found during the assessments, making sure any areas not living up to the standards are improved before deadline.

These quality assessments have become part of our project pipeline and I think all the developers appreciate the feedback and a confirmation of their code meeting the standards it should. I wish we had some statistics on how the quality of our projects have improved after we introduced assessments to our project routines, but sadly we don't have any numbers to show of. I mean, how do you measure quality? Is quality represented by the number of bugs reported? The happiness of the customer? Or maybe the cyclomatic complexity of the codebase? It's very difficult to measure, but we're sure about one thing: The quality of our projects, however we measure it, have improved dramatically after introducing quality assessments.

Our quality assessment tool

I recently encountered a very different quality assessment than the ones I've described above. A customer asked me to do an assessment of how they were using the software I had developed for them. Now, this is a very interesting scenario. As a developer, you make quite a lot of assumptions as to how the customer will use your software. Note: The customer, in my world, is usually the editor and administrator of a website or commerce site, it's not the visitor. I don't think I'm able to count the number of times I've thought "Nah, the customer won't do that" only to realize that the customer in fact does exactly the opposite of what you imagined. And this becomes very clear when you're doing a quality assessment of how the customer is using your software.

In this particular case, there was one feature that was completely untouched. Why was that? Was it to complicated for them to use? Or did they not need it?

Another feature was used a lot more than what I had expected, and in some cases it served an entirely different purpose than what had been planned. Was there any risk involved with doing this? And how had this happened?

These are all very interesting questions, and the answers are very useful for a developer trying to map out how this customer is different from the previous customer he worked with. Needless to say, the report was also very useful for the customer as they received pointers as to how they could solve certain challenges or use the software more efficiently. So if you find yourself unsure of how a customer is using your software, I would look into the possibility of doing an assessment of the work they have put into it. My guess is, you will be surprised!


No comments:

Post a Comment