By Rory Canavan
After my last blog, I was approached to delve somewhat deeper into the aspect of quality assurance around software asset management (SAM) processes, and to offer my experience in this area. Here are some points I believe are worthy of note:
Effective Work-rate: Whatever quality assurance indicators are considered in a process, make sure that their measurement does not exceed the work required to carry out the process in the first place. It could very quickly become the norm for a company to be expending more resources on ensuring statistics are collected than actually achieving the overall goal(s) of the process.
Automate wherever possible: The days of clipboards and note-taking in respect of quality assurance are hopefully behind us; greater reliance on automation offers the opportunity to measure success factors for processes in places or times individuals might consider unpleasant or unacceptable.
Relevance: Quality Assurance data collection should be done in support of the overall process objectives – i.e. the number of requests handled in a “Request software title” process might be of some interest; but of greater interest might be, what’s the typical turnaround time for a request, from acknowledgement by the IT department to the time of installation?
Exception handling: If a process is designed properly, it will be grounded in the real-world and so recognise that not everything runs smoothly 100% of the time. Therefore, exception handling for when processes go wrong can offer quick wins in determining which elements of a process to measure.
The following acronym (SMART) has been well used in operations management, but applies here as well:
Any Quality Assurance objectives should be:
Specific: Don’t be ambiguous with your quality control objectives—“The purpose of this function step is to re-enforce the correct choice of license type in the creation of a license record in FlexNet Manager Platform” is much clearer than “I’m just checking that Stephen is doing his job properly”.
Measurable: How people feel or think about a given situation or product is the preserve of politics. It is very difficult to dispute facts and figures – and these should be easy enough to select in the world of Software Asset Management. For example, specify the frequency of hardware and software inventory data collection and measure the percent coverage of your IT estate.
Attributable: The task of a process and/or function-step should always be assigned to a specific individual – this is not to associate blame; but rather to highlight that if a training requirement is to be addressed then a QA assessment can properly identify the correct individual; alternatively, process improvement could be required.
Realistic: Any objectives asked of an individual or process should be within their skill-set, or the achievable scope of the system used to carry out the process.
Time-based: In business, we can’t wait indefinitely for the completion of a process. Therefore, thresholds are required to assess whether a process is a success or a failure based on time to successful completion.
Derivations of the SMART acronym abound; however these are the principles I attempt to adhere to when creating points of assessment for quality assurance in the realm of software asset management.
I would wrap up by saying that quality assurance should have objectives in its own right, and should be culturally addressed within a company as being part of the overall standards by which that organization is run – the company does not exist to stand over its employees shoulders to make them fearful of doing their job, but rather to encourage and promote continuous improvement and idea generation.
To learn more about best practice software asset management processes, download our whitepaper: NCC Guideline for IT Management: Software Asset and License Management Best Practice, or view our on-demand webinar: How to get started on a SAM Program