When new applications are introduced to users, there’s an inevitable difference in the rate of adoption of the application. By adoption we mean the level of application familiarity, proficiency, and engagement.
The term “User Experience” (UX) also comes to mind; this factor is generally down to the design and overall approach of the application but also the way in which it was deployed.
For example, two organisations decide to implement the same CRM system, but the routes taken are completely different. Organisation one embarks on a comprehensive analysis of the needs of the users and factors in the imperative need for clean and accurate data. A formal discovery process highlights the need for some custom workflows to be incorporated into the delivered solution. Organisation two opts for a more accelerated deployment using an out-of-the-box configuration. They decide that their users are quite smart and plan to cover off specific processes with user training; they’ll also plan to implement a set of policies and procedures to ensure that all users are fully aware of the need for accurate information.
We visit both companies three months after deployment to try and get a measure of the differences. We interview a random set of users as well as the application sponsor/owner. Organisation one reports their users had initial difficulties with the new User Interface compared to the replaced legacy system, because of tight data verification rules. However, these rules ensure the information collected by the new system is accurate and clean but do add complexity to use. The investment in workflow did cost them an additional eight weeks and a rather hefty fee but users are very pleased with the way that key business processes are presented to them by the application – in short, there’s clarity and little need to resort to training manuals, help desks or colleagues for assistance.
Predictably, it was a different story at Organisation two. Like their peers, they embarked on comprehensive user training on the new system. However, in addition to the having to get to grips with a different UI and approach, the users also needed to learn how to execute their specific processes using the new system. Put another way, they needed to perform a tacit, “manual workflow” using their newly acquired knowledge and skills. Although the steps were defined in the training documents and quick-guides, the users of the system still needed to refer to colleagues and support teams. In addition, very few users had fully digested the need to conform to specific data conventions and accuracy levels; needless to state that after a while, the data quality degraded to a point where upstream processes were affected.
This example is meant to highlight a number of factors:
- Irrespective of the application, there’s a strong case for well-documented processes and a definitive “single source of the truth”. It’s also imperative that data hygiene standards are established for applications, especially those that feed other processes and systems.
- When embarking on new applications, some process discovery is needed; this is the case even if the user journeys and flows are obvious as one day, an individual will mis-understand. Worse still, they may incorrectly advise other users.
- Application Usage Analytics are a powerful tool especially for complex or large-scale deployments; it’s essential to understand the shape and scope of individual user journeys within the application. This analysis will highlight realistic, baseline metrics for processes and tasks as well as those considered as outliers.
- Remember to segment your user community into logical groups: e.g. expert users, experienced users, new users, remote users etc. The performance and usage metrics may vary between these groups.
- Using the gathered information to identify candidate tasks for improvement, workflows and automation routines can be considered, built and tested. Running Usage Analytics on the test group will provide essential before and after metrics.
- The saying “we achieved the outcome despite the way we got there” doesn’t work for business processes and applications; yes, the outcome is vital but cost in terms of user productivity and employee morale is likely to be unsustainable.
- There are a number of techniques and methods that can measure business outcomes; an example is “OKR”s (Objectives, Key Results), just one approach that may work in your organisation.
While up-front discovery and analysis prior to deployment is clearly the best way, it’s never too late to address application shortcomings that affect user’s interactions. The capabilities provided by Digital Adoption Platforms such as in-application help, guidance, and training as well as workflows and automations can transform adoption rates of hitherto difficult-to-use applications.