I’ve often heard stories from my operations research colleagues about fantastic results obtained from advanced analytics models they have built, but the models were never used in the business. Typically, the lack of acceptance of their models was due to a lack of understanding of what would be required for successful integration of those models into normal business operations.
What they missed was the need for applying organizational change management principles in order to gain broader acceptance of advanced analytics techniques that are used for decision support. I collaborated with longtime colleagues Zahir Balaporia of Schneider (now at FICO), Karl Kempf of Intel, John Milne of Clarkson University (formerly at IBM), and Rahul Saxena of Cobot Systems (formerly of Cisco)—who have decades of experience in the development and deployment of analytic methods for improved decision making—on a paper about this topic. Following is an adaptation of our paper and the related presentation I have given at INFORMS events.
For the purpose of this discussion, we like to classify analytics projects into one of three buckets:
- A one-time analysis leading to a strategic decision
- Analytics that are used to drive repeated decisions with humans in the loop
- Analytics that are deeply embedded in automated process.
Each of these types of projects requires change management principles to be applied in order for the projects to have maximum impact.
What do we mean by change management? Change management is an approach to transition individuals, teams and organizations to a desired future state. Acclaimed change practitioner Daryl Connor developed a model, viewable here, of “how and when people become committed to major new organizational requirements.” Understanding these stages of commitment is very helpful when launching a project so that the various stakeholders are prepared for the change management journey. As described by Connor, stakeholders should be moved through three phases of the process: a preparation phase, an acceptance phase, and a commitment phase. The ultimate goal is to get commitment to the analytics project.
For analytics professionals, a key skill is that of business problem (question) framing, as described in the INFORMS Job Task Analysis, available at Certified Analytics Professional website (registration required). This skill requires understanding a business problem and determining whether the problem is amenable to an analytics solution. Applying this skill when launching a project is a key aspect of the preparation phase of Connor’s change management model.
A project should commence by conducting a solution discovery process. Three steps in this process contribute to the preparation phase in change management.
1. Obtain or receive the problem statement and usability requirements. Collect answers to questions such as:
- What is the current situation?
- What is your vision of the ideal situation?
- What actions would help to bridge the gap?
- How will success be measured?
- Is there a baseline measure available today that can be used for comparison?
- What is the cost of doing nothing?
2. Identify the stakeholders. Get answers to these questions:
- Who are the users of a potential solution?
- Who inside and outside the organization will be impacted by the potential solution?
- Which people from different parts of the organization need to be involved in the construction of a potential solution? These could be end users, IT executives, database architects, and user interface designers.
- What business processes will be affected by the potential solution, and how?
3. Obtain stakeholder agreement on the problem statement. Without agreement at the beginning from all of the various stakeholders as to the problem being addressed, there will be challenges in moving to the acceptance and commitment phases of the change management process.
As the project proceeds and users gain awareness, address user concerns—worry about change or worry about the status quo—to transition from a fear of analytics to comfort with analytics. Here are some proven techniques to overcome potential objections and fears:
- Gently describe the risks in openly resisting change.
- Explore and resolve conflicts in management’s support of the initiative with mixed messages, related to the measurement of users via different KPI’s. For example, management may be looking at metrics at a global level that are inconsistent with performance metrics for individual employees.
- Recognize that reduced workload for some people might yield layoffs. Understand who will get laid off—those who embrace the project, or those who resist it? Address the fear that analytics is replacing the human expert. To dispel the perception of job risk, point out that the proposed solution will allow the user to explore different alternatives with different assumptions.
It is better to tell users, “The system will help you make decisions faster” instead of “The system will make better decisions for you.” Furthermore, design a system with controls for users to guide the decision process. For systems where users will interact with the recommendations made by analytics, create controls that allow users to override those recommendations. Over time, users will trust the system’s recommendations. Make sure to build the system to log these overrides, as that data can indicate how to improve the system.
There are several ways to help users move toward the acceptance phase of the change management process. During the solution discovery process, understand the breadth and depth of the business problem—the target organization must transfer information via interviews, and it is important to listen and learn. Lowering resistance entails listening to user’s common objections, such as “This can’t work” and “We tried that before and it did not work.” Identify roadblocks, understand how the current project differs from prior attempts, convert an objector into a vocal advocate, and ask for users’ advice.
To move towards acceptance, it is advisable to progress with “digestible chunks.” Create a Graphical User Interface (GUI) of the current process, with connections to data and illustrations of current decisions. This will allow users to get comfortable with a new system. Use a “Crawl, Walk, Run” strategy. Try to adapt Agile software development practices (described in many places including the Agile Alliance website) for the analytics project—create targets for progress in short 2-week intervals; allow users to review progress and understand issues as they arise, in a step-by-step manner.
To move towards the commitment phase, incremental deployment of a solution is a best practice. Increase the scope across appropriate business dimensions: start with a single product, single business unit, and/or a single geography; then expand the scope incrementally. Success stories that demonstrate incremental deployment include the following examples:
- Inventory optimization solutions that proceed by gaining adoption with one product line, then expanding to multiple products.
- One notable success is the retail price optimization project at Intercontinental Hotels Group (the 2011 INFORMS Franz Edelman Prize finalist, referenced here), where the team started with a few hotels, then added 100 properties per month.
- More recently, at UPS for its ORION (On-Road Integrated Optimization and Navigation) project (the 2016 Edelman winner, referenced here), the team started with deploying the solution at one distribution center, added a few distribution centers over time, gained commitment from those initial successes, and increased the rate of deployment as the business adopted the change.
It is critical to continually measure the benefits of the analytics. During solution discovery, define the metrics that can be measured that demonstrate success. During deployment, report on these measures. (Don’t be afraid to change the metrics!) It may be necessary to convince the business that the old way of measuring the process may need to be revised—which would present another change management project.
After deployment, provide ongoing maintenance and support. Take stock in the observation, “Any decision support system left unattended for six months is replaced by a spreadsheet.” Work with the users to create feedback loops to measure effectiveness, and be sure to check in regularly with them. Make sure there is a process that allows regular recalibration of the models and verifications of the assumptions made about data.
Often, an analytics solution is deployed deep inside a software system, and is a “black box” to the users. Remove the mystery of the black box, by explaining what the system does in terms of the business application, not the algorithm. To gain acceptance that the analytics solution demonstrates improvement in the business process, compare the new results to the baseline. Use data visualization techniques and descriptive analytics techniques, such as reporting and drill-down investigation tools. Plan for these techniques early in the project.
All analytics projects are dependent upon data, and it is important to understand how data feeds the system. There are typically two sources of data: automatic feeds, which require continually checking the assumptions made about those data feeds, and user inputs, for which automatic alerts should be created for data that fall outside control limits. There can be multiple potential points of failure. For automatic data feeds, what if the network goes down? What if the data makes the model become infeasible? Avoid the potential result of users blaming the analytics system when there are other root causes. This can lead to a lack of adoption of the eventual system.
To ensure adoption of the analytics solution, understand the downstream effects of the system. What will the new business process be? People’s roles and responsibilities may change. Interactions may not be obvious (e.g. better targeted marketing to increase sales will require an increase in production capacity; predicting failures of equipment may require an increase in inventory of replacement parts). A successful implementation may open up unforeseen opportunities—in some cases, users may reject success because they want more capabilities.
Remember: analytics informs intuition; intuition informs analytics. Decision support systems provide either a “best” solution to support the decision makers to perform what-if analysis; or a set of best or near-best solutions with a set of reports with which to slice and dice and compare solutions so that the user makes the final choice. Giving the user that power increases the likelihood of adoption.
In conclusion, change management requires listening, understanding, planning and communication to move people through the three phases of preparation, acceptance and commitment. Successful advanced analytics projects require good change management so that the resulting solutions are accepted within the business. Using these techniques should lead to less stories about analytics solutions that were never adopted by the business.