A Framework for Embedding Decision Intelligence into your Organization
How to bring Decision Intelligence to life
Why “Decision Intelligence”?
There has been growing interest in Decision Intelligence (DI) for some time now, and the proposed imperatives for adoption are diverse — ranging from principled and altruistic drivers [1] to much needed approaches to technology enablement by automating the action-to-outcome process at scale [2] and everything in-between.
From my perspective, there are a range of reasons to explore this domain: a recognition that investing in data to “make better decisions” is too vague; a desire to improve decision culture and to mitigate the risks inherent in unstructured or ad hoc decision-making based principally on heuristics; a desire to see Decision Intelligence (DI) as a unifying discipline, bringing together much-needed influence from a variety of social sciences, quantitative methods, and business concepts. Clearly, there is a diversity of drivers for adoption, presumably dependent on your individual or organizational context.
While this may not come as a surprise to those of us working at the intersection of analytics and business strategy, there are also a range of very business-sounding reasons. While I intend to unpack these drivers in a future paper, at a high level, they generally fall into the categories of:
· Data and analytic investment optimization
· Optimized design and re-use of data, analytics, and decision products and artifacts
· Driving focus in decision-support activities to enable business strategy and oversight
Regardless of how you got here (these are all compelling arguments), if you’re at all like me, you’re asking yourself “ok… but what now?”
A diversity of views
Let’s start with the lack of DI consolidation. There are as many definitions for DI as there are reasons for adopting the field in your own business (just do a quick Google search). However, some thought leadership in this space has started to emerge around at least three different views of DI that share many common elements:
Dr. Lorien Pratt’s Link book, and the CDD
I was enthusiastically awaiting Lorien’s book [1] in 2019 and was hopeful that it would bring together all of the loosely collected concepts, descriptions, interpretations of DI into one place. Lorien has been advocating for DI for over a decade and has emerged as one of the major consolidating voices in this field. I would highly encourage you to check out some of Lorien’s articles, blogs, podcasts and videos to get a sense of her view of DI.
Her book presents a narrative-based summary of the major drivers and reasons for DI, presented alongside a few core concepts. Most notably, Lorien advocates for the use of a “Causal Decision Diagram”, or “CDD”.
The CDD is a valuable reductionist tool that allows a user to “Design” decisions by identifying three major decision components (actions, causal links, outcomes), and their cumulative effect on a desired outcome. Link goes through several examples, and even presents methods on CDD design, highlighting the utility of CDD’s as scaffolds for inserting data, information, analytics, or AI/ML models.
Google’s Dr. Cassie Kozyrkov, and her role as a “Chief Decision Scientist”
Another major voice in this field is Google’s Chief Decision Scientist — Dr. Cassie Kozyrkov. Apart from her impressive CV and background, Cassie is a very gifted speaker and colloquial writer who actively engages in the broader data science and business community, often leveraging humorous anecdotes or internet memes to bring her message to life. As with Lorien, I would encourage you to check out some of Cassie’s articles and videos.
To my knowledge, Cassie’s most sweeping conceptualization of DI has been outlined in this article [3], and presents an artful blend of behavioral economics and psychology, data science, statistics, and decision science. I appreciate the breadth of focus, and how she simultaneously presents the conditions under which statistical rigor are demanded, while also outlining the role of hard and soft factors that influence decision-making under uncertainty. I think Cassie’s cross-discipline view of DI invites discourse on the role of bias, heuristics, and decision culture in modern data and analytics application for decision-making.
“Engineering Decision Intelligence” model as mentioned in Gartner® research
For those of you following DI, you may have noted that the field appeared in the “Top Trends in Data and Analytics, 2022” [4], and again in their “Gartner® Hype Cycle™ for Analytics and Business Intelligence, 2021” [5].
So I believe that this is clearly on their radar and they’ve also put together their own definition and framework for DI. I should point out as well that DI was represented during their virtual Data and Analytics Summit in May of last year.
Gartner defines Decision Intelligence as:
“a practical discipline used to improve decision making by explicitly understanding and engineering how decisions are made, outcomes evaluated, managed and improved by feedback.” [6]
To know more, you can read Gartner research reports (only accessible to Gartner subscribers)
What has been missing?
Notwithstanding the above, it has been very difficult to find an openly available source of guidance for those of us looking to embed DI into our businesses. While we know that DI is being leveraged inside of organizations, as of right now, there’s no guide to doing so. Most of the published material in this space has been either too theoretical or too closely tied to proprietary services or frameworks, and generally lacks a step-wise approach to “getting off the ground”.
This paper will not completely address this gap, but it takes an important first step by proposing an open-source framework that describes the major groupings of DI processes that can serve as a wire-frame for embedding DI into organizations of all types.
An integrated process framework for embedding Decision Intelligence
I propose here a framework that describes the major activities and enabling factors of a holistic decision intelligence operating model. This model, presented below, ties together concepts from several different DI definitions and decision sciences, and serves as a guide for the implementation of DI capabilities within organizations:
This framework is structured around two principle dimensions:
· Core Activities, which include sub-groupings for Decision Design, Decision Support, Decision Optimization, and Decision Review outline the main activities of a DI operating model established within the organization;
· Enabling Factors (Organizational Capabilities and Decision Culture) describe the components of an organization that reinforce or enable the Core Activities.
The following sections describe each of these dimensions at a high-level and provide the basis for others in this field to build on this framework.
Core Activities — Pre-Decision
Decision Design
The first set of core activities occur pre-decision, and describe the process of “designing” a decision — that is, taking the first critical steps in providing structure to a decision that will enable all subsequent downstream steps to build off a decision foundation.
Decision Framing
The very famous psychologists Kahneman and Tversky define a decision frame as follows:
“… the decision-maker’s conception of the acts, outcomes, and contingencies associated with a particular choice.” [7]
We can see from this definition that it effectively sets the boundaries of the decision — what are my actions (levers), outcomes (desired or otherwise), and contingencies (risks and externalities)? They further go on to show that decision preference is highly dependent on framing — even when expected outcomes are identical.
Clearly, the process of appropriately framing a decision is key to not only defining appropriate decision scope, but also to ensuring that the desired decision process has been established. The frame, therefore, should at a minimum address questions like:
· What are my desired outcomes?
· How much am I willing to invest in obtaining these outcomes?
· What are my risks? Are they tolerable?
Indeed — you should know the answer to these questions before you look at any data, because they can be invalidated by your examination of the data — even if you don’t intend it to be so! [8]
A systematic approach to defining a decision frame as part of the broader set of core activities taken pre-decision is a critical component to anchoring your downstream decision processes.
Decision Mapping
Dr. Lorien Pratt has done an excellent job of walking us through the utility of CDD’s in her book “Link”, and related articles (as discussed above). This method is highly useful for capturing actions, causal links, and outcomes in a simple, graphical format that invites discussion, inclusiveness, transparency, and structure.
There are similar frameworks that exist (for e.g. the Government of Canada’s Logic Model approach [9], some interpretation of Kaplan and Norton’s Balanced Scorecard [10], etc.), however these are generally associated with activities at the program/project-level, and not on actions within a decision process (e.g. inputs => activities => outputs).
The principle advantages of decision mapping are [1]:
· It reduces the amount of complexity that you must “hold in your head” by graphically representing causal relationships between actions, intermediaries, and desired outcomes
· It provides an opportunity for significantly broader participation in the decision-making process by “huddling” around CDD’s
· It presents causal links that serve as points of integration for data and analytic products, metrics, information, predictive models, etc. and in this way orchestrate multiple inputs into a single decision
· It presents an opportunity to revisit the entire decision process for post-decision review and provides for the re-use of decision artifacts
The process of facilitating decision-mapping exercises is critical to enabling the adoption of structured decision making via CDD’s or related decision processes.
Decision Augmentation
The activities clustered under “Decision Support” build off the “Decision Design” steps by evaluating designed decisions, and clearly identifying where there are points of integration for data and/or analytics. Equally, it invites an analysis of the decision’s design for the purpose of clearly identifying areas of uncertainty that may or may not be resolvable.
Data and Analytics Integration
A designed decision — for example, one presenting a series of action to outcome pathways — presents an opportunity to clearly identify where and how data or analytics can support downstream pre-decision processes (e.g. modeling), or how to determine whether an action will have a desired outcome.
The rationale is simple: if you have an action — causal link — outcome pathway, what you’re really looking at is a mathematical function. For example, if my action is to increase ad-spend, and my outcome or intermediary is an increase in sales, then there is a mathematical function that describes the relationship between ad-spend and sales.
In this way, for the purpose of working through a decision within the framework described above, we can take sub-components of a decision to attach specific data, metrics, analytics, predictive models, etc. Equally, we could consider blocks of sub-components (for example, clusters of action-link-outcome or similar) that are known to have relationships that are governed by single aggregate relationship that describes actions and outcomes. Lorien Pratt’s book Link does a great job of presenting CDD’s as a scaffold for integrating data and analytics.
Identifying Uncertainty
All decisions are taken under some degree of uncertainty — particularly in a modern business context and doubly so in a “VUCA” world [11]. Whether those sources of uncertainty are external to your organization, internal, or driven by inherent issues with data-quality, they present a challenge to the decision-maker. There are manifold approaches and “best practices” for taking decisions under uncertainty — ranging from business process perspectives [12] to statistically-driven approaches based on Expected Utility [13].
From a Decision Intelligence integration perspective, I think this part of the process is about looking at a designed decision (that is, a framed and mapped decision), and evaluating the decision for areas of uncertainty. Specifically, while this is notable for arising in a somewhat contentious context, it is useful to evaluate uncertainty using a Rumsfeldian approach that has found supporters in other scientific fields [14]:
A framed decision presents an opportunity to review the decision context at a high-level and to identify where we have known knowns, known unknowns, etc. Equally, CDD’s present an opportunity for us to sit down to look at each causal link and to identify uncertainty from a Rumsfeldian perspective (as above), but also (where data and analytics have been integrated) from a statistical perspective — what’s the error? what does the quality of the data look like? etc.
As it relates to this process framework, by proactively identifying areas of uncertainty, we are preparing ourselves to acknowledge the limitations of our decision process and giving ourselves a chance to identify systematically unresolvable uncertainty, or areas where we might reduce the level of uncertainty in our decision-making.
Decision Optimization
A designed and enhanced decision provides an opportunity for us to consider whether there are opportunities to look at action optimization through modeling, and/or to agree to a decision approach that is most appropriate to the decision context and design. This set of processes present an opportunity to address final steps before a decision is taken.
Modeling
This quantitative step is possible where well structured and mapped decisions present an opportunity to model aggregated action-to-outcome links within a CDD (or logic model, or similar). This potentially allows for the assessment of the aggregate impact of action(s) on decision outcomes (intended or otherwise). Where data (or analytics) don’t allow for a fully nuanced model to be developed, knowledge of an actions gross impact on a particular outcome or intermediate (e.g. “more” or “less” of x outcome will result from this action) can still shed light on the aggregate impact an all outcomes of a series of actions.
It should be obvious that this is a complicated process that requires a significant investment of time and resources and may or may not be appropriate to the decision context. However — this is a potentially valuable step, as it allows for decision optimization where a quantitative decision model can be developed that allows for action optimization against targeted business outcomes.
More broadly, we can also look at Decision Automation as a potential objective of a broader set of activities that seek to optimize an action-to-outcome link. O’Reilly has detailed the role that AI orchestration plays in decision automation and goes on to explain that an objective of AI orchestration should really be Decision Intelligence [2]!
Whether or not you choose to model a decision, it should be clear that organizations looking to model and optimize their decisions will need to draw on in-house quantitative experts (including Data Scientists) and/or engage and external service providers to support the process.
Approach
While much is made of “Data Driven” approaches to decision making, the reality is that it is simply not optimal under many circumstances, or even desirable. In a recent Gartner® Data and Analytics Summit, Gartner held a few sessions focused on decision-making.
Specifically, Gareth Herschel, Gartner Analyst, proposed that “the best decisions are blended”, and “any approach has strengths and weaknesses” [16]:
I would extend this concept to mean that decision elements (e.g. action-to-outcome links in a CDD for example) can also be evaluated for the approach that is most suited to that particular element. For example, an instinctive (i.e. heuristic) approach to assessing whether or not an action will impact an outcome may be suitable where outcome sensitivity to that action is low, or where data are prohibitively expensive to gather. This is also a good way to tailor the approach to participants in the decision process (see above).
By evaluating a decision (or components of a decision) for the best approach, we’re also inviting a discussion of the role of bias in decision-making [17]. While not the subject of this paper, it’s a major force in organizational decision-making culture and process, and there are many deep references on this front to evaluate.
For the purpose of this framework, it is important to acknowledge that there are a range of approaches, and that part of your decision process should take into account the most appropriate blend of approaches for a particular decision context.
Core Processes — Post Decision
Decision Review
So — you’ve taken all the steps you can to design, enhance, and optimize a decision — and now you’ve gone ahead and taken that decision. Specifically, an irrevocable allocation of resources has taken place [3], signalling that a decision has, in fact, been made and that you are no longer designing a decision process or assessing your available options. Even though we think we might be done, we can still leverage the structure that Decision Intelligence brings to decision-making by creating additional value from a retrospective look at our decision, and from the retention of decision artifacts.
Decision Retrospective
There is a well-established behavior preference to assess the quality of a decision by the quality of the outcome. This is known as the outcome bias and is a costly error that most of us are prone to make [18]. In fact, and especially in an organizational context where resources are scarce and there is a temptation to move on quickly, it seems clear that outcome bias can mask important facts about the decision process that inform how we take decisions going forward.
“Even the best decision has a high probability of being wrong. Even the most effective one eventually becomes obsolete”
Peter F. Drucker [19]
So, if we accept that Peter Drucker is correct, and that even the best decision can be wrong, then we must accept that a good decision can have a bad outcome. In other words, the quality of a decision should be evaluated on the merits of its process.
A decision retrospective is an opportunity to ask whether the decision process was sound, and in the event of a bad outcome, to determine whether there would have been something else we could have done. Note that even with a good outcome, the decision process may have been bad. A decision retrospective provides us with an opportunity to introduce continuous improvement into the decision-making framework.
Retention of Decision Artifacts
A well documented decision — consisting of a clear frame, a clear design, potentially a decision map and link to data and analytics, lessons learned, etc. — provides a rich opportunity to retain and re-use all these decision artifacts for future decisions. Similarly, sub-components (for example, any information that has been hypothesized and tested through an action — causation — outcome link) can be re-used as sub-components of future decision maps.
While not all decisions will lend themselves to this type of knowledge retention, for those that do, there’s an opportunity to reduce the amount of future-work required to create similarly structured decisions.
The retention of decision artifacts — where possible — provides an opportunity for re-use, which can significantly improve the efficiency of applying DI to future decisions.
Enabling Factors
While the sections above describe the processes that can be tailored to a decision intelligence implementation, they are unlikely to succeed without a corporate environment that enables the application of decision intelligence techniques to a corporate context.
Decision Culture
Decision-making authority is central to the delegation of accountability within organizations and is reinforced by a culture that supports a sound — and often participative — decision-making process. This culture is tied directly to an organization’s leadership and its willingness to build this culture [20]. In fact, it can be a competitive differentiator:
“The leaders at making and executing decisions nearly always turn in better financial results than their competitors. […] a critical attribute of a decision-focused company, the hardest to develop yet most essential for lasting success, is a culture that fosters great decision making and execution throughout the organization.” [21]
Bain and Company
It goes without saying, then, that an organization’s decision culture (values, desire for transparency in decision-making, culture of continuous improvement, etc.) is a critical enabling factor that supports a DI approach to decision-making.
Some (relatively) recent research from McKinsey highlighted the prevalence of decision-related issues, and further proposed an approach to adapting decision-making practices to the type of decision under consideration [22]. This is a great example of a proposal for a culture of decision-making that seeks to put emphasis on the decision process, and not just the outcome.
Organizational Capabilities
Finally, we can consider an organization’s capabilities, and their role in supporting a robust approach to decision-making that leverages DI principles. Here, we are discussing the “classic” three organizational dimensions — people, process, technology.
First, and most importantly, an approach to decision-making that is rooted in data or evidence requires that workforce skills are appropriately mated to this ambition. It follows, then, that alongside skills required to undertake core work, a skills inventory of an enabling organization would include data-centric roles (analytics, data science, information management, etc.), and skills necessary to support a technology infrastructure that enables data work.
Second, organizational processes can support the appetite for structured decision-making and review. In particular, governance, risk management, and program/project management provide a vehicle for building decision-supportive practices into corporate processes. As an example, a part of benefits attestation for program-related work can include a decision retrospective and analysis. Similarly, a mitigation for operational or strategic risk can include advocacy for a DI-related approach to strategic decisions. While this area is organization-specific, it provides the means to “hard-wire” DI or structured decision processes into the organization’s work.
Finally, an organization’s technological capabilities and infrastructure are critical to ensuring that data assets can be stored, transformed, visualized, analyzed, etc. Similarly, these capabilities enable storage/retrieval of decision assets, and provide collaboration capabilities and tooling that are critical to holding facilitated discussions that are critical to decision intelligence.
Tailoring: how to use this framework
The application of all framework processes is a significant investment of time and resources. As a result, a more appropriate application of this framework would involve tailoring the framework to the decision at-hand.
This is a core concept of project management, which provides an analogous scenario that promotes the appropriate selection of a sub-set of all available processes, methods, etc. [23]
The concept is simple: not all decisions require all processes, with simpler or lower impact decisions requiring a smaller sub-set, while more complex decisions potentially require more.
What comes next?
This article covers far too many areas in far too little depth. However, hopefully what it can provide is a set of interlocking processes within a simple framework that can help to embed some of the core DI concepts into a corporate context. It is my sincere hope that others looking to unite several of these ideas into a unified and consolidated DI field can draw from this work to help their thinking along — even if this framework is discarded/molded/adapted. Personally, I’ll be watching this field closely and looking at how we mature DI into a “production-ready” practice that meets the challenges of our shared future.
Acknowledgements
I would like to acknowledge the many incredible smart people who are actively trying to contribute to this space and to build a community around decision intelligence as a discipline. In particular, I would like to acknowledge the contributions of Dr. Lorien Pratt, who has made great strides in trying to capture the scope and application of DI, and of Dr. Cassie Kozyrkov, who has used her position of authority in the tech sector to significantly elevate the profile of this emerging domain.
Note: all views presented here are my own
About the Author:
Erik is a leader in product development and analytics, with over 15 years of experience in the Canadian public sector. He’s interested in the application of digital and scientific principles to the everyday challenges faced by organizations.
Erik can be found here on LinkedIn, or reached directly at erik.balodis@gmail.com
References
[1] L. Pratt, Link: How Decision Intelligence Connects Data, Actions, and Outcomes for a Better World, Emerald Group Pub Ltd., 2019.
[2] O’Reilly Media, “AI Orchestration Enables Decision Intelligence,” Medium, 19 January 2021. [Online]. Available: https://medium.com/oreillymedia/ai-orchestration-enables-decision-intelligence-2a88d8306ac9. [Accessed 26th September 2021].
[3] C. Kozyrkov, “Introduction to Decision Intelligence,” Towards Data Science, 2 August 2019. [Online]. Available: https://towardsdatascience.com/introduction-to-decision-intelligence-5d147ddab767. [Accessed 26th September 2021].
[4] Gartner, “Top Trends in Data and Analytics, 2022", Rita Sallam, 11 March 2022. GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.
[5] Gartner, “Hype Cycle for Analytics and Business Intelligence, 2021”, Austin Kronz, Peter Krensky, 29 July 2021. GARTNER and HYPE CYCLE are a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and are used herein with permission
[6] Gartner,”Decisions — They Are What You Make Them,”G. Herschel, Gartner Data and Analytics Summit, Virtual, 2021.
[7] A. Tversky and D. Kahneman, “The Framing of Decisions and the Psychology of Choice,” Science, vol. 211, no. 30, pp. 453–458, 1981.
[8] C. Kozyrkov, “The First Thing Great Decision Makers Do,” Harvard Business Review, 25 June 2019. [Online]. Available: https://hbr.org/2019/06/the-first-thing-great-decision-makers-do. [Accessed 26 September 2021].
[9] Government of Canada, “Supporting Effective Evaluations: A Guide to Developing Performance Measurement Strategies,” [Online]. Available: https://www.canada.ca/en/treasury-board-secretariat/services/audit-evaluation/centre-excellence-evaluation/guide-developing-performance-measurement-strategies.html#LogicModel. [Accessed 26 September 2021].
[10] R. S. Kaplan and D. P. Norton, “The Balanced Scorecard — Measures that Drive Performance,” Harvard Business Review, January-February 1992.
[11] N. Bennet and G. J. Lemoine, “What VUCA Really Means for You,” Harvard Business Review, January-February 2014.
[12] A. Alexander, A. De Smet and L. Weiss, “Decision Making in Uncertain Times,” McKinsey & Company, 24 March 2020. [Online]. Available: https://www.mckinsey.com/business-functions/organization/our-insights/decision-making-in-uncertain-times. [Accessed 26 September 2021].
[13] J. Chen, “Expected Utility,” Investopedia, 7 May 2021. [Online]. Available: https://www.investopedia.com/terms/e/expectedutility.asp. [Accessed 26 August 2021].
[14] M. Shermer, “Rumsfeld’s Wisdom,” Scientific American, 1 September 2005. [Online]. Available: https://www.scientificamerican.com/article/rumsfelds-wisdom/. [Accessed 26 August 2021].
[15] A. Mantovani, “Known knowns, known unknowns, unknown uknowns, & Leadership,” Medium, 28 April 2020. [Online]. Available: https://medium.com/@andreamantovani/known-knowns-known-unknowns-unknown-unknowns-leadership-367f346b0953. [Accessed 26 August 2021].
[16] Gartner, “Decisions — They Are What You Make Them,” G. Herschel, Gartner Data and Analytics Summit, Virtual, 2021. GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.
[17] J. B. Soll, K. L. Milkman and J. W. Payne, “Outsmart Your Own Biases,” Harvard Business Review, pp. 64–71, May 2015.
[18] G. Francesca, “What We Miss When We Judge a Decision by the Outcome,” Harvard Business Review, 2 September 2016. [Online]. Available: https://hbr.org/2016/09/what-we-miss-when-we-judge-a-decision-by-the-outcome. [Accessed 2021].
[19] P. F. Drucker, “The Effective Decision,” Harvard Business Review, 1967.
[20] D. Waller, “10 Steps to Creating a Data-Driven Culture,” Harvard Business Review, 6 Feb 2020. [Online]. Available: https://hbr.org/2020/02/10-steps-to-creating-a-data-driven-culture.
[21] M.W. Blenko, P. Rogers and P. Meehan, “Create a decision-focused culture,” Bain & Company, 15 Apr 2011. [Online]. Available: https://www.bain.com/insights/decision-insights-7-create-a-decision-focused-culture/.
[22] A de Smet, G. Jost and L. Weiss, “Three keys to faster, better decisions,” 1 May 2019. [Online]. Available: https://www.mckinsey.com/business-functions/people-and-organizational-performance/our-insights/three-keys-to-faster-better-decisions.
[23] S. Whitaker, “The Benefits of Tailoring: Making a Project Management Methodology Fit,” PMI White Paper, September 2014. [Online]. Available: https://www.pmi.org/learning/library/tailoring-benefits-project-management-methodology-11133.