
We want high-quality business analytics, a better "process of discovering meaningful and actionable insight in data" as Prof Dursun Delen describes in his book Prescriptive Analytics, as we want to make better decisions and get better at problem-solving by using mathematical/statistical models or machine learning/deep learning algorithms, technologies, tools and practices.
There have been many attempts to suggest a rational human decision-making process, and the best one was by Herbert Alexander Simon (1916–2001), who was an American economist and political scientist, received the Turing Award in 1975, and Nobel Prize in Economics in 1978.
For a successful outcome, the decision-making process should follow standardized, systematic, and logical steps. H. A. Simon, in 1977 came up with the theory that involves intelligence, design, choice, implementation, and a form of feedback between each phase. Simon’s theory is yet the most concise and complete theory on evidence-based, scientific decision-making.
Humans do willingly settle with something less than the best, satisficy Simon calls, as they behave in bounded rationality, as they have a limited capacity for rational thinking, the best solution they come up with may not be the real-world best solution. This is also another reason why many managers go with their gut feelings. My dear friend Emrah has a great book on this, The Myth of Experience, if you want to read more on this.
Briefly, the intelligence phase is about understanding the goals and objectives, and problem identification, problem classification and problem ownership. The output of this phase is a clear problem statement.
The second phase design is about model abstraction, model formulation, alternative generation, and criteria identification. This phase results in a set of alternatives to be considered. It does validate the problem statement.
The third phase choice is about evaluating alternatives, ranking them, and selecting the best one. This is where sensitivity analysis is done to verify and test the alternative along with what-if analysis statement and planning the implementation.
The fourth phase is implementation, it is doing what it takes, taking action and understanding the impact. The results of the implementation will be taken into account for the next phase of the cycle.
Now, we have a better understanding of human-rationale on Decision Making, let’s look at the evolution of the tools and the technology over the years to enable us to make better decisions.
Short History of Decision-Making Systems Evolution
Prior to the 1970s, Operational Research (OR) dominated the industry with heuristic methods like simulation models. With the introduction of rule-based Expert Systems (ES) were introduced and "if/then" statements regulated the intelligent decision algorithms.
In the 1980s, Enterprise Companies needed to unify the data from different departments, into a single source of truth with a schema. Database Management Systems and ERP systems emerged helping the data to be integrated, referenced. With easy access to data, a basic level of static, on-demand generated reports were giving an overall view of the system.
In the 1990s, with the introduction of the data warehouses, Prism being pioneer, Inman and Kimball published the go-to books for data warehousing, namely Building the DataWarehouse and The DataWarehouse Toolkit. Business executives were now aware of their key performance indicators with decision support systems and were able to track the results with the visual support of dashboards, scorecards.
The 2000s came with the branding of the Business Intelligence on the DW-driven decision support systems, and the amount of the data was big enough to require "mining" the data to "discover" the meaning behind the numbers. The increase demand/overhead of consistently patching, updating DW infrastructure enabled/benefited from the IaaS movement. Software systems were also getting complicated, and needed to split up and create contracts for each communication, where Service Oriented Architecture emerged to address these requirements.
In the 2010s, a variety of data sources contributed to the data ecosystem to be overwhelmed from smart meters to smart health monitors, and social media feeds, and "Big Data" became a thing everyone was part of it. Massively Parallel Computers left their places to Hadoop better Map Reduce programming, with built-in NoSQL query power.
The 2020s challenged the limitation of batch processing, and real-time analytics is at the core of any transaction, with a requirement of full automation of the pipelines to shorten the feedback loops and provide value sooner.
Types of Analytics
The most basic type of Analytics is Descriptive Analytics. This is about collecting data from the information systems designed to support the decision-making process by running reports from Data Warehouses, Data Lakes, Data Lakehouses. Any reports explain the causalities and correlations attempts to explain "What happened?". It gives an insight and the dashboards, monitoring systems display the metrics to enable the right decisions.
One level more complicated than just aggregating data and doing number crunching is Predictive Analysis, which answers "What will happen?" and comes up with recommendations. The power of predicting customer behaviour/demand, financial market movements, and stock market prices is priceless. Data mining use exploratory analysis with multi-dimensional cubes on high-performance infrastructure and ML models use unsupervised learning (as in where you don’t know the feature names you collected) to predict or cluster the information via classification, regression, or time-series forecasting. Rolls-Royce has an Intelligent Engine Health Management (EHM) system that tracks engine health worldwide by the generated hundreds of terabytes of data from on-board sensors and lives satellite feeds. It is a form of predictive monitoring that flags potential threats and recommends engine improvements. They have extended the system to achieve 14% more fuel-efficiency in their aviation.
Prescriptive Analytics uses sophisticated optimization, simulation, and heuristics-based decision-modelling techniques to answer "What should I do?" so that the system capable of thinking instead of humans. They execute the reinforcement systems, along with other applied statistics, operations research, machine learning, natural language processing, image processing, speech recognition, and signal processing. Kensho is one of the most significant ones augmenting the ability to analyse the impact of the real-world events on the financial markets and be able to answer complex financial queries, categorise events, which is an average salary of $350k-$500k employee.
Hope it will clarify the confusion of the Analytics, with the historical background on decision making systems, models, as well as examples. With reinforcement systems in place, I reckon we need a new word for the automated version of Prescriptive Analytics, (Automated Analytics sound very cliche) so that we can identify the lines in between where the recommendation is and where the action is taken. As we have trusted our satisficied decisions, maybe we can be trusting the systems learning from their feedback loops…
If you have any questions, please leave a comment, and if you did like, don’t forget to follow me. Until the next one, take care!