Photo by Bob Osias on Unsplash

The Next Step in Machine Learning’s Evolution: Graph Neural Networks

If you have multi-dimensional networked datasets, consider a Graph Neural Network

Towards Data Science
4 min readJul 20, 2021

--

The capacity to consistently attain enterprise value from mission critical machine learning deployments hinges on at least one of the following three applications: classifying entities, predicting events, and understanding why events happened.

No matter which technique is used, whether it includes supervised, unsupervised, or reinforcement learning, or if the scale and compute of deep learning is involved, conventional machine learning has limitations for solving these business problems.

It works well for many types of data, but incurs difficulty and outright failure when applied to high dimensionality networked datasets. These limits demand a new approach for social network research, recommendation engines, biology, chemistry, computer vision, and Natural Language Processing deployments in which context is pivotal.

Graph Neural Networks excel at predicting events, explaining them, and classifying entities at scale to deliver striking accuracy for these and other pragmatic deployments. Pairing their reasoning with semantic inferences creates additional knowledge for predicting events based on the multifaceted, contextualized relationships in high dimensionality data.

Whether used for anything from anti-money laundering to disease prediction to e-commerce product recommendations, it substantially improves machine learning’s capability to perform the previously mentioned tasks so enterprise users capitalize on this technology.

Machine Learning’s Problem

The problem with typical machine learning deployments is they only work well on Euclidian datasets, or low dimensionality data that’s easily transformed into numbers. These numbers (called vectors) describe the features for an array of situations from consumer behavior to the distance between the eyes and nose for facial recognition. Feature vectors inform accomplished predictions for accurately detecting objects, for example, regardless of where they are on the screen, what type of lighting is employed, or what their size is.

However, this approach’s effectiveness quickly declines on non-Euclidian datasets with more than two dimensions in which the underlying context is vital to representing real-world entities for calculations. As an example, for a ‘Know Your Customer’ use case in finance, it would be tough to classify clients based on their friends and enemies in a network (and the friends and enemies of those people) because this data isn’t easily vectorized. Each number in the vector would depend on other parts of the graph, making this approach too complex — but not for Graph Neural Networks.

Graph Neural Networks

In the preceding example or any other in which the context of the situation — like the relationship between chemicals when developing pharmaceuticals — influences the data represented in machine learning models, Graph Neural Networks accomplish three tasks. They predict relationships by determining if an entity should have another link to another entity (a person, business object, etc.), they swiftly classify what type of nodes there are, and they pinpoint shapes or clusters in complicated graphs to detect subgraphs.

Most applications of these neural networks combine all three of these capabilities, although their predictive might is easily the most impressive in use cases like recommending products, services, or even answers to customer service questions. They’re also adept at foreseeing actions related to AML and other financial services regulations, or foretelling how people will react to different events or actions in social networks.

Scalable Contextualization

The final of these use cases is the most impressive not just because of the predictive accuracy of Graph Neural Networks, but also because of the scale of the context at which they achieve these results. One interesting application area described here: is to train these models on all the world events recorded in newspapers over the scope of a yearThis particular undertaking included 400,000 facts about 23,000 disparate entities (including countries, governments, institutions and more) and over 250 event types like making statements, issuing attacks, threatening or killing people, and others.

Graph Neural Networks can analyze the context of this news data at scale, identify entities and their multifaceted relationships to one another and, for example, issue accurate predictions about who the president will threaten and who he’ll seek assistance from based on everything that happened in the last 10 months. Predictions are based on the previous events and the desired timestamp.

Impressive Implications

The implications of this approach can revolutionize enterprise applications of machine learning for mission critical processes. Almost anything is possible, from rapidly assessing a patient’s medical future by analyzing his or her medical past (and present), to predicting the exact time and place of equipment failure in the Industrial Internet for a fleet of assets. In marketing, for example, Graph Neural Networks can reveal exactly which strategies are likely to produce the most conversions on specific customer segments, while issuing the same advantages for sales and e-commerce applications.

Graph Neural Networks unequaled understanding of the context found between events is the blueprint for revealing what will happen in the future, with a contextualization and accuracy eclipsing that of conventional machine learning.

--

--

Dr. Jans Aasman is an expert in Cognitive Science and CEO of Franz Inc., an early innovator in AI and leading provider of Knowledge Graph solutions.