Thoughts and Theory
A unifying mathematical framework where deep learning models and protocols can be defined and executed

Graphs are natural objects for modeling relations between entities. Graphs have been used successfully in modeling such relations in many applications, including social networks, physical systems, and scene understanding. Graphs are however limited in their modeling ability as they can only model pairwise relationships. What if we want to model relationships between the relationships ? We can also wonder about a model that builds up a hierarchy of such relationships in an abstract fashion. This for instance shows up in the way we as humans think. Abstraction and analogy that we are capable of require building relations among the relations in a hierarchical manner. Furthermore, the importance of this perspective is evident in the unprecedented success of deep learning models where complicated concepts are built from simpler ones in a hierarchical fashion.
Over the past few years Graph Neural Networks have emerged as a computational framework to facilitate neural network-type computations on graphs. Given the above perspective, it is natural to think about generalization of graph neural networks to networks that model higher order relationships. From a practical standpoint, there are other objects that one might want to do deep learning-type computations on. Objects like triangular and polygonal meshes, polyhedral objects, etc. Furthermore, It would be desirable to have a single mathematical framework that allows deep learning-type computations on these object. The purpose of this article is to show that such a mathematical frame is possible and more importantly intuitive to understand. Specifically, This article introduces Cell Complex Neural Networks (CXN)[1], a general unifying training framework on cell complexes that vastly expands the domains upon which one can apply deep learning protocols. Cell complexes are mathematical constructions that generalize graphs , 3D meshes, simplicial complexes, and polygonal complexes. The training on a cell complex is defined in an entirely combinatorial fashion allowing for intuitive manipulation, conceptualization, and implementation.
We first start by explaining what cell complexes are then move on to the definition of cell complexes neural networks.
What are cell complexes ?
A cell complex is a mathematical construct that is built by combining simple primitives called cells. These cells are attached together to form the final object which is called a cell complex [2]. Examples of cell complex are given in the following Figure.

The most primitive types of cells are the nodes, these are also called the zero cells. 1-cells are the edges, 2-cells are the faces and so on. Notice that a 2-cell may bound arbitrary number of edges, a property that less general complexes, such as simplicial complexes, do not necessarily have. The exact mathematical definition of cell complexes is not important for this article and the intuition will suffice to understand the concept (See [2] for a more precise treatment on cell complexes). The most relevant fact for us is that they generalize graphs in the sense they are able to model higher order relationships via the higher order cells.
How do we represent cell complexes?
Representing cell complexes can be done by using adjacency matrices. The following figure illustrates how to construct these matrices. Notice that these matrices generalize adjacency matrices on graphs.

The best way to understand cell complex neural networks is to see how they generalize message passing graph neural networks.
Graph Neural Networks
The most popular type of graph neural networks can be understood under the message passing model [3]. In the message passing model, we are usually given a a graph G = (V, E) , an embedding vector for every node in the graph and sometimes we are also given an embedding vector for every edge in G. Given a desired depth L > 0 for the model, a message passing scheme network updates the embedding representation for every node in the graph L times according the following update relation [3]:
![Message passing scheme on graphs [3].](https://towardsdatascience.com/wp-content/uploads/2021/07/1jkMCebF31fHwg4CcYrg0Qg.png)
Here, the vector h is the feature vector available at the node i. The superscript (k) on h indicates the update stage. The functions phi and alpha are trainable functions, such as MLP, and the function E is a permutation invariant differentiable function. The vector eij is an embedding vector representing the edge between the node i and the node j and it is sometimes included in the computations.
This is also explained in the following figure:

Cell Complex Neural Networks
Now we are ready to introduce cell complex neural networks. Cell complex neural network utilize the notion of message passing schemes and it uses higher order cells in the complex to perform these computations. The core idea is rather intuitive : edges in the graphs are utilized to message pass the information between nodes. Now suppose that we have a cell complex of dimension 2 such as a triangular surface. In this case we have nodes, edges and faces. We can use the same message passing idea between edges and consider bounding faces to be the tool that transport the messages between edges. We explain the idea concretely via an example.
Adjacency message passing scheme (AMPS):
Consider the cell complex given on the left of the following figure.
![Two layers Cell Complex Neural Network (CXN). The computation is demonstrated with respect to the red target vertex. The information flow goes from lower cells to higher incident cells [1]. Image Source Simplicial Complex representation Learning](https://towardsdatascience.com/wp-content/uploads/2021/07/1eJSpGhPjbC0CI4sD3G17mQ.png)
In the beginning, the green target node takes messages from its surrounding nodes :the brouwn, the red and the orange nodes as well as the edges utilized to send these messages : the yellow, the dark and the light pink edges. This can be seen from the Figure (a) above. This is exactly what we do in the graph neural networks setting we had above.
Now we go to the second layer in cell complex network, and we want to update the vectors on the edges. In this case we consider the neighbors of the yellow edge to be the pink and the grey edges and the reason is that they all bound a higher dimensional face (the dark grey face). This is illustrated in Figure (b) above. Other edges and nodes are updated similarly.
You can think about the flow of information with this message passing scheme as given in the following figure :

Information flow radially from lower dimensional cells to higher ones and as we go deeper in the network, we also get further way from the source node.
Mathematically, this can be described by the following simple equation
![adjacency message passing scheme (AMPS) [1].](https://towardsdatascience.com/wp-content/uploads/2021/07/19I58MgggM4uo5EMey7MRyw.png)
Here $H_m$ is the embeddings that represent the cells of dimension m in the cell complex. The superscript on H represents the update stage. The function M is the message propagation function that depends on the weights θ. Note the embeddings at stage (k), depend on the adjacency relations as well as the neighbor cells and the cells of one dimension higher. This equation, while simple, is very general and it subdue all existing message passing graph neural networks.
Coadjacency message passing scheme (CMPS):
Cell complexes are more complicated than graphs and they naturally admit other natural message passing schemes. In particular, instead of utilizing the adjacency relation to perform message passing we can utilize the co-adjacency relation to do that. This is illustrated in the following figure :

CMPS can be used for instance if we want to have the desired computations on the faces (say for mesh segmentation and you want these faces to be labeled). Mathematically, this can be written as follows :
![Coadjacency message passing scheme (CMPS) [1].](https://towardsdatascience.com/wp-content/uploads/2021/07/1OY6HTGWKwBxZB3fOyv8Dpg.png)
It is also illustrative to look at the way information flows using CMPS :

There are other natural message passing schemes that can be defined on cell complexes. We leave the details to our paper [1].
But why do we need neural networks with such generality?
There are multiple reasons for why one might want to define deep learning on cell complexes. First, it is mathematically elegant to have a single deep learning model that subdue all other models. Cell complexes are generalization of graphs, which in turn generalize images. However, the gap between graphs and cell complexes is massive and it contains many other complexes (e.g., simplicial complexes, polyhedral complexes, and ∆-complexes). All these objects are important in practice and sometimes it is desirable to work with one family over the other. For instance, when working with partial differential equation on surfaces, it is desirable to work with quad meshes. Second, it is also desirable to have an intuitive mathematical framework that other practitioners can reconstruct, implement and maybe customize for their purposes. Message passing scheme on graphs are popular because they satisfy these conditions. Having these schemes generalized on more general objects means that all existing message passing -based models on graphs can be elegantly generalized to cell complexes using the general schemes we introduced above.
More information about cell complex neural network can be read here. In particular, more message passing schemes are discussed as well as the case when the cell complex is oriented.
References :
[1] Mustafa Hajij, Kyle Istvan, and Ghada Zamzmi. Cell complex neural networks. In NeurIPS Workshop on Topological Data Analysis and Beyond, 2020.
[2] Allen Hatcher, Algebraic topology, 2005.
[3] Gilmer, J., Schoenholz, S. S., Riley, P. F., Vinyal, O., and Dahl, G. E. Neural message passing from quantum chemistry. In Proceedings of the International Conference on Machine Learning, 2017