Dear Math, I Am Not Your Therapist, Solve Your Own Problems.

Or actually, I will build a neural network that will solve those problems for you.

Dalya Gartzman
5 min readDec 2, 2018

--

Imagine a website that is like a private math tutor. It can solve equations, and even guide you through the solution so you can solve the next equation yourself. Now imagine that one day you want to interact with your automatic tutor not only by equations, but also using your natural language.

Well, imagine no more. This website, Simplisico, exists. And in order for you to be able to interact with it in your natural language, one thing that needs to happen along the way is that Simplisico will understand by itself what equation you are actually trying to solve. Which is where I come in. In this post I will share with how I created a POC for the challenge of reading a textual math problem and extracting the underlying equation.

Don’t worry, we don’t really have 42 steps to cover, and you have less than 10 min. of reading ahead of you.

Let’s Make a Plan!

Before we go into the details, let’s think about how we would approach this challenge in general.

Step (1) - when facing a textual math problem, make sure the problem is written in a language that we can read. I mean, if the question is in Basque and I don’t speak Basque, I would have to translate the question to a language I understand before I can even try to tackle the mathematical task.

Then, one way to solve the mathematical task itself, is:
(2) “hide” the numbers in the question with parameters,
(3)
understand the structure of the equation,
(4) insert the numbers in the correct places.

In the next sections, we will see how we perform steps (1)-(2) using existing tools, and how we build a neural network that achieves steps (3)-(4).

Preprocessing - Teaching Our Machine to Read

Let’s see how we translate the question to a language the machine understands, in two steps, using two existing tools:

First, I did something called Tokenization, where I replaced the numbers in the question with variable names, separated the questions into words and removed punctuation. And I did all that easily using NLTK package.

Then, I did something called Embedding, which means that I “translated” the words into vectors in high dimensional space, in a way that preserves their semantic and syntactic relationships. This was also a rather simple step, thanks to the Word2Vec algorithm in the Gensim package.

Let’s think of what we gained thus far. Let’s say my machine already knows how to solve questions where it is asked to count trees in a park. Now, if it also knows that there is a similar relationship between trees in a park, and steps in a solution, then when I ask it to count steps in a solution, for the first time ever, it will know to apply the same logic it already knows for counting trees.

So now, actually, we kinda taught our machine to read.

Let’s continue to the main part, where we talk about the neural network that will extract the underlying equation.

Buzzword-Driven-Development

If you know a little bit about neural networks (if not, you can read this quick intro I wrote) then I can tell you straight away that the neural network I constructed for this task is A Sequence to Permutation Recurrent Neural Network With Long-Short-Term Memory and Attention.

Damn that’s a mouthful.

Let’s take it in bit by bit.

First, the input is a textual math problem, hence, a sequence of words. And why would the output be a permutation? Because we expect that every number the machine saw in the input question, will appear in the output equation. Then, if we know what the structure of the equation is, then we can reduce our problem to finding the correct permutation of the numbers in the input question.

Now, Recurrent Neural Network (RNN) refer to a general architecture in which the network contains some sort of feedback loop. In our case, the network will read each word at a time, this is the Short-Term Memory part, but it will also read the word in the context in which it was read. This is the Long-Term Memory part. In addition, let’s say in the beginning of the question I added a sentence “I want to write a blog post on Medium”. Obviously this does not add any information to the equation I want to solve, so I want the network to know not to give too much Attention to these kinds of sentences.

The last missing piece is that we want to reintroduce the numbers we kept on the side earlier. This we can do by concatenating the input numbers with the output of the previous step, and flow them through another layer of neurons.

All these aspects of the architecture of the network can be obtained using somewhat basic features of Keras package.

The internet is full of Lego blocks. We just need to learn how to connect them.

That’s All Folks

If you followed the steps thus far, voilà - you got yourself an algorithm that can read a textual math problem and extract the underlying equation!

And better yet, you now know one way to approach challenges involving text understanding in general.

Now you can think - what linguistic riddle do YOU want to solve? how can you use what you learned here to solve this riddle? .

These are truly exciting times to explore this fascinating domain, where increasingly complex tools become increasingly accessible, and the sky in no longer the limit to what you can achieve with basic understanding of already existing mechanisms. The internet is full of Lego blocks. We just need to learn how to connect them.

[Watch this to learn more about my adventures working on this project.]

--

--