The world’s leading publication for data science, AI, and ML professionals.

Skejul meetings with Deep Learning

Tired of trying to meet with someone and never find a date or time? Me too.

The problem

It’s not easy to schedule a meeting when you have people everywhere in the world, in different time zones or even in the same room. We are all super busy, right?

Maybe you are not using your time correctly, maybe you have lots of meeting one day and zero on others. Wouldn’t it be nice to have a perfect calendar, with time to spare, all your meeting and activities organized?

Guess what? Deep Learning to the rescue!

Skejul

http://skejul.com
http://skejul.com

I had the pleasure on meeting the CEO of Skejul (say it like schedule), Mr. Matthew Lamons, for a short explanation of the system and its backend.

Skejul is Artificial Intelligence (AI) that works on your behalf without requiring you to spend a lot of time setting preferences, declaring priorities, managing other people’s contact information, or maintaining information on the places you go and the things you like to do.

It also simplifies the complex process of coordinating activities involving people, places and things.

But how does it do it?

Sequence models in Deep Learning

What is a sequence?

In simple words is a particular order in which related events, movements, or things follow each other. In the case of the Fibonacci Sequence, those "things" are numbers, and they have a function that relates them, we call that a recurrence relation:

If you are a programmer maybe you’ll find this Python snippet more useful:

def fib(n): 
   if n==1 or n==2: 
      return 1 
   return fib(n-1)+fib(n-2)

This code is recursive. A recursion occurs when you define a problem in terms of itself.

So all cool with programming and mathematics. But what does it has to do with Deep Learning.

Deep Learning With Python, F. Chollet. Page 196.
Deep Learning With Python, F. Chollet. Page 196.

In Deep Learning there’s a specific type of Neural Networks (NN) that are able to processes sequences by iterating through the sequence elements and maintaining a state __ containing information relative to what it has seen so far. Is like they have a "memory". Is a way of processing information incrementally as it comes.

This type of NN is called Recurrent Neural Network (RNN), this type of neural network that has an internal loop and its state is reset between processing two different, independent sequences.

RNN for time-series

So one way Skejul looks to solve the problem is using RNN (and other kinds of NN) for time-series. A time-series is data collected over many periods of time (weekly, daily, etc.)

Is a sequencial type of data where the elements are related through time. For example stock prices, the price of the cryptocurrencies and much more.

RNN are particularly good for time-series predictions, with that I mean, trying to see the value of x at a period of time using the values of x before that single time point.

While RNNs are very useful for sequential data, they have some problems, like learning "long-term dependencies", think of that like the gap between the relevant information and the point where it is needed to become very large.

To solve that Hochreiter & Schmidhuber (1997) introduced the Long Short Term Memory networks (LSTMs) that are a special kind of RNN, capable of learning long-term dependencies.

For more on LSTMs check this amazing post.

LSTM model in Keras (code adapted from here)

Don’t worry if you don’t get the code. It’s just to show how easy is to build a LSTM in Keras. I’ll be doing posts on introduction to Keras.

from keras.layers.core import Dense, Activation, Dropout
from keras.layers.recurrent import LSTM
from keras.models import Sequential
import time
def build_model(layers):
    model = Sequential()

    model.add(LSTM(
        input_shape=(layers[1], layers[0]),
        output_dim=layers[1],
        return_sequences=True))
    model.add(Dropout(0.2))

    model.add(LSTM(
        layers[2],
        return_sequences=False))
    model.add(Dropout(0.2))

    model.add(Dense(
        output_dim=layers[3]))
    model.add(Activation("linear"))

    start = time.time()
    model.compile(loss="mse", optimizer="rmsprop")
    print("> Compilation Time : ", time.time() - start)
    return model

With this simple code you can create a sequential model in Keras :).

Time-series and Skejul

So what Skejul does is using its own proprietary algorithms to make an educated guess about the best time and date for the creation of a meeting with any amount of guests.

This algorithm as Matthew told me,uses Deep Learning technologies and they’ve been exploring ConvNets, Residual NN and Deep Reinforcement Learning.

But what about the data? You may be thinking, how are they training their NN?

This is the fun part, you can be part of the private beta and the algorithm can learn from you too! You can sign up here.

A lot of what Skejul does is getting data from your calendar, but not only your meeting, but your replies to them, when you accept them, when you deny them and much more.

They will be launching the free version of the software around Spring of this year, and the commercial version around mid Summer. There’s so much to come for Skejul, and new functionalities will be added incrementally, like a video platform to have meetings inside the app, adding media to a meeting and also a chat.

They claim that with their algorithms and platform you can achieve 75x better suggestions with their Skejul Activity Forecasting Engine (SAFE)™ for your meetings than with uninformed guesses. Also they already have pre-actived members from over 25 countries and from cities such as: London, Mumbai, Mexico City, San Francisco, Singapore, Berlin, Rome, Buenos Aires, Dublin, Dubai, Hong Kong and many more!

Stay close for more info ;).

Summary

  • Scheduling meetings is not a trivial task, so let’s use Deep Learning and AI to solve it.
  • Deep Learning is a very important field with lots of applications, the one showed here is to predict values in sequence models.
  • A sequence is particular order in which related events, movements, or things follow each other.
  • Recurrent Neural Networks (RNN) are a type of neural network that has an internal loop and its state is reset between processing two different, independent sequences. They posses something similar to a memory.
  • RNN have problems learning "long-term dependencies", so LSTMs comes to the rescue. LSTM (Long-Short Term Memory) networks solves this problems.
  • You can implement a LSTM model in Keras really easy and fast.
  • Skejul is using RNN, LSTMs, ConvNets, Residual NN, Deep Reinforcement Learning and more to solve the problem of scheduling meetings in real time with different people around the globe.

For more information follow me on Linkedin:

Favio Vázquez | LinkedIn


Related Articles