The world’s leading publication for data science, AI, and ML professionals.

NeutonAI: Introducing a Remarkably Effective No-Code AutoML Platform

Yes, even your grandma can build machine learning models now

Photo by Samia Liamani on Unsplash
Photo by Samia Liamani on Unsplash

Disclaimer: This is not a sponsored article. I don’t have any affiliation with Neuton or its creators. The article shows an unbiased overview of the Automl solution, intending to make data science tools accessible to the broader masses.

It seems like every day there’s a new kid on the AutoML block. NeutonAI is one of these kids that makes you wonder why spend years learning data science if three clicks can get you 90% of the way. 90% is enough for most of us.

Nerd alert: NeutonAI has a patented neural network framework under the hood the works completely different than, for example, TensorFlow. The creators declared its unique characteristics that distinguish Neuton from other AutoML solutions, since most of them use well-known algorithms. You can read more about it here.

You’ll see today what this AutoML solution has to offer. You’ll also build and evaluate your first Machine Learning model on the platform with just a couple of clicks – no prior knowledge required.

The article is structured as follows:

  • Getting started with NeutonAI – Registration and setup
  • Creating a new AutoML project
  • Training machine learning models with NeutonAI
  • Evaluating machine learning models
  • Making predictions

Getting started with NeutonAI – Registration and setup

You’ll have to setup an account with Neuton and Google Cloud before building machine learning models. This section describes the process, but feel free to check their official Getting started document if you get stuck.

You’ll want to head over to the Try Neuton for Free page on their website. It should look somewhat like this:

Image 1 - NeutonAI registration page (image by author)
Image 1 – NeutonAI registration page (image by author)

You’ll have to fill in your name and phone details, and afterward, an option to connect your Google Cloud account will appear. It’s a mandatory step, so either connect your cloud account or create a new one.

The next step is to choose a plan, assuming you have a Google Cloud account ready. The Gravity plan is entirely free, so you’ll start with that one. A new Google Cloud account comes with $300 free credits, which might come in handy down the road:

Image 2 - Google Cloud setup (image by author)
Image 2 – Google Cloud setup (image by author)

You should leave all options to their default value and agree to terms of use. After doing so, you’ll be able to finalize the Google Cloud project setup, which will redirect you back to the Neuton page.

Assuming everything went well, you’ll see a button to create a new project:

Image 3 - Neuton AI project homepage (image by author)
Image 3 – Neuton AI project homepage (image by author)

And that’s it – you’re ready to create your first AutoML project on the platform.


Creating a new AutoML project

A new modal window will pop up once you click on the Add Solution button. It will ask you to specify the project name and description. We’ll create a machine learning project based around a well-known Titanic dataset:

Image 4 - Creating a new project (1) (image by author)
Image 4 – Creating a new project (1) (image by author)

The next step is the dataset. Neuton provides you the options to either upload the dataset or select one of theirs. You’ll find many datasets in their library, Titanic included:

Image 5 - Creating a new project (2) (image by author)
Image 5 – Creating a new project (2) (image by author)

After choosing the dataset, you’ll have to select a target variable (what you want to predict). The Survived column tells if the passenger survived the accident or not, so that’s what you’ll want to select:

Image 6 - Creating a new project (3) (image by author)
Image 6 – Creating a new project (3) (image by author)

And finally, you can click on the Start Training button. Leave every other option as-is. You should use the TinyML option only if you’re building models for microcontrollers – which you aren’t today.

Image 7 - Creating a new project (4) (image by author)
Image 7 – Creating a new project (4) (image by author)

The training process will now start – grab yourself a cup of coffee while waiting.


Training machine learning models with NeutonAI

There isn’t a whole lot to the model training process. It’s already started, so all you have to do is to sit and wait:

Image 8 - Training machine learning models (1) (image by author)
Image 8 – Training machine learning models (1) (image by author)

NeutonAI handles data processing and feature engineering for you. You don’t have to lift a finger. No more hours spent dealing with messy data – everything is dealt with automatically, provided your data is reasonably formatted.

Training a model for the Titanic dataset took around 5 minutes for me. Once done, you should see the following message:

Image 9 - Training machine learning models (2) (image by author)
Image 9 – Training machine learning models (2) (image by author)

And that’s all there is to it – you can now evaluate the model and make predictions on new data.


Evaluating machine learning models

Neuton is once again dead simple. An EDA dashboard is automatically created for you, and that’s what we’ll explore first.

Click on My Solutions – Analytic’s Tools. A menu like this should pop up:

Image 10 - Evaluating machine learning models (1) (image by author)
Image 10 – Evaluating machine learning models (1) (image by author)

Neuton generates the entire dashboard after clicking on the Expolarotry Data Analysis option:

Image 11 - Evaluating machine learning models (2) (image by author)
Image 11 – Evaluating machine learning models (2) (image by author)

You can also click on the Feature Importance Matrix option – here’s what it shows:

Image 12 - Evaluating machine learning models (3) (image by author)
Image 12 – Evaluating machine learning models (3) (image by author)

You can also gain additional insights into training metrics by going over to the Training tab. This will show you a model quality diagram and a confusion matrix:

Image 13 - Evaluating machine learning models (4) (image by author)
Image 13 – Evaluating machine learning models (4) (image by author)

There are plenty of convenient metrics for business users that help them to understand if the model is accurate enough or not. The only relevant metric I don’t see here is MAPE, but maybe it’ll be added in future releases. You can learn more about the model quality diagram here.

To summarize, Neuton comes with many options for evaluating machine learning models. Of course, there’s more to the story, but it’s a topic for another time.

Finally, let’s see how you can make predictions for previously unseen data.


Making predictions

To start, open your solution and click on the pink button that says Enable under Prediction:

Image 14 - Making predictions (1) (image by author)
Image 14 – Making predictions (1) (image by author)

Doing so will enable both the web interface and the REST API, so you can choose a method that best suits your needs. Let’s cover the web interface first.

Web interface

The Web Prediction interface is useful when you have separate datasets for training and testing. For example, the Titanic dataset includes two CSV files. You’ve trained the model on the first one, and now it’s time to evaluate the performance on the second one.

That’s where the web interface shines.

The only gotcha is that a free version is limited to predicting only 100 rows of a CSV file, but you can cut your file into parts and continue using the free version as long as you want.

Click on the Start button under Prediction – Web Prediction:

Image 15 - Making predictions (2) (image by author)
Image 15 – Making predictions (2) (image by author)

It will ask you to select the CSV file, so choose the one made available by Neuton:

Image 16 - Making predictions (3) (image by author)
Image 16 – Making predictions (3) (image by author)

Once finished, you can either download the predictions or view them online as a table:

Image 17 - Making predictions (4) (image by author)
Image 17 – Making predictions (4) (image by author)

Select the option to view the predicted data online. Here’s how it will look like:

Image 18 - Making predictions (5) (image by author)
Image 18 – Making predictions (5) (image by author)

It’s useful that you also get the probabilities, so you can tweak the decision threshold manually if needed.

Speaking of adjustments, you can click on any prediction row to open up the Model Interpreter. It allows you to change parameters and watch how the predictions change, and get additional insights into the reasoning behind the model. Here’s how it looks like:

Image 19 - Making predictions (6) (image by author)
Image 19 – Making predictions (6) (image by author)

The whole interpreter is a bit difficult to find. I hope it’ll get placed on a more visible location in future updates. You can read more about it here.

Let’s explore the REST API option next.

REST API

APIs are more or less a standard approach to putting machine learning models to production. The idea is to expose a predictive functionality of a model so anyone can access them, even the people without technical knowledge. You can also build apps around REST APIs, so it’s easy to understand why they are a go-to approach in production.

Neuton does all of that for you.

You’re limited to 5000 API requests in the free version, but that will be more than enough for the needs of a small business or project.

Click on the Access instructions under Prediction – REST API Access. You’ll have all the details there:

Image 20 - Making predictions (7) (image by author)
Image 20 – Making predictions (7) (image by author)

Simple, isn’t it? It will save you hours if you don’t need something extra-specific. And most of the time, you don’t.


The verdict

You now know how to go from a dataset to a deployed machine learning model. The question remains – Is NeutonAI the right tool for the job?

It depends.

If you’re a data-science-first company providing tailored solutions based on SOTA research, that NeutonAI platform isn’t for you. No AutoML solution will make the cut. However, NeutonAI may be interesting to you as a baseline, since it has a patented neural framework under the hood, which automatically builds optimal models according to the creators.

On the other hand, if you’re a business user who doesn’t need anything ultra-specific, Neuton might be just what you’re looking for. Even the most premium version will cost you less than a monthly salary of a mid-level data scientist, and it can definitely do more.

What are your thoughts? Have you tried NeutonAI? Feel free to drop your thoughts in the comment section.


Loved the article? Become a Medium member to continue learning without limits. I’ll receive a portion of your membership fee if you use the following link, with no extra cost to you:

Join Medium with my referral link – Dario Radečić


Stay connected


Related Articles