Geospatial Analytics & Insights

Carto vs Kepler.gl: Which tool to choose for location analytics?

A comparison of the geospatial tools for different use cases

Aditi Sinha
Towards Data Science
7 min readMay 1, 2020

--

Kepler.gl

A brief introduction to the tools

Carto and Kepler.gl are the tools that companies use in their analyses to make decisions about ground operations. In this piece, we compare them on factors ranging from technology to target users.

Kepler.gl

Kepler.gl

Kepler.gl is a web-based platform to visualize large-scale location data. It was created and open-sourced by the visualization team at Uber with the mission to create industry-grade open source frameworks to supercharge big data.

Carto

Carto

Carto is an open-source location intelligence platform built on PostGIS and PostgreSQL. CARTO (formerly called CartoDB) is a Software as a Service (SaaS) cloud computing platform that provides GIS, web mapping, and spatial data science tools.

Locale.ai

locale.ai

At Locale.ai, we are building an “operational” analytics platform using location data for supply and operations teams in on-demand companies. Similar to how web analytics tools (Google Analytics, Mixpanel, Clevertap) give analytics for web products using clickstream data, Locale helps to analyze and optimize your on-ground operations using geospatial data.

Comparision

This image shows the comparison between the three tools in a nutshell on different features or parameters. This section dives deeper into all of the parameters.

Comparison Summary Table

1- Analysis & Visualizations

Kepler.gl

Kepler is one of the best geospatial visualization tools for exploratory data analysis (EDA). The Uber team has done thorough and detailed research on the kind of visualization features relevant to mobility companies.

One of its biggest drawbacks is that any kind of processing that needs to be done on the data needs to be done outside. Kepler.gl is, unfortunately, a desktop tool that takes data only in the form of CSV, JSON, and geoJSON files.

These characteristics make it very painful when you are dealing with large amounts of data and need to keep downloading CSVs to plot onto the tool

Carto

Carto’s visualization and analysis capabilities are much more horizontal casting a wide net on the use cases they tackle as a company which includes different kinds of filters, widgets. They also have several built-in analyses (pre-built functions to help you manipulate data) inside the product.

The downside over here is that all these analyses can only be done on one data set at a time. Hence, the pre-processing out here also needs to happen outside the system, especially in cases of more complicated data sets.

Locale

At Locale, we have tried to include the best of both worlds. We have taken the inspiration from Kepler for visualizations and added analyses onto the product — not as functions but instead as use cases. We also ingest all your data across different databases, formats, and systems, get it onto a single place and model it so that it becomes super easy to be able to carry out these analyses.

Carto

2- Target Personas

Kepler.gl

Kepler.gl is targeting mostly business users, who essentially don’t know how to write code or SQL because of its sheer simplicity. However, we have seen a vast number of data scientists across Google and Airbnb adopting it for visualizations.

Carto

Carto is positioned as a location intelligence platform with an aptitude for data analysis and visualization that does not require previous GIS or development experience. Their UX tailored to the needs of modern web developers and data scientists. Their visualizations can be used for business users to make decisions.

Locale

Our target users at Locale are also business users as well — supply and operations teams, product teams, strategy teams and marketing teams inside companies. We have found that they are the ones who are in maximum pain and most of the time end up using excel sheets. They still depend on an analyst for BI reports or developers for live dashboards.

3- Target Industries

Kepler.gl

As mentioned before, since Kepler has been open-sourced by Uber, quite a lot of its features are much more relevant to mobility companies. It’s also used heavily by academicians and journalists since they don’t deal with large amounts of data live streaming.

Carto

Since Carto acts more like a platform, they end up working with a wide variety of clients ranging from governments to finance to healthcare to logistics.

Locale

We think of our market as every company with moving supply and/or demand. This includes on-demand delivery, micro-mobility, logistics, supply chain, as well as workforce companies.

Kepler.gl

4- Use Cases

Kepler.gl

Kepler.gl is mainly used for geospatial visualizations. Some examples of maps that have been built on Kepler are:

  • California Earthquakes
  • New York Cab Rides
  • San Franciso Elevation Contours
  • New York City Population
  • San Franciso Street Tree Map

Carto

The use cases that Carto dabbles with are:

  • Site Planning: Driving site selection and investment decisions
  • Logistics Optimization: Optimizing supply chain design
  • Territory Optimization: Aligning sales and service territories
  • Geomarketing: Targeting users based on where they are & how they move
  • Mobility Planning: Better infrastructure decisions & reducing traffic

Locale

The use cases that we at Locale tackle with are:

  • Lifecycle Analysis: Analysis of the user, rider or order journey to reduce drop-offs.
  • Supply-Demand Analysis: To analyze mismatch in demand (orders) and supply (riders) and reduce idle time.
  • Trip analysis: To analyze trips and movement to increase utilization.
  • Single Trip Analysis: To create profiles of a single user, bike or trip based on movement patterns.
  • Station Location Analysis: To analyze the performance of static (or fixed) entities.

5- Technology

Kepler.gl

There are four major suits available in Kepler.gl — Deck, Luma, React map and React vis — to make beautiful data-driven maps. It is built with deck.gl and utilizes WebGL (A JavaScript API) to render large data faster and efficiently.

Carto

Carto’s web app is called Builder where users can manage data, run user side analysis and design custom maps. The CARTO Engine, which is a set of APIs and developer libraries is built for creating custom maps and data visualization interfaces. The tool uses JavaScript extensively in the front end web application, backend rails based APIs, and for client libraries.

Locale

Locale.ai uses a wide range of powerful open-source tools to handle large scale datasets in front-end and backend. The frontend is powered by Uber’s Deck.gl for high-performance visualizations, Nebula.gl for additional editing capabilities and Mapbox-GL for rendering maps.

Unlike other platforms, Locale provides the ability to ingest a large amount of data both in realtime and on-demand to analyze and gain insights on the fly. The backend is powered by python, PostgreSQL and PostGIS for powerful data processing and geospatial operations.

Building it Internally

While working with some of the leading on-demand and micro-mobility companies, what we observed that they end up hacking around these open-source tools like Kepler itself or QGIS.

Sometimes developers also build their own internal tools, but most of the time they are not well suited for all different audiences inside the company and since the tools are not built in a scalable way, maintaining these suck up a lot of their bandwidth often!

A lot of times there is even a repetition of effort and the wheel keeps getting re-invented over and over again. As Jeff Lawson from Twilio said

“It is easier than ever to build software but harder than ever to operate it”.

Why Choose Locale?

So, if you are a company that decides to build this internally, it would have to be built like a platform itself, (much like how Uber has done it) and with the following characteristics:

  • Simple and intuitive user interface to carry out analyses, especially for business users
  • Scalable geospatial visualizations with actionability
  • ETL robust enough to handle streaming data as well as historical analysis to go back in time

This would require setting up a team of at least 6–7 (consisting of the front end and data engineers, geospatial analysts as well as data scientists). On top of that, the added pain of waiting for at least 6 months to build the platform and kickstart the analyses.

If you want to delve further, check our website out or get in touch with me on LinkedIn or Twitter.

--

--