The world’s leading publication for data science, AI, and ML professionals.

How to build R Shiny apps that update themselves

Ditch that manually repetitive analysis. Relax and let R do all the work.

The R ecosystem is a really good place right now. In particular, the infrastructure in R that allows people to interact with data through Shiny apps offers incredible potential to reduce the workload of a data analyst.

It is now possible to build Shiny apps that can update themselves on a regular basis, pulling in refreshed data so that people are always looking at the most up to date analysis. So with one up front development sprint, an analyst can reduce their ongoing analytic workload on a particular topic by close to 100%.

What you will need

  1. A project where you are delivering reports in the same format on a regular basis, and where all the data is available in a database.
  2. R, R Markdown and R Shiny skills
  3. Access to a Linux server running R that contains a Shiny Server, or if you are really lucky, access to the RStudio Connect publishing platform, which makes this entire process so much easier.

Step 1: Build your app against a local data source

Write your shiny app so that it deploys successfully on your local machine, using a local data source in the form of a .csv, .feather file, .RData file, or even a SQLLitelocal database. Ensure that it runs as expected.

Step 2: Write an ETL script or document that can run on server

Write a script in R that pulls the raw data from the remote database, transforms it into the form needed for your Shiny app. You will then need to write this data to somewhere where it can be accessed remotely by your Shiny app. Options include:

  • Writing back to the source database in a new table (if you have permission to do so)
  • Writing into the file system of the Linux server that hosts your Shiny Server
  • If you are working with RStudio Connect, writing into its cache at /var/cache/data.

Step 3: Deploy your script to your Linux server or on RStudio Connect

You’ll need to set up your database connections in R on your Linux server. Here is a good guide to this. Then you will need to edit your script to point to these connections.

If you are writing your output to the Linux file system, you will need to ensure it is in a directory that the Shiny Server can access.

If you are operating in RStudio Connect, you can wrap your script into an R Markdown document and publish it including source code, so that the code executes whenever the document is refreshed, thus doing a fresh ETL.

Step 4: Deploy your Shiny app to the Shiny Server or to RStudio Connect

You’ll need to edit your app’s code to point to the remote data source, whether its a database or somewhere in the file system. Then you can load the app files to a subdirectory of shiny-server on the Linux server. The URL of the app will be the URL of the shiny server followed by the subdirectory name, eg https://my.shiny.server/myappname.

If working in RStudio Connect, you simply publish as normal and customise the URL inside RStudio Connect.

Step 5: Set up auto refresh of data

If your script is sitting on your linux server, you can use various scheduling tools to set up an auto run of the script at the desired interval. cronis a simple way to do this. Just find your crontabfile in your Linux user folder, and edit it appropriately. For example, to set your script to run at 2am every Sunday, edit your crontabfile with a line as follows:

0 2 * * 0 Rscript /mypath/myscript

This will run your script automatically at the desired frequency, pulling the raw data and overwriting the files that your app accesses with new data.

If you are using RStudio Connect, you can set up for your R Markdown document containing your script to refresh to a schedule using the settings menu for your document.

So there you have it. An entirely automated app that refreshes itself. Think of all the time you could be saving if you could set this up. You’re welcome.

You can find out more about Shiny Server and RStudio Connect here.

_Originally I was a Pure Mathematician, then I became a Psychometrician and a Data Scientist. I am passionate about applying the rigor of all those disciplines to complex people questions. I’m also a coding geek and a massive fan of Japanese RPGs. Find me on LinkedIn or on Twitter._


Related Articles