Scheduled backup of WSL2

How to use Task Scheduler to schedule automatic backups of your WSL2 setup including data

Dr Martin McGovern PhD FIA
CodeX

--

Windows Subsystem for Linux is fantastic in that it offers basically a full Linux kernel on a Windows operating system which also has access to data you store on the Windows OS as well as within the WSL itself.

It is advised that best practice is to store all your data that you use in connection with Linux within the WSL installation rather than continually accessing it via the windows C: drive. By storing all the data within WSL faster read/write times are achieved and it also saves some hassle that can develop between file/directory permissions between Linux and Windows files.

However the risk here is that if your WSL installation gets corrupted for whatever reason then you risk losing all data stored within it. In order to resolve this issue I schedule an automatic backup of my entire WSL system to a tar file on a weekly basis.

This results in a single tar file containing all my WSL data and installation details being stored as a safe and secure backup. I can then easily fully restore my WSL installation with all data using this file and as an added benefit it gives me portability where I can transfer my WSL setup to a flash drive and use it on another computer etc.

How to manually backup/restore/remove WSL2

To backup and restore WSL2 setup we run the below commands in PowerShell.

Remove

Simply use command below to remove a setup called Ubuntu-test
wsl --unregister Ubuntu-test

Backup

To create a backup of current setup simply run the code below:
wsl --export distro_name file_name.tar

For example to backup a WSL distribution you have named Ubuntu-18.04 to a file called ubuntu.tar simply run:
wsl --export Ubuntu-18.04 ubuntu.tar

Restore

To restore a backup simply run:
wsl --import distro_name install_location file_name.tar

For example to create a setup called Ubuntu-18.04 from a previous WSL setup file called ubuntu.tar and then saving the setup in C:\Users\martin\ubuntu run:
wsl --import Ubuntu-18.04 C:\Users\martin\ubuntu ubuntu.tar

If you want to match where Windows normally installs them to by default, they’re generally in their own folder in C:\Users\NAME\AppData\Local\Packages.

For example creating a setup called ubuntu_test based on a backup file called wsl_ubuntu_20.tar
wsl --import ubuntu_test C:\Users\mmcgo\AppData\Local\Packages\ubuntu_test .\wsl_ubuntu_20.tar --version 2

Correct registry key for restored setup

Note this new restored distribution will login as root so you need to change the registry key to the default UUID of 1000.

First open the registry by hittingWindows Button + R, then type in regedit and hit return. This should open up the registry. Next move to the folder aHKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Lxss

Now look down the list of registry entries in there and see which one matches the name you gave your distribution. Then go to the DefaultUid and change this to decimal 1000 not hexidecimal as this will not work.

Next time you launch your restored WSL setup it should start normally with your usual setup rather than as root.

Setup regular backup on login for WSL2

To do this we use task scheduler but first we need to create a PowerShell script which will run the actual WSL backup command and secondly we need to create a CMD script to ensure the correct execution policy to enable us to execute a file via task scheduler and secondly to run the PowerShell script we just created.

PowerShell Script to create WSL backup

Create a new PowerShell script in any editor and call it wsl_backup.ps1.The PowerShell script should have the following lines in it. The below command backs up a distribution called Ubuntu-20.04 to a file called wsl_ubuntu_20.tar (it overwrites the file if it already exists).

wsl --export Ubuntu-20.04 C:\Users\mmcgo\OneDrive\Desktop\wsl_setup\wsl_backups\wsl_ubuntu_20.tar

CMD script to change execution policy and run PowerShell script

This script will update the execution policy to allow task scheduler to run the PowerShell script you created in the previous step and it will then run the powershell script we called wsl_backup.ps1.

PowerShell -Command "Set-ExecutionPolicy Unrestricted"
PowerShell C:\Users\mmcgo\OneDrive\Desktop\wsl_setup\wsl_scripts\wsl_backup.ps1

Setup Task Scheduler to run CMD script once a week

Now that we have the PowersShell and CMD scripts written we just need to setup Task Scheduler to run the CMD script once a week.

First open Task Scheduler.

Next create a new task.

You will be presented with the below options

First in the general tab fill in short description as for example below. Also select whether this task is for all users or only certain users. Below I have selected just my profile.

Next in the triggers tab create new trigger as shown in example below, choosing on-a-schedule and specific user in this case for this task. Then specify the day and time, I choose Sunday each week at 4am. Remember as it is set to only run when user logged from the previous step this means if if misses the scheduled time as the computer is turned off etc it will run the next time you turn your computer on after the scheduled time on Monday etc.

This should then show up in the list of triggers like below

Next in actions choose start a program option and point to the CMD file you created earlier.

This should then show up in the list of actions like below

Finally in conditions set the task to always run by deselecting only run when connected to AC power as shown below.

You can ignore the remaining tabs for this task. You should be all good to go and can test by rebooting.

This backup process only takes a few minutes and means at worst you will only lose changes made that day to files in your Linux distribution. It is important to backup like this regularly as WSL is outside your OneDrive files and as such will not be automatically backup up like other windows files.

--

--

Dr Martin McGovern PhD FIA
CodeX
Writer for

I am a fully qualified Actuary and data science team leader focused on combining the best aspects of data science with actuarial science to drive innovation.