Skip to main content

Game Server Backup Script Example

There are many ways to backup your Game Server on top of our internal Daily Backups, many times you're able to do so using a system integrated into the server itself or a server-side mod made for the game. When that's not an option, or you want to ensure you have a remote backup for a 3-2-1 Backup Strategy, you can make use of a VPS and a backup script.
Backup software and scripts like this one run on a separate server from your Game Server and connects to it remotely, sending console commands to control the server state and then copying the server files to create the backup. Finding the right software or script that integrates with what you're wanting to backup can be tricky, but we've created a simple example script that works with our Game Control Panel API that can be run on any Linux VPS to create quick backups of any Game Server hosted with our Game Panel. It can also be used as a starting point to design similar scripts for panels with similar APIs, such as Pterodactyl.

Server OS & distro

This script was built for and tested on Ubuntu 22.04, but should work on any Linux Distro with access to the following commands: curl, mkdir, cd, zip, find, and rm. The rclone software that we make use of in this script can be installed on any Linux distro.


Before we get into the script we want to go over exactly what the goals of this script are so that you can more easily understand the steps that are taken.

Control the Game Server State

Our existing backups, and most external backups for that matter, are taken without control or knowledge of the server state. Using Minecraft as an example, if you back up a server while it's online, any chunks that are actively being saved at the time won't be backed up correctly. That's a semi-rare occurrence since Minecraft saves on average every 5 minutes, and backing up an individual server using our infrastructure takes far less than that, but it is a possibility. With the ability to communicate with the server over the API, you can tell Minecraft to turn off saving using the /save-off command, guaranteeing that all of the chunks you backup will be error free.
This issue can lead to an error called concurrent modification, and is a common issue in backups. If you save to a file at the same time as it's being copied, then the copy will be malformed and unusable. Depending on how the saving and copying is being done, it may even affect the original file. Our backups are done in a way that saving to a file as it's being backed up will only affect the copy, making at most the file in the backup unusable, but leaving the original file intact. Again, this is a fairly rare occurrence, and most often becomes an issue with files or games that require constant saving, for example games, mods, or plugins that make use of SQLite databases or otherwise make saves more often than once a minute.

The ability to control the server state, either via the console or controlling the power state, is a big part of this script and why we make use of the API. It allows us to be sure that the data we want to back up is in a safe condition, and that the backup we make will be of a static server state and not of one in flux.

Incremental Data Transfer

When controlling the server state with the goal of preventing saves to the disk while making a backup, you can run into problems causing your Game Server to crash or lag. These problems stem from the saves that otherwise would have happened being saved in memory, and either bogging down the server while it waits, or causing it to freeze when it's finally time to save.
That makes the second goal of this script to minimize the amount of time that we're actually interacting with the Game Server. We do this by keeping an uncompressed version of the Game Server files with the script. We then sync only the changes that were made to the live Game Server files between each backup, minimizing the transfer time while still leaving us with a fully functional backup at the time of compression.

We've chosen to do this with a program called Rclone which allows for this kind of synchronization over SFTP and many other cloud/network storages. For the basic version of the script, we just use Rclone to sync between the panel's SFTP address and a local destination, but you can also sync between SFTP and Google Drive/Cloud, an AmazonS3 bucket, or any other destination on this list.

Off-Site Storage and Data Isolation

A major part of backing up any system is keeping at least one backup off-site and isolated from the live files. The goal of storing the live data and the backups on separate servers is to prevent the accidental deletion of the backup directory through operator error, or encryption/deletion of the entire server's storage by an attacker or malware.
The ideal situation for a production environment is an air-gap on physical media, but for Game Servers, isolation between the backup server and the live server tends to be enough. The minimal standard we're aiming to meet is that the Game Server has no knowledge of the location of the backups and if it's infected it can not affect previously generated backups. We do this by having the backup server contact the game server and storing multiple sets of fully functional compressed server backups.

The Script

The example script script itself consists of only a few parts, using Minecraft as the example target game,

  • The variables used to configure the script.
    • Most of these variables are used multiple times in the script, and they can all be modified. They all have short explanations of what they do and what needs to be placed in them.
  • A couple of mkdir commands that make the directories specified in the server_name variable. The second directory created is the server_name variable suffixed with the word Archives.
    • The root directory of the script, specified in the local_storage_dir variable, has to exist prior to you running the script!
  • A few curl commands that send the API requests to the panel to disable saving and notify users.
    • This repeats a few times with the payload changing for each Minecraft command.
  • An rclone command that syncs between an SFTP destination that's been configured with Rclone and a local directory.
  • A few more curl commands to enable saving and notify users now that the sync has been completed.
    • At this point, we are no longer interacting with the live server other than to notify users/admins of the current backup progress. This means that the server is no longer affected. This step normally only takes a few minutes, but varies depending on the amount of changes between backups.
    • If this step is taking too long, more frequent backups can help. If you don't have enough storage for additional full backups, you can copy just this part of the script to a new file and run this part more frequently. That will decrease the amount of changes that need to be synced every time.
  • A CD command that changes the working directory to our backup location.
    • This affects the resulting directory tree contained within compressed file.
  • A zip command that compresses the backed up files.
    • Uses the current date and time as parameters to construct a file name.
    • This Server_$(date -d "today" +"%m%b-%d-%Y--%H-%M") would turn into this Server_10Oct-10-2023--18-01 on October 10th, 2023, at 18:01 local time.
  • A set of find and rm commands piped together that can locate and delete files older than 3 days by default.
    • You can change this by modifying the days_to_keep variable at the top of the script to a different integer.
  • A final curl command to notify users.


In order to use the backup script you need to configure it. To start with, you need an API Token, the linked article shows you how to get one for the panel we use. Once you have an API Token and you've saved it somewhere, you'll need to set up and configure Rclone. In order to get the files from WinterNode, you'll need to use SFTP, and Rclone provides documentation on configuring an SFTP Destination here and one on installing it on Linux. Make note of the name you give your destination and use the credentials you get from the Using SFTP article.

Now you need a directory on your Linux Server to store the compressed backups and the local copy of your server files. The directory tree should look something like this. We're using the /home/ directory as an example, but as long as you set the permissions correctly, you can have this directory anywhere on your server.

  • /home/ServerBackups/
    • ServerArchives/
    • Server/

After completing the above steps you should have 4 things, a directory in which to store the backup script and subfolders, your server's UUID, an API Token, and a configured SFTP Destination with Rclone. All of these can be configured using the variables at the top of the script, and can be further modified in the commands themselves, if needed.


The script below is built for Minecraft, but the command payloads and curl commands can be swapped out for the commands, or payload types, required for other game servers. If you aren't sure what's required for the game you host, reach out to our team via Discord and we'll do our best to find out with you!

Server OS & distro

This script was built for and tested on Ubuntu 22.04, but should work on any Linux Distro with access to the following commands: curl, mkdir, cd, zip, find, and rm. The rclone software that we make use of in this script can be installed on any Linux distro.


#This command tells the script to print the output when you run the script manually, allowing you to follow it's progress when you're checking that it works.
set -x

#A few variables that you can edit easily. These contain information about the server you're going to be backing up and the locations where your backup files will be stored.

#Replace <SERVER UUID GOES HERE> with your server UUID
#Replace <API KEY GOES HERE> with your API Token
#Replace <REMOTE SFTP DESTINATION> with your configured Rclone SFTP Destination, the : character is necessary for it to be read as a remote destination instead of a local directory.

#Change /home/ServerBackups to match the full path of your directory
#This folder path needs to exist before you run the script!
#This is the name of the folder that your uncompressed files will be stored in.
#A second folder of the same name, but suffixed with the word "Archives", will also be created and will be used for the backups.
#Change the number 3 to the number of days you want to keep backups for

#Change Server_$(date -d "today" +"%m%b-%d-%Y--%H-%M") to match the file name you want
#-This part $(date -d "today" +"%m%b-%d-%Y--%H-%M") is what adds the date to the end of the file. You can customize that with the parameters on the following page.
backup_name=${server_name}_$(date -d "today" +"%m%b-%d-%Y--%H-%M").zip

mkdir ${local_storage_dir}/${server_name}Archives/
mkdir ${local_storage_dir}/${server_name}/

#Can be removed or modified as needed, change the "say Backup Starting" string to change the command sent to the server
curl "${server_UUID}/command" \
-H "Content-Type: application/json" \
-H "Accept: application/vnd.wisp.v1+json" \
-H "Authorization: Bearer $api_token" \
-d '{"command": "say Backup Starting"}'

#Can be removed or modified as needed, change the "save-all" string to change the command sent to the server
curl "${server_UUID}/command" \
-H "Content-Type: application/json" \
-H "Accept: application/vnd.wisp.v1+json" \
-H "Authorization: Bearer $api_token" \
-d '{"command": "save-all"}'

#Can be removed or modified as needed, change the "save-off" string to change the command sent to the server
curl "${server_UUID}/<SERVER UUID GOES HERE>/command" \
-H "Content-Type: application/json" \
-H "Accept: application/vnd.wisp.v1+json" \
-H "Authorization: Bearer $api_token" \
-d '{"command": "save-off"}'

#This syncs the remote destination, in this case your server, and a local destination, the local copy of your server files
rclone sync $SFTP_destination ${local_storage_dir}/${server_name}/ --progress

#Can be removed or modified as needed, change the "save-on" string to change the command sent to the server
curl "${server_UUID}/command" \
-H "Content-Type: application/json" \
-H "Accept: application/vnd.wisp.v1+json" \
-H "Authorization: Bearer $api_token" \
-d '{"command": "save-on"}'

#Can be removed or modified as needed, change the "save-all" string to change the command sent to the server
curl "${server_UUID}/command" \
-H "Content-Type: application/json" \
-H "Accept: application/vnd.wisp.v1+json" \
-H "Authorization: Bearer $api_token" \
-d '{"command": "save-all"}'

#Can be removed or modified as needed, change the "say Backup Sync Complete"" string to change the command sent to the server
curl "${server_UUID}/command" \
-H "Content-Type: application/json" \
-H "Accept: application/vnd.wisp.v1+json" \
-H "Authorization: Bearer $api_token" \
-d '{"command": "say Backup Sync Complete"}'

#This changes the working directory to be your local storage directory. Moving the working directory affects the file tree stored in the zip file, making it easier to work with when you decompress it.
cd $local_storage_dir
#This creates a zip file of the contents of the ${server_name} folder, located in the ${server_name} folder suffixed with the word "Archives". The name of the zip file will be generated from the $backup_name variable and will have the date and time suffixed to it.
zip -r ${local_storage_dir}/${server_name}Archives/$backup_name ./${server_name}

#This finds the backups older than the specified number of days and "pipes" them into an rm command, deleting them. By running this every time, we ensure that backups older than X amount of days are deleted.
find ${local_storage_dir}/${server_name}Archives/ -mtime +${days_to_keep} | xargs rm -f

#Can be removed or modified as needed, change the "say Backup Archive Complete" string to change the command sent to the server
curl "${server_UUID}/command" \
-H "Content-Type: application/json" \
-H "Accept: application/vnd.wisp.v1+json" \
-H "Authorization: Bearer $api_token" \
-d '{"command": "say Backup Archive Complete"}'


With the script configured, you'll need to set it's permissions to grant your user the ability to execute the script. Then you can run the script by navigating to the directory and typing ./ into the console. Let it run through and if there are any errors double check that you've replaced all of the variables with the correct values. It may take awhile for the zip file to be compressed, so don't worry if it doesn't look like it's doing much, the script will exit if it's failed.

Scheduling with cron

After you've completed a successful run of the script by confirming that it synced your Game Server's files and generated a zip archive, you can add it to your Linux Server's CronTab. The provided CronTab link is a great jumping off point for understanding the CronTab, but to summarize, you can provide a full path to the location of your script and a shorthand for the times and dates you want the script to run. Using the crontab -e command, the script will be run using your user's scope and permissions, and you can use this editor to generate the shorthand for the times and dates you want your script to run.
For example, if you're script is located at /home/ServerBackups/ and you want to backup every 6 hours, your cron entry would look like this: 0 0,6,12,18 * * * /home/ServerBackups/

Other types of connection

There are other ways to communicate with your server than just the use of a game management panel like WISP or Pterodactyl.


RCON is a widely utilized TCP-based protocol that can send commands to game servers that include an RCON server and has the port open. In the context of our backup script, it can be used to send the commands we need to get the server into the state we want before we start the backup.

RCON is insecure

While it requires a password in order to send a command to the server, both the command and the password are sent over plant text. That essentially means that the password can easily be pulled out of the packet by anyone listening, opening your server up to an easy attack vector.
The solution to this is to either only use RCON from within the same machine or network, or to use something like an SSH Tunnel when connecting to an RCON Server over the public internet.

The easiest way to use RCON in the backup script is to make use of a tool like rcon-cli, which would just require you to put the rcon-cli binary in your backup script's folder and then run a command similar to the ones shown in their example section instead of the curl commands used in our example.

Game Tips

Some games have oddities in how they save data or what commands are available. As we come across them, we'll be collecting that information here.



If you want to back up your CoreProtect database, use the /co consumer command to stop CoreProtect from writing to the database when you stop the world from saving, then start copying the CoreProtect database after you've turned saving back on. The order of operations should look like this.

/co consumer pause

-copy your world files


-copy your CoreProtect database

/co consumer resume

Doing things in this order will ensure that the data in your CoreProtect database matches the state of your world as closely as possible while also minimizing the amount of time that the server's not saving to the disk. CoreProtect databases can get big, so you don't want to copy the database while saving is off.


CoreProtect will store the data it would have otherwise logged in memory until the consumer resumes, just like Minecraft does when saving is disabled. So the longer the consumer is off, the more RAM you'll be consuming and this may result in a an OOM crash if your database is big or your transfer speed is slow.


Palworld seems to save frequently enough that any backup taken while the server is online has a significant chance of being corrupted. It also does not have the ability to temporarily disable saving to the disk as of writing. So the best way to backup your Palworld server is to do so as part of a regular restart schedule using the Power action.

Not using the GCP?

If you aren't using the GCP, you'll probably need to use RCON, which can be enabled for Palworld using these steps.

The ${server_UUID} and $api_token variables will pull from the variables you set at the start of the script using this section.

curl "${server_UUID}/power" \
-H "Content-Type: application/json" \
-H "Accept: application/vnd.wisp.v1+json" \
-H "Authorization: Bearer $api_token" \
-d '{"signal": "start"}'

You can replace start with stop to stop the server.

Reach Out!

Have Questions? Need Help? Feel free to reach out!

Join our Discord