I have a Linux (Arch/Debian) with only a CLUI (Bash) and a server environment (LAMP) with the following details:
4 GB RAM
2 vCPU Cores
80 GB SSD
4 TB Transfer
40 Gbps Network In
4000 Mbps Network Out
I host two websites (WordPress and/or Drupal) on each system.
I want to backup and sync such a system live to my Windows Subsystem for Linux (WSL) but I don't want hourly/daily/weekly cron; I want that every change in scope of my system would be synced to a directory in my WSL.
What I desire is very much like Google's Backup&Sync which I use in my Windows 10 / Ubuntu machines to prevent data loss in case a computer was unexpectedly and sadly ruined or stolen and formatted (in contrast to manual deletion of data after it was stolen where the data would be lost even in the remote syncing environment).
My question is how to do such live syncing (like Google's Backup&Sync) from Linux to WSL?
Side note
This is how I backup manually so far and it's exactlly the process I desire to automate:
cib() {
# Create an Immediate ZIP Backup (of both $drt and DB);
date="$(date \+%F\-%T)"
mysqldump -u root -p --all-databases | zip "$drt/db-$date.zip" - # Note the hyphen before this comment;
zip -r "${drt}/all_zipped-$date.zip" "$drt"/ -x "*/cache/*" "*/phpmyadmin/*"
rm -f "$drt/db-$date.zip"