WPFleet includes a comprehensive automated backup system with scheduling, retention management, and notifications.
The backup system provides:
- Automated scheduling via cron container
- Manual backups for on-demand protection
- Configurable retention policies
- Multiple backup types (full, site, database)
- Notification integration for backup status
- Easy restoration process
- Configure in
.env:
# Enable automated backups
BACKUP_ENABLED=true
BACKUP_SCHEDULE="0 2 * * *" # 2 AM daily
# Retention settings
BACKUP_RETENTION_DAYS=7 # Keep backups for 7 days
BACKUP_RETENTION_WEEKLY=4 # Keep 4 weekly backups
BACKUP_RETENTION_MONTHLY=3 # Keep 3 monthly backups- Start the cron container:
docker-compose up -d cron- Verify backups are scheduled:
docker logs wpfleet_cronCreate backups on-demand:
# Backup a specific site
./scripts/backup.sh site example.com
# Backup all sites
./scripts/backup.sh all
# Backup database only
./scripts/backup.sh database example.com
# Backup files only
./scripts/backup.sh files example.comBacks up both database and files:
./scripts/backup.sh site example.comCreates:
example.com_YYYYMMDD_HHMMSS.sql.gz- Compressed databaseexample.com_YYYYMMDD_HHMMSS.tar.gz- Compressed files
./scripts/backup.sh database example.comCreates:
example.com_db_YYYYMMDD_HHMMSS.sql.gz
./scripts/backup.sh files example.comCreates:
example.com_files_YYYYMMDD_HHMMSS.tar.gz
Backup all WordPress sites:
./scripts/backup.sh allCreates separate backup files for each site.
Backups are stored in:
data/backups/
├── example.com/
│ ├── example.com_20231201_020000.sql.gz
│ ├── example.com_20231201_020000.tar.gz
│ ├── example.com_20231202_020000.sql.gz
│ └── example.com_20231202_020000.tar.gz
└── another-site.com/
└── ...
Each site has its own subdirectory for organized backup management.
The backup schedule uses standard cron syntax:
* * * * *
│ │ │ │ │
│ │ │ │ └─ Day of week (0-7, 0 and 7 are Sunday)
│ │ │ └─── Month (1-12)
│ │ └───── Day of month (1-31)
│ └─────── Hour (0-23)
└───────── Minute (0-59)
# Daily at 2 AM
BACKUP_SCHEDULE="0 2 * * *"
# Every 12 hours
BACKUP_SCHEDULE="0 */12 * * *"
# Weekly on Sunday at 3 AM
BACKUP_SCHEDULE="0 3 * * 0"
# Monthly on the 1st at 4 AM
BACKUP_SCHEDULE="0 4 1 * *"
# Every 6 hours
BACKUP_SCHEDULE="0 */6 * * *"Add custom scheduled tasks in .env:
CUSTOM_CRON_JOBS="0 3 * * 0 cd /wpfleet && ./scripts/backup.sh all"Multiple jobs can be added with newlines:
CUSTOM_CRON_JOBS="0 3 * * 0 cd /wpfleet && ./scripts/backup.sh all
0 */6 * * * cd /wpfleet && ./scripts/quota-manager.sh monitor 80"Configure retention in .env:
# Keep all backups for this many days
BACKUP_RETENTION_DAYS=7
# After RETENTION_DAYS, keep one weekly backup
BACKUP_RETENTION_WEEKLY=4
# After weekly retention, keep one monthly backup
BACKUP_RETENTION_MONTHLY=3How it works:
- Daily retention: All backups are kept for
BACKUP_RETENTION_DAYSdays - Weekly retention: After daily period, keep one backup per week for
BACKUP_RETENTION_WEEKLYweeks - Monthly retention: After weekly period, keep one backup per month for
BACKUP_RETENTION_MONTHLYmonths - Automatic cleanup: Runs based on
BACKUP_CLEANUP_SCHEDULE
Remove old backups manually:
./scripts/backup-cleanup.shThis applies the retention policy to all backup directories.
Configure automated cleanup:
BACKUP_CLEANUP_ENABLED=true
BACKUP_CLEANUP_SCHEDULE="0 3 * * 0" # 3 AM every SundayUse the import functionality to restore:
# Create new site from backups
./scripts/site-manager.sh add example.com --import-from
# When prompted, provide the backup files:
# Database: data/backups/example.com/example.com_20231201_020000.sql.gz
# Files: data/backups/example.com/example.com_20231201_020000.tar.gz./scripts/db-manager.sh import example.com data/backups/example.com/example.com_db_20231201_020000.sql.gz# Extract files to site directory
tar -xzf data/backups/example.com/example.com_files_20231201_020000.tar.gz -C data/wordpress/example.com/
# Fix permissions
docker exec wpfleet_frankenphp chown -R www-data:www-data /var/www/html/example.com# Cron container logs
docker logs wpfleet_cron
# Backup-specific logs
tail -f data/logs/cron/backup.log
# Cleanup logs
tail -f data/logs/cron/cleanup.log# List all backups for a site
ls -lh data/backups/example.com/
# Find backups older than 30 days
find data/backups/ -name "*.gz" -mtime +30
# Check total backup size
du -sh data/backups/Automatic notifications are sent for:
- Successful backups
- Failed backups
- Backup cleanup completion
- Low disk space warnings
Configure notifications in Notifications Guide.
Use the AWS CLI in the cron container:
# Add to CUSTOM_CRON_JOBS
0 4 * * * aws s3 sync /wpfleet/data/backups/ s3://your-bucket/wpfleet-backups/Copy backups to remote server:
# Add to CUSTOM_CRON_JOBS
0 4 * * * scp -r /wpfleet/data/backups/ user@backup-server:/path/to/backups/Sync backups to external location:
# Add to CUSTOM_CRON_JOBS
0 4 * * * rsync -avz /wpfleet/data/backups/ user@backup-server:/path/to/backups/- Regular testing: Periodically test backup restoration
- Multiple locations: Store backups in multiple locations
- Monitor disk space: Ensure adequate space for backups
- Verify backups: Check backup logs for errors
- Document procedures: Keep restoration procedures documented
- Automate offsite: Copy backups to external storage
- Retention balance: Balance storage costs with recovery needs
-
Check cron container status:
docker ps | grep cron -
Verify cron configuration:
docker logs wpfleet_cron
-
Check backup script permissions:
ls -l scripts/backup.sh
-
Check disk space:
df -h
-
Review error logs:
tail -f data/logs/cron/backup.log
-
Test manual backup:
./scripts/backup.sh site example.com
-
Verify backup file integrity:
# Test database file gunzip -t backup.sql.gz # Test archive file tar -tzf backup.tar.gz | head
-
Check available disk space:
df -h data/wordpress/
-
Verify permissions after restore:
docker exec wpfleet_frankenphp chown -R www-data:www-data /var/www/html/example.com
-
Exclude unnecessary files:
- Modify backup script to exclude cache directories
- Skip temporary files
-
Compress differently:
- Use faster compression (gzip -1 instead of gzip -9)
- Or use pigz for parallel compression
-
Backup during low-traffic:
- Schedule backups during off-peak hours
Typical compression ratios:
- Database: 10:1 to 20:1 compression
- Files: 2:1 to 5:1 compression (varies with media content)
Example: A 2GB site might create:
- Database: 100MB (compressed from 1GB)
- Files: 400MB (compressed from 1GB)
- Total: 500MB per backup
Calculate needed storage:
Storage = (Daily backups × Days) + (Weekly backups × Weeks) + (Monthly backups × Months)
Storage = (500MB × 7) + (500MB × 4) + (500MB × 3)
Storage = 3.5GB + 2GB + 1.5GB = 7GB per site