2 months ago
I set daily backups to my Postgres, Mongo and Redit DBs.
However, I also store essential information on a storage bucket that I wish to setup a backup schedule.
What's the recommended way to do that?
2 Replies
2 months ago
railway doesn't have built-in backups for storage buckets sadly. here's what you should do:
create a simple service with a cron job that uses rclone or aws cli to sync your railway bucket to another s3-compatible storage (backblaze b2, cloudflare r2, or regular aws s3 all work great).
grab your bucket credentials from railway's dashboard, set up the sync command, and schedule it to run daily just like your db backups. it's basically the same pattern you're already using for postgres/mongo/redis but for object storage.
2 months ago
to add to what was said, railway actually has a native cron feature now so you don't need to keep a service running 24/7. you can set up a lightweight container that only spins up on schedule.
here's a quick approach: create a new service with a dockerfile that has rclone installed, add your source bucket creds and destination bucket creds as env vars, then set a cron schedule in the service settings (like 0 3 * * * for 3am daily). your entrypoint script just runs rclone sync source:bucket dest:bucket and exits. railway will spin it up on schedule, run the sync, then shut it down so you're not paying for idle time.
if you want something prebuilt, check the template marketplace for "backup" templates. there's one for postgres to r2 that uses rclone under the hood (https://railway.com/deploy/backup-postgres-to-r2) which you could adapt for bucket to bucket sync with minimal changes.