Back to Cron Helper
Recipeawscloudbackups3

Sync Folder to AWS S3

Cron job to sync local files to an AWS S3 bucket.

Using the AWS CLI, this job synchronizes a local directory with an S3 bucket. It only uploads new or modified files, making it an efficient way to back up user uploads, assets, or logs to the cloud automatically.

Cron Schedule

Every Hour0 * * * *
Runs at minute 0 of every hour (e.g., 1:00, 2:00, 3:00...).

Command to Run

Copy and paste this command into your crontab or automation script

aws s3 sync /local/path s3://my-bucket/path
This command will run according to the cron schedule above.

Implementation Examples

Here are code examples for implementing this cron job in different programming languages:
Unix/Linux Crontab
0 * * * * /path/to/script.sh
Python (with schedule library)
schedule.every().hour.do(job)
Node.js (with node-cron)
cron.schedule('0 * * * *', () => {
  console.log('Running every hour');
});
Go (with robfig/cron)
c.AddFunc("0 * * * *", func() { fmt.Println("Run every hour") })
GitHub Actions Workflow
- cron: "0 * * * *"

Related Cron Recipes

Automate PostgreSQL Backups Daily

How to schedule a daily PostgreSQL database backup using cron.
databasepostgresbackup

Automate MySQL Backups Daily

Schedule a daily MySQL database dump using cron.
databasemysqlbackup

Automate MongoDB Backups Daily

How to schedule a daily MongoDB backup using cron.
databasemongodbbackup

Automate Redis RDB Snapshots

Schedule a Redis background save (snapshot) via cron.
databaserediscache

PostgreSQL Vacuum Analyze

Automate PostgreSQL database maintenance with VACUUM ANALYZE.
databasepostgresmaintenance

Rsync Incremental Backup

Schedule secure, incremental backups using rsync.
linuxbackupnetwork