WordPress WooCommerce Themes

Backup Your Files! 10x Speed Upload to AWS S3 (Tested 2Gbps)

Super Easy Guide to Backing Up Files to AWS S3 Deep Archive 📁

This guide covers how to upload both single files and entire folders to AWS S3 using the AWS CLI. We'll focus on sending them directly to the DEEP_ARCHIVE storage class for inexpensive, long-term backup.

Step 1: Prerequisites ⚙️

This guide assumes you have already installed and configured the AWS CLI. If you haven't, please complete our first guide:

➡️ How to Install & Configure the AWS CLI

It also assumes a basic understanding of AWS S3 (file storage) and have an S3 bucket ready.

Step 2: Optimize Upload Speed (Optional but Recommended) ⚡

For large files (e.g., videos, large archives), you can increase speed by configuring multipart uploads. This allows the CLI to upload multiple parts of a file in parallel.

To set the maximum number of parallel requests to 100, run this command:

aws configure set default.s3.max_concurrent_requests 100

This is a one-time configuration that will speed up all subsequent large uploads. 🚀

You can also set the multipart chunk size. A common setting is 64MB.

aws configure set default.s3.multipart_chunksize 64MB

This command instructs the CLI to break large files into 64MB pieces before uploading them in parallel.

Step 3: Upload a Single File ⬆️

Use the aws s3 cp (copy) command to upload files.

The command requires a source path (what to copy) and a destination path (your s3:// bucket).

For example, to upload my-backup.zip from your Desktop to a bucket named my-super-safe-bucket and send it directly to DEEP_ARCHIVE, use the --storage-class flag:

aws s3 cp "C:\Users\YourName\Desktop\my-backup.zip" "s3://my-super-safe-bucket/" --storage-class DEEP_ARCHIVE

(On Mac/Linux, your path would be something like /home/YourName/Desktop/my-backup.zip)

The file will be uploaded directly to the Deep Archive storage class. 🥶

Step 4: Upload an Entire Folder 📂

To back up an entire folder (e.g., "My Photos"), use the same cp command with the --recursive flag. This copies all contents, including subfolders.

Example path: C:\Users\YourName\Documents\MyPhotos.

To upload the entire folder and set all files within it to DEEP_ARCHIVE, use both the --recursive and --storage-class flags:

aws s3 cp "C:\Users\YourName\Documents\MyPhotos" "s3://my-super-safe-bucket/MyPhotos/" --recursive --storage-class DEEP_ARCHIVE

The command copies the folder and all its subfolders, applying the storage class to every file.

Step 5: Verify the Upload ✅

You can verify the files were uploaded by listing the contents of your bucket from the CLI.

To list (ls) all files in your bucket recursively, run:

aws s3 ls "s3://my-super-safe-bucket/" --recursive

Your newly-uploaded files and folders will appear in the list. ✨

Alternatively, you can navigate to the S3 service in your AWS Console, open your bucket, and view the uploaded files. 😌

Step 6: A Note on Storage Classes 🗄️

This guide focuses on DEEP_ARCHIVE for the cheapest long-term backup. However, AWS offers several storage classes for different needs.

  • If you omit the --storage-class flag entirely, your files will be uploaded to the default STANDARD class, which is more expensive but allows for instant access.
  • Other common classes include STANDARD_IA (for infrequent access) and GLACIER_IR (Instant Retrieval), which balance cost and access speed.

For this guide's goal of cheap archival, DEEP_ARCHIVE is the correct choice.

Backup Complete! 😌

You've successfully learned how to use the aws s3 cp command to back up your important files and folders to S3, using the --recursive and --storage-class flags to ensure they are archived efficiently and cost-effectively.

🥂🥳 And that's it! You're now a pro at backing up your important files to S3 Deep Archive. 🎉

Related Articles