How to S3 Uploads by HumanMade WordPress Plugin

The following how-to was acquired at great headache-induced cost and required many full-days’ work for my feeble brain to comprehend all the moving parts. I’ve summarized my experiences with the S3 Uploads by HumanMade WordPress plugin in this “how to”, neat and tidy for your reading pleasure. First, some notes:

  • Here’s the S3 Uploads by HumanMade plugin.
  • Only use this plugin if you either want to use the WP-CLI for your S3 stuff or are a sadomasochist. If you fit into neither of those use cases, leave now, get a beer, and appreciate the headaches you’ve just avoided!
  • While, technically, you can use this plugin without WP-CLI or SSH access on your server, the support seems very sloppy for non-ssh/WP-CLI implementations, and I don’t recommend it (I tried this route first and failed miserably!).

On the plus side, if you are willing to join me on this journey, you just might learn some things, as I did:

  • Why would I ever use WP-CLI? How do I use it? Etc.
  • How to do some relatively-complex stuff with SSH
  • What are code dependencies, and how do I use them?
  • How can I use the command line for fun and profit?
  • How to upload to Amazon S3 (and related permissions craziness!).

Why S3 Uploads by HumanMade and not some other S3 plugin?

I chose this particular plugin for a number of boneheaded, well-meaning, and blissfully-walking-into-the-unknown reasons:

  1. I trust HumanMade.
  2. The idea of learning more about WP-CLI sparked my curiosity.
  3. My command line skills sit on the edge of serviceability, and I find some joy in getting stuff done on the command line.
  4. Fwiw, the only plugin with a large user base in the entire WordPress ecosystem seems to be WP Offload Media, but their pricing is ABSURD at scale! (e.g. if you manage a handful of sites, you’ll easily land in the $200/year category, and it’s likely cheaper to simply buy more server space). They do offer a free version of their plugin – likely not a good fit for a professional WordPress person.

Okay, let’s start the “how to” for S3 Uploads by HumanMade WP plugin!

Step 1: Create a new S3 user.

The way I understand it, you should treat S3 users like you would WordPress users – one admin per site, unless for some reason you need more than one. The S3 Uploads plugin has a function to create new S3 users, but I could not get it to work. So I went in and created an S3 user the old-fashioned way via the S3 Users admin page. It’s rather straightforward – just make sure to copy/save your secret keys to a text file, as they will only be shown to you once!

This is a little confusing, so here are the steps you’ll complete in the Amazon User control panel (a.k.a. “IAM Manager” – and outlined below in detail):

  1. Create a new user.
  2. Create a new policy (a.k.a. “Permissions” – it’ll open a new window for this step).
  3. Add the policy you created in Step #2 to the user you created in Step #1 (in the original IAM window…confusing right?!).
  4. Download your IAM user policy for reference later (there will be a “download” button after you create the user).

Add the following permissions to the user, replacing “your_bucket_name” with the name of your Amazon S3 bucket:

    "Version": "2012-10-17",
    "Statement": [
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
            "Resource": [

Step 2: Create an S3 bucket for this website.

You’ll also need to create the S3 bucket via your Amazon S3 console (a “bucket” is another word for “folder”). Make sure your bucket name does not have any dots in it, or else you’ll get an ssl error later (e.g. call your bucket “websitecom” rather than “”. Here are the bucket permissions:

Step 2.X: Edit your S3 Bucket policies so media is public (e.g. visible on your website!)

    "Version": "2012-10-17",
    "Statement": [
            "Sid": "PublicReadGetObject",
            "Effect": "Allow",
            "Principal": "*",
            "Action": [
            "Resource": [

During your bucket setup process, select these options:

Step 3: Upload your photos to S3.

There are many ways to get files onto Amazon S3, and those uploading options are beyond the scope of this document (Check out my article on migrating files from WordPress to Amazon S3). If you are stuck at this step, I recommend using Cyberduck, Filezilla, or a different S3-enabled ftp uploader of your choice to get your files to S3. In my case, I downloaded my entire WP “uploads” folder to my desktop then uploaded it to S3 with Cyberduck. A friend told me they just go to the Amazon S3 console and upload using their gui uploader.

To migrate your entire “uploads” directory to S3 using the Humanmade plugin, use the following WP-CLI command:

wp s3-uploads upload-directory /path/to/uploads/ uploads

So in my case, I typed:

wp s3-uploads upload-directory ./wp-content/uploads/ uploads

Or you can test a single file upload to ensure it goes to the right place with something like:

wp s3-uploads cp <from> <to>

So in my case, I typed:

wp s3-uploads cp ./test.txt s3://

Step 4: Add the S3 Uploads by Humanmade plugin code to wp-config.php on your WP site.

I got a bit confused at this part, so I’m going to give lots of description. In your wp-config.php file, you’ll see a line that says,

/* That's all, stop editing! Happy publishing. */
/** Absolute path to the WordPress directory. */
if ( ! defined( 'ABSPATH' ) ) {
	define( 'ABSPATH', dirname( __FILE__ ) . '/' );

Immediately after that, put in the S3 Uploads plugin settings as follows:

	BEGIN S3 via
require_once __DIR__ . '/vendor/autoload.php';
define( 'S3_UPLOADS_BUCKET', 'your_s3_bucket_name_here/wp-content' );
define( 'S3_UPLOADS_REGION', 'your_s3_region_here' ); // the s3 bucket region (excluding the rest of the URL) - Mine was 'us-east-1'
// You can set key and secret directly:
define( 'S3_UPLOADS_KEY', 'your_s3_key' );
define( 'S3_UPLOADS_SECRET', 'your_s3_secret' );
// Or if using IAM instance profiles, you can use the instance's credentials: (NOTE: I commented this out)
//define( 'S3_UPLOADS_USE_INSTANCE_PROFILE', true );
define( 'S3_UPLOADS_AUTOENABLE', true );
	END S3 via

After that, just leave the normal stuff in there, as follows:

/** Sets up WordPress vars and included files. */
require_once ABSPATH . 'wp-settings.php';
@include_once('/var/lib/sec/wp-settings.php'); // Added by SiteGround WordPress management system

Here’s what my code in wp-config.php looks like for (bucket name is “tobycryns”:

	BEGIN S3 via
require_once __DIR__ . '/vendor/autoload.php';
define( 'S3_UPLOADS_BUCKET', 'tobycryns/wp-content' );
define( 'S3_UPLOADS_REGION', 'us-east-2' ); // the s3 bucket region (excluding the rest of the URL) - Mine was 'us-east-1'
//define( 'S3_UPLOADS_BUCKET_URL', '' );
// You can set key and secret directly:
define( 'S3_UPLOADS_KEY', 'my_key' );
define( 'S3_UPLOADS_SECRET', 'my_secret_key' );
define( 'S3_UPLOADS_AUTOENABLE', true );
	END S3 via

Step 5: Upload and install the plugin dependencies using composer.

If you have never used Composer before, this part might be confusing, and I’ll try to explain a bit. Inside this plugin is a bunch of code that relies on Amazon code libraries that are maintained by other people. Those other people are constantly patching that old code, so it’s always up to date. Running the following command downloads the most recent release of that Amazon code library to your server. So go ahead and ssh into your server’s command line, navigate to the web root (“public_html” in my case), run the following command, and a few seconds later, you’ll have the most up-to-date version of a bunch of Amazon scripts on your server ready to rock!

composer require humanmade/s3-uploads

Step 6: Run the WP-CLI “s3 uploads” commands.

From their Github page:

Via WP-CLI use the following command to activate the S3 Uploads by Humanmade plugin:

wp plugin activate s3-uploads

NOTE: If you get the “Warning: The ‘S3-Uploads’ plugin could not be found.” error, this means that the plugin in your wp-content/uploads/ folder is likely using capital letters such as “S3-Uploads”. If this is the case, simply retype the statement in WP-CLI using the capital letters as follows:

wp plugin activate S3-Uploads

The next thing that you should do is to verify your setup. You can do this using the verify command like so:

wp s3-uploads verify

Step 7: Fix old uploads so they load from S3

The S3 Uploads by Humanmade plugin says that it’s backwards-compatible with files uploaded prior to the plugin’s existence, however, I could not get old media to load in the site on old pages, blog posts, etc. The simplest fix I’ve discovered is to add the following to .htaccess just below “RewriteEngine On”:

# BEGIN Utilize S3 bucket rather than localhost for "uploads" folder - Via
RewriteRule ^wp-content/uploads/(.*)$$1 [R,L]
# END Utilize S3 bucket

Step 8: Add Cloudflare CDN

Don’t do the following unless you have tylenol handy, as the instructions are a work in progress – e.g. they will break your site! Stop! Grab a tea/coffee/beer – you’ve made it this far! Still want to proceed? Ah, okay *sigh*…go ahead and break your site using Cloudflare to manage your S3 urls…

By default, S3 Uploads will use the canonical S3 URIs for referencing the uploads, i.e. [bucket name][file path]. If you want to use another URL to serve the images from (for instance, if you wish to use S3 as an origin for CloudFlare), you should define S3_UPLOADS_BUCKET_URL in your wp-config.php:

// Define the base bucket URL (without trailing slash)
define( 'S3_UPLOADS_BUCKET_URL', 'https://your.origin.url.example/path' );

For example, in my case, I defined this as:

// Define the base bucket URL (without trailing slash)
define( 'S3_UPLOADS_BUCKET_URL', '' );

And my Cloudflare cname was:




Are there easier ways to do this? Absolutely! Are there better ways to do this? Probably!

Posted in

Toby Cryns


  1. Lenc on October 27, 2020 at 10:24 pm

    Do you know how to test this on aws localstack?

    • Toby Cryns on October 28, 2020 at 9:37 am

      Hi, Lenc. Unfortunately, I do not know how to test this on an aws localstack.

More posts from

How to merge two folders, including all sub-folders and files, on Mac

By The Mighty Mo! Design Co. | November 3, 2022

Today I had an issue where I needed to merge two folders, each of which contained many sub- and sub-sub folders that had lots of images. After a lot of trial-and-error and some Googling, I found the best solution is to use the “ditto” command in Terminal like this: That’s it! Hat tip to AppleInsider.

mailchimp usage stats

Some Surprising Trends in Website Development

By Toby Cryns | October 17, 2022

I wasted some time today to bring you (dum dum duuuuum!): Some Surprising Trends in Website Development!

How to Configure WordPress with Cloudflare, HSTS, TLS, and Secure Headers

By The Mighty Mo! Design Co. | October 11, 2022

I was recently asked to look into creating some secure http headers as well as forcing a website to load over TLS1.2+. Below are my “how to” instructions for updating these settings within WordPress and Cloudflare. Install & Configure the Cloudflare WordPress plugin. Make sure an SSL Certificate is installed on your host for your…