The following how-to was acquired at great headache-induced cost and required many full-days’ work for my feeble brain to comprehend all the moving parts. I’ve summarized my experiences with the S3 Uploads by HumanMade WordPress plugin in this “how to”, neat and tidy for your reading pleasure. First, some notes:
- Here’s the S3 Uploads by HumanMade plugin.
- Only use this plugin if you either want to use the WP-CLI for your S3 stuff or are a sadomasochist. If you fit into neither of those use cases, leave now, get a beer, and appreciate the headaches you’ve just avoided!
- While, technically, you can use this plugin without WP-CLI or SSH access on your server, the support seems very sloppy for non-ssh/WP-CLI implementations, and I don’t recommend it (I tried this route first and failed miserably!).
On the plus side, if you are willing to join me on this journey, you just might learn some things, as I did:
- Why would I ever use WP-CLI? How do I use it? Etc.
- How to do some relatively-complex stuff with SSH
- What are code dependencies, and how do I use them?
- How can I use the command line for fun
and profit? - How to upload to Amazon S3 (and related permissions craziness!).
Why S3 Uploads by HumanMade and not some other S3 plugin?
I chose this particular plugin for a number of boneheaded, well-meaning, and blissfully-walking-into-the-unknown reasons:
- I trust HumanMade.
- The idea of learning more about WP-CLI sparked my curiosity.
- My command line skills sit on the edge of serviceability, and I find some joy in getting stuff done on the command line.
- Fwiw, the only plugin with a large user base in the entire WordPress ecosystem seems to be WP Offload Media, but their pricing is ABSURD at scale! (e.g. if you manage a handful of sites, you’ll easily land in the $200/year category, and it’s likely cheaper to simply buy more server space). They do offer a free version of their plugin – likely not a good fit for a professional WordPress person.
Okay, let’s start the “how to” for S3 Uploads by HumanMade WP plugin!
Step 1: Create a new S3 user.
The way I understand it, you should treat S3 users like you would WordPress users – one admin per site, unless for some reason you need more than one. The S3 Uploads plugin has a function to create new S3 users, but I could not get it to work. So I went in and created an S3 user the old-fashioned way via the S3 Users admin page. It’s rather straightforward – just make sure to copy/save your secret keys to a text file, as they will only be shown to you once!
This is a little confusing, so here are the steps you’ll complete in the Amazon User control panel (a.k.a. “IAM Manager” – and outlined below in detail):
- Create a new user.
- Create a new policy (a.k.a. “Permissions” – it’ll open a new window for this step).
- Add the policy you created in Step #2 to the user you created in Step #1 (in the original IAM window…confusing right?!).
- Download your IAM user policy for reference later (there will be a “download” button after you create the user).
Add the following permissions to the user, replacing “your_bucket_name” with the name of your Amazon S3 bucket:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObjectAcl",
"s3:GetObject",
"s3:PutBucketAcl",
"s3:ListBucket",
"s3:DeleteObject",
"s3:GetBucketAcl",
"s3:GetBucketLocation",
"s3:PutObjectAcl"
],
"Resource": [
"arn:aws:s3:::your_bucket_name",
"arn:aws:s3:::your_bucket_name/*"
]
}
]
}
Step 2: Create an S3 bucket for this website.
You’ll also need to create the S3 bucket via your Amazon S3 console (a “bucket” is another word for “folder”). Make sure your bucket name does not have any dots in it, or else you’ll get an ssl error later (e.g. call your bucket “websitecom” rather than “website.com”. Here are the bucket permissions:
Step 2.X: Edit your S3 Bucket policies so media is public (e.g. visible on your website!)
https://docs.aws.amazon.com/AmazonS3/latest/dev/WebsiteAccessPermissionsReqd.html
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:GetObject"
],
"Resource": [
"arn:aws:s3:::your_bucket_name/*"
]
}
]
}
During your bucket setup process, select these options:
amazon bucket creation 1 amazon bucket creation 2 amazon bucket creation 3 amazon bucket creation 4
Step 3: Upload your photos to S3.
There are many ways to get files onto Amazon S3, and those uploading options are beyond the scope of this document (Check out my article on migrating files from WordPress to Amazon S3). If you are stuck at this step, I recommend using Cyberduck, Filezilla, or a different S3-enabled ftp uploader of your choice to get your files to S3. In my case, I downloaded my entire WP “uploads” folder to my desktop then uploaded it to S3 with Cyberduck. A friend told me they just go to the Amazon S3 console and upload using their gui uploader.
To migrate your entire “uploads” directory to S3 using the Humanmade plugin, use the following WP-CLI command:
wp s3-uploads upload-directory /path/to/uploads/ uploads
So in my case, I typed:
wp s3-uploads upload-directory ./wp-content/uploads/ uploads
Or you can test a single file upload to ensure it goes to the right place with something like:
wp s3-uploads cp <from> <to>
So in my case, I typed:
wp s3-uploads cp ./test.txt s3://music.cryns.com/wp-content/uploads/test.txt
Step 4: Add the S3 Uploads by Humanmade plugin code to wp-config.php on your WP site.
I got a bit confused at this part, so I’m going to give lots of description. In your wp-config.php file, you’ll see a line that says,
/* That's all, stop editing! Happy publishing. */
/** Absolute path to the WordPress directory. */
if ( ! defined( 'ABSPATH' ) ) {
define( 'ABSPATH', dirname( __FILE__ ) . '/' );
}
Immediately after that, put in the S3 Uploads plugin settings as follows:
/*
BEGIN S3 via https://github.com/humanmade/S3-Uploads
*/
require_once __DIR__ . '/vendor/autoload.php';
define( 'S3_UPLOADS_BUCKET', 'your_s3_bucket_name_here/wp-content' );
define( 'S3_UPLOADS_REGION', 'your_s3_region_here' ); // the s3 bucket region (excluding the rest of the URL) - Mine was 'us-east-1'
// You can set key and secret directly:
define( 'S3_UPLOADS_KEY', 'your_s3_key' );
define( 'S3_UPLOADS_SECRET', 'your_s3_secret' );
// Or if using IAM instance profiles, you can use the instance's credentials: (NOTE: I commented this out)
//define( 'S3_UPLOADS_USE_INSTANCE_PROFILE', true );
define( 'S3_UPLOADS_AUTOENABLE', true );
/*
END S3 via https://github.com/humanmade/S3-Uploads
*/
After that, just leave the normal stuff in there, as follows:
/** Sets up WordPress vars and included files. */
require_once ABSPATH . 'wp-settings.php';
@include_once('/var/lib/sec/wp-settings.php'); // Added by SiteGround WordPress management system
Here’s what my code in wp-config.php looks like for tobycryns.com (bucket name is “tobycryns”:
/*
BEGIN S3 via https://github.com/humanmade/S3-Uploads
*/
require_once __DIR__ . '/vendor/autoload.php';
define( 'S3_UPLOADS_BUCKET', 'tobycryns/wp-content' );
define( 'S3_UPLOADS_REGION', 'us-east-2' ); // the s3 bucket region (excluding the rest of the URL) - Mine was 'us-east-1'
//define( 'S3_UPLOADS_BUCKET_URL', 'https://cdn.tobycryns.com/wp-content' );
// You can set key and secret directly:
define( 'S3_UPLOADS_KEY', 'my_key' );
define( 'S3_UPLOADS_SECRET', 'my_secret_key' );
define( 'S3_UPLOADS_AUTOENABLE', true );
/*
END S3 via https://github.com/humanmade/S3-Uploads
*/
Step 5: Upload and install the plugin dependencies using composer.
If you have never used Composer before, this part might be confusing, and I’ll try to explain a bit. Inside this plugin is a bunch of code that relies on Amazon code libraries that are maintained by other people. Those other people are constantly patching that old code, so it’s always up to date. Running the following command downloads the most recent release of that Amazon code library to your server. So go ahead and ssh into your server’s command line, navigate to the web root (“public_html” in my case), run the following command, and a few seconds later, you’ll have the most up-to-date version of a bunch of Amazon scripts on your server ready to rock!
composer require humanmade/s3-uploads
Step 6: Run the WP-CLI “s3 uploads” commands.
From their Github page:
Via WP-CLI use the following command to activate the S3 Uploads by Humanmade plugin:
wp plugin activate s3-uploads
NOTE: If you get the “Warning: The ‘S3-Uploads’ plugin could not be found.” error, this means that the plugin in your wp-content/uploads/ folder is likely using capital letters such as “S3-Uploads”. If this is the case, simply retype the statement in WP-CLI using the capital letters as follows:
wp plugin activate S3-Uploads
The next thing that you should do is to verify your setup. You can do this using the verify
command like so:
wp s3-uploads verify
Step 7: Fix old uploads so they load from S3
The S3 Uploads by Humanmade plugin says that it’s backwards-compatible with files uploaded prior to the plugin’s existence, however, I could not get old media to load in the site on old pages, blog posts, etc. The simplest fix I’ve discovered is to add the following to .htaccess just below “RewriteEngine On”:
# BEGIN Utilize S3 bucket rather than localhost for "uploads" folder - Via https://tinyurl.com/s3redirect
RewriteRule ^wp-content/uploads/(.*)$ https://themightymo.s3.amazonaws.com/wp-content/uploads/$1 [R,L]
# END Utilize S3 bucket
Step 8: Add Cloudflare CDN
Don’t do the following unless you have tylenol handy, as the instructions are a work in progress – e.g. they will break your site! Stop! Grab a tea/coffee/beer – you’ve made it this far! Still want to proceed? Ah, okay *sigh*…go ahead and break your site using Cloudflare to manage your S3 urls…
By default, S3 Uploads will use the canonical S3 URIs for referencing the uploads, i.e.
[bucket name].s3.amazonaws.com/uploads/[file path]
. If you want to use another URL to serve the images from (for instance, if you wish to use S3 as an origin for CloudFlare), you should defineS3_UPLOADS_BUCKET_URL
in yourwp-config.php
:
// Define the base bucket URL (without trailing slash)
define( 'S3_UPLOADS_BUCKET_URL', 'https://your.origin.url.example/path' );
For example, in my case, I defined this as:
// Define the base bucket URL (without trailing slash)
define( 'S3_UPLOADS_BUCKET_URL', 'https://tobycryns.s3.amazonaws.com/wp-content' );
And my Cloudflare cname was:
cdn cdn.tobycryns.s3.us-east-2.amazonaws.com
Miscellaneous
- Jetpack plugin caused me some issues with their Photon cdn – I’m not sure exactly what the issue was but disconnecting & reconnecting Jetpack solved it!
- Kinsta’s S3 Bucket documentation
- Pagely’s S3 User documentation
- Amazon’s WP Offload Media Lite WP plugin documentation
- Check out my free “How to Migrate WordPress to Amazon S3 on the CHEAP!” guide.
Conclusion
Are there easier ways to do this? Absolutely! Are there better ways to do this? Probably!
2 Comments
More posts from themightymo.com
How to Block Mailinator and other spam from Gravity Forms
I’ve had a couple of scenarios where I’ve needed to block a specific domain from filling out Gravity Forms on my WordPress site. Today I needed to block spammy “mailinator.com” submissions from a contact form. The solution was simple: Install and configure the Gravity Forms Email Blacklist plugin. In a few minutes – all done!
How to merge multiple .csv files via command line
Today I downloaded 29 csv files with tax information. I could import those files 1 at a time into WooCommerce, but wouldn’t it be nice if I could merge them first and then only import a single file? The unix command is simple – just navigate to the folder with all the csv files and…
15 Best WordPress Support and Maintenance Services for 2023: Ensuring Your Website Stays in Top Shape
Small business owners and non-profit leaders know that keeping your website secure, fast, and fully-functional is critical to success. Here’s the unspoken truth: You can either learn how to do all that tech maintenance and development stuff yourself, or hire someone else to do it. But let’s face it, doing it yourself isn’t always the…
Do you know how to test this on aws localstack?
Hi, Lenc. Unfortunately, I do not know how to test this on an aws localstack.