Create and send custom Mailchimp email campaigns from WordPress

There are lots of Mailchimp + WordPress integrations out there already. But when I wanted to create a customized, automated daily email campaign that would be generated from WordPress content (beyond just a listing of recent posts) and sent to a Mailchimp list, I couldn’t find anything to do just that.

For a little while I used MailChimp’s ability to read RSS feeds and generate campaigns from them, and I built a custom RSS feed on my WordPress site where the latest entry in the feed was the customized content I wanted to be populated into the *|RSSITEM:CONTENT_FULL|* variable to then go out to my subscribers.

But this was unreliable — sometimes the message wouldn’t go out because of caching issues on my end, mysterious days where MailChimp didn’t seem to check the feed, and there were other quirks — and even working with MailChimp support I couldn’t get things to a stable state. I also didn’t like that MailChimp’s system forced me to pick a top-of-the-hour time during the day when the feed would be checked, and if for some reason it was missed, I had to reconfigure the whole campaign to get the message out.

I needed a better way.

The resulting method that’s been working for almost a year now is to initiate and send the Mailchimp campaign directly from within WordPress. I get full control of scheduling, message generation and formatting. I can re-run the campaign send if I need to. And it has worked reliably every day. Here’s how I set it up (with inspiration from this blog post by Isabel Castillo).

Continue reading Create and send custom Mailchimp email campaigns from WordPress

Testing my WordPress plugins in preparation for WordPress core releases

A couple times per year, WordPress plugin authors and owners get an email like this one:

WordPress 5.6 is imminent! Are your plugins ready?

You’re receiving this email because you have commit access or ownership of existing, open plugins hosted on WordPress.org. The next release of WordPress, 5.6, is scheduled for 08 December 2020.

We would like you to take this time to review your existing plugins and ensure their ongoing compatibility with WordPress. Once you’ve done so, you can update the readme “Tested up to:” value to 5.6. This information provides peace of mind to users and helps encourage them to update WordPress.

Here are the current “Tested up to:” values for each of your plugins:

The message goes on from there to list the plugins I’m responsible for and some notes and details about what’s new in the upcoming WordPress release.

In case it’s not clear, this is an important moment because the authors of tens of thousands of WordPress plugins are being asked to help ensure that when the many millions of WordPress sites out there upgrade to the upcoming release, that those sites continue to look and function as expected by their users. It’s an impressive example of how the WordPress developer community works together in the background to help sustain and grow the larger WordPress ecosystem.

For authors of widely used plugins, by the time this email goes out their plugin may already be fully ready, especially if they’ve been following or maybe even contributing to the development of the new WordPress core release. Some plugin authors rightly have an extensive automated test suite in place to confirm that every part of their plugin’s functionality works against the latest beta or release candidate version of the new version before it comes out.

Authors and maintainers of smaller plugins (like me) may not have the same infrastructure set up, and instead need to perform some manual testing of our plugins to ensure they’re ready.

So, here are the steps I follow every major release cycle to make sure my plugins have been tested and are ready for the new version.

Continue reading Testing my WordPress plugins in preparation for WordPress core releases

SMTP relay through Fastmail from Postfix on macOS Mojave

When my Mac laptop tries to send me email — the output of a cron job, for example — by default it ends up in a local mailbox file that I never check. I want the mail to get to my regular email account, but I don’t want it to relay that message through whatever random ISP I might be connected to at any given time, or over the open internet. It’ll likely fail, it’s not secure, and there are better ways.

Instead, I relay all outgoing mail sent from macOS through my email provider, Fastmail (affiliate link). This especially makes sense since most of the email from my OS is going straight to my inbox hosted at Fastmail.

The notion of relaying email through a specific provider is built in to the Postfix mail transport agent that comes with macOS, so in theory it’s not a big deal to set up. In reality, I’ve found it to be a somewhat fragile configuration, and rarely does it survive a macOS upgrade or switch to a new computer. So I’ve come to document it pretty heavily for my own reference. I recently went through the process again, so thought I’d write it up here in case it’s helpful to others.

Most of these steps are derived from this nice compilation of steps that applies to macOS Sierra through Mojave.

  1. Edit /etc/postfix/main.cf and add these lines to the end:

Continue reading SMTP relay through Fastmail from Postfix on macOS Mojave

What are AWS hosting costs using Laravel Vapor?

When I was researching tools and services for launching a SaaS app, I was pretty clear that I wanted to use Laravel Vapor to manage the Amazon Web Services deployment. The main mystery about that decision was what it would actually cost to have a Vapor-managed deployment on AWS for the size of my application and my expected usage levels.

I found a few articles and blog posts about that topic (the most helpful was Cost & Performance optimization in Laravel Vapor) but, as is the case with AWS hosting in general, there was no clear formula that would lead me to a precise monthly hosting cost for a brand new web application.

In hopes that it helps someone else in a similar situation, I’d like to add one more data point to the mix. Here are some details about what it’s costing me (so far) to host my Laravel-powered application on AWS as managed by Vapor.

Vapor itself is $39/month. This cost does not change if you use Vapor across multiple projects, so your per-project cost can go down over time if you plan to launch more than one project. Some people have raised eyebrows at this baseline cost but as anyone who has ever had to manage their own hosting server infrastructure and worry about upgrades, security issues, configuration management, etc. knows, it feels like a great deal. I wrote about that more in my other post on launching a SaaS business, but this sentiment remains true:

It felt like the magical world of cloud hosting that was always promised but never quite delivered had finally become reality. Even a month later I’m still constantly amazed by it. Huge kudos to the Vapor team.

Now, on to AWS itself. I’m currently paying approximately $1.00 per day for AWS services, and the monthly bill ends up being about $33.00.

Continue reading What are AWS hosting costs using Laravel Vapor?

Getting alerts about outdated Time Machine backups

Our household uses Apple’s Time Machine backup system as one of our backup methods. Various Macs are set to run their backups at night (we use TimeMachineEditor to schedule backups so they don’t take up local bandwidth or computing resources during the day) and everything goes to a Synology NAS (basically following the steps in this great guide from 9to5Mac).

But, Time Machine doesn’t necessarily complete a successful backup for every machine every night. Maybe the Mac is turned off or traveling away from home. Maybe the backup was interrupted for some reason. Maybe the backup destination isn’t on or working correctly. Most of these conditions can self-correct within a day or two.

But, I wanted to have a way to be notified if any given Mac had not successfully completed a backup after a few days so that I could take a look. If, say, three days go by without a successful backup, I start to get nervous. I also didn’t want to rely on any given Mac’s human owner/user having to notice this issue and remember to tell me about it.

I decided to handle this by having a local script on each machine send a daily backup status to a centralized database, accomplished through a new endpoint on my custom webhook server. The webhook call stores the status update locally in a database table. Another script runs daily to make sure every machine is current, and sends a warning if anyone is running behind.

Here are the details in case it helps anyone else.

Continue reading Getting alerts about outdated Time Machine backups

Personal banking needs an API

My washer and dryer can tell my smart watch when they are done washing and drying. A voice assistant in my kitchen can update my grocery list. Documents in selected folders on my laptop can be synced to data centers around the world in an instant.

These things are possible through the use of APIs, which most every user-facing service, tool, device and ecosystem out there these days seems to understand are an essential part of their offering. APIs give users, developers and partners a way to build new things on top of the thing you already offer. They give people flexibility to integrate a service, tool or device into their lives in a way that makes sense for them. APIs help encourage wide adoption and extensibility.

The industry that seems to be far behind in offering powerful APIs to end users? Personal banking, and related billing systems for utilities and credit card companies.

When I want to check the current balance on my personal checking account, I have to follow a multi-step process in a web browser or mobile app.

When I want to get the latest PDF copy of a bill from my mobile phone carrier, it’s something like 10 clicks across three different websites.

When I want to initiate a bill payment on a credit card provider that doesn’t support automatic drafting from my bank, it’s a similarly long process.

The other day a rep from a utility told me I had to call them to request a form be mailed to me so I could fill it out and mail it back to them, just in order to set up automatic payments from a bank account.

And when I want to be notified about certain kinds of activity from these institutions, I have to log in to each one to go through their proprietary grid of checkboxes and verification methods to set up push alerts or text messages…if they offer notifications at all.

Indeed, accessing and working with my personal financial information is one of the most cumbersome, high-friction, analog things I do any more. Personal banking and bill payment feels like swimming in mud compared to the light speed of most of the rest of the information economy.

Why is there so much friction in personal banking and financial transactions?

Continue reading Personal banking needs an API

How long does it take between when a plugin update is released and when auto-updates install it on your WordPress site?

Auto-updates for WordPress themes and plugins were released this year in WordPress version 5.5. They allow WordPress site owners to opt-in to automatically have new versions of plugins and themes installed when they are released, without any intervention from the site owner.

If you use auto-updates, one question might be on your mind:

How long will it take between when the author of a plugin releases a new version and when that new version is installed on your WordPress site?

This question is vital for site owners and managers. Especially in scenarios when new plugin or theme versions contain critical security fixes, time is of the essence to avoid possible unauthorized access to your WordPress site.

To get to the answer, let’s first review how plugin and theme releases happen.

The Plugin and Theme Release Process

When a plugin or theme author is ready to make an update to their software, they upload those changes to the directory on WordPress.org. This is where the code for their theme or plugin is hosted publicly.

Most theme and plugin authors also indicate the release of non-trivial changes by increasing the version number associated with their plugin. Maybe it’s a small “point release” like going from version 1.1 to version 1.2, or maybe it’s a major release like going from version 3.0 to version 4.0. The change in version number lets everyone know that there’s new functionality and fixes available. It’s a convenient way to refer to how software has changed over time.

Once the updated software and version number change is live on WordPress.org, it’s immediately in effect for new installations of that plugin or theme. Anyone downloading and installing a plugin or theme from that directory will now be using the latest code made available by the author.

But what about existing sites that already have that theme or plugin installed? How do they learn about the new changes and new version?

How WordPress Sites Discover Updates

You might think it happens through a “push notification” sent to your site from WordPress.org. But the WordPress.org systems would have to contact thousands or maybe millions of sites to tell them about an update to a single plugin. That’s just not practical.

Continue reading How long does it take between when a plugin update is released and when auto-updates install it on your WordPress site?

How I built and launched a SaaS app in one month

In June of this year, I had an idea for a software-as-a-service (Saas) application that I wanted to build and launch. On July 1st, I committed the first lines of code to a git repo for the app. After about 200 hours of development time, on August 10th I launched the first version of the application into the world, ready for users and subscribers.

The service is called WP Lookout and you can read more about the service itself and why I built it on wplookout.com. This post is about the process, tools and services I used to build and launch a SaaS application in what felt like a relatively short period of time.

Here’s the short list of resources I used along the way if you just want to explore them without further commentary:

  • Laravel for the application framework
  • Laravel Homestead for my local development environment
  • PhpStorm for software development
  • Laravel Spark for scaffolding the SaaS subscription and account management
  • Stripe for subscription payment processing
  • Laravel Nova for an administrative dashboard
  • Git and GitHub for tracking code development, to-do items and feature branches
  • Upwork for hiring and paying a software code reviewer and consultant
  • Slack for coordinating with my contractor
  • TermsFeed for generating privacy policy and terms of service documents
  • Amazon Web Services for application hosting
  • Laravel Vapor for managing AWS setup and deployment
  • WordPress for building the WP Lookout marketing website
  • Matomo for privacy-focused analytics
  • HelpScout for managing user support interactions
  • MailerLite for handling new user onboarding and marketing automation

I’m not going to go in-depth on all of these tools, as some of them are pretty simple and self-evident in their value. Others are just magical and deserving of some additional observations. Here are some things that stood out and what I learned along the way:

Continue reading How I built and launched a SaaS app in one month

Cloning a WP Engine site to a local VVV development environment

When doing custom WordPress theme and plugin development for a site that’s already launched, I find it is essential to get my local development environment set up as close as possible to the live environment where the site is hosted. This minimizes headaches and unexpected problems that come when making any updated code live.

Here are the steps I use to clone the database, theme, plugin and media/uploads from a site hosted on WP Engine into a locally-hosted development environment powered by VVV. This method allows me to develop and test new functionality against recent site content before I deploy it to a staging environment for final testing.

The usual warning applies: you should make sure you know what each of these steps is actually doing and adjust them to fit your specific setup. Some of these actions could result in data loss, so please use them at your own risk.

Okay, here we go:

  1. Make sure any custom theme and plugin code you’ll be working with is under source control, probably using these instructions from WP Engine, and up to date locally. Make sure that those directories are not accidentally included in the .gitignore file for your repo. (I handle this by default ignoring everything in, say, the plugins directory with /wp-content/plugins/* and then adding exclusion rules for each directory I’m working on, e.g. !/wp-content/plugins/my-custom-plugin*.)
  2. Initiate a backup checkpoint on the environment you want to clone. If you’re okay using the slightly out of date one that WP Engine automatically makes every morning, that’s fine.
  3. Initiate a download of the backup checkpoint, choosing the “partial” option and checking “Entire database,” “Plugins” and “Themes.” Leave “Media uploads” and “Everything else” unchecked.
  4. Use the WP Engine SSH gateway to rsync the media folder from the WP Engine environment to your local environment. Here’s an example command for a “mystaging” environment…you will want to try your version with the dry run -n flag first to make sure it is going to do what you want!
    rsync -n -rlD -vz --size-only --delete mystaging:sites/mystaging/wp-content/uploads/ /Users/chris/vvv/www/mysite/public_html/wp-content/uploads/
    

    This should delete any local media files not in the WP Engine environment, and copy everything else down that isn’t already there. It could take a while if it’s the first time you’re running it, but subsequent runs should only grab new uploads to the site.

  5. Unzip the downloaded backup checkpoint on your local system.
  6. Move the mysql.sql file into a directory that will be accessible from your VVV Vagrant shell: mv ~/Downloads/wp-content/mysql.sql /Users/chris/vvv/www/mysite/
  7. Update your local plugins directory from the downloaded backup. Again, please use the -n flag with your version of this command first to preview the results:
    rsync -rlD -vz --size-only --delete --exclude=query-monitor ~/Downloads/wp-content/plugins/ /Users/chris/vvv/www/mysite/public_html/wp-content/plugins/
    

    Note that this command also excludes removing the query-monitor plugin, which I have installed locally but not in the WP Engine environment. That way it can stay ready to activate without reinstalling.

  8. Repeat the above step for the themes directory if need be.
  9. Resolve any discrepancies, things git notes as changes that need to be committed, missing files, etc. There shouldn’t be any but it’s worth checking.
  10. SSH into your VVV box: vagrant ssh
  11. Make sure your WP dev site and the WP Engine site are running the same version of WordPress core.
  12. Go to the directory where the site lives, e.g. cd /srv/www/mysite/public_html/.
  13. Clear out the existing database (again, assuming you don’t have any locally staged changed that you would care about): wp db reset.
  14. Import the database from the WP Engine environment: wp db import ../mysql.sql
  15. Update any content or configuration references to the site’s hostname, noting that you may want to run this with the --dry-run flag first: wp search-replace '//mystaging.com' '//mysite.test' --precise --recurse-objects --all-tables --skip-columns=guid.
  16. If you need to put any of your plugins in local development mode, now’s the time, e.g. wp config set JETPACK_DEV_DEBUG true --raw. And, if you  were using a local development plugin like Query Monitor, reactivate it: wp plugin install query-monitor --activate

That’s it! In only 16 steps you should have a fully up to date copy of your site’s data available for local development and testing. Obviously much of the above could be scripted (with appropriate safeguards) to save time and reduce human error in the process.

I hope you find this helpful. If you have improvements to suggest or your own fun methods of cloning environments, please share in the comments.

Generate an RSS feed from a Twitter user timeline

I needed to generate a valid RSS feed from a Twitter user’s timeline, but only for tweets that matched a certain pattern. Here’s how I did it using PHP.

First, I added the dependency on the TwitterOAuth library by Abraham Williams:

$ composer require abraham/twitteroauth

This library will handle all of my communication and authentication with Twitter’s API. Then I created a read-only app in my Twitter account and securely noted the four key authentication items I would need, the consumer API token and secret, and the access token and secret.

Now, I can quickly bring recent tweets from my target Twitter user into a PHP variable:

require "/path/to/vendor/autoload.php" ;
use Abraham\TwitterOAuth\TwitterOAuth;

$consumerKey       = "your_key_goes_here"; // Consumer Key
$consumerSecret    = "your_secret_goes_here"; // Consumer Secret
$accessToken       = "your_token_goes_here"; // Access Token
$accessTokenSecret = "your_token_secret_goes_here"; // Access Token Secret

$twitter_username    = 'wearrrichmond';

$connection = new TwitterOAuth( $consumerKey, $consumerSecret, $accessToken, $accessTokenSecret );

// Get the 10 most recent tweets from our target user, excluding replies and retweets
$statuses = $connection->get(
	'statuses/user_timeline',
	array(
		"count" => 10,
		"exclude_replies" => true,
		'include_rts' => false,
		'screen_name' => $twitter_username,
		'tweet_mode' => 'extended',
	)
);

My specific use case is that my local public school system doesn’t publish an RSS feed of news updates on its website, but it does tweet those updates with a somewhat standard pattern: the headline of the announcement, possibly followed by an at-mention and/or image, and then including a link back to a PDF file on their website that lives in a certain directory. I wanted to capture these items for use on another site I created to aggregate local news headlines into one place, and it mostly relies on the presence of an RSS feed.

So, I only want to use the tweets that match this pattern in the custom RSS feed. Here’s that snippet:

// For each tweet returned by the API, loop through them
foreach ( $statuses as $tweet ) {

	$permalink = '';
	$title     = '';

	// We only want tweets with URLs
	if ( ! empty( $tweet->entities->urls ) ) {

		// Look for a usable permalink that matches our desired URL pattern, and use the last (or maybe only) one
		foreach ( $tweet->entities->urls as $url ) {

			if ( false !== strpos( $url->expanded_url, 'rcs.k12.in.us/files', 0 ) ) {
				$permalink = $url->expanded_url;

			}
		}

		// If we got a usable permalink, go ahead and fill out the rest of the RSS item
		if ( ! empty( $permalink ) ) {

			// Set the title value from the Tweet text
			$title = $tweet->full_text;

			// Remove links
			$title = preg_replace( '/\bhttp.*\b/', '', $title );

			// Remove at-mentions
			$title = preg_replace( '/\@\w+\b/', '', $title );

			// Remove whitespace at beginning and end
			$title = trim( $title );

			// TODO: Add the item to the feed here

		}
	}
}

Now we have just the tweets we want, ready to add to an RSS feed. We can use the included SimplePie library to do this. In my case, the final output is written to an output text file, which another part of my workflow can then query.

Here’s the final result all put together:

<?php

/**
 * Generate an RSS feed from a Twitter user's timeline
 * Chris Hardie <chris@chrishardie.com>
 */

require "/path/to/vendor/autoload.php" ;
use Abraham\TwitterOAuth\TwitterOAuth;

$consumerKey       = "your_key_goes_here"; // Consumer Key
$consumerSecret    = "your_secret_goes_here"; // Consumer Secret
$accessToken       = "your_token_goes_here"; // Access Token
$accessTokenSecret = "your_token_secret_goes_here"; // Access Token Secret

$twitter_username    = 'wearrrichmond';
$rss_output_filename = '/path/to/www/rcs-twitter.rss';

$connection = new TwitterOAuth( $consumerKey, $consumerSecret, $accessToken, $accessTokenSecret );

// Get the 10 most recent tweets from our target user, excluding replies and retweets
$statuses = $connection->get(
	'statuses/user_timeline',
	array(
		"count" => 10,
		"exclude_replies" => true,
		'include_rts' => false,
		'screen_name' => $twitter_username,
                'tweet_mode' => 'extended',
	)
);

$xml = new SimpleXMLElement( '<rss/>' );
$xml->addAttribute( 'version', '2.0' );
$channel = $xml->addChild( 'channel' );

$channel->addChild( 'title', 'Richmond Community Schools' );
$channel->addChild( 'link', 'http://www.rcs.k12.in.us/' );
$channel->addChild( 'description', 'Richmond Community Schools' );
$channel->addChild( 'language', 'en-us' );

// For each tweet returned by the API, loop through them
foreach ( $statuses as $tweet ) {

	$permalink = '';
	$title     = '';

	// We only want tweets with URLs
	if ( ! empty( $tweet->entities->urls ) ) {

		// Look for a usable permalink that matches our desired URL pattern, and use the last (or maybe only) one
		foreach ( $tweet->entities->urls as $url ) {

			if ( false !== strpos( $url->expanded_url, 'rcs.k12.in.us/files', 0 ) ) {
				$permalink = $url->expanded_url;

			}
		}

		// If we got a usable permalink, go ahead and fill out the rest of the RSS item
		if ( ! empty( $permalink ) ) {

			// Set the title value from the Tweet text
			$title = $tweet->full_text;

			// Remove links
			$title = preg_replace( '/\bhttp.*\b/', '', $title );

			// Remove at-mentions
			$title = preg_replace( '/\@\w+\b/', '', $title );

			// Remove whitespace at beginning and end
			$title = trim( $title );

			$item = $channel->addChild( 'item' );
			$item->addChild( 'link', $permalink );
			$item->addChild( 'pubDate', date( 'r', strtotime( $tweet->created_at ) ) );
			$item->addChild( 'title', $title );

			// For the description, include both the original Tweet text and a full link to the Tweet itself
			$item->addChild( 'description', $tweet->full_text . PHP_EOL . 'https://twitter.com/' . $twitter_username . '/status/' . $tweet->id_str . PHP_EOL );

		}
	}
}

$rss_file = fopen( $rss_output_filename, 'w' ) or die ("Unable to open $rss_output_filename!" );
fwrite( $rss_file, $xml->asXML() );
fclose( $rss_file );

exit;

Here’s the same thing as a gist on GitHub.

I set this script up to run via cronjob every hour, which gives me a regularly updated feed based on the Twitter account’s activity.

Several ways this could be improved include:

  • Better escaping and sanitizing of the data that comes back from Twitter
  • Make the filtering of the Tweets more tolerant to changes in the target user’s Tweet structure
  • Genericizing the functionality to support querying multiple Twitter accounts and generating multiple corresponding output feeds
  • Fixing Twitter so that RSS feeds of user timelines are offered on the platform again

If you find this helpful or have a variation on this concept that you use, let me know in the comments!

Updated July 14 2020 to add the use of “tweet_mode = extended” in the API connection and to replace the references to tweet text with “full_text”, as apparently it is not the default in the Twitter API to use the 240-character version of tweets.