Tools and tech we’re using to publish a print, online newspaper

Wow, it’s been over a month since I took ownership of a print and online newspaper here in my community. There’s a lot to say about that experience and what I’ve learned so far. In this post, I’ll be focused on the tools and technology we’re using  to operate this business. Some of these were in place before I came in, some are new in the last month.

I’m sharing this because (a) I generally enjoy the topic of if/how tools can make life and business easier, and (b) I hope it could be useful to someone else publishing a newspaper or building a media organization.

Print Layout and Design

It’s Adobe Creative Cloud all the way, for better or worse. InDesign for newspaper layout, Photoshop for image editing. Given the way our staff is set up and our weekly newspaper production process works, almost everyone touches the newspaper pages at some point or another, so the monthly license costs to cover all of that is somewhat ouch. If there were a viable alternative to InDesign, we’d probably switch to it.

Issue and Story Budget Planning

We’re using an Airtable base that helps us record story ideas and plan for our upcoming issues by tracking what articles are going to go where, what state their in, and all the associated data that goes with them such as photos, source info, internal notes, etc. It’s pretty great and the real-time collaboration that it makes possible is hard to beat. I think down the road we may move toward a custom Laravel-powered solution that allows for tighter integration of all of our business operations, but that’s a ways off.

Phone System

We’re using a self-hosted FreePBX (Asterisk) installation with the Sysadmin Pro and EndPoint Manager paid add-on modules. Digital Ocean had a 1-click installer on their marketplace that made it super fast to get going. We’re using VOIP.ms for our trunk lines and they made DID porting in very easy.

Having used Asterisk in a previous business I was already familiar with its architecture and features, but FreePBX meant I could configure everything via web interface instead of editing dialplan files – amazing. We have extensions, queues, interactive voice menus, voicemail speech to text transcription (using this tool) and more, and it sets up a nice foundation for future integration with other tools like our CRM data.

We’re using Yealink T31P and T33G VOIP phones and so far Counterpath’s Bria Mobile has been the most compatible/feature complete softphone for iOS that I’ve found.

Continue reading Tools and tech we’re using to publish a print, online newspaper

Define, fire and listen for custom Laravel model events within a trait

It took me some time to figure out the right way to define, fire and listen for custom events for a Laravel model using a trait, so I’m writing down how I did it in case it’s helpful to others.

Let’s say you have a model Post that you set up as having a “draft” status by default, but eventually will have a status of “publish”. Let’s also say you want to make the act of publishing a post a custom model event that can be listened for in addition to the standard model events like “created” or “updated”. And let’s say you want to do all of this using a trait so that you can apply the same logic to another model in the future, such as comments on the post, without repeating yourself.

Here’s what my Post model might look like:

<?php

namespace App\Models;

class Post extends Model
{
    //
}

Let’s create a Publishable trait that can be applied to this model:

<?php

namespace App\Models\Traits;

trait Publishable
{
    // Add to the list of observable events on any publishable model
    public function initializePublishable()
    {
        $this->addObservableEvents([
            'publishing',
            'published',
        ]);
    }

    // Create a publish method that we'll use to transition the 
    // status of any publishable model, and fire off the before/after events
    public function publish()
    {
        if (false === $this->fireModelEvent('publishing')) {
            return false;
        }
        $this->forceFill(['status' => 'publish'])->save();
        $this->fireModelEvent('published');
    }

    // Register the existence of the publishing model event
    public static function publishing($callback)
    {
        static::registerModelEvent('publishing', $callback);
    }

    // Register the existence of the published model event
    public static function published($callback)
    {
        static::registerModelEvent('published', $callback);
    }
}

This new trait can now be applied to the Post model:

Continue reading Define, fire and listen for custom Laravel model events within a trait

Getting alerts about outdated Time Machine backups

Our household uses Apple’s Time Machine backup system as one of our backup methods. Various Macs are set to run their backups at night (we use TimeMachineEditor to schedule backups so they don’t take up local bandwidth or computing resources during the day) and everything goes to a Synology NAS (basically following the steps in this great guide from 9to5Mac).

But, Time Machine doesn’t necessarily complete a successful backup for every machine every night. Maybe the Mac is turned off or traveling away from home. Maybe the backup was interrupted for some reason. Maybe the backup destination isn’t on or working correctly. Most of these conditions can self-correct within a day or two.

But, I wanted to have a way to be notified if any given Mac had not successfully completed a backup after a few days so that I could take a look. If, say, three days go by without a successful backup, I start to get nervous. I also didn’t want to rely on any given Mac’s human owner/user having to notice this issue and remember to tell me about it.

I decided to handle this by having a local script on each machine send a daily backup status to a centralized database, accomplished through a new endpoint on my custom webhook server. The webhook call stores the status update locally in a database table. Another script runs daily to make sure every machine is current, and sends a warning if anyone is running behind.

Here are the details in case it helps anyone else.

Continue reading Getting alerts about outdated Time Machine backups

Generate an RSS feed from a Twitter user timeline

I needed to generate a valid RSS feed from a Twitter user’s timeline, but only for tweets that matched a certain pattern. Here’s how I did it using PHP.

First, I added the dependency on the TwitterOAuth library by Abraham Williams:

$ composer require abraham/twitteroauth

This library will handle all of my communication and authentication with Twitter’s API. Then I created a read-only app in my Twitter account and securely noted the four key authentication items I would need, the consumer API token and secret, and the access token and secret.

Now, I can quickly bring recent tweets from my target Twitter user into a PHP variable:

require "/path/to/vendor/autoload.php" ;
use Abraham\TwitterOAuth\TwitterOAuth;

$consumerKey       = "your_key_goes_here"; // Consumer Key
$consumerSecret    = "your_secret_goes_here"; // Consumer Secret
$accessToken       = "your_token_goes_here"; // Access Token
$accessTokenSecret = "your_token_secret_goes_here"; // Access Token Secret

$twitter_username    = 'wearrrichmond';

$connection = new TwitterOAuth( $consumerKey, $consumerSecret, $accessToken, $accessTokenSecret );

// Get the 10 most recent tweets from our target user, excluding replies and retweets
$statuses = $connection->get(
	'statuses/user_timeline',
	array(
		"count" => 10,
		"exclude_replies" => true,
		'include_rts' => false,
		'screen_name' => $twitter_username,
		'tweet_mode' => 'extended',
	)
);

My specific use case is that my local public school system doesn’t publish an RSS feed of news updates on its website, but it does tweet those updates with a somewhat standard pattern: the headline of the announcement, possibly followed by an at-mention and/or image, and then including a link back to a PDF file on their website that lives in a certain directory. I wanted to capture these items for use on another site I created to aggregate local news headlines into one place, and it mostly relies on the presence of an RSS feed.

So, I only want to use the tweets that match this pattern in the custom RSS feed. Here’s that snippet:

// For each tweet returned by the API, loop through them
foreach ( $statuses as $tweet ) {

	$permalink = '';
	$title     = '';

	// We only want tweets with URLs
	if ( ! empty( $tweet->entities->urls ) ) {

		// Look for a usable permalink that matches our desired URL pattern, and use the last (or maybe only) one
		foreach ( $tweet->entities->urls as $url ) {

			if ( false !== strpos( $url->expanded_url, 'rcs.k12.in.us/files', 0 ) ) {
				$permalink = $url->expanded_url;

			}
		}

		// If we got a usable permalink, go ahead and fill out the rest of the RSS item
		if ( ! empty( $permalink ) ) {

			// Set the title value from the Tweet text
			$title = $tweet->full_text;

			// Remove links
			$title = preg_replace( '/\bhttp.*\b/', '', $title );

			// Remove at-mentions
			$title = preg_replace( '/\@\w+\b/', '', $title );

			// Remove whitespace at beginning and end
			$title = trim( $title );

			// TODO: Add the item to the feed here

		}
	}
}

Now we have just the tweets we want, ready to add to an RSS feed. We can use the included SimplePie library to do this. In my case, the final output is written to an output text file, which another part of my workflow can then query.

Here’s the final result all put together:

<?php

/**
 * Generate an RSS feed from a Twitter user's timeline
 * Chris Hardie <chris@chrishardie.com>
 */

require "/path/to/vendor/autoload.php" ;
use Abraham\TwitterOAuth\TwitterOAuth;

$consumerKey       = "your_key_goes_here"; // Consumer Key
$consumerSecret    = "your_secret_goes_here"; // Consumer Secret
$accessToken       = "your_token_goes_here"; // Access Token
$accessTokenSecret = "your_token_secret_goes_here"; // Access Token Secret

$twitter_username    = 'wearrrichmond';
$rss_output_filename = '/path/to/www/rcs-twitter.rss';

$connection = new TwitterOAuth( $consumerKey, $consumerSecret, $accessToken, $accessTokenSecret );

// Get the 10 most recent tweets from our target user, excluding replies and retweets
$statuses = $connection->get(
	'statuses/user_timeline',
	array(
		"count" => 10,
		"exclude_replies" => true,
		'include_rts' => false,
		'screen_name' => $twitter_username,
                'tweet_mode' => 'extended',
	)
);

$xml = new SimpleXMLElement( '<rss/>' );
$xml->addAttribute( 'version', '2.0' );
$channel = $xml->addChild( 'channel' );

$channel->addChild( 'title', 'Richmond Community Schools' );
$channel->addChild( 'link', 'http://www.rcs.k12.in.us/' );
$channel->addChild( 'description', 'Richmond Community Schools' );
$channel->addChild( 'language', 'en-us' );

// For each tweet returned by the API, loop through them
foreach ( $statuses as $tweet ) {

	$permalink = '';
	$title     = '';

	// We only want tweets with URLs
	if ( ! empty( $tweet->entities->urls ) ) {

		// Look for a usable permalink that matches our desired URL pattern, and use the last (or maybe only) one
		foreach ( $tweet->entities->urls as $url ) {

			if ( false !== strpos( $url->expanded_url, 'rcs.k12.in.us/files', 0 ) ) {
				$permalink = $url->expanded_url;

			}
		}

		// If we got a usable permalink, go ahead and fill out the rest of the RSS item
		if ( ! empty( $permalink ) ) {

			// Set the title value from the Tweet text
			$title = $tweet->full_text;

			// Remove links
			$title = preg_replace( '/\bhttp.*\b/', '', $title );

			// Remove at-mentions
			$title = preg_replace( '/\@\w+\b/', '', $title );

			// Remove whitespace at beginning and end
			$title = trim( $title );

			$item = $channel->addChild( 'item' );
			$item->addChild( 'link', $permalink );
			$item->addChild( 'pubDate', date( 'r', strtotime( $tweet->created_at ) ) );
			$item->addChild( 'title', $title );

			// For the description, include both the original Tweet text and a full link to the Tweet itself
			$item->addChild( 'description', $tweet->full_text . PHP_EOL . 'https://twitter.com/' . $twitter_username . '/status/' . $tweet->id_str . PHP_EOL );

		}
	}
}

$rss_file = fopen( $rss_output_filename, 'w' ) or die ("Unable to open $rss_output_filename!" );
fwrite( $rss_file, $xml->asXML() );
fclose( $rss_file );

exit;

Here’s the same thing as a gist on GitHub.

I set this script up to run via cronjob every hour, which gives me a regularly updated feed based on the Twitter account’s activity.

Several ways this could be improved include:

  • Better escaping and sanitizing of the data that comes back from Twitter
  • Make the filtering of the Tweets more tolerant to changes in the target user’s Tweet structure
  • Genericizing the functionality to support querying multiple Twitter accounts and generating multiple corresponding output feeds
  • Fixing Twitter so that RSS feeds of user timelines are offered on the platform again

If you find this helpful or have a variation on this concept that you use, let me know in the comments!

Updated July 14 2020 to add the use of “tweet_mode = extended” in the API connection and to replace the references to tweet text with “full_text”, as apparently it is not the default in the Twitter API to use the 240-character version of tweets.

Generic ‘send to Slack’ shell script

On any given server I maintain, I like to set up a generic “send a message to Slack” shell script that can be called from any other tool or service running on that machine. With it I can log information of interest to a Slack channel for reading or maybe action.

Here’s what send-to-slack.sh usually looks like:

#!/bin/bash -e

message=$1

[ ! -z "$message" ] && curl -X POST -H 'Content-type: application/json' --data "{
              \"text\": \"${message}\"
      }" https://hooks.slack.com/services/12345/67890/abcdefghijklmnop

That last line has the “incoming webhook URL” provided by Slack when you set Slack up to receive messages via incoming webhooks, something that is included in even their most basic/free tier.

Running the script and sending a message to the channel is as simple as $ sh send-to-slack.sh 'My message goes here' and the result is what you would expect:

Once that’s in place and tested, I can call the script from wherever I want on that server. Other shell scripts. Custom WordPress functions. Cron jobs. And so on.

There are many other ways this could be customized or extended. It’s worth noting that this is not necessarily a fully secure way to do things if you have untrusted users who can control the input to the script and the message that gets output…please remember to sanitize your inputs and escape your outputs!

 

Put all those email newsletters in an RSS feed

The other day someone told me that they think blogging is dead.

I tried to suppress the sounds of existential pain emanating from deep within my soul, but it still hurt.

Blogging is far from dead, but I also recognize that email newsletters are all the hotness right now when it comes to getting your written thoughts in front of someone else. And I recognize that if you want to follow some kinds of updates from some kinds of people or organizations, you’re going to have to do the email thing.

For a while, I used email filters to manage this issue, dutifully creating or updating them in my setup each time I cringe-fully subscribed to a new email newsletter list after searching in vain for an RSS feed subscribe button. Then I would let them all go into just the right email folder (or maybe even still my inbox) so I could read them when I was in the mood to read blog-posts-as-email-messages on a given subject.

Ugh.

I didn’t like that this approach still created a kind of additional email “to do” burden on me, leaving me with folders to sort, search through and clear out. Newsletter content is usually not actionable or time-sensitive. What I really wanted was to treat all those email newsletter messages like blog post headlines in a separate kind of reader app, available to be read at my leisure. YOU KNOW, LIKE AN RSS FEED READER.

So here’s my current setup:

  1. newsletter emails go to a dedicated email alias configured at my mail provider, and that’s what I subscribe to lists with
  2. those messages are forwarded to a Zapier-powered recipe that converts them into items on a custom generated RSS feed
  3. I subscribe to the RSS feed in my feed reader, Feedly.

Now I can browse the headlines when I want to, read some items and gloss over others, and my email inbox is no longer crowded with articles that aren’t necessarily actionable or time-sensitive for me.

A few things I could do to tweak this setup further:

  • Right now all of my email newsletters go into a single RSS feed. For better categorization and readability, I could break these out into individual feeds.
  • The translation of HTML-only emails (another annoying thing about the email newsletter age) doesn’t always work well into the RSS feed format as supported by Zapier. I haven’t really explored a fix for this but it hasn’t affected me much so far.

Also note that Zapier’s pricing structure is such that depending on the number of incoming messages you have, you might need to upgrade to a paid plan.

That’s it. My email inbox has benefitted greatly from this setup, and I hope yours will too.