SEO Tools for Automation and Scale That I Built During the Pandemic

SEO Tools for Automation and Scale That I Built During the Pandemic

I’ve spent my work-from-home time during the 2020-2021 pandemic developing SEO tools that automate or scale SEO tasks or processes. By starting each day early, taking advantage of time that would be spent preparing, commuting and managing other responsibilities, I was able to build new scaled or automated SEO processes and solutions.

Right before the pandemic, I moved from Drum Agency (formerly BKV) to 360i. This involved moving from managing half a dozen SEO clients and a small SEO staff to managing a smaller number of SEO clients within a much larger, fuller staffed SEO practice – and only take on leadership responsibilities that I really love – SEO product development, marketing, evangelizing and a little mentoring on the side. This freed up time for me to dig deeper into client work and build solutions that I didn’t have time for in the past.

One challenge of working in an SEO agency is that you must spread your attention and efforts across multiple clients and limit the scope of work, while knowing you could dive deeper. SEO automation is one way to extend your scope within limited hours.

Like many other people, I also spent 2020 on a variety of self-taught DIY home improvement and repair pandemic projects. I got ambitious and dove into new areas of plumbing, electrical, and house exterior, often requiring a tall ladder ignoring any fears of height on a 2 story house or exploring the crawlspace beneath the house, including: applying exterior brick mortar wash, installing a dishwasher, repairing a bathroom faucet, installing window box planters on 2nd floor windows, disassembling and repairing a skylight, sawing apart old plumbing to replace a valve, replacing guts of a toilet, repairing a leaking bathtub, repairing motion lights hanging off the second floor roof and other less impressive sounding projects.

But the focus of this article is SEO automation tools that sprung up during 2020. Some of this was inspired by Hamlet Batista, who we sadly lost during the pandemic. He inspired me to begin learning Python, however these projects used Knime, Datastudio, Excel, Google Sheets, Screaming Frog, Node JS & Lighthouse-Batch, Windows Task Scheduler, Bat text files, and whatever data sources were used.

Content Mapping at Scale

Project: Perform Content Journey Stage Audit of Client and Competitors

Identify the journey stage of all content across multiple websites for a competitive view of performance and gaps at each stage: awareness, engagement, consideration, decision and retention.

  • Problem: You may be tempted to look at URL directories and assume everything in that directory falls in a particular journey stage. That may work to an extent for some page types like products, services, about, and contact. However, resource areas and blogs are often the largest sections of pages and they contain a mix across many of the stages, especially awareness, engagement and consideration. Within a limited budget, it is not feasible to manually view and bucket each article in the correct journey stage.
  • Solution: Use Screaming Frog custom search to identify text and calls-to-action that provide a signal that suggest a particular journey stage. This process was specific for each website. I had to manually review a sample of pages, blogs and resources in order to identify phrases like calls-to-actions that were indicative to a particular journey stage. Then in Excel, I reviewed pages and QA’ed where they fell in the journey stage. Finally an audit of content types let’s us see a view of audience, stage and content type to see what types perform best and where the gaps are.
Screaming Frog settings for custom search using Regex in the Content Area to identify and categorize pages into customer journey stages.
Screaming Frog custom search using Regex in the Content Area to identify customer journey stage for each page.
Methodology of defining where content falls in the customer journey.
Content volume, gaps and performance by audience, journey stage and content type (how-to, FAQ infographic, listical, case study, white paper, etc.) were one of the final outputs to inform the content roadmap.

Automated Core Web Vitals Reporting

Project: Capture Lighthouse Core Web Vital and Page Speed Reports in Weekly Batches

Perform weekly automated Lighthouse reports across a sample of pages across main website templates in order to diagnose issues, identify areas to focus, track improvements and monitor for new problems.

  • Problem: Recurring reporting on Page Speed and Core Web Vitals across batches of pages was taking up valuable time.
  • Solution: I began by running Lighthouse-Batch reports using Node.js in the command prompt, with all my configurations saved in a .bat file, like where to find the seed list of URLs and where to output the resulting JSON files. I then automated these using Windows Task Scheduler to run the .bat file on a weekly basis. Lastly, I used Knime to read the JSON files, transform the data and save it into G-sheets where it powers my DataStudio report. The only problem was that is used my own computer to run this process. Shout out to Tyson Nakayama for adapting the process on an Amazon Workspace, using the official Lighthouse API and saving the data to BigQuery in a dentsu managed data lake. Tyson and I also worked together to create reports that show which scripts are causing problems site-wide, not just at the page level.
The original process was automated by running .bat files in Windows Task Scheduler on my local machine.
This is one of my earlier Knime workflows to pull JSON data into Google Sheets for a DataStudio report. A .bat file, similar to above would trigger the Knime workflow to run after Lighthouse-batch had time to complete.

Influencer-PR SEO Integration Process

Project: [details coming soon]

  • Problem:
  • Solution:

Content Copy Priority Tracker

Project: [details coming soon]

  • Problem:
  • Solution:

Project: [details coming soon]

  • Problem:
  • Solution:

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.