All GuidesInterviewsNewsOpinionPatch NotesResources
Back To Posts

Badass SEO: Automate Screaming Frog

#Guides

22 May 2019

4 Comments

When Jonathan asked me to write a second guest post for SEOButler (my first one was on why keyword rankings don’t matter), I knew I had to write something unique.

Over the past year or so, I’ve become obsessed with automation. Do more work in less time with less effort and make more money? Sign. Me. Up.

Without further ado, let’s talk about how you can automate Screaming Frog on both Macs and Windows machines…

“What?” I hear you ponder profusely about the puzzling prospect of an angry autonomous amphibian.

Well, for those not in the know…

Screaming Frog is a web crawling software. Set this baby loose on your website, and Screaming Frog will move from page to page, gathering various data that can be used for SEO and web development.

Constantly opening Screaming Frog, setting up your configuration, all that exporting and saving… it takes up a lot of time.

Or, you have your VAs or employees follow massive SOPs that look like:

  • Step 1: Open Screaming Frog
  • Step 2: Open Configuration
  • Step 10: Crawl the site
  • Step 25: Export this
  • Step 88: Export that
  • Etc, etc.

Bottom line, there’s a lot of room for error, and every time there’s a UI update, your SOP becomes out of date.

So, why not have a computer handle all of your crawls for you?

Quick Menu

Using the Command Line

Screaming Frog Automation for Mac Users

Screaming Frog Automation for Windows Users

Walk Before You Run

Before I get into how you can automate Screaming Frog as well as give you copy & paste commands for both Mac and Windows, let me quickly cover the basics.

Command Line

We’re going to be automating Screaming Frog by utilising the command-line interface. If you’re not familiar, the command line allows you to control your computer using a program like Terminal (Mac) or Command Prompt (Windows).

You can use the command-line to open files, folders, applications, connect to Wi-Fi, overload nuclear power plants, shut down your computer, ping servers, locate all devices on a network, and so much more.

If you want to take a dive into Screaming Frog’s documentation on their website, you can do that here. And if you’re a nerd like me and want to learn more about using the command-line, you can check out Code Academy.

Scheduled Crawls

We’re not going to go over scheduled crawls since they’re easy to set up. You can read Screaming Frog’s documentation about them here.

If you set up scheduled crawls, I recommend doing so on a Virtual Private Server (VPS).

Mac Users

This section is just for Mac OS X users. If you’re a Windows user, click here to go to the Windows section.

NB: All commands highlighted in black are formatted to be copy and pasted as single line commands into Terminal–ignore any line breaks!

Using Terminal

First, you need to understand how to get into your Terminal. This is where we’ll be running our commands.

Find Terminal by opening Spotlight ⌘ Cmd + Space and searching for “Terminal”. Then press return to open Terminal. It should look something like this (I’m using a dark theme, yours will likely be white).

You should already be there, but just to double check, use Terminal to navigate to your home directory by typing cd ~ into the prompt and pressing return.

Now, we’re going to open a command-line editor—which is basically, a text editor that’s used within Terminal. We’re going to use nano, and the file we’re going to open is our home directory’s .bash_profile.

This file is going to house the automation we build in a bit.

From your home directory, type nano .bash_profile into your prompt. This will open your .bash_profile using nano. It will look something like this (your file will be empty).

Now, we could edit this file directly within Terminal, but—it’s a real pain. Instead, we’re going to use a text editor.

If you already have an editor like Notepad++, Sublime, or Atom installed, you can use that. But, I’m going to show you how to do it using Mac’s built-in TextEdit program.

To open your .bash_profile with TextEdit, close your existing Terminal window that has .bash_profile open. Click “terminate” if prompted.

Open a new Terminal window, and from your root directory (cd ~), type the following to open .bash_profile in TextEdit:

open -e .bash_profile

The final preliminary step to cover is alias. An alias allows us to take an ugly chunk of shell script and shorten it. Think of it as an application shortcut. Instead of opening an application by locating its file directory path, you can use Spotlight Search to open it just by typing its name.

We’ll make our first alias in a second.

Open Screaming Frog

While keeping your .bash_profile open in TextEdit, go back to your Terminal window and enter the following command to open Screaming Frog (SF).

NB: This assumes that you installed Screaming Frog into the default folders.

/Applications/Screaming\ Frog\ SEO\ Spider.app/Contents/MacOS/ScreamingFrogSEOSpiderLauncher

This will open Screaming Frog as if you had launched the application file by clicking on it.

You can then close Screaming Frog, but keep the Terminal window open.

SF Alias

Rather than typing all of that out (or pasting it) every time, we can create an alias to enter all that for us using just a couple of characters.

Go back to the TextEdit window that has your .bash_profile file open and enter the following into the file.

alias sf="/Applications/Screaming\ Frog\ SEO\ Spider.app/Contents/MacOS/ScreamingFrogSEOSpiderLauncher"

This creates an alias that you can call by entering “sf” (without quotes) into Terminal.

You can name the alias whatever you want. Just make sure there are no spaces!

Save your file.

In order for Terminal to recognise the new changes, we need to create a new Terminal window (there’s probably a more efficient way of doing this, but I have no idea ?‍♂️)

Select Terminal and open a new window by pressing ⌘ Cmd + n You can close the old Terminal window. For each edit made to the .bash_profile, you’ll have to repeat this step.

In the new Terminal window, type “sf” (without quotes) and press return. This should open Screaming Frog the same way it did when we typed the file path.

When using Screaming Frog in this manner, closing your Terminal window will also close Screaming Frog, so be careful!

Congrats, you just created your first application alias for Terminal!

For now on, I’ll refer to the .bash_profile TextEdit window as bash profile and our root Terminal window as just our Terminal.

Crawl A Site

If we want to both open Screaming Frog and start crawling a website right from our command line, then we can enter the following syntax into the Terminal:

sf --crawl <url>

If we wanted to crawl http://example.com/ then we’d enter:

sf --crawl http://example.com

Use whatever site you want. In all of my examples, I’ll use example.com as a placeholder for a website.

We could also make an alias in our bash profile so that we only need to enter the URL of the website we want to crawl.

Make sure to enter at least one line break from your previous alias!

Run Screaming Frog Without A GUI

We’ve covered the basics, so let’s get into some of the fun stuff. Most of the time when you’re using Screaming Frog from the command line, you’re going to run it “headless.”

Headless means that the software can run without a Graphical User Interface (GUI). The GUI is the front end interface you see when you launch Screaming Frog (and most other apps) from the Finder. Many of the functions with the command line for Screaming Frog have to be run headless.

We’re going to add a couple of new functions to our next alias.

Instead of opening the GUI, we’re going to crawl a website headless while saving our crawl file inside of a time-stamped folder within another folder dedicated to Screaming Frog crawls.

Don’t worry, I’ll walk you through it.

These are the new “arguments” (the things that start with “–”) we’re going to use:

--headless

This argument allows us to run Screaming Frog headless.

--save-crawl

This argument will allow us to save the crawl.

NB: Saving a crawl can be a bit fidgety. I personally don’t use it anymore, as it sometimes saves a blank file. If I need a report, I export the file (tutorial below) instead of saving the crawl file.

--output-folder

This argument determines where the file will be saved.

--timestamped-output

This argument will save the file under a time-stamped folder. Since every crawl file is saved as crawl.seospider, this will prevent a conflict or being forced to overwrite the existing file.

Before we add the script, create a new folder on your desktop. I called mine “sf”. Make a note of what user you’re currently logged in as on your Mac.

You can find your username from your root directory, it will be your active directory. Not including the “$”.

On your bash profile, add a line break, and then paste the following command. You can name the alias whatever you want (just don’t use spaces).

alias sf-h="/Applications/Screaming\ Frog\ SEO\ Spider.app/Contents/MacOS/ScreamingFrogSEOSpiderLauncher --headless --save-crawl --output-folder /users/{username}/desktop/sf --timestamped-output --crawl"

Replace {username} with your username (no braces). Note that the

 --crawl

argument is the last argument.

Save your bash profile, open a new Terminal window and type the following.

sf-h http://example.com

This will crawl the website and add a folder and file to your desktop within your sf folder, which will look something like this:

So far we’ve covered a basic crawl both with and without a GUI. It’s time to get into some more fun stuff ?.

Exporting Data

Aside from just doing some basic crawls, we can also export CSV files from our crawls automatically.

Exporting Tabs

By tabs, I mean everything under the overview panel.

If we wanted to crawl a site, and then export a CSV file of all the images that do not have alt text, we could use our “sf” alias and call:

sf --crawl <url> --output-folder /users/{username}/desktop/sf --export-tabs "Images:Missing Alt Text" --headless

This will add the CSV to our sf folder. If we also wanted to save the crawl.seospider file, then we can add the

 --save-crawl 

argument.

You can see the new syntax here is:

--export-tabs "tab-parent:tab-child"

If you wanted to have an alias for this:

alias sf-alt="/Applications/Screaming\ Frog\ SEO\ Spider.app/Contents/MacOS/ScreamingFrogSEOSpiderLauncher --headless --output-folder /users/{username}/desktop/sf --timestamped-output --export-tabs ‘Images:Missing Alt Text’ --crawl" 

By now, you should have a good understanding of how to turn a command into an alias. So as not to beat a dead horse, this will be the last time I explicitly show how to create an alias in this section.

You can export multiple tabs at once, as well.

Exporting Multiple Files At Once

Exporting multiple files at a time is quite simple.

You just need to separate each file type with a comma. This changes the syntax to:

'parent1:child1,parent2:child2,parent3:child3'

The parent child syntax follows with the drop downs in the summary panel:

As far as I know, there’s no limit to the number of exports you can do at once.

With our newfound knowledge, we can export all missing title tags and meta descriptions at the same time!

sf-c <url> --timestamped-output --output-folder /users/{username}/desktop/sf --export-tabs "Page Titles:Missing,Meta Description:Missing" --headless

sf-c <url> –timestamped-output –output-folder /users/{username}/desktop/sf –export-tabs “Page Titles:Missing,Meta Description:Missing” –headless

This should have exported 2 CSV files to your sf folder within a time-stamped folder.

Exporting Reports

Aside from exporting tabs from the summary panel, you can also export reports.

Exporting a report follows a similar syntax to the summary tabs we just did above. It also follows a parent:child syntax, but in the case of there being no child, only the top-level name is needed.

You can see this when you export the “Redirect & Canonical Chains” report:

sf-c <url> --timestamped-output --output-folder /users/{username}/desktop/sf --save-report "Redirect & Canonical Chains" --headless

Bulk Export

We can do the same exact thing using bulk exports.

sf-c <url> --timestamped-output --output-folder /users/{username}/desktop/sf --bulk-export "All Anchor Text--headless

Create A Sitemap

If you’re using a CMS like WordPress, then you probably use a plugin that auto-generates a sitemap for you.

But, if you’re working on a flat site or a CMS that doesn’t create sitemaps for you, Screaming Frog allows you to create a sitemap from a crawl.

sf-c <url> --create-sitemap --output-folder /users/{username}/desktop/sf --headless

Configuration Files

There are a lot of cool things you can do with Screaming Frog’s configuration.

From crawling / not crawling certain pages, crawl speed, user agents, JavaScript rendering, and a lot more.

When you change the configuration, you can actually save that configuration as a config file for future crawls.

We can use a config file through our Terminal commands to also crawl and export data with more specific conditions.

Let’s do this to check if pages on a website have YouTube, Wistia, or Vimeo videos.

Create Config File

Before we can use a configuration file, we first need to create and save one to our folder.

For this, open Screaming Frog and go to Configuration >> Custom >> Search and add the following three lines to the filters.

src="https://www.youtube.com/"
src="https://player.vimeo.com/"
src="https://fast.wistia.com/"

Click “OK”

Then save the config file by going to File >> Configuration >> Save As

Save the file name as whatever you want, (I chose video.seospiderconfig) and save it to your sf folder under a new folder named “config”.

Crawling With the Config File

Now, let’s crawl a site using our config file and export our three custom filters as CSV files.

sf-c <url> --config /users/{username}/desktop/sf/config/video.seospiderconfig --output-folder /users/{username}/desktop/sf --headless --export-tabs "Custom:Filter 1,Custom:Filter 2,Custom:Filter 3"

Crawling TXT Files

You can use “list” mode through your Terminal as well.

Create a .txt file with a list of URLs that you want to crawl. These can be from different websites or all one site. Save the .txt file within your sf folder.

If you wanted to crawl a list of URLs to see which pages 4xx’d…

sf --export-tabs "Response Codes:Client Error (4xx)" --output-folder /users/{username}/desktop/sf --headless --crawl-list /users/{username}/desktop/sf/filename.txt

This concludes the Mac section! I hope this gives you a solid overview of how to automate Screaming Frog and how to use most of the commands available to you.

Automating Screaming Frog becomes even more powerful when you combine the examples above into one large crawl.

Imagine typing a few characters, walking away to make a cup of tea, and coming back to a dozen exported files, a sitemap, and a saved crawl of the website.

Windows Users

This section is just for Windows users. If you’re a Mac user, click here to go to the Mac section.

NB: All commands highlighted in black are formatted to be copy and pasted as single line commands into the Command Prompt–ignore any line breaks!

Using the Command Prompt

Before diving into everything, there are some things to get familiar with and some setting up to do.

First, let’s get familiar with the Command Prompt.

Use your search bar to search for “Command Prompt”.


Once found, open Command Prompt. It should look something like this:

We’re going to be running Screaming Frog through the use of shortcuts. You can think of these as mini-applications.

Instead of typing out 50 to 200 character-long commands or navigating through multiple commands, we can use shortcuts to… well, take a shortcut ?

Let’s jump into creating our first one now.

We don’t want to clutter our desktop with a bunch of cmd.exe shortcuts, so we’re going to create a new folder within our user’s root directory to hold all of our shortcuts and configuration files.

Don’t worry, I’ll walk you through it.

Bring your Command Prompt back into view (if it wasn’t already). Type the following and press enter to open your users directory.

start .

This will open File Explorer.

Create a new folder here, and call it “sf-cmds” (without quotes).

Now, every time we want to run a command, we don’t want to go through the rigamarole of reopening this folder, going into sf-cmds, and then opening the cmd.exe shortcut that we need.

Instead, we’re going to create a shortcut to our shortcuts.

To facilitate this process, and the rest of what we’re going to do, you’re going to want to open a text editor. This can be Google Docs, Microsoft Word, or a program like Sublime.

You just need to be able to copy and paste what I show you here, and easily edit the text.

To create our shortcut, go to your desktop and right-click. Then choose new >> shortcut.

This will prompt you with a new window that looks like this:

Using your text editor, copy the following and change {username} to your username and then paste the command into the text box.

"C:\Windows\System32\cmd.exe" /k cd "C:\Users\{username}\sf-cmds"

Click “Next” and save the shortcut as “SF”. Now, you’ll have this icon on your desktop.

And, if you click that icon, it will open the Command Prompt and go right into the sf-cmds folder we created a minute ago.

Now that we have the basics down, let’s dive into automating Screaming Frog by using what we just learned.

Open Screaming Frog

Go back to your Command Prompt window or open a new one if you closed yours (it doesn’t matter if it’s through the program or the SF shortcut).

We need to figure out if we’re running Screaming Frog on a 32-bit or a 64-bit system. Try to access the 32-bit directory by entering the following into Command Prompt.

cd "C:\Program Files\Screaming Frog SEO Spider"

If you get the message “The system cannot find the path specified”, then try the 64-bit:

cd "C:\Program Files (x86)\Screaming Frog SEO Spider"

If the 32-bit didn’t work, then the 64-bit should have. You’ll notice that the active directory now reads as whichever path worked above.

From your current directory, enter ScreamingFrogSEOSpiderCli.exe to open Screaming Frog as if you had clicked on the application.

Congrats! You just opened Screaming Frog through the command-line.

But, that was way more work than just opening it the normal way. So, let’s create a command-line shortcut.

This may seem pointless, as we could just open Screaming Frog by clicking on the application, but we’ll need this shortcut later.

Close your Command Prompt and open the SF shortcut we created earlier. Open this directory in file explorer by entering start .

From this folder, create a new shortcut with the following:

"C:\Windows\System32\cmd.exe" /k cd "C:\Program Files (x86)\Screaming Frog SEO Spider" & ScreamingFrogSEOSpiderCli.exe

If you’re running on a 32-bit machine, then change C:\Program Files (x86)\Screaming Frog SEO Spider to C:\Program Files\Screaming Frog SEO Spider

Going forward, I’ll just show the 64-bit command when creating shortcuts, but if you’re on 32-bit, you’ll need to adjust the command for each example below.

Name this shortcut “open” (without quotes).

Now, close Command Prompt and File Explorer and go back to your desktop.

Run your SF shortcut and enter open.lnk

That should have opened Screaming Frog again.

What we did was open our SF shortcut which put us into our sf-cmds folder which contains our open shortcut. Then, we ran the open shortcut by entering its name followed by .lnk

Nothing too exciting, but we’re getting there.

The syntax for all of our shortcuts is:

"C:\Windows\System32\cmd.exe" /k

This starts the Command Prompt.

cd "C:\Program Files (x86)\Screaming Frog SEO Spider"

This moves the Command Prompt into the 64-bit Screaming Frog folder.

&

This separates our commands. The first one, moving into the correct folder, and the following to execute whatever command we want to use while in this folder.

ScreamingFrogSEOSpiderCli.exe

This is what we’re doing—in this case, just opening Screaming Frog.

Crawl A Site

Close Screaming Frog and your Command Prompt, then open SF again. This time, we’re going to append more commands to our shortcut file.

Enter

open.lnk –crawl <url>

Where <url> is the URL of the page you want to start crawling on. So, if you wanted to crawl http://example.com, then you’d enter:

open.lnk --crawl http://example.com

We can create a shortcut for this as well.

Create a new shortcut in your sf-cmds folder with:

"C:\Windows\System32\cmd.exe" /k cd "C:\Program Files (x86)\Screaming Frog SEO Spider" & ScreamingFrogSEOSpiderCli.exe --crawl

Name this shortcut “crawl” (without quotes). We’ll be using this a lot.

You’ll notice with all of the shortcuts we create,

--crawl 

will be the last argument. This is so that you only have to feed a URL into the shortcut to get it to work.

Close everything again and open your SF shortcut.

Now enter

crawl.lnk http://example.com

This will crawl from the specified URL.

Run Screaming Frog Without A GUI

We’ve covered the basics, so let’s get into some of the fun stuff. Most of the time when you’re using Screaming Frog from the command-line, you’re going to run it headless.

Headless means that the software can run without a Graphical User Interface (GUI). The GUI is the front end interface you see when you launch Screaming Frog (and most other apps) from the Finder. Many of the functions with the command line for Screaming Frog have to be run headless.

We’re going to add a couple of new functions to our next shortcut.

Instead of opening the GUI, we’re going to crawl a website headless while saving our crawl file inside of a time-stamped folder within another folder dedicated to Screaming Frog crawls.

Don’t worry, I’ll walk you through it.

These are the new “arguments” (the things that start with “–”) we’re going to use:

--headless

This argument allows us to run Screaming Frog headless.

--save-crawl

This argument will allow us to save the crawl.

NB: I mentioned in the Mac section that I don’t save crawls due to how rarely it actually saves the data in the crawl file. This doesn’t appear to be an issue (at least, not that I’ve run into) on Windows.

--output-folder

This argument determines where the file will be saved.

--timestamped-output

This argument will save the file under a time-stamped folder. Every crawl file is saved as crawl.seospider, so this will prevent a conflict or being forced to overwrite the existing file.

Let’s create a new file on our desktop called “SF-Files”. We’ll use this to store anything we export or save from Screaming Frog. We’re putting it on the desktop so that it’s easily accessible by whoever needs to grab the files.

Open your sf-cmds folder and add this monster of a command into a new shortcut and name that shortcut “save-crawl” (no quotes).

"C:\Windows\System32\cmd.exe" /k cd "C:\Program Files (x86)\Screaming Frog SEO Spider" & ScreamingFrogSEOSpiderCli.exe --headless --save-crawl --output-folder "C:\Users\{username}\Desktop\SF-Files" --timestamped-output --crawl

Remember to replace {username} with your username!

Close everything again and open your SF shortcut. Then enter save-crawl.lnk http://example.com

As explained above, this is going to crawl the given website without a GUI. Once that crawl is completed, it will save the crawl file in the SF-Files folder we created.

So far we’ve covered a basic crawl both with and without a GUI. It’s time to get into some more fun stuff ?.

Exporting Data

Aside from just doing some basic crawls, we can also export CSV files from our crawls automatically.

Exporting Tabs

By tabs, I mean everything under the overview panel.

Let’s say we wanted to crawl a website, and then export all of the images that are missing alt text.

Open your SF shortcut and enter:

crawl.lnk http://example.com --output-folder "C:\Users\{username}\Desktop\SF-Files" --export-tabs "Images:Missing Alt Text" --headless

This will export the CSV file to the SF-Files folder.

You can see the new syntax here is

--export-tabs "tab-parent:tab-child"

If we wanted to make a shortcut for this, we’d use:

"C:\Windows\System32\cmd.exe" /k cd "C:\Program Files (x86)\Screaming Frog SEO Spider" & ScreamingFrogSEOSpiderCli.exe --output-folder "C:\Users\{username}\Desktop\SF-Files" --export-tabs "Images:Missing Alt Text" --headless --crawl

By now, you should have a good understanding of how to turn a command into a shortcut. To not beat a dead horse, this will be the last time I explicitly show how to create a shortcut in this section, and all .lnk files are called while inside the sf-cmds folder.

You can export multiple tabs at once, as well.

Exporting Multiple Files At Once

Exporting multiple files at a time is quite simple.

You just need to separate each file type with a comma. This changes the syntax to:

'parent1:child1,parent2:child2,parent3:child3'

The parent child syntax follows with the drop downs in the summary panel.

As far as I know, there’s no limit to the number of exports you can do at once.

With our newfound knowledge, we can export all missing title tags and meta descriptions at the same time!

crawl.lnk <url> --timestamped-output --output-folder "C:\Users\{username}\Desktop\SF-Files" --export-tabs "Page Titles:Missing,Meta Description:Missing" --headless

This should have exported 2 CSV files to your SF-Files folder within a time-stamped folder.

Exporting Reports

Aside from exporting tabs from the summary panel, you can also export reports.

Exporting a report follows a similar syntax to the summary tabs we just did above. It also follows a parent:child syntax, but in the case of there being no child, only the top-level name is needed.

You can see this when you export the “Redirect & Canonical Chains” report.

crawl.lnk <url> --timestamped-output --output-folder "C:\Users\{username}\Desktop\SF-Files" --save-report "Redirect & Canonical Chains" --headless 

Bulk Export

We can do the same exact thing using bulk exports.

crawl.lnk <url> --timestamped-output --output-folder "C:\Users\{username}\Desktop\SF-Files" --bulk-export  "All Anchor Text" --headless

Create A Sitemap

If you’re using a CMS like WordPress, then you probably use a plugin that auto-generates a sitemap for you.

But, if you’re working on a flat site or a CMS that doesn’t create sitemaps for you, Screaming Frog allows you to create a sitemap from a crawl.

crawl.lnk <url> --output-folder "C:\Users\{username}\Desktop\SF-Files" --headless --create-sitemap

This will export an XML file to your SF-Files folder.

Configuration Files

There’s a ton of cool things you can do with Screaming Frog’s configuration.

From crawling—or not crawling—certain pages, adjusting crawl speed, user agents, JavaScript rendering, and a lot more.

When you change the configuration, you can actually save that configuration as a config file for future crawls.

We can use a config file through our Command Prompt commands to also crawl and export files with more specific conditions.

Let’s do this to check if pages on a website have YouTube, Wistia, or Vimeo videos.

Create Config File

Before we can use a configuration file, we first need to create and save one to our folder.

For this, open Screaming Frog and go to Configuration >> Custom >> Search and add the following three lines to the filters.

src="https://www.youtube.com/
src="https://player.vimeo.com/
src="https://fast.wistia.com/

Click “OK”

Then save the config file by going to File >> Configuration >> Save As

Save the file name as whatever you want, (I chose video.seospiderconfig) and save it to your sf-cmds folder under a new folder named “config”.

Crawling With The Config File

Now, let’s crawl a site using our config file and export our three custom filters as CSV files.

crawl.lnk <url> --config "C:\Users\{username}\Desktop\video.seospiderconfig" --output-folder "C:\Users\{username}\Desktop\SF-Files" --headless --export-tabs "Custom:Filter 1,Custom:Filter 2,Custom:Filter 3"

Crawling TXT Files

You can also use “list” mode through the command-line as well.

Create a .txt file with a list of URLs that you want to crawl. These can be from different websites or all one site. Save the .txt file on your desktop as filename.txt.

If you wanted to crawl a list of URLs and export the pages that 4xx’d…

sf.lnk --export-tabs "Response Codes:Client Error (4xx)" --output-folder "C:\Users\{username}\Desktop\SF-Files" --headless --crawl-list "C:\Users\{username}\Desktop\filename.txt"

That concludes the Windows section! I hope this post gives you a solid overview on how to automate Screaming Frog and how to use most of the commands available to you.

Automating Screaming Frog becomes even more powerful when you combine the examples above into one large crawl.

Imagine typing a few characters, walking away to make a cup of tea, and coming back to a dozen exported files, a sitemap, and a saved crawl of the website.

Have Fun

When implemented correctly, automating Screaming Frog can save you a lot of time, make your SOP’s simpler, and remove a lot of the margin-for-error in completing tasks.

I couldn’t cover everything that you can do with Screaming Frog through the command-line, and there are plenty of other 3rd party applications/tools you can integrate into this process that would be way too much to cover here—but I hope I sparked your creativity.

If you’ve got any killer Screaming Frog tips and tricks, I’d love to hear about them. SF is an awesomely powerful tool, and there’s always something new to learn. Please feel free to leave your tips in the comments, so everyone can benefit from your SF wisdom.

 

JAROD SPIEWAKMarketer by day, frustrated programmer by night. Jarod is the Lead Strategist & Founder of Blue Dog Media, a digital agency helping service businesses make more money.

 

4 Comments

Roman

Does somebody know how to disallow export .csv files without data using cli agruments –export-tabs and –bulk-export?
Now every time need to open each .csv file for investigation.

    Jarod Spiewak

    Hi Roman,

    I’m not 100% sure what you’re asking. Can you PM me on FB and I’ll take a look?

    Thanks,
    Jarod

Dominik

Is it possible to crawl a list of URLs placed in a txt-file and export the Crawl Overview of every Crawl as XLSX instead of the CSV?

I wrote an Excel Makro, and the automation of above mentioned process would be a superb complement.

alias sf=”/Applications/Screaming\ Frog\ SEO\ Spider.app/Contents/MacOS/ScreamingFrogSEOSpiderLauncher”

sf –timestamped-output –output-folder /users/dominik/desktop/sf/ –save-report “Crawl Overview” –headless –crawl-list /users/dominik/desktop/sf/urls.txt

Would this actually work?

Thanks in advance!

Leave a Reply

You must be logged in to post a comment.

Like the article? Read some more: