Prospecting with Power – Using Link Prospector with Screaming Frog for Brands

There are a lot of tools out there for link building, but few provide reports as actionable as Link Prospector. I’ve been using the product since shortly after its release and I am quick to champion it as my must have tool.

Advanced Link Building Software

The tool utilizes advanced queries to find actionable linking opportunities. Instead of scraping Google results for hours on end and using SEO Tools for Excel to get Page Rank, this tool can do all the work for you.

At first glance, it seems like a simple tool,

but it takes skill to wield it properly.

To use Link Prospector effectively you need to think about content. Instead of prospecting for substandard links for your money terms, use this tool to build links to your resources and assets.

It’s much more effective to construct research phrases around user intent. Why would someone be looking for a collection of content similar to your client’s? What purpose does your client’s content serve to the user?

Recently, we’ve been working with a client in the children’s space and we’ve been having a lot of luck with Link Prospector – the client’s content provides parents with creative and entertaining activities for their kids. So instead of plugging in, ~”kids activities”, we’ve been using research phrases that have intent and search volume.

We asked ourselves,“When do kids need to be occupied? What events make a parent have to search for something fun for their kids to do?” 

It’s much easier to find high quality link prospects when thinking about user intent. This exact report generated some very high quality prospects for us, but we weren’t done there. Our client also has a plethora of lesson plans, and we knew there was a ton of opportunity there. Again, instead of prospecting for the head term – we targeted our link prospects by intent and need:

The last research phrase is still a little broad, but it’s still more targeted than just prospecting for head terms.

We had an issue though, our client is a brand. They were already listed on a number of these sites. With so many prospects, hand vetting was proving to be cumbersome – so we developed a solution.

The Solution

Link Prospector CSVs are huge, so we used Screaming Frog to compartmentalize the workload, here’s the step by step:

1) Get the paths export from Link Prospector

 

2) Turn The Data Into A List

 

2.1) Open the CSV and Delete Row 1

2.2) Delete all columns but URLs (column A), move that data to column B.

2.3) Add http:// in column C to match up with URLs.

2.4) In Column A enter CONCATENATE=(B1,C1) And drag the right corner of call A1 to the bottom of the sheet.

2.5) Copy column A as values.

2.6) Paste column A values over the CONCATENATE formulas.

2.7) Delete all data but Column A, save the CSV. It should look like this:

3) Open Screaming Frog, Change the Mode from Spider to List

4) Under Configuration select “Custom”

4.1) For Filter 1 select “Does Not Contain” and enter the client domain.

4.2) For Filter 2 select “Contains” and enter client domain again.

 

Example:

5) Upload the CSV, run the report, and export the two custom reports

This will give you two different files to work through, and they should be approached in different ways. The first export will be sites that you need to build a relationship and convince the webmaster to link over. The second custom export is just easy wins. Sites that already link to your client may be linking to a page that is going through a 301, or even worse, directing users to a 404. This export of live links is a golden opportunity for anchor text optimization and deep page linking by providing webmasters with new and updated content.

In order to prioritize your work, exporting both sheets and importing them into a tool like BuzzStream or using SEO Tools for Excel to get Page Rank is highly recommended!

Now get prospecting!

 

This post originally appeared on the SEER Interactive blog.

Catching Up with your Local Competitors & Automating Citation Discovery

I am a link builder at heart; it’s what I love to do. On the reverse side, I find citation building to be one of the most boring tasks in the industry. Link building is strategic and tactical – citation building is well… not. Which is why this strategy was created – to take some of the monotony out of citation building.

Quick Intro: Citations consist of a NAP (Name, Address, Phone Number) and they are a key component in local search ranking factors. Citation building consists of going around to local directories and social networks and getting your NAP listed. Some good examples are Facebook Local Business or Place pages, Yelp!, FourSquare, SuperPages, BrownBook, Yahoo Local, Merchant Circle etc.

Here’s how to quickly catch up to your competitors and automate finding their new local citations from here on out

First, figure out all the keywords you want to rank for that are triggering local search: I am going to do use one of the coolest restaurants in the world for my example: Alinea.

Chicken Liver, Bacon, Caramelized Onion, Vin Santo via Yelp user Manda Amanda Bear B.

Here are the terms we are going for

Chicago Restaurant
Restaurant Chicago
Restaurant in Chicago
Chicago Restaurant
Chicago Fine Dining
Fine Dining Chicago

Note: Alinea’s Google Plus Local page does not reflect this strategy and they are currently at the bottom of the seven-pack, but we are still going to use them as an example anyway 🙂

Type in the local search queries you are targeting into Places and scrape all the results that are populating and throw it into an excel sheet:

Next, set your Google search results to 100. Search for your competitors’ NAPs individually and scrape all results. Make sure to include -site:competitor in your query, because a lot of restaurants have their NAP on every page of their site. 

Put all of the NAP URLs in a separate excel sheet:

Because some niche citation sources can be tricky to navigate and find exactly where to register your citation/business, you are going to want to copy this entire data set into a new excel sheet so you can have your raw data to go back to if need be.

Remove everything after the TLD and de-duplicate the NAP Source column:

This is your new go to list of citations to build. But before you start building citations, do the same process to your business and remove all sources where you already have your NAP listed. Now you can start building citations. It’s important to make sure whoever does the actual citation building keeps meticulous records because you will need to go back and check that all the NAPs created match the NAP on your Google Local Plus page. This is especially important if you hand this off to a freelancer.

Here’s how to automate finding their new local citations from here on out:

 Set up Google Alerts for all Competitor NAPs in your Google Reader:

 

NAPs need to be exact, but your competitors are bound to screw up from time to time – So you want to set up Google Alerts to account for their errors. Take just the address of your competitor and abbreviate anything that you can (example: 123 N Fake Street to 123 North Fake Street) and set it up in a separate Google Alert. Next take their phone number and put all the normal variations into separate Google Alerts as well (example: (123) 456-7890 to 123 456 7890 and 123 456-7890 and 123.456.7890 and 1234567890). This may seem like overkill, but you never know what Google might miss. Now you can go through your Google Reader every few weeks and do all your citation building for your client in one clean sweep.

Getting the Competitive Edge

Keep a record of how many citations each competitor is building a month. Track it month over month and see if it stays consistent. A lot of SEO firms do a set amount of citations a month in a “local search package,” by figuring out how many citations your competitors are going to be building each month you can stay one step ahead of them.

 

This post originally appeared on the SEER Interactive blog.

Using Copyscape for Easy Link Wins

As link builders, we rack our brains trying to figure out how to get “big wins,” and I am no different. Nothing gets me more amped than earning 20 linking root domains from high domain authority sites with some great link bait. However, sometimes we can get so focused on the big wins, that we forget about the low hanging fruit.

Recently I’ve been working on a site that provides a lot of fantastic resources for its readers. These resources are so fantastic that they have been duplicated and reposted all over the web. We’re not talking about scraper sites here, this content has been reposted on various respected and authoritative sites. I was searching for one of their resource pages by its title one day and I found that there were numerous other  websites hosting the exact same content.

I’m sure some of you are thinking:

“Oh no! JH!! Your client has duplicated content across multiple domains! You need to contact those sites and have them take down all of that content now!!”

I understand where you are coming from, but my client created this content to help and inform the public. Other people found this content to be so valuable that they wanted to inform their own readers about it – the content is serving it’s intended  purpose and there is no way I am going to ask webmasters to take it down. Good content spreads, that’s the nature of the web.

To be an effective link builder you need to be able to put yourself in another webmaster’s shoes and break free from SEO tunnel vision. Most likely, a webmaster saw a cool or informative piece of content and wanted to share it with their readers – where’s the harm in that? If you were the original creator of that content and Google cached it, it’s not going to hurt your site if people repost it.

Finding The Prospects

There is no way I am going to spend my time contacting a bunch of random site owners begging them to take this content down, instead am going to thank them for reposting the content… and then ask them for a link.

Copyscape Logo

By using the premium version of CopyScape, a well known plagiarism tool, you can find all the sites on the web that have duplicated even small portions of your content.

To streamline the process, I like to throw all the Copyscape results into excel and then use Niels Bosma’s SEO Tools For Excel and grade by Page Rank of the root domain. This lets you cut out most of the low quality scraper sites that you don’t want links from anyway.

Making Contact

I am a big fan of using the phone when it comes to building links. Mainly because emails can be deleted, but phone calls during business hours generally get answered. If you can’t find any contact info on the site, do a WhoIs look up. Yes, it’s an aggressive manuever, but let’s not forget – these people have lifted content from your site and you have every right to contact them.

Here’s generally how the exchange goes:

JH: Hi, I’m John-Henry from SEER Interactive and I wanted to talk to you about some content on your site that was originally written by my client.”

Webmaster: Uh… Oh, jeez, Ok. What’s up?

JH: First of all, thanks for putting that on your site, my client created that content for this exact purpose. We’re really glad you could find some use for it. 

Webmaster: Thanks, I really liked that post and I thought my readers would too!

JH: Awesome, glad you liked it. I was hoping you could do us a small favor. My client spent a good chunk of time researching all that inforamtion and putting together that resource. We would really appreciate it if you could put a link next to the content that says “Information provided by (client)”

Webmaster: Yeah sure, no problem!

I kid you not, it is that easy.

This Only Works If…

The one caveat about this tactic: it only works if you are consulting your clients to create truly great content. Mediocre content might get scraped and spun and respun and posted all around the web, but truly great content that provides value will generally stay intact. This is an easy link win and CopyScape is dirt cheap (5 cents a scan). If your client is getting their content reposted, hopefully this tactic can work for you too.

 

This post originally appeared on the SEER Interactive blog.