Negative SEO is a controversial topic in the SEO space. You hear about it a lot, but does it happen a lot? How easy is it to ruin a competitor’s rankings, really? Should you be worried? How do you know if someone is trying to knock you out of SERPs with negative SEO? Is there anything you can do to stay safe?
In this article, I’ll do my best to shed some light on the matter and answer these questions. But before we start…
Negative SEO is a set of activities aimed at decreasing a competitor’s site rankings in search engines’ results. These activities may include knowingly building spammy, unnatural links to the site, content scraping, and even hacking the site. We’ll look at the different kinds of negative SEO in a moment.
From day one, Google used to do their best to identify unnatural links when evaluating web pages’ authority. Originally, links Google thought to be spammy were simply ignored – they did not pass any PageRank to the page they linked to. In response, SEOs would build and buy thousands (or sometimes millions) of links; those that passed equity were great, and the ones that didn’t… simply didn’t, without harming the site.
In April 2012, Google launched the Penguin update, meaning that for the first time, they would be taking strong punitive action against sites with manipulative links. This, of course, helped Google provide better, fairer search results. But it also gave birth to negative SEO through spammy link building, as you can see from Google Trends below.
Does Google have anything to say on the matter? They do acknowledge that negative SEO attacks happen, although not too often.
Now, let’s look at the different shapes negative SEO can take, and examine the way to stay safe from each.
As the name implies, negative off-page SEO targets the site without internally interfering with it in any way. Most commonly, it implies manipulating with the site’s backlinks or externally duplicating its content.
Generally, a single spammy link (even if it’s sitewide) wouldn’t be able to shatter a site’s rankings. That’s why negative SEO typically involves links from a group of sites, or link farms.
A link farm is a hub of interconnected websites. Originally, these sites used to link to each other to increase the link popularity of each site’s pages. You could purchase links from these websites to increase your own site’s PageRank. One example of a link farm is a PBN (private blog network) — a network of sites that are created solely for link building and typically owned by one individual. Most PBNs are made up from expired domains, which means that the sites usually have accumulated some backlinks and authority by the time they become part of a PBN.
In 2012, Google’s Penguin algorithm made it significantly harder for link farms to be effective — if Google spots backlinks that come from a link farm, it justly concludes that the linked-to site is involved in a link scheme, which naturally leads to a penalty. That’s how link farming went from a black-hat but effective tactic to a negative SEO technique.
To top things up, the attacker may also point lots of exact match anchor text links at a ranking page and screw up its anchor text ratio. These exact-match anchors may be completely unrelated to your industry; or, they may actually include your target keyword to make your link profile look like you’re manipulating it.
One example of a real-world negative SEO attack through links is WP Bacon, a podcast site about WordPress. The site was attacked with thousands of links with the anchor text “porn movie”. Subsequently, WP Bacon has fell 50+ spots for the majority of the keywords it ranked for in Google in about 10 days. This story has a happy ending though: the webmaster submitted a disavow file with the spammy linking domains included in the attack. As the spam attack continued, they kept disavowing new domains they were receiving links from. Eventually, WP Bacon did recover for most of the search terms they were initially ranking for.
How to stay safe: Preventing a negative SEO attack isn’t something in your power, but spotting the attempt early enough to revert it is totally doable. To do that, you need to regularly monitor link profile growth. SEO SpyGlass, for example, gives you progress graphs for both the number of links in your profile, and the number of referring domains (you’ll find them under the Summary dashboard). An unusual spike in either of those graphs is reason enough to look into the links that your site suddenly acquired.
The typical graphs would look something like this:
On the other hand, if you see something like this when you haven’t been actively building links, you may want to look into the backlinks that resulted in the spike:
To actually see the links that contributed to a recent spike, switch to the Backlinks dashboard and sort the links by Last Found Date in descending order (by clicking on the header of the column twice). This will make the newest links appear at the top of the list so that you can look into them.
If the spike appeared on the Linking Domains graph, do the same in the Linking Domains dashboard.
If you’ve no idea where the links are coming from, it’s useful to look at their Penalty Risk. It’s a pretty accurate metric to tell if the links are coming from link farms as it evaluates the domains’ IP addresses and looks at other linking domains in your profile that come from the same IP or C-class.
To add the Penalty Risk column to your view, right-click the header of any column, select Penalty Risk, and click OK. Next, select those new suspicious backlinks you just discovered, and click Update Link Penalty Risk. In a few minutes, the column should be populated with values on a scale from 0 to 100.
If you click on the “info” button next to the Penalty Risk value for any link, you’ll see the list of factors that make it potentially risky.
Finally, if you do find some of the links are spammy, you can add them to a disavow file right in SEO SpyGlass. To do that, right-click the spammy backlink/linking domain and select Disavow (it typically makes more sense to disavow on the domain level, so make sure to select Entire domain under Disavow mode.) Do the same for all unnatural links or domains you spotted. Finally, go to Preferences > Disavow/Blacklist backlinks, review your disavow file, and export it once you’re happy with it.
Another negative SEO technique is falsifying duplicate content. It involves scraping your site’s content and copying it to other websites, often multiple times, sometimes even as part of the link farms discussed above.
You probably know that Google’s Panda update was designed, in part, to detect and fight content duplication. So when Google finds content that is duplicated across multiple sites, they will usually pick only one version to rank. You’d hope that Google is clever enough to identify the original source of the content, and in most cases they are… Unless the scraped copy gets indexed before the original.
That’s why scrapers often automatically copy new content and repost it right away. If Google finds the “stolen” version first, it may de-rank your site, and rank the scraper site instead.
How to stay safe: There are a few great tools designed to help you stay safe from scrapers. Copyscape is one of them. All you need to do is enter the URL of your content to find out if there are any duplicates of it online.
There’s another neat hack you could use that doesn’t require a lot of extra effort if you already track how your content gets shared and linked to online. A social media and Web monitoring app like Awario lets you hit two birds with one stone here. If you use a tool like Awario, you probably tend to create alerts for your posts’ URLs and titles. To also search for scraped versions of your content, all you need to do is add another keyword — an extract from your post. Ideally, it should be a few sentences long. Surround the piece with double quotes to make sure you’re searching for an exact match. With this setup, the app is going to look for both mentions of your original article (like shares, links and such) and the scraped versions of the content found on other sites.
If you do find scraped copies of your content, it’s a good idea to first contact the webmaster asking them to remove the piece (although you might suspect they’re not very likely to respond). If that’s not effective, you may want to report the scraper using Google’s copyright infringement report.
In local SEO, reviews mean a lot. An influx of negative ones isn’t just bad for your local rankings; it’s bad for business. But reviews are relatively easy to manipulate, and they may be the first thing a jealous competitor will try to do.
How to stay safe: Obviously, you need to keep an eye on your Google My Business listing and look through the new reviews your company gets. Fake reviews violate Google’s policy, according to which, one should never “post reviews on behalf of others or misrepresent your identity or connection with the place you’re reviewing”.
When you’re positive you’ve spotted a fake review, you can flag it for removal following these steps:
1. Navigate to Google Maps.
2. Search for your business using its name or address.
3. Select your business from the search results.
4. In the panel on the left, scroll to Review summary.
5. Under the average rating, click [number of] reviews.
6. Scroll to the review you’d like to flag and click the flag icon.
7. Complete the report form.
When they don’t know better, a desperate competitor may try and crash your site altogether (here is a real-life example). Mainly, this is achieved by forcefully crawling the site and thus causing heavy server load. This may slow down the site or even crash it altogether. If search engines can’t access your site when it’s down, you’ll definitely lose some crawl budget there; if this happens for a few times in a row… You guessed it — you might get de-ranked.
How to stay safe: If you notice that your site is becoming slower, or, worse, crashes altogether, a wise thing to do is contact your hosting company or webmaster — they should be able to tell you where the load is coming from. If you know a thing or two about server logs though, here are some detailed instructions on finding the villain crawlers in the logs and blocking them with robots.txt and .htaccess.
Clicks are a controversial signal in the SEO spot; not everyone believes they are a ranking signal. But there are real-life experiments that clearly show that an unusually high click rate on a certain search result can boost its rankings; while a low CTR will get a site de-ranked.
Bartosz Goralewich actually saw this happen in a negative SEO attack on a client site. It looked like a CTR bot was programmed to search for their main keywords and branded terms and click and dwell on various results. Then they’d click on the client’s listing and quickly bounce back to the SERP. Eventually, the client’s site dropped in the SERP.
How to stay safe: Make sure to carefully monitor your main keywords’ CTR in Google Search Console, under Search Traffic > Search Analytics. There, you’ll find both the stats on your site’s overall CTR across all keywords, and the click rates for individual keywords.
Negative on-page SEO attacks are much more difficult to implement. These involve hacking into your site and changing things around.
Here are the main SEO threats a hacker attack can pose.
You’d think you’d notice if someone changed your content around, but in reality, this tactic can be very subtle and difficult to spot. It involves adding spammy content (and links) to a website; the trick is, this content is often well hidden (e.g., under “display:none” in HTML), so you won’t see it unless you look in the code.
How to stay safe: Regular site audits with a tool like WebSite Auditor is the best way to continuously check your site against such threats. To run an audit, simply launch WebSite Auditor and create a project for your site. To re-run it for an existing project, use the Rebuild Project button. As long as you do this regularly, you should be able to spot subtle changes that could otherwise go unnoticed, such as the number of outgoing links on the site.
To look into those links in detail, switch to the All Resources dashboard and check with the External resources section. If you spot an unexpected increase in the count of these, look through the list on the right to see where those links point to, and the lower part of the screen for the pages they were found on.
If you identified and eliminated an attack and need to clean up the mess it created, Custom Search is a great help. To use it, go to WebSite Auditor’s Pages dashboard, and click on Custom Search. Enter the content that you’ve seen to be added to your pages when you first identified the attack (such as a keyword), and click Search. The tool will now find all instances of your query across your entire site.
A change in robots.txt is one simple alteration that could wreak havoc on your entire SEO strategy. A disallow rule is all it takes to tell Google to completely ignore your important pages or even the entire website.
There are multiple examples of this online, including this story. A client fired an SEO agency he wasn’t happy with, and their revenge was adding a “Disallow: /” rule to the client’s site.
How to stay safe: Regular ranking checks will help you be the first to know should your site get de-indexed. With Rank Tracker, you can schedule automatic checks to occur daily or weekly. If your site suddenly drops from search engines’ results, you’ll see a Dropped note in the Difference column.
If this happens for a big number of keywords, this usually implies a penalty or de-indexation. If you suspect the latter, check the crawl stats in your Google Search Console account and take a look at your robots.txt.
A possible negative SEO scenario is someone modifying your pages to redirect to theirs. This isn’t a threat for most small businesses, but if your site enjoys high authority and link popularity, it could be someone’s sneaky way to increase their own site’s PageRank, or to simply redirect visitors to their site when they try to access yours.
For the site under attack, such redirects aren’t just a temporary inconvenience. If Google finds out about the redirect before you do, they can penalize the site for “redirecting to a malicious website”.
How to stay safe: See point 1 above. With WebSite Auditor, it should be pretty easy for you to see if any new redirects have been added to your site by looking at the Redirects section in your site audit. Make sure to run these site audits regularly so you can see if any changes be made on your site, you are the first to know about them, not Google.
Even if the attacker has no negative SEO in mind, a hacker attack per se can hurt your SEO. Google wants to protect its users and will take a dim view of any site which is hosting malware (or linking to sites which do); that’s why if they suspect a site has been hacked, they may de-rank your site, or at the very least add a “this site may be hacked” line to your search listings.
Would you click on a result like that?
How to stay safe: Negative SEO aside, not getting hacked should be high on your list of priorities for obvious reasons. This topic deserves a post of its own, but you can find some great tips on stepping up your site’s security here and here.
Above, I’ve covered the 9 common negative SEO tactics and how you can protect yourself against them. But this list is not exhaustive: anything that can negatively affect your site’s reputation has the potential to be used against you. The main takeaway here is to keep a close eye on your organic traffic, rankings, and backlinks.
If you have your own tips or additions to the list, please let me know in the comments below!