2021-10-08

Avoiding Webspam: Essential Tips & Insight for Marketers

Earlier this year, Google released its 2020 webspam report. This report detailed everything the experts at Google are doing to combat spammy search results. Let’s go over some of the key points in this 2020 report and review marketing tips to ensure you’re not the next victim of webspam.

The 2020 Google Webspam Report

In a blog Google released in April, they broke down what they’ve been doing to fight spam in their search engines. They wasted no time patting themselves on the back and letting readers know that they’ve been successfully reducing search engine spam by about 80% over the last three years. Google also announced that they made significant investments in artificial intelligence to reduce the amount of content that’s either being scraped from other websites or auto-generated by robots. In short, Google has built an artificial intelligence to fight the artificial intelligence creating spam in the search engine. Finally, they announced that they need your help to stop this webspam.

How to Avoid Website Hacking

One of the biggest issues that create webspam is hacked websites. In this article, they break down the best practices that you should be following to avoid hacking. There are three prominent reasons websites get hacked: compromised passwords, compromised security, and insecure/out-of-date plugins. 

COMPROMISED PASSWORDS

The first reason that websites get hacked is compromised passwords. If you’re still using that 1-2-3 password for your website, make sure to change it! Try using a password management system like LastPass or some other equivalent tool to ensure that your passwords are secure. You can also make passwords long enough to safeguard against brute-force hacks. Finally, passwords can be compromised simply by giving them to somebody who doesn’t deserve to have them. 

COMPROMISED SECURITY

The second biggest reason websites get hacked is compromised security in the core technology that manages the website. For instance, if you’re using a content management system like WordPress, and you haven’t updated that core WordPress to the latest version of PHP, you’re likely introducing a vulnerability into your website. 

INSECURE PLUGINS 

The third biggest reason your website might get hacked is insecure plugins or other technologies used in conjunction with your content management system. So perhaps you have an old scheduling plugin that you love, but you haven’t updated it because it requires you to update all the designs and styles of your website. That plugin creates a significant security vulnerability that could cause your website to be overtaken by malicious actors. 

In summary, make sure you’re not giving away your passwords to people who don’t need them. You can also use secure passwords to keep your website safe. Consider adopting a password management system to ensure they can’t be guessed or hacked. All your core technologies, such as your content management system and your plugins, should be up-to-date to avoid exposing your website to the common security vulnerabilities that hackers can take advantage of. 

How Google Filters Webspam from Search Results

In the next section of the report, Google provided an infographic explaining how their process filters webspam out of search results. Essentially, they have machines working at every stage of the process to filter out items they think might be non-original content or auto-generated by robots. Let’s look at how they break this down:

You’ll see here that there are three main layers of security that Google puts in their process:

Blog-Post-Graphic-2

STEP 1: ADDRESSING CRAWLED SPAM

First, Google uses machines that filter and flag anything that looks like it could be spam. As they discover new content on the web, these crawlers filter both historical spam tactics and more recent ways of auto-generating content. 

STEP 2: PRE-INDEX CHECKS

The second layer of Google’s defense is crawlers that check content as it is indexed. These machines check for spam, duplicate content, or auto-generated content before being introduced in the index.

STEP 3: MANUAL CONTENT REVIEW TEAM

The third and final step is a manual review team for content in the index and starting to rank. At this point in the process, Google reviews any content they deem suspicious, and their team may manually take actions to remove certain content from the search engine. 

One way to figure out if your content has been a victim of a manual action is by using Google Search Console. Google makes sure to alert any verified site owner if they have any manual actions taken against a web property. If you think you’ve been unfairly targeted by these actions, there is an entire process that allows you to request a reevaluation of your website that’s detailed here in this blog.

Preventing Hackers with Web Management

In this report, Google outlined all the things that they are doing to help combat webspam. They also put a lot of pressure on us as website managers to ensure we’re doing our part to prevent hackers from taking over the internet. If you need help with your web services—including PPC and SEOcontact us to see how we can help.

I hope you found this helpful.  And remember to always be optimizing.

Jeff Cooper

Founder & President, Saltbox Solutions

Copyright © 2024 Saltbox Group, LLC | All Right Reserved.