In an announcement by Gary Illyes on the Google webmaster blog, Google wants to clarify that spam reports are only to improve Google’s spam detection algorithms. The mention of manual actions were removed from the Google webmaster guidelines since these spam reports are going to lead to a Google employee reviewing a site and manually penalizing it.

The difference here is when a spam report is submitted, it’ll be used by Google to find out how to improve its search algorithm. A site that is reported in this way won’t be dropped by itself from the search results directly because it was reported. The information may be used by Google to update it algorithm but not manually penalize any specific site.

What are manual actions? They are penalties that can be assigned by Google employees to individual sites or pages when the site violates Google’s webmaster guidelines. Spam reports will not lead to manual actions.

This is what the guidelines said prior to this update:

“If you believe that another site is abusing Google’s quality guidelines, please let us know by filing a spam report. Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. While we may not take manual action in response to every report, spam reports are prioritized based on user impact, and in some cases may lead to complete removal of a spammy site from Google’s search results. Not all manual actions result in removal, however. Even in cases where we take action on a reported site, the effects of these actions may not be obvious.”

This is what the guidelines say now, after this update:

“If you believe that another site is abusing Google’s quality guidelines, please let us know by filing a spam report. Google prefers developing scalable and automated solutions to problems, and will use the report for further improving our spam detection systems.”

More. Gary Illyes from Google wrote “spam reports play a significant role: they help us understand where our automated spam detection systems may be missing coverage. Most of the time, it’s much more impactful for us to fix an underlying issue with our automated detection systems than it is to take manual action on a single URL or site.”

SourceBarry Schwartz