View Single Post
  #16  
Old 11-11-2018, 10:25
atom0s's Avatar
atom0s atom0s is offline
Family
 
Join Date: Jan 2015
Location: 127.0.0.1
Posts: 396
Rept. Given: 26
Rept. Rcvd 126 Times in 63 Posts
Thanks Given: 54
Thanks Rcvd at 730 Times in 279 Posts
atom0s Reputation: 100-199 atom0s Reputation: 100-199
Quote:
Originally Posted by Archer View Post
As far as I remember, robots file is ignored when it comes to anti-malware scanning. At least it didn't help when I faced a similar problem.
This.

Google's 'Safe Search' engine stuff ignores the robot.txt file or other attempted blocks. You can full-on block the site from scanning/indexing your site via blocking their bots user agent but you lose ranking and indexing altogether then.

Voting on VirusTotal doesn't fix anything with Google either. VT's info about the site is not tied into Google and is its own personal opinion based on scanned links from the domain or opinion based voting. Most you'll do is make VT assume the site is safe but the Google block is still going to be there.

The easiest method of fixing the Google block is passwording the files. Other methods of fixing the issue involve custom actions based on the connecting client, such as looking for Google's user-agent for their spider and analyzer and just serving a fake file or a basic site/page that isn't an error of any kind. If you just error out the page, you'll get a ton of errors regarding the site on the dashboard that will cause Google to stop indexing your site until they are corrected properly.
__________________
Personal Projects Site: https://atom0s.com
Reply With Quote