Quote:
Originally Posted by Archer
As far as I remember, robots file is ignored when it comes to anti-malware scanning. At least it didn't help when I faced a similar problem.
|
This.
Google's 'Safe Search' engine stuff ignores the robot.txt file or other attempted blocks. You can full-on block the site from scanning/indexing your site via blocking their bots user agent but you lose ranking and indexing altogether then.
Voting on VirusTotal doesn't fix anything with Google either. VT's info about the site is not tied into Google and is its own personal opinion based on scanned links from the domain or opinion based voting. Most you'll do is make VT assume the site is safe but the Google block is still going to be there.
The easiest method of fixing the Google block is passwording the files. Other methods of fixing the issue involve custom actions based on the connecting client, such as looking for Google's user-agent for their spider and analyzer and just serving a fake file or a basic site/page that isn't an error of any kind. If you just error out the page, you'll get a ton of errors regarding the site on the dashboard that will cause Google to stop indexing your site until they are corrected properly.