View Single Post
Old 11-11-2018, 10:25
atom0s's Avatar
atom0s atom0s is offline
Join Date: Jan 2015
Posts: 226
Rept. Given: 22
Rept. Rcvd 101 Times in 47 Posts
Thanks Given: 38
Thanks Rcvd at 303 Times in 117 Posts
atom0s Reputation: 100-199 atom0s Reputation: 100-199
Originally Posted by Archer View Post
As far as I remember, robots file is ignored when it comes to anti-malware scanning. At least it didn't help when I faced a similar problem.

Google's 'Safe Search' engine stuff ignores the robot.txt file or other attempted blocks. You can full-on block the site from scanning/indexing your site via blocking their bots user agent but you lose ranking and indexing altogether then.

Voting on VirusTotal doesn't fix anything with Google either. VT's info about the site is not tied into Google and is its own personal opinion based on scanned links from the domain or opinion based voting. Most you'll do is make VT assume the site is safe but the Google block is still going to be there.

The easiest method of fixing the Google block is passwording the files. Other methods of fixing the issue involve custom actions based on the connecting client, such as looking for Google's user-agent for their spider and analyzer and just serving a fake file or a basic site/page that isn't an error of any kind. If you just error out the page, you'll get a ton of errors regarding the site on the dashboard that will cause Google to stop indexing your site until they are corrected properly.
No longer active on this site/forum much. If you need to contact me, you can find me on my personal site here:
Reply With Quote