#16
|
||||
|
||||
Quote:
Google's 'Safe Search' engine stuff ignores the robot.txt file or other attempted blocks. You can full-on block the site from scanning/indexing your site via blocking their bots user agent but you lose ranking and indexing altogether then. Voting on VirusTotal doesn't fix anything with Google either. VT's info about the site is not tied into Google and is its own personal opinion based on scanned links from the domain or opinion based voting. Most you'll do is make VT assume the site is safe but the Google block is still going to be there. The easiest method of fixing the Google block is passwording the files. Other methods of fixing the issue involve custom actions based on the connecting client, such as looking for Google's user-agent for their spider and analyzer and just serving a fake file or a basic site/page that isn't an error of any kind. If you just error out the page, you'll get a ton of errors regarding the site on the dashboard that will cause Google to stop indexing your site until they are corrected properly.
__________________
Personal Projects Site: https://atom0s.com |
#17
|
|||
|
|||
I don't understand why this topic has such a huge thread going on
Any website with "questionable" material on it, will get flagged. "Questionable materials" include cracks, methods to create cracks or exploits and a whole bunch of stuff. No other way other than to hide off the contents of this site from unregistered users. That way even the spiders cannot access it. |
#18
|
||||
|
||||
The problem was solved and has disappeared for two weeks.
Thanks to Hmily and Aaron. And thanks for all of you who provided suggestions. |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Site down? | nikre | General Discussion | 31 | 09-13-2018 19:24 |
Unwanted code added while assembling on Olly | RaptorX | General Discussion | 3 | 02-18-2011 03:49 |