随心泛目录使用注意事项-2

最近大哥使用随心泛目录起飞后,买了很多域名去做,然后服务器崩溃了!小弟来解释一下,泛的意思就是广泛,如果程序没有做限制的话,那就是无限制页面!那就会消耗服务器很多资源,比如内存、硬盘、cpu。

随心泛目录使用注意事项-2

处理方法有下面几种:
1:拦截垃圾蜘蛛

nginx伪静态规则:
 if ($http_user_agent ~* (BLEXBot|o-http-cl|AhrefsBot|DataForSeoBot|Barkrowler|SemrushBot|semrush|DotBot|mj12bot|SM-G900P|xx-xx)) {
      return 503;
}
apache:
RewriteEngine On
#Block spider
RewriteCond %{HTTP_USER_AGENT} "ing|oogle|BLEXBot|AliApp|o-http-cl|AhrefsBot|DataForSeoBot|Barkrowler|SemrushBot|semrush|DotBot|mj12bot|SM-G900P|xx-xx" [NC]
RewriteRule !(^robots\.txt$) - [F]
iis:
  <rule name="noua" stopProcessing="true">
                    <match url=".*" ignoreCase="false" />
                    <conditions>
                        <add input="{HTTP_USER_AGENT}" pattern="(SemrushBot|o-http-cl|MegaIndex.ru|MauiBot|BLEXBot|acoonbot|ahrefsbot|alexa|toolbar|apachebench|applebot|asktbfxtv|chinasospider|compspybot|coolpadwebkit|crawldaddy|curl|digext|dotbot|easouspider|ec2linkfinder|edisterbot|elefent|exabot|ezooms|feeddemon|feedly|heritrix|httpclient|ichiro|indy|library|jaunty|java|jikespider|jorgee|lightdeckreports|bot|mail.ru|microsoft|url|control|mj12bot|msnbot-media|obot|perl|psbot|purebot|python|python-urllib|scrapy|seokicks-robot|siteexplorer|spbot|spiderman|swebot|swiftbot|teleport|teleportpro|turnitinbot|turnitinbot-agent|universalfeedparser|wangidspider|wbsearchbot|webdup|wget|wotbox|wsanalyzer|xbfmozilla|xenu|yandexbot|yottaa|yunguance|yyspider|zmeu|Yahoo! Slurp China|Yahoo! Slurp|msnbot|msnbot-media|ia_archiver|EasouSpider|JikeSpider|YandexBot|AhrefsBot|ezooms.bot)" />
                    </conditions>
                    <action type="CustomResponse" statusCode="400" statusReason="Forbidden" statusDescription="Forbidden" />
                </rule>
需要注意!上面没有拦截谷歌goole,bing蜘蛛访问,需要屏蔽的话,自己加入,然后添加到伪静态就好了

2:升级服务器的配置
3:根据wwwlogs的网站日志分析

1.站点支持支付宝扫码支付,更多支付方式联系客服QQ!
2.客服QQ:3013197813822674928
3.本站大部分下载资源收集于网络,仅供个人学习与交流使用,版权归原作者所有!
4.严禁用于非法商业活动,遵守相关法律法规,违者自行承担后果!
0 条回复 A文章作者 M管理员
    暂无讨论,说说你的看法吧
个人中心
搜索