本文共 2232 字,大约阅读时间需要 7 分钟。
一、Nginx防盗链
#vi /usr/local/nginx/conf/vhost/test.com.conf#/usr/local/nginx/sbin/nginx -t#/usr/local/nginx/sbin/nginx -s reload#curl -e " "-x127.0.0.1:80 -I test.com/1.gif //提示403#curl -e " "-x127.0.0.1:80 -I test.com/1.gif //提示200二、Nginx访问控制#vi /usr/local/nginx/conf/vhost/test.com.conf#/usr/local/nginx/sbin/nginx -t#/usr/local/nginx/sbin/nginx -s reload#curl -e " "-x127.0.0.1:80 -I test.com/admin //访问127.0.0..1正常匹配正则#vi /usr/local/nginx/conf/vhost/test.com.conf#/usr/local/nginx/sbin/nginx -t#/usr/local/nginx/sbin/nginx -s reload#mkdir /data/wwwroot/test.com/upload //创建upload#echo "1111" > /data/wwwroot/test.com/upload/1.php#curl -x127.0.0.1:80 test.com/upload/1.php根据user_agent限制#vi /usr/local/nginx/conf/vhost/test.com.conf#curl -A"Tomatoafdkdjk" -x127.0.0.1:80 test.com/upload/1.txt -I如果要忽略大小写,可以在~匹配符号后面加一个号if ($http_user_agent ~ 'Spider/3.0|YoudaoBot|Tomato'){ return 403;}三、Nginx解析php相关配置#vi /usr/local/nginx/conf/vhost/test.com.conf#/usr/local/nginx/sbin/nginx -t#/usr/local/nginx/sbin/nginx -s reload如果配置文件中fastcgi_pass unix:/tmp/php-fcgi.sock;这个路径不对,会提示502错误,遇到502报错先看下错误日志,再去看下是不是配置文件的fastcgi_pass unix:/tmp/php-fcgi.sock;这个路径是否正确,如果不正确,需要查看/usr/local/php-fpm/etc/php-fpm.conf这个配置文件中的sock路径#tail /usr/local/nginx/logs/nginx_error.log //查看错误日志如果/usr/local/php-fpm/etc/php-fpm.conf文件定义的listen = 127.0.0.1:9000那么 /usr/local/nginx/conf/vhost/test.com.conf这个配置文件中的fastcgi_pass unix:/tmp/php-fcgi.sock;需要修改为fastcgi_pass 127.0.0.1:9000;总结出现502错误解决思路:首先检查nginx的配置/usr/local/nginx/conf/vhost/test.com.conf和PHP的配置/usr/local/php-fpm/etc/php-fpm.conf两个文件中配置的路径要对应,一个配置的是sock路径,另一个也要配置为一样的sock路径;如一个配置的是127.0.0.:9000,则另一个也同样配置为127.0.0.1:9000还需要注意/usr/local/nginx/conf/vhost/test.com.conf配置文件中,fastcgi_param SCRIPT_FILENAME /data/wwwroot/test.com$fastcgi_script_name;的这个路径对应的是配置文件中的 root /data/wwwroot/test.com;这个路径502问题也可能是php-fpm资源耗尽导致进程卡死,也会报502错误#vi /usr/local/php-fpm/etc/php-fpm.conf四、 Nginx代理#cd /usr/local/nginx/conf/vhost/#vi proxy.conf#/usr/local/nginx/sbin/nginx -t#/usr/local/nginx/sbin/nginx -s reload#curl ask.apelearn.com/robots.txt //robots.txt 是针对蜘蛛索引的一个列表#curl -x127.0.0.1:80 ask.apelearn.com/robots.txt //指定127.0.0.1也可以访问转载于:https://blog.51cto.com/13669226/2128277