.htaccess - Codeigniter site was not working in live server

Can someone help me,My site was working fine in staging server and I moved the site live server where I am getting HTTP ERROR 500.My htaccess file is <IfModule mod_rewrite.c> RewriteEngine On #RewriteBase / #Removes access to the system folder by users. #Additionally this will allow you to create a System.php controller, #previously this would not have been possible. #'system' can be replaced if you have renamed your system folder. RewriteCond %{REQUEST_URI} ^system.* RewriteRule ^(.*)$ /index.php?/$1 [L] #When ...Read more

apache .htaccess redirect - clean url

I am working on a php redirect script which 302 redirects visitors to other sites when they access a redirect url..The script gets a variable (id) from the url and then redirects the visitor to the specific page.The url structure is : example.com/redirect/index.php?id=testAt the moment all redirects work if I use "ugly" urls, but I want to strip all unnessecary information out of the url with .htaccess rewrites for better usability.Which .htaccess rewrite rules do I need to make the above shown urls look like : example.com/redirect/test I am cu...Read more

how to redirect a dynamic URL using .htaccess?

Is there any way to redirect this link :http://example.com/?page=profile.php&id=100tohttp://example.com/profile.php?id=100using htaccess ??by the way I know that I can redirect from http://example.com/?page=profile.php to:http://example.com/profile.php using 301 redirect but I am asking about the id in the first link I use it with & while in the second URL I use it with ? is there a way to make the redirect taking the id parameter in consideration...Read more

.htaccess - nginx configuration for expiration specifying

I am trying to improve my pagespeed. I have installed nginx and changed my sites to http2. But in the gtmetrix Performance Report my css,png and js files has no expiration dates. (expiration not specified) in the Leverage browser caching recommendation. I think I have set all possible ways to do this and tried different variants.If their are another improvements in my nginx instrution and htaccess file, feel free to post it. nginx addictional instructions in Pleskgzip on;gzip_vary on;gzip_min_length 1240;gzip_proxied expired no-cache no-st...Read more

.htaccess - How to disable HSTS header with HTTP?

I have inserted the following in the .htaccess of my site in order to be admitted to the HSTS preload list:<ifModule mod_headers.c>Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload" </ifModule>The problem is that when I submit my site, I obtain: Warning: Unnecessary HSTS header over HTTP. The HTTP page at http: //fabriziorocca.it sends an HSTS header. This has no effect over HTTP, and should be removed.At the moment I use the following in the .htaccess in order to switch from http to htt...Read more

.htaccess - HSTS Preload Submission

I want to pass all tests for HSTS Preload.I currently have 2 errors, that I need to solve:First:`http://example.com` (HTTP) should immediately redirect to`https://example.com` (HTTPS) before adding the www subdomain. Right now, the first redirect is to `https://www.example.com/`.My htaccess looks like this:RewriteEngine OnRewriteCond %{HTTP_HOST} ^example.com$ [NC]RewriteRule ^(.*)$ https://example.com/$1 [R=301,L]RewriteRule ^(.*)$ https://%{HTTP_HOST}/ [R=302,L]Second:Response error: No HSTS header is present on the response.My htaccess looks...Read more

.htaccess - Warning: Unnecessary HSTS header over HTTP

I want to use https:// and non www. URL always. So I used the following code in my htaccess file. But i am getting an warning from https://hstspreload.orgRewriteCond %{HTTPS} offRewriteRule .* https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]RewriteCond %{HTTP_HOST} !^www\.RewriteRule .* https://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301] <ifModule mod_headers.c>Header always set Strict-Transport-Security "max-age=31536000;includeSubDomains; preload" </ifModule>Warning Message is given bellow : Warning: Unnecessary HSTS header over HTTPT...Read more

.htaccess - how to force https before www in htaccess

I am trying to finalize HSTS compliance and am a Web guy but this is over my head.My current .htaccess is:<IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteCond %{ENV:HTTPS} !on [NC] RewriteRule ^(.*)$ https://www.example.com/$1 [R,L]</IfModule>Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload" env=HTTPSI am using: https://hstspreload.org/ to check compliance and when I run this tool for my domain it returns: http://example.com (HTTP) should immediately redirect to https://exa...Read more

.htaccess - Ban IPs from text file using htaccess

I read and understand how to block an ip using htaccess:order deny,allowdeny from 111.222.33.44deny from 55.66.77.88...allow from allBut my list of black IPs includes thousands of IPs.I save all IPs to a blacklist.txt file.Can I use htaccess to call blacklist.txt and block all IPs which are stored in this file? If so, how?...Read more

.htaccess - Access-Control-Allow-Origin Multiple Origin Domains?

Is there a way to allow multiple cross-domains using the Access-Control-Allow-Origin header?I'm aware of the *, but it is too open. I really want to allow just a couple domains.As an example, something like this:Access-Control-Allow-Origin: http://domain1.example, http://domain2.exampleI have tried the above code but it does not seem to work in Firefox.Is it possible to specify multiple domains or am I stuck with just one?...Read more

.htaccess - How to block access to specific URL?

I am getting thousands of bot visits every hour to one specific URL of my website http://www.domain.com/.gtput/I would like to block ALL traffic (human+bot) trying to access this URL. (This URL is not accessed by human, though!)After lot of googling, I found an answer that worked from here --> Anyway to block visits to specific URLs, for eg via htaccess?. I am using following code in htaccess file to block this URL.<IfModule mod_alias.c>Redirect 403 /.gtput/</IfModule>Is there a BETTER way to block ALL traffic from accessing that on...Read more

Restrict access to webpage from single referrer domain using .htaccess

I want to restrict access to a website to only allow referrers from a single domain. I can't get the .htaccess file to work correctly.Say I am referring from http://domainname.com - access will be allowed.Or http://subdomain.domainname.com - access will be allowed.But any other referrer (or typing in URL) will block, and direct to Access Denied page. Code as follows (note I need to allow access from ANY referrer page on domainname.comRewriteEngine OnRewriteBase /# allow these referers to passthroughRewriteCond %{HTTP_REFERER} ^http://(protect|u...Read more

mod rewrite - IP banning via .htaccess not working

Liteserver on a Shared access host here.I'm trying to get rid of a lot of bots who waste my resources for nothing.I can successfully restrict access to some of them using a specific User Agent, but I can't ban their IP addresses, together with those of a lot of chinese ones who are constantly scanning my website. I am still seeing AhrefsBot IP (5.10.83.44) in the access log, even if its IP is banned (see the last line of the htaccess file). The rules should already be inherited by /gallery subfolder. 5.10.83.44 - - [07/Sep/2013:00:56:42 +0200]...Read more

.htaccess - cloudflare user-agent in htaccess

I got .htaccess blocking all user-agents and only allow one's i need to allow cloudflare to access how can i allow not using (Mozilla)this is what i got user-agent Mozilla/5.0 (compatible; CloudFlare-AlwaysOnline/1.0; +http://www.cloudflare.com/always-online) RewriteEngine on AuthType Basic AuthName "private" AuthUserFile "/home/example/.htpasswds/public_html/exemple/passwd" require valid-user #-only allow-# SetEnvIf User-Agent .0011 0011 Order deny,allow Deny from all Allow from env=0011 #-index only open f...Read more

.htaccess - How do I block IPs using htaccess

How can I stop these IP addresses from hitting my site (185...* and 93.127..).I have tried these in .htaccess: Order Allow,Deny Deny from 185.162.0.0/15 Deny from 185.164.0.0/14 Deny from 185.168.0.0/13 Deny from 185.176.0.0/12 Deny from 185.192.0.0/11 Deny from 93.127.0.0/16 Allow from alland bots from http://tab-studio.com/en/blocking-robots-on-your-page/ but I continue to be hit....Read more