Single Blog Title

This is a single blog caption
image of swipe card

7 htaccess file examples that work for SEO

The HTACCESS (hypertext access) file can deal, as it did for me, with some tricky SEO issues and improve your ranking on the search engines. For just a small text file it surely does some fantastic work for anyone who runs their site from an Apache server. I’m not the worlds most technically savvy website builder, but I do try to understand as much as I can about what the code I am inserting does and, how it works.

HTML and CSS is not much of a problem as I know what I am doing, but programming code like PHP, JavaScript and HTACCESS are still a bit of a mystery to me. Even though I use them on my website the nitty gritty of the code would take another leap in my understanding of websites. So I will try to explain what I know about this very helpful little file and give you some of the HTACCESS code that worked for me.

What is the HTACCES file used for?

It has a whole myriad of uses some very important and some not so important. What the htaccess file does is communicate with your server and give it instructions as to what you want it to do when someone visits a page on your website.

When I first tested my website at Nibbler, I had a very low Server Behavior score. Nibbler informed me that I needed to use a 301 redirect or a canonical meta tag to force search engines to the correct version of my website and, that I had duplicate content because of this.

Search engines and visitors where visiting the different domains connected to my website i.e. the www. version, the naked version or non-www, and also the index.html of both versions.

Another cause for duplicate content were pages returning with or without the trailing slash ( / ). There are lots of reasons for duplicate content and i found this article Duplicate content: causes and solutions by Joost de Valk, very helpful.

Along with the problem of duplicate content were, URLs which were not search engine friendly, and slow webpage loading times as there was no webpage compression, and an incorrect setting of cacheable resources.

All of these issues could result in a lower ranking and even penalization of your website by the search engines. When I learnt this it invoked a real horror in me!

The solution? The HTACCESS file.

First I had to do a little FAQ finding at my host, to find out if I could use the htaccess file on their server. This is important as the file can been seen as hacking by some hosts so you must check to see if its allowed. In my case the answer was yes, and that they supported the mod_rewrite function of the htaccess file which enables you to re-write URL’s. As I was using one of their Windows servers and the htaccess file works on Apache servers, I got in touch with the team and asked them to switch me over to the Apache server, which they did.

How do I create a htaccess file.

Creating the htacces file is quite easy once you know how. First open a blank text file like notepad. Then save the file exactly like this:



including the (dot) and the inverted commas (which is important) and all in lower case, which should produce a file which is un-readable and un-editable. “So what good is that!” I here you cry. To edit the htaccess file you will need, website editing software similar to Adobes Dreamweaver. You could use note pad or a text editor to edit this file, but it is a lot simpler to use the correct software.

IMPORTANT! You can cut and paste this code into your htaccess file but make sure there is no spacing between the lines. When you paste into your editor, the code should look like this:


Solving The Canonical Website Address With HTACCESS

The first issue I resolved with htaccess was that of the canonical website address. I had not much of an issue with this before Nibbler, as you can set the canonical address using Googles webmaster tool, which I had done, but that only works for Google, and not the other search engines like Yahoo and Bing.

Searching for the correct code for this resulted in a bit of trial and error, there are different ways it can be done and occasionally the code i found was incorrect or missing parts. I tried several bits of code and each time uploaded it onto my server via FTP (ASCII not Binary) and then tested it with Nibbler to see if it was working. Eventually I found the code that worked which is below:

RewriteEngine on
RewriteCond %{HTTP_HOST} ^yourwebsite\$ [NC]
RewriteRule ^(.*)$$1 [R=301,L]
RewriteCond %{THE_REQUEST} ^.*/index.html
RewriteRule ^(.*)index.html$$1 [R=301,L]

The htaccess file has a code all of its own. You can get all the information from The Apache Software Foundation website but in all honesty unless you understand the technicalities of it all, its very hard to make heads or tails of, but when it works it works, and Nibbler likes it!

The above code works if you want to redirect your visitors to the www. version of your website, the code also redirects the index.html pages to the correct canonical address, but if you want it to redirect to your naked website address then you put this:

RewriteEngine on
RewriteCond %{HTTP_HOST} ^www.yourwebsite\$ [NC]
RewriteRule ^(.*)$$1 [R=301,L]
RewriteCond %{THE_REQUEST} ^.*/index.html
RewriteRule ^(.*)index.html$$1 [R=301,L]

There appears to be no SEO benefit for using www or non-www (naked) domains as long as you use one or the other to avoid duplicate content, its up to you. But before choosing you might want to read this article regarding the issue. WWW vs non-WWW for your Canonical Domain URL – Which is Best and Why?

HTACCESS Code for clean URL’S

The next thing the htaccess file helped me with were clean URL’s. I was informed by non other then Nibbler, that my URL’s had extensions (how awful) and that I must rewrite them in order to make them easier for my visitors to remember, to make them more search engine friendly and for better security. A clean URL looks like this:

as apposed to an ugly URL which looks like this:

A clean URL means to serve a webpage without its extension which in my case was the .html

A Google search eventually came up with this code which removes the webpage extension for HTML:

RewriteBase /
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}\.html -f
RewriteRule ^(.*)$ $1.html

This code also works for PHP all you do is replace the [html] with [php] and to Nibblers delight, it works!

IMPORTANT! To enable this code to work, you have to remove the .html extension from the links within your webpage HTML like this:

<a href=”lee_allan_kane_photography.html”>……</a>
<a href=”wedding-photography.html”>……</a>

should look like

<a href=”lee_allan_kane_photography”>……</a>
<a href=”wedding-photography”>……</a>

Which can means if you build your website offline in a folder as i do, then the links will stop working! This is the case for HTML but i think the PHP guys build them on a live server so it wouldn’t make any difference.

Getting Rid Of The Trailing Slash with HTACCESS


After adding the code for clean URL’s I found that some pages had a trailing slash at the end of the URL and others did not. The trailing slash denotes a directory and those without a trailing slash denote a file. In most cases both versions of these URLS’s will have the same content which is a problem for search engines as they do not know which one to serve and will see them as duplicate content.

There is htaccess code which you can implement in your file which will either remove the slash from all URL’s or add the slash. It is your choice as to which one you have, and as long as you choose one or the other, again, search engines are happy as it resolves the issue of duplicate content. Below is the code to remove the trailing slash:

RewriteEngine On
RewriteRule ^(.*)/$ http://%{HTTP_HOST}/$1


The HTACCESS code for GZIP


An issue that was bought up by SEOquake and Nibbler was that my webpage’s were not compressed with Gzip.

Gzip is compression and decompression SERVER SOFTWARE that compresses your webpage when visited, therefore enabling your webpage to load faster in the visitors browser. Htaccess has some code which you can put in the htaccess file that initiates gzip when the visitors browser looks to see if the web page has gzip compression.

Most website hosts have this software on their servers, and serve the websites hosted by them already compressed so you don’t have to worry, some have the software installed but you have to activate it via the htaccess file. There are websites which can check this for you just Google “gzip testing”.

Mine was the latter and after entering the code, to my growing htaccess file, hey presto! was compressed. And the gzip code for the htaccess file:

# compress text, HTML, JavaScript, CSS, and XML
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/xml
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/x-javascript
# remove browser bugs
BrowserMatch ^Mozilla/4 gzip-only-text/html
BrowserMatch ^Mozilla/4\.0[678] no-gzip
BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
Header append Vary User-Agent

This code compresses my HTML CSS website including XML RSS and JavaScript. The # denotes a comment in the same way you comment a HTML or CSS document. It covers the main applications of my website, but I have no doubt, if you was to research a bit further then there will be more code to which you can add to this, to cover other applications like PHP ect.

How To Set The Browser Cache With HTACCESS

Pingdom was another SEO tool I used but one I think is bettered by Google’s Pagespeed Insights. Both tools test the speed of your webpage and produce a report which details what it is on your webpage which is slowing the page down when loaded onto a visitors browser.

They suggested that I should leveraged browser caching to reduce load time and enabling previous visitors to downloaded resources from their own cache rather than over the network (that’s assuming that they don’t clean their browser cache, and are regular visitors to my site).

With the htaccess file you can set the lifetime for cacheable resources like images, HTML, CSS, JavaScript ect. Enter the code below and set the max-age (which is set in seconds) to at least 1 week.

# 1 WEEK 1 DAY
<filesMatch “.(jpg|jpeg|png|gif|swf|ico)$”>
Header set Cache-Control “max-age=691200, public”
# 1 WEEK 1 DAY
<filesMatch “.(xml|txt|js)$”>
Header set Cache-Control “max-age=691200, proxy-revalidate”
# 1 WEEK 1 DAY
<filesMatch “.(html|htm|css|php)$”>
Header set Cache-Control “max-age=691200, private, proxy-revalidate”

All the above htaccess code dealt with my Server behaviour issues for SEO and gave me the maximum score of 10 on Nibbler.

More Information About The HTACCES File And Some Extra Code

There are some things about the htaccess file code that can effect whether it works or not. On creating your htaccess file do some research for yourself as their are two important commands in getting the code to work which are:

RewriteBase /


RewriteEngine On

These bits of code at the beginning of the command tell the server to initiate the commands when required. Some servers like require some code to go before these two commands which looks like

Options +FollowSymLinks

If your htaccess file does not appear to be working at all by using the code from this page, you should look into the above option as some hosts servers have this already switched on like mine, and some do not.

Uploading your htaccess file should be the same as uploading webpage’s by using ASCII (American Standard Code for Information Interchange) which, as it says on the box, is pretty much the standard way of uploading.

I also read and noticed myself when creating the .htaccess file, that the file itself may occasionally not save properly or becomes corrupt due to commands being entered incorrectly. If that’s the case then just create another file and re-enter the code, but I found once I got the file and the code working properly, all was good.

Here is some more htaccess code which will come in handy.

HTACCES Security


This piece of code is essential for securing your htaccess file. It disables access from the internet so no one can read, download or edit it from your website. This code differs from some you may find on other websites as it offers heightened security by adding strong pattern matching:

<Files ~ “^.*\.([Hh][Tt][Aa])”>
order allow,deny
deny from all
satisfy all

HTACCESS 301 Redirect

The htaccess 301 redirect is very useful if you have to change the name of a URL. This bit of code will redirect the old page to the new page:

RewriteEngine On
Redirect 301 /old-webpage

(You may have to add the extension if your not using a clean URL.)

Custom 404 Error Pages

You can also make a custom error page and direct 404 errors to the custom page with the code below:

ErrorDocument 404 /your-folder-name/your-error-page.html

Setting The Charset With HTACCESS

You can add the char set for your site in the htaccess file and remove it from your webpage’s to speed up the pages download by decreasing the amount of HTTP requests:

AddCharset UTF-8 .html

The only problem with this is, that if a visitor reads your webpage offline the character encoding will not be set.

In Conclusion

With the htaccess file and all the other modifications made within my websites pages as directed by SEOquake, Nibbler, Pingdom and Pagespeed Insights in a short time I grew my visitors from around 70 to 300+ it does not seem a lot but I have been monitoring my analytics and that figure has not dropped. So from what I can gather around the 300 mark appears to be the top limit JUST for making those modifications.

As I started out by saying in the SEOquake article, one of the best SEO tools you have is good regularly updated content. So I will grow my site with good content, which in theory should attract more visitors and giving me a better ranking. In relation to that the better the ranking then the more work offers I should get in my field of photography. But there is still MUCH more you can do for your SEO including backlinks and social networking which I will cover in further articles.

38 Responses

  1. Great Post! I have been looking everywhere for common htacess rules that just work and yours do. I appreciated the time you took to give great detail about what each rule does and providing the correct ways to write them.

    I’m now using these on my site and everything works great. Can’t thank you enough for making everything so clear. Great Job!

  2. Wonderful post, I am searching for this kinds of stuff to solve my problem. This is complete guide for .htaccess file. Now I got it, anyways thanks for the post. I have also many kinds of problem solving blogs. check it out.

    Viktor –

  3. HI Lee Kane, good stuff. I almost broke my head in writing a .htaccess code for an E-commerce website and finally took three full days to complete my redirection process. I was trying to route index.php to home and finally succeeded after major complications occurred in the site.

  4. Hi,

    This post about .htaccess is very helpful and I use compression and the security code on my websites. Only one thing is not clear to me please can you explain me why the code 301 redirect to www. use the below code from your above post

    RewriteEngine on
    RewriteCond %{HTTP_HOST} ^yourwebsite\$ [NC]

    Why use ^yourwebsite\$ instead of the ^$

    Other than that all above codes are very clear to me. Thank you for this very helpful .htaccess codes for SEO and Security for website.

  5. abe

    This small but great snippet is not working for me and I get Internal Server Error (#500). Any insights?

    # 1 WEEK 1 DAY

    Header set Cache-Control “max-age=691200, public”

    # 1 WEEK 1 DAY

    Header set Cache-Control “max-age=691200, proxy-revalidate”

    # 1 WEEK 1 DAY

    Header set Cache-Control “max-age=691200, private, proxy-revalidate”


  6. nice post. Thanks.

    If i want to check htaccess working fine or not so i have added some text into <IfModule but not getting 500 internal server error

    means that part is not working?

  7. J

    Hi – Good article. The only thing is that you are removing the trailing / however in terms of SEO, everything I read say that it’s imperative to keep it, plus apparently there are some performance issues too in terms of apache.

  8. Darren Gregory


    I host my wife’s website with Bluepark, who have said that Google doesn’t like out of stock products to be Redirected to Category pages … Are they correct in any way … I’ve recently made the site fully SSL and the traffic drop to https:/// has been marked, despite all 301’s being set up …. Thanks.

  9. Hi,

    I host my wife’s website with Bluepark, who have said that Google doesn’t like out of stock products to be Redirected to Category pages … Are they correct in any way … I’ve recently made the site fully SSL and the traffic drop to has been marked, despite all 301’s being set up …. Thanks.

  10. I owe you a big whopping debt of gratitude. I’ve been told more than once over the years by various websites (most recently Pingdom) to use caching, but never how to use it. I Googled it again tonight and found you. You wrote in language I can understand! You gave examples and explanations! My website’s fixed. I had to straighten the curly quotes in your Browser Cache example, but otherwise your code was perfect. Thank you!!

Leave a Reply