Creating Usable, Search Engine Friendly URLs

August 10th, 2008 Posted in SEO

There are many reasons to use mod_rewrite to create informative, useful URLs for your website. Most dynamic websites use some form of PHP or ASP to pull the data from the database and often times use that data in the URL as a string. This is not only a potential security flaw, it also gives the user and search engine alike a very uninformative destination for your website.

Rather than have your page selling a baseball bat as that would take you to sales page for the bat, it makes much more sense to have that URL be While both of these URLs might take you to the same page, the user will know exactly what the page has on it when going through his history or if he emails it to himself or writes it down.

As for its affect on Search Engine Optimization, URL rewriting is a key component. Not only do all major search engines take into account the keywords in the URL, it also creates totally unque URLs for all the pages on your site which will be much more likely to be mined by the search engines. In addition, using SEO friendly URLs take all the & and the = out of the URL which can trip up some search engines and peoples links, which could cause people who’d like to link to your article to be unable to because your URL strings mess with their outbound link code.

If those reasons aren’t enough, the security risk putting your variables and code for the world to see makes it much more likely for a front-end attack on your website. They can see how you use naming conventions, what type of data is in your database, and could theoretically know that certain pages are inserting certain information using GETS. All in all, it’s just bad form.

Enough about WHY you should use nice URLs, now here’s the basics on HOW!

First, we must create or edit the .htaccess file and starting it with the command RewriteEngine On in your web root. This will turn the RewriteEngine on for your entire site.

We then Add Rewrite Rules. There are many very basic rewrite rules and some very complex rewrite rules. Here’s the basics on how it works and a very basic example.

RewriteEngine on
RewriteRule ^old\.html$ new.html

Though this is the simplest example possible, it may throw a few people off. The structure of the ‘old’ URL is the only difficult part in this RewriteRule. There are three special characters in there.

  • The caret, ^, signifies the start of an URL, under the current directory. This directory is whatever directory the .htaccess file is in. You’ll start almost all matches with a caret.
  • The dollar sign, $, signifies the end of the string to be matched. You should add this in to stop your rules matching the first part of longer URLs.
  • The period or dot before the file extension is a special character in regular expressions, and would mean something special if we didn’t escape it with the backslash, which tells Apache to treat it as a normal character.

Basically, that rewrite rule will take old.html and take them instantly to new.html without them having to do anything. While it’s never recommended to move pages, this is very helpful if you need to.

Sometimes you do want your readers to know a redirect has occurred, and can do this by forcing a new HTTP request for the new page. This will make the browser load up the new page as if it was the page originally requested, and the location bar will change to show the URL of the new page. All you need to do is turn on the [R] flag, by appending it to the rule:

RewriteRule ^old\.html$ new.html [R]

This is especially useful if you want bookmarkers to start using the new page. Also, this is a very important idea to remember as we go deeper into the mod_rewrite, as we will use the flagging to rewrite URLs using regular expressions.

If you are a true coder, you have used Regular Expressions numerous times on many levels. They are basically the meat of all coding, and they are very important in creating our SEO friendly URLs. If you haven’t ever used Regular Experessions, this might not make a ton of sense, but you still should be able to find and use examples to help you solve even the most tricky URL rewrite solutions.

Lets look at our above bat example. We will first start very basic.

So we have a script that when we use the URL it takes us to the ID 32, which happens to be our Louisville Slugger Cobra Bat. We want to make that a little cleaner and safer.

RewriteRule ^products/([0-9][0-9])/$ /product.php?id=$1

If we just add this one line of code under RewriteEngine On, it will now automatically take us to that page when we type This is the most basic of rewrite rules and should help you see how regular expressions can be used.

This will match any URLs that start with ‘products/’, followed by any two digits, followed by a forward slash. For example, this rule will match an URL like products/12/ or products/99/, and redirect it to the PHP page.

The parts in square brackets are called ranges. In this case we’re allowing anything in the range 0-9, which is any digit. Other ranges would be [A-Z], which is any uppercase letter; [a-z], any lowercase letter; and [A-Za-z], any letter in either case.

Now, for a bit more complex example.

RewriteRule ^([_A-Za-z0-9-]+)/([_A-Za-z0-9-]+)/?$ /product.php?name=$2 [NC,L]

This code would add the category name with – between words, then a / then the product name with – between words, replacing spaces. This one simple line turns Slugger Cobra into

You should be able to go through both the RewriteRule and the URLs and see how and why this works, especially using what we learned above about the special symbols.

Going through it quickly, the co

As a final example, I show you a quick error handling nugget.

ErrorDocument 404 /index.php

That will send any 404 errors to your index.php page, leaving the URL in tact. If you add a [R] at the end, it will browser redirect them and take them to the index.php page. This is true for all error codes. This is a great way to keep all traffic, even if they mess something up!

There are tons of uses for rewrite rules and much that can be done to help create safe, secure, optimized websites that are helpful for the user at the same time!

Author -

who has written 27 posts on [Re]

Contact the author

One Response to “Creating Usable, Search Engine Friendly URLs”

  1. Mustafa Ali Says:

    Well, i m trying to rewrite urls on IIS server for my php site. i m unable to do so. i have also tried web.config but didnt work for me. Is there any solution to deal with it…

Leave a Reply