Updated: 12/31/2017 by Computer Hope

.htaccess is a file stored in a directory, commonly on a Unix and Linux variant operating system, that grants or denies users or groups access rights to that directory. On a Unix and Linux system, this file should have the permissions set to 640 using chmod, be in the root public_html directory and may look similar to the examples listed below.

Full .htaccess example and explanation

Below is a full break down of each of the major segments of a .htaccess file. Each of these segments can be incorporated into your .htaccess file depending on what your account needs.


Be sure to test your web page after implementing any of the below changes. These changes can restrict or redirect your visitors in a way you may have not anticipated.


Lines that begin with # are comments or nonexecutable statements. Also, many of the examples below are using regular expressions to help match characters or files in the URL string.

Set the default character setting

#Set the charset for the pages
AddDefaultCharset UTF-8

In the first example, the character settings of each of the page are set to UTF-8. Although this can be specified in a meta tag, if you want to apply the setting to every document, you should set it in .htaccess.

Redirect matches found in URL

#Redirect M$soft and Hacking attempts
RedirectMatch (.*MSOffice)$ /error.htm
RedirectMatch (.*httpodbc\.dll)$ /error.htm
RedirectMatch (.*root\.exe)$ /error.htm
RedirectMatch (.*nt)$ /error.htm
RedirectMatch (.*comments.php)$ /error.htm

In the above example, the RedirectMatch redirects any of the above matched strings to the error.htm page. These lines can also be forwarded to a script to log matches or direct the users more accordingly. In the first line, we match any MSOffice at the end of the URL and forward to the error.htm page.

Redirect the user with a 410 error

#HTTP 410 don't log files don't have
Redirect gone /crossdomain.xml
Redirect gone /labels.rdf

The next example redirects the user to a 410 error message, which means the page they're looking for is gone, never going to return, and has no forwarding address. This type of redirect is a great way to redirect requests to pages you do not have on your server, but are frequently requested and causing 404 errors in your error log.

Custom error document pages

#Error pages
ErrorDocument 400 /error.php?400
ErrorDocument 401 /error.php?401
ErrorDocument 403 /error.php?403
ErrorDocument 404 /error.php?404
ErrorDocument 405 /error.php?405
ErrorDocument 410 /error.php?410
ErrorDocument 500 /error.php?500
ErrorDocument 501 /error.php?501

In the above example, any HTTP errors are directed to a specific page, in this case a PHP script that gives the user an error and logs the error for the webmaster. See our HTTP definition for a full listing of HTTP errors, if you need more than what is listed above. Your site may need nothing more than a custom 404 response.

Create a 301 redirect

#HTTP 301 redirect computerhope.com to www.computerhope.com
RewriteEngine On
rewritecond %{http_host} ^computerhope.com [NC]
rewriterule ^(.*)$ https://www.computerhope.com/$1 [L,R=301,NC]

In the above example, we created a 301 which redirects https://computerhope.com to https://www.computerhope.com. The redirection uses "L, R=301,NC" as a flag. The "L" is short for last and tells Apache to run no more rewrite rules, "R=301" is for the 301 redirect, and finally "NC" is short for no case and makes this rule case-insensitive. Creating this type of redirect helps prevent your web pages from getting listed multiple times in search engines and keeps everything consistent. We've also added the option to follow symlinks (symbolic links), which helps prevent any errors from occurring if a file or directory is being linked to and is not an actual file or directory.

RewriteCond %{HTTP_HOST} ^www\.(.*) [NC]
RewriteRule ^(.*) http://%1/$1 [R=301,L]

The above example is another example of how to create a 301 direct. In this example, we're directing any www address to a non www address. So if implemented https://www.computerhope.com would become https://computerhope.com. In this example, we also added the wild character .* as the domain instead of specifying comptuerhope.com.

Secure .htacess file

# Secure htaccess file
<Files .htaccess>
order allow,deny
deny from all

In this next example, this section creates a rule that prevents anyone from doing any exploits that may allow them to view your .htaccess file and see any rules you have setup in the file. These extra lines can added additional protection to the .htaccess file.

Disable directory indexing

# disable directory browsing
Options All -Indexes

In the above example, this security rule would prevent anyone from browsing directories on your server. For example, if you have a directory called /files that does not contain a index.html file, that directory's files can be seen by anyone. If that directory has sensitive files containing passwords or user data, the person browsing that folder can view or save any files in that directory, which would be a security risk.

Make HTML files act as SSI files

#Allow files with chmod +x to be SSI files
xBitHack on

By turning on xBitHack you can allow any HTML file that has executable permissions, e.g., chmod +x to be treated as an SSI file. This addition is useful for anyone who is running a web page as static HTML files and needs one or more of their HTML web pages to have SSI.

Enable website caching

# Month cache
<FilesMatch "\.(gif|jpg|jpeg|pdf|png|ico)$">
Header set Cache-Control "max-age=2592000"

# Week cache
<FilesMatch "\.(js|css|ch|txt)$">
Header set Cache-Control "max-age=604800"

# Day cache
<FilesMatch "\.(html|htm)$">
Header set Cache-Control "max-age=86400"

In the above example, caching is setup to help improve how fast your pages load and decrease the demand on your server. In the above first example, image files and other files are set to a max-age of a month. In other words, if the visitor has already viewed any of these file types it will not be loaded from the server again for at least a month. Next, files such as JavaScript files and CSS files are set to a week max-age limit. Finally, the HTML files are set to a day limit. These can all be adjusted depending on how often you update these type of files.


The age is represented in seconds. In other words, there are 86,400 seconds in one day.

Deny visitors based on USER_AGENT

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR]
RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR]
RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR]
RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR]
RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR]
RewriteCond %{HTTP_USER_AGENT} HTTrack [OR]
RewriteCond %{HTTP_USER_AGENT} ^Zeus
RewriteRule ^.* - [F,L]

There are dozens of tools and services that can index your site looking for e-mail addresses or copying your complete page. If used improperly these services can be a drain on your server and can also be used maliciously. If you notice these user agents in your visitor logs that can be prevented by using a command similar to the above.

Deny visitors based on IP address

Order Allow,Deny
Deny from
Allow from all

In the above example, these lines deny an IP address from accessing your pages. Banning an IP using this method helps block anyone from that IP from doing anything to your website.

Creating a password protected directory

AuthUserFile /home/directory/.passfile
AuthGroupFile /dev/null
AuthName Access For Valid Users
AuthType Basic
<Limit GET>
require valid-user
</Limit >

The AuthUserFile is the file that contains your users and passwords that you want to grant access to the directory where the files are stored.

To create a passfile, enter the following command at the prompt.

htpasswd -c . passfile username

After entering the above command, a prompt to enter the password for the username will appear.

The passfile should also be set to 640 permissions.

The above examples and information may not apply to all systems or setups. Therefore, if you are unsure if your ISP (Internet service provider) or web host supports the creation of rights using htaccess, contact them if you cannot password protect a website.

gzip, .htpasswd, SEO terms, Web design terms