.htaccess

Updated: 10/17/2017 by Computer Hope

.htaccess is a file stored in a directory, commonly on a Unix and Linux variant operating system, that grants or denies users or groups access rights to that directory. On a Unix and Linux system, this file should have the permissions set to 640 using chmod, be in the root public_html directory and may look similar to the examples listed below.

Full .htaccess example and explanation

Below is a full break down of each of the major segments of a .htaccess file. Each of these segments can be incorporated into your own .htaccess file depending on what your account needs.

Note: Be sure to test your web page after implementing any of the below changes. These changes can restrict or redirect your visitors in a way you may have not anticipated.

Tip: Lines that begin with # are comments or nonexecutable statements. Also, many of the examples below are using regular expressions to help match characters or files in the URL string.

Set the default character setting
Redirect matches found in URL
Redirect the user with a 410 error
Custom error document pages
Create a 301 redirect
Secure .htaccess file
Disable directory indexing
Make HTML files act as SSI files
Enable website caching
Deny visitors based on USER_AGENT
Deny visitors based on IP address
Creating a password protected directory

Set the default character setting

#Set the charset for the pages
AddDefaultCharset UTF-8

In this first example this line will set the character settings of each of the page to UTF-8. Although this can be specified in a meta tag we recommend specifying it in the .htaccess file instead so each file that loads is properly defined and so every HTML page doesn't have to have the meta tag defined within it.

Redirect matches found in URL

#Redirect M$soft and Hacking attempts
RedirectMatch (.*MSOffice)$ /error.htm
RedirectMatch (.*httpodbc\.dll)$ /error.htm
RedirectMatch (.*root\.exe)$ /error.htm
RedirectMatch (.*nt)$ /error.htm
RedirectMatch (.*comments.php)$ /error.htm

In this above example, the RedirectMatch will redirect any of the above matched strings to the error.htm page. These lines can also be forwarded to a script to log matches or direct the users more accordingly. In the first line, we match any MSOffice at the end of the URL and forward to the error.htm page.

Redirect the user with a 410 error

#HTTP 410 don't log files don't have
Redirect gone /crossdomain.xml
Redirect gone /labels.rdf

The next example redirects the user to a 410 error message, which means the page they're looking for is gone, never going to return, and has no forwarding address. This is a great way to redirect requests to pages you do not have on your server but are frequently requested and causing 404 errors in your error log.

Custom error document pages

#Error pages
ErrorDocument 400 /error.php?400
ErrorDocument 401 /error.php?401
ErrorDocument 403 /error.php?403
ErrorDocument 404 /error.php?404
ErrorDocument 405 /error.php?405
ErrorDocument 410 /error.php?410
ErrorDocument 500 /error.php?500
ErrorDocument 501 /error.php?501

In this above example, any HTTP errors that are encountered are directed to a specific page, in this case a PHP script that gives the user a non-generic error and logs or reports the error to the webmaster. See our HTTP definition for a full listing of HTTP errors, if you need more than what is listed above. Your own site may need nothing more than a custom 404 response.

Create a 301 redirect

#HTTP 301 redirect computerhope.com to www.computerhope.com
RewriteEngine On
rewrtecond %{http_host} ^computerhope.com [NC]
rewriterule ^(.*)$ https://www.computerhope.com/$1 [L,R=301,NC]

This next example creates a 301 redirect for https://computerhope.com to https://www.computerhope.com and uses "L, R=301,NC" as a flag. The "L" is short for last and tells Apache to run no more rewrite rules, "R=301" is for the 301 redirect, and finally "NC" is short for no case and makes this rule case-insensitive. This is a great method to help prevent your web pages from getting listed multiple times in search engines and keeps everything consistent. We've also added the option to follow symlinks (symbolic links), which helps prevent any errors from occurring if a file or directory is being linked to and is not an actual file or directory.

RewriteCond %{HTTP_HOST} ^www\.(.*) [NC]
RewriteRule ^(.*) http://%1/$1 [R=301,L]

This is another example of how to create a 301 direct. In this example, we're directing any www. address to a non www address. So if implemented https://www.computerhope.com would become https://computerhope.com. In this example, we also added the wild character .* as the domain instead of specifying comptuerhope.com.

Secure .htacess file

# Secure htaccess file
<Files .htaccess>
order allow,deny
deny from all
</Files>

In this next example, this section creates a rule that prevents anyone from doing any exploits that may allow them to view your .htaccess file and see any rules you have setup in the file. This is a great extra protection to add to any .htaccess file.

Disable directory indexing

# disable directory browsing
Options All -Indexes

This is another example of a security rule that should be added in your .htaccess to prevent anyone from browsing directories on your server. For example, if you have a directory called /files that does not contain a index.html file, that directory's files can be seen by anyone. If that directory has sensitive files containing passwords or user data, the person browsing that folder can view or save any files in that directory, which would be a security risk.

Make HTML files act as SSI files

#Allow files with chmod +x to be SSI files
XBitHack on

By turning on XBitHack you can allow any HTML file that has executable permissions, e.g. chmod +x to be treated as a SSI file. This addition is useful for anyone who has been running a web page as static HTML files and needs one or more of their HTML web pages to have SSI.

Enable website caching

# Month cache
<FilesMatch "\.(gif|jpg|jpeg|pdf|png|ico)$">
Header set Cache-Control "max-age=2592000"
</FilesMatch>

# Week cache
<FilesMatch "\.(js|css|ch|txt)$">
Header set Cache-Control "max-age=604800"
</FilesMatch>

# Day cache
<FilesMatch "\.(html|htm)$">
Header set Cache-Control "max-age=43200"
</FilesMatch>

This next example, sets up caching control and can help increase the speed of how fast your pages load and decrease the demand on your server. In the above first example image files and other files that are not modified or updated frequently are set to a max-age of a month, which means if the visitor has already viewed that file it will not be loaded from the server again for at least a month. Next, files such as JavaScript files and CSS files are set to a week max-age limit. Finally, the HTML files are set to a day limit. These can all be adjusted depending on how often you update these type of files.

Deny visitors based on USER_AGENT

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR]
RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR]
RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR]
RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR]
RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR]
RewriteCond %{HTTP_USER_AGENT} HTTrack [OR]
RewriteCond %{HTTP_USER_AGENT} ^Zeus
RewriteRule ^.* - [F,L]

There are dozens of tools and services that can index your site looking for e-mail addresses or just copying your complete page. If used improperly these services can be a drain on your server and can also be used maliciously. If you notice these user agents in your visitor logs that can be prevented by using a command similar to the above.

Deny visitors based on IP address

Order Allow,Deny
Deny from 178.239.58.144
Allow from all

This next example denies an IP address from accessing your pages. This is helpful to do when someone is obviously trying to hack your server, leaching your bandwidth, or causing other problems.

Creating a password protected directory

AuthUserFile /home/directory/.passfile
AuthGroupFile /dev/null
AuthName Access For Valid Users
AuthType Basic

<Limit GET>
require valid-user
</Limit >

The AuthUserFile is the file that contains your users and passwords that you want to grant access to the directory where the files are stored.

To create a passfile enter the below command at the prompt.

htpasswd -c . passfile username

After entering the above command a prompt to enter the password for the username will appear.

The passfile should also be set to 640 permissions.

It is important to note that the above examples and information may not apply to all systems or setups. Therefore, if you are unsure if your Internet Service Provider supports the creations of rights using htaccess, it is recommend you contact them if you are unable to set password protection on your website.

.htpasswd, gzip, SEO terms, Web design terms