Server Configuration

How to hide apache information with ServerTokens and ServerSignature directives

In default Apache configuration, the server sends HTTP Header with the information of Apache version, modules, Operating System, etc of the Server. The HTTP response header “Server” displays all these details of the server. This information can be used by hackers to try to exploit any vulnerabilities in the Apache, OS or other modules you …

How to hide apache information with ServerTokens and ServerSignature directives Read More »

How to hide PHP version in the HTTP Headers

In default Apache/PHP configuration, the server sends HTTP Header with the information of which PHP version is running on the server. The HTTP response header “X-Powered-By” displays the version of PHP that is running on the server. This information can be used by hackers to try to exploit any vulnerabilities in the PHP version you …

How to hide PHP version in the HTTP Headers Read More »

Schedule crons using crontab

Many times we need to have some process to run automatically at set times or after set intervals. This can be easily done in Linux/Unix by using cron jobs. In this post I’ll talk cron and crontabs. Cron is an Unix utility which allows you to automatically run tasks periodically at set times or set time intervals. crontab is a file which contains the details of which task has to be run when. It is essentially a schedule of all crons.

Migrating servers using DNS TTL for minimum downtime

You have your site running on an old hardware and want to migrate it to a new upgraded hardware which would result in change of the ipaddress of your site and in turn imply downtime for your site. You want to minimize the downtime due to the change in ipaddress. This can be easily achieved …

Migrating servers using DNS TTL for minimum downtime Read More »

How and when to use robots.txt file

We have heard of crawlers and bots are crawling our sites to scrap content for various reasons like indexing in search engines, identifying content, scanning email addresses, etc. There are all kinds of crawlers/bots which crawl websites. While some are good which should be allowed access to our site, but we might want to restrict …

How and when to use robots.txt file Read More »