How and when to use robots.txt file

We have heard of crawlers and bots are crawling our sites to scrap content for various reasons like indexing in search engines, identifying content, scanning email addresses, etc. There are all kinds of crawlers/bots which crawl websites. While some are good which should be allowed access to our site, but we might want to restrict …

How and when to use robots.txt file Read More »

Optimizing / Speeding up websites

In this article I will list a few things that can be done to speed up website. I will give just brief description on each topic and give links to find more details on those topics. I have used tools like Google’s Page Speed, YSlow, Audits Tab in Chrome Developer Tools.