SEO (Search Engine Optimization) is not a science at all in the sense that we cannot prescribe the same treatment to all the websites that we optimize. It’s an art. One needs to see what the individual website has to offer and then optimize it to make that website unique in its industry. And, this is where the secret of SEO lies.
However, there are few rules that should be followed by every website that wishes to be on the first page of search engines for their respective targeted keywords. Here, I have created a SEO checklist of stuff that we should not do if we care for SEO at all. This list is based on my experience as search engine optimizer. I’m going to modify this list as and when needed. So, watch this space!
X Do not use Frames
So, why should we use frames? You can check your website here to see how it will look in lynx browser.
X Do not create doorway page
Doorway page is basically a single page website where all the links point to another website that does business. So, the idea is to buy multiple domains and then create a one page website and then drive all the traffic to the main website. This is such a waste of resources because doing this will ban all your domains from search engines such as Google.
X Do not create multiple pages, subdomains, or domains with substantially duplicate content
Google warns us against this because it’s spamming. Just to make my point, let me give you an example. I had to optimize a website that had two different domain names but was pointing to the same server and hence both had exactly same content. After about couple of months, both of these domains vanished from Google. So, we changed both the websites such that the content is unique for each domain name. Now, both of them are back in Google. So, I have tried it and got burnt and so, I do not advise you to do so.
Sometimes, it does make sense to purchase multiple domains for branding purposes and it’s not worth it to create unique websites for each domains. In such cases, I suggest using 301 redirect. Do not use 302 redirect as Google doesn’t like it.
X Do not do keyword stuffing
Keyword stuffing is when we insert all possible keywords and/or repeat keywords unnecessarily in a tag or an attribute of a tag. Sometimes, some SEO practitioners will add a list of keywords in the content primarily for search engines only and then hide it so that the visitors can’t see them. This is not only poor netiquette but this is defined as spamming. Any text on the web page that the visitors can’t see should be deleted. Also, note that search engines conduct semantic analysis and hence adding keywords alone on your website is not going make any difference.
X Do not use cloaking
Cloaking is when the server is configured to send one version of the web page to a visitor and a different version to search engine robots. This is not only unethical but is a sure way of getting your website banned.
X Do not have Flash only or Images only website
Search engines cannot read images and they are just beginning to read flash. So, I wouldn’t suggest creating flash-only websites or websites with images only if you do want to be discovered by your potential clients through search engines.
X Do not steal text
Itâ€™s definitely possible to find out duplicate content. For instance, go to CopyScape.com and find out if someone is copying content from your website. Of course, if someone is copying content and giving you credit for it then it’s perfectly fine. If not, you might want to contact them.
X Do not link to free for all link farms
Linking to a bad website can be considered as being associated with them and hence can be penalized. Hence, avoid bad company. I usually check all the links on my clients’ websites in every three to six months to ensure that we are not in bad neighborhood.
X Do not purchase keywords in the address bar
Itâ€™s a spam and is not going to help you in any way. Stay away from it.
X Do not use automatic submissions to search engines
I recommend staying away from any automatic submission or optimizing tools. Manual effort is worth it and do not try to look for short cut.
X Do not exclude search engines in Robots.txt
Recently, one of our client’s website was not getting indexed in search engine. We discovered that the reason was robots.txt file was excluding all search engine robots from all the pages in the website. Now, why would anyone in right mind do this? Writing robots.txt is good idea to reduce the strain on search engines’ server but do not exclude search engines from your web pages. Here is a Robot.txt validator that you might want to use before you upload this file to your server.
I hope this is helpful. If you have any comments regarding this checklist, please leave your comments and I will be more than happy to discuss it with you.