We often get questions from people wondering why their site is not ranking or why it is not indexed by search engines. Recently, I came across several sites with major errors that could easily be fixed, if only the owners knew to look. While some SEO mistakes are quite complex, here are some of the often overlooked “whimsy” mistakes. So take a look at these SEO mistakes – and how you can avoid making them yourself. SEO Failure #1: Robots.txt Problems The robots.txt file has a lot of power.
It tells search engine crawlers what to exclude from their indexes. In the past, I've seen sites forget to remove a single line of code from this file after a company employee list site redesign, and dump their entire site in search results. So when a flower site highlighted a problem, I started with one of the first checks I always do on a site – look at the robots.txt file. I wanted to know if the site's robots.txt file was preventing search engines from indexing their content.
But instead of the expected text file, I saw a page offering to deliver flowers to Robots.Txt. SEO failure on a flower site The site had no robots.txt, which is the first thing a bot looks for when crawling a site. It was their first mistake. But take this file as destination… really? SEO Failure #2: Autogeneration Goes Wild Second, the site automatically generated nonsensical content. It would probably be delivered to Santa Claus or whatever text