Apply Two Small Changes to Avoid Common SEO Pitfalls

If search engines can’t read the content on your website, they won’t show your pages in the search results. When you’ve taken the time to produce content for your website, you probably want it to be found and read by your target audience.

Since the vast majority of users use search engines when surfing the web, it’s a good idea to help the search engines in easily reading your content. Therefore, it’s essential that your website is ‘search engine friendly’ in the sense that you should make technical and structural enhancements to your website to make it easier for search engines to read. If the search engines can’t read your website, then all other optimizations are a waste of time.

Technical changes to the website are just a small part of Search Engine Optimization (SEO), but they’re a good place to start. Here are two ways to avoid your website trips over itself:

Don’t let your pages compete against each other

If a search engine thinks that two or more pages on your website are about the same subject, it won’t know which to place higher in the rankings. This can happen if multiple pages have the same main heading or title (called “Duplication”).

A helpful trick is to think of the title of a web page the same way you’d think of the title of a book: It should be unique while still suggesting what the page is about. If you saw two books on a shelf with the same title and author, you’d probably assume they’re the same book and just grab one or the other. Search engines do that too, so make sure that your titles and headlines are unique and descriptive of the content.

Don’t block your pages from being found

There are several techniques to prevent search engines from finding and indexing a page. These are very useful for keeping people from accidentally finding your unfinished website during development and testing, but people often forget to unblock the search engines when the page is finished. If search engines don’t show your pages in the results, you lose a large percentage of users who would otherwise be finding and using your website.

One way of limiting search engine access is to put a small text file titled “robots.txt” at the root of your website that contains rules on what search engines are allowed to read. Another way is to have a meta-tag called “meta robots” embedded in the page code that tells search engines whether or not they can read and index the page. Make sure that the pages you want to be found by Google and other search engines are not restricted by robots.txt and meta robots.

Kick off your search engine optimization by downloading this SEO checklist.






Download the SEO Checklist




Your email address will not be published. Required fields are marked *

*

by Bo Vejgaard
October 3rd
2013

Subscribe to Blog Updates