If Google or other search engines can’t read the content on your website, they won't show your pages in the search results.
When you've taken the time to produce content for your website, you probably want it to be found and read by your target audience. Since the vast majority of users use search engines when surfing the web, it's a good idea to help the search engines in easily reading your content. Therefore, it's essential that your website is 'search engine friendly' in the sense that you should make technical and structural enhancements to your website to make it easier for search engines to read.
Technical changes to the website are just a small part of Search Engine Optimization (SEO), but they're a good place to start. If the search engines can't read your website, then all other optimizations are a waste of time.
With this and subsequent blog posts, we want to help you get started in making your website more easily readable for search engines and thus more find useful for your customers. After reading this post, you should have an urge to start search engine optimizing your website.
So where do you begin?
Some of the most common ways that websites trip over themselves:
- The pages are competing among themselves for the search
engines' attention. If a search engine thinks that
two or more pages on your website are about the same subject, it
won't know which to place higher in the rankings. This can happen
if multiple pages have the same main heading or title (called "
A helpful trick is to think of the title of a web page the same way you'd think of the title of a book: It should be unique while still suggesting what the page is about.
If you saw two books on a shelf with the same title and author, you'd probably assume they're the same book and just grab one or the other. Search engines do that too, so make sure that your titles and headlines are unique and descriptive of the content.
- The pages are blocked from being found and indexed by
search engines. There are several techniques to
prevent search engines from finding a page. These are very useful
for keeping people from accidentally finding your unfinished
website during development and testing, but people often forget to
unblock the search engines when the page is finished.
If search engines don't show your pages in the results, you lose a large percentage of users who would otherwise be finding and using your website.
One way of limiting search engine access is to put a small text file titled "robots.txt" at the root of your website that contains rules on what search engines are allowed to read.
Another way is to have a meta-tag called "meta robots" embedded in the page code that tells search engines whether or not they can read and index the page. Make sure that the pages you want to be found by Google and other search engines are not restricted by robots.txt and meta robots.
If you take the plunge into SEO, tell us how it goes in the comments below!
If you don't quite know how to get started you can try our
Siteimprove SEO tool, which among other things, highlights the
places where your website competes with itself.
You can also write to me, Bo Vejgaard, and I'd be happy to answer your questions.