General Guide to SEOs

Basic Understanding

Most people understand Search Engine Optimization or SEO as a way to increase the traffic on their site. A better understanding of SEO is the process of optimizing your content in order to be easier to find by potential visitors who are looking for the information you have published or the service you provide. In other words, it is more suitable to think of traffic incensement as a result of well performed content/website optimization, not as a SEO itself.  

When You Build Your Website

It is important to understand that nowadays SEO literary can be called a science, just like math and physics for example. Due to the fact that it would be almost impossible to cover all the points which make one website well search engine optimized, we will provide you with some basic guidelines how to improve your website and certain things that should be avoided during the web site creation process.


Content and Title

Whatever design you decide to make for your website the most important thing for your website will be the content itself. So, providing high-quality, useful, full and accurate information on your website will definitely make it popular and webmasters will link and refer visitors to your website which is one of the key factors of site optimization.


Most crawlers have very sophisticated algorithms and can recognize natural from unnatural links. The natural links to your website are developed when other webmasters include in their articles or comments links to your content or products and not just including you in the blogroll, for example. If a certain page is considered as “important” by a search engine and there is a natural referral link to your website content most probably your page will be also crawled and marked as “important” if the content is relevant to the topic of the original page.


Another important subject that should be considered while creating your content is the title of your pages and articles. Please note that we are not referring to your URLs and links here. You should carefully choose your posts, articles, pages and categories titles in order to be more search engine friendly. When you are creating a content regarding a subject, think what words a potential visitors will use in the search engine websites when they are looking for this information and try to include them in your title.


For example, if you are writing a tutorial about how to install WordPress it will not be suitable to name it “
Configuration, adjustment and setup of WP”. That is because people who are looking for this information will use more common words in order to describe the information they need – “How to install Word Press”. Think about the phrases that users would use to find your pages and include them on your site – this is a certainly a good SEO practice and will improve your website visibility in the search engines.

In addition to your title and key words another important part of your page are the meta tags. They are read by the search engines but are not displayed as a part of your web page design. For more information regarding meta tags and meta tags optimization you may refer to
this URL.

Dynamic Pages Issues

People like eye-candies, bots do not. Yes, indeed, people do like colorful websites, flash, AJAX, javascript and so on, however, you should know that these technologies are quite difficult to be assimilated by the crawlers because they are not plain text. Furthermore, most bots can be referred to another page only by static text link which means that should make sure that all of your pages on your website are accessible from at least one simple text link on another page. This is a very good practice and assures that all pages on your website will be crawled by the search engine bots. Generally, the best way to perform this is to create a SiteMap of your website which can be easily accessed from a text link on your home page.


The easiest way to imagine how the search engine bot actually “see” your website is to think of it as text-only browser. If you are Unix/Linux user you can simply use a text browser via your shell such as Lynx (http://www.google.com/search?q=lynx+browser). However, if you are a Windows user you will need a text only browser or program which can visualize your website in order to get a general idea of how the bot “sees” your pages.


Basically, this website works as web proxy but provides only the text output. In order to use this website, however, you should create a simple file called delorie.htm under your public_html directory. Be advised that it can be just an empty file – it is used only to verify that you are the owner of the website and not a bot that uses the proxy service.


As soon as you have the file created type your domain name and click on “View Page” button. You will be displayed a text page with the text content of your website. All information that can not be seen via this proxy most probably will not be crawled by the search engine bots – this includes any flash, images or other graphical/dynamic content.


It is important to mention, however, that some search engine bots recognize graphical and dynamic content such as Flash, AJAX, etc.


Possible Workaround

Still, it is hard to avoid using dynamic pages when you are creating a modern-looking website, thus, you will need some workaround in this matter in order to assure that the information is properly crawled. A possible solution is to create a text only duplicate of your dynamic page which will be readable by the search engine bot. This page can be included in your SiteMap and in this way all of your information will be read by the bot.


Specifically for Google you can also disallow the dynamic page in your
robots.txt file in order to make sure that the Google bot will not detect the dynamic page but the text-only one. This can be done by placing the following lines in your robots.txt file:

Disallow: /the-name-of-your-page
Disallow: /myExample.html


Was this article helpful?

mood_bad Dislike 0
mood Like 0
visibility Views: 2430