On-page optimization refers to activities that you do on your website to help it rank better in Search engines (SE’s).
Correspondingly, Off-page optimization refers to activities that you do outside of your site, i.e. on other sites and off-line.
This article deals strictly with the first part – On-page optimisation.
Meta tags are texts inserted into the html-header of your site. There are 3 main meta-tags: title-tag, meta description and meta keywords.
Optimise them as follows:
Titles. For each page of the site put the main keyword in title tag. You may also add a few secondary or sitewide keywords, as well as some descriptive words (for the title to read well). But remember – the more words you use in title, the more deluted the “weight” of every word becomes, and that weight is one of the most important ranking factors.
So compose your title wisely and use as few words as necessary to target the most important key phrases of the page.
Meta keywords. Write you meta keywords using the most relevant to the page content keywords (up to 50 words) separated by comma, no spaces;
Meta description. Write your meta description using the page main keyword and a few secondary keywords, placing the main keyword closer to the beginning. Important thing here is that meta description should be concise (15-20 words) and informative, reflecting the real content of the page and enticing a user, who finds your site via a Search Engine, to click on the link and visit your site.
Observe hierarchy of headings:
Use your whole title text or just the main keyword in H1-heading, at the very top of the page. Use major secondary keywords in H2, while less important secondary terms in H3 and H4 subheadings. Use the latter only if those benefits readability of your content, otherwise restrict youself to just H1 and H2.
If a keyword in headings is used several times, use synonyms. Search engines love that.
Employ the keywords from your title within the first couple of sentences immediately after the H1-heading. Use those and also secondary keywords throughout the texts, but do it naturally without over-stuffing, so that the texts read well. Also, use the main keyphrases at the very end of the page as closer to the closing </body> tag as possible. Again, as with the headings, using synonyms or related phrases to back up your targeted keywords is highly benefitial.
Place keywords prominently within paragraphs matching their headings.
Your content should be strictly relevant to the title of the page, as well as informative and captivating. The more time users spend on your site and the less is the bounce rate, the better your chances to rank well in SERPs for the chosen keywords.
Make Use of Optimized Images
It also helps to rank better if you place an image (jpg or gif) close to the top of the page. Here both the filename and the ALT tag of the image should contain your main key phrase and 1-2 auxiliary word(s) (only if necessary). E.g., main-keyword.jpg and ALT tag = main-keyword-auxiliary-word(s).
Navigation and Architecture
For Search Engines it does matter how visitors travel through your site. In essence, every page, section and principle articles should be easy to find. This is great for both visitors and for spiders, and thus is appreciated by SE’s.
So add a sitemap and a link to it from every page. In addition, make the structure of your home page and that of important landing pages in a SEO siloing format, i.e. content of the page should be like a huge sitemap including links to all important parts of the site – section pages, categories, subcategories, as well as to the most important artciles.
Optimize File Names
Rather than giving abstract names to html files corresponding to site pages, give them meaningful names like “this-page-main-key-phrase.html” or “this-page-title.html“, where the words in quotes should be substituted by either your main key phrase or the title for this page. Use hyphen (not underscores) to connect the words.
Optimize Directories Structure
For every section and category pages make a separate directory and name it using the main key phrase for that page.
Put all the dependent files into the correspodning directory and make a local indexpage with the links to all internal pages for that directory. Every page, in turn, should link back to the directory index page.
There should also be the main index page (which is the site home page) that lists all links to the index pages of each directory, and the index page of each directory should link back to the main index page of the site.
Try to limit the number of levels to no more than 2 (e.g., yoursite.com/section/category/article1.html)
Optimize Internal Linking
For every site page, all internal links to directories and other pages should use either the title of the target page, the keywords of that page or name of an article as their anchor texts.
This will facilitate the flow of both link weight and page rank through the site and, thus, will help the intrenal pages to rank better for their targeted keywords.
Validate and Optimize Site’s code
Put a valid Document Type declaration (DTD) at the very top of every page. This is important, as the “DOCTYPE” tells an html validator which HTML version to use when checking the document’s syntax. Find out how to do this by clicking on the recommended list of DTDs.
Sites are prone to html errors and broken links, but spiders hate getting entrapped or led to nowhere. Dead links and errors are indicative of poor site maintainance or site not being up to date, which negatively reflects on the site ranking. So validate your HTML code using, e.g., the Markup Validation Service (or similar), and fix all errors and broken links thus found.
When a page does not exist any more, make sure that a proper 404 error is returned. Do not substitue 404 with 301 or 302 redirects, as those, in fact, tell the spiders that the old page has not been removed, but rather has a copy with duplicate content. Use Rex Swain’s HTTP Viewer to check if a proper 404 error is returned. Use 301 or 302 riderects only when the page still exists but has been moved (temporary – 302, permanently – 301).
Avoid URLs that have too many parameters, include session ID’s or contain other parameters that result in creating the same page for different URL’s. The best practise is to make your URL’s relatively short (less than 255 characters) and user-friendly by making them meaningful and relevant to the page content.
A robots.txt is a text file placed in the root directory of your website to tell search engines not to index certain directories and pages of your site. In particular, if you have different url’s poitning to pages with indentical or very similar content, by having proper robots.txt you may avoid getting penalised for duplicate content.
Even if you don’t have any information on your site that you want to protect from indexing, I still highly recommend to have a simplest robots.txt, as otherwise every time a spider crawls your site, the server keeps logging 404 error into the server log files , thus waisting your disc space.
Elements to Avoid
- Do not use frames, as frames effectively list multiple pages on the same page. Thus if you want to link to a particular article within a frame you will not be able to do this, as a frame does not have a unique URL. Because of that, frames prevent your site from ranking well in search engines.
- Minimise use of flash, especially for internal navigation links and presenting majority of texts in flash.