The essential elements of an audit

1-blocking factors

a- Access to the site:
   – Analysis of access constraints:
         •Due to the problem of language detection
         •Multilingual site
   – Access to the site after the IP address detection
b- Links and content:
   – Content and links in Javascript
   – Content and links in a menu
   – Content and links in Flash
   – Similar content
   – Quality of URLs
c- Inaccessibility of server
d- Error 404
e- Identical <title> and <meta description> tag
f- Use of many link exchanges or purchased links
g- Page load time

2 Structure-site optimization and content

This is the conceptualization of the theme of the website by offering a useful and innovative content. It is therefore necessary to structure pages by classifying subjects into categories and subcategories and create unique themed pages, easy to read, with semantic relations and containing keywords. It is also needed to plan putting links in the content. What must be avoided is to put content invisible to users but visible to search engines.

3-Choosing Keywords

It is important to note that the use of keywords is tricky, because in some cases a pun may or may not promote the development of its website.
These techniques are to be monitored to not miss the optimization with keywords: polysemy (double meaning); homophony (phonetically identical); paronymy (substitute one element to another phonetically); paronymy from two signifiers; parasynonyme (simple semantic); antithesis (combination of two antonym words); alliteration and assonance; games mixture of oral and written.

4 Choice of title-pages

We need to create an unique and relevant page title, brief but descriptive, clearly describing the theme of the content of the page. The allocation of the <title> for each page must be unique. Also as always  keywords must be added in every titles.

5- Linking: facilitation of internal navigation

The linking allows the webmaster to highlight the different contents of its site and for visitors to easily find the desired content. Therefore a plan of the various branches from the root site to different pages needs to be done. The use of breadcrumb trail is advised and this is where we create two sitemaps:
– Creating a hierarchical list of site pages for users
– Creating an XML Sitemap file for search engines
At the same pace, we must also think about the menus, where there will be links and important pages of the site will be pointed.

6-Using Tags

– The <meta> « description »: allows search engines to provide a summary of the content of the page, it must be unique and relevant in relation to the content of the page. This description must of course contains the keywords.
– The <h1>, <h2> … are used to display titles of paragraphs in your content. Paragraph headings should be explicit and help users to identify the content. It must be short and represents a single keyword.
– The keywords in the content are put in <b> or <strong> tags
– It is important to study the use of alternative tags: noframe, noscript embed because their use can sometimes interfere with or distort the referencing
– W3C Code Validation

7-Use of legends and images

Inserting image content requires legends using the « alt » attribute. Legends are more important than the images themselves. The keywords should be included in the title of the image and the « alt » attribute to facilitate its use in SEO. The formats used are those best known as jpeg, gif, png … and also the name of the image must represent the content in which it is located.

8-Improved URL structure

We use words that summarize the contents to facilitate the transmission of information. To do this, create a simple directory structure. One version of URL should return to a single page or document, if the resolution of duplicate content is done by the 301 redirection or by the use of the canonical URL « rel = canonical »

9-quality of anchor texts

Those are texts placed between the tags <a href = » …. »> </ a> where the user clicks:
– Use descriptive text: giving a rough idea of the content of the called page
– Use a short text
– Differentiate posts normal text and links
– Plan internal and external links as to facilitate site navigation

10- Robots.txt

This file is very important. It allows control over access to certain content on the site by the search engines with the award of the parameter « nofollow ». It also allows you to follow or not links. It must be placed in the root directory of the site.

<<back next>>