(SEO): Full Guide

USENITBR November 2017(SEO): Full GuideThis folder on SEO is updated regularly by Jean-François Pillou, founder CommentCaMarche with SEO optimizations examples implemented on sites Figaro CCM Benchmark Group (robots.txt, optimization for the Panda algorithm, etc.). It details all the conditions necessary for a good SEO. Feel free to follow us on Twitter @jeffpillou on for a complete SEO standby overnight.

The term " SEO " (in English SEO for Search Engine Optimization, translate search engine optimization) is the set of techniques for improving the visibility of a website, ie positioning (in English ranking) Pages of a site high on the results pages for certain keywords.

The difficulty of the exercise lies not in promoting the site with search engines that the content structure and internal mesh to provide a good user experience and advance in the results page on previously chosen keywords.

Indeed, a majority of Internet users use search engines to find information and questions as such a search engine using keywords (in English keywords). It is therefore essential before any thing to worry about the content that you want to offer to best meet the expectations of users and identify keywords that may be seized by these!

(SEO): Full Guide

In the skin of the user

SEO is primarily a discipline designed to produce pages that appeal to the user. Beyond the technical aspects, it is particularly important to be empathetic and to put in place the user to understand how it sails, it looks, how it seeks and what will make your page will please him more than those of competing sites.

It is therefore important firstly to revisit some definitions for understanding how a search engine.


SERP

The term SERP (Search Engine Result Pages) Means the search results as displayed after a query. It is essential to understand that a user on the other results for the same search engine may vary, firstly depending on the setting chosen by the user (language, number of results per page) but also by location (country, region) where the request is made or the terminal (mobile, tablet, desktop) or sometimes between requests previously made by the user, and finally because the search engines are regularly A / B testing to test different displays. As such, it is not uncommon that a site disappears SERP on a request for 24 to 48 hours, and then redo its appearance. This means that it takes a minimum 72 hours before worrying.

It's not because you see yourself in first place as you are bound. To get the closest possible result of what most users see, it is advisable to disable history queries or navigate using the incognito browser.

The pages referenced in first place obviously get more visits, then come in second page, etc. It's the same for the pages referenced on the first page compared to the pages referenced on the second page. Thus, if a page is in 11th position (hence the second page), it is very interesting to try to optimize it to make it pass the first page and get a significant gain unique visitors.

Keywords

SEO makes sense vis-à-vis key words, that is to say the words (applications) used by visitors to search. The first job is to determine the keywords on which you want to position the pages of its site. The keywords that you have in mind does not always correspond to the keywords used by visitors because they tend to use or even the shortest possible terms to make spelling errors. There are tools to compare search volume of a keyword compared to another and giving suggestions:
  • http://www.google.com/trends/?hl=fr


Finally, there are sites to find the keywords competing sites such as SEMRush.com

SEO Black hat / White hat

In terms of SEO, generally between two schools of thought:
  • White hat SEO (read white hat) SEO designating strict accordance with the instructions of the search engines to webmasters, hoping to get a sustainable SEO time playing with the rules of the game;
  • Black hat SEO (black hat), designating SEOs adopting technical contrary to the instructions of the search engines in order to get a quick profit on high monetization potential pages, but with a high risk of downgrade. The black hat SEO and play cat and mouse with the search engines, which regularly adjust their algorithms to identify and decommission sites not respecting the instructions. Techniques such as cloaking or happy and spinning are considered dangerous and not recommended.


Gray hat SEO refers by extension the border say technical, that is to say, the use of which is controversial in communities dedicated to SEO.

Is my website listed?

To see if your site is listed in a search engine, simply type the following into the search field:
Site: votresite.com

If your website is in the index, the engine should display a selection of pages it knows about the site and a number of results representing approximately the number of pages it knows of your site.

Submit website

Before talking about SEO, the first step is to ensure that the major search engines, especially Google (as it is the most used) identify the site and come browse regularly. As such "referencing his site" means nothing special, except that the site that you want to see go are present in the index of a search engine. To do this, just:
  • is to get links from sites themselves regularly indexed by search engines, so that they recognize the existence of yours.
  • either to declare your website (or pages of your site) via the interface of the major search engines.
  • either to declare a file (called sitemap) site directly through the interface of the major search engines to help them find new pages to index.

Add the site in search engines

To this end, there are online forms for submitting its website. Feel free to set up a web analytics solution like Google Analytics or AT Internet (www.atinternet.com) learn about the origin of your visitors and pages visited, and many other information helpful.

Google

Google is the leading search engine in France with 90% market share. The page to reference a URL in Google is this: https://www.google.com/webmasters/tools/submit-url.

Submission is free and completely free, but requires some time, quite variable depending on the period.

Bing

Referencing in Bing through the use of tools for webmasters. Simply create an account and follow the procedure on the next page: http://www.bing.com/toolbox/webmaster

Yahoo

Yahoo now uses Bing for its search engine. The next page explains how to submit new URL: https://fr.aide.yahoo.com/kb/SLN2217.html

Voila.fr

Here is the engine used by service Orange.fr. Although there is some smaller market than Google and Bing, it is still interesting to be included. The address to register on Voila is: http://referencement.ke.voila.fr/index.php

Exalead

Exalead is an alternative French search engine. To submit a site on Exalead, just use the following page: http://www.exalead.com/search/web/submit/

owner

SEO is not necessarily paying because search engines index the content free sites and it is not possible to pay to better position its site.

Just as other sites link to your visit to the engines and the links are of good quality (that is to say at sites with good reputation), the better your site will emerge in engines search on terms corresponding to those of your site. However, the methods to be used are numerous and sometimes complex and a simple mistake can have significant implications, which is why many companies make use of SEO professionals to advise or assist them.

However it is possible firstly to buy keywords on search engines, then it is advertising location (called sponsored links), located around the so-called natural search results. This is called SEM (Search Engine Marketing) as opposed to SEO (Search Engine Optimization).

On the other hand, SEO is a broad concept, asking a lot of experience and many hidden difficulties, it may be desirable to use agencies specializing in SEO that will advise and assist.

Specialized agencies can help you improve your ranking in search results. They can sometimes offer the realization or updating of site content. But beware of offers style "SEO in over 200 search engines" or "Referencing in 1000+ directories" or "guaranteed SEO" / "First place in a few days". An SEO must remain natural, that is to say, it should be gradual.

Beware of automatic referencing software. Some search engines will reject outright your site (in most cases you must leave your email address with the referral form to fill). In extreme cases, the use of such software, if massively submits pages of your site in many directories can be cons-productive and cause some engines to ban your site.

Optimizing SEO

The reference for the search engines is the web page, so you have to think, in the design of the website, in structuring the pages, taking into account the above advice for each page.

In fact most webmasters think properly index the home page of their website but abandoning other pages, or it is usually the other pages that contain the most interesting content. It is therefore imperative to choose a title, URL and meta (etc.) for for each of the site's pages.

There are some website design techniques to give more effectiveness to SEO pages of a site:

  • an original and interesting content,
  • a well chosen title,
  • appropriate URL,
  • readable text body by engines,
  • META tags that accurately describes the content of the page,
  • thoughtful links
  • ALT attributes to describe the image content.

Contents of the web page

Search engines seek above all to provide quality service to their users by giving them the most relevant results based on their search well before even thinking to improve the SEO it is essential to focus on creating content consisting and original.

Original content does not mean content that is offered by any other site, this would be an impossible mission. However it is possible to treat a subject and bring him a profit deepening some points by organizing it in an original way or by connecting different information. Social networks are thus an excellent way to promote content and to identify the interests that readers relate to your content.

On the other hand, always with a view to provide the best content to visitors, search engines attach importance to updating information. Failure to update the site therefore increases the index provided by the engine to the site or in any case the frequency of passage of the crawler.

Page title

The title is the preferred element to describe briefly the content of the page, this is especially the first item that the visitor will read in the result page of the search engine, so it is essential to grant him particular importance. The title of a web page is described in the header of the web page between tags and .

The title must describe as precisely as possible, in 6 or 7 words maximum, the content of the web page and its recommended total length should ideally not exceed sixty characters. Finally, it should ideally be as unique as possible in the site so that the page is not considered duplicate content.

The title is especially important that this information that will appear in search results, favorites of the Internet in the title bar and browser tabs and in 'historical.

Given that European users read from left to right, it is advisable to put the words carrying the most sense from the page to the left. Make sure in particular that each page on your site has a unique title, including pages with pagination. In the latter case, for example you can make the pages beyond Page 1 do include the page number in the title.

URL of the page

Some search engines give utmost importance to keywords present in the URL, including the keywords present in the domain. It is therefore advisable to put a suitable file name containing one or two keywords for each site files rather than names of page1.html kind page2.html etc.

Kioskea uses a called URL-rewriting technique of writing readable URLs containing the keywords of the page title. TLC on the dash is used as a separator: http://www.commentcamarche.net/faq/20265-php-nettoyer-une-chaine-de-caracteres

Body of the page

To make the most of the content of each page it is necessary that it be transparent (as opposed to opaque content such as flash), that is to say that it comprises a maximum of text, indexable by engines. The content of the page should be primarily addressed quality content to visitors, but it is possible to improve it by ensuring that different keywords are present.

The frames (frames) are strongly discouraged since they sometimes prevent indexing of the website in good conditions.

META Tags

META tags are not displayed to insert at the beginning of HTML document tags to finely describe the document. Given the misuse of meta found in a large number of websites, the engines use less this information when indexing pages. The meta tag "keywords" has been officially abandoned by Google

META description

The meta tag description to add a description that describes the page, without the display to visitors (eg, terms in the plural, even with voluntary spelling mistakes). This is usually the description (or part of this description) that will appear in the SERP. It is advisable to use HTML coding for accented characters and not exceed twenty keywords.

META robots

The meta tag robots is particularly important because it allows to describe the behavior vis-à-vis robot the page and whether the page should be indexed or not and if the robot is authorized to follow the links.

By default no robots tag indicates that the robot can index the page and follow the links it contains.

The robots tag can have the following values:

  • index, follow: This statement amounts to not put robots tag as this is the default behavior.
  • noindex, follow: the robot does not index the page (although the robot can return regularly to see if there are new links)
  • index, nofollow: the robot should not follow links from the page (against the robot can index the page)
  • noindex, nofollow: the robot should index the page more, or follow the links. This will result in a drastic decrease of the frequency of visits of the page by robots.


Here is an example of robots tag:


Also note the existence of the following value, which can be combined with the previous values:
  • noarchive: the robot should not offer users the cached version (including the Google cache).
  • noodp: the robot should not offer the description from DMOZ (Open Directory Project) Default


It is possible to specifically target the crawlers of Google (Googlebot) by replacing the name robots Googlebot (however it is advised to use the standard tag to remain generic):


If a large number of pages should not be indexed by search engines, it's best to block via the robots.txt because in this case the crawlers do not lose time to crawl those pages and and can focus all their energy on useful pages.

Kioskea on the forum questions that have not been answered are excluded from the search engines, but they can continue to crawl the pages to follow the links:



After a month, if the question is still no response, the meta tag is the next, so that the engine forget:


internal links

In order to give maximum exposure to each of your pages, it is advisable to establish internal links between your pages to allow crawlers to browse your entire tree. Thus it may be useful to create a page with the architecture of your site and containing pointers to each of your pages.

That means by extension that the site navigation (main menu) must be thinking to effectively give access to pages with high potential in terms of SEO.

netlinking

The term refers to the fact netlinking to obtain external links pointing to their website as it increases one hand traffic and notoriety of its site, and secondly because the search engines take into account the number and quality links pointing to a site to describe its relevance level (in the case of Google with its called index PageRank).

Nofollow links

Links are default followed by search engines (in the absence of meta robots nofollow or a robots.txt file to prevent indexing of the page). However, it is possible to tell search engines not to follow some links using nofollow.

This is especially recommended if:
  • The link is the subject of a trade agreement (paid links)
  • The link is added by untrusted users contributing areas of the site (comments, ratings, forums, etc.).


On Kioskea, links posted by anonymous users or have not actively participated in the community (support forums) are nofollow links. The links posted by active users and contributors are normal links (called "dofollow").

ALT Attributes images

The site's images are opaque to search engines, that is to say they are not able to index the content, so it is advisable to put an attribute ALT on each of the images, allowing to describe the content. The ALT attribute is also essential for the blind, navigating using Braille terminals.

An example of ALT attribute:
width ="140"
height ="40"
border ="0"
alt ="logo Kioskea">

It is also advisable to inform a title attribute to display a tooltip to the user describing the image.

Improve crawl

SEO begins with crawl (in French exploration) Your site by the crawlers of search engines. These agents browsing websites looking for new pages to index or pages to update. A crawler is sort of like a virtual visitor: it follows the links on your site to explore the maximum pages. These robots are identifiable in the logs by the header User-Agent HTTP they send. These are user-agents of the major search engines:

Googlebot, etc.

Below are examples of chains User Agents for the most popular search engines:


Object


Thus, it should ensure to mesh intelligently its pages with links to allow robots to access the maximum pages as quickly as possible.

To improve the indexing of your site, there are several methods:

robots.txt

It is possible and desirable to block unwanted pages referencing using a robots.txt file to allow crawlers to devote all their energy to useful pages. The pages duplicated (eg having unnecessary parameters robots) or pages with little interest to visitors from a search (internal search results on the site, etc.) must typically be blocked;

On Kioskea, the results of the internal search engine are explicitly excluded from listing through the robots.txt file, so as not to provide users arriving via a search engine results generated automatically in accordance with the instructions from Google.

page load speed

It is important to improve the page loading time, using, for example, caching mechanisms because it allows one hand to improve the user experience and therefore visitor satisfaction and secondly because the engine search take more into account these types of signals in the positioning of the pages;

Sitemap

The fact of creating a sitemap file can give access to the robots all your pages or recent pages indexed.

social Networking

More and more search engines take into account the social sharing signals in their algorithm. Google Panda takes into account this criterion to determine whether a site is of quality or not. In other words, the fact of promoting social sharing limits the risk of impact by algorithms such as Panda.

On Kioskea, the pages contain asynchronous sharing buttons so as not to slow loading pages and meta OpenGraph og: picture to indicate the social networks which image display when a user shares a link.

a mobile SEO

The ideal is to have a mobile site designed for responsive design, in this case, the page indexed for desktop and mobile devices is the same, only the display changes in the display device.

If your mobile website is on a domain or subdomain apart, as is the case for Kioskea, simply automatically direct users to the mobile site, making sure that each page is pointing redirected its equivalent on mobile site. It should also ensure that the crawler Googlebot-Mobile is well treated as a mobile terminal!

Google said that the pages "mobile-friendly" have an SEO boost on non-mobile friendly pages in mobile search results. This boost is applied page by page and is revalued over water for each page, as it passes or not the test.

To deepen: a mobile SEO

duplicate content

Wherever possible, it is to create unique page titles throughout the site because search engines like Google tend to ignore duplicate content (duplicate content in English), that is ie either many of the site with the same title or site pages whose main content is on the site or other sites.

Duplicate content is something natural, not least by the fact that we are led to make citations to report about personalities or to mention in official texts. However, too high a proportion of duplicate content on a site can lead to an algorithmic penalty, so it is advisable to block such content using a robots.txt file or a robots meta tag with a value "noindex".

canonical tag

When search engines detect duplicate content, they still retain a single page, according to their own algorithms, which can sometimes lead to errors. Thus, it is advisable to include in the pages with duplicate content Canonical tag pointing to a page to keep. Here is the syntax:



Generally, it is advisable to include in your pages a canonical tag with the URL of the current page. This allows in particular to limit the loss due to unnecessary parameters in the URL as http://www.commentcamarche.net/forum/?page=1 or http://www.commentcamarche.net/faq/?utm_source= email!

This also serves to index pages because sometimes Google indexes your home page in its form http://www.commentcamarche.net/ and http://www.commentcamarche.net/index.php

penalties

There are generally two types of penalties:
  • The manual penalties, that is to say as a result of human action, following a breach of the Webmaster Guidelines. It may be unnatural links (purchased), artificial content, sneaky redirects, etc. Penalties for purchasing links are common and penalize the site have sold bonds as well as those who bought them. These penalties can not be exercised after having corrected the problem (which implies to have identified the problem) and made a request for review of the site via the dedicated form. The review of a website can take several weeks and does not necessarily lead to a position or sometimes partial recovery;
  • The algorithmic penalties, that is to say, resulting in no human action, usually related to a combination of factors that only the search engine knows. This is the case for example of Google Panda, the Google algorithm downgrading the said sites of poor quality or Google Penguin algorithm targeting bad SEO practices. These penalties can not be exercised before fter eliminated "signals" leading to a downgrade in the next iteration of the algorithm.

Google algorithm

Google's algorithm is all insctructions allowing Google to provide a results page following a request.

PageRank

Initially the algorithm was based solely on the study of links between web pages and was based on an index assigned to each page and called PageRank (PR). The principle is simple: over a page of incoming links, the more PageRank increases. More a page's PageRank, the more it distributes to its outgoing links. By extension, we speak of the PageRank of a site to designate the PageRank of the home page, as this is usually the page that has the highest PageRank of all pages of the site.

Optimization of the algorithm

Since the PageRank, the algorithm takes into account a large number of additional signals, including (partial list):
  • the freshness of the information;
  • the mention of the author;
  • the time spent, the degree of involvement of the reader;
  • Other sources of traffic that SEO
  • etc.


Google ads perform about 500 optimization of the algorithm per year, more than one change per day. Therefore, the SERP may vary significantly depending on the changes made by Google teams.

Google Panda

Panda is the name given to the filter deployed by Google to fight against poor quality sites. The idea is to degrade the positioning of sites whose content is considered too low quality:
  • See Google Panda

Google Penguin

Google's Penguin update Google penalizing sites SEO optimization is considered excessive. This is the case for example sites too many links come from sites deemed as "spamming". It also seems that an abuse of links between pages speaking of disparate subjects is a factor that can result in a penalty via the Google Penguin algorithm. Google has set up a form to disavow links that could potentially harm the SEO of a site (see the history of deployments Google Penguin).

Out of an algorithmic penalty

First, make sure that the loss of audience is linked to an algorithm change. For this, we must determine whether the decline coincides with a known deployment Panda or Penguin. If this is the case, it is likely that this is linked. It is noted that the deployment can take several days or weeks, which means that the decline is not necessarily brutal.

To get an algorithmic penalty, it is advisable to do a manual review of its main pages of content and checking point by point if the quality is the appointment and if the content is unique. In case of insufficient quality, it will be necessary to modify this content to improve or to dereference (or delete). About Google Penguin, look out for the links pointing to the site in Google Webmaster Tools and ensure that links are natural one hand, the other good (at sites not having the air be spam).

Handy tools

  • Google Webmaster Tools
  • Bing Webmaster Tools
  • Google Trends
  • ÜberSuggest - Suggested Keyword

Twitter Accounts SEO Consultants (France)

  • @abondance_com Olivier Andrieu, founder of the site Abondance.com
  • @axenetwit Sylvain Richard, SEO and founder of the blog specializing in SEO Axe-Net.fr
  • @largox Virginia Cleve, Head of Digital Marketing division at Radio France
  • @oseox Aurélien Bardon, SEO consultant and founder of Oseox.fr
  • @renaudjoly Renaud Joly, head of SEO LaRedoute.fr
  • @rochdaniel Daniel Roch, Independent Consultant and founder of SEO blog SEOmix
  • @webrankinfo: Olivier Duffez, SEO consultant and founder of WebRankInfo.com
  • @Zorgloob: Éric Lebeau, founder Zorgloob, specialized site on Google news.
  • @ Mar1e Marie Pourreyron, manager of Altiref, SEO agency websites.
  • @dsampaolo Didier Sampaolo, senior technical advisor SEO specialist.
  • @Polo_Seo Paul Colombo, Former Head at SEO Seloger.com and current head Sarenza.com in SEO.

Twitter accounts (USA)

  • @google: Official Google account
  • @mattcutts: official account of Matt Cutts, head of antispam cell at Google.
  • @sengineland: Official Site Account SearchEngineLand, an American website specializing reference in SEO / SEM / PPC.
  • @dannysullivan: official account of Danny Sullivan, editor of the site SearchEngineLand
  • @seobook: official report of the US site SEOBook specializing in SEO
  • @SEOmoz: Official Site SEOmoz account, an American tool positioning tracking.
  • @googlewmc: Official Google Webmaster Tools account, giving information on the latest developments of the tool.

See also


Download this article (PDF)
download this article (PDF