Wednesday, September 5, 2018

History and Evolution of SEO



WHAT IS SEO?

SEO Stands for search engine optimization. It is a process which affects the visibility of the web page during web search.

seo

HISTORY AND EVOLUTION OF SEO

HISTORY OF SEO

seo history


In 1988 two PhD students of Stanford University Larry Page and Sergey Brin created Google as a part of their project. Initially Google was used by a small group. But in 2008 another search engine yahoo became famous and widely used. 
On 11th September 2001, the world trade center was attacked and this event also an important turning point in the growth of Google. Someone searched for information about the world trade center on Google but to their utter surprise no information was found. This news spread like fire and became one of the greatest drawbacks of Google. So the Google authorities then took a firm decision to earn people’s trust.
The working of Google can be categorized into three process:
 1) crawling
 2) catching
3) indexing.
 Web crawlers (also known as spider, internet bots, and Google ports) are predefined programs for crawling web pages. It scans the web pages and tries to understand its contents. If the contents have been understood then it takes a snapshot and stores it under the appropriate category in its database. But in olden days, the website designers mainly focused on increasing the attractiveness of the webpages, which could not be crawled by Google.
In order to make websites more crawlable Google decided to introduce a new optimization technique. At first this technique was meant to be kept a secret but later Google released it as a free pdf of the document called SEO starters Guide   which began with a sentimental statement ‘This document first began as a document for team within Google’. Using this website, webmasters (the person who control the website), began optimizing websites, hence making their websites more crawlable. So their focus transformed from attractiveness and not toto optimization.
SEO can be categorized into two:
 1) Black hat SEO and
2) White hat SEO
Black hat SEO contains unethical, illegal methods to increase the ranking of websites. Meta stuffing and keyword stuffing are two such examples. White hat SEO is the ethical and legal ways or techniques to improve the ranking of websites.

EVOLUTION OF SEO

seo pyramid


Within a course of time, SEO became niche specific which is a content based technique for increasing the ranking of websites. In this technique the particular keyword or content is present in a large quantity in the website to increase ranking. It is also known as keyword stuffing and is a black hat SEO practice. Another similar technique is Meta stuffing. In this technique Meta tags are filled with the target keywords to improve ranking. When Google realized this, they stopped ranking based on Meta tags in 2009.

Google brought about some changes in the matrix of its algorithm and transformed from niche specific to link specific. In this case the no of websites with larger no of links will have higher rankings. Due to this reason companies began to sell links and web masters began to purchase from them, resulting in low quality websites.

On realizing this issue Google again changed the matrix in its algorithm and transformed from link specific to quality link specific. Quality link specific means that if the website has a large no of links from quality website then their ranking will be improved. In order to check which links are quality links Google introduced a new ranking system known as page ranks. Page ranks are determined after 200 tests. Earlier there were tools to find out the page rank of websites which increased the stress and pressure on the web masters. Now this tools are no longer available. Page ranks are now hidden. 

Twitter, us government website, add this share button installing link, flash player update link etc. are some examples with a page rank of 10/10. The increase in the no of page rank means that the website has higher levels of trust. The reason twitter has a page rank of 10 is because maximum no of urls are created, has exclusive links that are published and more interactions.

The reason Google emphasis on quality links is because Google only wants to provide only the best quality results to its users. The ranking will be based on the concept of page rank. Higher the page rank results in higher ranking. If both pages have the same page rank, then the content of the website will be considered in the ranking process. In such a case the content and the links topic in the website must be related, also called on-topic ranking.

Naturally websites with higher page rank, started selling their links which became a huge headache for Google. Google could not control such ranking because of their algorithm.  To solve this problem, Google introduced a new collocial term in SEO known as ‘passing the juice’. In this technique those websites with a large no of links from it but low no of links to it from other websites will have a lower rank. Also, the high page ranking websites which sells their links will automatically have a decrease in page rank.

Sometimes situations may arise in which page links must be given without compromising the equity of the website liker for giving references. For this problem, Google introduce a solution:
REL=*NO FOLLOW*

Why was google scared to make updations in its algorithm?

Google tried to provide a lot of changes in its algorithm but they did it with great fear. Why? To answer this question, we have to look back at its history. When Google hit fame, it started a program called ad words. Adwords is a program where advertisers pay Google to advertise their ads. It works in the following manner where each time someone views the ad Google gets payment. It’s a ppc site with a copy score which is a prepaid program.

After AdWords, Google introduced AdSense. AdSense is a program in which owners of websites or blog signs up and show ads of Google and make some profit. For each click Google provides a small percentage of the earnings as income to the users of AdSense.

In order to register for AdSense the user must apply for it and after approval, the users will get a JavaScript code. Once it is pasted, the visitors can view those ads and for each click the owner of the blog or website gets a reward. This is known as afflicted marketing.
There are mainly two types of AdWords 
1) search network and 
2) Display networks. 

In search networks, ads are displayed while searching whereas in the display networks, ads are displayed in the blogs or websites.

Google then made an updation but this updation leads to a loss of a big revenue and this in turn affected their growth. So they were afraid to make updations. 
Web Quality  TeamAfter 2008, they created a special team to maintain quality called web quality team. The head of the team was Matt Cutts. In 2009, they made huge quality changes because they acquired large companies like youtube, Gmail etc. These updates were done to provide the users with high quality results. They tried to transform their search engine into something far better like a personal assistant. They added features like calculator, ip etc. they also tried to make it more user friendly by adding a feature Google suggest so that people would like and depend on Google more. Google can also act as a personalised search engine, in which Google stores details about previous searches and displays it to the users. This feature is not available unless the users have logged in.
Data centers
In 2009, on the basis of user interaction Google introduced data centers for storing details about user interaction. Cookies or log in files of the browsers are collected by Google and then send to these data centers, which is in turn used by Google to rank websites. If the site is closed as soon as it is opened, Google understands that the users do not like the site and thus Google decreases its rank.
Pogo sticking
Moz.com is the world’s largest SEO community which introduced a concept called Pogo sticking. If a person searches for a particular keyword, then spends little time on one website and the hits backspace and opens another website from the search list and spends more time on this new website, the ranking of this website will increase and the other website will decrease.
Bounce rate
Another concept similar to Pogo sticking is Bounce rate. Bounce rate is the percentage of how many people spends lower time on the website without even checking the content of the website. Bounce rate can be used to determine the trust of the site. If the loading time increase bounce rate also increases.
Exit Rate
Another concept which is often confused with bounce rate is exit rate, it is the percentage of users who actively clicks away to other website after visiting the other pages of the website or in simple words, and it is the page on which the visitors have exited.

In 2010, when social media was used widely Google made a change in the algorithm to include social media sharing to affect its ranking. This is known as social media signal. More shares and likes will result in high ranking. But if the share and likes are same then the influential power of the person sharing the post is considered and ranking is done on its basis. Influential power is based on the responses of the people to their posts.

In 2011, they made some more updations to eliminate spamming webmasters. Their main aim was to increase the quality of the search result. The different updations are as follows:

1. Panda
google panda



This updation was made in 2011 but its different versions were realised through the year. This updation was done against content spamming. The different types of content spamming are:
1) Content duplication or plagiarism: the content will be copy pasted from various websites.
2) Content spinning: the same content will be copy pasted in different parts of the website in different  ways.
3) Quality low content: content with grammatical mistakes like spelling etc. which makes it low quality.
4) Thini pages: pages with low content.
5)    Keyword stuffing: the target keyword is repeated in different ways in the content. It is a black hat SEO technique.

After its updation, panda became a huge success. So they released a large no of versions, about a 100.
On 19th may 2014, another version panda 4.0 is an updation which added a permanent filter in the main algorithm.  When the crawler encounters the spamming content it will not pass through the filter. This happens 95% of the time. The rest 5%, means the crawler has not crawled its content but when it does so it, the website will be removed.

2) Penguin
In 2012, penguin updation was done. This version was also a success and various versions were also released. This updation was made to target link spamming. The different types are:
1) Paid links:  it involves the selling and purchasing of links.
2) Link exchange: exchange of links.
3) Link farming: Building links by self-farming techniques. E.g. automatic program which may be self-developed. 
4) Link scheme: Link referral websites are used to increase links.
5) Comment spamming: Adding links in comments. A black hat SEO technique.
6) Wiki spamming: web masters takes links from Wikipedia to increase its ranking. But Wikipedia added moderators to create such spamming.
7) Guest blogging: if the no of guest values increases beyond a threshold value then Google uses penguin to punish them.

In 2016 September and October, phase 1 and phase 2, penguin 4.0 updation was created. This updation made it real time but it wasn’t such a huge success. Real time means the punishment will be carried immediately. But currently it is become more successful.

3) Pigeon
This updation was based on local ranking. In local ranking, targeting is based on local SEO. For e.g. if the business is targeting the local people, then local SEO is done. If local SEO is not done then the website will not be ranked locally.
On Dec 10th 2014 it was fist implemented in us and UK and phase 2 in other countries.

4) Humming bird
hummingbird


This updated was based on semantic search result. Semantic search results provides a detailed search. The ranking is based on the feedback of users. Google wanted to provide useful depth result on the basis of the response of the people. The fact strongly accepted by people will be supported strongly. The defect of this is that a wrong fact strongly supported by Google will also be promoted by Google.
Rankbrain is an advanced version of humming bird. It was released in 2015. It uses artificial intelligence to access facts and automatically updates itself. By using artificial intelligence, it is easier to predict human behaviour and also has automatic updations.                          

2 comments:

  1. The question faced by all those dealing with these challenges caused by SEO is how to deal with it. Should they give up trying? Has anyone invented a way to deal with this issue? One solution is to design a new standard for search engine optimization, which looks at SEO basically as a marketing tool aimed to attract search engines. 软文代写

    ReplyDelete