Monday, September 10, 2018

Guide to Webmater tools


Webmaster tools -Introduction

Everything can be shared on the internet but how can you make sure that your website is listed in the search results among thousands of other website? Google console makes sure your website is listed. It helps in monitoring the visitors and also shows the keywords for which they have got more results. By focusing on these the business can be improved. It also lists all possible errors by using site map and also sends an email regarding the same.
webmaster tools

Web masters tools (now known as Google search console) is a free website for webmasters introduced by Google. This site is used by webmasters to check index status and to optimize visibility. The sites name was changed from Web masters tools to Google search console on May 20, 2015. In January 2015 Google introduced a new version with lots of exciting features.
Google search console can be used by a wide variety of users in the different areas. Such as business owners with website, SEO specialist, web developer, site administrator, app development.

Features:


  • Google search console was designed to help webmasters. They do so by providing the following tools:
  • Helps webmasters to maintain their site maps (list of pages of a website) and also find errors in their site map.
  • The statics about when Googlebots access particular site can be viewed. Crawl rate can be checked and set as needed.
  • Maintains a robots.txt file that helps find pages that are accidently blocked in robots.txt.
  • It shows the list of internal and external links which are linked to the site.
    links to website
  • It gets the different urls which Googlebots finds difficulty in crawling and also displays the error message encountered by Googlebots.
  • Shows the different keywords for which the site is listed and also the no of CLR’s.
  • Check website security issues
  • Why should we use Google’s search console?
  • It helps ensure that Google sees your content.
  • It enables you to upload your document and crawl its content.
  • You can also remove unwanted articles from the search engine.
  • Helps you monitor spam issues.
  • Shows the keywords for which your site has high ranking.

How to create a Google search console account?

Step 1: Register with your Gmail id
 Step 2: Paste the html code in the head part of your website.
Step 3: In the Google search console just click verify to verify your account.
With these three steps you can start using Google account.

webmaster verify
The google webmaster tools allow you to crawl your content with the Googlebot, first fetch url website url and click on request indexing. Once it is indexed , a complete button will be shown.
When you crawl another content in that website, the other portion of the link need to be added.\
fetch googlebot








For more information on:
SEO- History and Evolution
On Page Optimization
                                                                                                                                     

Google analytics

                                         
Google analytics is a free web analytics service provided by Google. It helps in keeping track of traffic. This service was launched by Google in November 2015 after they acquired URCHIN. Google analytics can also be used for website as well as apps.
google analytics
Google Analytics helps in Search engine optimization by showing the most visited keywords, the type of downloads, the devices used etc. By using Google Analytics the well being of the website as well as the mobile app can be ensured. Different issues while viewing the website can also be identified.

Types of traffic:

Traffic refers to the no of visits in a page. There are five different types of traffic in a website. They are as follows:
1) Direct traffic: These are the traffic that is obtained by directly typing the URL of the website. These traffics are due to product notice, advertisements etc.
2) Organic Traffic: These are the traffic obtained by search engine visits such as from Google, Bing etc.
3) Social Media Traffic: These are the traffic obtained by shares etc. in social media platforms.
4) Referral Traffic: The external links to the website.
5) Traffic from ads: These are the visits through the ads. For e.g. afflicted links, banner ads etc.

What is its use?

1) Website traffic

It helps track traffic about those who visited the website and about which pages the users stayed on for a longer time. It helps understand how viewers navigate through pages and which page has the highest bounce rate.

2) Different conversion rate

It helps understand the no of downloads, page views and the no of registrations for your site.

3) Real time feature

This enables the users to find the current no of views at the present time.

4) Traffic rate

The traffic rate of the organic traffic can be found by reviewing the organic keyword. If the average visit time is high then that visit is more useful to the user than that of low user visit time.

5) It helps with the development and optimization of pages.

6) Identify the obstacles encountered by visitors which prevents them from accessing the pages.

7) Tracks the nature of visits.

8) They help in identifying the potential audience (People who actually have a chance of buying your product).

9) Location of the viewers can be found.

10)  The devices used by Users can also be found.

11) User flow can be analysed. By using Google analytics 12 different types can be found.

12) Terms and conditions for each country is different.

Steps to set up Google Analytics account:

Step 1: Sign in and enter you username, blog URL, industry etc. and agree to the terms and conditions  here


sign up
Step 2: Paste the tracking id in the website.
tracking id


Step 3: This tag will collect data and send it to the Google server for processing.

Step 4: Refresh Google analytics page and start tracking.


   



Wednesday, September 5, 2018

On Page Optimization- A Complete Guide

on page optimization

In the modern world competition is very high and so getting a high rank is very difficult. Since search engines are becoming smarter every day and having a good content is not the only factor which affects ranking. One such technique which helps to improve the ranking of your website in comparison to your top competitor’s websites is on-page optimization.

What is on-page optimization? Why is it important?

On-page optimization is the technique which helps in improving the ranking of your website in the organic search result. Examples are html codes, keyword density.

On-page optimization should be known to the webmasters as they help in not only improving ranking but also the visibility of the website.

On-page optimization can be done on the head as well as the body section. The different types are as follows:
Optimization techniques


Head section

The head section of the webpage includes details which are not directly viewed on the webpage such as title tag.

The different types are:

a) Title Optimization

b)  Meta Tag Optimization



The URL will be checked first. Its character limit is 90. Urls should be written in a way as to convey some relevant meaning to the Google-bots.

a) Title optimization

A title is the most important optimization element because it marks the identity of that page or website. A title tag should be short but descriptive enough so that the users can identify the purpose. A title tag is the first thing the search engine shows and index. So it must be designed  to attract the user's interest which in turn improves ranking.

Important points to be noted are as follows:
  • Create a meaningful descriptive title using the targeted keyword.
  • If contrast headings are used Google bots will be confused so use relevant titles.
  • The total character limit in the snippet is 55 -60. (Snippet: The URL, description, title of the page in the search console.)
  • The limitation of the search box is 70 characters and its pixel width is less than 512.
  • Titles cannot be given completely in caps lock as it may exceed the limit.
  • They should not be completely in small letters as it reduces the impression which in turn reduces the CTR rate. CTR (Click through rate) rate can be defined as the percentage of clicks in the website based on impression.
  •  If there is no title then Google checks for H1 tag and displays it as title. If there is no H1 tag it checks for H2 and so on. If no tags are found it   selects a title from the links in the webpages.
  • No spelling or grammatical mistakes are allowed.
  • The symbols must be written as html codes because sometimes the bots may become blocked after it encounters a symbol.
  • The SEO title should be unique and must not be similar to other keywords in the webpages. If two pages have the same keyword then it becomes Title duplication.
  • Keyword Cannibalization refers to the situation when different pages on the same website fight for the same keyword. This must be avoided. Title should have more than three words.
  • Shorter title provides lower clarity. 

b) Meta Tag Optimization

Meta description is the part below the title and the URL shown in the search console. Meta tag should be written well as they are considered as a selling snippet because if a searcher finds it attractive the chances of clicks will be high.
Meta Tag Optimization


 Some of the important to be noted are:
  • Provide relevant Meta descriptions because Google may not print it as such if it is not relevant.
  • If no Meta tag is provided Google selects the most relevant sentence from the description as the Meta content which is usually the first sentence.
  • Open directory project contains lakhs of business directories which is maintained by a community of volunteer editors. Google considers this as a powerful website and uses it for its recommendation.
  •  A worst case scenario is when Google finds no relevant information  and it takes some dumb information as its meta content known as  dumbing of irrelevant words by GOOGLE.
  • By using rich snippets the Meta rate can be improved which in turn improve CTR rate. Rich snippet (bread crumbs) is used to increase the attractiveness of website.
  • Event Data snippet: are used to show the date and events.
  • Side links: When the brand name is searched in the search panel then its subpages will also be displayed.
  • Character limitation of Meta description is 155-160 characters in a   page and 154-155 in a blog post because the date of the blog post will be displayed and hence those characters must be eliminated from it.
  • The pixel width of Meta description is 1024 pixel width.
  • Meta description has to be unique. No two websites or pages in a website can use same Meta tag.
  • The Meta description cannot be too short i.e. greater than 140 words.
  • No grammatical mistakes are allowed.

Body section:

An HTML body tag marks a portion of the body. The different types of optimization are:

a) Optimization for important HTML Tags
b)  Link Optimization
c) Keyword Optimization
d) Image Optimization
e) Authorship Optimization

a) Optimization for important HTML Tags

While writing a content it is very important to highlight different sections according to its importance. This helps in attracting viewers and catch their interest. The most commonly used HTML Tags are H1, H2 and H3.

Some of the important points are as follows:
  • H1 is most important because it attracts the user, acts as an identification mark and must not be too lengthy.
  • H1 can be used in a page with approximately three words. Usually given to title.
  • Sandbox: It is a temporary saving database which is used to save confusing webpages and the ranking will occur in some rank and this rank cannot be changed, even after the website has been optimized.
  • H2 should not be used more than once.
  • Contrasting H2 may lead to confusion but this does not happen always.
  • Secondary keywords can be highlighted with H2 and H3. The content should also be relevant and it may help ranking.

b) Link Optimization

Internal and external optimization must be done in the website to provide a better navigation to both search engines and users. Anchor text (Clickable Link) are codes with hyperlink which connects an image or text to another website.

Some important points are as follows:
  •  By using anchor text visibility of the page will also increase.
  • Act as a fuel to Google bots.
  • Anchor text should be relevant. The more relevant the keyword more ranking. Anchor text should not be given to focus keywords as the other websites rank will improve as Google takes those links as a recommendation and lists the page in ranking.
  •  Sometimes equity can also be passed in NO FOLLOW links.
  • Penguin does not like misusing of internal links.

c) Keyword Optimization

   The content of the website must be arranged in a manner as to suit both the viewers as well as the search engine. There must be a balance of the focused keywords so that it does not become keyword stuffing. Keyword stuffing affects readability of the content.

Some points to be noted are:
  • Percentage of no of target keywords in the blog is known as Keyword density. Keyword Density must be moderate.
  • A lot of myths were found in old school SEO like keyword density affects ranking and bold letters increases ranking. These are all wrong. Keyword density and bold letters does not affect ranking. Bold letters provide easiness in reading and act as an interesting fuel.
  • Long tail Keyword are longer and more specific keywords. 
  • If the focused keyword comes first then the users can have more clarity of the content.
  • Ranking can be increased if focused keyword are given in grey color is a myth.
  • Ranking can be increased if the targeted keyword is before dot ( . ) or comma ( , ) because the Googlebots while scanning stops at keyword or comma and the keyword will have more importance.
  • To prevent over use of the focused keyword use synonyms.

d) Image Optimization

Image optimization is one of the simplest optimization technique. A Googlebot cannot interpret images like they interpret text. They interpret images with the help of  the following elements:
  • Alt name (Alt text): Alternate text (ALT text) is used to describe images when mouse is moved over an image. They must be meaningful and short. Focused Keyword can be used which in turn improves ranking.
  • File name: File names should be meaningful and can be similar to alt text. The Googlebot checks the alt name and file name for identification. In the absence of an ALT name, filename has secondary identity and the content is said to be related to its filename.
  • The keywords present in the area of the image is also considered..

Some important points to be noted are:
  • If the focused keyword is given as alt and a hyperlink is added to the image then equity can be passed to the website and the ranking of our website decreases.
  • The hyper link given in our image has more power than that given in the text as viewers are more attracted towards images.
  • If the ALT text and the images are unrelated then Google will decrease the ranking based on user interaction as no user is likely to download that image.
  • If the user changes their previous names then redirection should be provided. Images should not be saved with space because redirection will not be possible.
  • Infographics : It is the virtual representation using images. They help in improving the understandability and attractiveness of the written content.

e) Authorship optimization

In today's world everything can be found on the internet, so it is necessary to establish the ownership of your content. Google authorship enables you to do just that. This authorship can be established by linking your Google+ account with the content. This two way connection helps you to establish authorship or verify ownership.

Benefits of authorship optimization is as follows:
  • Establish authorship of our content. It protects content against content plagiarism.
  • It helps in improving editorial power or authorship power. this will help in improving the CTR rate and hence ranking increases.
  • The search result with authorship have more chance of a being clicked.

Steps to establish authorship

Step 1: Create a Google+ account.

Step 2: Link the post to the account and then use rel ="author" to establish authorship.

Search Console: It is a no charge website for webmasters created by Google. It shows the indexing  status and optimize visibility of their website.



History and Evolution of SEO



WHAT IS SEO?

SEO Stands for search engine optimization. It is a process which affects the visibility of the web page during web search.

seo

HISTORY AND EVOLUTION OF SEO

HISTORY OF SEO

seo history


In 1988 two PhD students of Stanford University Larry Page and Sergey Brin created Google as a part of their project. Initially Google was used by a small group. But in 2008 another search engine yahoo became famous and widely used. 
On 11th September 2001, the world trade center was attacked and this event also an important turning point in the growth of Google. Someone searched for information about the world trade center on Google but to their utter surprise no information was found. This news spread like fire and became one of the greatest drawbacks of Google. So the Google authorities then took a firm decision to earn people’s trust.
The working of Google can be categorized into three process:
 1) crawling
 2) catching
3) indexing.
 Web crawlers (also known as spider, internet bots, and Google ports) are predefined programs for crawling web pages. It scans the web pages and tries to understand its contents. If the contents have been understood then it takes a snapshot and stores it under the appropriate category in its database. But in olden days, the website designers mainly focused on increasing the attractiveness of the webpages, which could not be crawled by Google.
In order to make websites more crawlable Google decided to introduce a new optimization technique. At first this technique was meant to be kept a secret but later Google released it as a free pdf of the document called SEO starters Guide   which began with a sentimental statement ‘This document first began as a document for team within Google’. Using this website, webmasters (the person who control the website), began optimizing websites, hence making their websites more crawlable. So their focus transformed from attractiveness and not toto optimization.
SEO can be categorized into two:
 1) Black hat SEO and
2) White hat SEO
Black hat SEO contains unethical, illegal methods to increase the ranking of websites. Meta stuffing and keyword stuffing are two such examples. White hat SEO is the ethical and legal ways or techniques to improve the ranking of websites.

EVOLUTION OF SEO

seo pyramid


Within a course of time, SEO became niche specific which is a content based technique for increasing the ranking of websites. In this technique the particular keyword or content is present in a large quantity in the website to increase ranking. It is also known as keyword stuffing and is a black hat SEO practice. Another similar technique is Meta stuffing. In this technique Meta tags are filled with the target keywords to improve ranking. When Google realized this, they stopped ranking based on Meta tags in 2009.

Google brought about some changes in the matrix of its algorithm and transformed from niche specific to link specific. In this case the no of websites with larger no of links will have higher rankings. Due to this reason companies began to sell links and web masters began to purchase from them, resulting in low quality websites.

On realizing this issue Google again changed the matrix in its algorithm and transformed from link specific to quality link specific. Quality link specific means that if the website has a large no of links from quality website then their ranking will be improved. In order to check which links are quality links Google introduced a new ranking system known as page ranks. Page ranks are determined after 200 tests. Earlier there were tools to find out the page rank of websites which increased the stress and pressure on the web masters. Now this tools are no longer available. Page ranks are now hidden. 

Twitter, us government website, add this share button installing link, flash player update link etc. are some examples with a page rank of 10/10. The increase in the no of page rank means that the website has higher levels of trust. The reason twitter has a page rank of 10 is because maximum no of urls are created, has exclusive links that are published and more interactions.

The reason Google emphasis on quality links is because Google only wants to provide only the best quality results to its users. The ranking will be based on the concept of page rank. Higher the page rank results in higher ranking. If both pages have the same page rank, then the content of the website will be considered in the ranking process. In such a case the content and the links topic in the website must be related, also called on-topic ranking.

Naturally websites with higher page rank, started selling their links which became a huge headache for Google. Google could not control such ranking because of their algorithm.  To solve this problem, Google introduced a new collocial term in SEO known as ‘passing the juice’. In this technique those websites with a large no of links from it but low no of links to it from other websites will have a lower rank. Also, the high page ranking websites which sells their links will automatically have a decrease in page rank.

Sometimes situations may arise in which page links must be given without compromising the equity of the website liker for giving references. For this problem, Google introduce a solution:
REL=*NO FOLLOW*

Why was google scared to make updations in its algorithm?

Google tried to provide a lot of changes in its algorithm but they did it with great fear. Why? To answer this question, we have to look back at its history. When Google hit fame, it started a program called ad words. Adwords is a program where advertisers pay Google to advertise their ads. It works in the following manner where each time someone views the ad Google gets payment. It’s a ppc site with a copy score which is a prepaid program.

After AdWords, Google introduced AdSense. AdSense is a program in which owners of websites or blog signs up and show ads of Google and make some profit. For each click Google provides a small percentage of the earnings as income to the users of AdSense.

In order to register for AdSense the user must apply for it and after approval, the users will get a JavaScript code. Once it is pasted, the visitors can view those ads and for each click the owner of the blog or website gets a reward. This is known as afflicted marketing.
There are mainly two types of AdWords 
1) search network and 
2) Display networks. 

In search networks, ads are displayed while searching whereas in the display networks, ads are displayed in the blogs or websites.

Google then made an updation but this updation leads to a loss of a big revenue and this in turn affected their growth. So they were afraid to make updations. 
Web Quality  TeamAfter 2008, they created a special team to maintain quality called web quality team. The head of the team was Matt Cutts. In 2009, they made huge quality changes because they acquired large companies like youtube, Gmail etc. These updates were done to provide the users with high quality results. They tried to transform their search engine into something far better like a personal assistant. They added features like calculator, ip etc. they also tried to make it more user friendly by adding a feature Google suggest so that people would like and depend on Google more. Google can also act as a personalised search engine, in which Google stores details about previous searches and displays it to the users. This feature is not available unless the users have logged in.
Data centers
In 2009, on the basis of user interaction Google introduced data centers for storing details about user interaction. Cookies or log in files of the browsers are collected by Google and then send to these data centers, which is in turn used by Google to rank websites. If the site is closed as soon as it is opened, Google understands that the users do not like the site and thus Google decreases its rank.
Pogo sticking
Moz.com is the world’s largest SEO community which introduced a concept called Pogo sticking. If a person searches for a particular keyword, then spends little time on one website and the hits backspace and opens another website from the search list and spends more time on this new website, the ranking of this website will increase and the other website will decrease.
Bounce rate
Another concept similar to Pogo sticking is Bounce rate. Bounce rate is the percentage of how many people spends lower time on the website without even checking the content of the website. Bounce rate can be used to determine the trust of the site. If the loading time increase bounce rate also increases.
Exit Rate
Another concept which is often confused with bounce rate is exit rate, it is the percentage of users who actively clicks away to other website after visiting the other pages of the website or in simple words, and it is the page on which the visitors have exited.

In 2010, when social media was used widely Google made a change in the algorithm to include social media sharing to affect its ranking. This is known as social media signal. More shares and likes will result in high ranking. But if the share and likes are same then the influential power of the person sharing the post is considered and ranking is done on its basis. Influential power is based on the responses of the people to their posts.

In 2011, they made some more updations to eliminate spamming webmasters. Their main aim was to increase the quality of the search result. The different updations are as follows:

1. Panda
google panda



This updation was made in 2011 but its different versions were realised through the year. This updation was done against content spamming. The different types of content spamming are:
1) Content duplication or plagiarism: the content will be copy pasted from various websites.
2) Content spinning: the same content will be copy pasted in different parts of the website in different  ways.
3) Quality low content: content with grammatical mistakes like spelling etc. which makes it low quality.
4) Thini pages: pages with low content.
5)    Keyword stuffing: the target keyword is repeated in different ways in the content. It is a black hat SEO technique.

After its updation, panda became a huge success. So they released a large no of versions, about a 100.
On 19th may 2014, another version panda 4.0 is an updation which added a permanent filter in the main algorithm.  When the crawler encounters the spamming content it will not pass through the filter. This happens 95% of the time. The rest 5%, means the crawler has not crawled its content but when it does so it, the website will be removed.

2) Penguin
In 2012, penguin updation was done. This version was also a success and various versions were also released. This updation was made to target link spamming. The different types are:
1) Paid links:  it involves the selling and purchasing of links.
2) Link exchange: exchange of links.
3) Link farming: Building links by self-farming techniques. E.g. automatic program which may be self-developed. 
4) Link scheme: Link referral websites are used to increase links.
5) Comment spamming: Adding links in comments. A black hat SEO technique.
6) Wiki spamming: web masters takes links from Wikipedia to increase its ranking. But Wikipedia added moderators to create such spamming.
7) Guest blogging: if the no of guest values increases beyond a threshold value then Google uses penguin to punish them.

In 2016 September and October, phase 1 and phase 2, penguin 4.0 updation was created. This updation made it real time but it wasn’t such a huge success. Real time means the punishment will be carried immediately. But currently it is become more successful.

3) Pigeon
This updation was based on local ranking. In local ranking, targeting is based on local SEO. For e.g. if the business is targeting the local people, then local SEO is done. If local SEO is not done then the website will not be ranked locally.
On Dec 10th 2014 it was fist implemented in us and UK and phase 2 in other countries.

4) Humming bird
hummingbird


This updated was based on semantic search result. Semantic search results provides a detailed search. The ranking is based on the feedback of users. Google wanted to provide useful depth result on the basis of the response of the people. The fact strongly accepted by people will be supported strongly. The defect of this is that a wrong fact strongly supported by Google will also be promoted by Google.
Rankbrain is an advanced version of humming bird. It was released in 2015. It uses artificial intelligence to access facts and automatically updates itself. By using artificial intelligence, it is easier to predict human behaviour and also has automatic updations.