Monday, 29 April 2019

Google analytics

Google Analytics

Google Analytics is a free web service tools for SEO. This service is available for all who have a Google account. It offers an easy and free way to track and analyze visitors on your website. You may have hundreds and thousands of visitors to your site but if you don’t know anything about them, then it would be meaningless. Google analytics is a tool that helps in turning most of the visitors to customers. In addition to providing the information of the visitors of your site, it also provides key insights on how your website is performing and what you need to do more to meet your goals. You can track everything related to your sites, ie how much traffic your site has, where the traffic is coming from. You can even monitor social media activities, track mobile app traffic, identify trends etc.
google analytics

Traffic Sources

traffic sources


In Google analytics, the traffic source is a report that provides an overview of the different kind of sources that send traffic to your website.
There are different kinds of traffic sources, they are 
  1. Direct Traffic Sources
  2. Organic Traffic Sources
  3. Social Media Traffic Sources
  4. Referral Traffic Sources
  5. Ad Traffic Sources

Direct Traffic Sources

Direct traffic sources mean searching the results by URL directly through their browser, that means directly accessing the site.
The direct session occurs anytime google cannot determine another referring source or channel.
Direct traffic is also known as visitors with no referring website.

Organic Traffic Source

It defines visits generated by paid ads.
Visitors who are considered organic find your website after using a search engine like google or bing. So they are not referred by any other websites.

Social Media Traffic Sources

social media traffic sources

Social media traffic means visitors cames through social media. In other words, t refers to traffic coming to your website, mobile site or mobile apps from social network and other social media platforms. Social media traffic will act as paid sources and unpaid sources.
For example, Facebook traffic can come from paid ads, shared posts from your pages, and maybe even posts from a group.

Referral Traffic Sources

Referral traffic sources


Referral traffic means visitors came through referral links. It is a method of reporting visits that came through referrals from one page to another page. When the clicks on a hyperlink to go to another website, Google analytics tracks it as a referral from one site to another so it will consider as referral traffic.

Ad Traffic Sources


Ad traffic is the best way to get visitors to your webpages. Ad traffic is one of the best ways to get visitors to your webpage. Ad traffic visitors came through advertisements.
For example, is Youtube. Youtube ads are much easy to understand what is ad traffic. We can watch ads and then through touching the ad, it will lead us to the page.



Sunday, 21 April 2019

Google Search Console

GOOGLE SEARCH CONSOLE


Google Search Console (previously Google Webmaster Tools) is a no-charge web service by Google for webmasters. It allows webmasters to check indexing status and optimize visibility of their websites.
As of May 20, 2015, Google rebranded Google Webmaster Tools as Google Search Console. In January 2018, Google introduced a new version of the Search Console, with a refreshed user interface and improvements. 

Features
  • Search Analytics
  • HTML Improvements
  • Crawl Errors
  • Fetch as Google
  • Sitemaps & Robot.txt Tester

Search Analytics


Google Analytics is a web analytics service offered by Google that tracks and reports website traffic, currently as a platform inside the Google marketing platform brand. Google launched the service in November 2005 after acquiring the developer Urchin.
Google Analytics is the most widely used web analytics service on the web. Google Analytics provides an SDK that allows gathering usage data from IOS and Android app, known as Google Analytics for Mobile Apps.

HTML Improvements
HTML improves in displaying the search engine result page(SERP). If there are any issues with SEO these features will helps to identify them. I will find issues like missing metadata, Content duplication, and more that can easily find by reading. Example, If metadata or meta description or title tag is missing it will easily find out by these features of Google Search Console.

Crawl Errors
Crawl errors occur when a search engine tries to reach a page on your website but fails at it. Let’s shed some more light on crawling first. Crawling is the process where a search engine tries to visit every page of your website via a bot. A search engine bot finds a link to your website and starts to find all your public pages from there. The bot crawls the pages and indexes all the contents for use in Google, plus adds all the links on these pages to the pile of pages it still has to crawl. Your main goal as a website owner is to make sure the search engine bot can get to all pages on the site. Failing this process returns what we call crawl errors.
Fetch as Google
Fetch as Google is one of the main features in Google Search Console. It helps in ensuring that the webpages are user-friendly. Google crawl every page for publishing or indexing them on search engine result page. The URL will be analysed with the help of this tool verification. This tool will help us to communicate with the Google search engine bots to find out if the page can be indexed or not. 


Sitemaps & Robot.txt Tester

Robots.txt file serves to provide valuable data to the search systems scanning the Web. Before examining the pages of your site, the searching robots perform verification of this file. Due to such a procedure, they can enhance the efficiency of scanning. This way you help searching systems to perform the indexation of the most important data on your site first. But this is only possible if you have correctly configured robots.txt.

Wednesday, 10 April 2019

SEO On-page Optimisation

ON PAGE OPTIMISATION


SEO on-page optimization


On-page optimization refers to all the measures that can be taken directly within the website in order to improve its position in the search rankings. On-page SEO is the practice of optimizing individual web pages in order to rank higher and earn more relevant traffic in search engines. Examples of this include measures to optimize the content or improve the meta description and title tags.

Snippet is a result Google shows to the user in the search results. It is the base in SEO On-page optimisation. The snippet is a single search result in a set of search results and generally consists of a title, a URL and a description of the page. Search engines often use pieces of your content to fill in the parts that make up the snippet. It is the brief extract or a listing part which is shown when we search for anything.

Snippet


To increase our ranking, we must add relevant content, improve meat description and title.

SEO on-page tips

  • Page Title
  • URL Structure
  • Meta Description
  • Head Tags
  • Keyword Density
  • Image Optimisation.
  • Internal Linking


A page title also is known as a title tag, is a short description of a webpage and appears at the top of a browser window and in SERPs. It is an important element of an optimized SEO page. A page title should include a page's keyword in the title tag.

  •  The user must get attention.
  •  Don't use uppercase characters for complete title.
  •  Use uppercase for the first letter of each word of the title.
  •  Should focus on keyword.No spelling or grammar mistakes.
  •  Pixel width should be within 512 pixels.
  •  Normally 55-60 character, maximum up to 70.
  •  Do not give the same title for more than one web page. Keyword Cannibalism will affect if give the same title for more than one page.
  •  Minimum 3 words.


URL (or URL-address) is a special form of individual address of a certain resource on the Internet. It can refer to the website, some particular document, or an image. The Internet user just needs to insert this code into the location bar to find the needed website, document, folder, or image.


The meta description is a snippet of up to about 155 characters – a tag in HTML – which summarizes a page's content. Search engines show the meta description in search results mostly when the searched-for phrase is within the description, so optimizing the meta description is crucial for on-page SEO.

  •  Do not copy from other web pages.
  •  Character length for a page’s meta description: 155-160 characters.
  •  Character length for a post’s meta description: within 155  characters.
  •  Maximum Pixel width: 1024 pixels.
  •  Should be relevant to the web page
  •  If meta description is not given or not relevant, Google takes content from web page as the meta description.
  •  To improve CTR, We can add marketing strategies(offers, discounts etc) in the meta description.
  •  No grammatical or spelling mistakes. 




HTML <headTag. The HTML <headtag represents the head section of the HTML document. The <head> element can contain other HTML tags that contain metadata. Metadata provides information about the document such as title, description, keywords etc.

head tags




Keyword density is the percentage of times a keyword or phrase appears on a web page compared to the total number of words on the page. In the context of search engine optimization, keyword density can be used to determine whether a web page is relevant to a specified keyword or keyword phrase.





Image optimization is about reducing the file size of your images as much as possible without sacrificing quality so that your page load times remain low. It's also about image SEO. That is, getting your product images and decorative images to rank on Google and other image search engines.





Internal links are links that go from one page on a domain to a different page on the same domain. They are commonly used in the main navigation. These type of links are useful for three reasons: They allow users to navigate a website. They help establish information hierarchy for the given website. They help spread link equity (ranking power) around websites.

Sunday, 7 April 2019

History And Evolution

History and Evolution of SEO


history and evolution of SEO

On September 11 1997, SEO is the process that is using for the increasing ranking or listing of a website in the first searching page

Search engine optimisation

Search engine optimisation is the process of affecting the online visibility of a website in a web search engine's unpaid results. In general, the earlier (or higher ranked on the search results page), and more frequently a website appears in the search results list, the more visitors it will receive from the search engine's users; these visitors can then be converted into customers.

PROCESS OF SEO
There are three methods for using this process

1 CRAWLING

2 CATCHING

3 INDEXING

Crawling
It is like a scanning process that used to analyze a website and take the complete information about the site and put it on the database

Catching
It is a process that used for the easy way to identify the link that we search before through a Snapshot

Indexing
This process is like a library. It will arrange all the database as per our search result as if we search for a school the data will be stored in the school category.

WEBMASTERS
A webmaster is the custodian of a website who can edit any changes. Without this team, Google cannot edit any changes in a website.

SEO PRACTICING
There are three types of SEO practising. They are

1) Black hat
2) White hat
3) Grey hat

BLACK HAT- It is a Non Ethical method. It is used to achieve a result in a shortcut way but this method is illegal in the SEO world.
WHITE HAT- This is the real method using all over the world to get the result because it will take time but the results will be genuine
GREY HAT- This method is just a mixed type of black hat and white hat.

Keyword Stuffing
This process is using for the searching of keyword (SEO Expert in India) early times this process is used all over the world and through this the internet suffer more.

Keyword Metatag
This method is stopped by GOOGLE in 2009 because of people by illegal use.

Niche specific
It was one of the oldest techniques to list websites by GOOGLE. When we are searching a particular keyword on GOOGLE, the website listed in the topmost positions will be the website having the exact keyword more times. So webmasters started including more keywords in the pages for listing their websites. This is known as keyword stuffing and it was the drawback of this technique.

Link specific
This method is introduced by GOOGLE by getting more links from others like exchange (<ahref=www.sahil.com"nofollow">sahilseo></a>). For ranking this process.
After this GOOGLE decided to change another algorithm in the programme that only the quality links can give the ranking without this website link could not get ranking.
On that time there are only a few sites are in the top position(TWITTER, US GOVERNMENT SITE, FLASH PLAYER LINK are the 10 ranked sites).

Passing Juice 
After all this GOOGLE decided that if a person shares his links too much he will lose his rank, this process is called the passing the juice.If it doesn't want to happen the code for this process is ( <a href>sahil,seo.kerala='nofollow'<a/a>)

PAY PER CLICK- if a person clicks on the ad his money deduct by the GOOGLE and he will get the commission from the ad
COST PER CLICK- Each keyword has a separate cost

Personalised search result- means what we search in the GOOGLE in previous use, that data will be shown in the next use when we hadn't sign out. it was in 2009

Pogo Sticking- if a customer didn't spend more time in a site and if he spends more time in another site it will come on the 1st position.
on the time of 2010, the whole system came to change.If a person needs to increase the ranking of his website he needs the likes, sharing page, through social media like facebook, twitter etc... This happens in 2010 and also the influential people comments or shares also help to improve the ranking of a page.


UPDATIONS DONE BY GOOGLE FOR SEO

updations of SEO


At the time of 2011 GOOGLE introduce GOOGLE PANDA UPDATION for the duplicate content like Plagiarism less quality content, spelling mistakes can be also removed by the GOOGLE panda updation so that a page quality will be increased.
Content spinning refers to the contents of a website copied and the same content posted in a different order in another website.
  • PANDA 4.0: it was started on May 19, 2014. The Panda update has been included in the algorithm as a permanent filter.

SEO updation Panda4.0

  • PENGUIN: It was developed in the year of 2012.Penguin Update in April 2012 to better catch sites deemed to be spamming its search results, in particular, those doing so by buying links or obtaining them through link networks designed primarily to boost Google rankings.

  • PENGUIN 4.0: I was introduced in 2016 in the months of September October. Copying links from other websites are prohibited. If a link is copied the website got punished at the real time. And also excess of guest blogging will be caught and punished.

SEO updation Penguin4.0















  • PIGEON: It was established in 2014. It was based on local SEO. In other words, it is related to locally visible results.


    SEO updations Hummingbird
  • HUMMINGBIRD: It was developed in 2013. It is based on semantic search results. The results will be deep.






  • RANK BRAIN: It was introduced in 2015. It will act as artificial intelligence. It is also like a hummingbird update. 

  • PARKED DOMAIN: The updation within the name servers for the domain name is also taken by the person others cannot take it, so the person who takes the domain can be updated later. on 2012 people start taking a domain name and no one can take it and it could not be visible

  • EXACT MATCH DOMAIN (EMD): focusing keywords is the method in this. if we take a particular keyword in a domain and block the domain, other websites can't take the name of that domain.

  • PIRATE: The contents photos videos etc that is shared by a website cannot be copied by another website without permission.

  • MOBILEGEDDON: It was launched on April 21 2015. It is mobile friendly updation. It should be readable without tapping or zooming.