Sunday, 21 April 2019

Google Search Console

GOOGLE SEARCH CONSOLE


Google Search Console (previously Google Webmaster Tools) is a no-charge web service by Google for webmasters. It allows webmasters to check indexing status and optimize visibility of their websites.
As of May 20, 2015, Google rebranded Google Webmaster Tools as Google Search Console. In January 2018, Google introduced a new version of the Search Console, with a refreshed user interface and improvements. 

Features
  • Search Analytics
  • HTML Improvements
  • Crawl Errors
  • Fetch as Google
  • Sitemaps & Robot.txt Tester

Search Analytics


Google Analytics is a web analytics service offered by Google that tracks and reports website traffic, currently as a platform inside the Google marketing platform brand. Google launched the service in November 2005 after acquiring the developer Urchin.
Google Analytics is the most widely used web analytics service on the web. Google Analytics provides an SDK that allows gathering usage data from IOS and Android app, known as Google Analytics for Mobile Apps.

HTML Improvements
HTML improves in displaying the search engine result page(SERP). If there are any issues with SEO these features will helps to identify them. I will find issues like missing metadata, Content duplication, and more that can easily find by reading. Example, If metadata or meta description or title tag is missing it will easily find out by these features of Google Search Console.

Crawl Errors
Crawl errors occur when a search engine tries to reach a page on your website but fails at it. Let’s shed some more light on crawling first. Crawling is the process where a search engine tries to visit every page of your website via a bot. A search engine bot finds a link to your website and starts to find all your public pages from there. The bot crawls the pages and indexes all the contents for use in Google, plus adds all the links on these pages to the pile of pages it still has to crawl. Your main goal as a website owner is to make sure the search engine bot can get to all pages on the site. Failing this process returns what we call crawl errors.
Fetch as Google
Fetch as Google is one of the main features in Google Search Console. It helps in ensuring that the webpages are user-friendly. Google crawl every page for publishing or indexing them on search engine result page. The URL will be analysed with the help of this tool verification. This tool will help us to communicate with the Google search engine bots to find out if the page can be indexed or not. 


Sitemaps & Robot.txt Tester

Robots.txt file serves to provide valuable data to the search systems scanning the Web. Before examining the pages of your site, the searching robots perform verification of this file. Due to such a procedure, they can enhance the efficiency of scanning. This way you help searching systems to perform the indexation of the most important data on your site first. But this is only possible if you have correctly configured robots.txt.

No comments:

Post a Comment