Submit Google Sitemap Directly To Robots.txt File Of Your Blogspot Blog
Sitemap will hold all the posts of your blog. So every time you submit something in your blog, your sitemap will update and so Google. You can simply submit your rss link to Google webmaster tool easily like below screen.
But here the problem is the default Google xml sitemap file can only have latest 26 posts only. That means Google will not crawl your other pages, besides latest 26 pages and also those posts will not get indexed. So in that case, you can fix this issue just by adding a few lines of extra code.
Simply submit this above line in Google sitemap and get relaxed. This line will tell Google bot to index pages from page 1 to till page 500. Now your blog have more than 500 pages, then you have to add one more line like below and go on.
Besides that you can also edit the Custom robots.txt section of your webmaster page and directly paste the code below. This will help Google boot to find your complete sitemap every time it comes to your website.
Your default robot.txt section will be updated with below code.
User-agent: *Disallow: /searchAllow: /Sitemap: http://www.getaheadindia.in/feeds/posts/default?orderby=UPDATED
Check your Webmaster Account >> Health >>Blocked Urls section.
You can simply replace this code with the below one.
Now add the line multiple times as per your blogs total posts. This will tell Google boot every time to crawl your entire blog without manual pinging.