Google can find and index pages on the internet on its own. However, since there are over 1 billion websites, it can take time for Google to notice your pages, especially if your website is new.
This article explores different methods to submit URLs to Google Index so that Google crawls and indexes your URLs quickly, helping you get returns on your SEO investments faster. We’ll also discuss how you can submit a website to Google Search.
How to submit a URL to Google?
Submit an XML sitemap through Search Console
Submitting links to Google through an XML sitemap is one of the first things to do when launching your website. The XML sitemap helps Google find and index relevant pages on your website quickly.
If you are using a CMS, you probably already have a sitemap file on your website; you just need to find the URL of the sitemap. In case of WordPress, you can install an SEO plugin such as Yoast or Rank Math. The plugin will create a sitemap file for your website, which you can access at yourdomain.com/sitemap_index.xml. If you’re using Wix, Shopify, or Squarespace, you can find your sitemap at yourdomain.com/sitemap.xml.
Sitemaps are usually stored in one of those two locations, so make sure you check them out. If you don’t have a sitemap, you can go on and create one manually for your website.
Once you have created an XML sitemap, go to Google Search Console and select Sitemaps under Index from the left bar. Paste your sitemap URL in the box and click on Submit.
Google will check the sitemap file and crawl the URLs listed in the sitemap. You just need to keep the sitemap file updated with links you want Google to index; Google will check the sitemap file for new links frequently.
This is the best way to submit URLs for indexing, as you can get notified if Google encounters any issues when crawling the URLs. You’ll also get to see when each page was crawled last time.
Request a crawl through Google Search Console’s URL inspection tool
If you have already submitted a sitemap and yet Google hasn’t indexed some of your pages, you can use the URL inspection tool in Google Search Console. The tool will tell you if the URL has already been discovered, crawled, and indexed by Google or if there are any errors preventing Google from indexing the page.
To submit your URL, go to Google Search Console and click on URL Inspection in the left bar. Type in your URL and hit Enter. The next page will give you the details on URL discovery, crawling, and indexing on Google. You can click on Request Indexing to get your URL indexed by Google.
Note that indexing can take anywhere from a few days to a few weeks, and submitting your URL to Google multiple times does not offer any benefits. Also, if you have more pages to submit to Google, this method can be inefficient, and you won’t get to know the status of your request. Submitting a sitemap is always the ideal method, as all search engines and crawlers can access the sitemap.
Ping Google about new or updated sitemap
Google has a ping service to let the search engine know about new or updated sitemaps. When you ping Google, it initiates a fresh crawl of the sitemap. However, this method is relevant only for new and updated sitemaps, and there’s no point in submitting an unchanged or already crawled sitemap again through this method.
To ping Google about your new or updated sitemap, just navigate to
Don’t forget to replace yourdomain.com/sitemap.xml with your sitemap URL. Once you ping Google, you will see the Sitemap Notification Received page.
Why isn’t Google indexing my website or URL?
Google may fail to index your website or URL even after submitting a sitemap and using the URL inspection tool. This can happen due to various reasons, and the best place to find the reason is Google Search Console’s URL Inspection tool itself.
Here’s how to check if your URL is indexed or not:
Go to Google Search Console and open the URL inspection tool. Enter your URL and hit Enter. Google will return a page with URL inspection data.
If Google hasn’t discovered the URL or your website yet, the URL inspection data will look like this:
But if the URL is known to Google and yet is not indexed, there can be errors with your website. You’ll have to find and fix the errors and submit your website pages again to get them indexed.
Here are the most common reasons why URLs don’t get indexed by Google:
1. You’ve blocked Googlebot in robots.txt.
A robots.txt file basically tells crawlers which URLs they should crawl and which URLs they should not crawl. If none of your website pages are getting indexed by Google, it is a good idea to check your robots.txt file by going to yourdomain.com/robots.txt.
If your robots.txt has the following lines of code, you have blocked all crawlers from accessing your website.
If you have the following code, you have blocked Google from crawling your site.
You can remove these lines of code and resubmit your sitemap on Google Search Console. To find out if your robots.txt is properly configured, read our expert guide on robots.txt.
2. You’ve used the noindex meta tag on your pages.
A noindex meta tag tells search engines not to index the page. If your page or website has the noindex meta tag, Google will not crawl and index those pages.
Here’s what the noindex meta tag looks like:
<meta name=”robots” content=”noindex”>
You can check your page source to see if it has the noindex tag. If all of your pages have a noindex tag, check your CMS settings to see if you have disabled indexing for all your website pages.
3. Your page content offers low value.
Google has over 1 billion websites to index, with several thousand web pages on each topic. Naturally, Google chooses not to index pages that add no value for searchers.
If you’ve made sure that there are no technical errors that prevent Google from indexing your URL, the reason can be low-value content. Generally, pages with low word counts and pages that are near duplicates are considered low-value pages. You can rewrite and improve the content of such pages to get them indexed by Google.