Google can find, crawl, and index web pages automatically, even without a sitemap. It helps web admins get their website pages found and indexed by Google without much effort. But at the same time, it also means that unnecessary pages can also get indexed by Google, such as your staging site, duplicate content, or outdated content.
You can also run into situations where de-indexing is unavoidable, like when sensitive or confidential content from your website starts showing up on Google. And the worst case is when a hacker gains access to your website, creates several spammy pages, and they all get crawled by Google.
This article explores the different methods to remove URLs from Google and suggests the ideal method for each situation.
How can I remove a link from Google Search?
You can remove links from Google Search in five different ways, depending on the type of content you want to remove.
When the page needs to remain accessible for website visitors
Adding a noindex meta tag in the head section of the page is the best option here—you get to keep your page accessible for visitors, yet it won’t appear in Google SERPs. The noindex meta tag tells crawlers not to index the page. Google respects the noindex tag and does not index pages with the noindex meta tag.
But if the page has some backlinks and is getting organic traffic, using this method will result in losing the organic traffic and the value of backlinks.
In such cases, you can use the canonical tag instead of noindex to tell Google to pass the traffic and authority to a page that has similar content to the one you want to remove from Google index. The canonical tag tells Google which page to index. So adding a canonical tag that links to a similar page tells Google to index the similar page instead of the page you want to remove from Google.
Note that the canonical tag is a hint and not a directive, so you must canonicalize a page that is very similar to the page you want to remove from Google. Alternatively, you can 301 redirect the page to one with similar content, but that also means Google and your website visitors will not be able to access the page.
Although the noindex meta tag tells Google not to index the page, it doesn’t make it inaccessible for them. Google and most other search engines respect robots meta tags, so this will prevent the page from getting indexed, permanently.
Here’s what the HTML noindex meta tag looks like:
<meta name=”robots” content=”noindex”>
This code should go in the head section, i.e., between and , on the page you want to remove from Google search. But if the page already has the robots meta tag with content=”index”, just change “index” to “noindex”.
Note that since the page is already indexed, Google will remove the page from its index once it recrawls the page and finds the noindex meta tag.
When the page needs to be inaccessible for both visitors and Google
When the page doesn’t have to be accessible for both visitors and Google, you can handle it in two different ways depending on the situation. And you can delete URL of the page if you want in both cases.
The first method is to implement a 301 redirect to a similar or most closely related page on your website. A 301 redirect is a permanent redirect that passes all link equity to the redirected page. So this method is suitable when the page is getting organic traffic or has some backlinks.
However, redirecting to a relevant or closely related page is critical here as Google may consider it a soft 404 error otherwise.
The second method is to use a 410 status code, telling Google that the page is permanently removed. As you can guess, a 410 status code is a dead end and doesn’t pass link equity, so this method is suitable when the page isn’t getting any traffic and have no backlinks.
Google is usually very quick to remove links with a 410 status code from its index.
To remove spam URLs or staging URLs immediately from Google Search
If your staged website pages got indexed, or if your website got hacked and the hacker added several spam pages, you have to make it clear to Google that those pages should not be displayed in search results at all. For this, you can use Google Search Console’s URL removal tool, provided you own or control the Google Search Console property of the website.
When you remove a URL through Google Search Console, Google will hide it from search results for 180 days and clear the cached version of the web page. This method wouldn’t delete the URL from Google’s index—it only takes the URL off Google search results. So to completely remove it from Google, you need to delete the page from your website, use the robots noindex meta tag, or 301 redirect it to another page.
Here’s how you can ask Google to remove a link from search results:
Go to Google Search Console and select Removals under Index in the left bar. Click on the New Request button.

It’ll open a pop-up window where you can enter the page URL.

Enter the URL of the page you want to remove from search results and click on NEXT. Google will now hide URL from search results for 180 days. You can also cancel the URL removal request anytime by accessing Removals in Google Search Console.
If you just want to clear the current snippet and the Google cached page but don’t want to remove it from Google, you can go to the Clear Cached URL tab in the pop-up window and enter the URL there.
● When you are dealing with spam URLs, remove the URLs from Google through Search Console. Also, delete spam pages from your website and return a 410 status code to make it clear to Google that those pages are gone forever. Additionally, create an XML sitemap with all spam URLs and submit it to Search Console. Google will find out the 410 status code quickly when crawling the URLs in the sitemap. You can remove the sitemap once all spam URLs are removed.
● When you are dealing with staging URLs that got indexed, you can use the URL removal tool and then 301 redirect staging URLs to production URLs. This way, Google will consolidate the ranking signals to production URLs. Alternatively, you can use noindex tag if that fits your situation better. Ensure you don’t use robots.txt to disallow the staging site, as it can prevent Google from finding the 301 redirects or the noindex meta tag.
To remove confidential information from Google
When you want to remove URLs leading to confidential information, the ideal option is to store all those pages inside a password-protected directory on your server. You can use a login system or HTTP authentication to access the content. This is by far the most effective and secure way to block private URLs.
Alternatively, you can use IP whitelisting to allow only specific IP addresses to access the pages. This way, a group of users you allow will be able to access the pages, and search engines will not be able to access and index them. This method is ideal for internal networks, member-only content, and for staging sites.
Removing links from Google using Robots.txt
You can also remove URLs from Google Search by disallowing them through your robots.txt file. However, it is not a failproof method, so you cannot depend on the robots.txt file alone to get your links removed from Google.
The robots.txt directives just tell Google the web admin’s preferences; Google can choose to honor the robots.txt or ignore it. In most cases, Google honors the robots.txt directives. Also, a robots.txt disallowed page can still get indexed if linked to from other websites.
Here’s the robots.txt directive that asks all crawlers not to index a page with the URL yourdomain.com/do-not-index/
User-agent: *
Disallow: /do-not-index/
To learn about creating, editing, and properly configuring a robots.txt file, we recommend reading our complete guide on robots.txt.