Discovered – Currently Not Indexed: How to Get Indexed by Google

Share this Article

Discovered currently not indexed how to get indexed by google

There’s nothing more frustrating than creating valuable content on your website – only to realize Google isn’t indexing that page. You want your material to show up in relevant search results, and that will only happen if Google has crawled and indexed your website properly. 

Although there is no guarantee that Google will index your website and its individual pages, there are key reasons why it might not – and ways to encourage indexing. That’s what we want to talk about on the Foxxr blog today. 

First, let’s address four of the most common reasons why your page is discovered but not currently indexed by Google.

4 Reasons Your Content Isn’t Getting Indexed by Google (And How to Fix Them)

1. Orphaned Pages

Orphan pages sitemap screaming frog

Source: screaming frog

Does your website have pages that exist all by themselves? We call these “orphan” pages, and Google’s crawlers commonly overlook them. These pages have no other links on the website directing users to them, and they aren’t a part of the overarching website journey. 

Although standalone pages can be found by search engines, they stand a much better chance of being indexed if they are a part of a larger network. That’s why we always recommend that our clients prioritize internal website links. 

The more internal links you have going back to a standalone page, the more likely Google is to deem that page valuable and index it. If you’re dealing with a page that isn’t indexed, try linking back to it on the homepage and within other key web pages. Check out this tutorial on how to use the Screaming Frog SEO Spider to find orphan pages from three sources, XML Sitemaps, Google Analytics, and Search Console.

2. Robots.Txt File

Another reason why Google might not index a page: it has a robots.txt block in place. 

Robots.txt files are useful when you want to instruct search engine crawlers on how to crawl your website. You can block certain pages from being crawled, prevent specific media files from appearing in search results, and essentially create “private” web content. 

However, robots.txt can backfire if you want a page to be indexed. You need to ensure that your blocks are being used properly, not preventing Google from crawling and indexing important content. 

If you set your robots.txt file through a plugin to noindex your site, Google will not be able to crawl it. Ensure that your robots.txt files do not have the following lines:

User-agent: *

Disallow: /

If it does, the forward slash may be blocking all pages from the root folder of the site, which could cause serious problems for search engine indexing. 

3. Content Doesn’t Match With Searches

Content does not match search

The truth is that Google indexes the content it deems useful. If no one is searching for the topic you’re writing about, then Google probably isn’t going to waste the time to crawl and index your particular blog post or webpage. 

So, if your page is not indexed, it could be that you’re writing about something Google deems irrelevant or unpopular in search results. This doesn’t mean the content can’t be found on the web or via social media. It just means it won’t appear in SERPs. 

In other words, if you want to get your page index, make sure it answers search queries. You need to be creating pages around what people are actively looking for online, not just about the topics you deem appropriate or interesting. 

Furthermore, your content needs to actually add value to search results. If you’re just rehashing the same content that hundreds or thousands of other websites have already covered, then Google likely won’t deem it worth indexing. 

4. Slow-Loading Websites

Slow loading website

Allow us to preface this section by stating that your website doesn’t have to be fast-loading in order to be indexed. However, Google has been and will continue to be less likely to feature slow-loading websites at the top of its search results. 

Why? Well, Google doesn’t like to recommend pages to users that are hard to use and slow to load. Furthermore, a slow loading page decreases the speed at which Google can crawl your site, which then potentially leads to problems with proper indexing down the road.  

If your website doesn’t load quickly, consider upgrading to a faster server. If you’re worried about individual pages that aren’t indexed, run them through the Google Speed Insights tool. This can help you understand if your problem lies with your server, bad connections, high payload sizes, or other related loading issues. 

How to Ensure Your Pages Are Indexed by Google

✓ Write Great Content

This is arguably the most crucial thing you can do to increase your chances of getting Google to index your site. The search engine giant is always on the hunt for high-quality, meaningful, and accurate content to provide for searchers. 

If your content is subpar, why should Google bother crawling and indexing your pages? 

95 percent b2b consumers trust businesses with great content

Source: visme. Co

This doesn’t just mean producing content that is grammatically correct or error-free. It means producing content that answers searchers’ questions. Google’s ultimate job is to match searchers with answers, and it will likely index your page if it seems to answer an existing question. 

Furthermore, you need to write long-form content, not just brief little tidbits of information. Time and time again, our SEO experts have found that content needs to be over 1,000 words if it’s going to compete against other high-ranking sources. 

Have questions on creating SEO-friendly content? Don’t hesitate to reach out to the Foxxr content team for guidance and solutions. 

✓ Make Your Website User-Friendly

Make user friendly website

In the world of SEO, websites that are user-friendly are more likely to be search-engine friendly. Your website must be: 

  • Easy to navigate
  • Quick to load
  • Full of valuable information
  • Organized and not distracting 

If a regular, human user cannot find what they’re looking for on your website, why should Google bother trying to crawl and index your pages? Ensure that you’re doing everything in your power to make your website usable and easy to navigate. This will improve your bounce rates and increase your chances of proper indexing. 

✓ Update Your Meta Tags

Sometimes, meta tags are set to noindex,nofollow on your website’s backend. As a result, that page may not have been re-indexed and may not be crawled by Google in the future. This can happen if you’re using a plugin that blocks Google from seeing certain things on your website. 

The solution: change meta tags with the words noindex,nofollow so they read index, follow. This can be a time-consuming change to make if it applies to hundreds of pages, but it’s crucial if you need certain pages to be indexed. 

✓ Submit a Full Sitemap to Google

Submit sitemap to google

This sounds complicated, but trust us – it’s not as difficult as it sounds. 

Google is more likely to index your individual pages if it has a better idea of how they fit into your website as a whole. Using a tool like Yoast allows you to create a full website sitemap, submit it to search engines, and invite Google to come and crawl your website. 

✓ Ask Google for a Page Recrawl

Did you know you can actually request that Google index your page? There’s a tool for that. 

If you wrote a blog post that you know is valuable, but that isn’t being indexed for some reason, consider asking Google to recrawl the URL. The search engine sometimes just misses pages or doesn’t realize you’ve made changes. Requesting a recrawl is a good way to say, “Hey Google, why aren’t you indexing this page?

Keep in mind that it may take days or even weeks for you to see changes in indexation after submitting a page. Requesting a recrawl multiple times won’t speed things up. 

The Wrap Up

There are many factors that play into your page’s chances of being indexed by Google. The more you pay attention to the tips we’ve listed above, the better the odds are of having Google crawl and index your best content. 

Need Help Troubleshooting Your Index Problems?

Even stellar content can be overlooked by Google – and that’s a problem for your business. If you’re not sure what to do about a page that is discovered but not indexed, it might be time to talk to a professional. 

At Foxxr, our goal is to get your brand directly in front of online searchers. We specialize in optimizing websites for search engines, and that includes assessing pages that aren’t indexing for various reasons. Our team will improve your keyword usage, help you cultivate excellent content, and grab the attention of Google’s powerful crawlers.  

For more information, don’t hesitate to call 727-379-2207 or reach out online. You don’t have to deal with all of your SEO headaches alone. 

Avatar for Brian Childers

Brian Childers

Brian Childers is the Founder and CEO of Foxxr Digital Marketing, a St. Petersburg, FL-based agency specializing in home services marketing. With a proven track record of success, Brian leads a team of digital marketing experts who empower home service businesses to achieve significant growth through targeted lead generation and revenue-boosting strategies.