Posted in

Why Are My Pages Discovered But Not Indexed? Understanding & Fixing Common Indexing Issues

How To Fix Discovered

If you’re struggling with page indexing on your site, you’re not alone. This guide delves into why Google may discover your pages but leave them unindexed, along with strategies to fix this issue.

The “Indexing” Report in Google Search Console (GSC)

Google Search Console’s “Indexing” report provides insight into which pages Google has crawled and indexed, as well as those it hasn’t. This report highlights pages successfully indexed and flags issues on others, helping website managers understand why certain pages are not appearing in search results.

Is It a Problem if a Page Isn’t Indexed?

Not all unindexed pages indicate an issue. For example, site owners might exclude pages from search engines using the noindex tag or robots.txt file to manage which pages appear in search results. These intentional exclusions appear in the “Excluded” section of the report. However, pages flagged without reason may require attention, especially if they’re intended for search visibility.

Common Indexing Issues

Google does not index every URL it encounters, especially if it encounters issues or perceives low quality. Your main concern should be ensuring key pages you want indexed are accessible to Google. GSC flags certain issues, like:

  • Server Errors (500): Indicate server issues that prevent crawling.
  • Soft 404 Errors: Occur when a page’s content is considered too minimal, so it isn’t indexed as a proper page.

If these issues affect critical pages, they might indicate larger structural problems with the site.

What Is “Discovered – Currently Not Indexed”?

This status means Google has found the URL but hasn’t yet indexed it. Google’s explanation is that its bots aim to crawl without overwhelming site servers. As Googlebot competes with other site traffic, it prioritizes crawl rate to avoid crashes. Pages labeled “Discovered – currently not indexed” are typically waiting for resources to be freed up for crawling or may be skipped due to perceived quality concerns.

Causes for “Discovered – Currently Not Indexed”

Two main reasons account for this status:

  1. Server Constraints: Googlebot paces itself to prevent server strain.
  2. Page Quality Concerns: Google often bases quality assumptions on other pages across your site. If low-quality or duplicate pages are abundant, Google may deprioritize indexing new pages.

How to Fix “Discovered – Currently Not Indexed” Pages

Though no quick fix guarantees indexing, here are several steps to improve your chances:

1. Confirm Indexing Status

Use a site:yourdomain.com search in Google with the specific URL. If the page appears in results, it is indexed. Check the “Last updated” date in GSC’s report, as it might be outdated.

2. Assess Overall Site Quality

Poor site quality can impact indexation. Quality extends beyond content—Google also evaluates layout, design, images, page speed and current SEO trends. To assess quality:

  • Conduct a website audit comparing your site to competitors.
  • Improve layout, speed, and image use to boost user experience.

3. Eliminate Duplicate Pages

Duplicate or low-value pages can negatively impact site indexation. Pages with multiple URLs, like those with trailing slashes (e.g., /contact-us vs. /contact-us/) or URL parameters (e.g., ?color=red), are viewed as separate pages by Google. Check for:

  • Duplicate content across URLs.
  • Parameterized URLs, common in eCommerce, that create near-duplicates by filtering options like color.

4. Convey Page Importance

If certain pages are intended for indexing, communicate their significance to Googlebot:

  • Add to XML Sitemap: Include essential pages in your sitemap to prioritize them.
  • Internal Links: Link to these pages from other high-traffic pages or navigation menus to show they’re important.
  • External Backlinks: Links from authoritative external sites can signal the page’s value. For example, a fashion blog backlink to a product page may indicate quality.

5. Submit URL for Crawling

After making improvements, use GSC’s “URL Inspection” tool to resubmit URLs for crawling. If the URL still appears under “Discovered – currently not indexed” after weeks, additional issues might still exist.

Summary

Improving your site for crawling and indexing involves quality audits, addressing technical issues, and signaling page importance. By prioritizing high-quality pages and reducing duplicates, you can improve your site’s crawlability and increase the likelihood that critical pages progress from “Discovered – currently not indexed” to “Indexed.” This systematic approach ultimately enhances your visibility in search results.