You've built your website, but it's not showing up on Google. Sound familiar? Google needs to discover, crawl, and index your pages before they appear in search results. Here's how to make it happen.
Why Submit Your Website to Google?
Google discovers pages by following links and crawling the web. But for new websites, this process can take weeks or even months. Manually submitting your site speeds up discovery significantly. Here's what happens when you submit:
- Faster discovery: Instead of waiting for Googlebot to find your site, you tell Google directly
- Priority crawling: Submitted URLs get crawled sooner than organically discovered ones
- Indexing confirmation: You can verify whether your pages are actually indexed
- Error detection: Google Search Console reveals crawl and indexing issues
Step 1: Set Up Google Search Console
Google Search Console (GSC) is the free tool Google provides for website owners to monitor search performance. It's the most important tool for getting indexed.
How to set up GSC:
- 1Go to search.google.com/search-console and sign in with your Google account
- 2Click "Add Property" and enter your website URL (use the Domain option for full coverage)
- 3Verify ownership via DNS record, HTML file upload, HTML meta tag, or Google Analytics
- 4Wait for verification to complete (DNS can take up to 48 hours)
Step 2: Submit Your Sitemap
A sitemap is an XML file that lists all the pages on your website. It helps Google understand your site structure and find all your important pages.
To submit your sitemap:
- 1. In Google Search Console, go to Sitemaps in the left menu
- 2. Enter your sitemap URL (e.g.,
https://yoursite.com/sitemap.xml) - 3. Click Submit
- 4. Check the status — it should show "Success" after processing
Pro tip: Most modern frameworks (Next.js, WordPress, etc.) generate sitemaps automatically. Make sure yours is at /sitemap.xml and referenced in your robots.txt.
Step 3: Use URL Inspection Tool
The URL Inspection tool lets you check individual URLs and request indexing for specific pages. This is the fastest way to get a new page indexed.
How to request indexing:
- Paste your page URL into the inspection bar at the top of Search Console
- Wait for the live test to complete
- Click "Request Indexing" to add it to Google's crawl queue
- Repeat for each important page (homepage, key landing pages, blog posts)
Step 4: Check Your robots.txt
Your robots.txt file tells crawlers which pages they can and cannot access. A misconfigured robots.txt is one of the most common reasons sites don't get indexed.
Warning: If your robots.txt contains Disallow: /, Google cannot crawl any page. Make sure important pages are allowed.
Step 5: Monitor Indexing Status
After submitting, monitor your indexing status in Google Search Console. Go to Pages in the left menu to see:
- How many pages are indexed vs not indexed
- Reasons pages aren't being indexed (e.g., noindex tag, crawl errors, redirect loops)
- Whether Google can render your pages properly
Common Indexing Issues & Fixes
"Discovered — currently not indexed"
Google found the page but hasn't crawled it yet. Improve content quality and add internal links to signal importance.
"Crawled — currently not indexed"
Google crawled but chose not to index. This usually means low-quality or duplicate content. Improve the page or consolidate with similar content.
"Blocked by robots.txt"
Your robots.txt is preventing Google from crawling the page. Update the file to allow access.
"Page with redirect"
The URL redirects to another page. Make sure internal links point to the final destination URL.
Check If Your Site Is Indexable
Use BoostLogik's free SEO analyzer to check if your pages are crawlable, if your robots.txt is configured correctly, and if your meta tags allow indexing.
Analyze My Website Free