- Soft 404: Pages returning 200 but appearing empty to Google
- Discovered - currently not indexed: Google found pages but hasn't indexed them yet
- Duplicate without user-selected canonical: Missing canonical URLs
- Page with redirect: Inconsistent URL handling
File: src/app/blog/[slug]/page.tsx
Added generateStaticParams() function to pre-render all blog posts at build time:
export async function generateStaticParams() {
// Fetches all blog slugs and generates static pages
// This ensures Google gets fully rendered HTML
}Benefits:
- Faster page loads
- Fully rendered HTML for crawlers
- Better SEO performance
- Eliminates "Soft 404" errors
Changed from cache: 'no-store' to next: { revalidate: 3600 }:
- Blog posts are cached for 1 hour
- Reduces server load
- Faster page loads
- More reliable for crawlers
Added to all blog pages:
robots: {
index: true,
follow: true,
googleBot: {
index: true,
follow: true,
'max-video-preview': -1,
'max-image-preview': 'large',
'max-snippet': -1,
},
}Canonical URLs:
alternates: {
canonical: blogUrl,
}File: src/app/sitemap.ts
Created a new native Next.js sitemap that:
- Automatically generates XML sitemap
- Includes all static and dynamic pages
- Uses proper TypeScript types
- Updates automatically on build
Benefits:
- More reliable than custom route handler
- Better integration with Next.js
- Automatic revalidation
- Type-safe implementation
File: next.config.ts
Added:
trailingSlash: false, // Prevents duplicate URLs
output: 'standalone', // Better for production
async headers() { // SEO-friendly headers
return [
{
source: '/:path*',
headers: [
{ key: 'X-Robots-Tag', value: 'index, follow' }
],
},
];
}File: public/robots.txt
Changed:
Crawl-delay: 1→Crawl-delay: 0(faster crawling)- Ensured sitemap URL is correct
File: src/app/sitemap.xml/route.ts
Enhanced error handling and data validation:
- Better error logging
- Filters out invalid entries
- Proper cache strategy
- More reliable data fetching
npm run build
npm startVisit: http://localhost:3000/sitemap.xml
Check that all blog URLs are present and valid.
Push changes to GitHub and deploy via Vercel.
- Go to Google Search Console
- Use URL Inspection tool
- Test the live URL for each blog post
- Click "Request Indexing"
- Go to "Sitemaps" in GSC
- Remove old sitemap if any
- Add:
https://emrhngngr.tech/sitemap.xml - Submit
- Go to "Page Indexing" report
- Click "Validate Fix" for each issue:
- Soft 404
- Discovered - currently not indexed
- Duplicate without canonical
- Page with redirect
For immediate results:
# Use Google's URL Inspection tool for each blog URL:
https://emrhngngr.tech/blog/your-blog-slug- Sitemap recognized by Google
- Pages start appearing in "Discovered" state
- Validation process begins
- Pages move from "Discovered" to "Indexed"
- Soft 404 errors resolved
- Duplicate content issues fixed
- Full indexing of all blog posts
- Improved search rankings
- Better crawl efficiency
Add more internal links between blog posts to help Google discover and understand content relationships.
Ensure each blog post has:
- ✅ Unique title (60 chars)
- ✅ Meta description (160 chars)
- ✅ High-quality images with alt text
- ✅ Proper heading hierarchy (H1 → H2 → H3)
- ✅ Minimum 300 words content
Keep your Lighthouse scores high:
- Performance: 100/100
- Accessibility: 100/100
- Best Practices: 100/100
- SEO: 100/100
Already implemented via metadata, but verify with:
Regular monitoring in Google Search Console:
Coverage → View details → Check for errors
-
Check robots.txt:
https://emrhngngr.tech/robots.txtEnsure blog paths are allowed.
-
Verify Sitemap:
https://emrhngngr.tech/sitemap.xmlAll blog URLs should be listed.
-
Test URL with Google: Use URL Inspection tool in GSC to see what Google sees.
-
Check for Manual Actions: GSC → Security & Manual Actions → Manual Actions
-
Review Server Logs: Check if Googlebot is successfully crawling pages.
- Sitemap submitted to GSC
- All blog URLs tested with URL Inspection
- Validation requested for all errors
- Monitoring "Page Indexing" report weekly
- Checking for new crawl errors
- Verifying organic search traffic increase
Track these in Google Search Console:
- Indexed pages: Should increase from current levels
- Impressions: Should grow as more pages get indexed
- Average position: Should improve over time
- Click-through rate: Should remain stable or improve
Implementation Date: October 28, 2025 Status: ✅ Completed Next Review: Check GSC in 7 days