Hi all... this is my first post. I've been pondering on how to do this for some time now, so figured I'd ask...
I have an ungodly number of "not indexed" pages in Google Search Console that I'm trying to get removed from GSC. All of the urls are duplicate parameterized urls that provide zero benefit to the site and are not linked to from anywhere (anymore). For 8+ months, I've had a Cloudflare worker script provide a simple 410 error page to Googlebot for these urls I want removed from GSC, but from my estimates, it will take 2 years to remove them at the current pace. My site gets roughly 15k pages crawled per day, with spikes up to 40k/day, and I only have ~5k legit pages that are rarely updated.
My question is - does anyone have a suggestion on how to hasten the removal of these 410 error urls from GSC? All these "404" errors in GSC are actually 410's, but GSC is still holding on to them for 6+ months after seeing the initial 410 error.
I thought to maybe generate a sitemap of the 1000 urls in GSC that Googlebot found 410's on in each update, but even at ~2000 urls/week, this would still take over a year.
I could also generate sitemaps of all the old parameter urls, but Googlebot has never seen the majority of them, so this would balloon the numbers in GSC roughly 8x what they already are.
I should also point out, as the photo doesn't show it... the pages ARE actually being removed from GSC, but I added the 410 error to even more these past couple months, so it is only staying the same/increasing because of the new batch. Prior to this, the 404/410's were decreasing by 2-8k urls/week.
I have an ungodly number of "not indexed" pages in Google Search Console that I'm trying to get removed from GSC. All of the urls are duplicate parameterized urls that provide zero benefit to the site and are not linked to from anywhere (anymore). For 8+ months, I've had a Cloudflare worker script provide a simple 410 error page to Googlebot for these urls I want removed from GSC, but from my estimates, it will take 2 years to remove them at the current pace. My site gets roughly 15k pages crawled per day, with spikes up to 40k/day, and I only have ~5k legit pages that are rarely updated.
My question is - does anyone have a suggestion on how to hasten the removal of these 410 error urls from GSC? All these "404" errors in GSC are actually 410's, but GSC is still holding on to them for 6+ months after seeing the initial 410 error.
I thought to maybe generate a sitemap of the 1000 urls in GSC that Googlebot found 410's on in each update, but even at ~2000 urls/week, this would still take over a year.
I could also generate sitemaps of all the old parameter urls, but Googlebot has never seen the majority of them, so this would balloon the numbers in GSC roughly 8x what they already are.
I should also point out, as the photo doesn't show it... the pages ARE actually being removed from GSC, but I added the 410 error to even more these past couple months, so it is only staying the same/increasing because of the new batch. Prior to this, the 404/410's were decreasing by 2-8k urls/week.