This script abuses the Indexing API which is intended for job posting and other specific purposes.
<a href="https://developers.google.com/search/apis/indexing-api/v3/quickstart" rel="nofollow">https://developers.google.com/search/apis/indexing-api/v3/qu...</a><p>Use at your own risk.
The annoying thing about this is that it will ruin this feature for everyone else. I, and many others, use this for requesting to index time sensitive content.
Another easy way is to just tweet it, which works for me - they usually get indexed < 1 hour later. Google has access to tweets and the URLs in those tweets.
What happend to the good'ol sitemap.xml?<p>You'll probably find an npm package with lots of dependencies that'll generate that sitemap for you if that's what you need...
From Indexing API documentation:<p>> Currently, the Indexing API can only be used to crawl pages with either `JobPosting` or `BroadcastEvent` embedded in a `VideoObject`.<p>So this might come with the risk of seeing the site you want to boost rather penalized by Google.
I recently launched a mini project and was shocked at how difficult and long it took to get any of its pages properly indexed on Google.<p>It's almost as if Google is actively trying -not- to index anything as a way to reduce spam, by forcing the people who really care to jump through 100 hoops.<p>A great way for the dark web remains dark.
I’ve seen a lot of indie startups lately that are basically selling faster google indexing then you can get for free using google search console. I guess they are probably using this feature under the hood.
"to get your site indexed" => a nonsense claim<p>+ this technique might make engines aware of your content, but doesn't guarantee indexation whatsoever.
? "what I've noticed"...Google only indexing if a site has backlinks or is submitted by owner. Uh..yeah, how else would google know about a new URL? C'mon. This just seems like the usual SEO obsession/grift with some 'secret' way to get things done. It's straightfwd these days. Are you saying none of the pages you're queuing up are linked to each other? Most cases they would be in some way right? So the spider will start indexing them all based on a top url submission or a few key urls. Do event/job board sites really need <i>all</i> of their pages to be indexed immediately?
So, Google stopped automating indexation because spam, humanity finds new way to resume automation to again propagate spam. It seems Google is trapped in its toxic game of search engine optimization.