Google uses different processes to crawl and index new pages, usually these processes rely on natural link discovery i.e. an existing website linking to your new page or updated content. Natural link discovery and indexation does take some time, however there are several way of speeding up this process. My favourite way of speeding up link discovery and indexation is social media.

Now though, there is a new way of submitting specific URLs to Google. Remember the Fetch As Googlebot function under Diagnostics in Google Webmaster Tools? Well, the function has now been extended allowing you to fetch a particular page as GoogleBot and then submit it to Google straight after that.

After you fetch a URL as Googlebot, if the fetch is successful, you’ll see the option to submit that URL to Google’s index. When you submit a URL using this method, Googlebot will crawl the page, usually within a day. Google will then consider it for inclusion but it is not guaranteed that every URL submitted in this way will be indexed; Google might still use their regular link discovery processes.

In their announcement, Google presents two likely scenarios where you might want to use this functionality. I must admit both of these scenarios happen quite regularly, specifically the second one, read below.

“This new functionality may help you in several situations: if you’ve just launched a new site, or added some key new pages, you can ask Googlebot to find and crawl them immediately rather than waiting for us to discover them naturally. You can also submit URLs that are already indexed in order to refresh them, say if you’ve updated some key content for the event you’re hosting this weekend and want to make sure we see it in time. It could also help if you’ve accidentally published information that you didn’t mean to, and want to update our cached version after you’ve removed the information from your site.”