Reputation: 1
I'm looking for a way to detect when Googlebot visits my website so that I can provide it with access to the full content, which is typically restricted to authenticated users. Currently, unauthenticated users only see limited content.
I've considered a few options, such as using the list of Google’s crawler IP addresses, but I know these can change frequently. I've also thought about checking the User-Agent string, but I'm concerned about its reliability. Additionally, performing a reverse DNS lookup on the server side seems impractical and time-consuming.
Is there a more effective and reliable method to ensure that Googlebot can identify itself, perhaps by sending an authentication token? Any insights or suggestions would be greatly appreciated!
Upvotes: -2
Views: 34