Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add robots.txt to limit Google crawler to English documents #1940

Closed
bahmutov opened this issue Jul 31, 2019 · 3 comments · Fixed by #1942
Closed

Add robots.txt to limit Google crawler to English documents #1940

bahmutov opened this issue Jul 31, 2019 · 3 comments · Fixed by #1942
Assignees

Comments

@bahmutov
Copy link
Contributor

Currently because of the way we have partially completed translations from English to Chinese and Japanese docs, Google text search can find English text on a Japanese page, leading to less optimal result.

We should add robots.txt to limit the Google crawler from indexing non-English translations for now

@jennifer-shehane
Copy link
Member

I feel like implementing some of these best practices is more ideal than just completely ignoring translated files from search indexing. https://support.google.com/webmasters/answer/189077?hl=en&visit_id=637002396938016733-1976776371&rd=1

@bahmutov
Copy link
Contributor Author

bahmutov commented Aug 1, 2019

Sure, implementing the SEO optimizations for translations could be step 2

@jennifer-shehane
Copy link
Member

yeah, I was convinced also of this in the standup. I made a not of it in the translation optimizations #1686

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants