Google Removes Robots.txt Guidance For Blocking Auto-Translated Pages
Google has removed its previous recommendation to block auto-translated pages using robots.txt, reflecting a shift in approach: quality matters more than how content is produced. This update aligns technical guidance with Google’s existing “scaled content abuse” policies, which evaluate content based on user value rather than automation.
Importantly, this is purely a documentation change - Google’s crawling behavior is unchanged. The advice now encourages webmasters to use meta robots noindex tags on low-quality translations instead of blanket disallows, and to assess translation quality on a per‑page basis. The takeaway: if your auto-translated pages offer real value, don’t hide them - only suppress those that fall short.