You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Robots.txt is currently managed by Content Build. This will need to move to Next Build as part of the migration away from Content Build.
The contents of Robots.txt should be revisited with various teams/people to ensure it is up-to-date and serving needs correctly.
Acceptance Criteria
The appropriate teams and people are consulted to figure out desired content in robots.txt
Robots.txt is implemented within Next Build per Next.js guidelines to produce the desired content
Robots.txt acts appropriate for each environment
Supporting details
Interested people
Mikki Northeius
Frontend COP
Analytics team, potentially
Robots.txt in Next.js
Next.js provides documentation for implementing robots.txt. This would almost certainly be a Typescript file that responds to app environment so that it can provide the correct output per environment.
Description
Robots.txt is currently managed by Content Build. This will need to move to Next Build as part of the migration away from Content Build.
The contents of Robots.txt should be revisited with various teams/people to ensure it is up-to-date and serving needs correctly.
Acceptance Criteria
Supporting details
Interested people
Robots.txt in Next.js
Next.js provides documentation for implementing robots.txt. This would almost certainly be a Typescript file that responds to app environment so that it can provide the correct output per environment.
Current robots.txt
The current robots.txt from Content Build is a text file. However, the file is processed by an Update Robots plugin that rewrites robots.txt to simply disallow crawling.
There are a number of Disallow rules that may no longer apply. Each of these should be revisited.
The text was updated successfully, but these errors were encountered: