robots.txt
API Reference for robots.txt file.
Add or generate a robots.txt
file that matches the Robots Exclusion Standard in the root of app
directory to tell search engine crawlers which URLs they can access on your site.
Static robots.txt
Generate a Robots file
Add a robots.js
or robots.ts
file that returns a Robots
object.
Good to know: robots.js
is a special Route Handlers that is cached by default unless it uses a Dynamic API or dynamic config option.
Output:
Customizing specific user agents
You can customise how individual search engine bots crawl your site by passing an array of user agents to the rules
property. For example:
Output:
Robots object
Version History
Version | Changes |
---|---|
v13.3.0 | robots introduced. |