Create a properly formatted robots.txt file to control how search engine crawlers access your website. Block AI bots, protect admin areas, and manage crawl behavior.
User-agent: * Disallow: /admin/ Disallow: /private/ Disallow: /api/ Allow: /
Save this content as robots.txt in the root directory of your website.