Create a Robots.txt Crawl Directive

Create a robost.txt file for https://example.com/ that blocks the /sample-works/ directory but allows crawling the /sample-works/seo/ directory.

Using ChatGPT?

Save all chats, add your notes, categorize and search your chat history.