Robots.TXT ban syntax
I have dynamically generated pages that can have any name.
www.example.com/pages/
www.example.com/pages/dynamicpage1/
www.example.com/pages/dynamicpage2/
I need the first url to crawl pages, but none of the dynamic pages
User-agent: *
Disallow: /pages/*
The above will block all of the listed three pages. How easy is it to block all dynamic pages but not the first url.
+3
source to share