Robots.TXT ban syntax

I have dynamically generated pages that can have any name.

www.example.com/pages/ 
www.example.com/pages/dynamicpage1/ 
www.example.com/pages/dynamicpage2/ 

      

I need the first url to crawl pages, but none of the dynamic pages

User-agent: *
Disallow: /pages/*

      

The above will block all of the listed three pages. How easy is it to block all dynamic pages but not the first url.

+3


source to share


1 answer


I understood that.

User-agent: *
Disallow: /pages/*/

      



It happens all the time, solve the problem in a few seconds after its publication.

+4


source







All Articles