Welcome toVigges Developer Community-Open, Learning,Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
3.9k views
in Technique[技术] by (71.8m points)

robots.txt ignore only slug only

I want to achieve that behavior:

Allow: /plans and Disallow: /plans/*

crawl: www.example.com/plans

Do not crawl:


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

It would be:

Allow: /plans$
Disallow: /plans/

Entries are assumed to have a trailing wildcard so /plans/ and /plans/* are the same thing. However, this also means that /plans will also match /plansandstuff. This can be dealt with by using $ which matches "end of path".

See also: Robots.txt Specification

Keep in mind that the robots.txt file is advisory and not all crawlers pay attention to it.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to Vigges Developer Community for programmer and developer-Open, Learning and Share
...