Opened 5 years ago

Last modified 5 years ago

#4184 accepted defect (bug)

Add a robots.txt disallow all rule to

Reported by: jonoaldersonwp's profile jonoaldersonwp Owned by: otto42's profile Otto42
Milestone: Priority: lowest
Component: Version Control Keywords: seo

Description (last modified by jonoaldersonwp)

Replace the contents of with:

User-agent: *
Disallow: /

Change History (4)

#1 @Otto42
5 years ago

  • Owner set to Otto42
  • Status changed from new to accepted

The easy way to fix this would be to add the robots.txt to the root SVN. I think I'm the only person with direct access to do that easily.

Is that an acceptable fix?

#2 @dd32
5 years ago

Committing a file won't work here, as it's the robots.txt served by apache for all SVN hosts, there's a similar generic one for Trac hosts.

@jonoaldersonwp can you confirm what should happen for routes such as these, and if they should differ from (That's just a cached CDN of

Having them have different content is possible, it's just easier to have a single file for all svns.

#3 @dd32
5 years ago

  • Component changed from General to Version Control

#4 @jonoaldersonwp
5 years ago

  • Description modified (diff)

Hmm. I think we can safely apply the same global disallow rule to each of these. Easy! :)

Note: See TracTickets for help on using tickets.