Making WordPress.org

Opened 6 years ago

Last modified 6 years ago

#4184 accepted defect (bug)

Add a robots.txt disallow all rule to ps.w.org

Reported by: jonoaldersonwp's profile jonoaldersonwp Owned by: otto42's profile Otto42
Milestone: Priority: lowest
Component: Version Control Keywords: seo
Cc:

Description (last modified by jonoaldersonwp)

Replace the contents of https://ps.w.org/robots.txt with:

User-agent: *
Disallow: /

Change History (4)

#1 @Otto42
6 years ago

  • Owner set to Otto42
  • Status changed from new to accepted

The easy way to fix this would be to add the robots.txt to the root SVN. I think I'm the only person with direct access to do that easily.

Is that an acceptable fix?

#2 @dd32
6 years ago

Committing a file won't work here, as it's the robots.txt served by apache for all SVN hosts, there's a similar generic one for Trac hosts.

@jonoaldersonwp can you confirm what should happen for routes such as these, and if they should differ from https://ps.w.org/robots.txt (That's just a cached CDN of plugins.svn.wordpress.org)
http://plugins.svn.wordpress.org/robots.txt
http://core.svn.wordpress.org/robots.txt
http://meta.svn.wordpress.org/robots.txt

Having them have different content is possible, it's just easier to have a single file for all svns.

#3 @dd32
6 years ago

  • Component changed from General to Version Control

#4 @jonoaldersonwp
6 years ago

  • Description modified (diff)

Hmm. I think we can safely apply the same global disallow rule to each of these. Easy! :)

Note: See TracTickets for help on using tickets.