Opened 6 years ago
Last modified 6 years ago
#4184 accepted defect (bug)
Add a robots.txt disallow all rule to ps.w.org
Reported by: | jonoaldersonwp | Owned by: | Otto42 |
---|---|---|---|
Milestone: | Priority: | lowest | |
Component: | Version Control | Keywords: | seo |
Cc: |
Description (last modified by )
Replace the contents of https://ps.w.org/robots.txt with:
User-agent: *
Disallow: /
Change History (4)
#2
@
6 years ago
Committing a file won't work here, as it's the robots.txt served by apache for all SVN hosts, there's a similar generic one for Trac hosts.
@jonoaldersonwp can you confirm what should happen for routes such as these, and if they should differ from https://ps.w.org/robots.txt (That's just a cached CDN of plugins.svn.wordpress.org)
http://plugins.svn.wordpress.org/robots.txt
http://core.svn.wordpress.org/robots.txt
http://meta.svn.wordpress.org/robots.txt
Having them have different content is possible, it's just easier to have a single file for all svns.
Note: See
TracTickets for help on using
tickets.
The easy way to fix this would be to add the robots.txt to the root SVN. I think I'm the only person with direct access to do that easily.
Is that an acceptable fix?