Skip to content

Disallow expensive pages via robots.txt#3654

Merged
josephsnyder merged 1 commit intoKitware:masterfrom
williamjallen:add-robots-txt
Apr 16, 2026
Merged

Disallow expensive pages via robots.txt#3654
josephsnyder merged 1 commit intoKitware:masterfrom
williamjallen:add-robots-txt

Conversation

@williamjallen
Copy link
Copy Markdown
Collaborator

Scrapers frequently target expensive pages such as testOverview.php, and pages which are not meaningful to index such as /tests/<id> and /builds/<id>/*, which adds significant load to public-facing CDash instances. This attempts to address the issue by requesting that bots not index these pages.

Scrapers frequently target expensive pages such as `testOverview.php`, and pages which are not meaningful to index such as `/tests/<id>` and `/builds/<id>/*`, which adds significant load to public-facing CDash instances.  This attempts to address the issue by requesting that bots not index these pages.
@josephsnyder josephsnyder added this pull request to the merge queue Apr 16, 2026
Merged via the queue into Kitware:master with commit 4a8cfae Apr 16, 2026
19 of 21 checks passed
@williamjallen williamjallen deleted the add-robots-txt branch April 16, 2026 13:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants