Skip to content

Commit

Permalink
[Issue #3405] Setup API to respond to robots.txt requests (#3484)
Browse files Browse the repository at this point in the history
## Summary
Fixes #3405 

### Time to review: __3 mins__

## Changes proposed
Add a very high level robots.txt implementation for the API side.

## Additional Information
Content provided by the route passes validation and enforces correct
crawling restrictions:
<img width="1817" alt="image"
src="https://github.com/user-attachments/assets/cc0c0a42-09bb-498d-b43c-7f7e139b8dcc"
/>
<img width="1801" alt="image"
src="https://github.com/user-attachments/assets/964a4d15-b6ab-455d-a71c-29d431c33f8a"
/>
<img width="1806" alt="image"
src="https://github.com/user-attachments/assets/6a422ba8-6df9-46cd-9e29-f8da6e66ec27"
/>
  • Loading branch information
mdragon authored Jan 10, 2025
1 parent 6469c93 commit 830931f
Showing 1 changed file with 12 additions and 0 deletions.
12 changes: 12 additions & 0 deletions api/src/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,7 @@ def create_app() -> APIFlask:
configure_app(app)
register_blueprints(app)
register_index(app)
register_robots_txt(app)
register_search_client(app)

auth_endpoint_config = AuthEndpointConfig()
Expand Down Expand Up @@ -163,3 +164,14 @@ def index() -> str:
</body>
</html>
"""


def register_robots_txt(app: APIFlask) -> None:
@app.route("/robots.txt")
@app.doc(hide=True)
def robots() -> str:
return """
User-Agent: *
Allow: /docs
Disallow: /
"""

0 comments on commit 830931f

Please sign in to comment.