Skip to content

Add X-Robots-Tag header to HTTP tunnels to prevent search engine indexing #5400

@gustavosbarreto

Description

@gustavosbarreto

Problem

Google Search Console has flagged *.shellhub.cloud (our HTTP tunneling domain) as containing "deceptive pages" due to user-hosted content being indexed by search engines.

Since ShellHub provides HTTP tunneling services (similar to ngrok), the content served through *.shellhub.cloud subdomains is controlled by end-users and can include login pages, admin panels, or other interfaces that trigger false positives in Google Safe Browsing.

Affected URLs example:

  • https://*.shellhub.cloud/auth/login
  • https://*.shellhub.cloud/admin/users

These are legitimate self-hosted applications (e.g., Immich) from our users, not phishing attempts.

Solution

Add the X-Robots-Tag: noindex, nofollow HTTP header to all responses served through HTTP tunnels on *.shellhub.cloud.

This will:

  1. Prevent search engines from indexing user-hosted content
  2. Avoid future false positive security warnings
  3. Maintain clear separation between our control panel (cloud.shellhub.io) and user tunneling infrastructure

Implementation

The header should be added at the reverse proxy level for all HTTP tunnel responses:

X-Robots-Tag: noindex, nofollow

Additional Context

This is a common issue for HTTP tunneling providers. Similar services (ngrok, localtunnel, etc.) implement the same solution to prevent search engine crawlers from indexing user-generated content.

After implementation, we will submit a review request to Google Search Console explaining the nature of our service.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions