Discoverability layer
World indexed
ALL IN public pages can be discovered through search indexing, direct links, and the broader Infa discovery layer. Technical files stay machine-readable. Users stay on clean human pages.
How it works
Public pages and public post URLs can be crawled and indexed. The sitemap helps search engines discover URLs faster.
The robots file tells crawlers what they are allowed to read. The manifest helps browsers install the site as an app.
Local context
Current local default: Columbus
If no zone is selected during posting, the experience stays local by default.
If no zone is selected during posting, the experience stays local by default.