Index Blocking
Define pages to prevent them from appearing in search.
After Dark uses the noindex
robots meta directive to prevent search engines from crawling and indexing certain parts of your site. It appears in the HTML document head
and looks like this:
<meta name="robots" content="noindex">
Unlike robots.txt meta directives are defined within page content itself and unambiguously indicate which, if any, pages should be blocked from indexing — even if some of those pages appear in your site’s Sitemap.
To facilitate the discovery of index blocked pages Fuzzy Search utilizes the very same meta directive exposed to search engines to prevent disclosure of pages in its own result listings. Therefore, if a page can be found in fuzzy search, that page may ultimately appear on a search engine result page.
Adjust index blocking per-page using noindex
Front Matter:
noindex = true # set false or remove to unblock
Block entire sections using an _index.md
file with the above setting:
├── content
│ ├── legal
│ │ ├── _index.md
│ │ ├── terms.md
│ │ └── privacy.md
│ ├── post
By default the following page types are blocked automatically:
- Section listings automatically linked to from the Section Menu;
- Taxonomy Pages such as
Category
,Tag
and terms listings; and, - If enabled, the Fuzzy Search page or any deep-linked result within.
Adjust defaults using the noindex_kinds
setting from
Site Configuration. For example, to enable crawling of section pages add the following to the config:
[params]
noindex_kinds = [
"taxonomy",
"taxonomyTerm"
] # crawl "section" pages
Learn about
Robots Meta Directives on Moz and see how Google uses noindex
in
Block search indexing with 'noindex'.