Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion src/collections/posts/lastmod.json
Original file line number Diff line number Diff line change
Expand Up @@ -23,5 +23,5 @@
"src/routes/posts/(2025)/environment-variables-in-sveltekit/+page.md": "2025-11-05T14:06:22.000Z",
"src/routes/posts/(2025)/it-aint-easy-to-move-a-side-project-off-big-tech/+page.md": "2025-11-07T20:24:14.000Z",
"src/routes/posts/(2025)/managing-environment-variables-with-vercel/+page.md": "2025-11-12T20:58:01.000Z",
"src/routes/posts/(2026)/the-privacy-friendly-video-doorbell-that-failed-in-cold-weather/+page.md": "2026-01-11T14:13:07.000Z"
"src/routes/posts/(2026)/the-privacy-friendly-video-doorbell-with-flawed-person-detection/+page.md": "2026-01-17T15:14:53.000Z"
}
22 changes: 22 additions & 0 deletions src/routes/notes/(2026)/fix-your-robots-txt-file/+page.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
---
title: Fix your robots.txt file
description:
If Google does not index your site, it might be because you don't have a robots.txt file.
publishedDate: 2026-01-18
link: https://www.alanwsmith.com/en/37/wa/jz/s1/
---

This link popped up in my Bluesky feed. Long story short: Google won't index your website if your
site does not serve a `robots.txt` file. That's a little weird because this did not matter in the
past. And many small websites probably don't even have a `robots.txt` file.

`robots.txt` is part of the
[Robots Exclusion Protocol](https://www.rfc-editor.org/rfc/rfc9309.html). But it is nothing more
than an unenforced code of conduct pinned to the entrance of your website. Be nice, dear bots, or
else nothing will happen.

If you plan to add a `robots.txt` file to your site, you can
[check an earlier note](/notes/disallowing-ai-bots-in-robots-txt) on how to discourage AI bots from
slurping up content by denying them access to your website in `robots.txt`. Again, this does not
prevent bots from accessing your site, but it makes the house rules clear and possibly legally
enforceable.