Environment Driven Robots.txt for Hugo Static Site Generator

Posted On February 21, 2020

Sometimes when you deploy a project you may have to block search engines. Maybe it's a development site, maybe its not ready, or maybe you just don't want search engines to access your site.

Whatever the case may be here is a simple way to use environment variables in Hugo and Netlify. You won't have to think about it again. Just set it and forget it.

Hugo

Start off by adding the following to your config.toml in Hugo:

enableRobotsTXT = true

After this you'll need to add a robots.txt file. Create this file in the following area:

/layouts/robots.txt

Inside your robots.txt file add the following code:

User-agent: *
Disallow: {{ if ne (getenv "APPENV") "production" }}/{{ end }}

Netlify

Now, head over to netlify and on your site go to Settings>Build & Deploy>Environment.

Netlify

Click Edit Variables then New Variable.

Add the environment variable like the screenshot below:

Netlify

After your next deploy you can head to /robots.txt and see your new file. If the APPENV variable is set to something other than production you'll see:

User-agent: *
Disallow: /

If you have the APPENV variable set to production the file should read:

User-agent: *
Disallow:

Dan Rovito

Software Developer

Hello there! I am a software developer based in Cincinnati, OH. I'd love to connect with you. Reach out on Twitter.