Robots.txt package

Feature Card

Description

It’d be great to have a @frontity/robots package where people can configure their robots.txt file their Frontity settings.

User Stories

As a Frontity project developer
I want to configure my robots.txt using the settings
so that I can configure different robots.txt for each site from the comfort of my settings

Possible solution

This could be the configuration for the package:

export default {
  //...
  packages: [
    // ...
    {
       name: "@frontity/robots",
       state: {
         robots: {
           file: "User-agent: *\nDisallow: /some-url"
         }
       }
    }
  ]
}

Dependencies

We need the server extensibility for this. I’ll update this post as soon as I write the feature request card for that.

Hi, @luisherranz! I really prefer to have more flexibility to write my own rules in robots.txt. Maybe for me makes sense just copy robots.txt file to the build directory.

Yeah, I added that feature last week: Use a robots.txt file in the root

Will using the frontity.settings.js file to add the content of your robots.txt file limit your flexibility? If so, could you explain how so we can improve the design? :slight_smile:

The main problem with the robots file is that Frontity supports multiple sites, and maybe people need different robots.txt for different sites.

1 Like

Hi @luisherranz! :wink:
I think more about after sent the message :slight_smile: For me makes senses to have robots.txt configured with frontity.settings.js because it’s make easier to create the file for multiple sites. :smiley:

Awesome, thanks for the feedback @christian.nascimento! Really appreciated :+1:

1 Like