Skip to content

Staging deploys should block search engines #76

@caesarsol

Description

@caesarsol

The solution is easy: adding a robots.txt in the build dir.
The problem is to be extra-sure we don't add that in production, but only in staging env!

We can't be sure about this since the same yarn build command is used, and the NODE_ENV variable is apparently not correctly set (@marcofugaro told me, but is that true?).
Seems like the best option could be to use something like:

[[ ! -v GENERATE_SOURCEMAP ]] && [[ -v CI ]] && echo 'User-agent: *\nDisallow: /' > build/robots.txt

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions