What is the smartest way to handle robots.txt in Express?

0

Issue

I’m currently working on an application built with Express (Node.js) and I want to know what is the smartest way to handle different robots.txt for different environments (development, production).

This is what I have right now but I’m not convinced by the solution, I think it is dirty:

app.get '/robots.txt', (req, res) ->
  res.set 'Content-Type', 'text/plain'
  if app.settings.env == 'production'
    res.send 'User-agent: *\nDisallow: /signin\nDisallow: /signup\nDisallow: /signout\nSitemap: /sitemap.xml'
  else
    res.send 'User-agent: *\nDisallow: /'

(NB: it is CoffeeScript)

There should be a better way. How would you do it?

Thank you.

Solution

Use a middleware function. This way the robots.txt will be handled before any session, cookieParser, etc:

app.use('/robots.txt', function (req, res, next) {
    res.type('text/plain')
    res.send("User-agent: *\nDisallow: /");
});

With express 4 app.get now gets handled in the order it appears so you can just use that:

app.get('/robots.txt', function (req, res) {
    res.type('text/plain');
    res.send("User-agent: *\nDisallow: /");
});

Answered By – SystemParadox

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More