Skip to content

Conversation

@andymurd
Copy link

Some sites (yeah hello dfat.gov.au) will put a robot into a tarpit just for trying to download robots.txt. To escape, it's good to be able to supply a timeout when constructing a parser, like this:

parser = new robots.RobotsParser(false, { headers: { userAgent: "USER_AGENT" }, timeout: 30000 });

This PR adds a handler for timeout events, and treats them like errors.

Sorry, but I couldn't get the unit tests to run because expresso is so far out of date, but I have run the code successfully against several sites.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant