Not sure what "digital rights" that "manages"? I don't see it as an unreasonable suggestion that the tool shouldn't be set up out of the box to DoS sites it's scraping, that doesn't prevent anyone who is technical enough to know what they're doing to fork it and remove whatever limits are there by default? I can't see it as a "my computer should do what I want!" issue, if you don't like how this package works, change it or use another?
Indeed DRM is a very different thing from adhering to standards like `robots.txt` as a default out of the box (there could still be a documented option to ignore it).
He was using DRM as a metaphor for restricted software.
And advocating that software should do whatever the user wants.
If the user is ignorant about the harm the software does, then adding robots.txt support is win-win for all.
But if the user doesn't want it, then it's political, in the same way that DRM is political and anti-user.