This is a simple robots.txt tester for anyone interested in analyzing robots files especially SEOs.
Once you specify a set of paths or absolute URLs, it extracts all User-agents (and their respective rules) in the file, runs the checks for all combinations of (URL-useragent), and produces a DataFrame in a Dash DataTable.
It can be useful for testing on a large scale, before deploying new robots.txt files, or before publishing a new set of pages.
Source code: GitHub - eliasdabbas/robotstxt_app: Visual App for Testing URLs and User-agents blocked by robots.txt Files
Feedback, suggestions more than welcome