The robot.txt file is used to define the pages of the Website that the search engine can crawl. Attackers can inlude robot.txt file in http request and get the information, achieving local file inclusion attacks, resulting in information leak. This rule requires checking robot.txt in http request to prevent such attacks. This rule supports to defend the A3: Injection of OWASP Top 10 - 2021. Other reference:None