Many websites have a robots.txt, a plain-text file that tells search engines to ignore certain files and folders on the site. Security.txt is a proposed standard to do likewise with security policies.
"When security risks in web services are discovered by independent security researchers who understand the severity of the risk, they often lack the channels to disclose them properly. As a result, security issues may be left unreported. security.txt defines a standard to help organizations define the process for security researchers to disclose security vulnerabilities securely."
I'm reminded of the Curator's Code, a proposed standard for crediting blog-post sources—a letter of the law so likely to be brought to bear against its spirit that everyone in the business of giving credit knew instinctively never to use it.