Amazon Web Services investigates Perplexity AI for bypassing robots.txt protocol on websites

By: Nastya Bobkova | 29.06.2024, 01:31
Amazon Web Services investigates Perplexity AI for bypassing robots.txt protocol on websites

Amazon Web Services has launched an investigation after allegations were made against Perplexity AI that its crawlers could bypass the robot exclusion protocol on websites.

Here's What We Know

The Robot Exclusion Protocol (robots.txt) is a web programming standard used by web developers to tell search engines and other bots how to behave on their sites. It is an important mechanism for controlling access to content and protecting privacy.

The accusations are that Perplexity AI crawlers hosted on Amazon's servers may not adhere to this standard, opening the door to the illegal collection of data or content from websites without proper permission.

This has come to the attention of AWS, who are now looking into the allegations to determine if there is a basis for the allegations and whether the use of Perplexity AI is in line with their policies and service standards.

Source: Wired