The bipartisan bill builds on a provision in the Violence Against Women Act Reauthorization Act of 2022, which added a similar right of action for non-faked explicit images. In a summary, the sponsors described it as a response to an “exponentially” growing volume of digitally manipulated explicit AI images, referencing Taylor Swift’s case as an example of how the fakes can be “used to exploit and harass women — particularly public figures, politicians, and celebrities.”
Source: Lawmakers propose anti-nonconsensual AI porn bill after Taylor Swift controversy