
"The earlier version could have applied broadly to nearly every AI-powered chatbot or search tool. The amended bill focuses more narrowly on so-called “AI companions”—conversational systems designed to simulate emotional or interpersonal interactions with users. That change does address some of the broadest concerns raised about the original proposal, though some questions about the bill's reach remain. Bottom line: the revised bill still creates serious problems for privacy, online speech, and parental choice."
"The new GUARD Act still requires companies offering AI companions to implement burdensome age-verification systems tied to users' real-world identities. Even parents who specifically want their teenagers to use these systems would still face significant hurdles. A family might decide that a conversational AI tool helps an isolated teenager practice social interaction, or engage in harmless creative roleplay. A parent deployed in the military might set up a persistent AI storyteller for a younger child."
"Under the revised bill, those users could still face mandatory age checks tied to sensitive personal or financial information before they or their children can use these services. The revised bill also leaves important definitions unclear while sharply increasing penalties for developers and companies that get those judgments wrong. Congress narrowed the GUARD Act. But it is still trying to solve a complicated social problem with vague legal standards, heavy liability, and privacy-invasive verification systems."
"The revised GUARD Act still requires companies offering AI companions to verify that users are adults through a “reasonable age verification” system. The bill allows a broader set of verification methods than the earlier version, but the core requirement remains: identity-linked checks before minors can access AI companion services. This approach can force families to provide sensitive information even for low-risk uses like creative roleplay or social practice."
The GUARD Act was narrowed after criticism. The earlier version could have applied to many AI-powered chatbots and search tools. The amended bill targets “AI companions,” conversational systems designed to simulate emotional or interpersonal interactions. The revision reduces some broad concerns, but privacy, online speech, and parental choice issues remain. Companies offering AI companions must implement “reasonable age verification” tied to users’ real-world identities. This can require sensitive personal or financial information even when parents want teenagers to use the tools. Definitions remain unclear, while penalties for developers and companies increase if judgments are wrong. The bill uses vague legal standards, heavy liability, and privacy-invasive verification systems to address a complex social problem.
Read at Electronic Frontier Foundation
Unable to calculate read time
Collection
[
|
...
]