HARRISBURG, PA – With artificial intelligence companionship apps rapidly growing in popularity, Representative Melissa Shusterman is preparing legislation to require new safety and transparency standards for AI programs that simulate personal or emotional relationships.
The Chester County Democrat said her bill responds to rising concerns about the “loneliness epidemic,” as roughly half of Americans report experiencing chronic loneliness. Some AI companionship platforms now report more than 20 million active users each month.
Shusterman’s proposal would require AI companion applications to include built-in safety features when users engage for extended periods, triggering reminders about the non-human nature of the interaction.
The legislation would also mandate safety protocols when users express suicidal thoughts or self-harm ideation, ensuring the system provides immediate crisis information and support.
Addressing risks of emotional dependency and addiction
Developers of AI companionship tools have marketed them as social and emotional supports for isolated users. However, experts warn that prolonged use can create dependency and distort perceptions of real human relationships.
Shusterman said the goal is not to restrict innovation but to protect users from psychological harm by promoting responsible AI design. “As AI becomes more human-like, we must safeguard mental health and ensure users understand they are communicating with a program, not a person,” she said in the memo.
Promoting transparency and responsible AI practices
Under the proposal, AI companionship platforms operating in Pennsylvania would be required to:
- Notify users regularly that the AI is not a human being;
- Implement session time alerts when conversations continue for extended periods;
- Deploy automated safety responses if users express distress or suicidal ideation; and
- Maintain clear data policies about user interactions.
Supporters say the measure would establish Pennsylvania as one of the first states to regulate emotional AI applications, balancing technological innovation with consumer protection and ethical standards.
Key Points
- Rep. Melissa Shusterman plans legislation to regulate AI companionship apps amid growing mental health concerns.
- Bill would require time-use warnings, crisis-response features, and transparency about AI’s non-human identity.
- Measure aims to address emotional dependency risks while encouraging safe, responsible AI development.