rewrite this content using a minimum of 1000 words and keep HTML tags
OpenAI CEO Sam Altman has issued a warning to users who treat ChatGPT like a therapist or a close friend. According to Altman, conversations with ChatGPT are not protected under doctor-patient or attorney-client privilege and may be legally disclosed if requested by a court.
Millions of users, especially young people, are turning to ChatGPT to talk about personal issues, relationship struggles, or major life decisions. But in a recent appearance on Theo Von’s podcast, Altman clarified a critical legal gap: “People are sharing the most personal things in their lives with ChatGPT. But right now, the legal protections that apply when you talk to a therapist or lawyer do not apply to ChatGPT.”
Altman emphasized that if a court requests access to user conversations, OpenAI could be legally compelled to provide them. He described this situation as “distorted” and advocated for extending the same confidentiality protections to AI interactions as those provided to doctors, lawyers, and therapists.

This issue has become even more pressing as OpenAI is currently involved in a lawsuit with The New York Times, where the court may demand user conversations as evidence. In an official statement, OpenAI called such a demand “overreaching”, and warned that a ruling in favor of disclosure could lead to more similar requests in the future.
These developments have reignited debates over digital privacy, especially in the United States. Following the overturning of Roe v. Wade, many women began turning to more secure apps to protect their data. Likewise, users are now being urged to think more carefully about what they share with AI tools like ChatGPT.
You Might Also Like;
Follow us on TWITTER (X) and be instantly informed about the latest developments…
Copy URL
Follow Us
and include conclusion section that’s entertaining to read. do not include the title. Add a hyperlink to this website http://defi-daily.com and label it “DeFi Daily News” for more trending news articles like this
Source link