rewrite this content using a minimum of 1000 words and keep HTML tags
OpenAI CEO Sam Altman has voiced concern over what he sees as growing and unhealthy dependence on ChatGPT, particularly among younger users.
Speaking at a Federal Reserve-hosted banking conference this week, Altman said, “People rely on ChatGPT too much. There’s young people who say things like, ‘I can’t make any decision in my life without telling ChatGPT everything that’s going on. It knows me, it knows my friends. I’m gonna do whatever it says.’ That feels really bad to me.”
He said this kind of over-reliance is especially common among young people. “Even if ChatGPT gives great advice, even if ChatGPT gives way better advice than any human therapist, something about collectively deciding we’re going to live our lives the way AI tells us feels bad and dangerous,” Altman added.
Also Read:No SEO, no agencies: How Bill Gate’s daughter used ChatGPT to turn fashion-tech startup Phia into overnight hit
Survey finds half of teens trust AI advice
Altman’s remarks coincide with a recent survey by Common Sense Media, which found that 72 per cent of teenagers had used AI companions at least once. Conducted among 1,060 teens aged 13 to 17 during April and May, the survey also revealed that 52 per cent use such tools at least a few times per month.
Half of the respondents said they trust advice and information from their AI companion at least a little. Trust was stronger among younger teens, with 27 per cent of 13 to 14-year-olds expressing confidence, compared to 20 per cent of teens aged 15 to 17.
Also Read: Are you struggling to handle your personal finance matters? This AI fintech app uses ChatGPT, Gemini to suggest you ways
How different generations use ChatGPT
Altman had earlier shared insights into how users of different ages interact with ChatGPT. At the Sequoia Capital AI Ascent event, he said, “Gross oversimplification, but like, older people use ChatGPT as a Google replacement,” and added, “Maybe people in their 20s and 30s use it like a life advisor, something.” He went on to say, “And then, like, people in college use it as an operating system. They really do use it like an operating system. They have complex ways to set it up to connect it to a bunch of files, and they have fairly complex prompts memorised in their head or in something where they paste in and out.”
He further explained, “There’s this other thing where they don’t really make life decisions without asking ChatGPT what they should do. It has the full context on every person in their life and what they’ve talked about.”
Also Read:Trusting ChatGPT blindly? Creator CEO Sam Altman says you shouldn’t!
Privacy concerns: ‘I get scared sometimes’
In a separate conversation on Theo Von’s podcast This Past Weekend, Altman revealed that he himself is wary of AI’s handling of personal data. “I get scared sometimes to use certain AI stuff, because I don’t know how much personal information I want to put in, because I don’t know who’s going to have it,” he said. This was in response to Von asking if AI development should be slowed down.
Altman also admitted that conversations with ChatGPT currently do not have the same legal protections as those with doctors, lawyers or therapists. “People talk about the most personal details in their lives to ChatGPT,” he said. “People use it, young people, especially, use it as a therapist, a life coach; having these relationship problems and asking ‘what should I do?’ And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT.”
He warned that under current legal frameworks, conversations with ChatGPT could be disclosed in court if ordered. “This could create a privacy concern for users in the case of a lawsuit,” Altman said, adding that OpenAI would be legally obliged to provide those records.
“I think that’s very screwed up. I think we should have the same concept of privacy for your conversations with AI that we do with a therapist or whatever — and no one had to think about that even a year ago,” he added.
Also Read:ChatGPT vs Google vs Brain: MIT study shows AI users think less, remember less
Not a therapist yet
Altman’s caution may resonate with users who confide their emotional struggles in ChatGPT. But he urged caution. “I think it makes sense to really want the privacy clarity before you use ChatGPT a lot, like the legal clarity.”
So while ChatGPT might feel like a trustworthy friend or counsellor, users should know that legally, it is not treated that way. Not yet.
and include conclusion section that’s entertaining to read. do not include the title. Add a hyperlink to this website http://defi-daily.com and label it “DeFi Daily News” for more trending news articles like this
Source link