Business Insights
  • Home
  • Crypto
  • Finance Expert
  • Business
  • Invest News
  • Investing
  • Trading
  • Forex
  • Videos
  • Economy
  • Tech
  • Contact

Archives

  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • August 2023
  • January 2023
  • December 2021
  • July 2021
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019

Categories

  • Business
  • Crypto
  • Economy
  • Finance Expert
  • Forex
  • Invest News
  • Investing
  • Tech
  • Trading
  • Uncategorized
  • Videos
Apply Loan
Money Visa
Advertise Us
Money Visa
  • Home
  • Crypto
  • Finance Expert
  • Business
  • Invest News
  • Investing
  • Trading
  • Forex
  • Videos
  • Economy
  • Tech
  • Contact
ChatGPT Chats Could Be Used Against Users In Court
  • Crypto

ChatGPT Chats Could Be Used Against Users In Court

  • July 27, 2025
  • Roubens Andy King
Total
0
Shares
0
0
0
Total
0
Shares
Share 0
Tweet 0
Pin it 0

OpenAI could be legally required to produce sensitive information and documents shared with its artificial intelligence chatbot ChatGPT, warns OpenAI CEO Sam Altman.

Altman highlighted the privacy gap as a “huge issue” during an interview with podcaster Theo Von last week, revealing that, unlike conversations with therapists, lawyers, or doctors with legal privilege protections, conversations with ChatGPT currently have no such protections.

“And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s like legal privilege for it… And we haven’t figured that out yet for when you talk to ChatGPT.”

He added that if you talk to ChatGPT about “your most sensitive stuff” and then there is a lawsuit, “we could be required to produce that.”

Altman’s comments come amid a backdrop of an increased use of AI for psychological support, medical and financial advice.

“I think that’s very screwed up,” Altman said, adding that “we should have like the same concept of privacy for your conversations with AI that we do with a therapist or whatever.”

Sam Altman on This Past Weekend podcast. Source: YouTube

Lack of a legal framework for AI

Altman also expressed the need for a legal policy framework for AI, saying that this is a “huge issue.” 

“That’s one of the reasons I get scared sometimes to use certain AI stuff because I don’t know how much personal information I want to put in, because I don’t know who’s going to have it.”

Related: OpenAI ignored experts when it released overly agreeable ChatGPT

He believes there should be the same concept of privacy for AI conversations as exists with therapists or doctors, and policymakers he has spoken with agree this needs to be resolved and requires quick action. 

Broader surveillance concerns 

Altman also expressed concerns about more surveillance coming from the accelerated adoption of AI globally.

“I am worried that the more AI in the world we have, the more surveillance the world is going to want,” he said, as governments will want to make sure people are not using the technology for terrorism or nefarious purposes. 

He said that for this reason, privacy did not have to be absolute, and he was “totally willing to compromise some privacy for collective safety,” but there was a caveat. 

“History is that the government takes that way too far, and I’m really nervous about that.”

Magazine: Growing numbers of users are taking LSD with ChatGPT: AI Eye