
"Privacy experts are already sounding the alarm about the potential harms of saying too much to your chatbot. The underlying concern is that no one is entirely sure how your personal information, whether sensitive or seemingly innocuous, could be used in the future."
"Some fear personal data could end up in a mass surveillance system or be used in other unforeseen ways that will ultimately harm or disadvantage you. That ambiguity, they argue, is reason enough for caution."
"One step you can take to make your ChatGPT experience more secure is to stop OpenAI from using your information to train its models. Security experts are voicing concern that if your data ends up in a model, it could one day be used in a way we can't even anticipate right now."
As ChatGPT becomes integral to daily life for many, users should reconsider the personal information they share. Privacy experts warn about the risks of disclosing even seemingly harmless details, as the future use of this data remains uncertain. Concerns include potential misuse in mass surveillance systems. To enhance privacy, users can opt out of having their data used for training models by adjusting settings in their accounts. This proactive step can help mitigate risks associated with personal data exposure.
Read at ZDNET
Unable to calculate read time
Collection
[
|
...
]