ChatGPT’s Digital Memory: When Your Old Life Resurfaces via AI
An investigation into AI data privacy has highlighted a concerning trend where ChatGPT can surface specific personal information, such as old residential addresses and phone numbers. Although this data was historically available in public phone books, its inclusion in large language model training sets creates new risks for digital stalking and doxxing. The incident demonstrates that even if information is outdated, the automated nature of AI makes it easier than ever to retrieve sensitive fragments of a person's past, raising urgent questions about how developers filter private information from their massive datasets.