Every time you ask a chatbot a question, you assume the conversation stays between you and the machine. That trust is now under pressure. An investigation by PCMag found that a New York analytics startup called Profound is selling access to anonymised records of user prompts from major AI tools like ChatGPT, Google Gemini and Claude (by Anthropic). What users typed — often in private moments — is being packaged and sold to marketers looking to spot trending interests before they hit search engines.
Profound’s product, named Prompt Volumes, aggregates chatbot data for commercial clients. While the company says all identifiable names & personal details are removed, privacy advocates warn the dataset still reveals deeply personal topics — medical, financial, relationship matters — questions users asked believing no one else would watch.
Marketing consultant Lee Dryburgh recently warned that browser extensions may secretly funnel chatbot conversations to third-party firms. “AI chats are not casual searches,” he wrote. “They’re confessions.” Profound replied with a cease-and-desist, accusing him of brand damage — a reaction that drew even more scrutiny.
According to Profound, it never collects data directly from chatbots. Instead, it says it licences “opt-in consumer panels” via established providers (e.g., a subsidiary of Semrush). That model is widely used in advertising analytics. But for privacy experts such explanations sound too tidy. The Electronic Frontier Foundation (EFF) argues that even anonymised data can often be traced back to individuals when linked with demographics or regional tags.
Another concern: security researchers at Georgia Tech found several browser extensions in the Chrome Web Store that, with permissions to read website data, could extract full ChatGPT sessions — prompts and responses included. Some users may “opt-in” to data sharing without realising what they’re agreeing to.
Profound defends itself, saying its data supply chain is legal and complies with privacy laws like GDPR and CCPA. Yet the opacity of how opt-in consent works makes it hard for users to know if their prompts are in these commercial panels.
What emerges is a quiet market built on people’s curiosity and trust. Chatbots have become safe spaces for users, confiding questions they might not ask elsewhere. Marketers now view these confessions as data points. The arrangement may be legal—but it brushes against the spirit of privacy protection.
The ethical issue shifts from just who collects data to who interprets it and for what purpose. When questions about mental health or relationships become trend metrics for marketers, the line between research and exploitation gets thin.
Until transparency catches up, here’s the simple advice: treat your chatbot like an open forum, not a diary. Disable unnecessary extensions, use private mode, and assume someone, somewhere might be listening
Leave a comment