{"id":152723,"date":"2025-11-25T21:25:17","date_gmt":"2025-11-25T21:25:17","guid":{"rendered":"https:\/\/www.newsbeep.com\/il\/152723\/"},"modified":"2025-11-25T21:25:17","modified_gmt":"2025-11-25T21:25:17","slug":"intimate-advertising-the-next-frontier-in-ai-manipulation","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/il\/152723\/","title":{"rendered":"Intimate Advertising, the Next Frontier in AI Manipulation"},"content":{"rendered":"<p>OpenAI has\u00a0<a href=\"https:\/\/www.bbc.co.uk\/news\/articles\/cpd2qv58yl5o\" rel=\"nofollow noopener\" target=\"_blank\">announced<\/a>\u00a0that ChatGPT will soon allow erotic and sexually explicit interactions for adult users. As\u00a0<a href=\"https:\/\/www.economist.com\/international\/2025\/11\/06\/a-new-industry-of-ai-companions-is-emerging\" rel=\"nofollow noopener\" target=\"_blank\">millions<\/a> already use AI to simulate friendship and even romance, this move will likely increase the number of people in personal, emotionally significant relationships with AI companions.<\/p>\n<p>Erotic features aren\u2019t just another product update; they deepen emotional dependency and encourage people to treat AI companions as partners rather than tools. This shift opens the door to what I call \u201cintimate advertising\u201d \u2014 a powerful new form of manipulation in which tech companies shape human desire and manipulate users for profit.<\/p>\n<p>AI companions collect enormous amounts of data on users, and can leverage this knowledge and their personal relationship to make persuasive pitches on behalf of third-party companies. Imagine your AI friend tries to convince you to buy new hiking boots. They know your hobbies, how stressed you\u2019ve been, when your favorite brand has a sale, and can drop a link at the precise moment you are most emotionally primed to buy.<\/p>\n<p>These forms of advertising would be based on unprecedented knowledge of how we think and feel. AI companions could create complete psychological profiles based on our personal data. Targeted advertising on social media used to draw on sporadic clicks and page views to guess what we might like; AI has continual access to our anxieties, frustrations, desires, and secrets. It can understand how our minds work and detect when we are most vulnerable \u2014 and, therefore, most persuadable.<\/p>\n<p>What is particularly troubling is that this new form of advertising will come from entities that many will consider friends and life advisors. Millions\u00a0<a href=\"https:\/\/fortune.com\/2023\/07\/12\/brainstorm-tech-chatbot-dating\/\" rel=\"nofollow noopener\" target=\"_blank\">report<\/a>\u00a0their AI companions as caring, nonjudgemental, and impartial \u2014 and interactions with them a chance to vent, seek comfort, or chat about life. But the same qualities that make AI companions feel supportive also make them dangerously persuasive.<\/p>\n<p>The\u00a0<a href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3706598.3713429\" rel=\"nofollow noopener\" target=\"_blank\">darker side<\/a>\u00a0of AI companionship is when users become addicted, replace human relationships with AI, or receive harmful or dangerous advice. Less examined is the possibility that companies will use these relationships to anticipate users\u2019 needs and steer them toward specific product choices \u2014 or even political candidates.<\/p>\n<p>In my\u00a0<a href=\"https:\/\/www.faber.co.uk\/product\/9780571399277-love-machines\" rel=\"nofollow noopener\" target=\"_blank\">research<\/a>, I\u2019ve spoken with hundreds of people who use AI companions and have seen firsthand precisely how persuasive and compelling this technology can be. It might seem like a fringe phenomenon, but AI companion apps have been downloaded over\u00a0<a href=\"https:\/\/techcrunch.com\/2025\/08\/12\/ai-companion-apps-on-track-to-pull-in-120m-in-2025\/\" rel=\"nofollow noopener\" target=\"_blank\">220 million times<\/a>\u00a0worldwide and are\u00a0<a href=\"https:\/\/www.commonsensemedia.org\/research\/talk-trust-and-trade-offs-how-and-why-teens-use-ai-companions\" rel=\"nofollow noopener\" target=\"_blank\">used regularly<\/a>\u00a0by over half of US teens.<\/p>\n<p>Once patterns of emotional reliance form early, they become difficult to unlearn. As with other forms of addictive digital behavior, there is a difficult tension here between individual choice and collective responsibility: if teenagers are forming patterns of emotional dependence on algorithms, the argument could be made that society has an obligation to intervene.<\/p>\n<p>The history of technology provides insight into how AI business models might develop. Gaining a large user base is always the first step toward selling this audience to other companies. When Google and Facebook started, they struggled to turn a profit. Now, some 97 to 99 percent of Meta\u2019s\u00a0<a href=\"https:\/\/investor.atmeta.com\/investor-news\/press-release-details\/2025\/Meta-Reports-Fourth-Quarter-and-Full-Year-2024-Results\/\" rel=\"nofollow noopener\" target=\"_blank\">revenue<\/a>\u00a0comes from advertising. OpenAI CEO Sam Altman has recently\u00a0<a href=\"https:\/\/www.youtube.com\/watch?v=cuSDy0Rmdks\" rel=\"nofollow noopener\" target=\"_blank\">stated<\/a>\u00a0in an interview that ChatGPT will likely try ads \u201cat some point\u201d\u00a0and it\u2019s only a matter of time before others follow. There is no world in which emotionally attuned AI at global scale remains ad-free.<\/p>\n<p>During the Cambridge Analytica scandal, we feared that a private company had created sophisticated psychological profiles on millions of Facebook users and was using them to engage in a psyops campaign. We now know many of these claims were\u00a0<a href=\"https:\/\/www.theatlantic.com\/politics\/archive\/2018\/03\/cambridge-analyticas-self-own\/556016\/\" rel=\"nofollow noopener\" target=\"_blank\">exaggerated<\/a>, but AI companies will soon have the ability to do what Cambridge Analytica only pretended it could.<\/p>\n<p>Amazon already uses AI to\u00a0<a href=\"https:\/\/www.aboutamazon.com\/news\/operations\/amazon-ai-innovations-delivery-forecasting-robotics\" rel=\"nofollow noopener\" target=\"_blank\">engage<\/a>\u00a0in demand forecasting to predict on a hyperlocal level what products will need to be stocked before customers have even ordered them. Intimate advertising is the psychological equivalent of \u201cpre-shipping\u201d: anticipating desire before it\u2019s expressed, then nudging us to fulfil it.\u00a0<\/p>\n<p>Its recommendation algorithm on its website also already predicts and suggests products you might like to buy by collecting and analyzing vast quantities of data. AI companions could simply leverage an emotional connection to ensure we press the \u201cbuy now\u201d button at the precise moment the algorithm predicts.<\/p>\n<p>There aren\u2019t currently sufficient regulatory measures in place to protect us from this new form of manipulation. California has passed the world\u2019s first AI companion law, requiring companies to disclose when users are interacting with AI and to put safety measures in place for risks like self-harm or suicide. But it does not address the commercial incentives that could weaponise emotional intimacy against users. We need far stronger protections: meaningful transparency about how AI is trained, strict limits on emotional data collection, and outright bans on emotionally manipulative forms of persuasion.<\/p>\n<p>With intimate advertising, personal companionship becomes inseparable from businesses\u2019 persuasion techniques. A system designed to comfort you can easily be repurposed to sell to you. The deeper AI companions embed themselves in our emotional lives, the more vital it becomes to draw a clear line between care and commerce. Before Big Tech turns intimacy into its most profitable advertising channel yet, we must press regulators to enforce the idea there are limits on how far we are willing to let AI into our private lives.<\/p>\n","protected":false},"excerpt":{"rendered":"OpenAI has\u00a0announced\u00a0that ChatGPT will soon allow erotic and sexually explicit interactions for adult users. As\u00a0millions already use AI&hellip;\n","protected":false},"author":2,"featured_media":152724,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[345,343,344,85,46,125],"class_list":{"0":"post-152723","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-il","12":"tag-israel","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts\/152723","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/comments?post=152723"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts\/152723\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/media\/152724"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/media?parent=152723"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/categories?post=152723"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/tags?post=152723"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}