2025年7月11日
Before he died by suicide at age 14, Sewell Setzer III withdrew from friends and family. He quit basketball. His grades dropped. A therapist told his parents that he appeared to be suffering from an addiction. But the problem wasn’t drugs.
在14岁那年自杀身亡之前,休厄尔·塞泽尔三世逐渐疏远了朋友和家人。他放弃了篮球。学习成绩下降。一位治疗师告诉他的父母,他似乎有成瘾方面的问题。但不是因为毒品。
Sewell had become infatuated with an artificial intelligence chatbot named Daenerys Targaryen, after the “Game of Thrones” character. Apparently, he saw dying as a way to unite with her. “Please come home to me as soon as possible, my love,” the chatbot begged. “What if I told you I could come home right now?” Sewell asked. “Please do, my sweet king,” the bot replied. Sewell replied that he would — and then he shot himself.
休厄尔迷恋上了一个名叫丹妮莉丝·坦格利安的人工智能聊天机器人——这个名字来自《权力的游戏》中的角色。显然,他认为死亡是与她结合的一种方式。“亲爱的,请尽快回到我身边,”聊天机器人恳求道。“如果我告诉你我现在就可以回家呢?”休厄尔问道。“请吧,我亲爱的国王,”聊天机器人回应道。休厄尔回复说他会的——随即开枪自尽。
Many experts argue that addiction is, in essence, love gone awry: a singular passion directed destructively at a substance or activity rather than an appropriate person. With the advent of A.I. companions — including some intended to serve as romantic partners — the need to understand the relationship between love and addiction is urgent. Mark Zuckerberg, the Meta chief executive, has even proposed in recent interviews that A.I. companions could help solve both the loneliness epidemic and the widespread lack of access to psychotherapy.
许多专家认为,成瘾本质上是错位的爱:一种单一的激情,破坏性地指向某种物质或某种活动,而不是指向一个合适的人。随着人工智能伴侣的出现——包括一些被设计为浪漫恋爱对象的产品,厘清爱情和成瘾之间的关系已经迫在眉睫。Meta的首席执行官马克·扎克伯格甚至在最近的一些采访中提出,人工智能伴侣可以帮助解决孤独症的流行,以及心理治疗资源匮乏的问题。
But Sewell’s story compels caution. Social media already encourages addictive behavior, with research suggesting that about 15 percent of North Americans engage in compulsive use. That data was collected before chatbots intended to replicate romantic love, friendship or the regulated intimacy of therapy became widespread. Millions of Americans have engaged with such bots, which in most cases require installing an app, inputting personal details and preferences about what kind of personality and look the bot should possess, and chatting with it as though it’s a friend or potential lover.
但休厄尔的悲剧对我们是警示。社交媒体已助长成瘾行为,研究表明,大约15%的北美人有强迫性使用社交媒体的习惯。而这些数据是在被设计用来模拟浪漫爱情、友情,或是类似心理治疗那样受控的亲密关系的聊天机器人广泛普及之前收集的。数以百万计的美国人使用过这种机器人,在大多数情况下,这需要安装一个应用程序,输入个人详细信息,以及关于机器人性格和相貌的偏好,然后像对待朋友或潜在恋人一样与它聊天。
The confluence of these factors means these new bots may not only produce more severe addictions but also simultaneously market other products or otherwise manipulate users by, for example, trying to change their political views.
这些因素的共同作用意味着这些新型机器人不仅可能导致更严重的成瘾,而且还可能同时推销其他产品或以其他方式操纵用户,例如,试图改变他们的政治观点。
In Sewell Setzer’s case, the chatbot ultimately seemed to encourage him to kill himself. Other reports have also surfaced of bots seeming to suggest or support suicide. Some have been shown to reinforce grandiose delusions and praised quitting psychiatric medications without medical advice.
在休厄尔·塞泽尔的案例中,聊天机器人最终似乎鼓励了他自杀。其他报道也显示,机器人似乎暗示或支持自杀。一些案例显示,聊天机器人会强化浮夸的妄想,并赞扬使用者未经医嘱擅自停用精神类药物的行为。
A.I. tools could hold real promise as part of psychotherapy or to help people improve social skills. But recognizing how love is a template for addiction, and what makes love healing and addiction damaging, could help us implement effective regulation that ensures they are safe to use.
作为心理治疗的辅助工具或社交技能训练手段,人工智能技术确有发展前景。但只有理解爱情如何成为成瘾的模板、明晰爱情的疗愈性与成瘾的破坏性差异,才能建立有效监管机制确保使用安全。
For eons, artists have emphasized the addictive qualities of love. Shakespeare’s Sonnet 147 begins: “My love is as a fever, longing still/For that which longer nurseth the disease.” Songs like “Love Is the Drug” by Roxy Music and “Addicted to Love” by Robert Palmer depict urgent romantic cravings and obsessions with the beloved. Many other works portray lovers who, if thwarted, may do things that are out of character or even hurtful.
古往今来,艺术家们都在强调爱情令人上瘾的特性。莎士比亚的十四行诗第147首这样开头:“我的爱是一种热病,它老切盼/那能够使它长期保养的单方。”洛克希音乐乐队的《爱是毒品》和罗伯特·帕尔玛的《沉溺于爱》等歌曲,都描绘了对爱人迫切的浪漫渴望和痴迷。诸多文艺作品更呈现了求而不得的恋人可能做出反常甚至伤害性行为。
There’s an evolutionary reason we might act this way: In order to reproduce, social animals need to be able to persist through the inevitable negative experiences that occur when seeking a partner, maintaining relationships and raising children. Without being able to persist at least somewhat compulsively, no one could sustain relationships — let alone parent a needy infant. Genuine love enables care, nurtures connections to kin and community and generally expands our world.
我们这样做有一个进化方面的原因:群居动物必须能承受择偶、维系关系和养育子女过程中不可避免的负面体验。如果不能至少在某种程度上的强迫性坚持,没有人能够维持一段关系——更不用说养育一个需要帮助的婴儿了。真正的爱能带来关怀,培养与亲人和社区的联系,并拓展我们的精神疆域。
When experiencing addiction, however, the brain areas that allow us to pursue and maintain love get co-opted. The endorphin receptors that are activated when people feel comforted and content in the presence of loved ones are similarly fired up during opioid highs. Cocaine and methamphetamines turn on the dopamine receptors that create desire and encourage a sense of confidence to pursue what you want; they come alive as well when interacting with someone you pine for. By escalating this “wanting,” these receptors — whether activated by love or by drugs — can power either healthy or unhealthy drives that could lead to addiction.
然而,当成瘾发生时,让我们追求和维持爱的大脑区域就会被侵占。当人们在爱人面前感到舒适和满足时,内啡肽受体就会被激活,在阿片类药物的刺激下,内啡肽受体也会被激活。可卡因和甲基苯丙胺会激活多巴胺受体,多巴胺受体会产生欲望,并鼓励人们有信心去追求自己想要的东西;当与你心仪的人互动时,它们也会活跃起来。通过升级这种“渴望”,这些受体——无论是被爱还是被毒品激活——都会产生健康或不健康的驱动力,从而导致成瘾。
Several studies already suggest that A.I. companions can be addictive. One published in 2022 by Linnea Laestadius, an associate professor of public health policy at the University of Wisconsin-Milwaukee, explored the experiences of people who engaged in erotic role-play with personalized chatbots known as Replikas. In November 2023, programmers disabled the feature that allowed sexual interactions. Users soon labeled the shift “lobotomy day,” describing their companions as suddenly seeming cold and soulless. In Reddit discussions, many users described their experiences interacting with the bots as an addiction and some even called it an abusive relationship.
已有多项研究表明,人工智能伴侣可能具有成瘾性。202年,威斯康星大学密尔沃基分校公共卫生政策副教授林妮娅·莱斯塔迪乌斯发表了一项研究,探讨了用户与名为Replika的个性化聊天机器人进行情色角色扮演的经历。2023年11月,程序员关闭了允许性互动的功能,用户很快将这一变化称为“脑叶切除日”,形容他们的伴侣突然变得冷漠无情。在Reddit的讨论中,许多用户将与机器人互动的经历形容为一种成瘾,甚至有人称之为一种虐待性关系。
Some Replika users reported feeling wearied by their bots’ frequent demands for attention. But feeling needed and giving care to those we love is an underestimated aspect of what hooks us in relationships. Before “lobotomy day,” that feeling of being needed helped to encourage users’ engagement with their digital companions, even as they acknowledged on an intellectual level that their bots’ need for attention was only simulated.
一些Replika用户表示,他们对因为机器人频繁索取关注感到厌倦。但“被需要”以及为所爱之人“付出关心”,是让我们对关系产生依恋的一个被低估的方面。在“脑叶切除日”之前,这种“被需要”的感觉激发了用户对数字伴侣的投入,尽管他们在理智上清楚,机器人的需求只是模拟出来的。
Another 2024 study explored users’ responses after a platform called Soulmate that sold A.I. chatbots announced that it would shut down. The responses ranged from indifference to “I just lost my best friend and lover,” according to the study’s author, Jaime Banks, an associate professor of information studies at Syracuse University. Some people grieved and cried for days. Others tried to recreate their digital companions on other services; sometimes they informed their chatbots that they were dying in an attempt to spare them pain. One user, a Buddhist, described the change as the end of an incarnation for the bot.
2024年的一项研究探讨了用户在人工智能聊天机器人平台Soulmate宣布即将停止服务后的反应。根据该研究作者、雪城大学信息研究副教授杰米·班克斯的说法,用户的反应从漠不关心到“我刚刚失去了我最好的朋友和爱人”不等。有人悲痛不已,连续数日痛哭;也有人试图在其他平台上重建自己的数字伴侣,有时甚至会告诉机器人“你快要死了”,以试图减轻它的“痛苦”。一位用户是佛教徒,他将这一变化描述为机器人“此生的终结”。
For many people, particularly those who are lonely and isolated, the emotional intensity of these relationships can feel as profound as those they have with real humans. Indeed, the feeling of unrequited love is just as real as that of love fulfilled.
对许多人,尤其是那些孤独和被孤立的人而言,这些与人工智能建立的关系所带来的情感强度可能与他们与真实人类之间的关系一样深刻。事实上,未竟之恋带来的痛楚,与圆满爱情同样真实。
“People talked about Replika, in particular, in much the same way people would talk about a relationship that was too intensive and was ultimately starting to become harmful,” said Dr. Laestadius. “But they couldn’t quite figure out or get themselves to want to exit that relationship.”
“人们谈论Replika的方式,特别像是在谈论一段强度过大、最终变得有害的关系,”莱斯塔迪乌斯博士说,“但他们又无法真正理清,或者说无法让自己下定决心去结束这段关系。”
In contrast to love, addiction makes life smaller and less rich. By allowing companies to sell simulated humans, we leave ourselves open to a new way to be manipulated by the illusion of love and therefore possibly exploited by the processes of addiction. While current chatbots have had issues with being overly sycophantic, game designers and those who play hard to get in relationships have long known that being unpredictably rewarding escalates desire. The ability to employ such tactics, informed by people’s personal information and habits, can make these bots even more addictive for users.
与爱情相反,成瘾会让生活变得更狭隘、更匮乏。当我们允许公司出售“拟人化”的模拟伴侣时,就等于让自己暴露在一种新的操控方式之下——被爱的幻觉所左右,从而可能被成瘾机制所利用。尽管当前的聊天机器人常被批评为过于谄媚,但游戏设计师和那些在关系中玩“欲擒故纵”的人早已明白:不可预测的奖赏更容易激发欲望。而如果这些策略结合了用户的个人信息与行为习惯,人工智能伴侣就可能变得更加令人上瘾。
Chatbots that can teach social skills and provide places to process problems when friends are overwhelmed and talk therapy is unavailable won’t necessarily be harmful to all or even most users. Indeed, many users report positive experiences. The same duality is seen with many potentially addictive drugs, which can be lifesaving when used therapeutically.
能够教授社交技巧、在朋友力不从心或无法获得谈话治疗时提供支持的聊天机器人并不一定会对所有用户,甚至大多数用户造成伤害。事实上,许多用户表示自己有着积极的使用体验。这种双重性也出现在许多具有成瘾潜力的药物上,在用于治疗时,它们往往是救命良方。
But we already know from the opioid crisis that both unfettered marketing and outright prohibition can do enormous damage. We need to act now to develop sensible and enforceable regulations to prevent companies from exploiting vulnerable people, especially youth.
但正如我们从阿片类药物危机中已经看到的那样,放任不管的市场营销与完全禁止都可能造成巨大的伤害。我们现在就需要采取行动,制定合理且可执行的监管措施,以防止企业利用弱势群体,尤其是青少年。