近日,包括马斯克在内的1000多名科技界领袖和研究人员敦促人工智能实验室暂停对最先进系统的开发,并在一封公开信中警告,AI工具“对社会和人类造成深远风险”。但是AI监管方面的立法或还缺乏,说服更广泛的科技界同意暂停研发进程将十分困难,促使政府迅速采取行动的可能性很小。
▲图片来源:CNBC
1. capability overhang 能力过剩 2. mitigate danger/risk 减轻危险/缓释风险 The Future of Life Institute is a nonprofit that seeks to mitigate risks associated with powerful technologies and counts the Musk Foundation as its biggest contributor. 未来生命研究所是一家非营利组织,致力于降低与强大技术相关的风险,马斯克基金会是其最大的捐赠方。 (英文摘自2023.3.29 Bloomberg UK 网站) Specifically, the institute focuses on mitigating long-term “existential” risks to humanity such as super intelligent AI. Musk, who has expressed longtermist beliefs, donated $10 million to the institute in 2015. 具体来说,该研究所专注于降低人类的长期“生存”风险,例如超级人工智能。2015 年,表达了长期主义信念的马斯克向该研究所捐赠了 1000 万美元。 (英文摘自2023.3.29 The Vice 网站) 3. pump the brakes 暂停、叫停、刹车 In an open letter Wednesday, Elon Musk and more than 1,000 other technology leaders urged developers to pump the brakes on new artificial technology they feel would pose a risk to society and humanity. 在周三的一封公开信中,埃隆·马斯克和其他 1000 多名技术领袖敦促开发人员暂停新人工智能技术研发,他们认为新人工智能技术会对社会和人类构成风险。 (英文摘自2023.3.30 News 10 NBC 网站) ▲图片来源:Bloomberg 4. moratorium 暂停 The letter asks AI labs “to immediately pause for at least six months the training of AI systems more powerful than GPT-4”. If such a delay cannot be enacted quickly, governments should step in and institute a moratorium, it says. 这封信要求人工智能实验室“立即暂停训练比 GPT-4 更强大的人工智能系统,暂停时长至少六个月”。信中表示,如果不能迅速实施这种推迟,政府应该介入并要求暂停。 (英文摘自2023.3.30 BBC 网站) A moratorium of six months or more would give the industry time to set safety standards for AI design and head off potential harms of the riskiest AI technologies, the proponents of a pause said. 暂停支持者表示,六个月或更长时间的暂停将使行业有时间为人工智能设计制定安全标准,并阻止高风险人工智能技术带来的潜在危害。 (英文摘自2023.3.29 The Wall Street Journal 网站) ▲图片来源:CNN 5. upend professions颠覆职业 job market 就业市场 These tools have also sparked questions around how AI can upend professions, enable students to cheat, and shift our relationship with technology. 这些工具还引发了系列问题,如人工智能如何颠覆职业、让学生作弊以及改变人类与科技的关系。 (英文摘自2023.3.29 CNN 网站) The billionaire has long warned of the perils of unfettered AI development. He once said artificial intelligence is “far more dangerous” than nuclear warheads. His words have more gravity today, as the rise of ChatGPT threatens to upend the job market with more advanced, human-like writing. 长期以来,这位亿万富翁一直警告开发不受限的人工智能的风险。他曾表示,人工智能比核弹头“危险得多”。他的话在今天更具分量,因为 ChatGPT 的兴起可能会以更先进的、类似人类的写作方式颠覆就业市场。 (英文摘自2023.3.6 CNBC 网站) ▲图片来源:Reuters 6. hiatus 暂时的间隙、暂停、中断 Even though the signatories say they’re not calling for a complete pause on AI development in general, they add that experts should use the letter’s proposed half-year hiatus to develop and implement shared safety protocols. 尽管签署方表示他们并不呼吁全面暂停人工智能开发,但他们补充说,专家们应该利用这封信中提议的半年中断时间来制定和实施共享安全协议。 (英文摘自2023.3.29 Cybernews 网站) Lian Jye Su, an analyst at ABI Research, said the letter shows legitimate concerns among tech leaders over the unregulated usage of AI technologies. But he called parts of the petition “ridiculous”, including the premise of asking for a hiatus in AI development beyond GPT-4. ABI Research 的分析师 Su表示,这封信展现了技术领导者对人工智能技术不受监管的使用的合理担忧。但他称请愿书的部分内容“荒谬”,包括要求暂停 GPT-4 以外的人工智能开发的前提。 (英文摘自2023.3.29 CBS 网站) 核稿:文晶、国佳