Encode, a nonprofit organization focused on ensuring AI safety, has filed an amicus brief in support of Elon Musk's injunction to halt OpenAI's transition to a for-profit company. The brief, submitted to the U.S. District Court for the Northern District of California, argues that OpenAI's conversion would "undermine" its mission to develop and deploy transformative technology in a way that is safe and beneficial to the public.
OpenAI, founded in 2015 as a nonprofit research lab, has been planning to transition its existing for-profit entity into a Delaware Public Benefit Corporation (PBC). However, Musk, an early contributor to the original nonprofit entity, filed a lawsuit in November to stop the change, accusing OpenAI of abandoning its original philanthropic mission and engaging in anticompetitive practices. OpenAI has dismissed Musk's complaints as "baseless" and driven by sour grapes.
Encode's brief, supported by AI pioneer Geoffrey Hinton and UC Berkeley computer science professor Stuart Russell, argues that OpenAI's conversion would prioritize financial returns over safety and public benefit. The nonprofit's founder and president, Sneha Revanur, stated that "[t]he courts must intervene to ensure AI development serves the public interest" and that OpenAI's plans would "internalize the profits [of AI] but externalize the consequences to all of humanity."
The brief highlights several concerns, including the potential for OpenAI's nonprofit to become a "side thing" that gives license to the PBC to operate without addressing safety concerns. It also notes that OpenAI's nonprofit board would no longer be able to cancel investors' equity if needed for safety once the company's restructuring is completed.
OpenAI's plans have sparked concerns among experts and employees, with some former employees expressing worries that the company is prioritizing commercial products over safety. Miles Brundage, a former policy researcher who left OpenAI in October, has voiced concerns that OpenAI's nonprofit could become a "normal company" without addressing problematic areas.
The outcome of this legal battle has significant implications for the AI industry, with Facebook's parent company Meta also supporting efforts to block OpenAI's conversion. Meta has argued that allowing the shift would have "seismic implications for Silicon Valley."
As the debate around AI safety and governance continues to grow, Encode's brief serves as a reminder of the importance of prioritizing public interest and safety in AI development. The court's decision will have far-reaching consequences for the future of AI and its impact on society.
Encode, founded in 2020, has been involved in various AI-related legislative efforts, including California's SB 1047 and the White House's AI Bill of Rights. The organization's involvement in this case underscores its commitment to ensuring that AI development serves the greater good.
The legal battle is ongoing, with no clear timeline for a decision. However, one thing is certain – the outcome will have a significant impact on the future of AI and its governance.