Encode, a nonprofit involved in California’s unsuccessful SB 1047 AI safety legislation, has requested permission to submit an amicus brief supporting Elon Musk’s efforts to halt OpenAI’s shift to a for-profit entity.
In its filing with the U.S. District Court for the Northern District of California, Encode’s legal team argued that OpenAI’s transition would jeopardize its mission to create transformative and safe technology for public benefit. The brief emphasized the importance of keeping advanced AI controlled by organizations prioritizing safety and public interest, rather than profit-driven entities.
OpenAI was initially established as a nonprofit in 2015 but adopted a hybrid structure to accommodate external investments, including significant backing from Microsoft. It now operates a capped-profit model under nonprofit oversight. However, OpenAI recently announced plans to restructure as a Delaware Public Benefit Corporation (PBC), transferring control from its nonprofit to the PBC, which will issue stock while maintaining its stated mission.
Elon Musk, an early OpenAI supporter, filed a lawsuit in November to stop the transition, accusing the organization of abandoning its original mission and using anticompetitive tactics that harm rivals like his AI startup, xAI. OpenAI dismissed Musk’s allegations as baseless.
Meta, Facebook’s parent company, has also opposed OpenAI’s restructuring, citing concerns about the broader implications for Silicon Valley. In a December letter to California’s attorney general, Meta highlighted the potential industry impact of allowing OpenAI’s shift.
Encode’s legal team argued that transitioning to a PBC would shift OpenAI’s priorities from safety to profit. The nonprofit’s current obligations, such as avoiding competition with safety-conscious AGI projects, might be diminished in a for-profit model. Additionally, Encode warned that OpenAI’s nonprofit board would lose its authority to revoke investor equity for safety reasons.
Concerns about OpenAI’s priorities have grown, with some employees leaving due to fears that commercial interests are overshadowing safety. Former policy researcher Miles Brundage expressed worries about the nonprofit becoming secondary to the PBC’s operations.
Encode’s brief stressed the risks of relinquishing nonprofit control over transformative AI technology, stating that the move would harm public safety and trust. Founded in 2020 by high school student Sneha Revanur, Encode has contributed to various AI policy initiatives, including the White House’s AI Bill of Rights and President Biden’s executive order on AI.
Post a Comment