Dark
Light

OpenAI’s Advisory Board Pushes for Stronger Nonprofit Oversight in AI Development

July 21, 2025

OpenAI’s advisory board is calling for the organisation to keep its nonprofit roots at the forefront of its work. The group argues that when technology with such wide-reaching impacts is developed, it shouldn’t be controlled solely by corporate interests. Their recent report outlines a broader vision: a democratic, people-first approach to building artificial intelligence.

Daniel Zingale, head of OpenAI’s nonprofit commission and an advisor to three California governors, put it plainly: “We think it’s too important to entrust to any one sector, the private sector or even the government sector.” This sentiment emphasises the value of a ‘common sector’ that brings diverse community voices into the decision-making process.

The board’s suggestions aren’t legally binding, but they set a clear framework for how OpenAI should proceed. With voices like labor organiser Dolores Huerta on board, the call is for communities—especially those most affected by AI—to have a real say. It’s a welcome change if you’ve ever felt left in the dark by big tech.

OpenAI started as a nonprofit research lab in 2015 and later transitioned into a for-profit venture now valued at $300 billion—a move that has prompted regulatory scrutiny and legal challenges, including from early backer Elon Musk. Recently, OpenAI announced plans to reshape its structure into a public benefit corporation that balances shareholder interests with its social mission. The nonprofit arm will continue to hold shares, although the precise details are yet to be finalised.

Huerta’s message was clear: AI must serve as a blessing rather than a curse. The board envisions a nonprofit that not only builds cutting-edge technology but also includes and supports a wide range of voices. They stress that real change will be measured by what it builds, whom it includes, and how true it stays to its mission over time.

In conversations with communities across California, many people expressed excitement about the promise of AI—but they also wanted clearer insights into its development and the decision-making behind it. “They know this is profoundly important; they simply want to understand what’s happening, how it’s developed, and who is making the key choices,” Zingale observed.

The commission has taken an interesting approach by engaging directly with senior engineers rather than with CEO Sam Altman, aiming to ground their recommendations in practical realities. One prominent proposal is for OpenAI to allocate significant resources to the nonprofit branch, which reported $23 million in assets for 2023. Beyond that, the board calls for measures to address economic imbalances, boost AI literacy, and open up governance to everyday people.

Ultimately, the board is urging OpenAI to be transparent and proactive in involving the public. By establishing dedicated funds—for example, to ease economic pressures in art, theatre, and health—and by appointing a human leader at the helm of the nonprofit, they believe OpenAI can truly reflect the needs of those it aims to serve.

If you’ve ever struggled to see where the decisions are made in tech, this call for clearer public oversight might feel especially reassuring.

Don't Miss