Key Figure Behind U.S. AI Safety Institute Steps Down

Key Figure Behind U.S. AI Safety Institute Steps Down

Elizabeth Kelly, the inaugural director of the U.S. AI Safety Institute, has resigned. Her departure stirs uncertainty about the institute's future direction under the Trump administration. During her tenure, Kelly forged significant partnerships with AI firms to enhance model safety evaluation.

Key Figure Behind U.S. AI Safety Institute Steps Down

The U.S. AI Safety Institute faces a pivotal moment as its first director, Elizabeth Kelly, has announced her resignation. Revealing her decision on LinkedIn, Kelly steps down on Wednesday, casting uncertainty over the institute's future direction under President Donald Trump's leadership.

One year ago, Kelly took charge, guiding the institute in its foundational efforts to identify and mitigate risks posed by advanced AI systems. Her leadership saw pivotal alliances formed with emerging technology companies like OpenAI and Anthropic, granting the institute the opportunity to rigorously test AI models before their public release. Kelly also expanded the institute's influence by fostering collaborations with global AI safety organizations.

Originally established during former President Joe Biden's administration, the institute operates within the U.S. Commerce Department's National Institute of Standards and Technology. Since assuming office on January 20, President Trump has canceled Biden's 2023 executive order on AI, leaving the institute's trajectory in question as his administration's plans remain undisclosed.

In her LinkedIn message, Kelly expressed optimism: "I am confident that AISI's future is bright and its mission remains vital to the future of AI innovation." Requests for additional comments from Kelly went unanswered.

Published At: Feb. 7, 2025, 10:28 a.m.
Original Source: U.S. AI Safety Institute director leaves role (Author: Jeffrey Dastin)
Note: This publication was rewritten using AI. The content was based on the original source linked above.
← Back to News