Tony Kim
Mar 06, 2026 11:13
Claude Opus 4.6 discovered 14 high-severity Firefox bugs, practically a fifth of all essential vulnerabilities fastened in 2025. Mozilla shipped fixes to a whole bunch of thousands and thousands of customers.
Anthropic’s Claude Opus 4.6 recognized 22 safety vulnerabilities in Mozilla Firefox over a two-week interval, with 14 categorized as high-severity—representing practically a fifth of all essential Firefox bugs remediated all through 2025. The findings have already been patched in Firefox 148.0, defending a whole bunch of thousands and thousands of customers.
The collaboration marks a major milestone in AI-assisted safety analysis. Inside twenty minutes of preliminary exploration, Claude found a Use After Free vulnerability in Firefox’s JavaScript engine—a reminiscence flaw that might permit attackers to inject malicious code. By the point Anthropic researchers validated and submitted that first bug, the AI had already flagged fifty extra distinctive crashing inputs.
Pace That Human Researchers Cannot Match
Anthropic scanned practically 6,000 C++ recordsdata and submitted 112 distinctive experiences to Mozilla’s Bugzilla tracker. The corporate selected Firefox particularly as a result of it is some of the rigorously examined open-source initiatives in existence—making it a more durable benchmark than typical targets.
“Browser vulnerabilities are significantly harmful as a result of customers routinely encounter untrusted content material and rely upon the browser to maintain them secure,” Anthropic famous of their announcement. The JavaScript engine introduced an particularly essential assault floor because it processes exterior code each time somebody browses the online.
Mozilla’s safety group tailored their processes mid-collaboration, ultimately encouraging Anthropic to submit findings in bulk with out manually validating every one. Most points shipped fixes in Firefox 148, with remaining patches coming in future releases.
The Exploitation Hole—For Now
This is the place it will get uncomfortable. Anthropic additionally examined whether or not Claude might really exploit the bugs it found. After spending roughly $4,000 in API credit throughout a number of hundred makes an attempt, Opus 4.6 efficiently developed working exploits in two instances—crude ones that solely functioned in take a look at environments with security measures disabled, however practical nonetheless.
The AI proved much better at discovering vulnerabilities than weaponizing them. That is excellent news for defenders, however Anthropic is not sugarcoating the trajectory: “Wanting on the fee of progress, it’s unlikely that the hole between frontier fashions’ vulnerability discovery and exploitation talents will final very lengthy.”
What This Means for the Trade
The partnership comes amid Mozilla’s broader push to counter AI trade giants. In late January 2026, Mozilla introduced plans to deploy roughly $1.4 billion via Mozilla Ventures to fund AI startups centered on security and transparency—positioning itself as a “insurgent alliance” in opposition to closed-source AI dominance. Mozilla Ventures has already backed over 55 corporations since launching in 2022.
Anthropic, in the meantime, closed a $30 billion Collection G spherical in February 2026 at a $380 billion valuation, giving it substantial sources to develop cybersecurity initiatives. The corporate has already used Claude to find vulnerabilities in different main initiatives together with the Linux kernel.
For builders, the message is blunt: this window the place AI finds bugs quicker than it exploits them will not keep open indefinitely. Anthropic plans to develop its safety work considerably, together with direct outreach to open-source maintainers and a brand new Claude Code Safety instrument at the moment in restricted preview. They’re additionally hiring safety researchers to scale these efforts.
Mozilla engineers have began experimenting with Claude internally for their very own safety testing—a telling signal of the place browser safety is headed.
Picture supply: Shutterstock


