Why This House Race Terrifies the Tech Industry
A New York primary is exposing Silicon Valley’s resistance to meaningful AI oversight
After 34 years in Congress, Rep. Jerry Nadler’s decision not to seek re-election has created one of the most consequential open-seat races in the country. Safe, high-influence districts at the center of American finance and media rarely open—and when they do, national interests quickly converge.
The contest to replace Nadler has quickly clarified its core tension. This race is not primarily about taxes, healthcare, or housing. It is a referendum on the future of governance itself: specifically, whether Congress will meaningfully regulate advanced technology —or allow Silicon Valley to continue writing the rules.
Alex Bores, a New York State Assemblyman and former Palantir engineer running to replace Nadler, has drawn unusually aggressive opposition from AI and venture capital interests. Central to Bores’s campaign is his record on technology governance.
As the author of the RAISE (Responsible AI Safety and Education) Act, a state AI safety law that would require large AI developers to adopt and report safety plans, Bores has become a rare political figure willing to articulate specific guardrails for advanced technologies. That stance—one that mixes technical competence with accountability—has attracted scrutiny and opposition from well-funded industry groups who argue it could constrain innovation.
A new super PAC, Leading the Future —armed with a massive war chest and backed by executives who are simultaneously lobbying Washington for liability protections and public financial backstops —has begun running attack ads against Bores. The alignment is instructive. The same actors pushing for rapid AI deployment are also seeking to offload its legal and economic risks onto the public —privatizing gains while socializing potential failures.
This is not a debate about innovation versus regulation. It is a fight over who bears responsibility when powerful, opaque systems fail.
Congress is not short on opinions about technology; it is short on members who understand it. Most lawmakers rely on industry briefings or staff summaries to interpret systems that now shape labor markets, national security, healthcare, and democratic discourse. That structural knowledge gap has real consequences: weak oversight, regulatory capture, and policy that lags years behind technological reality.
Bores represents a departure from that model. Trained as a computer scientist and experienced in building large-scale software systems, he brings insider familiarity with how advanced technologies are developed, deployed, and, crucially, how they fail. That background does not make him anti-innovation. It makes him skeptical of an industry that insists its products are too complex to regulate but too important not to subsidize.
His legislative record reflects that posture. The bill applies baseline safety and disclosure requirements to the most powerful AI systems—those expensive enough to train and broad enough in application to pose systemic risk. Rather than banning technology or micromanaging development, it treats frontier AI the way regulators already treat aircraft, pharmaceuticals, and financial instruments: as products whose scale demands pre-deployment testing, risk assessment, and accountability.
This approach explains both Bores’s appeal and the intensity of the opposition he faces. For technology executives seeking maximum flexibility with minimal liability, a lawmaker who can interrogate technical claims rather than accept them at face value is a genuine threat. For years, that knowledge gap has functioned as a shield.
What makes this race nationally relevant is the precedent it may set. A victory for Bores would signal that technical literacy is a political asset, potentially ushering in a new era of risk-aware governance. A defeat would send a grimmer message: that despite the public’s anxiety over AI, policy will remain the exclusive domain of those with the capital to purchase it.
The outcome of this primary will answer a fundamental question: Can Silicon Valley buy its way out of oversight, or will Congress finally acquire the competence to govern it?


