The Broad Way

[ Sharp Mind · Sharp Blade · Sharp Spirit ]

root@construct:~
/the-white-house-wants-to-regulate-ai-a-developers-take
$_
<-- back to /rants
2026-03-20//OPINION

The White House Wants to Regulate AI -- A Developer's Take

I'm sitting in Vitoria, Brazil, 5,000 miles from Washington D.C., reading about the White House preparing to send Congress a national AI regulatory framework. And I can already feel it in my codebase. The White House is putting together a comprehensive AI regulatory framework for Congress. Child safety, creators' rights, federal vs state regulation, and tightening export controls on AI chips. People are LITERALLY getting charged for diverting AI hardware to China. On paper, reasonable. Protect kids. Protect artists. Prevent adversarial nations from getting cutting-edge compute. Who could object? Me. Not to the goals -- to the inevitable execution. WHY A BRAZILIAN DEVELOPER CARES ABOUT AMERICAN REGULATION Your regulation is everyone's regulation. When the EU passed GDPR, every developer on Earth added cookie banners. When Apple changed App Store privacy rules, every mobile dev globally complied. When the US sanctions a chip manufacturer, the supply chain ripples hit Sao Paulo the same week as San Jose. Every major AI API, every foundation model, every GPU cloud is headquartered in the US. I didn't get a vote. I don't get a seat at the table. But I'll get the bill. THE REAL PROBLEM ISN'T LAWMAKER IGNORANCE -- IT'S INCENTIVE MISALIGNMENT Lawmakers optimize for electability. "We protected children from AI" is a visible win. "We created a nuanced framework balancing innovation with safety while preserving open-source ecosystem dynamics" is not a campaign slogan. The visible wins get teeth. The nuanced stuff gets asterisks. Child safety will be strict. Good. Creators' rights will get some framework modeled after copyright law already outdated when Napster existed. But open-source provisions? Indie developer exemptions? Those will be vague, underfunded, and written by people who think "open source" means "free software." EXPORT CONTROLS GENUINELY WORRY ME Restricted chip supply means compute gets more expensive for everyone. A startup in Belo Horizonte pays more because of a geopolitical chess match it has no part in. The compute gap between big and small widens. Google, Microsoft, Meta -- they have their own chip programs, partially insulated. The rest of us fight over whatever NVIDIA is allowed to sell. And it pushes innovation underground. China won't stop. They'll build parallel ecosystems. Fragmentation is the worst outcome for developers who just want to build things. THE OPEN SOURCE TIME BOMB Regulation tends to treat "AI" as a monolith. A framework designed for OpenAI's GPT doesn't make sense for a researcher fine-tuning a 7B model on a single GPU. Compliance requirements trivially easy for a company with a legal department become existential threats for open-source maintainers. If the US framework gets open-source exemptions wrong, indie devs and small teams are dead in the water. CREATORS' RIGHTS -- OR LIABILITY LAUNDERING If an AI model trained on copyrighted data, who owes what? Large providers will negotiate blanket licenses with major publishers. For the rest of us using APIs? We become liability laundering machines. API prices go up to cover legal exposure, or every indie dev needs to understand copyright law at lawyer level. Neither outcome is good. MY PLAN Bet hard on open-source models. Self-hosted equals most control. Keep building for Brazilian market first -- fallback options exist. Mistral is French, Qwen is Chinese. Watch the open-source exemptions like a hawk. Ship faster. Regulatory frameworks take years. Build now while the rules are being written. The AI horse has left the barn. Models are out there. Weights on BitTorrent. Knowledge in every CS department on the planet. You cannot uninvent transformers. Regulation can shape commercial deployment, create liability frameworks, fund safety research, punish bad actors. But it CANNOT stop a developer in Curitiba from downloading an open model and deploying it. Some regulation IS necessary. Deepfakes of children need criminal penalties yesterday. Autonomous weapons need hard limits. But the framework needs to be surgically precise, not a blunt instrument catching every developer in the blast radius. The question isn't whether regulation is coming. It's whether the people writing it understand the difference between OpenAI and a guy with a GPU in his apartment. History says they won't.
The Broad Way | Kinho.dev