governments are providing you with safe, clean water, roadways, basic standards of food safety, protection from enemies, disaster prevention and relief, and social services beyond the feasible scope of charity. They may suck at many things, but existential risk (even if just civilizational shift owing to mass-produced disinformation) is a great place to point a government at.
The issue with those things is we actually know how to measure success for most of them, and they notably work best when the politicians stay the hell out of the process other then setting broad directions.
There's no equivalent for AI research, and a lot of temptation to try and find simple metrics which mean safe - and it's appealing. Water should be below X ppm contaminant is "easy" to establish. AI should be <blank> to be safe?
What is your definition of what makes an AI system dangerous or safe? How would this be measured or standardized? Not vagueries, not "well superintelligence should be regulated". A company owns a 100,000 Nvidia GPUs. What are you implementing? What physical actions are the people enforcing this policy going to take? Someone, somewhere will be at a computer entering commands, or writing code, or looking at data. What will it be?
Exactly. I gave examples of actual tech projects undertaken by governments, so an abstract discussion of "government vs. private industry" is not really on point here.
You're attacking a straw man. Where governments do well is establishing a regulatory framework. They don't deliver the clean water. They (should) ensure that the water that's delivered meets a standard.
In every single city I've ever lived in or paid attention to, the water system has been directly operated by employees of the municipal government, not private companies and not even contractors. And the regional projects they buy from tend to be pure government, too.
No, you're attacking something without even reading it.
The "bipartisan coalition" in the Senate is determined to spend $32 billion (just a downpayment, for sure) is not planning to do what you want. They want to hand out money to a lot of universities, NGO's, and bureaucrats. In your terms, they want to deliver the clean water.
A "regulatory framework" would be something like the Dodd-Frank bill, or Sarbanes-Oxley. Maybe that's what you want, but it's not what Schumer is pushing.
Yes, something like SOX (although one can argue that SOX, which was born of Enron, didn't stop the GFC or Maddoff).
I think there's also a place for government to perhaps 'accelerate' the market to encourage certain behaviours by either creating financial incentives/disincentives. This can be in the form of grants etc. In that case, giving money to researchers in the domain would make sense - on the basis that they are more likely to understand what's technically required than a government public servant.
And to your point regarding beauracrats: someone has to manage and oversee regulatory frameworks. Voluntary frameworks with no oversight at all are a figleaf. We can argue about the number/cost but we shouldn't be arguing about the need, assuming we agree on the need for regulatory frameworks in the first place.
Nobody except governments (plural) has the authority to regulate things. If you think regulation is required, there is no choice except governments to do it. Industries can only regulate themselves if given the power to do so by a government. Otherwise it is just a cartel agreeing to not push to hard under threat of actual regulation (and when one party thinks they can push harder, it falls apart).
As a real example, fully autonomous weapons will be developed and used. And if you don't want them to exist outside of black markets and pariah states, governments need to agree that development, manufacture and sale of them is a crime, like chemical weapons and land mines. Yes, I didn't say it would work well, but it is still the best tool we have.