In the Netherlands dental service prices are set by the government [1]. Under 18 are universally covered by basic health insurance; for adults average dental for regular work + emergency is 30/month.
This! I flew from Madrid to SF last year and I can't begin to describe the difference in the quality of food. The scale of agricultural industrialization is terrifying - I wish you luck but I don't think anything short of this becoming a major campaign issue will help you.
I think it is possible that the majority of Americans do not know what they are missing. It is difficult to really understand how much better simple things like fruits, vegetables, and bread can taste without experiencing it. It's like The Matrix, you just have to see it for yourself. Well, taste it for yourself. I find that in America even local farm produce at the "farmer's market" often tastes flat and uninspiring. For whatever reason, heirloom tomatoes tend to be good though - they constitute an exception.
To be fair, I was not born in America. So it is possible that it's not that American food is actually subpar, it's just that I became used to particular nuances of how certain foods taste back when I was a child and I do not get that from most American food, and to Americans their produce tastes extremely delicious. I'm pretty skeptical of this idea though. My hunch is that I'm not experiencing some sort of chemical nostalgia, and that American produce actually isn't very good.
RFK Jr. successfully made some of this kind of stuff a minor campaign issue in the most recent US presidential election, so whatever one thinks about RFK Jr., at least it seems that there is some demand for food production reforms in the US electorate.
Lifelong American Midwesterner and I'm also convinced there's a big difference in the taste of some produce between what you get at a typical American grocery store and a farmer's market or my local natural foods store. I get all my produce there, and people who don't normally shop there often comment on how much better my raw vegetables are when they eat at my house.
Someday I should go buy some produce from each store at peak season and try them side by side.
As a European I would think a large part of the problem is that Americans are just sick more seriously and often. Your car culture, quality of food, and general preventative healthcare accessibility seem all terrible there. The prevalence of obesity in younger population is staggering. In my (engineering) programme I see one very obese person and a couple fairly overweight, but that's about it.
> The best engineers can visualize the whole architecture in their head, and describe exactly what they want to an AI
I'd go a step further and say the engineers who, unprompted, discover requirements and discuss their own designs with others have an even better time. You need to effectively communicate your thoughts to coding agents, but perhaps more crucially you need to fit your ever-growing backyard of responsibilities into the larger picture. Being that bridge requires a great level of confidence and clear-headedness and will be increasingly valued.
Isn't disqualifying X months of potentially great research due to a misformed, but existing reference harsh? I don't think they'd be okay with references that are actually made up.
> When your entire job is confirming that science is valid, I expect a little more humility when it turns out you've missed a critical aspect.
I wouldn't call a misformed reference a critical issue, it happens. That's why we have peer reviews. I would contend drawing superficially valid conclusions from studies through use of AI is a much more burning problem that speaks more to the integrity of the author.
> It will serve as a reminder not to cut any corners.
Or yet another reason to ditch academic work for industry. I doubt the rise of scientific AI tools like AlphaXiv [1], whether you consider them beneficial or detrimental, can be avoided - calling for a level pragmatism.
even the fact that citations are not automatically verified by the journal is crazy, the whole academia and publishing enterprise is an empire built on inefficiency, hubris, and politics (but I'm repeating myself).
Science relies on trust.. a lot. So things which show dishonesty are penalised greatly. If we were to remove trust then peer reviewing a paper might take months of work or even years.
And that timeline only grows with the complexity of the field in question. I think this is inherently a function of the complexity of the study, and rather than harshly penalizing such shortcomings we should develop tools that address them and improve productivity. AI can speed up the verification of requirements like proper citations, both on the author's and reviewer's side.
Math does that. Peer review cycles are measured in years there. This does not stop fashionable subfields from publishing sloppy papers, and occasionally even irrecoverably false ones.
How would you do that? Even in cases where there's a standard format, a DOI on every reference, and some giant online library of publication metadata, including everything that only exists in dead tree format, that just lets you check whether the cited work exists, not whether it's actually a relevant thing to cite in the context.
Yup; or potentially just purchasing a fab from them, given that Intel has signaled they want to leverage TSMC more, and much of Intel's remaining value is wrapped up in server-grade chips that Apple wouldn't be interested in.
But also; Apple is one of the very few companies at their size that seems to have the political environment to make, and more importantly succeed, at decade investments. The iPhone wasn't an obvious success for 5 or 6 years. They started designing their own iPhone chips ~the iPhone 4 iirc, and pundits remarked: this isn't a good idea; today, the M5 in the iPad Pro outperforms every chip made by EVERYONE else in the world, by 25%, at a tenth the power draw and no active cooling (e.g. 9950X3D). Apple Maps (enough said). We're seeing similar investments today, things we could call "failures" that in 10 years we'll think were obviously going to be successful (cough vision pro).
> Apple is one of the very few companies at their size that seems to have the political environment to make, and more importantly succeed, at decade investments.
Definitely! But I'd recon they would want to bootstrap that part of their supply chain as soon as possible? Say China does invade Taiwan, suddenly their main supplier is gone and the Intel capacity mostly goes to military and other high margin segments. If they instead own Intel they not only control the narrative but also capitalize on the increase in Intel's value.
> the M5 in the iPad Pro outperforms every chip made by EVERYONE else in the world
No, it does not. The core inside the M5 is faster than every other core design in single-threaded burst performance. That is common for small machines with a low core count and no hyperthreading.
The chip itself does not outperform every other chip in the world, nor is it 10x more efficient than the 9950X3D. That's not even napkin math at that point, you're making up numbers with no relation to relevant magnitude.
The 9950X3D has a TDP of 170 watts. M5 has an estimated TDP of around 20 watts.
The comparison point was for single core performance, which certainly makes the TDP comparison unfair if interpreted together. The numbers are ballpark-correct.
No one else is remotely close to Apple. Apple could stop developing chips for four years, and it’s very likely they would still ship the most efficient core architecture, and sit in the top five in performance. If you’re quibbling over the semantics of this particular comparison, you are not mentally ready for what M5 Ultra is going to do to these comparisons in a few months.
The numbers do not exist in isolation. They are "interpreted together" because statistics are more than just advertisement lines. The TDP comparison is mind-bogglingly stupid and you should really feel ashamed for defending it if you care about statistical integrity.
> you are not mentally ready for what M5 Ultra is going to do to these comparisons in a few months
I hope so. The past Ultra chips have been losing to Nvidia laptops in raster and compute efficiency.
Apple could afford Intel, and could get past antitrust by arguing military security. Who's mobile phone can politicians trust?
Then again, Microsoft should have bought Intel: MS has roughly $102 billion in cash (+ short-term investments). Intel’s market value is approximately $176 billion. Considering Azure, Microsoft has heaps of incentive to buy Intel.
I would guess Google are more likely to greenfield develop their own foundry rather than try and buy Intel.
> and could get past antitrust by arguing military security
Antitrust would certainly block Apple specifically for this reason. Apple is not a credible supplier of DoD hardware and acquiring IFS would complicate their status as a Trusted Foundry.
If Apple had more time to reform their image and invest in MIL-STD processes then maybe it would work. As-is, I'd be shocked if the US let Intel become the victim of a hostile takeover. Even for a company as important as Apple.
> The GPUs were designed for graphics [...] However, because they are designed to handle everything from video game textures to scientific simulations, they carry “architectural baggage.” [...] A TPU, on the other hand, strips away all that baggage. It has no hardware for rasterization or texture mapping.
With simulations becoming key to training models doesn't this seem like a huge problem for Google?
[1] https://puc.overheid.nl/nza/doc/PUC_789284_22/1/