Hacker Newsnew | past | comments | ask | show | jobs | submit | bendergarcia's commentslogin

Absolutely! I don’t think people are really considering the full effects of just letting ai be the middle man. I mean Sam Altman basically said this is what he wants Gwen he said intelligence is a commodity no?

I wouldn’t put it past these tech companies to prefer ai outputs to encourage ai inputs

We are without our consent introducing a party in between people. The models become the arbiters of who does and does not get a job. It feels problematic.

There will be a great arbitrage for people who do not use LLMs.

If your HR department is using ChatGPT to filter resumes, you’ll end up with people who used ChatGPT to generate resumes. I don’t want to make a “slippery slope“ argument, but my gut feeling is that the quality of your organization will deteriorate quickly.

On the other hand, I am a handyman/subcontractor. Almost all of my work comes through phone calls, texts, and one-off emails. I only work with people that are recommended by a trusted sources. I haven’t handled a traditional resume (mine or other people’s) in over eight years.

If I started interacting with somebody and they seemed like they were a computer, that would be the fastest way for me to know I should move on to another client. If they can’t take the time to interact with me, how am I supposed to perform hundreds of hours of physical labor for them?


And I feel the common response of: well just use the model that’s available. Ai is and will probably always be resource constrained and profit driven, that means we will eventually see a world where poor people have worse resumes than rich people and there really won’t be any way around it because the man in the middle has the final say

Not too long ago I bet resumes that were printed from a computer were preferred to resumes typed on a typewriter. What happened was that computers became commodities. It is reasonable to assume that LLMs will become commodified too.

That would hardly be surprising. Monospaced fonts make natural language a pain to read, so what that would prove is that well-presented resumes are preferred to poorly-presented ones.

This case is different, as the LLM output isn’t measurably better than the human output (unless you have a particular love of bland corpo-speak).


This is a terrible way to soften an obvious alignment failure with AI rollout.

before it used to be HR, so you always had a party in between "actual" people. HR (mostly) never cared about the CV, they just look at a checklist and see if it matches.

The ship has sailed as soon as hiring managers stopped reading cv's directly and we got recruiters as a profession.

We already did that when we all created LinkedIn accounts.

Take a look at how things worked before (and still do): employers decide who get jobs based on a combination of personal biases, nepotism, and ulterior motives while applicants present distorted versions of themselves and network/pull strings to put the odds in their favor. That seems more problematic.

You would be surprised at the process in other industries. What you are describing is the tech job market specifically.

Other fields have their own problems, including credentialism and ballooning concomitant student loans, but do, by strict convention, not hire based on vibes or pulled strings. Often to their partial detriment, as the cure -- ie, strict oversight of hiring that also forces the hiring manager to ignore important implicit signals -- is alive and well in medicine, law, civil engineering, education, and the trades. Notable exceptions include entertainment, sales, real estate, and software engineering.

By optimizing for vibes, the tech industry gains "Spidey senses" in the hiring loop but pays for it in impartiality.

IMO this precipitated the DEI movement's advent, as it was seen as a way of remediating the drawbacks while preserving the information channel.

Without it, expect either homophily, and, eventually, a harsh and remedial credentialism.


I'm a physician and have recently been on both sides of the hiring process for new physicians and residents at a few different institutions. It's absolutely not meritocratic--you'd be shocked at how strong a role connections and pedigree play. The hard requirements are just table stakes, but the selection process from there is completely subjective and susceptible to all kinds of problematic biases. Generally people don't want to rock the boat and discuss this stuff openly, but it's absolutely a problem that needs to be pointed out.

Weird. I used to be an academic and hiring was wildly formal. Sorry to hear medicine fell to vibes.

I agree with all these points. The algorithm to multiplying being done by the calculator, is not the same because you still have to learn why you are multiplying. If you rely on ai you may not know why the ai is giving the answer. All you’ve learned is copy paste. Figuring out how someone can use ai and give the wrong answer will solve the a huge problem. But it feels quite difficult at the time being. Ai is so general it’s hard to think of how to pose the question in a way that ai can’t answer. Maybe submitting prompts and judging prompts that bypass understanding. What if the test was more how to teach the ai? Meh ai in education is filled with gotchas.


The only reason this is possible is because of the content those people created. This literally doesn’t exist without them. Not sure what you’re trying to say….


Yea... thats the point he was making.


Right I guess the most realistic thing about this calculation is that people who make 15 an hour have no vacation cause they can’t afford to vacation. Let’s play it out:

After taxes probably looking at closer to 900K. After paying off a 30 year mortgage, probably left with 500k? For simple math let’s say I 400k over 40 years that’s 10k left over per year to cover food and all the necessities of life. They’re living on a knifes edge and an inevitable emergency completely derails them.


>They’re living on a knifes edge and an inevitable emergency completely derails them.

I know what you mean, $1.248M sure doesn't buy what it used to any more.


I think they should rephrase. It makes SDR appear HDR. It’s just making up information no? It’s not actually making it HDR just it appears to be HDR?


Making up information? The same can be said for most commonly used modern compressed video formats. Just low bitrate streams of data that gets interpolated and predicted into producing what looks like high resolution video. AV1 even has entire systems for synthesizing film grain.

The way i see it, if the ai generated HDR looks good, why not? It wouldn't be more fake or made up than the rest of the video.


Beoings relationship with the U.S. only barely starts at commercial planes. They support the U.S. mission militarily. It is and should be alarming but also not surprising. The gov cozies up with any corporation that can further its interest


If that's how they cut corners in civilian aviation, which is used by the public all the time, how can they cut corners when they deliver obscure military hardware that just sits in the warehouse waiting for WW3?


The KC-46 saga gives the complexities of this: https://en.m.wikipedia.org/wiki/Boeing_KC-46_Pegasus#Flight_...

On the one hand, Boeing fucked up the project badly. On the other hand, the contract was written so Boeing ate the $5B+(?) in rework / deficiency remediation.


Reading more around it, Northrop Grumman won the initial contract with an Airbus model and Boeing complained, got the proposal rewritten in their favour. They had an official who passed them info and got a highly inflated contract written, who was then jailed for corruption, Boeing was fined and the CEO was fired. Yet the US is still going with them for the tankers despite the ongoing problems that still aren't resolved. The Airbus version has now been in service in other countries for 10+ years. https://en.m.wikipedia.org/wiki/KC-X


From memory, without looking back through Wikipedia, the original contract award was killed. Then Boeing won the new bid.

Acquisition at that level is extremely cutthroat, so who knows what happened.

The broader perspective is that the current major aircraft contracts are:

   - F-35 Lockheed-Martin
   - B-21 Northrop Grumman
   - KC-46 Boeing
   - X-37 (Space) Boeing
   - MQ-25 (Naval Refueling) Boeing
That seems like a pretty fair spreading of contracts among the remaining majors, especially if you had less faith in Boeing to produce combat equipment, but still wanted to maintain it as a company.


We do know what happened though. Boeing used an insider to pass information about their competitors bids and then gave them a high paying job with a large sign on bonus.

They got the contract killed because they knew they could work up a furor about a European design being used by the US. Of course it's fine in the other direction.

There's legitimate reasons to not want to depend on an ally for equipment but in this case it seems that Boeing haven't been able to deliver on it at all. Losing might have been a good kick up the ass to improve for the next time this type of contract comes around.


I mean, the billions of dollars hole in their books in doing a decent job of that.

They've already made noise about 'being more selective about their bids in the future' or some such.

Which is honestly the way it should work. Because the US govt can't reform Boeing. Only Boeing can choose to do that internally.


It’s not about who voted for Trump or not. There’s democratic politicians who just as likely take beoing money.

The crux of it is that more and more education is being eroded and the education we do get is funneled through the lens of what’s profitable. Americans critical thinking ability is being eroded. And we all know the systemic issues that cause this.


> Americans critical thinking ability is being eroded.

It's more nuanced than that. The internet has made critical thinking harder than ever. The search for engagement means headlines and half truths are pushed over deep analysis. And, once a person shows any interest in taking a side they are funneled into a bubble that's hard to leave. Even the best critical thinkers are going to have problems as AI takes off.


It’s kind of like humans: seed of two people and a random interest to pursue, what could they do?!? It makes poverty and children dying unnecessarily even more depressing.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: