Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To me, the key quote is the simple "If you had told me in late 2022 I’d be saying these things 3 years later, I would’ve been pretty surprised." As someone with little exposure to the design industry, seeing how quickly AI could generate images, I'd been under the assumption that the AI takeover was already well underway there, so was surprised to learn that it's not.

If anything, that gives some comfort around the future of engineering job prospects. While there's still room to worry, "yeah but design is fundamentally human, while engineering is mostly technical and can be automated", I'm sure, just as design has realized, that when we get to a point where AI should be taking over, we'll realize that there's a lot of non-technical things that engineers do, that AI cannot replace.

Basically, if replacing a workforce is the goal, AI image generators and code generators look like replacement technologies from afar, but when you look closer you realize they're "the right solution to the wrong problem", to be a true replacement tech, and in fact don't really move the needle. And maybe AI, by definition of being artificial and intelligence (as opposed to real common sense) as a whole, is fundamentally an approach that "solves the wrong problem" as a replacement tech, even as AGI or even ASI gets created.



That quote stood out to me as well, but mostly because the 3 images shown by the author have nothing to do with product/interface/communications design.

I guess they’re vaguely cool looking images? If the author had used them to talk about how “concept art” in games/movies was going to get upended by AI there would be a point there, but as it stands I find it very puzzling that someone who claims to teach design would use them as key examples of why design - a human process of coming up with specific solutions to fuzzy problems with arbitrary constraints - was headed in any particularly direction.


I think there's some benefit of hindsight in that perspective though. I can imagine how, at the time, you see the advancement, and it's not obvious what the barriers for AI takeover are. Similar to software now, plenty of SWEs have a nagging feeling about AI encroachment. But in all likelihood, eventually it'll become clear that most SWE work involves coordinating with other teams, planning incremental delivery and various testing and review phases, working with CS when users face issues, etc. The boundaries will be a lot clearer, and looking back at the current FUD because of a better autocomplete, it'll seem ridiculous by then. (At least I hope so!)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: