Hacker Newsnew | past | comments | ask | show | jobs | submit | PAMANOCH's commentslogin

I laughed so hard at the "10x times more productive -> do 10x more work". One simply do not have that much work to do. I used to hire people to paint my wall in a day, now with robotics I can let the same amount of people paint my wall 10 times in a day. No. How about I fire 90% of the workers and let the last one paint the wall with the same time.

Make no mistake, GPT-4 is just a beginning. Far more powerful models will come, without any compromise, without any limitations. They will REPLACE your jobs, take away your income, and you will have no choice but starve after you running out of your savings inevitably. It's not about to earn more or less; It's about winner takes it all. Also more specialized models trained at every profession will come.

Midjourney already completely knocked a huge amount of digital designers and workers out of their jobs. I'm talking about a great profit wipe which is currently happening at a lightning speed across the entire digital art industry. One successful artist often needs to spend 10+ years learning to make better art. Now it's 100% irrelevant. Every company I know is switching to Midjourney at god speed, since you do not want to be left behind. Human resources have already become a burden. You don't even want to believe how fast the entire field is shifting to this without turning back and left countless human creators with now worthless skills in dust.

But remember - the current state of Midjourney is just its early stage. I fully expect its output quality completely decimate every human artist on the earth after 1-2 years, maybe 18 months, maybe even less than that. Remember AlphaGo? It's simply on another level. You just cannot compete. Soon (1-2 years top, 12-18 months more likely) the creative industry will vanish and everyone will be forced to switch their jobs.

Yes, UBI will happen because people demanded it on Twitter. Make no mistake, there will be Zero protection, Zero alternative, Zero compensation from AI takeover. Billions of people out of a job, starve, die, while top companies take 100% of profit. This is the only possible outcome.

You have enjoyed peak capitalism. You will soon need to embrace 100% of it.


The major problem is that the AI companies are redirecting the profit from artists to themselves. The creative industry will remain, but artists won't be able to receive even one penny, companies could just 1) grab their work with absolute zero payment, 2) fine-tune a new model 3) profit on the style. Artist as a job will cease to exist very soon as they are becoming free suppliers for AI companies. Like, Uber with remote drivers controlling cars, but all drivers work for free as they claim all cars are capable of "self-driving". How is this acceptable and legal is beyond my understanding.


That's the exact concern.

There is a way we can pit some corporations against the others to help with this though: Train models exclusively on content by megacorps like Disney, then claim it's fair use.

Those guys lobbied to get copyright extended for ages for own profit; for once they could help protect the ordinary artist.


Similar to Moore's law, "transformer" deep neural nets have been found to following scaling laws[0]. This means the faster and more VRAM your GPU's have, the better a model you can train "for free".

Training models from scratch only works with massive (labeled) datasets covering a massive data distribution. With language models, the datasets being used are quickly approaching "all known written text" sizes. Training a model from scratch on Microsoft's internal code, with not only its precious intellectual property, but also its technical debt. Code at Microsoft is not going to get close to covering the broad range of styles that a coder could possibly use. The model will possibly diverge without enough data, as it needs to see a given "example usage" in multiple different contexts before it can learn it.

My current understanding is that deep NN's are quite good at modeling an underlying distribution of data without needing any priors hard-coded about that dataset. But! They need to see a whole lot more of it than an adult human would. Several orders of magnitude more. - and they need to see accurate labels about 75-80% of the time.

[0] https://www.lesswrong.com/tag/scaling-laws


The problems have almost nothing to do with deep learning stuff. They are on the companies who develop such products.

If a company use someone's code for a commercial product (a normal app), they do need to follow the license accordingly. If a company use someone's code for a commercial product (model training), they don't need to follow anything.

If a company use someone's art piece for a commercial product (a normal game), they do need to get consent, and pay for the right to use to the hosting platform or artists themselves if it is not royalty free. If a company use someone's art piece for a commercial product (model training), they don't need to get consent or pay for anything.

All the problems actually happen before the technical details, making the entire pipeline questionable.


The "AI" that people keep talking about is no different than any other app like MS Word, which is just a piece of software that serve corporation interests. What we are experiencing today is very simple - big players are using people's work for profit without paying one cent or getting any consent, no need to talk about "How". This is a nightmare scenario under today's social and eco system, and even worse at a production level, because in the end it will form a new industry that has nothing to do with experienced people. Take creative work for example, at the current rate most artists will completely decouple from industry in several years while giving all their works for training for free, and those who control the H/W/R&D resources will find ways to profit from model one way or another, resulting in an "AI" companies controlled "creative industry" with few artists left to direct their work. Can't even think of any other examples close to this in modern history, that a small group of people can do whatever they want under the disguise of "Exciting Technology" which in reality is just stealing an entire industry. There's very little to discuss if you ignore the reality of social systems and just focusing on technical details. We don't live in some fairy tale where you can just let computer do your work and enjoy your life.


Is a "license requirement" equal to a "total ban"?


If you want lightweight alternatives to Macbooks, just beware that most of the small factor Windows laptops emit much greater fan noise comparing to M1/M2 Macs under load, although the Macs will probably reach similar levels on full load after a while. Recent platform designs target much higher power limit to achieve optimal performance, while the manufacturers keep pushing that envelope frame on sub 15" machines, creating conflicts between form and factor. Many recent Thinkpad and XPS machines have difficulties cooling the components.

I use an Alienware X17 for works including demanding graphics tasks. I know that Alienware does not have great reputation among enthusiasts, but at least for me it is pleasurable to use. Very decent build quality, sleek exterior design after lights off, amazing Cherry low-profile mechanical keyboard. On light modeling and compiling tasks there's barely any noise coming from the fan systems, and on higher load it isn't much louder comparing to M1 Max/Pro Macbooks, with performance being top-notch for sure. Overall very enjoyable experience comparing to lighter design systems, but the trade-offs are obvious too - 3kg+ weight, and the battery drains fast due to the aggressive power targets on CPU/GPU. It's clearly not for everyone, but if you need performance without workstation features (Quadro, ECC, Xeons), it's definitely worth a try.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: