Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think the data is the weakness.

We're using Transformer architecture right now. There's no reason there won't be further discoveries in AI that are as impactful as "Attention is All You Need".

We may be due for another "AI Winter" where we don't see dramatic improvement across the board. We may not. Regardless, LLMs using the Transformer architecture may not have human level intelligence, but they _are_ useful, and they'll continue to be useful. In the 90s, even during the AI winter, we were able to use Bayesian classification for such common tasks as email filtering. There's no reason we can't continue to use Transformer architecture LLMs for common purposes too. Content production alone makes it worth while.

We don't _need_ AGI, it just seems like the direction we are heading as a species. If we don't get there, it's fine. No need to throw the baby out with the bath water.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: