Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> the commoditization clock (how quickly open-source alternatives catch up)

I believe we are already there at least for the average person.

Using Ollama I can run different LLMs locally that are good enough for what I want to do. That's on a 32GB M1 laptop. No more having to pay someone to get results.

For development Pycharm Pro latest LLM autocomplete is just short of writing everything for me.

I agree with you in relation to the enterprise.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: