> the commoditization clock (how quickly open-source alternatives catch up)
I believe we are already there at least for the average person.
Using Ollama I can run different LLMs locally that are good enough for what I want to do. That's on a 32GB M1 laptop. No more having to pay someone to get results.
For development Pycharm Pro latest LLM autocomplete is just short of writing everything for me.
I believe we are already there at least for the average person.
Using Ollama I can run different LLMs locally that are good enough for what I want to do. That's on a 32GB M1 laptop. No more having to pay someone to get results.
For development Pycharm Pro latest LLM autocomplete is just short of writing everything for me.
I agree with you in relation to the enterprise.