Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The current trend for productionizing LLM-based applications is to write your own (really thin) wrappers around the actual LLM call. The majority of your code should be business logic independent of the LLM, anyway: information retrieval, user interface, response logging, and so on.

In my experience with cloud-based and local models, LLM chaining compounds errors; I would urge you to look at few-shot, single-query interaction models for business applications using LLMs as a new unit of compute.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: