Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Although, tbf, some libraries are documented better than others.

Also, local llms with an agentic tool can be a lot of fun to quickly prototype things. Quality can be hit or miss.

Hopefully the work trickles down to local models long-term.



And you think an llm can generate code to use an undocumented library? :D


Even documented libraries can be a struggle, especially if they are not particularly popular. I'm doing a project with WiFi/LoRa/MQTT on an ESP32. The WiFi code was fairly decent, but the MQTT and especially LoRa library code was nearly useless.


Sonnet 3.5 fails to generate basic JetpackCompose libraries properties properly. Maybe if somebody tried really hard to scrape all the documentation and force feed it, then it could work. But i don't if there are examples of this. Like general LLM, but with complete Android/Kotlin pushed into it to fix the synapses.


Of course, why wouldn't it? It's a generative model, not a lookup table. Show it the library headers, and it'll give you decent results.

Obviously, if the library or code using it weren't part of the training data, and you don't supply either in the context of your request, then it won't generate valid code for it. But that's not LLM's fault.


> not a lookup table

You can imagine the classic attention mechanism as a lookup table, actually.

Transformers are layers and layers and layers of lookup tables.


If there are open source projects that use said library, then probably yes.


Unless they are not hosted on github, then no :D




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: